AREF bypass capacitance on ATMega2560?

It seems the prices have come down in recent years, but still, the parts I have seen have no Flash. So you need to add in that cost. But the Sigma parts aren't really general purpose. They are good if you can make you app fit the DSP design, otherwise they aren't much use. I pursued them hard a few years ago until an FAE just threw in the towel and said I couldn't do my app on their part.

That makes no sense. There will always be some designs that a given part is a perfect fit for, but that doesn't mean different devices can't be compared. The question is what is the best fit for a given job. I am hearing some say that FPGAs aren't the best fit and I find they often are a better fit than an MCU. Much of it has to do with mis-information about what FPGAs can and can't do and what is required to make them run. Just read Joerge's post. Much of the stuff he objects to is specific to the individual devices he has worked with.

I have not found a big difference in software. The software is different, but those differences are not important. It all compiles my HDL fine (mostly because they often use the same third party tool vendors) and simulation just works anymore.

I don't know what devices you work with, but the ones I use are easy to program.

The CPU is the easy part to port, the compiler handles that for you. It is the drivers for the I/O that is harder. Their libraries have to have compatible interfaces and every port is a port. With FPGAs, all you need to do to switch between brands is normally a new pin list and timing constraints. The HDL just compiles to suit the new device. It has been a while since I ported between brands but it would make sense if they provide tools to port the timing constraints. That is the only part that might be any work at all.

Care to elaborate?

You can find a small number of DSPs with CD qualitity CODECs and the same for MCUs. I know, I did this search recently. I didn't find much and none that suited my other critera. So the redo of my board will likely have another FPGA on it.

I would appreciate a list of the MCUs/DSPs which have stereo CD quality CODECs on chip. The Sigma parts from ADI don't count because their DSPs can *only* be used for certain coding like filters, not general purpose use.

You are mixing apples and oranges. One manufacturer has many different families of FPGAs, no? Some are huge power hungry devices that burn a hole in your board. Others are much lower power and don't burn a hole in your pocketbook either.

--

Rick
Reply to
rickman
Loading thread data ...

Nonesense. I constantly look for MCU solutions for my designs.

I won't argue that. But I don't consider CPLDs in the same vein as FPGAs, but you are right, the distinction is blurring.

That is the sort of thinking that is just a pair of blinders. I don't care if the real estate is "expensive". I care about my system cost. Gates in an FPGA are very *inexpensive*. If I want to use them for a soft core CPU that is just as good a use as a USB or SPI interface.

Your opinion. I don't sell 100,000 quantities, so the prices I get at Digikey are often competitive with the other distis. Certainly they give you a ball park number for comparison purposes. The point is that with FPGAs, *no one* gives you a good price unless you get the manufacturer involved. That is one down side to FPGAs.

--

Rick
Reply to
rickman

At Digikey search on iCE40. That will give you a listing of all the parts. Looks like Digikey's prices are a little high, not just because they are Digikey. The $3 figure I gave is from quotes I got from Silicon Blue before they were traded through Digikey. Like I said, you need to get a quote on any FPGA to find the "real" price.

I don't think it is all about the "haggle". I think no small part is about doing what it takes to get the design in and deny the competition. When you get quotes from Digikey web site they never even know you are looking at their parts. Think of it as a live auction vs. sealed bids. They prefer to be in the live auction.

--

Rick
Reply to
rickman

1k qty is $3.67 at Digikey:

formatting link

$3.02 with 12wks leadtime at Arrow:

formatting link

ROM is included.

It is a $3 part. See above.

Sure, that's why I wrote "powerful". The less powerful ones have never impressed me much but I have to admit that I haven't looked the last five years. So, tell us, which FPGA can fully replace the above DSP for three bucks and where could one buy it off the shelf?

No, what I am saying is the it doesn't matter much which parts I use (my clients decide that anyhow) but that finding a programmer locally can be important. Some projects will not really come off the ground if the programmer isn't local. Or it can take forever.

I do disagree.

Well, if you are standing next to a huge roaring engine and this, that and the other subtle disturbance has to be ironed out it would be a major problem if the programmer is three time zones away. You don't have to believe me but that's how it is with some of my assignments.

Just like EMC jobs on large gear can simply not be done remotely. Which is why I bought the Signalhound analyzer (fits into carry-on).

That is undoubtedly true. Plus probably less in errata headaches.

[...]

So far the next project on the books that will contain logic is 2nd half of next year. It will need a 16-bit audio codec as well but we can use an external chip for that, PCM29xx series et cetera. Would (probably) even keep it external with a uC because the 16-bit ADCs on those aren't very quiet.

Mine are never less than 12-bit, often 16-bit.

Well, give us all here an example of a FPGA that can fully emulate a TSM320 and costs $3 :-)

I guess then I don't see your point. Flash is available on almost everything nowadays.

--
Regards, Joerg 

http://www.analogconsultants.com/
Reply to
Joerg

Which part number?

I don't see how the equivalent of a TMS320 or a big MSP430 could fit into one of these small Lattice devices.

I don't play those games. Or only once every 20-some years when buying a car, and then the car dealer may develop an ulcer when we are done.

Well, most design engineers don't. tti lost a deal on capacitors last week because they wouldn't let me look at the details of a cap without entering some stupid captcha. Which as usual didn't work. I moved on and found it elsewhere. I simply do not have the time for such games.

--
Regards, Joerg 

http://www.analogconsultants.com/
Reply to
Joerg

Interjecting what little experience I have here;

MAX7000s are still available, and they were introduced in 1995;

formatting link
although I think they've finally gone out of stock (not that you'd want to use them, they also guzzle power like they're NMOS).

FLEX 10K FPGAs don't appear on their website, but they're still quite available. Can't seem to find when they were introduced? I'm seeing documents since at least 1996 referencing them.

At least between the big two (Altera that I know of, and I would assume Xilinx too), FPGAs look to have excellent support over time.

And VHDL doesn't age. The library blocks might, but I'd be willing to guess that those are synthesized as well, so as long as your toolchain supports whatever code format, you can migrate chips easily when they do finally die.

And really, if you have to support something for over 20 years, it's probably time it does die and gets a redesign. Like that VAX or whatever it was NASA's still supporting.

Tim

--
Deep Friar: a very philosophical monk. 
Website: http://seventransistorlabs.com
Reply to
Tim Williams

Ok, sorry, forgot to say that I never use that manufacturer.

But the minute a footprint changes or you have to re-compile you are screwed in some heavily regulated markets.

Nope. Not if it's in the worlds of medical or aerospace. There you have a huge re-cert effort on your hands for changes. New layout? Back to the end of the line.

You also need to support very old legacy equipment with type-certified spare parts and for those few sales you really don't want a re-cert. Just think about the DC-3 which is still in service. Some of those are over 60 years old. Can be worse with elevators. We have a company near here that sometimes has to support elevators that are over a century old.

Sometimes changing is very time consuming. I recently learned that this is even the case for alarm systems. "If we even add as much as one capacitor for EMC we have to go through the whole insurer certification process again".

--
Regards, Joerg 

http://www.analogconsultants.com/
Reply to
Joerg

That is not always true (at least for medical equipment, no experience with aerospace). If the change is minor enough, it may be enough to write a rationale that explains the change and how it does not impact the function of the equipment. If the notified body agrees with the rationale, only a limitied effort is required to re-cert.

Weird, I would expect a similar approach with a rationale or something would be enough.

--
Stef    (remove caps, dashes and .invalid from e-mail address to reply by mail) 

Antonym, n.: 
 Click to see the full signature
Reply to
Stef

Yes, and those are quite closely dependent on the amount of silicon used.

OK. That GA144 you mentioned has 144 cpu nodes made in a rather old process technology (0.18 micron, I guess 1990's vintage). They still manage to run the thing at 700+ Mhz, keep power consumption to around

0.5W with all cpu's running full speed, and sell it for $20 in small quantity. Can you do anyting like that with an FPGA? What will it cost? How much power will it use? I'll accept the b16 as a comparable processor to a GA144 node. Bernd's paper mentions the b16 ran at 25 MHz in a Flex10K30E, a 30-to-1 slowdown, power consumption not mentioned. But I don't know how the underlying silicon processes compare.
Reply to
Paul Rubin

I really doubt they would agree if a code re-compilation was required to make this work. With code and firmware they have become very careful because there have been to many mishaps.

Most of the time the notified bodies or even the FDA do not care much about the code, they care about your process. So then the onus is on the company, and there mostly on the VP of Quality Control. He or she will normally not take a re-compile lightly, or as something that can be brushed under the carpet as "not too risky".

It is the same with some hardware. I went through a whole re-cert once just because we had to switcher the manufacturer for one little transformer.

The bottomline is that in the unlikely but possible situation where something bad happens you need to be prepared. Then there will be a barrage of request for documents from the regression testing and all that. Woe to those who then don't have them.

There are many other markets with similar requirements. One of them is railroad electronics, especially for countries like Germany.

--
Regards, Joerg 

http://www.analogconsultants.com/
Reply to
Joerg

Nonsense. If you want to know how much water has collected in the basement because of a burst pipe, do you call the water authority to read the meter? No, you put a stick in the water and measure it. If you want to know a parameter, then measure that parameter, don't infer it from something only vaguely related.

You have a bias against soft cores because you want to analyze them in a meaningless way. How about analyzing them in the terms that you care about?

Like what exactly? Do 700 MIPS, of course you can. An FPGA can be configured to run your algorithm more exactly than any processor and so it can get very low power.

BTW, you know the GA144 doesn't do 700 MIPS either. It is less than half that with most code. The GA144 isn't 0.5 Watts either, it is close to 1 Watt with all nodes running. It also doesn't cost $20 to use because it requires a *ton* of support devices, boot prom, RAM, clock,

1.8 volt to *everything else* voltage translation, etc...

I actually considered using it in my board redesign. I might have to add a RAM chip to it, but all the clocks are external to the board anyway and there is already a low voltage power supply. So the main issue is the voltage translation which is partly dealt with currently since the current FPGA had to be buffered to some of the I/O for 5 volt logic. So the GA144 might do ok in that design. But then there is the reason I am doing a redesign... the FPGA is EOL. I don't have much confidence GA will be around in 10 years. Do you know of one major design win they have had?

You haven't told me what the design requirements are... how can I possibly give you a price?

How long is a piece of string?

You are trying to compare apples to horses. No, you can't use an FPGA to implement some existing processor and improve on cost, power or any other parameter. I never said you could. That would be like using a kitchen knife as a razor. It won't work so well and has little value. But if you have an application - it may well be easier to implement in an FPGA than in a GA144... in fact, I can almost guarantee that!

--

Rick
Reply to
rickman

Partly I suppose.

Or I could say that my projects so far all require a microcontroller anyway, and it seemed likely that a separate FPGA was always going to be more expensive than, say, choosing a faster CPU.

A STM32F4 can bitbang a PIO at 84MHz. (It can't do anything else then, but still...)

OK, thanks, will check them out.

--

John Devereux
Reply to
John Devereux

I won't pretend that an FPGA is the right solution for every task. But I think MCUs are often used because that is what the designer is used to and FPGAs aren't understood well enough to consider. Is "enough" RAM more than what a given FPGA has? I don't know, how much RAM do you really need? Most MCU projects I have worked on never had a realistic RAM estimate, it was all by the seat of the pants. The fact that code uses RAM makes it harder to estimate. FPGAs are a lot easier to design with in that regard. RAM quantities have to be known exactly. LUT counts have to be estimated though, so its not totally different.

Or even an AVR... are you reading Ulf? I think the requirements for MCUs are often overstated. Most of the sort of work I do could be done with an 8051 (ugh!) if one of the higher performance devices especially, but I often don't have the real estate for a separate MCU unless I can treat it as an I/O expander.

Hell, I'd be estatic if they provided FPGAs in small enough packages so I can use a 32 pin QFN for an MCU and the same footprint for an FPGA. Well, Lattice *does* put an XO2 in a 32 QFN, but only 256 LUTs which is not big enough for much. Why not 1 or 2 or 4 kLUT? For some reason FPGA vendors all think you need more I/O and less LUTs.

I got a freebie eval board for the iCE40 but haven't fired it up. I want to measure some power consumption numbers. The data sheets changed the static current a while back, well after they had been out, just after Lattice bought SiliconBlue so I'm not sure what that was about. The 1 kLUT part went from around 40 uA to 100 uA quiescent current. The dynamic current is still very low though, single digit mA with the device full of 16 bit counters running at 32 MHz. But they seem to have removed that data when they changed data sheet formats.

--

Rick
Reply to
rickman

Good grief. The issue wasn't to show YOU that YOUR application was better in a DSP. Like many FGPA weenies, you're trying to sell a part that has a niche market as the universal hammer.

Hammer, meet nail.

That is *NOT* what you're arguing. You're making the general case that FPGA >> DSP >> uC, which is just silly.

Hammer, meet nail.

Nonsense.

I have.

Like DSPs. I agree with him. FPGAs aren't in his future. You keep sugar-coating FPGAs and (erroneously) tear down DSPs. Note that I'm more of an FPGA kind of guy than a DSP sort but in this case Joerg is absolutely right. FPGAs only compete in small niche markets and those where money is no object.

The software is different in how it works, not what it does. That difference makes *NO* difference to the end result or the cost of the product. IOW, it's completely irrelevant. At one time it may have been important but only in so much as that much of it didn't work (making the hardware useless).

Pile on more sugar. You clearly don't work where time is money.

That's all included in the port. I'm talking from working hardware to working hardware (the target system not qualified, of course). There is only about 10% of the code that even has to be looked at.

Wrong. That's all included.

Bullshit! More sugar!

Oh, you never use libraries? Yet you (erroneously) add that cost into the DSP/uC bucket.

You've TOTALLY forgotten about simulation, for instance. That's a huge effort that you simply sweep under the rug.

Goal post shift added to the hammer.

Sigmas have them. I haven't looked for others.

The families all look the same and vary only in density and mix of memory, speed, MCU, DSP(hmm), and other features.

Good grief, you're arguing both sides.

Reply to
krw

You certainly don't look very hard. I keep looking for FPGA solutions and haven't found one yet. ;-)

The architecture is the same. They're the same. OK, let me ask the question(s) I've asked every one of the FPGA suppliers; Define FPGA. Define CPLD. They can't. It *IS* marketing.

Part cost ~= system cost. MCUs are so cheap that any soft core is useless. The development costs are a lot less, too. The tool chains for the embedded stuff suck.

Good grief. Do you have diabetes?

If you're buying no more than 1K pieces, you're sorta stuck with the DigiKeys of the world. I am for prototypes, though I build as many prototypes as I did in production at my last job. ;-)

Sure, I'll buy that but it just solidifies the fact that FPGAs really aren't mainstream components. They are a niche and probably always will be, unfortunately.

Reply to
krw

BTW, watch the TMS320 5000series parts. The DMA is seriously broken if you're using the BSP (I2S/TDM interfaces). The McBSP sucks, too, but that's a different issue. Last I knew they had no intention of fixing I2S/TDM DMA, either.

Reply to
krw

Like that new kid on the block, the 8051?

Reply to
krw

I think what you are saying is that the MCU is a key part of your design and you use a lot of code in it. Ok, if your emphasis in on using a commercial MCU that will do the job. But unless your MCU needs are just too large for something that fits in an FPGA, you have it backwards in my opinion. Why have both when you can just use an FPGA?

I haven't gotten a quote on these parts since they were bought by Lattice. I'd appreciate a pricing update if you get one. They should be able to do a lot better than the Digikey price, I know Xilinx and Altera always do. Heck, the Digikey pricing for most FPGAs doesn't go above qty 1... if nothing else there should be some quantity price breaks.

--

Rick
Reply to
rickman

Good grief is right. You don't need to be rude. It isn't just my application, the Sigma parts are designed for a very limited set of DSP apps and even the development software limits how you design with them. They won't do the job of *most* DSP apps.

If you don't want to discuss engineering, then please spare me.

You are repeating yourself.

No one is tearing down DSPs. Can you just stick to the engineering and skip the drama?

Your statement is exactly the sort of "mis-information" I am talking about. At $3 I think you can use an FPGA in a low cost app. So your "money is no object" claim is just BS.

That is what I am saying. I find little difference in how the tools work. You write your HDL in your editor or their built in editor, you simulate it using the free tool they provide and you compile to a bit stream that gets downloaded into the FPGA, Flash or RAM. No, the tools aren't going to look exactly the same, but they do the same job and work the same way. Most of the tool is actually third party anyway, except for the Xilinx HDL in house compiler. But them most FPGA professionals (read as working for a company that has a few bucks) pay for the third party tools anyway.

Ok, very convincing argument. I have no idea what you are talking about. Downloading an FPGA is no different than an MCU. You either attach a cable for JTAG or your use the resources you designed into your target.

With FPGAs *none* of the code has to be looked at.

You have identical peripheral interface libraries for different brands of MCUs? Every timer function works the same, every SPI port sets up the same, every power controller is operated the same?

You are the consummate debater...

The only libraries I use are the HDL libraries which are standardized, like using stdio. I don't add the libraries into the MCU column, I'm not the one using the MCU.

What about it? I paid for a set of tools from Lattice. I had used the Modelsim simulator at work and was used to it. The Lattice tools said they came with the Modelsim simulator. But by the time I got the package it had the Active HDL simulator. I complained about this thinking I would have to learn a new simulator... but it was a no-op to switch. Even my Modelsim scripts ran under the AHDL simulator.

So what is your concern?

But Sigmas aren't general purpose DSPs and can't do most DSP jobs. They are designed to be filters, like in hearing aids.

We've been down the road before. You are not enjoyable to discuss things with. You get obnoxious and don't explain what you are talking about. Do you really want to have this discussion? I don't think I do.

--

Rick
Reply to
rickman

Yes, basically. "a lot" being only e.g. about 64k probably, not much for a MCU but would push the price up for an FPGA I think.

I'm pretty sure that a FPGA with enough RAM would be far too expensive (compared to the $3 200 MIPS CPU).

A M3 or M4 with attached FPGA + memories would be interesting, if it was at a reasonable price.

NXP have a M4 with attached M0 which sort of goes in that direction; the M0 does the more deterministic simple stuff, the M4 does the number crunching and runs the more complicated software.

[...]

Unfortunately I don't really have a live application, so would only be able to buy them as "education" at this stage.

--

John Devereux
Reply to
John Devereux

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.