Interesting article about FPGAs

A good read for anyoane interested in FPGA

formatting link

--
jakab
Reply to
jakab tanko
Loading thread data ...

That was a good read. Of course, anything that says FPGA designers will have a bright and glorious future is always nice ;_)

Thanks jakab

Reply to
Vinh Pham

From the article: Antifuse also has some power consumption advantages over SRAM.

Can anybody explain that to me? Perhaps that refers to static current?

[The SRAM part of an SRAM FPGA doesn't change during normal operation so the F part of C*V^2*F is 0.]

-- The suespammers.org mail server is located in California. So are all my other mailboxes. Please do not send unsolicited bulk e-mail or unsolicited commercial e-mail to my suespammers.org address or any of my other addresses. These are my opinions, not necessarily my employer's. I hate spam.

Reply to
Hal Murray

Hal,

Here's my best guess for why an anti-fuse FPGA could have lower dynamic power (I don't know if this is the case): If I recall correctly, anti-fuse FPGAs can do some forms of switching with direct metal-to-metal connection through anti-fuse vias. In an SRAM FPGA, all switching must be done through SRAM-controlled multiplexors & buffers. The additional switches in a given route increase the C and (maybe) add more crow-bar current during switching. Also, many of the transistors in the muxing fabric are off during operation. So you've got more transistors (higher static power) and more dynamic power.

- Paul

Reply to
Paul Leventis

There's one thing Zeidman mentions that I feel very strongly about. In fact, I've been repeating a phrase very similar to the one he used in the article for about four years now: WE NEED BETTER TOOLS.

I sincerely think that FPGA manufacturers need to figure out a way to get out of the tool business and pour their resources into making better chips. Open it up to the masses. The first company to do that is bound to take a big huge bite out of the other's market share simply because of all the tools that will be offered by the engineering community. There's probably a ton of very capably and creative guys out there who are ready, eager and able to create wonderful new --and very powerful-- tools with which to craft designs. Instead, we are gang-chained to the vision and approach offered by the vendors.

Well meaning as they might be, I doubt that any real progress will come out of their shops. Just incremental improvements, but that's it. For example: Why is it that I can't use a farm of twenty PC's to compile a complex design quckly?

When your very survival as a business is predicated upon how well your product performs in a free market, the best tools tend to float to the top and others fizzle. No FPGA manufacturer's survival, at the current stage of things, depends on the quality of their tools. As long as they are adequate engineers make them work. What us end-users want are the chips, so we put-up with whatever we are forced to use in order to put the chips on our boards. If a better tool appeared we'd probably drop the current offerings in a nanosecond.

--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Martin Euredjian

To send private email:
0_0_0_0_@pacbell.net
where
"0_0_0_0_"  =  "martineu"






"jakab tanko"  wrote in message
news:bljsjs$4me$1@news.storm.ca...
> A good read for anyoane interested in FPGA
> http://www.embedded.com/showArticle.jhtml?articleID=15201141
> ---
> jakab
>
>
Reply to
Martin Euredjian

and if they are in-adequate, engineers choose another vendor - so FPGA vendors are actually VERY dependant on tool quality.

It depends very much where in the tool chain you mean. For front end, silicon independant compilers/simulators, then there is scope for any tool chain to create a connectivity file. What you need here is enough competent designers, with enough spare time....

For back end tools, and some IP generators, that have to map closely to the silicon, then it is not practical to 3rd party that. Tools would come out MUCH later, or the silicon would have to 'freeze' features.

If there are features you want to see, then talk with the FPGA vendors....

-jg

Reply to
Jim Granville

I don't think you got my point. Tools today are adequate. And that's it. Without competition the rate of progress is slow. So everyone deals with it and life goes on. But, you only have one choice per vendor. So, as long as the pain/results threshold is on the correct side of the scale, vendors are safe and few are going to switch chips and redo complete designs just 'cause of the tools. In some companies managers would let an engineer pop a blood vessel before incurring the additional cost of a redesign just 'cause the FPGA guy is complaining. If there was real competition things would be different.

Take hierarchical floorplanning, for example. Badly needed. Nowhere in sight. Or more intelligent routers. Or RPM's that contain more than just placement. How about parallel processing (network-based, bunch of cheap computers) to speed up compile runs that take too long? How about better design entry environments (recent thread about editors is one example). Or better documentation. Etc., etc.

Take something like Xilinx's Incremental flow. Sounds great, but, if you instantiate a module more than once you can't use that approach on it. Why? And, depending on how which documents you read, it doesn't work with submodules in a hierarchy, only modules instantiated in the top level. If this is the case, why? I have a fair degree of certainty that an independent developer would not take this approach, cause the competition would tear them to shreads.

An open market and real competition would no doubt result in a quantum leap in FPGA design tool quality. I have no doubts about this. The problem is that the critical bits of info are not available for anyone to even attempt to generate their own bit files.

Compare the FPGA world to the embedded/C world. Most microcontroller manufacturers' compilers stink. There are a number of great compilers and design environments out there available from various companies. Some are wonderful. Some are not. Others are free. And, of course, there are those that are too expensive. But, you have choices, and companies dedicated to that small segment of the business have proven that they can do a better job than the chip manufacturers.

--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Martin Euredjian

To send private email:
0_0_0_0_@pacbell.net
where
"0_0_0_0_"  =  "martineu"
Reply to
Martin Euredjian

Boy, how long have you been in the business? The tools these days are excellent. That's not to say they can't be improved, because they can. I recall hand routing fpga designs and thinking I was in 7th heaven. But, then, I have done designs with hand drawn schematics and wire-wrapped TTL logic (I had a drafting board & machine until a couple of years ago).

Reply to
Tom Seim
20+ years. I go back to hand-drawn schematics and wire-wrapped boards as well. Had lots of fun designing boards the size of pizza boxes with loads of TTL chips.

Don't be fooled by shinny GUI's. Today's tools are adequate. Not sure I'd call them good. I have a feeling they need to (and could/should) be substantially better. And I simply don't think that sole-sourcing is the way to go. Not any more. That's all.

-Martin

Reply to
Martin Euredjian
[Opinions below are my own]

Martin,

I must say, as an FPGA software developer I disagree with most of what you have to say. Altera (and I'm sure Xilinx too) has 100's of sw engineers banging on the software at any given time. Software can be a great differentiator -- in the past 10 years, there have been times when a company has pulled ahead on the basis of better software, or fell behind due to buggy/unstable software. Most customers don't pick our chips until they've tried them out, and that means using the software. Once someone has committed to a given vendor's chips, they won't change chips just because of software. But a bad software experience can certainly influence a customer's future vendor decision!

Now, you can argue this means we've brought the software up to "adequate" levels. But then why would we be constantly adding new features and improving tool quality? We feel the competitive pressure -- even in a two-horse race you have to gallop as fast as you can.

To address a few of the features you point out as being in need:

- Hierarchical floorplanning. I believe that you can make hierarchical LogicLock constraints in Quartus. I'm not sure if that meets your needs, but if it doesn't you're welcome to build your own floorplanning tool and pass constraints into Quartus. There are 3rd party companies who have done this.

- More intelligent routers. What is lacking in current routers? The Quartus router is fully timing driven, and comes mighty close to the best-possible routes even in the presence of congestion. It inserts routing to resolve hold-time. What more do you want than a router that meets your timing requirements, fits your design even if badly congested, and automatically resolves hold-time issues?

- Parallel processing. It is not easy to make a fine-grained parallel algorithm. Let's ignore FPGA vendors for a second. Academics have been trying to make parallel CAD for FPGAs/ASICs work for years. I don't think anyone has come up with a good solution. The problem is that in placement (a large part of compile time), everything is related to everything else. So you have a lot of inter-process communication in order to keep the state of independent processes from going stale. You can do things like periodic updates, etc. but then the resulting placements could have much worse performance. What you can do with FPGA tools is farm off multiple compiles to different machines -- you could do a "seed sweep" to find the best placement & routing. Not ideal, but can be helpful at times.

- Design Entry Environments. Agreed -- and I believe there are a number of

3rd party tools for this. The inputs to FPGA tools are well documented so that 3rd party vendors can write their own HDL environments and/or do their own synthesis & placement.

I do believe that with more players in the tool market there would be more novel ideas. The 3rd party synthesis vendors have added all sorts of features to their tools and new tool flows that the FPGA vendors have yet to offer. The area where the most innovation is needed is in front-end tools & design entry. I would argue that back-end tools (i.e. placement, routing, and bit file generation) wouldn't benefit much from more documentation/open standards. There is a huge amount of complexity associated with creating a functional bit stream for a REAL fpga design, and your average coder at home won't be able to contribute much to the process.

Just my $0.02 worth. This is an interesting thread...

- Paul Leventis Altera Corp.

Reply to
Paul Leventis

I appreciate your opinion. And thanks for jumping in.

Somehow I have more faith in third parties doing a better job. Sorry, maybe I'm a bit close-minded here. There are blatant examples of this outside the FPGA domain. How many companies out there write software that is a million times better than anything MS can put out? Look at photo editing software. You have things like Photoshop at the top and hundreds of other approaches to choose from.

I guess my greater point is that we'll never know how much better things can get until the whole process is opened-up, as opposed to just front-end tools.

company

Sure. But, I argue that if the design community has five or six different tools to choose from aside from yours chances are you'll have: 1)A higher conversion rate (from your competition); 2) Probably higher performance tools than offered today; 3) More resources to develop chips as opposed to having huge software development departments.

I would think that most OEM's would decide on which architecture would work best for the widget they are trying to build. The tools are just a means to an end: getting a working chip into their widget. I think history can prove that engineers will endure great pains to make whatever tools they have available to them work.

You bet! I think that this might be much less of an issue for larger corporations that can afford to have huge offshore teams working on designs for 1/5th the cost. Small to medium shops might not have that flexibility (or might feel very strongly about not screwing their local talent pool) and, therefore, foot the hidden bill generated by the inadequacies --whatever these may be-- of single-source tools.

Maybe I've come to the realization that highly efficient, tuned, and high-performance tools are much more important to the small to medium shops. There's a direct cost associated with the way these tools either help you or get in the way. As and example, I've seen $10K EDA (schematic/layout) packages so poor that your productivity goes right down the drain. I happen to know several large corporations that swear by some of these packages. That's because of the disconnect between the guy using the tools and the guy writing the checks.

But you still only have two horses. How many people tried to build a powered flying machine? People with lots of money and resources failed miserably. Yet, a couple of bicycle makers hell-bent to make it happen did. The fact that Altera and Xilinx have hundreds of people working on software is great, but that, to me at least, does not necessarily translate into tools that might incorporate the most innovative "out of the box" thinking.

I have yet to use Altera chips, so I can't comment on this.

If you read through the newsgroup archives you'll run into posts from those who understand the subject a heck of a lot more than I do. Some of the things I remember (and I've seen looking at FPGA editor) are things like weird inter-CLB routing (using snaking routes instead of short high-speed routing resources. Or, how about more intelligent data path alignment tools?

There has to be a better approach. Some of these designs take hours to compile. Productivity goes right down the drain. No wander large corps look for offshore design teams. So, what I'm saying is that you are also affecting my local economy because it's simply too expensive to have whole teams on hold for several hours for each iteration.

Again, FPGA synthesis/placement/routing software is not my field, so I'm not the one to offer ideas on better ways to do it.

of

their

This might even go further to include better HDL's.

to

And that encapsulates my thinking. If these outside vendors had access to the complete process I bet a lot of interesting tools would come out.

home

I don't think that what I'm pushing for is in the realm of garage operators.

Thanks again,

--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Martin Euredjian

To send private email:
0_0_0_0_@pacbell.net
where
"0_0_0_0_"  =  "martineu"
Reply to
Martin Euredjian

Well, I got you beat (I'm into my 35th year). Used to be that Xilinx didn't have design entry (you had to buy Orcad). Now it's integrated, which is a much superior arrangement.

There are third party tools out there, but they can be expensive. I have just started to use Handle-C from Celoxica. It is very good at implementing highly parallel algorithms. We are using it for real-time image processing.

I don't think we have seen the end of tool development. I will give some feedback on Handle-C after I've had the opportunity to do a few designs.

Tom

Reply to
Tom Seim

I looked at Celoxica's tools two years ago and just couldn't afford the time/effort to consider the switch. I might look at them again next year.

--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Martin Euredjian

To send private email:
0_0_0_0_@pacbell.net
where
"0_0_0_0_"  =  "martineu"
Reply to
Martin Euredjian

The FPGA suppliers don't have such large SW teams just for the fun of it. They do it because it is crucial for their business. ISTR a note recently that the SW teams are now larger than the HW teams.

If you want to see what 'SW here / HW over there' gave us, just go back

10-15 years.

?!

I believe for the really big designs, you can do incremental designs. Altera called this (strangely) 'back annotate' IIRC.

If the change you have just done affects a large portion of logic, then a full respin is a good idea - but more often it affects a small section.

What IS a positive thing for all tool chains, is the emergence of NIOS / MicroBlaze etc - this means there are teams INSIDE the companies who USE the tools, and have a vested interest in just what you describe - ability to keep optimised layouts, and faster iteration times. Each minor improvement can win them 'boasting rights for fastest core'.

I have not tracked it closely, but I believe SW gains alone have given significant performance gains for the NIOS/Microblaze core speeds. Silicon gains will always improve the numbers....

-jg

Reply to
Jim Granville

How are these teams organized? Where does innovative thinking (as opposed to sustaining or evolutionary thinking) fit into their infrastructure? Are there parallel teams competing to create the best possible tools? How many layers removed from actual users are these developers?

I'm not saying that they are not doing a decent job. But, what's out there to compare to? All I'm saying is that the history of humanity shows that there's always someone who, for some reason, can figure out how to do a better job. Until you open it up to the entrepreneurial masses we'll never know what we might be missing (or not). There are countless examples of small dedicated teams running circles around large ones. So, developer count alone does not tend to mean much.

hierarchical

I'm not sure what you mean by "?!". I don't need to know about Altera's offerings to know that hierarchical floorplanning is important. The tools I use don't offer this. I can't comment on Altera's tools 'cause I don't use them. For all I know, they might be wonderful.

In the Xilinx tool flow, if you took the time to write your own code to drive the tools outside of the GUI you could probably achieve a degree of parallel processing and something akin to hierarchical floorplanning. You'd have to use the modular design flow. With enough segmentation you could launch compilation runs for different modules on a bunch of machines on the network. Your code would have to wait for everyone to be done and then launch NGDBUILD to glue it all together. If the flow could do more than one hierarchical level this could be useful. Not having attempted this I'm not sure what problems one might run into, but it could be an interesting way to boost throughput.

In general terms, as devices get faster, cheaper and larger (gate count) the whole idea of segmenting designs down to basic hierarchical pieces that get stiched together to form larger functional modules becomes more and more important. This, probably at the expense of packing the logic in as tightly as possible (meaning that you'd have to accept CLB waste you might not otherwise have accepted years ago). But you have to have control of all the branches and leaves of the tree, not just the first level.

--
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Martin Euredjian

To send private email:
0_0_0_0_@pacbell.net
where
"0_0_0_0_"  =  "martineu"
Reply to
Martin Euredjian
[Opinions below are my own]

Martin,

I agree that competition drives innovation, and that no matter how large the team, progress can stagnate. Why would an FPGA company take a risk on some radically new tool when there are a gazillion incremental improvements that still need to be implemented? Having outside parties developing tools can provide quantum leaps or totally fresh ways of approaching a problem.

However, the major motivator is probably going to be money. And the FPGA CAD business is a tough one; not only is the target market (of FPGA developers) relatively small (but growing quickly), they are for the most part highly cost sensitive. Not only do you need a good product, but you need a product that provides a significant advantage over the freely (or close to) provided tools from the vendors themselves. In some senses, by pricing their CAD tools so low, FPGA vendors have made it difficult for any third party company to survive, at least on the back-end of the flow.

In areas such as synthesis & simulation, there is a greater opportunity and market. Not only do the various 3rd party vendors offer good products, they provide users with some level of FPGA vendor-independence, which has value. Much of the development done on these tools can be leveraged across various FPGA families from different companies. And the expertise can be applied to the much larger (but hopefully shrinking ;-)) ASIC business.

There have been a handful of FPGA CAD start-ups over the years that have attacked various problems with some success. The few that I can think off hand are ADT (now part of Magma), Hier Designs, Right Track CAD (happily a part of Altera), and NeoCAD (part of Xilinx). So some innovation has been happening, though often development agreements were the means of obtaining the documentation & interfaces required.

Perhaps the best solution (as you suggest) would be for Altera/Xilinx to let the market decide by providing publicly documented interfaces to their tools. If nobody sees an advantage to writing such a tool, no one will. However, there is a cost associated with documenting and supporting such interfaces. And there is always the fear that more and more disclosure will aid competitors by providing insights into how both the hardware and software tools work. Does the good outweigh the bad? I dunno.

None of my arguments are new, or unique to FPGAs for that matter -- almost any open-source or open-standard debate eventually ends up circling around these issues!

Regards,

Paul Leventis Altera Corp

Reply to
Paul Leventis

As a couple of bicycle makers, we can tap into (for Xilinx anyway) pretty much every detail of the design with or without placement and routing information, using XDL format (is XDL still supported in the latest tool chain?), which is plain text and doesn't look too difficult to parse.

Therefore the way is open to us to develop hierarchical floorplanning or advanced routing tools, using XDL format, converting between NCD and XDL to go to and from the Xilinx toolchain.

The last little step, the fairly mechanical conversion of NCD to bitstream, remains under Xilinx control for reasons/excuses (according to POV) that have been discussed here before. But that's not the "interesting" part of the problem, where performance can be won or lost.

So far, I haven't seen any great rush to use this facility, and (cough) haven't had/made/found any time to do so myself...

- Brian

Reply to
Brian Drummond

Hal,

Simple. Every memory cell is a drain (pun intended) as the devices are getting leakier and leakier. No memoery cells: less leakge.

Aust> From the article:

Reply to
Austin Lesea

Let's face it. The FPGA companies are really software companies that happen to have very expensive dongles :-)

--

--Ray Andraka, P.E. President, the Andraka Consulting Group, Inc.

401/884-7930 Fax 401/884-7950 email snipped-for-privacy@andraka.com
formatting link

"They that give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety." -Benjamin Franklin, 1759

Reply to
Ray Andraka

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.