Where is Open Source for FPGA development?

vhdl-mode is even better, IMHO. NTEmacs works really well under XP (it feels a little bit unixy, as all the paths get / in them instead of \). There's even a win32 installer now I gather, which sets lots fo defaults to the way you expect them to work in Windows.

The best bit is the "paste as testbench", which takes your entity and creates a testbench with all the signals wired up, a clock generator installed and a process ready for you to fill in. Now if only it would do the stimulus and error checking for me :-)

In emacs, you can have modelsim compile an individual file at the prod of a key. Then you can get a very quick syntax check.

But if you make a trivial error in your design, won't the optimiser chuck lots of stuff out and make it look much better than it is? You'll never know until you simulate...

Cheers, Martin

--
martin.j.thompson@trw.com 
TRW Conekt - Consultancy in Engineering, Knowledge and Technology
http://www.conekt.net/electronics.html
Reply to
Martin Thompson
Loading thread data ...

When the optimizer chucks out lots of stuff, it usually spits out truckloads of warnings about unconnected signals being trimmed, never changing ("replaced by logic") or equivalent to some other net. Some of these trimmed signals are expected (is there a simple and inexpensive way to quietly drop multiplier LSBs when doing fixed-point maths?) but the rest are good indicators of missing stuff.

Since I do not wish to spend much time on control logic during the approximation phase, I simply connect the N control signals to an N-bits PRNG with a few IOBs blended in to prevent synthesis from simplifying the dummy control logic and everything else thereafter. Once the data/processing pipeline is in place and looks good, I can start implementing and optimizing the real control logic using simulations - I usually try to save as much of the control logic as possible for last since relatively minor data path refinements can have a major impact on the control logic and optimization opportunities... I am lazy so I try to avoid moving targets whenever possible.

Reply to
Daniel S.

In news: snipped-for-privacy@o5g2000hsb.googlegroups.com timestamped 27 Mar 2007 15:23:44 -0700, "Andy Peters" posted: "[..]

I take it that you don't simulate much, if at all ...

you should have used ModelSim FIRST !"

That is not a good enough excuse.

Reply to
Colin Paul Gloster

Yes, I get loads of warnings flagged about dropped bits due to my tendency to write code that is "easy to reuse 'cos the optimiser will drop the bits it needs to". That means important stuff can get lost in the morass!

That must be where we differ then - most of my designs, its the control logic that takes most of the effort, especially handling error cases. The actual "processing" isn't usually my problem...

Now, off to write code which will no doubt prove me wrong within an hour ;-)

Cheers, Martin

--
martin.j.thompson@trw.com 
TRW Conekt - Consultancy in Engineering, Knowledge and Technology
http://www.conekt.net/electronics.html
Reply to
Martin Thompson

Re-read my paragraph... I think you misread it: I fully agree that control logic is indeed the most complicated part of most designs. What I said is that I keep control for late/last to avoid having to recode it should any significant alterations in the pipeline become necessary due to timing/area/other constraints.

"Hey Martin, there's a new feature we added to the specs... you will have six more control bits and two extra pipeline stages to manage. Unfortunately for you, the extra stages and controls sit smack in the middle of the two spots that gave you so many headaches over the last week or two."

Of course, waiting until the data path is completed only reduces the likelihood of late changes ruining control efforts, it does not immunize against late changes... and it still requires a "disposable" (minimum effort) controller for preliminary data path testing.

Reply to
Daniel S.

OK, fair enough - I understand where you're coming from.

That's be the difference then - most of my current FPGA is explicitly not data-path, so there are very few changes to the data-path that are likely to cause me headaches :-) I hope...

Indeed. How about we disallow late changes :-) That's go down well!

Cheers, Martin

--
martin.j.thompson@trw.com 
TRW Conekt - Consultancy in Engineering, Knowledge and Technology
http://www.conekt.net/electronics.html
Reply to
Martin Thompson

Architecture/Feature freezes are usually a good thing... but changes can still be forced by external factors: peripheral chips can go obsolete, better options (updated parts, pricing changes, etc.) can come along, provisioning/IP contracts can fall through, FPGA sizing guesstimates can come out too tight or short, etc.

I worked for a couple of post-docs on a software-defined radio project some years ago. They had XC2V4000 devices they thought would be sufficient for their design but they found out, about a year into the design phase, that even the TX pipeline would (barely) not fit on a single FPGA and they ended up halving their design goals to give the RX side a decent chance of fitting in its own 2V4000. About a year after my contract ended, one of them showed me one of the first commercial derivatives from this research project which ended up halved a second time to make everything (RX + TX + some previously external bits like CPU/PPC405 and SERDES/MGT) fit in a single XC2VP70. Today, I suspect they would be easily able to pull off the initial specs with a single XC5VSX95T, assuming they would not mind going back to off-chip CPUs. I remember the docs being really &%#%@'d about the largest 4VSX being undersized logic-wise, having such a ridiculous logic-to-DSP48 ratio and no PPC/MGT while even the largest devices in the

4VFX family had way too few DSP48s, rendering both families completely useless for their application. I am certain these folks would be really happy if Xilinx decided to include a SINGLE (and no more than one) PPC405 core in all LX/SX FPGAs... an XC5VSX95T with an on-chip PPC405 would be a dream platform for them: excellent logic-to-DSP48 ratio for their application, a couple of MGTs and one PPC405 in one package to reduce external parts count.

When dealing with bleeding-edge projects, tables can turn in any direction at any time for all sorts of reasons. The docs planned their design with hopes that the V4 would meet their needs by providing resource balance similar to the V2P but the preliminary V4 specs they received thoroughly terminated all such expectations.

You get to see lots of interesting things when riding with people who are at the edge of their respective domains.

Reply to
Daniel S.

Yes, I found it weird that one PPC wasn't in *all* the families.

Indeed!

Cheers, Martin

--
martin.j.thompson@trw.com 
TRW Conekt - Consultancy in Engineering, Knowledge and Technology
http://www.conekt.net/electronics.html
Reply to
Martin Thompson

Can you point me to that paper? If "GI" stands for the german "Gesellschaft f=FCr Informatik", then I am a member and have missed the paper anyway ;-)

Thanks, Torsten

Reply to
Torsten Landschoff

It was at "7. GI/ITG/GMM-Workshop Modellierung und Verifikation". I just picked up the proceedings and was looking for the paper and realised, that there were multiple papers that were trying to extend the synthesizable part of the language.

The Paper that I had in mind used XSLT to transform VHDL to VHDL:

  • Oetjens, Gerlach, Rosenstiel: "Ein XML-basierter Ansatz zur flexiblen Darstellung und Transformation von Schaltungsbeschreibungen"

Than there was another paper about a more special case:

  • Jan Gutsche, H.-U. Post: "Erh=F6hung der Synthesegenauigkeit durch Sprachraumerweiterung synthesef=E4higer sequentieller VHDL- Beschreibungen".

Less practical but especially enlightening was this paper: * Meinrad Fiedler: "Ein =DCbersetzungsverfahren von Verilog- Kausalspezifikationen in Signalflankengraph-basierte Spezifikationen zum Entwurf asynchroner Schaltwerke". What the author is presenting as a simple example does not look remotely similar to anything that could be synthesized using Synopsys DC. This shows that what is to be considered "synthesizable" is largely in the eye of the beholder.

always begin @(posedge c); if (s) begin y =3D 1; @(negedge c); y =3D 0; end else begin z=3D1; @(negedge c); z=3D0; end end

Reply to
comp.arch.fpga
Reply to
Stephen Williams

Actually, many of the large open source projects are full time staffed by industry giants ... for example many of the linux developers have a day job maintaining linux for their respective empoyer's hardware platforms ... in IBM, SUN, HP, etc ... while also having hat's as core developers in various distro's as a paid job. Much of the integration and testing of distro's is also funded day work, via contracts ... it's how RedHat engineers get paid.

This is big business, not just volunteer work. The hardware interfaces into computers and not "just easy" pieces of code to develop and maintain, and are loaded with heavy IP rights ... just like the internal chip interfaces for FPGA's. The difference is, that software developers took those interfaces public with open source operating systems.

If it can work for operating systems and major distro's ... then it can work in other industries where there is leadership in open source to obtain advantages for both the Corporations and their customers. So far, Xilinx and Altera are not taking that lead ... which should result in a significantly better tool set for the industry.

As I have suggested before ... places where vendors are crying about not having enough funds to support their product lines (IE dropping support for entire product lines like the XC4K chips) while new product is on the shelf in distribution and production inventory ... could have done that much better by turning over to a customer/vendor open source partnership.

We hear frequently here that vendors cann't support anything other than the top few dozen customers ... that changes with active open source industry partnerships which are lead by industry paid staff.

John

Reply to
fpga_toys

I'd guess FPGA vendors are certainly not unaware of the advantages of the opensource model.

I think that the problem for them is twofold:

1/ for old chips, netlists could be reverse-engineered from bitstream easily with the chip specs. This would be a major blow to some FPGA users.

2/ for newer chips, even with strong crypto and/or eeproms available on all chip lines to prevent bitstream reversing, the major problem would be to still allow selling closed IP components for the platform. There's no solution to this problem -- it simply is unsolvable from an information theory standpoint. The only thing possible would be to change the business model, and I guess this is not likely to happen anytime soon.

The change won't come from the most established FPGA vendor, that's for sure -- maybe the contender could do something about it, though...

JB

Reply to
jbnote

Security thru obscurity has always made victims of the honest by limiting their options. It's also slows, or prevents, real security that develops from good vendor and customer partnerships.

Reply to
fpga_toys

I fully agree with this view, but that's beside the point. The business for FPGA IP is certainly inherently flawed, but it does exist, and there's no reason for the vendors to harm it themselves.

JB

Reply to
jbnote

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.