which commercial HDL-Simulator for FPGA?

I wonder if it's core XP or if it's the filesystem itself. There's a Windows-XP port of EXT2 available. It would be amusing if it was faster than NTFS.

So was your virus scanner on during the simulation :-)

The answer is to just use Linux.

--
/*  jhallen@world.std.com AB1GO */                        /* Joseph H. Allen */
int a[1817];main(z,p,q,r){for(p=80;q+p-80;p-=2*a[p])for(z=9;z--;)q=3&(r=time(0)
+r*57)/7,q=q?q-1?q-2?1-p%79?-1:0:p%79-77?1:0:p158?-79:0,q?!a[p+q*2
]?a[p+=a[p+=q]=q]=q:0:0;for(;q++-1817;)printf(q%79?"%c":"%c\n"," #"[!a[q-1]]);}
Reply to
Joseph H Allen
Loading thread data ...

I haven't been able to get Icarus to work, it's not complete enough to run any of our testbenches. We aren't doing anything fancy, in fact all of our code is strict Verilog 95 it's not even 2001.

Reply to
General Schvantzkopf

That's hardly a convincing proof that all the performance increase is due to file system. In my experience, NFS does slow down ncsim a lot when I turn on the waveform dumping, but without waveform dumping, there is no noticeable performance difference after the design elaboration.

I'm not saying filesystem isn't part of it, but for a long simulation with no data logging, 99% of the time the simulator is not doing file I/O. Rather, I believe the following two play a more major role in the speed difference:

  1. Context switching. Linux is very very good at this. In a workstation environment where I/O interrupts happen hundreds of times a second, context switching happens everytime the CPU switch to run from one process to the next one. What's good about the Linux kernel is that you can tune a lot of things: the amount of interrupts, how frequently the kernel service them, and how pre-emptible the kernel is. A fine-tuned batch server can very fast. Not so much help from XP. I believe Windows 2000 does have an option to choose between server and desktop mode, but not sure what difference it makes.
  2. Memory management. Linux is again very very good at this. Filesystem caching and virtual memory management works hand-in-hand. My 1GB RAM workstation ran 99.9% of time without going to swap partition, whereas in Windows XP, the same workstation constantly sees harddrive thrashing, especially after running a very memory intensive job.
--
Faster, faster, you fool, you fool!
		-- Bill Cosby
Reply to
Jason Zheng
Reply to
Stephen Williams

I agree with rickman on the notion of a pure HDL hierarchy, but, like you, I also like to see structural views at all levels, including the top. However, I don't like to edit or to maintain graphical sources.

I let the quartus rtl viewer draw my schematics based on my synthesis code alone. I can bring it up live to drill down module by module or print out pdfs at any level like this:

formatting link
formatting link

Yes, a case statement is easy to write, read, and sim. Drawing curvy arrows and attaching equations is fun once.

-- Mike Treseler

Reply to
Mike Treseler

I thought maybe that went without saying. It is the main reason I maintain an SE license. Not only is it faster in linux (your numbers look about right to me), but I can take advantage of the ease of scripting make and vsim commands to do things like daily builds and verification from an svn repository.

Thanks for the VMware info. I'm still old-school with two optiplex boxes and a kvm switch.

-- Mike Treseler

Reply to
Mike Treseler

It's not NFS that's the problem with the VMs it's the virtual IO performance. I looked at the effect of NFS alone by using a directory that was mounted on a second Linux machine that was connected to my test machine via gigabit Ethernet. The degradation of native NC over a true gigabit network was negligible, about the same as running it in a VM with a virtual disk or a shared disk, i.e. about 10%. Using a virtual NIC caused NC to go from about 8:14 to 18:37 and KVM to go from 7:42 to

38:36. Regardless of the source of the IO performance problems, the effect was dramatic which is why I'm assuming that it's disk IO that's XP's problem. However I'm willing to concede that this is just a guess, it could be any number of other factors as many posters have pointed out. My original point was that if you are going to shell out for an expensive simulator like NC, VCS or Questa, you shouldn't cripple it by running on Windows.
Reply to
General Schvantzkopf

I used the version that was in the F9 repositories which is 0.9, are the current snapshots significantly better than that one?

Reply to
General Schvantzkopf

My tests indicate that the virtual memory manager in windows causes large simulations to run at 10% of the speed of a Linux/Unix/Solaris box.

Which validates your original point.

G.

Reply to
ghelbig

But simulations are not file system IO bound unless your dumping trace files to the disk. You typically load the simulation image into memory and run (unless you're dumping waveform data to the disk).

Petter

--
A: Because it messes up the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
Reply to
Petter Gustad

We have display statements in this testbench, it's not nearly as much IO as when you dump a .trn file but it's enough so that slow disk IO has between a 3X and 5X slowdown depending on how bad the IO is.

Reply to
General Schvantzkopf

mean:

formatting link

I agree completely with the enhanced readability of a drawing at a high level. The details are not improved at all, but it is easy to see the large scale connections in a drawing. I guess I just don't bother to use a schematic for that, I just make a block diagram to go with the HDL.

It is a shame that there is no standard way of representing drawings. This would help a lot with the other issues of version control, etc. But my preference would be to use software which would *produce* a drawing from the source code. Even if it required the user to draw the connection lines, it would be helpful to have a program that would create the symbols and keep them in sync with the HDL code for each module. I would find this useful even at lower levels. After all, a picture is worth a thousand words, right?

As long as we are talking about our "wish list", I would also like an editor that was smart enough to complete words and sentences in my HDL. There are any number of ways that a program can track what you are doing and try to anticipate your actions as you type. For example, if I am creating a clocked process and typing an assignment for a signal or variable , it would be nice to have the software know that it needs a definition and an initialization in the reset portion of the process. So as soon as I enter the assignment, it would take me to the appropriate spot for the definition and start it for me to complete followed by the same for the initialization in the reset section of the process.

If I am typing a "with" statement, I want the software to see the word "with" and put the rest of the structure on the screen for me to fill in the blanks. I find all the typing to be tedious and error prone, not to mention that after all these years, I still don't have the syntax memorized and keep a small stack of books by my elbow.

Just think how nice it would be to have the editor add the appropriate conversion function when you type an assignment between incompatible signals. No error message telling you that you need to convert that integer to an unsigned, it just adds the conversion!

I hate to use a microsoft product as an example of the "right" way to do anything, but the version of Word that I use does a pretty good job of completing words for me sometimes. Even though it is not always accurate, I have to admit that it does a pretty impressive job of spell checking and syntax checking, and that is with *English*, not a well defined language like VHDL or Verilog. I can only imagine that it would be a much easier job to implement something similar for an HDL. (spell checkers don't catch when you type and instead of an though...)

Rick

Reply to
rickman

Quartus rtl viewer does that.

Emacs vhdl-mode completes words.

-- Mike Treseler

Reply to
Mike Treseler

Software C/C++ (and other programming languages, too) have been doing this for YEARS. Microsoft calls its solution "intelliSense". It's in Developer Studio (VC++/VC##, VB.net/etc.).

The first time you (sucessfully) compile the project's source-code, the class/variable browser registers every user-defined identifier (typedef, class, function, struct, union, built-in type int/float/char) with the editor.

Then, as you type source-code, the browser pops up a 'helper' box. For function-calls (including the C/C++/Windows standard library), it shows the argument-names and their data-type. I guess you could call it dynamic annotation. You can jump to the definition of the object under the cursor, at any time. (For standard library calls, this is less useful -- most of the compiler header-files are unreadable jibberish.)

Yeah, intelliSense does this for many contexts. Though it sounds like you additionally want some form of AutoCompletion, combined with context-sensitive editing.

Reply to
SynopsysFPGAexpress

In my mind this might work for small designs, but the huge amount of signal logging slows down the simulation. I usually like to log just part of the signals needed, which is the normal way Modelsim works. I never understood the way vcs liked to work, it felt so unintegrated (in the past, the new GUI is and quite good copy of modelsim gui :))

In the beginning of simulation just add "log -r /dut/interesting_module/*" and after that you can add signals from that block also after the simulation to the viewer.

And you can also open the wave from the GUI after the simulation, or open many different waves logged from different places and merge or compare them in the GUI (open dataset functionality etc)

And I have seen Modelsim SE to be much faster than VCS in some designs. Each design is different beast in terms of simulation speed and what simulator is the fastest. Unfortunately none of the free simulators support mixed language simulations, and almost all of the designs I see commercially are mixed language, so it's quite hard to test the speed differences.

--Kim

Reply to
Kim Enkovaara

I've used this methods for many years for large ASIC designs. It slows down the simulation, but I find this much more effective than running the simulation again. Also it's more cost effective to release the expensive simulation license and use the cheaper waveform viewer for debugging. You can even run the simulations during the night and have the VPD (TRN, SST or whatever you prefer) files waiting for you the next morning.

I never understood the way Modelsim liked to work :-) I prefer DVE over Modelsim any day. I guess it's a matter of taste.

Petter

--
A: Because it messes up the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
Reply to
Petter Gustad

For e/Specman and Systemverilog-TB debugging, the Cadence/NCsim doesn't log dynamic-objects to the TRN/SST file. So you pretty much have to do most debugging interactively (if you want to see Systemverilog objects/queues/dynamic-arrays, etc.), with the full license checkout of the simulator.

I'm not sure how that compares to Mentor Questasim or Synopsys VCS.

Reply to
SynopsysFPGAexpress

VCS/DVE (VPD dump files) supports SV datatypes.

Petter

-- A: Because it messes up the order in which people normally read text. Q: Why is top-posting such a bad thing? A: Top-posting. Q: What is the most annoying thing on usenet and in e-mail?

Reply to
Petter Gustad
Reply to
Stephen Williams

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.