I am currently learning VHDL step by step, but I noticed there are also
quite a lot of projects written in verilog.
So, for somebody who knows the basis, what would be best?
- First get a very good knowledge of VHDL and then start with verilog
- or learn vhdl and verilog at the same time (as the two languages do
offer simular features).
Probably only you will know whether you prefer to write every
thing in triplicate, or fall between the stepping stones.
Alternatively, if you model in MyHDL  it can export either, or
Well, my first interest is to understand the code written by others in
the many open-source projects out there.
Isn't it best first to learn (or at least to understand) the two "main"
languages and then move on to the alternatives?
I have learned VHDL pretty well and have used Verilog some. I prefer
VHDL mainly because that is what I know well. Some have a quite strong
opinion that Verilog is much more productive and useful. I won't argue
about it until I learn Verilog better.
If you are going to learn both and have a good learning book for each of
these languages, here is one you might want to include.
HDL Chip Design, Douglas J. Smith
It has many more examples than other books and many show the same thing
in VHDL and Verilog. A great way to see the differences. I only wish I
had a copy that was updated for VHDL 2008. VHDL 2008 is a *huge*
improvement over previous versions. I will never go back to limiting
myself to older versions.
I've done a lot of that, good for getting comfortable with RTL
and other novel aspects of hardware design. You did shake my
belief expressed in my intuitive comment above, so I checked a
project, and attached a summary below; the VHDL exported is
almost double the size of the verilog.
Its not like English vs Chinese! Whether you prefer the terse
or verbose, the underlying RTL is what you need to grok. My
preference is the less extreme choice, MyHDL.
MyHDL is a fast free modelling tool, with substantial support
of Python libraries for design and test, and exports to waveform
viewer. Downside: you need good testing to prove design source
is valid and interpreted correctly. Upside: you should be
doing this anyway!
As for writing Verilog/MyHDL/VHDL, use code templates, your own
or from net or tool manuals. Enjoy the trip, this low-level
stuff still fascinates me.
Verilog compiles every piece of code, VHDL compiles none...
VHDL does very strong type checking and also has a verbose syntax which is
quite cumbersome, but at least, when the compiler does not complain, chance
s are high that the code does what the designer intended.
- Make use of some VHDL-2008 features (like process(all))
- Understand that you do not need component declarations, you can use
inst: entity work.foobar(rtl) port map (...)
without component statement.
Verilog will save you some time writing code but can result in extremely ha
rd to find errors that turn out to be simply typos. At least a good editor
is a must and I would suggest to read this:
Can't comment on MyHDL and SystemVerilog...
Yes, that is the point of strong typing. It should eliminate the class
of errors that are due to mistakes in cross type assignments. This is
something that you have to understand very thoroughly in order to use
Verilog because it won't warn you, it will just do something that may or
may not be what you want.
This is where I am not sure which is preferable. I can write VHDL code
easily in spite of the "verboseness" of the language. I have yet to
even find a good book that tells you how to deal with these issues in
Verilog. I have asked about a good book to learn these issues from an
have been told in the Verilog group that there is none. Until I can
learn about this and be certain I am writing good code, I can't really
consider using Verilog professionally.
Wow. This is a pretty long paper about "gotchas" in Verilog... 63
pages! I'm not sure I want to work with it.
How about C/C++? There are many discussions their
"characteristics"; I like the wryly amusing FQA
Having first used C in 1981 (gulp), I now feel uncomfortable
using it for anything more than a hack. Can you "cast away
constness" at the moment? I remember reading endless discussions
in the early 90s about whether it should be allowed or forbidden
in the standard.