im in the process of designing and implementing a digital system on a spartan 3 FPGA. apart from the FPGA, im using two static RAM's and a CMOS image sensor for this design. Compared with other digital systems, this design is fairly simple. But im facing lot of trouble when it comes to debugging . Currently im using Chipscope pro for debugging, but the hassle of updating the connections and lack of block RAM for signal storage has forced me to investigate other possibilities for debugging. Since the design involves a CMOS sensor and a RAM, i cant do a full system simulation either.
1) Does any one know a better way of debugging?
2) Is it common to write a simulation model for external components(like CMOS sensor) so that the whole system can be simulated?
im guessing at least one of the above should exist, as making a real world complex digital system with the methods im using currently will be very time consuming and error prone.
Yes it is common, and ofttimes very necessary, to simulate (HDL) external components. If you ever get into the ASIC world, or a large design within an FPGA, you will gain an appreciatiation for HDL-simulation. Large FPGA designs can take hours to compile (synthezize, place, and route) which makes gate level simulation a very ineffcient way to do digital design. Can you imagine a 2hr compile each time you made a change?! You could blow a whole day with just 4 changes!! Somtimes you can get third party vendors to supply you with an HDL model of their part, which then will allow you to simulate their component with your design. Our company has done this in the past with transceiver chips.
Simulation is a learned art: meaning the more you do the better you get at it. Knowing how to stimulate your design and test the boundaries can be a challenging exercise, but one that can save your much work later on. If you skimp on simulating you'll usually pay for it later with debugging.
But, if you're in the debugging phase, having a board connection (like a Mictor) to a digital analyzer is always helpful. The downside is that it takes pins to bring out the signals, but at least you're not limited by block RAM and the hassle of updating connections. One could even provide a programmable test port which would allow you to dynamically pick which signals come out of the debug port w/o having to re-compile your design.
Debugging can be extremely frustrating--I know because I've been there. But you can look forward to doing the Engineer Victory Dance (taken from Dilbert) when you find the problem :)
I've been using the following method for years: I create a debug 'bus' inside the FPGA. I use a mux or tri-state buffers which can put internal signals onto 8 or 16 output pins. This allows me to monitor some essential signals from different modules and track down where something goes wrong using a logic analyzer. Because the debug bus grows together with the design, I can always monitor any module connected to the debug bus in the fpga without need for re-compiling.
If a microcontroller is connected to an FPGA I make sure the microcontroller can write/read data into memories or filters for test purposes. During development, the microcontroller is very handy for generating test patterns and verifying results.
I believe this is done very often but an error in the model will introduce an error in your design. You'll still need to verify the real-life timing on the external components with an oscilloscope or logic analyzer.
Reply to nico@nctdevpuntnl (punt=.)
Bedrijven en winkels vindt U op www.adresboekje.nl
The SRAM simulation model is almost certainly available online.
For the sensor, I would expect to have to write a model. However Micron.com supply RAM models so it's worth looking for sensor models if you're using one of theirs.
I'd use a constant array initialised with a test pattern for the sensor, though if I had a few hours to spare I'd write a reader for .bmp or .raw image files. Not strictly necessary but tends to impress the pointy haired types...
This has been built into products from both Agilent and Tektronix to work with their logic analyzers. The Agilent product includes a mux and is tightly linked to the LA, so that when you switch the mux, the signal names show up on the LA screen.
Like simulation? Modelsim is a much more effective debug tool than any sort of in system debug tool for most problems.
Your reasoning here does not follow at all. Your system could have much more to it than just a CMOS sensor and a RAM and it could still be modelled effectively. In fact, as a general rule, the larger the system around the thing you're designing becomes, the more important it becomes to model those other things.....in this particular case it would appear that a system with a sensor, RAM and FPGA is enough to warrant a more detailed simulation model than you provided.
Yes, unless you're implementing the same thing that you've done before and feel confident that you can do it again without more detailed simulation AND the cost to fixing a problem is low (for the case when you're wrong). This is usually the case with an FPGA, but not at all the case when you implement in a custom ASIC.
Models for these external parts that make up the system that surround your design do not necessarily have to be the full blown model for the device. For example, an image sensor does not need to model the complete process of the conversion of photons into a data stream to be useful. It needs to only be as accurate as needed for you to debug (in a simulation environment) your design.
If the FPGA design is simply collecting the data, doing some processing and storing the results to memory than that sensor model doesn't need to be anything more than a source of data. But the key thing you do is use the datasheet to model the handshaking that occurs in getting that data into the FPGA. These are the exact same datasheets that you were looking at when you did the FPGA design. Now look at them from the perspective of writing a functional model that implements this chip. Although there is the possibility of 'making the same mistake twice' when interpreting the data sheet, generally you'll find that when you're working on the model for the part you have a completely different perspective than when you were working on the FPGA design to interface to that part.
Once you have that simulation model all up and running properly it's time to try it out in the real world. Depending on your skill you'll probably find some issues at that point where the real design still doesn't work. At that point, you pull out the in system debug tools, scope and logic analyzer, etc. not so much to debug the problem but to determine in what way your simulation model of the system is deficient and in need of enhancement. In other words, you're looking for the thing(s) that you missed or thought were not important or just didn't fully understand when you were taking those two views of the datasheet. Once you get a handle on what that is, you upgrade your model for the system to model this newly understood behaviour, leaving the design that you're working on unchanged. Once you upgrade your system model appropriately you should be seeing your design now fail in the same manner that it does in the real world. NOW you're ready to debug your design and then re-run the simulation to verify that the problem is fixed and THEN take it to the lab to see that it works in reality as well.
As you get better at it, you tend to get better at generating parts models in part because whenever you look at a datasheet for something you'll be using in a new design you'll immediately be viewing it from the perspective of how to interface to it (knowledge which you'll use to implement your design) and from the perspective of how to create an effective model.
Bottom line is that once you adapt this mindset you'll likely find that you're more effective debugging in the simulation environment where you have immediate access to any internal signal that you're interested in than it is to debug in the real world where access to internal signals usually comes at the cost of a design change (to bring out the internal signals) and an FPGA rebuild....and you still haven't fixed the problem, just brought out some signals....which could lead to bringing out more signals....and in the end when you have your design change to actually fix the problem you still don't have a way to test this change in the simulation envrionment ahead of time to see if you got it right. If the problem is not easily re-creatable this can be a major drawback.
1) I agree with others: simulation, simulation, simulation, and .....debug Before the day of things like chipscope I used to reserved 16 ~ 32 spare pins and route them to the headers, now the logic analyzer is your best friend. I don't recall having any design with full system simulation, but with only the CMOS sensor and static RAM, it's doable
2) It depends on your time budget. By the way, I'm not familiar with CMOS sensor, what kind of interface/interactive signals you have on the CMOS? If most of them like the I2C interface then it's not worth to write a model for it.
hi, thank you for all the information. Ok, now im directed towards simulation. i've downloaded the simulation model for RAM, but there is no model from the manufacturer for CMOS sensor.
So my first task is to write a model for sensor, but im not sure which way to take in writing this. My options are pure verilog and verlog mixed with PLI (or something similar). The whole sensore is controlled by I2c interface, so i will need to implement i2c protocol as well. im wondering if there are any free models that implement i2c so that i can use directly.