Greetings people !
So lately, I have been working on my Sigma Delta ADC and before going advan ce, I wanted to have a base model WORKING (1ST ORDER, 3 bit digital output) . The circuit is quite simple.
Input (x) -> Difference Amplifier -> Integrator (op-amp) -> Quantizer-> Enc oder -> 3-bit Output (Y) -> Feedback DAC to the non-inverting pin of Differ ence Amplifier (I am using Instrument Amplifier to get better result but th at is not a necessity)
Now, the output is not as what we want (that is another issue with Quantize r but we can solve that later) but the time LTspice is taking to simulate t he whole circuit is round about 10 minutes or something which is kind of an unacceptable ( giving a look at the transient response parameters are this ;tran 0 100m 95m 1m uic (100ms stop time, 95ms time to start saving data, max time step 1ms , skip initial operating point solution; True). This is t he best transient settings I did and it is still taking around 10minutes. A nd without this setting, the simulation time grows relatively high.
Also , the CPU usage during simulation is 100% (means other programs are ha rd to run). Although the machine I am using is a SAMSUNG series 5 ultra boo k with an AMD A6-4455M (2 CPUS-2.1 GHZ each I believe )I believe it is not the problem with CPU to handle a low level simulation.(BTW I burrowed it fr om a friend for some time till I get my new DELL XPS 13"...So don't blame m e on buying an AMD)
One more thing is that I used the simulation on my friend's core 2 duo desk top and it is taking almost the same time. Also on my friend's notebook whi ch has 2nd generation core i3 processor and still the same time taken to si mulate. RAM is not the issue though !
So, Can any one please tell me what are the reasons? Is the circuit not pro perly optimized? Am I giving the wrong transient parameters ? Is this the C PU problem ? Any advice perhaps because I am relatively new to this spice s imulation environment (only a month of little bit of experience).
Thanks in advance :)