Ping: Kevin Ayward - Monte Carlo

Kevin, What are your thoughts on using Monte Carlo analysis on I/C designs?

My opinion (not recently well-received :-) is that Monte Carlo analysis is what poor designers use to CYA when they do designs that rely on device parameters and component values, rather than doing proper ratiometric designs.

What say ye? ...Jim Thompson

-- | James E.Thompson | mens | | Analog Innovations | et | | Analog/Mixed-Signal ASIC's and Discrete Systems | manus | | San Tan Valley, AZ 85142 Skype: skypeanalog | | | Voice:(480)460-2350 Fax: Available upon request | Brass Rat | | E-mail Icon at

formatting link
| 1962 | I love to cook with wine. Sometimes I even put it in the food.

Reply to
Jim Thompson
Loading thread data ...

Seriously, i do not know how one "sets up" a Monte Carlo analysis. Does one take the equations that specify how a circuit works (like an op-amp) and solve them using part values that are random but within part spec? "Step" and repeat to get an idea as to how it responds? If so, that sounds rather crude and perhaps not too useful.

Then again, use of R3/C5 or whatever seems worse to me.

KISS to help me understand. Thanks.

Reply to
Robert Baer

If I may, I do occasionally use Monte-Carlo analyses and find them useful. The challenge really is to have a meaningful model set to work with. By meaningful, I mean models that reflect manufacturing variability in a realistic way, not just by arbitrarily skewing a few model parameters with no regard to the physical realizability of said skews. I am fortunate to work in an environment where Monte-Carlo modeling is taken seriously and is well-supported.

Monte-Carlo analysis is useful even with proper ratiometric designs. You may have the ratios right, but mismatches may lead to unacceptable variability of yield loss in production, and MC can show you this. Remember that matching goes as K/sqrt(area) in the standard Pelgrom model, but you don't usually know K explicitly. For good models, Monte-Carlo will show you roughly the value of K and its impact; this will guide you towards appropriate device sizing.

The use of Monte-Carlo models _can_ be a simple CYA tool, but if you know what you're doing it's a valuable addition to the toolkit. The difference, as in many things, is in how you use that thing between your ears.

Reply to
Steve Goldstein

Not certain I understand but what I envisioned was consider a function of numerous variables. Those variables can create min/max all through its space. How does one find a true max, or true min? when they might have landed on a local maxima/minima? The solution: splatter the calculations by using randomly selected varaible values, THEN compare results. The idea is that that technique is less likely to concentrate on a local minima/maxima and actually find the global minima/maxima.

Reply to
RobertMacy

...snip... Seriously, i do not know how one "sets up" a Monte Carlo analysis. Does one take the equations that specify how a circuit works (like an op-amp) and solve them using part values that are random but within part spec? "Step" and repeat to get an idea as to how it responds? If so, that sounds rather crude and perhaps not too useful.

Then again, use of R3/C5 or whatever seems worse to me.

KISS to help me understand. Thanks.

trying this replay AGAIN!

Not certain I completely understand but what I envisioned was Consider a function of numerous variables. Those variables can create min/max all through the function's space. How does one find a true max, or true min, when they might have landed on a local maxima/minima?

The solution: splatter the calculations by using randomly selected varaible values, THEN compare results. The idea is that that technique is less likely to concentrate on a local minima/maxima and actually find the global minima/maxima.

Single variables, not likely to need it. But if you have 50-100 makes sense.

Reply to
RobertMacy

Just running simulations at randomized points in the parameter space is not, properly speaking, Monte Carlo.

MC is a technique for computing integrals that is most useful when (1) you have so many variables that doing a regular grid is prohibitive; (2) you're integrating the difference between the true integrand and some much simpler approximate model, which reduces the variance of the integral.

It was invented by Nathan Metropolis and John von Neumann for the Manhattan Project.

What you're talking about is more like simulated annealing, which iirc is another JvN idea from about the same time.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

...snip... Seriously, i do not know how one "sets up" a Monte Carlo analysis. Does one take the equations that specify how a circuit works (like an op-amp) and solve them using part values that are random but within part spec? "Step" and repeat to get an idea as to how it responds? If so, that sounds rather crude and perhaps not too useful.

Then again, use of R3/C5 or whatever seems worse to me.

KISS to help me understand. Thanks.

trying this reply 3rdTIME!

Not certain I completely understand but what I envisioned was Consider a function of numerous variables. Those variables can create min/max all through the function's space. How does one find a true max, or true min, when they might have landed on a local maxima/minima?

The solution: splatter the calculations by using randomly selected varaible values, THEN compare results. The idea is that that technique is less likely to concentrate on a local minima/maxima and actually find the global minima/maxima.

Single variables, not likely to need it. But if you have 50-100 makes sense.

Reply to
RobertMacy

Here's a pretty good explanation of the method (better explanation than in the PSpice manual, which is quite vague :-)....

...Jim Thompson

--
| James E.Thompson                                 |    mens     | 
| Analog Innovations                               |     et      | 
| Analog/Mixed-Signal ASIC's and Discrete Systems  |    manus    | 
| San Tan Valley, AZ 85142     Skype: skypeanalog  |             | 
| Voice:(480)460-2350  Fax: Available upon request |  Brass Rat  | 
| E-mail Icon at http://www.analog-innovations.com |    1962     | 
              
I love to cook with wine.     Sometimes I even put it in the food.
Reply to
Jim Thompson

Yep ;-) ...Jim Thompson

--
| James E.Thompson                                 |    mens     | 
| Analog Innovations                               |     et      | 
| Analog/Mixed-Signal ASIC's and Discrete Systems  |    manus    | 
| San Tan Valley, AZ 85142     Skype: skypeanalog  |             | 
| Voice:(480)460-2350  Fax: Available upon request |  Brass Rat  | 
| E-mail Icon at http://www.analog-innovations.com |    1962     | 
              
I love to cook with wine.     Sometimes I even put it in the food.
Reply to
Jim Thompson

Sure. If done right (and the models are worth anything) you can predict yields and bins from Monte-Carlo simulations. We were doing MC simulations over forty years ago. A coworker racked up 360/85 (a quite large mainframe) CPU weeks doing MC simulations on one driver design.

Reply to
krw

I agree 100%. MC is a complete waste of time for, essentially, anything but dc offsets.

For mainstream IC design I never use MC, only WC (worst case corners) with combined parameter sweeps. I use a LOT of corners. I have had lots of chips, all coming out spot on with this approach. WC works. Period.

Kevin Aylward B.Sc.

formatting link
formatting link
- SuperSpice

Reply to
Kevin Aylward

Yep. That's all it will get you.

Same here. LOTS of corners, LOTS of successful chips, like several HUNDRED! ...Jim Thompson

--
| James E.Thompson                                 |    mens     | 
| Analog Innovations                               |     et      | 
| Analog/Mixed-Signal ASIC's and Discrete Systems  |    manus    | 
| San Tan Valley, AZ 85142     Skype: skypeanalog  |             | 
| Voice:(480)460-2350  Fax: Available upon request |  Brass Rat  | 
| E-mail Icon at http://www.analog-innovations.com |    1962     | 
              
I love to cook with wine.     Sometimes I even put it in the food.
Reply to
Jim Thompson

Monte-Carlo done in the simplistic way described in the link is indeed pretty useless. The problem with most so-called Monte-Carlo simulation work is that things aren't as simple as most people make them out to be, and as a result a lot of CPU cycles are expended in a useless work.

In the real world transistor model parameters are linked, some more closely than others. You can't just arbitrarily vary one (BF, for example) while holding all the others fixed and expect to get physically meaningful results. You'll get output, all right, provided the simulation converges, and while you'll see distributions that'll make you think you've done something useful they won't mirror the real world. This is basically the garbage-in-garbage-out rule.

Meaningful Monte-Carlo analyses require models that take into account the linkages, with appropriate scaling, between transistor parameters. This requires both fluency with the model equations and a good background in the device physics. It's a lot of effort and investment. Absent this care, the results of transistor-level Monte-Carlo sims are only marginally useful.

Even with good models, blind use of Monte-Carlo in the belief that it will lead to a more robust design leads to a lot of meaningless data. If your basic op-amp design has 85 degrees of phase margin, why would you waste the time to determine that the expected production variation is 80 to 89 degrees? There's no value in that, it's still going to be extremely stable. You could get just as useful an answer by doing the sim at just slow/nom/fast corners (assuming that your slow and fast models are physically meaningful to begin with).

Monte-Carlo sims are most useful when applied to parameters that are known to be very sensitive to device variation and that could lead to large yield swings. One good example would be CMRR, which often depends crucially on component matching. This can be done by hand calculation for a discrete design, but if you're working on an IC and have Pelgrom models that accurately show spread versus device sizing, it can make all the difference when you're pushing hundreds or thousands of wafers through fab.

As I said in my first reply to this thread, I do occasionally use Monte-Carlo simulation, and find value in it. I don't use it where it doesn't make sense to do so, and I've never uncovered an unsuspected problem with MC sims, but it's quite valuable when I need to ensure high yields to a spec that I know is sensitive to production distributions. As I also said before, I'm fortunate to work in a large company that understands the requirements for useful Monte-Carlo modeling and provides the support to create models that are useful. These are not the sorts of models you're going to get in most PDKs, they have to be developed based on measured results from a wide range of test structures designed specifically for obtaining this data. And it's also important to separate device-to-device variability on a single wafer from wafer-to-wafer and lot-to-lot variability to avoid needless overdesign. It takes a lot of investment to do this right.

In general I actually do agree with Jim and Kevin that a lot of Monte-Carlo simulation is a waste if time. But there are situations where MC is valuable and useful, provided one has both the right models and an understanding of when and where to use them. Unfortunately, management almost always ignores the former and many engineers seem to lack the latter.

Reply to
Steve Goldstein

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.