LTspice, a great program, but that UI!

Don't know, I gave up board level design over 15 years ago :-)

However, if you have the filter as a Laplace transfer function, its pretty trivial to input that into any XSpice based simulator, including SS. XSpice directly supports Laplace rational numerators/dominators. I do include a set of chebys and so forth in SS.

I guess this question sort of really illustrates how many "engineers" that are designing electronic products, don't actually know much electronic engineering, i.e. how to get a filter cook book and make a model.

I'm an engineer, and for maybe 12 years, prior to me learning anything about Spice internals, I was writing my own models for PSpice.

This is a misnomer/misconception if this implies others are not. It was a marketing ploy to express uniqueness when it wasn't. Even original 1985 PSpice had behavioural modelling. Spice3/XSPice is behavioural. Its the B Source. I have created extensive behavioural models for SuperSpice. I use behavioural all the time.

Furthermore, the XSpice bit has full, event driven digital simulation, which speeds up analog logic tremendously, way way faster than LTSpice could possible dream off. LTSpice don't have mixed-mode at all.

I have some PLL behavioural examples in SuperSpice, one that uses my behavioural analog logic libary e.g. fast simulating analog D-Type counters, and a version that uses the XSpice mixed mode engine digital counters.

Nope. LTSpice buys you 3 times in speed only due to the design of its engine core, its behavioural bit has no speed advantage to XSpice at all.

I have SMPS behavioural examples in SuperSpice .

They rus to full steady state on my Novatech i7 in 7 seconds. Yes, seven seconds. So, totally viable to design with.

The bumph that went around late 90s when LTSpice came out, is pretty much redundant now. Computers are, maybe 1,000 times faster.

Anyone that that requires behavioural models for their Spice, I and Jim Thomson will be happy to provide such a consulting service, at very affordable rates :-)

My claim, is that it don't sell any where near as some/many claim/think, in particular, one poster, who the interested can look up in the thread, suggested, $Billions. LT do about $2B a year, so that claim was one of ROTFLMAO.

One off storys don't mean much. Overall principles do. LTSpice is used by people to design in parts from ALL manufactures, so it aides people buying the competition. *Most* engineers, for *most* projects, DO check out alternatives, IMO...

I have a work colleague siting next to me at work, that has used LTSpice for years, and still does, despite my urging him to use a real simulator. There is just no accounting for taste. Anyway, he builds pretty big designs in them. As far I am aware that has never led to the purchase of an LT Part.

And for reference, to put this "behavioural" misconception to bed.

Spice3 Behavioural D type:

.SUBCKT D_XN !in out outn clk

  • _SS_Symbol [C:\ProgramData\AnaSoft\SuperSpice\System\Behavioural.ssm] [D]
  • b1 mid1 0 i=-1e-3*(1-V(clk))*v(!in) -1e-3*V(clk)*v(mid2) b3 mid2 0 v=0.5*(tanh(20*(v(mid1)-0.5)) + 1) r1 mid1 0 1k c1 mid1 0 10f
  • b4 mid3 0 i=-1e-3*V(clk)*v(mid2) -1e-3*(1-V(clk))*v(out) b6 out 0 v=0.5*(tanh(20*(v(mid3)-0.5)) + 1) r2 mid3 0 1k c2 mid3 0 10f
  • b7 outn 0 v=1 - v(out)
  • .ends D_XN

Behavioural XSpice

Laplace Cheby:

.MODEL Chebychev_1DB_3 s_xfer(in_offset=0 gain=1 num_coeff=[0.491307] den_coeff=[1 0.988341 1.238409 0.491307] denormalized_freq=1K int_ic=[0.0

0.0] )

You know where to find the rest...

-- Kevin Aylward

formatting link
- SuperSpice
formatting link

Reply to
Kevin Aylward
Loading thread data ...

That's what I've been saying. Low-volume is LT's bread and butter. They have a few interesting/unique parts that we use but they're way too expensive for hgh-volume production. At 100/month I used several of their parts but at 10K/month, I haven't been able to justify one.

That group doesn' t buy LT. That's the point!

So you have no idea of the number of (e.g.) power supplies available now. IOW, you're talking through your ass.

Utter bullshit.

Engineering includes economics.

You're obviously clueless.

Reply to
krw

LTC is selling about $1.5 billion a year. ADI is paying $14.8 billion to acquire them. Somebody is buying their stuff.

I was told that ADI will be moving their part models to LT Spice. That would be smart.

--

John Larkin         Highland Technology, Inc 

lunatic fringe electronics
Reply to
John Larkin

And thus somewhat disconnected with what circuit designer types need. It's not like designing chips.. it's consuming whatever the chip makers give me, figuring out the flaws and limitations, balancing that with cost and other constraints (like time), and when I settle on something for production hoping the maker doesn't yank it because it isn't selling enough.

I no longer care, the design is done, but for future ref. However this still assumes that the chip maker provides the data, which by itself still doesn't mean the chip as a system will follow. If I hadn't have had the LTC1068 model I might not have noticed that the bypass pin capacitance had a huge effect on low-frequency pass-through, enough of an effect that I drove it actively to

1/2 supply.

How is using a manufacture-supplied tool to design a circuit with that manufacturer's part bad designing? Seemed like the thing to do. Also it in no way means I don't know how to design a filter. Try designing an ANR filter.. it's a race between amplitude and phase and all those fancy "known" techniques won't help... it's passive. Then make it work with crappy 10% ceramic caps (or worse depending on offset voltage) and speakers and mics with +/-3db variation. Oh yea the requirements for the filter are unknown, but that's not even an issue. There's a way to do all that, and it works perfectly (enough) but it's so obvious that the PhD types usually don't get it.

OK

Was gigabucks which I read figuratively as meaning a lot, didn't take it as a literal number. Thing is if everyone who used LTspice bought just *one* high-dollar chip, that's likely millions right there, more than enough to pay a developer. I see absolutely nothing wrong with a chip maker providing free tools for their products (I wish more did that), it's not a bribe, it's interactive documentation and I'm sure it sells them a lot of parts. Nobody knows precisely how much because when I buy a chip they don't ask how I found it. But sales are sales and I have no doubt LTspice adds sales.

??? what principle? that it's useful? You seem to think that if someone does something one way it means they can't do something else another way. I do all kinds of stuff from tubes to microcontrollers to analog micro crap and one thing I've learned is you don't just learn how to do something a particular way. If you do that then you're stuck doing it that way. No, I learned how to quickly learn how to do the thing I need to do, then move on to the next thing. Next project is going to be different anyway so it just wastes time to get all theoretical on something I might never need again, just document it file it away and if I need it again there it is.

For a lot of stuff I simply don't care how or why it works, I just care that it does work. No more than I care about what kind of current sources a chip uses, it's academic and doesn't change the outcome. I only care about stuff like that when something is wrong with the current solution.

Or when there's no solution... for the kinds of filters I need the most, poles and zeros and Laplace and biquads get me precisely nowhere.. all that fancy stuff, well, clips, makes noise, is too sensitive to component variation, and takes up too much room on the board. And in my head. If I need it I'll learn it, if not I don't care unless I'm just curious or something. There is much more to designing circuits than applying a bunch of fancy math - it's been my experience that the fancier the math, the greater the odds it's not going to work in the Real World. Nature and physics don't give a damn about human equations and real parts are never the value it says on the packaging.

They don't care. They're happy if it's used for other things. If they tried to lock it down then it wouldn't be as useful, not as popular, less eyeballs seeing LT all the time, fewer sales. Just as you suspect - it's a billboard. So what.. would be one thing if it was a scam but their chips work really well.

Didn't mean to imply that other spice simulators aren't behavioral, the point was that LT took the time to make netlist and behavioral models for their parts, even complicated parts. I was under the impression that there were other optimizations for the smps models.

Real world SK lowpass something...

  • sup | |/c in >---150K---*---150K---*----| 2N5089 etc (biased | | |\e 1/2 sup) 470p 220p | | _|_ | `-----------------*---> out | (biased F ~=150K/45 ~= 3.3khz 22K 1/2 sup _|_ minus a bit)

...the knobs are easier to turn. Anything much more complicated than that I'd rather use an app if I need a standard filter.

Terry

Reply to
Terry Newton

It would also be a good idea to snag AD's current unencrypted models.

Cheers

Phil Hobbs

Reply to
pcdhobbs

If you need to design serious LC filters, the NuHertz software may be worth it. In a few minutes, it designed a filter that we'd spent days struggling with. It can use standard values and finite Qs and designs filter forms that most people have never heard of.

I recently designed a one-side-absorptive LC lowpass by pure fiddling in LT Spice. It has 5 parts, which is about the limit of a fiddlable-design, or past the limit for some topologies. In my filter, various causalities seem to be sort of orthogonal, so fiddling converged. In the other one, the NuHertz solution, things diverged; fidddling just kept making things worse.

Somebody should do research on the dynamics of fiddling. Maybe it's been done.

--

John Larkin         Highland Technology, Inc 

lunatic fringe electronics
Reply to
John Larkin

Don't mind Kevin, he's just throwing rocks because somebody said his baby picture was ugly. ;)

Cheers

Phil "Not much of a SPICE fan anyway" Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

They have. The net is that if you have N adjustments that are coupled at the 10% level, you can do it by hand. 30% requires an expert, and

50% is essentially impossible by hand unless you start out very close to the optimum. (And this works in both directions--it doesn't help to put one axis on a fine thread.)

I remember seeing GPIB-controlled screwdrivers used for adjusting microwave filters.

Filter tuning is especially exciting when things are very nonlinear, e.g. one time I had to tune up the RF coupling network in a plasma etcher that had a solid state PA. The plasma guys had redesigned the chuck, and the pi network wouldn't tune--they were way out of adjustment, and of course the match changes completely when the plasma strikes. With the horrible mismatch, they couldn't turn the RF power high enough to get the plasma to strike before the reverse-power sensor shut it down to avoid blowing up the final.

Once I got the plasma to strike at lower power, I found that one of the two caps tuned slowly and the other one very fast. It was one of those long narrow canyon problems the numerical analysts talk about, except that in this case there were rocks falling from the edges. ;)

Tuning by finite differences was the ticket--adjust the slow knob a fair way, then tweak up the fast knob, noting which slow direction made for the lower reflected power.

Once that was done, we put a pad capacitor in series with the fast control, and all was peace and concord.

For ordinary filters, the main pitfall is that it's often easy to tweak up a beautiful looking passband that's slightly too narrow. Anyone who has done that on a filter of 3 sections or more knows that you might as well start over at that point--the true optimum generally isn't anywhere nearby.

You want to tune to get the right number of bumps first, spread them out over the right bandwidth, and then tune for the right return loss shape.

A grid dip meter is helpful for getting the resonant frequencies of the individual sections right to start with. Most filter designs I'm familiar with are more sensitive to that than to the exact L and C.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Well, we we will leave it at that then.

-- Kevin Aylward

formatting link
- SuperSpice
formatting link

Reply to
Kevin Aylward

f>idddling just kept making things worse.

Yes it has. One is called genetic algorithms (GA), Its based on the Darwinian triple axioms of random variation, replication and selection.

They pop up some really novel topologies, that work really well, apparently.

I think the PSpice optimiser does that as well by other methods. Like having goal functions and stuff that it twiddles to. I dont use PSpice anymore, so Jim will have to answer that one.

-- Kevin Aylward

formatting link
- SuperSpice
formatting link

Reply to
Kevin Aylward

Maybe I was lucky. There's nothing wrong with luck.

One advantage that I had was the option to decide, on an irrational basis, what "good enough" means. It was a tradeoff between pulse shape and reflection absorption, and "that looks nice to me" was the optimization goal. Fatigue can alter one's standard of "good enough."

I once designed a PWM-based DAC that had two trimpots to set gain and offset. They diverged so agressively that nobody could adjust them. We wrote down an algorithm to tide us over until I redesigned it.

Some circuits might be optimum from a cost standpoint or something, but they are too difficult to optimize. More stages, or a different topology, or a buffer halfway through, might untangle things.

And some circuits are just not possible to draw in a visually appealing way. Time to redesign.

Rob here has optimized filters and other things by cycling through all reasonable 5% parts, simulating every combination, and picking the best one. That might run for a few days, generally Python and not Spice. Again, that works for things with a small number of components.

--

John Larkin         Highland Technology, Inc 
picosecond timing   precision measurement  

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

I rarely if ever do that - speaker crossovers are about as close as I've gotten (years ago when hi fi was a thing). But I really admire the RF folk - things are totally different up there, where parasitics are as much a part of the circuit as parts. GHz especially, all those odd-shaped trace squiggles. That stuff must be a bear to simulate.

Actually, the dynamics of fiddling is very important to me, especially for those pesky ANR filters... because the headset is a physical system affected by many unknowns and variable conditions, the filter specs are also unknown. I don't know the exact requirements! Even if I did, the optimum response changes with just about everything.. not looking for optimum, looking for something that's good enough and is forgiving.

For math types a situation to this problem is often deemed impossible but the solution is actually very simple - knobs! Literally. Replace the resistors with trimmers, capacitors with switches that select likely values, turn the knobs until it works right. For our stuff we have no problem manually adjusting around 18 values provided the selections are fairly restrained and each adjustment does something fairly specific and can be assigned a sensible label. Usually the one doing the adjusting doesn't know much about electronics so I set it up so with the knobs about half way it should be in the ballpark, go from there.

Terry

Reply to
Terry Newton

It's kinda fun and I learned some things... and nobody found my accidental easter egg yet - hint never do head math at 1am when trying to lob the rocks back.... they tend to miss.

Terry

Reply to
Terry Newton

I really wouldn't assume too much about my design approach. None of "not up on available SC spice models are available" really translates to requirements for a lecture on board level design.

As I noted, over 15 years designed, medical imaging front ends, audio power amps, pro audio mixing desks, telecom boards....

Got my first electronics kit aged around 11. Built quite a lot of stuff from

14 to 18, whence I went to uni.

Just curious, when did you build your first crystal set with high impedance headphones...?

Sure, having a good model is really useful. The however though is, if LTSpice did not exist, one would think that they would be providing a model for the existing spices.

To analyse the worth of LTSpice to LT, one needs to do an A B comparison. With and without existence. What would happen? So, if LTSpice didn't exist, it does not mean that you would not have been able to discover the effect you needed to discover. The models are independent of spice

No idea how my comment, translates to what you say here.

More plugging here.... I implemented a fair bit of filter design in SS, if you ever get around to trying it. It will design various passive and active low and high pass filters, and place the design on the schematic.

Butterworth, Bessel, Chebychev and Gaussian

Second from bottom screen shoot.

formatting link

Yeah, component variation. Its a big issue, even for board level design, not just ic design.

That's why I said LTSpice is useless for me. Sure, a lot of board level design can get around modelling that in spice as board design can have access to 1% components, but still, having inbuilt Worst Case and Monte Carlo is really useful for all designs.

Sure. Again, you *assume* much about my approach to design. Sure, I am up on math a tad, e.g.

formatting link

but I hardly use any math whatsoever when designing a chip. Its all spice. That's what it does well. The math.

If you get the standard academic books, they will produce whacking big equations. The proffs that wrote them say, this is how you design. They are wrong. They are completely useless for design, and NO ONE does it that way. To many variables and approximations to rationally deal with. .

The idea is that you have to understand the principles, like what changes in L and W do to speed, matching, accuracy, gain, what different topologies do, like cascode, folded cascode, Millar compensation, etc..what ways there are to put poles and zeros in say, an opamp feedback loop, how they effect things such as PSRR.

You then do systematic parameter runs, and "intelligently" twiddle with the values, until you converge to the spec you find acceptable.

For example, for an LDO, maybe 2 comp caps, two comp resistors, and be stable over a range of load caps from 0 to 10uf, and loads from 100 to 1M ohms., over all process corners and temperatures and power supply voltages. As I noted, maybe 100,000 total simulations.

And again, when one does Worst Case modelling in spice that supports it, one includes the variations that might be present on say, the board. I might not know if the parasitic board capacitance is 5pf or 10pf, but I can but the min and max estimates into the models and just press the button and get a suite of results that will enable a descent, first pass design to be done.

They wrote marketing blerb to mislead the user. The SMPS models were, apparently, just a way to implement the few "digital" bits that would be done in any mixed-mode simulator with a genuine digital engine. A digital mode J-K runs way faster than an analog implementation.

Mike did a brilliant job on the engine, but its still an analog engine.

  • sup | |/c

| _|_ | `-----------------*---> out | (biased

_|_ minus a bit)

Well, I could never be bothered to finish the Filter sets in SS, but it still can be useful as is.

There is a wad of freebe filter design stuff on the web, today, anyway.

-- Kevin Aylward

formatting link
- SuperSpice
formatting link

Reply to
Kevin Aylward

Well, computing isn't a good substitute for thought, but it sure is easier. ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Oh. I'm just an engineer.

--

John Larkin         Highland Technology, Inc 
picosecond timing   precision measurement  

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

All the difficult problems, pretty much need this twiddle approach. All the simple ways to make money, were done a long time ago, like dos 1.0

To explain.

As prior noted, I design ASICS for "high performance" TCXOs and OCXOs.

Every single oscillator is calibrated by running through several temperature cycles, say -40 deg to 105 deg. i.e. banks of ovens/coolers.

The TCXOs have many order chebychev polynomial function generators to generate the inverse control voltage to a VCXO to counter the xtal temp variation. i.e. summing up a number of cheby co-efficients. No secrets here so I can give the main details, all the basics in patents 30 years ago.

The chebys are generated from gilbert cells. Ideally, gilberts are independent of temp, but in practice, they are, and also not exact multiplications due to other 2nd order effects.

However, all these variations don't matter, even the temp co of the band gap reference, that drives all the circuits, don't matter. It don't matter because, every transfer function over temperature, including the nonlinear temperature to voltage temp sensor, are all measured in the calibration runs. So, all the dacs controlling all of the co-efficients are set to whatever they need to be for the actual final compensating curve to do what it needs to be. Typically, you can get a 20ppm down to 50ppb over temperature.

All of this calibration, unlike your menial labour, is automated in software.

-- Kevin Aylward

formatting link
- SuperSpice
formatting link

Reply to
Kevin Aylward

I don't mind f'ing around a bit, to get things to work, (and then find the edges where they don't work.) But I certainly want to understand (at least in my own hand wavy way) why that works.

I guess I'm speaking of twiddling with solder and not code, but same sorta idea in my mind.... poking around is a good first strategy, if you don't quite know what is going on.

George H.

Reply to
George Herold

Speaking of UI, your site says the student version of Superspice has UI limitations. That seems odd because usually apps have functional limitations. What is limited in the UI?

--
This email has been checked for viruses by AVG. 
http://www.avg.com
Reply to
Tom Del Rosso

Essentially, not a lot. "GUI functions" means the checkbox for parameter sweeps don't work. This is, of course, to limit some functionality to entice the user to upgrade. The astute user will notice that they can make a "rerun" file with the appropriate commands (described in the help) to run multicomponent parameter sweeps, with the appropriate extra "include" file telling SS to use that rerun file.

Other than what ever produces the functional limitations described, the GUI behaves the same.

However..., if someone sends the appropriate "I am very poor" begging email, I am liable to send the Pro with only the non commercial use restriction. All money I get for SuperSpice goes to my 80 year old mother.

If anyone wants the XSpice VC++ code, they can have that as well.

-- Kevin Aylward

formatting link
- SuperSpice
formatting link

Reply to
Kevin Aylward

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.