DC motor design rules of thumb?

I asked this in basics, but didnt get any opinions. Trying it over here if you dont mind.

Anyone know of a site that describes DC motor tradeoffs? I know torque is proportional to current and turns, and there are practical problems with wire size and resistance that limit number of turns, etc. Lets say

I have a 3 pole armature and a 7 pole armature wound with the same number of turns and wire size. Will they have the same torque? Which one will have greater rpm? I know the inductance goes up with turns. Seems like there should be a rule about L/R time constant and RPM somewhere in the rule of thumb list. Some big hefty motors cant be PWMd

faster than a kilohertz or so... is this because of large L? I have am 8.5W motor that reads 12 ohms and 7 mH. Guess thats 1 amp stall torque at 12 volts... but that exceeds the continous power. If its rated at 8000 rpm thats milliseconds per rev. Kind of clobbers my R/L time constant idea huh?

Reply to
BobG
Loading thread data ...

Thanks for responding. The Curtis golf cart controllers pwm at 1.5KHz or 15KHz... switch back and forth. Wonder what the freq selection criteria is? Any golfers out there? Do you hear the motor whine when its slow or fast??

Reply to
BobG

That diesn't sound right. I can believe there might be a problem with insulation on some motors but that has more to do with edge speed than PWM frequency. Although I can see how the two might get confused.

There is an advantage to getting the switching speed above 20kHz or so in that it get out of the audible range. Not much advantage to going much above that since all you do is increas the switching losses unless you have a low inductance motor.

If the power requirements get high enough you might have a problem getting the power stage to switch that fast but that's not a motor issue per se.

The only thing I see limiting switching frequency are motor insulation, power stage limitations and regulatory requirements on EMI emmissions :)

Have I missed something?

Robert

Reply to
R Adsett

Actually, no. The lower frequency is to allow control during plugging (effectively driving the motor in reverse). The controller has a finite lower limit to the duty cycle width it can apply and when that is applied at the default frequency the motor current becomes too large to control and possible too large for the controller to handle without failing. By cutting the frequency in the fashion they do the maximum duty cycle is cut to about 10% and control can be maintained during plugging.

The reason you hear it at low frequencies is that the controller must assume that it is in plugging until it can establish otherwise, anything else is unsafe. This is on every direction change or power cycle and on the simple Curtis controllers those are the same thing.

They are more subtle points to consider as well but that's the gist of it and they are not the only controller to do this.

It's nothing to do with the motor but rather control and power section limitations. If you could provide a sufficiently fine grained PWM you could plug at 15kHz and would not need to cut back to 1.5kHz

I don't know of any reason armature RPM would determine PWM frequency.

Robert

Reply to
R Adsett

Armature rpm will determine what the best PWM freq is to use.

Reply to
Martin Riddle

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.