I'm working on a project that involves replacing some existing electronics with some that I've designed. It's controlling the speed of DC brushed motors. Since the original equipment uses a knowledge of the applied voltage and the motor current to infer back EMF and therefor motor speed, that's what I've done, too.
My problem is that my equipment is consistently sensing the speed as being faster than it really is, by a constant amount -- except when the motor is stalled, when the reading is correctly zero. So I end up servoing the motor speed to a figure that's a 20-30 RPM higher than desired.
It's not just a mis-calibration of the motor torque constant or the resistance -- if that were the case, then the motor speed error would depend on the command speed or the applied torque. I can easily calibrate out any dependence on torque (by adjusting the calibrated motor resistance) and I can get a 1:1 correspondence between increments in the commanded speed and increments in the actual speed. But I'm left with this @#$% offset.
Has anyone seen this? Anyone care to hazard a guess at what's going on?