I know just enough about electricity to ask stupid questions. This seems like a very friendly group, so let me try.
I bought a simple, cheap ceiling fan here in SE Asia; it has a speed control unit (Off-1-2-3). The speed control has four external wires: two to fan, two to house electricity (phase and neutral).
After a lightning strike, fan had maximum speed (3) when minimum speed (1) was selected! I replaced the little speed control box; problem solved.
I'd wondered how such a speed control worked, and got my chance to find out when replacing it -- I could see the circuit, as well as a written diagram. It was very simple: Inductors were placed in series at lower speeds. (There was also a high-valued resistor.)
I decided (wrongly?) that the inductors, in effect, delayed the current wave-form, so that at low-speed the fan was still getting the same volts and amperes as before, but fewer watts because of phase difference. Does this make sense?
I did Google searches like "fan variable power inductor" and saw many ways to slow down a fan, but none of them seemed to be this way. (They spoke of $40 solutions, much more expensive than mine.)
The little puzzle got me thinking and Googling. I learned how "watt" and "volt-ampere" have different definitions. I guess the power company consumes watts but bills me for volt-amperes because they're easier to measure. The fans don't cost much to run, but I guess I'm billed at a higher rate (volt-amperes) than I actually consume (watts), right? If other appliances run concurrently, perhaps that would somehow "average" the current phase and minimize volt-amperes wasted???
I'm afraid I suffer from serious misconceptions and this whole post will seem silly....