Using electrostatic deflection requires a CRT with deflection plates. Such a tube would have (I believe) a thicker neck. Also, the output transistors would have to swing at least a couple hundred volts to deflect the beam.
My guess is that Sony, et al, stick with magnetically deflected tubes because they've been the standard for 60 years. That's the kind of tube they build, and the kind of deflection circuits they design.
However "correct" your theories might or might not be, they run up against industry practice. Electrostatic deflection is considered obsolete, at least with respect to video displays. And pretty soon CRTs will be obsolete with respect to video displays.
I have a Toshiba CZ-3299K HDTV that's about 12 years old. It has a 32" magnetically deflected CRT, and runs at four times the normal scanning rate (~ 63kHz) without problems.
I don't think so. Try changing the screen brightness of a 'scope's CRT. Does the deflection change?
Again, I don't think so. If this were true, the image on any magnetically deflected CRT would show severe geometric distortion that varied with image brightness. It doesn't.
Are you sure? How can you change beam current without changing brightness?