Okay, this is kind of a retarded question, but here goes.
Let's say you want to convert you basic PC monitor from raster scan to vector.
You pull off the yoke connector from the main board and find out the it still runs okay, well except that you've burnt a little spot of phosphor in the center of the screen unless you were prescient enough to back off the brightness, contrast, and the three screen controls first.
Then you hook up a DC supply to the V and H yoke coils and find out it takes about 3 AMPS to do a full screen deflection in the vertical direction, about TEN amps in the horizontal direction.
(I think that's because the horizontal deflection has to happen faster, therfore lower inductance in the h coil, therefore fewer gausses per amp).
So we need some pretty hefty drivers, many amps plus many volts if we want fast deflection.
Not a huge problem so far.
But now we realize we don't want to burn the screen if we're drawing short or slow vectors, and we don't want the lines to be invisible if we're drawing them long or quickly.
So we need some sort of Z-axis brightness modulation, somehow proportional to the "writing rate", like in inches per millisecond or thereabouts.
Anybody have any idea how to compute this on the fly as it were?
Ideally I'd like to be able to draw 6000 full length vectors per second.
A quick web search didnt find anything promising.
Regards,
George