buying
don't
I'm quite shocked at what you're saying. Like me you must have worked with monitors but I doubt much with alternative video sources.
Old CRT monitors are very forgiving on line frequency. You may have seen the picture size breath with brightness and perhaps a little with frequency. While the flyback time is fixed by self resonance you mention, the scanning period isn't, and the rate of change of the scanning magnetic field is largely determined by the power supply voltage across inductive coils. You may also remember all the corrective inductors, some designed to saturate earlier than other, and E-W modulation circuits to reduce pin-cushion and other related effects.
In short a monochrome monitor didn't give a jot about the precise frequency, after all they were built to a price and with component tolerances to suit.
In PAL and NTSC systems things became very different, where accuracy is now required to ensure colour lock in the monitor. As a result, because of the relationship between colour subcarrier and line rate, a video source would aim to be few ppm for broadcast and 100 ppm or so for most other applications.
I keep an open mind on whether a CGA card can make a mono monitor die. My instinct suggests that the monitor would have died at nominally the same time regardless of video source quality, especially if all the linearisation and E-W correction was still functioning as normal.