The TV in out teen agers's playroom has only an RF input and the kids have a commodity grade VCR and an entry level DVD hooked up to it.
The RF output from the VCR feeds the input of the TV, and the DVD's video and audio outputs are connected to the corrosponding inputs on the VCR.
All's well when playing video tapes on the VCR, but when playing a DVD the TV image "fades" or changes contrast slowly with a cycle time of about 20 seconds. Not badly enough to make you miss anything on the screen, but it's noticably annoying.
I wrote it off as being some weird kind of incompatibility anomaly, and since I don't have to watch DVDs in that room I didn't bother trying to eliminate the problem.
Yesterday I was at a friend's home who had a similar setup, but with different brands of equipment and his TV image faded in an out the same as ours when he tried to show me a scene on a DVD.
That got me wondering if that fading effect is a "well known problem" and if it is, what's the easiest way to get things working "normally".
Thanks guys,
Jeff