What I gleaned from the excellent answers for the original "VSWR Doesn't Matter?" thread is that high VSWR doesn't really matter in a lossless transmission line environment between a transmitter's antenna tuner and the antenna, since any reflected RF energy will simply continue to "bounce" back and forth between the tuner's output impedance and the antenna's input impedance until it is, finally, completely radiated from the antenna without loss.
But then why does the concept of "mismatch loss" exist in reference to antennas? I have quickly calculated that if a transmitter outputs 100 watts, and the TX antenna has an impedance that will cause a VSWR of 10:1 -- using lossless transmission line -- that the mismatch loss in this "lossless" system would be 4.81dB! (Reflected power 66.9 watts, RL -1.74).
Since mismatch loss is the "amount of power lost due to reflection", and is as if an "attenuator with a value of the mismatch loss where placed in series with the transmission line", then I would think that VSWR would *definitely* matter, and not just for highly lossy lines either. But here again, I'm probably not seeing the entire picture here. What am I missing??
Confused!
-Bill