A fellow Ham recently told me that the reflections from a high SWR causes the reflected modulated signal to interfere with the forward modulated signal and cause distortion...
If the transmitter sees a load impedance other than the value
for which it is designed, then the distortion level
may increase.
But the description of the mechanism is totally out to lunch.
It isn't the SWR on the line that causes the distortion: it is quite
possible to operate a line at high SWR and still have no additional
distortion. What matters is the load impedance that the transmitter
sees.
My job used to involve measuring the distortion levels of WiFi
transmitter chips at different load impedances. This becomes
particularly important with 256QAM (quadrature amplitude
modulation with 256 targets). Each "target" is a particular value
of both amplitude and phase, and represents an 8-bit value.
The standard measurement was Error Vector Magnitude (EVM)
at a particular power level, and over some range of load impedances.
EVM is measured by generating all the 256 possible target values
and calculating how close to the target the transmitted signal is.
In the real world, EVM is how a device determines the reliability
of the received signal - when it gets too high, the devices switch
to a lower data rate that can tolerate more error, and/or higher
power.
We had to verify that every chip we shipped met the required
specifications. It was fascinating to see how the display of targets
vs. actual values shifted with load impedance and output power.
When the transmitted signal went into compression, we could
see the higher amplitude signals were short of their targets. Some
load impedances actually improved the EVM, but the requirement
was that it meet spec over a particular range of load impedance
values (perhaps corresponding to a 2 : 1 SWR, a 10 dB return loss,
or whatever).
Now, this was a pretty extreme case, where phase and amplitude
were supposed to be precise values for high speed digital encoding.
With a ham SSB transmitter, there may be some increase in high
order IMD, depending on how hard the transmitter is driven and
the ALC action. That's not likely to be an issue for the station you
are talking to, rather to those operating elsewhere in the band.
Details will depend greatly on the load impedance and output
power, as well as the transmitter design.
With a digital signal where you turn down the output so the ALC
isn't active, that should ensure it is in the linear range, and so
wouldn't add any distortion.
I've carefully said "load impedance" in this explanation rather than
"SWR". It has nothing to do with the SWR on the feedline, just the
resulting impedance that the transmitter sees. For example, an
SWR of 2 : 1 might yield a load impedance of 100 + j0 ohms, or
25 + j0 ohms, or 75 - j35 ohms. The transmitter likely will
respond differently to each of those values: some might reduce
distortion, some may increase it, depending on the design.
If you have a tube amp with and adjustable output network,
tuning the amp for proper current and output power should
handle any issues with load impedance (assuming it is
within the range that the amp is capable of matching).
tl:dr
A non-optimal load impedance may increase the transmitter
distortion, but the SWR on the feedline itself has nothing to
do with it.