eHam Forums => Station Building => Topic started by: KC2ELS on January 06, 2004, 10:23:16 PM

Title: How to use oscilloscope to check signal quality?
Post by: KC2ELS on January 06, 2004, 10:23:16 PM
Does anyone have any pointers on how to use an oscilloscope to check signal quality?  I'd like to ensure that the signals I transmit (CW and SSB on HF, FM (voice and digital) on VHF) are clean and solid.

URLs with documented procedures deeply appreciated, as well as suggested vendors for probes and the like.

Title: How to use oscilloscope to check signal quality?
Post by: WB2WIK on January 07, 2004, 12:47:46 PM
John, this could be a really costly endeavor.

FM signals cannot be "monitored" on an oscilloscope at all, as the modulation occurs in the phase or frequency, and not amplitude, domain -- you must use a spectrum analyzer for this.  For a commercial one, be prepared to spend at least $3000 for an older, used piece of gear.  Of course, a Service Monitor can do the job quite well as these are combination signal generators-audio oscillators-receivers-discriminators-video displays all built into one instrument (much more complex than just an oscilloscope) and are about the handiest piece of gear a ham could own...problem is their cost, which is typically $10K and up.

For CW, AM and SSB, you can use a standard time-domain scope if it has sufficient bandwidth for the job, and you don't need any "probes."  Just an RF sampling connector and a piece of coax is required, and you can make the sampling connector from an ordinary coaxial "Tee" adapter.  But even with a great scope, all you'll see is the modulated RF envelope on these modes.  That tells absolutely nothing about distortion.  To measure distortion, you need to demodulate the signal using an extremely linear detector that introduces no distortion of its own, and then make a measurement of "all modulation products other than the single intended one" using a distortion analyzer or equivalent that has the ability to deeply notch the modulation frequency.  It's not a simple job.

To measure intermodulation distortion, a key parameter for SSB work, again requires a spectrum analyzer as well as a way to modulate the transmitter with well controlled and very low distortion signal sources (not your voice).

Because of these obstacles, the vast majority of hams don't even attempt to make such measurements and rely instead on things we can easily measure, such as amplifier linearity based on I/O slope.


Title: How to use oscilloscope to check signal quality?
Post by: KC2ELS on January 08, 2004, 12:26:44 AM
Now that you mention it, I can see that FM signals would need a frequency-domain device, while my oscilloscope is definitely a time-domain device.

I am a little nervous about connecting the output of my rig directly to the oscilloscope.  According to my MFJ, the rig generates 100W.  Shouldn't I have some serious attenuation in place so I don't zorch anything?

I was planning on using a two-tone generator to check for distortion, instead of my voice.  Wouldn't that work?

Title: How to use oscilloscope to check signal quality?
Post by: WB2WIK on January 08, 2004, 02:16:37 PM
You don't plug "100W" into your oscilloscope.  You connect the oscilloscope through a series resistor between a coaxial tap point and the 'scope's vertical input jack.  Most 'scopes have input Z of 1 to 10 Megohms, so they don't need "power" to drive them, only voltage.  Most 'scopes can withstand an input voltage to the vertical amp of at least 20Vpk, many will accept a lot more than that.  Depends on the scope!  If you're going to use the vertical amp (which is what the front panel input jack leads to), the 'scope's bandwidth must be substantially wider than your operating frequency (RF), otherwise the 'scope itself will create distortion and display it, and you won't know if the distortion you see is really "you," or your 'scope causing it.

Typically, for operation up to 30 MHz (all HF bands), you'd want to use a 'scope having a flat input response to at least 150 or 200 MHz.  If your 'scope doesn't have that high a vertical amp range, then you'd need to either find a different 'scope, or connect your signal directly to the vertical deflection plates of the 'scope's CRT, which usually involves a modification to the 'scope.

You can normally connect your transmitter to your 'scope as follows:

Use a coaxial "T" adapter on the output port of your transmitter.  To one port of that "T" your normal antenna (or a dummy load) is connected.  That absorbs literally *all* the power you're transmitting.  To the other port of that "T," connect to the vertical input port of your 'scope, using a small, inexpensive adapter that simply divides voltage.  If you have a 10x 'scope probe (usually creates 11 Megohms total input Z, including the 'scope itself and the probe), that should divide the voltage to be low enough that you can "hang" the probe directly onto the second port of the "T" adapter center pin (and ground, using the probe's ground clip).

At 100W, providing you're terminated in a 50 Ohm load (your antenna or dummy load), the peak RF voltage the 'scope probe will see is 100V.  That will be divided by 10, with a 10x probe, to be only 10V peak at the 'scope input.  I don't know of any commercial oscilloscope made that cannot easily handle a 10V peak signal.

As for using tone modulation, that's cool, but you still need a spectrum analyzer to measure distortion -- a scope won't do anything other than show you the tone modulated RF envelope, which may look very pretty but cannot measure distortion.


Title: How to use oscilloscope to check signal quality?
Post by: K8AG on June 10, 2004, 01:50:55 PM

Unless you are designing equipment and doing deveopment experiments, he best way to determine signal "quality" is to get on the air and ask.  Hams are notorious for having opinions and usually love to share them.

That was mine.

73, JP, K8AG