eHam

eHam Forums => Misc => Topic started by: NO2A on September 26, 2012, 01:00:18 PM



Title: radio sensitivity
Post by: NO2A on September 26, 2012, 01:00:18 PM
A typical Heath era rig might have a sensitivity rating of 1mv. Typical today`s rigs are .2mv. If you were using two rigs with those ratings with the same antenna,conditions,etc. what difference would that make as far as "S" meter readings? Approximately speaking.


Title: RE: radio sensitivity
Post by: G3RZP on September 27, 2012, 12:02:49 AM
I presume that by '1mv' you mean '1 microvolt' - mV usually means millivolt.

Supposing (and it's BIG supposing) that the S meters have been calibrated for 50 microvolts equals S9 AND that the linearity is identical, then they should read the same. The difficulty is that the linearity is almost certainly different, so although the S9 may be the same, the other readings probably won't be.

As far as HF is concerned, most receivers are rather more sensitive than they need to  be: this is especially so at 7 MHz and below, because of atmospheric and man-made noise - especially the latter in urban and even suburban areas. I live in teh country and it's pretty quiet, but even so, on 10m, I can use 3dB of attenuation in the antenna lead and still get a 4 dB rise in  noise in switching from dummy load to a 4 ele Steppir - and even more if the beam is pointing to the sun when it's playing up


Title: RE: radio sensitivity
Post by: AE4RV on September 27, 2012, 07:51:26 AM
This is why I don't fully trust the Sherwood list. The quality of DSP filtering and noise reduction is more important to me, which is something I need to hear for myself.


Title: RE: radio sensitivity
Post by: NO2A on September 27, 2012, 05:58:06 PM
Yes,I meant microvolt. Maybe measurement in db would be a better indicator than an "s" unit. Noise level certainly has gotton much worse than it used to be.


Title: RE: radio sensitivity
Post by: WB2WIK on September 27, 2012, 06:52:05 PM
It doesn't matter.

A 1940s era SX-28 has more than enough sensitivity to hear anything we can possibly detect on the HF bands, with an antenna connected.

What is far more important is how the signals are processed.  Receivers vary a lot in AGC characteristics and the ability to filter and noise blank.

"Sensitivity" for HF receivers is probably the least important parameter.


Title: RE: radio sensitivity
Post by: W4OP on September 27, 2012, 09:21:13 PM
Steve is spot on. Sensitivity is a non-issue. If you hear more noise when you connect your antenna- then you have enough sensitivity.
I get a kick when I hear someone comment on how "quiet" a receiver is. This most assuredly is a result of how the AGC is set up, as our HF noise floor is not determined by the receiver, but rather by  manmade, atmospheric and galactic noise.

Dale W4OP


Title: RE: radio sensitivity
Post by: TANAKASAN on September 28, 2012, 12:50:33 AM
A 'quiet' receiver could be sensitive or deaf as a post. Wire a shorted plug across your antenna connection then turn RF gain and audio gain to full, then you can tell if you have a quiet receiver. Life gets really interesting when you do this test with the pre-amplifier switched in, some of them are truly horrible.

I normally use a variable attenuator from 160m-30m, nothing on 20m and a 10dB low noise amplifier on 30m and higher.

Tanakasan


Title: RE: radio sensitivity
Post by: G3RZP on September 29, 2012, 03:43:50 AM
Dale,

>If you hear more noise when you connect your antenna- then you have enough sensitivity.<

Not totally true, because especially on frequencies below 10 MHz, the rise in noise on connecting the antenna may be caused by multiple signal intermodulation as well as, in addition, phase noise.

Higher sensitivity was a goal in the 1930s, when it WAS needed. Unfortunately, it's stayed as necessary marketing hype 70+ years later, possibly because of confusion with noise figure requirements at VHF/UHF.


Title: RE: radio sensitivity
Post by: NO2A on September 29, 2012, 02:04:26 PM
Listen to an HW-16 on 15m,then listen to any modern day rig on the same band. Then tell me it`s a non-issue.


Title: RE: radio sensitivity
Post by: W4OP on September 29, 2012, 06:32:09 PM
from G3RZP:
Not totally true, because especially on frequencies below 10 MHz, the rise in noise on connecting the antenna may be caused by multiple signal intermodulation as well as, in addition, phase noise.

Higher sensitivity was a goal in the 1930s, when it WAS needed. Unfortunately, it's stayed as necessary marketing hype 70+ years later, possibly because of confusion with noise figure requirements at VHF/UHF.
--------------------
No argument there at all. I am well aware of phase noise and IM although phase noise may have been a less of an issue with the good PTO's like Collins or Drake or TenTec  than it is with some of today's DDS stuff.
And, as you said, VHF/UHF is a totally different story. On my 23cM EME system, I can make use of LNA's with  NF below 0.2dB.
73,
Dale W4OP


Title: RE: radio sensitivity
Post by: N8CMQ on September 29, 2012, 11:43:03 PM
When talking about sensitivity, there are a lot of standards.
I use hard microvolts. I add a six Db pad between the generator and receiver.
Using 30% modulation at 1000 Hz, I look for a six Db drop in audio when I disable the modulation, then I read the microvolts off the generator. Most of the time, the sensitivity is 1 to 2 microvolts.


Title: RE: radio sensitivity
Post by: G3RZP on September 30, 2012, 04:33:28 AM
'CMQ,

Depends on what the generator is calibrated in!

Certainly in the UK, and I believe a lot of Europe, historically generators were calibrated in terms of the EMF i.e. the open circuit volts. The US, on the other hand, calibrated in terms of the actual volts into the correct load - which is half the EMF.

Then the radar/ECM people came into the game. For them, it's actual power that matters and so we found generators calibrated in dBm, which is the power in 50 ohms that it would be if the load was matched. In most cases at HF, the receivers have quite a high input SWR, but provided the cables are short at HF, the results are repeatable enough.

Marine radio standards required on AM that switching off 30% modulation gave a 10dB change in output, while the specifications were all for EMF voltages: below 3.8 MHz, the dummy antennas were specified - for 1.6 to 3.8 MHz, it was 250pF in series with 10 ohms, so it was an interesting and lossy little network wthat presented 50 ohms to the generator and source impedance of 10 ohms and 250pF in series to the rx. I can't remember the lower frequency dummy antennas though - it may have been 6 ohms and 350pF at 500kHz.

So with multiplicity of methods, it has to be defined which is used!

Incidentally, there's a good article in the latest QEX on antenna noise, although his reference to the ITU-R Rec. P372 -7 is several years out of date: the current version is P.327-10


Title: RE: radio sensitivity
Post by: N2EY on October 01, 2012, 10:22:52 AM
Listen to an HW-16 on 15m,then listen to any modern day rig on the same band. Then tell me it`s a non-issue.

Many HW-16s exhibit so-so receiver performance on 15 because there's inadequate first-mixer LO injection. Try 40 meters and hear the difference.

73 de Jim, N2EY


Title: RE: radio sensitivity
Post by: NO2A on October 01, 2012, 11:29:56 AM
I agree Jim. That was my point. Many old rigs were horrible on 10 or 15m,hence the term,"deaf as a post."


Title: RE: radio sensitivity
Post by: W4OP on October 01, 2012, 01:38:18 PM
I am seeing that exact situation on my Hallicrafters FPM-300 MK II. The VFO runs straight to the 9MHz mixer on 80M, while all other bands go through a het /mixer with gain. The  output into the 9MHz portion is exactly 3dB lower on 80M and this shows up in the final PA output level.

Dale W4OP


Title: RE: radio sensitivity
Post by: N8CMQ on October 06, 2012, 10:33:37 PM
'CMQ,

Depends on what the generator is calibrated in!

Certainly in the UK, and I believe a lot of Europe, historically generators were calibrated in terms of the EMF i.e. the open circuit volts. The US, on the other hand, calibrated in terms of the actual volts into the correct load - which is half the EMF.

Then the radar/ECM people came into the game. For them, it's actual power that matters and so we found generators calibrated in dBm, which is the power in 50 ohms that it would be if the load was matched. In most cases at HF, the receivers have quite a high input SWR, but provided the cables are short at HF, the results are repeatable enough.

Marine radio standards required on AM that switching off 30% modulation gave a 10dB change in output, while the specifications were all for EMF voltages: below 3.8 MHz, the dummy antennas were specified - for 1.6 to 3.8 MHz, it was 250pF in series with 10 ohms, so it was an interesting and lossy little network wthat presented 50 ohms to the generator and source impedance of 10 ohms and 250pF in series to the rx. I can't remember the lower frequency dummy antennas though - it may have been 6 ohms and 350pF at 500kHz.

So with multiplicity of methods, it has to be defined which is used!

Incidentally, there's a good article in the latest QEX on antenna noise, although his reference to the ITU-R Rec. P372 -7 is several years out of date: the current version is P.327-10

That is the beauty of standards, there are so many to choose from...
My generators are in DBm and microvolts into 50 ohms.
I also use a 6 DB pad for 'hard' microvolts.
When I do a sensitivity check, I switch 30% modulation of 1000 Hz off for a 6 DB drop in the audio output level.
Most of the receivers I check are 1 microvolt for 6 DB drop.