Call Search
     

New to Ham Radio?
My Profile

Community
Articles
Forums
News
Reviews
Friends Remembered
Strays
Survey Question

Operating
Contesting
DX Cluster Spots
Propagation

Resources
Calendar
Classifieds
Ham Exams
Ham Links
List Archives
News Articles
Product Reviews
QSL Managers

Site Info
eHam Help (FAQ)
Support the site
The eHam Team
Advertising Info
Vision Statement
About eHam.net

donate to eham
   Home   Help Search  
Pages: [1]   Go Down
  Print  
Author Topic: Power supply noise vs Phase noise (long)  (Read 677 times)
W4JCK
Member

Posts: 71




Ignore
« on: August 18, 2016, 01:51:59 PM »


Eham may not be the right venue for this, but I'll ask anyway.
First, let me stipulate that I'm not any kind of RF expert or guru on modern receiver design.  My tenure in the electronics field was primarily in data acquisition and conversion.  So I'm an RF hobbyist like all of us who have a ham license.

The basis for my question:  As techniques and test equipment advance, we're doing more analysis of the effects of various subsystems in our radios.  This can be evidenced in the way the ARRL has changed some of their methods and reflected in the much-ballyhooed Sherwood "list".  One aspect that gets a good deal of attention is noise.  We deal with a wide variety of noise in various forms as we use our HF radios.  Atmospheric noise tends to swamp a good bit of a radio's internal noise floor.  The area my question concerns is the oft referenced phase noise.  In any wide bandwidth system, noise levels affect the SNR of the system.  Now that we use A/D converters in our radios, the input span dynamic range can be affected by noise - lowering this range as a result of degradation of the SNR.

My question is simply this - what is the effect on phase noise, in our HF radios, that is caused by power supply noise?  I'm referring to conducted noise through the power supply leads.  This would include the noise from the power source in addition to any radiated noise picked up by the normally unshielded supply leads.

I've read various articles and papers regarding power supply noise and it's effect on the phase noise of oscillators and synthesizers.  It's real and can be measured.  Now, these tests that have been done are on just the oscillator or synthesizer, not on a complete system.  Since we know, from various evaluations, that phase noise will effect the performance of our receiver and transmitter, how does the noise from our power source correlate?  Or perhaps it doesn't - I don't know nor have I been able to find where any such tests have been done with our HF radios.

The ARRL used to use HP 62XX power supplies in their lab.  These are big rack-sized, 50lb linear supplies.  I'm sure they have probably upgraded them by now with something of a similar ilk.  I would speculate that Sherwood uses something similar.  So, we see the oft pristine numbers that come from testing with these types of power supplies.

I suspect that there are a small number of hams that use such supplies.  I also suspect that the vast majority of hams do not.  One trend I've noticed, especially with some of the newer hams, is that after purchasing a new HF rig, they scour sites like Ebay, et al for the cheapest power supply they can find.  This usually turns out to be some type of switcher and sometimes of dubious origin.  Most hams who have been around a while know the effects of a poorly designed and constructed switching power supply.  This is NOT an indictment of SMPS's.  They are here to stay and we're all having to get used to it.  A linear power supply can have noise also.  Poorly filtered supplies can produce significant amounts of 60 or 120Hz ripple in addition to noisy regulators and other types of Gaussian noise.  Switching power supplies can and often generate noise that is not as readily apparent as the familiar raspy, moving signal heard as we tune through a band. 

So, how much of our performance is or can be compromised by a noisy supply?  It would be comforting to think that the manufacturers have taken care of that internally and I believe it would also be delusive to rely on that.  I would posit that the level of noise management in our radios is directly proportional to the cost.

So, has anyone done such testing?  If it's irrelevant, please take some time and explain why.

Thanks for reading.



Logged
W8JPF
Member

Posts: 13




Ignore
« Reply #1 on: August 18, 2016, 05:18:01 PM »

Interesting post and thought-provoking questions.  I'll answer them by saying...



I don't know.



Keep reading and digging.  If you come up with some good answers, let us know.
Logged

Joe Fischer, W8JPF
AC7CW
Member

Posts: 642




Ignore
« Reply #2 on: August 20, 2016, 04:36:28 PM »

I've de-noised a small signal amplifier chain by stripping the insulation from the power buss traces and sliding various capacitance values along the busses and experimenting with the capacitor placement while observing the noise at the output with a spectrum analyzer. The power busses were more like striplines and placement of the bypass caps was more important than their values in that case. Quite possibly if that last little bit of phase noise can be dealt with it will be via "cut and try" engineering efforts.

Logged
K6AER
Member

Posts: 3979




Ignore
« Reply #3 on: Yesterday at 07:39:04 PM »

Power supply noise is generally less than 1% of the DC output. Most radios have internal regulators and as a result the master oscillators have phase noise down 140dBc at 10 KHz. The atmospheric noise level is still 30 dB above most receiver noise levels. Even radios made 50 years ago were more sensitive the antenna base band noise.
« Last Edit: Today at 07:43:10 AM by K6AER » Logged
W4JCK
Member

Posts: 71




Ignore
« Reply #4 on: Today at 05:57:35 AM »

Power supply noise is generally less than 1% of the DC output. Most radios have internal regulators d as a result the master oscillators have phase noise down 140dBc at 10 KHz. The atmospheric noise level is still 30 dB above most receiver noise levels. Even radios made 50 years ago were more sensitive the antenna base band noise.

I'm not quite sure I understand what you're saying.  As I noted in my original post, current transceiver testing is usually performed with lab-grade power supplies.  Most hams do not use these types of power supplies - that was the scope of my query - the effect of this wide distribution of power supply types and their associated noise levels.
As to atmospheric noise, this is not usually a facet of testing.  As to phase noise effects, these are independent of atmospheric effects.

Logged
K6AER
Member

Posts: 3979




Ignore
« Reply #5 on: Today at 07:38:46 AM »

Power supply noise generally is not a problem due to the radios having internal regulators for the various stages. The only stage that does no have a sub regulator is the final output stage. Any DC regulation noise will be masked by the actual modulation of the transmitter.

Oscillator phase noise will present its self in two ways. On transmit it will mix with the modulation and frequency generation to increase the sideband noise of the transmitted signal.

On receive it can add to the mix of the incoming signals to increase the mixing of desired signals and to the actual receiver noise floor. Even then the actual receiver noise floor is so far down from what the receiver is getting from the antenna it is not an issue. Typical noise floors on receiver are from -130 to -140 dBm depending on band width. Typical noise floor of the twenty meter band on a good day is -110 dBm. Any contribution to the receiver noise floor by a power supply is so minimal that it is not a factor in testing.
« Last Edit: Today at 07:45:23 AM by K6AER » Logged
Pages: [1]   Go Up
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.11 | SMF © 2006-2009, Simple Machines LLC Valid XHTML 1.0! Valid CSS!