Call Search
     

New to Ham Radio?
My Profile

Community
Articles
Forums
News
Reviews
Friends Remembered
Strays
Survey Question

Operating
Contesting
DX Cluster Spots
Propagation

Resources
Calendar
Classifieds
Ham Exams
Ham Links
List Archives
News Articles
Product Reviews
QSL Managers

Site Info
eHam Help (FAQ)
Support the site
The eHam Team
Advertising Info
Vision Statement
About eHam.net

   Home   Help Search  
Pages: [1]   Go Down
  Print  
Author Topic: Amplifier Output Impedance  (Read 653 times)
WA2VTA
Member

Posts: 2




Ignore
« on: June 14, 2002, 09:15:52 PM »

Can anyone tell me why a "pure" voltage source (0 ohms output impedance) is not used for RF amplifiers? (i.e. the final amplifiers of RF transmitters.) It seems to me that if they were used, the 6 dB loss incurred by the amplifier's output impedance could be nearly eliminated.

Perhaps there is something about amplifiers, transmission lines, or antennas that I am not understanding.

Also, how/why has the output impedance of these amplifiers come to be 50 ohms?

Thanks - appreciate it if you could shed some light!
Logged
AC5E
Member

Posts: 3585




Ignore
« Reply #1 on: June 14, 2002, 10:00:42 PM »

Now there's a real can of worms! A really complete answer would require a couple of chapters of advanced EE text and that's beyond the scope of this forum. And even if I went that far I would bet a dollar to a moldy doughnut that someone will disagree.

However, lessee here what I can do as simply as I can.

Amplifiers have a specified output impedance for several reasons - starting with the fact that maximum energy transfer occurs when the impedance of the source matches the impedance of the load whether that load is a transmission line/tuner/antenna - or dummy load.

A practical antenna system has at least some impedance, practical transmission lines have a considerable amount of impedance, and purchasable lines run from 50 to 600 ohms. Transmission line impedance pretty well sets the design impedance of the tranmitter/amplifier.

Furthermore, amplifier linearity is important if spurious emissions are to be avoided, and amplifers are most linear when they see the design load, whatever that load may be.

A Zero impedance source is probably not achievable, but a solid state transmitter APPROACHES that at some point, the point where at least one transistor is fully conducting, every cycle. And at other points in the cycle, when the tranistors are essentially cut off, the impedance is very high.

This change in impedances is one of the reasons solid state gear is almost invariably less spectrally pure than many of the homebrew rigs of the 1930's. A deliberate mismatch between output impedance and the load would only make matters worse.  

Of course, tube rigs also have a cyclic shift in output impedance as the plate current changes, but since tube rigs have tank circuits this makes very little difference in practice. (This is a point of contention - but trust me)

And now we are at the point described in several songs, "The subject's interesting but the rhymes are mighty tough."  Or at least I have gone as far as I care to without mathematics.

73  Pete Allen  AC5E

Logged
WA2VTA
Member

Posts: 2




Ignore
« Reply #2 on: June 15, 2002, 02:09:50 PM »

Thanks so much, Pete.

After reading your response, I sat down with pencil and paper to figure it out for myself.  I think I did...

If you think about it, an amplifier designed as a voltage source (e.g. an emitter-follower) must still have an output impedance.  The reason being that the output stage must drop the difference between the rail voltage and the load voltage at any given time during the cycle.  This would imply an instantaneous series resistance value in the output stage.

After thinking this through, the Maximum Power Transfer Theorem (output impedance = load impedance) begins to make more sense.  The amplifier is designed to safely source a specific voltage and current to a specific load impedance, which implies an equal source impedance.

Does it sound like I'm on the right track?

Thanks again.
Logged
AC5E
Member

Posts: 3585




Ignore
« Reply #3 on: June 15, 2002, 06:15:19 PM »

Yes, Anthony, you are. Now think about this. Impedance is RF resistance. It consists of DC resistance plus reactance. Circuit impedances control the current and power flowing in an AC circuit (even 60 cycle line current is a form of RF, and RF is a form of AC) just as resistance controls the currents in a DC circuit.

Now - zero ohms impedance implies no resistance with no reactance. Virtually no voltage need be applied for infinite current flow.  A resonant superconducting circuit might approach that, but you can't get there. And if you did you would still be looking for a zero impedance transmission line and a matching antenna - or a matching network with a truely infinite stepup ratio. You can't go there either.

But if you did have zero impedance all the power houses in the world couldn't feed it. It would essentially be a black hole. But you sure would "get out" for the few seconds it would take to put the lights out.

On the other hand - a very high impedance circuit would be essentially voltage fed. High voltage, very low current. Lots of insulation, small conductors, and usually but not always fairly low losses. And there are high impedance antennas, and fairly high impedance transmission lines. So a high impedance circuit would be good - except that it takes a high voltage RF source. We used to have those things, you know. Tubes.

1,000 Volts at 100 Ma is 100 watts. And there were a lot of tubes that were quite happy to operate very efficently in such service. But I'm afraid that tubes are pretty much a thing of the past. Sob!

So we have rigs with 50 ohm outputs because that fits our most popular and practical antennas, feasable feedlines, and matching units that work pretty well with the rest of our "standardized" station equipment. Something we would not have if transmitters were made to work into something like 10 ohms or 5,000 ohms.

73  Pete Allen  AC5E

Logged
WB6BYU
Member

Posts: 13486




Ignore
« Reply #4 on: June 17, 2002, 04:10:25 PM »

Why 50 ohms?

If you look at the formulas for the voltages, currents, and losses in
a transmission line, you will find that the lowest losses occur when
the impedance is around 75 ohms.  The greatest power handling
capability is at an impedance around 37.5 ohms.  50 ohms is a
compromise between those two values.  (This also explains why
services such as cable TV who are more concerned about losses
than with power handling capability use 75 ohm cable.)

Since 50 ohms is the standard feedline impedance, most rigs are
designed to feed it.  (Though the solid state rigs I have owned have
seemed perfectly happy driving 75 ohms also.)
Logged
Pages: [1]   Go Up
  Print  
 
Jump to:  

Powered by MySQL Powered by PHP Powered by SMF 1.1.11 | SMF © 2006-2009, Simple Machines LLC Valid XHTML 1.0! Valid CSS!