Friday, August 12, 2011

Analog and mixed signal design:Why 50 Ohm?

Has anyone wondered why we use 50 Ohms as the the reference resistance in so many of our designs. Why 50 Ohm seems to be a defacto standard. We normalize to 50 Ohm; we use 50 Ohm in our oscilloscopes; we pick 50 ohms as a good convenient reference resistor. But how did this happen. Where did this 50 Ohm factor come from. We ran across a explanation which sounds reasonable enough and decided to post it to this blog. Standard coaxial lines in England in the 1930's used a commonly available center conductor which turned out to be 50 Ohms! Others say that for minimum signal attenuation, the transmission line characteristic impedance is 77 Ohm. For maximum power handling it is around 30 Ohm. A good compromise is 50 Ohm for both performance parameters. So this is how 50 Ohm became a convenient impedance level!?

No comments:

Post a Comment