Sunday, July 4, 2010

I would like to know the standard for receiver S unit measurements...

Q Mike, WD5GYG, asks, “I would like to know the standard for receiver S unit measurements. Some say that if the strength of a received signal increases by 3 dB, the S meter will show an increase of one S unit. Others say it takes a 6-dB
increase to produce an increase of one S unit. Can you explain?”

A There is no “official” S-meter standard, but the defacto standard that has evolved over the years has been 6 dB per S unit. There is a vast gulf between theory and practice, though. Most radios do not adhere to this standard, or at least not very diligently. Some S meters are derisively referred to as “Guessmeters,” and having measured the performance of many of these circuits, I can say that is an accurate analogy indeed! The Collins S meters used a signal level of 50 μV to produce a reading of 9 on the S meter. Different modern rigs indicate S9 anywhere from 1 to 200 μV!

Most S meters simply indicate the AGC voltage (the stronger the signal, the more the gain has to be reduced, so the higher the AGC voltage), but this varies directly with the components used in the receiver design. It is possible to have a calibrated S meter, but to get consistent performance across the entire HF band for all signal levels would require a more expensive circuit—a cost that might make the equipment less price competitive in the marketplace. Besides, most hams use their S meters for relative comparisons of signals on the same band, not as devices to make absolute, accurate measurements.


From QST October 1999