Q Jim Brown, K5JAZ, asks, “I have a signal generator with a meter calibrated 0-10 μV. I am searching for the formulas and printed tables to convert μV to μW and dBm. Can you help?”
A Signal generators are usually calibrated to deliver their rated voltage into a specific value of resistive load. Most modern test equipment is calibrated in a 50-Ω system; equipment designed to test televisions and television systems is calibrated for a 75-Ω load. Some equipment, usually audio or telephone equipment, is calibrated into 600 Ω.
Let’s assume that your generator is calibrated for 50 Ω. What this means is that it will deliver what is indicated on the outputlevel control and multiplier if it is operated into a 50-Ω load.
With this assumption, you can use Ohms Law to convert from μV to μW. The formula P = (E2)/R works if the units are volts, watts and ohms. If you wish, you can convert the μV to volts, obtain the power in watts, then multiply that result by 1,000,000 to convert the result to μW. In this case, we are assuming R to be 50 Ω.
To wrap that all into one formula, you can do the conversions
all at once, using μV and ohms and obtaining a result in μW by:
P(μW) = (E(μV)2)/R)/(1,000,000)
and, if you know R to be 50 Ω, you can use:
P(μW) = (E(μV)2)/(50,000,000)
The term dBm means decibels related to a milliwatt, so, you can either convert the value in microwatts to milliwatts by dividing it by 1000, then use the formula:
dBm = 10log10(P(mW)) where P = the actual power in milliwatts, or, to do it all in one motion, you can use the formula:
dBm = –30 + 10log10(P(μW)) with P in microwatts or
dBm = 30 + 10log10(P(W)) with P in watts.
From QST February 1999