Noise figure measurements of rtl-sdr dongles
Noise figure is a measure of how much noise a component introduces into a signal that passes through it. For a radio receiver, it defines how weak a radio signal it is capable of receiving before the signal is drowned in the receiver's own noise. For instance, in spectrum sensing, having a low noise figure receiver helps a lot when trying to detect hidden transmitters. To have some reference to compare my own receiver design with, I recently performed some noise figure measurements on an Ezcap DVB-T dongle.
Principles of noise measurements are nicely detailed in an application note from Agilent. Unfortunately I don't have access to specialized noise measurement equipment. I do however have a calibrated Rohde & Schwarz SMBV vector signal generator at work. It can be used as both a continuous wave and a somewhat decent noise source, so I chose to measure the noise figure using both the Y-factor method and the generator twice-power methods.
Both methods require measuring the power of the signal exiting the receiver. I implemented a power meter in GNU Radio using the flow graph shown below (GRC file). It measures true (RMS) signal power in a 200 kHz wide band that is offset by -500 kHz from the center frequency of the tuner. This is to exclude low-frequency noise from the measurement. High level of noise around the DC is characteristic of the direct conversion tuner used by the Ezcap dongle.
The settings used for the RTL-SDR source block are:
- Sample rate 2.048 Msample/s,
- LO frequency 700.5 MHz (which puts the center of the 200 kHz measured band at 700 MHz)
I used GNU Radio release 3.7.5.1.
For twice-power method, I set the signal generator to unmodulated sine wave at 700 MHz and manually found the output power setting that caused a 3 dB change in the power meter reading. This is the minimum discernible signal MDS:
For the Y-factor method, I used the arbitrary waveform function on the generator to produce Gaussian noise in a 50 MHz band centered on 700 MHz. Total power on the generator was 80 dBm. For such a setup, the excess noise ratio is:
With the noise generator turned off, the power detector showed -41.5 dB. With the noise generator turned on, the power detector showed -36.5 dB. This gives the following noise figure:
These results are curious for several reasons.
First of all, the two methods should produce the same result, but in fact the resulting noise figures differ by 3.4 dB (a factor of around 2). My first suspect was an error in my calculations somewhere. The twice-power method, for example, is sensitive to the measurement bandwidth and this is a common source of errors in my experience. However I have repeated these exact same measurements using a completely simulated receiver in GNU Radio and the same power meter (and hence the same 200 kHz filter). In simulation the two methods agree perfectly, which makes me think the error is not in my calculations.
Another suspect was the quality of the noise for the Y-factor method. This method is typically used with specialized (analog) noise sources, not a pseudo-random vector loaded into an arbitrary waveform generator. However, repeated measurements with different signal powers, pseudo-random cycle lengths and sampling rates are in very good agreement (less than 0.5 dB difference in resulting noise figure). I have also measured the spectral power density used in the ENR calculation (Pgen/BWgen) with a spectrum analyzer and that measurement agrees with the calculated figure to within 0.1 dB.
Above makes me think that both measurements are correct and that there is some physical process in the receiver that is causing this difference. There may be some automatic gain control somewhere that behaves differently. Crest factor for instance is significantly different between noise and constant-wave inputs.
Update: Based on my later discovery that the noise power and ENR was not correct in my Y-factor calculation, it is likely that the 17.0 dB result is more accurate.
The second weird thing is the unusually large value. The noise figure is largely determined by the first stage, which is the low-noise amplifier in the Elonics E4000 tuner integrated circuit in this case. The datasheet specifies a noise figure around 4 dB, which is significantly lower than what I saw. It's not that far fetched though that a cheap design like this would perform worse than the best-case promoted in the datasheet. There might be a noisy power supply and interference from the USB bus for instance.
The most elaborate existing characterization of the rtl-sdr DVB-T dongles I'm aware of was done in 2013 by HB9AJG. Among other things, it also includes measurements of the minimum discernible signal. For the E4000 tuner at 700 MHz, that document states a noise figure of 11.0 dB, which is also somewhat lower than both my measurements. However, I believe HB9AJG made several errors in their article and in fact after accounting for them, their results nicely match mine for the twice-power method (I plan to write a bit more on that in a future post).
In conclusion, even though the results look unusual, I can't find any concrete reason to doubt their accuracy. The noise figure for this particular receiver seems to be between 17.0 and 13.6 dB, which is not particularly good. It depends on what you want to do, of course, but in general these dongles do not work very well with weak signals.
A nice approach to measure noise figure!
Did you consider the two different concepts used in the SDR-RTL dogles? E4000 tuners use a zero IF concept and R820T tuners use IF downmixing. Does a R820T based dongle give the same uncorrelated results?
A third method to measure NF is the AM concept: measuring S/N in the voice channel for an AM modulated defined signal.
NF is by far the best parameter to specify the sensitivity of a receiver/amplifier. However it is difficult to measure.
q.e.d.