Noise versus temperature

11.07.2012 20:26

I mentioned energy detection and spectrum sensing before. The idea is that you try to detect if any radio transmissions are present in your neighborhood by simply detecting the energy the transmitter emits into the electromagnetic field. This is the most basic way of detection - any signal must carry energy. It's also the most robust as detection doesn't depend on the sensor knowing any details about the transmission.

There is a catch, of course. Parts of your own detector are unavoidably emitting radio frequency energy as well. Anything with a temperature above absolute zero has various charged particles moving in random ways which emit their own electrical signals and this thermal glow effectively blinds you for weaker signals. By definition the energy detector can't distinguish between a valid transmission signal and random noise.

SNE-CREWTV, spectrum sensor for UHF and VHF bands

With some statistics however you can still probabilistically detect signals that lie somewhat below the noise floor. The more accurately you can model your own noise, the weaker signals you can detect and higher the probability of correct detection. This is why I tried to find a good statistical model of the noise present in the spectrum sensing receiver I designed at the Institute.

Perhaps not surprisingly, I found out that the characteristics of the noise slowly change with time, which complicates things a bit. In electronics if some property is slowly changing it's usually connected to changes in temperature. The TDA18219HN tuner I'm using has an integrated die temperature sensor, so it was a straightforward experiment to check how the average noise level changes as the silicon heats up or cools down.

Measured power with no input signal versus die temperature

Graph above was measured with no input signal (with a terminator on the antenna connector), at 1.7 MHz channel bandwidth. It shows another surprise - the noise level actually decreases with temperature! This result is suspect, as noise usually increases with temperature. For instance, in this temperature range, the thermal noise of the terminating resistor alone would increase by 0.4 dB.

Obviously some other effect is at play here, so I repeated the same measurement, but this time with the receiver directly connected to a signal generator. I set the generator to a constant sine wave with an amplitude around 20 dB above the noise floor.

Measured power with -80 dBm CW input signal versus die temperature

As you can see, this signal shows a similar decrease in measured power. So obviously it's not the noise itself that decreases with the temperature but the sensitivity of the receiver itself.

At this moment it's hard to say which parts of the receiver are responsible for this. It might be the many amplification stages or the power detector itself. Also interesting is that noise amplitude is falling faster than the amplitude of an external signal (approximately -2 dB change versus -1.5 dB over the measured range). I would expect just the opposite, as the increase in thermal noise should counter the loss of sensitivity.

In any case, most likely I can't do anything about this at the hardware level. The noise model will simply have to either compensate for the die temperature or only be valid after the receiver heats up.

Posted by Tomaž | Categories: Analog

Add a new comment


(No HTML tags allowed. Separate paragraphs with a blank line.)