Phase noise in microcontroller ADCs

17.04.2017 19:44

In the past few years I designed a series of small VHF/UHF receivers. They are all based on TV tuner chips from NXP semiconductors and the STM32 series of Cortex-M3 microcontrollers. The receivers were originally intended for experiments with TV whitespaces, such as detecting the presence of wireless microphones. However, European projects come and go, and so recently my work at the Institute has shifted towards ultra-narrowband technology. I keep using my old hardware though. Partly because it is convenient and partly because it's interesting to find where its limitations are.

With ultra-narrowband, phase noise is often the defining characteristic of a system. Phase noise of a receiver is a measure of the short-term stability of its frequency reference. One of its effects is blooming of narrowband signals in the frequency domain. A common way to specify phase noise is in decibels relative to the carrier (dBc), at 1 Hz equivalent bandwidth at a certain offset from the carrier. This slide deck from Agilent nicely explains the concept.

Not surprisingly, my design has quite a lot of phase noise. This was not a concern when receiving wide band FM microphone signals. However, it turns out that it's not the RF part that is the culprit. Most of the phase noise in the system comes from the analog-to-digital converter in the ARM microcontroller that I use to sample the baseband signal. I investigated this using the same setup I used for my ADC undersampling measurement - in the following measurements, no RF circuits were involved.

This is how the spectrum of a 500 kHz CW signal looks like after being sampled at 2 Msample/s (using the interleaved dual mode of the ADC). The spectrum is calculated using FFT from 2048 samples. Ideally, there should only be a narrow spike representing one frequency component, however the phase noise causes it to smear into a broad peak:

Measured spectrum of a CW signal.

From this, I drew the phase noise plot. This shows half of the dual sideband power, calculated at 1 Hz equivalent bandwidth and relative to the total signal power:

Measured phase noise of the ADC.

At 10 kHz offset, this gives:

\mathcal{L}_{ADC}(10\mathrm{kHz}) = -77 \mathrm{dBc @ 1 Hz}

On the other hand, typical phase noise from the datasheet of the tuner chip I'm using is:

\mathcal{L}_{tuner}(10\mathrm{kHz}) = -93 \mathrm{dBc @ 1 Hz}

For comparison, the National Instruments USRP N210, another device I use daily, is only 3 dB better at 10 kHz (according to this knowledge base page):

\mathcal{L}_{USRP}(10\mathrm{kHz}) = -80 \mathrm{dBc @ 1 Hz}

Proper lab equipment of course is significantly better. The Rohde & Schwarz SMBV signal generator I used in the measurement only has -148 dBc of phase noise specified at 20 kHz offset.

What causes this phase noise? The ADC in the microcontroller is driven by the system clock. The accuracy of this clock determines the accuracy of the signal sampling and in turn the phase noise in the digital signal on the output of the ADC. In my case, the system clock is derived from the high speed internal (HSI) oscillator using the integrated PLL. The datasheet doesn't say anything about the oscillator, but it does say that the PLL cycle-to-cycle jitter is at most 300 ps.

Using a Monte Carlo simulation, I simulated the phase noise of a system where signal sampling has a random ±150 ps jitter with a uniform distribution. The results nicely fit the measurement. The shaded area below shows the range of 𝓛(f) observed in 5000 runs:

Simulated phase noise compared to measurement.

So it seems that the PLL is responsible for most of the phase noise. Unfortunately, it appears that I can't avoid using it. There is no way to run the integrated ADC from a separate external clock. I could run the whole system from a low-jitter high-speed external (HSE) clock without the PLL, however HSE is limited to 25 MHz. This is quite low compared to my current system clock of 56 MHz and would only be feasible for significantly lower sample rates (which would require different analog anti-aliasing filters). External ADC triggering also wouldn't help here since even with an external trigger, the sample-and-hold circuit appears to be still driven by the ADC clock.

For some further reading on the topic, I recommend Effect of Clock Jitter on High Speed ADCs design note from Linear, which talks about phase noise from the perspective of serious ADCs, and Phase Locked Loop section in STMicroelectronics AN2352.

Posted by Tomaž | Categories: Analog


Check for deeper analysis. Regards from Serbia.

I happened to stumble across the crystal section of the STM32F072 (that I use) yesterday. It allows crystals up to 32MHz. Also, I expect a " *2 " PLL to work better than a *7 (that you seem to be using for 56Mhz based on an 8MHz external crystal / HSI).

Anyway, I wouldn't be surprised if your HSI is also contributing to the jitter you're seeing.

For an experiment, consider checking of you can feed the CPU a clock from an external oscilator. Some of my FPGA boards have a 50MHz oscilator on board that might be available for a quick hack...

Beware also that STM32 has badly designed HSE oscillator so it will not be solution for you anyway.
See my simple measurements of HSE stability
I don't know how good/bad is HSI.

Hello there, great read! I was just wondering, why not use an external adc chip?

Łukasz, the goal when I was designing this receiver was to make a small and cheap device. The integrated ADC was well with-in the original design requirements, so there was no point in using an external ADC. If I were to make a new device for ultra-narrowband sensing, I would probably choose a different approach.

Posted by Tomaž

The phase noise should be better at lower input frequencies, because the jitter will be a smaller portion of the actual period. It would be interesting to see the results at say 1KHz....

Posted by AM

Hello Tomaž, I was wondering could the problem be alleviated with additional external sample and hold circuit driven from a better phase stability clock source (from, say, another, proper PLL, using the same time frequency base as MCU)? In a way, that would provide fixed phase for sampling, keeping a sample "buffered" in external hold, until the internal S&H takes takes it over, at some random time between two external S&H circuit sampling moments.

Posted by salec

I am also wondering what FFT window did you use in the first image? No window? If so, that might explain the apparent phase noise?

Posted by AM

It might be interesting to see how the PLL reference frequency affects the phase noise. Perhaps try different values for the M divider to use higher reference frequencies which will fall further outside the loop filter and result in less jitter.

Posted by emeb

salec, I guess you could do that. Synchronizing the external S&H with the internal ADC might be tricky. You would need to assure that you are holding a valid sample at the moment when the internal ADC takes a sample. I'm not sure how you could that on STM32.

AM, interesting point. The images above were drawn with no window (i.e. rectangular window). I haven't seen any discussion on the effect of the windowing function on phase noise measurements. I reran the calculations with the Hamming window. It does affect the 𝓛(f) values, both for the measured data and the simulations. However, I still believe that what I see here is due to jitter in the ADC clock. Simulated PLL jitter fits the measurement even when a windowing function is used. Also, simulation with no jitter produces significantly different values.

This article discusses window functions in regard to ADC measurements, but doesn't mention phase noise:

Posted by Tomaž

Window vs no Window should make a huge difference. Think about it. I ran my own experiment, albeit with a completely different ADC just to see the effect of a windowing.

See the results below:

Posted by AM

Add a new comment

(No HTML tags allowed. Separate paragraphs with a blank line.)