Frequency Measurement Guidelines for Oscillators



Every digital electronic device requires a reference clock and oscillators are widely used to serve that purpose. Verifying frequency characteristics of high performance devices requires accurate frequency measurement. This document contains an overview of various frequency measurement methods and instruments and is intended to help the users of ILSI MMD MEMS oscillators take accurate frequency measurements.

Common Frequency Measurement Issues

Measurements made with different frequency counters do not match

A discrepancy between measurements results made with different frequency counters may be caused by one or more of the following reasons.

1) Two instruments use different frequency references. Base models of frequency counters are usually equipped with inexpensive TCXO-based frequency references with 1 to 5 ppm frequency stability and a few ppm/year aging rate. An error introduced by the frequency reference will add an error to the measurement results. Figure 1 illustrates how measurement results of a frequency counter may differ when using an internal TCXO reference compared to an external high-precision rubidium reference. Refer to section 3 for frequency reference selection guidelines.


2) Gate times or instrument specifications are different. Different measurement results may appear when using the same frequency reference if instruments are using different gate times. Moreover if gate times and references are the same, but instrument resolutions are different, the results may not match at low gate times. Refer to section 4 for details.

Frequency displayed by the frequency counter is much higher than expected

Poor signal integrity may inadvertently increase or even double the frequency measured by the frequency counter. This is often encountered in probing schemes where the instrument input is configured to a high impedance mode (for example, 1 MΩ). Section 6 discusses signal integrity impact on frequency measurement and provides probing recommendations.

Frequency counter measurements with different gate times do not match

The frequency measurement error of a frequency counter is inversely proportional to the gate time. As illustrated in Figure 4, the shorter the gate time, the larger the error. Refer to section 4 for details about frequency counters.

Oscilloscope frequency measurement shows large spread

Oscilloscopes take frequency measurements for every period of the input signal. Depending on the scope settings and capabilities measurement results may be averaged over multiple captures or over all signal periods within a single capture. As discussed in section 5, frequency measurement conducted on a single period is highly affected by signal period jitter and internal noise in the oscilloscope, causing the results to change by thousands of ppm. Collecting thousands of samples and taking the average significantly reduces the error, but this method still doesn’t provide ppm-level accuracy that can be easily achieved by using a frequency counter. Figure 2 shows an example of frequency measurement using a high-end oscilloscope.


Frequency Reference Selection


GPS-disciplined time bases offer several advantages over other references:
– Ensure that all remote locations effectively use the same reference which leads to excellent frequency measurement results correlation;
– Do not require calibration.

GPS-disciplined time base is recommended for the most accurate measurements and results correlation. Rubidium time base is also acceptable in most cases. All time bases require calibration except GPS-disciplined.

Measurement Using a Frequency Counter

Frequency counters are designed to take accurate frequency measurements and are the preferred instrument of choice. The original frequency counters used a digital gate method which was easy to implement, but measurement error depended on input frequency (refer to Appendix A).

Modern frequency counters use a reciprocal counting method [2]. With this method, the gate time is synchronous with the input signal, so the measurement error is due to one reference clock cycle. For better resolution, the reference frequency is multiplied to a reasonably high number. The main advantage of this approach is that resolution is independent of input frequency.

There are methods that further increase measurement resolution by time stamping start and stop input signal edges. This makes it possible to determine when those events have occurred within the reference clock cycle (Figure 3). Modern frequency counters achieve 20 ps resolution or better [3].



Figure 5 represents measurement data that illustrate how reducing the gate time from 100 ms to 10 ms limits the resolution to 5 ppb (0.1 Hz) for a 20 MHz input signal.

Note: Some frequency counters may take additional measurements within the gate time and use this information to improve measurement accuracy. Examples of such counters are Agilent

53132A and 53230A. Therefore representation of measurement error relation to gate time with x may not be accurate for all counters.

Frequency measurement error is dominated by two factors:

  1. Time base accuracy and stability
  2. The frequency counter’s time interval measurement error relative to the gate time

Selecting a higher resolution frequency counter and increasing gate time leads to improved measurement accuracy provided that an accurate time base is used. ILSI MMD recommends using at least 100 ms gate time and a GPS-disciplined or rubidium time base. Refer to your instrument manual for detailed information on your frequency counter accuracy and resolution.

Measurement Using a Digital Oscilloscope

Oscilloscopes are widely used for measuring parameters of a clock signal. This section discusses the limitations of digital oscilloscopes that restrict their effectiveness in frequency measurement.

Oscilloscope stamping accuracy and quantization noise

A digital oscilloscope converts an analog input signal to a digital signal by acquiring a set of readings from an analog-to-digital converter that are equally spaced in time. To measure frequency, an oscilloscope detects time instances of signal transitions using a threshold which is usually 50% from the signal amplitude. Oscilloscope software uses interpolation between two points. One point is just before the signal crosses the threshold and the second point is right after crossing the threshold (see Figure 6). The accuracy of measuring the time instance at which the signal has crossed the threshold depends on the oscilloscope time stamping accuracy and quantization noise. The time stamping accuracy defines the errors of t1 and t2 and the quantization noise defines the errors of V1 and V2. Refer to section 4 of ILSI MMD application note AN10007 Clock Jitter and Measurement [4] for more information about the impact of oscilloscope quantization noise on timing measurements.


Single period measurement

Many oscilloscopes are capable of measuring only one clock period per waveform capture. The relative error of such measurement is fairly high and increases with input signal frequency. High frequency period jitter that is intrinsically present in the signal also adds significant error. Running multiple captures and averaging the data reduces the error to a certain instrument measurement limit. This, however, is time consuming and still doesn’t provide ppm-level accuracy.

Gate time and time base limitations

Modern high-performance digital oscilloscopes are capable of running built-in measurements on all adjacent signal cycles that are acquired within a single capture. They also have very good time-stamping accuracy. Unfortunately due to limited memory, a very small time frame of a signal (typically up to 1 ms) can be captured with a maximum sampling rate. This effectively limits the maximum measurement gate time and as a result, limits the measurement accuracy.

The primary objective for the oscilloscope time base is low jitter, therefore it doesn’t have very good frequency stability. This can be corrected by using an external reference.

Probing the Signal

Signal integrity may impact frequency measurement even if it is averaged over multiple periods. The number of cycles registered may be artificially higher if signal integrity issues cause the signal to cross the measurement threshold in addition to when rise/fall time events occur (Figure 7). This phenomenon is often called double triggering. When double triggering occurs, the measured frequency is higher than the actual signal frequency.


Figure 8 illustrates how the non-terminated long wires connected to a high impedance probe can cause signal integrity issues. The ringing on the screen capture is significant enough to cause double triggering.


Improper probing affects signal integrity and may cause double triggering. This condition causes the frequency measured by the instrument to be higher than expected and variation between measurements may be high.

To ensure good signal integrity, the impedance of the source, load and transmission line should match. For this purpose, source or load termination techniques should be used. The following examples illustrate signal probing using a 50-Ω coaxial cable and various termination options.

Figure 9 shows a signal waveform captured with source termination and 1 MΩ instrument termination. In this example, the output impedance of the DUT is 25 to 30 Ω so a 20-Ω resistor has been added in series with the output to match the 50-Ω cable impedance. The wave which is travelling across the transmission line reflects from the instrument’s high impedance input. The reflection is reduced by source termination, but the waveform still contains over- and undershoots. This method is not recommended because reflections are difficult to eliminate.


The preferred way to connect a signal under test to a frequency counter is shown in Figure 10. Instrument input terminated with 50Ω ensures good signal integrity and a 1 kΩ resistor isolates the DUT from the external load. This probing scheme has a 21:1 attenuation factor.

Refer to ILSI MMD application note Probing Oscillator Output [5] for more information on probing techniques.


Appendix A. Digital Gate Frequency Measurement Method

Figure A.1 illustrates a block diagram of a direct digital gate frequency measurement method. A preconditioned input signal is applied to one of the inputs of a digital AND gate. The second input to the gate is generated from the time base oscillator by the use of a configurable divider network. This provides a positive pulse of a required duration called the gate time 1. While

the pulse is in a high state the input signal passes through the AND gate and the number of its rising edges is counted by the counting register. Thus the number of cycles N of the input signal that fits within know predefined period of time 2 is obtained. The input signal frequency is then calculated as 3.