0 0 0

Understanding ENOB (Effective Number of Bits) in Test and Measurement Equipment

19 November, 2025

ENOB is a practical measure of ADC accuracy.

Advanced electronic systems are becoming faster, more compact, and more complex. Engineers working on next-generation wireless systems, automotive radar, medical imaging, and high-performance computing are constantly pushing the boundaries of what's possible. As a result, testing these systems is challenging: signals have higher frequencies, denser modulation, and are more sensitive to errors. This makes the accuracy, precision, and sensitivity of measurement instruments more important than ever.

Most test and measurement instruments—from oscilloscopes to digitizers and spectrum analyzers—use analog-to-digital converters (ADCs) to capture real-world signals. ADCs convert analog waveforms into the digital domain for measurement, analysis, and storage. However, there is a caveat: analog-to-digital conversion is never perfect. Each ADC introduces a certain level of error, and these different types of errors add up in complex ways. Device datasheets often span dozens of pages and contain specifications for noise, distortion, linearity, jitter, resolution, and much more. For engineers with varying levels of experience in test measurements, trying to distill all of this into a single definition—"How good is this device?"—can be quite challenging. That's why the IEEE developed the concept of the effective number of bits (ENOB)—a single, concise figure of merit that reflects the actual performance of a digitizer or oscilloscope.

What is ENOB (effective number of bits)?

ENOB (effective number of bits) characterizes the actual resolution of an ADC by converting the measured signal-to-noise and distortion ratio (SINAD) into the equivalent number of bits of an ideal converter. ENOB combines noise and distortion into a single, intuitive number, expressing the actual performance of a real converter as the equivalent resolution of an ideal N-bit ADC.

ENOB and ADC Resolution: Bits on Paper vs. Bits in Practice

It's easy to assume that the higher the resolution of an ADC, the better it is. It seems that a 12-bit oscilloscope is inevitably superior to an 8-bit one. In practice, this isn't always the case.

Resolution is the number of discrete steps an ADC can theoretically represent. For example, a 12-bit ADC has 4096 discrete levels, while an 8-bit one only has 256. But that's only part of the picture. In practice, noise, jitter, and distortion mean that the effective resolution is always lower than the nominal one.

This is where ENOB comes into play. If a 12-bit oscilloscope has an ENOB of 8, its performance roughly matches that of an ideal 8-bit converter. In other words, the "extra" four bits of resolution don't actually help signal measurement. Sometimes an 8-bit oscilloscope with higher ENOB at the frequencies you need can outperform a more expensive 10- or 12-bit instrument.




Figure 1 illustrates the unavoidable quantization error—the ±½ least significant bit (LSB) uncertainty—that defines the noise floor of an ideal digitizer. This sets the context for thinking about ENOB as a more informative way to describe real-world performance.


Sources of Error (and Why ENOB Matters)

Real-world digitizers introduce error beyond the baseline ±½ LSB quantization noise floor. The sources of these errors are:

• Offset Error: a constant shift in the output code.

• Gain Error: improper scaling of the input signal.

• Nonlinearity: deviations from a perfectly linear transfer function.

• Aperture Uncertainty (Sampling Jitter): timing variations during the sampling process.

• Random Noise: thermal and other sources of unpredictable fluctuations.

• Digital errors: data loss due to metastability, missing codes, etc.




Aperture uncertainty, also known as sampling jitter, deserves special attention. It refers to the variation in the timing of samples inherent in the analog-to-digital conversion process. Although some sources use the term "sampling jitter" broadly to refer to any timing variations in a sampling system, "aperture uncertainty" refers specifically to the instability of the sampling mechanism within the converter itself.

Imagine sampling a sine wave at its zero-crossing point. For a slow signal, a small amount of jitter is practically insignificant. However, as frequency increases, the slope of the curve at the zero-crossing point steepens, and the slightest timing error turns into a significantly larger amplitude error.




Figure 3 illustrates the impact of jitter on measurements, especially at high frequencies. This is one of the key reasons why ENOB typically decreases with increasing bandwidth.

All of these sources of error reduce the usable resolution of a device. Rather than evaluating each error individually, ENOB combines most of them into a single figure of merit.


How ENOB is Calculated

The formal definition of ENOB is based on comparing the performance of a real ADC with an ideal one. The most common formula is:

ENOB = (SINAD – 1.76) / 6.02

Here, SINAD (signal-to-noise ratio and distortion) is a measure of the "purity" of the digitized signal; the term 1.76 accounts for the quantization noise of an ideal converter; and the factor 6.02 converts decibels (logarithm to base 10) to bits (logarithm to base 2).

Simply put, ENOB measures how many effective bits your converter delivers in practice. For example, if an oscilloscope's ADC is nominally 12 bits, the effective number may only be 9.5 bits, equivalent to an ideal (theoretical) 9.5-bit device. ENOB thus provides a powerful method for comparing devices "on a single scale."


Measurement of ENOB in Practice

How do engineers measure ENOB? At a high level of abstraction, the procedure for an oscilloscope is as follows.

1. Apply a high-quality sine wave signal. The signal source should be significantly cleaner than the digitizer under test to avoid masking the device's errors. To obtain meaningful results, use an amplitude as close to the full-scale input range as possible without clipping—typically around 90% of full scale for Tektronix oscilloscopes (as opposed to ~80% used by some other manufacturers). It should be noted that definitions of "full scale" vary: Tektronix and Rohde & Schwarz define it as 10 vertical divisions, while Keysight and Teledyne LeCroy use 8 divisions. Testing near full scale is important because errors such as sampling jitter and harmonic distortion increase with increasing signal amplitude.

2. Waveform capture (the oscilloscope digitizes the signal).

3. Ideal sine wave fit (the ideal mathematical sine wave is compared to the digitized one).

4. Difference calculation (noise and distortion are extracted, and SINAD is calculated).

5. Convert SINAD to ENOB (the formula given determines the effective resolution).

These tests should be repeated at various frequencies in the range to plot the ENOB curve across the instrument's bandwidth, which allows for frequency-dependent effects such as sampling jitter, distortion, and interleaving errors to be accounted for. An important takeaway: ENOB is not a single number; it varies depending on frequency and signal conditions. This is why published ENOB charts are so useful when evaluating oscilloscopes.


Why ENOB is Critical in Test and Measurement Equipment

ENOB is a critical metric for engineers and designers. Key reasons:

Characteristic Clarity. Instead of examining dozens of uncertainty parameters, ENOB provides a single, comprehensive measure of digitizer accuracy.

Optimal Instrument Selection. A higher-bit instrument isn't always better. Sometimes a modest 8-bit oscilloscope with excellent ENOB is a wiser investment than a more expensive 12-bit alternative with poor ENOB at the frequencies you need.

Confidence in Results. Whether debugging a high-speed interface, analyzing a radar chirp signal, or verifying a medical waveform, ENOB reveals whether the instrument can truly "see" what's going on.

Storage and processing efficiency. If your system's ENOB is 7 bits, storing 16-bit data does nothing but waste disk space and slow down processing.

In other words, ENOB is a practical decision-making tool for engineers working in today's demanding environments.


The Tektronix Advantage

At Tektronix, ENOB is a key design priority. Our oscilloscopes and digitizers are designed to deliver high effective bits across wide bandwidths so you can confidently capture signals.

Tektronix instruments consistently deliver industry-leading ENOB values, giving engineers improved test fields and a clearer understanding of results. Equally important, Tektronix provides transparency into ENOB measurement methods. Detailed application notes and videos clearly demonstrate how performance is characterized, ensuring that the data displayed accurately reflects real-world operation. For engineers balancing cost, performance, and reliability, this means no need to wonder if an instrument is showing the full picture. With Tektronix, ENOB performance is transparent and reliable—critical when working on mission-critical designs.


ENOB as a Critical Metric

In the world of high-performance electronics, the number of bits on a datasheet isn't everything. Effective Number of Bits (ENOB) is a metric that reveals how many of those bits are actually usable, accounting for noise, jitter, and distortion.

By focusing on ENOB, engineers can make more informed instrument selection decisions, avoid being misled by purely nominal resolution specifications, and gain confidence in their measurements.

Tektronix oscilloscopes and digitizers excel at ENOB performance, delivering the clarity and accuracy needed to solve today's most challenging design challenges.

© All Rights Reserved. Connection.by

Модули для Опенкарт (Opencart) всех версий!