Resolution, dynamic range and accuracy of A/D boards



Home | Forum | DAQ Fundamentals | DAQ Hardware | DAQ Software

Input Devices
| Data Loggers + Recorders | Books | Links + Resources


AMAZON multi-meters discounts AMAZON oscilloscope discounts


F.4.1 Dynamic range

One of several considerations in determining the analog input requirements of an A/D board is the range of voltages, which each channel is required to measure. The physical parameters to be measured, the type of sensor(s) used and how they are connected, determine the input voltage ranges required.

The input range specifications quoted by board manufacturers of A/D boards refer to the minimum and maximum voltage levels that the A/D converter on the board can quantize.


AMAZON multi-meters discounts AMAZON oscilloscope discounts


Typically, a selection of input ranges is provided, either unipolar (e.g. 0 to 10 V), for measuring positive voltages only, or bipolar (e.g. -10 V to +10 V), for measuring both positive and negative voltages. This allows the user to match the input signal range to that of the A/D converter, taking into account the resolution of the A/D converter and the gain required of the input amplifier.

When considering the input range, it's only the dynamic range of the input signal that needs to be taken into account. e.g., consider a strain gauge setup in a Wheatstone bridge configuration. The input voltage to be read has a common mode component due to the excitation of the bridge, while the small differential voltage changes, (of interest) are due to the change in strain gauge resistance. The common mode voltages don't provide any useful information and are greatly attenuated, (almost eliminated), by using differential inputs and instrument amplifiers with high CMRR. Only the small differential voltage changes are amplified and converted by the A/D converter. The amplifier gain should therefore be selected so that the maximum differential voltage change expected at the input will be amplified to cover as much of the input range of the A/D converter as possible.


AMAZON multi-meters discounts AMAZON oscilloscope discounts


As only one of the allowable range settings can be selected at any time, typically by jumpers on the board itself, care should be taken in matching the input signal requirements where more than one channel is sampled. The A/D converter input range selected must accurately measure the signal inputs from a number of channels, possibly different sensors, and therefore potentially different input voltage levels and signal ranges. The input range should therefore cover each channel's input range with as little overlap as possible, thus giving the greatest number of data points and therefore the highest resolution and accuracy.

It should be noted that the input ranges specified don't necessarily refer to the maximum or minimum voltage levels that can be applied at any single input, or to the maximum allowable common mode voltage, which can be applied, to a differential input. These are specifications, more related to the input amplifier. If there are any doubts with regard to this, users should consult the board manufacturer.

F.4.2 Resolution

The resolution specification quoted by manufacturers of A/D boards refers to the resolution of the A/D converter used on the board. it's usually expressed by the number of bits the A/D converter uses to represent the analog input voltage (i.e. n-bit) or as a fraction of the maximum number of discrete levels, which can be used to represent the analog signal (i.e. ½^ n ). The resolution implicitly defines the number of discrete ranges into which the full-scale voltage (FSV) input range can be divided to approximate an analog input voltage. A 12-bit A/D board can divide the input range into (2^12 = 4096) discrete levels, each 1/4096 the size of the input voltage range.

Together, the resolution, input range, and input amplifier gain available on the A/D board, determine the smallest detectable voltage change in the signal input. For an ideal A/D board with a resolution of n-bits, this is calculated using the formula:

smallest detectable change = input range / amplifier gain × 2^n

e.g., on a 12-bit A/D board, with a 0 V to +10 V input range, and the amplifier gain set to 1, the smallest detectable voltage change would be 10/(1 * 4096) = 2.44 mV.

Therefore, each 2.44 mV change at the input would change the output of the A/D converter by ± 1 LSB or ± 0 × 001h. 0 V would be represented by 0 × 000h, while the maximum voltage, represented by 0 × FFFh would be 9.9976 V. Due to the staircase nature of the ideal transfer characteristic of an A/D converter, a much smaller change in the input voltage can still result in a transition to the next digital output level, but this will not reliably be the case.

Changes smaller than 2.44 mV will not therefore be reliably detected. If the same 12-bit A/D converter is used to measure an input signal ranging from -10 V to +10 V, then the smallest detectable voltage change is increased to 4.88 mV. This value represents the voltage equivalent of 1 LSB, of the full-scale voltage, and for A/D boards, is termed code width.

The resolution figure quoted only provides a guide to the smallest detectable change that can be reliably distinguished by the A/D board, since the value calculated is based on the ideal performance of all components of the analog input circuitry. The effects of noise, non-linearities in the A/D converter, and errors in the other components of the analog input circuitry, can mean that the true resolution of an A/D board can be as much as 2 bits lower than the manufacturer's specification. This means that a 16-bit A/D board may be accurate to only 14-bits.

F.4.3 System accuracy

The system accuracy, or how closely the equivalent digital outputs match the incoming analog signal(s), is another very important criteria, especially where the analog signal contains a lot of information, or where a small part of the signal range is to be examined in detail. As has been demonstrated, the functional components of the analog input circuitry (i.e. multiplexer, amplifier, sample-and-hold and A/D converter) of A/D boards are not ideal. The practical performance limits and errors in each of these components influence the overall performance and accuracy of the system as a whole.

The specification known as system accuracy usually refers to the relative accuracy of the A/D board and indicates the worst-case deviation from the ideal straight-line transfer function. Relative accuracy is determined on an A/D board by applying to the input a voltage at minus full-scale, converting this analog voltage to a digital code, increasing the voltage, and repeating the steps until the full input range of the board has been covered. By subtracting the theoretical analog voltage, which should cause each code transition from the analog input voltage that actually resulted in the code transition. The maximum deviation from zero is the relative accuracy of the A/D board. Board manufacturers usually quote the system accuracy in terms of LSB, since an absolute voltage value would only have meaning relative to the selected input voltage range. e.g., where ' n' = 2, the system accuracy of a 12-bit A/D board is 2/4096 ( ± 0.048%), while for a 16-bit A/D board the accuracy is 2/65536 ( ± 0.003%).

The tendency of analog circuits to change characteristics or drift, with time and temperature, requires that A/D boards be periodically calibrated to maintain accuracy within the specified range. Manufacturers specify the offset voltage and gain accuracy to be adjustable in a range ± n LSB. This means that where the input range to a 12-bit A/D converter is 0-10 V, and the input is set to +1/2 LSB (i.e. +1.22 mV), the digital output would read no greater than 0 × 005h. This would represent a maximum offset voltage adjustment of 5/4096 × 10 V = 12.2 mV. Where the input range to a 12-bit A/D converter is 0-10 V, and the input is set to -3/2 LSB (i.e. +9.996 V), the digital output would read no less than 0 × 99 Ah. For gain accuracy, this figure represents the maximum gain error.

Autocalibration, where the entire analog section of the board (multiplexer, amplifier, sample and hold), as well as the A/D converter, is automatically calibrated without user intervention, is provided on some A/D boards.

Several auto-calibration methods are used:

• Calibration is carried out automatically when a voltage reference is connected to the board.

• Calibration takes place as part of the conversion process.

• The accuracy on each input channel is checked for all available gain settings.

A correction code for each channel/gain combination is stored, then recalled to dynamically compensate for drift in hardware.

NEXT: Sampling rate and the Nyquist theorem

PREV: Single ended vs differential signals

All related articles   Top of Page   Home



Updated: Tuesday, March 3, 2020 22:31 PST