Home | Forum | DAQ Fundamentals | DAQ Hardware | DAQ Software Input Devices | Data Loggers + Recorders | Books | Links + Resources |
AMAZON multi-meters discounts AMAZON oscilloscope discounts Goals • Be able to explain the meaning of the terms accuracy, precision, sensitivity, resolution, repeatability, rangeability, span and hysteresis • Be able to make an appropriate selection of sensing devices for a particular process • Describe the sensors used for measurement of temperature, pressure, flow and liquid level • List the methods of minimizing the interference effects of noise on our instrumentation system. The definition of transducers and sensors A transducer is a device that obtains information in the form of one or more physical quantities and converts this into an electrical output signal. Transducers consist of two principle parts, a primary measuring element referred to as a sensor, and a transmitter unit responsible for producing an electrical output that has some known relationship to the physical measurement as the basic components. AMAZON multi-meters discounts AMAZON oscilloscope discounts In more sophisticated units, a third element may be introduced which is quite often microprocessor based. This is introduced between the sensor and the transmitter part of the unit and has amongst other things, the function of linearizing and ranging the transducer to the required operational parameters. Listing of common measured variables In descending order of frequency of occurrence, the principal controlled variables in process control systems comprise:
We now list and describe these different types of transducers, ending with a methodology of selecting sensing devices. The common characteristics of transducers All transducers, irrespective of their measurement requirements, exhibit the same characteristics such as range, span, etc. This section explains and demonstrates the interpretation of the most common of these characteristics. Dynamic and static accuracy The very first, and most common term accuracy is also the most misused and least understood. It’s nearly always quoted as 'this instrument is ±X% accurate', when in fact it should be stated as 'this instrument is ±X% inaccurate'. In general, accuracy can be best described as how close the measurement's indication is to the absolute or real value of the process variable. In order to obtain a clear understanding of this term, and all of the other ones that are associated with it, the term error should first be defined. The definition of error in process control Error means a mistake or transgression, and is the difference between a perfect measurement and what was actually measured at any point, time and direction of process movement in the process measuring range. AMAZON multi-meters discounts AMAZON oscilloscope discounts There are two types of accuracy, static or steady-state accuracy and dynamic accuracy. 1. Static accuracy is the closeness of approach to the true value of the variable when that true value is constant. 2. Dynamic accuracy is the closeness of approach of the measurement when the true value is changing, remembering that a measurement lag occurs here, that is to say, by the time the measurement reading has been acted upon, the actual physical measured quantum may well have changed. In addition to the term accuracy, a sub-set of terms appear, these being precision, sensitivity, resolution, repeatability and rangeability all of which have a relationship and association with the term error. Precision Precision is the accuracy with which repeated measurements of the same variable can be made under identical conditions. In process control, precision is more important than accuracy, i.e. it’s usually preferable to measure a variable precisely than it’s to have a high degree of absolute accuracy. The difference between these two properties of measurement. Using a fluid as an example, the dashed curve represents the actual or real temperature. The upper measurement illustrates a precise but inaccurate instrument while the lower measurement illustrates an imprecise but more accurate instrument. The first instrument has the greater error, the latter has the greater drift. (Drift: An undesirable change in the output to input relationship over a period of time.) ++++Accuracy vs precision related to a typical temperature measurement. Temperature 60, 80, 100; Actual temperature; Imprecise, accurate; Precise, inaccurate Sensitivity Generally, sensitivity is defined as the amount of change in the output signal from a transducer's transmitting element to a specified change in the input variable being measured, i.e. it’s the ratio of the output signal change to the change in the measured variable and is a steady-state ratio or the steady-state gain of the element. So, the greater the output signal change from the transducer's transmitter for a given input change, the greater the sensitivity of the measuring element. Highly sensitive devices, such as thermistors, may change resistance by as much as 5% per °C, while devices with low sensitivity, such as thermocouples, may produce an output voltage which changes by only 5 µV (5 × 10^-6 V) per °C. The second kind of sensitivity important to measuring systems is defined as the smallest change in the measured variable which will produce a change in the output signal from the sensing element. In many physical systems, particularly those containing levers, linkages and mechanical parts, there is a tendency for these moving parts to stick and to have some free play. The result of this is that small input signals may not produce any detectable output signal. To attain high sensitivity, instruments need to be well-designed and well constructed. The control system will then have the ability to respond to small changes in the controlled variable; it’s sometimes known as resolution. Resolution Precision is related to resolution, which is defined as the smallest change of input that results in a significant change in transducer output. Repeatability The closeness of agreement between a number of consecutive measurements of the output for the same value of input under identical operating conditions, approaching from the same direction for full range transverses is usually expressed as repeatability in percent of span. It does not include hysteresis. Rangeability This is the region between stated upper and lower range values of which the quantity is measured. Unless otherwise stated, input range is implied. Example: If the range is stated as 50-320 °C then the range is quoted as 50-320 °C. Span Span should not be confused with rangeability, although the same points of reference are used. Span is the Algebraic difference between the upper and lower range values. Example: If the range is stated as 50-320 °C then the span is 320-50 = 270 °C. Hysteresis This is a dynamic measurement, and shows as the dependency of an output value, given for an excursion of the input, as compared with the history of prior excursions and the direction of the transverse. Example: If an input into a system is moved between 0 and 100% and the resultant output recorded and then the input is returned back to 0%, again with the output recorded the difference between the two values, 0% 100% 0%, ?? as recorded, gives the hysteresis value of the system at all points in its range. Repetitive tests must be done under identical conditions. NEXT: Process measurement and transducers--part 2 |
Updated:
Wednesday, March 16, 2016 17:33
PST