Home | Glossary | Books | Links/Resources |
EMC Testing | Environmental
Testing |
Vibration Testing |
AMAZON multi-meters discounts AMAZON oscilloscope discounts ... The fundamental difficulty in establishing an absolute standard for temperature has already been mentioned in the introduction to this section. This difficulty is that there is no practical way in which a convenient relationship can be established that relates the temperature of a body to another measurable quantity expressible in primary standard units. Instead, it’s necessary to use a series of reference calibration points for temperatures that are very well defined. These points have been determined by research and international discussion and are published as the International Practical Temperature Scale. They provide fixed, reproducible reference points for temperature in the form of freezing points and triple points k of substances where the transition among solid, liquid, and gaseous states is sharply defined. The full set of defined points is: • Triple point of hydrogen k _259.3467_ C • Triple point of neon k _248.5939_ C • Triple point of oxygen k _218.7916_ C • Triple point of argon k _189.3442_ C • Triple point of mercury k _38.8344_ C • Triple point of water k +0.0100_ C • Melting point of gallium +29.7646_ C • Freezing point of indium +156.5985_ C • Freezing point of tin +231.928_ C • Freezing point of zinc +419.527_ C • Freezing point of aluminum +660.323_ C • Freezing point of silver +961.78_ C • Freezing point of gold +1064.18_ C • Freezing point of copper +1084.62_ C For calibrating intermediate temperatures, interpolation between fixed points is carried out by one of the following reference instruments: • a helium gas thermometer for temperatures below 24.6_ K • a platinum resistance thermometer for temperatures between 13.8_ K and 961.8_ C • a narrow-band radiation thermometer for temperatures above +961.8_ C The triple point method of defining fixed points involves use of a triple point cell. The cell consists of a sealed cylindrical glass tube filled with a highly pure version of the reference substance (e.g., mercury). This must be at least 99.9999% pure (such that contamination is less than one part in one million). The cell has a well that allows insertion of the thermometer being calibrated. It also has a valve that allows the cell to be evacuated down to the required triple point pressure. The freezing point method of defining fixed points involves use of an ingot of the reference metal (e.g., tin) that is better than 99.99% pure. This is protected against oxidation inside a graphite crucible with a close-fitting lid. It’s heated beyond its melting point and allowed to cool. If its temperature is monitored, an arrest period is observed in its cooling curve at the freezing point of the metal. The melting point method is similar but involves heating the material until it melts (this is only used for materials such as gallium where the melting point is defined more clearly than the freezing point). Electric resistance furnaces are available to carry out these procedures. Up to 1100_ C, a measurement uncertainty of less than _0.5_ C is achievable. The accuracy of temperature calibration procedures is fundamentally dependent on how accurately points on the IPTS can be reproduced. The present limits are: 1_ K 0.3% 800_ K 0.001% 10_ K 0.1% 1500_ K 0.02% 100_ K 0.005% 4000_ K 0.2% 273.15_ K 0.0001% 10,000_ K 6.7% Reference Instruments and Special Calibration Equipment The primary reference standard instrument for calibration at the top of the calibration chain is a helium gas thermometer, a platinum resistance thermometer, or a narrow-band radiation thermometer according to the temperature range of the instrument being calibrated, as explained at the end of the last section. However, at lower levels within the calibration chain, almost any instrument from the list of instrument classes given in Section 1 may be used for workplace calibration duties in particular circumstances. Where involved in such duties, of course, the instrument used would be one of high accuracy that was reserved solely for calibration duties. The list of instruments suitable for workplace-level calibration therefore includes mercury-in-glass thermometers, base metal thermocouples (type K), noble metal thermocouples (types B, R, and S), platinum resistance thermometers, and radiation pyrometers. However, a subset of this is commonly preferred for most calibration operations. Up to 950_ C, the platinum resistance thermometer is often used as a reference standard. Above that temperature, up to about 1750_ C, a type S (platinum/rhodium-platinum) thermocouple is usually employed. Type K (chromel-alumel) thermocouples are also used as an alternative reference standard for temperature calibration up to 1000_ C. Although no special types of instruments are needed for temperature calibration, the temperature of the environment within which one instrument is compared with another has to be controlled carefully. This requires purpose-designed equipment, which is available commercially from a number of manufacturers. For calibration of all temperature transducers other than radiation thermometers above a temperature of 20_ C, a furnace consisting of an electrically heated ceramic tube is commonly used. The temperature of such a furnace can typically be controlled within limits of _2_ C over the range from 20 to 1600_ C. Below 20_ C, a stirred water bath is used to provide a constant reference temperature, and the same equipment can, in fact, be used for temperatures up to 100_ C. Similar stirred liquid baths containing oil or salts (potassium/sodium nitrate mixtures) can be used to provide reference temperatures up to 600_ C. For the calibration of radiation thermometers, a radiation source, which approximates as closely as possible to the behavior of a black body, is required. The actual value of the emissivity of the source must be measured by a surface pyrometer. Some form of optical bench is also required so that instruments being calibrated can be held firmly and aligned accurately. The simplest form of radiation source is a hot plate heated by an electrical element. The temperature of such devices can be controlled within limits of _1_ C over the range from 0 to 650_ C and the typical emissivity of the plate surface is 0.85. Type R noble metal thermocouples embedded in the plate are normally used as the reference instrument. A black body cavity provides a heat source with a much better emissivity. This can be constructed in various alternative forms according to the temperature range of the radiation thermometers to be calibrated, although a common feature is a blackened conical cavity with a cone angle of about 15_. For calibrating low-temperature radiation pyrometers (measuring temperatures in the range of 20 to 200_ C), the black body cavity is maintained at a constant temperature (_0.5_ C) by immersing it in a liquid bath. The typical emissivity of a cavity heated in this way is 0.995. Water is suitable for the bath in the temperature range of 20_90_ C, and a silicone fluid is suitable for the range of 80_200_ C. Within these temperature ranges, a mercury-in-glass thermometer is used commonly as the standard reference calibration instrument, although a platinum resistance thermometer is used when better accuracy is required. Another form of black body cavity is one lined with a refractory material and heated by an electrical element. This gives a typical emissivity of 0.998 and is used for calibrating radiation pyrometers at higher temperatures. Within the range of 200_1200_ C, temperatures can be controlled within limits of _0.5_ C, and a type R thermocouple is generally used as the reference instrument. At the higher range of 600_1600_ C, temperatures can be controlled within limits of _1_ C, and a type B thermocouple (30% rhodium-platinum/6% rhodium-platinum) is normally used as the reference instrument. As an alternative to thermocouples, radiation thermometers can also be used as a standard within _0.5_ C over the temperature range from 400 to 1250_ C. To provide reference temperatures above 1600_ C, a carbon cavity furnace is used. This consists of a graphite tube with a conical radiation cavity at its end. Temperatures up to 2600_ C can be maintained with an accuracy of _5_ C. Narrow-band radiation thermometers are used as the reference standard instrument. Again, the aforementioned equipment merely provides an environment in which radiation thermometers can be calibrated against some other reference standard instrument. To obtain an absolute reference standard of temperature, a fixed-point, black body furnace is used. This has a radiation cavity consisting of a conical-ended cylinder that contains a crucible of 99.999% pure metal. If the temperature of the metal is monitored as it’s heated up at a constant rate, an arrest period is observed at the melting point of the metal where the temperature ceases to rise for a short time interval. Thus the melting point, and hence the temperature corresponding to the output reading of the monitoring instrument at that instant, is defined exactly. Measurement uncertainty is of the order of _0.3_ C. The list of metals, and their melting points, was presented earlier at the beginning of Section 14. In the calibration of radiation thermometers, knowledge of the emissivity of the hot plate or black body furnace used as the radiation source is essential. This is measured by special types of surface pyrometer. Such instruments contain a hemispherical, gold-plated surface that is supported on a telescopic arm that allows it to be put into contact with the hot surface. The radiation emitted from a small hole in the hemisphere is independent of the surface emissivity of the measured body and is equal to that which would be emitted by the body if its emissivity value was 100. This radiation is measured by a thermopile with its cold junction at a controlled temperature. A black hemisphere is also provided with the instrument, which can be inserted to cover the gold surface. This allows the instrument to measure the normal radiation emission from the hot body and so allows the surface emissivity to be calculated by comparing the two radiation measurements. Within this list of special equipment, mention must also be made of standard tungsten strip lamps, which are used for providing constant known temperatures in the calibration of optical pyrometers. The various versions of these provide a range of standard temperatures between 800 and 2300_ C to an accuracy of _2_ C. Calculating Frequency of Calibration Checks The manner in which the appropriate frequency for calibration checks is determined for the various temperature-measuring instruments available was discussed in the instrument review presented in Section -1. The simplest instruments from a calibration point of view are liquid-in-glass thermometers. The only parameter able to change within these is the volume of the glass used in their construction. This only changes very slowly with time, and hence only infrequent (e.g., annual) calibration checks are required. The required frequency for calibration of all other instruments is either (a) dependent on the type of operating environment and the degree of exposure to it or (b) use related. In some cases, both of these factors are relevant. Resistance thermometers and thermistors are examples of instruments where the drift in characteristics depends on the environment they are operated in and on the degree of protection they have from that environment. Devices such as gas thermometers and quartz thermometers suffer characteristics drift, which is largely a function of how much they are used (or misused!), although in the case of quartz thermometers, any drift is likely to be small and only infrequent calibration checks will be required. Any instruments not mentioned so far suffer characteristics drift due to both environmental and use-related factors. The list of such instruments includes bimetallic thermometers, thermocouples, thermopiles, and radiation thermometers. In the case of thermocouples and thermopiles, it must be remembered that error in the required characteristics is possible even when the instruments are new, as discussed in Section 1, and therefore their calibration must be checked before use. As the factors responsible for characteristics drift vary from application to application, the required frequency of calibration checks can only be determined experimentally. The procedure for doing this is to start by checking the calibration of instruments used in new applications at very short intervals of time and then to progressively lengthen the interval between calibration checks until a significant deterioration in instrument characteristics is observed. The required calibration interval is then defined as that time interval predicted to elapse before the characteristics of the instrument have drifted to the limits allowable in that particular measurement application. Working and reference standard instruments and ancillary equipment must also be calibrated periodically. An interval of 2 years is usually recommended between such calibration checks, although monthly checks are advised for the black body cavity furnaces used to provide standard reference temperatures in pyrometer calibration. Standard resistance thermometers and thermocouples may also need more frequent calibration checks if the conditions (especially of temperature) and frequency of use demand them. Procedures for Calibration The standard way of calibrating temperature transducers is to put them into a temperature controlled environment together with a standard instrument or to use a radiant heat source of controlled temperature with high emissivity in the case of radiation thermometers. In either case, the controlled temperature must be measured by a standard instrument whose calibration is traceable to reference standards. This is a suitable method for most instruments in the calibration chain but is not necessarily appropriate or even possible for process instruments at the lower end of the chain. In the case of many process instruments, their location and mode of fixing make it difficult or sometimes impossible to remove them to a laboratory for calibration checks to be carried out. In this event, it’s standard practice to calibrate them in their normal operational position, using a reference instrument that is able to withstand whatever hostile environment may be present. If this practice is followed, it’s imperative that the working standard instrument is checked regularly to ensure that it has not been contaminated. Such in situ calibration may also be required where process instruments have characteristics that are sensitive to the environment in which they work so that they are calibrated under their usual operating conditions and are therefore accurate in normal use. However, the preferred way of dealing with this situation is to calibrate them in a laboratory with ambient conditions (of pressure, humidity, etc.) set up to mirror those of the normal operating environment. This alternative avoids having to subject reference calibration instruments to harsh chemical environments that are commonly associated with manufacturing processes. For instruments at the lower end of the calibration chain, that is, those measuring process variables, it’s common practice to calibrate them against an instrument that is of the same type but of higher accuracy and reserved only for calibration duties. If a large number of different types of instruments have to be calibrated, however, this practice leads to the need to keep a large number of different calibration instruments. To avoid this, various reference instruments are available that can be used to calibrate all process instruments within a given temperature measuring range. Examples are the liquid-in-glass thermometer (0 to +200_ C), platinum resistance thermometer (_200 to +1000_ C), and type S thermocouple ( +600 to +1750_ C). The optical pyrometer is also often used as a reference instrument at this level for the calibration of other types of radiation thermometers. For calibrating instruments further up the calibration chain, particular care is needed with regard to both the instruments used and the conditions they are used under. It’s difficult and expensive to meet these conditions and hence this function is subcontracted by most companies to specialist laboratories. The reference instruments used are the platinum resistance thermometer in the temperature range of _200 to +1000_ C, the platinum-platinum/10% rhodium (type S) thermocouple in the temperature range of +1000 to +1750_ C, and a narrow-band radiation thermometer at higher temperatures. An exception is optical pyrometers, which are calibrated as explained in the final paragraph of this section. A particular note of caution must be made where platinum-rhodium thermocouples are used as a standard. These are very prone to contamination, and if they need to be handled at all, this should be done with very clean hands. Before ending this section, it’s appropriate to mention one or two special points that concern the calibration of thermocouples. The mode of construction of thermocouples means that their characteristics can be incorrect even when they are new, due to faults in either the homogeneity of the thermocouple materials or the construction of the device. Therefore, calibration checks should be carried out on all new thermocouples before they are put into use. The procedure for this is to immerse both junctions of the thermocouple in an ice bath and measure its output with a high-accuracy digital voltmeter (_5 mV). Any output greater than 5 mV would indicate a fault in the thermocouple material and/or its construction. After this check on thermocouples when they are brand new, the subsequent rate of change of thermoelectric characteristics with time is entirely dependent on the operating environment and the degree of exposure to it. Particularly relevant factors in the environment are the type and concentration of trace metal elements and the temperature (the rate of contamination of thermocouple materials with trace elements of metals is a function of temperature). A suitable calibration frequency can therefore only be defined by practical experimentation, and this must be reviewed whenever the operating environment and conditions of use change. A final word of caution when calibrating thermocouples is to ensure that any source of electrical or magnetic fields is excluded, as these will induce erroneous voltages in the sensor. Special comments are also relevant regarding calibration of a radiation thermometer. As well as normal accuracy checks, its long-term stability must also be verified by testing its output over a period that is 1 hour longer than the manufacturer's specified "warm-up" time. This shows up any in components within the instrument that are suffering from temperature-induced characteristics drift. It’s also necessary to calibrate radiation thermometers according to the emittance characteristic of the body whose temperature is being measured and according to the level of energy losses in the radiation path between the body and measuring instrument. Such emissivity calibration must be carried out for every separate application that the instrument is used for, using a surface pyrometer. Finally, it should be noted that the usual calibration procedure for optical pyrometers is to sight them on the filament of a tungsten strip lamp in which the current is measured accurately. This method of calibration can be used at temperatures up to 2500_ C. Alternatively, they can be calibrated against a standard radiation pyrometer. |
Home | Glossary | Books | Links/Resources |
EMC Testing | Environmental
Testing |
Vibration Testing |
Updated: Tuesday, 2014-03-25 17:36 PST