Instrumentation Reference Guide--Contents and Introduction


Contents and Introduction (this page)

Part I The Automation Knowledge Base

1. The Automation Practicum

2. Basic Principles of Industrial Automation

3. Measurement Methods and Control Strategies

4. Simulation and Design Software

5. Security for Industrial Automation

Part II Mechanical Measurements

6. Measurement of Flow

7. Measurement of Viscosity

8. Measurement of Length

9. Measurement of Strain

10. Measurement of Level and Volume

11. Vibration

12. Measurement of Force

13. Measurement of Density

14. Measurement of Pressure

15. Measurement of Vacuum

16. Particle Sizing

17. Fiber Optics in Sensor Instrumentation

18. Nanotechnology for Sensors

19. Microprocessor-Based and Intelligent Transmitters

20. Industrial Wireless Technology and Planning

Part III Measurement of Temperature and Chemical Composition

21. Temperature Measurement

22. Chemical Analysis: Introduction

23. Chemical Analysis: Spectroscopy

24. Chemical Analysis: Electrochemical Techniques

25. Chemical Analysis: Gas Analysis

26. Chemical Analysis: Moisture Measurement

Part IV Electrical and Radiation Measurements

27. Electrical Measurements

28. Optical Measurements

29. Nuclear Instrumentation Technology

30. Measurements Employing Nuclear Techniques

31. Non-Destructive Testing

32. Noise Measurement

Part V Controllers, Actuators, and Final Control Elements

33. Field Controllers, Hardware and Software

34. Advanced Control for the Plant Floor

35. Batch Process Control

36. Applying Control Valves

Part VI Automation and Control Systems

37. Design and Construction of Instruments

38. Instrument Installation and Commissioning

39. Sampling

40. Telemetry

41. Display and Recording

42. Pneumatic Instrumentation

43. Reliability in instrumentation and control

44. Safety

45. EMC

46. etc.


Introduction

Instrumentation is not a clearly defined subject, having what might be called a "fuzzy frontier" with many other subjects.

Look for guides about it, and in most libraries you are liable to find them widely separated along the shelves, classified under several different headings. Instrumentation is barely recognized as a science or technology in its own right. That raises some difficulties for writers in the field and indeed for would-be readers. We hope that what we are offering here will prove to have helped with clarification.

A reference book should of course be there for people to refer to for the information they need. The spectrum is wide: students, instrument engineers, instrument users, and potential users who just want to explore possibilities. And the information needed in real life is a mixture of technical and commercial matters. So while the major part of the Instrumentation Reference Guide is a technical introduction to many facets of the subject, there is also a commercial part where manufacturers and so on are listed. Instrumentation is evolving, perhaps even faster than most technologies, emphasizing the importance of relevant research; we have tried to recognize that by facilitating contact with universities and other places spearheading development.

One need for information is to ascertain where more information can be gained. We have catered for this with references at the ends of sections to more specialized guides.

Many agents have come together to produce the Instrumentation Reference Guide and to whom thanks are due: those who have written, those who have drawn, and those who have painstakingly checked facts. I should especially thank our staff who produced order out of chaos in the compilation of long lists of names and addresses. Thanks should also go elsewhere in the Butterworth hierarchy for the original germ of the idea that this could be a good addition to their family of reference books. In a familiar tradition, I thank my wife for her tolerance and patience about time-consuming activities such as emailing, texting, IMing, telephoning, typing, and traveling--or at the least for limiting their natural intolerance and impatience of my excessive indulgence in them!

A modern view of the automation practitioner in the process industries.

In the process industries, practitioners are now required to have knowledge and skills far outside the "instrumentation and control" area. Typically, automation practitioners have been required to be familiar with enterprise organization and integration, so the instruments and control systems under their purview can easily transfer and receive needed information and instructions from anywhere throughout the extended enterprise. They have needed substantially more experience in programming and use of computers, an entirely new sub-discipline of automation has been created: industrial networking.

In fact, the very name of the profession has changed.

In 2008, the venerable Instrumentation Society of America changed its official name to the International Society of Automation in recognition of this fact.

We hope that this guide and the guidance it provides will be of benefit to all practitioners of automation in the process industries.


Introduction

1. Techniques and applications

We can look at instrumentation work in two ways: by techniques or by applications. When we consider instrumentation by technique, we survey one scientific field, such as radioactivity or ultrasonics, and look at all the ways in which it can be used to make useful measurements. When we study instrumentation by application, we cover the various techniques to measure a particular quantity. Under flowmetering, for instance, we look at many methods, including tracers, ultrasonics, or pressure measurement. This guide is mainly applications oriented, but in a few cases, notably pneumatics and the employment of nuclear technology, the technique has been the primary unifying theme.

2. Accuracy

The most important question in instrumentation is the accuracy with which a measurement is made. It is such a universal issue that we will talk about it now as well as in the individual chapters to follow. Instrument engineers should be skeptical of accuracy claims, and they should hesitate to accept their own reasoning about the systems they have assembled. They should demand evidence-and preferably proof. Above all, they should be clear in their own minds about the level of accuracy needed to perform a job. Too much accuracy will unnecessarily increase costs; too little may cause performance errors that make the project unworkable.

Accuracy is important but complex. We must first distinguish between systematic and random errors in an instrument. Systematic error is the error inherent in the operation of the instrument, and calibrating can eliminate it. We discuss calibration in several later chapters. Calibration is the comparison of the reading of the instrument in question to a known standard and the maintenance of the evidentiary chain from that standard. We call this traceability.

The phrase random errors implies the action of probability. Some variations in readings, though clearly observed, are difficult to explain, but most random errors can be treated statistically without knowing their cause. In most cases it is assumed that the probability of error is such that errors in individual measurements have a normal distribution about the mean, which is zero if there is no systematic error.

This implies that we should quote errors based on a certain probability of the whereabouts of the true value. The probability grows steadily wider as the range where it might be also grows wider.

When we consider a measurement chain with several links, the two approaches give increasingly different figures.

For if we think of possibilities/impossibilities, we must allow that the errors in each link can be extreme and in the same direction, calling for a simple addition when calculating the possible total error. On the other hand, this is improbable, so the "chain error" that corresponds to a given probability, ec , is appreciably smaller. In fact, statistically,

ec = √ e12 + e22 + …

where e1, e2, and so on are the errors in the different links, each corresponding to the same probability as ec.

We can think of influence quantities as the causes of random errors. Most devices that measure a physical quantity are influenced by other quantities. Even in the simple case of a tape measure, the tape itself is influenced by temperature. Thus, a tape measure will give a false reading unless the influence is allowed for. Instruments should be as insensitive as possible to influence quantities, and users should be aware of them. The effects of these influence quantities can often be reduced by calibrating under conditions as close as possible to the live measurement application. Influence quantities can often be quite complicated. It might not only be the temperature than can affect the instrument, but the change in temperature. Even the rate of change of the temperature can be the critical component of this influence quantity. To make it even more complex, we must also consider the differential between the temperatures of the various instruments that make up the system.

One particular factor that could be thought of as an influence quantity is the direction in which the quantity to be measured is changing. Many instruments give slightly different readings according to whether, as it changes, the particular value of interest is approached from above or below.

This phenomenon is called hysteresis.

If we assume that the instrument output is exactly proportional to a quantity, and we find discrepancies, this is called nonlinearity error. Nonlinearity error is the maximum departure of the true input/output curve from the idealized straight line approximating it.

It may be noted that this does not cover changes in incremental gain, the term used for the local slope of the input/ output curve. Special cases of the accuracy of conversion from digital to analog signals, and vice versa, are discussed in Sections 29. Calibration at sufficient intermediate points in the range of an instrument can cover systematic nonlinearity.

Microprocessor-based instrumentation has reduced the problem of systematic nonlinearity to a simple issue. Most modern instruments have the internal processing capability to do at least a multipoint breakpoint linearization. Many can even host and process complex linearization equations of third order or higher.

Special terms used in the preceding discussion are defined in BS 5233, several ANSI standards, and in the ISA Dictionary of Instrumentation, along with numerous others.

The general approach to errors that we have outlined follows a statistical approach to a static situation.

Communications theory emphasizes working frequencies and time available, and this approach to error is gaining importance in instrumentation technology as instruments become more intelligent. Sensors connected to digital electronics have little or no error from electronic noise, but most accurate results can still be expected from longer measurement times.

Instrument engineers must be very wary of measuring the wrong thing! Even a highly accurate measurement of the wrong quantity may cause serious process upsets. Significantly for instruments used for control, Heisenberg's law applies on the macro level as well as on the subatomic. The operation of measurement can often disturb the quantity measured.

This can happen in most fields: A flowmeter can obstruct flow and reduce the velocity to be measured, an over-large temperature sensor can cool the material studied, or a low impedance voltmeter can reduce the potential it is monitoring. Part of the instrument engineer's task is to foresee and avoid errors resulting from the effect instrument has on the system it is being used to study.

3. Environment

Instrument engineers must select their devices based on the environment in which they will be installed. In plants there will be extremes of temperature, vibration, dust, chemicals, and abuse. Instruments for use in plants are very different from those that are designed for laboratory use.

Two kinds of ill effects arise from badly selected instruments: false readings from exceptional values of influence quantities and the irreversible failure of the instrument itself.

Sometimes manufacturers specify limits to working conditions. Sometimes instrument engineers must make their own judgments. When working close to the limits of the working conditions of the equipment, a wise engineer de-rates the performance of the system or designs environmental mitigation.

Because instrumentation engineering is a practical discipline, a key feature of any system design must be the reliability of the equipment. Reliability is the likelihood of the instrument, or the system, continuing to work satisfactorily over long periods. We discuss reliability deeply in Part 4. It must always be taken into account in selecting instruments and designing systems for any application.

4. Units

The introductory chapters to some books have discussed the theme of what systems of units are used therein. Fortunately the question is becoming obsolete because SI units are adopted nearly everywhere, and certainly in this book. In the United States and a few other areas, where other units still have some usage, we have listed the relationships for the benefit of those who are still more at home with the older expressions.

NEXT

Related Articles -- Top of Page -- Home

Updated: Monday, February 6, 2017 0:37 PST