Basic Thermometry Concepts
An understanding of the basic concepts and features available in thermometry is helpful when you set out to shop and compare different thermometers for purchase. Here is a brief review of the basic concepts you should understand and consider…
Reproducibility
The history of thermometry helps draw attention to some of the basic challenges in recording and using accurate temperature information. Adopting a universal scale (whether Fahrenheit, Celsius, Kelvin, Rankine or the more obscure Delisle, Réaumur or Rømer scales) makes the establishment of scientific standards possible, as well as the direct comparison of relative temperature data from place to place and instrument to instrument. It also hints at the importance of “reproducibility” in thermometry.
A thermometer measuring the ice point of water should read 32°F (or 0°C) every time you measure it, not 32°F (0°C) one time, 34°F (1°C) the next time and 30°F (-1°C) the next. That would defeat the purpose of a universal scale used to compare the relative temperatures of dissimilar materials and environments.
Reproducibility, along with accuracy and resolution, are the foundations upon which all good thermometer technology is built, but they are actually three different things. Some expensive thermometers on the market today are quite precise and fairly accurate on occasion but are not reliably reproducible. That means that you may or may not have accurate temperature data depending upon the performance of the thermometer at the particular time you take your measurement.
One common challenge to the reproducibility of thermometers is a phenomenon known as “hysteresis.” With hysteresis, the physical properties of an instrument, like a thermometer probe, for example, are temporarily changed by the process of taking a measurement. Thermometers exhibiting hysteresis will display different temperatures in the same material (say, an ice bath) over a short period of time and are therefore not “reproducible.”
This problem is common with “mechanical” thermometers like bimetal dial thermometers but can also affect “electronic” thermometers like instant-read digitals. An extended resting period, to allow the physical properties of the instrument to return to normal, can sometimes restore accuracy, but often only temporarily. Or, as with dial thermometers, they may have to be re-calibrated regularly.
Accuracy
One of the most basic attributes of any thermometer is its accuracy, and because a thermometer is only as good as the temperatures it takes, accuracy is of the utmost importance.
The National Institute of Standards and Technology (NIST) provides a way for calibrated thermometers and their temperatures to be traceable to a national standard, thereby giving the user a guarantee of accuracy.
“Traceability” is characterized by several essential elements, including:
- An unbroken chain of comparisons going back to stated references acceptable to the parties, usually a national or international standard (i.e. NIST)
- Measurement uncertainty. The uncertainty of measurement for each step in the traceability chain must be calculated or estimated according to agreed methods and must be stated so that an overall uncertainty for the whole chain may be calculated or estimated. (This is represented as a plus sign over a minus sign in front of a number (e.g. ± 0.7°F.)
- Documentation. Each step in the chain must be performed according to documented and generally acknowledged procedures, and the results must be recorded.
- Competence. The laboratories or bodies performing one or more steps in the chain must supply evidence for their technical competence.
History
The level to which accuracy could be measured took a major leap forward with the invention of electronic thermometers and digital displays. Anyone older than Generation X can likely remember peering at the liquid in a glass tube thermometer, or even the dial on a bimetal thermometer, and trying to determine which hashmark the measurement was closest to.
Sophisticated industrial and scientific processes have come to depend upon very accurate temperature measurements. Slight increases or decreases in temperature can have profound effects upon the growth of bacteria, the pliability of plastics, the interaction of chemicals, the health of a patient, etc, and electronic thermometers with digital displays can make it easy to determine temperature within a tenth of degree or less.
Drift
The potential for instruments to lose accuracy over time is sometimes called “drift.” Drift in thermometers necessitates periodic calibration against standards. In the United States, NIST is responsible for setting the standards by which the accuracy of instruments are checked and reset.
Electronic thermometers with computer circuitry can sometimes perform very complex calculations, factoring in such things as the affect of ambient temperature on the thermometer’s own circuitry to determine a measurement with greater accuracy and reproducibility. But by separating the temperature sensor (the probe) from the temperature calculator and display (the meter) into distinct devices, they also introduce the possibility of additional error.
With mechanical thermometers like liquid thermometers and dial thermometers, the display is directly manipulated by the physical properties of the temperature sensor itself (the expansion of the liquid or bimetal coil). Dial thermometers need frequent (weekly, if not daily) recalibration, but they only need one calibration at a time.
Electronic thermometers, particularly those that take interchangeable probes, may only need to be calibrated once a year (depending upon use) but both the probes and the meter should be calibrated for accuracy. Electronic thermometers and probes that are calibrated together can often mitigate against the potential for composite errors. Such probe/meter calibrations are said to be “system calibrated” (see Basic Thermometry Concepts: Regulations & Calibration). Accuracy can also be variable over the full range of temperatures measured by a given thermometer (see Basic Thermometry Concepts: Range).
Resolution
Resolution refers to the smallest detectable increment of measurement on an instrument. A thermometer that displays temperature readings to the hundredth of a degree (e.g. 100.26°) has a greater resolution than one that only shows the tenths of a degree (e.g. 100.3°) or whole degrees (100°).
Although resolution is different than accuracy, the two should be thought of as going hand in hand. After all, a thermometer that is accurate to ±0.05° wouldn’t be half has useful if its resolution were only in tenths of a degree (e.g. 0.1°). Likewise, it could be misleading for a thermometer to show hundredths of a degree on its display, if it’s traceable accuracy were only ±1°.
Sometimes the resolution of a given thermometer changes above or below a certain temperature. The ThermoWorks Precision Plus Thermometer, for example, has a resolution of 0.01° up to 199.9°F and a resolution of 0.1° above 199.9°F all the way up to its upper range specification (1562.0°F [850.0°C]). A thermometer that automatically adjusts its resolution at the critical temperature, like the Precision Plus, is said to be “auto-ranging.”
In rare cases, the resolution of a thermometer can be affected by the limitations of its digital display—older thermometer models often only had space to display three digits, so even though the thermometer and probe were precise to the tenth of a degree, after 99.9° or -99.9°, only whole digits were displayed (i.e. 100° or -100°).
Range
Resolution, accuracy and reproducibility form the foundation of thermometer technology but there are other important considerations when choosing an instrument.
Range describes the upper and lower limits of a thermometers’ measurement scale. Different thermometer technologies tend to perform best in different ranges of measurement. Some specialize at very, very hot temps or very, very cold ones. Some have a broader range. It’s not uncommon for a thermometer to have different accuracy or resolution specifications in the center of its range than it does at its outer limits.
Read specification tables carefully. The better idea you have of the particular temperatures you’re likely to be measuring (say, for example, baking temps at 300-400°F), the more carefully you’ll be able to select a technology that performs best in that range.
Note: If you choose a meter and a probe set, be sure both the meter and the probe have their best performance in the range you are targeting.
Speed
Speed, or “response time,” is a very important consideration when choosing a thermometer. Some thermometer technologies are faster than others, and depending on the application, additional seconds (or fractions of a second) can make all the difference.
Response time can be affected by many factors, including the position of the sensor relative to the substance being measured, the mass of the sensor itself, the speed of the processor doing the calculations, the length of the wiring between the sensor and the processor or the type of technology used.
In general, electronic thermometers are faster than mechanical thermometers (like liquid mercury or dial thermometers). Thermocouple sensors are faster than resistance technologies (like the thermistor or the RTD), and reduced tip probes are faster than standard-diameter probes (because the sensor is closer to the material being measured and the mass of the sensor is smaller and therefore more responsive to changes in temperature).
Time Constants
In technical catalogs and websites, including ThermoWorks.com, response time is often listed in increments called “time constants.” It can be a little confusing, but one time constant is the time it takes for a given instrument to get to 63% of a full reading. To achieve a 100% practical equivalent, four more time constants are needed – for a total of (5) constants to get an accurate temperature.
If a technical spec table lists a given probe as having a time constant of 0.5 seconds, you can expect to get a full reading with that probe at 2.5 seconds (or five times the listed time constant). This is important to remember, so you are sure to compare apples to apples when considering instruments with different specifications or brands.
Commercial Claims
Commercial claims – like the one associated with the Super-Fast Thermapen (i.e. reads to within 1°F of final temperature of an ice bath within 3 seconds) are full-reading claims. The technical response time of the Thermapen is 0.6 seconds, or 3 seconds divided by five. Don’t be misled by competitive thermometers that claim a “response time” of 3 seconds “like a Thermapen” when their actual time to a full reading is 15 seconds. Technical specs, like those on our probe selection pages, use time constants and will need to be multiplied by five to approximate the time for a full reading.
Reading Update Rate
Another number that can be misleading is the “reading update rate.” This number refers only to the frequency with which the digital processor of a thermometer samples the sensor. The Super-Fast Thermapen has an update rate of 0.5 seconds. That means that the digital display will show changes in the temperature as measured by the sensor every half second but it has nothing to do with the speed with which the sensor will adjust to the temperature of the material being measured.
Finally, as with accuracy, the “real” response time of a thermometer varies depending on the particular substance, and the range of temperatures being measured. Spec tables give outside limits, not exact speeds.
It’s important to remember that – just as with accuracy – the total response time of a system (i.e. meter and probe) may well be the aggregate of the response times of the individual components (i.e. the meter response time plus the probe response time). That’s one of the things that makes integrated systems like the Super-Fast Thermapen and the ThermoWorks FoodCheck appealing – the response times listed are composite.
Recording Features
One of the most valuable features of today’s thermometers is the ability to view, record and manipulate the measurements that you take. Here are a few advanced features included in some of our more dynamic thermometers:
- Recording maximum and minimum (Max/Min) temperatures is a very helpful feature, particularly when trying to determine if a target has been kept within designated temperature boundaries over an extended period of time – as with data logging. Thermometers with Max/Min functionality display the highest and lowest temperatures encountered. Some mechanical thermometers do this with physical markers that get pushed up or down over time, but Max/Min is more common with electronic instruments. *Note that electronic instruments with Max/Min often do not have auto-off feature since turning an instrument off resets its Max/Min recordings.
- Hold is a feature that allows you to freeze a displayed measurement (usually a digital reading) for later consultation.
- Differential Recordings (Diff.), display the product of subtracting the minimum temperature encountered from the maximum temperature encountered, showing the range of deviation over a span of time.
- Average temperature recordings (Avg.), simply averages all the measurements encountered over a span of time.
- High and Low Alarms alert you (by blinking, beeping or even sending you an email or text message) when a measurement has gone above or below a certain preset temperature.
- Auto-off is a feature that shuts the instrument off after a specified amount of time to protect long-term battery life. Some units also come with the ability to disable this feature for more extended measurements
Thermometer Sensor Technologies
As we have discussed previously, there are thermometers that measure many different types of physical characteristics, but the five most common include: liquid expansion devices, bi-metallic devices, resistance temperature devices (RTD’s and thermistors), thermocouples and infrared radiation devices.
Bi-metals have dial displays. The dial is connected to a spring coil at the center of the probe. The spring coil is made of two different types of metal that expand in different (yet predictable) ways when exposed to heat. The expansion of the coil with heat pushes the needle on the dial. Bi-metal thermometers are very inexpensive, and typically take several minutes to come to temperature. Not to mention, their entire metal coil has to be immersed in the material being measured to get an accurate reading (usually more than an inch or two).
Liquid thermometers and bi-metals are mechanical thermometers that don’t need any electricity to function. Bi-metal thermometers also go out of calibration very easily and need to be re-calibrated weekly or even daily using an ice bath. Adjustments can be made using a simple screw that rewinds the metal coil.
Electronic thermometers (RTD’s, thermistors and thermocouples) measure the affects of heat on electronic currents. Resistance devices (RTD’s and thermistors) take advantage of the fact that electrical resistance reacts to changes in temperature along predictable curves.
Both the thermistor and its high-precision “standards thermometer” cousin – the RTD – measure resistance in a resistor attached to an electronic circuit to calculate temperature. Thermistors typically use ceramic beads as resistors, while RTD’s often use either platinum (a highly stable metal) or other metal films.
With thermistors, resistance decreases with temperature and with RTD’s, resistance increases. Both thermistors and RTD’s may have a higher degree of accuracy than thermocouples, but their range is limited by comparison and they are generally not as fast.
Thermocouples work on the principle, that when connected to two different metals across a span with a temperature difference, an electronic circuit is generated. The voltage of the generated circuit changes with variations in temperature in predictable ways. Common thermocouples weld together nickel and chromium (called Type K), copper and constantan (Type T) or iron and constantan (Type J) and place the weld at the very tip of the thermometer probe.
Since thermocouples only generate voltage when there is a difference in temperature along the circuit (and the difference in temperatures needs to be known to calculate a temperature reading), thermocouples either have a “cold junction” where part of the circuit is brought to the ice point (32°F/0°C) or an electronic “cold junction” compensation that aids in the calculation. Thermocouples can detect temperatures across wide ranges and are typically very fast.