Measuring Plant Stress with an Infrared Thermometer

  • Hatfield J
N/ACitations
Citations of this article
83Readers
Mendeley users who have this article in their library.

Abstract

Infrared thermometers are rapid, reliable instruments for measuring foliage temperature. These instruments are relatively simple to use with only a few considerations, e.g., field of view, target dimensions, and calibration. The foliage temperature can be incorporated into crop water stress indices that have been related to soil water availability and leaf water potential. The number of samples that need to be collected is relatively small. The infrared thermometer provides a technique for the remote detection of stress in all types of plants. Plant stress measurements with the hand-held infrared thermometer (IRT) have become increasingly popular in the last 10 years following the introduction of the portable, battery-powered IRT. Digital displays of foliage temperature allowed for quick and easy measurements. Possible sources of error and potential uses of the IRT to measure water stress will be presented. It is assumed that the reduction of soil water will result in stomata] closure and cause an increase in foliage temperature. In practice, any situation (e.g., disease, fertility, root pruning) that affects transpiration will affect foliage temperature. Theory of operation Infrared thermometers measure the energy emitted by the surface focused onto the detector instead of surface temperature as such. A complete theoretical treatment of infrared sensors and filters is given by Wolfe and Zissis (1978). The IRT detector is filtered to allow only a specific waveband, typically 8 to 14 km, onto the detector. A 10.5-to 12.5-µm waveband is commonly used to avoid strong water absorption bands. This captured energy, E, is converted to temperature (T) via Stefan's Law, which states E = εσ T 4 , where ε is the emissivity of the object and σ the Stefan-Boltzman constant (5.68 × 10-8 Joules m-2 ·s-1 · Kelvin-4). Energy emitted is proportional to the fourth power of temperature and the emissivity of the surface. Emissivity can be thought of as an "emittance efficiency factor"; most biological objects vary from 0.9 to 0.99 for the 10.5-to 12.5-µm waveband. Not correcting for emissivity can cause a 1 to 3K error. Most plants have emissivities of 0.97 to 0.99 in this waveband; however, a correction may be necessary if a particular method requires a high accuracy of temperature measurement. Measurements of emissivity can be made using the method described by Fuchs and Tanner (1966) and Idso et al. (1969). The amount of energy gathered into the detector depends on the field of view (FOV) of the instrument; however, energy flux is normalized to a unit area. The FOV is considered as the angle of the apex of a cone with the apex being the detector. A large FOV will "see" more area while a narrow FOV will "see" only a small spot. Infrared thermometer FOVs range from 0.1 to 50°; however, most IRTs used in plant stress research have a 4 or 15° FOV. The larger the FOV, the more area will be represented by the average temperature. With a small FOV, one can measure a spot temperature. A 1° FOV may be used to measure fruit or flower temperature that requires a defined target to be seen. It is important to understand that the distance the IRT is held from the object does not influence the surface temperature recorded, but does affect the size of the viewed area (Fig. 1). The viewed area of a 4° FOV IRT held perpendicular to the target will increase as a function of the distance. O'Toole and Real (1984) discuss this problem in their description of IRT target size. The energy emitted by a surface and detected by the IRT is not influenced by the intervening atmosphere because it is nearly transparent to the sensitive wavebands of the IRT. Calibration Like any other instrument, the IRT requires calibration to provide HORTSCIENCE, VOL. 25(12), DECEMBER 1990 accurate and reliable readings. Several manufacturers of IRTs sell blackbody standards that can be used as a device to check the stability of an IRT, however, these are not calibration devices. A recommended procedure to check the stability of an IRT is to compare the IRT reading with the blackbody standard before and after a series of field measurements. If there is a stability problem with the IRT, or if it is sensitive to ambient conditions, this comparison will detect these errors. Calibrations of the IRT are best made in controlled situations where the ambient temperature can be maintained relatively constant and the target temperature varied from 0 to 50C. Calibration standards range from blackbody cones positioned in controlled waler baths to specifically designed units for IRT calibration (Wolfe and Zissis, 1978). A blackbody standard can be constructed from a thin, metal cylinder, preferably copper, with a slanted sealed end. The interior should be painted with a dull, flat black paint. and calibrated high resolution temperature sensors should be attached to the outside surface where the IRT detects the temperature. This configuration increases the area viewed by the IRT. Commercially available calibration standards use either parallel or concentric v-shaped grooves or intersecting conical cavities to increase the surface area of the plate. These plates are thermoelectrically controlled and typically control the temperature to within 0.5C (Wolfe and Zissis, 1978). Calibration is done by comparing the IRT temperature reading to the temperature of the calibration standard. All IRTs should be calibrated once per year and checked daily, when in use, against a portable secondary standard to evaluate stability. Sadler and van Bavel (1982) describe a simple procedure for calibrating an IRT that would be appropriate for most researchers. Principles of use Primary precautions that should be taken when using an IRT are the FOV and the angle at which the IRT is positioned relative to the object (O'Toole and Real, 1984). A wide FOV IRT, e.g., 15°, will view a large target area and could possibly detect energy emitted by the soil, surrounding plant, or sky. These objects may be at temperatures different from the intended target and create a source of error that depends on the magnitude of the temperature difference between the intended target and other objects and the relative area of the FOV they occupy. All of these extraneous energy sources influence the temperature displayed on the IRT. O'Toole and Real (1983) discussed the implications of FOV on the proper use of the Fig. 1. Influence of IRT distance from the target on the area seen by the IRT when it is positioned perpendicular to the target, for two fields of view (FOV). 1535

Cite

CITATION STYLE

APA

Hatfield, J. L. (2019). Measuring Plant Stress with an Infrared Thermometer. HortScience, 25(12), 1535–1538. https://doi.org/10.21273/hortsci.25.12.1535

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free