Best Practices for Body Temperature Measurement with Infrared Thermography: External Factors Affecting Accuracy

Uložené v:
Podrobná bibliografia
Názov: Best Practices for Body Temperature Measurement with Infrared Thermography: External Factors Affecting Accuracy
Autori: Siavash Mazdeyasna, Pejman Ghassemi, Quanzeng Wang
Zdroj: Sensors, Vol 23, Iss 8011, p 8011 (2023)
Informácie o vydavateľovi: MDPI AG
Rok vydania: 2023
Zbierka: Directory of Open Access Journals: DOAJ Articles
Predmety: elevated body temperature, infrared thermograph, thermography, ISO/TR 13154, viewing angle, external temperature reference source, Chemical technology, TP1-1185
Popis: Infrared thermographs (IRTs) are commonly used during disease pandemics to screen individuals with elevated body temperature (EBT). To address the limited research on external factors affecting IRT accuracy, we conducted benchtop measurements and computer simulations with two IRTs, with or without an external temperature reference source (ETRS) for temperature compensation. The combination of an IRT and an ETRS forms a screening thermograph (ST). We investigated the effects of viewing angle ( θ , 0–75°), ETRS set temperature ( T E T R S , 30–40 °C), ambient temperature ( T a t m , 18–32 °C), relative humidity (RH, 15–80%), and working distance ( d , 0.4–2.8 m). We discovered that STs exhibited higher accuracy compared to IRTs alone. Across the tested ranges of T a t m and RH, both IRTs exhibited absolute measurement errors of less than 0.97 °C, while both STs maintained absolute measurement errors of less than 0.12 °C. The optimal T E T R S for EBT detection was 36–37 °C. When θ was below 30°, the two STs underestimated calibration source (CS) temperature ( T C S ) of less than 0.05 °C. The computer simulations showed absolute temperature differences of up to 0.28 °C and 0.04 °C between estimated and theoretical temperatures for IRTs and STs, respectively, considering d of 0.2–3.0 m, T a t m of 15–35 °C, and RH of 5–95%. The results highlight the importance of precise calibration and environmental control for reliable temperature readings and suggest proper ranges for these factors, aiming to enhance current standard documents and best practice guidelines. These insights enhance our understanding of IRT performance and their sensitivity to various factors, thereby facilitating the development of best practices for accurate EBT measurement.
Druh dokumentu: article in journal/newspaper
Jazyk: English
ISBN: 978-5-458-09863-2
5-458-09863-3
Relation: https://www.mdpi.com/1424-8220/23/18/8011; https://doaj.org/toc/1424-8220; https://doaj.org/article/c545809863304c148c0f38f5aff6dccf
DOI: 10.3390/s23188011
Dostupnosť: https://doi.org/10.3390/s23188011
https://doaj.org/article/c545809863304c148c0f38f5aff6dccf
Prístupové číslo: edsbas.F756B4E1
Databáza: BASE
Popis
Abstrakt:Infrared thermographs (IRTs) are commonly used during disease pandemics to screen individuals with elevated body temperature (EBT). To address the limited research on external factors affecting IRT accuracy, we conducted benchtop measurements and computer simulations with two IRTs, with or without an external temperature reference source (ETRS) for temperature compensation. The combination of an IRT and an ETRS forms a screening thermograph (ST). We investigated the effects of viewing angle ( θ , 0–75°), ETRS set temperature ( <semantics> T E T R S </semantics> , 30–40 °C), ambient temperature ( <semantics> T a t m </semantics> , 18–32 °C), relative humidity (RH, 15–80%), and working distance ( d , 0.4–2.8 m). We discovered that STs exhibited higher accuracy compared to IRTs alone. Across the tested ranges of <semantics> T a t m </semantics> and RH, both IRTs exhibited absolute measurement errors of less than 0.97 °C, while both STs maintained absolute measurement errors of less than 0.12 °C. The optimal <semantics> T E T R S </semantics> for EBT detection was 36–37 °C. When θ was below 30°, the two STs underestimated calibration source (CS) temperature ( <semantics> T C S </semantics> ) of less than 0.05 °C. The computer simulations showed absolute temperature differences of up to 0.28 °C and 0.04 °C between estimated and theoretical temperatures for IRTs and STs, respectively, considering d of 0.2–3.0 m, <semantics> T a t m </semantics> of 15–35 °C, and RH of 5–95%. The results highlight the importance of precise calibration and environmental control for reliable temperature readings and suggest proper ranges for these factors, aiming to enhance current standard documents and best practice guidelines. These insights enhance our understanding of IRT performance and their sensitivity to various factors, thereby facilitating the development of best practices for accurate EBT measurement.
ISBN:9785458098632
5458098633
DOI:10.3390/s23188011