At least some embodiments disclosed herein relate to temperature measurement in general and more particularly but not limited to the detection of persons having fever.
Infrared radiation from a person corresponds to heat dissipation and temperature of the body of the person. Thus, thermal imaging of infrared radiation can be used to measure temperature.
There are different types of thermal imaging techniques. For example, U.S. Pat. No. 9,851,256, issued on Dec. 26, 2017 and entitled “Apparatus and method for electromagnetic radiation sensing”, discloses a thermal imaging device that uses micromechanical radiation sensing pixels to measure the intensity of infrared radiation in different locations of a thermal image. Such a thermal imaging device can have adjustable sensitivity and measurement range and can be utilized for human detection, fire detection, gas detection, temperature measurements, environmental monitoring, energy saving, behavior analysis, surveillance, information gathering and for human-machine interfaces, etc.
The embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding. However, in certain instances, well known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure are not necessarily references to the same embodiment; and, such references mean at least one.
At least one embodiment disclosed herein includes a fever scanner that can be used to scan a person within a distance of 0.5 to 1.5 meters and determine whether the person has a fever. For example, the fever scanner can be positioned on a table in a reception area to scan a visitor. The scanner can be configured with an accuracy sufficient to determine whether the visitor has a fever corresponding to a typical symptom of an infectious disease, such as COVID-19, SARS, MERS, flu, etc. Fever can be detected without bringing the scanner in close proximity to the forehead of the visitor and thus avoid socially intrusive actions that can make the visitor uncomfortable.
The fever scanner can be implemented using a combination of a thermal camera and an optical camera. For example, a hybrid camera as disclosed in Prov. U.S. Pat. App. Ser. No. 62/871,660, filed Jul. 8, 2019 and entitled “Hybrid Cameras”, can be used, the entire disclosure of which is hereby incorporated herein by reference.
Such a fever scanner can be affordable and mass deployable, plug and play, with the accuracy of within half degree Celsius or Kelvin in body temperature measurements, without requiring a reference blackbody calibration source.
The high accuracy can be achieved for measuring the body temperature of a person (e.g., visitor) being scanned at varying distances by using an empirical formula to correct the measurement obtained by a thermal camera based on the distance between the person and the scanner. The distance can be measured based on an optical image of the face of the visitor. For example, a correction factor can be added to the temperature measurement calculated based on the thermal image of the facial portion of the visitor. The correction factor can be an empirical function of a distance between the scanner and the visitor. In one implementation, the correction factor in Celsius or Kelvin is proportional to (e.g., equal to) the distance in meters.
The distance between the visitor and the scanner can be measured using an optical camera that captures the facial image of the visitor in visible lights. The optical camera can have a resolution substantially higher than the resolution of the thermal camera. Thus, the image generated by the optical camera can be analyzed to determine the face size captured in the image. The face size captured in the image can be used to calculate a distance between the optical camera and the visitor and thus the distance between the scanner and the visitor.
Optionally, the distance can be measured using an alternative technique, such as an ultrasound sensor, a 3D depth camera, or another distance sensor.
For example, a microprocessor controller can be configured in the fever scanner to calculate the temperature of a visitor from the thermal image of the visitor, detect a face in an optical image of the visitor, calculate a distance between the visitor and the scanner/thermal camera, and correct the temperature calculated from the thermal image based on the distance. When the corrected temperature is above a threshold, the fever scanner can generate an alert.
In some implementations, the detected face in the optical image is used to select an area in the thermal image that corresponds to the face of the visitor to calculate the temperature of the visitor.
In some implementations, the thermal image sensor is configured with a resolution that is sufficient to estimate the distance between the visitor and the scanner. Thus, the optical camera can be omitted in such implementations.
Optionally, a display is presented to guide the visitor to a position for optimal temperature measurement. For example, the optical image of the visitor can be presented with an outline that identifies the expected boundary of the image of the head and shoulders of the visitor when the visitor is in an ideal position and/or distance from the scanner (e.g., 0.5 meter at the center of the view field of the scanner). When the optical image of the visitor partially fills in the outline, the outline superimposed on the optical image indicates that the visitor is off from the center of the view field and/or is too far from the scanner. Thus, the visitor can adjust his/her position for an improved measurement.
The fever scanner (101) of
The fever scanner (101) includes a face detection module (111) that identifies the face captured in the optical image (107). A distance measurement module (115) computes a distance between the scanner and the person. For example, an artificial neural network (ANN) can be trained to recognize the face/head portion in the optical image (107) and provides a distance between the scanner and the person having the face/head in the optical image (107).
For example, images of persons of different characteristics can be collected with distances measured using another method (e.g., measuring tapes) can be used to train the ANN to predict the measured distances.
Alternatively, after the face detection module (111) identifies a face portion in the optical image (107) and/or its boundary, a size of the face portion can be calculated (e.g., based on a bounding box the extracted face portion or an area measurement of the face portion in the optical image (107)). A formula can be used to convert the size to the distance between the face and the scanner.
The thermal image (109) includes a corresponding facial portion of the person being scanned. In some configurations, the person is instructed to be positioned with a background having a temperature that is substantially lower than the body temperature of a person. Thus, the facial portion can be extracted to calculate a temperature of the person from the radiation intensity of the facial portion.
Optionally, the location of the facial portion in the optical image (107) can be used to identify the corresponding facial portion in the thermal image (109) to calculate a temperature of the person.
The temperature module (113) of the fever scanner (101) is configured to not only calculate the temperature based on the infrared radiation intensity in the thermal image (109), but also adjust the calculated temperature to include a distance-based correction (117). For example, the distance-based correction (117) can be computed from the distance between the face being scanned and the fever scanner (101) based on an empirical formula.
The fever scanner (101) can include an alert generator (123) that compares the output of the temperature module (113) with a threshold (121). When the face temperature is above the threshold (121), the alert generator (123) can provide an indication that fever is detected.
At least some of the computing modules (e.g., 111, 115, 113) in the fever scanner (101) can be implemented via a microprocessor or controller executing instructions. Alternatively, or in combination, some of the computing modules (e.g., 111, 113, 115) can be implemented via logic circuits (e.g., using afield-programmable gate array (FPGA) or, an application-specific integrated circuit (ASIC)).
In some embodiments, the fever scanner (101) is enclosed within a housing and configured to be used as a standalone device. In other embodiments, the fever scanner (101) includes a communication port and/or a wireless connection that can be used to connect the optical image (107), the thermal image (109) to an external display device and/or an external data storage and processing location.
The user interface includes a panel (133) configured to display the optical image (107) captured by the optical camera (103) of a fever scanner (101) and another panel (135) configured to display the thermal image (109) captured by the thermal camera (105) of the fever scanner (101).
Further, the user interface includes an area (137) configured to present the operation status of the fever scanner (101) and another area (139) configured to present the temperature of a person being scanned.
At block 201, a fever scanner (e.g., 101) captures, using a thermal camera (105), a thermal image (109) of a person.
At block 203, the fever scanner (e.g., 101) measures, using a distance sensor, a distance between the person and the thermal camera (105) of the fever scanner.
For example, the distance sensor can include a 3D depth camera to measure the distance, or an ultrasound generator to determine the distance based on a round trip time of an ultrasound signal.
For example, the distance sensor can include an optical camera (103) configured to generate an optical image (107) of the person based on sensing lights visible to human eyes and reflected from the person. The thermal camera is configured to generate the thermal image by sensing intensity of infrared radiation from the face, head and/or neck of the person.
At block 205, the fever scanner (e.g., 101) determines a first temperature from the thermal image (109).
At block 207, the fever scanner (e.g., 101) calculates a second temperature of the person based on the first temperature and the distance.
For example, the first temperature is based on the intensity of the infrared radiation; and the second temperature is calculated based on an empirical formula as a function of the distance. The empirical formula provides a difference between the first temperature and the second temperature; and the difference can be a linear function of the distance.
For example, after the optical camera (103) captures the optical image (107) of the person, a face detection module (111) recognizes a face portion of the person in the optical image and determine the distance based on the face portion.
For example, the face portion of the person in the optical image can be identified using an artificial neural network (ANN). A size of the face portion in the optical image (107) can be used to calculate the distance. Alternatively, the artificial neural network (ANN) can be trained to calculate the distance based on the size and characteristics of the face portion in the optical image (107). Thus, the optical image (107) can be used as an input to the artificial neural network to directly obtain the distance.
Optionally, the fever scanner (101) can have a user interface configured to provide an alert when the second temperature is above a threshold. The threshold can be adjusted to screening persons fora particular type of disease during an outbreak or pandemic. For example, the alert can be in the form of an audio signal (e.g., beep), or a visual indicator (e.g., flashing display of the second temperature).
Optionally, the user interface can be configured in a way as illustrated in
The fever scanner (101) can be enclosed within a housing adapted to position the fever scanner (101) at a fixed location facing a person (e.g., visitor) in vicinity of the location.
A processor can be configured within the housing of the fever scanner (101) to perform the methods discussed above by executing instructions. The instructions can be stored in a non-transitory machine readable medium such that when the instructions are executed by the processor the fever scanner (101) performs the methods discussed above.
Optionally, the processor can be configured in a data processing system located outside of the housing of the fever scanner (101). A wired or wireless connection between the data processing system to facilitate the computation discussed above. For example, the processor can be located in a personal computer or a server computer.
In one implementation, the resolution of the optical camera (103) is much greater than the resolution of the thermal camera (105). Thus, the optical image (107) can be used to identify a facial portion of the person and the corresponding portion in the thermal image (109) for an accurate determination of the first temperature. Alternatively, when the thermal camera (105) has a sufficient resolution of the recognition of the facial portion, the distance between the person and the fever scanner (101) can be measured based on the thermal image (109) instead of the optical image (107).
The present disclosure includes methods and apparatuses which perform the methods described above, including data processing systems which perform these methods, and computer readable media containing instructions which when executed on data processing systems cause the systems to perform these methods.
A typical data processing system can include an inter-connect (e.g., bus and system core logic), which interconnects a microprocessor(s) and memory. The microprocessor is typically coupled to cache memory.
The inter-connect interconnects the microprocessor(s) and the memory together and also interconnects them to input/output (I/O) device(s) via I/O controller(s). I/O devices can include a display device and/or peripheral devices, such as mice, keyboards, modems, network interfaces, printers, scanners, video cameras and other devices known in the art. In one embodiment, when the data processing system is a server system, some of the I/O devices, such as printers, scanners, mice, and/or keyboards, are optional.
The inter-connect can include one or more buses connected to one another through various bridges, controllers and/or adapters. In one embodiment the I/O controllers include a USB (Universal Serial Bus) adapter for controlling USB peripherals, and/or an IEEE-1394 bus adapter for controlling IEEE-1394 peripherals.
The memory can include one or more of: ROM (Read Only Memory), volatile RAM (Random Access Memory), and non-volatile memory, such as hard drive, flash memory, etc.
Volatile RAM is typically implemented as dynamic RAM (DRAM) which requires power continually in order to refresh or maintain the data in the memory. Non-volatile memory is typically a magnetic hard drive, a magnetic optical drive, an optical drive (e.g., a DVD RAM), or other type of memory system which maintains data even after power is removed from the system. The non-volatile memory can also be a random access memory.
The non-volatile memory can be a local device coupled directly to the rest of the components in the data processing system. A non-volatile memory that is remote from the system, such as a network storage device coupled to the data processing system through a network interface such as a modem or Ethernet interface, can also be used.
In the present disclosure, some functions and operations are described as being performed by or caused by software code to simplify description. However, such expressions are also used to specify that the functions result from execution of the code/instructions by a processor, such as a microprocessor.
Alternatively, or in combination, the functions and operations as described here can be implemented using special purpose circuitry, with or without software instructions, such as using Application-Specific Integrated Circuit (ASIC) or Field-Programmable Gate Array (FPGA). Embodiments can be implemented using hardwired circuitry without software instructions, or in combination with software instructions. Thus, the techniques are limited neither to any specific combination of hardware circuitry and software, nor to any particular source for the instructions executed by the data processing system.
While one embodiment can be implemented in fully functioning computers and computer systems, various embodiments are capable of being distributed as a computing product in a variety of forms and are capable of being applied regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
At least some aspects disclosed can be embodied, at least in part, in software. That is, the techniques can be carried out in a computer system or other data processing system in response to its processor, such as a microprocessor, executing sequences of instructions contained in a memory, such as ROM, volatile RAM, non-volatile memory, cache or a remote storage device.
Routines executed to implement the embodiments can be implemented as part of an operating system or a specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” The computer programs typically include one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processors in a computer, cause the computer to perform operations necessary to execute elements involving the various aspects.
A machine readable medium can be used to store software and data which when executed by a data processing system causes the system to perform various methods. The executable software and data can be stored in various places including for example ROM, volatile RAM, non-volatile memory and/or cache. Portions of this software and/or data can be stored in any one of these storage devices. Further, the data and instructions can be obtained from centralized servers or peer to peer networks. Different portions of the data and instructions can be obtained from different centralized servers and/or peer to peer networks at different times and in different communication sessions or in a same communication session. The data and instructions can be obtained in entirety prior to the execution of the applications. Alternatively, portions of the data and instructions can be obtained dynamically, just in time, when needed for execution. Thus, it is not required that the data and instructions be on a machine readable medium in entirety at a particular instance of time.
Examples of computer-readable media include but are not limited to non-transitory, recordable and non-recordable type media such as volatile and non-volatile memory devices, Read Only Memory (ROM), Random Access Memory (RAM), flash memory devices, floppy and other removable disks, magnetic disk storage media, optical storage media (e.g., Compact Disk Read-Only Memory (CD ROM), Digital Versatile Disks (DVDs), etc.), among others. The computer-readable media can store the instructions.
The instructions can also be embodied in digital and analog communication links for electrical, optical, acoustical or other forms of propagated signals, such as carrier waves, infrared signals, digital signals, etc. However, propagated signals, such as carrier waves, infrared signals, digital signals, etc. are not tangible machine readable medium and are not configured to store instructions.
In general, a machine readable medium includes any mechanism that provides (i.e., stores and/or transmits) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.).
In various embodiments, hardwired circuitry can be used in combination with software instructions to implement the techniques. Thus, the techniques are neither limited to any specific combination of hardware circuitry and software nor to any particular source for the instructions executed by the data processing system.
The above description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding. However, in certain instances, well known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure are not necessarily references to the same embodiment; and, such references mean at least one.
In the foregoing specification, the disclosure has been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications can be made thereto without departing from the broader spirit and scope as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
The present application claims the benefit of the filing dates of Prov. U.S. Pat. App. Ser. No. 63/005,085, filed Apr. 3, 2020, and Prov. U.S. Pat. App. Ser. No. 63/006,005, filed Apr. 6, 2020, both entitled “Fever Detection”, the entire disclosures of which applications are hereby incorporated herein by reference. The present application relates to U.S. patent application Ser. No. 16/919,722, filed Jul. 2, 2020, published as U.S. Pat. App. Pub. No. 2021/0014396 on Jan. 14, 2021, and entitled “Hybrid Cameras,” the entire disclosure of which application is hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63005085 | Apr 2020 | US | |
63006005 | Apr 2020 | US |