Many different thermometers and temperature sensors exist for invasively determining the internal temperatures of persons. Detecting that a person has an elevated internal temperature may help determine whether the person has an infectious disease. Some temperature sensors, such as oral thermometers, must physically contact a person in order to determine their internal temperature. However, infectious diseases may be spread to persons through physical contact or being in close proximity to a person with an infectious disease. Other temperature sensors may be used which do not contact individual persons and may be used to determine temperatures of more than one person at a time. However, such non-contact temperature sensors are often not as accurate as contact temperature sensors at determining the internal temperature of a person.
Systems and methods for non-invasively identifying individuals within a temperature screening area that have internal core temperatures outside of a range of approved temperatures are described herein. During a temperature screening process, individuals may enter the temperature screening area with different subject conditions and from different environments with different ambient conditions. The different environments with different ambient conditions may correspond to environments with different temperatures, humidity, and air flow (e.g., wind or air conditioning). In at least one example, the temperature screening area may comprise a hotel lobby and the different environments with different ambient conditions may include a first environment with a first temperature (e.g., a restaurant adjacent to the hotel lobby) and a second environment with a second temperature less than the first temperature (e.g., a cold-weather outdoor environment adjacent to the hotel lobby). The different subject conditions may correspond to individual-specific conditions such as the presence of facial obstructions (e.g., a face mask, scarf, or long hair covering a portion of an individual's face), the presence of sweat (e.g., sweat on an individual's face due to exercise prior to entering the temperature screening area), and an individual's circumstances with respect to their circadian rhythm or sleep-wake cycle (e.g., how long the individual has been awake). During a temperature screening process, the temperatures of skin surface regions of individuals (e.g., exposed facial regions) may be tracked over time using one or more thermal imaging devices (e.g., a thermal imaging camera) as the individuals are moving within the temperature screening area. The different temperatures of a particular skin surface region (e.g., a cheek region of an individual's face) over time may correspond with a temperature-time sequence (i.e., a temperature-vs-time sequence) for the particular skin surface region. Different temperature-time sequences for different skin surface regions of an individual may be generated as the individual moves around the temperature screening area.
One or more temperature-time sequences for one or more skin surface regions of an individual within the temperature screening area may be used to estimate an internal core temperature for the individual. In at least one example, a first temperature-time sequence for a first facial region (e.g., a forehead region of the individual's face) may be used to extrapolate an asymptotic surface temperature for the first facial region, and an internal core temperature for the individual may be determined by applying a location-dependent temperature offset to the asymptotic surface temperature for the first facial region. In another example, a first temperature-time sequence for a first facial region (e.g., a forehead region of the individual's face) may be used to determine an asymptotic surface temperature for the first facial region and a second temperature-time sequence for a second facial region (e.g., a cheek region of the individual's face) may be used to determine an asymptotic surface temperature for the second facial region. A first estimated internal core temperature for the individual may be determined via application of a first temperature offset to the asymptotic surface temperature for the first facial region and a second estimated internal core temperature for the individual may be determined via application of a second temperature offset to the asymptotic surface temperature for the second facial region. The first temperature offset and the second temperature offset may be acquired from a lookup table or database that stores mappings of temperature offsets for various exposed surface regions of an individual. In at least one embodiment, an internal core temperature for the individual may be calculated by averaging the first estimated internal core temperature and the second estimated internal core temperature. In some cases, in response to detecting that the internal core temperature for the individual is outside of an approved range of temperature values (e.g., has exceeded a maximum internal core temperature), a higher-accuracy temperature screening process may be performed for the individual or an indication that the internal core temperature for the individual has exceeded an approved range of temperature values may be displayed.
One issue that prevents the large-scale deployment of temperature screening processes in environments with dynamic ambient conditions is that accurate non-invasive temperature screenings may require a prohibitive amount of time. For example, performing an accurate temperature screening for individuals entering and moving freely within a hotel lobby may require the individuals to remain within the hotel lobby for at least 15 minutes to equilibrate to the temperature of the hotel lobby. According to some embodiments, the technical benefits of the systems and methods disclosed herein (that is, for non-invasively identifying individuals within a temperature screening area that have elevated internal core temperatures greater than a maximum approved temperature) may include improved temperature screening throughput for a given temperature screening accuracy, improved temperature screening accuracy with a minimal impact on temperature screening throughput, reduced energy consumption when performing temperature screening processes, and/or the ability to manufacture high-accuracy temperature screening systems using lower-cost hardware components (e.g., using a combination of thermal imaging cameras of different price ranges).
Technology described herein improves an automated temperature screening of individuals within a dynamic environment. As individuals enter and move freely within a temperature screening area, individuals with temperature anomalies, such as individuals with estimated internal core temperatures greater than a threshold temperature, may be identified using a temperature screening approach that has both high-throughput and high-accuracy. During a high-throughput screening process, the temperatures of exposed skin surface regions of individuals (e.g., exposed forehead and cheek regions) may be tracked over time as an individual is within the temperature screening area. As an individual may over time turn away from an image capturing device or an obstruction may appear that prevents the image capturing device from capturing an exposed skin surface region of the individual, the high-throughput screening process may continuously monitor and track the exposed skin surface region and then determine and store a temperature for the exposed skin surface region when a sufficient number of pixels for the exposed skin surface region are captured. The image capturing device may include sensors or cameras for capturing thermal images and color images of the exposed skin surface region.
The determined temperatures of one or more exposed skin surface regions of an individual may be used to calculate an internal core temperature for the individual, which may then be compared with an approved range of temperature values in order to determine whether the individual has an internal core temperature that is outside of the approved range of temperature values. In at least one embodiment, a first exposed skin surface region may comprise a forehead region of the individual and a second exposed skin surface region may comprise a cheek region of the individual. Estimates of the internal core temperature for the individual may be computed by applying a first temperature offset to a first temperature of the first exposed skin surface region and applying a second temperature offset to a second temperature of the second exposed skin surface region. The first temperature offset and the second temperature offset may be acquired from a lookup table, for example, based on the locations of the exposed skin surface regions on the individual. In some cases, an average internal core temperature may be computed from the two internal core temperature estimates. In response to detecting that an individual has an internal core temperature that is outside of the approved range of temperature values, a non-invasive higher-accuracy temperature screening process may be performed for the individual.
In some cases, the higher-accuracy temperature screening process may cause an orientation adjustment and/or an optical adjustment (e.g., optically zooming in) to a thermal imaging device such that a field of view of the thermal imaging device captures a targeted exposed skin surface region of the individual, such as an exposed forehead region of the individual. The higher-accuracy temperature screening process may also display information to cause the individual to be moved into a particular screening area or within a particular distance of the thermal imaging device, display information to cause the individual to change an orientation of their body or to look towards the thermal imaging device, and/or display information to cause the individual to remove obstructions to capturing temperature information for the targeted skin surface region (e.g., displaying instructions to remove glasses, face coverings, and ear muffs). The optics of the thermal imaging device may be adjusted such that an image of a facial region of the individual may be captured with at least a minimum number of pixels for the region. As an example, the thermal imaging device may capture an image of the inner canthus region (or an eye tear duct region) of the individual with at least 4×4 pixels covering the inner canthus region of the individual. In at least one embodiment, the minimum number of pixels for the region may be set based on an average walking speed for the individual and/or a distance between the individual and the thermal imaging device.
Some technical benefits of selectively applying the higher-accuracy temperature screening process to individuals that are identified as having an internal core temperature that is outside of an approved range of temperature values include increased temperature screening throughput for a given temperature screening accuracy and increased temperature screening accuracy for a given temperature screening throughput. Temperature screening throughput may refer to the rate at which individuals pass through a temperature screening area within a specified period of time (e.g., a system may perform temperature screening and allow up to thirty individuals to pass through the temperature screening area per minute). Temperature screening accuracy may refer to the proximity of a determined surface temperature to the actual surface temperature. In at least one example, the temperature screening accuracy may be set such that the determined surface temperature using a temperature screening process is within one-half of a degree of the actual surface temperature. In another example, the temperature screening accuracy may be set such that the estimated internal temperature of an individual is within one degree of the actual internal temperature of the individual.
During the high-throughput screening process, a temperature window may be specified corresponding with an approved range of temperature values. The temperature window may correspond with a temperature range that is greater than a low threshold temperature and less than a high threshold temperature. For example, the temperature window may comprise a range of temperatures between 94 degrees Fahrenheit and 101 degrees Fahrenheit. In response to detecting that a surface temperature of an exposed skin surface region (or regions) of an individual is either less than the low threshold temperature (e.g., is less than 94° F.) or greater than the high threshold temperature (e.g., is greater than 101° F.), a high-accuracy temperature screening process may additionally be performed to determine a second surface temperature for the same region (or a smaller portion of the region) of the individual. The high-accuracy temperature screening process may determine the second surface temperature using a high-resolution thermal imaging camera that captures at least a threshold number of pixels for the region of the individual. A thermal image with an increase in the number of pixels for the region of the individual (e.g., an inner canthus of the individual) may determine a surface temperature with improved accuracy. The region of the individual may comprise a facial region (e.g., a portion of the individual's face), a forehead region of the individual, a cheek region of the individual, an ear region of the individual, a neck region of the individual, an inner eye region of the individual, an orbital region of the individual (e.g., a portion of the individual's face around the individual's eyes or between the individual's eyes), or a temporal region of the individual (e.g., a region between an individual's eye and ear), for example.
During the high-throughput screening process, a first image capturing device including a thermal imaging camera or image sensor may determine a surface temperature for a targeted region of an individual. In some embodiments, during the high-accuracy temperature screening process, a second image capturing device including a higher-resolution thermal imaging camera or image sensor may determine a second surface temperature for the targeted region or for a sub-region of the targeted region. The sub-region of the targeted region may comprise a portion of the targeted region. In at least one example, during the high-throughput screening process, a surface temperature for an individual's orbital region may be determined; subsequently, during the high-accuracy temperature screening process, a second surface temperature may be determined for a sub-region or portion of the individual's orbital region, such as an inner canthus, eye tear duct, or a facial region between the individual's eye and nasal bridge. In some cases, the thermal imaging information acquired during the high-throughput screening process may be insufficient to determine a surface temperature for a targeted sub-region of the region. For example, the thermal images captured during the high-throughput screening process may cover the targeted sub-region with only 2×2 pixels, while the thermal images captured during the high-accuracy screening process may cover the targeted sub-region with at least 4×4 pixels.
In some embodiments, during the high-accuracy temperature screening process, an individual may be positioned within a field of view of a thermal imaging device in response to detecting that the surface temperature of an exposed skin surface region exceeds a temperature threshold. The individual may be positioned within the field of view of the thermal imaging device by either moving the individual into the field of view of the thermal imaging device or by adjusting an orientation of the thermal imaging device. In at least one example, out of a set of thermal imaging cameras, the thermal imaging camera that is reoriented may comprise the thermal imaging camera within the set of thermal imaging cameras that is closest to the individual or that has the best view or least obstructed view of the individual's exposed skin surface region (e.g., the reoriented camera may comprise the thermal imaging camera that has the ability to capture the highest number of pixels covering the individual's face or exposed skin surface region). A technical benefit of configuring or reorienting the thermal imaging camera that is closest to the individual or the thermal imaging camera that has the least obstructed view of the individual's face or targeted skin surface region is that temperature information with higher resolution and higher reliability may be obtained.
In some embodiments, during the high-throughput temperature screening process, as the temperatures of exposed skin surface regions of individuals are tracked over time, corresponding temperature measurement confidence values may be determined based on a distance between an individual and a thermal imaging device and/or a walking speed of the individual when temperature information for a skin surface region was captured. Temperature information for various regions of an individual and corresponding temperature measurement confidence values may be updated over time as the individual moves around the temperature screening area. The resolution and reliability of temperature information may be improved when the individual is closer to a thermal imaging device and when the individual is moving at a slower rate within the temperature screening area. A technical benefit of storing temperature measurement confidence values along with the temperature information for tracked regions of an individual is that the accuracy and reliability of the temperature screening may be increased. In some cases, the temperature threshold for determining whether an individual has an elevated internal core temperature may be adjusted based on the temperature measurement confidence values. For example, if the temperature measurement confidence value for a first surface temperature of an exposed skin surface region of an individual is less than a threshold confidence value (e.g., is less than 0.7), then the maximum approved temperature within the temperature window may be reduced by one degree. The high-throughput temperature screening process may perform region detection and tracking for individuals within the temperature screening area and determine a temperature measurement confidence factor for each region based on the distance to the individual, the direction they are looking, the walking speed of the individual, and the presence of facial obstructions (e.g., glasses).
In some embodiments, the amount of time to perform an accurate temperature screening of individuals within a dynamic environment may be reduced by tracking one or more skin surface regions of the individuals and extrapolating asymptotic surface temperatures for the one or more skin surface regions of the individuals before the one or more skin surface regions actually reach their asymptotic surface temperatures. The dynamic environment may comprise a temperature screening area (e.g., a hotel lobby or an entrance to grocery store). In some cases, the temperatures of the one or more skin surface regions of the individuals (e.g., orbital regions and temporal regions of a person's face) within the dynamic environment may be tracked over time using one or more thermal imaging devices (e.g., a thermal imaging camera) as the individuals are moving within the environment. Temperatures and corresponding time stamps for when the temperatures were captured may be used to generated temperature-time sequences (or a temperature-vs-time sequences) for the one or more skin surface regions of the individuals. As an individual moves around the environment, different exposed skin surfaces may be captured by the one or more thermal imaging devices. An asymptotic surface temperature for a region of the one or more skin surface regions may be determined when temperature-time data points for a corresponding temperature-time sequence are deemed sufficient to determine the asymptotic surface temperature for the region.
In at least one embodiment, the temperature-time sequence may be sufficient to determine the asymptotic surface temperature for the region if a fitting error between the temperature-time sequence and a temperature fitting function for the region is less than a threshold error value. In other embodiments, the temperature-time sequence may be sufficient to determine the asymptotic surface temperature for the region if the temperature-time sequence has at least a threshold number of data points (e.g., at least ten temperature-time data points within a two minute time interval) and that a rate of change for a subset of the data points in the temperature-time sequence is below a threshold rate of change (e.g., the rate of change for the at least ten temperature-time data points is less than one degree within the two minute time interval).
In some embodiments, during a temperature screening process, individuals may enter a temperature screening area with different subject conditions and from different ambient conditions and move freely around the temperature screening area. Over time, for each individual within the temperature screening area, the temperature screening process may capture or acquire temperature-time information (e.g., from one or more thermal cameras) and generate and store temperature-time sequences corresponding with one or more skin surface regions of an individual. In response to detecting that at least a threshold number of asymptotic surface temperatures (e.g., at least two surface temperatures) corresponding with the one or more skin surface regions of the individual may be generated or extrapolated with at most a threshold amount of temperature error (e.g., a temperature error of less than half a degree), an internal core temperature for the individual may be estimated and transferred to a computing device or an electronic display. The temperature offset applied to a particular asymptotic surface temperature for a particular skin surface region of the individual may be determined based on a location of the particular skin surface region on the individual or from a lookup table or database that stores mappings of temperature offsets for various exposed surface regions of an individual. In some cases, the temperature offset may depend on a height, weight, mass, and/or surface area of an individual.
In at least one embodiment, an asymptotic surface temperature for a region of an individual may be extrapolated using a temperature profile fitting function (or temperature fitting function) for the region. In at least one example, a cheek region of an individual with a first surface area and a first height may be mapped to a first temperature fitting function that specifies a temperature over time trajectory for the cheek region. In another example, a forehead region of an individual with a first surface area and a first height may be mapped to a second temperature fitting function different from the first temperature fitting function that specifies a temperature over time trajectory for the forehead region. Different skin surface regions of an individual may equilibrate with an ambient temperature at different rates due to variations in skin tissue density and/or distance from an individual's nose or mouth. The asymptotic surface temperature for a region may be extrapolated using a temperature fitting function when at least a threshold number of temperature-time data points for a temperature-time sequence have a curve fitting error that is less than a threshold error value. In at least one example, the determination of whether a goodness-of-fit between the threshold number of temperature-time data points and the temperature fitting function for the region is sufficient to extrapolate an asymptotic surface temperature for the region may depend on a goodness-of-fit measure, such as having an R-squared value greater than 0.9. Some approaches for determining whether a set of temperature-time data points for a temperature-time sequence is sufficient to extrapolate an asymptotic surface temperature for a skin surface region may take into account the rate of change of the temperature-vs-time sequence and/or how well the temperature-vs-time sequence fits a temperature fitting function with at least a threshold number of data points over a given time period.
A technical benefit of identifying a temperature fitting function for a targeted region of an individual based on a location of the targeted region and/or a distance that the targeted region is from the individual's nose or mouth and extrapolating an asymptotic surface temperature for the targeted region of the individual using the temperature fitting function is that accurate non-invasive temperature screening may be performed with a reduced equilibration time. A technical benefit of identifying a temperature fitting function for a targeted region based on an estimated surface area of the individual, an estimated height or weight of the individual, and/or an amount of air flow within an environment is that an accuracy of an extrapolated asymptotic surface temperature for the targeted region may be increased.
The amount of heat transfer between an exposed skin surface and the environment may be proportional to the temperature difference between the temperature of the exposed skin surface and the ambient temperature of the environment. As the temperature difference decreases over time as an individual equilibrates with the ambient temperature of the environment, the rate of heat transfer may also decrease causing a decrease in the rate of change of the temperature of the exposed skin surface. In some cases, a set of temperature-time data points for a temperature-time sequence used for detecting that a curve fitting error for the set of temperature-time data points is less than a threshold error (e.g., has an R-squared value greater than 0.8) may be selected as the most recent set of temperature-time data points for a temperature-time sequence (e.g., the eight most recent temperature-time data points within the temperature-time sequence). In other cases, the set of temperature-time data points for a temperature-time sequence that is used for detecting that a curve fitting with a particular temperature fitting function is sufficient may comprise a set of five temperature-time data points that are spaced apart in time by at least 30 seconds.
Upon detection that an individual has been within a temperature screening area for more than a threshold period of time (e.g., for more than 3 minutes) and that a temperature-time sequence for the individual has not been deemed sufficient to extrapolate an asymptotic surface temperature for an exposed skin surface region of the individual, an indication directing the individual to a particular area (e.g., a holding area with increased air flow) may be displayed. In some cases, an individual may be directed into a holding area to temporarily equilibrate via an electronic message to a portable computing device, such as a mobile phone, used by the individual.
Continuing with the example of
In
Continuing with
High-accuracy screening area 104 includes a portion of ETRS 116. In some examples, high-accuracy screening area 104 can include a separate ETRS used only in high-accuracy screening area 104. In some examples, person 120 can be stationary and can be located in line with ETRS 116 such that person 120 is approximately the same distance from imaging device 110 as ETRS 116 is from imaging device 110.
Continuing with
In some operations, imaging device 110 detects heat patterns in a scene by receiving energy emitted in the infrared-wavelength spectrum from the scene and processing the IR energy to generate an IR (e.g., thermal) image. In some operations, imaging device 110 can generate a visible light image of the same scene by receiving energy in the visible light-wavelength spectrum and processing the visible light energy to generate a visible light image.
In some examples, imaging device 110 collects or captures the IR energy and visible light energy substantially simultaneously so that the visible light image and the IR image generated by the imaging device 110 are of the same scene at substantially the same time. In such examples, the IR image generated by imaging device 110 can be indicative of localized temperatures within the scene at a particular period of time while the visible light image generated by the imaging device 110 can be indicative of the same scene at the same period of time. In other examples, imaging device 110 can capture IR energy and visible light energy from a scene at different periods of time.
In the example of
Imaging device 110 is located such that its field of view, as defined by lines 112 and 114, includes both high-throughput screening area 102 and high-accuracy screening area 104. In
Imaging device 110 is located at a height at which it can view and identify the heads of the persons 118 and is preferably at a similar height to the heads of the persons 118. This can be advantageous as less distortion can be present in images generated by imaging device 110. In some examples, imaging device 110 can have an adjustable height such that it can adjust to the heights of the persons 118 and keep their heads in an optimal view. However, in some examples, imaging device 110 can be located at a height substantially above persons 118 and can thereby allow imaging device 110 to avoid interference or obstructions (e.g., from other person's heads). In some examples, imaging device 110 is located at an angle relative to the high-throughput screening area 102 and high-accuracy screening area 104. The angle can be a horizontal or vertical angle with respect to a ground surface. For example, imaging device 110 can be located at a height substantially higher than the heads of persons 118 and can have a downward vertical angle such that it can view the heads of persons 118. Imaging device 110 can be attached to supports such as a tripod or to a wall or to other objects so that it can be located to have the high-throughput screening area 102 and high-accuracy screening area 104 within its field of view. Other supporting mechanisms are contemplated.
In
Continuing with
Processor 128 can be directly connected to imaging device 110 using a wire, but it some examples, processor 128 can be connected wirelessly to imaging device 110. In some such examples, processor 128 is located remotely from imaging device 110 and in some examples, processor 128 can be a processor in a remote server. In some such examples, imaging device 110 can be configured to send image data through the internet or other remote protocols to the remote processor. A remote processor can be advantageous as it does not need to occupy the same space as imaging device 110 and can be more powerful than a local processor. In some examples, more than one processor can be in communication with imaging device 110 and perform the same or different functions as the other processors.
In operation, processor 128 can receive imagery data, which can include IR light image data and visible light image data of a first target scene (e.g., the high-throughput screening area 102) from imaging device 110. Additionally, processor 128 can receive imagery data, which can include IR light image data and visible light image data of a second target scene (e.g., the high-accuracy screening area 104) from imaging device 110. In some examples, processor 128 can receive imagery data from both the first target scene and second target scene substantially simultaneously from imaging device 110. In some such examples, imaging device 110 can generate imagery data, including IR light image data and visible light image data, substantially simultaneously and send the data to processor 128. In some embodiments, the infrared image data of the imagery data can be indicative of temperatures in the target scene. Further, in some embodiments, processor 128 can interpret the received IR light image data and assign temperatures corresponding with the IR light image data. Alternatively, in some examples, imaging device 110 assigns temperatures corresponding with the IR light image data and communicates the temperatures to processor 128. Processor 128 can additionally interpret the received visible light image data and use the data for various functions as is described further herein. The temperatures can correspond to the surface temperatures of objects such as persons 118.
As discussed above, a first target scene, which can be high-throughput screening area 102, can include multiple persons such as persons 118. Image data representative of the first target scene can thus include image data associated with multiple persons. Processor 128 can analyze the image data received from imaging device 110 and correlate specific portions of the image data with specific persons 118. For example, a first and second person can be located in high-throughput screening are 102 with imaging device 110 generating image data of the scene that includes the first and second person. Processor 128 can use algorithms and other techniques to determine that a specific portion of visible light imaging data is associated with the first person and that a different portion of the visible light imaging data is associated with the second person. Processor 128 can then determine temperatures associated with IR light image data and correlate specific portions of the temperatures with the first person and different specific portions of the temperatures with the second person. In some examples, processor can perform the process described above substantially simultaneously for multiple individuals in the high-throughput screening area 102. In some examples, processor 128 identifies persons in a target scene using a portion of the imagery data which can include IR image data and/or visible light image data. For example, processor 128 can identify a person using the visible light image data and can subsequently map a portion of the IR image data with the portion of the visible light image data that is representative of the person. In some such examples, processor 128 can determine the temperature of the person. In some examples, processor 128 can identify persons using IR image data.
In some examples, processor 128 can receive visible light image data of a first target scene and visible light image data of a second target scene from imaging device 110. Processor 128 can analyze the visible light image data received from imaging device 110. For example, processor 128 can use image analysis to correlate patterns of pixels (e.g., pixel regions) with objects and/or persons in order to identify the objects and/or persons. Additionally or alternatively, in some examples, processor 128 can use facial recognition and/or facial detection techniques to determine what, if any, portion of an image contains facial features associated with persons. Image analysis and facial recognition techniques can consist of various algorithms, equations, and steps and can include using multiple images, portions of images, or other systems which determine if a person is present in the image and/or if facial features are present in the image. For example, processor 128 can analyze visible light image data and determine if portions of the visible light image data correspond to persons 118 or to other objects. Processor 128 can further use tracking techniques to track the location of persons and/or facial features within an image and/or in multiple images over time. For example, processor 128 can receive visible light image data comprising multiple images per second from a target scene, determine that the image data includes a person, determine that the person has facial features, and can further track the person through each image over time.
Furthermore, in some examples, processor 128 can use facial recognition, facial detection, or other techniques to store and compare facial features or other identifying features of persons. In some examples, this can allow processor to determine if a person has been imaged before and if the person has gone from the high-throughput screening area 102 to the high-accuracy screening area 104 or from high-accuracy screening area 104 to high-throughput screening area 102. Additionally, in some examples, persons can be identified at a much later point in time, such as during a subsequent temperature screening. This information can be used to track individual person's temperatures and their change in temperature over long periods of time (e.g., over days, weeks, or longer). In some examples, information about the identified individual can be stored for other uses such as for calibration or other uses which can increase the accuracy of the temperature scanner system.
Using the techniques described above, or other techniques, processor 128 can determine the temperature of a person, or multiple persons, using imaging device 110. The temperature can be determined continuously, at intervals over time, or at a single point in time. The temperature can be a direct temperature (e.g., surface temperature) determined from the IR light image data, however in some examples, the temperature can be an indirect temperature. In some such examples, the indirect temperature can be correlated with a direct temperature but can include adjustments such as offsets from the direct temperature. For example, a direct temperature of a person can be determined using the IR light image data as described above. However, the direct temperature may need to be adjusted to be more accurate or require an adjustment to reflect the difference between the surface temperature of the person and an internal temperature of the person. An external temperature reference source (ETRS) can be used for adjusting the direct temperature as described further herein. Other manipulations of the temperature can be done and are contemplated.
Although a person can have multiple temperatures associated with them, in some examples, processor 128 determines the highest temperature associated with the person and can compare it to a temperature threshold. In other examples, processor 128 determines the lowest temperature associated with the person and can compare it to a different temperature threshold. Further, in some examples, processor 128 determines a temperature of a specific area of a person and, in some examples, processor 128 determines an average temperature of person for comparing to a threshold temperature. Other temperatures locations and types associated with a person are contemplated. Preferably, temperatures closely correlated with an internal temperature of the person are used for comparing to a temperature threshold.
A temperature threshold can be any temperature or range of temperatures. In some examples, multiple temperature thresholds are used. In some examples, the temperature threshold can be a temperature for a human (e.g., 98.6° F.) or a range of a normal temperatures for a human (e.g., 97.7° F. — 99.5° F.). In some examples, the temperature threshold can be an abnormal temperature for a human (e.g., 100+° F.). In some examples, the temperature threshold can be fixed while in other examples, the temperature threshold can be adjusted as is described further herein.
Continuing with the example of
Further, in the operation of example
In some examples, the first imaging device 210 can have a first depth of field and the second imaging device 216 can have a second depth of field. In some such examples, the first imaging device 210 with the first depth of field can receive infrared image data at a first range of distances from the first imaging device 210. Further, in some such examples, the second imaging device 216 with the second depth of field can receive infrared image data at a second range of distances from the second imaging device 216. In some examples, the second depth of field is smaller than the first depth of field, and in some examples, the second range of distances is smaller than the first range of distances. A larger depth of field with a larger range of distances can allow the first imaging device to receive more infrared image data from a scene which includes multiple persons. However, a smaller depth of field with a smaller range of distances can allow the second imaging device to receive more focused infrared image data which can also be more accurate. In some examples, the depth of field of the first imaging device and the second imaging device can be adjusted. In some examples, the second imaging device has a higher resolution than the first imaging device.
Additionally, in some examples, second imaging device 216 can have a shorter minimum focus distance and a longer lens effective focal length than first imaging device 210. A shorter minimum focus distance and longer lens effective focal length can allow objects such as a person to be in focus at a distance closer to the imaging device. It can be advantageous to have objects such as a person be closer to second imaging device 216 as image data, such as IR light image data which corresponds to temperatures of the objects, can be more accurate than if objects are further away from the second imaging device 216. In
Continuing with the operation of
The configuration of the temperature scanner system of
As shown in
In some examples, within the first field of view 302, the processor, in combination with the imaging device, can further identify one or more specific portions of persons 316. In some examples, a specific portion is the face 318 of the persons. However, in some examples, the processor can identify portions of the face such as an eye or portion of the eye. In some examples, the processor can determine a temperature associated with a specific portion of the person and in some examples, the processor can further track the temperature of the specific portion of the person over time (e.g., while the person is moving). In some examples, the processor can have a higher accuracy when determining the temperature of a specific portion of the face (e.g., eye tear duct/inner canthus) than an entire face of the person.
In some examples, within the second field of view 304, the processor, in combination with the imaging device, can further identify one or more specific portions of the person 320. In some examples, a specific portion is the face of the person 320. However, in some examples, the processor can identify portions of the face such as an eye or forehead. In some examples, the processor can determine a temperature associated with a specific portion of the person and in some examples, the processor can further track the temperature of the specific portion of the person over time (e.g., while the person is moving). In some examples, the processor can have higher accuracy when determining the temperature of a specific portion of the face (e.g., eye tear duct/inner canthus) than an entire face of the person.
Continuing with the example of
As depicted in
While persons' head and/or faces have been described as being used, other parts of persons may be used to identify, track, and/or determine the accuracy of a temperature reading of the persons. For example, in some embodiments, the system uses a person's head to identify the person and uses the person's head to track the person throughout a target scene.
Continuing with
In some examples, multiple ETRS are used to further increase the accuracy. In some examples, a single ETRS can be used between multiple scenes (e.g., the ETRS of
In some examples, imagery, temperatures, and other data (e.g., a confidence factor) can be displayed on a display. Imagery data can include visible light imagery and/or infrared light imagery. In some such examples, the display can display information about each individual person proximate to the imagery of said person. In some examples, the display is monitored by an operator such that if a person has a temperature above the threshold, the operator can notify the person, and if a person has a temperature below the threshold, the operator can allow access through access gates. However, in some examples, the display is not actively monitored by an operator and displays information to any person such as the persons being screened. More than one display can be used, however in some examples, one display is used which incorporates one or more imaging devices. For example,
As previously discussed herein, one or more imaging devices in communication with a processor can determine the temperature of one or more persons in a target scene. However, in some examples, a confidence factor is included with the determining of the temperature of one or more persons. A confidence factor, as used in this disclosure, is a measurement of certainty/uncertainty. For example, the confidence factor can be a determination of the likelihood that something, such a measurement, value, or series of values, is accurate. In at least one example, a large uncertainty range can indicate a lower confidence factor while a small uncertainty range can indicate a higher confidence factor. The confidence factor can be an additional step in determining the temperature of one or more persons and can be used to increase the accuracy of any temperature determinations of the persons. In some examples, the confidence factor can be associated with an image or series of images which can include IR light image data and/or visible light image data. In some examples the confidence factor can be associated with a temperature or series of temperatures which can be determined from IR light image data and optionally, visible light image data. In operation, a confidence factor is determined by a processor (e.g., processor 228 of
The confidence factor can include many individual factors which are specific to the determination being done. In some examples, the confidence factor can have individual factors related to the specific properties of an image such as the clarity, blurriness, brightness, darkness, and contrast. In some examples, the confidence factor can be weighted. In some examples, the confidence factor can take into account multiple images over a period of measurement. In some such examples, the confidence factor can include an analysis of the multiple images (e.g., average amount of clarity and/or brightness). In one such example, a first confidence factor can be assigned to a first image (e.g., frame) of a plurality of images (e.g., a video) and a second confidence factor can be assigned to a second image of the plurality of images. Each of the first and the second confidence factors can be weighted based on distance from the person to the imaging device and an overall confidence factor can be aggregated from the first and second confidence factors. An overall confidence factor (e.g., aggregated confidence factor) can be compared and used in any case where an individual confidence factor is used (e.g., compared to a threshold). In some examples, the confidence factor can have individual factors related to the objects within the image such as the number of persons or properties of the persons. For example, a confidence factor can include the likelihood that a person has one or more obstructions on their face which interfere with the temperature determination. Some examples of obstructions include hair, glasses, hats, and masks. These obstructions could affect the accuracy of the temperature determination of the person. In some other examples, a confidence factor can include the distance between a person whose temperature is being determined and the imaging device generating the image data of the person. Because the distance can affect the temperature determination of the person, a confidence factor can be assigned to the temperature determination which takes into account that smaller distances from an imaging device generally produce more accurate temperature determinations than larger distances from an imaging device. In some examples, a confidence factor can include the direction a person is facing and a speed at which the person is moving. The confidence factor can include any number of the factors above and other factors not listed.
In some example operations, a relatively low confidence factor can be associated with/assigned to a temperature reading of a person's face due to the temperature reading of their face being below a threshold. The temperature reading being below a threshold could indicate that some obstruction is located on the person's face and the relatively low confidence factor can reflect that it is likely some obstruction is preventing a more accurate temperature reading. In such a case, an initial confidence factor can be 100% and is subsequently lowered by 20% to be 80% due to the person's temperature being below a threshold. In some other example operations, a relatively low confidence factor can be associated with/assigned to a temperature reading of a person's face due to the person not looking toward the imaging device. The relatively low confidence factor can thus reflect the likelihood that the temperature reading of the person's face is accurate due to the positioning of the person's face. In such a case, an initial confidence factor can be 100% and is subsequently lowered by 30% to be 70% due to the person not looking toward the imaging device. Other factors are contemplated, and different factors can have different impacts on the confidence factor. For example, a person not looking toward the imaging device as well as their temperature being below a threshold could lower the confidence factor from 100% to 50% with each factor contributing to the decrease in the confidence factor. In some examples, factors add together to decrease the confidence factor as described above, while in some examples, factors can be combined in a non-linear fashion. One such example of a non-linear combination is that one factor contributes to a decrease of 10% in the confidence factor by itself and another factor contributes to a decrease of 15% by itself. However, when both factors are present, the confidence factor can decrease by 50%. Other examples of confidence factors, other combinations of factors, and other contributions of the factors to the confidence factor are contemplated and this disclosure is not limited to the examples above.
In some examples, the confidence factor can change over time. In some such examples, the confidence factor can change over time based on the temperature determination changing over time. A processor can change the confidence factor over time based on information or factors which were not previously used in the confidence factor. For example, the confidence factor associated with a temperature determination of a person can increase continuously as a person gets closer to an imaging device as the imaging device can receive more IR light from the person and/or the imaging device can obtain finer details of the person. It can be advantageous to continually update a confidence factor as it can increase the accuracy of determinations such as a temperature determination.
In some example operations, a confidence factor can change as the imaging device receives images. In one such example operation, a confidence factor can be associated with/assigned to a first image in which the person's face is blurred due to their head motion. Subsequently, the confidence factor can be updated to reflect a second image in which the person's face is clear (e.g., not blurred). In some examples, the updating can be an aggregate of individual confidence factors associated with/assigned to a temperature reading, while in some examples, the updating can be a running average. Other methods of analysis and updating of the confidence factor are contemplated.
In some examples, the confidence can be compared to a threshold by a processor. If the confidence factor does not exceed a threshold, the processor can perform an additional action, such as providing a signal that the confidence factor did not exceed the threshold. In some examples in which a confidence factor is based on the accuracy of the determined temperature of a person, the confidence factor not exceeding its threshold, the determined temperature exceeding a threshold, or a combination thereof, can be used to indicate the person needs further screening. For example, a person can have a temperature determined in a high-throughput screening area which is within an acceptable range, however, an associated confidence factor can be lower than a threshold, indicating that the determined temperature is not sufficiently accurate. In such an example, the person can be directed to a high-accuracy screening area for further temperature determination.
In some examples of the system for screening persons for elevated temperatures, processes for adjusting the determining of temperature readings can be included. In some such examples, a processor, which determines the temperatures of the persons in the target scene, can adjust the determined temperatures based on previous determinations. For example, a processor can determine the temperatures of persons in a first target scene using IR image data of the first target scene. The processor can later adjust and determine the temperatures of persons in the first target scene, however, the temperatures determined later can be higher and/or lower than they would have been if the processor had not adjusted. In at least one example, the temperatures of the persons all exceed the threshold and the processor adjusts such that the temperatures of the same persons no longer exceed the threshold. In some examples, the adjustment can be temporary (e.g., for a determined amount of time) while in some examples, the adjustment can be permanent until subsequent adjustment.
In some examples, the determined temperature readings of the second scene are used to adjust the determining of temperature readings of the first scene. For example, if determined temperatures of a majority of individuals in the second scene do not exceed a threshold, the processor can be adjusted to change its determination of temperatures of individuals in the first scene. In this example “false positives,” where an individual is directed to a second scene due to their temperature exceeding a threshold in the first target scene, can be reduced. Further, in some examples, the temperature of individual persons relative to an average temperature of a plurality of persons is used to adjust the system. In some such examples, the processor can determine if a temperature of a person is different than previous temperatures of persons. In some examples, adjustment can be done while the system for screening persons is active (e.g., continuous adjustment). Further, adjustment, as described above, can be applied to the threshold in addition to or in lieu of adjusting the determining of the temperature. In some examples, directing random persons to go from the high-throughput screening area to the high-accuracy screening area can aid in adjusting various portions of the screening system including the threshold. The random directing of persons to the high-accuracy screening area can prevent large variations between the first scene and second scene temperatures and can increase accuracy of both. In some examples, the directing of person to the high-accuracy screening area is not random and can vary based on a number of factors including the number of persons and the frequency of which persons are required to go through the high-accuracy screening area. In some examples, the threshold can be adjusted in response to outdoor temperatures. Adjusting the threshold for outdoor temperature can be advantageous as persons can have increased/decreased temperatures for a period of time after entering a screening area from the outside. It can also be advantageous to adjust temperature determinations and/or thresholds because persons can have a range of temperatures which can be considered “normal” (e.g., one person has a higher “normal” temperature than another person). Additionally, it can be advantageous to adjust temperature determinations and/or thresholds because portions of the system, including the imaging device(s) used to generate IR image data, can be subject to unintentional changes (e.g., temperature changes) over time.
In some examples of the system for screening persons for elevated temperatures, data, which can include IR image data, temperature data, and confidence factor data, can be correlated between data obtained from a high-throughput area and data obtained from a high-accuracy area. In some such examples, a processor can correlate and/or analyze the data. The processor can use the data to determine various aspects of the system. For example, if the imaging device of the high-throughput area identifies an individual person with a temperature exceeding a threshold, the imaging device of the high-accuracy area can identify if the same individual is in the high-accuracy area (e.g., facial recognition). The data can then be linked for later use such as for adjustment of the system. In such an example, the system can determine that a person has been through the high-accuracy area before allowing the person through an access gate. In some examples, the data from the high-throughput area and data from the high-accuracy area are correlated such that the process of assigning a confidence factor in the high-throughput area is updated during operation. Other examples of using correlated data are contemplated including using the correlated data to adjust the temperature determinations of the imaging devices and using the correlated data to adjust the threshold. A person having ordinary skill will appreciate that the disclosure is not limited to the examples above.
In further examples of the system for screening persons for elevated temperatures, data (e.g., IR image data) can be used from both the high-accuracy screening area and the high-throughput screening area for later analysis and updating of algorithms. Algorithms can include algorithms for correlating IR image data with temperatures and algorithms for determining threshold temperatures. For example, machine learning can be used with the data collected by the first imaging device and the second imaging device to adjust later, or in real time, screening parameters of the first imaging device or of the processor. Other methods for analyzing the data and adjusting various parameters of the screening system are contemplated.
In at least one embodiment, the capture device 620 may include one or more image sensors for capturing images. An image sensor may comprise a CCD image sensor or a CMOS image sensor. In some embodiments, capture device 620 may include a thermal image sensor and/or an IR CMOS image sensor. The capture device 620 may include an IR light component 634 for a depth camera, a depth camera 636, an RGB camera 638, and a thermal camera 640. In at least one example, the IR light component 634 may emit an infrared light into a capture area and may then use sensors to detect the backscattered light from the surface of one or more individuals in the capture area using the color and/or IR light sensing components within the capture device 620. In some embodiments, pulsed infrared light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the capture device 620 to a particular region on a surface of an individual within the capture area. Capture device 620 may also include optics for producing collimated light and/or for diffusing light (e.g., an optical diffuser for spreading light from an IR light source that generates a narrow beam of light) for illuminating an environment.
As depicted in
As depicted in
The temperature tracking engine 694 includes a skeletal tracking engine 690 and a region tracking engine 688. The skeletal tracking engine 690 may leverage a skeletal model of a human to help recognize body parts (e.g., arms, hands, and faces). The region tracking engine 688 may track one or more regions of an individual using image information acquired from capture devices 620 and 622. The region tracking engine 688 may use thermal image information, color image information, and/or depth information in order to detect a particular facial region of an individual and determine a surface temperature for the particular facial region. In the event that the particular facial region is determined to be obstructed due to people or objects within an environment blocking a view of the particular facial region, due to the individual facing away from a capture device, or due to the presence of a wearable obstruction to the particular facial region (e.g., a face mask), the region tracking engine 688 may delay storing temperature information for the particular facial region until reliable temperature information for the particular facial region is acquired.
In step 702, a first set of images corresponding with a first field of view (e.g., of a first imaging device) is acquired. In some cases, the first field of view may correspond with a field of view of a first imaging device, which may correspond with capture device 620 in FIG. 6, imaging device 110 in
In step 708, a temperature measurement confidence value for the first surface temperature is determined. In at least one example, if it is detected that the person is within a threshold distance (e.g., within thirty feet from the first imaging device), then the temperature measurement confidence value for the first surface temperature may be set to 1.0; otherwise, if it is detected that the person is not within the threshold distance, then the temperature measurement confidence value for the first surface temperature may be reduced to a value less than 1.0 based on the distance between the first imaging device and the person. In another example, if it is detected that the person was moving or walking when the first set of images were captured and a speed of movement is less than a threshold speed (e.g., is less than 3 miles per hour), then the temperature measurement confidence value for the first surface temperature may be set to 1.0; otherwise, if it is detected that the speed of movement for the person at the time that the first set of images were captured is greater than or equal to the threshold speed, then the temperature measurement confidence value for the first surface may be set to a value less than 1.0 (e.g., to 0.7) based on the speed of movement.
In step 710, it is detected that the first surface temperature exceeds a first temperature threshold (e.g., is greater than 99 degrees Fahrenheit) based on the first surface temperature and the temperature measurement confidence value. In at least one example, if the temperature measurement confidence value is greater than 0.9 and the first surface temperature exceeds the first temperature threshold (e.g., the first surface temperature exceeds 99 degrees Fahrenheit), then it may be detected that the first surface temperature exceeds the first temperature threshold. If the temperature measurement confidence value is less than 0.7, then the corresponding surface temperature may be deemed to be unreliable and not used for determining whether the first surface temperature has exceeded the first temperature threshold.
In step 712, the person is positioned within a second field of view (e.g., of a second imaging device different from the first imaging device) in response to detecting that the first surface temperature exceeds the first temperature threshold. In some cases, both the first field of view and the second field of view may be captured using the same imaging device. An imaging device may correspond with capture device 622 in
In step 714, a second set of images corresponding with the second field of view of a second imaging device is acquired while the person is within the second field of view of the second imaging device. In step 716, a second surface temperature associated with the facial region of the person is determined using the second set of images. In at least one embodiment, the second surface temperature may correspond with a portion of the facial region less than the entire facial region. For example, if the facial region of the person comprises a forehead region of the person, then the portion of the facial region may comprise a small region between the eyes of the person.
In at least one embodiment, a number of pixels for the facial region of the person captured by the first set of images may comprise a first number of pixels and the number of pixels for the facial region of the person captured by the second set of images may comprise a second number of pixels greater than the first number of pixels. The second surface temperature associated with the facial region of the person may be determined by averaging temperature values for the pixels covering the facial region or by averaging the three highest temperature values for the pixels covering the facial region. In step 718, the second surface temperature associated with the facial region of the person is outputted. The second surface temperature may be transmitted to another computing device, such as a mobile computing device or smart phone. The second surface temperature may be displayed using an electronic display. In some embodiments, an internal core temperature for the person may be estimated using the second surface temperature.
In step 732, a first surface temperature associated with a region of a person (or a person) is determined. The region of the person may comprise a facial region or another region of the person's body, such as an exposed arm region or an exposed leg region. Along with the first surface temperature, a temperature measurement confidence value for the first surface temperature may be determined and stored. The temperature measurement confidence value may be determined based on a number of pixels covering the region of the person. In one example, the temperature measurement confidence value may comprise the number of pixels covering the region divided by 16. The temperature measurement confidence value may correspond with a confidence score or a reliability score for the first surface temperature. In at least one example, if a walking speed or a movement of the person at the time that the first surface temperature was captured was greater than a threshold speed, then the temperature measurement confidence value may be divided in half or set to 0.5; otherwise, the temperature measurement confidence value may be set to 1.0.
In step 734, it is detected that the first surface temperature exceeds a first temperature threshold based on the first surface temperature and the temperature measurement confidence value. It may be detected that the first surface temperature exceeded the first temperature threshold if the first surface temperature is greater than the first temperature threshold and the temperature measurement confidence value for the first surface temperature is greater than 0.7. In one embodiment, it may be detected that the first surface temperature exceeds the first temperature threshold (e.g., is greater than 99° F.) and that the temperature measurement confidence value exceeds a confidence value threshold (e.g., is greater than 0.7).
In some cases, an imaging device may be adjusted or reoriented in response to detecting that the first surface temperature exceeds a first temperature threshold and that the temperature measurement confidence value for the first surface temperature exceeds a confidence value threshold. In other cases, an imaging device may be adjusted or reoriented in response to detecting that the first surface temperature exceeds a first temperature threshold or that the temperature measurement confidence value for the first surface temperature is below a confidence value threshold.
In some embodiments, it may be determined whether an orientation of an imaging device should be adjusted or reoriented. In at least one embodiment, if the person is within a threshold distance of the imaging device, then the orientation of the imaging device will be adjusted to capture the region of the person. In another embodiment, if a speed of the person is less than a threshold speed, then the orientation of the imaging device will be adjusted to capture the region of the person.
In some embodiments, an imaging device may not be adjusted, reconfigured, or reoriented to position the person within a field of view of the imaging device if the person is redirected or moved to a designated location in front of the imaging device or otherwise positioned within the field of view of the imaging device.
In step 736, it is determined whether the person should be redirected or otherwise moved into a field of view of an imaging device. If it is determined that the person should be redirected into the field of view of an imaging device, then step 738 is performed. In step 738, an indication that the person should be directed or moved into the field of view of the imaging device is transmitted. In step 740, the indication that the person should be directed or moved into the field of view of the imaging device is displayed. In at least one example, the indication that the person should be directed into the field of view of the imaging device may be displayed by displaying textual instructions and/or symbols on a display screen. The display screen may comprise a display of a portable electronic device or a smart phone used by the person.
If it is determined that the imaging device should be adjusted or that the person should not be redirected into the field of view of the imaging device, then step 742 is performed. In step 742, the imaging device is adjusted such that the person is positioned within a field of view of the imaging device. The adjustment may comprise an orientation adjustment and/or an optical adjustment (e.g., adjusting an optical zoom) of the imaging device. In step 744, a second surface temperature associated with the region of the person is determined using a second set of images corresponding with the field of view of the imaging device. In step 746, the second surface temperature associated with the region of the person is outputted. In at least one embodiment, the second surface temperature associated with the region of the person may be transmitted to another computing device or displayed using an electronic display.
In some embodiments, the second surface temperature may be used to estimate or compute an internal core temperature for the person. A temperature offset may be determined from a lookup table based on a location of the region of the person and applied to the second surface temperature to compute the internal core temperature for the person. The second surface temperature associated with the region of the person may be determined by averaging temperature values for all pixels covering the region or by averaging the three highest temperature values for pixels covering the region of the person. The region may comprise an inner canthus region or an eye tear duct region of the person.
In some embodiments, it may be detected that a first surface temperature associated with a first region of the person has exceeded a first temperature threshold and then after positioning the person within a second field of view of a second imaging device, it may be detected that a second surface temperature associated with a second region of the person different from the first region of the person has exceeded a second temperature threshold. The regions of the person to which surface temperatures are determined before and after the second imaging device has been repositioned or reoriented may comprise the same region of the person or different regions of the person.
In some embodiments, upon detecting that a first surface temperature exceeds a first temperature threshold, the orientation of the imaging device and the optics of the imaging device may be adjusted such that the person is positioned within a field of view of the imaging device. The field of view may allow for a higher resolution of a target region of the person and/or for higher confidence scores for temperatures associated with the target region of the person. As a person walks within the field of view of the imaging device, the person walks closer to the imaging device, the relative speed of the person may slow down allowing for higher confidence scores for temperatures associated with the target region of the person.
As discussed elsewhere herein subjects within the entry area 810 and outside of the facility 800 may have a variety of surface temperatures and relationships between their corresponding surface and core temperatures. For example, subject 830 entering facility 800 may have spent less time outside of the facility 800 than subject 832, who remains outside of the facility 800, and may have a surface temperature comparatively less affected by the environment outside of the facility than subject 832.
The facility 800 includes an infrared sensing device 860, shown as being positioned near the checkpoint 815 and subject 836 who is positioned at the front of the queue 850 closest to the checkpoint 815. In some examples, infrared sensing device 860 can be used to receive infrared (IR) radiation from a subject (e.g., subject 836) and provide temperature information representative of a surface temperature of the subject (e.g., a surface temperature of a facial region of the subject). In various examples, infrared sensing device 860 can include an infrared spot thermometer configured to provide a single temperature value associated with a target scene observed by the spot thermometer. In some examples, infrared sensing device 860 includes a temperature gun-style infrared sensing device in which the device is aimed at a subject for which a temperature measurement is to be taken.
In some examples, infrared sensing device 860 can include an infrared imaging tool having a plurality of infrared sensing elements, such as a plurality of bolometers. In some examples, each bolometer of a plurality of bolometers can be configured to absorb infrared energy focused through an infrared lens assembly and increase in temperature in response to the absorbed energy. The electrical resistance of each bolometer may change as the temperature of the bolometer changes. With each detector element functioning as a sensor pixel, a two-dimensional image or picture representation of the infrared radiation can be generated by translating the changes in resistance of each detector element into a time-multiplexed electrical signal that can be processed for visualization on a display or storage in memory (e.g., of a computing device). An infrared imaging tool can be configured to provide infrared image data representative of a scene.
As discussed elsewhere herein, in some cases, a surface temperature of a subject (e.g., of subject 836) may not provide enough information to determine a core temperature of the subject. The subject may have been standing in queue 850 for a long time (e.g., sufficiently long to become equilibrated to the environment of the entry area 810), or may have just recently entered the facility 800 from outside, which may be substantially warmer or colder than the inside of the facility 800.
The temperature screening system of
The temperature screening system of
The temperature screening system of
The temperature screening system of
In some embodiments, processor 870 may be configured to generate a display to visually present information, such as image data or other visual indications of data. The temperature screening system may include a display device 874 in communication with processor 870, and processor 870 can be configured to provide a generated display to the display device 874. In some examples, the display device 874 is a dedicated display, such as a monitor or television. In some embodiments, display device 874 can include a device such as a smartphone or tablet, for example, carried by a user tasked with screening subjects.
While various components are shown as being in communication with processor 870, communication can be accomplished via wired or wireless communication. In some embodiments, one or more components can be physically coupled to the processor 870 to facilitate communication therewith, such as via one or more wires, fiber optic cables, or the like. Additionally or alternatively, in some examples, one or more components can be in wireless communication with the processor 870, such as via Bluetooth, Wi-Fi, or the like.
In some embodiments, one or more components of the system shown in
As described elsewhere herein, in some cases, an infrared sensing device (e.g., 876) can provide a surface temperature of a subject, but if the subject has not equilibrated to the environment, the surface temperature may not be a reliable indicator of a core temperature. Monitoring a subject's surface temperature over time may provide information that can be used to determine whether a subject's core temperature can be accurately determined using surface temperature measurements.
The process of
The process further includes determining a figure of merit indicating if a core temperature of the subject can be accurately determined from the data set during step 920. In some examples, the figure of merit represents whether an accurate equilibrium surface temperature of the subject can be determined, which can reflect whether an accurate core temperature can be determined from the temperature-time data. For example, the figure of merit can include a subject stability figure of merit that indicates whether the subject has sufficiently equilibrated to the environment such that the most recent surface temperature measurement is approximately equal to an equilibrium surface temperature. In other examples, the figure of merit represents whether the set of temperature-time data can be accurately fit to a fitting function for predicting an equilibrium temperature, even if the equilibrium temperature has not yet been reached. Such a determination can be made, for example, using a sum of squares analysis comparing a fitting function to the temperature-time data. In some examples, if an accurate equilibrium surface temperature can be determined from the temperature-time data, an accurate core temperature can similarly be determined based on the accurate equilibrium surface temperature. In some examples, determining the core temperature of the subject based on the temperature-time data during step 960 includes determining an equilibrium surface temperature based on the temperature-time data. In other examples, the temperature-time data can be used to calculate the core temperature without explicitly calculating the equilibrium surface temperature.
In the illustrated process of
If the figure of merit meets the threshold, the core temperature of the subject can be accurately determined, and the process involves determining an equilibrium surface temperature of the subject based on the temperature-time data during step 960. In some embodiments, determining the equilibrium surface temperature comprises utilizing most-recent received temperature information as an equilibrium surface temperature. In other examples, determining the equilibrium surface temperature comprises utilizing one or more statistical algorithms, look-up tables, and/or curve-fitting techniques to analyze the temperature-time data and determine an equilibrium surface temperature (e.g., by extrapolating temperature-time data beyond the most recent received temperature information).
The process of
In some examples, an equilibrium surface temperature need not explicitly be calculated, and the temperature-time data itself can be used (e.g., in combination with ambient temperature information) to determine a core temperature. For example, in some embodiments, an algorithm or lookup table can be used to determine a core temperature using ambient temperature information and one or more data points in the temperature-time data.
With respect to
In some embodiments, systems may present various indicia to a user based on the results of the process of
In some examples, if the fitting function sufficiently fits the temperature-time data (e.g., the residual sum of squares is below a predetermined threshold), then the temperature-time data can be extrapolated to determine an equilibrium surface temperature based on the fitting function. The equilibrium surface temperature can be used to determine a core temperature of the subject. In some cases, using the equilibrium surface temperature to determine the core temperature need not involve explicitly identifying or outputting the equilibrium surface temperature value prior to determining the core temperature as described elsewhere herein.
Processes such as those described with respect to
Returning to
As depicted, the facility 800 includes a plurality of infrared sensing devices 860, 862, 864 positioned proximate to queue 850 and generally along a pathway from the door 805 to the checkpoint 815. In some embodiments, one or more of a plurality of infrared sensing devices 860, 862, 864 can be used to provide temperature information to a system processor representative of a surface temperature of a subject at a point in time. For example, subject 836 may pass by infrared sensing devices 864, 862, 860 in order while traveling from the door 805 to checkpoint 815. Data from each of the infrared sensing devices representative of a surface temperature of a given subject (e.g., 836) can be aggregated as a set of temperature-time data associated with the subject, which can be used as described herein to determine whether an accurate equilibrium surface temperature of the subject can be determined from the temperature-time data.
Each of a plurality of infrared sensing devices (e.g., 864, 862, 860) can be positioned anywhere within the entry area 810. The infrared sensing devices can include, for example, staffed or automated stations at which subjects stop for screening in order to be screened a plurality of times to generate the temperature-time data associated therewith. In some such examples, staff at the staffed stations can manually log temperature information for the subject, for example, using a handheld infrared sensing device. In some examples, a subject may enter identifying information into an automated infrared sensing device and the infrared sensing device can provide temperature information representative of the subject's surface temperature automatically. In other examples, subjects may visit the infrared sensing devices and manually initiate measurement of his or her own surface temperature, for example using a fixed stationary infrared sensing device.
Temperature data may be associated with a particular subject based on a unique user identifier, such as a number or other identifier assigned to the subject. Such identifier may be entered along with temperature information collected at a given infrared sensing device in order to accumulate temperature-time data over time. In other examples, facial recognition or other automated recognition technology may be used to aggregate temperature information acquired at different times and to associate a set of temperature-time data with a particular subject. In some such examples, infrared image data representing a subject's face can be analyzed at a plurality of times, for example, at one or more infrared sensing devices. Facial recognition can be performed using the infrared image data, or can be done using other image data, such as visible light image data. A maximum temperature value from the infrared image data may be used as temperature data associated with the subject at a given time, and associated with the user within the system based on the facial recognition.
In some examples, a subject (e.g., 840) may enter queue 850. Upon reaching checkpoint 815, temperature-time data associated with the subject can be analyzed to determine whether an accurate equilibrium surface temperature can be determined based on the temperature-time data. If so, a core temperature of the subject can be determined such as described herein, and the subject can be admitted or denied access to screened area 820 based on the determined core temperature. If not, the subject may be directed to holding area 812 so that additional temperature information can be acquired for the subject. As shown in the illustrated example, an infrared sensing device 866 is included in the holding area 812 and can be used to collect additional temperature information associated with the subject. In some cases, the infrared sensing device 866 within the holding area 812 may comprise a higher resolution infrared camera compared with an infrared camera associated with infrared sensing device 864.
In some embodiments, a station including an infrared sensing device (e.g., 860, 862, 864, or 866) can include a display and can indicate to a subject at the station or a person staffing the station whether an accurate equilibrium surface temperature for the subject can be determined based on the temperature-time data associated with the subject.
The facility 800 of
Infrared sensing devices 890, 894 can be configured to provide infrared image data to one or more processors (e.g., processor 870 in
In some embodiments, the one or more processors can be configured to track subjects through an area, such as through the entry area 810. Tracking can be performed in a variety of ways, for example, by assigning each subject a retroreflective tag or other item that can be detectable by an infrared sensing device. Tracking can also be performed by identifying subjects using track descriptors unique to each subject. Other tracking processes are possible. Probabilistic tracking techniques can also be used. Tracking a subject through a space allows the system to aggregate temperature information representative of a surface temperature for a given subject even if the subject moves throughout the entry area 810. In some cases, tracking further includes tracking a particular measurement surface of each subject, such as tracking a subject's face through a scene, and in some cases, tracking a maximum surface temperature of the subject's face through the scene.
In some embodiments, one or more processors (e.g., processor 870 in
In some examples, the one or more processors are in communication with a display such that the one or more processors can output information associated with a subject on the display. For example, with respect to
In some embodiments, one or more processors in communication with infrared sensing devices 890 and 894 are configured to aggregate temperature-time data regarding surface temperature for each of one or more subjects within a field of view of the infrared sensing devices.
The data set of
In some embodiments, a temperature screening system can be configured to collect data into a data set such as that shown in
The data shown in the data set of
In an example embodiment described with respect to
In some examples, temperature-time data associated with a user can be aggregated while a user waits at an infrared sensing device.
In the illustrated example of
The display 1900 in
In the illustrated example, one or more processors determined that an accurate equilibrium surface temperature for subject 1902 could not be determined (“No” in the top line of status box 1912), and instruct a user to hold the subject 1902 at the infrared sensing device for further data collection and/or subject equilibration (“Hold” in the bottom line of status box 1912). With respect to subjects 1904, 1906 one or more processors determined that an accurate equilibrium surface temperature for the subjects could be determined (“Yes” in the top line of status boxes 1914, 1916) and instructs a user to proceed to measure a core temperature of the subjects (“Measure” in the bottom line of status boxes 1914, 1916). In some such examples, a user may initiate a core temperature measurement, for example, using an equilibrium surface temperature (e.g., a most recent data point in the temperature-time data or an extrapolated equilibrium temperature based on a fitting function) and a body mode offset.
In the illustrated example, one or more processors determined that an accurate equilibrium surface temperature for subject 1922 could not be determined (“No” in the top line of status box 1932), and instruct a user to hold the subject 1922 at the infrared sensing device for further data collection and/or subject equilibration (“Hold” in the bottom line of status box 1932). With respect to subjects 1924, 1926 one or more processors determined that an accurate equilibrium surface temperature for the subjects could be determined (“Yes” in the top line of status boxes 1934, 1936). As described elsewhere herein, one or more processers can be configured to, if an accurate equilibrium surface temperature of a subject can be determined, determine a core temperature of the subject. In the example of
Subsequent action can be taken for each of the subjects 1922, 1924, 1926 in
Various configurations have been shown herein for acquiring temperature-time data associated with a subject in a space (e.g., in an entry area) in order to determine whether the subject's equilibrium surface temperature and/or core temperature can be accurately measured using the temperature-time data. In various examples, any combination of such configurations can be used. For example, one or more infrared imaging tools having a field of view through which subjects can be tracked as in
The temperature-time data from such configurations can be used in a variety of ways to make such a determination, and, if an accurate equilibrium surface temperature and/or core temperature can be determined, systems can use the temperature-time data to determine such temperature(s).
The determined rate of change can be compared to a threshold during step 1020. If the rate of change is below a threshold, then a most recent surface temperature can be considered approximately equal to an equilibrium surface temperature during step 1030. In this case the method can include determining an equilibrium surface temperature and/or a core temperature for the subject during step 1040.
As noted, in some embodiments, if the rate of change is below the threshold in step 1020, a most recent measurement can be considered an equilibrium surface temperature. Additionally or alternatively, in some cases, determining a core temperature comprises using an equilibrium surface temperature and a body mode offset (e.g., based on an ambient temperature and using an algorithm or lookup table) to determine the core temperature.
In the example of
In some embodiments, if a rate of change determined from two data points is not below the threshold, additional data can be acquired or analyzed (beyond the two data points) to perform additional analysis, such as fitting accumulated temperature-time data to a fitting function. The process of
The process shown in
In addition or alternatively to determining whether enough data is used (at step 1070), the method can include determining whether the fitting function is a good fit to the analyzed data during step 1080. For example, such analysis can include determining how well a fitting function fits the data, such as by a residual sum of square or other analysis. If the fitting function does not fit the data well (e.g., the residual sum of squares is above a threshold), the fit can be considered inadequate or unreliable, for example, for predicting an asymptotic equilibrium surface temperature.
Thus, in various examples, when three or more data points are fit to a fitting function, a figure of merit representing whether an accurate equilibrium surface temperature of the subject can be determined can include a number and/or time span of temperature-time data. Additionally or alternatively, a figure of merit can include a representation of whether a fitting function accurately reflects the temperature-time data.
If a number of data points and/or time span of collected data points is used as a figure of merit, and the number and/or span falls below a threshold, additional data can be collected during step 1050 and the fit repeated during step 1060 with the new data. As shown in
As described, in various embodiments, a data threshold (e.g., number of data points and/or time span reflected by data points) as in step 1070, a fit sufficiency threshold (e.g., using a residual sum of squares threshold or the like) as in step 1080, or both can be used to determine whether a subject's equilibrium surface temperature can be accurately determined using the temperature-time data. If the one or both thresholds analyzed in a given embodiment are satisfied, the process can proceed to determining an equilibrium surface temperature and/or a core temperature during step 1040.
The example process shown in
As described elsewhere herein, in some embodiments, determining a core temperature includes determining a subject's equilibrium surface temperature and calculating a core temperature using a body mode offset (or a temperature offset). In various embodiments, the determining the core temperature can be performed using an equilibrium surface temperature and body mode offset based on a lookup table or algorithm.
Processes such as those described herein, for example, with respect to
Additionally or alternatively, processes such as those described herein can prevent inaccurate screenings of subjects who have not yet equilibrated to the environment such that an instantaneous surface temperature measurement cannot be reliably used to determine a core temperature. Thus, reliability, safety, and efficiency of a contactless temperature screening system can be increased by analyzing, on a subject-by-subject basis, whether the subject's equilibrium surface temperature can be accurately determined by using temperature-time data representative of subject's surface temperature at a plurality of times. Additionally, using temperature-time data as described herein can account for subjects' surface temperature equilibration to the surrounding environment (e.g., by verifying the subject has equilibrated by looking at a rate of temperature change of the subject or by fitting the temperature-time data to a fitting function) and can be more accurate in determining a surface temperature and/or core temperature when compared to a single surface temperature measurement at a single point in time.
In some examples, temperature screening systems can manage subjects based on whether an accurate equilibrium surface temperature can be determined, such as by directing subjects for which an accurate equilibrium surface temperature cannot be determined to a holding area or otherwise requiring such subjects remain in an entry area for further screening. In some examples, systems can be configured to cause ambient air to flow through an area in which subjects are equilibrating (e.g., entry area 810 in
Various embodiments described herein refer to determining whether an accurate equilibrium surface temperature or an accurate core temperature can be determined. In some cases, such determinations are synonymous. For instance, in some examples, an equilibrium surface temperature is used to determine a core temperature as described herein (e.g., using an equilibrium surface temperature and a body mode offset). In such examples, if the equilibrium surface temperature cannot be accurately determined, then the core temperature similarly cannot be accurately determined. Thus, in some embodiments, whether an accurate core temperature can be determined is analogous to whether an accurate equilibrium surface temperature can be determined. In other examples, temperature-time data can be used to calculate a core temperature without explicitly determining an equilibrium surface temperature, for example, wherein an algorithm for calculating a core temperature uses temperature-time data and a body mode offset without explicitly determining an equilibrium surface temperature. Some such instances utilize the same data and analyses to determine a core temperature as when determining an equilibrium surface temperature even if a surface temperature value is not expressly calculated or presented. In such examples, an inability to accurately determine an accurate equilibrium surface temperature corresponds to an inability to accurately determine a core temperature.
In some embodiments, temperature screening systems (e.g., one or more processors) can be configured to learn or calibrate operation over time. For example, in some embodiments, an equilibrium surface temperature or core temperature for one or more subjects may be measured or estimated based on processes described herein, such as by determining a surface temperature asymptote based on fitting temperature-time data to a fitting function. The subject can then be shunted back into the entry area for further equilibration. A subsequent surface or core temperature can be performed after additional equilibration time in the entry area and compared to the prior measured or estimated value. The system can learn whether an accurate equilibrium surface temperature and/or core temperature was made in the initial measurement or estimation. In some examples, a system can be configured to generate or update one or more thresholds based on such information.
For instance, in an example scenario, at the time of the initial determination of an equilibrium surface temperature, the rate of temperature change was X degrees per second and a most-recent data point was used as an estimated surface temperature. However, after further equilibration and surface temperature analysis, the initial estimated surface temperature was incorrect. In such an instance, a threshold rate of temperature change for determining whether an accurate equilibrium surface temperature can be determined or has been reached should be lower than X. A temperature rate of change of X degrees per second is insufficient to consider a most recent data point of the temperature-time data an accurate equilibrium surface temperature.
Similarly, in another example scenario, at the time of the initial determination, at least three data points from a set of temperature-time data are fit to a fitting function and an asymptotic equilibrium surface temperature is estimated. However, if the estimated asymptotic equilibrium surface temperature is sufficiently different (e.g., more than a threshold amount of difference) from the actual surface temperature measured after further equilibration, the data fit was insufficient to determine an accurate equilibrium surface temperature. In various examples, a fitting parameter associated with the data (e.g., a residual sum of squares or R-square metric) can be considered inadequate for determining an accurate equilibrium surface temperature. Additionally or alternatively, in some examples, the number of temperature-time data points and/or the time span of the temperature-time data used in the fitting is insufficient. Such information can be used to adjust decisions made by the system in future analyses. Various non-limiting examples have been described.
In some embodiments, for a particular facial region (e.g., a cheek region), a plurality of temperature fitting functions may be stored within a lookup table or database with each temperature fitting function corresponding with a particular asymptotic surface temperature for the particular facial region. In at least one example, a cheek region of a person with a first body type may be associated with a first temperature fitting function corresponding with an asymptotic surface temperature of 99° F., a second temperature fitting function corresponding with an asymptotic surface temperature of 99.5° F., and a third temperature fitting function corresponding with an asymptotic surface temperature of 100° F.
As depicted in
In some embodiments, a computing device, such as computing device 680 in
As depicted in
In step 1202, a set of images of an environment is acquired from one or more imaging devices. The one or more imaging devices may include a capture device, such as capture device 620 in
In step 1212, an asymptotic surface temperature for the facial region of the person is determined using the temperature fitting function and the temperature-time sequence. In some cases, the asymptotic surface temperature for the facial region of the person may be determined in response to detecting that the temperature-time sequence is sufficient to determine the asymptotic surface temperature for the facial region of the person. In at least one example, it may be detected that the temperature-time sequence is sufficient to determine the asymptotic surface temperature for the facial region of the person if the temperature-time sequence fits the temperature fitting function with a curve fitting error below a threshold error value. In step 1214, a core body temperature (or an internal core temperatures) for the person is determined based on the asymptotic surface temperature for the facial region of the person. In at least one embodiment, the core body temperature may be calculated using a temperature offset associated with the facial region of the person applied to the asymptotic surface temperature determined in step 1212. In step 1216, the core body temperature for the person is outputted. The core body temperature may be outputted by transmitting the core body temperature to a portable electronic device or by displaying the core body temperature using a display.
In step 1232, a plurality of temperatures corresponding with a region of a person during a time period is acquired. In step 1234, a temperature-time sequence for the region of the person is generated using the plurality of temperatures. In step 1236, a temperature fitting function is identified based on the region of the person. In one example, the region of the person may comprise a cheek region and the temperature fitting function may correspond with the temperature fitting function 1104 in
At least one embodiment of the disclosed technology comprises a temperature screening system including a storage device (e.g., a semiconductor memory) and one or more processors in communication with the storage device. The storage device configured to store a plurality of temperatures corresponding with a facial region of a person during a time period. The one or more processors are configured to generate a temperature-time sequence for the facial region of the person using the plurality of temperatures, identify a temperature fitting function based on the facial region of the person, determine an asymptotic surface temperature for the facial region of the person using the temperature fitting function and the temperature-time sequence, estimate a core body temperature for the person based on the asymptotic surface temperature for the facial region of the person, and output the core body temperature for the person.
In some cases, the disclosed technology may further include one or more processors configured to estimate a mass or surface area of the person and identify the temperature fitting function based on the mass of the person and/or the surface area of the person. The disclosed technology may further include one or more processors configured to determine a walking speed for the person during the time period and determine a plurality of temperature measurement confidence values corresponding with the plurality of temperatures based on the walking speed for the person during the time period. The one or more processors may also be configured to detect that the temperature-time sequence is sufficient based on the plurality of temperature measurement confidence values corresponding with the plurality of temperatures.
At least one embodiment of the disclosed technology includes acquiring a plurality of temperatures corresponding with a facial region of a person during a time period, generating a temperature-time sequence for the facial region of the person using the plurality of temperatures, identifying a temperature fitting function based on the facial region of the person, determining an asymptotic surface temperature for the facial region of the person using the temperature fitting function and the temperature-time sequence, estimating a core body temperature for the person based on the asymptotic surface temperature for the facial region of the person, and outputting the core body temperature for the person.
In some cases, the disclosed technology may further include determining the asymptotic surface temperature for the facial region of the person by extrapolating the asymptotic surface temperature for the facial region of the person from the temperature fitting function in response to detecting the temperature-time sequence being sufficient to determine the asymptotic surface temperature for the facial region of the person. The disclosed technology may further include detecting that the temperature-time sequence is sufficient to determine the asymptotic surface temperature for the facial region of the person by computing a fitting error between the temperature-time sequence and the temperature fitting function and detecting the temperature-time sequence is sufficient to determine the asymptotic surface temperature in response to the fitting error being below a threshold error value.
At least one embodiment of the disclosed technology includes acquiring a plurality of temperatures corresponding with a facial region of a person during a first time period, generating a temperature-time sequence for the facial region of the person during the first time period using the plurality of temperatures, determining an amount of surface area for the person, identifying a temperature fitting function based on a location of the facial region of the person and the amount of surface area for the person, determining an asymptotic surface temperature for the facial region of the person using the temperature fitting function and the temperature-time sequence in response to detecting that the temperature-time sequence is sufficient to determine the asymptotic surface temperature for the facial region of the person, wherein the asymptotic surface temperature for the facial region of the person is greater than any temperature of the plurality of temperatures. The method may further include determining a core body temperature for the person based on the location of the facial region of the person and the asymptotic surface temperature for the facial region of the person and outputting the core body temperature for the person.
In some cases, the disclosed technology may further include determining an air flow within the environment and identifying the temperature fitting function based on the air flow within the environment.
The disclosed technology may be described in the context of computer-executable instructions being executed by a computer or processor. The computer-executable instructions may correspond with portions of computer program code, routines, programs, objects, software components, data structures, or other types of computer-related structures that may be used to perform processes using a computer. Computer program code used for implementing various operations or aspects of the disclosed technology may be developed using one or more programming languages, including an object oriented programming language such as Java or C++, a function programming language such as Lisp, a procedural programming language such as the “C” programming language or Visual Basic, or a dynamic programming language such as Python or JavaScript. In some cases, computer program code or machine-level instructions derived from the computer program code may execute entirely on an end user's computer, partly on an end user's computer, partly on an end user's computer and partly on a remote computer, or entirely on a remote computer or server.
The flowcharts and block diagrams in the figures provide illustrations of the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various aspects of the disclosed technology. In this regard, each block in a flowchart may correspond with a program module or portion of computer program code, which may comprise one or more computer-executable instructions for implementing the specified functionality. In some implementations, the functionality noted within a block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. In some implementations, the functionality noted within a block may be implemented using hardware, software, or a combination of hardware and software.
For purposes of this document, it should be noted that the dimensions of the various features depicted in the figures may not necessarily be drawn to scale.
For purposes of this document, reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” or “another embodiment” may be used to describe different embodiments and do not necessarily refer to the same embodiment.
For purposes of this document, a connection may be a direct connection or an indirect connection (e.g., via another part). In some cases, when an element is referred to as being connected or coupled to another element, the element may be directly connected to the other element or indirectly connected to the other element via intervening elements. When an element is referred to as being directly connected to another element, then there are no intervening elements between the element and the other element.
For purposes of this document, the term “based on” may be read as “based at least in part on.”
For purposes of this document, without additional context, use of numerical terms such as a “first” object, a “second” object, and a “third” object may not imply an ordering of objects, but may instead be used for identification purposes to identify different objects.
For purposes of this document, the term “set” of objects may refer to a “set” of one or more of the objects.
In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.
The present application is a continuation of International Patent Application No. PCT/US2022/014122, filed Jan. 27, 2022, which claims priority to U.S. Provisional Application No. 63/142,249, filed Jan. 27, 2021, which are herein incorporated by reference in their entirety.
Number | Date | Country | |
---|---|---|---|
63142249 | Jan 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2022/014122 | Jan 2022 | US |
Child | 17861049 | US |