The current invention relates to systems and methods for gathering and processing information from an IR sensor to produce a thermal image and a method to produce thermal image data.
Thermographic cameras are known. Most thermographic cameras use a lens to focus IR radiation onto a chip with multiple sensors. Such sensors are expensive and bulky. There is a need for a simple and low-cost thermographic camera.
In a first aspect of the invention, there is provided a method for providing thermal image data using a device comprising an IR sensor arranged to receive IR radiation from the surroundings in a field of view, and position determining means able to determine a position and orientation of the IR sensor, the method involving the steps
In particular, there is provided a method for providing thermal image data using a device comprising an IR sensor arranged to receive IR radiation from the surroundings in a field of view, and positioning determining means able to determine a position and orientation of s the IR sensor, where the thermal image data represents a plurality of pixels in a dot matrix or a bitmap, the method involving the steps
The design with one sensor provides a cost-efficient way of providing thermal imaging in a low-cost device with a single IR sensor, for example a mobile phone.
In one embodiment it is assumed that the entire temperature difference between two partially overlapping fields is caused by one or both of the non-common fields. In one embodiment it is assumed that the entire temperature difference between two partially overlapping fields is caused by one of the non-common fields.
The field of view may be swept in at least two different sweep directions, resulting in a first temperature value for a non-common field from a first sweep direction and a second temperature value for a non-common field from a second sweep direction and the first and the second temperature values are used to determine the temperature for an area in the image or pixels that represent the area that is shared by both the two non-common fields. Using a plurality of sweep directions increases the resolution of the image. Adding even more sweep directions increases resolution.
The data from the various sweep directions can be used in any suitable manner to determine the temperature of the shared region. In one embodiment the temperature for a area or pixel in the image is determined by averaging the temperatures of the shared region.
The IR sensor may be provided in a handheld device, for example a mobile device such as a mobile phone. The user may be provided with instructions to sweep the device in at least two different directions that are not the direction of detection of the field of view of the IR sensor. The instructions to the user may comprise instruction to move the handheld device in a pre-defined direction, and where the certain direction is updated depending on data from the position and orientation determining means. When the device has a display, the display may be used to display indications that prompts the user to sweep, and also used to display a thermal image created with the use of the thermal image data. The display may be used to display the image as it is formed in real time. This has the advantage that the user is provided with feedback for how to sweep to improve the image in real time.
In a second aspect of the invention there is provided a system comprising an IR sensor arranged to receive IR radiation from the surroundings in a field of view, and positioning determining means able to determine a position and orientation of the IR sensor, the system arranged to provide thermal image data, where the system is configured to cause the IR sensor to determine a IR sensor value corresponding to a temperature with at least a predetermined frequency, and further configured to determine the current relative position and orientation of the IR sensor for the time point for which the IR sensor values are determined, and further configured to use the determined IR sensor values, together with their respective detected positions and orientations, to determine data for a thermal image, where the temperature for an area in the image is determined by determining temperatures for two partially overlapping fields of view, such that an overlap between two fields of view are associated with two fields not common (non-common fields) for the two overlapping fields of view, and where a temperature value for a non-common field is determined by using the temperature difference between the two partially overlapping fields of view and the proportion of the area of the non-common field in relation to the area of the field of view.
In particular, there is provided a system comprising an IR sensor arranged to receive IR radiation from the surroundings in a field of view, and positioning determining means able to determine a position and orientation of the IR sensor, the system arranged to provide thermal image data representing a plurality of pixels in a dot matrix or a bitmap, where the system is configured to cause the IR sensor to determine a IR sensor value corresponding to a temperature with at least a predetermined frequency, and further configured to determine the respective current relative position and orientation of the IR sensor for the time point for which the IR sensor values are determined, and further configured to use the determined IR sensor values, together with their respective detected positions and orientations, to determine data for a thermal image, where an temperature for a pixel in the image is determined by determining temperatures for two partially overlapping fields of view, such that an overlap between two fields of view are associated with two part-fields not common (non-common fields) for the two overlapping fields of view, and where a temperature value for a non-common field is determined by using the temperature difference between the two partially overlapping fields of view and the proportion of the area of the non-common field in relation to the area of the field of view, and where the temperature for a point in the image is determined by using the temperature values thus determined for non-common fields to which the point belongs.
In a third aspect of the invention there is provided software arranged to repeatedly obtain IR sensor values from an IR sensor, and further arranged to repeatedly obtain, from positioning determining means, the position and orientation of the IR sensor, the software further being configured to use the IR sensor values, together with their respective detected positions and orientations, to determine the thermal image of the area, where an temperature for an area in the image is determined by determining temperatures for two partially overlapping fields of view, such that an overlap between two fields of view are associated with fields not common for the two overlapping fields of view (non-common fields), and where a temperature value for a non-common field is determined by using the temperature difference between the two partially overlapping fields of view and the proportion of the area of the non-common field in relation to the area of the field of view, and where the temperature for an area in the thermal image data is determined by using the temperature values thus determined.
In particular, there is provided software arranged to repeatedly obtain IR sensor values from an IR sensor, and further arranged to repeatedly obtain, from positioning determining means, the position and orientation of the IR sensor, the software further being configured to use the IR sensor values, together with their respective detected positions and orientations, to determine the thermal image of the area, where the thermal image data represents a plurality of pixels in a dot matrix or a bitmap where an temperature for a pixel in the image is determined by determining temperatures for two partially overlapping fields of view, such that an overlap between two fields of view are associated with fields not common for the two overlapping fields of view (non-common fields), and where a temperature value for a non-common field is determined by using the temperature difference between the two partially overlapping fields of view and the proportion of the area of the non-common field in relation to the area of the field of view, and where the temperature for a pixel in the thermal image data is determined by using the temperature values thus determined.
With reference to
The IR sensor 2 is preferably of a type that senses one value, and one value only, at one given moment. The thermal image data is produced by scanning the field of view 4 of the IR sensor 2 across an area of interest 3 and repeatedly gather IR information during scanning, and then compiling and processing the thus gathered information using the position and/or orientation of the IR sensor 2 to produce the image data.
The system 1 also comprises a memory 8, a processor 9, and a communication bus 10. In a preferred embodiment the system 1 comprises a digital display 11 for showing digital images. The digital display 11 may be for example an LCD display. System 1 may comprise any suitable combination of software and hardware. System 1 may for example comprise an operating system and a device for making input such as a touch display.
The memory 8 is used for storing data, such as for example software, for example software carrying out the method and also necessary data such as IR sensor values, positions, orientation, time points, time stamps, overlapping fields and their positions, and image data.
The system 1 also comprises positioning and/or orientation determining means 6 referred to as “position determining means 6” herein, able to determine a position and/or the orientation of the IR sensor 2. Orientation of the IR sensor 2 means the orientation of the direction of detection 5. Preferably, both position and orientation are determined by position determining means 6, but in certain embodiments only position or orientation is determined. Position and orientation are determined at fine resolution, preferably such that a movement of a few millimetres (for translation) or degrees (for orientation) can be detected.
Position and orientation are determined with methods known in the art. Hence, position determining means 6 may comprise any suitable technology including tracking cameras, marker-based or markerless tracking, inertial tracking, digital model building, gyroscopes and accelerometers, which technologies may be combined in any suitable manner. Present models of mobile phones such as iPhone X and Android phones are able to determine position and orientation with a high resolution. Apple iOS provides an API called Core Motion for providing position and orientation data, and Android provides a similar API.
Preferably the IR sensor 2 and positioning determining means 6 are comprised in a device 7, which preferably is a device that can be handheld. The device 7 may be a mobile phone, such as an iPhone or an Android phone. Nevertheless, parts of system 1 may located outside device 7, such as on a server. In particular, memory 8 and processor 9 or parts of or memory 8 and processor 9 may be located on a server which is in digital communication, for example wireless communication, with device 7.
The direction 5 of detection of the field of view 4 is preferably fixed in relation to device 7 The position determining means 6 is able to determine the direction 5 of detection of the field of view 4 and the position of the IR sensor 2. The position and the direction 5 are preferably determined in relation to object in the surroundings such as objects in the area of interest 3. The position and direction may preferably be determined in relation to a stationary object. The direction of detection 5 may be the same as a line of observation perpendicular and directed towards surface of a display 11 on device 7 (see
System 1 may also comprise suitable components for processing the signal from the IR sensor 2 known in the art, such as for example, amplifiers, filters, A/D converters, etc. System 1 is powered by a suitable power source such as for example a battery. The IR signal is provided from the IR sensor 2 to rest of system 1 which may be able to process the signal, store data representing the signal and carry out computations using this data using any suitable combination of software and hardware.
The system 1 is used for production of thermal image data in the form a dot matrix 12 representing a thermal image, an example of which is shown in
The image data is suitable for rendering a thermal image 12 on a display, for example display 11 of system 1. The image data may be arranged as dot matrix 12 where a temperature value is stored for each pixel 13. The number of pixels in thermal image 12 may be from, for example, 10,000 to several millions.
Each pixel 13 in the dot matrix 12 may assume at least two different colours (for example black and white) but it is preferred that more colours can be shown. Each colour represents a temperature value or a range of temperature values. The colour scheme is selected depending on the application. Typically, a heat image uses the colour scale: black-blue-green-yellow-red, where black indicates the lowest temperature and red indicates the highest temperature. Each colour indicates a certain temperature range. The colours are pseudo-colors and any useful colour scheme may be used. The temperature range for each colour is selected based on the intended use. The display 11 of device 7 may be used for rendering the thermal image based on the dot matrix 12. Hence a pixel 13 may display a colour representing a temperature value.
For generating the image data, the field of view 4 is swept over the area of interest 3, i.e. the area of which a thermal image 12 is to be generated. The direction of sweep is indicated with arrow 15 in the figures. When the device 7 is handheld, sweeping is done manually, by moving the hand. However, powered and/or automatically controlled sweeping may be used in some embodiments, whereby for example a motor controlled by a computer moves the field of view 4. Hence, the system 1 and method described herein may be used as an alternative to multipixel thermal imaging chip sets.
The IR sensor 2 determines the IR value in the field of view 4 with at least a predetermined frequency. For a handheld device, a suitable frequency may be from 1 Hz to 100 Hz, where 2 Hz to 20 Hz is more preferred.
The position determining means 6 repeatedly determines the position and/or the orientation of the IR sensor 2 with a suitable time interval, which may be simultaneous or with the same frequency as capturing IR data. However, different sampling frequencies may be used. The sampling frequency for position and/or orientation may be at least the frequency for sampling IR data. Time-pairing algorithms may be used to pair the correct IR data with the correct data for position and/or orientation. For example, time stamps may be used. Hence, the position determining means 6 detects the respective current relative position and/or orientation of the IR sensor 2 for each IR value during the course of the field of view 4 is being swept.
Sweeping the direction of field of view 4 may be done by translating the field of view 4, for example by translating the device 7 so that the field of view 4 moves across the area of interest 3 (shown in
Hence, each IR value detected during scanning becomes associated with data representing position and/or orientation. The IR image data may be compiled by using temperature values and positioning them in a dot matrix 12 with the use of the position and/or orientation data. The position and/or orientation data obtained by position determining means 6 may be used to position the various captured fields of view in three-dimensional space. As shown in
In
The proportion of the area of the non-common parts 102103 in relation to the field of view 4a4b depend on the sampling frequency and the speed with which the sweep is carried out. The sweep speed may be varying, in particular when a handheld device 7 is used. The proportion of area of non-common field 102103 in relation to the area of the field of view 4 may be determined by a area proportion determining means. The area proportion determining means may use the current sweep speed as determined by position determination means 6. Area proportion determining means is a part of system 1 and is preferably implemented as software that uses information from position determining means 6 and the sampling time points. Hence, area proportion determining means uses information from position determination means 6 to determine the speed with which the field of view 4 moves over the area of interest 3 and can use that information to determine the proportion between the area of a non-common field 102103 in relation to a field of view 4a 4b. High scanning speed results in large non-common fields 102103 and slow sweep speed results in small non-common fields. Low sweep speed results in improved resolution of the thermal information. A suitable proportion of the non-common field in relation to the total field of view 4 may be from 1 to 20%.
In one embodiment sampling frequency of IR sensor and position determining means 6 automatically increases when the sweep speed is high.
The change in temperature between field of view 4a and 4b is divided with the area proportion and the resulting ratio is used to calculate a temperature for the old and/or new non-common field 102103. In one embodiment it is assumed that the entire temperature difference between overlapping fields 4a, 4b is due to new non-common field 103. In one embodiment it is assumed that the entire temperature difference is due to old non-common field 102. In further embodiments it is assumed that the temperature difference is due to both new non-common field 103 and old non-common field 102, for example in fixed proportions, for example 50% due to new non-common field 102 and 50% due to old non-common field 103.
In the following is an example where it is assumed that the entire change in temperature from field of view 4a to field of view 4b is due to new non-common field 103. Say for example that the temperature of field of view 4a is 23° C. and that the temperature of area 4b is 21° C. The difference in temperature (−2° C.) will be attributed to non-common fields 102103. If we assume that the non-common field 103 has a size of 15% of field of view 4b, there will be a change that is −2° C./0.15=−13.3° C., i.e. the temperature value for the new non-common field 103 will be 23−13.3=9.7° C. Using data from more directions of sweep as described below increases the accuracy.
A method is shown in
The method is preferably applied for a series of overlapping field of views 4a, 4b, 4c, 4d . . . as shown in
A larger area of interest 3 may be covered by sweeping across the area of interest 3 multiple times, each time with a suitable offset (
With reference to
In step 300, the field of view is swept over the area of interest 3 in a first direction 15a, resulting a in a first temperature value for a first new non-common field 103a. This is done using the method above with reference to
The shared region 21 is determined by using the position information stored for each non-common area 103a 103b. The starting position 14 may be used, for example, and all temperature vales may be related in space to starting position 14.
Above it is described how new non-common areas 103a,b are used, but old non-common areas 102a,b may alternatively be used. Both new and old non-common areas may also be used as described above, where, for example any temperature change is divided by old and new areas, for example in a predetermined manner, for example, 50/50.
In preferred embodiments values from three, four, five, six, seven, eight or more sweep directions 15 are used. Hence temperature values for three, four, five, six, seven, eight or more non-overlapping regions 102103 may be used. Using temperature data from a plurality (such as two or more) sweeping directions increases the resolution of the thermal image 12. The temperature data from the plurality of sweep directions can be used in any suitable manner. In one embodiment the average of the temperature values from the different directions 15ab are used. In one embodiment, where there are more than two sweep directions, the value that is furthest away from the average is used. In one embodiment the extreme value that is closest to the average is used (if the extreme values are 10 and 2 and the average is 5, 2 is chosen). In one embodiment the median value is used.
Thus, for each pixel 13 in the image 12, the temperature value may be the result of using temperature data for a plurality of temperature values, for example the average of the plurality of temperature values.
The directions 15a 15b of sweep of the two datasets is preferably separated with at least 45°. Preferably two directions of sweep are separated by 90°. In a preferred embodiment the field of view 4 is swept in at least four mutually different directions 15, resulting in four different sets of determined IR sensor values. The mutually different directions 15 are preferably separated by 90°. In an even more preferred embodiment eight different directions 15 separated by 45° is used. In one embodiment three different sweep directions 15 separated by 120° is used.
The sweeps are preferably carried out in the same plane or planes that are parallel. The planes are preferably perpendicular to the direction of observation 5 of the field of view 4 at the starting point 14.
In one embodiment, shown in
However, random sweeping may be used as shown in
A user may be provided with instructions to sweep in different predefined directions as described above, such as for example, two directions separated by 90°. The instructions may be provided in any manner such as text, symbols, or sound such as synthetic speech.
In a preferred embodiment, instructions are provided on display 11. An example is shown in
In one embodiment, shown in
The user may also be prompted to sweep in a certain plane, which may be the plane perpendicular to the direction of observation 5 at the starting point 14 or the plane of the display 11 of the device 7 at the starting point 14. Command such as “closer” and “further away” may be used on the display 11 to keep the device 7 in the plane while sweeping. The user may also be prompted to sweep with a certain speed. As mentioned above, the resolution will be better if the sweep speed is low. Hence the user may be prompted to increase or decrease the sweep speed as the case may be. Feedback may be provided in real time.
In one embodiment, the display 11 of the device 7 is used to display a thermal image 12 using the data for the thermal image processed as above. The method may thus comprise the step of displaying a thermal image 12 of the display 11 of the device 7 that comprises the IR sensor 2. The image 12 may be rendered on the display 11 in real time as the user sweeps across the area of interest 3. Hence, image data may be provided to display 11 for rendering in real time. This provides feedback to the user.
There is also provided imaging software implementing the methods described herein. The imaging software may be provided as an app to be downloaded to a portable device 7 with an IR sensor 2 and position determining means 6. The imaging software is preferably stored on memory 8. The imaging software may be configured to obtain IR sensor values and position and/or orientation data from position determining means 6. For example, the imaging software may repeatedly query one or more sensors, or switch on one or more sensors, and providing a predetermined configuration to one or more sensor. Such configuration may for example be instructions to the IR sensor 2 or position determining means 6 regarding sampling frequencies. The imaging software may be configured to pair IR sensor data with data for position and/or orientation. The imaging software includes the proportion determining means described above.
The imaging software is configured to compile data to produce thermal imaging data to produce a thermal image 12. Hence imaging software is configured to determine non-overlapping fields, and positions and orientation for these, and to compile temperature data from different directions as described herein.
The inventive method was simulated using digital thermal data of an area of interest showing a man and a dog, shown in the upper left panels of
Number | Date | Country | Kind |
---|---|---|---|
1950850-6 | Jul 2019 | SE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2020/068461 | 7/1/2020 | WO |