The present invention relates to the field of infrared cameras. More particularly, this invention relates to infrared cameras enabling an operator to capture a thermal image of an asset and the scene or area where the asset is located and then associate it with any auxiliary information relating to the asset or scene.
Thermal imaging cameras have long been used in many settings to measure temperature profiles and analyze thermal variations and differences of one or more assets in a scene such as an electric motor in an area of a manufacturing plant. To that end, such profiles can be used in comparing the different temperatures of the assets at any given time, or conversely, can be used in comparing these temperatures or thermal variations and differences over a period of time. For example, infrared cameras are frequently used in industrial applications as part of preventative maintenance programs and for building sciences and diagnostics inspections. These types of programs often typically rely on periodic inspections of the assets of a plant, facility, or structure to discover likely failures before they occur or monitor and document ongoing performance. For example, in an industrial setting, plant personnel often develop survey routes in order to routinely gather temperature data on the identified equipment. As is known, similar examples exist in other applications for building sciences as well.
In an industrial setting example, after collecting a thermal baseline image for each piece of equipment along a route, a person (such as a thermographer, an inspector, a building performance professional, an electrician, a technician, etc.) can then identify changes in thermal characteristics over the course of several inspections by comparing later thermal images of the equipment with the baseline image or other prior images. Changes in thermal characteristics can in some cases indicate the imminent failure of a part, thus allowing the person to schedule maintenance or replacement prior to failure.
In a simple case, a person can visually compare thermal images captured at different times to determine changes in thermal characteristics over the time period. However, only so much information can be gleaned solely from thermal images. To aid in the analysis of a most recent image captured or in comparing that image to images previously captured, infrared imaging cameras have been configured to allow operators to add notes or annotations to an image. These annotations are generally created as audio or text files, and can be subsequently saved to the image in order to provide supplementary information that is difficult to ascertain from the image alone.
However, issues were encountered in adding these annotations to the image. In particular, some cameras only enable annotations to be added at the time the thermal image is saved (with limited ability for later modification), while other cameras enable annotations to be added after the image is saved. As such, it is not uncommon to find operators of these cameras further carrying a notebook to jot down their impressions of the scene in the field. In turn, the operator waits until later to combine these impressions into a single file which can then be stored with the saved image of the scene. Unfortunately, this process is time-consuming and invites error, because what is observed and noted in the field may not be fully recalled, even with the use of such notebook. Further, the operator may not be able to remember the particular image that the annotations correspond to, inviting further potential error.
Many of the above and other problems are solved by embodiments of an apparatus and methods shown in U.S. Pat. No. 9,176,990, assigned to the assignee of this patent and hereby incorporated by reference. The '990 patent shows a thermal imager for capturing a primary thermal image of an asset or scene and then associating it, as desired, with auxiliary information relating to the asset or scene. The auxiliary information can be associated with the primary thermal image in the field after the image is captured. The auxiliary information can pertain to further detail regarding the asset, the surroundings of the scene, and/or the surroundings of the location of the scene, which when associated with the primary thermal image, collectively represents a form of asset information card for the image. The auxiliary information is generally in the form of images and video recordings, whether infrared and/or visible light, and can be associated to the primary thermal image in varying or distinct manners, such as being based on relatedness of the information to the asset or the scene. The infrared image is annotated with the other images and recordings in the thermal imager at or proximate the time of recording.
The infrared imager of the '990 patent has the added ability to capture still images, sound recordings, and video, and thereby annotate the infrared images by capturing the infrared images and the annotations at approximately the same time. However, many users continue to use their existing infrared imagers that have no ability to capture still and video images and sound recordings. As such, there remains a problem with annotating thermal images with other images and recordings taken by video camera, still image cameras, and sound recorders. Moreover, often the infrared images reside on a remote server and only require annotations with images and recording that can be captured by non-infrared equipment such as the ubiquitous smartphone which records video, audio, and still images. There remains a long felt and unmet need for such annotating infrared images with auxiliary information captured by non-infrared devices.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The invention described and claimed herein has a number of embodiments and should not be interpreted or otherwise limited to the abbreviated discussion of some embodiments in this summary. The embodiments of the invention address the problem of how to collocate (group) information about an asset, such as a motor or machine, by using tags. A primary piece of information is a thermal image. The thermal image is annotated with auxiliary information, including and not limited to digital images (still), sound recordings of notes spoken by an observer of the asset, and video images. Auxiliary information is captured with a smartphone, a digital camera, a digital recorder or a digital video camera. Each capture device applies a tag to the captured information that relates the captured information to an asset.
There are numerous types of tags, including date/time stamps, serial numbers, bar and matrix codes, RFID codes and geolocations (latitude and longitude) tags. The tagged information is processed in the capture device or is sent to another device where tags on the information are used to collocate information about the same asset into a common folder. The receiving device annotates a thermal image with icons that represent the types of auxiliary information in the folder.
The foregoing aspects and many of the attendant advantages of this invention will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
While illustrative embodiments have been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention.
Turning to
The ASD 30 may be a personal computer, a tablet computer, or a server. The ASD 30 has a processor 31 and a collocation program 33. The processor 31 operates the collocation program 33 to sort thermal images and auxiliary information with related tags into a common folder 40 to provide a folder having the infrared image 35 and auxiliary information. The processor also operates the collocation program to create icons 36-38 representative of the types of auxiliary information associated with the tag. Icons 36-38 are symbols representative of visual images 6, sound recordings 7, and video images 8, respectively.
Moving to
Other capture devices 12-14 include suitable apparatus and circuitry to apply one or more tags to captured auxiliary information including images 6, sound recordings 7, and videos 8. The digital camera 12 and video camera 14 can apply date/time stamps to information and are inherently capable of capturing images of a serial number, bar code, or matrix (two-dimensional) code, which can be later deciphered as tags for asset 20. The digital recorder 13 may also attach a date/time stamp to a recording and the user may make an audible record of a serial number that can also be deciphered later. The other capture devices 12-14 may also be equipped with or modified to include apparatus and circuitry to detecting and appending RFID and geolocation codes to auxiliary information. Each capture device may send its information via a wired or wireless communication path to the ASD 30.
The ASD 30 has a processor 31, a memory 32, and one or more computer programs 33 for collocating images. Memory 32 holds multiple folders 40.1, 40.2, 40.3, 40.N of thermal images and auxiliary information. Each folder 40.N has an infrared image 35 of an asset 20 and the thermal image is annotated with icons 36-38 representing the types of auxiliary information in the folder. ASD 30 has suitable apparatus, methods and computer programs, such as optical character recognition and speech recognition programs, to decipher and recognize serial numbers, bar codes, matrix codes and audible notes corresponding to tags. The tag identification programs may operate independently or together to collocate received images and recordings. For example, audio recordings of spoken serial numbers are recognized by speech recognition programs. Such audio recordings may be processed together with the date/time tag data to collocate the audio recording with a captured image having similar date/times.
ASD 30 has a display 34 for displaying an infrared image 35 of the motor 20 or other asset with one or more icons 36, 37, 38 on the bottom of image 35. The icons 36-38 refer to still images, audio recordings, and video images, respectively, of the motor 20 and its relevant area or environment 21. A user operating an input device 30 may access any folder 40.N and display it on display 34. By clicking one of icons 36-38, the user may display still or video images or listen to comments in the audio annotations. The input device 39 is any suitable device and includes one or more of a keyboard, mouse, track ball, track pad, and touch screen on the display 34 and combinations thereof.
The specific embodiment of
The tablet computer/ASD 30 wirelessly receives the images and sound recordings from the infrared camera 10 and the smartphone 11. The tablet computer receives the thermal image 35 and stores it in a folder 40.3. The ASD 30 has a collocation application that reads the date/time stamp of the images and recordings received from the smartphone 11 and collocates those images and sound recording into folder 40.3 and adds icons 36-38 to the thermal images 35. The collocation program matches the images and sound recordings from the smartphone 11 to the thermal image 35 because they have date/time tags of the same day and close to the same time as the thermal image 35. In some embodiments, the collocation program may ask the user to accept or decline the prospective collocation of images and sound recordings with a given thermal image. Upon accessing folder 40.3, the user sees a thermal image 35 of the motor 20 and icons 36-38. By operating a user interface (not shown) of ASD 30, the user may select an icon and view the images 36, 37 and listen to the sound recording 38.
While some embodiments of the invention have been shown and described, modifications and variations may be made thereto by those of ordinary skill in the art without departing from the spirit and scope of the present invention. For example, each of the capture devices 10-14 may be constructed or modified to have a collocation program and other programs for deciphering bar codes embedded in images and recognizing spoken serial numbers in sound recordings. Each of the capture devices may send its information to any of the other devices or to a personal computer, tablet computer, or cloud storage facility. In this regard, the personal computer, tablet computer, and cloud storage facility may have apparatus, circuitry, and programs for collocating tagged information into folders and for deciphering codes embedded in images and recordings.
It should be understood that aspects of the various embodiments may be interchanged either in whole or in part. Furthermore, those of ordinary skill in the art will appreciate that the foregoing description is by way of example only, and is not intended to limit the invention, except as further described in the appended claims. Those skilled in the art will understand that other and equivalent components and steps may be used to achieve substantially the same results in substantially the same way as that described and claimed.
While illustrative embodiments have been illustrated and described, it will be appreciated that various changes can be made therein without departing from the spirit and scope of the invention.
This application claims the benefit of Provisional Application No. 61/801,380, filed Mar. 15, 2013, and Provisional Application No. 61/876,719, filed Sep. 11, 2013, the disclosures of which are hereby incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
5386117 | Piety | Jan 1995 | A |
5637871 | Piety | Jun 1997 | A |
5664207 | Crumpler | Sep 1997 | A |
6750978 | Marggraff | Jun 2004 | B1 |
7076239 | Kirkup | Jul 2006 | B2 |
7191184 | Laborde et al. | Mar 2007 | B2 |
7304618 | Plathe | Dec 2007 | B2 |
7454050 | Garvey | Nov 2008 | B2 |
7478305 | Betawar | Jan 2009 | B2 |
7528372 | Garvey, III | May 2009 | B2 |
7561200 | Garvey, III | Jul 2009 | B2 |
7703032 | Wells | Apr 2010 | B2 |
7706596 | Garvey | Apr 2010 | B2 |
7728275 | Blanchard | Jun 2010 | B2 |
7902507 | Garvey, III | Mar 2011 | B1 |
7995830 | Garvey | Aug 2011 | B2 |
8003942 | Garvey, III | Aug 2011 | B2 |
8059882 | Amidi | Nov 2011 | B2 |
8119986 | Garvey, III | Feb 2012 | B1 |
8124923 | Blanchard | Feb 2012 | B2 |
8148687 | Praly | Apr 2012 | B1 |
8300922 | Garvey, III | Oct 2012 | B1 |
8334513 | Garvey, III | Dec 2012 | B1 |
8358903 | Meads et al. | Jan 2013 | B1 |
8368001 | Blanchard | Feb 2013 | B2 |
8447541 | Rada et al. | May 2013 | B2 |
8754779 | Ruther | Jun 2014 | B2 |
8976039 | O'Hara et al. | Mar 2015 | B2 |
20010038343 | Meyer | Nov 2001 | A1 |
20030184653 | Ohkubo | Oct 2003 | A1 |
20040249605 | Komatsu | Dec 2004 | A1 |
20050125512 | Fuller, III | Jun 2005 | A1 |
20060071812 | Mason, Jr. | Apr 2006 | A1 |
20070032244 | Counts et al. | Feb 2007 | A1 |
20070118323 | Ishizuka | May 2007 | A1 |
20080231719 | Benson | Sep 2008 | A1 |
20080263150 | Childers | Oct 2008 | A1 |
20090010484 | Amidi | Jan 2009 | A1 |
20090100096 | Erlichson | Apr 2009 | A1 |
20090141593 | Taha | Jun 2009 | A1 |
20100039516 | Hollander et al. | Feb 2010 | A1 |
20100167659 | Wagner | Jul 2010 | A1 |
20110099424 | Rivera Trevino | Apr 2011 | A1 |
20110137678 | Williams | Jun 2011 | A1 |
20110288810 | Ishikawa | Nov 2011 | A1 |
20120004886 | Jordil | Jan 2012 | A1 |
20120038760 | Kantzes et al. | Feb 2012 | A1 |
20120047424 | Rothschild | Feb 2012 | A1 |
20120130223 | Reicher | May 2012 | A1 |
20120172023 | Griff | Jul 2012 | A1 |
20120178438 | Vashi | Jul 2012 | A1 |
20120223969 | Schoeller | Sep 2012 | A1 |
20120224067 | Stuart | Sep 2012 | A1 |
20120229270 | Morley | Sep 2012 | A1 |
20120270505 | Prakash | Oct 2012 | A1 |
20120300089 | Sbaiz | Nov 2012 | A1 |
20120320189 | Stuart | Dec 2012 | A1 |
20130009788 | Langenberg | Jan 2013 | A1 |
20130029683 | Kim | Jan 2013 | A1 |
20130065633 | Sridhara | Mar 2013 | A1 |
20130066576 | Cs | Mar 2013 | A1 |
20130124136 | Neeley | May 2013 | A1 |
20130127904 | Dove et al. | May 2013 | A1 |
20130181978 | Tsukagoshi et al. | Jul 2013 | A1 |
20130265502 | Huebner | Oct 2013 | A1 |
20130307992 | Erlandsson | Nov 2013 | A1 |
20140124120 | Pham | May 2014 | A1 |
20140236468 | Dave | Aug 2014 | A1 |
20140278259 | Neeley et al. | Sep 2014 | A1 |
Number | Date | Country |
---|---|---|
2152024 | Dec 1995 | CA |
101339671 | Jan 2009 | CN |
101582819 | Nov 2009 | CN |
102009054000.8 | Nov 2009 | DE |
1 965 316 | Sep 2008 | EP |
1 965 316 | May 2012 | EP |
2008-39656 | Feb 2008 | JP |
2011-55489 | Mar 2011 | JP |
2012-54987 | Mar 2012 | JP |
10-2008-0112692 | Dec 2006 | KR |
10-2012-0065540 | Jun 2012 | KR |
10-2012-0077332 | Jul 2012 | KR |
2010046820 | Apr 2010 | WO |
2013020110 | Feb 2013 | WO |
2013020110 | Feb 2013 | WO |
Entry |
---|
Extech Instruments, “Extech EX540 Wireless Datalogging selected as 2010 EC&M Product of the Year Category Winner,” Press Release, Mar. 18, 2009, 2 pages. |
Extech Instruments, “Extech EX845 METERLiNK™ Clamp Meter Transmit Readings to FLIR IR Cameras,” Press Release, Apr. 1, 2010, 3 pages. |
Extech Instruments, “MultiMeter/Datalogger with Wireless PC Interface,” Product Datasheet, Jul. 14, 2011, 1 page. |
Extech Instruments, “Wireless TRMS Multimeter—Model EX540,” User's Guide, Apr. 1, 2010, 17 pages. |
International Search Report and Written Opinion dated Sep. 12, 2014, in International Patent Application No. PCT/US2014/029867, filed Mar. 14, 2014, 12 pages. |
International Search Report dated Jul. 10, 2014, in International Patent Application No. PCT/US2014/029561, filed Mar. 14, 2014, 2 pages. |
International Search Report and Written Opinion dated Jul. 17, 2014, in International Patent Application No. PCT/US2014/029889, 13 pages. |
International Search Report and Written Opinion dated Jul. 18, 2014, in International Patent Application No. PCT/US2014/029885, 16 pages. |
International Search Report and Written Opinion dated Jul. 22, 2014, in International Patent Application No. PCT/US2014/029883, 13 pages. |
International Search Report and Written Opinion dated Jul. 24, 2014, in International Patent Application No. PCT/US2014/029879, 12 pages. |
Extended European Search Report, dated Sep. 30, 2016, for European Application No. 14765749.8-1952 / 2974266, 13 pages. |
Extended European Search Report, dated Oct. 21, 2016, for European Application No. 14763237.6-1855 / 2972976, 9 pages. |
Chinese First Office Action, dated Dec. 20, 2017, for Chinese Application No. 201480027601.X, 32 pages (with English Translation). |
JP Notification of Rejection for JP Application No. 2016-503140 dated Mar. 27, 2018, 10 pgs. |
Number | Date | Country | |
---|---|---|---|
20140267765 A1 | Sep 2014 | US |
Number | Date | Country | |
---|---|---|---|
61801380 | Mar 2013 | US | |
61876719 | Sep 2013 | US |