One or more embodiments of the invention relate generally to infrared imaging devices and more particularly, for example, to infrared imaging devices for portable equipments and, for example, to systems and methods for multi-spectrum imaging using infrared imaging devices.
Various types of portable electronic devices, such as smart phones, cell phones, tablet devices, portable media players, portable game devices, digital cameras, and laptop computers, are in widespread use. These devices typically include a visible-light image sensor or camera that allows users to take a still picture or a video clip. One of the reasons for the increasing popularity of such embedded cameras may be the ubiquitous nature of mobile phones and other portable electronic devices. That is, because users may already be carrying mobile phones and other portable electronic devices, such embedded cameras are always at hand when users need one. Another reason for the increasing popularity may be the increasing processing power, storage capacity, and/or display capability that allow sufficiently fast capturing, processing, and storage of large, high quality images using mobile phones and other portable electronic devices.
However, image sensors used in these portable electronic devices are typically CCD-based or CMOS-based sensors limited to capturing visible light images. As such, these sensors may at best detect only a very limited range of visible light or wavelengths close to visible light (e.g., near infrared light when objects are actively illuminated with light in the near infrared spectrum). As a result, there is a need for techniques to provide infrared imaging capability in a portable electronic device form factor.
Various techniques are disclosed for providing a device attachment configured to releasably attach to and provide infrared imaging functionality to mobile phones or other portable electronic devices. For example, a device attachment may include a housing with a partial enclosure (e.g., a tub or cutout) on a rear surface thereof shaped to at least partially receive a user device, a multi-wavelength image sensor assembly disposed within the housing and configured to capture infrared image data and visible light image data, and a processing module communicatively coupled to the multi-wavelength sensor assembly and configured to transmit the infrared image data and/or the visible light image data to the user device.
The device attachment may be configured to cooperate with one or more components of an attached device such as a smartphone to capture and/or process image data. For example, an additional visible light camera on a smart phone attached to the device attachment may be used to capture additional visible light images that can be used, together with visible light images captured using a visible light image sensor in the device attachment, to measure distances to objects in a scene using the parallax of the objects between the two visible light image sensors. The measured distances can be used to align or otherwise combine infrared images from the infrared image sensor with the visible light images from the visible light imaging module. As another example, a light source in a smart phone attached to the device attachment may be operated to illuminate some or all of a scene to be imaged by imaging modules in the device attachment for use in combining infrared and visible light images.
A timer may be used to determine when a thermal imaging module in the device attachment can be used for determining calibrated temperatures of imaged objects.
The scope of the invention is defined by the claims, which are incorporated into this section by reference. A more complete understanding of embodiments of the invention will be afforded to those skilled in the art, as well as a realization of additional advantages thereof, by a consideration of the following detailed description of one or more embodiments. Reference will be made to the appended sheets of drawings that will first be described briefly.
Embodiments of the invention and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures.
Referring now to
Device attachment 1250 may be configured to receive a portable electronic device such as user device 1200. In the embodiment of
As shown in
In some embodiments, shutter member 250 may be used, for example, to protect imaging components 7000 and 7002 when not in use. Shutter 250 may also be used as a temperature reference as part of a calibration process (e.g., a non-uniformity correction (NUC) process as described in U.S. patent application Ser. No. 14/099,818 filed Dec. 6, 2013 which is incorporated by reference herein in its entirety, a radiometric calibration process, and/or other calibration processes) for infrared imaging module 7000 as would be understood by one skilled in the art. Device attachment 1250 may include a front portion 7007 and a rear portion 7009. Front portion 7007 may be formed from a housing that encloses functional components of the device attachment such as a battery, connectors, imaging components, processors, memory, communications components, and/or other components of a device attachment as described herein. Rear portion 7009 may be a structural housing portion having a shape that forms a recess into which user device 1200 is configured to be releasably attached.
The device connector may be implemented according to the connector specification associated with the type of user device 1200. For example, the device connector may implement a proprietary connector (e.g., an Apple® dock connector for iPod™ and iPhone™ such as a “Lightning” connector, a 30-pin connector, or others) or a standardized connector (e.g., various versions of Universal Serial Bus (USB) connectors, Portable Digital Media Interface (PDMI), or other standard connectors as provided in user devices).
In one embodiment, the device connector may be interchangeably provided, so that device attachment 1250 may accommodate different types of user devices that accept different device connectors. For example, various types of device connector plugs may be provided and configured to be attached to a base connector device attachment 1250, so that a connector plug that is compatible with user device 1200 can be attached to the base connector before attaching device attachment 1250 to user device 1200. In another embodiment, the device connector may be fixedly provided.
Device attachment 1250 may also communicate with user device 1200 via a wireless connection. In this regard, device attachment 1250 may include a wireless communication module configured to facilitate wireless communication between user device 1200 and device attachment 1250. In various embodiments, a wireless communication module may support the IEEE 802.11 WiFi standards, the Bluetooth™ standard, the ZigBee™ standard, or other appropriate short range wireless communication standards. Thus, device attachment 1250 may be used with user device 1200 without relying on the device connector, if a connection through the device connector is not available or not desired.
Infrared imaging module 7000 may be implemented, for one or more embodiments, with a small form factor and in accordance with wafer level packaging techniques or other packaging techniques. Infrared imaging module 7000 may include a lens barrel, a housing, an infrared sensor assembly, a circuit board, a base, and a processing module.
An infrared sensor assembly may include a plurality of infrared sensors (e.g., infrared detectors) implemented in an array or other fashion on a substrate and covered by a cap. For example, in one embodiment, an infrared sensor assembly may be implemented as a focal plane array (FPA). Such a focal plane array may be implemented, for example, as a vacuum package assembly. In one embodiment, an infrared sensor assembly may be implemented as a wafer level package (e.g., singulated from a set of vacuum package assemblies provided on a wafer). In one embodiment, an infrared sensor assembly may be implemented to operate using a power supply of approximately 2.4 volts, 2.5 volts, 2.8 volts, or similar voltages.
Infrared sensors in infrared imaging module 7000 may be configured to detect infrared radiation (e.g., infrared energy) from a target scene including, for example, mid wave infrared wave bands (MWIR), long wave infrared wave bands (LWIR), and/or other thermal imaging bands as may be desired in particular implementations. Infrared sensors may be implemented, for example, as microbolometers or other types of thermal imaging infrared sensors arranged in any desired array pattern to provide a plurality of pixels.
User device 1200 may be any type of portable electronic device that may be configured to communicate with device attachment 1250 to receive infrared images captured by infrared sensor assembly 7000 and/or non-thermal images such as visible light images from non-thermal imaging module 7002.
Infrared image data captured by infrared imaging module 7000 and/or non-thermal image data such as visible light image data captured by non-thermal imaging module 7002 may be provided to a processing module of device attachment 1250 and/or device 1200 for further processing.
The processing module may be configured to perform appropriate processing of captured infrared image data, and transmit raw and/or processed infrared image data to user device 1200. For example, when device attachment 1250 is attached to user device 1200, a processing module may transmit raw and/or processed infrared image data to user device 1200 via a wired device connector or wirelessly via appropriate wireless components further described herein. Thus, for example, user device 1200 may be appropriately configured to receive the infrared image data (e.g., thermal image data) and/or non-thermal image data from device attachment 1250 to display user-viewable infrared images (e.g., thermograms) to users on display 201 and permit users to store infrared image data non-thermal image data, multi-wavelength image data, and/or user-viewable infrared images. That is, user device 1200 may be configured to run appropriate software instructions (e.g., a smart phone “app”) to function as an infrared camera that permits users to frame and take infrared, non-infrared, and/or combined still images, videos, or both. Device attachment 1250 and user device 1200 may be configured to perform other infrared imaging functionalities, such as storing and/or analyzing thermographic data (e.g., temperature information) contained within infrared image data.
Device attachment 1250 may also include a battery 1208 (see, e.g.,
In some embodiments, a non-thermal camera module 101 of device 1200 may be used together with non-thermal camera module 7002 of device attachment 1250. When blending infrared (e.g., thermal) and non-thermal (e.g., visible) video images, the two images may be mapped to each other pixel by pixel. Differences between the two cameras (e.g., distortion, parallax, pointing angle, etc.) can be compensated. Imaging modules 7000 and 7002 may be mounted close to each other to reduce parallax differences between images captured with the imaging modules. In order to provide corrections for any remaining parallax differences, particularly for very nearby objects in an image, non-thermal camera 101 in the device 1200 can be used in conjunction with non-thermal camera module 7002 to determine the distance to the objects in a scene. The determined distance can then be used to adjust the alignment of infrared (e.g., thermal) and non-thermal (e.g., visible) video images even at variable scene distances.
As shown in
The measured distance, the non-thermal image captured by non-thermal imaging module 7002, and a thermal image (e.g., an infrared image) from thermal imaging module 7000 can be provided to processing circuitry such as merge engine 303. Merge engine 303 can use the measured distance to correct any remaining parallax differences between the thermal image and the non-thermal image so that the thermal image and the non-thermal image can be combined and provided to display 201 for display to a user. Distance measure engine 301 and merge engine 303 may represent algorithms performed by a logic device (e.g., a programmable logic device or microprocessor).
At block 400, a first non-thermal image may be captured using a non-thermal image sensor in the device attachment and, optionally, a thermal image may be captured using a thermal image sensor in the device attachment.
At block 402, a second non-thermal image may be captured using a non-thermal image sensor in a device camera.
At block 404, a distance may be determined to a scene object using the first and second non-thermal images (e.g., by determining a parallax-induced shift of the object in the first and second non-thermal images and triangulating the distance to the object using the determined shift and the known relative locations of the non-thermal image sensor in the device attachment and the non-thermal image sensor in the device). The known relative locations may be determined based on the known positions of the non-thermal image sensors in each respective device and the known position of the device within the device attachment and/or based on a calibration operation performed by capturing an image of an object at a known distance using both of the non-thermal image sensors and determining the relative locations of the non-thermal image sensors using the images of the object and the known distance.
In some embodiments, the capturing of the non-thermal images may be controlled to improve accuracy of determining a parallax-induced shift between the first and second non-thermal images, which in turn would improve the accuracy of the determined distance and the parallax correction. For example, if the first and second non-thermal images are captured while the user and/or the object are in motion, the accuracy of determining a parallax-induced shift may be affected due to a shift and/or blurring of the objects in the images caused by the motion. Such a motion-induced shift may occur, for example, if the timing of the capturing by the first and second non-thermal image sensors is not adequately synchronized.
Thus, in one embodiment, operations of
In another embodiment, operations of
At block 406, the thermal image and the first non-thermal image may be combined using the determined distance to the object (e.g., by performing a parallax correction between the thermal image and the first non-thermal image using the determined distance).
In order to improve the parallax corrections determined using the non-thermal camera module in the device attachment and the non-thermal camera module in the device, any distortion and alignment error between the non-thermal camera module in the device attachment and the non-thermal camera module in the device can be calibrated. For example, an image may be captured of an object such as a hand in front of a thermally and visually uniform background using the non-thermal camera module in the device attachment, the thermal imaging module in the device attachment, and the non-thermal camera module in the device. Processing circuitry (e.g., a smartphone app running on the device processor) can be used to match the edges of the hand in all three images and correlate the alignment between two non-thermal camera modules to a factory calibrated alignment between the non-thermal camera module in the device attachment and the thermal imaging module in the device attachment.
At block 500, images may be captured of an object (e.g., a hand) at a common time using each of a thermal image sensor in a device attachment, a non-thermal image sensor in the device attachment, and a non-thermal image sensor in an attached device.
At block 502, edges of the object in each captured image may be detected.
At block 504, alignment and distortion corrections between the non-thermal image sensor in the device attachment and the non-thermal image sensor in the attached device may be determined based on the locations in the images of the detected edges.
At block 506, the alignment and distortion corrections may be stored (e.g., in the device attachment or the device) for use in distance measurements for parallax corrections between images captured using the thermal image sensor and the non-thermal image sensor in the device attachment.
While various embodiments illustrated above with reference to
In some embodiments, thermal imaging module 7000 may be used to determine an image-based calibrated temperature of an object (e.g., by capturing one or more calibrated thermal images and determining from the intensity and/or spectrum of the object in the thermal images, the temperature of the object as would be understood by one skilled in the art). The accuracy of this type of image-based temperature measurement can be improved by ensuring that the thermal imaging module has been recently calibrated when an image-based temperature measurement is to be made.
At block 600, a system such as a system that includes a device attachment having a thermal image sensor and an attached device may perform a calibration of a thermal image sensor such as a thermal image sensor in a device attachment using a closed shutter (e.g., by closing the shutter and capturing one or more images of the shutter using the thermal image sensor).
At block 602, the system may monitor the time since the last calibration of the thermal image sensor (e.g., by a processor in the device attachment or a processor in an attached device).
At block 604, the system may receive a request for an image-based temperature determination from a user.
At block 606, the system may determine whether the time since calibration is less than a maximum allowable time using the monitored time. The maximum allowable time may be, as examples, less than 20 seconds since the last calibration, less than 10 seconds since the last calibration, less than one minute since the last calibration or less than 30 seconds since the last calibration. In response to determining that the time since the last calibration is less than the maximum allowable time, the system may proceed to block 608.
At block 608, one or more thermal images and/or an infrared spectrum of an object may be captured.
At block 610, the system may determine the temperature of the object from thermal images and/or the infrared spectrum.
If it is determined at block 606 that the time since the last calibration is greater than the maximum allowable time, the system may proceed to block 612.
At block 612, the system may instruct user to perform a new calibration of the thermal imaging module using the closed shutter to ensure that the subsequent temperature measurement is accurate.
In some embodiments, a light source in a portable electronic device that is attached to a device attachment having a thermal imaging module may be used in cooperation with the thermal imaging module and a non-thermal imaging module to enhance imaging of a scene. For example, light source 103 of device 1200 (see
At step 800, thermal image data may be captured using a thermal image sensor in a device attachment and non-thermal image data may be captured using a non-thermal image sensor in the device attachment. If desired, additional non-thermal image data may be captured using a camera in an attached device.
At step 802, while capturing the thermal image data and the non-thermal image data using the device attachment, a light source of an attached device may be operated. The light source may be operated (e.g., flashed or held on) during image capture operations based on, for example, user input and/or automatically determined light levels. Illuminating the scene using the light source may enhance the non-thermal images captured by the non-thermal image sensor in the device attachment.
At step 804, the captured thermal image data and the captured non-thermal image data from the device attachment may be combined to form an enhanced output image that includes some or all of the thermal image data and actively illuminated non-thermal image data. In some embodiments, thermal and non-thermal images may be processed to generate combined images using high contrast processing.
Regarding high contrast processing, high spatial frequency content may be obtained from one or more of the thermal and non-thermal images (e.g., by performing high pass filtering, difference imaging, and/or other techniques). A combined image may include a radiometric component of a thermal image and a blended component including infrared (e.g., thermal) characteristics of a scene blended with the high spatial frequency content, according to a blending parameter, which may be adjustable by a user and/or machine in some embodiments. In some embodiments, high spatial frequency content from non-thermal images may be blended with thermal images by superimposing the high spatial frequency content onto the thermal images, where the high spatial frequency content replaces or overwrites those portions of the thermal images corresponding to where the high spatial frequency content exists. For example, the high spatial frequency content may include edges of objects depicted in images of a scene, but may not exist within the interior of such objects. In such embodiments, blended image data may simply include the high spatial frequency content, which may subsequently be encoded into one or more components of combined images.
For example, a radiometric component of thermal image may be a chrominance component of the thermal image, and the high spatial frequency content may be derived from the luminance and/or chrominance components of a non-thermal image. In this embodiment, a combined image may include the radiometric component (e.g., the chrominance component of the thermal image) encoded into a chrominance component of the combined image and the high spatial frequency content directly encoded (e.g., as blended image data but with no thermal image contribution) into a luminance component of the combined image. By doing so, a radiometric calibration of the radiometric component of the thermal image may be retained. In similar embodiments, blended image data may include the high spatial frequency content added to a luminance component of the thermal images, and the resulting blended data encoded into a luminance component of resulting combined images. The non-thermal image may be from any type of non-thermal imager, including for example a visible light imager, a low light visible light imager, a CCD imaging device, an EMCCD imaging device, a CMOS imaging device, a sCMOS imaging device, a NIR imaging device, a SWIR imaging device, or other types of non-thermal imagers (e.g., including passive or active illumination as would be understood by one skilled in the art).
For example, any of the techniques disclosed in the following applications may be used in various embodiments: U.S. patent application Ser. No. 12/477,828 filed Jun. 3, 2009; U.S. patent application Ser. No. 12/766,739 filed Apr. 23, 2010; U.S. patent application Ser. No. 13/105,765 filed May 11, 2011; U.S. patent application Ser. No. 13/437,645 filed Apr. 2, 2012; U.S. Provisional Patent Application No. 61/473,207 filed Apr. 8, 2011; U.S. Provisional Patent Application No. 61/746,069 filed Dec. 26, 2012; U.S. Provisional Patent Application No. 61/746,074 filed Dec. 26, 2012; U.S. Provisional Patent Application No. 61/748,018 filed Dec. 31, 2012; U.S. Provisional Patent Application No. 61/792,582 filed Mar. 15, 2013; U.S. Provisional Patent Application No. 61/793,952 filed Mar. 15, 2013; and International Patent Application No. PCT/EP2011/056432 filed Apr. 21, 2011, all of such applications are incorporated herein by reference in their entirety. Any of the techniques described herein, or described in other applications or patents referenced herein, may be applied to any of the various thermal devices, non-thermal devices, and uses described herein.
In some embodiments, any one of device attachment 1250 or device 1200 may be configured to receive user input indicating a portion of interest to be imaged by a first imaging module (e.g., infrared imaging module 7000), control the light source 103 to illuminate at least the portion-of-interest in a spectrum sensed by a second imaging module (e.g., visible spectrum imaging module 7002 and/or 101), receive illuminated captured images of the portion-of-interest from the second imaging module, and generate a combined image comprising illuminated characteristics of the scene derived from the illuminated captured images. In some embodiments, a thermal image may be used to detect a “hot” spot in an image, such as an image of a circuit breaker box. Light source 103 may be used to illuminate a label of a circuit breaker to provide a better image and potentially pin point the cause of the hot spot.
In some embodiments, any portion of process 5800 may be implemented in a loop so as to continuously operate on a series of infrared and/or visible spectrum images, such as a video of a scene. In other embodiments, process 5800 may be implemented in a partial feedback loop including display of intermediary processing (e.g., after or while receiving infrared and/or visible spectrum images, registering images to each other, generating illuminated and/or combined images, or performing other processing of process 5800) to a user, for example, and/or including receiving user input, such as user input directed to any intermediary processing step. Further, in some embodiments, process 5800 may include one or more steps, sub-steps, sub-processes, or blocks of any of the other processes described herein.
At block 5810, device attachment 1250 generates visible spectrum images of a scene. For example, imaging module 7002 may be configured to generate one or more visible spectrum images of a scene. In some embodiments, block 5810 may include one or more operations discussed with reference to the processes of
At block 5812, optionally at the same time as block 5810, device attachment 1250 generates infrared images of the scene. For example, imaging modules 7000 may be configured to generate one or more infrared images of the scene. In some embodiments, block 5812 may include one or more operations discussed with reference to the processes of
At block 5820, device attachment 1250 produces an output signal of data corresponding to the generated images. For example, any one of imaging modules 7000 or 7002 and/or a processor may be adapted to produce an output signal of data corresponding to the images generated in blocks 5810 and 5812. In some embodiments, the output signal may adhere to a particular interface standard, for example, such as MIPI®.
At block 5830, device attachment 1250 and/or device 1200 stores the data according to a common data format. For example, the data may be stored in a desired data file according to a common data format.
At block 5840, device attachment 1250 and/or device 1200 registers the images to each other. For example, device attachment 1250 and/or device 1200 may be adapted to register any one of the generated images to another one of the generated images by performing one or more of interpolation, scaling, cropping, rotational transformation, morphing, and/or filtering operations on one or more of the images to substantially match spatial content within the images. In some embodiments, device attachment 1250 and/or device 1200 may be adapted to register images to each other using one or more of the processes described in connection with
At block 5850, device attachment 1250 and/or device 1200 receives user input indicating a portion-of-interest of the scene. For example, device attachment 1250 and/or device 1200 may be adapted to receive user input provided by one or more other components, a touchscreen display, and/or other devices indicating a portion-of-interest of the already imaged scene. The user input may be used to designate a pixel or group of pixels corresponding to the portion-of-interest. In some embodiments, the user input may be combined with the selection of registration operations performed in block 5840 to determine corresponding pixels in a variety of captured images.
At block 5852, a light source such as light source 103 of device 1200 illuminates the portion of interest. For example, any one of device attachment 1250 and/or device 1200 may be adapted to control light source 103 to illuminate all or a designated portion of interest in a particular scene. In some embodiments, a particular spectrum and/or portion of a scene may be selected by controlling a MEMS lens and/or other system coupled or otherwise associated with an illumination module.
At block 5854, device attachment 1250 and/or device 1200 generates illuminated images of the portion-of-interest. For example, any one of imaging modules 7000, 7002, or 101 sensitive to the spectrum illuminated in block 5852 may be adapted to generate an illuminated image that is captured while light source 103 is illuminating at least the portion-of-interest designated in block 5850.
At block 5860, device attachment 1250 and/or device 1200 generates combined images of the scene from the visible spectrum images, the infrared images, and/or the illuminated images. In one embodiment, a combined image may include a visible spectrum image with embedded data corresponding to infrared image data for each pixel of visible spectrum data. When such a combined image is displayed, a user may select a pixel or group of pixels with a user interface and text corresponding to the infrared image data may be displayed alongside the visible spectrum image, such as in a text box or legend, for example. In some embodiments, any one of imaging modules 7000, 7002, and/or 101 may be adapted to generate combined images using one or more of the processes described herein, including processes described in connection with
At block 5870, device 1200 displays one or more of the generated images. For example, device 1200 may be adapted to use a display (e.g., display 201 in
While various embodiments illustrated herein are described in relation to a device attachment, it should be understood that one or more embodiments of the invention are applicable also to the device solely or in conjunction with the device attachment. For example, the thermal image sensor may be implemented directly into the device (e.g., device 1200) and also optionally the additional non-thermal image sensor may be implemented within the device. Consequently, the principles taught herein may be applied based on the sensors implemented within the device.
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
This application is a continuation of International Patent Application No. PCT/US2014/073096 filed Dec. 31, 2014 and entitled “DEVICE ATTACHMENT WITH DUAL BAND IMAGING SENSOR” which is incorporated herein by reference in its entirety. International Patent Application No. PCT/US2014/073096 claims the benefit of U.S. Provisional Patent Application No. 61/923,732 filed Jan. 5, 2014 and entitled “DEVICE ATTACHMENT WITH DUAL BAND IMAGING SENSOR” which is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/281,883 filed May 19, 2014 and entitled “DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR” is hereby incorporated by reference in its entirety. International Patent Application No. PCT/US2013/062433 filed Sep. 27, 2013 and entitled “DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/880,827 filed Sep. 20, 2013 and entitled “DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 13/901,428 filed May 23, 2013 and entitled “DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/652,075 filed May 25, 2012 and entitled “DEVICE ATTACHMENT WITH INFRARED IMAGING SENSOR” is hereby incorporated by reference in its entirety. U.S. Design Pat. application No. 29/423,027 filed May 25, 2012 and entitled “DEVICE ATTACHMENT WITH CAMERA” is hereby incorporated by reference in its entirety. International Patent Application No. PCT/US2013/078551 filed Dec. 31, 2013 and entitled “INFRARED IMAGING DEVICE HAVING A SHUTTER” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/747,789 filed Dec. 31, 2012 and entitled “INFRARED IMAGING DEVICE HAVING A SHUTTER” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 13/966,052 filed Aug. 13, 2013 and entitled “INFRARED CAMERA SYSTEM HOUSING WITH METALIZED SURFACE” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/683,124 filed Aug. 14, 2012 and entitled “INFRARED CAMERA SYSTEM HOUSING WITH METALIZED SURFACE” is hereby incorporated by reference in its entirety. International Patent Application No. PCT/US2014/059200 filed Oct. 3, 2014 and entitled “DURABLE COMPACT MULTISENSOR OBSERVATION DEVICES” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/101,245 filed Dec. 9, 2013 and entitled “LOW POWER AND SMALL FORM FACTOR INFRARED IMAGING” is hereby incorporated by reference in its entirety. International Patent Application No. PCT/US2012/041744 filed Jun. 8, 2012 and entitled “LOW POWER AND SMALL FORM FACTOR INFRARED IMAGING” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/656,889 filed Jun. 7, 2012 and entitled “LOW POWER AND SMALL FORM FACTOR INFRARED IMAGING” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/545,056 filed Oct. 7, 2011 and entitled “NON-UNIFORMITY CORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/495,873 filed Jun. 10, 2011 and entitled “INFRARED CAMERA PACKAGING SYSTEMS AND METHODS” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/495,879 filed Jun. 10, 2011 and entitled “INFRARED CAMERA SYSTEM ARCHITECTURES” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/495,888 filed Jun. 10, 2011 and entitled “INFRARED CAMERA CALIBRATION TECHNIQUES” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/099,818 filed Dec. 6, 2013 and entitled “NON-UNIFORMITY CORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES” is hereby incorporated by reference in its entirety. International Patent Application No. PCT/US2012/041749 filed Jun. 8, 2012 and entitled “NON-UNIFORMITY CORRECTION TECHNIQUES FOR INFRARED IMAGING DEVICES” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/101,258 filed Dec. 9, 2013 and entitled “INFRARED CAMERA SYSTEM ARCHITECTURES” is hereby incorporated by reference in its entirety. International Patent Application No. PCT/US2012/041739 filed Jun. 8, 2012 and entitled “INFRARED CAMERA SYSTEM ARCHITECTURES” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/138,058 filed Dec. 21, 2013 and entitled “COMPACT MULTI-SPECTRUM IMAGING WITH FUSION” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/748,018 filed Dec. 31, 2012 and entitled “COMPACT MULTI-SPECTRUM IMAGING WITH FUSION” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/299,987 filed Jun. 9, 2014 and entitled “INFRARED CAMERA SYSTEMS AND METHODS FOR DUAL SENSOR APPLICATIONS” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 12/477,828 filed Jun. 3, 2009 and entitled “INFRARED CAMERA SYSTEMS AND METHODS FOR DUAL SENSOR APPLICATIONS” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/138,040 filed Dec. 21, 2013 and entitled “TIME SPACED INFRARED IMAGE ENHANCEMENT” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/792,582 filed Mar. 15, 2013 and entitled “TIME SPACED INFRARED IMAGE ENHANCEMENT” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/746,069 filed Dec. 26, 2012 and entitled “TIME SPACED INFRARED IMAGE ENHANCEMENT” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/138,052 filed Dec. 21, 2013 and entitled “INFRARED IMAGING ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/793,952 filed Mar. 15, 2013 and entitled “INFRARED IMAGING ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/746,074 filed Dec. 26, 2012 and entitled “INFRARED IMAGING ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/246,006 filed Apr. 4, 2014 entitled “SMART SURVEILLANCE CAMERA SYSTEMS AND METHODS” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 13/437,645 filed Apr. 2, 2012 and entitled “INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 13/105,765 filed May 11, 2011 and entitled “INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/473,207 filed Apr. 8, 2011 and entitled “INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 12/766,739 filed Apr. 23, 2010 and entitled “INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. International Patent Application No. PCT/EP2011/056432 filed Apr. 21, 2011 and entitled “INFRARED RESOLUTION AND CONTRAST ENHANCEMENT WITH FUSION” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 14/029,716 filed Sep. 17, 2013 and entitled “ROW AND COLUMN NOISE REDUCTION IN THERMAL IMAGES” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/745,489 filed Dec. 21, 2012 and entitled “ROW AND COLUMN NOISE REDUCTION IN THERMAL IMAGES” is hereby incorporated by reference in its entirety. U.S. Provisional Patent Application No. 61/745,504 filed Dec. 21, 2012 and entitled “PIXEL-WISE NOISE REDUCTION IN THERMAL IMAGES” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 13/622,178 filed Sep. 18, 2012 and entitled “SYSTEMS AND METHODS FOR PROCESSING INFRARED IMAGES” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 13/529,772 filed Jun. 21, 2012 and entitled “SYSTEMS AND METHODS FOR PROCESSING INFRARED IMAGES” is hereby incorporated by reference in its entirety. U.S. patent application Ser. No. 12/396,340 filed Mar. 2, 2009 and entitled “SYSTEMS AND METHODS FOR PROCESSING INFRARED IMAGES” is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2764055 | Clemens et al. | Sep 1956 | A |
5128796 | Barney et al. | Jul 1992 | A |
6297794 | Tsubouchi et al. | Oct 2001 | B1 |
6330371 | Chen et al. | Dec 2001 | B1 |
6348951 | Kim | Feb 2002 | B1 |
6396543 | Shin et al. | May 2002 | B1 |
6424843 | Reitmaa et al. | Jul 2002 | B1 |
6633231 | Okamoto | Oct 2003 | B1 |
6681120 | Kim | Jan 2004 | B1 |
6707044 | Lannestedt et al. | Mar 2004 | B2 |
6759949 | Miyahara | Jul 2004 | B2 |
6883054 | Yamaguchi et al. | Apr 2005 | B2 |
6911652 | Walkenstein | Jun 2005 | B2 |
7050107 | Frank et al. | May 2006 | B1 |
D524785 | Huang | Jul 2006 | S |
7084857 | Lieberman et al. | Aug 2006 | B2 |
7208733 | Mian et al. | Apr 2007 | B2 |
7263379 | Parkulo et al. | Aug 2007 | B1 |
7284921 | Lapstun et al. | Oct 2007 | B2 |
7296747 | Rohs | Nov 2007 | B2 |
7305368 | Lieberman et al. | Dec 2007 | B2 |
7321783 | Kim | Jan 2008 | B2 |
7333832 | Tsai et al. | Feb 2008 | B2 |
7377835 | Parkulo et al. | May 2008 | B2 |
7420663 | Wang et al. | Sep 2008 | B2 |
7453064 | Lee | Nov 2008 | B2 |
7470902 | Kraemer et al. | Dec 2008 | B1 |
7477309 | Cuccias | Jan 2009 | B2 |
7567818 | Pylkko | Jul 2009 | B2 |
7572077 | Lapstun et al. | Aug 2009 | B2 |
7575077 | Priepke et al. | Aug 2009 | B2 |
7595904 | Lapstun et al. | Sep 2009 | B2 |
7616877 | Zarnowski et al. | Nov 2009 | B2 |
7620265 | Wolff et al. | Nov 2009 | B1 |
7627364 | Sato | Dec 2009 | B2 |
7697962 | Cradick et al. | Apr 2010 | B2 |
7723686 | Hannebauer | May 2010 | B2 |
7725141 | Su | May 2010 | B2 |
7728281 | Chen | Jun 2010 | B2 |
7735974 | Silverbrook et al. | Jun 2010 | B2 |
7747454 | Bartfeld et al. | Jun 2010 | B2 |
7760919 | Namgoong | Jul 2010 | B2 |
7761114 | Silverbrook et al. | Jul 2010 | B2 |
7773870 | Naruse | Aug 2010 | B2 |
7801733 | Lee et al. | Sep 2010 | B2 |
7810733 | Silverbrook et al. | Oct 2010 | B2 |
7872574 | Betts et al. | Jan 2011 | B2 |
7900842 | Silverbrook et al. | Mar 2011 | B2 |
7903152 | Kim | Mar 2011 | B2 |
7947222 | Bae et al. | May 2011 | B2 |
7960700 | Craig et al. | Jun 2011 | B2 |
8049163 | Granneman et al. | Nov 2011 | B1 |
8153980 | Brady et al. | Apr 2012 | B1 |
8208026 | Hogasten et al. | Jun 2012 | B2 |
8275413 | Fraden et al. | Sep 2012 | B1 |
8305577 | Kivioja et al. | Nov 2012 | B2 |
8345226 | Zhang | Jan 2013 | B2 |
8520970 | Strandemar | Aug 2013 | B2 |
8537343 | Zhang | Sep 2013 | B2 |
8565547 | Strandemar | Oct 2013 | B2 |
8749635 | Hogasten et al. | Jun 2014 | B2 |
8780208 | Hogasten et al. | Jul 2014 | B2 |
8781420 | Schlub et al. | Jul 2014 | B2 |
8825112 | Fraden et al. | Sep 2014 | B1 |
9171361 | Strandemar | Oct 2015 | B2 |
9235876 | Hogasten et al. | Jan 2016 | B2 |
9237284 | Hogasten et al. | Jan 2016 | B2 |
20020006337 | Kimura et al. | Jan 2002 | A1 |
20020058352 | Jacksen et al. | May 2002 | A1 |
20020122036 | Sasaki | Sep 2002 | A1 |
20020135571 | Klocek et al. | Sep 2002 | A1 |
20020140542 | Prokoski et al. | Oct 2002 | A1 |
20020149600 | Van Splunter et al. | Oct 2002 | A1 |
20030007193 | Sato et al. | Jan 2003 | A1 |
20030112871 | Demos | Jun 2003 | A1 |
20030122957 | Emme | Jul 2003 | A1 |
20030223623 | Gutta et al. | Dec 2003 | A1 |
20040047518 | Tiana | Mar 2004 | A1 |
20040101298 | Mandelbaum et al. | May 2004 | A1 |
20040119020 | Bodkin | Jun 2004 | A1 |
20040127156 | Park | Jul 2004 | A1 |
20040128070 | Schmidt et al. | Jul 2004 | A1 |
20040157612 | Kim | Aug 2004 | A1 |
20040165788 | Perez et al. | Aug 2004 | A1 |
20040169860 | Jung et al. | Sep 2004 | A1 |
20040207036 | Ikeda | Oct 2004 | A1 |
20040211907 | Wellman et al. | Oct 2004 | A1 |
20040256561 | Beuhler et al. | Dec 2004 | A1 |
20050030314 | Dawson | Feb 2005 | A1 |
20050067852 | Jeong | Mar 2005 | A1 |
20050089241 | Kawanishi et al. | Apr 2005 | A1 |
20050068333 | Nakahashi et al. | May 2005 | A1 |
20050093890 | Baudisch | May 2005 | A1 |
20050110803 | Sugimura | May 2005 | A1 |
20050138569 | Baxter et al. | Jun 2005 | A1 |
20050169655 | Koyama et al. | Aug 2005 | A1 |
20050184993 | Ludwin et al. | Aug 2005 | A1 |
20050213813 | Lin et al. | Sep 2005 | A1 |
20050213853 | Maier et al. | Sep 2005 | A1 |
20050219249 | Xie et al. | Oct 2005 | A1 |
20050248912 | Kang et al. | Nov 2005 | A1 |
20050265688 | Kobayashi | Dec 2005 | A1 |
20050270784 | Hahn et al. | Dec 2005 | A1 |
20050277447 | Buil et al. | Dec 2005 | A1 |
20060039686 | Soh et al. | Feb 2006 | A1 |
20060045504 | Zarnowski et al. | Mar 2006 | A1 |
20060060984 | Wakabayashi et al. | Mar 2006 | A1 |
20060077246 | Kawakami et al. | Apr 2006 | A1 |
20060097172 | Park | May 2006 | A1 |
20060120712 | Kim | Jun 2006 | A1 |
20060132642 | Hosaka et al. | Jun 2006 | A1 |
20060140501 | Tadas | Jun 2006 | A1 |
20060147191 | Kim | Jul 2006 | A1 |
20060154559 | Yoshida | Jul 2006 | A1 |
20060210249 | Seto | Sep 2006 | A1 |
20060234744 | Sung et al. | Oct 2006 | A1 |
20060240867 | Wang et al. | Oct 2006 | A1 |
20060279758 | Myoki | Dec 2006 | A1 |
20060285907 | Kang et al. | Dec 2006 | A1 |
20070004449 | Sham | Jan 2007 | A1 |
20070019077 | Park | Jan 2007 | A1 |
20070019099 | Lieberman et al. | Jan 2007 | A1 |
20070019103 | Lieberman et al. | Jan 2007 | A1 |
20070033309 | Kuwabara et al. | Feb 2007 | A1 |
20070034800 | Huang | Feb 2007 | A1 |
20070052616 | Yoon | Mar 2007 | A1 |
20070057764 | Sato et al. | Mar 2007 | A1 |
20070103479 | Kim et al. | May 2007 | A1 |
20070120879 | Kanade et al. | May 2007 | A1 |
20070132858 | Chiba et al. | Jun 2007 | A1 |
20070139739 | Kim et al. | Jun 2007 | A1 |
20070159524 | Kim et al. | Jul 2007 | A1 |
20070189583 | Shimada et al. | Aug 2007 | A1 |
20070211965 | Helbing et al. | Sep 2007 | A1 |
20070222798 | Kuno | Sep 2007 | A1 |
20070248284 | Bernsen et al. | Oct 2007 | A1 |
20070274541 | Uetake et al. | Nov 2007 | A1 |
20070285439 | King et al. | Dec 2007 | A1 |
20070286517 | Paik et al. | Dec 2007 | A1 |
20070299226 | Park et al. | Dec 2007 | A1 |
20080038579 | Schuisky et al. | Feb 2008 | A1 |
20080056612 | Park et al. | Mar 2008 | A1 |
20080079834 | Chung et al. | Apr 2008 | A1 |
20080112012 | Yokoyama et al. | May 2008 | A1 |
20080151056 | Ahamefula | Jun 2008 | A1 |
20080165190 | Min et al. | Jul 2008 | A1 |
20080165342 | Yoshida et al. | Jul 2008 | A1 |
20080170082 | Kim | Jul 2008 | A1 |
20080218474 | Ahn et al. | Sep 2008 | A1 |
20080248833 | Silverbrook et al. | Oct 2008 | A1 |
20080259181 | Yamashita et al. | Oct 2008 | A1 |
20080266079 | Lontka | Oct 2008 | A1 |
20080278772 | Silverbrook et al. | Nov 2008 | A1 |
20080284880 | Numata | Nov 2008 | A1 |
20080292144 | Kim | Nov 2008 | A1 |
20080297614 | Lieberman et al. | Dec 2008 | A1 |
20090023421 | Parkulo et al. | Jan 2009 | A1 |
20090027525 | Lin et al. | Jan 2009 | A1 |
20090040042 | Lontka | Feb 2009 | A1 |
20090040195 | Njolstad et al. | Feb 2009 | A1 |
20090052883 | Lee et al. | Feb 2009 | A1 |
20090065695 | DeMarco | Mar 2009 | A1 |
20090129700 | Rother et al. | May 2009 | A1 |
20090131104 | Yoon | May 2009 | A1 |
20090148019 | Hamada et al. | Jun 2009 | A1 |
20090213110 | Kato et al. | Aug 2009 | A1 |
20090215479 | Karmarkar | Aug 2009 | A1 |
20090227287 | Kotidis | Sep 2009 | A1 |
20090238238 | Hollander et al. | Sep 2009 | A1 |
20090278048 | Choe et al. | Nov 2009 | A1 |
20090297062 | Molne et al. | Dec 2009 | A1 |
20090303363 | Blessinger | Dec 2009 | A1 |
20100020229 | Hershey et al. | Jan 2010 | A1 |
20100066866 | Lim | Mar 2010 | A1 |
20100090965 | Birkler | Apr 2010 | A1 |
20100090983 | Challener et al. | Apr 2010 | A1 |
20100103141 | Challener et al. | Apr 2010 | A1 |
20100113068 | Rothschild | May 2010 | A1 |
20100131268 | Moeller | May 2010 | A1 |
20100134604 | Kieffer et al. | Jun 2010 | A1 |
20100144387 | Chou | Jun 2010 | A1 |
20100163730 | Schmidt et al. | Jul 2010 | A1 |
20100234067 | Silverbrook et al. | Sep 2010 | A1 |
20100245582 | Harel | Sep 2010 | A1 |
20100245585 | Fisher et al. | Sep 2010 | A1 |
20100245826 | Lee | Sep 2010 | A1 |
20100314543 | Lee et al. | Dec 2010 | A1 |
20110043486 | Hagiwara et al. | Feb 2011 | A1 |
20110063446 | McMordie et al. | Mar 2011 | A1 |
20110102599 | Kwon et al. | May 2011 | A1 |
20110117532 | Relyea et al. | May 2011 | A1 |
20110121978 | Schwörer et al. | May 2011 | A1 |
20110122075 | Seo et al. | May 2011 | A1 |
20110128384 | Tiscareno et al. | Jun 2011 | A1 |
20120007987 | Gaber | Jan 2012 | A1 |
20120083314 | Ng et al. | Apr 2012 | A1 |
20120184252 | Hirsch | Jul 2012 | A1 |
20120273688 | Tsai et al. | Nov 2012 | A1 |
20120274814 | Wajs | Nov 2012 | A1 |
20120276954 | Kowalsky | Nov 2012 | A1 |
20120292518 | Goldstein | Nov 2012 | A1 |
20120320086 | Kasama et al. | Dec 2012 | A1 |
20130204570 | Mendelson et al. | Aug 2013 | A1 |
20130250047 | Hollinger | Sep 2013 | A1 |
20130258111 | Frank et al. | Oct 2013 | A1 |
20130270441 | Burt | Oct 2013 | A1 |
20130286236 | Mankowski | Oct 2013 | A1 |
20130320220 | Donowsky | Dec 2013 | A1 |
20130329054 | Hoelter et al. | Dec 2013 | A1 |
20130342691 | Lewis et al. | Dec 2013 | A1 |
20140092257 | Hogasten et al. | Apr 2014 | A1 |
20140098238 | Boulanger et al. | Apr 2014 | A1 |
20140139685 | Nussmeier et al. | May 2014 | A1 |
20140218520 | Teich et al. | Aug 2014 | A1 |
20140240512 | Hogasten et al. | Aug 2014 | A1 |
20140253735 | Fox et al. | Sep 2014 | A1 |
20140267768 | Burleigh | Sep 2014 | A1 |
20140285672 | Hogasten et al. | Sep 2014 | A1 |
20150271420 | Neal | Sep 2015 | A1 |
20150334315 | Teich et al. | Nov 2015 | A1 |
20150358560 | Boulanger et al. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
2764055 | Jul 2012 | CA |
2874947 | Feb 2007 | CN |
2899321 | May 2007 | CN |
201203922 | Mar 2009 | CN |
101635754 | Jan 2010 | CN |
201481406 | May 2010 | CN |
201550169 | Aug 2010 | CN |
101859209 | Oct 2010 | CN |
201628839 | Nov 2010 | CN |
101945154 | Jan 2011 | CN |
102045423 | May 2011 | CN |
102045448 | May 2011 | CN |
102055836 | May 2011 | CN |
201869255 | Jun 2011 | CN |
201897853 | Jul 2011 | CN |
102178510 | Sep 2011 | CN |
202261481 | May 2012 | CN |
102880289 | Jan 2013 | CN |
202998279 | Jun 2013 | CN |
102006057431 | Jun 2008 | DE |
0398725 | Nov 1990 | EP |
0837600 | Apr 1998 | EP |
0973137 | Jan 2000 | EP |
1983485 | Oct 2008 | EP |
2136554 | Dec 2009 | EP |
2477391 | Jul 2012 | EP |
1997275518 | Apr 1999 | JP |
2004004465 | Jan 2004 | JP |
2004048571 | Feb 2004 | JP |
2004241491 | Aug 2004 | JP |
2006098098 | Apr 2006 | JP |
2006105655 | Apr 2006 | JP |
2007006475 | Jan 2007 | JP |
2007267035 | Oct 2007 | JP |
2007325842 | Dec 2007 | JP |
2010181324 | Aug 2010 | JP |
2012231309 | Nov 2012 | JP |
20000026757 | May 2000 | KR |
100227582 | Nov 2000 | KR |
100272582 | Nov 2000 | KR |
20000073381 | Dec 2000 | KR |
100285817 | Jan 2001 | KR |
20010001341 | Jan 2001 | KR |
20010002462 | Jan 2001 | KR |
20010010010 | Feb 2001 | KR |
20010014992 | Feb 2001 | KR |
20010044756 | Jun 2001 | KR |
20010050263 | Jun 2001 | KR |
20010060752 | Jul 2001 | KR |
20010068202 | Jul 2001 | KR |
20010070355 | Jul 2001 | KR |
20010074565 | Aug 2001 | KR |
20020006967 | Jan 2002 | KR |
20020044339 | Jun 2002 | KR |
20020049605 | Jun 2002 | KR |
20020061406 | Jul 2002 | KR |
20020061920 | Jul 2002 | KR |
20020069690 | Sep 2002 | KR |
20020078469 | Oct 2002 | KR |
20020083368 | Nov 2002 | KR |
20020083961 | Nov 2002 | KR |
20020085124 | Nov 2002 | KR |
20020085490 | Nov 2002 | KR |
20020095752 | Dec 2002 | KR |
20030000332 | Jan 2003 | KR |
20030007030 | Jan 2003 | KR |
20030012444 | Feb 2003 | KR |
20030016607 | Mar 2003 | KR |
20030024545 | Mar 2003 | KR |
20030037101 | May 2003 | KR |
20030051140 | Jun 2003 | KR |
20030055693 | Jul 2003 | KR |
20030056667 | Jul 2003 | KR |
20030067116 | Aug 2003 | KR |
20030085742 | Nov 2003 | KR |
20030088968 | Nov 2003 | KR |
20040001684 | Jan 2004 | KR |
20040001686 | Jan 2004 | KR |
20040023826 | Mar 2004 | KR |
20040027692 | Apr 2004 | KR |
20040033223 | Apr 2004 | KR |
20040033532 | Apr 2004 | KR |
20040033986 | Apr 2004 | KR |
20040033993 | Apr 2004 | KR |
20040039868 | May 2004 | KR |
20040040296 | May 2004 | KR |
20040042475 | May 2004 | KR |
20040044624 | May 2004 | KR |
100437890 | Jun 2004 | KR |
20040054416 | Jun 2004 | KR |
20040058969 | Jul 2004 | KR |
20040062802 | Jul 2004 | KR |
20040064855 | Jul 2004 | KR |
20040066724 | Jul 2004 | KR |
20040068864 | Aug 2004 | KR |
20040070840 | Aug 2004 | KR |
20040076308 | Sep 2004 | KR |
20040086994 | Oct 2004 | KR |
20040102386 | Dec 2004 | KR |
20050008245 | Jan 2005 | KR |
20050011313 | Jan 2005 | KR |
20050012505 | Feb 2005 | KR |
20050014448 | Feb 2005 | KR |
20050015293 | Feb 2005 | KR |
20050015526 | Feb 2005 | KR |
20050015745 | Feb 2005 | KR |
20050018370 | Feb 2005 | KR |
20050023950 | Mar 2005 | KR |
20050028537 | Mar 2005 | KR |
20050033308 | Apr 2005 | KR |
101006660 | Sep 2005 | KR |
1020050095463 | Sep 2005 | KR |
100547739 | Jan 2006 | KR |
20060023957 | Mar 2006 | KR |
1020060019715 | Mar 2006 | KR |
20060054877 | May 2006 | KR |
20060071220 | Jun 2006 | KR |
100612890 | Aug 2006 | KR |
100633792 | Oct 2006 | KR |
100646966 | Nov 2006 | KR |
20060119077 | Nov 2006 | KR |
20060119236 | Nov 2006 | KR |
20060120318 | Nov 2006 | KR |
20060121595 | Nov 2006 | KR |
100660125 | Dec 2006 | KR |
100663528 | Jan 2007 | KR |
100672377 | Jan 2007 | KR |
20070002590 | Jan 2007 | KR |
20070005263 | Jan 2007 | KR |
20070005553 | Jan 2007 | KR |
20070009380 | Jan 2007 | KR |
100677913 | Feb 2007 | KR |
100689465 | Mar 2007 | KR |
20070028201 | Mar 2007 | KR |
100722974 | May 2007 | KR |
100729813 | Jun 2007 | KR |
20070067650 | Jun 2007 | KR |
100743171 | Jul 2007 | KR |
100743254 | Jul 2007 | KR |
20070068501 | Jul 2007 | KR |
20070078477 | Aug 2007 | KR |
20070082960 | Aug 2007 | KR |
20070087513 | Aug 2007 | KR |
20070091486 | Sep 2007 | KR |
100766953 | Oct 2007 | KR |
100771364 | Oct 2007 | KR |
20070104957 | Oct 2007 | KR |
100777428 | Nov 2007 | KR |
20070115754 | Dec 2007 | KR |
20070122344 | Dec 2007 | KR |
20070122345 | Dec 2007 | KR |
100802525 | Feb 2008 | KR |
20080013314 | Feb 2008 | KR |
20080015099 | Feb 2008 | KR |
20080015100 | Feb 2008 | KR |
20080015973 | Feb 2008 | KR |
20080018407 | Feb 2008 | KR |
100822053 | Apr 2008 | KR |
20080045551 | May 2008 | KR |
100841243 | Jun 2008 | KR |
20080053057 | Jun 2008 | KR |
20080054596 | Jun 2008 | KR |
100846192 | Jul 2008 | KR |
20080059882 | Jul 2008 | KR |
20080069007 | Jul 2008 | KR |
100854932 | Aug 2008 | KR |
20080071070 | Aug 2008 | KR |
20080078315 | Aug 2008 | KR |
100866177 | Oct 2008 | KR |
100866475 | Nov 2008 | KR |
100866476 | Nov 2008 | KR |
100866573 | Nov 2008 | KR |
100870724 | Nov 2008 | KR |
20080096918 | Nov 2008 | KR |
20080098409 | Nov 2008 | KR |
100871916 | Dec 2008 | KR |
20080112331 | Dec 2008 | KR |
20090003899 | Jan 2009 | KR |
20090018486 | Feb 2009 | KR |
20090020864 | Feb 2009 | KR |
100888554 | Mar 2009 | KR |
20090036734 | Apr 2009 | KR |
100897170 | May 2009 | KR |
20090052526 | May 2009 | KR |
100901784 | Jun 2009 | KR |
100903348 | Jun 2009 | KR |
20090089931 | Aug 2009 | KR |
100922497 | Oct 2009 | KR |
20090105424 | Oct 2009 | KR |
100932752 | Dec 2009 | KR |
100935495 | Jan 2010 | KR |
20100006652 | Jan 2010 | KR |
2010022327 | Mar 2010 | KR |
20100039170 | Apr 2010 | KR |
100958030 | May 2010 | KR |
20100059681 | Jun 2010 | KR |
20100070116 | Jun 2010 | KR |
20100070119 | Jun 2010 | KR |
20100072994 | Jul 2010 | KR |
100977516 | Aug 2010 | KR |
2010091758 | Aug 2010 | KR |
20100089125 | Aug 2010 | KR |
20100090521 | Aug 2010 | KR |
20100091758 | Aug 2010 | KR |
20100098958 | Sep 2010 | KR |
100985816 | Oct 2010 | KR |
100990904 | Nov 2010 | KR |
20100123021 | Nov 2010 | KR |
20110006437 | Jan 2011 | KR |
20110011264 | Feb 2011 | KR |
2011024290 | Mar 2011 | KR |
20110019994 | Mar 2011 | KR |
101111167 | Apr 2011 | KR |
1111167 | Feb 2012 | KR |
1020130142810 | Dec 2013 | KR |
201116030 | May 2011 | TW |
WO 2000023814 | Apr 2000 | WO |
WO 03093963 | Nov 2003 | WO |
WO 2004027459 | Apr 2004 | WO |
WO 2005002228 | Jan 2005 | WO |
WO 2005015143 | Feb 2005 | WO |
WO 2005088846 | Sep 2005 | WO |
WO 2006112866 | Oct 2006 | WO |
WO 2007006242 | Jan 2007 | WO |
WO 2007053329 | May 2007 | WO |
WO 2008090345 | Jul 2008 | WO |
WO 2009122114 | Oct 2009 | WO |
WO 2010005152 | Jan 2010 | WO |
WO 2010033142 | Mar 2010 | WO |
WO 2012149926 | Nov 2012 | WO |
WO 2012170954 | Dec 2012 | WO |
WO 2014105241 | Jul 2014 | WO |
WO 2014159758 | Oct 2014 | WO |
Entry |
---|
Frank et al. “Mobile Communications Device Attachment with Camera” U.S. Appl. No. 29/423,027, filed May 25, 2012, 6 pgs. |
Gangkofner et al. “Optimizing the High-Pass Filter Addition Technique for Image Fusion”, Photogrammetric Engineering & Remote Sensing, vol. 74, No. 9, Sep. 1, 2008, pp. 1107-1118, XP9150814. |
Ager et al. “Geo-Positional Accuracy Evaluation of QuickBird Ortho-Ready Standard 2A Multispectral Imagery”, Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery X, vol. 5425, Aug. 12, 2004, pp. 488-499, XP040185047, Bellingham, WA. |
Darpa, “Broad Agency Announcement Low Cost Thermal Imager Manufacturing (LCTI-M)”, Microsystems Technology Office, DARPA-BAA-11-27, Jan. 24, 2011. pp. 1-42, Arlington, VA. |
Number | Date | Country | |
---|---|---|---|
20160316154 A1 | Oct 2016 | US |
Number | Date | Country | |
---|---|---|---|
61923732 | Jan 2014 | US |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/US2014/073096 | Dec 2014 | US |
Child | 15199867 | US |