The present invention relates generally to asset inspection and, more particularly, to image-based inspection of assets using reference images.
In industrial environments such as manufacturing facilities or other locations, there is often a need to inspect various assets such as machines, electronics, or other devices. In many cases, the assets may be temperature-sensitive and therefore required to operate at temperatures within expected tolerances to facilitate ongoing reliable functionality. For example, if an asset exhibits a temperature that is too high or too low, this may indicate a fault in need of repair.
Various conventional techniques exist for monitoring assets. In some cases, large numbers of sensors or fixed camera systems may be installed throughout a facility. However, such implementations can require significant investments in infrastructure and may be cost prohibitive. Moreover, the fixed nature of such implementations can limit their ability to monitor all relevant assets in a given environment. In other cases, a user may be required to manually inspect the assets. However, this approach can be subject to human error as it puts the responsibility on the user to properly monitor the condition of the asset repeatedly. Accordingly, there is a need for an improved approach to asset monitoring.
According to various embodiments of the present disclosure, a method includes capturing, by a camera, a live image of an asset under inspection. The method further includes receiving, at the camera, a manipulation to align the camera relative to the asset based on a comparison between the live image and a reference image of the asset. The method further includes capturing, by the camera, an adjusted live image of the asset aligned with the reference image.
According to various embodiments of the present disclosure, a system includes a camera. The camera is configured to capture a live image of an asset under inspection. The camera is configured to receive a manipulation to align the camera relative to the asset based on a comparison between the live image a reference image of the asset. The camera is configured to capture an adjusted live image of the asset under inspection aligned with the reference image.
The scope of the invention is defined by the claims, which are incorporated into this section by reference. A more complete understanding of embodiments of the present invention will be afforded to those skilled in the art, as well as a realization of additional advantages thereof, by a consideration of the following detailed description of one or more embodiments. Reference will be made to the appended sheets of drawings that will first be described briefly.
Embodiments of the present invention and their advantages are best understood by referring to the detailed description that follows. It should be appreciated that like reference numerals are used to identify like elements illustrated in one or more of the figures.
Embodiments of the present disclosure provide systems and methods for asset inspection. A reference image of an asset may be used to capture a similar image of the asset repeatedly, speed up inspection, and draw correct conclusions regarding the status or health of the asset. The reference image may be a thermal and/or visible light image of the asset, such as under normal state or conditions.
During inspection of the asset, the reference image may be presented to the user, such as while the user is in front of the asset. For example, the reference image may be presented together with a live image of the asset, such as to compare the live image to the reference image (e.g., for feedback to align the images together and/or to assess a status of the asset). For instance, a manipulation may be provided to the camera to align the camera relative to the asset based on a comparison between the live image and the reference image. Presenting the reference image together with the live image may support the taking of similar images of the asset every time, which enables trending. For example, such configurations may ensure that the camera is roughly the same distance and angle towards the asset, otherwise the two images would be dissimilar.
To speed up inspection and provide decision support (e.g., to assist the user in making the correct conclusions in the field), the live image can inherit properties from the reference image, or vice versa. For example, the reference image can be prepared before being transferred to the camera. For instance, measuring tools, such as measuring spots or boxes, can be placed on areas of interest, and/or color palette, level and span, among other image properties, can be adjusted to make the areas of interest clearly visible in the image, among other properties or characteristics. In short, any kind of preparation can be applied to the reference image before transferring to the camera. The selection and preparation of the reference image can be done by the same user doing the inspection, or it can be done as guidance by a more experienced user. In embodiments, the reference image may inherit a characteristic of the live image, such as when the user changes any setting of live image. In such embodiments, the reference image may be changed correspondingly.
Portable device 101 may be positioned to receive infrared radiation 194A and/or visible light radiation 194B from a scene 190 (e.g., corresponding to a field of view of portable device 101) in an environment 102 (e.g., a workplace, warehouse, industrial site, manufacturing facility, or other environment). In various embodiments, scene 190 may include one or more physical assets 192 (e.g., temperature-sensitive machines, equipment, electronics, or other devices) of interest which may be captured in thermal images and/or visible light images by portable device 101. Although a single example asset 192 is illustrated in
As shown, portable device 101 includes a housing 103 (e.g., a camera body graspable by a user), a thermal imaging subsystem 110A, a visible light imaging subsystem 110B, a logic device 168, user controls 170, a memory 172, a communication interface 174, a machine readable medium 176, a display component 178, a position sensor 179, other sensors 180, and other components 182, or any combination thereof. Such embodiments are illustrative only, and portable device 101 may include other components facilitating the operations described herein.
Thermal imaging subsystem 110A and visible light imaging subsystem 110B may be used to capture thermal images and visible light images in response to infrared radiation 194A and visible light radiation 194B, respectively, received from scene 190.
Thermal imaging subsystem 110A may include an aperture 158A, filters 160A, optical components 162A, a thermal imager 164A, and a thermal imager interface 166A. In this regard, infrared radiation 194A passing through aperture 158A may be received by filters 160A that selectively pass particular thermal wavelength ranges (e.g., wavebands) of infrared radiation 194A. Optical components 162A (e.g., an optical assembly including one or more lenses, additional filters, transmissive windows, and/or other optical components) pass the filtered infrared radiation 194A for capture by thermal imager 164A.
Thermal imager 164A may capture thermal images of scene 190 in response to the filtered infrared radiation 194A. Thermal imager 164A may include an array of sensors (e.g., microbolometers) for capturing thermal images (e.g., thermal image frames) of scene 190. In some embodiments, thermal imager 164A may also include one or more analog-to-digital converters for converting analog signals captured by the sensors into digital data (e.g., pixel values) to provide the captured images. Thermal imager interface 166A provides the captured images to logic device 168 which may be used to process the images, store the original and/or processed images in memory 172, and/or retrieve stored images from memory 172.
Visible light imaging subsystem 110B may include an aperture 158B, filters 160B, optical components 162B, a visible light imager 164B, and a visible light imager interface 166B. It will be appreciated that the various components of visible light imaging subsystem 110B may operate in an analogous manner as corresponding components of thermal imaging subsystem 110A with appropriate technology for capturing visible light images.
Moreover, although particular components are illustrated for each of thermal imaging subsystem 110A and visible light imaging subsystem 110B, it will be understood that the illustrated components are provided for purposes of example. As such, greater or fewer numbers of components may be used in each subsystem as appropriate for particular implementations.
Logic device 168 may include, for example, a microprocessor, a single-core processor, a multi-core processor, a microcontroller, a programmable logic device configured to perform processing operations, a digital signal processing (DSP) device, one or more memories for storing executable instructions (e.g., software, firmware, or other instructions), and/or any other appropriate combinations of devices and/or memory to perform any of the various operations described herein. Logic device 168 is configured to interface and communicate with the various components of portable device 101 to perform various method and processing steps described herein. In various embodiments, processing instructions may be integrated in software and/or hardware as part of logic device 168, or code (e.g., software and/or configuration data) which may be stored in memory 172 and/or a machine readable medium 176. In various embodiments, the instructions stored in memory 172 and/or machine readable medium 176 permit logic device 168 to perform the various operations discussed herein and/or control various components of portable device 101 for such operations.
Memory 172 may include one or more memory devices (e.g., one or more memories) to store data and information. The one or more memory devices may include various types of memory including volatile and non-volatile memory devices, such as RAM (Random Access Memory), ROM (Read-Only Memory), EEPROM (Electrically-Erasable Read-Only Memory), flash memory, fixed memory, removable memory, and/or other types of memory.
Machine readable medium 176 (e.g., a memory, a hard drive, a compact disk, a digital video disk, or a flash memory) may be a non-transitory machine readable medium storing instructions for execution by logic device 168. In various embodiments, machine readable medium 176 may be included as part of portable device 101 and/or separate from portable device 101, with stored instructions provided to portable device 101 by coupling the machine readable medium 176 to portable device 101 and/or by portable device 101 downloading (e.g., via a wired or wireless link) the instructions from the machine readable medium (e.g., containing the non-transitory information).
Logic device 168 may be configured to process captured images and provide them to display component 178 for presentation to and viewing by the user. Display component 178 may include a display device such as a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, and/or other types of displays as appropriate to display images and/or information to the user of portable device 101. Logic device 168 may be configured to display images and information on display component 178. For example, logic device 168 may be configured to retrieve images and information from memory 172 and provide images and information to display component 178 for presentation to the user of portable device 101. Display component 178 may include display electronics, which may be utilized by logic device 168 to display such images and information.
User controls 170 may include any desired type of user input and/or interface device having one or more user actuated components, such as one or more buttons, slide bars, knobs, keyboards, joysticks, and/or other types of controls that are configured to generate one or more user actuated input control signals. In some embodiments, user controls 170 may be integrated with display component 178 as a touchscreen to operate as both user controls 170 and display component 178. Logic device 168 may be configured to sense control input signals from user controls 170 and respond to sensed control input signals received therefrom. In some embodiments, portions of display component 178 and/or user controls 170 may be implemented by appropriate portions of a tablet, a laptop computer, a desktop computer, and/or other types of devices.
In various embodiments, user controls 170 may be configured to include one or more other user-activated mechanisms to provide various other control operations of portable device 101, such as auto-focus, menu enable and selection, field of view (FoV), brightness, contrast, gain, offset, spatial, temporal, and/or various other features and/or parameters.
Position sensor 179 may be implemented as any appropriate type of device used to determine a position (e.g., location) of portable device 101 in environment 102 (e.g., in an industrial facility containing assets 192 to be monitored). For example, in various embodiments, position sensor 179 may be implemented as a global positioning system (GPS) device, motion sensors (e.g., accelerometers, vibration sensors, gyroscopes, and/or others), depth sensing systems (e.g., time of flight cameras, LiDAR scanners, thermal cameras, visible light cameras, and/or others), antennas, other devices, and/or any combination thereof as desired. In some embodiments, position sensor 179 may send appropriate signals to logic device 168 for processing to determine the absolute and/or relative position of portable device 101 in environment 102.
Portable device 101 may include various types of other sensors 180 including, for example, temperature sensors and/or other sensors as appropriate.
Logic device 168 may be configured to receive and pass images from thermal and visible light imager interfaces 166A-B, additional data from position sensor 179 and sensors 180, and control signal information from user controls 170 to one or more external devices such as remote system 198 through communication interface 174 (e.g., through wired and/or wireless communications). In this regard, communication interface 174 may be implemented to provide wired communication over a cable and/or wireless communication over an antenna. For example, communication interface 174 may include one or more wired or wireless communication components, such as an Ethernet connection, a wireless local area network (WLAN) component based on the IEEE 802.11 standards, a wireless broadband component, mobile cellular component, a wireless satellite component, or various other types of wireless communication components including radio frequency (RF), microwave frequency (MWF), and/or infrared frequency (IRF) components configured for communication with a network. As such, communication interface 174 may include an antenna coupled thereto for wireless communication purposes. In other embodiments, the communication interface 174 may be configured to interface with a DSL (e.g., Digital Subscriber Line) modem, a PSTN (Public Switched Telephone Network) modem, an Ethernet device, and/or various other types of wired and/or wireless network communication devices configured for communication with a network.
In some embodiments, a network may be implemented as a single network or a combination of multiple networks. For example, in various embodiments, the network may include the Internet and/or one or more intranets, landline networks, wireless networks, and/or other appropriate types of communication networks. In another example, the network may include a wireless telecommunications network (e.g., cellular phone network) configured to communicate with other communication networks, such as the Internet. As such, in various embodiments, portable device 101 and/or its individual associated components may be associated with a particular network link such as for example a URL (Uniform Resource Locator), an IP (Internet Protocol) address, and/or a mobile phone number.
Portable device 101 may include various other components 182 such as speakers, displays, visual indicators (e.g., recording indicators), vibration actuators, a battery or other power supply (e.g., rechargeable or otherwise), and/or additional components as appropriate for particular implementations.
Although various features of portable device 101 are illustrated together in
Referring to
Reference image 214 may be any image used to identify an inspection condition of asset 192 based on a comparison with live image 210. For example, reference image 214 may be an image of asset 192 itself, such as an image taken by an installer during installation of asset 192, an image taken by a manufacturer during manufacture of asset 192, or any other image of asset 192 taken at any time prior to live image 210. In some embodiments, reference image 214 may be an image of a similar asset and not of asset 192 itself. For instance, reference image 214 may be an image of another device/equipment of the same model as asset 192 (e.g., a standard image of asset model, the same asset at another location, etc.) or an image of another device/equipment having properties and/or a configuration similar to asset 192 (e.g., a prior model of asset 192, a comparable model of asset 192, etc.).
Reference image 214 may be provided in many ways. For example, reference image 214 may be provided (e.g., to camera 200) by an image database maintained by a server (e.g., by database 199 of remote system 198). In some embodiments, reference image 214 may be taken by a second camera different than camera 200. For example, as noted above, reference image 214 may be taken by the installer during installation of asset 192, by the manufacturer during manufacture of asset 192, or by another person or device.
In embodiments, reference image 214 may be selected or identified (e.g., by a user, by system 100, etc.) for use in comparing against live image 210. For example, using user controls 170, a user may select, from among multiple images, an image to use as reference image 214, such as toggling between various prior images of asset 192. In this manner, the user may toggle between a time series of images of asset 192, such as to provide additional decision support. Such embodiments may also enable a trend plot of temperature values to be presented for a measurement tool, which may provide additional decision support. In embodiments, reference image 214 may be selected automatically, or at least selected by default, based on a user setting. For instance, the user setting may include at least one of a “last asset inspection” setting, a “first image taken of the asset” setting, or a “last image associated with a similar time or environmental condition of the live image” setting, although other configurations are contemplated. The “last asset inspection” setting may select, as default, the last inspection image of asset 192 as reference image 214. The “first image taken of the asset” setting may select, as default, the earliest image taken of asset 192 as reference image 214. The “last image associated with a similar time or environmental condition of the live image” setting may select, as default, the latest image of asset 192 taken during a similar time of day and/or year (e.g., morning, afternoon, fall, October, etc.) and/or similar environmental conditions (e.g., ambient temperature, etc.) as reference image 214, such as for assets whose temperature may vary over the year. Depending on the application, the selection of reference image 214 can be done by the same user doing the inspection, or the selection can be done as guidance by a more experienced user.
As shown in
In embodiments, camera 200 and/or system 100 may apply a characteristic associated with reference image 214 or live image 210 to live image 210 or reference image 214 prior to displaying the images. The characteristic may include temperature measuring functions, properties that affect the appearance of the image, and/or properties that affect the temperature reading of asset 192. For example, reference image 214 may be prepared to include measuring tools (e.g., temperature measurement box(es) 230, temperature measurement spot(s) 232, etc.) placed on areas of interest and/or by adjusting image characteristics (e.g., color palette, temperature span settings, thermal brightness (level) settings, etc.) in a way to make areas of interest clearly visible. In embodiments, the characteristic may include other image properties/characteristics, such as various image parameters/properties and/or other data associated with the image (e.g., ambient temperature, time, user, camera type/model, location, position, etc.). Such examples are illustrative only, and any kind of preparation can be applied to reference image 214 before transferring to camera 200. Depending on the application, the preparation of reference image 214 can be done by the same user doing the inspection, or the preparation can be done as guidance by a more experienced user.
Live image 210 may inherit the characteristics of reference image 214 described above. For example, live image 210 may inherit any or all measurement functions (e.g., temperature measurement box(es) 230, temperature measurement spot(s) 232, etc.) and their placement from reference image 214, leading to an efficient inspection of asset 192 as live image 210 is automatically prepared with the correct measuring tools in place. Additionally, or alternatively, live image 210 may inherit the image characteristics and/or properties of reference image 214, such as color palette, temperature span, level, emissivity, distance to object, ambient temperature, etc. In this manner, the system may ensure that a user is looking at images with the same visual presentation to aid in inspection (e.g., to ensure an “apples-to-apples” comparison). For example, with the same properties controlling the presentation of the images, something that appears warmer in one image will be warmer. Although live image 210 is described as inheriting a characteristic of reference image 214, in embodiments, reference image 214 may inherit a characteristic of live image 210 (e.g., should the user change any setting of live image 210, reference image 214 is changed correspondingly).
Comparison between live image 210 and reference image 214 may facilitate the taking of similar images of asset 192 every time. For example, comparing live image 210 to reference image 214 may ensure that camera 200 is roughly the same distance and angle towards asset 192 during inspection; otherwise, the two images may not look similar or capture the same information. To that end, camera 200 may be configured to receive a manipulation to align camera 200 relative to asset 192 based on a comparison between live image 210 and reference image 214. The manipulation may adjust at least one of a position, an angle, or a field of view of camera 200 to align live image 210 with reference image 214. Depending on the application, the manipulation may be performed by the user of camera 200, such as in real time based on user comparison of live image 210 to reference image 214, or the manipulation may be performed by another device (e.g., a robot operated by system 100 and/or remote system 198), although other configurations are contemplated.
Once live image 210 and reference image 214 are (or appear) similar, camera 200 may capture an adjusted live image of asset 192, the adjusted live image aligned with reference image 214. The adjusted live image may then be used to determine a status, condition, or operational state of asset 192. For example, the adjusted live image may display or otherwise identify a fault, failure, or undesired operation condition of asset 192, or that asset 192 is operating satisfactorily. The adjusted live image may be stored (e.g., in camera 200, in database 199, etc.) for use in future inspections of asset 192. For example, the adjusted live image may be used as reference image 214 in future inspections of asset 192.
Depending on the application, live image 210, reference image 214, and the adjusted live image may be thermal images, such as captured by thermal imaging subsystem 110A. In embodiments, camera 200 may be configured to capture a visible light live image of asset 192 (e.g., as captured by visible light imaging subsystem 110B) and receive a visible light reference image of asset 192. In such embodiments, the manipulation of camera 200 to align live image 210 with reference image 214 may be based on a comparison between the visible light live image and the visible light reference image.
In some embodiments, camera 200 may be configured to process the thermal live image and the visible light live image to provide a combined live image. Camera 200 may also process the thermal reference image and the visible light reference image to provide a combined reference image. In such embodiments, the manipulation of camera 200 to align live image 210 with reference image 214 may be based on a comparison between the combined live image and the combined reference image. The combined live image and the combined reference image may be generated using various thermal plus visible light combining techniques as further discussed herein.
Referring to
Referring to
In block 710, process 700 includes receiving (e.g., by camera 200) an identification of an asset to be inspected. For example, asset 192 may be flagged as needing inspection, such as during routine inspections of one or more assets in a warehouse, on an industry floor, etc. In embodiments, the identification of asset 192 to be inspected may be based on at least one of a predetermined inspection route, a detected position of camera 200 relative to asset 192 (e.g., GPS positioning), a communication between camera 200 and asset 192 (e.g., near-field communication (NFC), wireless communication, Bluetooth communication, etc.), or user input. For example, during routine inspections, the user may provide an indication (e.g., via user controls 170, voice control, etc.) to proceed to the next asset for inspection. In embodiments, the use of an inspection route may be the same or similar to that disclosed in U.S. Provisional Patent Application No. 63/003,111, filed Mar. 31, 2020, and International Patent Application No. PCT/US2021/025011 filed Mar. 30, 2021, all of which are hereby incorporated by reference in their entirety.
In block 715, process 700 includes capturing (e.g., by camera 200) a live image (e.g. live image 210) of the asset under inspection. For instance, thermal imaging subsystem 110A and/or visible light imaging subsystem 110B may be used to capture a thermal live image and/or a visible light live image of asset 192, such as in a manner as described above.
In block 720, process 700 includes identifying a reference image of the asset based on a user selection and/or a setting. The user setting may include at least one of a “last asset inspection” setting, a “first image taken of the asset” setting, or a “last image associated with a similar time or environmental condition of the live image” setting, as described above. Such implementations are exemplary only, and other configurations are contemplated.
In block 725, process 700 includes receiving (e.g., by camera 200) the reference image of the asset. Block 725 may include receiving the reference image from an image database maintained by a server (e.g., database 199 of remote system 198). The reference image may be taken by a second camera different than camera 200. For example, the reference image may be an image taken by an installer during installation of asset 192, an image taken by a manufacturer during manufacture of asset 192, or any other image of asset 192 taken at any time prior to live image. The reference image may be an image of asset 192 itself, or an image of a different asset. The reference image may be a thermal reference image or a visible light reference image.
In some embodiments, process 700 may include optional block 730 wherein thermal and/or visible images may be combined to provide combined live images and/or combined reference images comprising thermal image content and visible light image content. For example, block 730 may include processing a thermal live image and a visible light live image to provide a combined live image, processing a thermal reference image and a visible light reference image to provide a combined reference image, and/or other processing. In some embodiments, the processing performed in block 730 may include any of the various techniques set forth in U.S. Pat. Nos. 8,520,970, 8,565,547, 8,749,635, 9,171,361, 9,635,285, and/or 10,091,439, all of which are hereby incorporated by reference in their entirety. In some embodiments, such processing may include, for example, contrast enhancement processing (e.g., also referred to as MSX processing, high contrast processing, and/or fusion processing), true color processing, triple fusion processing, alpha blending, and/or other processing as appropriate.
Such combined live images and/or combined reference images may be used as the live images and/or reference images in other blocks of process 700 described herein to facilitate convenient review of such combined images and ease of alignment by a user or by another device. For example, in some embodiments, such combined images may permit a high resolution visible light features to be discerned simultaneously with low resolution thermal features.
In some embodiments, process 700 may include optional block 732, where the live image and/or the reference image is adjusted to compensate for detected environmental conditions and/or device operating conditions. For example, one or more sensors may monitor ambient temperate, device temperature, humidity, and/or other conditions of the environment, which may affect camera operation and/or image capture/characteristics. In such embodiments, block 732 may compensate for the detected conditions, such that the live and reference images are similar for comparison purposes.
In block 735, process 700 includes receiving (e.g., at camera 200) a manipulation to align the camera relative to the asset based on a comparison between the live image and the reference image of the asset. The manipulation may adjust at least one of a position, an angle, or a field of view of the camera to align the live image with the reference image. The manipulation may be performed by the user of the camera, or by another device (e.g., a robot, a machine, etc.), as described above. The manipulation and comparison may be based on thermal imagery, visible light imagery, or combined thermal and visible light imagery, as noted above.
In block 740, process 700 includes applying a characteristic associated with the reference image or the live image to the live image or the reference image. The characteristic applied may include temperature measuring tools or functions (e.g., temperature measurement box(es) 230, temperature measurement spot(s) 232) placed on areas of interest, properties that affect the appearance of the image (e.g., color palette, span, level), and/or properties that affect the temperature reading (e.g., emissivity, distance to object, ambient temperature) of asset 192, among other characteristics.
In block 745, process 700 includes displaying the live image and the reference image simultaneously on a display component for viewing by a user. For example, the live and reference images may be displayed on display component 178 of portable device 101/camera 200 and/or a display component of a remote device (e.g., of remote system 198, a smartphone, etc.). The live and reference images may be displayed side-by-side, picture-in-picture, vertically stacked, or in other configurations. For example, the live and reference images may be displayed as shown in
In block 750, process 700 includes identifying a detected difference between the live image and the reference image on the display component. Block 750 may include visually highlighting, flagging, or otherwise noting differences between the live image and the reference image, such as identified by the user. In embodiments, the differences between the live image and the reference image may be detected using a processor (e.g., logic device 168 of portable device 101/camera 200 and/or remote system 198), such as via a neural network running a machine learning algorithm or other artificial intelligence. Block 750 may include boxing or otherwise isolating the detected difference, such as in a manner as explained with reference to
In block 755, process 700 includes capturing (e.g., by camera 200) an adjusted live image of the asset aligned with the reference image. For example, once the live image and the reference image are (or at least appear) similar, thermal imaging subsystem 110A and/or visible light imaging subsystem 110B may capture a thermal live image and/or a visible light live image of asset 192 aligned with the reference image, such as in a manner as described above. In embodiments, the adjusted live image may be used as a reference image in future inspections of asset 192.
In view of the present disclosure, it will be appreciated that various techniques are provided to facilitate alignment of live images with reference images to permit comparable and useful images to be repeatedly captured of an asset under inspection. Repeated capture of comparable and useful images of the asset may speed up inspection and lead to correct conclusions regarding the status or health of the asset. For example, the live image may be presented together with a reference image to facilitate quick identification of any differences in the live image from the reference image, such as a change in temperature. To aid inspection and increase efficiency, the live image can inherit properties from the reference image, or vice versa, such that the images appear similar for the comparison.
Where applicable, various embodiments provided by the present disclosure can be implemented using hardware, software, or combinations of hardware and software. Also, where applicable, the various hardware components and/or software components set forth herein can be combined into composite components comprising software, hardware, and/or both without departing from the spirit of the present disclosure. Where applicable, the various hardware components and/or software components set forth herein can be separated into sub-components comprising software, hardware, or both without departing from the spirit of the present disclosure. In addition, where applicable, it is contemplated that software components can be implemented as hardware components, and vice-versa.
Software in accordance with the present disclosure, such as program code and/or data, can be stored on one or more computer readable mediums. It is also contemplated that software identified herein can be implemented using one or more general purpose or specific purpose computers and/or computer systems, networked and/or otherwise. Where applicable, the ordering of various steps described herein can be changed, combined into composite steps, and/or separated into sub-steps to provide features described herein.
Embodiments described above illustrate but do not limit the invention. It should also be understood that numerous modifications and variations are possible in accordance with the principles of the present invention. Accordingly, the scope of the invention is defined only by the following claims.
This application is a continuation of International Patent Application No. PCT/US2023/075638 filed Sep. 29, 2023 and entitled “CAMERA ALIGNMENT USING REFERENCE IMAGE FOR ASSET INSPECTION SYSTEMS AND METHODS,” which claims priority to and the benefit of U.S. Provisional Patent Application No. 63/412,251 filed Sep. 30, 2022 and entitled “CAMERA ALIGNMENT USING REFERENCE IMAGE FOR ASSET INSPECTION SYSTEMS AND METHODS,” all of which are incorporated herein by reference in their entirety.
| Number | Date | Country | |
|---|---|---|---|
| 63412251 | Sep 2022 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | PCT/US2023/075638 | Sep 2023 | WO |
| Child | 19088632 | US |