This disclosure relates generally to monitoring, and, more particularly, to methods, apparatus and systems for monitoring devices.
Conventional monitoring systems typically rely on alerts from sensors and analysis of event streams to infer that an anomalous condition has manifested. However, mechanical malfunctions may present an indirect and diffused correlation on behavioral parameters over a long period of time. Consequently, mechanical malfunctions may escape detection until such time as a serious degradation exists.
The figures are not to scale. As used in this patent, stating that any part (e.g., a layer, film, area, or plate) is in any way positioned on (e.g., positioned on, located on, disposed on, or formed on, etc.) another part, indicates that the referenced part is either in contact with the other part, or that the referenced part is above the other part with one or more intermediate part(s) located therebetween. Stating that any part is in contact with another part means that there is no intermediate part between the two parts.
Conventional monitoring systems may fail to detect degradation of a mechanical system. In manned systems, such as a vehicle with a driver, vehicle sensory data is supplemented by humans who may notice subtle alterations in vehicle performance, unusual noises, unusual smells, or the like, and who may assess the need to investigate further. Such conventional monitoring systems use data streams from system sensors (e.g., engine rotation speed, vibration, tire pressure, etc.) as indicators, but these data streams at times prove insufficient to discover a problem or indicate a severity of a problem. Autonomous systems likewise receive data streams from internal sensors and analyze the data streams to draw inferences therefrom. This monitoring paradigm and analysis is suitable for simple mechanisms of degradation. For instance, a low tire pressure indicator is sufficient to alert as to a low tire pressure. However, a variety of mechanical failure mechanisms can prove difficult to diagnose via on-board sensors. For instance, a blown head gasket could manifest in a variety of ways (e.g., misfires, lowered compression, overheating, swelling of the radiator cap, corruption of fluids, oil leak, coolant leak, whitish exhaust, etc.).
In accord with the some teachings of this disclosure, autonomous devices (e.g., autonomous land vehicles, autonomous aerial vehicles, drones, robots, spacecraft, industrial equipment or machinery, etc.) and/or manned devices (e.g., terrestrial vehicles, aircraft, watercraft, etc.) implement an example governor to monitor one or more systems, subsystems, or components to assess damage and/or degradation of the apparatus/device, including any system(s), subsystem(s), or component(s). The example governor helps to eliminate the human in-the-loop performing manual screening of autonomous devices and helps to increase device autonomy.
As shown in
The image sensor 120 is to obtain example image data, such as thermal image data, spatial image data and/or optical image data, of the area(s) of the device 110 in which the image sensor 120 is disposed. In some examples, the image sensor 120 may include the non-contact MLX90620 temperature measurement device from Melexis of Belgium, which includes a 16×4 element far infrared (FIR) thermopile sensor array constructed to produce a real-time map of heat values. In some examples, the image sensor 120 includes a spatial image sensor like Intel® RealSense™ Depth Module D400.
The image sensor 120 outputs the image data via an example communication pathway 130, such as a hardwired communication pathway or a wireless communication pathway, to an example governor 150. In some examples, the governor 150 is disposed within the device 110. For instance, the governor 150 may be disposed in a dashboard, under a seat, or in a trunk of the device 110. In some examples, the governor 150 is disposed at a remote location (e.g., external to the device 110, in a different region than the device 110, etc.). As described below, the governor 150 processes the image data from the image sensor 120 and outputs the image data and/or a derivative thereof, via a communication device 155, to an example RF broadcast tower 160 and/or an example network 165.
In some examples, the communication device 155 includes a device such as a transmitter, a transceiver, a modem and/or network interface card to facilitate exchange of the image data with one or more external machines 170 (e.g., computing devices of any kind, computer, server, etc.) via the RF broadcast tower 160 and/or network 165. In some examples, the communication device 155 may communicate, directly or indirectly (e.g., via one or more intermediary devices), to the network 165 via an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, a 10Base-T connection, a FireWire connector or a Universal Serial Bus (USB) connector. Thus, while an example RF broadcast tower 160 and an example communication device 155 are indicated in the example of
In general, the example governor 150 is to cause the image sensor 120 to obtain example image data of the structure of the device 110 at a first time, from which an example impression may be formed. The impression facilitates comparison of actual data from sensor data with data calculated as a result of applying a trained model or impression for previous data. For instance, in a first example, where previous samples of data at times t−1, t−2, . . . t−N are known and the example governor 150 receives sensor data at time t, the governor 150 applies the impression to the data samples at moments times t−1, t−2, . . . t−N and calculates estimated data for time t using the impression. The governor 150 then compares the estimated data at time t with the actual data at time t and determines a verdict as to whether the comparison is favorable (e.g., a “good” state) or unfavorable (e.g., a “bad” state). In a second example, where previous samples of data at times t−1, t−2, . . . t−N are known and the example governor 150 receives sensor data at time t, the governor 150 applies the impression to the data samples at moments times t, t−1, t−2, . . . t−N and determines a verdict as to whether the comparison is favorable (e.g., a “good” state), unfavorable (e.g., a “bad” state) or “unknown.” The impression can be updated continuously after every sample of data, periodically, or aperiodically. For instance, in some examples, the governor 150 updates the impression periodically using batches of data. In some examples, the governor 150 updates the impression when a new classification becomes available in response to telemetry data or an external expert system. The impression allows the governor 150 to determine if measured data is close to data that is predicted, with the governor 150 calculating a verdict (e.g., “good” state or “bad” state) through a comparison of predicted data and measured data (e.g., a neural net “feedforward” evaluation) and modifying or updating the impression to integrate input data of which the governor 150 was not previously familiar (e.g., a neural net “backpropagation” or training).
The example image manager 210 is to receive and process example image data from the image sensor 120 and to pass the image data to the example impression manager 220 for processing. In some examples, the example image manager 210 obtains image data of the structure of the device 110 responsive to a request from the governor 150.
The example impression manager 220 is to use the image data to form an impression or trained model of the structure of the device 110 imaged by the image sensor 120. In some examples, each volume and/or surface area in a field of view, or points of view, of the image sensor 120 is assigned a value corresponding to the image data obtained by the image sensor 120. In some examples, the impression can be a set of weights in a matrix representing a neural network (e.g., Artificial Neural Network (ANN), Recurrent Neural Network (RNN), Convolutional Neural Network (CNN), Deep Neural Network (DNN), etc.) or some other representation of behavior (e.g., a “decision tree,” Support Vector Machine (SVM), Logistic Regression (LR), etc.). The impression is trained (e.g., matrix coefficients are updated, etc.) using data from the image sensor 120. For example, a thermal image sensor would output image data including a temperature reading at each point or pixel in a field of view (e.g., a 640×480 thermal image sensor may include 307,200 pixels) and a spatial image sensor would output distances (e.g., via time-of-flight) at each point or pixel in a field of view (e.g., a point cloud of a 3D image sensor, etc.). The point cloud data for temperature and/or distance is brought into a common reference system (e.g., polar, Cartesian, etc.). The result of training is a modified impression which accommodates or integrates valid changes in image data.
In some examples, an impression may include an arrangement of raw data, such as a temperature in a volume or surface area defined by a selected coordinate system, or a derivative of the raw data, such as a map of spatial thermal gradients within a given volume and/or a mapping of the spatial thermal gradients onto a coefficients vector (e.g., a Radon transform). The example impression manager 220 may use any manner of expression of the image data to uniquely identify the imaged operational state. For instance, the impression manager 220 may convert the image data into an alternative representation via a mathematical transform, linear transform, matrix representation, linear mapping, eigenvalue decomposition, wavelet decomposition, geometric multiscale analysis, polygonal 3D model, surface model, non-uniform rational basis spline (NURBS) surface model, polygon mesh, and store and/or export the representation in a suitable format (e.g., as a Standard Tessellation Language (STL) file, Standard ACIS Text (SAT) file and/or OBJ geometry file, or any other 3D modelling file format).
In some examples an initial impression is provided by a manufacturer of the device 110 (e.g., a vehicle, etc.) and the initial impression is updated using image data from the image sensor 120.
In some examples, the example impression comparator 230 is to apply an impression to samples of image data at times t−1, t−2, . . . t−N and to calculate estimated image data for time t. The impression comparator 230 then compares the estimated image data for time t to actual image data from the imager 120 for time t to determine a level of correspondence between the estimated image data for time t to the actual image data at time t. The impression comparator 230 then renders a verdict as to whether or not the actual data corresponds to a “known good” state or a “known bad” state. In some examples, the example impression comparator 230 is to apply an impression to samples of image data at times t, t−1, t−2, . . . t−N and is to calculate a verdict estimated image data for time t (e.g., a “known good” state, a “known bad” state, an “unknown” state, etc.).
In some instance, the impression manager 220 maps spatial gradients and/or thermal gradients onto a coefficients vector (e.g., by a Radon transform, Hough transform, Funk transform, combinations of transforms, etc.) to form and/or update the impression and the impression comparator 230 is to compare basis vectors in a vector space as between sets of image data at different times to determine to detect correspondence between the image data and a known state (e.g., good state, a bad state, etc.) and/or an unknown state.
In some examples, the governor 150 outputs an example impression and/or the image data relating to the impression, or derivatives thereof, to an example memory 250. The example memory 250 includes, in an example operating state manager 252 corresponding to an operating state of the device, the example impression 254, including image data relating to the impression 254 and/or derivatives thereof. In some examples, the operating state manager 252 differentiates between a plurality of operating states (e.g., one or more “known good” state(s), one or more “known bad” state(s), etc.). The operating state manager 252 also includes an example first image data set 256 and successive image data sets to the first image data set 256 through an example Nth image data set 258, where N is any integer. In some examples, the first image data set 256 and/or another other image data set through the Nth image data set 258 include more than one image data set (e.g., a plurality of image data sets from a plurality of different times). In some examples, the first image data set 256 is preloaded into the memory 250 by a vendor of the device 110.
In some examples, the memory 250 is local to the device 110. In some examples, the memory 250 is remote to the device 110 and communication between the governor 150 and the memory 250 is via the communication device 155 and/or via the communication device 155 and any intermediary devices such as the RF broadcast tower 160 and/or the network 165.
While an example manner of implementing the governor 150 of
A flowchart representative of example machine readable instructions for implementing the governor 150 of
As mentioned above, the example processes of
In some examples, the image sensor 302 includes a plurality of image sensors 302 producing vectors of measured values, with each of the image sensors 302 producing scalar values combinable with respect to time as a tuple on n-vector, where n is any integer.
The reflected light from the example first object 370, the example second object 372 and the example third object 374 interacts with one or more lenses and photosensors of the image sensor 340 yielding, via time-of-flight, distance data for each point in the point cloud. The image sensor 340 is to build a map (e.g., a distance map and/or a thermal map) of a selected volume of the structure of the autonomous and/or manned device (e.g., a volume within the engine compartment, etc.) from which the governor 150 can detect deviations in alignment and/or temperature. In some examples, the light source 350 includes one or more lights to illuminate a selected volume of the structure of the autonomous and/or manned device (e.g., a volume within the engine compartment, etc.). In some examples, the light source 350 includes a solid-state diode, a lamp, and/or a bulb to output light in one or more ranges of wavelengths (e.g., visible light spectrum, infrared light spectrum, etc.). In some examples, the image sensor 340 includes RealSense™ technology from Intel®, such as an Intel® RealSense™ Depth Module D400 Series image sensor.
In some examples, the impression manager 220 integrates or associates the image data from the image sensor 120 (e.g., image sensor 302, 325, 340) with telemetry data from one or more additional sensors 125 of the autonomous and/or manned device. For instance, image data used by the impression manager 220 to form and/or update the impression 254 may be associated with data from one or more other sensors 125 (e.g., a pressure sensor, a vibration sensor, a velocity sensor, an acceleration sensor, etc.) operatively associated with one or more systems or subsystems of the device (e.g., autonomous and/or manned device, device 110). The impression comparator 230 is thus informed as to changes in an operating condition and is able to contextually determine whether a rendered verdict is indicative of a changed operating condition or is indicative of a potential malfunction.
In
Example comparisons utilizing the example data of
The programs or instructions of
The impression manager 220 is then used to form an impression 254 or update the impression 254 at example block 520 of
In some examples, at example block 535 and/or example block 540, the image manager 210 and/or governor 150 may perform pre-processing of the image data from the image sensor 120 and/or telemetry data from the sensors 125, respectively. The pre-processing may be used, for example, to suppress distortions in the data, eliminate noisiness in the data, enhance the data, normalize the data and/or transform the image data into an appropriate coordinate system and into a format suitable for further processing, such as by the impression manager 220 and/or the impression comparator 230.
At example block 545, the impression comparator 230 applies the impression 254 to the image data. For instance, the governor 150 may use the impression manager 220 to access the impression 254 from the operating state manager 252 and apply the impression 254 to previous samples of image data at times t−1, t−2, . . . t−N. As noted above, the impression 254 may include, for example, a mapping of spatial gradients and/or thermal gradients onto a coefficients vector or a set of weights in a matrix representing a neural network (e.g., ANN/RNN/CNN/DNN, “decision tree,” SVM, LR, etc.). At example block 545, the impression comparator 230 may also apply the impression 254 to calculate estimated data for time t based on the impressions applied to the previous samples of image data at times t−1, t−2, . . . t−N. At block 550, the impression comparator 230 determines a verdict using the impression and the image data, such as by comparing the impression 254 of the estimated data at time t with the actual data at time t. At block 555, the impression comparator 230 renders a verdict as to whether the comparison of the impression 254 of the estimated data at time t with the actual data at time t is favorable and reflects a good verdict (e.g., reflective of a known “good” state of the device 110). In some examples, where previous samples of data at times t−1, t−2, . . . t−N are known, the impression comparator 230 applies the impression 254 to the data samples at moments times t, t−1, t−2, . . . t−N at block 550 to determine a verdict and determines at block 555 whether the verdict corresponds to a good verdict or a bad verdict.
If the result at block 555 is “YES,” the control passes to example block 560, where the governor 150 and/or the impression comparator 230 determines whether the telemetry data from the sensor(s) 125 of the device 110 is within acceptable limits. If the result at block 555 is “NO,” the control passes to example block 565, where the impression comparator 230 renders a verdict as to whether the comparison of the impression 254 of the estimated data at time t with the actual data at time t is unfavorable and reflects a bad verdict (e.g., reflective of a known “bad” state of the device 110). In some examples, where previous samples of data at times t−1, t−2, . . . t−N are known, the impression comparator 230 applies the impression 254 to the data samples at moments times t, t−1, t−2, . . . t−N at block 550 and determines a verdict at block 555 as to whether the comparison is unfavorable and reflects a bad verdict.
If the result at block 565 is “NO,” control passes to example block 570, where the impression comparator 230 and/or governor 150 stores the image data in the memory 250 for later evaluation. Control then passes to example block 560, discussed above, where the governor 150 and/or the impression comparator 230 determine whether the telemetry data from the sensor(s) 125 of the device 110 is within acceptable limits.
At block 560, if the telemetry data from the sensor(s) 125 of the device 110 is within acceptable limits (i.e., the result is “YES”), control passes to example block 575, where the impression comparator 230 and/or the governor 150 updates the impression 254 to incorporate the image data as “good” (e.g., reflecting a normal operational condition, reflecting a normal structural condition, etc.). Control then passes to block 525. If the telemetry data from the sensor(s) 125 of the device 110 is not within acceptable limits and the result at block 560 is “NO,” control passes to example block 580. At block 580, the impression comparator 230 and/or the governor 150 updates the impression 254 to incorporate the image data as “bad” (e.g., reflecting an abnormal operational condition, reflecting an abnormal structural condition, etc.). Control then passes back to block 585. At block 585, the impression comparator 230 and/or governor 150 stores the image data in the memory 250 for later evaluation. Control then passes to example block 590. At block 590 the governor 150 and/or the impression comparator 230 output a deviation report relating to the impression 254 and/or the image data (e.g., reporting locally to a controller, reporting remotely to a central facility or server, etc.). If a particular deviation report (e.g., a temperature in a particular location of a structure or system of the autonomous or manned device, a displacement of a structure or system of the autonomous or manned device, etc.) is later positively associated with a particular performance issue and/or maintenance issue, the impressions and/or image data can then be flagged as a known problem impression. Particularly for a plurality of similarly configured devices 110 (e.g., a fleet of drones, a fleet of vehicles, etc.), a deviation report issued by one device 110 informing of a potential and/or actual performance and/or maintenance issue may enable trending of problems, targeted preventive maintenance, and timely corrective actions not only for the one device 110, but also for the other similarly configured devices 110.
The first input data vector 610A represents image data for a selected volume 405 at a first time (t0). Each of the blocks 620 of the selected volume 405 has a uniform first temperature, expressed as uniform fill in the first input data vector 610A of
The fourth input data vector 610D represents image data for a selected volume 405 at a fourth time (t3). Some blocks 620 of the selected volume 405 have a first temperature, block 628 is shown to have a fourth temperature higher than the third temperature, and block 630 is shown to have the third temperature, with each of the different temperatures being expressed using a different fill. The fifth input data vector 610E represents image data for a selected volume 405 at a fifth time (t4). Some blocks 620 of the selected volume 405 are shown to have the first temperature, block 632 is shown to have the fourth temperature, and block 634 is shown to have the third temperature higher, with each of the different temperatures being expressed using a different fill. Block 636 of the fifth input data vector 610E is shown to have the second temperature. The sixth input data vector 610F represents image data for a selected volume 405 at a sixth time (t5). Some blocks 620 of the selected volume 405 are shown to have the first temperature, block 638 is shown to have the fourth temperature, and block 640 and block 642 are shown to have the third temperature.
In each of the first training sample 702, the second training sample 704 and the third training sample 706, the rows 618F, 618G and 618J include blocks 620 exhibiting higher temperatures than the remaining blocks 620, which are all at the first temperature 708. In the first training sample 702, row 618F shows that the block 620 at column 616A (CL(t−3)) has a second temperature 710 higher than the first temperature 708, the block 620 at column 616B (CL(t−2)) has a third temperature 712 higher than the second temperature 710, the block 620 at column 616C (CL(t−1)) has a fourth temperature 714 higher than the third temperature 712 and the block 620 at column 616N (CL(t)) is shown to have the fourth temperature 714. Row 618G of the first training sample 702 shows that the block 620 at column 616A (CL(t−3)) has the first temperature 708, the block 620 at column 616B (CL(t−2)) has the second temperature 710, the block 620 at column 616C (CL(t−1)) has the third temperature 712 and the block 620 at column 616N (CL(t)) is shown to have the third temperature 712. Row 618J of the first training sample 702 shows that the block 620 at column 616N (CL(t)) has the second temperature 710.
In the second training sample 704 and the third training sample 706, row 618F shows that the block 620 at column 616A (CL(t−3)) has third temperature 712, the block 620 at column 616B (CL(t−2)) has the fourth temperature, the block 620 at column 616C (CL(t−1)) has the fourth temperature and the block 620 at column 616N (CL(t)) has the fourth temperature 714. Row 618G of the first training sample 702 shows that the block 620 at column 616A (CL(t−3)) has the second temperature, the block 620 at column 616B (CL(t−2)) has the third temperature, the block 620 at column 616C (CL(t−1)) has the third temperature and the block 620 at column 616N (CL(t)) has the third temperature. Row 618J of the second training sample 704 shows that the block 620 at column 616C (CL(t−1)) has the second temperature and the block 620 at column 616N (CL(t)) has the third temperature.
In some examples, such as is shown in the example of
As noted above, in some examples, the training of the impression 254 includes implementation of neural networks, decision trees, support vector machines (SVMs), and/or other machine learning application. The training of the impression 254 may include, for example, updating of a matrix or coefficients responsive to image data from the image sensor 120 to modify the impression 254 to accommodate valid changes in image data. In the example of
In some examples, the training of the impression 254 is during normal operation of the device 110 for a sufficient time for the impression 254 to learn actual behavior of the device 110 in advance of any potential for anomalous behavior of the device 110. In some examples, the device 110 may include a vendor-supplied impression 254 (e.g., values of coefficients for a pre-trained model, etc.) in the memory 250.
The example array 850 of
The expected values 831A-831P are then compared by the impression comparator 230 to the example rows 851A-851P of data vectors (CL(t)) in example array 850, corresponding to actual image data.
In some examples, the data vectors (CL(t)) of rows 851A-851P of array 850 (e.g., actual values of image data) are compared to the data vectors (N*L(t)) of the rows of expected values 831A-831P of array 830 (e.g., the expected values of impression 254) to determine if any comparison between corresponding data vectors (e.g., a comparison of the data vector of row 851A to the data vector of row 831A, etc.) is greater than a threshold difference. In some examples, where the data vectors represent temperatures (e.g., absolute temperatures, temperature gradients, etc.), the threshold difference may be expressed in a difference in temperatures (e.g., a difference of 1° F., 2° F., 3° F. . . . 10° F., 20° F., 30° F., etc.). To illustrate, the data vector C10(t) of row 851J is shown to correspond to the fourth temperature 714, whereas the data vector N*10(t) of row 831J (e.g., the impression) is shown to correspond to the third temperature 712, indicating a difference therebetween. In some examples, where the data vectors represent distances or dimensions, the threshold difference may be expressed in a difference in dimension (e.g., a difference of 0.1 mm, 0.2 mm, 0.3 mm, etc.). In some examples, the threshold difference is determined via a sum of absolute differences, a sum of differences squared, or the like.
The processor platform 900 of the illustrated example includes a processor 912. The processor 912 of the illustrated example is hardware. For example, the processor 912 can be implemented by one or more integrated circuits, logic circuits, microprocessors or controllers from any desired family or manufacturer. The hardware processor may be a semiconductor based (e.g., silicon based) device. In this example, the processor 912 implements the governor 150, the image manager 210, the impression manager 220, and the impression comparator 230.
The processor 912 of the illustrated example includes a local memory 913 (e.g., a cache). The processor 912 of the illustrated example is in communication with a main memory including a volatile memory 914 and a non-volatile memory 916 via a bus 918. The volatile memory 914 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS Dynamic Random Access Memory (RDRAM) and/or any other type of random access memory device. The non-volatile memory 916 may be implemented by flash memory and/or any other desired type of memory device. Access to the volatile memory and non-volatile memory 914, 916, local memory and/or main memory is controlled by a memory controller.
The processor platform 900 of the illustrated example also includes an interface circuit 920. The interface circuit 920 may be implemented by any type of interface standard, such as an Ethernet interface, a universal serial bus (USB), and/or a PCI express interface.
In the illustrated example, one or more input devices 922 are connected to the interface circuit 920. The input device(s) 922 permit(s) a user to enter data and/or commands into the processor 912. The input device(s) can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a track-pad, a trackball, isopoint and/or a voice recognition system.
One or more output devices 924 are also connected to the interface circuit 920 of the illustrated example. The output devices 924 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display, a touchscreen, a tactile output device, a printer and/or speakers). The interface circuit 920 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip and/or a graphics driver processor.
The interface circuit 920 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem and/or network interface card to facilitate exchange of data with external machines (e.g., computing devices of any kind) via a network 165 (e.g., an Ethernet connection, a digital subscriber line (DSL), a telephone line, coaxial cable, a cellular telephone system, etc.).
The processor platform 900 of the illustrated example also includes one or more mass storage devices 928 for storing software and/or data. Examples of such mass storage devices 928 include floppy disk drives, hard drive disks, compact disk drives, Blu-ray disk drives, RAID systems, and digital versatile disk (DVD) drives.
The coded instructions 932 of
From the foregoing, it will be appreciated that example methods, apparatus and articles of manufacture have been disclosed that enable early detection of a wide range of known and/or unforeseeable malfunctions and that enable early intervention. This is particularly advantageous for autonomous devices, as it can help to detect problems before the autonomous devices become unrecoverable, and devices having a high degree of similitude, as knowledge gained on one autonomous and/or manned device can be applied to other similar autonomous and/or manned devices.
Example 1 is a monitoring system including an image sensor to obtain image data of a device and a governor to cause the image sensor to obtain image data of the device, to form an impression from the image data, to use the impression and the image data to generate expected values for a state of the device at a predetermined time, and to compare image data for the structure at the predetermined time with the generated expected values.
Example 2 includes the monitoring system as defined in example 1, wherein the image sensor includes at least one of a thermal image sensor, a spatial image sensor, and an optical image sensor and wherein the image data includes at least one of thermal image data, spatial image data, and optical image data.
Example 3 includes the monitoring system as defined in example 1 or example 2, wherein the image sensor includes a thermal image sensor and a spatial image sensor, wherein the image data includes thermal image data and spatial image data.
Example 4 includes the monitoring system as defined in any of examples 1-3, wherein the image sensor is to obtain the image data of the device, via the image sensor, during operation of the device.
Example 5 includes the monitoring system as defined in any of examples 1-4, wherein the image sensor is to obtain the image data of the device, via the image sensor, between operation cycles of the device.
Example 6 includes the monitoring system as defined in any of examples 1-5, wherein the impression is formed from a plurality of sets of the image data.
Example 7 includes the monitoring system as defined in any of examples 1-6, further including a non-transitory machine readable medium to store at least one of the impression or the image data.
Example 8 includes the monitoring system as defined in any of examples 1-7, wherein the impression includes a mapping of at least one of spatial gradients or thermal gradients within a selected volume of a structure of the device onto coefficients vectors to permit comparison, over respective basis spaces, to actual values from the image data.
Example 9 includes the monitoring system as defined in any of examples 1-8, wherein the governor is to, following the comparison of the image data for the device at the predetermined time with the generated expected values from the impression, determine whether telemetry data from instruments monitoring one or more systems or subsystems of the device is within acceptable limits.
Example 10 includes the monitoring system as defined in any of examples 1-9, further including a communication device to communicate, to a remote device, at least one of the impression, the image data, or a deviation report relating to the impression.
Example 11 includes the monitoring system as defined in any of examples 1-10, wherein the governor is to generate a deviation report responsive to a difference between an expected value generated from by the impression and an actual value of the image data.
Example 12 includes the monitoring system as defined in any of examples 1-11, wherein the remote device includes another device similarly configured to the device.
Example 13 includes the monitoring system as defined in any of examples 1-12, wherein the remote device includes a central server or service in communication with a plurality of similarly configured device.
Example 14 is a method for automated monitoring of a device, including imaging a device during an operating state of the device to obtain first image data, forming an impression from the first image data, imaging the device during a subsequent operating state of the device to obtain second image data, estimating values for the second image data using the impression, and comparing the estimated values for the second image data to actual values for the second image data.
Example 15 includes the method for automated monitoring of claim 14, and further includes determining if a difference between the estimated values for the second image data and the actual values for the second image data is less than a threshold difference.
Example 16 includes the method for automated monitoring of claim 14 or claim 15, and further includes determining if telemetry data from a sensor of the device is within acceptable limits.
Example 17 includes the method for automated monitoring of any of claims 14-16, and further includes updating the impression to incorporate the second image data as good data if the telemetry data is within acceptable limits.
Example 18 includes the method for automated monitoring of any of claims 14-17, and further includes updating the impression to incorporate the second image data as bad data if telemetry data from a sensor of the device is not within acceptable limits.
Example 19 includes the method for automated monitoring of any of claims 14-18, and further includes outputting a deviation report.
Example 20 includes the method for automated monitoring of any of claims 14-19, and further includes, following a determination that the difference between the estimated values for the second image data and the actual values for the second image data are less than the threshold difference, comparing the estimated values for the second image data to known values corresponding to a bad outcome to determine if the estimated values for the second image data correspond to the bad outcome.
Example 21 includes the method for automated monitoring of any of claims 14-20, and further includes outputting a deviation report if the estimated values for the second image data are determined to correspond to the bad outcome.
Example 22 includes the method for automated monitoring of any of claims 14-21, and further includes determining if telemetry data from a sensor of the device is within acceptable limits if the estimated values for the second image data are determined not to correspond to the bad outcome.
Example 23 includes the method for automated monitoring of any of claims 14-22, and further includes updating the impression to incorporate the second image data as good data if the telemetry data is within acceptable limits.
Example 24 is a system including an imaging means to obtain image data of a device and a governing means to cause the imaging means to obtain image data of the device, to form an impression from the image data, to use the impression and the image data to generate expected values for a state of the device at a predetermined time, and to compare image data for the device at the predetermined time with the generated expected values.
Example 25 includes the system of claim 24, wherein the imaging means includes at least one of a thermal imaging means, a spatial imaging means, and an optical imaging means and wherein the image data includes at least one of thermal image data, spatial image data, and optical image data.
Example 26 includes the system of claim 24 or claim 25, wherein the imaging means includes a thermal imaging means and a spatial imaging means, and wherein the image data includes thermal image data and spatial image data.
Example 27 includes the system of claims 24-26, wherein the imaging means is to obtain the image data of the device, via the imaging means, during operation of the device.
Example 28 includes the system of claims 24-27, wherein the imaging means is to obtain the image data of the device, via the imaging means, between operation cycles of the device.
Example 29 includes the system of claims 24-28, wherein the impression is formed from a plurality of sets of the image data.
Example 30 includes the system of claims 24-29, further including a communication means to communicate to a remote device at least one of the impression, the image data, or a deviation report relating to the impression in response to a difference between expected values generated from the impression and actual values of the image data exceeding a threshold difference.
Example 31 is a non-transitory machine readable medium comprising executable instructions that, when executed, cause at least one processor to image a device during an operating state of the device to obtain first image data, form an impression from the first image data, image the device during the operating state of the device to obtain second image data, estimate values for the second image data using the impression and compare the estimated values for the second image data to actual values for the second image data.
Example 32 includes the non-transitory machine readable medium of claim 31, and further includes executable instructions that, when executed, cause at least one processor to determine if a difference between the estimated values for the second image and the actual values for the second image data is less than a threshold difference.
Example 33 includes the non-transitory machine readable medium of claim 31 or claim 32, and further includes executable instructions that, when executed, cause at least one processor to determine if telemetry data from a sensor of the device is within acceptable limits.
Example 34 includes the non-transitory machine readable medium of any of claims 31-33, and further includes executable instructions that, when executed, cause at least one processor to update the impression to incorporate the image data as good data if the telemetry data is within acceptable limits.
Example 35 includes the non-transitory machine readable medium of any of claims 31-34, and further includes executable instructions that, when executed, cause at least one processor to update the impression to incorporate the image data as bad data if telemetry data from a sensor of the device is not within acceptable limits.
Example 36 includes the non-transitory machine readable medium of any of claims 31-35, and further includes executable instructions that, when executed, cause at least one processor to output a deviation report if a difference between the estimated values for the second image and the actual values for the second image data is greater than a threshold difference.
Example 37 is a monitoring system including an image sensor to obtain image data of a device and a governor to cause the image sensor to obtain image data of the device, to form an impression from the image data, to use the impression and the image data to determine a verdict.
Example 38 includes the monitoring system of claim 37, wherein the verdict is determined by using the impression and the image data to generate expected values for a state of the device at a predetermined time and to compare image data for the device at the predetermined time with the generated expected values.
Example 39 includes the monitoring system of claim 37 or claim 38, wherein the impression is to determine the verdict using the image data directly.
Although certain example methods, device and articles of manufacture have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all methods, device and articles of manufacture fairly falling within the scope of the claims of this patent. For instance, while examples herein have disclosed implementation of an image sensor 120 to obtain image data representative of a temperature of one or more objects in a specified volume 405, in some examples contactless thermometers (pyrometers) and/or contact thermometer arrays (a plurality of thermometers) may be placed on surfaces of one or more parts to provide temperature data in lieu of image data.
Number | Name | Date | Kind |
---|---|---|---|
7056012 | Blakeley, III | Jun 2006 | B2 |
7111981 | Blakeley, III | Sep 2006 | B2 |
7163336 | Blakeley, III | Jan 2007 | B2 |
7168316 | Blakeley, III | Jan 2007 | B2 |
7192186 | Blakeley, III | Mar 2007 | B2 |
7452127 | Blakeley, III | Nov 2008 | B2 |
8208026 | Hogasten et al. | Jun 2012 | B2 |
8520970 | Strandemar | Aug 2013 | B2 |
8565547 | Strandemar | Oct 2013 | B2 |
8727608 | Blakeley, III | May 2014 | B2 |
8749629 | Baleine et al. | Jun 2014 | B2 |
8749635 | Hogasten et al. | Jun 2014 | B2 |
8766808 | Hogasten | Jul 2014 | B2 |
8780208 | Hogasten et al. | Jul 2014 | B2 |
9058653 | Kostrzewa et al. | Jun 2015 | B1 |
9083897 | Hogasten et al. | Jul 2015 | B2 |
9129459 | Tesanovic et al. | Sep 2015 | B2 |
9143703 | Boulanger et al. | Sep 2015 | B2 |
9171361 | Strandemar | Oct 2015 | B2 |
9207708 | Simolon et al. | Dec 2015 | B2 |
9208542 | Hogasten et al. | Dec 2015 | B2 |
9235023 | Burt et al. | Jan 2016 | B2 |
9235876 | Hogasten et al. | Jan 2016 | B2 |
9237284 | Hogasten et al. | Jan 2016 | B2 |
9247131 | Kostrzewa et al. | Jan 2016 | B2 |
9292909 | Hogasten et al. | Mar 2016 | B2 |
9451183 | Hogasten et al. | Sep 2016 | B2 |
9471970 | Strandemar | Oct 2016 | B2 |
9473681 | Hoelter et al. | Oct 2016 | B2 |
9509924 | Terre et al. | Nov 2016 | B2 |
9517679 | Frank et al. | Dec 2016 | B2 |
9521289 | Dart et al. | Dec 2016 | B2 |
9538038 | Sieh et al. | Jan 2017 | B2 |
9635285 | Teich et al. | Apr 2017 | B2 |
9674458 | Teich et al. | Jun 2017 | B2 |
9706137 | Scanlon et al. | Jul 2017 | B2 |
9706138 | Teich et al. | Jul 2017 | B2 |
9706139 | Nussmeier et al. | Jul 2017 | B2 |
9716843 | Fox et al. | Jul 2017 | B2 |
9716844 | Nussmeier et al. | Jul 2017 | B2 |
9723227 | Hogasten et al. | Aug 2017 | B2 |
9723228 | Boulanger et al. | Aug 2017 | B2 |
9756262 | Frank et al. | Sep 2017 | B2 |
9756264 | Hoelter et al. | Sep 2017 | B2 |
9807319 | Teich et al. | Oct 2017 | B2 |
9819880 | Hogasten et al. | Nov 2017 | B2 |
9843742 | Garrow et al. | Dec 2017 | B2 |
9843743 | Lewis et al. | Dec 2017 | B2 |
9848134 | Simolon et al. | Dec 2017 | B2 |
9900478 | Fox et al. | Feb 2018 | B2 |
9900526 | Kostrzewa et al. | Feb 2018 | B2 |
9918023 | Simolon et al. | Mar 2018 | B2 |
9948872 | Frank et al. | Apr 2018 | B2 |
9948878 | Simolon et al. | Apr 2018 | B2 |
9961277 | Hoelter et al. | May 2018 | B2 |
9973692 | Szabo et al. | May 2018 | B2 |
9986175 | Frank et al. | May 2018 | B2 |
10044946 | Strandemar et al. | Aug 2018 | B2 |
10051210 | Nussmeier et al. | Aug 2018 | B2 |
10079982 | Boulanger et al. | Sep 2018 | B2 |
10091439 | Hogasten et al. | Oct 2018 | B2 |
20040066833 | Blakeley, III | Apr 2004 | A1 |
20050031013 | Blakeley, III | Feb 2005 | A1 |
20050147152 | Blakeley, III | Jul 2005 | A1 |
20050178199 | Blakeley, III | Aug 2005 | A1 |
20060227846 | Blakeley, III | Oct 2006 | A1 |
20070019705 | Blakeley, III | Jan 2007 | A1 |
20080259993 | Blakeley | Oct 2008 | A1 |
20100220193 | Hogasten et al. | Sep 2010 | A1 |
20100309315 | Hogasten et al. | Dec 2010 | A1 |
20110221599 | Hogasten | Sep 2011 | A1 |
20110261207 | Strandemar | Oct 2011 | A1 |
20110262053 | Strandemar | Oct 2011 | A1 |
20120200698 | Baleine et al. | Aug 2012 | A1 |
20120262584 | Strandemar | Oct 2012 | A1 |
20120312976 | Boulanger et al. | Dec 2012 | A1 |
20120321212 | Hogasten et al. | Dec 2012 | A1 |
20130022279 | Hogasten et al. | Jan 2013 | A1 |
20130173435 | Cozad, Jr. | Jul 2013 | A1 |
20130242110 | Terre et al. | Sep 2013 | A1 |
20130250102 | Scanlon et al. | Sep 2013 | A1 |
20130250125 | Garrow et al. | Sep 2013 | A1 |
20130253551 | Boyle et al. | Sep 2013 | A1 |
20130258111 | Frank et al. | Oct 2013 | A1 |
20130270441 | Burt et al. | Oct 2013 | A1 |
20130278771 | Magoun et al. | Oct 2013 | A1 |
20130300875 | Strandemar et al. | Nov 2013 | A1 |
20130314536 | Frank et al. | Nov 2013 | A1 |
20130321637 | Frank et al. | Dec 2013 | A1 |
20130329054 | Hoelter et al. | Dec 2013 | A1 |
20130342691 | Lewis et al. | Dec 2013 | A1 |
20140015982 | Strandemar | Jan 2014 | A9 |
20140016879 | Hogasten et al. | Jan 2014 | A1 |
20140037225 | Hogasten et al. | Feb 2014 | A1 |
20140085482 | Teich et al. | Mar 2014 | A1 |
20140092256 | Simolon et al. | Apr 2014 | A1 |
20140092257 | Hogasten et al. | Apr 2014 | A1 |
20140092258 | Dart et al. | Apr 2014 | A1 |
20140093133 | Frank et al. | Apr 2014 | A1 |
20140098237 | Sieh et al. | Apr 2014 | A1 |
20140098238 | Boulanger et al. | Apr 2014 | A1 |
20140104415 | Fox et al. | Apr 2014 | A1 |
20140108850 | Simolon et al. | Apr 2014 | A1 |
20140112537 | Frank et al. | Apr 2014 | A1 |
20140139643 | Hogasten et al. | May 2014 | A1 |
20140139685 | Nussmeier et al. | May 2014 | A1 |
20140168433 | Frank | Jun 2014 | A1 |
20140168445 | Hogasten et al. | Jun 2014 | A1 |
20140184807 | Simolon et al. | Jul 2014 | A1 |
20140218520 | Teich et al. | Aug 2014 | A1 |
20140232875 | Boulanger et al. | Aug 2014 | A1 |
20140240512 | Hogasten et al. | Aug 2014 | A1 |
20140253735 | Fox et al. | Sep 2014 | A1 |
20140285672 | Hogasten et al. | Sep 2014 | A1 |
20150006018 | Tesanovic et al. | Jan 2015 | A1 |
20150085133 | Teich et al. | Mar 2015 | A1 |
20150109454 | Strandemar et al. | Apr 2015 | A1 |
20150146009 | Kostrzewa et al. | May 2015 | A1 |
20150172545 | Szabo et al. | Jun 2015 | A1 |
20150288892 | Frank et al. | Oct 2015 | A1 |
20150296146 | Scanlon et al. | Oct 2015 | A1 |
20150312488 | Kostrzewa et al. | Oct 2015 | A1 |
20150312489 | Hoelter et al. | Oct 2015 | A1 |
20150319378 | Hoelter et al. | Nov 2015 | A1 |
20150319379 | Nussmeier et al. | Nov 2015 | A1 |
20150332441 | Hogasten et al. | Nov 2015 | A1 |
20150334315 | Teich et al. | Nov 2015 | A1 |
20150358560 | Boulanger et al. | Dec 2015 | A1 |
20150365592 | Kostrzewa et al. | Dec 2015 | A1 |
20150379361 | Boulanger | Dec 2015 | A1 |
20160042503 | Strandemar | Feb 2016 | A1 |
20160074724 | Terre | Mar 2016 | A1 |
20160156880 | Teich et al. | Jun 2016 | A1 |
20160224055 | Simolon et al. | Aug 2016 | A1 |
20160316119 | Kent | Oct 2016 | A1 |
20160316154 | Elmfors et al. | Oct 2016 | A1 |
20160366345 | Nussmeier et al. | Dec 2016 | A1 |
20170004609 | Strandemar | Jan 2017 | A1 |
20170078590 | Hogasten et al. | Mar 2017 | A1 |
20170088098 | Frank et al. | Mar 2017 | A1 |
20170208260 | Terre et al. | Jul 2017 | A1 |
20170318237 | Nussmeier et al. | Nov 2017 | A1 |
20170359526 | Boulanger et al. | Dec 2017 | A1 |
20170374261 | Teich et al. | Dec 2017 | A1 |
20170374298 | Teich et al. | Dec 2017 | A1 |
20190032508 | Wang | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
10-1526419 | Nov 2015 | KR |
Entry |
---|
International Searching Authority, “Search Report,” issued in connection with PCT patent application No. PCT/US2018/048366, dated Dec. 14, 2018, 8 pages. |
International Searching Authority, “Written Opinion,” issued in connection with PCT patent application No. PCT/US2018/048366, dated Dec. 14, 2018, 12 pages. |
ECN Magazine, “Active pixel infrared thermometer array overcomes cost hurdles associated with thermal imaging,” published Mar. 20, 2012, retrieved from Internet on Sep. 28, 2017, [https://www.ecnmag.com/product-release/2012/03/active-pixel-infrared-thermometer-array-overcomes-cost-hurdles-associated-thermal-imaging], 3 pages. |
Number | Date | Country | |
---|---|---|---|
20190096052 A1 | Mar 2019 | US |