The present invention relates to a technique for detecting gas using an image.
When a gas leak occurs, a slight temperature change occurs in the area where the leaked gas is drifting. As a technique for detecting gas using this principle, gas detection using an infrared image has been known.
As the gas detection using an infrared image, for example, Patent Literature 1 discloses a gas leak detection apparatus that includes an infrared camera for imaging an area to be inspected, and an image processing unit for processing the infrared image captured by the infrared camera, in which the image processing unit includes an extraction unit for extracting dynamic fluctuation caused by a gas leak from a plurality of infrared images arranged in a time series.
As the gas detection using an image other than the gas detection using an infrared image, for example, gas detection using an optical flow has been proposed. Patent Literature 2 discloses a gas leak detection system that is a system for detecting a gas leak on the basis of imaging by a long-focus optical system, which includes an imaging means for continuously capturing an object irradiated with parallel light or light similar to the parallel light using a camera of the long-focus optical system, a computing means for converting, using an optical flow process, the continuous image data captured by the imaging means into vector display image data in which a motion of particles in a plurality of image data is displayed as a vector, and an output means for outputting and displaying the vector display image data converted by the computing means.
A gas region extracted by image processing may be generated on the basis of an event other than appearance of the gas to be detected. For example, when the sun is obstructed by moving clouds and shadows of steam or the like reflected on a reflective surface on which sunlight is reflected is fluctuating, the resulting images may be included in the image as a gas region. Therefore, in the case of a gas detection technique based on a time-series image (e.g., moving image) having been subject to image processing of extracting a gas region, even if gas detection (gas region detection) is carried out, a user may determine that there is a possibility of misdetection in consideration of weather conditions (wind, weather), a time zone (daytime, night-time), and the like at the time of the gas detection.
In such a case, while the user determines whether or not it is misdetection by viewing the gas region included in the image, there may be a case where misdetection cannot be determined by viewing only the image at the time of the gas detection. In view of the above, the user views motions, changes in shape, and the like in the gas region in the past before the time at which the gas is detected, thereby determining whether or not it is misdetection. Furthermore, in the case of the shadow fluctuation mentioned above, the user determines whether or not it is misdetection by viewing whether or not a similar gas region is detected when the sun is not obstructed by clouds in the same time zone at the position with the same positional relationship with the sun. In order to make this determination, it is conceivable to go back from the time point at which the gas is detected and reproduce the time-series images. However, in a case where the retroactive period of time is long (e.g., one day or one week), the reproduction time of the time-series images becomes long, and the user cannot quickly determine whether or not it is misdetection. If the time-series images are subject to fast-forward reproduction, a gas region included in the image may be missed.
Patent Literature 1: JP 2012-58093 A
Patent Literature 2: JP 2009-198399 A
The present invention aims to provide an image processing device for gas detection, an image processing method for gas detection, and an image processing program for gas detection that enable a user to grasp contents of a time-series image in a short time without missing a gas region included in the image.
In order to achieve the object mentioned above, an image processing device for gas detection reflecting one aspect of the present invention includes a first generation unit and a display control unit. The first generation unit obtains first time-series images whose imaging time is a first predetermined time period, sets a plurality of second predetermined time periods arranged in a time series and included in the first predetermined time period, and generates, for a plurality of second time-series images respectively corresponding to the plurality of second predetermined time periods, a representative image of the second time-series images corresponding to the second predetermined time period and to a part of the first time-series images, thereby generating time-series representative images. The first generation unit generates, in the case of generating the representative image using the second time-series images including a gas region, the representative image including the gas region. The display control unit displays, on a display, a plurality of the representative images included in the time-series representative images in a time-series order.
Advantages and features provided by one or a plurality of embodiments of the invention are fully understood from the following detailed descriptions and the accompanying drawings. Those detailed descriptions and the accompanying drawings are provided merely as examples, and are not intended to be definition of limitation of the present invention.
Hereinafter, one or more embodiments of the present invention will be described with reference to the accompanying drawings. However, the scope of the present invention is not limited to the disclosed embodiments.
In each drawing, a configuration denoted by a same reference sign indicates a same configuration, and description of content of the configuration that has already described is omitted.
The infrared camera 2 captures video of infrared images of a subject including a monitoring target of a gas leak (e.g., portion where gas transport pipes are connected with each other), and generates moving image data MD indicating the video. It only needs to be a plurality of infrared images captured in a time series, and is not limited to moving images. The infrared camera 2 includes an optical system 4, a filter 5, a two-dimensional image sensor 6, and a signal processing unit 7.
The optical system 4 forms an infrared image of a subject on the two-dimensional image sensor 6. The filter 5 is disposed between the optical system 4 and the two-dimensional image sensor 6, and transmits only infrared light of a specific wavelength among the light having passed through the optical system 4. The wavelength band to pass through the filter 5 among the infrared wavelength bands depends on a type of the gas to be detected. For example in the case of methane, a filter 5 that allows a wavelength band of 3.2 to 3.4 μm to pass therethrough is used. The two-dimensional image sensor 6 is, for example, a cooled indium antimony (InSb) image sensor, which receives infrared light having passed through the filter 5. The signal processing unit 7 converts analog signals output from the two-dimensional image sensor 6 into digital signals, and performs publicly known image processing. Those digital signals become the moving image data MD.
The image processing device for gas detection 3 is a personal computer, a smartphone, a tablet terminal, or the like, and includes an image data input unit 8, an image processing unit 9, a display control unit 10, a display 11, and an input unit 12 as functional blocks.
The image data input unit 8 is a communication interface that communicates with a communication unit (not illustrated) of the infrared camera 2. The moving image data MD transmitted from the communication unit of the infrared camera 2 is input to the image data input unit 8. The image data input unit 8 transmits the moving image data MD to the image processing unit 9.
The image processing unit 9 performs predetermine processing on the moving image data MD. The predetermined processing is, for example, processing of generating time-series pixel data from the moving image data MD.
The time-series pixel data will be specifically described.
The pixels at the same position in the plurality (K) of frames indicate pixels in the same order. For example, in the case of the first pixel, data obtained by arranging, in a time series, pixel data of the first pixel included in the first frame, pixel data of the first pixel included in the second frame, . . . , pixel data of the first pixel included in the (K−1)-th frame, and pixel data of the first pixel included in the K-th frame is to be the time-series pixel data D1 of the first pixel. Furthermore, in the case of the M-th pixel, data obtained by arranging, in a time series, pixel data of the M-th pixel included in the first frame, pixel data of the M-th pixel included in the second frame, . . . , pixel data of the M-th pixel included in the (K−1)-th frame, and pixel data of the M-th pixel included in the K-th frame is to be the time-series pixel data D1 of the M-th pixel. The number of the time-series pixel data D1 is the same as the number of pixels included in one frame.
Referring to
The display control unit 10 causes the display 11 to display the moving image indicated by the moving image data MD and the moving image on which the predetermined processing mentioned above is performed by the image processing unit 9.
The input unit 12 receives various kinds of input related to gas detection. Although the image processing device for gas detection 3 according to the embodiment includes the display 11 and the input unit 12, the image processing device for gas detection 3 may not include those units.
The HDD 3d stores programs for implementing the functional blocks of the image processing unit 9 and the display control unit 10, and various kinds of data (e.g., moving image data MD). The program for implementing the image processing unit 9 is a processing program for obtaining the moving image data MD and performing the predetermined processing mentioned above on the moving image data MD. The program for implementing the display control unit 10 is, for example, a display control program for displaying a moving image indicated by the moving image data MD on the display 11 or displaying a moving image having been subject to the predetermined processing mentioned above performed by the image processing unit 9 on the display 11. Although those programs are stored in advance in the HDD 3d, they are not limited thereto. For example, a recording medium (e.g., external recording medium such as a magnetic disk and an optical disk) recording those programs may be prepared, and the programs recorded in the recording medium may be stored in the HDD 3d. In addition, those programs may be stored in a server connected to the image processing device for gas detection 3 via a network, and those programs may be transmitted to, via the network, the HDD 3d to be stored in the HDD 3d. Those programs may be stored in the ROM 3c instead of the HDD 3d. The image processing device for gas detection 3 may include a flash memory instead of the HDD 3d, and those programs may be stored in the flash memory.
The CPU 3a is an exemplary hardware processor, which reads out those programs from the HDD 3d, loads them in the RAM 3b, and executes the loaded programs, thereby implementing the image processing unit 9 and the display control unit 10. However, a part of or all of respective functions of the image processing unit 9 and the display control unit 10 may be implemented by processing performed by a digital signal processor (DSP) instead of or together with processing performed by the CPU 3a. Likewise, a part of or all of the respective functions may be implemented by processing performed by a dedicated hardware circuit instead of or together with processing performed by software.
Note that the image processing unit 9 includes a plurality of components illustrated in
Those programs are expressed using definitions of the components. The first generation unit 91 and the first generation program will be described as an example. The first generation unit 91 obtains first time-series images whose imaging time is a first predetermined time period, sets a plurality of second predetermined time periods arranged in a time series and included in the first predetermined time period, and generates, for second time-series images respectively corresponding to the plurality of second predetermined time periods, a representative image of the second time-series images corresponding to the second predetermined time period and to a part of the first time-series images, thereby generating time-series representative images. The first generation program is a program that obtains first time-series images whose imaging time is a first predetermined time period, sets a plurality of second predetermined time periods arranged in a time series and included in the first predetermined time period, and generates, for second time-series images respectively corresponding to the plurality of second predetermined time periods, a representative image of the second time-series image corresponding to the second predetermined time period and to a part of the first time-series images, thereby generating time-series representative images.
A flowchart of those programs (first generation program, second generation program, etc.) to be executed by the CPU 3a is illustrated in
The present inventor has found out that, in gas detection using an infrared image, in a case where a gas leak and a background temperature change occur in parallel and the background temperature change is larger than the temperature change due to the leaked gas, it is not possible to display an image of leaking gas unless the background temperature change is considered. This will be described in detail.
An image I1 is an infrared image of the test site captured at time T1 immediately before the sunlight is obstructed by clouds. An image I2 is an infrared image of the test site captured at time T2 5 seconds after the time T1. Since the sunlight is obstructed by clouds at the time T2, the background temperature is lower than that at the time T1.
An image I3 is an infrared image of the test site captured at time T3 10 seconds after the time T1. Since the state in which the sunlight is obstructed by clouds continues from the time T2 to the time T3, the background temperature at the time T3 is lower than that at the time T2.
An image I4 is an infrared image of the test site captured at time T4 15 seconds after the time T1. Since the state in which the sunlight is obstructed by clouds continues from the time T3 to the time T4, the background temperature at the time T4 is lower than that at the time T3.
The background temperature has dropped by about 4° C. in 15 seconds from the time T1 to the time T4. Therefore, it can be seen that the image I4 is overall darker than the image I1, and the background temperature is lower.
At a time after the time T1 and before the time T2, gas ejection starts at the spot SP1. The temperature change due to the ejected gas is slight (about 0.5° C.). Therefore, while the gas is ejected at the spot SP1 at the time T2, the time T3, and the time T4, the background temperature change is much larger than the temperature change due to the ejected gas, whereby it cannot be understood that the gas is ejected at the spot SP1 by viewing the image I2, the image I3, and the image I4.
The graph illustrating a temperature change at the spot SP1 is different from the graph illustrating a temperature change at the spot SP2. Since no gas is ejected at the spot SP2, the temperature change at the spot SP2 indicates a background temperature change. Meanwhile, since gas is ejected at the spot SP1, gas is drifting at the spot SP1. Therefore, the temperature change at the spot SP1 indicates a temperature change obtained by adding the background temperature change and the temperature change due to the leaked gas.
It can be seen from the graph illustrated in
As described above, in a case where the background temperature change is much larger than the temperature change due to the ejected gas (leaked gas), it cannot be understood that the gas is ejected at the spot SP1 by viewing the image I2, the image I3, and the image I4 illustrated in
The reason is that the moving image data MD (
The image processing unit 9 (
The frequency component data having a frequency higher than the frequency of the frequency component data indicating the temperature change due to the leaked gas and indicating high-frequency noise is regarded as high-frequency component data D3. The image processing unit 9 performs, in addition to processing of removing the low-frequency component data D2, processing of removing the high-frequency component data D3 on each of the plurality of time-series pixel data D1 included in the moving image data MD.
In this manner, the image processing unit 9 does not perform processing of removing the low-frequency component data D2 and the high-frequency component data D3 in units of frames, but performs processing of removing the low-frequency component data D2 and the high-frequency component data D3 in units of time-series pixel data D1.
The image processing device for gas detection 3 generates a monitoring image using an infrared image. When a gas leak occurs, the monitoring image includes an image showing an area in which gas appears due to the gas leak. The image processing device for gas detection 3 detects the gas leak on the basis of the monitoring image. While various methods are available as a method of generating a monitoring image, an exemplary method of generating a monitoring image will be described here. The monitoring image is generated using infrared images of a monitoring target and the background.
Referring to
The image processing unit 9 sets data extracted from the time-series pixel data D1 by calculating a simple moving average in units of a first predetermined number of frames smaller than K frames for the time-series pixel data D1 as the low-frequency component data D2, and extracts M pieces of low-frequency component data D2 corresponding to the respective M pieces of time-series pixel data D1 (step S2).
The first predetermined number of frames is, for example, 21 frames. A breakdown thereof includes a target frame, consecutive 10 frames before the target frame, and consecutive 10 frames after the target frame. The first predetermined number only needs to be a number capable of extracting the low-frequency component data D2 from the time-series pixel data D1, and may be more than 21 or less than 21, not being limited to 21.
The image processing unit 9 sets data extracted from the time-series pixel data D1 by calculating a simple moving average in units of a third predetermined number (e.g., 3) of frames smaller than the first predetermined number (e.g., 21) for the time-series pixel data D1 as the high-frequency component data D3, and extracts M pieces of high-frequency component data D3 corresponding to the respective M pieces of time-series pixel data D1 (step S3).
The third predetermined number of frames is, for example, three frames. A breakdown thereof includes a target frame, one frame immediately before the target frame, and one frame immediately after the target frame. The third predetermined number only needs to be a number capable of extracting a third frequency component from the time-series pixel data, and may be more than three, not being limited to three.
Referring to
The image processing unit 9 sets data obtained by calculating a difference between the time-series pixel data D1 and the high-frequency component data D3 extracted from the time-series pixel data D1 as difference data D5, and calculates M pieces of difference data D5 corresponding to the respective M pieces of time-series pixel data D1 (step S5).
The difference data D5 is data obtained by calculating a difference between the time-series pixel data D1 and the high-frequency component data D3 illustrated in
The difference data D4 includes frequency component data indicating a temperature change due to the leaked gas, and the high-frequency component data D3 (data indicating high-frequency noise). The difference data D5 does not include frequency component data indicating a temperature change due to the leaked gas, and includes the high-frequency component data D3.
Since the difference data D4 includes the frequency component data indicating a temperature change due to the leaked gas, the variation in the amplitude and waveform of the difference data D4 becomes larger after the start of the gas ejection at the spot SP1 (90th and subsequent frames). On the other hand, since the difference data D5 does not include the frequency component data indicating a temperature change due to the leaked gas, such a situation does not occur. The difference data D5 repeats a minute amplitude. This is the high-frequency noise.
Although the difference data D4 and the difference data D5 are correlated with each other, they are not completely correlated with each other. That is, in a certain frame, a value of the difference data D4 may be positive and a value of the difference data D5 may be negative or vice versa. Therefore, even if a difference between the difference data D4 and the difference data D5 is calculated, the high-frequency component data D3 cannot be removed. In order to remove the high-frequency component data D3, it is necessary to convert the difference data D4 and the difference data D5 into values such as absolute values that can be subject to subtraction.
In view of the above, the image processing unit 9 sets data obtained by calculating moving standard deviation in units of a second predetermined number of frames smaller than the K frames for the difference data D4 as standard deviation data D6, and calculates M pieces of standard deviation data D6 corresponding to the respective M pieces of time-series pixel data D1 (step S6). Note that moving variance may be calculated instead of the moving standard deviation.
Further, the image processing unit 9 sets data obtained by calculating moving standard deviation in units of a fourth predetermined number of frames smaller than the K frames (e.g., 21) for the difference data D5 as standard deviation data D7, and calculates M pieces of standard deviation data D7 corresponding to the respective M pieces of time-series pixel data D1 (step S7). Moving variance may be used instead of the moving standard deviation.
Since the standard deviation data D6 and the standard deviation data D7 are standard deviation, they do not include negative values. Therefore, the standard deviation data D6 and the standard deviation data D7 can be regarded as data obtained by converting the difference data D4 and the difference data D5 such that they can be subject to subtraction.
The image processing unit 9 sets data obtained by calculating a difference between the standard deviation data D6 and the standard deviation data D7 obtained from the same time-series pixel data D1 as difference data D8, and calculates M pieces of difference data D8 corresponding to the respective M pieces of time-series pixel data D1 (step S8).
The image processing unit 9 generates a monitoring image (step S9). That is, the image processing unit 9 generates a video including the M pieces of difference data D8 obtained in step S8. Each frame included in the video is a monitoring image. The monitoring image is an image obtained by visualizing the difference of the standard deviation. The image processing unit 9 outputs the video obtained in step S9 to the display control unit 10. The display control unit 10 displays the video on the display 11. Examples of the monitoring image included in the video include an image I12 illustrated in
Since the image I12 illustrated in
As described above, according to the embodiment, the image processing unit 9 (
Sensor noise differs depending on a temperature as it becomes smaller as the temperature becomes higher. In the two-dimensional image sensor 6 (
According to the embodiment, steps S100 to S102 illustrated in
Referring to
The second generation unit 92 performs a process of steps S1 to S9 illustrated in
The monitoring image Im1 is, for example, the image I12 illustrated in
Although the gas region is extracted in the process of steps S1 to S9 illustrated in
Referring to
When the image processing unit 9 determines that the monitoring image video V1 includes the gas region, the image processing device for gas detection 3 makes predetermined notification, thereby notifying the user of the gas detection. When the user determines that the detection may be misdetection, the user operates the input unit 12 to input the first predetermined time period and the second predetermined time period and to input a command to generate the representative image video V2. The first predetermined time period is a period that goes back from the time point at which the gas is detected. The second predetermined time period is a time unit of the monitoring image video V1 to be used for generating a representative image Im2. Here, it is assumed that the first predetermined time period is 24 hours, and the second predetermined time period is 10 seconds. Those are specific examples, and the first predetermined time period and the second predetermined time period are not limited to those values.
The first generation unit 91 obtains, from among the monitoring image videos V1 stored in the second generation unit 92, the monitoring image video V1 up to 24 hours before the time point at which the image processing device for gas detection 3 detects the gas, and divides the 24 hours of the obtained monitoring image video V1 into 10-second intervals. Each 10 seconds corresponds to a part P1 (exemplary second time-series image) of the monitoring image video V1. The part P1 of the monitoring image video V1 includes a plurality of monitoring images Im1 arranged in a time series.
Referring to
A specific example of the representative image video V2 is illustrated in
In order to suppress oversight of the gas region, if the gas region is present in at least a part of 10 seconds, the first generation unit 91 causes the representative image Im2 to include the gas region. A first exemplary method of generating the representative image Im2 will be described. Referring to
A second exemplary method of generating the representative image Im2 will be described. Referring to
The first generation unit 91 selects the monitoring image Im1 having the maximum average luminance value of the gas region as a representative image Im2.
A third exemplary method of generating the representative image Im2 will be described. In the third example, an area of the gas region is used instead of the average luminance value of the gas region. Referring to
According to the third example, in a case where the part P1 (second time-series images) of the monitoring image video V1 includes the gas region, the area of the gas region included in the representative image Im2 can be enlarged. Accordingly, the user can easily find the gas region.
In the second and third examples, the first generation unit 91 determines whether or not the part P1 of the monitoring image video V1 includes the gas region, and generates the representative image Im2 including the gas region in the case where the part P1 of the monitoring image video V1 includes the gas region. In the second and third examples, the first generation unit 91 determines that the part P1 of the monitoring image video V1 does not include the gas region in the case where any of the plurality of monitoring images Im1 included in the part P1 of the monitoring image video V1 does not include the gas region. In the case where the part P1 of the monitoring image video V1 does not include the gas region, the first generation unit 91 sets a predetermined monitoring image Im1 (optional monitoring image Im1) among the plurality of monitoring images Im1 included in the part P1 of the monitoring image video V1 as a representative image Im2. The predetermined monitoring image Im1 may be any one (e.g., the top monitoring image Im1) as long as it is a plurality of monitoring images Im1 included in the part P1 of the monitoring image video V1.
The user views the representative image video V2 (time-series representative images) to grasp the contents of the monitoring image video V1 (first time-series images) in a short time. In a case where there is a second predetermined time period in which no gas region is present in a plurality of the second predetermined time periods (10 seconds), it is necessary for the user to recognize the fact. In view of the above, in the case of the part P1 of the monitoring image video V1 corresponding to the second predetermined time period in which no gas region is present (in the case where no gas region is included in the part of the monitoring image video V1), the first generation unit 91 sets a predetermined monitoring image Im1 among the plurality of monitoring images Im1 included in the part P1 of the monitoring image video V1 as a representative image Im2.
As described above, the first generation unit 91 obtains the first time-series images (monitoring image video V1) whose imaging time is the first predetermined time period (24 hours), sets a plurality of the second predetermined time periods (10 seconds) arranged in a time series and included in the first predetermined time period, and generates, for the second time-series images respectively corresponding to the plurality of second predetermined time periods, a representative image Im2 of the second time-series images corresponding to the second predetermined time period and to a part of the first time-series images, thereby generating time-series representative images (representative image video V2).
Referring to
8,640 frames÷4 fps=2,160 seconds=36 minutes
Note that the second predetermined time period is lengthened when it is desired to further shorten the reproduction time. For example, in the case where the second predetermined time period is 1 minute, the number of representative images Im2 (frames) included in the representative image video V2 is 1,440 (=24 hours×60 minutes). The reproduction time is 6 minutes as expressed by the following formula.
1,440 frames÷4 fps=360 seconds=6 minutes
In the case of generating the representative image Im2 using the first example described above, the maximum value of the pixel values during the second predetermined time period is set as the pixel value of the representative image Im2. Therefore, in this case, noise tends to be included in the representative image Im2 when the second predetermined time period is lengthened.
Referring to
In addition, since the representative image Im2 is an image that represents the part P1 of the monitoring image video V1, the number of the representative images Im2 included in the representative image video V2 is smaller than the number of the monitoring images Im1 included in the monitoring image video V1. Therefore, the reproduction time of the representative image video V2 can be made shorter than that of the monitoring image video V1.
In this manner, according to the embodiment, the user can grasp the contents of the time-series images (monitoring image video V1) in a short time.
In a case where the part P1 of the monitoring image video V1 includes the gas region, the first generation unit 91 generates a representative image Im2 including the gas region. Therefore, according to the embodiment, oversight of the gas region can be suppressed.
As described above, according to the embodiment, the user can grasp the contents of the monitoring image video V1 in a short time without missing the gas region included in the image. Therefore, effects similar to the effects obtained by digest reproduction of the monitoring image video V1 can be obtained.
A service is conceivable in which the gas detection system 1 is used to monitor a gas monitoring target (e.g., gas piping in a gas plant) for a long period of time and facts occurred during the period are provided to the user. If the representative image video V2 is stored in a cloud computing storage, a service provider is not required to visit the site where the gas monitoring target is located. In the case of using cloud computing, it is not realistic to continuously upload all the data of the monitoring image video V1 to the cloud from the viewpoint of data capacity and bandwidth, and it is preferable to reduce the data volume. As described above, since the number of the representative images Im2 included in the representative image video V2 is smaller than the number of the monitoring images Im1 included in the monitoring image video V1, the data volume of the representative image video V2 can be made smaller than that of the monitoring image video V1.
A first variation of the embodiment will be described. In the embodiment, as illustrated in
Referring to
In the first variation, the first generation unit 91 generates a representative image Im2 using a part P1 of the monitoring image video V1 corresponding to each 10 seconds. This is similar to the embodiment.
The total period of the plurality of divided periods (2 minutes) is the same length as the first predetermined time period (24 hours). According to the first variation, since the second predetermined time period (10 seconds) is shorter than the divided period, a plurality of second predetermined time periods is to be set at predetermined intervals. According to the first variation, the number of the representative images Im2 can be made smaller in the case where the second predetermined time period has the same length compared with the aspect in which the plurality of second predetermined time periods is set to be continuous (
A second variation of the embodiment will be described.
Referring to
The first generation unit 91 determines whether or not the gas region is included in the monitoring image video V1 in the divided period T2. No gas region is assumed to be included in the monitoring image video V1 in the divided period T2. The first generation unit 91 sets a predetermined monitoring image Im1 as a representative image Im2 among a plurality of monitoring images Im1 belonging to the divided period T2. For example, the first monitoring image Im1 is set as a representative image Im2.
The first generation unit 91 determines whether or not the gas region is included in the monitoring image video V1 in the divided period T3. The gas region is assumed to be included in the monitoring image video V1 in the divided period T3. The first generation unit 91 sets a second predetermined time period (10 seconds) in the period in which the gas region first appears in the divided period T3. The first generation unit 91 generates a representative image Im2 using the part P1 of the monitoring image video V1 corresponding to the second predetermined time period.
According to the second variation, throughout the period of the divided periods, a representative image Im2 including no gas region is generated when no gas region is present, and a representative image Im2 including the gas region is generated when there is the gas region in at least a part of the divided period. Therefore, in a case where the gas region is present in a part of the divided period, oversight of the gas region can be suppressed.
The second variation is premised on determination on whether or not the gas region is included in the monitoring image video V1. Accordingly, in the second variation, the above-described second exemplary method of generating the representative image Im2 (the representative image Im2 is determined on the basis of an average luminance value of the gas region) or the third example (the representative image Im2 is determined on the basis of an area of the gas region) is applied.
A third variation will be described. In the third variation, a gas region is colored.
An image processing unit 9 of the gas detection system 1a includes a color processing unit 93. The color processing unit 93 performs image processing of colorizing the gas region. The monitoring images Im1 illustrated in
The color processing unit 93 colorizes the gas region according to a luminance value of each pixel included in the cut out gas region. The color processing unit 93 regards a pixel having a luminance value equal to or less than a predetermined threshold value as noise, and does not color the pixel. Accordingly, the color processing unit 93 colors pixels having luminance values exceeding the predetermined threshold value.
The color processing unit 93 sets three adjacent pixels as one set in the cut out gas region, and calculates an average value of the luminance values of those pixels. This average value is to be the original luminance value. For example, when the average value (original luminance value) is 63, the color processing unit 93 sets, among the tree pixels included in the set, the luminance value of the pixel corresponding to R to 0, the luminance value of the pixel corresponding to G to 255, and the luminance value of the pixel corresponding to B to 255. The color processing unit 93 performs a similar process on other sets as well. Accordingly, the gas region is colorized. When gas concentration is high, the luminance value (pixel value) of each pixel included in the gas region is relatively large, whereby the gas region has a larger red area. When gas concentration is low, the luminance value (pixel value) of each pixel included in the gas region is relatively small, whereby the gas region has a larger blue area.
The color processing unit 93 colorizes the gas region for each of the gas regions included in the 2nd to 16th monitoring images Im1 in a similar manner.
The color processing unit 93 combines the colorized gas region (hereinafter referred to as a colored gas region) with a visible image Im3. More specifically, the color processing unit 93 obtains, from the moving image data md, a frame (visible image Im3) captured at the same time as the monitoring image Im1 illustrated in
The visible image Im3 is a color image. The colored gas region R2 is combined near the center (spot SP1 in
Video of the visible image Im3 in which the colored gas region R2 is combined as illustrated in
Referring to
A specific example of the representative image video V4 is illustrated in
In order to suppress oversight of the colored gas region R2, if the colored gas region R2 is present in at least a part of 10 seconds, the first generation unit 91 causes the representative image Im4 to include the colored gas region R2. A method of generating the representative image Im4 will be described. Referring to
The first generation unit 91 determines that the part P2 of the visible image video V3 does not include the colored gas region R2 in the case where any of the plurality of visible images Im3 included in the part P2 of the visible image video V3 does not include the colored gas region R. In the case where the part P2 of the visible image video V3 does not include the colored gas region R2, the first generation unit 91 sets a predetermined visible image Im3 among the plurality of visible images Im3 included in the part P2 of the visible image video V3 as a representative image. The predetermined visible image Im3 may be any one (e.g., the top visible image Im3) as long as it is a plurality of visible images Im3 included in the part P2 of the visible image video V3.
The third variation includes the following second mode in addition to the first mode as described above. The first generation unit 91 and the second generation unit 92 illustrated in
As described above, according to the third variation, the gas region included in the representative image Im4 is colorized (colored gas region R2), whereby the gas region can be highlighted. Accordingly, the user can easily find the gas region.
The third variation can be combined with the first variation illustrated in
Although the color visible image Im3 has been described as an example of the background of the colored gas region R2 in the third variation, a grayscale visible image Im3 may be used as the background. In addition, an infrared image captured by an infrared camera 2 may be used as the background. The visible camera 13 is not required in the mode of using the infrared image as the background.
An image processing device for gas detection according to a first aspect of an embodiment includes a first generation unit that generates time-series representative images by obtaining first time-series images whose imaging time is a first predetermined time period, setting a plurality of second predetermined time periods arranged in a time series and included in the first predetermined time period, and performing, on a plurality of second time-series images respectively corresponding to the plurality of second predetermined time periods, generation of a representative image of the second time-series images corresponding to the second predetermined time periods and to a part of the first time-series images, in which, in a case where the representative image is generated using the second time-series images including a gas region, the first generation unit generates the representative image including the gas region, and a display control unit that displays, on a display, a plurality of the representative images included in the time-series representative images in a time-series order is further provided.
In the first time-series images, a gas monitoring target (e.g., a gas pipe of a gas plant) is captured. The first time-series images may be time-series images having been subject to image processing of extracting a gas region, or may be time-series images not having been subject to such image processing. In the latter case, for example, in a case where liquefied natural gas leaks from a gas pipe, a misty image (gas region) is included in the first time-series image even if the image processing of extracting the gas region is not performed. The image processing of extracting the gas region is not limited to the image processing described in the embodiment, and may be publicly known image processing.
Of the first predetermined time period (first predetermined time period>second predetermined time period), the first time-series image includes the gas region during the period in which the gas to be detected appears or during the period in which an event causing misdetection occurs. Of the first predetermined time period, the first time-series image does not include the gas region during the period in which the gas to be detected does not appear and the event causing the misdetection does not occur.
The representative image is an image representing the second time-series image (a part of the first time-series images). The time-series representative images include a plurality of representative images arranged in a time series. The display control unit displays the plurality of representative images on a display in a time-series order (reproduces the time-series representative images). Therefore, the user can grasp the contents of the first time-series images by viewing those representative images.
In addition, since the representative image is an image representing the second time-series image that is a part of the first time-series images, the number of the representative images included in the time-series representative images is smaller than the number of images included in the first time-series images. Therefore, the time-series representative images can have a shorter reproduction time than the first time-series images.
As described above, according to the image processing device for gas detection of the first aspect of the embodiment, the user can grasp the contents of the time-series images (first time-series images) in a short time.
The first generation unit generates a representative image including the gas region in the case of the second time-series image including the gas region. Therefore, according to the image processing device for gas detection of the first aspect of the embodiment, oversight of the gas region can be suppressed.
The image processing device for gas detection according to the first aspect of the embodiment includes a first mode for determining whether or not the second time-series image includes the gas region, and a second mode for not determining whether or not the second time-series image includes the gas region. In the second mode, a representative image including the gas region is generated as a result if the second time-series image includes the gas region, and a representative image not including the gas region is generated as a result if the second time-series image does not include the gas region.
In the configuration described above, a processing unit for performing image processing of colorizing the gas region is further provided.
According to this configuration, the gas region is colorized, whereby the gas region can be highlighted. Accordingly, the user can easily find the gas region. The gas region may be colorized at the stage of the first time-series images (processing of colorizing the gas region may be performed on a plurality of images included in the first time-series images), or the gas region may be colorized at the stage of the time-series representative images (processing of colorizing the gas region may be performed on a plurality of representative images included in the time-series representative images).
In the above configuration, in a case where the second time-series image includes the gas region, the first generation unit calculates an area of the gas region for each image including the gas region among a plurality of images included in the second time-series images, and selects the image having the maximum gas region area as the representative image.
This configuration is the first mode mentioned above. According to this configuration, in a case where the second time-series image includes the gas region, the area of the gas region included in the representative image can be enlarged. Accordingly, the user can easily find the gas region.
In the above configuration, in a case where the second time-series image includes the gas region, the first generation unit calculates an average luminance value of the gas region for each image including the gas region among the plurality of images included in the second time-series images, and selects the image having the maximum average luminance value of the gas region as the representative image.
This configuration is the first mode mentioned above. According to this configuration, in a case where the second time-series image includes the gas region, the average luminance value of the gas region included in the representative image can be increased. Accordingly, the user can easily find the gas region.
In the above configuration, in a case where the second time-series image does not include the gas region, the first generation unit selects a predetermined image among the plurality of images included in the second time-series images as the representative image.
This configuration is the first mode mentioned above. As described above, the user views the time-series representative images to grasp the contents of the first time-series images in a short time. Accordingly, in a case where there is a second predetermined time period in which no gas region is present in a plurality of the second predetermined time periods, it is necessary for the user to recognize the fact. In view of the above, in the case of the second time-series image corresponding to the second predetermined time period in which no gas region is present (in a case where the second time-series image does not include the gas region), the first generation unit sets a predetermined image (optional image) among the plurality of images included in the second time-series images as a representative image. The predetermined image may be any one (e.g., the top image) as long as it is a plurality of images included in the second time-series images.
In the above configuration, the first generation unit sets a plurality of divided periods obtained by dividing the first predetermined time period, and sets the second predetermined time period included in the divided period and shorter than the divided period for each of the plurality of divided periods.
The total period of the plurality of divided periods is the same length as the first predetermined time period. According to this configuration, since the second predetermined time period is shorter than the divided period, a plurality of second predetermined time periods is to be set at predetermined intervals. According to this configuration, the number of the representative images can be made smaller in the case where the second predetermined time period has the same length compared with the aspect in which the plurality of second predetermined time periods is set to be continuous. Therefore, according to this configuration, even if the first predetermined time period is long, the contents of the first time-series images can be roughly grasped without increasing the reproduction time of the time-series representative images. This configuration is effective in the case where the first predetermined time period is long (e.g., one day).
In the above configuration, in a case where there is a period in which the gas region is present in the divided period, the first generation unit sets the period as the second predetermined time period.
There may be a gas region in a part of divided periods instead of the entire period thereof. When the second predetermined time period is set in the period in which the gas region is present, the first generation unit generates a representative image including the gas region, and when the second predetermined time period is set in the period in which no gas region is included, it generates a representative image including no gas region. This configuration gives priority to the former case. Accordingly, throughout the period of the divided periods, the first generation unit generates a representative image including no gas region when no gas region is present, and generates a representative image including the gas region when there is the gas region in at least a part of the divided period. According to this configuration, oversight of the gas region can be suppressed in the case where there is the gas region in at least a part of the divided period.
In the above configuration, the first generation unit sets the maximum value of the values indicated by the pixels positioned in the same order in the plurality of images included in the second time-series images as a value of the pixel positioned in the same order in the representative image, thereby generating the representative image.
Since the values indicated by the pixels included in the gas region are relatively large, the region including the pixels having relatively large values is the gas region. This configuration is the second mode mentioned above, and a representative image is generated without determining whether or not the gas region is included in the second time-series image. According to this configuration, in a case where the second time-series image includes the gas region, the gas region included in the representative image is to be a gas region indicating a logical sum of the gas regions included in the respective images included in the second time-series images. Therefore, it has been found out that, in a case where the gas fluctuates due to a change in the wind direction or the like, the area of the gas region included in the representative image can be enlarged. In such a case, the user can easily find the gas region.
In the above configuration, the first generation unit sets a plurality of divided periods obtained by dividing the first predetermined time period, and sets the second predetermined time period included in the divided period and shorter than the divided period for each of the plurality of divided periods.
According to this configuration, even if the first predetermined time period is long, the contents of the first time-series images can be roughly grasped without increasing the reproduction time of the time-series representative images. This configuration is effective in the case where the first predetermined time period is long (e.g., one day).
In the above configuration, a processing unit for performing image processing of colorizing the gas region is further provided in the case where the representative image includes the gas region.
This configuration determines whether or not the representative image includes the gas region, and colorizes the gas region in the case where the representative image includes the gas region. Therefore, the gas region can be highlighted according to this configuration.
In the above configuration, there is further provided a second generation unit that generates the first time-series images by performing image processing of extracting the gas region on third time-series images captured during the first predetermined time period.
According to this configuration, the time-series images having been subject to the image processing of extracting the gas region are to be the first time-series images.
An image processing method for gas detection according to a second aspect of the embodiment includes a first generation step of generating time-series representative images by obtaining first time-series images whose imaging time is a first predetermined time period, setting a plurality of second predetermined time periods arranged in a time series and included in the first predetermined time period, and performing, on a plurality of second time-series images respectively corresponding to the plurality of second predetermined time periods, generation of a representative image of the second time-series images corresponding to the second predetermined time periods and to a part of the first time-series images, in which, in a case where the representative image is generated using the second time-series images including a gas region, the first generation step generates the representative image including the gas region, and a display control step of displaying, on a display, a plurality of the representative images included in the time-series representative images in a time-series order is further provided.
The image processing method for gas detection according to the second aspect of the embodiment defines the image processing device for gas detection according to the first aspect of the embodiment from the viewpoint of a method, and exerts effects similar to those of the image processing device for gas detection according to the first aspect of the embodiment.
An image processing program for gas detection according to a third aspect of the embodiment causes a computer to perform a first generation step of generating time-series representative images by obtaining first time-series images whose imaging time is a first predetermined time period, setting a plurality of second predetermined time periods arranged in a time series and included in the first predetermined time period, and performing, on a plurality of second time-series images respectively corresponding to the plurality of second predetermined time periods, generation of a representative image of the second time-series images corresponding to the second predetermined time periods and to a part of the first time-series images, in which, in a case where the representative image is generated using the second time-series images including a gas region, the first generation step generates the representative image including the gas region, and the program further causing a computer to perform a display control step of displaying, on a display, a plurality of the representative images included in the time-series representative images in a time-series order.
The image processing program for gas detection according to the third aspect of the embodiment defines the image processing device for gas detection according to the first aspect of the embodiment from the viewpoint of a program, and exerts effects similar to those of the image processing device for gas detection according to the first aspect of the embodiment.
Although the embodiment of the present invention has been illustrated and described in detail, it is illustrative only and does not limit the present invention. The scope of the present invention should be construed on the basis of the description of the appended claims.
Japanese patent application No. 2017-181283 filed on Sep. 21, 2017, the entire disclosure of which is hereby incorporated by reference in its entirety.
According to the present invention, it becomes possible to provide an image processing device for gas detection, an image processing method for gas detection, and an image processing program for gas detection.
Number | Date | Country | Kind |
---|---|---|---|
2017-181283 | Sep 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/031286 | 8/24/2018 | WO | 00 |