The present disclosure relates to a deterioration detection apparatus, a deterioration detection system, a deterioration detection method, and a program.
Infrastructure equipment, such as conduits, is attached to the lateral sides or back sides of structures such as bridges installed outdoors, in order to allow liquid, gas, communication cables, and the like to pass. Companies or local governments that own infrastructure equipment periodically inspect conduits or attachment members for attaching the conduits to bridges, and check for deterioration such as rusting.
Conventionally, inspection has been performed through close visual examination, in which a scaffold for inspection and the like are installed on the above-mentioned structures, and a worker approaches the equipment and inspects the equipment. However, there has been concern that inspection through close visual examination requires cost pertaining to installation of a scaffold, it is difficult to secure the safety of a worker in the case of high-place work, and the like. In view of this, in recent years, inspection methods have been proposed in which an unmanned drone captures an image of equipment, and deterioration of the equipment is efficiently detected based on the captured image using an image processing technology. NPL 1 discloses a technique for dividing a captured image into rectangular regions using an image classification technique that uses deep learning (CNN: Convolution Neural Network), and automatically determining whether or not there is deterioration in each of the divided rectangular regions, for example.
However, an image of equipment captured by an unmanned drone includes elements other than the equipment that is the inspection target, such as a tree, a river, a vehicle, a pedestrian, a sign, roads, and a building. Therefore, there has been a problem in that, with conventional techniques, it is difficult to accurately detect deterioration with focus on the target equipment based on such a captured image.
An object of the present disclosure that has been made with the foregoing in view is to provide a deterioration detection apparatus, a deterioration detection system, a deterioration detection method, and a program for enabling deterioration of equipment to be accurately detected based on a captured image.
A deterioration detection apparatus according to an embodiment of the present invention is a deterioration detection apparatus that detects deterioration of equipment attached to a structure, and includes an equipment region extraction unit configured to extract a region in which the equipment is present, based on a captured image of the equipment, and a deterioration region detection unit configured to detect a deterioration region of the equipment based on the region in which the equipment is present.
A deterioration detection system according to an embodiment of the present invention is a deterioration detection system that detects deterioration of equipment attached to a structure, and includes the above deterioration detection apparatus, an image capturing apparatus configured to capture an image of the equipment, and a server apparatus configured to store the deterioration region.
A deterioration detection method according to an embodiment of the present invention is a deterioration detection method for detecting deterioration of equipment attached to a structure, and includes a step of capturing an image of the equipment, a step of extracting a region in which the equipment is present, based on the captured image, and detecting a deterioration region of the equipment based on the region in which the equipment is present, and a step of storing the deterioration region.
A program according to an embodiment of the present invention causes a computer to function as the deterioration detection apparatus.
According to the present disclosure, it is possible to provide a deterioration detection apparatus, a deterioration detection system, a deterioration detection method, and a program for enabling deterioration of equipment to be accurately detected based on a captured image.
Embodiments of the present invention will be described below in detail with reference to the drawings.
An exemplary configuration of a deterioration detection system 1 according to an embodiment of the present invention will be described with reference to
The deterioration detection system 1 is a system that detects deterioration V of equipment 3 attached to a structure 2 based on a captured image (moving image, still image) of the equipment 3, using deep learning. Examples of the structure 2 include bridges. Examples of the equipment 3 include conduits, attachment members for attaching conduits to a bridge, and the like.
As shown in
The image capturing apparatus 100 is an uninhabited airborne vehicle, a telephoto camera, or the like. The image capturing apparatus 100 captures images of the equipment 3. It is sufficient for the image capturing apparatus 100 to have a function of optically capturing an image of the equipment 3, and there is no particular limitation on the configuration thereof. The image capturing apparatus 100 transmits image data of a captured image to the deterioration detection apparatus 200. Note that the captured image includes not only the equipment 3 but also elements other than the equipment 3 that is an inspection target, such as a tree, a river, a vehicle, a pedestrian, a sign, road, and a building.
Examples of the deterioration detection apparatus 200 include a mobile phone such as a smartphone, a tablet terminal, and a notebook PC (personal computer), which is used by a worker U. The deterioration detection apparatus 200 receives image data of a captured image from the image capturing apparatus 100. The deterioration detection apparatus 200 extracts a region in which the equipment 3 is present based on the captured image, and detects a deterioration region of the equipment 3 based on the region in which the equipment 3 is present, which will be described later in detail. The deterioration detection apparatus 200 transmits detection data of the deterioration region of the equipment 3 to the server apparatus 300 via a network.
The server apparatus 300 receives the detection data of the deterioration region of the equipment 3 from the deterioration detection apparatus 200 via the network. The server apparatus 300 stores the detection data of the deterioration region of the equipment 3.
An exemplary configuration of the deterioration detection apparatus 200 according to the present embodiment will be described with reference to
As shown in
Various types of information are input to the input unit 210. The input unit 210 may be any device that enables the worker U to perform a predetermined operation, and may be a microphone, a touch panel, a keyboard, a mouse, or the like. As a result of the worker U performing a predetermined operation using the input unit 210 for example, image data of a captured image of the equipment 3 captured by the image capturing apparatus 100 is input to the equipment region extraction unit 221. The input unit 210 may be formed in one piece with the deterioration detection apparatus 200, or may also be provided separately.
The control unit 220 may be constituted by dedicated hardware, or may also be constituted by a general-purpose processor or a processor specialized for specific processing.
The equipment region extraction unit 221 extracts a region in which the equipment 3 is present, based on image data of the captured image input through the input unit 210, using an image classification technique that uses a CNN that represents a deep learning technique. Examples of a model include VGG16 and the like, but there is no limitation thereto. The equipment region extraction unit 221 outputs extraction data of the region in which the equipment 3 is present, to the deterioration region detection unit 222.
The following document can be referred to for details of VGG16, for example.
Karen Simonyan, Andrew Zisserman (2014), Very Deep Convolutional Networks for Large-Scale Image Recognition, arXiv:1409.1556 [cs. CV].
The equipment region extraction unit 221 will be described in detail.
As shown in
Specifically, the rectangular region division unit 2211 divides the captured image I into A×B (=(W/w)×(H/h)) rectangular regions R by separating rectangular regions R in the captured image I while moving in an X direction A (=W/w: constant) times and in a Y direction B (=H/h: constant) times. The rectangular region division unit 2211 divides the captured image I into 48 (=8×6) rectangular regions R, for example, by separating rectangular regions R in the captured image I while moving in the X direction eight times and in the Y direction six times. Note that the size of each rectangular region R (height: h, width: w), the number of rectangular regions R (A×B), and the like may be suitably set.
As shown in
Hereinafter, in the present specification, “1/2 displacement” means displacing the rectangular region R by w/2 in the X direction a predetermined number of times, or displacing the rectangular region R by h/2 in the Y direction a predetermined number of times (see open arrows in
As shown in
At this time, an overlapping region X(1/2) in which the rectangular region R, the displaced rectangular region R′1(1/2), the displaced rectangular region R′2(1/2), and the rectangular region R′3(1/2) overlap is generated. The size of the overlapping region X(1/2) can be expressed as height: h/2 (pixels) and width: w/2 (pixels), for example.
As shown in
At this time, an overlapping region X(1/3) in which the rectangular region R, the displaced rectangular region R′1(1/3), the displaced rectangular region R′2(1/3), the displaced rectangular region R′3(1/3), the displaced rectangular region R′4(1/3), the displaced rectangular region R′5(1/3), the displaced rectangular region R′6(1/3), the displaced rectangular region R′7(1/3), and a displaced rectangular region R′8(1/3) overlap is generated. The size of the overlapping region X(1/3) can be expressed as height: h/3 (pixels) and width: w/3 (pixels), for example.
In addition, when generating displaced rectangular regions R′, the rectangular region displacement unit 2212 determines the number of displaced rectangular regions R′ and the positions of the displaced rectangular regions R′, in other words determines two-dimensional orthogonal coordinates P(x, y) on the xy plane.
When the coordinates of the rectangular region R are indicated by P(i, j), the coordinates of the displaced rectangular region R′ generated by displacing the rectangular region R by w/n in the X direction k times can be expressed as P(1/n) (i+k, j). In addition, in this case, the coordinates of the displaced rectangular region R′ generated by displacing the rectangular region R by h/n in the Y direction once can be expressed as P(1/n) (i, j+l). In addition, in this case, the coordinates of the displaced rectangular region R′ generated by displacing the rectangular region R by w/n in the X direction k times and by h/n in the Y direction once can be expressed as P(1/n) (i+k, j+l).
As shown in
As shown in
The score calculation unit 2213 calculates a score S1 (first score) indicating whether or not the equipment 3 is present in the rectangular region R and a score S2 (second score) indicating whether or not the equipment 3 is present in the displaced rectangular region R′, using an image classification technique that uses a CNN that represents a deep learning technique. VGG16 is used for a model for performing learning, for example. The score S1 in the rectangular region R and the score S2 in the displaced rectangular region R′ are numerical values of 0 to 1, and are calculated as estimated values.
The score calculation unit 2213 then calculates a score S3 (third score) indicating whether or not the equipment 3 is present in the overlapping region X, based on the score S1 in the rectangular region R and the score S2 in the displaced rectangular region R′. The number of scores S2 in the displaced rectangular regions R′ matches the number of displaced rectangular regions R′. When three displaced rectangular regions R′ are generated by the rectangular region displacement unit 2212 for example, the score calculation unit 2213 calculates the score S3 in the overlapping region X based on four scores in total, namely the score S1 in the rectangular region R and three scores S2 in the three displaced rectangular regions R′. When eight displaced rectangular regions R′ are generated by the rectangular region displacement unit 2212 for example, the score calculation unit 2213 calculates the score S3 in the overlapping region X based on nine scores in total, namely the score S1 in the rectangular region R and eight scores S2 in the eight displaced rectangular regions R′.
The score calculation unit 2213 may calculate the weight average of the score S1 and the score S2, and use the weight average as score S3, for example. The score calculation unit 2213 may calculate the geometric average of the score S1 and the score S2, and may use the geometric average as score S3, for example. The score calculation unit 2213 may find the minimum values or maximum values of the score S1 and the score S2, and calculate the finding result as score S3, for example. Note that a method for calculating the score S3 is not limited to these calculation methods.
As shown in
Furthermore, the score calculation unit 2213 calculates a score S3(1/2) in the overlapping region X(1/2) based on the score S1(1/2) (i, j) in the rectangular region R(1/2), the score S2(1/2) (i+1, j) in the displaced rectangular region R′1(1/2), the score S2(1/2) (i, j+1) in the displaced rectangular region R′2(1/2), and the score S2(1/2) (i+1, j+1) in the displaced rectangular region R′3(1/2), using the following expression.
F in Expression 1 indicates computation of a weight average, a geometric average, a minimum value, a maximum value, or the like.
If F indicates computation of a weight average for example, the score calculation unit 2213 calculates the score S3(1/2) in the overlapping region X(1/2) based on the score S1(1/2) (i, j) in the rectangular region R(1/2), the score S2(1/2) (i+1, j) in the displaced rectangular region R′1(1/2), the score S2(1/2) (i, j+1) in the displaced rectangular region R′2(1/2), and the score S2(1/2) (i+1, j+1) in the displaced rectangular region R′3(1/2), using Expression 2 below. Here, a, b, c, and d each indicate a weight.
Here, assuming the score S1(1/2) (i, j) in the rectangular region R(1/2)=0.8, the score S2(1/2) (i+1, j) in the displaced rectangular region R′1(1/2)=0.7, the score S2(1/2) (i, j+1) in the displaced rectangular region R′2(1/2)=0.8, and the score S2(1/2) (i+1, j+1) in the displaced rectangular region R′3(1/2)=0.7, then the score S3(1/2) in the overlapping region X(1/2) can be expressed by the following expression.
If F indicates computation of a geometric average for example, the score calculation unit 2213 calculates the score S3(1/2) in the overlapping region X(1/2) based on the score S1(1/2)(i, j) in the rectangular region R(1/2), the score S2(1/2) (i+1, j) in the displaced rectangular region R′1(1/2), the score S2(1/2) (i, j+1) in the displaced rectangular region R′2(1/2), and the score S2(1/2) (i+1, j+1) in the displaced rectangular region R′3(1/2) using the following expression.
Here, assuming the score S1(1/2) (i, j) in the rectangular region R(1/2)=0.8, the score S2(1/2) (i+1, j) in the displaced rectangular region R′1(1/2)=0.7, the score S2(1/2) (i, j+1) in the displaced rectangular region R′2(1/2)=0.8, and the score S2(1/2) (i+1, j+1) in the displaced rectangular region R′3(1/2)=0.7, then the score S3(1/2) in the overlapping region X(1/2) can be expressed by the following expression.
As shown in
Furthermore, the score calculation unit 2213 calculates a score S3(1/3) in the overlapping region X(1/3) based on the score S1(1/3)(i, j) in the rectangular region R(1/3), the score S2(1/3)(i+1, j) in the displaced rectangular region R′1(1/3), the score S2(1/3)(i+2, j) in the displaced rectangular region R′2(1/3), the score S2(1/3)(i, j+1) in the displaced rectangular region R′3(1/3), the score S2(1/3)(i+1, j+1) in the displaced rectangular region R′4(1/3), the score S2(1/3)(i+2, j+1) in the displaced rectangular region R′5(1/3), the score S2(1/3)(i, j+2) in the displaced rectangular region R′6(1/3), the score S2(1/3)(i+1, j+2) in the displaced rectangular region R′7(1/3), and the score S2(1/3)(i+2, j+2) in the displaced rectangular region R′8(1/3), using the following expression.
F in Math. 6 indicates computation of a weight average, a geometric average, a minimum value, a maximum value, or the like.
As shown in
Note that, when calculating the score S3 in the overlapping region X, the score calculation unit 2213 does not necessarily need to adopt all of the scores of the overlapping region X to perform the above computation. The score calculation unit 2213 may select a plurality of scores from all of the scores of the overlapping region X as appropriate to perform the above computation. At this time, for example, the score calculation unit 2213 may select a score in a displaced rectangular region R′ that is more proximal to the rectangular region R, and exclude a score in a displaced rectangular region R′ that is more distant from the rectangular region R.
The determination unit 2214 determines whether or not the equipment 3 is present in the rectangular region R, based on the score S3 in the overlapping region X. The determination unit 2214 compares score S3 in the overlapping region X with a threshold value Vth, and if the score S3 in the overlapping region X is larger than or equal to the threshold value Vth, the determination unit 2214 determines that the equipment 3 is present in the rectangular region R, and if the score S3 in the overlapping region X is smaller than the threshold value Vth, the determination unit 2214 determines that the equipment 3 is not present in the rectangular region R. The threshold value Vth is not particularly limited, and may be suitably set, or may also be mechanically calculated. The determination unit 2214 outputs extraction data of a region in which the equipment 3 is present, to the deterioration region detection unit 222.
If the score S3 in the overlapping region X is 0.8 and the threshold value Vth is 0.7 for example, the determination unit 2214 determines that the equipment 3 is present in the rectangular region R, in other words the image in the rectangular region R is an image of the equipment 3. Accordingly, this rectangular region R is extracted as a region in which the equipment 3 is present.
If the score S3 in the overlapping region X is 0.6 and the threshold value Vth is 0.7 for example, the determination unit 2214 determines that the equipment 3 is not present in the rectangular region R, in other words the image in the rectangular region R is not an image of the equipment (but an image of an element other than the equipment 3 that is an inspection target, such as a tree, a river, a vehicle, a pedestrian, a sign, a road, and a building).
The deterioration region detection unit 222 detects a deterioration region of the equipment 3 based on the extraction data of the region in which the equipment 3 is present, which has been input from the equipment region extraction unit 221, using a region detection technique that uses semantic segmentation that represents a deep learning technique. The deterioration region of the equipment 3 comes in all shapes, sizes, and the like, and thus recognition in units of pixels, instead of class-classification type recognition, is preferably performed. Examples of a semantic segmentation model include U-net and the like, but there is no limitation thereto. The deterioration region detection unit 222 outputs the extraction data of the deterioration region of the equipment 3 to the output unit 240.
The following document can be referred to for details of U-net, for example.
The storage unit 230 includes one or more memories, and may include a semiconductor memory, a magnetic memory, an optical memory, and the like. Each memory of the storage unit 230 may function as a primary storage device, a secondary storage device, or a cache memory, for example. Each memory does not necessarily need to be provided inside the deterioration detection apparatus 200, and a configuration may also be adopted in which each memory is provided outside the deterioration detection apparatus 200.
The storage unit 230 stores various types of information to be used for operations of the deterioration detection apparatus 200. The storage unit 230 stores image data of a captured image, extraction data of a region in which the equipment 3 is present, extraction data of a deterioration region of the equipment 3, and the like. The storage unit 230 also stores data such as the rectangular region R, the displaced rectangular regions R′, the overlapping region X, the score S1, the scores S2, and score S3. Besides, the storage unit 230 stores various programs, various types of data, and the like.
The output unit 240 outputs various types of information. The output unit 240 is a liquid crystal display, an organic EL (Electro-Luminescence) display, a speaker, or the like. The output unit 240 displays a predetermined screen based on detection data of a deterioration region of the equipment 3 input from the deterioration region detection unit 222, for example. The output unit 240 may be formed in one piece with the deterioration detection apparatus 200, or may also be provided separately.
The communication unit 250 has a function of communicating with the image capturing apparatus 100 and a function of communicating with the server apparatus 300. The communication unit 250 receives image data of a captured image from the image capturing apparatus 100, for example. The communication unit 250 transmits detection data of a deterioration region of the equipment 3 to the server apparatus 300, for example.
The deterioration detection apparatus 200 according to the present embodiment extracts a region in which equipment is present, based on a captured image, and detects a deterioration region of the equipment based on the region in which the equipment is present. When extracting the region in which the equipment is present, the deterioration detection apparatus 200 uses a plurality of scores calculated for one rectangular region generated by dividing the captured image, instead of one score calculated for the one rectangular region generated by dividing the captured image. Accordingly, even if a captured image shows elements other than the equipment, it is possible to accurately specify an image of the equipment from such a captured image, and thus it is possible to accurately detect deterioration of the equipment.
An example of a deterioration detection method according to an embodiment of the present invention will be described with reference to
In step S101, the image capturing apparatus 100 captures an image of the equipment 3. The image capturing apparatus 100 transmits image data of the captured image to the deterioration detection apparatus 200. Note that the worker U may store the image data of the image captured by the image capturing apparatus 100, in an electron medium such as a memory card or a USB memory.
In step S102, the deterioration detection apparatus 200 receives the image data of the captured image from the image capturing apparatus 100. The deterioration detection apparatus 200 divides the captured image into a plurality of rectangular regions.
In step S103, for each of the plurality of rectangular regions included in the captured image, the deterioration detection apparatus 200 displaces the rectangular region in the x and y directions such that a portion thereof overlaps, and thereby generates a displaced rectangular region corresponding to the rectangular region.
In step S104, the deterioration detection apparatus 200 calculates a score S1 indicating whether or not the equipment 3 is present in each rectangular region and a score S2 indicating whether or not the equipment 3 is present in the displaced rectangular region, using an image classification technique that uses CNN that is a deep learning technique. VGG16 is used as a model, for example.
In step S105, the deterioration detection apparatus 200 calculates a score S3 indicating whether or not the equipment 3 is present in the overlapping region, based on the score S1 indicating whether or not the equipment 3 is present in the rectangular region and the score S2 indicating whether or not the equipment 3 is present in the displaced rectangular region, using a predetermined calculation method.
In step S106, the deterioration detection apparatus 200 determines whether or not the equipment 3 is present in the rectangular region, based on the score S3 indicating whether or not the equipment 3 is present in the overlapping region. The deterioration detection apparatus 200 compares the score S3 in the overlapping region with the threshold value Vth, and, if the score S3 in the overlapping region is larger than or equal to the threshold value Vth, the deterioration detection apparatus 200 determines that the equipment 3 is present in the rectangular region, and, if the score S3 in the overlapping region X is smaller than the threshold value Vth, the deterioration detection apparatus 200 determines that the equipment 3 is not present in the rectangular region.
In step S107, the deterioration detection apparatus 200 detects a deterioration region of the equipment 3 based on extraction data of the region in which the equipment 3 is present, using a region detection technique that uses semantic segmentation that represents a deep learning technique. U-net is used for the model, for example. The deterioration detection apparatus 200 transmits detection data of the deterioration region of the equipment 3 to the server apparatus 300.
In step S108, the server apparatus 300 receives the detection data of the deterioration region of the equipment 3 from the deterioration detection apparatus 200. The server apparatus 300 stores the detection data of the deterioration region of the equipment 3.
In the deterioration detection method according to the present embodiment, instead of one-stage processing in which a deterioration region of equipment is detected based on a captured image as is conventional, two-stage processing is performed in which a region in which equipment is present is extracted based on a captured image, and a deterioration region of the equipment is detected based on the region in which the equipment is present. Accordingly, even if a captured image shows an element other than the equipment, it is possible to accurately specify an image of the equipment based on such a captured image, and thus it is possible to accurately detect deterioration of the equipment that is an inspection target.
The determination accuracy of a score when the deterioration detection apparatus 200 according to the present embodiment (that includes an equipment region extraction unit) is used was compared with the determination accuracy of a score when a conventional deterioration detection apparatus (that does not include an equipment region extraction unit) is used, and evaluation was performed.
The determination accuracy of a score was calculated based on a confusion matrix. The true positive rate (TPR) is used as an index of evaluation of the determination accuracy of the score. Note that, besides the true positive rate, accuracy, precision, False Positive Rate (FPR), or the like may also be used as an index of evaluation of the determination accuracy of a score.
1/2 displacement was performed as a working example. The size of a rectangular region R(1/2) was set to height: h=80 (pixels) and width: w=80 (pixels). The size of an overlapping region X(1/2) was set to height: h=40 (pixels) and width: w=40 (pixels). The average value of a score S1 in the rectangular region R, a score S2 in a displaced rectangular region R′1(1/2), a score S2 in a displaced rectangular region R′2(1/2), and a score S2 in the displaced rectangular region R′3(1/2) was calculated as a score S3 in an overlapping region X(1/2).
As a comparative example, a rectangular region R was not displaced. The size of the rectangular region R was set to height: h=40 (pixels) and width: w=40 (pixels). A score S in a predetermined region included in the rectangular region R was calculated. The size of the predetermined region was set to height: h=40 (pixels) and width: w=40 (pixels).
The graph 201 shown in
Therefore, it is indicated that the determination accuracy of a score according to the deterioration detection apparatus 200 according to the present embodiment is higher than that of the conventional deterioration detection apparatus. That is to say, it is indicated that the deterioration detection apparatus 200 according to the present embodiment can accurately specify an image of equipment based on a captured image.
Note that, in the above working example, when 1/2 displacement, 1/3 displacement, . . . , and 1/n displacement were performed, and determination accuracies of a score, computation amounts, and computation times were compared, the computation amount and computation time increased as n takes a larger value. It was found that, when the balance between these are taken into consideration, 1/2 displacement is the most suitable among 1/2 displacement, 1/3 displacement, . . . , and 1/n displacement. Therefore, it is indicated that, by the deterioration detection apparatus 200 generating an appropriate number of rectangular regions at appropriate positions, a high effect is achieved.
The detection accuracy of a deterioration region of the equipment 3 when the deterioration detection apparatus 200 according to the present embodiment (that includes an equipment region extraction unit) is used was compared with the detection accuracy of a deterioration region of equipment 3 when a conventional deterioration detection apparatus (that does not include an equipment region extraction unit) is used, and evaluation was performed.
As a working example, a region in which the equipment 3 is present was extracted based on a captured image I, and a deterioration region of the equipment 3 was detected based on a region 300 in which the equipment 3 is present.
As a comparative example, a deterioration region of the equipment 3 was detected based on a captured image I.
A comparison between the region 301 shown in
A comparison between the region 303 shown in
Therefore, it is indicated that, with the deterioration detection apparatus 200 according to the present embodiment, the detection accuracy of the deterioration region of the equipment 3 is higher than that of the conventional deterioration detection apparatus. That is to say, it is indicated that the deterioration detection apparatus 200 according to the present embodiment can accurately detect deterioration of the equipment 3 based on a captured image.
The present invention is not limited to the above embodiments and modified examples. The above-described various types of processing may not only be executed chronologically as described, but may also be executed in parallel or individually as required or according to the processing capacity of the device that executes the processing, for example. Besides, modifications can be made as appropriate to the extent that they do not depart from the spirit of the invention.
It is also possible to use a computer that can execute a program instruction in order to cause the computer to function as the above embodiments and modified examples. Such a computer can be realized by storing, in the storage unit of the computer, a program on which processing contents for realizing functions of devices are written, and causing the processor of the computer to read out and execute this program, and at least some of the processing contents may be realized with hardware. Here, the computer may be a general-purpose computer, a dedicated computer, a work station, a PC, an electronic notepad, or the like. The program instruction may be program codes, code segments, or the like for executing a necessary task. The processor may be a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or the like.
A program for causing a computer to execute the above-described deterioration detection method is, referring to
In addition, this program may be recorded in a computer-readable recording medium. Using such a recording medium, the program can be installed in a computer. Here, the recording medium on which the program is recorded may be a non-transient recording medium. The non-transient recording medium may be a CD (Compact Disk)-ROM (Read-Only Memory), DVD (Digital Versatile Disk)-ROM, BD (Blu-ray (registered trademark) Disk)-ROM, or the like. In addition, this program can be provided by being downloaded via a network.
Although the above embodiments have been described as a representative example, it is apparent to those of ordinary skill in the art that many modifications and replacements can be made to the extent that they do not depart from the scope and spirit of the present disclosure. Therefore, the present invention is not to be interpreted as limiting by the above embodiments, and various modifications and changes can be made without departing from the scope of the claims. A plurality of configuration blocks illustrated in a configuration diagram of an embodiment of the present invention can be combined into one, or one configuration block can be divided, for example. In addition, a plurality of processes illustrated in a flowchart of an embodiment of the present invention can be combined into one, or one process can be divided.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/012104 | 3/18/2020 | WO |