This application claims the benefit of Japanese Patent Application No. 2021-208441, filed Dec. 22, 2021, which is hereby incorporated by reference herein in its entirety.
The present invention relates to an information processing apparatus, a shooting system, a method, and a non-transitory computer-readable storage medium.
An inspector of a structure, such as a bridge and a tunnel, checks the positions and extents of deformations (cracks, water leakage, and so forth) on the surfaces of the structure, and records the inspection result. In general, a civil engineer visually checks whether there are deformations on the surfaces of the structure, and records the positions, sizes, and the like of the deformations on, for example, a hand-held notebook. However, in the recent years, there has been a problem of a labor shortage in civil engineers who inspect structures due to, for example, aging of civil engineers. To address this problem, “image-based inspection” has been performed whereby the surfaces of a structure are shot with high definition using an image capturing apparatus (camera), and deformations are checked and recorded based on shot images using an image analysis technique.
In image-based inspection, an interchangeable-lens, high-definition image capturing apparatus (camera) that supports a large number of pixels is mounted on, for example, an automatic tripod head or a drone, and the surfaces of a structure to be inspected are exhaustively shot using the image capturing apparatus. In a case where the surfaces of a structure, such as a bridge and a tunnel, are shot with a resolution that allows detection of deformations (cracks), several tens to several hundreds of shot images are obtained. However, as each shot image only shows a small area of the structure, an inspector cannot understand which area of the structure has been shot from each shot image. For this reason, respective shot images are stitched (composited) to generate a large-scale stitched image (composite image) that shows the surfaces of the structure in a wider range. In this way, by performing shooting in such a manner that respective shooting ranges overlap one another at the time of shooting, shot images can be composited together.
A technique has been disclosed that determines an abnormality, such as missing data in shot images that have been shot by a shooting apparatus mounted on an unmanned flight vehicle, and specifies images that need to be re-shot in accordance with whether there is an abnormality. Specifically, images to be re-shot are specified by detecting a data abnormality, such as missing image data, at the time of communication of shot images (Japanese Patent No. 6619761).
The present invention in its one aspect provides an information processing apparatus comprising a storage unit configured to store pieces of image quality information respectively for a plurality of images that show a subject, a determination unit configured to determine whether image qualities of the images are favorable based on the pieces of image quality information of the images, a composition unit configured to generate a composite image using images that have been determined to have favorable image qualities by the determination unit, and a notification unit configured to, in a case where there is a missing pixel area in the composite image, provide a notification about a re-shooting method for the subject corresponding to the missing pixel area.
The present invention in its one aspect provides an a shooting system comprising a mobile object including shooting unit configured to shoot a subject, and an information processing apparatus that processes images shot by the shooting unit, wherein the information processing apparatus includes a storage unit configured to store pieces of image quality information respectively for a plurality of images that show the subject, a determination unit configured to determine whether image qualities of the images are favorable based on the pieces of image quality information of the images, a composition unit configured to generate a composite image using images that have been determined to have favorable image qualities by the determination unit, and a notification unit configured to, in a case where there is a missing pixel area in the composite image, provide a notification about a re-shooting method for the subject corresponding to the missing pixel area.
The present invention in its one aspect provides an a shooting system comprising a tripod head, a shooting unit configured to shoot a subject, the shooting unit being mounted on the tripod head, and an information processing apparatus that processes images shot by the shooting unit, wherein the information processing apparatus includes a storage unit configured to store pieces of image quality information respectively for a plurality of images that show the subject, a determination configured to determine whether image qualities of the images are favorable based on the pieces of image quality information of the images, a composition unit configured to generate a composite image using images that have been determined to have favorable image qualities by the determination unit, and a notification unit configured to, in a case where there is a missing pixel area in the composite image, provides a notification about a re-shooting method for the subject corresponding to the missing pixel area.
The present invention in its one aspect provides a method comprising storing pieces of image quality information respectively for a plurality of images that show a subject, determining whether image qualities of the images are favorable based on the pieces of image quality information of the images, generating a composite image using images that have been determined to have favorable image qualities by the determining, and in a case where there is a missing pixel area in the composite image, providing a notification about a re-shooting method for the subject corresponding to the missing pixel area.
The present invention in its one aspect provides a non-transitory computer-readable storage medium storing a program that, when executed by a computer, causes the computer to perform a method of an information processing apparatus comprising storing pieces of image quality information respectively for a plurality of images that show a subject, determining whether image qualities of the images are favorable based on the pieces of image quality information of the images, generating a composite image using images that have been determined to have favorable image qualities by the determining, and in a case where there is a missing pixel area in the composite image, providing a notification about a re-shooting method for the subject corresponding to the missing pixel area.
Further features of the present invention will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
Hereinafter, embodiments will be described in detail with reference to the attached drawings. Note, the following embodiments are not intended to limit the scope of the claimed invention. Multiple features are described in the embodiments, but limitation is not made an invention that requires all such features, and multiple such features may be combined as appropriate. Furthermore, in the attached drawings, the same reference numerals are given to the same or similar configurations, and redundant description thereof is omitted.
According to the present invention, a subject that needs to be re-shot can be efficiently shot.
An information processing apparatus stores respective pieces of image quality information of a plurality of images that show a subject, and determines whether the image qualities of the images are favorable based on the pieces of image quality information of the images. The information processing apparatus generates a composite image with use of images that have been determined to have favorable image qualities, and in a case where there is a missing pixel area in the composite image, provides a notification about a method of re-shooting a subject corresponding to the missing pixel area. Note that although the information processing apparatus is mounted on a mobile object (drone) or a tripod head, it may be mounted on any apparatus on which the information processing apparatus can be mounted. Furthermore, the information processing apparatus may not be mounted on a mobile object (drone) or a tripod head, and an information processing apparatus that is separately prepared may control a mobile object (drone) or a tripod head via, for example, wireless communication. Moreover, data may be exchanged between a mobile object (drone) or a tripod head and the information processing apparatus via a removable storage medium. In this case, a shooting system that includes the information processing apparatus and the mobile object (drone) is configured. Here, image quality information denotes a plurality of elements with which the image quality of an image obtained by shooting a subject is evaluated (e.g., a shooting resolution, focus, and blurring). The image quality denotes the quality of an image obtained by shooting a subject. Whether the image quality is favorable is determined based on whether at least one of the plurality of elements of the image quality information is favorable. A composite image denotes one image obtained by compositing a plurality of images obtained by shooting a subject.
The following describes the reason why a technique to assist an inspector in determining the image quality and making a judgment on re-shooting is necessary. For example, in inspection of an extremely thin crack with a width of 0.2 mm on the surface of a structure, the extremely thin crack may not be shown in images unless the shooting settings of an image capturing apparatus (camera) are appropriately configured. Among the shooting settings, especially a shooting resolution (an actual dimension corresponding to one pixel in an image), focus (focal point), and blurring are important in ensuring the quality of the image (hereinafter, image quality). When the inspector inspects a crack with a width of 0.2 mm or more, it is sufficient that the shooting resolution be higher than 0.5 mm/pixel. When the inspector inspects a crack with a width of 0.05 mm or more, it is sufficient that the shooting resolution be higher than 0.3 mm/pixel. It is possible to shoot a crack thinner than the shooting resolution; however, in shooting of a much thinner crack, it is necessary to shoot the crack with a higher shooting resolution. For example, in a bridge inspection under the domestic jurisdiction, cracks with a width of 0.05 mm or more are the targets of inspection, and it is recommended to perform shooting with a shooting resolution higher than 0.3 mm/pixel (the Ministry of Land, Infrastructure, Transport and Tourism (March 2021, Manual for Delivery of Three-Dimensional Deliverable Using Inspection Assistance Technique (Image Measurement Technique (Draft)).
Focus adjustment exerts a large influence on how cracks are shown in shot images. For example, in images obtained by shooting cracks in an out-of-focus state, extremely thin cracks are not shown due to defocus. In image-based inspection, as the entirety of shot images shows the surface of a structure (hereinafter, an inspection surface), it is necessary to configure settings related to shooting of an image capturing apparatus so as to bring the entirety of the shot images into focus. That is to say, it is necessary to set a diaphragm (f-number) so that the depth of field (the range that is in focus) includes the closest point through to the farthest point of the inspection surface. The larger the f-number, the deeper the depth of field, and the wider the range that is in focus. However, increasing the f-number will reduce the shutter speed, thereby making a subject (cracks) easily blurred.
Also, a drastic increase in the f-number causes defocus called “small-aperture defocus”, and exerts an influence on how cracks are shown. Furthermore, shaking of an image capturing apparatus (camera) at the time of shooting of cracks exerts an influence on how cracks are shown. In blurry images, the appearance of cracks are such that the cracks are not easily distinguishable from stains on the surface of a structure. Although increasing the ISO film speed can increase the shutter speed of the image capturing apparatus (camera), setting the ISO film speed too high will cause ISO noise. As the ISO film speed exerts an influence on how cracks are shown in shot images, it is necessary to adjust the shooting conditions so that shooting can be performed without blur while avoiding an increase in the value of the ISO film speed whenever possible.
As described above, the matters to be kept in mind during shooting at the time of inspection of a structure are clear; however, as optimum setting values vary depending on the shooting condition, they cannot be determined in advance. For example, the focal length that realizes a necessary shooting resolution varies depending on the distance from the image capturing apparatus (camera) to the target of shooting (a crack). As a necessary depth of field varies depending on the direction of the surface to be shot (the surface with a crack), an optimum f-number varies. As the brightness of the surrounding of the target of shooting (a crack) also varies depending on the weather, the shutter speed necessary for shooting images without blur varies as well. For example, in a case where the surface of a structure is shot using an image capturing apparatus (camera) mounted on a drone, the distance between the image capturing apparatus (camera) and the target of shooting, as well as the direction of the surface to be shot, may change immediately before shooting depending on the wind condition at the time of shooting. Furthermore, an optimum value of a shutter speed for shooting without blur changes due to a sudden gust of wind at the time of shooting.
In view of the above-described conditions, the image qualities of shot images are determined after the surface of a structure has been shot, and if there is a shot image that is inappropriate for inspection by an inspector, the location to be re-shot and the method for improving the image qualities of shot images are specified, and re-shooting is performed. Here, in terms of cost reduction, the determination of the image qualities of shot images and the judgment on whether re-shooting is necessary are made immediately after shooting at the site of shooting. For example, in a case where a structure to be inspected is far from the office of the inspector, and in a case where it is necessary to make arrangements to, for example, rent shooting equipment, re-shooting of the target of inspection is difficult because it incurs a large amount of effort. In general, the determination of the image qualities of shot images and the judgment on whether re-shooting is necessary are made based on the experience and know-how of the inspector; however, it is difficult to make these determination and judgment appropriately in a short amount of time. In view of this, there is a need for a technique to assist the inspector in determining the image qualities and making a judgment on re-shooting.
The shooting unit 101 is an image capturing apparatus that shoots a subject, and includes a camera mounted on, for example, a mobile object (e.g., a drone), a tripod head, and the like. The shooting unit 101 captures images of the surfaces of a structure to be inspected (referred to as inspection target surfaces), and generates shot image data. “Shot image data” includes image data of an inspection target, model names of the image capturing apparatus and a lens, such camera parameters as the focal length, shutter speed, and ISO film speed at the time of shooting, information of a shooting distance from the shooting unit 101 to the subject, and pieces of in-focus degree information of a shot image (defocus values). The shooting unit 101 obtains information of a shooting distance from the shooting unit 101 to the subject with use of, for example, known means, such as a distance measurement apparatus included in the shooting unit 101.
The shooting unit 101 includes, for example, two photoelectric conversion units in which a sensor included in the shooting unit 101 performs photoelectric conversion on a per-pixel basis. The shooting unit 101 calculates pieces of in-focus degree information (defocus values) of a shot image based on the phase difference between two images that are respectively recorded in the photoelectric conversion units. A piece of in-focus degree information is represented by, for example, a value of 0.0 or more for each pixel. In a case where a piece of in-focus degree information indicates 0.0, it means that there is no phase difference between two images in that pixel, and that pixel is in focus. In a case where a piece of in-focus degree information indicates 0.0 or more, it means that there is a phase difference corresponding to a numerical value between two images in that pixel, and that pixel is out of focus.
The mobile object (drone) or tripod head on which the shooting unit 101 is mounted may be controlled by means of a user operation or an automatic operation. No matter which one of the user operation and the automatic operation controls the shooting unit 101, the shooting unit 101 obtains a plurality of shot images by exhaustively shooting the inspection target surfaces while changing a shooting range with respect to the subject. In order to cause the composition unit 104 to composite the shot images, the shooting unit 101 shoots images in such a manner that the shooting ranges of respective inspection target surfaces overlap (overlie) one another.
For example, the shooting unit 101 may perform shooting in such a manner that respective shooting ranges overlap (overlie) one another by repeating still image shooting. Alternatively, the shooting unit 101 may obtain shot image data in which respective shooting ranges overlap one another by extracting (capturing) still images from moving images after the moving images have been shot. Here, the timing of shooting of still images and the timing of extraction of still images from moving images may be no particular timings based on a user setting, or may be timings automatically set by the shooting unit 101. The shooting unit 101 transmits a plurality of pieces of shot image data that have been obtained to the storage unit 102.
The storage unit 102 is a storage apparatus that stores various types of data inside the information processing apparatus 100, and includes, for example, an HDD, an SSD, a RAM, a ROM, and the like. The storage unit 102 receives the pieces of shot image data from the shooting unit 101, and stores the pieces of shot image data into various types of storage mediums. Under an instruction from a CPU (not shown) included in the information processing apparatus 100, the storage unit 102 transmits the pieces of shot image data to the determination unit 103. The storage unit 102 receives determination information obtained as a result of the determination unit 103 determining whether the image qualities of the pieces of shot image data are favorable (equivalent to a determination result). Under an instruction from the CPU (not shown), the storage unit 102 transmits only the pieces of shot image data with favorable image qualities to the composition unit 104 based on the determination information.
The determination unit 103 receives pieces of shot image data from the storage unit 102, and determines the image qualities of the pieces of shot image data. The determination unit 103 transmits determination information obtained by determining the image qualities of the pieces of shot image data to the storage unit 102 and the notification unit 105. The determination unit 103 calculates a shooting resolution, which is included in image quality information, with use of the following expression 1 based on a shooting distance from the shooting unit 101 to a subject (an inspection target surface), the camera’s sensor size, the focal length, and the number of pixels. Image quality information includes, for example, a shooting resolution, focus, and blurring.
The shooting distance and the focal length are information recorded in shot image data. The sensor size and the number of pixels are numerical values that are unique to each of the models of image capturing apparatuses (cameras). Therefore, the determination unit 103 obtains the unique numerical values with reference to, for example, a database stored in the storage unit 102 or the like based on model information of the image capturing apparatus (camera) recorded in shot image data. The determination unit 103 determines the image quality of shot image data by comparing the shooting resolution calculated based on expression 1 with a base shooting resolution that has been recorded in advance in the storage unit 102 or the like. That is to say, the determination unit 103 determines that the image quality of shot image data is favorable in a case where the numerical value of the shooting resolution is equal to or larger than the numerical value of the base shooting resolution. On the other hand, the determination unit 103 determines that the image quality of shot image data is not favorable in a case where the numerical value of the shooting resolution is smaller than the numerical value of the base shooting resolution.
Next, a description is given of an example in which the determination unit 103 determines whether the image quality of shot image data is favorable with use of “focus” included in image quality information. The determination unit 103 determines whether the image quality is favorable based on pieces of in-focus degree information recorded in shot image data. Shot image data includes, for example, values of pieces of in-focus degree information of respective pixels in a shot image. As described earlier, in a case where the numerical value of a piece of in-focus degree information of each pixel is 0.0, it means that the pixel is in focus. In a case where the numerical value is a numerical value larger than 0.0, it means that the pixel is out of focus in accordance with that numerical value.
The determination unit 103 determines whether focus is favorable with respect to each shot image based on the extent to which the entirety of the shot image includes pixels with a predetermined numerical value or less. For example, in a case where pixels with pieces of in-focus degree information having a numerical value of 1.0 or less account for at least 50% of the entirety of the shot image, the determination unit 103 determines that focus of the shot image is favorable. On the other hand, in a case where pixels with pieces of in-focus degree information having a numerical value of 1.0 or less account for less than 50% of the entirety of the shot image, the determination unit 103 determines that focus of the shot image is not favorable.
Furthermore, a description is given of an example in which the determination unit 103 determines whether the image quality of shot image data is favorable with use of “blurring” included in image quality information. The determination unit 103 determines whether blurring is included by performing frequency analysis with respect to image data recorded in shot image data. In a case where shot image data includes blurring, cracks and the like are shown in a wide range in a shot image, and thus the shot image data has a low frequency. In view of this, the determination unit 103 calculates a spatial frequency for each area in the shot image with use of, for example, a known method, such as a Fourier transform. The determination unit 103 determines whether the shot image data includes blurring based on an average value obtained by averaging the frequency components that have been calculated for respective areas in the shot image. That is to say, in a case where the average value indicates a high frequency, the determination unit 103 determines that the shot image data does not include blurring. On the other hand, in a case where the average value indicates a low frequency, the determination unit 103 determines that the shot image data includes blurring.
The determination unit 103 determines the image quality of shot image data from the storage unit 102, generates “determination information” indicating whether the image quality of the shot image data is favorable, and transmits the determination information to the storage unit 102 and the notification unit 105. Here, in a case where the determination unit 103 determines one element included in image quality information from the shot image data (one of the shooting resolution, focus, and blurring), a determination result related to one element is used as the determination information. In a case where the determination unit 103 determines a plurality of elements included in image quality information from the shot image data, a determination result based on the combination of the plurality of elements is used as the determination information.
On the other hand, in a case where the determination unit 103 determines that one of the elements of image quality information (e.g., blurring) is not favorable, it generates determination information 110 indicating that the image quality of shot image data is not favorable. Specifically, in a case where the shooting resolution is ○, focus is ○, and blurring is × with regard to “number = 2”, the determination unit 103 determines that the image quality is ×. Alternatively, for example, in a case where the determination unit 103 determines that one of the elements of image quality information (e.g., the shooting resolution) is favorable, it may generate determination information indicating that the image quality of shot image data is favorable. Specifically, in a case where the shooting resolution is ○, focus is ×, and blurring is × with regard to “number = 3”, the determination unit 103 determines that the image quality is ○. Note that as shown in
The composition unit 104 receives pieces of shot image data from the storage unit 102, and generates composite image data by compositing the pieces of shot image data using a known method. For example, in a case where pieces of shot image data that have been shot using a fixed tripod head are to be composited, the composition unit 104 composites together the pieces of shot image data using a method of panoramic composition. Also, in a case where pieces of shot image data that have been shot using a mobile object (drone) are to be composited, the composition unit 104 composites together the pieces of shot image data using a method of three-dimensional reconfiguration. In a case where the foregoing methods are used, the composition unit 104 is not always capable of compositing all pieces of shot image data. For example, in a case where the composition unit 104 generates a composite image using pieces of shot image data that include a few areas that overlap with (overlie) other pieces of shot image data, it may not be capable of generating a composite image with no missing pixel. A missing pixel means a pixel for which no pixel value has been recorded.
Also, the pieces of shot image data that are used to generate the composite image 201 do not represent all of the pieces of shot image data that have been shot by the shooting unit 101, but consist only of pieces of shot image data for which the image qualities have been determined to be favorable by the determination unit 103. Therefore, the section (pixels) for which the image qualities have not been determined to be favorable in pieces of shot image data is not used to generate the composite image 201, and is thus displayed as the area 203 in the composite image 201. The area 203 is an area in which a total of 13 pixels are missing, and is “a missing section B attributed to poor image quality”.
Furthermore, in a case where pieces of shot image data include an unshot section because the shooting unit 101 failed to shoot a subject (inspection target surfaces) due to a certain reason, this section is displayed as the area 204 in the composite image 201. Examples of the certain reason include a case where the shooting unit 101 mounted on a mobile object (drone) that is moving at high speed cannot shoot a desired subject area, and a case where a shutter cannot be released due to, for example, a poor connection at a contact point between the image capturing apparatus and the lens in the shooting unit 101. The area 204 is an area in which one pixel is missing, and is “a missing section C attributed to a failure in shooting”.
As described above, after generating the composite image 201, the composition unit 104 specifies the areas 202 to 204 in the composite image 201 for which no pixel value has been recorded (pixels are missing). Next, the composition unit 104 estimates the reasons for missing pixels in the area 204 (missing section C), the area 203 (missing section B), and the area 202 (missing section A), in stated order. Note that the order of estimation of the reasons for missing pixels is exemplary, and no limitation is intended by this.
For example, the composition unit 104 specifies “used pieces of shot image data” which compose the surroundings of the area 203 and the area 204 and which were used to generate the composite image 201, and specifies used pieces of shot image data with shooting orders that are adjacent to shooting orders of used pieces of shot image data. For example, it is expected that there is no unallocated number (missing data) in used pieces of shot image data with shooting orders that are adjacent to shooting orders of used pieces of shot image data that compose the surrounding of the area 204 (the missing section C attributed to a failure in shooting). For example, the composition unit 104 can determine whether there is an unallocated number (missing data) in pieces of shot image data based on whether there are unique numbers (e.g., serial numbers) that have been respectively allocated to used pieces of shot image data. Therefore, in a case where the composition unit 104 determines that there is no unallocated number (missing data) in used pieces of shot image data located in the surrounding of the area 204, it estimates that the reason for missing pixels in the area 204 (missing section C) is a “shortage of images”. Here, the “shortage of images” includes the reasons for missing pixels attributed to “a failure in shooting” and “a shortage of overlap”. Note that the composition unit 104 can determine whether the reason for missing pixels in the area 204 (missing section C) is “a failure in shooting” or “a shortage of overlap” as described later.
On the other hand, in a case where there is an unallocated number (missing data) in used pieces of shot image data with shooting orders that are adjacent to shooting orders of used pieces of shot image data, it means that there is shot image data that the composition unit 104 has not received from the storage unit 102. That is to say, it is expected that there is an unallocated number (missing data) in used pieces of shot image data with shooting orders that are adjacent to shooting orders of used pieces of shot image data that compose the surrounding of the area 203 (the missing section B attributed to poor image quality). Therefore, in a case where the composition unit 104 determines that there is an unallocated number (missing data) in used pieces of shot image data located in the surrounding of the area 203, it estimates that the reason for missing pixels in the area 203 is “poor image quality”.
As another estimation method, the composition unit 104 may specify, from among pieces of shot image data received from the storage unit 102, pieces of shot image data that were not used to generate the composite image 201 (referred to as unused pieces of shot image data). The composition unit 104 specifies pieces of shot image data which have shooting times close to shooting times of the unused pieces of shot image data (e.g., the difference between the former shooting times and the latter shooting times falls within a predetermined range), and which were used to generate the composite image 201 (referred to as used pieces of shot image data). The composition unit 104 specifies a portion of the composite image 201 that the used pieces of shot image data compose (i.e., the positions of the used pieces of shot image data in the composite image 201). For example, it is expected that used pieces of shot image data that compose the surrounding of the area 203 (the missing section B attributed to poor image quality) in the composite image 201 have shooting times close to shooting times of the unused pieces of shot image data. In view of this, in a case where a missing pixel area is included in the surrounding of the used pieces of shot image data specified from the composite image 201, the composition unit 104 estimates that the reason for missing pixels in the area 203 is “poor image quality”. Also, regarding an area for which the reason for missing pixels has not been determined to be “poor image quality” (e.g., the area 204), the composition unit 104 can estimate that the reason for missing pixels therein is the “shortage of images”.
Furthermore, the composition unit 104 can determine whether the “shortage of images” has occurred due to the “shortage of overlap” in used images, or has occurred due to the “failure in shooting”, which is a shortage of pieces of shot image data that are used to generate the composite image 201. The composition unit 104 determines whether the “shortage of images” has been caused by the “shortage of overlap” or the “failure in shooting” using the following method.
First, the composition unit 104 estimates the reason for missing pixels in the area 202 using the estimation method that is used in estimating the reason for missing pixels in the area 204 (missing section C) (“shortage of images”). Consequently, the composition unit 104 estimates that the reason for missing pixels in each of the area 202 and the area 204 is the “shortage of images”. Furthermore, the composition unit 104 determines whether the reason for missing pixels in each area is the “failure in shooting” or “shortage of overlap” based on whether the number of missing pixels (the size of the missing pixel area) in each of the area 202 and the area 204 exceeds a predetermined threshold.
The predetermined threshold is the number of missing pixels, and is assumed to be four, for example. In a case where the composition unit 104 determines that the number of mixing pixels in the area 204 of
Note that
Below is a description of another estimation method for a case where the reason for missing pixels is the “failure in shooting”. The composition unit 104 obtains a communication history related to a defect in the contact point between the image capturing apparatus (camera) and the lens in the shooting unit 101, which can cause the “failure in shooting” (information including time and an event, and information of a shutter interval). Then, the composition unit 104 specifies used images that were shot at shooting times adjacent to the time of failure to release the shutter of the camera (the occurrence of a shooting failure event). In this way, the composition unit 104 can estimate that the reason for missing pixels in the area 204 is the “failure in shooting” based on whether there is a missing pixel area in the surrounding of the used images that were specified from the composite image 201 using the foregoing method. Furthermore, a description is now given of another estimation method for a case where the reason for missing pixels is the “shortage of overlap”. The composition unit 104 may obtain the time at which a subject was shot when the moving speed of the mobile object (drone) on which the shooting unit 101 is mounted exceeded a threshold, and specify used images that were shot at shooting times adjacent to the obtained time. In this way, the composition unit 104 can estimate that the reason for missing pixels in the area 202 is the “shortage of overlap” based on whether there is a missing pixel area in the surrounding of the used images that were specified from the composite image 201 using the foregoing method. The threshold for the moving speed of the mobile object (drone) may be any numerical value of a speed at which the “shortage of overlap” occurs when the shooting unit 101 shoots a subj ect.
After estimating the reasons for missing pixels in the areas 202 to 204, the composition unit 104 records respective estimation results in the composite image data in association with the coordinates of the areas 202 to 204 in the composite image 201. The composition unit 104 transmits the generated composite image data to the notification unit 105.
Based on the determination information on the image qualities of pieces of shot image data from the determination unit 103 and on the composite image data from the composition unit 104, the notification unit 105 provides a notification about methods of re-shooting a subject (inspection target surfaces) to be used by the shooting unit 101. In the composite image data, the coordinates of the areas 202 to 204 in the composite image 201 and the reasons for missing pixels in the areas 202 to 204 are recorded in association with each other. The notification unit 105 refers to the composite image data for the reasons for missing pixels in the areas 202 to 204, and presents re-shooting methods corresponding to the reasons for missing pixels in association with the coordinates of the areas 202 to 204 in the composite image 201.
In a case where the reason for missing pixels in the area 202 is the “shortage of overlap”, the notification unit 105 provides a notification so that the speed at which the shooting range is changed when the shooting unit 101 shoots a subject (inspection target surfaces) is re-set. For example, in a case where the shooting unit 101 is mounted on a mobile object (drone), the notification unit 105 provides a notification so that the moving speed of the mobile object is reduced. If the speed at which the shooting range is changed with respect to the subject (inspection target surfaces) is reduced, more pieces of shot image data will overlap, and thus the “shortage of overlap” in the composite image 201 will be resolved.
In a case where the mobile object (drone) on which the shooting unit 101 is mounted moves autonomously, the notification unit 105 may directly access the shooting unit 101 and change the settings so as to reduce the moving speed of the mobile object. In a case where the mobile object (drone) on which the shooting unit 101 is mounted moves in accordance with a user operation, the notification unit 105 may provide a notification to the terminal (not shown) and the like of the user so that the user reduces the speed of the mobile object. Also, there are cases where the composition unit 104 generates a composite image based on a group of still images extracted from moving images of the subject (inspection target surfaces) shot by the shooting unit 101. At this time, in a case where the composite image 201 includes the area 202 attributed to the “shortage of overlap”, the notification unit 105 provides a notification so that the interval at which the still images are extracted (captured) from the moving images is reduced.
In a case where the reason for missing pixels in the area 204 is the “failure in shooting”, the notification unit 105 provides a notification so that the interval at which the shutter of the shooting unit 101 is released is changed. For example, in a case where the shutter included in the shooting unit 101 is controlled by a user operation, the notification unit 105 provides a notification to the terminal and the like of the user so that the interval of a shutter operation is reduced. In a case where the shutter included in the shooting unit 101 is an interval shutter that is automatically operated, the notification unit 105 provides a notification about re-setting of a shooting interval. Here, as the interval shutter is set so as to “shoot one still image every X seconds”, the notification is provided so as to reduce “X seconds”. The notification unit 105 may directly access the shooting unit 101 and change the settings of the interval shutter. Also, in a case where the shutter included in the shooting unit 101 is controlled by a user operation, the notification unit 105 may provide a notification to the user so that a shutter operation is performed flawlessly.
In a case where the reason for missing pixels in the area 203 is “poor image quality”, the notification unit 105 provides a notification so that the shooting settings of the shooting unit 101 are changed. In order to change the shooting settings of the shooting unit 101, the notification unit 105 uses determination information on the image qualities of unused pieces of shot image data that have not been received from the determination unit 103. In a case where the reason for missing pixels in the area 203 is “poor image quality”, the determination information always includes shot image data with image quality information for which the image quality has not been determined to be favorable. Image quality information for which the image quality has not been determined to be favorable is information indicating that one of the shooting resolution, focus, and blurring is not favorable. The notification unit 105 decides on the content of change in the shooting settings in the shooting unit 101 with reference to image quality information indicating unfavorable image quality.
In a case where it is determined, for example, that the “shooting resolution” is not favorable based on the determination information (determination result) on the image quality, the notification unit 105 provides a notification so that the setting of the focal length of the shooting unit 101 is changed to the telephoto side. Once the setting of the focal length has been changed to the telephoto side, the shooting range is reduced, and thus the notification unit 105 provides a notification so that the speed at which the shooting range of the shooting unit 101 is changed is reduced, or the shooting interval of the shooting unit 101 is reduced. Alternatively, the notification unit 105 may provide a notification so that the shooting position of the shooting unit 101 further approaches the subject (inspection target surfaces) while maintaining the focal length of the shooting unit 101 as is. In this way, the shooting distance from the shooting unit 101 to the subject (inspection target surfaces) is reduced, and thus the shooting resolution can be further increased.
In a case where it is determined that “focus” is not favorable based on the determination information (determination result) on the image quality, the notification unit 105 provides a notification so that the f-number of the shooting unit 101 is increased. As a result, the depth of field becomes deeper, thereby making it easier to bring the subject (inspection target surfaces) into focus.
In a case where it is determined that “blurring” is not favorable based on the determination information (determination result) on the image quality, the notification unit 105 provides a notification so that the shutter speed of the shooting unit 101 is increased. At this time, together with the foregoing notification, the notification unit 105 may provide a notification so that the ISO film speed is increased. Also, “blurring” may occur in a case where the speed at which the shooting range of the shooting unit 101 is changed is too fast. In view of this, the notification unit 105 may provide a notification so that the speed at which the shooting range of the shooting unit 101 is changed is reduced. In a case where the mobile object (drone) on which the shooting unit 101 is mounted moves autonomously, the notification unit 105 may directly access the shooting unit 101 and set the moving speed of the mobile object to be slower. On the other hand, in a case where the mobile object on which the shooting unit 101 is mounted moves in accordance with a user operation, the notification unit 105 may provide a notification to the terminal (not shown) of the user so that the moving speed of the mobile object is reduced. At this time, in a case where the notification unit 105 notifies the terminal of the user of a change in the shooting settings of the shooting unit 101, it may directly access the shooting unit 101 and change the shooting settings in place of the user.
In step S401, the shooting unit 101 shoots a subject of a structure to be inspected (inspection target surfaces), and obtains pieces of shot image data. The shooting unit 101 transmits the pieces of shot image data to the storage unit 102, and processing proceeds to step S402.
In step S402, the storage unit 102 stores the pieces of shot image data. The storage unit 102 transmits the pieces of shot image data to the determination unit 103, and processing proceeds to step S403.
In step S403, based on respective pieces of image quality information of the plurality of pieces of shot image data obtained by shooting the subject, the determination unit 103 determines the image quality of each piece of shot image data. The determination unit 103 transmits determination information (determination result) obtained by determining the image quality of each piece of shot image data to the storage unit 102 and the notification unit 105, and processing proceeds to step S404.
In step S404, based on the determination information (determination result) of each piece of shot image data, the storage unit 102 selects pieces of shot image data for generating a composite image. That is to say, the storage unit 102 selects, from among respective pieces of shot image data, only pieces of shot image data for which the determination information (determination result) on the image qualities indicates favorable image qualities. The storage unit 102 transmits the selected pieces of shot image data to the composition unit 104, and processing proceeds to step S405.
In step S405, the composition unit 104 generates a composite image 201 by compositing together the pieces of shot image data with favorable image qualities using a known method (e.g., a method of panoramic composition or a method of three-dimensional reconfiguration), and processing proceeds to step S406.
In step S406, based on whether there is a missing pixel area (for which no pixel value has been recorded) in the composite image 201, the composition unit 104 determines whether to change the shooting settings of the shooting unit 101. In a case where the composition unit 104 determines that there is a missing pixel area (for which no pixel value has been recorded) in the composite image 201 (Yes of step S406), processing proceeds to step S407. In a case where the composition unit 104 determines that there is no missing pixel area (for which no pixel value has been recorded) in the composite image 201 (No of step S406), processing ends.
In step S407, the composition unit 104 determines areas 202 to 204 with missing pixels (for which no pixel value has been recorded) in the composite image 201, and estimates the reasons for missing pixels in the areas 202 to 204. The composition unit 104 records the reasons for missing pixels into composite image data in association with respective positions (coordinates) of the areas 202 to 204. The composition unit 104 transmits the composite image data to the notification unit 105. The notification unit 105 provides a notification about re-shooting methods to be used by the shooting unit 101 based on the determination information on the image qualities of the pieces of shot image data from the determination unit 103, and on the composite image 201 from the composition unit 104. Thereafter, processing proceeds to step S408.
In step S408, in accordance with a selection of a re-shooting method via a user operation or the notification unit 105, the notification unit 105 changes various types of settings related to re-shooting of the subject (inspection target surfaces) corresponding to the areas 202 to 204 to be performed by the shooting unit 101, and processing returns to step S401.
As described above, according to the first embodiment, image qualities are determined based on pieces of image quality information of pieces of shot image data, and a composite image is generated using images with favorable image qualities based on determination information (determination result) on the image qualities. According to the first embodiment, in a case where there is a missing pixel area in the composite image, a re-shooting method for the missing pixel area can be estimated from the relationship between shooting times of a used image and an unused image. Consequently, in a case where there is a missing pixel area in the composite image, the missing pixel area can be visualized as a location of re-shooting, and a re-shooting method for shooting a subject corresponding to the missing pixel area can be presented.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Number | Date | Country | Kind |
---|---|---|---|
2021-208441 | Dec 2021 | JP | national |