The present invention relates to a technique for determining the quality of a captured image.
A technique for detecting deformations such as a crack from a captured image in inspecting a structure desirably uses a captured image that is in focus and sharp. Since a high resolution image is desirable for the detection of fine deformations from a range to be inspected, a plurality of high resolution images may be combined into a combined image for use in inspecting a large-scale structure.
Japanese Patent No. 6619761 discusses a technique for determining anomalies such as a missing image among images to be combined, and identifying an image or images to be recaptured based on the presence or absence of an anomaly.
As described above, captured images used in inspecting a structure desirably satisfy predetermined image qualities, such as being properly focused and having high resolution. If the captured images do not satisfy such qualities, image processing for inspection such as deformation detection and combination can fail to be properly performed, resulting in a need for recapturing the images. Recapturing images (hereinafter, may also be referred to as reimaging) costs a lot of labor if the structure is located at a remote place. Image capturing involves preparing materials, or the reimaging image capturing is performed on a different day in particular. Thus, a technique for determining whether the captured images satisfy predetermined image qualities, i.e., whether image recapturing is necessary. Further, it is difficult to appropriately determine such image qualities manually by visual observation.
The technique discussed in Japanese Patent 6619761 identifies an image to be recaptured by detecting a data anomaly such as missing image data while communicating the captured image. However, in a case where the image data is normal even if the image data has low image quality, for example, because the recorded image is not properly focused, or has low resolution, the captured image is not determined to be recaptured. Thus, the reimaging determination according to the conventional technique does not take into account the image quality, and in this respect there is room for improvement.
According to an aspect of the present invention, an information processing apparatus includes an obtaining unit configured to obtain at least one of in-focus degree information indicating an in-focus degree of each predetermined region of an image, frequency analysis information indicating a frequency analysis result of the image, and imaging resolution information indicating imaging resolution, a determination unit including at least one of a function of determining a ratio of a region where the in-focus degree satisfies a predetermined condition in the image based on the in-focus degree information, a function of determining whether the frequency analysis result satisfies a predetermined condition based on the frequency analysis information, and a function of determining whether the imaging resolution satisfies a predetermined condition based on the imaging resolution information, and an output unit configured to output information for specifying that the image is not to be used for predetermined image processing based on a result of a determination made by the determination unit.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
According to the present invention, whether to use a captured image for the predetermined image processing can be determined based on the image quality of the captured image.
The information processing apparatus 100 is an apparatus for controlling entire imaging processing according to the present exemplary embodiment. The information processing apparatus 100 includes a central processing unit (CPU) 101, a read-only memory (ROM) 102, a random access memory (RAM) 103, a hard disk drive (HDD) 104, a display unit 105, an operation unit 106, and a communication unit 107. The CPU 101 performs calculations and logical determinations for various types of processing, and controls the components connected to a system bus 110. The ROM 102 is a program memory and stores programs used for control, including various processing procedures to be described below, by the CPU 101. The RAM 103 is used as a temporary storage area such as a main memory and a work area for the CPU 101. The program memory may be implemented by loading the programs from an external storage device connected to the information processing apparatus 100 into the RAM 103.
The HDD 104 is used to store electronic data, such as image data, and programs according to the present exemplary embodiment. An external storage device may be used as a device having a similar role. For example, the external storage device may be implemented by a medium (recording medium) and an external storage drive for accessing the medium. Known examples of such a medium include a flexible disk (FD), a compact disc read-only memory (CD-ROM), a digital versatile disc (DVD), a Universal Serial Bus (USB) memory, a magneto-optical (MO) disk, and a flash memory. The external storage device may be a network-connected server apparatus.
The display unit 105 is a device that outputs an image on a display screen. Examples include a liquid crystal display (LCD) and an organic electroluminescence (EL) display (OELD). The display unit 105 may be an external device connected to the information processing apparatus 100 in a wired or wireless manner. The operation unit 106 includes a keyboard and a mouse, and accepts a user's various operations. The communication unit 107 performs wired or wireless bidirectional communication with another information processing apparatus, a communication device, or an external storage device by using conventional communication techniques. For example, the communication unit 107 can include a chip and an antenna for performing public wireless communication. The communication unit 107 may be configured to perform communication by using other wireless communication methods such as a wireless local area network (LAN) and Bluetooth®.
In the present exemplary embodiment, the imaging assist apparatus 150 is a camera platform apparatus capable of changing an imaging position and an imaging direction based on control from the information processing apparatus 100. The imaging apparatus 180 to be described below is mounted on the imaging assist apparatus 150. The imaging assist apparatus 150 includes a communication unit 151, an imaging position and direction control unit 152, and an imaging instruction unit 153. The communication unit 151 performs wireless or wired communication with the information processing apparatus 100, and controls the imaging direction and position and issues imaging instructions based on instructions from the information processing apparatus 100. For example, the communication unit 151 can include a chip and an antenna for performing public wireless communication. The communication unit 151 may be configured to perform communication by using other wireless communication methods such as a wireless LAN and Bluetooth®.
The imaging position and direction control unit 152 changes the imaging position and direction of the camera platform apparatus so that an image of an imaging region of an object to be inspected can be captured. The imaging instruction unit 153 controls the imaging apparatus 180 set at the imaging position and direction changed by the imaging position and direction control unit 152 to capture an image.
The imaging apparatus 180 is an apparatus for capturing an image based on imaging instruction information received from the information processing apparatus 100 via the imaging assist apparatus 150. The imaging apparatus 180 includes a full image plane phase difference image sensor, and records in-focus degree information (defocus values) of the captured image. Details of the in-focus degree information will be described below with reference to
Each pixel of the full image plane phase difference image sensor of the imaging apparatus 180 includes two photoelectric conversion units, which will be referred to as a split pixel A and a split pixel B. Split pixels A and B two-dimensionally regularly arranged in the full image plane phase difference image sensor output an image A and an image B, respectively, as parallax images. An image A+B obtained by adding the image A and the image B is recorded as a recording still image. The defocus amounts are calculated based on phase differences between the parallax images. The full image plane phase difference image sensor will be described to be configured so that the defocus amounts are derived pixel by pixel, whereas defocus amounts may be derived in units of predetermined regions, such as in units of blocks each including a plurality of pixels (e.g., 5×5 pixels).
Next, the imaging processing according to the present exemplary embodiment will be described.
In step S201, the CPU 101 of the information processing apparatus 100 performs imaging processing. The imaging processing illustrated in step S201 is processing to capture images by operating the imaging assist apparatus 150 and the imaging apparatus 180 based on control from the information processing apparatus 100. The imaging processing illustrated in step S201 will be described below with reference to a flowchart of
In step S202, the CPU 101 obtains the table 501 generated in step S201 and the captured image files. The subsequent processing of steps S203 to S205 is repeated on each of the obtained captured image files in order of captured image identifiers (IDs) listed in the table 501.
In step S203, the CPU 101 performs imaging condition determination processing for determining the imaging condition with which a captured image is captured. The imaging condition determination processing will be described below with reference to the flowchart of
Examples of the imaging condition determined in step S203 include an aperture value (f-stop number). To use images having a large depth of field and not much affected by out of focus blur due to diffraction for inspection, a condition range is provided for aperture values at which the images for inspection use are captured. In other words, in the present exemplary embodiment, an imaging range where an image is captured at an aperture value outside the predetermined range or other than predetermined values is determined not to be used for predetermined image processing for inspection, such as deformation detection and combination processing, and the captured image is determined to be recaptured.
Another example of the imaging condition determined in step S203 may be an International Organization for Standardization (ISO) value (sensitivity) indicating light-capturing capability. An image that is captured at a high ISO value and likely to be much affected by noise is unsuitable for inspection. A condition value is therefore provided for the ISO value at which the images for inspection use are captured. In other words, in the present exemplary embodiment, an imaging range where an image is captured at an ISO value outside a predetermined range or other than predetermined values is determined not to be used for predetermined image processing for inspection, such as the deformation detection and the combination processing, and the captured image is determined to be recaptured.
Another example of the imaging condition determined in step S203 may be an object distance. If the distance to an object (object distance) in capturing the image is too large, image resolution can be too low to detect fine deformations. A condition is therefore imposed on the distance to the object at which the image for inspection use is captured. In other words, in the present exemplary embodiment, an imaging range in which an image is captured at an object distance outside a predetermined range or other than predetermined values is determined not to be used for predetermined image processing for inspection, such as the deformation detection and the combination processing, and the captured image is determined to be recaptured. In addition, the imaging condition determined in step S203 may include a plurality of conditions set for respective attributes.
In step S204, the CPU 101 determines whether to shift the processing to step S205 for the image to be processed, using the determination result of step S203. If the captured image is determined to agree with the predetermined imaging condition in step S203 (YES in step S204), the processing proceeds to step S205. If not (NO in step S204), the processing proceeds to step S203. In step S203, the CPU 101 processes the captured image with the next captured image ID.
In step S205, the CPU 101 performs image quality determination processing for determining the image quality of the captured image. The image quality determination processing will be described with reference to the flowchart of
In step S206, the CPU 101 determines whether the imaging condition determination processing of step S203 or the image quality determination processing of step S205 has been completed for all the captured images obtained in step S202. If the determination processing has been completed for all the captured images (YES in step S206), the processing proceeds to step S207. If not (NO in step S206), the processing proceeds to step S203. In step S203, the CPU 101 processes the captured image with the next captured image ID.
In step S207, the CPU 101 determines whether the determination information in the table 501 includes an image quality determination result “NG” or “out of imaging condition”. If the determination information includes an image quality determination result “NG” or “out of imaging condition” (YES in step in step S207), the processing proceeds to step S208. If not (NO in step S207), the processing ends.
In step S208, the CPU 101 identifies an image or images to be recaptured. In the first exemplary embodiment, the information processing apparatus 100 generates a combined image based on the imaging positions and imaging ranges of the respective captured images, and presents the imaging position(s) of the image(s) to be recaptured on the combined image to the operator. The information processing apparatus 100 identifies the image(s) about which the determination information in the table 501 is not OK, and presents the imaging position(s) and imaging range(s) of the image(s) to be recaptured in the combined image based on the information about the imaging position(s). As illustrated in
Next, details of the imaging processing illustrated in step S201 of
If the specification of the inspection range (imaging range) from the user is completed, then in step S302, the CPU 101 generates the captured image list illustrated by the table 501 of
In step S303, the CPU 101 controls the camera platform apparatus that is the imaging assist apparatus 150 and the imaging apparatus 180 to capture images in the order of the captured image IDs in the table 501 generated in step S302 based on the information (imaging position information) about the imaging positions corresponding to the captured image IDs.
The information processing apparatus 100 changes the imaging direction and imaging position of the imaging assist apparatus 150 based on coordinate information described in the imaging position information corresponding to each captured image ID in the table 501. The information processing apparatus 100 then controls the imaging apparatus 180 to adjust focus by an automatic focus function with the central area of the screen as the distance measurement point, for example. The information processing apparatus 100 transmits an imaging instruction to the imaging apparatus 180 so that an image is captured upon the completion of the automatic focusing. The information processing apparatus 100 stores information such as the captured filename into the corresponding record of the captured image list based on control from the imaging assist apparatus 150 or the imaging completion notification transmitted from the imaging apparatus 180. As illustrated in
Next, details of the imaging condition determination processing illustrated in step S203 of
In step S602, the CPU 101 determines whether the values of the imaging information obtained in step S601 agree with a predetermined imaging condition. If the values are determined to be out of the imaging condition (NO in step S602), the processing proceeds to step S603. If the values are determined to fall within the imaging condition (YES in step S602), the processing ends. The imaging condition includes thresholds or ranges set in advance. The determination is made based on whether the values included in the imaging information fall within the thresholds or ranges. For example, the aperture value, the ISO sensitivity, and the distance to the object described in the description of the processing of step S203 may be used as the imaging condition.
In step S603, the CPU 101 determines that the determination result is “out of imaging condition”, and records the determination result into the determination information flied of the table 501. The information processing apparatus 100 may be configured to, if the values of the obtained imaging information satisfy a predetermined imaging condition, record it into the determination information field of the table 501.
Next, details of the image quality determination processing illustrated in step S205 of
In step S702, the CPU 101 calculates the ratio of regions by using the in-focus degree information obtained in step S701. The processing for calculating the ratio in step S702 will be described with reference to
The full image plane phase difference image sensor of the imaging apparatus 180 can obtain information about the defocus amount at each pixel position during imaging. Conventional techniques can be used as a method for obtaining the information about the defocus value. For example, automatic focusing techniques using the amount of focus deviation in the front-to-back direction detected from an image sensor have already been widely put to practical use. The information processing apparatus 100 may be configured to obtain the defocus value by using parallax images captured by a stereoscopic camera.
In the defocus maps of
The calculation of the ratio of in-focus regions will be described with reference to
In step S703, the CPU 101 determines whether the ratio calculated in step S702 is greater than or equal to a predetermined threshold. If the ratio is greater than or equal to the predetermined threshold (YES in step S703), the processing proceeds to step S704. If the ratio is less than the threshold (NO in step S703), the processing proceeds to step S705. Suppose that the threshold for the ratio is 80%. In such a case, the ratio in
In step S702, the ratio of regions where the defocus amount is “0” is described to be determined. However, it is not limited thereto. For example, the information processing apparatus 100 may be configured to determine the ratio of regions where the defocus amount is “0” or “1”. In other words, the information processing apparatus 100 may be configured to determine the ratio of regions where the in-focus degree is greater than or equal to a threshold. Alternatively, the information processing apparatus 100 may be configured to determine the ratio of regions where the defocus amount is “3” in step S702. In such a case, if, in step S703, the ratio is less than a threshold, the processing may proceed to step S704. If the ratio is greater than or equal to the threshold, the processing may proceed to step S705. Alternatively, the information processing apparatus 100 may be configured to determine a first ratio of regions where the defocus amount is “0” and a second ratio of regions where the defocus amount is “3” in step S702. In such a case, if the first ratio is greater than a first threshold and the second ratio is less than a second threshold different from the first threshold, the processing may proceed to step S704. In other cases, the processing may proceed to step S705. In this way, whether the in-focus degree of the image satisfies various predetermined conditions can be determined based on the in-focus degree information. Based on the determination result of the in-focus state, whether to use the image for image processing such as deformation detection and combination can be determined. Alternatively, the information processing apparatus 100 may be configured to determine whether the in-focus degree of the image satisfies various predetermined conditions based on the in-focus degree information, and use the determination result of the in-focus state in determining whether to recapture the image.
In step S704, the CPU 101 determines that the captured image file that is a processing target can be used for image processing such as deformation detection and combination, or does not need to be recaptured (OK). The CPU 101 records the determination result into the determination information field of the table 501. Then, the image quality determination processing ends.
In step S705, the CPU 101 determines that the captured image file that is a processing target is unusable for image processing such as deformation detection and combination, or to be recaptured (NG). The CPU 101 records the determination result into the determination information field of the table 501. Then, the image quality determination processing ends.
The image quality determination processing of step S205 is described to use the in-focus degree information, i.e., the degree of in-focus state. However, the image quality can be determined using other image processing results. For example, frequency analysis processing can be used to calculate a degree of out of focus blur from the amounts of high and low frequency components, and the image quality may be determined based on a predetermined threshold for out of focus blur intensity. More specifically, out of focus blur information indicating the degree of out of focus blur in each predetermined region of the image is generated by using the frequency analysis processing on the image, and the ratio of regions where the degree of out of focus blur satisfies a predetermined condition in the image is determined based on the generated out of focus blur information. The information processing apparatus 100 may be configured to output determination information indicating whether to use the image for predetermined image processing based on the determination result whether the degree of out of focus blur satisfies a predetermined condition.
Further, the information indicating the degree of out of focus blur and the in-focus degree information may be used in combination. More specifically, the information processing apparatus 100 may be configured to determine whether to recapture an image in the imaging range of a captured image by using the image quality determination result obtained by using the in-focus degree information and the image quality determination result obtained by using the degree of out of focus blur based on the frequency analysis processing.
While the imaging processing of step S201 is described to be controlled and performed by the information processing apparatus 100, it is not limited thereto. More specifically, the information processing apparatus 100 may be configured to omit the processing of step S201 and obtain information to be used in subsequent processing, such as the captured image list, while obtaining captured images in step S202.
As described above, in the present exemplary embodiment, the information processing apparatus 100 determines the image quality of a plurality of captured images of a structure in advance in using the captured images for inspection of the structure, such as combination and deformation detection. The information processing apparatus 100 can thereby identify images having poor image quality and present the identified images to the operator as candidates to be recaptured. Since the information processing apparatus 100 identifies the locations (positions) of the captured images to be recaptured in the combined image, the operator can recapture images at the identified imaging positions, which reduces labor in recapturing the images. In this way, the combined image can be prevented from including low quality images, and the detection processing for the deformation such as crack can be performed with higher reliability. Moreover, the reimaging can be performed in a short time since only unsuitable images are recaptured instead of all the images.
The information processing apparatus 100 may be configured to generate visualization information for visualizing the defocus values such as illustrated in
In the present exemplary embodiment, the imaging assist apparatus 150 is described to be a camera platform apparatus. However, the imaging apparatus 180 may be mounted on an autonomous-flying drone (unmanned aircraft), in which case images can be captured at different imaging positions. In the case of using a drone apparatus, a Global Navigation Satellite System (GNSS) device, an altimeter, and/or an electronic compass mounted on the drone apparatus is/are used to measure the imaging positions and directions specified by the operator and capture images at the specified positions. In capturing images at a place where the GNSS device is not usable, like under the floor slab of a bridge, the imaging apparatus 180 communicates with a base station installed on the ground and relatively measures the direction to and distance from the drone apparatus to measure the imaging position. Since the techniques related to the measurement of the imaging position are not the main objective of the present invention, a description thereof will be omitted here.
The information processing system according to the present exemplary embodiment is described to include the information processing apparatus 100, the imaging assist apparatus 150, and the imaging apparatus 180 that are configured as separate independent apparatuses. However, the information processing system may be configured as a single apparatus having the functions of the apparatuses described above. For example, the information processing system may be configured as an integral drone apparatus having the imaging function. Alternatively, the information processing system may be configured to perform distributed processing by a greater number of apparatuses.
In the first exemplary embodiment, a configuration is described where images identified to be recaptured are displayed on the combined image as illustrated in
In
Next, image quality determination processing according to the first modification will be described.
In
In step S1101 executed after the processing of step S704, the CPU 101 moves the captured image file determined to be OK in step S704 to the OK directory created in step S1001.
In step S1102 executed after the processing of step S705, the CPU 101 moves the captured image file determined to be NG in step S705 to the NG directory created in step S1001.
The flowchart of the image quality determination processing (step S205) according to the first modification of the first exemplary embodiment has been described above. According to the first modification, captured image files to be recaptured can be found out based on the presence or absence of a captured image file in the NG directory. Captured image files determined not to satisfy the imaging condition in the imaging condition determination processing of step S203 are not moved into either of the directories. Such captured image files can be determined to be recaptured. The information processing apparatus 100 may be configured to also move the captured image files determined not to satisfy the imaging condition into the NG directory as captured image files to be recaptured. The information processing apparatus 100 may be configured to also create an imaging condition NG directory in step S1001, and move the captured image files determined not to satisfy the imaging condition into the imaging condition NG directory.
If all the captured image files are moved to the OK directory by the directory classification, an inspection operation using the combined image can be immediately started since all the captured images have favorable image quality.
The configuration for classifying the captured images into the directories according to the first modification of the first exemplary embodiment enables collective check of the images with respect to each determination result of the image quality determination. The operator can immediately check for image quality defects by checking the images determined to be recaptured or by visually observing the superimposed image files visualizing the in-focus degree information described in the first exemplary embodiment.
Since moving and classifying the captured image files into the directories sorts out captured image files determined to be NG, and the OK directory includes only captured image files determined to be OK, the operator does not need to perform image classification operations. If the images are recaptured, the subsequent generation of the combined image and inspections using the combined image can be immediately performed by adding recaptured image files determined to be OK since all the captured image flies to be combined are in the OK directory.
As described above, according to the first modification, the information processing apparatus 100 outputs information for storing files expressing images into predetermined directories based on the result of the image quality determination processing. Whether to recapture images can be indicated by the storage locations of the files expressing the images.
In the first exemplary embodiment, the list of images to be recaptured is managed on the RAM 103 or the HDD 104. A second modification of the first exemplary embodiment describes an example of processing of outputting the captured image list in the table 501, in a file form instead of by memory transfer.
In the second modification, the processing for generating the table 501, which is the captured image list, in step S302 of
With such a configuration, the list of captured image files to be recaptured is handled as a list file, whereby the determination results of the captured image files can be listed. The list file can also be used by other apparatuses, systems, and applications.
In the first exemplary embodiment, the first modification of the first exemplary embodiment, and the second modification of the first exemplary embodiment, images are recaptured by the operator checking (visually observing) the determination results whether the images are to be recaptured. A third modification of the first exemplary embodiment describes an example in which the imaging assist apparatus 150 is controlled to recapture images by using the determination results and the information about the imaging positions of images to be recaptured.
In step S1201, the CPU 101 extracts the captured image IDs of records in which the determination information is not “OK”, i.e., is “NG” indicating that an image is to be recaptured from the table 501.
In step S1202, the CPU 101 obtains information about the imaging positions corresponding to the captured image IDs extracted in step S1201.
In step S1203, like step S303 of
In the third modification of the first exemplary embodiment, the imaging positions are identified from the captured image IDs of the images to be recaptured, and the imaging assist apparatus 150 that is a camera platform apparatus is controlled to recapture the images by using the information about the identified imaging positions. In this way, the images can be recaptured without the operator making operations for reimaging.
The captured image files recaptured in the reimaging processing of
The information processing apparatus 100 may be configured to display a screen for checking whether to recapture images as illustrated in
In the first exemplary embodiment and the first, second, and third modifications of the first exemplary embodiment, the determination in the image quality determination processing is described to be made by using the in-focus degree information. As described above, the image quality determination processing may be performed by using the degree of out of focus blur based on the frequency analysis processing instead of the in-focus degree information. The image quality determination result using the in-focus degree information and the image quality determination result using the degree of out of focus blur based on the frequency analysis processing may be combined. A fourth modification describes a case where a determination using imaging resolution information indicating imaging resolution and a determination using frequency analysis information indicating a frequency analysis result are made in addition to the determination using the in-focus degree information.
The imaging resolution refers to the size of the surface to be imaged per pixel of a captured image, expressed in units of mm/pixel. The imaging resolution can be calculated from the size of the image sensor, the image size of the surface to be imaged, and the distance to the surface to be imaged. The greater the value of the imaging resolution, the rougher the resolution, and the deformations such as a crack are more difficult to be observed.
The frequency analysis information is obtained by performing the frequency analysis processing on the image and calculating an average value of the obtained frequency components. If the calculated average value is small, there are less high frequency components or edgy portions and more out of focus blur and motion blur (the degrees of out of focus blur and motion blur are high). In addition to the out of focus blur determination by the in-focus degree determination, motion blur can be determined by the frequency analysis. Images in which deformations such as a crack are difficult to observe because of the effect of motion blur can thereby be excluded.
In step S1401, the CPU 101 obtains imaging resolution information included in the captured image. More specifically, the CPU 101 calculates the imaging resolution from the image size of the captured image, the size of the image sensor, and the distance to the surface to be imaged. The distance to the surface to be imaged is obtained by obtaining the distance to the object in focusing on the position of the distance measurement point on the object.
In step S1402, the CPU 101 determines whether the imaging resolution obtained in step S1401 is less than or equal to a predetermined threshold. If the imaging resolution is less than or equal to the threshold (YES in step S1402), the processing proceeds to step S1403. If the imaging resolution is greater than the threshold (NO in step S1402), the processing proceeds to step S705. In this way, whether the captured image has imaging resolution desirable as the quality of an inspection image can be determined. If the imaging resolution is greater than the threshold, i.e., the captured image of the object is rough, the captured image file is determined to be NG without proceeding to the subsequent determination processing. The setting of the threshold for the imaging resolution determination will be described below with reference to a setting screen 1601 illustrated in
The processing of steps S701, S702, and S703 is similar to that described with reference to
In step S1403, the CPU 101 determines whether to continue the processing subsequent to the imaging resolution determination. If the subsequent processing is to be continued (YES in step S1403), the processing proceeds to step S701. If the subsequent processing is to be ended (NO in step S1403), the processing proceeds to step S704. The setting of whether to continue the processing subsequent to the imaging resolution determination will be described below with reference to the setting screen 1601 illustrated in
In step S1404, the CPU 101 determines whether to continue the processing subsequent to the in-focus degree determination processing. If the subsequent processing is to be continued (YES in step S1404), the processing proceeds to step S1405. If the subsequent processing is to be ended (NO in step S1404), the processing proceeds to step S704. As with step S1403, the setting of whether to continue the subsequent processing will be described below with reference to the setting screen 1601 illustrated in
In step S1405, the CPU 101 performs the frequency analysis processing on the captured image to obtain a frequency component value. The CPU 101 calculates frequency components in a horizontal direction and frequency components in a vertical direction by using wavelet transformation as an example of the frequency analysis processing. The CPU 101 calculates an average of the obtained frequency components. The calculated average value is the frequency component value.
In step S1406, the CPU 101 determines whether the frequency component value calculated in step S1403 is less than a predetermined threshold. If the frequency component value is less than the threshold (YES in step S1406), the processing proceeds to step S704. If the frequency component value is greater than or equal to the threshold (NO in step S1406), the processing proceeds to step S705. Thus, if the frequency component value is greater than or equal to the threshold, the captured image is determined to include a lot of high frequency components and a lot of edgy regions. If the frequency component value is less than the threshold, the captured image is determined to include a lot of low frequency components and to have been much affected by out of focus blur and motion blur. Since the motion blur determination processing is performed after the in-focus degree determination, whether the captured image is suitable as an inspection image can be determined even in an imaging situation where the captured image is determined to be in focus and is motion-blurred. An example of the imaging situation where the captured image is in focus and is motion-blurred is a situation in which the image is captured by the imaging apparatus 180 mounted on a flying object such as a drone. The drone can be swung due to wind during the imaging. The threshold for the motion blur determination will be described below with reference to the setting screen 1601 of
In step S1407, the CPU 101 generates an image on which determination results are superimposed (determination result superimposed image) by using the information about the result of the determination made in step S704 or S705. The generated image will be described with reference to
A result image 1500 includes defocus regions 1501, 1502, and 1503, and a frame region 1511. The frame region 1511 includes an imaging resolution determination result region 1512, an in-focus degree determination result region 1513, and a motion blur determination result region 1514. The defocus regions 1501, 1502, and 1503 are regions into which the defocus map illustrated in
The frame region 1511 displays the imaging resolution determination result region (imaging resolution determination icon) 1512, the in-focus degree determination result region (in-focus degree determination icon) 1513, and the motion blur determination result region (motion blur determination icon) 1514 as symbols for indicating the respective determination results. The result image 1500 illustrates an example in which the imaging resolution determination and the in-focus degree determination are OK and the motion blur determination is NG. The icons are differently expressed depending on the determination results. For example, the imaging resolution determination result region 1512 and the in-focus degree determination result region 1513 where the determinations are OK are expressed by white icons with black letters. The motion blur determination result region 1514 where the determination is NG is expressed by a black icon with a white letter. This not only enables the visual observation of the determination results but also facilitates checking the stage where the error has occurred. Moreover, which of the determination methods has not been performed may be indicated by the presence or absence of the icons. For example, a case where, in step S1402 of the image quality determination flowchart of
In the frame region 1511, the color of each icon may be changed based on the determination results. Suppose, for example, that the imaging resolution determination icon is assigned red, the in-focus degree determination icon is assigned blue, and the motion blur determination icon is assigned yellow. If the in-focus degree determination on an image is NG, the result image 1500 is generated with the frame region 1511 filled with blue. In this way, even in small image sizes like thumbnail images, the determination results that are difficult to figure out from the icons of the determination regions can thus be identified from the color of the frame region 1511.
The flowchart of the image quality determination processing illustrated in
The determination pattern selection section 1611 is divided into three determination sections, i.e., an imaging resolution determination section 1612, an in-focus degree determination section 1613, and a motion blur determination section 1614. The determination sections 1612, 1613, and 1614 include determination name labels 1615, 1616, and 1617, and determination symbols 1618, 1619, and 1620, respectively. Input areas 1621 to 1626 for numerical values such as determination thresholds used in the image quality determination flowchart of
Furthermore, the icons 1512, 1513, and 1514 in
Of the determination name labels 1615 to 1617, the in-focus degree determination name label 1616 includes an in-focus degree determination checkbox 1616a and an in-focus degree determination name section 1616b. The motion blur determination name label 1617 includes a motion blur determination checkbox 1617a and a motion blur determination name section 1617b. By making ON/OFF operations on the checkboxes 1616a and 1617a, whether to further perform the corresponding determination processes if the result of the imaging resolution determination process that is always performed is OK can be specified. For example, in
In
Based on the states of the in-focus degree determination checkbox 1616a and the motion blur determination checkbox 1617a, the CPU 101 determines whether to continue the processing in steps S1403 and S1404. If the determination in the previous stage is OK and the subsequent processing is not needed, the subsequent processing can thus be omitted. For example, in an imaging mode using a drone that is likely to cause motion blur, the motion blur checkbox 1617a can be turned ON to perform the motion blur determination process. In an imaging mode using a tripod that is unlikely to cause motion blur, the motion blur determination checkbox 1617a can be turned OFF not to perform the motion blur determination process. In the case of performing only the imaging resolution determination process, the imaging resolution can be identified to check whether the captured image is suitable for an inspection image, by turning OFF the in-focus degree determination checkbox 1616a and the motion blur determination checkbox 1617a. In performing the determination processing on a large number of images, an effect of reducing the processing time as described above can be expected since the processing can be stopped based on the intended uses and purposes.
The value input to the imaging resolution determination threshold input area (first setting unit) 1621 indicates the threshold in step S1402. The value input to the in-focus degree determination threshold input area 1622 indicates the predetermined value in step S702. The value input to the in-focus degree determination ratio input area (second setting unit) 1623 indicates the threshold for the ratio in step S703. The value input to the in-focus degree determination region input area (second setting unit) 1624 is an item for setting a region where the ratio is calculated in step S702, and expresses the area of the central region intended for the determination in percentage, with the area of the entire image subjected to the determination processing as 100%. Reducing the central region intended for the determination (reducing the area) reduces the amount of the calculation processing, from which an improvement in speed can be expected. If the value input to the in-focus degree determination region input area (second setting unit) 1624 is set so that the region intended for the determination is less than 100% (e.g., 50%), and the combined image is generated by stitching, only the central region of the image used for the combined image is subjected to the determination. Since only the central region is subjected to the determination, the peripheral portions of the image serving as overlapping margins are excluded from the determination. In this way, images with peripheral portions that are out of focus can therefore be determined to be usable for stitching.
The value input to the motion blur determination threshold input area (third setting unit) 1625 indicates the threshold in step S1406. The value input to the motion blur determination region input area (third setting unit) 1626 is an item for setting a region where the frequency component value is calculated in step S1405. The motion blur determination region 1617 indicates the area of the central region intended for the calculation in percentage, with the area of the entire image subjected to the calculation processing as 100%. In addition to similar effects to that of the in-focus degree determination region 1615, the motion blur determination region 1617 can provide an effect of reducing high frequency components occurring in the motion blur determination because the captured image also includes plants or other objects behind the structure if the structure to be inspected is a bridge pier or the like.
In the present modification, the image quality determination processing of step S205 is replaced by the processing of the flowchart illustrated in
In the present modification, the imaging resolution determination process, the in-focus degree determination process, and the motion blur determination process are described to be performed in this order as illustrated in
In the screen of
The imaging resolution determination section 1612 does not include a checkbox like the in-focus degree determination checkbox 1616a or the motion blur determination checkbox 1617a. The reason is that the subsequent determination processes do not need to be performed if the captured image does not have imaging resolution desirable as the image quality of an inspection image, i.e., the object is roughly imaged.
If both the in-focus degree determination checkbox (first selection unit) 1616a and the motion blur determination checkbox (second selection unit) 1617a are OFF and the motion blur determination checkbox 1617a is then turned ON, the in-focus degree determination checkbox 1616a may also turn ON in an interlocked manner. If both the in-focus degree determination checkbox (first selection unit) 1616a and the motion blur determination checkbox (second selection unit) 1617a are ON and the in-focus degree determination checkbox 1616a is then turned OFF, the motion blur determination checkbox 1617a may also turn OFF in an interlocked manner.
This can ensure that the motion blur determination is made after the captured image is determined to be in focus. The reason for such settings is that by performing the motion blur determination process after the in-focus degree determination process as described above, whether the captured image is suitable for an inspection image can be determined even in an imaging situation where the captured image is determined to be in focus and is motion-blurred.
In the foregoing example, the determination result superimposed image (superimposed image, result superimposed image) is generated regardless of whether the determination result is OK or NG. However, superimposed images may be generated only for objects determined to be NG. The processing time may be reduced by generating a list of determination results without generating superimposed images. The superimposed image storage condition selection section 1630 of
If “do not store” is selected in the superimposed image storage condition selection section 1630, the determination processing illustrated in
In generating the determination result superimposed image in step S1407, the superimposition of the icons indicating the determination results on the result image shows in which process the image is determined to be NG. Since no icon is displayed for an unexecuted determination process or processes, the user can find out how far the processes have been performed.
Other examples of the display mode of the result superimposed image will be described with reference to
In
The NG determination region 1741 in the right frame region 1712 displays the same color as that of the determination region corresponding to the determination process ending with NG. In
If the imaging resolution determination is NG, the left frame region 1711 displays only the imaging resolution determination region 1731. The NG determination region 1741 in the right frame region 1712 displays the same color as that of the imaging resolution determination region 1731. Now, suppose that only the in-focus degree determination checkbox 1616a is ON in the determination pattern selection section 1611, the imaging resolution determination is OK, and the in-focus degree determination is also OK. In such a case, the left frame region 1711 displays the imaging resolution determination region 1731 and the in-focus degree determination region 1732. The NG determination region 1741 in the right frame region 1712 does not display anything.
With such a UI, the imaging resolution information can thus be observed on the result image. This enables intuitive observation about which of the imaging resolution, in-focus degree, and motion blur, three determination processes has/have actually been performed can also be observed, and which of the processes has ended with NG. In other words, visualization information for visualizing which of the imaging resolution determination, the in-focus degree determination, and the motion blur determination has/have been performed and which of the executed determinations is determined to not satisfy a predetermined condition can be generated.
Since the imaging resolution determination region 1731, the in-focus degree determination region 1732, and the motion blur determination region 1733 are in colors corresponding to those of the imaging resolution determination symbol 1618, the in-focus degree determination symbol 1619, the motion blur determination symbol 1620 in
In the present modification, the information processing apparatus 100 is described to include a determination unit having an in-focus degree determination function, a motion blur determination function, and an imaging resolution determination function and be capable of selecting the determination mode. However, the determination unit may include only one or two of the functions. For example, the determination unit may be configured to be able to perform only the imaging resolution determination function among the above-described functions. The determination unit may be configured to be able to perform only the in-focus degree determination function and the imaging resolution determination function. An obtaining unit may have a configuration corresponding to that of the determination unit. Even with such a configuration, the information processing apparatus 100 can determine whether to use a captured image for predetermined image processing based on the image quality of the captured image.
An exemplary embodiment of the present invention is directed to determining whether to use a captured image for predetermined image processing based on the image quality of the captured image.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Applications No. 2020-186062, filed Nov. 6, 2020, and No. 2021-154360, filed Sep. 22, 2021, which are hereby incorporated by reference herein in their entirety.
Number | Date | Country | Kind |
---|---|---|---|
2020-186062 | Nov 2020 | JP | national |
2021-154360 | Sep 2021 | JP | national |