The present invention relates to an image processing method, an image processing system, and a program.
Marine surveillance is performed by detecting sea-going vessels with use of captured image such as satellite images. In particular, by using synthetic aperture radar (SAR) images in which a ground surface is imaged from a high altitude as captured images, it is possible to perform marine surveillance not affected by the weather.
Patent Literature 1 discloses an example of detecting a vessel by using an SAR image. In the art disclosed in Patent Literature 1, a captured image is binarized and an area with high luminance is extracted as a candidate vessel.
Patent Literature 1: JP 2019-175142 A
However, in the art disclosed in Patent Literature 1, there is a problem that a vessel coming alongside a land cannot be detected with high accuracy due to an interference with an object provided on the land. That is, on a land that a vessel comes alongside, since there is an object on the land such as a bridge or a crane, a vessel cannot be detected with high accuracy due to an interference with such an object. Specifically, when an object on the land has high luminance, it is difficult to appropriately set a luminance value serving as a threshold for binarization, which causes a case where an object on the land and a vessel are not distinguishable from each other with high accuracy. Moreover, there is a case where an object on the land may be erroneously detected as a vessel. This causes a problem that detection may not be performed with high accuracy not only in the case of detecting a vessel and the case of the water area in a captured image, but also in the case of detecting a mobile body located in a specific area.
In view of the above, an object of the present invention is to provide an image processing method, an image processing system, and a program capable of solving the above-described problem, that is, a mobile body in a captured image cannot be detected with high accuracy.
An image processing method, according to one aspect of the present invention, is configured to include
Further, an image processing device, according to one aspect of the present invention, is configured to include
Further, a program, according to an aspect of the present invention, is configured to cause an information processing device to realize
With the configurations described above, the present invention can detect a mobile body in a captured image with high accuracy.
A first exemplary embodiment of the present invention will be described with reference to
An image processing device 10 of the present embodiment is for detecting a sea-going vessel in order to perform marine surveillance from captured images such as satellite images captured using a synthetic aperture radar (SAR). However, the image processing device 10 does not necessarily image an area on the sea, but may image any area. Moreover, the image processing device 10 is not limited to detecting a vessel from a captured image, but may detect any mobile body. For example, the image processing device 10 may image an area such as an airport and detect a mobile body such as an aircraft. Furthermore, an image to be processed by the image processing device 10 is not limited to a satellite image captured using an SAR, but may be any image.
The image processing device 10 is configured of one or a plurality of information processing devices each having an arithmetic device and a storage device. As illustrated in
The focused image storage unit 15 stores therein a focused image (object image) that is an image of a sea that is an area (object area) from which a vessel that is a mobile body is detected. A focused image is a satellite image captured using an SAR as illustrated in the upper drawing of
The background image storage unit 16 stores therein a background image that is a satellite image captured by using an SAR as similar to the focused image, and is an image of an area corresponding to the object area of the focused image, that is, the sea that is an area almost the same as the object area. In particular, a background image is a captured image at the time when there is no vessel moving in the object area such as the sea. Therefore, a background image is configured of an image captured in a time period when there is no vessel in the object area, or an image in which image processing of removing a mobile body, that is, a vessel, is performed on a plurality of past images of the object area. For example, a background image is generated by performing positioning of a plurality of past captured images of the object area and selecting a minimum value for each pixel to thereby remove pixels of high luminance values that may be determined to be a vessel, as described below. Since a background image is a satellite image itself or generated from a satellite image, it is associated with position information such as latitude and longitude on the earth based on information such as an orbit of the satellite and the setting of the imaging device.
The difference image storage unit 17 stores therein a difference image representing a difference between the focused image and the background image. Specifically, a difference image is an image in which a difference between luminance values of pixels corresponding to each other in the focused image and the background image is generated as a new pixel value, as described below. Therefore, as illustrated in the lower drawing of
The geographic information storage unit 18 stores therein map information of an object area where the focused image and the background image are captured. In particular, map information includes position information of a land of the object area, and includes position information such as latitude and longitude on the earth of the land, for example.
The difference image generation unit 11 (image generation means) reads out, from the focused image storage unit 15, a focused image captured at a predetermined time that is a processing object from which a vessel is to be detected, reads out a background image from the background image storage unit 16, generates a difference image from the focused image and the background image, and stores it in the difference image storage unit 17. For example, the difference image generation unit 11 performs positioning of the focused image illustrated in the upper drawing of
Note that the difference image generation unit 11 may have a function of generating a background image to be used for generating a difference image. In that case, the difference image generation unit 11 generates a background image by acquiring past captured images of the object area previously stored in the background image storage unit 16, performing positioning on those captured images on the basis of the position information, similarity of the topography in the images, and the like, and selecting a minimum value for each pixel to thereby remove pixels having high luminance values that may be determined as a vessel. However, a background image may be generated by any method.
The binary image generation unit 12 (detection means) performs processing to binarize each of the focused image, the background image, and the difference image. At that time, the binary image generation unit 12 determines a threshold of a luminance value for binarization in each image. In the below description, a threshold setting process by the binary image generation unit 12 will be described first.
The binary image generation unit 12 first uses the geographic information stored in the geographic information storage unit 18 to set a water area (specific area) on the focused image. Specifically, the binary image generation unit 12 specifies a land area (exclusion area) representing the position of the land on the focused image, from the position information included in the focused image and the geographic information including the position information of the land. Then, the binary image generation unit 12 sets an extended land area (extended exclusion area) that is obtained by extending the edge of the land adjacent to the water area in the land area by a predetermined distance to the water area side. For example, the binary image generation unit 12 sets the extended land area by extending the edge of the land area adjacent to the water area to the water area side by 20 pixels in the focused image, that is, by 20 m in the object area. Then, the binary image generation unit 12 excludes the extended land area from the object area of the entire focused image and sets the remaining area as a water area. Thereby, the binary image generation unit 12 sets the water area on the focused image as indicated by an area surrounded by a dotted line in the upper drawing of
As similar to the above description, the binary image generation unit 12 sets a water area on the background image as illustrated by an area surrounded by a dotted line in the center drawing of
Then, the binary image generation unit 12 generates distribution of luminance values of pixels of a water area set in each of the focused image, the background image, and the difference image, and from the distribution, sets a threshold of luminance values for binarization. Specifically, for a focused image, the binary image generation unit 12 first generates distribution of luminance values of all pixels in the area set as a water area to be surrounded by a dotted line in the upper drawing of
Then, the binary image generation unit 12 also performs, on the background image and the difference image, the same processing as that performed on the focused image, and generates binary images respectively. Specifically, for the background image, the binary image generation unit 12 generates distribution of luminance values of all pixels in the area set as a water area surrounded by a dotted line in the center drawing of
The candidate pixel extraction unit 13 (detection means) extracts a pixel of a candidate vessel by using the binary images of the focused image, the background image, and the difference image generated as described above. At that time, the candidate pixel extraction unit 13 determines, for each binary image, whether or not each pixel is a water area (specific area), and extracts a pixel that is a candidate vessel on the basis of the determination result of each binary image. For example, when the focused pixel satisfies that there is a change in the pixel values, that is, that pixel is not a water area in the binary image of the focused image, it is a water area in the binary image of the background image, and it is not a water area in the binary image of the difference image, the candidate pixel extraction unit 13 extracts it as a pixel of a candidate vessel. As a result, the pixels of the areas surrounded by the rectangles of the dotted lines in
The mobile body detection unit 14 (detection means) detects a mobile body that is a vessel located on the focused image, on the basis of the pixels extracted as a candidate vessel as described above. For example, the mobile body detection unit 14 generates a figure configured of a plurality of pixels on the basis of the distance between the pixels extracted as a candidate vessel. For example, when the pixels extracted as a candidate vessel are adjacent to each other or located within a range of a certain distance, the mobile body detection unit 14 puts a set of the pixels into one figure. Then, the mobile body detection unit 14 compares, for example, the generated figure with a template indicating the shape of a vessel prepared in advance, and when determining that the generated figure is almost the same as the template, detects the generated figure as a vessel. Then, as illustrated in
Next, operation of the image processing device 10 as described above will be described with mainly reference to the flowchart of
First, the image processing device 10 acquires a plurality of past captured images of an object area previously stored in the background image storage unit 16. Then, the image processing device 10 performs positioning on the captured images on the basis of the position information, selects a minimum value for each pixel to remove pixels having high luminance values that may be determined as a vessel to thereby generate a background image, and stores it in the background image storage unit 16 (step S1). For example, the image processing device 10 generates a background image as illustrated in the center drawing of
Then, the image processing device 10 reads out, from the focused image storage unit 15, a focused image as illustrated in the upper drawing of
Then, the image processing device 10 sets a water area in each of the focused image, the background image, and the difference image (step S3). For example, regarding the focused image, the image processing device 10 specifies a land area representing the position of the land on the focused image by using the geographic information stored in the geographic information storage unit 18, and sets an extended land area by extending the edge adjacent to the water area of the land area by a predetermined distance to the water area side. Then, the image processing device 10 excludes the extended land area from the object area of the entire focused image, and sets the remaining area as a water area. Thereby, the image processing device 10 sets the water area on the focused image as indicated by an area surrounded by the dotted line in the upper drawing of
Then, the image processing device 10 generates distribution of the luminance values of the pixels in the water area for each water area set in the focused image, the background image, and the difference image (step S4). For example, the image processing device 10 generates distribution of the luminance values of the pixels by approximating the distribution of the luminance values to a function. Then, from the distribution generated for each of the focused image, the background image, and the difference image, the image processing device 10 sets a threshold of a luminance value for binarizing each image (step S5). For example, for each of the focused image, the background image, and the difference image, the image processing device 10 sets a threshold of luminance values with which the luminance values of the sea that can be considered as a water area and the luminance values that can be considered as an object existing in the water area are distinguishable from each other. Regarding the difference image, it can be said to set a threshold of luminance values with which the pixels having no change and the pixels with a change between the focused image and the background image are distinguishable from each other.
Then, for each of the focused image, the background image, and the difference image, the image processing device 10 generates a binary image in which the luminance value of each pixel is binarized by using a threshold set for each image (step S6). As a result, the image processing device 10 generates binary images as illustrated in the upper drawing, the center drawing, and the lower drawing of
Then, the image processing device 10 extracts a pixel that is a candidate vessel by using the binary images of the focused image, the background image, and the difference image generated as described above (step S7). For example, the image processing device 10 determines, for each binary image, whether or not each pixel is a water area, and extracts a pixel that is a candidate vessel on the basis of the determination result of each binary image. In particular, in the present embodiment, when the focused pixel satisfies that there is a change in the pixel values, that is, the pixel is not a water area in the binary image of the focused image, it is a water area in the binary image of the background image, and it is not a water area in the binary image of the difference image, the image processing device 10 extracts it as a pixel of a candidate vessel. As a result, the pixels in the areas surrounded by the rectangles of dotted lines in
Then, the image processing device 10 detects a mobile body that is a vessel located on the focused image, on the basis of the pixels extracted as a candidate vessel as described above (step S8). For example, the image processing device 10 generates a figure configured of a plurality of pixels on the basis of a distance between the pixels extracted as a candidate vessel, and when the figure satisfies the criterion such as a template, the image processing device 10 detects it as a vessel. For example, as illustrated in
As described above, in the present embodiment, by using an object image from which a mobile body that is a vessel is to be detected, a background image, and a difference image thereof, it is possible to suppress an influence of a land and an object on the land and to detect a vessel with high accuracy. In particular, in the present embodiment, a vessel can be detected with high accuracy by using binary images of an object image, a background image, and a difference image, performing binarization by using distribution of luminance values of a water area set to each of the object image, the background image, and the difference image, and setting a water area in which an expended land is removed at that time.
In the above description, an example in which the image processing device 10 detects a vessel appearing in the focused image has been provided. However, it is possible to detect various vessels by changing the criteria for extracting a pixel as a candidate vessel in the candidate pixel extraction unit 13. For example, by changing the criterion for extracting a candidate vessel in the candidate pixel extraction unit 13, it is possible to detect a vessel lost in the background image, that is, a vessel that is at anchor in the background image but is lost in the focused image. In that case, when the focused pixel satisfies that there is a change in the pixel value, that is, the pixel is a water area in the binary image of the focused image, it is not a water area in the binary image of the background image, and it is not a water area in the binary image of the difference image, the candidate pixel extraction unit 13 extracts it as a pixel of a candidate for a lost vessel.
Moreover, the image processing device 10 can cope with detection of any mobile body in any area without being limited to detection of a vessel on the sea that is a water area. For example, the image processing device 10 may image an area such as an airport and detect a mobile body such as an aircraft. In that case, the above-described water area (specific area) is exchanged with a paved ground. This means that the binary image generation unit 12 sets an area of a paved ground instead of a water area as illustrated by a dotted line in
Next, a second exemplary embodiment of the present invention will be described with reference to
First, a hardware configuration of an image processing device 100 in the present embodiment will be described with reference to
The image processing device 100 can construct, and can be equipped with, an image generation means 121 and a detection means 122 illustrated in
Note that
The image processing device 100 executes the image processing method illustrated in the flowchart of
As illustrated in
Since the present invention is configured as described above, by using an object image from which a mobile body is to be detected, a background image, and a difference image thereof, it is possible to detect a mobile body with high accuracy while suppressing an influence of an area that is different from the place to which the mobile body can move or an object existing in such an area.
Note that the program described above can be supplied to a computer by being stored in a non-transitory computer-readable medium of any type. Non-transitory computer-readable media include tangible storage media of various types. Examples of non-transitory computer-readable media include magnetic storage media (for example, flexible disk, magnetic tape, and hard disk drive), magneto-optical storage media (for example, magneto-optical disk), a CD-ROM (Read Only Memory), a CD-R, a CD-R/W, and semiconductor memories (for example, mask ROM, PROM (Programmable ROM), EPROM (Erasable PROM), a flash ROM, and a RAM (Random Access Memory)). Note that the program may be supplied to a computer by being stored in a transitory computer-readable medium of any type. Examples of transitory computer-readable media include electric signals, optical signals, and electromagnetic waves. A transitory computer-readable medium can be supplied to a computer via a wired communication channel such as a wire and an optical fiber, or a wireless communication channel.
While the present invention has been described with reference to the exemplary embodiments described above, the present invention is not limited to the above-described embodiments. The form and details of the present invention can be changed within the scope of the present invention in various manners that can be understood by those skilled in the art. Further, at least one of the functions of the image generation means 121 and the detection means 122 described above may be carried out by an information processing device provided and connected to any location on the network, that is, may be carried out by so-called cloud computing.
The whole or part of the exemplary embodiments disclosed above can be described as the following supplementary notes. Hereinafter, outlines of the configurations of an image processing method, an image processing device, and a program, according to the present invention, will be described. However, the present invention is not limited to the configurations described below.
An image processing method comprising:
The image processing method according to supplementary note 1, further comprising
The image processing method according to supplementary note 2, further comprising
The image processing method according to supplementary note 2 or 3, further comprising
The image processing method according to supplementary note 4, further comprising
The image processing method according to supplementary note 5, further comprising
The image processing method according to supplementary note 5 or 6, further comprising
The image processing method according to any of supplementary notes 5 to 7, further comprising
The image processing method according to any of supplementary notes 2 to 8, wherein
An image processing device comprising:
The image processing device according to supplementary note 10, wherein
The image processing device according to supplementary note 11, wherein the detection means sets an extended exclusion area that is generated by extending an exclusion area specified according to a predetermined criterion from an area included in each of the object image, the corresponding image, and the difference image, and determines a residual area in which the extended exclusion area is removed from the area included in each of the object image, the corresponding image, and the difference image, to be the specific area.
The image processing device according to supplementary note 11 or 12, wherein
The image processing device according to supplementary note 13, wherein on the basis of the distributions generated for the respective specific areas included in the object image, the corresponding image, and the difference image, the detection means generates transformed images that are images from which the specific areas are detectable for the object image, the corresponding image, and the difference image respectively, and extracts the mobile body on a basis of the generated transformed images.
The image processing device according to supplementary note 14, wherein on the basis of the distributions generated for the respective specific areas included in the object image, the corresponding image, and the difference image, the detection means sets thresholds of luminance values for binarizing the object image, the corresponding image, and the difference image respectively, generates the transformed images obtained by binarizing the object image, the corresponding image, and the difference image respectively with use of the thresholds, and extracts the mobile body on the basis of the generated transformed images.
The image processing device according to supplementary note 14 or 15, wherein
The image processing device according to any of supplementary notes 14 to 16, wherein
A computer-readable medium storing thereon a program for causing an information processing device to realize:
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/016962 | 4/17/2020 | WO |