This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2012-120105, filed on May 25, 2012; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to an object detecting apparatus, an object detecting method and a computer program product.
Various technologies for detecting an object from an image captured by a camera are known. It is also known to connect a camera that takes images and a processing device that performs processing for detecting an object via a network.
It is, however, difficult to detect whether or not an object is present from a blurred image captured while the camera moves.
According to an embodiment, an object detecting apparatus includes an image acquiring unit and a determining unit. The image acquiring unit is configured to acquire at least one target image for detection, the target image being captured by an image capturing unit while the image capturing unit moves within a second image capturing range included in a predetermined first image capturing range. The determining unit is configured to determine whether or not an object that is not captured in at least one reference image is captured in the target image, on the basis of a difference between each frequency of pixel values in a histogram for a first region and each frequency of pixel values in a second region of the target image corresponding to the first region, the first region being one of a plurality of regions each extending in a direction of blurring caused by movement of the image capturing unit in the reference image, the reference image being captured by the image capturing unit while the image capturing unit moves within the first image capturing range.
Embodiments of an object detecting apparatus will be described in detail below with reference to the accompanying drawings.
The image capturing unit 10 is a digital camera or the like that has a solid-state image sensor such as a charge coupled device (CCD) and a lens (not illustrated), takes a plurality of images at predetermined time intervals and outputs the captured images to the image acquiring unit 20. For example, the image capturing unit 10 captures reference images and target images for detection, which will be described later. The image capturing unit 10 may be configured to take an image at an arbitrarily set time or may be a moving image imaging device that captures images of a plurality of frames within a predetermined time. Thus, the images captured by the image capturing unit 10 may not include or may include images of regions that overlap with one another.
The drive unit 12 drives the image capturing unit 10 so that the image capturing unit 10 can move in a first image capturing range, which will be described later, for imaging. Note that the drive unit 12 moves the image capturing unit 10 during exposure during which the image capturing unit 10 emits light to the solid-state image sensor or the like via the lens. Specifically, since the movement of the image capturing unit 10 is greater as compared to the shutter speed of the image capturing unit 10, an image captured by the image capturing unit 10 results in a blurred image according to the movement of the image capturing unit 10. In addition, when the image capturing unit 10 performs image capturing in a dark place, images are more likely to be faint as blurred images even the movement of the image capturing unit 10 is small as compared to a case in which image capturing is performed in bright light.
The image-capturing position detecting unit 14 detects position information representing each image-capturing position of the image capturing unit 10 and outputs the detected image information to the drive control unit 16 and the position information acquiring unit 22. For example, the image-capturing position detecting unit 14 is implemented by an encoder provided in the drive unit 12. Alternatively, the image-capturing position detecting unit 14 may be implemented by an acceleration sensor provided in the image capturing unit 10.
The drive control unit 16 receives the position information detected by the image-capturing position detecting unit 14 and controls the drive unit 12 so that the image capturing unit 10 moves within a predetermined range. For example, the drive control unit 16 controls the drive unit 12 so that an image-capturing position of a reference image and an image-capturing position of an image for detection will correspond to each other on the basis of the position information detected by the image-capturing position detecting unit 14.
For example, the image capturing unit 10 captures images I0, I1 and I2 while moving from the start point to the end point. Note that each of images (the images I0, I1 and I2, for example) that are captured by the image capturing unit 10 in a state in which it is clear that no object to be detected is present is used as a reference image (background image) in order to detect an object to be detected by the object detecting apparatus 1. Images that are captured by the image capturing unit 10 while the object detecting apparatus 1 is in operation when it is necessary to detect an object to be detected are images for detection that are distinguished from reference images. An object to be detected will hereinafter be simply abbreviated to an “object” in some cases.
Note that a range in which the image capturing unit 10 can capture images while the image capturing unit 10 moves from the start point to the end point is referred to as a first image capturing range for capturing reference images. In addition, the image capturing unit 10 is allowed to capture images within a second image capturing range for capturing target images for detection that is included in the first image capturing range. The first image capturing range and the second image capturing range may be identical. When the second image capturing range is included in the first image capturing range, the numbers of reference images and target images for detection may be one or more than one. The image capturing unit 10 captures a reference image when the object detecting apparatus 1 is initialized, when the image capturing unit 10 has received an instruction from outside that is not illustrated, or the like.
The image acquiring unit 20 (
The histogram generating unit 24 generates a histogram representing distribution of frequency of each pixel value (luminance) for the images received from the image acquiring unit 20 and normalizes the histogram.
Subsequently, the histogram generating unit 24 divides the image of the captured image range W into a plurality of first regions (R0 to Rk) each extending in the direction of blurring caused by the movement of the image capturing unit 10 and generates a histogram (h0 to hk) of pixel values for each of the first regions as illustrated in
The histogram generating unit 24 also associates the generated histogram (and the reference images) and the position information received from the position information acquiring unit 22 so that the image-capturing positions and the histogram correspond to each other and outputs the association result to the storage unit 26.
The storage unit 26 receives and stores the associated histogram (and reference images) and position information from the histogram generating unit 24. The storage unit 26 may also be configured to store a determination result from the determining unit 28, which will be described later.
The determining unit 28 receives the images captured by the image capturing unit 10 via the image acquiring unit 20 and receives the position information detected by the image-capturing position detecting unit 14 via the position information acquiring unit 22. The determining unit 28 also acquires the associated histogram and position information and the like from the storage unit 26. The determining unit 28 then determines whether an object (an object to be detected) that is not imaged in the reference images but is imaged in the target images for detection is present or not and outputs the determination result to the output unit 30.
A method for determining whether or not the object to be detected is present by the determining unit 28 will be described here.
As illustrated in
The second region of an image for detection is a region on an image corresponding to the first region of a reference image. For example, in a case of a target image for detection that is captured at the same image-capturing position as a reference image and that has no object therein that is not present in the reference image, the reference image and the target image for detection are substantially identical or similar images and an image in each second region is substantially identical or similar to that in the corresponding first region.
Specifically, the determining unit 28 determines pixels of a second region having pixel values with the frequency being the first threshold T1 or greater to be pixels representing a background, and determines pixels of a second region having pixel values with the frequency being smaller than the first threshold T1 to be object candidate pixels. In the example illustrated in
The determining unit 28 determines whether pixels in each second region are pixels representing a background or object candidate pixels by comparison with a histogram of the corresponding first region. In other words, the determining unit 28 determines whether pixels are either pixels representing a background or object candidate pixels for all the pixels of an image for detection.
Furthermore, the determining unit 28 determines whether or not the frequency of pixels (the number N of object candidate pixels) of the target image for detection determined to be object candidate pixels is equal to or greater than the second threshold T2. If the number N of object candidate pixels is greater than the second threshold T2, the determining unit 28 determines that an object that is not imaged in the reference images but is imaged in the image for detection is present. Thus, if the number N of object candidate pixels is smaller than the second threshold T2, the determining unit 28 determines that an object to be detected is not present. Note that the second threshold T2 is set in advance by the user according to conditions such as the size of the object the presence of which is to be determined and the determination accuracy.
The output unit 30 receives and outputs the determination result of determination by the determining unit 28. For example, the output unit 30 is a display device and displays the determination result and the like, which will be described later with reference to
Note that the configuration of the object detecting apparatus 1 is not limited to that illustrated in
Next, an outline of exemplary operation of the object detecting apparatus 1 will be described.
In step S102, the histogram generating unit 24 generates a histogram for each first region of the reference images acquired in step S100.
In step S104, the object detecting apparatus 1 acquires one or more images for detection and position information representing image-capturing positions of the images for detection.
In step S106, the determining unit 28 compares each histogram of the reference images with a pixel value of each pixel of the images for detection corresponding to the reference images to determine whether each pixel of the images for detection is a pixel representing a background or an object candidate pixel (whether or not each pixel is a candidate of the object to be detected). In this process, the determining unit 28 identifies an image for detection corresponding to a reference image by using the position information of the reference images and the position information of the images for detection. Thus, the position information of a reference image is preferably identical to the position information of an image for detection. The operation for detecting an object by the object detecting apparatus 1, however, is not limited to the case in which the position information of a reference image and the position information of a target image for detection are identical. For example, the object detecting apparatus 1 may detect whether or not the object described above is present in an image for detection by using a histogram generating from three reference images and one target image for detection whose position information is identical or substantially identical to that of any one of the reference images. Alternatively, the object detecting apparatus 1 may detect whether or not the object is present in an image for detection whose image-capturing position is slightly shifted from that of a reference image. In a case where the background does not change much regardless of the image-capturing position in the first image capturing range (the difference in the luminance distribution is small), the drive control unit 16 need not control the drive unit 12 so that the image-capturing position of a reference image and that of a target image for detection correspond to each other. In other words, the correspondence between the image-capturing position of a reference image and that of a target image for detection may be set by the user according to the accuracy of detection of the object or the like.
In step S108, the determining unit 28 determines whether or not an object that is not imaged in the reference images but is imaged in the target image for detection is present by using the second threshold T2.
In step S110, the output unit 30 receives and outputs the determination result of determination by the determining unit 28. For example, the output unit 30 may be configured to display a target image for detection in which it is determined that “the object is present” by the determining unit 28 with a circle in the center thereof as in
Modifications
In the embodiment described above, the object detecting apparatus 1 generates a histogram by the histogram generating unit 24 by using all of the reference images (images I0, I1 and I2, for example) captured within the first image capturing range. In a modification of the embodiment, the object detecting apparatus 1 divides the first image capturing range into a plurality of ranges and generates a histogram for each of the divided ranges by the histogram generating unit 24. Specifically, the histogram generating unit 24 divides a captured image range W including all the reference images and generates a plurality of histograms.
Subsequently, the histogram generating unit 24 divides the image of the captured image range Wa into a plurality of first regions (Ra0 to Rak) extending in the direction of blurring caused by the movement of the image capturing unit 10 and generates a histogram (ha0 to hak) of pixel values for each of the first regions. The histograms (ha0 to hak) may be presented as a histogram group image Ha in which the horizontal direction represents an 8-bit pixel value (luminance), the vertical direction represents the number 0 to k of the subscripts excluding the alphabets of the first regions of the reference image and the frequency of pixel values is represented by an 8-bit density value, for example.
The histogram generating unit 24 also divides the image of the captured image range Wb into a plurality of first regions (Rb0 to Rbk) extending in the direction of blurring caused by the movement of the image capturing unit 10 and generates a histogram (hb0 to hbk) of pixel values for each of the first regions. The histograms (hb0 to hbk) may be presented as a histogram group image Hb in which the horizontal direction represents an 8-bit pixel value (luminance), the vertical direction represents the number 0 to k of the subscripts excluding the alphabets of the first regions of the reference image and the frequency of pixel values is represented by an 8-bit density value, for example.
Next, a method for determining the position at which the histogram generating unit 24 divides the captured image range W will be described.
Next, the histogram generating unit 24 acquires target images for detection in a state in which no object to be detected is present at the same image-capturing positions as the images I0, I1 and I2 and calculates the number Nm of the object candidate pixels. Note that Nm may be calculated by using the images I0, I1 and 12 themselves as target images for detection without performing imaging at the same image-capturing positions. The histogram generating unit 24 may also regard an image Im (each of the images I0, I1 and I2) as a target image for detection and compare each pixel of the first region (second region) of the image Im with a corresponding histogram hn to calculate the number Nm of object candidate pixels of each image Im.
The histogram generating unit 24 then identifies a position at which |Nm−Nm-1| becomes maximum, and divides the captured image range W on the basis of the previous time at which Nm is taken.
The histogram generating unit 24 does not perform division of the captured image range W when the maximum value of |Nm−Nm-1| is smaller than a predetermined threshold. The histogram generating unit 24 may also be configured to repeat division of the captured image range W until all of |Nm−Nm-1| become smaller than the predetermined threshold.
While an example of a case in which the determining unit 28 determines whether each pixel of each second region of a target image for detection is a pixel representing a background or an object candidate pixel to be detected has been described in the embodiment described above, the object detecting apparatus 1 is not limited thereto. For example, the object detecting apparatus 1 may be configured such that the determining unit 28 generates a histogram of pixel values for each second region, determines whether or not a histogram of a first region and a histogram of a corresponding second region are different, and determines that an object to be detected is present if the number of second regions determined to be different is a predetermined number or larger.
As described above, since the object detecting apparatus according to the embodiment determines whether or not an object to be detected is present on the basis of the difference between the frequency of a histogram of pixel values generated for each of a plurality of first regions each extending in the direction of blurring caused by movement and the frequency of pixel values of each second region corresponding to each first region, it is possible to detect whether an object to be detected is present from a blurred image obtained by image capturing during movement.
Meanwhile, the object detecting apparatus described above can also be put into practice with the use of a general-purpose computer device that serves as the basic hardware. That is, the image-capturing position detecting unit 14, the drive control unit 16, the image acquiring unit 20, the position information acquiring unit 22, the histogram generating unit 24, the storage unit 26, the determining unit 28 and the output unit 30 can be implemented by running computer programs in a processor installed in the computer device. At that time, the object detecting apparatus can be put into practice by installing in advance the computer programs in the computer device. Alternatively, the position estimation device can be put into practice by storing the computer programs in a memory medium such as a compact disk read only memory (CD-ROM) or by distributing the computer programs via a network as a computer program product, and then appropriately installing the computer programs in the computer device. Moreover, the image-capturing position detecting unit 14, the drive control unit 16, the image acquiring unit 20, the position information acquiring unit 22, the histogram generating unit 24, the storage unit 26, the determining unit 28 and the output unit 30 can be implemented with the use of a memory medium such as a memory that is embedded in the computer device or attached to the computer device from outside; a hard disk; a compact disk recordable (CD-R), a compact disk rewritable (CD-RW), a digital versatile disk random access memory (DVD-RAM), and a digital versatile disk recordable (DVD-R).
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2012-120105 | May 2012 | JP | national |