This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2016-220648, filed Nov. 11, 2016, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a processing device, an linage capture device, and an automatic control system.
In recent years, a computational photography technology is received a lot of attraction. In the technology, changing an image capture process and encoding distance information into a captured image achieves acquiring the image and the distance information at the same time. By using the technology, a distance to the object can be obtained using the captured image.
Various embodiments will be described hereinafter with reference to the accompanying drawings.
In general, according to one embodiment, a processing device includes a memory and a circuit coupled with the memory. The circuit acquires a first image of a first color component and a second image of a second color component. The first image has a non-point-symmetric blur function (point spread function) and captures a first object. The second image has a point-symmetric blur function and captures the first object. The circuit determines whether the first object is on a near side of a first position or on a deep side of the first position when viewed from a capture position based on the first image and the second image.
First, a configuration of an image capture device according to an embodiment will be described with reference to
In recent years, a technology of calculating a distance to an object on an image by using the image is received a lot of attraction. However, a process of calculating the distance to the object causes a high calculation cost, and a high-speed calculation may be not easy. In addition, as well as the calculation of the distance to the object, there are some cases in which it is important to determine whether the object is on a near side or a deep side from a reference position depending on applications. Therefore, there is a need to realize a new function in order to determine a position of the object with respect to the reference position at a high speed.
As illustrated in
The image sensor 30 receives light passing through the filter 10 and the lens 20, and converts (photoelectrically converts) the received light into an electric signal to generate an image. The image sensor 30 generates an image including pixels. Each of the pixels contains at least one color component. As the image sensor 30, for example, a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) is used. The image sensor 30 includes, for example, imaging elements which receive a red (R) light, imaging elements which receive a green (G) light, and imaging elements which receive a blue (B) light. Each imaging element receives the light of the corresponding wavelength band, and converts the received light into an electric signal. A/D converting the electric signal can generate a color image. In the following, an R component, a G component, and a B component of the image may be referred to as an R image, a G image, and a B image, respectively. Further, the R image, the G image, and the B image can be generated using the electric signals of the red, green, and blue imaging elements, respectively.
The CPU 40 controls various components in the image capture device 100. The CPU 40 executes various programs which are loaded from the nonvolatile memory 30 that is used as a storage device to the RAM 50. In the nonvolatile memory 90, an image generated by the image sensor 30 and a processing result of the image may be stored.
In the memory card slot 60, various portable storage mediums such as an SD memory card and an SDHC memory card may be inserted. When inserting the storage medium into the memory card slot 60, data may be written to and read from the storage medium. The data includes, for example, image data and distance data.
The display 70 is, for example, a liquid crystal display (LCD). The display 70 displays a screen image based on a display signal generated by the CPU 40. Further, the display 70 may be a touch screen display. In this case, for example, a touch panel is disposed on the upper surface of the LCD. The touch panel is a capacitive pointing device for inputting on the screen of the LCD. The touch panel detects a contact position on the screen that is touched by a finger and a movement of the contact position.
The communication unit 80 is an interface device that performs a wired communication or a wireless communication. The communication unit 80 includes a transmitter transmitting a signal in a wired or wireless manner, and a receiver receiving a signal in a wired or wireless manner.
The filter 10 includes two or more color filter regions. The color filter regions each have a non-symmetric shape with respect to the optical center of the image capture device. Part of the wavelength band of a light transmitting a color filter region overlaps with part of the wavelength band of a light transmitting another color filter region, for example. The wavelength band of a light transmitting a color filter region may include, for example, a wavelength band of the light transmitting another color filter region. In the following, the description will be given as an example using the filter 10 of
The first filter region 11 and the second filter region may be a filter changing a transmittance of an arbitrary wavelength band, a polarization filter passing a polarized light in an arbitrary direction, or a microlens changing a focusing power of an arbitrary wavelength band. For example, the filter changing the transmittance of an arbitrary wavelength band may be a primary color filter (RGB), a complementary color filter (CMY), a color compensating filter (CC-RGB/CMY), an infrared/ultraviolet cutoff filter, an ND filter, or a shielding plate. When the first filter region 11 and the second filter region 12 are microlenses, a distribution of focused light rays is deviated by the lens 20, and thus a blur shape changes.
In the following, a case where the first filter region 11 is a yellow (Y) filter region and the second filter region 12 is a cyan (C) filter region will be exemplified in order to help with understanding.
When the filter 10 is disposed in an aperture of the camera, a structured aperture of which the aperture is divided into two color parts constitutes a color-filtered aperture. The image sensor 30 generates an image based on light rays transmitting the color-filtered aperture. The lens 20 may be disposed between the filter 10 and the image sensor 30 on an optical path through which the light is incident into the image sensor 30. The filter 10 may be disposed between the lens 20 and the image sensor 30 on the optical path through which the light is incident into the image sensor 30. When multiple lenses 20 are provided, the filter 10 may be disposed between two lenses 20.
More specifically, the light of the wavelength band corresponding to a second sensor 32 transmits both the first filter region 11 of yellow and the second filter region 12 of cyan. The light of the wavelength band corresponding to a first sensor 31 transmits the first filter region 11 of yellow but does not transmit the second filter region 12 of cyan. The light of the wavelength band corresponding to a third sensor 33 transmits the second filter region 12 of cyan but does not transmit the first filter region 11 of yellow.
Transmitting a light of a certain wavelength band through a filter or a filter region means transmitting (passing) the light of the wavelength band through the filter or the filter region at a high transmittance. This means that the attenuation of the light (that is, a reduction of the amount of light) of the wavelength band due to the filter or the filter region is extremely small. Not transmitting the light of a certain wavelength band through a filter or a filter region means shielding the light by the filter or the filter region, for example, transmitting the light of the wavelength region through the filter or the filter region at a low transmittance. This means that the attenuation of the light of the wavelength band due to the filter or the filter region is extremely large. The filter or the filter region attenuates the light by, for example, absorbing the light of a certain wavelength band.
Therefore, the light of the wavelength band corresponding to the R image transmits only the first filter region 11 of yellow, and the light of the wavelength band corresponding to the B image transmits only the second filter region 12 of cyan.
The blur shapes on the R and B image change depending on a distance (or a depth) d to the object. In addition, each of the filter regions 11 and 12 has a non-point-symmetric shape with respect to the optical center 13. Therefore, the directions of blur deviation on the R and B images are inverted according to whether the object is on the near side or on the deep side from a focus position when viewed from an image capture point. The focus position is a point away from the image capture point by a focus distance df, and is a focused position at which the blur does not occur on the image captured by the image capture device 100.
The description will be given about a change of the light rays and the blur shape due to the color-filtered aperture where the filter 10 is disposed, with reference to
When an object 15 is on the deep side from the focus distance df (focus position) (d>df), blur occurs in an image captured by the image sensor 30. A blur function indicating a shape of blur on the image is different among the R image, the G image, and the B image. That is, a blur function 101R of the R image indicates the blur shape deviated to the left side, a blur function 101G of the G image indicates the blur shape without deviation, and a blur function 101B of the B image indicates the blur shape deviated to the right side.
When the object 15 is at the focus distance df (d=df), blur almost does not occur in an image captured by the image sensor 30. A blur function indicating a shape of blur on the image is almost the same among the R image, the G image, and the B image. That is, a blur function 102R of the R image, a blur function 102G of the G image, and a blur function 102B of the B image show blur shapes without deviation.
When the object 15 is on the near side from the focus distance df (d<df), blur occurs in an image captured by the image sensor 30. A blur function indicating a shape of blur on the image is different among the R image, the G image, and the B image. That is, a blur function 103R of the R image indicates the blur shape deviated to the right side, a blur function 103G of the G image shows the blur shape without deviation, and a blur function 103B of the B image shows the blur shape deviated to the left side.
When blur occurs on an image captured using the filter 10, a different shape of blur occurs on the R image, the G image, and the B image, respectively. As illustrated in
Blur correction filters 17 and 18 configured to correct the non-point-symmetric blur on the R image and the B image into point-symmetric blur based on blur estimated per distance to an object are applied to the blur function 16R of the R image and the blur function 16B of the B image. Then, a determination is made as to whether the blur functions 16R and 16B match with the blur function 16G of the G image. A plurality of blur correction filters corresponding to a plurality of distances is prepared as the blur correction filters 17 and 18 per distance at a specific interval. When a blur function 19R applied with the blur correction filter 17 or a blur function 19B applied with the blur correction filter 18 matches with the blur function 16G of the G image, the distance corresponding to the blur correction filter 17 or 18 is determined as the distance to the captured object 15.
Determining whether a blur function matches with another blur function can employ a correlation between the R image or B image applied with the blur correction filter and the G image. Therefore, for example, retrieving a blur correction filter, for which a correlation between the R image or B image applied with the blur correction filter and the G image is higher, from among the blur correction filters achieves estimating a distance to the object captured in each pixel on the image. That is, a corrected image obtained by correcting a blur shape of the R or B image is generated using the plurality of blur correction filters created on an assumption that the distance to the object shown in the image is arbitrary, and a distance at which the correlation between the generated corrected image and the G image is higher is found. Therefore, the distance to the object can be calculated.
However, the process of calculating the distance to the object using a number of blur correction filters causes a high calculation cost. Therefore, the distance to the object may be not used as its usage required in real time depending on the number of prepared correction filters. In addition, an exact distance to the object is not necessary depending on the usage. For example, only the determination on whether the object is on the deep side of a reference position or on the near side of the reference position may be sufficient.
Therefore, in this embodiment, by using the image captured using the color-filtered aperture disposed with the filter 10, the distance to the object is not calculated, but it is determined whether the object is on the deep side of the reference position, or on the near side of the reference position. In this embodiment, for example, a blur deviation of a color component that contains a blur having a shape expressed by the non-point-symmetric blur function is determined, so that a relative position of the object with respect to the reference position can be determined at a high speed. A distance (hereinafter, also referred to as a reference distance) from a capture position to the reference position may be the focus distance, or may be an arbitrary distance designated by a user.
When the object 15 is at the focus distance df (d=df), the rays (light flux) corresponding to one point on the object 15 are collected in a narrow range (for example, one point) 302 on the image sensor 30. Therefore, an image 52 having no blur is generated. With this regard, when the object 15 is on the deep side from the focus distance df (d>df), the rays corresponding to one point on the object 15 are not collected at one point on the image sensor 30 compared to the case where the object 15 is at the focus distance df, and is spread in a wide range. Therefore, an image 51 containing a blur 301 is generated. In addition, when the object 15 is on the near side from the focus distance df (d<df), the rays corresponding to one point on the object 15 are not collected at one point on the image sensor 30 compared to the case where the object 15 is at the focus distance df, and is spread in a wide range. Therefore, an image 53 containing a blur 303 is generated.
As illustrated in
An example of a functional configuration of the image capture device 100 will be described with reference to
The image capture device 100 further includes an image processing unit 41 and a control signal generating unit 42. Each arrow from the image sensor 30 to the control signal generating unit 42 indicates a path of the electric signal. Hardware (circuit), software (program) executed by the CPU 40, or a combination of software and hardware can realize the respective functional configurations of the image Capture device 100 including the image processing unit 41 and the control signal generating unit 42.
The image processing unit 41 determines whether the object captured in the image is on the near side or on the deep side of the reference position based on the blur on the image generated by the image sensor 30. The image processing unit 41 includes an acquisition unit 411 and a determination unit 412.
The acquisition unit 411 acquires images generated by the image sensor 30. The acquisition unit 411 acquires, for example, an image of a first color component that has a non-point-symmetric blur function and captures a first object, and an image of a second color component that has a point-symmetric blur function and captures the first object. The first color component is, for example, the R component or the B component, and the second color component is, for example, the G component. The acquisition unit 411 may acquire, for example, an image including pixels each having at least one color component. In this image, blur does not occur in a pixel for which the distance to the object is the focus distance, and blur occurs in a pixel for which the distance to the object is not the focus distance. Further, the blur function indicative of blur of the first color component of the pixels is non-point-symmetric.
The determination unit 412 determines whether the first object is on the near side of the reference position (first position) or on the deep side of the reference position when viewed from the capture position based on the image of the first color component and the image of the second color component. The reference position is, for example, a point at which a distance from the capture position is the reference distance. The reference distance may be the focus distance, or may be an arbitrary distance designated by the user. The image capture device 100 may also include a reception unit 43 that receives information input by the user. The reception unit 43 may receive information indicating the reference distance and information designating a processing target pixel on the acquired image. The reference distance may be calculated from a reference surface given by the user. Alternatively, the reception unit 43 may receive information related to a reference plane in place of the reference distance. The reference surface may be a flat surface, a curved surface or a discontinuous surface. For example, the user may input information indicating the reference distance through an input device such as a mouse, a keyboard or a touch screen display, or may designate a region where a processing target pixel on the image is included. When the reception unit 43 receives information designating a processing target pixel on the image, the determination unit 412 may determine whether the first object containing the processing target pixel is on the near side or on the deep side of the reference position when viewed from the capture position.
In addition, the determination unit 412 may determine a deviation of blur of the first color component on the acquired image. The determination unit 412 determines whether the object is on the near side or on the deep side of the reference position based on the deviation of blur of the first color component. When the reception unit 43 receives the information designating the processing target pixel on the image, the determination unit 412 may determine the deviation of blur of the first color component of the processing target pixel.
The control signal generating unit 42 generates various control signals for controlling the image capture device 100 and/or an external device based on a determination result of the determination unit 412 on whether the object is on the near side and on the deep side of the reference position. The control signal generating unit 42 detects, for example, an event that the object comes at the reference position, or an event that the object go away from the reference position based on the determination result, and generates various control signals for controlling the image capture device 100 and/or the external device.
Further, the determination unit 412 also outputs the determination result indicating that the object is on the near side or on the deep side of the reference position, to the external device.
Next, the description will be given with reference to
Originally, that is, if there is no color-filtered aperture where the filter 10 is disposed and no blur, the edge region 511 is configured by a dark color region 511L on the left side and a bright color region 511R on the right side. A boundary between these dark and bright color regions 511L and 511R is an edge 511E. Therefore, a relation 61 between the positions of the pixels and the pixel values in these regions 511L and 511R on each of the R image, the G image and the B image shows a sharp edge shape.
In practice, the edge region 511 is affected by the color-filtered aperture, and contains the blur. Therefore, a first region 611 on the left side and a second region 612 on the right side of the edge 511E on the image 51 have a tinge of red.
More specifically, in the edge region 511 on the G image, there occurs a point-symmetric blur that is expressed by the blur function 101G. Therefore, a relation 61G between the positions of the pixels and the pixel values in the edge region 511 on the G image snows that a large blur occurs on both of the first region 611 on the left side of the edge 511E and the second region 612 on the right side of the edge 511E.
In the edge region 511 on the R image, there occurs a non-point-symmetric blur that is deviated to the left side and is expressed by the blur function 101R. Therefore, a relation 61R between the positions of the pixels and the pixel values in the edge region 511 on the R image shows that a large blur occurs in the first region 611 on the left side of the edge 511E and a small blur occurs in the second region 612 on the right side of the edge 511E.
In the edge region 511 on the B image, there occurs the non-point-symmetric blur that is deviated to the right side and is expressed by the blur function 101B. Therefore, a relation 61B between the positions of the pixels and the pixel values in the edge region 511 on the B image shows that a small blur occurs in the first region 611 on the left side of the edge 511E, and a large blur occurs in the second region 612 on the right side of the edge 511E.
In this way, the light of the red wavelength band and the light of the blue wavelength band pass through part of the filter, and thus the non-point-symmetric blur occurs.
Therefore, when the object 15 is on the deep side from the focus distance, the edge region 511 has a characteristic that a gradient of the first region 611 on the R image is large and a gradient of the second region 612 on the R image is small, and a characteristic that the gradient of the first region 611 on the B image is small and the gradient of the second region 612 on the B image is large.
On the basis of these characteristics, the determination unit 412 determines that the object 15 is on the deep side from the focus distance based on, for example, (1) that the gradient of the first region 611 on the R image is equal to or more than a first threshold and the gradient of the second region 612 on the R image is less than a second threshold, and/or (2) that the gradient of the first region 611 on the B image is less than the second threshold and the gradient of the second region 612 on the B image is equal to or more than the first threshold, by using the pixels in the edge region 511.
Originally, that is, if there is no color-filtered aperture where the filter 10 is disposed and no blur, the edge region 512 is configured by a bright color region 512L on the left side and a dark color region 512R on the right side. A boundary between these bright and dark color regions 512L and 512R is an edge 512E. Therefore, a relation 62 between the positions of the pixels and the pixel values in these regions 512L and 512R of the blur on each of the R image, the G image, and the B image shows a sharp edge shape.
In practice, the edge region 512 is affected by the color-filtered aperture, and contains the blur. Therefore, a first region 621 on the left side and a second region 622 on the right side of the edge 512E on the image 51 have a tinge of blue.
More specifically, in the edge region 512 on the G image, there occurs a point-symmetric blur that is expressed by the blur function 101G. Therefore, a relation 62G between the positions of the pixels and the pixel values in the edge region 512 on the G image shows that a large blur occurs on both of the first region 621 on the left side of the edge 512E and the second region 622 on the right side of the edge 512E.
In the edge region 512 on the R image, there occurs a non-point-symmetric blur that is deviated to the left side and is expressed by the blur function 101R. Therefore, a relation 62R between the positions of the pixels and the pixel values in the edge region 512 on the R image shows that a large blur occurs in the first region 621 on the left side of the edge 512E and a small blur occurs in the second region 622 on the right side of the edge 512E.
In the edge region 512 on the B image, there occurs the non-point-symmetric blur which is deviated to the right side and is expressed by the blur function 101B. Therefore, a relation 62B between the positions of the pixels and the pixel values in the edge region 512 on the B image shows that a small blur occurs in the first region 621 on the left side of the edge 512E, and a large blur occurs in the second region 622 on the right side of the edge 512E.
Therefore, when the object 15 is on the deep side from the focus distance, the edge region 512 has a characteristic that a gradient of the first region 621 on the R image is large and a gradient of the second region 622 on the R image is small, and a characteristic that the gradient of the first region 621 on the B image is small and the gradient of the second region 622 on the B image is large.
On the basis of these characteristics, the determination unit 412 determines that the object 15 is on the deep side from the focus distance based on, for example, (1) that the gradient of the first region 621 on the R image is equal to or more than the first threshold and the gradient of the second region 622 on the R image is less than the second threshold, and/or (2) that the gradient of the first region 621 on the B image is less than the second threshold and the gradient of the second region 622 on the B image is equal to or more than the first threshold, by using the pixels in the edge region 512.
In this way, it is possible to determine whether the object is on the near side or on the deep side of the focus distance by using the edge region of the image generated by the light of the wavelength band where the point-symmetric blur occurs and the edge region of the image generated by the light of the wavelength band where the non-point-symmetric blur occurs.
Originally, that is, if there is no color-filtered aperture where the filter 10 is disposed and no blur, the edge region 531 is configured by a dark color region 531L on the left side and a bright color region 531R on the right side. A boundary between these dark and bright color regions 531L and 531R is an edge 531E. Therefore, a relation 63 between the positions of the pixels and the pixel values in these regions 531L and 531R of the blur on each of the R image, the G image and the B image shows a sharp edge shape.
In practice, the edge region 531 is affected by the color-filtered aperture, and contains the blur. Therefore, a first region 631 on the left side and a second region 632 on the right side of the edge 531E on the image 53 have a tinge of bine.
More specifically, in the edge region 531 on the G image, there occurs a point-symmetric blur expressed by the blur function 103G. Therefore, a relation 63G between the positions of the pixels and the pixel values in the edge region 531 on the G image shows that a large blur occurs on both of the first region 631 on the left side of the edge 531E and the second region 632 on the right side of the edge 531E.
In the edge region 531 on the R image, there occurs a non-point-symmetric blur that is deviated to the right side and is expressed by the blur function 103R. Therefore, a relation 63R between the positions of the pixels and the pixel values in the edge region 531 on the R image shows that a small blur occurs in the first region 631 on the left side of the edge 531E and a large blur occurs in the second region 632 on the right side of the edge 531E.
In the edge region 531 on the B image, there occurs the non-point-symmetric blur that is deviated to the left side and is expressed by the blur function 103B. Therefore, a relation 63B between the positions of the pixels and the pixel values in the edge region 531 on the B image shows that a large blur occurs in the first region 631 on the left side of the edge 531E, and a small blur occurs in the second region 632 on the right side of the edge 531E.
Therefore, when the object 15 is on the near side from the focus distance, the edge region 531 has a characteristic that a gradient of the first region 631 on the R image is small and a gradient of the second region 632 on the R image is large, and a characteristic that the gradient of the first region 631 on the B image is large and the gradient of the second region 632 on the B image is small.
On the basis of these characteristics, she determination unit 412 determines that the object 15 is on the near side from the focus distance based on, for example, (1) that the gradient of the first region 631 on the R image is less than the second threshold and the gradient of the second region 632 on the R image is equal to or more than the first threshold, and/or (2) that the gradient of the first region 631 on the B image is equal to or more than the first threshold and the gradient of the second region 632 on the B image is less than the second threshold, by using the pixels in the edge region 531.
Originally, that is, if there is no color-filtered aperture where the filter 10 is disposed and no blur, the edge region 532 is configured by a bright color region 532L on the left side and a dark color region 532R on the right side. A boundary between these bright and dark color regions 532L and 532R is an edge 532E. Therefore, a relation 64 between the positions of the pixels and the pixel values in these regions 532L and 532R of the blur on each of the R image, the G image and the B image shows a sharp edge shape.
In practice, the edge region 532 is affected by the color-filtered aperture, and contains the blur. Therefore, a first region 641 on the left side and a second region 642 on the right side of the edge 532E on the image 53 have a tinge of red.
More specifically, in the edge region 532 on the G image, there occurs a point-symmetric blur expressed by the blur function 103G. Therefore, a relation 64G between the positions of the pixels and the pixel values in the edge region 532 on the G image shows that a large blur occurs on both of the first region 641 on the left side of the edge 532E and the second region 642 on the right side of the edge 532E.
In the edge region 532 on the R image, there occurs a non-point-symmetric blur which is deviated to the right side and is expressed by the blur function 103R. Therefore, a relation 64R between the positions of the pixels and the pixel values in the edge region 532 on the R image shows that, a small blur occurs in the first region 641 on the left side of the edge 532E and a large blur occurs in the second region 642 on the right side of the edge 532E.
In the edge region 532 on the B image, there occurs the non-point-symmetric blur that is deviated to the left side and is expressed by the blur function 103B. Therefore, a relation 64B between the positions of the pixels and the pixel values in the edge region 532 on the B image shows that a large blur occurs in the first region 641 on the left side of the edge 532E, and a small blur occurs in the second region 642 on the right side of the edge 532E.
Therefore, when the object 15 is on the near side from the focus distance, the edge region 532 has a characteristic that a gradient of the first region 641 on the R image is small and a gradient of the second region 642 is large on the R image, and a characteristic that the gradient of the first region 641 on the B image is large and the gradient of the second region 642 on the B image is small.
On the basis of these characteristics, the determination unit 412 determines that the object 15 is on the near side from the focus distance based on, for example, (1) that the gradient of the first region 641 on the R image is less than the second threshold and the gradient of the second region 642 on the R image is equal to or more than the first threshold, and/or (2) that the gradient of the first region 641 on the B image is equal to or more than the first threshold and the gradient of the second region 642 on the B image is less than the second threshold, by using the pixels in the edge region 532.
In this way, the determination unit 412 determines the deviation of blur shown in the gradient of the region on the left side of the edge and in the gradient of the region on the right side of the edge, so that it is possible to determine whether the object is on the near side or on the deep side from the focus distance.
Further, the description will be given with reference to
The determination unit 412 determines pixels included in an inner circle 72 and pixels included in a shaded portion 73 between the circle 72 and the outer circle when double circle 71 is disposed to be matched in the center in the edge region 511 configured by the dark color region 511L and the bright color region 511R of the R image. The sizes of the circle 72 and the shaded portion 73 forming the double circle 71 can be determined, for example, based on the characteristic of the lens 20 and the size of the edge region 511. For example, the radius of the inner circle 72 can be set to a length from a boundary between the dark color region 51L and the bright color region 511R to a pixel farthest away from the boundary contained in a region where the blur is distributed on the R image. The region where the blur is distributed is changed according to the distance to the object, and is not known in advance. Therefore, the region may be estimated from a profile of the pixel values, or may be set to a value that is not too large.
Even in the G and B images, the double circle 71 is set to be matched with the position of the double circle 71 of the R image. The center of the circle is, for example, set on the boundary between the dark color region 511L and the bright color region 511R.
The determination unit 412 calculates a value of R/(R+G+B) using the pixel values of the R component, the G component and the B component of the pixels contained in each circle 72. In addition, the determination unit 412 calculates a value of R/(R+G+B) using the pixel values of the R component, the G component and the B component of the pixels contained in the shaded portion 73. Then, as illustrated in
The values of R/(R+G+B) in two regions may be calculated using the color image obtained by combining the R image, the G image and the B image. In addition, the values of R/(R+G) or the values of R/(G+B) in two regions may be calculated in place of R/(R+G′B). The value of R/(R+G+B) of the shaded portion 73 and the circle 72 may be used in place of calculating the value of R/(R+G+) or the shaded portion 73. In place of the double circle 71, a polygonal shape such as a rectangular shape or other shapes overlapping at least in a part or other shapes may be used.
The determination unit 412 determines pixels included in an inner circle 72 and pixels included in a shaded portion 73 between the circle 72 and the outer circle when double circle 71 is disposed to be matched in the center in the edge region 512 configured by the bright color region 512L and the dark color region 512R of the B image. The sizes of the circle 72 and the shaded portion 73 forming the double circle 71 can be determined, for example, based on the characteristic of the lens 20 and the size of the edge region 512. For example, the radius of the inner circle 72 can be set to a length from a boundary between the bright color region 512L and the dark color region 512R to a pixel farthest away from the boundary contained in a region where the blur is distributed on the B image. The region where the blur is distributed is changed according to the distance to the object, and is not known in advance. Therefore, the region may be estimated from a profile of the pixel values, or may be set to a value that is not too large.
Even in the G and R images, the double circle 71 is set to be matched with the position of the double circle 71 of the B image. The center of the circle is, for example, set on the boundary between the bright color region 512L and the dark color region 512R.
The determination unit 412 calculates a value of B/(R+G+B) using the pixel values of the R component, the G component and the B component of the pixels contained in each circle 72. In addition, the determination unit 412 calculates a value of B/(R+G+B) using the pixel value of the R component, the G component and the B component of the pixels contained in the shaded portion 73. Then, as illustrated in
The values of B/(R+G+B) in two regions may be calculated using the color image obtained by combining the R image, the G image and the B image. In addition, the values of B/(R+G) or the values of B/(G+B) in two regions may be calculated in place of B/(R+G+B). The value of B/(R+G+B) of the shaded portion 73 and the circle 72 may be used in place of calculating the value of B/(R+G+B) of the shaded portion 73. In place of the double circle 71, a polygonal shape such as a rectangular shape overlapping at least in a part or other shapes may be used.
The determination unit 412 determines pixels included in an inner circle 72 and pixels included in a shaded portion 73 between the circle 72 and the outer circle when double circle 71 is disposed to be matched in the center in the edge region 531 configured by the dark color region 531L and the bright color region 531R of the B image. The sizes of the circle 72 and the shaded portion 73 forming the double circle 71 can be determined, for example, based on the characteristic of the lens 20 and the size of the edge region 531. For example, the radius of the inner circle 72 can be set to a length from a boundary between the dark color region 531L and the bright color region 531R to a pixel farthest away from the boundary contained in a region where the blur is distributed on the B image. The region where the blur is distributed is changed according to the distance to the object, and is not known in advance. Therefore, the region may be estimated from a profile of the pixel values, or may be set to a value that is not too large.
Even in the G and R images, the double circle 71 is set to be matched with the position of the double circle 71 of the B image. The center of the circle is, for example, set on the boundary between the dark color region 531L and the bright color region 531R.
The determination unit 412 calculates a value of B/(R+G+B) using the pixel values of the R component, the G component, and the B component of the pixel contained in each circle 72. In addition, the determination unit 412 calculates a value of B/(R+G+B) using the pixel values of the R component, the G component and the B component of the pixels contained in the shaded portion 73. Then, as illustrated in
The values of B/(R+G+B) in two regions may be calculated using the color image obtained by combining the R image, the G image and the B image. In addition, the values of B/(R+G) or the values of B/(G+B) in two regions may be calculated in place of B/(R+G+B). The value of B/(R+G+B) of the shaded portion 73 and the circle 72 may be used in place of calculating the value of B/(R+G+B) of the shaded portion 73. In place of the double circle 71, a polygonal shape such as a rectangular shape overlapping at least in a part or other shapes may be used.
The determination unit 412 determines pixels included in an inner circle 72 and pixels included in a shaded portion 73 between the circle 72 and the outer circle when double circle 71 is disposed to be matched in the center in the edge region 532 configured by the bright color region 532L and the dark color region 532R of the R image. The sizes of the circle 72 and the shaded portion 73 forming the double circle 71 can be determined, for example, based on the characteristic of the lens 20 and the size of the edge region 532. For example, the radius of the inner circle 72 can be set to a length from a boundary between the bright color region 532L and the dark color region 532R to a pixel farthest away from the boundary contained in a region where the blur is distributed on the R image. The region where the blur is distributed is changed according to the distance to the object, and is not known in advance. Therefore, the region may be estimated from a profile of the pixel values, or may be set to a value that is not too large.
Even in the G and B images, the double circle 71 is set to be matched with the position of the double circle 71 of the R image. The center of the circle is, for example, set on the boundary between the bright color region 532L and the dark color region 532R.
The determination unit 412 calculates a value of R/(R+G+B) using the pixel values of the R component, the G component and the B component of the pixels contained in each circle 72. In addition, the determination unit 412 calculates a value of R/(R+G+B) using the pixel values of the R component, the G component and the B component of the pixels contained in the shaded portion 73. Then, as illustrated in
The values of R/(R+G+B) in two regions may be calculated using the color image obtained by combining the R image, the G image and the B image. In addition, the values of R/(R+G) or the values of R/(G+B) in two regions may be calculated in place of R/(R+G+B). The value of R/(R+G+B) of the shaded portion 73 and the circle 72 may be used in place of calculating the value of R/(R+G+B) of the shaded portion 73. In place of the double circle 71, a polygonal shape such as a rectangular shape overlapping at least in a part or other shapes may be used.
In this way, the determination unit 412 can determine whether the object is on the near side or on the deep side from the focus distance by determining a deviation of blur that indicates a ratio of the color component in the edge region.
With the above configuration, it is possible to determine whether the object is on the near side or on the deep side from the focus distance (focus position). Since there is no need to calculate the distance from the image capture device to the object, the determination of this embodiment can be performed by a simple process at a high speed.
The reference distance may be the focus distance described above, or other distances may be used. The reference distance may be an arbitrary distance designated by the user. In the following, the description will be given with reference to
As illustrated in
Using the characteristic, the determination unit 412 determines whether the object 15 is on the near side or on the deep side from the reference distance by determining a correction image having a high correlation with the reference image having the point-symmetric blur from two correction images that are obtained by applying a first blur correction filter for correcting the blur when the object 15 is on the near side from the reference distance and a second blur correction filter for correcting the blur when the object 15 is on the deep side from the reference distance to the target image having the non-point-symmetric blur. That is, when a correction image having a high correlation with the reference image is an image to which the first blur correction filter is applied, the determination unit 412 determines that the object 15 is on the near side from the reference distance. In addition, when the correction image having a high correlation with the reference image is an image to which the second blur correction filter is applied, the determination unit 412 determines that the object 15 is on the deep side from the reference distance. In other words, it can be said that the determination unit 412 determines whether an actual distance to the object 15 approaches the distance that is assumed by the first blur correction filter and is on the near side from the reference distance or the distance that is assumed by the second blur correction filter and is on the deep side from the reference distance.
More specifically, the determination unit 412 applies the first blur correction filter to the target image having the non-point-symmetric blur to correct the blur when the object 15 is on the near side from the reference distance, and thus calculates a first correction image. The first blur correction filter is, for example, a filter to correct the blur when the object 15 is on the near side by a predetermined distance from the reference distance. In addition, the determination unit 412 applies the second blur correction filter to the target image to correct the blur when the object 15 is on the deep side from the reference distance, and thus calculates a second correction image. The second blur correction filter is, for example, a filter to correct the blur when the object 15 is on the deep side by the predetermined distance from the reference distance.
Next, the determination unit 412 calculates a first correlation value between the first correction image and the reference image. The determination unit 412 also calculates a second correlation value between the second correction image and the reference image. The first correlation value and the second correlation value may be obtained using, for example, a normalized cross-correlation (NCC), a zero-mean normalized cross-correlation (JSNCC), and a color alignment measure.
The determination unit 412 compares the first correlation value with the second correlation value. If the first correlation value is larger than the second correlation value, the determination unit 412 determines chat the object 15 is on the near side from the reference distance. On the other hand, if the second correlation value is larger than the first correlation value, the determination unit 412 determines that the object 15 is on the deep side from the reference distance.
Further, the determination unit 412 may calculate a first difference degree between the first correction image and the reference image, and may calculate a second difference degree between the second correction image and the reference image. If the first difference degree is larger than the second difference degree, the determination unit 412 determines that the object 15 is on the deep side from the reference distance. On the other hand, when the second difference degree is larger than the first difference degree, the determination unit 412 determines that the object 15 is on the near side from the reference distance. The first difference degree and the second difference degree are obtained using, for example, a sum of squared difference (SSD) and a sum of absolute difference (SAD).
With the above configuration, it is possible to determine whether the object is on the deep side or on the near side from the reference distance (reference position) by a simple process at a high speed. The determination unit 412 applies two blur correction filters to the target image. Therefore, it can be said that the calculation cost is less than that in the case of the process of obtaining the distance to the object by applying a number of blur correction filters to the target image as described with reference to
As described above, the control signal generating unit 42 generates various control signals for controlling the image capture device 100 and the external devices based on the determination result of the determination unit 412 on whether the object is on the near side or on the deep side of the reference distance. The determination unit 412 may transmit a signal containing the determination result to the control signal generating unit 42, and the control signal generating unit 42 may generate the control signal for controlling the focus distance and zooming in or out of the image capture device 100 based on the determination result of the determination unit 412.
Further, the signal generated by the determination unit 412 includes, for example, data on a captured image and data on the determination result on the pixel in the captured image. The data on the captured image is, for example, data on a color space expressed, by RGB or YUV of the pixels. The determination unit 412 can generate (output), for example, a list of sets of three pixel values of RGB or YUV of a pixel and the determination result on the pixel. The sets are arranged in an order of pixels included in the captured image. The order is, for example, an order of raster scanning from the pixel at the left upper end to the pixel at the right lower end of the captured image. In addition, the determination unit 412 may generate data of a list of only determination results arranged in the order, or may generate a list of sets between coordinates of the pixel on the captured image and the determination result on the pixel. As described above, the determination unit 412 can determine whether the object is on the near side or on the deep side of the reference distance with respect to a pixel of a processing target designated in the captured image. Therefore, the list may not include the determination results on all the pixels in the image, but may contain the determination results on some pixels in the image. In addition, an image and a numerical value based on the generated list may be displayed on the display 70. For example, a pop-up screen may be displayed on the captured image displayed on the display 70 to show whether the object is on the near side or on the deep side of the reference distance. In addition, there may be displayed an image by which a user can identify whether the object is on the near side or on the deep side of the reference distance, on the display 70. The displayed image is, for example, an image in which the pixels on the near side of the reference distance and the pixels on the deep side of the reference distance are separated by color.
In addition, for example, when the object to be focused is at a position different from the focus distance, the control signal generating unit 42 generates a control signal for changing the focus distance to the position of the object. The image capture device 100 controls the lens 20 according to the generated control signal to change the focus distance to the near side or to the deep side. Therefore, an automatic focus and a tracking focus can be realized with respect to the object. The position to be focused may be input from the reception unit 43.
In addition, for example, if the object is on the deep side from the reference distance, the control signal generating unit 42 generates a control signal for zooming in. If the object is on the near side from the reference distance, the control signal generating unit 42 generates a control signal for zooming out. The image capture device 100 performs a zoom-in operation or a zoom-out operation by controlling the lens 20 that is a zoom lens according to the generated control signal. Therefore, the object on the image can be kept constant in size. Further, the reference distance may be kept constant even after the zoom-in or zoom-out operation.
Furthermore, when the image capture device 100 is a video recording device that records an image, the control signal generating unit 42 may generate a control signal based on the determination result on whether the object is on the near side or on the deep side from the reference distance. The control signal relates to a recording start of the image, a recording stop of the image, a resolution switching, and/or a compression ratio switching. The video recording device includes a device having a function of recording continuously captured images such as a monitor camera, a drive recorder, and a camera equipped in a drone. The image capture device 100 that is the video recording device performs the recording start of the image, the recording stop of the image, the resolution switching, or the compression rate switching according to the generated control signal. Therefore, when the object is on the near side from the reference distance, the image capture device 100 may start the recording of the image, increase the resolution, or lower the compression ratio. For example, the recording of the image may start, the resolution may be increased, or the compression ratio may be lowered since a time point when a person approaches a region within the reference distance from the monitor camera provided in a house, or a time point immediately before an accident that an object approaches a region within the reference distance from the camera of the drive recorder. In addition, when the object goes away toward the deep side, the image capture device 100 may stop the recording of the image, lower the resolution, or increase the compression ratio. Further, for example, when a distant image containing the object on the deep side of the reference distance is recorded when the image capture device 100 captures the ground from the sky, the resolution may be increased or the compression ratio may be lowered in order to observe a detailed portion of the object in a distance.
In addition, when the image capture device 100 is a video recording device that records an image, the image capture device 100 may include an attribute information generating unit 44 to generate attribute information corresponding to the recorded image. The attribute information generating unit 44 generates the attribute information for at least one image based on the determination result on whether the object is on the near side or on the deep side of the reference distance. For example, the attribute information generating unit 44 generates the attribute information (that is, index) for at least one image corresponding to a scene that the object approaches on the near side. The attribute information generating unit 44 can record the image and the attribute information in association with each other. Therefore, the user can play only the scene of which the attribute information is generated, and can skip other scenes when the user watches a recorded video containing images or recorded images, so that it is possible for the user to efficiently watch only the scene in which an event occurs. On the contrary, the user can efficiently watch only the scene in which an event does not occur by playing the scene in which the attribute information is not generated.
Next, the description will be given with reference to a flowchart of
First, the CPU 40 of the image capture device 100 determines whether an image is acquired (step S11). When an image is not acquired (No in step S11), it is determined again whether an image is acquired by returning to step S11.
When an image is acquired (Yes in step S11), the CPU 40 sets an image (for example, the G image) of a color component containing a point-symmetric blur of color components in the acquired image as the reference image, and detects an edge of an object from the reference image (step S12). For example, when a difference between the pixel values of an interested pixel on the reference image and an adjacent pixel is equal to or more than a threshold, the CPU 40 detects the interested pixel as the edge.
Next, the CPU 40 sets an image (for example, the R image or the B image) of a color component containing the non-point-symmetric blur of the color components in the acquired image as the target image, and determines pixels corresponding to the edge region containing the edge detected in step S12 from the target image (step S13). The edge region contains, for example, pixels detected as the edge and pixels on either side of them. The CPU 40 calculates a deviation of blur in the edge region using the pixel values of the determined pixels (step S15). The deviation of blur is expressed by, for example, the gradient of the first region and the gradient of the second region in the edge region. For example, the first region contains pixels positioned on the left side of the edge, and the second region contains pixels positioned on the right side of the edge. In this case, the deviation of blur is expressed by a gradient calculated based on the pixel values of the pixels positioned on the left side of the edge and the gradient calculated based on the pixel values of the pixels positioned on the right side of the edge.
Then, the CPU 40 determines whether the object is on the near side or on the deep side from the focus distance based on the calculated deviation of blur (step S15). For example, the CPU 40 determines whether the object is on the near side or on the deep side from the focus distance based on a magnitude relation between the gradient of the first region and the gradient of the second region.
In addition, a flowchart of
First, the CPU 40 of the linage capture device 100 determines whether an image is acquired (step S21). When an image is not acquired (No in step S21), it is determined again whether an image is acquired by returning to step S21.
When an image is acquired (Yes in step S21), the CPU 40 sets an image (for example, the G image) of the color component containing the point-symmetric blur of the color components in the acquired image as the reference image, sets an image (for example, the R and B images) of the color component containing the non-point-symmetric blur of the color components as the target image, and applies to the target image a correction filter for correcting the blur when the object is on the near side from the reference distance, so that the first correction image is generated (step S22). In addition, the CPU 40 applies to the target image a correction filter for correcting the blur when the object is on the deep side from the reference distance, so that the second correction image is generated (step S23).
Then, the CPU 40 calculates the first correlation value between the first correction image and the reference image (step S24). In addition, the CPU 40 calculates the second correlation value between the second correction image and the reference image (step S25).
Next, the CPU 40 determines whether the calculated first correlation value is larger than the second correlation value (step S26). If the first correlation value is larger than the second correlation value (Yes in step S26), the CPU 40 determines that the object is on the near side from the reference distance (step S27). On the other hand, if the first correlation value is equal to or less than the second correlation value (No in step S26), the CPU 40 determines that the object is on the deep side from the reference distance (step S28).
Further, the procedures illustrated in
Next, the description will be given about examples of a system to which the image capture device 100 is applied. The image capture device is configured as above and determines whether the object is on the near side or on the deep side of the reference distance (reference position).
The control signal generating unit 42 in the image capture device 100 generates a control signal related to the opening/closing of the door portion 602 based on the determination result of the determination unit 412, and outputs the generated control signal to the driving unit 601. More specifically, the control signal generating unit 42 generates the control signal to open the door portion 602 based on the determination result indicating that the object is on the near side from the reference distance, and outputs the control signal to the driving unit 601. In addition, the control signal generating unit 42 generates the control signal to close the door portion 602 based on the determination result indicating that the object is on the deep side from the reference distance, and outputs the control signal to the driving unit 601. When the door portion 602 is opened, and the object is positioned on the near side from the reference distance, the control signal generating unit 42 may generate a signal to keep the opening of the door portion 602 and transmit the signal to the driving unit 601. When the door portion 602 is closed, the control signal generating unit 42 may generate a signal to keep the closing of the door portion 602 and transmit the signal to the driving unit 601 according to the relation between the object and the reference distance. When the object moves to the near side from the deep side of the reference distance, the control signal generating unit 42 may generate a signal to open the door portion 602 and transmit the signal to the driving unit 601. When the object moves to the deep side from the near side of the reference distance, the control signal generating unit 42 may generate a signal to close the door portion 602 and transmit the signal to the driving unit 601. The image capture device 100 stores a relation between the object and the reference distance in the storage unit to determine the movement of the object.
The driving unit 601 includes, for example, a motor and opens or closes the door portion 602 by transferring a driving force of the motor to the door portion 602. The driving unit 601 operates the door portion 602 to be opened or closed based on the control signal which is generated by the control signal generating unit 42.
The reference distance is not necessary to be equal in all the pixels. For example, a reference surface may be configured by reference distances. The reference surface may be a flat surface, a curved surface or a non-continuous surface. For example, the determination unit 412 of the image capture device 100 determines whether the pedestrian 106 being an object is on the near side or on the deep side from a reference surface 107 using the acquired image. The reference surface 107 is set to be at a certain distance from the door portion 602 in front of the door portion 602 for example. The reference surface 107 is, for example, a flat surface in parallel with the door portion 602. The reference surface 107 and the optical axis of the lens 20 may be perpendicular or may not. The image capture device 100 provided on the upper side of the door portion 602 determines whether the pedestrian 106 is on the near side or on the deep side from the reference surface 107.
Further, the reception unit 43 of the image capture device 100 may receive a designation of a specific object, a specific region or a specific pixel on the acquired image. The reception unit 43 receives, for example, information indicating a pixel contained in the object in front of the door portion 602 designated by the user. The determination unit 412 may determine whether the pixel is on the near side or on the deep side from the reference distance. It is possible to obtain the determination result simply at a high speed by determining some pixels in the image.
In the example illustrated in
In addition, in the example illustrated in
The determination unit 412 of the image capture device 100 may continuously determine whether the pedestrian (object) 106 is on the near side or on the deep side from the reference surface 107 using continuously captured images. The determination unit 412 can detect that the pedestrian 106 moves from the near side to the deep side of the reference surface 107, or that the pedestrian moves from the deep side to the near side, by using the continuous determination results. Further, the determination unit 412 can detect a time when the pedestrian 106 keeps staying on the near side or on the deep side using the continuous determination results. The determination unit 412 may output a signal containing such a detection result to the control signal generating unit 42.
The control signal generating unit 42 generates the control signal to open the door portion 602 based on the detection result indicating that the pedestrian 106 moves from the deep side to the near side of the reference surface 107, and outputs the control signal to the driving unit 601. In addition, the control signal generating unit 42 generates the control signal to close the door portion 602 based on the detection result indicating that the pedestrian 106 moves from the near side to the deep side of the reference surface 107, and outputs the control signal to the driving unit 601.
Further, based on the detection result indicating a time when the pedestrian 106 keeps staying on the near side of the reference surface 107, the control signal generating unit 42 may estimate that the pedestrian 106 stays on the near side of the door portion 602 and does not pass by the door portion 602 when the time is equal to or more than a threshold. In this case, the control signal generating unit 42 may generate the control signal to close the door portion 602 and output the control signal to the driving unit 601.
The description will be given with reference to a flowchart of
First, the image capture device 100 may generate an ambient image of the door portion 602 (step S31). Then, the image capture device 100 performs the determination process to determine whether the object (for example, a pedestrian) is on the near side or on the deep side from the reference surface using the generated image (step S32).
If the object is on the near side from the reference surface based on the determination process (Yes in step S33), the control signal generating unit 42 generates the control signal to open an automatic door (step S34). On the other hand, if the object is on the deep side from the reference surface (No in step S33), the control signal generating unit 42 generates the control signal to close the automatic door (step S35). Then, the control signal generating unit 42 outputs the control signal to the driving unit 601 (step S36).
The driving unit 601 receives the control signal from the control signal generating unit 42, and operates the door portion 602 to be opened or to be closed based on the received control signal (step S37). In other words, the driving unit 601 receiving the control signal to open the automatic door operates the door portion 602 to be opened. In addition, the driving unit 601 receiving the control signal to close the automatic door operates the door portion 602 to be closed.
Such a configuration of the automatic door system 600 may also be applied to the control of a door of an automobile. As illustrated in
The control signal generating unit 42 generates the control signal related to the opening/closing of the door 703 of the automobile 700 based on the determination result on whether the object is on the near side or on the deep side from the reference distance (or the reference surface) that is output from the determination unit 412 of the image capture device 100. More specifically, when the object is on the near side from the reference distance, the control signal generating unit 42 generates the control signal not to open the door 703 of the automobile 700. Therefore, even when a passenger of the automobile 700 tries to open the door 703 for example, the control is performed not to open the door 703. Therefore, for example, it is possible to prevent an accident that the door 703 conflicts with the object caused by opening the door 703.
When the object is on the deep side from the reference distance, the control signal generating unit 42 generates the control signal to enable the door 703 of the automobile 700 to be opened. Therefore, when the passenger of the automobile 700 operates the door 703 to be opened, the door 703 is controlled to be opened. In other words, when the object is away from the distance where the door comes into contact at the time when the door 703 is opened, the door 703 is opened according to the operation of the passenger of the automobile 700.
As illustrated in
The control signal generating unit 42 in the image capture device 100 generates the control signal related to the movements of the moving object 800 based on the determination result which is output from the image capture device 100 and related on whether the object is on the near side or on the deep side from the reference distance. The control signal relates to an acceleration/deceleration, a level of a lifting force, a turning, a switching between a normal operation mode and an automatic operation mode (conflict avoid mode), and/or an actuation of a safety device such as an air bag of the moving object 800 or a part thereof. More specifically, the control signal generating unit 42 generates the control signal related to at least one of the deceleration, the level of the lifting force, the turning to a direction away from the object, the switching from the normal operation mode to the automatic operation mode (conflict avoid mode), and the actuation of the safety device based on the determination result on that the object is on the near side from the reference distance. The control signal generating unit 42 also generates the control signal related to at least one of the acceleration, the level of the lifting force, the turning to a direction approaching the object, and the switching from the automatic operation mode to the normal operation mode based on the determination result on that the object is on the deep side from the reference distance. The control signal generating unit 42 outputs the generated control signal to the driving unit 801.
The driving unit 801 operates the moving object 800 based on the control signal. That is, the driving unit 801 operates based on the control signal to cause the moving object 800 or a part thereof to perform the acceleration/deceleration, the level of the lifting force, the turning, the switching between the normal operation mode and the automatic operation mode (conflict avoid mode), and/or the actuation of the safety device such as the air bag. As described above, the image capture device 100 can determine whether the object is on the near side or on the deep side from the reference distance at a high speed. Therefore, such a configuration is, for example, applied to the movement of a robot and the automatic operation of the automobile that are necessarily controlled in real time.
As another example, in a case where the moving object 800 is a drone, at the time of inspecting a crack or a wire breaking from the sky, the image capture device 100 acquires an image obtained by capturing an inspection target and determines whether the object is on the near side or on the deep side from the reference distance. The control signal generating unit 42 generates the control signal to control thrust of the drone based on the determination result such that a distance to the inspection target is constant. Herein, the thrust includes the lifting force. The driving unit 801 operates the drone based on the control signal, so that the drone can fly in parallel with the inspection target. In a case where the moving object 800 is a monitoring drone, the control signal may be generated to control the thrust of the drone such that a distance to the object of the monitor target is kept constant.
In addition, at the time when the drone flies, the image capture device 100 acquires an image obtained by capturing the ground and determines whether the ground is on the near side or on the deep side from the reference distance (that is, a height from the ground is smaller or larger than the reference distance). The control signal generating unit 42 generates based on the determination result the control signal to control the thrust of the drone such that the height from the ground becomes a designated height. The driving unit 801 can make the drone to fly at the designated height by operating the drone based on the control signal. In the case of a drone for crop-spraying, the drone can evenly spray agricultural chemicals easily by keeping the height from the ground constant.
Further, in a case where the moving object 800 is the drone or the automobile, at the time of a coordinated flying of the drones or a regimental running of the automobiles, the image capture device 100 acquires an image obtained by capturing a peripheral drone or a preceding automobile and determines whether the drone or the automobile is on the near side or on the deep side from the reference distance. The control signal generating unit 42 generates based on the determination result the control signal to control a thrust of the drone or a speed of the automobile such that a distance to the peripheral drone or the preceding automobile becomes constant. The driving unit 801 operates the drone or the automobile based on the control signal so that the coordinated flying of the drones or the regimental running of the automobiles can be easily performed. In a case where the moving object 800 is the automobile, the reference distance may be configured to be set by a driver by receiving a designation of the driver through a user interface. Therefore, the automobile may run at a desired vehicle-to-vehicle distance of a driver. Alternatively, the reference distance may be changed according to a speed of the automobile to keep a safe vehicle-to-vehicle distance with respect to the preceding automobile. The safe vehicle-to-vehicle distance is different depending on the speed of the automobile. Therefore, the reference distance may be set to be longer as the speed of the automobile is increased. In addition, in a case where the moving object 800 is an automobile, the control signal generating unit 42 may be configured such that a predetermined distance in the advancing direction is set to the reference distance, and when an object appears on the near side of the reference distance, the brake is automatically operated or the safety device such as an air bag is actuated. In this case, the safety device such as an automatic brake and an air bag is provided as the driving unit 801.
As illustrated in
The monitor unit 901 causes the image capture device 100 to capture images continuously, and firstly displays the images captured by the image capture device 100 through the user interface 902. The user interface 302 performs, for example, a display process on a display device, and an input process from a keyboard or a pointing device. The display device and the pointing device may be realized as an integrated device such as a touch screen display for example.
In addition, the monitor unit 301 secondly monitors a state within a capture range of the image capture device 100 based on the determination results that are sequentially output from the image capture device 100 and indicate whether the object is on the near side or on the deep side from the reference distance. The monitor unit 901 analyzes a flow of a person, for example, a flow that a person goes into the reference distance and a flow that a person goes out of the reference distance, or a flow of a vehicle, for example, a flow that a vehicle goes into the reference distance and a flow that a vehicle goes out of the reference distance, and records the analysis result in a storage device such as a hard disk drive (HDD) Further, the analysis may be not necessarily performed in real time, and may be performed as a batch process in which the determination results that are accumulated in the storage device and indicate whether the object is on the near side or on the deep side from the reference distance. In addition, the monitor unit 901 may notify that a person or a vehicle goes into the reference distance, or that a person or a vehicle goes out of the reference distance through the user interface 902.
As described above, according to this embodiment, it is possible to determine the position of the object with respect to the reference position at a high speed. Therefore, the determination result on whether the object is on the near side or on the deep side of the reference position can be obtained in real time, so that it is possible to realize a system that appropriately controls various types of apparatuses in an environment where a positional relation with respect to the object is dynamically changed.
Further, each of various functions described in any of the embodiments may be realized by a circuit (processing circuit). Examples of the processing circuit include a programmed processor such as a central processing unit (CPU). This processor performs each described function by executing a computer program (instructions) stored in a memory. This processor may be a microprocessor including an electric circuit. Examples of a processing circuit include a digital signal processor (DSP), an application specific integrated circuit (ASIC), a microcontroller, a controller, and other electric circuit components. Each of components other than the CPU described in the embodiments may be realized by a processing circuit.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2016-220648 | Nov 2016 | JP | national |