The present invention relates to a maneuvering assisting apparatus. More specifically, the present invention relates to a maneuvering assisting apparatus which is arranged in a moving body such as an automobile and detects an obstacle in a surrounding area.
One example of this type of an apparatus is disclosed in a patent literature 1. According to the background art, a floodlight device floodlights a spot light beam at a plurality of floodlight angles different from each other toward a road surface rearward of a vehicle. A video rearward of the vehicle including a light spot of the floodlighted spot light beam is captured by a camera. A control unit detects the light spot from the video captured by the camera, and calculates a state of the road surface rearward of the vehicle based on a detected result. Furthermore, with reference to the calculated state of the road surface, the control unit determines a range which has a problem with the vehicle passing, and displays an image representing the determined range on a display.
However, according to the background art, it is necessary to continuously change the floodlight angle of the spot light beam in order to calculate the state of the road surface. In other words, a complicated process is required to calculate the state of the road surface based on the spot light beam thus floodlighted.
A maneuvering assisting apparatus according to this invention comprises: a radiating means, arranged in a moving body, which radiates a light beam obliquely downward; an imaging means, arranged in the moving body, which has a viewing field corresponding to a radiating direction of the radiating means; a determining means which determines whether or not a change amount in a reflection position of the light beam radiated by the radiating means exceeds a reference, based on output of the imaging means; a changing means which changes the radiating direction of the radiating means when a determined result of the determining means is updated from a negative result to a positive result; a detecting means which detects a manner of changing in the reflection position of the light beam radiated by the radiating means based on the output of the imaging means parallel to a changing process of the changing means; and a notifying means which outputs a different notification corresponding to a detected result of the detecting means toward an operator of the moving body.
Preferably, the radiating means radiates a linear light beam extending in right and left directions of the viewing field of the imaging means, and the changing means changes the radiating direction to an upper direction.
Preferably, the detecting means includes a first notifying means which outputs the notification in a first manner when the detected result of the detecting means indicates the reflecting light beam being disappeared, and a second notifying means which outputs the notification in a second manner when the detected result of the detecting means indicates at least one portion of the reflecting light beams being detected.
Preferably, further comprised is a creating means which creates a bird's eye view image based on the output of the imaging means, and the determining means and the detecting means respectively execute a determining process and a detecting process with reference to the bird's eye view image created by the creating means.
Preferably, the imaging means includes a plurality of cameras arranged on a periphery of the moving body, the radiating means includes a plurality of laser irradiators respectively allocated to the plurality of cameras, and further comprised is an updating means which repeatedly updates a camera and a laser irradiator noticed for the determining process of the determining means.
Preferably, further comprised is a setting means which sets an updating manner of the updating means to a different manner corresponding to at least one of a moving speed of the moving body and a moving direction of the moving body.
Preferably, further comprised is an adjusting means which adjusts an imaging direction of the imaging means and the radiating direction of the radiating means with reference to at least one of the moving speed of the moving body and the moving direction of the moving body, and the changing means executes a radiating-direction changing process by using the radiating direction adjusted by the adjusting means as a reference.
A maneuvering assisting program product according to the present invention is a maneuvering assisting program product executed by a processor of a maneuvering assisting apparatus provided with a radiating means, arranged in a moving body, which radiates a light beam obliquely downward and an imaging means, arranged in the moving body, which has a viewing field corresponding to a radiating direction of the radiating means, comprises: a determining step of determining whether or not a change amount in a reflection position of the light beam radiated by the radiating means exceeds a reference, based on output of the imaging means; a changing step of changing the radiating direction of the radiating means when a determined result of the determining step is updated from a negative result to a positive result; a detecting step of detecting a manner of changing in the reflection position of the light beam radiated by the radiating means based on the output of the imaging means parallel to a changing process of the changing step; and a notifying step of outputting a different notification corresponding to a detected result of the detecting step toward an operator of the moving body.
A maneuvering assisting method according to the present invention is a maneuvering assisting method executed by a maneuvering assisting apparatus provided with a radiating means, arranged in a moving body, which radiates a light beam obliquely downward and an imaging means, arranged in the moving body, which has a viewing field corresponding to a radiating direction of the radiating means, comprises: a determining step of determining whether or not a change amount in a reflection position of the light beam radiated by the radiating means exceeds a reference, based on output of the imaging means; a changing step of changing the radiating direction of the radiating means when a determined result of the determining step is updated from a negative result to a positive result; a detecting step of detecting a manner of changing in the reflection position of the light beam radiated by the radiating means based on the output of the imaging means parallel to a changing process of the changing step; and a notifying step of outputting a different notification corresponding to a detected result of the detecting step toward an operator of the moving body.
The above described objects and other objects, features and advantages of the present invention will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings.
A maneuvering assisting apparatus 10 of this embodiment shown in
With reference to
The laser irradiator L_2 is installed near the center on a right side of the automobile 100 and oriented rightward, obliquely downward of the automobile 100. The camera C_2 is installed near the center in a width direction on a right side and on an upper side in a height direction of the automobile 100, and oriented rightward, obliquely downward of the automobile 100.
The laser irradiator L_3 is installed near the center of a rear portion of the automobile 100, and oriented rearward, obliquely downward of the automobile 100. The camera C_3 is installed near the center in a width direction of a rear portion and on an upper side in a height direction of the automobile 100, and oriented rearward, obliquely downward of the automobile 100.
The laser irradiator L_4 is installed near the center on a left side of the automobile 100, and oriented leftward, obliquely downward direction of the automobile 100. The camera C_4 is installed near the center in a width direction on a left side and on an upper side in a height direction of the automobile 100, and oriented leftward, obliquely downward direction of the automobile 100.
A state where the automobile 100 and its surrounding grounds in which no obstacle exists are looked down is shown in
The camera C_1 has a viewing field VW_1 capturing a front of the automobile 100, the camera C_2 has a viewing field VW_2 capturing a right direction of the automobile 100, the camera C_3 has a viewing field VW_3 capturing a rear of the automobile 100, and the camera C_4 has a viewing field VW_4 capturing a left direction of the automobile 100. Moreover, the viewing fields VW_1 and VW_2 have a common viewing field VW_12, the viewing fields VW_2 and VW_3 have a common viewing field VW_23, the viewing fields VW_3 and VW_4 have a common viewing field VW_34, and the viewing fields VW_4 and VW_1 have a common viewing field VW_41.
That is, in a state where no obstacle exists in a surrounding area of the automobile 100, all the bright lines G_1 to G_4 extend linearly and the bright lines G_1 to G_4 are respectively captured by the cameras C_1 to C_4.
Returning to
The bird's-eye view image BEV_1 is equivalent to an image captured by a virtual camera looking perpendicularly down on the viewing field VW_1, and the bird's-eye view image BEV_2 is equivalent to an image captured by a virtual camera looking perpendicularly down on the viewing field VW_2. Moreover, the bird's-eye view image BEV_3 is equivalent to an image captured by a virtual camera looking perpendicularly down on the viewing field VW_3, and the bird's-eye view image BEV_4 is equivalent to an image captured by a virtual camera looking perpendicularly down on the viewing field VW_4.
According to
Subsequently, the CPU 12p joins the bird's-eye view images BEV_1 to BEV_4 each other by a coordinate conversion. The bird's-eye view images BEV_2 to BEV_4 are rotated and/or moved by using the bird's-eye view image BEV_1 as a reference, and as a result, a whole-circumference bird's-eye view image shown in
In
A display device 16 installed in a driver's seat of the automobile 100 extracts a partial image D1 in which the overlapped areas OL_12 to OL_41 are located at four corners, and pastes a vehicle image D2 resembling an upper portion of the automobile 100 at a center of the extracted image D1. As a result, a maneuvering assisting image shown in
Subsequently, a manner of creating the bird's-eye view images BEV_1 to BEV_4 is described. It is noted that all the bird's-eye view images BEV_1 to BEV_4 are created according to the same manner, and therefore, on behalf of all the bird's-eye view images BEV_1 to BEV_4, the manner of creating the bird's-eye view image BEV_3 is described.
With reference to
In the camera coordinate system (X, Y, Z), an optical center of the camera C3 is used as an origin O, and in this state, the Z axis is defined in an optical axis direction, the X axis is defined in a direction orthogonal to the Z axis and parallel to the ground, and the Y axis is defined in a direction orthogonal to the Z axis and X axis. In the coordinate system (Xp, Yp) of the imaging surface S, a center of the imaging surface S is used as the origin, and in this state, the Xp axis is defined in a lateral direction of the imaging surface S and the Yp axis is defined in a vertical direction of the imaging surface S.
In the world coordinate system (Xw, Yw, Zw), an intersecting point between a perpendicular line passing through the origin O of the camera coordinate system (X, Y, Z) and the ground is used as an origin Ow, and in this state, the Yw axis is defined in a direction vertical to the ground, the Xw axis is defined in a direction parallel to the X axis of the camera coordinate system (X, Y, Z), and the Zw axis is defined in a direction orthogonal to the Xw axis and Yw axis. Also, a distance from the Xw axis to the X axis is “h”, and an obtuse angle formed by the Zw axis and the Z axis is equivalent to the above described angle θ.
When coordinates in the camera coordinate system (X, Y, Z) are written as (x, y, z), “x”, “y”, and “z” respectively indicate an X-axis component, a Y-axis component, and a Z-axis component in the camera coordinate system (X, Y, Z). When coordinates in the coordinate system (Xp, Yp) of the imaging surface S are written as (xp, yp), “xp” and “yp” respectively indicate an Xp-axis component and a Yp-axis component in the coordinate system (Xp, Yp) of the imaging surface S. When coordinates in the world coordinate system (Xw, Yw, Zw) are written as (xw, yw, zw), “xw”, “yw”, and “zw” respectively indicate an Xw-axis component, a Yw-axis component, and a Zw-axis component in the world coordinate system (Xw, Yw, Zw).
A transformation equation between the coordinates (x, y, z) of the camera coordinate system (X, Y, Z) and the coordinates (xw, yw, zw) of the world coordinate system (Xw, Yw, Zw) is represented by Equation 1 below:
Herein, if a focal length of the camera C_3 is assumed as “f”, a transformation equation between the coordinates (xp, yp) of the coordinate system (Xp, Yp) of the imaging surface S and the coordinates (x, y, z) of the camera coordinate system (X, Y, Z) is represented by Equation 2 below:
Furthermore, based on Equation 1 and Equation 2, Equation 3 is obtained. Equation 3 shows a transformation equation between the coordinates (xp, yp) of the coordinate system (Xp, Yp) of the imaging surface S and the coordinates (xw, zw) of the two-dimensional ground coordinate system (Xw, Zw).
Furthermore, the bird's-eye-view coordinate system (X3, Y3), which is a coordinate system of the bird's-eye view image BEV_3 shown in
A projection from the two-dimensional coordinate system (Xw, Zw) that represents the ground, onto the bird's-eye-view coordinate system (X3, Y3), is equivalent to a so-called parallel projection. When a height of a virtual camera, i.e., a virtual view point, is assumed as “H”, a transformation equation between the coordinates (xw, zw) of the two-dimensional coordinate system (Xw, Zw) and the coordinates (x3, y3) of the bird's-eye-view coordinate system (X3, Y3) is represented by Equation 4 below. A height H of the virtual camera is previously determined
Furthermore, based on Equation 4, Equation 5 is obtained, and based on Equation 5 and Equation 3, Equation 6 is obtained. Moreover, based on Equation 6, Equation 7 is obtained. Equation 7 is equivalent to a transformation equation for transforming the coordinates (xp, yp) of the coordinate system (Xp, Yp) of the imaging surface S into the coordinates (x3, y3) of the bird's-eye-view coordinate system (X3, Y3).
The coordinates (xp, yp) of the coordinate system (Xp, Yp) of the imaging surface S represent coordinates of the scene image P_3 captured by the camera C_3. Therefore, the scene image P_3 from the camera C_3 is transformed into the bird's-eye view image BEV_3 by using Equation 7. In reality, the scene image P_3 firstly undergoes an image process such as a lens distortion correction, and is then transformed into the bird's-eye view image BEV_3 using Equation 7.
Subsequently, an operation of detecting the obstacle from a surrounding area of the automobile 100 is described. Firstly, the CPU 12p sets the variable N to any one of “1” to “4”, and duplicates a bird's eye view image BEV_N held in the work area W1 into a work area W3 in response to the vertical synchronization signal Vsync. As described with reference to
The CPU 12p detects the bright line G_N from the bird's eye view image BEV_N, and also detects a change amount of the bright line G_N as “ΔG_N”. The detected change amount ΔG_N is equivalent to a difference between the bright line G_N detected in a state where the obstacle does not exist in the surrounding area and the latest bright line G_N, and represents a size of a distortion portion of the bright line G_N which is attributed to the obstacle.
When the automobile 100 approaches to a rearward obstacle OBJ1 as shown in
The CPU 12p compares the detected change amount ΔG_N to a reference REF so as to update the variable N if ΔG_N≦REF is established. When the variable N before updating is “1”, the variable N after updating indicates “2”, and when the variable N before updating is “2”, the variable N after updating indicates “3”. Moreover, when the variable N before updating is “3”, the variable N after updating indicates “4”, and when the variable N before updating is “4”, the variable N after updating indicates “1”. Thus, the variable N is cyclically updated. Upon completion of an updating process of the variable N, the CPU 12p returns to the above-described process.
On the other hand, when ΔG_N>REF is established, the CPU 12p changes an irradiating direction of a laser irradiator L_N upward by a predetermined angle. The predetermined angle is equivalent to an angle of the bright line G_N being outside of a viewing field VW_N of a camera C_N in a case where the obstacle does not exist in the surrounding area of the automobile 100.
When an irradiating direction of the laser irradiator L_3 is changed upward by the predetermined angle in a state of
After the irradiating direction of the laser irradiator L_N is changed, the CPU 12p duplicates the bird's eye view image BEV_N held in the work area W1 into the work area W3 so as to determine whether or not the bright line G_N disappeared from the duplicated bird's eye view image BEV_N. If at least one portion of the bright line G_N appears in the bird's eye view image BEV_N, the CPU 12p controls a warning lamp 18 arranged at a dashboard so as to issue a warning colored red. In contrary, if no bright line G_N appears in the bird's eye view image BEV_N, the warning lamp 18 is controlled so as to issue a warning colored yellow.
According to
When a driver performs a warning stop operation toward an operation panel 20 after the warning is thus issued, the CPU 12p stops the warning lamp 18, restores the irradiating direction of the laser irradiator L_N, and thereafter returns to the above-described process.
Specifically, the CPU 12p executes a plurality of tasks including an image creating task shown in
With reference to
With reference to
In the step S25, the variable N is updated among “1” to “4”. When the variable N before updating is “1”, the variable N after updating indicates “2”, and when the variable N before updating is “2”, the variable after updating indicates “3”. Moreover, when the variable N before updating is “3”, the variable N after updating indicates “4”, and when the variable N before updating is “4”, the variable after updating indicates “1”.
In the step S27, the irradiating direction of the laser irradiator L_N is changed upward by a predetermined angle. When the vertical synchronization signal Vsync is generated, the process advances from the step S29 to the step S31 so as to duplicate the bird's-eye view image BEV_N of a subsequent frame held in the work area W1 into the work area W3. In the step S33, it is determined whether or not the bright line G_N is disappeared from the duplicated bird's-eye view images BEV_N. When no bright line G_N is detected from the bird's eye view image BEV_N, the process advances to the step S35 while when at least one portion of the bright line G_N is detected from the bird's eye view image BEV_N, the process advances to the step S37.
In the step S35, the warning lamp 18 is controlled so as to issue the warning colored yellow. In the step S37, the warning lamp 18 is controlled so as to issue the warning colored red. Upon completion of the process in the step S35 or S37, it is determined in the step S39 whether or not the warning stop operation is performed. When a determined result is updated from NO to YES, the warning lamp 18 is stopped in the step S41, the irradiating direction of the laser irradiator L_N is restored in a step S43, and thereafter the process returns to the step S13.
As can be seen from the above-described explanation, the laser irradiator L_N (N: 1 to 4) is arranged on the automobile 100 and radiates the laser beam obliquely downward. The radiated laser beam depicts the bright line G_N on the ground. The camera C_N arranged on the automobile 100 has the viewing field VW_N corresponding to a radiating direction of the laser irradiator L_N.
The CPU 12p determines whether or not a change amount in a reflection position of the laser beam radiated by the laser irradiator L_N, i.e., the change amount of the bright line G_N exceeds the reference REF, based on output of the camera C_N (S23). The radiating direction of the laser irradiator L_N is changed by the CPU 12p when a determined result is updated from a negative result to a positive result (S27).
A changing manner of the bright line G_N is detected by the CPU 12p based on the output of the camera C_N parallel to a changing process for the radiating direction (S33). The CPU 12p outputs a different notification corresponding to the detected result toward the driver of the automobile 100 (S35, S37).
Thus, the radiating direction of the laser irradiator L_N is changed when the change amount of the bright line G_N exceeds the reference REF. Moreover, the notification outputted toward the driver of the automobile 100 differs corresponding to the changing manner of the bright line G_N. Thereby, it becomes possible to ensure a detection accuracy of the obstacle while a throughput is inhibited.
It is noted that, in this embodiment, orientations of the cameras C_1 to C_4 are fixed, and orientations of the laser irradiators L_1 to L_4 are also fixed except being changed upward in the step S27 shown in
However, a speed sensor 22 which detects a moving speed of the automobile 100 and a direction sensor 24 which detects a moving direction of the automobile 100 may be optionally added as shown in
With reference to
When YES is determined in the step S55, the angles of the cameras C_1 to C_4 (the imaging directions) and the angles of the laser irradiators L_1 to L_4 (the radiating directions) are initialized in a step S57, and weighting amounts of the cameras C_1 to C_4 are initialized in a step S59. Upon completion of the process in the step S59, the process returns to the step S51.
It is noted that the weighting amounts of the cameras C_1 to C_4 are parameters for adjusting the updating manner of the variable N. The variable N is preferentially set to the number of a camera which is larger in the weighting amount. For example, in a case where the weighting amount of the camera C_1 is the largest and the weighting amount of the camera C_3 is the smallest, the cycle in which variable N is updated to “1” becomes the shortest and the cycle in which the variable N is updated to “3” becomes the longest.
When YES is determined in the step S61, it is regarded that the automobile 100 is moving forward at high speed (speed equal to or more than the threshold value) and therefore, the angles of the camera C_1 and the laser irradiator L_1 are adjusted upward in a step S65. Also, the weighting amount of the camera C_1 is increased in a step S67. Upon completion of the process in the step S67, the process returns to the step S51.
When YES is determined in the step S63, it is regarded that the automobile 100 is moving rearward at high speed, and therefore, the angles of the camera C_3 and the laser irradiator L_3 are adjusted upward in a step S69. Also, the weighting amount of the camera C_3 is increased in a step S71. Upon completion of the process in the step S71, the process returns to the step S51.
When YES is determined in the step S73, it is regarded that the automobile 100 is moving right forward at high speed, and therefore, the angles of the cameras C_1 and C_2 and the laser irradiators L_1 and L_2 are adjusted right forward of the automobile 100 in a step S77. Also, the weighting amount of the cameras C_1 and C_2 are increased in a step S79. Upon completion of the process in the step S79, the process returns to the step S51.
When YES is determined in the step S75, it is regarded that the automobile 100 is moving right rearward at high speed, and therefore, the angles of the cameras C_2 and C_3 and the laser irradiators L_2 and L_3 are adjusted right rearward of the automobile 100 in a step S81. Also, the weighting amount of the cameras C_2 and C_3 are increased in a step S83. Upon completion of the process in the step S83, the process returns to the step S51.
When YES is determined in the step S85, it is regarded that the automobile 100 is moving left forward at high speed, and therefore, the angles of the cameras C_1 and C_4 and the laser irradiators L_1 and L_4 are adjusted left forward of the automobile 100 in a step S87. Also, the weighting amount of the cameras C_1 and C_4 are increased in a step S89. Upon completion of the process in the step S89, the process returns to the step S51.
When NO is determined in the step S85, it is regarded that the automobile 100 is moving left rearward at high speed, and therefore, the angles of the cameras C_3 and C_4 and the laser irradiators L_3 and L_4 are adjusted left rearward of the automobile 100 in a step S91. Also, the weighting amount of the cameras C_3 and C_4 are increased in a step S93. Upon completion of the process in the step S93, the process returns to the step S51.
Moreover, according to this embodiment, an automobile is assumed as a moving body, however, a construction machine or a train may be assumed as the moving body. In a case of the construction machine, since there is a possibility that a ground of a workplace is out of the level, the reference REF referred to in the step S23 shown in
Furthermore, in this embodiment, the warning is issued toward the driver of the automobile 100, however, the warning may be issued toward a driver of the outside, i.e., another automobile. Thereby, a possibility to avoid a situation where another automobile is coming to crash is increased.
Moreover, in this embodiment, the laser irradiator L_2 is installed near the center on the right side of the automobile 100, the camera C_2 is installed near the center in the width direction on the right side and on the upper side in the height direction of the automobile 100, the laser irradiator L_4 is installed near the center on the left side of the automobile 100, and the camera C_4 is installed near the center in the width direction on the left side and on the upper side in the height direction of the automobile 100. However, the laser irradiator L_2 and the camera C_2 may be installed at the top and bottom of a door miller on the right side, and the laser irradiator L_4 and the camera C_4 may be installed at the top and bottom of a door miller on the left side.
Moreover, in this embodiment, the irradiation direction of the laser beam is changed upward when ΔG_N that is the change amount of the bright line G_N exceeds the reference REF (see the step S23 in
Furthermore, in this embodiment, a warning lamp is arranged in order to issue the warning, however, the warning may be issued from a speaker, and may be displayed on the display device.
Moreover, in this embodiment, the bird's-eye view images BEV_1 to BEV_4 are created on the work area W1 of the memory 12m (see the step S5 in
Furthermore, in this embodiment, a single obstacle sensing task shown in
Moreover, according to
Notes relating to the above-described embodiment will be shown below. It is possible to arbitrarily combine these notes with the above-described embodiment unless any contradiction occurs.
The coordinate transformation for producing a bird's-eye view image from a photographed image, which is described in the embodiment, is generally called a perspective projection transformation. Instead of using this perspective projection transformation, the bird's-eye view image may also be optionally produced from the photographed image through a well-known planer projection transformation. When the planer projection transformation is used, a homography matrix (coordinate transformation matrix) for transforming a coordinate value of each pixel on the photographed image into a coordinate value of each pixel on the bird's-eye view image is evaluated in advance at a stage of a camera calibrating process. A method of evaluating the homography matrix is well known. Then, during image transformation, the photographed image may be transformed into the bird's-eye view image based on the homography matrix. In either way, the photographed image is transformed into the bird's-eye view image by projecting the photographed image on the bird's-eye view image.
Although the present invention has been described and illustrated in detail, it is clearly understood that the same is by way of illustration and example only and is not to be taken by way of limitation, the spirit and scope of the present invention being limited only by the terms of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
2008-245213 | Sep 2008 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2009/065575 | 9/7/2009 | WO | 00 | 3/14/2011 |