An embodiment of the present invention relates to a periphery monitoring device.
Conventionally, periphery monitoring devices are developed, which display, on a display in the interior of a vehicle, an image of the surroundings of the vehicle generated by an on-vehicle imaging device (for example, a camera) to provide the driver on a driver's seat with the situation around. One example of such a periphery monitoring device facilitates a driver's determination on whether the corner of the vehicle body comes into contact with a peripheral object by displaying, on an overhead image, an estimated trajectory indicating the passage of the corner while the vehicle turns in a small space such as a parking lot.
Patent Document 1: Japanese Laid-open Patent Application No. 2012-66616
Such a conventional technique enables the driver to relatively easily determine whether each of the corners comes into contact with any object in the surroundings. However, while the vehicle travels forward, such a determination is to be comprehensively made as to whether all the corners can pass at the same timing with no contact with the object. The conventional system that displays an estimated trajectory requires the driver's experience and skills to intuitively determine how the vehicle behaves during traveling, or to determine whether the vehicle as a whole comes no contact with an object.
An object of the present invention is to provide a periphery monitoring device that enables a driver to determine how the vehicle behaves during traveling, or to more intuitively determine whether the vehicle as a whole comes no contact with the object.
According to one embodiment of the present invention, for example, a periphery monitoring device includes an acquirer and a controller. The acquirer acquires a vehicle image of the vehicle and a peripheral image. The vehicle image is to be displayed on the peripheral image in the overhead mode. The peripheral image represents peripheral situation of a vehicle based on image data output from an imager mounted on the vehicle to image the surroundings of the vehicle, and is to be displayed in an overhead mode. The controller causes a virtual vehicle image to be displayed on the peripheral image together with the vehicle image. The virtual vehicle image represents, in the overhead mode, a state of the vehicle when traveling at a current steering angle. With this configuration, for example, the periphery monitoring device displays, on an overhead image, the vehicle image and the virtual vehicle image representing the state of the vehicle when traveling at the current steering angle, to present the relationship between the traveling vehicle and the surroundings, such as the one between the virtual vehicle image and an object located around the vehicle. Thus, the periphery monitoring device can provide display in such a manner that the user (driver) can intuitively recognize the relationship between the surroundings and the vehicle during traveling.
The controller of the periphery monitoring device causes the virtual vehicle image to be displayed such that the virtual vehicle image travels away from the vehicle image in a direction corresponding to the current steering angle of the vehicle from a superimposed position of the virtual vehicle image and the vehicle image. With this configuration, for example, the periphery monitoring device can display in advance change in the relationship between the surroundings and the vehicle when continuously traveling at the current steering angle, which enables the user to more intuitively recognize the behavior of the vehicle and the positional relationship with respect to the object during traveling.
The controller of the periphery monitoring device, for example, changes orientation of the virtual vehicle image with respect to the vehicle image so as to correspond to orientation of the vehicle traveling at the current steering angle while causing the virtual vehicle image and the vehicle image to be displayed at the superimposed position. With this configuration, the periphery monitoring device displays a future direction of the vehicle. Thus, the periphery monitoring device can provide display to allow the user to intuitively recognize behavior (posture, orientation) of the vehicle when traveling at the current steering angle, and easily understand a current steering direction. For example, in the case that the vehicle is coupled to a towed vehicle, the user can easily estimate the behavior of the towed vehicle by recognizing the behavior of the vehicle.
The acquirer of the periphery monitoring device acquires positional information indicating a position of an object to watch for located around the vehicle, and the controller sets a display stop position of the virtual vehicle image in accordance with the position of the object to watch for, for example. With this configuration, the periphery monitoring device stops moving the virtual vehicle image at the time of or immediately before interfering with an object to watch for, for example, an obstacle (such as another vehicle, a wall, a pedestrian), if it occurs during the vehicle traveling at the current steering angle, thereby making it possible to draw attention of the user.
The controller of the periphery monitoring device sets a display mode of the virtual vehicle image in accordance with a distance to the object to watch for. With this configuration, for example, the periphery monitoring device can further ensure that the user recognizes the presence of the object to watch for.
The acquirer of the periphery monitoring device acquires a coupling state of a towed vehicle towed by the vehicle with respect to the vehicle, and the controller causes the virtual vehicle image to be displayed on the peripheral image together with a coupling image representing the coupling state of the towed vehicle, for example. With this configuration, for example, the periphery monitoring device can concurrently display the coupling image of the towed vehicle and the virtual vehicle image, to enable the user to easily recognize from a future moving state or orientation of the virtual vehicle image how the state of the coupled towed vehicle (coupling angle) is changed due to the traveling of the towing vehicle (for example, backward travel).
The controller of the periphery monitoring device causes the virtual vehicle image to be displayed, after start of traveling of the vehicle, for example. With this configuration, for example, the periphery monitoring device can avoid continuously displaying the virtual vehicle image to simplify an image display during vehicle stop, and display the relationship between the vehicle and the surroundings in the future while gradually moving the vehicle, as needed. That is, the user can understand a future moving route while gradually driving the vehicle, and easily choose an appropriate moving route in accordance with the most recent surrounding environment.
When the current steering angle of the vehicle corresponds to a steering neutral position, the controller of the periphery monitoring device causes the virtual vehicle image not to be displayed. With this configuration, the periphery monitoring device enables the user to intuitively recognize from a display state of the display device that the current steering angle corresponds to a steering neutral position, that is, the vehicle is movable forward substantially straight. Additionally, the periphery monitoring device can simplify the peripheral image in an overhead mode, making it possible for the user to easily understand peripheral situation.
Hereinafter, exemplary embodiments of the present invention are disclosed. Configurations of the embodiments below, and operations, results, and effects attained by the configurations are merely exemplary. The present invention can be implemented by configurations other than the configurations disclosed in the following embodiments, and can attain at least one of various effects based on the basic configurations and derivative effects.
As illustrated in
A vehicle body 2 defines a vehicle interior 2a where an occupant (not illustrated) rides. The vehicle interior 2a is provided with a steering 4, an accelerator 5, a braking unit 6, and a gearshift 7, facing a seat 2b of a driver as an occupant. The steering 4 is, for example, a steering wheel projecting from a dashboard 24, the accelerator 5 is, for example, an accelerator pedal located under a foot of the driver. The braking unit 6 is, for example, a brake pedal located under a foot of the driver. The gearshift 7 is, for example, a shift lever projecting from a center console. The steering 4, the accelerator 5, the braking unit 6, and the gearshift 7 are not limited thereto.
The vehicle interior 2a is provided with a display device 8 serving as a display output and a voice output device 9 serving as a voice output. Examples of the display device 8 include a liquid crystal display (LCD) and an organic electroluminescent display (OELD). The voice output device 9 is, for example, a speaker. The display device 8 is, for example, covered with a transparent operation input 10 such as a touch panel. The display device 8 is covered by a transparent operation input 10 such as a touch screen. The occupant can view images displayed on the screen of the display device 8 through the operation input 10. The occupant can also touch, press, and move the operation input with his or her finger or fingers at positions corresponding to the images displayed on the screen of the display device for executing operational inputs. The display device 8, the voice output device 9, and the operation input 10 are, for example, included in a monitor 11 disposed in the center of the dashboard 24 in the vehicle width direction, that is, transverse direction. The monitor 11 can include an operation input (not illustrated) such as a switch, a dial, a joystick, and a push button. Another voice output device (not illustrated) may be disposed in the vehicle interior 2a at a different location from the monitor 11 to be able to output voice from the voice output device 9 of the monitor 11 and another voice output device. For example, the monitor 11 can double as a navigation system and an audio system.
In the vehicle interior 2a, a display device 12 different from the display device 8 is also disposed. As illustrated in
As illustrated in
As illustrated in
The imager 15a is, for example, located at a rear end 2e of the vehicle body 2 on a wall of a hatch-back door 2h under the rear window. The imager 15b is, for example, located at a right end 2f of the vehicle body 2 on a right side mirror 2g. The imager 15c is, for example, located at the front of the vehicle body 2, that is, at a front end 2c of the vehicle body 2 in vehicle length direction on a front bumper or a front grill. The imager 15d is, for example, located at a left end 2d of the vehicle body 2 on a left side mirror 2g in vehicle width direction. The ECU 14 can perform computation and image processing on image data generated by the imagers 15, thereby creating an image at wider viewing angle and a virtual overhead image of the vehicle 1 from above. The ECU 14 performs computation and image processing on wide-angle image data (curved image data) generated by the imagers 15 to correct distortion or generate a cutout image of a particular area. The ECU 14 can perform viewpoint conversion to convert image data into virtual image data imaged at a virtual viewpoint different from the viewpoint of the imagers 15. For example, the ECU 14 can convert image data into virtual image data of side-view image representing the side surface of the vehicle 1 as viewed away from the vehicle 1. The ECU 14 causes the display device 8 to display the generated image data to provide peripheral monitoring information for allowing the driver to conduct safety check of the right and left sides of the vehicle 1 and ahead of, behind and around the vehicle 1 while viewing the vehicle 1 from above.
The ECU 14 can perform driver assistance by identifying a section line drawn on the road surface around the vehicle 1 from the image data generated by the imagers 15, or perform parking assistance by detecting (extracting) a parking lot (section lines).
As illustrated in
As illustrated in
The ECU 14 includes, for example, a central processing unit (CPU) 14a, a read only memory (ROM) 14b, a random access memory (RAM) 14c, a display controller 14d, a voice controller 14e, and a solid state drive (SSD, flash memory) 14f. The CPU 14a can perform computation and control of image processing involving an image displayed on the display device 8 and the display device 12, for example. The CPU 14a generates, for example, an overhead image (peripheral image) exhibiting the image of the vehicle 1 at a center from the image data generated by the imagers 15, for example. By displaying a virtual vehicle image of the vehicle 1 on the peripheral image when traveling at a current steering angle, the CPU 14a presents the image in a manner that the driver can intuitively understand a future positional relationship between the vehicle 1 and the object to watch for (such as an obstacle, a parking line, and a section line) located around the vehicle 1. The overhead image can be created by known method, so that description thereof will be omitted. The CPU 14a can perform various kinds of computation and control such as determination on a target moving position (for example, a target parking position) of the vehicle 1, calculation of a guide route for the vehicle 1, determination on interference or non-interference with an object, automatic control (guiding control) of the vehicle 1, and cancellation of automatic control.
The CPU 14a can read an installed and stored computer program from a non-volatile storage device such as the ROM 14b to perform computation in accordance with the computer program. The RAM 14c temporarily stores various kinds of data used in the calculation by the CPU 14a. Of the computation by the ECU 14, the display controller 14d mainly executes synthesis of image data to be displayed on the display device 8. The voice controller 14e mainly executes processing on voice data output from the voice output device 9, of the computation of the ECU 14. The SSD 14f is a rewritable nonvolatile storage and can store therein data upon power-off of the ECU 14. The CPU 14a, the ROM 14b, and the RAM 14c can be integrated in the same package. The ECU 14 may include another logical operation processor such as a digital signal processor (DSP) or a logic circuit, instead of the CPU 14a. The SSD 14f may be replaced by a hard disk drive (HDD). The SSD 14f and the HDD may be provided separately from the ECU 14.
Examples of the brake system 18 include an anti-lock brake system (ABS) for preventing locking-up of the wheels during braking, an electronic stability control (ESC) for preventing the vehicle 1 from skidding during cornering, an electric brake system that enhances braking force (performs braking assistance), and a brake by wire (BBW). The brake system 18 applies braking force to the wheels 3 and the vehicle 1 through an actuator 18a. The brake system 18 is capable of detecting signs of lock-up of the brake and skidding of the wheels 3 from, for example, a difference in the revolving speeds between the right and left wheels 3 for various types of control. Examples of the brake sensor 18b include a sensor for detecting the position of a movable part of the brake 6. The brake sensor 18b can detect the position of a brake pedal being a movable part. The brake sensor 18b includes a displacement sensor. The CPU 14a can calculate a braking distance from a current speed of the vehicle 1 and magnitude of the braking force calculated from a result the detection by the brake sensor 18b.
The steering-angle sensor 19 represents, for example, a sensor for detecting the amount of steering of the steering 4 such as a steering wheel. The steering-angle sensor 19 includes, for example, a Hall element. The ECU 14 acquires the steering amount of the steering 4 operated by the driver and the steering amount of each wheel 3 during automatic steering from the steering-angle sensor 19 for various kinds of control. Specifically, the steering-angle sensor 19 detects the rotation angle of a rotational part of the steering 4. The steering-angle sensor 19 is an exemplary angle sensor.
The accelerator sensor 20 represents, for example, a sensor that detects the position of a movable part of the accelerator 5. The accelerator sensor 20 can detect the position of the accelerator pedal as a movable part. The accelerator sensor 20 includes a displacement sensor.
The shift sensor 21 is, for example, a sensor that detects the position of a movable part of the gearshift 7. The shift sensor 21 can detect positions of a lever, an arm, and, a button as movable parts, for example. The shift sensor 21 may include a displacement sensor or may be configured as a switch.
The wheel-speed sensor 22 represents a sensor for detecting the amount of revolution and the revolving speed per unit time of the wheels 3. The wheel-speed sensor 22 is placed on each wheel 3 to output the number of wheel speed pulses indicating the detected revolving speed, as a sensor value. The wheel-speed sensor 22 may include, for example, a Hall element. The ECU 14 acquires the sensor value from the wheel-speed sensor 22 and computes the moving amount of the vehicle 1 from the sensor value for various kinds of control. In calculating the speed of the vehicle 1 from the sensor value of each wheel-speed sensor 22, the CPU 14a sets the speed of the vehicle 1 according to the speed of the wheel 3 with the smallest sensor value among the four wheels for executing various kinds of control. If one of the four wheels 3 exhibits a larger sensor value than the other wheels 3, such as one wheel 3 exhibiting higher rotation speed per unit period (unit time, or unit distance) by a given value or more than the other wheels 3, the CPU 14a regards the wheel 3 as being slipping (in idling state), and executes various kinds of control. The wheel-speed sensor 22 may be included in the brake system 18 (not illustrated). In such a case, the CPU 14a may acquire a result of the detection of the wheel-speed sensor 22 via the brake system 18.
The configuration, arrangement, and electrical connection of various sensors and actuators described above are merely exemplary, and can be variously set (changed).
By way of example, the ECU 14 implementing the periphery monitoring system 100 generates a peripheral image representing the surroundings of the vehicle 1 in an overhead mode, and causes display of a vehicle image of the vehicle 1 in the overhead mode on the peripheral image, and display of a virtual vehicle image representing a state of the vehicle 4 (a moving position or orientation of the vehicle body) when traveling at the current steering angle.
For display in the overhead mode as described above, as illustrated in
In the present embodiment, the virtual vehicle image may be displayed in a first display mode or a second display mode.
To display in the first display mode or the second display mode as described above, the acquirer 30 mainly acquires the peripheral image 46 representing the peripheral situation of the vehicle 1 in the overhead mode based on the image data output from the imagers 15 that images the surroundings of the vehicle 1, and the vehicle image 48 of the vehicle 1 to be displayed on the peripheral image 46 in the overhead mode. That is, the acquirer 30 acquires, from various sensors, the ROM 14b, and the SSD 14f, various kinds of information (data) required for performing display in the overhead mode, and temporarily held it in in the RAM 14c, for example.
For example, the steering-angle acquirer 30a acquires information (a steering angle) on an operation state of the steering 4 (steering wheel) output from the steering angle sensor 19. That is, the steering-angle acquirer 30a acquires a steering angle of a driver's intended traveling direction of the vehicle 1. The steering-angle acquirer 30a may acquire information about whether the vehicle 1 is movable forward or backward, from the position of the movable part of the gearshift 7 acquired from the shift sensor 21, to be able to identify the steering angle as forward steering angle or backward steering angle.
The peripheral-image generator 30b can generate the peripheral image 46 in the overhead mode through known viewpoint conversion and distortion correction on the image data generated by the imagers 15a to 15d. By displaying the peripheral image 46, the peripheral situation of the vehicle 1 can be presented to the user. The peripheral image 46 is based on the image data generated by the imagers 15a to 15d, so that the peripheral image 46 can be an overhead image centered on the vehicle 1 (an image having a viewpoint above the center of the screen 8b) as a basic image. In another embodiment, the viewpoint may be changed through viewpoint conversion to generate the peripheral image 46 representing the position of the vehicle 1 moved to the bottom end, that is, a forward overhead image mainly representing the region ahead of the vehicle 1 in the overhead mode. Conversely, the peripheral image 46 can be an image mainly representing the vehicle 1 moved in position to the top end, that is, a rearward overhead image of the region behind the vehicle 1 in the overhead mode. For example, the forward overhead image is useful for the first display mode, with no object to watch for located and the virtual vehicle image 50 largely moving ahead of the vehicle 1. The rearward overhead image is useful for the first display mode with the virtual vehicle image 50 largely moving behind the vehicle 1. The overhead image including the vehicle 1 (vehicle image 48) at the center is useful for the second display mode. The present embodiment describes an example of displaying the vehicle image 48 at a center of the peripheral image 46, but the display position of the vehicle image 48 can be appropriately changed by the user's (driver's) operation to the operation input 10.
The vehicle-marker acquirer 30c acquires, as vehicle markers, the vehicle image 48 (vehicle icon) of the vehicle 1 in the overhead mode, the virtual vehicle image 50 (virtual icon), and the towed vehicle image 66 of the towed vehicle 60 (a trailer icon, refer to
The object-to-watch-for acquirer 30d acquires an object to watch for of the vehicle 1 when traveling, from the result of the detection by the ranging units 16 and 17 and the image data generated by the imagers 15, for example. For example, the ranging units 16 and 17 searches the surroundings of the vehicle 1 for an object such as another vehicle 52, a bicycle, a pedestrian, a wall, a structure, and if found, the object-to-watch-for acquirer 30d acquires (detects) a distance (positional information) to the object. The object-to-watch-for acquirer 30d detects a parking line indicating a parking region, a section line, and a stop line drawn on the road surface through image processing on the image data generated by the imagers 15. The object detected by the ranging units 16 and 17 can be used for the vehicle-marker display-position controller 32a of the control unit 32 to stop moving (first display mode) or turning (second display mode) of the virtual vehicle image 50, to determine whether the virtual vehicle image 50, when displayed, interferes (contacts) with the object, that is, whether the vehicle 1 can continue to travel at the current steering angle, and to notify the user (driver) of presence of the object to call for attention. The parking line, the section line, and the stop line detected based on the image data generated by the imagers 15 can be used in notifying the user of drive timing or an amount of driving the vehicle 1 for guiding the vehicle 1 to the location thereof. The object to watch for can be detected with a laser scanner, for example. The imagers 15 may be stereo cameras to detect presence of the object or a distance to the object detected from the image data. In this case, the ranging units 16 and 17 are omissible.
In the case of the vehicle 1 coupled with the towed vehicle 60 (trailer), the trailer-coupling-angle acquirer 30e detects a coupling angle between the vehicle 1 and the towed vehicle 60 (an angle and a coupling state of the coupling arm 62 with respect to the vehicle 1) from the image data generated by the imager 15a, for example. While the vehicle 1 coupled with the towed vehicle 60 is traveling, the vehicle 1 and the towed vehicle 60 may differently behave from each other. Specifically, while the vehicle 1 travels backward, the coupling angle between the vehicle 1 and the towed vehicle 60 may increase or decrease depending on the steering angle of the vehicle 1 and a current coupling angle. Thus, the vehicle-marker display-position controller 32a of the control unit 32 moves the virtual vehicle image 50 in accordance with the acquired coupling angle while displaying the vehicle image 48 and the towed vehicle image 66, to facilitate estimation of future behavior of the towed vehicle 60 (towed vehicle image 66). In the case of the coupling device 56 (hitch ball 56a) coupling the vehicle 1 to the towed vehicle 60 including an angle sensor, the coupling angle of the coupling arm 62 may be directly acquired from the angle sensor. This reduces the processing load of the CPU 14a from that by image processing on the image data. Without the coupling device 56 of the vehicle 1 for coupling to the towed vehicle 60, the trailer-coupling-angle acquirer 30e may be omissible.
The control unit 32 mainly performs control of the display of, on the peripheral image 46, the virtual vehicle image 50 representing a state of the vehicle 1 traveling at the current steering angle in the overhead mode together with the vehicle image 48.
The vehicle-marker display-position controller 32a determines a display position of the vehicle image 48 being one of the vehicle markers acquired by the vehicle-marker acquirer 30c. As described above, the vehicle-marker display-position controller 32a may choose a viewpoint of the peripheral image 46 (overhead image) in accordance with a moving direction of the virtual vehicle image 50, to determine the display position of the vehicle image 48 in accordance with the viewpoint. The vehicle-marker display-position controller 32a determines a display position of the virtual vehicle image 50 being one of the vehicle markers in accordance with the steering angle of the vehicle 1 acquired by the steering-angle acquirer 30a. In the first display mode of the virtual vehicle image 50, the vehicle-marker display-position controller 32a displays the virtual vehicle image 50 on the peripheral image 46 (overhead image) such that it continuously or intermittently moves to a position corresponding to a position of the vehicle 1 traveling by three meters at a steering angle at that point, with reference to the display position of the vehicle image 48, for example. In this case, as illustrated in
In the first display mode of the virtual vehicle image 50, after the object-to-watch-for acquirer 30d detects the object to watch for, the vehicle-marker display-position controller 32a can acquire a display stop position to stop the virtual vehicle image 50 before the virtual vehicle image 50 comes into contact with another vehicle 52, for example. That is, in displaying the virtual vehicle image 50 running away from the vehicle image 48, the virtual vehicle image 50 can be stopped before contacting another vehicle 52 for the purpose of calling for the driver's attention. That is, the display can show that the vehicle 1 can travel until the stop position of the virtual vehicle image 50 without contacting with the obstacle such as another vehicle 52.
In the second display mode of the virtual vehicle image 50, the vehicle-marker display-position controller 32a displays the virtual vehicle image 50 on the peripheral image 46 (overhead image) such that at the display position, the vehicle image 48 is oriented in a direction corresponding to the orientation of the vehicle 1 traveling, for example, by three meters at the current steering angle. In this case, as illustrated in
In the case of the towed vehicle 60 coupled to the vehicle 1, the vehicle-marker display-position controller 32a displays the towed vehicle image 66 acquired by the vehicle-marker acquirer 30c on the peripheral image 46 (overhead image) in accordance with the coupling angle acquired by the trailer-coupling-angle acquirer 30e. For example, as illustrated in
The display-mode controller 32b mainly changes the display mode of the virtual vehicle image 50. For example, as illustrated in
For another example, the virtual vehicle image 50 may be changed from non-blinking in regular setting to blinking to call the user's attention. As illustrated in
In the second display mode illustrated in
The overhead display controller 32c controls the display mode of the screen 8b. For example, the peripheral image 46 as an overhead image may be displayed in response to a user's (driver's) request through the operation input 10. The peripheral image 46 may be displayed, assuming issuance of a display request, if the driver operates to transition to backward traveling, increasing blind spots, or upon detection of the object (obstacle) to watch for by the object-to-watch-for acquirer 30d in the traveling direction. After acquiring a display request for the peripheral image 46, the overhead display controller 32c switches the screen 8a of the display device 8 displaying a navigation screen or an audio screen in regular setting to an actual-image display mode representing the traveling direction of the vehicle 1, and displays the screen 8b together with the screen 8a. As illustrated in
The driving assist 34 acquires the estimated motion line 42 and the estimated direction line 44 to be displayed on the screen 8a, provides assistance for the driver to drive the vehicle 1, and parking assistance to drive the vehicle 1 to enter a parking region, and exit assistance for exiting the vehicle 1 from the parking region.
The route-marker acquirer 34a acquires the estimated motion line 42 and the estimated direction line 44 according to the steering angle of the vehicle 1 acquired by the steering-angle acquirer 30a, and a position of the gearshift 7 (shift lever), or receipt of a forward instruction or a backward instruction from the driver through the operation input 10. The estimated motion line 42 and the estimated direction line 44 are displayed ahead of or behind the vehicle 1 up to three meters, for example. A display length may be changed by the driver's operating the operation input 10. The estimated motion line 42 can indicate a future position of the wheel 3 on a road surface when the vehicle 1 travels at the current steering angle. The estimated motion line 42 is changed depending on the steering angle of the vehicle 1, so that the driver can easily search for a route by which the vehicle 1 can run on a road surface having less unevenness, for example. Similarly, the estimated direction line 44 can indicate a future moving direction of the vehicle 1 when traveling at the current steering angle. The estimated direction line 44 is also changed depending on the steering angle of the vehicle 1, so that the driver can easily find a moving direction of the vehicle 1 while comparing with the peripheral situation of the vehicle 1, by changing a steering amount.
The vehicle-state acquirer 34b acquires a current status of the vehicle 1 to perform driver assistance for the vehicle 1. For example, the vehicle-state acquirer 34b acquires magnitude of current braking force from a signal from the brake system 18, or acquires a current vehicle speed or a degree of acceleration/deceleration of the vehicle 1 from a result of the detection from the wheel-speed sensor 22. In accordance with a signal from the gearshift 7, the vehicle-state acquirer 34b also acquires the current state of the vehicle 1 such as being movable forward or backward, or stoppable (parkable).
The target-position determiner 34c, the route calculator 34d, and the guidance controller 34e mainly function to provide parking assistance or exit assistance.
In the present embodiment, in the first display mode, the virtual vehicle image 50 is moved in advance prior to the vehicle image 48 on the overhead peripheral image 46, to display progress of guidance in one of the assistance modes. In actually guiding the vehicle 1, the vehicle 1 may be directly guided to the target parking position from a guidance start position without turning back, and the vehicle 1 may turn back two or more times or temporarily stop.
In actual parking assistance for the vehicle 1, a reference point set to the vehicle 1 is guided to the target parking position within a parkable region to place the vehicle 1 in the parkable region. The reference point is set at the center of the rear-wheel shaft, for example. Thus, to guide the vehicle image 48 on the screen 8b, as illustrated in
The same applies to the exit assistance. For example, to notify the driver of a temporarily stop at the time when the front part of the vehicle 1 exits from the parking space to a road, the display color of the virtual vehicle image 50, which is away from the vehicle image 48 in a parked state on the peripheral image 46, is changed to red, for example, at the time when the virtual vehicle image 50 reaches the road. In this case, the driver can check rightward and leftward to enter the road. Also in this case, the driver can understand the peripheral situation from the virtual vehicle image 50 displayed in the overhead mode, and easily recognize a temporary stop location to check rightward and leftward.
To perform such parking assistance (exit assistance), the target-position determiner 34c detects a parkable region 68a in the peripheral region of the vehicle 1 with reference to an obstacle around the vehicle 1 and a parking line or a stop line on the road surface, which are acquired by the object-to-watch-for acquirer 30d based on the information from the imagers 15 and the ranging units 16 and 17. The target-position determiner 34c also determines the target parking position N for guiding the vehicle 1 with reference to the detected parkable region 68a and the information from the imagers 15 and the ranging units 16 and 17.
The route calculator 34d calculates the guide route L for guiding the vehicle 1 from the present position of the vehicle 1 to the target parking position (such that the reference point M matches with the target parking position N) by a known method. In response to receipt of request for the point to watch for (turn-back point), the route calculator 34d sets the point to watch for (turn-back point) on the guide route with reference to the obstacle located around the vehicle 1 (such as the other vehicles 52a and 52b) and the section line 68 acquired by the object-to-watch-for acquirer 30d.
The guidance controller 34e guides the vehicle 1 along the guide route L calculated by the route calculator 34d. In this case, when the turn-back point P1 is set on the guide route L, for example, a voice message may be output via the voice controller 14e, or a text message or an indicator may be displayed on the display device 8 or the display device 12 to prompt the driver to temporarily stop the vehicle 1 and shift the gear at the present position.
The display-switch receiver 36 receives an operation signal (request signal) when the driver makes a display request for the virtual vehicle image 50 in the overhead mode via the operation input 10 or the operation unit 14g. In another embodiment, for example, the display-switch receiver 36 may regard the shifting of the gearshift (shift lever) to the backward range as the display request for the virtual vehicle image 50 in the overhead mode, and receive the request signal. The display-switch receiver 36 may also receive a cancel request for canceling display of the virtual vehicle image 50 in the overhead mode via the operation input 10 or the operation unit 14g.
With reference to the obstacle (such as another vehicle 52) located around the vehicle 1 and the section line 68, acquired by the object-to-watch-for acquirer 30d, the notifier 38 displays a message on the screen 8a, or outputs a voice message via the voice controller 14e if the object to watch for is present around the vehicle 1. The notifier 38 may allow the display-mode controller 32b to change the display mode of the vehicle image 48 or the virtual vehicle image 50 on the peripheral image 46 for a necessary notification. The output 40 outputs, to the display controller 14d or the voice controller 14e, overhead display determined by the control unit 32 or the details of assistance determined by the driving assist 34.
The following describes an example of display processing to the overhead image performed by the periphery monitoring system 100 configured as described above with reference to the flowcharts in
First, the ECU 14 checks whether the display-switch receiver 36 receives the display request for the virtual vehicle image 50 (S100). With no display request for the virtual vehicle image 50 received (No at S100), it temporarily ends this processing. After receiving the display request for the virtual vehicle image 50 (Yes at S100), the overhead display controller 32c switches the screen 8a of the display device 8 (S102). That is, the regular mode of the screen 8a displaying a navigation screen or an audio screen is switched to a mode of displaying an actual image representing the traveling direction of the vehicle 1. As illustrated in
Subsequently, the vehicle-marker acquirer 30c acquires, from a storage such as the ROM 14b, the vehicle image 48 (vehicle icon) and the virtual vehicle image 50 (virtual vehicle, virtual icon) in the overhead mode (S104). In this case, the acquired output 40 and virtual vehicle image 50 may be the same data in different display modes. At this point, if the trailer-coupling-angle acquirer 30e acquires the coupling angle of the towed vehicle 60 (Yes at S106), the vehicle-marker acquirer 30c acquires the towed vehicle image 66 (towed vehicle icon) (S108). If the trailer-coupling-angle acquirer 30e does not acquire the coupling angle of the towed vehicle 60 (No at S106), that is, if the vehicle 1 does not tow the towed vehicle 60, the processing skips S108. If the vehicle 1 tows the towed vehicle 60 and cannot acquire the coupling angle from the image data generated by the imager 15a due to dark environment, for example, the processing skips S108.
If currently controlling in a mode other than the parking assistance mode (No at S110), for example, the ECU 14 acquires the peripheral image 46 (overhead image) generated by the peripheral-image generator 30b to be displayed on the screen 8b (S112). Subsequently, the ECU 14 checks whether a rearward display mode is currently requested from an operation state of the gearshift 7 or the operation input 10 (S114). In the rearward display mode (Yes at S114), for example, when the gearshift 7 is shifted to the backward range, or when acquiring a signal indicating that the driver intends to perform backward travel through an input to the operation input 10, the ECU 14 performs rearward display processing for displaying an image of behind the vehicle as succeeding processing (S116). That is, the screen 8a displays an actual image of a region behind the vehicle 1 imaged by the imager 15a, and the screen 8b displays the virtual vehicle image 50 moving backward. If the rearward display mode is not requested at S114 (No at S114), for example, when the gearshift 7 is shifted to the forward range, or when acquiring a signal indicating that the driver intends to drive the vehicle forward through an input to the operation input 10, the ECU 14 performs frontward display processing for displaying an image of ahead of the vehicle as succeeding processing (S118). That is, the screen 8a displays an actual image of the region ahead of the vehicle 1 imaged by the imager 15c, and the screen 8b displays the virtual vehicle image 50 moving forward.
Subsequently, the ECU 14 acquires the steering angle of the vehicle 1 detected by the steering angle sensor 19 via the steering-angle acquirer 30a (S120). If the display request for the virtual vehicle is received at S100, and the received request is the first display mode (Yes at S122), the vehicle-marker display-position controller 32a displays the virtual vehicle image 50 traveling away from the vehicle image 48 in a direction corresponding to the steering angle of the vehicle 1 (S124). In this case, the virtual vehicle image 50 may be continuously or intermittently displayed. This display mode may be chosen by the driver. The route-marker acquirer 34a acquires the estimated motion line 42 and the estimated direction line 44 in accordance with the steering angle of the vehicle 1 and superimpose them on the actual image on the screen 8a.
At this point, after determining that the object to watch for (for example, another vehicle 52) acquired by the object-to-watch-for acquirer 30d is present in the moving direction of the virtual vehicle image 50, and the object is an obstacle that may interfere with (come into contact with) the vehicle (Yes at S126), the vehicle-marker display-position controller 32a calculates a stop display position of the virtual vehicle image 50 (S128). If the display position of the virtual vehicle image 50 reaches the calculated stop display position (Yes at S130), for example, the vehicle-marker display-position controller 32a stops moving display of the virtual vehicle image 50 immediately before another vehicle 52 (at the stop display position) as illustrated in
Subsequently, the ECU 14 monitors receipt or no receipt of a display stop request for the virtual vehicle image 50 via the display-switch receiver 36 (S134). With no receipt of the display stop request (No at S134), the ECU 14 returns to S110, to continuously display the virtual vehicle image 50. For example, if the mode is not changed at S110 and S122, the virtual vehicle image 50 temporarily disappears from the peripheral image 46, and is displayed again away from the position of the vehicle image 48 and moves in a direction corresponding to the current steering angle of the vehicle 1. Thus, in response to the change in the steering angle of the vehicle 1, the virtual vehicle image 50 moves on the display in a direction different from that in previous display. That is, the virtual vehicle image 50 can be moved in a direction for avoiding the obstacle such as another vehicle 52. Thus, it is possible to find the steering angle of the vehicle 1 not to interfere with (come into no contact with) another vehicle 52 while referring to movement of the virtual vehicle image 50.
After receipt of the display request for a mode other than the first display mode at S122 (No at S122), that is, the display request for the second display mode, the vehicle-marker display-position controller 32a displays the virtual vehicle image 50 acquired at S104 at the display position of the vehicle image 48 to turn in a direction corresponding to the vehicle body direction at the time when the vehicle 1 moves backward by a given distance (for example, three meters) at the current steering angle (S136). At this point, the route-marker acquirer 34a acquires the estimated motion line 42 and the estimated direction line 44 in accordance with the steering angle of the vehicle 1 to be superimposed on the actual image on the screen 8a.
If determining that the object to watch for (for example, another vehicle 52) is present in the turning direction of the virtual vehicle image 50 determined by the vehicle-marker display-position controller 32a as an obstacle interfering with the vehicle (Yes at S138), the display-mode controller 32b changes the display mode of the virtual vehicle image 50 to a highlighted display (S140), and advances the process to S134. For example, as illustrated in
After acquiring the coupling angle of the towed vehicle 60 at S106 and the towed vehicle image 66 at S108, the overhead display controller 32c displays the towed-vehicle display region 64 on the screen 8b as illustrated in
At S110, if the current control state is the parking assistance mode (Yes at S110), for example, the ECU 14 advances to the flowchart in
As described above with reference to
If the vehicle-state acquirer 34b confirms the operation of the gearshift 7 to change the shift position (Yes at S156), the ECU 14 temporarily advances the processing to S110 to check continuance of the parking assistance mode. That is, when the driver decides not to park although moving the vehicle 1 to the gear shifting point, the processing proceeds to S112, display processing of the virtual vehicle image 50. In response to continuance of the parking assistance mode, the processing proceeds to S142 in which guidance control has been already started (Yes at S142), and proceeds to S150, skipping S144 to S148 to continue traveling display of the virtual vehicle image 50. If the virtual vehicle image 50 does not reach the gear shifting position on display at S152 (No at S152), the ECU 14 advances to S158, skipping S154 and S156.
With no change in the shift position at S156 (No at S156), the vehicle-marker display-position controller 32a checks whether the virtual vehicle image 50 reaches the target parking position N on display (S158). If the virtual vehicle image 50 does not reach the target parking position N on display (No at S158), the processing proceeds to S110, as described above, and the vehicle-marker display-position controller 32a continues to control display of the virtual vehicle image 50 while checking continuance of the parking assistance. If the virtual vehicle image 50 reaches the target parking position N on display (Yes at S158), the vehicle-marker display-position controller 32a stops moving the display of the virtual vehicle image 50 at the target parking position N. The display-mode controller 32b displays the virtual vehicle image 50 in a stop mode (S160). For example, the display-mode controller 32b changes the state of the virtual vehicle image 50 to the blinking state while maintaining the display color thereof in green in regular setting. With such display of the virtual vehicle image 50, the driver can easily recognize that the vehicle 1 can reach the target parking position N if guiding the vehicle 1 at the current steering angle. The guidance controller 34e checks whether the vehicle 1 reaches the target parking position N (S162). If the vehicle 1 has not reached the target parking position N yet (No at S162), guidance controller 34e continues to perform display at S160. If the vehicle 1 reaches the target parking position N (Yes at S162), the processing ends. In this case, the ECU 14 may allow the voice controller 14e to output a voice message representing completion of the parking assistance from the voice output device 9. The ECU 14 may allow the display controller 14d to display a text message representing completion of the parking assistance on the display device 8. After elapse of a given period, the ECU 14 may return the display of the display device 8 to regular display such as a navigation screen or an audio screen.
In this way, the periphery monitoring system 100 according to the present embodiment displays the virtual vehicle image 50 in the overhead mode. Consequently, this can provide the driver with the display in such a manner that the driver can intuitively recognize a future moving position of the vehicle 1, a future orientation of the vehicle 1, and a future positional relationship between the vehicle 1 and the object to watch for (for example, another vehicle 52), when the vehicle 1 travels at the current steering angle. This results in abating the driver's sense of insecurity, and makes it easier for the driver to make appropriate driving determination, which contributes to reducing a driving load.
As illustrated in
In displaying the virtual vehicle images 50 in an afterimage display mode as illustrated in
Thus, by not displaying the virtual vehicle image 50 while the current steering angle of the vehicle 1 corresponds to the steering neutral position, the driver can intuitively recognize that the vehicle is movable substantially straight (steering angle=0 degree). Also, the peripheral image displayed in the overhead mode is simplified, enabling the driver to more easily understand the peripheral situation.
In the case of not displaying the virtual vehicle image 50 when the current steering angle of the vehicle 1 corresponds to the steering neutral position, as illustrated in
In regular forward travel, the peripheral-image generator 30b can display an actual frontward image on the screen 8a of the display device 8 according to the image data generated by the imager 15c. When the ECU 14 receives an operation (braking request) of the braking unit 6 (brake pedal) from the brake sensor 18b and the object-to-watch-for acquirer 30d detects a stop line 72 ahead on a road surface 70, the ECU 14 executes a stop-position display mode. In this case, the overhead display controller 32c displays the screen 8b (peripheral image 46) on the display device 8. The vehicle-marker display-position controller 32a displays the vehicle image 48 on the peripheral image 46. The ECU 14 calculates an estimated stop position of the vehicle 1 from a detected value (brake force) by the brake sensor 18b, a vehicle speed of the vehicle 1 based on a detected value by the wheel-speed sensor 22, and deceleration, for example. The vehicle-marker display-position controller 32a acquires a display position of the virtual vehicle image 50 (50d) corresponding to the estimated stop position.
In displaying the stop position on the virtual vehicle image 50 as illustrated in
Thus, by displaying the virtual vehicle image 50 promptly, the driver is allowed to increase and decrease braking force appropriately and quickly. Specifically, excessive increase in braking force (sudden braking) is avoidable. With a driver's excessive initial operation amount of the braking unit 6, the virtual vehicle image 50 stops before the stop line 72 on display. Also in this case, displaying the virtual vehicle image 50 in a highlighted manner makes it possible for the driver to recognize excessive braking force and reduce the braking force. Along with the driver's adjustment of the braking force, the display position of the virtual vehicle image 50 may be changed. The ECU 14 may appropriately output a voice message in accordance with the display state of the virtual vehicle image 50. For example, the ECU 14 may output a message such as “Appropriate braking force”, “Insufficient braking force, please step on the brake pedal a little harder”, and “Excessive braking force, please relax braking force a little”. Alternatively, the ECU 14 may output different kinds of annunciation sound to inform the driver of the same or similar messages depending on the display state of the virtual vehicle image 50.
As illustrated in
A display processing program for a virtual vehicle image executed by the CPU 14a according to the embodiment may be recorded and provided in an installable or executable file format on a computer-readable recording medium such as a compact disc read only memory (CD-ROM), a flexible disk (FD), a compact disc recordable (CD-R), and a digital versatile disc (DVD).
The display processing program for a virtual vehicle image may be stored in a computer connected to a network such as the Internet and provided by being downloaded via the network. Furthermore, the display processing program for a virtual vehicle image executed in the present embodiment may be provided or distributed via a network such as the Internet.
Embodiments and modifications of the present invention have been described above for illustrative purpose only and are not intended to limit the scope of the invention. Such novel embodiments may be carried out in a variety of forms, and various omissions, substitutions, and modifications can be made without departing from the spirit of the invention. Such embodiments and modifications are incorporated in the scope and spirit of the invention and are incorporated in the scope of the inventions set forth in the claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2017-110347 | Jun 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/006590 | 2/22/2018 | WO | 00 |