The present disclosure relates to an assistance method and an assistance apparatus.
Conventionally, for example, a technology of causing another vehicle to recognize the presence of an own vehicle by emitting laser light, for example, in order to prevent a collision accident between vehicles or between a vehicle and a pedestrian in an environment with low visibility such as nighttime has been known.
A related technique is described in JP 2005-157873 A.
Under such circumstances, there is room for improvement of information to be recognized by another vehicle by irradiation with laser light from the viewpoint of further improvement in safety.
An object of the present disclosure is to improve traffic safety in an environment with low visibility.
An assistance method according to the present disclosure includes: detecting a movement trajectory of a moving object moving around an own vehicle; acquiring a predicted path of the moving object based on the movement trajectory of the moving object; determining a display mode related to the moving object on a road surface on the predicted path of the moving object, based on the predicted path of the moving object; and controlling irradiation of the road surface with laser light based on the display mode related to the moving object.
Hereinafter, embodiments of an assistance apparatus, a vehicle, an assistance method, and a program according to the present disclosure will be described with reference to the drawings.
In the description of the present disclosure, components having the same or substantially the same functions as those described above with reference to the previously described drawings are denoted by the same reference numerals, and the description thereof may be appropriately omitted. In addition, even in the case of representing the same or substantially the same portion, the dimensions and ratios may be expressed differently from each other depending on the drawings. Furthermore, for example, in order to ensure visibility of the drawings, in the description of each drawing, just main components are denoted by reference numerals, and even components having the same or substantially the same functions as those described above in the previous drawings may not be denoted by reference numerals.
Here, the front tire 13f according to the embodiment is an example of a first wheel. The rear tire 13r according to the embodiment is an example of a second wheel. Although
The vehicle body 12 is supported by the wheels 13. The vehicle 1 includes a driving machine (not illustrated), and is movable by driving at least one wheel (driving wheel) of the wheels 13 of the vehicle 1 by power of the driving machine. As the driving machine, any driving machine such as an engine using gasoline, hydrogen, or the like as a fuel, a motor using electric power from a battery, or a combination of an engine and a motor can be applied. In this case, the predetermined direction in which the two pairs of wheels 13 are disposed is a traveling direction of the vehicle 1. The vehicle 1 can move forward or backward by shifting a gear (not illustrated) or the like. The vehicle 1 can also make right or left turns by steering.
The vehicle body 12 has a front end portion F which is an end portion adjacent to the front tire 13f, and a rear end portion R which is an end portion adjacent to the rear tire 13r. The vehicle body 12 has a substantially rectangular shape in top view, and four corners of the substantially rectangular shape may be referred to as end portions.
A pair of bumpers 14 is provided near a lower end of the vehicle body 12 at the front end portions F and rear end portions R of the vehicle body 12. Among the pair of bumpers 14, a front bumper 14f covers the entire front surface and parts of side surfaces in the vicinity of a lower end portion of the vehicle body 12. Among the pair of bumpers 14, a rear bumper 14r covers the entire rear surface and parts of the side surfaces in the vicinity of the lower end portion of the vehicle body 12.
A sonar (sound navigation and ranging) 15 that transmit and receive sound waves such as ultrasonic waves is disposed at a predetermined end portion of the vehicle body 12. The sonar 15 includes wave transmitting/receiving units 15f and 15r. For example, one or more wave transmitting/receiving units 15f are disposed on the front bumper 14f, and one or more wave transmitting/receiving units 15r are disposed on the rear bumper 14r. Furthermore, the number of wave transmitting/receiving units 15f and 15r and/or positions of the wave transmitting/receiving units 15f and 15r are not limited to those in the example illustrated in
In the present embodiment, the sonar 15 using sound waves such as ultrasonic waves is exemplified, but the present disclosure is not limited thereto. For example, the vehicle 1 may include a radar that transmits and receives electromagnetic waves instead of the sonar 15 or in addition to the sonar 15. Furthermore, the sonar 15 may be simply referred to as a sensor.
The sonar 15 detects an obstacle around the vehicle 1 based on a sound wave transmission/reception result. The sonar 15 measures a distance between the obstacle around the vehicle 1 and the vehicle 1 based on the sound wave transmission/reception result. Here, the sonar 15 according to the embodiment is an example of an on-vehicle sensor.
In addition, the vehicle 1 includes all-around cameras 16 that image the surroundings of the vehicle 1. As an example, the vehicle 1 includes, as the all-around cameras 16, a front camera 16a that images a front area, a rear camera 16b that images a rear area, a left side camera 16c that images a left side area, and a right side camera (not illustrated) that images a right side area.
Hereinafter, in a case where the front camera 16a, the rear camera 16b, the left side camera 16c, and the right side camera are not particularly distinguished from each other, the front camera 16a, the rear camera 16b, the left side camera 16c, and the right side camera are simply referred to as the all-around camera 16. The position of the all-around camera 16 and/or the number of all-around cameras 16 is not limited to those in the example illustrated in
The all-around camera 16 is a camera capable of capturing an image of the surroundings of the vehicle 1, and is, for example, a camera that captures a color image. The captured image captured by the all-around camera 16 may be a moving image or a still image. In addition, the all-around camera 16 may be a camera built on the vehicle 1, a camera of a drive recorder retrofitted to the vehicle 1, or the like. Here, the all-around camera 16 according to the embodiment is an example of the on-vehicle sensor.
At least one irradiation device 17 is provided at a predetermined end portion of the vehicle body 12.
The irradiation device 17 includes a light emitter such as a light emitting diode (LED) and an optical system that converges, expands, or deflects light from the light emitter. The irradiation device 17 switches on/off of light emission by the light emitter under the control of the assistance apparatus 3. In addition, the irradiation device 17 operates the optical system such that the light from the light emitter has a predetermined irradiation shape at a predetermined position on a road surface in front of the vehicle 1 in the traveling direction under the control of the assistance apparatus 3. In other words, the irradiation device 17 is configured to be able to irradiate the road surface with laser light in a visible light range. As an example, the irradiation device 17 irradiates the road surface positioned in the vehicle traveling direction with visible laser light such that the light has an irradiation shape corresponding to a display mode determined by the assistance apparatus 3.
The light emitter of the irradiation device 17 is not limited to the LED, and may be another light source. For example, the light emitter may be a solid laser such as a semiconductor laser, a gas laser such as a He—Ne laser, or a liquid laser. The light emitter may be a high-intensity discharge (HID) lamp, a halogen lamp, or a lamp shared with a front light of the vehicle 1. Alternatively, the irradiation device 17 may be integrated with the front light of the vehicle 1.
The irradiation device 17 may be configured to be able to change a color or a luminance of the laser light with which the road surface is irradiated. The color or luminance may be changed depending on an output of the light emitter or may be changed by a filter of the optical system.
The irradiation device 17 may be a projector that projects an image or the like on the road surface positioned in the vehicle traveling direction. Further, the irradiation device 17 may be configured to cooperate with another irradiation device such as a headlight of the vehicle 1. For example, when the irradiation device 17 irradiates the road surface with the laser light in a desired display mode, the assistance apparatus 3 may control to reduce a light quantity of the headlight.
As illustrated in
As illustrated in
The HMI 21 is an interface for outputting a notification such as assistance information to a driver of the vehicle 1. The HMI 21 is provided, for example, around a driver's seat of the vehicle 1. It is enough that the HMI 21 can output a predetermined notification so as to be recognizable by the driver of the vehicle 1, and the HMI 21 may be provided at another portion around the driver's seat, such as a rear seat.
The HMI 21 may be a head mounted display (HMD) mounted on the head of the driver. Furthermore, the HMI 21 may be a projection type display device such as a head up display (HUD) that projects a video (virtual image) on a front of the driver, for example, a display region provided on a windshield 180 or a dashboard 190. Furthermore, the HMI 21 is not limited to a device that displays a video, and may include other notification devices such as a speaker and a horn that output a notification sound, a warning sound, and a sound.
The CPU 31 is an arithmetic device that controls the entire assistance apparatus 3. The CPU 31 loads a program stored in the ROM 32 or the HDD 34 into the RAM 33 and executes the program, thereby implementing each processing described below.
The CPU 31 according to the embodiment is an example of a processor in the assistance apparatus 3. As the processor, another processor may be provided instead of the CPU 31 or in addition to the CPU 31. As the another processor, various processors such as a graphics processing unit (GPU), an application specific integrated circuit (ASIC), and a field programmable gate array (FPGA) can be used as appropriate.
The ROM 32 stores programs, parameters, and the like that implement various types of processing executed by the CPU 31.
The RAM 33 is, for example, a main storage device of the assistance apparatus 3, and temporarily stores data necessary for various types of processing executed by the CPU 31.
The HDD 34 stores various data, programs, and the like used by the assistance apparatus 3. As an example, the HDD 34 holds past movement of a moving object as a monitoring target detected by an advanced driving assistant system (ADAS) such as the sonar 15 or the all-around camera 16, a calculated predicted path as a monitoring target, determined irradiation contents, and the like. Various storage media and storage devices such as a solid state drive (SSD) and a flash memory can be used as appropriate instead of the HDD 34 or in addition to the HDD 34.
The I/F 35 is an interface for transmitting and receiving data. The I/F 35 receives data from another device provided on the vehicle 1, for example, the on-vehicle sensor such as the sonar 15 or the all-around camera 16. In addition, the I/F 35 transmits data to other devices provided on the vehicle 1, for example, the irradiation device 17 and the HMI 21.
The I/F 35 may acquire a signal from an accelerator sensor (not illustrated) that detects an operation amount of an accelerator pedal by the driver and a signal from a brake sensor (not illustrated) that detects an operation amount of a brake pedal by the driver, or operation amounts based on these signals.
The I/F 35 may transmit and receive information to and from another ECU mounted on the vehicle 1 via a controller area network (CAN) or the like on the vehicle 1, or may communicate with an information processing device outside the vehicle 1 via a network such as the Internet. As an example, the I/F 35 acquires vehicle information regarding a state of the vehicle 1 such as a vehicle speed pulse, each speed including a yaw rate, an acceleration, position information, and shift information from another ECU or various on-vehicle sensors of the vehicle 1 via the CAN, for example.
The detection unit 301 monitors movement of a moving object moving around an own vehicle and detects a movement trajectory. For example, the detection unit 301 acquires data from the on-vehicle sensor of the ADAS provided on the vehicle 1 such as the sonar 15 or the all-around camera 16 via, for example, the I/F 35. In addition, the detection unit 301 detects past movement of a monitoring target based on the acquired data.
Here, the monitoring target is, for example, a moving object moving around the own vehicle, and may also include the own vehicle. A movement trajectory of the own vehicle may be acquired based on an output of a global navigation satellite system (GNSS) sensor such as a global positioning system (GPS) sensor or other on-vehicle sensors such as a wheel speed sensor, an inertial sensor, and an acceleration sensor.
In addition, the moving object is at least one of a vehicle other than the own vehicle or a pedestrian. More specifically, the moving object is at least one of a person such as a pedestrian and a mobility device such as a bicycle or an automobile that transports a person or an object. The mobility device includes various vehicles such as a bicycle, a motorcycle, an automobile, a kick scooter, and a senior car that can move along a movement path provided on the ground. In addition, the mobility device may be driven by human power or may be driven using power of a prime mover or a motor. Further, the mobility device may be configured to be capable of autonomous driving.
The path prediction unit 302 acquires a predicted path of the moving object based on the movement trajectory of the moving object detected by the detection unit 301. In other words, the path prediction unit 302 predicts future movement of the moving object based on the past movement of the moving object detected by the detection unit 301.
The determination unit 303 determines a collision risk between at least two moving objects moving around the own vehicle based on predicted paths of the at least two moving objects.
The irradiation content determination unit 304 determines a display mode related to the moving object on the road surface on the predicted path of the moving object based on the predicted path of the moving object moving around the own vehicle.
As an example, the display mode related to the moving object includes a display indicating the predicted path of the moving object.
As an example, the display mode related to the moving object includes a display indicating the type of the moving object. The type of the moving object indicates the type of the corresponding moving object among various moving objects, the type being any one of a vehicle other than the own vehicle and a pedestrian. For example, the type of the moving object includes at least one of “automobile”, “motorcycle”, “bicycle”, or “pedestrian”, and indicates the type of the moving object that is any one of “automobile”, “motorcycle”, “bicycle”, and “pedestrian”. In addition, the type of the moving object may indicate, for example, a “passenger car” or a “truck” among automobiles.
As an example, the display mode related to the moving object includes a display indicating a stop line for at least one of at least two moving objects having a collision risk in a case where there is a collision risk between the moving objects.
The irradiation control unit 305 controls irradiation of the road surface in front of the own vehicle or the moving object moving around the own vehicle with the laser light based on the display mode related to the moving object.
In the scene illustrated in
In the scene illustrated in
In the scene illustrated in
In the scene illustrated in
As an example, the irradiation content determination unit 304 determines a display mode related to the moving object 507 on a road surface on a predicted path 403 of the moving object 507 based on the predicted path 403 of the moving object 507 moving around the own vehicle. Here, a display 601 including an arrow indicating the predicted path 403 of the moving object 507 and an icon indicating the type of the moving object 507 is an example of the display mode related to the moving object 507. As an example, the icon indicating the type of the moving object 507 is an icon indicating “motorcycle” as illustrated in
As an example, the irradiation content determination unit 304 determines a display mode related to the moving object 505 on a road surface on a predicted path of the moving object 505 based on the predicted path of the moving object 505 moving around the own vehicle. Here, a display 603 including an arrow indicating the predicted path of the moving object 505 and an icon indicating the type of the moving object 505 is an example of the display mode related to the moving object 505. As an example, the icon indicating the type of the moving object 505 is an icon indicating “automobile (passenger car or four-wheeled vehicle)” as illustrated in
Display of the displays 601 and 603, in other words, irradiation of the predicted path and the type of road surface, may be performed in a case where it is determined that a dangerous situation has occurred. In other words, the irradiation content determination unit 304 may determine the display mode including the predicted path and the type in a case where the dangerous situation has occurred. Alternatively, the irradiation control unit 305 may start control of irradiation with laser light for the road surface for implementing the display mode including the predicted path and the type in a case where the dangerous situation has occurred.
Here, the dangerous situation is a case where it is determined that there is a collision risk between at least two moving objects among the plurality of moving objects 501, 503, 505, and 507. As an example, in a case where the predicted paths intersect, the determination unit 303 determines that there is a collision risk between the moving objects whose predicted paths intersect each other.
As an example, in a case where it is determined that the dangerous situation has occurred, the irradiation content determination unit 304 determines the display mode related to the moving object 507, the display mode including a display 605 indicating a stop line for the moving object 507 on the road surface on the predicted path of the moving object 507.
As an example, in a case where it is determined that the dangerous situation has occurred, the irradiation content determination unit 304 determines the display mode related to the moving object 505, the display mode including a display 607 indicating a stop line for the moving object 505 on the road surface on the predicted path of the moving object 505.
Display of the displays 605 and 607, in other words, the stop lines, on the road surface may be performed regardless of whether or not it is determined that the dangerous situation has occurred. In other words, each of the displays 605 and 607 may be displayed on the road surface together with the displays 601 and 603. In this case, the determination unit 303 does not have to determine whether the dangerous situation has occurred. Alternatively, the determination unit 303 does not have to be provided in the assistance apparatus 3.
Next, a flow of the assistance processing executed by the assistance apparatus 3 configured as described above will be described.
The detection unit 301 monitors movement of a moving object other than itself (S101). The path prediction unit 302 also predicts a path of the moving object other than itself (S102).
The irradiation content determination unit 304 determines an irradiation mode for displaying the predicted path on the road surface. Furthermore, the irradiation control unit 305 causes the irradiation device 17 to irradiate the road surface with laser light to irradiate the road surface to project the predicted path of a moving object other than itself in the determined irradiation mode (S103).
The determination unit 303 determines whether traffic around the vehicle 1 is in the dangerous situation (S104). In a case where it is not determined that the traffic is in the dangerous situation (S104: No), the flow of
On the other hand, in a case where it is determined that the traffic is in the dangerous situation (S104: Yes), the irradiation content determination unit 304 determines a display mode of displaying a stop line on the road surface. Furthermore, the irradiation control unit 305 causes the irradiation device 17 to irradiate the road surface with laser light to project the stop line for the moving object other than itself on the road surface in the determined irradiation mode (S105). Thereafter, the flow of
The assistance apparatus 3 may irradiate the road surface with laser light in a display mode related to the own vehicle without being limited to the display mode related to the moving object moving around the own vehicle. In other words, the detection unit 301 may acquire a predicted path of the own vehicle based on the movement trajectory of the own vehicle. In addition, the irradiation content determination unit 304 may determine the display mode related to the own vehicle on the road surface on the predicted path of the own vehicle based on the predicted path of the own vehicle without being limited to the display mode related to another moving object around the own vehicle. Furthermore, the irradiation control unit 305 may control the irradiation of the road surface with the laser light based on the display mode related to the own vehicle.
The detection unit 301 may acquire the vehicle information such as the position, the speed, the movement direction, and the type of the moving object moving around the own vehicle by V2X communication such as vehicle-to-vehicle communication or road-to-vehicle communication.
Furthermore, the irradiation control unit 305 may transmit a control signal by, for example, V2X communication to cause not just the irradiation device 17 of the own vehicle but also the irradiation device 17 mounted on the moving object moving around the own vehicle or the irradiation device 17 installed on the road surface to irradiate the road surface with laser light.
The assistance apparatus 3 may perform control of sounding the horn instead of or in addition to the irradiation of the road surface to project the stop line.
As described above, the assistance apparatus 3 according to the embodiment predicts the future path of another vehicle from the past movement of another vehicle, and reports the presence of another vehicle by irradiating the road surface with laser light in the display mode based on the predicted path. In addition, the assistance apparatus 3 according to the embodiment reports a risk not just for the own vehicle, but also for other vehicles, another vehicle and a pedestrian, and the like, in a case where it is determined that there is a risk.
With such a configuration, it is possible to notify the driver of the own vehicle and the driver of another vehicle of the presence of a motorcycle or the like that passes by the side of the own vehicle. Therefore, even in an environment with blind spots or low visibility such as nighttime or traffic congestion, it is possible to improve safety not just for the own vehicle but also for another vehicle, by preventing an involuntary accident.
In each embodiment described above, “determining whether it is A” may mean “determining that it is A”, “determining that it is not A”, or “determining whether or not it is A”.
The program executed by the assistance apparatus 3 of each embodiment described above is provided by being recorded in a computer-readable recording medium such as a CD-ROM, an FD, a CD-R, or a DVD as a file in an installable format or an executable format.
Further, the program executed by the assistance apparatus 3 according to each embodiment described above may be stored on a computer connected to a network such as the Internet and be provided by being downloaded via the network. Further, the program executed by the assistance apparatus 3 may be provided or distributed via a network such as the Internet.
In addition, the program executed by the assistance apparatus 3 according to each embodiment described above may be provided by being incorporated in the ROM or the like in advance.
In addition, the program executed by the assistance apparatus 3 according to each embodiment described above has a module configuration including the respective functional units (the detection unit 301, the path prediction unit 302, the determination unit 303, the irradiation content determination unit 304, and the irradiation control unit 305), and as actual hardware, the CPU 31 reads and executes the program from the ROM 32 or the HDD 34, whereby each functional unit is loaded into the RAM 33, and each functional unit is generated on the RAM 33.
According to at least one embodiment described above, traffic safety in an environment with low visibility can be improved.
Although some embodiments of the present invention have been described, these embodiments have been presented as examples, and are not intended to limit the scope of the invention. These embodiments can be implemented in various other forms, and various omissions, substitutions, and changes can be made without departing from the gist of the invention. These embodiments and modifications thereof are included in the scope and gist of the invention and are included in the invention described in the claims and equivalents thereof.
The following technology is disclosed by the above description of the embodiment.
An assistance method including:
The assistance method according to (1), further including:
The assistance method according to (1) or (2), wherein the display mode related to the moving object includes a display of the predicted path of the moving object.
The assistance method according to any of (1) to (3), wherein the display mode related to the moving object includes a display of a type of the moving object.
The assistance method according to (4), wherein the type of the moving object includes at least one of an automobile, a motorcycle, a bicycle, or a pedestrian.
The assistance method according to any of (1) to (5), wherein the moving object is at least one of a vehicle other than the own vehicle or a pedestrian.
The assistance method according to any of (1) to (6), further including:
An assistance apparatus including:
The assistance apparatus according to (8), further including:
The assistance apparatus according to (8) or (9), in which the irradiation content determination unit is configured to determine the display mode related to the moving object including a display of the predicted path of the moving object.
The assistance apparatus according to any of (8) to (10), in which the irradiation content determination unit is configured to determine the display mode related to the moving object including a display of a type of the moving object.
The assistance apparatus according to (11), in which the type of the moving object includes at least one of an automobile, a motorcycle, a bicycle, or a pedestrian.
The assistance apparatus according to any of (8) to (12), in which the moving object is at least one of a vehicle other than the own vehicle or a pedestrian.
The assistance apparatus according to any of (8) to (13), further including a determination unit configured to determine a collision risk between at least two moving objects moving around the own vehicle based on predicted paths of the at least two moving objects,
The assistance apparatus according to any of (8) to (14), further including an irradiation device configured to irradiate the road surface on the predicted path of the moving object with the laser light.
A vehicle including:
A program for causing a computer to execute the assistance method according to any of (1) to (7).
A recording medium (computer program product) on which the program to be executed by a computer, according to (17), is recorded.
Although the above-described present embodiment and the modification have been described above, the embodiment and the modification example have been presented as examples, and are not intended to limit the scope of the invention. The above-described novel embodiments and modification examples can be implemented in various other forms, and various omissions, substitutions, and changes can be made without departing from the gist of the invention. The above-described present embodiments and modification examples are included in the scope or gist of the invention, and are included in the invention described in the claims and the equivalent scope thereof.
Number | Date | Country | Kind |
---|---|---|---|
2022-206045 | Dec 2022 | JP | national |
This application is a continuation of International Application No. PCT/JP2023/026084, filed on Jul. 14, 2023 which claims the benefit of priority of the prior Japanese Patent Application No. 2022-206045, filed on Dec. 22, 2022, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2023/026084 | Jul 2023 | WO |
Child | 19039477 | US |