The present invention relates to a traffic safety assistance system.
In recent years, there has been an increase in efforts to give consideration to the vulnerable people among the traffic participants and provide them with access to a sustainable transportation system. Toward this end, research and development is attracting attention to further improve traffic safety and convenience through the development of preventive safety technology.
For example, a technique disclosed in JP2021-165938A causes a display device of a first traffic participant (an own vehicle) to display a surrounding area image showing a surrounding area of the first traffic participant, and further display symbolic images indicating second traffic participants (other vehicles) at positions corresponding to the second traffic participants in the surrounding area image.
However, in JP2021-165938A, the second traffic participants are displayed using the same symbolic images (see FIG. 4 of JP2021-165938A). Accordingly, it is difficult to determine which second traffic participant in the actual space corresponds to which symbolic image in the surrounding area image. Further, in a case where there is a second traffic participant hiding in a blind spot in the actual space (for example, a vehicle hiding behind a right-turning vehicle in the oncoming lane), it is difficult to determine a symbolic image in the surrounding area image to which the abovementioned second traffic participant corresponds.
In view of the above background, an object of the present invention is to provide a traffic safety assistance system including a display device that enables easy determination of the correspondence and relationship between the second traffic participant in the actual space and the appearance image in the surrounding area image, and contributes to the development of a sustainable transportation system accordingly.
To achieve such an object, one aspect of the present invention provides a traffic safety assistance system (1), comprising: an acquiring device (12 to 15) configured to acquire information about a plurality of traffic participants (2 to 5) that exists in a target traffic area (A); a server device (18) configured to estimate prospective behavior of the plurality of traffic participants based on the information about the plurality of traffic participants acquired by the acquiring device; and a notifying device (12 to 15) configured to give notifications to the plurality of traffic participants based on an estimation result of the server device, wherein the plurality of traffic participants includes at least one first traffic participant (71) and at least one second traffic participant (72 to 74), the server device is configured to transmit an appearance image (92 to 94) of the second traffic participant corresponding to appearance of the second traffic participant to the first traffic participant, the notifying device includes a display device (84) provided in the first traffic participant, and the display device is configured to display a surrounding area image (91) showing a surrounding area of the first traffic participant, and display the appearance image of the second traffic participant, which is transmitted from the server device to the first traffic participant, at a position corresponding to the second traffic participant in the surrounding area image.
According to this aspect, the appearance image corresponding to the appearance of the second traffic participant is displayed at the position corresponding to the second traffic participant in the surrounding area image. Thus, it is possible to bring the appearance image of the second traffic participant in the surrounding area image closer to the actual appearance of the second traffic participant. Accordingly, it is possible to easily determine the correspondence and relationship between the second traffic participant in the actual space and the appearance image in the surrounding area image, and contribute to the development of a sustainable transportation system accordingly.
In the above aspect, preferably, the server device is configured to transmit sound data to the first traffic participant together with the appearance image of the second traffic participant, the sound data being data on a sound generated by the second traffic participant, the notifying device includes a sound output device (85) provided in the first traffic participant, and the sound output device is configured to output a sound corresponding to the sound data while the display device is displaying the appearance image of the second traffic participant.
According to this aspect, the sound output device outputs a sound corresponding to the data on a sound generated by the second traffic participant simultaneously as the display device displays the appearance image corresponding to the appearance of the second traffic participant. Accordingly, it is possible to more easily determine the correspondence and relationship between the second traffic participant in the actual space and the appearance image in the surrounding area image.
In the above aspect, preferably, the display device is configured to display a prescribed symbolic image at the position corresponding to the second traffic participant in the surrounding area image in a case where the server device cannot transmit the appearance image of the second traffic participant to the first traffic participant.
According to this aspect, even if the server device cannot transmit the appearance image of the second traffic participant to the first traffic participant, it is possible to show the position of the second traffic participant in the surrounding area image using the symbolic image.
In the above aspect, preferably, the server device includes a storage unit (61) configured to store appearance images of the plurality of traffic participants, and the server device is configured to transmit the appearance image of the second traffic participant read from the storage unit to the first traffic participant in a case where the storage unit stores the appearance image of the second traffic participant.
According to this aspect, since the server device reads the appearance image of the second traffic participant from its own storage unit, it is possible to promptly transmit the appearance image of the second traffic participant from the server device to the first traffic participant.
In the above aspect, preferably, the server device is configured to transmit the appearance image of the second traffic participant acquired from the second traffic participant to the first traffic participant in a case where the storage unit does not store the appearance image of the second traffic participant but the second traffic participant holds the appearance image of the second traffic participant.
According to this aspect, it is possible to increase the probability that the server device can transmit the appearance image of the second traffic participant to the first traffic participant.
In the above aspect, preferably, the server device is configured to transmit the appearance image of the second traffic participant acquired via an outside network to the first traffic participant in a case where the storage unit does not store the appearance image of the second traffic participant and the second traffic participant does not hold the appearance image of the second traffic participant.
According to this aspect, it is possible to further increase the probability that the server device can transmit the appearance image of the second traffic participant to the first traffic participant.
In the above aspect, preferably, the server device is configured to determine whether a risk is posed between the plurality of traffic participants, the at least one second traffic participant comprises a plurality of second traffic participants including: a plurality of risk parties that is included in parties involved in the risk; and a non-risk party that is not included in the parties involved in the risk, and the display device is configured to display appearance images of the plurality of risk parties, which vary from one risk party to another, at positions corresponding to the plurality of risk parties, and display a prescribed symbolic image at a position corresponding to the non-risk party.
According to this aspect, it is possible to easily distinguish the second traffic participants included in the parties involved in the risk and the second traffic participant not included in the parties involved in the risk.
In the above aspect, preferably, the first traffic participant is a first vehicle, the second traffic participant is a second vehicle, and the display device is configured to blink at least a portion of an appearance image of the second vehicle in a case where a blinker (75) of the second vehicle is operating.
According to this aspect, it is possible to confirm that the second vehicle is about to turn based on the appearance image of the second vehicle.
In the above aspect, preferably, the first traffic participant is a first vehicle, the second traffic participant is a second vehicle, and the server device is configured to transmit a three-dimensional image of the second vehicle corresponding to a vehicle model and color of the second vehicle to the first vehicle.
According to this aspect, it is possible to further bring the appearance image of the second traffic participant in the surrounding area image closer to the actual appearance of the second traffic participant. Accordingly, it is possible to more easily determine the correspondence and relationship between the second traffic participant in the actual space and the appearance image in the surrounding area image.
Thus, according to the above aspects, it is possible to provide a traffic safety assistance system including a display device that enables easy determination of the correspondence and relationship between the second traffic participant in the actual space and the appearance image in the surrounding area image.
In the following, a traffic safety assistance system 1 according to an embodiment of the present invention will be described with reference to the drawings.
With reference to
Each automobile device group 12 is provided in the corresponding automobile 2 and moves together with the corresponding automobile 2. Each automobile device group 12 includes an onboard actuator 21, an onboard sensor 22, an onboard navigation device 23, an onboard HMI 24, an onboard controller 25, an onboard communication device 26, and an onboard terminal 27. The components other than the onboard terminal 27 in each automobile device group 12 constitute a portion of the corresponding automobile 2.
The onboard actuator 21 includes a driving device that applies a driving force to the automobile 2, a brake device that applies a braking force to the automobile 2, and a steering device that steers wheels of the automobile 2. The driving device consists of an internal combustion engine and/or an electric motor.
The onboard sensor 22 detects various conditions related to the automobile 2. The onboard sensor 22 includes an external environment sensor that detects the state of an external environment of the automobile 2, a vehicle sensor that detects the state of the automobile 2, and a driver sensor that detects the state of a driver of the automobile 2. The external environment sensor includes an external environment camera that captures an image of an object (a delimiting line, an obstacle, another vehicle, and the like) existing around the automobile 2, a radar that detects a position of the object existing around the automobile 2 using radio waves such as millimeter waves, and a lidar (LiDAR) that detects the position of the object existing around the automobile 2 using light such as infrared rays. The vehicle sensor includes a vehicle speed sensor that detects a vehicle speed of the automobile 2. The driver sensor includes a driver camera that captures an image of the driver and a biological sensor that detects biological information (pulse, breathing, skin potential, and the like) about the driver.
The onboard navigation device 23 is a device that provides route guidance to the destination of the automobile 2, and the like. The onboard navigation device 23 stores map information. The onboard navigation device 23 identifies the current position of the automobile 2 based on GNSS signals received from artificial satellites.
The onboard HMI 24 gives notifications to the driver of the automobile 2. For example, the onboard HMI 24 includes a touch panel and a sound output device. The touch panel displays various screens to the driver. The sound output device outputs audio guidance, warning sounds, and the like to the driver.
The onboard controller 25 is an electronic control unit (ECU) consisting of one or more computers configured to execute various processes. The onboard controller 25 includes an arithmetic processing unit (a processor such as CPU and MPU) and a storage device (memory such as ROM and RAM). The onboard controller 25 is connected to each component of the automobile 2 and controls each component of the automobile 2. The onboard controller 25 controls the traveling of the automobile 2 by controlling the onboard actuator 21. The onboard controller 25 recognizes the object existing around the automobile 2 based on the detection result of the external environment sensor of the onboard sensor 22.
The onboard communication device 26 is connected to the server device 18 via a wireless communication network. The onboard communication device 26 transmits to the server device 18 various states related to the automobile 2 detected by the onboard sensor 22 and the current position of the automobile 2 identified by the onboard navigation device 23. The onboard communication device 26 receives traffic safety assistance information (details thereof will be described later) from the server device 18.
The onboard terminal 27 is a portable terminal carried by the driver of the automobile 2. The onboard terminal 27 is an electronic control unit (ECU) consisting of one or more computers configured to execute various processes. The onboard terminal 27 includes an arithmetic processing unit (a processor such as CPU and MPU) and a storage device (memory such as ROM and RAM). For example, the onboard terminal 27 consists of a wearable device or a smartphone. The onboard terminal 27 is connected to the server device 18 via the wireless communication network.
Each motorcycle device group 13 is provided in the corresponding motorcycle 3 and moves together with the corresponding motorcycle 3. With reference to
Each bicycle terminal 14 is a portable terminal carried by the driver of the corresponding bicycle 4 and moves together with the corresponding bicycle 4. For example, each bicycle terminal 14 consists of a wearable device or a smartphone.
Each bicycle terminal 14 includes a terminal sensor 41, a terminal navigation device 42, a terminal HMI 43, a terminal controller 44, and a terminal communication device 45.
The terminal sensor 41 is installed in each bicycle terminal 14 and detects the state of the bicycle 4. The terminal sensor 41 includes an acceleration sensor 41A that detects acceleration of the bicycle 4 and a gyro sensor 41B that detects an angular velocity of the bicycle 4.
The terminal navigation device 42 stores map information. The terminal navigation device 42 identifies the current position of the bicycle 4 based on GNSS signals received from artificial satellites.
The terminal HMI 43 gives notifications to the driver of the bicycle 4. For example, the terminal HMI 43 includes a touch panel and a sound output device. The touch panel displays various screens to the driver. The sound output device outputs audio guidance, warning sounds, and the like to the driver.
The terminal controller 44 is an electronic control unit (ECU) consisting of one or more computers configured to execute various processes. The terminal controller 44 includes an arithmetic processing unit (a processor such as CPU and MPU) and a storage device (memory such as ROM and RAM).
The terminal communication device 45 is connected to the server device 18 via the wireless communication network. The terminal communication device 45 transmits to the server device 18 the state of the bicycle 4 detected by the terminal sensor 41 and the current position of the bicycle 4 identified by the terminal navigation device 42. The terminal communication device 45 receives the traffic safety assistance information (details thereof will be described later) from the server device 18.
Each pedestrian terminal 15 is a portable terminal carried by the corresponding pedestrian 5 and moves together with the corresponding pedestrian 5. For example, each pedestrian terminal 15 consists of a wearable device or a smartphone.
Each pedestrian terminal 15 includes a terminal sensor 51, a terminal navigation device 52, a terminal HMI 53, a terminal controller 54, and a terminal communication device 55. The terminal sensor 51 includes an acceleration sensor 51A that detects acceleration of the pedestrian 5 and a gyro sensor 51B that detects an angular velocity of the pedestrian 5. The components of each pedestrian terminal 15 are the same as the components of each bicycle terminal 14, so the description thereof will be omitted.
Each infrastructural camera 16 is provided in the target traffic area A. Each infrastructural camera 16 captures images of the plurality of traffic participants and the plurality of infrastructures in the target traffic area A. Each infrastructural camera 16 is connected to the server device 18 via the wireless communication network. Each infrastructural camera 16 transmits the captured images of the traffic participants and the infrastructures to the server device 18.
The signal controller 17 controls each traffic signal 9 installed in the target traffic area A. The signal controller 17 acquires information about each traffic signal 9 installed in the target traffic area A. The signal controller 17 is connected to the server device 18 via the wireless communication network. The signal controller 17 transmits the acquired information about each traffic signal 9 to the server device 18.
The server device 18 is an electronic control unit (ECU) consisting of one or more computers configured to execute various processes. The server device 18 includes an arithmetic processing unit (a processor such as CPU and MPU) and a storage device (memory such as ROM and RAM). The server device 18 consists of a virtual server provided in the cloud. In another embodiment, the server device 18 may consist of a physical server installed in a company operating the traffic safety assistance system 1 and the like.
The server device 18 includes a database unit 61, a behavior estimating unit 62, a risk determining unit 63, an assistance information generating unit 64, and an assistance information transmitting unit 65.
The database unit 61 stores map information (hereinafter referred to as “the area map information”) about the target traffic area A. The area map information includes information about each roadway 6 (for example, information about the width and number of lanes in each roadway 6) in the target traffic area A, information about each sidewalk 7 (for example, information about the width of each sidewalk 7) in the target traffic area A, and information about each crosswalk 8 (for example, information about the position of each crosswalk 8) in the target traffic area A.
The behavior estimating unit 62 recognizes the state of the target traffic area A (that includes the state of the traffic participants existing in the target traffic area A) based on the information transmitted from the onboard communication device 26 of each automobile device group 12, the onboard communication device 36 of each motorcycle device group 13, the terminal communication device 45 of each bicycle terminal 14, the terminal communication device 55 of each pedestrian terminal 15, each infrastructural camera 16, and the signal controller 17, and the area map information stored in the database unit 61. The behavior estimating unit 62 estimates the prospective behavior (for example, the prospective trajectory) of the plurality of traffic participants based on the state of the target traffic area A. More specifically, the behavior estimating unit 62 constructs a virtual space corresponding to the target traffic area A based on the state of the target traffic area A, and estimates the prospective behavior of the plurality of traffic participants by performing simulation in the virtual space.
The risk determining unit 63 determines whether a risk (for example, a possibility of contact) is posed between the plurality of traffic participants based on the prospective behavior of the plurality of traffic participants estimated by the behavior estimating unit 62. For example, when the prospective trajectory of one of the traffic participants overlaps with the prospective trajectory of another of the traffic participants, the risk determining unit 63 determines that the risk is posed between these two traffic participants.
The assistance information generating unit 64 generates the traffic safety assistance information (hereinafter abbreviated as “the assistance information”) based on the determination result of the risk determining unit 63. The assistance information is information to notify the presence of the risk to the traffic participants to be the parties (hereinafter referred to as “the risk parties”) involved in the risk. The assistance information includes information about the risk parties (for example, information about locations of the risk parties) and information about the contents of the risk (for example, information about the location where the risk is posed).
The assistance information transmitting unit 65 transmits the assistance information generated by the assistance information generating unit 64 to the risk parties. For example, in a case where one automobile 2 in the target traffic area A is included in the risk parties, the assistance information transmitting unit 65 transmits the assistance information to the onboard communication device 26 provided in the one automobile 2. Accordingly, the onboard HMI 24 provided in the one automobile 2 notifies the assistance information to the driver of the one automobile 2.
As described above, each automobile device group 12 (more specifically, the onboard sensor 22 and the onboard navigation device 23), each motorcycle device group 13 (more specifically, the onboard sensor 32 and the onboard navigation device 33), each bicycle terminal 14 (more specifically, the terminal sensor 41 and the terminal navigation device 42), and each pedestrian terminal 15 (more specifically, the terminal sensor 51 and the terminal navigation device 52) acquire the information about the plurality of traffic participants existing in the target traffic area A. The server device 18 estimates the prospective behavior of the plurality of traffic participants based on the information about the plurality of traffic participants acquired by these devices. Each automobile device group 12 (more specifically, the onboard HMI 24), each motorcycle device group 13 (more specifically, the onboard HMI 34), each bicycle terminal 14 (more specifically, the terminal HMI 43), and each pedestrian terminal 15 (more specifically, the terminal HMI 53) give notifications to the plurality of traffic participants based on the estimation result of the server device 18. Thus, the traffic safety assistance system 1 assists the plurality of traffic participants in safe traffic.
In a case where the traffic participant is a vehicle (for example, the automobile 2, the motorcycle 3, or the bicycle 4), a notification to the traffic participant means a notification to an occupant (for example, a driver) of the traffic participant. On the other hand, in a case where the traffic participant is a human being (for example, the pedestrian 5), a notification to the traffic participant means a notification to the traffic participant himself/herself.
With reference to
With reference to
The first automobile 71 is provided with a speaker 85 (an example of a sound output device) at any location such as the dashboard 82 and a door (not shown). The speaker 85 is a device that outputs audio guidance, warning sounds, and the like to the driver of the first automobile 71. The speaker 85 is one of the components of the onboard HMI 24 of the first automobile 71.
With reference to
The surrounding area image 91 is a bird's-eye image (three-dimensional image) looking down on the surrounding area of the first automobile 71 from a point above the first automobile 71 toward a front and lower side thereof. In another embodiment, the surrounding area image 91 may be a look-down image (two-dimensional image) looking down on the first automobile 71 and its surrounding area from right above.
First to third appearance images 92 to 94 are displayed at positions corresponding to the surrounding vehicles 72 to 74 in the surrounding area image 91. The first appearance image 92 is an appearance image of the second automobile 72, the second appearance image 93 is an appearance image of the third automobile 73, and the third appearance image 94 is an appearance image of the motorcycle 74. Hereinafter, the first to third appearance image(s) 92 to 94 will be collectively referred to as “the appearance image(s) 92 to 94”. The meter panel 84 may determine the positions of the appearance images 92 to 94 in the surrounding area image 91 based on the position information about the surrounding vehicles 72 to 74 transmitted from the surrounding vehicles 72 to 74 to the first automobile 71 via the server device 18.
The appearance images 92 to 94 of the surrounding vehicles 72 to 74 are stored in the database unit 61 (an example of the storage unit) of the server device 18 and transmitted from the server device 18 to the first automobile 71 together with the position information about the surrounding vehicles 72 to 74. In a case where the appearance images 92 to 94 of at least one of the surrounding vehicles 72 to 74 cannot be transmitted from the server device 18 to the first automobile 71, the meter panel 84 may display a prescribed symbolic image at a position corresponding to the at least one of the surrounding vehicles 72 to 74 in the surrounding area image 91.
The appearance images 92 to 94 of the surrounding vehicles 72 to 74 are three-dimensional images corresponding to the vehicle models and colors of the surrounding vehicles 72 to 74 (an example of appearance of the surrounding vehicles 72 to 74). Accordingly, the shapes and colors of the appearance images 92 to 94 of the surrounding vehicles 72 to 74 vary from one surrounding vehicle 72 to 74 to another. In another embodiment, in a case where the surrounding area image 91 is a look-down image (two-dimensional image), the appearance images 92 to 94 of the surrounding vehicles 72 to 74 may be two-dimensional images.
In a case where a blinker 75 (see
While the meter panel 84 is displaying the appearance images 92 to 94 of the surrounding vehicles 72 to 74, the speaker 85 outputs simulated sounds of the surrounding vehicles 72 to 74. The simulated sounds of the surrounding vehicles 72 to 74 correspond to sound data on the surrounding vehicles 72 to 74 (data on sounds generated by the surrounding vehicles 72 to 74) transmitted from the server device 18 to the first automobile 71. For example, the simulated sounds of the surrounding vehicles 72 to 74 are sounds that reproduce the output sounds of the driving devices (internal combustion engines and/or electric motors) of the surrounding vehicles 72 to 74. The sound levels and sizes of the simulated sounds of the surrounding vehicles 72 to 74 vary from one surrounding vehicle 72 to 74 to another.
Next, image transmission control executed by the server device 18 will be described with reference to
When the image transmission control is started, the server device 18 determines whether the database unit 61 stores the appearance image of the second automobile 72 (step ST1).
In a case where the database unit 61 stores the appearance image of the second automobile 72 (step ST1: Yes), the server device 18 transmits the appearance image of the second automobile 72 read from the database unit 61 to the first automobile 71 (step ST2).
In a case where the database unit 61 does not store the appearance image of the second automobile 72 (step ST1: No), the server device 18 determines whether the second automobile 72 holds the appearance image of the second automobile 72 (i.e., the appearance image of the own vehicle) (step ST3).
In a case where the second automobile 72 holds the appearance image of the second automobile 72 (step ST3: Yes), the server device 18 acquires the appearance image of the second automobile 72 from the second automobile 72, and transmits the appearance image of the second automobile 72 acquired from the second automobile 72 to the first automobile 71 (step ST4).
In a case where the second automobile 72 does not hold the appearance image of the second automobile 72 (step ST3: No), the server device 18 determines whether the appearance image of the second automobile 72 exists on the Internet (an example of the outside network) (step ST5).
In a case where the appearance image of the second automobile 72 exists on the Internet (step ST5: Yes), the server device 18 acquires the appearance image of the second automobile 72 via the Internet, and transmits the appearance image of the second automobile 72 acquired via the Internet to the first automobile 71 (step ST6).
In a case where the appearance image of the second automobile 72 does not exist on the Internet (step ST5: No), the server device 18 ends the image transmission control without transmitting the appearance image of the second automobile 72 to the first automobile 71.
As shown in
In a situation shown in
In a situation shown in
In a situation shown in
In the above embodiment, as shown in
In the above embodiment, as shown in
In the above embodiment, the automobile 2 (the first automobile 71) is an example of the first traffic participant. In another embodiment, the traffic participant (for example, the motorcycle 3, the bicycle 4, or the pedestrian 5) other than the automobile 2 may be an example of the first traffic participant.
In the above embodiment, the automobiles 2 (the second and third automobiles 72, 73) and the motorcycle 3 (the motorcycle 74) are examples of the second traffic participants. In another embodiment, the traffic participants (for example, the bicycle 4 or the pedestrian 5) other than the automobiles 2 and the motorcycle 3 may be examples of the second traffic participant. In a case where the bicycle 4 or the pedestrian 5 is an example of the second traffic participant, an image of the bicycle 4 or the pedestrian 5 captured by the camera of the bicycle terminal 14 or the pedestrian terminal 15 may be used as the appearance image of the second traffic participant.
In the above embodiment, the automobiles 2, the motorcycles 3, the bicycles 4, and the pedestrians 5 are examples of the traffic participants. In another embodiment, mobile objects (for example, two-wheeled self-balancing transporters, ships, aircrafts, and the like) other than the automobiles 2, the motorcycles 3, the bicycles 4, and the pedestrians 5 may be examples of the traffic participants.
Concrete embodiments of the present invention have been described in the foregoing, but the present invention should not be limited by the foregoing embodiments and various modifications and alterations are possible within the scope of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2023-057786 | Mar 2023 | JP | national |