The present disclosure relates to an information presentation method, an information presentation device, a vehicle control method, and a vehicle control device.
There has been a method of estimating a state of a driver who is riding on a mobile body (for example, a patent literature JP 2021-130390 A).
However, such a conventional technique is required to make further improvement.
An information presentation method according to one aspect of the present disclosure is a method of presenting information to a driver of a first vehicle. The information presentation method includes detecting one or more objects located in front of the first vehicle via at least one first sensor serving to sense an outside of the first vehicle, and determining a risk of each of the one or more objects. The information presentation method includes outputting alert sound for alerting a first dangerous object via one or more speakers provided in an interior of the first vehicle in response to determining, based on the risk of each of the one or more objects, that the first dangerous object is present among the one or more objects. The first dangerous object corresponds to a first risk exceeding a predetermined level. The alert sound is presented to the driver as a sound image localized at a first sound image position between the driver and the first dangerous object. The risk relates to a future collision risk between each of the one or more objects and the first vehicle. The information presentation method includes changing a sound image position of the sound image from the first sound image position to a second sound image position when the risk of the first dangerous object rises from the first risk to a second risk. The second sound image position is a position between the driver and the first dangerous object and closer to the driver than the first sound image position.
The environment surrounding our daily life is increasingly digitized.
For example, many people own smartphones, which are information terminals dedicated to individuals, and have come to install, in the smartphones, and use various application programs (hereinafter, simply called “application”) such as an application for managing health of users and a social communication application for communicating with other people.
The present disclosure discloses a technique for assisting a user such that the user can live a healthy, happy, comfortable, convenient, reassuring, safe, pleasant, economical, and reasonable life by causing a smartphone, that is an information terminal having various information processing capabilities, an application, which operates on the smartphone, a computer resource (hereinafter, this computer resource is referred to as a cloud), which connected via a network that manages and provides various information, a mobile body (hereinafter, referred to as a vehicle), which has an advanced information processing capability of assisting safe driving of the user, and the application operating on the vehicle, to cooperatively operate.
Note that the present disclosure can also be implemented as a program for causing a computer to execute characteristic configurations included in the control method used herein or a system that operates in accordance with the program. It goes without saying that such a computer program can be distributed via a computer-readable non-transitory recording medium such as am SD (Secure Digital) card or a communication network such as the Internet.
Note that all the embodiments described below illustrate specific examples of the present disclosure. Numerical values, shapes, constituent elements, steps, order of steps, and the like described in the following embodiments are examples and do not limit the present disclosure. Among the constituent elements in the following embodiments, constituent elements that are not described in independent claims indicating a most superordinate concept are described as any constituent elements. In all the embodiments, respective contents can be combined.
In our society, it is predicted that the Internet will become more popular in the future and various sensors will become familiar. Accordingly, in our society, it is predicted that information ranging from information about an individual's internal state, activities, and the like to information about an entire town including buildings, transportation networks, and the like will be digitized and made available in a computer system. Digitized data concerning individuals (personal information) will be securely managed as big data in a cloud server of an information bank or the like via a communication network and will be used for various purposes for individuals and the society.
Such an advanced information society is called Society 5.0 in Japan. The advanced information society is a society in which economic development and solution of social problems are expected by an information base (a cyber-physical system) in which a real space (a physical space), which is a material world surrounding individuals, and a virtual space (a cyberspace) in which various kings of processing concerning the physical space are performed by computers in cooperation are highly fused.
In such an advanced information society, by analyzing communication (including acquisition and provision of information and an expression method for the information) and behavior in various daily scenes performed by an individual and analyzing big data including accumulated personal information, it is possible to provide information and services necessary for the individual with a communication method considered to be optimal for the individual corresponding to the scene.
In the following description, a specific embodiment that provides a safe and comfortable movement experience in an advanced information society in which such a cyber-physical system operates is described.
In addition, the vehicle 1 and the information terminal 2 can directly perform wireless communication with a device present in a short distance by using near field communication 6 such as Wi-Fi (registered trademark), Bluetooth (registered trademark), or UWB, which is an ultra-wideband wireless communication standard.
An electronic key 7 for using the vehicle 1 and a digital driver's license 8, which is a driver's license of the user, are stored in the information terminal 2. The electronic key 7 necessary for using the vehicle 1 is acquired by the information terminal 2 by communicating with a vehicle management cloud 10 via the Internet. Conditions necessary for the user to drive are also described in the digital driver's license 8.
Further, the information terminal 2 may include a personal data store (hereinafter referred to as PDS) that aggregates user's personal information and information about driving and the like and manages sharing with a third party based on personal permission or an application that provides a function of an information bank that provides a mediating function of social data distribution in this way.
The Internet includes a personal information management cloud 9 that provides functions of an information bank and a PDS. Personal information of the user, information about driving and the like, and the like are aggregated and managed in the personal information management cloud 9. The use by a third party is managed based on the personal permission.
Note that, as described above, the equivalent functions are sometimes provided by a smartphone. Therefore, in the following description of the present disclosure, it is assumed that such personal information, information about driving and the like, and the like are managed by the information terminal 2 and/or the personal information management cloud 9.
In the present disclosure, these pieces of information may be managed by either the information terminal 2 or the personal information management cloud 9. In particular, information relating to driving may be accumulated in a memory in a vehicle and use by a third party may be managed by an application of the vehicle 1 in the same manner as the information terminal 2 and the personal information management cloud 9.
The vehicle management cloud 10 operates in cooperation with the vehicle 1 such that the vehicle 1 can be used with the electronic key 7 linked with the vehicle 1. The vehicle management cloud 10 also acquires, sets, updates, and manages information about a use status of the vehicle 1 and setting of a safe driving function in cooperation with an application executed by an arithmetic operation unit 106 of the vehicle 1.
A third party cloud 11 is a cloud for a third party to provide services relating to a user and/or a vehicle. For example, the third party cloud is a cloud for implementing services provided by various third parties based on a use status accumulated in the vehicle such as a vehicle management service that proposes replacement of consumables, an insurance service that proposes update of a vehicle insurance, and an administrative service that determines a place having a high accident risk and performs road maintenance.
The vehicle 1 includes an information presentation device and a vehicle control device. The information presentation device and the vehicle control device include, for example, a processor and a memory. The processor executes a program stored in the memory to include at least one functional block (see the vehicle 1 in
The information terminal 2 includes a sensor unit 201 for acquiring video information, audio information, and/or physical quantities of a surrounding environment, an information input/output unit 202 performs input and output of information such as video and sound with the user, an operation unit 203 that receives button pressing, touch operation, and the like from the user, an arithmetic operation unit 204 that performs various calculations and information processing such as information drawing performed in the information terminal 2, a memory 205 that retains data and files to be used by the arithmetic operation unit 204, and a communication unit 206 for communicating with other computers on a communication network.
When an application for performing key management for using the vehicle 1 with the electronic key 7, an application for managing collected personal information and information relating to driving and the like, or the like is installed in the information terminal 2, a program included in the application and necessary data are recorded in the memory 205 of the information terminal 2 and the program is executed by the arithmetic operation unit 204.
Note that the information terminal 2 is described as a smartphone but is not limited to this. The information terminal 2 may be a form such as a wristwatch-type smartwatch, a glasses-type smart glass, a ring-type smart ring, a smart speaker that performs sound operation, or a robot including a movable unit as well.
The personal information management cloud 9, the vehicle management cloud 10, and the third party cloud 11 include a communication unit 901 for communicating with other computers on a communication network, a memory 902 that records information about the vehicle and the user and a management program for the information, and an arithmetic operation unit 903 that performs various data processing. Note that these clouds, the vehicle 1, and the information terminal 2 may perform communication through communication means that is not the Internet of the wide-area communication network 5. For example, the near field communication 6 may be used for unlocking processing performed between the vehicle 1 and the information terminal 2.
In order to simplify the description, it is assumed that a risk is determined for the vehicle 1 on the basis of the distance from the point Oc to an object around the vehicle. It is assumed that, when an object approaches within this radius Rc, the risk is determined to be a predetermined risk (for example, the risk is “high”). In the following description, the moving route is described as information including the current position and the moving speed (the speed and the direction) of the moving object.
In an example at t=t0 in
The sensor unit 103 of the vehicle 1 uses a sensor to identify a moving object such as a person, a bicycle, or a vehicle in addition to a stationary object such as a road, a traffic light, or a sign around the vehicle and detects the position and the moving speed (the speed and the direction) of the moving object. The arithmetic operation unit 106 of the vehicle 1 monitors and evaluates a situation around the vehicle and a current risk in real time based on data acquired by these sensors.
In an example at t=t1 in
In an example at t=t2 illustrating a situation at time t2 when time has elapsed from time t1, a state is drawn in which the vehicle 1 further has made a right turn and the bicycle 12 has traveled straight to approach to a distance D=Rc+Rb. The sensor unit 103 of the vehicle 1 sequentially detects this situation and continues to update the memory 107.
The arithmetic operation unit 106 sequentially calculates the value to grasp the situation around the vehicle. Determination of a risk is performed in this elapse of time. The arithmetic operation unit 106 performs notification to the driver via the information input/output unit 104 based on the determination and instructs the movable unit 101 to perform emergency braking of the vehicle in order to avoid an accident.
Similarly, there is a bicycle 12 that is about to cross the road on obliquely front right in the traveling direction of the vehicle 1. A moving route of the bicycle 12 is represented as Vb2, a front center point is represented as Ob2, and a distance D2 is a distance from Oc to Ob2.
Similarly, a motorcycle 14 travels side by side on obliquely rear left the vehicle 1. A moving route of the motorcycle 14 is represented as Vb3 and a front center point is Ob3. Similarly, there is a following vehicle 15 behind the vehicle 1. A moving route of the following vehicle 15 is represented as Vb4 and a front center point is Ob4.
In addition to type identification for the mobile bodies illustrated in
Navigation information 20 is displayed in the middle. At the right end, destination information 21 or destination periphery information 22 is displayed. There are left and right side mirrors 23, there is a front windshield 24 forward in the front, and there is, in an upper portion, a rearview mirror 26 including a sensor 25 (for example, an RGB camera, a combination of an infrared LED and an infrared camera, a multispectral camera, a radio wave sensor using reflection variation of an electromagnetic wave, an audio microphone, or the like) for detecting states of the driver and passengers.
There are four independent speakers 28 on the top, the bottom, the left, and the right. In a dashboard, there is a space video projection device 27 (a head-up display (including a holographic display) that can display a visual image on a transparent panel, a windshield 24, or an empty space, a display utilizing a two-sided corner reflector array, a transparent display that displays a visual image on a transparent panel, a retinal display that directly forms an image on a retina, and the like).
The arithmetic operation unit 106 of the vehicle 1 calculates and sets, based on sensing data acquired by a sensor from the outside of the vehicle, a dangerous center region 29A indicating a center portion of a dangerous region including the front center point Ob1 of the preceding vehicle 13 and a dangerous region 29B including the dangerous center region 29A and indicating a dangerous region concerning the preceding vehicle 13.
Similarly, the arithmetic operation unit 106 calculates and sets a dangerous center region 30A indicating a center portion of a dangerous region including the front center point Ob2 of the bicycle 12 and a dangerous region 30B including the dangerous center region 30A and indicating a dangerous region concerning the bicycle 12. Although not illustrated, dangerous center regions and dangerous regions may be similarly calculated and set for the motorcycle 14 reflected on the side mirror 23 and the following vehicle 15 reflected on the rearview mirror 26.
The dangerous center regions 29A and 30A and the dangerous regions 29B and 30B indicate where an object with a high risk is present when viewed from the driver, in other words, a direction in which the driver should direct a line of sight to check an object around the vehicle for which the driver should consider safety.
Note that the dangerous center regions 29A and 30A are position information including a part of an object around the vehicle, are regions centered on a specific position (the front center point Ob1 or the front center point Ob2) where the distance between the own vehicle (the vehicle 1) and the object (the bicycle 12 or the preceding vehicle 13) is short and the possibility of collision is high and are determined by the arithmetic operation unit 106 based on data sensed by the sensor unit 103 of the vehicle 1 and/or data received by the communication unit 108.
Similarly, the dangerous regions 29B and 30B are larger regions including the dangerous center regions 29A and 30A. The dangerous regions 29B and 30B are regions set to include a region identified as a part or a whole including a dangerous center region of the object by the arithmetic operation unit 106 based on the data sensed by the sensor unit 103 of the vehicle 1 and/or the data received by the communication unit 108. Therefore, the dangerous center regions 29A and 30A may be set to be region information indicating a part of an oncoming vehicle and the dangerous regions 29B and 30B may be set to be region information indicating the entire oncoming vehicle. More specifically, the dangerous region may indicate an entire individual object (for example, the entire vehicle 13) present around the vehicle and the dangerous center region may indicate a portion of the object at a short distance from the own vehicle (for example, the left front portion of the vehicle 13).
A sensor (such as a sensor 25 provided in the rearview mirror) of the vehicle 1 sequentially senses a driving state of the driver as well. An attention region, which is a region to which the driver directed constant or more attention most recently, is detected.
In the example illustrated in
The driver paid attention to an attention region 33, so that it is seen that the driver is checking the motorcycle 14 on obliquely rear left reflected on the left side mirror 23. Similarly, the driver paid attention to an attention region 34, so that it is seen that the driver is checking the navigation information 20.
In recent years, a line-of-sight detection technology has penetrated, and a target object checked by the driver during driving as described above can be acquired by detecting the head position (or the center position of both the eyes) of the driver and the line-of-sight direction (or the relative position between the reference point position such as the pupil position and the outer corner of the eye and the outer edge of the iris) of the driver with a sensor in the vehicle and performing image recognition processing with the arithmetic operation unit 106.
Note that, in the present disclosure, a method of acquiring a line-of-sight direction and a determination standard for a gazing region are not limited as long as a region to which the driver directed constant or more attention can be detected.
Concerning the determination of a region to which the driver directed constant or more attention, the region may be determined as an attention region when a direction in which the driver's line of sight is directed stayed within a predetermined angle for a given time or more.
A region within a predetermined distance from a line segment connecting points where the line of sight stayed for a constant time or more in chronological order for a predetermined time may be determined as a region to which the driver directed constant or more attention.
In
In driving the vehicle 1, the driver has to continuously monitor a positional relation with a stationary/moving object around the vehicle without omission while following instructions of a road line, a sign, a traffic light, or the like in a wide viewing angle around the vehicle and continue to perform safe driving control in accordance with a situation around the vehicle.
This is considered to be because there is a moment when safety consideration is not practically through in the case of a driver unaccustomed to driving operation, a driver who drives overseas or the like where driving rules are different from usual, a driver whose cognitive judgment ability has deteriorated with aging, or a driver who continues a driving operation for a long time like a bus or a taxi.
Thus, the present disclosure discloses, as an example, the safe driving assist system 100 that plainly informs the driver of an alert for urging the driver to take appropriate safety consideration only when necessary in a situation as illustrated in
As described with reference to figures up to
In
Further, the arithmetic operation unit 106 displays the marker display 37 (for example, a caution mark or the like) between the bicycle 12, which is determined as being an object with a high risk due to reduced safety consideration for a predetermined period, and the driver to make it easy to direct attention.
Further, the arithmetic operation unit 106 controls the speaker 28 of the information input/output unit 104 to set a virtual sound source position between the driver and the bicycle 12 and output sound such that the sound alert 36 “pipipi” can be heard from the direction of the bicycle 12. Note that, needless to say, these marks and sound are examples. Marks of other shapes or different sound may be used.
As described above, for a dangerous object around the vehicle that that driver is considered to be unaware, the information input/output unit 104 notifies the dangerous object using the arrow mark 38, the caution mark (the marker display 37), the sound alert 36, or the like considering the position and the direction of the dangerous object in a relatively low risk situation before the vehicle itself activates emergency braking such as an advanced driver assistance system (ADAS). Accordingly, safe driving can be assisted more easily for an important part at an earlier stage.
Similarly,
First, the arithmetic operation unit 106 determines that the driver does not direct, for an immediately preceding predetermined time or more, constant or more attention to a dangerous region including the bicycle 12 approaching from obliquely front right (an upper left part 81). Subsequently, the arithmetic operation unit 106 determines that the bicycle 12 has approached the vehicle 1 without being noticed by the driver and a predetermined risk is exceeded (a center upper part 82).
Subsequently, the arithmetic operation unit 106 uses (the space video projection device 27 of) the information input/output unit 104 to display the arrow mark 38 indicating the position of a dangerous center region (a dangerous region or a dangerous object) around the line-of-sight direction of the driver. The arithmetic operation unit 106 sets a virtual sound source position such that the sound alert 36 can be heard from the direction of the dangerous center region (the dangerous region or the dangerous object) and outputs the sound alert 36 from (the speaker 28 of) the information input/output unit 104 (an upper right part 83).
Marker display (the marker display 37) for calling attention to the dangerous center region may be simultaneously performed. The sound alert 36 may be continuously sounded longest until the marker display in the lower right part 89 is extinguished or may be output only once.
Subsequently, the driver notices the display of the arrow mark 38 and the sound alert 36 (a middle left part 84). The driver who noticed the display of the arrow mark 38 or the sound alert 36 starts to check the direction indicated by the arrow mark 38 and the direction of the virtual sound source position where the sound alert 36 sounds (a center middle part 85). Subsequently, the position of the dangerous center region is notified by the marker display 37 (a middle right part 86).
Subsequently, the driver notices the marker display 37 and notices the bicycle 12 present in the dangerous center region (or the dangerous region) (a lower left part 87). Subsequently, the driver performs driving operation for avoiding danger not to cause an accident with or to maintain a safe distance from the bicycle 12, which is the dangerous object noticed anew, (a lower center part 88). Finally, the risk of the bicycle 12, which has been the dangerous object, falls below a predetermined value and the marker display is extinguished (a lower right part 89).
Note that timing of the start or the end of the arrow mark 38, the marker display 37, and the sound alert 36 described here is an example. In the present disclosure, timings of notification, relative order, display positions, and positions of the virtual sound source of the arrow mark 38, the marker display 37, and the sound alert 36 may be different. For example, the arrow mark 38 may not be used, the sound alert 36 may sound, the marker display 37 may be performed next, and the sound alert 36 may be stopped at timing when the driver checks the sound alert 36 or at timing when the risk falls below a predetermined value, and the marker display 37 may be extinguished.
For example, a panel smaller than the windshield 24 may be installed in front of the driver, one or more of the windshield 24, the left and right side mirrors 23, the rearview mirror 26, and the left and right windows may be laid out in an arrangement viewed from the driver's seat, and the arrow mark 38 and the marker display 37 may be performed therein. Alternatively, a virtual viewpoint video obtained by overlooking the surroundings of the own vehicle from above may be generated by the arithmetic operation unit 106 based on data acquired from the sensor unit 103 and/or the communication unit 108 and may be displayed on the information input/output unit 104 (for example, a video display unit in the vicinity of the cockpit 17). At this time, it is conceivable to plainly mark and display a dangerous object such that the driver can check at a glance where a dangerous object having a risk of a predetermined amount or more is present in the virtual viewpoint video.
The sensor (and/or the communication unit 108) of the vehicle 1, which has detected that the information terminal 2 and the vehicle 1 have approached within a distance in which proximity communication is possible or the information terminal 2 has approached within a predetermined distance from the door of the vehicle 1, starts authentication of the electronic key 7. The arithmetic operation unit 106 of the vehicle 1 transmits an input value including a random number to the information terminal 2 via the communication unit 108 (step S101).
The communication unit 206 of the information terminal 2 receives the input value and the arithmetic operation unit 204 calculates a response value corresponding to the input value (step S102). The arithmetic operation unit 204 of the information terminal 2 returns the response value and a user ID, which is identification information for identifying the user, to the vehicle 1 via the communication unit 206 (step S103).
The arithmetic operation unit 106, which has received the response value via the communication unit 108 of the vehicle 1, verifies whether the response value is an expected result. When the response value is an expected result, the arithmetic operation unit 106 of the vehicle 1 unlocks the door via the key control unit 105. Further, the arithmetic operation unit 106 of the vehicle 1 identifies the user having the user ID used for the unlocking as the driver (step S104).
Note that the arithmetic operation unit 106 of the vehicle 1 may perform personal identification of the user who is sitting in the driver's seat by using a sensor (such as a camera that performs image recognition) and may determine that the user ID of the user sitting in the driver's seat is the user ID of the driver. The arithmetic operation unit 106 may detect that the user whose user ID is used for unlock has sat in the driver's seat and determine this user as the driver.
When use permission of a driving condition is checked in the PDS 9A (step S106), the vehicle 1, which has identified the user ID of the driver, receives the driving condition (step S107) and determines whether the driver is capable of driving the vehicle (step S108). This is described in detail below.
The user can set a function (hereinafter, safe driving function) of the vehicle that assists safe driving enabled during driving (step S109). This is described in detail below.
While the user is driving, the vehicle assists the driving such that the user can safely drive with the safe driving function based on the setting (step S110). This is described in detail below.
The arithmetic operation unit 106 of the vehicle 1 records driving data including a history of driving operation of the driver in a memory or update the driving data (step S111). When the user approves the driving data, the vehicle 1 shares the driving data with the vehicle management cloud 10 and the PDS 9A as appropriate (step S112 and step S115).
The vehicle management cloud 10 records the driving data (step S113), evaluates a safe driving function for the user (step S114), and, in response to determining that new setting should be recommended, proposes the new setting to the user via the vehicle 1 (step S117). Processing concerning the proposal of the new safe driving function (step S118) using the vehicle management cloud 10 is described in detail below.
The PDS 9A, which has received the driving data, may record the driving data (step S116) and provide some incentive to the user via the vehicle 1 (or the information terminal 2) as a reward for the data provision (steps S119 to S120). As an example of this, an automatic change of an automobile insurance is described in detail below.
As described above, a cycle in which the information terminal 2, the vehicle 1, the vehicle management cloud 10, and the PDS 9A of the personal information management cloud 9 (or the information terminal 2 to the vehicle 1) cooperate to, when the user drives, appropriately assist safe driving of the user during the driving and propose a better service from a history of the driving starts for the first time.
The present disclosure is not limited to only plainly assisting safe driving of the user as appropriate with respect to dangerous objects around the vehicle using the space video projection device 27 and 3D stereophonic sound. By utilizing the driving data linked with the user accumulated using these, it is possible to smoothly implement, in a data-driven format based on the driving data of the user, determination of propriety of driving operation, proposal of setting/updating of a safe driving function, and cooperation with a third party service involved in vehicle driving.
Accordingly, even in various drivers and driving situations having problems in the safe driving described above, it is possible to perform driving mainly by oneself at ease while receiving assist of the safe driving function. With the technology of the present disclosure, it is expected to reduce social problems such as a problem in that it is difficult for a driver to sufficiently consider safe driving in an aging society, a problem in that a driver's license has to be returned because of a risk of causing an accident but this causes a significant trouble in actual life, and a problem in that an opportunity to go out decreases and deterioration of a cognitive function progresses.
In a date and time field, information indicating a date and time when this record was generated is recorded as a data value. In this example, expression is made by using the ISO8601 format such that the date and time of this record is 22:38:11 on Mar. 17, 2022 in Japan Standard Time.
In the next user ID field, a user ID for identifying a driver corresponding to this record is described. This may be replaced with information capable of identifying an individual such as a driver's license number. In the vehicle ID field, a vehicle ID (a chassis number) for identifying a vehicle corresponding to this record is described.
In a safe driving function field, validity/invalidity of the safe driving function of the vehicle 1 at the time of this driving or when the safe driving function operates or a setting value is described. As condition for outputting the sound alert 36 is described. As the condition, for example, the distance D illustrated in
The marker display 37 and the arrow mark 38 are described as false. Functions of both of the marker display 37 and the arrow mark 38 are disabled. When the functions are enabled, true is described here or conditions such as a distance and a grace time to be displayed like the sound alert 36 are described. Eight direction sensitivities represent thresholds of warning notifications for eight directions of vehicle front, obliquely front right, right, obliquely rear right, rear right, obliquely rear left, left, and obliquely front left in stages of 0 (low) to 9 (high). The obliquely rear left is set to 9 and, compared with other directions in which the sensitivities are smaller than 9, the warning notification is set to operate for a dangerous object present in the obliquely rear left even if the dangerous object is far from the vehicle.
This may be setting set by the user or calculated by the arithmetic operation unit 903 of the vehicle management cloud 10 that evaluates driving data and proposes a setting value of the safe driving function or the arithmetic operation unit 106 of the vehicle 1. When the vehicle management cloud 10 performs the setting, the setting may be performed, from information of an incident and accident report field of driving data in the past of the user, by statistical processing for setting a numerical value to be high for a direction in which a probability of safety consideration of the user being insufficient is high and setting a numerical value to be low for a direction in which the probability of the safety consideration of the user being insufficient is low.
In the incident and accident report field, a history of operation of the safe driving function of the vehicle 1 is described. In particular, when the vehicle 1 detects a risk of a predetermined risk or more, a positional relation between a vehicle state and a dangerous object at the time when the risk is determined is described.
Vehicle speed expresses the moving speed of the vehicle 1 at this time in units of Km/h. A steering angle expresses an angle of steering wheel operation of the vehicle 1 at this time in units of a degree. When the steering wheel 16 is turned to the right, the angle is a positive angle and, when the steering wheel 16 is turned to the left, the angle is a negative angle. A direction expresses a relative position of a target dangerous object in units of a degree as a clockwise angle from the front of the vehicle. 233 degrees in this example indicates an obliquely rear left direction of the vehicle 1.
A type indicates a type of the target object for which the risk has been detected. The type includes a person, a bicycle, a motorcycle, an ordinary passenger car, a large passenger car, a train, a traffic sign, a traffic light, a guardrail, a step on a road surface, and the like. In the example of
A risk indicates an evaluation result of a risk based on an accident risk described below. In this example, it is indicated that the risk of this incident was medium. In the incident and accident report field, a valid data value is recorded when the safe driving function of the vehicle 1 operates. Otherwise, an invalid data value is recorded.
In a place field, information indicating a place where this record was generated is recorded as a data value. In the example of
In a moving distance field, information indicating the total moving distance from the immediately preceding place where the safe driving function operated to the current place in units of Km is recorded as a data value. The example of
A high risk is a state with the risk is the highest as shown in a table. This is a state in which the arithmetic operation unit 106 of the vehicle 1 directly controls the movable unit 101 in order to perform immediate vehicle control for avoiding an accident. For example, in the example illustrated in
A medium risk is a state in which the risk is the second highest after the high risk. The arithmetic operation unit 106 of the vehicle 1 notifies the user of danger via the information input/output unit 104 such that the user performs immediate vehicle control for avoiding an accident. The difference from “high” is that the arithmetic operation unit 106 does not directly control the movable unit 101 in order to avoid an accident. For example, in the example illustrated in
The low risk is a state in which the risk is lower than the medium risk and higher than the risk determined as safe. The low risk is a state in which the arithmetic operation unit 106 determines that there is no need for immediate vehicle control for avoiding an accident, although a dangerous object that can cause an accident is present within a constant range around the vehicle. For example, in the example illustrated in
The risk determined as safe is a state in which the risk is the lowest. The risk determined as safe is a state in which the arithmetic operation unit 106 determines that there is no dangerous object that can cause an accident within a constant range around the vehicle. For example, in the example illustrated in
The arithmetic operation unit 106 of the vehicle 1 that has identified the user ID of the driver in the processing to that point requests the PDS 9A (whose function is provided by the personal information management cloud 9, the information terminal 2, the vehicle 1, or an application operating in any device) for a driving condition corresponding to the user ID via the communication unit 108 (step S201). The request in step S201 includes a vehicle ID (and/or a manufacturer ID for identifying a manufacturer of the vehicle 1) for identifying the vehicle 1 that has made the inquiry.
Note that, when the PDS 9A is managed by the personal information management cloud 9 or an application operating therein, the PDS 9A is referred to as centralized PDS. In this case, data is collectively managed in a cloud provided by a management company for personal information. When the PDS 9A is managed by the information terminal 2, the vehicle 1, or an application operating therein, the PDS 9A is referred to as distributed PDS. In this case, the user collectively manages personal information on the information terminal 2 of the user.
The processing on the PDS 9A side is common in both the cases. Therefore, in the present disclosure, it is collectively described, without distinction, whether the PDS 9A is present on the personal information management cloud 9 as an embodiment, present on the information terminal 2, present on the vehicle 1, managed across two or more of these, or personal information of these is managed by an unspecified number of computers on a network using a distributed ledger technology.
That is, the computer (personal information management cloud 9) in the present embodiment may be one of multiple computers capable of communicating with one another via a network. Each of the multiple computers manages at least one of a driving characteristic database including driver's license information and permission information on a distributed ledger.
Note that, in the present disclosure, it is assumed that personal information of the user and information about driving and the like are managed by the PDS 9A. However, as long as an equivalent mechanism for managing third party use of equivalent information based on personal permission is provided, the information management may not be carried out in the PDS 9A.
The arithmetic operation unit 903 of the PDS 9A, which has received the request in step S201 via the communication unit 901, accesses the memory 902 and checks whether the user having the user ID has permitted use of driving condition information of the user to the vehicle 1 having the vehicle ID (or a manufacturer of the vehicle) (step S202).
If the user has not permitted the use (step S203: No), the arithmetic operation unit 903 transmits, to the vehicle 1, via the communication unit 901, indication that there is no use permission (step S204). The arithmetic operation unit 106 of the vehicle 1, which has received this via the communication unit 108, performs a notification, using the information input/output unit 104, such that the user permits the use (step S205).
When the user permits the use, the arithmetic operation unit 106 of the vehicle 1 notifies the PDS 9A to that effect via the communication unit 108 (step S206: Yes). The PDS 9A having received this notification records, in the memory 902 thereof, indication that the user has permitted the use of the driving condition information to the vehicle ID (or the manufacturer ID) (step S207). Then, the PDS 9A reads the driving condition information of the user having the user ID from the memory 902, and returns the driving condition information to the vehicle 1 via the communication unit 901 (step S208).
Thus, the driving characteristic information (the driving condition information) is acquired from the driving characteristic database (the PDS 9A) managed in the computer with which the first communication circuit (the communication unit 108 of the vehicle 1) can communicate. Driving characteristic data of users acquired from vehicles is accumulated in the driving characteristic database.
When the user does not permit the use (step S206: No), the arithmetic operation unit 106 of the vehicle 1 ends the processing here.
When the arithmetic operation unit 903 of the PDS 9A has successfully confirmed the use permission (step S203: Yes), the arithmetic operation unit 903 reads the driving condition information of the user having the user ID from the memory 902, returns the driving condition information to the vehicle 1 via the communication unit 901 (step S208), and ends the processing.
The arithmetic operation unit 106 of the vehicle 1, which has received the driving condition information of the user via the communication unit 108, acquires the safe driving function of the vehicle 1 (step S209). Then, the arithmetic operation unit 106 compares the driving condition information of the user and the safe driving function of the vehicle 1 and determines whether the driving condition can be satisfied (step S210).
In response to determining that the driving condition can be satisfied (step S210: Yes), the arithmetic operation unit 106 of the vehicle 1 permits the user to drive the vehicle 1 (step S211). Further, the arithmetic operation unit 106 enables the safe driving function of the vehicle 1 in accordance with the driving condition information, appropriately sets parameters of the safe driving function with respect to the driving condition information of the user, and ends the processing.
In response to determining that the driving condition cannot be satisfied (step S210: No), the arithmetic operation unit 106 of the vehicle 1 does not permit the user to drive the vehicle 1 (step S212). Further, the arithmetic operation unit 106 notifies a reason for this to the user using the information input/output unit 104 (for example, a cockpit monitor) and ends and the processing.
In the driving characteristic database (the PDS 9A) of the present embodiment, a plurality of user IDs for identifying a plurality of users and driving characteristic data including driver's license information of the plurality of users are accumulated in correlation with each other. Moreover, the computer (the personal information management cloud 9 or the information terminal 2) stores, in the memory, in correlation with one another, the plurality of user IDs, a plurality of vehicle IDs for identifying a plurality of vehicles, and permission information indicating which vehicle among those vehicles each of the users has permitted access to the driving characteristic data of the vehicle. When the computer determines, based on the user ID of the driver, the vehicle ID of the vehicle, and the permission information, that the driver has permitted the access to the driving characteristic data of the driver by the vehicle, the vehicle acquires the driving characteristic data of the driver.
Note that, in the above-described embodiment, although determination is performed as to whether the driving condition is satisfied, the present disclosure is not limited to this. When a driving skill of the user is less than a predetermined value or a predetermined condition is imposed on the driver's license of the user, the information may be transmitted (broadcast) to road equipment such as vehicles and traffic lights in the periphery.
The vehicle control method of the present embodiment acquires, from the storage device (memory 107) of the vehicle 1, a function list indicating two or more driving assist functions implemented on the vehicle 1 and transmits vehicle information and an alert matter to one or more other vehicles in response to determining, based on the driving characteristic information and the function list, that level of the driving skill of the driver is less than a reference value and there is no necessary assist function for compensating for the insufficiency of the driving skill among the driving assist functions.
In this case, vehicle information for identifying the vehicle that has transmitted this information is transmitted. The vehicle information may include at least one of vehicle model information, number information, color information, current position information, and travel lane information (a group selected from the current position).
Note that the vehicle in the periphery, which has received the information, may display the information on the monitor of the cockpit and notify the driver of the information. Accordingly, an appropriate inter-vehicle distance and the like can be taken and can contribute to safe driving. In a case where the vehicle in the periphery, which has received this information, is an automatic driving vehicle and when a vehicle that has transmitted this information can be identified, an inter-vehicle distance may be taken more than usual from the vehicle, driving control may be performed to change a route and move away from the vehicle, or a driving plan may be corrected.
In this manner, using the driving condition information recorded in the PDS 9A of the user, the arithmetic operation unit 106 of the vehicle 1 acquires and determines whether the user can drive the vehicle 1 and what kind of safe driving function has to be enabled or set even when the user can drive the vehicle 1.
Accordingly, driving can be performed only with the vehicle 1 in which a necessary safe driving function is available. In other words, the user can be prevented from driving the vehicle 1 in which the safe driving function necessary for the user is not available. This should be determined in accordance with a safe driving skill of the user. Therefore, it is desirable that the driving condition information is managed by the PDS 9A that can securely collectively manage the user's personal information and the like and can restrict access from a third party.
Note that the driving condition may be determined not only based on information with a relatively gentle change with time managed by the PDS 9A but also taking into account a result of alcohol concentration measurement by exhalation collection of the user immediately before the start of driving.
The personal information managed by the PDS 9A is described as the driving condition information for each user. However, the driving condition information may be information expressed or recorded in the driver's license of the user. For example, the driving condition information may include type information of a driver's license such as a limited conditional license (as an example, an assist car limited license) such as a safe driving assist vehicle and a large vehicle license, condition information for driving (as an example, glasses), an acquisition date and an expiration date of the driver's license, an address and a face photograph image of the user, and the like.
Further, the driving condition information may include the driving data of the user described with reference to
In the above description, it is assumed that the user cannot drive when the driving condition is not satisfied. However, the present disclosure is not limited to this. When driving is permitted from the expiration date of the driver's license of the user and the type of the vehicle permitted for driving but only the safe driving function is insufficient, a restriction may be added, although the driving itself is permitted.
For example, it is conceivable that maximum speed is limited to 80% of speed permitted by law (for example, if the maximum speed is 60 Km/h, the speed is limited to 48 Km/h), maximum speed of the vehicle is set (for example, 40 Km/h) such that the speed cannot be increased even if the user steps on the accelerator at or above the maximum speed, a range of a destination that can be set by the user in a car navigation system is limited to within a predetermined distance range from the user's home (for example, setting is possible when the distance is within a range of 100 km from home), the range is limited to a predetermined region including the user's home or an administrative unit (for example, setting is possible in a municipality where one's home is located), or a route (a road) available in the car navigation system is narrowed such that the vehicle cannot travel other than a specific safe route.
Further, when only the safe driving function described above is insufficient, the vehicle 1 may output, to the outside, visual information indicating that the safe driving function is insufficient. This may be, for example, a specific form capable of identifying presence or absence of light emission, a color of the light emission, and a blinking interval of a brake lamp, a blinker, a headlight, a license plate, and other lighting mechanisms of the vehicle 1 can be identified.
Next, a vehicle control method of the present embodiment is described. The vehicle control method of the present embodiment is a vehicle control method for controlling a vehicle communicable with one or more other vehicles located in the periphery, the vehicle control method including acquiring driving characteristic information indicating a driving characteristic of a driver via a first communication circuit mounted on the vehicle;, determining, based on the driving characteristic information, whether there is an alert matter to be notified to the one or more other vehicles concerning the driving characteristic of the driver, and transmitting, in response to determining that there is the alert matter, vehicle information for identifying the vehicle and the alert matter to the one or more other vehicles located in the periphery of the vehicle via a second communication circuit mounted on the vehicle. The driving characteristic information is information indicating a driving skill of the driver or information about a driver's license of the driver, and the alert matter is information indicating that a level of the driving skill is less than a reference value or information indicating that there is a constraint condition on the driver's license.
As illustrated in
The vehicle 1, which has acquired the driving characteristic information, determines whether there is a matter to be alerted to another vehicle 1311 for safe driving (step S225). This determination may be made as described with reference to
When it is not determined that the alert to the other vehicle is necessary, the arithmetic operation unit 106 ends the processing (step S225). If necessary, the arithmetic operation unit 106 periodically broadcasts (transmits) vehicle information for identifying the vehicle and information to be notified to other vehicles or road equipment (step S226). The other vehicle 1311, which has received the information, notifies the driver of the information in the case of manned driving (step S227). When the other vehicle 1311 is an automatic driving vehicle, the other vehicle 1311 performs driving control to make an inter-vehicle distance wider than usual to maintain a safe distance such that the vehicle can pass safely (step S227).
The arithmetic operation unit 106 of the vehicle 1 acquires driving characteristic information (driving skill information) of the user from the memory based on the user ID of the driver (step S231). Further, the arithmetic operation unit 106 of the vehicle 1 acquires, from the memory 107, list information of driving assist functions provided in the vehicle (step S232). Note that the arithmetic operation unit 106 of the vehicle 1 may acquire the driving characteristic information and/or the list information of the driving assist functions not from the memory but from another computer on the network such as the PDS 9A or the vehicle management cloud 10 via the communication unit 108.
Subsequently, the arithmetic operation unit 106 of the vehicle 1 determines, from the driving skill information of the user, whether the user has a driving skill that can be assisted by the driving assist function provided in the vehicle (step S233). In the case of Yes (step S233: Yes), the arithmetic operation unit 106 selects a corresponding driving assist function to compensate for driving skill insufficiency and enables the driving assist function (step S234). Accordingly, the user can drive while complementing a driving skill of a necessary part. Then, the arithmetic operation unit 106 ends the processing.
On the other hand, in the case of No (step S233: No), it is concerned whether safe driving can be performed. In this case, as described above, the arithmetic operation unit 106 of the vehicle 1 provides a predetermined constraint on setting of a destination of the car navigation system or narrows a road that can be passed only to a road to which constant or more safety design applied (a road width of a predetermined amount or more, a section where a guardrail is present, an accident occurrence amount is statistically equal to or less than a given amount, and the like) (step S235). This makes it possible to avoid driving in an unfamiliar place and to suppress a risk of occurrence of an accident by not passing on a dangerous road.
The arithmetic operation unit 106 of the vehicle 1 also has an effect in suppressing (limiting) the maximum speed of the vehicle to the predetermined speed as described above (step S236). The arithmetic operation unit 106 of the vehicle 1 may set legal maximum speed for each road as the maximum speed at which the vehicle can drive, may set maximum speed lower than the legal maximum speed, or may uniformly determine the maximum speed.
Further, in order to notify that a vehicle requiring safe driving is present among vehicles in the periphery, the arithmetic operation unit 106 of the vehicle 1 periodically transmits, to vehicles in the periphery and road equipment via wireless communication (the communication unit 108), vehicle information for identifying the own vehicle and information indicating that, for example, a driving assist function is insufficient or a driver's license has a specific restriction condition (step S237).
The vehicle control method in the present embodiment described above restricts a navigation function of the car navigation system mounted on the vehicle 1 in response to determining that a necessary assist function is absent or insufficient among driving assist functions. The restriction of the navigation function includes restricting a destination that can be set by the driver, restricting a route that can be set by the driver, or alerting the driver to restrict the maximum speed of the vehicle 1 to a predetermined value or less. Then, the arithmetic operation unit 106 ends the processing.
Note that a traffic light for vehicles, which is road equipment that has received this information, may adjust signal switching timing or restrict vehicle passage in a specific direction such that the vehicle can pass safely. For example, the traffic light may adjust timing of the signal such that the vehicle can turn to the right or the left for a longer time than usual such that the vehicle can safely turn to the right or the left when the vehicle enters an intersection to turn to the right or the left. A traffic light in a crosswalk may also adjust switching timing of the traffic light of the crosswalk in order to assist safe traveling of the vehicle. For example, while the vehicle crosses the crosswalk, it is conceivable to change the crosswalk to red to prevent pedestrians from crossing the crosswalk.
A state in which the user individually adjusts a distance at which the sound alert 36 notified at the time of the low risk among the four stages is output, a distance at which the marker display 37 is output, and a distance at which the arrow mark 38 is displayed in an adjustable range 40 is illustrated.
By sliding a screen of the information input/output unit 104 while touching the screen with a finger 41, the user can freely set the distance at which the sound alert 36 sounds, the distance at which the marker display 37 is performed, and the distance at which the arrow mark 38 is displayed. It is assumed here that the arrow mark 38 is disabled and is not displayed on the setting screen. By individually setting the warning output conditions in this way, it is possible to apply setting desired by the user.
First, the arithmetic operation unit 106 of the vehicle 1 transmits driving data of the user identified by the user ID to the vehicle management cloud 10 during driving and/or after a driving end via the communication unit 108 (step S301). The vehicle management cloud 10 receives the driving data of the user (step S302). The arithmetic operation unit 903 of the vehicle management cloud 10 acquires a first function currently used as a safe driving function by the user and a first setting value, which is a setting value of the first function, from a safe driving function field of the received driving data (step S303). For example, the first function is the currently used sound alert 36.
The arithmetic operation unit 903 of the vehicle management cloud 10 determines, based on driving data of the user within a predetermined period, a second function required or recommended as a safe driving function to the user and a second setting value that is a setting recommendation value of the second function (step S304). For example, the second function means the sound alert 36 that is currently used and the marker display 37 that is not currently used but is recommended anew.
Then, the vehicle management cloud 10 acquires the difference between the first function and the second function and the difference between the first setting value and the second setting value (step S305). The vehicle management cloud 10 determines whether there is the difference between the first function and the second function or whether the difference between the first setting value and the second setting value is a predetermined value or more (step S306). In response to determining that the first function and the second function are the same and the difference between the first setting value and the second setting value is less than the predetermined value (step S306: No), the vehicle management cloud 10 ends the processing of the vehicle management cloud 10. In this case, the vehicle 1 also ends the processing without generating new processing.
When the first function and the second function are different or the difference between the first setting value and the second setting value is equal to or more than the predetermined value (step S306: Yes), the vehicle management cloud 10 transmits a notification for recommending the second function and the second setting value to the user to the vehicle 1 via the communication unit 901 (step S307).
The arithmetic operation unit 106 of the vehicle 1, which has received this notification via the communication unit 108, causes (the cockpit monitor or the like of) the information input/output unit 104 to display the received notification when the user having the user ID is driving or starts driving (step S308).
When the user does not approve this (step S309: No), the vehicle 1 ends the processing. In this case, the vehicle management cloud 10 also ends the processing without generating new processing. When the user approves (step S309: Yes), the arithmetic operation unit 106 of the vehicle 1 enables the second function and sets or updates the setting value of the second function to the second setting value (step S310).
When new billing processing is required for the change to the second function and the second setting value, the arithmetic operation unit 106 of the vehicle 1 requests the billing processing to the vehicle management cloud 10 (step S311) and ends the processing. The vehicle management cloud 10 receives the request via the communication unit 901, carries out the billing processing based on settlement information of the user registered in advance (step S312), and ends the processing.
In the vehicle control method according to the present embodiment, the necessary function information indicating a necessary assist function is transmitted to a second computer (the vehicle management cloud 10) that manages distribution of driving assist applications via a first communication circuit (the communication unit 108), in response to determining that a necessary assist function is absent among driving assist functions or is insufficient. Then, in response to determining, based on the necessary function information, that the second computer has the first application (a second function) corresponding to the necessary assist function among the driving assist applications, recommendation information for recommending to introduce a first application into a vehicle is acquired from the second computer. Thereafter, a message for recommending to introduce the first application into the vehicle is presented based on the recommendation information, to a driver via a display or a speaker provided in the vehicle 1. When the driver agrees to the message, the first application is installed in the vehicle 1.
As described above, the required or recommended safe driving function can be enabled or the setting value thereof can be updated based on the driving data. In order to use this safe driving function, the user pays a usage fee to a vehicle manufacturer depending on necessity. When the safe driving function of the vehicle 1 is implemented by software, it is conceivable that a new safe driving function is added after purchase of the vehicle or the function is improved. In such a case, the mechanism described above is considered to effectively work.
In the processing described above, the vehicle management cloud 10 may transmit a change proposal for the safe driving function to the information terminal 2 and, when the user accepts the change proposal on the information terminal 2, the information terminal 2 may demand the vehicle 1 to change the safe driving function.
If the enabling of the second function is an essential condition for driving, the user may be notified to that effect via the information input/output unit 104. The driving of the user may not be permitted unless the user's consent is obtained.
First, the arithmetic operation unit 106 of the vehicle 1 acquires a first function and a first setting value currently used by the user identified by the user ID (step S401). Further, the arithmetic operation unit 106 of the vehicle 1 determines, based on driving data of the user within a predetermined period, a second function and a second setting value that are necessary or recommended as the safe driving function (step S402). Further, the arithmetic operation unit 106 of the vehicle 1 acquires the difference between the functions and the difference between the setting values as described above (step S403).
Subsequently, the arithmetic operation unit 106 of the vehicle 1 determines whether there is the difference between the functions or whether the difference between the setting values is equal to or more than a predetermined value as described above (step S404). In a case of step S404: No, the arithmetic operation unit 106 ends the processing. In the case of step S404: Yes, the arithmetic operation unit 106 performs, using the information input/output unit 104, notification for recommending the user a change to the second function and the second setting value (step S405).
When the user does not approve this (step S406: No), the arithmetic operation unit 106 ends the processing. When the user approves (step S406: Yes), the arithmetic operation unit 106 enables the second function and sets or updates the setting value to the second setting value (step S407). Then, the arithmetic operation unit 106 requests the billing processing to the vehicle management cloud 10 depending on necessity (step S408). The vehicle management cloud 10, which has received the request, carries out settlement processing (step S409) and ends the processing.
The sound alert 36 and the marker display 37 are used as the first function and the second function is not changed to indicate that the setting recommendation value thereof is changed based on the second setting value. As described above, the safe driving function of the vehicle 1 is updated based on the approval of the user to automatically improve safety based on the driving data. Further, billing processing related thereto can also be smoothly implemented by the present disclosure.
First, the arithmetic operation unit 106 of the vehicle 1 detects a type, a position, and speed of a dangerous object around the vehicle using a sensor (step S501). At the same time, the arithmetic operation unit 106 of the vehicle 1 calculates an attention region of the user during the driving from the head position (or the center position of both the eyes, for example) of the user and a line-of-sight detection result.
The arithmetic operation unit 106 of the vehicle 1 determines whether the object having the high risk described with reference to
When the object having the high risk is absent (step S502: No), the arithmetic operation unit 106 determines whether an object having a medium risk is present (step S503). When the object having medium risk is present (step S503: Yes), the arithmetic operation unit 106 notifies the user of an imminent risk using the information input/output unit 104 (step S505). Thereafter, as described above, as long as the arithmetic operation unit 106 does not end the assist of the safe driving (step S510: No), the arithmetic operation unit 106 returns to the sensing of the periphery of the vehicle and the attention region of the user (step S501).
When the object having the medium risk is absent (step S502: No), the arithmetic operation unit 106 determines whether an object having a low risk is present (step S506). When the object having the low risk is present (step S506: Yes), the arithmetic operation unit 106 further determines whether the user is aware of the object having the low risk (step S507). In response to determining that the user is not aware of the object having the low risk (step S507: No), the arithmetic operation unit 106 performs notification using the information input/output unit 104 such that the user notices the object having the low risk (step S508). Thereafter, as described above, as long as the arithmetic operation unit 106 does not end the assist of the safe driving (step S510: No), the arithmetic operation unit 106 returns to the sensing of the periphery of the vehicle and the attention region of the user (step S501).
In response to determining that the user is aware of the object having the low risk (step S507: Yes) or determining that the object having the low risk is absent, there is no information to be notified to the user. For that reason, when the notification has been performed using the information input/output unit 104, the arithmetic operation unit 106 stops the notification (step S509). Thereafter, as described above, as long as the arithmetic operation unit 106 does not end the assist of the safe driving (step S510: No), the arithmetic operation unit 106 returns to the sensing of the periphery of the vehicle and the attention region of the user (step S501).
As described above, in the present disclosure, when determination is made such that the risk of the vehicle 1 is low (the low risk) and the user is not aware of the object having the low risk, the object having the low risk is notified to the user via the information input/output unit 104. However, even if the risk is determined to be low, but the user is determined to be aware of the object having the low risk, this risk is not notified unlike the case of the medium risk or the high risk.
First, the arithmetic operation unit 106 of the vehicle 1 acquires, from the memory 107, a history of position information (Ob) of the object having the low risk in a predetermined time (step S601). Note that, it is assumed that, in the memory 107 of the vehicle 1, position information and speed information are sequentially updated for each dangerous object around the vehicle determined by the arithmetic operation unit 106 and there is a history for the predetermined time or more.
Subsequently, the arithmetic operation unit 106 of the vehicle 1 acquires, from the memory 107, position information (Of) of the head of the user and a history of line-of-sight direction information of the user in the predetermined time (step S602). It is assumed that, in the memory 107 of the vehicle 1, the position information of the head and the line-of-sight direction information determined by the arithmetic operation unit 106 are sequentially updated and there is a history for the predetermined time or more.
Subsequently, the arithmetic operation unit 106 of the vehicle 1 generates, by an amount for the predetermined time, a unit vector (Vb) directed from Of to Ob by using the acquired position information (Of) of the user's head and the acquired position information (Ob) of the object having the low risk (step S603).
Subsequently, the arithmetic operation unit 106 of the vehicle 1 generates, by an amount for the predetermined time, a unit vector (Ve) along the line-of-sight direction from Of from the acquired position information (Of) of the user's head and the acquired line-of-sight direction information (step S604).
Subsequently, the arithmetic operation unit 106 of the vehicle 1 calculates angles formed by the Vb vector and the Ve vector within the predetermined time and derives a smallest angle (a minimum angle between the vectors) formed by the vectors among the angles (step S605). The arithmetic operation unit 106 of the vehicle 1 determines whether the smallest angle (the minimum angle) formed by the Vb vector and the Ve vector within the predetermined time is equal to or less than a predetermined value (step S606).
When the angle is equal to or less than the predetermined value (step S606: Yes), the arithmetic operation unit 106 determines that the user is aware of (has checked) the object having the low risk within the predetermined time (step S607) and ends the processing. On the other hand, when the angle is not equal to or less than the predetermined value (step S606: No), the arithmetic operation unit 106 determines that the user is not aware of (has not checked) the object having the low risk within the predetermined time (step S608) and ends the processing.
As described above, when determination is made such that the line-of-sight direction of the user is sufficiently close to a direction in which the object having the low risk is present within the most recent predetermined time, the user is determined to have checked the object having the low risk and, otherwise, the user is determined not to check the object having the low risk.
Note that, here, whether the user has visually recognized the object having the low risk is checked with the line-of-sight direction. However, the present disclosure is not limited to this. For example, line-of-sight detection may be performed for each eye and whether a dangerous object has been visually recognized may be determined with a focal length of the eye and the like. In this case, there is an advantage that, even if the user is viewing the direction of the object having the low risk, it can be determined that the windshield 24 is in focus and the object having the low risk is not seen.
A temporal contrast difference outside the vehicle may be measured by a sensor of the vehicle 1 and, when a predetermined or more sudden contrast difference occurs in a short time, there is a possibility that, even if the user is viewing the direction of an object around the vehicle, the user cannot visually recognize the object. Therefore, in this case, it may be determined that the user is not aware of the object. This is likely to occur near an entrance and an exit of a tunnel. Further, there is also a case in which a pupil contraction adjustment function is deteriorated because of aging. For that reason, when there is a sudden contrast difference in a short time and/or when the user is old, it may be determined that, even if the line of sight 35 is focused, the user cannot visually recognize the object around the vehicle (a probability determination may be increased).
First, at a time of time to (t=t0), Ob at the center in front of the bicycle 12 is present at a position of Ob (t0). For that reason, when the center position of the face of the driver is represented as Of, the driver needs to direct the line of sight 35 to a direction of Of to Ob (t0) in order to check the bicycle 12. A unit vector in this direction is represented as Vb (t0). The Vb (t0) vector can be calculated by acquiring a relative position of the bicycle 12 from the vehicle 1 and the center position of the face of the driver using sensor information of the vehicle 1 at time to.
The driver is checking the rearview mirror 26 at time to. The line of sight 35 of the driver at this time is represented by a unit vector as Ve (t0). The Ve (t0) vector can be calculated by acquiring the center position of the face of the driver and the line-of-sight direction information of the driver using the sensor information of the vehicle 1 at time t0.
At time t0, a direction the driver is viewing is the direction of the Ve (t0) vector and a direction the driver should view to check a bicycle is the direction of the Vb (t0) vector. An angle formed by the two vectors is represented as Ang (t0).
Similarly, at the time of time t1 (t=t1) and the time of time t2 (t=t2), the bicycle 12 continues to move forward and moves to the front of the vehicle 1 and the risk increases. Meanwhile, it is seen that the driver is checking preceding vehicle 13. An angle formed by the Vb (t1) vector and the Ve (t1) vector is represented as Ang (t1) and an angle formed by the Vb (t2) vector and the Ve (t2) vector is represented as Ang (t2). These are also calculated like Ang (t0).
When three measurement points are taken between t0 and t2, the angular differences between a direction viewed by the driver and a direction of the bicycle 12 to be viewed are Ang (t0), Ang (t1), and Ang (t2) respectively at times t0, t1, and t2. Therefore, whether the driver has checked the bicycle 12 within the time from t0 to t2 is determined as yes if at least one of Ang (t0), Ang (t1), and Ang (t2) is smaller than a predetermined angle and is determined as no if none of Ang (t0), Ang (t1), and Ang (t2) is smaller than the predetermined angle
As described above, the position detection and the line-of-sight detection may be performed for each eye and the determination may be performed by further using the focal length and the like. Moreover, it may be determined whether the visual recognition has been performed taking into account a temporal change in the contrast difference around the vehicle and/or the age of the driver.
First, the arithmetic operation unit 106 of the vehicle 1 acquires the position information (Ob) of the object having the low risk and the position information (Of) of the head of the user (step S701). The position information acquired in
When the sound alert 36 is enabled (step S702: Yes), the arithmetic operation unit 106 of the vehicle 1 calculates position information (Pa) of a point that is located on a line connecting the position information (Ob) of the object and the position information (Of) of the head of the user and is located away from the position information (Of) of the head by a predetermined distance (step S703).
Then, the arithmetic operation unit 106 of the vehicle 1 sets the point as a virtual sound source position such that the user can hear the sound alert 36 from the position information (Pa) and then outputs a sound signal using the information input/output unit 104 (step S704).
Note that the position of the information input/output unit 104 (for example, an acoustic system including a plurality of speakers 28 embedded around the cockpit 17, in a headrest of a seat, or the like) in the vehicle 1 is known in advance, an audio signal produced by determining an audio signal output from the speakers 28 in advance for each position information (Pa) of the virtual sound source as in a channel-based scheme may be used or an object-based scheme may be used in which the position information (Pa) of the virtual sound source is arranged in a three-dimensional space and audio signal processing simulating sound reaching the user's ear is sequentially performed. In the present disclosure, a method for implementing the 3D stereophonic sound is not limited.
Note that the position information (Pa) of the virtual sound source is not limited to the above in the present disclosure and may be on a virtual three-dimensional line connecting any position in the dangerous region (see
When the sound alert 36 is not enabled (step S702: No) or when the processing of the sound alert 36 ends (step S704), subsequently, the arithmetic operation unit 106 of the vehicle 1 checks whether the marker display 37 is enabled (step S705).
When the marker display 37 is enabled (step S705: Yes), the arithmetic operation unit 106 of the vehicle 1 calculates position information (Pv) of a point that is located on a line connecting the position information (Ob) of the object and the position information (Of) of the head of the user and is located away from the position information (Of) of the head by a predetermined distance (step S706).
Then, the arithmetic operation unit 106 of the vehicle 1 sets the point as a virtual display position (an image forming position of a marker display video) such that the user can visually recognize the marker display 37 with the position information (Pv) and outputs a video signal using the information input/output unit 104 (step S707).
Note that the position information (Pv) of the marker display is not limited to the above in the present disclosure and may be on a virtual three-dimensional line connecting any position in the dangerous region (see
Note that (the space video projection device 27 of) the information input/output unit 104 of the vehicle 1 may be configured from a plurality of video projection devices such that a video can be displayed at a wide angle in a direction in which a window on the front side in a traveling direction in which a safety confirmation range of the user is required is present or the video may be displayed by a single video projection device.
A viewpoint video from the outside of the vehicle 1 (for example, a rear upper part or an upper part) may be generated by the arithmetic operation unit 106 in real time from sensor data and displayed on the information input/output unit 104 present around the driver's seat (for example, a small head-up display provided in front of a driver's seat or a monitor that is a part of a cockpit).
When the marker display 37 is not enabled (step S705: No) or when the processing of the marker display 37 ends (step S707), the arithmetic operation unit 106 ends the processing. Note that, although description is omitted concerning the display processing of the arrow mark 38 here, the processing may be performed the same as the marker display 37 with the exception that a display position is near the current attention region of the user (on an extension line of the Ve vector in
Note that, here, for convenience of description, the processing is described as the processing of outputting the sound alert 36 and then outputting the marker display 37. However, the present disclosure is not limited to this and these kinds of processing may be performed in parallel or may be sequentially executed in order different from the order described above.
An information presentation method of the present embodiment is an information presentation method for presenting information to a driver of a first vehicle (the vehicle 1), the method including detecting, via at least one first sensor (the sensor unit 103) that senses the outside of the first vehicle, one or more objects located in front of the first vehicle, determining a risk of each of the one or more objects, and outputting an alert sound for alerting the first dangerous object via one or more speakers (the information input/output unit 104) provided in an interior of the first vehicle in response to determining, based on the risk of each of the one or more objects, that a first dangerous object having a first risk exceeding a predetermined level is present among the one or more objects. The alert sound is presented to the driver as a sound image localized at a first sound image position between the driver and the first dangerous object, the risk relates to a future collision risk between each of the one or more objects and the first vehicle, and, when the risk of the first dangerous object rises from the first risk to a second risk, the sound image position of the sound image is changed from the first sound image position to a second sound image position, and the second sound image position is a position between the driver and the first dangerous object and closer to the driver than the first sound image position.
The first sound image position is a position (a virtual sound source position) on an imaginary line extending from the pupil of the driver to the first dangerous object or a position away from the imaginary line by a first distance. The second sound image position is a position on the imaginary line or a position away from the imaginary line by a second distance. Each of the first sound image position and the second sound image position is a position on the imaginary line. The sound image moves on the imaginary line when the sound image position of the sound image is changed. Further, when the first dangerous object is determined to be present, a first marker (marker display) for alerting the first dangerous object to the driver is displayed on a head-up display mounted on the first vehicle, and the first marker is presented to the driver as a virtual image formed between the driver and the first dangerous object.
The first marker is presented to the driver as a virtual image overlapping with the first dangerous object. In addition, the first marker is presented to the driver as a virtual image formed at a position on an imaginary line extending from the pupil of the driver to the first dangerous object or at a position away from the imaginary line by a predetermined distance. Further, an image forming position of the first marker moves along an imaginary line extending from the pupil of the driver to the first dangerous object.
First, the virtual sound source position of the sound alert 36 is arranged to be closer to the user as the risk increases in Pa (t0), Pa (t1), and Pa (t2) as time elapses to times t0, t1, and t2. This is to cause the user to recognize as if the sound alert 36 is gradually sounding in a far place to a near place. Accordingly, it is possible to inform the user that the target object around the vehicle is naturally approaching the vehicle 1 and the risk is increasing.
Note that bringing the virtual sound source position of the sound alert 36 close as the risk increases is an example. The present disclosure is not limited to this. A fixed value may be adopted as the distance between the position information (Pa) of the virtual sound source and the center position information (Of) of the face of the user or the fixed value may be adjusted with preference of the user or may be adjusted by the arithmetic operation unit 106 (or the arithmetic operation unit 903) of the vehicle 1 (or the vehicle management cloud 10) from driving data.
Note that the volume of the sound alert 36 may be increased in accordance with a determination result of the risk. Accordingly, it is conceivable that there is an advantage that the user can easily recognize the risk smoothly by being warned gradually with louder sound.
Note that the sound of the sound alert 36 may be changed in accordance with the determination result of the risk. For example, the sound alert 36 may sound “pi” at time t0 when the risk is low, may sound “pipi” at time t1 when the risk is slightly increased, and may sound “pipipi” at time t2 when the risk is further increased. By changing the sound to be output in this way according to the determination result of the risk, it is possible to more plainly notify the user of the risk.
The virtual display position of the marker display 37 is arranged to be closer to the target object as the risk increases in Pv (t0), Pv (t1), and Pv (t2) as time elapses to times t0, t1, and t2. This is to, by reducing the difference between a focal length at the time when the user views the marker display 37 and a focal length at the time when the user views the target object, reduce an amount of change in the focal length to make it possible to quickly visually recognize the target object. Accordingly, it is possible to, by following the marker display 37 with the eyes, plainly inform the user in which direction the target object around the vehicle is naturally located.
Note that moving the virtual display position of the marker display 37 away from the user toward the target object is an example. The present disclosure is not limited to this. A fixed value may be adopted as the distance between the position information (Pv) of the virtual display and the center position information (Of) of the face of the user. The fixed value may be adjusted by the user in accordance with his/her preference or may be adjusted by the arithmetic operation unit 106 (or the arithmetic operation unit 903) of the vehicle 1 (or the vehicle management cloud 10) from the driving data. Alternatively, the position information (Pv) of the virtual display may be superimposed and displayed on a real image around the position of the target object in a three-dimensional space.
Note that at least one of a pattern, a color, size, and clarity of the marker display 37 may be changed according to a determination result of the risk. Therefore, larger marker display 37 may be performed as the risk increases or the marker display 37 may be performed with a color or clarity (opacity) that can be visually recognized more clearly. Accordingly, it is conceivable that there is an advantage that the user is allowed to easily recognize the risk smoothly.
Adjusted light is output from an optical system present in the information input/output unit 104 to the marker 121 and is reflected on the windshield 24 to reach the eyes of the driver, whereby an image is formed and visually recognized. In this example, the information input/output unit 104 performs control such that an image of the marker 121 at each time is formed on a virtual image plane (on the virtual image plane 122) perpendicular to the road (the ground 124) or in the gravity direction.
As described above, the marker 121 has a higher risk as time elapses to times t0, t1, and t2 as described above and moves from the driver side to the target object side on a line connecting the driver and the target object 123 (the dangerous region) and is displayed at positions Pv (t0), Pv (t1), and Pv (t2). The marker 121 may be displayed at a position closer to the target object side from the driver side when the risk increases according to not only the time change but also a determination result of the risk. The driver is considered to have an advantage that the driver has a video experience as if the marker 121 flies toward the target object 123 and the line of sight of the driver is easily guided (attention is easily drawn).
Note that, in this example, the image forming position moves from the driver side to the target object side on the line connecting the driver and the target object 123 (the dangerous region). However, the present disclosure is not limited to this. The optical system of the information input/output unit 104 may be controlled such that the image forming position (the virtual image plane 126) moves on a line on the ground obtained by projecting the line connecting the driver and the target object 123 onto the ground 124 in the perpendicular direction. In this case, the driver has a video experience as if the marker 125 approaches the target object 123 while crawling on the ground 124 from the driver's side. It is considered easy to guide the line of sight.
Display positions are controlled by the information input/output unit 104 such that all pieces of position information of Pv (t0), Pv (t1), and Pv (t2), which are display position information of a marker 127, are included in the inside of the cone. In
Naturally, the display position Pv of the marker 127 may be set on an imaginary line connecting the head of the driver and one point in the dangerous object (or the dangerous region 129) in accordance with the risk. Even in the case of a horizontal virtual image plane illustrated in
Similarly,
Display positions are controlled by the information input/output unit 104 of the vehicle 1 such that all pieces of position information of Pv (t0), Pv (t1), and Pv (t2), which are the display position information of a marker 131, are included in the inside of the square pyramid. In
In this case, there is the same advantage as the advantage illustrated in
On the other hand, the bicycle (the target object 123) far from the driver is present on a slope of the ground 124. Therefore, when the marker 133 is displayed on the ground 124 near the bicycle, the virtual image plane 134 is arranged along the slope to be parallel along the tilt and the unevenness of the ground 124 near the bicycle. When the markers are displayed in this way, by arranging the virtual image plane along the tilt or the unevenness of the ground near the target objects of the markers, the respective markers for the respective target objects can be visually recognized from the driver without discomfort.
As described above, when a plurality of markers (the marker 133 and the marker 137) are simultaneously displayed in parallel to the ground 124 or in the direction perpendicular to the gravity direction, the optical system of the information input/output unit 104 is controlled such that, for each of the markers, the tilts of the virtual image planes (the virtual image plane 134 and the virtual image plane 138) are matched with the tilt of the ground on which the markers are displayed. Note that partial tilt and unevenness of the ground 124 may be acquired from high-resolution digital map information or may be measured in real time using a 3D space measurement technology by a sensor (a LiDAR, a camera, or the like).
Further, as described with reference to
In such a case, it is difficult to plainly notify the driver of the dangerous object even if the marker is displayed immediately before the target object as illustrated in
For that reason, when a part or the entire object 139 (the dangerous object or the dangerous region) of the marker display is included in the blind spot region 140 from the driver, it is conceivable to arrange a perpendicular virtual image plane 142, which is the same as the gravity direction, at a position on the driver side of the target object 139 and display a marker 141 at a position on the virtual image plane easily visually recognized by the driver close to the target object 139.
In some case, it is difficult to detect a dangerous object present in the blind spot region 140 only with the sensor unit 103 mounted on the vehicle 1 such as LiDAR or a camera. For that reason, concerning a situation and a dangerous object around the vehicle that cannot be detected by an in-vehicle sensor, data detected by a sensor mounted on another nearby vehicle (V2V communication or the like), nearby road equipment (V2I communication or the like), an artificial satellite or the like or information about ta situation around the own vehicle constructed in a cyberspace from the data may be transmitted by a specific computer on these kinds of equipment or a network and received by the communication unit 108 of the vehicle 1. The arithmetic operation unit 106 may acquire or determine the position of the dangerous object around the own vehicle. Naturally, information about a type and a moving speed vector of the dangerous object may be received at the same time.
On the other hand, for a vehicle in the blind spot region from the driver, the perpendicular virtual image plane 142 present immediately in front of a target object illustrated in
The distance 143 to the marker 141 visually recognized by the driver is the same as the distance indicated as the horizontal distance to the virtual image plane 142 in
As described with reference to
In an upper right part 823, display of an animation marker 381 for causing the driver to pay attention to the dangerous region is started and, at the same time, the sound alert 36 is output from the virtual sound source position set in the direction of the dangerous region. The animation marker 381 is an arc-shaped marker centered on the dangerous region (in this example, the bicycle 12) and is displayed as a marker having a large radius that greatly exceeds the dangerous region when appearing.
In a left middle part 824, the radius of the animation marker 381 slightly decreases and a slightly smaller arc-shaped marker is displayed centering on the dangerous region. The driver notices the sound alert 36 and/or this animation marker 381.
In a center middle part 825, the radius of the animation marker 381 further decreases and a smaller arc-shaped marker is displayed centering on the dangerous region. The driver learns that there is the dangerous region in the direction in which the sound alert 36 sounds (the direction of the virtual sound source position) and/or there is the dangerous region in the center direction of the animation marker 381 and starts to check a region (a direction) to which the driver is guided.
In a middle right part 826, the animation marker 381 changes to a final form and is reduced to a minimum size centered on the dangerous region. The required time from the upper left part 821 to the middle right part 826 is an instance (as an example, within 0.5 seconds, within 5 seconds, or the like).
In a lower left part 827, a lower center part 828, and a lower right part 829, marker display indicating the position of the dangerous region is performed. This marker display is a marker 382 that is not displayed as animation in a size deviating from the dangerous region like the animation marker but continues to be displayed while keeping a constant relative position with the dangerous region such that a dangerous object in a target dangerous region is easily visually recognized or is easily continuously visually recognized. Thereafter, the flow is the same as the flow illustrated in
In
Therefore, by performing a notification starting with the animation marker 381 or the sound alert 36 for the detected dangerous object having the constant risk, it is possible to more easily and inexpensively assist safe driving without providing equipment for detecting the line-of-sight direction (the line of sight 35) of the driver and displaying an arrow mark (see
Note that, in the above description, it is assumed that the radius of the animation marker 381 is reduced with time. However, the present disclosure is not limited to this. The arc-shaped animation marker 381 may not only be reduced in radius with time but may also be changed in as color or a pattern or a sound effect may be simultaneously sounded.
For example, various display forms are conceivable, such as a pattern in which an arc is thinned while a radius is large, reduction speed of the radius is fast, a color is light red, and a tail is drawn long and, as the radius decreases, the arc is thickened, the reduction speed of the radius is slow, the color is dark red, and the tail is drawn short.
In
Subsequently, when the condition for marker display is satisfied (step S712: Yes) (for example, in the case of a low risk or higher in
Note that the arc-shaped animation marker may be displayed while changing one or more of the thickness, the color, and the pattern of the animation marker at the same time as described above. The marker may also change on the time axis without being displayed in a fixed color, pattern, or form.
Accordingly, the animation marker that is displayed to a range exceeding the dangerous region is instantaneously displayed. Therefore, it is possible to, without the trouble of detecting the line of sight of the driver, notify the driver of the dangerous object in a form of allowing the driver to easily notice the dangerous object.
At time t0, the bicycle 12 on the front right side is still far from the vehicle 1 and a risk is determined as “low”. In order to notify the driver of this, the arithmetic operation unit 106 of the vehicle 1 calculates a quadrangular pyramid that includes, for example, the head of the driver and a dangerous region 373 (quadrangular in
With this sound alert, the driver can intuitively sense, without delay, only with the sense of hearing, that a dangerous object with a “low” risk is present on the front right side of the vehicle. In other words, the direction of the dangerous object and the risk thereof can be communicated to the driver irrespective of the line of sight of the driver.
At t1 when time has elapsed a little from time to, the bicycle 12 is closer to a traveling route of the own vehicle and the risk is determined as “medium”. The arithmetic operation unit 106 of the vehicle 1 acquires the latest head position and the dangerous region 373 and updates the quadrangular pyramid in the three-dimensional space. Then, the arithmetic operation unit 106 arranges the virtual sound source position Pa (t1) inside the quadrangular pyramid. At this time, since the risk is “medium”, the arithmetic operation unit 106 arranges Pa (t1) at a distance closer to the driver than Pa (t0) corresponding to the risk “low”. A sound alert 375 is “pipi”, which is a stronger warning than when the risk is “low”, and performs notification with a slightly larger medium-degree volume.
Further, since Pa (t1) is closer to the driver than Pa (t0), the driver can intuitively feel that the dangerous object is approaching the driver. With this sound alert 375 or with a difference from the sound alert 374 at the time when the risk was “low” at time t0, the driver can sense that the dangerous object having the risk of “medium” is present on the front right side of the own vehicle.
Further, at t2 when time has elapsed a little from time t1, the bicycle 12 is closer to the traveling route of the own vehicle and the risk is determined as “high”. The arithmetic operation unit 106 of the vehicle 1 acquires the latest head position and the dangerous region 373 and updates the quadrangular pyramid in the three-dimensional space. Then, the arithmetic operation unit 106 arranges the virtual sound source position Pa (t2) inside the quadrangular pyramid. At this time, since the risk is “high”, the arithmetic operation unit 106 arranges Pa (t2) at a distance closer to the driver than Pa (t1) corresponding to “medium”. A sound alert 376 is “pipii”, which is a stronger warning than when the risk is “medium”, and performs notification with a slightly larger large-degree volume.
With the sound alert 376 or with a difference from the sound alert 374 and the sound alert 375 at the time when the risk is “low” and “medium” at times to and t1, the driver can sense that the dangerous object having the risk “high” is present on the front right side of the own vehicle. Since Pa (t2) is closer to the driver than Pa (t1) and Pa (t0), the driver can intuitively feel that the dangerous object is approaching the driver with the elapse of time, that is, is dangerous.
As described with reference to
At time t0, the other vehicle 13 is about to merge from the front left side with the own vehicle traveling straight. At this time, although the own vehicle is traveling straight, the other vehicle 13 is stopped without obstructing a traveling lane and the risk is set to “low”. While the risk is “low”, for the other vehicle on the front left side, which is the target object, the green semi-arc marker 377 is displayed on the front side surface of the other vehicle in parallel to the ground.
Accordingly, the driver can easily determine that, although the other vehicle is present on the front left side, the risk of the other vehicle is low and can safely drive. The stop of the other vehicle 13 on the front left side may be determined by detecting a temporal change in the position of the other vehicle 13 with the sensor unit 103 (the LiDAR, the camera, or the like) mounted on the own vehicle or a radio signal transmitted from the other vehicle 13 may be acquired by the communication unit 108 of the vehicle 1 and the speed being 0 Km/h may be received as a part of driving control information.
At t1 when time has elapsed a little from time to, it is detected that the own vehicle continues traveling straight but the other vehicle 13 on the front left side also starts moving forward. In this state, there is a risk of collision and the risk rises to “medium”. While the risk is “medium”, an orange semi-arc marker 378 is displayed with respect to the other vehicle 13 on the front side surface of the other vehicle 13 from the front direction in parallel to the ground.
In order to indicate that the other vehicle 13 is moving forward, the semi-arc marker 378 is drawn with an arc to cover a traveling direction side (the front side in
This is merely an example and any one or more of a shape, a color, a pattern, a size, luminance, and a blinking period of a marker may be changed to distinguish and display a state of a target object of the marker (for example, whether the target object is moving forward, moving backward, stopped, turning right, or the like).
In general, it is difficult to for the driver to discriminate whether the other vehicle 13 is about to enter at a merging point or is waiting for the own vehicle to pass. If a temporal change of the position of the other vehicle 13 can be measured or the driving control information can be acquired from the other vehicle 13 via wireless communication (the communication unit 108 of the vehicle 1), this is easy for the driver to understand and it is possible to notify the driver using video and sound without delay. Therefore, it can be expected it is possible to contribute to safe driving.
At t2 when time has elapsed a little from time t1, it is detected that the own vehicle continues traveling straight but the other vehicle 13 on the front left side is also about to move forward and merge. In this state, the risk of collision is high and the risk rises to “high”. While the risk is “high”, with respect to the other vehicle, a red semi-arc marker 379 is displayed on the front side surface of the other vehicle 13 in parallel to the ground from the front direction. In order to indicate that the other vehicle 13 is moving forward higher speed, the semi-arc marker 379 is drawn with an arc to have a larger width on the traveling direction side (the front side in this drawing) of the other vehicle 13 and visually informs the driver that the other vehicle 13 is moving forward. Further, a marker such as an arrow 380 indicating an expected traveling route of the other vehicle 13 may be displayed in order to plainly indicate to the driver that the other vehicle 13 is about to merge.
A temporal movement amount of the other vehicle 13 or a steering angle of tires may be detected by the sensor unit 103 of the vehicle 1 and the expected traveling route may be derived by the arithmetic operation unit 106 and displayed by the information input/output unit 104. Alternatively, the expected traveling route may be acquire as driving control information from the other vehicle 13 via wireless communication (the communication unit 108 of the vehicle 1). Since the expected traveling route is displayed to be superimposed on the real world, it is possible to visually understand, for example, whether the other vehicle 13 is stopped, whether the other vehicle is increasing or reducing speed, and in which direction the other vehicle 13 is about to travel. If how the other vehicle 13 moves/does not move is plainly directly superimposed on the driver's vision using the head-up display or the like as described above, it can be expected that it is possible to greatly contribute to safe driving.
Note that it is also conceivable to read the driver's driving intention such as the expected traveling route and whether to merge or continue to stop and wait for passage from an operation situation of one or more of a steering wheel, an accelerator, and a brake to control a video of an outward display panel of the own vehicle or project the video on the ground near the own vehicle. When such outside display of the intention is performed, it is easy for a driver of an oncoming vehicle of the vehicle to implement smooth and safe driving and it can be expected that an accident is reduced.
Note that such an expected traveling route and a current driving intention can be more reliably read in an automatic driving vehicle. When an unmanned automatic driving vehicle travels while being mixed with a manned driving vehicle, if a current driving intention (to, for example, decelerate, accelerate, stop, halt, immediately merge, or merge after other vehicle passage) of the automatic driving vehicle can be presented to a driver by the method described above, it is considered possible to implement, efficiently in a united form, driving intention confirmation between drivers of the automatic driving vehicle and the manned vehicle.
If an object satisfying the condition of the high risk is absent around the vehicle, the processing proceeds to No (step S722: No) and the arithmetic operation unit 106 of the vehicle 1 checks whether an object satisfying the condition of the medium risk is present around the vehicle (step S724). If an object satisfying the condition of the medium risk is present around the vehicle, the processing proceeds to Yes (step S724: Yes) and the information input/output unit 104 of the vehicle 1 displays these markers in second shape, color, and pattern (step S725). The information input/output unit 104 may simultaneously notify the presence of the object to the driver with second sound signal, volume, and virtual sound source position.
If an object satisfying the condition of the medium risk is absent around the vehicle, the processing proceeds to No (step S724: No) and the arithmetic operation unit 106 of the vehicle 1 checks whether an object satisfying the condition of the low risk is present around the vehicle (step S726). If an object satisfying the condition of the low risk is present around the vehicle, the processing proceeds to Yes (step S726: Yes) and the information input/output unit 104 of the vehicle 1 displays these markers in the third shape, color, and pattern (step S727). At the same time, the driver may be notified of the third sound signal, the volume, and the virtual sound source position.
If an object satisfying the condition of the low risk is absent around the vehicle, the processing proceeds to No (step S726: No) and the information input/output unit 104 of the vehicle 1 stops displaying the markers because a dangerous object for which attention should be called with a marker is absent (step S728). The information input/output unit 104 stops the sound alert as well. Upon ending the assist of the safe driving, the arithmetic operation unit 106 of the vehicle 1 ends the processing (step S729: Yes). When not ending the processing (step S729: No), the arithmetic operation unit 106 of the vehicle 1 returns to the beginning (step S721) and returns to the loop of sensing the periphery of the vehicle with the sensor unit 103 again.
Note that, although only the shapes and colors of the markers are described here, when the sound alert is simultaneously used, as described above, audio signals, volumes, and virtual sound source positions of different sound alerts may be used in accordance with the respective risks.
As illustrated in the flowchart of
First, the arithmetic operation unit 903 of the insurance company cloud 11 (or an application operating therein) requests, via the communication unit 901, the PDS 9A used by the user to provide driving-related information over a predetermined period (for example, the past one month or one year) of a user identified by a user ID (step S801). The request includes a user ID, requested data type information (driving data and information about a driver's license (an expiration date, a drivable vehicle type, a driving condition, and the like)), a period of requested data (for example, past one year), and requester identification information (information for identifying the insurance company) for identifying a requesting corporation or organization.
The arithmetic operation unit 903 of the PDS 9A, which has received the request via the communication unit 901, collates the driving-related information with a database recorded in the memory 902 and checks whether the user having the user ID has permitted the use of the driving-related information to the insurance company (step S802). When the user has not permitted the use (step S803: No), the arithmetic operation unit 903 transmits a message proposing to permit the use to the information terminal 2 of the user via the communication unit 901 (step S804).
The arithmetic operation unit 204 of the information terminal 2, which has received the message via the communication unit 206, displays the message to the user using (the display of) the information input/output unit 202 (step S805) and urges the user to permit the insurance company to use of the driving-related information (the driving data and the information about the driver's license) (step S806).
When the user permits the use (step S806: Yes), a response to that effect is returned to the PDS 9A via the communication unit 206 of the information terminal 2. The arithmetic operation unit 903 of the PDS 9A, which has received the response via the communication unit 901, additionally writes, in the database recorded in the memory 902, that the user has permitted the use of the driving-related information to the insurance company (step S809).
On the other hand, when the user does not permit the use (step S806: No), the arithmetic operation unit 204 answer the PDS 9A to that effect. The PDS 9A, which has received the answer, answers the insurance company cloud 11 that the user has not permitted the use of the driving-related information (step S807). The insurance company cloud 11, which has received the answer (step S808), ends this processing because the use permission has not been obtained.
When the user has already permitted the use of the driving-related information to the insurance company (step S803: Yes) or when the user has permitted the user anew (step S806: Yes), the arithmetic operation unit 903 of the PDS 9A answer the insurance company cloud 11 with the driving-related information of the predetermined period of the user (the user ID) via the communication unit 901 (step S810).
The arithmetic operation unit 903 of the insurance company cloud 11, which has received the answer via the communication unit 901, determines, based on the received driving-related information, vehicle information used by the user, and content of the current insurance contract, whether to propose to update or update the insurance contract (step S811).
Then, when the insurance contract is proposed to be updated or is updated (step S812: Yes), the arithmetic operation unit 903 of the insurance company cloud 11 transmits the update proposal or update content of the insurance contract to the information terminal 2 (step S813). The arithmetic operation unit 204 of the information terminal 2, which has received this via the communication unit 206, notifies the user of the update proposal or the update content of the insurance contract using the information input/output unit 202 (step S814). On the other hand, in response to determining that the update proposal or the update of the insurance contract is not performed (step S812: No), the arithmetic operation unit 903 of the insurance company cloud 11 ends this processing.
The insurance company cloud 11, which has acquired the driving-related information of the user from the PDS 9A as described above, can determine, based on the current contract content, the vehicle 1 to be a contract target, whether to propose to update the insurance contract or update the insurance contract, and the driver's license information and the driving data of the user and notify to that effect via the information terminal 2 of the user. By managing, with the PDS 9A, the driving-related information including the driving data collected by the vehicle 1 and allowing the insurance company to use the driving-related information under personal permission, it is possible to update the insurance contract to an appropriate insurance contract according to the recent driving history of the user.
This can also be considered a benefit of the user obtained only by accumulating the driving-related information of the user in the PDS 9A and allowing a third party to use the driving-related information. The driving-related information of the user managed by the PDS 9A is not used only for the safe driving function in the vehicle but is permitted to be used and released also for a service of the third party as described above. Therefore, a new data utilization method can be generated and the user can receive a new benefit. What the present disclosure discloses here is a specific application case of information processing for the third party to utilize such driving-related information.
Note that the third-party use of the driving-related information including the driving data obtained from the vehicle 1 as described above is not limited to the insurance company. For example, if the driving data is released to an administrative unit in charge of the place where an incident/accident report has been made or a road maintenance company, it is possible to grasp where an event having a high risk has occurred. Accordingly, it is considered possible to find and eliminate a reason why an event having a high risk occurs at the place.
For example, by permitting the use of the driving data to an automobile sales company, it is also considered possible to examine what kind of safe driving function is preferable at the time of vehicle purchase while collating the safe driving function with specific driving data. A vehicle selling side can propose a vehicle corresponding to a safe driving skill of the user and the user side can select a vehicle and a safe driving function corresponding to the safe driving skill of the user.
As described above, by permitting the use of the driving data, the driving conditions of the driver's license, and the like to the vehicle 1 driven by the user (or the manufacturer thereof), before the start of driving, the vehicle 1 can check whether the user is permitted to drive the vehicle 1, what is a safe driving function necessary for driving, and the like.
This makes it possible to prevent a user without legal permission from driving the vehicle 1. When the safe driving function required when the user drives is insufficient, it is possible to satisfy movement needs of the user while improving safe driving by limiting speed and a drivable range at the time of driving.
By permitting the use of the driving data, the driving conditions of the driver's license, and the like to the vehicle 1 driven by the user (or the manufacturer thereof), at least one of the sound alert 36, the marker display 37, and the arrow mark 38 may be output not only in response to the determination result of the low risk but also in a safer state for the purpose of assisting check of a dangerous object around the vehicle for a user who is unaccustomed to driving shortly after acquiring a driver's license.
In this case, when the safe driving can be carried out at a predetermined level based on the driving data of the user, the notification of the sound alert 36, the marker display 37, and the arrow mark 38 may be limited to when the low risk is determined as described above.
In the safe driving assist system 100 indicated by the present disclosure, by making the driving-related information (the information of the driver's license, the driving data, and the like) available to a third party as well, it is considered possible to implement a safer and more appropriate movement experience conforming to finer needs not only for the user but for the entire society around the user.
The safe driving assist system 100 of the present disclosure detects a dangerous object around the vehicle and estimates whether the user is aware of the dangerous object. Therefore, in particular, when the user is not aware of the dangerous object, it is possible to plainly notify the dangerous object using a display technology by augmented reality of virtual reality and a 3D stereophonic sound technology. It is possible to generate a new added value service by causing a third party to use, based on personal permission, data concerning driving of the user obtained here. As a future image of mobility in the Society5.0 society, industrial applicability is considered to be extremely high.
A program executed by the safe driving assist system 100 in the present embodiment is provided by being incorporated in advance in a ROM or the like.
The program to be executed by the safe driving assist system 100 in the present embodiment may be provided by being recorded in a computer-readable recording medium such as a CD-ROM, a flexible disk (FD), a CD-R, or a DVD (Digital Versatile Disk) as a file in an installable format or an executable format.
Further, the program to be executed by the safe driving assist system 100 in the present embodiment may be configured to be stored on a computer connected to a network such as the Internet and to be provided by being downloaded via the network. The program to be executed by the safe driving assist system 100 in the present embodiment may be provided or distributed through a network such as the Internet.
Although the several embodiments of the present invention are described above, these embodiments are presented as examples and are not intended to limit the scope of the invention. These embodiments can be implemented in other various forms, and various omissions, substitutions, and changes can be made without departing from the gist of the invention. These embodiments and the modifications thereof are included in the scope and the gist of the invention and are included in the inventions described in the claims and the scope of equivalents of the inventions.
Examples of other various embodiments disclosed by the description of the embodiments described above include the following techniques.
An information presentation method for presenting information to a driver of a vehicle via a head-up display mounted on the vehicle, the information presentation method including:
A control method for controlling a first computer that retains driving characteristic data of a plurality of users, the control method including:
The control method according to the technique 2, wherein
A vehicle control method for controlling a vehicle mounted with a communication circuit connectable to a network, the vehicle control method including:
The vehicle control method according to the technique 4, wherein
A vehicle control method for controlling a vehicle mounted with a communication circuit connectable to a network, the vehicle control method including:
The vehicle control method according to the technique 6, wherein
The vehicle control method according to the technique 6, wherein,
Number | Date | Country | Kind |
---|---|---|---|
2022-057194 | Mar 2022 | JP | national |
This application is a continuation of International Application No. PCT/JP2023/006878, filed on Feb. 24, 2023, which claims the benefit of priority of the prior Japanese Patent Application No. 2022-057194, filed on Mar. 30, 2022, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2023/006878 | Feb 2023 | WO |
Child | 18888936 | US |