This application claims priority to Japanese Patent Application No. 2023-186823, filed on Oct. 31, 2023, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an alarm system.
A line-of-sight monitoring apparatus that monitors the direction of the driver's line of sight and warns the driver if the line of sight deviates from a predetermined range using an output apparatus mounted in the vehicle is disclosed. For example, see Patent Literature (PTL) 1.
According to the prior art, when a driver is driving inattentively, the line-of-sight monitoring apparatus alarms the driver to turn the line of sight to the predetermined range. However, the line-of-sight monitoring apparatus in the prior art does not provide alarms in response to changes in vehicle travel condition caused by inattentive driving. Therefore, the driver cannot immediately respond to the deteriorating travel condition.
It would be helpful to reduce the risk of deterioration of the travel condition of a vehicle due to the driver gazing at a display of an in-vehicle apparatus.
An alarm system according to an embodiment of the present disclosure that solves the above problem is an alarm system to be mounted in a vehicle, the alarm system including:
According to the present disclosure, the risk of deterioration of the travel condition of a vehicle due to the driver gazing at a display of an in-vehicle apparatus can be reduced.
In the accompanying drawings:
An embodiment of the present disclosure will be described below, with reference to the drawings. The drawings used in the following description are schematic. In addition, dimensional ratios or the like on the drawings do not necessarily match actual ones.
An alarm system 10 of one embodiment of the present disclosure comprises an imager 11, an alarm apparatus 12, and a display 13, as illustrated in
The imager 11 is a camera located in the cabin of the vehicle 20 that captures images including the face of the driver 21. The imager 11 may be located in the vehicle 20 near the room mirror, on the dashboard, or in the center of the steering wheel, etc. The imager 11 is positioned so that the face of the driver 21 is included within the field of view. For example, the optical axis O of the imager 11 is directed toward the face of the driver 21. The imager 11 outputs the captured image of the driver 21 to the alarm apparatus 12.
The alarm apparatus 12 includes a controller 14 and a memory 15. The alarm apparatus 12 is configured to acquire an image including the face of the driver 21 driving the vehicle 20 from the imager 11. The alarm apparatus 12 is configured to acquire information regarding a travel condition of the vehicle 20 from one or more ECUs (Electronic Control Units) 16 in the vehicle 20. The controller 14 of the alarm apparatus 12 can generate an alarm and display it on the display 13 based on the information acquired from the imager 11 and the ECU 16.
Information regarding the travel condition of the vehicle 20 includes information on a travel speed of the vehicle 20, information on an inter-vehicle distance between the vehicle 20 and the vehicle that travels ahead, and information indicating changes in travel speed and inter-vehicle distance. Information regarding the travel condition is not limited to information regarding the travel speed and the inter-vehicle distance. For example, information regarding travel conditions may include information about steering angle and information regarding wobble and lane departure of the vehicle 20.
The controller 14 includes at least one processor, at least one programmable circuit, at least one dedicated circuit, or any combination thereof. The processor is a general purpose processor such as a central processing unit (CPU) or a graphics processing unit (GPU), or a dedicated processor that is dedicated to specific processing. The programmable circuit is, for example, a field-programmable gate array (FPGA). The dedicated circuit is, for example, an application specific integrated circuit (ASIC). The controller 14 executes processes related to the operations of the alarm apparatus 12 while controlling the components of the alarm apparatus 12.
The memory 15 includes at least one semiconductor memory, at least one magnetic memory, at least one optical memory, or any combination thereof. The semiconductor memory is, for example, random access memory (RAM) or read only memory (ROM). The RAM is, for example, static random access memory (SRAM) or dynamic random access memory (DRAM). The ROM is, for example, electrically erasable programmable read only memory (EEPROM). The memory 15 functions as, for example, a main memory, an auxiliary memory, or a cache memory. The memory 15 stores programs to be executed by the controller 14, data to be used for the operations of the alarm apparatus 12, data obtained by the operations of the alarm apparatus 12, and the like.
The memory 15 may store the positions of the imager 11 and the display 13. For example, the memory 15 may store the position of the imager 11 and the range of the display surface on which the image of the display 13 is displayed by means of a coordinate system fixed to the vehicle 20. The memory 15 may also store the standard eye position of the driver 21 for one or more registered drivers 21. In this case, the memory 15 may store information to identify the registered driver from the facial image of the driver.
The ECU 16 is an electronic control apparatus that controls each piece of equipment in the vehicle 20. The ECU 16 includes ECUs for advanced driver assistance systems (ADAS) such as inter-vehicle distance control systems and ACC (Advanced Cruise Control System). The ECUs 16 are connected with each other via a network, such as the Controller Area Network (CAN). The ECU 16 can acquire signals from a vehicle speed sensor 17 that detects the speed of the vehicle 20 and a distance measuring apparatus 18 that measures the inter-vehicle distance between the vehicle 20 and the vehicle that travels ahead of the vehicle 20. The ECU 16 can output information on the speed of the vehicle 20 and/or the inter-vehicle distance to the alarm apparatus 12. Signals from the vehicle speed sensor 17 and the distance measuring apparatus 18 may be output to the alarm apparatus 12 without going through the ECU 16. The ECU 16 may be located at various locations inside the vehicle 20.
The vehicle speed sensor 17 is, for example, a sensor that measures the number of tire revolutions. From the number of tire revolutions per hour and the length of the tire's circumference, the speed of the vehicle 20 is calculated. The vehicle speed sensor 17 can be a sensor used for the speedometer display of the vehicle 20.
Light detection and ranging (LIDAR), millimeter wave radar, and stereo cameras are included in the distance measuring apparatus 18. Any one or more of these devices can be employed as the distance measuring apparatus 18. LIDAR and millimeter-wave radar include scanning devices that measure scattered light by scanning pulsed electromagnetic waves (infrared, millimeter waves, etc.). These distance measuring apparatuses 18 can calculate the distance to each reflection point on an object based on the time it takes for the irradiated electromagnetic waves to be reflected and detected by the object (time-of-flight). The LIDER and millimeter wave radar may be located on the front side of the vehicle 20, e.g., in the front bumper, as the distance measuring apparatus 18 in
The display 13 is used for displaying information for an in-vehicle apparatus mounted in the vehicle 20. The display 13 displays images based on image signals input from in-vehicle apparatuses in the vehicle 20, including a navigation apparatus 19. As the display 13, various types of displays, such as a Liquid Crystal Display (LCD), an organic Electro-Luminescent (EL) display, and an inorganic EL display may be used. The display 13 may be embedded within the dashboard and may be located on the dashboard. When looking at the display 13, the driver 21 must move his/her line of sight from the front of the vehicle 20 to the display 13.
The navigation apparatus 19 displays a road map of the area around where the vehicle 20 is traveling and provides guidance on the route to a destination of the vehicle 20. The navigation apparatus 19 has various functions such as zooming in and out of maps, searching for destinations, and displaying nearby facilities. The navigation apparatus 19 generates image signals at all times and outputs them to the display 13. The display 13 normally displays images from the navigation apparatus 19.
As illustrated in
The image acquisition interface 31 acquires image signals of the image including the face of the driver 21 from the imager 11.
The line-of-sight estimation unit 32 estimates the position and direction of the line of sight of the driver 21 from the image signals acquired from the imager 11. For example, the line-of-sight estimation unit 32 estimates the position of the eyes of the driver 21 and the direction of the line of sight of the driver 21 based on the image of the driver 21. Known methods can be employed to estimate the position of the eyes of the driver 21 and the direction of the line of sight of the driver 21. For example, methods for estimating the direction of the line of sight include using Purkinje images, using the relative positional relationship between the iris and the eye socket, approximating the pupil contour with an ellipse, and estimating from the ellipse parameter. The position of the eyes of the driver 21 may use the information stored in the memory 15 of the alarm apparatus 12 in advance for each registered driver. In this case, the controller 14 may identify the driver 21 based on the image of the driver 21's face.
The travel information acquisition interface 33 acquires from the ECU 16 information on the travel speed of the vehicle 20 and information on the inter-vehicle distance between the vehicle 20 and the vehicle that travels ahead of the vehicle 20. The travel information acquisition interface 33 may further acquire information indicating a change in travel speed, for example, information on the acceleration of the vehicle 20. The travel information acquisition interface 33 may also acquire information on the speed of change in inter-vehicle distance. The change in speed and inter-vehicle distance of the vehicle 20 may be calculated by the controller 14 based on the information on the travel speed and the information on the inter-vehicle distance, respectively.
The gazing state determination unit 34 recognizes that the driver 21 is looking at the display 13 when the line of sight of the driver 21 intersects with the display 13. Whether or not the line of sight of the driver 21 intersects with the display 13 can be determined based on the position and direction of the line of sight estimated by the line-of-sight estimation unit 32 and the position information of the display 13 stored in the memory 15.
When a situation in which the driver 21 looks at the display 13 has continued for a predetermined time period or longer, the gazing state determination unit 34 determines that the user is gazing at the display 13. In other words, in this disclosure, “gazing” means continuously looking at the display 13 for a predetermined time period or longer. The predetermined time period can be set, for example, to 1 or 2 seconds. The time that the driver 21 keeps looking at the display 13 should be less than 1 second. Depending on the surrounding environment during driving, a dangerous situation may occur if the driver 21 continues to look at the display 13 for more than 2 seconds.
The warning information generation unit 35 may set the predetermined time period to a value that differs according to the information regarding the travel condition acquired by the travel information acquisition interface 33. For example, the gazing state determination unit 34 may set the predetermined time period shorter as the traveling speed increases. For example, the gazing state determination unit 34 may set the predetermined time period shorter as the inter-vehicle distance decreases. In this way, the more likely it is that a danger will occur due to the driver 21 gazing at the display 13, the shorter the warning can be issued to the user. This reduces the possibility of hazards to the driving of the vehicle 20.
The warning information generation unit 35 generates warning information to be displayed on the display 13 when the gazing state determination unit 34 determines that the driver 21 is gazing at the display 13.
Warning information includes information regarding travel conditions. For example, warning information may include travel speed. In particular, when the travel speed is decreasing, the warning information generation unit 35 may generate warning information that the vehicle 20 is decelerating. If the travel speed is reduced because the attention of the driver 21 is directed to the display 13, it may cause irritation to the driver 21 of the following vehicle and the risk of being rear-ended. In this case, the warning information may include a notification that the speed is too low.
The warning information generation unit 35 may also generate warnings based on information on the inter-vehicle distance. For example, the warning information generation unit 35 may generate a warning when the inter-vehicle distance is shorter than a predetermined distance and when the inter-vehicle distance is reduced with a faster speed than a predetermined speed. If the inter-vehicle distance to the vehicle that travels ahead is short, or if the inter-vehicle distance is rapidly reduced, the risk of a rear-end collision with the vehicle in front may occur. In this case, the warning information may include a notification that the inter-vehicle distance is short.
The display controller 36 displays the warning information generated by the warning information generation unit 35 on the display 13, as illustrated in
Thus, according to the alarm system 10 of the present disclosure, by directly displaying information on the display 13 that the driver 21 is gazing at, which informs the driver 21 of the worsening travel conditions, the driver 21 can immediately take action against the worsening travel conditions. The driver 21 can accelerate the vehicle 20 when the speed is reduced. The driver 21 can also ease off the accelerator or apply the brakes when the inter-vehicle distance is getting shorter. The alarm system of the present disclosure allows the driver 21 to respond to deteriorating travel conditions more quickly than simply encouraging the driver 21 to turn the line of sight to the front.
In the above embodiment, the warning information was superimposed on the display 13 normally used for the display of the navigation apparatus 19. Since the screen of the navigation apparatus 19 tends to be gazed at by the driver 21 for a relatively long time in order to read the displayed contents, it is highly effective to display warning information to direct the attention of the driver 21 to driving.
A flow of processing executed by the controller 14 of the alarm apparatus 12 will be described below using
The controller 14 acquires an image including the face of the driver 21 from the image acquisition interface 31 (S1).
The controller 14 estimates the line of sight of the driver 21 based on the image of the driver 21 (S2).
The controller 14 determines whether the line of sight of the driver 21 intersects with the display 13 (S3). If the line of sight of the driver 21 intersects with the display 13, the driver 21 is perceived to be looking at the display 13.
When the line of sight of the driver 21 intersects with the display 13 in S3 (S3: Yes), the controller 14 proceeds to S4. When the line of sight of the driver 21 does not intersect with display 13 in S3 (S3: No), the driver 21 is not looking at the display 13, so the controller 14 terminates the process without performing any special processing.
The controller 14 acquires information regarding the travel condition from the ECU 16 (S4).
The controller 14 determines whether the driver 21 is gazing at the display 13 (S5), and if a situation in which the line of sight of the driver 21 intersects with the display 13 has continued for a predetermined time period, the driver 21 is judged to be gazing at the display 13.
When the driver 21 is gazing at the display 13 in S5 (S5: Yes), the controller 14 proceeds to S6. When the driver 21 is not gazing at the display 13 because the time period during which the driver 21 looks at the display 13 in S5 is less than the predetermined time period (S5: No), the controller 14 returns to S1 and continues monitoring whether the driver 21 is looking at the display 13.
The controller 14 generates warning information using the information regarding the travel conditions (S6).
The controller 14 interrupts the image of the navigation apparatus 19 displayed on the display 13 to display the warning information generated in S6 (S7).
The controller 14 can repeat the process from S1 to S7 to reduce the risk of deterioration of the travel condition of the monitored vehicle by continuously monitoring the operation of the driver 21 while the vehicle 20 is in motion. This allows the driver 21 to drive the vehicle 20 in a stable manner.
It should be noted that the present disclosure is not limited to the above embodiment, and various modifications and revisions can be implemented. For example, functions or the like included in each means, each step, or the like can be rearranged without logical inconsistency, and a plurality of means, steps, or the like can be combined into one or divided.
For example, in the above embodiment, the imager 11 captures and outputs an image including the face of the driver 21, and the controller 14 of the alarm apparatus 12 estimates the position and direction of the line of sight based on the image of the driver 21. However, the imager 11 may be an intelligent camera with image recognition functions, and the process of estimating the position and direction of the line of sight of the driver 21 may be performed by the imager 11. For example, the imager 11 may be equipped with a machine-learned model to estimate the position and direction of the line of sight based on the image of the driver. In this case, the alarm apparatus 12 acquires the position and direction of the line of sight of the driver as output from the imager 11. The controller 14 does not require the line-of-sight estimation unit 32.
The imager 11 can be a stereo camera with multiple cameras arranged in parallel, rather than a monocular camera. In this case, the controller 14 of the alarm apparatus 12 may analyze the images output from the multiple cameras to estimate the position and direction of the line of sight.
Furthermore, in the above embodiment, the controller 14 determines whether the driver 21 is looking at the display 13 based on whether the line of sight of the driver 21 crosses the display surface of the display 13 showing the image based on the image of the face of the driver 21. However, it is not essential to estimate the position and direction of the line of sight based on the face image. For example, a number of information combining various face images of the driver 21 and information on whether the driver 21 is looking at the display or not may be used as data for machine learning, so that machine learning can directly determine whether the driver is looking at the display 13 based on the face images of the driver 21.
In the above example, navigation apparatuses 19 are shown as in-vehicle apparatuses. However, the in-vehicle apparatuses are not limited to the navigation apparatuses 19. In particular, the in-vehicle apparatuses can include in-vehicle multimedia apparatuses that integrate navigation functions as well as music players, reception of various broadcasts, and communication functions linked to cell phones.
Number | Date | Country | Kind |
---|---|---|---|
2023-186823 | Oct 2023 | JP | national |