This application claims priority on Japanese Patent Application No. 2004-321148 filed on Nov. 4, 2004; the entire contents of which are hereby incorporated by reference.
Conventionally, driver alert systems have been proposed. These driver alert systems judge when a driver has become inattentive to visual information used in driving, and if the driver has become inattentive, the system provides an alert to the driver. A driver may become inattentive due to use of a cellular phone or by talking with a passenger, for example.
For example, JP 11-276461A describes a driver alert system that monitors a driver's eye movement. More specifically, the system monitors the frequency of a driver's eye movement, and from this, the system judges whether a driver is becoming visually inattentive with respect to the task of driving. When a driver's attentiveness level is falling, and the driver's ability to process external information is falling, the system provides the user with an alert. For example, a heads up projection may be displayed on the windshield or a tone may be output. However, this may not be sufficient for the user to visually recognize a dangerous situation such as impending impact with another vehicle or obstacle.
The present invention relates to a vision recognition support device.
In one embodiment, the vision recognition support device includes an obscuring device configured to obscure a view of an operator operating an apparatus.
The embodiment may also include a control device controlling operation of the obscuring device to temporarily obscure the view of the operator. For example, the control device may receive information regarding the operator and determine whether to activate the obscuring device to temporarily obscure the view of the operator based on the received information. The information on the operator may come from an operator attentiveness level detecting device, and the control device determines to activate the obscuring device when the detected attentiveness level falls.
As another example, the control device may receive information regarding the apparatus and determines whether to activate the obscuring device to temporarily obscure the view of the operator based on the received information. The information on the apparatus may come from an apparatus state detecting system detecting an operating state of the apparatus, and/or a collision detecting device detecting if a collision situation exists between apparatus with an object.
As an example, the apparatus may a vehicle and the operator may be a driver of the vehicle. In this example, the obscuring device may be employed at one of a windshield and a side window of the vehicle.
The present invention will become more fully understood from the detailed description given herein below and the accompanying drawings, wherein like elements are represented by like reference numerals, which are given by way of illustration only and thus are not limiting of the present invention and wherein:
FIGS. 4(a) and 4(b) illustrate graphs for explaining example triggering conditions and activation schemes for the obscuring device of
Now referring to the drawings, an explanation will be given of various example embodiments of the present invention.
Hereafter, the embodiments of the vision recognition support system of this invention are explained using the example of a driver of vehicle, such as a car. However, the present invention may be employed in various forms, and this invention is not limited to a vision recognition support system for the driver of vehicle. For example, the system may be employed to support vision recognition in any situation where an operator's vision recognition is required for proper operation of an apparatus such as a vehicle, a manufacturing machine, etc.
The attentiveness level detecting device 10 detects the driver's attentiveness level to the vision information used in driving a vehicle. For example, the attentiveness level detecting device 10 may be the same as that employed in JP11-276461A. However, the attentiveness level detecting device 10 may be any well-known attentiveness level detecting device such as that disclosed in US Published Application No. 2003/0146841 A1, which is hereby incorporated by reference in its entirety. Accordingly, as described above, the attentiveness level detecting device 10 may detect the driver's eye movement, and determine whether the driver's attentiveness is falling based on the frequency of the detected eye movement. Namely, the attentiveness level detecting device 10 may judges a level of the driver's attentiveness (e.g., high, normal, low etc.) based on the frequency of detected eye movement. The attentiveness level detecting device 10 outputs this detected attentiveness level to the control device 50.
The vehicle state detecting system 20 may include a number of vehicle sensors that indicate the state of the vehicle. For example the vehicle state detecting system 20 may include a speed sensor sensing a speed of the vehicle, a transmission position sensor detecting the position of the vehicle's gearbox, a acceleration sensor detecting acceleration/deceleration of the vehicle, a steering sensor detecting an amount of steering from a neutral position, a radar (i) detecting proximity and/or direction of objects (e.g., other vehicles) in front of the vehicle and/or (ii) detecting proximity and/or direction of objects to either or both sides of the vehicle, and an imaging sensor sensing the image in front of the vehicle and detecting therefrom a position of the vehicle in the vehicle's current running-lane. As each of the above-described sensors are well-known in the art, as is there location in a vehicle, these sensor and there locations in a vehicle will not be described in detail for the sake of brevity. The vehicle state detecting system 20 supplies the output from the sensors in the system 20 to the collision judging device 30 and the control device 50. Furthermore, it will be understood that the sensors used in detecting the operation state of the apparatus will vary depending on the apparatus.
Using the information from the vehicle state detecting system 20, the collision judging device 30 judges the possibility of collision between the vehicle and an object (e.g., another vehicle) that exists in front of or to the side of the vehicle. The collision judging device 30 may be any well-known collision judging device. And, because such collision judging devices are so well-known in the art, the collision judging device 30 will not be described in detail for the sake of brevity.
Based on the input received from the attentiveness level detecting device 10, the vehicle state detecting system 20 and the collision judging device 30, the control device 50 controls operation of the obscuring device 40. The control device 50 will be described in greater detail below with respect to
The transparent EL display 46, when deactivated, is transparent, and does not obscure the driver's vision. However, when activated, the transparent EL display 46 may obscure or, alternatively, block the driver's vision. As is known, activating the transparent EL display 46 generally requires application of voltage to the EL matrix forming a part of the transparent EL display 46. Accordingly, in case of a failure in the vehicle's electrical system, the transparent EL display 46 remains transparent by default. Because transparent EL displays are well-known in that art, the transparent EL display 46 will not be described in detail for the sake of brevity. Also, it will be understood that the present invention is not limited to a transparent EL display as the obscuring device 40. Instead, any system that allows for selectively obscuring or blocking a driver's view may be used.
As stated above, the control device 50 controls the operation of the obscuring device 40. The control device 50 may be implemented as a microcomputer where the microcomputer includes a bus line connecting well-known elements such as a CPU, ROM, RAM, I/O, etc. The program for operation of the vision recognition support system 100 is written in the ROM. According to this program, the CPU etc. performs operation processing as described in more detail below.
The control part 54 controls the obscuring device 40 while the vehicle is running. More specifically, the control part 54 selectively activates the obscuring device 40. For example if the driver's attentiveness level changes from high or normal to low, the control part 54 may activate and then deactivate the obscuring device to temporarily obscure or block the driver's view. As another example, if the collision judging device 30 indicates a possible collision with an object, the control part 54 temporarily activates the obscuring device 40 to temporarily obscure or block the driver's view of the object. This may seem counterintuitive, but serves to present the driver with a greater, more discrete or stepwise change of the image viewed by the driver. Because the driver should perceive a greater change in the viewed image, the driver may more readily ascertain or recognize the situation being faced, and react accordingly.
Depending on the condition leading to activation of the obscuring device 40 and the vehicle state as indicated by the vehicle state detecting system 20, the length of time with which the obscuring device 40 is activated may change. Also, instead of a single activation, the control part 54 may repeatedly activate and deactivate the obscuring device 40, with the length of activation and deactivation being controlled by the control part 54.
As will be appreciated, the conditions established to trigger activation and deactivation of the obscuring device are a matter of design choice, as are the operational parameters (e.g., length of activation) of the obscuring device for each condition. As will further be appreciated, these conditions and parameters may be established empirically according to routine testing.
For example, the vehicle speed, and the distance and the time to an object for which a collision possibility has been detected, are useful metrics for determining whether a triggering condition exits. As another example, the object is perceived as having a certain area within the driver's field of view. As is known, this area may be determined from the output of the imaging sensor in the vehicle state detecting system 20. Accordingly, another useful metric for determining a triggering condition may be the rate of change in the area of the object's image.
For the purposes of example only, example triggering conditions and operational parameters associated therewith will be described with respect to FIGS. 4(a) and 4(b). In this example, the control part 54 at least compares the rate of change in area of an imaged object (e.g., another vehicle) against a threshold to determine whether to trigger an activation scheme for the obscuring device 40. In
Assuming the driver's attentiveness level is high and the lower threshold is used, then at time t1 when the change in area of the imaged object reaches the threshold, the control part 54 activates the obscuring device 40. As shown by curve “Q” the driver's perceived change in area of the object drops to zero because the driver's view of the object is obscured. However, as shown by curve “q”, the rate of change in the area of the imaged object continues to increase.
In this embodiment, the control part 54 activates the obscuring device 40 for a period to time Δ1t, such that at time t1+Δ1t the obscuring device 40 is deactivated. As shown by curve “Q” in
As further shown by
Next, processing performed by the control part 54 will be described with respect to
The invention being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the invention, and all such modifications as would be obvious to one skilled in the art are intended to be included within the scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2004-321148 | Nov 2004 | JP | national |