The current embodiment relates to motor vehicles and in particular to a system and method for responding to driver behavior.
Motor vehicles are operated by drivers in various conditions. Lack of sleep, monotonous road conditions, use of items, or health-related conditions can increase the likelihood that a driver may become drowsy or inattentive while driving. When drowsy or inattentive, drivers may have delayed reaction times. A drowsy driver also has an increased likelihood of falling asleep at the wheel, which can cause potential harm to the driver, other vehicle occupants and occupants in nearby vehicles or pedestrians.
In one aspect, a method of controlling one or more vehicle systems in a motor vehicle includes receiving monitoring information, determining if a driver is drowsy and modifying the control of one or more vehicle systems when the driver is drowsy.
In another aspect, a method of controlling a vehicle system in a motor vehicle includes receiving monitoring information, determining a level of drowsiness and modifying the control of the vehicle system when the driver is drowsy according to the level of drowsiness.
In another aspect, a method of controlling a vehicle system in a motor vehicle includes receiving information from a sensor, where the sensor is capable of detecting information about the autonomic nervous system of a driver. The method also includes determining if the driver is drowsy and modifying the control of the vehicle system when the driver is drowsy.
In another aspect, a method of controlling a vehicle system in a motor vehicle includes receiving monitoring information and determining a body state index for a driver, where the body state index characterizes drowsiness. The method also includes determining a control parameter using the body state index and operating a vehicle system using the control parameter.
Other systems, methods, features and advantages will be, or will become, apparent to one of ordinary skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features and advantages be included within this description and this summary, be within the scope of the embodiments, and be protected by the following claims.
The embodiments can be better understood with reference to the following drawings and detailed description. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the embodiments. Moreover, in the figures, like reference numerals designate corresponding parts throughout the different views.
In some cases, a motor vehicle includes one or more engines. The term “engine” as used throughout the specification and claims refers to any device or machine that is capable of converting energy. In some cases, potential energy is converted to kinetic energy. For example, energy conversion can include a situation where the chemical potential energy of a fuel or fuel cell is converted into rotational kinetic energy or where electrical potential energy is converted into rotational kinetic energy. Engines can also include provisions for converting kinetic energy into potential energy. For example, some engines include regenerative braking systems where kinetic energy from a drive train is converted into potential energy. Engines can also include devices that convert solar or nuclear energy into another form of energy. Some examples of engines include, but are not limited to: internal combustion engines, electric motors, solar energy converters, turbines, nuclear power plants, and hybrid systems that combine two or more different types of energy conversion processes.
For purposes of clarity, only some components of motor vehicle 100 are shown in the current embodiment. Furthermore, it will be understood that in other embodiments some of the components may be optional. Additionally, it will be understood that in other embodiments, any other arrangements of the components illustrated here can be used for powering motor vehicle 100.
Generally, motor vehicle 100 may be propelled by any power source. In some embodiments, motor vehicle 100 may be configured as a hybrid vehicle that uses two or more power sources. In other embodiments, motor vehicle 100 may use a single power source, such as an engine.
In one embodiment, motor vehicle 100 can include engine 102. Generally, the number of cylinders in engine 102 could vary. In some cases, engine 102 could include six cylinders. In some cases, engine 102 could be a three cylinder, four cylinder or eight cylinder engine. In still other cases, engine 102 could have any other number of cylinders.
In some embodiments, motor vehicle 100 may include provisions for communicating, and in some cases controlling, the various components associated with engine 102 and/or other systems of motor vehicle 100. In some embodiments, motor vehicle 100 may include a computer or similar device. In the current embodiment, motor vehicle 100 may include electronic control unit 150, hereby referred to as ECU 150. In one embodiment, ECU 150 may be configured to communicate with, and/or control, various components of motor vehicle 100.
ECU 150 may include a microprocessor, RAM, ROM, and software all serving to monitor and supervise various parameters of the engine, as well as other components or systems of motor vehicle 100. For example, ECU 150 is capable of receiving signals from numerous sensors, devices, and systems located in the engine. The output of various devices is sent to ECU 150 where the device signals may be stored in an electronic storage, such as RAM. Both current and electronically stored signals may be processed by a central processing unit (CPU) in accordance with software stored in an electronic memory, such as ROM.
ECU 150 may include a number of ports that facilitate the input and output of information and power. The term “port” as used throughout this detailed description and in the claims refers to any interface or shared boundary between two conductors. In some cases, ports can facilitate the insertion and removal of conductors. Examples of these types of ports include mechanical connectors. In other cases, ports are interfaces that generally do not provide easy insertion or removal. Examples of these types of ports include soldering or electric traces on circuit boards.
All of the following ports and provisions associated with ECU 150 are optional. Some embodiments may include a given port or provision, while others may exclude it. The following description discloses many of the possible ports and provisions that can be used, however, it should be kept in mind that not every port or provision must be used or included in a given embodiment.
In some embodiments, ECU 150 can include provisions for communicating and/or controlling various systems associated with engine 102. In one embodiment, ECU 150 can include port 151 for receiving various kinds of steering information. In some cases, ECU 150 may communicate with electronic power steering system 160, also referred to as EPS 160, through port 151. EPS 160 may comprise various components and devices utilized for providing steering assistance. In some cases, for example, EPS 160 may include an assist motor as well as other provisions for providing steering assistance to a driver. In addition, EPS 160 could be associated with various sensors including torque sensors, steering angle sensors as well as other kinds of sensors. Examples of electronic power steering systems are disclosed in Kobayashi, U.S. Pat. No. 7,497,471, filed Feb. 27, 2006 as well as Kobayashi, U.S. Pat. No. 7,497,299, filed Feb. 27, 2006, the entirety of both being hereby incorporated by reference.
In some embodiments, ECU 150 can include provisions for receiving various kinds of optical information. In one embodiment, ECU 150 can include port 152 for receiving information from one or more optical sensing devices, such as optical sensing device 162. Optical sensing device 162 could be any kind of optical device including a digital camera, video camera, infrared sensor, laser sensor, as well as any other device capable of detecting optical information. In one embodiment, optical sensing device 162 could be a video camera. In addition, in some cases, ECU 150 could include port 159 for communicating with thermal sensing device 163. Thermal sensing device 163 may be configured to detect thermal information. In some cases, thermal sensing device 163 and optical sensing device 162 could be combined into a single sensor.
Generally, one or more optical sensing devices and/or thermal sensing devices could be associated with any portion of a motor vehicle. In some cases, an optical sensing device could be mounted to the roof of a vehicle cabin. In other cases, an optical sensing device could be mounted in a vehicle dashboard. Moreover, in some cases, multiple optical sensing devices could be installed inside a motor vehicle to provide viewpoints of a driver or occupant from multiple different angles. In one embodiment, optical sensing device 162 may be installed in a portion of motor vehicle 100 so that optical sensing device 162 can capture images of the face and/or head of a driver or occupant. Similarly, thermal sensing device 163 could be located in any portion of motor vehicle 100 including a dashboard, roof or in any other portion. Thermal sensing device 163 may also be located so as to provide a view of the face and/or head of a driver.
In some embodiments, ECU 150 can include provisions for receiving information about the location of a driver's head. In one embodiment, ECU 150 can include port 135 for receiving information related to the distance between a driver's head and headrest 137. In some cases, this information can be received from proximity sensor 134. Proximity sensor 134 could be any type of sensor configured to detect the distance between the driver's head and headrest 137. In some cases, proximity sensor 134 could be a capacitor. In other cases, proximity sensor 134 could be a laser sensing device. In still other cases, any other types of proximity sensors known in the art could be used for proximity sensor 134. Moreover, in other embodiments, proximity sensor 134 could be used to detect the distance between any part of the driver and any portion of motor vehicle 100 including, but not limited to: a headrest, a seat, a steering wheel, a roof or ceiling, a driver side door, a dashboard, a central console as well as any other portion of motor vehicle 100.
In some embodiments, ECU 150 can include provisions for receiving information about the biological state of a driver. For example, ECU 150 could receive information related to the autonomic nervous system (or visceral nervous system) of a driver. In one embodiment, ECU 150 may include port 153 for receiving information about the state of a driver from bio-monitoring sensor 164. Examples of different information about a driver that could be received from bio-monitoring sensor 164 include, but are not limited to: heart information, such as, heart rate, blood pressure, oxygen content, etc., brain information, such as, electroencephalogram (EEG) measurements, functional near infrared spectroscopy (fNIRS), functional magnetic resonance imaging (fMRI), etc, digestion information, respiration rate information, salivation information, perspiration information, pupil dilation information, as well as other kinds of information related to the autonomic nervous system or other biological systems of the driver.
Generally, a bio-monitoring sensor could be disposed in any portion of a motor vehicle. In some cases, a bio-monitoring sensor could be disposed in a location proximate to a driver. For example, in one embodiment, bio-monitoring sensor 164 could be located within or on the surface of driver seat 190. In other embodiments, however, bio-monitoring sensor 164 could be located in any other portion of motor vehicle 100, including, but not limited to: a steering wheel, a headrest, an armrest, dashboard, rear-view mirror as well as any other location. Moreover, in some cases, bio-monitoring sensor 164 may be a portable sensor that is worn by a driver, associated with a portable device located in proximity to the driver, such as a smart phone or similar device or associated with an article of clothing worn by the driver.
In some embodiments, ECU 150 can include provisions for communicating with and/or controlling various visual devices. Visual devices include any devices that are capable of displaying information in a visual manner. These devices can include lights (such as dashboard lights, cabin lights, etc.), visual indicators, video screens (such as a navigation screen or touch screen), as well as any other visual devices. In one embodiment, ECU 150 includes port 154 for communicating with visual devices 166.
In some embodiments, ECU 150 may include provisions for receiving input from a user. For example, in some embodiments, ECU 150 can include port 158 for receiving information from user input device 111. In some cases, user input device 111 could comprise one or more buttons, switches, a touch screen, touch pad, dial, pointer or any other type of input device. For example, in one embodiment, input device 111 could be a keyboard or keypad. In another embodiment, input device 111 could be a touch screen. In one embodiment, input device 111 could be an ON/OFF switch. In some cases, input device 111 could be used to turn on or off any body state monitoring devices associated with the vehicle or driver. For example, in an embodiment where an optical sensor is used to detect body state information, input device 111 could be used to switch this type of monitoring on or off. In embodiments using multiple monitoring devices, input device 111 could be used to simultaneously turn on or off all the different types of monitoring associated with these monitoring devices. In other embodiments, input device 111 could be used to selectively turn on or off some monitoring devices but not others.
In some embodiments, ECU 150 may include ports for communicating with and/or controlling various different engine components or systems. Examples of different engine components or systems include, but are not limited to: fuel injectors, spark plugs, electronically controlled valves, a throttle, as well as other systems or components utilized for the operation of engine 102.
It will be understood that only some components of motor vehicle 100 are shown in the current embodiment. In other embodiments, additional components could be included, while some of the components shown here could be optional. Moreover, ECU 150 could include additional ports for communicating with various other systems, sensors or components of motor vehicle 100. As an example, in some cases, ECU 150 could be in electrical communication with various sensors for detecting various operating parameters of motor vehicle 100, including but not limited to: vehicle speed, vehicle location, yaw rate, lateral g forces, fuel level, fuel composition, various diagnostic parameters as well as any other vehicle operating parameters and/or environmental parameters (such as ambient temperature, pressure, elevation, etc.).
In some embodiments, ECU 150 can include provisions for communicating with and/or controlling various different vehicle systems. Vehicle systems include any automatic or manual systems that may be used to enhance the driving experience and/or enhance safety. In one embodiment, ECU 150 can include port 157 for communicating with and/or controlling vehicle systems 172. For purposes of illustration, a single port is shown in the current embodiment for communicating with vehicle systems 172. However, it will be understood that in some embodiments, more than one port can be used. For example, in some cases, a separate port may be used for communicating with each separate vehicle system of vehicle systems 172. Moreover, in embodiments where ECU 150 comprises part of the vehicle system, ECU 150 can include additional ports for communicating with and/or controlling various different components or devices of a vehicle system. Examples of different vehicle systems 172 are illustrated in
Motor vehicle 100 can include electronic stability control system 222 (also referred to as ESC system 222). ESC system 222 can include provisions for maintaining the stability of motor vehicle 100. In some cases, ESC system 222 may monitor the yaw rate and/or lateral g acceleration of motor vehicle 100 to help improve traction and stability. ESC system 222 may actuate one or more brakes automatically to help improve traction. An example of an electronic stability control system is disclosed in Ellis et al., U.S. Pat. No. 8,423,257, the entirety of which is hereby incorporated by reference. In one embodiment, the electronic stability control system may be a vehicle stability system.
In some embodiments, motor vehicle 100 can include antilock brake system 224 (also referred to as ABS system 224). ABS system 224 can include various different components such as a speed sensor, a pump for applying pressure to the brake lines, valves for removing pressure from the brake lines, and a controller. In some cases, a dedicated ABS controller may be used. In other cases, ECU 150 can function as an ABS controller. Examples of antilock braking systems are known in the art. One example is disclosed in Ingaki, et al., U.S. Pat. No. 6,908,161, filed Nov. 18, 2003, the entirety of which is hereby incorporated by reference. Using ABS system 224 may help improve traction in motor vehicle 100 by preventing the wheels from locking up during braking.
Motor vehicle 100 can include brake assist system 226. Brake assist system 226 may be any system that helps to reduce the force required by a driver to depress a brake pedal. In some cases, brake assist system 226 may be activated for older drivers or any other drivers who may need assistance with braking. An example of a brake assist system can be found in Wakabayashi et al., U.S. Pat. No. 6,309,029, filed Nov. 17, 1999, the entirety of which is hereby incorporated by reference.
In some embodiments, motor vehicle 100 can include automatic brake prefill system 228 (also referred to as ABP system 228). ABP system 228 includes provisions for prefilling one or more brake lines with brake fluid prior to a collision. This may help increase the reaction time of the braking system as the driver depresses the brake pedal. Examples of automatic brake prefill systems are known in the art. One example is disclosed in Bitz, U.S. Pat. No. 7,806,486, the entirety of which is hereby incorporated by reference.
In some embodiments, motor vehicle 100 can include low speed follow system 230 (also referred to as LSF system 230). LSF system 230 includes provisions for automatically following a preceding vehicle at a set distance or range of distances. This may reduce the need for the driver to constantly press and depress the acceleration pedal in slow traffic situations. LSF system 230 may include components for monitoring the relative position of a preceding vehicle (for example, using remote sensing devices such as lidar or radar). In some cases, LSF system 230 may include provisions for communicating with any preceding vehicles for determining the GPS positions and/or speeds of the vehicles. Examples of low speed follow systems are known in the art. One example is disclosed in Arai, U.S. Pat. No. 7,337,056, filed Mar. 23, 2005, the entirety of which is hereby incorporated by reference. Another example is disclosed in Higashimata et al., U.S. Pat. No. 6,292,737, filed May 19, 2000, the entirety of which is hereby disclosed by reference.
Motor vehicle 100 can include cruise control system 232. Cruise control systems are well known in the art and allow a user to set a cruising speed that is automatically maintained by a vehicle control system. For example, while traveling on a highway, a driver may set the cruising speed to 55 mph. Cruise control system 232 may maintain the vehicle speed at approximately 55 mph automatically, until the driver depresses the brake pedal or otherwise deactivates the cruising function.
Motor vehicle 100 can include collision warning system 234. In some cases, collision warning system 234 may include provisions for warning a driver of any potential collision threats with one or more vehicles. For example, a collision warning system can warn a driver when another vehicle is passing through an intersection as motor vehicle 100 approaches the same intersection. Examples of collision warning systems are disclosed in Mochizuki, U.S. Pat. No. 8,557,718, and Mochizuki et al., U.S. Pat. No. 8,587,418, the entirety of both being hereby incorporated by reference. In one embodiment, collision warning system 234 could be a forward collision warning system.
Motor vehicle 100 can include collision mitigation braking system 236 (also referred to as CMBS 236). CMBS 236 may include provisions for monitoring vehicle operating conditions (including target vehicles and objects in the environment of the vehicle) and automatically applying various stages of warning and/or control to mitigate collisions. For example, in some cases, CMBS 236 may monitor forward vehicles using a radar or other type of remote sensing device. If motor vehicle 100 gets too close to a forward vehicle, CMBS 236 could enter a first warning stage. During the first warning stage, a visual and/or audible warning may be provided to warn the driver. If motor vehicle 100 continues to get closer to the forward vehicle, CMBS 236 could enter a second warning stage. During the second warning stage, CMBS 236 could apply automatic seatbelt pretensioning. In some cases, visual and/or audible warnings could continue throughout the second warning stage. Moreover, in some cases, during the second stage automatic braking could also be activated to help reduce the vehicle speed. In some cases, a third stage of operation for CMBS 236 may involve braking the vehicle and tightening a seatbelt automatically in situations where a collision is very likely. An example of such a system is disclosed in Bond, et al., U.S. Pat. No. 6,607,255, and filed Jan. 17, 2002, the entirety of which is hereby incorporated by reference. The term collision mitigation braking system as used throughout this detailed description and in the claims refers to any system that is capable of sensing potential collision threats and providing various types of warning responses as well as automated braking in response to potential collisions.
Motor vehicle 100 can include auto cruise control system 238 (also referred to as ACC system 238). In some cases, ACC system 238 may include provisions for automatically controlling the vehicle to maintain a predetermined following distance behind a preceding vehicle or to prevent a vehicle from getting closer than a predetermined distance to a preceding vehicle. ACC system 238 may include components for monitoring the relative position of a preceding vehicle (for example, using remote sensing devices such as lidar or radar). In some cases, ACC system 238 may include provisions for communicating with any preceding vehicles for determining the GPS positions and/or speeds of the vehicles. An example of an auto cruise control system is disclosed in Arai et al., U.S. Pat. No. 7,280,903, filed Aug. 31, 2005, the entirety of which is hereby incorporated by reference.
Motor vehicle 100 can include lane departure warning system 240 (also referred to as LDW system 240). LDW system 240 may determine when a driver is deviating from a lane and provide a warning signal to alert the driver. Examples of lane departure warning systems can be found in Tanida et al., U.S. Pat. No. 8,063,754, the entirety of which is hereby incorporated by reference.
Motor vehicle 100 can include blind spot indicator system 242. Blind spot indicator system 242 can include provisions for helping to monitor the blind spot of a driver. In some cases, blind spot indicator system 242 can include provisions to warn a driver if a vehicle is located within a blind spot. Any known systems for detecting objects traveling around a vehicle can be used.
In some embodiments, motor vehicle 100 can include lane keep assist system 244. Lane keep assist system 244 can include provisions for helping a driver to stay in the current lane. In some cases, lane keep assist system 244 can warn a driver if motor vehicle 100 is unintentionally drifting into another lane. Also, in some cases, lane keep assist system 244 may provide assisting control to maintain a vehicle in a predetermined lane. An example of a lane keep assist system is disclosed in Nishikawa et al., U.S. Pat. No. 6,092,619, filed May 7, 1997, the entirety of which is hereby incorporated by reference.
In some embodiments, motor vehicle 100 could include navigation system 248. Navigation system 248 could be any system capable of receiving, sending and/or processing navigation information. The term “navigation information” refers to any information that can be used to assist in determining a location or providing directions to a location. Some examples of navigation information include street addresses, street names, street or address numbers, apartment or suite numbers, intersection information, points of interest, parks, any political or geographical subdivision including town, township, province, prefecture, city, state, district, ZIP or postal code, and country. Navigation information can also include commercial information including business and restaurant names, commercial districts, shopping centers, and parking facilities. In some cases, the navigation system could be integrated into the motor vehicle. In other cases, the navigation system could be a portable or stand-alone navigation system.
Motor vehicle 100 can include climate control system 250. Climate control system 250 may be any type of system used for controlling the temperature or other ambient conditions in motor vehicle 100. In some cases, climate control system 250 may comprise a heating, ventilation and air conditioning system as well as an electronic controller for operating the HVAC system. In some embodiments, climate control system 250 can include a separate dedicated controller. In other embodiments, ECU 150 may function as a controller for climate control system 250. Any kind of climate control system known in the art may be used.
Motor vehicle 100 can include electronic pretensioning system 254 (also referred to as EPT system 254). EPT system 254 may be used with a seatbelt for a vehicle. EPT system 254 can include provisions for automatically tightening, or tensioning, the seatbelt. In some cases, EPT system 254 may automatically pretension the seatbelt prior to a collision. An example of an electronic pretensioning system is disclosed in Masuda et al., U.S. Pat. No. 6,164,700, filed Apr. 20, 1999, the entirety of which is hereby incorporated by reference.
Additionally, vehicle systems 172 could incorporate electronic power steering system 160, visual devices 166, audio devices 168 and tactile devices 170, as well as any other kinds of devices, components or systems used with vehicles.
It will be understood that each of these vehicle systems may be standalone systems or may be integrated with ECU 150. For example, in some cases, ECU 150 may operate as a controller for various components of one or more vehicle systems. In other cases, some systems may comprise separate dedicated controllers that communicate with ECU 150 through one or more ports.
Additionally, in some embodiments, motor vehicle 100 may include brain monitoring system 310 for monitoring various kinds of brain information. In some cases, brain monitoring system 310 could include electroencephalogram (EEG) sensors 330, functional near infrared spectroscopy (fNIRS) sensors 332, functional magnetic resonance imaging (fMRI) sensors 334 as well as other kinds of sensors capable of detecting brain information. Such sensors could be located in any portion of motor vehicle 100. In some cases, sensors associated with brain monitoring system 310 could be disposed in a headrest. In other cases, sensors could be disposed in the roof of motor vehicle 100. In still other cases, sensors could be disposed in any other locations.
In some embodiments, motor vehicle 100 may include digestion monitoring system 312. In other embodiments, motor vehicle 100 may include salivation monitoring system 314. In some cases, monitoring digestion and/or salivation could also help in determining if a driver is drowsy. Sensors for monitoring digestion information and/or salivation information can be disposed in any portion of a vehicle. In some cases, sensors could be disposed on a portable device used or worn by a driver.
It will be understood that each of the monitoring systems discussed above could be associated with one or more sensors or other devices. In some cases, the sensors could be disposed in one or more portions of motor vehicle 100. For example, the sensors could be integrated into a seat, door, dashboard, steering wheel, center console, roof or any other portion of motor vehicle 100. In other cases, however, the sensors could be portable sensors worn by a driver, integrated into a portable device carried by the driver or integrated into an article of clothing worn by the driver.
For purposes of convenience, various components discussed above and shown in
Response system 199 can include provisions for determining if a driver is drowsy based on biological information, including information related to the autonomic nervous system of the driver. For example, a response system could detect a drowsy condition for a driver by analyzing heart information, breathing rate information, brain information, perspiration information as well as any other kinds of autonomic information.
A motor vehicle can include provisions for assessing the behavior of a driver and automatically adjusting the operation of one or more vehicle systems in response to the behavior. Throughout this specification, drowsiness will be used as the example behavior being assessed; however, it should be understood that any driver behavior could be assessed, including but not limited to drowsy behavior, distracted behavior, impaired behavior and/or generally inattentive behavior. The assessment and adjustment discussed below may accommodate for the driver's slower reaction time, attention lapse and/or alertness. For example, in situations where a driver may be drowsy, the motor vehicle can include provisions for detecting that the driver is drowsy. Moreover, since drowsiness can increase the likelihood of hazardous driving situations, the motor vehicle can include provisions for modifying one or more vehicle systems automatically in order to mitigate against hazardous driving situations. In one embodiment, a driver behavior response system can receive information about the state of a driver and automatically adjust the operation of one or more vehicle systems.
The following detailed description discusses a variety of different methods for operating vehicle systems in response to driver behavior. In different embodiments, the various different steps of these processes may be accomplished by one or more different systems, devices or components. In some embodiments, some of the steps could be accomplished by a response system 199 of a motor vehicle. In some cases, some of the steps may be accomplished by an ECU 150 of a motor vehicle. In other embodiments, some of the steps could be accomplished by other components of a motor vehicle, including but not limited to, the vehicle systems 172. Moreover, for each process discussed below and illustrated in the Figures it will be understood that in some embodiments one or more of the steps could be optional.
In step 402, response system 199 may receive monitoring information. In some cases, the monitoring information can be received from one or more sensors. In other cases, the monitoring information can be received from one or more autonomic monitoring systems. In still other cases, the monitoring information can be received from one or more vehicle systems. In still other cases, the monitoring information can be received from any other device of motor vehicle 100. In still other cases, the monitoring information can be received from any combination of sensors, monitoring systems, vehicles systems or other devices.
In step 404, response system 199 may determine the driver state. In some cases, the driver state may be normal or drowsy. In other cases, the driver state may range over three or more states ranging between normal and very drowsy (or even asleep). In this step, response system 199 may use any information received during step 402, including information from any kinds of sensors or systems. For example, in one embodiment, response system 199 may receive information from an optical sensing device that indicates the driver has closed his or her eyes for a substantial period of time. Other examples of determining the state of a driver are discussed in detail below.
In step 406, response system 199 may determine whether or not the driver is drowsy. If the driver is not drowsy, response system 199 may proceed back to step 402 to receive additional monitoring information. If, however, the driver is drowsy, response system 199 may proceed to step 408. In step 408, response system 199 may automatically modify the control of one or more vehicle systems, including any of the vehicle systems discussed above. By automatically modifying the control of one or more vehicle systems, response system 199 may help to avoid various hazardous situations that can be caused by a drowsy driver.
In some embodiments, a user may not want any vehicle systems modified or adjusted. In these cases, the user may switch input device 111, or a similar kind of input device, to the OFF position (see
As indicated in
Additionally, upon detecting that a driver is drowsy or otherwise inattentive, response system 199 may control the low speed follow system 230, the cruise control system 232, the collision warning system 234, the collision mitigation braking system 236, the auto cruise control system 238, the lane departure warning system 240, the blind spot indicator system 242 and the lane keep assist system 244 to provide protection due to the driver's lapse of attention. For example, the low speed follow system 230, cruise control system 232 and lane keep assist system 244 could be disabled when the driver is drowsy to prevent unintended use of these systems. Likewise, the collision warning system 234, collision mitigation braking system 236, lane departure warning system 240 and blind spot indicator system 242 could warn a driver sooner about possible potential hazards. In some cases, the auto cruise control system 238 could be configured to increase the minimum gap distance between motor vehicle 100 and the preceding vehicle.
In some embodiments, upon detecting that a driver is drowsy or otherwise inattentive, response system 199 may control the electronic power steering system 160, visual devices 166, the climate control system 250 (such as HVAC), audio devices 168, the electronic pretensioning system 254 for a seatbelt and tactile devices 170 to supplement the driver's alertness. For example, the electronic power steering system 160 may be controlled to decrease power steering assistance. This requires the driver to apply more effort and can help improve awareness or alertness. Visual devices 166 and audio devices 168 may be used to provide visual feedback and audible feedback, respectively. Tactile devices 170 and the electronic pretensioning system 254 can be used to provide tactile feedback to a driver. Also, the climate control system 250 may be used to change the cabin or driver temperature to effect the drowsiness of the driver. For example, by changing the cabin temperature the driver may be made more alert.
The various systems listed in
A response system can include provisions for determining a level of drowsiness for a driver. The term “level of drowsiness” as used throughout this detailed description and in the claims refers to any numerical or other kind of value for distinguishing between two or more states of drowsiness. For example, in some cases, the level of drowsiness may be given as a percentage between 0% and 100%, where 0% refers to a driver that is totally alert and 100% refers to a driver that is fully drowsy or even asleep. In other cases, the level of drowsiness could be a value in the range between 1 and 10. In still other cases, the level of drowsiness may not be a numerical value, but could be associated with a given discrete state, such as “not drowsy”, “slightly drowsy”, “drowsy”, “very drowsy” and “extremely drowsy”. Moreover, the level of drowsiness could be a discrete value or a continuous value. In some cases, the level of drowsiness may be associated with a body state index, which is discussed in further detail below.
In step 442, response system 199 may receive monitoring information. In some cases, the monitoring information can be received from one or more sensors. In other cases, the monitoring information can be received from one or more autonomic monitoring systems. In still other cases, the monitoring information can be received from one or more vehicle systems. In still other cases, the monitoring information can be received from any other device of motor vehicle 100. In still other cases, the monitoring information can be received from any combination of sensors, monitoring systems, vehicles systems or other devices.
In step 444, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 may return back to step
442. If the driver is drowsy, response system 199 may proceed to step 446. In step 446, response system 199 may determine the level of drowsiness. As discussed above, the level of drowsiness could be represented by a numerical value or could be a discrete state labeled by a name or variable. In step 448, response system 199 may modify the control of one or more vehicle systems according to the level of drowsiness.
Examples of systems that can be modified according to the level of drowsiness include, but are not limited to: antilock brake system 224, automatic brake prefill system 228, brake assist system 226, auto cruise control system 238, electronic stability control system 222, collision warning system 234, lane keep assist system 244, blind spot indicator system 242, electronic pretensioning system 254 and climate control system 250. In addition, electronic power steering system 160 could be modified according to the level of drowsiness, as could visual devices 166, audio devices 168 and tactile devices 170. In some embodiments, the timing and/or intensity associated with various warning indicators (visual indicators, audible indicators, haptic indicators, etc.) could be modified according to the level of drowsiness. For example, in one embodiment, electronic pretensioning system 254 could increase or decrease the intensity and/or frequency of automatic seatbelt tightening to warn the driver at a level appropriate for the level of drowsiness.
As an example, when a driver is extremely drowsy, the antilock brake system 224 may be modified to achieve a shorter stopping distance than when a driver is somewhat drowsy. As another example, automatic brake prefill system 228 could adjust the amount of brake fluid delivered during a prefill or the timing of the prefill according to the level of drowsiness. Likewise, the level of brake assistance provided by brake assist system 226 could be varied according to the level of drowsiness, with assistance increased with drowsiness. Also, the headway distance for auto cruise control system 238 could be increased with the level of drowsiness. In addition, the error between the yaw rate and the steering yaw rate determined by electronic stability control system 222 could be decreased in proportion to the level of drowsiness. In some cases, collision warning system 234 and lane departure system 240 could provide earlier warnings to a drowsy driver, where the timing of the warnings is modified in proportion to the level of drowsiness. Likewise, the detection area size associated with blind spot indicator system 242 could be varied according to the level of drowsiness. In some cases, the strength of a warning pulse generated by electronic pretensioning system 254 may vary in proportion to the level of drowsiness. Also, climate control system 250 may vary the number of degrees that the temperature is changed according to the level of drowsiness. Moreover, the brightness of the lights activated by visual devices 166 when a driver is drowsy could be varied in proportion to the level of drowsiness. Also, the volume of sound generated by audio devices 168 could be varied in proportion to the level of drowsiness. In addition, the amount of vibration or tactile stimulation delivered by tactile devices 170 could be varied in proportion to the level of drowsiness. In some cases, the maximum speed at which low speed follow system 230 operates could be modified according to the level of drowsiness. Likewise, the on/off setting or the maximum speed at which cruise control system 232 can be set may be modified in proportion to the level of drowsiness. Additionally, the degree of power steering assistance provided by electronic power steering system 160 could be varied in proportion to the level of drowsiness. Also, the distance that the collision mitigation braking system begins to brake can be lengthened or the lane keep assist system could be modified so that the driver must provide more input to the system.
In step 452, response system 199 may receive monitoring information, as discussed above and with respect to step 442 of
Next, in step 456, response system 199 can determine a body state index of the driver. The term “body state index” refers to a measure of the drowsiness of a driver. In some cases, the body state index could be given as a numerical value. In other cases, the body state index could be given as a non-numerical value. Moreover, the body state index may range from values associated with complete alertness to values associated with extreme drowsiness or even a state in which the driver is asleep. In one embodiment, the body state index could take on the values 1, 2, 3 and 4, where 1 is the least drowsy and 4 is the most drowsy. In another embodiment, the body state index could take on values from 1-10.
Generally, the body state index of the driver can be determined using any of the methods discussed throughout this detailed description for detecting driver behavior as it relates to drowsiness. In particular, the level of drowsiness may be detected by sensing different degrees of driver behavior. For example, as discussed below, drowsiness in a driver may be detected by sensing eyelid movement and/or head movement. In some cases, the degree of eyelid movement (the degree to which the eyes are open or closed) or the degree of head movement (how tilted the head is) could be used to determine the body state index. In other cases, the autonomic monitoring systems could be used to determine the body state index. In still other cases, the vehicle systems could be used to determine the body state index. For example, the degree of unusual steering behavior or the degree of lane departures may indicate a certain body state index.
In step 458, response system 199 may determine a control parameter. The term “control parameter” as used throughout this detailed description and in the claims refers to a parameter used by one or more vehicle systems. In some cases, a control parameter may be an operating parameter that is used to determine if a particular function should be activated for a given vehicle system. For example, in situations where an electronic stability control system is used, the control parameter may be a threshold error in the steering yaw rate that is used to determine if stability control should be activated. As another example, in situations where automatic cruise control is used, the control parameter may be a parameter used to determine if cruise control should be automatically turned off. Further examples of control parameters are discussed in detail below and include, but are not limited to: stability control activation thresholds, brake assist activation thresholds, blind spot monitoring zone thresholds, time to collision thresholds, road crossing thresholds, lane keep assist system status, low speed follow status, electronic power steering status, auto cruise control status as well as other control parameters.
In some cases, a control parameter can be determined using vehicle system information as well as the body state index determined during step 456. In other cases, only the body state index may be used to determine the control parameter. In still other cases, only the vehicle operating information may be used to determine the control parameter. Following step 458, during step 460, response system 199 may operate a vehicle system using the control parameter.
In one embodiment, the value of the control coefficient 470 increases from 0% to 25% as the body state index increases from 1 to 4. In some cases, the control coefficient may serve as a multiplicative factor for increasing or decreasing the value of a control parameter. For example, in some cases when the body state index is 4, the control coefficient may be used to increase the value of a control parameter by 25%. In other embodiments, the control coefficient could vary in any other manner. In some cases, the control coefficient could vary linearly as a function of body state index. In other cases, the control coefficient could vary in a nonlinear manner as a function of body state index. In still other cases, the control coefficient could vary between two or more discrete values as a function of body state index.
As seen in
It will be understood that calculation unit 480 is intended to be any general algorithm or process used to determine one or more control parameters. In some cases, calculation unit 480 may be associated with response system 199 and/or ECU 150. In other cases, however, calculation unit 480 could be associated with any other system or device of motor vehicle 100, including any of the vehicle systems discussed previously.
In some embodiments, a control parameter may be associated with a status or state of a given vehicle system.
A response system can include provisions for detecting the state of a driver by monitoring the eyes of a driver.
It will be understood that any type of algorithm known in the art for analyzing eye movement from images can be used. In particular, any type of algorithm that can recognize the eyes and determine the position of the eyelids between a closed and open position may be used. Examples of such algorithms may include various pattern recognition algorithms known in the art.
In other embodiments, thermal sensing device 163 can be used to sense eyelid movement. For example, as the eyelids move between opened and closed positions, the amount of thermal radiation received at thermal sensing device 163 may vary. In other words, thermal sensing device 163 can be configured to distinguish between various eyelid positions based on variations in the detected temperature of the eyes.
In step 602, response system 199 may receive optical/thermal information. In some cases, optical information could be received from a camera or other optical sensing device. In other cases, thermal information could be received from a thermal sensing device. In still other cases, both optical and thermal information could be received from a combination of optical and thermal devices.
In step 604, response system 199 may analyze eyelid movement. By detecting eyelid movement, response system 199 can determine if the eyes of a driver are open, closed or in a partially closed position. The eyelid movement can be determined using either optical information or thermal information received during step 602. Moreover, as discussed above, any type of software or algorithm can be used to determine eyelid movement from the optical or thermal information. Although the current embodiment comprises a step of analyzing eyelid movement, in other embodiments the movement of the eyeballs could also be analyzed.
In step 606, response system 199 determines the body state index of the driver according to the eyelid movement. The body state index may have any value. In some cases, the value ranges between 1 and 4, with 1 being the least drowsy and 4 being the drowsiest state. In some cases, to determine the body state index response system 199 determines if the eyes are closed or partially closed for extended periods. In order to distinguish drooping eyelids due to drowsiness from blinking, response system 199 may use a threshold time that the eyelids are closed or partially closed. If the eyes of the driver are closed or partially closed for periods longer than the threshold time, response system 199 may determine that this is due to drowsiness. In such cases, the driver may be assigned a body state index that is greater than 1 to indicate that the driver is drowsy. Moreover, response system 199 may assign different body state index values for different degrees of eyelid movement or eyelid closure.
In some embodiments, response system 199 may determine the body state index based on detecting a single instance of prolonged eyelid closure or partial eyelid closure. Of course, it may also be the case that response system 199 analyzes eye movement over an interval of time and looks at average eye movements.
A response system can include provisions for detecting the state of a driver by monitoring the head of a driver.
It will be understood that any type of algorithm known in the art for analyzing head movement from images can be used. In particular, any type of algorithm that can recognize the head and determine the position of the head may be used. Examples of such algorithms may include various pattern recognition algorithms known in the art.
In step 802, response system 199 may receive optical and/or thermal information. In some cases, optical information could be received from a camera or other optical sensing device. In other cases, thermal information could be received from a thermal sensing device. In still other cases, both optical and thermal information could be received from a combination of optical and thermal devices.
In step 804, response system 199 may analyze head movement. By detecting head movement, response system 199 can determine if a driver is leaning forward. The head movement can be determined using either optical information or thermal information received during step 802. Moreover, as discussed above, any type of software or algorithm can be used to determine head movement from the optical or thermal information.
In step 806, response system 199 determines the body state index of the driver in response to the detected head movement. For example, in some cases, to determine the body state index of the driver, response system 199 determines if the head is tilted in any direction for extended periods. In some cases, response system 199 may determine if the head is tilting forward. In some cases, response system 199 may assign a body state index depending on the level of tilt and/or the time interval over which the head remains tilted. For example, if the head is tilted forward for brief periods, the body state index may be assigned a value of 2, to indicate that the driver is slightly drowsy. If the head is tilted forward for a significant period of time, the body state index may be assigned a value of 4 to indicate that the driver is extremely drowsy.
In some embodiments, response system 199 may determine the body state index based on detecting a single instance of a driver tilting his or her head forward. Of course, it may also be the case that response system 199 analyzes head movement over an interval of time and looks at average head movements.
A response system can include provisions for detecting the state of a driver by monitoring the relative position of the driver's head with respect to a headrest.
It will be understood that any type of algorithm known in the art for analyzing head distance and/or movement from proximity or distance information can be used. In particular, any type of algorithm that can determine the relative distance between a headrest and the driver's head can be used. Also, any algorithms for analyzing changes in distance to determine head motion could also be used. Examples of such algorithms may include various pattern recognition algorithms known in the art.
In step 202, response system 199 may receive proximity information. In some cases, proximity information could be received from a capacitor or laser based sensor. In other cases, proximity information could be received from any other sensor. In step 204, response system 199 may analyze the distance of the head from a headrest. By determining the distance between the driver's head and the head rest, response system 199 can determine if a driver is leaning forward. Moreover, by analyzing head distance over time, response system 199 can also detect motion of the head. The distance of the head from the headrest can be determined using any type of proximity information received during step 202. Moreover, as discussed above, any type of software or algorithm can be used to determine the distance of the head and/or head motion information.
In step 206, response system 199 determines the body state index of the driver in response to the detected head distance and/or head motion. For example, in some cases, to determine the body state index of the driver, response system 199 determines if the head is leaning away from the headrest for extended periods. In some cases, response system 199 may determine if the head is tilting forward. In some cases, response system 199 may assign a body state index depending on the distance of the head from the head rest as well as from the time interval over which the head is located away from the headrest. For example, if the head is located away from the headrest for brief periods, the body state index may be assigned a value of 2, to indicate that the driver is slightly drowsy. If the head is located away from the headrest for a significant period of time, the body state index may be assigned a value of 4 to indicate that the driver is extremely drowsy. It will be understood that in some cases, a system could be configured so that the alert state of the driver is associated with a predetermined distance between the head and the headrest. This predetermined distance could be a factory set value or a value determined by monitoring a driver over time. Then, the body state index may be increased when the driver's head moves closer to the headrest or further from the headrest with respect to the predetermined distance. In other words, in some cases the system may recognize that the driver's head may tilt forward and/or backward as he or she gets drowsy.
In some embodiments, response system 199 may determine the body state index based on detecting a single distance measurement between the driver's head and a headrest. Of course, it may also be the case that response system 199 analyzes the distance between the driver's head and the headrest over an interval of time and uses average distances to determine body state index.
In some other embodiments, response system 199 could detect the distance between the driver's head and any other reference location within the vehicle. For example, in some cases a proximity sensor could be located in a ceiling of the vehicle and response system 199 may detect the distance of the driver's head with respect to the location of the proximity sensor. In other cases, a proximity sensor could be located in any other part of the vehicle. Moreover, in other embodiments, any other portions of a driver could be monitored for determining if a driver is drowsy or otherwise alert. For example, in still another embodiment, a proximity sensor could be used in the backrest of a seat to measure the distance between the backrest and the back of the driver.
A response system can include provisions for detecting abnormal steering by a driver for purposes of determining if a driver is drowsy.
In step 1002, response system 199 may receive steering angle information. In some cases, the steering angle information may be received from EPS 160 or directly from a steering angle sensor. Next, in step 1004, response system 199 may analyze the steering angle information. In particular, response system 199 may look for patterns in the steering angle as a function of time that suggest inconsistent steering, which could indicate a drowsy driver. Any method of analyzing steering information to determine if the steering is inconsistent can be used. Moreover, in some embodiments, response system 199 may receive information from lane keep assist system 244 to determine if a driver is steering motor vehicle 100 outside of a current lane.
In step 1006, response system 199 may determine the body state index of the driver based on steering wheel movement. For example, if the steering wheel movement is inconsistent, response system 199 may assign a body state index of 2 or greater to indicate that the driver is drowsy.
A response system can include provisions for detecting abnormal driving behavior by monitoring lane departure information.
In step 1020, response system 199 may receive lane departure information. In some cases, the lane departure information may be received from LWD system 240 or directly from some kind of sensor (such as a steering angle sensor, or a relative position sensor). Next, in step 1022, response system 199 may analyze the lane departure information. Any method of analyzing lane departure information can be used.
In step 1024, response system 199 may determine the body state index of the driver based on lane departure information. For example, if the vehicle is drifting out of the current lane, response system 199 may assign a body state index of 2 or greater to indicate that the driver is drowsy. Likewise, if the lane departure information is a lane departure warning from LDW system 240, response system 199 may assign a body state index of 2 or greater to indicate that the driver is drowsy. Using this process, response system 199 can use information from one or more vehicle systems 172 to help determine if a driver is drowsy. This is possible since drowsiness (or other types of inattentiveness) not only manifest as driver behaviors, but can also cause changes in the operation of the vehicle, which may be monitored by the various vehicle systems 172.
In step 1202, response system 199 may receive information related to the autonomic nervous system of the driver. In some cases, the information can be received from a sensor. The sensor could be associated with any portion of motor vehicle 100 including a seat, armrest or any other portion. Moreover, the sensor could be a portable sensor in some cases.
In step 1204, response system 199 may analyze the autonomic information. Generally, any method of analyzing autonomic information to determine if a driver is drowsy could be used. It will be understood that the method of analyzing the autonomic information may vary according to the type of autonomic information being analyzed. In step 1206, response system 199 may determine the body state index of the driver based on the analysis conducted during step 1204.
It will be understood that the methods discussed above for determining the body state index of a driver according to eye movement, head movement, steering wheel movement and/or sensing autonomic information are only intended to be exemplary and in other embodiments any other method of detecting the behavior of a driver, including behaviors associated with drowsiness, could be used. Moreover, it will be understood that in some embodiments multiple methods for detecting driver behavior to determine a body state index could be used simultaneously.
A response system can include provisions for controlling one or more vehicle systems to help wake a drowsy driver. For example, a response system could control various systems to stimulate a driver in some way (visually, orally, or through movement, for example). A response system could also change ambient conditions in a motor vehicle to help wake the driver and thereby increase the driver's alertness.
In step 1502, response system 199 may receive drowsiness information. In some cases, the drowsiness information includes whether a driver is in a normal state or a drowsy state. Moreover, in some cases, the drowsiness information could include a value indicating the level of drowsiness, for example on a scale of 1 to 10, with 1 being the least drowsy and 10 being the drowsiest.
In step 1504, response system 199 determines if the driver is drowsy based on the drowsiness information. If the driver is not drowsy, response system 199 returns back to step 1502. If the driver is drowsy, response system 199 proceeds to step 1506. In step 1506, steering wheel information may be received. In some cases, the steering wheel information can be received from EPS system 160. In other cases, the steering wheel information can be received from a steering angle sensor or a steering torque sensor directly.
In step 1508, response system 199 may determine if the driver is turning the steering wheel. If not, response system 199 returns to step 1502. If the driver is turning the steering wheel, response system 199 proceeds to step 1510 where the power steering assistance is decreased. It will be understood that in some embodiments, response system 199 may not check to see if the wheel is being turned before decreasing power steering assistance.
In step 1524, response system 199 may determine the body state index of a driver using any of the methods discussed above for determining a body state index. Next, in step 1526, response system 199 may set a power steering status corresponding to the amount of steering assistance provided by the electronic power steering system. For example, in some cases, the power steering status is associated with two states, including a “low” state and a “standard” state. In the “standard” state, power steering assistance is applied at a predetermined level corresponding to an amount of power steering assistance that improves drivability and helps increase the driving comfort of the user. In the “low” state, less steering assistance is provided, which requires increased steering effort by a driver. As indicated by look-up table 1540, the power steering status may be selected according to the body state index. For example, if the body state index is 1 or 2 (corresponding to no drowsiness or slight drowsiness), the power steering status is set to the standard state. If, however, the body state index is 3 or 4 (corresponding to a drowsy condition of the driver), the power steering status is set to the low state. It will be understood that look-up table 1540 is only intended to be exemplary and in other embodiments the relationship between body state index and power steering status can vary in any manner.
Once the power steering status is set in step 1526, response system 199 proceeds to step 1528. In step 1528, response system 199 determines if the power steering status is set to low. If not, response system 199 may return to step 1520 and continue operating power steering assistance at the current level. However, if response system 199 determines that the power steering status is set to low, response system 199 may proceed to step 1530. In step 1530, response system 199 may ramp down power steering assistance. For example, if the power steering assistance is supplying a predetermined amount of torque assistance, the power steering assistance may be varied to reduce the assisting torque. This requires the driver to increase steering effort. For a drowsy driver, the increased effort required to turn the steering wheel may help increase his or her alertness and improve vehicle handling.
In some cases, during step 1532, response system 199 may provide a warning to the driver of the decreased power steering assistance. For example, in some cases, a dashboard light reading “power steering off” or “power steering decreased” could be turned on. In other cases, a navigation screen or other display screen associated with the vehicle could display a message indicating the decreased power steering assistance. In still other cases, an audible or haptic indicator could be used to alert the driver. This helps to inform the driver of the change in power steering assistance so the driver does not become concerned of a power steering failure.
In step 1802, response system 199 may receive drowsiness information. In step 1804, response system 199 determines if the driver is drowsy. If the driver is not drowsy, response system 199 proceeds back to step 1802. If the driver is drowsy, response system 199 proceeds to step 1806. In step 1806, response system 199 automatically adjusts the cabin temperature. In some cases, response system 199 may lower the cabin temperature by engaging a fan or air-conditioner. However, in some other cases, response system 199 could increase the cabin temperature using a fan or heater. Moreover, it will be understood that the embodiments are not limited to changing temperature and in other embodiments other aspects of the in-cabin climate could be changed, including airflow, humidity, pressure or other ambient conditions. For example, in some cases, a response system could automatically increase the airflow into the cabin, which may stimulate the driver and help reduce drowsiness.
In step 2102, response system 199 may receive drowsiness information. In step 2104, response system 199 determines if the driver is drowsy. If the driver is not drowsy, response system 199 returns to step 2102. Otherwise, response system 199 proceeds to step 2106. In step 2106, response system 199 may provide tactile stimuli to the driver. For example, response system 199 could control a seat or other portion of motor vehicle 100 to shake and/or vibrate (for example, a steering wheel). In other cases, response system 199 could vary the rigidity of a seat or other surface in motor vehicle 100.
In step 2108, response system 199 may turn on one or more lights or indicators. The lights could be any lights associated with motor vehicle 100 including dashboard lights, roof lights or any other lights. In some cases, response system 199 may provide a brightly lit message or background on a display screen, such as a navigation system display screen or climate control display screen. In step 2110, response system 199 may generate various sounds using speakers in motor vehicle 100. The sounds could be spoken words, music, alarms or any other kinds of sounds. Moreover, the volume level of the sounds could be chosen to ensure the driver is put in an alert state by the sounds, but not so loud as to cause great discomfort to the driver.
A response system can include provisions for controlling a seatbelt system to help wake a driver. In some cases, a response system can control an electronic pretensioning system for a seatbelt to provide a warning pulse to a driver.
A motor vehicle can include provisions for adjusting various brake control systems according to the behavior of a driver. For example, a response system can modify the control of antilock brakes, brake assist, brake prefill as well as other braking systems when a driver is drowsy. This arrangement helps to increase the effectiveness of the braking system in hazardous driving situations that may result when a driver is drowsy.
Referring now to
In step 2702, response system 199 may receive drowsiness information. In step 2704, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 returns to step 2702. If the driver is drowsy, response system 199 may proceed to step 2706. In step 2706, response system 199 may determine the current stopping distance. The current stopping distance may be a function of the current vehicle speed, as well as other operating parameters including various parameters associated with the brake system. In step 2708, response system 199 may automatically decrease the stopping distance. This may be achieved by modifying one or more operating parameters of ABS system 224. For example, the brake line pressure can be modified by controlling various valves, pumps and/or motors within ABS system 224.
In some embodiments, a response system can automatically prefill one or more brake lines in a motor vehicle in response to driver behavior.
In step 2802, response system 199 may receive drowsiness information. In step 2804, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 may return to step 2802. If the driver is drowsy, response system 199 may automatically prefill the brake lines with brake fluid in step 2806. For example, response system 199 may use automatic brake prefill system 228. In some cases, this may help increase braking response if a hazardous condition arises while the driver is drowsy. It will be understood that any number of brake lines could be prefilled during step 2806. Moreover, any provisions known in the art for prefilling brake lines could be used including any pumps, valves, motors or other devices needed to supply brake fluid automatically to brake lines.
Some vehicles may be equipped with brake assist systems that help reduce the amount of force a driver must apply to engage the brakes. These systems may be activated for older drivers or any other drivers who may need assistance with braking. In some cases, a response system could utilize the brake assist systems when a driver is drowsy, since a drowsy driver may not be able to apply the necessary force to the brake pedal for stopping a vehicle quickly.
In some embodiments, a response system could modify the degree of assistance in a brake assist system. For example, a brake assist system may operate under normal conditions with a predetermined activation threshold. The activation threshold may be associated with the rate of change of the master cylinder brake pressure. If the rate of change of the master cylinder brake pressure exceeds the activation threshold, brake assist may be activated. However, when a driver is drowsy, the brake assist system may modify the activation threshold so that brake assist is activated sooner. In some cases, the activation threshold could vary according to the degree of drowsiness. For example, if the driver is only slightly drowsy, the activation threshold may be higher than when the driver is extremely drowsy.
In step 2938, response system 199 determines if the rate of brake pressure increase exceeds the activation threshold. If not, response system 199 proceeds back to step 2930. Otherwise, response system 199 proceeds to step 2940. In step 2940, response system 199 activates a modulator pump and/or valves to automatically increase the brake pressure. In other words, in step 2940, response system 199 activates brake assist. This allows for an increase in the amount of braking force applied at the wheels.
In order to accommodate changes in brake assist due to drowsiness, the initial threshold setting may be modified according to the state of the driver. In step 2954, response system 199 determines the body state index of the driver using any method discussed above. Next, in step 2956, response system 199 determines a brake assist coefficient. As seen in look-up table 2960, the brake assist coefficient may vary between 0% and 25% according to the body state index. Moreover, the brake assist coefficient generally increases as the body state index increases. In step 2958, the activation threshold is selected according to the initial threshold setting and the brake assist coefficient. If the brake assist coefficient has a value of 0%, the activation threshold is just equal to the initial threshold setting. However, if the brake assist coefficient has a value of 25%, the activation threshold may be modified by up to 25% in order to increase the sensitivity of the brake assist when the driver is drowsy. In some cases, the activation threshold may be increased by up to 25% (or any other amount corresponding to the brake assist coefficient). In other cases, the activation threshold may be decreased by up to 25% (or any other amount corresponding to the brake assist coefficient).
A motor vehicle can include provisions for increasing vehicle stability when a driver is drowsy. In some cases, a response system can modify the operation of an electronic stability control system. For example, in some cases, a response system could ensure that a detected yaw rate and a steering yaw rate (the yaw rate estimated from steering information) are very close to one another. This can help enhance steering precision and reduce the likelihood of hazardous driving conditions while the driver is drowsy.
In step 3202, response system 199 may receive drowsiness information. In step 3204, response system 199 determines if the driver is drowsy. If the driver is not drowsy, response system 199 may return to step 3202. Otherwise, response system 199 receives yaw rate information in step 3206. The yaw rate information could be received from a yaw rate sensor in some cases. In step 3208, response system 199 receives steering information. This could include, for example, the steering wheel angle received from a steering angle sensor. In step 3210, response system 199 determines the steering yaw rate using the steering information. In some cases, additional operating information could be used to determine the steering yaw rate. In step 3212, response system 199 may reduce the allowable error between the measured yaw rate and the steering yaw rate. In other words, response system 199 helps minimize the difference between the driver intended path and the actual vehicle path.
In order to reduce the allowable error between the yaw rate and the steering yaw rate, response system 199 may apply braking to one or more brakes of motor vehicle 100 in order to maintain motor vehicle 100 close to the driver intended path. Examples of maintaining a vehicle close to a driver intended path can be found in Ellis et al., U.S. Pat. No. 8,423,257, the entirety of which is hereby incorporated by reference.
In step 3244, response system 199 sets an activation threshold associated with the electronic stability control system. The activation threshold may be associated with a predetermined stability error. In step 3246, response system 199 determines if the stability error exceeds the activation threshold. If not, response system 199 may return to step 3238. Otherwise, response system 199 may proceed to step 3248. In step 3248, response system 199 applies individual wheel brake control in order to increase vehicle stability. In some embodiments, response system 199 could also control the engine to apply engine braking or modify cylinder operation in order to help stabilize the vehicle.
In some cases, in step 3250, response system 199 may activate a warning indicator. The warning indicator could be any dashboard light or message displayed on a navigation screen or other video screen. The warning indicator helps to alert a driver that the electronic stability control system has been activated. In some cases, the warning could be an audible warning and/or a haptic warning.
In step 3266, response system 199 determines a stability control coefficient. As seen in look-up table 3270, the stability control coefficient may be determined from the body state index. In one example, the stability control coefficient ranges from 0% to 25%. Moreover, the stability control coefficient generally increases with the body state index. For example, if the body state index is 1, the stability control coefficient is 0%. If the body state index is 4, the stability control coefficient is 25%. It will be understood that these ranges for the stability control coefficient are only intended to be exemplary and in other cases the stability control coefficient could vary in any other manner as a function of the body state index.
In step 3268, response system 199 may set the activation threshold using the initial threshold setting and the stability control coefficient. For example, if the stability control coefficient has a value of 25%, the activation threshold may be up to 25% larger than the initial threshold setting. In other cases, the activation threshold may be up to 25% smaller than the initial threshold setting. In other words, the activation threshold may be increased or decreased from the initial threshold setting in proportion to the value of the stability control coefficient. This arrangement helps to increase the sensitivity of the electronic stability control system by modifying the activation threshold in proportion to the state of the driver.
In step 3402, response system 199 may receive drowsiness information. In step 3404, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 may proceed back to step 3402. Otherwise, response system 199 may proceed to step 3406. In step 3406, response system 199 may modify the operation of a collision warning system so that the driver is warned earlier about potential collisions. For example, if the collision warning system was initially set to warn a driver about a potential collision if the distance to the collision point is less than 25 meters, response system 199 could modify the system to warn the driver if the distance to the collision point is less than 50 meters.
In step 3502, collision warning system 234 may retrieve the heading, position and speed of an approaching vehicle. In some cases, this information could be received from the approaching vehicle through a wireless network, such as a DSRC network. In other cases, this information could be remotely sensed using radar, laser or other remote sensing devices.
In step 3504, collision warning system 234 may estimate a vehicle collision point. The vehicle collision point is the location of a potential collision between motor vehicle 100 and the approaching vehicle, which could be traveling in any direction relative to motor vehicle 100. In some cases, in step 3504, collision warning system 234 may use information about the position, heading and speed of motor vehicle 100 to calculate the vehicle collision point. In some embodiments, this information could be received from a GPS receiver that is in communication with collision warning system 234 or response system 199. In other embodiments, the vehicle speed could be received from a vehicle speed sensor.
In step 3506, collision warning system 234 may calculate the distance and/or time to the vehicle collision point. In particular, to determine the distance, collision warning system 234 may calculate the difference between the vehicle collision point and the current location of motor vehicle 100. Likewise, to determine the time to collision warning system 234 could calculate the amount of time it will take to reach the vehicle collision point.
In step 3508, collision warning system 234 may receive drowsiness information from response system 199, or any other system or components. In step 3509, collision warning system 234 may determine if the driver is drowsy. If the driver is not drowsy, collision warning system 234 may proceed to step 3510, where a first threshold parameter is retrieved. If the driver is drowsy, collision warning system 234 may proceed to step 3512, where a second threshold distance is retrieved. The first threshold parameter and the second threshold parameter could be either time thresholds or distance thresholds, according to whether the time to collision or distance to collision was determined during step 3506. In some cases, where both time and distance to the collision point are used, the first threshold parameter and the second threshold parameter can each comprise both a distance threshold and a time threshold. Moreover, it will be understood that the first threshold parameter and the second threshold parameter may be substantially different thresholds in order to provide a different operating configuration for collision warning system 234 according to whether the driver is drowsy or not drowsy. Following both step 3510 and 3512, collision warning system 234 proceeds to step 3514. In step 3514, collision warning system 234 determines if the current distance and/or time to the collision point is less than the threshold parameter selected during the previous step (either the first threshold parameter or the second threshold parameter).
The first threshold parameter and the second threshold parameter could have any values. In some cases, the first threshold parameter may be less than the second threshold parameter. In particular, if the driver is drowsy, it may be beneficial to use a lower threshold parameter, since this corresponds to warning a driver earlier about a potential collision. If the current distance or time is less than the threshold distance or time (the threshold parameter), collision warning system 234 may warn the driver in step 3516. Otherwise, collision warning system 234 may not warn the driver in step 3518.
A response system can include provisions for modifying the operation of an auto cruise control system according to driver behavior. In some embodiments, a response system can change the headway distance associated with an auto cruise control system. In some cases, the headway distance is the closest distance a motor vehicle can get to a preceding vehicle. If the auto cruise control system detects that the motor vehicle is closer than the headway distance, the system may warn the driver and/or automatically slow the vehicle to increase the headway distance.
In step 3802, response system 199 may receive drowsiness information. In step 3804, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 may return to step 3802. If the driver is drowsy, response system 199 may proceed to step 3806. In step 3806, response system 199 may determine if auto cruise control is being used. If not, response system 199 may return back to step 3802. If auto cruise control is being used, response system 199 may proceed to step 3808. In step 3808, response system 199 may retrieve the current headway distance for auto cruise control. In step 3810, response system 199 may increase the headway distance. With this arrangement, response system 199 may help increase the distance between motor vehicle 100 and other vehicles when a driver is drowsy to reduce the chances of a hazardous driving situation while the driver is drowsy.
In step 3934, response system 199 determines if the auto cruise control status is on. If so, response system 199 proceeds to step 3942. Otherwise, if the auto cruise control status is off, response system 199 proceeds to step 3936. In step 3936, response system 199 ramps down control of automatic cruise control. For example, in some cases response system 199 may slow down the vehicle gradually to a predetermined speed. In step 3938, response system 199 may turn off automatic cruise control. In some cases, in step 3940, response system 199 may inform the driver that automatic cruise control has been deactivated using a dashboard warning light or message displayed on a screen of some kind. In other cases, response system 199 could provide an audible warning that automatic cruise control has been deactivated. In still other cases a haptic warning could be used.
If the auto cruise control status is determined to be on during step 3934, response system 199 may set the auto cruise control distance setting in step 3942. For example, look-up table 3946 provides one possible configuration for a look-up table relating the body state index to a distance setting. In this case, a body state index of 1 corresponds to a first distance, a body state index of 2 corresponds to a second distance and a body state index of 3 corresponds to a third distance. Each distance may have a substantially different value. In some cases, the value of each headway distance may increase as the body state index increases in order to provide more headway room for drivers who are drowsy or otherwise inattentive. In step 3944, response system 199 may operate auto cruise control using the distance setting determined during step 3942.
A response system can include provisions for automatically reducing a cruising speed in a cruise control system based on driver monitoring information.
In step 3902, response system 199 may receive drowsiness information. In step 3904, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 returns to step 3902, otherwise response system 199 proceeds to step 3906. In step 3906, response system 199 determines if cruise control is operating. If not, response system 199 returns back to step 3902. If cruise control is operating, response system 199 determines the current cruising speed in step 3908. In step 3910, response system 199 retrieves a predetermined percentage. The predetermined percentage could have any value between 0% and 100%. In step 3912, response system 199 may reduce the cruising speed by the predetermined percentage. For example, if motor vehicle 100 is cruising at 60 mph and the predetermined percentage is 50%, the cruising speed may be reduced to 30 mph. In other embodiments, the cruising speed could be reduced by a predetermined amount, such as by 20 mph or 30 mph. In still other embodiments, the predetermined percentage could be selected from a range of percentages according to the driver body index. For example, if the driver is only slightly drowsy, the predetermined percentage could be smaller than the percentage used when the driver is very drowsy. Using this arrangement, response system 199 may automatically reduce the speed of motor vehicle 100, since slowing the vehicle may reduce the potential risks posed by a drowsy driver.
In step 3831, response system 199 may determine the body state index of the driver. Next, in step 3832, response system 199 may set the low speed follow status based on the body state index of the driver. For example, look-up table 3850 shows an exemplary relationship between body state index and the low speed follow status. In particular, the low speed follow status varies between an “on” state and an “off” state. For low body state index (body state indexes of 1 or 2) the low speed follow status may be set to “on”. For high body state index (body state indexes of 3 or 4) the low speed follow status may be set to “off”. It will be understood that the relationship between body state index and low speed follow status shown here is only exemplary and in other embodiments the relationship could vary in any other manner.
In step 3834, response system 199 determines if the low speed follow status is on or off. If the low speed follow status is on, response system 199 returns to step 3830. Otherwise, response system 199 proceeds to step 3836 when the low speed follow status is off. In step 3836, response system 199 may ramp down control of the low speed follow function. For example, the low speed follow system may gradually increase the headway distance with the preceding vehicle until the system is shut down in step 3838. By automatically turning of low speed follow when a driver is drowsy, response system 199 may help increase driver attention and awareness since the driver must put more effort into driving the vehicle.
In some cases, in step 3840, response system 199 may inform the driver that low speed follow has been deactivated using a dashboard warning light or message displayed on a screen of some kind. In other cases, response system 199 could provide an audible warning that low speed follow has been deactivated.
A response system can include provisions for modifying the operation of a lane departure warning system, which helps alert a driver if the motor vehicle is unintentionally leaving the current lane. In some cases, a response system could modify when the lane departure warning system alerts a driver. For example, the lane keep departure warning system could warn the driver before the vehicle crosses a lane boundary line, rather than waiting until the vehicle has already crossed the lane boundary line.
In step 4202, response system 199 may retrieve drowsiness information. In step 4204, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 proceeds back to step 4202. Otherwise, response system 199 proceeds to step 4206. In step 4206, response system 199 may modify the operation of lane departure warning system 240 so that the driver is warned earlier about potential lane departures.
In step 4276, response system 199 may set the road crossing threshold. The road crossing threshold may be a time associated with the time to lane crossing. In step 4278, response system 199 determines if the time to lane crossing exceeds the road crossing threshold. If not, response system 199 proceeds back to step 4270. Otherwise, response system 199 proceeds to step 4280 where a warning indicator is illuminated indicating that the vehicle is crossing a lane. In other cases, audible or haptic warnings could also be provided. If the vehicle continues exiting the lane a steering effort correction may be applied in step 4282.
In step 4294, response system 199 determines an initial threshold setting from the minimum reaction time and the vehicle operating information. In step 4296, response system 199 determines the body index state of the driver. In step 4298, response system 199 determines a lane departure warning coefficient according to the body state index. An exemplary look-up table 4285 includes a range of coefficient values between 0% and 25% as a function of the body state index. Finally, in step 4299, response system 199 may set the road crossing threshold according to the lane departure warning coefficient and the initial threshold setting.
In addition to providing earlier warnings to a driver through a lane departure warning system, response system 199 can also modify the operation of a lane keep assist system, which may also provide warnings as well as driving assistance in order to maintain a vehicle in a predetermined lane.
In step 4236, response system 199 may determine the deviation of the vehicle path from the road center. In step 4238, response system 199 may learn the driver's centering habits. For example, alert drivers generally adjust the steering wheel constantly in attempt to maintain the car in the center of a lane. In some cases, the centering habits of a driver can be detected by response system 199 and learned. Any machine learning method or pattern recognition algorithm could be used to determine the driver's centering habits.
In step 4240, response system 199 may determine if the vehicle is deviating from the center of the road. If not, response system 199 proceeds back to step 4230. If the vehicle is deviating, response system 199 proceeds to step 4242. In step 4242, response system 199 may determine the body state index of the driver. Next, in step 4244, response system 199 may set the lane keep assist status using the body state index. For example, look-up table 4260 is an example of a relationship between body state index and lane keep assist status. In particular, the lane keep assist status is set to a standard state for low body state index (indexes 1 or 2) and is set to a low state for a higher body state index (indexes 3 or 4). In other embodiments, any other relationship between body state index and lane keep assist status can be used.
In step 4246, response system 199 may check the lane keep assist status. If the lane keep assist status is standard, response system 199 proceeds to step 4248 where standard steering effort corrections are applied to help maintain the vehicle in the lane. If, however, response system 199 determines that the lane keep assist status is low in step 4246, response system 199 may proceed to step 4250. In step 4250, response system 199 determines if the road is curved. If not, response system 199 proceeds to step 4256 to illuminate a lane keep assist warning so the driver knows the vehicle is deviating from the lane. If, in step 4250, response system 199 determines the road is curved, response system 199 proceeds to step 4252. In step 4252, response system 199 determines if the driver's hands are on the steering wheel. If so, response system 199 proceeds to step 4254 where the process ends. Otherwise, response system 199 proceeds to step 4256.
This arrangement allows response system 199 to modify the operation of the lane keep assist system in response to driver behavior. In particular, the lane keep assist system may only help steer the vehicle automatically when the driver state is alert (low body state index). Otherwise, if the driver is drowsy or very drowsy (higher body state index), response system 199 may control the lane keep assist system to only provide warnings of lane deviation without providing steering assistance. This may help increase the alertness of the driver when he or she is drowsy.
A response system can include provisions for modifying the control of a blind spot indicator system when a driver is drowsy. For example, in some cases, a response system could increase the detection area. In other cases, the response system could control the monitoring system to deliver warnings earlier (i.e., when an approaching vehicle is further away).
In
In step 4302, response system 199 may receive drowsiness information. In step 4304, response system 199 determines if the driver is drowsy. If the driver is not drowsy, response system 199 returns back to step 4302. If the driver is drowsy, response system 199 proceeds to step 4306. In step 4306, response system 4306 may increase the blind spot detection area. For example, if the initial blind spot detection area is associated with the region of the vehicle between the passenger side mirror about 3-5 meters behind the rear bumper, the modified blind spot detection area may be associated with the region of the vehicle between the passenger side mirror and about 4-7 meters behind the rear bumper. Following this, in step 4308, response system 199 may modify the operation of blind spot indicator system 242 so that the system warns a driver when a vehicle is further away. In other words, if the system initially warns a driver if the approaching vehicle is within 5 meters of motor vehicle 100, or the blind spot, the system may be modified to warn the driver when the approaching vehicle is within 10 meters of motor vehicle 100, or the blind spot of motor vehicle 100. Of course, it will be understood that in some cases, step 4306 or step 4308 may be optional steps. In addition, other sizes and locations of the blind spot zone are possible.
In step 4420, response system 199 may determine the location and/or bearing of a tracked object. In step 4422, response system 199 sets a zone threshold. The zone threshold may be a location threshold for determining when an object has entered into a blind spot monitoring zone. In some cases, the zone threshold may be determined using the body state index of the driver as well as information about the tracked object.
In step 4424, response system 199 determines if the tracked object crosses the zone threshold. If not, response system 199 proceeds to step 4418. Otherwise, response system 199 proceeds to step 4426. In step 4426, response system 199 determines if the relative speed of the object is in a predetermined range. If the relative speed of the object is in the predetermined range, it is likely to stay in the blind spot monitoring zone for a long time and may pose a very high threat. Response system 199 may ignore objects with a relative speed outside the predetermined range, since the object is not likely to stay in the blind spot monitoring zone for very long. If the relative speed is not in the predetermined range, response system 199 proceeds back to step 4418. Otherwise, response system 199 proceeds to step 4428.
In step 4428, response system 199 determines a warning type using the body state index. In step 4430, response system 199 sets the warning intensity and frequency using the body state index. Lookup table 4440 is an example of a relationship between body state index and a coefficient for warning intensity. Finally, in step 4432, response system 199 activates the blind spot indicator warning to alert the driver of the presence of the object in the blind spot.
Generally, the zone threshold can be determined using the initial threshold setting (determined in step 4452) and the blind spot zone coefficient. For example, if the blind spot zone coefficient has a value of 25%, the zone threshold may be up to 25% larger than the initial threshold setting. In other cases, the zone threshold may be up to 25% smaller than the initial threshold setting. In other words, the zone threshold may be increased or decreased from the initial threshold setting in proportion to the value of the blind spot zone coefficient. Moreover, as the value of the zone threshold changes, the size of the blind spot zone or blind spot detection area may change. For example, in some cases, as the value of the zone threshold increases, the length of the blind spot detection area is increased, resulting in a larger detection area and higher system sensitivity. Likewise, in some cases, as the value of the zone threshold decreases, the length of the blind spot detection area is decreased, resulting in a smaller detection area and lower system sensitivity.
As seen in
Referring now to
For purposes of illustration, the distance between vehicles is used as the threshold for determining if response system 199 should issue a warning and/or apply other types of intervention. However, it will be understood that in some cases, the time to collision between vehicles may be used as the threshold for determining what actions response system 199 may perform. In some cases, for example, using information about the velocities of the host and target vehicles as well as the relative distance between the vehicles can be used to estimate a time to collision. Response system 199 may determine if warnings and/or other operations should be performed according to the estimated time to collision.
In step 4556, response system 199 may set a first time to collision threshold and a second time to collision threshold. In some cases, the first time to collision threshold may be greater than the second time to collision threshold. However, in other cases, the first time to collision threshold may be less than or equal to the second time to collision threshold. Details for determining the first time to collision threshold and the second time to collision threshold are discussed below and shown in
In step 4558, response system 199 may determine if the time to collision is less than the first time to collision threshold. If not, response system 199 returns to step 4550. In some cases, the first time to collision threshold may a value above which there is no immediate threat of a collision. If the time to collision is less than the first time to collision threshold, response system 199 proceeds to step 4560.
At step 4560, response system 199 may determine if the time to collision is less than the second time to collision threshold. If not, response system 199 enters a first warning stage at step 4562. The response system 199 may then proceed through further steps discussed below and shown in
In step 4588, response system 199 may determine a time to collision coefficient. In some cases, the time to collision coefficient can be determined using look-up table 4592, which relates the time to collision coefficient to the body state index of the driver. In some cases, the time to collision coefficient increases from 0% to 25% as the body state index increases. In step 4590, response system 199 may set the first time to collision threshold and the second time to collision threshold. Although a single time to collision coefficient is used in this embodiment, the first time to collision threshold and the second time to collision threshold may differ according to the first initial threshold setting and the second initial threshold setting, respectively. Using this configuration, in some cases, the first time to collision threshold and the second time to collision threshold may be decreased as the body state index of a driver increases. This allows response system 199 to provide earlier warnings of potential hazards when a driver is drowsy. Moreover, the timing of the warnings varies in proportion to the body state index.
In step 4704, response system 199 may set the warning frequency and intensity. This may be determined using the body state index in some cases. In particular, as the driver state increases due to the increased drowsiness of the driver, the warning state frequency and intensity can be increased. For example, in some cases look-up table 4570 can be used to determine the warning frequency and intensity. In particular, in some cases as the warning intensity coefficient increases (as a function of body state index), the intensity of any warning can be increased by up to 25%. In step 4706, response system 199 may apply a warning for forward collision awareness. In some cases, the intensity of the warning can be increased for situations where the warning intensity coefficient is large. For example, for a low warning intensity coefficient (0%) the warning intensity may be set to a predetermined level. For higher warning intensity coefficients (greater than 0%) the warning intensity may be increased beyond the predetermined level. In some cases, the luminosity of visual indicators can be increased. In other cases, the volume of audible warnings can be increased. In still other cases, the pattern of illuminating a visual indicator or making an audible warning could be varied.
In step 4602, response system 199 may receive drowsiness information. In step 4604, response system 199 may determine if the driver is drowsy. If the driver is not drowsy, response system 199 proceeds back to step 4602. Otherwise, response system 199 proceeds to step 4606. In step 4606, response system 199 may turn off navigation system 4606. This may help reduce driver distraction.
It will be understood that in some embodiments, multiple vehicle systems could be modified according to driver behavior substantially simultaneously. For example, in some cases when a driver is drowsy, a response system could modify the operation of a collision warning system and a lane keep assist system to alert a driver earlier of any potential collision threats or unintentional lane departures. Likewise, in some cases when a driver is drowsy, a response system could automatically modify the operation of an antilock brake system and a brake assist system to increase braking response. The number of vehicle systems that can be simultaneously activated in response to driver behavior is not limited.
It will be understood that the current embodiment illustrates and discusses provisions for sensing driver behavior and modifying the operation of one or more vehicle systems accordingly. However, these methods are not limited to use with a driver. In other embodiments, these same methods could be applied to any occupant of a vehicle. In other words, a response system may be configured to detect if various other occupants of a motor vehicle are drowsy. Moreover, in some cases, one or more vehicle systems could be modified accordingly.
While various embodiments have been described, the description is intended to be exemplary, rather than limiting and it will be apparent to those of ordinary skill in the art that many more embodiments and implementations are possible that are within the scope of the embodiments. Accordingly, the embodiments are not to be restricted except in light of the attached claims and their equivalents. Also, various modifications and changes may be made within the scope of the attached claims.
This application is a continuation of U.S. application Ser. No. 14/977,787 filed Dec. 22, 2015, published as US 2016/0152233, and now issued as U.S. Pat. No. 9,855,945, which is a continuation of U.S. application Ser. No. 13/843,249 filed on Mar. 15, 2013, published as US 2013/0245886, and now issued as U.S. Pat. No. 9,296,382, which is a continuation of U.S. application Ser. No. 13/030,637 filed on Feb. 18, 2011, now issued as U.S. Pat. No. 8,698,639, all of which are expressly incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4671111 | Lemelson | Jun 1987 | A |
4891764 | McIntosh | Jan 1990 | A |
5057834 | Nordstrom | Oct 1991 | A |
5191524 | Pincus et al. | Mar 1993 | A |
5195606 | Martyniuk | Mar 1993 | A |
5465079 | Bouchard et al. | Nov 1995 | A |
5485892 | Fujita | Jan 1996 | A |
5546305 | Kondo | Aug 1996 | A |
5570087 | Emelson | Oct 1996 | A |
5609158 | Dhan | Mar 1997 | A |
5682901 | Kamen | Nov 1997 | A |
5719950 | Dsten et al. | Feb 1998 | A |
5783997 | Saitoh et al. | Jul 1998 | A |
5815070 | Yoshikawa | Sep 1998 | A |
5821860 | Yokoyama | Oct 1998 | A |
5913375 | Hishikawa | Jun 1999 | A |
5942979 | Uppino | Aug 1999 | A |
5960376 | Yamakado et al. | Sep 1999 | A |
6009377 | Hiwatashi | Dec 1999 | A |
6044696 | Spencer-Smith | Apr 2000 | A |
6061610 | Boer | May 2000 | A |
6067497 | Sekine | May 2000 | A |
6104296 | Yasushi et al. | Aug 2000 | A |
6172613 | Deline et al. | Jan 2001 | B1 |
6185487 | Kondo et al. | Feb 2001 | B1 |
6195008 | Bader | Feb 2001 | B1 |
6198996 | Berstis | Mar 2001 | B1 |
6256558 | Sugiura et al. | Jul 2001 | B1 |
6271745 | Anzai et al. | Aug 2001 | B1 |
6278362 | Yoshikawa et al. | Aug 2001 | B1 |
6337629 | Bader | Jan 2002 | B1 |
6361503 | Starobin et al. | Mar 2002 | B1 |
6393348 | Ziegler et al. | May 2002 | B1 |
6393361 | Yano et al. | May 2002 | B1 |
6435626 | Kostadina | Aug 2002 | B1 |
6438472 | Tano et al. | Aug 2002 | B1 |
6459365 | Tamura | Oct 2002 | B2 |
6485415 | Uchiyama et al. | Nov 2002 | B1 |
6485418 | Yasushi et al. | Nov 2002 | B2 |
6542081 | Torch | Apr 2003 | B2 |
6575902 | Burton | Jun 2003 | B1 |
6603999 | SerVaas | Aug 2003 | B2 |
6663572 | Starobin et al. | Dec 2003 | B2 |
6668221 | Harter, Jr. et al. | Dec 2003 | B2 |
6697723 | Olsen et al. | Feb 2004 | B2 |
6734799 | Munch | May 2004 | B2 |
6791462 | Choi | Sep 2004 | B2 |
6809643 | Elrod et al. | Oct 2004 | B1 |
6810309 | Sadler et al. | Oct 2004 | B2 |
6822573 | Basir et al. | Nov 2004 | B2 |
6876949 | Hilliard et al. | Apr 2005 | B2 |
6909947 | Douros et al. | Jun 2005 | B2 |
6950027 | Banas | Sep 2005 | B2 |
6974414 | Victor | Dec 2005 | B2 |
6993378 | Wiederhold et al. | Jan 2006 | B2 |
7032705 | Zheng et al. | Apr 2006 | B2 |
7046128 | Roberts | May 2006 | B2 |
7062313 | Nissila | Jun 2006 | B2 |
7092849 | Lafitte et al. | Aug 2006 | B2 |
7102495 | Mattes et al. | Sep 2006 | B2 |
7138938 | Prakah-Asante et al. | Nov 2006 | B1 |
7149653 | Bihler et al. | Dec 2006 | B2 |
7183930 | Basir et al. | Feb 2007 | B2 |
7183932 | Bauer | Feb 2007 | B2 |
7196629 | Ruoss et al. | Mar 2007 | B2 |
7219923 | Fujita et al. | May 2007 | B2 |
7225013 | Geva et al. | May 2007 | B2 |
7248997 | Nagai et al. | Jul 2007 | B2 |
7254439 | Misczynski et al. | Aug 2007 | B2 |
7266430 | Basson et al. | Sep 2007 | B2 |
7283056 | Bukman et al. | Oct 2007 | B2 |
7301465 | Tengshe et al. | Nov 2007 | B2 |
7304580 | Sullivan et al. | Dec 2007 | B2 |
7330570 | Sogo et al. | Feb 2008 | B2 |
7349792 | Durand | Mar 2008 | B2 |
7350608 | Fernandez | Apr 2008 | B2 |
7389178 | Raz et al. | Jun 2008 | B2 |
7401233 | Duri et al. | Jul 2008 | B2 |
7403804 | Ridder et al. | Jul 2008 | B2 |
7424357 | Ozaki et al. | Sep 2008 | B2 |
7424414 | Craft | Sep 2008 | B2 |
7465272 | Kriger | Dec 2008 | B2 |
7482938 | Suzuki | Jan 2009 | B2 |
7496457 | Fujita et al. | Feb 2009 | B2 |
7502152 | Lich et al. | Mar 2009 | B2 |
7507207 | Sakai et al. | Mar 2009 | B2 |
7511833 | Breed | Mar 2009 | B2 |
7517099 | Hannah | Apr 2009 | B2 |
7532964 | Fujita et al. | May 2009 | B2 |
7561054 | Raz et al. | Jul 2009 | B2 |
7576642 | Rodemer | Aug 2009 | B2 |
7618091 | Akaike et al. | Nov 2009 | B2 |
7620521 | Breed et al. | Nov 2009 | B2 |
7639148 | Victor | Dec 2009 | B2 |
7649445 | Kuramori et al. | Jan 2010 | B2 |
7650217 | Ueyama | Jan 2010 | B2 |
7663495 | Haque et al. | Feb 2010 | B2 |
7672764 | Yoshioka et al. | Mar 2010 | B2 |
7689271 | Sullivan | Mar 2010 | B1 |
7719431 | Bolourchi | May 2010 | B2 |
RE41376 | Torch | Jun 2010 | E |
7756558 | Ridder et al. | Jul 2010 | B2 |
7769499 | McQuade et al. | Aug 2010 | B2 |
7800592 | Kerr et al. | Sep 2010 | B2 |
7803111 | Kriger | Sep 2010 | B2 |
7805224 | Basson et al. | Sep 2010 | B2 |
7809954 | Miller et al. | Oct 2010 | B2 |
7864039 | Georgeson | Jan 2011 | B2 |
7866703 | Spahn et al. | Jan 2011 | B2 |
7946483 | Miller et al. | May 2011 | B2 |
7948361 | Bennett et al. | May 2011 | B2 |
7948387 | Ishida et al. | May 2011 | B2 |
7953477 | Tulppo et al. | May 2011 | B2 |
8019407 | Lian et al. | Sep 2011 | B2 |
8106783 | Wada et al. | Jan 2012 | B2 |
8140241 | Takeda et al. | Mar 2012 | B2 |
8157730 | LeBoeuf et al. | Apr 2012 | B2 |
8251447 | Fujita et al. | Aug 2012 | B2 |
8315757 | Yamamura et al. | Nov 2012 | B2 |
8328690 | Ohtsu | Dec 2012 | B2 |
8428821 | Nilsson | Apr 2013 | B2 |
8471909 | Ishikawa | Jun 2013 | B2 |
8497774 | Scalisi et al. | Jul 2013 | B2 |
8618952 | Mochizuki | Dec 2013 | B2 |
8698639 | Fung et al. | Apr 2014 | B2 |
8706204 | Seo | Apr 2014 | B2 |
8742936 | Galley | Jun 2014 | B2 |
8773239 | Phillips et al. | Jul 2014 | B2 |
8788148 | Ohta et al. | Jul 2014 | B2 |
8983732 | Lisseman et al. | Mar 2015 | B2 |
9149231 | Fujita | Oct 2015 | B2 |
9296382 | Fung et al. | Mar 2016 | B2 |
9315194 | Okuda | Apr 2016 | B2 |
9440646 | Fung et al. | Sep 2016 | B2 |
9751534 | Fung et al. | Sep 2017 | B2 |
10308258 | Fung et al. | Jun 2019 | B2 |
10759436 | Fung | Sep 2020 | B2 |
10875536 | Fung | Dec 2020 | B2 |
20010037171 | Sato | Nov 2001 | A1 |
20020005778 | Breed | Jan 2002 | A1 |
20020097145 | Tumey | Jul 2002 | A1 |
20020101354 | Banas | Aug 2002 | A1 |
20020105438 | Forbes | Aug 2002 | A1 |
20030062768 | Loudon et al. | Apr 2003 | A1 |
20030105578 | Takenaga | Jun 2003 | A1 |
20030149354 | Bakharev | Aug 2003 | A1 |
20030151516 | Basir | Aug 2003 | A1 |
20030171684 | Vasin et al. | Sep 2003 | A1 |
20040032957 | Mansy et al. | Feb 2004 | A1 |
20040044293 | Burton | Mar 2004 | A1 |
20040088095 | Eberle et al. | May 2004 | A1 |
20040133082 | Abraham-Fuchs et al. | Jul 2004 | A1 |
20040201481 | Yoshinori | Oct 2004 | A1 |
20040245036 | Fujita et al. | Dec 2004 | A1 |
20050022606 | Partin et al. | Feb 2005 | A1 |
20050030184 | Victor | Feb 2005 | A1 |
20050033189 | McCraty et al. | Feb 2005 | A1 |
20050065711 | Dahlgren | Mar 2005 | A1 |
20050080533 | Basir et al. | Apr 2005 | A1 |
20050148894 | Misczynski et al. | Jul 2005 | A1 |
20050155808 | Braeuchle et al. | Jul 2005 | A1 |
20050219058 | Katagiri | Oct 2005 | A1 |
20050246134 | Nagai et al. | Nov 2005 | A1 |
20050256414 | Kettunen et al. | Nov 2005 | A1 |
20060082437 | Yuhara | Apr 2006 | A1 |
20060122478 | Sliepen et al. | Jun 2006 | A1 |
20060142968 | Han | Jun 2006 | A1 |
20060161322 | Njoku | Jul 2006 | A1 |
20060212195 | Veith et al. | Sep 2006 | A1 |
20060283652 | Yanai et al. | Dec 2006 | A1 |
20060287605 | Lin et al. | Dec 2006 | A1 |
20070080816 | Haque | Apr 2007 | A1 |
20070159344 | Kisacanin | Jul 2007 | A1 |
20070168128 | Tokoro | Jul 2007 | A1 |
20070190970 | Watson | Aug 2007 | A1 |
20070222617 | Chai | Sep 2007 | A1 |
20070243854 | Taki et al. | Oct 2007 | A1 |
20070265540 | Fuwamoto et al. | Nov 2007 | A1 |
20070265777 | Munakata | Nov 2007 | A1 |
20070296601 | Sultan | Dec 2007 | A1 |
20070299910 | Fontenot et al. | Dec 2007 | A1 |
20080015422 | Wessel | Jan 2008 | A1 |
20080027337 | Dugan et al. | Jan 2008 | A1 |
20080027341 | Sackner et al. | Jan 2008 | A1 |
20080033518 | Rousso et al. | Feb 2008 | A1 |
20080040004 | Breed | Feb 2008 | A1 |
20080046150 | Breed | Feb 2008 | A1 |
20080071177 | Yanagidaira et al. | Mar 2008 | A1 |
20080167757 | Kanevsky et al. | Jul 2008 | A1 |
20080183388 | Goodrich | Jul 2008 | A1 |
20080185207 | Kondoh | Aug 2008 | A1 |
20080195261 | Breed | Aug 2008 | A1 |
20080228046 | Futatsuyama et al. | Sep 2008 | A1 |
20080258884 | Schmitz | Oct 2008 | A1 |
20080290644 | Spahn et al. | Nov 2008 | A1 |
20080294015 | Tsuji | Nov 2008 | A1 |
20080312796 | Matsuura et al. | Dec 2008 | A1 |
20080319602 | McClellan et al. | Dec 2008 | A1 |
20090040054 | Wang et al. | Feb 2009 | A1 |
20090054751 | Babashan et al. | Feb 2009 | A1 |
20090091435 | Bolourchi | Apr 2009 | A1 |
20090115589 | Galley | May 2009 | A1 |
20090132143 | Kamiya | May 2009 | A1 |
20090156988 | Ferren et al. | Jun 2009 | A1 |
20090209829 | Yanagidaira et al. | Aug 2009 | A1 |
20090234552 | Takeda et al. | Sep 2009 | A1 |
20090268022 | Omi | Oct 2009 | A1 |
20090284361 | Boddie et al. | Nov 2009 | A1 |
20090289780 | Tenorio-Fox | Nov 2009 | A1 |
20090313987 | Tu | Dec 2009 | A1 |
20090315767 | Scalisi et al. | Dec 2009 | A1 |
20090318776 | Toda | Dec 2009 | A1 |
20090318777 | Kameyama | Dec 2009 | A1 |
20090326399 | Batalloso et al. | Dec 2009 | A1 |
20100009808 | Ohtsu | Jan 2010 | A1 |
20100030434 | Okabe | Feb 2010 | A1 |
20100049068 | Fuwamoto | Feb 2010 | A1 |
20100094103 | Kaplan | Apr 2010 | A1 |
20100106365 | Visconti et al. | Apr 2010 | A1 |
20100113950 | Lin et al. | May 2010 | A1 |
20100148923 | Takizawa | Jun 2010 | A1 |
20100168527 | Zumo et al. | Jul 2010 | A1 |
20100185101 | Sakai et al. | Jul 2010 | A1 |
20100188233 | Kuntzel | Jul 2010 | A1 |
20100217099 | LeBoeuf | Aug 2010 | A1 |
20100222687 | Thijs et al. | Sep 2010 | A1 |
20100228419 | Lee | Sep 2010 | A1 |
20100234692 | Kuo et al. | Sep 2010 | A1 |
20100250044 | Alasry et al. | Sep 2010 | A1 |
20100253526 | Szczerba | Oct 2010 | A1 |
20100295707 | Bennie et al. | Nov 2010 | A1 |
20110028857 | Ibanez et al. | Feb 2011 | A1 |
20110034912 | De Graff et al. | Feb 2011 | A1 |
20110046498 | Klap et al. | Feb 2011 | A1 |
20110046970 | Fontenot | Feb 2011 | A1 |
20110066042 | Pandia | Mar 2011 | A1 |
20110105925 | Hatakeyama | May 2011 | A1 |
20110109462 | Deng | May 2011 | A1 |
20110112442 | Meger et al. | May 2011 | A1 |
20110137200 | Yin et al. | Jun 2011 | A1 |
20110144515 | Bayer | Jun 2011 | A1 |
20110152701 | Buxi et al. | Jun 2011 | A1 |
20110163863 | Chatmon | Jul 2011 | A1 |
20110169625 | James | Jul 2011 | A1 |
20110213511 | Visconti et al. | Sep 2011 | A1 |
20110246028 | Lisseman et al. | Oct 2011 | A1 |
20110284304 | Van Schoiack | Nov 2011 | A1 |
20120010514 | Vrazic | Jan 2012 | A1 |
20120022392 | Leuthardt et al. | Jan 2012 | A1 |
20120025993 | Akiyama | Feb 2012 | A1 |
20120054054 | Kameyama | Mar 2012 | A1 |
20120071775 | Osorio et al. | Mar 2012 | A1 |
20120083668 | Pradeep et al. | Apr 2012 | A1 |
20120097472 | Kubo et al. | Apr 2012 | A1 |
20120116198 | Veen et al. | May 2012 | A1 |
20120123806 | Schumann, Jr. et al. | May 2012 | A1 |
20120133515 | Palshof | May 2012 | A1 |
20120212353 | Fung et al. | Aug 2012 | A1 |
20120215403 | Tengler et al. | Aug 2012 | A1 |
20120259181 | Fujita et al. | Oct 2012 | A1 |
20120271513 | Yoneda et al. | Oct 2012 | A1 |
20120290215 | Adler et al. | Nov 2012 | A1 |
20130030256 | Fujita et al. | Jan 2013 | A1 |
20130038735 | Nishiguchi et al. | Feb 2013 | A1 |
20130046154 | Lin et al. | Feb 2013 | A1 |
20130124038 | Naboulsi | May 2013 | A1 |
20130144470 | Ricci | Jun 2013 | A1 |
20130158741 | Hahne | Jun 2013 | A1 |
20130172771 | Muhlsteff | Jul 2013 | A1 |
20130245886 | Fung et al. | Sep 2013 | A1 |
20140121927 | Hanita | May 2014 | A1 |
20140148988 | Lathrop et al. | May 2014 | A1 |
20140163374 | Ogasawara et al. | Jun 2014 | A1 |
20140188770 | Agrafioti et al. | Jul 2014 | A1 |
20140224040 | Van'tZelfde et al. | Aug 2014 | A1 |
20140275854 | Venkatraman | Sep 2014 | A1 |
20140309789 | Ricci | Oct 2014 | A1 |
20140309881 | Fung | Oct 2014 | A1 |
20140309893 | Ricci | Oct 2014 | A1 |
20150029014 | Bande Martinez | Jan 2015 | A1 |
20150367858 | Fung et al. | Dec 2015 | A1 |
20180057015 | Barke | Mar 2018 | A1 |
20180105180 | Fung | Apr 2018 | A1 |
20180357894 | Bjersing | Dec 2018 | A1 |
20190241190 | Fung et al. | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
1798521 | Jul 2006 | CN |
1802273 | Jul 2006 | CN |
10126224 | Dec 2002 | DE |
10248894 | May 2004 | DE |
69730298 | Jan 2005 | DE |
102004045677 | Jul 2005 | DE |
102004037298 | Mar 2006 | DE |
102005020847 | Nov 2006 | DE |
102006050017 | Apr 2008 | DE |
102008042342 | Apr 2010 | DE |
102009051260 | Jun 2010 | DE |
102010013243 | Sep 2011 | DE |
202012001096 | May 2012 | DE |
102012208644 | May 2013 | DE |
102012102459 | Sep 2013 | DE |
102013200777 | Jul 2014 | DE |
102013010928 | Dec 2014 | DE |
0549909 | Jul 1993 | EP |
1661511 | May 2006 | EP |
2426012 | Mar 2012 | EP |
2591969 | May 2013 | EP |
2675686 | Dec 2013 | EP |
2880166 | Jun 2006 | FR |
2465439 | May 2010 | GB |
58149101 | Sep 1983 | JP |
06107032 | Apr 1994 | JP |
06255520 | Sep 1994 | JP |
H710024 | Jan 1995 | JP |
9216567 | Aug 1997 | JP |
11105579 | Apr 1999 | JP |
11151231 | Jun 1999 | JP |
11328593 | Nov 1999 | JP |
200057479 | Feb 2000 | JP |
2000211543 | Aug 2000 | JP |
2000261880 | Sep 2000 | JP |
2000515829 | Nov 2000 | JP |
2001151137 | Jun 2001 | JP |
2001260698 | Sep 2001 | JP |
2002087107 | Mar 2002 | JP |
2002102188 | Apr 2002 | JP |
2002362391 | Dec 2002 | JP |
2004149003 | May 2004 | JP |
2004246708 | Sep 2004 | JP |
2005071185 | Mar 2005 | JP |
2005114673 | Apr 2005 | JP |
2005168908 | Jun 2005 | JP |
3687356 | Aug 2005 | JP |
200614765 | Jan 2006 | JP |
3757684 | Mar 2006 | JP |
2006182277 | Jul 2006 | JP |
2006232172 | Sep 2006 | JP |
2006302206 | Nov 2006 | JP |
3862192 | Dec 2006 | JP |
2006346109 | Dec 2006 | JP |
2007229116 | Sep 2007 | JP |
2007244479 | Sep 2007 | JP |
2007304705 | Nov 2007 | JP |
2008102777 | May 2008 | JP |
2008123448 | May 2008 | JP |
2008181327 | Aug 2008 | JP |
2008197821 | Aug 2008 | JP |
2008204107 | Sep 2008 | JP |
2008213823 | Sep 2008 | JP |
2008223879 | Sep 2008 | JP |
2008225899 | Sep 2008 | JP |
2008229091 | Oct 2008 | JP |
2008287561 | Nov 2008 | JP |
2008302741 | Dec 2008 | JP |
2008305096 | Dec 2008 | JP |
2009039167 | Feb 2009 | JP |
2009080718 | Apr 2009 | JP |
2009101714 | May 2009 | JP |
2009116693 | May 2009 | JP |
2009132307 | Jun 2009 | JP |
2009142576 | Jul 2009 | JP |
2009146377 | Jul 2009 | JP |
2009172205 | Aug 2009 | JP |
2009202841 | Sep 2009 | JP |
2009208739 | Sep 2009 | JP |
2009213779 | Sep 2009 | JP |
4340991 | Oct 2009 | JP |
4361011 | Nov 2009 | JP |
2010058691 | Mar 2010 | JP |
2010122650 | Jun 2010 | JP |
2010122897 | Jun 2010 | JP |
2010128649 | Jun 2010 | JP |
2010128669 | Jun 2010 | JP |
2010142593 | Jul 2010 | JP |
2010143578 | Jul 2010 | JP |
2010186276 | Aug 2010 | JP |
2010198313 | Sep 2010 | JP |
2011008457 | Jan 2011 | JP |
201130869 | Feb 2011 | JP |
20040098704 | Nov 2004 | KR |
20050015771 | Feb 2005 | KR |
20110127978 | Nov 2011 | KR |
2298215 | Apr 2007 | RU |
WO02096694 | Dec 2002 | WO |
WO2004108466 | Dec 2004 | WO |
WO2007090896 | Aug 2007 | WO |
WO2009098731 | Aug 2009 | WO |
WO 2009104460 | Aug 2009 | WO |
WO2010122650 | Oct 2010 | WO |
WO 2011038803 | Apr 2011 | WO |
WO 2013113947 | Aug 2013 | WO |
WO 2014123222 | Aug 2014 | WO |
WO 2014149657 | Sep 2014 | WO |
Entry |
---|
Wu, H., Rubinstein, M., Shih, E., Guttag, J. & Durand, F., Freeman, W., “Eulerian Video Magnification for Revealing Subtle Changes in the World,” ACM Transactions on Graphics 31, No. 4 (Jul. 1, 2012): pp. 1-8. |
Examination Report of AU2012218054 dated Jun. 20, 2014, 2 pages. |
Office Action of CN Serial No. 201280009235.6, dated Oct. 20, 2014, 8 pages. |
Office Action of CN Serial No. 201280009235.6, dated Oct. 20, 2014, 8 pages, English translation. |
Extended European Search Report of EP12747073.0 dated Jul. 3, 2014. |
Extended European Search Report of related application No. 16177772.7 dated Nov. 7, 2016, 10 pages. |
Office Action of Japanese Patent Application No. 2014-227769 dated Oct. 20, 2015, 9 pages. |
Office Action of Japanese Patent Application 2016-142323 dated May 30, 2017, 3 pages. |
Office Action of KR Serial No. 10-2013-7024830 dated Oct. 13, 2014, 9 pages. |
Office Action of KR Serial No. 10-2013-7024830 dated Oct. 13, 2014, 11 pages, English translation. |
Office Action of Korean Patent Application No. 10-2015-7005245 dated Aug. 7, 2015, 8 pages. |
Office Action of Korean Patent Application No. 10-2015-7005245 dated Aug. 7, 2015, 9 pages (English Translation). |
Office Action of U.S. Appl. No. 13/030,637 dated Mar. 28, 2013, 38 pages. |
Office Action of U.S. Appl. No. 13/030,637 dated Aug. 7, 2013, 23 pages. |
Office Action of U.S. Appl. No. 13/843,077 dated Feb. 11, 2016, 11 pages. |
Office Action of U.S. Appl. No. 13/843,194 dated Mar. 27, 2015, 39 pages. |
Office Action of U.S. Appl. No. 13/843,194 dated Sep. 24, 2015, 14 pages. |
Office Action of U.S. Appl. No. 13/843,249 dated Oct. 7, 2014, 30 pages. |
Office Action of U.S. Appl. No. 13/843,249 dated Apr. 28, 2015, 19 pages. |
Office Action of U.S. Appl. No. 13/843,249 dated Sep. 4, 2015, 11 pages. |
Office Action of U.S. Appl. No. 13/843,249 dated Nov. 24, 2015, 12 pages. |
Office Action of U.S. Appl. No. 13/858,038 dated Jun. 26, 2015, 19 pages. |
Office Action of U.S. Appl. No. 13/858,038 dated Oct. 15, 2015, 12 pages. |
Office Action of U.S. Appl. No. 14/074,710 dated Jan. 21, 2016, 17 pages. |
Office Action of U.S. Appl. No. 14/315,726 dated Aug. 5, 2016, 37 pages. |
Office Action of U.S. Appl. No. 14/315,726 dated Sep. 9, 2015, 42 pages. |
Office Action of U.S. Appl. No. 14/315,726 dated Dec. 2, 2015, 18 pages. |
Office Action of U.S. Appl. No. 14/461,530 dated Oct. 2, 2015, 44 pages. |
Office Action of U.S. Appl. No. 14/461,530 dated Jan. 14, 2016, 15 pages. |
Office Action of U.S. Appl. No. 14/851,753 dated Sep. 27, 2016, 95 pages. |
Office Action of U.S. Appl. No. 14/851,753 dated Dec. 21, 2016, 12 pages. |
Office Action of U.S. Appl. No. 14/851,753 dated Mar. 22, 2017, 14 pages. |
Office Action of U.S. Appl. No. 14/878,295 dated Jul. 27, 2017, 101 pages. |
Office Action of U.S. Appl. No. 14/977,787 dated Nov. 30, 2016, 65 pages. |
Office Action of U.S. Appl. No. 14/977,787 dated Apr. 21, 2017, 17 pages. |
Office Action of U.S. Appl. No. 14/983,176 dated May 9, 2016, 46 pages. |
International Search Report and Written Opinion dated May 23, 2012 in PCT Application No. PCT/US2012/023362. |
International Search Report and Written Opinion of PCT/US2014/020131 dated Jul. 1, 2014. |
International Search Report and Written Opinion of PCT/US2015/037019 dated Nov. 2, 2015, 12 pages. |
Boer, E., “Behavioral Entropy as a Measure of Driving Performance,” 2001, 5 pages. |
Boyraz, P. et al., “Multi-sensor driver drowsiness monitoring,” Proceedings of the I MECH E Part D Journal of Automobile Engineering, vol. 222, No. 11, Jul. 23, 2008, pp. 1857-1878. |
Brown et al.: “Framework for Multivariate Selectivity Analysis, Part I: Theoretical and Practical Merits”, Applied Spectroscopy, vol. 59, No. 6, 2005, pp. 787-803. |
Eoh, H. et al., “Driving Performance Analysis of the ACAS FOT Data and Recommendations for a Driving Workload Manager,” Technical Report UMTRI-2006-18, Dec. 2006, one hundred twenty-six pages. [Online] [Retrieval date unknown] Retrieved from the Internet URL:http://www.deepblue.lib.umich.edu/bitstream/2027.42/64468/1/102432.pdf. |
Heitmann, Anneke et al., “Technologies for the monitoring and prevention of driver fatigue,” Proceedings of the First International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, 2001, pp. 81-86. |
Kable, Greg, “Ferrari plans mind reading tech,” Autocar.co.uk, Jan. 7, 2011. |
Kavsaoğlu et al.: “A novel feature ranking algorithm for biometric recognition with PPG signals”, Computers in Biology and Medicine vol. 49, 2014, pp. 1-14. |
Kong, L. et al., “Non-contact detection of oxygen saturation based on visible light imaging device using ambient light,” Optics Express, vol. 21, No. 15, p. 17464-17471, Jul. 29, 2013. |
Langdale-Smith, N., Jan. 27, 2015. CES 2015—Seeing Machines: The Future of Automotive Safety. Retrieved from https://www.youtube.com/watch?v=obPnLufAu7o. |
Murata et al.: “Noninvasive Biological Sensor System for Detection of Drunk Driving”, IEEE Transactions on Information Technology in Biomedicine, vol. 15, No. 1, Jan. 2011. |
Nakayama, O. et al., “Development of a Steering Entropy Method for Evaluating Driver Workload,” SAE Technical Paper Series 1999-01-0892, Mar. 1-4, 1999, Detroit, Michigan, USA. |
Ofalt, Martin M., Jr., “Ford, MIT Partnering to Increase Driver Wellness and Safety,” The College Driver.com, Jan. 24, 2010. |
Poh, M., McDuff, D.J., & Picard R.W., “Advancements in Noncontact, Multiparameter Physiological Measurements Using a Webcam,” IEEE Transactions on Biomedical Engineering, vol. 58, No. 1, pp. 7-11, Jan. 2011. |
Poh, M., McDuff, D.J. & Picard R.W., “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation,” Optics Express, vol. 18, No. 10, pp. 10762-10774, May 10, 2010. |
Reimer, Bryan et al., “An Evaluation of Driver Reactions to New Vehicle Parking Assist Technologies Developed to Reduce Driver Stress,” MIT AgeLab White Paper, Nov. 4, 2010, pp. 1-26. |
Ridder et al.: “Framework for Multivariate Selectivity Analysis, Part II: Experimental Applications”, Applied Spectroscopy, vol. 59, No. 6, 2005, pp. 804-815. |
Serbedzija, et al. “Vehicle as a Co-Driver”, 1st International Symposium on Vehicular Computing Systems, Published May 16, 2010, 7 pages. |
Szondy, David “Volvo uses face recognition to help tired drivers”, Mar. 18, 2014 http:www.gizmag.com/volvo-automated-driver-monitoring/31257/. |
Wiegand et al., Development and Evaluation of a Naturalistic Observer Rating of Drowsiness Protocol; Feb. 25, 2009 retrieved from the internet, retrieved on May 14, 2012; http://scholar.lib.vt.edu/VTTI/reports/ORD_Final_Report_022509.pdf, entire document. |
Wu, H., Rubinstein, M., Shih, E., Guttag, J. & Durand, F., Freeman, W., “Eulerian Video Magnification for Revealing Subtle Changes in the World,” MIT CSAIL, 8 pages. |
Article: http://www.faurecia.cn/jian-kang-mai-bo-fo-ji-ya-active-wellness-zuo-yi-wei-jia-cheng-zhe-jian-kang-hu-hang, printed on Apr. 27, 2015. |
http://media.ford.com/article_display.cfm?article_id=36728 “Ford Research Developing Intelligent System to Help Drivers Manage Stressful Situations on the Road”, Dearborn, Michigan, Jun. 27, 2012, 2 pages. |
http://reflect.pst.ifi.Imu.de/ “The Reflect Project” article (1 page) and Video Link to “The Reflect Project”: http://vimeo.com/25081038, filmed in Maranello, Italy, Mar. 2011, 7 minutes, 53 seconds. |
Internet Video: CEATEC new chip detects motion, heartbeats—Videos (news)—PC Advisor printed Jan. 17, 2012. |
Press Release: “Faurecia keeps travelers fit, healthier in a heartbeat with “Active Wellness” car seat”, Apr. 20, 2015. |
Press Release, “Ford and MIT research study shows technological advancements reduce stress on driver,” http://web.mit.edu/press/2010/ford-mit-release.html, Nov. 4, 2010. |
Press Release: “Hoana Partners with Automotive Seat Manufacturer Faurecia to Introduce “Active Wellness™” at Auto Shanghai 2015”, Apr. 20, 2015, http://www.hoana.com/hoana_partners_with_faurecia/. |
Press Release: “Volvo Cars conducts research into driver sensors in order to create cars that get to know their drivers”, http://www.media.volvocars.com/global/en-gb/print/140898?print=1, Mar. 17, 2014. |
TruTouch Technologies prototype, Driver Alcohol Detection System for Safety, www.DADSS.org, 1 page. |
TruTouch Technologies: “Technology Overview” pp. 1-4, printed Apr. 27, 2015. |
Vector Forces by Dr. Larry Bortner dated Aug. 21, 2004. |
Video: https://www.youtube.com/watch?v=obPnLufAu7o, printed May 11, 2015, 2 pages. |
YouTube Video Link: https://www.youtube.com/watch?feature=youtu.be&v=_1UBDFSzQ28&app=desktop, printed on Apr. 27, 2015—Faurecia at the 2015 Shanghai Auto Show. |
Office Action of U.S. Appl. No. 15/656,595 dated Oct. 2, 2018, 143 pages. |
Office Action of U.S. Appl. No. 15/720,489 dated Oct. 1, 2018, 146 pages. |
Extended European Search Report of related application No. EP 15811941.2 dated Aug. 3, 2018, 7 pages. |
Office Action of U.S. Appl. No. 15/836,341 dated Jun. 28, 2019, 25 pages. |
Office Action of Japanese Patent Application 2018-209617 dated Sep. 3, 2019, 3 pages. |
Office Action of Japanese Patent Application 2018-209617 dated Sep. 3, 2019, 3 pages, English translation. |
Office Action of U.S. Appl. No. 15/836,341 dated Jun. 26, 2020, 22 pages. |
NOA of U.S. Appl. No. 15/836,341 dated Sep. 2, 2020, 7 pages. |
Office Action of U.S. Appl. No. 15/836,341 dated Sep. 26, 2019, 14 pages. |
Office Action of U.S. Appl. No. 16/936,222 dated Nov. 30, 2021, 112 pages. |
Office Action of U.S. Appl. No. 16/385,108 dated Feb. 3, 2020, 106 pages. |
Office Action of U.S. Appl. No. 16/419,133 dated Feb. 3, 2020, 106 pages. |
Office Action of U.S. Appl. No. 16/419,145 dated Feb. 19, 2020, 117 pages. |
Office Action of U.S. Appl. No. 16/419,152 dated Feb. 19, 2020, 117 pages. |
Office Action of U.S. Appl. No. 16/419,161 dated Feb. 18, 2020, 109 pages. |
Office Action of U.S. Appl. No. 15/836,341 dated Feb. 6, 2020, 22 pages. |
Notice of Allowance of U.S. Appl. No. 16/385,108 dated Apr. 21, 2020, 25 pages. |
Notice of Allowance of U.S. Appl. No. 16/419,133 dated May 4, 2020, 25 pages. |
Notice of Allowance of U.S. Appl. No. 16/419,145 dated Apr. 28, 2020, 13 pages. |
Notice of Allowance of U.S. Appl. No. 16/419,152 dated Apr. 24, 2020, 13 pages. |
Notice of Allowance of U.S. Appl. No. 16/419,161 dated Jun. 4, 2020, 10 pages. |
Notice of Allowance of U.S. Appl. No. 16/936,222 dated Mar. 11, 2022, 8pages. |
Number | Date | Country | |
---|---|---|---|
20180072310 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14977787 | Dec 2015 | US |
Child | 15720597 | US | |
Parent | 13843249 | Mar 2013 | US |
Child | 14977787 | US | |
Parent | 13030637 | Feb 2011 | US |
Child | 13843249 | US |