Distracted driving is dangerous and sometimes fatal. In 2011, up to 10% of fatal auto collisions were reported as distraction-affected crashes in the U.S. For drivers under the age of 20, 21% of the fatal, distraction-related crashes involve the use of cell phones at the time of the collisions. Many states now outlaw texting while driving. But laws alone are not enough to deter some drivers. To prevent distracted driving, an effective way may be to limit the use of cell phones by disabling the distracting features. Existing systems and methods use servers or additional devices to identify whether the user is driving. Consequentially, these systems and methods incur extra costs to the users. It is desirous to utilize the computing power and features of a mobile device to prevent drivers from distractions.
The structure, overall operation and technical characteristics of the present invention will become apparent with the detailed description of preferred embodiments and the illustration of the related drawing as follows.
The invention is incorporated in a method and a system with interconnected software and a mobile device that detect whether a user of the mobile device is driving above a certain speed that serious, or even fatal, accidents are likely to happen. For the purpose of this application, a “mobile device” should be broadly considered to include various handheld devices, such as cell phones, handheld game consoles, portable media players, personal digital assistants (PDAs), mobile internet devices (MIDs), mobile televisions, and any other hand-held device that can transmit text messages and related information. If the user is found driving, certain distracting features of the mobile device may be temporarily disabled by a lock-out mechanism of the mobile device. The distracting features may include texting, youtube, emails, movies, and so forth. When the features are disabled, a message may be displayed to inform the user of the temporary disablement.
The mobile device may include a motion detector to detect the traveling speed and to determine whether the traveling speed is beyond a certain, predefined threshold level, such as 10 miles/hour, which indicates that the user may be driving. The traveling speed of the user means an instantaneous speed of the mobile device, rather than an average speed over the course of the journey. The location of the mobile device may be sampled at a certain frequency such as several times a minute so that the traveling speed can be constantly updated. Alternatively, the speed may be determined with a global positioning system (GPS) or any other means known to a person skilled in the art. For example, the vehicle may be able to communicate with the mobile device of its traveling speed. If the traveling speed is lower than the speed at which statistically crashes may cause serious injuries, such as when a driver stops at a red light, then the user may be allowed to use the distracting features.
In addition to the traveling speed, indications of whether a user is driving include the gestures of the user, the objects near the user, and the orientation of the mobile device being used. The mobile device may include a user interaction analyzer to detect these driver indications. For example, user gesture identification consistent with driving may include identifying movements and positions of the user's eyes, head, shoulders, and hands, and any combinations thereof. Objects near the user may be used to identify whether the user is sitting in the driver seat. As to the orientation of the mobile device, a driver is not likely to tilt the mobile device flat against her lap when using the device. Therefore, if a user uses the mobile device with the screen facing up, the user is less likely the driver.
One embodiment may identify the user gestures based on images taken by its cameras. Gestures of a driver are different from a passenger's. In an embodiment with a rearward facing camera and facial recognition software for distinguishing the user's face from all of the objects comprising the background scenery (the interior components of the vehicle and the objects outside the vehicle), or for identifying when the user's face tilt changes in a nodding motion, the user interaction analyzer of the embodiment may decide that the user is a driver. Another embodiment with an eye movement tracking sensor may detect that the user is a driver by identifying an up and down darting motion of the user's eye. Alternatively, if a user interaction analyzer automatically takes pictures of the user every 10 seconds, identifies the objects in the pictures, and finds in a picture that at least one of the user's hands rests on the steering wheel (a part of the background scenery), then the user interaction analyzer may determine that a driver gesture is detected. Similarly, various combinations of eye, head, hand, and shoulder movements and positions may be used for identifying whether the user is driving.
Additionally, the relative positions of the objects may indicate where the user is sitting in a vehicle. Examples of the objects include an instrument cluster, a steering wheel, the road, other vehicles, and so forth. An embodiment may use a forward facing camera to captures images for identifying the objects. When the embodiment identifies from the image a certain shape that resembles a part of an instrument cluster, the user may be sitting in the driver seat and therefore a driver. Also, if a steering-wheel-like object appears or the meeting of the dashboard with the windshield in the image taken by the forward facing camera, the embodiment may determine that the user is holding the mobile device while driving. Another example is a road with road markings delineating the lanes, if the marking on the left is closer to the user, the user may be sitting in the driver seat (in most countries). Various embodiments may use various objects and combinations thereof to identify the user's seat.
Finally, the orientation of the mobile device may be used by the user interaction analyzer of the mobile device to distinguish a driver from passengers. One embodiment may use an orientation sensor, such as an accelerometer in a smart phone, to detect the angle of the mobile device. If the mobile device is placed against a user's lap, the user is less likely a driver. But if the screen of the mobile device is facing away from the direction of travel, and the mobile device is inclining forward within 45 degrees of vertical angle, the user may be a driver.
One object of this invention is to identify whether the user is driving while using a mobile device and to disable the distracting features of the mobile device, while allowing the use of the mobile device by passengers.
One object of this invention is to prevent distracted driving involving mobile devices while certain emergency uses are still allowed.
Another object of this invention is to provide a low cost system and method for effectively preventing distracted driving caused by a mobile device while not incurring extra cost for additional devices or services specially designed to identify a driver.
The preferred embodiments are a method and a mobile device for preventing distracted driving by disabling distracting features of the mobile device. The preferred method is illustrated in
In Step 110 of
For a preferred embodiment with a GPS receiver, the motion detector may determine the traveling speed of the mobile device by measuring the Doppler shift in the GPS signals from the satellites. The signals from the satellites change depending on how fast the mobile device is moving towards or away from the satellites. The more satellites the GPS receiver can track, the more accurate the calculated traveling speed is. For another embodiment with a cell phone as the mobile device, the traveling speed of the mobile device may be determined by the distance and location of cell towers by the motion detector.
The preferred embodiment may then use the user interaction analyzer to detect whether the user is a driver by various driver indications. The first indication is based on a predefined set of driving gestures—user gestures detected in Step 120 that are consistent with driving. The preferred embodiment may adopt a rearward facing camera and facial recognition software for detecting the orientation of the user's face. If the tilt of the user's face changes in a nodding motion, the user may be a distracted driver. If the user blocks the camera, the user may be a driver who tries to circumvent the detection.
In another embodiment with a rearward facing motion sensor, the motion sensor may also detect a nodding motion which can be used to determine that the user is driving. Moreover, if an embodiment is equipped with an eye motion sensor, the user's up and down darting motion may indicate that she is a driver. Various embodiments may use one or more gesture matches with the predetermined driver gestures as a pass of this criterion.
In Step 130, the preferred embodiment may identify objects generally seen from a driver seat but not from a passenger seat for determining where the user is sitting. The preferred embodiment with a forward facing camera may take pictures of the environment the user is facing. Based on the shapes of objects in the pictures, the embodiment may determine that the objects match those generally seen from a driver seat, thus meeting the criterion. The objects used may include things in the vehicle such as the steering wheel, the instrument cluster, the dashboard, the meeting of the dashboard and the windshield, or any parts of these objects. Things outside the vehicle may also be used, such as the road and other vehicles. The criterion in various embodiments may require only one match or multiple matches.
Some environmental issues may affect the effectiveness of Steps 120 and 130. An example would be when it is night, it may be too dark for the embodiments to recognize the user's gestures or the objects around the user. Therefore, another driver indication may be necessary. The preferred embodiment also adopts the orientation of the mobile device to detect by the user interaction analyzer whether the user is a driver. If the screen is facing away from the direction of travel and tilting slightly forward, such as being within 45 degrees of vertical angle or any predetermined range of orientations of a driver's mobile device, the user may be the driver. On the contrary, if the user is a passenger, the user may tilt her device flat against her lap, with the screen facing up. Smart phones generally are equipped with accelerometers for determining orientation.
For an embodiment with a smart phone as the mobile device or having an orientation sensor, the embodiment may determine by the user interaction analyzer that the user is a driver when the device is not facing up. In the preferred embodiment illustrated in
If the criteria 110 and 150 are met, the preferred embodiment in
The message may be “pull over and stop if you wish to text” when the distracting feature disabled is texting with a cell phone. In addition, the lock-out mechanism may include at least one hot button for emergency situations in Steps 180 and 190. The hot button may be pre-defined to allow the user to text or contact 911 or an emergency contact person. It is preferred that the emergency contact is established with the service provider so that the user cannot circumvent the blocking.
The preferred embodiment in
In addition, alternate embodiments may adopt some but not all of the criteria in
While the invention has been described by means of specific embodiments, numerous modifications and variations could be made thereto by those ordinarily skilled in the art without departing from the score and spirit disclosed herein.
The present application claims priority to co-pending U.S. provisional patent application entitled “Driver Distraction Disabling via Gesture Recognition,” having Ser. No. 61/859,105, filed on Jul. 26, 2013, which is entirely incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61859105 | Jul 2013 | US |