Driver attention evaluation

Information

  • Patent Grant
  • 9845097
  • Patent Number
    9,845,097
  • Date Filed
    Wednesday, August 12, 2015
    9 years ago
  • Date Issued
    Tuesday, December 19, 2017
    7 years ago
Abstract
A driver evaluation system for a vehicle generates for display a parameter indicative of a driver's level of attention to driving the vehicle for a temporal period based on a complement of a weighted average of a plurality of counts. Some of the counts represent a number of interaction events between the driver and an infotainment system in the vehicle during the temporal period.
Description
TECHNICAL FIELD

This disclosure relates to systems and methods for evaluating a driver's focus on the task of driving.


BACKGROUND

Sophisticated automobile electronic devices can provide an array of features and functions with which the driver can interact. Moreover, the number of available portable computing and telecommunication devices, such as cell phones, tablets and wearable devices is increasing. As a result, these portable devices are more likely to be present in the vehicle. Overuse of these electronic and portable devices while driving may draw the attention of the driver away from the road.


SUMMARY

A driver evaluation system for a vehicle includes one or more controllers programmed to generate for display a parameter indicative of a driver's level of attention to driving the vehicle for a temporal period based on a complement of a weighted average of a plurality of counts. At least some of the counts represent a number of interaction events between the driver and an infotainment system in the vehicle during the temporal period. The one or more controllers may further be programmed to link the parameter with geographic coordinates traversed by the vehicle during the temporal period, and in response to a driver attention request including the geographic coordinates, output the parameter. The one or more controllers may further be programmed to accumulate one of the counts such that, in response to occurrence of an interaction event, the one of the counts being greater than zero, and a count timer exceeding a predefined time limit, the count timer is decreased according to a difference between the count timer and a quotient of the predefined time limit and the one of the counts. The temporal period may be defined by a sliding window time period. The one or more controllers may further be programmed to record a value of the parameter for each of temporal periods defining a drive cycle. At least one of the plurality of counts may represent a number of look-away events from a road during the temporal period. The look-away events may be based on an eye gaze direction or head pose of the driver. The infotainment system may be a cell phone, an instrument panel cluster, or a center stack console.


A driver evaluation method includes, by a controller, accumulating a driver inattention event count such that, in response to occurrence of an interaction event and a count timer exceeding a limit, the count timer is decreased according to a difference between the count timer and a quotient of the limit and the count. The method also includes displaying a driver attention state value that is based on a complement of a weighted average that includes the count. The method may further include linking the driver attention state value with geographic coordinates, and in response to a driver attention request including the geographic coordinates, outputting the driver attention state value. The method may further include recording the driver attention state value for each of a plurality of temporal periods defining a drive cycle, and upon completion of the drive cycle, displaying an average of at least some of the driver attention state values. The method may further include wirelessly transmitting the driver attention state value off-board. The driver inattention event count may represent a number of look-away events from a road for a temporal period. The look-away events may be based on an eye gaze direction or head pose of a driver.


A vehicle includes an interface, and one or more controllers programmed to, in response to a driver attention request, generate for display via the interface a parameter indicative of a driver's level of attention to driving the vehicle for a selected time period based on a complement of a weighted average, having a value between zero and one, of a plurality of counts. At least some of the counts represent a number of interaction events between the driver and an infotainment system in the vehicle during the selected time period. The one or more controllers may further be programmed to accumulate one of the counts such that, in response to occurrence of an interaction event, the one of the counts being greater than zero, and a value of a count timer exceeding a predefined time limit, the value is decreased according to a difference between the count timer and a quotient of the predefined time limit and the one of the counts. The selected time period may define a drive cycle. At least one of the plurality of counts may represent a number of look-away events from a road during the selected time period. The look-away events may be based on an eye gaze direction or head pose of the driver.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a driver attention evaluation system.



FIGS. 2A and 2B are plots showing, respectively, driver inattention events over time and a corresponding count of the same for a sliding window time period.



FIG. 3 is a block diagram of an algorithm for counting driver inattention events.



FIGS. 4A and 4B are plots showing, respectively, driver inattention events over time and a corresponding count of the same using the algorithm of FIG. 3.



FIG. 5 is a block diagram of a vehicle including the driver attention evaluation system of FIG. 1.





DETAILED DESCRIPTION

Embodiments of the present disclosure are described herein. It is to be understood, however, that the disclosed embodiments are merely examples and other embodiments may take various and alternative forms. The figures are not necessarily to scale; some features could be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention. As those of ordinary skill in the art will understand, various features illustrated and described with reference to any one of the figures may be combined with features illustrated in one or more other figures to produce embodiments that are not explicitly illustrated or described. The combinations of features illustrated provide representative embodiments for typical applications. Various combinations and modifications of the features consistent with the teachings of this disclosure, however, could be desired for particular applications or implementations.


Drivers have access to a variety of in-vehicle infotainment systems such as center stack consoles, clusters, mobile phones, wearable devices, etc. Extended use of these devices during periods of driving may reduce driver focus and attention. Although various systems may provide alerts or postpone information delivery based on the extent to which a driver is interacting with infotainment devices, it may further be useful to provide a gauge of overall driver attentiveness. That is, it may be useful to represent the driver's level of attention to the driving task for a given time period (e.g., a drive cycle, a selected portion of a drive cycle, etc.) as a function of their aggregate interaction with available devices during that time period: high interaction would suggest low driver attentiveness to the driving task, low interaction would suggest high driver attentiveness to the driving task. Numeric values, symbols, icons, etc. can be used to convey the driver's level of attention.


Referring to FIG. 1, one or more processors 10 may implement a set of algorithms structured to generate a parameter representing the driver's level of attention to the driving task. To begin, data concerning driver head pose excursions, center stack interaction, cluster interaction, connected brought-in device interaction, etc. is collected via respective counting algorithms 12, 14, 16, 18. As an example, in-vehicle cameras may be used to track a direction of the driver's eye gaze or head pose using known techniques. If the gaze or pose is evaluated as being away from the road for at least some predetermined period of time (using, for example, known recursive signal processing techniques to determine look-away glances longer than the predetermined period of time), an accumulator value, PE, may be incremented. Likewise if signals from a center stack console, instrument panel cluster, a brought-in device in communication with the vehicle, etc. indicate that a driver is engaged therewith, respective accumulator values, CS, Cl, BD, may be incremented.


Any number of techniques may be used to track the level of interaction with the above mentioned subsystems and devices. For example, FIG. 2A shows a number of interactions with a given infotainment device over a 55 second time period. A value of 1 indicates an interaction event, such as touching a screen, turning a dial, dialing a cell phone, etc. And, a value of 0 indicates the absence of an interaction event. FIG. 2B shows a corresponding circular buffer count with a 12 second moving window. While the counting result from the circular buffer is the true accumulated count for the moving 12 second window, a 50 Hz sampling rate will require 600 data storage memory allocations.


Referring to FIG. 3, an alternative counting algorithm may be used that only requires two fixed size registers: one for the count and one for a count timer. If an interaction event is detected at decision block 24, the count is incremented at operation 26. If the count is greater than 0 at decision block 28, the algorithm proceeds to decision block 30. If the count timer is less than a predefined time limit, the count timer is incremented at operation 32. The count is then output at operation 34. The algorithm then returns to decision block 24.


Thus upon system initialization, the count timer remains at 0 until occurrence of a first interaction event—at which time the count timer begins to increase in value up to the predefined time limit. The predefined time limit effectively dictates the rate at which the count will decay absent further interaction events: the smaller the limit, the faster the decay; the larger the limit, the slower the decay. Hence, the limit can be selected or tuned to achieve a desired decay for a particular infotainment device. For example, the limit may be selected such that the count for interactions with a mobile device decay more slowly as compared with interactions with a radio volume dial because interactions with the mobile device may be more distracting than interactions with the radio volume dial, etc.


Returning to decision block 30, if the timer is greater than the limit, the timer, at operation 32 is reduced by a fraction of its value according to the difference between the timer and the quotient of the limit and the count. Additionally, the counter is decremented. The algorithm then proceeds to operation 34.


Once the count timer exceeds the limit, the value of the count no longer increases because any increment to the count that took place at operation 26 is then removed at operation 36. Put a different way, once the count timer exceeds the limit, the value of the count will either remain the same (in the presence of interaction events) or decrease in value.


Returning to decision block 28, if the count is equal to 0, the timer is reset to 0. The algorithm then returns to decision block 24.



FIG. 4A again shows the number of interactions with the given infotainment device over a 55 second time period. (The same number of interactions with the same frequency as FIG. 2A.) FIG. 4B shows a corresponding count using the memory efficient algorithm of FIG. 3. Comparing FIGS. 2B and 4B, the memory efficient algorithm approximates the count with a fair degree of accuracy, but with much less overhead.


Referring again to FIG. 1, the respective counts PE, CS, Cl and BD are provided to an attention state aggregation algorithm 40. In one example, the respective counts are summed using a weighted average to generate an attention state aggregation value, ASA, according to the equation

ASA=Σi=1Nwiyi   Eq. (1)

where N is the number of driver interaction devices tracked, yi is the interaction device accumulated incidents, and wi is the weight attributed to each device verifiable dependent element. The weights may be selected such that any resulting ASA has a value between 0 and 1. And, the weights may be selected to take into account that certain interactions may be more distracting than others: dialing a brought-in device (such as a cell phone) may be more distracting than turning a dial on a cluster. Hence in certain embodiments, the rates at which the counts decay and the weights associated with the counts may be different to reflect that certain types of interactions are more taxing than others.


Using the inputs PE, CS, Cl and BD, the ASA would be given by

ASA=PEwPE+CSwCS+ClwCl+BDwBD   Eq. (2)

Any suitable aggregating technique, however, may be used.


The ASA is then provided to a driver attention state evaluation algorithm 42. In one example, the driver attention state value, DASV, may be computed as the complement of the ASA according to the equation

DASV=1−ASA   Eq. (3)

Hence, the DASV provides a value between 0 and 1 that may be scaled as a percentile and binned into different categories to reflect driver attention for a given time period. Values in the range of 80% to 100% can represent high driver attention to the driving task, whereas values in the range of 0% to 20% can represent low driver attention to the driving task.


The respective counts discussed above can be sampled at selected or periodic times, and equations (2) and (3) used to generate a DASV for that period. These DASV can then be, for example, averaged to develop a DASV for a given drive cycle, or, for example, linked with various geographic portions of a drive cycle such that the driver's attention for a city driving portion or highway portion of the drive cycle can be evaluated. As an example, a DASV can be generated every minute, and each DASV can be associated with current geographic coordinates of the vehicle for that time. A user may then request the DASV for a particular portion of a drive cycle via a request for the DASV associated with geographic coordinates defined by the particular portion of the drive cycle. Values for the DASV or an average thereof may then be reported. Other scenarios are also contemplated.


Referring to FIG. 5, a vehicle 44 includes a plurality of sub-systems or devices that a driver may interact with such as a center stack console 46 and an instrument panel cluster 48. A driver may also have brought into the vehicle 44 a device 50, such as a cell phone or wearable device, that is in communication with various controllers and communication infrastructure of the vehicle 44 using known techniques. Still further, the vehicle 44 may include a camera system 51 configured as known in the art to track head movements or eye gaze directions of the driver. Data indicating whether the driver is interacting with any of the center stack console 46, the instrument panel cluster 48 or brought-in device 50 may be provided to the driver attention evaluation system 10 via a car area network or other communication lines therebetween. Likewise, data representing the driver's head pose or direction of eye gaze may be provided to the driver attention evaluation system 10. Using the algorithms described above, the processors of the driver attention evaluation system 10 may generate DASV from the data provided.


The DASV may be passed to a controller 52, which may selectively provide the DASV to an interface 54 for display, to a memory 56 for later retrieval, or to a transmitter (or otherwise) for off-board transmission.


The processes, methods, or algorithms disclosed herein may be deliverable to or implemented by a processing device, controller, or computer, which may include any existing programmable electronic control unit or dedicated electronic control unit. Similarly, the processes, methods, or algorithms may be stored as data and instructions executable by a controller or computer in many forms including, but not limited to, information permanently stored on non-writable storage media such as ROM devices and information alterably stored on writeable storage media such as floppy disks, magnetic tapes, CDs, RAM devices, and other magnetic and optical media. The processes, methods, or algorithms may also be implemented in a software executable object. Alternatively, the processes, methods, or algorithms may be embodied in whole or in part using suitable hardware components, such as Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, controllers or other hardware components or devices, or a combination of hardware, software and firmware components.


The words used in the specification are words of description rather than limitation, and it is understood that various changes may be made without departing from the spirit and scope of the disclosure. As previously described, the features of various embodiments may be combined to form further embodiments of the invention that may not be explicitly described or illustrated. While various embodiments could have been described as providing advantages or being preferred over other embodiments or prior art implementations with respect to one or more desired characteristics, those of ordinary skill in the art recognize that one or more features or characteristics may be compromised to achieve desired overall system attributes, which depend on the specific application and implementation. These attributes may include, but are not limited to cost, strength, durability, life cycle cost, marketability, appearance, packaging, size, serviceability, weight, manufacturability, ease of assembly, etc. As such, embodiments described as less desirable than other embodiments or prior art implementations with respect to one or more characteristics are not outside the scope of the disclosure and may be desirable for particular applications.

Claims
  • 1. A driver evaluation system for a vehicle comprising: a controller programmed to generate for display a parameter indicative of a driver's level of attention to driving the vehicle for a sliding window time period based on a complement of a weighted average of a plurality of counts at least some of which represent a number of interaction events between the driver and an infotainment system in the vehicle during the time period.
  • 2. The system of claim 1, wherein the controller is further programmed to link the parameter with geographic coordinates traversed by the vehicle during the time period, and in response to a driver attention request including the geographic coordinates, output the parameter.
  • 3. The system of claim 1, wherein the controller is further programmed to accumulate one of the counts such that, in response to occurrence of an interaction event, the one of the counts being greater than zero, and a count timer exceeding a predefined time limit, the count timer is decreased according to a difference between the count timer and a quotient of the predefined time limit and the one of the counts.
  • 4. The system of claim 1, wherein the controller is further programmed to record a value of the parameter for time periods defining a drive cycle.
  • 5. The system of claim 1, wherein at least one of the counts represents a number of look-away events from a road during the time period.
  • 6. The system of claim 5, wherein the look-away events are based on an eye gaze direction or head pose of the driver.
  • 7. The system of claim 1, wherein the infotainment system is a cell phone, an instrument panel cluster, or a center stack console.
  • 8. A driver evaluation method comprising: by a controller, accumulating a driver inattention event count such that, in response to occurrence of an inattention event and a count timer exceeding a limit, the count timer is decreased according to a difference between the count timer and a quotient of the limit and the count, anddisplaying a driver attention state value that is based on a complement of a weighted average that includes the count.
  • 9. The method of claim 8 further comprising linking the driver attention state value with geographic coordinates, and in response to a driver attention request including the geographic coordinates, outputting the driver attention state value.
  • 10. The method of claim 8 further comprising recording the driver attention state value for each of a plurality of time periods defining a drive cycle, and upon completion of the drive cycle, displaying an average of at least some of the driver attention state values.
  • 11. The method of claim 8 further comprising wirelessly transmitting the driver attention state value off-board.
  • 12. The method of claim 8, wherein the driver inattention event count represents a number of look-away events from a road for a time period.
  • 13. The method of claim 12, wherein the look-away events are based on an eye gaze direction or head pose of a driver.
  • 14. A vehicle comprising: an interface; andone or more controllers programmed to, in response to a driver attention request, generate for display via the interface a parameter indicative of a driver's level of attention to driving the vehicle for a selected time period based on a complement of a weighted average, having a value between zero and one, of a plurality of counts at least some of which represent a number of interaction events between the driver and an infotainment system in the vehicle during the selected time period, andaccumulate one of the counts such that, in response to occurrence of an interaction event, the one of the counts being greater than zero, and a value of a count timer exceeding a predefined time limit, the value is decreased according to a difference between the count timer and a quotient of the predefined time limit and the one of the counts.
  • 15. The vehicle of claim 14, wherein the selected time period defines a drive cycle.
  • 16. The vehicle of claim 14, wherein at least one of the counts represents a number of look-away events from a road during the selected time period.
  • 17. The vehicle of claim 16, wherein the look-away events are based on an eye gaze direction or head pose of the driver.
US Referenced Citations (168)
Number Name Date Kind
4797671 Toal, Jr. Jan 1989 A
4804937 Barbiaux Feb 1989 A
5355511 Hatano et al. Oct 1994 A
5432841 Rimer Jul 1995 A
5633484 Zancho et al. May 1997 A
5654686 Geschke et al. Aug 1997 A
5732074 Spaur et al. Mar 1998 A
5758300 Abe May 1998 A
5889468 Banga Mar 1999 A
5942979 Luppino Aug 1999 A
5943206 Crayford Aug 1999 A
5963129 Warner Oct 1999 A
5986543 Johnson Nov 1999 A
5993397 Branson Nov 1999 A
6025777 Fuller et al. Feb 2000 A
6037676 Foree Mar 2000 A
6067009 Hozuka et al. May 2000 A
6104931 Havinis et al. Aug 2000 A
6292095 Fuller et al. Sep 2001 B1
6295449 Westerlage et al. Sep 2001 B1
6339736 Moskowitz et al. Jan 2002 B1
6343220 Van Der Salm Jan 2002 B1
6370472 Fosseen Apr 2002 B1
6377890 Doi Apr 2002 B1
6415210 Hozuka et al. Jul 2002 B2
6429773 Schuyler Aug 2002 B1
6435018 Murakami et al. Aug 2002 B1
6441732 Laitsaari et al. Aug 2002 B1
6470732 Breton Oct 2002 B1
6487478 Azzaro et al. Nov 2002 B1
6525643 Okada et al. Feb 2003 B1
6571617 Van Niekerk et al. Jun 2003 B2
6587040 Seto Jul 2003 B2
6611740 Lowrey et al. Aug 2003 B2
6612165 Juzswik et al. Sep 2003 B2
6629031 Gustavsson et al. Sep 2003 B2
6671609 Nantz et al. Dec 2003 B2
6691025 Reimer Feb 2004 B2
6732031 Lightner et al. May 2004 B1
6738697 Breed May 2004 B2
6825758 Laitsaari Nov 2004 B1
6836708 Tripathi Dec 2004 B2
6839614 Timko et al. Jan 2005 B1
6845314 Fosseen Jan 2005 B2
6847872 Bodin et al. Jan 2005 B2
6853853 Van Wiemeersch et al. Feb 2005 B1
6868358 Brown Mar 2005 B2
6892052 Kotola et al. May 2005 B2
6892116 Geisler et al. May 2005 B2
6930614 Rackham et al. Aug 2005 B2
6937141 Muramatsu Aug 2005 B2
6983200 Bodin et al. Jan 2006 B2
6993421 Pillar et al. Jan 2006 B2
7040154 Shaw et al. May 2006 B2
7053761 Schofield et al. May 2006 B2
7068158 Komatsu et al. Jun 2006 B2
7092804 McQuade et al. Aug 2006 B2
7096101 Sonnenrein et al. Aug 2006 B2
7114379 Emord Oct 2006 B2
7170400 Cowelchuk et al. Jan 2007 B2
7171188 Watanabe et al. Jan 2007 B1
7216532 Rimkus et al. May 2007 B2
7218209 Utter et al. May 2007 B2
7219063 Schalk et al. May 2007 B2
7224262 Simon et al. May 2007 B2
7228122 Oyagi et al. Jun 2007 B2
7319378 Thompson et al. Jan 2008 B1
7379541 Iggulden et al. May 2008 B2
7394352 Bell et al. Jul 2008 B2
7403124 Arakawa et al. Jul 2008 B2
7509849 Rutherford et al. Mar 2009 B2
7532958 Powers et al. May 2009 B2
7778186 Oman et al. Aug 2010 B2
7783246 Twitchell, Jr. et al. Aug 2010 B2
7849149 Habaguchi et al. Dec 2010 B2
7859392 McClellan et al. Dec 2010 B2
8061879 Simmons et al. Nov 2011 B2
8089348 Kameyama Jan 2012 B2
8120475 Iwamoto et al. Feb 2012 B2
8325028 Schofield et al. Dec 2012 B2
8344866 Lermer et al. Jan 2013 B2
8522320 Kleve et al. Aug 2013 B2
8981942 He et al. Mar 2015 B2
20010033225 Razavi et al. Oct 2001 A1
20020130771 Osborne et al. Sep 2002 A1
20030004741 Johnson et al. Jan 2003 A1
20030016130 Joao Jan 2003 A1
20030093218 Jones May 2003 A1
20030158640 Weber Aug 2003 A1
20030205081 Proschka Nov 2003 A1
20030208309 Tripathi Nov 2003 A1
20040050188 Richards et al. Mar 2004 A1
20040075539 Savoie et al. Apr 2004 A1
20040112124 Sonnenrein et al. Jun 2004 A1
20040193368 Sanqunetti Sep 2004 A1
20040203634 Wang et al. Oct 2004 A1
20050024189 Weber Feb 2005 A1
20050137763 Watkins et al. Jun 2005 A1
20050179518 Kawamura Aug 2005 A1
20050190900 White et al. Sep 2005 A1
20050195106 Davis et al. Sep 2005 A1
20050273218 Breed et al. Dec 2005 A1
20060095174 Sonnenrein et al. May 2006 A1
20060208865 Quach et al. Sep 2006 A1
20060220806 Nguyen Oct 2006 A1
20060220809 Stigall et al. Oct 2006 A1
20060220813 Utter et al. Oct 2006 A1
20060235652 Rimkus et al. Oct 2006 A1
20060273885 Thompson Dec 2006 A1
20060288101 Mastrodonato et al. Dec 2006 A1
20070013498 Knoll et al. Jan 2007 A1
20070015548 Flick Jan 2007 A1
20070027595 Nou Feb 2007 A1
20070060056 Whitaker et al. Mar 2007 A1
20070069951 Sweet Mar 2007 A1
20070155300 Hsieh Jul 2007 A1
20070156317 Breed Jul 2007 A1
20070193348 Rutherford et al. Aug 2007 A1
20070200671 Kelley et al. Aug 2007 A1
20070229350 Scalisi et al. Oct 2007 A1
20070290881 Nikitin et al. Dec 2007 A1
20080005055 Horvitz Jan 2008 A1
20080024285 Vandenbrink et al. Jan 2008 A1
20080046149 Breed Feb 2008 A1
20080077292 Gisler Mar 2008 A1
20080082221 Nagy Apr 2008 A1
20080106859 Eguchi et al. May 2008 A1
20080125665 Nigam May 2008 A1
20080136611 Benco et al. Jun 2008 A1
20080140265 Hong et al. Jun 2008 A1
20080147265 Breed Jun 2008 A1
20080147271 Breed Jun 2008 A1
20080172147 Taki et al. Jul 2008 A1
20080197970 Fouts Aug 2008 A1
20080204556 De Miranda et al. Aug 2008 A1
20080215665 Appleby et al. Sep 2008 A1
20080228355 De Jonk et al. Sep 2008 A1
20080266051 Taki et al. Oct 2008 A1
20080299961 Muller et al. Dec 2008 A1
20080309451 Zellweger et al. Dec 2008 A1
20090075624 Cox et al. Mar 2009 A1
20090091437 Corniot Apr 2009 A1
20090096575 Tieman Apr 2009 A1
20090096576 Oman et al. Apr 2009 A1
20090096596 Sultan et al. Apr 2009 A1
20090098907 Huntzicker et al. Apr 2009 A1
20090167524 Chesnutt et al. Jul 2009 A1
20090273438 Sultan et al. Nov 2009 A1
20090273687 Tsukizawa Nov 2009 A1
20100033333 Victor Feb 2010 A1
20100145759 Hembury Jun 2010 A1
20100168967 Dlugoss et al. Jul 2010 A1
20100233957 Dobosz Sep 2010 A1
20110015971 Hembury Jan 2011 A1
20110029875 Milch Feb 2011 A1
20110071720 Schondorf et al. Mar 2011 A1
20110071725 Kleve et al. Mar 2011 A1
20110071734 Van Wiemeersch et al. Mar 2011 A1
20110080282 Kleve et al. Apr 2011 A1
20110130945 Deedy et al. Jun 2011 A1
20110205040 Van Wiemeersch Aug 2011 A1
20110205047 Patel et al. Aug 2011 A1
20110215901 Van Wiemeersch et al. Sep 2011 A1
20110230165 Kleve et al. Sep 2011 A1
20110254692 Furuta Oct 2011 A1
20120313768 Campbell et al. Dec 2012 A1
20130151027 Petrucci et al. Jun 2013 A1
20140214313 Lorenz Jul 2014 A1
Foreign Referenced Citations (4)
Number Date Country
10141439 Feb 2003 DE
10225787 Dec 2003 DE
2005220635 Aug 2005 JP
2006075533 Jul 2006 WO
Non-Patent Literature Citations (8)
Entry
J. Smith, Wanted: One Gorilla, printed from www.tirereview.com, Jul. 27, 2009, pp. 1-4.
Check Tire Pressure with Bluetooth, printed from www.esato.com, Jul. 30, 2004, pp. 1-2.
Acumine Pty Ltd—Fleet Monitoring System, http://www.acumine.com/—Products/Fleet Monitoring.php., May 22, 2009, pp. 1-3.
Vehicle monitoring system, GPS vehicle monitoring system. Vehicle tracking system. http://www.guardmagic.com/, May 22, 2009, pp. 1-2.
911 Assist, Vehicle Health Report Expand Sync Capabilities and Convenience Features, printout from www.media.ford.com, May 21, 2009, pp. 1-2.
Vehicle Health Report Delivers Assistance With Vehicle Maintenance and Monitoring, printout from www.media.ford.com, Jun. 2008, pp. 1-1.
Solindo GPS, Solindo Web Products: The Solutions Provider Company. Printout from www.solindoweb.com/products. php on Sep. 16, 2009, pp. 1-4.
Deliverable D61.1 Final Report, Highly Automated Vehicles for Intelligent Transport (HAVEit), Sep. 23, 2011, pp. 1-358.
Related Publications (1)
Number Date Country
20170043781 A1 Feb 2017 US