System and method for naming, filtering, and recall of remotely monitored event data

Information

  • Patent Grant
  • 8666590
  • Patent Number
    8,666,590
  • Date Filed
    Friday, June 22, 2007
    17 years ago
  • Date Issued
    Tuesday, March 4, 2014
    11 years ago
Abstract
System and method for capturing video data, comprising buffering video data captured from a video recording device in a vehicle, detecting a triggering event, saving a portion of the video data occurring within a specified period of time near the event, and naming a saved portion of video data with a label associated with the triggering event.
Description
TECHNICAL FIELD

The present invention relates generally to a system and method for tagging data files and, more particularly, to a system and method for tagging and recalling video data files for a driver monitoring system.


BACKGROUND

Video monitoring of vehicle drivers and passengers is known; however, existing vehicle video monitoring systems do not provide easily useable video files for use by personnel who supervise drivers or review their behavior. Current systems merely provide a digital or analog recording for an entire driving shift without any markers, tags or other indication of where questionable driver behavior may be found in the recording. As a result, a supervisor or person analyzing driver behavior must view the video recording and/or exceptions for an entire shift, week, month, or longer to identify incidents of poor driving behavior, such as failure to use a seatbelt, use of a cell phone while driving, or failure to pay attention to the road, aggressive driving, and/or impact events. This method is very inefficient and difficult to use, particularly if the driver's shift is an entire workday, which may require the supervisor to review an 8 hour or longer video for each driver.


One known method for processing long video recordings of drivers is to have a third party review the entire recording and to break the recording into segments each time a new violation occurs. For example, the third party reviewer may watch the video for an entire driving shift and breaks the video file into separate sub-files each time the reviewer observes the driver in the video commit a violation, such as driving without a seatbelt, using a cell phone, or not paying attention to the road. In known systems, these sub-files are marked with minimal information, such as a date/time stamp, that is not helpful to a supervisor or reviewer who is looking for particular types of violations or who wants to prioritize his review to more serious violations.


SUMMARY OF THE INVENTION

The present invention is directed generally to a system and method for capturing video data, comprising buffering video data captured from a video recording device in a vehicle, detecting a triggering event, saving a portion of the video data occurring within a specified period of time near the event, and naming a saved portion of video data with a label associated with the triggering event. The video data may be video of a driver of a vehicle, occupants of a vehicle, or a view outside of the vehicle. The triggering event may be detected by a vehicle monitoring system mounted in the vehicle. The vehicle monitoring system may be coupled to an on-board diagnostic system in the vehicle, and the triggering event may be detected from data received from the on-board diagnostic system.


The triggering event may be detected using signals received from an on-board diagnostic system in the vehicle. This may be a speeding violation, an impact detection, a seatbelt warning, or a use of a wireless device, for example. The specified period of time captured in the saved video is configurable based upon a type of triggering event. The saved portion of video data may be a still image or may further include audio data. The saved portions of video data are provided to a database outside of the vehicle, for example, to be reviewed and analyzed.


In one embodiment a system and method of capturing vehicle video, comprises capturing video data associated with triggering events that occur in a vehicle, wherein the video data is a view of occupants of the vehicle, saving the video data as a file with a name corresponding to the associated triggering event, and providing one or more saved video data files to a database outside of the vehicle. The video data files may be reviewed, searched using the video data file name, grouped according to triggering events using the video data file name, prioritized for review using the video data file name, or searched for with a selected triggering event using the video data file name. The video data files may be provided to the database via a wireless connection, a hardwired connection, or via a memory storage device.


In another embodiment, a system for capturing vehicle video, comprises one or more video data recorders mounted in the vehicle, wherein the video data recorders provide a stream of video data, one or more buffers for capturing, at least temporarily, the video data streams from the one or more video data recorders, a vehicle monitoring system coupled to the one or more video data recorders and the buffers, and a video data storage device for storing video data files comprising at least a portion of a video data stream. The vehicle monitoring system identifies an occurrence of a preselected event and, in response, causes one or more video data files to be saved to the video storage device. The vehicle monitoring device is coupled to an on-board diagnostic system in the vehicle. The preselected event may be the occurrence of certain parameters in the on-board diagnostic system. The preselected event is a potential speeding violation, a potential collision, a potential seatbelt violation, or a potential use of a wireless device in the vehicle. The video data files are labeled using a term associated with an event that was detected at the time the video data was captured.


A method for saving video data captured in a vehicle, comprises saving a video data file comprising video captured from inside a vehicle within a selected period of time of an event, and naming the saved video data file using a label associated with the event.





BRIEF DESCRIPTION OF THE DRAWINGS

For a more complete understanding of the present invention, and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawing, in which:



FIG. 1 is a block diagram of a system incorporating embodiments of the invention;



FIG. 2 is a diagram of the location of cameras used in embodiments of the invention;



FIG. 3 is a block diagram of a system incorporating embodiments of the invention; and



FIG. 4 is an illustration of video data capture according to embodiments of the invention.





DETAILED DESCRIPTION

The present invention provides many applicable inventive concepts that can be embodied in a wide variety of specific contexts. The specific embodiments discussed are merely illustrative of specific ways to make and use the invention, and do not limit the scope of the invention.


With reference now to FIG. 1, there is shown a vehicle 101 in which a vehicle monitoring device is installed. The monitoring device may be self contained, such as a single unit mounted on a windshield 105 or dashboard 106. Alternatively, the monitoring device may include multiple components, such as a processor or central unit mounted under a car seat 103 or in a trunk 104. Similarly, the monitoring device may have a self-contained antenna in the unit (105), or may be connected to remotely mounted antennas 107. The vehicle monitoring units may be connected to an on-board diagnostic (OBD) system 102 or bus in the vehicle. Information and data associated with the operation of the vehicle may be collected from the OBD system,102, such as engine operating parameters, vehicle identification, seatbelt use, door position, etc. The OBD system 102 may also be used to power the vehicle monitoring device. In one embodiment, the vehicle monitoring device is of the type described in U.S. patent application Ser. No. 11/805,237, filed on May 22, 2007, entitled “System and method for Monitoring Vehicle Parameters and Driver Behavior,” the disclosure of which is hereby incorporated by reference herein.


The vehicle monitoring system may include a camera or any other digital video recording device. Referring to FIG. 2, the camera may be mounted on the vehicle's dashboard 201, windshield 202, headliner 203, or any other location that allows for video capture of at least the driver of the vehicle while the vehicle is in operation. The camera may be incorporated into a vehicle monitoring device that is mounted on the vehicle's windshield 105 or dashboard 106. Alternatively, a camera sensor mounted on a dashboard or windshield may be coupled to a remotely mounted vehicle monitoring device 103, 104. The recorded video information may be stored at the camera location (e.g. 105, 106) or in a remote monitoring device (e.g. 103, 104).


The video data may also be transmitted in real-time or at intervals from the vehicle monitoring system to a central monitoring system or server for storage and/or processing. For example, the video may be transmitted to server 109 via communication network 108, which may be a cellular, satellite, WiFi, Bluetooth, infrared, ultrasound, short wave, microwave or any other suitable network. Server 109 may process the video data and/or store the video data to database 110, which may be part of server 109 or a separate device located nearby or at a remote location. Users can access the video data files on server 109 and database 110 using terminal 111, which may be co-located with server 109 and database 110 or coupled via the Internet or other network connection. Is some embodiments, the video data captured by the monitoring system in vehicle 101 may be transmitted via a hardwired communication connection, such as an Ethernet connection that is attached to vehicle 101 when the vehicle is within a service yard or at a base station. Alternatively, the video data may be transferred via a flash memory, diskette, or other memory device that can be directly connected to server 109 or terminal 111.


Video data formats are well known and it is understood that the present invention may use and store video data in any compressed or uncompressed file format now known or later developed, including, for example, the Moving Picture Experts Group (MPEG), Windows Media Video (WMV), or any other file format developed by the International Telecommunication Union (ITU), International Organization for Standardization (ISO), International Electrotechnical Commission (IEC) or other standards body, company or individual.


In one embodiment of the invention, the captured video is used to monitor, mentor or other wise analyze a driver's behavior during certain events. For example, if the vehicle is operated improperly, such as speeding, taking turns too fast, colliding with another vehicle, or driving in an unapproved area, then a supervisor may want to view the driver video recorded during those events to determine what the driver was doing at that time and if the driver's behavior can be improved. Additionally, if the driver's behavior is inappropriate or illegal, such as not wearing a seatbelt or using a cell phone while driving, but does not cause the vehicle to operate improperly, a supervisor may also want to review the video recorded during those events. Accordingly, it would be helpful to a user, such as a supervisor, fleet manager, driving instructor, parent, vehicle owner or other authority (collectively hereinafter a “supervisor”) to have the capability to quickly identify a portion of a driver video record that is associated with such vehicle misuse or improper driver behavior. The supervisor could then analyze the video and provide feedback to the driver to correct the improper or illegal driving behavior.



FIG. 3 is a block diagram of one embodiment of a video capturing system according to one embodiment of the invention. Vehicle monitoring device 301 is mounted anywhere appropriate in the vehicle. Camera or digital video recording device 302 is mounted on the windshield, dashboard, or headliner, for example, so that the driver will be in the field of view. Camera 302 outputs a stream of video data to video data buffer 303. When commanded by vehicle monitoring device 301, video data buffer 303 stores portions or clips of the video data stream to video data storage 304. The video data stream may also, or alternatively, be fed directly to video data storage 304 so that most or all of the video stream is captured. In one embodiment, the video data stream corresponds to video of the driver that is captured during operation of the vehicle.


Video of the passengers and other occupants of the vehicle may also be captured in addition to the driver video data. In other embodiments, more than one camera or video recording device is used in order to capture multiple views simultaneously, such as, for example, a driver view, a passenger view, a view looking forward out of the vehicle, an instrument panel view, and/or a side view. The camera mounting locations are not limited to the windshield, dashboard or headliner, but may be placed anywhere inside or outside of the vehicle and may be oriented to view into or out of the vehicle. Accordingly, multiple video data streams, clips or files may be provided to video data buffer 303 and video data storage 304. Alternatively, separate video data buffers 303 and video data storage devices 304 may be assigned to one or more different video data streams.


Vehicle monitoring device 301 is coupled to camera 302, video data buffer 303, and video storage device 304. These may be separate components, one single component, or various ones of the components may be combined into one device. It will be understood that camera 302 may be any video capture device or equipment. Moreover, video data buffer 303 and video storage device 304 may be any appropriate data buffering and storage devices. Vehicle monitoring device 301 detects predetermined events, such as a collision, a speeding violation, or a disconnected seatbelt, and causes video data buffer to capture video data associated with the triggering event. That event video data is then stored to video data storage device 304. The event video data may be one or more still images or a video clip of any preselected length. Preferably, the event video data files are named so that they may easily be searched, identified and recalled by a supervisor. For example, if a speeding violation was detected, the associated event video data clip might be named or labeled “Speeding,” “Speeding Violation,” or “Speeding x MPH” where “x” is a maximum speed recorded or a speed differential over a posted speed limit.


U.S. patent application Ser. No. 11/805,238, filed May 22, 2007, entitled “System and Method for Monitoring and Updating Speed-By-Street Data,” which application is hereby incorporated by reference herein in its entirety, describes the use of speed-by-street data to identify the specific speed limits on a particular street. The vehicle's owner, fleet manager, or other authority may set speeding thresholds that will trigger the capture of video clips associated with speeding. Static thresholds, such as speeds over 70 MPH, and dynamic thresholds, such as speeds 10 MPH over a posted speed limit, may be set. When vehicle monitoring device 301 determines that the vehicle is currently speeding, such as when a speeding threshold is met, an event trigger will be sent to video data buffer 303 causing a video data file associated with that speeding event to be stored to video data storage device 304 and labeled with an appropriately usable file name.


Vehicle monitoring device 301 may send information identifying the triggering event to video data buffer 303 or video data storage device 304 for use in naming the event video files. Either or both of video data buffer 303 or video data storage device 304 may be configured to name the event video files. Alternatively, vehicle monitoring device 301 may determine the appropriate name or label and provide that information to video data buffer 303 or video data storage device 304 to name the stored file. Other information or criteria in addition to triggering event identifier may be provided to name the file. For example, if a collision or impact is detected, the event video data may be simply named “Collision” or “Possible Impact.” If additional information is available from monitoring device 301, a more detailed label may be generated, such as “Collision—forward quarter,” “Rear Impact,” or “Impact Delta V x” where “x” is a measured or observed Delta V during the collision.


As disclosed in the above-cited U.S. patent application Ser. No. 11/805,237, one embodiment of the vehicle monitoring device receives inputs from accelerometers and/or a crash data recorder that measures “g” forces on the vehicle. These forces may indicate collisions, turning too fast, jackrabbit starts, hard braking or other extreme driving maneuvers. If the vehicle monitoring system detects such forces or identifies a potential collision or impact, an event trigger will be sent to video data buffer 303 causing a video data file associated with that acceleration or impact event to be stored to video data storage device 304 and labeled with an appropriately usable file name.


The device could be collecting video continuously to a buffered memory and once a specified event threshold is exceeded the device collects some configurable amount of video in the past as well as some configurable amount of video into the future (post infraction) and then saves said video to a file whereas the infraction that caused the data capture is coded into the file name. In the alternative, the device could be off and quickly triggered once an infraction or activity of interest is detected. However, such an arrangement would prevent the capture of, video of past events.



FIG. 4 illustrates the processing and storing of video data according to exemplary embodiments of the invention. Video data stream 401 represents the video data captured by camera 302 and provided to video data buffer 303. Video data stream 401 may be in any appropriate format. Video data stream 401 begins at start time 402, which may correspond to the movement of the vehicle's key to an “on” or “ignition” position, the start of the vehicle's engine, the start of a selected route, entry into a designated area, a predetermined time, or any other time event. Video data stream 401 flows in the direction “t” illustrated until end 403, which may correspond to the movement of the vehicle's key to an “off” position, the shutdown of the vehicle's engine, the end of a selected route, exit form a designated area, a predetermined time, or any other time or event.


Buffer window 404 represents an amount of video data that is stored in video data buffer 303. Accordingly, a portion of the video data stream 401 from the current time (t0) to some time in the past (t1) is captured in the video data buffer 303. The period from t0 to t1 is the size of the buffer window, such as 15 seconds, 30 seconds, 2 minutes, etc. The buffer window size may be adjustable depending upon the detected event and the supervisor's settings. For example, the supervisor may not want to view any video clips longer than 30 seconds, so the video buffer is set to a 30 second size. Whenever a triggering event is detected, such as speeding or a collision, the data in the video buffer is captured and stored to video data storage device 304. This allows the supervisor to later observe some period of time (e.g. 30 seconds) leading up to the event. The video buffer and video storage device may be further configured to allow additional video to be captured following the triggering event so that the supervisor may observe some period of time before and after the event. For example, if the buffer size was 30 seconds and the system was configured to capture 10 seconds of video following the triggering event before storing the video clip, then the supervisor could later view the 20 seconds leading up to the event and 10 seconds after the event.


It will be understood that the size of buffer window 404 and the amount of video data captured to individual data files is configurable and may be of any size supported by the available equipment. In another embodiment, the type of triggering event may determine how much time the video clip should capture. Vehicle monitoring device 301 may receive inputs from the vehicle OBD, such as a seatbelt warning, and inputs from other sensors, such as a cell phone use detector. If the driver or a passenger does not use his or her seatbelt, vehicle monitoring device 301 will detect the seatbelt warning on the OBD bus. If the cell phone use detector observes a wireless device being used in or near the vehicle, an input is sent to the vehicle monitoring device 301. In either case, vehicle monitoring device 301 sends a event trigger to video data buffer 303 to capture the driver video. A supervisor may not want to watch 30 seconds or more of the driver talking on a cell phone or not wearing a seatbelt. Instead, they simply need to visually confirm that the violation occurred. Accordingly, the system may be configured to capture a shorter video clip, such as 10 seconds, or a still image when a seatbelt, cell phone use or similar event is detected. On the other hand, for speeding violations, collisions, and aggressive driving triggers, the system may be configured to capture longer video clips.


As illustrated in FIG. 4, the captured video clips 405 are stored to video data storage device 304. Each video clip, which may be of any length or may be a still image, is named so that the files may be easily searched and recalled, as noted above. For example, seatbelt and cellular phone use may simply be named “seatbelt” or “cell phone,” while other events, such as speeding and collisions may be assigned more detailed names. Additional information, such as a date/time stamp, driver name, vehicle identifier, fleet identifier, or the like may also be added to the file name or as additional data added to the file itself. The additional information may be visible or not visible when the video clip is played or observed.


Video data storage 304 may be located in the vehicle and, at the end of the shift, trip or route (403), video clips 405 may be transferred to server 109 or database 110 (FIG. 1), such as by wireless communication over network 108 or by hardwire or Ethernet connection. Vehicle monitoring device 301 may also have a USB port, memory card slot, diskette recording device or other equipment for transferring video clips to a flash drive, memory card, diskette, compact disk, or other memory device. Video clips 405 may then be loaded to server 109 or database 110 directly or remotely, for example, via terminal 111.


Once the video clips are loaded to server 109 and/or database 110, a supervisor may review all of the video files for a particular shift, trip, or route. The files for a particular driver, group of drivers, day, group of day, vehicle, fleet, or all the video files may also be viewed. The supervisor may search, sort and prioritize the video clips using the file names. For example, if the supervisor wanted to see all video clips associated with speeding, the word “speed” could be used as a search term, using any standard file search tool, to find all of the speeding video clips. Similarly, reports on the video clips could be generated using the file names, such as whether there were incidents of speeding, collisions, seatbelt misuse, or the like during a particular shift. The file naming convention described herein allows the supervisor to immediately identify the relevance of each video file and to recall only those files of interest.


Any event or time can be selected as a trigger for capturing video data. It is expected that different users may configure a vehicle monitoring system to capture specific events based upon their use of the vehicle. For example, a parent may configure the device to capture video of potential speeding, impact, seatbelt, and cell phone use violations or events. On the other hand, a fleet operator may record video of those events in addition to capturing video of other events, such as video from particular routes, stops, deliveries, or pick-ups. The monitoring system may be configured to use any OBD data or other sensor data to trigger video capture. For example, if the OBD senses an open vehicle door or if a specific sensor is installed to detect when a vehicle door opens, that event can be used to trigger video capture of the driver and vehicle occupants, which may be useful for example in the taxi, livery, bus, or other commercial transportation industries. Similarly, the start of a taxi meter my trigger video capture of the vehicle occupants.


Additionally, the opening and/or closing of a driver's door and/or passenger door may also constitute a triggering event. Also, the sitting position and/or feedback from seat sensors regarding weight, posture, placement, and/or positioning may constitute a trigger-able event. For example, detecting a condition indicating that a child is riding in the front seat, such as the passenger's positioning, posture, weight, and/or the like, may trigger video capture of the passenger seat occupant. It will be understood that any exception condition or parameters may be selected to trigger video recording and that the captured video files may be named using a descriptive or meaningful label or tag associated with the triggering event.


The present invention may also be used to capture audio data in addition to or instead of video data. The audio data may be captured using microphones mounted with the video recording device or using separate microphones. The audio data may be saved in the same data file as the corresponding video data, or may be saved in separate audio data files. The audio data files may be named in the same descriptive manner as described herein for video data files.


Table 1 illustrates a list of file names for saved video clips according to one embodiment of the invention. The saved video clips are labeled so that the video clip can be correlated to specific violations. The file names illustrate that, for this example trip on May 21, 2006, the driver failed to use a seatbelt at the beginning of the drive at 3:01 PM. The time and date stamp may be as specific as desired by the user, such as including the year and seconds as shown. Alternatively, the file name may just include the violation type without any further details, or may include a sequential identification of the violations, such as “Speeding 1,” “Speeding 2,” “Speeding 3” etc. If the seatbelt remains unattached, the system may be configured to record and label an appropriate video clip every 15 minutes or some other longer or shorter repetition interval to prevent a constant stream of seatbelt violations from being recorded.











TABLE 1









VHCL1_SEATBELT_052106_15.01.04



VHCL1_SPEED_052106_15.21.56



VHCL2_CELL PHONE_052106_15.36.06



VHCL2_SPEED_052106_15.36.40



VHCL1_BRAKE_05212006_15.25.16



VHCL2_SPEED_052106_16.35.21



VHCL3_HARD ACCEL_052106_17.15.56



VHCL1_SPEED_052106_16.52.06



VHCL2_BRAKE_05212006_17.25.16



VHCL2_IMPACT_052106_17.25.18










The vehicles or drivers in the example shown in Table 1 are identified in the file name using the VHCLx field. This identifier could be a vehicle's fleet number, license number, VIN, the number of the vehicle monitoring unit's cell, satellite, or modem, or a driver identifier. The video files may be searched and sorted by the vehicle/driver identifier field, which allows files from multiple vehicles to be processed or reviewed at the same time. In Table 1, video file names for data from three vehicles are illustrated. These vehicles had potential speeding, acceleration, seatbelt and cell phone violations.


Additional detail may be included in the file name, such as a speeding amount, such as “Speeding 10” or “Speeding 15,” to show the extent of the speeding violation. The driver clips of Table 1 show that the driver had a hard brake (i.e. deceleration) and an impact or collision at 17:25 PM. If the system assigned file names as shown in Table 1, then the user could jump straight to content of interest, such as to view the video of an impact. Alternatively, the file listing could be sorted, searched, or otherwise organized using commonly available file search tools.


Although the present invention and its advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims. Moreover, the scope of the present application is not intended to be limited to the particular embodiments of the process, machine, manufacture, composition of matter, means, methods and steps described in the specification. As one of ordinary skill in the art will readily appreciate from the disclosure of the present invention, processes, machines, manufacture, compositions of matter, means, methods, or steps, presently existing or later to be developed, that perform substantially the same function or achieve substantially the same result as the corresponding embodiments described herein may be utilized according to the present invention. Accordingly, the appended claims are intended to include within their scope such processes, machines, manufacture, compositions of matter, means, methods, or steps.

Claims
  • 1. A computer-implemented method for monitoring, recording and storing various types of event data detected during operation of a vehicle in a manner that facilitates retrieval of stored event data by selecting for a particular vehicle retrieval of one or more types of events related to at least one of i) operation of the vehicle, ii) use of vehicle equipment, iii) operator behaviors while in the vehicle, or iv) passenger behaviors while in the vehicle, the computer-implemented method comprising: detecting at one or more sensors installed in a vehicle one or more parameters from which data is derived that defines various types of events related to at least one of i) operation of the vehicle, ii) use of vehicle equipment, iii) operator behaviors while in the vehicle, or iv) passenger behaviors while in the vehicle;for each type of event, configuring a time period over which a triggered event is to be recorded;receiving at a processing system inputs from said one or more sensors, the inputs from said one or more sensors being analyzed by one or more processors of the processing system to determine triggering events that determine when an event type is to be recorded and stored;recording at one or more digital video monitors installed in the vehicle one or more behaviors for at least one of the vehicle operator, the vehicle passenger, a view inside the vehicle, or a view outside the vehicle;for each instance of a triggering event, storing in a data storage device a portion of digital video recorded over the configured time period for the triggered event, and storing the recorded portion of digital video in a digital format that is identified by the type of event that is triggered; andfor a given vehicle, and for a selected type of event, retrieving from data storage for separate presentation all stored instances of the recorded digital video for a selected type of event over a given time period.
  • 2. The method of claim 1, wherein the configured time period includes at least one of a time prior to the event and a time after the event.
  • 3. The method of claim 1, wherein the configured time period includes at least both a time prior to the event and a time after the event.
  • 4. The method of claim 1, wherein the video data is video of a driver of the vehicle.
  • 5. The method of claim 1, wherein the video data is video of one or more passengers of the vehicle.
  • 6. The method of claim 1, wherein the video data is video of a view outside of the vehicle.
  • 7. The method of claim 1, wherein the processing system in included in a vehicle monitoring system mounted in the vehicle.
  • 8. The method of claim 7, wherein the one or more sensors are part of an on-board an on-board diagnostic system in the vehicle, and wherein the on-board diagnostic system is coupled to the vehicle monitoring system.
  • 9. The method of claim 7, wherein the data storage device is included in the vehicle monitoring system.
  • 10. The method of claim 9, wherein a database is stored outside of the vehicle, and wherein the method further comprises transmitting from the vehicle monitoring system to the database the portion of digital video recorded for each triggering event.
  • 11. The method of claim 10, wherein the transmission is wireless.
  • 12. The method of claim 11, wherein the transmission is automatically initiated by the vehicle monitoring system when the vehicle is returned to a fleet vehicle base.
  • 13. The method of claim 10, wherein the stored portion of digital video is temporarily stored in a buffer and then automatically wirelessly transmitted from the buffer to the database.
  • 14. The method of claim 1, wherein at least one of the triggering events is a speeding violation.
  • 15. The method of claim 1, wherein at least one of the triggering events is an impact detection.
  • 16. The method of claim 1, wherein at least one of the triggering events is a seatbelt warning.
  • 17. The method of claim 1, wherein at least one of the triggering events is a detection of a use of a wireless device.
  • 18. The method of claim 1, wherein at least one of the triggering events is detected by a seat sensor.
  • 19. The method of claim 18, wherein an input that is used to detect the triggering event from the seat sensor is selected from the group consisting of: a weight;a passenger size;an occupant's position;an occupant's posture; anda placement of an object on a seat.
  • 20. The method of claim 1, wherein the stored portion of video data is a still image.
  • 21. The method of claim 1, wherein the stored portion of video data comprises audio data.
  • 22. The method of claim 1, further comprises retrieving from data storage all stored instances of the recorded digital video for each type of event over a given time period, and prioritizing presentation of the retrieved instances based on the types of events.
  • 23. A system for monitoring, recording and storing various types of event data detected during operation of a vehicle in a manner that facilitates retrieval of stored event data by selecting for a particular vehicle retrieval of one or more selected types of events related to at least one of i) operation of the vehicle, ii) use of vehicle equipment, iii) operator behaviors while in the vehicle, or iv) passenger behaviors while in the vehicle, the system comprising: one or more sensors installed in a vehicle for detecting one or more parameters from which data is derived that defines various types of events related to at least one of i) operation of the vehicle, ii) use of vehicle equipment, iii) operator behaviors while in the vehicle, or iv) passenger behaviors while in the vehicle;a processing system for configuring a time period over which a triggered event is to be recorded, and for receiving one or more inputs from said one or more sensors, the processing system comprising one or more processors digitally processing said inputs from said one or more sensors to determine triggering events that determine when an event type is to be recorded and stored;one or more monitors installed in the vehicle for recording one or more behaviors for at least one of the vehicle operator, the vehicle passenger, a view inside the vehicle, or a view outside the vehicle;a data storage device for storing a portion of digital video recorded over the configured time period for each instance of a triggering event, and for storing the recorded portion of digital video in a digital format that is identified by the type of event that is triggered; andan output device for retrieval from data storage and for separate presentation all stored instances of the recorded digital video for a selected type of event over a given time period.
  • 24. The system of claim 23, wherein the configured time period includes at least one of a time prior to the event and a time after the event.
  • 25. The system of claim 23, wherein the configured time period includes at least both a time prior to the event and a time after the event.
  • 26. The system of claim 23, wherein the video data is video of a driver of the vehicle.
  • 27. The system of claim 23, wherein the video data is video of one or more passengers of the vehicle.
  • 28. The system of claim 23, wherein the video data is video of a view outside of the vehicle.
  • 29. The system of claim 23, wherein the processing system in included in a vehicle monitoring system mounted in the vehicle.
  • 30. The system of claim 29, wherein the one or more sensors are part of an on-board an on-board diagnostic system in the vehicle, and wherein the on-board diagnostic system is coupled to the vehicle monitoring system.
  • 31. The system of claim 29, wherein the data storage device is included in the vehicle monitoring system.
  • 32. The system of claim 31, wherein a database is stored outside of the vehicle, and wherein the method further comprises transmitting from the vehicle monitoring system to the database the portion of digital video recorded for each triggering event.
  • 33. The system of claim 32, wherein the transmission is wireless.
  • 34. The system of claim 32, wherein the transmission is automatically initiated by the vehicle monitoring system when the vehicle is returned to a fleet vehicle base.
  • 35. The system of claim 32, wherein the stored portion of digital video is temporarily stored in a buffer and then automatically wirelessly transmitted from the buffer to the database.
  • 36. The system of claim 23, wherein at least one of the triggering events is a speeding violation.
  • 37. The system of claim 23, wherein at least one of the triggering events is an impact detection.
  • 38. The system of claim 23, wherein at least one of the triggering events is a seatbelt warning.
  • 39. The system of claim 23, wherein at least one of the triggering events is a detection of a use of a wireless device.
  • 40. The system of claim 23, wherein at least one of the triggering events is detected by a seat sensor.
  • 41. The system of claim 40, wherein an input that is used to detect the triggering event from the seat sensor is selected from the group consisting of: a weight;a passenger size;an occupant's position;an occupant's posture; anda placement of an object on a seat.
  • 42. The system of claim 23, wherein the stored portion of video data is a still image.
  • 43. The system of claim 23, wherein the stored portion of video data comprises audio data.
  • 44. The system of claim 23, wherein all stored instances of the recorded digital video for each type of event are retrieved for a given time period, and presentation of the retrieved instances is based on the types of events.
  • 45. One or more digital storage devices containing computer-executable instructions for causing one or more processors of a computing system to implement a method for storing various types of event data detected during operation of a vehicle in a manner that facilitates retrieval of stored event data by selecting for a particular vehicle retrieval of one or more selected types of events related to at least one of i) operation of the vehicle, ii) use of vehicle equipment, iii) operator behaviors while in the vehicle, or iv) passenger behaviors while in the vehicle, the computer-implemented method comprising: receiving at a processing system inputs from any of a plurality of sensors installed in a vehicle, the sensors detecting parameters from which data is derived that defines various types of events related to at least one of i) operation of the vehicle, ii) use of vehicle equipment, iii) operator behaviors while in the vehicle, or iv) passenger behaviors while in the vehicle;for each type of event, configuring at a processing system a time period over which a triggered event is to be recorded;receiving at the processing system inputs from said one or more sensors, the inputs from said one or more sensors being analyzed by one or more processors of the processing system to determine triggering events that determine when an event type is to be recorded and stored;recording at one or more digital video monitors installed in the vehicle one or more behaviors for at least one of the vehicle operator, the vehicle passenger, a view inside the vehicle, or a view outside the vehicle;for each instance of a triggering event, storing in a data storage device a portion of digital video recorded over the configured time period for the triggered event, and storing the recorded portion of digital video in a digital format that is identified by the type of event that is triggered; andfor a given vehicle, and for a selected type of event, retrieving from data storage for separate presentation all stored instances of the recorded digital video for a selected type of event over a given time period.
US Referenced Citations (479)
Number Name Date Kind
1767325 Taylor Jun 1930 A
3975708 Lusk Aug 1976 A
4369427 Drebinger et al. Jan 1983 A
4395624 Wartski Jul 1983 A
4419654 Funk Dec 1983 A
4458535 Juergens Jul 1984 A
4785280 Fubini Nov 1988 A
4926417 Futami May 1990 A
4939652 Steiner Jul 1990 A
5032821 Domanico Jul 1991 A
5119504 Durboraw, III Jun 1992 A
5223844 Mansell et al. Jun 1993 A
5225842 Brown et al. Jul 1993 A
5303163 Ebaugh et al. Apr 1994 A
5305214 Komatsu Apr 1994 A
5309139 Austin May 1994 A
5311197 Sorden et al. May 1994 A
5325082 Rodriguez Jun 1994 A
5347260 Ginzel Sep 1994 A
5359528 Haendel Oct 1994 A
5365114 Tsurushima Nov 1994 A
5365451 Wang et al. Nov 1994 A
5394136 Lammers Feb 1995 A
5400018 Scholl Mar 1995 A
5414432 Penny, Jr. et al. May 1995 A
5422624 Smith Jun 1995 A
5424584 Matsuda Jun 1995 A
5430432 Camhi Jul 1995 A
5436612 Aduddell Jul 1995 A
5436837 Gerstung Jul 1995 A
5446659 Yamawaki Aug 1995 A
5453939 Hoffman Sep 1995 A
5457439 Kuhn Oct 1995 A
5475597 Buck Dec 1995 A
5485161 Vaughn Jan 1996 A
5499182 Ousborne Mar 1996 A
5521579 Bernhard May 1996 A
5521580 Kaneko May 1996 A
5525960 McCall Jun 1996 A
5548273 Nicol Aug 1996 A
5581464 Woll Dec 1996 A
5586130 Doyle Dec 1996 A
5600558 Mearek Feb 1997 A
5612875 Haendel Mar 1997 A
5625337 Medawar Apr 1997 A
5638077 Martin Jun 1997 A
5642284 Parupalli Jun 1997 A
5648755 Yagihashi Jul 1997 A
5659289 Zonkoski Aug 1997 A
5689067 Klein Nov 1997 A
5708417 Tallman Jan 1998 A
5717374 Smith Feb 1998 A
5719771 Buck Feb 1998 A
5723768 Ammon Mar 1998 A
5740548 Hudgens Apr 1998 A
5742915 Stafford Apr 1998 A
5751245 Janky et al. May 1998 A
5764139 Nojima Jun 1998 A
5767767 Lima Jun 1998 A
5777580 Janky et al. Jul 1998 A
5795997 Gittins Aug 1998 A
5797134 McMillan et al. Aug 1998 A
5801618 Jenkins Sep 1998 A
5801948 Wood Sep 1998 A
5815071 Doyle Sep 1998 A
5825283 Camhi Oct 1998 A
5825284 Dunwoody Oct 1998 A
5844475 Horie Dec 1998 A
5847271 Poublon Dec 1998 A
5862500 Goodwin Jan 1999 A
5867093 Dodd Feb 1999 A
5877678 Donoho Mar 1999 A
5880674 Ufkes Mar 1999 A
5880958 Helms et al. Mar 1999 A
5883594 Lau Mar 1999 A
5892434 Carlson Apr 1999 A
5907277 Tokunaga May 1999 A
5914654 Smith Jun 1999 A
5918180 Dimino Jun 1999 A
5926087 Busch Jul 1999 A
5928291 Jenkins et al. Jul 1999 A
5941915 Federle et al. Aug 1999 A
5945919 Trask Aug 1999 A
5949330 Hoffman Sep 1999 A
5949331 Schofield Sep 1999 A
5954781 Slepian Sep 1999 A
5955942 Slifkin Sep 1999 A
5957986 Coverdill Sep 1999 A
5964816 Kincaid Oct 1999 A
5969600 Tanguay Oct 1999 A
5974356 Doyle et al. Oct 1999 A
5978737 Pawlowski Nov 1999 A
5982278 Cuvelier Nov 1999 A
5987976 Sarangapani Nov 1999 A
5999125 Kurby Dec 1999 A
6002327 Boesch Dec 1999 A
6008724 Thompson Dec 1999 A
6018293 Smith Jan 2000 A
6026292 Coppinger et al. Feb 2000 A
6028508 Mason Feb 2000 A
6028510 Tamam Feb 2000 A
6037861 Ying Mar 2000 A
6037862 Ying Mar 2000 A
6038496 Dobler Mar 2000 A
6044315 Honeck Mar 2000 A
6059066 Lary May 2000 A
6064928 Wilson May 2000 A
6064970 McMillan et al. May 2000 A
6067008 Smith May 2000 A
6067009 Hozuka May 2000 A
6072388 Kyrtsos Jun 2000 A
6073007 Doyle Jun 2000 A
6075458 Ladner et al. Jun 2000 A
6078853 Ebner Jun 2000 A
6081188 Kutlucinar Jun 2000 A
6084870 Wooten et al. Jul 2000 A
6094149 Wilson Jul 2000 A
6098048 Dashefsky Aug 2000 A
6100792 Ogino Aug 2000 A
6104282 Fragoso Aug 2000 A
6108591 Segal et al. Aug 2000 A
6121922 Mohan Sep 2000 A
6124810 Segal et al. Sep 2000 A
6130608 McKeown Oct 2000 A
6131067 Girerd et al. Oct 2000 A
6133827 Alvey Oct 2000 A
6141610 Rothert Oct 2000 A
6147598 Murphy Nov 2000 A
6172602 Hasfjord Jan 2001 B1
6178374 Möhlenkamp et al. Jan 2001 B1
6184784 Shibuya Feb 2001 B1
6185501 Smith Feb 2001 B1
6198995 Settles Mar 2001 B1
6204756 Senyk Mar 2001 B1
6204757 Evans Mar 2001 B1
6208240 Ledesma Mar 2001 B1
6212455 Weaver Apr 2001 B1
6216066 Goebel Apr 2001 B1
6222458 Harris Apr 2001 B1
6225898 Kamiya May 2001 B1
6227862 Harkness May 2001 B1
6229438 Kutlucinar May 2001 B1
6232873 Dilz May 2001 B1
6246933 Bague Jun 2001 B1
6247360 Anderson Jun 2001 B1
6249219 Perez Jun 2001 B1
6253129 Jenkins et al. Jun 2001 B1
6255892 Gartner Jul 2001 B1
6255939 Roth Jul 2001 B1
6262658 O'Connor Jul 2001 B1
6265989 Taylor Jul 2001 B1
6266588 McClellan Jul 2001 B1
6278361 Magiawala Aug 2001 B1
6285931 Hattori Sep 2001 B1
6289332 Menig Sep 2001 B2
6294988 Shomura Sep 2001 B1
6294989 Schofield Sep 2001 B1
6295492 Lang Sep 2001 B1
6297768 Allen, Jr. Oct 2001 B1
6301533 Markow Oct 2001 B1
6306063 Horgan et al. Oct 2001 B1
6308120 Good Oct 2001 B1
6308134 Croyle et al. Oct 2001 B1
6313742 Larson Nov 2001 B1
6320497 Fukumoto Nov 2001 B1
6331825 Ladner et al. Dec 2001 B1
6333686 Waltzer Dec 2001 B1
6337653 Bchler Jan 2002 B1
6339739 Folke Jan 2002 B1
6339745 Novik Jan 2002 B1
6344805 Yasui Feb 2002 B1
6351211 Bussard Feb 2002 B1
6356188 Meyers Mar 2002 B1
6356822 Diaz Mar 2002 B1
6356833 Jeon Mar 2002 B2
6356836 Adolph Mar 2002 B1
6359554 Skibinski Mar 2002 B1
6362730 Razavi Mar 2002 B2
6362734 McQuade Mar 2002 B1
6366199 Osbom Apr 2002 B1
6378959 Lesesky Apr 2002 B2
6389340 Rayner May 2002 B1
6393348 Ziegler May 2002 B1
6404329 Hsu Jun 2002 B1
6405112 Rayner Jun 2002 B1
6405128 Bechtolsheim et al. Jun 2002 B1
6415226 Kozak Jul 2002 B1
6424268 Isonaga Jul 2002 B1
6427687 Kirk Aug 2002 B1
6430488 Goldman Aug 2002 B1
6433681 Foo Aug 2002 B1
6441732 Laitsaari Aug 2002 B1
6449540 Rayner Sep 2002 B1
6459367 Green Oct 2002 B1
6459369 Wang Oct 2002 B1
6459961 Obradovich Oct 2002 B1
6459969 Bates Oct 2002 B1
6462675 Humphrey Oct 2002 B1
6472979 Schofield Oct 2002 B2
6476763 Allen, Jr. Nov 2002 B2
6480106 Crombez Nov 2002 B1
6484035 Allen, Jr. Nov 2002 B2
6484091 Shibata Nov 2002 B2
6493650 Rodgers Dec 2002 B1
6512969 Wang Jan 2003 B1
6515596 Awada Feb 2003 B2
6519512 Haas Feb 2003 B1
6525672 Chainer Feb 2003 B2
6526341 Bird et al. Feb 2003 B1
6529159 Fan et al. Mar 2003 B1
6535116 Zhou Mar 2003 B1
6542074 Tharman Apr 2003 B1
6542794 Obradovich Apr 2003 B2
6549834 McClellan Apr 2003 B2
6552682 Fan Apr 2003 B1
6556905 Mittelsteadt Apr 2003 B1
6559769 Anthony May 2003 B2
6564126 Lin May 2003 B1
6567000 Slifkin May 2003 B2
6571168 Murphy May 2003 B1
6587759 Obradovich Jul 2003 B2
6594579 Lowrey Jul 2003 B1
6599243 Woltermann Jul 2003 B2
6600985 Weaver Jul 2003 B2
6604033 Banet Aug 2003 B1
6609063 Bender et al. Aug 2003 B1
6609064 Dean Aug 2003 B1
6611740 Lowrey Aug 2003 B2
6611755 Coffee Aug 2003 B1
6622085 Amita et al. Sep 2003 B1
6629029 Giles Sep 2003 B1
6630884 Shanmugham Oct 2003 B1
6631322 Arthur et al. Oct 2003 B1
6636790 Lightner Oct 2003 B1
6639512 Lee et al. Oct 2003 B1
6643578 Levine Nov 2003 B2
6651001 Apsell Nov 2003 B2
6654682 Kane et al. Nov 2003 B2
6657540 Knapp Dec 2003 B2
6662013 Takiguchi et al. Dec 2003 B2
6662141 Kaub Dec 2003 B2
6664922 Fan Dec 2003 B1
6665613 Duvall Dec 2003 B2
6674362 Yoshioka Jan 2004 B2
6675085 Straub Jan 2004 B2
6677854 Dix Jan 2004 B2
6678612 Khawam Jan 2004 B1
6696932 Skibinski Feb 2004 B2
6703925 Steffel Mar 2004 B2
6710738 Allen, Jr. Mar 2004 B2
6714894 Tobey et al. Mar 2004 B1
6718235 Borugian Apr 2004 B1
6718239 Rayner Apr 2004 B2
6727809 Smith Apr 2004 B1
6728605 Lash Apr 2004 B2
6732031 Lightner May 2004 B1
6732032 Banet May 2004 B1
6737962 Mayor May 2004 B2
6741169 Magiawala May 2004 B2
6741170 Alrabady May 2004 B2
6745153 White Jun 2004 B2
6748322 Fernandez Jun 2004 B1
6750761 Newman Jun 2004 B1
6750762 Porter Jun 2004 B1
6756916 Yanai Jun 2004 B2
6759952 Dunbridge Jul 2004 B2
6766244 Obata et al. Jul 2004 B2
6768448 Farmer Jul 2004 B2
6775602 Gordon Aug 2004 B2
6778068 Wolfe Aug 2004 B2
6778885 Agashe et al. Aug 2004 B2
6784793 Gagnon Aug 2004 B2
6784832 Knockeart et al. Aug 2004 B2
6788196 Ueda Sep 2004 B2
6788207 Wilkerson Sep 2004 B2
6792339 Basson Sep 2004 B2
6795017 Puranik et al. Sep 2004 B1
6798354 Schuessler Sep 2004 B2
6803854 Adams et al. Oct 2004 B1
6807481 Gastelum Oct 2004 B1
6813549 Good Nov 2004 B2
6819236 Kawai Nov 2004 B2
6832141 Skeen et al. Dec 2004 B2
6845314 Fosseen Jan 2005 B2
6845316 Yates Jan 2005 B2
6845317 Craine Jan 2005 B2
6847871 Malik et al. Jan 2005 B2
6847872 Bodin Jan 2005 B2
6847873 Li Jan 2005 B1
6847887 Casino Jan 2005 B1
6850841 Casino Feb 2005 B1
6859039 Horie Feb 2005 B2
6859695 Klausner Feb 2005 B2
6865457 Mittelsteadt Mar 2005 B1
6867733 Sandhu et al. Mar 2005 B2
6868386 Henderson et al. Mar 2005 B1
6870469 Ueda Mar 2005 B2
6873253 Veziris Mar 2005 B2
6873261 Anthony Mar 2005 B2
6879894 Lightner Apr 2005 B1
6885293 Okumura Apr 2005 B2
6892131 Coffee May 2005 B2
6894606 Forbes et al. May 2005 B2
6895332 King May 2005 B2
6909398 Knockeart et al. Jun 2005 B2
6914523 Munch Jul 2005 B2
6922133 Wolfe Jul 2005 B2
6922616 Obradovich Jul 2005 B2
6922622 Dulin Jul 2005 B2
6925425 Remboski Aug 2005 B2
6928348 Lightner Aug 2005 B1
6937162 Tokitsu Aug 2005 B2
6950013 Scaman Sep 2005 B2
6954140 Holler Oct 2005 B2
6958976 Kikkawa Oct 2005 B2
6965827 Wolfson Nov 2005 B1
6968311 Knockeart et al. Nov 2005 B2
6970075 Cherouny Nov 2005 B2
6970783 Knockeart et al. Nov 2005 B2
6972669 Saito Dec 2005 B2
6980131 Taylor Dec 2005 B1
6981565 Gleacher Jan 2006 B2
6982636 Bennie Jan 2006 B1
6983200 Bodin Jan 2006 B2
6988033 Lowrey Jan 2006 B1
6988034 Marlatt et al. Jan 2006 B1
6989739 Li Jan 2006 B2
7002454 Gustafson Feb 2006 B1
7002579 Olson Feb 2006 B2
7005975 Lehner Feb 2006 B2
7006820 Parker et al. Feb 2006 B1
7012632 Freeman et al. Mar 2006 B2
7019641 Lakshmanan Mar 2006 B1
7020548 Saito et al. Mar 2006 B2
7023321 Brillon et al. Apr 2006 B2
7023332 Saito Apr 2006 B2
7024318 Fischer Apr 2006 B2
7027808 Wesby Apr 2006 B2
7034705 Yoshioka Apr 2006 B2
7038578 Will May 2006 B2
7042347 Cherouny May 2006 B2
7047114 Rogers May 2006 B1
7049941 Rivera-Cintron May 2006 B2
7054742 Khavakh et al. May 2006 B2
7059689 Lesesky Jun 2006 B2
7069126 Bernard Jun 2006 B2
7069134 Williams Jun 2006 B2
7072753 Eberle Jul 2006 B2
7081811 Johnston Jul 2006 B2
7084755 Nord Aug 2006 B1
7088225 Yoshioka Aug 2006 B2
7089116 Smith Aug 2006 B2
7091880 Sorensen Aug 2006 B2
7098812 Hirota Aug 2006 B2
7099750 Miyazawa Aug 2006 B2
7099774 King Aug 2006 B2
7102496 Ernst Sep 2006 B1
7109853 Mattson Sep 2006 B1
7113081 Reichow Sep 2006 B1
7113107 Taylor Sep 2006 B2
7117075 Larschan et al. Oct 2006 B1
7119696 Borugian Oct 2006 B2
7124027 Ernst Oct 2006 B1
7124088 Bauer et al. Oct 2006 B2
7129825 Weber Oct 2006 B2
7132934 Allison Nov 2006 B2
7132937 Lu et al. Nov 2006 B2
7132938 Suzuki Nov 2006 B2
7133755 Salman Nov 2006 B2
7135983 Filippov Nov 2006 B2
7138916 Schwartz Nov 2006 B2
7139661 Holze Nov 2006 B2
7145442 Wai Dec 2006 B1
7149206 Pruzan Dec 2006 B2
7155321 Bromley et al. Dec 2006 B2
7161473 Hoshal Jan 2007 B2
7164986 Humphries Jan 2007 B2
7170390 Quiñones Jan 2007 B2
7170400 Cowelchuk Jan 2007 B2
7174243 Lightner Feb 2007 B1
7180407 Guo Feb 2007 B1
7180409 Brey Feb 2007 B2
7187271 Nagata Mar 2007 B2
7196629 Ruoss Mar 2007 B2
7197500 Israni et al. Mar 2007 B1
7216022 Kynast et al. May 2007 B2
7216035 Hörtner May 2007 B2
7218211 Ho May 2007 B2
7222009 Hijikata May 2007 B2
7225065 Hunt May 2007 B1
7228211 Lowrey Jun 2007 B1
7233235 Pavlish Jun 2007 B2
7236862 Kanno Jun 2007 B2
7239948 Nimmo Jul 2007 B2
7256686 Koutsky Aug 2007 B2
7256700 Ruocco Aug 2007 B1
7256702 Isaacs Aug 2007 B2
7260497 Watabe Aug 2007 B2
RE39845 Hasfjord Sep 2007 E
7269507 Cayford Sep 2007 B2
7269530 Lin Sep 2007 B1
7271716 Nou Sep 2007 B2
7273172 Olsen Sep 2007 B2
7280046 Berg Oct 2007 B2
7283904 Benjamin Oct 2007 B2
7286917 Hawkins Oct 2007 B2
7286929 Staton Oct 2007 B2
7289024 Sumcad Oct 2007 B2
7289035 Nathan Oct 2007 B2
7292152 Torkkola Nov 2007 B2
7292159 Culpepper Nov 2007 B2
7298248 Finley Nov 2007 B2
7298249 Avery Nov 2007 B2
7301445 Moughler Nov 2007 B2
7317383 Ihara Jan 2008 B2
7317392 DuRocher Jan 2008 B2
7317927 Staton Jan 2008 B2
7319848 Obradovich Jan 2008 B2
7321294 Mizumaki Jan 2008 B2
7321825 Ranalli Jan 2008 B2
7323972 Nobusawa Jan 2008 B2
7323974 Schmid Jan 2008 B2
7323982 Staton Jan 2008 B2
7327239 Gallant Feb 2008 B2
7327258 Fast Feb 2008 B2
7333883 Geborek Feb 2008 B2
7339460 Lane Mar 2008 B2
7349782 Churchill Mar 2008 B2
7352081 Taurasi Apr 2008 B2
7355508 Mian Apr 2008 B2
7365639 Yuhara Apr 2008 B2
7366551 Hartley Apr 2008 B1
7375624 Hines May 2008 B2
7376499 Salman May 2008 B2
7378946 Lahr May 2008 B2
7378949 Chen May 2008 B2
7386394 Shulman Jun 2008 B2
7421334 Dahlgren et al. Sep 2008 B2
7433889 Barton Oct 2008 B1
7447509 Cossins et al. Nov 2008 B2
7499949 Barton Mar 2009 B2
7565230 Gardner et al. Jul 2009 B2
7706940 Itatsu Apr 2010 B2
7880642 Gueziec Feb 2011 B2
7898388 Ehrman et al. Mar 2011 B2
7941258 Mittelsteadt et al. May 2011 B1
8150628 Hyde et al. Apr 2012 B2
8311277 Peleg et al. Nov 2012 B2
20010018628 Jenkins et al. Aug 2001 A1
20020005895 Freeman et al. Jan 2002 A1
20020024444 Hiyama et al. Feb 2002 A1
20020103622 Burge Aug 2002 A1
20030055555 Knockeart et al. Mar 2003 A1
20040039504 Coffee et al. Feb 2004 A1
20040066330 Knockeart et al. Apr 2004 A1
20040077339 Martens Apr 2004 A1
20040083041 Skeen et al. Apr 2004 A1
20040138794 Saito et al. Jul 2004 A1
20040142672 Stankewitz Jul 2004 A1
20040143602 Ruiz et al. Jul 2004 A1
20040210353 Rice Oct 2004 A1
20040236474 Chowdhary et al. Nov 2004 A1
20050064835 Gusler Mar 2005 A1
20050091018 Craft Apr 2005 A1
20050096809 Skeen et al. May 2005 A1
20050137757 Phelan et al. Jun 2005 A1
20060080359 Powell et al. Apr 2006 A1
20060154687 McDowell Jul 2006 A1
20060190822 Basson et al. Aug 2006 A1
20060234711 McArdle Oct 2006 A1
20070040928 Jung et al. Feb 2007 A1
20070124332 Ballesty et al. May 2007 A1
20070136078 Plante Jun 2007 A1
20070229234 Smith Oct 2007 A1
20070293206 Lund Dec 2007 A1
20080064413 Breed Mar 2008 A1
20080122603 Plante et al. May 2008 A1
20080255888 Berkobin Oct 2008 A1
20100033577 Doak et al. Feb 2010 A1
Foreign Referenced Citations (4)
Number Date Country
2071931 Dec 1993 CA
197 00 353 Jul 1998 DE
WO2005109369 Nov 2005 WO
WO2008109477 Sep 2008 WO
Non-Patent Literature Citations (4)
Entry
Ogle, et al.; Accuracy of Global Positioning System for Determining Driver Performance Parameters; Transportation Research Record 1818; Paper No. 02-1063; pp. 12-24.
Shen, et al.; A computer Assistant for Vehicle Dispatching with Learning Capabilities; Annals of Operations Research 61; pp. 189-211, 1995.
Tijerina, et al.; Final Report Supplement; Heavy Vehicle Driver Workload Assessment; Task 5: Workload Assessment Protocol; U.S. Department of Transportation; 69 pages, Oct. 1996.
Myra Blanco; Effects of In-Vehicle Information System (IVIS) Tasks on the Information Processing Demands of a Commercial Vehicle Operations (CVO) Driver; 230 pages, 1999.
Related Publications (1)
Number Date Country
20080319604 A1 Dec 2008 US