Vehicle exception event management systems

Information

  • Patent Grant
  • 9738156
  • Patent Number
    9,738,156
  • Date Filed
    Friday, October 17, 2014
    10 years ago
  • Date Issued
    Tuesday, August 22, 2017
    7 years ago
Abstract
Exception event recorders and analysis systems include: vehicle mounted sensors arranged as a vehicle event recorder to capture both discrete and non-discrete data; a discretization facility; a database; and an analysis server all coupled together as a computer network. Motor vehicles with video cameras and onboard diagnostic systems capture data when the vehicle is involved in a crash or other anomaly (an ‘event’). In station where interpretation of non-discrete data is rendered, i.e. a discretization facility, captured data is used as a basis for production of supplemental discrete data to further characterize the event. Such interpreted data is joined to captured data and inserted into a database in a structure which is searchable and which supports logical or mathematical analysis by automated machines. A coupled analysis server is arranged to test stored data for prescribed conditions and upon finding such, to initiate further actions appropriate for the detected condition.
Description
BACKGROUND OF THE INVENTIONS

Field


The following invention disclosure is generally concerned with vehicle event recorders and more specifically concerned with recording systems including a video discretization facility and operation arranged to create discrete data relating to video image series and associate that discrete data with other digital data associated with the event in a database record.


Prior Art


The inventions presented in U.S. Pat. No. 6,947,817 by inventor Diem for nonintrusive diagnostic tools for testing oxygen sensor operation relates to a diagnostic system for testing a vehicle where such systems include a wireless communications link between a vehicle any remote network of server computers. In particular, a WiFi type access points allowed an analyzer to communicate by way the Internet with a server computer hosting and oxygen sensor SOAP (simple object access protocol) service. In a nutshell, the system relates to smog sensors for automobiles which communicate with remote servers by way of a WiFi communications links.


Video surveillance systems are used to provide video records of events, incidents, happenings, et cetera in locations of special interest. For example, retail banking offices are generally protected with video surveillance systems which provide video evidence in case of robbery. While video surveillance systems are generally used in fixed location scenarios, mobile video surveillance systems are also commonly used today.


In particular, video systems have been configured for use in conjunction with an automobile and especially for use with police cruiser type automobiles. As a police cruiser is frequently quite near the scene of an active crime, important image information may be captured by video cameras installed on the police cruiser. Specific activity of interest which may occur about an automobile is not always associated with crime and criminals. Sometimes events which occur in the environments immediately about an automobile are of interest for reasons having nothing to do with crime. In example, a simple traffic accident where two cars come together in a collision may be the subject of video evidence of value. Events and circumstances leading up to the collision accident may be preserved such that an accurate reconstruction can be created. This information is useful when trying come to a determination as to cause, fault and liability. As such, general use of video systems in conjunction with automobiles is quickly becoming an important tool useful for the protection of all. Some examples of the systems are illustrated below with reference to pertinent documents.


Inventor Schmidt presents in U.S. Pat. No. 5,570,127, a video recording system for a passenger vehicle, namely a school bus, which has two video cameras one for an inside bus view and one for a traffic view, a single recorder, and a system whereby the two cameras are multiplexed at appropriate times, to the recording device. A switching signal determines which of the two video cameras is in communication with the video recorder so as to view passengers on the passenger vehicle at certain times and passing traffic at other times.


Thomas Doyle of San Diego, Calif. and QUALCOMM Inc. also of San Diego, present an invention for a method and apparatus for detecting fault conditions in a vehicle data recording device to detect tampering or unauthorized access, in U.S. Pat. No. 5,586,130. The system includes vehicle sensors for monitoring one or more operational parameters of the vehicle. The fault detection technique contemplates storing a current time value at regular intervals during periods in which the recording device is provided with a source of main power. Inventor Doyle also teaches in the U.S. Pat. No. 5,815,071, a method and apparatus for monitoring parameters of vehicle electronic control units.


A “computerized vehicle log” is presented by Dan Kikinis of Saratoga Calif. in U.S. Pat. No. 5,815,093. The vehicle accident recording system employs a digital camera connected to a controller in nonvolatile memory, and an accident sensing interrupter. The oldest memory is overwritten by the newest images, until an accident is detected at which time the memory is blocked from further overwrites to protect the more vital images, which may include important information about the accident. Mr. Kikinis instructs that in preferred embodiments, the system has a communications port whereby stored images are downloaded after an accident to a digital device capable of displaying images. This feature is described in greater detail in the specification which indicates a wired download to a server having specialized image handling and processing software thereon.


Inventor Mr. Turner of Compton, Calif., no less, teaches an antitheft device for an automotive vehicle having both an audible alarm and visual monitor system. Video monitor operators are responsible for monitoring and handling an emergency situation and informing a 911 emergency station. This system is presented in U.S. Pat. No. 6,002,326.


A vehicle accident video recorder, in particular, a railroad vehicle accident video recorder, is taught by inventors Cox et al. In this system, a method and monitoring unit for recording the status of the railroad vehicle prior to a potential accident is presented. The monitoring unit continuously monitors the status of an emergency brake of the railroad vehicle and the status of a horn of the railroad vehicle. Video images are recorded and captured for a predetermined period of time after detecting that the emergency brake or horn blast has been applied as an event trigger. This invention is the subject of U.S. Pat. No. 6,088,635.


A vehicle crash data recorder is presented by inventor Ferguson of Bellaire, Ohio in U.S. Pat. No. 6,185,490. The apparatus is arranged with a three stage memory to record and retain information. And further it is equipped with a series and parallel connectors to provide instant on-scene access to accident data. It is important to note that Ferguson finds it important to include the possibility of on-site access to the data. Further, that Ferguson teaches use of a wired connection in the form of a serial or parallel hardwire connector. This teaching of Ferguson is common in many advanced systems configured as vehicle event recorders.


A traffic accident data recorder and traffic accident reproduction system and method is presented as U.S. Pat. No. 6,246,933. A plurality of sensors for registering vehicle operation parameters including at least one vehicle mounted digital video, audio camera is included for sensing storing and updating operational parameters. A rewritable, nonvolatile memory is provided for storing those processed operational parameters and video images and audio signals, which are provided by the microprocessor controller. Data is converted to a computer readable form and read by a computer such that an accident can be reconstructed via data collected.


U.S. Pat. No. 6,298,290 presented by Abe et al, teaches a memory apparatus for vehicle information data. A plurality of sensors including a CCD camera collision center of vehicle speed sensors, steering angle sensor, brake pressure sensor, acceleration sensor, are all coupled to a control unit. Further, the control unit passes information to a flash memory and a RAM memory subject to an encoder. The information collected is passed through a video output terminal. This illustrates another hardwire system and the importance placed by experts in the art on a computer hardware interface. This is partly due to the fact that video systems are typically data intensive and wired systems are necessary as they have bandwidth sufficient for transfers of large amounts of data.


Mazzilli of Bayside, N.Y. teaches in U.S. Pat. No. 6,333,759 a 360° automobile video camera system. A complex mechanical mount provides for a single camera to adjust its viewing angle giving a 360° range for video recording inside and outside of an automotive vehicle.


U.S. Pat. No. 6,389,339 granted to Inventor Just, of Alpharetta, Ga. teaches a vehicle operation monitoring system and method. Operation of a vehicle is monitored with an onboard video camera linked with a radio transceiver. A monitoring service includes a cellular telecommunications network to view a video data received from the transceiver to a home-base computer. These systems are aimed at parental monitoring of adolescent driving. The mobile modem is designed for transmitting live video information into the network as the vehicle travels.


Morgan, Hausman, Chilek, Hubenak, Kappler, Witz, and Wright with their heads together invented an advanced law enforcement and response technology in US patent number to U.S. Pat. No. 6,411,874 granted Jun. 25, 2002. A central control system affords intuitive and easy control of numerous subsystems associated with a police car or other emergency vehicle. This highly integrated system provides advanced control apparatus which drives a plurality of detector systems including video and audio systems distributed about the vehicle. A primary feature included in this device includes an advanced user interface and display system, which permits high level driver interaction with the system.


Inventor Lambert teaches in U.S. Pat. No. 6,421,080 a “digital surveillance system with pre-event recording”. Pre-event recording is important in accident recording systems, because detection of the accident generally happens after the accident has occurred. A first memory is used for temporary storage. Images are stored in the temporary storage continuously until a trigger is activated which indicates an accident has occurred at which time images are transferred to a more permanent memory.


Systems taught by Gary Rayner in U.S. Pat. Nos. 6,389,340; 6,405,112; 6,449,540; and 6,718,239, each directed to cameras for automobiles which capture video images, both of forward-looking and driver views, and store recorded images locally on a mass storage system. An operator, at the end of the vehicle service day, puts a wired connector into a device port and downloads information into a desktop computer system having specialized application software whereby the images and other information can be played-back and analyzed at a highly integrated user display interface.


It is not possible in the systems Rayner teaches for an administrative operator to manipulate or otherwise handle the data captured in the vehicle at an off-site location without human intervention. It is necessary for a download operator to transfer data captured from the recorder unit device to a disconnected computer system. While proprietary ‘DriveCam’ files can be e-mailed or otherwise transferred through the Internet, those files are in a format with a can only be digested by desktop software running at a remote computer. It is necessary to have the DriveCam desktop application on the remote computer. In order that the files be properly read. In this way, data captured by the vehicles is totally unavailable to some parties having an interest in the data. Namely those parties who do not have access to a computer appropriately arranged with the specific DriveCam application software. A second and major disadvantage is systems presented by Rayner includes necessity that a human operator service the equipment each day in a manual download action.


Remote reporting and manipulation of automobile systems is not entirely new. The following are very important teachings relating to some automobile systems having a wireless communications link component.


Inventors Fan et al, teach inventions of methods and systems for detecting vehicle collision using global positioning system GPS. The disclosure of Jun. 12, 2001 resulted in granted patent having U.S. Pat. No. 6,459,988. A GPS receiver is combined with wireless technology to automatically report accident and third parties remotely located. A system uses the GPS signals to determine when an acceleration value exceeds the preset threshold which is meant to be indicative of an accident having occurred.


Of particular interest include inventions presented by inventors Nagda et al., in the document numbered U.S. Pat. No. 6,862,524 entitled using location data to determine trafficking route information. In this system for determining and disseminating traffic information or route information, traffic condition information is collected from mobile units that provide their location or position information. Further route information may be utilized to determine whether a mobile unit is allowed or prohibited from traveling along a certain route.


A common assignee, @Road Inc., owns the preceding two patents in addition to the following: U.S. Pat. Nos. 6,529,159; 6,552,682; 6,594,576; 6,664,922; 6,795,017; 6,832,140; 6,867,733; 6,882,313; and 6,922,566. As such, @Road Inc., must be considered a major innovator in position technologies arts as they relate to mobile vehicles and remote server computers.


General Motors Corp. teaches in U.S. Pat. No. 6,728,612, an automated telematics test system and method. The invention provides a method and system testing a telematics system in a mobile vehicle a test command from a test center to a call center is based on a test script. The mobile vehicle is continuously in contact by way of cellular communication networks with a remotely located host computer.


Inventor Earl Diem and Delphi Technologies Inc., had granted to them on Sep. 20, 2005, U.S. Pat. No. 6,947,817. The nonintrusive diagnostic tool for sensing oxygen sensor operation include a scheme or an oxygen analyzer deployed in a mobile vehicle communicates by way of an access point to a remotely located server. A diagnostic heuristic is used to analyze the data and confirm proper operation of the sensor. Analysis may be performed by a mainframe computer quickly note from the actual oxygen sensor.


Similar patents including special relationships between mobile vehicles and remote host computers include those presented by various inventors in U.S. Pat. Nos. 6,735,503; 6,739,078; 6,760,757; 6,810,362; 6,832,141; and 6,850,823.


Another special group of inventions owned by Reynolds and Reynolds Holding Inc., is taught first by Lightner et al, in U.S. Pat. No. 6,928,348 issued Aug. 9, 2005. In these inventions, Internet based emission tests are performed on vehicles having special wireless couplings to computer networks. Data may be further transferred to entities of particular interest including the EPA or California Air Resources Board, for example, or particular insurance companies and other organizations concerned with vehicle emissions and environment.


Other patents held by Reynolds and Reynolds Holding Inc., include those relating to reporting of automobile performance parameters to remote servers via wireless links. Specifically, an onboard data bus OBD system is coupled to a microprocessor, by way of a standard electrical connector. The microprocessor periodically receives data and transmits it into the wireless communications system. This information is more fully described in U.S. patent granted Oct. 21, 2003 U.S. Pat. No. 6,636,790. Inventors Lightner et al, present method and apparatus for remotely characterizing the vehicle performance. Data at the onboard data by his periodically received by a microprocessor and passed into a local transmitter. The invention specifically calls out transmission of data on a predetermined time interval. Thus these inventions do not anticipate nor include processing and analysis steps which result in data being passed at time other than expiration of the predetermined time period.


Reynolds and Reynolds Holding Inc., further describes systems where motor vehicles are coupled by wireless communications links to remote host servers in U.S. Pat. No. 6,732,031.


Additionally, recent developments are expressed in application for U.S. patent having document number: 2006/0095175 published on May 4, 2006. This disclosure describes a comprehensive systems having many important components. In particular, deWaal et al presents a ‘crash survivable apparatus’ in which information may be processed and recorded for later transmission into related coupled systems. An ability to rate a driver performance based upon data captured is particular feature described is some detail.


Also, inventor Boykin of Mt. Juliet Tennessee presents a “composite mobile digital information system” in U.S. Pat. No. 6,831,556. In these systems, a mobile server capable of transmitting captured information from a vehicle to a second location such as a building is described. In particular, a surveillance system for capturing video, audio, and data information is provided in a vehicle.


Inventors Lao et al, teach in their publication numbered 2005/0099498 of a “Digital Video System-Intelligent Information Management System” which is another application for U.S. patent published May 12, 2005. A digital video information management system for monitoring and managing a system of digital collection devices is specified. A central database receives similar information from a plurality of distributed coupled systems. Those distributed systems may also be subject to reset and update operations via the centralized server.


Finally, “Mobile and Vehicle-Based Digital Video System” is the title of U.S. patent application disclosure publication numbered 2005/0100329 also published on May 12, 2005. It also describes a vehicle based video capture and management system with digital recording devices optimized for field use. Because these systems deploy non-removable media for memory, they are necessarily coupled to data handling systems via various communications links to convey captured data to analysis servers.


While systems and inventions of the art are designed to achieve particular goals and objectives, some of those being no less than remarkable, these inventions have limitations which prevent their use in new ways now possible. Inventions of the art are not used and cannot be used to realize the advantages and objectives of the inventions taught herefollowing.


SUMMARY OF THESE INVENTIONS

Comes now: James Plante; Gregory Mauro; Ramesh Kasavaraju; and Andrew Nickerson, with inventions of data processing, recording and analysis systems for use in conjunction with vehicle event recorders. An ‘exception event’ occurs whenever an extraordinary condition arises during normal use of a motor vehicle. Upon declaration of such exception event, or hereinafter simply ‘event’, information is recorded at the vehicle—in particular, information relating to vehicle and operator performance and the state of the environments about the vehicle.


Accordingly, systems first presented herein are arranged to capture, record, interpret, and analyze information relating to or arising from vehicle use. In particular, both discrete and non-discrete types of information are captured by various vehicle mounted sensors in response an event having been declared, via an event trigger. Non-discrete data is passed to and processed by a discretization facility where it is used to produce an interpreted dataset then associated and recombined with original captured data thus forming a complete event dataset.


Analysis can then be taken up against these complete datasets which include interpreted data where analysis results are used to drive automated actions in related coupled systems. Accordingly, those actions depend upon: interpreted information processed in the discretization facility; discrete data captured at the vehicle event recorder; and combinations thereof.


An analysis server is provided to run database queries which depend upon both the discrete data, and interpreted data as both of these are in machine proces sable form. The analysis server is therefore enabled with greater functionality as its information base is considerably broadened to include that which would not otherwise be processable by automated machines. The analysis server is arranged to initiate actions in response to detection of certain conditions in the event database. These may be actions which depend on a single event record, or a plurality of event records. The following examples illustrate this point thoroughly.


A vehicle event recorder having a suitable event trigger captures video and numeric data in response to a detected impact or impulse force. Numeric information collected by the plurality of vehicle subsystem sensors is insufficient to fully characterize the nature of the event. However, upon review of video and audio information captured by an expert event interpreter, various important aspects of the event can be specified in a discrete way. For example, it can be determined that the impact should be characterized as a “curb strike” type impact where a single front wheel made excessive contact with the roadway edge or other object. The interpreter's review is expressed via a graphical user interface system particularly designed for this purpose. These graphical user interfaces are comprised of control objects which can be set to various values which reflect the interpretation. As such, the control object value state having been manipulated by an interpreter after reviewing non-discrete data, may be associated with a particular event and stored in a database where it may be read by a machine in an analysis step. For example, in a general daily review of vehicle activity, a computer (analysis server) determines that a curb strike has occurred. Further, the analysis server considers the degree of severity by further analyzing force data and finally determines a maintenance action is necessary and orders a front-end alignment action be performed on the vehicle. The analysis server transmits the order (for example via e-mail) to the fleet maintenance department. Upon the next occasion where the vehicle is in for maintenance, the necessary alignment will be executed.


In a second illustrative example an analysis server reads a plurality of event records. This time, an action initiated by the analysis server is directed not to a vehicle, but rather to a vehicle operator. This may be the case despite the fact that a single operator may have operated many different vehicles of a vehicle fleet to bring about several event records; each event record having an association with the operator in question. An analysis server may produce a query to identify all of the events which are characterized as “excess idle time” type events associated with any single operator. When a vehicle is left idling for extended periods, the operation efficiency of the vehicle is reduced. Accordingly, fleet managers discourage employee operators from extended idling periods. However, under some conditions, extended idling is warranted. For example where a school bus is loading children in an extremely cold weather, it is necessary to run the engine to generate heat for the bus interior. It is clear that an excess idling type event should only be declared after careful interpretation of non-discrete video data. Discrete data produced by vehicle subsystem detectors may be insufficient to properly declare all excess idling type events. Whenever a single operator has accumulated excess idling events at a rate beyond a predetermined threshold, for example three per month, the analysis server can automatically detect such conditions. Upon detection, the analysis server can take action to order a counseling session between a fleet manager and the operator in question. In this way, automated systems which depend upon interpreted data are useful for managing operations of fleet vehicles.


Vehicle event recorders combine capture of non-discrete information including images and audio as well as discrete digital or numeric data. Information is passed to a specialized processing station or discretization facility including a unique event record media player arranged to playback a very specific event dataset and graphical user interface arranged with special controls having adjustable states.


These systems are further coupled to databases which support storage of records having a structure suitable to accommodate these event records as described. In addition, these database records are coupled to the controls of the graphical user interface via control present value states. Finally, these systems are also comprised of analysis servers which interrogate the database to determine when various conditions are met and to initiate actions in response thereto.


Objectives Of These Inventions


It is a primary object of these inventions to provide information processing systems for use with vehicle event recorders.


It is an object of these inventions to provide advanced analysis on non-discrete data captured in vehicle event recorders.


A better understanding can be had with reference to detailed description of preferred embodiments and with reference to appended drawings. Embodiments presented are particular ways to realize these inventions and are not inclusive of all ways possible. Therefore, there may exist embodiments that do not deviate from the spirit and scope of this disclosure as set forth by appended claims, but do not appear here as specific examples. It will be appreciated that a great plurality of alternative versions are possible.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims and drawings where:



FIG. 1 is schematic drawing of an example exception event management system;



FIG. 2 illustrates in further detail a discretization portion of these systems;



FIG. 3 similarly details these discretization facilities;



FIG. 4 illustrates an example of a display monitor including a graphical user interface couple with a special purpose multi media player;



FIG. 5 suggests an alternative version including special graphical objects;



FIG. 6 illustrates elements of these systems as they relate data types and further to portions of a database record structure;



FIG. 7 is a schematic of a vehicle mounted portion including the various sensors which capture data in an event;



FIG. 8 is a block diagram depicting the structure of an event record contents and their relationships with a discretization facility; and



FIG. 9 is a system block diagram overview.





GLOSSARY OF SPECIAL TERMS

Throughout this disclosure, reference is made to some terms which may or may not be exactly defined in popular dictionaries as they are defined here. To provide a more precise disclosure, the following terms are presented with a view to clarity so that the true breadth and scope may be more readily appreciated. Although every attempt is made to be precise and thorough, it is a necessary condition that not all meanings associated with each term can be completely set forth. Accordingly, each term is intended to also include its common meaning which may be derived from general usage within the pertinent arts or by dictionary meaning. Where the presented definition is in conflict with a dictionary or arts definition, one must consider context of use and provide liberal discretion to arrive at an intended meaning. One will be well advised to error on the side of attaching broader meanings to terms used in order to fully appreciate the entire depth of the teaching and to understand all intended variations.


Vehicle Event Recorder


A vehicle event recorder is vehicle mounted apparatus including video recording equipment, audio recording equipment, vehicle system sensors, environmental sensors, microprocessors, application-specific programming, and a communications port, among others. A vehicle event recorder is arranged to capture information and data in response to detection of an abnormal condition or ‘exception event’.


Exception Event


An ‘exception event’ is any occurrence or incident which gives rise to declaration of an ‘event’ and results in the production of a recorded dataset of information relating to vehicle operator and systems status and performance especially including video images of environments about the vehicle. An exception event is declared via a trigger coupled to either a measured physical parameter which may exceed a prescribed threshold (automatic) or a user who might manipulate a ‘panic button’ tactile switch (manual).


Preferred Embodiments Of These Inventions


In accordance with each of preferred embodiments of these inventions including vehicle exception event management systems are provided. It will be appreciated that each of the embodiments described include an apparatus and the apparatus of one preferred embodiment may be different than the apparatus of another embodiment.


Preferred embodiments of vehicle exception event management systems are particularly characterized as including the following elements. A vehicle event recorder is a vehicle mounted system to capture data relating to vehicle use in response to a trigger or ‘exception event’. The vehicle event recorder outputs such data to a database and also to a specially arranged discretization facility for interpretation. The discretization facility may be arranged as a part of a computer network to which the vehicle event recorder is similarly coupled. Output from the discretization facility is carefully coupled to stored event records thus preserving their association with the appropriate event. In this way, stored exception event records are far richer with information developed by both discrete and interpretive systems.


A basic understanding of these systems is realized in view of the drawing figures, in particular the overview illustration of FIG. 1. A common motor vehicle 1 may be provided with systems first presented here. In particular, a vehicle event recorder 2 which includes a video camera, memory, and event trigger such that upon declaration of an exception event, video data relating to the event, more particularly video associated with a period immediately prior to and immediately after an event is recorded to memory for temporary storage. In some versions, an OBD system 3 is also coupled to the event, trigger and memory in a similar fashion whereby data captured in these same periods by the OBD is stored to a memory for further processing.


After a session of normal vehicle use, or ‘service period’, the vehicle is coupled to a computer network such that data captured and stored in temporary on-board memory can be transferred further into the system components such as a database 4, discretization facility 5, and analysis server 6. In preferred versions, the vehicle may be connected to a system network merely by returning to a predetermined parking facility. There, a data communications link or data coupling between the vehicle mounted vehicle event recorder and a local wireless access point permits data associated with various events which occurred since last download to be downloaded 7 to the system database.


At this stage, a single event data record is allocated for each new event data set and each data record is assigned a unique identifier 8 sometimes known as a primary key. As such, there exists a one-to-one correspondence between events and event data records stored in the database. While an event data record may be comprised of both non-discrete data 9 including video image series; an analog audio recordings; acceleration measurements, for example, and discrete data 10 such as binary indication of headlights on/off; numeric speed values; steering angle indicators; gear ratio indicators, among others, et cetera, the event data record is not complete, or ‘preliminary’, at this stage. An interpreted portion 11 of the event record remains allocated but empty at this stage. Until a discretization step is taken up at a discretization facility and data is reviewed, analyzed and interpreted to formulate the interpreted data portion, and then added to the event data record, the event data record is only partially complete.


An event data record 12 is passed to a discretization facility. The discretization facility operates to read, analyze and interpret non-discrete data contained in the event data record. In some versions, non-discrete data is processed by advanced computer processes capable of interpretation by applying “fuzzy logic” rules and processing. In other versions, a human interpreter intervenes to read certain non-discrete data and convert it into representative discrete values processable via a machine. In still other versions, both machine and human discretization processes are employed.


Machine processes may be illustrated as interpretation algorithms 14 are applied to video data. Video images subject to image processing routines specifically arranged to “recognize” particular patterns can yield a discrete output. In example, the moment of impact is readily discoverable as a frame-to-frame image tends to greatly change at the moment of impact. Thus, some motion detection routines will be suitable for deciphering the precise moment of impact. Another useful illustrative example includes interpretation of traffic light signals. Image analysis can be applied such that it is determined precisely which traffic light color was indicated as the vehicle approaches an intersection. In even more advanced schemes, the traffic light changes may be automatically quantified by image analysis whereby it can be shown approximately how much time has passed between a light change and an impact. These and other fully automated image processing modules may be implemented as part of a discretization facility which reads non-discrete image data and produces discrete numeric outputs. Of course, an endless number of image recognition algorithms may be arranged to produce discrete output from image interpretation. It is not useful to attempt to enumerate them here and it is not the purpose of this teaching to present new image processing routines. On the other hand, it is the purpose of this disclosure to present new relationships between the vehicle event recorders and the systems which process, store and use data collected thereby and those relationships are detailed here. It is not only video data which might be subject to processing by interpretation modules, but also, audio data and any other non-discrete data captured by a vehicle event recorder.


Audio data may be processed by discretization algorithms configured to recognize the screech of skating tires and the crediting of glass and metal. In this case, discretization of audio data may yield a numeric estimation for speed, time of extreme breaking, and moment of impact et cetera. Again, it is not useful to present detail as to any particular recognition Skeen as many can be envisioned by a qualified engineers without deviation from the scope of the systems presented here. In addition to video and audio types of non-discrete data, acceleration data captured as an analog or not discrete signal may be similarly processed. Mathematical integration applied to acceleration data yields a velocity and position values for any moment of time in the event period.


Besides and in parallel with automated means for interpretive reading of non-discrete data, these discretization facilities also include means for manual interpretive reading of non-discrete data. In some cases, there can be no substitute for the human brain which has a very high interpretive capacity. Accordingly, discretization facilities of these inventions also provides a system which permits a human interpreter to review non-discrete information of an event record, interpret its meaning, and to effect and bring about discrete machine representations thereof. Specifically, a special proprietary media player arranged with particular view to presenting data captured by these vehicle event recorder systems in a dynamic graphical/image presentation over a prescribed timeline. Further, these manual interpretive systems also include simultaneous display of a custom graphical user interface which includes devices for discrete data entry. Such devices or graphical user interface “controls” each are associated with a particular attribute relating to an event/driver/vehicle/environments and each have a range of discrete values as well as a present state value. By reviewing data via the discretization facility media player and manipulating the graphical user interface, a human interpreter generates interpreted data which is discrete in nature. Thus, both automated and manual systems may be used at a discretization facility to produce discrete data from review and interpretation of non-discrete information. The discretization facility output, the interpreted data is then combined with the preliminary event record to form a complete event record 15 and returned to the database for further processing/analysis.


Event records which are complete with discrete, non-discrete, and interpreted data may be interrogated by database queries which depend upon either or all of these data types or combinations of either of them. In systems which do not provide for discretization of non-discrete data, it is impossible to run effective machine based analysis as the processable information is quite limited.


Analysis of so prepared complete event records comprising discrete data, non-discrete data, and interpreted data may be performed to drive automated systems/actions 16 including: maintenance actions (wheel re-alignments in response to impacts characterized as ‘curb strike’ type collisions for example); occurrence of prescribed events (operator service exceeds 10,000 hours without accidents); triggers (driver violations requires scheduling of counseling meeting); weekly performance reports on drivers/vehicles, among others. Some of these actions are further detailed in sections herefollowing. For the point being made here, it is sufficient to say automated systems are tied to event data which was previously subject to a discretization operation. Analysis servers may run periodic analysis on event data or may run ‘on-demand’ type analysis in response to custom requests formulated by an administrator.


In this way, these systems provide for advanced analysis to be executed detailed event records which include ‘in-part’ discretized or interpreted data. Data captured during vehicle use is stored and processed in a manner to yield the highest possible machine access for advanced analysis which enables and initiates a highly useful responses.



FIGS. 2 and 3 illustrate the discretization facility 21 in isolation and better detail. Arranged as a node of a computer network in communication with system databases, the discretization facility is comprised of primary elements including an event record media player 22 as well as graphical user interface 23. The media player is preferably arranged as a proprietary player operable for playing files arranged in a predetermined format specific to these systems. Similarly, graphical user interfaces of these systems are application specific to support function particular to these systems and not found in general purpose graphical user interfaces. In addition, these discretization facilities may optionally include algorithm based interpretive algorithm systems 24 which read and interpret non-discrete data to provide a discrete interpreted output. A discretization facility receives as input a preliminary event record 25, the event record comprising at least a portion of data characterized as non-discrete. In example, a video or audio recording is non-discrete data which cannot be used in mathematical analysis requiring discrete inputs. After being processed by the discretization facility, an event record 26 is provided as output where the event record includes a newly added portion of interpreted data being characterized as discrete. In some cases, a human operator interacting with the graphical user interface and media player is means of creating the interpreted data.


This process is further illustrated in FIG. 3 which shows media player data inputs as well as an example of a graphical user interface. A discretization facility 31 is embodied as major elements including event record media player 32 and custom graphical user interface 33. Data produced by a vehicle event recorder and an on-board diagnostics system is received at the discretization facility and this data arrives in a format and structure specifically designed for these systems. Specifically, a timeline which synchronously couples video data and OBD data assures a display/viewing for accurate interpretation. This is partly due to the specific nature of the data to be presented. Common media player standards do not support playing of certain forms of data which may be collected by a vehicle event recorder and on-board diagnostics systems, for example Windows™ Media Player cannot be used in conjunction with data captured in a motor vehicle; Windows™ Media Player takes no account of data related to speed, acceleration, steering wheel orientation, et cetera. In contrast, data specific to these exception event recording systems include: digital and numeric data 34 formed by sensors coupled to vehicle subsystems, as well as more conventional audio data 35 recorded at an audio transducer. These may include operator compartment microphones as well as microphones arranged to receive and record sounds from the vehicle exterior. Acceleration data 36, i.e. the second derivative of position with respect to time, may be presented as continuous or non-discrete data subject to interpretation. Video data 37 captured as a series of instantaneous frames separated in time captures the view of environments about the vehicle including exterior views especially forward views of traffic and interior views, especially views of a vehicle operator. Each of these types of data may be subject to some level of interpretation to extract vital information.


Some examples are illustrated as follows. Some vehicle collision type events include complex multiple impacts. These multiple impacts might be well fixed objects like trees and road signs or may be other vehicles. In any case, a microphone which captures sounds from a vehicle exterior may produce an audio recording which upon careful review and interpretation might contribute to a detailed timeline as to various impacts which occur in the series. Numeric data which indicates an operators actions such as an impulse braking action, swerve type extreme steering action, et cetera, may be considered in conjunction with an event record timeline to indicate operator attention/inattention and other related response factors. Accelerometer data can be used to indicate an effective braking action, for example. Acceleration data also gives information with respect to a series of impacts which might accompany an accident. Acceleration data associated with orthogonal reference directions can be interpreted to indicate resulting direction of travel collisions. Mathematical integration of acceleration data provides precise position and velocity information as well. Video images can be played back frame-by-frame in slow motion to detect conditions not readily otherwise measured by subsystem sensors. It human reviewer particularly effective at determining the presence certain factors in an event scene. As such, media players of these systems are particularly arranged to receive this data as described and to present it in a logical manner so a human reviewer can easily view or “read” the data. While viewing an event playback, an interpreter is also provided with a special graphical user interface which permits easy quantification and specification to reflect various attributes which may be observed or interpreted in the playback. A human operator may manipulate graphical user interface controls 38 to set their present state values. These controls and each of them have a range of values and a present state value. The present state value is adjusted by an operator to any value within the applicable range. The present state value of each control is coupled to the database via appropriate programming such that the database will preserver the present state value of the control and transfer it as part of an event record stored in long term memory.


An example of graphical user interfaces effected in conjunction with event record type media players is illustrated further in FIG. 4 which together fill an image field 41, for example that of a computer workstation monitor. The first portion of the image field may be arranged as an event video player 42. Video images captured by a vehicle event recorder may be replayed at the player to provide a detailed visual depiction of the event scene. A video series, necessarily having an associated timeline, may be replayed on these players in several modes including either: fast forward, rewind, slow motion, or in actual or ‘real-time’ speed, among others as is conventional in video playback systems. A second portion, a graphical display field 43 of the display field may be arranged to present graphical and numeric information. This data is sometimes dependent upon time and can be presented in a manner whereby it changes in time with synchronization to the displayed video images. For example, a binary indication of the lights status may be presented as “ON” or “1” at the first video frame, but indicated as being “off” or “0” just after a collision where the lights are damaged and no longer drawing current as detected by appropriate sensors. Another area of the display field includes a graphical user interface 44. A “tab strip” type graphical user interface control is particularly useful in some versions of these systems. Graphical user interface controls may be grouped into logically related collections and presented separately on a common tab. A timeline control 46 permits an interpreter to advance and to recede the instant time at will by sliding a pip along the line. “Start” and “stop” playback controls 47 can be used to freeze a frame or to initiate normal play. Similarly, controls may additionally include fast forward, rewind, loop, et cetera. Control interface 48 to adjust audio playback (volume) are also part of these media players. It is important to note that the graphical presentations of display field 43 are strictly coupled to the video with respect to time such that frame-by-frame, data represented there indicates that which was captured at the same incident a video frame was captured. Sometimes information presented is represented for the entire event period. For example, it is best to show force data 49 for the entire event period. In this case, a “present instant” reference line 410 is used to indicate the moment which corresponds with the video frame capture. It is easy to see that conventional media players found in the art are wholly unsuitable for use in these systems. Those media players do not account for presentation of event data with synchronization to a video timeline. For example the graphical representation of instantaneous steering wheel orientation angle 411, instantaneous speed. Media players of the art are suitable for display of video simultaneously with a data element such as air temperature area air temperature does not appreciably change in time so there exists no synchronization with the video frames. However, when presented data is collected via sensors coupled to a vehicle subsystems and is synchronized with the video, the media player is characterized as an event record media player ERMP and constitutes a proprietary media player. Further, this specialized media player is an exceptionally good tool for reading and presenting an event intuitively and in detail as it provides a broad information base from which detailed and accurate interpretations may be easily made. While a few interesting and illustrative examples of data types are presented in the data display field, it should be appreciated that a great many other types not shown here are examples may also be included in advanced systems. As it is necessary for a clear disclosure to keep the drawing easily understandable, no attempt is made to show all possible data factors which might be presented in a data display field of these systems. Indeed there may be many hundreds of parameters captured at the vehicle during an event which might be nicely displayed in conjunction with a frame-by-frame video of the event. One should realize that each particular parameter may contribute to a valuable understanding of the event but that it is not mentioned here is no indication of its level of importance. What is important and taught here, is the notion that a better interpretive platform is realized when any time dependent parameter is played back in a pleaded display field in conjunction with the video where synchronization between the two is effected.


The ERMP, so defined in the paragraphs immediately prior, is preferably presented at the same time with graphical user interface 44. Graphical user interfaces are sometimes preferably arranged as a tab strip. For example, a “Driver” tab 412 may have controls associated therewith which relate specifically to driver characterizations. Various graphical user interface control element types are useful in implementations of these graphical user interface systems; checkboxes 413, drop-down listboxes 414, radio buttons 415, sliders 416, command buttons, et cetera, among others. Checkboxes may be used indicate binary conditions such as whether or not a driver is using a cell phone, is smoking, is alert, wearing sunglasses, made error, is using a seat belt properly, is distracted, for example. It is easily appreciated that these are merely illustrative examples, one would certainly devise many alternative and equally interesting characterizations associated with a driver and driver performance in fully qualified systems. Again these are provided merely for illustration of graphical user interface controls.


One will easily see however, their full value in consideration of the following. To arrange a physical detector which determines whether or not a driver is wearing sunglasses is a difficult task indeed; possible but very difficult. Conversely, in view of these systems which permit discretization of such driver characteristics including the state of her sunglasses, that is these systems which arrive at a discrete and thus computer processable expression of this condition, the detailed nature of an event is realized quite readily. By a simple review of an event video, an interpreter can make the determination that a driver is wearing sunglasses and indicate such by ticking an appropriate checkbox. As the checkbox, and more precisely it present state value, is coupled to the specific event record, information is passed to and stored in the database and becomes processable by computer algorithms. Previously known systems do not accommodate such machine processable accounts various information usually left in a non-discrete form if captured at all. A fleet manager can thereafter form the query: “what is the ratio of noon hour accident type events where drivers were wearing sunglasses versus these with drivers not wearing sunglasses”. Without systems first presented here, such information would not available without an extremely exhaustive labor intensive examination of multiple videos.


Of course, these systems are equally useful for information which is not binary, yet still discrete. A listbox type control may provide a group having a discrete number of distinct members. For example a “crash type” list box containing five elements (‘values’) each associated with a different type of crash may be provided where a reviewer's interpretation could be expressed accordingly. For example, a “sideswiped” crash could be declared after careful review of the media player data and so indicated in the drop-down listbox member associated with that crash type. Of course, it is easy to appreciate the difficulty of equipping a car with electronic sensors necessary to distinguish between a “sideswipe” type crash and a “rear-ender” crash. Thus, a considerable amount of information collected by a video event recorder is non-discrete and not processable by automated analysis until it has been reduced to a discrete form in these discretization facilities. These systems are ideal for converting non-discrete information into processable discrete (interpreted) the dataset to be connected with the event record in an electronic database and data structure coupled to the controls of the graphical user interface. Analysis executed on such complete event records which include interpreted data can be preformed to trigger dependent actions.


Another useful combination version of an event record media player 51 and custom graphical user interface 52 is illustrated in FIG. 5. In this version, an ERMP includes three fields coupled together via an event timeline. An image field 53 is a first field arranged to show video and image data captured via any of the various cameras of a vehicle event recorder. A numeric or graphical field 54 is arranged to represent non-image data captured at a vehicle event recorder during an event. Some presentations of this data may be made in a graphical form such as arrow indicators 55 to indicate acceleration direction and magnitude; the wheel graphical icon 56 to indicate the steering wheel orientation angle. Presenting some numeric data in graphical form may aid interpreters to visualize a situation better; it is easy to appreciate the wheel icon expresses in a far more intuitive way than a mere numeric value such as “117°”. “Present instant” indicator 57 moves in agreement (synchronously) with the event timeline and consequently the displayed image frame. In this way, the ERMP couples video images of an event record with numeric data of the same event. Another graphical field 58, an icon driven image display indicates a computed path of a vehicle during an event and further illustrates various collisions as well as the severity (indicated by size of star balloon) associated with those collisions. The graphic additionally includes a “present instant” indication 59 and is thereby similarly coupled to the video and more precisely the event timeline common to all three display fields of the ERMP. This graphic aids an interpreter in understanding of the event scenario details with particular regard to events having a plurality of impacts.


In response to viewing this ERMP, an interpreter can manipulate the graphical user interface provided with specific controls associated with the various impacts which may occur in a single event. For illustration, three impacts are included in the example represented. Impact 1 and 2 coming close together in time, impact 1 being less severe than impact 2, impact 3 severe in intensity, coming sometime after impact 2. By ticking appropriate checkboxes, an interpreter specifies the details of the event as determined from review of information presented in the ERMP. By using drop-down list boxes 511, the interpreter specifies the intensity of the various impacts. Special custom graphical control 512, a nonstandard graphical user interface control graphically presents a vehicle and four quadrants A,B,C,D, where an interpreter can indicate via mouse clicks 513 the portion of the vehicle in which the various impacts occur. In this way, graphical user interface 52 is used in conjunction with ERMP 51 to read and interpret both non-discrete and discrete data captured by a vehicle event recorder and to provide for discretization of those interpretations by graphical user interface controls each dedicated to various descriptors which further specify the accident. Experts will appreciate that a great plurality of controls designed to specify event details will finally come to produce the most useful systems; it is not the purpose of this description to present each of those controls which may be possible. Rather, this teaching is directed to the novel relationships between unique ERMPs and graphical user interfaces and further, discretization facilities in combination with a vehicle mounted vehicle event recorders and database and analysis systems coupled therewith.



FIG. 6 illustrates further relationships between data source subsystems and data record structure. In particular, those operable for capture of data both non-discrete and discrete in nature, and those subsystems operable for converting captured non-discrete data to discrete data.


Attention is drawn to discretization facility 61 which may include image processing modules such as pattern recognition systems. In addition, these discretization facilities include a combination of specialized event record media player as well as custom graphical user interface. Alternatively, a human operator 62 may view image/audio/numeric and graphical data to interpret the event details and enter results via manipulation of graphical user interface controls. In either case, the discretization facility produces an output of machine processable discrete data related to the non-discrete input received there.


Event data is captured and recorded at a vehicle event recorder 63 coupled to a vehicle subsystems, and vehicle operating environments. In some preferred versions, an on-board diagnostics system 64 is coupled 65 to the vehicle event recorder such that the vehicle event recorder trigger operates to define an event. An on-board diagnostics system usually presents data continuously, however, in these event driven systems, on-board diagnostics data is only captured for a period associated with an event declaration. As described herein the vehicle event recorder produces both numeric/digital data as well as non-discrete data such as video and audio streams. Specifically, transducers 66 coupled to vehicle subsystems and analog to digital converters, A/D, produce a discrete data 67. Some of this discrete data comes from the on-board diagnostics system and some comes from subsystems independent of on-board diagnostic systems. Further, a video camera 68 produces video image series or non-discrete data 69. A copy 610 of these data, including both discrete and non-discrete, is received at the discretization facility for interpretation either by a computer interpretive algorithms or by operator driven schemes. All data, however so created, is assembled together and associated as a single unit or event record in a database structure which includes a unique identifier or “primary key” 611. Interpreted data 612 output from the discretization facility (i.e. the value of graphical user interface controls) is included as one portion of the complete event record; a second portion is the non-discrete data 513 captured by the vehicle event recorder; and a third portion of the event record is the discrete data 514 captured in the vehicle event recorder and not created as a result of an interpretive system.


It is useful to have a closer look at vehicle mounted subsystems and their relationship with the vehicle event recorder and the on-board diagnostics systems. FIG. 7 illustrates a vehicle event recorder 71 and an on-board diagnostics system 72 and coupling 73 therebetween. Since an event is declared by a trigger 74 of the vehicle event recorder, it is desirable when capturing data from the on-board diagnostics system that the data be received and time stamped or otherwise synchronized with a system clock 75. In this way, data from the on-board diagnostics system can be properly played back with accurate correspondence between the on-board diagnostics system data and the video images which each have an instant in time associated therewith. Without this timestamp, it is impossible to synchronize data from the on-board diagnostics system with data from the vehicle event recorder. An on-board diagnostics system may include transducers coupled to vehicle subsystems, for example the steering system 76; engine 77 (such as an oil pressure sensor or engine speed sensors); the transmission 78 (gear ratio) and brakes system 79, among others. Today, standard on-board diagnostics systems make available diagnostic data from a great plurality of vehicle subsystems. Each of such sensors can be used to collect data during an event and that data may be preserved at a memory 710 as part of an event record by the vehicle event recorder. The vehicle event recorder also may comprise sensors independent of the on-board diagnostics system also which capture numeric and digital data during declared events. A keypad 711 is illustrative. A keypad permits a vehicle operator to be associated with a system via a “login” as the operator for an assigned use period. A global positioning system receiver 712 and electronic compass 713 similarly may be implemented as part of a vehicle event recorder, each taking discrete measurements which can be used to characterize an event. In addition to systems which capture discrete data, a vehicle event recorder also may include systems which capture data in a non-discrete form. Video camera 714, microphone 715, and accelerometers set 716 each may be used to provide data useful in interpretive systems which operate to produce discrete data therefrom. While several of each type of data collection system is mentioned here, this is not intended to be an exhaustive list. It will be appreciated that a vehicle event recorder may include many additional discrete and non-discrete data capture subsystems. It is important to understand by this teaching, that both discrete and non-discrete data are captured at a vehicle event recorder and that discrete data may be captured at an onboard diagnostics system and these data capture operations are time stamped or otherwise coupled in time to effect a synchronization between the two.



FIG. 8 illustrates the relationship between a preliminary event record 81 as taken by on-board hardware in comparison to a complete event record 82 which includes an interpreted data portion having discrete, computer processable data therein. In this way, advanced algorithms may be run against the complete event record to more effectively control and produce appropriate fleet management actions.


An event record produced by vehicle mounted systems includes both a discrete data portion 83 and a non-discrete data portion 84. Data associated with a particular declared event is captured and sent to a discretization facility 85 for processing. At the discretization facility, non-discrete data is read either by humans or machines in interpretive based systems and an interpreted data portion 86 is produced and amended to the original event record to arrive at a complete event record.


Finally FIG. 9 presents in block diagram a system review. Primary system elements mounted in a motor vehicle 91 include a vehicle event recorder 92 and optionally an on-board diagnostics system 93. These may be linked together by a system clock 94 and a vehicle event recorder event trigger 95. Together, these systems operate to capture data which may be characterized as discrete and that which is characterized as non-discrete, the data relating to a declared event and further to pass that capture data to a database 96. A discretization facility 97 is comprised of an event record media player 98 where data may be presented visually in a time managed system. A discretization facility further includes a graphical user interface 99 which a system operator may manipulate to effect changes to a present value state of a plurality of controls each having a value range. These control values are coupled to the database and more specifically to the data record associated with an event being played at the media player such that the data record thereafter includes these control values which reflect interpretations from the discretization facility. An analysis server 910 includes query generator 911 which operates to run queries against event data stored in the database, the queries at least partly depending on the interpreted data stored as part of complete event record. Result sets 912 returned from the database can be used in analysis systems as thresholds which trigger actions 913 to be taken up in external systems. For example upon meeting some predefined conditions, special reports 914 may be generated and transmitted to interested parties. In other systems, vehicle maintenance scheduling/operations may be driven by results produced partly based upon interpreted data in the complete event record.


In review, systems presented here include in general, vehicle exception event management systems. Exception events are for example either of: crash; accident; incident; et cetera. These management systems primarily are comprised of the following subsystem components including: a vehicle event recorder; a discretization facility; and a database. The vehicle event recorder includes a video recorder or camera with a field-of-view directed to environments, for example a forward traffic view, or a rearward vehicle operator view. These video cameras are set to capture video images whenever a system trigger declares the occurrence of an event. The discretization facility is a node of a computer network which is communicatively connected to the vehicle event recorder. In this way, data may be transferred from the vehicle event recorder to the discretization facility. The discretization facility is also in communication with the database such that discrete data is generated at the discretization facility and provided as output and further transferred from the discretization facility to the database in accordance with an appropriate structure.


The discretization facility of these vehicle exception event management systems includes two very important subsystem elements including a media player and a graphical user interface. Preferably displayed simultaneously at a single monitor, a media player receives captured data from the vehicle event recorder and re-plays the data in a prescribed format and design at the monitor such that an interpreter can consider and interpret various aspects of the recorded information.


On the same monitor and at the same time, a graphical user interface having several control elements with a range of discrete value states may be presented in a way where the user can manipulate the values associated with each control—i.e. via mouse click actions. Finally, the graphical user interface controls are coupled to the database such that their values are transferred to the appropriate database record associated with the event represented at the monitor by the media player and graphical user interface so manipulated. A discretization facility may also include a tactile device, such as a computer ‘mouse’ wherein a human operator may manipulate present value states of graphical user interface control elements.


These media players are distinct as they accommodate various data types not present in other media player systems. These systems also play several data types at the same time. Thus, they are ‘multi-media’ players and include subsystems to replay and present video, audio, and other exception event data; i.e. all that data associated with an event recordset.


Media players of these event management systems have three distinct display field portions. These include: a video field portion, a graphics field portion and a text field portion. Video images, either one or more, are presented in the video field portion. In some cases, the field portion is divided into several different view fields to accommodate video from different cameras of the same video event recorder. All three display fields are synchronized together via a common timeline. Various data captured in a vehicle event recorder is time stamped so that it can be replayed synchronously with other data. In this regard, it is said that video fields, graphic fields and text fields are coupled by a common event timeline.


Replay of data is controlled by way of special timeline controls of the media player. That is, the media player timeline controls permit playback functions described as: replay; rewind; slow motion; fast forward; and loop play. When a timeline is being played in a forward direction, audio may accompany video and graph information via a system speaker.


Graphics fields of these media players may include at least one dynamic graphic element responsive to data in the event dataset; graphical representation of data sometimes aids in its comprehensive interpretation.


A graphics field may be arranged to include for example a plot of force verses time; sometimes referred to as ‘G-force’ or ‘acceleration forces’ these are the forces which act on a vehicle as it advances through the event period. In best versions, the graphics field comprises two plots of force, each plot being associated with crossed or orthogonal directions. Another useful example of a graphical representation of event data is a graphics field element having a dynamic representation of steering wheel position.


Text fields may be provided in these graphical user interfaces to include at least one dynamic text element responsive to data in an event dataset. A text field may further include at least one text element which characterizes an exception event, the characterization being related to some attribute of the vehicle or operator or environmental condition.


Finally, these systems may also include an analysis server coupled to the database wherein machine processable commands may be executed against data stored in the database. Machine processable commands include prescribed queries which may be executed periodically.


One will now fully appreciate how systems may be arranged to process, interpret and analyze data collected in conjunction with vehicle event recorders. Although the present inventions have been described in considerable detail with clear and concise language and with reference to certain preferred versions thereof including best modes anticipated by the inventors, other versions are possible. Therefore, the spirit and scope of the invention should not be limited by the description of the preferred versions contained therein, but rather by the claims appended hereto.

Claims
  • 1. A discretization system configured to receive vehicle event information and form vehicle event records related to vehicle events, the discretization system comprising: one or more physical computer processors configured to: receive vehicle event information that includes first non-discrete sensory data and first discrete quantitative data related to operation of a vehicle during a vehicle event, wherein the first non-discrete sensory data includes information captured by a video camera having a field-of-view that includes an environment about the vehicle;facilitate discretization of a portion of the first non-discrete sensory data, wherein the portion of the first non-discrete sensory data includes the information captured by the video camera;determine second discrete quantitative data based on the discretization of the portion of the first non-discrete sensory data;determine third discrete quantitative data based on one or more of the first discrete quantitative data, the second discrete quantitative data, or the first non-discrete sensory data;form a vehicle event record that includes the third discrete quantitative data; andtransfer the vehicle event record to a remote computing device that is external to the vehicle; anda user interface, wherein the user interface receives one or more of entry and selection of manual interpretation information pertaining to the portion of the first non-discrete sensory data from a human interpreter, wherein the manual interpretation information includes discrete numeric values;wherein the one or more physical computer processors is configured to determine the second discrete quantitative data based on the discrete numeric values included in the manual interpretation information.
  • 2. The system of claim 1, wherein the one or more physical computer processors are configured such that one or more of the first discrete quantitative data, the second discrete quantitative data, and the third discrete quantitative data include observations related to operator behavior, the observations related to operator behavior related to one or more of a collision, a near collision, a driving error, a distraction, a position of a head of an operator, an eye position of the operator, a hand position of the operator, talking by the operator, a lane departure, a lane position, a number of lanes in a direction of travel, a road type, a following distance, a number of visible vehicles, or a vehicle environment.
  • 3. The system of claim 1, wherein the one or more physical computer processors are further configured to determine the third discrete quantitative data based on one or more of predetermined logic rules, or predetermined algorithms.
  • 4. The system of claim 1, wherein the one or more physical computer processors are further configured to facilitate presentation of the third discrete quantitative data to a remotely located user.
  • 5. The system of claim 1, further comprising a data store configured to electronically store the first discrete quantitative data, the first non-discrete sensory data, the second discrete quantitative data, and the third discrete quantitative data, the data store comprising one or more of a relational database, a NoSQL database, or a Hadoop data store.
  • 6. The system of claim 5, wherein the data store is configured such that the first discrete quantitative data, the first non-discrete sensory data, the second discrete quantitative data, and the third discrete quantitative data are associated with the vehicle event and a vehicle event timeline in the data store.
  • 7. The system of claim 1, wherein the user interface includes one or more of a keypad, a button, a switch, a keyboard, a knob, a lever, a display screen, a touch screen, a speaker, a microphone, an indicator light, an audible alarm, a printer, or a tactile feedback device.
  • 8. The system of claim 1, wherein the one or more physical computer processors are further configured to automatically discretize a second portion of the first non-discrete sensory data via one or more of a neural network or logistic regression; and determine the third discrete quantitative data based on one or more of the first discrete quantitative data, the second discrete quantitative data, or the automatically discretized second portion of the first non-discrete sensory data.
  • 9. The system of claim 1, wherein the one or more physical computer processors are further configured to automatically discretize a second portion of the first non-discrete sensory data via one or more of pattern recognition image processing techniques or pattern recognition audio signal processing techniques; and determine the third discrete quantitative data based on one or more of the first discrete quantitative data, the second discrete quantitative data, or the automatically discretized second portion of the first non-discrete sensory data.
  • 10. The system of claim 1, wherein the first non-discrete sensory data received by the discretization system includes one or more of audio recording data or visual information representing a vehicle environment of the vehicle, the visual information acquired by one or more cameras, the vehicle environment including spaces in and around an interior and an exterior of the vehicle, the one or more cameras including one or more of a forward looking camera, a driver view camera, a passenger view camera, a rear vehicle view camera, or a side vehicle view camera.
  • 11. The system of claim 1, wherein the user interface is configured to facilitate manual discretization of a portion of the first non-discrete sensory data by a human interpreter via a graphical representation of the first discrete quantitative data, the graphical representation of the first discrete quantitative data including a graphical representation of one or more of vehicle acceleration, vehicle speed, engine speed, vehicle gear, vehicle brake position, vehicle steering wheel position, throttle position, engine load, vehicle angular velocity, gear ratio, lane departure, following distance, a collision warning, rollover protection system activation, fishtailing protection system activation, a speedometer, an engine RPM gage, or a force gauge.
  • 12. The system of claim 1, wherein the one or more physical computer processors are further configured to generate and facilitate distribution of a report based on one or more of the first discrete quantitative data, the first non-discrete sensory data, the second discrete quantitative data, and the third discrete quantitative data.
  • 13. The system of claim 12, wherein the one or more physical computer processors are configured such that the report includes coaching information.
  • 14. The system of claim 12, wherein the one or more physical computer processors are configured such that the report is distributed via one or more of an email, a text message, or a phone call.
  • 15. The system of claim 12, wherein the one or more physical computer processors are further configured to generate and facilitate distribution of a report based on discrete quantitative data and non-discrete sensory data from multiple individual vehicle events.
  • 16. A method for receiving vehicle event information and forming vehicle event records related to vehicle events, the method comprising: receiving vehicle event information that includes first non-discrete sensory data and first discrete quantitative data related to operation of a vehicle during a vehicle event, wherein the first non-discrete sensory data includes information captured by a video camera having a field-of-view that includes an environment about the vehicle;facilitating discretization of a portion of the first non-discrete sensory data, wherein the portion of the first non-discrete sensory data includes the information captured by the video camera;determining second discrete quantitative data based on the discretization of the portion of the first non-discrete sensory data;determining third discrete quantitative data based on one or more of the first discrete quantitative data, the second discrete quantitative data, or the first non-discrete sensory data;forming a vehicle event record that includes the third discrete quantitative data;transferring the vehicle event record to a remote computing device that is external to the vehicle; andreceiving one or more of entry or selection of manual interpretation information pertaining to the portion of the first non-discrete sensory data from a human interpreter via a user interface, wherein the manual interpretation information includes discrete numeric values; andwherein determining the second discrete quantitative data is based on the discrete numeric values included in the manual interpretation information.
  • 17. The method of claim 16, wherein one or more of the first discrete quantitative data, the second discrete quantitative data, and the third discrete quantitative data include observations related to operator behavior, the observations related to operator behavior related to one or more of a collision, a near collision, a driving error, a distraction, a position of a head of an operator, an eye position of the operator, a hand position of the operator, talking by the operator, a lane departure, a lane position, a number of lanes in a direction of travel, a road type, a following distance, a number of visible vehicles, or a vehicle environment.
  • 18. The method of claim 16, further comprising determining the third discrete quantitative data based on one or more of predetermined logic rules, or predetermined algorithms.
  • 19. The method of claim 16, further comprising facilitating presentation of the third discrete quantitative data to a remotely located user.
  • 20. The method of claim 16, further comprising electronically storing the first discrete quantitative data, the first non-discrete sensory data, the second discrete quantitative data, and the third discrete quantitative data in a data store, the data store comprising one or more of a relational database, a NoSQL database, or a Hadoop data store.
  • 21. The method of claim 16, wherein the data store is configured such that the first discrete quantitative data, the first non-discrete sensory data, the second discrete quantitative data, and the third discrete quantitative data are associated with the vehicle event and a vehicle event timeline in the data store.
  • 22. The method of claim 16, wherein the user interface includes one or more of a keypad, a button, a switch, a keyboard, a knob, a lever, a display screen, a touch screen, a speaker, a microphone, an indicator light, an audible alarm, a printer, or a tactile feedback device.
  • 23. The method of claim 16, further comprising automatically discretizing a second portion of the first non-discrete sensory data via one or more of a neural network or logistic regression; and determining the third discrete quantitative data based on one or more of the first discrete quantitative data, the second discrete quantitative data, or the automatically discretized second portion of the first non-discrete sensory data.
  • 24. The method of claim 16, further comprising automatically discretizing a second portion of the first non-discrete sensory data via one or more of pattern recognition image processing techniques or pattern recognition audio signal processing techniques; and determining the third discrete quantitative data based on one or more of the first discrete quantitative data, the second discrete quantitative data, or the automatically discretized second portion of the first non-discrete sensory data.
  • 25. The method of claim 16, wherein the first non-discrete sensory data includes one or more of audio recording data or visual information representing a vehicle environment of the vehicle, the visual information acquired by one or more cameras, the vehicle environment including spaces in and around an interior and an exterior of the vehicle, the one or more cameras including one or more of a forward looking camera, a driver view camera, a passenger view camera, a rear vehicle view camera, or a side vehicle view camera.
  • 26. The method of claim 16, further comprising facilitating manual discretization of a portion of the first non-discrete sensory data by a human interpreter via a graphical representation of the first discrete quantitative data, the graphical representation of the first discrete quantitative data including a graphical representation of one or more of vehicle acceleration, vehicle speed, engine speed, vehicle gear, vehicle brake position, vehicle steering wheel position, throttle position, engine load, vehicle angular velocity, gear ratio, lane departure, following distance, a collision warning, rollover protection system activation, fishtailing protection system activation, a speedometer, an engine RPM gage, or a force gauge.
  • 27. The method of claim 16, further comprising generating and facilitating distribution of a report based on one or more of the first discrete quantitative data, the first non-discrete sensory data, the second discrete quantitative data, and the third discrete quantitative data.
  • 28. The method of claim 27, wherein the report includes coaching information.
  • 29. The method of claim 27, wherein the report is distributed via one or more of an email, a text message, or a phone call.
  • 30. The method of claim 16, further comprising generating and facilitating distribution of a report based on discrete quantitative data and non-discrete sensory data from multiple individual vehicle events.
US Referenced Citations (907)
Number Name Date Kind
673203 Freund Apr 1901 A
673795 Hammer May 1901 A
673907 Johnson May 1901 A
676075 Mcdougall Jun 1901 A
679511 Richards Jul 1901 A
681036 Burg Aug 1901 A
681283 Waynick Aug 1901 A
681998 Swift Sep 1901 A
683155 Thompson Sep 1901 A
683214 Mansfield Sep 1901 A
684276 Lonergan Oct 1901 A
685082 Wood Oct 1901 A
685969 Campbell Nov 1901 A
686545 Selph Nov 1901 A
689849 Brown Dec 1901 A
691982 Sturgis Jan 1902 A
692834 Davis Feb 1902 A
694781 Prinz Mar 1902 A
2943141 Knight Jun 1960 A
3634866 Meyer Jan 1972 A
3781824 Caiati Dec 1973 A
3812287 Lemelson May 1974 A
3885090 Rosenbaum May 1975 A
3992656 Joy Nov 1976 A
4054752 Dennis Oct 1977 A
4072850 McGlynn Feb 1978 A
4258421 Juhasz Mar 1981 A
4271358 Schwarz Jun 1981 A
4276609 Patel Jun 1981 A
4280151 Tsunekawa Jul 1981 A
4281354 Conte Jul 1981 A
4401976 Stadelmayr Aug 1983 A
4409670 Herndon Oct 1983 A
4420773 Toyoda Dec 1983 A
4425097 Owens Jan 1984 A
4456931 Toyoda Jun 1984 A
4489351 d'Alayer de Costemore Dec 1984 A
4496995 Colles Jan 1985 A
4500868 Tokitsu Feb 1985 A
4528547 Rodney Jul 1985 A
4533962 Decker Aug 1985 A
4558379 Hutter Dec 1985 A
4588267 Pastore May 1986 A
4593313 Nagasaki Jun 1986 A
4621335 Bluish Nov 1986 A
4625210 Sagl Nov 1986 A
4630110 Cotton Dec 1986 A
4632348 Keesling Dec 1986 A
4638289 Zottnik Jan 1987 A
4646241 Ratchford Feb 1987 A
4651143 Yamanaka Mar 1987 A
4671111 Lemelson Jun 1987 A
4718685 Kawabe Jan 1988 A
4754255 Sanders Jun 1988 A
4758888 Lapidot Jul 1988 A
4763745 Eto Aug 1988 A
4785474 Bernstein Nov 1988 A
4789904 Peterson Dec 1988 A
4794566 Richards Dec 1988 A
4804937 Barbiaux Feb 1989 A
4806931 Nelson Feb 1989 A
4807096 Skogler Feb 1989 A
4814896 Heitzman Mar 1989 A
4837628 Sasaki Jun 1989 A
4839631 Tsuji Jun 1989 A
4843463 Michetti Jun 1989 A
4843578 Wade Jun 1989 A
4853856 Hanway Aug 1989 A
4853859 Morita Aug 1989 A
4866616 Takeuchi Sep 1989 A
4876597 Roy Oct 1989 A
4883349 Mittelhaeuser Nov 1989 A
4896855 Furnish Jan 1990 A
4926331 Windle May 1990 A
4930742 Schofield Jun 1990 A
4936533 Adams Jun 1990 A
4939652 Steiner Jul 1990 A
4942464 Milatz Jul 1990 A
4945244 Castleman Jul 1990 A
4949186 Peterson Aug 1990 A
4980913 Skret Dec 1990 A
4987541 Levente Jan 1991 A
4992943 McCracken Feb 1991 A
4993068 Piosenka Feb 1991 A
4995086 Lilley Feb 1991 A
5012335 Cohodar Apr 1991 A
5027104 Reid Jun 1991 A
5046007 McCrery Sep 1991 A
5050166 Cantoni Sep 1991 A
5056056 Gustin Oct 1991 A
5057820 Markson Oct 1991 A
5096287 Kakinami Mar 1992 A
5100095 Haan Mar 1992 A
5111289 Lucas May 1992 A
5140434 Van Blessinger Aug 1992 A
5140436 Blessinger Aug 1992 A
5140438 Kurahashi Aug 1992 A
5144661 Shamosh Sep 1992 A
5178448 Adams Jan 1993 A
5185700 Bezos Feb 1993 A
5196938 Blessinger Mar 1993 A
5223844 Mansell Jun 1993 A
5224211 Roe Jun 1993 A
5262813 Scharton Nov 1993 A
5283433 Tsien Feb 1994 A
5294978 Katayama Mar 1994 A
5305214 Komatsu Apr 1994 A
5305216 Okura Apr 1994 A
5308247 Dyrdek May 1994 A
5309485 Chao May 1994 A
5311197 Sorden May 1994 A
5321753 Gritton Jun 1994 A
5327288 Wellington Jul 1994 A
5330149 Haan Jul 1994 A
5343527 Moore Aug 1994 A
5353023 Mitsugi Oct 1994 A
5361326 Aparicio Nov 1994 A
5387926 Bellan Feb 1995 A
5388045 Kamiya Feb 1995 A
5388208 Weingartner Feb 1995 A
5404330 Lee Apr 1995 A
5408330 Squicciarini Apr 1995 A
5422543 Weinberg Jun 1995 A
5430431 Nelson Jul 1995 A
5430432 Camhi Jul 1995 A
5435184 Pineroli Jul 1995 A
5445024 Riley Aug 1995 A
5445027 Zoerner Aug 1995 A
5446659 Yamawaki Aug 1995 A
5455625 Englander Oct 1995 A
5455716 Suman Oct 1995 A
5465079 Bouchard Nov 1995 A
5473729 Bryant Dec 1995 A
5477141 Naether Dec 1995 A
5495242 Kick Feb 1996 A
5495243 McKenna Feb 1996 A
5497419 Hill Mar 1996 A
5499182 Ousborne Mar 1996 A
5504482 Schreder Apr 1996 A
5513011 Matsumoto Apr 1996 A
5515285 Garrett May 1996 A
5519260 Washington May 1996 A
5521633 Nakajima May 1996 A
5523811 Wada Jun 1996 A
5526269 Ishibashi Jun 1996 A
5530420 Tsuchiya Jun 1996 A
5532678 Kin Jul 1996 A
5537156 Katayama Jul 1996 A
5539454 Williams Jul 1996 A
5541590 Nishio Jul 1996 A
5544060 Fujii Aug 1996 A
5546191 Hibi Aug 1996 A
5546305 Kondo Aug 1996 A
5548273 Nicol Aug 1996 A
5552990 Ihara Sep 1996 A
5559496 Dubats Sep 1996 A
5568211 Bamford Oct 1996 A
5570087 Lemelson Oct 1996 A
5570127 Schmidt Oct 1996 A
5574424 Nguyen Nov 1996 A
5574443 Hsieh Nov 1996 A
D376571 Kokat Dec 1996 S
5581464 Woll Dec 1996 A
5586130 Doyle Dec 1996 A
5590948 Moreno Jan 1997 A
5596382 Bamford Jan 1997 A
5596647 Wakai Jan 1997 A
5600775 King Feb 1997 A
5608272 Tanguay Mar 1997 A
5610580 Lai Mar 1997 A
5612686 Takano Mar 1997 A
5631638 Kaspar May 1997 A
5638273 Coiner Jun 1997 A
5642106 Hancock Jun 1997 A
5646856 Kaesser Jul 1997 A
5652706 Morimoto Jul 1997 A
RE35590 Bezos Aug 1997 E
5654892 Fujii Aug 1997 A
5659355 Barron Aug 1997 A
5666120 Kline Sep 1997 A
5667176 Zamarripa Sep 1997 A
5669698 Veldman Sep 1997 A
5671451 Takahashi Sep 1997 A
5677979 Squicciarini Oct 1997 A
5680117 Arai Oct 1997 A
5680123 Lee Oct 1997 A
5686765 Washington Nov 1997 A
5686889 Hillis Nov 1997 A
5689442 Swanson Nov 1997 A
5696705 Zykan Dec 1997 A
5706362 Yabe Jan 1998 A
5706909 Bevins Jan 1998 A
5712679 Coles Jan 1998 A
5717456 Rudt Feb 1998 A
5719554 Gagnon Feb 1998 A
5758299 Sandborg May 1998 A
5781101 Stephen Jul 1998 A
5781145 Williams Jul 1998 A
5784007 Pepper Jul 1998 A
5784021 Oliva Jul 1998 A
5784521 Nakatani Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker Aug 1998 A
5793308 Rosinski Aug 1998 A
5793420 Schmidt Aug 1998 A
5793739 Tanaka Aug 1998 A
5793985 Natarajan Aug 1998 A
5794165 Minowa Aug 1998 A
5797134 McMillan Aug 1998 A
5798458 Monroe Aug 1998 A
5800040 Santo Sep 1998 A
5802545 Coverdill Sep 1998 A
5802727 Blank Sep 1998 A
5805079 Lemelson Sep 1998 A
5813745 Fant Sep 1998 A
5815071 Doyle Sep 1998 A
5815093 Kikinis Sep 1998 A
5819198 Peretz Oct 1998 A
5825284 Dunwoody Oct 1998 A
5825412 Hobson Oct 1998 A
5844505 Van Dec 1998 A
5845733 Wolfsen Dec 1998 A
5867802 Borza Feb 1999 A
5877897 Schofield Mar 1999 A
5896167 Omae Apr 1999 A
5897602 Mizuta Apr 1999 A
5897606 Miura Apr 1999 A
5899956 Chan May 1999 A
5901806 Takahashi May 1999 A
5914748 Parulski Jun 1999 A
5919239 Fraker Jul 1999 A
5926210 Hackett Jul 1999 A
5928291 Jenkins Jul 1999 A
5938321 Bos Aug 1999 A
5946404 Bakshi Aug 1999 A
5948038 Daly Sep 1999 A
5959367 OFarrell Sep 1999 A
5978017 Tino Nov 1999 A
6002326 Turner Dec 1999 A
6006148 Strong Dec 1999 A
6008723 Yassan Dec 1999 A
6008841 Charlson Dec 1999 A
6009370 Minowa Dec 1999 A
6011492 Garesche Jan 2000 A
6028528 Lorenzetti Feb 2000 A
6037860 Zander Mar 2000 A
6037977 Peterson Mar 2000 A
6041410 Hsu Mar 2000 A
6049079 Noordam Apr 2000 A
6057754 Kinoshita May 2000 A
6060989 Gehlot May 2000 A
6064792 Fox May 2000 A
6067488 Tano May 2000 A
6076026 Jambhekar Jun 2000 A
6084870 Wooten Jul 2000 A
6088635 Cox Jul 2000 A
6092008 Bateman Jul 2000 A
6092021 Ehlbeck Jul 2000 A
6092193 Loomis Jul 2000 A
6100811 Hsu Aug 2000 A
6111254 Eden Aug 2000 A
6118768 Bhatia Sep 2000 A
6122738 Millard Sep 2000 A
6141611 Mackey Oct 2000 A
6144296 Ishida Nov 2000 A
6147598 Murphy Nov 2000 A
6151065 Steed Nov 2000 A
6163338 Johnson Dec 2000 A
6163749 McDonough Dec 2000 A
6167186 Kawasaki Dec 2000 A
6170742 Yacoob Jan 2001 B1
6181373 Coles Jan 2001 B1
6182010 Berstis Jan 2001 B1
6185490 Ferguson Feb 2001 B1
6195605 Tabler Feb 2001 B1
6200139 Clapper Mar 2001 B1
6208919 Barkesseh Mar 2001 B1
6211907 Scaman Apr 2001 B1
6218960 Ishikawa Apr 2001 B1
6246933 Bague Jun 2001 B1
6246934 Otake Jun 2001 B1
6252544 Hoffberg Jun 2001 B1
6253129 Jenkins Jun 2001 B1
6259475 Ramachandran Jul 2001 B1
6263265 Fera Jul 2001 B1
6266588 McClellan Jul 2001 B1
6298290 Abe Oct 2001 B1
6300875 Schafer Oct 2001 B1
6324450 Iwama Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6337622 Sugano Jan 2002 B1
6349250 Hart Feb 2002 B1
6353734 Wright Mar 2002 B1
6356823 Iannotti Mar 2002 B1
6360147 Lee Mar 2002 B1
6366207 Murphy Apr 2002 B1
6389339 Just May 2002 B1
6389340 Rayner May 2002 B1
6400835 Lemelson Jun 2002 B1
6405112 Rayner Jun 2002 B1
6405132 Breed Jun 2002 B1
6408232 Cannon Jun 2002 B1
6411874 Morgan Jun 2002 B2
6421080 Lambert Jul 2002 B1
6434510 Callaghan Aug 2002 B1
6449540 Rayner Sep 2002 B1
6456321 Ito Sep 2002 B1
6459988 Fan Oct 2002 B1
6470241 Yoshikawa Oct 2002 B2
6472771 Frese Oct 2002 B1
6490513 Fish Dec 2002 B1
6493650 Rodgers Dec 2002 B1
6505106 Lawrence Jan 2003 B1
6507838 Syeda-Mahmood Jan 2003 B1
6508400 Ishifuji Jan 2003 B1
6516256 Hartmann Feb 2003 B1
6518881 Monroe Feb 2003 B2
6525672 Chainer Feb 2003 B2
6529159 Fan Mar 2003 B1
6535804 Chun Mar 2003 B1
6552682 Fan Apr 2003 B1
6556905 Mittelsteadt Apr 2003 B1
6559769 Anthony May 2003 B2
6574538 Sasaki Jun 2003 B2
6575902 Burton Jun 2003 B1
6580373 Ohashi Jun 2003 B1
6580983 Laguer-Diaz Jun 2003 B2
6593848 Atkins Jul 2003 B1
6594576 Fan Jul 2003 B2
6611740 Lowrey Aug 2003 B2
6611755 Coffee Aug 2003 B1
6624611 Kirmuss Sep 2003 B2
6629029 Giles Sep 2003 B1
6629030 Klausner Sep 2003 B2
6636791 Okada Oct 2003 B2
6664922 Fan Dec 2003 B1
6665613 Duvall Dec 2003 B2
6679702 Rau Jan 2004 B1
6684137 Takagi Jan 2004 B2
6694483 Nagata Feb 2004 B1
6701234 Vogelsang Mar 2004 B1
6714894 Tobey Mar 2004 B1
6718239 Rayner Apr 2004 B2
6721640 Glenn Apr 2004 B2
6721652 Sanqunetti Apr 2004 B1
6728612 Carver Apr 2004 B1
6732031 Lightner May 2004 B1
6732032 Banet May 2004 B1
6735503 Ames May 2004 B2
6737954 Chainer May 2004 B2
6738697 Breed May 2004 B2
6739078 Lajoie May 2004 B2
6741168 Webb May 2004 B2
6745153 White Jun 2004 B2
6747692 Patel Jun 2004 B2
6748305 Klausner Jun 2004 B1
6760757 Lundberg Jul 2004 B1
6762513 Landgraf Jul 2004 B2
6779716 Grow Aug 2004 B1
6795017 Puranik Sep 2004 B1
6795111 Mazzilli Sep 2004 B1
6795759 Doyle Sep 2004 B2
6798743 Ma Sep 2004 B1
6804590 Sato Oct 2004 B2
6810362 Adachi Oct 2004 B2
6812831 Ikeda Nov 2004 B2
6819989 Maeda Nov 2004 B2
6831556 Boykin Dec 2004 B1
6832140 Fan Dec 2004 B2
6832141 Skeen Dec 2004 B2
6836712 Nishina Dec 2004 B2
6842762 Raithel Jan 2005 B2
6847873 Li Jan 2005 B1
6850823 Eun Feb 2005 B2
6859695 Klausner Feb 2005 B2
6859705 Rao Feb 2005 B2
6862524 Nagda Mar 2005 B1
6865457 Mittelsteadt Mar 2005 B1
6867733 Sandhu Mar 2005 B2
6873261 Anthony Mar 2005 B2
6882313 Fan Apr 2005 B1
6882912 DiLodovico Apr 2005 B2
6894606 Forbes May 2005 B2
6895248 Akyol May 2005 B1
6898492 De Leon May 2005 B2
6898493 Ehrman May 2005 B2
6919823 Lock Jul 2005 B1
6922566 Puranik Jul 2005 B2
6928348 Lightner Aug 2005 B1
6931309 Phelan Aug 2005 B2
6947817 Diem Sep 2005 B2
6950122 Mirabile Sep 2005 B1
6954223 Miyazawa Oct 2005 B2
6988034 Marlatt Jan 2006 B1
7003289 Kolls Feb 2006 B1
7012632 Freeman Mar 2006 B2
7020548 Saito Mar 2006 B2
7023333 Blanco Apr 2006 B2
7039510 Gumpinger May 2006 B2
7076348 Bucher Jul 2006 B2
7079927 Tano Jul 2006 B1
7082359 Breed Jul 2006 B2
7082382 Rose et al. Jul 2006 B1
7088387 Freeman Aug 2006 B1
7095782 Cohen Aug 2006 B1
7098812 Hirota Aug 2006 B2
7100190 Johnson Aug 2006 B2
7113853 Hecklinger Sep 2006 B2
7117075 Larschan Oct 2006 B1
7119832 Blanco Oct 2006 B2
7138904 Dutu Nov 2006 B1
7155321 Bromley Dec 2006 B2
7177738 Diaz Feb 2007 B2
7209833 Isaji Apr 2007 B2
7239252 Kato Jul 2007 B2
7254482 Kawasaki Aug 2007 B2
7265663 Steele Sep 2007 B2
7266507 Simon Sep 2007 B2
7272179 Siemens Sep 2007 B2
7308341 Schofield Dec 2007 B2
7317974 Luskin Jan 2008 B2
7343306 Bates Mar 2008 B1
7348895 Lagassey Mar 2008 B2
7349027 Endo Mar 2008 B2
7370261 Winarski May 2008 B2
7382933 Dorai Jun 2008 B2
7386376 Basir Jun 2008 B2
7389178 Raz Jun 2008 B2
7457693 Olsen Nov 2008 B2
7471189 Vastad Dec 2008 B2
7471192 Hara Dec 2008 B2
7536457 Miller May 2009 B2
7548586 Mimar Jun 2009 B1
7561054 Raz Jul 2009 B2
7584033 Mittelsteadt Sep 2009 B2
7623754 McKain Nov 2009 B1
7659827 Gunderson Feb 2010 B2
7659835 Jung Feb 2010 B2
7667731 Kreiner Feb 2010 B2
7689001 Kim Mar 2010 B2
7702442 Takenaka Apr 2010 B2
7725216 Kim May 2010 B2
7768548 Silvernail Aug 2010 B2
7769499 McQuade Aug 2010 B2
7783956 Ko Aug 2010 B2
7804426 Etcheson Sep 2010 B2
7821421 Tamir Oct 2010 B2
7845560 Emanuel Dec 2010 B2
7853376 Peng Dec 2010 B2
7868912 Venetianer Jan 2011 B2
7893958 DAgostino Feb 2011 B1
7940250 Forstall May 2011 B2
7941258 Mittelsteadt May 2011 B1
7974748 Goerick Jul 2011 B2
8054168 McCormick Nov 2011 B2
8068979 Breed Nov 2011 B2
8090598 Bauer Jan 2012 B2
8113844 Huang Feb 2012 B2
8139820 Plante Mar 2012 B2
8140265 Grush Mar 2012 B2
8140358 Ling Mar 2012 B1
8152198 Breed Apr 2012 B2
8239092 Plante Aug 2012 B2
8269617 Cook Sep 2012 B2
8311858 Everett Nov 2012 B2
8314708 Gunderson Nov 2012 B2
8321066 Becker Nov 2012 B2
8373567 Denson Feb 2013 B2
8417562 Siemens Apr 2013 B1
8442690 Goldstein May 2013 B2
8471701 Yariv Jun 2013 B2
8508353 Cook Aug 2013 B2
8538696 Cassanova Sep 2013 B1
8538785 Coleman Sep 2013 B2
8564426 Cook Oct 2013 B2
8564446 Gunderson Oct 2013 B2
8571755 Plante Oct 2013 B2
8577703 McClellan Nov 2013 B2
8606492 Botnen Dec 2013 B1
8634958 Chiappetta Jan 2014 B1
8635557 Geise Jan 2014 B2
8676428 Richardson Mar 2014 B2
8744642 Nemat-Nasser Jun 2014 B2
8775067 Cho Jul 2014 B2
8781292 Ross Jul 2014 B1
8803695 Denson Aug 2014 B2
8805110 Rhoads Aug 2014 B2
8849501 Cook Sep 2014 B2
8855847 Uehara Oct 2014 B2
8862395 Richardson Oct 2014 B2
8868288 Plante Oct 2014 B2
8880279 Plante Nov 2014 B2
8892310 Palmer Nov 2014 B1
8989959 Plante Mar 2015 B2
8996234 Tamari Mar 2015 B1
8996240 Plante Mar 2015 B2
9047721 Botnen Jun 2015 B1
9085362 Kilian Jul 2015 B1
9183679 Plante Nov 2015 B2
9201842 Plante Dec 2015 B2
9208129 Plante Dec 2015 B2
9226004 Plante Dec 2015 B1
9240079 Lambert Jan 2016 B2
9607526 Hsu-Hoffman Mar 2017 B1
20010005217 Hamilton Jun 2001 A1
20010005804 Rayner Jun 2001 A1
20010018628 Jenkins Aug 2001 A1
20010020204 Runyon Sep 2001 A1
20010052730 Baur Dec 2001 A1
20020019689 Harrison Feb 2002 A1
20020027502 Mayor Mar 2002 A1
20020029109 Wong Mar 2002 A1
20020035422 Sasaki Mar 2002 A1
20020044225 Rakib Apr 2002 A1
20020059453 Eriksson May 2002 A1
20020061758 Zarlengo May 2002 A1
20020067076 Talbot Jun 2002 A1
20020087240 Raithel Jul 2002 A1
20020091473 Gardner Jul 2002 A1
20020105438 Forbes Aug 2002 A1
20020107619 Klausner Aug 2002 A1
20020111725 Burge Aug 2002 A1
20020111756 Modgil Aug 2002 A1
20020118206 Knittel Aug 2002 A1
20020120374 Douros Aug 2002 A1
20020135679 Scaman Sep 2002 A1
20020138587 Koehler Sep 2002 A1
20020163532 Thomas Nov 2002 A1
20020169529 Kim Nov 2002 A1
20020169530 Laguer-Diaz Nov 2002 A1
20020183905 Maeda Dec 2002 A1
20030016753 Kim Jan 2003 A1
20030028298 Macky Feb 2003 A1
20030053433 Chun Mar 2003 A1
20030055557 Dutta Mar 2003 A1
20030065805 Barnes Apr 2003 A1
20030067541 Joao Apr 2003 A1
20030079041 Parrella Apr 2003 A1
20030080713 Kirmuss May 2003 A1
20030080878 Kirmuss May 2003 A1
20030081121 Kirmuss May 2003 A1
20030081122 Kirmuss May 2003 A1
20030081127 Kirmuss May 2003 A1
20030081128 Kirmuss May 2003 A1
20030081934 Kirmuss May 2003 A1
20030081935 Kirmuss May 2003 A1
20030095688 Kirmuss May 2003 A1
20030112133 Webb Jun 2003 A1
20030125854 Kawasaki Jul 2003 A1
20030144775 Klausner Jul 2003 A1
20030152145 Kawakita Aug 2003 A1
20030154009 Basir Aug 2003 A1
20030158638 Yakes Aug 2003 A1
20030177187 Levine Sep 2003 A1
20030187704 Hashiguchi Oct 2003 A1
20030191568 Breed Oct 2003 A1
20030195678 Betters Oct 2003 A1
20030214585 Bakewell Nov 2003 A1
20030220835 Barnes Nov 2003 A1
20030222880 Waterman Dec 2003 A1
20040008255 Lewellen Jan 2004 A1
20040033058 Reich Feb 2004 A1
20040039503 Doyle Feb 2004 A1
20040039504 Coffee Feb 2004 A1
20040044452 Bauer Mar 2004 A1
20040044592 Ubik Mar 2004 A1
20040054444 Abeska Mar 2004 A1
20040054513 Laird Mar 2004 A1
20040054689 Salmonsen Mar 2004 A1
20040064245 Knockeart Apr 2004 A1
20040070926 Boykin Apr 2004 A1
20040083041 Skeen Apr 2004 A1
20040088090 Wee May 2004 A1
20040103008 Wahlbin May 2004 A1
20040103010 Wahlbin May 2004 A1
20040104842 Drury Jun 2004 A1
20040111189 Miyazawa Jun 2004 A1
20040117638 Monroe Jun 2004 A1
20040135979 Hazelton Jul 2004 A1
20040138794 Saito Jul 2004 A1
20040145457 Schofield et al. Jul 2004 A1
20040153244 Kellum Aug 2004 A1
20040153362 Bauer Aug 2004 A1
20040167689 Bromley Aug 2004 A1
20040179600 Wells Sep 2004 A1
20040181326 Adams Sep 2004 A1
20040184548 Kerbiriou Sep 2004 A1
20040203903 Wilson Oct 2004 A1
20040209594 Naboulsi Oct 2004 A1
20040210353 Rice Oct 2004 A1
20040230345 Tzamaloukas Nov 2004 A1
20040230370 Tzamaloukas Nov 2004 A1
20040230373 Tzamaloukas Nov 2004 A1
20040230374 Tzamaloukas Nov 2004 A1
20040233284 Lesesky Nov 2004 A1
20040236474 Chowdhary Nov 2004 A1
20040243285 Gounder Dec 2004 A1
20040243308 Irish Dec 2004 A1
20040243668 Harjanto Dec 2004 A1
20040254689 Blazic Dec 2004 A1
20040254698 Hubbard Dec 2004 A1
20040267419 Jeng Dec 2004 A1
20050021199 Zimmerman Jan 2005 A1
20050043869 Funkhouser Feb 2005 A1
20050060070 Kapolka Mar 2005 A1
20050060071 Winner Mar 2005 A1
20050065682 Kapadia Mar 2005 A1
20050065716 Timko Mar 2005 A1
20050073585 Ettinger Apr 2005 A1
20050078423 Kim Apr 2005 A1
20050088291 Blanco Apr 2005 A1
20050099498 Lao May 2005 A1
20050100329 Lao May 2005 A1
20050102074 Kolls May 2005 A1
20050125117 Breed Jun 2005 A1
20050131585 Luskin Jun 2005 A1
20050131595 Luskin Jun 2005 A1
20050131597 Raz Jun 2005 A1
20050136949 Barnes Jun 2005 A1
20050137757 Phelan Jun 2005 A1
20050137796 Gumpinger Jun 2005 A1
20050146458 Carmichael Jul 2005 A1
20050149238 Stefani Jul 2005 A1
20050149259 Cherveny Jul 2005 A1
20050152353 Couturier Jul 2005 A1
20050159964 Sonnenrein Jul 2005 A1
20050166258 Vasilevsky Jul 2005 A1
20050168258 Poskatcheev Aug 2005 A1
20050171692 Hamblen Aug 2005 A1
20050174217 Basir Aug 2005 A1
20050182538 Phelan Aug 2005 A1
20050182824 Cotte Aug 2005 A1
20050185052 Raisinghani Aug 2005 A1
20050185936 Lao Aug 2005 A9
20050192749 Flann Sep 2005 A1
20050197748 Holst Sep 2005 A1
20050200714 Marchese Sep 2005 A1
20050203683 Olsen Sep 2005 A1
20050206741 Raber Sep 2005 A1
20050209776 Ogino Sep 2005 A1
20050212920 Evans Sep 2005 A1
20050216144 Baldassa Sep 2005 A1
20050228560 Doherty Oct 2005 A1
20050233805 Okajima Oct 2005 A1
20050251304 Cancellara Nov 2005 A1
20050256681 Brinton Nov 2005 A1
20050258942 Manasseh Nov 2005 A1
20050264691 Endo Dec 2005 A1
20050283284 Grenier Dec 2005 A1
20060001671 Kamijo Jan 2006 A1
20060007151 Ram Jan 2006 A1
20060011399 Brockway Jan 2006 A1
20060015233 Olsen Jan 2006 A1
20060022842 Zoladek Feb 2006 A1
20060025897 Shostak Feb 2006 A1
20060030986 Peng Feb 2006 A1
20060040239 Cummins Feb 2006 A1
20060047380 Welch Mar 2006 A1
20060053038 Warren Mar 2006 A1
20060055521 Blanco Mar 2006 A1
20060057543 Roald Mar 2006 A1
20060058950 Kato Mar 2006 A1
20060072792 Toda Apr 2006 A1
20060078853 Lanktree Apr 2006 A1
20060082438 Bazakos Apr 2006 A1
20060092043 Lagassey May 2006 A1
20060095175 DeWaal May 2006 A1
20060095199 Lagassey May 2006 A1
20060095349 Morgan May 2006 A1
20060103127 Lie May 2006 A1
20060106514 Liebl May 2006 A1
20060111817 Phelan May 2006 A1
20060122749 Phelan Jun 2006 A1
20060129578 Kim Jun 2006 A1
20060142913 Coffee Jun 2006 A1
20060143435 Kwon Jun 2006 A1
20060147187 Takemoto Jul 2006 A1
20060161960 Benoit Jul 2006 A1
20060168271 Pabari Jul 2006 A1
20060178793 Hecklinger Aug 2006 A1
20060180647 Hansen Aug 2006 A1
20060184295 Hawkins Aug 2006 A1
20060192658 Yamamura Aug 2006 A1
20060200008 Moore-Ede Sep 2006 A1
20060200305 Sheha Sep 2006 A1
20060204059 Ido Sep 2006 A1
20060209090 Kelly Sep 2006 A1
20060209840 Paatela Sep 2006 A1
20060212195 Veith Sep 2006 A1
20060215884 Ota Sep 2006 A1
20060226344 Werth Oct 2006 A1
20060229780 Underdahl Oct 2006 A1
20060242680 Johnson Oct 2006 A1
20060247833 Malhotra Nov 2006 A1
20060253307 Warren Nov 2006 A1
20060259218 Wu Nov 2006 A1
20060261931 Cheng Nov 2006 A1
20070001831 Raz Jan 2007 A1
20070005404 Raz Jan 2007 A1
20070027583 Tamir Feb 2007 A1
20070027726 Warren Feb 2007 A1
20070035632 Silvernail Feb 2007 A1
20070043487 Krzystofczyk Feb 2007 A1
20070100509 Piekarz May 2007 A1
20070120948 Fujioka May 2007 A1
20070124332 Ballesty May 2007 A1
20070127833 Singh Jun 2007 A1
20070132773 Plante Jun 2007 A1
20070135979 Plante Jun 2007 A1
20070135980 Plante Jun 2007 A1
20070136078 Plante Jun 2007 A1
20070142986 Alaous Jun 2007 A1
20070143499 Chang Jun 2007 A1
20070150138 Plante Jun 2007 A1
20070150140 Seymour Jun 2007 A1
20070173994 Kubo Jul 2007 A1
20070179691 Grenn Aug 2007 A1
20070183635 Weidhaas Aug 2007 A1
20070208494 Chapman Sep 2007 A1
20070213920 Igarashi Sep 2007 A1
20070216521 Guensler Sep 2007 A1
20070219685 Plante Sep 2007 A1
20070219686 Plante Sep 2007 A1
20070236474 Ramstein Oct 2007 A1
20070241874 Okpysh Oct 2007 A1
20070244614 Nathanson Oct 2007 A1
20070253307 Mashimo Nov 2007 A1
20070257781 Denson Nov 2007 A1
20070257782 Etcheson Nov 2007 A1
20070257804 Gunderson Nov 2007 A1
20070257815 Gunderson Nov 2007 A1
20070260677 DeMarco Nov 2007 A1
20070262855 Zuta Nov 2007 A1
20070263984 Sterner Nov 2007 A1
20070268158 Gunderson Nov 2007 A1
20070271105 Gunderson Nov 2007 A1
20070273480 Burkman Nov 2007 A1
20070279214 Buehler Dec 2007 A1
20070280677 Drake Dec 2007 A1
20070299612 Kimura Dec 2007 A1
20080035108 Ancimer Feb 2008 A1
20080059019 Delia Mar 2008 A1
20080071827 Hengel Mar 2008 A1
20080111666 Plante May 2008 A1
20080122603 Plante May 2008 A1
20080137912 Kim Jun 2008 A1
20080143834 Comeau Jun 2008 A1
20080147267 Plante et al. Jun 2008 A1
20080157510 Breed Jul 2008 A1
20080167775 Kuttenberger Jul 2008 A1
20080169914 Albertson Jul 2008 A1
20080177436 Fortson Jul 2008 A1
20080195261 Breed Aug 2008 A1
20080204556 deMiranda Aug 2008 A1
20080211779 Pryor Sep 2008 A1
20080234920 Nurminen Sep 2008 A1
20080243389 Inoue Oct 2008 A1
20080252412 Larsson Oct 2008 A1
20080252485 Lagassey Oct 2008 A1
20080252487 McClellan Oct 2008 A1
20080269978 Shirole Oct 2008 A1
20080281485 Plante Nov 2008 A1
20080309762 Howard et al. Dec 2008 A1
20080319604 Follmer Dec 2008 A1
20090009321 McClellan Jan 2009 A1
20090043500 Satoh Feb 2009 A1
20090043971 Kim Feb 2009 A1
20090051510 Follmer Feb 2009 A1
20090138191 Engelhard May 2009 A1
20090157255 Plante Jun 2009 A1
20090216775 Ratliff et al. Aug 2009 A1
20090224869 Baker Sep 2009 A1
20090290848 Brown Nov 2009 A1
20090299622 Denaro Dec 2009 A1
20090312998 Berckmans Dec 2009 A1
20090326796 Prokhorov Dec 2009 A1
20090327856 Mouilleseaux Dec 2009 A1
20100030423 Nathanson Feb 2010 A1
20100045451 Periwal Feb 2010 A1
20100047756 Schneider Feb 2010 A1
20100049516 Talwar Feb 2010 A1
20100054709 Misawa Mar 2010 A1
20100057342 Muramatsu Mar 2010 A1
20100063672 Anderson Mar 2010 A1
20100063680 Tolstedt Mar 2010 A1
20100063850 Daniel Mar 2010 A1
20100070175 Soulchin Mar 2010 A1
20100076621 Kubotani Mar 2010 A1
20100085193 Boss Apr 2010 A1
20100085430 Kreiner Apr 2010 A1
20100087984 Joseph Apr 2010 A1
20100100315 Davidson Apr 2010 A1
20100103165 Lee Apr 2010 A1
20100104199 Zhang Apr 2010 A1
20100149418 Freed Jun 2010 A1
20100153146 Angell Jun 2010 A1
20100157061 Katsman Jun 2010 A1
20100191411 Cook Jul 2010 A1
20100201875 Rood Aug 2010 A1
20100220892 Kawakubo Sep 2010 A1
20100250020 Lee Sep 2010 A1
20100250021 Cook Sep 2010 A1
20100250022 Hines Sep 2010 A1
20100250060 Maeda Sep 2010 A1
20100250116 Yamaguchi Sep 2010 A1
20100253918 Seder Oct 2010 A1
20100268415 Ishikawa Oct 2010 A1
20100283633 Becker Nov 2010 A1
20100312464 Fitzgerald Dec 2010 A1
20110035139 Konlditslotis Feb 2011 A1
20110043624 Haug Feb 2011 A1
20110060496 Nielsen Mar 2011 A1
20110077028 Wilkes Mar 2011 A1
20110091079 Yu-Song Apr 2011 A1
20110093159 Boling Apr 2011 A1
20110112995 Chang May 2011 A1
20110121960 Tsai May 2011 A1
20110125365 Larschan May 2011 A1
20110130916 Mayer Jun 2011 A1
20110140884 Santiago Jun 2011 A1
20110145042 Green Jun 2011 A1
20110153367 Amigo Jun 2011 A1
20110161116 Peak Jun 2011 A1
20110166773 Raz Jul 2011 A1
20110172864 Syed Jul 2011 A1
20110173015 Chapman Jul 2011 A1
20110208428 Matsubara Aug 2011 A1
20110212717 Rhoads Sep 2011 A1
20110213628 Peak Sep 2011 A1
20110224891 Iwuchukwu Sep 2011 A1
20110251752 DeLarocheliere Oct 2011 A1
20110251782 Perkins Oct 2011 A1
20110254676 Marumoto Oct 2011 A1
20110257882 McBurney Oct 2011 A1
20110273568 Lagassey Nov 2011 A1
20110282542 Nielsen Nov 2011 A9
20110283223 Vaittinen et al. Nov 2011 A1
20110304446 Basson Dec 2011 A1
20120021386 Anderson Jan 2012 A1
20120035788 Trepagnier Feb 2012 A1
20120041675 Juliver Feb 2012 A1
20120046803 Inou Feb 2012 A1
20120071140 Oesterling Mar 2012 A1
20120078063 Moore-Ede Mar 2012 A1
20120081567 Cote Apr 2012 A1
20120100509 Gunderson Apr 2012 A1
20120109447 Yousefi May 2012 A1
20120123806 Schumann May 2012 A1
20120134547 Jung May 2012 A1
20120150436 Rossano Jun 2012 A1
20120176234 Taneyhill Jul 2012 A1
20120190001 Knight Jul 2012 A1
20120198317 Eppolito Aug 2012 A1
20120210252 Fedoseyeva Aug 2012 A1
20120277950 Plante Nov 2012 A1
20120280835 Raz Nov 2012 A1
20120283895 Noda Nov 2012 A1
20120330528 Schwindt Dec 2012 A1
20130004138 Kilar Jan 2013 A1
20130006469 Green Jan 2013 A1
20130021148 Cook Jan 2013 A1
20130028320 Gardner Jan 2013 A1
20130030660 Fujimoto Jan 2013 A1
20130073112 Phelan Mar 2013 A1
20130073114 Nemat-Nasser Mar 2013 A1
20130096731 Tamari Apr 2013 A1
20130127980 Haddick May 2013 A1
20130145269 Latulipe Jun 2013 A1
20130151980 Lee Jun 2013 A1
20130170762 Marti Jul 2013 A1
20130197774 Denson Aug 2013 A1
20130209968 Miller Aug 2013 A1
20130274950 Richardson Oct 2013 A1
20130278631 Border Oct 2013 A1
20130317711 Plante Nov 2013 A1
20130332004 Gompert et al. Dec 2013 A1
20130345927 Cook Dec 2013 A1
20130345929 Bowden Dec 2013 A1
20140025225 Armitage Jan 2014 A1
20140025254 Plante Jan 2014 A1
20140032062 Baer Jan 2014 A1
20140046550 Palmer Feb 2014 A1
20140047371 Palmer Feb 2014 A1
20140058583 Kesavan Feb 2014 A1
20140089504 Scholz Mar 2014 A1
20140094992 Lambert Apr 2014 A1
20140098228 Plante Apr 2014 A1
20140152828 Plante Jun 2014 A1
20140226010 Molin Aug 2014 A1
20140232863 Paliga Aug 2014 A1
20140279707 Joshua Sep 2014 A1
20140280204 Avery Sep 2014 A1
20140300739 Mimar Oct 2014 A1
20140309849 Ricci Oct 2014 A1
20140335902 Guba Nov 2014 A1
20140336916 Yun Nov 2014 A1
20150057836 Plante Feb 2015 A1
20150105934 Palmer Apr 2015 A1
20150112542 Fuglewicz Apr 2015 A1
20150112545 Binion Apr 2015 A1
20150134226 Palmer May 2015 A1
20150135240 Shibuya May 2015 A1
20150156174 Fahey Jun 2015 A1
20150189042 Sun Jul 2015 A1
20150222449 Salinger Aug 2015 A1
20150317846 Plante Nov 2015 A1
20160054733 Hollida Feb 2016 A1
Foreign Referenced Citations (68)
Number Date Country
2469728 Dec 2005 CA
2469728 Dec 2005 CA
2692415 Aug 2011 CA
2692415 Aug 2011 CA
4416991 Nov 1995 DE
20311262 Sep 2003 DE
202005008238 Sep 2005 DE
102004004669 Dec 2005 DE
102004004669 Dec 2005 DE
0708427 Apr 1996 EP
0840270 May 1998 EP
0848270 May 1998 EP
1170697 Jan 2002 EP
1324274 Jul 2003 EP
1355278 Oct 2003 EP
1427165 Jun 2004 EP
1818873 Aug 2007 EP
2104075 Sep 2009 EP
2320387 May 2011 EP
2407943 Jan 2012 EP
2268608 Jan 1994 GB
2402530 Dec 2004 GB
2402530 Dec 2004 GB
2451485 Feb 2009 GB
2447184 Jun 2011 GB
2446994 Aug 2011 GB
58085110 May 1983 JP
S5885110 May 1983 JP
62091092 Apr 1987 JP
S6291092 Apr 1987 JP
S62166135 Jul 1987 JP
02056197 Feb 1990 JP
H0256197 Feb 1990 JP
H04257189 Sep 1992 JP
H05137144 Jun 1993 JP
5294188 Nov 1993 JP
H08124069 May 1996 JP
H09163357 Jun 1997 JP
H09272399 Oct 1997 JP
10076880 Mar 1998 JP
H1076880 Mar 1998 JP
2002191017 Jul 2002 JP
2002191017 Jul 2002 JP
1000588169 Dec 2000 KR
8809023 Nov 1988 WO
9005076 May 1990 WO
9427844 Dec 1994 WO
9600957 Jan 1996 WO
9701246 Jan 1997 WO
9726750 Jul 1997 WO
9937503 Jul 1999 WO
9940545 Aug 1999 WO
9962741 Dec 1999 WO
0007150 Feb 2000 WO
0028410 May 2000 WO
0048033 Aug 2000 WO
0077620 Dec 2000 WO
0123214 Apr 2001 WO
0125054 Apr 2001 WO
0146710 Jun 2001 WO
03045514 Jun 2003 WO
2006022824 Mar 2006 WO
2006022824 Mar 2006 WO
2007067767 Jun 2007 WO
2009081234 Jul 2009 WO
2011055743 May 2011 WO
2013072939 May 2013 WO
2013159853 Oct 2013 WO
Non-Patent Literature Citations (197)
Entry
DriveCam, Inc.'s Infringement Contentions Exhibit B, U.S. Pat. No. 7,659,827. Aug. 19, 2011. (29 pgs.).
DriveCam, Inc.'s Infringement Contentions Exhibit C, U.S. Pat. No. 7,804,426. Aug. 19, 2011. (47 pgs.).
DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Aug. 19, 2011. (6 pgs.).
Preliminary Claim Construction and Identification of Extrinsic Evidence of Defendant/Counterclaimant SmartDriveSystems, Inc.' in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H (RBB), for the Southern District of California. Nov. 8, 2011. (13 pgs.).
Supplement to DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Oct. 14, 2011. (7 pgs.).
USPTO Non-Final Office Action mailed Jan. 4, 2016 in U.S. Appl. No. 14/529,134, filed Oct. 30, 2014 (65 pgs).
Notice of Allowance for U.S. Appl. No. 13/957,810, mailed Jun. 8, 2015, 10 pages.
Adaptec published and sold its VideoOh! DVD software USB 2.0 Edition in at least Jan. 24, 2003.
Ambulance Companies Use Video Technology to Improve Driving Behavior, Ambulance Industry Journal, Spring 2003.
Amended Complaint for Patent Infringement, Trade Secret Misappropriation, Unfair Competition and Conversion in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California, Document 34, filed Oct. 20, 2011, pp. 1-15.
Amendment filed Dec. 23, 2009 during prosecution of U.S. Appl. No. 11/566,424.
Answer to Amended Complaint; Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 47, filed Dec. 13, 2011, pp. 1-15.
U.S. Appl. No. 11/296,906, filed Dec. 8, 2005, File History.
U.S. Appl. No. 11/297,669, filed Dec. 8, 2005, File History.
U.S. Appl. No. 11/297,889, filed Dec. 8, 2005, File History.
U.S. Appl. No. 11/298,069, filed Dec. 9, 2005, File History.
U.S. Appl. No. 11/299,028, filed Dec. 9, 2005, File History.
U.S. Appl. No. 11/593,659, filed Nov. 7, 2006, File History.
U.S. Appl. No. 11/593,682, filed Nov. 7, 2006, File History.
U.S. Appl. No. 11/593,882, filed Nov. 7, 2006, File History.
U.S. Appl. No. 11/595,015, filed Nov. 9, 2006, File History.
U.S. Appl. No. 11/637,754, filed Dec. 13, 2006, File History.
U.S. Appl. No. 11/637,755, filed Dec. 13, 2006, File History.
Bill, ‘DriveCam—FAQ’, Dec. 12, 2003.
Bill Siuru, ‘DriveCam Could Save You Big Bucks’, Land Line Magazine, May-Jun. 2000.
Chris Woodyard, ‘Shuttles save with DriveCam’, Dec. 9, 2003.
Dan Carr, Flash Video Template: Video Presentation with Navigation, Jan. 16, 2006, http://www.adobe.com/devnet/fiash/articles/vidtemplate—mediapreso—flash8.html.
David Cullen, ‘Getting a real eyeful’, Fleet Owner Magazine, Feb. 2002.
David Maher, ‘DriveCam Brochure Folder’, Jun. 6, 2005.
David Maher, “DriveCam Brochure Folder”, Jun. 8, 2005.
David Vogeleer et al., Macromedia Flash Professional 8UNLEASHED (Sams Oct. 12, 2005).
Del Lisk, ‘DriveCam Training Handout Ver4’, Feb. 3, 2005.
Drivecam, Inc., User's Manual for Drivecam Video Systems' Hindsight 20/20 Software Version 4.0 (2003).
DriveCam, Inc.'s Infringement Contentions Exhibit A, U.S. Pat. No. 6,389,340, Document 34.1, Oct. 20, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit B, U.S. Pat. No. 7,659,827. Aug. 19, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit B, U.S. Pat. No. 7,804,426, Document 34.2, Oct. 20, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit C, U.S. Pat. No. 7,659,827, Document 34.3, Oct. 20, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit C, U.S. Pat. No. 7,804,426. Aug. 19, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit D, Document 34.4, Oct. 20, 2011.
DriveCam—Illuminator Data Sheet, Oct. 2, 2004.
Drivecam.com as retrieved by the Internet Wayback Machine as of Mar. 5, 2005.
DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Aug. 19, 2011.
DriveCam Driving Feedback System, Mar. 15, 2004.
DriveCam Extrinsic Evidence with Patent LR 4.1 .a Disclosures, Nov. 3, 2011.
DriveCam Extrinsic Evidence with Patent LR 4.1 .a Disclosures, Nov. 8, 2011.
Driver Feedback System, Jun. 12, 2001.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 53, filed Dec. 20, 2011, pp. 1-48.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 55, filed Jan. 1, 2012, pp. 86-103.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 55, filed Jan. 3, 2012, pp. 86-103.
Trivinci Systems, LLC, “Race-Keeper System User Guide”, V1 .1.02, Jan. 2011, p. 21.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Exhibit A, Document 55, filed Jan. 3, 2012, pp. 49-103.
Franke, U., et al., Autonomous Driving Goes Downtown, IEEE Intelligent Systems, 13(6):40-48 (1988); Digital Object Identifier 10.1109/5254.736001.
Gallagher, B., et al., Wireless Communications for Vehicle Safety: Radio Link Performance and Wireless Connectivity Methods, Vehicular Technology Magazine, IEEE, 1(4):4-24 (2006); Digital Object Identifier 10.1109/MVT.2006.343641.
Gandhi, T., et al., Pedestrian Protection Systems: Issues, Survey, and Challenges, IEEE Transactions on Intelligent Transportation Systems, 8(3):413-430 (2007); Digital Object Identifier 10.1109/TITS.2007.903444.
Gary and Sophia Rayner, Final Report for Innovations Deserving Exploratory Analysis (IDEA) Intelligent Transportation Systems (ITS) Programs' Project 84, I-Witness Black Box Recorder, San Diego, CA. Nov. 2001.
GE published its VCR User's Guide for Model VG4255 in 1995.
Glenn Oster, ‘Hindsight 20/20 v4.0 Software Installation’, 1 of 2, Jun. 20, 2003.
Glenn Oster, ‘HindSight 20/20 v4.0 Software Installation’, 2 of 2, Jun. 20, 2003.
Glenn Oster, ‘Illuminator Installation’, Oct. 3, 2004.
Hans Fantel, Video; Search Methods Make a Difference in Picking VCR's, NY Times, Aug. 13, 1989.
I/O Port Racing Supplies' website discloses using Traqmate's Data Acquisition with Video Overlay system in conjunction with professional driver coaching sessions (available at http://www.ioportracing.com/Merchant2/merchant.mvc?Screen=CTGY&Categorys-ub.--Code=coaching)., printed from site on Jan. 11, 2012.
Inovate Motorsports, OT-1 16 Channel OBD-II Interface User Manual, Version 1.0, Nov. 28, 2007, pp. 3, 4, 21 & 27.
Interior Camera Data Sheet, Oct. 26, 2001.
International Search Report and Written Opinion issued in PCT/US07/68325 on Feb. 27, 2008.
International Search Report and Written Opinion issued in PCT/US07/68328 on Oct. 15, 2007.
International Search Report and Written Opinion issued in PCT/US07/68329 on Mar. 3, 2008.
International Search Report and Written Opinion issued in PCT/US07/68332 on Mar. 3, 2008.
International Search Report and Written Opinion issued in PCT/US07/68334 on Mar. 5, 2008.
International Search Report for PCT/US2006/47055, Mailed Mar. 20, 2008 (2 pages).
International Search Report issued in PCT/US2006/47042 mailed Feb. 25, 2008.
J. Gallagher, ‘Lancer Recommends Tech Tool’, Insurance and Technology Magazine, Feb. 2002.
Jean (DriveCam vendor), ‘DC Data Sheet’, Nov. 6, 2002.
Jean (DriveCam vendor), ‘DriveCam brochure’, Nov. 6, 2002.
Jean (DriveCam vendor), ‘Feedback Data Sheet’, Nov. 6, 2002.
Jean (DriveCam vendor), ‘Hindsight 20-20 Data Sheet’, Nov. 4, 2002.
Jessyca Wallace, ‘Analyzing and Processing DriveCam Recorded Events’, Oct. 6, 2003.
Jessyca Wallace, ‘Overview of the DriveCam Program’, Dec. 15, 2005.
Jessyca Wallace, ‘The DriveCam Driver Feedback System’, Apr. 6, 2004.
Joint Claim Construction Chart, U.S. Pat. No. 6,389,340, ‘Vehicle Data Recorder’ for Case No. 3:11-CV-00997-H-RBB, Document 43-1, filed Dec. 1, 2011, pp. 1-33.
Joint Claim Construction Chart in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 11-CV-0997-H (RBB), for the Southern District of California, Document 43, filed Dec. 1, 2011, pp. 1-2.
Joint Claim Construction Worksheet, U.S. Pat. No. 6,389,340, ‘Vehicle Data Reporter’ for Case No. 3:11-CV-00997-H-RBB, Document 44-1, filed Dec. 1, 2011, pp. 1-10.
Joint Claim Construction Worksheet, U.S. Pat. No. 6,389,340, “Vehicle Data Reporter” for Case No. 3:11-CV-00997-H-RBB, Document 44-1, filed Dec. 1, 2011, pp. 1-10.
Joint Claim Construction Worksheet in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 44, filed Dec. 1, 2011, pp. 1-2.
Joint Motion for Leave to Supplement Disclosure of Asserted Claims and Preliminary Infringement Contentions in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-cv-00997-H-RBB, Document 29, filed Oct. 12, 2011, pp. 1-7.
Julie Stevens, ‘DriveCam Services’, Nov. 15, 2004.
Julie Stevens, ‘Program Support Roll-Out & Monitoring’, Jul. 13, 2004.
Jung, Sang-Hack, et al., Egomotion Estimation in Monocular Infra-red Image Sequence for Night Vision Applications, IEEE Workshop on Applications of Computer Vision (WACV '07), Feb. 2007, pp. 8-8; Digital Object Identifier 10.1109/WACV.2007.20.
JVC Company of America, JVC Video Cassette Recorder HR-IP820U Instructions (1996).
Kamijo, S., et al., A Real-Time Traffic Monitoring System by Stochastic Model Combination, IEEE International Conference on Systems, Man and Cybernetics, 4:3275-3281 (2003).
Kamijo, S., et al., An Incident Detection System Based on Semantic Hierarchy, Proceedings of the 7th International IEEE Intelligent Transportation Systems Conference, Oct. 3-6, 2004, pp. 853-858; Digital Object Identifier 10.1109/ITSC.2004.1399015.
Karen, ‘Downloading Options to HindSight 20120’, Aug. 6, 2002.
Karen, ‘Managers Guide to the DriveCam Driving Feedback System’, Jul. 30, 2002.
Kathy Latus (Latus Design), ‘Case Study—Cloud 9 Shuttle’, Sep. 23, 2005.
Kathy Latus (Latus Design), ‘Case Study—Lloyd Pest Control’, Jul. 19, 2005.
Kathy Latus (Latus Design), ‘Case Study—Time Warner Cable’, Sep. 23, 2005.
Ki, Yong-Kul, et al., A Traffic Accident Detection Model using Metadata Registry, Proceedings of the Fourth International Conference on Software Engineering Research, Management and Applications; Aug. 9-11, 2006 pp. 255-259 Digital Object Identifier 10.1109/SERA.2006.8.
Kitchin, Charles. “Understanding accelerometer scale factor and offset adjustments.” Analog Devices (1995).
Lin, Chin-Teng et al., EEG-based drowsiness estimation for safety driving using independent component analysis; IEEE Transactions on Circuits and Systems—I: Regular Papers, 52(12):2726-2738 (2005); Digital Object Identifier 10.1109/TCSI.2005.857555.
Lisa McKenna, ‘A Fly on the Windshield?’, Pest Control Technology Magazine, Apr. 2003.
Miller, D.P., Evaluation of Vision Systems for Teleoperated Land Vehicles. Control Systems Magazine, IEEE, 8(3):37-41 (1988); Digital Identifier 10.1109/37.475.
Munder, S., et al., Pedestrian Detection and Tracking Using a Mixture of View-Based Shape-Texture Models, IEEE Transactions on Intelligent Transportation Systems, 9(2):333-343 (2008); Digital Identifier 10.1109/TITS.2008.922943.
Panasonic Corporation, Video Cassette Recorder (VCR) Operating Instructions for Models No. PV-V4020/PV-V4520.
Passenger Transportation Mode Brochure, May 2, 2005.
Patent Abstracts of Japan vol. 007, No. 180 (P-215), Aug. 9, 1983 (Aug. 9, 1983) JP 58 085110 A (Mitsuhisa Ichikawa), May 21, 1983 (May 21, 1983).
Patent Abstracts of Japan vol. 011, No. 292 (E-543), Sep. 19, 1987 (Sep. 19, 1987) JP 62 091092 A (OK ENG:KK), Apr. 25, 1987 (Apr. 25, 1987).
Patent Abstracts of Japan vol. 012, No. 001 (M-656), Jan. 6, 1988 (Jan. 6, 1988) JP 62 166135 A (Fuji Electric Co Ltd), Jul. 22, 1987 (Jul. 22, 1987).
Patent Abstracts of Japan vol. 014, No. 222 (E-0926), May 10, 1990 (May 10, 1990) JP 02 056197 A (Sanyo Electric Co Ltd), Feb. 26, 1990 (Feb. 26, 1990).
Patent Abstracts of Japan vol. 017, No. 039 (E-1311), Jan. 25, 1993 (Jan. 25, 1993) JP 04 257189 A (Sony Corp), Sep. 11, 1992 (Sep. 11, 1992).
Patent Abstracts of Japan vol. 017, No. 521 (E-1435), Sep. 20, 1993 (Sep. 20, 1993) JP 05 137144 A (Kyocera Corp), Jun. 1, 1993 (Jun. 1, 1993).
Patent Abstracts of Japan vol. 1996, No. 09, Sep. 30, 1996 (Sep. 30, 1996) JP 08 124069 A (Toyota Motor Corp), May 17, 1996 (May 17, 1996).
Patent Abstracts of Japan vol. 1997, No. 10, Oct. 31, 1997 (Oct. 31, 1997) JP 09 163357 A (Nippon Soken Inc), Jun. 20, 1997 (Jun. 20, 1997).
Patent Abstracts of Japan vol. 1998, No. 02, Jan. 30, 1998 (Jan. 30, 1998) JP 09 272399 A (Nippon Soken Inc), Oct. 21, 1997 (Oct. 21, 1997).
Patent Abstracts of Japan vol. 1998, No. 8, Jun. 30, 1998 (Jun. 30, 1998) JP 10 076880 A (Muakami Corp), Mar. 24, 1998 (Mar. 24, 1998).
PCT/US2010/022012, Invitation to Pay Additional Fees with Communication of Partial International Search, Jul. 21, 2010.
Peter G. Thurlow, Letter (including exhibits) Regarding Patent Owner's Response to Initial Office Action in Ex Parte Reexamination, Mar. 27, 2012.
Preliminary Claim Construction and Identification of Extrinsic Evidence of Defendant/Counterclaimant SmartDriveSystems, Inc.' in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H (RBB), for the Southern District of California. Nov. 8, 2011.
Quinn Maughan, ‘DriveCam Enterprise Services’, Jan. 5, 2006.
Quinn Maughan, ‘DriveCam Managed Services’, Jan. 5, 2006.
Quinn Maughan, ‘DriveCam Standard Edition’, Jan. 5, 2006.
Quinn Maughan, ‘DriveCam Unit Installation’, Jul. 21, 2005.
Quinn Maughan, ‘Enterprise Services’, Apr. 17, 2006.
Quinn Maughan, ‘Enterprise Services’, Apr. 7, 2006.
Quinn Maughan, ‘Hindsight Installation Guide’, Sep. 29, 2005.
Quinn Maughan, ‘Hindsight Users Guide’, Jun. 7, 2005.
Ronnie Rittenberry, ‘Eyes on the Road’, Jul. 2004.
SmartDrives Systems, Inc's Production, SO14246-S014255, Nov. 16, 2011.
Supplement to DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions' in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Oct. 14, 2011.
The DriveCam, Nov. 6, 2002.
The DriveCam, Nov. 8, 2002.
Traqmate GPS Data Acquisition's Traqmate Data Acquisition with Video Overlay system was used to create a video of a driving event on Oct. 2, 2005 (available at http://www.trackvision.net/phpBB2/viewtopic.php? t=51&sid=1184fbbcbe3be5c87ffa0f2ee6e2da76), printed from site on Jan. 11, 2012.
Trivinci Systems, LLC, Race-Keeper Systems User Guide, Jan. 2011, v1, 1.02, pp. 34 and 39.
U.S. Appl. No. 12/691,639, entitled ‘Driver Risk Assessment System and Method Employing Selectively Automatic Event Scoring’, filed Jan. 21, 2010.
U.S. Appl. No. 11/377,167, Final Office Action dated Nov. 8, 2013.
U.S. Appl. No. 11/377,157, filed Mar. 16, 2006 entitled, “Vehicle Event Recorder Systems and Networks Having Parallel Communications Links”.
U.S. Appl. No. 11/377,167, filed Mar. 16, 2006 entitled, “Vehicle Event Recorder Systems and Networks Having Integrated Cellular Wireless Communications Systems”.
USPTO Final Office Action for U.S. Appl. No. 11/297,669, mailed Nov. 7, 2011, 15 pages.
USPTO Final Office Action for U.S. Appl. No. 13/957,810, mailed Jun. 27, 2014, 24 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Apr. 2, 2009, 7 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Nov. 6, 2009, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/297,669, mailed Apr. 28, 2011, 11 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/299,028, mailed Apr. 24, 2008, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Nov. 19, 2007, 7 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Nov. 25, 2011, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Sep. 11, 2008, 8 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,167, mailed Jun. 5, 2008, 11 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/800,876, mailed Dec. 1, 2010, 12 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/800,876, mailed Dec. 20, 2011, 8 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 12/096,591, mailed May 20, 2014, 19 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 14/036,299, mailed Aug. 12, 2014.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,907, Mailed Mar. 22, 2007 ( 17 pages).
USPTO Non-final Office Action mailed Aug. 27, 2009 during prosecution of U.S. Appl. No. 11/566,424.
USPTO Non-Final Office Action mailed Nov. 27, 2013 in U.S. Appl. No. 13/957,810, filed Aug. 2, 2013.
Veeraraghavan, H., et al., Computer Vision Algorithms for Intersection Monitoring, IEEE Transactions on Intelligent Transportation Systems, 4(2):78-89 (2003); Digital Object Identifier 10.1109/TITS.2003.821212.
Wijesoma, W.S., et al., Road Curb Tracking in an Urban Environment, Proceedings of the Sixth International Conference of Information Fusion, 1:261-268 (2003).
World News Tonight, CBC Television New Program discussing teen drivers using the DriveCam Program and DriveCam Technology, Oct. 10, 2005, on PC formatted CD-R, World News Tonight.wmv, 7.02 MB, Created Jan. 12, 2011.
Written Opinion issued in PCT/US07/68328 on Oct. 15, 2007.
Written Opinion of the International Searching Authority for PCT/US2006/47042. Mailed Feb. 25. 2008 (5 pages).
Written Opinion of the International Searching Authority for PCT/US2006/47055, Mailed Mar. 20, 2008 (5 pages).
PCT International Search Report and Written Opinion for PCT/IB16/51863, dated Sep. 16, 2016.
“DriveCam, Inc's Disclosure of Proposed Constructions and Extrinsic Evidence Pursuant to Patent L.R. 4.1.A & 4.1.b” Disclosure and Extrinsic Evidence in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Nov. 8, 2011.
“DriveCam Driving Feedback System”, DriveCam brochure, Jun. 12, 2001, Document #6600128.
“DriveCam Driving Feedback System” DriveCam brochure, Mar. 15, 2004.
“DriveCam Passenger Transportation Module”, DriveCam brochure, Oct. 26, 2001.
“DriveCam Video Event Data Recorder”, DriveCam brochure, Nov. 6, 2002, Document #6600127.
“Responsive Claim Construction and Identification of Extrinsic Evidence of Defendani/Counterclaimant SmartDrive Systems, Inc.” Claim Construction and and Extrinsic Evidence in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-Cv-00997-H (Rbb), for the Southern District of California. Nov. 15, 2011.
“Sonic MyDVD 4.0: Tutorial: Trimming video segments”. Tutorial for software bundled with Adaptec VideoOh! Dvd Usb 2.0 Edition, 2003.
“User's Manual for DriveCam Video Systems' HindSight 20/20 Software Version 4.0” DriveCam Manual, San Diego, 2003, Document #6600141-1.
Canadian Office Action issued in Application No. 2,632,685 dated Jan. 30, 2015; 5 pages.
Dan Maher, “DriveCam Taking Risk Out of Driving”, DriveCam brochure folder, Jun. 6, 2005.
Del Lisk, “DriveCam Training Seminar” Handout, 2004.
European Examination Report issued in EP 07772812.9 on Jan. 22, 2015; 5 pages.
Jean (DriveCam vendor) “DriveCam Driving Feedback System”, DriveCam brochure, Nov. 6, 2002, Document #6600128-1.
Notice of Allowance Allowance for U.S. Appl. No. 14/036,299, mailed Mar. 20, 2015, xx pages.
Notice of Allowance Application for U.S. Appl. No. 11/566,424, mailed Feb. 26, 2010, 6 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Dec. 3, 2014, 5 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Feb. 13, 2015, 2 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Feb. 25, 2014, 2 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Nov. 18, 2013, 7 pages.
Notice of Allowance for U.S. Appl. No. 11/377,167, mailed Apr. 1, 2015, 7 pages.
Notice of Allowance for U.S. Appl. No. 11/800,876, mailed Apr. 19, 2012, 8 pages.
USPTO Final Office Action for U.S. Appl. No. 11/296,906, mailed Aug. 8, 2012, 15 pages.
USPTO Final Office Action for U.S. Appl. No. 12/096,591, mailed Dec. 5, 2014, 23 pages.
USPTO Final Office Action for U.S. Appl. No. 12/096,591, mailed Jul. 18, 2012, 15 pages.
USPTO Final Office Action for U.S. Appl. No. 12/096,591, mailed Nov. 7, 2013, 14 pages.
USPTO Final Office Action for U.S. Appl. No. 13/957,810, mailed Jun. 27, 2014, 22 pages.
USPTO Final Office Action for U.S. Appl. No. 14/036,299, mailed Feb. 24, 2015, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Apr. 8, 2014, 19 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Jun. 12, 2012, 13 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Apr. 7, 2014, 7 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Aug. 18, 2014, 5 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Sep. 10, 2012, 10 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,167, mailed Jun. 27, 2013, 11 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 12/096,591, mailed Jun. 14, 2011, 8 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 12/096,591, mailed Mar. 27, 2013, 16 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 13/957,810, mailed Apr. 17, 2015, 6 pages.
USPTO Non-final Office Action for U.S. Appl. No. 13/957,810, mailed Nov. 27, 2013, 18 pages.
PCT International Search Report and Written Opinion for PCT/US15/60721 dated Feb. 26, 2016, 11 pages.
Related Publications (1)
Number Date Country
20150035665 A1 Feb 2015 US
Continuations (1)
Number Date Country
Parent 11595015 Nov 2006 US
Child 14516650 US