Rail vehicle event detection and recording system

Information

  • Patent Grant
  • 9663127
  • Patent Number
    9,663,127
  • Date Filed
    Tuesday, October 28, 2014
    9 years ago
  • Date Issued
    Tuesday, May 30, 2017
    7 years ago
Abstract
This disclosure relates to a system and method for detecting and recording rail vehicle events. The system comprises one or more cameras, one or more sensors, non-transient electronic storage, one or more physical computer processors, and/or other components. The one or more cameras may be configured to acquire visual information representing a rail vehicle environment. The one or more sensors may be configured to generate output signals conveying operation information related to operation of the rail vehicle. The non-transient electronic storage may be configured to store electronic information. The one or more physical computer processors may be configured to detect rail vehicle events based on the output signals and facilitate electronic storage of the visual information and the operation information for a period of time that includes the rail vehicle event in the non-transient electronic storage.
Description
FIELD

This disclosure relates to a system and method for detecting and recording rail vehicle events.


BACKGROUND

Typically, trains are not equipped with vehicle event detection systems. Some trains are equipped with cameras but these cameras are usually only used for surveillance purposes to monitor interior passenger compartments. The cameras are not connected to mechanical and/or safety subsystems of the train in any way.


SUMMARY

One aspect of the disclosure relates to a rail vehicle event detection system for detecting and recording rail vehicle events. The rail vehicle event detection system may be configured to be coupled with a rail vehicle. In some implementations, the rail vehicle event detection system may be electrically isolated from the rail vehicle. In some implementations, the system may include one or more of an operator identity system, a camera, a transceiver, a sensor, a backup power system, electronic storage, a processor, a user interface, and/or other components.


Operator identity information may be received by the operator identity system. The operator identity information may identify periods of time that individual operators operate the rail vehicle. In some implementations, receiving operator identity information may include receiving entry and/or selection of the operator identity information from operators of the rail vehicle at the rail vehicle. In some implementations, receiving operator identity information may include receiving the operator identity information from a remotely located computing device. In some implementations, receiving operator identity information may include receiving operator identity information from a biometric sensor configured to generate output signals that convey biometric information that identifies an individual operator of the rail vehicle.


Visual information may be acquired by one or more cameras. The visual information may represent a rail vehicle environment. The rail vehicle environment may include spaces in and around an interior and an exterior of the rail vehicle. The visual information may include views of exterior sides of the rail vehicle that capture visual images of events (e.g., collisions, near collisions, etc.) that occur at or near the sides of the rail vehicle, views of interior compartments of the rail vehicle, and/or other visual information. In some implementations, visual information representing the rail vehicle environment at or near ends of the rail vehicle may be acquired. In some implementations, the visual information may be received from a third party camera and/or digital video recorder (DVR) system. For example, such systems may include Panorama, a system previously installed in the rail vehicle, and/or other systems. Visual information may be received from a third party camera and or DVR system wirelessly and/or via wires.


Output signals may be generated by one or more sensors. The output signals may convey operation information related to operation and/or context of the rail vehicle. In some implementations, the output signals may convey information related to mechanical and/or safety subsystems of the rail vehicle. The output signals that convey information related to safety subsystems of the rail vehicle may include overspeed sensor information and/or other information, for example. In some implementations, the output signals may convey operation information related to operation of the rail vehicle at or near both ends of the rail vehicle. In some implementations, the output signals may convey information related to the environment around railcars of the rail vehicle. For example, such output signals may include information from a communications based train control (CBTC) system and/or other external signals received from third party rail safety products.


The processor may be configured to execute computer program components. The computer program components may include an event detection component, a storage component, a communication component, and/or other components.


Rail vehicle events may be detected by the event detection component. The rail vehicle events may be detected based on the output signals and/or other information. Electronic storage of rail vehicle event information may be facilitated by the storage component. The vehicle event information may be stored for a period of time that includes the rail vehicle event. The rail vehicle event information may include the operator identity information, the visual information, and the operation information for the period of time that includes the rail vehicle event.


Wireless communication of the rail vehicle event information may be facilitated by the communication component (e.g., via the transceiver). Wireless communication may be facilitated via the transceiver and/or wireless communication components configured to transmit and receive electronic information. In some implementations, the rail vehicle event information may be wirelessly communicated to a remote computing device via the wireless communication components, for example.


The system may be electrically isolated from the rail vehicle via an opto-isolator, an optical isolation circuit, and/or other isolation components. The opto-isolator may transfer electrical signals between two isolated circuits (e.g., a rail vehicle circuit and a rail vehicle event detection system circuit) using light. The opto-isolator may prevent unexpectedly high voltages in one circuit from being transferred to and/or damaging another circuit. The opto-isolator may couple an input current to an output current via a beam of light modulated by the input current. The opto-isolator may convert an input current signal into a light signal, send the light signal across a dielectric channel, capture the light signal on an output side of the dielectric channel, and then transform the light signal back into an electric signal (e.g., an output current).


These and other objects, features, and characteristics of the system and/or method disclosed herein, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates a system configured to detect and record rail vehicle events.



FIG. 2A illustrates the system and/or individual components of the system coupled with a rail vehicle at two locations.



FIG. 2B illustrates the system in communication with rail vehicle subsystems and a remote computing device.



FIG. 3 illustrates an example view of a graphical user interface presented to the reviewer via a remote computing device.



FIG. 4A illustrates an opto-isolator.



FIG. 4B illustrates an isolation circuit.



FIG. 5 illustrates a method for detecting and recording rail vehicle events.





DETAILED DESCRIPTION


FIG. 1 illustrates a system 10 configured to detect and record rail vehicle events. In some implementations, system 10 may include one or more of an operator identity system 12, a camera 14, a transceiver 16, a sensor 18, a backup power system 20, electronic storage 22, a processor 30, a user interface 40, and/or other components. System 10 is configured to be coupled with a rail vehicle. System 10 may be configured to monitor operation of the rail vehicle and/or determine whether rail vehicle events occur. By way of a non-limiting example, rail vehicle events may include collisions with other vehicles and/or pedestrians, near collisions, a specific behavior and/or driving maneuver performed by a rail vehicle operator (e.g., unsafe backing, unsafe braking, unsafe railroad crossing, unsafe turning, operating the rail vehicle with hands off of the control lever and/or any other similar maneuver such as operating the rail vehicle without a foot on a foot controller (for example), passing a signal bar, passing red over red, failure to yield to pedestrians, failure to yield to vehicles, speeding, not checking mirrors, not scanning the road/tracks ahead, not scanning an intersection, operating a personal electronic device, intercom responds, being distracted while eating, drinking, reading, etc., slingshotting, following or not following a transit agency's standard operating procedure), penalty stops, activation of a specific rail vehicle safety system (such as a track brake and/or an emergency brake), train operating parameters (e.g., speed) exceeding threshold values, improper stops at stations, and/or other rail vehicle events. Responsive to determining that a rail vehicle event has occurred, system 10 may be configured to record rail vehicle event information and/or transmit the recorded rail vehicle event information to one or more remotely located computing devices (e.g., wirelessly and/or via wires). The rail vehicle event information may include visual images of the environment about the rail vehicle (e.g., the exterior of the rail vehicle, streets surrounding rail tracks, passenger compartments, operator compartments, etc.), sensor information generated by rail vehicle system sensors and/or aftermarket sensors installed as part of system 10 (e.g., sensors 18), operator information, and/or other information.


In some implementations, system 10 and/or individual components of system 10 may be coupled with a rail vehicle at one or more locations on and/or within the rail vehicle. For example, FIG. 2A illustrates system 10 and/or individual components of system 10 coupled with a rail vehicle 250 at two locations 252 and 254 at or near the ends of the rail vehicle. This is not intended to be limiting. In some implementations, system 10 and/or individual components of system 10 may be coupled with rail vehicle 250 at any number of locations. In some implementations, system 10 may be coupled with rail vehicle 250 in locations that facilitate communication with one or more subsystems of rail vehicle 250.


For example, FIG. 2B illustrates system 10 in communication with rail vehicle subsystems 202, 204, 206, and 208. Rail vehicle subsystems may include mechanical subsystems, vehicle safety subsystems, track safety subsystems, inter-railcars safety subsystems, camera subsystems, DVR subsystems, and/or other rail vehicle subsystems (further described below related to sensors 18). System 10 may be configured to be coupled with the rail vehicle subsystems so that information may be transmitted wirelessly and/or system 10 may be physically coupled with the rail vehicle subsystems via wires and/or other physical couplings. As shown in FIG. 2B, system 10 may be configured to communicate (e.g., wirelessly and/or via wires) with one or more remote computing devices 210. System 10 may communicate information (e.g., rail vehicle event information and/or other information) to remote computing device 210 and/or receive information from remote computing device 210 (e.g., information related to settings and/or other control of system 10, and/or other information.)


In some implementations, system 10 may be configured to communicate with other rail third part products 270 (DVR systems, safety systems, etc.). For example, system 10 may be configured to be physically coupled with a rail third party DVR system. As another example, system 10 may be configured to communicate with a CBTC safety system via a physical coupling. In some implementations, system 10 may be configured to communicate information to and/or receive information from third party products 270 wirelessly and/or via wires.


Remote computing device 210 may include one or more processors, a user interface, electronic storage, and/or other components. Remote computing device 210 may be configured to enable a user to interface with system 10, and/or provide other functionality attributed herein to remote computing device 210. Remote computing device 210 may be configured to communicate with system 10 via a network such as the internet, cellular network, Wi-Fi network, Ethernet, and other interconnected computer networks. Remote computing device 210 may facilitate viewing and/or analysis of the information conveyed by output signals of sensors 18 (FIG. 1), information determined by processor 30 (FIG. 1), information stored by electronic storage 22 (FIG. 1), and/or other information. By way of non-limiting example, remote computing device 210 may include one or more of a server, a server cluster, desktop computer, a laptop computer, a handheld computer, a tablet computing platform, a NetBook, a Smartphone, a gaming console, and/or other computing platforms.


As described above, in some implementations, remote computing device 210 may be and/or include a server. The server may include communication lines and/or ports to enable the exchange of information with a network, processor 30 of system 10, and/or other computing platforms. The server may include a plurality of processors, electronic storage, hardware, software, and/or firmware components operating together to provide the functionality attributed herein to remote computing device 210. For example, the server may be implemented by a cloud of computing platforms operating together as a system server.


Returning to FIG. 1, operator identity system 12 may be configured to receive operator identity information that identifies periods of time individual operators operate the rail vehicle. In some implementations, operator identity system 12 may be coupled with the rail vehicle and may be configured to receive entry and/or selection of the operator identity information from operators of the rail vehicle at the rail vehicle. For example, a rail vehicle operator may key in a specific identification code via a user interface located in the rail vehicle. In some implementations, operator identity system 12 may be configured to receive the operator identity information from a remotely located computing device (e.g., remote computing device 210 shown in FIG. 2B). In some implementations, the operator identity information may be received via a third party hardware platform configured (e.g., via software) to transmit the operator identity information to system 10.


In some implementations, operator identity system 12 may be configured to receive operator identity information from a biometric sensor configured to generate output signals that convey biometric information that identifies an individual operator of the rail vehicle. In some implementations, the biometric sensor may be worn by the operator. For example, such biometric sensors may include fingerprint scanning sensors, iris scanning sensors, sensors that generate output signals related to an electroencephalogram (EEG) of the operator used to uniquely identify the operator, sensors that generate output signals related to an electrocardiogram (ECG) of the operator used to uniquely identify the operator, sensors that generate output signals related to brain waves of the operator used to uniquely identify the operator, and/or other biometric sensors.


Cameras 14 may be configured to acquire visual information representing a rail vehicle environment. Any number of individual cameras 14 may be positioned at various locations on and/or within the rail vehicle. The rail vehicle environment may include spaces in and around an interior and/or an exterior of the rail vehicle. Cameras 14 may be configured such that the visual information includes views of exterior sides of the rail vehicle, interior compartments of the rail vehicle, and/or other areas to capture visual images of activities that occur at or near the sides of the rail vehicle, in front of and/or behind the rail vehicle, within the rail vehicle, on streets surrounding rail vehicle tracks, and/or in other areas. In some implementations, cameras 14 may include multiple cameras positioned around the rail vehicle and synchronized together to provide a 360 degree and/or other views of the inside of one or more portions of the rail vehicle (e.g., a driver compartment, a passenger compartment) and/or a 360 degree and/or other views of the outside of the vehicle (e.g., at or near a leading end of the rail vehicle looking ahead toward upcoming traffic, street crossings, etc.). In some implementations, the visual information may be received from a third party camera and/or digital video recorder (DVR) system. For example, such systems may include Panorama, a system previously installed in the rail vehicle, and/or other systems. Visual information may be received from a third party camera and or DVR system wirelessly and/or via wires.


Transceiver 16 may comprise wireless communication components configured to transmit and receive electronic information. In some implementations, processor 30 may be configured to facilitate wireless communication of rail vehicle event information to a remote computing device (e.g., remote computing device 210) via transceiver 16 and/or other wireless communication components. Transceiver 16 may be configured to transmit and/or receive encoded communication signals. Transceiver 16 may include a base station and/or other components. In some implementations, transceiver 16 may be configured to transmit and receive signals via one or more radio channels of a radio link. In some implementations, transceiver 16 may be configured to transmit and receive communication signals substantially simultaneously. Transmitting and/or receiving communication signals may facilitate communication between remote computing device 210 (FIG. 2B) and processor 30, for example.


Sensors 18 may be configured to generate output signals conveying operation information related to operation and/or context of the rail vehicle and/or other information. Information related to the operation of vehicle 12 may include feedback information from one or more subsystems of the rail vehicle, and/or other information. The subsystems may include, for example, the engine, the drive train, lighting systems (e.g., headlights, brake lights, train status indicator lights, track information lighting/signage), the braking system, power delivery (e.g., mechanical and/or electrical) systems, safety systems, radio systems, dispatch systems, and/or other subsystems. The subsystems of the rail vehicle may include one or more mechanical sensors, electronic sensors, and/or other sensors that generate output signals. In some implementations, sensors 18 may include at least one sensor that is a rail vehicle subsystem sensor associated with mechanical systems of the rail vehicle (e.g., the engine, drive train, lighting, braking, power delivery systems, etc). In some implementations, sensors 18 may include at least one sensor that is a rail vehicle subsystem sensor associated with a rail vehicle safety system configured to generate output signals conveying information related to safety systems of the rail vehicle. Rail vehicle safety subsystem sensors may include automatic train protection (ATP) sensors (e.g., ATP bypass active, ATP overspeed sensors), an automatic train control system (ATCS), track switches, track brake sensors, emergency brake sensors, intercom call sensors, a high horn sensor, a slingshotting sensor (e.g., a sensor that conveys output signals that indicate whether a side to side g-force at a last rail car when the rail car speed is too high causes passenger discomfort, has the potential to cause derailment, an/or may cause damage to the rail car and/or the track), and/or other sensors.


Information related to the context of the rail vehicle may include information related to the environment in and/or around the rail vehicle. The vehicle environment may include spaces in and around an interior and an exterior of the rail vehicle. The information related to the context of the rail vehicle may include information related to movement of the rail vehicle, an orientation of the rail vehicle, a geographic position of the rail vehicle, a spatial position of the rail vehicle relative to other objects, a tilt angle of rail vehicle, and/or other information. In some implementations, the output signals conveying the information related to the context of the rail vehicle may be generated via non-standard aftermarket sensors installed in the rail vehicle and/or other sensors. The non-standard aftermarket sensor may include, for example, a video camera (e.g., cameras 14), a microphone, an accelerometer, a gyroscope, a geolocation sensor (e.g., a GPS device), a radar detector, a magnetometer, radar, biometric sensors, an intercom, an active safety sensor that utilizes a camera such as Mobile Eye and/or Bendex, and/or other sensors. In some implementations, the output signals may include information from a communications based train control (CBTC) system and/or other external signals received from third party rail safety products.


Although sensors 18 are depicted in FIG. 1 as a single element, this is not intended to be limiting. Sensors 18 may include one or more sensors located adjacent to and/or in communication with the various mechanical systems of the rail vehicle, adjacent to and/or in communication with the various safety systems of the rail vehicle, in one or more positions (e.g., at or near the front/rear of the rail vehicle) to accurately acquire information representing the vehicle environment (e.g. visual information, spatial information, orientation information), and/or in other locations. For example, in some implementations, system 10 may be configured such that a first sensor is located in a driver compartment of the rail vehicle near operational control used to operate the rail vehicle and a second sensor is located on top of the rail vehicle and is in communication with a geolocation satellite. In some implementations, sensors 18 are configured to generate output signals substantially continuously during operation of the rail vehicle.


One or more components of system 10 may be electrically coupled with the rail vehicle such that the one or more components of system 10 may be powered by electrical power from the rail vehicle. The one or more components of system 10 may be individually electrically coupled to the rail vehicle and/or the components of system 10 may be electrically coupled to the rail vehicle via common electrical connection. Backup power system 20 may be configured to provide electrical power to system 10 responsive to power received by the system 10 from the rail vehicle ceasing. (Power from the rail vehicle may cease for various reasons such as turning the rail vehicle ignition off, mechanical malfunctions, criminal activity, and/or other events where it would be advantageous for system 10 to continue to operate). Power system 20 may be configured to power operator identity system 12, cameras 14, transceiver 16, sensors 18, processor 30, user interface 40, electronic storage 22, and/or other components of system 10. Power system 20 may comprise one or more power sources connected in series and/or in parallel. In some implementations, power system 20 may be rechargeable. Power system 20 may be recharged via an AC power source, a rail vehicle power source, a USB port, a non-contact charging circuit, and/or other recharging methods. Examples of power sources that may be included backup power system 20 include one or more DC batteries, Lithium Ion and/or Lithium Polymer Cells, Nickel Metal Hydride, and/or other power sources.


Electronic storage 22 may be configured to store electronic information. Electronic storage 22 may comprise electronic storage media that electronically stores information. The electronic storage media of electronic storage 22 may comprise one or both of system storage that is provided integrally (i.e., substantially non-removable) with system 10 and/or removable storage that is removably connectable to system 10 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 22 may comprise one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 22 may store software algorithms, recorded video event data, information determined by processor 30, information received via user interface 40, and/or other information that enables system 10 to function properly. Electronic storage 22 may be (in whole or in part) a separate component within system 10, or electronic storage 22 may be provided (in whole or in part) integrally with one or more other components of system 10 (e.g., user interface 40, processor 30, etc.).


Processor 30 may be configured to provide information processing capabilities in system 10. As such, processor 30 may comprise one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Although processor 30 is shown in FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor 30 may comprise a plurality of processing units. These processing units may be physically located within the same device, or processor 30 may represent processing functionality of a plurality of devices operating in coordination.


Processor 30 may be configured to execute one or more computer program components. The computer program components may comprise one or more of an event detection component 32, a storage component 34, a communication component 36, and/or other components. Processor 30 may be configured to execute components 32, 34, and/or 36 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor 30. It should be appreciated that although components 32, 34, and 36 are illustrated in FIG. 1 as being co-located within a single processing unit, in implementations in which processor 30 comprises multiple processing units, one or more of components 32, 34, and/or 36 may be located remotely from the other components. The description of the functionality provided by the different components 32, 34, and/or 36 described herein is for illustrative purposes, and is not intended to be limiting, as any of components 32, 34, and/or 36 may provide more or less functionality than is described. For example, one or more of components 32, 34, and/or 36 may be eliminated, and some or all of its functionality may be provided by other components 32, 34, and/or 36. As another example, processor 30 may be configured to execute one or more additional components that may perform some or all of the functionality attributed below to one of components 32, 34, and/or 36.


Event detection component 32 may be configured to detect rail vehicle events based on the output signals from sensors 18 and/or other information. In some implementations, event detection component 32 may determine one or more rail vehicle parameters based on the output signals and/or other information. In some implementations, event detection component 32 may determine rail vehicle parameters that are not directly measurable by sensors 18. In some implementations, event detection component 32 may be configured to determine one or more rail vehicle parameters one or more times in an ongoing manner during operation of the rail vehicle. Event detection component 32 may be configured to detect rail vehicle events based on the information conveyed by the output signals generated by sensors 18, the rail vehicle parameters, pre-determined rail vehicle event criteria, and/or based on other information. A specific rail vehicle event may be detected based on the sensor information, the determined parameters, and the obtained vehicle event criteria by comparing the information conveyed by the output signals and/or the determined vehicle parameters to rail vehicle event criteria sets such that a first rail vehicle event is detected responsive to the output signals and/or the determined parameters satisfying one or more individual criteria in a first criteria set associated with a first vehicle event.


Storage Component 34 may be configured to facilitate electronic storage of rail vehicle event information for a period of time that includes the rail vehicle event. The rail vehicle event information may be stored in non-transient electronic storage 22, electronic storage included in remote computing device(s) 210 (FIG. 2B), and/or in other locations. The rail vehicle event information may include the visual information from one or more cameras 14, the operation information from one or more sensors 18 for the period of time that includes the rail vehicle event, operator identity information, and/or other information. In some implementations, storage component 34 may be configured such that operator identity information for the period of time that includes the rail vehicle event is included in the rail vehicle event information. In some implementations, storage component 34 may be configured to synchronize the operator identity information, the visual information, the operation information, and/or other information with respect to time. For example, visual information from various cameras 14 may be synchronized with information conveyed by the output signals from various sensors 18 by storage component 34.


Communication component 36 may be configured to facilitate wireless communication of information conveyed by the output signals, the determined parameters, the rail vehicle event information, and/or other information to remote computing device 210 (FIG. 2B) and/or other devices. Communication component 36 may be configured to facilitate communication via one or more of a WiFi network, a cellular network, an Ethernet network, and/or other network communication solutions. Communication component 36 may be configured such that a user of system 10 may choose one communication solution to start with and, without changing the hardware of system 10, change the solution to any other available communication solution any time the user requests a change. Communication component 36 may be configured to facilitate communication responsive to the detection of a rail vehicle event. Communication component 36 may be configured to facilitate communication in real-time or near real-time. For example, communication component 36 may facilitate one or more individual communications during operation of the rail vehicle. Individual communications may be responsive to a detected rail vehicle event and may occur just after detection of an individual rail vehicle event. In some implementations, communication component 36 may be configured to facilitate communication after use of the rail vehicle has ceased such that the information conveyed by the output signals, the determined parameters, rail vehicle event information, and/or other information is communicated in a single communication. In some implementations, communication component 36 may be configured to associate visual and/or other information in the output signals of the one or more cameras 14 with information related to operation and/or context of the vehicle (e.g., vehicle subsystem sensors and/or aftermarket sensors 18).


User interface 40 may be configured to provide an interface between system 10 and users through which the users may provide information to and receive information from system 10. This enables pre-determined profiles, criteria, data, cues, results, instructions, and/or any other communicable items, collectively referred to as “information,” to be communicated between a user and one or more of processor 30, sensors 18, remote computing device 210 (shown in FIG. 2B), operator identity system 12, cameras 14, electronic storage 22, backup power system 20, rail vehicle subsystems 202-208 (shown in FIG. 2B), and/or other components of system 10. In some implementations, all and/or part of user interface 40 may be included in remote computing device 210, operator identity system 12, and/or other components of system 10. In some implementations, user interface 40 may be included in a housing with one or more other components (e.g., processor 30) of system 10.


Examples of interface devices suitable for inclusion in user interface 40 comprise a keypad, buttons, switches, a keyboard, knobs, levers, a display screen, a touch screen, speakers, a microphone, an indicator light, an audible alarm, a printer, a tactile feedback device, and/or other interface devices. In one implementation, user interface 40 comprises a plurality of separate interfaces. In some implementations, user interface 40 comprises at least one interface that is provided integrally with processor 30 and/or electronic storage 22.


It is to be understood that other communication techniques, either hard-wired or wireless, are also contemplated by the present disclosure as user interface 40. In some implementations, user interface 40 may be included in a removable storage interface provided by electronic storage 22. In this example, information may be loaded into system 10 wirelessly from a remote location (e.g., via a network), from removable storage (e.g., a smart card, a flash drive, a removable disk, etc.), and/or other sources that enable the user(s) to customize the implementation of system 10. Other exemplary input devices and techniques adapted for use with system 10 as user interface 40 comprise, but are not limited to, an RS-232 port, RF link, an IR link, modem (telephone, cable, and/or other modems), a cellular network, a Wi-Fi network, a local area network, and/or other devices and/or systems. In short, any technique for communicating information with system 10 is contemplated by the present disclosure as user interface 40.


User interface 40, communication component 36, remote computing device 210 (FIG. 2B) and/or other components of system 10 may be configured to facilitate review of the rail vehicle event information and/or communication with an operator of the rail vehicle. In some implementations, the review and/or communication may be facilitated in real time or near real time to provide feedback to an operator about his performance. For example, a remotely located reviewer may review rail vehicle event information recently transmitted (e.g., by communication component 36 via transceiver 16) to remote computing device 210. The remote reviewer may look for behaviors such as unsafe backing, unsafe braking, unsafe railroad crossing, unsafe turning, operating the vehicle with hands and/or feet off of a control lever, passing a signal bar, passing red over red, failure to yield to pedestrians, failure to yield to vehicles, speeding, not checking mirrors, not scanning the road/tracks ahead, not scanning an intersection, operating a personal electronic device, being distracted by eating/drinking/reading/etc., improper stops at stations, speeding, following too close behind another train, and/or other dangerous behaviors. Based on his review of the driver's technique, the reviewer may send a message back to the rail vehicle operators which the rail vehicle operator may receive via user interface 40, for example. In some implementations, communication component 36, remote computing device 210 and/or other components of system 10 may be configured to facilitate automatic analysis of rail vehicle event information and alert (e.g., via text message, email, a phone call, via an indicator displayed by user interface 40, etc.) reviewers, rail vehicle operators, and/or other users.



FIG. 3 illustrates an example view 300 of a graphical user interface presented to the reviewer via remote computing device 210 (FIG. 2B), for example. View 300 includes a forward looking camera field 302, a driver camera field 304, a map field 306, rail vehicle subsystems information fields 308, and/or other fields. These fields may facilitate review of the operator's performance and/or other information before, during, and/or after a rail vehicle event, and/or at other times. FIG. 3 is not intended to be limiting. The graphical user interface may include any number of views and/or fields. The graphical user interface may be presented to a reviewer via user interface 40. As described above, some or all of user interface 40 may be included in remote computing device 210 and/or the rail vehicle.


In some implementations, system 10 may be electrically isolated from the rail vehicle. System 10 may be electrically isolated from the rail vehicle via an opto-isolator, an optical isolation circuit, and/or other isolation components. An opto-isolator 400 is illustrated in FIG. 4A. As shown in FIG. 4A, opto-isolator 400 may be a stand-alone component within system 10. An isolation circuit 402 is illustrated in FIG. 4B. As shown in FIG. 4B, opto-isolator 400 may be electrically coupled to one or more safety systems 406 of the rail vehicle via an expansion port cable 408. Opto-isolator 400 may be electrically coupled to a housing 404 that houses one or more components of system 10 via a custom interface cable 410 and/or an expansion port 412. In some implementations, custom interface cable 410 may be electrically coupled to a relay 414 via coupling wires 416, 418. Housing 404 may be coupled to a hardwired power cable 420 (separately from expansion port 412). Hardwired power cable 420 may be electrically coupled with relay 414 via an ignition sense wire 422 and/or coupling wire 418. (In some implementations ignition sense wire and coupling wire 418 may be the same wire.) Hardwired power cable 420 may be electrically coupled with a negatively charged portion of a main electrical panel 432 of the rail vehicle and/or a DC-DC converter 430 via a ground wire 434. Hardwired power cable 420 may be electrically coupled with a positively charged portion of main electrical panel 432 and/or converter 430 via a power wire 440.


Referring to FIGS. 4A and 4B, opto-isolator 400 may transfer electrical signals between two isolated circuits (e.g., a rail vehicle circuit and a rail vehicle event detection system circuit) using light. Opto-isolator 400 may prevent unexpectedly high voltages in one circuit (e.g., the rail vehicle circuit) from being transferred to and/or damaging another circuit (e.g., the rail vehicle event detection system circuit). Opto-isolator 400 may couple an input current to an output current via a beam of light modulated by the input current. Opto-isolator 400 may convert the input current signal into a light signal, send the light signal across a dielectric channel, capture the light signal on an output side of the dielectric channel, and then transform the light signal back into an electric signal (e.g., an output current).


In some implementations, opto-isolator 400 may be configured to provide multiple (e.g., six) inputs driving a corresponding number (e.g., 6) optically isolated outputs. One of the inputs may provide a time delay function that requires the input signal to remain present for a minimum number (e.g., 5) of seconds before the signal is output to expansion port 412. The time delay may be enabled or disabled via a printed circuit board (PCB) jumper. When disabled, the input may function identically to the other (e.g., five) inputs. Inputs may present as high as practical impedance for the rail vehicle source signals, whether powered or not. Power to drive the inputs may be supplied by the rail vehicle. By way of a non-limiting example, voltages between about 5V and about 50V may be considered a high signal. Inputs below about 2V may be considered a low signal. Inputs may include a selectable 3× attenuator to increase noise margins if necessary. Inputs may provide about 100V transient protection. Outputs may be optically isolated from the inputs. Output power may be provided by the same power source (e.g., a rail vehicle power source) that drives other components of system 10. Output states may mimic the input states (e.g., high in=high out). The output circuit may provide a minimum of about 7 volts with about a 5 mA load in the high state to ensure proper operation of expansion port 412.



FIG. 5 illustrates a method 500 for detecting and recording rail vehicle events. The rail vehicle events may be detected and recorded with a rail vehicle event detection system configured to be coupled with a rail vehicle. In some implementations, the rail vehicle event detection system may be electrically isolated from the rail vehicle (e.g., via an opto-isolator). The operations of method 500 presented below are intended to be illustrative. In some implementations, method 500 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 500 are illustrated in FIG. 5 and described below is not intended to be limiting.


In some implementations, method 500 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 500 in response to instructions stored electronically on one or more electronic storage mediums. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 500.


At an operation 502, operator identity information may be received. The operator identity information may identify periods of time that individual operators operate the rail vehicle. In some implementations, receiving operator identity information may include receiving entry and/or selection of the operator identity information from operators of the rail vehicle at the rail vehicle. In some implementations, receiving operator identity information may include receiving the operator identity information from a remotely located computing device. In some implementations, receiving operator identity information may include receiving operator identity information from a biometric sensor configured to generate output signals that convey biometric information that identifies an individual operator of the rail vehicle. In some implementations, operation 502 may be performed by one or more operator identity systems the same as or similar to operator identity system 12 (shown in FIG. 1 and described herein).


At an operation 504, visual information may be acquired. The visual information may represent a rail vehicle environment. The rail vehicle environment may include spaces in and around an interior and an exterior of the rail vehicle. The visual information may include views of exterior sides of the rail vehicle that capture visual images of collisions that occur at the sides of the rail vehicle, passengers entering and/or exiting the rail vehicle, wheelchair loading and/or offloading, and/or other visual information. In some implementations, visual information representing the rail vehicle environment at or near both ends of the rail vehicle may be acquired. In some implementations, operation 504 may be performed by one or more cameras the same as or similar to cameras 14 (shown in FIG. 1 and described herein).


At an operation 506, output signals may be generated. The output signals may convey operation information related to operation of the rail vehicle. In some implementations, the output signals convey information related to safety systems of the rail vehicle. The output signals that convey information related to safety systems of the rail vehicle may include overspeed sensor information and/or other information. In some implementations, the output signals may convey operation information related to operation of the rail vehicle at or near both ends of the rail vehicle. In some implementations, the output signals may be communicated via wires and/or wirelessly using WiFi, Bluetooth, radio signals, a wireless network such as the internet and/or a cellular network, and/or other communication techniques. In some implementations, operation 506 may be performed by one or more sensors the same as or similar to sensors 18 (shown in FIG. 1 and described herein).


At an operation 508, rail vehicle events may be detected. The rail vehicle events may be detected based on the output signals and/or other information. In some implementations, operation 508 may be performed by a processor component the same as or similar to event detection component 32 (shown in FIG. 1 and described herein).


At an operation 510, electronic storage of rail vehicle event information may be facilitated. The vehicle event information may be stored for a period of time that includes the rail vehicle event. The rail vehicle event information may include the visual information and the operation information for the period of time that includes the rail vehicle event. In some implementations, operation 510 may be performed by a processor component the same as or similar to storage component 34 (shown in FIG. 1 and described herein).


At an operation 512, wireless communication of the rail vehicle event information may be facilitated. Wireless communication may be facilitated via wireless communication components configured to transmit and receive electronic information. In some implementations, the rail vehicle event information may be wirelessly communicated to a remote computing device via the wireless communication components. In some implementations, operation 512 may be performed by a processor component the same as or similar to communication component 36 (shown in FIG. 1 and described herein).


Although the system(s) and/or method(s) of this disclosure have been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the disclosure is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present disclosure contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.

Claims
  • 1. A rail vehicle event detection system configured to be coupled with a rail vehicle, the system comprising: one or more cameras configured to acquire visual information representing a rail vehicle environment, the rail vehicle environment including spaces in and around an interior and an exterior of the rail vehicle;one or more sensors configured to generate output signals conveying operation information related to operation of the rail vehicle;non-transient electronic storage configured to store electronic information;one or more physical computer processors configured by computer readable instructions to: detect rail vehicle events based on the output signals; andfacilitate electronic storage of rail vehicle event information for a period of time that includes the rail vehicle event in the non-transient electronic storage, the rail vehicle event information including the visual information from the one or more cameras and the operation information from the one or more sensors for the period of time that includes the rail vehicle event; anda power system configured to provide electrical power to the rail vehicle event detection system responsive to electrical power received by the rail vehicle event detection system from the rail vehicle ceasing.
  • 2. The system of claim 1, further comprising an operator identity system configured to receive operator identity information that identifies periods of time individual operators operate the rail vehicle, wherein the one or more physical computer processors are configured such that operator identity information for the period of time that includes the rail vehicle event is included in the rail vehicle event information.
  • 3. The system of claim 2, wherein the operator identity system is coupled with the rail vehicle and is configured to receive entry and/or selection of the operator identity information from operators of the rail vehicle at the rail vehicle.
  • 4. The system of claim 2, wherein the operator identity system is configured to receive the operator identity information from a remotely located computing device.
  • 5. The system of claim 2, wherein the operator identity system is configured to receive operator identity information from a biometric sensor configured to generate output signals that convey biometric information that identifies an individual operator of the rail vehicle.
  • 6. The system of claim 1, wherein the one or more cameras are configured such that the visual information includes views of exterior sides of the rail vehicle to capture visual images of collisions that occur at the sides of the rail vehicle.
  • 7. The system of claim 1, further comprising wireless communication components configured to transmit and receive electronic information, wherein the one or more physical computer processors are configured to facilitate wireless communication of the rail vehicle event information to a remote computing device via the wireless communication components.
  • 8. The system of claim 1, wherein the one or more sensors are configured such that at least one of the one or more sensors is a rail vehicle subsystem sensor associated with mechanical systems of the rail vehicle, the at least one sensor configured to generate output signals conveying information related to the mechanical systems of the vehicle.
  • 9. The system of claim 1, wherein the one or more sensors are configured such that at least one of the one or more sensors is a rail vehicle subsystem sensor associated with a rail vehicle safety system, the at least one sensor configured to generate output signals conveying information related to safety systems of the rail vehicle.
  • 10. The system of claim 9, wherein the one or more sensors are configured such that the at least one rail vehicle subsystem sensor associated with the rail vehicle safety system is one or more of an overspeed sensor, a track brake sensor, an intercom sensor, a high horn sensor, an emergency brake sensor, or a CBTC sensor.
  • 11. A rail vehicle event detection system configured to be coupled with a rail vehicle, the system comprising: one or more cameras configured to acquire visual information representing a rail vehicle environment, the rail vehicle environment including spaces in and around an interior and an exterior of the rail vehicle;one or more sensors configured to generate output signals conveying operation information related to operation of the rail vehicle;non-transient electronic storage configured to store electronic information; andone or more physical computer processors configured by computer readable instructions to: detect rail vehicle events based on the output signals; andfacilitate electronic storage of rail vehicle event information for a period of time that includes the rail vehicle event in the non-transient electronic storage, the rail vehicle event information including the visual information from the one or more cameras and the operation information from the one or more sensors for the period of time that includes the rail vehicle event,wherein the rail vehicle event detection system is electrically isolated from the rail vehicle.
  • 12. The system of claim 1, wherein the one or more one or more cameras are configured to acquire visual information representing the rail vehicle environment at or near both ends of the rail vehicle; and wherein the one or more sensors are configured to generate output signals conveying operation information related to operation of the rail vehicle at or near both ends of the rail vehicle.
  • 13. A method for detecting and recording rail vehicle events with a rail vehicle event detection system configured to be coupled with a rail vehicle, the method comprising: acquiring visual information representing a rail vehicle environment, the rail vehicle environment including spaces in and around an interior and an exterior of the rail vehicle;generating output signals conveying operation information related to operation of the rail vehicle;detecting rail vehicle events based on the output signals; andfacilitating electronic storage of rail vehicle event information for a period of time that includes the rail vehicle event in non-transient electronic storage, the rail vehicle event information including the visual information and the operation information for the period of time that includes the rail vehicle event,wherein the rail vehicle event detection system is electrically isolated from the rail vehicle.
  • 14. The method of claim 13, further comprising receiving operator identity information that identifies periods of time individual operators operate the rail vehicle, wherein the operator identity information for the period of time that includes the rail vehicle event is included in the rail vehicle event information.
  • 15. The method of claim 14, further comprising receiving entry and/or selection of the operator identity information from operators of the rail vehicle at the rail vehicle.
  • 16. The method of claim 14, further comprising receiving the operator identity information from a remotely located computing device.
  • 17. The method of claim 14, further comprising receiving operator identity information from a biometric sensor configured to generate output signals that convey biometric information that identifies an individual operator of the rail vehicle.
  • 18. The method of claim 13, wherein the visual information includes views of exterior sides of the rail vehicle to capture visual images of collisions that occur at the sides of the rail vehicle.
  • 19. The method of claim 13, wherein the output signals convey information related to safety systems of the rail vehicle.
  • 20. The method of claim 19, wherein the output signals that convey information related to safety systems of the rail vehicle include one or more of overspeed sensor information, track brake sensor information, intercom sensor information, high horn sensor information, emergency brake sensor information, or CBTC sensor information.
  • 21. The method of claim 13, wherein the rail vehicle event detection system is electrically isolated from the rail vehicle by an opto-isolator.
  • 22. The method of claim 13, further comprising acquiring visual information representing the rail vehicle environment at or near both ends of the rail vehicle; and generating output signals conveying operation information related to operation of the rail vehicle at or near both ends of the rail vehicle.
  • 23. The system of claim 11, wherein the rail vehicle event detection system is electrically isolated from the rail vehicle via an opto-isolator.
US Referenced Citations (875)
Number Name Date Kind
673203 Freund Apr 1901 A
673795 Hammer May 1901 A
673907 Johnson May 1901 A
676075 McDougall Jun 1901 A
679511 Richards Jul 1901 A
681036 Burg Aug 1901 A
681283 Waynick Aug 1901 A
681998 Swift Sep 1901 A
683155 Thompson Sep 1901 A
683214 Mansfield Sep 1901 A
684276 Lonergan Oct 1901 A
685082 Wood Oct 1901 A
685969 Campbell Nov 1901 A
686545 Selph Nov 1901 A
689849 Brown Dec 1901 A
691982 Sturgis Jan 1902 A
692834 Davis Feb 1902 A
694781 Prinz Mar 1902 A
2943141 Knight Jun 1960 A
3634866 Meyer Jan 1972 A
3781824 Caiati Dec 1973 A
3812287 Lemelson May 1974 A
3885090 Rosenbaum May 1975 A
3992656 Joy Nov 1976 A
4054752 Dennis Oct 1977 A
4072850 McGlynn Feb 1978 A
4258421 Juhasz Mar 1981 A
4271358 Schwarz Jun 1981 A
4276609 Patel Jun 1981 A
4280151 Tsunekawa Jul 1981 A
4281354 Conte Jul 1981 A
4401976 Stadelmayr Aug 1983 A
4409670 Herndon Oct 1983 A
4420773 Toyoda Dec 1983 A
4425097 Owens Jan 1984 A
4456931 Toyoda et al. Jun 1984 A
4489351 d'Alayer de Costemore Dec 1984 A
4496995 Colles Jan 1985 A
4500868 Tokitsu Feb 1985 A
4528547 Rodney Jul 1985 A
4533962 Decker Aug 1985 A
4558379 Hutter Dec 1985 A
4588267 Pastore May 1986 A
4593313 Nagasaki Jun 1986 A
4621335 Bluish Nov 1986 A
4625210 Sagl Nov 1986 A
4630110 Cotton Dec 1986 A
4632348 Keesling Dec 1986 A
4638289 Zottnik Jan 1987 A
4646241 Ratchford Feb 1987 A
4651143 Yamanaka Mar 1987 A
4671111 Lemelson Jun 1987 A
4718685 Kawabe Jan 1988 A
4754255 Sanders Jun 1988 A
4758888 Lapidot Jul 1988 A
4763745 Eto Aug 1988 A
4785474 Bernstein Nov 1988 A
4789904 Peterson Dec 1988 A
4794566 Richards Dec 1988 A
4804937 Barbiaux Feb 1989 A
4806931 Nelson Feb 1989 A
4807096 Skogler Feb 1989 A
4814896 Heitzman Mar 1989 A
4837628 Sasaki Jun 1989 A
4839631 Tsuji Jun 1989 A
4843463 Michetti Jun 1989 A
4843578 Wade Jun 1989 A
4853856 Hanway Aug 1989 A
4853859 Morita Aug 1989 A
4866616 Takeuchi Sep 1989 A
4876597 Roy Oct 1989 A
4883349 Mittelhaeuser Nov 1989 A
4896855 Furnish Jan 1990 A
4926331 Windle May 1990 A
4930742 Schofield Jun 1990 A
4936533 Adams Jun 1990 A
4939652 Steiner Jul 1990 A
4942464 Milatz Jul 1990 A
4945244 Castleman Jul 1990 A
4949186 Peterson Aug 1990 A
4980913 Skret Dec 1990 A
4987541 Levente Jan 1991 A
4992943 McCracken Feb 1991 A
4993068 Piosenka Feb 1991 A
4995086 Lilley Feb 1991 A
5012335 Cohodar Apr 1991 A
5027104 Reid Jun 1991 A
5046007 McCrery Sep 1991 A
5050166 Cantoni Sep 1991 A
5056056 Gustin Oct 1991 A
5057820 Markson Oct 1991 A
5096287 Kakinami Mar 1992 A
5100095 Haan Mar 1992 A
5111289 Lucas May 1992 A
5140434 Van Aug 1992 A
5140436 Blessinger Aug 1992 A
5144661 Shamosh Sep 1992 A
5178448 Adams Jan 1993 A
5185700 Bezos Feb 1993 A
5196938 Blessinger Mar 1993 A
5223844 Mansell Jun 1993 A
5224211 Roe Jun 1993 A
5262813 Scharton Nov 1993 A
5283433 Tsien Feb 1994 A
5294978 Katayama Mar 1994 A
5305214 Komatsu Apr 1994 A
5305216 Okura Apr 1994 A
5308247 Dyrdek May 1994 A
5309485 Chao May 1994 A
5311197 Sorden May 1994 A
5321753 Gritton Jun 1994 A
5327288 Wellington Jul 1994 A
5330149 Haan Jul 1994 A
5343527 Moore Aug 1994 A
5353023 Mitsugi Oct 1994 A
5361326 Aparicio Nov 1994 A
5387926 Bellan Feb 1995 A
5388045 Kamiya Feb 1995 A
5388208 Weingartner Feb 1995 A
5404330 Lee Apr 1995 A
5408330 Squicciarini Apr 1995 A
5422543 Weinberg Jun 1995 A
5430431 Nelson Jul 1995 A
5430432 Camhi Jul 1995 A
5435184 Pineroli Jul 1995 A
5445024 Riley Aug 1995 A
5445027 Zoerner Aug 1995 A
5446659 Yamawaki Aug 1995 A
5455625 Englander Oct 1995 A
5455716 Suman Oct 1995 A
5465079 Bouchard Nov 1995 A
5473729 Bryant Dec 1995 A
5477141 Naether Dec 1995 A
5495242 Kick Feb 1996 A
5495243 McKenna Feb 1996 A
5497419 Hill Mar 1996 A
5499182 Ousborne Mar 1996 A
5504482 Schreder Apr 1996 A
5505076 Parkman Apr 1996 A
5513011 Matsumoto Apr 1996 A
5515285 Garrett May 1996 A
5519260 Washington May 1996 A
5521633 Nakajima May 1996 A
5523811 Wada Jun 1996 A
5526269 Ishibashi Jun 1996 A
5530420 Tsuchiya Jun 1996 A
5532678 Kin Jul 1996 A
5537156 Katayama Jul 1996 A
5539454 Williams Jul 1996 A
5541590 Nishio Jul 1996 A
5544060 Fujii Aug 1996 A
5546191 Hibi Aug 1996 A
5546305 Kondo Aug 1996 A
5548273 Nicol Aug 1996 A
5552990 Ihara Sep 1996 A
5559496 Dubats Sep 1996 A
5568211 Bamford Oct 1996 A
5570087 Lemelson Oct 1996 A
5570127 Schmidt Oct 1996 A
5574424 Nguyen Nov 1996 A
5574443 Hsieh Nov 1996 A
D376571 Kokat Dec 1996 S
5581464 Woll Dec 1996 A
5586130 Doyle Dec 1996 A
5590948 Moreno Jan 1997 A
5596382 Bamford Jan 1997 A
5600775 King Feb 1997 A
5610580 Lai Mar 1997 A
5612686 Takano Mar 1997 A
5631638 Kaspar May 1997 A
5638273 Coiner Jun 1997 A
5642106 Hancock Jun 1997 A
5646856 Kaesser Jul 1997 A
5652706 Morimoto Jul 1997 A
RE35590 Bezos Aug 1997 E
5654892 Fujii Aug 1997 A
5659355 Barron Aug 1997 A
5666120 Kline Sep 1997 A
5667176 Zamarripa Sep 1997 A
5669698 Veldman Sep 1997 A
5671451 Takahashi Sep 1997 A
5677979 Squicciarini Oct 1997 A
5680117 Arai Oct 1997 A
5680123 Lee Oct 1997 A
5686765 Washington Nov 1997 A
5686889 Hillis Nov 1997 A
5689442 Swanson Nov 1997 A
5696705 Zykan Dec 1997 A
5706362 Yabe Jan 1998 A
5706909 Bevins Jan 1998 A
5712679 Coles Jan 1998 A
5717456 Rudt Feb 1998 A
5719554 Gagnon Feb 1998 A
5758299 Sandborg May 1998 A
5781101 Stephen Jul 1998 A
5781145 Williams Jul 1998 A
5784007 Pepper Jul 1998 A
5784021 Oliva Jul 1998 A
5784521 Nakatani Jul 1998 A
5790403 Nakayama Aug 1998 A
5790973 Blaker Aug 1998 A
5793308 Rosinski Aug 1998 A
5793420 Schmidt Aug 1998 A
5793739 Tanaka Aug 1998 A
5793985 Natarajan Aug 1998 A
5794165 Minowa Aug 1998 A
5797134 McMillan Aug 1998 A
5798458 Monroe Aug 1998 A
5800040 Santo Sep 1998 A
5802545 Coverdill Sep 1998 A
5802727 Blank Sep 1998 A
5805079 Lemelson Sep 1998 A
5813745 Fant Sep 1998 A
5815071 Doyle Sep 1998 A
5815093 Kikinis Sep 1998 A
5819198 Peretz Oct 1998 A
5825284 Dunwoody Oct 1998 A
5825412 Hobson Oct 1998 A
5844505 Van Dec 1998 A
5845733 Wolfsen Dec 1998 A
5867802 Borza Feb 1999 A
5877897 Schofield Mar 1999 A
5883337 Dolan Mar 1999 A
5896167 Omae Apr 1999 A
5897602 Mizuta Apr 1999 A
5897606 Miura Apr 1999 A
5899956 Chan May 1999 A
5901806 Takahashi May 1999 A
5914748 Parulski Jun 1999 A
5919239 Fraker Jul 1999 A
5926210 Hackett Jul 1999 A
5928291 Jenkins Jul 1999 A
5938321 Bos Aug 1999 A
5946404 Bakshi Aug 1999 A
5948038 Daly Sep 1999 A
5956664 Bryan Sep 1999 A
5959367 OFarrell Sep 1999 A
5978017 Tino Nov 1999 A
5995881 Kull Nov 1999 A
6002326 Turner Dec 1999 A
6006148 Strong Dec 1999 A
6008723 Yassan Dec 1999 A
6008841 Charlson Dec 1999 A
6009370 Minowa Dec 1999 A
6011492 Garesche Jan 2000 A
6028528 Lorenzetti Feb 2000 A
6037860 Zander Mar 2000 A
6037977 Peterson Mar 2000 A
6041410 Hsu Mar 2000 A
6049079 Noordam Apr 2000 A
6057754 Kinoshita May 2000 A
6060989 Gehlot May 2000 A
6064792 Fox May 2000 A
6067488 Tano May 2000 A
6076026 Jambhekar Jun 2000 A
6084870 Wooten Jul 2000 A
6088635 Cox Jul 2000 A
6092008 Bateman Jul 2000 A
6092193 Loomis Jul 2000 A
6100811 Hsu Aug 2000 A
6111254 Eden Aug 2000 A
6118768 Bhatia Sep 2000 A
6122738 Millard Sep 2000 A
6141611 Mackey Oct 2000 A
6144296 Ishida Nov 2000 A
6147598 Murphy Nov 2000 A
6151065 Steed Nov 2000 A
6163338 Johnson Dec 2000 A
6163749 McDonough Dec 2000 A
6167186 Kawasaki Dec 2000 A
6170742 Yacoob Jan 2001 B1
6181373 Coles Jan 2001 B1
6182010 Berstis Jan 2001 B1
6185490 Ferguson Feb 2001 B1
6195605 Tabler Feb 2001 B1
6200139 Clapper Mar 2001 B1
6208919 Barkesseh Mar 2001 B1
6211907 Scaman Apr 2001 B1
6218960 Ishikawa Apr 2001 B1
6246933 Baque Jun 2001 B1
6246934 Otake Jun 2001 B1
6252544 Hoffberg Jun 2001 B1
6253129 Jenkins Jun 2001 B1
6259475 Ramachandran Jul 2001 B1
6263265 Fera Jul 2001 B1
6266588 McClellan Jul 2001 B1
6298290 Abe Oct 2001 B1
6300875 Schafer Oct 2001 B1
6324450 Iwama Nov 2001 B1
6333759 Mazzilli Dec 2001 B1
6337622 Sugano Jan 2002 B1
6349250 Hart Feb 2002 B1
6353734 Wright Mar 2002 B1
6356823 Iannotti Mar 2002 B1
6360147 Lee Mar 2002 B1
6366207 Murphy Apr 2002 B1
6389339 Just May 2002 B1
6389340 Rayner May 2002 B1
6400835 Lemelson Jun 2002 B1
6405112 Rayner Jun 2002 B1
6405132 Breed Jun 2002 B1
6408232 Cannon Jun 2002 B1
6411874 Morgan Jun 2002 B2
6421080 Lambert Jul 2002 B1
6434510 Callaghan Aug 2002 B1
6449540 Rayner Sep 2002 B1
6456321 Ito Sep 2002 B1
6459988 Fan Oct 2002 B1
6470241 Yoshikawa Oct 2002 B2
6472771 Frese Oct 2002 B1
6490513 Fish Dec 2002 B1
6493650 Rodgers Dec 2002 B1
6505106 Lawrence Jan 2003 B1
6507838 Syeda-Mahmood Jan 2003 B1
6508400 Ishifuji Jan 2003 B1
6516256 Hartmann Feb 2003 B1
6518881 Monroe Feb 2003 B2
6525672 Chainer Feb 2003 B2
6526352 Breed Feb 2003 B1
6529159 Fan Mar 2003 B1
6535804 Chun Mar 2003 B1
6552682 Fan Apr 2003 B1
6553308 Uhlmann Apr 2003 B1
6556905 Mittelsteadt Apr 2003 B1
6559769 Anthony May 2003 B2
6574538 Sasaki Jun 2003 B2
6575902 Burton Jun 2003 B1
6580373 Ohashi Jun 2003 B1
6580983 Laguer-Diaz Jun 2003 B2
6593848 Atkins, III Jul 2003 B1
6594576 Fan Jul 2003 B2
6611740 Lowrey Aug 2003 B2
6611755 Coffee Aug 2003 B1
6624611 Kirmuss Sep 2003 B2
6629029 Giles Sep 2003 B1
6629030 Klausner Sep 2003 B2
6636791 Okada Oct 2003 B2
6664922 Fan Dec 2003 B1
6665613 Duvall Dec 2003 B2
6679702 Rau Jan 2004 B1
6684137 Takagi Jan 2004 B2
6694483 Nagata Feb 2004 B1
6701234 Vogelsang Mar 2004 B1
6714894 Tobey Mar 2004 B1
6718239 Rayner Apr 2004 B2
6721640 Glenn Apr 2004 B2
6721652 Sanqunetti Apr 2004 B1
6728612 Carver Apr 2004 B1
6732031 Lightner May 2004 B1
6732032 Banet May 2004 B1
6735503 Ames May 2004 B2
6737954 Chainer May 2004 B2
6738697 Breed May 2004 B2
6739078 Morley May 2004 B2
6741168 Webb May 2004 B2
6745153 White Jun 2004 B2
6747692 Patel Jun 2004 B2
6748305 Klausner Jun 2004 B1
6760757 Lundberg Jul 2004 B1
6762513 Landgraf Jul 2004 B2
6795017 Puranik Sep 2004 B1
6795111 Mazzilli Sep 2004 B1
6795759 Doyle Sep 2004 B2
6798743 Ma Sep 2004 B1
6804590 Sato Oct 2004 B2
6810362 Adachi Oct 2004 B2
6812831 Ikeda Nov 2004 B2
6819989 Maeda Nov 2004 B2
6831556 Boykin Dec 2004 B1
6832140 Fan Dec 2004 B2
6832141 Skeen Dec 2004 B2
6836712 Nishina Dec 2004 B2
6842762 Raithel Jan 2005 B2
6847873 Li Jan 2005 B1
6850823 Eun Feb 2005 B2
6859695 Klausner Feb 2005 B2
6859705 Rao Feb 2005 B2
6862524 Nagda Mar 2005 B1
6865457 Mittelsteadt Mar 2005 B1
6867733 Sandhu Mar 2005 B2
6873261 Anthony Mar 2005 B2
6882313 Fan Apr 2005 B1
6882912 DiLodovico Apr 2005 B2
6894606 Forbes May 2005 B2
6895248 Akyol May 2005 B1
6898492 De Leon May 2005 B2
6898493 Ehrman May 2005 B2
6919823 Lock Jul 2005 B1
6922566 Puranik Jul 2005 B2
6928348 Lightner Aug 2005 B1
6931309 Phelan Aug 2005 B2
6947817 Diem Sep 2005 B2
6950122 Mirabile Sep 2005 B1
6954223 Miyazawa Oct 2005 B2
6988034 Marlatt Jan 2006 B1
7003289 Kolls Feb 2006 B1
7012632 Freeman Mar 2006 B2
7020548 Saito Mar 2006 B2
7023333 Blanco Apr 2006 B2
7027621 Prokoski Apr 2006 B1
7039510 Gumpinger May 2006 B2
7076348 Bucher Jul 2006 B2
7079927 Tano Jul 2006 B1
7082359 Breed Jul 2006 B2
7082382 Rose et al. Jul 2006 B1
7088387 Freeman Aug 2006 B1
7095782 Cohen Aug 2006 B1
7098812 Hirota Aug 2006 B2
7100190 Johnson Aug 2006 B2
7113853 Hecklinger Sep 2006 B2
7117075 Larschan Oct 2006 B1
7119832 Blanco Oct 2006 B2
7138904 Dutu Nov 2006 B1
7155321 Bromley Dec 2006 B2
7177738 Diaz Feb 2007 B2
7209833 Isaji Apr 2007 B2
7239252 Kato Jul 2007 B2
7254482 Kawasaki Aug 2007 B2
7265663 Steele Sep 2007 B2
7266507 Simon Sep 2007 B2
7272179 Siemens Sep 2007 B2
7308341 Schofield Dec 2007 B2
7317974 Luskin Jan 2008 B2
7343306 Bates Mar 2008 B1
7348895 Lagassey Mar 2008 B2
7349027 Endo Mar 2008 B2
7370261 Winarski May 2008 B2
7382933 Dorai Jun 2008 B2
7386376 Basir Jun 2008 B2
7389178 Raz Jun 2008 B2
7398140 Kernwein Jul 2008 B2
7457693 Olsen Nov 2008 B2
7471189 Vastad Dec 2008 B2
7471192 Hara Dec 2008 B2
7536457 Miller May 2009 B2
7561054 Raz Jul 2009 B2
7584033 Mittelsteadt Sep 2009 B2
7623754 McKain Nov 2009 B1
7659827 Gunderson Feb 2010 B2
7659835 Jung Feb 2010 B2
7667731 Kreiner Feb 2010 B2
7698028 Bilodeau Apr 2010 B1
7702442 Takenaka Apr 2010 B2
7725216 Kim May 2010 B2
7768548 Silvernail Aug 2010 B2
7769499 McQuade Aug 2010 B2
7783956 Ko Aug 2010 B2
7804426 Etcheson Sep 2010 B2
7821421 Tamir Oct 2010 B2
7853376 Peng Dec 2010 B2
7893958 DAgostino Feb 2011 B1
7940250 Forstall May 2011 B2
7941258 Mittelsteadt May 2011 B1
7974748 Goerick Jul 2011 B2
8054168 McCormick Nov 2011 B2
8068979 Breed Nov 2011 B2
8090598 Bauer Jan 2012 B2
8113844 Huang Feb 2012 B2
8139820 Plante Mar 2012 B2
8140265 Grush Mar 2012 B2
8140358 Ling Mar 2012 B1
8152198 Breed Apr 2012 B2
8239092 Plante Aug 2012 B2
8269617 Cook Sep 2012 B2
8311858 Everett Nov 2012 B2
8314708 Gunderson Nov 2012 B2
8321066 Becker Nov 2012 B2
8373567 Denson Feb 2013 B2
8417562 Siemens Apr 2013 B1
8442690 Goldstein May 2013 B2
8471701 Yariv Jun 2013 B2
8508353 Cook Aug 2013 B2
8538696 Cassanova Sep 2013 B1
8538785 Coleman Sep 2013 B2
8564426 Cook Oct 2013 B2
8564446 Gunderson Oct 2013 B2
8571755 Plante Oct 2013 B2
8577703 McClellan Nov 2013 B2
8606492 Botnen Dec 2013 B1
8635557 Geise Jan 2014 B2
8676428 Richardson Mar 2014 B2
8744642 Nemat-Nasser Jun 2014 B2
8775067 Cho Jul 2014 B2
8803695 Denson Aug 2014 B2
8849501 Cook Sep 2014 B2
8855847 Uehara Oct 2014 B2
8868288 Plante Oct 2014 B2
8880279 Plante Nov 2014 B2
8892310 Palmer Nov 2014 B1
8989959 Plante Mar 2015 B2
8996234 Tamari Mar 2015 B1
8996240 Plante Mar 2015 B2
9047721 Botnen Jun 2015 B1
9183679 Plante Nov 2015 B2
9201842 Plante Dec 2015 B2
9208129 Plante Dec 2015 B2
9226004 Plante Dec 2015 B1
9240079 Lambert Jan 2016 B2
9296401 Palmer Mar 2016 B1
20010005217 Hamilton Jun 2001 A1
20010005804 Rayner Jun 2001 A1
20010018628 Jenkins Aug 2001 A1
20010020204 Runyon Sep 2001 A1
20010052730 Baur Dec 2001 A1
20020019689 Harrison Feb 2002 A1
20020027502 Mayor Mar 2002 A1
20020029109 Wong Mar 2002 A1
20020035422 Sasaki Mar 2002 A1
20020044225 Rakib Apr 2002 A1
20020059453 Eriksson May 2002 A1
20020061758 Zarlengo May 2002 A1
20020067076 Talbot Jun 2002 A1
20020087240 Raithel Jul 2002 A1
20020091473 Gardner Jul 2002 A1
20020105438 Forbes Aug 2002 A1
20020107619 Klausner Aug 2002 A1
20020111725 Burge Aug 2002 A1
20020111756 Modgil Aug 2002 A1
20020118206 Knittel Aug 2002 A1
20020120374 Douros Aug 2002 A1
20020135679 Scaman Sep 2002 A1
20020138587 Koehler Sep 2002 A1
20020163532 Thomas Nov 2002 A1
20020169529 Kim Nov 2002 A1
20020169530 Laguer-Diaz Nov 2002 A1
20020183905 Maeda Dec 2002 A1
20030016753 Kim Jan 2003 A1
20030028298 Macky Feb 2003 A1
20030053433 Chun Mar 2003 A1
20030055557 Dutta Mar 2003 A1
20030065805 Barnes Apr 2003 A1
20030067541 Joao Apr 2003 A1
20030079041 Parrella Apr 2003 A1
20030080713 Kirmuss May 2003 A1
20030080878 Kirmuss May 2003 A1
20030081121 Kirmuss May 2003 A1
20030081122 Kirmuss May 2003 A1
20030081127 Kirmuss May 2003 A1
20030081128 Kirmuss May 2003 A1
20030081934 Kirmuss May 2003 A1
20030081935 Kirmuss May 2003 A1
20030095688 Kirmuss May 2003 A1
20030112133 Webb Jun 2003 A1
20030125854 Kawasaki Jul 2003 A1
20030144775 Klausner Jul 2003 A1
20030154009 Basir Aug 2003 A1
20030158638 Yakes Aug 2003 A1
20030177187 Levine Sep 2003 A1
20030187704 Hashiguchi Oct 2003 A1
20030191568 Breed Oct 2003 A1
20030195678 Betters et al. Oct 2003 A1
20030214585 Bakewell Nov 2003 A1
20030220835 Barnes Nov 2003 A1
20030222880 Waterman Dec 2003 A1
20040008255 Lewellen Jan 2004 A1
20040015276 Kane Jan 2004 A1
20040033058 Reich Feb 2004 A1
20040039503 Doyle Feb 2004 A1
20040039504 Coffee Feb 2004 A1
20040044452 Bauer Mar 2004 A1
20040044592 Ubik Mar 2004 A1
20040054444 Abeska Mar 2004 A1
20040054513 Laird Mar 2004 A1
20040054689 Salmonsen Mar 2004 A1
20040064245 Knockeart Apr 2004 A1
20040070926 Boykin Apr 2004 A1
20040083041 Skeen Apr 2004 A1
20040088090 Wee May 2004 A1
20040103008 Wahlbin May 2004 A1
20040103010 Wahlbin May 2004 A1
20040104842 Drury Jun 2004 A1
20040111189 Miyazawa Jun 2004 A1
20040138794 Saito Jul 2004 A1
20040145457 Schofield et al. Jul 2004 A1
20040153244 Kellum Aug 2004 A1
20040153362 Bauer Aug 2004 A1
20040167689 Bromley Aug 2004 A1
20040179600 Wells Sep 2004 A1
20040181326 Adams Sep 2004 A1
20040184548 Kerbiriou Sep 2004 A1
20040203903 Wilson Oct 2004 A1
20040209594 Naboulsi Oct 2004 A1
20040210353 Rice Oct 2004 A1
20040230345 Tzamaloukas Nov 2004 A1
20040230370 Tzamaloukas Nov 2004 A1
20040230373 Tzamaloukas Nov 2004 A1
20040230374 Tzamaloukas Nov 2004 A1
20040233284 Lesesky Nov 2004 A1
20040236474 Chowdhary Nov 2004 A1
20040243285 Gounder Dec 2004 A1
20040243308 Irish Dec 2004 A1
20040243668 Harjanto Dec 2004 A1
20040254689 Blazic Dec 2004 A1
20040254698 Hubbard Dec 2004 A1
20050021199 Zimmerman Jan 2005 A1
20050043869 Funkhouser Feb 2005 A1
20050060070 Kapolka Mar 2005 A1
20050060071 Winner Mar 2005 A1
20050065682 Kapadia Mar 2005 A1
20050065716 Timko Mar 2005 A1
20050073585 Ettinger Apr 2005 A1
20050078423 Kim Apr 2005 A1
20050088291 Blanco Apr 2005 A1
20050099498 Lao May 2005 A1
20050100329 Lao May 2005 A1
20050102074 Kolls May 2005 A1
20050107954 Nahla May 2005 A1
20050125117 Breed Jun 2005 A1
20050131585 Luskin Jun 2005 A1
20050131595 Luskin Jun 2005 A1
20050131597 Raz Jun 2005 A1
20050136949 Barnes Jun 2005 A1
20050137757 Phelan Jun 2005 A1
20050137796 Gumpinger Jun 2005 A1
20050146458 Carmichael Jul 2005 A1
20050149238 Stefani Jul 2005 A1
20050149259 Cherveny Jul 2005 A1
20050159964 Sonnenrein Jul 2005 A1
20050166258 Vasilevsky Jul 2005 A1
20050168258 Poskatcheev Aug 2005 A1
20050171692 Hamblen Aug 2005 A1
20050174217 Basir Aug 2005 A1
20050182538 Phelan Aug 2005 A1
20050182824 Cotte Aug 2005 A1
20050183627 Hommen Aug 2005 A1
20050185052 Raisinghani Aug 2005 A1
20050185936 Lao Aug 2005 A9
20050192749 Flann Sep 2005 A1
20050197748 Hoist Sep 2005 A1
20050200714 Marchese Sep 2005 A1
20050203683 Olsen Sep 2005 A1
20050205719 Hendrickson Sep 2005 A1
20050206741 Raber Sep 2005 A1
20050209776 Ogino Sep 2005 A1
20050212920 Evans Sep 2005 A1
20050216144 Baldassa Sep 2005 A1
20050228560 Doherty Oct 2005 A1
20050233805 Okajima Oct 2005 A1
20050251304 Cancellara Nov 2005 A1
20050251337 Rajaram Nov 2005 A1
20050256681 Brinton Nov 2005 A1
20050258942 Manasseh Nov 2005 A1
20050264691 Endo Dec 2005 A1
20050283284 Grenier Dec 2005 A1
20060001671 Kamijo Jan 2006 A1
20060007151 Ram Jan 2006 A1
20060011399 Brockway Jan 2006 A1
20060015233 Olsen Jan 2006 A1
20060022842 Zoladek Feb 2006 A1
20060025897 Shostak Feb 2006 A1
20060030986 Peng Feb 2006 A1
20060040239 Cummins Feb 2006 A1
20060047380 Welch Mar 2006 A1
20060053038 Warren Mar 2006 A1
20060055521 Blanco Mar 2006 A1
20060057543 Roald Mar 2006 A1
20060058950 Kato Mar 2006 A1
20060072792 Toda Apr 2006 A1
20060078853 Lanktree Apr 2006 A1
20060082438 Bazakos Apr 2006 A1
20060092043 Lagassey May 2006 A1
20060095175 DeWaal et al. May 2006 A1
20060095199 Lagassey May 2006 A1
20060095349 Morgan May 2006 A1
20060098843 Chew May 2006 A1
20060103127 Lie May 2006 A1
20060106514 Liebl May 2006 A1
20060111817 Phelan May 2006 A1
20060122749 Phelan Jun 2006 A1
20060142913 Coffee Jun 2006 A1
20060147187 Takemoto Jul 2006 A1
20060161960 Benoit Jul 2006 A1
20060168271 Pabari Jul 2006 A1
20060178793 Hecklinger Aug 2006 A1
20060180647 Hansen Aug 2006 A1
20060200008 Moore-Ede Sep 2006 A1
20060200305 Sheha Sep 2006 A1
20060204059 Ido Sep 2006 A1
20060209090 Kelly Sep 2006 A1
20060209840 Paatela Sep 2006 A1
20060212195 Veith Sep 2006 A1
20060215884 Ota Sep 2006 A1
20060226344 Werth Oct 2006 A1
20060229780 Underdahl Oct 2006 A1
20060242680 Johnson Oct 2006 A1
20060244830 Davenport Nov 2006 A1
20060247833 Malhotra Nov 2006 A1
20060253307 Warren Nov 2006 A1
20060259218 Wu Nov 2006 A1
20060261931 Cheng Nov 2006 A1
20070001831 Raz Jan 2007 A1
20070005404 Raz Jan 2007 A1
20070027583 Tamir Feb 2007 A1
20070027726 Warren Feb 2007 A1
20070043487 Krzystofczyk Feb 2007 A1
20070120948 Fujioka May 2007 A1
20070124332 Ballesty May 2007 A1
20070127833 Singh Jun 2007 A1
20070132773 Plante Jun 2007 A1
20070135979 Plante Jun 2007 A1
20070135980 Plante Jun 2007 A1
20070136078 Plante Jun 2007 A1
20070142986 Alaous Jun 2007 A1
20070143499 Chang Jun 2007 A1
20070150138 Plante Jun 2007 A1
20070150140 Seymour Jun 2007 A1
20070173994 Kubo Jul 2007 A1
20070179691 Grenn Aug 2007 A1
20070183635 Weidhaas Aug 2007 A1
20070208494 Chapman Sep 2007 A1
20070216521 Guensler Sep 2007 A1
20070216771 Kumar Sep 2007 A1
20070217670 Bar-Am Sep 2007 A1
20070219685 Plante Sep 2007 A1
20070219686 Plante Sep 2007 A1
20070241874 Okpysh Oct 2007 A1
20070244614 Nathanson Oct 2007 A1
20070257781 Denson Nov 2007 A1
20070257782 Etcheson Nov 2007 A1
20070257804 Gunderson Nov 2007 A1
20070257815 Gunderson Nov 2007 A1
20070260677 DeMarco Nov 2007 A1
20070268158 Gunderson Nov 2007 A1
20070271105 Gunderson Nov 2007 A1
20070272116 Bartley Nov 2007 A1
20070273480 Burkman Nov 2007 A1
20070279214 Buehler Dec 2007 A1
20070299612 Kimura Dec 2007 A1
20080035108 Ancimer Feb 2008 A1
20080059019 Delia Mar 2008 A1
20080071827 Hengel Mar 2008 A1
20080111666 Plante May 2008 A1
20080122603 Plante May 2008 A1
20080143834 Comeau Jun 2008 A1
20080147267 Plante Jun 2008 A1
20080157510 Breed Jul 2008 A1
20080167775 Kuttenberger Jul 2008 A1
20080169914 Albertson Jul 2008 A1
20080177436 Fortson Jul 2008 A1
20080234920 Nurminen Sep 2008 A1
20080243389 Inoue Oct 2008 A1
20080252485 Lagassey Oct 2008 A1
20080252487 McClellan Oct 2008 A1
20080269978 Shirole Oct 2008 A1
20080281485 Plante Nov 2008 A1
20080309762 Howard et al. Dec 2008 A1
20080319604 Follmer Dec 2008 A1
20090009321 McClellan Jan 2009 A1
20090043500 Satoh Feb 2009 A1
20090043971 Kim Feb 2009 A1
20090051510 Follmer Feb 2009 A1
20090138191 Engelhard May 2009 A1
20090157255 Plante Jun 2009 A1
20090216775 Ratliff et al. Aug 2009 A1
20090224869 Baker Sep 2009 A1
20090290848 Brown Nov 2009 A1
20090299622 Denaro Dec 2009 A1
20090312998 Berckmans Dec 2009 A1
20090326796 Prokhorov Dec 2009 A1
20100030423 Nathanson Feb 2010 A1
20100045451 Periwal Feb 2010 A1
20100049516 Talwar Feb 2010 A1
20100057342 Muramatsu Mar 2010 A1
20100063672 Anderson Mar 2010 A1
20100063680 Tolstedt Mar 2010 A1
20100063850 Daniel Mar 2010 A1
20100070175 Soulchin Mar 2010 A1
20100076621 Kubotani Mar 2010 A1
20100085193 Boss Apr 2010 A1
20100085430 Kreiner Apr 2010 A1
20100087984 Joseph Apr 2010 A1
20100094489 Moffitt Apr 2010 A1
20100100315 Davidson Apr 2010 A1
20100104199 Zhang Apr 2010 A1
20100153146 Angell Jun 2010 A1
20100157061 Katsman Jun 2010 A1
20100191411 Cook Jul 2010 A1
20100204857 Forrest Aug 2010 A1
20100220892 Kawakubo Sep 2010 A1
20100241296 Rhea Sep 2010 A1
20100250020 Lee Sep 2010 A1
20100250021 Cook Sep 2010 A1
20100250060 Maeda Sep 2010 A1
20100250116 Yamaguchi Sep 2010 A1
20100253918 Seder Oct 2010 A1
20100268415 Ishikawa Oct 2010 A1
20100283633 Becker Nov 2010 A1
20100312464 Fitzgerald Dec 2010 A1
20100327125 Braband Dec 2010 A1
20110035139 Konlditslotis Feb 2011 A1
20110043624 Haug Feb 2011 A1
20110060496 Nielsen Mar 2011 A1
20110077028 Wilkes Mar 2011 A1
20110091079 Yu-Song Apr 2011 A1
20110093159 Boling Apr 2011 A1
20110112995 Chang May 2011 A1
20110121960 Tsai May 2011 A1
20110125365 Larschan May 2011 A1
20110130916 Mayer Jun 2011 A1
20110140884 Santiago Jun 2011 A1
20110153367 Amigo Jun 2011 A1
20110161116 Peak Jun 2011 A1
20110173015 Chapman Jul 2011 A1
20110213628 Peak Sep 2011 A1
20110216200 Chung Sep 2011 A1
20110224891 Iwuchukwu Sep 2011 A1
20110251752 DeLarocheliere Oct 2011 A1
20110257882 McBurney Oct 2011 A1
20110273568 Lagassey Nov 2011 A1
20110283223 Vattinen et al. Nov 2011 A1
20110285842 Davenport Nov 2011 A1
20110304446 Basson Dec 2011 A1
20120021386 Anderson Jan 2012 A1
20120035788 Trepagnier Feb 2012 A1
20120041675 Juliver Feb 2012 A1
20120046803 Inou Feb 2012 A1
20120071140 Oesterling Mar 2012 A1
20120072088 Cutright Mar 2012 A1
20120078063 Moore-Ede Mar 2012 A1
20120081567 Cote Apr 2012 A1
20120100509 Gunderson Apr 2012 A1
20120109447 Yousefi May 2012 A1
20120123806 Schumann May 2012 A1
20120130563 McBain May 2012 A1
20120134547 Jung May 2012 A1
20120150436 Rossano Jun 2012 A1
20120190001 Knight Jul 2012 A1
20120203402 Jape Aug 2012 A1
20120210252 Fedoseyeva Aug 2012 A1
20120245908 Berggren Sep 2012 A1
20120269383 Bobbitt Oct 2012 A1
20120277950 Plante Nov 2012 A1
20120283895 Noda Nov 2012 A1
20130004138 Kilar Jan 2013 A1
20130006469 Green Jan 2013 A1
20130018534 Hilleary Jan 2013 A1
20130021148 Cook Jan 2013 A1
20130030660 Fujimoto Jan 2013 A1
20130032054 Schneider Feb 2013 A1
20130046421 ElFassi Feb 2013 A1
20130048795 Cross Feb 2013 A1
20130073114 Nemat-Nasser Mar 2013 A1
20130096731 Tamari Apr 2013 A1
20130197774 Denson Aug 2013 A1
20130274950 Richardson Oct 2013 A1
20130317711 Plante Nov 2013 A1
20130332004 Gompert et al. Dec 2013 A1
20130345927 Cook Dec 2013 A1
20140012438 Shoppa Jan 2014 A1
20140025225 Armitage Jan 2014 A1
20140025254 Plante Jan 2014 A1
20140046550 Palmer Feb 2014 A1
20140047371 Palmer Feb 2014 A1
20140052315 Isailovski Feb 2014 A1
20140098228 Plante Apr 2014 A1
20140152828 Plante Jun 2014 A1
20140226010 Molin Aug 2014 A1
20140257594 Hashimoto Sep 2014 A1
20140279707 Joshua Sep 2014 A1
20140280204 Avery Sep 2014 A1
20140339374 Mian Nov 2014 A1
20150000415 Kelley Jan 2015 A1
20150035665 Plante Feb 2015 A1
20150057836 Plante Feb 2015 A1
20150105934 Palmer Apr 2015 A1
20150134226 Palmer May 2015 A1
20150202935 Muthusamy Jul 2015 A1
20150203116 Fairgrieve Jul 2015 A1
20150317846 Plante Nov 2015 A1
20150371462 Ramesh Dec 2015 A1
20160114820 Palmer Apr 2016 A1
20160140872 Palmer May 2016 A1
20160200330 Palmer Jul 2016 A1
20160200333 Palmer Jul 2016 A1
20160292936 Palmer Oct 2016 A1
Foreign Referenced Citations (99)
Number Date Country
1126093 Jun 1982 CA
1126093 Jun 1982 CA
2469728 Dec 2005 CA
2469728 Dec 2005 CA
2632689 Jun 2007 CA
2692415 Aug 2011 CA
2692415 Aug 2011 CA
4416991 Nov 1995 DE
20311262 Sep 2003 DE
20311262 Sep 2003 DE
202005008238 Sep 2005 DE
102004004669 Dec 2005 DE
102004004669 Dec 2005 DE
0708427 Apr 1996 EP
0840270 May 1998 EP
0848270 May 1998 EP
1115092 Jul 2001 EP
1170697 Jan 2002 EP
1324274 Jul 2003 EP
1355278 Oct 2003 EP
1427165 Jun 2004 EP
1818873 Aug 2007 EP
068475375 Aug 2008 EP
077529295 Dec 2008 EP
077531838 Dec 2008 EP
077728129 Dec 2008 EP
2104075 Sep 2009 EP
2268608 Jan 1994 GB
2402530 Dec 2004 GB
2402530 Dec 2004 GB
2451485 Feb 2009 GB
2447184 Jun 2011 GB
2446994 Aug 2011 GB
58085110 May 1983 JP
S5885110 May 1983 JP
62091092 Apr 1987 JP
S6291092 Apr 1987 JP
S62166135 Jul 1987 JP
02056197 Feb 1990 JP
H0256197 Feb 1990 JP
H04257189 Sep 1992 JP
H05137144 Jun 1993 JP
5294188 Nov 1993 JP
H08124069 May 1996 JP
H09163357 Jun 1997 JP
H09272399 Oct 1997 JP
10076880 Mar 1998 JP
H1076880 Mar 1998 JP
2002191017 Jul 2002 JP
2002191017 Jul 2002 JP
8809023 Nov 1988 WO
9005076 May 1990 WO
9427844 Dec 1994 WO
9600957 Jan 1996 WO
9701246 Jan 1997 WO
9726750 Jul 1997 WO
9937503 Jul 1999 WO
9940545 Aug 1999 WO
PCTUS9901810 Aug 1999 WO
9962741 Dec 1999 WO
0007150 Feb 2000 WO
0028410 May 2000 WO
0048033 Aug 2000 WO
0077620 Dec 2000 WO
0123214 Apr 2001 WO
0125054 Apr 2001 WO
PCTUS9929382 Apr 2001 WO
0146710 Jun 2001 WO
03045514 Jun 2003 WO
2005095175 Oct 2005 WO
2005118366 Dec 2005 WO
2006022824 Mar 2006 WO
2006022824 Mar 2006 WO
WO2006047042 May 2006 WO
2006125256 Nov 2006 WO
2006125256 Nov 2006 WO
WO2006047029 Dec 2006 WO
WO2006047055 Dec 2006 WO
WO2007006265 Mar 2007 WO
WO2007006404 Mar 2007 WO
WO2007006536 Mar 2007 WO
2007067767 Jun 2007 WO
PCTUS0768325 Nov 2007 WO
PCTUS0768328 Nov 2007 WO
PCTUS0768329 Nov 2007 WO
PCTUS0768331 Nov 2007 WO
PCTUS0768332 Nov 2007 WO
PCTUS0768333 Nov 2007 WO
PCTUS0768334 Nov 2007 WO
PCTUS07084366 Nov 2007 WO
WO2007083997 Nov 2007 WO
WO2007083998 Nov 2007 WO
PCTUS0775397 Feb 2008 WO
PCTUS1022012 Jul 2010 WO
2011055743 May 2011 WO
PCTUS1122087 Jul 2011 WO
PCTUS1255063 Mar 2013 WO
PCTUS1255060 Apr 2013 WO
2013134615 Sep 2013 WO
Non-Patent Literature Citations (261)
Entry
DriveCam, Inc.'s Infringement Contentions Exhibit B, U.S. Pat. No. 7,659,827. Aug. 19, 2011. (29 pgs.).
DriveCam, Inc.'s Infringement Contentions Exhibit C, U.S. Pat. No. 7,804,426. Aug. 19, 2011. (47 pgs.).
DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Aug. 19, 2011. (6 pgs.).
Preliminary Claim Construction and Identification of Extrinsic Evidence of Defendant/Counterclaimant SmartDriveSystems, Inc. in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H (RBB), for the Southern District of California. Nov. 8, 2011. (13 pgs.).
Supplement to DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions' in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Oct. 14, 2011. (7 pgs.).
“DriveCam, Inc's Disclosure of Proposed Constructions and Extrinsic Evidence Pursuant to Patent L.R. 4.1.a & 4.1.b” Disclosure and Extrinsic Evidence in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Nov. 8, 2011, 68 pages.
“DriveCam Driving Feedback System”, DriveCam brochure, Jun. 12, 2001, Document #6600128, 2 pages.
“DriveCam Driving Feedback System” DriveCam brochure, Mar. 15, 2004, 4 pages.
“DriveCam Passenger Transportation Module”, DriveCam brochure, Oct. 26, 2001, 2 pages.
“DriveCam Video Event Data Recorder”, DriveCam brochure, Nov. 6, 2002, Document #6600127, 2 pages.
“Responsive Claim Construction and Identification of Extrinsic Evidence of Defendani/Counterclaimant SmartDrive Systems, Inc.” Claim Construction and and Extrinsic Evidence in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H (RBB), for the Southern District of California. Nov. 15, 2011, 20 pages.
“Sonic MyDVD 4.0: Tutorial: Trimming video segments”. Tutorial for software bundled with Adaptec VideoOh! DVD USB 2.0 Edition, 2003, 13 pages.
“User's Manual for DriveCam Video Systems' HindSight 20/20 Software Version 4.0” DriveCam Manual, San Diego, 2003, Document #6600141-1, 54 pages.
Canadian Office Action issued in Application No. 2,632,685 dated Jan. 30, 2015; 5 pages.
Dan Maher, “DriveCam Taking Risk Out of Driving”, DriveCam brochure folder, Jun. 6, 2005, 6 pages.
Del Lisk, “DriveCam Training Seminar” Handout, 2004, 16 pages.
European Examination Report issued in EP 07772812.9 on Jan. 22, 2015; 5 pages.
Jean (DriveCam vendor) “DriveCam Driving Feedback System”, DriveCam brochure, Nov. 6, 2002, Document #6600128-1, 2 pages.
Notice of Allowance Allowance for U.S. Appl. No. 14/036,299, mailed Mar. 20, 2015, 5 pages.
Notice of Allowance Application for U.S. Appl. No. 11/566,424, mailed Feb. 26, 2010, 6 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Dec. 3, 2014, 5 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Feb. 13, 2015, 2 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Feb. 25, 2014, 2 pages.
Notice of Allowance for U.S. Appl. No. 11/377,164, mailed Nov. 18, 2013, 7 pages.
Notice of Allowance for U.S. Appl. No. 11/377,167, mailed Apr. 1, 2015, 7 pages.
Notice of Allowance for U.S. Appl. No. 11/800,876, mailed Apr. 19, 2012, 8 pages.
Notice of Allowance for U.S. Appl. No. 13/957,810, mailed Jun. 8, 2015, 10 pages.
USPTO Final Office Action for U.S. Appl. No. 11/296,906, mailed Aug. 8, 2012, 15 pages.
USPTO Final Office Action for U.S. Appl. No. 12/096,591, mailed Dec. 5, 2014, 23 pages.
USPTO Final Office Action for U.S. Appl. No. 12/096,591, mailed Jul. 18, 2012, 15 pages.
USPTO Final Office Action for U.S. Appl. No. 12/096,591, mailed Nov. 7, 2013, 14 pages.
USPTO Final Office Action for U.S. Appl. No. 13/957,810, mailed Jun. 27, 2014, 22 pages.
USPTO Final Office Action for U.S. Appl. No. 14/036,299, mailed Feb. 24, 2015, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Apr. 8, 2014, 19 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Jun. 12, 2012, 13 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Apr. 7, 2014, 7 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Aug. 18, 2014, 5 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Sep. 10, 2012, 10 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,167, mailed Jun. 27, 2013, 11 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 12/096,591, mailed Jun. 14, 2011, 8 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 12/096,591, mailed Mar. 27, 2013, 16 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 13/957,810, mailed Apr. 17, 2015, 6 pages.
USPTO Non-final Office Action for U.S. Appl. No. 13/957,810, mailed Nov. 27, 2013, 18 pages.
Gary A. Rayner, U.S. Appl. No. 09/405,857, filed Sep. 24, 1999.
Jamie Etcheson, U.S. Appl. No. 11/566,424, filed Dec. 4, 2006.
DriveCam, Inc., U.S. Appl. No. 14/070,206, filed Nov. 1, 2013.
USPTO Non-Final Office Action mailed Jan. 4, 2016 in U.S. Appl. No. 14/529,134, filed Oct. 30, 2014 (65 pgs).
Adaptec published and sold its VideoOh! DVD software USB 2.0 Edition in at least Jan. 24, 2003.
Ambulance Companies Use Video Technology to Improve Driving Behavior, Ambulance Industry Journal, Spring 2003.
Amended Complaint for Patent Infringement, Trade Secret Misappropriation, Unfair Competition and Conversion in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California, Document 34, filed Oct. 20, 2011, pp. 1-15.
Amendment filed Dec. 23, 2009 during prosecution of U.S. Appl. No. 11/566,424.
Answer to Amended Complaint; Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 47, filed Dec.13, 2011, pp. 1-15.
Answer to Amended Complaint; Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 47, filed Dec. 13, 2011, pp. 1-15.
U.S. Appl. No. 11/296,906, filed Dec. 8, 2005, File History.
U.S. Appl. No. 11/297,669, filed Dec. 8, 2005, File History.
U.S. Appl. No. 11/297,889, filed Dec. 8, 2005, File History.
U.S. Appl. No. 11/298,069, filed Dec. 9, 2005, File History.
U.S. Appl. No. 11/299,028, filed Dec. 9, 2005, File History.
U.S. Appl. No. 11/593,659, filed Nov. 7, 2006, File History.
U.S. Appl. No. 11/593,682, filed Nov. 7, 2006, File History.
U.S. Appl. No. 11/593,882, filed Nov. 7, 2006, File History.
U.S. Appl. No. 11/595,015, filed Nov. 9, 2006, File History.
U.S. Appl. No. 11/637,754, filed Dec. 13, 2006, File History.
U.S. Appl. No. 11/637,755, filed Dec. 13, 2006, File History.
Bill, ‘DriveCam—FAQ’, Dec. 12, 2003.
Bill Siuru, ‘DriveCam Could Save You Big Bucks’, Land Line Magazine, May-Jun. 2000.
Chris Woodyard, ‘Shuttles save with DriveCam’, Dec. 9, 2003.
Dan Carr, Flash Video Template: Video Presentation with Navigation, Jan. 16, 2006, http://www.adobe.com/devnet/fiash/articles/vidtemplate—mediapreso—flash8.html.
David Cullen, ‘Getting a real eyeful’, Fleet Owner Magazine, Feb. 2002.
David Maher, ‘DriveCam Brochure Folder’, Jun. 6, 2005.
David Maher, “DriveCam Brochure Folder”, Jun. 8, 2005.
David Vogeleer et al., Macromedia Flash Professional 8UNLEASHED (Sams Oct. 12, 2005).
Del Lisk, ‘DriveCam Training Handout Ver4’, Feb. 3, 2005.
Drivecam, Inc., User's Manual for Drivecam Video Systems' Hindsight 20/20 Software Version 4.0 (2003).
DriveCam, Inc.'s Infringement Contentions Exhibit A, U.S. Pat. No. 6,389,340, Document 34.1, Oct. 20, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit B, U.S. Pat. No. 7,659,827. Aug. 19, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit B, U.S. Pat. No. 7,804,426, Document 34.2, Oct. 20, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit C, U.S. Pat. No. 7,659,827, Document 34.3, Oct. 20, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit C, U.S. Pat. No. 7,804,426. Aug. 19, 2011.
DriveCam, Inc.'s Infringement Contentions Exhibit D, Document 34.4, Oct. 20, 2011.
DriveCam—Illuminator Data Sheet, Oct. 2, 2004.
Drivecam.com as retrieved by the Internet Wayback Machine as of Mar. 5, 2005.
DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Aug. 19, 2011.
DriveCam Driving Feedback System, Mar. 15, 2004.
DriveCam Extrinsic Evidence with Patent LR 4.1 .a Disclosures, Nov. 3, 2011.
DriveCam Extrinsic Evidence with Patent LR 4.1 .a Disclosures, Nov. 8, 2011.
Driver Feedback System, Jun. 12, 2001.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 53, filed Dec. 20, 2011, pp. 1-48.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 55, filed Jan. 1, 2012, pp. 86-103.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 55, filed Jan. 3, 2012, pp. 86-103.
First Amended Answer to Amended Complaint and First Amended Counterclaims; and Demand for Jury Trial in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Exhibit A, Document 55, filed Jan. 3, 2012, pp. 49-103.
Franke, U., et al., Autonomous Driving Goes Downtown, IEEE Intelligent Systems, 13(6):40-48 (1988); Digital Object Identifier 10.1109/5254.736001.
Gallagher, B., et al., Wireless Communications for Vehicle Safety: Radio Link Performance and Wireless Connectivity Methods, Vehicular Technology Magazine, IEEE, 1(4):4-24 (2006); Digital Object Identifier 10.1109/MVT.2006.343641.
Gandhi, T., et al., Pedestrian Protection Systems: Issues, Survey, and Challenges, IEEE Transactions on Intelligent Transportation Systems, 8(3):413-430 (2007); Digital Object Identifier 10.1109/TITS.2007.903444.
Gary and Sophia Rayner, Final Report for Innovations Deserving Exploratory Analysis (IDEA) Intelligent Transportation Systems (ITS) Programs' Project 84, I-Witness Black Box Recorder, San Diego, CA. Nov. 2001.
GE published its VCR User's Guide for Model VG4255 in 1995.
Glenn Oster, ‘Hindsight 20/20 v4.0 Software Installation’, 1 of 2, Jun. 20, 2003.
Glenn Oster, ‘HindSight 20/20 v4.0 Software Installation’, 2 of 2, Jun. 20, 2003.
Glenn Oster, ‘Illuminator Installation’, Oct. 3, 2004.
Hans Fantel, Video; Search Methods Make a Difference in Picking VCR's, NY Times, Aug. 13, 1989.
I/O Port Racing Supplies' website discloses using Traqmate's Data Acquisition with Video Overlay system in conjunction with professional driver coaching sessions (available at http://www.ioportracing.com/Merchant2/merchant.mvc?Screen=CTGY&Categorys- ub.--Code=coaching)., printed from site on Jan. 11, 2012.
Inovate Motorsports, OT-1 16 Channel OBD-II Interface User Manual, Version 1.0, Nov. 28, 2007, pp. 3, 4, 21 & 27.
Interior Camera Data Sheet, Oct. 26, 2001.
International Search Report and Written Opinion issued in PCT/US07/68325 on Feb. 27, 2008.
International Search Report and Written Opinion issued in PCT/US07/68328 on Oct. 15, 2007.
International Search Report and Written Opinion issued in PCT/US07/68329 on Mar. 3, 2008.
International Search Report and Written Opinion issued in PCT/US07/68332 on Mar. 3, 2008.
International Search Report and Written Opinion issued in PCT/US07/68334 on Mar. 5, 2008.
International Search Report for PCT/US2006/47055, Mailed Mar. 20, 2008 (2 pages).
International Search Report issued in PCT/US2006/47042 mailed Feb. 25, 2008.
J. Gallagher, ‘Lancer Recommends Tech Tool’, Insurance and Technology Magazine, Feb. 2002.
Jean (DriveCam vendor), ‘DC Data Sheet’, Nov. 6, 2002.
Jean (DriveCam vendor), ‘DriveCam brochure’, Nov. 6, 2002.
Jean (DriveCam vendor), ‘Feedback Data Sheet’, Nov. 6, 2002.
Jean (DriveCam vendor), ‘Hindsight 20-20 Data Sheet’, Nov. 4, 2002.
Jessyca Wallace, ‘Analyzing and Processing DriveCam Recorded Events’, Oct. 6, 2003.
Jessyca Wallace, ‘Overview of the DriveCam Program’, Dec. 15, 2005.
Jessyca Wallace, ‘The DriveCam Driver Feedback System’, Apr. 6, 2004.
Joint Claim Construction Chart, U.S. Pat. No. 6,389,340, ‘Vehicle Data Recorder’ for Case No. 3:11-CV-00997-H-RBB, Document 43-1, filed Dec. 1, 2011, pp. 1-33.
Joint Claim Construction Chart in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 11-CV-0997-H (RBB), for the Southern District of California, Document 43, filed Dec. 1, 2011, pp. 1-2.
Joint Claim Construction Worksheet, U.S. Pat. No. 6,389,340, ‘Vehicle Data Reporter’ for Case No. 3:11-CV-00997-H-RBB, Document 44-1, filed Dec. 1, 2011, pp. 1-10.
Joint Claim Construction Worksheet, U.S. Pat. No. 6,389,340, “Vehicle Data Reporter” for Case No. 3:11-CV-00997-H-RBB, Document 44-1, filed Dec. 1, 2011, pp. 1-10.
Joint Claim Construction Worksheet in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997 H (RBB), for the Southern District of California, Document 44, filed Dec. 1, 2011, pp. 1-2.
Joint Motion for Leave to Supplement Disclosure of Asserted Claims and Preliminary Infringement Contentions in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-Cv-00997-H-RBB, Document 29, filed Oct. 12, 2011, pp. 1-7.
Julie Stevens, ‘DriveCam Services’, Nov. 15, 2004.
Julie Stevens, ‘Program Support Roll-Out & Monitoring’, Jul. 13, 2004.
Jung, Sang-Hack, et al., Egomotion Estimation in Monocular Infra-red Image Sequence for Night Vision Applications, IEEE Workshop on Applications of Computer Vision (WACV '07), Feb. 2007, pp. 8-8; Digital Object Identifier 10.1109/WACV.2007.20.
JVC Company of America, JVC Video Cassette Recorder HR-IP820U Instructions (1996).
Kamijo, S., et al., A Real-Time Traffic Monitoring System by Stochastic Model Combination, IEEE International Conference on Systems, Man and Cybernetics, 4:3275-3281 (2003).
Kamijo, S., et al., An Incident Detection System Based on Semantic Hierarchy, Proceedings of the 7th International IEEE Intelligent Transportation Systems Conference, Oct. 3-6, 2004, pp. 853-858; Digital Object Identifier 10.1109/ITSC.2004.1399015.
Karen, ‘Downloading Options to HindSight 20120’, Aug. 6, 2002.
Karen, ‘Managers Guide to the DriveCam Driving Feedback System’, Jul. 30, 2002.
Kathy Latus (Latus Design), ‘Case Study—Cloud 9 Shuttle’, Sep. 23, 2005.
Kathy Latus (Latus Design), ‘Case Study—Lloyd Pest Control’, Jul. 19, 2005.
Kathy Latus (Latus Design), ‘Case Study—Time Warner Cable’, Sep. 23, 2005.
Ki, Yong-Kul, et al., A Traffic Accident Detection Model using Metadata Registry, Proceedings of the Fourth International Conference on Software Engineering Research, Management and Applications; Aug. 9-11, 2006 pp. 255-259 Digital Object Identifier 10.1109/SERA.2006.8.
Kitchin, Charles. “Understanding accelerometer scale factor and offset adjustments.” Analog Devices (1995).
Lin, Chin-Teng et al., EEG-based drowsiness estimation for safety driving using independent component analysis; IEEE Transactions on Circuits and Systems-I: Regular Papers, 52(12):2726-2738 (2005); Digital Object Identifier 10.1109/TCSI.2005.857555.
Lisa Mckenna, ‘A Fly on the Windshield?’, Pest Control Technology Magazine, Apr. 2003.
Miller, D.P., Evaluation of Vision Systems for Teleoperated Land Vehicles. Control Systems Magazine, IEEE, 8(3):37-41 (1988); Digital Identifier 10.1109/37.475.
Munder, S., et al., Pedestrian Detection and Tracking Using a Mixture of View-Based Shape-Texture Models, IEEE Transactions on Intelligent Transportation Systems, 9(2):333-343 (2008); Digital Identifier 10.1109/TITS.2008.922943.
Panasonic Corporation, Video Cassette Recorder (VCR) Operating Instructions for Models No. PV-V4020/PV-V4520.
Passenger Transportation Mode Brochure, May 2, 2005.
Patent Abstracts of Japan vol. 007, No. 180 (P-215), Aug. 9, 1983 (Aug. 9, 1983) & JP 58 085110 A (Mitsuhisa Ichikawa), May 21, 1983 (May 21, 1983).
Patent Abstracts of Japan vol. 011, No. 292 (E-543), Sep. 19, 1987 (Sep. 19, 198) & JP 62 091092 A (OK Eng:KK), Apr. 25, 1987 (Apr. 25, 1987).
Patent Abstracts of Japan vol. 012, No. 001 (M-656), Jan. 6, 1988 (Jan. 6, 1988) & JP 62 166135 A (Fuji Electric Co Ltd), Jul. 22, 1987 (Jul. 22, 1987).
Patent Abstracts of Japan vol. 014, No. 222 (E-0926), May 10, 1990 (May 10, 1990) & JP 02 056197 A (Sanyo Electric Co Ltd), Feb. 26, 1990 (Feb. 26, 1990).
Patent Abstracts of Japan vol. 017, No. 039 (E-1311), Jan. 25, 1993 (Jan. 25, 1993) & JP 04 257189 A (Sony Corp), Sep. 11, 1992 (Sep. 11, 1992).
Patent Abstracts of Japan vol. 017, No. 521 (E-1435), Sep. 20, 1993 (Sep. 20, 1993) & JP 05 137144 A (Kyocera Corp), Jun. 1, 1993 (Jun. 1, 1993).
Patent Abstracts of Japan vol. 1996, No. 09, Sep. 30, 1996 (Sep. 30, 1996) & JP 08 124069 A (Toyota Motor Corp), May 17, 1996 (May 17, 1996).
Patent Abstracts of Japan vol. 1997, No. 10, Oct. 31, 1997 (Oct. 31, 1997) & JP 09 163357 A (Nippon Soken Inc), Jun. 20, 1997 (Jun. 20, 1997).
Patent Abstracts of Japan vol. 1998, No. 02, Jan. 30, 1998 (Jan. 30, 1998) & JP 09 272399 A (Nippon Soken Inc), Oct. 21, 1997 (Oct. 21, 1997).
Patent Abstracts of Japan vol. 1998, No. 8, Jun. 30, 1998 (Jun. 30, 1998) & JP 10 076880 A (Muakami Corp), Mar. 24, 1998 (Mar. 24, 1998).
PCT/US2010/022012, Invitation to Pay Additional Fees with Communication of Partial International Search, Jul. 21, 2010.
Peter G. Thurlow, Letter (including exhibits) Regarding Patent Owner's Response to Initial Office Action in Ex Parte Reexamination, Mar. 27, 2012.
Preliminary Claim Construction and Identification of Extrinsic Evidence of Defendant/Counterclaimant SmartDriveSystems, Inc. in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H (RBB), for the Southern District of California. Nov. 8, 2011.
Quinn Maughan, ‘DriveCam Enterprise Services’, Jan. 5, 2006.
Quinn Maughan, ‘DriveCam Managed Services’, Jan. 5, 2006.
Quinn Maughan, ‘DriveCam Standard Edition’, Jan. 5, 2006.
Quinn Maughan, ‘DriveCam Unit Installation’, Jul. 21, 2005.
Quinn Maughan, ‘Enterprise Services’, Apr. 17, 2006.
Quinn Maughan, ‘Enterprise Services’, Apr. 7, 2006.
Quinn Maughan, ‘Hindsight Installation Guide’, Sep. 29, 2005.
Quinn Maughan, ‘Hindsight Users Guide’, Jun. 7, 2005.
Ronnie Rittenberry, ‘Eyes on the Road’, Jul. 2004.
SmartDrives Systems, Inc's Production, SO14246-S014255, Nov. 16, 2011.
Supplement to DriveCam's Disclosure of Asserted Claims and Preliminary Infringement Contentions' in DriveCam, Inc. v. SmartDrive Systems, Inc., Case No. 3:11-CV-00997-H-RBB, for the Southern District of California. Oct. 14, 2011.
The DriveCam, Nov. 6, 2002.
The DriveCam, Nov. 8, 2002.
Traqmate GPS Data Acquisition's Traqmate Data Acquisition with Video Overlay system was used to create a video of a driving event on Oct. 2, 2005 (available at http://www.trackvision.net/phpBB2/viewtopic.php?t=51&sid=1184fbbcbe3be5c87ffa0f2ee6e2da76), printed from site on Jan. 11, 2012.
Trivinci Systems, LLC, Race-Keeper Systems User Guide, Jan. 2011, v1, 1.02, pp. 34 and 39.
U.S. Appl. No. 12/691,639, entitled ‘Driver Risk Assessment System and Method Employing Selectively Automatic Event Scoring’, filed Jan. 21, 2010.
U.S. Appl. No. 11/377,167, Final Office Action dated Nov. 8, 2013.
U.S. Appl. No. 11/377,157, filed Mar. 16, 2006 entitled, “Vehicle Event Recorder Systems and Networks Having Parallel Communications Links”.
U.S. Appl. No. 11/377,167, filed Mar. 16, 2006 entitled, “Vehicle Event Recorder Systems and Networks Having Integrated Cellular Wireless Communications Systems”.
USPTO Final Office Action for U.S. Appl. No. 11/297,669, mailed Nov. 7, 2011, 15 pages.
USPTO Final Office Action for U.S. Appl. No. 13/957,810, mailed Jun. 27, 2014, 24 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Apr. 2, 2009, 7 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,906, mailed Nov. 6, 2009, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/297,669, mailed Apr. 28, 2011, 11 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/299,028, mailed Apr. 24, 2008, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Nov. 19, 2007, 7 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Nov. 25, 2011, 9 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,164, mailed Sep. 11, 2008, 8 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/377,167, mailed Jun. 5, 2008, 11 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/800,876, mailed Dec. 1, 2010, 12 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 11/800,876, mailed Dec. 20, 2011, 8 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 12/096,591, mailed May 20, 2014, 19 pages.
USPTO Non-Final Office Action for U.S. Appl. No. 14/036,299, mailed Aug. 12, 2014.
USPTO Non-Final Office Action for U.S. Appl. No. 11/296,907, Mailed Mar. 22, 2007 ( 17 pages).
USPTO Non-final Office Action mailed Aug. 27, 2009 during prosecution of U.S. Appl. No. 11/566,424.
USPTO Non-Final Office Action mailed Nov. 27, 2013 in U.S. Appl. No. 13/957,810, filed Aug. 2, 2013.
Veeraraghavan, H., et al., Computer Vision Algorithms for Intersection Monitoring, IEEE Transactions on Intelligent Transportation Systems, 4(2):78-89 (2003); Digital Object Identifier 10.1109/TITS.2003.821212.
Wijesoma, W.S., et al., Road Curb Tracking in an Urban Environment, Proceedings of the Sixth International Conference of Information Fusion, 1:261-268 (2003).
World News Tonight, CBC Television New Program discussing teen drivers using the DriveCam Program and DriveCam Technology, Oct. 10, 2005, on PC formatted CD-R, World News Tonight.wmv, 7.02 MB, Created Jan. 12, 2011.
Written Opinion issued in PCT/US07/68328 on Oct. 15, 2007.
Written Opinion of the International Searching Authority for PCT/US2006/47042. Mailed Feb. 25. 2008 (5 pages).
Written Opinion of the International Searching Authority for PCT/US2006/47055, Mailed Mar. 20, 2008 (5 pages).
Mortlock, “Automatic Train Control: Concept of System,” Jun. 28, 2010 Retrieved from http://ww.hsr.ca.gove/docs/programs/eir—memos/Proj—Guidelines—TM3—3—1R00.pdf (64 pages).
PCT International Search Report and Written Opinion for PCT/US2015/066873, dated Feb. 19, 2016 (17 pages).
PCT International Search Report and Written Opinion for PCT/US2016/012757 dated Mar. 18, 2016 (7 pages).
European Search Report EP16150325.5 dated May 19, 2016 (13 pages).
Gary A. Rayner, U.S. Appl. No. 09/020,700, filed Feb. 9, 1998.
Gary A. Rayner, U.S. Appl. No. 09/405,857, filed Sep. 24, 1999 .
Gary A. Rayner, U.S. Appl. No. 09/611,891, filed Jul. 7, 2000.
Gary A. Rayner, U.S. Appl. No. 09/669,449, filed Sep. 25, 2000.
Gary A. Rayner, U.S. Appl. No. 09/732,813, filed Dec. 11, 2000.
Charlie Gunderson, U.S. Appl. No. 11/382,222, filed May 8, 2006.
Charlie Gunderson, U.S. Appl. No. 11/382,239, filed May 9, 2006.
Charlie Gunderson, U.S. Appl. No. 11/382,325, filed May 9, 2006.
Charlie Gunderson, U.S. Appl. No. 11/382,328, filed May 9, 2006.
David Stanley, U.S. Appl. No. 11/465,765, filed Aug. 18, 2006.
Larry Richardson, U.S. Appl. No. 11/467,486, filed Aug. 25, 2006.
Craig Denson, U.S. Appl. No. 11/467,694, filed Aug. 28, 2006.
Carl Miller, U.S. Appl. No. 11/566,526, filed Dec. 4, 2006.
Jamie Etcheson, U.S. Appl. No.11/566.424, filed Dec. 4, 2006.
Jamie Etcheson, U.S. Appl. No. 11/566,539, filed Dec. 4, 2006.
Bryan Cook, U.S. Appl. No. 12/359,787, filed Jan. 26, 2009.
Bryon Cook, U.S. Appl. No. 12/691,639, filed Jan. 21, 2010.
Bryon Cook, U.S. Appl. No. 12/793,362, filed Jun. 3, 2010.
Bryon Cook, U.S. Appl. No. 12/814,117, filed Jun. 11, 2010.
Charlie Gunderson, U.S. Appl. No. 13/234,103, filed Sep. 15, 2011.
Syrus C. Nemat-Nasser, U.S. Appl. No. 13/235,263, filed Sep. 16, 2011.
Roni Tamari, U.S. Appl. No. 13/271,417, filed Oct. 12, 2011.
Bryan Cook, U.S. Appl. No. 13/586,750, filed Aug. 15, 2012.
Craig Denson, U.S. Appl. No. 13/736,709, filed Jan. 8, 2013.
Bryon Cook, U.S. Appl. No. 13/923,130, filed Jun. 20, 2013.
DriveCam, Inc., U.S. Appl. No. 13/914,339, filed Jun. 10, 2013.
DriveCam, Inc., U.S. Appl. No. 14/027,038, filed Sep. 13, 2013.
Larry Richardson, U.S. Appl. No. 13/448,725, filed Apr. 17, 2012.
DriveCam, Inc., U.S. Appl. No. 14/034,296, filed Sep. 23, 2013.
Joshua Donald Botnen, U.S. Appl. No. 13/222,301, filed Aug. 31, 2011.
DriveCam Inc., U.S. Appl. No. 14/070,206, filed Nov. 1, 2013.
DriveCam, Inc., U.S. Appl. No. 90/011,951, filed Oct. 11, 2011.
DriveCam, Inc., U.S. Appl. No. 95/001,779, filed Oct. 11, 2011.
James Plante, U.S. Appl. No. 11/296,906, filed Dec. 8, 2005.
James Plante, U.S. Appl. No. 12/096,591, filed Oct. 3, 2008.
James Plante, U.S. Appl. No. 11/296,907, filed Dec. 8, 2005.
James Plante, U.S. Appl. No. 12/096,592, filed Oct. 3, 2008.
James Plante, U.S. Appl. No. 13/734,800, filed Jan. 4, 2013.
James Plante, U.S. Appl. No. 11/297,669, filed Dec. 8, 2005.
James Plante, U.S. Appl. No. 11/377,157, filed Mar. 16, 2006.
James Plante, U.S. Appl. No. 11/377,164, filed Mar. 16, 2006.
James Plante U.S. Appl. No. 11/377,167, filed Mar. 16, 2006.
James Plante, U.S. Appl. No. 11/593,659, filed Nov. 7, 2006.
James Plante, U.S. Appl. No. 13/568,151, filed Aug. 7, 2012.
James Plante, U.S. Appl. No. 13/570,283, filed Aug. 9, 2012.
James Plante, U.S. Appl. No. 11/593,682, filed Nov. 7, 2006.
James Plante, U.S. Appl. No. 11/637,754, filed Dec. 13, 2006.
James Plante, U.S. Appl. No. 11/800,876, filed May 8, 2007.
James Plante, U.S. Appl. No. 13/539,312, filed Jun. 30, 2012.
James Plante, U.S. Appl. No. 11/298,069, filed Dec. 9, 2005.
James Plante, U.S. Appl. No. 13/957,810, filed Aug. 2, 2013.
James Plante, U.S. Appl. No. 11/299,028, filed Dec. 9, 2005.
Jason Palmer, U.S. Appl. No. 14/076,511, filed Nov. 11, 2013.
Jason Palmer, U.S. Appl. No. 14/055,833, filed Oct. 16, 2013.
Jason Palmer, U.S. Appl. No. 14/186,416, filed Feb. 21, 2014.
James Plante, U.S. Appl. No. 14/036,299, filed Sep. 25, 2013.
James Plante, U.S. Appl. No. 14/177,047, filed Feb. 10, 2014.
DriveCam, Inc., U.S. Appl. No. 95/001,781, filed Oct. 11, 2011.
DriveCam, Inc., U.S. Appl. No. 95/001802, filed Nov. 3, 2011.
Related Publications (1)
Number Date Country
20160114820 A1 Apr 2016 US