Methods of facilitating emergency assistance

Information

  • Patent Grant
  • 11069221
  • Patent Number
    11,069,221
  • Date Filed
    Friday, November 15, 2019
    5 years ago
  • Date Issued
    Tuesday, July 20, 2021
    3 years ago
Abstract
A method of responding to a vehicle accident includes receiving data indicating that vehicle is involved in an accident. The method further includes transmitting a communication to a mobile device of a user in response to receiving the data. Still further, the method includes displaying the communication via a mobile device application on the mobile device, the communication prompting the user to provide responses to one or more questions regarding the accident. The method further includes determining a likely severity of the accident based on the received data, receiving an indication of a response to the one or more questions from the user, and, based on the determination of the likely severity of the accident and the received indication of the response, prompting the user with an emergency assistance recommendation. In addition, the method further includes performing one or more assessments based on the determination of the likely severity of the accident, the one or more assessments including determining damage to the vehicle, repairs needed for the vehicle, or fault of the user for the accident.
Description
FIELD

The present embodiments relate generally to telematics data and/or insurance policies. More particularly, the present embodiments relate to performing certain actions, and/or adjusting insurance policies, based upon telematics and/or other data indicative of risk or insured behavior.


BACKGROUND

At times, insurance providers are able to provide helpful information to customers who have recently been in an accident. When a customer calls a claims associate to report an accident and initiate a claim, for example, the associate may be able to offer suggestions with respect to the next steps that the customer should take. Often, however, customers do not call their insurance providers promptly after an accident, and/or it takes a significant amount of time for the associate to locate and relay useful information. Moreover, in an emergency situation (e.g., a serious car accident), a claim associate may be very limited in his or her ability to provide assistance. In such a situation, the customer may be unable to contact a claim associate and, more importantly, may be unable to contact emergency services/responders.


The present embodiments may overcome these and/or other deficiencies.


BRIEF SUMMARY

The present embodiments disclose systems and methods that may relate to the intersection of telematics and insurance. In some embodiments, for example, telematics and/or other data may be collected and used to determine a likely severity of a vehicle accident. The data may be gathered from one or more sources, such as mobile devices (e.g., smart phones, smart glasses, smart watches, smart wearable devices, smart contact lenses, and/or other devices capable of wireless communication); smart vehicles; smart vehicle or smart home mounted sensors; third party sensors or sources of data (e.g., other vehicles, public transportation systems, government entities, and/or the Internet); and/or other sources of information. Based upon the likely severity, a communication related to emergency assistance or an emergency assistance request may be generated. The communication may be sent to a driver involved in the accident (e.g., for approval, rejection or modification prior to being sent to an emergency service provider), and/or sent directly to an emergency service provider, for example.


In one aspect, a computer-implemented method of loss mitigation may include (1) collecting, by one or more remote servers associated with an insurance provider, accident data associated with a vehicle accident involving a driver. The accident data may include vehicle telematics data, and/or the driver may be associated with an insurance policy issued by the insurance provider. The method may also include (2) analyzing, by the one or more remote servers, the accident data; (3) determining, by the one or more remote servers and based upon the analysis of the accident data, a likely severity of the vehicle accident; (4) generating, by the one or more remote servers and based upon the determined likely severity of the vehicle accident, a communication related to emergency assistance or an emergency assistance recommendation; (5) transmitting, via wireless communication, the communication related to the emergency assistance or emergency assistance recommendation from the one or more remote servers to a mobile device associated with the driver; (6) receiving, at the one or more remote servers, a wireless communication from the driver indicating approval or modification of the emergency assistance or emergency assistance recommendation; and/or (7) notifying, via a communication sent from the one or more remote servers, a third party of requested emergency assistance in accordance with the emergency assistance or emergency assistance recommendation as approved or modified by the driver. The method may include additional, fewer, or alternate actions, including those discussed elsewhere herein.


In another aspect, a computer-implemented method of loss mitigation may include (1) collecting, by one or more remote servers associated with an insurance provider, accident data associated with a vehicle accident involving a driver. The accident data may include vehicle telematics data, and/or the driver may be associated with an insurance policy issued by the insurance provider. The method may also include (2) analyzing, by the one or more remote servers, the accident data; (3) determining, by the one or more remote servers and based upon the analysis of the accident data, a likely severity of the vehicle accident; (4) generating, by the one or more remote servers and based upon the determined likely severity of the vehicle accident, a communication related to emergency assistance or an emergency assistance recommendation; and/or (5) transmitting the communication related to the emergency assistance or emergency assistance recommendation from the one or more remote servers to a third party to facilitate a prompt and appropriate emergency responder response to the vehicle accident. The method may include additional, fewer, or alternate actions, including those discussed elsewhere herein.


In another aspect, a system for facilitating loss mitigation may include one or more processors and one or more memories. The one or more memories may store instructions that, when executed by the one or more processors, cause the one or more processors to (1) collect accident data associated with a vehicle accident involving a driver. The accident data may include vehicle telematics data, and/or the driver may be associated with an insurance policy issued by the insurance provider. The instructions may also cause the one or more processors to (2) analyze the accident data; (3) determine, based upon the analysis of the accident data, a likely severity of the vehicle accident; (4) generate, based upon the determined likely severity of the vehicle accident, a communication related to emergency assistance or an emergency assistance recommendation; (5) cause the communication related to the emergency assistance or emergency assistance recommendation to be transmitted, via wireless communication, to a mobile device associated with the driver; (6) receive a wireless communication from the driver indicating approval or modification of the emergency assistance or emergency assistance recommendation; and/or (7) cause a third party to be notified of requested emergency assistance in accordance with the emergency assistance or emergency assistance recommendation as approved or modified by the driver.


Advantages will become more apparent to those skilled in the art from the following description of the preferred embodiments which have been shown and described by way of illustration. As will be realized, the present embodiments may be capable of other and different embodiments, and their details are capable of modification in various respects. Accordingly, the drawings and description are to be regarded as illustrative in nature and not as restrictive.





BRIEF DESCRIPTION OF THE DRAWINGS

There are shown in the drawings arrangements which are presently discussed. It is understood, however, that the present embodiments are not limited to the precise arrangements and instrumentalities shown.



FIG. 1 illustrates an exemplary computer system on which the techniques described herein may be implemented, according to one embodiment.



FIG. 2 illustrates an exemplary mobile device or smart vehicle controller that may collect, receive, generate and/or send telematics and/or other data for purposes of the techniques described herein, according to one embodiment.



FIG. 3 illustrates an exemplary computer-implemented method of loss mitigation associated with an insured event, according to one embodiment.



FIG. 4 illustrates an exemplary computer-implemented method of providing intelligent routing to reduce risk and/or the likelihood of an insured event occurring, according to one embodiment.



FIG. 5 illustrates an exemplary computer-implemented method of theft prevention and/or mitigation, according to one embodiment.





DETAILED DESCRIPTION

The present embodiments may relate to, inter alia, collecting data, including telematics and/or other data, and analyzing the data (e.g., by an insurance provider server or processor) to provide insurance-related benefits to insured individuals, and/or to apply the insurance-related benefits to insurance policies or premiums of insured individuals. For example, the insurance-related benefits may include risk or loss mitigation and/or prevention, and/or may include theft protection, mitigation, and/or avoidance. The insurance-related benefits may also, or instead, include other products and/or services, such as intelligent vehicle routing in real-time, for example. The present embodiments may prevent losses/injury/damage to persons and/or property, and/or reward an insured for exhibiting risk-averse behavior (e.g., in the form of lower insurance premiums or rates, or additional insurance discounts, points, and/or rewards).


I. Exemplary Telematics Data System


FIG. 1 illustrates a block diagram of an exemplary telematics system 1 on which the exemplary methods described herein may be implemented. The high-level architecture includes both hardware and software applications, as well as various data communications channels for communicating data between the various hardware and software components. The telematics system 1 may be roughly divided into front-end components 2 and back-end components 4.


The front-end components 2 may obtain information regarding a vehicle 8 (e.g., a car, truck, motorcycle, etc.) and/or the surrounding environment. Information regarding the surrounding environment may be obtained by one or more other vehicles 6, public transportation system components 22 (e.g., a train, a bus, a trolley, a ferry, etc.), infrastructure components 26 (e.g., a bridge, a stoplight, a tunnel, a rail crossing, etc.), smart homes 28 having smart home controllers 29, and/or other components communicatively connected to a network 30. Information regarding the vehicle 8 may be obtained by a mobile device 10 (e.g., a smart phone, a tablet computer, a special purpose computing device, etc.) and/or a smart vehicle controller 14 (e.g., an on-board computer, a vehicle diagnostic system, a vehicle control system or sub-system, etc.), which may be communicatively connected to each other and/or the network 30.


In some embodiments, telematics data may be generated by and/or received from sensors 20 associated with the vehicle 8. Such telematics data from the sensors 20 may be received by the mobile device 10 and/or the smart vehicle controller 14, in some embodiments. Other, external sensors 24 (e.g., sensors associated with one or more other vehicles 6, public transportation system components 22, infrastructure components 26, and/or smart homes 28) may provide further data regarding the vehicle 8 and/or its environment, in some embodiments. For example, the external sensors 24 may obtain information pertaining to other transportation components or systems within the environment of the vehicle 8, and/or information pertaining to other aspect so of that environment. The sensors 20 and the external sensors 24 are described further below, according to some embodiments.


In some embodiments, the mobile device 10 and/or the smart vehicle controller 14 may process the sensor data from sensors 20, and/or other of the front-end components 2 may process the sensor data from external sensors 24. The processed data (and/or information derived therefrom) may then be communicated to the back-end components 4 via the network 30. In other embodiments, the front-end components 2 may communicate the raw sensor data from sensors 20 and/or external sensors 24, and/or other telematics data, to the back-end components 4 for processing. In thin-client embodiments, for example, the mobile device 10 and/or the smart vehicle controller 14 may act as a pass-through communication node for communication with the back-end components 4, with minimal or no processing performed by the mobile device 10 and/or the smart vehicle controller 14. In other embodiments, the mobile device 10 and/or the smart vehicle controller 14 may perform substantial processing of received sensor, telematics, or other data. Summary information, processed data, and/or unprocessed data may be communicated to the back-end components 4 via the network 30.


The mobile device 10 may be a general-use personal computer, cellular phone, smart phone, tablet computer, or a dedicated vehicle use monitoring device. In some embodiments, the mobile device 10 may include a wearable device such as a smart watch, smart glasses, wearable smart technology, or a pager. Although only one mobile device 10 is illustrated, it should be understood that a plurality of mobile devices may be used in some embodiments. The smart vehicle controller 14 may be a general-use on-board computer capable of performing many functions relating to vehicle operation, an on-board computer system or sub-system, or a dedicated computer for monitoring vehicle operation and/or generating telematics data. Further, the smart vehicle controller 14 may be installed by the manufacturer of the vehicle 8 or as an aftermarket modification or addition to the vehicle 8. Either or both of the mobile device 10 and the smart vehicle controller 14 may communicate with the network 30 over link 12 and link 18, respectively. Additionally, the mobile device 10 and smart vehicle controller 14 may communicate with one another directly over link 16. In some embodiments, the mobile device 10 and/or the smart vehicle controller 14 may communicate with other of the front-end components 2, such as the vehicles 6, public transit system components 22, infrastructure components 26, and/or smart homes 28, either directly or indirectly (e.g., via the network 30).


The one or more sensors 20 referenced above may be removably or fixedly disposed within (and/or on the exterior of) the vehicle 8, within the mobile device 10, and/or within the smart vehicle controller 14, for example. The sensors 20 may include any one or more of various different sensor types, such as an ignition sensor, an odometer, a system clock, a speedometer, a tachometer, an accelerometer, a gyroscope, a compass, a geolocation unit (e.g., a GPS unit), a camera and/or video camera, a distance sensor (e.g., radar, LIDAR, etc.), and/or any other sensor or component capable of generating or receiving data regarding the vehicle 8 and/or the environment in which the vehicle 8 is located.


Some of the sensors 20 (e.g., radar, LIDAR, ultrasonic, infrared, or camera units) may actively or passively scan the vehicle environment for objects (e.g., other vehicles, buildings, pedestrians, etc.), traffic control elements (e.g., lane markings, signs, signals, etc.), external conditions (e.g., weather conditions, traffic conditions, road conditions, etc.), and/or other physical characteristics of the environment. Other sensors of sensors 20 (e.g., GPS, accelerometer, or tachometer units) may provide operational and/or other data for determining the location and/or movement of the vehicle 8. Still other sensors of sensors 20 may be directed to the interior or passenger compartment of the vehicle 8, such as cameras, microphones, pressure sensors, thermometers, or similar sensors to monitor the vehicle operator and/or passengers within the vehicle 8.


The external sensors 24 may be disposed on or within other devices or components within the vehicle's environment (e.g., other vehicles 6, infrastructure components 26, etc.), and may include any of the types of sensors listed above. For example, the external sensors 24 may include sensors that are the same as or similar to sensors 20, but disposed on or within some of the vehicles 6 rather than the vehicle 8.


To send and receive information, each of the sensors 20 and/or external sensors 24 may include a transmitter and/or a receiver designed to operate according to predetermined specifications, such as the dedicated short-range communication (DSRC) channel, wireless telephony, Wi-Fi, or other existing or later-developed communications protocols. As used herein, the terms “sensor” or “sensors” may refer to the sensors 20 and/or external sensors 24.


The other vehicles 6, public transportation system components 22, infrastructure components 26, and/or smart homes 28 may be referred to herein as “external” data sources. The other vehicles 6 may include any other vehicles, including smart vehicles, vehicles with telematics-capable mobile devices, autonomous vehicles, and/or other vehicles communicatively connected to the network 30 via links 32.


The public transportation system components 22 may include bus, train, ferry, ship, airline, and/or other public transportation system components. Such components may include vehicles, tracks, switches, access points (e.g., turnstiles, entry gates, ticket counters, etc.), and/or payment locations (e.g., ticket windows, fare card vending machines, electronic payment devices operated by conductors or passengers, etc.), for example. The public transportation system components 22 may further be communicatively connected to the network 30 via a link 34, in some embodiments.


The infrastructure components 26 may include smart infrastructure or devices (e.g., sensors, transmitters, etc.) disposed within or communicatively connected to transportation or other infrastructure, such as roads, bridges, viaducts, terminals, stations, fueling stations, traffic control devices (e.g., traffic lights, toll booths, entry ramp traffic regulators, crossing gates, speed radar, cameras, etc.), bicycle docks, footpaths, or other infrastructure system components. In some embodiments, the infrastructure components 26 may be communicatively connected to the network 30 via a link (not shown in FIG. 1).


The smart homes 28 may include dwellings or other buildings that generate or collect data regarding their condition, occupancy, proximity to a mobile device 10 or vehicle 8, and/or other information. The smart homes 28 may include smart home controllers 29 that monitor the local environment of the smart home, which may include sensors (e.g., smoke detectors, radon detectors, door sensors, window sensors, motion sensors, cameras, etc.). In some embodiments, the smart home controller 29 may include or be communicatively connected to a security system controller for monitoring access and activity within the environment. The smart home 28 may further be communicatively connected to the network 30 via a link 36, in some embodiments.


The external data sources may collect data regarding the vehicle 8, a vehicle operator, a user of an insurance program, and/or an insured of an insurance policy. Additionally, or alternatively, the other vehicles 6, the public transportation system components 22, the infrastructure components 26, and/or the smart homes 28 may collect such data, and provide that data to the mobile device 10 and/or the smart vehicle controller 14 via links not shown in FIG. 1.


In some embodiments, the front-end components 2 communicate with the back-end components 4 via the network 30. The network 30 may include a proprietary network, a secure public internet, a virtual private network and/or one or more other types of networks, such as dedicated access lines, plain ordinary telephone lines, satellite links, cellular data networks, or combinations thereof. In embodiments where the network 30 comprises the Internet, data communications may take place over the network 30 via an Internet communication protocol.


The back-end components 4 may use a remote server 40 to receive data from the front-end components 2, determine characteristics of vehicle use, determine risk levels, modify insurance policies, and/or perform other processing functions in accordance with any of the methods described herein. In some embodiments, the server 40 may be associated with an insurance provider, either directly or indirectly. The server 40 may include one or more computer processors adapted and configured to execute various software applications and components of the telematics system 1.


The server 40 may further include a database 46, which may be adapted to store data related to the operation of the vehicle 8 and/or other information. As used herein, the term “database” may refer to a single database or other structured data storage, or to a collection of two or more different databases or structured data storage components. Additionally, the server 40 may be communicatively coupled via the network 30 to one or more data sources, which may include an accident database 42 and/or a third party database 44. The accident database 42 and/or third party database 44 may be communicatively connected to the network via a communication link 38. The accident database 42 and/or the third party database 44 may be operated or maintained by third parties, such as commercial vendors, governmental entities, industry associations, nonprofit organizations, or others.


The data stored in the database 46 might include, for example, dates and times of vehicle use, duration of vehicle use, speed of the vehicle 8, RPM or other tachometer readings of the vehicle 8, lateral and longitudinal acceleration of the vehicle 8, incidents or near-collisions of the vehicle 8, communications between the vehicle 8 and external sources (e.g., other vehicles 6, public transportation system components 22, infrastructure components 26, smart homes 28, and/or external information sources communicating through the network 30), environmental conditions of vehicle operation (e.g., weather, traffic, road condition, etc.), errors or failures of vehicle features, and/or other data relating to use of the vehicle 8 and/or the vehicle operator. Prior to storage in the database 46, some of the data may have been uploaded to the server 40 via the network 30 from the mobile device 10 and/or the smart vehicle controller 14. Additionally, or alternatively, some of the data may have been obtained from additional or external data sources via the network 30. Additionally, or alternatively, some of the data may have been generated by the server 40. The server 40 may store data in the database 46 and/or may access data stored in the database 46 when executing various functions and tasks associated with the methods described herein.


The server 40 may include a controller 55 that is operatively connected to the database 46 via a link 56. It should be noted that, while not shown in FIG. 1, one or more additional databases may be linked to the controller 55 in a known manner. For example, separate databases may be used for sensor data, vehicle insurance policy information, and vehicle use information. The controller 55 may include a program memory 60, a processor 62 (which may be called a microcontroller or a microprocessor), a random-access memory (RAM) 64, and an input/output (I/O) circuit 66, all of which may be interconnected via an address/data bus 65. It should be appreciated that although only one microprocessor 62 is shown, the controller 55 may include multiple microprocessors 62. Similarly, the memory of the controller 55 may include multiple RAMs 64 and multiple program memories 60. Although the I/O circuit 66 is shown as a single block, it should be appreciated that the I/O circuit 66 may include a number of different types of I/O circuits. The RAM 64 and program memories 60 may be implemented as semiconductor memories, magnetically readable memories, or optically readable memories, for example. The controller 55 may also be operatively connected to the network 30 via a link 35.


The server 40 may further include a number of software applications stored in a program memory 60. The various software applications on the server 40 may include specific programs, routines, or scripts for performing processing functions associated with the methods described herein. Additionally, or alternatively, the various software application on the server 40 may include general-purpose software applications for data processing, database management, data analysis, network communication, web server operation, or other functions described herein or typically performed by a server. The various software applications may be executed on the same computer processor or on different computer processors. Additionally, or alternatively, the software applications may interact with various hardware modules that may be installed within or connected to the server 40. Such modules may implement part of all of the various exemplary methods discussed herein or other related embodiments.


In some embodiments, the server 40 may be a remote server associated with or operated by or on behalf of an insurance provider. The server 40 may be configured to receive, collect, and/or analyze telematics and/or other data in accordance with any of the methods described herein. The server 40 may be configured for one-way or two-way wired or wireless communication via the network 30 with a number of telematics and/or other data sources, including the accident database 42, the third party database 44, the database 46 and/or the front-end components 2. For example, the server 40 may be in wireless communication with mobile device 10; insured smart vehicles 8; smart vehicles of other motorists 6; smart homes 28; present or past accident database 42; third party database 44 operated by one or more government entities and/or others; public transportation system components 22 and/or databases associated therewith; smart infrastructure components 26; and/or the Internet. The server 40 may be in wired or wireless communications with other sources of data, including those discussed elsewhere herein.


Although the telematics system 1 is shown in FIG. 1 to include one vehicle 8, one mobile device 10, one smart vehicle controller 14, one other vehicle 6, one public transportation system component 22, one infrastructure component 26, one smart home 28, and one server 40, it should be understood that different numbers of each may be utilized. For example, the system 1 may include a plurality of servers 40 and hundreds or thousands of mobile devices 10 and/or smart vehicle controllers 14, all of which may be interconnected via the network 30. Furthermore, the database storage or processing performed by the server 40 may be distributed among a plurality of servers in an arrangement known as “cloud computing.” This configuration may provide various advantages, such as enabling near real-time uploads and downloads of information as well as periodic uploads and downloads of information. This may in turn support a thin-client embodiment of the mobile device 10 or smart vehicle controller 14 discussed herein.



FIG. 2 illustrates a block diagram of an exemplary mobile device 10 and/or smart vehicle controller 14. The mobile device 10 and/or smart vehicle controller 14 may include a processor 72, display 74, sensor 76, memory 78, power supply 80, wireless radio frequency transceiver 82, clock 84, microphone and/or speaker 86, and/or camera or video camera 88. In other embodiments, the mobile device and/or smart vehicle controller may include additional, fewer, and/or alternate components.


The sensor 76 may be able to record audio or visual information. If FIG. 2 corresponds to the mobile device 10, for example, the sensor 76 may be a camera integrated within the mobile device 10. The sensor 76 may alternatively be configured to sense speed, acceleration, directional, fluid, water, moisture, temperature, fire, smoke, wind, rain, snow, hail, motion, and/or other type of condition or parameter, and/or may include a gyro, compass, accelerometer, or any other type of sensor described herein (e.g., any of the sensors 20 described above in connection with FIG. 1). Generally, the sensor 76 may be any type of sensor that is currently existing or hereafter developed and is capable of providing information regarding the vehicle 8, the environment of the vehicle 8, and/or a person.


The memory 78 may include software applications that control the mobile device 10 and/or smart vehicle controller 14, and/or control the display 74 configured for accepting user input. The memory 78 may include instructions for controlling or directing the operation of vehicle equipment that may prevent, detect, and/or mitigate vehicle damage. The memory 78 may further include instructions for controlling a wireless or wired network of a smart vehicle, and/or interacting with mobile device 10 and remote server 40 (e.g., via the network 30).


The power supply 80 may be a battery or dedicated energy generator that powers the mobile device 10 and/or smart vehicle controller 14. The power supply 80 may harvest energy from the vehicle environment and be partially or completely energy self-sufficient, for example.


The transceiver 82 may be configured for wireless communication with sensors 20 located about the vehicle 8, other vehicles 6, other mobile devices similar to mobile device 10, and/or other smart vehicle controllers similar to smart vehicle controller 14. Additionally, or alternatively, the transceiver 82 may be configured for wireless communication with the server 40, which may be remotely located at an insurance provider location.


The clock 84 may be used to time-stamp the date and time that information is gathered or sensed by various sensors. For example, the clock 84 may record the time and date that photographs are taken by the camera 88, video is captured by the camera 88, and/or other data is received by the mobile device 10 and/or smart vehicle controller 14.


The microphone and speaker 86 may be configured for recognizing voice or audio input and/or commands. The clock 84 may record the time and date that various sounds are collected by the microphone and speaker 86, such as sounds of windows breaking, air bags deploying, tires skidding, conversations or voices of passengers, music within the vehicle 8, rain or wind noise, and/or other sound heard within or outside of the vehicle 8.


The present embodiments may be implemented without changes or extensions to existing communications standards. The smart vehicle controller 14 may also include a relay, node, access point, Wi-Fi AP (Access Point), local node, pico-node, relay node, and/or the mobile device 10 may be capable of RF (Radio Frequency) communication, for example. The mobile device 10 and/or smart vehicle controller 14 may include Wi-Fi, Bluetooth, GSM (Global System for Mobile communications), LTE (Long Term Evolution), CDMA (Code Division Multiple Access), UMTS (Universal Mobile Telecommunications System), and/or other types of components and functionality.


II. Telematics Data

Telematics data, as used herein, may include telematics data, and/or other types of data that have not been conventionally viewed as “telematics data.” The telematics data may be generated by, and/or collected or received from, various sources. For example, the data may include, indicate, and/or relate to vehicle (and/or mobile device) speed; acceleration; braking; deceleration; turning; time; GPS (Global Positioning System) or GPS-derived location, speed, acceleration, or braking information; vehicle and/or vehicle equipment operation; external conditions (e.g., road, weather, traffic, and/or construction conditions); other vehicles or drivers in the vicinity of an accident; vehicle-to-vehicle (V2V) communications; vehicle-to-infrastructure communications; and/or image and/or audio information of the vehicle and/or insured driver before, during, and/or after an accident. The data may include other types of data, including those discussed elsewhere herein. The data may be collected via wired or wireless communication.


The data may be generated by mobile devices (smart phones, cell phones, lap tops, tablets, phablets, PDAs (Personal Digital Assistants), computers, smart watches, pagers, hand-held mobile or portable computing devices, smart glasses, smart electronic devices, wearable devices, smart contact lenses, and/or other computing devices); smart vehicles; dash or vehicle mounted systems or original telematics devices; public transportation systems; smart street signs or traffic lights; smart infrastructure, roads, or highway systems (including smart intersections, exit ramps, and/or toll booths); smart trains, buses, or planes (including those equipped with Wi-Fi or hotspot functionality); smart train or bus stations; internet sites; aerial, drone, or satellite images; third party systems or data; nodes, relays, and/or other devices capable of wireless RF (Radio Frequency) communications; and/or other devices or systems that capture image, audio, or other data and/or are configured for wired or wireless communication.


In some embodiments, the data collected may also derive from police or fire departments, hospitals, and/or emergency responder communications; police reports; municipality information; automated Freedom of Information Act requests; and/or other data collected from government agencies and officials. The data from different sources or feeds may be aggregated.


The data generated may be transmitted, via wired or wireless communication, to a remote server, such as a remote server and/or other processor(s) associated with an insurance provider. The remote server and/or associated processors may build a database of the telematics and/or other data, and/or otherwise store the data collected.


The remote server and/or associated processors may analyze the data collected and then perform certain actions and/or issue tailored communications based upon the data, including the insurance-related actions or communications discussed elsewhere herein. The automatic gathering and collecting of data from several sources by the insurance provider, such as via wired or wireless communication, may lead to expedited insurance-related activity, including the automatic identification of insured events, and/or the automatic or semi-automatic processing or adjusting of insurance claims.


In one embodiment, telematics data may be collected by a mobile device (e.g., smart phone) application. An application that collects telematics data may ask an insured for permission to collect and send data about driver behavior and/or vehicle usage to a remote server associated with an insurance provider. In return, the insurance provider may provide incentives to the insured, such as lower premiums or rates, or discounts. The application for the mobile device may be downloadable off of the internet.


III. Pre-Generated Requests for Assistance

Gathered telematics and/or other data (e.g., any type or types of telematics and/or other data described above in Section I and/or Section II) may facilitate determining the severity of (i) an accident; (ii) damage to a vehicle; and/or (iii) the injuries to the persons involved. The data gathered, such as data gathered after the accident, may facilitate determining what vehicle systems are broken or damaged, and/or are in need of minor or substantial repairs. The data gathered may indicate how much vehicle damage has occurred, and whether or not emergency services may be necessary and/or should be called or otherwise contacted.


The telematics and/or other data may also be used to (a) identify a first notice of loss, which in turn may be used to automatically start or initiate the claim handling process; and/or (b) accident reconstruction. Loss identification and/or accident reconstruction may then be paired individually and/or collectively with insurance policy data to automatically generate an insurance claim for an insured event. External data (e.g., public infrastructure or transportation system data) may also be used to determine the type and/or severity of the insured event, and the insurance claim may be modified accordingly.


A. Accident Identification


An insurance provider remote server (e.g., server 40 of FIG. 1) may promptly identify that an accident has occurred from the data gathered. Immediately after which, the remote server may automatically push a pre-generated or tailored message to the insured via wireless communication. The message may request that assistance be sent or directed to the current location of the insured or the vehicle accident. The insured may approve or modify the pre-generated message. The pre-generation of the message requesting assistance may substantially reduce an amount of time that it takes emergency responders to arrive at the scene of a serious accident in some instances. Such time savings may facilitate the saving of human lives with respect to certain vehicle accidents.


As an example, in the case of an accident, communications and/or options may be pushed to the insured's mobile device (e.g., mobile device 10 of FIG. 1). The insured or driver may be asked “Are you injured?”; “Do you need assistance or an ambulance?”; “Do you need the police sent?”; “Is the accident serious or minor?”; “How many people are injured?”; “Is anyone seriously injured?”; and/or “Is your vehicle still drivable?” via their mobile device (such as a smart phone, smart glasses, or wearable device) and/or vehicle wireless communication system.


In some embodiments, a customer or insured may control whether or not emergency responders (e.g., police, fire fighters, ambulances, tow trucks, or even insurance agents) are deployed to the scene of an accident. As suggested above, for example, a mobile device or smart phone application may ask the insured: “Have you been in an accident”; “Do you need assistance?”; “Is the accident serious?”; and/or other questions. The mobile device application may allow an insured to communicate with an insurance provider, and/or communicate directly to emergency responders, more effectively and efficiently than with conventional techniques, and/or may save time when it is of critical importance for those injured in a vehicle accident. Additionally or alternatively, the mobile device (and/or insurance provider remote server, such as remote server 40 of FIG. 1) may automatically call emergency services for the insured once (a) an accident has been detected from analysis of the telematics and/or other data collected, and/or (b) the severity of the accident has been automatically and remotely determined from the data.


B. Post-Accident Services


The mobile device application may (1) include and/or present a list of next steps that the insured should take after a vehicle accident (including instructions on how to submit an insurance claim, or automatically generate an insurance claim, for the insured event); (2) provide post-accident assistance; (3) allow for pre-selecting towing and/or auto repair service providers; and/or (4) call pre-determined persons (e.g., spouse, significant other, loved one, parents, children, friends, etc.). The mobile device application may allow the insured to customize the automatic or semi-automatic services that may be provided and/or presented to the insured when an insured event (e.g, vehicle accident) is detected from analysis of the telematics and/or other data.


The mobile device application (and/or application or functionality of a smart vehicle display or controller, such as smart vehicle controller 14 of FIG. 1) may automatically determine that a vehicle is likely no longer drivable from the data collected. After which, the mobile device application may present towing services (and ratings thereof) on a mobile device of the insured promptly and/or immediately after an accident. The insured may then pick a towing service using the mobile device (and/or smart vehicle) application. The application may then direct the mobile device and/or smart vehicle to electronically notify the towing service of a request for immediate service, such as via wireless communication.


The mobile device and/or smart vehicle application may also present options, such as whether to direct the mobile device and/or smart vehicle to call an insurance agent and/or family members. The options may allow the insured to control the communications, and/or the communications may be pre-set by the insured to automatically occur. For instance, if the telematics and/or other data gathered indicates that the insured is in a serious vehicle accident, the mobile device and/or smart vehicle application may direct the mobile device and/or smart vehicle to automatically notify the insured's spouse of the details of the accident, including severity, accident location, status of the insured or driver, and/or current location of the insured or driver (e.g., in an ambulance or at a hospital).


The mobile device and/or smart vehicle application may automatically generate an insurance claim, and/or attach associated data gathered from various sensors or systems pertinent to the insured event. The application may present the insured an option to automatically submit the automatically generated insurance claim, such as by pressing an icon or button on a user interface or display screen of a mobile device application or smart vehicle control system.


C. Application Customization


The mobile device and/or smart vehicle application may allow the insured to customize the application. The application may allow the insured to select services that are requested when an accident is detected from the data collected. The accident detection may trigger the pre-selected services being requested and/or include contacting police, an ambulance, and/or an insurance agent.


In one embodiment, the insurance provider may keep a user-customized profile of user preferences for an insured. The profile may indicate if a customer call center should proactively call the insured when collected data indicates that an accident has occurred. Also, for a serious accident, the insurance provider remote server may send a text or other message to the responsible insurance agent. The responsible insurance agent may then reach out to the insured promptly to provide individual customer service.


IV. Loss Mitigation Services

Gathered telematics and/or other data (e.g., any type or types of telematics and/or other data described above in Section I and/or Section II) may facilitate loss mitigation services. If an insured event happens, an insurance provider may be remotely notified via wireless communication and/or may identify such insured events based upon data remotely received from vehicles, mobile devices, and/or other electronic devices or systems.


The telematics and/or other data gathered may lead to triage of an auto accident. The data gathered may facilitate identification of whether the claim is minor and may be a “self-serve” type of claim. Additionally or alternatively, the data gathered may indicate that the claim is major, and may involve a fatality or a total loss claim. An application on a smart phone (e.g., mobile device 10 of FIG. 1, or on a smart vehicle controller such as smart vehicle controller 14 of FIG. 1) of the insured may automatically present options for the insured to submit a self-serve type of claim, and/or automatically or semi-automatically get the insured in contact with a representative of the insurance provider for more serious and complex claims. Moreover, any of the assistance requests discussed above in Section III may be automatically sent to the insured, to a first responder (e.g., hospital), and/or to other individuals or entities, for example (e.g., after approval or modification of the request by the insured).


The mobile device and/or smart vehicle application may allow two customers of the same insurance provider to exchange information after an accident. For instance, the applications and/or mobile devices may be equipped for Near Field Communication (NFC). The insurance customers may agree upon the facts of the accident, including who was at fault, and submit a single or joint insurance claim to the insurance provider via their mobile devices. Such submission, especially for minor accidents, may facilitate prompt and efficient handling of the insurance claim(s) by the insurance provider, and alleviate any inconvenience incurred on the part of the insured or group of insurance customers with respect to filing insurance claims and/or other paperwork.


V. Intelligent Routing and Other Recommendations

The present embodiments may facilitate generating intelligent routing and/or other recommendations, and transmitting those to an insured. Intelligent routing recommendations may be based upon vehicle location, route, and/or destination information. The intelligent routing may also be based upon historical data and/or real-time data. The historical and/or real-time data may relate to past or current accidents, weather, traffic, traffic patterns, road conditions, and/or road construction. The intelligent routing functionality, and/or usage (or percentage of usage) thereof, may be used to adjust insurance premiums or rates, and/or discounts.


A. Route Guidance


The intelligent routing recommendations may provide (e.g., via wireless communication, from server 40 of FIG. 1 to mobile device 10, and/or smart vehicle controller 14, of FIG. 1) directions and/or route guidance to a driver or insured based upon traffic patterns and/or actual accident data. The intelligent routing may also take into consideration current weather, construction, traffic, and/or other current conditions.


The intelligent routing recommendations may provide real-time warnings or updates to drivers or insurance customers. Moreover, the intelligent routing recommendations may lead to collision or accident avoidance; more efficient or quicker trips; driving through less traffic or construction; better gas mileage; and/or other benefits.


For instance, short-term or minor road construction projects that may occur with little or no notice may be promptly detected by an insured or the insured's smart vehicle. The GPS location of the minor road construction project (which may be temporarily shutting down a main traffic route or otherwise slowing down traffic) may be sent from the smart vehicle of the insured to the insurance provider remote server. The remote server may then estimate routes to divert traffic around the construction project and notify other insurance customers in the area of an alternate recommended route, such as via wireless communication to their smart vehicles (e.g., vehicle 8 or smart vehicle controller 14 of FIG. 1) or mobile devices (e.g. mobile device 10 of FIG. 1).


The telematics and/or other data may be used to generate messages or alerts transmitted to a smart vehicle or mobile device. A message may indicate that the driver is entering a danger zone associated with an above average risk. For instance, the area may have a lot of ongoing construction, and/or be associated with a higher than average number of accidents.


In one embodiment, the intelligent routing may utilize vehicle-to-vehicle (V2V) communication. The V2V communication may reveal that the vehicles ahead of an insured vehicle are all braking, indicating an accident ahead. The V2V communication data may be sent directly from one vehicle to an insured vehicle (e.g., from vehicle 6 to vehicle 8 of FIG. 1), or alternatively, from one vehicle to a remote telematics or an insurance provider server (e.g., from vehicle 6 to server 40 of FIG. 1). The remote server may then send a message or warning to the insured or insured vehicle to slow down, or even exit a highway and take an alternate route. Access to the remote server may be granted via a subscription service or as a customer service provided by the insurance provider.


V2V communication may include sending a message to a smart vehicle or mobile device directing the smart vehicle or mobile device to automatically start recording telematics data. For instance, V2V communication may indicate that an accident has occurred or is likely to happen. In such situations, automatically recording telematics and/or other data may facilitate accident reconstruction and/or cause of accident determination.


B. Accident Location Reporting


In one embodiment, an insured may opt-in to a program that allows or facilitates, from telematics and/or other data, automatic vehicle accident location reporting. Reporting accident locations in real-time to an insurance provider remote server may facilitate the remote server determining intelligent routes for a group of other insurance customers presently on the road. Customers currently traveling toward the scene of the accident may be re-directed by the remote server. The intelligent routes may direct each of the other insurance customers away from, or to avoid, the scene of the accident, facilitating more efficient and safer travel.


In other words, if one insured self-reports an accident location (e.g., via automatic wireless communication indicating GPS location information), other insurance customers or drivers may be able to promptly and effectively avoid the accident scene through intelligent routing. The intelligent routing may not only consider avoidance of the accident scene, but also other driving risk conditions, such as current traffic, construction, and/or weather conditions, to determine an overall lowest risk alternate route to each vehicle's respective destination.


C. Other Recommendations


Telematics and/or other data gathered (e.g., any type or types of telematics and/or other data described above in Section I and/or Section II) may reveal certain trends about an insured. The data may indicate that the insured is typically driving in areas associated with an above-average number of accidents and/or high crime neighborhoods. The data may also indicate that the insured is driving over the speed limit too much and/or taking turns while driving above a recommended speed. The high risk accident areas or corners may be highlighted on a road map display, such as a vehicle navigation unit, for ease of viewing.


In response, the insurance provider remote server may push appropriate recommendations to the insured, such as recommendations to take certain corners at a slower speed and/or avoid traveling on roads, or through areas, associated with a high number of past accidents. The insurance provider remote server may also present an insurance-related benefit on a display that may be earned if the insured follows the insurance-related recommendations as a means of incentivizing lower risk driving behavior.


VI. Theft Prevention & Mitigation

A telematics device may determine that the driver of a vehicle is not the owner or an authorized driver (e.g., not someone covered under the auto insurance policy). The vehicle and/or mobile device may determine that an unknown driver is attempting or may attempt to start an insured vehicle, or is already driving the insured vehicle, by detecting that an unknown or unrecognized mobile device (e.g., smart phone) is in the vehicle.


As an example, allowed/authorized mobile device Bluetooth signatures may be detected from normal mobile device operation. However, if an unrecognized Bluetooth signal is detected, it may be determined that the vehicle has been stolen, especially if GPS information from the insured's mobile device indicates that the insured is not presently in the insured vehicle. The insured, insurance provider, and/or police may all be automatically notified of the theft.


Additionally or alternatively, a current GPS location of the insured vehicle may be displayed on a virtual map of a mobile device application, along with speed and direction information. The mobile device application with “Find My Car” functionality may be used to locate vehicles parked in large parking lots, such as a shopping mall or airport, where the insured may have forgotten where they have parked, and/or may be used to locate stolen vehicles.


The telematics and/or other data may indicate that a home is left unoccupied for a substantial length of time. For instance, it may be determined from the data collected indicates how often an insured visits and/or how much the insured spends at a second or vacation home. If an insured home is left unoccupied for a substantial amount of time, a recommendation may be sent to the insured to monitor the condition or status of the home more closely to alleviate the risk of theft and/or needed repairs being left unattended to. Insurance savings (e.g., a premium discount) may be provided to the insured in return for following the recommendations provided by the insurance provider.


VII. Exemplary Loss Mitigation Method


FIG. 3 illustrates an exemplary computer-implemented method 100 of loss mitigation associated with a vehicle accident. In some embodiments, the method 100 may be implemented in whole or in part by one or more components of the system 1 depicted in FIG. 1. For example, the method 100 may be implemented by one or more servers remote from the components (e.g., sensors, vehicles, mobile devices, etc.) sourcing telematics data, such as the server 40 (e.g., processor(s) 62 of the server 40 when executing instructions stored in the program memory 60 of the server 40) or another server not shown in FIG. 1.


The method 100 may include collecting accident data associated with a vehicle accident involving a driver (block 102). The driver may be associated with an insurance policy issued by the insurance provider (e.g., an owner of the policy, or another individual listed on the policy). The accident data may include telematics data, and possibly other data, collected from one or more sources. For example, the accident data may include data associated with or generated by one or more mobile devices (e.g., mobile device 10 of FIGS. 1 and 2); an insured vehicle or a computer system of the insured vehicle (e.g., vehicle 8 or smart vehicle controller 14 of FIGS. 1 and 2, or one or more sensors mounted on the vehicle); a vehicle other than the insured vehicle (e.g., vehicle 6 of FIG. 1); vehicle-to-vehicle (V2V) communication (e.g., communications between vehicle 8 and vehicle 6 in FIG. 1); and/or roadside equipment or infrastructure located near a location of the vehicle accident (e.g., infrastructure components 26 of FIG. 1). Generally, the accident data may include any one or more of the types of data discussed above in Section I and/or II (and/or other suitable types of data), and may be collected according to any of the techniques discussed above in Section I and/or II (and/or other suitable techniques). The accident data may have been generated by the respective source(s), and/or collected, before, during and/or after the accident.


The method 100 may also include analyzing any or all of the collected accident data (block 104), and determining a likely severity of the accident based upon the analysis (block 106). For example, it may be determined that an accident is likely severe (e.g., likely involves severe personal injury) if accelerometer data included in the accident data indicates a very large and abrupt change in speed. As another example, it may be determined that an accident is likely severe if the accident data (e.g., from a vehicle-mounted camera) shows that the accident was a head-on accident between two vehicles.


The method 100 may also include automatically communicating with the driver (e.g., the insured) (block 108). For example, a communication related to emergency assistance or an emergency assistance recommendation may be generated based upon the likely severity as determined at block 106, and then transmitted from one or more remote servers implementing the method 100 to a mobile device associated with (e.g., owned and/or carried by) the driver, such as mobile device 10 of FIG. 1. The communication may take any of various different forms, such as, for example, “Are you injured?”; “Do you need assistance or an ambulance?”; “Do you need the police sent?”; “Is the accident serious or minor?”; “How many people are injured?”; “Is anyone seriously injured?”; and/or “Is your vehicle still drivable?”


Alternative embodiments and/or scenarios corresponding to block 108 (and/or a process subsequent to block 108) are reflected in blocks 108A through 108C. For example, the driver (e.g., insured) may either accept or reject the emergency assistance indicated in the communication (block 108A), e.g., by making a selection via a user interface of the mobile device, in response to a prompt that appears in connection with the communication. Alternatively, the driver may modify the emergency assistance request or recommendation (block 108B), e.g., by indicating a different type of emergency assistance (ambulance, police, etc.). Again, the modification may be made via a user interface of the mobile device, in response to a prompt that appears in connection with the communication. As yet another alternative, an emergency assistance request may automatically be sent to a third party (e.g., police department, fire department, hospital, etc.) without waiting for any feedback from the driver (block 108C). For example, the communication at block 108 may merely notify the driver that emergency assistance has been requested, and possibly specify the type of assistance requested.


Although not shown in FIG. 3, the method 100 may also include receiving a wireless communication from the driver (e.g., from the mobile device of the driver) in response to the communication at block 108. The wireless communication may indicate whether the driver approved and/or modified (or rejected) the recommended or proposed emergency assistance, for example. In such an embodiment, if the assistance is not rejected, the method 100 may further include notifying a third party (e.g., police department, fire department, hospital, etc.) of the proposed emergency assistance, in accordance with the modifications, if any, made by the driver.


The method 100 may also include determining (e.g., based upon the analysis at block 104) fault of the driver for the accident. As seen in FIG. 3, for example, the fault for the driver (e.g., the insured) and/or for another driver may be compared or otherwise analyzed, and assigned to the appropriate party or parties (block 110). The fault may be determined as one or more binary indicators (e.g., “at fault” or “not at fault”), percentages (e.g., “25% responsible”), ratios or fractions, and/or any other suitable indicator(s) or measure(s) of fault. In some embodiments and/or scenarios, fault for a first individual is implicitly determined based upon the fault that is explicitly determined for another individual (e.g., an insured may implicitly be determined to have 0% fault if another driver is explicitly determined to be 100% at fault).


The method 100 may also include handling an insurance claim associated with the accident (block 112). For example, the fault of the driver (e.g., insured) determined at block 110 may be used to determine the appropriate payout by the insurance provider, or whether another insurance provider should be responsible for payment, etc.


The method 100 may also include adjusting, generating and/or updating one or more insurance-related items (block 114). The insurance-related item(s) may include, for example, parameters of the insurance policy (e.g., a deductible), a premium, a rate, a discount, and/or a reward. The adjustment, generation and/or update may be based upon the fault determined at block 110, or based upon the driver having the emergency assistance functionality that allows the method 100 to be performed (e.g., a mobile device application that enables the driver to receive the communication sent at block 108 and/or to send the wireless communication received at block 108), for example.


In other embodiments, the method 100 may include additional, fewer, or alternate actions as compared to those shown in FIG. 3, including any of those discussed elsewhere herein. For example, the method 100 may further include transmitting information indicative of the adjusted, generated, or updated insurance-related items to a mobile device associated with the driver (or another individual associated with the insurance policy), such as mobile device 10 of FIG. 1, to be displayed on the mobile device for review, modification, or approval by the driver or other individual.


As another example, the method 100 may further include receiving a wireless communication from the driver cancelling emergency assistance that has already been requested from a third party. As yet another example, the method 100 may include (1) generating an estimated insurance claim based upon the likely severity determined at block 106; (2) transmitting the estimated insurance claim to the driver's or insured's mobile device to facilitate presenting all or some of the claim to the driver or insured; (3) receiving a wireless communication from the driver or insured indicating approval, rejection or modification of the claim; and/or (4) handling the claim in accordance with the approval, rejection or modification. In still other example embodiments, the method 100 may omit blocks 110, 112 and/or 114.


As can be seen from the above discussion, the method 100 may enable a prompt response by the appropriate personnel (e.g., by first responders with an ambulance), and various components (e.g., in the example system 1) may complete their associated tasks relatively quickly and/or efficiently. For instance, the processor 62 of FIG. 1 may require much less time and/or far fewer processing cycles to request emergency assistance than if an insurance provider employee were to learn about the accident via other means (e.g., a phone call from the insured or passenger) and then instruct processor 62 to generate a request for help.


VIII. Additional Exemplary Loss Mitigation Methods

In one aspect, a computer-implemented method of loss mitigation may be provided. The method may include (1) collecting or receiving telematics and/or other data at or via a remote server associated with an insurance provider, the telematics and/or other data being associated with a vehicle accident involving a specific driver and/or an insured. The insured may own an insurance policy issued by the insurance provider, and the telematics and/or other data may be gathered before, during, and/or after the vehicle accident. The method may include (2) analyzing the telematics and/or other data at and/or via the remote server; (3) determining, at and/or via the remote server, a likely severity of the vehicle accident from the analysis of the telematics and/or other data; (4) generating a communication related to emergency assistance or an emergency assistance recommendation, at the remote server, based upon the likely severity of the vehicle accident that is determined from the analysis of the telematics and/or other data; (5) transmitting, via wireless communication, the communication related to the emergency assistance or emergency assistance recommendation from the remote server to a mobile device or smart vehicle associated with the specific driver and/or insured; (6) receiving, at and/or via the remote server, a wireless communication from the specific driver and/or insured indicating approval, rejection, or modification of the emergency assistance or emergency assistance recommendation; and/or (7) notifying, via communication sent from the remote server, the emergency assistance approved and/or requested by the specific driver to a third party, such as emergency responders (i.e., police or medical personnel). The method may include additional, fewer, or alternate actions, including those discussed elsewhere herein.


For instance, the method may include adjusting, generating, and/or updating, at and/or via the remote server, an insurance policy, premium, rate, discount, and/or reward for the specific driver and/or the insured based upon having the emergency assistance functionality. The method may further comprise transmitting information related to the adjusted, generated, and/or updated insurance policy, premium, rate, discount, and/or reward from the remote server to a mobile device associated with the specific driver and/or insured to facilitate presenting, on a display of the mobile device, all or a portion of the adjusted, generated, and/or updated insurance policy, premium, rate, discount, and/or reward to the specific driver and/or insured for their review, modification, and/or approval. Also, the telematics and/or other data may include the types of data discussed elsewhere herein.


In another aspect, another computer-implemented method of loss mitigation may be provided. The method may include (1) collecting or receiving telematics and/or other data at or via a remote server associated with an insurance provider, the telematics and/or other data being associated with a vehicle accident involving a specific driver and/or an insured. The insured may own an insurance policy issued by the insurance provider, and the telematics and/or other data may be gathered before, during, and/or after the vehicle accident. The method may include (2) analyzing the telematics and/or other data at and/or via the remote server; (3) determining, at and/or via the remote server, a likely severity of the vehicle accident from the analysis of the telematics and/or other data; (4) generating a communication related to emergency assistance or an emergency assistance recommendation, at and/or via the remote server, based upon the likely severity of the vehicle accident that is determined from the analysis of the telematics and/or other data; and/or (5) transmitting, via wireless communication, the communication related to the emergency assistance or emergency assistance recommendation from the remote server directly to a third party, such as a police department, fire department, and/or hospital to facilitate prompt and appropriate emergency responder response to the vehicle accident.


The method may further comprise notifying the specific driver and/or insured, via wireless communication sent from the remote server, that the emergency assistance from the third party has been requested, and/or receiving from the specific driver and/or insured, at or via the remote server, a wireless communication indicating a cancellation of the emergency assistance from the third party and/or that the emergency assistance is not necessary. The method may include adjusting, generating, and/or updating, via the remote server, an insurance policy, premium, rate, discount, and/or reward for the specific driver and/or the insured based upon having the emergency assistance functionality.


The method may include transmitting information related to the adjusted, generated, and/or updated insurance policy, premium, rate, discount, and/or reward from the remote server to a mobile device associated with the specific driver and/or insured to facilitate presenting, on a display of the mobile device, all or a portion of the adjusted, generated, and/or updated insurance policy, premium, rate, discount, and/or reward to the specific driver and/or insured for their review, modification, and/or approval.


IX. Exemplary Estimated Claim Generation Method

In one aspect, a computer-implemented method of generating an insurance claim for an insured may be provided. The method may include: (1) collecting or receiving telematics and/or other data (e.g., any of the telematics and/or other data described above in Section I and/or Section II) at or via a remote server associated with an insurance provider (e.g., server 40 of FIG. 1), the telematics and/or other data being associated with a vehicle accident involving a specific driver and/or an insured, the insured owning an insurance policy issued by the insurance provider and the telematics and/or other data being gathered before, during, and/or after the vehicle accident; (2) analyzing the telematics and/or other data at or via the remote server; (3) determining, at or via the remote server, a likely severity of the vehicle accident from the analysis of the telematics and/or other data; (4) generating an estimated insurance claim, at or via the remote server, based upon the severity of the vehicle accident determined from the analysis of the telematics and/or other data; (5) transmitting, via wireless communication, the estimated insurance claim from the remote server to a mobile device associated with the specific driver and/or insured (e.g., mobile device 10 of FIG. 1) to facilitate presenting all, or a portion of, the estimated insurance claim to the specific driver and/or insured; (6) receiving, at or via the remote server, a wireless communication from the specific driver and/or insured indicating approval, rejection, or modification of the estimated insurance claim; and/or (7) handling or addressing the estimated insurance claim, at or via the remote server, in accordance with the specific driver and/or insured's approval, rejection, or modification of the estimated insurance claim. The method may further include (8) adjusting, generating, and/or updating, at or via the remote server, an insurance policy, premium, rate, discount, and/or reward for the specific driver and/or the insured (or insured vehicle) based upon the estimated insurance claim. The method may include additional, fewer, or alternate actions, including those discussed elsewhere herein.


X. Exemplary Intelligent Routing Method


FIG. 4 illustrates an exemplary computer-implemented method 200 of providing intelligent routing to reduce risk and/or the likelihood of an insured event from occurring. In some embodiments, the method 200 may be implemented in whole or in part by one or more components of the system 1 depicted in FIG. 1. For example, the method 200 may be implemented by one or more servers remote from the components (e.g., sensors, vehicles, mobile devices, etc.) sourcing telematics data, such as the server 40 (e.g., processor(s) 62 of the server 40 when executing instructions stored in the program memory 60 of the server 40) or another server not shown in FIG. 1.


The method 200 may include receiving trip information including a vehicle's destination, planned route, and/or current location. As seen in FIG. 4, for example, the method 200 may include collecting telematics and/or other data associated with the vehicle's location, route, and/or destination (and possibly other information, such as the vehicle's origination point) at an insurance provider remote server (block 202). The data may include GPS, navigation, and/or other data associated with, or generated by, the driver's mobile device, the driver's vehicle (or a computer system thereof), another vehicle (e.g., a vehicle in the vicinity of the driver's vehicle), V2V communication, and/or roadside equipment and/or infrastructure, for example.


The method 200 may also include analyzing the data/information collected at block 202 (block 204). In some embodiments and/or scenarios, the method 200 may include comparing/analyzing the vehicle location, route, and/or destination with real-time traffic, construction, and/or weather conditions (block 206A). The real-time traffic, construction, and/or weather conditions may be telematics data collected from other vehicles (and/or roadside equipment or infrastructure, etc.), for example. In other embodiments and/or scenarios, the method 200 may include comparing/analyzing the vehicle location, route, and/or destination with information in a database of traffic conditions, construction conditions, weather conditions, and/or past accidents (block 206B). The method 200 may include building the database using traffic, construction, weather, and/or accident information gathered from one or more sources (e.g., news feeds, telematics data, etc.), for example.


The method 200 may also include identifying a lower risk route or routes between the vehicle's current location and the destination (block 208). For example, the method 200 may include identifying areas (e.g., roads or road segments) associated with higher risk of vehicle accident using collected vehicle telematics data and/or database (e.g., traffic, construction, weather, accident, etc.) information, and the route(s) may be identified/determined at block 208 such that those high-risk areas are avoided. Alternatively, as seen in FIG. 4, the method 200 may include identifying a more fuel-efficient route from the vehicle's current location to the destination at block 208.


The method 200 may also include communicating at least one of the one or more identified lower risk routes to the driver (e.g., the insured) as a recommended route to the destination (block 210). The route may be communicated via wireless communication to a mobile device and/or a smart vehicle of the driver (e.g., to mobile device 10, and/or a vehicle navigation system of vehicle 8, of FIG. 1), for example.


The method 200 may also include determining whether the recommended route was taken by the driver based upon analysis of telematics and/or other data (block 212). For example, GPS data may be received from the driver's mobile device or smart vehicle, and used to determine whether the recommended route was followed or a different route was taken instead.


The method 200 may also include adjusting, updating, and/or generating one or more insurance-related items based upon the determination at block 212 (block 214). The insurance-related item(s) may include, for example, parameters of the insurance policy (e.g., a deductible), a premium, a rate, a discount, and/or a reward. Alternatively, or additionally, the insurance-related item(s) may be adjusted, updated, and/or generated (e.g., insurance discounts may be provided) based upon an amount of usage, by the driver or another individual associated with the same insurance policy, of the intelligent routing functionality (e.g., a number or percentage of recommended routes taken, etc.).


In other embodiments, the method 200 may include additional, fewer, or alternate actions as compared to those shown in FIG. 4, including any of those discussed elsewhere herein. For example, the method 200 may omit block 212 and/or block 214.


As can be seen from the above discussion, the method 200 may efficiently determine low-risk routes for drivers. For instance, the processor 62 of FIG. 1 may centrally determine low-risk routes for numerous different drivers in much less time than would be needed if those drivers were to instead use personal (e.g., mobile) computing devices to directly collect, and manually review, information (e.g., past or current accident information) needed to identify their own low-risk routes.


XI. Additional Exemplary Intelligent Routing Methods

In another aspect, a computer-implemented method of intelligent routing may be provided. The method may include (1) collecting telematics and/or other data and/or building a database related to multiple vehicle accidents; (2) identifying, via a processor or remote sever, areas of higher than average vehicle accidents and/or less risky travel routes or roads; (3) receiving, at or via the remote server, a destination, a planned route, and/or a current location of a vehicle, such as from telematics related data; (4) based upon the destination, planned route, or current location of the vehicle, determining, at or via the remote server, a recommended route to the destination that avoids the areas of higher than average vehicle accidents and/or reduces accident associated risk; and/or (5) transmitting the recommended route from the remote server to the insured and/or driver via wireless communication for display on the vehicle navigation system and/or a mobile device associated with the insured and/or driver to facilitate the insured and/or driver traveling via a route associated with lower risk of accident.


The method may include generating insurance discounts based upon an amount that the insured uses the intelligent routing functionality provided by an insurance provider. The telematics and/or other data may include the data indicated elsewhere herein. The method of intelligent routing may include additional, fewer, or alternative actions, including those discussed elsewhere herein.


In another aspect, another method of intelligent routing may be provided. The method may include: (1) building a database associated with road traffic, construction areas, and/or vehicle accidents; (2) receiving, at or via an insurance provider remote server, a vehicle destination and a current vehicle location associated with an insured vehicle from the insured vehicle and/or a mobile device of a driver and/or insured associated with the insured vehicle, such as from telematics related data; (3) analyzing, at or via the insurance provider remote server, the vehicle destination and the current vehicle location associated with the insured vehicle in conjunction with the database associated with the road traffic, construction areas, and/or vehicle accidents to determine a low risk recommended or alternate route to the destination; and/or (4) transmitting from the remote server, the low risk recommended or alternate route to the destination to the vehicle and/or a mobile device associated with the driver and/or insured to facilitate the driver and/or insured taking the low risk recommended or alternate route to the destination.


The method may include generating insurance discounts based upon an amount of usage (by an insured) of the intelligent routing functionality provided by an insurance provider. The telematics and/or other data may include the data indicated elsewhere herein. The method of intelligent routing may include additional, fewer, or alternative actions, including those discussed elsewhere herein.


XII. Exemplary Theft Prevention or Mitigation Method


FIG. 5 illustrates an exemplary computer-implemented method 300 of theft prevention or mitigation. In some embodiments, the method 300 may be implemented in whole or in part by one or more components of the system 1 depicted in FIG. 1. For example, the method 300 may be implemented by one or more servers remote from the components (e.g., sensors, vehicles, mobile devices, etc.) sourcing telematics data, such as the server 40 (e.g., processor(s) 62 of the server 40 when executing instructions stored in the program memory 60 of the server 40) or another server not shown in FIG. 1. While the blocks of FIG. 5 refer to a “remote server,” however, it is understood that other components may implement the method 300 in other embodiments. For example, the method 300 may be implemented by a vehicle controller, such as the smart vehicle controller 14 of FIGS. 1 and 2, or another vehicle controller not shown in FIG. 1 or 2.


The method 300 may include collecting driver-related data over time (block 302). The data may be associated with one or more authorized drivers of an insured vehicle (e.g., a policy owner and one or more family members), with the driver(s) and vehicle being covered by an insurance policy issued by an insurance provider (e.g., an insurance provider associated with one or more servers implementing the method 300, in one embodiment). In particular, the collected driver-related data may be associated with identification and/or driving behavior of the driver(s). For example, the driver-related data may include data indicative of driver weights, driver appearances, acceleration, braking and/or cornering behaviors of the drivers, and so on.


The driver-related data may include telematics data, and possibly other data, collected from one or more sources. For example, the driver-related data may include data associated with or generated by one or more mobile devices (e.g., mobile device 10 of FIGS. 1 and 2); an insured vehicle or a computer system of the insured vehicle (e.g., vehicle 8 or smart vehicle controller 14 of FIGS. 1 and 2, or one or more sensors mounted on the vehicle, such as a driver's seat weight sensor, or a vehicle-mounted camera for capturing images of drivers, etc.); a vehicle other than the insured vehicle (e.g., vehicle 6 of FIG. 1); vehicle-to-vehicle (V2V) communication (e.g., communications between vehicle 8 and vehicle 6 in FIG. 1); and/or roadside equipment or infrastructure located near a location of the vehicle accident (e.g., infrastructure components 26 of FIG. 1). Generally, the driver-related data may include any one or more of the types of data discussed above in Section I and/or II (and/or other suitable types of data), and may be collected according to any of the techniques discussed above in Section I and/or II (and/or other suitable techniques).


The collected driver-related data may be analyzed (block 304). For example, the data may be analyzed in order to determine an “electronic signature” for the mobile device of each authorized driver. As another example, vehicle operation data such as acceleration, braking and cornering, and/or other data, may be analyzed to determine higher-level behaviors of a driver (e.g., how often the driver brakes suddenly, or how often and/or by how much the driver exceeds the speed limit, etc.). Data may also be analyzed to categorize the data according to driver (e.g., determine, based upon weights or other identifying data, which driving behaviors correspond to which authorized driver, etc.).


While not shown in FIG. 5, the method 300 may also include a block in which a database associated with the authorized driver(s) is built based upon the driver-related data. For example, the output data produced by the analysis at block 304 (e.g., driver-specific weights, images or facial features, driving behaviors, etc.) may be stored in the database.


The known data (e.g., stored in the database) may be compared to new data to determine that a driver is unauthorized, i.e., not one of the individuals covered by the insurance policy (block 306). While referred to here as an unauthorized “driver,” the individual may be currently driving the insured vehicle, or may merely be attempting to start the vehicle or even just sitting in a seat (e.g., driver's seat) of the vehicle.


While not shown in FIG. 5, the method 300 may include a block in which current or real-time driver-related data is collected prior to making the comparison at block 306. For example, current telematics and/or other data associated with the unauthorized individual may be collected (e.g., in a manner similar to that described above in connection with block 302) in order to determine identifying characteristics and/or driving behaviors of the individual.


The comparison at block 306 may include, for example, comparing a weight of the current driver with the weights of each authorized driver (e.g., based upon data that was generated by a driver's seat weight sensor of the insured vehicle), comparing an appearance of the current driver with the appearance of each authorized driver (e.g., based upon image data captured by a vehicle-mounted camera and using suitable image processing techniques), and/or comparing electronic signatures or signals of mobile devices of the authorized drivers with an unknown electronic signature or signal of an unrecognizable mobile device associated with the unauthorized individual. Additionally or alternatively, the comparison may include comparing acceleration, braking and/or cornering behaviors or patterns of the current driver with like behaviors or patterns of each authorized driver, etc.


After determining that the current driver is unauthorized, the insured vehicle may be disabled (block 308). For example, a remote server implementing the method 300 may send a wireless signal to a vehicle controller within the insured vehicle (e.g., smart vehicle controller 14 of FIG. 1), causing the insured vehicle to gradually and slowly come to a halt (if currently moving), or preventing the insured vehicle from being started (if not yet moving), etc. In some embodiments, the disablement/prevention may only occur if an authorized driver (e.g., the policy owner) acknowledges/confirms that the person currently driving (or attempting to start, etc.) the insured vehicle does not have the permission of the policy owner and/or vehicle owner to drive the insured vehicle.


Disablement of the vehicle may also depend upon other conditions being satisfied. For example, it may first need to be verified that the unauthorized individual is sitting in a driver's seat of the insured vehicle (e.g., rather than simply being a passenger). The verification may be made by triangulation or communication techniques between the unauthorized individual's mobile device and a vehicle-mounted transmitter, and/or using a visual image of the unauthorized individual, for example.


As an alternative to block 308, the method 300 may include tracking the location of the insured vehicle (block 310). Vehicle tracking may be accomplished using GPS coordinates, for example, and may persist until the vehicle is returned to the vehicle owner. The method 300 may also include transmitting a current GPS location of the insured vehicle to a mobile device of one of the authorized drivers (e.g., the policy owner and/or vehicle owner), and/or to a third party remote server (e.g., a server associated with a police department).


In other embodiments, the method 300 may include additional, fewer, or alternate actions as compared to those shown in FIG. 5, including any of those discussed elsewhere herein. For example, instead of (or in addition to) block 308, the method 300 may include notifying one of the authorized drivers (e.g., the policy owner) and/or authorities (e.g., a server of a police department), via wired or wireless communications, that the insured vehicle was (or may be) stolen.


As can be seen from the above discussion, the method 300 may efficiently prevent vehicle theft, or efficiently mitigate the losses and/or inconveniences due to such a theft. For instance, the processor 62 of FIG. 1 may detect a likely vehicle theft far more quickly than if an insurance provider employee were to input theft reporting data to server 40 only after an insured or other individual recognized, and then called to report, the theft.


XIII. Additional Exemplary Theft Prevention or Mitigation Method

In one aspect, a computer-implemented method of vehicle theft prevention or mitigation may be provided. The method may include: (1) collecting or receiving telematics and/or other data at or via a remote server associated with an insurance provider (or at or via a vehicle controller) over time, the telematics and/or other data being associated with an insured driver or family member driving an insured vehicle (and/or their identification), the insured vehicle being covered by an insurance policy issued by the insurance provider; (2) building, at or via the remote server (or vehicle controller), a database of insured drivers or family members (i) authorized to drive the insured vehicle, and/or (ii) covered by the insurance policy; (3) collecting or receiving current telematics and/or other data at or via the remote server (or vehicle controller) associated with an individual attempting to start or currently driving the insured vehicle; (4) determining, at or via the remote server (or vehicle controller), that the individual attempting to start or currently driving the insured vehicle is not among the insured drivers or family members (i) authorized to drive the insured vehicle, or (ii) covered by the insurance policy; and/or (5) if so, then directing or controlling, at or via the remote server (or vehicle controller), a disablement of an operation of the insured vehicle (i.e., preventing the vehicle from operating, or safely or orderly slowing the down the vehicle to a halt and/or moving the vehicle off to the side of the road) and/or preventing the individual from starting or otherwise operating the insured vehicle to facilitate preventing or mitigating theft of the insured vehicle.


The determining, at or via the remote server (or vehicle controller), that the individual attempting to start, or currently driving, the insured vehicle is not among the insured drivers or family members (i) authorized to drive the insured vehicle, or (ii) covered by the insurance policy may be performed by comparing electronic signatures or signals of mobile devices of the insured drivers or family members with an unknown electronic signature or signal of a unrecognizable mobile device associated with the individual attempting to start, or currently driving, the insured vehicle, or otherwise sitting in a driver's seat of the insured vehicle. The method may include verifying, before preventing operation of the insured vehicle, that the unknown individual attempting to start, or currently driving, the insured vehicle is sitting in the driver's seat of the insured vehicle, such as via (a) triangulation or communication techniques between the unrecognizable mobile device and vehicle mounted transmitters, and/or (b) using visual images gathered or collected from the telematics and/or other data.


In one embodiment, determining, at or via the remote server (or vehicle controller), that the individual attempting to start, or currently driving, the insured vehicle is not among the insured drivers or family members (i) authorized to drive the insured vehicle, or (ii) covered by the insurance policy is performed by comparing electronic signatures or signals of various mobile devices. In another embodiment, determining, at or via the remote server (or vehicle controller), that the individual attempting to start, or currently driving, the insured vehicle is not among the insured drivers or family members (i) authorized to drive the insured vehicle, or (ii) covered by the insurance policy is performed by comparing (a) visual images (such as gathered by vehicle mounted cameras or mobile devices) or weights (such as determined from seat sensors) of the insured drivers or family members with (b) visual images or a weight of the individual attempting to start, or currently driving, the insured vehicle, respectively.


In one aspect, the telematics and/or other data may include data associated with, or generated by, mobile devices, such as smart phones, smart glasses, and/or smart wearable electronic devices capable of wireless communication. The telematics and/or other data may include data associated with, or generated by, an insured vehicle or a computer system of the insured vehicle. The telematics and/or other data may include data associated with, or generated by, (i) a vehicle other than the insured vehicle; (ii) vehicle-to-vehicle (V2V) communication; and/or (iii) road side equipment or infrastructure.


The method may further include, when it is determined, at or via the remote server (or vehicle controller), that the individual attempting to start, or currently driving, the insured vehicle is not among the insured drivers or family members (i) authorized to drive the insured vehicle, or (ii) covered by the insurance policy, generating a message (or wireless communication) and transmitting the message from the remote server (or vehicle controller) to a mobile device of one of the insured drivers or family members, or to authorities to facilitate vehicle recapture or safety. The method may include tracking the GPS location of the insured vehicle at the remote server (or vehicle controller), and/or transmitting the present GPS location of the insured vehicle to a mobile device of an insured or to a third party remote server, such as a police department. The method may include additional, fewer, or alternate actions, including those discussed elsewhere herein.


XIV. Additional Considerations

The following additional considerations apply to the foregoing discussion. Throughout this specification, plural instances may implement operations or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.


Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.


As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.


As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).


In addition, use of “a” or “an” is employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.


Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs through the principles disclosed herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the methods and systems disclosed herein without departing from the spirit and scope defined in the appended claims. Finally, the patent claims at the end of this patent application are not intended to be construed under 35 U.S.C. § 112(f) unless traditional means-plus-function language is expressly recited, such as “means for” or “step for” language being explicitly recited in the claim(s).

Claims
  • 1. A method comprising: receiving, from a vehicle occupied by a user, data indicating that the vehicle is involved in an accident;transmitting, in response to receiving the data, a communication to a mobile device of the user;displaying, via a mobile device application installed on the mobile device, the communication, wherein the communication prompts the user to provide responses to one or more questions regarding the accident;determining, based on the received data, a likely severity of the accident, wherein determining the likely severity includes determining whether one or more injuries were likely sustained during the accident;receiving an indication of a response to the one or more questions from the user;based on the determination of the likely severity of the accident and the received indication of the response to the one or more questions, prompting the user with an emergency assistance recommendation; andbased on the determination of the likely severity of the accident, performing one or more assessments, wherein performing the one or more assessments includes at least one of determining damage to the vehicle, determining repairs needed for the vehicle, or determining fault of the user for the accident.
  • 2. The method of claim 1, wherein performing the one or more assessments further includes identifying a first notice of loss.
  • 3. The method of claim 1, wherein performing the one or more assessments including determining fault of the user for the accident, and wherein the method further includes adjusting a parameter of an insurance policy associated with the user based on the determined fault.
  • 4. The method of claim 3, wherein adjusting the parameter of the insurance policy includes adjusting at least one of a deductible, a premium, a rate, a discount, or a reward associated with the insurance policy.
  • 5. The method of claim 1, further including: identifying an insurance policy associated with the user or with the vehicle; andgenerating an estimated insurance claim associated with the insurance policy based on the determined likely severity of the accident.
  • 6. The method of claim 5, further including transmitting at least a portion of the estimated insurance claim to the mobile device.
  • 7. The method of claim 1, further including: receiving an indication of an approval or a modification of the emergency assistance recommendation from the user; andtransmitting a notification to an emergency responder requesting emergency assistance in accordance with the received indication of the approval or the modification of the emergency assistance recommendation.
  • 8. The method of claim 1, wherein the data indicating that the vehicle is involved in the accident is received from a telematics device associated with the vehicle.
  • 9. The method of claim 8, wherein the data indicating the vehicle is involved in the accident includes at least one of a speed of the vehicle at a time of the accident, an acceleration of the vehicle at the time of the accident, a sound of an air bag deploying at the time of the accident, and a GPS location associated with the vehicle at the time of the accident.
  • 10. The method of claim 1, further comprising: transmitting, to one or more emergency responders, an emergency assistance request based on at least one of the determined likely severity of the accident or the received indication of the response to the one or more questions.
  • 11. The method of claim 1, wherein the one or more questions regarding the accident include at least one question regarding whether injuries were sustained during the accident.
  • 12. A system comprising: a telematics device configured to transmit vehicle data, of a vehicle, to a remote server; andthe remote server, including: at least one processor; andat least one memory storing computer-readable instructions that, when executed by the processor, cause the remote server to: receive, from the telematics device, data indicating that the vehicle is involved in an accident, wherein the vehicle is occupied by a user;transmit, in response to receiving the data, a communication to a mobile device of the user to be displayed via a mobile device application installed on the mobile device, wherein the communication prompts the user to provide responses to one or more questions regarding the accident;determine, based on the received data, a likely severity of the accident, wherein determining the likely severity includes determining whether one or more injuries were likely sustained during the accident;receive an indication of a response to the one or more questions from the user;based on the determination of the likely severity of the accident and the received indication of the response to the one or more questions, prompt the user with an emergency assistance recommendation; andbased on the determination of the likely severity of the accident, perform one or more assessments, the one or more assessments including at least one of determining damage to the vehicle, determining repairs needed for the vehicle, or determining fault of the user for the accident.
  • 13. The system of claim 12, wherein the one or more assessments further include identifying a first notice of loss.
  • 14. The system of claim 12, wherein the one or more assessments includes determining fault of the user for the accident, and wherein the instructions further cause the remote server to adjust a parameter of an insurance policy associated with the user based on the determined fault.
  • 15. The system of claim 12, wherein the instructions further cause the remote server to: identify an insurance policy associated with the user or with the vehicle; andgenerate an estimated insurance claim associated with the insurance policy based on the determined likely severity of the accident.
  • 16. The system of claim 15, wherein the instructions further cause the remote server to transmit at least a portion of the estimated insurance claim to the mobile device.
  • 17. The system of claim 12, wherein the instructions further cause the remote server to: receive, from the user, an approval or a modification to the emergency assistance recommendationtransmit a notification to an emergency responder requesting emergency assistance in accordance with the received indication of the approval or the modification of the emergency assistance recommendation.
  • 18. The system of claim 12, wherein the instructions further cause the remote server to transmit, to one or more emergency responders, an emergency assistance request based on at least one of the determined likely severity of the accident or the received indication of the response to the one or more questions.
  • 19. The system of claim 12, wherein the one or more questions regarding the accident include at least one question regarding whether injuries were sustained during the accident.
  • 20. The system of claim 12, wherein the one or more questions regarding the accident include at least one question regarding whether the user requires emergency assistance.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 16/178,838 (filed Nov. 2, 2018), which is a continuation of U.S. application Ser. No. 15/676,470 (filed on Aug. 14, 2017), which is a continuation of U.S. application Ser. No. 14/798,757 (filed Jul. 14, 2015), which claims the benefit of U.S. Provisional Application No. 62/027,021 (filed Jul. 21, 2014); U.S. Provisional Application No. 62/040,735 (filed Aug. 22, 2014); U.S. Provisional Application No. 62/145,022 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,024 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,027 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,028 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,029 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,032 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,033 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,145 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,228 (filed Apr. 9, 2015); U.S. Provisional Application No. 62/145,232 (filed Apr. 9, 2015); and U.S. Provisional Application No. 62/145,234 (filed Apr. 9, 2015). The entirety of each of the foregoing applications is incorporated by reference herein. Additionally, the present application is related to co-pending U.S. patent application Ser. No. 14/798,741 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,750 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,745 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,763 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,609 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,615 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,626 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,633 (filed Jul. 14, 2015); co-pending U.S. patent application Ser. No. 14/798,769 (filed Jul. 14, 2015); and co-pending U.S. patent application Ser. No. 14/798,770 (filed Jul. 14, 2015).

US Referenced Citations (1084)
Number Name Date Kind
4218763 Kelley et al. Aug 1980 A
4386376 Takimoto et al. May 1983 A
4565997 Seko et al. Jan 1986 A
4833469 David May 1989 A
5214582 Gray May 1993 A
5220919 Phillips et al. Jun 1993 A
5363298 Survanshi et al. Nov 1994 A
5367456 Summerville et al. Nov 1994 A
5368484 Copperman et al. Nov 1994 A
5436839 Dausch et al. Jul 1995 A
5453939 Hoffman et al. Sep 1995 A
5488353 Kawakami et al. Jan 1996 A
5499182 Ousborne Mar 1996 A
5515026 Ewert May 1996 A
5574641 Kawakami et al. Nov 1996 A
5626362 Mottola May 1997 A
5689241 Clarke, Sr. et al. Nov 1997 A
5797134 McMillan et al. Aug 1998 A
5825283 Camhi Oct 1998 A
5835008 Colemere, Jr. Nov 1998 A
5978720 Hieronymus et al. Nov 1999 A
5983161 Lemelson et al. Nov 1999 A
6031354 Wiley et al. Feb 2000 A
6064970 McMillan et al. May 2000 A
6067488 Tano May 2000 A
6141611 Mackey et al. Oct 2000 A
6151539 Bergholz et al. Nov 2000 A
6215200 Genzel Apr 2001 B1
6246933 Bague Jun 2001 B1
6253129 Jenkins et al. Jun 2001 B1
6271745 Anzai et al. Aug 2001 B1
6285931 Hattori et al. Sep 2001 B1
6298290 Abe et al. Oct 2001 B1
6313749 Horne et al. Nov 2001 B1
6323761 Son Nov 2001 B1
6353396 Atlas Mar 2002 B1
6400835 Lemelson et al. Jun 2002 B1
6473000 Secreet et al. Oct 2002 B1
6477117 Narayanaswami et al. Nov 2002 B1
6553354 Hausner et al. Apr 2003 B1
6556905 Mittelsteadt et al. Apr 2003 B1
6570609 Heien May 2003 B1
6579233 Hursh Jun 2003 B2
6661345 Bevan et al. Dec 2003 B1
6701234 Vogelsang Mar 2004 B1
6704434 Sakoh et al. Mar 2004 B1
6727800 Dutu Apr 2004 B1
6765495 Dunning et al. Jul 2004 B1
6795759 Doyle Sep 2004 B2
6832141 Skeen et al. Dec 2004 B2
6889137 Rychlak May 2005 B1
6909407 Schradi et al. Jun 2005 B1
6909947 Douros et al. Jun 2005 B2
6925425 Remboski et al. Aug 2005 B2
6931309 Phelan et al. Aug 2005 B2
6934365 Suganuma et al. Aug 2005 B2
6944536 Singleton Sep 2005 B2
6956470 Heise et al. Oct 2005 B1
6974414 Victor Dec 2005 B2
6983313 Korkea-Aho Jan 2006 B1
6989737 Yasui Jan 2006 B2
7027621 Prokoski Apr 2006 B1
7054723 Seto et al. May 2006 B2
7056265 Shea Jun 2006 B1
7102496 Ernst, Jr. et al. Sep 2006 B1
7138922 Strumolo et al. Nov 2006 B2
7149533 Laird et al. Dec 2006 B2
7200207 Meer et al. Apr 2007 B2
7253724 Prakah-Asante et al. Aug 2007 B2
7254482 Kawasaki et al. Aug 2007 B2
7266532 Sutton et al. Sep 2007 B2
7290275 Baudoin et al. Oct 2007 B2
7302344 Olney et al. Nov 2007 B2
7315233 Yuhara Jan 2008 B2
7330124 Ota Feb 2008 B2
7348882 Adamczyk et al. Mar 2008 B2
7349860 Wallach et al. Mar 2008 B1
7356392 Hubbard et al. Apr 2008 B2
7386376 Basir et al. Jun 2008 B2
7391784 Renkel Jun 2008 B1
7423540 Kisacanin Sep 2008 B2
7424414 Craft Sep 2008 B2
7480501 Petite Jan 2009 B2
7499774 Barrett et al. Mar 2009 B2
7565230 Gardner et al. Jul 2009 B2
7596242 Breed et al. Sep 2009 B2
7609150 Wheatley et al. Oct 2009 B2
7639148 Victor Dec 2009 B2
7676062 Breed et al. Mar 2010 B2
7692552 Harrington et al. Apr 2010 B2
7719431 Bolourchi May 2010 B2
7783426 Kato et al. Aug 2010 B2
7783505 Roschelle et al. Aug 2010 B2
7791503 Breed et al. Sep 2010 B2
7792328 Albertson et al. Sep 2010 B2
7797107 Shiller Sep 2010 B2
7812712 White et al. Oct 2010 B2
7813888 Vian et al. Oct 2010 B2
7835834 Smith et al. Nov 2010 B2
7865378 Gay Jan 2011 B2
7870010 Joao Jan 2011 B2
7877275 Ball Jan 2011 B2
7881914 Trotta et al. Feb 2011 B2
7881951 Roschelle et al. Feb 2011 B2
7890355 Gay et al. Feb 2011 B2
7904219 Lowrey et al. Mar 2011 B1
7912740 Vahidi et al. Mar 2011 B2
7973674 Bell et al. Jul 2011 B2
7979172 Breed Jul 2011 B2
7979173 Breed Jul 2011 B2
7983802 Breed Jul 2011 B2
7987103 Gay et al. Jul 2011 B2
7991629 Gay et al. Aug 2011 B2
8005467 Gerlach et al. Aug 2011 B2
8009051 Omi Aug 2011 B2
8010283 Yoshida et al. Aug 2011 B2
8016595 Aoki et al. Sep 2011 B2
8027853 Kazenas Sep 2011 B1
8035508 Breed Oct 2011 B2
8040247 Gunaratne Oct 2011 B2
8068983 Vian et al. Nov 2011 B2
8078334 Goodrich Dec 2011 B2
8090598 Bauer et al. Jan 2012 B2
8095394 Nowak et al. Jan 2012 B2
8102901 Aissi et al. Jan 2012 B2
8106769 Maroney et al. Jan 2012 B1
8108655 Abernathy et al. Jan 2012 B2
8117049 Berkobin et al. Feb 2012 B2
8123686 Fennell et al. Feb 2012 B2
8139109 Schmiedel et al. Mar 2012 B2
8140249 Hessling et al. Mar 2012 B2
8140358 Ling et al. Mar 2012 B1
8140359 Daniel Mar 2012 B2
8164432 Broggi et al. Apr 2012 B2
8180522 Tuff May 2012 B2
8180655 Hopkins, III May 2012 B1
8185380 Kameyama May 2012 B2
8188887 Catten et al. May 2012 B2
8190323 Maeda et al. May 2012 B2
8204766 Bush Jun 2012 B2
8255144 Breed Aug 2012 B2
8255243 Raines et al. Aug 2012 B2
8255244 Raines et al. Aug 2012 B2
8260489 Nielsen et al. Sep 2012 B2
8260639 Medina, III Sep 2012 B1
8265861 Ikeda et al. Sep 2012 B2
8275417 Flynn Sep 2012 B2
8280752 Cripe et al. Oct 2012 B1
8311858 Everett et al. Nov 2012 B2
8314708 Gunderson et al. Nov 2012 B2
8332242 Medina, III Dec 2012 B1
8340893 Yamaguchi et al. Dec 2012 B2
8340902 Chiang Dec 2012 B1
8344849 Larsson et al. Jan 2013 B2
8352118 Mittelsteadt et al. Jan 2013 B1
8355837 Avery et al. Jan 2013 B2
8364391 Nagase et al. Jan 2013 B2
8384534 James et al. Feb 2013 B2
8386168 Hao Feb 2013 B2
8423239 Blumer et al. Apr 2013 B2
8437966 Connolly et al. May 2013 B2
8447231 Bai et al. May 2013 B2
8451105 McNay May 2013 B2
8457880 Malalur et al. Jun 2013 B1
8473143 Stark et al. Jun 2013 B2
8487775 Victor et al. Jul 2013 B2
8510196 Brandmaier et al. Aug 2013 B1
8520695 Rubin et al. Aug 2013 B1
8554468 Bullock Oct 2013 B1
8554587 Nowak et al. Oct 2013 B1
8566126 Hopkins, III Oct 2013 B1
8595034 Bauer et al. Nov 2013 B2
8595037 Hyde et al. Nov 2013 B1
8605947 Zhang et al. Dec 2013 B2
8606512 Bogovich et al. Dec 2013 B1
8618922 Debouk et al. Dec 2013 B2
8634980 Urmson et al. Jan 2014 B1
8645014 Kozlowski et al. Feb 2014 B1
8645029 Kim et al. Feb 2014 B2
8660734 Zhu et al. Feb 2014 B2
8698639 Fung et al. Apr 2014 B2
8700251 Zhu et al. Apr 2014 B1
8712893 Brandmaier et al. Apr 2014 B1
8725311 Breed May 2014 B1
8725472 Hagelin et al. May 2014 B2
8731977 Hardin et al. May 2014 B1
8738523 Sanchez et al. May 2014 B1
8742936 Galley et al. Jun 2014 B2
8781442 Link, II Jul 2014 B1
8781669 Teller et al. Jul 2014 B1
8788299 Medina, III Jul 2014 B1
8799034 Brandmaier et al. Aug 2014 B1
8816836 Lee et al. Aug 2014 B2
8818608 Cullinane et al. Aug 2014 B2
8825258 Cullinane et al. Sep 2014 B2
8849558 Morotomi et al. Sep 2014 B2
8868288 Plante et al. Oct 2014 B2
8874301 Rao et al. Oct 2014 B1
8874305 Dolgov et al. Oct 2014 B2
8876535 Fields et al. Nov 2014 B2
8880291 Hampiholi Nov 2014 B2
8892271 Breed Nov 2014 B2
8902054 Morris Dec 2014 B2
8909428 Lombrozo Dec 2014 B1
8917182 Chang et al. Dec 2014 B2
8928495 Hassib et al. Jan 2015 B2
8935036 Christensen et al. Jan 2015 B1
8954205 Sagar et al. Feb 2015 B2
8954217 Montemerlo et al. Feb 2015 B1
8954226 Binion et al. Feb 2015 B1
8954340 Sanchez et al. Feb 2015 B2
8965677 Breed et al. Feb 2015 B2
8972100 Mullen et al. Mar 2015 B2
8981942 He et al. Mar 2015 B2
8989959 Plante et al. Mar 2015 B2
8996228 Ferguson et al. Mar 2015 B1
8996240 Plante Mar 2015 B2
9008952 Caskey et al. Apr 2015 B2
9019092 Brandmaier et al. Apr 2015 B1
9020876 Rakshit Apr 2015 B2
9026266 Aaron et al. May 2015 B2
9049584 Hatton Jun 2015 B2
9053588 Briggs et al. Jun 2015 B1
9055407 Riemer et al. Jun 2015 B1
9056395 Ferguson et al. Jun 2015 B1
9056616 Fields et al. Jun 2015 B1
9063543 An et al. Jun 2015 B2
9070243 Kozlowski et al. Jun 2015 B1
9075413 Cullinane et al. Jul 2015 B2
9079587 Rupp et al. Jul 2015 B1
9081650 Brinkmann et al. Jul 2015 B1
9098080 Norris et al. Aug 2015 B2
9123250 Duncan et al. Sep 2015 B2
9135803 Fields et al. Sep 2015 B1
9141582 Brinkmann et al. Sep 2015 B1
9141995 Brinkmann et al. Sep 2015 B1
9141996 Christensen et al. Sep 2015 B2
9144389 Srinivasan et al. Sep 2015 B2
9147219 Binion et al. Sep 2015 B2
9147353 Slusar Sep 2015 B1
9151692 Breed Oct 2015 B2
9164507 Cheatham, III et al. Oct 2015 B2
9177475 Sellschopp Nov 2015 B2
9180888 Fields et al. Nov 2015 B1
9182942 Kelly et al. Nov 2015 B2
9188985 Hobbs et al. Nov 2015 B1
9194168 Lu et al. Nov 2015 B1
9205805 Cudak et al. Dec 2015 B2
9205842 Fields et al. Dec 2015 B1
9221396 Zhu et al. Dec 2015 B1
9224293 Taylor Dec 2015 B2
9229905 Penilla et al. Jan 2016 B1
9230441 Sung et al. Jan 2016 B2
9235211 Davidsson et al. Jan 2016 B2
9262787 Binion et al. Feb 2016 B2
9274525 Ferguson et al. Mar 2016 B1
9275417 Binion et al. Mar 2016 B2
9275552 Fields et al. Mar 2016 B1
9279697 Fields et al. Mar 2016 B1
9282430 Brandmaier et al. Mar 2016 B1
9282447 Gianakis Mar 2016 B2
9283847 Riley, Sr. et al. Mar 2016 B2
9299108 Diana et al. Mar 2016 B2
9308891 Cudak et al. Apr 2016 B2
9311271 Wright Apr 2016 B2
9317983 Ricci Apr 2016 B2
9342074 Dolgov et al. May 2016 B2
9342993 Fields et al. May 2016 B1
9352709 Brenneis et al. May 2016 B2
9352752 Cullinane et al. May 2016 B2
9355423 Slusar May 2016 B1
9361599 Biemer et al. Jun 2016 B1
9361650 Binion et al. Jun 2016 B2
9371072 Sisbot Jun 2016 B1
9373203 Fields et al. Jun 2016 B1
9376090 Gennermann Jun 2016 B2
9377315 Grover et al. Jun 2016 B2
9381916 Zhu et al. Jul 2016 B1
9384491 Briggs et al. Jul 2016 B1
9384674 Nepomuceno Jul 2016 B2
9390451 Slusar Jul 2016 B1
9390452 Biemer et al. Jul 2016 B1
9390567 Kim et al. Jul 2016 B2
9398421 Guba et al. Jul 2016 B2
9399445 Abou Mahmoud et al. Jul 2016 B2
9406177 Attard et al. Aug 2016 B2
9421972 Davidsson et al. Aug 2016 B2
9424607 Bowers et al. Aug 2016 B2
9429943 Wilson et al. Aug 2016 B2
9430944 Grimm et al. Aug 2016 B2
9440657 Fields et al. Sep 2016 B1
9443152 Atsmon et al. Sep 2016 B2
9443436 Scheidt Sep 2016 B2
9454786 Srey et al. Sep 2016 B1
9457754 Christensen Oct 2016 B1
9466214 Fuehrer Oct 2016 B2
9475496 Attard et al. Oct 2016 B2
9477990 Binion et al. Oct 2016 B1
9478150 Fields et al. Oct 2016 B1
9489635 Zhu Nov 2016 B1
9505494 Marlow et al. Nov 2016 B1
9511765 Obradovich Dec 2016 B2
9511767 Okumura et al. Dec 2016 B1
9511779 Cullinane et al. Dec 2016 B2
9517771 Attard et al. Dec 2016 B2
9524648 Gopalakrishnan et al. Dec 2016 B1
9529361 You et al. Dec 2016 B2
9530333 Fields et al. Dec 2016 B1
9542846 Zeng et al. Jan 2017 B2
9558667 Bowers et al. Jan 2017 B2
9566959 Breuer et al. Feb 2017 B2
9567007 Cudak et al. Feb 2017 B2
9583017 Nepomuceno Feb 2017 B2
9586591 Fields et al. Mar 2017 B1
9587952 Slusar Mar 2017 B1
9594373 Solyom et al. Mar 2017 B2
9601027 Nepomuceno Mar 2017 B2
9604652 Strauss Mar 2017 B2
9632502 Levinson et al. Apr 2017 B1
9633318 Plante Apr 2017 B2
9646428 Konrardy May 2017 B1
9646433 Sanchez et al. May 2017 B1
9650051 Hoye et al. May 2017 B2
9656606 Vose et al. May 2017 B1
9663112 Abou-Nasr et al. May 2017 B2
9665101 Templeton May 2017 B1
9679487 Hayward Jun 2017 B1
9697733 Penilla et al. Jul 2017 B1
9707942 Cheatham, III et al. Jul 2017 B2
9712549 Almurayh Jul 2017 B2
9715711 Konrardy et al. Jul 2017 B1
9720419 O'Neill et al. Aug 2017 B2
9725036 Tarte Aug 2017 B1
9727920 Healy et al. Aug 2017 B1
9734685 Fields et al. Aug 2017 B2
9753390 Kabai Sep 2017 B2
9754325 Konrardy et al. Sep 2017 B1
9754424 Ling et al. Sep 2017 B2
9754490 Kentley et al. Sep 2017 B2
9761139 Acker, Jr. et al. Sep 2017 B2
9766625 Boroditsky et al. Sep 2017 B2
9767516 Konrardy et al. Sep 2017 B1
9773281 Hanson Sep 2017 B1
9783159 Potter Oct 2017 B1
9786154 Potter Oct 2017 B1
9792656 Konrardy et al. Oct 2017 B1
9797881 Biondo et al. Oct 2017 B2
9805423 Konrardy et al. Oct 2017 B1
9805601 Fields et al. Oct 2017 B1
9816827 Slusar Nov 2017 B1
9847033 Carmack et al. Dec 2017 B1
9852475 Konrardy et al. Dec 2017 B1
9858621 Konrardy et al. Jan 2018 B1
9868394 Fields et al. Jan 2018 B1
9870649 Fields et al. Jan 2018 B1
9878617 Mochizuki Jan 2018 B2
9884611 Abou Mahmoud et al. Feb 2018 B2
9892567 Binion et al. Feb 2018 B2
9896062 Potter Feb 2018 B1
9904928 Leise Feb 2018 B1
9908530 Fields et al. Mar 2018 B1
9934667 Fields et al. Apr 2018 B1
9939279 Pan et al. Apr 2018 B2
9940676 Biemer Apr 2018 B1
9940834 Konrardy et al. Apr 2018 B1
9944282 Fields et al. Apr 2018 B1
9946531 Fields et al. Apr 2018 B1
9948477 Marten Apr 2018 B2
9972054 Konrardy et al. May 2018 B1
9986404 Mehta et al. May 2018 B2
10007263 Fields et al. Jun 2018 B1
10013697 Cote et al. Jul 2018 B1
10017153 Potter Jul 2018 B1
10019901 Fields et al. Jul 2018 B1
10026130 Konrardy et al. Jul 2018 B1
10026237 Fields et al. Jul 2018 B1
10042359 Konrardy et al. Aug 2018 B1
10043323 Konrardy et al. Aug 2018 B1
10055794 Konrardy et al. Aug 2018 B1
10065517 Konrardy et al. Sep 2018 B1
10086782 Konrardy et al. Oct 2018 B1
10089693 Konrardy et al. Oct 2018 B1
10102587 Potter Oct 2018 B1
10102590 Farnsworth et al. Oct 2018 B1
10106083 Fields et al. Oct 2018 B1
10134278 Konrardy et al. Nov 2018 B1
10156848 Konrardy et al. Dec 2018 B1
10157423 Fields et al. Dec 2018 B1
10163327 Potter Dec 2018 B1
10163350 Fields et al. Dec 2018 B1
10166994 Fields et al. Jan 2019 B1
10168703 Konrardy et al. Jan 2019 B1
10181161 Konrardy et al. Jan 2019 B1
10185997 Konrardy et al. Jan 2019 B1
10185998 Konrardy et al. Jan 2019 B1
10185999 Konrardy et al. Jan 2019 B1
10351097 Potter Jul 2019 B1
10354230 Hanson Jul 2019 B1
10387962 Potter Aug 2019 B1
10475127 Potter Nov 2019 B1
10540723 Potter Jan 2020 B1
10814812 Christensen Oct 2020 B1
10825326 Potter Nov 2020 B1
20010005217 Hamilton et al. Jun 2001 A1
20020016655 Joao Feb 2002 A1
20020049535 Rigo et al. Apr 2002 A1
20020091483 Douet Jul 2002 A1
20020103622 Burge Aug 2002 A1
20020103678 Burkhalter et al. Aug 2002 A1
20020111725 Burge Aug 2002 A1
20020116228 Bauer et al. Aug 2002 A1
20020128751 Engstrom et al. Sep 2002 A1
20020128882 Nakagawa Sep 2002 A1
20020135618 Maes et al. Sep 2002 A1
20020146667 Dowdell et al. Oct 2002 A1
20030028298 Macky et al. Feb 2003 A1
20030046003 Smith et al. Mar 2003 A1
20030061160 Asahina Mar 2003 A1
20030095039 Shimomura et al. May 2003 A1
20030112133 Webb et al. Jun 2003 A1
20030120576 Duckworth Jun 2003 A1
20030139948 Strech Jul 2003 A1
20030146850 Fallenstein Aug 2003 A1
20030182042 Watson et al. Sep 2003 A1
20030182183 Pribe Sep 2003 A1
20030200123 Burge Oct 2003 A1
20040005927 Bonilla et al. Jan 2004 A1
20040017106 Aizawa et al. Jan 2004 A1
20040019539 Raman et al. Jan 2004 A1
20040039503 Doyle Feb 2004 A1
20040054452 Bjorkman Mar 2004 A1
20040077285 Bonilla et al. Apr 2004 A1
20040085198 Saito et al. May 2004 A1
20040085211 Gotfried May 2004 A1
20040090334 Zhang et al. May 2004 A1
20040111301 Wahlbin et al. Jun 2004 A1
20040122639 Qiu Jun 2004 A1
20040139034 Farmer Jul 2004 A1
20040153362 Bauer et al. Aug 2004 A1
20040158476 Blessinger et al. Aug 2004 A1
20040169034 Park Sep 2004 A1
20040185842 Spaur et al. Sep 2004 A1
20040198441 Cooper et al. Oct 2004 A1
20040204837 Singleton Oct 2004 A1
20040226043 Mettu et al. Nov 2004 A1
20040252027 Torkkola et al. Dec 2004 A1
20040260579 Tremiti Dec 2004 A1
20050007438 Busch et al. Jan 2005 A1
20050046584 Breed Mar 2005 A1
20050055249 Helitzer et al. Mar 2005 A1
20050059151 Bosch Mar 2005 A1
20050065678 Smith et al. Mar 2005 A1
20050071052 Coletrane et al. Mar 2005 A1
20050071202 Kendrick Mar 2005 A1
20050073438 Rodgers et al. Apr 2005 A1
20050075782 Torgunrud Apr 2005 A1
20050080519 Oesterling et al. Apr 2005 A1
20050088291 Blanco et al. Apr 2005 A1
20050088521 Blanco et al. Apr 2005 A1
20050093684 Cunnien May 2005 A1
20050107673 Ball May 2005 A1
20050108065 Dorfstatter May 2005 A1
20050108910 Esparza et al. May 2005 A1
20050131597 Raz et al. Jun 2005 A1
20050134443 Hottebart et al. Jun 2005 A1
20050154513 Matsunaga et al. Jul 2005 A1
20050216136 Lengning et al. Sep 2005 A1
20050227712 Estevez et al. Oct 2005 A1
20050228763 Lewis et al. Oct 2005 A1
20050237784 Kang Oct 2005 A1
20050246256 Gastineau et al. Nov 2005 A1
20050259151 Hamilton et al. Nov 2005 A1
20050267784 Slen et al. Dec 2005 A1
20060010665 Watzl Jan 2006 A1
20060031103 Henry Feb 2006 A1
20060052909 Cherouny Mar 2006 A1
20060052929 Bastian et al. Mar 2006 A1
20060053038 Warren et al. Mar 2006 A1
20060055565 Kawamata et al. Mar 2006 A1
20060079280 LaPerch Apr 2006 A1
20060089763 Barrett et al. Apr 2006 A1
20060089766 Allard et al. Apr 2006 A1
20060092043 Lagassey May 2006 A1
20060095302 Vahidi et al. May 2006 A1
20060106650 Bush May 2006 A1
20060136291 Morita et al. Jun 2006 A1
20060149461 Rowley et al. Jul 2006 A1
20060155616 Moore et al. Jul 2006 A1
20060184295 Hawkins et al. Aug 2006 A1
20060212195 Veith et al. Sep 2006 A1
20060220905 Hovestadt Oct 2006 A1
20060229777 Hudson et al. Oct 2006 A1
20060232430 Takaoka et al. Oct 2006 A1
20060244746 England et al. Nov 2006 A1
20060294514 Bauchot et al. Dec 2006 A1
20070001831 Raz et al. Jan 2007 A1
20070027726 Warren et al. Feb 2007 A1
20070048707 Caamano et al. Mar 2007 A1
20070055422 Anzai et al. Mar 2007 A1
20070080816 Haque et al. Apr 2007 A1
20070088469 Schmiedel et al. Apr 2007 A1
20070093947 Gould et al. Apr 2007 A1
20070122771 Maeda et al. May 2007 A1
20070124599 Morita et al. May 2007 A1
20070132773 Plante Jun 2007 A1
20070149208 Syrbe et al. Jun 2007 A1
20070159344 Kisacanin Jul 2007 A1
20070159354 Rosenberg Jul 2007 A1
20070208498 Barker et al. Sep 2007 A1
20070219720 Trepagnier et al. Sep 2007 A1
20070249372 Gao et al. Oct 2007 A1
20070263628 Axelsson et al. Nov 2007 A1
20070265540 Fuwamoto et al. Nov 2007 A1
20070282489 Boss et al. Dec 2007 A1
20070282638 Surovy Dec 2007 A1
20070291130 Broggi et al. Dec 2007 A1
20070299700 Gay et al. Dec 2007 A1
20080007451 De Maagt et al. Jan 2008 A1
20080027761 Bracha Jan 2008 A1
20080028974 Bianco Feb 2008 A1
20080033684 Vian et al. Feb 2008 A1
20080052134 Nowak et al. Feb 2008 A1
20080061953 Bhogal et al. Mar 2008 A1
20080064014 Wojtczak et al. Mar 2008 A1
20080065427 Helitzer et al. Mar 2008 A1
20080077383 Hagelin et al. Mar 2008 A1
20080082372 Burch Apr 2008 A1
20080084473 Romanowich Apr 2008 A1
20080106390 White May 2008 A1
20080111666 Plante et al. May 2008 A1
20080114502 Breed et al. May 2008 A1
20080114530 Petrisor et al. May 2008 A1
20080126137 Kidd et al. May 2008 A1
20080143497 Wasson et al. Jun 2008 A1
20080147265 Breed Jun 2008 A1
20080147266 Plante et al. Jun 2008 A1
20080147267 Plante et al. Jun 2008 A1
20080161989 Breed Jul 2008 A1
20080167821 Breed Jul 2008 A1
20080180237 Fayyad et al. Jul 2008 A1
20080189142 Brown et al. Aug 2008 A1
20080195457 Sherman et al. Aug 2008 A1
20080204256 Omi Aug 2008 A1
20080243558 Gupte Oct 2008 A1
20080255887 Gruter Oct 2008 A1
20080255888 Berkobin et al. Oct 2008 A1
20080258885 Akhan Oct 2008 A1
20080258890 Follmer et al. Oct 2008 A1
20080291008 Jeon Nov 2008 A1
20080294690 McClellan et al. Nov 2008 A1
20080297488 Operowsky et al. Dec 2008 A1
20080300733 Rasshofer et al. Dec 2008 A1
20080306996 McClellan Dec 2008 A1
20080313007 Callahan et al. Dec 2008 A1
20080319665 Berkobin et al. Dec 2008 A1
20090005979 Nakao et al. Jan 2009 A1
20090015684 Ooga et al. Jan 2009 A1
20090027188 Saban Jan 2009 A1
20090040060 Anbuhl et al. Feb 2009 A1
20090063030 Howarter et al. Mar 2009 A1
20090063174 Fricke Mar 2009 A1
20090069953 Hale et al. Mar 2009 A1
20090079839 Fischer et al. Mar 2009 A1
20090081923 Dooley et al. Mar 2009 A1
20090085770 Mergen Apr 2009 A1
20090106135 Steiger Apr 2009 A1
20090115638 Shankwitz et al. May 2009 A1
20090132294 Haines May 2009 A1
20090140887 Breed et al. Jun 2009 A1
20090174573 Smith Jul 2009 A1
20090207005 Habetha et al. Aug 2009 A1
20090210257 Chalfant et al. Aug 2009 A1
20090247113 Sennett et al. Oct 2009 A1
20090254240 Olsen, III et al. Oct 2009 A1
20090267801 Kawai et al. Oct 2009 A1
20090300065 Birchall Dec 2009 A1
20090303026 Broggi et al. Dec 2009 A1
20090313566 Vian et al. Dec 2009 A1
20100004995 Hickman Jan 2010 A1
20100005649 Kim et al. Jan 2010 A1
20100013130 Ramirez et al. Jan 2010 A1
20100014570 Dupis et al. Jan 2010 A1
20100015706 Quay et al. Jan 2010 A1
20100030540 Choi et al. Feb 2010 A1
20100030586 Taylor et al. Feb 2010 A1
20100042318 Kaplan et al. Feb 2010 A1
20100043524 Takata Feb 2010 A1
20100055649 Takahashi et al. Mar 2010 A1
20100076646 Basir et al. Mar 2010 A1
20100082244 Yamaguchi et al. Apr 2010 A1
20100085171 Do Apr 2010 A1
20100106346 Badli et al. Apr 2010 A1
20100106356 Trepagnier et al. Apr 2010 A1
20100128127 Ciolli May 2010 A1
20100131300 Collopy et al. May 2010 A1
20100131302 Collopy et al. May 2010 A1
20100131304 Collopy et al. May 2010 A1
20100131307 Collopy et al. May 2010 A1
20100142477 Yokota Jun 2010 A1
20100143872 Lankteee Jun 2010 A1
20100157061 Katsman et al. Jun 2010 A1
20100157255 Togino Jun 2010 A1
20100164737 Lu et al. Jul 2010 A1
20100198491 Mays Aug 2010 A1
20100205012 McClellan Aug 2010 A1
20100214087 Nakagoshi et al. Aug 2010 A1
20100219944 McCormick et al. Sep 2010 A1
20100253541 Seder et al. Oct 2010 A1
20100256836 Mudalige Oct 2010 A1
20100286845 Rekow et al. Nov 2010 A1
20100293033 Hall et al. Nov 2010 A1
20100299021 Jalili Nov 2010 A1
20100332131 Horvitz et al. Dec 2010 A1
20110009093 Self et al. Jan 2011 A1
20110010042 Boulet et al. Jan 2011 A1
20110043350 Ben David Feb 2011 A1
20110043377 McGrath et al. Feb 2011 A1
20110054767 Schafer et al. Mar 2011 A1
20110060496 Nielsen et al. Mar 2011 A1
20110066310 Sakai et al. Mar 2011 A1
20110077809 Leary Mar 2011 A1
20110087505 Terlep Apr 2011 A1
20110090075 Armitage et al. Apr 2011 A1
20110090093 Grimm et al. Apr 2011 A1
20110093134 Emanuel et al. Apr 2011 A1
20110093350 Laumeyer et al. Apr 2011 A1
20110106370 Duddle et al. May 2011 A1
20110109462 Deng et al. May 2011 A1
20110118907 Elkins May 2011 A1
20110128161 Bae et al. Jun 2011 A1
20110133954 Ooshima et al. Jun 2011 A1
20110137684 Peak et al. Jun 2011 A1
20110140919 Hara et al. Jun 2011 A1
20110140968 Bai et al. Jun 2011 A1
20110144854 Cramer et al. Jun 2011 A1
20110153367 Amigo et al. Jun 2011 A1
20110161116 Peak et al. Jun 2011 A1
20110161119 Collins Jun 2011 A1
20110169625 James et al. Jul 2011 A1
20110184605 Neff Jul 2011 A1
20110187559 Applebaum Aug 2011 A1
20110190972 Timmons et al. Aug 2011 A1
20110196571 Foladare et al. Aug 2011 A1
20110202305 Willis et al. Aug 2011 A1
20110238997 Bellur et al. Sep 2011 A1
20110241862 Debouk et al. Oct 2011 A1
20110246244 O'Rourke Oct 2011 A1
20110251751 Knight Oct 2011 A1
20110279263 Rodkey et al. Nov 2011 A1
20110288770 Greasby Nov 2011 A1
20110295446 Basir et al. Dec 2011 A1
20110295546 Khazanov Dec 2011 A1
20110301839 Pudar et al. Dec 2011 A1
20110304465 Boult et al. Dec 2011 A1
20110307188 Peng et al. Dec 2011 A1
20110307336 Smirnov et al. Dec 2011 A1
20120004933 Foladare et al. Jan 2012 A1
20120007224 Hasebe et al. Jan 2012 A1
20120010185 Stenkamp et al. Jan 2012 A1
20120010906 Foladare et al. Jan 2012 A1
20120013582 Inoue et al. Jan 2012 A1
20120019001 Hede et al. Jan 2012 A1
20120025969 Dozza Feb 2012 A1
20120028680 Breed Feb 2012 A1
20120053824 Nam et al. Mar 2012 A1
20120056758 Kuhlman et al. Mar 2012 A1
20120059227 Friedlander et al. Mar 2012 A1
20120066007 Ferrick et al. Mar 2012 A1
20120071151 Abramson et al. Mar 2012 A1
20120072214 Cox et al. Mar 2012 A1
20120072243 Collins et al. Mar 2012 A1
20120072244 Collins et al. Mar 2012 A1
20120081221 Doerr et al. Apr 2012 A1
20120083668 Pradeep et al. Apr 2012 A1
20120083959 Dolgov et al. Apr 2012 A1
20120083960 Zhu et al. Apr 2012 A1
20120083964 Montemerlo et al. Apr 2012 A1
20120083974 Sandblom Apr 2012 A1
20120092157 Tran Apr 2012 A1
20120101855 Collins et al. Apr 2012 A1
20120108909 Slobounov et al. May 2012 A1
20120109407 Yousefi et al. May 2012 A1
20120109692 Collins et al. May 2012 A1
20120116548 Goree et al. May 2012 A1
20120123806 Schumann, Jr. et al. May 2012 A1
20120129545 Hodis et al. May 2012 A1
20120135382 Winston et al. May 2012 A1
20120143391 Gee Jun 2012 A1
20120143630 Hertenstein Jun 2012 A1
20120146766 Geisler Jun 2012 A1
20120172055 Edge Jul 2012 A1
20120185204 Jallon et al. Jul 2012 A1
20120188100 Min et al. Jul 2012 A1
20120190001 Knight et al. Jul 2012 A1
20120191343 Haleem Jul 2012 A1
20120191373 Soles et al. Jul 2012 A1
20120197669 Kote et al. Aug 2012 A1
20120200427 Kamata Aug 2012 A1
20120203418 Braennstroem et al. Aug 2012 A1
20120209634 Ling et al. Aug 2012 A1
20120209692 Bennett et al. Aug 2012 A1
20120215375 Chang Aug 2012 A1
20120221168 Zeng et al. Aug 2012 A1
20120235865 Nath et al. Sep 2012 A1
20120239242 Uehara Sep 2012 A1
20120239281 Hinz Sep 2012 A1
20120239471 Grimm et al. Sep 2012 A1
20120239822 Poulson et al. Sep 2012 A1
20120246733 Schafer et al. Sep 2012 A1
20120256769 Satpathy Oct 2012 A1
20120258702 Matsuyama Oct 2012 A1
20120271500 Tsimhoni et al. Oct 2012 A1
20120277950 Plante et al. Nov 2012 A1
20120284747 Joao Nov 2012 A1
20120286974 Claussen et al. Nov 2012 A1
20120289819 Snow Nov 2012 A1
20120303177 Jauch et al. Nov 2012 A1
20120303222 Cooprider et al. Nov 2012 A1
20120303392 Depura et al. Nov 2012 A1
20120306663 Mudalige Dec 2012 A1
20120315848 Smith et al. Dec 2012 A1
20120316406 Rahman et al. Dec 2012 A1
20130006674 Bowne et al. Jan 2013 A1
20130006675 Bowne et al. Jan 2013 A1
20130017846 Schoppe Jan 2013 A1
20130018677 Chevrette Jan 2013 A1
20130030275 Seymour et al. Jan 2013 A1
20130030606 Mudalige et al. Jan 2013 A1
20130030642 Bradley et al. Jan 2013 A1
20130038437 Talati et al. Feb 2013 A1
20130044008 Gafford et al. Feb 2013 A1
20130046562 Taylor et al. Feb 2013 A1
20130057671 Levin et al. Mar 2013 A1
20130066751 Glazer et al. Mar 2013 A1
20130073115 Levin et al. Mar 2013 A1
20130073318 Feldman et al. Mar 2013 A1
20130073321 Hofmann et al. Mar 2013 A1
20130093886 Rothschild Apr 2013 A1
20130097128 Suzuki et al. Apr 2013 A1
20130116855 Nielsen et al. May 2013 A1
20130131907 Green et al. May 2013 A1
20130144459 Ricci Jun 2013 A1
20130144657 Ricci Jun 2013 A1
20130151027 Petrucci et al. Jun 2013 A1
20130151202 Denny et al. Jun 2013 A1
20130164715 Hunt et al. Jun 2013 A1
20130179198 Bowne et al. Jul 2013 A1
20130189649 Mannino Jul 2013 A1
20130190966 Collins et al. Jul 2013 A1
20130209968 Miller et al. Aug 2013 A1
20130218603 Hagelstein et al. Aug 2013 A1
20130218604 Hagelstein et al. Aug 2013 A1
20130226391 Nordbruch et al. Aug 2013 A1
20130227409 Das et al. Aug 2013 A1
20130231824 Wilson et al. Sep 2013 A1
20130237194 Davis Sep 2013 A1
20130245857 Gariepy et al. Sep 2013 A1
20130245881 Scarbrough Sep 2013 A1
20130257626 Masli et al. Oct 2013 A1
20130267194 Breed Oct 2013 A1
20130278442 Rubin et al. Oct 2013 A1
20130289819 Hassib et al. Oct 2013 A1
20130290037 Hu et al. Oct 2013 A1
20130297418 Collopy et al. Nov 2013 A1
20130302758 Wright Nov 2013 A1
20130304513 Hyde et al. Nov 2013 A1
20130304514 Hyde et al. Nov 2013 A1
20130307786 Heubel Nov 2013 A1
20130317665 Fernandes et al. Nov 2013 A1
20130317693 Jefferies et al. Nov 2013 A1
20130317711 Plante Nov 2013 A1
20130317786 Kuhn Nov 2013 A1
20130317865 Tofte Nov 2013 A1
20130332402 Rakshit Dec 2013 A1
20130339062 Brewer Dec 2013 A1
20140002651 Plante Jan 2014 A1
20140004734 Hoang Jan 2014 A1
20140006660 Frei et al. Jan 2014 A1
20140009307 Bowers et al. Jan 2014 A1
20140011647 Lalaoua Jan 2014 A1
20140012492 Bowers et al. Jan 2014 A1
20140013965 Perez Jan 2014 A1
20140019170 Coleman et al. Jan 2014 A1
20140027790 Lin et al. Jan 2014 A1
20140030073 Lacy et al. Jan 2014 A1
20140039934 Rivera Feb 2014 A1
20140047347 Mohn et al. Feb 2014 A1
20140047371 Palmer et al. Feb 2014 A1
20140052323 Reichel et al. Feb 2014 A1
20140052336 Moshchuk et al. Feb 2014 A1
20140052479 Kawamura Feb 2014 A1
20140058761 Freiberger et al. Feb 2014 A1
20140059066 Koloskov Feb 2014 A1
20140063064 Seo et al. Mar 2014 A1
20140070980 Park Mar 2014 A1
20140080100 Phelan et al. Mar 2014 A1
20140095009 Oshima et al. Apr 2014 A1
20140095214 Mathe et al. Apr 2014 A1
20140099607 Armitage et al. Apr 2014 A1
20140100892 Collopy et al. Apr 2014 A1
20140104405 Weidl et al. Apr 2014 A1
20140106782 Chitre et al. Apr 2014 A1
20140108198 Jariyasunant et al. Apr 2014 A1
20140111332 Przybylko et al. Apr 2014 A1
20140111647 Atsmon et al. Apr 2014 A1
20140114691 Pearce Apr 2014 A1
20140125474 Gunaratne May 2014 A1
20140129053 Kleve et al. May 2014 A1
20140129139 Ellison et al. May 2014 A1
20140129301 Van Wiemeersch et al. May 2014 A1
20140130035 Desai et al. May 2014 A1
20140135598 Weidl et al. May 2014 A1
20140148988 Lathrop et al. May 2014 A1
20140149148 Luciani May 2014 A1
20140152422 Breed Jun 2014 A1
20140156133 Cullinane et al. Jun 2014 A1
20140156134 Cullinane et al. Jun 2014 A1
20140156176 Caskey et al. Jun 2014 A1
20140167967 He et al. Jun 2014 A1
20140168399 Plummer et al. Jun 2014 A1
20140172467 He et al. Jun 2014 A1
20140172727 Abhyanker et al. Jun 2014 A1
20140180727 Freiberger et al. Jun 2014 A1
20140188322 Oh et al. Jul 2014 A1
20140191858 Morgan et al. Jul 2014 A1
20140207707 Na et al. Jul 2014 A1
20140218187 Chun et al. Aug 2014 A1
20140218520 Teich et al. Aug 2014 A1
20140221781 Schrauf et al. Aug 2014 A1
20140236638 Pallesen et al. Aug 2014 A1
20140240132 Bychkov Aug 2014 A1
20140244096 An et al. Aug 2014 A1
20140253376 Large et al. Sep 2014 A1
20140257866 Gay et al. Sep 2014 A1
20140257869 Binion et al. Sep 2014 A1
20140266655 Palan Sep 2014 A1
20140272810 Fields et al. Sep 2014 A1
20140272811 Palan Sep 2014 A1
20140277916 Mullen et al. Sep 2014 A1
20140278569 Sanchez et al. Sep 2014 A1
20140278571 Mullen et al. Sep 2014 A1
20140278586 Sanchez et al. Sep 2014 A1
20140278840 Scofield et al. Sep 2014 A1
20140279707 Joshua et al. Sep 2014 A1
20140301218 Luo et al. Oct 2014 A1
20140303827 Dolgov et al. Oct 2014 A1
20140306799 Ricci Oct 2014 A1
20140306814 Ricci Oct 2014 A1
20140309864 Ricci Oct 2014 A1
20140309870 Ricci et al. Oct 2014 A1
20140310186 Ricci Oct 2014 A1
20140330478 Cullinane et al. Nov 2014 A1
20140335902 Guba et al. Nov 2014 A1
20140337930 Hoyos et al. Nov 2014 A1
20140343972 Fernandes et al. Nov 2014 A1
20140350970 Schumann, Jr. et al. Nov 2014 A1
20140358324 Sagar et al. Dec 2014 A1
20140358592 Wedig et al. Dec 2014 A1
20140376410 Ros et al. Dec 2014 A1
20140378082 Ros et al. Dec 2014 A1
20140379385 Duncan et al. Dec 2014 A1
20140380264 Misra et al. Dec 2014 A1
20150006278 Di Censo et al. Jan 2015 A1
20150019266 Stempora Jan 2015 A1
20150024705 Rashidi Jan 2015 A1
20150025917 Stempora Jan 2015 A1
20150032581 Blackhurst et al. Jan 2015 A1
20150035685 Strickland et al. Feb 2015 A1
20150039348 Miller et al. Feb 2015 A1
20150039350 Martin et al. Feb 2015 A1
20150039397 Fuchs Feb 2015 A1
20150045983 Fraser et al. Feb 2015 A1
20150051752 Paszkowicz Feb 2015 A1
20150051787 Doughty et al. Feb 2015 A1
20150058046 Huynh et al. Feb 2015 A1
20150066284 Yopp Mar 2015 A1
20150070160 Davidsson et al. Mar 2015 A1
20150070265 Cruz-Hernandez et al. Mar 2015 A1
20150073645 Davidsson et al. Mar 2015 A1
20150088334 Bowers et al. Mar 2015 A1
20150088358 Yopp Mar 2015 A1
20150088360 Bonnet et al. Mar 2015 A1
20150088373 Wilkins Mar 2015 A1
20150088550 Bowers et al. Mar 2015 A1
20150095132 Van Heerden et al. Apr 2015 A1
20150100189 Tellis et al. Apr 2015 A1
20150100190 Yopp Apr 2015 A1
20150100191 Yopp Apr 2015 A1
20150100353 Hughes et al. Apr 2015 A1
20150109450 Walker Apr 2015 A1
20150112504 Binion et al. Apr 2015 A1
20150112543 Binion et al. Apr 2015 A1
20150112545 Binion et al. Apr 2015 A1
20150112730 Binion et al. Apr 2015 A1
20150112731 Binion et al. Apr 2015 A1
20150112800 Binion et al. Apr 2015 A1
20150113521 Suzuki et al. Apr 2015 A1
20150120331 Russo et al. Apr 2015 A1
20150127570 Doughty May 2015 A1
20150128123 Eling May 2015 A1
20150142244 You et al. May 2015 A1
20150142262 Lee May 2015 A1
20150149017 Attard et al. May 2015 A1
20150149018 Attard et al. May 2015 A1
20150149023 Attard et al. May 2015 A1
20150149265 Huntzicker et al. May 2015 A1
20150153733 Ohmura et al. Jun 2015 A1
20150154711 Christopulos et al. Jun 2015 A1
20150158469 Cheatham, III et al. Jun 2015 A1
20150158495 Duncan et al. Jun 2015 A1
20150160653 Cheatham, III et al. Jun 2015 A1
20150161738 Stempora Jun 2015 A1
20150161893 Duncan et al. Jun 2015 A1
20150161894 Duncan et al. Jun 2015 A1
20150166069 Engelman et al. Jun 2015 A1
20150169311 Dickerson et al. Jun 2015 A1
20150170287 Tirone et al. Jun 2015 A1
20150170290 Bowne et al. Jun 2015 A1
20150170522 Noh Jun 2015 A1
20150178997 Ohsaki Jun 2015 A1
20150178998 Attard et al. Jun 2015 A1
20150185034 Abhyanker Jul 2015 A1
20150187013 Adams et al. Jul 2015 A1
20150187015 Adams et al. Jul 2015 A1
20150187016 Adams et al. Jul 2015 A1
20150187019 Fernandes et al. Jul 2015 A1
20150187194 Hypolite et al. Jul 2015 A1
20150189241 Kim et al. Jul 2015 A1
20150193219 Pandya et al. Jul 2015 A1
20150193220 Rork et al. Jul 2015 A1
20150203107 Lippman Jul 2015 A1
20150203113 Duncan et al. Jul 2015 A1
20150221142 Kim et al. Aug 2015 A1
20150229885 Offenhaeuser Aug 2015 A1
20150232064 Cudak et al. Aug 2015 A1
20150233719 Cudak et al. Aug 2015 A1
20150235323 Oldham Aug 2015 A1
20150235557 Engelman et al. Aug 2015 A1
20150239436 Kanai et al. Aug 2015 A1
20150241241 Cudak et al. Aug 2015 A1
20150241853 Vechart et al. Aug 2015 A1
20150242953 Suiter Aug 2015 A1
20150246672 Pilutti et al. Sep 2015 A1
20150253772 Solyom et al. Sep 2015 A1
20150254955 Fields et al. Sep 2015 A1
20150266489 Solyom et al. Sep 2015 A1
20150266490 Coelingh et al. Sep 2015 A1
20150271201 Ruvio et al. Sep 2015 A1
20150274072 Croteau et al. Oct 2015 A1
20150284009 Cullinane et al. Oct 2015 A1
20150293534 Takamatsu Oct 2015 A1
20150294422 Carver et al. Oct 2015 A1
20150302719 Mroszczak et al. Oct 2015 A1
20150307110 Grewe et al. Oct 2015 A1
20150310742 Albornoz Oct 2015 A1
20150310758 Daddona et al. Oct 2015 A1
20150321641 Abou Mahmoud et al. Nov 2015 A1
20150332407 Wilson, II et al. Nov 2015 A1
20150334545 Maier et al. Nov 2015 A1
20150336502 Hillis et al. Nov 2015 A1
20150338852 Ramanujam Nov 2015 A1
20150339777 Zhalov Nov 2015 A1
20150339928 Ramanujam Nov 2015 A1
20150343947 Bernico et al. Dec 2015 A1
20150346727 Ramanujam Dec 2015 A1
20150348335 Ramanujam Dec 2015 A1
20150348337 Choi Dec 2015 A1
20150356797 McBride et al. Dec 2015 A1
20150382085 Lawrie-Fussey et al. Dec 2015 A1
20160014252 Biderman et al. Jan 2016 A1
20160019790 Tobolski et al. Jan 2016 A1
20160026182 Boroditsky et al. Jan 2016 A1
20160027276 Freeck et al. Jan 2016 A1
20160036899 Moody et al. Feb 2016 A1
20160042463 Gillespie Feb 2016 A1
20160042644 Velusamy Feb 2016 A1
20160042650 Stenneth Feb 2016 A1
20160055750 Linder et al. Feb 2016 A1
20160068103 McNew et al. Mar 2016 A1
20160071418 Oshida et al. Mar 2016 A1
20160073324 Guba et al. Mar 2016 A1
20160083285 De Ridder et al. Mar 2016 A1
20160086285 Jordan Peters et al. Mar 2016 A1
20160086393 Collins et al. Mar 2016 A1
20160092962 Wasserman et al. Mar 2016 A1
20160093212 Barfield, Jr. et al. Mar 2016 A1
20160101783 Abou-Nasr et al. Apr 2016 A1
20160104250 Allen et al. Apr 2016 A1
20160105365 Droste et al. Apr 2016 A1
20160116293 Grover et al. Apr 2016 A1
20160116913 Niles Apr 2016 A1
20160117871 McClellan et al. Apr 2016 A1
20160117928 Hodges et al. Apr 2016 A1
20160125735 Tuukkanen May 2016 A1
20160129917 Gariepy et al. May 2016 A1
20160140783 Catt et al. May 2016 A1
20160140784 Akanuma et al. May 2016 A1
20160147226 Akselrod et al. May 2016 A1
20160163217 Harkness Jun 2016 A1
20160167652 Slusar Jun 2016 A1
20160171521 Ramirez et al. Jun 2016 A1
20160187127 Purohit et al. Jun 2016 A1
20160187368 Modi et al. Jun 2016 A1
20160189303 Fuchs Jun 2016 A1
20160189306 Bogovich et al. Jun 2016 A1
20160189544 Ricci Jun 2016 A1
20160200326 Cullinane et al. Jul 2016 A1
20160203560 Parameshwaran Jul 2016 A1
20160221575 Posch et al. Aug 2016 A1
20160229376 Abou Mahmoud et al. Aug 2016 A1
20160231746 Hazelton et al. Aug 2016 A1
20160248598 Lin et al. Aug 2016 A1
20160255154 Kim et al. Sep 2016 A1
20160264132 Paul et al. Sep 2016 A1
20160272219 Ketfi-Cherif et al. Sep 2016 A1
20160275790 Kang et al. Sep 2016 A1
20160277911 Kang et al. Sep 2016 A1
20160282874 Kurata et al. Sep 2016 A1
20160288833 Heimberger et al. Oct 2016 A1
20160291153 Mossau et al. Oct 2016 A1
20160292679 Kolin et al. Oct 2016 A1
20160301698 Katara et al. Oct 2016 A1
20160303969 Akula Oct 2016 A1
20160304027 Di Censo et al. Oct 2016 A1
20160304038 Chen et al. Oct 2016 A1
20160304091 Remes Oct 2016 A1
20160313132 Larroy Oct 2016 A1
20160314224 Wei et al. Oct 2016 A1
20160323233 Song et al. Nov 2016 A1
20160327949 Wilson et al. Nov 2016 A1
20160343249 Gao et al. Nov 2016 A1
20160347329 Zelman et al. Dec 2016 A1
20160370194 Colijn et al. Dec 2016 A1
20170001146 Van Baak et al. Jan 2017 A1
20170015263 Makled et al. Jan 2017 A1
20170017734 Groh et al. Jan 2017 A1
20170017842 Ma et al. Jan 2017 A1
20170023945 Cavalcanti et al. Jan 2017 A1
20170024938 Lindsay Jan 2017 A1
20170036678 Takamatsu Feb 2017 A1
20170038773 Gordon et al. Feb 2017 A1
20170067764 Skupin et al. Mar 2017 A1
20170072967 Fendt et al. Mar 2017 A1
20170076606 Gupta et al. Mar 2017 A1
20170078948 Guba et al. Mar 2017 A1
20170080900 Huennekens et al. Mar 2017 A1
20170084175 Sedlik et al. Mar 2017 A1
20170086028 Hwang et al. Mar 2017 A1
20170106876 Gordon et al. Apr 2017 A1
20170116794 Gortsas Apr 2017 A1
20170120761 Kapadia et al. May 2017 A1
20170123421 Kentley et al. May 2017 A1
20170123428 Levinson et al. May 2017 A1
20170136902 Ricci May 2017 A1
20170147722 Greenwood May 2017 A1
20170148324 High et al. May 2017 A1
20170154479 Kim Jun 2017 A1
20170168493 Miller et al. Jun 2017 A1
20170169627 Kim et al. Jun 2017 A1
20170176641 Zhu et al. Jun 2017 A1
20170178422 Wright Jun 2017 A1
20170178423 Wright Jun 2017 A1
20170178424 Wright Jun 2017 A1
20170192428 Vogt et al. Jul 2017 A1
20170200367 Mielenz Jul 2017 A1
20170212511 Paiva Ferreira et al. Jul 2017 A1
20170234689 Gibson et al. Aug 2017 A1
20170236210 Kumar et al. Aug 2017 A1
20170249844 Perkins et al. Aug 2017 A1
20170270617 Fernandes et al. Sep 2017 A1
20170274897 Rink et al. Sep 2017 A1
20170308082 Ullrich et al. Oct 2017 A1
20170309092 Rosenbaum Oct 2017 A1
20170330448 Moore et al. Nov 2017 A1
20180004223 Baldwin Jan 2018 A1
20180013831 Dey et al. Jan 2018 A1
20180046198 Nordbruch et al. Feb 2018 A1
20180053411 Wieskamp et al. Feb 2018 A1
20180075538 Konrardy et al. Mar 2018 A1
20180080995 Heinen Mar 2018 A1
20180099678 Absmeier et al. Apr 2018 A1
20180194343 Lorenz Jul 2018 A1
20180231979 Miller et al. Aug 2018 A1
20180307250 Harvey Oct 2018 A1
Foreign Referenced Citations (8)
Number Date Country
2494727 Mar 2013 GB
2002-259708 Sep 2002 JP
WO-2005083605 Sep 2005 WO
WO-2010034909 Apr 2010 WO
WO-2014139821 Sep 2014 WO
WO-2014148976 Sep 2014 WO
WO-2016067610 May 2016 WO
WO-2016156236 Oct 2016 WO
Non-Patent Literature Citations (132)
Entry
“Biofeedback mobile app”, Kurzweill Accelerating Intelligence, downloaded from the Internet at: ,http://www.kurzweilai.net/biofeedback-mobile-app> (Feb. 12, 2013).
“Driverless Cars . . . The Future is Already Here”, Autolnsurance Center, downloaded from the Internet at: <http://www.autoinsurancecenter.com/driverless-cars...the-future-is-already-here.htm> (2010; downloaded on Mar. 27, 2014).
“Integrated Vehicle-Based Safety Systems (IVBSS)”, Research and Innovative Technology Administration (RITA), http://www.its.dot.gov/ivbss/, retrieved from the internet on Nov. 4, 2013, 3 pages.
“Intel Capital to Invest in Future of Automotive Technology”, News Release, Intel Corp. (Feb. 29, 2012).
“Linking Driving Behavior to Automobile Accidents and Insurance Rates: An Analysis of Five Billion Miles Driven”, Progressive Insurance brochure (Jul. 2012).
“MIT Spin-off Affectiva Raises $5.7 Million to Commercialize Emotion Technology”, Business Wire (Jul. 19, 2011).
“Private Ownership Costs”, RACQ, Wayback Machine, http://www.racq.com.au:80/˜/media/pdf/racqpdfs/cardsanddriving/cars/0714_vehicle_running_costs.ashx/ (Oct. 6, 2014).
“Self-Driving Cars: The Next Revolution”, KPMG, Center for Automotive Research (2012).
The Influence of Telematics on Customer Experience: Case Study of Progressive's Snapshot Program, J.D. Power Insights, McGraw Hill Financial (2013).
Al-Shihabi et al., A framework for modeling human-like driving behaviors for autonomous vehicles in driving simulators, Agents'01, pp. 286-291 (May 2001).
Alberi et al., A proposed standardized testing procedure for autonomous ground vehicles, Virginia Polytechnic Institute and State University, 63 pages (Apr. 29, 2008).
Beard et al., Autonomous vehicle technologies for small fixed-wing UAVs, J. Aerospace Computing Info. Commun. (Jan. 2005).
Birch, ‘Mercedes-Benz’ world class driving simulator complex enhances moose safety, SAE International, Automotive Engineering (Nov. 13, 2010).
Bondarev, Design of an Emotion Management System for a Home Reboot, Koninklijke Philips Electronics NV, 63 pp. (2002).
Bosker, Affectiva's Emotion Recognition Tech: When Machines Know What You're Feeling, www.HuffPost.com (Dec. 24, 2012).
Broggi et al., Extensive Tests of Autonomous Driving Technologies, IEEE Trans on Intelligent Transportation Systems, 14(3):1403-15 (May 30, 2013).
Campbell et al., Autonomous Driving in Urban Environments: Approaches, Lessons, and Challenges, Phil. Trans. R. Soc. A, 368:4649-72 (2010).
Carroll et al. “Where Innovation is Sorely Needed”, http://www.technologyreview.com/news/422568/where-innovation-is-sorely-needed/?nlid, retrieved from the internet on Nov. 4, 2013, 3 pages.
Chan et al., The emotional side of cognitive distraction: implications for road safety, Accident Analysis and Prevention, 50:147-54 (2013).
Cutler, Using the IPhone's Front-Facing Camera, Cardiio Measures Your Heartrate, downloaded from the Internet at: <https://techcrunch.com/2012/08/09/cardiio/> (Aug. 9, 2012).
Davies, Avoiding Squirrels and Other Things Google's Robot Car Can't Do, downloaded from the Internet at: <http://www.wired.com/2014/05/google-self-driving-car-can-cant/ (downloaded on May 28, 2014).
Davies, Here's How Mercedes-Benz Tests its New Self-Driving Car, Business Insider (Nov. 20, 2012).
Duffy et al., Sit, Stay, Drive: The Future of Autonomous Car Liability, SMU Science & Technology Law Review, vol. 16, pp. 101-123 (Winter 2013).
Figueiredo et al., An Approach to Simulate Autonomous Vehicles in Urban Traffic Scenarios, University of Porto, 7 pages (Nov. 2009).
Filev et al., Future Mobility: Integrating Vehicle Control with Cloud Computing, Mechanical Engineering, 135.3:S18-S24, American Society of Mechanical Engineers (Mar. 2013).
Foo et al., Three-dimensional path planning of unmanned aerial vehicles using particle swarm optimization, Sep. 2006, AIAA.
Franke et al., Autonomous Driving Goes Downtown, IEEE Intelligent Systems, (Nov. 1998).
Funkhouser, Paving the Road Ahead: Autonomous vehicles, products liability, and the need for a new approach, Utah Law Review, vol. 437, Issue 1 (2013).
Garza, “Look Ma, No Hands!” Wrinkles and Wrecks in the Age of Autonomous Vehicles, New England Law Review, vol. 46, pp. 581-616 (2012).
Gechter et al., Towards a Hybrid Real/Virtual Simulation of Autonomous Vehicles for Critical Scenarios, International Academy Research and Industry Association (IARIA), 4 pages (2014).
Gerdes et al., Implementable ethics for autonomous vehicles, Chapter 5, IN: Maurer et al. (eds.), Autonomes Fahren, Springer Vieweg, Berlin (2015).
Gleeson, “How much is a monitored alarm insurance deduction?”, Demand Media (Oct. 30, 2014).
Goldmark, MIT is making a road frustration index to measure stresses of driving, Fast Company (Jul. 23, 2013).
Graham-Rowe, “A Smart Phone that Knows You're Angry”, MIT Technology Review (Jan. 9, 2012).
Gray et al., A unified approach to threat assessment and control for automotive active safety, IEEE, 14(3):1490-9 (Sep. 2013).
Grifantini, Sensor detects emotions through the skin, MIT Technology Review (Oct. 26, 2010).
Gurney, Sue my car not me: Products liability and accidents involving autonomous vehicles, Journal of Law, Technology & Policy (2013).
Hancock et al., “The Impact of Emotions and Predominant Emotion Regulation Technique on Driving Performance,” Work, 41 Suppl 1:3608-11 (Feb. 2012).
Hars, Autonomous Cars: The Next Revolution Looms, Inventivio GmbH, 4 pages (Jan. 2010).
Healy, Detecting Stress during Real-world Driving Tasks Using Physiological Sensors, IEEE Trans Intelligent Transportation Systems 6.2:156-66 (2005).
Kluckner et al., Image based building classification and 3D modeling with super-pixels, ISPRS Technical Commission II Symposium, PCV 2010, vol. XXXVIII, part 3A, pp. 233-238 (Sep. 3, 2010).
Kus, Implementation of 3D optical scanning technology for automotive applications, Sensors, 9:1967-79 (2009).
Laine et al., Behavioral triggers of skin conductance responses and their neural correlates in the primate amygdala, J. Neurophysiol., 101:1749-54 (2009).
Lattner et al., Knowledge-based risk assessment for intelligent vehicles, pp. 191-196, IEEE KIMAS 2005, Apr. 18-21, Waltham, Massachusetts (Apr. 2005).
Lee et al., Autonomous Vehicle Simulation Project, Int. J. Software Eng. and Its Applications, 7(5):393-402 (2013).
Lee et al., What is stressful on the road? Analysis on aggression-inducing traffic situations through self-report, Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 57(1):1500-1503 (Sep. 2013).
Levendusky, Advancements in automotive technology and their effect on personal auto insurance, downloaded from the Internet at: <http://www.verisk.com/visualize/advancements-in-automotive-technology-and-their-effect> (2013).
Lewis, The History of Driverless Cars, downloaded from the Internet at: <www.thefactsite.com/2017/06/driverless-cars-history.html> (Jun. 2017).
Lomas, Can an algorithm be empathetic? UK startup EI technologies is building software that's sensitive to tone of voice, downloaded from the Internet at: https://techcrunch.com/2013/08/04/empathy/ (Aug. 4, 2013).
Marchant et al., The coming collision between autonomous vehicles and the liability system, Santa Clara Law Review, 52(4): Article 6 (2012).
McCraty et al., “The Effects of Different Types of Music on Mood, Tension, and Mental Clarity.” Alternative Therapies in Health and Medicine 4.1 (1998): 75-84. NCBI PubMed. Web. Jul. 11, 2013.
Mercedes-Benz, Press Information: Networked With All Sense, Mercedes-Benz Driving Simulator (Nov. 2012).
Merz et al., Beyond Visual Range Obstacle Avoidance and Infrastructure Inspection by an Autonomous Helicopter, Sep. 2011, IEEE.
Miller, A simulation and regression testing framework for autonomous workers, Case Western Reserve University, 12 pages (Aug. 2007).
Mui, Will auto insurers survive their collision with driverless cars? (Part 6), downloaded from the Internet at: <http://www.forbes.com/sites/chunkamui/2013/03/28/will-auto-insurers-survive-their-collision> (Mar. 28, 2013).
Murph, Affectiva's Q Sensor Wristband Monitors and Logs Stress Levels, Might Bring Back the Snap Bracelet, Engadget.com (Nov. 2, 2010).
Nasoz et al., Emotion recognition from physiological signals using wireless sensors for presence technologies, Cogn. Tech. Work, 6:4-14 (2004).
Nass et al., Improving automotive safety by pairing driver emotion and car voice emotion. CHI 2005 Late Breaking Results: Short Papers, Portland, Oregon (Apr. 2-7, 2005).
Pereira, An Integrated Architecture for Autonomous Vehicle Simulation, University of Porto., 114 pages (Jun. 2011).
Peterson, New technology—old law: autonomous vehicles and California's insurance framework, Santa Clara Law Review, 52(4):Article 7 (Dec. 2012).
Philipson, Want to drive safely? Listen to Elton John, Aerosmith or S Club 7, The Telegraph (Jan. 8, 2013).
Pohanka et al., Sensors simulation environment for sensor data fusion, 14th International Conference on Information Fusion, Chicago, IL, pp. 1-8 (2011).
Quinlan et al., Bringing Simulation to Life: A Mixed Reality Autonomous Intersection, Proc. IROS 2010—IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei Taiwan, 6 pages (Oct. 2010).
Read, Autonomous cars & the death of auto insurance, downloaded from the Internet at: <http://www.thecarconnection.com/news/1083266_autonomous-cars-the-death-of-auto-insurance> (Apr. 1, 2013).
Reddy, The New Auto Insurance Ecosystem: Telematics, Mobility and the Connected Car, Cognizant (Aug. 2012).
Reifel et al., “Telematics: The Game Changer—Reinventing Auto Insurance”, A.T. Kearney (2010).
Roberts, “What is Telematics Insurance?”, MoneySupermarket (Jun. 20, 2012).
Ryan Hurlbert, “Can Having Safety Features Reduce Your Insurance Premiums?”, Dec. 15, 2010, 1 page.
Ryan, Can having safety features reduce your insurance premiums? (Dec. 15, 2010).
Saberi et al., An approach for functional safety improvement of an existing automotive system, IEEE (2015).
Sepulcre et al., Cooperative vehicle-to-vehicle active safety testing under challenging conditions, Transportation Research Part C, 26:233-55 (2013).
Sharma, Driving the future: the legal implications of autonomous vehicles conference recap, downloaded from the Internet at: <http://law.scu.edu/hightech/autonomousvehicleconfrecap2012> (Aug. 2012).
Shaya, “For Some, Driving Is More Stressful than Skydiving.” AutomotiveNews.com. Automotive News, Jun. 12, 2013.
Sorrel, App Measures Vital Signs Using IPad Camera, wired.com (Nov. 18, 2011).
Stavens, Learning to Drive: Perception for Autonomous Cars, Stanford University, 104 pages (May 2011).
Stienstra, Autonomous Vehicles & the Insurance Industry, 2013 CAS Annual Meeting—Minneapolis, MN (Nov. 2013).
Talbot, “Wrist Sensor Tells You How Stressed Out You Are”, MIT Technology Review (Dec. 20, 2012).
Tiberkak et al., An architecture for policy-based home automation system (PBHAS), 2010 IEEE Green Technologies Conference (Apr. 15-16, 2010).
Toor, Valve looks to sweat levels and eye controls for future game design, downloaded from the Internet at: https://www.theverge.com/2013/5/7/4307750/valve-biometric-eye-tracking-sweat-left-4-dead-portal-2 (May 7, 2013).
U.S. Appl. No. 14/798,609, Final Office Action, dated Mar. 22, 2019.
U.S. Appl. No. 14/798,609, Nonfinal Office Action, dated Aug. 23, 2018.
U.S. Appl. No. 14/798,615, Final Office Action, dated Aug. 3, 2018.
U.S. Appl. No. 14/798,615, Final Office Action, dated Jun. 25, 2019.
U.S. Appl. No. 14/798,615, Nonfinal Office Action, dated Feb. 7, 2018.
U.S. Appl. No. 14/798,615, Nonfinal Office Action, dated Jan. 25, 2019.
U.S. Appl. No. 14/798,626, Final Office Action, dated Jul. 19, 2018.
U.S. Appl. No. 14/798,626, Final Office Action, dated Jun. 3, 2019.
U.S. Appl. No. 14/798,626, Nonfinal Office Action, dated Jan. 23, 2019.
U.S. Appl. No. 14/798,626, Nonfinal Office Action, dated Jan. 30, 2018.
U.S. Appl. No. 14/798,633, Final Office Action, dated Sep. 19, 2018.
U.S. Appl. No. 14/798,633, Nonfinal Office Action, dated Apr. 27, 2018.
U.S. Appl. No. 14/798,633, Nonfinal Office Action, dated Mar. 4, 2019.
U.S. Appl. No. 14/798,633, Notice of Allowance, dated Jul. 3, 2019.
U.S. Appl. No. 14/798,741, Final Office Action, dated Apr. 17, 2019.
U.S. Appl. No. 14/798,741, Final Office Action, dated Jul. 17, 2018.
U.S. Appl. No. 14/798,741, Nonfinal Office Action, dated Jan. 29, 2018.
U.S. Appl. No. 14/798,741, Nonfinal Office Action, dated Nov. 9, 2018.
U.S. Appl. No. 14/798,745, Final Office Action, dated Aug. 30, 2018.
U.S. Appl. No. 14/798,745, Nonfinal Office Action, dated Apr. 17, 2018.
U.S. Appl. No. 14/798,745, Notice of Allowance, dated Apr. 29, 2019.
U.S. Appl. No. 14/798,750, Final Office Action, dated Aug. 20, 2019.
U.S. Appl. No. 14/798,750, Final Office Action, dated Aug. 29, 2018.
U.S. Appl. No. 14/798,750, Nonfinal Office Action, dated Apr. 12, 2019.
U.S. Appl. No. 14/798,750, Nonfinal Office Action, dated Mar. 5, 2018.
U.S. Appl. No. 14/798,757, Nonfinal Office Action, dated Jan. 17, 2017.
U.S. Appl. No. 14/798,757, Notice of Allowance, dated Jul. 12, 2017.
U.S. Appl. No. 14/798,763, Final Office Action, dated Apr. 11, 2019.
U.S. Appl. No. 14/798,763, Final Office Action, dated Jul. 12, 2018.
U.S. Appl. No. 14/798,763, Nonfinal Office Action, dated Aug. 16, 2019.
U.S. Appl. No. 14/798,763, Nonfinal Office Action, dated Feb. 5, 2018.
U.S. Appl. No. 14/798,763, Nonfinal Office Action, dated Oct. 25, 2018.
U.S. Appl. No. 14/798,769, Final Office Action, dated Mar. 14, 2017.
U.S. Appl. No. 14/798,769, Nonfinal Office Action, dated Oct. 6, 2016.
U.S. Appl. No. 14/798,769, Notice of Allowance, dated Jun. 27, 2017.
U.S. Appl. No. 14/798,770, Nonfinal Office Action, dated Nov. 2, 2017.
U.S. Appl. No. 14/798,770, Notice of Allowance, dated Jun. 25, 2018.
U.S. Appl. No. 15/676,460, Notice of Allowance, dated Oct. 5, 2017.
U.S. Appl. No. 15/676,470, Nonfinal Office Action, dated Apr. 24, 2018.
U.S. Appl. No. 15/676,470, Notice of Allowance, dated Sep. 17, 2018.
U.S. Appl. No. 15/859,854, Notice of Allowance, dated Mar. 28, 2018.
U.S. Appl. No. 15/964,971, Nonfinal Office Action, dated Jun. 5, 2018.
U.S. Appl. No. 16/685,392, Potter et al., “Methods of Faciliating Emergency Assistance”, filed Nov. 15, 2019.
U.S. Appl. No. 16/685,470, Potter et al., “Methods of Facilitating Emergency Assistance”, filed Nov. 15, 2019.
UTC Spotlight: Superstorm Sandy LiDAR Damage Assessment to Change Disaster Recovery, Feb. 2013.
Vasudevan et al., Safe semi-autonomous control with enhanced driver modeling, 2012 American Control Conference, Fairmont Queen Elizabeth, Montreal, Canada (Jun. 27-29, 2012).
Villasenor, Products liability and driverless cars: Issues and guiding principles for legislation, Brookings Center for Technology Innovation, 25 pages (Apr. 2014).
Wang et al., Shader-based sensor simulation for autonomous car testing, 15th International IEEE Conference on Intelligent Transportation Systems, Anchorage, Alaska, pp. 224-229 (Sep. 2012).
Wardzinski, Dynamic risk assessment in autonomous vehicles motion planning, Proceedings of the 2008 1st International Conference on Information Technology, IT 2008, Gdansk, Poland (May 19-21, 2008).
Wiesenthal et al., “The Influence of Music on Driver Stress,” J. Applied Social Psychology, 30(8):1709-19 (Aug. 2000).
Woodbeck et al., “Visual cortex on the GPU: Biologically inspired classifier and feature descriptor for rapid recognition”, Jun. 28, 2008, IEEE Computer Society Conf. on Computer Vision and Pattern Recognition Workshops 2008, p. 1-8.
Young et al., “Cooperative Collision Warning Based Highway Vehicle Accident Reconstruction”, Eighth International Conference on Intelligent Systems Design and Applications, Nov. 26-28, 2008, pp. 561-565.
Zhou et al., A Simulation Model to Evaluate and Verify Functions of Autonomous Vehicle Based on Simulink, Tongji University, 12 pages (2009).
Provisional Applications (13)
Number Date Country
62145022 Apr 2015 US
62145234 Apr 2015 US
62145027 Apr 2015 US
62145228 Apr 2015 US
62145029 Apr 2015 US
62145232 Apr 2015 US
62145032 Apr 2015 US
62145033 Apr 2015 US
62145024 Apr 2015 US
62145028 Apr 2015 US
62145145 Apr 2015 US
62040735 Aug 2014 US
62027021 Jul 2014 US
Continuations (3)
Number Date Country
Parent 16178838 Nov 2018 US
Child 16685470 US
Parent 15676470 Aug 2017 US
Child 16178838 US
Parent 14798757 Jul 2015 US
Child 15676470 US