Aspects of the disclosure generally relate to the detection of vehicle crashes using sensors and computing devices, which may be integrated into mobile devices.
Typically, drivers of vehicles involved in crashes (or in some cases, emergency personnel) report crashes to insurance providers days or even weeks after the crash. The delay in reporting crashes often results in a delay in processing insurance claims. The information that the driver gives to his or her insurance provider after the fact might also be incomplete or vague. For example, the driver might have forgotten the location of the accident.
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosure. The summary is not an extensive overview of the disclosure. It is neither intended to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure. The following summary merely presents some concepts of the disclosure in a simplified form as a prelude to the description below.
Aspects of the disclosure relate to systems, methods, and computing devices, such as a mobile computing device comprising an accelerometer configured to measure acceleration of at least one axis of the accelerometer, a processor, and memory storing computer-executable instructions that, when executed by the processor, cause the processor of the mobile computing device to receive acceleration events measured by the accelerometer and determine whether a number of the acceleration events measured by the accelerometer exceeds a threshold number of acceleration events during a predetermined time window. If the number of the acceleration events measured by the accelerometer exceeds the threshold number of acceleration events, a determination that the mobile computing device is within a vehicle and that the vehicle was involved in a crash may be made. On the other hand, if the number of acceleration events measured by the accelerometer does not exceed the threshold number of acceleration events, a determination that the vehicle was not involved in a crash may be made.
The mobile computing device described herein may have memory storing additional computer-executable instructions that, when executed by the processor of the mobile computing device, cause the process of the mobile computing device to determine whether each of the acceleration events exceeding the threshold number of acceleration events has a magnitude exceeding an acceleration magnitude threshold. Determining that the vehicle was involved in the crash may comprise determining that the vehicle was involved in the crash if the number of acceleration events measured by the accelerometer exceeds the threshold number of acceleration events, and each of the acceleration events exceeding the threshold number of acceleration events has a magnitude exceeding the acceleration magnitude threshold.
In some aspects, the acceleration events may comprise acceleration events having a magnitude exceeding an acceleration magnitude threshold. The accelerometer may comprise three axes, and the magnitude may comprise at least one of a sum of the absolute values of each of the three axes of the accelerometer, a sum of the squares of each of the three axes of the accelerometer, and a magnitude of a single axis of the accelerometer. Additionally or alternatively, the acceleration magnitude threshold may comprise a plurality of acceleration magnitude thresholds, and the memory of the mobile computing device may store additional computer-executable instructions that, when executed by the processor, cause the processor of the mobile computing device to determine a severity of the crash based on whether one or more of the plurality of acceleration magnitude thresholds has been exceeded and responsive to determining that the vehicle was involved in the crash.
The time window disclosed herein may comprise a number of acceleration samples measured periodically by the accelerometer. Additionally or alternatively, the time window may comprise a time value greater than 5 milliseconds.
In some aspects, the mobile computing device may further comprise communication circuitry configured to wirelessly communicate with other devices. The memory of the mobile computing device may store additional computer-executable instructions that, when executed by the processor, cause the processor of the mobile computing device to send, via the communication circuitry and to a crash detection server, a message comprising information identifying an owner of the mobile computing device, information identifying the mobile computing device, and information identifying a location of the mobile computing device. The message may be sent responsive to a determination that the vehicle was involved in the crash. The mobile computing device may comprise a mobile phone, and the memory of the mobile phone may store additional computer-executable instructions that, when executed by the processor, cause the processor of the mobile phone to receive, via the communication circuitry, a phone call. The phone call my be received responsive to the message being sent to the crash detection server.
Aspects of the disclosure relate to systems and methods for identifying, by a mobile computing device, a time window, an acceleration magnitude threshold, and a number of acceleration events threshold. The system and method may include a determination of a number of acceleration events measured by an accelerometer of the mobile computing device exceeding the acceleration magnitude threshold during the time window. The system and method may also include a determination that a the mobile computing device is within a vehicle and that the vehicle was involved in a crash responsive to determining that the number of acceleration events measured by the accelerometer of the mobile computing device exceeding the acceleration magnitude threshold also exceeds the number of acceleration events threshold.
The systems and methods disclosed herein may include a determination of a location of the mobile computing device and a confirmation that the crash occurred based on the location of the mobile computing device. Moreover, the mobile computing device may identify a second acceleration magnitude threshold greater than the acceleration magnitude threshold. A determination of a second number of acceleration events measured by the accelerometer of the mobile computing device exceeding both the acceleration magnitude threshold and the second acceleration magnitude threshold may be made. The severity of the crash may be determined based on the number of acceleration events exceeding the acceleration magnitude threshold and the second number of acceleration events exceeding both the acceleration magnitude threshold and the second acceleration magnitude threshold.
Other features and advantages of the disclosure will be apparent from the additional description provided herein.
A more complete understanding of the present invention and the advantages thereof may be acquired by referring to the following description in consideration of the accompanying drawings, in which like reference numbers indicate like features, and wherein:
In the following description of the various embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration, various embodiments of the disclosure that may be practiced. It is to be understood that other embodiments may be utilized.
As will be appreciated by one of skill in the art upon reading the following disclosure, various aspects described herein may be embodied as a method, a computer system, or a computer program product. Accordingly, those aspects may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. In addition, aspects may take the form of a computing device configured to perform specified actions. Furthermore, such aspects may take the form of a computer program product stored by one or more computer-readable storage media having computer-readable program code, or instructions, embodied in or on the storage media. Any suitable computer readable storage media may be utilized, including hard disks, CD-ROMs, optical storage devices, magnetic storage devices, and/or any combination thereof. In addition, various signals representing data or events as described herein may be transferred between a source and a destination in the form of electromagnetic waves traveling through signal-conducting media such as metal wires, optical fibers, and/or wireless transmission media (e.g., air and/or space).
Input/Output (I/O) module 109 may include a microphone, keypad, touch screen, and/or stylus through which a user of the computing device 101 may provide input, and may also include one or more of a speaker for providing audio input/output and a video display device for providing textual, audiovisual and/or graphical output. Software may be stored within memory unit 115 and/or other storage to provide instructions to processor 103 for enabling device 101 to perform various functions. For example, memory unit 115 may store software used by the device 101, such as an operating system 117, application programs 119, and an associated internal database 121. The memory unit 115 includes one or more of volatile and/or non-volatile computer memory to store computer-executable instructions, data, and/or other information. Processor 103 and its associated components may allow the crash detection computing device 101 to execute a series of computer-readable instructions to transmit or receive sensor data, process sensor data, and determine or confirm crash and non-crash events from the sensor data.
The crash detection computing device 101 may operate in a networked environment 100 supporting connections to one or more remote computers, such as terminals/devices 141 and 151. Crash detection computing device 101, and related terminals/devices 141 and 151, may include devices installed in vehicles, mobile devices that may travel within vehicles, or devices outside of vehicles that are configured to receive and process vehicle and other sensor data. Thus, the crash detection computing device 101 and terminals/devices 141 and 151 may each include personal computers (e.g., laptop, desktop, or tablet computers), servers (e.g., web servers, database servers), vehicle-based devices (e.g., on-board vehicle computers, short-range vehicle communication systems, sensor and telematics devices), or mobile communication devices (e.g., mobile phones, portable computing devices, and the like), and may include some or all of the elements described above with respect to the crash detection computing device 101. The network connections depicted in
It will be appreciated that the network connections shown are illustrative and other means of establishing a communications link between the computers may be used. The existence of any of various network protocols such as TCP/IP, Ethernet, FTP, HTTP and the like, and of various wireless communication technologies such as GSM, CDMA, Wi-Fi, and WiMAX, is presumed, and the various computing devices and crash detection system components described herein may be configured to communicate using any of these network protocols or technologies.
Additionally, one or more application programs 119 used by the crash detection computing device 101 may include computer executable instructions (e.g., sensor data analysis programs, crash detection algorithms, and the like) for transmitting and receiving sensor and crash data and performing other related functions as described herein.
Sensor data may refer to information pertaining to one or more actions or events performed by a vehicle and can include aspects of information identified or determined from data collected from a vehicle or mobile device. Sensor data can include, for example, location data, speed or velocity data, acceleration data, presence data, time data, direction data, mobile device orientation data, rotation/gyroscopic data, and the like.
Vehicle 210 may be, for example, an automobile, motorcycle, scooter, bus, recreational vehicle, boat, or other vehicle for which sensor or crash data may be collected and analyzed. A mobile computing device 216 within the vehicle 210 may be used to collect sensor or crash data (e.g., via sensors 218) and/or to receive sensor or crash data from the vehicle 210 (e.g., via vehicle sensors 219). The mobile device 216 may process the data to detect a crash or non-crash event and/or transmit the sensor or crash data to the crash detection server 250 or other external computing devices. Mobile computing device 216 may be, for example, mobile phones, personal digital assistants (PDAs), tablet computers, laptop computers, smartwatches, and other devices that may be carried by drivers or passengers inside or outside of the vehicle 210. The mobile computing device 216 may contain some or all of the hardware/software components as the computing device 101 depicted in
When mobile computing device 216 within the vehicle 210 is used to sense vehicle data, the mobile computing device 216 may store, analyze, and/or transmit the vehicle data to one or more other computing devices. For example, mobile device 216 may transmit vehicle data directly to crash detection server 250, and thus may be used instead of sensors or communication systems of the vehicle 210.
The mobile device 216 may include various sensors 218 capable of detecting and recording conditions at and operational parameters of the vehicle 210 if the mobile device 216 is inside the vehicle. The sensors 218 may be used to sense, for example, the location of the mobile device 216, such as the GPS coordinates (e.g., latitude and longitude). The location of the mobile device 216 may also be determined based on wireless networks the mobile device has connected to, such as Wi-Fi networks, cellular networks, and the like. Images taken by a camera of the mobile device 216 may also be used to determine the location. For example, the mobile device may capture an image before, during, or after the accidents, and the captured image may be compared to images stored in one or more databases (e.g., databases of a search engine). Once a match is found, the location of the mobile device 216 may be determined based on the tagged location of the matching image in the database. In some aspects, location may be detected, for example, at least once per second (e.g., 60 Hz).
The sensors 218 of the mobile device 216, such as a GPS and/or a compass, may sense the speed and/or direction at which the mobile device 216 (and accordingly vehicle 210) is traveling. An accelerometer of the mobile device 216 may sense the acceleration of the mobile device. A gyroscope may be used to determine the orientation of the mobile device. In some aspects, orientation may be detected, for example, at a rate of 90 Hz. The gyroscope may also be used to measure the speed of rotation of the mobile device 216. A magnetometer may be used to measure the strength and direction of the magnetic field relative to the mobile device. The sensors 218 previously described are exemplary, and the mobile device 216 may include any other sensors used for crash detection.
The data collected by the mobile device 216 may be stored and/or analyzed within the mobile device 216. The processing components of the mobile computing device 216 may be used to analyze sensor data, determine that a crash has or has not occurred, and confirm whether or not the crash has occurred. Additionally or alternatively, the mobile device 216 may transmit, via a wired or wireless transmission network, the data to one or more external devices for storage or analysis, such as vehicle computer 214 or crash detection server 250. In other words, mobile computing device 216 may be used in conjunction with, or in place of, the vehicle computer 214 or crash detection server 250 to detect crashes.
The vehicle computer 214 of the vehicle 210 may contain some or all of the hardware/software components as the computing device 101 depicted in
The system 200 may include a crash detection server 250, containing some or all of the hardware/software components as the computing device 101 depicted in
The crash detection computer 251 may be configured to retrieve data from the database 252, or may receive driving data directly from vehicle 210, mobile device 216, or other data sources. The crash detection computer 251 may perform crash detection analyses and other related functions, as will be described in further detail in the examples below. The analyses described herein may be performed entirely in the crash detection computer 251 of the crash detection server 250, entirely in the vehicle computer 214, or entirely in the mobile device 216. In other examples, certain analyses may be performed by vehicle computer 214, other analyses may be performed by the crash detection computer 251, and yet other analyses may be performed by the mobile device 216.
The system 200 may also include an external location detection device 220, containing some or all of the hardware/software components as the computing device 101 depicted in
In some aspects, the location of the mobile device 216 and/or vehicle 210 may be determined using another mobile device and/or vehicle. For example, vehicle 210 may be configured to perform vehicle-to-vehicle (V2V) communications, by establishing connections and transmitting/receiving vehicle data to and from other nearby vehicles using short-range communication system 212.
Short-range communication system 212 is a vehicle-based data transmission system configured to transmit vehicle data to other nearby vehicles, and to receive vehicle data from other nearby vehicles. In some examples, communication system 212 may use the dedicated short-range communications (DSRC) protocols and standards to perform wireless communications between vehicles. In the United States, 75 MHz in the 5.850-5.925 GHz band have been allocated for DSRC systems and applications, and various other DSRC allocations have been defined in other countries and jurisdictions. However, the short-range communication system 212 need not use DSRC, and may be implemented using other short-range wireless protocols in other examples, such as WLAN communication protocols (e.g., IEEE 802.11), Bluetooth (e.g., IEEE 802.15.1), or one or more of the Communication Access for Land Mobiles (CALM) wireless communication protocols and air interfaces.
The V2V transmissions between the short-range communication system 212 and another vehicle's communication system may be sent via DSRC, Bluetooth, satellite, GSM infrared, IEEE 802.11, WiMAX, RFID, and/or any suitable wireless communication media, standards, and protocols. In certain systems, the short-range communication system 212 may include specialized hardware installed in vehicle 210 (e.g., transceivers, antennas, etc.), while in other examples the communication system 212 may be implemented using existing vehicle hardware components (e.g., radio and satellite equipment, navigation computers) or may be implemented by software running on the mobile device 216 of drivers and passengers within the vehicle 210.
The range of V2V communications between vehicle communication systems may depend on the wireless communication standards and protocols used, the transmission/reception hardware (e.g., transceivers, power sources, antennas), and other factors. Short-range V2V communications may range from just a few feet to many miles. V2V communications also may include vehicle-to-infrastructure (V2I) communications, such as transmissions from vehicles to non-vehicle receiving devices, for example, toll booths, rail road crossings, and road-side traffic monitoring devices. Certain V2V communication systems may periodically broadcast data from a vehicle 210 to any other vehicle, or other infrastructure device capable of receiving the communication, within the range of the vehicle's transmission capabilities. For example, a vehicle 210 may periodically broadcast (e.g., every 0.1 second, every 0.5 seconds, every second, every 5 seconds, etc.) certain vehicle data via its short-range communication system 212, regardless of whether or not any other vehicles or reception devices are in range. In other examples, a vehicle communication system 212 may first detect nearby vehicles and receiving devices, and may initialize communication with each by performing a handshaking transaction before beginning to transmit its vehicle data to the other vehicles and/or devices.
The types of vehicle data transmitted by the vehicle 210 may depend on the protocols and standards used for the V2V communication, the range of communications, whether a crash has been detected, and other factors. In certain examples, the vehicle 210 may periodically broadcast corresponding sets of similar vehicle driving data, such as the location (which may include an absolute location in GPS coordinates or other coordinate systems, and/or a relative location with respect to another vehicle or a fixed point), speed, and direction of travel. In certain examples, the nodes in a V2V communication system (e.g., vehicles and other reception devices) may use internal clocks with synchronized time signals, and may send transmission times within V2V communications, so that the receiver may calculate its distance from the transmitting node based on the difference between the transmission time and the reception time. The state or usage of the vehicle's 210 controls and instruments may also be transmitted, for example, whether the vehicle is accelerating, braking, turning, and by how much, and/or which of the vehicle's instruments are currently activated by the driver (e.g., head lights, turn signals, hazard lights, cruise control, 4-wheel drive, traction control, windshield wipers, etc.). Vehicle warnings such as detection by the vehicle's 210 internal systems that the vehicle is skidding, that an impact has occurred, or that the vehicle's airbags have been deployed, also may be transmitted in V2V communications.
The mobile computing device 216 may be used instead of, or in conjunction with, short-range communication system 212. For example, the mobile device 216 may communicate directly with the other vehicle or directly with another mobile device, which may be inside or outside of the other vehicle. Additionally or alternatively, the other vehicle may communicate location information to vehicle 210, and vehicle 210 may in turn communicate this location information to the mobile device 216. Any data collected by any vehicle sensor or mobile device 216 sensor may be transmitted via V2V or other communication to other nearby vehicles, mobile devices, or infrastructure devices receiving V2V communications from communication system 212 or communications directly from mobile device 216. Further, additional vehicle driving data not from the vehicle's sensors (e.g., vehicle make/model/year information, driver information, etc.) may be collected from other data sources, such as a driver's or passenger's mobile device 216, crash detection server 250, and/or another external computer system, and transmitted using V2V communications to nearby vehicles and other transmitting and receiving devices using communication system 212.
Systems and methods described herein may detect vehicle crashes (e.g., accidents) based on the number of high magnitude accelerometer readings within a particular time window. For example, a computing device 101 may receive five samples of accelerometer readings made within a time window. The computing device 101 may determine that a crash has occurred if the magnitude of three or more of the accelerometer readings is greater than a threshold. Otherwise, the computing device 101 may determine that a non-crash event occurred, such as the mobile device 216 being dropped or a hard braking event of the vehicle 210. The previous description is merely exemplary, and additional examples of the crash detection system 200 and method performed by the system are described below.
In step 305, a computing device, such as the crash detection server 250 or mobile device 216, may determine whether to update an acceleration magnitude threshold. The acceleration magnitude threshold may be used alone or in combination with the number of high acceleration events within a time window to determine whether a crash has occurred. As will be described in further detail in the examples below, a computing device may use the acceleration magnitude threshold to distinguish between a crash event (e.g., magnitude of acceleration exceeding the threshold) and a hard braking event (e.g., magnitude of acceleration not exceeding the threshold).
The magnitude and direction of acceleration may be measured by, for example, an accelerometer of the mobile device 216 and/or vehicle 210. The accelerometer may include three different axes (i.e., x-axis, y-axis, and z-axis), and acceleration measurements may be taken for each axis. The magnitude of acceleration for the purposes of crash detection may be determined using any number of methods. For example, the magnitude of acceleration may be determined based on the sum of the absolute values of all three axes of the accelerometer, as illustrated in the following algorithm:
|x|+|y|+|z|
The computing device may add an offset to the axis corresponding to the direction of gravity in order to account for the effect of gravity on acceleration measurements. For example, if the direction of gravity corresponds to the z axis, and acceleration is measured using the standard gravity unit of measurement (G or 9.8 m/s2), the following algorithm may be used to determine the magnitude of acceleration for the purposes of crash detection:
|x|+|y|+|z+1|
Alternatively, if the orientation of the mobile device 216 is unknown, a high-pass filter may be used to remove the effect of gravity. The magnitude of acceleration may alternatively be determined based on the sum of the squares of all three axes of the accelerometer, as illustrated in the following algorithm:
x2y2+z2
The computing device may add an offset to the axis corresponding to the direction of gravity, such as the z-axis, as illustrated in the following algorithm:
x2+y2+(z+1)2
In some aspects, the magnitude of acceleration may be determined using the magnitude of a single axis of the accelerometer. If a single axis is used, the computing device may choose the axis to measure based on the orientation of the mobile device 216. For example, the gyroscope and compass of the mobile device 216 may be used to determine the orientation of mobile device, such as by determining the direction of the force of gravity. The orientation of the mobile device may be fixed by a cradle attached to the vehicle 210 (e.g., the windshield or dashboard of the vehicle 210) configured to hold the mobile device. The mobile device 216 and/or vehicle 210 may detect whether the mobile device 216 is in the cradle using, for example, wired connections (e.g., if the mobile device 216 is plugged into the cradle), wireless connections (e.g., near-field communication (NFC), wireless charging, etc.), or presence sensors (e.g., light sensors on the mobile device 216 or cradle, which may be covered when the mobile device 216 is placed in the cradle). If the mobile device 216 is fixed by the cradle, the computing device may select the appropriate axis (or axes) to measure for acceleration, such as the x-axis, the y-axis, the z-axis, or a combination thereof. Each axis may use a different acceleration magnitude threshold for the purposes of determining a crash or non-crash event.
Returning to
Exemplary, non-limiting acceleration magnitude thresholds include 3G, 4G, and 8G. In some aspects, the computing device may use multiple acceleration magnitude thresholds to determine the severity of the crash. For example, the computing device may be configured for three thresholds: 3G, 8G, and 60G. If the magnitude of acceleration is below 3G, the computing device may determine that a crash did not occur. If the magnitude of acceleration is between 3G and 8G, the computing device may determine that a minor crash occurred. If the magnitude of acceleration is between 8G and 60G, the computing device may determine that a moderate crash occurred. If the magnitude of acceleration is above 60G, the computing device may determine that a severe crash occurred. While the above example uses three thresholds, any number of thresholds (and thus levels of severity) may be used.
In some aspects, the threshold selected may depend on the configuration and capabilities of the accelerometer in the mobile device 216 or vehicle 210. For example, if the accelerometer is capable of measuring accelerations of up to +/−16G, the computing device may select any threshold value(s) less than 16G.
In step 315, the computing device may determine whether to update a time window. The time window may establish a period of time for which the computing device makes acceleration measurements for the purposes of determining a crash. The time window may be represented as a time value, such as 5 milliseconds. Alternatively, the time window may be represented as a number of acceleration measurements, such as 7 measurements, if the accelerometer makes periodic measurements (e.g., 125 measurements per second or 125 Hz). In the latter example, the time value for the time window may be 5.6 milliseconds (i.e., 7 measurements÷125 measurements/second). 125 Hz is merely exemplary, and other non-limiting examples include 90 Hz and 100 Hz. Other exemplary, non-limiting examples of the number of acceleration measurements include 3, 5, and 10 measurements. As will be described in further detail in the examples below, a computing device may determine whether the number of high magnitude acceleration measurements within the time window exceed a threshold number of acceleration measurements. In step 320, the computing device may determine a new time window if the computing device determined in step 315 to update the window. The time window may be updated in order to improve the accuracy of the crash detection algorithm, based on an analysis of crash and non-crash data collected from a plurality of mobile devices and/or from a plurality of vehicles. The time window may be increased to screen out noise or to observe multiple collisions that occur during a crash.
In step 325, the computing device may determine whether to update a threshold number of acceleration events. In step 330, the computing device may determine a new threshold number of acceleration events if the computing device determines to update the threshold in step 325. The threshold number of acceleration events may be used in combination with the acceleration magnitude threshold and time window previously described to determine whether a crash has occurred. For example, if the number of high magnitude acceleration events during the time window exceeds the threshold number of acceleration events, the computing device may determine that a crash occurred. Otherwise, the computing device may determine that a non-crash event occurred, such as the mobile device being dropped. In some aspects, the time window described above may be chosen to be long enough to distinguish the short duration of a dropped phone's impact with a surface from the longer duration of a vehicle crash. For example, the period of time may be greater than or equal to 5 milliseconds.
As previously described, each of the acceleration magnitude threshold, the time window, and/or the number of acceleration events threshold may be updated according to the steps illustrated in
A brief, non-limiting example of a computing device using the acceleration magnitude threshold, time window, and number of acceleration events threshold will now be described. Assume that the acceleration magnitude threshold is 4G, time window is 5 measurements (or 4 milliseconds measured periodically at 125 Hz), and the number of acceleration events threshold is 3 measurements. The computing device may receive 5 acceleration measurements from the accelerometer during the time window and determine the magnitude of acceleration for each of the 5 measurements. If the magnitude of acceleration for at least 3 of the measurements exceeds 4G, the computing device may determine that a crash occurred. Otherwise, the computing device may determine that a non-crash event occurred, such as the phone being dropped or a hard braking event. Additional examples of crash detection will now be provided with reference to
In step 405, a computing device may determine whether a trigger event has occurred. The trigger event may indicate the possibility of a crash, such as a magnitude of acceleration that exceeds an acceleration magnitude threshold. In some aspects, a threshold smaller than the acceleration magnitude threshold may be used to trigger the computing device to initiate detection of a crash. The trigger event may also be based on GPS measurements. For example, the computing device may determine that a trigger event has occurred if the change in speed measured by the GPS system of the mobile device 216 (or vehicle 210) is greater than a certain threshold. The computing device may wait for a trigger event before proceeding to step 410.
In step 410, the computing device may start the time window for taking acceleration measurements. As previously explained, the time window may comprise a time period and/or a number of measurements to take (e.g., if the acceleration measurements are periodically taken, such as every millisecond). The computing device may also initialize the time window to t=0 (the base time). In step 415, the computing device may initialize an acceleration count, which may be used to track the number of high acceleration events detected during the time window. The acceleration count may be initialized to 0 if the event that triggered the start of the time window is not included in the acceleration count, such as if the magnitude of the acceleration event trigger did not exceed the acceleration magnitude threshold or if the event is not otherwise to be counted. On the other hand, the acceleration count may be initialized to 1 if the magnitude of the acceleration event trigger exceeded the acceleration magnitude threshold or if the event is otherwise to be counted.
Instead of waiting for a trigger event (step 405) to trigger the time window (step 410) and to initialize the acceleration count (step 415), the computing device may use a rolling time window. Sensor data, such as acceleration data and/or GPS data, may be periodically made by and stored in, for example, the mobile device 216's memory. When a new sensor reading is made, the computing device may drop the oldest reading in the time window and add the new reading to the window.
In step 420, the computing device may determine whether the time window has ended. For example, if the time window is 5 milliseconds, the computing device may determine that the time window has ended when t=5 ms. If the time window is 5 measurements, the computing device may determine that the time window has ended when 5 measurements have been taken since the beginning of the time window.
If the time window has not ended (step 420: N), in step 425, the computing device may determine whether the magnitude of the acceleration for the currently sampled acceleration exceeds the acceleration magnitude threshold. For example, if the threshold is 4G and the magnitude of the current acceleration sample is 2.5G (step 425: N), the computing device may return to step 420 to determine whether the time window has ended and/or to take the next measurement. On the other hand, if the magnitude of the current acceleration sample is 4.6G (step 425: Y), the computing device may proceed to step 428.
In step 428, the computing device may optionally determine whether the previous acceleration sample (e.g., immediately previous acceleration sample) also exceeded the acceleration magnitude threshold. If the previous sample did not exceed the threshold (step 428: N), the computing device may proceed to step 430 and increment the acceleration count. On the other hand, if the previous sample exceeded the threshold (step 428: Y), the computing device might not increment the acceleration count and instead return to step 420. In other words, the computing device may optionally determine whether a crash has occurred based on the number of non-consecutive acceleration readings above the acceleration magnitude threshold. The computing device might not rely on consecutive acceleration samples. In other words, and as will be described below, the computing device may determine that a crash occurred based on either consecutive acceleration samples or non-consecutive acceleration samples.
In step 435, the computing device may determine whether the acceleration count within the time window has exceeded the number of acceleration events threshold. For example, if the threshold is two high magnitude acceleration events and the acceleration count is two (step 435: N), the computing device may return to step 420 to determine whether the time window has ended and/or to take the next measurement. On the other hand, if the acceleration count is three (step 435: Y), the computing device may proceed to step 445 and determine that a crash has occurred. The computing device may also determine that the mobile device is located within the vehicle involved in the crash. As previously explained, the computing device may determine the severity of the crash based on a plurality of acceleration magnitude thresholds. For example, if one, some, or all of the measured magnitudes exceeds a high threshold, the computing device may determine that a severe crash occurred. If one, some, or all of the magnitudes falls between a medium and high threshold, the computing device may determine that a moderate crash occurred. If one, some, or all of the magnitudes falls between a low and medium threshold, the computing device may determine that a minor crash occurred. If the mobile device 216 or vehicle computer 214 determines that a crash occurred in step 445, the device may generate a message indicating the crash and send the message to, for example, crash detection server 250.
In step 450, the computing device may confirm whether a crash occurred by analyzing additional data. In some aspects, the computing device may confirm the accident based on GPS readings. For example, the computing device may confirm the accident based on the change in speed of the vehicle 210 being greater than a threshold (e.g., indicating a hard stop or deceleration) and the GPS coordinates of the vehicle after the hard stop or deceleration falling within a certain radius of the location of the hard stop or deceleration for a particular length of time (e.g., thirty seconds).
A JavaScript Object Notation (JSON) algorithm may be used for crash determination and confirmation, as previously described. An exemplary JSON structure may be as follows:
A JSON dictionary may include keys for “gps” and “accelerometer.” The following table illustrates the keys for “accelerometer”:
The following table illustrates the keys for “gps”:
The above JSON configuration example may be used to determine and confirm a crash in the following scenario. The GPS trail may show a magnitude of deceleration of 0.33 G followed by the vehicle not moving more than 50 m in 30 s. Within an acceleration window of length 7 (e.g., a time value of 7/90 seconds for 90 Hz sampling) starting at the same time as the above GPS deceleration event, at least 3 of the 7 acceleration magnitude readings exceeds 5G.
Additionally or alternatively, the computing device may confirm (after detecting) the crash based on the location of the mobile device 216 and/or vehicle 210. For example, if the computing device determines that the mobile device 216 is on a road (or within a predetermined radius from a road), the computing device may confirm the crash. Otherwise, the computing device may determine that a crash did not occur. The location of the mobile device 216 and/or vehicle 210 may be determined using the location detection device 220, as previously described. The computing device may determine the existence of a road by accessing a database of maps, such as GPS or search engine maps. If the crash is not confirmed (step 450: N), the computing device may return to step 405 to determine whether another trigger event has occurred. If the crash is confirmed (step 450: Y), the computing device may proceed to step 455.
In step 455, the computing device may generate and/or store the crash data, such as the number of acceleration events counted, the severity of the crash, and the threshold values. The computing device may also generate and/or store the location of the crash, the time of the crash (including time zone), the identity of the vehicle (e.g., VIN, make/model, license plate number, etc.), the identity of the driver involved in the crash (e.g., name, customer number, driver's license number, etc.), and the identity of the mobile device 216 (e.g., IMEI, MAC address, IP address, etc.). For example, the time may be represented by a timestamp in the following format: YYYY-MM-DD HH:MM:SS-ZZZZ. -ZZZZ may stand for time zone offset from UTC (e.g., -0500 is Eastern Standard Time). In some aspects, the mobile device 216 may send the data to the crash detection server 250, which may store the data in database 252. The mobile device 216 may also send data for a number of seconds before and after the time window (e.g., 5 seconds before and 5 seconds after or 10 seconds before and 10 seconds after) to the crash detection server 250, and the data may be stored in database 252. By providing this data to the crash detection server 250, the server may be able to compare the before, during, and after values to confirm the crash. The crash detection server 250 may also use the stored information to make fast insurance claim determinations (relative to if the driver reports the crash days or weeks later), begin estimating vehicle damage costs faster at the First Notice of Loss (FNOL), and identify the location of accidents.
In step 460, the computing device may notify one or more individuals of the crash, via email, a telephone call, an on-screen pop-up, or any other communication medium. For example, the computing device may contact emergency personnel, such as local fire or police personnel. The message to the emergency personnel may include the location of the crash, the identity of the driver involved in the crash, the license plate number of the vehicle, the severity of the crash, and the like. The computing device may similarly send messages to other individuals, such as the driver's emergency contact identified in his or her profile stored in database 252. The computing device may also attempt to contact the driver or passenger of the vehicle involved in the crash. For example, the computing device may attempt to call the mobile device 216 or an onboard vehicle communication system in the vehicle 210. Additionally or alternatively, the computing device may provide emergency personnel with the phone number of the mobile device 216, which they may use to contact individuals in the vehicle.
Returning to step 420, the computing device may determine that the time window ended (step 420: Y), without the acceleration count exceeding the threshold number of acceleration events needed to determine that a crash occurred. In response, the computing device may determine that a non-crash event occurred, such as the mobile device 216 being dropped or a hard braking event. For example, if the mobile device 216 is dropped, the computing device might only detect one or two high magnitude events (compared to three or four for a crash). Accordingly, in step 440, the computing device may determine whether the number of high magnitude acceleration events falls below a mobile device drop threshold, such as two or three. If so (step 440: Y), the computing device may determine, in step 470, that the mobile device was dropped. The computing device may optionally return to step 405 to detect for additional trigger events and/or crashes. Otherwise, in step 475, the computing device may determine that a hard braking event occurred. The computing device may return to step 405 to detect for additional trigger events and/or crashes.
While the aspects described herein have been discussed with respect to specific examples including various modes of carrying out aspects of the disclosure, those skilled in the art will appreciate that there are numerous variations and permutations of the above described systems and techniques that fall within the spirit and scope of the invention.
This application is a continuation of pending U.S. patent application Ser. No. 16/106,455, filed Aug. 21, 2018 and entitled “Automatic Crash Detection,” which is a continuation of U.S. patent application Ser. No. 15/880,187 (now U.S. Pat. No. 10,083,550), filed Jan. 25, 2018 and entitled “Automatic Crash Detection,” which is a continuation of U.S. patent application Ser. No. 15/665,710 (now U.S. Pat. No. 9,916,698), filed Aug. 1, 2017 and entitled “Automatic Crash Detection,” which is a continuation of U.S. patent application Ser. No. 14/685,067 (now U.S. Pat. No. 9,767,625), filed Apr. 13, 2015 and entitled “Automatic Crash Detection.” Each of the prior applications is incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2833495 | Feeney et al. | May 1958 | A |
4198864 | Breed | Apr 1980 | A |
4716458 | Heitzman et al. | Dec 1987 | A |
5517183 | Bozeman, Jr. | May 1996 | A |
5521822 | Wang | May 1996 | A |
5719554 | Gagnon | Feb 1998 | A |
5736970 | Bozeman, Jr. | Apr 1998 | A |
5903317 | Sharir et al. | May 1999 | A |
5950169 | Borghesi et al. | Sep 1999 | A |
5963128 | McClelland | Oct 1999 | A |
6023664 | Bennet | Feb 2000 | A |
6060989 | Gehlot | May 2000 | A |
6061610 | Boer | May 2000 | A |
6076028 | Donnelly | Jun 2000 | A |
6141611 | Mackey et al. | Oct 2000 | A |
6246933 | Bague | Jun 2001 | B1 |
6262657 | Okuda et al. | Jul 2001 | B1 |
6266617 | Evans | Jul 2001 | B1 |
6295492 | Lang et al. | Sep 2001 | B1 |
6330499 | Chou et al. | Dec 2001 | B1 |
6405112 | Rayner | Jun 2002 | B1 |
6438475 | Gioutsos et al. | Aug 2002 | B1 |
6472982 | Eida et al. | Oct 2002 | B2 |
6509868 | Flick | Jan 2003 | B2 |
6539249 | Kadhiresan et al. | Mar 2003 | B1 |
6553308 | Uhlmann et al. | Apr 2003 | B1 |
6573831 | Ikeda et al. | Jun 2003 | B2 |
6594579 | Lowrey et al. | Jul 2003 | B1 |
6611740 | Lowrey et al. | Aug 2003 | B2 |
6641038 | Gehlot et al. | Nov 2003 | B2 |
6642844 | Montague | Nov 2003 | B2 |
6701234 | Vogelsang | Mar 2004 | B1 |
6732020 | Yamagishi | May 2004 | B2 |
6732031 | Lightner et al. | May 2004 | B1 |
6741168 | Webb et al. | May 2004 | B2 |
6756887 | Evans | Jun 2004 | B2 |
6765499 | Flick | Jul 2004 | B2 |
6798356 | Flick | Sep 2004 | B2 |
6909947 | Douros et al. | Jun 2005 | B2 |
6925425 | Remboski et al. | Aug 2005 | B2 |
6946966 | Koenig | Sep 2005 | B2 |
6982654 | Rau et al. | Jan 2006 | B2 |
6988033 | Lowrey et al. | Jan 2006 | B1 |
7069118 | Coletrane et al. | Jun 2006 | B2 |
7082359 | Breed | Jul 2006 | B2 |
7092803 | Kapolka et al. | Aug 2006 | B2 |
7099835 | Williams, III | Aug 2006 | B2 |
7113127 | Banet et al. | Sep 2006 | B1 |
7119669 | Lundsgaard et al. | Oct 2006 | B2 |
7129826 | Nitz et al. | Oct 2006 | B2 |
7133611 | Kaneko | Nov 2006 | B2 |
7135993 | Okamoto et al. | Nov 2006 | B2 |
7155259 | Bauchot et al. | Dec 2006 | B2 |
7155321 | Bromley et al. | Dec 2006 | B2 |
7158016 | Cuddihy et al. | Jan 2007 | B2 |
7174243 | Lightner et al. | Feb 2007 | B1 |
7271716 | Nou | Sep 2007 | B2 |
7305293 | Flick | Dec 2007 | B2 |
7323972 | Nobusawa | Jan 2008 | B2 |
7323973 | Ceglia et al. | Jan 2008 | B1 |
7348895 | Lagassey | Mar 2008 | B2 |
7418400 | Lorenz | Aug 2008 | B1 |
7477968 | Lowrey et al. | Jan 2009 | B1 |
7504965 | Windover et al. | Mar 2009 | B1 |
7508298 | Pisz et al. | Mar 2009 | B2 |
7565230 | Gardner et al. | Jul 2009 | B2 |
7600426 | Savolainen et al. | Oct 2009 | B2 |
7624031 | Simpson et al. | Nov 2009 | B2 |
7650235 | Lee et al. | Jan 2010 | B2 |
7671727 | Flick | Mar 2010 | B2 |
7715961 | Kargupta | May 2010 | B1 |
7747365 | Lowrey et al. | Jun 2010 | B1 |
7872636 | Gopi et al. | Jan 2011 | B1 |
7908921 | Binda et al. | Mar 2011 | B2 |
8000979 | Blom | Aug 2011 | B2 |
8014789 | Breed | Sep 2011 | B2 |
8019629 | Medina, III et al. | Sep 2011 | B1 |
8022845 | Zlojutro | Sep 2011 | B2 |
8041635 | Garcia et al. | Oct 2011 | B1 |
8069060 | Tipirneni | Nov 2011 | B2 |
8090598 | Bauer et al. | Jan 2012 | B2 |
8140358 | Ling et al. | Mar 2012 | B1 |
8214100 | Lowrey et al. | Jul 2012 | B2 |
8229759 | Zhu et al. | Jul 2012 | B2 |
8255275 | Collopy et al. | Aug 2012 | B2 |
8260639 | Medina, III et al. | Sep 2012 | B1 |
8271187 | Taylor et al. | Sep 2012 | B2 |
8285588 | Postrel | Oct 2012 | B2 |
8311858 | Everett et al. | Nov 2012 | B2 |
8321086 | Park et al. | Nov 2012 | B2 |
8330593 | Golenski | Dec 2012 | B2 |
8370254 | Hopkins, III et al. | Feb 2013 | B1 |
8401877 | Salvagio | Mar 2013 | B2 |
8403225 | Sharra et al. | Mar 2013 | B2 |
8417604 | Orr et al. | Apr 2013 | B2 |
8423239 | Blumer et al. | Apr 2013 | B2 |
8432262 | Talty et al. | Apr 2013 | B2 |
8433590 | Prescott | Apr 2013 | B2 |
8438049 | Ranicar, III et al. | May 2013 | B2 |
8442508 | Harter et al. | May 2013 | B2 |
8442797 | Kim et al. | May 2013 | B2 |
8447459 | Lowrey et al. | May 2013 | B2 |
8452486 | Banet et al. | May 2013 | B2 |
8463488 | Hart | Jun 2013 | B1 |
8466781 | Miller et al. | Jun 2013 | B2 |
8478514 | Kargupta | Jul 2013 | B2 |
8484113 | Collopy et al. | Jul 2013 | B2 |
8494938 | Kazenas | Jul 2013 | B1 |
8510133 | Peak et al. | Aug 2013 | B2 |
8510200 | Pearlman et al. | Aug 2013 | B2 |
8527135 | Lowrey et al. | Sep 2013 | B2 |
8571895 | Medina, III et al. | Oct 2013 | B1 |
8577703 | McClellan et al. | Nov 2013 | B2 |
8581712 | Morgan et al. | Nov 2013 | B2 |
8589015 | Willis et al. | Nov 2013 | B2 |
8595034 | Bauer et al. | Nov 2013 | B2 |
8598977 | Maalouf et al. | Dec 2013 | B2 |
8620692 | Collopy et al. | Dec 2013 | B2 |
8630768 | McClellan et al. | Jan 2014 | B2 |
8633985 | Haynes et al. | Jan 2014 | B2 |
8635091 | Amigo et al. | Jan 2014 | B2 |
8688380 | Cawse et al. | Apr 2014 | B2 |
8712893 | Brandmaier et al. | Apr 2014 | B1 |
8751270 | Hanson et al. | Jun 2014 | B1 |
8799034 | Brandmaier et al. | Aug 2014 | B1 |
8930581 | Anton et al. | Jan 2015 | B2 |
9002719 | Tofte | Apr 2015 | B2 |
9165325 | Chakravarty et al. | Oct 2015 | B2 |
9324201 | Jun | Apr 2016 | B2 |
9361735 | Leise | Jun 2016 | B1 |
9659331 | Hanson et al. | May 2017 | B1 |
9672719 | Hollenstain et al. | Jun 2017 | B1 |
9767625 | Snyder et al. | Sep 2017 | B1 |
10580075 | Brandmaier et al. | Mar 2020 | B1 |
10657647 | Chen et al. | May 2020 | B1 |
20020003571 | Schofield et al. | Jan 2002 | A1 |
20020007289 | Malin et al. | Jan 2002 | A1 |
20020049535 | Rigo et al. | Apr 2002 | A1 |
20020055861 | King et al. | May 2002 | A1 |
20020103622 | Burge | Aug 2002 | A1 |
20020111725 | Burge | Aug 2002 | A1 |
20020161697 | Stephens et al. | Oct 2002 | A1 |
20030005765 | Brudis et al. | Jan 2003 | A1 |
20030233261 | Kawahara et al. | Dec 2003 | A1 |
20040000992 | Cuddihy et al. | Jan 2004 | A1 |
20040068350 | Tomson | Apr 2004 | A1 |
20040083123 | Kim et al. | Apr 2004 | A1 |
20040088090 | Wee | May 2004 | A1 |
20040128065 | Taylor et al. | Jul 2004 | A1 |
20040145457 | Schofield et al. | Jul 2004 | A1 |
20040186744 | Lux | Sep 2004 | A1 |
20040189493 | Estus et al. | Sep 2004 | A1 |
20040189722 | Acres | Sep 2004 | A1 |
20040205622 | Jones et al. | Oct 2004 | A1 |
20050021374 | Allahyari | Jan 2005 | A1 |
20050104745 | Bachelder et al. | May 2005 | A1 |
20050119826 | Lee et al. | Jun 2005 | A1 |
20050161505 | Yin et al. | Jul 2005 | A1 |
20050216487 | Fisher et al. | Sep 2005 | A1 |
20050278082 | Weekes | Dec 2005 | A1 |
20060025897 | Shostak et al. | Feb 2006 | A1 |
20060055583 | Orr et al. | Mar 2006 | A1 |
20060067573 | Parr et al. | Mar 2006 | A1 |
20060192783 | Kass et al. | Aug 2006 | A1 |
20060224305 | Ansari et al. | Oct 2006 | A1 |
20060226960 | Pisz et al. | Oct 2006 | A1 |
20060282202 | Cashier | Dec 2006 | A1 |
20070009136 | Pawlenko et al. | Jan 2007 | A1 |
20070027583 | Tamir et al. | Feb 2007 | A1 |
20070037610 | Logan | Feb 2007 | A1 |
20070043594 | Lavergne | Feb 2007 | A1 |
20070136162 | Thibodeau et al. | Jun 2007 | A1 |
20070162308 | Peters | Jul 2007 | A1 |
20070238954 | White et al. | Oct 2007 | A1 |
20070288268 | Weeks | Dec 2007 | A1 |
20080027761 | Bracha | Jan 2008 | A1 |
20080052134 | Nowak et al. | Feb 2008 | A1 |
20080078253 | Blackwood et al. | Apr 2008 | A1 |
20080215375 | Nakano et al. | Sep 2008 | A1 |
20080225118 | Suzuki | Sep 2008 | A1 |
20080242261 | Shimanuki et al. | Oct 2008 | A1 |
20080255722 | McClellan et al. | Oct 2008 | A1 |
20080294690 | McClellan et al. | Nov 2008 | A1 |
20080300731 | Nakajima et al. | Dec 2008 | A1 |
20080306636 | Caspe-Detzer et al. | Dec 2008 | A1 |
20080306996 | McClellan | Dec 2008 | A1 |
20080319665 | Berkobin et al. | Dec 2008 | A1 |
20090013755 | Tsai | Jan 2009 | A1 |
20090036091 | Ball et al. | Feb 2009 | A1 |
20090063174 | Fricke | Mar 2009 | A1 |
20090099732 | Pisz | Apr 2009 | A1 |
20090106052 | Moldovan | Apr 2009 | A1 |
20090125180 | Berkobin et al. | May 2009 | A1 |
20090164504 | Flake et al. | Jun 2009 | A1 |
20090192688 | Padmanabhan et al. | Jul 2009 | A1 |
20090198772 | Kim et al. | Aug 2009 | A1 |
20090234678 | Arenas | Sep 2009 | A1 |
20090248283 | Bicego, Jr. | Oct 2009 | A1 |
20090254241 | Basir | Oct 2009 | A1 |
20090265385 | Beland et al. | Oct 2009 | A1 |
20100020170 | Higgins-Luthman et al. | Jan 2010 | A1 |
20100030540 | Choi et al. | Feb 2010 | A1 |
20100036595 | Coy et al. | Feb 2010 | A1 |
20100049552 | Fini et al. | Feb 2010 | A1 |
20100131300 | Collopy et al. | May 2010 | A1 |
20100138242 | Ferrick et al. | Jun 2010 | A1 |
20100161491 | Bauchot et al. | Jun 2010 | A1 |
20100174564 | Stender et al. | Jul 2010 | A1 |
20100205012 | McClellan | Aug 2010 | A1 |
20100219944 | McCormick et al. | Sep 2010 | A1 |
20100250369 | Peterson et al. | Sep 2010 | A1 |
20100323657 | Barnard et al. | Dec 2010 | A1 |
20110012720 | Hirschfeld | Jan 2011 | A1 |
20110060496 | Nielsen et al. | Mar 2011 | A1 |
20110070834 | Griffin et al. | Mar 2011 | A1 |
20110070864 | Karam et al. | Mar 2011 | A1 |
20110077028 | Wilkes, III et al. | Mar 2011 | A1 |
20110106449 | Chowdhary et al. | May 2011 | A1 |
20110118934 | Lowrey et al. | May 2011 | A1 |
20110153367 | Amigo et al. | Jun 2011 | A1 |
20110153369 | Feldman et al. | Jun 2011 | A1 |
20110161116 | Peak et al. | Jun 2011 | A1 |
20110161118 | Borden et al. | Jun 2011 | A1 |
20110161119 | Collins | Jun 2011 | A1 |
20110185178 | Gotthardt | Jul 2011 | A1 |
20110281564 | Armitage et al. | Nov 2011 | A1 |
20110285874 | Showering et al. | Nov 2011 | A1 |
20110307119 | Basir et al. | Dec 2011 | A1 |
20110307188 | Peng et al. | Dec 2011 | A1 |
20120021386 | Anderson et al. | Jan 2012 | A1 |
20120028680 | Breed | Feb 2012 | A1 |
20120047203 | Brown et al. | Feb 2012 | A1 |
20120069051 | Hagbi et al. | Mar 2012 | A1 |
20120072243 | Collins et al. | Mar 2012 | A1 |
20120076437 | King | Mar 2012 | A1 |
20120084179 | McRae et al. | Apr 2012 | A1 |
20120109690 | Weinrauch et al. | May 2012 | A1 |
20120109692 | Collins et al. | May 2012 | A1 |
20120119936 | Miller et al. | May 2012 | A1 |
20120136802 | McQuade et al. | May 2012 | A1 |
20120150412 | Yoon et al. | Jun 2012 | A1 |
20120191476 | Reid et al. | Jul 2012 | A1 |
20120192235 | Tapley et al. | Jul 2012 | A1 |
20120197486 | Elliott | Aug 2012 | A1 |
20120197669 | Kote et al. | Aug 2012 | A1 |
20120202551 | Mirbaha | Aug 2012 | A1 |
20120209631 | Roscoe et al. | Aug 2012 | A1 |
20120209632 | Kaminski et al. | Aug 2012 | A1 |
20120230548 | Calman et al. | Sep 2012 | A1 |
20120232995 | Castro et al. | Sep 2012 | A1 |
20120239417 | Pourfallah et al. | Sep 2012 | A1 |
20120242503 | Thomas et al. | Sep 2012 | A1 |
20120250938 | DeHart | Oct 2012 | A1 |
20120259665 | Pandhi et al. | Oct 2012 | A1 |
20120290150 | Doughty et al. | Nov 2012 | A1 |
20120303392 | Depura et al. | Nov 2012 | A1 |
20120316893 | Egawa | Dec 2012 | A1 |
20120330687 | Hilario et al. | Dec 2012 | A1 |
20130006674 | Bowne et al. | Jan 2013 | A1 |
20130006675 | Bowne et al. | Jan 2013 | A1 |
20130018676 | Fischer et al. | Jan 2013 | A1 |
20130033386 | Zlojutro | Feb 2013 | A1 |
20130035964 | Roscoe et al. | Feb 2013 | A1 |
20130046510 | Bowne et al. | Feb 2013 | A1 |
20130054274 | Katyal | Feb 2013 | A1 |
20130069802 | Foghel et al. | Mar 2013 | A1 |
20130073318 | Feldman et al. | Mar 2013 | A1 |
20130073321 | Hofmann et al. | Mar 2013 | A1 |
20130090881 | Janardhanan et al. | Apr 2013 | A1 |
20130138267 | Hignite et al. | May 2013 | A1 |
20130151288 | Bowne et al. | Jun 2013 | A1 |
20130166098 | Lavie et al. | Jun 2013 | A1 |
20130166326 | Lavie et al. | Jun 2013 | A1 |
20130179027 | Mitchell | Jul 2013 | A1 |
20130179198 | Bowne et al. | Jul 2013 | A1 |
20130190967 | Hassib et al. | Jul 2013 | A1 |
20130197856 | Barfield et al. | Aug 2013 | A1 |
20130197945 | Anderson | Aug 2013 | A1 |
20130204645 | Lehman et al. | Aug 2013 | A1 |
20130211660 | Mitchell | Aug 2013 | A1 |
20130226369 | Yorio et al. | Aug 2013 | A1 |
20130226397 | Kuphal et al. | Aug 2013 | A1 |
20130289819 | Hassib et al. | Oct 2013 | A1 |
20130290036 | Strange | Oct 2013 | A1 |
20130297097 | Fischer et al. | Nov 2013 | A1 |
20130297353 | Strange et al. | Nov 2013 | A1 |
20130297418 | Collopy et al. | Nov 2013 | A1 |
20130300552 | Chang | Nov 2013 | A1 |
20130311209 | Kaminski et al. | Nov 2013 | A1 |
20130316310 | Musicant et al. | Nov 2013 | A1 |
20130317860 | Schumann, Jr. | Nov 2013 | A1 |
20130317865 | Tofte et al. | Nov 2013 | A1 |
20130332026 | McKown et al. | Dec 2013 | A1 |
20130336523 | Ruan | Dec 2013 | A1 |
20130339062 | Brewer et al. | Dec 2013 | A1 |
20140067429 | Lowell | Mar 2014 | A1 |
20140081675 | Ives et al. | Mar 2014 | A1 |
20140114691 | Pearce | Apr 2014 | A1 |
20140121878 | Pandhi et al. | May 2014 | A1 |
20140122012 | Barfield | May 2014 | A1 |
20140132404 | Katoh et al. | May 2014 | A1 |
20140195070 | Shimizu et al. | Jul 2014 | A1 |
20140244312 | Gray et al. | Aug 2014 | A1 |
20140244678 | Zamer et al. | Aug 2014 | A1 |
20140277916 | Mullen et al. | Sep 2014 | A1 |
20140300739 | Mimar | Oct 2014 | A1 |
20140313334 | Slotky | Oct 2014 | A1 |
20140316825 | van Dijk et al. | Oct 2014 | A1 |
20140344050 | McKinley | Nov 2014 | A1 |
20140368602 | Woodgate et al. | Dec 2014 | A1 |
20150006023 | Fuchs | Jan 2015 | A1 |
20150019267 | Prieto et al. | Jan 2015 | A1 |
20150073834 | Gurenko et al. | Mar 2015 | A1 |
20150088550 | Bowers et al. | Mar 2015 | A1 |
20150106133 | Smith, Jr. | Apr 2015 | A1 |
20150149218 | Bayley et al. | May 2015 | A1 |
20150269791 | Amigo et al. | Sep 2015 | A1 |
20150307048 | Santora | Oct 2015 | A1 |
20150324924 | Wilson et al. | Nov 2015 | A1 |
20160094964 | Barfield, Jr. et al. | Mar 2016 | A1 |
20160203703 | Graeve | Jul 2016 | A1 |
20160255282 | Bostick et al. | Sep 2016 | A1 |
20170089710 | Slusar | Mar 2017 | A1 |
20170293894 | Taliwal et al. | Oct 2017 | A1 |
20180033220 | Pal et al. | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
2002301438 | Sep 2006 | AU |
2007200869 | Mar 2007 | AU |
2658219 | Jan 2008 | CA |
203025907 | Jun 2013 | CN |
103390326 | Nov 2013 | CN |
1488198 | Dec 2004 | EP |
1826734 | Aug 2007 | EP |
1965361 | Sep 2008 | EP |
2147320 | Jan 2010 | EP |
2481037 | Aug 2012 | EP |
2486384 | Jun 2012 | GB |
2488956 | Sep 2012 | GB |
2005112932 | May 2004 | KR |
1998047109 | Oct 1998 | WO |
2002079934 | Oct 2002 | WO |
2006074682 | Jul 2006 | WO |
2012045128 | Apr 2012 | WO |
2012067640 | May 2012 | WO |
2012097441 | Jul 2012 | WO |
2012106878 | Aug 2012 | WO |
2012173655 | Dec 2012 | WO |
2012174590 | Dec 2012 | WO |
Entry |
---|
Jan. 15, 2020—U.S. Notice of Allowance—U.S. Appl. No. 16/106,380. |
Jan. 24, 2020—U.S. Non-Final Office Action—U.S. Appl. No. 15/271,834. |
Aug. 7, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 16/106,380. |
Mar. 21, 2019—U.S. Final Office Action—U.S. Appl. No. 15/271,812. |
Mar. 21, 2019—U.S. Final Office Action—U.S. Appl. No. 15/271,834. |
Apr. 12, 2019 (WO) International Search Report—App. PCT/US2019/016324. |
Apr. 12, 2019 (WO) Written Opinion of the International Searching Authority—App. PCT/US2019/016324. |
“Vehicle Wrap Trends: What are QR Codes and why do I need one?” Sunrise Signs. Retrieved from http://www.sunrisesigns.com/our-blog/bid/34661/Vehicle-Wrap-Trends-What-are-QR-Codes-and-why-do-I-need-one on May 21, 2013. |
“When to File a Car Insurance Claim—and When Not To,” retrieved Jun. 3, 2016 from https://www.nerdwallet.com/blog/insurance/when-to-file-car-insurance-claims/, 6 pages. |
“WreckWatch: Automatic Traffic Accident Detection and Notification with Smartphones.” J. White et al., Journal of Mobile Networks and Applications manuscript, retrieved Apr. 15, 2015. |
Nov. 30, 2017—(WO) International Search Report—PCT/US17/52199. |
Allen Hong, The Linear-Logic ScanGauge II Review, Jun. 10, 2007. |
AX22 Performance Computer, Race Technology Ltd. [On-line], Retrieved from the Internet: http://www.race-technology.com. |
Barometer-Aided Road Grade Estimation, Jussi Parviainen et al., Tampere University of Technology, Finland; 2009. |
Bubble Level, 2010, Apple Inc. [On-line], Retrieved from the Internet: http://developer.apple.com/library/ios/samplecode/BubbleLevel/Listings/ReadMe_txt.html. |
Car Accelerometer details, 2011, Hurtado Apps—iPhone/iPod applications [On-line], Retrieved from the Internet: http://apps.hurtado.cl/car/car-details. |
Carl Duzen, et al., Using an Accelerometer to Classify Motion, CAPE inc, 2001. |
Charles Petzold, Accelerometer and Location Service in Windows Phone 7, Nov. 23, 2010 [On-line], Retrieved from the Internet: http://www.c-sharpcorner.com/UploadFile/8c85cf/4363/. |
CS-525H: Immersive Human-Computer Interaction, Oct. 25, 2010, Department of Computer Science, Worcester Polytechnic Institute. |
Dash3 Instruction Manual, 2010, Race Technology Limited [On-line], Retrieved from the Internet: http://www.race-technology.com. |
Dash4 Pro, 2011, Race Technology Ltd. [On-line], Retrieved from the Internet: http://www.race-technology.com/dash4_pro_2_31014.html. |
Doug Newcomb, Cool iPhone Car Applications, Nov. 20, 2008 [On-line], Retrieved from the Internet: http://edmunds.com. |
DragTimes.com Density Altitude, DragTimes, Sep. 1, 2013 <https://play.google.com/store/apps/details?id=com.DragTimes&feature=search_result>. |
DynoStorm, 2009, BunsenTech, LLC [On-line], Retrieved from the Internet: http://www.bunsentech.com/projects/dynostorm/. |
Everywhere Navigation: Integrated Solutions on Consumer Mobile Devices, Naser El-Sheimy et al., Inside GNSS, pp. 74-82, Oct. 2011. |
Fleet Management Features, 2011, RedTail Telematics [On-line], Retrieved from the Internet: http://www.redtailtelematics.com/fleet-management/features/. |
GForce, 2011 [On-line], Retrieved from the Internet: http://gadgitech.com/uk/IPhone/Applications/GForce.html. |
Giuseppe Ghiani, et al., Multimodal PDA Interfaces to Assist Drivers in Monitoring Their Vehicles, ISTI-CNR. |
Glossary III: Rise of the Smartphonesa, Scott McCormick, May 12, 2011 <http://floatlearning.com/2011/05/glossary-iii-rise-of-the-smartphones/>. |
GMeter, 2008 [On-line], Retrieved from the Internet: http://hunter.pairsite.com/gmeter/. |
IHUD an Aerospace inspired spatial motion visualization on the iPhone 3G and 3GS, and iPad [On-line], [attached copy retrieved on Apr. 14, 2011], Retrieved from the Internet: http://www.i-hud.com/. |
Insurance, 2011, Webtech Wireless [On-line], Retrieved from the Internet: http://www.wtwmail.com/en/industry_solutions/insurance/. |
K.A.T. Matrix 3-Axis Accelerometer (Car Performance Meter), 2011 [On-line], Retrieved from the Internet: http://www.amazon.com. |
Maciag, A. K. (1980). Motor accident insurance and systems of compensation. (Order No. MK49023, University of Alberta (Canada)). ProQuest Dissertations and Theses, 1. Retrieved from <http://search.proquest.com/docview/303097892?accountid=14753>. |
Race Technology Knowledge Base, 2008 [On-line], Retrieved from the Internet: http://www.race-technology.com/wiki/index.php/AnalysisTools. |
Released—GReddy iPhone and iPod App, Jun. 28, 2010, The Octane Report [On-line], Retrieved from the Internet: http://www.octanereport.com. |
Rev User Manual, Nov. 9, 2009, DevToaster, LLC [On-line], Retrieved from the Internet: http://www.devtoaster.com. |
Solution: Fleet Performance, 2009, Cadec Global Inc. [On-line], Retrieved from the Internet: http://www.cadec.com/solutions/executiveDashboards.php. |
Spevacek, C. E., Ledwith, J. F., Newman, T. R., & Lennes, John B.,Jr. (2001). Additional insured and indemnification issues affecting the insurance industry, coverage counsel, and defense counsel—legal advice and practice pointers. FDCC Quarterly, 52(1), 3-101. Retrieved from <http://search.proquest.com/docview/201215466?accountid=14753>. |
Technical Plan, Harker Innovation Team [On-line], Retrieved from the Internet: http://fuelourfuturenow.discoveryeducation.com/pdfs/dash-plus/Harker_Plan.pdf. |
Vehicle Performance Computer Owner's Manual, 2011. Beltronics [On-line], Retrieved from the Internet: http://www.beltronics.com. |
Vehicle productivity, security & safety, 2010, Acadian Companies [On-line], Retrieved from the Internet: http://www.acadian.com/site598.php. |
Verma, M., R. Lange, and D. McGarry. “A Study of US Crash Statistics from Automated Notification Data.” In 20th International technical conference on the enhanced safety of vehicles conference (ESV). Lyon, France, pp. 18-21. 2007. |
Vitalijs Lennojs, aGile Dashboard, Dec. 19, 2008 [On-line], Retrieved from the Internet: http://iphoneapplicationlist.com/app/id300133977/. |
What Can You Do With a Barometer, Joe Levi, Pocketnow, Oct. 19, 2011, <http://pocketnow.com/android/what-can-you-do-with-a-barometer-on-a-smartphone>. |
Your Resource Highway to driver Safety, 2011, GeoPoint Partners, LLC [On-line], Retrieved from the Internet: http://www.geopointpartners.com/. |
May 21, 2020—U.S. Non-Final Office Action—U.S. Appl. No. 15/271,812. |
Sep. 16, 2020—U.S. Notice of Allowance—U.S. Appl. No. 15/271,834. |
Harley, Aurora, “Ensure High Contrast for Text Over Images by Harley,” Nielsen Norman Group, Oct. 15, 2015, retrieved from https://www.nngroup.com/articles/text-over-images, 14 pages. |
Oct. 3, 2019—U.S. Non-Final Office Action—U.S. Appl. No. 15/271,812. |
“Automated Collision Notification (ACN) Field Operational Test (FOT) Evaluation Report.” L.R. Bachman et al., NHTSA. Feb. 2001. |
“Automated Collision Notification (ACN) Field Operational Test-Final Report (FOT).” D. Funke et al., NHTSA. Oct. 31, 2000. |
“Automatic Crash Notification.” ComCARE Alliance. Retrieved from <http://www.nhtsa.gov/DOT/NHTSA/NRD/Articles/ERD/PDF/Research/COMCARE_ACN_System.pdf> on Nov. 12, 2013. |
“Automatic Crash Response, Car Safety, & Emergency Services-OnStar” OnStar, retrieved from <http://www.onstar.onstar.com/web/portal/emergencyexplore?tab=1&g=1> on Jan. 12, 2013. |
“Automatic License Plate Recognition (ALPR) Scanning Systems.” Experienced Criminal Lawyers, Get Lawyer Leads, Inc., Retrieved from http://www.experiencedcriminallawyers.com/articles/automatic-license-plate-recognition-alpr-scanning-systems on Jun. 28, 2013. |
“Bump (application).” Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Bump_(application) on Aug. 29, 2013. |
“Car insurance firms revving up mobile app features.” Mark Chalon Smith, Insurance.com, Feb. 6, 2012. Retrieved from http://www.insurance.com/auto-insurance/auto-insurance-basics/car-insurance-mobile-apps.html on Jun. 28, 2013. |
“Course Notebook.” Jeremy S. Daily, ME 4024: Machine Dynamics, University of Tulsa; Spring 2013. |
“Design and Development of a GSM Based Vehicle Theft Control System and Accident Detection by Wireless Sensor Network.” V.B.Gopala Krishna et al., International Journal of Emerging Trends in Engineering and Development, Issue 2, vol. 5, pp. 529-540. Jul. 2012. |
“Design and implementation of a smart card based healthcare information system.” Geylani Kardas et al., Computer Methods and Programs in Biomedicine 81. pp. 66-78. Sep. 27, 2003. |
“Encrypted QR Codes.” qrworld. Nov. 27, 2011. Retrieved from http://qrworld.wordpress.com/2011/11/27/encrypted-qr-codes on Nov. 12, 2013. |
“Fall Detection with Three-Axis Accelerometer and Magnetometer in a Smartphone.” Soo-Young Hwang et al., National University, Korea, retrieved on Apr. 15, 2015. |
“Filing an Auto Claim”, Rocky Mountain Insurance, 3 pages (Year 2009). |
“Financial Rights Legal Centre: Making a Claim on Your Car Insurance,” retrieved from www. financialrights.org.au, 8 pages. |
“For insurance companies, the day of digital reckoning.” Henrik Naujoks et al., Bain & Company. 2013. |
“Fraunhofer offers secure NFC keys that can be shared via QR codes.” Karl Dryer, NFC World. Mar. 20, 2013. Retrieved from http://www.nfcworld.com/2013/03/20/323184/fraunhofer-offers-secure-nfc-keys-that-can-be-shared-via-qr-codes on Nov. 13, 2013. |
“G-tac.” Liberty for One, retrieved from <http://apps.libertyforone.com/g-tac/> on Jun. 17, 2015. |
“Geico App-Android Apps on Google Play.” GEICO. Retrieved from <http://play.google.com/store/apps/details?id=com.geico.mobile&hl=en> Nov. 12, 2013. |
“Information-Sharing in Out-of-Hospital Disaster Response: The Future Role of Information Technology.” Jeffrey L. Arnold et al., Abstracts of Prehospital and Disaster Medicine. Retrieved from http://journals.cambridge.org/action/displayAbstract?fromPage=online&aid=8231246 on May 20, 2013. |
“Insurance Claim Manager App,” retrieved Jun. 3, 2016, from https://www.snappii.com/resource-center/snappii-insurance-claims-manager-app/, 4 pages. |
“Insurance Tech Trends 2013: Elements of postdigital.” Mark White et al., Deloitte Development LLC. 2013. |
“Introducing the Octagon Insurance Claims App,” retrieved Jun. 3, 2016 from http://www.octagoninsurance.com/insurance-claim/octagon-insurance-mobile-claims-app., 3 pages. |
“Liberty Mutual Mobile App: Connecting you to Liberty Mutual on the go,” retrieved Jun. 3, 2016 from https://www.libertymutual.com/liberty-mutual-mobile/mobile-app, 4 pages. |
“License plate readers allow police to quickly scan, check for offenders.” Ann Marie Bush, The Capital-Journal, Mar. 17, 2013, Retrieved from http://cjonline.com/news/2013-03-17/license-plate-readers-allow-police-quickly-scan-check-offenders on Jun. 28, 2013. |
“License Plate Scanner Obsoletes Meter Maid.” Bertel Schmitt, The Truth About Cars. Feb. 1, 2011. Retrieved from http://www.thetruthaboutcars.com/2011/02/license-plate-scanner-obsoletes-meter-maid on Jun. 28, 2013. |
“Mercedes-Benz mbrace™.” Mercedes-Benz, Oct. 22, 2010. |
“Microsoft Tag Implementation Guide: Practical requirements and recommendations to ensure successful Tag production.” Microsoft Tag. Aug. 2010. |
“Near Field Communication: A Simple Exchange of Information.” Samsung. Mar. 5, 2013. Retrieved from http://www.samsung.com/us/article/near-held-communication-a-simple-exchange-of-information on May 21, 2013. |
“Nericell: Rich Monitoring of Road and Traffic Conditions using Mobile Smartphones.” Prashanth Mohan et al., Microsoft Research India, Bangalore, ACM, 2008. |
“New Idea: QR Codes for License Plates.” Andrew Maxwell, Heka Interactive. Feb. 11, 2011. Retrieved from http://www.andrewcmaxwell.com/2011/02/new-idea-qr-codes-for-license-plates on May 21, 2013. |
“New Technology Security Risks : QR codes and Near Field Communication.” Charlotte Gray. Retrieved from http://www.qwiktag.com/index.php/knowledge-base/150-technology-security-risks-qr-codes on Nov. 13, 2013. |
“Portable Automatic Conjecturing and Announcing System for Real-Time Accident Detection.” C.F. Lai et al., International Journal on Smart Sensing and Intelligent Systems, vol. 2(9), Jun. 2009. |
“Privacy Policy.” Lemon Wallet. Retrieved from http://lemon.com/privacy on May 20, 2013. |
“Providing Accident Detection in Vehicular Networks through OBD-II Devices and Android-based Smart Phones.” M. Narsing Rao et al., International Journal of Science and Research (ISSN: 2319-7064), vol. 2(9), Sep. 2013. |
“QR Code.” IDL Services. Retrieved from http://www.internationaler-fuehrerschein.com/en/the-idd/qr-code-quick-response-code-feature-in-the-idd.html on May 21, 2013. |
“Safe Driving and Accidental Monitoring Using GPS System and Three Axis Accelerometer.” R. Goregaonkar et al., International Journal of Emerging Technology and Advanced Engineering, vol. 3(11), Nov. 2013. |
“Scan Someone's License Plate and Message Them Instantly with New Bump App.” Rebecca Boyle, Popular Science, Sep. 17, 2010. Retrieved from http://www.popsci.com/cars/article/2010-09/social-networking-site-uses-license-plates-connect-drivers on Jun. 28, 2013. |
“SmoothDrive” app, CelluDrive Ltd., May 11, 2011, <http://www.celludrive.com/ptasite/home.htm>. |
“Snooper UK Store—Buy Direct from the Manufacturer.” Snooper, retrieved from <http://snooper.uk/products/snooper-lynx-app/index.html> on Apr. 15, 2015. |
“Speed-Breaker Early Warning System.” Mohit Jain et al., retrieved on Apr. 15, 2015. |
“Taking Advantage of the Pre-Claim Assistance Provision in your Professional Liability Policy,” retrieved Jun. 3, 2016 from http://www.sugarmanlaw.com/News-Articles/ID/33/Taking-Advantage-of-the-Pre-Claim-Assistance-Provision_in_your-Professional_Liability_Policy, 2 pages. |
“The Automated Collision Notification System.” Bruce R. Donnelly et al., NHTSA. Retrieved from <http://nhtsa.gov/DOT/NHTSA/NRD/Articles/EDR/PDF/Research/Automated_Collision_Notification_System_PDF> on Nov. 12, 2013. |
“The driving quality app: Product Description.” DriSMo, retrieved from <http://hovedprosjekter.hig.no/v2011/imt/in/drismo/index.php?option=com_content&view=article&id=5&Itemid=3> on Apr. 15, 2015. |
“The Potential for Automatic Crash Notification Systems to Reduce Road Fatalities.” Julie A. Lahausse et al., Annals of Advances in Automotive Medicine, vol. 52, pp. 85-92. 2008. (retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3256762/ on Jan. 12, 2013). |
“This App Turns Smartphones Into Safe Driving Tools.” Kate Freeman, Mashable. Aug. 30, 2012. Retrieved from <http://mashable.com/2012/08/30/drivescribe-app-safe-driving> on Nov. 12, 2013. |
“Top 10 Technology Trends Impacting Life and PC Insurers in 2013.” Juergen Weiss et al., Gartner. Mar. 27, 2013. |
“Trends 2013—North American Insurance eBusiness and Channel Strategy.” Ellen Carney, Forrester. May 16, 2013. |
“Using Smartphones and Wireless Mobile Sensor Networks to Detect Car Accidents and Provide Situational Awareness to Emergency Responders.” Christopher Thompson et al., Vanderbilt University; retrieved Dec. 22, 2014. |
“Using Smartphones to Detect Car Accidents and Provide Situational Awareness to First Responders.” Christopher Thompson, Institute for Software Integrated Systems, Vanderbilt University; presented at the Third International ICST Conference on Mobile Wireless MiddleWARE, Operating Systems, and Applications; retrieved Dec. 22, 2014. |
“Vehicle Damage Claims,” retrieved from Jun. 3, 2016 from https://www.statefarm.com/claims/resources/auto/vehicle-damage, 2 pages. |
Jul. 7, 2020—U.S. Final Office Action—U.S. Appl. No. 15/271,834. |
Oct. 28, 2020—U.S. Non-Final Office Action—U.S. Appl. No. 16/848,196. |
Nov. 10, 2020—U.S. Final Office Action—U.S. Appl. No. 15/271,812. |
Number | Date | Country | |
---|---|---|---|
20190156594 A1 | May 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16106455 | Aug 2018 | US |
Child | 16255264 | US | |
Parent | 15880187 | Jan 2018 | US |
Child | 16106455 | US | |
Parent | 15665710 | Aug 2017 | US |
Child | 15880187 | US | |
Parent | 14685067 | Apr 2015 | US |
Child | 15665710 | US |