The present application relates generally to insurance and, more specifically, to systems and methods for collecting and processing vehicle image and sound data for insurance rating or underwriting purposes.
In vehicle insurance, insurance providers generally seek to determine an insurance policy premium for a vehicle based on the risk of losses associated with the condition of the vehicle. For purposes of making this determination, it is well understood that various vehicle conditions can influence the probability that the vehicle will experience a loss that is recognizable under the policy. For example, mileage accrued on the vehicle can affect the overall operating condition of the vehicle. As such, vehicles with less mileage or that are driven less generally have a lower risk of loss, and therefore may be offered lower premiums for a given level of coverage. Conversely, vehicles with high mileage or that are driven often correspond to a higher risk of loss, and therefore may be offered higher premiums for the same level of coverage.
Currently, insurance providers have limited or inadequate access to information regarding vehicle condition. Most of the information is gathered through questionnaires provided to prospective policy holders who own or operate the vehicle. However, responses obtained from the questionnaires may not always be accurate or complete. Thus, the determined insurance policy premium for the vehicle may be poorly correlated with the actual risk of losses associated with the conditions of the vehicle.
The features and advantages described in this summary and the following detailed description are not all-inclusive. Many additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims hereof. Additionally, other embodiments may omit one or more (or all) of the features and advantages described in this summary.
A computer-implemented method for providing vehicle insurance may include receiving, via one or more processors, data associated with a vehicle. The data may be captured by a sensor of a computing device and includes image data recorded by a photo sensor of the computing device and sound data recorded by an audio sensor of the computing device. The method may also compare, via one or more processors, the received data to stored baseline vehicle data to determine an operating condition of the vehicle, the stored baseline vehicle data including a baseline vehicle image or baseline engine sound model. Further, the method may identify, via one or more processors, a risk of loss for the vehicle based on the determined operating condition of the vehicle, where the risk of loss includes the image data deviating from the baseline image by a threshold amount or the sound data recorded by the audio sensor of the computing device deviating from the baseline engine sound model by a threshold amount. The method may then determine, via one or more processors, an insurance premium for the vehicle based at least in part on the identified risk of loss. Finally, the method may provide, via one or more processors, the determined insurance premium to a user.
A non-transitory computer-readable storage medium including computer-readable instructions to be executed on one or more processors of a system for providing vehicle insurance. The instructions when executed causing the one or more processors to receive data associated with a vehicle. The data may be captured by a sensor of a computing device. The instructions when executed, may also cause the one or more processors to compare the received data to stored baseline vehicle data to determine an operating condition of the vehicle. Further, the instructions when executed, may cause the one or more processors to identify a risk of loss for the vehicle based on the determined operating condition of the vehicle. The instructions when executed, may then cause the one or more processors to determine an insurance premium for the vehicle based at least in part on the identified risk of loss. Finally, the instructions when executed, may cause the one or more processors to provide the determined insurance premium to a user.
A computer system for providing vehicle insurance may comprise a vehicle data repository and an insurance server that includes a memory having instructions for execution on one or more processors. The instructions when executed by the one or more processors, may cause the insurance server to retrieve, via a network connection, sound data associated with a vehicle from the vehicle data repository. The instructions when executed by the one or more processors, may also retrieve, via a network connection, baseline engine sound model data from the vehicle data repository. Further, the instructions when executed by the one or more processors, may compare the sound data associated with the vehicle with the baseline engine sound model to determine an operating condition of the vehicle. The instructions when executed by the one or more processors, may identify a risk of loss for the vehicle based on the determined operating condition of the vehicle, wherein the risk of loss includes the sound data recorded by the audio sensor of the computing device deviating from the baseline engine sound model by a threshold amount. The instructions when executed by the one or more processors, may then determine an insurance premium for the vehicle based at least in part on the identified risk of loss. Finally, the instructions when executed by the one or more processors, may provide, via a network connection, the determined insurance premium to a user.
The figures depict a preferred embodiment of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
Generally speaking, the disclosed system collects and analyzes image and sound data associated with the conditions of a vehicle in order to provide a policy holder with vehicle insurance ratings. As described herein, the conditions of the vehicle may correspond to the vehicle mileage, engine sounds, appearance, etc.
Generally, a policy holder (e.g., a current or potential policy holder) may operate the one or more sensors 114 on the computing device 102 to collect or capture image and sound data associated with the conditions of the vehicle. For example, the one or more sensors 114 may include an imaging sensor (e.g., a camera, a video recorder, etc.) that the policy holder may operate to capture images and/or videos of the vehicle. As another example, the one or more sensors 114 may include an audio sensor (e.g., a microphone) that the policy holder may operate to record sounds related to the vehicle such as engine sounds while the vehicle is running.
Once image and sound data associated with the conditions of the vehicle is collected or captured by the one or more sensors 114, the processor 108 may cause the data to be stored in the memory 110 before being transmitted to the insurance server 104 via the network 106. As such, the memory 110 may store captured images and/or videos as vehicle image data 110A, and recorded sounds as vehicle sound data 110B.
Additionally, the memory 110 may store vehicle owner or operator data 110C related to the owner or operators of the vehicle (e.g., the policy holder). For example, the owner or operators may input personal information (e.g., age, gender, home address, previous insurance claims, etc.) via, for example, the user interface 112. The processor 108 may then cause the personal information to be stored as the vehicle owner or operator data 110C in the memory 110. In some embodiments, the processor 108 may cause the gathered vehicle image data 110A, vehicle sound data 1106, and vehicle owner or operator data 110C to be transmitted directly to the insurance server 104 via the network 106.
In some embodiments, some or all of the vehicle image data 110A, vehicle sound data 110B, and vehicle owner or operator data 110C may be sent to the insurance server 104 via a third party. For example, a server of a vehicle data provider (not shown in
With continued reference to
A processor 104A of the insurance server 104 may execute instructions stored in a memory 104B of the insurance server 104 to retrieve data stored in the vehicle data repository 120. The insurance server 104 may operate directly on the data 120A-120C provided in the vehicle data repository 120, or may operate on other data that is generated based on the data 120A-120C. For example, the insurance server 104 may convert any or all of the data 120A-120C to a particular format (e.g., for efficient storage), and later utilize the modified data. In some embodiments, the vehicle data repository 120 may not be directly coupled to the insurance server 104, but instead may be accessible by the insurance server 104 via a network such as the network 106.
The insurance server 104 may be configured to provide insurance ratings for the vehicle based on image and sound data associated with the condition of the vehicle. To this end, the insurance server 104 may compare the data 120A-120C with the one or more vehicle condition models 122A and vehicle owner or operator statistics 122B or other data to determine a risk rating or a parameter corresponding to a risk rating. The sound data may also be used to identify the beginning and ending of a trip using the vehicle. For example, an audio device may listen for a particular engine sound for the vehicle (i.e., like an audio fingerprint for the vehicle) and the functions described herein may determine when a trip has started and then when the trip has ended. The vehicle footprint sound could also trigger collection of other usage-based data such as mile driven, speed, time of day, geographic location, etc., or to distinguish a trip in Vehicle A from a trip in Vehicle B. Some example embodiments and scenarios are described here for illustration purposes.
In an example embodiment, a condition of the vehicle may correspond to the vehicle mileage. For example, the vehicle image data 120A in the repository 120 may specify odometer reading information. As such, the insurance server 104 may compare the odometer information in the data 120A to a vehicle odometer model (e.g., stored as one of the vehicle condition models 122) in the repository 120. The vehicle odometer model may identify one or more ranges of vehicle mileage, where each range corresponds to an indicator of loss likelihood. Thus, the insurance server 104 may determine a risk indicator by matching the odometer information in the data 120A to one of the vehicle mileage ranges in the vehicle odometer model. The insurance server 104 may then determine an insurance premium for the vehicle based at least in part on the determined risk indicator.
In another example embodiment, a condition of the vehicle may correspond to the vehicle engine sound. For example, the vehicle sound data 120B in the repository 120 may specify engine sound information. As such, the insurance server 104 may compare the engine sound information in the data 120B to a baseline engine sound model (e.g., stored as one of the vehicle condition models 122) in the repository 120. The baseline engine sound model may indicate that certain engine sounds correspond to particular operating conditions or states of the vehicle that can lead to a high-risk of loss, such as states of disrepair or indications of inappropriate or illegal modifications to the vehicle engine, when the sound data recorded by the audio sensor of the computing device deviates from the baseline engine sound model by a threshold amount. For example, where an exhaust note of a normally operating vehicle at idle may include a frequency of 600 Hz, the recorded sound data may indicate a low frequency of 300 Hz or a high frequency of 1000 Hz. Either the low or the high frequency may indicate disrepair or other engine states that correspond to a high-risk of loss. Thus, by comparing the engine sound information in the data 120B to the baseline engine sound model, the insurance server 104 may determine an appropriate risk of loss. Based at least in part on the identified risk of loss, the insurance server 104 may then determine an appropriate insurance premium for the vehicle.
In some embodiments, the insurance server 104 may use other conditions of the vehicle (e.g., paint conditions, tire conditions, window conditions, interior conditions, dents, scratches, or other vehicle defects) for vehicle insurance rating purposes.
Further, the insurance server 104 may utilize the vehicle owner or operator data 120C to provide insurance ratings for the vehicle. In an example embodiment, the vehicle owner or operator data 120C may specify information such as the age, gender, and marital status of the vehicle owner or operator, the home address or the neighborhood that the vehicle owner or operator resides, how the vehicle is used, how far the vehicle owner or operator drives every day and where the vehicle is driven to, how many miles driven per day, week, month, etc., how many speeding tickets the vehicle owner or operator has, how many accidents the vehicle owner or operator has been involved in, how many vehicle insurance claims that the vehicle owner or operator has filed, and other relevant information. The insurance server 104 may compare any or all of the information in the data 120C to the vehicle owner or operator statistics 122B in the repository 120. For example, the vehicle owner or operator statistics 122B may indicate a high risk of loss for the vehicle if the vehicle owner or operator lives in a dangerous neighborhood. In another example, the vehicle owner or operator statistics 122B may indicate a higher risk of loss for the vehicle if the vehicle owner or operator has been given speed tickets or has been involved in accidents in the past. Conversely, the vehicle owner or operator statistics 122B may indicate a lower risk of loss if the vehicle owner or operator has a clean driving record. Thus, by comparing the information in the data 120C to the vehicle owner or operator statistics 122B, the insurance server 104 may further refine or modify the risk of loss associated with the vehicle in order to provide more accurate vehicle insurance ratings.
It is understood that the above examples are not exclusive, and that more than one such embodiment may coexist within a single system.
Moreover, the insurance server 104 may be configured to provide renewals, updates, and/or adjustments of an existing insurance premium. To do so, the insurance server 104 may perform an audit, where the policy holder submits new or updated information (e.g., new image and sound data) regarding the conditions of the vehicle and/or the owner or operators of the vehicle. Using the new or updated information, the insurance server 104 may calculate a new or updated risk rating to determine or update the existing insurance premium.
In an example embodiment, the insurance server 104 may determine an insurance premium for a vehicle based at least in part on a mileage estimation provided by the policy holder. The mileage estimation may specify that the policy holder will only accrue a certain amount of mileage on the vehicle over a certain period of time covered by the insurance premium. At renewal time, the insurance server 104 may audit the policy holder to determine the actual mileage accrued on the vehicle. If the actual mileage is within the mileage estimation, then the insurance server 104 may renew the existing insurance premium for the vehicle. However, if the actual mileage is much greater than the mileage estimation, then the insurance server 104 may calculate a new risk rating and determine a new insurance premium for the vehicle.
In some embodiments, the policy holder may be requested to use a link to connect with a professional agent to perform the auditing process. For example, using a video chat service, the professional agent may perform inspect the vehicle and submit information regarding the conditions of the vehicle to the insurance server 104.
In the embodiment of
In addition, the user interface 200 may include a vehicle information input field 208, which allows the policy holder to enter information about the vehicle as well as the owner or operators of the vehicle (which may be the policy holder himself or herself). In the embodiment of
Once the policy holder has entered or selected all the necessary information in the input fields 204, 206 and 208, the policy holder may execute the button 210 to submit the information to the insurance server 104. For example, the computing device 102 may transmit the information to the insurance server 104 for storage and processing via a network connection such as the network 106.
Referring now to
The method 300 begins by receiving image and sound data associated with a vehicle (block 302). For example, the method 300 may receive a captured image and/or a video of the vehicle odometer reading as the image data, and a recorded sound of the vehicle engine as the sound data.
Next, the method 300 analyzes the received image and sound data by comparing the received image and sound data to stored baseline vehicle data in order to determine an operating condition of the vehicle (block 304). For example, the method 300 may compare mileage information in the odometer reading (as received in the image data or as entered into the user interface 200) to a set of mileage ranges that are part of the stored baseline vehicle data. From the comparison, the method 300 may determine how well the vehicle can operate given the mileage that is already accrued on the vehicle. Block 304 may also analyze the sound data to identify the beginning and ending of a trip using the vehicle. Block 304 may also trigger collection of other usage-based data such as mile driven, speed, time of day, geographic location, etc.
The method 300 then identifies a risk of loss for the vehicle based at least in part on the determined operating condition of the vehicle (block 306). Continuing with the above example, the method 300 may determine in the block 304 that the vehicle has a high mileage. Accordingly, the method 300 may identify a high risk of loss associated with the vehicle.
Finally, the method 300 may determine an insurance premium (or provide an indication of the determined insurance premium) for the vehicle based at least in part on the identified risk of loss (block 308). The method 300 may provide (e.g., display or communicate) the insurance premium to a policy holder via a display screen (e.g., the user interfaces 112 of
In some embodiments, the method 300 may include additional functionalities not shown in
In some embodiments, the method 300 may include additional blocks not shown in
Using the system (e.g., 100), user interface (e.g., 200), and method (e.g., 300) described herein, a system may be implemented for providing insurance ratings based on vehicle image and sound data.
As shown in
The processor 402 of
The system memory 414 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc. The mass storage memory 416 may include any desired type of mass storage device. For example, if the computing device 401 is used to implement an application 418 having an API 419 (including functions and instructions as described by the method 300 of
The peripheral I/O controller 410 performs functions that enable the processor 402 to communicate with peripheral input/output (I/O) devices 422 and 424, a network interface 426, a local network transceiver 427, a cellular network transceiver 428, and a GPS transceiver 429 via the network interface 426. The I/O devices 422 and 424 may be any desired type of I/O device such as, for example, a keyboard, a display (e.g., a liquid crystal display (LCD), a cathode ray tube (CRT) display, etc.), a navigation device (e.g., a mouse, a trackball, a capacitive touch pad, a joystick, etc.), etc. The cellular telephone transceiver 428 may be resident with the local network transceiver 427. The local network transceiver 427 may include support for a Wi-Fi network, Bluetooth, Infrared, or other wireless data transmission protocols. In other embodiments, one element may simultaneously support each of the various wireless protocols employed by the computing device 401. For example, a software-defined radio may be able to support multiple protocols via downloadable instructions. In operation, the computing device 401 may be able to periodically poll for visible wireless network transmitters (both cellular and local network) on a periodic basis. Such polling may be possible even while normal wireless traffic is being supported on the computing device 401. The network interface 426 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 wireless interface device, a DSL modem, a cable modem, a cellular modem, etc., that enables the system 100 to communicate with another computer system having at least the elements described in relation to the system 100.
While the memory controller 412 and the I/O controller 410 are depicted in
The system 400 may include but is not limited to any combination of a LAN, a MAN, a WAN, a mobile, a wired or wireless network, a private network, or a virtual private network. Moreover, while only two remote computing devices 430 and 432 are illustrated in
Additionally, certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code or instructions embodied on a machine-readable medium or in a transmission signal, wherein the code is executed by a processor) or hardware modules. A hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs)).
The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
As used herein any reference to “some embodiments” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in some embodiments” in various places in the specification are not necessarily all referring to the same embodiment.
Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context.
Further, the figures depict preferred embodiments of a system for providing insurance ratings based on vehicle image and sound data for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for a system and a process for providing insurance ratings based on vehicle image and sound data through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
This application is a continuation of U.S. application Ser. No. 14/203,344, filed Mar. 10, 2014, which claims the benefit of U.S. Provisional Application No. 61/775,652, filed Mar. 10, 2013; both of which are hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4303904 | Chasek | Dec 1981 | A |
5310999 | Claus et al. | May 1994 | A |
5319611 | Korba | Jun 1994 | A |
5499182 | Ousborne | Mar 1996 | A |
5550551 | Alesio | Aug 1996 | A |
5797134 | McMillan et al. | Aug 1998 | A |
5819230 | Christie et al. | Oct 1998 | A |
5916287 | Arjomand et al. | Jun 1999 | A |
6064970 | McMillan et al. | May 2000 | A |
6313791 | Klanke | Nov 2001 | B1 |
6408232 | Cannon et al. | Jun 2002 | B1 |
6434510 | Callaghan | Aug 2002 | B1 |
6718235 | Borugian | Apr 2004 | B1 |
6741168 | Webb et al. | May 2004 | B2 |
6831993 | Lemelson et al. | Dec 2004 | B2 |
6832141 | Skeen et al. | Dec 2004 | B2 |
6856933 | Callaghan | Feb 2005 | B1 |
6868386 | Henderson et al. | Mar 2005 | B1 |
7194347 | Harumoto | Mar 2007 | B2 |
7343306 | Bates et al. | Mar 2008 | B1 |
7343310 | Stender | Mar 2008 | B1 |
7571128 | Brown | Aug 2009 | B1 |
7650210 | Breed | Jan 2010 | B2 |
7659827 | Gunderson et al. | Feb 2010 | B2 |
7724145 | Batra et al. | May 2010 | B2 |
7725348 | Allen et al. | May 2010 | B1 |
7812712 | White et al. | Oct 2010 | B2 |
7860764 | Alexander et al. | Dec 2010 | B1 |
7865378 | Gay | Jan 2011 | B2 |
7870010 | Joao | Jan 2011 | B2 |
7890355 | Gay et al. | Feb 2011 | B2 |
7930098 | Huang et al. | Apr 2011 | B2 |
7937278 | Cripe et al. | May 2011 | B1 |
7991629 | Gay et al. | Aug 2011 | B2 |
8027853 | Kazenas | Sep 2011 | B1 |
8086523 | Palmer | Dec 2011 | B1 |
8090598 | Bauer et al. | Jan 2012 | B2 |
8140358 | Ling | Mar 2012 | B1 |
8280752 | Cripe et al. | Oct 2012 | B1 |
8311858 | Everett | Nov 2012 | B2 |
8332242 | Medina, III | Dec 2012 | B1 |
8352118 | Mittelsteadt | Jan 2013 | B1 |
8359213 | Berg et al. | Jan 2013 | B2 |
8359259 | Berg et al. | Jan 2013 | B2 |
8407139 | Palmer | Mar 2013 | B1 |
8423239 | Blumer et al. | Apr 2013 | B2 |
8489433 | Altieri et al. | Jul 2013 | B2 |
8508353 | Cook et al. | Aug 2013 | B2 |
8527146 | Jackson et al. | Sep 2013 | B1 |
8538785 | Coleman et al. | Sep 2013 | B2 |
8538789 | Blank et al. | Sep 2013 | B1 |
8566126 | Hopkins, III | Oct 2013 | B1 |
8569141 | Huang et al. | Oct 2013 | B2 |
8605948 | Mathony | Dec 2013 | B2 |
8606512 | Bogovich et al. | Dec 2013 | B1 |
8606514 | Rowley et al. | Dec 2013 | B2 |
8612139 | Wang et al. | Dec 2013 | B2 |
8630768 | McClellan et al. | Jan 2014 | B2 |
8635091 | Amigo et al. | Jan 2014 | B2 |
8655544 | Fletcher et al. | Feb 2014 | B2 |
8682699 | Collins et al. | Mar 2014 | B2 |
8686844 | Wine | Apr 2014 | B1 |
8725408 | Hochkirchen et al. | May 2014 | B2 |
8731768 | Fernandes et al. | May 2014 | B2 |
8799035 | Coleman et al. | Aug 2014 | B2 |
8799036 | Christensen et al. | Aug 2014 | B1 |
8812330 | Cripe et al. | Aug 2014 | B1 |
8874477 | Hoffberg | Oct 2014 | B2 |
8892451 | Everett et al. | Nov 2014 | B2 |
8935036 | Christensen et al. | Jan 2015 | B1 |
8983677 | Wright | Mar 2015 | B2 |
8989914 | Nemat-Nasser et al. | Mar 2015 | B1 |
9008956 | Hyde et al. | Apr 2015 | B2 |
9031545 | Srey et al. | May 2015 | B1 |
9053469 | Bohanek et al. | Jun 2015 | B1 |
9098367 | Ricci | Aug 2015 | B2 |
9105066 | Gay et al. | Aug 2015 | B2 |
9141582 | Brinkmann et al. | Sep 2015 | B1 |
9141995 | Brinkmann et al. | Sep 2015 | B1 |
9141996 | Christensen et al. | Sep 2015 | B2 |
9164957 | Hassib | Oct 2015 | B2 |
9183441 | Blumer et al. | Nov 2015 | B2 |
9208525 | Hayward et al. | Dec 2015 | B2 |
9221428 | Kote et al. | Dec 2015 | B2 |
9256991 | Crawford | Feb 2016 | B2 |
9377528 | Birken | Jun 2016 | B2 |
9418383 | Hayward et al. | Aug 2016 | B1 |
9454786 | Srey et al. | Sep 2016 | B1 |
9734537 | Christensen | Aug 2017 | B2 |
20010044733 | Lee et al. | Nov 2001 | A1 |
20020026394 | Savage et al. | Feb 2002 | A1 |
20020111725 | Burge | Aug 2002 | A1 |
20020128985 | Greenwald | Sep 2002 | A1 |
20020198843 | Wang et al. | Dec 2002 | A1 |
20030112133 | Webb et al. | Jun 2003 | A1 |
20030191581 | Ukai et al. | Oct 2003 | A1 |
20030209893 | Breed et al. | Nov 2003 | A1 |
20030229528 | Nitao | Dec 2003 | A1 |
20030236686 | Matsumoto et al. | Dec 2003 | A1 |
20040039611 | Hong | Feb 2004 | A1 |
20040102984 | Wahlbin et al. | May 2004 | A1 |
20040117358 | von Kaenel et al. | Jun 2004 | A1 |
20040153362 | Bauer et al. | Aug 2004 | A1 |
20040193347 | Harumoto | Sep 2004 | A1 |
20040225557 | Phelan et al. | Nov 2004 | A1 |
20050024185 | Chuey | Feb 2005 | A1 |
20050171663 | Mittelsteadt | Aug 2005 | A1 |
20050267784 | Slen et al. | Dec 2005 | A1 |
20050283388 | Eberwine et al. | Dec 2005 | A1 |
20060049925 | Hara et al. | Mar 2006 | A1 |
20060053038 | Warren et al. | Mar 2006 | A1 |
20060059020 | Davidson | Mar 2006 | A1 |
20060075120 | Smit | Apr 2006 | A1 |
20060079280 | LaPerch | Apr 2006 | A1 |
20060095301 | Gay | May 2006 | A1 |
20060114531 | Webb | Jun 2006 | A1 |
20060206415 | Ross | Sep 2006 | A1 |
20060247852 | Kortge et al. | Nov 2006 | A1 |
20070005404 | Raz et al. | Jan 2007 | A1 |
20070061173 | Gay | Mar 2007 | A1 |
20070106539 | Gay | May 2007 | A1 |
20070124045 | Ayoub et al. | May 2007 | A1 |
20070156312 | Breed | Jul 2007 | A1 |
20070156468 | Gay et al. | Jul 2007 | A1 |
20070256499 | Pelecanos | Nov 2007 | A1 |
20070268158 | Gunderson et al. | Nov 2007 | A1 |
20070282638 | Surovy | Dec 2007 | A1 |
20070288270 | Gay et al. | Dec 2007 | A1 |
20070299700 | Gay et al. | Dec 2007 | A1 |
20080018466 | Batra et al. | Jan 2008 | A1 |
20080027761 | Bracha | Jan 2008 | A1 |
20080051996 | Dunning et al. | Feb 2008 | A1 |
20080059019 | Delia | Mar 2008 | A1 |
20080065427 | Helitzer et al. | Mar 2008 | A1 |
20080174451 | Harrington et al. | Jul 2008 | A1 |
20080215376 | Engelman | Sep 2008 | A1 |
20080243558 | Gupte | Oct 2008 | A1 |
20080255888 | Berkobin et al. | Oct 2008 | A1 |
20090002147 | Bloebaum et al. | Jan 2009 | A1 |
20090024419 | McClellan et al. | Jan 2009 | A1 |
20090024458 | Palmer | Jan 2009 | A1 |
20090043441 | Breed | Feb 2009 | A1 |
20090094066 | Freudman et al. | Apr 2009 | A1 |
20090150023 | Grau et al. | Jun 2009 | A1 |
20090210257 | Chalfant et al. | Aug 2009 | A1 |
20100030568 | Daman | Feb 2010 | A1 |
20100066513 | Bauchot et al. | Mar 2010 | A1 |
20100088123 | McCall et al. | Apr 2010 | A1 |
20100131302 | Collopy | May 2010 | A1 |
20100131304 | Collopy et al. | May 2010 | A1 |
20100138244 | Basir | Jun 2010 | A1 |
20100185534 | Satyavolu et al. | Jul 2010 | A1 |
20100223080 | Basir et al. | Sep 2010 | A1 |
20100238009 | Cook et al. | Sep 2010 | A1 |
20110022421 | Brown et al. | Jan 2011 | A1 |
20110040579 | Havens | Feb 2011 | A1 |
20110106370 | Duddle et al. | May 2011 | A1 |
20110125363 | Blumer et al. | May 2011 | A1 |
20110137685 | Tracy et al. | Jun 2011 | A1 |
20110153367 | Amigo et al. | Jun 2011 | A1 |
20110161117 | Busque et al. | Jun 2011 | A1 |
20110161118 | Borden et al. | Jun 2011 | A1 |
20110200052 | Mungo et al. | Aug 2011 | A1 |
20110213628 | Peak et al. | Sep 2011 | A1 |
20110267186 | Rao et al. | Nov 2011 | A1 |
20110304446 | Basson et al. | Dec 2011 | A1 |
20110307188 | Peng et al. | Dec 2011 | A1 |
20120004933 | Foladare et al. | Jan 2012 | A1 |
20120021386 | Anderson et al. | Jan 2012 | A1 |
20120029945 | Altieri et al. | Feb 2012 | A1 |
20120065834 | Senart et al. | Mar 2012 | A1 |
20120069979 | Henry, Jr. et al. | Mar 2012 | A1 |
20120072243 | Collins et al. | Mar 2012 | A1 |
20120072244 | Collins et al. | Mar 2012 | A1 |
20120089423 | Tamir et al. | Apr 2012 | A1 |
20120089701 | Goel | Apr 2012 | A1 |
20120101855 | Collins et al. | Apr 2012 | A1 |
20120109418 | Lorber | May 2012 | A1 |
20120109692 | Collins et al. | May 2012 | A1 |
20120130752 | Moskal | May 2012 | A1 |
20120158436 | Bauer et al. | Jun 2012 | A1 |
20120190386 | Anderson | Jul 2012 | A1 |
20120197669 | Kote et al. | Aug 2012 | A1 |
20120209632 | Kaminski et al. | Aug 2012 | A1 |
20120209634 | Ling et al. | Aug 2012 | A1 |
20120214472 | Tadayon et al. | Aug 2012 | A1 |
20120226421 | Kote et al. | Sep 2012 | A1 |
20120259665 | Pandhi et al. | Oct 2012 | A1 |
20120271661 | Reynolds et al. | Oct 2012 | A1 |
20120283893 | Lee et al. | Nov 2012 | A1 |
20120323531 | Pascu | Dec 2012 | A1 |
20120323772 | Michael | Dec 2012 | A1 |
20120330499 | Scheid | Dec 2012 | A1 |
20130006675 | Bowne et al. | Jan 2013 | A1 |
20130013347 | Ling et al. | Jan 2013 | A1 |
20130013348 | Ling et al. | Jan 2013 | A1 |
20130018677 | Chevrette | Jan 2013 | A1 |
20130035964 | Roscoe et al. | Feb 2013 | A1 |
20130041521 | Basir et al. | Feb 2013 | A1 |
20130041621 | Smith et al. | Feb 2013 | A1 |
20130046510 | Bowne et al. | Feb 2013 | A1 |
20130046559 | Coleman et al. | Feb 2013 | A1 |
20130046562 | Taylor et al. | Feb 2013 | A1 |
20130046646 | Malan | Feb 2013 | A1 |
20130084847 | Tibbitts et al. | Apr 2013 | A1 |
20130110310 | Young | May 2013 | A1 |
20130117050 | Berg et al. | May 2013 | A1 |
20130144474 | Ricci | Jun 2013 | A1 |
20130144657 | Ricci | Jun 2013 | A1 |
20130151064 | Becker et al. | Jun 2013 | A1 |
20130161110 | Furst | Jun 2013 | A1 |
20130166098 | Lavie et al. | Jun 2013 | A1 |
20130166326 | Lavie et al. | Jun 2013 | A1 |
20130188794 | Kawamata | Jul 2013 | A1 |
20130189660 | Mangum et al. | Jul 2013 | A1 |
20130211662 | Blumer | Aug 2013 | A1 |
20130226624 | Blessman et al. | Aug 2013 | A1 |
20130244210 | Nath | Sep 2013 | A1 |
20130262530 | Collins et al. | Oct 2013 | A1 |
20130289819 | Hassib et al. | Oct 2013 | A1 |
20130297387 | Michael | Nov 2013 | A1 |
20130304276 | Flies | Nov 2013 | A1 |
20130304515 | Gryan et al. | Nov 2013 | A1 |
20130317693 | Jefferies et al. | Nov 2013 | A1 |
20130325519 | Tracy et al. | Dec 2013 | A1 |
20130344856 | Silver et al. | Dec 2013 | A1 |
20130345896 | Blumer et al. | Dec 2013 | A1 |
20140012604 | Allen, Jr. | Jan 2014 | A1 |
20140019167 | Cheng et al. | Jan 2014 | A1 |
20140019170 | Coleman et al. | Jan 2014 | A1 |
20140025401 | Hagelstein et al. | Jan 2014 | A1 |
20140046701 | Steinberg et al. | Feb 2014 | A1 |
20140052479 | Kawamura | Feb 2014 | A1 |
20140058761 | Freiberger et al. | Feb 2014 | A1 |
20140074345 | Gabay | Mar 2014 | A1 |
20140074402 | Hassib et al. | Mar 2014 | A1 |
20140086419 | Rana | Mar 2014 | A1 |
20140089101 | Meller | Mar 2014 | A1 |
20140108058 | Bourne et al. | Apr 2014 | A1 |
20140111647 | Atsmon et al. | Apr 2014 | A1 |
20140114696 | Amigo et al. | Apr 2014 | A1 |
20140180723 | Cote et al. | Jun 2014 | A1 |
20140180727 | Freiberger et al. | Jun 2014 | A1 |
20140257865 | Gay et al. | Sep 2014 | A1 |
20140257866 | Gay et al. | Sep 2014 | A1 |
20140257867 | Gay et al. | Sep 2014 | A1 |
20140257868 | Hayward et al. | Sep 2014 | A1 |
20140257869 | Binion et al. | Sep 2014 | A1 |
20140257870 | Cielocha et al. | Sep 2014 | A1 |
20140257871 | Christensen et al. | Sep 2014 | A1 |
20140257872 | Christensen et al. | Sep 2014 | A1 |
20140257873 | Hayward et al. | Sep 2014 | A1 |
20140257874 | Hayward et al. | Sep 2014 | A1 |
20140278574 | Barber | Sep 2014 | A1 |
20140304011 | Yager et al. | Oct 2014 | A1 |
20140310028 | Christensen et al. | Oct 2014 | A1 |
20160086393 | Collins et al. | Mar 2016 | A1 |
20160225098 | Helitzer et al. | Aug 2016 | A1 |
Entry |
---|
Classic Car Feature Article “Insurance by the Mile”, Article #102504, by Jack Nerad for Driving Today, downloaded from the Internet at: <http://www.antiquecar.com/feature-insurance_by_the_mile.php> (Oct. 25, 2004). |
Mihailescu, An assessment Charter airline benefits for Port Elizabeth and the Eastern Cape, Chinese Business Review, pp. 34-45 (Feb. 2010). |
U.S. Appl. No. 14/202,660, Final Office Action, dated Jul. 10, 2015. |
U.S. Appl. No. 14/202,660, Final Office Action, dated Sep. 4, 2014. |
U.S. Appl. No. 14/202,660, Nonfinal Office Action, dated Feb. 3, 2015. |
U.S. Appl. No. 14/202,660, Nonfinal Office Action, dated Jul. 28, 2017. |
U.S. Appl. No. 14/202,660, Nonfinal Office Action, dated May 14, 2014. |
U.S. Appl. No. 14/202,812, Final Office Action, dated Jul. 23, 2015. |
U.S. Appl. No. 14/202,812, Final Office Action, dated Sep. 5, 2014. |
U.S. Appl. No. 14/202,812, Nonfinal Office Action, dated Feb. 23, 2015. |
U.S. Appl. No. 14/202,812, Nonfinal Office Action, dated Jun. 30, 2017. |
U.S. Appl. No. 14/202,812, Nonfinal Office Action, dated May 23, 2014. |
U.S. Appl. No. 14/202,997, Notice of Allowance, dated May 29, 2014. |
U.S. Appl. No. 14/203,015, Notice of Allowance, dated Mar. 31, 2015. |
U.S. Appl. No. 14/203,015, Office Action, dated May 22, 2014. |
U.S. Appl. No. 14/203,015, Office Action, dated Oct. 29, 2014. |
U.S. Appl. No. 14/203,115, Final Office Action, dated Dec. 12, 2016. |
U.S. Appl. No. 14/203,115, Final Office Action, dated Mar. 12, 2015. |
U.S. Appl. No. 14/203,115, Final Office Action, dated Nov. 5, 2015. |
U.S. Appl. No. 14/203,115, Nonfinal Office Action, dated Jun. 11, 2014. |
U.S. Appl. No. 14/203,115, Nonfinal Office Action, dated Jun. 15, 2017. |
U.S. Appl. No. 14/203,115, Nonfinal Office Action, dated Jun. 28, 2016. |
U.S. Appl. No. 14/203,115, Nonfinal Office Action, dated Jun. 30, 2015. |
U.S. Appl. No. 14/203,115, Nonfinal Office Action, dated Oct. 9, 2014. |
U.S. Appl. No. 14/203,143, Examiner's Answer to Appeal Brief, dated Nov. 17, 2016. |
U.S. Appl. No. 14/203,143, Final Office Action, dated Jan. 7, 2016. |
U.S. Appl. No. 14/203,143, Final Office Action, dated May 18, 2015. |
U.S. Appl. No. 14/203,143, Final Office Action, dated Sep. 23, 2014. |
U.S. Appl. No. 14/203,143, Nonfinal Office Action, dated Jan. 14, 2015. |
U.S. Appl. No. 14/203,143, Nonfinal Office Action, dated Jul. 29, 2015. |
U.S. Appl. No. 14/203,143, Nonfinal Office Action, dated Jun. 3, 2014. |
U.S. Appl. No. 14/203,210, Final Office Action, dated Aug. 11, 2015. |
U.S. Appl. No. 14/203,210, Final Office Action, dated Aug. 28, 2017. |
U.S. Appl. No. 14/203,210, Final Office Action, dated Nov. 28, 2014. |
U.S. Appl. No. 14/203,210, Final Office Action, dated Oct. 13, 2016. |
U.S. Appl. No. 14/203,210, Nonfinal Office Action, dated Apr. 22, 2014. |
U.S. Appl. No. 14/203,210, Nonfinal Office Action, dated Aug. 27, 2014. |
U.S. Appl. No. 14/203,210, Nonfinal Office Action, dated Jun. 29, 2017. |
U.S. Appl. No. 14/203,210, Nonfinal Office Action, dated Mar. 19, 2015. |
U.S. Appl. No. 14/203,210, Nonfinal Office Action, dated May 26, 2016. |
U.S. Appl. No. 14/203,338, Final Office Action, dated Oct. 6, 2014. |
U.S. Appl. No. 14/203,338, Notice of Allowance, dated May 20, 2015. |
U.S. Appl. No. 14/203,338, Office Action, dated Feb. 3, 2015. |
U.S. Appl. No. 14/203,338, Office Action, dated Jun. 2, 2014. |
U.S. Appl. No. 14/203,344, Final Office Action, dated Dec. 22, 2015. |
U.S. Appl. No. 14/203,344, Final Office Action, dated Mar. 18, 2015. |
U.S. Appl. No. 14/203,344, Nonfinal Office Action, dated Jun. 30, 2015. |
U.S. Appl. No. 14/203,344, Nonfinal Office Action, dated Jun. 6, 2014. |
U.S. Appl. No. 14/203,344, Nonfinal Office Action, dated Nov. 24, 2014. |
U.S. Appl. No. 14/203,344, Nonfinal Office Action, dated Oct. 6, 2016. |
U.S. Appl. No. 14/203,344, Notice of Allowance, dated Apr. 7, 2017. |
U.S. Appl. No. 14/203,349, Final Office Action, dated Mar. 17, 2015. |
U.S. Appl. No. 14/203,349, Final Office Action, dated Dec. 3, 2015. |
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated Feb. 10, 2017. |
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated Jun. 15, 2015. |
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated May 20, 2014. |
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated Oct. 23, 2014. |
U.S. Appl. No. 14/203,349, Notice of Allowance, dated Jul. 26, 2017. |
U.S. Appl. No. 14/203,356, Final Office Action, dated Apr. 17, 2015. |
U.S. Appl. No. 14/203,356, Nonfinal Office Action, dated Jun. 13, 2014. |
U.S. Appl. No. 14/203,356, Nonfinal Office Action, dated Sep. 24, 2014. |
U.S. Appl. No. 14/203,356, Notice of Allowance, dated Aug. 6, 2015. |
U.S. Appl. No. 14/314,822, Final Office Action, dated Apr. 20, 2017. |
U.S. Appl. No. 14/314,822, Final Office Action, dated Mar. 10, 2015. |
U.S. Appl. No. 14/314,822, Final Office Action, dated Oct. 8, 2015. |
U.S. Appl. No. 14/314,822, Nonfinal Office Action, dated Dec. 10, 2014. |
U.S. Appl. No. 14/314,822, Nonfinal Office Action, dated Jan. 12, 2017. |
U.S. Appl. No. 14/314,822, Nonfinal Office Action, dated Jul. 7, 2015. |
U.S. Appl. No. 14/788,998, Final Office Action, dated Apr. 18, 2016. |
U.S. Appl. No. 14/788,998, Final Office Action, dated Jun. 14, 2017. |
U.S. Appl. No. 14/788,998, Nonfinal Office Action, dated Dec. 4, 2015. |
U.S. Appl. No. 14/788,998, Nonfinal Office Action, dated Feb. 10, 2017. |
U.S. Appl. No. 14/795,369, Final Office Action, dated Apr. 20, 2016. |
U.S. Appl. No. 14/795,369, Final Office Action, dated Aug. 22, 2017. |
U.S. Appl. No. 14/795,369, Nonfinal Office Action, dated Dec. 8, 2015. |
U.S. Appl. No. 14/795,369, Nonfinal Office Action, dated Feb. 7, 2017. |
U.S. Appl. No. 14/862,703, Nonfinal Office Action, dated Dec. 10, 2015. |
U.S. Appl. No. 14/862,703, Notice of Allowance, dated Apr. 13, 2016. |
U.S. Appl. No. 15/674,067, “Systems and Methods for Generating Vehicle Insurance Policy Data Based on Empirical Vehicle Related Data”, Hayward et al., filed Aug. 10, 2017. |
Number | Date | Country | |
---|---|---|---|
61775652 | Mar 2013 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14203344 | Mar 2014 | US |
Child | 15639016 | US |