Systems and methods for generating vehicle insurance policy data based on empirical vehicle related data

Information

  • Patent Grant
  • 9865020
  • Patent Number
    9,865,020
  • Date Filed
    Thursday, August 10, 2017
    7 years ago
  • Date Issued
    Tuesday, January 9, 2018
    7 years ago
Abstract
The present disclosure generally relates to a computer implemented system and method for automatically generating insurance policy related data. The system and method may determine a vehicle operator and generate empirical vehicle operator identity data. The system and method may further acquire empirical vehicle operation data related to the actual vehicle operator, and correlate the empirical vehicle operator identity data and the empirical vehicle operation data to generate vehicle insurance policy related data. The system and method may further include processing one or more insurance options, including underwriting and pricing, based at least in part on the vehicle insurance policy related data.
Description
TECHNICAL FIELD

The present disclosure generally relates to systems and methods for assessing, pricing, and provisioning vehicle insurance. In particular, the present disclosure relates to systems and methods for generating vehicle insurance policy data based on empirical vehicle operator identity data and empirical vehicle operation data.


BACKGROUND

Vehicle insurance policies may be based, at least in part, on information related to a vehicle insurance policy applicant, such as age of the applicant, gender of the applicant, number of prior insurance claim(s) that the applicant has submitted, driving record of the applicant, etc. Vehicle insurance policies may also be based, at least in part, on information related to a driving routine associated with the vehicle insurance policy applicant, such as where the insurance applicant lives and where the applicant drives to work.


Various sensors, such as seat belt sensors, seat occupancy sensors, vehicle telematics sensors, infrared sensors, vibration sensors, image sensors, ultrasonic sensors, etc., are being incorporated within modern-day vehicles. Data derived from associated sensors is used to monitor and/or control vehicle operation.


SUMMARY

Generating vehicle insurance policy related data based on empirical vehicle related data is desirable. In particular, it is desirable to automatically generate insurance policy related data based on empirical data related to a vehicle operator identity and/or empirical data related to vehicle operation.


A computer implemented method for automatically generating insurance policy data, that is representative of a vehicle insurance policy, may include receiving, at one or more processors, empirical vehicle operator identity data that may be representative of an identity of a vehicle operator. The method may further include receiving, at one or more processors, empirical vehicle operation data that may be representative of actual operation of a vehicle and that may be, at least partially, based on vehicle sensor data. The method may also include correlating, by one or more processors, at least a portion of the empirical vehicle operator identity data with at least a portion of the empirical vehicle operation data. The method may yet further include generating, by one or more processors, vehicle insurance policy related data based, at least in part, on the correlated empirical vehicle operator identity data and empirical vehicle operation data.


In an embodiment, a system for automatically generating vehicle insurance policy related data, that is representative of a vehicle insurance policy, may include an empirical vehicle operator identity data acquisition module stored on a memory that, when executed by a processor, causes the processor to acquire empirical vehicle operator identity data that may be representative of an identity of a vehicle operator. The system may also include an empirical vehicle operation data acquisition module stored on a memory that, when executed by a processor, causes the processor to acquire empirical vehicle operation data that may be representative of operation of a vehicle. The system may further include a vehicle insurance policy data generation module stored on a memory that, when executed by a processor, causes the processor to generate vehicle insurance policy related data based, at least in part, on the empirical vehicle operator identity data and the empirical vehicle operation data.


In another embodiment, a tangible, computer-readable medium may store instructions that, when executed by a process, cause the processor to automatically generate vehicle insurance policy related data that is representative of a vehicle insurance policy. The tangible, computer-readable medium may also include an empirical vehicle operator identity data acquisition module that, when executed by a processor, causes the processor to acquire empirical vehicle operator identity data that may be representative of an identity of a vehicle operator. The tangible, computer-readable medium may further include an empirical vehicle operation data acquisition module that, when executed by a processor, causes the processor to acquire empirical vehicle operation data that may be representative of operation of a vehicle. The tangible, computer-readable medium may also include a vehicle insurance policy data generation module that, when executed by a processor, causes the processor to generate vehicle insurance policy related data based, at least in part, on the empirical vehicle operator identity data and the empirical vehicle operation data.





BRIEF DESCRIPTION OF THE DRAWINGS

The figures described below depict various aspects of the systems and methods disclosed herein. It should be understood that each figure depicts an embodiment of a particular aspect of the disclosed systems and methods, and that each of the figures is intended to accord with a possible embodiment thereof. Further, wherever possible, the following description refers to the reference numerals included in the following figures, in which features depicted in multiple figures are designated with consistent reference numerals.



FIG. 1 illustrates a block diagram of a computer system on which an exemplary vehicle insurance policy data generation system and method may operate in accordance with the described embodiments;



FIGS. 2A-2C depict various views of the interior of an example vehicle that illustrate locations of vehicle sensors within, and on, a vehicle;



FIGS. 3A-3C illustrate various example images constructed from data retrieved from the vehicle devices of FIGS. 2A-2C;



FIG. 4 illustrates a block diagram of an exemplary vehicle module for use in generating and transmitting empirical vehicle operator identity data and empirical vehicle operation data;



FIG. 5 depicts a flow diagram of an example method of generating and transmitting empirical vehicle operator identity data;



FIG. 6 depicts a flow diagram of an example method of generating and transmitting empirical vehicle operations data;



FIG. 7 illustrates a block diagram of an exemplary remote server for use in receiving empirical vehicle operator identity data and empirical vehicle operations data, and generating vehicle insurance policy data based on the empirical vehicle operator identity data and empirical vehicle operations data; and



FIG. 8 depicts a flow diagram of an example method of generating vehicle insurance policy data based on empirical vehicle operator identity data and empirical vehicle operations data.





DETAILED DESCRIPTION

While vehicle insurance rates are typically based, at least in part, on information associated with an applicant, or applicants, seeking insurance coverage, undisclosed drivers often operate the associated vehicle(s). Methods and systems are provided that automatically generate vehicle insurance policy related data based on empirical vehicle operator identity data. The empirical vehicle operator identity data may be representative of an identity of an operator, or operators, that have actually operated an associated insured vehicle. Empirical vehicle operator identity data may, for example, be based on data acquired from various vehicle sensors, such as seat occupancy sensors, seatbelt sensors, body heat sensors (e.g., infrared sensors), weight sensors (e.g., pressure transducers), cameras (e.g., image sensors), etc. The vehicle sensor data may be time stamped.


In addition to vehicle insurance policy rates being based on information pertaining to an insurance applicant, vehicle insurance policy rates may be based on information related to operation of the vehicle. For example, vehicle insurance customers who operate their vehicles for less time generally pay a lower amount for vehicle insurance when compared to customers who operate their vehicles frequently, all other factors being equal. In addition to, or as an alternative to, generating vehicle insurance policy data based on empirical vehicle operator identity data, the present systems and methods may generate vehicle insurance policy data based on empirical vehicle operation data. Empirical vehicle operation related data may be representative of an amount of time an insured vehicle was actually in use. For example, travel time may be used as a unit of exposure for, at least in part, determining a vehicle insurance rate. In particular, a vehicle motion sensor (e.g., a vehicle speedometer sensor, a vehicle odometer sensor, a vibration sensor or a light sensor) may be used to detect motion of a vehicle. Data received from a vehicle motion sensor may be time stamped. The time stamped vehicle motion sensor data may be used to generate, record and transmit empirical vehicle operation related data that may be representative of a length of time a vehicle was in use.


Turning to FIG. 1, a high-level block diagram of a vehicle insurance policy data generation system 100 is illustrated that may implement communications between a vehicle module 105 and a remote computing device 110 (e.g., a remote server) to receive vehicle sensor data, to generate empirical vehicle operator identity data, to generate empirical vehicle operation data and to generate vehicle insurance policy data. For example, the vehicle insurance policy data generation system 100 may acquire data from vehicle sensors (e.g., vehicle telematics systems sensors, seat belt sensors, steering wheel sensors, seat occupancy sensors, vibration sensors, image sensors, infrared sensors ultrasonic sensors, audio sensors, pressure sensors, etc.) and generate empirical vehicle operator identity data, empirical vehicle operation data and vehicle insurance policy data based on the vehicle sensor data. These vehicle sensors may, for example, be located as denoted with regard to reference numbers 225a, 235a, 245a, 260a, 280b of FIGS. 2A and 2B.


For clarity, only one vehicle module 105 is depicted in FIG. 1. While FIG. 1 depicts only one vehicle module 105, it should be understood that any number of vehicle modules 105 may be supported. The vehicle module 105 may include a memory 120 and a processor 125 for storing and executing, respectively, a module 121. The module 121, stored in the memory 120 as a set of computer-readable instructions, may be related to an empirical vehicle operator identity data module (e.g., empirical vehicle operator identity data module 421 of FIG. 4) and/or an empirical vehicle operation data module (e.g., empirical vehicle operation data module 422 of FIG. 4). Execution of the module 121 may also cause the process 125 to associate the empirical vehicle operator identity data and/or the empirical vehicle operation data with a time and, or date (i.e., “time stamp” the data). Execution of the module 121 may also cause the processor 125 to receive known vehicle operator identity data from, for example, an insurance related database (e.g., insurance related database 170 of FIG. 1). Execution of the module 121 may further cause the processor 125 to communicate with the processor 155 of the remote computing device 110 via the network interface 130, the vehicle module communications network connection 131 and the wireless communication network 115 to transmit empirical vehicle operator identity data and/or the empirical vehicle operation data from the vehicle module 105 to the remote server 110.


The vehicle module 105 may further include an image sensor input 135 communicatively connected to a first image sensor 136 and a second image sensor 137. While two image sensors 136, 137 are depicted in FIG. 1, any number of image sensors may be included. The vehicle module 105 may also include an infrared sensor input 140 communicatively connected to a first infrared sensor 141 and a second infrared sensor 142. While two infrared sensors 141, 142 are depicted in FIG. 1, any number of infrared sensors may be included. The vehicle module 105 may further include an ultrasonic sensor input 145 communicatively connected to a first ultrasonic sensor 146 and a second ultrasonic sensor 147. While two ultrasonic sensors 146, 147 are depicted in FIG. 1, any number of ultrasonic sensors may be included. The vehicle module 105 may also include a microphone input 150 communicatively connected to a first microphone 151 and a second microphone 152. While two microphones 151, 152 are depicted in FIG. 1, any number of microphones may be included. The vehicle module 105 may also include a vibration sensor inputs 106 communicatively connected to a first vibration sensor 107 and a second vibration sensor 108. While two vibration sensors 107, 108 are depicted in FIG. 1, any number of vibration sensors may be included. The vehicle module 105 may also include seat occupancy sensor inputs 122 communicatively connected to a first seat occupancy sensor 123 and a second seat occupancy sensor 124. While two seat occupancy sensors 123, 124 are depicted in FIG. 1, any number of seat occupancy sensors may be included. Any one of the seat occupancy sensors 123, 124 may be, for example, an ultrasonic sensor, a pressure sensor, a body heat sensor (e.g., an infrared sensor) or a camera/video sensor (e.g., an image sensor). The vehicle module 105 may also include steering wheel sensor inputs 132 communicatively connected to a first steering wheel sensor 133 and a second steering wheel sensor 134. While two steering wheel sensors 133, 134 are depicted in FIG. 1, any number of steering wheel sensors may be included. Any one of the steering wheel sensors 133, 134 may be, for example, an ultrasonic sensor, a pressure sensor, a body heat sensor (e.g., an infrared sensor) or a camera/video sensor (e.g., an image sensor). The vehicle module 105 may also include seat belt sensor inputs 127 communicatively connected to a first seat belt sensor 128 and a second seat belt sensor 129. While two seat belt sensors 128, 129 are depicted in FIG. 1, any number of seat belt sensors may be included. The vehicle module 105 may further include vehicle telematics system inputs 126. The vehicle telematics system inputs 126 may include, for example, a global positioning system (GPS) sensor, a vehicle speedometer sensor, a vehicle odometer sensor, a vehicle air bag sensor, a vehicle interior temperature sensor, a vehicle exterior temperature sensor, a vehicle pitch sensor, a vehicle yaw sensor and/or a time and day clock sensor. The vehicle module 105 may further include a display/user input device 125.


As one example, a first image sensor 136 may be located in a driver-side A-pillar (e.g., location of vehicle sensor 235a of FIG. 2A), a second image sensor 137 may be located in a passenger-side A-pillar (e.g., location of vehicle sensor 245a of FIG. 2A), a first infrared sensor 141 may be located in a driver-side B-pillar (e.g., location of vehicle sensor 280b of FIG. 2B), a second infrared sensor 142 may be located in a passenger-side B-pillar (not shown in the Figs.), first and second ultrasonic sensors 146, 147 may be located in a center portion of a vehicle dash (e.g., location of vehicle sensor 225a of FIG. 2A) and first and second microphones 151, 152 may be located on a bottom portion of a vehicle interior rearview mirror (e.g., location of vehicle sensor 260a of FIG. 2A). The processor 115 may acquire vehicle sensor data from any one of, or all of, these vehicle sensors 107, 108, 123, 124, 126, 128, 129, 133, 134, 136, 137, 141, 142, 146, 147, 151, 152 and may generate real-time vehicle operator identity data, empirical vehicle operator identity data and/or empirical vehicle operation data based on the vehicle sensor data. The processor 115 may transmit empirical vehicle operator identity data and/or empirical vehicle operation data to the remote computing device 110. Alternatively, the processor 115 may transmit vehicle sensor data and/or real-time vehicle operator identity data to the remote computing device 110 and the processor 155 may generate empirical vehicle operator identity data and/or empirical vehicle operation data based on the vehicle sensor data and/or real-time vehicle operator identity data.


The network interface 130 may be configured to facilitate communications between the vehicle module 105 and the remote computing device 110 via any hardwired or wireless communication network 115, including for example a wireless LAN, MAN or WAN, WiFi, the Internet, a Bluetooth connection, or any combination thereof. Moreover, the vehicle module 105 may be communicatively connected to the remote computing device 110 via any suitable communication system, such as via any publicly available or privately owned communication network, including those that use wireless communication structures, such as wireless communication networks, including for example, wireless LANs and WANs, satellite and cellular telephone communication systems, etc. The vehicle module 105 may cause insurance risk related data to be stored in a remote computing device 110 memory 160 and/or a remote insurance related database 170.


The remote computing device 110 may include a memory 160 and a processor 155 for storing and executing, respectively, a module 161. The module 161, stored in the memory 160 as a set of computer-readable instructions, facilitates applications related to generation of vehicle insurance policy data. The module 161 may also facilitate communications between the computing device 110 and the vehicle module 105 via a network interface 165, a remote computing device network connection 166 and the network 115 and other functions and instructions.


The computing device 110 may be communicatively coupled to an insurance related database 170. While the insurance related database 170 is shown in FIG. 1 as being communicatively coupled to the remote computing device 110, it should be understood that the insurance related database 170 may be located within separate remote servers (or any other suitable computing devices) communicatively coupled to the remote computing device 110. Optionally, portions of insurance related database 170 may be associated with memory modules that are separate from one another, such as a memory 120 of the vehicle module 105. The processor 155 may further execute the module 161 to store known vehicle operator identity data within the insurance related database 170. The known vehicle operator identity data may be generated based on digital images of individuals associated with an insurance policy application and/or other authorized drivers associated with an insurance policy application.


Turning to FIGS. 2A-2C, vehicle sensor systems 200a, 200b, 200c are illustrated. As depicted in FIG. 2A, the vehicle sensor system 200a may include a center-dash vehicle sensor 225a located in a center area of the dash, a driver-side A-pillar vehicle sensor 235a located in a driver side A-pillar 230a, a passenger-side A-pillar vehicle sensor 245a located in a passenger-side A-pillar 240a and a rearview mirror vehicle sensor 260a located on a bottom-side of the rearview mirror 255a. The vehicle sensor system 200a may further, or alternatively, include vehicle sensors in a driver-side visor 265a, a passenger-side visor 270a, a rearview mirror mounting bracket 250a and, or the steering wheel 210a. As described in detail herein, a position of a left-hand 215a of a vehicle driver and, or a position of a right-hand 220a of the vehicle driver, relative to a vehicle steering wheel 210a may be determined based on data acquired from any one of the vehicle sensors 225a, 235a, 245a, 260a. Any one of the vehicle sensors 225a, 235a, 245a, 260a may be an image sensor 136, 137, a pressure sensor 123, 124, a vibration sensor 107, 108, an infrared sensor 141, 142, an ultrasonic sensor 145, 146, a microphone 151, 152 or any other suitable vehicle sensor. Empirical vehicle operator identity data, real-time vehicle operator identity data and/or empirical vehicle operation data may be generated based on data received from any one of, or any combination of, vehicle sensors shown in FIG. 2A.


With reference to FIG. 2B, the vehicle sensor system 200b may include a driver-side B-pillar vehicle sensor 280b located in a driver-side B-pillar 275b and a center-dash vehicle sensor 225b located in a center area of the dash. While not shown in FIG. 2B, the vehicle sensor system 200b may include a passenger-side B-pillar vehicle sensor and, or any other vehicle sensors as described in conjunction with FIG. 2A. The vehicle sensor system 200b may further include a display device 285b. The display device 285b may be located in a center-console area. As illustrated in FIG. 2B, data acquired from the vehicle sensors 225b, 280b may be used to determine an identity of an occupant of a driver-side seat 290b, a passenger-side seat 295b, a position of hands on a steering wheel 210b and, or at least a portion of a face of a vehicle driver (not shown in FIG. 2B). Empirical vehicle operator identity data, real-time vehicle operator identity data and/or empirical vehicle operation data may be generated based on data received from any one of, or any combination of, vehicle sensors shown in FIG. 2B.


Turning to FIG. 2C, the vehicle sensor system 200c may include a driver-side A-pillar vehicle sensor 235c located in a driver side A-pillar 230c, a passenger-side A-pillar vehicle sensor 245c located in a passenger-side A-pillar 240c and a rearview mirror vehicle sensor 260c located on a bottom-side of the rearview mirror 255c. The vehicle sensor system 200c may further, or alternatively, include vehicle sensors in a rearview mirror mounting bracket 250c and, or the steering wheel 210c. While not shown in FIG. 2C, the vehicle monitoring system 200c may include any other vehicle sensors as described in conjunction with FIGS. 2A and 2B. As illustrated in FIG. 2C, data acquired from the vehicle sensors 235c, 245c may be used to generate vehicle operator identity data corresponding to an occupant of a driver-side seat 290c, a passenger-side seat 295c occupancy, a position of hands on a steering wheel 210c and, or at least a portion of a face of a vehicle driver (not shown in FIG. 2C). Driver position within the driver-side seat 290c may, for example, be inferred from shifting weight on the seat 290c. Shifting weight on a seat may be determined via a signal obtained from a pressure transducer 123, 124 located with the seat. Empirical vehicle operator identity data, real-time vehicle operator identity data and/or empirical vehicle operation data may be generated based on data received from any one of, or any combination of, vehicle sensors shown in FIG. 2C.


With reference to FIGS. 3A-3C, vehicle interiors 300a, 300b, 300c are depicted. As described in detail herein, data acquired from the vehicle sensors 325a, 335a, 345a, 360a, 380b of FIGS. 3A and 3B (or any other suitably located vehicle sensors) may be used to determine a position of at least a portion of a passenger 397a within the vehicle interior 300a. The data acquired from the vehicle sensors 325a, 335a, 345a, 360a, 380b (or any other suitably located vehicle sensors) may be used to determine whether, or not the passenger 397a is wearing a seatbelt 396a. As further illustrated in FIG. 3A, data acquired from the vehicle sensors 325a, 335a, 345a, 360a, 380b of FIGS. 3A and 3B (or any other suitably located vehicle sensors) may be used to determine a position and, or orientation of a vehicle driver's head 319a and, or right-hand 320a on a steering wheel 310a. For example, the data acquired from the vehicle sensors 325a, 335a, 345a, 360a, 380b may be used to determine whether the vehicle driver's head 319a is oriented toward a rearview mirror 355a, oriented toward the driver-side A-pillar 330a or oriented toward the front windshield. The data acquired from the vehicle sensors 325a, 335a, 345a, 360a, 380b may be used to determine whether the driver is wearing a seatbelt 391a. In any event, the vehicle interior 300a may include a microphone 350a located proximate the rearview mirror 355a. As described in detail herein, data acquired from the microphone 350a may be used to determine a source of sound within and/or around the vehicle 300a and, or a volume of the sound. Empirical vehicle operator identity data, real-time vehicle operator identity data and/or empirical vehicle operation data may be generated based on data received from any one of, or any combination of, vehicle sensors shown in FIG. 3A.



FIG. 3B depicts a vehicle interior 300b including a driver-side A-pillar vehicle sensor 335b located on a driver-side A-pillar 330b. As described in detail herein, data acquired from the vehicle sensor 335b (along with any other suitably located vehicle sensors) may be used to determine a position and, or orientation of a driver's head 319b, the driver's left hand 315b and, or right hand 320b relative to the steering wheel 310b. For example, data acquired from the vehicle sensor 335b (along with any other suitably located vehicle sensors) may be used to determine a gesture that the driver is performing with her left hand 315b. Empirical vehicle operator identity data, real-time vehicle operator identity data and/or empirical vehicle operation data may be generated based on data received from any one of, or any combination of, vehicle sensors shown in FIG. 3B. For example, data from a microphone 151, 152 may be used to identify a vehicle operator and/or a number of occupants. In particular, a number of distinct voices may be determined based on data from a microphone 151, 152. Alternatively, or additionally, the number of door opening and closing sounds may be determined based on data from a microphone 151, 152. Furthermore, data from a microphone 151, 152 may be also be used to identify a particular vehicle.


Turning to FIG. 3C, a vehicle interior 300b depicts a vehicle sensor 360c located on a bottom side of a rearview mirror 355c opposite a rearview mirror mount 350c. As described in detail herein, data acquired from the vehicle sensor 360c (along with any other suitably located vehicle sensors) may be used to determine a position and, or orientation of a driver's head 319c, the driver's left hand 315c and, or right hand 320c relative to the steering wheel 310c. For example, data acquired from the vehicle sensor 360c (along with any other suitably located vehicle sensors) may be used to determine that the driver's head 319c is oriented toward a cellular telephone 321c in her right hand 320c. Alternatively, or additionally, data acquired from the position sensor 360c (along with any other suitably located vehicle sensors) may be used to determine the presence of a cell phone in a driver's hand. As also described in detail herein, a determination may be made that the driver is inattentive to the road based on the driver's head 319c being oriented toward the cellular telephone 321c. Empirical vehicle operator identity data, real-time vehicle operator identity data and/or empirical vehicle operation data may be generated based on data received from any one of, or any combination of, vehicle sensors shown in FIG. 3C.


Turning to FIGS. 4 and 5, a vehicle module 405 of a vehicle insurance policy data generation system 400 is depicted along with a method of generating empirical vehicle operator identity data on the vehicle module 405 and, or transmitting the empirical vehicle operator identity data to a remote server 110. The vehicle module 405 may be similar to the vehicle module 121 of FIG. 1. The method 500 may be implemented by executing the modules 421, 424 on a processor (e.g., processor 115).


In any event, the vehicle module 405 may include an empirical vehicle operator identity data acquisition module 421 and an empirical vehicle related data transmission module 424 stored on a memory 420. The processor 115 may store a vehicle insurance application module on a memory (e.g., memory 420) of the vehicle module 405 and the vehicle insurance application module may be configured (block 505). The processor 115 may execute the empirical vehicle operator identity data acquisition module 421 and cause the processor 115 to acquire vehicle operator identity sensor data from at least one vehicle sensor (block 510). The processor 115 may further execute the empirical vehicle operator identity data acquisition module 421 and cause the processor 115 to generate real-time vehicle operator identity data (block 510). The processor 115 may further execute the empirical vehicle operator identity data acquisition module 421 and cause the processor 115 to receive known vehicle operator identity data (block 510). The processor 115 may further execute the empirical vehicle operator identity data acquisition module 421 and cause the processor 115 to generate empirical operator identity data based on, for example, a comparison of the real-time vehicle operator identity data with the known vehicle operator identity data (block 515). The processor 115 may execute the empirical vehicle related data transmission module 424 to cause the processor 115 to transmit the empirical vehicle operator identity data to a remote server (e.g., remote server 110 of FIG. 1) (block 520).


The method of generating empirical vehicle operator identity data 500 may include using a picture and/or a video of a vehicle operator's face to identify the driver of the vehicle. For example, the method 500 may include capturing at least one image of each person who is authorized to operate a vehicle in accordance with an associated insurance policy. The images may be stored within a database (e.g., insurance related database 170 of FIG. 1) with other empirical vehicle operator identity data. A camera and/or video device (e.g., image sensor 136, 137 of FIG. 1) may be provided within an associated insured vehicle. The camera and/or video device 136, 137 may, for example, be mounted on the dashboard (e.g., dashboard 225b of FIG. 2B) or steering wheel (e.g., steering wheel 210b of FIG. 2B) of an insured vehicle. The camera and/or video device 136, 137 may be activated when a driver occupies the vehicle. A camera and/or video device 136, 137 may be activated when a driver unlocks and enters the vehicle (e.g., sits in the driver's seat). Alternatively, a camera and/or video device 136, 137 may be activated when the driver grasps the steering wheel 210b. Optionally, a camera and/or video device 13, 137 may be activated when the driver inserts the key in an ignition of the insured vehicle. An image and/or video of the vehicle operator's face may be captured. Empirical vehicle operator identity data may be generated based on the captured image or video, using, for example, image recognition technology to identify key facial characteristics. The image recognition technology may, for example, determine a physical status of the driver by comparing an image of the driver, that was captured from within the vehicle, to a series of images that had previously been stored within an associated database 170. The facial characteristics data may be used to assess a current physical status of the driver within a predetermined set of rules. Thereby, an individual may be prohibited from operating a vehicle if the driver is not authorized to drive the vehicle or an authorized driver is determined to be in a condition unsuitable for operating the vehicle. For example, a facial image of a vehicle operator may be compared against a library of known vehicle operators to determine whether the vehicle operator has permission to drive a particular vehicle. If the vehicle operator does not have permission to drive the vehicle, the system (e.g., processor 115) may prohibit operation of the vehicle. Alternatively or additionally, if the vehicle operator does not have permission to drive the vehicle because the operator is not named on an insurance policy (i.e., the operator is an undisclosed driver), the system (e.g., processor 115) may generate an audit function to audit the actual owner/policy holder or otherwise inform him or her of the unauthorized/undisclosed driver.


Further, a facial image of a vehicle operator may be used to determine if the operator is wearing their corrective lenses, if required in accordance with her driver's license. If the driver is required to wear corrective lenses and does not have them on, operation of the vehicle may be prohibited. Yet further, a facial image of a vehicle operator may be used to determine if the operator is too tired or stressed to operate the vehicle. Images of faces, even in static photos, may show key characteristics of weariness and stress. Weariness and/or stress may affect the reflexes and acuity of a vehicle operator and may impact an ability of the vehicle operator to drive the vehicle. If a vehicle operator is determined to be too stressed or tired, the operation of the vehicle may be prohibited. Furthermore, an owner of a vehicle may require authentication of an operator of his vehicle prior to the vehicle being enabled for operation. Moreover, an owner of a vehicle may require validation that an authorized driver is in a suitable condition to operate the vehicle.


With further reference to FIG. 4, along with reference to FIG. 6, a vehicle module 405 of a vehicle insurance policy data generation system 400 is depicted along with a method of generating empirical vehicle operations data on the vehicle module 405 and, or transmitting empirical vehicle operations data to a remote server 110. The vehicle module 405 may be similar to the vehicle module 121 of FIG. 1. The method 600 may be implemented by executing the modules 422, 424 on a processor (e.g., processor 115).


In any event, the vehicle module 405 may include an empirical vehicle operation data acquisition module 422 and an empirical vehicle related data transmission module 424. The processor 115 may store a vehicle insurance application module on a memory (e.g., memory 420) of the vehicle module 405 and the vehicle insurance application module may be configured (block 605). The processor 115 may execute the empirical vehicle operation data acquisition module 422 to cause the processor 115 to receive vehicle operation sensor data (block 610). The processor 115 may further execute the empirical vehicle operation data acquisition module 422 to cause the processor 115 to generate empirical vehicle operation data based on the vehicle operation sensor data (block 615). The processor 115 may execute the empirical vehicle related data transmission module 424 to cause the processor 115 to transmit empirical vehicle operation data to a remote server (e.g., remote server 110 of FIG. 1) (block 620).


The processor 115 may execute an empirical vehicle operating environment data acquisition module 423 to cause the processor 115 to receive vehicle sensor data associated with an operating environment of the vehicle. For example, the processor 115 may generate empirical vehicle operating environment data based on data acquire from a temperature sensor, a rain sensor, an ice sensor, a snow sensor or other vehicle sensor capable of sensing an operating environment associated with the vehicle.


A method of generating empirical vehicle operation related data 600 may, for example, include detecting driving patterns. For example, vehicle operation data may be received from a vehicle telematics system (e.g., a GPS or a steering wheel angle sensor). Vehicle operation data may indicate, for example, left turn data which may be representative of a number of left turns a vehicle has navigated. One or more vehicle sensors (e.g., vibration sensors, light sensors or pressure sensors) may be installed on the exterior of the vehicle, such as on the windshield. Sensor technology (e.g., sensor technology available from Nexense ETC) may be used to monitor the length of time a vehicle is in use. Nexense's sensor technology may, for example, be used to measure sounds, movement and/or pressure within, and around, a vehicle. A pressure-sensitive sensor pad 123, 124 may be installed on a vehicle driver's seat. Data received from the vehicle driver's seat pressure sensor 123, 124 may be used to determine a length of time a driver's side seat was occupied. Alternatively, or additionally, a pressure sensor may be placed on an exterior of a vehicle (e.g., on the windshield). Data from the exterior pressure-sensitive sensor may be used, for example, to measure air flow over the vehicle as the vehicle is in motion. In another example, an audio sensor (e.g., a microphone 151, 152 of FIG. 1) may be used to monitor engine sound as the vehicle is in use. Furthermore, empirical vehicle operation data may be based on vehicle sensor data (e.g., seat occupancy sensors 123, 124, seat belt sensors 128, 129, body heat sensors 141, 142, cameras 136, 137, etc.). Empirical vehicle operation related data may, for example, be representative of circumstances where multiple passengers traveling in an insured vehicle. Multiple passengers may, for example, create a high risk for teenager vehicle operators. As discussed in detail elsewhere herein, associated insurance policy rate may be adjusted based on the empirical vehicle operation related data.


Empirical vehicle operation related data may be generated based on one or more vehicle motion sensors (e.g., vibration sensors 107, 108, pressure sensors 123, 124 and/or a light sensors 136, 137). Data from the vehicle motion sensors may be time stamped and used to determine a length of time a vehicle was in use. Empirical vehicle operation related data may be transmitted to an insurance agency. The insurance agency may determine vehicle usage based on, for example, travel time data. Travel time data may be used to determine vehicle insurance policy pricing adjustments and/or future policy payment adjustments for usage-based vehicle insurance. Updated vehicle insurance policy information may be automatically provided to an insurance customer.


Turning to FIGS. 7 and 8, a remote server 710 of a vehicle insurance policy data generation system 600 is depicted along with a method of establishing an insurance risk related data file on the server 800. The remote server 710 may be similar to the remote server with insurance application 110 of FIG. 1. The method 800 may be implemented by executing the modules 762-765 on a processor (e.g., processor 155 of FIG. 1).


In any event, the remote server 710 may include an empirical vehicle operator identity data receiving module 762, an empirical vehicle operation data receiving module 763, a data correlation module 764 and a vehicle insurance policy data generation module 765 stored on a memory 760. The processor 155 may execute the empirical vehicle operator identity data receiving module 762 to cause the processor 155 to receive empirical vehicle operator identity data (block 805). The processor 155 may execute the empirical vehicle operation data receiving module 763 to cause the processor 155 to receive empirical vehicle operation data (block 810). The processor 155 may execute the data correlation module 764 to cause the processor 155 to correlate at least a portion of the empirical vehicle operator identity data with at least a portion of the empirical vehicle operation data (block 815). The processor 155 may execute the vehicle insurance policy data generation module 765 to cause the processor 155 to generate vehicle insurance policy data based on the correlated empirical vehicle operator identity data and empirical vehicle operation data (block 820). Alternatively, the processor 155 may execute the vehicle insurance policy data generation module 765 to cause the processor 155 to generate vehicle insurance policy data based on the empirical vehicle operator identity data and the empirical vehicle operation data (block 820).


As a particular example of the generated insurance policy data, an insurance policy may include a principle vehicle operator (e.g., person A having 0 recorded accidents). The principal vehicle operator may weigh 125 lbs. A weight sensor 123, 124, positioned within a driver's seat of an associated insured vehicle, may generate empirical vehicle operator identity data that indicates a person weighing 250 lbs. has operated the vehicle. For example, the empirical vehicle operator identity data may indicate that a vehicle operator (e.g., person B having 10 recorded accidents) has driven the insured vehicle most of the time. Alternatively, or additionally, data acquired from a facial recognition device (e.g., a camera/image processor 136, 137) may be used to generate empirical vehicle operator identity data. The processor 155 may generate vehicle insurance policy data based on the empirical vehicle operator identity data. The processor 155 may transmit the vehicle insurance policy related data to an insurance underwriting agent for use in calculating a vehicle insurance rate. An insurance policy may be adjusted based on the vehicle insurance policy related data. For example, person B may be assigned as principle operator. Alternatively, the insurance policy may be adjusted based on a combination of person A and person B. For example, a discount may be determined when a teenager drives solo 90% of the time. Alternatively, a vehicle insurance rate may be increased when a teenager drives solo only 20% of the time. The processor 155 may generate vehicle insurance policy data based on empirical vehicle operator identity data when underwriting household composite vehicle insurance policies. The processor 155 may time stamp the empirical vehicle operator identity data. Thereby, the processor 155 may determine an amount of time that a vehicle has been driven by a particular individual based on the time-stamped empirical vehicle operator identity data.


This detailed description is to be construed as exemplary only and does not describe every possible embodiment, as describing every possible embodiment would be impractical, if not impossible. One could implement numerous alternate embodiments, using either current technology or technology developed after the filing date of this application.

Claims
  • 1. A computer implemented method for automatically generating insurance policy related data, the method comprising: identifying, by the one or more processors associated with a server, a vehicle operator using vehicle operator identity data that is based partially upon vehicle sensor data, the vehicle operator identity data identifying one or more physical aspects of the vehicle operator and including an image of the vehicle operator;changing, by one more processors associated with a vehicle module installed in a vehicle:a state of the vehicle to (i) prevent the vehicle operator from operating the vehicle when the determined identity of the vehicle operator does not match a vehicle operator who is insured to operate the vehicle, or (ii) allow the vehicle operator to operate the vehicle when the determined identity of the vehicle operator matches a vehicle operator who is insured to operate the vehicle; receiving, by the one or more processors associated with the server, vehicle operation data that is based partially upon the vehicle sensor data and is representative of operation of the vehicle by the vehicle operator who is allowed to operate the vehicle; andcalculating, by the one or more processors associated with the server, comprehensive vehicle insurance policy related data based upon a correlation between the vehicle operation data, the vehicle operator identity data, and the vehicle operator such that the comprehensive vehicle insurance policy related data indicates (i) driving habits of the vehicle operator who is allowed to operate the vehicle, and (ii) a proportion of driving time in which the vehicle operator was accompanied by one or more passengers when driving the vehicle.
  • 2. The method of claim 1, wherein the act of changing the state of the vehicle to allow the vehicle operator to operate the vehicle comprises: changing the state of the vehicle to allow the vehicle operator to operate the vehicle when the vehicle operator matches a vehicle operator that is insured to operate the vehicle based upon a comparison of the vehicle operator identity data to known vehicle operator identity data.
  • 3. The method of claim 2, wherein the known vehicle operator identity data is representative of (i) one or more identifiable physical aspects of vehicle operators that are insured to operate the vehicle, and (ii) images of the vehicle operators that are insured to operate the vehicle.
  • 4. The method of claim 1, wherein the act of changing the state of the vehicle to prevent the vehicle operator from operating the vehicle comprises: identifying facial characteristics of the vehicle operator to assess a physical status of the vehicle operator in accordance with a predetermined set of rules; andchanging the state of the vehicle to prevent the vehicle operator from operating the vehicle when the vehicle operator is determined to be in a condition unsuitable for operating the vehicle based upon the physical status of the vehicle operator.
  • 5. The method of claim 1, wherein the act of calculating the comprehensive vehicle insurance policy related data comprises: calculating one or more insurance options including underwriting and pricing.
  • 6. The method of claim 1, wherein the vehicle sensor data is received from at least one vehicle sensor including one or more of: a light sensor;a pressure sensor;a seat belt sensor;a seat occupancy sensor;an image sensor;a vehicle telematics system sensor;a steering wheel angle sensor;a vibration sensor;a vehicle pitch sensor;facial recognition sensor;fingerprint sensor;eye scan sensor;a vehicle yaw sensor;a vehicle speed sensor;a vehicle brake sensor;a steering wheel hand sensor;an air bag sensor;a microphone;an ultrasonic sensor; andan infrared sensor.
  • 7. A system for automatically generating vehicle insurance policy related data, the system comprising: a server configured to: identify a vehicle operator using vehicle operator identity data that is based partially upon vehicle sensor data, the vehicle operator identity data identifying one or more physical aspects of the vehicle operator and including an image of the vehicle operator; anda vehicle module configured to: change a state of the vehicle to prevent the vehicle operator from operating the vehicle when the determined identity of the vehicle operator does not match a vehicle operator who is insured to operate the vehicle, orchange a state of the vehicle to allow the vehicle operator to operate the vehicle when the determined identity of the vehicle operator matches a vehicle operator who is insured to operate the vehicle, andwherein the server is further configured to: receive vehicle operation data that is based partially upon the vehicle sensor data and is representative of operation of the vehicle by the vehicle operator who is allowed to operate the vehicle; andcalculate comprehensive vehicle insurance policy related data based upon a correlation between the vehicle operation data, the vehicle operator identity data, and the vehicle operator such that the comprehensive vehicle insurance policy related data indicates (i) driving habits of the vehicle operator who is allowed to operate the vehicle, and (ii) a proportion of driving time in which the vehicle operator was accompanied by one or more passengers when driving the vehicle.
  • 8. The system of claim 7, wherein: the vehicle module is further configured to change the state of the vehicle to allow the vehicle operator to operate the vehicle when the vehicle operator matches a vehicle operator that is insured to operate the vehicle based upon a comparison of the vehicle operator identity data to known vehicle operator identity data; andthe known vehicle operator identity data is representative of (i) one or more identifiable physical aspects of vehicle operators that are insured to operate the vehicle, and (ii) images of the vehicle operators that are insured to operate the vehicle.
  • 9. The system of claim 8, wherein the server is further configured to calculate the comprehensive vehicle insurance policy related data by calculating one or more insurance options including underwriting and pricing.
  • 10. The system of claim 8, wherein the vehicle sensor data is received from at least one vehicle sensor including one or more of: a light sensor;a pressure sensor;a seat belt sensor;a seat occupancy sensor;an image sensor;a vehicle telematics system sensor;a steering wheel angle sensor;a vibration sensor;a vehicle pitch sensor;facial recognition sensor;fingerprint sensor;eye scan sensor;a vehicle yaw sensor;a vehicle speed sensor;a vehicle brake sensor;a steering wheel hand sensor;an air bag sensor;a microphone;an ultrasonic sensor; andan infrared sensor.
  • 11. The system of claim 7, wherein the vehicle module is further configured to change the state of the vehicle to prevent the vehicle operator from operating the vehicle when identified facial characteristics of the vehicle operator used to assess a physical status of the vehicle operator in accordance with a predetermined set of rules indicate that the vehicle operator is in a condition unsuitable for operating the vehicle.
  • 12. A non-transitory, tangible, computer-readable medium storing instructions that, when executed by a processor, cause the processor to: identify a vehicle operator using vehicle operator identity data that is based partially upon vehicle sensor data, the vehicle operator identity data identifying one or more physical aspects of the vehicle operator and including an image of the vehicle operator;change a state of the vehicle to prevent the vehicle operator from operating the vehicle when the determined identity of the vehicle operator does not match a vehicle operator who is insured to operate the vehicle, orchange a state of the vehicle to allow the vehicle operator to operate the vehicle when the determined identity of the vehicle operator matches a vehicle operator who is insured to operate the vehicle; receive vehicle operation data that is based partially upon the vehicle sensor data and is representative of operation of the vehicle by the vehicle operator who is allowed to operate the vehicle; andcalculate comprehensive vehicle insurance policy related data based upon a correlation between the vehicle operation data, the vehicle operator identity data, and the vehicle operator such that the comprehensive vehicle insurance policy related data indicates (i) driving habits of the vehicle operator who is allowed to operate the vehicle, and (ii) a proportion of driving time in which the vehicle operator was accompanied by one or more passengers when driving the vehicle.
  • 13. The non-transitory, tangible, computer-readable medium of claim 12, wherein the instructions to change the state of the vehicle to allow the vehicle operator to operate the vehicle further include instructions that, when executed by the processor, cause the processor to change the state of the vehicle to allow the vehicle operator to operate the vehicle when the vehicle operator matches a vehicle operator that is insured to operate the vehicle based upon a comparison of the vehicle operator identity data to known vehicle operator identity data.
  • 14. The non-transitory, tangible, computer-readable medium of claim 13, wherein the known vehicle operator identity data is representative of (i) one or more identifiable physical aspects of vehicle operators that are insured to operate the vehicle, and (ii) images of the vehicle operators that are insured to operate the vehicle.
  • 15. The non-transitory, tangible, computer-readable medium of claim 12, wherein the instructions change the state of the vehicle to prevent the vehicle operator from operating the vehicle further include instructions that, when executed by the processor, cause the processor to change the state of the vehicle to prevent the vehicle operator from operating the vehicle when identified facial characteristics of the vehicle operator used to assess a physical status of the vehicle operator in accordance with a predetermined set of rules indicate that the vehicle operator is in a condition unsuitable for operating the vehicle.
  • 16. The non-transitory, tangible, computer-readable medium of claim 12, wherein the instructions to calculate the comprehensive vehicle insurance policy related data further include instructions that, when executed by the processor, cause the processor to calculate one or more insurance options including underwriting and pricing.
  • 17. The non-transitory, tangible, computer-readable medium of claim 12, wherein the vehicle sensor data is received from at least one vehicle sensor including one or more of: a light sensor;a pressure sensor;a seat belt sensor;a seat occupancy sensor;an image sensor;a vehicle telematics system sensor;a steering wheel angle sensor;a vibration sensor;a vehicle pitch sensor;facial recognition sensor;fingerprint sensor;eye scan sensor;a vehicle yaw sensor;a vehicle speed sensor;a vehicle brake sensor;a steering wheel hand sensor;an air bag sensor;a microphone;an ultrasonic sensor; andan infrared sensor.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/203,349, filed on Mar. 10, 2014, which claims the benefit of U.S. Provisional Application No. 61/775,652, filed on Mar. 10, 2013. The disclosure of each of which is incorporated by reference herein in its entirety.

US Referenced Citations (231)
Number Name Date Kind
4303904 Chasek Dec 1981 A
5310999 Claus et al. May 1994 A
5499182 Ousborne Mar 1996 A
5550551 Alesio Aug 1996 A
5797134 McMillan et al. Aug 1998 A
6064970 McMillan et al. May 2000 A
6313791 Klanke Nov 2001 B1
6408232 Cannon et al. Jun 2002 B1
6434510 Callaghan Aug 2002 B1
6718235 Borugian Apr 2004 B1
6741168 Webb et al. May 2004 B2
6831993 Lemelson Dec 2004 B2
6856933 Callaghan Feb 2005 B1
6868386 Henderson et al. Mar 2005 B1
7343306 Bates et al. Mar 2008 B1
7343310 Stender Mar 2008 B1
7571128 Brown Aug 2009 B1
7659827 Gunderson et al. Feb 2010 B2
7692552 Harrington Apr 2010 B2
7724145 Batra et al. May 2010 B2
7725348 Allen et al. May 2010 B1
7812712 White et al. Oct 2010 B2
7860764 Alexander et al. Dec 2010 B1
7865378 Gay Jan 2011 B2
7870010 Joao Jan 2011 B2
7873455 Arshad Jan 2011 B2
7890355 Gay et al. Feb 2011 B2
7930098 Huang Apr 2011 B2
7937278 Cripe et al. May 2011 B1
7991629 Gay et al. Aug 2011 B2
8027853 Kazenas Sep 2011 B1
8056538 Harnack Nov 2011 B2
8086523 Palmer Dec 2011 B1
8090598 Bauer et al. Jan 2012 B2
8140358 Ling et al. Mar 2012 B1
8240480 Shaw Aug 2012 B2
8280752 Cripe et al. Oct 2012 B1
8311858 Everett et al. Nov 2012 B2
8332242 Medina, III Dec 2012 B1
8352118 Mittelsteadt Jan 2013 B1
8359213 Berg et al. Jan 2013 B2
8359259 Berg et al. Jan 2013 B2
8407139 Palmer Mar 2013 B1
8423239 Blumer et al. Apr 2013 B2
8489433 Altieri et al. Jul 2013 B2
8508353 Cook et al. Aug 2013 B2
8527146 Jackson Sep 2013 B1
8538789 Blank et al. Sep 2013 B1
8566126 Hopkins, III Oct 2013 B1
8569141 Huang Oct 2013 B2
8605948 Mathony et al. Dec 2013 B2
8606512 Bogovich et al. Dec 2013 B1
8606514 Rowley et al. Dec 2013 B2
8612139 Wang et al. Dec 2013 B2
8630768 McClellan et al. Jan 2014 B2
8635091 Amigo et al. Jan 2014 B2
8655544 Fletcher et al. Feb 2014 B2
8682699 Collins et al. Mar 2014 B2
8686844 Wine Apr 2014 B1
8725408 Hochkirchen et al. May 2014 B2
8731768 Fernandes et al. May 2014 B2
8744642 Nemat-Nasser Jun 2014 B2
8781900 Schwarz Jul 2014 B2
8799035 Coleman et al. Aug 2014 B2
8799036 Christensen et al. Aug 2014 B1
8812330 Cripe et al. Aug 2014 B1
8892451 Everett et al. Nov 2014 B2
8935036 Christensen et al. Jan 2015 B1
8983677 Wright et al. Mar 2015 B2
9008956 Hyde et al. Apr 2015 B2
9031545 Srey et al. May 2015 B1
9098367 Ricci Aug 2015 B2
9105066 Gay et al. Aug 2015 B2
9141996 Christensen Sep 2015 B2
9164957 Hassib et al. Oct 2015 B2
9183441 Blumer et al. Nov 2015 B2
9208525 Hayward et al. Dec 2015 B2
9221428 Kote Dec 2015 B2
9235750 Sutton Jan 2016 B1
9256991 Crawford Feb 2016 B2
9418383 Hayward et al. Aug 2016 B1
9454786 Srey et al. Sep 2016 B1
9665997 Morgan May 2017 B2
9779458 Hayward Oct 2017 B2
20010044733 Lee et al. Nov 2001 A1
20020026394 Savage et al. Feb 2002 A1
20020111725 Burge Aug 2002 A1
20020128985 Greenwald Sep 2002 A1
20020198843 Wang et al. Dec 2002 A1
20030112133 Webb et al. Jun 2003 A1
20030191581 Ukai et al. Oct 2003 A1
20030236686 Matsumoto et al. Dec 2003 A1
20040039611 Hong et al. Feb 2004 A1
20040117358 von Kaenel et al. Jun 2004 A1
20040153362 Bauer et al. Aug 2004 A1
20040225557 Phelan Nov 2004 A1
20050024185 Chuey Feb 2005 A1
20050267784 Slen et al. Dec 2005 A1
20050283388 Eberwine et al. Dec 2005 A1
20060049925 Hara et al. Mar 2006 A1
20060053038 Warren Mar 2006 A1
20060075120 Smit Apr 2006 A1
20060079280 LaPerch Apr 2006 A1
20060095301 Gay May 2006 A1
20060114531 Webb et al. Jun 2006 A1
20060247852 Kortge et al. Nov 2006 A1
20070005404 Raz Jan 2007 A1
20070061173 Gay Mar 2007 A1
20070106539 Gay May 2007 A1
20070124045 Ayoub et al. May 2007 A1
20070156468 Gay et al. Jul 2007 A1
20070200663 White Aug 2007 A1
20070256499 Pelecanos et al. Nov 2007 A1
20070268158 Gunderson Nov 2007 A1
20070282638 Surovy Dec 2007 A1
20070288270 Gay et al. Dec 2007 A1
20070299700 Gay et al. Dec 2007 A1
20080018466 Batra et al. Jan 2008 A1
20080027761 Bracha Jan 2008 A1
20080051996 Dunning et al. Feb 2008 A1
20080065427 Helitzer et al. Mar 2008 A1
20080174451 Harrington Jul 2008 A1
20080243558 Gupte Oct 2008 A1
20080255888 Berkobin et al. Oct 2008 A1
20090002147 Bloebaum et al. Jan 2009 A1
20090024419 McClellan Jan 2009 A1
20090150023 Grau et al. Jun 2009 A1
20090210257 Chalfant et al. Aug 2009 A1
20100030568 Daman Feb 2010 A1
20100066513 Bauchot Mar 2010 A1
20100088123 McCall et al. Apr 2010 A1
20100131302 Collopy et al. May 2010 A1
20100131304 Collopy May 2010 A1
20100138244 Basir Jun 2010 A1
20100185534 Satyavolu et al. Jul 2010 A1
20100223080 Basir et al. Sep 2010 A1
20100238009 Cook et al. Sep 2010 A1
20110022421 Brown et al. Jan 2011 A1
20110040579 Havens Feb 2011 A1
20110106370 Duddle et al. May 2011 A1
20110125363 Blumer et al. May 2011 A1
20110137685 Tracy et al. Jun 2011 A1
20110153367 Amigo et al. Jun 2011 A1
20110161117 Busque et al. Jun 2011 A1
20110161118 Borden et al. Jun 2011 A1
20110195699 Tadayon Aug 2011 A1
20110200052 Mungo et al. Aug 2011 A1
20110213628 Peak et al. Sep 2011 A1
20110267186 Rao Nov 2011 A1
20110304446 Basson Dec 2011 A1
20110307188 Peng et al. Dec 2011 A1
20120004933 Foladare et al. Jan 2012 A1
20120021386 Anderson et al. Jan 2012 A1
20120029945 Altieri et al. Feb 2012 A1
20120065834 Senart Mar 2012 A1
20120072243 Collins et al. Mar 2012 A1
20120072244 Collins et al. Mar 2012 A1
20120089423 Tamir et al. Apr 2012 A1
20120089701 Goel Apr 2012 A1
20120101855 Collins et al. Apr 2012 A1
20120109418 Lorber May 2012 A1
20120109692 Collins May 2012 A1
20120158436 Bauer et al. Jun 2012 A1
20120190386 Anderson Jul 2012 A1
20120197669 Kote et al. Aug 2012 A1
20120209632 Kaminski et al. Aug 2012 A1
20120209634 Ling et al. Aug 2012 A1
20120214472 Tadayon Aug 2012 A1
20120259665 Pandhi et al. Oct 2012 A1
20120323531 Pascu et al. Dec 2012 A1
20120323772 Michael Dec 2012 A1
20120330499 Scheid et al. Dec 2012 A1
20130006675 Bowne et al. Jan 2013 A1
20130013347 Ling et al. Jan 2013 A1
20130013348 Ling et al. Jan 2013 A1
20130018677 Chevrette Jan 2013 A1
20130041521 Basir Feb 2013 A1
20130041621 Smith et al. Feb 2013 A1
20130046510 Bowne et al. Feb 2013 A1
20130046559 Coleman et al. Feb 2013 A1
20130046562 Taylor et al. Feb 2013 A1
20130046646 Malan Feb 2013 A1
20130073114 Nemat-Nasser Mar 2013 A1
20130110310 Young May 2013 A1
20130117050 Berg et al. May 2013 A1
20130144474 Ricci Jun 2013 A1
20130144657 Ricci Jun 2013 A1
20130151064 Becker et al. Jun 2013 A1
20130161110 Furst Jun 2013 A1
20130166098 Lavie Jun 2013 A1
20130166326 Lavie et al. Jun 2013 A1
20130188794 Kawamata et al. Jul 2013 A1
20130211662 Blumer et al. Aug 2013 A1
20130226624 Blessman et al. Aug 2013 A1
20130244210 Nath et al. Sep 2013 A1
20130262530 Collins et al. Oct 2013 A1
20130289819 Hassib et al. Oct 2013 A1
20130297387 Michael Nov 2013 A1
20130304276 Flies Nov 2013 A1
20130304515 Gryan et al. Nov 2013 A1
20130317693 Jefferies et al. Nov 2013 A1
20130325519 Tracy et al. Dec 2013 A1
20130345896 Blumer et al. Dec 2013 A1
20140012604 Allen, Jr. Jan 2014 A1
20140019167 Cheng Jan 2014 A1
20140019170 Coleman et al. Jan 2014 A1
20140025401 Hagelstein et al. Jan 2014 A1
20140046701 Steinberg et al. Feb 2014 A1
20140052479 Kawamura Feb 2014 A1
20140058761 Freiberger et al. Feb 2014 A1
20140074345 Gabay et al. Mar 2014 A1
20140074402 Hassib et al. Mar 2014 A1
20140089101 Meller Mar 2014 A1
20140108058 Bourne et al. Apr 2014 A1
20140111647 Atsmon et al. Apr 2014 A1
20140114696 Amigo et al. Apr 2014 A1
20140180723 Cote et al. Jun 2014 A1
20140257865 Gay et al. Sep 2014 A1
20140257866 Gay et al. Sep 2014 A1
20140257867 Gay et al. Sep 2014 A1
20140257868 Hayward et al. Sep 2014 A1
20140257869 Binion et al. Sep 2014 A1
20140257870 Cielocha et al. Sep 2014 A1
20140257871 Christensen et al. Sep 2014 A1
20140257872 Christensen et al. Sep 2014 A1
20140257873 Hayward et al. Sep 2014 A1
20140257874 Hayward et al. Sep 2014 A1
20140278574 Barber Sep 2014 A1
20140304011 Yager et al. Oct 2014 A1
20140310028 Christensen et al. Oct 2014 A1
20160086393 Collins Mar 2016 A1
Non-Patent Literature Citations (16)
Entry
Mihailescu, An assessment Charter airline benefits for Port Elizabeth and the Eastern Cape, Chinese Business Review, pp. 34-45 (Feb. 2010).
Nerad, “Insurance by the Mile”, AntiqueCar.com, Mar. 11, 2007, downloaded from the Internet at: <http://www.antiquecar.com/feature—insurance—by—the—mile.php> (3 pages).
U.S. Appl. No. 14/203,015, Notice of Allowance, dated Mar. 31, 2015.
U.S. Appl. No. 14/203,015, Office Action, dated May 22, 2014.
U.S. Appl. No. 14/203,015, Office Action, dated Oct. 29, 2014.
U.S. Appl. No. 14/203,338, Final Office Action, dated Oct. 6, 2014.
U.S. Appl. No. 14/203,338, Notice of Allowance, dated May 20, 2015.
U.S. Appl. No. 14/203,338, Office Action, dated Feb. 3, 2015.
U.S. Appl. No. 14/203,338, Office Action, dated Jun. 2, 2014.
U.S. Appl. No. 14/203,349, Final Office Action, dated Mar. 17, 2015.
U.S. Appl. No. 14/203,349, Final Office Action, dated Dec. 3, 2015.
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated Feb. 10, 2017.
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated Jun. 15, 2015.
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated May 20, 2014.
U.S. Appl. No. 14/203,349, Nonfinal Office Action, dated Oct. 23, 2014.
U.S. Appl. No. 14/203,349, Notice of Allowance, dated Jul. 26, 2017.
Provisional Applications (1)
Number Date Country
61775652 Mar 2013 US
Continuations (1)
Number Date Country
Parent 14203349 Mar 2014 US
Child 15674067 US