The present disclosure is generally directed to vehicle systems, in particular, toward semi-autonomous or fully autonomous vehicles.
In recent years, transportation methods have changed substantially. This change is due in part to a concern over the limited availability of natural resources, a proliferation in personal technology, and a societal shift to adopt more environmentally and user-friendly transportation solutions. These considerations have encouraged the development of a number of new features on vehicles that allow the user to concentrate on other tasks or on driving.
While these vehicles appear to be new they are generally implemented as a number of traditional subsystems that are executed with a communication system. In fact, the design and construction of the vehicles is limited to operations that can be completed by drivers. Among other things, these limitations fail to take advantage of the autonomous nature of vehicles.
Embodiments of the present disclosure will be described in connection with a vehicle, and in accordance with one exemplary embodiment an electric vehicle and/or hybrid-electric vehicle and associated systems.
With attention to
Referring to
Referring to
Exemplar data comprises charging type 310A comprising a manual charging station 310J, robotic charging station 310K such as robotic charging system 254, a roadway charging system 310L such as those of roadway system 250, an emergency charging system 310M such as that of emergency charging vehicle system 270, an emergency charging system 310N such as that of aerial vehicle charging system 280, and overhead charging type 310O such as that of overhead charging system 258.
Compatible vehicle charging panel types 310B comprise locations on vehicle 100 wherein charging may be received, such as vehicle roof 130, vehicle side 160 and vehicle lower or undercarriage 140. Compatible vehicle storage units 310C data indicates storage units types that may receive power from a given charging type 310A. Available automation level 310D data indicates the degree of automation available for a given charging type; a high level may indicate full automation, allowing the vehicle driver 220 and/or vehicle passengers 230 to not involve themselves in charging operations, while a low level of automation may require the driver 220 and/or occupant 230 to manipulate/position a vehicle charging device to engage with a particular charging type 310A to receive charging. Charging status 310E indicates whether a charging type 310A is available for charging (i.e. is “up”) or is unavailable for charging (i.e. is “down”). Charge rate 310F provides a relative value for time to charge, while Cost 310G indicates the cost to vehicle 100 to receive a given charge. The Other data element 310H may provide additional data relevant to a given charging type 310A, such as a recommended separation distance between a vehicle charging plate and the charging source. The Shielding data element 310I indicates if electromagnetic shielding is recommended for a given charging type 310A and/or charging configuration. Further data fields 310P, 310Q are possible.
The static charging areas 520A, 520B may be positioned a static area such as a designated spot, pad, parking space 540A, 540B, traffic controlled space (e.g., an area adjacent to a stop sign, traffic light, gate, etc.), portion of a building, portion of a structure, etc., and/or combinations thereof. Some static charging areas may require that the electric vehicle 100 is stationary before a charge, or electrical energy transfer, is initiated. The charging of vehicle 100 may occur by any of several means comprising a plug or other protruding feature. The power source 516A, 516B may include a receptacle or other receiving feature, and/or vice versa.
The charging area may be a moving charging area 520C. Moving charging areas 520C may include charging areas associated with one or more portions of a vehicle, a robotic charging device, a tracked charging device, a rail charging device, etc., and/or combinations thereof. In a moving charging area 520C, the electrical vehicle 100 may be configured to receive a charge, via a charging panel, while the vehicle 100 is moving and/or while the vehicle 100 is stationary. In some embodiments, the electrical vehicle 100 may synchronize to move at the same speed, acceleration, and/or path as the moving charging area 520C. In one embodiment, the moving charging area 520C may synchronize to move at the same speed, acceleration, and/or path as the electrical vehicle 100. In any event, the synchronization may be based on an exchange of information communicated across a communications channel between the electric vehicle 100 and the charging area 520C. Additionally or alternatively, the synchronization may be based on information associated with a movement of the electric vehicle 100 and/or the moving charging area 520C. In some embodiments, the moving charging area 520C may be configured to move along a direction or path 532 from an origin position to a destination position 520C′.
In some embodiments, a transformer may be included to convert a power setting associated with a main power supply to a power supply used by the charging areas 520A-C. For example, the transformer may increase or decrease a voltage associated with power supplied via one or more power transmission lines.
Referring to
The charging panel controller 610 may receive signals from vehicle sensors 626 to determine, for example, if a hazard is present in the path of the vehicle 100 such that deployment of the vehicle charging panel 608 is inadvisable. The charging panel controller 610 may also query vehicle database 210 comprising data structures 300 to establish other required conditions for deployment. For example, the database may provide that a particular roadway does not provide a charging service or the charging service is inactive, wherein the charging panel 108 would not be deployed.
The power source 516 may include at least one electrical transmission line 624 and at least one power transmitter or charging area 520. During a charge, the charging panel 608 may serve to transfer energy from the power source 516 to at least one energy storage unit 612 (e.g., battery, capacitor, power cell, etc.) of the electric vehicle 100.
Robotic charging unit 700 comprises one or more robotic unit arms 704, at least one robotic unit arm 704 interconnected with charging plate 520. The one or more robotic unit arms 704 maneuver charging plate 520 relative to charging panel 608 of vehicle 100. Charging plate 520 is positioned to a desired or selectable separation distance, as assisted by a separation distance sensor disposed on charging plate 520. Charging plate 520 may remain at a finite separation distance from charging panel 608, or may directly contact charging panel (i.e. such that separation distance is zero). Charging may be by induction. In alternative embodiments, separation distance sensor is alternatively or additionally disposed on robotic arm 704. Vehicle 100 receives charging via charging panel 608 which in turn charges energy storage unit 612. Charging panel controller 610 is in communication with energy storage unit 612, charging panel 608, vehicle database 300, charge provider controller 622, and/or any one of elements of instrument panel 400.
Robotic unit further comprises, is in communication with and/or is interconnected with charge provider controller 622, power source 516 and a robotic unit database. Power source 516 supplies power, such as electrical power, to charge plate 520 to enable charging of vehicle 100 via charging panel 608. Controller 622 maneuvers or operates robotic unit 704, either directly and/or completely or with assistance from a remote user, such as a driver or passenger in vehicle 100 by way of, in one embodiment, charging manual controller 432.
The overhead charging cable or first wire 814 is analogous to a contact wire used to provide charging to electric trains or other vehicles. An external source provides or supplies electrical power to the first wire 814. The charge provider comprises an energy source i.e. a provider battery and a provider charge circuit or controller in communication with the provider battery. The overhead charging cable or first wire 814 engages the overhead contact 824 which is in electrical communication with charge receiver panel 108. The overhead contact 824 may comprise any known means to connect to overhead electrical power cables, such as a pantograph 820, a bow collector, a trolley pole or any means known to those skilled in the art. Further disclosure regarding electrical power or energy transfer via overhead systems is found in US Pat. Publ. No. 2013/0105264 to Ruth entitled “Pantograph Assembly,” the entire contents of which are incorporated by reference for all purposes. In one embodiment, the charging of vehicle 100 by overhead charging system 800 via overhead contact 824 is by any means know to those skilled in the art, to include those described in the above-referenced US Pat. Publ. No. 2013/0105264 to Ruth.
The overhead contact 824 presses against the underside of the lowest overhead wire of the overhead charging system, i.e. the overhead charging cable or first wire 814, aka the contact wire. The overhead contact 824 may be electrically conductive. Alternatively or additionally, the overhead contact 824 may be adapted to receive electrical power from overhead charging cable or first wire 814 by inductive charging.
In one embodiment, the receipt and/or control of the energy provided via overhead contact 824 (as connected to the energy storage unit 612) is provided by receiver charge circuit or charging panel controller 110.
Overhead contact 824 and/or charging panel 608 may be located anywhere on vehicle 100, to include, for example, the roof, side panel, trunk, hood, front or rear bumper of the charge receiver 100 vehicle, as long as the overhead contact 824 may engage the overhead charging cable or first wire 814. Charging panel 108 may be stationary (e.g. disposed on the roof of vehicle 100) or may be moveable, e.g. moveable with the pantograph 820. Pantograph 820 may be positioned in at least two states comprising retracted and extended. In the extended state pantograph 820 engages first wire 814 by way of the overhead contact 824. In the retracted state, pantograph 820 may typically reside flush with the roof of vehicle 100 and extend only when required for charging. Control of the charging and/or positioning of the charging plate 608, pantograph 820 and/or overhead contact 824 may be manual, automatic or semi-automatic (such as via controller 610); said control may be performed through a GUI engaged by driver or occupant of receiving vehicle 100 and/or driver or occupant of charging vehicle.
With reference to
In one embodiment, the charging plate 520 is not in physical interconnection to AV 280, that is, there is no tether 1010. In this embodiment, the charging plate 520 is positioned and controlled by AV 280 by way of a controller on AV 280 or in communication with AV 280.
In one embodiment, the charging plate 520 position and/or characteristics (e.g. charging power level, flying separation distance, physical engagement on/off) are controlled by vehicle 100 and/or a user in or driver of vehicle 100.
Charge or power output of power source 516 is provided or transmitted to charger plate 620 by way of a charging cable or wire, which may be integral to tether 1010. In one embodiment, the charging cable is non-structural, that is, it provides zero or little structural support to the connection between AV 280 and charger plate 520.
Charging panel 608 of vehicle 100 receives power from charger plate 520. Charging panel 608 and charger plate 520 may be in direct physical contact (termed a “contact” charger configuration) or not in direct physical contact (termed a “flyer” charger configuration), but must be at or below a threshold (separation) distance to enable charging, such as by induction. Energy transfer or charging from the charger plate 520 to the charging panel 608 is inductive charging (i.e. use of an EM field to transfer energy between two objects). The charging panel 608 provides received power to energy storage unit 612 by way of charging panel controller 610. Charging panel controller 610 is in communication with vehicle database 210, vehicle database 210 comprising an AV charging data structure.
Charging panel 508 may be located anywhere on vehicle 100, to include, for example, the roof, side panel, trunk, hood, front or rear bumper and wheel hub of vehicle 100. Charging panel 608 is mounted on the roof of vehicle 100 in the embodiment of
Charging panel controller 610 may be located anywhere on charge receiver vehicle 100, to include, for example, the roof, side panel, trunk, hood, front or rear bumper and wheel hub of charge receiver 100 vehicle. In some embodiments, charging panel 608 may be deployable, i.e. may extend or deploy only when charging is needed. For example, charging panel 608 may typically stow flush with the lower plane of vehicle 100 and extend when required for charging. Similarly, charger plate 520 may, in one embodiment, not be connected to the lower rear of the emergency charging vehicle 270 by way of connector 1150 and may instead be mounted on the emergency charging vehicle 270, to include, for example, the roof, side panel, trunk, hood, front or rear bumper and wheel hub of emergency charging vehicle 270. Connector 1150 may be configured to maneuver connector plate 520 to any position on emergency charging vehicle 270 so as to enable charging. Control of the charging and/or positioning of the charging plate may be manual, automatic or semi-automatic; said control may be performed through a GUI engaged by driver or occupant of receiving vehicle and/or driver or occupant of charging vehicle.
Referring now to
The structural subsystem includes the frame 1204 of the vehicle 100. The frame 1204 may comprise a separate frame and body construction (i.e., body-on-frame construction), a unitary frame and body construction (i.e., a unibody construction), or any other construction defining the structure of the vehicle 100. The frame 1204 may be made from one or more materials including, but in no way limited to steel, titanium, aluminum, carbon fiber, plastic, polymers, etc., and/or combinations thereof. In some embodiments, the frame 1204 may be formed, welded, fused, fastened, pressed, etc., combinations thereof, or otherwise shaped to define a physical structure and strength of the vehicle 100. In any event, the frame 1204 may comprise one or more surfaces, connections, protrusions, cavities, mounting points, tabs, slots, or other features that are configured to receive other components that make up the vehicle 100. For example, the body panels, powertrain subsystem, controls systems, interior components, communications subsystem, and safety subsystem may interconnect with, or attach to, the frame 1204 of the vehicle 100.
The frame 1204 may include one or more modular system and/or subsystem connection mechanisms. These mechanisms may include features that are configured to provide a selectively interchangeable interface for one or more of the systems and/or subsystems described herein. The mechanisms may provide for a quick exchange, or swapping, of components while providing enhanced security and adaptability over conventional manufacturing or attachment. For instance, the ability to selectively interchange systems and/or subsystems in the vehicle 100 allow the vehicle 100 to adapt to the ever-changing technological demands of society and advances in safety. Among other things, the mechanisms may provide for the quick exchange of batteries, capacitors, power sources 1308A, 1308B, motors 1312, engines, safety equipment, controllers, user interfaces, interiors exterior components, body panels 1208, bumpers 1316, sensors, etc., and/or combinations thereof. Additionally or alternatively, the mechanisms may provide unique security hardware and/or software embedded therein that, among other things, can prevent fraudulent or low quality construction replacements from being used in the vehicle 100. Similarly, the mechanisms, subsystems, and/or receiving features in the vehicle 100 may employ poka-yoke, or mistake-proofing, features that ensure a particular mechanism is always interconnected with the vehicle 100 in a correct position, function, etc.
By way of example, complete systems or subsystems may be removed and/or replaced from a vehicle 100 utilizing a single minute exchange principle. In some embodiments, the frame 1204 may include slides, receptacles, cavities, protrusions, and/or a number of other features that allow for quick exchange of system components. In one embodiment, the frame 1204 may include tray or ledge features, mechanical interconnection features, locking mechanisms, retaining mechanisms, etc., and/or combinations thereof. In some embodiments, it may be beneficial to quickly remove a used power source 1308A, 1308B (e.g., battery unit, capacitor unit, etc.) from the vehicle 100 and replace the used power source 1308A, 1308B with a charged power source. Continuing this example, the power source 1308A, 1308B may include selectively interchangeable features that interconnect with the frame 1204 or other portion of the vehicle 100. For instance, in a power source 1308A, 1308B replacement, the quick release features may be configured to release the power source 1308A, 1308B from an engaged position and slide or move away from the frame 1204 of a vehicle 100. Once removed, the power source 1308A, 1308B may be replaced (e.g., with a new power source, a charged power source, etc.) by engaging the replacement power source into a system receiving position adjacent to the vehicle 100. In some embodiments, the vehicle 100 may include one or more actuators configured to position, lift, slide, or otherwise engage the replacement power source with the vehicle 100. In one embodiment, the replacement power source may be inserted into the vehicle 100 or vehicle frame 1204 with mechanisms and/or machines that are external or separate from the vehicle 100.
In some embodiments, the frame 1204 may include one or more features configured to selectively interconnect with other vehicles and/or portions of vehicles. These selectively interconnecting features can allow for one or more vehicles to selectively couple together and decouple for a variety of purposes. For example, it is an aspect of the present disclosure that a number of vehicles may be selectively coupled together to share energy, increase power output, provide security, decrease power consumption, provide towing services, and/or provide a range of other benefits. Continuing this example, the vehicles may be coupled together based on travel route, destination, preferences, settings, sensor information, and/or some other data. The coupling may be initiated by at least one controller of the vehicle and/or traffic control system upon determining that a coupling is beneficial to one or more vehicles in a group of vehicles or a traffic system. As can be appreciated, the power consumption for a group of vehicles traveling in a same direction may be reduced or decreased by removing any aerodynamic separation between vehicles. In this case, the vehicles may be coupled together to subject only the foremost vehicle in the coupling to air and/or wind resistance during travel. In one embodiment, the power output by the group of vehicles may be proportionally or selectively controlled to provide a specific output from each of the one or more of the vehicles in the group.
The interconnecting, or coupling, features may be configured as electromagnetic mechanisms, mechanical couplings, electromechanical coupling mechanisms, etc., and/or combinations thereof. The features may be selectively deployed from a portion of the frame 1204 and/or body of the vehicle 100. In some cases, the features may be built into the frame 1204 and/or body of the vehicle 100. In any event, the features may deploy from an unexposed position to an exposed position or may be configured to selectively engage/disengage without requiring an exposure or deployment of the mechanism from the frame 1204 and/or body. In some embodiments, the interconnecting features may be configured to interconnect one or more of power, communications, electrical energy, fuel, and/or the like. One or more of the power, mechanical, and/or communications connections between vehicles may be part of a single interconnection mechanism. In some embodiments, the interconnection mechanism may include multiple connection mechanisms. In any event, the single interconnection mechanism or the interconnection mechanism may employ the poka-yoke features as described above.
The power system of the vehicle 100 may include the powertrain, power distribution system, accessory power system, and/or any other components that store power, provide power, convert power, and/or distribute power to one or more portions of the vehicle 100. The powertrain may include the one or more electric motors 1312 of the vehicle 100. The electric motors 1312 are configured to convert electrical energy provided by a power source into mechanical energy. This mechanical energy may be in the form of a rotational or other output force that is configured to propel or otherwise provide a motive force for the vehicle 100.
In some embodiments, the vehicle 100 may include one or more drive wheels 1320 that are driven by the one or more electric motors 1312 and motor controllers 1314. In some cases, the vehicle 100 may include an electric motor 1312 configured to provide a driving force for each drive wheel 1320. In other cases, a single electric motor 1312 may be configured to share an output force between two or more drive wheels 1320 via one or more power transmission components. It is an aspect of the present disclosure that the powertrain include one or more power transmission components, motor controllers 1314, and/or power controllers that can provide a controlled output of power to one or more of the drive wheels 1320 of the vehicle 100. The power transmission components, power controllers, or motor controllers 1314 may be controlled by at least one other vehicle controller described herein.
As provided above, the powertrain of the vehicle 100 may include one or more power sources 1308A, 1308B. These one or more power sources 1308A, 1308B may be configured to provide drive power, system and/or subsystem power, accessory power, etc. While described herein as a single power source 1308 for sake of clarity, embodiments of the present disclosure are not so limited. For example, it should be appreciated that independent, different, or separate power sources 1308A, 1308B may provide power to various systems of the vehicle 100. For instance, a drive power source may be configured to provide the power for the one or more electric motors 1312 of the vehicle 100, while a system power source may be configured to provide the power for one or more other systems and/or subsystems of the vehicle 100. Other power sources may include an accessory power source, a backup power source, a critical system power source, and/or other separate power sources. Separating the power sources 1308A, 1308B in this manner may provide a number of benefits over conventional vehicle systems. For example, separating the power sources 1308A, 1308B allow one power source 1308 to be removed and/or replaced independently without requiring that power be removed from all systems and/or subsystems of the vehicle 100 during a power source 1308 removal/replacement. For instance, one or more of the accessories, communications, safety equipment, and/or backup power systems, etc., may be maintained even when a particular power source 1308A, 1308B is depleted, removed, or becomes otherwise inoperable.
In some embodiments, the drive power source may be separated into two or more cells, units, sources, and/or systems. By way of example, a vehicle 100 may include a first drive power source 1308A and a second drive power source 1308B. The first drive power source 1308A may be operated independently from or in conjunction with the second drive power source 1308B and vice versa. Continuing this example, the first drive power source 1308A may be removed from a vehicle while a second drive power source 1308B can be maintained in the vehicle 100 to provide drive power. This approach allows the vehicle 100 to significantly reduce weight (e.g., of the first drive power source 1308A, etc.) and improve power consumption, even if only for a temporary period of time. In some cases, a vehicle 100 running low on power may automatically determine that pulling over to a rest area, emergency lane, and removing, or “dropping off,” at least one power source 1308A, 1308B may reduce enough weight of the vehicle 100 to allow the vehicle 100 to navigate to the closest power source replacement and/or charging area. In some embodiments, the removed, or “dropped off,” power source 1308A may be collected by a collection service, vehicle mechanic, tow truck, or even another vehicle or individual.
The power source 1308 may include a GPS or other geographical location system that may be configured to emit a location signal to one or more receiving entities. For instance, the signal may be broadcast or targeted to a specific receiving party. Additionally or alternatively, the power source 1308 may include a unique identifier that may be used to associate the power source 1308 with a particular vehicle 100 or vehicle user. This unique identifier may allow an efficient recovery of the power source 1308 dropped off. In some embodiments, the unique identifier may provide information for the particular vehicle 100 or vehicle user to be billed or charged with a cost of recovery for the power source 1308.
The power source 1308 may include a charge controller 1324 that may be configured to determine charge levels of the power source 1308, control a rate at which charge is drawn from the power source 1308, control a rate at which charge is added to the power source 1308, and/or monitor a health of the power source 1308 (e.g., one or more cells, portions, etc.). In some embodiments, the charge controller 1324 or the power source 1308 may include a communication interface. The communication interface can allow the charge controller 1324 to report a state of the power source 1308 to one or more other controllers of the vehicle 100 or even communicate with a communication device separate and/or apart from the vehicle 100. Additionally or alternatively, the communication interface may be configured to receive instructions (e.g., control instructions, charge instructions, communication instructions, etc.) from one or more other controllers of the vehicle 100 or a communication device that is separate and/or apart from the vehicle 100.
The powertrain includes one or more power distribution systems configured to transmit power from the power source 1308 to one or more electric motors 1312 in the vehicle 100. The power distribution system may include electrical interconnections 1328 in the form of cables, wires, traces, wireless power transmission systems, etc., and/or combinations thereof. It is an aspect of the present disclosure that the vehicle 100 include one or more redundant electrical interconnections 1332 of the power distribution system. The redundant electrical interconnections 1332 can allow power to be distributed to one or more systems and/or subsystems of the vehicle 100 even in the event of a failure of an electrical interconnection portion of the vehicle 100 (e.g., due to an accident, mishap, tampering, or other harm to a particular electrical interconnection, etc.). In some embodiments, a user of a vehicle 100 may be alerted via a user interface associated with the vehicle 100 that a redundant electrical interconnection 1332 is being used and/or damage has occurred to a particular area of the vehicle electrical system. In any event, the one or more redundant electrical interconnections 1332 may be configured along completely different routes than the electrical interconnections 1328 and/or include different modes of failure than the electrical interconnections 1328 to, among other things, prevent a total interruption power distribution in the event of a failure.
In some embodiments, the power distribution system may include an energy recovery system 1336. This energy recovery system 1336, or kinetic energy recovery system, may be configured to recover energy produced by the movement of a vehicle 100. The recovered energy may be stored as electrical and/or mechanical energy. For instance, as a vehicle 100 travels or moves, a certain amount of energy is required to accelerate, maintain a speed, stop, or slow the vehicle 100. In any event, a moving vehicle has a certain amount of kinetic energy. When brakes are applied in a typical moving vehicle, most of the kinetic energy of the vehicle is lost as the generation of heat in the braking mechanism. In an energy recovery system 1336, when a vehicle 100 brakes, at least a portion of the kinetic energy is converted into electrical and/or mechanical energy for storage. Mechanical energy may be stored as mechanical movement (e.g., in a flywheel, etc.) and electrical energy may be stored in batteries, capacitors, and/or some other electrical storage system. In some embodiments, electrical energy recovered may be stored in the power source 1308. For example, the recovered electrical energy may be used to charge the power source 1308 of the vehicle 100.
The vehicle 100 may include one or more safety systems. Vehicle safety systems can include a variety of mechanical and/or electrical components including, but in no way limited to, low impact or energy-absorbing bumpers 1316A, 1316B, crumple zones, reinforced body panels, reinforced frame components, impact bars, power source containment zones, safety glass, seatbelts, supplemental restraint systems, air bags, escape hatches, removable access panels, impact sensors, accelerometers, vision systems, radar systems, etc., and/or the like. In some embodiments, the one or more of the safety components may include a safety sensor or group of safety sensors associated with the one or more of the safety components. For example, a crumple zone may include one or more strain gages, impact sensors, pressure transducers, etc. These sensors may be configured to detect or determine whether a portion of the vehicle 100 has been subjected to a particular force, deformation, or other impact. Once detected, the information collected by the sensors may be transmitted or sent to one or more of a controller of the vehicle 100 (e.g., a safety controller, vehicle controller, etc.) or a communication device associated with the vehicle 100 (e.g., across a communication network, etc.).
In some embodiments, the vehicle 100 may include an inductive charging system and inductive charger 1412. The inductive charger 1412 may be configured to receive electrical energy from an inductive power source external to the vehicle 100. In one embodiment, when the vehicle 100 and/or the inductive charger 1412 is positioned over an inductive power source external to the vehicle 100, electrical energy can be transferred from the inductive power source to the vehicle 100. For example, the inductive charger 1412 may receive the charge and transfer the charge via at least one power transmission interconnection 1408 to the charge controller 1324 and/or the power source 1308 of the vehicle 100. The inductive charger 1412 may be concealed in a portion of the vehicle 100 (e.g., at least partially protected by the frame 1204, one or more body panels 1208, a shroud, a shield, a protective cover, etc., and/or combinations thereof) and/or may be deployed from the vehicle 100. In some embodiments, the inductive charger 1412 may be configured to receive charge only when the inductive charger 1412 is deployed from the vehicle 100. In other embodiments, the inductive charger 1412 may be configured to receive charge while concealed in the portion of the vehicle 100.
In addition to the mechanical components described herein, the vehicle 100 may include a number of user interface devices. The user interface devices receive and translate human input into a mechanical movement or electrical signal or stimulus. The human input may be one or more of motion (e.g., body movement, body part movement, in two-dimensional or three-dimensional space, etc.), voice, touch, and/or physical interaction with the components of the vehicle 100. In some embodiments, the human input may be configured to control one or more functions of the vehicle 100 and/or systems of the vehicle 100 described herein. User interfaces may include, but are in no way limited to, at least one graphical user interface of a display device, steering wheel or mechanism, transmission lever or button (e.g., including park, neutral, reverse, and/or drive positions, etc.), throttle control pedal or mechanism, brake control pedal or mechanism, power control switch, communications equipment, etc.
An embodiment of the electrical system 1500 associated with the vehicle 100 may be as shown in
The power generation unit 1504 may be as described in conjunction with
The billing and cost control unit 1512 may interface with the power management controller 1324 to determine the amount of charge or power provided to the power storage 612 through the power generation unit 1504. The billing and cost control unit 1512 can then provide information for billing the vehicle owner. Thus, the billing and cost control unit 1512 can receive and/or send power information to third party system(s) regarding the received charge from an external source. The information provided can help determine an amount of money required, from the owner of the vehicle, as payment for the provided power. Alternatively, or in addition, if the owner of the vehicle provided power to another vehicle (or another device/system), that owner may be owed compensation for the provided power or energy, e.g., a credit.
The power management controller 1324 can be a computer or computing system(s) and/or electrical system with associated components, as described herein, capable of managing the power generation unit 1504 to receive power, routing the power to the power storage 612, and then providing the power from either the power generation unit 1504 and/or the power storage 612 to the loads 1508. Thus, the power management controller 1324 may execute programming that controls switches, devices, components, etc. involved in the reception, storage, and provision of the power in the electrical system 1500.
An embodiment of the power generation unit 1504 may be as shown in
Another power source 1308 may include wired or wireless charging 1608. The wireless charging system 1608 may include inductive and/or resonant frequency inductive charging systems that can include coils, frequency generators, controllers, etc. Wired charging may be any kind of grid-connected charging that has a physical connection, although, the wireless charging may be grid connected through a wireless interface. The wired charging system can include connectors, wired interconnections, the controllers, etc. The wired and wireless charging systems 1608 can provide power to the power generation unit 1504 from external power sources 1308.
Internal sources for power may include a regenerative braking system 1612. The regenerative braking system 1612 can convert the kinetic energy of the moving car into electrical energy through a generation system mounted within the wheels, axle, and/or braking system of the vehicle 100. The regenerative braking system 1612 can include any coils, magnets, electrical interconnections, converters, controllers, etc. required to convert the kinetic energy into electrical energy.
Another source of power 1308, internal to or associated with the vehicle 100, may be a solar array 1616. The solar array 1616 may include any system or device of one or more solar cells mounted on the exterior of the vehicle 100 or integrated within the body panels of the vehicle 100 that provides or converts solar energy into electrical energy to provide to the power generation unit 1504.
The power sources 1308 may be connected to the power generation unit 1504 through an electrical interconnection 1618. The electrical interconnection 1618 can include any wire, interface, bus, etc. between the one or more power sources 1308 and the power generation unit 1504.
The power generation unit 1504 can also include a power source interface 1620. The power source interface 1620 can be any type of physical and/or electrical interface used to receive the electrical energy from the one or more power sources 1308; thus, the power source interface 1620 can include an electrical interface 1624 that receives the electrical energy and a mechanical interface 1628 which may include wires, connectors, or other types of devices or physical connections. The mechanical interface 1608 can also include a physical/electrical connection 1634 to the power generation unit 1504.
The electrical energy from the power source 1308 can be processed through the power source interface 1624 to an electric converter 1632. The electric converter 1632 may convert the characteristics of the power from one of the power sources into a useable form that may be used either by the power storage 612 or one or more loads 1508 within the vehicle 100. The electrical converter 1624 may include any electronics or electrical devices and/or component that can change electrical characteristics, e.g., AC frequency, amplitude, phase, etc. associated with the electrical energy provided by the power source 1308. The converted electrical energy may then be provided to an optional conditioner 1638. The conditioner 1638 may include any electronics or electrical devices and/or component that may further condition the converted electrical energy by removing harmonics, noise, etc. from the electrical energy to provide a more stable and effective form of power to the vehicle 100.
An embodiment of the power storage 1612 may be as shown in
The battery 1704 can be any type of battery for storing electrical energy, for example, a lithium ion battery, a lead acid battery, a nickel cadmium battery, etc. Further, the battery 1704 may include different types of power storage systems, such as, ionic fluids or other types of fuel cell systems. The energy storage 1704 may also include one or more high-capacity capacitors 1704. The capacitors 1704 may be used for long-term or short-term storage of electrical energy. The input into the battery or capacitor 1704 may be different from the output, and thus, the capacitor 1704 may be charged quickly but drain slowly. The functioning of the converter 1632 and battery capacitor 1704 may be monitored or managed by a charge management unit 1708.
The charge management unit 1708 can include any hardware (e.g., any electronics or electrical devices and/or components), software, or firmware operable to adjust the operations of the converter 1632 or batteries/capacitors 1704. The charge management unit 1708 can receive inputs or periodically monitor the converter 1632 and/or battery/capacitor 1704 from this information; the charge management unit 1708 may then adjust settings or inputs into the converter 1632 or battery/capacitor 1704 to control the operation of the power storage system 612.
An embodiment of one or more loads 1508 associated with the vehicle 100 may be as shown in
The electric motor 1804 can be any type of DC or AC electric motor. The electric motor may be a direct drive or induction motor using permanent magnets and/or winding either on the stator or rotor. The electric motor 1804 may also be wireless or include brush contacts. The electric motor 1804 may be capable of providing a torque and enough kinetic energy to move the vehicle 100 in traffic.
The different loads 1508 may also include environmental loads 1812, sensor loads 1816, safety loads 1820, user interaction loads 1808, etc. User interaction loads 1808 can be any energy used by user interfaces or systems that interact with the driver and/or passenger(s). These loads 1808 may include, for example, the heads up display, the dash display, the radio, user interfaces on the head unit, lights, radio, and/or other types of loads that provide or receive information from the occupants of the vehicle 100. The environmental loads 1812 can be any loads used to control the environment within the vehicle 100. For example, the air conditioning or heating unit of the vehicle 100 can be environmental loads 1812. Other environmental loads can include lights, fans, and/or defrosting units, etc. that may control the environment within the vehicle 100. The sensor loads 1816 can be any loads used by sensors, for example, air bag sensors, GPS, and other such sensors used to either manage or control the vehicle 100 and/or provide information or feedback to the vehicle occupants. The safety loads 1820 can include any safety equipment, for example, seat belt alarms, airbags, headlights, blinkers, etc. that may be used to manage the safety of the occupants. There may be more or fewer loads than those described herein, although they may not be shown in
The communications componentry can include one or more wired or wireless devices such as a transceiver(s) and/or modem that allows communications not only between the various systems disclosed herein but also with other devices, such as devices on a network, and/or on a distributed network such as the Internet and/or in the cloud.
The communications subsystem can also include inter- and intra-vehicle communications capabilities such as hotspot and/or access point connectivity for any one or more of the vehicle occupants and/or vehicle-to-vehicle communications.
Additionally, and while not specifically illustrated, the communications subsystem can include one or more communications links (that can be wired or wireless) and/or communications busses (managed by the bus manager), including one or more of CANbus, OBD-II, ARCINC 429, Byteflight, CAN (Controller Area Network), D2B (Domestic Digital Bus), FlexRay, DC-BUS, IDB-1394, IEBus, I2C, ISO 9141-1/-2, J1708, J1587, J1850, J1939, ISO 11783, Keyword Protocol 2000, LIN (Local Interconnect Network), MOST (Media Oriended Systems Transport), Multifunction Vehicle Bus, SMARTwireX, SPI, VAN (Vehicle Area Network), and the like or in general any communications protocol and/or standard.
The various protocols and communications can be communicated one or more of wirelessly and/or over transmission media such as single wire, twisted pair, fibre optic, IEEE 1394, MIL-STD-1553, MIL-STD-1773, power-line communication, or the like. (All of the above standards and protocols are incorporated herein by reference in their entirety)
As discussed, the communications subsystem enables communications between any if the inter-vehicle systems and subsystems as well as communications with non-collocated resources, such as those reachable over a network such as the Internet.
The communications subsystem, in addition to well-known componentry (which has been omitted for clarity), the device communications subsystem 1900 includes interconnected elements including one or more of: one or more antennas 1904, an interleaver/deinterleaver 1908, an analog front end (AFE) 1912, memory/storage/cache 1916, controller/microprocessor 1920, MAC circuitry 1922, modulator/demodulator 1924, encoder/decoder 1928, a plurality of connectivity managers 1934-1966, GPU 1942, accelerator 1944, a multiplexer/demultiplexer 1954, transmitter 1970, receiver 1972 and wireless radio 310 components such as a Wi-Fi PHY/Bluetooth® module 1980, a Wi-Fi/BT MAC module 1984, transmitter 1988 and receiver 1992. The various elements in the device 1900 are connected by one or more links/busses 5 (not shown, again for sake of clarity).
The device 400 can have one more antennas 1904, for use in wireless communications such as multi-input multi-output (MIMO) communications, multi-user multi-input multi-output (MU-MIMO) communications Bluetooth®, LTE, 4G, 5G, Near-Field Communication (NFC), etc. The antenna(s) 1904 can include, but are not limited to one or more of directional antennas, omnidirectional antennas, monopoles, patch antennas, loop antennas, microstrip antennas, dipoles, and any other antenna(s) suitable for communication transmission/reception. In an exemplary embodiment, transmission/reception using MIMO may require particular antenna spacing. In another exemplary embodiment, MIMO transmission/reception can enable spatial diversity allowing for different channel characteristics at each of the antennas. In yet another embodiment, MIMO transmission/reception can be used to distribute resources to multiple users for example within the vehicle and/or in another vehicle.
Antenna(s) 1904 generally interact with the Analog Front End (AFE) 1912, which is needed to enable the correct processing of the received modulated signal and signal conditioning for a transmitted signal. The AFE 1912 can be functionally located between the antenna and a digital baseband system in order to convert the analog signal into a digital signal for processing and vice-versa.
The subsystem 1900 can also include a controller/microprocessor 1920 and a memory/storage/cache 1916. The subsystem 1900 can interact with the memory/storage/cache 1916 which may store information and operations necessary for configuring and transmitting or receiving the information described herein. The memory/storage/cache 1916 may also be used in connection with the execution of application programming or instructions by the controller/microprocessor 1920, and for temporary or long term storage of program instructions and/or data. As examples, the memory/storage/cache 1920 may comprise a computer-readable device, RAM, ROM, DRAM, SDRAM, and/or other storage device(s) and media.
The controller/microprocessor 1920 may comprise a general purpose programmable processor or controller for executing application programming or instructions related to the subsystem 1900. Furthermore, the controller/microprocessor 1920 can perform operations for configuring and transmitting/receiving information as described herein. The controller/microprocessor 1920 may include multiple processor cores, and/or implement multiple virtual processors. Optionally, the controller/microprocessor 1920 may include multiple physical processors. By way of example, the controller/microprocessor 1920 may comprise a specially configured Application Specific Integrated Circuit (ASIC) or other integrated circuit, a digital signal processor(s), a controller, a hardwired electronic or logic circuit, a programmable logic device or gate array, a special purpose computer, or the like.
The subsystem 1900 can further include a transmitter 1970 and receiver 1972 which can transmit and receive signals, respectively, to and from other devices, subsystems and/or other destinations using the one or more antennas 1904 and/or links/busses. Included in the subsystem 1900 circuitry is the medium access control or MAC Circuitry 1922. MAC circuitry 1922 provides for controlling access to the wireless medium. In an exemplary embodiment, the MAC circuitry 1922 may be arranged to contend for the wireless medium and configure frames or packets for communicating over the wireless medium.
The subsystem 1900 can also optionally contain a security module (not shown). This security module can contain information regarding but not limited to, security parameters required to connect the device to one or more other devices or other available network(s), and can include WEP or WPA/WPA-2 (optionally+AES and/or TKIP) security access keys, network keys, etc. The WEP security access key is a security password used by Wi-Fi networks. Knowledge of this code can enable a wireless device to exchange information with an access point and/or another device. The information exchange can occur through encoded messages with the WEP access code often being chosen by the network administrator. WPA is an added security standard that is also used in conjunction with network connectivity with stronger encryption than WEP.
The exemplary subsystem 1900 also includes a GPU 1942, an accelerator 1944, a Wi-Fi/BT/BLE PHY module 1980 and a Wi-Fi/BT/BLE MAC module 1984 and wireless transmitter 1988 and receiver 1992.
The various connectivity managers 1934-1966 manage and/or coordinate communications between the subsystem 1900 and one or more of the systems disclosed herein and one or more other devices/systems. The connectivity managers include an emergency charging connectivity manager 1934, an aerial charging connectivity manager 1938, a roadway charging connectivity manager 1942, an overhead charging connectivity manager 1946, a robotic charging connectivity manager 1950, a static charging connectivity manager 1954, a vehicle database connectivity manager 1958, a remote operating system connectivity manager 1962 and a sensor connectivity manager 1966.
The emergency charging connectivity manager 1934 can coordinate not only the physical connectivity between the vehicle and the emergency charging device/vehicle, but can also communicate with one or more of the power management controller, one or more third parties and optionally a billing system(s). As an example, the vehicle can establish communications with the emergency charging device/vehicle to one or more of coordinate interconnectivity between the two (e.g., by spatially aligning the charging receptacle on the vehicle with the charger on the emergency charging vehicle) and optionally share navigation information. Once charging is complete, the amount of charge provided can be tracked and optionally forwarded to, for example, a third party for billing. In addition to being able to manage connectivity for the exchange of power, the emergency charging connectivity manager 1934 can also communicate information, such as billing information to the emergency charging vehicle and/or a third party. This billing information could be, for example, the owner of the vehicle, the driver of the vehicle, company information, or in general any information usable to charge the appropriate entity for the power received.
The aerial charging connectivity manager 1938 can coordinate not only the physical connectivity between the vehicle and the aerial charging device/vehicle, but can also communicate with one or more of the power management controller, one or more third parties and optionally a billing system(s). As an example, the vehicle can establish communications with the aerial charging device/vehicle to one or more of coordinate interconnectivity between the two (e.g., by spatially aligning the charging receptacle on the vehicle with the charger on the emergency charging vehicle) and optionally share navigation information. Once charging is complete, the amount of charge provided can be tracked and optionally forwarded to, for example, a third party for billing. In addition to being able to manage connectivity for the exchange of power, the aerial charging connectivity manager 1938 can similarly communicate information, such as billing information to the aerial charging vehicle and/or a third party. This billing information could be, for example, the owner of the vehicle, the driver of the vehicle, company information, or in general any information usable to charge the appropriate entity for the power received etc., as discussed.
The roadway charging connectivity manager 1942 and overhead charging connectivity manager 1946 can coordinate not only the physical connectivity between the vehicle and the charging device/system, but can also communicate with one or more of the power management controller, one or more third parties and optionally a billing system(s). As one example, the vehicle can request a charge from the charging system when, for example, the vehicle needs or is predicted to need power. As an example, the vehicle can establish communications with the charging device/vehicle to one or more of coordinate interconnectivity between the two for charging and share information for billing. Once charging is complete, the amount of charge provided can be tracked and optionally forwarded to, for example, a third party for billing. This billing information could be, for example, the owner of the vehicle, the driver of the vehicle, company information, or in general any information usable to charge the appropriate entity for the power received etc., as discussed. The person responsible for paying for the charge could also receive a copy of the billing information as is customary. The robotic charging connectivity manager 1950 and static charging connectivity manager 1954 can operate in a similar manner to that described herein.
The vehicle database connectivity manager 1958 allows the subsystem to receive and/or share information stored in the vehicle database. This information can be shared with other vehicle components/subsystems and/or other entities, such as third parties and/or charging systems. The information can also be shared with one or more vehicle occupant devices, such as an app on a mobile device the driver uses to track information about the vehicle and/or a dealer or service/maintenance provider. In general, any information stored in the vehicle database can optionally be shared with any one or more other devices optionally subject to any privacy or confidentially restrictions.
The remote operating system connectivity manager 1962 facilitates communications between the vehicle 100 and any one or more autonomous vehicle systems. These communications can include one or more of navigation information, vehicle information, occupant information, or in general any information related to the remote operation of the vehicle.
The sensor connectivity manager 1966 facilitates communications between any one or more of the vehicle sensors and any one or more of the other vehicle systems. The sensor connectivity manager 1966 can also facilitate communications between any one or more of the sensors and/or vehicle systems and any other destination, such as a service company, app, or in general to any destination where sensor data is needed.
In accordance with one exemplary embodiment, any of the communications discussed herein can be communicated via the conductor(s) used for charging. One exemplary protocol usable for these communications is Power-line communication (PLC). PLC is a communication protocol that uses electrical wiring to simultaneously carry both data, and Alternating Current (AC) electric power transmission or electric power distribution. It is also known as power-line carrier, power-line digital subscriber line (PDSL), mains communication, power-line telecommunications, or power-line networking (PLN). For DC environments in vehicles PLC can be used in conjunction with CAN-bus, LIN-bus over power line (DC-LIN) and DC-BUS.
The communications subsystem can also optionally manage one or more identifiers, such as an IP (internet protocol) address(es), associated with the vehicle and one or other system or subsystems or components therein. These identifiers can be used in conjunction with any one or more of the connectivity managers as discussed herein.
An alternative or additional embodiment of vehicle 100 may be as shown in
The antennas 2004a, 2004b, and 2004c can communicate using one or more NFC standards. NFC standards cover communications protocols and data exchange formats and are based on existing radio-frequency identification (RFID) standards including ISO/IEC 14443 and Felicity Card (FeliCa). The standards include ISO/IEC 18092 and those defined by the NFC Forum. In addition to the NFC Forum, the GSM Association (GSMA) group defined a platform for the deployment of GSMA NFC Standards within mobile handsets. GSMA's efforts include Trusted Services Manager, Single Wire Protocol, testing/certification and secure element. NFC components incorporated into the antennas 2004a, 2004b, and 2004c can include, for example, the iClass®, veriClass®, and other NFC devices developed by HID®.
The antennas 2004a, 2004b, and 2004c can be placed in locations on the vehicle 100 to allow easier connection with other devices. For example, antenna 2004a may be placed on the roof 130 of the vehicle 100 to allow for connection with overhead NFC devices, such as may be found in parking garages. Antenna 2004c may be positioned outboard on a mirror 2016 or side panel of the vehicle 100 to better connect with devices on walls or other vertical structures, such as may be found at drive through restaurants and establishments. Another antenna 2004c may be within physical proximity to a charging port 2012 or gas tank door to facilitate communications with a charging device or a gas pump. Other antennas 2004 may be positioned in other parts of the vehicle 100 to facilitate other communications with other entities or facilities.
The vehicle 100 may also communicate with a dongle 2008. A dongle 2008 may be another hardware/software component that can communicate wirelessly with the vehicle 100. The dongle 2008 may communicate using an NFC connection or may employ another wireless technology, such as BlueTooth or WLAN technology. A common example of the dongle 2008 is the keyless entry system of many vehicles. Unlike the simple keyless entry system, the dongle 2008 may provide biometric data associated with the user of the dongle 2008. For example, the dongle 2008 may be associated with a single user, which may need to provide a biometric characteristic (e.g., a fingerprint, a voice signature, etc.) to employ the dongle 2008. This biometric signature may be exchanged between the dongle 2008 and the vehicle 100. Further, the presence of the dongle 2008, determined by a continuous connection between the dongle 2008 and the vehicle 100, can indicate the presence of the associated user holding the dongle 2008.
An embodiment of the hardware/software configuration associated with the antennas 2004 may be as shown in
The near field communication chipset 2104 can include any hardware and software associated with the execution of near field communication communications. As explained previously, loop antenna 2004 of the NFC device can be embedded within the vehicle. However, the other circuitry and components of the near field communication devices may be incorporated into a single NFC chipset 2104. Thus, the NFC chipset 2104 can include the components that receive data from the loop antenna, decode and/or encode information from or onto the transmission, or conduct other functions that allow for communications over the NFC connection. For example, the NFC chipset can include, for example, the iClass®, veriClass®, and other NFC devices developed by HID® or similar hardware and/or software.
The soft touch interface 2116 may include a physical interconnection abilities described in conjunction with
An embodiment of a soft touch interface navies described in
An embodiment of software 2300 that may be stored within the memory 1916 may be as shown in
The contactless payment application 2308 can conducts financial transactions using the vehicle. Contactless payment application 2308 may conduct the financial transactions with or without user input with third parties. The half token application 2312 can provide half tokens or pre-authorizations to third parties when interacting with those third parties. As such, the half token application 2312 is capable of authenticating or verifying the identity and veracity of the third party or providing that information for the user to the third party. In some configurations, the half token application 2312 is capable of sending authorizations for secure information in two or more packet communications that may be conducted over different communication media such that a higher-level security is achieved for the vehicle 100.
Event handling application 2316 is capable of receiving events associated with the vehicle 100 or the user. Events can include such items as need for charging of the electric vehicle, purchase requests by the user, service needs of the vehicle 100, communication requests by the user, or other types of events. Event handler 2316 can receive the event notifications and instruct other parts or components of the vehicle 100 to respond to the events as received. A browser 2320 can the Internet interface for the vehicle 100 and/or user. Thus, the browser can be any type of Internet interface, for example, Internet Explorer, Safari, Mozilla, etc. The browser 2320 may also provide Internet access to the vehicle 100.
Geolocation application 2324 can provide location information to the vehicle 100. The geolocation information may be provided from a GPS antenna, through triangulation of cellular towers or other known landmarks, or by some other type of location information. Geolocation information may be provided to the mapping application 2228. The mapping application 2228 can provide a coordinated location for the vehicle 100 based on the geolocation information provided by the geolocation application 2324 and one or more maps stored within the vehicle 100. Thus, the mapping application 2328 can resolve the physical locations of vehicle 100 based on the geolocation information.
The vehicle identity application 2332 can determine the identity of one or more occupants of the vehicle 100. Thus, the vehicle identity application 2332 can use one or more characteristics of biometric information to identify users within the vehicle 100. The identity of the users may then be used by other components of the vehicle 100.
The motor control application can control the motor function of the electric vehicle 100. Thus, the motor control application 2336 can accelerate or decelerate the electric motor based on user input or input from an automated driving function. The motor control function 2236 may also include the ability to send information using the motor. As such, the motor control application 2336 can include a data over field application 2340 which can encode data onto the electric field generated by the motor. In another configuration, the motor control application 2336 can cycle the motor in a pattern to convey information to a device that can receive or interpret the electric field of the motor.
The voice input output application 2344 can receive voice input from a user within the vehicle 100. Further, the voice input output can provide sound output to the speakers of the vehicle 100. Sound output can be synthesized voice to convey information to the user. In other configurations, the voice input output application 2344 can provide for the exchange of audio and/or video in a communication between a user of the vehicle 100 and another party. If synthesized voice output is provided, the voice input output application 2344 can include a synthesis module 2348. The synthesis module 2248 can translate written or other signals into voice output that can be sent to the speakers of the vehicle 100. As such, the driver of the vehicle 100 may not read the information but can listen to the information conveyed by the voice input/output application 2344.
An embodiment of data system 2400 that may be stored, retrieved, sent, managed etc. with the vehicle 100 and associated with a user may be as shown in
The biometrics field 2408 can include biometric information for the user. This information 2408 can include one of more of, but is not limited to: facial structure, facial recognition information, voiceprint, voice recognition information, eye color, DNA information, fingerprint information, or other types of biometric information that could be used to identify the user. This information 2408 may be gathered by sensors and stored in data structure 2408.
A username 2412 and password 2416 can be identifying information created by a user. The username 2412 can be any type of identifying information that the user can enter to access vehicle systems, and the password 2416 can be another form of data known to the user, used with the username 2414, to authenticate the user. The username 2412 and password 2416 may be created by the user in another computer and downloaded into the vehicle 100 or may be created in the vehicle 100.
The mobile device information 2420 can include any information about one or more mobile devices or other types of devices that may be associated with the user and linked with the vehicle 100. Mobile device information 2420 can include the mobile device number, the media access control (MAC) address, a uniform resource locator (URL), or other types of information that might be associated with a mobile device. Further, mobile device information 2420 can include any kind of Bluetooth link, passcode, or information about a wearable that might provide information about a user.
The dongle code 2420 can be the code or frequency of the dongle 2008. Thus, this information may be associated with the user rather than with the vehicle 100 because the dongle 2008 is more user-specific than vehicle-specific because the user carries the dongle 2008. Information from that dongle 2008 can be associated with any kind of specific information about the user. The encryption key 2428 can be any type of known secret that is common between the vehicle 100 and the user. This encryption key 2428 can include some type of pretty good privacy (PGP) key or other types of encryption information used to encrypt the information in the data structure 2404 or other information described herein.
The credentials 2432 can include any type of other credentials or information that might be used to access the encryption key 2428 or other information described herein. These credentials can be other types of information that may be different than that described previously, such as a barcode or other type of information that might be provided from another device or memory or may have been provided by the user through a user interface.
The YIN 2436 and ESN 2440 can be information about the vehicle 100. This information 2436, 2440 may be provided automatically when the data structure 2404 is created by the processor 5608 of the vehicle 100. The VIN may be stored not only as plates or information physically connected to the vehicle but may also be electronic information stored securely within the vehicle 100. The ESN 2440 may be an identification number for the communication system 1900 of the vehicle 100.
The engine code 2448 can be any type of code related to the electric or gasoline engine. This code 2444 can be provided by the engine through some type of communication with the magnetic or electric field of the engine. This engine code 2448 can be an alphanumeric code, a numeric code, a globally-unique identifier (GUID), or some other type of code. Thus, similar to the VIN 2436 or the ESN 2440, the engine code 2444 can provide an identification of the engine.
Another data system 2500 may be as shown in
Payment information 2508 can be any type of credit card, bank account, or other financial information that might be used in a secure communication payment through or with the vehicle 100. Similarly, the PIN 2512 can be any type of PIN associated with one or more of the financial instruments provided in the payment information 2508. These PINs may be created by the third-party financial institutions and given to the user. The user may enter the PIN 2512 into the vehicle 100 through a user interface, and the vehicle can store the PIN 2512 in data structure 2504.
The IDs 2516 may be any type of information used with the payment information 2508. Thus, the IDs 2516 can include the name on a credit card, an alias, or some other information associated with the payment information 2508. The address 2520, similar to the IDs 2516, can be the address for the user of the payment information 2508. As such, the address 2520 can be the home address, business address, or some other type of address associated with the user having payment information stored in field 2508.
Limits 2524, preferences 2528, and rules 2532 can be user-created fields, may be automatically set, or learned, by machine-learning, based on repetitive processes of the user while in the vehicle 100. Limits 2524 can be any limit on any type of secure communication or secure transaction used with the vehicle 100. These limits 2524 can be in dollars, in a number of transactions, or some other type of limit. Limits 2524 can be associated with the vehicle 100 or may be associated with two or more vehicles and thus may limit transactions across all types of vehicles the user may be associated therewith. Preferences 2528 can be any type of financial or behavior preference of the user. Behavior preference 2528 can be a determination of the desires or wants of a user based on their behavior. For example, if a user buys coffee every morning at a certain Starbucks, that coffee-buying event at the Starbucks can be a preference of the user stored in field 2528. The rules 2532 can be any type of information dictated by the user to control secure communications or transactions with the vehicle 100. For example, the user may preauthorize certain types of transactions or communications with the vehicle 100, or others may not be preauthorized. As such, this preauthorization may be included in the rules field 2532.
Data system 2600 can include one or more types of information regarding a user's behavior or information about the vehicle 100 associated with the user may be as shown in
The product services field 2608 can include any type of product or service bought by the user while operating the vehicle 100. For example, the products or services 2608 can represent such things as food purchases, gasoline purchases, car wash purchases, toll purchases, parking purchases, or other types of events that occur with and/or are associated with the vehicle 100. The lists of products and services 2608 may also include a context about when or where things were bought, what time the user bought them, how often they bought them, and other types of information. As such, this information of products and/service purchases 2608 can be generated and developed into preferences 2528, as explained previously with
An embodiment of one or more routes taken by the user may be stored in route field 2612. These routes 2612 may be the consistent or common routes taken by the user and may also include information about what types of products or services may be offered along those routes. This information 2612 can allow the vehicle 100 to present options to the user or conduct automatic communications or transactions along such routes 2612.
Triggering events 2616 may be predetermined events that may cause certain purchases or communications to be conducted automatically by the vehicle 100. For example, a triggering event 2616 of inclement weather may cause the user to automatically purchase heightened map and weather data for the vehicle display. In other events, triggering events 2616 may be set once the user enters the vehicle 100 before conducting a drive. For example, a user can establish what communications or transactions are preauthorized for that drive when the triggering event 2616 of starting the vehicle occurs.
Battery rules 2620 can be what type of actions may be allowed for charging, discharging, changing, or managing the battery charge. These battery rules 2620 can include one or more of, but are not limited to: where batteries may be charged or replaced, where a vehicle 100 is allowed to get a charge without authorization, and other types of information. As such, the battery rules 2620 can manage the financial transactions conducted and associated with the battery.
An embodiment of a signaling diagram between the vehicle 100 and a user 220 may be as shown in
The vehicle 100 may then request the user's username 2412 and password 2416, in signal 2704. The username and password request may be sent to a user interface displayed on a display associated with the vehicle 100. In other configurations, the username 2412 and/or password 2416 may be requested and provided to the vehicle 100 with a user interface event and an audio or other type of response or gesture, in signal 2708. The response from the user 220 can provide the credentials for the vehicle 100 to authenticate the user. In some configurations, the dongle 2008 may send a code 2712 to the vehicle 100. As the dongle 2008 is associated with the user 220, this code 2712 can identify the user to the vehicle 100. Along with the code 2712, the dongle 2008 may send a function or a request for an operation, optionally in step 2716. A function can be some type of request for the vehicle to perform an operation such as starting the vehicle 100, opening doors, opening windows, or doing some other kind of task. This function 2716 can also be a request to conduct a secure communication or transaction that may be authorized only with the dongle 2008. As such, even if the user's vehicle 100 is stolen, without the dongle 2008, financial transactions may not be conducted.
Biometrics may also be collected by vehicle 100, in signal 2720. Sensors 5437, within the vehicle 100, may obtain biometric information from the user 220. This biometric information can be a visual recordings of the user's face, may be a voice recording, may be some other physical characteristics recorded by the sensors 5437 within the vehicle 100. This biometric information can be compared to biometrics information 2408. If there is a positive comparison with the biometric information 2408 and the user is identified, the vehicle 100 may send an identity verification signal 2724 back to the user 220. The vehicle 100 may present some identity verification to the user interface to allow the user to understand that they have been identified. The user 220 may respond by providing an okay signal 2728 back to the vehicle 100. The okay signal 2728 may be given in a user interface or may be a voice or a hand gesture within the vehicle 100 and recorded by the vehicle sensors 5437.
In some configurations, the vehicle 100 may ask for a preauthorization or a half-token provided by the user, in signal 2732. Here, the vehicle 100 may present a user interface or an audio request to the user 220 requesting whether the user wishes to authorize certain automatic communications or automatic transactions before the drive begins. If the user does desire such type of automation, the user 220 may send an ack or acknowledge signal 2736 back to the vehicle 100. The ack signal 2736 can again be a user interface input, may be a voice command, some type of gesture or other type of response between the user 220 and the vehicle 100.
The vehicle 100 may then send a second token, e.g., an order, 2740 to a user 220. The order 2740 may be an automated request to a third party for some type of product or service. This order 2740 may be presented to the user interface to allow the user to acknowledge whether the order should be made in signal 2744. If acknowledged, the vehicle 100 can confirm the purchase to the user, in signal 2748, after making the proper purchase, transaction and/or communication with the third party. This confirmed purchase then may be acknowledged by the user 220, in signal 2752.
Another embodiment of a signaling process between a vehicle 100 and a second party, such as a retailer 2804, may be as shown in
The vehicle 100 may then send a half-token or preauthorization for some type of good or service or other type of secure communication to the second party, in signal 2812. The half-token signal 2812 may include some secure communication data, but may not be a complete set of data for the second party to conduct an activity without a second set or portion of information. Thereinafter, the vehicle 100 may send a preorder or other type of secure communication 2816 to the second party 2804; the half-token 2812 may authenticate or authorize a transaction, the order may be the next step in identifying what is desired by the user of the vehicle 100. The vehicle 100 may then send an identifier 2820 for identifying the user and/or vehicle. This information 2820 can be sent in a long-distance communication without the vehicle/user in the physical proximity of the second party 2804. As such, the identifying information 2820 can later be used to identify that vehicle/user when the vehicle 100 does arrive within physical proximity of the second party 2804.
Sometime thereinafter, after vehicle 100 drives to second party 2804 and a second communication establishment may be conducted with signals 2824. The vehicle 100 can establish communications that require physical proximity, for example, nearfield communications or some other type of communication connection. Thereinafter, the second party 2804 may send a request to identify or verify the identity of the vehicle 100 and/or user 220, in signal 2828. This signal 2828 may be a request for more information to complete the token. The vehicle 100 can send an acknowledgement signal 2832 to the second party 2804.
The second party may then send a confirm order signal 2836, which may be acknowledged by the vehicle 100 in signal 2840. The confirm order 2836 may cause the vehicle 100 to present information to the user interface for the user 220 to confirm the order and/or information previously sent by the vehicle 100. Upon acknowledging the order, the user 220 can authorize the vehicle 100 to send billing information, in signal 2844. Billing information 2844 may be the second, matching half-token sent to the second party 2804. Billing information thus may complete the information needed by the second party 2804 to conduct or provide the service or good ordered in signal 2816. The second party 2804 may acknowledge the billing information in signal 2848.
To enter financial information, the user may be provided with an indication 2916 that the user can select a radio button 2920 to enter financial information. If selected, the user can then enter information under different categories indicated by static information displays. For example, the name on the credit card may be presented in indication 2924 where the user could enter their name in fields 2928a and 2928b. A credit card number indication 2932 provides for a text box 2936 that allows the user to enter their credit card number into text box 2936. The billing address field 2940 allows for the user to enter their street, city, state, and ZIP code in fields 2944, 2948, 2952, and 2956, respectively. A security code field 2960 allows the user to enter their security code (CVV) from the back of their credit card in field 2964, or their PIN for the credit card in field 2968. Through the user interface 2908, the user can enter all their financial information which then may be used automatically by the vehicle 100 in secure communications.
Another user interface 3000 is shown in
A dropdown menu or selection menu, having banner 3016, allows a user to approve transactions for different types of events. For example, the user may, according to the banners 3020, 3028, and 3036, approve transactions for a particular route, time period, or per transaction by selecting one of the radio buttons 3024, 3032, or 3040, respectively. For any one of the items—for instance, the time period selection 3028, 3032—a dropdown menu 3044, with the banner 3044 for time limit, may be presented. Thus, for the time period selection 3028, the user may have different types of time periods from which to select. For example, the selections 3048, 3056, and 3064 indicate that the user can select approval of transactions or payment authorizations for a one-hour time period, a two-hour time period, or for a particular commute, by selecting radio buttons 3052, 3060, 3068, respectively.
Another embodiment of a window user interface 3072 may be as shown in
Another user interface 3100 for authorizing a particular transaction may be as shown in
Another embodiment of a user interface 3200 that provides for placing ads within the user interface 2908 of the vehicle 100 may be as shown in
Another approval or authorization user interface 3300 to conduct a transaction u may be shown in
Further authorization information that may be used for conducting transactions through the vehicle 100 may be as shown in
Another user interface 3500 for approving an order that was already preordered may be as shown in
Another user interface 3600 for approving transactions for a particular type of event may be as shown in
Another embodiment of a user interface 3632 that includes a popup information box 3636 may be as shown in
A user interface 3700 for allowing a user to approve transactions or communications along a particular route may be as shown in
Another user interface 3800 similar to user interface 3704 shown in
An embodiment of a user interface 3900 for receiving and/or providing financial information about a passenger in the vehicle 100 may be as shown in
Two additional fields 3972, 3976, in user interface 3900, allow the passenger to enter their mobile device information. This mobile device information 3972 and 3976 allows the vehicle 100 to identify the user and communicate the passenger's financial information when being identified within the vehicle 100. Also, having the mobile device information 3972, 3976 allows the vehicle 100 to conduct transactions through the mobile device rather than through the vehicle 100, if desired.
Another user interface 4000, within display 2908, related to user preferences, as represented by different visual indicia on tab 2912c, may be as shown in
The fields 4008-4052 may include selections for automatic payments (allows the vehicle 100 conduct payments without user input), pre-authorizations (requires the vehicle 100 to get authorization for orders), approve payment (requires the vehicle 100 to get approval for payments), use of PIN (requires the vehicle 100 to obtain a PIN for payments), store usual orders (requires the vehicle 100 to store common or recurring orders), allow passenger payment (allows a passenger to associated payment information with the vehicle 100 and make payments). Any one of these preferences may be provided or selected by selecting radio buttons 4012, 4020, 4028, 4036, 4044, and 4052.
An embodiment of a user interface 4100 associated with event preferences, as represented by tab 2912d having different visual indicia, may be as shown in
An embodiment of a method 4200 for receiving and storing user secure information may be as shown in
A processor or CPU 5608 can receive an event, in step 4208. The event can be the starting of the vehicle 100, a connection made by a user with the dongle 2008, or another event. This event can cause the CPU 5608 to present a user interface 2900 on a display 5616.
In step 4212, the CPU 5608 renders the user interface 2900 for displaying on the output device 5616. The output device 5616 can include the screen 2904 provided in the head unit, a heads-up display, or some other type of display within the vehicle 100. From the user interface 2900, the CPU 5608 can receive input into the user interface 2908, in step 4216. The input can be the entry or determination of whether or not the user desires to enter financial information to be associated with the vehicle 100.
In step 4220, the CPU 5608 determines whether the user desires to enter secure information. For example, the user may select radio button 2920 to alert the CPU 5608 that the user desires to enter secure information. If secure information is desired to be entered, the method 4200 proceeds “YES” to step 4224. However, if no information is desired to be entered, method 4200 proceeds “NO” to step 4240.
In step 4224, the CPU 5608 receives secure input through the user interface 2900. Here, the user may enter information about their finances or financial information in fields 2924 through 2968. This information may be provided to the CPU 5608, or the information may then be buffered in memory 5636. In step 4228, the CPU 5608 can encrypt the information and store it within memory 5636. The encryption key may be some secret shared between the user and the vehicle 100. This encryption information may be provided in another user interface, through voice input, or through some other input. Further, there may be other types of encryption keys that may be associated with biometrics or other information.
In step 4232, the CPU 5608 stores the encrypted information in secure storage within the vehicle 100. Secure storage can be a separate secure area of storage 5636. In some configurations, this secure storage may be accessed differently and may be physically separated from working memory 5636. In other configurations, the secure storage may be a separate part of the working memory 5636 that may be accessed with different protocols. Upon storing this information, the CPU 5608 associates the user of the vehicle 100 with the encrypted information, in step 4236. Here, the user's biometrics 2408, username 2412, password 2416 etc. may be stored with data structure 2504 or other information to associate the user with the information.
A method 4300 for encrypting the sensitive information, as described in conjunction with
In step 4308, the CPU 5608 can receive sensitive information, as described in conjunction with
In step 4316, the CPU 5608 can encrypt the sensitive information received in step 4308 with the key. This key would be associated with both the vehicle and the user and may not be accessed without the presence of the user within the vehicle 100 or if the user is not associated with the vehicle 100. As such, if the vehicle 100 is stolen, the information cannot be accessed. Further, another user cannot enter the vehicle 100 and use the information, and the user may not use some other vehicle for secure communications with the information. The encrypted information may then be stored in working memory 5636 or in storage 5620, 5624, in step 4320. As such, the CPU 5608 can permanently store the information in secure storage as encrypted information that becomes difficult to use except when the user is associated with the vehicle 100.
Thus, operator or owner financial information and user biometric data can be stored in an encrypted memory 5624, 5636 on the vehicle 100, such as in encrypted RAM or other storage medium. If RAM were to be encrypted, it must be decrypted automatically for usage by the CPU 5608. In some configurations, the CPU 5608 does not operate on the RAM directly. The CPU 5608 loads code and data from the RAM into the CPU's internal caches. The loading/unloading process is transparent for both the applications 5644 and the operating system 5640.
An automatic RAM encryption system can be embodied in hardware, for example, in the CPU 5608. If encryption were to be done in the RAM, an attacker may be able to freeze the RAM and then plug the RAM into a different machine to extract the information. In a similar manner, decryption in an external RAM controller may not prevent active attacks. The automatic RAM encryption system could also defeat a cold boot attack or a platform reset attack in which an attacker, with physical access to a computer, is able to retrieve encryption keys from a running operating system after using a cold reboot to restart the machine. This type of attack relies on the data remanence property of DRAM and SRAM to retrieve memory contents that remain readable in the seconds to minutes after power has been removed.
In additional or alternative configurations, disk encryption techniques can also be employed. When a user first powers on the vehicle and before the operating system 5640 can boot up, the user must unlock his or her disk by supplying the correct encryption key. The files that make up the operating system are on the encrypted disk, so there is no way for the computer 5608 to work with the operating system 5640 until the disk is unlocked.
In some configurations, inputting a security credential, like a passphrase or biometric identifier, does not unlock the whole disk but unlocks an encryption key, which in turn unlocks everything on the disk. This indirection allows a user to change his or her passphrase (if a passphrase is used) without having to re-encrypt the disk with a new key and also makes it possible to have multiple passphrases, other security credentials, or combinations thereof that can unlock the disk, for example, if the operator or owner were to add another user account to the on board computer.
Other configurations may defeat a cold boot attack; encryption software periodically flushes the encryption key every few minutes when not in use. The user would be required to re-authenticate himself or herself to the on board computer system 5608, after expiration of a selected period of time. A user-friendly transparent proximity solution can exist where the vehicle remote keyless system (the dongle 2008) acts as a type of wireless device to transmit a unique code or signal that can cause the controller 5608 to securely access the encryption key whenever needed or otherwise cause the encryption key to be refreshed periodically.
As will be appreciated, keyless remotes 2008 contain a short-range radio transmitter, and must be within a certain range, usually 5-20 meters, of the vehicle 100 to work. It sends a coded signal by radio waves to a receiver unit in the vehicle 100, which performs a selected function, such as locking or unlocking the door. Alternatively, the secure key can be derived from the coded signal itself by a known key generation algorithm (e.g., in which the seed is the coded signal or the coded signal along with another identifier, for example, a biometric indicator of a user or all or part of the VIN 2436 of the associated vehicle 100.
An embodiment of a method 4400 for a driver and/or passenger to enter secure information into the vehicle 100 may be as shown in
In steps 4408, 4424, the vehicle 100 can sense the presence of a driver or passenger, respectively. The internal sensors 5437 may sense the presence of a person within the vehicle 100. This person may be a driver or a passenger. The information gleaned from the sensors within the vehicle 100 may then be provided to the CPU 5608. In steps 4412, 4428, the CPU 5608 can receive the biometric information associated with the driver/passenger. This information may then be used to determine whether or not the passenger or driver is previously associated with the vehicle 100. The biometric information may be as stored in data structure 2404 in field 2408.
With the biometric information, the CPU 5608 can compare the received information with the information in field 2408 to determine if the driver or passenger is authenticated, in steps 4416, 4432. If the received biometric information from internal sensors 5437 is the same or similar to stored biometric information 2408, the driver/passenger is authenticated. As such, the CPU 5608 can determine if the driver/passenger is authenticated in steps 4420 or 4436. If the information does compare, then the driver/passenger is authenticated and the method 4400 proceeds “YES” to step 4448. However, if the driver or passenger is not authenticated, the method 4400 proceeds “NO” to step 4440, where the CPU 5608 determines if other information is available.
Other information may include a username 2412, a password 2416, mobile device information 2420, etc. that may be used to authenticate the user. If there is other information, the CPU 5608 can gather or obtain that information, in step 4444. Thus, the CPU 5608 may request the username or password from the user, connect or couple with the mobile device or the dongle, or may conduct other operations to gain that information. If the other information is valid and compares to the information stored in data structure 2404, the user may be authenticated again. However, if no other information is available, the operation may end without the user being authenticated or allowed to do secure communications. If the user finally is authenticated, then in step 4448, the vehicle processor 5608 can allow secure interactions between the vehicle 100 and third parties.
Based on a driver's preferences (e.g., user goes to Starbucks every morning, etc.), the onboard computer automatically pre-authorizes a transaction at a particular time and/or in response to a particular detected location of the vehicle 100. This pre-authorization can be done by observation and based on historic user behavior or spatial proximity of the vehicle 100 to the vendor (e.g., range of a transceiver on the vehicle 100 to a transceiver at the vendor or GPS-based locations of the vehicle 100 and vendor).
In this example, the vehicle 100 can gate release information to avoid unintended purchases. This gate-release of information can be done by an interactive user interface (UI) on the vehicle 100. Presumably, the vehicle 100 may identify a potential purchase by the driver, such as by receiving and pre-filtering purchase options before presenting, to the operator, the option. Otherwise, the operator could be swamped by options while he or she is driving through an area loaded with retail stores. The trigger could be based on prior driver behavior, proximity of the vehicle 100 to a vendor, a state of the vehicle 100 (e.g., stopped, in motion, velocity of vehicle 100, etc.), and the like. There may also be operator authentication prior to the function of pre-purchase being activated. This approval can prevent a thief or family member from using the function without authorization.
In some configurations, the vehicle 100 may anticipate payments to be made, or opportunities to pay, based on a number of factors associated with the vehicle 100 or vehicle state, such as speed, turn signal activation, road or lane change, and the like. In response to detecting one or more of these factors, the vehicle 100 may present the user with a context-sensitive prompt designed to provide an advanced authorization for payment for a good or service. This advanced authorization for payment can save authentication time, processing time, and/or waiting time associated with payment stations, providing a quick transactional exchange.
By way of example, a user may be approaching a gas station and a drive-through coffee shop in a vehicle 100. Upon reducing speed, directing, and/or positioning the vehicle 100 into proximity with the coffee shop, the vehicle 100 may prompt the user for advanced authorization for the coffee shop (e.g., rather than the gas station or any other service). The user may accept the advanced authorization (e.g., for an open-amount, particular amount, etc.). Additionally or alternatively, the vehicle 100 may prompt the user for authorization for a particular drink and/or group of drinks, food, etc. The user may accept a “usual” order (e.g., based on historical data, preferences, and/or voice command), and as the user drives up to the “ordering” window/kiosk, the user may be directed to the “pickup” window where the drink and/or food is waiting for the user and has already been paid for by the vehicle authorization made.
In some configurations, payments made by a vehicle 100 may be conditionally authorized ahead of a transaction time for a limited period of time. This pre-transaction authorization may be restricted and/or only active for a particular time surrounding a determined movement of the vehicle 100. In one example, a user may enter a destination into a driving-navigation program (e.g., GPS navigation system associated with the vehicle 100, etc.). The vehicle 100 may determine a particular path of travel for the vehicle 100 and determine that specific routes require payment (e.g., toll roads, parking garages, congestion tax fees, etc.) or requires charging to be performed. In response to determining that the specific routes require some type of payment, the vehicle 100 may determine to authorize the predicted payments required on the specific routes. This authorization may be valid only for a limited time, while the vehicle 100 is traveling along the route, and/or as long as the vehicle 100 has power, etc. For instance, if the user deviates from the path, the pre-transaction authorization is revoked, preventing any accidental or unintended payments.
A driver may generally associate their payment information with a vehicle 100. Yet, sometimes, the vehicle 100 may transport multiple parties, for example, in a car pool. If a person other than the driver desires to pay for a good or service, the person may not be able to utilize the vehicle's payment system. Thus, the vehicle 100 may be capable of adding multiple buyers/payers to the payment system by associating those other people with the vehicle 100.
A passenger may enter the vehicle 100. The vehicle 100 may sense the presence of the passenger by receiving a phone signal, by receiving sensor data from sensors associated with the passenger (e.g., activation of the airbag, heat, weight, etc.), etc. A user interface presented on the head unit or in a screen of the entertainment system may be prompt the passenger with the option to associate financial information with the vehicle 100. If the passenger participates, the passenger can utilize the payment system. Further, the vehicle 100 may use that person's cell or phone for transactions when the passenger is paying. Any stored data may be erased when the passenger leaves the vehicle 100, which prevents reuse of the payment information. Or, in the alternative, the payment information may be dormant until the user is identified in the vehicle 100 again or their mobile device enters the vehicle 100 again. In some situations, the passenger and driver can use the vehicle payment system to split payments, for example, for battery charging, food, hotel, etc.
An embodiment of a method 4500 for allowing contactless secure communications may be as shown in
The CPU 5608 can receive information from internal sensors 5437 that indicate the presence of a user within the vehicle, in step 4508. As described in conjunction with
Based on the identity of the person, the CPU 5608 can retrieve a profile for the person, in step 4516. The profile may include data structure 2504 or data structure 2604, as described in conjunction with
In step 4524, the vehicle 100 may enter a geographic location. The geographic location can be a route commonly taken by a user of the vehicle 100. This geographical location or route may be as described in conjunction with
An embodiment of a method 4600 for a vehicle to make secure communications may be as shown in
A vehicle 100 may enter a location, in step 4608. For example, the vehicle 100 can enter a drive-through, a toll lane, or some other type of location. The vehicle 100 may then determine if a radio frequency (RF) field is detected at one of the two or more RF devices 2004, as described in conjunction with
The CPU 5608 can then determine if the user authorizes contactless communication, in step 4620. As such, the CPU 5608 can access preferences 2528, rules 2532, or other information in data structure(s) 2504, 2604, etc. to determine if contactless communications are allowed. Further, the user interface 3072 or 3004 may be presented to the user and a determination made if the user provides an authorization of contactless communications. If contactless communications are not authorized, the method 4600 proceeds “NO” to step 4628. However, if contactless communications are authorized, method 4600 proceeds “YES” to step 4624. In step 4624, the CPU 5608 can automatically send contactless communications through the RF device 2004 to the third party. These communications can include financial information or other secure information from the vehicle 100. As such, the vehicle 100 can be an extension of the user in delivering information to a third party.
The method 4600 provides for devices at different location that allow for information exchange using various communication standards (e.g., near field communications (NFC), BLUETOOTH™ Low Energy (BLE), radio frequency identification (RFID). The devices can be mounted or incorporated into the vehicle 100 in various locations, e.g., a wing mirror, a windshield, a roof, etc. A vehicle 100 may have a communication device, e.g., an NFC or RFID device, installed in such a way that the device 2004 may communicate with external devices to conduct transactions including contactless payments.
The devices 2004 may be passive or active. For example, the communication device 2004 may be installed on or near the outer surface of the vehicle 100. The device 2004 may be placed in many areas of the vehicle 100 to communicate with external devices in a number of situations. A driver side device 2004c may be used to communicate and conduct contactless payment transactions in drive-thru situations. An overhead or under-carriage device 2004a may be utilized in situations such as toll-lanes or parking garages to similarly conduct payment transactions while the vehicle 100 may be moving. A communication device 2004c may be placed on or near the gas tank such that the device 2004c may conduct a payment transaction with a gas tank nozzle or gas pump. The device 2004 may be in communication with an onboard computer 5608 placed inside the vehicle 100. The device 2004 may be used to store and share any amount of information.
An embodiment of a method 4700 for conducting secure communications with the third party may be as shown in
The vehicle 100 begins secure communications with another party, in step 4708. Here, the processor 5608 may communicate through the communications system 1900 to another entity. The communication may be a remote communication used through a wireless or cellular system. In other embodiments, it may be a nearfield communication. The third party may send a request for authorization for some type of communication. In such a case, the vehicle 100 receives a requirement for authorization by the user, in step 4712. In other situations, an internal rule 2532 or some other event may cause the processor 5608 to require authorization by the user.
In step 4716, the processor 5608 may display a user interface (e.g., user interface 2904) to provide authorization to the vehicle 100. This user interface may be user interface 3000 shown in
In step 4724, the processor 5608 can create an authorization based on the authorization given by the user and send such authorization through the communications system 1900 to the other entity. As such, the processor 5608 can provide authorizations for secure communications or other events, such as financial transactions, through a display and/or user interfaces on the vehicle 100 rather than through a mobile device or a device (e.g., a credit card terminal) associated with the other party.
User interfaces for authorization of transaction may be provided in a head unit or other display (e.g., dash, heads-up display (HUD), etc.). An in-vehicle display and input device may be provided to conduct contactless payment transactions with external devices. When an external device initiates a payment transaction with a communication device of the vehicle 100, a user inside the vehicle 100 may be presented with a means for authorizing the payment, for example, a user interface that accepts a PIN entry, a signature, fingerprint ID, retina scanner, etc. or another interface that provides a different type of authorization (e.g., voice authorization, gesture authorization, etc.). The means for authorization may be dependent on the amount or type of transaction involved. Authorization may take place prior to, during, or following the transaction. For example, in the case of toll-lanes, the payment may be authorized following the use of the toll-lane. For drive-thru restaurant transactions, the authorization may take place prior to the sale. For the authorization of the sale of gas, the authorization may take place immediately following the pumping of the gas and prior to sale.
In some embodiments, payments made by a vehicle 100 may be authorized ahead of a transaction time and/or for a limited period of time. This pre-transaction authorization may be restricted and/or only active for a particular time surrounding a determined movement of the vehicle 100. In one example, a user may enter a destination into a driving-navigation program 5402 (e.g., GPS navigation system associated with the vehicle, etc.). The vehicle 100 may determine a particular path of travel for the vehicle 100 and determine that specific routes require payment (e.g., toll roads, parking garages, congestion tax fees, etc.). In response to determining that the specific routes require some type of payment, the vehicle 100 may determine to authorize the predicted payments required on the specific routes. This authorization may be valid only for a limited time, while the vehicle 100 is traveling along the route, and/or as long as the vehicle 100 has power, etc. For instance, if the user deviates from the path, the pre-transaction authorization is revoked, preventing any accidental or unintended payments.
Payment may be later authorized based on vehicle identification and later acceptance. When a vehicle 100 conducts a payment transaction, the transaction may be authorized based on a unique identifier associated with the vehicle 100 and/or geolocation information associated with the vehicle 100 and/or the driver. During the initial part of the transaction, the vehicle's contactless payment system may deliver the payment information and information regarding the vehicle 100 and/or driver of the vehicle 100 (e.g., a half token). This information may be collected in other ways by the external payment device to support the authorization. For example, the external device may photograph the vehicle 100, license plate, and/or driver. This information may be used to support the authorization for the transaction. Such pre-transaction authorization may be useful in situations requiring the driver to manually enter an authorization, such as by PIN code, signature, etc., which may be inconvenient or impossible when driving. For example, the system may be used for payments made while the vehicle 100 is in motion, e.g., for paying while entering/exiting parking garages and/or for paying for toll-lanes.
In some embodiments, the vehicle 100 may anticipate payments to be made, or opportunities to pay, based on a number of factors associated with the vehicle 100. These factors can include location, speed, proximity to a location, preference data, and/or historical information. In response to detecting one or more of these factors, the vehicle 100 may present the user with a context-sensitive prompt designed to provide an advanced authorization for payment for a good or service. This advanced authorization for payment can save authentication time, processing time, and/or waiting time associated with payment stations, providing a quick transactional exchange.
By way of example, a user may be approaching a gas station and a drive-through coffee shop in a vehicle 100, upon reducing speed and/or positioning the vehicle 100 into proximity with the coffee shop, the vehicle 100 may prompt the user for advanced authorization for the coffee shop (e.g., rather than the gas station, or any other service). The user may accept the advanced authorization (e.g., for an open-amount, particular amount, etc.). Additionally or alternatively, the vehicle 100 may prompt the user for authorization for a particular drink and/or group of drinks, food, etc. The user may accept a “usual” order (e.g., based on historical data, preferences, and/or voice command), and as the user drives up to the “ordering” window/kiosk, the user may be directed to the “pickup” window where the drink and/or food is waiting for the user and has already been paid for by the vehicle authorization made in advance.
Multiple communications for authorization, based on geolocation, proximity, etc. may be prompted by a user interface, for example “Are you going to buy a burger?”, “Are you filling up for gas?”, and so on. The user interface may be set for certain speeds of vehicle, based on preferences, etc. An initial step is the communications could be the “advertisement” (e.g., “Would you like a Starbucks?”). This advertisement could appear in the display of the vehicle 100. If the answer is “Yes”, a second step of authenticating the vehicle/driver/passenger(s) to Starbucks could occur to allow the transaction to proceed. Otherwise, the question could time out.
An embodiment of a method 4800 for sending secure information or completing a communication through a deployable antenna 2208 may be as shown in
The vehicle 100 may enter a location, in step 4808. Here, the vehicle 100 may maneuver within physical proximity of a third-party location. The third-party location may have a receiving pad 2212. In step 4812, the processor 5608 may receive an instruction to deploy antenna 2208 to touch the pad 2212. In response, the processor 5608 may operate the deploying device 2204 to deploy the antenna 2208 to physically contact the pad 2212.
Upon making an electrical connection through the contact surface(s), the CPU 5608 can establish a communication/electrical connection through the deployable antenna 2208, in step 4816. For example, authentication or authorization information may be exchanged between the vehicle 100 and the third party through antenna 2208. At this point, the CPU 5608 may provide a user interface, such as user interface 3000 or user interface 3012, to determine if the user authorizes the communication, in step 4820. If the user provides input that communications are authorized, the method 4800 proceeds “YES” to step 4824, where the processor 5608 automatically sends the communication or financial information. If no authorization is given, the method proceeds “NO” to end operation 4828.
The soft contact reader for payment, similar to the close proximity (e.g., NFC, close range, etc.) readers 2004 can be deployed in different locations for different functions (e.g., side of vehicle for coffee or food at drive through; top of vehicle for tolls, parking, etc., or other locations). Most forms of contactless payment utilize NFC protocol antennae. Such communication devices require very near proximity to transfer information. To utilize this technology with the vehicle 100, when conducting contactless payment transactions with external devices, a specially designed scanner device is needed. An NFC antennae device may be installed in a type of “brush” or “arm” wherein the external device 2208 may come in contact with the NFC antennae of the vehicle without damaging the vehicle 100 in any way. For example, when conducting a contactless payment transaction while driving into or out of a parking garage, a vehicle 100 may drive over a brush or arm equipped with an NFC reading device. This reading device may come in contact with an NFC antennae placed on the undercarriage of the vehicle. In other situations, for example a drive-thru restaurant, the vehicle 100 may drive alongside an NFC reader equipped brush or arm, wherein the NFC reader is enabled to accept payment information from an NFC antennae device placed on the driver side of the vehicle 100.
An embodiment of sending authentication information in a method 4900 may be as shown in
In step 4908, the communications system 1900 can establish a connection or communication link between the vehicle 100 and a third party. This connection may be a nearfield communication, wireless communication, cellular communication, etc. Any of the preceding or following descriptions of communication connections may be made. In step 4912, the CPU 5608 can then receive a request for a secure communication. This secure communication may be authorized or secured using authenticating information. This authenticating information may be information identifying a user. As such, the CPU 5608 can retrieve identifying information associated with the user, in step 4916. Here, the CPU may extract or create a half-token 2312 or information shown in data structure 2404.
In step 4920, the CPU 5608 may retrieve from memory the information associated with the vehicle 100, for example, a VIN 2436, an ESN 2440, an engine code 2444. The vehicle's information may be combined with the user information into a new grouping of identifying information from both the user and vehicle 100, in step 4924. This combined information may then be sent by the processor 5608 to the communications system 1900 and sent on to another party, in step 4928.
Thus, the vehicle 100 can employ a vehicle VIN 2436, an ESN 2440 for the vehicle 100, an ESN of a mobile device, etc. for communication/purchase authentication. Only if a third party receives both sets of information is there an authentication for a communication/purchase. Another individual cannot purchase anything if the vehicle 100 is carjacked or stolen. Further, in some configurations, a user could provide the VIN 2436 and/or ESN 2440 as an ignition key for vehicle 100. An additional level of authentication may require a PIN 2512, or additional biometrics 2408, for the authorization.
An embodiment of a method 5000 for determining whether a communication may be limited by predetermined limits may be as shown in
The communications system 1900 may establish a communications connection with a third party, in step 5008. In step 5012, the processor 5608 may receive a request for secure communication from the third party. Upon receiving and in response to receiving such request, the processor 5608 may access limits field 2524, in data structure 2504, to retrieve limits on such secure communications, in step 5016. The processor 5608 may then determine whether the communications are prohibited by these limits, in step 5020. Here, limits can include limits on the number of communications provided, can include monetary limits on transactions that may be conducted, etc. These limits on financial transactions may be cover two or more vehicles or just for the single vehicle 100. The limits may also be associated with a particular financial account or across family members.
Depending on the limit, the processor 5608 can determine whether a communication is authorized, in step 5020. If the communication is prohibited, the method 5000 proceeds “YES” to step 5024, where the communication is ended. However, if the communication is not prohibited, the method 5000 proceeds “NO” to step 5028, where the communications are sent. For example, without prohibitions, the financial information of the user may be sent to conduct a financial transaction.
The limits placed on transaction amounts can be based on an individual or on a collection of people, on a transaction, or over a specified period of time. The transaction history could be synchronized among multiple vehicles to provide a collective limit for all vehicles or may be based on a limit per-vehicle, per location, etc. Limits on types of transactions or products being purchased could also be imposed on another entity, for example, a parent can set transaction limits on driving teenagers.
Method 5100 may be similar to method 5000. Generally, the method 5100 starts with a start operation 5104 and ends with operation 5132. The method 5100 can include more or fewer steps or can arrange the order of the steps differently than those shown in
The method 5100, shown in
Method 5100 is directed to the other side of the transaction in method 5000. A provider/seller can also have business rules that are applied to the supplier side. The supplier sets a different notional set of purchase rules, i.e. business operational minimums or maximums, to facilitate selection or negotiation of battery exchange/repair/charging terms. Such rules are applied when the user seeks such services, as enabled by a searching algorithm querying a database of available service stations. Automated negotiation between user and service provider may occur. The business/user is able to adjust selectable parameters, e.g., the user may be in a particular hurry and increase his/her maximum fee for a certain service, e.g., charging prices. Thus, the business and user may conduct negotiations or bargain within the limits of pre-defined limits or parameters.
Method 5200 provides for conducting secure communications based on a set of preauthorization rules as shown in
In step 5208, the vehicle 100 establishes a connection with a third party through the communications system 1900. Again, the processor 5608 receives a request for a secure communication, in step 5212. Rather than obtaining preferences, the CPU 5608 can retrieve pre-authorization rules from field 2532 of data structure 2504, in step 5216. These pre-authorization rules 2532 may be similar to those that are set in user interface 4000 or 4100. The pre-authorization rules may determine whether a processor 5608 can conduct pre-orders or other communications without user input, at least for a portion of the communications.
If the communication is allowed by the rules 2532, in step 5220, the method 5200 proceeds “YES” to step 5228, where the communication is sent through the communication system 1900. However, if the communication is not allowed by the rules 2532, the method 5200 proceeds “NO” to step 5224, where the CPU 5608 determines whether the user authorizes the communication. If the communication is authorized based on input received in a user interface, such as interface 3000, the method 5200 proceeds “YES” to step 5228, again to send the communication. However, if the communications are not authorized or pre-authorized, the method 5200 proceeds “NO” to step 5232, where the communication is ended.
The user establishes a set of rules triggering automatic authorization for or purchasing of services. For example, the user may establish that when visibility conditions along a route are predicted to drop below certain minimum, a moving map display 3800 with overlaid weather is provided on a vehicle display. Such a display would aid the driver in considering alternative routes or increasing the driver's situational awareness as to the severity, duration, and direction of the cause of the visibility drop (e.g., is it ground fog over a wide area expected to linger or a result of a fast-moving thunderstorm). The real-time moving map overlaid weather display 3800 may require a costly live feed from a weather satellite, and, as such, the feed is not provided unless the triggering event is satisfied. In another example, the user may establish that if his programmed driving route requires in route charging, a purchase of a reserved charging spot at a charging station along the route is automatically made so as to reserve the spot.
An embodiment of a bifurcated communication between the vehicle 100 and a third party may be as shown in
Here, a first event may be received, in step 5308. This event may be the user driving to a certain location, taking a route, or some other type of event that occurs that causes the vehicle 100 to decide to send a first portion of a secure communication to a third party, in step 5312. For example, the processor 5608 may provide a pre-order for some good or service typically purchased by the user at that time of day at that route. This pre-authorization may be sent to the third party to prepare an order. At some point thereinafter, in step 5316, the processor 5608 can receive a second event associated with that first event. The second event may be the vehicle 100 driving to the physical location of the third party. As such, the second event may be the navigation system 302 identifying that the vehicle 100 is at the third party's location.
In step 5320, the processor 5608 can determine whether the user then authorizes the complete communication/transaction. Here, the user may be presented with a user interface 3300, 3500 associated with the preauthorization. If the communication/transaction is authorized, in step 5320, by selecting, for example, button 3512, the method proceeds “YES” to step 5328, where a second portion of the secure communication may be sent to a third party. The second communication can be a second half-token. As such, a half-token 2312 can be sent, in step 5312 while further information—such as financial information 2508, a PIN 2512, or some other type of information to complete the transaction—may be sent as the related second half token, in step 5328. If the user does not authorize the communication, the method 5300 can proceed “NO” to step 5336.
Utilizing the vehicle as a mobile wallet can pose some issues. One issue is the problem with ensuring that the vehicle 100 pays the right merchant, at the right time, in the right location. To ensure accuracy in payment, the vehicle 100 may break the payment procedure into two or more phases. A first phase may order goods or services and may provide some or all of the payment information. A second phase may send the total payment information or confirm the information in the first phase, such as by user input in response to a query (e.g., providing a PIN). Thus, the vehicle 100 can pre-authorize charges.
The vehicle 100 may also use different communication media (e.g., cellular communication, Bluetooth, NFC, etc.) to conduct the two or more phases. Using different communication media can enhance security as merchants or other nefarious individuals may not use the payment information without both phases of the communication and one of the phases may conducted with a communication media that requires physical proximity of the vehicle 100. Moreover, having a second phase requiring operator input can prevent a first vehicle from purchasing goods or services for another second vehicle waiting in line with the first vehicle 100.
As an example, the vehicle 100 may be driving toward a pay-point and pre-order a good. The vehicle 100 may understand that the driver is heading towards Starbucks (through past behavior), and the vehicle 100 can communicate with the Starbucks that payment will be authorized and desires their beverage hot, sweet, etc. (based on driver behavior and location). The pre-authorization may be sent over a cellular network. When the vehicle 100 arrives at the Starbucks, the final authorization may be sent over a local proximity-based payment system, e.g. NFC.
Payment can be authorized based on vehicle identification and later acceptance. When a vehicle 100 conducts a payment transaction, the transaction may be authorized based on a unique identifier associated with the vehicle and/or a geolocation information associated with the vehicle 100 and/or the driver. During the transaction, the vehicle's contactless payment system may deliver the payment information as well as information regarding the vehicle 100 and/or driver of the vehicle. This information may be collected in other ways by the external payment device to support the authorization. For example, the external device may photograph the vehicle 100, license plate, and/or driver. This information may be used to support the authorization for the transaction. Such pre-transaction authorization may be useful in situations where requiring the driver to manually enter an authorization, such as by PIN code, signature, etc., may be inconvenient or impossible because the driver is driving the vehicle 100. For example, the system may be used for payments made while the vehicle 100 is in motion, e.g., for paying while entering/exiting parking garages and/or for paying for toll-lanes.
In accordance with at least some embodiments of the present disclosure, the communication network 5452 may comprise any type of known communication medium or collection of communication media and may use any type of protocols, such as SIP, TCP/IP, SNA, IPX, AppleTalk, and the like, to transport messages between endpoints. The communication network 5452 may include wired and/or wireless communication technologies. The Internet is an example of the communication network 5452 that constitutes an Internet Protocol (IP) network consisting of many computers, computing networks, and other communication devices located all over the world, which are connected through many telephone systems and other means. Other examples of the communication network 104 include, without limitation, a standard Plain Old Telephone System (POTS), an Integrated Services Digital Network (ISDN), the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), such as an Ethernet network, a Token-Ring network and/or the like, a Wide Area Network (WAN), a virtual network, including without limitation a virtual private network (“VPN”); the Internet, an intranet, an extranet, a cellular network, an infra-red network; a wireless network (e.g., a network operating under any of the IEEE 802.9 suite of protocols, the Bluetooth® protocol known in the art, and/or any other wireless protocol), and any other type of packet-switched or circuit-switched network known in the art and/or any combination of these and/or other networks. In addition, it can be appreciated that the communication network 5452 need not be limited to any one network type, and instead may be comprised of a number of different networks and/or network types. The communication network 5452 may comprise a number of different communication media such as coaxial cable, copper cable/wire, fiber-optic cable, antennas for transmitting/receiving wireless messages, and combinations thereof.
The driving vehicle sensors and systems 5404 may include at least one navigation 5408 (e.g., global positioning system (GPS), etc.), orientation 5412, odometry 5416, LIDAR 5420, RADAR 5424, ultrasonic 5428, camera 5432, infrared (IR) 5436, and/or other sensor or system 5438. These driving vehicle sensors and systems 5404 may be similar, if not identical, to the sensors and systems 116A-K, 112 described in conjunction with
The navigation sensor 5408 may include one or more sensors having receivers and antennas that are configured to utilize a satellite-based navigation system including a network of navigation satellites capable of providing geolocation and time information to at least one component of the vehicle 100. Examples of the navigation sensor 5408 as described herein may include, but are not limited to, at least one of Garmin® GLO™ family of GPS and GLONASS combination sensors, Garmin® GPS 15x™ family of sensors, Garmin® GPS 16x™ family of sensors with high-sensitivity receiver and antenna, Garmin® GPS 18x OEM family of high-sensitivity GPS sensors, Dewetron DEWE-VGPS series of GPS sensors, GlobalSat 1-Hz series of GPS sensors, other industry-equivalent navigation sensors and/or systems, and may perform navigational and/or geolocation functions using any known or future-developed standard and/or architecture.
The orientation sensor 5412 may include one or more sensors configured to determine an orientation of the vehicle 100 relative to at least one reference point. In some embodiments, the orientation sensor 5412 may include at least one pressure transducer, stress/strain gauge, accelerometer, gyroscope, and/or geomagnetic sensor. Examples of the navigation sensor 5408 as described herein may include, but are not limited to, at least one of Bosch Sensortec BMX 160 series low-power absolute orientation sensors, Bosch Sensortec BMX054 9-axis sensors, Bosch Sensortec BMI054 6-axis inertial sensors, Bosch Sensortec BMI160 6-axis inertial sensors, Bosch Sensortec BMF054 9-axis inertial sensors (accelerometer, gyroscope, and magnetometer) with integrated Cortex M0+ microcontroller, Bosch Sensortec BMP280 absolute barometric pressure sensors, Infineon TLV493D-A1B6 3D magnetic sensors, Infineon TLI493D-W1B6 3D magnetic sensors, Infineon TL family of 3D magnetic sensors, Murata Electronics SCC2000 series combined gyro sensor and accelerometer, Murata Electronics SCC1300 series combined gyro sensor and accelerometer, other industry-equivalent orientation sensors and/or systems, and may perform orientation detection and/or determination functions using any known or future-developed standard and/or architecture.
The odometry sensor and/or system 5416 may include one or more components that is configured to determine a change in position of the vehicle 100 over time. In some embodiments, the odometry system 5416 may utilize data from one or more other sensors and/or systems 5404 in determining a position (e.g., distance, location, etc.) of the vehicle 100 relative to a previously measured position for the vehicle 100. Additionally or alternatively, the odometry sensors 5416 may include one or more encoders, Hall speed sensors, and/or other measurement sensors/devices configured to measure a wheel speed, rotation, and/or number of revolutions made over time. Examples of the odometry sensor/system 5416 as described herein may include, but are not limited to, at least one of Infineon TLE4924/26/27/28C high-performance speed sensors, Infineon TL4941plusC(B) single chip differential Hall wheel-speed sensors, Infineon TL5041plusC Giant Mangnetoresistance (GMR) effect sensors, Infineon TL family of magnetic sensors, EPC Model 25SP Accu-CoderPro™ incremental shaft encoders, EPC Model 30M compact incremental encoders with advanced magnetic sensing and signal processing technology, EPC Model 925 absolute shaft encoders, EPC Model 958 absolute shaft encoders, EPC Model MA36S/MA63S/SA36S absolute shaft encoders, Dynapar™ F18 commutating optical encoder, Dynapar™ HS35R family of phased array encoder sensors, other industry-equivalent odometry sensors and/or systems, and may perform change in position detection and/or determination functions using any known or future-developed standard and/or architecture.
The LIDAR sensor/system 5420 may include one or more components configured to measure distances to targets using laser illumination. In some embodiments, the LIDAR sensor/system 5420 may provide 3D imaging data of an environment around the vehicle 100. The imaging data may be processed to generate a full 360-degree view of the environment around the vehicle 100. The LIDAR sensor/system 5420 may include a laser light generator configured to generate a plurality of target illumination laser beams (e.g., laser light channels). In some embodiments, this plurality of laser beams may be aimed at, or directed to, a rotating reflective surface (e.g., a mirror) and guided outwardly from the LIDAR sensor/system 5420 into a measurement environment. The rotating reflective surface may be configured to continually rotate 360 degrees about an axis, such that the plurality of laser beams is directed in a full 360-degree range around the vehicle 100. A photodiode receiver of the LIDAR sensor/system 5420 may detect when light from the plurality of laser beams emitted into the measurement environment returns (e.g., reflected echo) to the LIDAR sensor/system 5420. The LIDAR sensor/system 5420 may calculate, based on a time associated with the emission of light to the detected return of light, a distance from the vehicle 100 to the illuminated target. In some embodiments, the LIDAR sensor/system 5420 may generate over 2.0 million points per second and have an effective operational range of at least 100 meters. Examples of the LIDAR sensor/system 5420 as described herein may include, but are not limited to, at least one of Velodyne® LiDAR™ HDL-64E 64-channel LIDAR sensors, Velodyne® LiDAR™ HDL-32E 32-channel LIDAR sensors, Velodyne® LiDAR™ PUCK™ VLP-16 16-channel LIDAR sensors, Leica Geosystems Pegasus:Two mobile sensor platform, Garmin® LIDAR-Lite v3 measurement sensor, Quanergy M8 LiDAR sensors, Quanergy S3 solid state LiDAR sensor, LeddarTech® LeddarVU compact solid state fixed-beam LIDAR sensors, other industry-equivalent LIDAR sensors and/or systems, and may perform illuminated target and/or obstacle detection in an environment around the vehicle 100 using any known or future-developed standard and/or architecture.
The RADAR sensors 5424 may include one or more radio components that are configured to detect objects/targets in an environment of the vehicle 100. In some embodiments, the RADAR sensors 5424 may determine a distance, position, and/or movement vector (e.g., angle, speed, etc.) associated with a target over time. The RADAR sensors 5424 may include a transmitter configured to generate and emit electromagnetic waves (e.g., radio, microwaves, etc.) and a receiver configured to detect returned electromagnetic waves. In some embodiments, the RADAR sensors 5424 may include at least one processor configured to interpret the returned electromagnetic waves and determine locational properties of targets. Examples of the RADAR sensors 5424 as described herein may include, but are not limited to, at least one of Infineon RASIC™ RTN7735PL transmitter and RRN7745PL/46PL receiver sensors, Autoliv ASP Vehicle RADAR sensors, Delphi L2C0051TR 77 GHz ESR Electronically Scanning Radar sensors, Fujitsu Ten Ltd. Automotive Compact 77 GHz 3D Electronic Scan Millimeter Wave Radar sensors, other industry-equivalent RADAR sensors and/or systems, and may perform radio target and/or obstacle detection in an environment around the vehicle 100 using any known or future-developed standard and/or architecture.
The ultrasonic sensors 5428 may include one or more components that are configured to detect objects/targets in an environment of the vehicle 100. In some embodiments, the ultrasonic sensors 5428 may determine a distance, position, and/or movement vector (e.g., angle, speed, etc.) associated with a target over time. The ultrasonic sensors 5428 may include an ultrasonic transmitter and receiver, or transceiver, configured to generate and emit ultrasound waves and interpret returned echoes of those waves. In some embodiments, the ultrasonic sensors 5428 may include at least one processor configured to interpret the returned ultrasonic waves and determine locational properties of targets. Examples of the ultrasonic sensors 5428 as described herein may include, but are not limited to, at least one of Texas Instruments TIDA-00151 automotive ultrasonic sensor interface IC sensors, MaxBotix® MB8450 ultrasonic proximity sensor, MaxBotix® ParkSonar™-EZ ultrasonic proximity sensors, Murata Electronics MA40H1S-R open-structure ultrasonic sensors, Murata Electronics MA40S4R/S open-structure ultrasonic sensors, Murata Electronics MA58MF14-7N waterproof ultrasonic sensors, other industry-equivalent ultrasonic sensors and/or systems, and may perform ultrasonic target and/or obstacle detection in an environment around the vehicle 100 using any known or future-developed standard and/or architecture.
The camera sensors 5432 may include one or more components configured to detect image information associated with an environment of the vehicle 100. In some embodiments, the camera sensors 5432 may include a lens, filter, image sensor, and/or a digital image processor. It is an aspect of the present disclosure that multiple camera sensors 5432 may be used together to generate stereo images providing depth measurements. Examples of the camera sensors 5432 as described herein may include, but are not limited to, at least one of ON Semiconductor® MT9V024 Global Shutter VGA GS CMOS image sensors, Teledyne DALSA Falcon2 camera sensors, CMOSIS CMV50000 high-speed CMOS image sensors, other industry-equivalent camera sensors and/or systems, and may perform visual target and/or obstacle detection in an environment around the vehicle 100 using any known or future-developed standard and/or architecture.
The infrared (IR) sensors 5436 may include one or more components configured to detect image information associated with an environment of the vehicle 100. The IR sensors 5436 may be configured to detect targets in low-light, dark, or poorly-lit environments. The IR sensors 5436 may include an IR light emitting element (e.g., IR light emitting diode (LED), etc.) and an IR photodiode. In some embodiments, the IR photodiode may be configured to detect returned IR light at or about the same wavelength to that emitted by the IR light emitting element. In some embodiments, the IR sensors 5436 may include at least one processor configured to interpret the returned IR light and determine locational properties of targets. The IR sensors 5436 may be configured to detect and/or measure a temperature associated with a target (e.g., an object, pedestrian, other vehicle, etc.). Examples of IR sensors 5436 as described herein may include, but are not limited to, at least one of Opto Diode lead-salt IR array sensors, Opto Diode OD-850 Near-IR LED sensors, Opto Diode SA/SHA727 steady state IR emitters and IR detectors, FLIR® LS microbolometer sensors, FLIR® TacFLIR 380-HD InSb MWIR FPA and HD MWIR thermal sensors, FLIR® VOx 640×480 pixel detector sensors, Delphi IR sensors, other industry-equivalent IR sensors and/or systems, and may perform IR visual target and/or obstacle detection in an environment around the vehicle 100 using any known or future-developed standard and/or architecture.
The vehicle 100 can also include one or more interior sensors 5437. Interior sensors 5437 can measure characteristics of the inside environment of the vehicle 100. The interior sensors 5437 may be as described in conjunction with
A navigation system 5402 can include any hardware and/or software used to navigate the vehicle either manually or autonomously. The navigation system 5402 may be as described in conjunction with
In some embodiments, the driving vehicle sensors and systems 5404 may include other sensors 5438 and/or combinations of the sensors 5406-5437 described above. Additionally or alternatively, one or more of the sensors 5406-5437 described above may include one or more processors configured to process and/or interpret signals detected by the one or more sensors 5406-5437. In some embodiments, the processing of at least some sensor information provided by the vehicle sensors and systems 5404 may be processed by at least one sensor processor 5440. Raw and/or processed sensor data may be stored in a sensor data memory 5444 storage medium. In some embodiments, the sensor data memory 5444 may store instructions used by the sensor processor 5440 for processing sensor information provided by the sensors and systems 5404. In any event, the sensor data memory 5444 may be a disk drive, optical storage device, solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
The vehicle control system 5448 may receive processed sensor information from the sensor processor 5440 and determine to control an aspect of the vehicle 100. Controlling an aspect of the vehicle 100 may include presenting information via one or more display devices 5472 associated with the vehicle, sending commands to one or more computing devices 5468 associated with the vehicle, and/or controlling a driving operation of the vehicle. In some embodiments, the vehicle control system 5448 may correspond to one or more computing systems that control driving operations of the vehicle 100 in accordance with the Levels of driving autonomy described above. In one embodiment, the vehicle control system 5448 may operate a speed of the vehicle 100 by controlling an output signal to the accelerator and/or braking system of the vehicle. In this example, the vehicle control system 5448 may receive sensor data describing an environment surrounding the vehicle 100 and, based on the sensor data received, determine to adjust the acceleration, power output, and/or braking of the vehicle 100. The vehicle control system 5448 may additionally control steering and/or other driving functions of the vehicle 100.
The vehicle control system 5448 may communicate, in real-time, with the driving sensors and systems 5404 forming a feedback loop. In particular, upon receiving sensor information describing a condition of targets in the environment surrounding the vehicle 100, the vehicle control system 5448 may autonomously make changes to a driving operation of the vehicle 100. The vehicle control system 5448 may then receive subsequent sensor information describing any change to the condition of the targets detected in the environment as a result of the changes made to the driving operation. This continual cycle of observation (e.g., via the sensors, etc.) and action (e.g., selected control or non-control of vehicle operations, etc.) allows the vehicle 100 to operate autonomously in the environment.
In some embodiments, the one or more components of the vehicle 100 (e.g., the driving vehicle sensors 5404, vehicle control system 5448, display devices 5472, etc.) may communicate across the communication network 5452 to one or more entities 5455A-N via a communications subsystem 5450 of the vehicle 100. Embodiments of the communications subsystem 5450 are described in greater detail in conjunction with
In some embodiments, the vehicle control system 5448 may receive control information from one or more control sources 5455B. The control source 5455 may provide vehicle control information including autonomous driving control commands, vehicle operation override control commands, and the like. The control source 5455 may correspond to an autonomous vehicle control system, a traffic control system, an administrative control entity, and/or some other controlling server. It is an aspect of the present disclosure that the vehicle control system 5448 and/or other components of the vehicle 100 may exchange communications with the control source 5455 across the communication network 5452 and via the communications subsystem 5450.
Information associated with controlling driving operations of the vehicle 100 may be stored in a control data memory 5464 storage medium. The control data memory 5464 may store instructions used by the vehicle control system 5448 for controlling driving operations of the vehicle 100, historical control information, autonomous driving control rules, and the like. In some embodiments, the control data memory 5464 may be a disk drive, optical storage device, solid-state storage device such as a random access memory (“RAM”) and/or a read-only memory (“ROM”), which can be programmable, flash-updateable, and/or the like.
In addition to the mechanical components described herein, the vehicle 100 may include a number of user interface devices. The user interface devices receive and translate human input into a mechanical movement or electrical signal or stimulus. The human input may be one or more of motion (e.g., body movement, body part movement, in two-dimensional or three-dimensional space, etc.), voice, touch, and/or physical interaction with the components of the vehicle 100. In some embodiments, the human input may be configured to control one or more functions of the vehicle 100 and/or systems of the vehicle 100 described herein. User interfaces may include, but are in no way limited to, at least one graphical user interface of a display device, steering wheel or mechanism, transmission lever or button (e.g., including park, neutral, reverse, and/or drive positions, etc.), throttle control pedal or mechanism, brake control pedal or mechanism, power control switch, communications equipment, etc.
Environmental sensors may comprise sensors configured to collect data relating to the internal environment of a vehicle 100. Examples of environmental sensors may include one or more of, but are not limited to: oxygen/air sensors 5401, temperature sensors 5403, humidity sensors 5405, light/photo sensors 5407, and more. The oxygen/air sensors 5401 may be configured to detect a quality or characteristic of the air in the interior space 108 of the vehicle 100 (e.g., ratios and/or types of gasses comprising the air inside the vehicle 100, dangerous gas levels, safe gas levels, etc.). Temperature sensors 5403 may be configured to detect temperature readings of one or more objects, users 216, and/or areas of a vehicle 100. Humidity sensors 5405 may detect an amount of water vapor present in the air inside the vehicle 100. The light/photo sensors 5407 can detect an amount of light present in the vehicle 100. Further, the light/photo sensors 5407 may be configured to detect various levels of light intensity associated with light in the vehicle 100.
User interface sensors may comprise sensors configured to collect data relating to one or more users (e.g., a driver and/or passenger(s)) in a vehicle 100. As can be appreciated, the user interface sensors may include sensors that are configured to collect data from users 216 in one or more areas of the vehicle 100. Examples of user interface sensors may include one or more of, but are not limited to: infrared sensors 5409, motion sensors 5411, weight sensors 5413, wireless network sensors 5415, biometric sensors 5417, camera (or image) sensors 5419, audio sensors 5421, and more.
Infrared sensors 5409 may be used to measure IR light irradiating from at least one surface, user, or another object in the vehicle 100. Among other things, the Infrared sensors 5409 may be used to measure temperatures, form images (especially in low light conditions), identify users 216, and even detect motion in the vehicle 100.
The motion sensors 5411 may detect motion and/or movement of objects inside the vehicle 104. Optionally, the motion sensors 5411 may be used alone or in combination to detect movement. For example, a user may be operating a vehicle 100 (e.g., while driving, etc.) when a passenger in the rear of the vehicle 100 unbuckles a safety belt and proceeds to move about the vehicle 10. In this example, the movement of the passenger could be detected by the motion sensors 5411. In response to detecting the movement and/or the direction associated with the movement, the passenger may be prevented from interfacing with and/or accessing at least some of the vehicle control features. As can be appreciated, the user may be alerted of the movement/motion such that the user can act to prevent the passenger from interfering with the vehicle controls. Optionally, the number of motion sensors in a vehicle may be increased to increase an accuracy associated with motion detected in the vehicle 100.
Weight sensors 5413 may be employed to collect data relating to objects and/or users in various areas of the vehicle 100. In some cases, the weight sensors 5413 may be included in the seats and/or floor of a vehicle 100. Optionally, the vehicle 100 may include a wireless network sensor 5415. This sensor 5415 may be configured to detect one or more wireless network(s) inside the vehicle 100. Examples of wireless networks may include, but are not limited to, wireless communications utilizing Bluetooth®, Wi-Fi™, ZigBee, IEEE 802.11, and other wireless technology standards. For example, a mobile hotspot may be detected inside the vehicle 100 via the wireless network sensor 5415. In this case, the vehicle 100 may determine to utilize and/or share the mobile hotspot detected via/with one or more other devices associated with the vehicle 100.
Biometric sensors 5417 may be employed to identify and/or record characteristics associated with a user. It is anticipated that biometric sensors 5417 can include at least one of image sensors, IR sensors, fingerprint readers, weight sensors, load cells, force transducers, heart rate monitors, blood pressure monitors, and the like as provided herein.
The camera sensors 5419 may record still images, video, and/or combinations thereof. Camera sensors 5419 may be used alone or in combination to identify objects, users, and/or other features, inside the vehicle 100. Two or more camera sensors 5419 may be used in combination to form, among other things, stereo and/or three-dimensional (3D) images. The stereo images can be recorded and/or used to determine depth associated with objects and/or users in a vehicle 100. Further, the camera sensors 5419 used in combination may determine the complex geometry associated with identifying characteristics of a user. For example, the camera sensors 5419 may be used to determine dimensions between various features of a user's face (e.g., the depth/distance from a user's nose to a user's cheeks, a linear distance between the center of a user's eyes, and more). These dimensions may be used to verify, record, and even modify characteristics that serve to identify a user. The camera sensors 5419 may also be used to determine movement associated with objects and/or users within the vehicle 100. It should be appreciated that the number of image sensors used in a vehicle 100 may be increased to provide greater dimensional accuracy and/or views of a detected image in the vehicle 100.
The audio sensors 5421 may be configured to receive audio input from a user of the vehicle 100. The audio input from a user may correspond to voice commands, conversations detected in the vehicle 100, phone calls made in the vehicle 100, and/or other audible expressions made in the vehicle 100. Audio sensors 5421 may include, but are not limited to, microphones and other types of acoustic-to-electric transducers or sensors. Optionally, the interior audio sensors 5421 may be configured to receive and convert sound waves into an equivalent analog or digital signal. The interior audio sensors 5421 may serve to determine one or more locations associated with various sounds in the vehicle 100. The location of the sounds may be determined based on a comparison of volume levels, intensity, and the like, between sounds detected by two or more interior audio sensors 5421. For instance, a first audio sensor 5421 may be located in a first area of the vehicle 100 and a second audio sensor 5421 may be located in a second area of the vehicle 100. If a sound is detected at a first volume level by the first audio sensors 5421 A and a second, higher, volume level by the second audio sensors 5421 in the second area of the vehicle 100, the sound may be determined to be closer to the second area of the vehicle 100. As can be appreciated, the number of sound receivers used in a vehicle 100 may be increased (e.g., more than two, etc.) to increase measurement accuracy surrounding sound detection and location, or source, of the sound (e.g., via triangulation, etc.).
The safety sensors may comprise sensors configured to collect data relating to the safety of a user and/or one or more components of a vehicle 100. Examples of safety sensors may include one or more of, but are not limited to: force sensors 5425, mechanical motion sensors 5427, orientation sensors 5429, restraint sensors 5431, and more.
The force sensors 5425 may include one or more sensors inside the vehicle 100 configured to detect a force observed in the vehicle 100. One example of a force sensor 5425 may include a force transducer that converts measured forces (e.g., force, weight, pressure, etc.) into output signals. Mechanical motion sensors 5427 may correspond to encoders, accelerometers, damped masses, and the like. Optionally, the mechanical motion sensors 5427 may be adapted to measure the force of gravity (i.e., G-force) as observed inside the vehicle 100. Measuring the G-force observed inside a vehicle 100 can provide valuable information related to a vehicle's acceleration, deceleration, collisions, and/or forces that may have been suffered by one or more users in the vehicle 100. Orientation sensors 5429 can include accelerometers, gyroscopes, magnetic sensors, and the like that are configured to detect an orientation associated with the vehicle 100.
The restraint sensors 5431 may correspond to sensors associated with one or more restraint devices and/or systems in a vehicle 100. Seatbelts and airbags are examples of restraint devices and/or systems. As can be appreciated, the restraint devices and/or systems may be associated with one or more sensors that are configured to detect a state of the device/system. The state may include extension, engagement, retraction, disengagement, deployment, and/or other electrical or mechanical conditions associated with the device/system.
The associated device sensors 5423 can include any sensors that are associated with a device in the vehicle 100. As previously stated, typical devices may include smart phones, tablets, laptops, mobile computers, and the like. It is anticipated that the various sensors associated with these devices can be employed by the vehicle control system 5448. For example, a typical smart phone can include, an image sensor, an IR sensor, audio sensor, gyroscope, accelerometer, wireless network sensor, fingerprint reader, and more. It is an aspect of the present disclosure that one or more of these associated device sensors 5423 may be used by one or more subsystems of the vehicle 100.
A GPS Antenna/receiver 5431 can be any antenna, GPS puck, and/or receiver capable of receiving signals from a GPS satellite or other navigation system. The signals may be demodulated, converted, interpreted, etc. by the GPS Antenna/receiver 5431 and provided to the location module 5433. Thus, the GPS Antenna/receiver 5431 may convert the time signals from the GPS system and provide a location (e.g., coordinates on a map) to the location module 5433. Alternatively, the location module 5433 can interpret the time signals into coordinates or other location information.
The location module 5433 can be the controller of the satellite navigation system designed for use in the vehicle 100. The location module 5433 can acquire position data, as from the GPS Antenna/receiver 5431, to locate the user or vehicle 100 on a road in the unit's map database 5435. Using the road database 5435, the location module 5433 can give directions to other locations along roads also in the database 5435. When a GPS signal is not available, the location module 5433 may apply dead reckoning to estimate distance data from sensors 5404 including one or more of, but not limited to, a speed sensor attached to the drive train of the vehicle 100, a gyroscope, an accelerometer, etc. Additionally or alternatively, the location module 5433 may use known locations of Wi-Fi hotspots, cell tower data, etc. to determine the position of the vehicle 100, such as by using time difference of arrival (TDOA) and/or frequency difference of arrival (FDOA) techniques.
The maps database 5435 can include any hardware and/or software to store information about maps, geographical information system (GIS) information, location information, etc. The maps database 5435 can include any data definition or other structure to store the information. Generally, the maps database 5435 can include a road database that may include one or more vector maps of areas of interest. Street names, street numbers, house numbers, and other information can be encoded as geographic coordinates so that the user can find some desired destination by street address. Points of interest (waypoints) can also be stored with their geographic coordinates. For example, a point of interest may include speed cameras, fuel stations, public parking, and “parked here” (or “you parked here”) information. The maps database 5435 may also include road or street characteristics, for example, speed limits, location of stop lights/stop signs, lane divisions, school locations, etc. The map database contents can be produced or updated by a server connected through a wireless system in communication with the Internet, even as the vehicle 100 is driven along existing streets, yielding an up-to-date map.
The computing environment 5500 may also include one or more servers 5514, 5516. In this example, server 5514 is shown as a web server and server 5516 is shown as an application server. The web server 5514, which may be used to process requests for web pages or other electronic documents from computing devices 5504, 5508, 5512. The web server 5514 can be running an operating system including any of those discussed above, as well as any commercially-available server operating systems. The web server 5514 can also run a variety of server applications, including SIP (Session Initiation Protocol) servers, HTTP(s) servers, FTP servers, CGI servers, database servers, Java servers, and the like. In some instances, the web server 5514 may publish operations available operations as one or more web services.
The computing environment 5500 may also include one or more file and or/application servers 5516, which can, in addition to an operating system, include one or more applications accessible by a client running on one or more of the computing devices 5504, 5508, 5512. The server(s) 5516 and/or 5514 may be one or more general purpose computers capable of executing programs or scripts in response to the computing devices 5504, 5508, 5512. As one example, the server 5516, 5514 may execute one or more web applications. The web application may be implemented as one or more scripts or programs written in any programming language, such as Java™, C, C #®, or C++, and/or any scripting language, such as Perl, Python, or TCL, as well as combinations of any programming/scripting languages. The application server(s) 5516 may also include database servers, including without limitation those commercially available from Oracle®, Microsoft®, Sybase®, IBM® and the like, which can process requests from database clients running on a computing device 5504, 5508, 5512.
The web pages created by the server 5514 and/or 5516 may be forwarded to a computing device 5504, 5508, 5512 via a web (file) server 5514, 5516. Similarly, the web server 5514 may be able to receive web page requests, web services invocations, and/or input data from a computing device 5504, 5508, 5512 (e.g., a user computer, etc.) and can forward the web page requests and/or input data to the web (application) server 5516. In further embodiments, the server 5516 may function as a file server. Although for ease of description,
The computing environment 5500 may also include a database 5518. The database 5518 may reside in a variety of locations. By way of example, database 5518 may reside on a storage medium local to (and/or resident in) one or more of the computers 5504, 5508, 5512, 5514, 5516. Alternatively, it may be remote from any or all of the computers 5504, 5508, 5512, 5514, 5516, and in communication (e.g., via the network 5510) with one or more of these. The database 5518 may reside in a storage-area network (“SAN”) familiar to those skilled in the art. Similarly, any necessary files for performing the functions attributed to the computers 5504, 5508, 5512, 5514, 5516 may be stored locally on the respective computer and/or remotely, as appropriate. The database 5518 may be a relational database, such as Oracle 20i®, that is adapted to store, update, and retrieve data in response to SQL-formatted commands.
The computer system 5600 may additionally include a computer-readable storage media reader 5624; a communications system 5628 (e.g., a modem, a network card (wireless or wired), an infra-red communication device, etc.); and working memory 5636, which may include RAM and ROM devices as described above. The computer system 5600 may also include a processing acceleration unit 5632, which can include a DSP, a special-purpose processor, and/or the like.
The computer-readable storage media reader 5624 can further be connected to a computer-readable storage medium, together (and, optionally, in combination with storage device(s) 5620) comprehensively representing remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing computer-readable information. The communications system 5628 may permit data to be exchanged with a network and/or any other computer described above with respect to the computer environments described herein. Moreover, as disclosed herein, the term “storage medium” may represent one or more devices for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
The computer system 5600 may also comprise software elements, shown as being currently located within a working memory 5636, including an operating system 5640 and/or other code 5644. It should be appreciated that alternate embodiments of a computer system 5600 may have numerous variations from that described above. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets), or both. Further, connection to other computing devices such as network input/output devices may be employed.
Examples of the processors 340, 5608 as described herein may include, but are not limited to, at least one of Qualcomm® Snapdragon® 800 and 801, Qualcomm® Snapdragon® 620 and 615 with 4G LTE Integration and 64-bit computing, Apple® A7 processor with 64-bit architecture, Apple® M7 motion coprocessors, Samsung® Exynos® series, the Intel® Core™ family of processors, the Intel® Xeon® family of processors, the Intel® Atom™ family of processors, the Intel Itanium® family of processors, Intel® Core® i5-4670K and i7-4770K 22 nm Haswell, Intel® Core® i5-3560K 22 nm Ivy Bridge, the AMD® FX™ family of processors, AMD® FX-4300, FX-6300, and FX-8350 32 nm Vishera, AMD® Kaveri processors, Texas Instruments® Jacinto C6000™ automotive infotainment processors, Texas Instruments® OMAP™ automotive-grade mobile processors, ARM® Cortex™-M processors, ARM® Cortex-A and ARM926EJ-S™ processors, other industry-equivalent processors, and may perform computational functions using any known or future-developed standard, instruction set, libraries, and/or architecture.
Any of the steps, functions, and operations discussed herein can be performed continuously and automatically.
Aspects of the embodiments can include: A vehicle, comprising: a sensor to: detect a first presence of a driver in a vehicle; detect a second presence of a passenger in the vehicle a memory to: store first sensitive information for the driver of the vehicle; store second sensitive information for a passenger in the vehicle; a processor in communication with the sensor and the memory, the processor to: receive the first presence and second presence; provide a user interface for the passenger to enter the sensitive information; receive the second sensitive information for the passenger;
associate the second sensitive information with the vehicle; and send the second sensitive information to the memory for storage.
Any of the one or more above aspects, wherein the sensor receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein the biometric information is also stored with the second sensitive information.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the second sensitive information is encrypted in the memory.
Any of the one or more above aspects, wherein the second sensitive information comprises one or more of a biometric, a username, a password, mobile device information, payment information, a personal identification number, identifiers, an address, limits, preferences, and/or rules.
Any of the one or more above aspects, wherein the second sensitive information comprises one or more of a desired product, a desired service, and/or a triggering event.
Any of the one or more above aspects, wherein the processor presents a user interface to a head unit display to receive the second sensitive information.
Any of the one or more above aspects, wherein the user interface receives input from the passenger in the user interface.
Any of the one or more above aspects, wherein the input includes financial information.
A method for associated passenger information with a vehicle, comprising: detecting a first presence of a driver in a vehicle; detecting a second presence of a passenger in the vehicle receiving the first presence and second presence; providing a user interface for the passenger to enter the sensitive information; receiving the second sensitive information for the passenger; associating the second sensitive information with the vehicle; and storing second sensitive information for a passenger in the vehicle.
Any of the one or more above aspects, wherein the sensor receives biometric information associated with the passenger, wherein the biometric information is also stored with the second sensitive information, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the second sensitive information is encrypted in the memory, wherein the second sensitive information comprises one or more of a biometric, a username, a password, mobile device information, payment information, a personal identification number, identifiers, an address, limits, preferences, and/or rules.
Any of the one or more above aspects, wherein the second sensitive information comprises one or more of a desired product, a desired service, and/or a triggering event.
Any of the one or more above aspects, wherein the processor presents a user interface to receive the second sensitive information, wherein the user interface receives input from the passenger, and wherein the input includes financial information.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: detecting a first presence of a driver in a vehicle; detecting a second presence of a passenger in the vehicle receiving the first presence and second presence; providing a user interface for the passenger to enter the sensitive information; receiving the second sensitive information for the passenger; associating the second sensitive information with the vehicle; and storing second sensitive information for a passenger in the vehicle.
Any of the one or more above aspects, wherein the sensor receives biometric information associated with the passenger, wherein the biometric information is also stored with the second sensitive information, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the second sensitive information is encrypted in the memory, wherein the second sensitive information comprises one or more of a biometric, a username, a password, mobile device information, payment information, a personal identification number, identifiers, an address, limits, preferences, and/or rules.
Any of the one or more above aspects, wherein the second sensitive information comprises one or more of a desired product, a desired service, and/or a triggering event.
Any of the one or more above aspects, wherein the processor presents a user interface to receive the second sensitive information, wherein the user interface receives input from the passenger, and wherein the input includes financial information.
A vehicle, comprising: two or more radio frequency (RF) antennas to communicate wirelessly sensitive information associated with a user of the vehicle; two or more RF transceivers, each RF transceiver associated with one of the two or more RF antennas, the two or more RF transceivers to communicate wirelessly sensitive information associated with a user of the vehicle; a processor in communication with the two or more RF transceivers, the processor to: determine which one of the two or more RF antennas is receiving a strongest signal; select a first RF transceiver associated with the RF antenna with the strongest signal to send the sensitive information; and send the sensitive information to the first RF transceiver.
Any of the one or more above aspects, wherein the RF transceiver and RF antenna are a radio frequency identification device.
Any of the one or more above aspects, wherein the RFID device is an active RFID device.
Any of the one or more above aspects, wherein the first RF antenna is located near a side mirror of the vehicle.
Any of the one or more above aspects, wherein the first RF antenna communicates with a second RF antenna in physical proximity of a drive through window of a drive through restaurant.
Any of the one or more above aspects, wherein the first RF antenna is located on a roof of the vehicle.
Any of the one or more above aspects, wherein the first RF antenna communicates with a second RF antenna in above the vehicle in a toll lane.
Any of the one or more above aspects, wherein the first RF antenna is located near a charging port or a gas refill door.
Any of the one or more above aspects, wherein the first RF antenna communicates with a second RF antenna on a charging station or gas pump.
Any of the one or more above aspects, wherein the vehicle contemporaneously communicates through two or more RF antennas.
A method for communicating information with a vehicle, comprising: receiving a signal at one of two or more radio frequency (RF) antennas and one of two or more RF transceivers, wherein each RF transceiver is associated with one of the two or more RF antennas, to communicate wirelessly sensitive information associated with a user of the vehicle; determining which one of the two or more RF antennas is receiving a strongest signal; selecting a first RF transceiver associated with the RF antenna with the strongest signal to send the sensitive information; and sending the sensitive information through the first RF transceiver.
Any of the one or more above aspects, wherein the RF transceiver and RF antenna are a radio frequency identification device, and wherein the RFID device is an active RFID device.
Any of the one or more above aspects, wherein the first RF antenna is located near a side mirror of the vehicle, wherein the first RF antenna communicates with a second RF antenna in physical proximity of a drive through window of a drive through restaurant.
Any of the one or more above aspects, wherein the first RF antenna is located on a roof of the vehicle, and wherein the first RF antenna communicates with a second RF antenna in above the vehicle in a toll lane.
Any of the one or more above aspects, wherein the first RF antenna is located near a charging port or a gas refill door, and wherein the first RF antenna communicates with a second RF antenna on a charging station or gas pump.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: receiving a signal at one of two or more radio frequency (RF) antennas and one of two or more RF transceivers, wherein each RF transceiver is associated with one of the two or more RF antennas, to communicate wirelessly sensitive information associated with a user of the vehicle; determining which one of the two or more RF antennas is receiving a strongest signal; selecting a first RF transceiver associated with the RF antenna with the strongest signal to send the sensitive information; and sending the sensitive information through the first RF transceiver.
Any of the one or more above aspects, wherein the RF transceiver and RF antenna are a radio frequency identification device, and wherein the RFID device is an active RFID device.
Any of the one or more above aspects, wherein the first RF antenna is located near a side mirror of the vehicle, wherein the first RF antenna communicates with a second RF antenna in physical proximity of a drive through window of a drive through restaurant.
Any of the one or more above aspects, wherein the first RF antenna is located on a roof of the vehicle, and wherein the first RF antenna communicates with a second RF antenna in above the vehicle in a toll lane.
Any of the one or more above aspects, wherein the first RF antenna is located near a charging port or a gas refill door, and wherein the first RF antenna communicates with a second RF antenna on a charging station or gas pump.
A vehicle, comprising: a sensor to: detect a presence of a driver in a vehicle; a memory to: store sensitive information for the driver of the vehicle; a processor in communication with the sensor and the memory, the processor to: receive the presence; based upon receiving the presence, provide a user interface for the passenger to enter user information; receive vehicle information associated with the vehicle; combine the user information and the vehicle information to generate sensitive information; and send the sensitive information to the memory for storage.
Any of the one or more above aspects, wherein the sensor receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein the biometric information is also stored with the sensitive information.
Any of the one or more above aspects, wherein the sensitive information is sent to a third party to authenticate the user.
Any of the one or more above aspects, wherein the sensitive information is encrypted in the memory.
Any of the one or more above aspects, wherein the user information comprises one or more of a biometric, a username, a password, mobile device information, payment information, a personal identification number, identifiers, an address, limits, preferences, and/or rules.
Any of the one or more above aspects, wherein the user information comprises one or more of a desired product, a desired service, and/or a triggering event.
Any of the one or more above aspects, wherein the processor presents a user interface to a head unit display to receive the second sensitive information.
Any of the one or more above aspects, wherein the user interface receives input from the passenger in the user interface, wherein the input includes financial information.
Any of the one or more above aspects, wherein the vehicle information comprises a vehicle identification number (VIN), an electronic serial number (ESN), and/or an engine code.
A method for communicating information with a vehicle, comprising: detecting a presence of a driver in a vehicle; based upon receiving the presence, providing a user interface for the passenger to enter user information; receiving vehicle information associated with the vehicle; combining the user information and the vehicle information to generate sensitive information; and sending the sensitive information to the memory for storage.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein the biometric information is also stored with the sensitive information.
Any of the one or more above aspects, wherein the sensitive information is sent to a third party to authenticate the user.
Any of the one or more above aspects, wherein the user information comprises one or more of a biometric, a username, a password, mobile device information, payment information, a personal identification number, identifiers, an address, limits, preferences, rules, a desired product, a desired service, and/or a triggering event.
Any of the one or more above aspects, wherein the vehicle information comprises a vehicle identification number (VIN), an electronic serial number (ESN), and/or an engine code.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: detecting a presence of a driver in a vehicle; based upon receiving the presence, providing a user interface for the passenger to enter user information; receiving vehicle information associated with the vehicle; combining the user information and the vehicle information to generate sensitive information; and sending the sensitive information to the memory for storage.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein the biometric information is also stored with the sensitive information.
Any of the one or more above aspects, wherein the sensitive information is sent to a third party to authenticate the user.
Any of the one or more above aspects, wherein the user information comprises one or more of a biometric, a username, a password, mobile device information, payment information, a personal identification number, identifiers, an address, limits, preferences, rules, a desired product, a desired service, and/or a triggering event.
Any of the one or more above aspects, wherein the vehicle information comprises a vehicle identification number (VIN), an electronic serial number (ESN), and/or an engine code.
A vehicle, comprising: a navigation system to provide a location and/or a route of the vehicle; a memory to: store user preferences associated with a user in the vehicle; a processor in communication with the navigation system and the memory, the processor to: determine a third party resident in the location and/or on the route of the vehicle; retrieve user preferences from the memory; determine, based on the user preferences, whether the user would desire to communicate with the third party; and automatically send a first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein user preference is a usual order based on historical information from a user's past behavior.
Any of the one or more above aspects, wherein the user is prompted to approve the first communication with a user interface displayed in a display in the vehicle.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party.
Any of the one or more above aspects, wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the second communication is a final authorization for a transaction.
Any of the one or more above aspects, wherein the final authorization includes a PIN entered by the user in a display of the vehicle.
A method for communicating information with a vehicle, comprising: determining a location and/or a route of the vehicle; determining a third party resident in the location and/or on the route of the vehicle; retrieving user preferences associated with a user in the vehicle from a memory; determining, based on the user preferences, whether the user would desire to communicate with the third party; and automatically sending a first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service, wherein user preference is a usual order based on historical information from a user's past behavior, and wherein the user is prompted to approve the first communication with a user interface displayed in a display in the vehicle.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party, and wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the second communication is a final authorization for a transaction, and wherein the final authorization includes a PIN entered by the user in a display of the vehicle.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: determining a location and/or a route of the vehicle; determining a third party resident in the location and/or on the route of the vehicle; retrieving user preferences associated with a user in the vehicle from a memory; determining, based on the user preferences, whether the user would desire to communicate with the third party; and automatically sending a first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service, wherein user preference is a usual order based on historical information from a user's past behavior, and wherein the user is prompted to approve the first communication with a user interface displayed in a display in the vehicle.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party, and wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the second communication is a final authorization for a transaction, and wherein the final authorization includes a PIN entered by the user in a display of the vehicle.
A vehicle, comprising: a display in the vehicle to display a user interface; a processor in communication with the display, the processor to: determine a user in the vehicle desires to interact with a third party; render the user interface for the display to request authorization from the user to allow a communication to the third party; receive user input associated with the request for authorization; and based on the user input, automatically send a first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein the user interface is an authorization for the vehicle to pre-order the good or service.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party.
Any of the one or more above aspects, wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the second communication is a final authorization for a transaction.
Any of the one or more above aspects, wherein the user interface is presented to authorize finally the second communication.
Any of the one or more above aspects, wherein the final authorization includes a PIN entered by the user in the user interface.
A method for communicating information with a vehicle, comprising: determining a user in the vehicle desires to interact with a third party; displaying a user interface to request authorization from the user to allow a communication to the third party; receiving user input associated with the request for authorization; and based on the user input, automatically sending a first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service, and wherein the user interface is an authorization for the vehicle to pre-order the good or service.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party, and wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the second communication is a final authorization for a transaction, wherein the user interface is presented to authorize finally the second communication, and wherein the final authorization includes a PIN entered by the user in the user interface.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: determining a user in the vehicle desires to interact with a third party; displaying a user interface to request authorization from the user to allow a communication to the third party; receiving user input associated with the request for authorization; and based on the user input, automatically sending a first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service, and wherein the user interface is an authorization for the vehicle to pre-order the good or service.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party, and wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the second communication is a final authorization for a transaction, wherein the user interface is presented to authorize finally the second communication, and wherein the final authorization includes a PIN entered by the user in the user interface.
A vehicle, comprising: a memory to: store user information associated with a user in the vehicle; store vehicle information associated with the vehicle; a processor in communication with the memory, the processor to: determine the user desires to interact with a third party; retrieve user information from the memory; retrieve vehicle information from the memory; combine the user information and the vehicle information into a first communication for the third party; and automatically send the first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service and the user information and vehicle information authenticate the user for the order.
Any of the one or more above aspects, wherein the user information comprises financial information, biometric information, and/or a PIN.
Any of the one or more above aspects, wherein the vehicle information comprises a vehicle identification number, an electronic serial number, and/or an engine code.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party, and wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the second communication is a final authorization for a transaction.
Any of the one or more above aspects, wherein the second communication includes user information or vehicle information not in the first communication.
Any of the one or more above aspects, wherein the final authorization includes a PIN entered by the user in a user interface in the vehicle.
A method for communicating information with a vehicle, comprising: storing user information associated with a user in the vehicle; storing vehicle information associated with the vehicle determining the user desires to interact with a third party; combining the user information and the vehicle information into a first communication for the third party; and automatically send the first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service and the user information and vehicle information authenticate the user for the order.
Any of the one or more above aspects, wherein the user information comprises financial information, biometric information, and/or a PIN, and wherein the vehicle information comprises a vehicle identification number, an electronic serial number, and/or an engine code.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party, and wherein the second communication is sent when the vehicle is in physical proximity to the third party, wherein the second communication is a final authorization for a transaction, wherein the second communication includes user information or vehicle information not in the first communication, and wherein the final authorization includes a PIN entered by the user in a user interface in the vehicle.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: storing user information associated with a user in the vehicle; storing vehicle information associated with the vehicle determining the user desires to interact with a third party; combining the user information and the vehicle information into a first communication for the third party; and automatically send the first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service and the user information and vehicle information authenticate the user for the order.
Any of the one or more above aspects, wherein the user information comprises financial information, biometric information, and/or a PIN, and wherein the vehicle information comprises a vehicle identification number, an electronic serial number, and/or an engine code.
Any of the one or more above aspects, wherein the vehicle sends a second communication to complete an interaction with the third party, and wherein the second communication is sent when the vehicle is in physical proximity to the third party, wherein the second communication is a final authorization for a transaction, wherein the second communication includes user information or vehicle information not in the first communication, and wherein the final authorization includes a PIN entered by the user in a user interface in the vehicle.
A vehicle, comprising: a memory to store limits on communications associated with a user in the vehicle; a processor in communication with the memory, the processor to: determine the user desires to interact with a third party; retrieve the limits from the memory; based on the limits, determine if the vehicle can send a first communication for the third party; and when allowed by the limits, automatically send the first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service and the limits are a monetary limit applied to the order.
Any of the one or more above aspects, wherein the limit applies only to the user and the vehicle.
Any of the one or more above aspects, wherein the limit applies only to the user and to two or more vehicles.
Any of the one or more above aspects, wherein the limit applies only to two or more users and the vehicle.
Any of the one or more above aspects, wherein the limit is set on the user by a second user.
Any of the one or more above aspects, wherein the limit is on a number of communications that may be completed between the vehicle and third party.
Any of the one or more above aspects, wherein the limit only applies for a predetermined time period or route.
A method for communicating information with a vehicle, comprising: a memory storing limits on communications associated with a user in the vehicle; a processor determining the user desires to interact with a third party; retrieving the limits from the memory; based on the limits, determining if the vehicle can send a first communication for the third party; and when allowed by the limits, automatically sending the first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service and the limits are a monetary limit applied to the order.
Any of the one or more above aspects, wherein one or more of: the limit applies only to the user and the vehicle, the limit applies only to the user and to two or more vehicles, the limit applies only to two or more users and the vehicle, and/or the limit is set on the user by a second user.
Any of the one or more above aspects, wherein the limit only applies for a predetermined time period or route.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: storing limits on communications associated with a user in the vehicle; determining the user desires to interact with a third party; retrieving the limits from the memory; based on the limits, determining if the vehicle can send a first communication for the third party; and when allowed by the limits, automatically sending the first communication to the third party for the user.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service and the limits are a monetary limit applied to the order.
Any of the one or more above aspects, wherein one or more of: the limit applies only to the user and the vehicle, the limit applies only to the user and to two or more vehicles, the limit applies only to two or more users and the vehicle, and/or the limit is set on the user by a second user.
Any of the one or more above aspects, wherein the limit only applies for a predetermined time period or route.
A vehicle, comprising: a memory to store a user rule applicable to communications associated with a user in the vehicle; a processor in communication with the memory, the processor to: determine a need for communication with a third party; retrieve the user rule from the memory; based on the user rule, determine to which third party the vehicle can send a first communication to address the need; select the third party; and when determined by the user rule, automatically send the first communication to the selected third party to address the need.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein the user rule determines a distance the vehicle can travel to the third party.
6Any of the one or more above aspects, wherein the user rule determines a monetary amount the vehicle can pay to the third party.
Any of the one or more above aspects, wherein the user rule allows the processor to bargain with the third party.
Any of the one or more above aspects, wherein the third party also has a vendor rule to govern an interaction with the vehicle.
Any of the one or more above aspects, wherein the use rule allows a search of two or more third parties that can address the need.
Any of the one or more above aspects, wherein the user rule is a time limit to address the need.
A method for communicating information with a vehicle, comprising: a memory storing a user rule applicable to communications associated with a user in the vehicle; a processor determining a need for communication with a third party; retrieving the user rule from the memory; based on the user rule, determining to which third party the vehicle can send a first communication to address the need; selecting the third party; and when determined by the user rule, automatically sending the first communication to the selected third party to address the need.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein one or more of: wherein the user rule determines a distance the vehicle can travel to the third party, wherein the user rule determines a monetary amount the vehicle can pay to the third party, wherein the user rule allows the processor to bargain with the third, and/or wherein the user rule is a time limit to address the need.
Any of the one or more above aspects, wherein the use rule allows a search of two or more third parties that can address the need.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: storing a user rule applicable to communications associated with a user in the vehicle; determining a need for communication with a third party; retrieving the user rule from the memory; based on the user rule, determining to which third party the vehicle can send a first communication to address the need; selecting the third party; and when determined by the user rule, automatically sending the first communication to the selected third party to address the need.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein one or more of: wherein the user rule determines a distance the vehicle can travel to the third party, wherein the user rule determines a monetary amount the vehicle can pay to the third party, wherein the user rule allows the processor to bargain with the third, and/or wherein the user rule is a time limit to address the need.
Any of the one or more above aspects, wherein the use rule allows a search of two or more third parties that can address the need.
A vehicle, comprising: a memory to store triggering event information applicable to communications associated with a user in the vehicle; a processor in communication with the memory, the processor to: determine a triggering event has occurred; retrieve the triggering event information from the memory; based on the triggering event information, determine a third party to which the vehicle send a first communication; and when determined by the trigger event information, automatically send the first communication to the selected third party to address the need.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein the triggering event information causes a change to a user interface in the vehicle.
Any of the one or more above aspects, wherein the triggering event is a weather related event.
Any of the one or more above aspects, wherein the first communication purchases weather information to be overlaid on the user interface.
Any of the one or more above aspects, wherein the triggering event is a police action.
Any of the one or more above aspects, wherein the first communication purchases real time traffic information to be overlaid on the user interface to avoid the police action.
Any of the one or more above aspects, wherein the triggering event is a malfunction of the vehicle, and wherein the first communication purchases a tow service or a part to address the malfunction.
A method for communicating information with a vehicle, comprising: a memory storing triggering event information applicable to communications associated with a user in the vehicle; a processor determining a triggering event has occurred; retrieving the triggering event information from the memory; based on the triggering event information, determining a third party to which the vehicle send a first communication; and when determined by the trigger event information, automatically sending the first communication to the selected third party to address the need.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein the triggering event information causes a change to a user interface in the vehicle, wherein the triggering event is a weather related event, and wherein the first communication purchases weather information to be overlaid on the user interface.
Any of the one or more above aspects, wherein the triggering event information causes a change to a user interface in the vehicle, wherein the triggering event is a police action, and wherein the first communication purchases real time traffic information to be overlaid on the user interface to avoid the police action.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: storing triggering event information applicable to communications associated with a user in the vehicle; determining a triggering event has occurred; retrieving the triggering event information from the memory; based on the triggering event information, determining a third party to which the vehicle send a first communication; and when determined by the trigger event information, automatically sending the first communication to the selected third party to address the need.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is an order for a good or service.
Any of the one or more above aspects, wherein the triggering event information causes a change to a user interface in the vehicle, wherein the triggering event is a weather related event, and wherein the first communication purchases weather information to be overlaid on the user interface.
Any of the one or more above aspects, wherein the triggering event information causes a change to a user interface in the vehicle, wherein the triggering event is a police action, and wherein the first communication purchases real time traffic information to be overlaid on the user interface to avoid the police action.
A vehicle, comprising: two or more soft touch radio frequency (RF) antennas to communicate sensitive information associated with a user of the vehicle; two or more RF transceivers, each RF transceiver associated with one of the two or more soft touch RF antennas, the two or more RF transceivers to communicate sensitive information associated with a user of the vehicle; a processor in communication with the two or more RF transceivers, the processor to: receive an input to deploy one of the two or more RF antennas; select a first RF transceiver associated with the deployed RF antenna to send the sensitive information; and send the sensitive information to the first RF transceiver.
Any of the one or more above aspects, wherein the RF transceiver and RF antenna are a radio frequency identification device.
Any of the one or more above aspects, wherein the RFID device is an active RFID device.
Any of the one or more above aspects, wherein the deployed RF antenna is located near a side mirror of the vehicle.
Any of the one or more above aspects, wherein the deployed RF antenna contacts a pad physically attached to a structure, associated with a second RF antenna, in physical proximity with a drive through window of a drive through restaurant.
Any of the one or more above aspects, wherein the deployed RF antenna is located on a roof of the vehicle.
Any of the one or more above aspects, wherein the deployed RF antenna contacts a pad physically attached to a structure, associated with a second RF antenna, above the vehicle in a toll lane.
Any of the one or more above aspects, wherein the deployed RF antenna is located near a charging port or a gas refill door.
Any of the one or more above aspects, wherein the deployed RF antenna contacts a pad physically attached to a structure on a charging station or gas pump.
Any of the one or more above aspects, wherein the vehicle contemporaneously communicates through two or more soft touch RF antennas.
A method for communicating information with a vehicle, comprising: receiving an input to deploy one of two or more soft touch radio frequency (RF) antennas to communicate sensitive information associated with a user of the vehicle; selecting a first RF transceiver associated with the deployed RF antenna to send the sensitive information; and sending the sensitive information through the first RF transceiver.
Any of the one or more above aspects, wherein the RF transceiver and RF antenna are a radio frequency identification device, and wherein the RFID device is an active RFID device.
Any of the one or more above aspects, wherein the deployed RF antenna is located near a side mirror of the vehicle, and wherein the deployed RF antenna contacts a pad physically attached to a structure, associated with a second RF antenna, in physical proximity with a drive through window of a drive through restaurant.
Any of the one or more above aspects, wherein the deployed RF antenna is located on a roof of the vehicle, and wherein the deployed RF antenna contacts a pad physically attached to a structure, associated with a second RF antenna, above the vehicle in a toll lane.
Any of the one or more above aspects, wherein the deployed RF antenna is located near a charging port or a gas refill door, and wherein the deployed RF antenna contacts a pad physically attached to a structure on a charging station or gas pump.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: receiving an input to deploy one of two or more soft touch radio frequency (RF) antennas to communicate sensitive information associated with a user of the vehicle; selecting a first RF transceiver associated with the deployed RF antenna to send the sensitive information; and sending the sensitive information through the first RF transceiver.
Any of the one or more above aspects, wherein the RF transceiver and RF antenna are a radio frequency identification device, and wherein the RFID device is an active RFID device.
Any of the one or more above aspects, wherein the deployed RF antenna is located near a side mirror of the vehicle, and wherein the deployed RF antenna contacts a pad physically attached to a structure, associated with a second RF antenna, in physical proximity with a drive through window of a drive through restaurant.
Any of the one or more above aspects, wherein the deployed RF antenna is located on a roof of the vehicle, and wherein the deployed RF antenna contacts a pad physically attached to a structure, associated with a second RF antenna, above the vehicle in a toll lane.
Any of the one or more above aspects, wherein the deployed RF antenna is located near a charging port or a gas refill door, and wherein the deployed RF antenna contacts a pad physically attached to a structure on a charging station or gas pump.
A vehicle, comprising: a memory to: store sensitive information associated with a user in the vehicle; store user preferences associated with a user in the vehicle a processor in communication with the memory, the processor to: based on the user preferences, determine the user desires to communicate with a third party in an interaction; automatically send a first communication to the third party; and send a second communication to the third party for the user to complete the interaction.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger.
Any of the one or more above aspects, wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is a pre-order for a good or service.
Any of the one or more above aspects, wherein user preference is a usual order based on historical information from a user's past behavior.
Any of the one or more above aspects, wherein the user is prompted to approve the first communication with a user interface displayed in a display in the vehicle.
Any of the one or more above aspects, wherein the second communication is an authorization to complete the order with the third party.
Any of the one or more above aspects, wherein the second communication is sent when the vehicle is in physical proximity to the third party.
Any of the one or more above aspects, wherein the first communication is sent over a first communication network, wherein the second communication is sent over a second communication network, and wherein the first and second communication networks are different.
Any of the one or more above aspects, wherein the first communication is sent when the vehicle is distant from the third party.
A method for communicating information with a vehicle, comprising: storing sensitive information associated with a user in the vehicle; storing user preferences associated with a user in the vehicle based on the user preferences, determining the user desires to communicate with a third party in an interaction; automatically sending a first communication to the third party; and sending a second communication to the third party for the user to complete the interaction.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is a pre-order for a good or service.
Any of the one or more above aspects, wherein user preference is a usual order based on historical information from a user's past behavior, and wherein the user is prompted to approve the first communication with a user interface displayed in a display in the vehicle.
Any of the one or more above aspects, wherein the second communication is sent when the vehicle is in physical proximity to the third party, wherein the first communication is sent over a first communication network, wherein the second communication is sent over a second communication network, and wherein the first and second communication networks are different, and wherein the first communication is sent when the vehicle is distant from the third party.
A non-transitory information storage media having stored thereon one or more instructions, that when executed by one or more processors, cause a vehicle to perform a method, the method comprising: storing sensitive information associated with a user in the vehicle; storing user preferences associated with a user in the vehicle based on the user preferences, determining the user desires to communicate with a third party in an interaction; automatically sending a first communication to the third party; and sending a second communication to the third party for the user to complete the interaction.
Any of the one or more above aspects, wherein a sensor associated with the vehicle receives biometric information associated with the passenger, and wherein, when the passenger enters the vehicle a second time, the biometric information identifies the passenger.
Any of the one or more above aspects, wherein the first communication is a pre-order for a good or service.
Any of the one or more above aspects, wherein user preference is a usual order based on historical information from a user's past behavior, and wherein the user is prompted to approve the first communication with a user interface displayed in a display in the vehicle.
Any of the one or more above aspects, wherein the second communication is sent when the vehicle is in physical proximity to the third party, wherein the first communication is sent over a first communication network, wherein the second communication is sent over a second communication network, and wherein the first and second communication networks are different, and wherein the first communication is sent when the vehicle is distant from the third party.
A means, system, SOC, ASIC, FPGA, device, circuit, component, software component, or other module for conducting any of the methods and/or any of the one or more aspects above.
The exemplary systems and methods of this disclosure have been described in relation to vehicle systems and electric vehicles. However, to avoid unnecessarily obscuring the present disclosure, the preceding description omits a number of known structures and devices. This omission is not to be construed as a limitation of the scope of the claimed disclosure. Specific details are set forth to provide an understanding of the present disclosure. It should, however, be appreciated that the present disclosure may be practiced in a variety of ways beyond the specific detail set forth herein.
Furthermore, while the exemplary embodiments illustrated herein show the various components of the system collocated, certain components of the system can be located remotely, at distant portions of a distributed network, such as a LAN and/or the Internet, or within a dedicated system. Thus, it should be appreciated, that the components of the system can be combined into one or more devices, such as a server, communication device, or collocated on a particular node of a distributed network, such as an analog and/or digital telecommunications network, a packet-switched network, or a circuit-switched network. It will be appreciated from the preceding description, and for reasons of computational efficiency, that the components of the system can be arranged at any location within a distributed network of components without affecting the operation of the system. For example, the various components can be located in a switch such as a PBX and media server, gateway, in one or more communications devices, at one or more users' premises, or some combination thereof. Similarly, one or more functional portions of the system could be distributed between a telecommunications device(s) and an associated computing device.
Furthermore, it should be appreciated that the various links connecting the elements can be wired or wireless links, or any combination thereof, or any other known or later developed element(s) that is capable of supplying and/or communicating data to and from the connected elements. These wired or wireless links can also be secure links and may be capable of communicating encrypted information. Transmission media used as links, for example, can be any suitable carrier for electrical signals, including coaxial cables, copper wire, and fiber optics, and may take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.
While the flowcharts have been discussed and illustrated in relation to a particular sequence of events, it should be appreciated that changes, additions, and omissions to this sequence can occur without materially affecting the operation of the disclosed embodiments, configuration, and aspects.
A number of variations and modifications of the disclosure can be used. It would be possible to provide for some features of the disclosure without providing others.
In yet another embodiment, the systems and methods of this disclosure can be implemented in conjunction with a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device or gate array such as PLD, PLA, FPGA, PAL, special purpose computer, any comparable means, or the like. In general, any device(s) or means capable of implementing the methodology illustrated herein can be used to implement the various aspects of this disclosure. Exemplary hardware that can be used for the present disclosure includes computers, handheld devices, telephones (e.g., cellular, Internet enabled, digital, analog, hybrids, and others), and other hardware known in the art. Some of these devices include processors (e.g., a single or multiple microprocessors), memory, nonvolatile storage, input devices, and output devices. Furthermore, alternative software implementations including, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.
In yet another embodiment, the disclosed methods may be readily implemented in conjunction with software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms. Alternatively, the disclosed system may be implemented partially or fully in hardware using standard logic circuits or VLSI design. Whether software or hardware is used to implement the systems in accordance with this disclosure is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized.
In yet another embodiment, the disclosed methods may be partially implemented in software that can be stored on a storage medium, executed on programmed general-purpose computer with the cooperation of a controller and memory, a special purpose computer, a microprocessor, or the like. In these instances, the systems and methods of this disclosure can be implemented as a program embedded on a personal computer such as an applet, JAVA® or CGI script, as a resource residing on a server or computer workstation, as a routine embedded in a dedicated measurement system, system component, or the like. The system can also be implemented by physically incorporating the system and/or method into a software and/or hardware system.
Although the present disclosure describes components and functions implemented in the embodiments with reference to particular standards and protocols, the disclosure is not limited to such standards and protocols. Other similar standards and protocols not mentioned herein are in existence and are considered to be included in the present disclosure. Moreover, the standards and protocols mentioned herein and other similar standards and protocols not mentioned herein are periodically superseded by faster or more effective equivalents having essentially the same functions. Such replacement standards and protocols having the same functions are considered equivalents included in the present disclosure.
The present disclosure, in various embodiments, configurations, and aspects, includes components, methods, processes, systems and/or apparatus substantially as depicted and described herein, including various embodiments, subcombinations, and subsets thereof. Those of skill in the art will understand how to make and use the systems and methods disclosed herein after understanding the present disclosure. The present disclosure, in various embodiments, configurations, and aspects, includes providing devices and processes in the absence of items not depicted and/or described herein or in various embodiments, configurations, or aspects hereof, including in the absence of such items as may have been used in previous devices or processes, e.g., for improving performance, achieving ease, and/or reducing cost of implementation.
The foregoing discussion of the disclosure has been presented for purposes of illustration and description. The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description for example, various features of the disclosure are grouped together in one or more embodiments, configurations, or aspects for the purpose of streamlining the disclosure. The features of the embodiments, configurations, or aspects of the disclosure may be combined in alternate embodiments, configurations, or aspects other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claimed disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment, configuration, or aspect. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred embodiment of the disclosure.
Moreover, though the description of the disclosure has included description of one or more embodiments, configurations, or aspects and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights, which include alternative embodiments, configurations, or aspects to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges, or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges, or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
The phrases “at least one,” “one or more,” “or,” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C,” “A, B, and/or C,” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more,” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising,” “including,” and “having” can be used interchangeably.
The term “automatic” and variations thereof, as used herein, refers to any process or operation, which is typically continuous or semi-continuous, done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material.”
Aspects of the present disclosure may take the form of an embodiment that is entirely hardware, an embodiment that is entirely software (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Any combination of one or more computer-readable medium(s) may be utilized. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electromagnetic, optical, or any suitable combination thereof. A computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
The terms “determine,” “calculate,” “compute,” and variations thereof, as used herein, are used interchangeably and include any type of methodology, process, mathematical operation or technique.
Examples of the processors as described herein may include, but are not limited to, at least one of Qualcomm® Snapdragon® 800 and 801, Qualcomm® Snapdragon® 610 and 615 with 4G LTE Integration and 64-bit computing, Apple® A7 processor with 64-bit architecture, Apple® M7 motion coprocessors, Samsung® Exynos® series, the Intel® Core™ family of processors, the Intel® Xeon® family of processors, the Intel® Atom™ family of processors, the Intel Itanium® family of processors, Intel® Core® i5-4670K and i7-4770K 22 nm Haswell, Intel® Core® i5-3570K 22 nm Ivy Bridge, the AMD® FX™ family of processors, AMD® FX-4300, FX-6300, and FX-8350 32 nm Vishera, AMD® Kaveri processors, Texas Instruments® Jacinto C6000™ automotive infotainment processors, Texas Instruments® OMAP™ automotive-grade mobile processors, ARM® Cortex™-M processors, ARM® Cortex-A and ARM926EJ-S™ processors, other industry-equivalent processors, and may perform computational functions using any known or future-developed standard, instruction set, libraries, and/or architecture.
The term “means” as used herein shall be given its broadest possible interpretation in accordance with 35 U.S.C., Section 112(f) and/or Section 112, Paragraph 6. Accordingly, a claim incorporating the term “means” shall cover all structures, materials, or acts set forth herein, and all of the equivalents thereof. Further, the structures, materials or acts and the equivalents thereof shall include all those described in the summary, brief description of the drawings, detailed description, abstract, and claims themselves.
The present application claims the benefits of and priority, under 35 U.S.C. § 119(e), to U.S. Provisional Application Ser. No. 62/359,563, filed on Jul. 7, 2016 and U.S. Provisional Application Ser. No. 62/378,348, filed on Aug. 23, 2016, both entitled “Next Generation Vehicle.” The entire disclosures of the applications listed above are hereby incorporated by reference, in their entirety, for all that they teach and for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
4361202 | Minovitch | Nov 1982 | A |
4476954 | Johnson et al. | Oct 1984 | A |
4754255 | Sanders et al. | Jun 1988 | A |
4875391 | Leising et al. | Oct 1989 | A |
5136498 | McLaughlin et al. | Aug 1992 | A |
5204817 | Yoshida | Apr 1993 | A |
5363306 | Kuwahara et al. | Nov 1994 | A |
5508689 | Rado et al. | Apr 1996 | A |
5521815 | Rose | May 1996 | A |
5529138 | Shaw et al. | Jun 1996 | A |
5531122 | Chatham et al. | Jul 1996 | A |
5572201 | Graham et al. | Nov 1996 | A |
5572450 | Worthy | Nov 1996 | A |
5610821 | Gazis et al. | Mar 1997 | A |
5648769 | Sato et al. | Jul 1997 | A |
5710702 | Hayashi et al. | Jan 1998 | A |
5794164 | Heckert et al. | Aug 1998 | A |
5797134 | McMillan et al. | Aug 1998 | A |
5812067 | Bergholz et al. | Sep 1998 | A |
5825283 | Camhi | Oct 1998 | A |
5838251 | Brinkmeyer et al. | Nov 1998 | A |
5847661 | Ricci | Dec 1998 | A |
5890080 | Coverdill et al. | Mar 1999 | A |
5928294 | Zelinkovsky | Jul 1999 | A |
5949345 | Beckert et al. | Sep 1999 | A |
5983161 | Lemelson et al. | Nov 1999 | A |
5986575 | Jones et al. | Nov 1999 | A |
6038426 | Williams, Jr. | Mar 2000 | A |
6072391 | Suzuki et al. | Jun 2000 | A |
6081756 | Mio et al. | Jun 2000 | A |
D429684 | Johnson | Aug 2000 | S |
6128003 | Smith et al. | Oct 2000 | A |
6141620 | Zyburt et al. | Oct 2000 | A |
6148261 | Obradovich et al. | Nov 2000 | A |
6152514 | McLellen | Nov 2000 | A |
6157321 | Ricci | Dec 2000 | A |
6198996 | Berstis | Mar 2001 | B1 |
6199001 | Ohta et al. | Mar 2001 | B1 |
6202008 | Beckert et al. | Mar 2001 | B1 |
6252544 | Hoffberg | Jun 2001 | B1 |
6267428 | Baldas et al. | Jul 2001 | B1 |
6302438 | Stopper, Jr. et al. | Oct 2001 | B1 |
6310542 | Gehlot | Oct 2001 | B1 |
6317058 | Lemelson et al. | Nov 2001 | B1 |
6339826 | Hayes, Jr. et al. | Jan 2002 | B2 |
6356838 | Paul | Mar 2002 | B1 |
6388579 | Adcox et al. | May 2002 | B1 |
6480224 | Brown | Nov 2002 | B1 |
6502022 | Chastain et al. | Dec 2002 | B1 |
6519519 | Stopczynski | Feb 2003 | B1 |
6526335 | Treyz et al. | Feb 2003 | B1 |
6546259 | Vendryes | Apr 2003 | B1 |
6557752 | Yacoob | May 2003 | B1 |
6563910 | Menard et al. | May 2003 | B2 |
6574603 | Dickson et al. | Jun 2003 | B1 |
6587739 | Abrams et al. | Jul 2003 | B1 |
6598227 | Berry et al. | Jul 2003 | B1 |
6607212 | Reimer et al. | Aug 2003 | B1 |
6617981 | Basinger | Sep 2003 | B2 |
6662077 | Haag | Dec 2003 | B2 |
6675081 | Shuman et al. | Jan 2004 | B2 |
6677858 | Faris et al. | Jan 2004 | B1 |
6678747 | Goossen et al. | Jan 2004 | B2 |
6681176 | Funk et al. | Jan 2004 | B2 |
6690260 | Ashihara | Feb 2004 | B1 |
6690940 | Brown et al. | Feb 2004 | B1 |
6724920 | Berenz et al. | Apr 2004 | B1 |
6754580 | Ask et al. | Jun 2004 | B1 |
6757593 | Mori et al. | Jun 2004 | B2 |
6762684 | Camhi | Jul 2004 | B1 |
6765495 | Dunning et al. | Jul 2004 | B1 |
6778888 | Cataldo et al. | Aug 2004 | B2 |
6782240 | Tabe | Aug 2004 | B1 |
6785531 | Lepley et al. | Aug 2004 | B2 |
6816783 | Hashima et al. | Nov 2004 | B2 |
6820259 | Kawamata et al. | Nov 2004 | B1 |
6850252 | Hoffberg | Feb 2005 | B1 |
6944533 | Obradovich et al. | Sep 2005 | B2 |
6950022 | Breed | Sep 2005 | B2 |
6958707 | Siegel | Oct 2005 | B1 |
6992580 | Kotzin et al. | Jan 2006 | B2 |
7019641 | Lakshmanan et al. | Mar 2006 | B1 |
7020544 | Shinada et al. | Mar 2006 | B2 |
7021691 | Schmidt et al. | Apr 2006 | B1 |
7042345 | Ellis | May 2006 | B2 |
7047129 | Uotani | May 2006 | B2 |
7058898 | McWalter et al. | Jun 2006 | B2 |
7096431 | Tambata et al. | Aug 2006 | B2 |
7142696 | Engelsberg et al. | Nov 2006 | B1 |
7164117 | Breed et al. | Jan 2007 | B2 |
7187947 | White et al. | Mar 2007 | B1 |
7203598 | Whitsell | Apr 2007 | B1 |
7233861 | Van Buer et al. | Jun 2007 | B2 |
7239960 | Yokota et al. | Jul 2007 | B2 |
7277454 | Mocek et al. | Oct 2007 | B2 |
7284769 | Breed | Oct 2007 | B2 |
7289645 | Yamamoto et al. | Oct 2007 | B2 |
7295921 | Spencer et al. | Nov 2007 | B2 |
7313547 | Mocek et al. | Dec 2007 | B2 |
7333012 | Nguyen | Feb 2008 | B1 |
7343148 | O'Neil | Mar 2008 | B1 |
7386376 | Basir et al. | Jun 2008 | B2 |
7386799 | Clanton et al. | Jun 2008 | B1 |
7432829 | Poltorak | Oct 2008 | B2 |
7474264 | Bolduc et al. | Jan 2009 | B2 |
7493140 | Michmerhuizen et al. | Feb 2009 | B2 |
7526539 | Hsu | Apr 2009 | B1 |
7548815 | Watkins et al. | Jun 2009 | B2 |
7561966 | Nakamura et al. | Jul 2009 | B2 |
7566083 | Vitito | Jul 2009 | B2 |
7606660 | Diaz et al. | Oct 2009 | B2 |
7606867 | Singhal et al. | Oct 2009 | B1 |
7643913 | Taki et al. | Jan 2010 | B2 |
7650234 | Obradovich et al. | Jan 2010 | B2 |
7671764 | Uyeki et al. | Mar 2010 | B2 |
7680596 | Uyeki et al. | Mar 2010 | B2 |
7683771 | Loeb | Mar 2010 | B1 |
7711468 | Levy | May 2010 | B1 |
7734315 | Rathus et al. | Jun 2010 | B2 |
7748021 | Obradovich et al. | Jun 2010 | B2 |
RE41449 | Krahnstoever et al. | Jul 2010 | E |
7791499 | Mohan et al. | Sep 2010 | B2 |
7796190 | Basso et al. | Sep 2010 | B2 |
7802832 | Carnevali | Sep 2010 | B2 |
7812716 | Cotter | Oct 2010 | B1 |
7821421 | Tamir et al. | Oct 2010 | B2 |
7832762 | Breed | Nov 2010 | B2 |
7864073 | Lee et al. | Jan 2011 | B2 |
7872591 | Kane et al. | Jan 2011 | B2 |
7873471 | Gieseke | Jan 2011 | B2 |
7881703 | Roundtree et al. | Feb 2011 | B2 |
7891004 | Gelvin et al. | Feb 2011 | B1 |
7891719 | Carnevali | Feb 2011 | B2 |
7899610 | McClellan | Mar 2011 | B2 |
7966678 | Ten Eyck et al. | Jun 2011 | B2 |
7969290 | Waeller et al. | Jun 2011 | B2 |
7969324 | Chevion et al. | Jun 2011 | B2 |
8060631 | Collart et al. | Nov 2011 | B2 |
8064925 | Sun et al. | Nov 2011 | B1 |
8066313 | Carnevali | Nov 2011 | B2 |
8098170 | Szczerba et al. | Jan 2012 | B1 |
8113564 | Carnevali | Feb 2012 | B2 |
8131419 | Ampunan et al. | Mar 2012 | B2 |
8157310 | Carnevali | Apr 2012 | B2 |
8162368 | Carnevali | Apr 2012 | B2 |
8175802 | Forstall et al. | May 2012 | B2 |
8233919 | Haag et al. | Jul 2012 | B2 |
8245609 | Greenwald et al. | Aug 2012 | B1 |
8269652 | Seder et al. | Sep 2012 | B2 |
8306514 | Nunally | Nov 2012 | B1 |
8334847 | Tomkins | Dec 2012 | B2 |
8346233 | Aaron et al. | Jan 2013 | B2 |
8346432 | Van Wiemeersch et al. | Jan 2013 | B2 |
8350721 | Carr | Jan 2013 | B2 |
8352282 | Jensen et al. | Jan 2013 | B2 |
8369263 | Dowling et al. | Feb 2013 | B2 |
8407144 | Roberts et al. | Mar 2013 | B2 |
8417449 | Denise | Apr 2013 | B1 |
8432260 | Talty et al. | Apr 2013 | B2 |
8442389 | Kashima et al. | May 2013 | B2 |
8442758 | Rovik et al. | May 2013 | B1 |
8467965 | Chang | Jun 2013 | B2 |
8497842 | Tomkins et al. | Jul 2013 | B2 |
8498809 | Bill | Jul 2013 | B2 |
8509982 | Montemerlo et al. | Aug 2013 | B2 |
8521410 | Mizuno et al. | Aug 2013 | B2 |
8527143 | Tan | Sep 2013 | B2 |
8527146 | Jackson et al. | Sep 2013 | B1 |
8532574 | Kirsch | Sep 2013 | B2 |
8543330 | Taylor et al. | Sep 2013 | B2 |
8547340 | Sizelove et al. | Oct 2013 | B2 |
8548669 | Naylor | Oct 2013 | B2 |
8559183 | Davis | Oct 2013 | B1 |
8577600 | Pierfelice | Nov 2013 | B1 |
8578279 | Chen et al. | Nov 2013 | B2 |
8583292 | Preston et al. | Nov 2013 | B2 |
8589073 | Guha et al. | Nov 2013 | B2 |
8600611 | Seize | Dec 2013 | B2 |
8613385 | Hulet et al. | Dec 2013 | B1 |
8621645 | Spackman | Dec 2013 | B1 |
8624727 | Saigh et al. | Jan 2014 | B2 |
8634984 | Sumizawa | Jan 2014 | B2 |
8644165 | Saarimaki et al. | Feb 2014 | B2 |
8660735 | Tengler et al. | Feb 2014 | B2 |
8671068 | Harter et al. | Mar 2014 | B2 |
8688372 | Bhogal et al. | Apr 2014 | B2 |
8705527 | Addepalli et al. | Apr 2014 | B1 |
8706143 | Elias | Apr 2014 | B1 |
8718797 | Addepalli et al. | May 2014 | B1 |
8725311 | Breed | May 2014 | B1 |
8730033 | Yarnold et al. | May 2014 | B2 |
8737986 | Rhoads et al. | May 2014 | B2 |
8761673 | Sakata | Jun 2014 | B2 |
8774842 | Jones et al. | Jul 2014 | B2 |
8779947 | Tengler et al. | Jul 2014 | B2 |
8782262 | Collart et al. | Jul 2014 | B2 |
8793065 | Seltzer et al. | Jul 2014 | B2 |
8798918 | Onishi et al. | Aug 2014 | B2 |
8805110 | Rhoads et al. | Aug 2014 | B2 |
8812171 | Fillev et al. | Aug 2014 | B2 |
8817761 | Gruberman et al. | Aug 2014 | B2 |
8825031 | Aaron et al. | Sep 2014 | B2 |
8825277 | McClellan et al. | Sep 2014 | B2 |
8825382 | Liu | Sep 2014 | B2 |
8826261 | Anand AG et al. | Sep 2014 | B1 |
8838088 | Henn et al. | Sep 2014 | B1 |
8862317 | Shin et al. | Oct 2014 | B2 |
8977408 | Cazanas et al. | Mar 2015 | B1 |
9043016 | Filippov et al. | May 2015 | B2 |
9073554 | Hyde et al. | Jul 2015 | B2 |
9229905 | Penilla et al. | Jan 2016 | B1 |
9526076 | Park | Dec 2016 | B1 |
20010010516 | Roh et al. | Aug 2001 | A1 |
20010015888 | Shaler et al. | Aug 2001 | A1 |
20020009978 | Dukach et al. | Jan 2002 | A1 |
20020023010 | Rittmaster et al. | Feb 2002 | A1 |
20020026278 | Feldman et al. | Feb 2002 | A1 |
20020045484 | Eck et al. | Apr 2002 | A1 |
20020065046 | Mankins et al. | May 2002 | A1 |
20020077985 | Kobata et al. | Jun 2002 | A1 |
20020095249 | Lang | Jul 2002 | A1 |
20020097145 | Tumey et al. | Jul 2002 | A1 |
20020103622 | Burge | Aug 2002 | A1 |
20020105968 | Pruzan et al. | Aug 2002 | A1 |
20020110146 | Thayer et al. | Aug 2002 | A1 |
20020126876 | Paul et al. | Sep 2002 | A1 |
20020128774 | Takezaki et al. | Sep 2002 | A1 |
20020143461 | Burns et al. | Oct 2002 | A1 |
20020143643 | Catan | Oct 2002 | A1 |
20020152010 | Colmenarez et al. | Oct 2002 | A1 |
20020154217 | Ikeda | Oct 2002 | A1 |
20020169551 | Inoue et al. | Nov 2002 | A1 |
20020174021 | Chu et al. | Nov 2002 | A1 |
20030004624 | Wilson et al. | Jan 2003 | A1 |
20030007227 | Ogino | Jan 2003 | A1 |
20030055557 | Dutta et al. | Mar 2003 | A1 |
20030060937 | Shinada et al. | Mar 2003 | A1 |
20030065432 | Shuman et al. | Apr 2003 | A1 |
20030101451 | Bentolila et al. | May 2003 | A1 |
20030109972 | Tak | Jun 2003 | A1 |
20030125846 | Yu et al. | Jul 2003 | A1 |
20030125963 | Haken | Jul 2003 | A1 |
20030132666 | Bond et al. | Jul 2003 | A1 |
20030149530 | Stopczynski | Aug 2003 | A1 |
20030158638 | Yakes et al. | Aug 2003 | A1 |
20030182435 | Redlich et al. | Sep 2003 | A1 |
20030202683 | Ma et al. | Oct 2003 | A1 |
20030204290 | Sadler et al. | Oct 2003 | A1 |
20030230443 | Cramer et al. | Dec 2003 | A1 |
20040017292 | Reese et al. | Jan 2004 | A1 |
20040024502 | Squires et al. | Feb 2004 | A1 |
20040036622 | Dukach et al. | Feb 2004 | A1 |
20040039500 | Amendola et al. | Feb 2004 | A1 |
20040039504 | Coffee et al. | Feb 2004 | A1 |
20040068364 | Zhao et al. | Apr 2004 | A1 |
20040070920 | Flueli | Apr 2004 | A1 |
20040093155 | Simonds et al. | May 2004 | A1 |
20040117494 | Mitchell et al. | Jun 2004 | A1 |
20040128062 | Ogino et al. | Jul 2004 | A1 |
20040153356 | Lockwood et al. | Aug 2004 | A1 |
20040162019 | Horita et al. | Aug 2004 | A1 |
20040180653 | Royalty | Sep 2004 | A1 |
20040182574 | Adnan et al. | Sep 2004 | A1 |
20040193347 | Harumoto et al. | Sep 2004 | A1 |
20040203974 | Seibel | Oct 2004 | A1 |
20040204837 | Singleton | Oct 2004 | A1 |
20040209594 | Naboulsi | Oct 2004 | A1 |
20040217850 | Perttunen et al. | Nov 2004 | A1 |
20040225557 | Phelan et al. | Nov 2004 | A1 |
20040255123 | Noyama et al. | Dec 2004 | A1 |
20040257208 | Huang et al. | Dec 2004 | A1 |
20040260470 | Rast | Dec 2004 | A1 |
20050012599 | DeMatteo | Jan 2005 | A1 |
20050026586 | Yang | Feb 2005 | A1 |
20050031100 | Iggulden et al. | Feb 2005 | A1 |
20050038598 | Oesterling et al. | Feb 2005 | A1 |
20050042999 | Rappaport | Feb 2005 | A1 |
20050065678 | Smith et al. | Mar 2005 | A1 |
20050065711 | Dahlgren et al. | Mar 2005 | A1 |
20050086051 | Brulle-Drews | Apr 2005 | A1 |
20050093717 | Lilja | May 2005 | A1 |
20050097541 | Holland | May 2005 | A1 |
20050114864 | Surace | May 2005 | A1 |
20050122235 | Teffer et al. | Jun 2005 | A1 |
20050124211 | Diessner et al. | Jun 2005 | A1 |
20050130744 | Eck et al. | Jun 2005 | A1 |
20050144156 | Barber | Jun 2005 | A1 |
20050149752 | Johnson et al. | Jul 2005 | A1 |
20050153760 | Varley | Jul 2005 | A1 |
20050159853 | Takahashi et al. | Jul 2005 | A1 |
20050159892 | Chung | Jul 2005 | A1 |
20050192727 | Shostak et al. | Sep 2005 | A1 |
20050197748 | Holst et al. | Sep 2005 | A1 |
20050197767 | Nortrup | Sep 2005 | A1 |
20050251324 | Wiener et al. | Nov 2005 | A1 |
20050257748 | Kriesel et al. | Nov 2005 | A1 |
20050261815 | Cowelchuk et al. | Nov 2005 | A1 |
20050278093 | Kameyama | Dec 2005 | A1 |
20050283284 | Grenier et al. | Dec 2005 | A1 |
20060015819 | Hawkins et al. | Jan 2006 | A1 |
20060023945 | King et al. | Feb 2006 | A1 |
20060036358 | Hale et al. | Feb 2006 | A1 |
20060041605 | King et al. | Feb 2006 | A1 |
20060044119 | Egelhaaf | Mar 2006 | A1 |
20060047386 | Kanevsky et al. | Mar 2006 | A1 |
20060058948 | Blass et al. | Mar 2006 | A1 |
20060059229 | Bain et al. | Mar 2006 | A1 |
20060125631 | Sharony | Jun 2006 | A1 |
20060130033 | Stoffels et al. | Jun 2006 | A1 |
20060142933 | Feng | Jun 2006 | A1 |
20060173841 | Bill | Aug 2006 | A1 |
20060175403 | McConnell et al. | Aug 2006 | A1 |
20060184319 | Seick et al. | Aug 2006 | A1 |
20060212909 | Girard et al. | Sep 2006 | A1 |
20060241836 | Kachouh et al. | Oct 2006 | A1 |
20060243056 | Sundermeyer et al. | Nov 2006 | A1 |
20060250272 | Puamau | Nov 2006 | A1 |
20060253307 | Warren et al. | Nov 2006 | A1 |
20060259210 | Tanaka et al. | Nov 2006 | A1 |
20060274829 | Siemens et al. | Dec 2006 | A1 |
20060282204 | Breed | Dec 2006 | A1 |
20060287807 | Teffer | Dec 2006 | A1 |
20060287865 | Cross et al. | Dec 2006 | A1 |
20060288382 | Vitito | Dec 2006 | A1 |
20060290516 | Muehlsteff et al. | Dec 2006 | A1 |
20070001831 | Raz et al. | Jan 2007 | A1 |
20070002032 | Powers et al. | Jan 2007 | A1 |
20070010942 | Bill | Jan 2007 | A1 |
20070015485 | DeBiasio et al. | Jan 2007 | A1 |
20070028370 | Seng | Feb 2007 | A1 |
20070032225 | Konicek et al. | Feb 2007 | A1 |
20070057781 | Breed | Mar 2007 | A1 |
20070061057 | Huang et al. | Mar 2007 | A1 |
20070067614 | Berry et al. | Mar 2007 | A1 |
20070069880 | Best et al. | Mar 2007 | A1 |
20070083298 | Pierce et al. | Apr 2007 | A1 |
20070088488 | Reeves et al. | Apr 2007 | A1 |
20070102508 | McIntosh | May 2007 | A1 |
20070103328 | Lakshmanan et al. | May 2007 | A1 |
20070111672 | Saintoyant et al. | May 2007 | A1 |
20070115101 | Creekbaum et al. | May 2007 | A1 |
20070118301 | Andarawis et al. | May 2007 | A1 |
20070120697 | Ayoub et al. | May 2007 | A1 |
20070135995 | Kikuchi et al. | Jun 2007 | A1 |
20070156317 | Breed | Jul 2007 | A1 |
20070174467 | Ballou et al. | Jul 2007 | A1 |
20070182625 | Kerai et al. | Aug 2007 | A1 |
20070182816 | Fox | Aug 2007 | A1 |
20070185969 | Davis | Aug 2007 | A1 |
20070192486 | Wilson et al. | Aug 2007 | A1 |
20070194902 | Blanco et al. | Aug 2007 | A1 |
20070194944 | Galera et al. | Aug 2007 | A1 |
20070195997 | Paul et al. | Aug 2007 | A1 |
20070198432 | Pitroda et al. | Aug 2007 | A1 |
20070200663 | White et al. | Aug 2007 | A1 |
20070208860 | Zellner et al. | Sep 2007 | A1 |
20070213090 | Holmberg | Sep 2007 | A1 |
20070228826 | Jordan et al. | Oct 2007 | A1 |
20070233341 | Logsdon | Oct 2007 | A1 |
20070233510 | Howes | Oct 2007 | A1 |
20070250228 | Reddy et al. | Oct 2007 | A1 |
20070257815 | Gunderson et al. | Nov 2007 | A1 |
20070276596 | Solomon et al. | Nov 2007 | A1 |
20070280505 | Breed | Dec 2007 | A1 |
20080005974 | Delgado Vazquez et al. | Jan 2008 | A1 |
20080023253 | Prost-Fin et al. | Jan 2008 | A1 |
20080027337 | Dugan et al. | Jan 2008 | A1 |
20080033635 | Obradovich et al. | Feb 2008 | A1 |
20080042824 | Kates | Feb 2008 | A1 |
20080051957 | Breed et al. | Feb 2008 | A1 |
20080052627 | Oguchi | Feb 2008 | A1 |
20080071465 | Chapman et al. | Mar 2008 | A1 |
20080082237 | Breed | Apr 2008 | A1 |
20080086455 | Meisels et al. | Apr 2008 | A1 |
20080090522 | Oyama | Apr 2008 | A1 |
20080104227 | Birnie et al. | May 2008 | A1 |
20080119994 | Kameyama | May 2008 | A1 |
20080129475 | Breed et al. | Jun 2008 | A1 |
20080143085 | Breed et al. | Jun 2008 | A1 |
20080147280 | Breed | Jun 2008 | A1 |
20080148374 | Spaur et al. | Jun 2008 | A1 |
20080154712 | Wellman | Jun 2008 | A1 |
20080154957 | Taylor et al. | Jun 2008 | A1 |
20080161986 | Breed | Jul 2008 | A1 |
20080164985 | Iketani et al. | Jul 2008 | A1 |
20080169940 | Lee et al. | Jul 2008 | A1 |
20080174451 | Harrington et al. | Jul 2008 | A1 |
20080212215 | Schofield et al. | Sep 2008 | A1 |
20080216067 | Villing | Sep 2008 | A1 |
20080228358 | Wang et al. | Sep 2008 | A1 |
20080234919 | Ritter et al. | Sep 2008 | A1 |
20080252487 | McClellan et al. | Oct 2008 | A1 |
20080253613 | Jones et al. | Oct 2008 | A1 |
20080255721 | Yamada | Oct 2008 | A1 |
20080255722 | McClellan et al. | Oct 2008 | A1 |
20080269958 | Filev et al. | Oct 2008 | A1 |
20080281508 | Fu | Nov 2008 | A1 |
20080300778 | Kuznetsov | Dec 2008 | A1 |
20080305780 | Williams et al. | Dec 2008 | A1 |
20080319602 | McClellan et al. | Dec 2008 | A1 |
20090006525 | Moore | Jan 2009 | A1 |
20090024419 | McClellan et al. | Jan 2009 | A1 |
20090037719 | Sakthikumar et al. | Feb 2009 | A1 |
20090040026 | Tanaka | Feb 2009 | A1 |
20090055178 | Coon | Feb 2009 | A1 |
20090082951 | Graessley | Mar 2009 | A1 |
20090099720 | Elgali | Apr 2009 | A1 |
20090112393 | Maten et al. | Apr 2009 | A1 |
20090112452 | Buck et al. | Apr 2009 | A1 |
20090119657 | Link, II | May 2009 | A1 |
20090125174 | Delean | May 2009 | A1 |
20090132294 | Haines | May 2009 | A1 |
20090138336 | Ashley et al. | May 2009 | A1 |
20090144622 | Evans et al. | Jun 2009 | A1 |
20090157312 | Black et al. | Jun 2009 | A1 |
20090158200 | Palahnuk et al. | Jun 2009 | A1 |
20090180668 | Jones et al. | Jul 2009 | A1 |
20090189373 | Schramm et al. | Jul 2009 | A1 |
20090189979 | Smyth | Jul 2009 | A1 |
20090195370 | Huffman et al. | Aug 2009 | A1 |
20090210257 | Chalfant et al. | Aug 2009 | A1 |
20090216935 | Flick | Aug 2009 | A1 |
20090222200 | Link et al. | Sep 2009 | A1 |
20090224931 | Dietz et al. | Sep 2009 | A1 |
20090224942 | Goudy et al. | Sep 2009 | A1 |
20090234578 | Newby et al. | Sep 2009 | A1 |
20090241883 | Nagoshi et al. | Oct 2009 | A1 |
20090254446 | Chernyak | Oct 2009 | A1 |
20090254572 | Redlich et al. | Oct 2009 | A1 |
20090264849 | La Croix | Oct 2009 | A1 |
20090275321 | Crowe | Nov 2009 | A1 |
20090278750 | Man et al. | Nov 2009 | A1 |
20090278915 | Kramer et al. | Nov 2009 | A1 |
20090279839 | Nakamura et al. | Nov 2009 | A1 |
20090284359 | Huang et al. | Nov 2009 | A1 |
20090287405 | Liu et al. | Nov 2009 | A1 |
20090299572 | Fujikawa et al. | Dec 2009 | A1 |
20090312998 | Berckmans et al. | Dec 2009 | A1 |
20090319181 | Khosravy et al. | Dec 2009 | A1 |
20100008053 | Osternack et al. | Jan 2010 | A1 |
20100023204 | Basir et al. | Jan 2010 | A1 |
20100035620 | Naden et al. | Feb 2010 | A1 |
20100036560 | Wright et al. | Feb 2010 | A1 |
20100042498 | Schalk | Feb 2010 | A1 |
20100052945 | Breed | Mar 2010 | A1 |
20100057337 | Fuchs | Mar 2010 | A1 |
20100057624 | Hurt et al. | Mar 2010 | A1 |
20100066498 | Fenton | Mar 2010 | A1 |
20100069115 | Liu | Mar 2010 | A1 |
20100070338 | Siotia et al. | Mar 2010 | A1 |
20100077094 | Howarter et al. | Mar 2010 | A1 |
20100085213 | Turnock et al. | Apr 2010 | A1 |
20100087987 | Huang et al. | Apr 2010 | A1 |
20100090817 | Yamaguchi et al. | Apr 2010 | A1 |
20100092095 | King et al. | Apr 2010 | A1 |
20100097178 | Pisz et al. | Apr 2010 | A1 |
20100097239 | Campbell et al. | Apr 2010 | A1 |
20100097458 | Zhang et al. | Apr 2010 | A1 |
20100106344 | Edwards et al. | Apr 2010 | A1 |
20100106418 | Kindo et al. | Apr 2010 | A1 |
20100118025 | Smith et al. | May 2010 | A1 |
20100121570 | Tokue et al. | May 2010 | A1 |
20100121645 | Seitz et al. | May 2010 | A1 |
20100125387 | Sehyun et al. | May 2010 | A1 |
20100125405 | Chae et al. | May 2010 | A1 |
20100125811 | Moore et al. | May 2010 | A1 |
20100127847 | Evans et al. | May 2010 | A1 |
20100131300 | Collopy et al. | May 2010 | A1 |
20100134958 | Disaverio et al. | Jun 2010 | A1 |
20100136944 | Taylor et al. | Jun 2010 | A1 |
20100137037 | Basir | Jun 2010 | A1 |
20100144284 | Chutorash et al. | Jun 2010 | A1 |
20100145700 | Kennewick et al. | Jun 2010 | A1 |
20100145987 | Harper et al. | Jun 2010 | A1 |
20100152976 | White et al. | Jun 2010 | A1 |
20100169432 | Santori et al. | Jul 2010 | A1 |
20100174474 | Nagase | Jul 2010 | A1 |
20100179712 | Pepitone et al. | Jul 2010 | A1 |
20100185341 | Wilson et al. | Jul 2010 | A1 |
20100188831 | Ortel | Jul 2010 | A1 |
20100197359 | Harris | Aug 2010 | A1 |
20100202346 | Sitzes et al. | Aug 2010 | A1 |
20100211259 | McClellan | Aug 2010 | A1 |
20100211282 | Nakata et al. | Aug 2010 | A1 |
20100211300 | Jaffe et al. | Aug 2010 | A1 |
20100211304 | Hwang et al. | Aug 2010 | A1 |
20100211441 | Sprigg et al. | Aug 2010 | A1 |
20100217458 | Schweiger et al. | Aug 2010 | A1 |
20100222939 | Namburu et al. | Sep 2010 | A1 |
20100228404 | Link et al. | Sep 2010 | A1 |
20100234071 | Shabtay et al. | Sep 2010 | A1 |
20100235042 | Ying | Sep 2010 | A1 |
20100235744 | Schultz | Sep 2010 | A1 |
20100235891 | Oglesbee et al. | Sep 2010 | A1 |
20100238006 | Grider et al. | Sep 2010 | A1 |
20100250071 | Pala et al. | Sep 2010 | A1 |
20100250497 | Redlich et al. | Sep 2010 | A1 |
20100253493 | Szczerba et al. | Oct 2010 | A1 |
20100256836 | Mudalige | Oct 2010 | A1 |
20100265104 | Zlojutro | Oct 2010 | A1 |
20100268426 | Pathak et al. | Oct 2010 | A1 |
20100274410 | Tsien et al. | Oct 2010 | A1 |
20100280751 | Breed | Nov 2010 | A1 |
20100280956 | Chutorash et al. | Nov 2010 | A1 |
20100287303 | Smith et al. | Nov 2010 | A1 |
20100289632 | Seder et al. | Nov 2010 | A1 |
20100289643 | Trundle et al. | Nov 2010 | A1 |
20100291427 | Zhou | Nov 2010 | A1 |
20100295676 | Khachaturov et al. | Nov 2010 | A1 |
20100304640 | Sofman et al. | Dec 2010 | A1 |
20100305807 | Basir et al. | Dec 2010 | A1 |
20100306080 | Trandal et al. | Dec 2010 | A1 |
20100306309 | Santori et al. | Dec 2010 | A1 |
20100306435 | Nigoghosian et al. | Dec 2010 | A1 |
20100315218 | Cades et al. | Dec 2010 | A1 |
20100321151 | Matsuura et al. | Dec 2010 | A1 |
20100325626 | Greschler et al. | Dec 2010 | A1 |
20100332130 | Shimizu et al. | Dec 2010 | A1 |
20110015853 | DeKock et al. | Jan 2011 | A1 |
20110018736 | Carr | Jan 2011 | A1 |
20110021213 | Carr | Jan 2011 | A1 |
20110021234 | Tibbits et al. | Jan 2011 | A1 |
20110028138 | Davies-Moore et al. | Feb 2011 | A1 |
20110035098 | Goto et al. | Feb 2011 | A1 |
20110035141 | Barker et al. | Feb 2011 | A1 |
20110040438 | Kluge et al. | Feb 2011 | A1 |
20110050589 | Yan et al. | Mar 2011 | A1 |
20110053506 | Lemke et al. | Mar 2011 | A1 |
20110077808 | Hyde et al. | Mar 2011 | A1 |
20110078024 | Messier et al. | Mar 2011 | A1 |
20110080282 | Kleve et al. | Apr 2011 | A1 |
20110082615 | Small et al. | Apr 2011 | A1 |
20110084824 | Tewari et al. | Apr 2011 | A1 |
20110090078 | Kim et al. | Apr 2011 | A1 |
20110092159 | Park et al. | Apr 2011 | A1 |
20110093154 | Moinzadeh et al. | Apr 2011 | A1 |
20110093158 | Theisen et al. | Apr 2011 | A1 |
20110093438 | Poulsen | Apr 2011 | A1 |
20110093846 | Moinzadeh et al. | Apr 2011 | A1 |
20110105097 | Tadayon et al. | May 2011 | A1 |
20110106375 | Sundaram et al. | May 2011 | A1 |
20110112717 | Resner | May 2011 | A1 |
20110112969 | Zaid et al. | May 2011 | A1 |
20110117933 | Andersson | May 2011 | A1 |
20110119344 | Eustis | May 2011 | A1 |
20110130915 | Wright et al. | Jun 2011 | A1 |
20110134749 | Speks et al. | Jun 2011 | A1 |
20110137520 | Rector et al. | Jun 2011 | A1 |
20110145331 | Christie et al. | Jun 2011 | A1 |
20110172873 | Szwabowski et al. | Jul 2011 | A1 |
20110175754 | Karpinsky | Jul 2011 | A1 |
20110183658 | Zellner | Jul 2011 | A1 |
20110187520 | Filev et al. | Aug 2011 | A1 |
20110193707 | Ngo | Aug 2011 | A1 |
20110193726 | Szwabowski et al. | Aug 2011 | A1 |
20110195699 | Tadayon et al. | Aug 2011 | A1 |
20110197187 | Roh | Aug 2011 | A1 |
20110205047 | Patel et al. | Aug 2011 | A1 |
20110209079 | Tarte et al. | Aug 2011 | A1 |
20110210867 | Benedikt | Sep 2011 | A1 |
20110212717 | Rhoads et al. | Sep 2011 | A1 |
20110213665 | Joa et al. | Sep 2011 | A1 |
20110221656 | Haddick et al. | Sep 2011 | A1 |
20110224865 | Gordon et al. | Sep 2011 | A1 |
20110224898 | Scofield et al. | Sep 2011 | A1 |
20110225527 | Law et al. | Sep 2011 | A1 |
20110227757 | Chen et al. | Sep 2011 | A1 |
20110231091 | Gourlay et al. | Sep 2011 | A1 |
20110231310 | Roberts et al. | Sep 2011 | A1 |
20110234369 | Cai et al. | Sep 2011 | A1 |
20110238517 | Ramalingam et al. | Sep 2011 | A1 |
20110245999 | Kordonowy | Oct 2011 | A1 |
20110246210 | Matsur | Oct 2011 | A1 |
20110247013 | Feller et al. | Oct 2011 | A1 |
20110251734 | Schepp et al. | Oct 2011 | A1 |
20110257973 | Chutorash et al. | Oct 2011 | A1 |
20110267204 | Chuang et al. | Nov 2011 | A1 |
20110267205 | McClellan et al. | Nov 2011 | A1 |
20110286676 | El Dokor | Nov 2011 | A1 |
20110291886 | Krieter | Dec 2011 | A1 |
20110291926 | Gokturk et al. | Dec 2011 | A1 |
20110298808 | Rovik | Dec 2011 | A1 |
20110301844 | Aono | Dec 2011 | A1 |
20110307354 | Erman et al. | Dec 2011 | A1 |
20110307570 | Speks | Dec 2011 | A1 |
20110309926 | Eikelenberg et al. | Dec 2011 | A1 |
20110309953 | Petite et al. | Dec 2011 | A1 |
20110313653 | Lindner | Dec 2011 | A1 |
20110320089 | Lewis | Dec 2011 | A1 |
20120006610 | Wallace et al. | Jan 2012 | A1 |
20120010807 | Zhou | Jan 2012 | A1 |
20120016581 | Mochizuki et al. | Jan 2012 | A1 |
20120029852 | Goff et al. | Feb 2012 | A1 |
20120030002 | Bous et al. | Feb 2012 | A1 |
20120030512 | Wadhwa et al. | Feb 2012 | A1 |
20120036220 | Dare et al. | Feb 2012 | A1 |
20120036440 | Dare et al. | Feb 2012 | A1 |
20120036552 | Dare et al. | Feb 2012 | A1 |
20120038489 | Goldshmidt | Feb 2012 | A1 |
20120046822 | Anderson | Feb 2012 | A1 |
20120047530 | Shkedi | Feb 2012 | A1 |
20120053793 | Sala et al. | Mar 2012 | A1 |
20120053888 | Stahlin et al. | Mar 2012 | A1 |
20120059789 | Sakai et al. | Mar 2012 | A1 |
20120065815 | Hess | Mar 2012 | A1 |
20120065834 | Senart | Mar 2012 | A1 |
20120068956 | Jira et al. | Mar 2012 | A1 |
20120071097 | Matsushita et al. | Mar 2012 | A1 |
20120072244 | Collins et al. | Mar 2012 | A1 |
20120074770 | Lee | Mar 2012 | A1 |
20120083960 | Zhu et al. | Apr 2012 | A1 |
20120083971 | Preston | Apr 2012 | A1 |
20120084773 | Lee et al. | Apr 2012 | A1 |
20120089299 | Breed | Apr 2012 | A1 |
20120092251 | Hashimoto et al. | Apr 2012 | A1 |
20120101876 | Truvey et al. | Apr 2012 | A1 |
20120101914 | Kumar et al. | Apr 2012 | A1 |
20120105613 | Weng et al. | May 2012 | A1 |
20120106114 | Caron et al. | May 2012 | A1 |
20120109446 | Yousefi et al. | May 2012 | A1 |
20120109451 | Tan | May 2012 | A1 |
20120110356 | Yousefi et al. | May 2012 | A1 |
20120113822 | Letner | May 2012 | A1 |
20120115446 | Guatama et al. | May 2012 | A1 |
20120116609 | Jung et al. | May 2012 | A1 |
20120116678 | Witmer | May 2012 | A1 |
20120116696 | Wank | May 2012 | A1 |
20120146766 | Geisler et al. | Jun 2012 | A1 |
20120146809 | Oh et al. | Jun 2012 | A1 |
20120149341 | Tadayon et al. | Jun 2012 | A1 |
20120150651 | Hoffberg et al. | Jun 2012 | A1 |
20120155636 | Muthaiah | Jun 2012 | A1 |
20120158436 | Bauer et al. | Jun 2012 | A1 |
20120173900 | Diab et al. | Jul 2012 | A1 |
20120173905 | Diab et al. | Jul 2012 | A1 |
20120179325 | Faenger | Jul 2012 | A1 |
20120179547 | Besore et al. | Jul 2012 | A1 |
20120188876 | Chow et al. | Jul 2012 | A1 |
20120197523 | Kirsch | Aug 2012 | A1 |
20120197669 | Kote et al. | Aug 2012 | A1 |
20120204166 | Ichihara | Aug 2012 | A1 |
20120210160 | Fuhrman | Aug 2012 | A1 |
20120215375 | Chang | Aug 2012 | A1 |
20120217928 | Kulidjian | Aug 2012 | A1 |
20120218125 | Demirdjian et al. | Aug 2012 | A1 |
20120226413 | Chen et al. | Sep 2012 | A1 |
20120238286 | Mallavarapu et al. | Sep 2012 | A1 |
20120239242 | Uehara | Sep 2012 | A1 |
20120242510 | Choi et al. | Sep 2012 | A1 |
20120254763 | Protopapas et al. | Oct 2012 | A1 |
20120254804 | Shema et al. | Oct 2012 | A1 |
20120259951 | Schalk et al. | Oct 2012 | A1 |
20120265359 | Das | Oct 2012 | A1 |
20120274459 | Jaisimha et al. | Nov 2012 | A1 |
20120274481 | Ginsberg et al. | Nov 2012 | A1 |
20120284292 | Rechsteiner et al. | Nov 2012 | A1 |
20120289217 | Reimer et al. | Nov 2012 | A1 |
20120289253 | Haag et al. | Nov 2012 | A1 |
20120296567 | Breed | Nov 2012 | A1 |
20120313771 | Wottlifff, III | Dec 2012 | A1 |
20120316720 | Hyde et al. | Dec 2012 | A1 |
20120317561 | Aslam et al. | Dec 2012 | A1 |
20120323413 | Kedar-Dongarkar et al. | Dec 2012 | A1 |
20120327231 | Cochran et al. | Dec 2012 | A1 |
20130005263 | Sakata | Jan 2013 | A1 |
20130005414 | Bindra et al. | Jan 2013 | A1 |
20130013157 | Kim et al. | Jan 2013 | A1 |
20130019252 | Haase et al. | Jan 2013 | A1 |
20130024060 | Sukkarie et al. | Jan 2013 | A1 |
20130024364 | Shrivastava et al. | Jan 2013 | A1 |
20130030645 | Divine et al. | Jan 2013 | A1 |
20130030811 | Olleon et al. | Jan 2013 | A1 |
20130031540 | Throop et al. | Jan 2013 | A1 |
20130031541 | Wilks et al. | Jan 2013 | A1 |
20130035063 | Fisk et al. | Feb 2013 | A1 |
20130035969 | Fisher | Feb 2013 | A1 |
20130046624 | Calman | Feb 2013 | A1 |
20130050069 | Ota | Feb 2013 | A1 |
20130055096 | Kim et al. | Feb 2013 | A1 |
20130059607 | Herz et al. | Mar 2013 | A1 |
20130063336 | Sugimoto et al. | Mar 2013 | A1 |
20130066512 | Willard et al. | Mar 2013 | A1 |
20130067599 | Raje et al. | Mar 2013 | A1 |
20130075530 | Shander et al. | Mar 2013 | A1 |
20130079964 | Sukkarie et al. | Mar 2013 | A1 |
20130083805 | Lu et al. | Apr 2013 | A1 |
20130085787 | Gore et al. | Apr 2013 | A1 |
20130086164 | Wheeler et al. | Apr 2013 | A1 |
20130099915 | Prasad et al. | Apr 2013 | A1 |
20130103196 | Monceaux et al. | Apr 2013 | A1 |
20130105264 | Ruth et al. | May 2013 | A1 |
20130116882 | Link et al. | May 2013 | A1 |
20130116915 | Ferreira et al. | May 2013 | A1 |
20130132286 | Schaefer et al. | May 2013 | A1 |
20130134730 | Ricci | May 2013 | A1 |
20130135118 | Ricci | May 2013 | A1 |
20130138591 | Ricci | May 2013 | A1 |
20130138714 | Ricci | May 2013 | A1 |
20130139140 | Rao et al. | May 2013 | A1 |
20130141247 | Ricci | Jun 2013 | A1 |
20130141252 | Ricci | Jun 2013 | A1 |
20130143495 | Ricci | Jun 2013 | A1 |
20130143546 | Ricci | Jun 2013 | A1 |
20130143601 | Ricci | Jun 2013 | A1 |
20130144459 | Ricci | Jun 2013 | A1 |
20130144460 | Ricci | Jun 2013 | A1 |
20130144461 | Ricci | Jun 2013 | A1 |
20130144462 | Ricci | Jun 2013 | A1 |
20130144463 | Ricci et al. | Jun 2013 | A1 |
20130144469 | Ricci | Jun 2013 | A1 |
20130144470 | Ricci | Jun 2013 | A1 |
20130144474 | Ricci | Jun 2013 | A1 |
20130144486 | Ricci | Jun 2013 | A1 |
20130144520 | Ricci | Jun 2013 | A1 |
20130144657 | Ricci | Jun 2013 | A1 |
20130145065 | Ricci | Jun 2013 | A1 |
20130145279 | Ricci | Jun 2013 | A1 |
20130145297 | Ricci et al. | Jun 2013 | A1 |
20130145360 | Ricci | Jun 2013 | A1 |
20130145401 | Ricci | Jun 2013 | A1 |
20130145482 | Ricci et al. | Jun 2013 | A1 |
20130147638 | Ricci | Jun 2013 | A1 |
20130151031 | Ricci | Jun 2013 | A1 |
20130151065 | Ricci | Jun 2013 | A1 |
20130151088 | Ricci | Jun 2013 | A1 |
20130151288 | Bowne et al. | Jun 2013 | A1 |
20130152003 | Ricci et al. | Jun 2013 | A1 |
20130154298 | Ricci | Jun 2013 | A1 |
20130157640 | Aycock | Jun 2013 | A1 |
20130157647 | Kolodziej | Jun 2013 | A1 |
20130158778 | Tengler et al. | Jun 2013 | A1 |
20130158821 | Ricci | Jun 2013 | A1 |
20130166096 | Jotanovic | Jun 2013 | A1 |
20130166097 | Ricci | Jun 2013 | A1 |
20130166098 | Lavie et al. | Jun 2013 | A1 |
20130166152 | Butterworth | Jun 2013 | A1 |
20130166208 | Forstall et al. | Jun 2013 | A1 |
20130167159 | Ricci et al. | Jun 2013 | A1 |
20130173531 | Rinearson et al. | Jul 2013 | A1 |
20130179689 | Matsumoto et al. | Jul 2013 | A1 |
20130181847 | Willig et al. | Jul 2013 | A1 |
20130190978 | Kato et al. | Jul 2013 | A1 |
20130194108 | Lapiotis et al. | Aug 2013 | A1 |
20130197796 | Obradovich et al. | Aug 2013 | A1 |
20130198031 | Mitchell et al. | Aug 2013 | A1 |
20130198737 | Ricci | Aug 2013 | A1 |
20130198802 | Ricci | Aug 2013 | A1 |
20130200991 | Ricci et al. | Aug 2013 | A1 |
20130203400 | Ricci | Aug 2013 | A1 |
20130204455 | Chia et al. | Aug 2013 | A1 |
20130204457 | King | Aug 2013 | A1 |
20130204466 | Ricci | Aug 2013 | A1 |
20130204484 | Ricci | Aug 2013 | A1 |
20130204493 | Ricci et al. | Aug 2013 | A1 |
20130204943 | Ricci | Aug 2013 | A1 |
20130205026 | Ricci | Aug 2013 | A1 |
20130205412 | Ricci | Aug 2013 | A1 |
20130207794 | Patel et al. | Aug 2013 | A1 |
20130212065 | Rahnama | Aug 2013 | A1 |
20130212659 | Maher et al. | Aug 2013 | A1 |
20130215116 | Siddique et al. | Aug 2013 | A1 |
20130218412 | Ricci | Aug 2013 | A1 |
20130218445 | Basir | Aug 2013 | A1 |
20130219039 | Ricci | Aug 2013 | A1 |
20130226365 | Brozovich | Aug 2013 | A1 |
20130226371 | Rovik et al. | Aug 2013 | A1 |
20130226392 | Schneider et al. | Aug 2013 | A1 |
20130226449 | Rovik et al. | Aug 2013 | A1 |
20130226622 | Adamson et al. | Aug 2013 | A1 |
20130227648 | Ricci | Aug 2013 | A1 |
20130231784 | Rovik et al. | Sep 2013 | A1 |
20130231800 | Ricci | Sep 2013 | A1 |
20130232142 | Nielsen et al. | Sep 2013 | A1 |
20130238165 | Garrett et al. | Sep 2013 | A1 |
20130241720 | Ricci et al. | Sep 2013 | A1 |
20130245882 | Ricci | Sep 2013 | A1 |
20130250933 | Yousefi et al. | Sep 2013 | A1 |
20130253832 | Nallu et al. | Sep 2013 | A1 |
20130261871 | Hobbs et al. | Oct 2013 | A1 |
20130261966 | Wang et al. | Oct 2013 | A1 |
20130265178 | Tengler et al. | Oct 2013 | A1 |
20130274997 | Chien | Oct 2013 | A1 |
20130279111 | Lee | Oct 2013 | A1 |
20130279491 | Rubin et al. | Oct 2013 | A1 |
20130282238 | Ricci et al. | Oct 2013 | A1 |
20130282357 | Rubin et al. | Oct 2013 | A1 |
20130282946 | Ricci | Oct 2013 | A1 |
20130288606 | Kirsch | Oct 2013 | A1 |
20130293364 | Ricci et al. | Nov 2013 | A1 |
20130293452 | Ricci et al. | Nov 2013 | A1 |
20130293480 | Kritt et al. | Nov 2013 | A1 |
20130295901 | Abramson et al. | Nov 2013 | A1 |
20130295908 | Zeinstra et al. | Nov 2013 | A1 |
20130295913 | Matthews et al. | Nov 2013 | A1 |
20130300554 | Braden | Nov 2013 | A1 |
20130301584 | Addepalli et al. | Nov 2013 | A1 |
20130304371 | Kitatani et al. | Nov 2013 | A1 |
20130308265 | Arnouse | Nov 2013 | A1 |
20130309977 | Heines et al. | Nov 2013 | A1 |
20130311038 | Kim et al. | Nov 2013 | A1 |
20130325453 | Levien et al. | Dec 2013 | A1 |
20130325568 | Mangalvedkar et al. | Dec 2013 | A1 |
20130329372 | Wilkins | Dec 2013 | A1 |
20130329888 | Alrabady et al. | Dec 2013 | A1 |
20130332023 | Bertosa et al. | Dec 2013 | A1 |
20130338914 | Weiss | Dec 2013 | A1 |
20130339027 | Dokor et al. | Dec 2013 | A1 |
20130344856 | Silver et al. | Dec 2013 | A1 |
20130345929 | Bowden et al. | Dec 2013 | A1 |
20140028542 | Lovitt et al. | Jan 2014 | A1 |
20140032014 | DeBiasio et al. | Jan 2014 | A1 |
20140054957 | Bellis | Feb 2014 | A1 |
20140058672 | Wansley et al. | Feb 2014 | A1 |
20140066014 | Nicholson et al. | Mar 2014 | A1 |
20140067201 | Visintainer et al. | Mar 2014 | A1 |
20140067564 | Yuan | Mar 2014 | A1 |
20140070917 | Protopapas | Mar 2014 | A1 |
20140081544 | Fry | Mar 2014 | A1 |
20140088798 | Himmelstein | Mar 2014 | A1 |
20140096068 | Dewan et al. | Apr 2014 | A1 |
20140097955 | Lovitt et al. | Apr 2014 | A1 |
20140109075 | Hoffman et al. | Apr 2014 | A1 |
20140109080 | Ricci | Apr 2014 | A1 |
20140120829 | Bhamidipati et al. | May 2014 | A1 |
20140121862 | Zarrella et al. | May 2014 | A1 |
20140125802 | Beckert et al. | May 2014 | A1 |
20140129281 | Struzik | May 2014 | A1 |
20140143839 | Ricci | May 2014 | A1 |
20140164611 | Molettiere et al. | Jun 2014 | A1 |
20140168062 | Katz et al. | Jun 2014 | A1 |
20140168436 | Pedicino | Jun 2014 | A1 |
20140169621 | Burr | Jun 2014 | A1 |
20140171752 | Park et al. | Jun 2014 | A1 |
20140172727 | Abhyanker et al. | Jun 2014 | A1 |
20140188533 | Davidson | Jul 2014 | A1 |
20140195272 | Sadiq et al. | Jul 2014 | A1 |
20140198216 | Zhai et al. | Jul 2014 | A1 |
20140199961 | Mohammed et al. | Jul 2014 | A1 |
20140200737 | Lortz et al. | Jul 2014 | A1 |
20140207328 | Wolf et al. | Jul 2014 | A1 |
20140220966 | Muetzel et al. | Aug 2014 | A1 |
20140222298 | Gurin | Aug 2014 | A1 |
20140222623 | Napper | Aug 2014 | A1 |
20140223384 | Graumann | Aug 2014 | A1 |
20140240089 | Chang | Aug 2014 | A1 |
20140244078 | Downey et al. | Aug 2014 | A1 |
20140244111 | Gross et al. | Aug 2014 | A1 |
20140244156 | Magnusson et al. | Aug 2014 | A1 |
20140245277 | Petro et al. | Aug 2014 | A1 |
20140245278 | Zellen | Aug 2014 | A1 |
20140245284 | Alrabady et al. | Aug 2014 | A1 |
20140252091 | Morse et al. | Sep 2014 | A1 |
20140257627 | Hagan, Jr. | Sep 2014 | A1 |
20140267035 | Schalk et al. | Sep 2014 | A1 |
20140277936 | El Dokor et al. | Sep 2014 | A1 |
20140278070 | McGavran et al. | Sep 2014 | A1 |
20140278071 | San Filippo et al. | Sep 2014 | A1 |
20140281971 | Isbell, III et al. | Sep 2014 | A1 |
20140282161 | Cash | Sep 2014 | A1 |
20140282278 | Anderson et al. | Sep 2014 | A1 |
20140282470 | Buga et al. | Sep 2014 | A1 |
20140282931 | Protopapas | Sep 2014 | A1 |
20140292545 | Nemoto | Oct 2014 | A1 |
20140292665 | Lathrop et al. | Oct 2014 | A1 |
20140303899 | Fung | Oct 2014 | A1 |
20140306799 | Ricci | Oct 2014 | A1 |
20140306814 | Ricci | Oct 2014 | A1 |
20140306817 | Ricci | Oct 2014 | A1 |
20140306826 | Ricci | Oct 2014 | A1 |
20140306833 | Ricci | Oct 2014 | A1 |
20140306834 | Ricci | Oct 2014 | A1 |
20140306835 | Ricci | Oct 2014 | A1 |
20140307655 | Ricci | Oct 2014 | A1 |
20140307724 | Ricci | Oct 2014 | A1 |
20140308902 | Ricci | Oct 2014 | A1 |
20140309789 | Ricci | Oct 2014 | A1 |
20140309790 | Ricci | Oct 2014 | A1 |
20140309804 | Ricci | Oct 2014 | A1 |
20140309805 | Ricci | Oct 2014 | A1 |
20140309806 | Ricci | Oct 2014 | A1 |
20140309813 | Ricci | Oct 2014 | A1 |
20140309814 | Ricci et al. | Oct 2014 | A1 |
20140309815 | Ricci et al. | Oct 2014 | A1 |
20140309838 | Ricci | Oct 2014 | A1 |
20140309839 | Ricci et al. | Oct 2014 | A1 |
20140309842 | Jefferies et al. | Oct 2014 | A1 |
20140309847 | Ricci | Oct 2014 | A1 |
20140309849 | Ricci | Oct 2014 | A1 |
20140309852 | Ricci | Oct 2014 | A1 |
20140309853 | Ricci | Oct 2014 | A1 |
20140309862 | Ricci | Oct 2014 | A1 |
20140309863 | Ricci | Oct 2014 | A1 |
20140309864 | Ricci | Oct 2014 | A1 |
20140309865 | Ricci | Oct 2014 | A1 |
20140309866 | Ricci | Oct 2014 | A1 |
20140309867 | Ricci | Oct 2014 | A1 |
20140309868 | Ricci | Oct 2014 | A1 |
20140309869 | Ricci | Oct 2014 | A1 |
20140309870 | Ricci et al. | Oct 2014 | A1 |
20140309871 | Ricci | Oct 2014 | A1 |
20140309872 | Ricci | Oct 2014 | A1 |
20140309873 | Ricci | Oct 2014 | A1 |
20140309874 | Ricci | Oct 2014 | A1 |
20140309875 | Ricci | Oct 2014 | A1 |
20140309876 | Ricci | Oct 2014 | A1 |
20140309877 | Ricci | Oct 2014 | A1 |
20140309878 | Ricci | Oct 2014 | A1 |
20140309879 | Ricci | Oct 2014 | A1 |
20140309880 | Ricci | Oct 2014 | A1 |
20140309885 | Ricci | Oct 2014 | A1 |
20140309886 | Ricci | Oct 2014 | A1 |
20140309891 | Ricci | Oct 2014 | A1 |
20140309892 | Ricci | Oct 2014 | A1 |
20140309893 | Ricci | Oct 2014 | A1 |
20140309913 | Ricci et al. | Oct 2014 | A1 |
20140309919 | Ricci | Oct 2014 | A1 |
20140309920 | Ricci | Oct 2014 | A1 |
20140309921 | Ricci et al. | Oct 2014 | A1 |
20140309922 | Ricci | Oct 2014 | A1 |
20140309923 | Ricci | Oct 2014 | A1 |
20140309927 | Ricci | Oct 2014 | A1 |
20140309929 | Ricci | Oct 2014 | A1 |
20140309930 | Ricci | Oct 2014 | A1 |
20140309934 | Ricci | Oct 2014 | A1 |
20140309935 | Ricci | Oct 2014 | A1 |
20140309982 | Ricci | Oct 2014 | A1 |
20140310031 | Ricci | Oct 2014 | A1 |
20140310075 | Ricci | Oct 2014 | A1 |
20140310103 | Ricci | Oct 2014 | A1 |
20140310186 | Ricci | Oct 2014 | A1 |
20140310277 | Ricci | Oct 2014 | A1 |
20140310379 | Ricci et al. | Oct 2014 | A1 |
20140310594 | Ricci et al. | Oct 2014 | A1 |
20140310610 | Ricci | Oct 2014 | A1 |
20140310702 | Ricci et al. | Oct 2014 | A1 |
20140310739 | Ricci et al. | Oct 2014 | A1 |
20140310788 | Ricci | Oct 2014 | A1 |
20140322676 | Raman | Oct 2014 | A1 |
20140324692 | Yarbrough et al. | Oct 2014 | A1 |
20140347207 | Zeng et al. | Nov 2014 | A1 |
20140347265 | Allen et al. | Nov 2014 | A1 |
20140380442 | Addepalli et al. | Dec 2014 | A1 |
20150007155 | Hoffman et al. | Jan 2015 | A1 |
20150012186 | Horseman | Jan 2015 | A1 |
20150032366 | Man et al. | Jan 2015 | A1 |
20150032670 | Brazell | Jan 2015 | A1 |
20150057839 | Chang et al. | Feb 2015 | A1 |
20150061895 | Ricci | Mar 2015 | A1 |
20150081133 | Schulz | Mar 2015 | A1 |
20150081167 | Pisz et al. | Mar 2015 | A1 |
20150088423 | Tuukkanen | Mar 2015 | A1 |
20150088515 | Beaumont et al. | Mar 2015 | A1 |
20150095190 | Hammad et al. | Apr 2015 | A1 |
20150116200 | Kurosawa et al. | Apr 2015 | A1 |
20150127493 | Winkelman et al. | May 2015 | A1 |
20150146605 | Rubin et al. | May 2015 | A1 |
20150158499 | Koravadi | Jun 2015 | A1 |
20150161578 | Ahmed et al. | Jun 2015 | A1 |
20150178034 | Penilla et al. | Jun 2015 | A1 |
20150199685 | Betancourt et al. | Jul 2015 | A1 |
20150220916 | Prakash et al. | Aug 2015 | A1 |
20150295920 | Van Kerrebroeck et al. | Oct 2015 | A1 |
20150363986 | Hoyos | Dec 2015 | A1 |
20150381633 | Grim et al. | Dec 2015 | A1 |
20160008985 | Kim et al. | Jan 2016 | A1 |
20160028737 | Srinivasan | Jan 2016 | A1 |
20160070527 | Ricci | Mar 2016 | A1 |
20160086391 | Ricci | Mar 2016 | A1 |
20160117725 | Capel et al. | Apr 2016 | A1 |
20160180604 | Wilson et al. | Jun 2016 | A1 |
20160252359 | Ikavalko et al. | Sep 2016 | A1 |
20160269456 | Ricci | Sep 2016 | A1 |
20160269469 | Ricci | Sep 2016 | A1 |
20170127230 | Enriquez | May 2017 | A1 |
20170136880 | Ricci | May 2017 | A1 |
20170136887 | Ricci | May 2017 | A1 |
20170136902 | Ricci | May 2017 | A1 |
20170136903 | Ricci | May 2017 | A1 |
20170136904 | Ricci | May 2017 | A1 |
20170136905 | Ricci | May 2017 | A1 |
20170136907 | Ricci | May 2017 | A1 |
20170136910 | Ricci | May 2017 | A1 |
20170357980 | Bakun | Dec 2017 | A1 |
20180174139 | Arora | Jun 2018 | A1 |
Number | Date | Country |
---|---|---|
1417755 | May 2003 | CN |
1847817 | Oct 2006 | CN |
101303878 | Nov 2008 | CN |
102467827 | May 2012 | CN |
1223567 | Jul 2002 | EP |
1484729 | Dec 2004 | EP |
2192015 | Jun 2010 | EP |
2004-284450 | Oct 2004 | JP |
2006-0128484 | Dec 2006 | KR |
WO 2007126204 | Nov 2007 | WO |
WO 2012102879 | Aug 2012 | WO |
WO 2013074866 | May 2013 | WO |
WO 2013074867 | May 2013 | WO |
WO 2013074868 | May 2013 | WO |
WO 2013074897 | May 2013 | WO |
WO 2013074899 | May 2013 | WO |
WO 2013074901 | May 2013 | WO |
WO 2013074919 | May 2013 | WO |
WO 2013074981 | May 2013 | WO |
WO 2013074983 | May 2013 | WO |
WO 2013075005 | May 2013 | WO |
WO 2013181310 | Dec 2013 | WO |
WO 2014014862 | Jan 2014 | WO |
WO 2014143563 | Sep 2014 | WO |
WO 2014158667 | Oct 2014 | WO |
WO 2014158672 | Oct 2014 | WO |
WO 2014158766 | Oct 2014 | WO |
WO 2014172312 | Oct 2014 | WO |
WO 2014172313 | Oct 2014 | WO |
WO 2014172316 | Oct 2014 | WO |
WO 2014172320 | Oct 2014 | WO |
WO 2014172322 | Oct 2014 | WO |
WO 2014172323 | Oct 2014 | WO |
WO 2014172327 | Oct 2014 | WO |
WO 2016145073 | Sep 2016 | WO |
WO 2016145100 | Sep 2016 | WO |
Entry |
---|
U.S. Appl. No. 61/567,962, filed Dec. 7, 2011, Baarman et al. |
“Nexus 10 Guidebook for Android,” Google Inc., © 2012, Edition 1.2, 166 pages. |
“Self-Driving: Self-Driving Autonomous Cars,” available at http://www.automotivetechnologies.com/autonomous-self-driving-cars, accessed Dec. 2016, 9 pages. |
Amor-Segan et al., “Towards the Self Healing Vehicle,” Automotive Electronics, Jun. 2007, 2007 3rd Institution of Engineering and Technology Conference, 7 pages. |
Bennett, “Meet Samsung's Version of Apple AirPlay,” CNET.com, Oct. 10, 2012, 11 pages. |
Cairnie et al., “Using Finger-Pointing to Operate Secondary Controls in Automobiles,” Proceedings of the IEEE Intelligent Vehicles Symposium 2000, Oct. 3-5, 2000, 6 pages. |
Clark, “How Self-Driving Cars Work: The Nuts and Bolts Behind Google's Autonomous Car Program,” Feb. 21, 2015, available at http://www.makeuseof.com/tag/how-self-driving-cars-work-the-nuts-and-bolts-behind-googles-autonomous-car-program/, 9 pages. |
Deaton et al., “How Driverless Cars Will Work,” Jul. 1, 2008, HowStuffWorks.com. <http://auto.howstuffworks.com/under-the-hood/trends-innovations/driverless-car htm> Sep. 18, 2017, 10 pages. |
Dumbaugh, “Safe Streets, Livable Streets: A Positive Approach to urban Roadside Design,” Ph.D. dissertation for School of Civil & Environ. Engr., Georgia Inst. of Technology, Dec. 2005, 235 pages. |
Fei et al., “A QoS-aware Dynamic Bandwidth Allocation Algorithm for Relay Stations in IEEE 802.16j-based Vehicular Networks,” Proceedings of the 2010 IEEE Global Telecommunications Conference, Dec. 10, 2010, 10 pages. |
Ge et al., “Optimal Relay Selection in IEEE 802.16j Multihop Relay Vehicular Networks,” IEEE Transactions on Vehicular Technology, 2010, vol. 59(5), pp. 2198-2206. |
Guizzo, Erico, “How Google's Self-Driving Car Works,” Oct. 18, 2011, available at https://spectrum.ieee.org/automaton/robotics/artificial-intelligence/how-google-self-driving-car-works, 5 pages. |
Heer et al., “ALPHA: An Adaptive and Lightweight Protocol for Hop-by-hop Authentication,” Proceedings of CoNEXT 2008, Dec. 2008, pp. 1-12. |
Jahnich et al., “Towards a Middleware Approach for a Self-Configurable Automotive Embedded System,” International Federation for Information Processing, 2008, pp. 55-65. |
Persson “Adaptive Middleware for Self-Configurable Embedded Real-Time Systems,” KTH Industrial Engineering and Management, 2009, pp. iii-71 and references. |
Raychaudhuri et al., “Emerging Wireless Technologies and the Future Mobile Internet,” p. 48, Cambridge Press, 2011, 3 pages. |
Stephens, Leah, “How Driverless Cars Work,” Interesting Engineering, Apr. 28, 2016, available at https://interestingengineering.com/driverless-cars-work/, 7 pages. |
Stoller, “Leader Election in Distributed Systems with Crash Failures,” Indiana University, 1997, pp. 1-15. |
Strunk et al., “The Elements of Style,” 3d ed., Macmillan Publishing Co., 1979, 3 pages. |
Suwatthikul, “Fault detection and diagnosis for in-vehicle networks,” Intech, 2010, pp. 283-286 [retrieved from: www.intechopen.com/books/fault-detection-and-diagnosis-for-in-vehicle-networks]. |
Walter et al., “The smart car seat: personalized monitoring of vital signs in automotive applications.” Personal and Ubiquitous Computing, Oct. 2011, vol. 15, No. 7, pp. 707-715. |
Wolf et al., “Design, Implementation, and Evaluation of a Vehicular Hardware Security Module,” ICISC'11 Proceedings of the 14th Int'l Conf. Information Security & Cryptology, Springer-Verlag Berlin, Heidelberg, 2011, pp. 302-318. |
International Search Report and Written Opinion for PCT Application No. PCT/US2017/041061, dated Sep. 14, 2017, 8 pages. |
Official Action for U.S. Appl. No. 15/396,616, dated Aug. 24, 2017, 8 pages. |
Notice of Allowance for U.S. Appl. No. 15/396,616, dated Jan. 24, 2018, 9 pages. |
Official Action for U.S. Appl. No. 15/396,592, dated Aug. 16, 2018, 12 pages. |
Official Action for U.S. Appl. No. 15/396,601, dated May 31, 2018, 34 pages. |
Official Action for U.S. Appl. No. 15/396,604, dated May 31, 2018, 65 pages. |
Official Action for U.S. Appl. No. 15/396,613, dated Aug. 27, 2018, 14 pages. |
U.S. Appl. No. 15/396,591, filed Dec. 31, 2016. |
U.S. Appl. No. 15/396,592, filed Dec. 31, 2016. |
U.S. Appl. No. 15/396,595, filed Dec. 31, 2016. |
U.S. Appl. No. 15/396,597, filed Dec. 31, 2016. |
U.S. Appl. No. 15/396,601, filed Dec. 31, 2016. |
U.S. Appl. No. 15/396,604, filed Dec. 31, 2016. |
U.S. Appl. No. 15/396,607, filed Dec. 31, 2016. |
U.S. Appl. No. 15/396,613, filed Dec. 31, 2016. |
Liu et al., “A Survey of Pay ent Card Industry Data Security Standard,” IEEE Communications Surveys & Tutorials, vol. 12(3), 2010, pp. 287-303. |
Notice of Allowance for U.S. Appl. No. 15/396,620, dated Mar. 12, 2018, 10 pages. |
Corrected Notice of Allowance for U.S. Appl. No. 15/396,620, dated Mar. 26, 2018, 7 pages. |
Official Action for U.S. Appl. No. 15/396,591, dated Nov. 5, 2018, 9 pages. |
Notice of Allowance for U.S. Appl. No. 15/396,592, dated Jan. 17, 2019, 8 pages. |
Final Action for U.S. Appl. No. 15/396,601, dated Jan. 14, 2019, 36 pages. |
Final Action for U.S. Appl. No. 15/396,604, dated Nov. 29, 2018, 64 pages. |
U.S. Appl. No. 15/396,591, filed Dec. 31, 2016, Ricci. |
U.S. Appl. No. 15/396,592, filed Dec. 31, 2016, Ricci. |
U.S. Appl. No. 15/396,595, filed Dec. 31, 2016, Ricci. |
U.S. Appl. No. 15/396,597, filed Dec. 31, 2016, Ricci. |
U.S. Appl. No. 15/396,601, filed Dec. 31, 2016, Ricci. |
U.S. Appl. No. 15/396,604, filed Dec. 31, 2016, Ricci. |
U.S. Appl. No. 15/396,607, filed Dec. 31, 2016, Ricci. |
U.S. Appl. No. 15/396,613, filed Dec. 31, 2016, Ricci. |
Profis, “Everything you need to know about NFC and mobile payments,” CNET, 2014, retrieved from https://www.cnet.com/how-to/how-nfc-works-and-mobile-payments/, 8 pages. |
Strickland, “How Near Field Communication Works,” Near Field Communication, retrieved from http://electronics.howstuffworks.com/near-field-communication.htm, retrieved on Sep. 27, 2016, 11 pages. |
“How NFC Works,” Near Field Communication, retrieved from http://nearfieldcommunication.org/how-it-works.html, retrieved on Sep. 27, 2016, 1 page. |
“Near Field Communication Technology Standards,” Near Field Communication, retrieved from http://nearfieldcommunication.org/technology.html, retrieved on Sep, 27, 2016, 1 page. |
“NFC SD and SIM Cards,” Near Field Communication, retrieved from http://nearfieldcommunication.org/sd-sim-cards.html, retrieved on Sep. 27, 2016, 1 page. |
“Near Field Communication versus Bluetooth,” Near Field Communication, retrieved from http://nearfieldcommunication.org/bluetooth.html, retrieved on Sep. 27, 2016, 1 page. |
“Near Field versus Far Field,” Near Field Communication, retrieved from http://nearfieldcommunication.org/far-field.html, retrieved on Sep. 27, 2016, 1 page. |
Notice of Allowance for U.S. Appl. No. 15/396,591, dated Feb. 28, 2019, 5 pages. |
Official Action for U.S. Appl. No. 15/396,595, dated Apr. 18, 2019, 18 pages. |
Notice of Allowance for U.S. Appl. No. 15/396,604, dated Apr. 3, 2019, 20 pages. |
Official Action for U.S. Appl. No. 15/396,610, dated Feb. 20, 2019, 6 pages, Restriction Requirement. |
Corrected Notice of Allowability for U.S. Appl. No. 15/396,592, dated Feb. 25, 2019, 2 pages. |
Notice of Allowance for U.S. Appl. No. 15/396,591, dated May 1, 2019, 5 pages. |
Official Action for U.S. Appl. No. 15/396,597, dated May 2, 2019, 14 pages. |
Official Action for U.S. Appl. No. 15/396,601, dated Jun. 26, 2019, 39 pages. |
Official Action for U.S. Appl. No. 15/396,607, dated Jul. 2, 2019, 14 pages. |
Official Action for U.S. Appl. No. 15/396,613, dated Jun. 14, 2019, 18 pages. |
Official Action for U.S. Appl. No. 15/396,595, dated Oct. 31, 2019, 6 pages. |
Notice of Allowance for U.S. Appl. No. 15/396,595, dated Dec. 18, 2019, 8 pages. |
Notice of Allowance for U.S. Appl. No. 15/396,597, dated Jan. 6, 2020, 9 pages. |
Final Action for U.S. Appl. No. 15/396,601, dated Dec. 23, 2019, 43 pages. |
Final Action for U.S. Appl. No. 15/396,607, dated Oct. 22, 2019, 16 pages. |
Official Action for U.S. Appl. No. 15/396,613, dated Jan. 22, 2020, 18 pages. |
Official Action for U.S. Appl. No. 15/396,607, dated Mar. 31, 2020, 15 pages. |
Number | Date | Country | |
---|---|---|---|
20180012279 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
62378348 | Aug 2016 | US | |
62359563 | Jul 2016 | US |