The present invention relates to systems and methods for vehicle security, methods for finding parking locations and identifying security grades of parking locations, as well as notifications to users regarding security alerts, and exchange of information with cloud-based processing systems to enable vehicle security features.
Vehicles, such as motorized vehicles and electric vehicles have been around for some time. Vehicles provide a means that enable humans to drive from place to place. In today's world, vehicles have become an indispensable mode of transportation, and provide the freedom to travel at any time of day and for various distances. Vehicles can be publically operated or can be privately owned. Humans most commonly operate vehicles, no matter the type, whether electric or combustion engine based. In recent years, technology has been advancing to allow for better wireless interfacing and networking with vehicles. Such wireless interfacing has included, for example, integrated alarms and remote lock and unlock by way of key fobs. These features, although representing advancements in security, still do not provide vehicle drivers with the level of security needed in today's society.
It is in this context that embodiments of the invention arise.
Methods and systems are disclosed for providing access to safety ratings of parking locations and access alerts associated with vehicles, including processing of notifications to user accounts associated with monitored vehicles. One example method includes receiving at a server, over time, safety alerts from a plurality of vehicles. Each safety alert is associated with a geographic location. The method associates, by the server, one or more safety alerts to parking locations corresponding to geographic locations from where the safety alerts were received. The server then generates a safety grade for one or more of the parking locations, and the safety grade is based on a number of safety alerts associated to the parking location and a safety type of the safety alerts. The method receiving a request at a server, from a computing device, to access the safety grade for a parking location proximate to a current geo-location of the computing device or a destination location and sends data to a user interface of the computing device. The data includes identification of one or more parking locations proximate to the current geo-location of the computing device or the destination location and associated safety grades.
In one embodiment, methods for managing personal security of a user of a vehicle are provided. The vehicle has a plurality of active sensors. The active sensors include one or more of a first type of sensors that collect data and one or more of a second type of sensors that produce actions. The method includes receiving a first remote request at the vehicle to activate a first level of security. The first remote request is an override signal to actively inform said vehicle to set said first level of security for approaching said vehicle by said user. The method includes recording an area proximate to the vehicle using one of said first type of sensors and illuminating said area proximate to the vehicle using one of said second type of sensors in response to activating the first level of security and the recording producing a media file. The method includes transmitting the media file to Internet storage associated with an account of the user of the vehicle. The first remote request causes the recording and the transmitting.
In one embodiment, the vehicle includes a computer and a communications system. The communications system is configured to provide the computer of the vehicle with wireless communication with a cloud services system that includes a database that stores a user account that identifies the vehicles as registered with the cloud services system. The user account identifies settings for the vehicle and information for notifying a user of the vehicle. The vehicle includes a plurality of sensors associated to sides of the vehicle, such that contact with a specific side of the vehicle is identified using a sensor of the plurality of sensors. The vehicle further includes a plurality of cameras integrated in the vehicle to enable capturing of image data of an area around the vehicle. The computer is configured to receive data from the plurality of sensors of the vehicle to detect a contact with the vehicle and identify a side of the vehicle from which the contact was detected. The computer is configured to send data to the cloud services system which will then send the notification indicating that contact was detected with the vehicle. The cloud services system enables access to additional data and controls, including live feeds of the area around the vehicle.
In another embodiment, systems are disclosed for a vehicle and for associated methods for handling contact detection of the vehicle. The vehicle includes a computer and a communications system. The communications system is configured to provide the computer of the vehicle with wireless communication with a cloud services system that includes a database that stores a user account that identifies the vehicles as registered with the cloud services system. The user account identifies settings for the vehicle and information for notifying a user of the vehicle. The vehicle includes a plurality of sensors associated to sides of the vehicle, such that contact with a specific side of the vehicle is identified using a sensor of the plurality of sensors. The vehicle further includes a plurality of cameras integrated in the vehicle to enable capturing of image data of an area around the vehicle. The computer is configured to receive data from the plurality of sensors of the vehicle to detect a contact with the vehicle and identify a side of the vehicle from which the contact was detected. The computer is configured to send a notification to the user of the vehicle indicating that contact was detected with the vehicle. The notification providing a link to an application that interfaces with the cloud services system to enable access to additional data and controls, including live feeds of the area around the vehicle.
In some embodiments, the live feed is a video feed of the area around the vehicle, and the controls enable focus of the video feed to specific locations of the area around the vehicle. The contact is one of a touch of the vehicle, or a motion of the vehicle, or a collision with the vehicle, the additional data includes identification of the side of the vehicle associated with the contact that was detected. The notification is sent by a server to an account viewable via a mobile device or computer of the user. The application provides an option for viewing the live feed, or contacting a security agent, or both.
In some embodiments, one or more of a motion sensor are provided to detect motion proximate to the vehicle, or a heat detector to detect heat proximate to the vehicle, or both.
In some embodiments, at least one of the plurality of cameras is configured to record an image or a video clip in the area around the vehicle for the contact that was detected, and the image or video clip being assessable via the application for viewing.
In some embodiments, the cloud services system is configured to save a history of events associated with one or more detected contacts with the vehicle, and the application or an internet connected device being provided with access to the history to view the events. The cloud services system includes one or more servers.
In some embodiments, a plurality of microphones are provided to capture sounds in the area around the vehicle, or an infrared detector for detecting data in the area around the vehicle, or both the plurality of microphones and the infrared detector.
In some embodiments, the wireless communication is configured to process one of radio communication, or Wi-Fi™ communication, or Bluetooth™ communication, or a near field communication (NFC), or cellular communication, or satellite communication, or peer-to-peer communication, or a combination of two or more thereof, and the wireless communication enables connection to the Internet for transacting with the cloud services system. In some embodiments, the live feed is a video feed of the area around the vehicle that provides a generated bird's eye view of the area around the vehicle, the controls further enable one or more of panning, rotating, tilting and patrolling 360-degrees around the vehicle.
In some embodiments, one or more of the plurality of sensors are selected from microphones, or motion detection sensors, or heat detection sensors, or infrared (IR) detector sensors, or sound detection light activated sensors, or recording system sensors, or communications system sensors, or gyroscope sensor, or combinations thereof.
In some embodiments, the controls provided via the application further include access to view a number of alerts, or a history of alerts, or past recordings, or past or current incidents, or incidents over a period of time, or video clips, or images, or still images, or a collection of still images, or audio files, or audio snippets, or alarms, or details regarding alarms, or details regarding the contact detected or past contacts detected, or management settings and remote controls of the vehicle, or combinations of two or more thereof.
In some embodiments, the computer of the vehicle uses data from the plurality of sensors and the plurality of cameras to determine when the contact that was detected qualifies as an event for which the notification should be sent to the user, wherein when the contact that was detected is not of a level that qualifies for the notification the notification is not sent to the user.
In some embodiments, a speaker and a microphone are provided on the vehicle, such that controls provided via the application use the speaker and the microphone of the vehicle to enable the user to speak with a person standing proximate to the vehicle, and at least one of the cameras enables focus onto the person standing proximate to the vehicle.
In one embodiment, a method for managing personal security of a user of a vehicle is provided. The vehicle has a plurality of active sensors. The method includes receiving a first remote request at the vehicle to activate a first level of security, the first level of security activating a first active sensor. The method includes recording an area proximate to vehicle in response to activating the first level of security, the recording producing a media file. The method includes transmitting the media file to Internet storage associated with an account of the user of the vehicle.
In some embodiments, the method further includes, receiving a second remote request at the vehicle to activate a second level of security, the second level of security activating a second active sensor, the second active sensor is more alarm intensive than the first active sensor.
In some embodiments, the second level of security triggers a notification to predefined authorities, the notification including at least part of the media file.
In some embodiments, the active sensors include, one or more of a first type of sensors that collect data; and one or more of a second type of sensors that produce actions, wherein the actions including turning on lights, controlling vehicle components, controlling recording of audio or video, flashing lights, providing audible responses, sounding alarms, sounding voice messages, or combinations thereof.
In some embodiments, the method further includes,
transmitting a notification to a predefined destination regarding the first remote request, wherein the transmitting of the notification is saved to history in the account of the user at the internet storage, such that access to the Internet storage is provided to the user via the account to view or share data regarding the notification.
In some embodiments, the method further includes, sending a notification to the user upon activating the first level of security, the notification including data indicative of a danger condition, the danger condition determined based monitored conditions proximate to the vehicle and rules that define which ones of the monitored conditions should be considered danger conditions.
In some embodiments, the notification is sent to a mobile device of the user that identifies the danger condition, a key fob for the vehicle, a predefined security agent, police, or a combination thereof.
In some embodiments, the media file is one of video, audio, image, sounds and images, video clips, infrared images, sound waves, impact data, or combinations thereof that can be stored in digital form.
In another embodiment, a method of managing security of a vehicle having electronics for managing the vehicle and for communicating wirelessly with the Internet is provided. The method includes receiving one or more requests at the vehicle from a wireless device handled by a user of the vehicle, wherein each one of successive requests triggers a heightened level of security. The method includes initiating video recording proximate to the vehicle upon receiving one of the requests. The method includes
transmitting, over the internet, a notification to a predetermined recipient concerning the one more requests received at the vehicle, the notification including at least part of the video recording. The notification and the last least part of the video recording is saved at a remote server, connected to the internet, associated with an account of the user for the vehicle.
In some embodiments, the account of the user for the vehicle is accessed at a website or application and the account identifies a history of events that include saved notifications and access to the video recordings associated with the notifications.
In some embodiments, the wireless device is a key fob having at least one button, wherein at least one of the buttons is selected one or more times to cause the one or more requests that are received by the vehicle.
In some embodiments, the wireless device is a portable computing device having one or more selection options, wherein at least one of the selection options is selected one or more times to cause the one or more requests that are received by the vehicle.
In some embodiments, the notification is correlated to a geographic location (i.e., geo-location) of the vehicle.
In some embodiments, the notification for the vehicle is added to a history database for the geographic location, notification history database including notifications from a plurality of vehicles that generate notifications over time, further comprising, generating safety grades, at a server connected to the Internet, and assigning the safety grades to a plurality of geographic locations based on past occurrences of notifications at or near the plurality of geographic locations.
In some embodiments, the method further includes, presenting the safety grades, upon request, to devices requesting safety history for a geographic location.
In another embodiment, a method for managing security of a vehicle having electronics for managing the vehicle and for communicating wirelessly with the Internet is disclosed. The method includes receiving one or more requests at the vehicle from a wireless device handled by a user of the vehicle. The method includes illuminating an area proximate to the vehicle with lighting upon receiving a first request, the illuminating being color coded to be indicative of a safety level associated with the vehicle based on events that were detected at the vehicle or proximate to the vehicle during a time before receiving the one or more requests at the vehicle, the one or more requests include a request to open or access the vehicle, the method being executed by a processor.
In some embodiments, the method further includes, initiating video recording proximate to the vehicle upon receiving one of the requests; transmitting, over the internet, a notification to a predetermined recipient concerning the one more requests received at the vehicle, the notification including at least part of the video recording; and wherein the notification and the last least part of the video recording is saved at a remote server, connected to the internet, associated with an account of the user for the vehicle.
In some embodiments, the method further includes, providing a changed lighting after receiving a second request, the changed lighting is intensified lighting or different color lighting.
In some embodiments, a second request triggers elevated audible outputs from the vehicle, the audio outputs include sounds and/or voice language warnings.
In some embodiments, the illuminated area proximate to the vehicle is viable from a distance to notify safety level, wherein a different color defines a different predefined safety level.
As shown in
The present invention relates to systems and methods for vehicle security, methods for finding parking locations and identifying security grades of parking locations, as well as notifications to users regarding security alerts, and exchange of information with cloud-based processing systems to enable vehicle security features.
A number of embodiments are described below, with reference to specific implementations that refer to vehicles, but such implementations should be broadly construed to include any type of vehicle, structure or object. Without limitation, vehicles can include any type of moving object that can be steered, and can include vehicles that are for human occupancy or not. Vehicles can include those that are privately owned, owned by corporations, commercially operated vehicles, such as buses, automobiles, trucks, cars, buses, trains, trolleys, etc. Example vehicles can include those that are combustion engine based, electric engine (EV) based, hybrids, or other types of energy source vehicles.
The embodiments of the present invention relate to vehicle security and/or vehicle personal security, and methods for managing remote control of security functions and remote access and control from a remote location over the Internet. At the remote location, a user is able to access a user interface for an application, which provides users access to their user accounts. A user account can be for a user and the user can add one or more vehicles, objects, data or appliances for remote reporting, viewing and control. In one embodiment, a user is an owner or user of a vehicle. The user can register the vehicle with a remote service.
The remote service can be accessed over the Internet, such as via a website or application of a portable device. The remote service can provide a multitude of cloud services for the user, such as remote control features, remote viewing services, remote alarm controls, remote camera activation, remote audio/video recording of the vehicle (i.e., areas around the vehicle and inside the vehicle). In one embodiment, the vehicle is able to connect to the Internet (e.g., when the vehicle engine is off, on, and/or is occupied or un-occupied) to allow a user, via a remote cloud service, to access features of the vehicle. The vehicle can be accessed when running, when parked, when stopped, when moving, etc. The vehicle and its audio recording devices and video cameras can be accessed from remote locations, to allow users to remotely communicate with the vehicle or with people riding or residing inside the vehicle.
The remote communication can also allow a person to communicate remotely with people standing outside (or inside) of a vehicle. For instance, if a user is accessing his or her vehicle from a remote location, cameras installed in and/or on the vehicle allow the remote user to see a person standing proximate to the vehicle. The remote user can then communicate with a person standing proximate to the vehicle using microphones and speakers of the vehicle.
In one embodiment, the user or owner of the vehicle can get a notification from the vehicle if the vehicle detects a person standing proximate to the vehicle or taking some action that is not consistent with passive standing. For instance, if a person is trying to gain unauthorized entry into the vehicle, the vehicle can send a notification to the owner and the owner can view the activity remotely and can also speak to the person attempting to break in or who has broken in to the vehicle. The same notification can be sent to authorities, such as the police. This notification can contain audio, video, location, date stamp and other metrics associated with the incident. This notification can be sent automatically or initiated remotely by a vehicle's user/owner to alert authorities after the user/owner has determined that the activity is in fact a crime and not a false alarm.
In the following examples, additional security features are also described which are relevant and pertain to preventative security tools. The security tools can include, for instance, allowing a user to select a button or input, which remotely lights the vehicle up before or as the user walks to the vehicle. This feature is useful, for instance, when a user needs to get to his or her car in a parking garage or location that may not be safe. For instance, if a user needs to collect his or her car from a parking spot at night, the user may not feel very comfortable going to his or her car, with fear of being car jacked.
The user, in one embodiment, can select a button that lights up his or her car from remote location. In one embodiment, the button can be thought of as a personal security escort service, which provides the driver approaching the vehicle with added security.
In one embodiment, the button can also act to begin recording of activity proximate to the vehicle. The recording can be saved to storage of the vehicle and can also be streamed to the user's cloud services account. Each time the user selects to use the security feature and recording begins and a history log can be saved. If an incident were to occur, the user can click or push a second button remotely that causes immediate notification to authorities or police. In this embodiment, the area around the vehicle is also recorded with cameras, microphones, motion sensors, etc.
Other sensors can be integrated into the vehicle, such as motion sensors, heat sensors, multiple cameras, etc. These sensors can provide triggers to code executed on vehicle hardware and/or cloud hardware, to initiate one or more alarms or take proactive security actions and/or send notifications to users of the vehicles and/or authorities.
If the user arrives at the vehicle and no incident occurs, the user can elect whether to keep or discard the recorded data. In one embodiment, each time a recording occurs, the history data can also save information regarding the time of alerts, location coordinates of the events (GPS data), etc. This data can also be saved and processed by cloud processing programs to collect data over time of vehicles having this communication/security feature.
Heat maps of prior activity can also be defined for later use. In one embodiment, the data used to construct the heat map of prior activity may be crowd-sourced in that data from events from all prior individual users is amalgamated into trends and probability of incident. In this manner, if a user wishes to see the historical data of a parking area or location, the user can see or determine how safe the location might be. Using this data, the user can determine when the safest time of day or the safest day of the week is to park at a certain location.
In one embodiment, the data regarding historical events can be provided to users on a display of a vehicle, smartphone or computer with Internet/Cloud access and the data can be provided in various levels of granularity. For instance, the data can be provided with a simple grade, such as A for safe, B for relatively safe, F for not safe, etc. If the user wishes more detailed information, the user can be provided with information regarding past actual events. People wishing more security can therefore select a better location to park. In one embodiment, a same parking garage can have different grade levels for different areas or for different parking slots. Information regarding localized incidents within parking garages, parking lots, street parallel parking, shopping centers, airports, service stations etc. can also be used by parking garage operators, police, business owners and or service attendants or the like to improve security as they will have access to granular incident data and heat maps.
In another embodiment, a vehicle's A/V recording systems may be triggered to record video and audio at certain predetermined times, as programmed by the user. The user can program the vehicle to record upon a panic button press, upon a break-in condition, or upon any type of trigger condition. The event can then be transmitted to a cloud storage system to enable the user to view data about the condition. All or part of the data can be provided to predetermined entities, such as police, local security, etc.
The data can be viewed via any computer or mobile device having access to the internet. In one embodiment, the vehicle can be continuously recording A/V data in a buffer, e.g., circular or non-circular buffer (e.g., storage that is local on the vehicle or storage that is part of a cloud based data center or centers). The data may be discarded after a buffer period of time. If a trigger condition occurs, the trigger condition can capture a buffer period of time before the trigger and a period of time after the trigger. These updates can be programmed to be auto-sent to recipient, and can be provided as notifications to smartphone devices.
In one embodiment, structures described herein can include parking structures, parking lots, private or commercial buildings, drive-through, bridges, toll roads, highways, shared or home driveways, designated driving and parking areas. In the specific embodiments described herein, vehicles, structures and objects may include circuitry and communication logic to enable communication with a cloud processing system over the Internet. A cloud processing system, as described herein, will include systems that are operated and connected to the Internet or to each other using local networking communication protocols.
A cloud processing system can be defined as an interconnected and distributed physical or virtual software defined network that utilizes virtual or physical processing and storage machines that enable various applications and operating systems to facilitate the communication with and between various client devices (vehicles, user devices, structures, objects etc.). The communication with and between the various client devices will enable the cloud processing system to deliver additional processing information, data, and real-time metrics concerning data obtained from other processing systems as well as client feedback data. The distributed nature of the cloud processing system will enable users of various vehicles, structures and objects to access the Internet, and be presented with more flexible processing power that will provide the requested services in a more effective manner.
The processing systems can be defined from various data centers that include multiple computing systems that provide the processing power to execute one or more computer readable programs. The processing of the computer readable programs can produce operations that can respond to requests made by other processing systems that may be local to a vehicle's electronic system. For example, a vehicle can include electronics that utilize memory and a processor to execute program instructions to provide services.
In one embodiment, the services provided by the electronic systems of a vehicle can include services that access the various components or subsystems of a vehicle, such as door locks, service histories, user profiles, audio settings, entertainment settings, mapping functions, communications systems, telecommunication synchronization systems, speakers, heating and cooling functions, auto-engine start/shut-off remotely via smart devices, remote heating/cooling initiation, remote face-to-face conferencing, etc. The electronic systems within a vehicle can also provide a user interface, such as a graphical user interface. The graphical user interface can include a plurality of buttons, controls and transceivers to receive input from a user. The input from a user can also be provided by voice input, facial recognition, eye-retina scans, fingerprint scans, a combination of biometrics, or via a capacitive or regular touchscreen contained or displayed within the vehicle, the vehicle's glass, doors, dashboard etc.
In other embodiments, the electronics of a vehicle can synchronize with a user's portable electronics. The user's electronics can include, for example mobile devices that include smart phones, tablet computers, laptop computers, general-purpose computers, special purpose computers, etc. The various computing devices of the vehicle, and or the computing devices of the user (smart devices) can be connected to the Internet or to each other. Provided that a user has access or account access to the cloud service, the cloud processing services on the Internet can provide additional processing information to the electronics of the vehicle.
In the following embodiments, examples will be provided for ways of having the cloud processing services deliver processing information concerning various physical locations that have mapping data associated there with. The following embodiments will also provide examples of ways a cloud processing service, together with physical sensors, can allow vehicles, structures and objects to become aware of each other, share locations, measurements and mapping data, intended paths and other metrics along with remote administration of the same.
The mapping data associated with the various locations can include locations of objects in the real world. The objects in the real world can include roads, sidewalks, buildings, barriers, fencing, parking structures, walls or obstacles within a location, doors, positioning of walls, location information of other vehicles within a location, sensor data associated with various locations, mapping data that outlines the geometries of a building or vehicle, sensor location that is static and/or dynamic, area and volume information within buildings, structures or areas, sensors for detecting movement or presence of obstacles within a location, data identifying occupancy a specific locations such as a parking structure, a parking space, etc.
In one embodiment, the sensors of a building, showing the outline of the building can provide data of what spaces are available within a designated parking area for example. When a vehicle reaches a building, parking lot, parking designated area of ad-hoc parking lot where auto-park is available, the vehicle will become aware of the availability of non-human operated auto parking and will transfer and receive information to and from the cloud to download and/or access the building's location and map of sensors. When a vehicle reaches a different auto-park location, it will download that particular map.
In one embodiment, vehicles can maintain information regarding where they are, where they are heading and their destination maintained which is maintained by GPS and navigation systems on board. The information collected and maintained by every vehicle is mutually exclusive, meaning that only each individual vehicle is aware of its own heading, rate of speed and current location. This information, on one embodiment is crowd sourced and crowd shared/consumed for use in for accident avoidance. By networking vehicles within a certain radius together, all individually location-aware vehicles become aware of all other vehicles in their sphere of influence. Every vehicle will network with vehicles in their range using wireless communication systems such as but not limited to Wi-Fi, Wi-Gig LTE, cellular, radio, near field communication or other methods.
In one embodiment, each vehicle may maintain a table (e.g., in storage, locally on in cloud storage) of all other vehicles in, entering, and or leaving its sphere of influence. The vehicle's cameras can be engaged to take still photos and or video record any incident, whether it results in a successful avoidance or impact. This footage can be used to alert authorities of the severity of the accident and aid insurance companies in identifying fault. A vehicle will maintain a buffer of events for a given amount of time before and after a collision event or collision avoidance event such as the location, speed, heading, and avoidance measures to store and identify the metrics that lead to an incident.
In one embodiment, a personal security system for a vehicle can include a number of features. One feature is electronics in the vehicle that can communicate with sensors of the vehicle and can communicate with the Internet for accessing cloud processing services and storage. The communication system for the vehicle can include, for instance, cellular communication with cell towers, WiFi, WiGig, 802.11ac, 802.11ad and consequent wireless networking standards and technology for communication with wireless routers, peer-to-peer communication with other vehicles or connected computers, near field communication, Bluetooth communication, satellite communication, radio communication, infrared communication, or combinations thereof.
In one embodiment, the communications of the vehicle and electronics of the vehicle will enable direct communication with a user of the vehicle. The user of the vehicle can include, for instance, the owner of the vehicle, a driver of the vehicle, or any third party having access to the vehicle (either to drive the vehicle, to monitor the vehicle remotely, etc.)
In one embodiment, the user can communicate security instructions to the vehicle using a key radio frequency emitting or network connected fob, mobile device, or the like. The key fob, for instance, may include a button to signal to the vehicle that a user wishes to activate a security protocol before, during or after the vehicle is accessed. The button can, in one embodiment, be a button that is in addition to a panic button. While a panic button can only provide on/off signals to the car (e.g., to sound alarms and lights), the additional button can be used to communicate more than on/off signals. Still further, the additional button can be the same button as an existing panic button. For instance, if the panic button is pushed once, that could mean that the user wishes a lower level of preventative security.
For example, vehicle preventative security can include: (a) turning on lights around the vehicle to make it safe for the user to approach (e.g., when the vehicle is parked in poorly lit place), (b) sounding a low level audio signal (less dramatic than a panic alarm); (c) providing voice and/or audio notification to the surrounding area noting that the vehicle is being monitored (e.g., “this vehicle is being monitored for security . . . you may be recorded during this time”); (d) turning on a recording light on the vehicle, which notifies people that the area is being recorded and actively viewed; (e) a combination of lighting, sound or audio notifications, and recording of the space around the vehicle as the user approaches can also be included as example preventative security. As noted, the preceding examples may be remotely triggered by a user, e.g., using a key fob or via a computer or mobile device connected to the Internet via an application or web browser.
In one embodiment, a key fob can allow a user to click once for a lower level of preventative security to be activated and if no incident occurs, the user once in the vehicle and in safety can select to erase the recorded images or data. The vehicle can also provide a query on the display screen asking the user to confirm if the event should be maintained (e.g., saved on vehicle storage or saved to cloud storage) or erased. On the other hand, a user can double click the security button or elect a separate button if the user determines that more security is needed while approaching the vehicle. If the user feels that danger is likely to occur, the user can hold down the security button or select a different button on the key fob or app on a smartphone. This will trigger elevated security sirens, more lights, voice comments notifying people near the vehicle that the area is being recorded and/or sent to the authorities.
As can be appreciated, by providing users with multiple levels of security activation, users can be provided with higher levels of security when approaching a vehicle alone or with small children. Any recordings, images, clips, audio recorded proximate to the vehicle can be shared with others. The sharing can be by way of email, text, application notifications, cloud access to storage holding data, events and data, etc. In one embodiment, the historical data for this user can be saved by the cloud services. In one embodiment, for other registered users of the cloud services, information from all or some of the historical data of events can be saved and used to map out a history of events for selected locations. The locations where events occurred can be mapped out using GPS data and time data.
In one embodiment, users of the service can access an app or cloud service account to determine the historical safety of a particular location or parking slot. This data can also be shared with parking locations, building owners, and others that can take corrective measures to improve security. In one embodiment, the data from historical events triggered at locations can be shared anonymously, without disclosing the identity of the car/user that triggered the alarm. This sharing will encourage others to share the data to collectively improve safety for particular parking areas.
In one embodiment, a vehicle can sense and collect data in its surroundings before a user decides to approach a vehicle. For instance, a vehicle can monitor a proximity volume around the vehicle automatically. In some cases, people will come in contact or in near proximity to the vehicle, but those actions would not be viewed as triggering an alarm. If, however, some activity is determined to be unusual, based on predefined rules, the vehicle can store the activity. If the activity continues (e.g., a person continues to look into the vehicle, is looking under the vehicle, approaches the vehicle too many times over some period of time, etc.), that information can be provided to the user/owner of the vehicle as a notification.
If the user gets this notification, the notification can include video clips or images of the events. By having this information ahead of time, the user can determine whether or not to approach the vehicle at all, and possibly notify the authorities or local security. If the notification simply shows other users getting into and out of their cars beside the user's vehicle, that notifications can be ignored by the user.
In other embodiments, when a user is attempting to park at a particular location, the vehicle can advance notify the driver of the probable safety for the location at which the user is about to park. If the location is the site or proximity of some threshold number of previous alert or security events, the driver can be notified so that the driver can determine if he or she still wishes to leave the vehicle parked at that location.
The system can also, in one embodiment, recommend a safer parking spot or location that is proximate to the current location. For example, the user can be recommended to park closer to one side of a building than the other, or at another parking garage with lower level of historical security triggered activities.
The wireless communication can include cellular tower communication that couples and communicates through various networks to the Internet, to provide access to cloud processing 120. Other methods can include providing Wi-Fi communication to local Wi-Fi transmitters and receivers, which communicate with cloud processing 120. Other types of communication can include radio frequency communication, such as Bluetooth communication or combinations of Wi-Fi and Bluetooth. It should be understood that vehicle 103 can communicate with cloud processing 120 via any number of communication methods, so long as exchanges of data can be made with cloud processing 120 from time to time.
The communication can be made by vehicle electronics 120 while the vehicle is on or when the vehicle is off, so long as communication and processing circuitry of vehicle electronics 103 has a power source. The power source can include battery power that powers vehicle electronics 103 to communicate with cloud processing 120 when vehicle 102 is turned off. When vehicle 102 is turned on, the battery that drives vehicle he tries 103 can be recharged.
In the illustrated example, user 110 can use a key fob 104 to communicate directly with vehicle 102. Key fob 104 can include a plurality of buttons that allow the user 110 to open the vehicle, lock the vehicle, push a panic button 104A, or push a record button 104B. In one embodiment, the record button is a 2nd button that provides the user 110 with a precautionary level of security when the user feels uncomfortable or desires to access the vehicle from a remote location.
By pressing the record button, the user can activate a security radius around vehicle 102, where in the security radius may cause vehicle 102 to start recording data around the vehicle, and may light up the area proximate to the vehicle using its lighting. It should be understood that the record button can take any other name, and is not limited to being called a record button. In one example, the button can be viewed as a security escort button, a caution button, or can be integrated with the panic button so that multiple presses of the panic button can activate different levels of security.
In one embodiment, vehicle 102 will integrate additional vehicle lighting so that the vehicle will appear lit up and will alert nearby persons that the vehicle is being monitored and/or being recorded. As the user 110 walks towards the vehicle, the vehicle will record its surroundings and will light up the space so that the user 110 can approach the vehicle. This process may be used by the vehicle owner when desiring to approach the vehicle in an area that may not appear to be safe. The area not appearing to be safe can include, low lighting in the parking location, hovering or proximity of other people near the vehicle that may appear to be less than safe, or for any other reason that the user feels uncomfortable or has a desire to obtain more security when approaching vehicle 102.
In one embodiment, if a person approaches the vehicle 102 when user 110 is walking towards the vehicle, or the person is hiding behind vehicle 102, vehicle 102 can record and or alert the person that the vehicle is being monitored. The person can be alerted with a sound, or a voice that communicates out loud to the area around the vehicle 102 that the area is being recorded, and that the vehicle owner is approaching. If when the user 110 is walking toward the vehicle 102, some condition changes, such as the user 110 becoming more apprehensive of his or her security, the user can then push the panic button 104A. This will cause an elevated level of activity for vehicle 102.
The elevated level of activity can include (in addition to recording the area around the vehicle and turning on the lights around the vehicle) sounding a siren and voicing a warning to anyone in the proximity of the vehicle that they are being recorded and the authorities have been notified of the current alarm condition. If when the user arrives at the car and the user feels safe, the user can turn off the panic button and can also turn off the recording. In one embodiment, the recordings and the triggering of the alarms can be saved to memory and storage of vehicle 102. In another embodiment, the recordings and data associated with the recordings and triggers alarms can be saved to cloud processing 120 automatically.
The saving of this data to cloud processing 120 allows user 110 to maintain a history of the times the alarm or triggers alarms occurred for vehicle 102, and the locations where the triggers occurred. In another embodiment, when more vehicles include the monitoring features described herein, those vehicles can also report different trigger an alarm conditions. Because the cars can be tracked for location using global positioning satellites (GPS), the position of the car when the triggers were detected (and any clips of video, images or metadata saved to the servers in cloud processing 120) can be communicated to databases of a service that monitors vehicle 102, using cloud processing 120. For example, if a plurality of cars sign up for a service, which include assigning accounts to different users of vehicles, when vehicles experience or trigger alarm conditions or recording conditions, that data can be saved to cloud processing 120.
By consolidating and analyzing data from a plurality of vehicles, it is possible to determine locations that may not be safe, or can be predicted to not be safe based on the number of trigger conditions that occurred (e.g. in proximity to a GPS location or locations). In another embodiment, the key fob 104 can be represented as an APP (application) on a portable device. The APP can include graphical user interface icons 104′ that represents images of key fob 104. Thus, the user can also access the vehicle 102 using a graphical user interface that resembles key fob 104.
In one embodiment, the user 110 is registered with cloud processing 120 that will provide access to vehicle 102 over an application. User 110 will be allowed to access the application and the vehicle 102 upon being certified using accounts and passwords. Additional security, such as encryption may also be used to prevent hacking of access codes. Accordingly, the user can be accessing vehicle 102 using a key fob 104 or a portable device 104′, or also a computer connected to the Internet.
If person 180b approaches the proximity zone 160, the person will be alerted that vehicle one or 2 is recording the proximity as user 110 approaches the vehicle. In one embodiment, user 110 can also be notified via a portable device that the proximity zone 160 around vehicle 102 has a person 180b proximate to the vehicle. At that point, user 110 can decide whether to approach the vehicle or not approach the vehicle, or to select the panic button 104a if user 104 feels threatened of its safety. Although this example shows a planar two-dimensional parking lot 150, the remote access into the vehicle can include accessing vehicles in parking structures that are multilevel, approaching vehicles in homes, approaching vehicles at work, approaching vehicles in any other location.
In the example of
As illustrated, the sensors 108 can be configured to capture image data around the perimeter of the vehicle 102 in proximity zone 160. In addition, optional sound capture can also be conducted within the perimeter 160. The perimeter 160 size can vary depending on the type of sensors utilized, or can be programmatically adjusted by the user. The user can, for example, reduce the radius or sensitivity of the proximity zone 160 so as to reduce the number of alerts that are received during a monitoring session. In other embodiments, the radius or sensitivity of proximity zone 160 can be maximized when the user needs to access the vehicle in unsafe conditions or environments. The sensing data, control and capture, can be communicated between vehicle 102 and cloud processing 120 utilizing vehicle electronics 103, and its communication systems. Also shown in
This illumination area 102 can include special illumination lighting around the perimeter of the vehicle. The illumination lighting can include, for example, LED lighting at the base of the vehicle that lights up floor or ground area around the vehicle with high magnification.
In one embodiment, the illumination lighting can be color coded. If a person is hiding or standing proximate to the vehicle as a user walks toward the vehicle that is illuminated, the illumination color or glow around the vehicle can change. If rules suggest that it is dangerous to approach the vehicle, the glow may be red, if it is safe, the glow may be green, or some other shades of color. The color can also switch from one color to another, as the user approaches with higher security needs.
In other embodiments, additional illumination lights can be provided in the vehicle that is activated based on motion capture. For example, if the user is attempting to approach a vehicle that is currently being monitored, if a person is hiding behind the vehicle, the lighting in the area where the person is sensed (using motion detection, heat detection, sound detection, or a combination thereof), the lighting on that person can be magnified.
Additionally, sound can be directed toward persons that may be in the proximity zone 160 when a person is approaching vehicle 102 in a safety mode. As will be described below, the safety modes can be set by user depending on the level of safety desired by the user when approaching vehicle 102. The safety levels can be low levels, such as lighting up the proximity zone 160 or simply emitting low-level sounds. In more elevated levels, the user can decide to automatically capture image data and send captured image data or video to third parties using cloud processing 120. Additionally, if a vehicle's ability to transfer, record or activate passive or active safety features are compromised, deactivated or tampered with, an emergency S.O.S signal may be sent to the user of the vehicle using any remaining operative sensor data. This signal may also be sent to authorities and captured by cloud processing.
In one embodiment, some or all of active sensors of the vehicle can be enabled when the vehicle is off or on, or some can be enabled upon receiving a request to enable some of the sensors from a user (e.g., when the user of vehicle decides to approach the vehicle, or wishes to view the vehicle remotely or activate some feature of the vehicle remotely). As shown in
Active sensors 108 are in communication with vehicle electronics 103. Vehicle electronics 103 can include security module 103a, which is integrated with vehicle electronics 103 or can be an add-on module that communicates with vehicle electronics 103. Vehicle or consular 3 is in communication with cloud processing 120.
The recording data can then be saved to local storage, and later communicated to cloud processing 120 for additional storage remotely from the vehicle 102. Also shown is an audio capture module that includes a microphone controller. The microphone controller is in communication with the plurality of microphones 108a. A microphone controller may be communicating with activation logic and record logic to set when one or more the microphones should be recording and the recording data can be saved to the local storage. Just as image data can be saved to cloud storage using cloud processing 120, the microphone data that is recorded can also be transmitted to cloud storage in cloud processing 120. Further, optionally included in security module 103a is a motion/heat capture module.
This module can include a plurality of controllers to capture either motion or heat data surrounding vehicle 102. The sensors around vehicle 102 can detect when heat of a human body is proximate to the vehicle 102, or if heat is coming from another vehicle that may have been running. The motion controller can also detect when motion is occurring proximate to the vehicle 102. Various sensors 108b and 108c can be located in and around vehicle 102. Proximity detection logic may be in communication with the image capture module, audio capture module, motion/heat capture module.
Data inputs from these various modules and their respective sensors can then be processed by proximity detection logic. Proximity detection logic can implement a plurality of rules to determine when people, objects, heat, or obstacles are proximate to vehicle 102. This data can then be transferred to an alarm type trigger logic. Based on the different sensor data collected by the proximity detection logic, the alarm type trigger logic can operate to trigger different types of alarm activations, based on alarm rules. Alarm type trigger logic can trigger a plurality of actions and without limitation can include, remote notification, activation of horns, turning on or off of lights, output of voice warnings, triggering of recordings, notification of authorities, saving of recordings to cloud storage of user accounts, notifications to third parties, and combinations thereof. In one embodiment, the alarm type trigger logic is also in communication with vehicle controller interface. Vehicle controller interface can include electronics of the vehicle, which are native to the vehicle. These electronics can serve to turn on existing lighting of the vehicle, horns of the vehicle, settings of the vehicle, etc. Each vehicle maker can include different controller interface and can include application interface that enable security add-on modules to be connected to the vehicle controller interface.
Vehicle controller interface can also link to communications systems of the vehicle. The vehicle's communications system will enable the vehicle 102 to communicate with cloud processing 120. Communications Systems of the vehicle can also enable users to communicate with the vehicle using a key fob 104. A user with a remote computer or network connected mobile device can also communicate with cloud processing 120 which in turn communicates with the vehicle via Communications Systems in security module 103a or other logic that is native to the vehicle and manufacturer. In one embodiment, an override signal can be monitored to receive input from the user. The input from the user can be in the form of receiving a panic button input, which can trigger the alarm types to activate certain ones of the trigger alarms and notifications.
In another embodiment, the override can include a record button. In this embodiment, the record button is a lower level security feature that may be lower level than a panic. For example, a user that wants to be cautious when approaching the vehicle in an unlighted place may decide to push the record button that will activate certain features on the vehicle, such as lighting, sound indicators, the camera sensors, the microphone sensors, etc. However, the record button may not necessarily mean that the user is indeed in trouble. If while the user is approaching the vehicle in the record mode the user determines that an actual panic or higher-level security activity is warranted, the user can then press the panic button. The panic button provides an activation of more security features of the vehicle, such as auto notification of police, recording clips of data in the vicinity of the vehicle being transmitted to authorities, notification of local security, etc.
If the user is not approaching the vehicle and the vehicle is simply being monitored from a remote location or being monitored to allow access from remote a location, the override button is not selected. In this case, the alarm type trigger logic would utilize alarm rules to decide which types of triggers alarms will be set. Additionally, notifications to the user may be sent via mobile phone application notifications, e-mails, texts, and other communication methods. The notifications can include video clips, images, sounds files, and metadata related to the trigger event.
Based on the type of detected events, certain data will be transferred to the user in the form of notifications, or simply added to the history file of the vehicle when parked at a particular location, and assigned a time tag for when the event occurred. The user can then access a mobile device, or any computer connected to the Internet to view any activity that has occurred to the vehicle when away from the vehicle, and any notifications concerning triggers of alarms. By having this information before approaching the vehicle, the user can proceed to the vehicle with more or less caution.
The normal monitoring may occur when the user is away from the vehicle, but may still wish to receive alarm conditions or data regarding the vehicle as notifications that may be viewed from the user's account from cloud processing application. The method may include the operation 202 during normal operation that detects a trigger activity proximate to the vehicle. The trigger activity may be simply detection of a person walking by the vehicle, a person talking near the vehicle, a person attempting to break into a vehicle, a collision detection from another vehicle or object, etc. The sensor data in operation 204 is analyzed to determine if the activity has occurred within the proximity zone around the vehicle. If the activity has been detected within the proximity zone, the system will read alarm rules 206 to determine the type of alarm to sound. The alarm can include audible sounds or simply triggering of applications or notifications or recordings, or combinations thereof.
As noted, the rules can be pre-set for the specific user, can be set based on historical events, can be set based on the time of day, can be set based on the location where the vehicle is parked, or other combinations of settings that can be made part of logic rules. In operation 212, one or more of the active sensors at one or more locations around the vehicle may be activated. For example, cameras may be activated to record, sounds may be activated to be emitted, lighting may be activated to spotlight particular objects or people in the vicinity of the vehicle, etc. Operation 214 indicates that data from the sensors may be recorded. The recorded data can then be temporarily saved at the vehicle. In operation 216, the recorded data can be saved and transmitted to a remote cloud services account for storage. The data can then be made accessible to persons having access to the user's account, monitoring services, authorities, etc. A determination can be made in operation 218 to determine if the alarm was due to an override.
In this example, no override signal was sent so the method moves to operation 222, where normal monitoring would communicate a notification to the owner of the vehicle regarding the triggered activity. The notification can be sent to third parties, persons that may be driving the vehicle temporarily for (for example with a temporary user account for the vehicle), to the police, the local monitoring services, to monitoring services of the user where the car may be registered.
In another embodiment, the trigger may be due to an override signal being sent to the vehicle in operation 208. This might be the result of an active panic. Active panic is simply a notation that can be thought of as a user desiring to seek higher levels of security when approaching a vehicle. In operation 210, it is determined that the level of panic for the override is a single click, a double-click, a hold down of a panic button, or inactivation of recording.
These are only examples of the types of levels that can be signaled by user when an override condition is triggered. Additional types of signals can be sent by the user via customized levels and activities that may be programmed through user interface from an application that is provided to users having accounts to a security program. Based on the level of security set by the override in operation 210, the alarm rules are read in operation 206. At this point, operations 212 to 216 are processed and it is determined that operation 218 that override condition was true and in urgent communication or notification can be sent to authorities in operation 220. The police, monitoring service or other identify persons can then see the notification, and in operation 224 access is provided to recordings for the triggered activity and/or past activity at the vehicle via a website or application. In one embodiment, this data that is sent to cloud processing regarding an event can be shared with others within a proximity threshold of the event (e.g., either when a user/device requests the data or via a push notification, or combinations thereof). For example, if an event is triggered 5 parking spaces away (or any other predefined distance) from a different user's vehicle with monitoring services enabled, the user of the second non-affected vehicle may receive a report/notification that an incident has occurred nearby and that user should use extra caution. This data can be in the form of a report with a certain time interval cadence or an on-the-fly alert or notification (e.g., email, application notification, text, voice message, etc.)
In this example, icons can be provided to the user to provide easy access to the users account, in this case user Bob. The user can then manage his user account through a website that communicates with cloud processing 120, if the user has registered with the security system that's operated by cloud processing 120. Remote control can be provided to the user via a remote control icon. The user, through the user interface can then select override signal such as panic and record directly from the computing device, without needing to access a key fob. The user can also review past recordings or start recording now, view live feeds in and around the proximity of the vehicle, see notifications and alarms.
The user can be provided with a set number of alarms and notifications that, once selected can show historical data of notifications, alarms and the like. The remote computing device, as noted above can be a remote computer, a portable device, a vehicle display, or any other computing system. The user interface can also provide the user with historical data regarding parking spaces are parking locations where the vehicle may be parked. The user can also be provided with recommendations of safer locations the park the vehicle 102. The recommendations can be obtained by looking at historical alarm conditions from other vehicles for specific parking locations.
This information can therefore qualify specific locations and parking slots for vehicles actively when the user selecting parking spots in a public location. In one embodiment, the recommendations can provide historical data of actual events that occurred at the specific location. By having this information, users can make informed decisions and where to park the vehicle is security may be an issue or is sense to be an issue.
If the user is using a GUI of a computer screen to activate functions of a key fob, the color codes can be shown on the screen, and the user may be able to simply select the exact level without multiple clicks. Specialized key fobs can also provide this functionality and can be linked to the user's smartphone app. For example, a key fob can be made to communicate data between a smartphone or computer.
In one embodiment, if the user selects the security button once, a level 1 security is activated, which may be shown with the green LED. This is a lower-level security with the user wants to be cautious when approaching a vehicle. The user can then select the security button a second time which lights up the second LED in a different color, such as yellow.
The yellow color may indicate a level 2, which may activate different levels of security for the vehicle as shown in the example triggers. If the user pushes the security button a third time, a level 3 securities activated, which may trigger additional security measures than those for levels 1 and 2. This may be shown as an orange LED on the key fob, or on the display screen of the user's application. Similarly, if the user selects the security button a fourth time, a red LED will be lit indicating the highest level of security, which is shown as level 4. In this example, more aggressive levels of security and notifications are triggered as the user selects a higher level of security from the key fob. The levels of security are only examples, and more or less levels may be used and more or less lights or indicators may be provided to select the particular security input.
In one embodiment, the key fob can receive voice input commands from the user. The user can activate the security level by simply speaking and the key fob or portable device can detect the sound and identify the individual as the owner or operator of the vehicle, and then apply the security level.
In one embodiment, lower levels of security may be activated when a user is approaching a vehicle and feels relatively safe. As the user feels that his safety has diminished, the user can selectively increase the level security as the user approaches the vehicle. This acts to provide preventative measures for approaching the vehicle, such that nearby criminal entities will be notified of the security and surveillance by the vehicle, the fact that the area is being recorded, and the fact that authorities are being notified. In one embodiment, by enacting the multilevel security system for vehicles, the security of their occupants or owners can be enhanced.
In other embodiments, the multiple levels of security can also be activated when the user is in the vehicle. For example, if the user is in the vehicle and is approached by criminal entities, the user can select the security button inside the vehicle to activate one or more the security systems described in
As shown in
The access to the data can also be encrypted to prevent unauthorized access to the data. GPS and mapping services can also be in communication with the cloud processing 120 provide data concerning the locations of the vehicles and activities that occurred to the vehicles when at particular locations. The cloud processing can be access by the vehicles themselves using their electronics and communications, via mobile devices, from home, from work, etc.
The security service can include security agents that monitor and receive trigger events from cloud processing from the various vehicles. The security service can be automated, or can include live people that handle specific events for members of an account and a service. The security service can also be in communication with authorities to contact police, local patrols, ambulances etc. The cloud processing can also be in direct communication with the authorities without having to communicate with security service agents of a security account for a vehicle or vehicles. Reports can then be generated by cloud processing for specific owners of vehicles so that owners of vehicles can understand historical security breaches or triggers that have occurred and the location where that has occurred.
The owners of the vehicles can also access cloud processing to get recommendations of safer locations the part based on historical data of other vehicles that have experienced security events. These past security events can be mapped to a heat map that identifies where the events occurred, and how long ago the events occurred. If events concerning security have occurred often in a specific location, but those events occurred years ago, the heat map will deemphasize those events over events that have occurred more recently.
Recommendations to the users can then be populated based on more accurate and recent activity. For example, if a garage operator has recently improved security, past security breaches and events can be deemphasized if the garage operator registers with the security service and notifies the security service of their improvements in security. In one embodiment, different parking areas can be rated by the cloud services logic and the ratings can be provided back to the user. These ratings can be used to recommend better locations for parking. If the parking owners see that their ratings have fallen, the parking owners can emphasize or correct their security issues to receive a better rating.
Other types of tracking can also be used. For example, tracking can be used to generate reports/notifications that are produced and distributed to safety coordinating officers of specific locations (or vehicle owners), or to identify information that occurred before or after an accident or incident at a specific location such as a vehicle collision, vehicle theft of attempted theft, robbery, attempted robbery as well as panic situations during crimes. These alerts can also be provided with historical data associated with the alerts. The historical data can include images taken by cameras of the specific locations where the alerts occurred. The images can be videos, video clips, still images, collection of still images, audio files, audio snippets, that are accessible instantly over cloud processing 120.
This information can also be accessible via cloud processing 120 after a specific incident is reported. For example, if a burglary or crime occurred within the location, the sensor data of the location as well as the sensor data of the vehicles can be coordinated to identify when and who traversed specific locations within the area. In one embodiment, the collected data can be partitioned to allow certain data to be shared with the owner of the vehicle and certain data to be restricted to the owner or operator of the location.
The vehicle cameras can also be obtaining data during, after or before the vehicle begins to move. The vehicle cameras can be located in all locations around the vehicle. The vehicle cameras can be located in the front, in the rear, under the vehicle, above the vehicle, to the sides of the vehicles etc. Other sensors the vehicle can include ultrasonic sensors, heat sensors, IR sensors, sound sensors, gyroscopes, microphones, etc.
In some embodiments, the vehicles may establish peer-to-peer links to facilitate fast transfer of data. In other embodiments, vehicles may link to each other using pairing algorithms that allow the vehicles to exchange data using WiFi, Bluetooth, near field communication (NFC), or some other short range communication protocol.
The APP may use any combination of device electronics in coordination with GPS location coordinates to determine what types of overlays to display on the GUI 105 to show the user the different options he or she may have in parking. The user may choose from many different overlays on the given map 1002 showing the user's proximate location and parking options. In this case, the user may choose a grade map from selections 1012 which can also show information in heat map overlay mode, incident report overlay mode, recommended parking overlay mode, rating mode among others. In this particular example, a letter grade 1008a-e may be given to each available parking area 1006a-e within a proximate location of the user's current location 1004. The letters grades illustrate one such way of indicating varying degrees of safety of certain locations, however, numbers, symbols, colors, as well as intensities or combinations of these may also be used to convey varying degrees of safety.
A user may also decide to view more information about a given parking area 106a-e and may be able to select an area by speaking to the app which will identify the selection, touching the screen to elect a choice, tap with an indicating utensil or speak a command. Once the user has made a selection, additional details may be displayed to the user pertaining to that particular parking area. This data may include metrics including but not limited to incidents, ratings, ability to rate, reservation options, alternatives, subscription options as well as payment options. A user may decide that a parking location A is close to where they need to be, however the location may have a high rate of incidents.
In one embodiment, the user may click on the incidents information tab to determine if the incidents were severe or mild to determine if the risk is worth parking closer to their destination. The user may want to subscribe to a given parking area 1006a-e in order to receive alerts when an incident happens to determine if the parking area is safe to park in at a later time or when they have parked in that area. After the user has parked in an area, the user may decide to rate the parking area, report an incident or pay for parking time.
In one embodiment, the homepage may also contain a section dedicated to interactive utilities 1020 including but not limited to destination entry, heat maps and map overlays, rating maps, reservations utilities, monitors etc. The user may choose to enter their destination in which the user may want to receive information about available parking in and around that location along with information on varying safety levels associated with those parking areas. Similarly, the user may already be in the area where the user wishes to park, in such case, the user may choose a dynamic location based map overlay to help in deciding where the safest place to park is. The user may choose to reserve parking in advance or reserve parking when they are approaching an identified parking area on the map overlay interactive map utility.
A user's APP homepage may also include dynamically updating sections 1024 in which the most important information at a given time may be displayed or surfaced to a user. If a user has parked in a certain parking area, he or she may want to monitor metrics related to incidents that may have occurred to his or her vehicle, vehicles around his or her vehicle, any dynamically received alerts, as well as precaution levels. Additionally, a user may choose to configure his or her APP homepage to display the most pertinent audio and video feeds to their needs.
The example of the APP shows 3 such feeds: 1026a of a side view of a user's vehicle, 1026b, a bird's eye view of a user's vehicle and 1026c, a view from inside the user's vehicle looking out through the driver's window. This example shows 3 such feeds, however, any number of feeds may or may not be shown simultaneously, depending on the APP configuration by the user. These feeds may be useful in identifying threats to a user's vehicle why the user is away. For instance, feed 1026 shows the view from inside the vehicle looking out through the driver's window which has captured suspicious activity relating to an individual peering into the user's vehicle. Since all camera's feeds may be passed through a filtering algorithm to determine if activity is suspicious or benign, a determination can be made by the logic to alert the user or not. In this case, the logic has determined that the activity of an individual peering into the user's vehicle qualifies as suspicious.
In one embodiment, information regarding the incident or a brief summary of the incident 1042 may be shown to the user. In this case, the incident is regarding the break-in of a vehicle proximate to the user's vehicle. The user may also elect to consume all details related to alert 1038 by selecting the details button 1046 which will show all text and media available regarding alert 1038. This may be useful to the user by electing to move his or her vehicle to a safer location. A second example of an alert may be alert 1040 with description 1052 regarding suspicious activity. Details regarding this alert may be displayed in area 1044 in which the user's vehicle determined that it has been touched using gyroscopes. The user may elect for more details by choosing button 1046 or may want to watch live feeds from their vehicle by choosing button 1048 to validate the alert or determine if the alert was a false alarm.
At this time, the user may elect to enable varying degrees of countermeasures to deter the suspicious individual. Actions 1060 available to the user may include but are not limited to honking the vehicle's horn, providing an audio warning, speaking directly to the suspicious individual near the vehicle, flashing the vehicle's lights or lighting system, alert the authorities, capture snapshots of the suspicious individual, record all video and audio for storage locally on the device as well as cloud processing storage systems as well as electing to alert others in the area that may be affected by the suspicious activity. This alert may show up on another user's device 104 as an alert as it did on the current user's device.
In one embodiment, the vehicles can communicate directly with each other via a temporary pairing process. The temporary pairing process can be automatically enabled when vehicles become too close to each other, for example. When this happens, local communication between the vehicles, such as a peer-to-peer connection, Wi-Fi connection, NFC connection, or Bluetooth connection can be established to enable the vehicles to share information concerning their proximity to one another. This local communication will enable one or both vehicles to take correction actions or alert a driver to change course or trigger automatic collision prevention measures (e.g., more aggressive notifications to one or both operators, slow the speed of one or more vehicles, change the driving direction of one or more vehicles, etc.). Once the close proximity communication occurs and some corrective action is made, the data regarding the occurrence and the actions taken can be communicated to the cloud system for storage. The information can then be viewed by a registered user having access to an account for the vehicle(s).
In still other embodiments, based on data collected from events to triggers of alarms, the cloud processing system can suggest specific parking spots, slots, garages, areas, areas of cities, areas of buildings, neighborhoods, etc. In still other embodiments, the data obtained from reported triggers, security breaches, or elevated levels of security requested, the data can be augmented with data from authorities, such as crime statistics. These statistics can be obtained from online databases and combined or blended with data obtained from vehicle activities. The data can be obtained for different types of crime activity, such as car-jacking, muggings, theft, etc. The data can also be assigned to specific areas by GPS locations, by zip codes, by a radius around some object, location or vehicle.
In one embodiment, a user can remotely reserve a parking spot. The reserved parking spot can be tied to a particular rating of security. Spots in a parking garage, for example, can be reserved and paid for based on their demand. Spots with low crime or high safety ratings may lease for higher fees, while spots with a lower safety rating may lease for less money. The reservation can be made using a mobile device for remote reservation, or can be made by electronics of the vehicle. In one embodiment, as a vehicle arrives at a parking area, the user may be provided with options to park. The options can be tied to the security rating for the parking spots.
It will be obvious, however, to one skilled in the art, that the present invention may be practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.
The various embodiments defined herein may define individual implementations or can define implementations that rely on combinations of one or more of the defined embodiments. Further, embodiments of the present invention may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
With the above embodiments in mind, it should be understood that the invention could employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared and otherwise manipulated.
Any of the operations described herein that form part of the invention are useful machine operations. The invention also relates to a device or an apparatus for performing these operations. The apparatus can be specially constructed for the required purpose, or the apparatus can be a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data, which can thereafter be read by a computer system. The computer readable medium can also be distributed over a network-coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims.
The present application is a continuation application of U.S. patent application Ser. No. 13/911,072, filed on Jun. 5, 2013, and entitled “Methods and Systems for Vehicle Security and Remote Access and Safety Control Interfaces and Notifications,” which claims priority from U.S. Provisional Patent Application No. 61/760,003, filed on Feb. 1, 2013, and entitled “Methods and Systems For Vehicle Security and Remote Access and Safety Control Interfaces and Notifications”, which are herein incorporated by reference. U.S. patent application Ser. No. 13/911,072, filed on Jun. 5, 2013, and entitled “Methods and Systems for Vehicle Security and Remote Access and Safety Control Interfaces and Notifications,” also claims priority to U.S. Provisional Patent Application No. 61/745,729, filed on Dec. 24, 2012, and entitled “Methods and Systems For Electric Vehicle (EV) Charging, Charging Systems, Internet Applications and User Notifications”, which are herein incorporated by reference. U.S. patent application Ser. No. 13/911,072, filed on Jun. 5, 2013, and entitled “Methods and Systems for Vehicle Security and Remote Access and Safety Control Interfaces and Notifications,” is a continuation-in-part of U.S. application Ser. No. 13/452,882, filed Apr. 22, 2012, (Now U.S. Pat. No. 9,123,035, issued on Sep. 1, 2015) and entitled “Electric Vehicle (EV) Range Extending Charge Systems, Distributed Networks of Charge Locating Mobile Apps”, which claims priority to U.S. Provisional Application No. 61/478,436, filed on Apr. 22, 2011, all of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3690397 | Parker | Sep 1972 | A |
3799063 | Reed | Mar 1974 | A |
3867682 | Ohya | Feb 1975 | A |
4052655 | Vizza | Oct 1977 | A |
4102273 | Merkle et al. | Jul 1978 | A |
4132174 | Ziegenfus et al. | Jan 1979 | A |
4162445 | Campbell | Jul 1979 | A |
4309644 | Reimers | Jan 1982 | A |
4347472 | Lemelson | Aug 1982 | A |
4383210 | Wilkinson | May 1983 | A |
4389608 | Dahl et al. | Jun 1983 | A |
4405891 | Galloway | Sep 1983 | A |
4433278 | Lowndes et al. | Feb 1984 | A |
4450400 | Gwyn | May 1984 | A |
4532418 | Meese | Jul 1985 | A |
4789047 | Knobloch | Dec 1988 | A |
4815840 | Benayad-Cherif et al. | Mar 1989 | A |
5049802 | Mintus et al. | Sep 1991 | A |
5121112 | Nakadozono | Jun 1992 | A |
5132666 | Fahs | Jul 1992 | A |
5184058 | Hesse | Feb 1993 | A |
5202617 | Nor | Apr 1993 | A |
5297664 | Tseng et al. | Mar 1994 | A |
5306999 | Hoffman | Apr 1994 | A |
5315227 | Pierson | May 1994 | A |
5327066 | Smith | Jul 1994 | A |
5343970 | Severinsky | Sep 1994 | A |
5422624 | Smith | Jun 1995 | A |
5434781 | Alofs | Jul 1995 | A |
5441122 | Yoshida | Aug 1995 | A |
5449995 | Kohchi | Sep 1995 | A |
5487002 | Diler et al. | Jan 1996 | A |
5488283 | Doughert et al. | Jan 1996 | A |
5492190 | Yoshida | Feb 1996 | A |
5548200 | Nor et al. | Aug 1996 | A |
5549443 | Hammerslag | Aug 1996 | A |
5555502 | Opel | Sep 1996 | A |
5563491 | Tseng | Oct 1996 | A |
5585205 | Kohchi | Dec 1996 | A |
5594318 | Knor | Jan 1997 | A |
5595271 | Tseng | Jan 1997 | A |
5596258 | Kimura et al. | Jan 1997 | A |
5612606 | Guimarin et al. | Mar 1997 | A |
5627752 | Buck et al. | May 1997 | A |
5636145 | Gorman et al. | Jun 1997 | A |
5642270 | Green et al. | Jun 1997 | A |
5666102 | Lahiff | Sep 1997 | A |
5691695 | Lahiff | Nov 1997 | A |
5694019 | Uchida et al. | Dec 1997 | A |
5701706 | Kreysler et al. | Dec 1997 | A |
5736833 | Farris | Apr 1998 | A |
5760569 | Chase, Jr. | Jun 1998 | A |
5778326 | Moroto et al. | Jul 1998 | A |
5790976 | Boll et al. | Aug 1998 | A |
5892598 | Asakawa et al. | Apr 1999 | A |
5916285 | Alofs et al. | Jun 1999 | A |
5974136 | Murai | Oct 1999 | A |
5998963 | Aarseth | Dec 1999 | A |
6014597 | Kochanneck | Jan 2000 | A |
6049745 | Douglas et al. | Apr 2000 | A |
6067008 | Smith | May 2000 | A |
6081205 | Williams | Jun 2000 | A |
6085131 | Kim | Jul 2000 | A |
6151539 | Bergholz et al. | Nov 2000 | A |
6175789 | Beckert et al. | Jan 2001 | B1 |
6225776 | Chai | May 2001 | B1 |
6234932 | Kuroda et al. | May 2001 | B1 |
6236333 | King | May 2001 | B1 |
6252380 | Koenck | Jun 2001 | B1 |
6301531 | Pierro | Oct 2001 | B1 |
6307349 | Koenck et al. | Oct 2001 | B1 |
6310542 | Gehlot | Oct 2001 | B1 |
6330497 | Obradovich et al. | Dec 2001 | B1 |
6330499 | Chou et al. | Dec 2001 | B1 |
6370475 | Breed et al. | Apr 2002 | B1 |
6373380 | Robertson et al. | Apr 2002 | B1 |
6416209 | Abbott | Jul 2002 | B1 |
6434465 | Schmitt et al. | Aug 2002 | B2 |
6456041 | Terada et al. | Sep 2002 | B1 |
6466658 | Schelberg, Jr. et al. | Oct 2002 | B2 |
6480767 | Yamaguchi | Nov 2002 | B2 |
6487477 | Woestmanm et al. | Nov 2002 | B1 |
6498454 | Pinlam | Dec 2002 | B1 |
6511192 | Henion | Jan 2003 | B1 |
6586866 | Ikedo | Jul 2003 | B1 |
6614204 | Pellegrino et al. | Sep 2003 | B2 |
6629024 | Tabata | Sep 2003 | B2 |
6727809 | Smith | Apr 2004 | B1 |
6741036 | Ikedo | May 2004 | B1 |
6765495 | Dunning et al. | Jul 2004 | B1 |
6789733 | Terranova | Sep 2004 | B2 |
6794849 | Mori et al. | Sep 2004 | B2 |
6850898 | Murakami | Feb 2005 | B1 |
6915869 | Botti | Jul 2005 | B2 |
6922629 | Yoshikawa et al. | Jul 2005 | B2 |
6937140 | Outslay | Aug 2005 | B1 |
6940254 | Nagamine | Sep 2005 | B2 |
7013205 | Hafner | Mar 2006 | B1 |
7039389 | Johnson, Jr. | May 2006 | B2 |
7084781 | Chuey | Aug 2006 | B2 |
7201384 | Chaney | Apr 2007 | B2 |
7269416 | Guthrie et al. | Sep 2007 | B2 |
7289611 | Iggulden | Oct 2007 | B2 |
7376497 | Chen | May 2008 | B2 |
7379541 | Iggulden et al. | May 2008 | B2 |
7402978 | Pryor | Jul 2008 | B2 |
7532965 | Robillard | May 2009 | B2 |
7565396 | Hoshina | Jul 2009 | B2 |
7630802 | Breed | Dec 2009 | B2 |
7650210 | Breed | Jan 2010 | B2 |
7674536 | Chipchase | Mar 2010 | B2 |
7683771 | Loeb | Mar 2010 | B1 |
7693609 | Kressner et al. | Apr 2010 | B2 |
7698078 | Kelty et al. | Apr 2010 | B2 |
7740092 | Bender | Jun 2010 | B2 |
7751945 | Obata | Jul 2010 | B2 |
7778746 | McLeod | Aug 2010 | B2 |
7796052 | Katz | Sep 2010 | B2 |
7850351 | Pastrick et al. | Dec 2010 | B2 |
7869576 | Rodkey et al. | Jan 2011 | B1 |
7885893 | Alexander | Feb 2011 | B2 |
7949435 | Pollack | May 2011 | B2 |
7956570 | Lowenthal | Jun 2011 | B2 |
7979198 | Kim et al. | Jul 2011 | B1 |
7986126 | Bucci | Jul 2011 | B1 |
7991665 | Hafner | Aug 2011 | B2 |
8006793 | Heichal et al. | Aug 2011 | B2 |
8027843 | Bodin et al. | Sep 2011 | B2 |
8036788 | Breed | Oct 2011 | B2 |
8054048 | Woody | Nov 2011 | B2 |
8072318 | Lynam | Dec 2011 | B2 |
8103391 | Ferro et al. | Jan 2012 | B2 |
8256553 | De Paschoal | Sep 2012 | B2 |
8262268 | Pastrick et al. | Sep 2012 | B2 |
8265816 | LaFrance | Sep 2012 | B1 |
8266075 | Ambrosio et al. | Sep 2012 | B2 |
8294420 | Kocher | Oct 2012 | B2 |
8333492 | Dingman et al. | Dec 2012 | B2 |
8336664 | Wallace et al. | Dec 2012 | B2 |
8350526 | Dyer et al. | Jan 2013 | B2 |
8366371 | Maniscalco et al. | Feb 2013 | B2 |
8392065 | Tolstedt | Mar 2013 | B2 |
8405347 | Gale | Mar 2013 | B2 |
8482255 | Crombez | Jul 2013 | B2 |
8483775 | Buck et al. | Jul 2013 | B2 |
8483907 | Tarte | Jul 2013 | B2 |
8490005 | Tarte | Jul 2013 | B2 |
8508188 | Murtha et al. | Aug 2013 | B2 |
8521599 | Rivers, Jr. et al. | Aug 2013 | B2 |
8527135 | Lowrey et al. | Sep 2013 | B2 |
8527146 | Jackson | Sep 2013 | B1 |
8552686 | Jung | Oct 2013 | B2 |
8589019 | Wallace et al. | Nov 2013 | B2 |
8624719 | Klose | Jan 2014 | B2 |
8630741 | Matsuoka et al. | Jan 2014 | B1 |
8635091 | Amigo | Jan 2014 | B2 |
8643329 | Prosser et al. | Feb 2014 | B2 |
8660734 | Zhu et al. | Feb 2014 | B2 |
8686864 | Hannon | Apr 2014 | B2 |
8694328 | Gormley | Apr 2014 | B1 |
8706394 | Trepagnier et al. | Apr 2014 | B2 |
8713121 | Bain | Apr 2014 | B1 |
8717170 | Juhasz | May 2014 | B1 |
8725551 | Ambrosio et al. | May 2014 | B2 |
8751065 | Kato | Jun 2014 | B1 |
8751271 | Stefik et al. | Jun 2014 | B2 |
8760432 | Jira et al. | Jun 2014 | B2 |
8799037 | Stefik et al. | Aug 2014 | B2 |
8816845 | Hoover et al. | Aug 2014 | B2 |
8818622 | Bergholz et al. | Aug 2014 | B2 |
8818725 | Ricci | Aug 2014 | B2 |
8819414 | Bellur et al. | Aug 2014 | B2 |
8825222 | Namburu et al. | Sep 2014 | B2 |
8836281 | Ambrosio et al. | Sep 2014 | B2 |
8836784 | Erhardt | Sep 2014 | B2 |
8970699 | Xiao | Mar 2015 | B2 |
20020064258 | Schelberg et al. | May 2002 | A1 |
20020085043 | Ribak | Jul 2002 | A1 |
20020133273 | Lowrey et al. | Sep 2002 | A1 |
20030137277 | Mori et al. | Jul 2003 | A1 |
20030153278 | Johnson | Aug 2003 | A1 |
20030205619 | Terranova et al. | Nov 2003 | A1 |
20030234325 | Marino et al. | Dec 2003 | A1 |
20040046506 | Mawai et al. | Mar 2004 | A1 |
20040064235 | Cole | Apr 2004 | A1 |
20040092253 | Simonds et al. | May 2004 | A1 |
20040093155 | Simonds et al. | May 2004 | A1 |
20040265671 | Chipchase et al. | Dec 2004 | A1 |
20050021190 | Worrell et al. | Jan 2005 | A1 |
20050044245 | Hoshina | Feb 2005 | A1 |
20050231119 | Ito et al. | Oct 2005 | A1 |
20060125620 | Smith et al. | Jun 2006 | A1 |
20060182241 | Schelberg | Aug 2006 | A1 |
20060282381 | Ritchie | Dec 2006 | A1 |
20060287783 | Walker | Dec 2006 | A1 |
20070068714 | Bender | Mar 2007 | A1 |
20070126395 | Suchar | Jun 2007 | A1 |
20070282495 | Kempton | Dec 2007 | A1 |
20080039979 | Bridges et al. | Feb 2008 | A1 |
20080039989 | Pollack et al. | Feb 2008 | A1 |
20080040129 | Cauwels et al. | Feb 2008 | A1 |
20080040223 | Bridges et al. | Feb 2008 | A1 |
20080040295 | Kaplan et al. | Feb 2008 | A1 |
20080052145 | Kaplan et al. | Feb 2008 | A1 |
20080086411 | Olsen et al. | Apr 2008 | A1 |
20080097904 | Volchek et al. | Apr 2008 | A1 |
20080155008 | Stiles et al. | Jun 2008 | A1 |
20080180027 | Matsushita et al. | Jul 2008 | A1 |
20080203973 | Gale et al. | Aug 2008 | A1 |
20080228613 | Alexander | Sep 2008 | A1 |
20080281663 | Hakim | Nov 2008 | A1 |
20080294283 | Ligrano | Nov 2008 | A1 |
20080312782 | Berdichevsky | Dec 2008 | A1 |
20090011639 | Ballard et al. | Jan 2009 | A1 |
20090021213 | Johnson | Jan 2009 | A1 |
20090021385 | Kelty et al. | Jan 2009 | A1 |
20090030712 | Bogolea | Jan 2009 | A1 |
20090043519 | Bridges et al. | Feb 2009 | A1 |
20090058355 | Meyer | Mar 2009 | A1 |
20090066287 | Pollack et al. | Mar 2009 | A1 |
20090076913 | Morgan | Mar 2009 | A1 |
20090082957 | Agassi et al. | Mar 2009 | A1 |
20090091291 | Woody et al. | Apr 2009 | A1 |
20090092864 | McLean | Apr 2009 | A1 |
20090144001 | Leonard et al. | Jun 2009 | A1 |
20090157289 | Graessley | Jun 2009 | A1 |
20090164473 | Bauer | Jun 2009 | A1 |
20090174365 | Lowenthal et al. | Jul 2009 | A1 |
20090177580 | Lowenthal et al. | Jul 2009 | A1 |
20090210357 | Pudar et al. | Aug 2009 | A1 |
20090287578 | Paluszek | Nov 2009 | A1 |
20090312903 | Hafner et al. | Dec 2009 | A1 |
20090312945 | Sakamoto | Dec 2009 | A1 |
20090313032 | Hafner et al. | Dec 2009 | A1 |
20090313033 | Hafner et al. | Dec 2009 | A1 |
20090313034 | Ferro et al. | Dec 2009 | A1 |
20090313098 | Hafner et al. | Dec 2009 | A1 |
20090313104 | Hafner et al. | Dec 2009 | A1 |
20090313174 | Hafner et al. | Dec 2009 | A1 |
20100013434 | Taylor-Haw et al. | Jan 2010 | A1 |
20100017045 | Nesler et al. | Jan 2010 | A1 |
20100017249 | Fincham et al. | Jan 2010 | A1 |
20100049396 | Ferro et al. | Feb 2010 | A1 |
20100049533 | Ferro et al. | Feb 2010 | A1 |
20100049610 | Ambrosio et al. | Feb 2010 | A1 |
20100049639 | Ferro et al. | Feb 2010 | A1 |
20100049737 | Ambrosio et al. | Feb 2010 | A1 |
20100057306 | Ishii et al. | Mar 2010 | A1 |
20100112843 | Heichai et al. | May 2010 | A1 |
20100141206 | Agassi et al. | Jun 2010 | A1 |
20100161481 | Littrell | Jun 2010 | A1 |
20100161482 | Littrell | Jun 2010 | A1 |
20100169008 | Niwa et al. | Jul 2010 | A1 |
20100198508 | Tang | Aug 2010 | A1 |
20100198513 | Zeng | Aug 2010 | A1 |
20100211340 | Lowenthal et al. | Aug 2010 | A1 |
20100211643 | Lowenthal et al. | Aug 2010 | A1 |
20100222939 | Namburu | Sep 2010 | A1 |
20100268426 | Pathak | Oct 2010 | A1 |
20100280956 | Chutorash et al. | Nov 2010 | A1 |
20100304349 | Kunin | Dec 2010 | A1 |
20110032110 | Taguchi | Feb 2011 | A1 |
20110074350 | Kocher | Mar 2011 | A1 |
20110074351 | Bianco et al. | Mar 2011 | A1 |
20110077809 | Leary | Mar 2011 | A1 |
20110106329 | Donnelly et al. | May 2011 | A1 |
20110112969 | Zaid et al. | May 2011 | A1 |
20110130885 | Bowen et al. | Jun 2011 | A1 |
20110187521 | Beruscha et al. | Aug 2011 | A1 |
20110191265 | Lowenthal et al. | Aug 2011 | A1 |
20110193522 | Uesugi | Aug 2011 | A1 |
20110202218 | Yano | Aug 2011 | A1 |
20110246252 | Uesugi | Oct 2011 | A1 |
20110279083 | Asai | Nov 2011 | A1 |
20110309929 | Myers | Dec 2011 | A1 |
20120013300 | Prosser et al. | Jan 2012 | A1 |
20120019204 | Matsuo | Jan 2012 | A1 |
20120025765 | Frey et al. | Feb 2012 | A1 |
20120028680 | Breed | Feb 2012 | A1 |
20120041624 | Stewart et al. | Feb 2012 | A1 |
20120053754 | Pease | Mar 2012 | A1 |
20120074903 | Nakashima | Mar 2012 | A1 |
20120105197 | Kobres | May 2012 | A1 |
20120123670 | Uyeki | May 2012 | A1 |
20120136743 | McQuade et al. | May 2012 | A1 |
20120136802 | McQuade et al. | May 2012 | A1 |
20120158244 | Talty et al. | Jun 2012 | A1 |
20120179323 | Profitt-Brown et al. | Jul 2012 | A1 |
20120218128 | Tieman et al. | Aug 2012 | A1 |
20120229056 | Bergfjord | Sep 2012 | A1 |
20120229085 | Lau | Sep 2012 | A1 |
20120232965 | Rodriquez et al. | Sep 2012 | A1 |
20120233077 | Tate et al. | Sep 2012 | A1 |
20120262002 | Widmer et al. | Oct 2012 | A1 |
20120268068 | Jung et al. | Oct 2012 | A1 |
20120268076 | Danner | Oct 2012 | A1 |
20120268242 | Tieman et al. | Oct 2012 | A1 |
20120280654 | Kim | Nov 2012 | A1 |
20120296512 | Lee et al. | Nov 2012 | A1 |
20120303397 | Prosser | Nov 2012 | A1 |
20120306445 | Park et al. | Dec 2012 | A1 |
20120310713 | Mercuri et al. | Dec 2012 | A1 |
20120316671 | Hammerslag et al. | Dec 2012 | A1 |
20130002876 | Pastrick et al. | Jan 2013 | A1 |
20130020139 | Kim et al. | Jan 2013 | A1 |
20130021162 | DeBoer et al. | Jan 2013 | A1 |
20130037339 | Hickox | Feb 2013 | A1 |
20130099892 | Trucker et al. | Apr 2013 | A1 |
20130103236 | Mehrgan | Apr 2013 | A1 |
20130110296 | Khoo et al. | May 2013 | A1 |
20130110632 | Theurer et al. | May 2013 | A1 |
20130110653 | Rivers et al. | May 2013 | A1 |
20130127247 | Oh et al. | May 2013 | A1 |
20130135093 | Araki | May 2013 | A1 |
20130144520 | Ricci | Jun 2013 | A1 |
20130145065 | Ricci | Jun 2013 | A1 |
20130179057 | Fisher et al. | Jul 2013 | A1 |
20130204466 | Ricci | Aug 2013 | A1 |
20130241720 | Ricci et al. | Sep 2013 | A1 |
20130253746 | Choi et al. | Sep 2013 | A1 |
20130280018 | Meirer et al. | Oct 2013 | A1 |
20130300554 | Braden | Nov 2013 | A1 |
20130317693 | Jefferies et al. | Nov 2013 | A1 |
20130317694 | Merg et al. | Nov 2013 | A1 |
20130328387 | Venkateswaran | Dec 2013 | A1 |
20130338820 | Corbett et al. | Dec 2013 | A1 |
20130342363 | Paek et al. | Dec 2013 | A1 |
20140002015 | Tripathi et al. | Jan 2014 | A1 |
20140019280 | Medeiros et al. | Jan 2014 | A1 |
20140021908 | McCool | Jan 2014 | A1 |
20140042968 | Hiroe | Feb 2014 | A1 |
20140047107 | Maturana et al. | Feb 2014 | A1 |
20140066049 | Cho et al. | Mar 2014 | A1 |
20140089016 | Smullin et al. | Mar 2014 | A1 |
20140106726 | Crosbie | Apr 2014 | A1 |
20140118107 | Almomani | May 2014 | A1 |
20140120829 | Bhamidipati | May 2014 | A1 |
20140125355 | Grant | May 2014 | A1 |
20140142783 | Grimm et al. | May 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140163774 | Demeniuk | Jun 2014 | A1 |
20140164559 | Demeniuk | Jun 2014 | A1 |
20140172192 | Kato | Jun 2014 | A1 |
20140172265 | Funabashi | Jun 2014 | A1 |
20140172727 | Abhyanker et al. | Jun 2014 | A1 |
20140179353 | Simon | Jun 2014 | A1 |
20140200742 | Mauti et al. | Jul 2014 | A1 |
20140203077 | Gadh et al. | Jul 2014 | A1 |
20140207333 | Vandivier et al. | Jul 2014 | A1 |
20140214261 | Ramamoorthy et al. | Jul 2014 | A1 |
20140214321 | Kawamata et al. | Jul 2014 | A1 |
20140218189 | Fleming et al. | Aug 2014 | A1 |
20140232331 | Stamenic et al. | Aug 2014 | A1 |
20140236414 | Droz et al. | Aug 2014 | A1 |
20140236463 | Zhang et al. | Aug 2014 | A1 |
20140253018 | Kong et al. | Sep 2014 | A1 |
20140277936 | El Dokor et al. | Sep 2014 | A1 |
20140278089 | Gusikhin et al. | Sep 2014 | A1 |
20140300739 | Mimar | Oct 2014 | A1 |
Number | Date | Country | |
---|---|---|---|
20180037193 A1 | Feb 2018 | US |
Number | Date | Country | |
---|---|---|---|
61760003 | Feb 2013 | US | |
61745729 | Dec 2012 | US | |
61478436 | Apr 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13911072 | Jun 2013 | US |
Child | 15787414 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13452882 | Apr 2012 | US |
Child | 13911072 | US |