This application relates generally to data processing and, more specifically, to systems and methods for interactive emergency information and identification.
During a catastrophic event people rely on TV, radio and other forms or “media” related devices for immediate information on all aspects of the event. This includes the locations, people involved, responding agencies and victims. With this the average system has no “immediate” flow of information of the event to the individual person, employee or management in a controlled environment in the vicinity of the event. However, timely response in emergency situations depends on accurate and up-to-date information about the emergency situation itself, affected persons, and their state. Prompt acquisition and exchange of such data can be essential in such situations. Audiovisual surveillance systems may require thorough analysis for detecting all affected persons. Additionally, deployment of surveillance systems is associated with high investments and, generally, negatively perceived by the public. Historically, state, local, and federal agencies use systems based on radio communications (to include mobile data terminals (MDTs) in emergency response vehicles). They rely on witnesses on scene to provide “approximate data” for correlation to the event that has just occurred.
Moreover, conventional systems cannot provide personalized information and guidelines to individuals affected by an emergency situation or request and receive information related to the emergency situation from the individuals.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Provided are systems and methods for interactive emergency information and identification. The interactive emergency information and identification system may comprise a processor and a database. The processor may be configured to receive a notification about an emergency situation. The notification may include a location associated with the emergency situation. Based on the location, a geo-fence may be defined. The geo-fence may be a physical area of varying radii around the location. The geo-fence may be pre-defined by a user of the interactive emergency information and identification system or, alternatively, may be defined by the processor based on the notification and other information retrieved in relation to the emergency situation. Additionally, location information associated with a location of a plurality of user devices may be received. Location of the user devices may be determined based on multilateration of radio signals between radio towers, triangulation of a GPS signal associated with each of the plurality of user devices, WiFi positioning, Bluetooth sensor signals, and so forth. The user devices are associated with individuals, so a position of an individual within the geo-fence can be determined based on the location information. The position may include a proximity zone associated with the position of the individual. The individual may be informed about the emergency situation and provided with a functionality to give feedback via a user interface associated with the user device. The feedback may be received from the individual by the processor. The feedback may include a request for help, a statement that no help is required, an assessment of the emergency situation, audio data, video data, text data, and so forth. Additionally, the processor may be configured to transmit the feedback to one or more emergency agencies.
The database may be communicatively coupled to the processor and configured to store at least the notification, the position of the individual, and the feedback.
In some embodiments, the processor may be further configured to provide emergency instructions associated with the emergency situation. The emergency instructions may be based on an emergency action plan associated with the emergency situation, and/or other data.
In further exemplary embodiments, modules, subsystems, or devices can be adapted to perform the recited steps. Other features and exemplary embodiments are described below.
Embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements and in which:
Interactive emergency information and identification systems and methods are described herein. In case of an emergency situation, such as a shooting, a terrorist attack, and so forth, individuals in a proximity to the location of the emergency situation may be determined using the location services of their user devices (e.g., smart phone or a tablet PC). The individuals within a certain distance from the location of the emergency situation may be informed about the emergency situation and requested to provide feedback based on different platforms. The feedback may be provided by B2B partners, state or local entities, and/or one or more civilian level users. Civilian level users or individuals may provide information concerning their condition, safety, and/or whatever information they may have concerning the emergency situation. Audio, video, and/or text data may be received from the individuals via their devices. For example, a photo of an active shooter or a video of a terrorist attack may be received. The received feedback may be forwarded to law enforcement or other appropriate agencies.
Additionally, data from various sources, such as local Emergency Plan Actions or specific plans, may be retrieved. Emergency instructions relative to the emergency situation affecting the individuals may be extracted from the data and provided to the individuals via a user interface of their devices. For example, emergency instructions may be provided in a graphical form as directions on a map displayed on the user device. At the same time, current position of the individual may be displayed on the map.
In some embodiments, an interactive emergency information and identification system may be used to request assistance in an emergency situation. Thus, a user may send an emergency notification and/or additional data related to the emergency via the user device. The user geographical position may be determined, and local emergency agencies may be informed about the emergency situation affecting the user. Additionally, emergency instructions may be retrieved based on the geographical position of the user and provided to the user via a graphical interface of the user device.
Referring now to the drawings,
The user device 130 may include a mobile telephone, a computer, a lap top, a smart phone, a tablet PC, and so forth. The user device 130, in some example embodiments, may include a Graphical User Interface (GUI) for displaying the user interface associated with the interactive emergency information and identification system 200. The user device 130 may also include a mobile transceiver assembly that may be used to determine a location of the user device. Determining the location may be enabled by employment of a GPS receiver, WiFi receiver, and/or Bluetooth receiver.
The individual 120 may be a bearer of the user device 130 who may interact with the interactive emergency information and identification system 200 and/or the responder 170 via a GUI. The responder 170 may communicate with the interactive emergency information and identification system 200 via the work station 180 or otherwise.
The interactive emergency information and identification system 200 may be operated by a security company 140 and may communicate with a B2B partner to exchange data related to an emergency situation and user contracts. Additionally, the interactive emergency information and identification system 200 may communicate with emergency and law enforcement agencies 160, for example, rescue service, fire emergency, FBI, governmental operations center, and so forth. Thus, the interactive emergency information and identification system 200 may receive notifications associated with emergency situations, emergency action plans, and other data from the emergency and law enforcement agencies 160. Additionally, the interactive emergency information and identification system 200 may transmit information about one or more individuals in proximity to the location of the emergency situation as well as audio, video, and/or text data received from the individual 120 to the emergency and law enforcement agencies 160.
The processor 210 may receive location information associated with locations of user devices. The location information may be received based on the defined geo-fence. Since the user devices are associated with individuals, the processor 210 may determine a position of an individual within the geo-fence based on the location information. The position may include a proximity zone associated with the position of the individual.
The processor 210 may inform the individual about the emergency situation via a user interface of the user device. Additionally, the individual may be provided with a functionality to give feedback related to the emergency situation. The feedback may be received by the processor 210 and may include a request for help, a statement that no help is required, an assessment of the emergency situation, audio information, video information, text information on the emergency situation, and so forth.
Notification about the emergency situation, location of the emergency situation, individuals located in proximity to the emergency situation, and feedback of the individuals may be stored in the database 220 and may be accessible for an operator of the system 200, one or more responders, representatives of emergency or law enforcement agencies, and so forth.
As shown in
At operation 320, a geo-fence for the emergency situation may be defined. The geo-fence may be defined automatically based on the description, classification, and/or type of the emergency situation. Alternatively, the geo-fence may be predefined by an individual whose user device interacts with the interactive emergency response system or by an operator of the interactive emergency information and identification system. The geo-fence may include two or more proximity zones. Zones may be differentiated based on proximity to the location of the emergency situation.
At operation 330, location information associated with the locations of user devices may be received. The location information may be determined via multilateration of radio signals between radio towers, triangulation of a GPS signal associated with each of the user devices, WiFi positioning, and Bluetooth sensor signals. The location information may be associated with the geo-fence. The user devices may include mobile phones, smart phones, tablet PCs, lap-tops, and so forth. The user devices may be carried by individuals and the location of user devices may indicate the individuals' locations. Based on the location information and the geo-fence, a position of an individual within the geo-fence may be determined at operation 340. The position may include a proximity zone associated with the individual.
At operation 350, the individual may be informed about the emergency situation via a user interface of the user device associated with the individual. The individual may be informed by a message displayed on a screen of the user device (for example, as a push message). In some embodiments, the message may depend on the proximity zone associated with the individual. Additionally, at operation 360, a functionality to give feedback may be provided to the individual via the user interface, and the feedback may be received at operation 370. Thus, information on the state of the individual may be requested. In such a way, the interactive emergency information and identification system may receive information on a number and state of individuals who are affected by the emergency situation. Moreover, audio, video, text, and other data related to the emergency situation may be received from the individual. The data may include, for example, a photo of a shooter in a shooting event, information on suspicious activity noticed by the individual, and so forth.
At optional operation 380, the data related to the feedback of the individual and location information may be distributed to corresponding agencies, B2B partners, and/or individual users. The volume and details of the data provided to B2B partners and users may depend on agreements and settings with these partners and/or users.
The data, also transmitted to corresponding agencies, may be used by them to facilitate emergency situation management and relief.
In some embodiments, emergency instructions associated with the emergency situation may be provided to the individual via the user interface (for example, as a text or as graphical instructions). The emergency instructions may be based on an emergency action plan associated with the emergency situation, instructions provided by corresponding agencies, and so forth. Additionally, the instructions may vary depending on the proximity zone associated with the position of the individual. For example, the individual within 10 meters of the shooter may receive instructions to stay in place, while the individual within a 50-100 meter proximity zone may receive instructions to move away from the shooter.
The current position of the individual may be continuously monitored and actions of the individual may be coordinated. For example, the individual may be informed that he is approaching a fire or moving away from a rescue team or informed about recommended moving directions.
In some embodiments, a user of the interactive emergency information and identification system may send an assistance request. The system may receive the request and provide assistance to the user. The assistance may include informational assistance, transmitting the assistance request to an emergency agency, first aid service, and so forth.
A physical area of a certain radii may be defined as a geo-fence 502 of the emergency situation as shown by
Location information associated with the location of user devices near the geo-fence 502 may be processed to determine the user devices within the geo-fence 502. The location of the user devices inside the geo-fence 502 may be taken as positions 506 of individuals inside the geo-fence 502. Additionally, user devices outside, but in proximity of the geo-fence 502 may be taken as positions 508 of individuals near the geo-fence 502. Screen 500 illustrates the positions 506 and 508 defined on the map 402 in relation to the location 404 of the emergency situation. Each of the positions 506 may be associated with a proximity zone within the geo-fence 502.
The screen 500 may be displayed to the operator 410 to visualize positions and movements of the individuals in relation to the emergency situation 404 in real time. Each of the positions 506, 508 may be accompanied by brief information associated with the individual. The information may be updated in real time and include name, age, state, phone number, and other data related to the individual.
In some embodiments, the operator 410 may connect and communicate with individuals, via the administrator's interface, for example, by phone, text messages, and so forth. The connection may be automated using the administrator's interface. Thus, the operator 410 may call one of the individuals without having to dial phone numbers, the operator 410 may simply activate an interface control element, and the system 200 will perform the connection automatically.
Additionally, a functionally to give feedback may be provided to the individual. Thus, the individual may send a request for help by activating an “I need help” button 608, or may define his state as satisfactory by activating an “I'm OK” button 610.
Furthermore, the interactive emergency information and identification system may provide a functionality to send data associated with the emergency situation. Thus,
Alternatively, the emergency instructions may include text, audio, or video messages, or any other form of communication.
Data received from individuals (e.g. feedback to the status of the individuals) may be analyzed by the system 200. Based on the analysis, consolidated data representing safety of each individual may be generated. The consolidated data may be provided to an operator via an administrator's interface.
An example screen 900 displaying reported safety status of the individuals in real time is illustrated by
The example computer system 1000 includes a processor or multiple processors 1002, a hard disk drive 1004, a main memory 1006 and a static memory 1008, which communicate with each other via a bus 1010. The computer system 1000 may also include a network interface device 1012. The hard disk drive 1004 may include a computer-readable medium 1020, which stores one or more sets of instructions 1022 embodying or utilized by any one or more of the methodologies or functions described herein. The instructions 1022 can also reside, completely or at least partially, within the main memory 1006 and/or within the processors 1002 during execution thereof by the computer system 1000. The main memory 1006 and the processors 1002 also constitute machine-readable media.
While the computer-readable medium 1020 is shown in an exemplary embodiment to be a single medium, the term “computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the machine and that causes the machine to perform any one or more of the methodologies of the present application, or that is capable of storing, encoding, or carrying data structures utilized by or associated with such a set of instructions. The term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media. Such media can also include, without limitation, hard disks, floppy disks, NAND or NOR flash memory, digital video disks (DVDs), RAM, ROM, and the like.
The exemplary embodiments described herein can be implemented in an operating environment comprising computer-executable instructions (e.g., software) installed on a computer, in hardware, or in a combination of software and hardware. The computer-executable instructions can be written in a computer programming language or can be embodied in firmware logic. If written in a programming language conforming to a recognized standard, such instructions can be executed on a variety of hardware platforms and for interfaces to a variety of operating systems. Although not limited thereto, computer software programs for implementing the present method can be written in any number of suitable programming languages such as, for example, C, C++, C# or other compilers, assemblers, interpreters or other computer languages or platforms.
Thus, various interactive emergency information and identification systems and methods have been described. Although embodiments have been described with reference to specific example embodiments, it will be evident that various modifications and changes may be made to these embodiments without departing from the broader spirit and scope of the system and method described herein. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.
| Number | Name | Date | Kind |
|---|---|---|---|
| 5563931 | Bishop et al. | Oct 1996 | A |
| 5894591 | Tamayo | Apr 1999 | A |
| 6084510 | Lemelson et al. | Jul 2000 | A |
| 6509833 | Tate | Jan 2003 | B2 |
| 6745021 | Stevens | Jun 2004 | B1 |
| 6816878 | Zimmers et al. | Nov 2004 | B1 |
| 6882307 | Gifford | Apr 2005 | B1 |
| 6882837 | Fernandez et al. | Apr 2005 | B2 |
| 6885936 | Yashio et al. | Apr 2005 | B2 |
| 7046140 | Adamczyk et al. | May 2006 | B2 |
| 7071821 | Adamczyk et al. | Jul 2006 | B2 |
| 7109859 | Peeters | Sep 2006 | B2 |
| 7194249 | Phillips et al. | Mar 2007 | B2 |
| 7233781 | Hunter et al. | Jun 2007 | B2 |
| 7301450 | Carrino | Nov 2007 | B2 |
| 7308246 | Yamazaki et al. | Dec 2007 | B2 |
| 7348882 | Adamczyk et al. | Mar 2008 | B2 |
| 7433672 | Wood | Oct 2008 | B2 |
| 7558558 | Langsenkamp et al. | Jul 2009 | B2 |
| 7593740 | Crowley et al. | Sep 2009 | B2 |
| 7848765 | Phillips et al. | Dec 2010 | B2 |
| 7920679 | Naim et al. | Apr 2011 | B1 |
| 7924149 | Mendelson | Apr 2011 | B2 |
| 8045954 | Barbeau et al. | Oct 2011 | B2 |
| 8073422 | Langsenkamp et al. | Dec 2011 | B2 |
| 8095610 | Gould et al. | Jan 2012 | B2 |
| 8103239 | Yamazaki et al. | Jan 2012 | B2 |
| 8126479 | Morrison | Feb 2012 | B2 |
| 8126480 | Morrison | Feb 2012 | B2 |
| 8145183 | Barbeau et al. | Mar 2012 | B2 |
| 8190118 | Sennett et al. | May 2012 | B2 |
| 8204525 | Sennett et al. | Jun 2012 | B2 |
| 8301112 | Morrison | Oct 2012 | B2 |
| 8312112 | Stremel et al. | Nov 2012 | B2 |
| 8320931 | Ward et al. | Nov 2012 | B2 |
| 8351297 | Lauder et al. | Jan 2013 | B2 |
| 8385956 | Sennett et al. | Feb 2013 | B2 |
| 8412147 | Hunter et al. | Apr 2013 | B2 |
| 8442482 | Maier et al. | May 2013 | B2 |
| 8442807 | Ramachandran | May 2013 | B2 |
| 8458067 | Arguelles et al. | Jun 2013 | B2 |
| 8531293 | Putz | Sep 2013 | B2 |
| 8532607 | Sennett et al. | Sep 2013 | B2 |
| 8542599 | Pons et al. | Sep 2013 | B1 |
| 8548423 | Rao | Oct 2013 | B2 |
| 8552886 | Bensoussan | Oct 2013 | B2 |
| 8594015 | Dunn et al. | Nov 2013 | B2 |
| 8594707 | Morrison | Nov 2013 | B2 |
| 8612278 | Ashley, Jr. et al. | Dec 2013 | B1 |
| 8614631 | Pinhanez | Dec 2013 | B2 |
| 8624727 | Saigh et al. | Jan 2014 | B2 |
| 8660518 | Sennett et al. | Feb 2014 | B2 |
| 8660520 | Felt et al. | Feb 2014 | B2 |
| 8665089 | Saigh et al. | Mar 2014 | B2 |
| 8725107 | Brok Den et al. | May 2014 | B2 |
| 20060223494 | Chmaytelli et al. | Oct 2006 | A1 |
| 20070159322 | Garratt Campbell | Jul 2007 | A1 |
| 20070202927 | Pfleging et al. | Aug 2007 | A1 |
| 20070219420 | Moore | Sep 2007 | A1 |
| 20080139165 | Gage et al. | Jun 2008 | A1 |
| 20080275308 | Moore | Nov 2008 | A1 |
| 20090005019 | Patel et al. | Jan 2009 | A1 |
| 20090042546 | McClendon | Feb 2009 | A1 |
| 20090172131 | Sullivan | Jul 2009 | A1 |
| 20090309742 | Alexander et al. | Dec 2009 | A1 |
| 20100159871 | Tester | Jun 2010 | A1 |
| 20100305806 | Hawley | Dec 2010 | A1 |
| 20110063138 | Berkobin et al. | Mar 2011 | A1 |
| 20110238300 | Schenken | Sep 2011 | A1 |
| 20110319051 | Reitnour | Dec 2011 | A1 |
| 20120002791 | Kraus | Jan 2012 | A1 |
| 20120071129 | Haney | Mar 2012 | A1 |
| 20120092161 | West | Apr 2012 | A1 |
| 20120130753 | Lewis | May 2012 | A1 |
| 20120200411 | Best | Aug 2012 | A1 |
| 20120253551 | Halimi et al. | Oct 2012 | A1 |
| 20120258681 | Hanover | Oct 2012 | A1 |
| 20120282887 | Khoo et al. | Nov 2012 | A1 |
| 20120309409 | Grosman et al. | Dec 2012 | A1 |
| 20120329420 | Zotti | Dec 2012 | A1 |
| 20130005363 | Tester | Jan 2013 | A1 |
| 20130012154 | Ramos | Jan 2013 | A1 |
| 20130040600 | Reitnour | Feb 2013 | A1 |
| 20130085668 | Roberts, Sr. et al. | Apr 2013 | A1 |
| 20130091452 | Sorden | Apr 2013 | A1 |
| 20130099977 | Sheshadri et al. | Apr 2013 | A1 |
| 20130231137 | Hugie et al. | Sep 2013 | A1 |
| 20130237174 | Gusikhin et al. | Sep 2013 | A1 |
| 20130241726 | Hunter et al. | Sep 2013 | A1 |
| 20130246397 | Farver et al. | Sep 2013 | A1 |
| 20130316751 | Rao | Nov 2013 | A1 |
| 20130324166 | Mian et al. | Dec 2013 | A1 |
| 20130332007 | Louboutin | Dec 2013 | A1 |
| 20140011471 | Khosla et al. | Jan 2014 | A1 |
| 20140017146 | Sakamoto et al. | Jan 2014 | A1 |
| 20140031000 | Hanover | Jan 2014 | A1 |
| 20140132393 | Evans | May 2014 | A1 |
| 20140143801 | Russell et al. | May 2014 | A1 |
| 20140172873 | Varoglu et al. | Jun 2014 | A1 |
| 20150163626 | Zimmer | Jun 2015 | A1 |
| Number | Date | Country |
|---|---|---|
| WO 2011059308 | May 2011 | WO |
| WO 2013087719 | Jun 2013 | WO |
| WO 2014062147 | Apr 2014 | WO |
| WO 2014072910 | May 2014 | WO |
| WO 2014075070 | May 2014 | WO |
| WO 2014096920 | Jun 2014 | WO |
| Entry |
|---|
| “Livesafe/Safety-Related Mobile Technology”, http://www.livesafemobile.com ; Oct. 31, 2014, 2 pages. |
| “Business & Corporations—Personal Security App—EmergenSee”, http://www.emergensee.com/be-emergensee-safe/business-corporations, Oct. 31, 2014, 3 pages. |
| WIPO, PCT/US2014/061389, “International Search Report,” Feb. 5, 2015, 2 pages. |
| WIPO, PCT/US2014/061389, “Writtem Opinion of the International Searching Authority,” Feb. 5, 2015, 14 pages. |
| U.S. Appl. No. 14/204,084, filed Mar. 11, 2014, South. |
| Campbell, M., “Apple invention uses iPhone and wearable sensors to monitor activities, automate alarms,” http://appleinsider.com/articles/14/06/19/apple- invention-uses-iphone-and-wearable-sensor, Jun. 19, 2014, pp. 1-10. |
| Roppolo, M., “What to expect at Goggle I/O 2014 developers conference,” http://www.cbsnews.com/news/google-io-2014-rumor-roundup-what-to-expect/ , Jun. 21, 2014, pp. 1-4. |
| Number | Date | Country | |
|---|---|---|---|
| 20150111523 A1 | Apr 2015 | US |