The present disclosure relates generally to a wearable device for emergency event evacuation and rescue.
Large facilities (e.g., buildings), such as commercial facilities, office buildings, hospitals, and the like, may have a fire alarm system that can be triggered during an emergency event (e.g., a fire) to warn occupants to evacuate. For example, a fire alarm system may include a fire control panel and a plurality of fire sensing devices (e.g., smoke detectors), located throughout the facility (e.g., on different floors and/or in different rooms of the facility) that can sense a fire occurring in the facility and provide a notification of the fire to the occupants of the facility via alarms.
When a fire or other emergency event occurs in a facility, it is important that the occupants of the facility be quickly and safely evacuated in an orderly and efficient manner. During many fires, smoke can be the greatest threat to the occupants of the facility, and the greatest obstacle to an effective evacuation. For instance, smoke inhalation can pose a significant health threat to the facility occupants. Further, smoke can make it difficult for firefighters or other emergency personnel to quickly and safely locate and rescue occupants in the facility who may be trapped or stranded.
A wearable device for emergency event evacuation and rescue is described herein. The wearable device can include a memory, and a processor configured to execute instructions stored in the memory to receive, while the wearable device is being worn by an occupant of a facility, a notification of an emergency event occurring in the facility, and transmit, responsive to receiving the notification of the emergency event occurring in the facility, a current location of the occupant in the facility and current health data of the occupant of the facility.
A wearable device for emergency event evacuation and rescue in accordance with the present disclosure can ensure that occupants of a facility (e.g., building) can be quickly and safely evacuated and rescued in an orderly and efficient manner during a fire or other emergency event occurring in the facility. For example, a wearable device for emergency event evacuation and rescue in accordance with the present disclosure can quickly alert an occupant of the facility of the fire or other emergency event, so that the occupant is aware of the emergency event and the need to evacuate the facility.
Further, a wearable device for emergency event evacuation and rescue in accordance with the present disclosure can make it easier for firefighters and/or other emergency personnel to quickly and safely locate and rescue occupants in the facility who may be trapped or stranded during the emergency event, especially when there is smoke present in the facility. Further, a wearable device for emergency event evacuation and rescue in accordance with the present disclosure can alert the firefighters and/or other emergency personnel of facility occupants who may be having health issues and/or are unconscious (e.g., due to smoke inhalation) during the emergency event. Accordingly, a wearable device for emergency event evacuation and rescue in accordance with the present disclosure can help the firefighters and/or other emergency personnel prioritize where to go (e.g., where they are needed) in the facility during the emergency event.
In the following detailed description, reference is made to the accompanying drawings that form a part hereof. The drawings show by way of illustration how one or more embodiments of the disclosure may be practiced.
These embodiments are described in sufficient detail to enable those of ordinary skill in the art to practice one or more embodiments of this disclosure. It is to be understood that other embodiments may be utilized and that mechanical, electrical, and/or process changes may be made without departing from the scope of the present disclosure.
As will be appreciated, elements shown in the various embodiments herein can be added, exchanged, combined, and/or eliminated so as to provide a number of additional embodiments of the present disclosure. The proportion and the relative scale of the elements provided in the figures are intended to illustrate the embodiments of the present disclosure and should not be taken in a limiting sense.
The figures herein follow a numbering convention in which the first digit or digits correspond to the drawing figure number and the remaining digits identify an element or component in the drawing. Similar elements or components between different figures may be identified by the use of similar digits. For example, 102 may reference element “02” in
As used herein, “a”, “an”, or “a number of” something can refer to one or more such things, while “a plurality of” something can refer to more than one such things. For example, “a number of components” can refer to one or more components, while “a plurality of components” can refer to more than one component.
As shown in
Control panel 104 can be used (e.g., by a user) to monitor and/or control components (e.g., devices) of fire control system 100. For instance, the user can use control panel 104 to directly control the operation of (e.g., actions performed by) the components (not shown in
The components being monitored and/or controlled by control panel 104 can be located throughout the facility (e.g., on different floors of the facility) and can be used to detect and/or manage a fire occurring in the facility, and/or to prevent a fire from occurring in the facility. For example, such components can include sensors (e.g., smoke detectors) that can sense a fire occurring in the facility, alarms that can provide a notification of the fire to the occupants of the facility, fans and/or dampers that can perform smoke control operations (e.g., pressurizing, purging, exhausting, etc.) during the fire, and/or sprinklers that can provide water to extinguish the fire, among other components.
As shown in
As shown in
Gateway device 106 can communicate with computing device 112 via network 108, as illustrated in
Gateway device 106 can also communicate with mobile device 110 via network 108, as illustrated in
Network 108 can be a network relationship through which gateway device 106 and computing device 112 can communicate. Examples of such a network relationship can include a distributed computing environment (e.g., a cloud computing environment), a wide area network (WAN) such as the Internet, a local area network (LAN), a personal area network (PAN), a campus area network (CAN), or metropolitan area network (MAN), among other types of network relationships. For instance, network 108 can include a number of servers that receive information from, and transmit information to, gateway device 106 and computing device 112 via a wired or wireless network.
As used herein, a “network” can provide a communication system that directly or indirectly links two or more computers and/or peripheral devices and allows users to access resources on other computing devices and exchange messages with other users. A network can allow users to share resources on their own systems with other network users and to access information on centrally located systems or on systems that are located at remote locations. For example, a network can tie a number of computing devices together to form a distributed control network (e.g., cloud).
A network may provide connections to the Internet and/or to the networks of other entities (e.g., organizations, institutions, etc.). Users may interact with network-enabled software applications to make a network request, such as to get a file or print on a network printer. Applications may also communicate with network management software, which can interact with network hardware to transmit information between devices on the network.
A wearable device 102 for emergency event evacuation and rescue in accordance with the present disclosure is illustrated in
Wearable device 102 can receive a notification (e.g., alert) of an emergency event, such as, for instance, a fire, occurring in the facility from control panel 104. For example, control panel 104 can determine the emergency event is occurring (e.g., detect the emergency event) based on signals (e.g., alarm signals) it receives from one or more of the components of the fire control system it is monitoring and/or controlling, and send a notification of the emergency event to wearable device 102 responsive to determining the emergency event is occurring (e.g. responsive to receiving the alarm signals). Control panel 104 can send (e.g., transmit) the notification of the emergency event to wearable device 102 via Bluetooth or other wireless network (not shown in
Responsive to receiving the notification of the emergency event, wearable device 102 can provide an alert of the emergency event to the occupant who is wearing the wearable device. Wearable device 102 can provide the alert by, for example, causing its display to flash, causing itself to vibrate, and/or displaying a message about the emergency event. The message can include a visual icon and/or text that indicates the type of emergency event. For instance, if the emergency event is a fire, the visual icon can be a flame, and the text can be “fire alarm”.
Responsive to receiving the notification of the emergency event, wearable device 102 can determine the current (e.g., real-time) location of the occupant in the facility, and transmit the current location of the occupant in the facility to gateway device 106. Wearable device 102 can transmit the current location of the occupant to gateway device 106 via Bluetooth or other wireless network (not shown in
Responsive to receiving the notification of the emergency event, wearable device 102 can determine (e.g., measure) current (e.g., real-time) health data of the occupant of the facility, and transmit the current health data of the occupant in the facility to gateway device 106. For instance, wearable device 102 can transmit the current health data of the occupant of the facility to gateway device 106 concurrently with the current location of the occupant in the facility. Wearable device 102 can transmit the current health data of the occupant to gateway device 106 via Bluetooth or other wireless network (not shown in
Responsive to receiving the current location of the occupant in the facility and the current health data of the occupant of the facility, gateway device can transmit (e.g., via network 108) the current location and current health data of the occupant to mobile device 110 and/or computing device 112. That is, mobile device 110 and/or computing device 112 can receive the current location of the occupant in the facility and the current health data of the occupant of the facility from wearable device 102 via gateway device 106.
Although a single wearable device is illustrated in
Mobile device 110 and/or computing device 112 can display (e.g., on a user interface) the current location of each respective occupant in the facility and the current health data of each respective occupant of the facility received from the wearable devices. For example, the current location of each respective occupant in the facility can be displayed (e.g., represented) as a dot in a floor plan of the facility displayed on the user interface of mobile device 110 and/or computing device 112. The color of each respective dot displayed in the floor plan can correspond to the current health data of the facility occupant whose current location is represented by that dot. For instance, an occupant whose current health data is abnormal (e.g., bad) and/or indicates they are unconscious may be represented by a red dot, and an occupant whose current health data is normal (e.g., good) and indicates they are conscious may be represented by a green dot.
As an additional example, the floor plan can include (e.g., display) a plurality of areas (e.g., zones) of a facility, and the color of each respective area displayed in the floor plan can correspond to the quantity (e.g., density) of occupants currently located in that respective area. For example, zones in which a relatively larger quantity of occupants are currently located can be displayed (e.g., highlighted or shaded) in red, zones in which a relatively smaller quantity of occupants are currently located can be displayed in yellow, and zones in which no occupants are currently located can be displayed in green.
If the current location of an occupant in the facility and/or the current health of an occupant in the facility changes during the emergency event, that change can be received from the wearable device being worn by that occupant, and mobile device 110 and/or computing device 112 can update the display to indicate (e.g., reflect) that change. For example, if an occupant moves to a different area (e.g., different zone) of the facility during the emergency event, the dot representing the current location of that occupant can move to that area in the floor plan in the display. As an additional example, if an occupant becomes unconscious during the emergency event, the color of the dot representing that occupant in the display can change (e.g., from green to red). As an additional example, if a relatively large quantity of occupants move to a different area of the facility during the emergency event, the color of that area in the floor plan in the display can change (e.g., from green or yellow to red).
In some embodiments, mobile device 110 and/or computing device 112 can determine and display directions to the current location of the occupant in the facility. For instance, the directions can be displayed (e.g., embedded) in the floor plan, and/or can be determined from the current location of the firefighter or other emergency personnel.
As shown in
As shown in
Although not shown in
Although not shown in
If the current location of an occupant in the facility and/or the current health of an occupant in the facility changes during the emergency event, that change can be received from the wearable device being worn by that occupant, and display 222 can be updated to indicate (e.g., reflect) that change. For example, if one of the occupants who is currently located in area 224-1 moves to area 224-2, the dot 226-1 representing the current location of that occupant can move to area 224-2. As an additional example, if one of the occupants who is currently located in area 224-1 becomes unconscious, the color of the dot 226-1 representing that occupant can change (e.g., from green to red). As an additional example, if all of the occupants who are currently located in area 224-1 move to area 224-2, the color of area 224-2 can change (e.g., from yellow to red).
As shown in
The memories 332, 342, and 352 can be any type of storage medium that can be accessed by processors 334, 344, and 354, respectively, to perform various examples of the present disclosure. For example, the memories 332, 342, and 352 can each be a non-transitory computer readable medium having computer readable instructions (e.g., computer program instructions) stored thereon that are executable by the processors 334, 344, and 354, respectively, for emergency event evacuation and rescue in accordance with the present disclosure.
The memories 332, 342, and 352 can be volatile or nonvolatile memory. The memories 332, 342, and 352 can also be removable (e.g., portable) memory, or non-removable (e.g., internal) memory. For example, the memories 332, 342, and 352 can be random access memory (RAM) (e.g., dynamic random access memory (DRAM) and/or phase change random access memory (PCRAM)), read-only memory (ROM) (e.g., electrically erasable programmable read-only memory (EEPROM) and/or compact-disc read-only memory (CD-ROM)), flash memory, a laser disc, a digital versatile disc (DVD) or other optical storage, and/or a magnetic medium such as magnetic cassettes, tapes, or disks, among other types of memory.
Further, although memories 332, 342, and 352 are illustrated as being located within wearable device 302, mobile device 310, and computing device 312, respectively, embodiments of the present disclosure are not so limited. For example, memories 332, 342, and/or 352 can also be located internal to another computing resource (e.g., enabling computer readable instructions to be downloaded over the Internet or another wired or wireless connection).
As shown in
A user of wearable device 302, mobile device 310, and computing device 312 can interact with those respective devices via user interface 336, user interface 346, and user interface 356, respectively. For example, the user interface 336 can provide (e.g., display and/or present) information to the user of wearable device 302, and/or receive information from (e.g., input by) the user of wearable device 302. Further, the user interface 346 can provide information to the user of mobile device 310, and/or receive information from the user of mobile device 310. Further, the user interface 356 can provide information to the user of computing device 312, and/or receive information from the user of computing device 312. For instance, in some embodiments, user interface 336, user
interface 346, and/or user interface 356 can be a graphical user interface (GUI) that can provide and/or receive information to and/or from the users of wearable device 302, mobile device 310, and computing device 312, respectively. For example, user interface 336 can provide an alert of an emergency event to a user (e.g., facility occupant) who is wearing the wearable device, as previously described herein. Further, user interfaces 346 and/or 356 can display the current location of each respective occupant in a facility and the current health data of each respective occupant of the facility received from wearable devices being worn by the occupants, as previously described herein. The displays can each be, for instance, a touch-screen (e.g., the GUI can include touch-screen capabilities).
The user interfaces 336, 346, and 356 can each be localized to any language. For example, the user interfaces 336, 346, and 356 can each display information in any language, such as English, Spanish, German, French, Mandarin, Arabic, Japanese, Hindi, etc.
Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art will appreciate that any arrangement calculated to achieve the same techniques can be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments of the disclosure.
It is to be understood that the above description has been made in an illustrative fashion, and not a restrictive one. Combination of the above embodiments, and other embodiments not specifically described herein will be apparent to those of skill in the art upon reviewing the above description.
The scope of the various embodiments of the disclosure includes any other applications in which the above structures and methods are used. Therefore, the scope of various embodiments of the disclosure should be determined with reference to the appended claims, along with the full range of equivalents to which such claims are entitled.
In the foregoing Detailed Description, various features are grouped together in example embodiments illustrated in the figures for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the embodiments of the disclosure require more features than are expressly recited in each claim.
Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.