This relates to showing the location and relevant data about personnel and sensors within an environment on top of a view of the environment.
The following description, given with respect to the attached drawings, may be better understood with reference to the non-limiting examples of the drawing, wherein the drawings show:
It has long been desirable to provide enhanced situational awareness to first responders. For example, providing first responders with more information about their surrounding environment could improve rescue operations. Prior art devices have attempted to provide enhanced situational awareness to first responders by combining a virtual representation of an environment (e.g. a map or 3D representation of a building) with status information received from first responders and having a user interpret the relevance of the combination and communicate the relevance to first responders.
Prior art systems illustrated in
The following example provides an illustration of exemplary prior art systems. In an event where a one-story building is on fire, firefighters (i.e. first responders) arrive and enter the building. As they move around they let the captain (i.e. user) know roughly where they are in the building (e.g. “I am entering the North-East corner”). The captain can use a map of the building to plot the locations of the firefighters in the building or a more modern system might automatically plot the locations on a map, given that there are sensors able to sense the location of the responders in the building. The captain can then communicate information to the firefighters about their locations based on information from the map and/or the information from the captain's view of the building. Alternatively the captain might use his view of the building for his own use without communicating information to the firefighters. In this example, dynamic information about the building (e.g. what parts are on fire and/or going down) is not combined with information received from the firefighters (e.g. locations). That is, the captain must look to the map to determine where the firefighters are located and look to the building to see which parts are on fire and integrate both types of information to determine if any firefighters are in danger. Only after such integration can the captain communicate to the firefighters if they are in danger.
The system described herein uses augmented reality to show information received from first responders on top of a live view of the first responders' environment. By having information received from first responders on top of a dynamic/live/real-time view of an environment, the system can provide an enhanced representation of the environment. This enhanced representation can be used to provide enhanced situational awareness for first responders. For example, in the scenario described above, if a part of the building can be seen as becoming weak under fire, the captain will immediately be able to determine if any of the firefighters are in danger by looking at the information superimposed on the dynamic view of the environment. The captain can then call the person at this location to leave that area of the building. Further, if a responder is down, it is also possible to see the responder's location with respect to the actual building, which can be useful in determining the best way to reach the responder given the current conditions of the building. The system can also show the locations and values of sensors placed within the environment superimposed on top of a real-time view of the environment. For example, when a temperature sensor is dropped by a firefighter, the sensor's own tracking system (or last location of the firefighter at the time he dropped the sensor) provides the location of the sensor. By showing data coming from the sensor on top of a real-time view of the environment, the captain can directly relate the sensor reading with a location in the environment.
The situation awareness system described herein makes use of Augmented Reality (AR) technology to provide the necessary view to the user. AR is like Virtual Reality, but instead of using completely artificial images (e.g. maps or 3D models), AR superimposes 3D graphics on a view of the real world. A very simple example of AR is used in football games to show the first down with a yellow line. An example of an AR system that can be employed is one described in the examples of U.S. application Ser. No. 11/441,241 in combination with the present disclosure.
An AR visualization system comprises: a spatial database, a graphical computer, a viewpoint tracking device, and a display device.
The working principle of an Augmented Reality system is described below. A display device that displays dynamic images corresponding to a user's view is tracked. That is, the display's position and orientation are measured by a viewpoint tracking device. A spatial database and a graphical computer associate information with a real world environment. Associated information is superimposed on top of the dynamic display image in accordance with the display's position and orientation, thereby creating an augmented image.
Computer 110 collects information from sensors 116a worn by first responder 100 and sensors 116b placed throughout surrounding environment 108. Sensors 116a include sensors that allow a first responder's location to be monitored and can include sensors that provide information about the state of the first responder's health (e.g. temperature, heart rate, etc.), the first responder's equipment (e.g. capacity and/or power level of equipment), and conditions within the first responder's immediate proximity (e.g. temperature, air content). Sensors 116b can include any sensors that can gather information about the conditions of the environment 108. Examples of such sensors 116b include temperature sensors, radiation sensors, smoke detectors, gas sensors, wind sensors, pressure sensors, humidity sensors and the like. It should be noted that although
Computer 110 updates database 118 with the information received from sensors 116a and 116b. Database 118 stores the information from sensors 116a and 116b. Database 118 may additionally contain model information about the environment 108, such as a 3D model of a building. Model information may be used to provide advanced functionality in the system, but is not necessary for the basic system implementation. Graphical computer 110 continuously renders information from the database 118, thereby showing a first responder's location within the environment 108 and generating graphics from current information received from sensors 116a and 116b. Instead of each sensor having a tracking device, since a sensor is not moving once it is installed by a firefighter, it is possible to use the location of the firefighter once the sensor was dropped (or activated) as the location of the sensor. Graphical computer 110 continuously renders information from the database 118, thereby placing current information from sensors 116a and 116b in the database 118.
Computer 110 also receives information about the viewpoint of the display device 124 captured by the tracking device 122. Computer 110 takes information from database 118 and tracking information about the viewpoint of the display device 124 and renders current information from sensors 116a and 116b in relation to the current view of the display device 124 by using a common 3D projection process. By measuring in real time the position and orientation of the display 124 (i.e. determining user's viewpoint), it is possible to align information rendered from the spatial database 118 with the corresponding viewpoint.
The display device 124 is able to show the image generated by the graphical computer 110 superimposed on a view of the surrounding environment 108 as “seen” by or through the display device 124. Thus, user 102 has a global perspective of environment 108 with information superimposed thereon and is able to use this enhanced global perspective of environment 108 to communicate information to first responder 100. Thereby, efficiently providing first responder 102 with information about environment 108 that would not otherwise be available to first responder 102.
The three exemplary configurations (optical see-through, collocated camera and display, and camera and display at different locations) described above are mentioned for understanding the implementation of an AR system and are not intended to be limiting. Any AR system that is able to superimpose graphics that appear attached to the real world could be used.
It should be noted that the elements shown in
Also displayed next to John and Joe's names is information regarding the status of each. In this example, the percentage represents the level of oxygen that each has in his oxygen tank. Here John's oxygen tank is 80% full and Joe's tank is 90% full. This can provide the captain with an idea of how much time John and Joe have to operate inside the building. Avatars can alternatively be used to represent the first responders or any of the information received from them. There are numerous known avatars used in the electronic gaming art which could be incorporated into the system. Further, graphical information can represent the combined status of John and Joe, e.g. an indicator that represents a combined oxygen level. Alternatively, both could be shown using an aggregated symbol (a team of responders operating close by, the reduce display cluttering).
Shown above the representations of John and Joe, is data coming from sensors that have been dropped inside the building. In this exemplary embodiment, the sensors are temperature sensors dropped somewhere in the building on fire. One such sensor is supplying the temperature reading 435 degrees as shown. Other types of sensors and additional temperature sensors can be placed throughout the building.
Although the principles of the exemplary system is illustrated by using the example of as a situational-awareness system for firefighters, exemplary systems can also be implemented as systems in the following applications: a system showing “blue force” (friendly) in military operations in urban environments, a system showing locations of workers inside a building, a system used by security personnel showing the location of an alarm sensor that has been triggered, a system used by maintenance personnel to show the location and data about a sensor or a set of sensor in a plant/building.
In this case where first responders 100 see graphics superimposed on their individual views, first responders 100 might be using a helmet, wrist mounted or PDA/tablet display to see the information aligned with the real world environment 108. This display 124 would show the same information such as the locations and data about responders 100 and sensors 116b or any other useful information. If a responder 100 needs assistance, it becomes now easy for other responders to come to help because they see where the responder 100 is with respect to the environment 108 and they can see how to get to the responder 100 while avoiding obstacles.
While the invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not to be limited to the disclosed embodiment, but on the contrary, is intended to cover various modifications and equivalent arrangements included within the spirit and scope of the appended claims.
The present application is related to the following co-pending U.S. Patent applications, the entire contents of each of which are incorporated herein by reference: 1. U.S. application Ser. No. 11/441,241 entitled “System and Method to Display Maintenance and Operation Instructions of an Apparatus Using Augmented Reality,” filed May 26, 2006; and2. U.S. application Ser. No. 11/______ entitled “Augmented Reality-Based System and Method Providing Status and Control of Unmanned Vehicles,” filed Mar. 8, 2007.3. U.S. application Ser. No. 11/516,545 entitled “Method and System for Geo-Referencing and Visualization of Detected Contaminants,” filed Sep. 7, 2006.