The present disclosure relates to methods, devices, and systems for building system maintenance using mixed reality.
Building systems can be installed in a building to manage aspects of the building. Building systems can include, for example, heating, ventilation, and air conditioning (HVAC) systems, access control systems, security systems, lighting systems, and fire systems, among others. A building system can refer a single building system (e.g., an HVAC system) or multiple building systems. A building management system (BMS) can manage a system in a single building, multiple systems in a single building, and/or multiple systems across a number of buildings.
Maintenance of building systems can be accomplished by various users. For example, building maintenance personnel may perform maintenance on various devices included in building systems. Additionally, other users such as technicians and/or engineers may perform maintenance on various devices in building systems. In some examples, engineers and/or technicians from a manufacturer of a device may travel to a site of the building to perform maintenance on various devices in building systems.
Devices, methods, and systems for building system maintenance using mixed reality are described herein. For example, a mixed reality computing device for building system maintenance can include a mixed reality display, a memory, and a processor to execute executable instructions stored in the memory to receive a work order for a device in a building, determine a location of the mixed reality computing device in the building, and display virtual information about the device on the mixed reality display based on the location of the mixed reality computing device in the building, where the displayed virtual information includes information about fixing a fault of the device, and where the virtual information displayed on the mixed reality display is overlaid over an area of the mixed reality display based on a field of view of the mixed reality computing device.
Building system maintenance can be performed by various users, including maintenance personnel, technicians, engineers, and/or other specialized users such as technicians and/or engineers from a manufacturer of a device utilized in the building. Building system maintenance can include regularly scheduled maintenance, servicing of devices, tuning of devices, validation of devices, and/or trouble shooting devices, among other types of building system maintenance.
During building system maintenance, delays may occur. For example, specialized maintenance technicians may travel to the site of the building to perform building maintenance. Further, a specialized maintenance technician may not be available to travel to the building site because of scheduling, travel time, travel distance, etc. In some examples, on-site technicians, engineers, etc. may not have the expertise to perform certain building system maintenance functions. These or other scenarios may delay the maintenance of a particular device. Delayed maintenance of one device may cause a cascade of other delays as a result of the delay in the maintenance of the particular device. These types of delays may result in damage to building systems, building system downtime, and/or loss of money.
Devices, methods, and systems for building system maintenance using mixed reality described herein can be utilized to enable a user to perform maintenance activities utilizing a mixed reality display. For example, a mixed reality computing device can be utilized to receive a work order and display virtual information about a device included in the work order. A user can utilize the mixed reality computing device to perform activities included in the work order on various devices and/or equipment included in the building. For example, the user can utilize virtual information about the device displayed on a mixed reality display of the mixed reality computing device to perform various maintenance and/or other activities.
Building system maintenance using mixed reality can provide a convenient and manageable approach to building system maintenance. A knowledge gap for users can be overcome so that a user does not have to take time to learn a building layout to find a device for maintenance, learn how to perform maintenance on the device, etc. Additionally, displaying, by the mixed reality computing device, virtual information about a device can allow for easy and intuitive instructions on how to perform maintenance on different building systems in a building, reducing errors and/or maintenance delays which can save costs in building system maintenance.
In the following detailed description, reference is made to the accompanying drawings that form a part hereof. The drawings show, by way of illustration, how one or more embodiments of the disclosure may be practiced.
These embodiments are described in sufficient detail to enable those of ordinary skill in the art to practice one or more embodiments of this disclosure. It is to be understood that other embodiments may be utilized and that process, electrical, and/or structural changes may be made without departing from the scope of the present disclosure.
As will be appreciated, elements shown in the various embodiments herein can be added, exchanged, combined, and/or eliminated so as to provide a number of additional embodiments of the present disclosure. The proportion and the relative scale of the elements provided in the figures are intended to illustrate the embodiments of the present disclosure, and should not be taken in a limiting sense.
The figures herein follow a numbering convention in which the first digit or digits correspond to the drawing figure number and the remaining digits identify an element or component in the drawing.
As used herein, “a” or “a number of” something can refer to one or more such things. For example, “a number of process variables” can refer to one or more process variables.
As used herein, mixed reality can include the merging of the real physical world and a virtual world to produce a visualization where physical and digital objects can co-exist and interact in real time. Mixed reality can include a mix of reality and virtual reality, encompassing both augmented reality and augmented virtuality via an immersive display. Mixed reality may include a mixed reality holographic object of virtual content overlaid on a visual of real world physical content, where the mixed reality content can be anchored to and interact with the real-world content. For example, the virtual content and real-world content may be able to react to each other in real time.
The mixed reality computing device 102 can include a display. The display can be a transparent mixed reality display. For example, the mixed reality computing device 102 may include a transparent display through which a user may view a physical environment in which the user is located, such as a building, an interior of a building, and/or a device. The transparent display can be, for example, a head mounted display, a handheld display, or a spatial display, among other types of transparent displays.
The mixed reality computing device 102 may also capture physical environment data from the physical environment. The physical environment may include one or more physical objects. Using such physical environment data, a 3-dimensional (3D) transformer may create a mixed reality model of the destination physical environment including the physical objects having associated physical object properties.
The 3D transformer may cause to be displayed a mixed reality hologram using a spatial anchor. The spatial anchor may include a coordinate system that adjusts as needed, relative to other spatial anchors or a frame of reference to keep an anchored mixed reality hologram in place, as is further described herein. The spatial anchor may correspond to a device 104 within the building 100. The mixed reality hologram can include a 3D representation of a device 104, virtual information about the device 104, directions 112 to the device 104, and/or other information, as is further described herein. For example, a user can view the physical environment in which they are located through the transparent mixed reality display with a mixed reality model overlaid on the transparent mixed reality display. The mixed reality model can supplement the view of the physical environment with virtually displayed information. In some examples, the mixed reality model can include a work order for a device in a building 100 and information corresponding thereto, as is further described herein.
Mixed reality computing device 102 can receive a work order. As used herein, the term “work order” refers to a task or job. The work order can be for a heating, ventilation, and air conditioning (HVAC) device 104 in building 100. For example, the HVAC device may have experienced a fault, have routine maintenance to be performed, etc. As used herein, an HVAC device can be a device such as a boiler, chiller, air handling unit (AHU), rooftop unit (RTU), variable air volume (VAV) systems and control devices, and/or heat pumps, sensors, operating panels, controllers, actuators, fans, pumps, valves, coils, and/or radiators, etc. However, the HVAC device is not limited to these examples. Further, although device 104 is described above as an HVAC device, embodiments of the present disclosure are not so limited. For example, device 104 can be a fire suppression device, a security device, a plumbing device, an electrical device, and/or any other building device.
The work order for the HVAC device 104 can be transmitted to mixed reality computing device 102 by, for instance, a building management system via a wired or wireless connection. As used herein, a building management system (BMS) can be used to monitor and/or control a facility (e.g., building). For example, an operator, service technician, or other user can use a BMS check and/or set the state of components of the facility, such as, for instance, control components, equipment (e.g., HVAC equipment), devices, networks, areas, and/or spaces of the building 100. The wired or wireless connection can be a network relationship that connects mixed reality computing device 102 with the building management system. Examples of such a network relationship can include a local area network (LAN), wide area network (WAN), personal area network (PAN), a distributed computing environment (e.g., a cloud computing environment), storage area network (SAN), Metropolitan area network (MAN), a cellular communications network, and/or the Internet, among other types of network relationships.
The work order received by mixed reality computing device 102 can include details of the work order. Work order details can include a type of device 104, a task to be performed on device 104, a location of device 104, and/or safety information associated with an area including the device, among other types of work order details. For example, mixed reality computing device 102 can receive a work order for device 104. For instance, the work order may include cleaning and/or checking the functionality of a smoke detector (e.g., if device 104 is a smoke detector), tuning a field of view of a security camera (e.g., if device 104 is a security camera), checking functionality of an access control system (e.g., if device 104 is an access control system), checking the functionality of intruder alarms (e.g., if device 104 is an intruder alarm), calibrating an HVAC sensor (e.g., if device 104 is an HVAC sensor), performance testing of a public address system (e.g., if device 104 is a public address system), functional testing of a fire suppression system (e.g., if device 104 is a fire suppression system), among other types of maintenance tasks, etc.
Mixed reality computing device 102 can display the details of the work order over a portion of the area of the mixed reality display. For example, mixed reality computing device 102 can display the details of the work order over a portion of the mixed reality display, while the user can simultaneously view the physical environment in which they are located. For example, the user can view information relating to a work order for device 104 (e.g., an HVAC sensor) including the task to be completed (e.g., calibration of the HVAC sensor), the type of device (e.g., a temperature sensor), and/or the location of device 104 (e.g., Room 1 of building 100), safety equipment which should be utilized (e.g., a hard hat, safety glasses, gloves, etc.) while simultaneously viewing the physical environment in which the user is located through the transparent display of mixed reality computing device 102.
Mixed reality computing device 102 can determine its location. For example mixed reality computing device 102 can determine its location within building 100. In the example illustrated in
Mixed reality computing device 102 can determine its location using spatial analytics. As used herein, the term “spatial analytics” refers to determining properties of an area based on topological, geometric, and/or geographic properties of the area. For example, mixed reality computing device 102 can view an area such as Room 1 of building 100 to determine its location based on topological, geometric, and/or geographic properties of Room 1 of building 100.
Mixed reality computing device 102 can view an area using various sensors and systems included with mixed reality computing device 102. For example, mixed reality computing device 102 can include an optical sensor that utilizes at least one outward facing sensor. The outward facing sensor may detect properties of an area within its field of view 110. For example, the outward facing sensor of mixed reality computing device 102 can detect a layout of Room 1, geometric shapes and/or patterns in Room 1, properties of objects in Room 1 (e.g., shape, color, texture, material, etc.), and/or other properties of the area corresponding to Room 1 of building 100.
In some examples, the optical sensor can include a camera that can record photographs and/or video. In some examples, the mixed reality computing device 102 can utilize spatial analytics including analyzing a video feed of the optical sensor. For example, the mixed reality computing device 102 can analyze the video feed of the optical sensor to detect a layout of Room 1, geometric shapes and/or patterns in Room 1, properties of objects in Room 1 (e.g., shape, color, texture, material, etc.), and/or other properties of the area corresponding to Room 1 of building 100.
The mixed reality computing device 102 can compare the analyzed video feed of the camera with a predetermined model of building 100. For example, the mixed reality computing device 102 can determine a layout of Room 1, geometric shapes and/or patterns in Room 1, properties of objects in Room 1 (e.g., shape, color, texture, material, etc.), and/or other properties of the area corresponding to Room 1 of building 100, and compare the Room 1 layout, geometric shapes and patterns in Room 1, the properties of objects in Room 1, and/or other properties of the area corresponding to Room 1 with the predetermined model of building 100 that includes a predetermined model of Room 1. In some examples, the predetermined model of building 100 can be located in a remote server. In some examples, the predetermined model can be included in the BMS.
Although mixed reality computing device 102 is described above as determining its location by viewing an area and comparing the viewed area to a predetermined model, embodiments of the present disclosure are not so limited. For example, the mixed reality computing device 102 can utilize a global positioning system (GPS), Wi-Fi positioning system utilizing wireless access points (APs) (e.g., APs located in building 100), and/or other location determination mechanisms.
As described above, based on the comparison of the viewed area to a predetermined model by analyzing a video feed captured by a camera of mixed reality computing device 102 and matching to the predetermined model, mixed reality computing device 102 can determine its location. For example, based on the layout of Room 1, geometric shapes and/or patterns in Room 1, properties of objects in Room 1 (e.g., shape, color, texture, material, etc.), and/or other properties of the area corresponding to Room 1 of building 100 captured by the camera of mixed reality computing device 102 matching the same of Room 1 included in the predetermined model of building 100, the mixed reality computing device 102 can determine it is located in Room 1 of building 100.
Mixed reality computing device 102 can determine a location of device 104 in building 100. The location of device 104 in building 100 can be used to display virtual information regarding device 104 on the transparent display of mixed reality computing device 102. For example, mixed reality computing device 102 can display virtual information about device 104 when device 104 is in a field of view 110 of mixed reality computing device 102, as is further described herein.
Mixed reality computing device 102 can determine a location of device 104 to display virtual information about device 104 using a spatial anchor. As used herein, the term “spatial anchor” refers to a coordinate system determining a frame of reference to keep a mixed reality hologram (e.g., virtual information) located in an assigned position. The virtual information of the mixed reality hologram can correspond to a device in building 100. Each device in building 100 can include a unique spatial anchor.
Since each device in building 100 includes a unique spatial anchor, mixed reality computing device 102 can determine which device it has located (e.g., and the corresponding virtual information about the device to display) among the devices in the building 100 based on the spatial anchor of that device. For example, device 104 may be a controller included in a panel, where the panel includes five total controllers. Each of the five controllers included in the panel can include a unique and different spatial anchor such that the mixed reality computing device 102 can display virtual information corresponding to the controller of interest (e.g., device 104).
As is further described herein, mixed reality computing device 102 can display a 3D representation of device 104 on the transparent display of mixed reality computing device 102 that is located in a position and orientation corresponding to the physical device 104 in the physical environment of Room 1 of building 100. The spatial anchor of device 104 can further function to keep the position and orientation of the 3D representation of device 104 static as the field of view 110 of mixed reality computing device 102 changes so that the user of mixed reality computing device 102 is not confused as to where the physical device 104 is located in the physical environment of Room 1.
As described above, mixed reality computing device 102 can determine its location in building 100. Additionally, mixed reality computing device 102 can receive the work order from the BMS of building 100 that includes a location of device 104. In some examples, mixed reality computing device 102 can determine that its location is different from the location of device 104 included in the work order. In such an example, mixed reality computing device 102 can display directions 112 to direct a user to device 104, as is further described herein.
In some examples, the mixed reality computing device 102 can determine its location is different than the location of device 104 based on mixed reality computing device 102 detecting a spatial anchor that is not associated with the device 104 included in the work order. For example, mixed reality computing device 102 can detect a spatial anchor of an object included in Room 2, where the detected spatial anchor of the object in room 2 does not correspond to the spatial anchor of device 104. Based on the detected spatial anchor of the device in Room 2, mixed reality computing device 102 can determine its location is different from the location of device 104.
Based on the determination of the location of mixed reality computing device 102 (e.g., Room 2), the mixed reality computing device 102 can display directions 112 from initial location 108 to location 106. For example, as illustrated in
The directions 112 can be displayed on the transparent display of mixed reality computing device 102. For example, the displayed directions 112 on the transparent display can include an arrow and a dotted line to point the user in a first direction towards the Hallway and out of Room 2 of building 100, and from the Hallway into Room 1, and to turn left once in Room 1 to locate device 104. The displayed directions 112 can be virtually displayed on the transparent display, overlaid over the physical environment of building 100. Accordingly, the user can view the physical environment of building 100 through the transparent display while simultaneously viewing the virtually displayed directions 112 on the transparent display as the user moves through building 100. The virtually displayed directions 112 can update in real-time as the user moves from Room 2 to Room 1.
Mixed reality computing device 102 can display virtual information about device 104 based on the location 106 of mixed reality computing device 102 and a location of device 104 in building 100. For example, mixed reality computing device 102 can (e.g., may only) display virtual information about device 104 in response to the location 106 of mixed reality computing device 102 and the location of device 104 being the same (e.g., mixed reality computing device 102 may not display virtual information about device 104 if the location 106 of mixed reality computing device 102 is different than the location of device 104). For instance, mixed reality computing device 102 can determine that mixed reality computing device 102 is in a same room as device 104. As a result, mixed reality computing device 102 can display virtual information about device 104.
The virtual information can include information about fixing a fault of device 104. For example, the work order for device 104 that is received by mixed reality computing device 102 can indicate that device 104 has a fault. As used herein, the term “fault” refers to an event that occurs to cause a piece of equipment to function improperly or to cause abnormal behavior in a building. In some examples, a fault can include a piece of equipment breaking down. In some examples, a fault can include a component of a piece of equipment ceasing to function correctly. In some examples, a fault can include abnormal behavior of a piece of equipment and/or an area.
Although a fault is described as including equipment breakdowns and abnormal behavior, embodiments of the present disclosure are not so limited. For example, faults can include any other event that causes equipment to function improperly, and/or causes abnormal behavior to occur in a building.
Virtual information can further include device information. For example, device 104 can be an AHU. The AHU can include a type of AHU (e.g., a chiller), a model of the AHU, and/or a serial number of the AHU, among other types of device information.
Virtual information can include wiring diagrams for device 104. For example, device 104 can include electrical circuits, electrical connections, and/or other electrical components. A wiring diagram for device 104 can be included in the virtual information such that a user can utilize the wiring diagram for various purposes, such as for troubleshooting, maintenance, testing, etc.
Virtual information can include user manuals for device 104. For example, device 104 can include a user manual, which can explain operating steps for device 104, operating parameters of device 104, safety information for device 104, etc.
Virtual information can include safety information for device 104. For example, different types of safety equipment may be utilized when working with different devices 104. For instance, electrical safety equipment may be specified when a work order includes tasks involving electricity, harnesses may be specified when a work order includes a device which is located above the ground, etc.
Virtual information can include operating information of the device 104. For example, real-time sensor values (e.g., real-time temperature) can be included in the virtual information. Other types of operating information of device 104 can include set-points of various equipment, etc.
As described above, mixed reality computing device 102 can display virtual information about device 104 in response to the location 106 of mixed reality computing device 102 and the location of device 104 being the same. In some examples, the location 106 of mixed reality computing device 102 is the same location as device 104 if it is within a predetermined distance from device 104. For example, if mixed reality computing device 102 is within the predetermined distance (e.g., 5 meters), mixed reality computing device 102 can display virtual information about device 104.
In some examples, mixed reality computing device 102 can display virtual information about device 104 in response to device 104 being located within the field of view 110 of mixed reality computing device 102. As used herein, the term “field of view” refers to an observable area mixed reality computing device 102 can view via the optical sensor (e.g., the camera) of mixed reality computing device 102. For example, when device 104 is located with the observable area of the camera of mixed reality computing device 102, mixed reality computing device 102 can display virtual information about device 104.
The virtual information can be can be displayed on the transparent display of mixed reality computing device 102. For example, the virtual information displayed on the transparent display can include information about fixing a fault of device 104, including device information, wiring diagrams, user manuals, safety information, operating information, among other types of virtual information. The displayed virtual information can be virtually displayed on the transparent display, overlaid over the physical environment of building 100. That is, the user can view the physical environment of building 100 through the transparent display while simultaneously viewing the virtually displayed virtual information on the transparent display. The virtual information can update in real-time.
In some examples, the device 104 may be obstructed by an obstacle in Room 1 of building 100. For example, device 104 may be a variable air volume (VAV) device located above ceiling panels so that it is not visible to a normal occupant of Room 1 of building 100. Nonetheless, mixed reality computing device 102 can display virtual information about device 104, information about fixing a fault of device 104, and/or display a 3D representation of device 104 via the transparent display of mixed reality computing device 102, as is further described in connection with
As previously described in connection with
In the example illustrated in
The list 216 of work orders can include three work orders which can each include various details. The first work order (e.g., #1) can include a work order number (e.g., C3424), a work order status of OPEN, and a predicted fault (e.g., VAV AIR LEAKAGE). Similarly, the second work order (e.g., #2) can include work order number C3527, a work order status of OPEN, and a predicted fault (e.g., VAV COOLING INEFFICIENCY), and the third work order (e.g., #3) can include work order number C4001, a work order status of OPEN, and a predicted fault (e.g., AHU OVER COOLING).
Although the list 216 of work orders is illustrated as including three work orders, embodiments of the present disclosure are not so limited. For example, the list 216 can include more than three work orders or less than three work orders.
In some examples, the list 216 of work orders can be user specific. For example, the mixed reality computing device may be utilized by different users. A first user may have a list of two work orders, while a second user may have the list 216 of three work orders. The mixed reality computing device can display the list of two work orders when the first user is using the mixed reality computing device, and display the list 216 of three work orders when the second user is using the mixed reality computing device.
The mixed reality computing device can display a 3D representation of device 322 on the mixed reality display. The 3D representation illustrated in
Although the 3D representation is described above as being a rectangular prism, embodiments of the present disclosure are not so limited. For example, the 3D representation can be a prism of any other shape (e.g., rectangular, square, cuboid, cylindrical, and/or more complex shapes such as star prisms, crossed prisms, toroidal prisms, and/or any other 3D shape).
As illustrated in
For example, the 3D representation of device 322 can be displayed on the transparent display of the mixed reality computing device overlaid over the physical environment of the building. For instance, the user can still view the ceiling panels, but can also view the 3D representation of device 322, as well as its location and/or devices that may be associated with and/or connected to device 322. For instance, as illustrated in
The 3D representation of device 322 can be displayed on the transparent display of the mixed reality computing device when device 322 is in the field of view of the mixed reality computing device. For example, when a user enters the space including device 322, device 322 may not be displayed on the transparent display since the user is not looking in the direction of device 322, or may not be in the correct area of the space including device 322, etc. When the user is looking in the direction of the device 322 such that device 322 is in the field of view of the mixed reality computing device, the 3D representation of device 322 can be displayed on the transparent display.
As previously described in connection with
Although the obstacle 324 is described above as being a ceiling panel, embodiments of the present disclosure are not so limited. For example, the obstacle 324 can be any other obstacle that can obstruct a view of a device. For instance, the obstacle can include a wall, a panel, a cover, an access door, etc.
The displayed virtual information 328 can be displayed on the transparent display of the mixed reality computing device. For example, the displayed virtual information 328 can be overlaid over the physical environment of the building. That is, the user can view the physical environment of the building while simultaneously viewing the displayed virtual information 328 on the transparent display.
The displayed virtual information 328 can include information about fixing a predicted fault of a device. For instance, the mixed reality computing device can receive a work order about a particular device, and the work order can include a fault that the device may have experienced. The displayed virtual information 328 can include information about fixing the fault that the device may have experienced. The virtual information 328 can be displayed on the mixed reality display in response to the location of the mixed reality computing device being in the same location as the device corresponding to the received work order.
As illustrated in
Utilizing the transparent display as described in connection with
As the user arrives at the device to begin the tasks included in the work order for the device, in some examples the transparent display can display a 3D representation of the device (e.g., previously described in connection with
As used herein, the term “SOP” refers to a set of step-by-step instructions to carry out a series of operations. The instructions can be performed to carry out, for example, a work order. In other words, the work order can be accomplished by the user by performing a series of step-by-step instructions included in an SOP.
Various work orders may include different SOPs. For example, the work order #1 having the open VAV air leakage can include a different SOP than work order #2 corresponding to a VAV cooling inefficiency (e.g., previously described in connection with
In other words, the information about fixing the fault of the device (e.g., the VAV device) can include steps of an SOP corresponding to the fault. As illustrated in
Although three steps 432 are illustrated in
In some examples, a video tutorial 434 can be displayed on the transparent display. For example, one user may be less skilled at a particular work order than other users, may have less technical ability, less technical experience, etc. As a result, a user may not fully understand a step of, or the steps of the SOP. Utilizing the mixed reality computing device, the user can view a video tutorial 434 of the steps 432 of the SOP. For example, the video tutorial 434 can provide a set of instructions with corresponding visual examples for the user to utilize in order to understand the steps 432 of the SOP. As an example, the user may not understand how to remove the cowling from the VAV device from the steps 432 of the SOP. The video tutorial 434 can provide the user with a visual example of how to remove the cowling from the VAV device in order to assist the user with the steps 432 of the SOP. The user can view the physical environment of the building through the transparent display while simultaneously viewing the video tutorial 434 on the transparent display.
Although not illustrated in
The user can utilize a live video assistance via the transparent display. For example, another technician, engineer, or other user who may be in a location remote from the location of the mixed reality computing device can connect to the mixed reality computing device and provide live video assistance to the user. For example, the user may not understand how to remove the cowling from the VAV device from the steps 432 of the SOP. Another technician can connect to the mixed reality computing device to explain and/or show the user how to remove the cowling from the VAV device. The technician can be displayed on the transparent display in a video viewable by the user of the mixed reality computing device. The technician can, in some examples, view what the user of the mixed reality computing device views via the optical sensor of the mixed reality computing device. The user can view the physical environment of the building through the transparent display while simultaneously viewing the live video assistance on the transparent display.
As a user is performing tasks in the SOP, the user can update a checklist. The checklist can document the steps the user has performed as the steps of the SOP are completed. For example, when the user removes the cowling from a VAV device, the user can update the checklist to document the step of the SOP to remove the cowling from the VAV device has been completed.
As illustrated in
Although not illustrated in
In some examples, the user can cause the mixed reality computing device to update a checklist of an SOP in response to a gesture, a gaze, a voice command, and/or a combination thereof.
Building system maintenance using mixed reality can allow a user to easy receive work orders, locate devices in a building which may be unfamiliar to them, and perform steps of an SOP to complete work orders of the devices in the building. The mixed reality computing device can allow a user who may be unfamiliar with a building or with a particular device included in a work order to complete installation, maintenance, and/or repairs of devices in a building. Integrated video tutorials and live video support can provide a user of the mixed reality computing device with further information to complete a work order without causing additional resources to be committed to the work order which can allow a user to complete work orders on a variety of different devices in a variety of different locations, saving time, cost, and labor.
As shown in
Memory 542 can be volatile or nonvolatile memory. Memory 542 can also be removable (e.g., portable) memory, or non-removable (e.g., internal) memory. For example, memory 542 can be random access memory (RAM) (e.g., dynamic random access memory (DRAM) and/or phase change random access memory (PCRAM)), read-only memory (ROM) (e.g., electrically erasable programmable read-only memory (EEPROM) and/or compact-disc read-only memory (CD-ROM)), flash memory, a laser disc, a digital versatile disc (DVD) or other optical disk storage, and/or a magnetic medium such as magnetic cassettes, tapes, or disks, among other types of memory.
Further, although memory 542 is illustrated as being located in mixed reality computing device 502, embodiments of the present disclosure are not so limited. For example, memory 542 can also be located internal to another computing resource (e.g., enabling computer readable instructions to be downloaded over the Internet or another wired or wireless connection).
As shown in
Additionally, mixed reality computing device 502 can receive information from the user of mixed reality computing device 502 through an interaction with the user via a user interface. For example, mixed reality computing device 502 can receive input from the user via, for instance, voice commands, physical gestures, gazing, or by touching the display 544 in embodiments in which the display 544 includes touch-screen capabilities (e.g., embodiments in which the display is a touch screen).
Although specific embodiments have been illustrated and described herein, those of ordinary skill in the art will appreciate that any arrangement calculated to achieve the same techniques can be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments of the disclosure.
It is to be understood that the above description has been made in an illustrative fashion, and not a restrictive one. Combination of the above embodiments, and other embodiments not specifically described herein will be apparent to those of skill in the art upon reviewing the above description.
The scope of the various embodiments of the disclosure includes any other applications in which the above structures and methods are used. Therefore, the scope of various embodiments of the disclosure should be determined with reference to the appended claims, along with the full range of equivalents to which such claims are entitled.
In the foregoing Detailed Description, various features are grouped together in example embodiments illustrated in the figures for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the embodiments of the disclosure require more features than are expressly recited in each claim.
Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment.