The present disclosure relates generally to the management of building systems of a building. An augmented reality (AR) system or apparatus can be located in a space of the building. The AR system can display information pertaining to the building to a user. However, providing such information may be challenging. For example, it may be difficult to determine the location of the AR system within the building. Therefore, the information that the AR system displays may not be accurate or may not include in-depth information.
One implementation of the present disclosure is a system. The system can include one or more memory devices storing instructions thereon that, when executed by one or more processors, cause the one or more processors to receive an indication of a point in a space of an environment, wherein an augmented reality system is located in the space of the environment at a coordinate in a first coordinate system. The instructions can cause the one or more processors to query a digital twin for a coordinate of the point in a second coordinate system, the digital twin including a contextual description of the environment. The instructions can cause the one or more processors to translate the coordinate of the augmented reality system in the first coordinate system to a coordinate of the augmented reality system in the second coordinate system based on the coordinate of the point in the second coordinate system and the coordinate of the augmented reality system in the first coordinate system. The instructions can cause the one or more processors to generate data to cause the augmented reality system to display information based at least in part on the translation.
In some embodiments, the point is at least one of a quick response (QR) code, a piece of art, a piece of equipment, or a feature of a building.
In some embodiments, the instructions cause the one or more processors to receive, from the augmented reality system, a coordinate of the point in the first coordinate system and translate the coordinate of the augmented reality system in the first coordinate system to the coordinate of the augmented reality system in the second coordinate system based on the coordinate of the point in the first coordinate system, the coordinate of the point in the second coordinate system, and the coordinate of the augmented reality system in the first coordinate system.
In some embodiments, the environment is a building and the space is a space of the building. In some embodiments, the point is a building object located within the space of the building. In some embodiments, the information is a description of the building space of the building.
In some embodiments, the instructions cause the one or more processors to query the digital twin for the information, the information describing the space. In some embodiments, the instructions cause the one or more processors to generate the data based on a query response of the digital twin. In some embodiments, the instructions cause the one or more processors to transmit the data to the augmented reality system to cause the augmented reality system to display a view of the space augmented with the information.
In some embodiments, the system includes an augmented reality system. In some embodiments, the augmented reality system is configured to receive the data from the one or more processors and display a view of the space augmented with the information.
In some embodiments, the instructions cause the one or more processors to receive, from the augmented reality system, an indication of an asset located within the environment, retrieve, based on the indication of the asset and the translation, a description of the asset from the digital twin, and generate the data to cause the augmented reality system to display the description of the asset.
In some embodiments, the instructions cause the one or more processors to receive, from the augmented reality system, a coordinate of an asset within the environment in the first coordinate system, translate the coordinate of the asset in the first coordinate system to the second coordinate system, and identify the asset within the digital twin based on a comparison of the coordinate of the asset in the second coordinate system and a stored coordinate of the asset in the second coordinate system stored within the digital twin.
In some embodiments, the instructions cause the one or more processors to receive an account profile of a user of the augmented reality system, determine, based on the account profile, that the user associated with the account profile has access to the information of the environment or authorization to utilize the point for the translation, and generate data to cause the augmented reality system to display the information responsive to a determination that the user has access to the information of the environment or authorization to utilize the point for the translation.
In some embodiments, the instructions cause the one or more processors to receive a set of coordinates in the first coordinate system of the augmented reality system and translate the set of coordinates of the augmented reality system in the first coordinate system to a second set of coordinates in the second coordinate system based on the coordinate of the point in the second coordinate system and the set of coordinates of the augmented reality system in the first coordinate system.
In some embodiments, the instructions cause the one or more processors to generate data to cause a user device to display a graphical representation of the space of the environment, receive, from the user device, a selection of the point in the graphical representation of the space of the environment, identify, based on the selection of the point in the graphical representation of the space of the environment, the coordinate of the point in the second coordinate system, and save the coordinate of the point in the second coordinate system in the digital twin.
In some embodiments, the system includes the digital twin including a graph data structure including nodes representing entities of the environment and edges between the nodes, the edges representing relationships between the entities. In some embodiments, the instructions cause the one or more processors to receive the coordinate of the point in the second coordinate system. In some embodiments, the instructions cause the one or more processors to identify a first node of the nodes representing the point and generate or update a second node of the nodes related to the first node by an edge of the edges to cause the second node to store or link to the coordinate of the point in the second coordinate system.
In some embodiments, the system includes the digital twin including a graph data structure including nodes representing entities of the environment and edges between the nodes, the edges representing relationships between the entities. In some embodiments, the instructions cause the one or more processors to query the digital twin for the coordinate of the point in the second coordinate system by identifying a first node of the nodes representing the point, identifying an edge of the edges relating the first node to a second node, and retrieving the coordinate of the point in the second coordinate system based on the second node.
In some embodiments, the system includes the digital twin including a graph data structure including nodes representing entities of the environment and edges between the nodes, the edges representing relationships between the entities. In some embodiments, the information describes an entity of the environment. In some embodiments, the instructions cause the one or more processors to query the digital twin for the information by identifying a first node of the nodes representing the entity, identifying an edge of the edges relating the first node to a second node representing a coordinate of the entity in the second coordinate system, determining that the coordinate of the entity in the second coordinate system indicates that the entity is within a distance from the augmented reality system, and retrieving the information of the entity responsive to a determination that the entity is within the distance from the augmented reality system.
Another implementation of the present disclosure is a method. The method can include receiving, by one or more processing circuits, an indication of a point in a space of an environment, wherein an augmented reality system is located in the space of the environment at a coordinate in a first coordinate system. The method can include querying, by the one or more processing circuits, a digital twin for a coordinate of the point in a second coordinate system, the digital twin including a contextual description of the environment. The method can include translating, by the one or more processing circuits, the coordinate of the augmented reality system in the first coordinate system to a coordinate of the augmented reality system in the second coordinate system based on the coordinate of the point in the second coordinate system and the coordinate of the augmented reality system in the first coordinate system. The method can include generating, by the one or more processing circuits, data to cause the augmented reality system to display information based at least in part on the translation.
In some embodiments, the method includes receiving, from the augmented reality system, a coordinate of the point in the first coordinate system. In some embodiments, the method includes translating, by the one or more processing circuits, the coordinate of the augmented reality system in the first coordinate system to the coordinate of the augmented reality system in the second coordinate system based on the coordinate of the point in the first coordinate system, the coordinate of the point in the second coordinate system, and the coordinate of the augmented reality system in the first coordinate system.
In some embodiments, the method includes querying, by the one or more processing circuits, the digital twin for the information, the information describing the space. In some embodiments, the method includes generating, by the one or more processing circuits, the data based on a query response of the digital twin. In some embodiments, the method includes transmitting, by the one or more processing circuits, the data to the augmented reality system to cause the augmented reality system to display a view of the space augmented with the information.
In some embodiments, the method includes receiving, by the one or more processing circuits from the augmented reality system, an indication of an asset located within the environment, retrieving, by the one or more processing circuits, based on the indication of the asset and the translation, a description of the asset from the digital twin, and generating, by the one or more processing circuits, the data to cause the augmented reality system to display the description of the asset.
In some embodiments, the method includes receiving, by the one or more processing circuits, a set of coordinates in the first coordinate system of the augmented reality system and translating, by the one or more processing circuits, the set of coordinates of the augmented reality system in the first coordinate system to a second set of coordinates in the second coordinate system based on the coordinate of the point in the second coordinate system and the set of coordinates of the augmented reality system in the first coordinate system.
Another implementation of the present disclosure is a building system. The building system can include one or more processors configured to execute instructions stored on one or more memory devices, the instructions causing the one or more processors to receive an indication of a point in a space of a building, wherein an augmented reality system is located in the space of the building at a coordinate in a first coordinate system. The instructions cause the one or more processors to query a digital twin for a coordinate of the point in a second coordinate system, the digital twin including a contextual description of the building. The instructions cause the one or more processors to translate the coordinate of the augmented reality system in the first coordinate system to a coordinate of the augmented reality system in the second coordinate system based on the coordinate of the point in the second coordinate system and the coordinate of the augmented reality system in the first coordinate system. The instructions cause the one or more processors to generate data to cause the augmented reality system to display information based at least in part on the translation.
Various objects, aspects, features, and advantages of the disclosure will become more apparent and better understood by referring to the detailed description taken in conjunction with the accompanying drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements.
Referring generally to the FIGURES, a building system that translates between coordinate systems for an AR system is shown, according to various exemplary embodiments. When an AR system first boots up, is activated, turned on, or otherwise enabled in a building space, the AR system can store an initial coordinate of the AR system as an origin location. As the AR system moves through the building space, the AR system can identify and store coordinates of the AR system indicating the new locations to which the AR system has moved. However, the coordinate system used by the AR system can be specific to the AR system. A coordinate system that describes the building may be different from the coordinate system of the AR system. This difference in coordinate systems used by the AR system and the building system can create technical challenges for the building system to provide information to the AR system about the spaces in which the AR system is located. For example, when there is such a difference in coordinate systems, the building system might provide inaccurate information or information that is irrelevant to the AR system or the space.
To solve these and other technical problems, the building system described herein can operate to translate at least one coordinate of the AR system (e.g., in a first coordinate system or AR system-relative coordinate system) into a coordinate of the building system (e.g., in a second coordinate system or building-relative coordinate system), using a reference coordinate of an object or point so the AR system and the building system can be operated in the same coordinate system. The point can be or include a visually distinct entity, object, area, spot, or structure. The point can be a piece of art, a piece of equipment, a quick response (QR) code, or a feature of a building (e.g., a wall, a ceiling, a staircase, etc.). A digital twin of the building system can store information of the point. The information can be a coordinate of the point in a particular coordinate system (e.g., space-relative coordinate system, building-relative coordinate system, etc.). The building system can store information of the point or the coordinate of the point in the digital twin. The building system can retrieve the point information or the coordinate of the point from the digital twin and use the coordinate of the point to translate a coordinate of the AR system (or any other entity or system) from the first coordinate system (e.g., an AR system specific coordinate system) to the second coordinate system (e.g., the building-relative coordinate system). The building system can use the translated coordinate of the AR system to cause the AR system to display building information associated with the space, point or area in which the AR system is located. The building information can include asset data, building performance, environmental conditions, etc. The translation of the building system can allow for the display of various types of information on the AR system.
The building system described herein can provide accurate and in-depth information, allowing a building manager to manage the building more consistently and efficiently. Because the building system can translate the location of the AR system into a building coordinate, the building system can provide accurate information regarding a person, an asset, or a space that is located nearby or relevant to the AR system. Furthermore, the building system can use the translated coordinates of the AR system to track the location of the AR system as the AR system moves through the building. The information that the building system provides to the AR system can include a shape, a location, or an operating status of a piece of equipment. The information can include a name, a schedule, or a role of a person. The information can include a participant list for an event occurring in the building, a duration of an event, or description of an event. This can allow a building manager to manage events and/or assets in the building, the building itself, and occupants of the building organically, while allowing a building user to have access to such various asset information without aid of building personnel.
As discussed herein, the first coordinate system can relate to a coordinate system in which an AR system captures, records, or stores coordinates or locations. The second coordinate system can relate to a coordinate system of an environment, such as a building. The second coordinate system can be an environment or building relative coordinate system. Each coordinate system may have a different origin and thus coordinates of the first and second coordinate systems that describe the same object, point, asset, or device may be different.
Referring now to
The memory devices 108 can include one or more devices (e.g., memory units, memory devices, storage devices, etc.) for storing data and/or computer code for completing and/or facilitating the various processes described in the present disclosure. The memory devices 108 can include random access memory (RAM), read-only memory (ROM), hard drive storage, temporary storage, non-volatile memory, flash memory, optical memory, or any other suitable memory for storing software objects and/or computer instructions. The memory devices 108 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. The memory devices 108 can be communicably connected to the processors 104 and can include computer code for executing (e.g., by the processors 104) one or more processes described herein.
The building system 100 can include a digital twin 128. The digital twin 128 can be a digital representation of a physical environment (e.g., a building, a ship, a boat, a vessel, a train, a plane, a manufacturing environment, a warehouse, a laboratory, etc.). The digital twin 128 can store contextual data of the environment and operational data of the environment. The digital twin 128 can be or include a digital replica of a physical asset (i.e., a physical device twin) and can store processes, people, places, and/or systems that can be used for various purposes. The digital twin 128 can be a software component stored and/or managed by the building system 100. The digital twin 128 can be a computing or storage entity or data structure that describes a physical thing (e.g., a building, spaces of a building, devices of a building, people of the building, equipment of the building, etc.) through modeling the physical thing through a set of attributes that define the physical thing. The digital twin 128 can include both ingested information and actions learned or executed through artificial intelligence or machine learning agents. The digital twin 128 can be or include a graph, a graph data structure, and/or artificial intelligence agents. The digital twin 128 can enable an in-depth analysis of data of the physical environment and provides the potential to monitor systems to mitigate risks, manage issues, and utilize simulations to test future solutions. The digital twin 128 can help technicians find the root cause of issues and solve problems faster, can support safety and security protocols, and can support building managers in more efficient use of energy and other facilities resources. The digital twin 128 can be used to enable and unify security systems, employee experience, facilities management, sustainability, etc.
The building system 100 can operate to communicate with an AR system 160 located in a space of the building via a network 140. The AR system 160 may include at least one processor, at least one memory device, computer hardware, at least one output device, at least one input device. For example, the hardware of the AR system 160 can be similar to the processors 104 or the memory devices 108. The AR system 160 can be located in or nearby an area of an environment (e.g., a space of the building) where the point 164 and/or assets 168 (e.g., building assets) are located. The AR system 160 can be a system, device, or apparatus that displays information to a user by augmenting, overlaying, or adding information to view of an environment. The view of the environment can be a natural view, e.g., light reflected from the environment and returned to the eyes of a user. The view of the environment can be a virtual view or captured view, e.g., a view displayed or projected to the eyes of the user based on a model or based on images or videos captured by a camera. The AR system 160 can be, include, or be embedded within, a cellphone, smart glasses, a headset, a virtual reality device, or any other user device. The AR system 160 can include a display 172 that displays information (such as the display information 180 received from the building system 100) to a user. The display 172 can be or include a liquid crystal display (LCD), a light emitting diode (LED) display, a curved mirror based display, a waveguide based display, a projector, or any other type of display device. The display 172 may include diffraction optical components, holographic optical components, polarized optical components, and reflective optical components.
The AR system 160 can include at least one sensor 176. The sensor 176 can be or include an image sensor such as a charge-coupled device (CCD) and an active-pixel sensor (e.g., complementary metal-oxide semiconductor (CMOS) sensor), a monitoring sensor such as a light detection and ranging (LIDAR) sensor, a bluetooth sensor, or a wireless beacon. In some embodiments, a LIDAR system can be used to digitize a space since it creates an accurate positioning system, e.g., it includes depth information. In some embodiments, the digital twin 128 of the building system 100 can use LIDAR and/or camera data for the points 164.
The AR system 160 can transmit, broadcast, convey, deliver, or otherwise communicate an indication of point 164 and/or a coordinate 148 of AR system 160 to the building system 100 via the via a network 140. The network 140 can be or include at least one wireless network, wired network, or combination of wireless and wired networks. The network 140 can include a Wi-Fi network, a wired Ethernet network, a ZigBee network, a Bluetooth network, and/or any other wireless/wired network. The network 140 may be a local area network or a wide area network (e.g., the Internet, a building WAN, etc.) and may use a variety of communications protocols (e.g., BACnet, IP, LON, etc.). The network 140 may include routers, modems, servers, cell towers, satellites, and/or network switches. The AR system 160 can communicate the indication 184 of point 164 or the coordinate 148 of AR system 160 as data packets, data frames, data messages, data elements, or any other type of information. A translation system 112 in the building system 100 can include at least a query manager 120. The query manager 120 can be connected to the AR system 160. The query manager 120 can be connected through the network 140. The query manager 120 can receive, collect, obtain, or get the indication 184 of point 164 or the coordinate 148 of AR system 160 from the AR system 160.
The AR system 160 can capture, generate, or identify an indication 184 the point 164 for the point 164 based on data that includes the measurements of the sensor 176 and/or based on the indication 184 of the point 164 received from the AR system 160. The data can be images of the point 164 or wireless signal strengths detected from a wireless signal emitter of the point 164. The point 164 can be or include a piece of art (e.g., a painting, a photo, a sculpture, an architecture, etc.), a piece of equipment, a feature of a building, or any visually distinct spot or area. The point 164 can be or include an asset tracking code such as a quick response (QR) code and a barcode, or a radio frequency identification (RFID). Information of the point 164 can be stored in the digital twin 128 of the building system 100, the information including point location data 132. For example, an artwork can be a point, whose coordinate, shape, and dimension can be stored in the digital twin 128.
Information of the point 164 can be stored in the digital twin 128 of the building system 100, the information including the point location data 132. The point location data 132 can be a location of the point 164 in the building or in a space of the building. The location of the point 164 can be in a Cartesian coordinate system, a circular coordinate system, a spherical coordinate system, a cylindrical coordinate system, or any other coordinate system. The location of the point 164 can include longitude, latitude, and/or altitude. The location of the point 164 can include at least one of x, y, or z values. The location of the point 164 can include at least one angle and/or at least one distance. Such coordinate systems can be relative to a certain point of the space where the point 164 is located (e.g., relative to a corner of a room where the point 164 is located) or can be relative to a certain point of the building. The point information stored in the digital twin 128 can include a shape, size, dimension, and color of the point 164. For example, the point 164 can be a sculpture, whose location, shape, size, dimension, and color can be stored in the digital twin 128.
Assets 168 can be located in a space of the building. The assets 168 may be located nearby the AR system 160 and/or the point 164. The assets 168 may be any type of building assets in the building (e.g., a building, a space, a device, a person, an equipment, etc.). Information of the asset 168 (i.e., asset data 136) can be stored in the digital twin 128 of the building system 100, the information including a shape, a size, a color, a status, an operating value, a measured value, a performance level, an energy consumption level, a performance metric, a date of maintenance, a fault status, an installation data, a model name, a model number, etc. of the asset 168. For example, a light (such as a chandelier) can be an asset, whose location, shape, and on/off status can be stored in the digital twin 128 as the asset data 136. For example, a thermostat can be the asset 168 and the asset data 136 can indicate a temperature setpoint of the thermostat, a measured temperature value of the thermostat, an equipment energy consumption associated with equipment controlled by the thermostat.
The AR system 160 can send the indication 184 of point 164 to the building system 100. The indication 184 of point 164 can be or include at least a part of the measurements of the sensor 176. For example, the indication 184 of point 164 can be an image/photo of the point 164 or an image/photo of a part of the point 164. In some embodiments, the indication 184 of point 164 can include a signal, a signal or code stored in a QR code, a signal from a barcode, or a signal from an RFID. In some embodiments, the indication 184 of point 164 can be manually entered by a building user. For example, a building user can manually indicate a nearest painting by the name of the painting, which can be a part of the indication 184 of point 164. In some embodiment, the AR system 160 can provide at least a part of a coordinate 148 of AR system 160 to the building system 100. For example, the AR system 160 may be equipped with a detection/ranging system (e.g., LIDAR), gyroscopes, etc. that detect or generate a location for the AR system 160 such that the AR system 160 can provide a location of the point 164 relative to the AR system 160 to the query manager 120 of the building system 100.
The query manager 120 can receive the indication 184 of point 164 and/or the coordinate 148 of AR system 160. The query manager 120 can analyze the indication 184 of point 164 to identify the point 164. For example, the query manager 120 can analyze the indication 184 of point 164 by running an image/pattern recognition algorithm (e.g., a convolutional neural network) to identify the indication 184 of point 164 collected by the sensor 176 (e.g., an image of an artwork captured by a camera). Based on the analysis, the query manager 120 can determine an identifier (e.g., the name of the point 164, an identification number of the point 164, a code of the point 164, etc.) associated with the indication 184 of point 164. Using the resulting identifier, the query manager 120 can query the digital twin 128 for characteristic information of the point 164 associated with the resulting identifier, the characteristic information including a dimension, a shape, a pattern, a color of the point 164, or any type of information of the point 164 that can be compared with the indication 184 of point 164. Based on the comparison between the indication 184 of point 164 and the characteristic information, the query manager 120 can determine a relative position between the AR system 160 and the point 164 (i.e., the coordinate 148 of AR system 160 in a first coordinate system). For example, the query manager can determine the distance between the AR system 160 and the point 164 by comparing the actual dimension included in the characteristic information and the dimension in the indication 184 of point 164 (e.g., a photo). For example, the query manager 120 can determine the azimuthal location of the AR system 160 in relation to the point 164 by analyzing the indication 184 of point 164 (e.g., the angle at which a photo is taken).
In some embodiments, the AR system 160 can determine the coordinate 148 of AR system 160 and provide the coordinate 148 of AR system 160 to the query manager 120. For example, the AR system can determine a coordinate of the point 164 relative to the AR system 160, using, for example, a LIDAR sensor, the coordinate of the point 164 being in the first coordinate system (i.e., AR system-relative coordinate system). This allows the AR system 160 to identify the coordinate 148 of AR system 160 in relation to the point 164. The AR system 160 can provide the coordinate 148 of AR system 160 to the query manager 120. Determining the relative position between the AR system 160 and the point 164 (the point 164 whose coordinate in the second coordinate system is stored in the digital twin 128), either by the AR system 160 or the query manager 120, can allow the building system 100 to describe the AR system 160 in the second coordinate system.
In some embodiments, the AR system 160 such as an AR headset, mobile devices such as a smartphone, or other devices may be used to generate a spatially mapped set of images of the building or spaces. In some embodiments, one or more devices for the sensor 176 may be used, optionally along with orientation data (e.g., accelerometer data), to generate an image of the space with the point 164 for various objects or image portions in the space. Once spaces are mapped, the AR system 160 can utilize the set of images to determine an accurate relative location (e.g., set of coordinates) for the AR system 160 within the space. In some embodiments, various other types of technologies may be used alone or in combination to improve the accuracy of the imaging and/or location data in the space. For example, in some embodiments, the AR system 160 may include LIDAR sensors configured to capture LIDAR data for the space. In some embodiments, the image data may be used in conjunction with other location data to determine an accurate location of objects and/or location of the AR system 160 within the space, such as Bluetooth-equipped devices configured to locate a relative or absolute location of the AR system 160 within the space. Bluetooth based location systems are described in U.S. patent application Ser. No. 17/220,795 filed Apr. 1, 2021, which is incorporated herein by reference in its entirety.
Using the resulting identifier, the query manager 120 can query the digital twin 128 for the point location data 132 associated with the resulting identifier. Upon the receipt of the query, the digital twin 128 can provide the point location data 132 (i.e., the coordinate of the point 164 in the second coordinate system or the building-relative coordinate system) to the query manager 120.
The query manager 120 can provide the point location data 132 and the coordinate 148 of the AR system 160 to a translation manager 124 of the translation system 112. The translation manager 124 can receive the point location data 132 and the coordinate 148 of the AR system 160 from the query manager 120. The translation manager 124 can determine the coordinate 148 of AR system 160 in the second coordinate system, e.g., the building-relative coordinate system. The translation manager 124 can translate the coordinate of the AR system 160 from the first coordinate system of the AR system 160 to the second coordinate system of the environment. For example, the translation manager 124 can translate the coordinate 148 of the AR system 160 from the first coordinate system to the second coordinate system based at least one of a coordinate of the point 164 in the first coordinate system (e.g., a coordinate captured by the AR system 160 or determined by the query manager 120) and a coordinate of the point 164 in the second coordinate system (e.g., a coordinate indicated by the point location data 132). This allows the building system 100 to identify a location of the AR system 160 in a coordinate system that the building system 100 uses for other building entities (e.g., point 164 or asset 168).
The translation manager 124 can query the digital twin 128 for the asset data 136 for assets associated with or located nearby the point 164 or the AR system 160. For example, the translation manager 124 can query the digital twin 128 for information about entities located nearby the AR system 160 by querying the digital twin 128 with a coordinate of the AR system 160 translated into the second coordinate system. For example, when the point 164 is located in a zone of a building that an HVAC system controls, the translation manager 124 can query the digital twin 128 for environmental conditions (e.g., temperature, humidity, air quality, etc.) measured by a sensor located in the building zone. The digital twin 128 can provide the environmental conditions to the translation manager 124 responsive to the query. The translation manager 124 can provide the asset data 136 to a display manager 116. Based on the asset data 136, the display manager 116 can generate data, such as the display information 180. The display information 180 can be a graphical user interface, a portion of the asset data 136, a metric derived from the asset data 136, a layout of the asset data 136 within the graphical user interface, etc. The display manager 116 can transmit, communicate, send, or deliver the display information 180 to the AR system 160. The display manager 116 can communicate the display information 180 to the AR system 160 via the network 140. Communicating the display information 180 to the AR system 160 can cause the AR system 160 to display information on the display 172.
The AR system 160 can receive the display information 180 from the display manager 116 of the building system 100 and display the display information 180 on the display 172. The display information 180 can be an augmented image, an overlaid image, or added information to a view of an environment. The information displayed can include at least a portion of the asset data 136, the space itself, a piece of building equipment, or a person. For example, the display information 180 can include information describing the asset 168 (e.g., a status the asset, measurements of the asset 168, an operating condition of the asset 168, a name of the asset 168, a fault the asset 168 is experiencing), information of the space where the AR system 160 is located (e.g., scheduled events, purposes of the space, temperature, etc.), information of the AR system 160 (e.g., location of the AR system in a building-relative coordinate system), or a virtual information desk. Although the display manager 116 transmits the display information 180 to the AR system 160 for display on the display 172, the AR system 160 is not limited to displaying visual information. For example, the building system 100 can cause the AR system 160 to play a sound, process data or information, communicate with one of the assets 168, control or operate the asset 168, collect data from the asset 168, etc.
In some embodiments, physical items, e.g., spaces, equipment, people, buildings, etc. can be linked via the point 164 to a virtual representation of the physical items (e.g., in a building graph). In some embodiments, based on the point 164, the building system 100 can identify where the AR system 160 is and how it is being positioned. In some embodiments, the point 164 can be used to identify and/or flag items that are missing or moved. For example, any missing objects that should be present can be flagged as missing or moved. These may be stolen items, moved furniture, broken items, etc. The sensor 176 can capture a visual image of a space and the display manager 116 can detect a missing or moved item based on a comparison between the asset data 136 and the captured image. The display information 180 can include such information so that the display 172 displays indications of the missing or moved item that the display manager 116 detects. This allows the point 164 to be used for theft or inventory alerts. In some embodiments, the point 164 can be used to display point attributes and/or label spaces. In some embodiments, the building system 100 can operate to perform asset inventory. For example, the AR system can display the asset 168 in a room on the display 172. For example, when a user walks into a room, the AR system 160 can capture images of the point 164 and/or the asset 168. For example, the building system 100 can cause the AR system 160 to display an indication of a room in which the AR system 160 is located so that a user of the AR system 160 can understand in which room the user is located. For example, the building system 100 can identify the asset 168 located nearby the AR system 160 and insert, save, or add the asset 168 into the digital twin 128 without requiring the user to provide an input that inventories the asset 168.
In some embodiments, the point 164 could be used in a building information model (BIM). For example, the building system 100 can pull information from the BIM and display the information to a building user in the AR system 160. For example, the building system 100 of the BIM can display where the AR system 160 is located in the second coordinate system. BIMs can be used in augmented or virtual reality, in some embodiments, e.g., a user could use the AR system 160 to tour a building, and the user can select a location within the BIM for information about the location, the point 164 or the asset 168. In response to such an input by a user, the AR system 160 can display the information about the space, the point 164, or the asset 168, which are automatically retrieved from the digital twin 128.
In some embodiments, using context information associated with the point 164, e.g., the context information stored (e.g., asset data 136) in the digital twin 128, the AR system 160 can further determine any of a variety of different pieces of information about the space. For example, in various embodiments, the AR system 160 can identity a purpose of the space, data about or collected from sensors in the space, people who are in the space or who are associated with the space, and events that have happened, are happening, or will be happening in the space (e.g., a schedule associated with a conference room, etc.). In some embodiments, any context information stored in the digital twin 128 could be provided to the AR system 160. The AR system 160 may use the context information to generate the visual representation of the space (e.g., displaying the information on the display 172). In some implementations, the context information or a portion thereof (e.g., sensor readings) may be displayed based on the point 164, the asset 168 located nearby the AR system 160, the location of the AR system 160, or the context information associated with the space. In some implementations, the context information may be used to augment a visual appearance of the objects within the space (e.g., to highlight a faulted piece of equipment within the space, to identify a building object within the space based on a query relating to the object, to provide a wayfinding path to a particular area within the building or space using the points 164, etc.). While the present description discusses one point (i.e., the point 164), it should be understood that, in various embodiments, any number of points may be used and stored within the digital twin(s) 128 (e.g., one or more points 164 in each space, multiple points 164 per floor, etc.). In some implementations, the number of points 164 utilized may be dependent in part on the characteristics of the space (e.g., size, number of walls, etc.).
In some embodiments, the point 164 can form a bridge between the digital twin 128 and the AR system 160. The digital twin 128 (e.g., a building graph of the digital twin 128) can be data stored for the point 164 and/or building-relative coordinates or information of the asset 168 in the building. This can allow for augmented reality for entities or the asset 168 in the digital twin 128. In some embodiments, the point 164 can help map entities of the digital twin 128 and connect the AR system 160 to the digital twin 128. For example, via the points 164, the AR system 160 could be connected to a particular building and augment a reality view with building data and/or the asset data 136 stored in the digital twin 128. For example, for a piece of equipment, the AR system 160 can be connected to a digital twin 128 of the equipment via the point 16, and a reality view can be augmented with the asset data 136 for the piece of equipment.
In some embodiments, the points 164 and/or the asset 168 may be automatically or semi-automatically identified by the AR system 160. In some embodiments, based on the identification, the building system 100 may provide relevant information to the AR system 160. For example, when a user walks into a room with an AR system 160, the AR system 160 localizes to the room, e.g., via the point 164. In some embodiments, the AR system 160 may be automatically activated when a user walks into a room. In some embodiments, a sensor 176 of the AR system 160 may automatically detect the point 164 or a signal from the point 164. For example, a smartphone localizes to the room via a bluetooth beacon data. Based on the identified location, the building system 100 can automatically identify the asset 168 associated with the identified location and communicate display information 180 to the AR system 160 so the AR system 172 displays the display information 180 or relevant information on the display 172. In some embodiments, the point 164 can be connected in various manners, including bluetooth beacon data, Wi-Fi base station data, ultra-wideband beacon data, etc.
In some embodiments, the building system 100 or a building user identifies a visually distinct spot of a physical environment and establishes the point 164 for the visually distinct spot. The user may point their device (i.e., the AR system 160) at the distinct spot (e.g., a piece of equipment, a room, a table, a computer, a sensor, etc.) to capture an image of the distinct spot, and the building system 100 stores the point location data 132 for the spot in the digital twin 128. The user may provide, via the AR system 160, a serial number, barcode, device name, room name, or any other identifying information for the distinct spot. This may link the point 164 to the virtual representation of the physical location via the information. When the AR system 160 capture a space where a visually distinct spot is located, an item whose information is unknown or is not stored in the digital twin 128 can be identified by the building system 100 and stored in the digital twin 128.
In some embodiments, the AR system 160 can capture information that a user locates, e.g., the point 164, the asset 168. For example, when a space manager locates a new table using a smartphone (i.e., the AR system 160), the table can be added to the digital twin 128 as a point 164 and/or an asset 168. The location of the table can be automatically added to the digital twin 128. For example, when a space manager locates a moved table, the updated location of the table can be added to the digital twin 128. This can reduce manual input in locating/registering the point 164 and/or the asset 168. In some embodiments, any time the AR system 160 locates and localizes on the asset 168, via the point 164, the asset data 136 can be identified and displayed (e.g., retrieved from a graph and displayed).
In some embodiments, the AR system 160 can provide a coordinate of an asset 168 in the first coordinate system (i.e., the AR system-relative coordinate system) to the digital twin 120. For example, the AR system 160 can include a sub-system (e.g., LIDAR) that can determine location information of an object and can determine the coordinate of the object in the first coordinate system. The digital twin 120 can receive the coordinate of the asset 168 in the first coordinate system, and can translate the coordinate in the first coordinate system to the coordinate in the second coordinate system. In some embodiments, the translated coordinate of the asset 168 in the second coordinate system can be stored in the digital twin 128. In some embodiments, the digital twin 128 can identify the asset 168 within the digital twin 128 based on a comparison of the coordinate of the asset 128 in the second coordinate system (i.e., the coordinate provided by the AR system 160) and a stored coordinate of the asset 168 in the second coordinate system stored within the digital twin 128. This allows for identification of an object (e.g., asset 168, point 164) based on the coordinate information.
In some embodiments, a user can perform a scan of their room with the camera of their smart phone. For example, the user can capture a 360-degree photo of a space (or multiple flat photos). The building system 100 can match the point(s) 164 in the photo(s) and attach them to correct locations of the digital twin 128. The building system 100 can label the asset(s) 168 in the photo(s) based on the point(s) 164. In some embodiments, asset inventorying and labeling can be done offline once such a panorama image is captured. The process can be automated with computer vision, in some embodiments. This also reduces technician site time.
In some embodiments, QR codes or other unique identifier may be added to a space to provide the point 164 or to be a part of the indication 184 of point 164; however, using this method requires the addition of a particular identifying element into the space. In some embodiments, the points 164 may be determined from a distinctive visual element within a space. For example, a computer vision process may determine a particular visual point within the space to use as the point 164. In some implementations, the point 164 may be shared between multiple AR systems 160 such that the multiple AR systems 160 are using the same common point 164. In some implementations, the common point 164 may be shared, for example, using a cloud service. The computer vision process may be able to represent the common point 164 in a manner such that the AR system 160 does not need to be in the same position at which the point 164 were captured, or viewing from the same angle, in order for the point 164 to be recognized by the AR system 160. Any visually distinctive element may be utilized as the point 164 for a space, such as a piece of artwork, unique device, etc.
In some embodiments, the point 164 can be private and be linked to an account. The account could be an account for a user, organization, entity, company, company team, etc. In some embodiments, a user can create and/or save the point 164 to an account or link the point 164 to the account. In this regard, only information associated with the point 164 saved to the account of a particular user may be displayed to the user. In some embodiments, when a particular user who belongs to a particular group uses the AR system 160, only information linked to the point 164 associated with the particular group may be displayed.
In some embodiments, the digital twin 128 can operate to provide augmented and/or virtual reality to the AR system 160 in a user device (e.g., a cellphone, smart glasses, a virtual reality device, etc.). The digital twin 128 can, in some embodiments, coordinate with an internal system (e.g., the same system as the digital twin 128) and/or an external system that provides the augmented or virtual reality view on the user device.
In some embodiments, the building system 100 may be configured to cause the AR system 160 to present one or more virtual agents on the display 172 of the AR system 160 to assist a user in one or more spaces of the building. For example, in some embodiments, the building system 100 may cause the AR system 160 to generate a virtual front desk agent on the display 172 of the AR system 160 that can be seen in or near a lobby of the building and can assist with tasks such as, for example, contacting an employee/occupant of the building, helping the user access the building (e.g., via credentials), helping the user find their way to a particular room or space of the building, etc. In some embodiments, the building system 100 may cause the AR system 160 to generate a supply station agent that can be displayed on the display 172 of the AR system 160 when the user is near a supply station that may, for example, help the user find building supplies. In some embodiments, the building system 100 may cause the AR system 160 to generate a maintenance agent that can be displayed, for example, when the user is near a mechanical room and may assist the user with performing maintenance tasks. In some embodiments, the agents may be visually represented as human beings or in any other visual form. In various embodiments, the visual appearance and/or functionality/capabilities of the virtual agents may be generated at least in part using information from one or more digital twins 128 of the building. For example, the virtual front desk agent may have capabilities determined, for example, based on the digital twin 128 of the lobby or other spaces of the building, the digital twin 128 of the user (e.g., indicating a role, permissions, etc. of the user), the digital twin 128 of equipment in the building, etc. In another example, the supply station agent may have capabilities determined, for example, based on the digital twin 128 of the supply station or supplies contained therein (e.g., indicating a current stock level of one or more types of supplies). In another example, the maintenance agent may have capabilities determined, for example, based on the digital twin 128 of the user (e.g., indicating a role/capabilities/knowledge of the user), the digital twin 128 of the space (e.g., the mechanical room), the digital twin 128 of the equipment in or served by the space, the digital twin 128 of other spaces of the building such as rooms/floors served by the mechanical room, the digital twin 128 of equipment in the other spaces served by the mechanical room, etc.
Referring now to
A first location 265 can be a location where the first AR system 260 is activated, turned on, or otherwise enabled. A second location 266 can be a location where the second AR system 261 is activated, turned on, or otherwise enabled. Although the AR systems 260 and 261 are located at a same physical point of the space, the location of the two AR systems 260 and 261 can be described in different coordinate systems. The two AR systems 260 and 261 separately map the space and determine coordinates differently in relation to a respective physical starting location (e.g., the first location 265 and second location 266). For example, if the AR systems 260 and 261 began mapping at different points, the two AR systems 260 and 261 will represent the same actual location (e.g., the location of the agent 202) within the space using different coordinate systems. For example, the first AR system 260 is in one coordinate system where the AR system 260 is located at (0, −1), whereas the second AR system 261 is in another coordinate system where the AR system 261 is located at (−1, 3). For example, the agent 202 is located at (−1.5, −2) in the coordinate system of the first AR system 260, whereas the agent 202 is located at (−2.5, 2) in the coordinate system of the second AR system 261.
Using the point 164 as a common reference point within the space, the two AR systems 260 and 261 can be represented in a single coordinate system (e.g., a building-relative coordinate system, a point-relative coordinate system, or a space-relative coordinate system). For example, the coordinate systems can be translated to a point-relative coordinate system 203. In this case, the AR systems 260 and 261 and the agent 202 may be located at (2, 0) and (0.5, −1), respectively, in the point-relative coordinate system 203. In this manner, although the coordinates of the agent 202 within the respective coordinate systems of the separate AR system 260 and 261 are different, the coordinates can be represented in the same coordinate system (i.e., in the point-relative coordinate system 203), providing a consistent experience between the AR systems 260 and 261.
Referring now to
In some embodiments, the coordinate 304 of the AR system 160 or the point 164 can be determined by the query manager 120 of the building system 100. Once the AR system 160 provides the indication 184 of point 164 (e.g., a photo of the star shown in
In some embodiments, the AR system 160 can determine the coordinate 304 of the AR system 160 or the point 164, or can provide the query manager 120 with a part of information used to determine the coordinate 304 of the AR system 160 or the point 164. For example, the AR system can determine the coordinate of the point 164 (e.g., the star) in the first coordinate system (i.e., in relation to the AR system 160), using, for example, a LIDAR sensor. This allows the AR system 160 to determine the relative position between the AR system 160 and the point 164, thereby providing the coordinate of the point 164 (or the coordinate 148 of AR system 160) in the first coordinate system to the query manager 120.
In some embodiments, the AR system 160 or a user of the AR system 160 may provide an identifier associated with the point 164 to the query manager 120. For example, the AR system 160 may automatically or semi-automatically identify the point 164 using a pattern recognition algorithm. For example, the AR system 160 may be configured to scan a QR code and send the code to the query manager 120. For example, a user of the AR system 160 can manually enter a code or an identifier associated with the point 164. In such embodiments, the query manager 120 need not analyze the indication 184 of point 164 to find the identifier associated with the point 164.
Referring now to
The query manager 120 can query the digital twin 128 of the space 400 to discover other assets in the space 400 and/or other spaces and receive the building-relative coordinates 402 of the assets 168. The translation manager 124 can then translate the AR system-relative coordinates 404 (i.e., in the first coordinate system) into the building-relative coordinates 402 (i.e., in the second coordinate system) based on the relationship between the building-relative coordinate 402 and the AR system-relative coordinate 404 of the point 164. In some embodiments, the building-relative coordinates 402 of the assets 168 of a building can be stored in the digital twin 128. In some embodiments, the translation manager 124 can translate the building-relative coordinates 402 of the assets 168 (i.e., in the second coordinate system) into the AR system-relative coordinates 404 of the assets 168 (i.e., in the first coordinate system), and includes those AR system-relative coordinates in the display information 180. This allows for a user-friendly and AR system-oriented environment where the AR system 160 operates and processes data regarding the space 400, the point 164, or the asset 168 in the AR-relative coordinate system, while still being able to switch to operating in the building-relative system.
Referring to
In some embodiments, the AR system 160 can display a graphical representation of a space of an environment (e.g., a building) where a point 164 is located. In some embodiments, the AR system 160 may display a graphical representation of a space of an environment on a display 172 of the AR system 160 or on a BIM interface 500 as shown in
Referring now to
The node 631 represents the AR system 160, and the node 631 is linked, via the edge 680, to the node 634 that represents AR system-relative coordinate information and/or indication 184 of point 164 of the AR system 160 represented by the node 631. The node 631 is linked, via the edge 682, to the node 638 that represents the space where the point 164 represented by the node 601 and the AR system 160 represented by the node 631 are located. While a plurality of nodes that represent the building assets 168 may be linked to the node 638, the two nodes 642 and 650 are shown to link to the node 638, respectively via the edges 684 and 686. For example, the node 642 represents a sensor and is linked via the edge 688 to the node 646 for AR system-relative coordinate information. For example, the node 650 represents a thermostat and is linked via the edge 690 to the node 654 for AR system-relative coordinate information. In some embodiments, the asset nodes (e.g., 642, 650) may be linked to nodes that represent other information such as a status of the asset 168. The nodes 608, 612, 616, 620, and 624 respectively can be related to the nodes 638, 642, 646, 650, and 654. For example, data for the node 642 or 650 may be provided by the node 612 or 620 as shown as a dashed line 692 or 694.
In some embodiments, the points 164 can be used to onboard data into the building graph 600. For example, a user can use their AR system 160 to identify objects, identify physical points of interest, input information, create a point 164, etc. The building graph 600 can then provide data (e.g., point data, entity relationship data, names of entities, etc.) to the AR system 160 to augment the view in the device whenever the point 164 is triggered, e.g., display data of the graph associated with the point 164 being displayed in the view.
In some embodiments, the digital twin 128 can include a graph data structure of an environment including nodes representing entities of the environment and edges between the nodes, the edges representing relationships between the entities. Based on the coordinate of the point 164 in the second coordinate system (e.g., the building-relative coordinate system), the building system 100 (e.g., the digital twin 128) can receive the coordinate of the point 164 in the second coordinate system and can identify a first node of the nodes representing the point 164 (e.g., node 601). In some embodiments, the building system 100 can generate or update a second node (e.g., node 604) related to the first node by an edge (e.g., edge 660) to cause the second node to store or link to the coordinate of the point 164 in the second coordinate system.
In some embodiments, the digital twin 128 can include a graph data structure of an environment including nodes representing entities of the environment and edges between the nodes, the edges representing relationships between the entities. The building system 100 can query the digital twin 128 for the coordinate of the point 164 in the second coordinate system by identifying a node and an edge. For example, the building system 100 (e.g., the digital twin 128) can identify a first node (e.g., node 601) representing the point 164 and identify an edge (e.g., edge 660) relating the first node to a second node (e.g., node 604). Based on the second node, the query manager 120 can retrieve the coordinate of the point 164 in the second coordinate system.
In some embodiments, the digital twin 128 can include a graph data structure of an environment including nodes representing entities of the environment and edges between the nodes, the edges representing relationships between the entities. The building system 100 can query the digital twin 128 for the information describing an entity of the environment (e.g., a building asset). The building system 100 can query the digital twin 128 for the information by identifying a first node (e.g., node 612) representing an entity (e.g., a sensor) and by identifying an edge (e.g., edge 668) relating the first node to a second node (e.g., node 616) representing a coordinate of the entity in the second coordinate system. The building system 100 can query the digital twin 128 for the information by determining that the coordinate of the entity in the second coordinate system indicates that the entity is within a distance from the AR system 160 and retrieving the information of the entity responsive to a determination that the entity is within the distance from the AR system 160. In some embodiments, the digital twin 128 can be automatically or semi-automatically queried for the information.
Referring now to
The process 700 can include a step 710 of receiving the indication 184 of point 164 sensed by the AR system 160, the AR system 160 located at a first coordinate of a first coordinate system. The step 710 can include receiving, by the building system 100, the indication 188 of point 164. For example, the sensor 176 of the AR system 160 can sense, detect, or identify the point 164. The sensor 176 can capture images of the point 164. The indication 184 of the point 164 can be a set of images, a code read from a QR code, an identifier of the point 164, or any other piece of information that the sensor 176 of the AR system 160 measures, detects, captures, or generates. The AR system 160 can transmit the indication 184 of the point 164 to the building system 100, e.g., to the query manager 120. Furthermore, the AR system 160 can record, determine, or generate first coordinates 148 in a first coordinate system (e.g., a coordinate system of the AR system 160) that indicate the location of the AR system 160 in an environment.
The indication 184 of the point 164 can be or include at least a part of the measurements of the sensor 176. For example, the indication 184 of point 164 can be an image/photo of the point 164 or an image/photo of a part of the point 164. In some embodiments, the indication 184 of point 164 can include a signal (e.g., a signal from a QR code, a signal from a barcode, or a signal from an RFID). In some embodiments, the indication 184 of point 164 can be manually entered by a building user. For example, a building user can manually indicate a nearest painting by the name of the painting, which can be a part of the indication 184 of point 164. In some embodiment, the AR system 160 can provide at least a part of coordinate 148 of AR system 160 to the building system 100. For example, the AR system 160 may be equipped with a detection/ranging system (e.g., LIDAR) and provide a location of the point 164 relative to the AR system 160 to the query manager 120 of the building system 100.
The AR system 160 can transmit the coordinate 148 of the AR system 160 to the building system 100, e.g., to the query manager 120. The query manager 120 of the building system 100 can receive the indication 184 of point 164 directly or indirectly from the AR system 160. The building system 100 can communicate with the AR system 160 via one or more networks 140.
The process 700 can include a step 720 of querying the digital twin 128 for a coordinate of the point 164 in the second coordinate system. For example, the query manager 120 can use the indication 184 of the point 164 to query the digital twin 128 for the coordinate of the point 164 in a second coordinate system, e.g., a building relative coordinate system or a coordinate system used by the building system 100. The query manager 120 can generate query data, e.g., query parameters including a name of the point 164, an identifier of the point 164, an image of the point 164, a shape of the point 164, or any other identifying information of the point 164. The query manager 120 can use the query data to query the digital twin 128. The query manager 120 can receive point location data 132 from the digital twin 128 responsive to the querying. The point location data 132 can include data that describes the point 164. For example, the point location data 132 can include a coordinate of the point 164 in the second coordinate system.
In some embodiments, the query manager 120 can determine an identifier of the point 164 (e.g., the name of the point, identification number/code of the point, etc.) based on the indication 184 of point 164. The digital twin 128 can receive a query from the query manager 120 for the point 164. The query can include the determined identifier or the indication 184 of point 164. Based on the identifier or the indication 184 of point 164, the digital twin 128 can provide the point location data 132 to the query manager 120. The digital twin 128 can provide the query manager 120 coordinate information of the point 164 stored in the digital twin 128. The coordinate information including the coordinates of the point 164 in the second coordinate system.
The query manager 120 can provide the point location data 132 and the coordinate 148 of AR system 160 to a translation manager 124. In some embodiments, the step 720 can include querying, by the query manager 120, the digital twin 128 for characteristic information of the point 164 to determine a spatial relationship between the AR system 160 and the point 164. Using the determined identifier, the query manager 120 can query the digital twin 128 for characteristic information of the point 164 associated with the identifier. The characteristic information can include a dimension, a shape, a pattern, a color of the point 164. The characteristic information can include any type of information describing the point 164 that can be compared with the indication 184 of point 164. Based on the comparison between the indication 184 of point 164 and the characteristic information, the query manager 120 can determine a relative position between the AR system 160 and the point 164 (i.e., the coordinate 148 of AR system 160 in the first coordinate system or the second coordinate system). For example, the query manager can determine the distance between the AR system 160 and the point 164 by comparing the actual dimension of the point 164 included in the characteristic information and the dimension in the indication 184 of point 164 (e.g., a photo). For example, the query manager 120 can determine the azimuthal location of the AR system 160 in relation to the point 164 by analyzing the indication 184 of point 164 (e.g., the angle at which a photo is taken).
The process 700 can include a step 730 of translating the first coordinate of the AR system 160 in the first coordinate system into a second coordinate of the AR system 160 in the second coordinate system based on the coordinate of the point 164 in the second coordinate system and the first coordinate 148 of AR system 160 in the first coordinate system. For example, because the translation manager 124 has the coordinate of the point 164 in both the first coordinate system and the second coordinate system, the translation manager 124 can use the point 164 as a common reference point to translate between the first and second coordinate system. By comparing the coordinates of the point 164 in the first and second coordinate system, the translation manager 124 can map a coordinate from the first coordinate system into the second coordinate system. Similarly, the translation manager 124 can map a coordinate from the second coordinate system into the first coordinate system. The translation manager 124 can use the coordinates of the point 164 in the first and second coordinate systems to map the coordinate 148 of the AR system 160 from the first coordinate system into the second coordinate system.
In some embodiments, the translation manager 124 can perform the translation based on a spatial relationship between the AR system 160 and the point 164. For example, if the translation manager 124 can determine a vector indicating the direction and distance that the AR system 160 is located away from the point 164, the translation manager 124 can generate a coordinate for the AR system 160 in the second coordinate system. The translation manager 124 can determine the vector based on the images of the point 164 that the AR system 160 captures and data describing the physical characteristics of the point 164 stored in the digital twin 128.
The process 700 can include a step 740 of displaying the information 180 based at least in part on the translation. The display manager 116 can cause the AR system 160 to display the information 180 based at least in part on the translation performed in the step 730. The AR system 160 can receive the display information 180 from the display manager 116 via the network 140. In some embodiments, displaying the display information 180 can include displaying the display information 180 on the display 172 which can be or include a liquid crystal display (LCD), a light emitting diode (LED) display, a curved mirror based display, a waveguide based display, or any other type of display device. In some embodiments, displaying the information 180 may include projecting images (e.g., 2D or 3D models, holograms, etc.) on a visual display 172 of the AR system 160. For example, the images can be projected onto an optical projection system of the AR system 160 and/or can be projected to the display screen 172 of the AR system 160. In some embodiments, displaying information may include superimposing images over captured images (e.g., superimposing images onto a live video).
The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.). For example, the position of elements may be reversed or otherwise varied and the nature or number of discrete elements or positions may be altered or varied. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions and arrangement of the exemplary embodiments without departing from the scope of the present disclosure.
The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
Although the figures show a specific order of method steps, the order of the steps may differ from what is depicted. Also, two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.
In various implementations, the steps and operations described herein may be performed on one processor or in a combination of two or more processors. For example, in some implementations, the various operations could be performed in a central server or set of central servers configured to receive data from one or more devices (e.g., edge computing devices/controllers) and perform the operations. In some implementations, the operations may be performed by one or more local controllers or computing devices (e.g., edge devices), such as controllers dedicated to and/or located within a particular building or portion of a building. In some implementations, the operations may be performed by a combination of one or more central or offsite computing devices/servers and one or more local controllers/computing devices. All such implementations are contemplated within the scope of the present disclosure. Further, unless otherwise indicated, when the present disclosure refers to one or more computer-readable storage media and/or one or more controllers, such computer-readable storage media and/or one or more controllers may be implemented as one or more central servers, one or more local controllers or computing devices (e.g., edge devices), any combination thereof, or any other combination of storage media and/or controllers regardless of the location of such devices.
While the techniques described herein relate to translation between coordinate systems for an AR system, the digital twin based techniques for displaying information about an environment on an AR system can be used with or without translation between coordinate systems. For example, an AR system can be programmed to utilize the coordinate system of a building and therefore, the building system may not need to perform any coordinate system translation. The building system can utilize the coordinate of the AR system in the building relative coordinate system to query the digital twin for information describing assets in an area around the AR system. The building system can generate data to cause the AR system to display the information describing the asset.
This application claims the benefit of, and priority to, U.S. Provisional Patent Application No. 63/285,077, filed Dec. 1, 2021, the entirety of which is incorporated by reference herein.
| Number | Name | Date | Kind |
|---|---|---|---|
| 5301109 | Landauer et al. | Apr 1994 | A |
| 5446677 | Jensen et al. | Aug 1995 | A |
| 5581478 | Cruse et al. | Dec 1996 | A |
| 5812962 | Kovac | Sep 1998 | A |
| 5960381 | Singers et al. | Sep 1999 | A |
| 5973662 | Singers et al. | Oct 1999 | A |
| 6014612 | Larson et al. | Jan 2000 | A |
| 6031547 | Kennedy | Feb 2000 | A |
| 6134511 | Subbarao | Oct 2000 | A |
| 6157943 | Meyer | Dec 2000 | A |
| 6285966 | Brown et al. | Sep 2001 | B1 |
| 6363422 | Hunter et al. | Mar 2002 | B1 |
| 6385510 | Hoog et al. | May 2002 | B1 |
| 6389331 | Jensen et al. | May 2002 | B1 |
| 6401027 | Xu et al. | Jun 2002 | B1 |
| 6437691 | Sandelman et al. | Aug 2002 | B1 |
| 6477518 | Li et al. | Nov 2002 | B1 |
| 6487457 | Hull et al. | Nov 2002 | B1 |
| 6493755 | Hansen et al. | Dec 2002 | B1 |
| 6577323 | Jamieson et al. | Jun 2003 | B1 |
| 6626366 | Kayahara et al. | Sep 2003 | B2 |
| 6646660 | Patty | Nov 2003 | B1 |
| 6704016 | Oliver et al. | Mar 2004 | B1 |
| 6732540 | Sugihara et al. | May 2004 | B2 |
| 6764019 | Kayahara et al. | Jul 2004 | B1 |
| 6782385 | Natsumeda et al. | Aug 2004 | B2 |
| 6813532 | Eryurek et al. | Nov 2004 | B2 |
| 6816811 | Seem | Nov 2004 | B2 |
| 6823680 | Jayanth | Nov 2004 | B2 |
| 6826454 | Sulfstede | Nov 2004 | B2 |
| 6865511 | Frerichs et al. | Mar 2005 | B2 |
| 6925338 | Eryurek et al. | Aug 2005 | B2 |
| 6986138 | Sakaguchi et al. | Jan 2006 | B1 |
| 7031880 | Seem et al. | Apr 2006 | B1 |
| 7401057 | Eder | Jul 2008 | B2 |
| 7552467 | Lindsay | Jun 2009 | B2 |
| 7627544 | Chkodrov et al. | Dec 2009 | B2 |
| 7818249 | Lovejoy et al. | Oct 2010 | B2 |
| 7889051 | Billig et al. | Feb 2011 | B1 |
| 7996488 | Casabella et al. | Aug 2011 | B1 |
| 8078330 | Brickfield et al. | Dec 2011 | B2 |
| 8104044 | Scofield et al. | Jan 2012 | B1 |
| 8229470 | Ranjan et al. | Jul 2012 | B1 |
| 8401991 | Wu et al. | Mar 2013 | B2 |
| 8495745 | Schrecker et al. | Jul 2013 | B1 |
| 8516016 | Park et al. | Aug 2013 | B2 |
| 8532808 | Drees et al. | Sep 2013 | B2 |
| 8532839 | Drees et al. | Sep 2013 | B2 |
| 8600556 | Nesler et al. | Dec 2013 | B2 |
| 8635182 | Mackay | Jan 2014 | B2 |
| 8682921 | Park et al. | Mar 2014 | B2 |
| 8731724 | Drees et al. | May 2014 | B2 |
| 8737334 | Ahn et al. | May 2014 | B2 |
| 8738334 | Jiang et al. | May 2014 | B2 |
| 8751487 | Byrne et al. | Jun 2014 | B2 |
| 8788097 | Drees et al. | Jul 2014 | B2 |
| 8805995 | Oliver | Aug 2014 | B1 |
| 8843238 | Wenzel et al. | Sep 2014 | B2 |
| 8874071 | Sherman et al. | Oct 2014 | B2 |
| 8941465 | Pineau et al. | Jan 2015 | B2 |
| 8990127 | Taylor | Mar 2015 | B2 |
| 9070113 | Shafiee et al. | Jun 2015 | B2 |
| 9116978 | Park et al. | Aug 2015 | B2 |
| 9185095 | Moritz et al. | Nov 2015 | B1 |
| 9189527 | Park et al. | Nov 2015 | B2 |
| 9196009 | Drees et al. | Nov 2015 | B2 |
| 9229966 | Aymeloglu et al. | Jan 2016 | B2 |
| 9286582 | Drees et al. | Mar 2016 | B2 |
| 9311807 | Schultz et al. | Apr 2016 | B2 |
| 9342928 | Rasane et al. | May 2016 | B2 |
| 9344751 | Ream et al. | May 2016 | B1 |
| 9354968 | Wenzel et al. | May 2016 | B2 |
| 9507686 | Horn et al. | Nov 2016 | B2 |
| 9524594 | Ouyang et al. | Dec 2016 | B2 |
| 9558196 | Johnston et al. | Jan 2017 | B2 |
| 9652813 | Gifford et al. | May 2017 | B2 |
| 9753455 | Drees | Sep 2017 | B2 |
| 9811249 | Chen et al. | Nov 2017 | B2 |
| 9838844 | Emeis et al. | Dec 2017 | B2 |
| 9886478 | Mukherjee | Feb 2018 | B2 |
| 9948359 | Horton | Apr 2018 | B2 |
| 10055114 | Shah et al. | Aug 2018 | B2 |
| 10055206 | Park et al. | Aug 2018 | B2 |
| 10116461 | Fairweather et al. | Oct 2018 | B2 |
| 10169454 | Ait-Mokhtar et al. | Jan 2019 | B2 |
| 10171297 | Stewart et al. | Jan 2019 | B2 |
| 10171586 | Shaashua et al. | Jan 2019 | B2 |
| 10187258 | Nagesh et al. | Jan 2019 | B2 |
| 10445933 | Rasane et al. | Oct 2019 | B2 |
| 10514963 | Shrivastava et al. | Dec 2019 | B2 |
| 10515098 | Park et al. | Dec 2019 | B2 |
| 10534326 | Sridharan et al. | Jan 2020 | B2 |
| 10536295 | Fairweather et al. | Jan 2020 | B2 |
| 10564993 | Deutsch et al. | Feb 2020 | B2 |
| 10705492 | Harvey | Jul 2020 | B2 |
| 10708078 | Harvey | Jul 2020 | B2 |
| 10760815 | Janakiraman et al. | Sep 2020 | B2 |
| 10762475 | Song et al. | Sep 2020 | B2 |
| 10824120 | Ahmed | Nov 2020 | B2 |
| 10845771 | Harvey | Nov 2020 | B2 |
| 10854013 | Rasane et al. | Dec 2020 | B2 |
| 10854194 | Park et al. | Dec 2020 | B2 |
| 10862928 | Badawy et al. | Dec 2020 | B1 |
| 10921760 | Harvey | Feb 2021 | B2 |
| 10921972 | Park et al. | Feb 2021 | B2 |
| 10969133 | Harvey | Apr 2021 | B2 |
| 10986121 | Stockdale et al. | Apr 2021 | B2 |
| 11016998 | Park et al. | May 2021 | B2 |
| 11024292 | Park et al. | Jun 2021 | B2 |
| 11038709 | Park et al. | Jun 2021 | B2 |
| 11041650 | Li et al. | Jun 2021 | B2 |
| 11054796 | Holaso | Jul 2021 | B2 |
| 11070390 | Park et al. | Jul 2021 | B2 |
| 11073976 | Park et al. | Jul 2021 | B2 |
| 11108587 | Park et al. | Aug 2021 | B2 |
| 11113295 | Park et al. | Sep 2021 | B2 |
| 11229138 | Harvey et al. | Jan 2022 | B1 |
| 11314726 | Park et al. | Apr 2022 | B2 |
| 11314788 | Park et al. | Apr 2022 | B2 |
| 11430193 | Smith | Aug 2022 | B1 |
| 11556105 | Cooley et al. | Jan 2023 | B2 |
| 11561522 | Cooley et al. | Jan 2023 | B2 |
| 11561523 | Cooley et al. | Jan 2023 | B2 |
| 11573551 | Cooley et al. | Feb 2023 | B2 |
| 11586167 | Cooley et al. | Feb 2023 | B2 |
| 20020010562 | Schleiss et al. | Jan 2002 | A1 |
| 20020016639 | Smith et al. | Feb 2002 | A1 |
| 20020059229 | Natsumeda et al. | May 2002 | A1 |
| 20020123864 | Eryurek et al. | Sep 2002 | A1 |
| 20020147506 | Eryurek et al. | Oct 2002 | A1 |
| 20020177909 | Fu et al. | Nov 2002 | A1 |
| 20030005486 | Ridolfo et al. | Jan 2003 | A1 |
| 20030014130 | Grumelart | Jan 2003 | A1 |
| 20030073432 | Meade, II | Apr 2003 | A1 |
| 20030158704 | Triginai et al. | Aug 2003 | A1 |
| 20030171851 | Brickfield et al. | Sep 2003 | A1 |
| 20030200059 | Ignatowski et al. | Oct 2003 | A1 |
| 20040068390 | Saunders | Apr 2004 | A1 |
| 20040128314 | Katibah et al. | Jul 2004 | A1 |
| 20040133314 | Ehlers et al. | Jul 2004 | A1 |
| 20040199360 | Friman et al. | Oct 2004 | A1 |
| 20050055308 | Meyer et al. | Mar 2005 | A1 |
| 20050108262 | Fawcett et al. | May 2005 | A1 |
| 20050154494 | Ahmed | Jul 2005 | A1 |
| 20050278703 | Lo et al. | Dec 2005 | A1 |
| 20050283337 | Sayal | Dec 2005 | A1 |
| 20060095521 | Patinkin | May 2006 | A1 |
| 20060140207 | Eschbach et al. | Jun 2006 | A1 |
| 20060184479 | Levine | Aug 2006 | A1 |
| 20060200476 | Gottumukkala et al. | Sep 2006 | A1 |
| 20060265751 | Cosquer et al. | Nov 2006 | A1 |
| 20060271589 | Horowitz et al. | Nov 2006 | A1 |
| 20070028179 | Levin et al. | Feb 2007 | A1 |
| 20070203693 | Estes | Aug 2007 | A1 |
| 20070261062 | Bansal et al. | Nov 2007 | A1 |
| 20070273497 | Kuroda et al. | Nov 2007 | A1 |
| 20070273610 | Baillot | Nov 2007 | A1 |
| 20080034425 | Overcash et al. | Feb 2008 | A1 |
| 20080094230 | Mock et al. | Apr 2008 | A1 |
| 20080097816 | Freire et al. | Apr 2008 | A1 |
| 20080186160 | Kim et al. | Aug 2008 | A1 |
| 20080249756 | Chaisuparasmikul | Oct 2008 | A1 |
| 20080252723 | Park | Oct 2008 | A1 |
| 20080281472 | Podgorny et al. | Nov 2008 | A1 |
| 20090195349 | Frader-Thompson et al. | Aug 2009 | A1 |
| 20100045439 | Tak et al. | Feb 2010 | A1 |
| 20100058248 | Park | Mar 2010 | A1 |
| 20100131533 | Ortiz | May 2010 | A1 |
| 20100274366 | Fata et al. | Oct 2010 | A1 |
| 20100281387 | Holland et al. | Nov 2010 | A1 |
| 20100286937 | Hedley et al. | Nov 2010 | A1 |
| 20100324962 | Nesler et al. | Dec 2010 | A1 |
| 20110015802 | Imes | Jan 2011 | A1 |
| 20110047418 | Drees et al. | Feb 2011 | A1 |
| 20110061015 | Drees et al. | Mar 2011 | A1 |
| 20110071685 | Huneycutt et al. | Mar 2011 | A1 |
| 20110077950 | Hughston | Mar 2011 | A1 |
| 20110087650 | Mackay et al. | Apr 2011 | A1 |
| 20110087988 | Ray et al. | Apr 2011 | A1 |
| 20110088000 | Mackay | Apr 2011 | A1 |
| 20110125737 | Pothering et al. | May 2011 | A1 |
| 20110137853 | Mackay | Jun 2011 | A1 |
| 20110153603 | Adiba et al. | Jun 2011 | A1 |
| 20110154363 | Karmarkar | Jun 2011 | A1 |
| 20110157357 | Weisensale et al. | Jun 2011 | A1 |
| 20110178977 | Drees | Jul 2011 | A1 |
| 20110191343 | Heaton et al. | Aug 2011 | A1 |
| 20110205022 | Cavallaro et al. | Aug 2011 | A1 |
| 20110218777 | Chen et al. | Sep 2011 | A1 |
| 20120011126 | Park et al. | Jan 2012 | A1 |
| 20120011141 | Park et al. | Jan 2012 | A1 |
| 20120022698 | Mackay | Jan 2012 | A1 |
| 20120062577 | Nixon | Mar 2012 | A1 |
| 20120064923 | Imes et al. | Mar 2012 | A1 |
| 20120083930 | Ilic et al. | Apr 2012 | A1 |
| 20120100825 | Sherman et al. | Apr 2012 | A1 |
| 20120101637 | Imes et al. | Apr 2012 | A1 |
| 20120135759 | Imes et al. | May 2012 | A1 |
| 20120136485 | Weber et al. | May 2012 | A1 |
| 20120158633 | Eder | Jun 2012 | A1 |
| 20120259583 | Noboa et al. | Oct 2012 | A1 |
| 20120272228 | Marndi et al. | Oct 2012 | A1 |
| 20120278051 | Jiang et al. | Nov 2012 | A1 |
| 20130007063 | Kalra et al. | Jan 2013 | A1 |
| 20130038430 | Blower et al. | Feb 2013 | A1 |
| 20130038707 | Cunningham et al. | Feb 2013 | A1 |
| 20130060820 | Bulusu et al. | Mar 2013 | A1 |
| 20130086497 | Ambuhl et al. | Apr 2013 | A1 |
| 20130097706 | Titonis et al. | Apr 2013 | A1 |
| 20130103221 | Raman et al. | Apr 2013 | A1 |
| 20130167035 | Imes et al. | Jun 2013 | A1 |
| 20130170710 | Kuoch et al. | Jul 2013 | A1 |
| 20130204836 | Choi et al. | Aug 2013 | A1 |
| 20130246916 | Reimann et al. | Sep 2013 | A1 |
| 20130247205 | Schrecker et al. | Sep 2013 | A1 |
| 20130262035 | Mills | Oct 2013 | A1 |
| 20130275174 | Bennett et al. | Oct 2013 | A1 |
| 20130275908 | Reichard | Oct 2013 | A1 |
| 20130297050 | Reichard et al. | Nov 2013 | A1 |
| 20130298244 | Kumar et al. | Nov 2013 | A1 |
| 20130331995 | Rosen | Dec 2013 | A1 |
| 20130338970 | Reghetti | Dec 2013 | A1 |
| 20140032506 | Hoey et al. | Jan 2014 | A1 |
| 20140059483 | Mairs et al. | Feb 2014 | A1 |
| 20140081652 | Klindworth | Mar 2014 | A1 |
| 20140135952 | Maehara | May 2014 | A1 |
| 20140152651 | Chen et al. | Jun 2014 | A1 |
| 20140172184 | Schmidt et al. | Jun 2014 | A1 |
| 20140189861 | Gupta et al. | Jul 2014 | A1 |
| 20140207282 | Angle et al. | Jul 2014 | A1 |
| 20140258052 | Khuti et al. | Sep 2014 | A1 |
| 20140269614 | Maguire et al. | Sep 2014 | A1 |
| 20140277765 | Karimi et al. | Sep 2014 | A1 |
| 20140278461 | Artz | Sep 2014 | A1 |
| 20140327555 | Sager et al. | Nov 2014 | A1 |
| 20150019174 | Kiff et al. | Jan 2015 | A1 |
| 20150042240 | Aggarwal et al. | Feb 2015 | A1 |
| 20150105917 | Sasaki et al. | Apr 2015 | A1 |
| 20150145468 | Ma et al. | May 2015 | A1 |
| 20150156031 | Fadell et al. | Jun 2015 | A1 |
| 20150168931 | Jin | Jun 2015 | A1 |
| 20150172300 | Cochenour | Jun 2015 | A1 |
| 20150178421 | Borrelli et al. | Jun 2015 | A1 |
| 20150185261 | Frader-Thompson et al. | Jul 2015 | A1 |
| 20150186777 | Lecue et al. | Jul 2015 | A1 |
| 20150202962 | Habashima et al. | Jul 2015 | A1 |
| 20150204563 | Imes et al. | Jul 2015 | A1 |
| 20150235267 | Steube et al. | Aug 2015 | A1 |
| 20150241895 | Lu et al. | Aug 2015 | A1 |
| 20150244730 | Vu et al. | Aug 2015 | A1 |
| 20150244732 | Golshan et al. | Aug 2015 | A1 |
| 20150261863 | Dey et al. | Sep 2015 | A1 |
| 20150263900 | Polyakov et al. | Sep 2015 | A1 |
| 20150286969 | Warner et al. | Oct 2015 | A1 |
| 20150295796 | Hsiao et al. | Oct 2015 | A1 |
| 20150304193 | Ishii et al. | Oct 2015 | A1 |
| 20150316918 | Schleiss et al. | Nov 2015 | A1 |
| 20150324422 | Elder | Nov 2015 | A1 |
| 20150341212 | Hsiao et al. | Nov 2015 | A1 |
| 20150348417 | Ignaczak et al. | Dec 2015 | A1 |
| 20150379080 | Jochimski | Dec 2015 | A1 |
| 20160011753 | Mcfarland et al. | Jan 2016 | A1 |
| 20160033946 | Zhu et al. | Feb 2016 | A1 |
| 20160035246 | Curtis | Feb 2016 | A1 |
| 20160065601 | Gong et al. | Mar 2016 | A1 |
| 20160070736 | Swan et al. | Mar 2016 | A1 |
| 20160078229 | Gong et al. | Mar 2016 | A1 |
| 20160090839 | Stolarczyk | Mar 2016 | A1 |
| 20160119434 | Dong et al. | Apr 2016 | A1 |
| 20160127712 | Alfredsson et al. | May 2016 | A1 |
| 20160139752 | Shim et al. | May 2016 | A1 |
| 20160163186 | Davidson et al. | Jun 2016 | A1 |
| 20160170390 | Xie et al. | Jun 2016 | A1 |
| 20160171862 | Das et al. | Jun 2016 | A1 |
| 20160173816 | Huenerfauth et al. | Jun 2016 | A1 |
| 20160179315 | Sarao et al. | Jun 2016 | A1 |
| 20160179342 | Sarao et al. | Jun 2016 | A1 |
| 20160179990 | Sarao et al. | Jun 2016 | A1 |
| 20160195856 | Spero | Jul 2016 | A1 |
| 20160212165 | Singla et al. | Jul 2016 | A1 |
| 20160239660 | Azvine et al. | Aug 2016 | A1 |
| 20160239756 | Aggour et al. | Aug 2016 | A1 |
| 20160247129 | Song et al. | Aug 2016 | A1 |
| 20160260063 | Harris et al. | Sep 2016 | A1 |
| 20160313751 | Risbeck et al. | Oct 2016 | A1 |
| 20160313752 | Przybylski | Oct 2016 | A1 |
| 20160313902 | Hill et al. | Oct 2016 | A1 |
| 20160350364 | Anicic et al. | Dec 2016 | A1 |
| 20160357521 | Zhang et al. | Dec 2016 | A1 |
| 20160357828 | Tobin et al. | Dec 2016 | A1 |
| 20160358432 | Branscomb et al. | Dec 2016 | A1 |
| 20160363336 | Roth et al. | Dec 2016 | A1 |
| 20160370258 | Perez | Dec 2016 | A1 |
| 20160378306 | Kresl et al. | Dec 2016 | A1 |
| 20160379326 | Chan-Gove et al. | Dec 2016 | A1 |
| 20170006135 | Siebel | Jan 2017 | A1 |
| 20170011318 | Vigano et al. | Jan 2017 | A1 |
| 20170017221 | Lamparter et al. | Jan 2017 | A1 |
| 20170039255 | Raj et al. | Feb 2017 | A1 |
| 20170052536 | Warner et al. | Feb 2017 | A1 |
| 20170053441 | Nadumane et al. | Feb 2017 | A1 |
| 20170063894 | Muddu et al. | Mar 2017 | A1 |
| 20170068409 | Nair | Mar 2017 | A1 |
| 20170070775 | Taxier et al. | Mar 2017 | A1 |
| 20170075984 | Deshpande et al. | Mar 2017 | A1 |
| 20170084168 | Janchookiat | Mar 2017 | A1 |
| 20170090437 | Veeramani et al. | Mar 2017 | A1 |
| 20170093700 | Gilley et al. | Mar 2017 | A1 |
| 20170098086 | Hoernecke et al. | Apr 2017 | A1 |
| 20170103327 | Penilla et al. | Apr 2017 | A1 |
| 20170103403 | Chu et al. | Apr 2017 | A1 |
| 20170123389 | Baez et al. | May 2017 | A1 |
| 20170134415 | Muddu et al. | May 2017 | A1 |
| 20170177715 | Chang et al. | Jun 2017 | A1 |
| 20170180147 | Brandman et al. | Jun 2017 | A1 |
| 20170188216 | Koskas et al. | Jun 2017 | A1 |
| 20170212482 | Boettcher et al. | Jul 2017 | A1 |
| 20170212668 | Shah et al. | Jul 2017 | A1 |
| 20170220641 | Chi et al. | Aug 2017 | A1 |
| 20170230930 | Frey | Aug 2017 | A1 |
| 20170235817 | Deodhar et al. | Aug 2017 | A1 |
| 20170251182 | Siminoff et al. | Aug 2017 | A1 |
| 20170256097 | Finn | Sep 2017 | A1 |
| 20170270124 | Nagano et al. | Sep 2017 | A1 |
| 20170277769 | Pasupathy et al. | Sep 2017 | A1 |
| 20170278003 | Liu | Sep 2017 | A1 |
| 20170294132 | Colmenares | Oct 2017 | A1 |
| 20170315522 | Kwon et al. | Nov 2017 | A1 |
| 20170315697 | Jacobson et al. | Nov 2017 | A1 |
| 20170322534 | Sinha et al. | Nov 2017 | A1 |
| 20170323389 | Vavrasek | Nov 2017 | A1 |
| 20170329289 | Kohn et al. | Nov 2017 | A1 |
| 20170336770 | Macmillan | Nov 2017 | A1 |
| 20170345287 | Fuller et al. | Nov 2017 | A1 |
| 20170351957 | Lecue et al. | Dec 2017 | A1 |
| 20170357225 | Asp et al. | Dec 2017 | A1 |
| 20170357490 | Park et al. | Dec 2017 | A1 |
| 20170357908 | Cabadi et al. | Dec 2017 | A1 |
| 20180012159 | Kozloski et al. | Jan 2018 | A1 |
| 20180013579 | Fairweather et al. | Jan 2018 | A1 |
| 20180024520 | Sinha et al. | Jan 2018 | A1 |
| 20180039238 | Gärtner et al. | Feb 2018 | A1 |
| 20180048485 | Pelton et al. | Feb 2018 | A1 |
| 20180069932 | Tiwari et al. | Mar 2018 | A1 |
| 20180114140 | Chen et al. | Apr 2018 | A1 |
| 20180137288 | Polyakov | May 2018 | A1 |
| 20180157930 | Rutschman et al. | Jun 2018 | A1 |
| 20180162400 | Abdar | Jun 2018 | A1 |
| 20180176241 | Manadhata et al. | Jun 2018 | A1 |
| 20180198627 | Mullins | Jul 2018 | A1 |
| 20180203961 | Aisu et al. | Jul 2018 | A1 |
| 20180239982 | Rutschman et al. | Aug 2018 | A1 |
| 20180275625 | Park et al. | Sep 2018 | A1 |
| 20180276962 | Butler et al. | Sep 2018 | A1 |
| 20180292797 | Lamparter et al. | Oct 2018 | A1 |
| 20180336785 | Ghannam et al. | Nov 2018 | A1 |
| 20180356775 | Harvey | Dec 2018 | A1 |
| 20180359111 | Harvey | Dec 2018 | A1 |
| 20180364654 | Locke et al. | Dec 2018 | A1 |
| 20190005025 | Malabarba | Jan 2019 | A1 |
| 20190013023 | Pourmohammad et al. | Jan 2019 | A1 |
| 20190025771 | Park et al. | Jan 2019 | A1 |
| 20190037135 | Hedge | Jan 2019 | A1 |
| 20190042988 | Brown et al. | Feb 2019 | A1 |
| 20190088106 | Grundstrom | Mar 2019 | A1 |
| 20190094824 | Xie et al. | Mar 2019 | A1 |
| 20190096217 | Pourmohammad et al. | Mar 2019 | A1 |
| 20190102840 | Perl et al. | Apr 2019 | A1 |
| 20190121801 | Jethwa et al. | Apr 2019 | A1 |
| 20190138512 | Pourmohammad et al. | May 2019 | A1 |
| 20190147883 | Mellenthin et al. | May 2019 | A1 |
| 20190158309 | Park et al. | May 2019 | A1 |
| 20190163152 | Worrall et al. | May 2019 | A1 |
| 20190268178 | Fairweather et al. | Aug 2019 | A1 |
| 20190310979 | Masuzaki et al. | Oct 2019 | A1 |
| 20190377306 | Harvey | Dec 2019 | A1 |
| 20200104522 | Collart | Apr 2020 | A1 |
| 20200133978 | Ramamurti et al. | Apr 2020 | A1 |
| 20200226156 | Borra et al. | Jul 2020 | A1 |
| 20200285203 | Thakur et al. | Sep 2020 | A1 |
| 20200304375 | Chennai | Sep 2020 | A1 |
| 20200336328 | Harvey | Oct 2020 | A1 |
| 20200348632 | Harvey | Nov 2020 | A1 |
| 20200387576 | Brett et al. | Dec 2020 | A1 |
| 20200396208 | Brett et al. | Dec 2020 | A1 |
| 20210042299 | Migliori | Feb 2021 | A1 |
| 20210043221 | Yelchuru et al. | Feb 2021 | A1 |
| 20210200171 | Sridharan et al. | Jul 2021 | A1 |
| 20210200174 | Sridharan et al. | Jul 2021 | A1 |
| 20210200713 | Sridharan et al. | Jul 2021 | A1 |
| 20210313075 | Mc Namara et al. | Oct 2021 | A1 |
| 20210325070 | Endel et al. | Oct 2021 | A1 |
| 20210342961 | Winter et al. | Nov 2021 | A1 |
| 20210373509 | Borah et al. | Dec 2021 | A1 |
| 20210373510 | Borah et al. | Dec 2021 | A1 |
| 20210381711 | Harvey et al. | Dec 2021 | A1 |
| 20210381712 | Harvey et al. | Dec 2021 | A1 |
| 20210382445 | Harvey et al. | Dec 2021 | A1 |
| 20210383041 | Harvey et al. | Dec 2021 | A1 |
| 20210383042 | Harvey et al. | Dec 2021 | A1 |
| 20210383200 | Harvey et al. | Dec 2021 | A1 |
| 20210383219 | Harvey et al. | Dec 2021 | A1 |
| 20210383235 | Harvey et al. | Dec 2021 | A1 |
| 20210383236 | Harvey et al. | Dec 2021 | A1 |
| 20220004671 | Zechlin | Jan 2022 | A1 |
| 20220066402 | Harvey et al. | Mar 2022 | A1 |
| 20220066405 | Harvey | Mar 2022 | A1 |
| 20220066432 | Harvey et al. | Mar 2022 | A1 |
| 20220066434 | Harvey et al. | Mar 2022 | A1 |
| 20220066528 | Harvey et al. | Mar 2022 | A1 |
| 20220066722 | Harvey et al. | Mar 2022 | A1 |
| 20220066754 | Harvey et al. | Mar 2022 | A1 |
| 20220066761 | Harvey et al. | Mar 2022 | A1 |
| 20220067226 | Harvey et al. | Mar 2022 | A1 |
| 20220067227 | Harvey et al. | Mar 2022 | A1 |
| 20220067230 | Harvey et al. | Mar 2022 | A1 |
| 20220069863 | Harvey et al. | Mar 2022 | A1 |
| 20220070293 | Harvey et al. | Mar 2022 | A1 |
| 20220121965 | Chatterji et al. | Apr 2022 | A1 |
| 20220138684 | Harvey | May 2022 | A1 |
| 20220147000 | Cooley et al. | May 2022 | A1 |
| 20220150124 | Cooley et al. | May 2022 | A1 |
| 20220215264 | Harvey et al. | Jul 2022 | A1 |
| 20230010757 | Preciado | Jan 2023 | A1 |
| 20230071312 | Preciado et al. | Mar 2023 | A1 |
| 20230076011 | Preciado et al. | Mar 2023 | A1 |
| 20230083703 | Meiners | Mar 2023 | A1 |
| Number | Date | Country |
|---|---|---|
| 2019226217 | Nov 2020 | AU |
| 2019226264 | Nov 2020 | AU |
| 101415011 | Apr 2009 | CN |
| 102136099 | Jul 2011 | CN |
| 102136100 | Jul 2011 | CN |
| 102650876 | Aug 2012 | CN |
| 104040583 | Sep 2014 | CN |
| 104603832 | May 2015 | CN |
| 104919484 | Sep 2015 | CN |
| 106204392 | Dec 2016 | CN |
| 106406806 | Feb 2017 | CN |
| 106960269 | Jul 2017 | CN |
| 107147639 | Sep 2017 | CN |
| 107598928 | Jan 2018 | CN |
| 2 528 033 | Nov 2012 | EP |
| 3 268 821 | Jan 2018 | EP |
| 3 324 306 | May 2018 | EP |
| 4 226 263 | Aug 2023 | EP |
| H10-049552 | Feb 1998 | JP |
| 2003-162573 | Jun 2003 | JP |
| 2007-018322 | Jan 2007 | JP |
| 4073946 | Apr 2008 | JP |
| 2008-107930 | May 2008 | JP |
| 2013-152618 | Aug 2013 | JP |
| 2014-044457 | Mar 2014 | JP |
| 20160102923 | Aug 2016 | KR |
| WO-2009020158 | Feb 2009 | WO |
| WO-2011100255 | Aug 2011 | WO |
| WO-2013050333 | Apr 2013 | WO |
| WO-2015106702 | Jul 2015 | WO |
| WO-2015145648 | Oct 2015 | WO |
| WO-2017035536 | Mar 2017 | WO |
| WO-2017192422 | Nov 2017 | WO |
| WO-2017194244 | Nov 2017 | WO |
| WO-2017205330 | Nov 2017 | WO |
| WO-2017213918 | Dec 2017 | WO |
| WO-2018132112 | Jul 2018 | WO |
| WO-2020061621 | Apr 2020 | WO |
| WO-2022042925 | Mar 2022 | WO |
| WO-2022103812 | May 2022 | WO |
| WO-2022103813 | May 2022 | WO |
| WO-2022103820 | May 2022 | WO |
| WO-2022103822 | May 2022 | WO |
| WO-2022103824 | May 2022 | WO |
| WO-2022103829 | May 2022 | WO |
| WO-2022103831 | May 2022 | WO |
| Entry |
|---|
| U.S. Appl. No. 17/566,029, PassiveLogic, Inc. |
| U.S. Appl. No. 17/567,275, PassiveLogic, Inc. |
| U.S. Appl. No. 17/722,115, PassiveLogic, Inc. |
| Apple, “Introducing RoomPlan,” retrieved from https://developer.apple.com/augmented-reality/roomplan/ on Aug. 11, 2022, 3 pages (2022). |
| Balaji et al., “Brick: Metadata schema for portable smart building applications,” Applied Energy, 2018 (20 pages). |
| Balaji et al., “Brick: Metadata schema for portable smart building applications,” Applied Energy, Sep. 15, 2018, 3 pages, (Abstract). |
| Balaji et al., “Demo Abstract: Portable Queries Using the Brick Schema for Building Applications,” BuildSys '16, Palo Alto, CA, USA, Nov. 16-17, 2016 (2 pages). |
| Balaji, B. et al., “Brick: Towards a Unified Metadata Schema For Buildings.” BuildSys '16, Palo Alto, CA, USA, Nov. 16-17, 2016 (10 pages). |
| Baron, “Avideh Zakhor: the brains behind Google Earth and Street View,” The Mercury News, retrieved from http://web.archive.org/web/20170522075837/https:/www.mercurynews.com/2017/05/18/avideh-zakhor-inventor-of-google-street-view/ on Aug. 11, 2022, 6 pages (2017). |
| Bhattacharya et al., “Short Paper: Analyzing Metadata Schemas for Buildings—The Good, The Bad and The Ugly,” BuildSys '15, Seoul, South Korea, Nov. 4-5, 2015 (4 pages). |
| Bhattacharya, A., “Enabling Scalable Smart-Building Analytics,” Electrical Engineering and Computer Sciences, University of California at Berkeley, Technical Report No. UCB/EECS-2016-201, Dec. 15, 2016 (121 pages). |
| Brick, “Brick Schema: Building Blocks for Smart Buildings,” URL: chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.memoori.com/wp-content/uploads/2016/06/Brick_Schema_Whitepaper.pdf, Mar. 2019 (17 pages). |
| Brick, “Brick: Towards a Unified Metadata Schema For Buildings,” URL: chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://brickschema.org/papers/Brick_BuildSys_Presentation.pdf, Presented at BuildSys '16, Nov. 2016 (46 pages). |
| Brick, “Metadata Schema for Buildings,” URL: https://brickschema.org/docs/Brick-Leaflet.pdf, retrieved from internet Dec. 24, 2019 (3 pages). |
| Canvas, “Capture spaces in 3D with an iPad or iPhone,” retrieved from https://canvas.io/ on Aug. 11, 2022, 12 pages (2022). |
| Chinese Office Action on CN Appl. No. 201780003995.9 dated Apr. 8, 2021 (21 pages with English language translation). |
| Chinese Office action on CN Appl. No. 201780043400.2 dated Apr. 25, 2021 (15 pages with English language translation). |
| Curry, E et al., “Linking building data in the cloud: Integrating cross-domain building data using linked data.” Advanced Engineering Informatics, 2013, 27 (pp. 206-219). |
| Digital Platform Litigation Documents Part 1, includes cover letter, dismissal of case DDE-1-21-cv- 01796, IPR2023-00022 (documents filed Jan. 26, 2023-Oct. 7, 2022), and IPR2023-00085 (documents filed Jan. 26, 2023-Oct. 7, 2022) (748 pages total). |
| Digital Platform Litigation Documents Part 10, includes DDE-1-21-cv-01796 (documents filed Nov. 1, 2022-Dec. 22, 2021 (1795 pages total). |
| Digital Platform Litigation Documents Part 2, includes IPR2023-00085 (documents filed Oct. 20, 2022) (172 pages total). |
| Digital Platform Litigation Documents Part 3, includes IPR2023-00085 (documents filed Oct. 20, 2022) and IPR2023-00170 (documents filed Nov. 28, 2022-Nov. 7, 2022) (397 pages total). |
| Digital Platform Litigation Documents Part 4, includes IPR2023-00170 (documents filed Nov. 7, 2022) and IPR2023-00217 (documents filed Jan. 18, 2023-Nov. 15, 2022) (434 pages total). |
| Digital Platform Litigation Documents Part 5, includes IPR2023-00217 (documents filed Nov. 15, 2022) and IPR2023-00257 (documents filed Jan. 25, 2023-Nov. 23, 2022) (316 pages total). |
| Digital Platform Litigation Documents Part 6, includes IPR2023-00257 (documents filed Nov. 23, 2022) and IPR 2023-00346 (documents filed Jan. 3, 2023, Dec. 13, 2022) (295 pages total). |
| Digital Platform Litigation Documents Part 7, includes IPR 2023-00346 (documents filed Dec. 13, 2022) and IPR2023-00347 (documents filed Jan. 3, 2023-Dec. 13, 2022) (217 pages total). |
| Digital Platform Litigation Documents Part 8, includes IPR2023-00347 (documents filed Dec. 13, 22), EDTX-2-22-cv-00243 (documents filed Sep. 20, 2022-Jun. 29, 2022), and DDE-1-21-cv-01796 (documents filed Feb. 3, 2023-Jan. 10, 2023 (480 pages total). |
| Digital Platform Litigation Documents Part 9, includes DDE-1-21-cv-01796 (documents filed Jan. 10, 2023-Nov. 1, 2022 (203 pages total). |
| El Kaed, C. et al., “Building management insights driven by a multi-system semantic representation approach,” 2016 IEEE 3rd World Forum on Internet of Things (WF-IoT), Dec. 12-14, 2016, (pp. 520-525). |
| Ellis, C. et al., “Creating a room connectivity graph of a building from per-room sensor units.” BuildSys '12, Toronto, ON, Canada, Nov. 6, 2012 (7 pages). |
| Everypoint, Homepage, retrieved from https://everypoint.io/ on Aug. 11, 2022, 9 pages (2022). |
| Extended European Search Report on EP Application No. 18196948.6 dated Apr. 10, 2019 (9 pages). |
| Field, T., Tweet from @nobbis on Nov. 19, 2021, retrieved from https://twitter.com/nobbis/status/1461819440241131524 on Aug. 11, 2022, 2 pages (2021). |
| Fierro et al., “Beyond a House of Sticks: Formalizing Metadata Tags with Brick,” BuildSys '19, New York, NY, USA, Nov. 13-14, 2019 (10 pages). |
| Fierro et al., “Dataset: An Open Dataset and Collection Tool for BMS Point Labels,” DATA'19, New York, NY, USA, Nov. 10, 2019 (3 pages). |
| Fierro et al., “Design and Analysis of a Query Processor for Brick,” ACM Transactions on Sensor Networks, Jan. 2018, vol. 1, No. 1, art. 1 (25 pages). |
| Fierro et al., “Design and Analysis of a Query Processor for Brick,” BuildSys '17, Delft, Netherlands, Nov. 8-9, 2017 (10 pages). |
| Fierro et al., “Mortar: An Open Testbed for Portable Building Analytics,” BuildSys '18, Shenzhen, China, Nov. 7-8, 2018 (10 pages). |
| Fierro et al., “Why Brick is a Game Changer for Smart Buildings,” URL: https://brickschema.org/papers/Brick_Memoori_Webinar_Presentation.pdf, Memoori Webinar, 2019 (67 pages). |
| Fierro, “Writing Portable Building Analytics with the Brick Metadata Schema,” UC Berkeley, ACM E-Energy, 2019 (39 pages). |
| Fierro, G., “Design of an Effective Ontology and Query Processor Enabling Portable Building Applications,” Electrical Engineering and Computer Sciences, University of California at Berkeley, Technical Report No. UCB/EECS-2019-106, Jun. 27, 2019 (118 pages). |
| File History for U.S. Appl. 12/776,159, filed May 7, 2010 (722 pages). |
| Final Conference Program, ACM BuildSys 2016, Stanford, CA, USA, Nov. 15-17, 2016 (7 pages). |
| Francoeur, B., “5 Types of Building Management Software (and Which One You Should Use),” AkitaBox, retrieved from https://home.akitabox.com/blog/building-management-softwaretypes/ on Aug. 11, 2022, 12 pages (2022). |
| Gao et al., “A large-scale evaluation of automated metadata inference approaches on sensors from air handling units,” Advanced Engineering Informatics, 2018, 37 (pp. 14-30). |
| Geo Week News Staff, “Indoor Reality wants to revolutionize interior 3D capture,” Geo Week News, retrieved from https://www.geoweeknews.com/news/indoor-reality-wants-make-indoor-3d-capture-easy-possible on Aug. 11, 2022, 11 pages (2018). |
| Harvey, T., “Quantum Part 3: The Tools of Autonomy, How PassiveLogic's Quantum Creator and Autonomy Studio software works,” URL: https://www.automatedbuildings.com/news/jan22/articles/passive/211224010000passive.html, Jan. 2022 (7 pages). |
| Harvey, T., “Quantum: The Digital Twin Standard for Buildings,” URL: https://www.automatedbuildings.com/news/feb21/articles/passivelogic/210127124501passivelogic.html, Feb. 2021 (6 pages). |
| Hilti, “2019 Company Report,” Hilti Corporation, retrieved from https://www.hilti.group/content/dam/documents/Media-Release/2020/march20/Hilti_Company_Report_2019_en.pdf on Aug. 9, 2022, 46 pages (2020). |
| Hilti, “BIM Equipment and Services—End-to-End Construction Workflow,” retrieved from https://www.hilti.group/content/hilti/CP/XX/en/services/engineering/bim-equipment-and-services.html on Aug. 11, 2022, 16 pages (2022). |
| Hu, S et al., “Building performance optimisation: A hybrid architecture for the integration of contextual information and time-series data,” Automation in Construction, 2016, 70 (pp. 51-61). |
| Indoor Reality, “Product Overview,” retrieved from http://www.indoorreality.com/#product-overview on Aug. 11, 2022, 9 pages (2017). |
| International Search Report and Written Opinion for PCT Appl. Ser. No. PCT/US2017/013831 dated Mar. 31, 2017 (14 pages). |
| International Search Report and Written Opinion for PCT Appl. Ser. No. PCT/US2017/035524 dated Jul. 24, 2017 (14 pages). |
| International Search Report and Written Opinion on PCT/US2017/052060, mailed Oct. 5, 2017, 11 pages. |
| International Search Report and Written Opinion on PCT/US2017/052633, mailed Oct. 23, 2017, 9 pages. |
| International Search Report and Written Opinion on PCT/US2017/052829, mailed Nov. 27, 2017, 24 pages. |
| International Search Report and Written Opinion on PCT/US2018/024068, mailed Jun. 15, 2018, 22 pages. |
| International Search Report and Written Opinion on PCT/US2018/052971, dated Mar. 1, 2019, 19 pages. |
| International Search Report and Written Opinion on PCT/US2018/052974, mailed Dec. 19, 2018, 13 pages. |
| International Search Report and Written Opinion on PCT/US2018/052975, mailed Jan. 2, 2019, 13 pages. |
| International Search Report and Written Opinion on PCT/US2018/052994, mailed Jan. 7, 2019, 15 pages. |
| International Search Report and Written Opinion on PCT/US2019/015481, dated May 17, 2019, 78 pages. |
| International Search Report and Written Opinion on PCT/US2020/058381, dated Jan. 27, 2021, 30 pages. |
| Japanese Office Action on JP Appl. No. 2018-534963 dated May 11, 2021 (16 pages with English language translation). |
| Johnson Heating and Cooling, “Mechanical Engineering and Planning,” Johnson Heating and Cooling L.L.C., retrieved from http://www.cooljohnson.com/Engineering_and_Planning.html on Aug. 11, 2022, 3 pages (2011). |
| Koh et al., “Plaster: An Integration, Benchmark, and Development Framework for Metadata Normalization Methods,” BuildSys '18, Shenzhen, China, Nov. 7-8, 2018 (10 pages). |
| Koh et al., “Scrabble: Transferrable Semi-Automated Semantic Metadata Normalization using Intermediate Representation,” BuildSys '18, Shenzhen, China, Nov. 7-8, 2018 (10 pages). |
| Koh et al., “Who can Access What, and When?” BuildSys '19, New York, NY, USA, Nov. 13- 14, 2019 (4 pages). |
| Li et al., “Event Stream Processing with Out-of-Order Data Arrival,” International Conferences on Distributed Computing Systems, 2007, (8 pages). |
| Maschmeyer, R., Tweets from @StrangeNative on Jul. 13, 2022, retrieved from https://twitter.com/StrangeNative/status/1547227797793284096 on Aug. 11, 2022, 8 pages (2022). |
| Matterport Inc., “Matterport Delivers New and Innovative Solutions for the Built World with AWS,” Yahoo! Finance, retrieved from https://finance.yahoo.com/news/matterport-delivers-innovative-solutions-built-141500920.html on Aug. 11, 2022, 9 pages (2021). |
| Microsoft, “Pin Power BI reports to locations in the real world (preview),” Microsoft Power BI Mobile, retrieved from https://docs.microsoft.com/en-us/power-bi/consumer/mobile/mobile-apps-data-in-space-pin-reports on Aug. 11, 2022, 14 pages (2022). |
| My Digital Buildings, “Le double numerique de vos batiments, cle en main [The digital double of your buildings, turnkey],” My Digital Buildings, retrieved from https://www.mydigitalbuildings.com/ on Aug. 11, 2022, 10 pages (including translation) (2022). |
| Nissin Electric Co., Ltd., “Smart power supply system (SPSS),” Outline of the scale verification plan, Nissin Electric Technical Report, Japan, Apr. 23, 2014, vol. 59, No. 1 (23 pages). |
| Open General, “Building Management System,” retrieved from https://opengeneral.com/building-management-system-2/ on Aug. 11, 2022, 9 pages (2022). |
| PassiveLogic, “Explorer: Digital Twin Standard for Autonomous Systems. Made interactive.” URL: https://passivelogic.com/software/quantum-explorer/, retrieved from internet Jan. 4, 2023 (13 pages). |
| PassiveLogic, “Quantum: The Digital Twin Standard for Autonomous Systems, A physics-based ontology for next-generation control and AI.” URL: https://passivelogic.com/software/quantum-standard/, retrieved from internet Jan. 4, 2023 (20 pages). |
| Porter, J., “Watch this impressive AR demo ‘reset’ a room using latest Apple tech,” The Verge, retrieved from https://www.theverge.com/2022/7/15/23219844/shopify-augmented-reality-roomplan-api-iphone-lidar-scanner on Aug. 11, 2022, 4 pages (2022). |
| Quantum Alliance, “Quantum Explorer Walkthrough,” 2022, (7 pages) (screenshots from video). |
| Results of the Partial International Search for PCT/US2018/052971, dated Jan. 3, 2019, 3 pages. |
| Shenhav, M., “Data in space' in preview: Now Power BI can be truly everywhere,” Microsoft Power BI Blog, retrieved from https://powerbi.microsoft.com/en-us/blog/data-in-space-in-preview-now-power-bi-can-be-truly-everywhere/ on Aug. 11, 2022, 13 pages (2022). |
| Sinha, Sudhi and Al Huraimel, Khaled, “Reimagining Businesses with AI” John Wiley & Sons, Inc., Hoboken, NJ, USA, 2021 (156 pages). |
| Sinha, Sudhi R. and Park, Youngchoon, “Building an Effective IoT Ecosystem for Your Business,” Johnson Controls International, Springer International Publishing, 2017 (286 pages). |
| Sinha, Sudhi, “Making Big Data Work For Your Business: A guide to effective Big Data analytics,” Impackt Publishing Ltd., Birmingham, UK, Oct. 2014 (170 pages). |
| SpinalCom, “SpinalTwin Suite for BIM-O&M,” SpinalCom, retrieved from https://www.spinalcom.com/en/our-solutions/spinaltwin-suite/ on Aug. 11, 2022, 2 pages (2022). |
| Stein, S., “Apple's AR Spaces on the new Clips app is pretty wild. iPhone 12 Pro capture w Lidar,” URL: https://twitter.com/jetscott/status/1387084548341649416, retrieved from internet Nov. 29, 2022 (2 pages). |
| The Virtual Nuclear Tourist, “Calvert Cliffs Nuclear Power Plant,” URL: http://www.nucleartourist.com/us/calvert.htm, Jan. 11, 2006 (2 pages). |
| Tulsi, J., “Matterport for iPhone with LiDAR Support is Here,” Matterport, retrieved from https://matterport.com/blog/matterport-iphone-lidar-support-here on Aug. 11, 2022, 6 pages (2021). |
| University of California at Berkeley, EECS Department, “Enabling Scalable Smart-Building Analytics,” URL: https://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-201.html, retrieved from internet Feb. 15, 2022 (7 pages). |
| Van Hoof, Bert, “Announcing Azure Digital Twins: Create digital replicas of spaces and infrastructure using cloud, AI and IoT,” URL: https://azure.microsoft.com/en-us/blog/announcing-azure-digital-twins-create-digital-replicas-of-spaces-and-infrastructure-using-cloud-ai-and-iot/, Sep. 24, 2018 (11 pages). |
| W3C, “Sparql: Query Language for RDF,” located on The Wayback Machine, URL: https://web.archive.org/web/20161230061728/http://www.w3.org/TR/rdf-sparql-query/), retrieved from internet Nov. 15, 2022 (89 pages). |
| Warzala, K., “The Importance of Building Automation Systems Visualization,” AutomatedBuildings.com, retrieved from https://www.automatedbuildings.com/news/dec11/articles/dglogik/111128050101dglogik.html on Aug. 11, 2022, 3 pages (2011). |
| Wei et al., “Development and Implementation of Software Gateways of Fire Fighting Subsystem Running on EBI,” Control, Automation and Systems Engineering, IITA International Conference on, IEEE, Jul. 2009 (pp. 9-12). |
| White et al., “Reduce building maintenance costs with AWS IoT TwinMaker Knowledge Graph,” The Internet of Things on AWS—Official Blog, URL: https://aws.amazon.com/blogs/iot/reduce-building-maintenance-costs-with-aws-iot-twinmaker-knowledge-graph/, Nov. 18, 2022 (10 pages). |
| Williams, S., “IRYoutube—Indoor Reality: Mapping Indoors One Step at a Time,” YouTube, retrieved from https://www.youtube.com/watch?v=7ABR8feajNA on Aug. 11, 2022, 3 pages (2017) (no audio for transcript). |
| Zhou, Q. et al., “Knowledge-infused and Consistent Complex Event Processing over Real-time and Persistent Streams,” Further Generation Computer Systems, 2017, 76 (pp. 391-406). |
| Number | Date | Country | |
|---|---|---|---|
| 20230169738 A1 | Jun 2023 | US |
| Number | Date | Country | |
|---|---|---|---|
| 63285077 | Dec 2021 | US |