Systems and methods for entity visualization and management with an entity node editor

Information

  • Patent Grant
  • 11941238
  • Patent Number
    11,941,238
  • Date Filed
    Monday, March 7, 2022
    2 years ago
  • Date Issued
    Tuesday, March 26, 2024
    a month ago
Abstract
A method for visualizing and managing entities and connections between entities based on a graphical user interface (GUI) node editor includes constructing an entity datablock, wherein the entity datablock is a data structure describing an entity and comprises entity descriptive information, an entity category, an entity name, entity relationships, and an entity identifier. The method includes establishing the entity relationships as bi-directional relationships, wherein the bi-directional relationships link two disparate entities. The method further includes representing the entity datablock as a node and the bi-directional relationships as lines connecting two nodes, wherein the node comprises at least one of the entity descriptive information, the entity category, or the entity name.
Description
BACKGROUND

The present disclosure relates generally to building management systems. The present disclosure relates more particularly to visualizing entities associated with building management systems.


A user may operate a system to manage a number of entities. The system may include a large number of related entities. The system can display the large number of related entities as a graphical user interface (GUI). As the number of related entities increases, the GUI can be difficult for the system to display and the user to understand.


A hierarchical tree structure is one method to abstract the relationships between entities and specify the most relevant entities. However, a hierarchical tree structure can traditionally only support one dimension of complexity.





BRIEF DESCRIPTION OF THE DRAWINGS

Various objects, aspects, features, and advantages of the disclosure will become more apparent and better understood by referring to the detailed description taken in conjunction with the accompanying drawings, in which like reference characters identify corresponding elements throughout. In the drawings, like reference numbers generally indicate identical, functionally similar, and/or structurally similar elements.



FIG. 1 is a schematic perspective view drawing of a building with a security system, according to an exemplary embodiment.



FIG. 2 is a block diagram of a system that can be configured to implement an entity node editor to visualize and manage entities of the building illustrated in FIG. 1, according to an exemplary embodiment.



FIG. 3 is an interface generated by the entity node editor illustrated in FIG. 2, the interface including multiple entities visualized as nodes, according to an exemplary embodiment.



FIG. 4 is a schematic drawing of a menu interaction of the interface illustrated in FIG. 3, including multiple interactive elements, according to an exemplary embodiment.



FIG. 5 is a schematic drawing of a relationship management interaction with the interface illustrated in FIG. 3, including multiple interactive relationship elements, according to an exemplary embodiment.



FIG. 6 is a flow diagram of a process of relationship management that can be performed by the entity node editor illustrated in FIG. 2, the flow diagram including interface elements of the interface of FIG. 3, according to an exemplary embodiment.



FIG. 7 is a flow diagram of a process of entity management that can be performed by the entity node editor illustrated in FIG. 2, the flow diagram including interface elements of the interface of FIG. 3, according to an exemplary embodiment.





SUMMARY

A method for visualizing and managing entities and connections between entities based on a graphical user interface (GUI) node editor, the method includes constructing an entity datablock, wherein the entity datablock is a data structure describing an entity and includes entity descriptive information, an entity category, an entity name, entity relationships, and an entity identifier. The method includes establishing the entity relationships as bi-directional relationships, the bi-directional relationships link two disparate entities. The method includes representing the entity datablock as a node and the bi-directional relationships as lines connecting two nodes. The node includes at least one of the entity descriptive information, the entity category, or the entity name. The method further includes receiving, by a system to implement a new entity datablock, from the user device, a request to create the new entity datablock. The method includes generating, by the system to implement the new entity datablock, an entity identifier for the new entity datablock. The method includes instantiating, by an entity database, a new data structure element with the entity identifier. The entity database includes a number of data structure elements. The method includes storing, by the entity database, in the new data structure element, entity information. The entity information includes entity descriptive information, an entity category, an entity name, and entity relationships.


The method of establishing the entity relationships further includes linking the two disparate entities with a pair of unique identifiers. The pair of unique identifiers are associated with the two disparate entities and are stored in the entity datablock of each of the two disparate entities.


The method of representing the entity datablock as a node further includes displaying an interactive GUI element that allows a user to edit the entity datablock associated with the node. The method of representing the bi-directional relationships as lines further includes displaying an interactive GUI element that allows a user to edit the entity relationships of the two disparate entities.


The GUI node editor further includes one or more graphical interactive elements that form a digital representation of one or more physical entities. The one or more physical entities further includes an object that is able to be controlled by the GUI node editor. The GUI node editor is configured to control the one or more physical entities.


A GUI node editor for visualizing and managing entities and connections between entities. The GUI node editor includes, representations of one or more physical entities, one or more user devices, and one or more memory devices configured to store instructions that, when executed on one or more processors, cause the one or more processors to construct an entity datablock. The entity datablock is a data structure describing an entity and includes entity descriptive information, an entity category, an entity name, entity relationships, and an entity identifier. The instructions cause the one or more processors to establish the entity relationships as bi-directional relationships. The bi-directional relationships link two disparate entities. The instructions cause the one or more processors to represent the entity datablock as a node and the bi-directional relationships as lines connecting two nodes. The node includes at least one of the entity descriptive information, the entity category, or the entity name.


The instructions cause the one or more processors to construct the entity datablock by receiving from the one or more user devices, a request to create a new entity datablock. The instructions cause the one or more processors to construct the entity datablock by generating an entity identifier for the new entity datablock and instantiating a new data structure element with the entity identifier. The entity database includes a number of data structure elements. The instructions cause the one or more processors to construct the entity datablock by storing in the new data structure element, entity information. The entity information includes entity descriptive information, an entity category, an entity name, and entity relationships.


The instructions cause the one or more processors to establish the entity relationships by linking the two disparate entities with a pair of unique identifiers. The pair of unique identifiers are associated with the two disparate entities and are stored in the entity datablock of each of the two disparate entities. The instructions cause the one or more processors to represent the entity datablock as a node by displaying an interactive GUI element that allows a user to edit the entity datablock associated with the node. The instructions cause the one or more processors to represent the bi-directional relationships as lines by displaying an interactive GUI element that allows a user to edit the entity relationships of the two disparate entities. The GUI node editor further includes one or more graphical interactive elements that form a digital representation of one or more physical entities. The one or more physical entities includes an object that is able to be controlled by the GUI node editor. The GUI node editor is configured to control the one or more physical entities.


A building automation system includes one or more physical entities and a GUI node editor. The GUI node editor includes one or more entity datablocks. The one or more entity datablocks are a data structure describing the one or more physical entities and include entity descriptive information, an entity category, an entity name, entity relationships, and an entity identifier. The GUI node editor includes one or more bi-directional entity relationships. The one or more bi-directional entity relationships link two disparate entities. The GUI node editor includes a number of nodes and a number of lines connecting two nodes. The number of nodes includes at least one of the entity descriptive information, the entity category, or the entity name.


DETAILED DESCRIPTION

Overview


Referring generally to the FIGURES, systems and methods for visualizing and managing entities via an entity node editor are shown, according to various exemplary embodiments. Security operators using large security monitoring systems manage a large number of physical elements. In the context of a connected building management system, the term entity is used to describe any related physical element such as a facility, building, door, floor, sensor, or other element not here listed. In the context of a building management visualization methodology, the term node can be used to describe an entity which communicably connects two or more disparate entities.


A simplified management method to visualize entities is provided. By way of example, a node could be a floor of a building which contains several rooms. Managing and visualizing the connections between entities is a difficult task because of a potentially large number of individual entities. Traditional systems that manage and visualize numerous entities, either on-location or remotely, support only one-dimensional relationships between entities and can present challenges for a security operator to use effectively. Tools that allow security operators to focus on the most relevant information about entities allow for a faster response time. Visualizing complex connections between entities will allow security operators to more effectively manage a large number of entities in a building management system.


Visualization of greater levels of complexity and bi-directional relationships for a large number of entities is achieved in some embodiments. The entity node editor can be in the form of a graphical user interface (GUI). The entity node editor provides the user with detailed, contextual information about the entities under management, the relationship between entities, the number of connected entities, and other relevant information. Only the most relevant information can be presented to the user and any duplicate relationships are collapsed in the interface. This simplifies the way information is presented, making it easier for users to focus on the most important information. The interface may provide a means for the user to view the entity details and entity relationship details for further insight or management. The entity node editor described herein can be used as part of a security monitoring system, building management system, or other system, providing detailed and relevant information, together with a direct means of managing entities and entity relationships. The system advantageously overcomes problems created by software interfaces that do not provide sufficient visualization of levels of interconnection.


Building with Security System


Referring now to FIG. 1, a building 100 with a security camera 102 and a parking lot 110 is shown, according to an exemplary embodiment. The building 100 is a multi-story commercial building surrounded by or near the parking lot 110 but can be any type of building in some embodiments. The building 100 may be a school, a hospital, a store, a place of business, a residence, an apartment complex, a hotel, an office building, etc. The building 100 may be associated with the parking lot 110.


Both the building 100 and the parking lot 110 are at least partially in the field of view of the security camera 102. In some embodiments, multiple security cameras 102 may be used to capture the entire building 100 and parking lot 110 not in (or in to create multiple angles of overlapping or the same field of view) the field of view of a single security camera 102. The parking lot 110 may be used by one or more vehicles 104 where the vehicles 104 may be either stationary or moving (e.g. delivery vehicles). The building 100 and parking lot 110 may be further used by one or more pedestrians 106 who can traverse the parking lot 110 and/or enter and/or exit the building 100. The building 100 may be further surrounded or partially surrounded by a sidewalk 108 to facilitate the foot traffic of one or more pedestrians 106, facilitate deliveries, etc. In some embodiments, the building 100 may be one of many buildings belonging to a single industrial park, shopping mall, or commercial park having a common parking lot and security camera 102. In another embodiment, the building 100 may be a residential building or multiple residential buildings that share a common roadway or parking lot.


The building 100 is shown to include a door 112 and multiple windows 114. An access control system can be implemented within the building 100 to secure these potential entrance ways of the building 100. For example, badge readers can be positioned outside the door 112 to restrict access to the building 100. The pedestrians 106 can each be associated with access badges that they can utilize with the access control system to gain access to the building 100 through the door 112. Furthermore, other interior doors within the building 100 can include access readers. In some embodiments, the doors are secured through biometric information, e.g., facial recognition, fingerprint scanners, etc. The access control system can generate events, e.g., an indication that a particular user or particular badge has interacted with the door. Furthermore, if the door 112 is forced open, the access control system, via door sensor, can detect the door forced open (DFO) event.


The windows 114 can be secured by the access control system via burglar alarm sensors. These sensors can be configured to measure vibrations associated with the window 114. If vibration patterns or levels of vibrations are sensed by the sensors of the window 114, a burglar alarm can be generated by the access control system for the window 114.


The building 100 can further include HVAC systems. For example, waterside systems, airside systems, building management systems, and/or various other HVAC systems can be included within the building 100. For example, equipment such as chillers, boilers, rooftop units, air handler units, thermostats, sensors, actuators, dampers, valves, and other equipment can be implemented within the building 100 to control the environmental conditions of the building 100. Examples of building equipment that can be implemented within the building 100 can be found in U.S. patent application Ser. No. 16/048,052 filed Jul. 27, 2018, the entirety of which is incorporated by reference herein.


In some embodiments, the security camera 102 is installed for purposes of monitoring a parking lot 110 and/or sidewalk 108 for accumulated snow. For example, the security camera may be configured to communicate with an image analysis device (e.g., convolutional neural network) to determine if the parking lot 110 or sidewalk 108 are covered with snow and accordingly require snow removal services. In such embodiments, vehicles 104 and/or pedestrians 106 could partially occlude the parking lot 110 or sidewalk 108. When the parking lot 110 and sidewalk 108 are partially occluded, it is possible that an image analysis system could inaccurately classify the parking lot 110 or sidewalk 108 as being covered in snow.


In some embodiments, the security camera 102 is configured to use an image analysis system to observe the parking lot 110 for the purpose of determining how many parking spaces are open and/or occupied. In these embodiments, pedestrians 106, snow, or some other foreign object could partially occlude the parking spaces. In some embodiments, the security camera 102 could be configured to observe the entrance(s) and/or exit(s) of building 100 for the purposes of counting the number of pedestrians 106 enter or exit the building. In this embodiment, for example, vehicles 104 might partially occlude the entrance(s) and/or exit(s) of the building 100.


A System for Entity Visualization and Management


Referring now to FIG. 2, a system 201 is shown, according to an example embodiment. System 201 can be configured to visualize and manage multiple entities. System 201 can be implemented for building 100 (e.g., some or all of the components of the system 201 can be on premises or remote) to automatically monitor and control various building functions and systems. System 201 is shown to include an entity node editor 200, user device 260 and one or more physical entities 270. The user device 260 may be a computer, a tablet, a phone, a smart-watch, a keypad, or any other device. The entity node editor 200 can be configured to generate a GUI visualization of the one or more physical entities 270 and allow management of said physical entities 270, as described in greater detail with further reference to FIGS. 3-7.


The user device 260 can be configured to display the GUI produced by the entity node editor 200. According to some embodiments, user device 260 may be a personal computer. However, in some embodiments, user device 260 can take other forms (e.g., a tablet, phone, controller, script, program, etc.). Additionally, in some embodiments, user device 260 can be part of entity node editor 200 (e.g., the entity node editor 200 can be implemented on the user device 260). In some embodiments, the user device 260 can be connected to the entity node editor 200. In some embodiments, user device 260 may contain a display 262. The display 262 can be configured to present information to a user in a visual format (e.g., as text, graphics, etc.). The display 262 may use any of a variety of display technologies such as light emitting diode (LED), organic light-emitting diode (OLED), liquid-crystal display (LCD), organic light-emitting transistor (OLET), surface-conduction electron-emitter display (SED), field emission display (FED), digital light processing (DLP), liquid crystal on silicon (LCoC), or any other display technologies known in the art. In some embodiments, the display 262 is configured to present visual media (e.g., text, graphics, etc.) without requiring a backlight (e.g., e-paper, etc.).


In some embodiments, user device 260 may contain input interface 264. The input interface 264 can be a touchscreen or other type of electronic display configured to receive input from a user (e.g., via a touch-sensitive panel). For example, the input interface 264 may include a touch-sensitive panel layered on top of an electronic visual display. A user can provide inputs through simple or multi-touch gestures by touching the input interface 264 with one or more fingers and/or with a stylus or pen. The input interface 264 can use any of a variety of touch-sensing technologies to receive user inputs, such as capacitive sensing (e.g., surface capacitance, projected capacitance, mutual capacitance, self-capacitance, etc.), resistive sensing, surface acoustic wave, infrared grid, infrared acrylic projection, optical imaging, dispersive signal technology, acoustic pulse recognition, or other touch-sensitive technologies known in the art. Many of these technologies allow for multi-touch responsiveness of input interface 264 allowing registration of touch in two or even more locations at once. In some embodiments, user device 260 may not contain input interface 264 and/or display 262.


The entity node editor 200 is shown to include a processing circuit 210. Processing circuit 210 includes processor 215 and memory 225. Processing circuit 210 can be configured to implement the functionality of the entity node editor 200. The processor 215 can be a general purpose processor, an application specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), a group of processing components, or other suitable electronic processing components. Processor 215 can include a memory 225.


The memory 225 can include one or more devices (e.g., memory units, memory devices, storage devices, etc.) for storing data and/or computer code for completing and/or facilitating the various processes described in the present disclosure. The memory 225 can include random access memory (RAM), read-only memory (ROM), hard drive storage, temporary storage, non-volatile memory, flash memory, optical memory, or any other suitable memory for storing software objects and/or computer instructions. The memory 225 can include database components, object code components, script components, or any other type of information structure for supporting the various activities and information structures described in the present disclosure. The memory 225 can be communicably connected to the processor 215 via the processing circuit 210 and can include computer code for executing (e.g., by the processor 215) one or more processes described herein.


The memory 225 is shown to include an entity node manager 235, an entity database 245, and an entity node presenter 255. The entity database 245 can be configured to store information associated with multiple physical entities and relationships between individual entities. The entity node manager 235 can be configured to edit information associated with individual entities stored in the entity database 245 and relationships between entities stored in the entity database 245. The entity node presenter 255 can be configured to visually display entities and relationships between entities stored in the entity database 245. The entity node presenter 255 can be configured to display the entities as nodes.


The entity database 245 can maintain data relevant to operation of the entity node editor 200. The entity database 245 can include one or more entity data structures, shown as entity datablock(s) 240. In some embodiments, the entity database 245 can be a hardware component (e.g., server, network attached storage, load balancing system, etc.). In some embodiments, the entity database 245 can be a software component (e.g., database management system, application programming interface, content management system, etc.). The entity database 245 can be communicably connected to the entity node manager 235, in some embodiments. In some embodiments, the entity database 245 is part of other components (e.g., the processor 215, the entity node editor 200, etc.), or is a separate component in part or entirely.


The one or more entity datablock(s) 240 can be configured to be digital representations of the one or more physical entities 270. The physical entities 270 can be anything illustrated in FIG. 1 (e.g., building 100, security camera 102, parking lot 110, sidewalk 108, pedestrian 106, etc.). Each individual entity datablock of the one or more entity datablock(s) 240 can correspond to an individual physical entity of the one or more physical entities 270. The one or more entity datablock(s) 240 may contain information relevant to the one or more physical entities 270 (e.g., name, location, related entities, entity category, address, employee contact information, owner, district, group affiliations, appearance, control systems, manager, description, employee list, security contacts, maintenance contacts, etc.). The one or more entity datablock(s) 240 are shown to include an entity identifier, shown as identifier 241, an entity category, shown as category 242, an entity name, shown as name 243, entity information, shown as info 244, and entity relationships 246, in some embodiments. The one or more entity datablock(s) 240 may contain a different number and type of data than those listen herein.


The identifier 241 can be configured to uniquely identify an individual entity datablock of the one or more entity datablock(s) 240. The entity node editor 200 can generate a unique individualized identifier by maintaining a list of assigned identifiers and incrementally assigning a new entity datablock with an identifier value sequentially greater than that contained in the list of assigned identifiers. By way of example, the first entity datablock may be stored with identifier value “10000” and the second entity datablock may be stored with identifier value “10001.” In some embodiments, the entity node editor 200 can generate an identifier for the new entity datablock in any other manner.


The category 242 can be configured to descriptively label an individual entity datablock in the one or more entity datablock(s) 240 by the type of individual physical entity it represents (e.g., building, floor, city, room, security camera, access controller, employee, contractor, etc.). By way of example, an entity datablock representing “Building 5—Corporate South” could be given the category “Building.” Category 242 can be manually entered by a user from user device 260, can be assigned by entity node editor 200, or can be generated by any other method.


Name 243 can be configured to descriptively identify an individual entity datablock in the one or more entity datablock(s) 240 by the title of the individual physical entity it represents. The entity node editor 200 can label displayed entities by the associated name contained in the corresponding individual entity datablock. By way of example, a physical entity named “Building 5—Corporate South” could be given name 243 “Building 5—Corporate South” which would be displayed by entity node editor 200 as the name of the node representing the physical entity. Further examples of name 243 used as a node display by entity node editor 200 can be found in FIGS. 3-7.


Info 244 can be configured to include information describing an individual entity datablock in the one or more entity datablock(s) 240. Info 244 can be manually entered by a user from the user device 260, can be assigned by entity node editor 200, or can be generated by another method not here mentioned. Info 244 can include names, locations, related entities, entity categories, addresses, employee contact information, owners, districts, group affiliations, physical appearance, control systems, manager, description, employee list, security contacts, maintenance contacts, or other information not here listed.


Entity relationships 246 can be configured to represent interconnections between disparate one or more entity datablock(s) 240. The entity relationships 246 can include parent entities, shown as parents 247, and child entities, shown as children 248. The entity relationships 246 can include a different number and/or type of relationship than those listed. The entity relationships 246 can be manually entered by a user from the user device 260, can be assigned by the entity node editor 200, or can be generated by another method not here mentioned. The entity relationships 246 are described in more detail in relation to FIGS. 3-7.


Parents 247 can be configured to contain a list of one or more identifier(s) 241 of one or more entity datablock(s) 240 at a greater hierarchy level than that of the specific entity datablock containing the parents 247. By way of example, an entity datablock representing a room in a building can contain the identifier of the entity datablock representing the building that the room exists inside of within the list of parents. The parents 247 can be manually entered by a user from user device 260, can be assigned by entity node editor 200, or can be generated by another method not here mentioned. The parents 247 are described in more detail in relation to FIGS. 3-7.


Children 248 can be configured to contain a list of one or more identifier(s) 241 of one or more entity datablock(s) 240 at a lower hierarchy level than that of the specific entity datablock containing the children 248. By way of example, an entity datablock representing a building containing rooms may contain the identifiers of the entity datablocks representing the rooms contained in the building within the list of children. The children 248 can be manually entered by a user from user device 260, can be assigned by the entity node editor 200, or can be generated by another method not here mentioned. The children 248 are described in more detail in relation to FIGS. 3-7.


The entity node manager 235 can be configured to edit the one or more entity datablock(s) 240. In some embodiments, the entity node manager 235 is part of other components (e.g., the processor 215, the entity node editor 200, etc.), or can be a separate entity entirely. In some embodiments, the entity node manager 235 can be a hardware component (e.g., server, desktop computer, cluster, azure, AWS, controller, network manager, gateway, bridge, etc.), while in some embodiments, the entity node manager 235 can be a software component (e.g., database handler, object code, script, etc.). The entity node manager 235 can be communicably connected to the one or more physical entities 270 through the entity node editor 200, in some embodiments. In some embodiments, the entity node manager 235 can be communicably connected to the entity database 245 through the processing circuit 210.


The entity node presenter 255 can be configured to visually display, on the user device 260, the one or more entities contained in the entity database 245. In some embodiments, the entity node presenter 255 displays the one or more entities as GUI nodes described in more detail with relation to FIGS. 3-7. In some embodiments, the entity node presenter 255 displays the one or more entities in another layout not here mentioned. In some embodiments, the entity node presenter 255 is part of other components (e.g., the processor 215, the entity node editor 200, etc.), or can be a separate entity entirely. In some embodiments, the entity node presenter 255 can be a hardware component (e.g., server, desktop computer, cluster, azure, AWS, controller, network manager, gateway, bridge, etc.), while in some embodiments, the entity node presenter 255 can be a software component (e.g., database handler, object code, script, etc.). The entity node presenter 255 can be communicably connected to the user device 260 through the entity node editor 200, in some embodiments. In some embodiments, the entity node presenter 255 can be communicably connected to the entity database 245 through the processing circuit 210.


The one or more physical entities 270 can be configured to be tangible objects as part of a building. The one or more physical entities 270 can be anything illustrated in FIG. 1 (e.g., the building 100, the security camera 102, the parking lot 110, the sidewalk 108, the pedestrian 106, etc.). The one or more physical entities 270 can be a physical hardware element (e.g., ingress card reader, security camera, access controller, retina scanner, fingerprint reader, etc.) and/or a traditionally non-physical element (e.g., biometric analysis, social media analysis, facial-recognition, license plate analyzer, etc.). The one or more physical entities 270 can be doors, stairs, windows, rooms, chairs, computers, furniture, floors, phones, cabinets, thermostats, heating units, security cameras, art, trash, trash cans, silverware, faucets, bathrooms, elevators, employees, software, passwords, security codes, routers, servers, switches, carpet, sensors, building controllers, boilers, chillers, roof top units, air handler units, commercial stores, vendors, machines, equipment, vehicles, exhaust ports, roofs, or other elements not here mentioned. The one or more physical entities 270 can be configured to interact with the entity node editor 200 to provide data or perform actions.


In some embodiments, the one or more physical entities 270 can be coupled to the entity node editor 200. In some embodiments, the one or more physical entities 270 can include multiple sub-entities, the first of which is shown as first sub-entity 272. The first sub-entity 272 can be configured to be any of the elements described above in reference to the one or more physical entities 270. The first sub-entity 272 can be an entity at a lower hierarchy level than that of the one or more physical entities 270 which contain the first sub-entity 272. The first sub-entity 272 may contain sub-entities of its own which may contain further levels of sub-entities. By way of example, a physical entity could be a building which contains sub-entities “room 1,” “room 2,” and “room 3,” which each respectively contain sub-entities “chair 1,” “desk 2,” and “computer 3,” and “computer 3” could contain sub-entity “Microsoft Office subscription registration key 4.”


GUI Elements of an Entity Node Editor


Referring now to FIG. 3, an overview of an entity node editor interface 301 is shown, according to an exemplary embodiment. The entity node editor interface 301 can be configured to allow a user to visualize and manage connected entities. The entity node presenter 255 can manage the entity node editor interface 301. The entity node editor interface 301 can be organized into elements within the interface. In some embodiments, the entity node editor interface 301 may contain nodes that represent entities, shown as nodes 305-395. Each node (e.g., node 305, node 315, node 325, etc.) can be associated with a specific entity datablock in the entity database 245 of FIG. 2. Each node (e.g., node 305, node 315, node 325, etc.) may be identified by a name 304 and/or a category 306. The name 304 and/or the category 306 can be respectively associated with the name 243 and the category 242 of FIG. 2. Any node that is not connected to another node (e.g., node 305, etc.) may display a blank connection element 308. A node that is not connected to another node (e.g., node 305, etc.) may be associated with an entity datablock that contains no entity relationships 246. In some embodiments, a node that that is connected to other nodes (e.g., node 315, node 325, etc.) displays the number of connections in the connection element 310. The number of connections in the connection element 310 may be computed by the entity node editor 200 by summing the numeric quantity of the identifiers 241 contained in the parents 247 and/or the children 248 of the respective specific entity datablock 240 for which the number of connections is sought. Lines between nodes, for example the line 312, represent entity relationships associated with the entity relationships 246 of FIG. 2. Lines between nodes may be differentiable (e.g., may display in a different color, line thickness, line style, etc.) based on the type of relationship represented (e.g., parent relationship, child relationship, etc.). In some embodiments, relationships between nodes (e.g., line 312, etc.) represent bi-direction relationships. In some embodiments, a user may select a node containing relationships (e.g., node 315, node 325, etc.) to display an expanded view 314 including the type of relationship. In some embodiments, the relationship types may include, “isPartOf,” “hasFloor,” “locatedinCity,” and “RelationshipTypeX.” In some embodiments, the relationship types may be of other types not here listed and may differ depending on the physical entity the entity node editor 200 is representing. Connections between nodes (e.g., line 312, etc.) can be descriptive. By way of example, a node representing a floor entity could connect through a “isPartOf” connection relationship to a node representing a building entity through a “hasFloor” connection relationship. In some embodiments, a user may select to view a sub-group of the entire list of entity nodes related to a node. Connection element 316 can display the total number of connected nodes and hidden connection 318 can display the number of hidden nodes. Thumbnail view 320 can display an overview of the entire entity node editor 301.


Referring now to FIG. 4, a schematic drawing 401 of a menu interaction with the entity node editor interface 301 is shown, according to an exemplary embodiment. The schematic drawing 401 of elements of the entity node editor interface 301 includes node 402. The node 402 can correspond to the node 365 of FIG. 3 and/or can represent any other node. Extended controls may be revealed by a user moving a selection pointer over the entity node 402, as shown in view 403. In some embodiments, extended controls are revealed by a user positioning a finger above the entity node 402 as with a touch screen device. Extended controls may be revealed in response to other actions not here mentioned. In view 405, selection of expand icon 406 can display relationship menu 412.


The relationship menu 412 can be displayed in response to other actions not here mentioned. The relationship menu 412 may contain relationship elements “isPartOf,” “hasFloor,” “locatedInCity,” “relationshipType,” and/or other elements not here listed. Selection of an element (e.g., “locatedInCity,” “relationshipType,” etc.) within the relationship menu 412 can display a relationship management tool described in more detail with reference to FIG. 5. In view 407, selection of options icon 408 can display an options menu for the specific entity node. The options menu may contain action options “hide from view,” “edit properties,” “delete,” and/or other action options not here mentioned. In view 409, selection of manage relationships icon 410 can display the relationship management tool described in more detail with reference to FIG. 5.


The node 402 is shown to include an element 414 that indicates the number of existing relationships between the node 402 and other nodes. The element 414 includes the number “5” indicating that five different relationships exist between the node 402 and other entities. The relationship menu 412, when expanded, further describes which of the five relationships exist with other nodes. The relationship menu 412 can indicate what types of relationships, and their number, exist with other nodes. For example, element 416 indicates that one of the five relationships is an “isPartOf” relationship, element 418 indicates that three of the five relationships are “hasFloor” relationships, while element 420 indicates that one of the five relationships is an “locatedInCity” relationship. Element 422 includes no number indication since none of the existing relationships of the node 402 are “relationshipType” relationships.


Referring now to FIG. 5, a schematic drawing of elements of the entity node editor interface 301 illustration relationship management interactions is shown, according to an exemplary embodiment. Relationship management tool 504 can be displayed in response to selection of connection display element 502. In some embodiments, the relationship management tool 504 can be displayed in response to other actions not here mentioned. The relationship management tool 504 can be configured to modify the relationships contained in the entity relationships 246 of the specific entity datablock corresponding to the node (e.g., node 305, node 315, node 325, etc.) selected (e.g., as shown with connection display element 502, etc.).


In some embodiments, the relationship management tool 504 can be configured to aid in visualizing the relationships of the specific node selected. A user may define the type of relationship between two nodes, as shown in action 506. In some embodiments, selection of the expand icon 406 and a relationship type, shown in view 508 (e.g., “isPartOf,” “hasFloor,” etc.), can display the existing entity connections in a category list format 510. In some embodiments, category list format 510 can include a list of the relationships associated with the entity node ordered by relationship element category. Relationship element categories can be defined by a user, the entity node editor 200, or by another means not here mentioned. Management of entity relationships, as shown in action 506, can update the one or more identifier(s) 241 contained in the one or more entity datablock(s) 240 of the associated entity nodes.


Referring now to FIG. 6, a flow diagram of a process 601 for relationship deletion interactions with the entity node editor interface 301 is shown, according to an exemplary embodiment. The process 601 may remove the one or more identifier(s) 241 from the one or more entity datablock(s) 240 associated with a node (e.g., node 305, node 315, node 325, etc.). In some embodiments, the steps of the process 601 may cause the entity node presenter 255 to update the entity node editor interface 301 of FIG. 3. The process 602 may occur in response to other actions not here listed. In some embodiments, the entity node manager 235 can be configured to perform the process 601. However, in some embodiments, other components can be configured to perform the process 601. The entity node editor 200 can be configured to perform the process 601. Furthermore, any computing device as described herein can be configured to perform the process 601.


In step 602, the entity node presenter 255 displays a connection between entity nodes as described in greater detail with reference to FIG. 3. In some embodiments, the node 365 contains five relationships displayed as lines between nodes. The connection element 316 can display the total number of connected nodes and the hidden connection 318 can display the number of hidden nodes.


In step 604, the entity node presenter 255 allows the user to select the connection line between entity nodes. The entity node presenter 255 can receive the selection of the connection line from the user device 262. The entity node presenter 255 may display connections between entity nodes in another manner not here mentioned. In some embodiments, the entity node presenter 255 is configured to select relationships connections in another manner not here mentioned.


In step 606, the entity node presenter 255 can change the display state of the connection line between entity nodes in response to being selected by a user. In some embodiments, the connection line changes color (e.g., may display as blue, green, red, yellow, etc.), changes line weight, changes line style, or otherwise adjusts the display of the connection line to signal that the connection line has been selected.


In step 608, the entity node presenter 255 can receive an indication from the user device 260 to delete the selected relationship of the step 602. In some embodiments, the indication is generated by the user device 260 in response to a user interacting with a virtual or physical button. For example, a user may press a delete key to generate the indication to delete the relationship connection selected by the user in the step 602. In some embodiments, a backspace key, or another input not here mentioned may serve to delete the relationship connection.


In step 610, the entity node presenter 255 can display an action confirmation dialog element 611. The action confirmation dialog element 611 may provide a user with an indication of the deletion action of the relationship selected in the step 602 that the user is about to perform and request input from the user to confirm that the user wishes to proceed with deletion of the relationship selected in the step 602. In some embodiments, the action confirmation dialog element 611 may include elements including a header, “Delete connection?” body text, “By deleting this connection, you are removing the following relationships: <entity name 1> <relationship element category> <entity name 2>” and interactive elements “Do not ask again” 617, “CANCEL” 615, and “DELETE” 613.


In some embodiments, the action confirmation dialog element 611 may contain other elements not here mentioned. Selection of the interactive element “Do not ask again” 617 can configure the entity node presenter 255 to skip the step 610 in future relationship deletion interactions. Selection of the interactive element “CANCEL” 615 can cause the entity node editor 200 to stop the relationship deletion interaction 601 and return to a default display. Selection of the interactive element “DELETE” 613 can cause the entity node presenter 255 to remove the corresponding one or more identifier(s) 241 from the associated one or more entity datablock(s) 240. In some embodiments, the relationship deletion interaction 601 may cause the entity node presenter 255 to update the entity node editor interface 301 of FIG. 3.


In step 612, the entity node presenter 255 can update the entity node editor interface 301 to reflect the deleted relationship connection. The entity node editor interface 301 can remove the connection line between entity nodes. In some embodiments, the node 365 now contains four relationships displayed as lines between nodes. The connection element 316 can display the total number of connected nodes and the hidden connection 318 can display the number of hidden nodes. In some embodiments, an “undo” 619 prompt is displayed by the entity node presenter 255 until the next user action.


Referring now to FIG. 7, a flow diagram of process 701 for entity deletion interactions that are performed with elements of the entity node editor interface 301 is shown, according to an exemplary embodiment. The process 701 may remove the one or more entity datablock(s) 240 associated with a node (e.g., node 305, node 315, node 325, etc.) from the entity database 245. In some embodiments, the entity deletion interaction 701 may cause the entity node presenter 255 to update the entity node editor interface 301 of FIG. 3. The entity deletion interaction 701 may occur in response to other actions not here listed. In some embodiments, the entity node manager 235 can be configured to perform the process 701. However, in some embodiments, other components can be configured to perform the entity deletion interaction 701. The entity node presenter 255 can be configured to perform the entity deletion interaction 701. Furthermore, any computing device as described herein can be configured to perform the process 701.


In step 702, the entity node presenter 255 can cause the user device 262 to display one or more nodes and one or more connections between nodes. Nodes and node connections are described in greater detail with reference to FIG. 3. In some embodiments, the node 365 contains five relationships displayed as lines between nodes. The connection element 316 can display the total number of connected nodes and the hidden connection 318 can display the number of hidden nodes.


In step 704, the entity node presenter 255 can allow the user to select the options icon on the entity node. The entity node presenter 255 can receive an indication of the selection of the entity node 705 from the user device 260. The entity node presenter 255 may be configured to perform the steps 706-712 in response to performing the step 704.


In step 706, the entity node presenter 255 can display an options menu in response to selection of the options icon on the entity node by a user. In some embodiments, the node changes color (e.g., may display as blue, green, red, yellow, etc.), changes display stroke, or otherwise adjusts the display of the entity node selected. In some embodiments, the options menu may contain elements including, “Hide from view” 707, “Edit properties” 709, “Delete” 711, or other elements not here listed. Selection of the element “Hide from view” 707 can hide the entity as part of the hidden elements. Selection of the element “Edit properties” 709 can display the relationship management tool 504 of FIG. 5. Selection of the element “Delete” 711 can delete the entity node. In some embodiments, a backspace key, or another input not here mentioned may serve to delete the entity node. In step 708, the entity node presenter 255 can be configured to allow the user to select the “Delete” 711 entity of the options menu.


In step 710, the entity node presenter 255 can be configured to display an action confirmation dialog element 713. In some embodiments, the action confirmation dialog element 713 may contain elements including a header, “Delete <entity name>?” body text, “Are you sure you want to delete <entity name>? Please be aware that any connections between this and any other entities will also be deleted.” and interactive elements “Do not ask again” 715, “CANCEL” 717, and “DELETE” 719. In some embodiments, the action confirmation dialog element 713 may contain other elements not here mentioned. Selection of the interactive element “Do not ask again” 715 can configure the entity node presenter 255 to skip the step 710 in future entity deletion interactions. Selection of the interactive element “CANCEL” 717 can cause the entity node presenter 255 to stop the entity deletion interaction 701 and return to a default display. Selection of the interactive element “DELETE” 719 can cause the entity node presenter 255 to remove the entity datablock corresponding to the selected entity node from the entity database 245 and remove all references to the identifier 241 of the deleted entity datablock 240. In some embodiments, the entity deletion interaction 701 may cause the entity node presenter 255 to update the entity node editor interface 301 of FIG. 3.


In step 712, the entity node presenter 255 can be configured to update the entity node editor interface 301 to reflect the deleted entity node. The entity node editor interface 301 can remove the connection line between entity nodes. In some embodiments, the node 365 now contains four relationships displayed as lines between nodes. The connection element 316 can display the total number of connected nodes and the hidden connection 318 can display the number of hidden nodes. In some embodiments, an “undo” prompt 721 is displayed by the entity node presenter 255 until the next user action.


CONFIGURATION OF EXEMPLARY EMBODIMENTS

The construction and arrangement of the systems and methods as shown in the various exemplary embodiments are illustrative only. Although only a few embodiments have been described in detail in this disclosure, many modifications are possible (e.g., variations in sizes, dimensions, structures, shapes and proportions of the various elements, values of parameters, mounting arrangements, use of materials, colors, orientations, etc.). For example, the position of elements may be reversed or otherwise varied and the nature or number of discrete elements or positions may be altered or varied. Accordingly, all such modifications are intended to be included within the scope of the present disclosure. The order or sequence of any process or method steps may be varied or re-sequenced according to alternative embodiments. Other substitutions, modifications, changes, and omissions may be made in the design, operating conditions and arrangement of the exemplary embodiments without departing from the scope of the present disclosure.


The present disclosure contemplates methods, systems and program products on any machine-readable media for accomplishing various operations. The embodiments of the present disclosure may be implemented using existing computer processors, or by a special purpose computer processor for an appropriate system, incorporated for this or another purpose, or by a hardwired system. Embodiments within the scope of the present disclosure include program products comprising machine-readable media for carrying or having machine-executable instructions or data structures stored thereon. Such machine-readable media can be any available media that can be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such machine-readable media can comprise RAM, ROM, EPROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of machine-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a machine, the machine properly views the connection as a machine-readable medium. Thus, any such connection is properly termed a machine-readable medium. Combinations of the above are also included within the scope of machine-readable media. Machine-executable instructions include, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.


Although the figures show a specific order of method steps, the order of the steps may differ from what is depicted. Also, two or more steps may be performed concurrently or with partial concurrence. Such variation will depend on the software and hardware systems chosen and on designer choice. All such variations are within the scope of the disclosure. Likewise, software implementations could be accomplished with standard programming techniques with rule based logic and other logic to accomplish the various connection steps, processing steps, comparison steps and decision steps.

Claims
  • 1. A building system of a building comprising one or more storage devices having instructions stored thereon that, when executed by one or more processors, cause the one or more processors to: cause a storage device to store an entity database comprising a plurality of entities representing at least one of people, spaces, building devices, or points of the building and relationship data representing a plurality of relationships that relate the plurality of entities of the building;cause a user interface of a user device to display a graphical representation of at least a portion of the entity database;cause the user interface of the user device to display, responsive to selection of a new entity, an element indicating an existing entity of the entity database and including a plurality of selectable relationship types for a new relationship between the new entity and the existing entity, the element including relationship identifiers of the plurality of selectable relationship types for the new relationship between the new entity and the existing entity;receive, via the element, a user input selecting a new relationship type to create the new relationship between the new entity and the existing entity of the entity database, wherein the new relationship type is selected from the plurality of selectable relationship types; andupdate the entity database to create the new relationship between the new entity and the existing entity based on the user input.
  • 2. The building system of claim 1, wherein the instructions cause the one or more processors to: create a particular entity or a particular relationship and add the particular entity or the particular relationship to the entity database responsive to receiving a user edit from the user device; ordelete the particular entity or the particular relationship from the entity database responsive to receiving the user edit.
  • 3. The building system of claim 1, wherein the instructions cause the one or more processors to cause the user interface of the user device to display, based on the relationship data, an indication of one or more relationships of the plurality of relationships.
  • 4. The building system of claim 1, wherein a relationship of the plurality of relationships between a first entity of the plurality of entities and a second entity of the plurality of entities is logically defined by the relationship data with one or more words or phrases, the one or more words or phrases comprising a predicate.
  • 5. The building system of claim 1, wherein the instructions cause the one or more processors to: receive, via the user device, a selection of one entity of the plurality of entities;identify the plurality of selectable relationship types of a set of relationship types for the one entity, wherein the plurality of selectable relationship types are relationships available for the one entity with second entities of the plurality of entities; andcause the element of the user device to display the relationship identifiers of the plurality of selectable relationship types.
  • 6. The building system of claim 1, wherein the instructions cause the one or more processors to: receive, via the user device, a selection of one relationship of the plurality of relationships;receive an indication to delete the one relationship; andupdate the entity database by removing the one relationship from the plurality of relationships in response to receiving the indication to delete the one relationship.
  • 7. The building system of claim 1, wherein the instructions cause the one or more processors to: receive, via the user device, a selection of one entity of the plurality of entities;receive an indication to delete the one entity; andupdate the entity database by removing the one entity from the plurality of entities in response to receiving the indication to delete the one entity.
  • 8. The building system of claim 1, wherein the instructions cause the one or more processors to receive an input including a particular new relationship between a first existing entity of the plurality of entities and a second existing entity of the plurality of entities; wherein the instructions cause the one or more processors to update the entity database by causing the particular new relationship to be added between the first existing entity of the plurality of entities and the second existing entity of the plurality of entities.
  • 9. The building system of claim 1, wherein the instructions cause the one or more processors to store the new entity and the new relationship between the new entity and the existing entity responsive to receiving the user input.
  • 10. A method of viewing and editing an entity database, the method comprising: causing, by one or more processing circuits, a storage device to store the entity database comprising a plurality of entities representing at least one of people, spaces, building devices, or points of the building and relationship data representing a plurality of relationships that relate the plurality of entities of the building;causing, by the one or more processing circuits, a user interface of a user device to display a graphical representation of at least a portion of the entity database;causing, by the one or more processing circuits, the user interface of the user device to display, responsive to selection of a new entity, an element indicating an existing entity of the entity database and including a plurality of selectable relationship types for a new relationship between the new entity and the existing entity, the element including relationship identifiers of the plurality of selectable relationship types for the new relationship between the new entity and the existing entity;receiving, by the one or more processing circuits, via the element, a user input selecting a new relationship type to create the new relationship between the new entity and the existing entity of the entity database, wherein the new relationship type is selected from the plurality of selectable relationship types; andupdating, by the one or more processing circuits, the entity database to create the new relationship between the new entity and the existing entity based on the user input.
  • 11. The method of claim 10, further comprising: causing, by the one or more processing circuits, the user interface of the user device to display, based on the relationship data, an indication of one or more relationships.
  • 12. The method of claim 10, wherein a relationship of the plurality of relationships between a first entity of the plurality of entities and a second entity of the plurality of entities is logically defined by the relationship data with one or more words or phrases, the one or more words or phrases comprising a predicate.
  • 13. The method of claim 10, further comprising: receiving, by the one or more processing circuits, via the user device, a selection of one entity of the plurality of entities;identifying, by the one or more processing circuits, based on the entity database, the plurality of selectable relationship types of a set of relationship types for the one entity, wherein the plurality of selectable relationship types are relationships available for the one entity with second entities of the plurality of entities; andcausing, by the one or more processing circuits, the element of the user device to display the relationship identifiers of the plurality of selectable relationship types.
  • 14. The method of claim 10, further comprising: receiving, by the one or more processing circuits, via the user device, a selection of one relationship of the plurality of relationships;receiving, by the one or more processing circuits, an indication to delete the one relationship; andupdating, by the one or more processing circuits, the entity database by removing the one relationship from the plurality of relationships in response to receiving the indication to delete the one relationship.
  • 15. The method of claim 10, further comprising: receiving, by the one or more processing circuits, via the user device, a selection of one entity of the plurality of entities;receiving, by the one or more processing circuits, an indication to delete the one entity; andupdating, by the one or more processing circuits, the entity database by removing the one entity from the plurality of entities in response to receiving the indication to delete the one entity.
  • 16. The method of claim 10, further comprising receiving, by the one or more processing circuits, an input including a particular new relationship between a first existing entity of the plurality of entities and a second existing entity of the plurality of entities; wherein the method further comprises updating, by the one or more processing circuits, the entity database by causing the particular new relationship to be added between the first existing entity of the plurality of entities and the second existing entity of the plurality of entities.
  • 17. The method of claim 10, further comprising receiving, by the one or more processing circuits, an input including the new relationship, the new entity, and a request to add the new relationship between the new entity and the existing entity of the plurality of entities; wherein the method further comprises storing, by the one or more processing circuits, the new entity and the new relationship between the new entity and the existing entity responsive to receiving the user input.
  • 18. One or more storage devices configured to store instructions thereon that, when executed by one or more processing circuits, cause the one or more processing circuits to: store an entity database comprising a plurality of entities representing at least one of people, spaces, building devices, or points of the building and relationship data representing a plurality of relationships that relate the plurality of entities of the building;cause a user interface of a user device to display a graphical representation of at least a portion of the entity database;cause the user interface of the user device to display, responsive to selection of a new entity, an element indicating an existing entity of the entity database and including a plurality of selectable relationship types for a new relationship between the new entity and the existing entity, the element including relationship identifiers of the plurality of selectable relationship types for the new relationship between the new entity and the existing entity;receive, via the element, a user input selecting a new relationship type to create the new relationship between the new entity and the existing entity of the entity database, wherein the new relationship type is selected from the plurality of selectable relationship types; andupdate the entity database to create the new relationship between the new entity and the existing entity based on the user input.
  • 19. The one or more storage devices of claim 18, wherein the instructions cause the one or more processing circuits to cause the user interface of the user device to display, based on the relationship data, an indication of one or more relationships of the plurality of relationships.
  • 20. The one or more storage devices of claim 18, wherein a relationships of the plurality of relationships between a first entity of the plurality of entities and a second entity of the plurality of entities is logically defined by the relationship data with one or more words or phrases, the one or more words or phrases comprising a predicate.
CROSS-REFERENCE TO RELATED PATENT APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/723,087, filed Dec. 20, 2019, which is a continuation of U.S. patent application Ser. No. 16/175,507, filed Oct. 30, 2018, the entire contents of both of which are incorporated herein by reference.

US Referenced Citations (446)
Number Name Date Kind
5065347 Pajak Nov 1991 A
5301109 Landauer et al. Apr 1994 A
5446677 Jensen et al. Aug 1995 A
5546529 Bowers Aug 1996 A
5581478 Cruse et al. Dec 1996 A
5714971 Shalit Feb 1998 A
5812962 Kovac Sep 1998 A
5960381 Singers et al. Sep 1999 A
5973662 Singers et al. Oct 1999 A
6014612 Larson et al. Jan 2000 A
6031547 Kennedy Feb 2000 A
6134511 Subbarao Oct 2000 A
6157943 Meyer Dec 2000 A
6195094 Celebiler Feb 2001 B1
6252597 Lokuge Jun 2001 B1
6281896 Alimpich Aug 2001 B1
6285366 Ng Sep 2001 B1
6285966 Brown et al. Sep 2001 B1
6363422 Hunter et al. Mar 2002 B1
6381611 Roberge Apr 2002 B1
6385510 Hoog et al. May 2002 B1
6389331 Jensen et al. May 2002 B1
6401027 Xu et al. Jun 2002 B1
6437691 Sandelman et al. Aug 2002 B1
6477518 Li et al. Nov 2002 B1
6487457 Hull et al. Nov 2002 B1
6493755 Hansen et al. Dec 2002 B1
6577323 Jamieson et al. Jun 2003 B1
6626366 Kayahara et al. Sep 2003 B2
6646660 Patty Nov 2003 B1
6691282 Rochford Feb 2004 B1
6704016 Oliver et al. Mar 2004 B1
6732540 Sugihara et al. May 2004 B2
6734882 Becker May 2004 B1
6764019 Kayahara et al. Jul 2004 B1
6782385 Natsumeda et al. Aug 2004 B2
6813532 Eryurek et al. Nov 2004 B2
6816811 Seem Nov 2004 B2
6823680 Jayanth Nov 2004 B2
6826454 Sulfstede Nov 2004 B2
6865511 Frerichs et al. Mar 2005 B2
6925338 Eryurek et al. Aug 2005 B2
6986138 Sakaguchi et al. Jan 2006 B1
7031880 Seem et al. Apr 2006 B1
7401057 Eder Jul 2008 B2
7552467 Lindsay Jun 2009 B2
7627544 Chkodrov et al. Dec 2009 B2
7797276 Yang Sep 2010 B1
7818249 Lovejoy et al. Oct 2010 B2
7889051 Billig et al. Feb 2011 B1
7996488 Casabella et al. Aug 2011 B1
8078330 Brickfield et al. Dec 2011 B2
8104044 Scofield et al. Jan 2012 B1
8229470 Ranjan et al. Jul 2012 B1
8401991 Wu et al. Mar 2013 B2
8495745 Schrecker et al. Jul 2013 B1
8516016 Park et al. Aug 2013 B2
8532808 Drees et al. Sep 2013 B2
8532839 Drees et al. Sep 2013 B2
8600556 Nesler et al. Dec 2013 B2
8635182 MacKay Jan 2014 B2
8682921 Park et al. Mar 2014 B2
8731724 Drees et al. May 2014 B2
8737334 Ahn et al. May 2014 B2
8738334 Jiang et al. May 2014 B2
8751487 Byrne et al. Jun 2014 B2
8788097 Drees et al. Jul 2014 B2
8805995 Oliver Aug 2014 B1
8843238 Wenzel et al. Sep 2014 B2
8874071 Sherman et al. Oct 2014 B2
8941465 Pineau et al. Jan 2015 B2
8990127 Taylor Mar 2015 B2
9070113 Shafiee et al. Jun 2015 B2
9116978 Park et al. Aug 2015 B2
9185095 Moritz et al. Nov 2015 B1
9189527 Park et al. Nov 2015 B2
9196009 Drees et al. Nov 2015 B2
9229966 Aymeloglu et al. Jan 2016 B2
9286582 Drees et al. Mar 2016 B2
9311807 Schultz et al. Apr 2016 B2
9344751 Ream et al. May 2016 B1
9354968 Wenzel et al. May 2016 B2
9507686 Horn et al. Nov 2016 B2
9524594 Ouyang et al. Dec 2016 B2
9558196 Johnston et al. Jan 2017 B2
9652813 Gifford et al. May 2017 B2
9753455 Drees Sep 2017 B2
9811249 Chen et al. Nov 2017 B2
9838844 Emeis et al. Dec 2017 B2
9886478 Mukherjee Feb 2018 B2
9948359 Horton Apr 2018 B2
10055114 Shah et al. Aug 2018 B2
10055206 Park et al. Aug 2018 B2
10116461 Fairweather et al. Oct 2018 B2
10169454 Ait-Mokhtar et al. Jan 2019 B2
10171297 Stewart et al. Jan 2019 B2
10171586 Shaashua et al. Jan 2019 B2
10187258 Nagesh et al. Jan 2019 B2
10514963 Shrivastava et al. Dec 2019 B2
10515098 Park et al. Dec 2019 B2
10534326 Sridharan et al. Jan 2020 B2
10536295 Fairweather et al. Jan 2020 B2
10564993 Deutsch et al. Feb 2020 B2
10705492 Harvey Jul 2020 B2
10708078 Harvey Jul 2020 B2
10760815 Janakiraman et al. Sep 2020 B2
10762475 Song et al. Sep 2020 B2
10824120 Ahmed Nov 2020 B2
10845771 Harvey Nov 2020 B2
10854194 Park et al. Dec 2020 B2
10862928 Badawy et al. Dec 2020 B1
10921760 Harvey Feb 2021 B2
10921972 Park et al. Feb 2021 B2
10969133 Harvey Apr 2021 B2
10986121 Stockdale et al. Apr 2021 B2
11016998 Park et al. May 2021 B2
11024292 Park et al. Jun 2021 B2
11038709 Park et al. Jun 2021 B2
11041650 Li et al. Jun 2021 B2
11054796 Holaso Jul 2021 B2
11070390 Park et al. Jul 2021 B2
11073976 Park et al. Jul 2021 B2
11108587 Park et al. Aug 2021 B2
11113295 Park et al. Sep 2021 B2
11229138 Harvey et al. Jan 2022 B1
11314726 Park et al. Apr 2022 B2
11314788 Park et al. Apr 2022 B2
11556105 Cooley et al. Jan 2023 B2
11561522 Cooley et al. Jan 2023 B2
11561523 Cooley et al. Jan 2023 B2
11573551 Cooley et al. Feb 2023 B2
11586167 Cooley et al. Feb 2023 B2
20020010562 Schleiss et al. Jan 2002 A1
20020016639 Smith et al. Feb 2002 A1
20020059229 Natsumeda et al. May 2002 A1
20020123864 Eryurek et al. Sep 2002 A1
20020147506 Eryurek et al. Oct 2002 A1
20020177909 Fu et al. Nov 2002 A1
20030005486 Ridolfo et al. Jan 2003 A1
20030014130 Grumelart Jan 2003 A1
20030073432 Meade, II Apr 2003 A1
20030158704 Triginai et al. Aug 2003 A1
20030171851 Brickfield et al. Sep 2003 A1
20030200059 Ignatowski et al. Oct 2003 A1
20030218641 Longobardi Nov 2003 A1
20040068390 Saunders Apr 2004 A1
20040128314 Katibah et al. Jul 2004 A1
20040133314 Ehlers et al. Jul 2004 A1
20040153446 Castronova Aug 2004 A1
20040199360 Friman et al. Oct 2004 A1
20040233238 Lahdesmaki Nov 2004 A1
20040239683 Chu Dec 2004 A1
20050027408 Donoghue Feb 2005 A1
20050055308 Meyer et al. Mar 2005 A1
20050108262 Fawcett et al. May 2005 A1
20050154494 Ahmed Jul 2005 A1
20050278703 Lo et al. Dec 2005 A1
20050283337 Sayal Dec 2005 A1
20060053382 Gardner Mar 2006 A1
20060095521 Patinkin May 2006 A1
20060140207 Eschbach et al. Jun 2006 A1
20060184479 Levine Aug 2006 A1
20060200476 Gottumukkala et al. Sep 2006 A1
20060265751 Cosquer et al. Nov 2006 A1
20060271589 Horowitz et al. Nov 2006 A1
20070028179 Levin et al. Feb 2007 A1
20070203693 Estes Aug 2007 A1
20070261062 Bansal et al. Nov 2007 A1
20070273497 Kuroda et al. Nov 2007 A1
20070273610 Baillot Nov 2007 A1
20080034425 Overcash et al. Feb 2008 A1
20080094230 Mock et al. Apr 2008 A1
20080097816 Freire et al. Apr 2008 A1
20080186160 Kim et al. Aug 2008 A1
20080249756 Chaisuparasmikul Oct 2008 A1
20080252723 Park Oct 2008 A1
20080281472 Podgorny et al. Nov 2008 A1
20090063545 Hsu et al. Mar 2009 A1
20090195349 Frader-Thompson et al. Aug 2009 A1
20100045439 Tak et al. Feb 2010 A1
20100058248 Park Mar 2010 A1
20100131533 Ortiz May 2010 A1
20100274366 Fata et al. Oct 2010 A1
20100281387 Holland et al. Nov 2010 A1
20100286937 Hedley et al. Nov 2010 A1
20100324962 Nesler et al. Dec 2010 A1
20110015802 Imes Jan 2011 A1
20110047418 Drees et al. Feb 2011 A1
20110061015 Drees et al. Mar 2011 A1
20110071685 Huneycutt et al. Mar 2011 A1
20110077950 Hughston Mar 2011 A1
20110087650 MacKay et al. Apr 2011 A1
20110087988 Ray et al. Apr 2011 A1
20110088000 MacKay Apr 2011 A1
20110093493 Nair et al. Apr 2011 A1
20110125737 Pothering et al. May 2011 A1
20110137853 MacKay Jun 2011 A1
20110153603 Adiba et al. Jun 2011 A1
20110154363 Karmarkar Jun 2011 A1
20110157357 Weisensale et al. Jun 2011 A1
20110178977 Drees Jul 2011 A1
20110191343 Heaton et al. Aug 2011 A1
20110205022 Cavallaro et al. Aug 2011 A1
20110218777 Chen et al. Sep 2011 A1
20120011126 Park et al. Jan 2012 A1
20120011141 Park et al. Jan 2012 A1
20120022698 MacKay Jan 2012 A1
20120062577 Nixon Mar 2012 A1
20120064923 Imes et al. Mar 2012 A1
20120083930 Ilic et al. Apr 2012 A1
20120100825 Sherman et al. Apr 2012 A1
20120101637 Imes et al. Apr 2012 A1
20120135759 Imes et al. May 2012 A1
20120136485 Weber et al. May 2012 A1
20120158633 Eder Jun 2012 A1
20120259583 Noboa et al. Oct 2012 A1
20120272228 Marndi et al. Oct 2012 A1
20120278051 Jiang et al. Nov 2012 A1
20130007063 Kalra et al. Jan 2013 A1
20130038430 Blower et al. Feb 2013 A1
20130038707 Cunningham et al. Feb 2013 A1
20130060820 Bulusu et al. Mar 2013 A1
20130086497 Ambuhl et al. Apr 2013 A1
20130097706 Titonis et al. Apr 2013 A1
20130103221 Raman et al. Apr 2013 A1
20130167035 Imes et al. Jun 2013 A1
20130170710 Kuoch et al. Jul 2013 A1
20130204836 Choi et al. Aug 2013 A1
20130246916 Reimann et al. Sep 2013 A1
20130247205 Schrecker et al. Sep 2013 A1
20130262035 Mills Oct 2013 A1
20130275174 Bennett et al. Oct 2013 A1
20130275908 Reichard Oct 2013 A1
20130297050 Reichard et al. Nov 2013 A1
20130298244 Kumar et al. Nov 2013 A1
20130331995 Rosen Dec 2013 A1
20130338970 Reghetti Dec 2013 A1
20140032506 Hoey et al. Jan 2014 A1
20140059483 Mairs et al. Feb 2014 A1
20140081652 Klindworth Mar 2014 A1
20140129457 Peeler May 2014 A1
20140135952 Maehara May 2014 A1
20140152651 Chen et al. Jun 2014 A1
20140172184 Schmidt et al. Jun 2014 A1
20140189861 Gupta et al. Jul 2014 A1
20140207282 Angle et al. Jul 2014 A1
20140258052 Khuti et al. Sep 2014 A1
20140269614 Maguire et al. Sep 2014 A1
20140277765 Karimi et al. Sep 2014 A1
20140278461 Artz Sep 2014 A1
20140327555 Sager et al. Nov 2014 A1
20150019174 Kiff Jan 2015 A1
20150042240 Aggarwal et al. Feb 2015 A1
20150046180 Futscher De Deus et al. Feb 2015 A1
20150105917 Sasaki et al. Apr 2015 A1
20150145468 Ma et al. May 2015 A1
20150156031 Fadell et al. Jun 2015 A1
20150168931 Jin Jun 2015 A1
20150172300 Cochenour Jun 2015 A1
20150178421 Borrelli et al. Jun 2015 A1
20150185261 Frader-Thompson et al. Jul 2015 A1
20150186777 Lecue et al. Jul 2015 A1
20150202962 Habashima et al. Jul 2015 A1
20150204563 Imes et al. Jul 2015 A1
20150235267 Steube et al. Aug 2015 A1
20150241895 Lu et al. Aug 2015 A1
20150244730 Vu et al. Aug 2015 A1
20150244732 Golshan et al. Aug 2015 A1
20150261863 Dey et al. Sep 2015 A1
20150263900 Polyakov et al. Sep 2015 A1
20150286969 Warner et al. Oct 2015 A1
20150295796 Hsiao et al. Oct 2015 A1
20150304193 Ishii et al. Oct 2015 A1
20150316918 Schleiss et al. Nov 2015 A1
20150324422 Elder Nov 2015 A1
20150341212 Hsiao et al. Nov 2015 A1
20150348417 Ignaczak et al. Dec 2015 A1
20150379080 Jochimski Dec 2015 A1
20160011753 McFarland et al. Jan 2016 A1
20160127712 White et al. Jan 2016 A1
20160033946 Zhu et al. Feb 2016 A1
20160035246 Curtis Feb 2016 A1
20160065601 Gong et al. Mar 2016 A1
20160070736 Swan et al. Mar 2016 A1
20160078229 Gong et al. Mar 2016 A1
20160090839 Stolarczyk Mar 2016 A1
20160119434 Dong et al. Apr 2016 A1
20160139752 Shim et al. May 2016 A1
20160163186 Davidson et al. Jun 2016 A1
20160170390 Xie et al. Jun 2016 A1
20160171862 Das et al. Jun 2016 A1
20160173816 Huenerfauth et al. Jun 2016 A1
20160179315 Sarao et al. Jun 2016 A1
20160179342 Sarao et al. Jun 2016 A1
20160179990 Sarao et al. Jun 2016 A1
20160195856 Spero Jul 2016 A1
20160210268 Sales Jul 2016 A1
20160212165 Singla et al. Jul 2016 A1
20160239660 Azvine et al. Aug 2016 A1
20160239756 Aggour et al. Aug 2016 A1
20160247129 Song et al. Aug 2016 A1
20160260063 Harris et al. Sep 2016 A1
20160313751 Risbeck et al. Oct 2016 A1
20160313752 Przybylski Oct 2016 A1
20160313902 Hill et al. Oct 2016 A1
20160342629 Kirn Nov 2016 A1
20160350364 Anicic et al. Dec 2016 A1
20160357424 Pang Dec 2016 A1
20160357521 Zhang et al. Dec 2016 A1
20160357828 Tobin et al. Dec 2016 A1
20160358432 Branscomb et al. Dec 2016 A1
20160359680 Parandehgheibi Dec 2016 A1
20160363336 Roth et al. Dec 2016 A1
20160370258 Perez Dec 2016 A1
20160378306 Kresl et al. Dec 2016 A1
20160379326 Chan-Gove et al. Dec 2016 A1
20170006135 Siebel Jan 2017 A1
20170011318 Vigano et al. Jan 2017 A1
20170017221 Lamparter et al. Jan 2017 A1
20170039255 Raj et al. Feb 2017 A1
20170052536 Warner et al. Feb 2017 A1
20170053441 Nadumane et al. Feb 2017 A1
20170063894 Muddu et al. Mar 2017 A1
20170068409 Nair Mar 2017 A1
20170070775 Taxier et al. Mar 2017 A1
20170075984 Deshpande et al. Mar 2017 A1
20170084168 Janchookiat Mar 2017 A1
20170090437 Veeramani et al. Mar 2017 A1
20170093700 Gilley et al. Mar 2017 A1
20170098086 Hoernecke et al. Apr 2017 A1
20170103327 Penilla et al. Apr 2017 A1
20170103403 Chu et al. Apr 2017 A1
20170123389 Baez et al. May 2017 A1
20170134415 Muddu et al. May 2017 A1
20170177715 Chang et al. Jun 2017 A1
20170180147 Brandman et al. Jun 2017 A1
20170188216 Koskas et al. Jun 2017 A1
20170206242 Djordjevic Jul 2017 A1
20170212482 Boettcher et al. Jul 2017 A1
20170212668 Shah et al. Jul 2017 A1
20170220641 Chi et al. Aug 2017 A1
20170230930 Frey Aug 2017 A1
20170235466 Tanwir Aug 2017 A1
20170235817 Deodhar et al. Aug 2017 A1
20170251182 Siminoff et al. Aug 2017 A1
20170270124 Nagano et al. Sep 2017 A1
20170277769 Pasupathy et al. Sep 2017 A1
20170278003 Liu Sep 2017 A1
20170294132 Colmenares Oct 2017 A1
20170315522 Kwon et al. Nov 2017 A1
20170315697 Jacobson et al. Nov 2017 A1
20170322534 Sinha et al. Nov 2017 A1
20170323389 Vavrasek Nov 2017 A1
20170329289 Kohn et al. Nov 2017 A1
20170336770 MacMillan Nov 2017 A1
20170345287 Fuller et al. Nov 2017 A1
20170351957 Lecue et al. Dec 2017 A1
20170357225 Asp et al. Dec 2017 A1
20170357490 Park et al. Dec 2017 A1
20170357908 Cabadi et al. Dec 2017 A1
20170364841 Kurjanowicz et al. Dec 2017 A1
20180012159 Kozloski et al. Jan 2018 A1
20180013579 Fairweather et al. Jan 2018 A1
20180024520 Sinha et al. Jan 2018 A1
20180039238 Gärtner et al. Feb 2018 A1
20180048485 Pelton et al. Feb 2018 A1
20180069932 Tiwari et al. Mar 2018 A1
20180114140 Chen et al. Apr 2018 A1
20180137288 Polyakov May 2018 A1
20180157930 Rutschman et al. Jun 2018 A1
20180162400 Abdar Jun 2018 A1
20180176241 Manadhata et al. Jun 2018 A1
20180198627 Mullins Jul 2018 A1
20180203961 Aisu et al. Jul 2018 A1
20180239982 Rutschman et al. Aug 2018 A1
20180262573 Przybylski Sep 2018 A1
20180275625 Park et al. Sep 2018 A1
20180276962 Butler et al. Sep 2018 A1
20180292797 Lamparter et al. Oct 2018 A1
20180336785 Ghannam et al. Nov 2018 A1
20180356775 Harvey Dec 2018 A1
20180359111 Harvey Dec 2018 A1
20180364654 Locke et al. Dec 2018 A1
20190005025 Malabarba Jan 2019 A1
20190013023 Pourmohammad et al. Jan 2019 A1
20190025771 Park et al. Jan 2019 A1
20190037135 Hedge Jan 2019 A1
20190042988 Brown et al. Feb 2019 A1
20190088106 Grundstrom Mar 2019 A1
20190094824 Xie et al. Mar 2019 A1
20190095395 Piecko Mar 2019 A1
20190095519 Park Mar 2019 A1
20190095644 Park et al. Mar 2019 A1
20190096217 Pourmohammad et al. Mar 2019 A1
20190102840 Perl et al. Apr 2019 A1
20190121801 Jethwa et al. Apr 2019 A1
20190138512 Pourmohammad et al. May 2019 A1
20190147883 Mellenthin et al. May 2019 A1
20190158309 Park et al. May 2019 A1
20190163152 Worrall et al. May 2019 A1
20190268178 Fairweather et al. Aug 2019 A1
20190310979 Masuzaki et al. Oct 2019 A1
20190364060 Muddu et al. Nov 2019 A1
20190377306 Harvey Dec 2019 A1
20200125249 Park et al. Apr 2020 A1
20200226156 Borra et al. Jul 2020 A1
20200285203 Thakur et al. Sep 2020 A1
20200336328 Harvey Oct 2020 A1
20200348632 Harvey Nov 2020 A1
20200387576 Brett et al. Dec 2020 A1
20200396208 Brett et al. Dec 2020 A1
20210042299 Migliori Feb 2021 A1
20210043221 Yelchuru et al. Feb 2021 A1
20210325070 Endel et al. Oct 2021 A1
20210342961 Winter et al. Nov 2021 A1
20210381711 Harvey et al. Dec 2021 A1
20210381712 Harvey et al. Dec 2021 A1
20210382445 Harvey et al. Dec 2021 A1
20210383041 Harvey et al. Dec 2021 A1
20210383042 Harvey et al. Dec 2021 A1
20210383200 Harvey et al. Dec 2021 A1
20210383219 Harvey et al. Dec 2021 A1
20210383235 Harvey et al. Dec 2021 A1
20210383236 Harvey et al. Dec 2021 A1
20220066402 Harvey et al. Mar 2022 A1
20220066405 Harvey Mar 2022 A1
20220066432 Harvey et al. Mar 2022 A1
20220066434 Harvey et al. Mar 2022 A1
20220066528 Harvey et al. Mar 2022 A1
20220066722 Harvey et al. Mar 2022 A1
20220066754 Harvey et al. Mar 2022 A1
20220066761 Harvey et al. Mar 2022 A1
20220067226 Harvey et al. Mar 2022 A1
20220067227 Harvey et al. Mar 2022 A1
20220067230 Harvey et al. Mar 2022 A1
20220069863 Harvey et al. Mar 2022 A1
20220070293 Harvey et al. Mar 2022 A1
20220121965 Chatterji et al. Apr 2022 A1
20220138684 Harvey May 2022 A1
20220147000 Cooley et al. May 2022 A1
20220150124 Cooley et al. May 2022 A1
20220215264 Harvey et al. Jul 2022 A1
20230010757 Preciado Jan 2023 A1
20230071312 Preciado et al. Mar 2023 A1
20230076011 Preciado et al. Mar 2023 A1
20230083703 Meiners Mar 2023 A1
Foreign Referenced Citations (49)
Number Date Country
2019226217 Nov 2020 AU
2019226264 Nov 2020 AU
101415011 Apr 2009 CN
102136099 Jul 2011 CN
102136100 Jul 2011 CN
102650876 Aug 2012 CN
104040583 Sep 2014 CN
104603832 May 2015 CN
104919484 Sep 2015 CN
106204392 Dec 2016 CN
106406806 Feb 2017 CN
106960269 Jul 2017 CN
107147639 Sep 2017 CN
107598928 Jan 2018 CN
104572125 Oct 2018 CN
2 528 033 Nov 2012 EP
3 268 821 Jan 2018 EP
3 324 306 May 2018 EP
4 226 263 Aug 2023 EP
H10-049552 Feb 1998 JP
2003-162573 Jun 2003 JP
2007-018322 Jan 2007 JP
4073946 Apr 2008 JP
2008-107930 May 2008 JP
2013-152618 Aug 2013 JP
2014-044457 Mar 2014 JP
20160102923 Aug 2016 KR
WO-2006091521 Aug 2006 WO
WO-2009020158 Feb 2009 WO
WO-2011100255 Aug 2011 WO
WO-2013050333 Apr 2013 WO
WO-2015106702 Jul 2015 WO
WO-2015145648 Oct 2015 WO
WO-2017035536 Mar 2017 WO
WO-2017132718 Aug 2017 WO
WO-2017192422 Nov 2017 WO
WO-2017194244 Nov 2017 WO
WO-2017205330 Nov 2017 WO
WO-2017213918 Dec 2017 WO
WO-2018132112 Jul 2018 WO
WO-2020061621 Apr 2020 WO
WO-2022042925 Mar 2022 WO
WO-2022103812 May 2022 WO
WO-2022103813 May 2022 WO
WO-2022103820 May 2022 WO
WO-2022103822 May 2022 WO
WO-2022103824 May 2022 WO
WO-2022103829 May 2022 WO
WO-2022103831 May 2022 WO
Non-Patent Literature Citations (77)
Entry
Balaji et al., “Brick: Metadata schema for portable smart building applications,” Applied Energy, 2018 20 pages.
Balaji et al., “Brick: Towards a Unified Metadata Schema for Buildings,” dated Nov. 16-17, 2016, 10 pages.
Balaji et al., Demo Abstract: Portable Queries Using the Brick Schema for Building Applications, dated Nov. 16-17, 2016, 2 pages.
Bhattacharya et al., Short Paper: Analyzing Metadata Schemas for Buildings—The Good, The Bad and The Ugly, ACM, dated Nov. 4-5, 2015, 4 pages.
Blender 2.79 Manual, Introduction to Nodes, https://docs.blender.org/manual/en/dev/render/blender_render/materials/nodes/introduction.html, retrieved Oct. 30, 2018, 4 pages.
Brick: Towards a Unified Metadata Schema For Buildings, dated Nov. 16, 2016, 46 pages.
Building Blocks for Smart Buildings, BrickSchema.org, dated Mar. 2019, 17 pages.
Fierro et al., Beyond a House of Sticks: Formalizing Metadata Tags with Brick, dated Nov. 13-14, 2019, 10 pages.
Fierro et al., Dataset: An Open Dataset and Collection Tool for BMS Point Labels, dated Nov. 10, 2019, 3 pages.
Fierro et al., Design and Analysis of a Query Processor for Brick, dated Jan. 2018, 25 pages.
Fierro et al., Design and Analysis of a Query Processor for Brick, dated Nov. 8-9, 2017, 10 pages.
Fierro et al., Mortar: An Open Testbed for Portable Building Analytics, dated Nov. 7-8, 2018, 10 pages.
Fierro et al., Why Brick is a Game Changer for Smart Buildings, URL: https://brickschema.org/papers/Brick_Memoori_Webinar_Presentation.pdf, Memoori Webinar, 2019, 67 pages.
Fierro, Writing Portable Building Analytics with the Brick Metadata Schema, UC Berkeley ACM E-Energy, 2019, 39 pages.
Gao et al., A large-scale evaluation of automated metadata inference approaches on sensors from air handling units, dated May 1, 2018, pp. 14-30.
International Search Report and Written Opinion for PCT Appl. Ser. No. PCT/US2017/013831 dated Mar. 31, 2017 (14 pages).
International Search Report and Written Opinion for PCT Appl. Ser. No. PCT/US2017/035524 dated Jul. 24, 2017 (14 pages).
International Search Report and Written Opinion on PCT/US2019/036823, dated Aug. 5, 2019, 13 pages.
Koh et al., “Scrabble: Transferrable Semi-Automated Semantic Metadata Normalization using Intermediate Representation,” dated Nov. 7-8, 2018, 10 pages.
Koh et al., Plaster: An Integration, Benchmark, and Development Framework for Metadata Normalization Methods, dated Nov. 7-8, 2018, 10 pages.
Koh et al., Who can Access What, and When?, dated Nov. 13-14, 2019, 4 pages.
Maya, Node Editor, Autodesk Knowledge Network, 2016, https://knowledge.autodesk.com/support/maya/learn-explore/caas/CloudHelp/cloudhelp/2016/ENU/Maya/files/GUID-23277302-6665-465F-8579-9BC734228F69-htm.html, retrieved Oct. 30, 2018, 6 pages.
Metadata Schema for Buildings, URL: https://brickschema.org/docs/Brick-Leaflet.pdf, Retrieved from Internet Dec. 24, 2019, 3 pages.
U.S. Appl. No. 17/566,029, Passivelogic, Inc.
U.S. Appl. No. 17/567,275, Passivelogic, Inc.
U.S. Appl. No. 17/722,115, Passivelogic, Inc.
Balaji et al., “Brick: Metadata schema for portable smart building applications,” Applied Energy, Sep. 15, 2018, 3 pages, (Abstract).
Bhattacharya, A., “Enabling Scalable Smart-Building Analytics,” Electrical Engineering and Computer Sciences, University of California at Berkeley, Technical Report No. UCB/EECS-2016-201, Dec. 15, 2016 (121 pages).
Chinese Office Action on CN Appl. No. 201780003995.9 dated Apr. 8, 2021 (21 pages with English language translation).
Chinese Office action on CN Appl. No. 201780043400.2 dated Apr. 25, 2021 (15 pages with English language translation).
Curry, E. et al., “Linking building data in the cloud: Integrating cross-domain building data using linked data.” Advanced Engineering Informatics, 2013, 27 (pp. 206-219).
Digital Platform Litigation Documents Part 1, includes cover letter, dismissal of case DDE-1-21-cv-01796, IPR2023-00022 (documents filed Oct. 26, 2013-Oct. 7, 2022), and IPR2023-00085 (documents filedJan. 26, 2023-Oct. 20, 2022) (748 pages total).
Digital Platform Litigation Documents Part 10, includes DDE-1-21-cv-01796 (documents filed Nov. 1, 2022-Dec. 22, 2021 (1795 pages total).
Digital Platform Litigation Documents Part 2, includes IPR2023-00085 (documents filed Oct. 20, 2022) (172 pages total).
Digital Platform Litigation Documents Part 3, includes IPR2023-00085 (documents filed Oct. 20, 2022) and IPR2023-00170 (documents filed Nov. 28, 2022-Nov. 7, 2022) (397 pages total).
Digital Platform Litigation Documents Part 4, includes IPR2023-00170 (documents filed Nov. 7, 2022) and IPR2023-00217 (documents filed Nov. 18, 2023-Nov. 15, 2022) (434 pages total).
Digital Platform Litigation Documents Part 5, includes IPR2023-00217 (documents filed Nov. 15, 2022) and IPR2023-00257 (documents filed Jan. 25, 2023-Nov. 23, 2022) (316 pages total).
Digital Platform Litigation Documents Part 6, includes IPR2023-00257 (documents filed Nov. 23, 2022) and IPR 2023-00346 (documents filed Jan. 3, 2023-Dec. 13, 2022) (295 pages total).
Digital Platform Litigation Documents Part 7, includes IPR 2023-00346 (documents filed Dec. 13, 2022) and IPR2023-00347 (documents filed Jan. 3, 2023-Dec. 13, 2022) (217 pages total).
Digital Platform Litigation Documents Part 8, includes IPR2023-00347 (documents filed Dec. 13, 2022), EDTX-2-22-cv-00243 (documents filed Sep. 20, 2022-Jun. 29, 2022), and DDE-1-21-cv-01796 (documents filed Feb. 3, 2023-Jan. 10, 2023 (480 pages total).
Digital Platform Litigation Documents Part 9, includes DDE-1-21-cv-01796 (documents filed Jan. 10, 2023-Nov. 1, 2022 (203 pages total).
El Kaed, C. et al., “Building management insights driven by a multi-system semantic representation approach,” 2016 IEEE 3rd World Forum on Internet of Things (WF-IoT), Dec. 12-14, 2016, (pp. 520-525).
Ellis, C. et al., “Creating a room connectivity graph of a building from per-room sensor units.” BuildSys '12, Toronto, ON, Canada, Nov. 6, 2012 (7 pages).
Extended European Search Report on EP Application No. 18196948.6 dated Apr. 10, 2019 (9 pages).
Fierro, G., “Design of an Effective Ontology and Query Processor Enabling Portable Building Applications,” Electrical Engineering and Computer Sciences, University of California at Berkeley, Technical Report No. UCB/EECS-2019-106, Jun. 27, 2019 (118 pages).
File History for U.S. Appl. No. 12/776,159, filed May 7, 2010 (722 pages).
Final Conference Program, ACM BuildSys 2016, Stanford, CA, USA, Nov. 15-17, 2016 (7 pages).
Harvey, T., “Quantum Part 3: The Tools of Autonomy, How PassiveLogic's Quantum Creator and Autonomy Studio software works,” URL: https://www.automatedbuildings.com/news/jan22/articles/passive/211224010000passive.html, Jan. 2022 (7 pages).
Harvey, T., “Quantum: The Digital Twin Standard for Buildings,” URL: https://www.automatedbuildings.com/news/feb21/articles/passivelogic/210127124501passivelogic.html, Feb. 2021 (6 pages).
Hu, S. et al., “Building performance optimisation: A hybrid architecture for the integration of contextual information and time-series data,” Automation in Construction, 2016, 70 (pp. 51-61).
International Search Report and Written Opinion on PCT/US2017/052060, dated Oct. 5, 2017, 11 pages.
International Search Report and Written Opinion on PCT/US2017/052633, dated Oct. 23, 2017, 9 pages.
International Search Report and Written Opinion on PCT/US2017/052829, dated Nov. 27, 2017, 24 pages.
International Search Report and Written Opinion on PCT/US2018/024068, dated Jun. 15, 2018, 22 pages.
International Search Report and Written Opinion on PCT/US2018/052971, dated Mar. 1, 2019, 19 pages.
International Search Report and Written Opinion on PCT/US2018/052974, dated Dec. 19, 2018, 13 pages.
International Search Report and Written Opinion on PCT/US2018/052975, dated Jan. 2, 2019, 13 pages.
International Search Report and Written Opinion on PCT/US2018/052994, dated Jan. 7, 2019, 15 pages.
International Search Report and Written Opinion on PCT/US2019/015481, dated May 17, 2019, 78 pages.
International Search Report and Written Opinion on PCT/US2020/058381, dated Jan. 27, 2021, 30 pages.
Japanese Office Action on JP Appl. No. 2018-534963 dated May 11, 2021 (16 pages with English language translation).
Li et al., “Event Stream Processing with Out-of-Order Data Arrival,” International Conferences on Distributed Computing Systems, 2007, (8 pages).
Nissin Electric Co., Ltd., “Smart power supply system (SPSS),” Outline of the scale verification plan, Nissin Electric Technical Report, Japan, Apr. 23, 2014, vol. 59, No. 1 (23 pages).
Passivelogic, “Explorer: Digital Twin Standard for Autonomous Systems. Made interactive.” URL: https://passivelogic.com/software/quantum-explorer/, retrieved from internet Jan. 4, 2023 (13 pages).
Passivelogic, “Quantum: The Digital Twin Standard for Autonomous Systems, A physics-based ontology for next-generation control and AI.” URL: https://passivelogic.com/software/quantum-standard/, retrieved from internet Jan. 4, 2023 (20 pages).
Quantum Alliance, “Quantum Explorer Walkthrough,” 2022, (7 pages) (screenshots from video).
Results of the Partial International Search for PCT/US2018/052971, dated Jan. 3, 2019, 3 pages.
Sinha, Sudhi and Al Huraimel, Khaled, “Reimagining Businesses with AI” John Wiley & Sons, Inc., Hoboken, NJ, USA, 2021 (156 pages).
Sinha, Sudhi R. and Park, Youngchoon, “Building an Effective IoT Ecosystem for Your Business,” Johnson Controls International, Springer International Publishing, 2017 (286 pages).
Sinha, Sudhi, “Making Big Data Work For Your Business: A guide to effective Big Data analytics,” Impackt Publishing LTD., Birmingham, UK, Oct. 2014 (170 pages).
The Virtual Nuclear Tourist, “Calvert Cliffs Nuclear Power Plant,” URL: http://www.nucleartourist.com/us/calvert.htm, Jan. 11, 2006 (2 pages).
University of California At Berkeley, EECS Department, “Enabling Scalable Smart-Building Analytics,” URL: https://www2.eecs.berkeley.edu/Pubs/TechRpts/2016/EECS-2016-201.html, retrieved from internet Feb. 15, 2022 (7 pages).
Van Hoof, Bert, “Announcing Azure Digital Twins: Create digital replicas of spaces and infrastructure using cloud, AI and IoT,” URL: https://azure.microsoft.com/en-us/blog/announcing-azure-digital-twins-create-digital-replicas-of-spaces-and-infrastructure-using-cloud-ai-and-iot/, Sep. 24, 2018 (11 pages).
W3c, “SPARQL: Query Language for RDF,” located on The Wayback Machine, URL: https://web.archive.org/web/20161230061728/http://www.w3.org/TR/rdf-sparql-query/), retrieved from internet Nov. 15, 2022 (89 pages).
Wei et al., “Development and Implementation of Software Gateways of Fire Fighting Subsystem Running on EBI,” Control, Automation and Systems Engineering, IITA International Conference on, IEEE, Jul. 2009 (pp. 9-12).
White et al., “Reduce building maintenance costs with AWS IoT TwinMaker Knowledge Graph,” The Internet of Things on AWS—Official Blog, URL: https://aws.amazon.com/blogs/iot/reduce-building-maintenance-costs-with-aws-iot-twinmaker-knowledge-graph/, Nov. 18, 2022 (10 pages).
Zhou, Q. et al., “Knowledge-infused and Consistent Complex Event Processing over Real-time and Persistent Streams,” Further Generation Computer Systems, 2017, 76 (pp. 391-406).
Related Publications (1)
Number Date Country
20220300149 A1 Sep 2022 US
Continuations (2)
Number Date Country
Parent 16723087 Dec 2019 US
Child 17688053 US
Parent 16175507 Oct 2018 US
Child 16723087 US