Conventionally, advanced building automation systems allow lights, appliances or other building automation devices all over a building or other facility to be remotely controlled and monitored from one or more user interfaces or keypads. A typical user interface for these systems is a keypad with status indicators. Pressing a key on the keypad provides a command signal which energizes a light or invokes a lighting scene or other building automation scenario. An indicator associated with the key may also be made to light or otherwise provide indication in acknowledgement of completion of the desired action. Typically, pressing the key a second time extinguishes the lights or terminates the automation scenario, and also extinguishes the indicator as acknowledgement.
In early control systems, the indicator was managed locally. Pressing the key would merely toggle the state of the indicator. The indicator could easily get out of sync with the true state if the system was adjusted from a different keypad or if a communications error caused one of the commands to become lost in transit.
Other contemporary systems either use a separate status reporting or retrieval mechanism, do not report true status or maintain all status information in a central processor for selective retrieval by a system component.
The following implementations and aspects thereof are described and illustrated in conjunction with systems, tools and methods which are meant to be exemplary and illustrative, not limiting in scope. In various implementations, one or more of the above-described issues have been reduced or eliminated, while other implementations are directed to other improvements.
In view of the foregoing it is a general aspect of the presently described developments to provide for status indication of a device to be controlled in a building automation system; by including functionalities and/or devices for recognizing a command signal that has been sent from a controlling device to a device to be controlled; and, interpreting the state of the device to be controlled from the command signal.
The foregoing specific aspects and advantages of the present developments are illustrative of those which can be achieved by these developments and are not intended to be exhaustive or limiting of the possible advantages which can be realized. Thus, those and other aspects and advantages of these developments will be apparent from the description herein or can be learned from practicing the disclosure hereof, both as embodied herein or as modified in view of any variations which may be apparent to those skilled in the art. Thus, in addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to and by study of the following descriptions.
In the drawings:
In the herein-described developments, building automation device status may be interpreted or inferred by receiving or listening in on the event messages used to control the system. Deriving status information from a control message removes the additional communications overhead of subsequently polling for status and thereby generating an additional status message communication. In a primary simple example, status can be inferred, passively or actively, in accordance herewith by listening in on the event messages sent out by other devices. For example, following transmission of an “On” command to a specific fixture or other device to be controlled, it can be assumed or interpreted that the fixture is in indeed in the “On” state; there being no further need to ask the fixture to confirm whether it is indeed “On”. An indication of this “On” state can then also be generated, a sort of passive status generation.
By contrast, other systems either use a separate status retrieval mechanism, do not report true status or maintain all status information in a central processor for universal retrieval by any system component. The current system, on the other hand, accomplishes status reporting via interpretation of the control communications. The herein described systems or means or mechanisms of obtaining and tracking system state minimize unnecessary communications. The same messages used to control the devices within the automation system can thus be used to update status indicators. This approach results in a simplified implementation and reduced communications overhead compared to a system utilizing separate messaging for state information. Often, the presently-disclosed systems may be involved in or use a communications scheme, e.g., wireless, where messaging bandwidth is at a premium. The developments hereof can thus minimize communication overhead or like burdens on a communications system whether in time or bandwidth.
The present systems are unique in that they offer rich feature sets but do not require a central processor to accomplish their functions. Implementations of similar distributed systems may involve communications via a relatively high-bandwidth CAN bus. For such, or wireless versions of the present technology developments to work reliably, optimization of communications protocols have been developed.
A system hereof can include a number of types of building automation devices from lighting to audio to video to temperature control, inter alia. In a first example, use of intelligent light switches herewith are described. Whether for lighting or otherwise, each of such switches may include a user interface, as for example, in the form of a touch sensitive panel, an associated set of indicators such as LED indicators, and in some instances, a transceiver, e.g., an IEEE 802.15.4 (ZigBee) transceiver, a microcontroller and one or more transconductive devices, e.g., a TRIAC. Such an embodiment may include the microcontroller and the transceiver and either the transconductive device(s) and/or the user interface. A system generally would include two or more of these devices. Other implementations can include different control/controllable devices and other communications hardware/software such as CAN (Controller Area Network).
End-user software may be employed to define desired operation of each of the user interface inputs, e.g., buttons, and their associated indicators. The result can be a set of scripts, one for each device in the system. The scripts can then be downloaded into the memories of the microcontrollers in the devices of the system. The scripts could then direct the devices as to appropriate responses to button presses and message receipts.
In a first, simple example, see e.g.,
For this scenario, the scripts on devices 10 and 20 would direct each such device 10, 20 to transmit, through a network 90, a command message A when either of buttons 11 or 21 is touched and a message B when either of respective buttons 12 or 22 is depressed. In this example, scripts on device 30 could direct the device 30 to energize the attached light 31 on receipt of message A and to de-energize on receipt of message B.
Devices 10 and 20 may also be responsible for managing state as displayed, for example, on LED or other indicators associated with each of their two buttons (note, the indicators may be within and/or otherwise form a part of the buttons themselves, or may be separate or discrete geographically/locationally as shown in
In a slight variation of this first example, in Example 2, see also
A more specific example, Example 3, below, see also
In this example, there are two input devices 100 and 200 shown (though of course others could be used as well, and see the optional central controller 500 set forth in dashed lines). On/in these input devices 100 and 200 are number of buttons and indicators, here LEDs, noting that device 200 is a multiple input device. Of these, LED (101) is associated with the “on/off state” of the corresponding “scene” controlled by Button (102); LED (201) is associated with the “on/off state” of the corresponding “scene” controlled by Button (202); LED (203) is associated with the “on/off state” of the corresponding “scene” controlled by Button (204), and, LED (205) is associated with the “on/off state” of the corresponding “scene” controlled by Button (206). In this example, the intended destination state of a scene is the state that is set when the scene is being turned on. Scenes are turned on when their buttons are pushed with their LEDs off, i.e., moving to the on state from the off state, the off state having been indicated by the off LED. Pushing the button when its LED is on turns off the scene, i.e., moving to the off state from the on state, the on state having been indicated by the on LED.
In this example, button (102) can be established to set the following “scene” when it is pushed with its corresponding LED (101) off; namely, to turn on all three of lights 301, 401 and 402 (i.e., to set light (301) to 100%, set light (401) to 100%, and set light (402) to 100%). In this case, button (102) can alternatively turn “off” the same “scene” when it is pushed with its corresponding LED (101) on; namely, to turn off all the lights (301), (401), (402).
Correspondingly, the respective buttons 202, 204 and 206 could be more specifically programmed to control only respective ones of the lights 301, 401 and 402. For example, button (202) could be set to set the following “scene” when it is pushed with its corresponding LED (201) off—Set Light (301) to 100%. Button (202) may then alternatively set the following “scene” when it is pushed with its corresponding LED (201) on—Turn off Light (301). Similarly, Button (204) could be used to set the following “scene” when it is pushed with its corresponding LED (203) off—Set Light (401) to 100%; and, alternatively, Button (204) sets the following “scene” when it is pushed with its corresponding LED (203) on—Turn off Light (401). And, Button (206) sets the following “scene” when it is pushed with its corresponding LED (205) off—Set Light (402) to 100%; and, Button (206) sets the following “scene” when it is pushed with its corresponding LED (205) on—Turn off Light (402).
In such an example, it can be set that when an LED is associated with the “on/off state” of the “scene” controlled by its corresponding button, the LED will be on (i.e., lit up) when the scene is “on” and it will be off when its scene is “off”. Defining what it means for a scene to be “on” and “off” can be interesting for determining when the appropriate LED should be on (i.e. lit up) or off. Note, in this example, lights are shown in the FIG.; however, it may be that a variety of alternative automation devices might be the “controlled” devices operating at the receipt of a command signal.
From this, there may be one or more of several useful definitions for what it means for a scene to be “on” or “off”. For example; 1) A scene is on when all of its controllable devices (e.g., lights) are on to the exact values set by the scene (ALL EXACT). 2) A scene is on when all of its controllable devices (e.g., lights) are on at any value (ALL ON). 3) A scene is on when at least one of its controllable devices (e.g., lights) is on (ANY ON). Further examples can be derived herefrom. For the purpose of this example, EXAMPLE 3, option 1 (ALL EXACT) will be the working definition of what it means for a scene to be on. “A scene is on when all of its controllable devices (e.g., lights) are on to the exact values set by the scene.” Further we say that LED 101, 201, 203, and 205 are all using the same (ALL EXACT) definition. Table 1 provides an illustration.
In stepping through Table 1; starting with Line 1: an “Initial State” of the system is presented where everything is off. Then, in Line 2: Button 102 is pressed. As described above, Button 102 sets the following “scene” when it is pushed with its corresponding LED 101 off—{Set Light (301) to 100%; Set Light (401) to 100%; Set Light (402) to 100%}. It may then be observed that all lights 301, 402, and 402 have been set to 100%. Now to apply the rules hereof (ALL EXACT) to determine whether a “scene” is “on” or “off” and therefore whether its corresponding LED should be on (i.e. Lit up) or off; it is known from the above description relative to
Moving next to a new command/action, as shown in Line 3: Button 102 is pressed, and as described above for “Example 3” and
Continuing in Table 1, in line 4, button 202 is pressed, this, as described above for EXAMPLE 3, corresponds to controlling, on/off, device 301, which is thus turned on in line 4. Similarly in lines 5 and 6 of Table 1, corresponding buttons 204 and 206 are consecutively pressed to achieve the change of status of the corresponding devices 401 and 402 to the on state. Line 6 provides the cumulative states. Then, if button 202 is pressed a second time, as indicated in line 7 of the Table, then, the device 301 would go to the off state. Similarly, in the next successive lines 8 and 9, pressings of buttons 204 and 206 lead to the off states for devices 401 and 402.
An exemplary implementation of the method for intended destination state management of state indicators is here described, see particularly
In further sense, the method of recognition and interpretation may be part of further overall operations which may also include the herein,
A further set of more detailed methodologies will now be described. In a first sense, a first step may include the creation of Scenes and association thereof with actions (e.g. sensor inputs, keypad buttons, timers, anything that can generate an input message) on input devices. A second step may include the creation of variables to represent the “on/off” state of each controllable device in a scene. Then, as part of this second step, for each input device that has state indicators: further establishing for each state indicator on the input device that represents the “on/off” state of a scene and for each controllable device in the scene, creating a variable to represent the “on/off” state of the controllable device.
A third step may include configuring/programming each device to respond to input messages that affect its status indicators. This may include for each status indicator on this device, and/or for each controllable device that affects this status indicator, and/or for each scene that has the potential to change the state of this controllable device, and/or for each Input Device/Button that sets this scene that changes the state of this controllable device, then, creating configuration/programming information that will cause this device to set the “on/off” state variable for this controllable device in response to the input message generated by the Input Device/Button that invoked the scene that changes the state of this controllable device. Then, a fourth step may include configuring/programming each device to update its status indicators each time it receives an input message. This may include for each input message received taking any prescribed action associated with the message (i.e. turn on a light); and/or setting state variables (from step 2) using configuration/programming information (from step 3), and/or depending upon what rules are being used to determine the “on/off” state of the scene that each status indicator represents, evaluating all state variables to determine if all controllable devices are headed for their intended destination state; and, if the scene is determined to be “on”, turn on its status indicator, else turn it off.
In a further methodological example; a method for determining the intended destination state management of state indicator of and during the controlling of a single light and representing its state with an LED will now be described. As from the description of
In a further variation of this method, e.g., this time taking into account multiple controlled devices and multiple indicators (
As introduced above, the methodology may optionally include a central control device or multiple system devices which could also participate in providing either direct indicator (status) control messages, e.g., messages A or B; or, may provide status indication by interpretation of the command signals sent by other devices.
A number of alternatives may be involved. For a first such example, to ensure that the indicators match the true status of the lighting system in the presence of communications errors, the status sensing mechanism may be augmented by a communications acknowledgement system. Communications acknowledgement may provide the system feedback as to whether a message has been successfully received. Delivery failure can be compensated for by resending the original command or holding indicator and state where it was prior to the failed delivery.
Some communications channels prefer or only support directed messaging. Messages on these systems are directed to specific destinations and are not readily available to other devices listening in on the communications channel. On systems such as this, events that result in message transmissions (e.g. button touches), can be encoded with a list of devices which need to receive the message so that the command signal is directly sent to the output device enacting the command, as well as to any one or more input/status devices which would have status indications updated due to the change of information. The list of devices would include not only the devices controlling the lights which need to be adjusted in response to the message but any keypad devices which may need to update their state indicators in response to the system state change that these messages invoke. The lists of recipients for each message could be generated a priori as part of the same system configuration that generates the scripts for the devices, or could be updated at later times depending upon which devices are desired to provide the status indications.
Some further alternatives for implementation of the current system may include keypads which could frequently poll the system; the system could then be adapted to respond with status information which the keypads could use to adjust their indicators to match the reported state of the lighting system. Optionally a central device or multiple system devices could provide direct indicator (status) control messages.
Note, in the primary examples hereof; either on/off states were generally described; however, stepwise increases and/or decreases may also be used herewith as well as for example in turning up or down the volume of an audio or video player. In such examples, the basic principle of having a command signal (stepwise movement, e.g., up volume or down volume; or, up temperature, or down temperature; or, step-wise forwarding or rewinding of video or audio playback, inter alia) both recognized by a state device (either the self-same device as either commanding device or output device, or a third party device (e.g., devices 20, 200, 500 in the primary examples) not involved directly in the command initiation or reception) and interpreted for status indication.
In contrast to conventional systems, the presently described approach is unique in the building automation industry. Contemporary systems have either used a separate status reporting and/or retrieval mechanism, or they do not report true status or they maintain all status information in a central processor for universal retrieval by any system component. In either the former or the latter, a specific status message is additionally generated and sent, whether automatically, or upon specific request. In either of these cases, these status messages are additional burden on the communications system reducing effective bandwidth. Moreover, these conventional status reporting mechanisms are either unreliable, of limited functionality, rely upon a central server or consume excessive communications bandwidth which can limit the practical system size.
By contrast, the present developments perform differently from conventional systems in that status may generally be considered to be inferred from command messages. The present systems are typically involved in updating indicator state when commands are issued, but, at least updates when a new command is issued and recognized.
While a number of exemplary aspects and embodiments have been discussed above, those of skill in the art will recognize certain modifications, permutations, additions and sub combinations thereof. It is therefore intended that the following appended claims and claims hereafter introduced are interpreted to include all such modifications, permutations, additions and sub-combinations as are within their true spirit and scope.
The present application claims the benefit of and priority from the U.S. Provisional Application No. 61/018,313, filed 31 Dec. 2007, entitled “PASSIVE STATUS GENERATION FOR LIGHTING CONTROL SYSTEMS”; the subject matter of which hereby being specifically incorporated herein by reference for all that it discloses and teaches.
Number | Date | Country | |
---|---|---|---|
61018313 | Dec 2007 | US |