Lighting wall control with virtual assistant

Information

  • Patent Grant
  • 10595380
  • Patent Number
    10,595,380
  • Date Filed
    Monday, September 25, 2017
    6 years ago
  • Date Issued
    Tuesday, March 17, 2020
    4 years ago
Abstract
A lighting wall controller can be a retrofit for existing light switches, and the lighting wall control replaces the conventional light switch and can still work as a conventional light switch or other power switch with “dumb” lights or appliances while providing the ability to control “smart” lights and/or other “smart” devices with voice commands. In addition to controlling lights and/or devices, voice commands can be used to provide information or actions back to the user in response to the voice command. The lighting wall controller thereby provides voice control functionality without requiring additional devices, such as additional voice control appliances.
Description
FIELD OF THE DISCLOSURE

The present disclosure relates to wall controls for lighting systems, and in particular to lighting wall controls including extended functionality such as a voice-directed virtual assistant.


BACKGROUND

Networked “smart home” devices continue to grow in popularity, providing increasing levels of functionality and convenience. For example, traditional light bulbs and lighting fixtures are increasingly being replaced with light-emitting diode (LED) based bulbs and fixtures, which may be networked together in order to provide features such as remote control from a smart phone and basic automation. In addition, devices such as door locks, thermostats, connected power outlets, and media remote controls are now being network connected to add features beyond what has previously been possible. Due to the large variety of these devices, there is now an emerging market for home automation “hubs”, which are capable of communicating with a variety of these devices in order to provide a user with a single place to control all of their devices. While many of these home automation “hubs” accomplish this task, they are often discrete devices that must be separately added to a network. One type of home automation “hub” may provide voice control over one or more “smart home” devices. Referred to herein as a voice control appliance, these devices respond to voice commands by providing audible feedback or changing the settings of one or more “smart home” devices connected thereto. For example, the Amazon Echo is one such device that has gained popularity in recent years. While such devices may provide convenient “voice assistant” functionality, they are generally only capable of listening for voice commands in a relatively small space. That is, an installation may require several of these voice control appliances placed around a space in order to adequately hear voice commands issued by a user throughout the space. Providing voice control appliances in this manner may not only be unsightly, but may be impractical in some scenarios due to the fact that they generally require access to a power outlet which may not be available. Environmental obstructions may interfere with the ability of these voice control appliances to recognize voice commands due to the required placement of such a standalone device in a particular location.


Accordingly, there is a need for an improved way to communicate with networked “smart home” devices and distribute the control thereof within a space.


SUMMARY

The present disclosure relates to wall controls for lighting systems, and in particular to lighting wall controls including extended functionality such as a voice-directed virtual assistant. In one embodiment, a lighting wall controller can be a retrofit for existing light switches, and the lighting wall control replaces the conventional light switch and can still work as a conventional light switch or other power switch with “dumb” lights or appliances while providing the ability to control “smart” lights and/or other “smart” devices with voice commands. In addition to controlling lights and/or devices, voice commands can be used to provide information or actions back to the user in response to the voice command. The lighting wall controller thereby provides voice control functionality without requiring additional devices, such as additional voice control appliances.


In one embodiment, the lighting wall controller can include processing circuitry, a memory, and a user interface. The memory includes instructions, which, when executed by the processing circuitry cause the lighting wall controller to process a voice command received from a user via the user interface and perform one or more actions in response thereto.


In one embodiment, processing the voice command from the user includes transcribing the voice command and sending the transcribed voice command to a remote device. The remote device then determines one or more actions to be taken based on the transcribed voice command and sends the one or more actions back to the lighting wall controller. In response, the lighting wall controller executes the one or more actions.


In one embodiment, processing the voice command from the user includes sending the voice command or a processed version of the voice command to a remote device, where it is transcribed. The remote device then determines one or more actions to be taken based on the transcribed voice command and sends the one or more actions back to the wall controller. In response, the wall controller executes the one or more actions. For example, the user may want to know the answer to a question. The user can ask the question to the wall controller which sends the processed version of the voice command, (i.e., the question), to the remote device or voice control appliance, and the remote device or voice control appliance retrieves the answer to the question itself or through other devices. The remote device will send the answer to the question which will be transmitted to the user via a speaker, display or other user interface.


In one embodiment, the remote device is a device on the same local area network (LAN) as the lighting wall controller. There may be one or more intermediate devices through which the lighting wall controller communicates with the remote device over the LAN. In another embodiment, the remote device is a device located outside of the LAN of the lighting wall controller, for example, on a wide area network (WAN) to which the lighting wall controller connects to via a gateway. In one embodiment, the remote device is a voice control appliance. In another embodiment, the remote device is a server.


In one embodiment, processing the voice command from the user includes locally transcribing the voice command and determining one or more actions to be taken based on the transcribed voice command. In response, the lighting wall controller executes the one or more actions.


In some embodiments, the wall controllers form a network, such as a mesh network or partial (i.e., weak) mesh network, and transmit information or commands between each other. For example, a user in the master bedroom could send a command to turn off the lights in the kitchen. Depending on the embodiment, the voice command could go directly to the wall controller in the kitchen and the wall controller in the kitchen will turn off the lights, or the voice controller could go to a voice control appliance or other device which sends a command to the wall controller in the kitchen to turn off the lights.


In one embodiment, the one or more actions include controlling a light output of a light bulb and/or lighting fixture. In another embodiment, the one or more actions include displaying information for a user via the user interface.


Those skilled in the art will appreciate the scope of the present disclosure and realize additional aspects thereof after reading the following detailed description of the preferred embodiments in association with the accompanying drawing figures.





BRIEF DESCRIPTION OF THE DRAWING FIGURES

The accompanying drawing figures incorporated in and forming a part of this specification illustrate several aspects of the disclosure, and together with the description serve to explain the principles of the disclosure.



FIG. 1 is a diagram illustrating a lighting network according to one embodiment of the present disclosure.



FIG. 2 is a functional schematic illustrating a lighting wall controller according to one embodiment of the present disclosure.



FIG. 3 is a functional schematic illustrating a lighting wall controller according to one embodiment of the present disclosure.



FIG. 4 is a functional schematic illustrating a lighting wall controller according to one embodiment of the present disclosure.



FIG. 5 is a diagram of a user interface for a lighting wall controller according to one embodiment of the present disclosure.



FIG. 6 is a diagram of a user interface for a lighting wall controller according to one embodiment of the present disclosure.



FIG. 7 is a diagram of a user interface of a lighting wall controller according to one embodiment of the present disclosure.



FIG. 8 is a diagram of a user interface of a lighting wall controller according to one embodiment of the present disclosure.



FIG. 9 is a call-flow diagram illustrating communication between a lighting wall controller and a remote device according to one embodiment of the present disclosure.



FIG. 10 is a call-flow diagram illustrating communication between a lighting wall controller and a remote device according to one embodiment of the present disclosure.



FIG. 11 is a flow diagram illustrating a method of processing one or more voice commands according to one embodiment of the present disclosure.





DETAILED DESCRIPTION

The embodiments set forth below represent the necessary information to enable those skilled in the art to practice the embodiments and illustrate the best mode of practicing the embodiments. Upon reading the following description in light of the accompanying drawing figures, those skilled in the art will understand the concepts of the disclosure and will recognize applications of these concepts not particularly addressed herein. It should be understood that these concepts and applications fall within the scope of the disclosure and the accompanying claims.


It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.


It will be understood that when an element such as a layer, region, or substrate is referred to as being “on” or extending “onto” another element, it can be directly on or extend directly onto the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly on” or extending “directly onto” another element, there are no intervening elements present. Likewise, it will be understood that when an element such as a layer, region, or substrate is referred to as being “over” or extending “over” another element, it can be directly over or extend directly over the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly over” or extending “directly over” another element, there are no intervening elements present. It will also be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present.


Relative terms such as “below” or “above” or “upper” or “lower” or “horizontal” or “vertical” may be used herein to describe a relationship of one element, layer, or region to another element, layer, or region as illustrated in the Figures. It will be understood that these terms and those discussed above are intended to encompass different orientations of the device in addition to the orientation depicted in the Figures.


The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including” when used herein specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.


Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms used herein should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.


As generally used herein, a “dumb” light or device is one that is simply controlled by adjusting or cutting of the power to the device, e.g. by a conventional light switch or TRIAC dimmer. A “smart” light or device is a device that includes decision making capability such that it can respond to signals, commands, feedback and/or information from sensors or other devices to adjust its operation.



FIG. 1 shows a lighting network 10 according to one embodiment of the present disclosure. The lighting network 10 includes a number of light bulbs 12 (which, while not shown as such, may also be lighting fixtures without departing from the principles of the present disclosure), a number of lighting wall controls 14, a router 16, a gateway 18, a voice control appliance 20, a voice control server 22, and a connected device 24, such as a smartphone, tablet, computer. Each one of these devices is connected to one another, either directly or via an intermediate device. These connections are illustrated in FIG. 1 as lines located between the devices, and may represent wired or wireless connections in any number of different communications technologies and/or protocols.


For example, each one of the lighting wall controls 14 may include multiple communication interfaces as discussed below in order to communicate with the light bulbs 12 using a first communication technology and/or protocol, communicate with the smartphone 24 using a second communication technology and/or protocol, and communicate with the voice control server 22 using a third communication technology and/or protocol.


Together, the light bulbs 12, the lighting wall controls 14, the router 16, the voice control appliance 20, the voice control server 22, and the smartphone 24 may form a local-area network (LAN). Communications between these devices may occur directly or through one or more intermediate devices such as the router 16, which may facilitate communications between all of the devices. The gateway 18 may connect the LAN to a wide-area network (WAN), such as the Internet. In some embodiments, the voice control server 22 may connect to the devices in the lighting network 10 via the LAN. In other embodiments, the voice control server 22 connects to the devices in the lighting network via the WAN.


The light bulbs 12 are configured to receive power, for example, from an alternating current (AC) line source along with one or more control signals and provide a light output based thereon. One or more of the light bulbs 12 may be “dumb” bulbs that are conventionally controlled, for example by an AC input signal AC_IN. These light bulbs 12 generally provide a light output that is proportional to an average amount of energy provided by the AC input signal AC_IN (e.g., via a triode for alternating current (TRIAC) dimmer), and do not include a means for communicating with other devices. Other light bulbs 12 may be “smart” bulbs equipped with electronics to provide decision making capabilities and communications circuitry such that they are capable of receiving data from other devices such as one or more of the lighting wall controls 14 and adjusting the light output thereof based on the commands. In some embodiments, these “smart” light bulbs 12 may also be controlled by conventional means as discussed above.


Each one of the lighting wall controls 14 is configured to receive user input and power, for example, from an AC line source, and control a light output from one or more of the light bulbs 12 in response thereto. The lighting wall controls 14 may do so by providing a user interface, which may be mechanical or software based (e.g., a touchscreen). To control the light output of the light bulbs 12, the lighting wall controls 14 may provide the control signals thereto via a wired communications interface or a wireless communications interface. The wired control signals may be conventional alternating current (AC) dimmer signals (e.g., as provided by a dimmer switch such as a TRIAC dimmer), commands sent via an AC line interface (e.g., by modulating or otherwise transmitting data over the AC line), and/or Ethernet control signals. The wireless control signals may be Bluetooth, Zigbee, Thread, and/or Z-Wave control signals. In short, any type of wired or wireless control signals may be used to control a light output of the light bulbs 12, and the type of control signals used may be dependent on the individual light bulbs 12 themselves as discussed above.


In addition to the above, each one of the lighting wall controls 14 may communicate among themselves in order to synchronize tasks, share sensor data, coordinate listening for or responding to voice commands from a user, or the like. In one embodiment, the lighting wall controls 14 form a mesh network or a light mesh network in order to communicate with one another. Accordingly, the lighting wall controls 14 may relay commands between one another, allowing voice commands or user input provided at one of the lighting wall controls 14 to execute one or more actions on a different lighting wall control 14. For example, a voice command from a user may indicate that the user wishes to dim the lights in a particular location, such as the master bedroom. If the voice command is not received by a lighting wall control 14 located in the master bedroom, the lighting wall control 14 may relay this command to the appropriate lighting wall control 14, thereby allowing for the execution of the command.


To this end, each one of the lighting wall controls 14 may be associated with a particular location in a space. For example, a lighting wall control 14 may be associated with a master bedroom, a kitchen, a conference room, or the like. These locations, which may be provided by a user, determined automatically, or some combination thereof, may allow a user to provide voice commands that are spatially oriented such as the example given above where a user wishes to dim the lights in a master bedroom. Such a voice command will be communicated as necessary to the appropriate lighting wall controller 14 in order to execute the command. Associating the lighting wall controls 14 with locations may be especially important when the light bulbs 12 connected thereto are conventionally controlled, since the lighting wall control 14 is then the exclusive control point for the light output of these conventionally controlled light bulbs 12. When the light bulbs 12 include their own communications circuitry, intervening lighting wall controllers 14 may be bypassed such that the lighting wall controller 14 receiving a voice command may adjust the light output of the light bulbs 12 regardless of whether it is physically attached to them or located in the same room. In such scenarios, the light bulbs 12 themselves may be associated with a particular location in order to effectuate such behavior.


Notably, the lighting wall controllers 14 may control other “smart” devices in addition to the light bulbs 12. For example, the lighting wall controllers 14 may directly or indirectly provide commands to door locks, thermostats, media controllers, connected power outlets, and the like based on voice commands from a user as described in detail below.


In the embodiment shown in FIG. 1, the lighting wall controls 14 act as a gateway for the light bulbs 12, connecting them to the lighting network 10. However, in one embodiment a separate lighting gateway is provided through which the light bulbs 12 and the lighting wall controls 14 connect to other devices in the lighting network 10. In such an embodiment, the lighting wall controls 14 may have a reduced number of communication interfaces in order to simplify the design thereof.


The control signals provided from the lighting wall controls 14 to the light bulbs 12 may control any number of different parameters of the light provided therefrom. For example, the control signals from the lighting wall controls 14 may cause the light bulbs 12 to change an intensity of a light provided therefrom, a color of the light provided therefrom, a color temperature of the light provided therefrom, a color rendering index of the light provided therefrom, or any other desired parameter.


Each of the lighting wall controls 14 may control different groups of light bulbs 12 throughout the lighting network 10. These groups of light bulbs 12 may be controlled via different communication interfaces as shown in FIG. 1. For example, the lighting wall controls 14 may control the light output of a first group of light bulbs 12 via an AC interface, providing AC dimming signals thereto. Accordingly, the lighting wall controls 14 may be connected to the first group of light bulbs 12 via an AC line. Further, the lighting wall controls 14 may control the light output of a second group of light bulbs 12 via a wireless interface such as those discussed above. Accordingly, the lighting wall controls 14 do not have to be connected to the second group of light bulbs 12 directly. The lighting wall controls 14 may operate the first group of light bulbs 12 and the second group of light bulbs 12 in a dependent (i.e., synchronous) or independent manner. That is, the lighting wall controls 14 may ensure that the light output from the first group of light bulbs 12 substantially matches that of the second group of light bulbs 12, or may operate the light bulbs 12 such that the light output from the first group of light bulbs 12 is different from that of the second group of light bulbs 12. In this way, the lighting wall controls 14 may “bridge” the control of multiple groups of light bulbs 12 in the lighting network 10, each of which may be operate via a different communications interface in order to provide seamless control of light bulbs 12 throughout a space. While the lighting wall controls 14 are shown coupled to separate light bulbs 12 in the lighting network 10, the light bulbs 12 controlled by each one of the lighting wall controls 14 may overlap in some embodiments. As discussed above, lighting wall controls 14 in the lighting network 10 may receive user input or voice commands from users which require execution of actions on other lighting wall controls 14. This may occur, for example, when changes to a light output of light bulbs 12 or group of light bulbs 12 exclusively controlled by a particular lighting wall control 14 are requested by a user from a different lighting wall controller 14 or another device. This information may be passed to the appropriate lighting wall control 14 as necessary to execute these actions as discussed above.


The lighting wall controls 14 may receive commands from the connected device 24 such as a smartphone via a wired or wireless interface. As discussed above, the connected device 24 may be any suitable device such as a tablet, a smart watch, a dedicated remote control, or the like. In various embodiments, these commands may traverse one or more intermediate devices in the lighting network 10 before reaching one or more of the lighting wall controls 14. In response to these commands, one or more of the lighting wall controls 14 may provide control signals to the light bulbs 12 in order to change a light output thereof.


In addition to the above, the lighting wall controls 14 may receive commands from the voice control appliance 20 via a wired or wireless interface. As discussed above, the voice control appliance 20 is a standalone device for responding to voice commands from a user. Commands may be generated by the voice control appliance 20 in response to voice input from a user. In generating the commands, the voice control appliance 20 may interact with the voice control server 22. The voice control appliance 20 and/or voice control server 22 may be configured to determine actions to take based on the voice commands from the user and relay these commands back to a requesting device. The computational complexity associated with natural language processing may necessitate the use of the voice control server 22 in some situations, since it may not be feasible to perform these computations on other devices in the lighting network 10 that may have limited processing power and/or stringent efficiency requirements.


While the voice control appliance 20 may provide a convenient way to interact with one or more devices, a lighting network 10 may require several of them in order to adequately listen for voice commands within a given space. Since the voice control appliance 20 is a separate device dedicated only to that task, it may be expensive or inconvenient for a user to place a number of these throughout a space to provide the desired level of coverage. Generally, these voice control appliances 20 recognize voice commands from a user in a relatively limited area. Accordingly, a substantial number of these devices must be placed strategically throughout a space in order to provide the desired functionality throughout the space. Further, these voice control appliances often require access to a power outlet, which may be problematic and/or produce unsightly results. The demands of these standalone devices may necessitate sub-optimal placement thereof such that the space in which voice commands are recognized is further reduced. Lighting wall controls 14 such as the one shown in FIG. 1 may be located in every room of a space, and in some cases in more than one place in a room. Further, these lighting wall controls 14 have access to power and are discreet in their appearance when compared to a dedicated device for which a user must find an appropriate spot. Finally, the placement of most lighting wall controllers 14 provides unrestricted access to sound waves in the surrounding area, and thus will be easily able to detect voice commands from a user. Accordingly, in order to provide voice control throughout the entirety of a space voice control or “virtual assistant” functionality is provided in the lighting wall controls 14 as discussed below.



FIG. 2 shows details of a lighting wall control 14 according to one embodiment of the present disclosure. The lighting wall control 14 includes processing circuitry 26, a memory 28, a user interface 30, communications circuitry 32, sensor circuitry 34, and power management circuitry 36. The processing circuitry 26 is configured to execute instructions stored in the memory 28 in order to provide the primary intelligence of the lighting wall control 14. In one embodiment, the memory 28 includes a voice processing module 38, which is a set of instructions stored in the memory 28 configured to allow the lighting wall control 14 to process voice commands as discussed below. While not shown, additional modules such as a fault detection module for detecting failures within the lighting wall control 14, a diagnostic module for diagnosing said errors, and a protection module for security or other purposes may be provided as instructions stored in the memory 28 or discrete circuitry in the lighting wall control 14 to increase the robustness of the device.


The user interface 30 allows a user to interact with the lighting wall control 14, and may provide several ways to do so. For example, the user interface 30 may include a switch SW, which may be mechanical or any other type of switch, a capacitive or otherwise touch sensitive interface TCH, a display DSP, or the like. In some embodiments the user interface 30 may include a touchless interface (not shown), such as a three-dimensional gesture sensor, which may be provided using various sensors such as an image sensor. The display may be as simple or complex as desired. For example, the display may be an indicator LED, multiple indicator LEDs, an LED array, a full display such as a liquid crystal display (LCD), or any combination thereof. To provide the voice control capability discussed herein, the user interface 30 may include a microphone MIC and a speaker SPK. The microphone MIC may include multiple microphones, which may be provided in an array in order to more accurately recognize voice commands from a user. Further, the speaker SPK may include multiple speakers in order to provide better sound, or may connect to one or more remote speakers in order to provide audible feedback to a user.


The communications circuitry 32 may include multiple communications interfaces 40, each of which may utilize a different communications technology and/or protocol to communicate with other devices in the lighting network 10. For example, a first communication interface 40A may be a WiFi communications interface, a second communication interface 40B may be a Bluetooth communications interface, and an nth communication interface 40N may be a IEEE 802.15 communications interface. In short, the communications circuitry 32 may include any number of different communications interfaces 40 in order to communicate with a variety of devices in the lighting network 10. As discussed above, in some embodiments the lighting wall control 14 may include a limited number of communications interfaces 40, and may communicate to other devices in the lighting network 10 via a separate lighting gateway.


The sensor circuitry 34 may include any number of sensors to allow the lighting wall control 14 to receive input from the surrounding environment. For example, the sensor circuitry 34 may include an ambient light sensor ALS, an occupancy sensor OCC, and an image sensor IMG. The ambient light sensor ALS may provide a measurement of the ambient light in the surrounding environment to the lighting wall control 14, which it may use to control a light output from one or more of the light bulbs 12. The occupancy sensor OCC may indicate whether or not the environment surrounding the lighting wall control 14 is occupied by a person, which may be used by the lighting wall control 14 to turn on and off the light bulbs 12. The image sensor IMG may be used to detect ambient light, occupancy, motion, and other light characteristics of the light bulbs 12. Any of these measurements may be used to adjust a light output of the light bulbs 12 in a desired fashion. Further, any number of additional sensor may be added to the sensor circuitry 34 (e.g., temperature sensors, barometric pressure sensors, accelerometers, or the like) in order to allow the lighting wall control 14 to collect additional information about the surrounding environment.


The power management circuitry 36 may be configured to receive an AC input signal AC_IN, for example, an AC line voltage, and provide an AC output signal AC_OUT to one or more of the light bulbs 12. In doing so, the lighting wall control 14 may dim or otherwise alter the light output of the light bulbs. In one embodiment, the power management circuitry 36 includes an AC dimmer (not shown). In other embodiments, the power management circuitry 36 includes power converter circuitry such as AC to direct current (DC) converter circuitry, power factor correction circuitry, rectifier circuitry, or the like (not shown). In some embodiments, the power management circuitry 36 may be configured to be wired in a three-way, four-way, or multiple-way AC circuit. The power management circuitry 36 may cooperate with the processing circuitry 26 in order to properly respond to AC signals received from other switches in the multiple-way configuration and to properly provide AC signals to other switches in the multiple-way configuration in order for all of the switches in the circuit to properly function. Where multiple switches in the circuit are lighting wall controls 14 including intelligence such as the one discussed herein, the lighting wall controls 14 may effectuate the multiple-way behavior by communicating in a wired or wireless manner. Where some of the switches in the circuit are “dumb” switches, the lighting wall control 14 may manipulate an AC output thereof in order to effectuate the multiple-way behavior. The lighting wall control 14 may require pass-through or constant AC power to provide all of the functionality thereof, and such considerations must therefore be taken when including the lighting wall control in a multiple-way circuit. In addition to receiving AC input signals AC_IN, the power management circuitry 36 may also be configured to receive DC input signals, condition or otherwise alter these signals as desired, and provide one or more output signals to the light bulbs 12 to control the light output thereof. In some embodiments, the power management circuitry 36 may include a battery to provide power in the event of a power outage, or to ensure storage of settings or otherwise operate one or more aspects of the lighting wall control 14 when line power is not available.



FIG. 3 shows a lighting wall control 14 according to an additional embodiment of the present disclosure. The lighting wall control 14 shown in FIG. 3 is substantially similar to that shown in FIG. 2, but further includes dedicated voice processing circuitry 42 therein. The dedicated voice processing circuitry 42 may be optimized for recognizing human speech. In one embodiment, the dedicated voice processing circuitry 42 is configured to transcribe spoken words into text, data, or any appropriate form, which may then be parsed to determine one or more actions to be taken based thereon. Further, the dedicated voice processing circuitry 42 may be optimized to listen for a “trigger phrase”, which may indicate that a person is providing a voice command to the lighting wall control 14. Listening for a trigger phrase may prevent the lighting wall control 14 from recording all spoken words in the surrounding environment in order to increase the privacy of users. To reduce the power consumption of the voice processing circuitry 42 and therefore optimize efficiency, it may be provided as a specialized application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), or the like. Providing dedicated voice processing circuitry 42 in the lighting wall control 14 may free up valuable processing power in the processing circuitry 26 for performing other tasks.



FIG. 4 shows a lighting wall controller 14 according to an additional embodiment of the present disclosure. For reference, a number of light bulbs 12 are also shown. The lighting wall controller 14 includes processing circuitry, which is configured to execute instructions stored in a memory to provide the central intelligence of the lighting wall controller 14. A power supply module, along with a battery connected thereto, receives an AC input signal AC_IN and provides power to the processing circuitry, which may be distributed to other portions of the device. Fault and protection circuitry increases the robustness of the lighting wall controller by detecting faults and responding thereto. Wireless communications circuitry allows the lighting wall controller to communicate with other devices in the lighting network 10. Gate drive circuitry controls a power device, which is in line with the AC line voltage provided to the light bulbs 12 in order to control the light output thereof in a conventional manner as discussed above. The power device may be a transistor device or any other suitable device for controlling the amount of energy delivered to the light bulbs 12. An indicator such as an LED or an LCD is provided, and may be used to provide feedback to a user as discussed above. A speaker and associated circuitry may similarly be used to provide audible feedback to a user. A voice recognition module along with a microphone attached thereto allows the lighting wall control 14 to receive and respond to voice commands. A mechanical switch allows a user to cut power to the light bulbs 12 when desired. A number of sensors including an occupancy sensor, an ambient light sensor, an image sensor, a touch sensor (which may be a capacitive touch sensor), and a three-dimensional gesture sensor allow the lighting wall control 14 to receive input from the surrounding environment. The processing circuitry is coupled to each one of the fault and protection circuitry, the wireless communications circuitry, the gate drive circuitry, the indicator, the speaker, the voice recognition module, the mechanical switch, and the sensors. Accordingly, the processing circuitry may receive input from these portions of the device or provide commands thereto to direct the activity of the lighting wall control 14. In various embodiments, the processing circuitry may be a microcontroller unit or the like.



FIG. 5 shows a user interface 30 for a lighting wall control 14 according to one embodiment of the present disclosure. As shown, the user interface 30 includes a touch panel 44, which may be mechanical, capacitive, or otherwise touch sensitive. Further, while referred to as a “touch” panel, the touch panel 44 may respond to non-touch gestures such as those performed by a user in the space surrounding the lighting wall control 14. The touch panel 44 may control the intensity of light provided by light bulbs 12 controlled by the lighting wall controller based on input from a user. A faceplate 46 is provided around the touch panel 44. The faceplate 46 may include a first opening 48, a second opening 50, and a third opening 52. The first opening 48 may provide the microphone MIC access to the surrounding environment so that voice commands from a user may be detected. The second opening 50 may provide optical access to the surrounding environment for one or more of the ambient light sensor ALS, the occupancy sensor OCC, and the image sensor IMG. Additional openings may be provided in embodiments in which more than one of these sensors is provided. The third opening 52 may provide the speaker SPK access to the surrounding environment so that audible feedback and other sounds may be provided from the lighting wall control 14.



FIG. 6 shows a user interface 30 for a lighting wall control 14 according to an additional embodiment of the present disclosure. The user interface 30 is substantially similar to that shown in FIG. 5, except that the touch panel 44 shown in FIG. 5 is replaced with a touchscreen 54. The touchscreen may display information about the light bulbs 12 controlled by the lighting wall control 14 as well as any other devices in the lighting network 10. For example, the touchscreen 54 may display the current occupancy status and the current brightness setting of the light bulbs 12 as shown in FIG. 6. Controls that are often used may be displayed in a prominent manner to allow a user to easily and intuitively control a light output of the light bulbs 12 connected to the lighting wall control 14. An indicator may be provided that the lighting wall control 14 is currently ready for voice commands from a user.



FIG. 7 shows the user interface 30 illustrated in FIG. 6 after a voice command has been detected by a user. The touchscreen 54 may indicate the command that was detected and indicate an action that is currently being executed in response thereto. In some embodiments, a progress indicator may be provided. Further, feedback may be solicited to refine the accuracy of voice recognition of the lighting wall control 14. For example, a prompt on the screen may ask whether the detected voice command was accurately transcribed, and whether the resulting action was the intended consequence of the detected voice command. After receiving this feedback, the lighting wall control 14 and/or a backend device used for responding to the voice commands may alter the transcription and/or response to the voice commands in order to better respond to voice commands over time.



FIG. 8 shows the user interface 30 illustrated in FIG. 6 after a different type of voice command has been detected by a user. While voice commands may be used to instruct the lighting wall control 14 to provide control signals to one or more other devices, they may also be used to request information from the lighting wall control 14, which must then be displayed or otherwise communicated to the user. For example, a user may ask for the weather forecast, which may then be displayed as shown in FIG. 8. Other types of information may be requested and displayed as well. In various embodiments, audible feedback may be provided by the user in addition to displaying the information on the touchscreen 54 or other user interface. Such audible feedback may include computer generated speech responding to the request from the user.



FIG. 9 is a call flow diagram illustrating communications between a lighting wall control 14 and a remote device 56 in order to execute one or more actions based on voice commands from a user according to one embodiment of the present disclosure. First, a voice command is received from a user (100). To receive a voice command, the lighting wall control 14 may constantly listen for voice commands and/or trigger phrases via the microphone MIC as discussed above. The voice command may then be transcribed into text, data representative of the voice command, or any appropriate form (102). The voice transcription may be accomplished via the processing circuitry 26 or the dedicated voice processing circuitry 42. The transcribed voice command is then sent to a remote device 56 (104), which may be the voice control appliance 20, the voice control server 22, or any other device, through one or more intermediate devices (e.g., the router 16, the gateway 18, or any other device). The remote device 56 determines any necessary actions to be taken based on the transcribed voice command (106). For example, the remote device 56 may use natural language processing along with machine learning algorithms to determine the intent of the voice command and how to respond. These actions are then sent back to the lighting wall control 14 (108), where they are executed thereby (110).


As discussed above, the actions may include changing a light output of one or more of the light bulbs 12, displaying information, controlling one or more other devices in the lighting network 10, or any other task. For example, a user may request the lighting wall control 14 to “Turn on the lights,” to “Set the brightness of the lights in conference room 1 to 80%,” or to “Turn on the projector.” The lighting wall control 14 along with the remote device will determine the necessary actions to be taken based on these requests.



FIG. 10 is a call flow diagram illustrating communications between a lighting wall control 14 and a remote device 56 in order to execute one or more actions based on voice commands from a user according to an additional embodiment of the present disclosure. First, a voice command is received from a user (200). The voice command is then sent to a remote device 56 (202). As discussed above, the remote device 56 may be the voice control appliance 20, the voice control server 22, or any other device, and communication with the remote device may occur between one or more intermediate devices. Sending the voice command to the remote device 56 may include performing analog-to-digital conversion of the voice command from the user and sending a digital version thereof to the remote device 56. In some embodiments, compression may be applied to the digital version of the voice command in order to reduce the required bandwidth of communication between the lighting wall control 14 and the remote device 56. The remote device 56 may then transcribe the voice command (204) using dedicated hardware or software as discussed above, and may determine any necessary actions to be taken based on the transcribed voice command (206) as discussed above. These actions are then sent back to the lighting wall control 14 (208), where they are executed thereby (210).



FIG. 11 is a flow diagram illustrating a method for responding to voice commands from the lighting wall control 14 according to one embodiment of the present disclosure. First, a voice command is received (300). The voice command is then transcribed into text, data representative of the voice command, or any appropriate form (302) for further processing. Necessary actions based on the transcribed voice command are then determined (304), and executed (306) by the lighting wall control 14. Notably, the transcription and determination of actions based on the transcribed voice command are performed locally on the lighting wall control 14 in the embodiment shown in FIG. 11. This may be enabled by the dedicated voice processing circuitry 42 discussed above.


The above approaches in FIGS. 9-11 illustrate different ways that the lighting wall control 14 could cooperate with a remote device such as the voice control appliance 20 and a voice control server 22 in order to respond to voice commands from a user. In particular, they illustrate different ways to distribute the transcription and processing of voice commands from a user to accomplish a desired task based thereon. The above approaches illustrate that the voice processing module 38 and/or the voice processing circuitry 42 in the lighting wall control 14 may perform several different levels of voice processing based on the embodiment. For example, the voice processing performed by the voice processing module 38 and/or the voice processing circuitry 42 may be a simple analog-to-digital conversion, or may involve more intensive processing such as voice-to-text transcription. In certain applications, the voice processing module 38 and/or the voice processing circuitry 42 may work alongside the processing circuitry 26 in order to perform even more intensive processing such as natural language processing and the like in order to determine a desired action to be performed based on the voice commands. In short, the term “voice processing” used throughout the present application may indicate many different levels of intensity of processing of the voice commands.


Notably, the above are only exemplary approaches to such a problem. There are any numbers of ways in which a lighting wall controller could parse and respond to voice commands from a user, all of which are contemplated herein. Regardless of the details of how it is accomplished, providing hardware and accompanying software for detecting voice commands in a lighting wall control 14 allows voice command (i.e., “virtual assistant”) functionality to be distributed throughout a space without the need for a multitude of dedicated hardware that may be expensive or unsightly. That is, due to the fact that lighting wall controls 14 are already integrated into a power infrastructure and distributed spatially throughout a home, these lighting wall controls 14 offer significant benefits for providing an interface for voice control over dedicated hardware.


Those skilled in the art will recognize improvements and modifications to the preferred embodiments of the present disclosure. For example, this disclosure has focused on a lighting wall controller, but depending on the embodiment, the wall controller according to principles of the present disclosure need not control lights (or at least not in the conventional fashion) even though it replaces a conventional light switch or is mounted where a conventional light switch would typically be located. Additionally, the wall controllers can network with each other in various network structures, including with other devices, lights and/or sensors. All such improvements and modifications are considered within the scope of the concepts disclosed herein and the claims that follow.

Claims
  • 1. A lighting wall control comprising: a user interface;a voice processing module configured to process voice commands from a user;power management circuitry configured to receive an AC input signal and provide an AC output signal suitable for powering one or more lights and controlling a light output thereof;a communication interface configured to communicate with one or more additional lights; andprocessing circuitry coupled to the user interface, the voice processing module, the power management circuitry, and the communication interface, the processing circuitry configured to: adjust a light output of the one or more lights via the power management circuitry based on user input from the user interface and the voice commands processed by the voice processing module; andadjust a light output of the one or more additional lights based on the user input from the user interface and the voice commands from the voice processing module.
  • 2. The lighting wall control of claim 1 wherein the communication interface is a wireless communication interface.
  • 3. The lighting wall control of claim 1 wherein the processing circuitry is configured to control the light output of the one or more lights and the light output of the one or more additional lights in a synchronous manner.
  • 4. The lighting wall control of claim 1 wherein the processing circuitry is configured to control the light output of the one or more lights independently.
  • 5. The lighting wall control of claim 1 wherein the processing circuitry is configured to: receive commands from one or more other lighting wall controls via the communication interface, wherein the commands are generated from the one or more other lighting wall controls based on one of user input from a user interface and voice commands from a voice processing module; andadjust the light output of the one or more lights based on the commands.
  • 6. The lighting wall control of claim 1 wherein the lighting wall control is configured to: receive the voice commands via a microphone;transcribe the voice commands; anddetermine one or more actions to be performed based on the transcribed voice commands.
  • 7. The lighting wall control of claim 6 wherein the voice processing module is configured to process the voice commands by transcribing the voice commands.
  • 8. The lighting wall control of claim 7 wherein the processing circuitry is configured to determine the one or more actions to be performed based on the transcribed voice commands.
  • 9. The lighting wall control of claim 1 wherein the processing circuitry is further configured to: transmit the voice commands to a remote server via the communication interface; andreceive one or more actions to be performed based on the voice commands from the remote server via the communication interface.
  • 10. The lighting wall control of claim 9 wherein the voice processing module is configured to process the voice commands by performing an analog-to-digital conversion on the voice commands such that the voice commands transmitted to the remote server are transmitted in a digital format.
  • 11. The lighting wall control of claim 9 wherein the processing circuitry is further configured to perform the one or more actions.
  • 12. The lighting wall control of claim 1 wherein: the voice processing module is configured to process the voice commands by transcribing the voice commands;the processing circuitry is configured to transmit the transcribed voice commands to a remote server via the communication interface; andthe processing circuitry is configured to receive one or more actions to be performed based on the transcribed voice commands from the server via the communication interface.
  • 13. The lighting wall control of claim 1 wherein the processing circuitry is configured to display visual information via the user interface in response to the voice commands.
  • 14. The lighting wall control of claim 1 wherein the processing circuitry is further configured to request information from a remote server via the communication interface in response to the voice commands.
  • 15. The lighting wall control of claim 1 wherein the processing circuitry is further configured to send a command to one or more other lighting wall controls via the communication interface based on the voice commands.
  • 16. The lighting wall control of claim 1 further comprising a first communication interface and a second communication interface, wherein the processing circuitry is coupled to the first communication interface and the second communication interface and configured to: transmit the voice commands to a remote server via the first communication interface;receive one or more actions to be performed based on the voice commands from the remote server via the first communication interface; andadjust the light output of the one or more lights via the second communication interface.
  • 17. The lighting wall control of claim 16 wherein the voice processing module is configured to process the voice commands by performing an analog-to-digital conversion on the voice commands such that the voice commands transmitted to the remote server are transmitted in a digital format.
  • 18. The lighting wall control of claim 16 wherein the first communication interface is configured to communicate with the remote server via a wide area network (WAN).
  • 19. A lighting wall control comprising: a user interface;a voice processing module configured to process voice commands from a user;power management circuitry configured to receive an AC input signal and provide an AC output signal suitable for powering one or more lights and controlling a light output thereof;a communication interface configured to communicate with one or more additional lights; andprocessing circuitry coupled to the user interface, the voice processing module, the power management circuitry, and the communication interface, the processing circuitry configured to: adjust a light output of the one or more lights via the power management circuitry based on user input from the user interface and the voice commands processed by the voice processing module;receive commands from one or more other lighting wall controls via the communication interface, wherein the commands are generated from the one or more other lighting wall controls based on one of user input from a user interface and voice commands from a voice processing module; andadjust the light output of the one or more lights based on the commands from the one or more other lighting wall controls.
  • 20. A lighting wall control comprising: a user interface;a voice processing module configured to process voice commands from a user;a communication interface; andprocessing circuitry coupled to the user interface, the voice processing module, and the communication interface, the processing circuitry configured to: adjust a light output of one or more lights based on user input from the user interface and the voice commands processed by the voice processing module;transmit the voice commands to a remote server via the communication interface; andreceive one or more actions to be performed based on the voice commands from the remote server via the communication interface.
  • 21. The lighting wall control of claim 20 wherein the processing circuitry is further configured to perform the one or more actions.
  • 22. A lighting wall control comprising: a user interface;a voice processing module configured to process voice commands from a user;a communication interface; andprocessing circuitry coupled to the user interface, the voice processing module, and the communication interface, the processing circuitry configured to: adjust a light output of one or more lights based on user input from the user interface and the voice commands processed by the voice processing module; andrequest information from a remote server via the communication interface in response to the voice commands.
  • 23. A lighting wall control comprising: a user interface;a voice processing module configured to process voice commands from a user;a communication interface; andprocessing circuitry coupled to the user interface, the voice processing module, and the communication interface, the processing circuitry configured to: adjust a light output of one or more lights based on user input from the user interface and the voice commands processed by the voice processing module; andsend a command to one or more other lighting wall controls via the communication interface based on the voice commands.
  • 24. A lighting wall control comprising: a user interface;a voice processing module configured to process voice commands from a user;a first communication interface;a second communication interface; andprocessing circuitry coupled to the user interface, the voice processing module, and the first and second communication interfaces, the processing circuitry configured to: adjust a light output of one or more lights based on user input from the user interface and the voice commands processed by the voice processing module;transmit the voice commands to a remote server via the first communication interface;receive one or more actions to be performed based on the voice commands from the remote server via the first communication interface; andadjust the light output of the one or more lights via the second communication interface.
RELATED APPLICATIONS

This application claims the benefit of provisional patent application Ser. No. 62/400,525, filed Sep. 27, 2016, the disclosure of which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (299)
Number Name Date Kind
D259514 Welch Jun 1981 S
D317363 Welch Jun 1991 S
5079680 Kohn Jan 1992 A
D344361 Friedman et al. Feb 1994 S
D349582 Bain et al. Aug 1994 S
5471119 Ranganath et al. Nov 1995 A
D373438 McCann-Compton et al. Sep 1996 S
6100643 Nilssen Aug 2000 A
6118230 Fleischmann Sep 2000 A
6137408 Okada Oct 2000 A
6160359 Fleischmann Dec 2000 A
6166496 Lys et al. Dec 2000 A
6211626 Lys et al. Apr 2001 B1
6437692 Petite et al. Aug 2002 B1
6441558 Muthu et al. Aug 2002 B1
6528954 Lys et al. Mar 2003 B1
6553218 Boesjes Apr 2003 B1
6735630 Gelvin et al. May 2004 B1
6804790 Rhee et al. Oct 2004 B2
6826607 Gelvin et al. Nov 2004 B1
6832251 Gelvin et al. Dec 2004 B1
6859831 Gelvin et al. Feb 2005 B1
6914893 Petite Jul 2005 B2
6975851 Boesjes Dec 2005 B2
6990394 Pasternak Jan 2006 B2
7009348 Mogilner et al. Mar 2006 B2
7020701 Gelvin et al. Mar 2006 B1
7031920 Dowling et al. Apr 2006 B2
7103511 Petite Sep 2006 B2
7139562 Matsui Nov 2006 B2
7288902 Melanson Oct 2007 B1
7305467 Kaiser et al. Dec 2007 B2
7313399 Rhee et al. Dec 2007 B2
D560006 Garner et al. Jan 2008 S
7344279 Mueller et al. Mar 2008 B2
D565771 Garner et al. Apr 2008 S
D567431 Garner et al. Apr 2008 S
7391297 Cash et al. Jun 2008 B2
7443113 Crouse et al. Oct 2008 B2
D582598 Kramer et al. Dec 2008 S
7468661 Petite et al. Dec 2008 B2
7482567 Hoelen et al. Jan 2009 B2
7484008 Gelvin et al. Jan 2009 B1
D586950 Garner et al. Feb 2009 S
D587390 Garner et al. Feb 2009 S
D588064 Garner et al. Mar 2009 S
7522563 Rhee Apr 2009 B2
D594576 Chan et al. Jun 2009 S
7587289 Sivertsen Sep 2009 B1
7606572 Rhee et al. Oct 2009 B2
7638743 Bartol et al. Dec 2009 B2
7649456 Wakefield et al. Jan 2010 B2
7657249 Boesjes Feb 2010 B2
7683301 Papamichael et al. Mar 2010 B2
7697492 Petite Apr 2010 B2
7797367 Gelvin et al. Sep 2010 B1
7844308 Rhee et al. Nov 2010 B2
7844687 Gelvin et al. Nov 2010 B1
7868562 Salsbury et al. Jan 2011 B2
7891004 Gelvin et al. Feb 2011 B1
7904569 Gelvin et al. Mar 2011 B1
7924174 Gananathan Apr 2011 B1
7924927 Boesjes Apr 2011 B1
7948930 Rhee May 2011 B2
8011794 Sivertsen Sep 2011 B1
8013545 Jonsson Sep 2011 B2
8021021 Paolini Sep 2011 B2
8035320 Sibert Oct 2011 B2
8079118 Gelvin et al. Dec 2011 B2
8098615 Rhee Jan 2012 B2
8126429 Boesjes Feb 2012 B2
8140658 Gelvin et al. Mar 2012 B1
D663048 Chen Jul 2012 S
8228163 Cash Jul 2012 B2
8271058 Rhee et al. Sep 2012 B2
8274928 Dykema et al. Sep 2012 B2
8275471 Huizenga et al. Sep 2012 B2
8344660 Mohan et al. Jan 2013 B2
8364325 Huizenga et al. Jan 2013 B2
8425071 Ruud et al. Apr 2013 B2
8461781 Schenk et al. Jun 2013 B2
8466626 Null et al. Jun 2013 B2
8497634 Scharf Jul 2013 B2
8508137 Reed Aug 2013 B2
8511851 Van de Ven et al. Aug 2013 B2
8536792 Roosli Sep 2013 B1
8536984 Benetz et al. Sep 2013 B2
8564215 Okawa et al. Oct 2013 B2
8591062 Hussell et al. Nov 2013 B2
8596819 Negley et al. Dec 2013 B2
8604714 Mohan et al. Dec 2013 B2
8610377 Chemel et al. Dec 2013 B2
8622584 Kinnune et al. Jan 2014 B2
8626318 Wu Jan 2014 B2
D703841 Feng et al. Apr 2014 S
D708360 Shibata et al. Jul 2014 S
8777449 Van De Ven et al. Jul 2014 B2
8786191 Kuang et al. Jul 2014 B2
8829800 Harris Sep 2014 B2
8829821 Chobot et al. Sep 2014 B2
8912735 Chobot et al. Dec 2014 B2
8975825 Hu Mar 2015 B2
8981671 Karasawa et al. Mar 2015 B2
9028087 Wilcox et al. May 2015 B2
9041315 Cho et al. May 2015 B2
9155165 Chobot Oct 2015 B2
9155166 Chobot Oct 2015 B2
9182096 Kinnune et al. Nov 2015 B2
D744699 Inoue et al. Dec 2015 S
9232596 Jelaca et al. Jan 2016 B2
9326358 Campbell et al. Apr 2016 B2
9332619 Olsen et al. May 2016 B2
9351381 Verfuerth et al. May 2016 B2
9408268 Recker et al. Aug 2016 B2
9433061 Chobot Aug 2016 B2
9504133 Verfuerth et al. Nov 2016 B2
9538617 Rains, Jr. et al. Jan 2017 B2
9572226 Motley et al. Feb 2017 B2
9618163 Power et al. Apr 2017 B2
9762115 Sharma Sep 2017 B2
20020047646 Lys et al. Apr 2002 A1
20020195975 Schanberger et al. Dec 2002 A1
20040001963 Watanabe et al. Jan 2004 A1
20040002792 Hoffknecht Jan 2004 A1
20040051467 Balasubramaniam et al. Mar 2004 A1
20040193741 Pereira et al. Sep 2004 A1
20040232851 Roach, Jr. et al. Nov 2004 A1
20050127381 Vitta et al. Jun 2005 A1
20050132080 Rhee et al. Jun 2005 A1
20060022214 Morgan et al. Feb 2006 A1
20060044152 Wang Mar 2006 A1
20060066266 Li Lim et al. Mar 2006 A1
20060076908 Morgan Apr 2006 A1
20060125426 Veskovic et al. Jun 2006 A1
20060161270 Luskin Jul 2006 A1
20060262545 Piepgras et al. Nov 2006 A1
20070013557 Wang et al. Jan 2007 A1
20070040512 Jungwirth et al. Feb 2007 A1
20070085700 Walters et al. Apr 2007 A1
20070126656 Huang et al. Jun 2007 A1
20070132405 Hillis et al. Jun 2007 A1
20070189000 Papamichael et al. Aug 2007 A1
20070291483 Lys Dec 2007 A1
20080031213 Kaiser et al. Feb 2008 A1
20080088435 Cash et al. Apr 2008 A1
20080197790 Mangiaracina et al. Aug 2008 A1
20080218087 Crouse et al. Sep 2008 A1
20080265799 Sibert Oct 2008 A1
20090021955 Kuang et al. Jan 2009 A1
20090026966 Budde et al. Jan 2009 A1
20090184616 Van De Ven et al. Jul 2009 A1
20090212718 Kawashima et al. Aug 2009 A1
20090230894 De Goederen et al. Sep 2009 A1
20090231832 Zukauskas et al. Sep 2009 A1
20090237011 Shah et al. Sep 2009 A1
20090267540 Chemel et al. Oct 2009 A1
20090284169 Valois Nov 2009 A1
20090284184 Valois et al. Nov 2009 A1
20090302994 Rhee et al. Dec 2009 A1
20090302996 Rhee et al. Dec 2009 A1
20090305644 Rhee et al. Dec 2009 A1
20090315485 Verfuerth et al. Dec 2009 A1
20090315668 Leete, III et al. Dec 2009 A1
20100007289 Budike, Jr. Jan 2010 A1
20100013649 Spira Jan 2010 A1
20100084992 Valois et al. Apr 2010 A1
20100128634 Rhee et al. May 2010 A1
20100134051 Huizenga et al. Jun 2010 A1
20100148940 Gelvin et al. Jun 2010 A1
20100150122 Berger et al. Jun 2010 A1
20100201516 Gelvin et al. Aug 2010 A1
20100203515 Rigler Aug 2010 A1
20100270935 Otake et al. Oct 2010 A1
20100295473 Chemel et al. Nov 2010 A1
20100301770 Chemel et al. Dec 2010 A1
20100301773 Chemel et al. Dec 2010 A1
20100301774 Chemel et al. Dec 2010 A1
20100308664 Face et al. Dec 2010 A1
20110025469 Erdmann et al. Feb 2011 A1
20110031897 Henig et al. Feb 2011 A1
20110035491 Gelvin et al. Feb 2011 A1
20110057581 Ashar et al. Mar 2011 A1
20110080120 Talstra et al. Apr 2011 A1
20110095687 Jonsson Apr 2011 A1
20110095709 Diehl et al. Apr 2011 A1
20110101871 Schenk et al. May 2011 A1
20110115407 Wibben et al. May 2011 A1
20110133655 Recker et al. Jun 2011 A1
20110137757 Paolini et al. Jun 2011 A1
20110156596 Salsbury Jun 2011 A1
20110178650 Picco Jul 2011 A1
20110182065 Negley et al. Jul 2011 A1
20110199004 Henig et al. Aug 2011 A1
20110199020 Henig et al. Aug 2011 A1
20110215725 Paolini Sep 2011 A1
20110221350 Staab Sep 2011 A1
20110249441 Donegan Oct 2011 A1
20110254554 Harbers Oct 2011 A1
20110298598 Rhee Dec 2011 A1
20120007725 Penisoara et al. Jan 2012 A1
20120013257 Sibert Jan 2012 A1
20120026733 Graeber et al. Feb 2012 A1
20120040606 Verfuerth Feb 2012 A1
20120050535 Densham et al. Mar 2012 A1
20120051041 Edmond et al. Mar 2012 A1
20120079149 Gelvin et al. Mar 2012 A1
20120082062 Mccormack Apr 2012 A1
20120086345 Tran Apr 2012 A1
20120087290 Rhee et al. Apr 2012 A1
20120091915 Ilyes et al. Apr 2012 A1
20120126705 Pezzutti et al. May 2012 A1
20120130544 Mohan et al. May 2012 A1
20120135692 Feri et al. May 2012 A1
20120136485 Weber et al. May 2012 A1
20120139426 Ilyes et al. Jun 2012 A1
20120147604 Farmer Jun 2012 A1
20120147808 Rhee Jun 2012 A1
20120153840 Dahlen et al. Jun 2012 A1
20120161643 Henig et al. Jun 2012 A1
20120176041 Birru Jul 2012 A1
20120206050 Spero Aug 2012 A1
20120223657 Van de Ven Sep 2012 A1
20120224457 Kim et al. Sep 2012 A1
20120229048 Archer Sep 2012 A1
20120230696 Pederson et al. Sep 2012 A1
20120235579 Chemel et al. Sep 2012 A1
20120235600 Simonian et al. Sep 2012 A1
20120242242 Linz et al. Sep 2012 A1
20120242254 Kim et al. Sep 2012 A1
20120271477 Okubo et al. Oct 2012 A1
20120280638 Pereira et al. Nov 2012 A1
20120299485 Mohan et al. Nov 2012 A1
20120306375 van de Ven Dec 2012 A1
20120306377 Igaki et al. Dec 2012 A1
20120320262 Chung Dec 2012 A1
20120327650 Lay et al. Dec 2012 A1
20130002157 van de Ven et al. Jan 2013 A1
20130002167 Van de Ven Jan 2013 A1
20130013091 Cavalcanti et al. Jan 2013 A1
20130026953 Woytowitz Jan 2013 A1
20130033872 Randolph et al. Feb 2013 A1
20130049606 Ferstl et al. Feb 2013 A1
20130051806 Quilici et al. Feb 2013 A1
20130057395 Ohashi Mar 2013 A1
20130058258 Boesjes Mar 2013 A1
20130063042 Bora et al. Mar 2013 A1
20130063047 Veskovic Mar 2013 A1
20130069539 So Mar 2013 A1
20130075484 Rhee et al. Mar 2013 A1
20130077299 Hussell et al. Mar 2013 A1
20130088168 Mohan et al. Apr 2013 A1
20130093328 Ivey et al. Apr 2013 A1
20130147366 Huizenga et al. Jun 2013 A1
20130154831 Gray et al. Jun 2013 A1
20130155392 Barrilleaux et al. Jun 2013 A1
20130155672 Vo et al. Jun 2013 A1
20130200805 Scapa et al. Aug 2013 A1
20130221857 Bowers Aug 2013 A1
20130229784 Lessard et al. Sep 2013 A1
20130257292 Verfuerth et al. Oct 2013 A1
20130257315 Restrepo Oct 2013 A1
20130320862 Campbell et al. Dec 2013 A1
20130328486 Jones Dec 2013 A1
20130342911 Bartol et al. Dec 2013 A1
20140001952 Harris et al. Jan 2014 A1
20140001959 Motley et al. Jan 2014 A1
20140001962 Harris Jan 2014 A1
20140001963 Chobot et al. Jan 2014 A1
20140001972 Harris et al. Jan 2014 A1
20140001977 Zacharchuk et al. Jan 2014 A1
20140062678 de Clercq et al. Mar 2014 A1
20140070710 Harris Mar 2014 A1
20140167621 Trott et al. Jun 2014 A1
20140167646 Zukauskas et al. Jun 2014 A1
20140212090 Wilcox et al. Jul 2014 A1
20140232299 Wang Aug 2014 A1
20140268790 Chobot et al. Sep 2014 A1
20140312777 Shearer et al. Oct 2014 A1
20140347885 Wilcox et al. Nov 2014 A1
20140355302 Wilcox et al. Dec 2014 A1
20150008827 Carrigan et al. Jan 2015 A1
20150008828 Carrigan et al. Jan 2015 A1
20150008829 Lurie et al. Jan 2015 A1
20150008831 Carrigan et al. Jan 2015 A1
20150015145 Carrigan et al. Jan 2015 A1
20150022096 Deixler Jan 2015 A1
20150042243 Picard Feb 2015 A1
20150048758 Carrigan et al. Feb 2015 A1
20150160673 Vasylyev Jun 2015 A1
20150189724 Karc Jul 2015 A1
20150195883 Harris et al. Jul 2015 A1
20150253488 Wilcox et al. Sep 2015 A1
20150264780 Harris et al. Sep 2015 A1
20150345762 Creasman et al. Dec 2015 A1
20150351169 Pope et al. Dec 2015 A1
20150351187 McBryde et al. Dec 2015 A1
20150351191 Pope et al. Dec 2015 A1
20160029464 Hughes et al. Jan 2016 A1
20180109107 Mosebrook Apr 2018 A1
Foreign Referenced Citations (66)
Number Date Country
492840 Jan 2011 AT
3666702 May 2002 AU
2002219810 May 2002 AU
2002352922 Jun 2003 AU
2426769 May 2002 CA
2511368 May 2002 CA
101461151 Jun 2009 CN
102119507 Jul 2011 CN
60143707 D1 Feb 2011 DE
1330699 Jul 2003 EP
1334608 Aug 2003 EP
1461907 Sep 2004 EP
1719363 Nov 2006 EP
1886415 Feb 2008 EP
2304311 Apr 2011 EP
2327184 Jun 2011 EP
2440017 Apr 2012 EP
1114508 Oct 2008 HK
4576KOLNP2007 Jul 2008 IN
H11345690 Dec 1999 JP
2001155870 Jun 2001 JP
2003178889 Jun 2003 JP
2005510956 Apr 2005 JP
3860116 Dec 2006 JP
3896573 Mar 2007 JP
2010050069 Mar 2010 JP
2010198877 Sep 2010 JP
2011526414 Oct 2011 JP
2012226993 Nov 2012 JP
20060050614 May 2006 KR
20080025095 Mar 2008 KR
20110001782 Jan 2011 KR
20110095510 Aug 2011 KR
0126068 Apr 2001 WO
0126327 Apr 2001 WO
0126328 Apr 2001 WO
0126329 Apr 2001 WO
0126331 Apr 2001 WO
0126332 Apr 2001 WO
0126333 Apr 2001 WO
0126334 Apr 2001 WO
0126335 Apr 2001 WO
0126338 Apr 2001 WO
0239242 May 2002 WO
0241604 May 2002 WO
03047175 Jun 2003 WO
2004109966 Dec 2004 WO
2006095316 Sep 2006 WO
2006130662 Dec 2006 WO
2007102097 Sep 2007 WO
2009011898 Jan 2009 WO
2009076492 Jun 2009 WO
2009145747 Dec 2009 WO
2009151416 Dec 2009 WO
2009158514 Dec 2009 WO
2010010493 Jan 2010 WO
2010047971 Apr 2010 WO
2010122457 Oct 2010 WO
2011070058 Jun 2011 WO
2011087681 Jul 2011 WO
2011090938 Jul 2011 WO
2011152968 Dec 2011 WO
2012112813 Aug 2012 WO
2012125502 Sep 2012 WO
2013050970 Apr 2013 WO
2014120971 Aug 2014 WO
Non-Patent Literature Citations (41)
Entry
Author Unknown, “Cluster Analysis”, Wikipedia—the free encyclopedia, Updated May 21, 2013, Retrieved on May 30, 2013, http://en.wikipedia.org/wiki/cluster_analysis, 16 pages.
Author Unknown, “IEEE Standard for Information Technology—Telecommunications and Information Exchange Between Systems—Local and Metropolitan Area Networks—Specific Requirements—Part 3: Carrier Sense Multiple Access with Collision Detection (CSMA/CD) Access Method and Physical Layer Specifications—Amendment: Data Terminal Equipment (DTE) Power via Media Dependent Interface (MDI),” Standard 802.3af-2003, Jun. 18, 2003, The Institute of Electrical and Electronics Engineers, Inc., 133 pages.
Author Unknown, “IEEE Standard for Information Technology—Telecommunications and Information Exchange Between Systems—Local and Metropolitan Area Networks—Specific Requirements—Part 3: Carrier Sense Multiple Access with Collision Detection (CSMA/CD) Access Method and Physical Layer Specifications—Amendment 3: Data Terminal Equipment (DTE) Power via the Media Dependent Interface (MDI) Enhancements,” Standard 802.3at-2009, Sep. 11, 2009, The Institute of Electrical and Electronics Engineers, Inc., 141 pages.
Author Unknown, “Multi-Agent System”, Wikipedia—the free encyclopedia, Updated Apr. 18, 2013, Retrieved May 30, 2013, http://en.wikipedia.org/wiki/multi-agent_system, 7 pages.
Author Unknown, i2C-Bus: What's That?, Updated 2012, Retrieved May 30, 2013, http://www.i2c-bus.org, 1 page.
Kuhn, Fabian et al., “Initializing Newly Deployed Ad Hoc & Sensor Network”, The Tenth Annual International Conference on Mobile Computing and Networking (MobiCom '04), Sep. 26-Oct. 4, 2004, 15 pages, Philadelphia, PA.
Teasdale, Dana et al., “Annual Technical Progress Report: Adapting Wireless Technology to Lighting Control and Environmental Sensing,” Dust Networks, Aug. 1, 2004, 41 pages.
DiGeronimo, John, “Search Report,” EIC 2800, Tracking No. 533769, Scientific & Technical Information Center, Feb. 1, 2017, 16 pages.
Author Unknown, “Controlling LEDs,” Lutron Electronics Co., Inc., Jan. 1, 2011, 16 pages.
Author Unknown, “Section 16950: Distributed Digital Lighting Control System,” Lighting Control Devices, Apr. 30, 2013, 20 pages.
Author Unknown, “System Design Guide—Lighting Control & Design: System Overview,” Lighting Control and Design, Form No. 1382.057, Accessed Aug. 9, 2013, 4 pages.
Author Unknown, “System Overview & Introduction,” nLight Network Lighting Controls, Accessed: Aug. 9, 2013, 4 pages, http://nlightcontrols.com/lighting-controls/overview.
Author Unknown, “The System: Components,” Simply5, Accessed: Aug. 9, 2013, 2 pages, http://simply5.net/how.html.
Technical Publications Department at Creston, “Creston Green Light Commercial Lighting Design Guide,” Creston Electronics, Inc., 2013, 74 pages.
Notice of Allowance for U.S. Appl. No. 15/628,975, dated Oct. 10, 2017, 9 pages.
U.S. Appl. No. 13/649,531, filed Oct. 11, 2012.
U.S. Appl. No. 13/589,899, filed Aug. 20, 2012.
U.S. Appl. No. 13/606,713, filed Sep. 7, 2012, now U.S. Pat. No. 8,829,800.
U.S. Appl. No. 13/782,022, filed Mar. 1, 2013.
U.S. Appl. No. 13/782,040, filed Mar. 1, 2013, now U.S. Pat. No. 8,975,827.
U.S. Appl. No. 13/782,053, filed Mar. 1, 2013.
U.S. Appl. No. 13/782,068, filed Mar. 1, 2013.
U.S. Appl. No. 13/782,078, filed Mar. 1, 2013, now U.S. Pat. No. 8,829,821
U.S. Appl. No. 13/782,096, filed Mar. 1, 2013.
U.S. Appl. No. 13/782,131, filed Mar. 1, 2013, now U.S. Pat. No. 8,912,735.
U.S. Appl. No. 29/452,813, filed Apr. 22, 2013.
U.S. Appl. No. 13/868,021, filed Apr. 22, 2013.
U.S. Appl. No. 13/719,786, filed Dec. 19, 2012.
U.S. Appl. No. 14/588,762, filed Feb. 1, 2015.
U.S. Appl. No. 14/498,119, filed Sep. 26, 2014.
U.S. Appl. No. 14/287,812, filed May 27, 2014.
U.S. Appl. No. 14/292,286, filed May 30, 2014.
U.S. Appl. No. 14/292,332, filed May 30, 2014.
U.S. Appl. No. 14/292,363, filed May 30, 2014.
U.S. Appl. No. 14/498,147, filed Sep. 26, 2014.
U.S. Appl. No. 14/498,168, filed Sep. 26, 2014.
U.S. Appl. No. 14/498,197, filed Sep. 26, 2014.
U.S. Appl. No. 14/498,219, filed Sep. 26, 2014.
U.S. Appl. No. 14/681,846, filed Apr. 8, 2015.
U.S. Appl. No. 15/628,975, filed Jun. 21, 2017.
Notice of Allowance for U.S. Appl. No. 15/628,975, dated Dec. 28, 2017, 8 pages.
Related Publications (1)
Number Date Country
20180092189 A1 Mar 2018 US
Provisional Applications (1)
Number Date Country
62400525 Sep 2016 US