The present disclosure is directed generally to interactive lighting control, particularly to the controlling and creating of light effects such as the tuning of light scenes based on a determined state of a plant.
Decorative tree lighting typically includes strips of lights that are hung and wrapped on trees and other plants for decorative purposes. Such tree lighting is often used for festive occasions like Christmas and Diwali and is usually controlled by simple timing-based controllers that execute pre-defined lighting recipes. Modern lighting systems may allow the user to set different recipes via mobile-phone interfaces.
Some tree lighting systems include lighting elements that wrap around the trunks of the tree using a mesh-based wire arrangement. The mesh-based wire arrangement is always in contact with the trunk. Biomimetic textile-based biosensors are available to monitor, in vivo and in real-time, variations in the solute content of plant sap. There is no detectable effect on the plant's morphology from the biosensor. However, such biosensors are inserted directly into the tissue of the plant.
There is a need in the art to improve interactive tree lighting control systems using user-friendly sensors.
The present disclosure is directed to inventive systems and methods for interactive lighting control using plant lighting and surface-based sensors to capture sensor data indicative of a state of the plant. Generally, embodiments of the present disclosure are directed to improved systems and methods for determining a state of a plant using surface-based sensors. Applicant has recognized and appreciated that it would be beneficial to exploit the existing structure of large-scale plant lighting using contact-based sensing technologies to determine a state of the plant. Additionally, Applicant has recognized and appreciated that it would be beneficial to control the plant lighting based on data collected from the contact-based sensors and/or user preferences.
Generally, in one aspect, a system for controlling plant lighting is provided. The system includes a plurality of sensors arranged around a portion of a plant, wherein the plurality of sensors are distributed among a plurality of lighting elements and the plurality of sensors are configured to capture sensor data for at least one parameter of the plant; and a processor associated with the plurality of sensors and the plurality of lighting elements. The processor is configured to determine or receive location information indicative of relative locations of the plurality of sensors around the portion of the plant; receive, from the plurality of sensors, sensor data for the at least one parameter of the plant; annotate the sensor data with the location information of the plurality of sensors and timestamp information; analyze the annotated sensor data; and determine a state of the plant based on the annotated sensor data.
According to an embodiment, the system further includes a lighting controller associated with the processor, wherein the lighting controller is configured to receive user input comprising a lighting effect corresponding to the state of the plant and control at least one of the plurality of lighting elements to provide the lighting effect based on the user input.
According to an embodiment, the processor is further configured to: receive, from the plurality of sensors, initial sensor data for the at least one parameter of the plant; and automatically determine the location information based on the initial sensor data received.
According to an embodiment, the processor is further configured to: receive an image of the plant; and receive, from a user, the location information indicative of the relative locations of the plurality of sensors within the image.
According to an embodiment, the plurality of sensors are contact-based sensors.
According to an embodiment, the plurality of sensors are ultrasonic sensors.
According to an embodiment, the processor is configured to classify the state of the plant based on a time-series classification algorithm.
Generally, in another aspect, a method for controlling plant lighting is provided. The method includes: determining or receiving location information indicative of relative locations of a plurality of sensors arranged around a portion of a plant, wherein the plurality of sensors are distributed among a plurality of lighting elements and the plurality of sensors are configured to capture sensor data for at least one parameter of the plant; measuring, by the plurality of sensors, sensor data for the at least one parameter of the plant; annotating, by a processor, the sensor data with the location information of the plurality of sensors and timestamp information; analyzing, by the processor, the annotated sensor data from the plurality of sensors; and determining, by the processor, a state of the plant based on the annotated sensor data.
According to an embodiment, the method further includes: receiving, by a lighting controller, user input comprising a lighting effect corresponding to the state of the plant; and controlling, by the lighting controller, at least one of the plurality of lighting elements based on the user input.
According to an embodiment, the determining or receiving step includes: collecting, by the plurality of sensors, initial sensor data for the at least one parameter of the plant; and automatically determining the location information based on the initial sensor data collected.
According to an embodiment, the determining or receiving step includes: receiving an image of the plant; and receiving, from a user, the location information indicative of relative locations of the plurality of sensors within the image.
According to an embodiment, the measuring step includes measuring the sensor data with contact-based sensors.
According to an embodiment, the measuring step includes measuring the sensor data with ultrasonic sensors.
According to an embodiment, the step of determining the state of the plant includes classifying the state of the plant based on a time-series classification algorithm.
According to an embodiment, the method further includes: receiving user input comprising a lighting effect corresponding to an aggregation of a plurality of states of the plant, the plurality of states of the plant comprising the state of the plant; and controlling, by a lighting controller, at least one of the plurality of lighting elements based on the user input.
In various implementations, the processor described herein may take any suitable form, such as, one or more processors or microcontrollers, circuitry, one or more controllers, a field programmable gate array (FGPA), or an application-specific integrated circuit (ASIC) configured to execute software instructions. Memory associated with the processor may take any suitable form or forms, including a volatile memory, such as random-access memory (RAM), static random-access memory (SRAM), or dynamic random-access memory (DRAM), or non-volatile memory such as read only memory (ROM), flash memory, a hard disk drive (HDD), a solid-state drive (SSD), or other non-transitory machine-readable storage media. The term “non-transitory” means excluding transitory signals but does not further limit the forms of possible storage. In some implementations, the storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform at least some of the functions discussed herein. It will be apparent that, in embodiments where the processor implements one or more of the functions described herein in hardware, the software described as corresponding to such functionality in other embodiments may be omitted. Various storage media may be fixed within a processor or may be transportable, such that the one or more programs stored thereon can be loaded into the processor so as to implement various aspects as discussed herein. Data and software, such as the algorithms or software necessary to analyze the data collected by the sensors, an operating system, firmware, or other application, may be installed in the memory.
It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein.
In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the present disclosure.
The present disclosure describes various embodiments of systems and methods for interacting with plant lighting using surface-based sensors to capture data indicative of a parameter of the plant. Applicant has recognized and appreciated that it would be beneficial to capture plant data (e.g., water transport measurements) using a plant-wide sensor system integrated with plant lighting and control the lighting based on the captured plant data. The present disclosure describes various embodiments of systems and methods for providing a distributed network of sensors by making use of illumination devices that are already arranged in a wire-mesh arrangement. Such existing infrastructure can be used as a backbone for the additional detection functionalities described herein.
Referring to
In embodiments, the sensors include at least some connected hydration sensors contacting various parts of the plant P. Water is typically transported through the xylem tissues present in the trunk of the plant during nighttime. Thus, the connected hydration sensors can be configured to measure water transport at night times. In embodiments where the time of day is relevant, the sensors can include a clock, a daylight sensor, or any other suitable means for determining day from night. The clock, daylight sensor, or other means can also be in communication with the sensors and/or other components of the system (e.g., processor 14). In embodiments, the sensors include at least some connected air quality detection sensors contacting various parts of the plant P to monitor the emission of gases such as carbon dioxide. The connected air quality detection sensors and connected hydration sensors can be used alternatively or in combination. Instead of or in addition to the connected hydration and air quality sensors, the sensors include at least some ultrasonic sensors to determine how much water is within the plant. For example, the ultrasonic sensors can be positioned on either side of the trunk of the plant and the signals emitted from one side and received at the other side can be used to determine how much water is in between. In embodiments, the sensors include some contact-less sensors such as optical sensors arranged at a distance from the plant. The connected sensors refer to any interconnection of two or more devices (including controllers or processors) that facilitates the transmission of information (e.g., for device control, data storage, data exchange, etc.) between the devices coupled to a network. Any suitable network for interconnecting two or more devices is contemplated including any suitable topology and any suitable communication protocols.
As shown schematically in
As shown in arrangement 50 of
As shown in arrangement 70 of
The system 10 also includes a processor 14 associated with the plurality of sensors S and the plurality of lighting elements 12. The processor 14 is configured to determine or receive location information indicative of relative locations of the plurality of sensors around the portion of the plant P where they are positioned as further explained below. In an embodiment, the processor 14 is configured to determine the relative locations of the plurality of sensors S on the plant P based on an automated commissioning process. In such an automated commissioning process, the processor 14 is configured to receive initial sensor data for at least one parameter of the plant P from the plurality of sensors S described herein. When a sufficient number of measurements are collected by all of the sensors S, the sensors undergo self-commissioning. This results in the processor 14 gaining an understanding of which sensors are at which locations of the plant. For example, if the sensors are activated in a sequence based on their position on the plant, such a sequence can be exploited to determine the relative positions of the sensors. Some sensors that are near the ground will provide sensor data that can be differentiated from sensor data provided by sensors that are higher up the trunk. Thus, if root sensors are activated before trunk, branch, and leaf sensors, the processor 14 can determine that the root sensors are below the trunk, branch, and leaf sensors and so on. The spatio-temporal pattern of the sensor readings can be used as input to a suitable graph-learning algorithm.
In an embodiment, the processor 14 is configured to receive the relative positions of the plurality of sensors around the plant using a manual commissioning process. In such an embodiment, for example, the processor 14 can receive an image of the plant and user input indicative of location information of the relative locations of the sensors within the image. For example, a user can assign each sensor a location within the image and that data can be used to determine relative locations among the sensors. In another example, a user can be instructed to identify each sensor in a sequence starting at a point in the image (e.g., the bottom) and ending at another point in the image (e.g., the top). Such a sequence conveys relative locations of the sensors.
The processor 14 is also configured to receive sensor data from the sensors S after the commissioning process. After commissioning is established, the sensor data is obtained depending on the parameter(s) being monitored. The processor 14 can annotate the obtained sensor data with the different parts of the plant, along with timestamp information. As used herein, the term “annotate” refers to the process of associating data from one data structure with data from another data structure. The term annotate can refer to data tagging or labeling in some embodiments. The processor can also analyze the annotated sensor data and determine a state of the plant based on the annotated sensor data. For example, when a sufficient number of measurements are collected by all of the sensors S, a time-series classification algorithm can be used to classify the state of the plant. In an embodiment, the states of the plant can include the following for a system of connected hydration sensors: “Water uptake ongoing”, “Water uptake starting”, “Water uptake complete”, etc. Different time-series classification techniques, such as, Markov models or Long Short-Term Memory artificial neural networks can be used as well.
The system 10 can also include a lighting controller 16 associated with the processor 14. As shown in
The one or more lighting effects of the user input 18 can include any light recipe comprising any combination of light parameters, such as, color, color temperature, saturation, brightness, intensity, etc. The light parameters can include any number of colors as well. For example, a particular state of a plant can correspond to a mixture of colors, or a mixture of a sub-set of colors. The light recipe can also convey a summary of different states the plant has exhibited throughout a period of time, such as a day. In an embodiment, the different states that the plant has exhibited over a period of time can be averaged according to any suitable process. In embodiments, different states can be applied with different weights in the averaging process depending on when the states occur during the day. For example, a particular state at night might be weighted more heavily than a particular state exhibited during the day or vice versa depending on the parameter of the plant being monitored. In embodiments, the light recipe can also convey contextual information of the user's day. The user input 18 from the electronic device 403 can be stored in memory 420.
The memory 420 can be integrated in or otherwise connected to the lighting controller 16. Each state of the plant 402 can be associated with specific lighting effects provided via user input 18 by processor 422 if the association is not already carried out by the electronic device 403. Processor 422 can be integrated in or otherwise connected to the lighting controller 16. In embodiments, lighting effects can be associated with one or more states of a plant and the associated data can be stored in memory 420. In embodiments, the associated data can be in a look up table (LUT) or any suitable alternative. Processor 422 can be configured to access such stored associated data in memory 420 when the controller 16 receives the one or more states of the plant 402.
The lighting controller 16 is also configured to control one or more of the lighting elements 12 to provide the lighting effect based on the user input 18. The lighting effect can be an intensity, one or more colors, a flashing pattern, or any other light effect property that can be altered. The lighting controller 16 can include a communications device 424, such as a wireless network device (e.g., Wi-Fi), Bluetooth device, infrared receiving unit, and so forth. Generally, light controllers include software components for configuring fixtures and designing and editing lighting scenes, and hardware components for sending control data to the fixtures. Controllers/drivers are typically sued for flashing, dimming, and color mixing lights. Example light controllers include the Video System Manager Pro, the Light System Manager (LCM) controller, and the ColorDial Pro, from Signify N.V. of Eindhoven, NL.
The communications device 424 of the lighting controller 16 is adapted to receive one or more lighting adjustment signals from the processor 422 causing the controller 16 to alter one or more lighting properties of the lighting elements 12. The lighting controller system 400 includes a power supply 426.
Referring to
At step 1004 of the method, the plurality of sensors measure sensor data for the at least one parameter of the plant. Such measurements can be obtained continuously, periodically, or on demand.
At step 1006 of the method, the sensor data can be annotated with location information and timestamp information.
At step 1008 of the method, the sensor data can be analyzed and at step 1010 a state of the plant can be determined based on the analyzed sensor data. The state of the plant can indicate a health status of the plant and/or indicate issues with water transport, for example. In embodiments with connected hydration sensors, the hydration measurements collected can be used to estimate water flows in the plant.
Advantageously, a plant-wide sensor system is provided that can generate information indicating a state or health status of the plant. Such described sensor systems are easy to install and do not require inserting the sensors into the tissue of the plant.
In addition to measuring water transport or other parameters of a plant, the plant-wide sensor system provides an entirely new dimension for users to interact with and control the connected lighting elements 12.
At step 1012 of the method in
At step 1014 of the method, the lighting controller 16 can control one or more of the plurality of lighting elements 12 based on the user input.
In embodiments with connected hydration sensors, methods can involve initializing hydration sensor readings in step 1, learning a topology of the plant based on the hydration sensor readings in step 2, continuing to obtain hydration sensor readings at different parts of the plant at step 3, and combining the hydration sensor readings to classify a state of the plant at step 4. The classification step can involve a time-series classification process. At step 5, the method can involve determining whether a state of the plant is converged. If not, the process returns to step 3 to continue obtaining hydration sensor readings at different parts of the plant. If the state of the plant is converged, the process proceeds to control the lighting elements. The lighting elements are then activated based on sensor state and any user preferences associated with the sensor state.
Advantageously, the systems and methods can be used to allow a user to customize plant lighting based on sensor data of the plant. The measurements obtained across the plant-wide sensor system can be combined with user preferences to generate different lighting scenes in the lighting controller 16. Accordingly, when the controller controls the lighting elements 12 to display a particular lighting scene based on the user input, the user can immediately appreciate what is happening with the plant when viewing the lighting scene. The user can also make changes to the lighting scenes as desired.
It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited.
All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified.
As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.”
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively.
While several inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
20201203.5 | Oct 2020 | EP | regional |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/EP2021/076175 | 9/23/2021 | WO |
Number | Date | Country | |
---|---|---|---|
63086706 | Oct 2020 | US |