The present disclosure is related to dynamic lighting wherein one or more lighting devices provide lighting that changes over time to shape the environment of an indoor space.
Modern lighting devices continue to evolve, including significant functionality in addition to providing light for general illumination. Many modern lighting devices include communications circuitry and form a network with one or more other devices. Leveraging the functionality of modern lighting fixtures, it may be desirable to provide dynamic lighting in which one or more characteristics of light provided from a lighting device or a group of lighting devices changes over time to shape the environment of an indoor space.
Systems and methods for providing dynamic lighting are provided. In an exemplary aspect, one or more characteristics of light provided from a lighting device or a group of lighting devices changes over time to shape the environment of an indoor space according to dynamic lighting instructions. Dynamic lighting may improve the health or wellbeing of individuals in an indoor space, for example, by simulating an outdoor environment to reduce stress, by providing circadian entrainment to improve sleep and wakefulness, or the like. Other aspects of the present disclosure enable lighting devices to provide light that is synchronized with one or more other devices and does not significantly drift over time so that the lighting devices can provide seamless dynamic lighting experiences that shape the environment of an indoor space.
In one embodiment, a lighting device includes a light source, communications circuitry, processing circuitry, and a memory. The processing circuitry is coupled to the light source and the communications circuitry. The memory is coupled to the processing circuitry and stores instructions, which, when executed by the processing circuitry cause the lighting device to receive dynamic lighting instructions via the communications circuitry. The dynamic lighting instructions include transition information. In response to receiving the dynamic lighting instructions, the lighting device determines a light output function for changing a light output characteristic of the light source based on the transition information. The lighting device then adjusts a light output characteristic variable for controlling the light source over time such that the light output characteristic transitions from its current state based on the light output function. By operating the lighting device as described above, dynamic lighting can be synchronized across lighting devices with minimal communication overhead and seamless transitions in light output.
In another embodiment, a method for providing dynamic lighting includes receiving dynamic lighting instructions at a lighting device. The dynamic lighting instructions including transition information. In response to receiving the dynamic lighting instructions at the lighting device, the method further includes determining a light output function for changing a light output characteristic of a light source based on the transition information. The method further includes adjusting, over time, a light output characteristic variable for controlling the light source such that the light output characteristic transitions from its current state based on the light output function.
In another embodiment, an intelligent lighting coordinator includes communications circuitry, processing circuitry, and a memory coupled to the processing circuitry. The processing circuitry is coupled to the communications circuitry. The memory stores instructions, which, when executed by the processing circuitry cause the intelligent lighting coordinator to receive a lighting control input via the communications circuitry and determine a first lighting control profile from the lighting control input. The intelligent lighting coordinator further determines dynamic lighting instructions for changing a light output characteristic of a light source based on the first lighting control profile and transmits the dynamic lighting instructions via the communications circuitry.
In another embodiment, an intelligent lighting system includes an intelligent lighting coordinator and a plurality of lighting devices. The intelligent lighting coordinator includes coordinator processing circuitry, and a coordinator memory. The coordinator memory stores instructions, which, when executed by the coordinator processing circuitry cause the intelligent lighting coordinator to receive a lighting control input and determine a first lighting control profile from the lighting control input. The intelligent lighting coordinator further transmits dynamic lighting instructions based on the first lighting control profile. Each one of the plurality of lighting devices includes a light source, lighting device processing circuitry, and a lighting device memory. The lighting device memory stores instructions, which, when executed by the lighting device processing circuitry cause the one of the plurality of lighting devices to in response to receiving the dynamic lighting instructions, determine a light output function for changing a light output characteristic of the light source using the dynamic lighting instructions and adjust a light output characteristic variable for controlling the light source over time such that the light output characteristic transitions from its current state based on the light output function.
Those skilled in the art will appreciate the scope of the present disclosure and realize additional aspects thereof after reading the following detailed description of the preferred embodiments in association with the accompanying drawing figures.
The accompanying drawing figures incorporated in and forming a part of this specification illustrate several aspects of the disclosure, and together with the description serve to explain the principles of the disclosure.
The embodiments set forth below represent the necessary information to enable those skilled in the art to practice the embodiments and illustrate the best mode of practicing the embodiments. Upon reading the following description in light of the accompanying drawing figures, those skilled in the art will understand the concepts of the disclosure and will recognize applications of these concepts not particularly addressed herein. It should be understood that these concepts and applications fall within the scope of the disclosure and the accompanying claims.
It will be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, without departing from the scope of the present disclosure. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
It will be understood that when an element such as a layer, region, or substrate is referred to as being “on” or extending “onto” another element, it can be directly on or extend directly onto the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly on” or extending “directly onto” another element, there are no intervening elements present. Likewise, it will be understood that when an element such as a layer, region, or substrate is referred to as being “over” or extending “over” another element, it can be directly over or extend directly over the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly over” or extending “directly over” another element, there are no intervening elements present. It will also be understood that when an element is referred to as being “connected” or “coupled” to another element, it can be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present.
Relative terms such as “below” or “above” or “upper” or “lower” or “horizontal” or “vertical” may be used herein to describe a relationship of one element, layer, or region to another element, layer, or region as illustrated in the Figures. It will be understood that these terms and those discussed above are intended to encompass different orientations of the device in addition to the orientation depicted in the Figures.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises,” “comprising,” “includes,” and/or “including” when used herein specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms used herein should be interpreted as having a meaning that is consistent with their meaning in the context of this specification and the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As discussed above, it may be desirable to provide dynamic lighting in which one or more characteristics of light provided from a lighting device or a group of lighting devices changes over time to shape the environment of an indoor space. Dynamic lighting may improve the health or wellbeing of individuals in an indoor space, for example, by simulating an outdoor environment to reduce stress, by providing circadian entrainment to improve sleep and wakefulness, or the like. Conventionally, synchronization of the light output of multiple lighting devices has required significant overhead in the form of communications between lighting devices and one or more coordinator devices (i.e., lots of messages sent at very short intervals). Often, lighting devices form part of a low bandwidth mesh network in which available data throughput is relatively low. For this reason, conventional methods for synchronization of lighting devices may not be capable of providing a seamless dynamic lighting experience due to the fact that they will flood such a low bandwidth network and thus interrupt the synchronization of the light output of lighting devices. Further, conventional methods for synchronizing the light output of lighting devices are not tolerant to dropped messages, since the lighting devices rely on messages from the one or more coordinator devices to change any aspect of the light output provided therefrom. Dropped messages may result in no changes in the light output from the lighting devices, and when a message finally does arrive at a lighting device may result in an abrupt change in light output that is disruptive to individuals in the space.
Alternatively, dynamic lighting has required a real time clock at each lighting device for accurate timekeeping and thus synchronization. Integrating a real time clock into a lighting device adds overhead in terms of both cost and complexity to the lighting device. Accordingly, it is often not practical to do so.
Aspects of the present disclosure enable lighting devices to provide light that is synchronized with one or more other devices and does not significantly drift over time so that the lighting devices can provide seamless dynamic lighting experiences that shape the environment of an indoor space.
In some embodiments, the processing circuitry 26 provides control signals for controlling the light source 20 according to one or more light output characteristics, while circuitry for providing signals suitable to drive the light source 20 in accordance with the control signals is integrated into the light source 20 itself. In other embodiments, drive signals may be provided directly by the processing circuitry 26 or may be provided by external circuitry such as driver circuitry, which is not shown. The sensor circuitry 22 may include any number of sensors such as an ambient light sensor, an occupancy sensor, one or more image sensors, a temperature sensor, or the like, and may provide sensor data from the one or more sensors to the processing circuitry 26 in order to enable certain functionality of the lighting device 12 discussed below. The communications circuitry 24 enables communication with other devices such as one or more other lighting devices 12 and the intelligent lighting coordinator 14. The memory 28 stores instructions, which, when executed by the processing circuitry 26 cause the lighting device 12 to perform one or more functions, such as provide dynamic lighting as discussed in detail below.
In some embodiments, the lighting device 12 includes multiple light sources 20, such as a direct light panel and an indirect light panel. In some embodiments, further light sources 20 may be included, such as a sky-emulating light source (e.g., where another light source may be a sun-emulating light source). In an exemplary aspect, the processing circuitry 26 provides control signals for controlling each of the light sources 20 independently according to one or more light output characteristics.
Exemplary dynamic lighting instructions are shown in
In some embodiments, the different profile identifiers are used to differentiate lighting devices 12 at different spatial locations within a space. For example, lighting devices 12 associated with the first profile identifier may be located at a first end of a space, lighting devices 12 associated with the second profile identifier may be located at a middle of the space, and lighting devices 12 associated with the third profile identifier may be located at a second end of the space opposite the first end. The destination states associated with each profile identifier may be configured to provide dynamic lighting that is coordinated across the space (e.g., light appears to move from the first end of the space to the second end of the space) over time. In embodiments discussed below, the dynamic lighting instructions are generated automatically based on knowledge of a spatial relationship between lighting devices 12 to provide such an effect. In some embodiments, different profile identifiers may additionally or alternatively be used to differentiate between light sources 20 within a same lighting device 12 (e.g., to differentiate an indirect/uplight from a direct/downlight).
Notably, the dynamic lighting instructions shown in
In response to receiving the dynamic lighting instructions, each lighting device 12 determines a light output function for changing from the current state of each light output characteristic based on the transition information (block 102). Details regarding determination of the light output function are discussed below. Each lighting device 12 then adjusts one or more light output characteristic variables over time based on the light output function such that the light output characteristics transition from the current state based on the light output function (block 104).
The light output characteristic variables are used, in a first mode (e.g., normal mode) of the lighting devices 12, to adjust the one or more light output characteristics of each light source 20. In some modes of the lighting devices 12 (e.g., based on occupancy events, due to an override instruction, in an emergency, etc.), the light output characteristic variables are not used to adjust the light output characteristics. However, in such modes, the light output characteristic variables may continue to be calculated based on the light output functions and are stored in the memory for when the first mode resumes.
Notably, each one of the lighting devices 12 continues to adjust the light output characteristic variables based on the determined light output function after the dynamic lighting instructions are received such that the lighting devices 12 operate semi-autonomously to transition between the current state and the destination state. However, as discussed above, the lighting devices 12 may not have access to a real time clock and thus may approximate a clock by counting processor clock cycles. Accordingly, the lighting devices 12 may experience timing drift such that they become unsynchronized with one or more other lighting devices 12.
To keep the light output from the lighting devices 12 synchronized, at some update interval the intelligent lighting coordinator 14 sends updated dynamic lighting instructions to the lighting devices 12 (block 106). The updated dynamic lighting instructions include updated transition information, such as an updated destination state (which may or may not change from the original dynamic lighting instructions) and an updated transition duration (or transition end time). The updated transition duration may be equal to the last transition duration sent minus the amount of time that has passed since the last dynamic lighting instructions were sent. For example, in a first set of updated dynamic lighting instructions sent five minutes after the original dynamic lighting instructions, the transition duration for the first profile identifier may be 55 minutes (60 minutes-5 minutes). In other examples, synchronization may be provided in another manner, such as through periodic transmission of a clock synchronization signal.
In response to receiving the updated dynamic lighting instructions, each lighting device 12 determines an updated light output function for each light output parameter based on the updated transition information (block 108). Each lighting device 12 then adjusts the one or more light output characteristic variables over time based on the updated light output function such that the light output characteristics transition from the current state based on the updated light output function (e.g., to the updated destination state over the updated transition duration) (block 110).
By updating the light output function (e.g., slope) each time updated dynamic lighting instructions are received and adjusting light output characteristics based on the updated light output function (e.g., an updated calculated slope between the current state and the destination state), the lighting devices 12 are able to provide transitions between different light output characteristics with minimal updates from the intelligent lighting coordinator 14 while simultaneously avoiding abrupt changes in light output characteristics. If a lighting device 12 experiences some timing drift between updated dynamic lighting instructions, the updated light output function may be different from the light output function determined in response to the previously received dynamic lighting instructions. The lighting device 12 will not attempt to adjust the light output characteristics back to the previously determined function, which may result in an abrupt change in the light output characteristics that would be disruptive to individuals in the space. Instead, the updated light output function is used to adjust the light output characteristics as discussed above.
In some embodiments, the dynamic lighting instructions may be used to adjust other settings for operating the lighting device 12 in addition to adjusting the light output characteristics (block 104, block 110). For example, operation of the sensor circuitry 22 may be adjusted (e.g., to activate, deactivate, adjust sensitivity, etc.), or other settings used for controlling the light sources (e.g., occupancy level, daylight settings, scheduled operations, etc.) may be adjusted.
As discussed above, the dynamic lighting instructions may include a destination state or light output function for one or more light output characteristics for the lighting devices 12 having different profile identifiers. Accordingly, a destination state for one or more light output characteristics is optionally extracted from the dynamic lighting instructions based on a profile identifier associated with the lighting device 12 (block 202). For example, if the lighting device 12 is associated with the first profile identifier (1001), the destination states for CCT and brightness associated with the first profile identifier may be extracted from the dynamic lighting instructions for calculation of the light output function discussed below.
For each one of the light output characteristics having transition information (block 204), a light output function is calculated based on the transition information (e.g., a slope between the current state of the light output characteristic and the destination state) for the light output characteristic (block 206). For example, if the light output characteristics include CCT and brightness, a slope between the current CCT and the destination CCT will be calculated and a slope between the current brightness and the destination brightness will be calculated. The one or more light output characteristics are then adjusted according to the slope calculated for each light output characteristic (block 208) such that the one or more light output characteristic variables (e.g., and the light output characteristics themselves) transition from the current state to the destination state over the transition duration. It should be understood that the light output function is not limited to a slope, but may also be any appropriate function for adjusting the light output characteristics over time, such as a geometric function, a circadian function, and so on. The memory 28 of the lighting device 12 may store instructions, which, when executed by the processing circuitry 26 cause the lighting device 12 to provide the functionality discussed above.
The light source 20 of the lighting device 12 may be limited in the resolution available for adjusting a given light output characteristic, as determined by a minimum step size representing the minimum amount by which a light output characteristic can be changed. This is dictated by the light source 20 itself as well as the circuitry that drives the light source 20. Due to the limits on the adjustability of the light output characteristics of the light source 20, a number of steps between the current state and the destination state may be calculated by dividing the transition magnitude by the step size. In the example shown, the step size is 400 K, thereby providing 5 steps between the current state and the destination state (2000 K/400 K=5). A step interval is then calculated by dividing the transition duration by the number of steps (60 min/5=12 min). The step interval is the interval between which the light output characteristic (in the present example CCT) should be changed by the minimum step size in the direction of the destination state. With the step interval calculated, the lighting device 12 now knows that it should change the CCT by the minimum step size (400 K) every 12 minutes to arrive at the destination state of 5000 K in 60 minutes.
For each one of the light output characteristics having updated transition information (block 304), an updated light output function is determined (e.g., an updated slope is calculated between the current state and the updated destination state) for the light output characteristic (block 306). The one or more light output characteristic variables are then adjusted according to the slope calculated for each light output characteristic (block 308) such that the one or more light output characteristics transition from the current state to the destination state over the transition duration. The memory 28 of the lighting device 12 may store instructions, which, when executed by the processing circuitry 26 cause the lighting device 12 to provide the functionality discussed above.
Dynamic lighting instructions are determined by the intelligent lighting coordinator 14 based on the lighting control profile (block 404). The dynamic lighting instructions may be similar to those discussed with respect to
Optionally, at some interval, updated dynamic lighting instructions may be provided (block 408). The interval may be determined by a timing drift associated with the lighting devices 12. For example, a measurable timing drift of the lighting devices 12 may result in noticeable differences between adjacent lighting devices 12 over some period of time if the updated dynamic lighting instructions are not provided. This period of time may be used to determine the interval used to send updated dynamic lighting instructions. The memory 34 of the intelligent lighting coordinator 14 may store instructions, which, when executed by the processing circuitry 32 cause the intelligent lighting coordinator 14 to provide the functionality discussed above.
By only sending updated dynamic lighting instructions at certain intervals and operating the lighting devices 12 in a semi-autonomous manner such that a slope between a current state and a destination state is calculated for each set of dynamic lighting instructions received as discussed above, the lighting devices 12 can remain synchronized when providing dynamic lighting with minimal overhead in terms of communication between the lighting devices 12 and the intelligent lighting coordinator 14. Further, abrupt changes in the light output of the lighting devices 12 are avoided to provide a pleasant and seamless dynamic lighting experience.
In an exemplary aspect, the lighting devices 12 operate in multiple modes. In a first mode, which may be considered a normal mode, a lighting device 12 operates as described above, with dynamic lighting provided according to dynamic lighting instructions received from the intelligent lighting coordinator 14. The lighting device 12 may operate in a second mode in response to a triggering event (e.g., received from an occupancy sensor, a wall controller, a scene controller, an emergency system, etc.) in which the lighting device may not provide all functions of the normal mode. For example, the second mode may be an override mode in which one or more of the light output functions derived from the dynamic lighting functions are overridden. While the light output functions are overridden, the adjustment of the light output characteristic variables may terminate, may be paused, or may continue such that the dynamic lighting resumes when the lighting device 12 exits the override mode.
In an example, the first mode may correspond to an occupancy state determined from occupancy sensor data (e.g., from an occupancy sensor in the lighting device 12 or received from another device). The second mode may correspond to an unoccupied state such that the light source is off, outputs at a low brightness, or otherwise is not adjusted in accordance with the dynamic functions described above. However, some of the light output characteristics may continue to be dynamically adjusted, such as the CCT. An example is further illustrated below.
Between time t0 and t1, the occupancy state is unoccupied and no user commands or dynamic lighting instructions have been provided to the lighting device 12. Accordingly, a CCT and brightness of light from the light source 20 are both provided at an unoccupied level (e.g., according to a second mode), which is a predetermined level for the CCT and brightness. At time t1, dynamic lighting instructions are provided to the lighting device 12. In response to the dynamic lighting instructions and as discussed above, the lighting device 12 determines a light output function (e.g., calculates a slope between the current state and a desired state). In the present example, a slope between a current state of the CCT and the desired state of the CCT and a slope between a current state of the brightness and the desired state of the brightness is calculated. However, since the occupancy state is unoccupied, only the CCT is adjusted according to the slope calculated for the CCT while the brightness of the lighting device 12 is kept at the unoccupied level to save power.
Notably, this is merely one example of how the lighting device 12 can behave, and in some embodiments both the CCT and brightness of the lighting device 12 may be adjusted according to the slope calculated for each one of these characteristics even when the occupancy state is unoccupied (e.g., the light output characteristic variables may be stored but not output). Table 1 illustrates various ways that a lighting device 12 can respond to an occupancy state and other commands based on one embodiment of the present disclosure:
Between time t1 and t5, the lighting device 12 calculates an updated slope for the CCT and the brightness in response to receipt of dynamic lighting instructions, but only the CCT is adjusted according to the calculated slope for the CCT while the brightness remains at the unoccupied level. Notably, even if a particular light output characteristic is not being changed by the lighting device 12 (e.g., due to an unoccupied state or a manual command from a user), the lighting device 12 continues to receive dynamic lighting instructions and calculate an updated slope for the light output characteristic in the background. This allows the lighting device 12 to seamlessly resume the dynamic lighting program at a later time, if the conditions dictate that it should do so.
At time t5, the occupancy state changes from unoccupied to occupied. In response, the brightness is adjusted according to the slope calculated for the brightness (e.g., according to a first mode). In one embodiment, the brightness is immediately adjusted based on the calculated slope for the brightness. In other embodiments, some transition between the unoccupied level and a level based on the calculated slope for the brightness is performed.
Between time t5 and t7, an updated slope for the CCT and the brightness are calculated in response to receipt of dynamic lighting instructions and the CCT and brightness are adjusted accordingly. At time t7, an override command is received from a user, causing the lighting device 12 to enter an override mode. The override command may be provided, for example, from the user application 18, a wall controller, or any other suitable means. The override command specifies a desired CCT and brightness. In response to the override command the lighting device 12 immediately adjusts the CCT and brightness of the light source 20 to the desired CCT and brightness. Between time t7 and t10, the lighting device 12 continues to calculate an updated slope for the CCT and brightness in response to receipt of dynamic lighting instructions. However, the light source 20 is not adjusted based on the calculated slope during the override mode. Instead, the light source 20 provides the light output characteristics according to the override command.
At time t10, the occupancy state changes from occupied to unoccupied. This ends the override mode and causes the lighting device 12 to adjust the brightness to the unoccupied level and the CCT to a level specified by the last calculated slope for the CCT based on the last received dynamic lighting instructions. Between time t10 and t13, the lighting device 12 continues to calculate an updated slope for the CCT and brightness in response to receipt of dynamic lighting instructions. However, only the CCT is adjusted according to the calculated slope for the CCT while the brightness remains at the unoccupied level.
At time t13 the occupancy state changes from unoccupied to occupied. In response, the brightness is adjusted according to the calculated slope for the brightness. Between time t13 and t15, updated slopes for the CCT and brightness are calculated in response to receipt of dynamic lighting instructions and the CCT and brightness are adjusted accordingly. At time t15, dynamic lighting instructions are no longer received by the lighting device 12. Accordingly, the CCT and brightness are maintained at the destination state of the last received dynamic lighting instructions. At time t16, the occupancy state changes from occupied to unoccupied. In response, the brightness is adjusted to the unoccupied level while the CCT remains unchanged. At time t17 the occupancy state changes from unoccupied to occupied. In response, the brightness is adjusted to the brightness value in the last occupied state (just before time t16). The CCT and brightness remain the same until time t18, at which time the present example ends.
As illustrated above, an occupancy state may change which light output characteristics are adjusted based on the calculated slope for each light output characteristic. When the dynamic lighting instructions include destination states for a plurality of light output characteristics, each one of the plurality of light output characteristics may be adjusted according to the appropriate calculated slope when the occupancy state is occupied and only a subset of the plurality of light output characteristics may be adjusted according to the appropriate calculated slope when the occupancy state is unoccupied. For example, as illustrated above both CCT and brightness may be adjusted according to the appropriate calculated slope when the occupancy state is occupied while only CCT may be adjusted according to the calculated slope for CCT when the occupancy state is unoccupied.
As discussed above, different profile identifiers in the dynamic lighting instructions may be used to differentiate lighting devices 12 at different spatial locations within a space, and thus the destination states for each profile identifier may be constructed to create a dynamic lighting program that is coordinated across a space. In one embodiment, the destination states for each profile identifier are automatically generated to create a dynamic lighting program that is coordinated across a space.
Generating the dynamic lighting instructions may include grouping lighting devices 12 into a number of profiles designated by a profile identifier based on their spatial relationships to one another, then generating destination states for each profile identifier to create a desired change in light across the space over time. In some embodiments, a lighting device 12 may have multiple profile identifiers (e.g., different profile identifiers for separate controls of different light sources 20 in the lighting device 12), or a single profile identifier may be used to provide separate control of light sources 20 in the lighting device 12. The profile identifiers may be fixed or configurable.
As discussed above, the user application 18 may be a software application running on a computing device such as a smartphone, a tablet, a computer, or the like.
Those skilled in the art will recognize improvements and modifications to the preferred embodiments of the present disclosure. All such improvements and modifications are considered within the scope of the concepts disclosed herein and the claims that follow.
This application claims the benefit of provisional patent application Ser. No. 62/926,862, filed Oct. 28, 2019, the disclosure of which is hereby incorporated herein by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4679086 | May | Jul 1987 | A |
6185444 | Ackerman et al. | Feb 2001 | B1 |
6470453 | Vilhuber | Oct 2002 | B1 |
6647426 | Mohammed | Nov 2003 | B2 |
7344279 | Mueller et al. | Mar 2008 | B2 |
8035320 | Sibert | Oct 2011 | B2 |
9030103 | Pickard | May 2015 | B2 |
9039746 | van de Ven et al. | May 2015 | B2 |
9155165 | Chobot | Oct 2015 | B2 |
9456482 | Pope et al. | Sep 2016 | B1 |
9488327 | Van Gheluwe et al. | Nov 2016 | B2 |
9681510 | van de Ven | Jun 2017 | B2 |
9686477 | Walters et al. | Jun 2017 | B2 |
9706617 | Carrigan et al. | Jul 2017 | B2 |
9710691 | Hatcher et al. | Jul 2017 | B1 |
9730289 | Hu et al. | Aug 2017 | B1 |
9769900 | Underwood et al. | Sep 2017 | B2 |
9826598 | Roberts et al. | Nov 2017 | B2 |
9888546 | Deese et al. | Feb 2018 | B2 |
9894740 | Liszt et al. | Feb 2018 | B1 |
10165650 | Fini et al. | Dec 2018 | B1 |
10203103 | Bendtsen et al. | Feb 2019 | B2 |
10781984 | Keller et al. | Sep 2020 | B2 |
20050128751 | Roberge et al. | Jun 2005 | A1 |
20060002110 | Dowling et al. | Jan 2006 | A1 |
20060022214 | Morgan et al. | Feb 2006 | A1 |
20060071780 | McFarland | Apr 2006 | A1 |
20060074494 | McFarland | Apr 2006 | A1 |
20060095170 | Yang et al. | May 2006 | A1 |
20070061050 | Hoffknecht | Mar 2007 | A1 |
20080125161 | Ergen et al. | May 2008 | A1 |
20080218334 | Pitchers et al. | Sep 2008 | A1 |
20080273754 | Hick et al. | Nov 2008 | A1 |
20090010178 | Tekippe | Jan 2009 | A1 |
20090045971 | Simons et al. | Feb 2009 | A1 |
20090066473 | Simons | Mar 2009 | A1 |
20090262189 | Marman | Oct 2009 | A1 |
20090290765 | Ishii et al. | Nov 2009 | A1 |
20100182294 | Roshan et al. | Jul 2010 | A1 |
20100189011 | Jing et al. | Jul 2010 | A1 |
20100226280 | Burns et al. | Sep 2010 | A1 |
20100231131 | Anderson | Sep 2010 | A1 |
20100262296 | Davis et al. | Oct 2010 | A1 |
20100295946 | Reed et al. | Nov 2010 | A1 |
20110007168 | Nagara et al. | Jan 2011 | A1 |
20110031897 | Henig et al. | Feb 2011 | A1 |
20110057581 | Ashar et al. | Mar 2011 | A1 |
20110169413 | Wendt et al. | Jul 2011 | A1 |
20110199004 | Henig et al. | Aug 2011 | A1 |
20110211758 | Joshi et al. | Sep 2011 | A1 |
20120038281 | Verfuerth | Feb 2012 | A1 |
20120143357 | Chemel et al. | Jun 2012 | A1 |
20120146518 | Keating et al. | Jun 2012 | A1 |
20120235579 | Chemel et al. | Sep 2012 | A1 |
20120320626 | Quilici et al. | Dec 2012 | A1 |
20130051806 | Quilici et al. | Feb 2013 | A1 |
20130182906 | Kojo et al. | Jul 2013 | A1 |
20130221203 | Barrilleaux | Aug 2013 | A1 |
20130257292 | Verfuerth et al. | Oct 2013 | A1 |
20130293877 | Ramer et al. | Nov 2013 | A1 |
20130307419 | Simonian et al. | Nov 2013 | A1 |
20140001963 | Chobot et al. | Jan 2014 | A1 |
20140028199 | Chemel | Jan 2014 | A1 |
20140028200 | Van Wagoner et al. | Jan 2014 | A1 |
20140062312 | Reed | Mar 2014 | A1 |
20140070724 | Gould et al. | Mar 2014 | A1 |
20140072211 | Kovesi et al. | Mar 2014 | A1 |
20140103833 | Ho et al. | Apr 2014 | A1 |
20140135017 | Hirano et al. | May 2014 | A1 |
20140159577 | Manoukis et al. | Jun 2014 | A1 |
20140167653 | Chobot | Jun 2014 | A1 |
20140211985 | Polese et al. | Jul 2014 | A1 |
20140217261 | De Groot et al. | Aug 2014 | A1 |
20140266916 | Pakzad et al. | Sep 2014 | A1 |
20140266946 | Bily et al. | Sep 2014 | A1 |
20140267703 | Taylor et al. | Sep 2014 | A1 |
20140340570 | Meyers et al. | Nov 2014 | A1 |
20150008831 | Carrigan et al. | Jan 2015 | A1 |
20150084503 | Liu et al. | Mar 2015 | A1 |
20150097975 | Nash et al. | Apr 2015 | A1 |
20150195855 | Liu | Jul 2015 | A1 |
20150208490 | Bishop et al. | Jul 2015 | A1 |
20150226392 | Gould et al. | Aug 2015 | A1 |
20150245451 | Sung et al. | Aug 2015 | A1 |
20150264779 | Olsen et al. | Sep 2015 | A1 |
20150264784 | Romano | Sep 2015 | A1 |
20150296599 | Recker et al. | Oct 2015 | A1 |
20150305119 | Hidaka et al. | Oct 2015 | A1 |
20150309174 | Giger | Oct 2015 | A1 |
20150351169 | Pope et al. | Dec 2015 | A1 |
20150370848 | Yach et al. | Dec 2015 | A1 |
20150373808 | Kuo et al. | Dec 2015 | A1 |
20160069978 | Rangarajan et al. | Mar 2016 | A1 |
20160095189 | Vangeel et al. | Mar 2016 | A1 |
20160100086 | Chien | Apr 2016 | A1 |
20160112870 | Pathuri | Apr 2016 | A1 |
20160124081 | Charlot et al. | May 2016 | A1 |
20160192458 | Keith | Jun 2016 | A1 |
20160195252 | Wilcox et al. | Jul 2016 | A1 |
20160205749 | Creusen et al. | Jul 2016 | A1 |
20160212830 | Erdmann et al. | Jul 2016 | A1 |
20160227618 | Meerbeek et al. | Aug 2016 | A1 |
20160270179 | Ryhorchuk et al. | Sep 2016 | A1 |
20160273723 | Van Gheluwe et al. | Sep 2016 | A1 |
20160282126 | Watts et al. | Sep 2016 | A1 |
20160286619 | Roberts et al. | Sep 2016 | A1 |
20170013697 | Engelen et al. | Jan 2017 | A1 |
20170048952 | Roberts et al. | Feb 2017 | A1 |
20170086273 | Soler | Mar 2017 | A1 |
20170094750 | Chen | Mar 2017 | A1 |
20170167708 | Kim et al. | Jun 2017 | A1 |
20170185057 | Ashdown et al. | Jun 2017 | A1 |
20170228874 | Roberts | Aug 2017 | A1 |
20170230364 | Barile et al. | Aug 2017 | A1 |
20170231045 | Hu et al. | Aug 2017 | A1 |
20170231060 | Roberts et al. | Aug 2017 | A1 |
20170231061 | Deese et al. | Aug 2017 | A1 |
20170231066 | Roberts et al. | Aug 2017 | A1 |
20170257925 | Forbis et al. | Sep 2017 | A1 |
20170366970 | Yu | Dec 2017 | A1 |
20180216791 | Leung et al. | Aug 2018 | A1 |
20180246270 | Di Trapani et al. | Aug 2018 | A1 |
20180252374 | Keller et al. | Sep 2018 | A1 |
20180259140 | Keller et al. | Sep 2018 | A1 |
20180318602 | Ciccarelli | Nov 2018 | A1 |
20180359838 | Liszt et al. | Dec 2018 | A1 |
20190242539 | Roberts | Aug 2019 | A1 |
20190340306 | Harrison et al. | Nov 2019 | A1 |
Number | Date | Country |
---|---|---|
104782229 | Jul 2015 | CN |
105874270 | Aug 2016 | CN |
102009016918 | Oct 2010 | DE |
202014104825 | Jan 2016 | DE |
102014115082 | Apr 2016 | DE |
2709428 | Mar 2014 | EP |
2918901 | Sep 2015 | EP |
3024898 | Feb 2016 | FR |
2010141663 | Jun 2010 | JP |
2012243206 | Dec 2012 | JP |
2016051608 | Apr 2016 | JP |
03067934 | Aug 2003 | WO |
2009011898 | Jan 2009 | WO |
2010004514 | Jan 2010 | WO |
2012143814 | Oct 2012 | WO |
2013121342 | Aug 2013 | WO |
2013158955 | Oct 2013 | WO |
2014147524 | Sep 2014 | WO |
2015103482 | Jul 2015 | WO |
2017045885 | Mar 2017 | WO |
Entry |
---|
International Search Report and Written Opinion for International Patent Application No. PCT/US2020/057706, dated Feb. 9, 2021, 19 pages. |
Decision on Appeal for U.S. Appl. No. 15/192,308, mailed May 20, 2020, 8 pages. |
Non-Final Office Action for U.S. Appl. No. 16/259,491, dated Feb. 20, 2020, 8 pages. |
Notice of Allowance for U.S. Appl. No. 16/259,491, dated Jun. 1, 2020, 7 pages. |
Notice of Allowance for U.S. Appl. No. 15/972,176, dated Jun. 19, 2019, 8 pages. |
Notice of Allowance for U.S. Appl. No. 16/657,294, dated May 8, 2020, 8 pages. |
Notice of Allowance for U.S. Appl. No. 15/972,178, dated Jun. 17, 2019, 9 pages. |
Examination Report for European Patent Application No. 17705540.7, dated Jul. 26, 2019, 8 pages. |
Summons to Attend Oral Proceedings for European Patent Application No. 17705540.7, mailed Feb. 20, 2020, 9 pages. |
Result of Consultation for European Patent Application No. 17705540.7, dated Jul. 31, 2020, 11 pages. |
Examination Report for European Patent Application No. 17708904.2, dated Aug. 2, 2019, 9 pages. |
Summons to Attend Oral Proceedings for European Patent Application No. 17708904.2, mailed Feb. 20, 2020, 10 pages. |
Result of Consultation for European Patent Application No. 17708904.2, dated Aug. 20, 2020, 18 pages. |
International Preliminary Report on Patentability for International Patent Application No. PCT/US2018/037048, dated Dec. 26, 2019, 9 pages. |
International Search Report and Written Opinion for PCT/US2019/016592, dated Apr. 17, 2019, 16 pages. |
International Preliminary Report on Patentability for PCT/US2019/016592, dated Aug. 20, 2020, 9 pages. |
Berclaz, J., et al., “Robust People Tracking with Global Trajectory Optimization,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 17-22, 2006, New York, New York, USA, 7 pages. |
Buckley, J. P., et al., “The sedentary office: an expert statement on the growing case for change towards better health and productivity,” British Journal of Sports Medicine, vol. 49, Mar. 26, 2015, pp. 1357-1362. |
Dalal, N., et al., “Histograms of Oriented Gradients for Human Detection,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 20-25, 2005, San Diego, California, USA, 8 pages. |
Girod, L., et al., “Locating Tiny Sensors in Time and Space: A Case Study,” Proceedings of the 2002 IEEE International Conference on Computer Design: VLSI in Computers and Processors, Sep. 16-18, 2002, Freiberg, Germany, pp. 214-219. |
Hnat, T., et al., “Doorjamb: Unobtrusive Room-level Tracking of People in Homes using Doorway Sensors,” Proceedings of the 2012 Sensys: The ACM Conference on Embedded Networked Sensor Systems, Nov. 6-9, 2012, Toronto, Canada, 14 pages. |
HELLA Aglaia, “APS-90 Advanced People Counting Sensor Data Sheet,” HELLA Aglaia Mobile Vision GmbH, Available online at: <<http://people-sensing.com/wp-content/uploads/2017/08/2017_11_Factsheet_APS-90E_EN_web.pdf>>, Nov. 2017, 1 page. |
Jia, J., et al., “Image Stitching Using Structure Deformation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 30, No. 4, Apr. 2008, pp. 617-631. |
Kalman, R. E., “A New Approach to Linear Filtering and Prediction Problems,” Transactions of the ASME—Journal of Basic Engineering, vol. 82, Series D, Jan. 1960, 12 pages. |
Kamthe, A., et al., “SCOPES: Smart Cameras Object Position Estimation System,” Proceedings of the 2009 European Conference on Wireless Sensor Networks, In: Roedig, U., et al. (eds.), Lecture Notes in Computer Science, vol. 5432, Springer, 2009, pp. 279-295. |
Kulkarn I, P., et al., “Senseye: A multi-tier camera sensor network,” Proceedings of the 2005 13th Annual ACM International Conference on Multimedia, Nov. 6-12, 2005, Singapore, Singapore, pp. 229-238. |
Mathew, M., et al., “Sparse, Quantized, Full Frame CNN for Low Power Embedded Devices,” 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Jul. 21-26, 2017, Honolulu, Hawaii, USA, 9 pages. |
Patwari, N., et al., “Relative Location Estimation in Wireless Sensor Networks,” IEEE Transactions on Signal Processing, vol. 51, No. 8, Aug. 2003, pp. 2137-2148. |
Satpathy, A., et al., “Human Detection by Quadratic Classification on Subspace of Extended Histogram of Gradients,” IEEE Transactions on Image Processing, vol. 23, No. 1, Jan. 2014, 11 pages. |
Szeliski, R., “Image Alignment and Stitching: A Tutorial,” Foundations and Trends in Computer Graphics and Vision, vol. 2, No. 1, 2006, pp. 1-104. |
Zeng, C., et al., “Robust Head-shoulder Detection by PCA-Based Multilevel HOG-LBP Detector for People Counting,” 2010 International Conference on Pattern Recognition, Aug. 23-26, 2010, Istanbul, Turkey, 4 pages. |
Zhu, Q., et al., “Fast Human Detection Using a Cascade of Histograms of Oriented Gradients,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Jun. 17-22, 2005, New York, New York, USA, 8 pages. |
Zomet, A., et al., “Seamless Image Stitching by Minimizing False Edges,” IEEE Transactions on Image Processing, vol. 15, No. 4, Apr. 2006, pp. 969-977. |
Office Action for German Patent Application No. 10 2018 213 656.4, dated May 22, 2019, 9 pages. |
Notice of Allowance for U.S. Appl. No. 15/681,941, dated Apr. 13, 2018, 8 pages. |
Notice of Allowance for U.S. Appl. No. 15/681,941, dated Aug. 1, 2018, 8 pages. |
Non-Final Office Action for U.S. Appl. No. 16/990,230, dated Apr. 1, 2021, 8 pages. |
Minutes of the Oral Proceedings for European Patent Application No. 17705540.7, mailed Nov. 2, 2020, 4 pages. |
Decision to Refuse for European Patent Application No. 17705540.7, dated Nov. 5, 2020, 25 pages. |
Minutes of the Oral Proceedings for European Patent Application No. 17708904.2, mailed Nov. 2, 2020, 4 pages. |
Decision to Refuse for European Patent Application No. 17708904.2, dated Nov. 5, 2020, 19 pages. |
Office Action for Canadian Patent Application No. 3065545, dated Jan. 21, 2021, 4 pages. |
Examination Report for European Patent Application No. 18738050.6, dated Dec. 4, 2020, 8 pages. |
First Office Action for Chinese Patent Application No. 2018800390348, dated Aug. 18, 2021, 22 pages. |
Examination Report for European Patent Application No. 19705900.9, dated Sep. 29, 2021, 7 pages. |
Notice of Allowance for U.S. Appl. No. 16/990,230, dated Sep. 9, 2021, 7 pages. |
Abdi, Hervé, “Metric Multidimensional Scaling (MDS): Analyzing Distance Matrices,” Encyclopedia of Measurement and Statistics, 2007, Thousand Oaks, California, SAGE Publications, Inc., 13 pages. |
Author Unknown, “Procrustes analysis,” https://en.wikipedia.org/wiki/Procrustes_analysis, Jul. 16, 2016, Wikipedia, 5 pages. |
Author Unknown, “Thread Commissioning,” Revision 2.0, Jul. 13, 2015, Thread Group, Inc., www.threadgroup.org, 26 pages. |
Author Unknown, “Thread Stack Fundamentals,” Revision 2.0, Jul. 13, 2015, Thread Group, Inc., www.threadgroup.org, 21 pages. |
Author Unknown, “The IES TM-30-15 Method,” Lighting Passport, Available online at: <<https://www.lightingpassport.com/ies-tm30-15-method.html>>, Jan. 15, 2016, 6 pages. |
Boots, Byron, et al., “A Spectral Learning Approach to Range-Only SLAM,” Proceedings of the 30th International Conference on Machine Learning, vol. 28, 2013, Atlanta, Georgia, JMLR Workshop and Conference Proceedings, 8 pages. |
Cree, “Cree® J Series™ 2835 LEDs,” Product Family Data Sheet: CLJ-DS8 Rev 0D, Cree, Inc., Available online at: <<http://www.cree.com/led-components/media/documents/data-sheet-JSeries-2835.pdf>>, 2017, 30 pages. |
Digeronimo, J., “EIC 2800 Search Report,” Scientific and Technical Information Center, Mar. 14, 2018, 33 pages. |
Figueiro, M. G., et al., “Light at Night and Measures of Alertness and Performance: Implications for Shift Workers,” Biological Research for Nursing, vol. 18, Issue 1, Feb. 19, 2015, pp. 90-100. |
Jacobson, J., “CoeLux: The $40,000 Artificial Skylight Everyone Will Want,” CE Pro, Available online at: <<https://www.cepro.com/article/coelux_the_40000_fake_skylight_everyone_will_want>>, Mar. 11, 2016, 9 pages. |
Kobourov, Stephen, G., “Force-Directed Drawing Algorithms,” Handbook of Graph Drawing and Visualization, Chapter 12, 2013, CRC Press, pp. 383-408. |
Lumileds, “DS146 LUXEON 3535L Color Line,” Product Datasheet, Lumileds Holding B.V., Available online at: <<https://www.lumileds.com/uploads/565/DS146-pdf>>, 2018, 18 pages. |
Rea, M. S., et al., “A model of phototransduction by the human circadian system,” Brain Research Reviews, vol. 50, Issue 2, Dec. 15, 2005, pp. 213-228. |
Rea, M. S., et al., “Circadian light,” Journal of Circadian Rhythms, vol. 8, No. 2, Feb. 13, 2010, 11 pages. |
Sahin, L., et al., “Alerting effects of short-wavelength (blue) and long-wavelength (red) lights in the afternoon,” Physiology & Behavior, vols. 116-117, May 27, 2013, pp. 1-7. |
Seoul Semiconductor, “STB0A12D—Mid-Power LED—3528 Series Product Data Sheet,” Seoul Semiconductor Co., Ltd., Revision 1.0, Available online at: <<http://www.seoulsemicon.com/upload2/3528_STB0A12D_Spec_Rev1.0.pdf>>, Jul. 21, 2017, 19 pages. |
Seoul Semiconductor, “STG0A2PD—Mid-Power LED—3528 Series Product Data Sheet,” Seoul Semiconductor Co., Ltd., Revision 1.0, Available online at: <<http://www.seoulsemicon.com/upload2/3528_STG0A2PD_Spec_Rev1.0.pdf>>, Jul. 21, 2017, 19 pages. |
International Preliminary Report on Patentability for International Patent Application No. PCT/US2017/016469, dated Aug. 23, 2018, 10 pages. |
Non-Final Office Action for U.S. Appl. No. 15/192,308, dated Jul. 3, 2017, 11 pages. |
Final Office Action for U.S. Appl. No. 15/192,308, dated Oct. 20, 2017, 12 pages. |
Advisory Action and Interview Summary for U.S. Appl. No. 15/192,308, dated Jan. 25, 2018, 5 pages. |
Non-Final Office Action for U.S. Appl. No. 15/192,308, dated Mar. 15, 2018, 10 pages. |
Final Office Action for U.S. Appl. No. 15/192,308, dated Jul. 12, 2018, 11 pages. |
Advisory Action for U.S. Appl. No. 15/192,308, dated Sep. 10, 2018, 3 pages. |
Examiner's Answer for U.S. Appl. No. 15/192,308, dated Mar. 6, 2019, 5 pages. |
Non-Final Office Action for U.S. Appl. No. 15/192,479, dated Jan. 6, 2017, 17 pages. |
Non-Final Office Action for U.S. Appl. No. 15/192,479, dated Dec. 15, 2017, 11 pages. |
Notice of Allowance for U.S. Appl. No. 15/192,479, dated May 9, 2018, 7 pages. |
Non-Final Office Action for U.S. Appl. No. 15/192,035, dated May 31, 2017, 19 pages. |
Final Office Action for U.S. Appl. No. 15/192,035, dated Sep. 14, 2017, 15 pages. |
Advisory Action for U.S. Appl. No. 15/192,035, dated Dec. 1, 2017, 3 pages. |
Non-Final Office Action for U.S. Appl. No. 15/192,035, dated Mar. 9, 2018, 16 pages. |
Final Office Action for U.S. Appl. No. 15/192,035, dated Aug. 1, 2018, 20 pages. |
Advisory Action for U.S. Appl. No. 15/192,035, dated Sep. 24, 2018, 3 pages. |
Notice of Allowance for U.S. Appl. No. 15/192,035, dated Nov. 6, 2018, 8 pages. |
Non-Final Office Action for U.S. Appl. No. 15/191,846, dated Mar. 22, 2017, 12 pages. |
Notice of Allowance for U.S. Appl. No. 15/191,846, dated Jul. 13, 2017, 8 pages. |
Non-Final Office Action for U.S. Appl. No. 15/191,753, dated Aug. 1, 2018, 11 pages. |
Notice of Allowance for U.S. Appl. No. 15/191,753, dated Jan. 14, 2019, 23 pages. |
Notice of Allowance for U.S. Appl. No. 15/621,695, dated Sep. 21, 2017, 8 pages. |
Final Office Action for U.S. Appl. No. 15/849,986, dated Oct. 26, 2018, 7 pages. |
Non-Final Office Action for U.S. Appl. No. 15/849,986, dated Apr. 19, 2018, 9 pages. |
Notice of Allowance for U.S. Appl. No. 15/849,986, dated Nov. 26, 2018, 8 pages. |
Corrected Notice of Allowability and Interview Summary for U.S. Appl. No. 15/849,986, dated Jan. 14, 2019, 6 pages. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2017/016448, dated Apr. 6, 2017, 16 pages. |
International Preliminary Report on Patentability for International Patent Application No. PCT/US2017/016448, dated Aug. 23, 2018, 10 pages. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2017/016454, dated Apr. 6, 2017, 16 pages. |
International Preliminary Report on Patentability for International Patent Application No. PCT/US2017/016454, dated Aug. 23, 2018, 10 pages. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2017/016469, dated Apr. 6, 2017, 16 pages. |
International Search Report and Written Opinion for International Patent Application No. PCT/US2018/037048, dated Aug. 31, 2018, 15 pages. |
Non-Final Office Action for U.S. Appl. No. 17/023,899, dated Mar. 17, 2022, 10 pages. |
First Office Action for Chinese Patent Application No. 2019800122175, dated Dec. 29, 2021, 12 pages. |
Non-Final Office Action for U.S. Appl. No. 16/932,959, dated Dec. 8, 2021, 14 pages. |
Office Action for Canadian Patent Application No. 3089271, dated Oct. 8, 2021, 4 pages. |
Examination Report for European Patent Application No. 18738050.6, dated Mar. 22, 2022, 6 pages. |
Final Office Action for U.S. Appl. No. 16/932,959, dated Jun. 8, 2022, 16 pages. |
Number | Date | Country | |
---|---|---|---|
20210127475 A1 | Apr 2021 | US |
Number | Date | Country | |
---|---|---|---|
62926862 | Oct 2019 | US |