Method, apparatus, and system for occupancy sensing

Information

  • Patent Grant
  • 9014829
  • Patent Number
    9,014,829
  • Date Filed
    Friday, November 4, 2011
    12 years ago
  • Date Issued
    Tuesday, April 21, 2015
    9 years ago
Abstract
Embodiments of the present invention include an occupancy sensing unit configured to monitor an environment illuminated by a lighting fixture. An inventive occupancy sensing unit may include an occupancy sensor to detect radiation indicative of at least one occupancy event in the environment illuminated by the lighting fixture according to sensing parameters. The occupancy sensor can be coupled to a memory that logs sensor data, which represent the occupancy events, provided by the occupancy sensor. A processor coupled to the memory performs an analysis of the sensor data logged in the memory and adjusts the sensing parameters of the occupancy sensor based on the analysis.
Description
BACKGROUND

In many situations, it is desirable (but not necessary) for lighting to be activated as soon as a person/object of interest enters a particular area of interest. This can be accomplished by using occupancy and/or motion sensors to monitor the area of interest. When a sensor detects occupancy and/or motion, e.g., based on radiation or a change in radiation emitted in the area of interest, it sends a signal to a lighting fixture that causes the lighting fixture to illuminate the area of interest. The lighting fixture illuminates the area for as long as the sensor detects an occupant. As soon as the sensor stops detecting the occupant, a timer in the lighting fixture begins counting down a predetermined timeout or delay period during which the light remains on. The lighting fixture turns off when the delay period ends (unless the occupancy sensor detects another occupant, in which case the timer stops counting down). Consider, for example, a sensor whose timeout period is 60 seconds: if a person enters the sensor's field-of-view at 11:27:03 and stays in the field-of-view until 11:31:18, the light remains on until 11:32:18 provided that nobody else enters the field-of-view. If the predetermined timeout or delay period is too long, then the light remains on unnecessarily, wasting energy and running down its useful life. If the predetermined amount of time is too short, then the light turns off prematurely, which may be annoying and possibly dangerous as well.


Occupancy sensors sense radiation at different wavelengths, including infrared, ultrasonic, visible, and/or radio-frequency wavelengths, to detect the presence or absence of people in a space. Passive infrared (PIR) sensors sense the difference in heat emitted by humans in motion from that of the background space. These sensors detect motion within a field of view that generally requires a clear line of sight; they cannot “see” through obstacles and have limited sensitivity to minor (hand) movement at distances greater than about 15 feet. PIR sensors tend to be most sensitive to movement laterally across their respective fields of view, which can be adjusted when the sensor is installed.


PIR sensors generally are most suitable for smaller, enclosed spaces (wall switch sensors), spaces where the sensor has a view of the activity (ceiling- and wall-mounted sensors), and outdoor areas and warehouse aisles. Potentially incompatible application characteristics include low motion levels by occupants, obstacles blocking the sensor's view, mounting on sources of vibration, or mounting within six feet to eight feet of HVAC air diffusers.


Ultrasonic sensors use the Doppler principle to detect occupancy by emitting an ultrasonic high-frequency signal (e.g., 32-40 kHz) throughout a space, sensing the frequency of a signal reflected by a moving object, and interpreting a change in frequency as motion. The magnitude and sign of the change in frequency represent the speed and direction, respectively, of the object with respect to the sensor. Ultrasonic sensors do not require a direct line of sight and instead can “see” around corners and objects, although they may need a direct line of sight if fabric partition walls are prevalent. In addition, ceiling-mounted sensor effective range declines proportionally to partition height. Ultrasonic sensors are more effective for low motion activity, with high sensitivity to minor (e.g., hand) movement, typically up to 25 feet. Ultrasonic sensors tend to be most sensitive to movement towards and away from the sensor. Ultrasonic sensors typically have larger coverage areas than PIR sensors.


Ultrasonic sensors are most suitable for open spaces, spaces with obstacles, restrooms, and spaces with hard surfaces. Potentially incompatible application characteristics include high ceilings (greater than 14 feet), high levels of vibration or air flow (which can cause nuisance switching), and open spaces that require selective coverage (such as control of lighting in individual warehouse aisles).


Dual-technology sensors employ both PIR and ultrasonic technologies, activating the lights only when both technologies detect the presence of people, which virtually eliminates the possibility of false-on. Dual-technology sensors keep the lights on so long as they continue to detect the presence of people using at least one of the two sensing technologies, which significantly reduces the possibility of false-off. Appropriate applications include classrooms, conference rooms, and other spaces where a higher degree of detection may be desirable.


For effective occupancy sensing, generally required coverage area and required sensitivity are coordinated by a lighting designer/engineer. Generally the designer must determine range and coverage area for the sensor based on the desired level of sensitivity. Manufacturers of sensors publish range and coverage area for sensors in their product literature, which may be different for minor (e.g., hand) motion and major (e.g., full-body) motion. Various coverage sizes and shapes are available for each sensor type. In a small space, one sensor may easily provide sufficient coverage. In a large space, it may be desirable to partition the lighting load into zones, with each zone controlled by one sensor.


The lighting designer/engineer must also decide how long each light should remain on after the associated occupancy and/or motion sensor no longer detects motion. This timeout parameter is controlled typically in hardware, so the designer may have only a few discrete options, e.g., 30 seconds, one minute, two minutes, five minutes, etc., for a particular type of lighting fixture. The operating characteristics and requirements of the lighting fixtures often determine the minimum timeouts. For example, fluorescent and high-intensity discharge (HID) fixtures have relatively long warm-up times, so they may have minimum timeouts of about 10-15 minutes to minimize wear and tear that would otherwise reduce the fixture life.


The timeout parameter is controlled typically by setting a switch (e.g., dual in-line package (DIP) switches), dial, or other interface on the lighting fixture itself. Once the lighting fixture is installed, it may become difficult to change the timeout settings (if they can be changed at all). For example, industrial lighting fixtures, such as the high-bay lighting fixtures that illuminate aisles in a warehouse, are often too high to be reached without a lift. Even if the fixture is relatively easy to reach, it may be impractical to change the timeout parameter because the people who own, maintain, and/or use the facility have no way to determine the appropriate or optimum timeout setting.


U.S. Patent Application Publication No. 2007/0273307 to Westrick et al. discloses an automated lighting system that performs adaptive scheduling based on overrides from users. More specifically, Westrick's system follows a predetermined schedule to switch a lighting fixture from an “ON” mode (in which the fixture turns on in response to a signal from an occupancy sensor) to an “OFF” mode (in which the fixture does not respond to signals from the occupancy sensor). Firmware adjusts the amount of time the system spends in “ON” mode based on how often users override the lighting controls by actuating an override switch, such as an on/off paddle switch. If the system detects a high number of overrides immediately after a period in “ON” mode, the system increases the amount of time that the system is “ON” (and decreases the amount of time that the system is “OFF”). Although Westrick's system adjusts how long a light is enabled to respond to occupancy signals, it does not change how long the light remains on in response to an occupancy signal. It also requires direct user intervention. Westrick's system does not log or record any occupancy sensor data, so it is incapable of detecting, analyzing, and responding to more complicated occupancy behavior, such changes in occupancy patterns based on the hour of the day or the day of the week.


U.S. Pat. No. 8,035,320 to Sibert discloses an illumination control network formed of luminaires whose behaviors are governed by a set of parameters, which may be selected from templates or set by direct user intervention. Sibert's luminaire has an occupancy response behavior that depends in part on a high threshold, a low threshold, and a decaying average, or running average, that represents the average output level from an occupancy sensor over a recent time interval. When the luminaire receives a signal from the occupancy sensor, it updates the running average, then compares the updated running average to the high and low thresholds. If the updated running average is lower than the low threshold, the luminaire remains off (or turns off). If the updated running average is higher than the high threshold, the luminaire turns on (or remains on) for a predetermined timeout period. If the updated running average is between the high and low thresholds, the luminaire remains in its current state until it receives another signal from the occupancy sensor or, if the luminaire is already on, until the timeout period elapses. The luminaire does not adjust the length of the timeout period in response to an occupancy signal. Like Westrick's system, Sibert's luminaires do not log or record any occupancy sensor data, so they are cannot detect, analyze, or respond to more complicated occupancy behavior, such changes in occupancy patterns based on the hour of the day or the day of the week.


SUMMARY

One embodiment of the invention includes an occupancy sensing unit to monitor an environment illuminated by a lighting fixture and associated methods of sensing occupancy in an illuminated environment. An example occupancy sensing unit comprises an occupancy sensor, a memory operatively coupled to the occupancy sensor, and a processor operatively coupled to the memory. The sensor detects radiation indicative of an occupancy event in the environment illuminated by the lighting fixture according to sensing parameters, including but not limited to gain, threshold, offset, polling frequency, and duty cycle, and provides data representing the occupancy event. The memory logs sensor data, possibly at the direction of the processor, which performs an analysis of the sensor data logged in the memory and adjusts the sensing parameters of the occupancy sensor based on the analysis of the sensor data logged in the memory.


In a further embodiment, the occupancy sensor provides an analog signal representative of the occupancy event. An analog-to-digital converter operatively coupled to the occupancy sensor provides a digital representation of the analog signal at one of a plurality of digital levels. The different levels in the plurality of digital levels represent different types of occupancy events.


The occupancy sensor may also comprise two or more sensing elements to provide one or more signals indicative of a velocity and/or a trajectory associated with the occupancy event. These signals can be used to provide sensor data that represents the velocity associated with the occupancy event. The processor may determine of a frequency with which a particular velocity and/or a particular trajectory appears in the sensor data and adjust the sensing parameters, sensor timeout, lighting fixture timeout, and/or lighting levels accordingly.


The processor may also perform other types of analysis, such as creating an n-dimensional array of the sensor data logged in the memory, wherein each dimension of the array corresponds to a parameter associated with the occupancy event. Suitable parameters include, but are not limited to: frequency, amplitude, duration, rate of change, duty cycle, time of day, day of the week, month of the year, ambient light level, and/or ambient temperature associated with the sensor data logged in the memory. The processor can partition the n-dimensional array into clusters corresponding to different types of occupancy events and adjust the sensing parameters, which include, but are not limited to sensor timeout, gain, threshold, offset, and/or sensitivity, based on the partitioning. Alternatively, or in addition, the processor can determine a distribution of a frequency (e.g., a histogram) with which the occupancy sensor detects occupancy events and, optionally, adjust the sensing parameters based on the frequency distribution.


The processor may also place the LED in an inactive state after elapsation of a sensor delay following an end of the at least one occupancy event (as shown, for example, by a change in state of an output from the occupancy sensor). In addition, the processor can vary the length of the sensor delay based on its analysis of the logged sensor data.


Another exemplary occupancy sensing unit can include a communications interface to provide sensor data and/or a signal indicative of the occupancy event to a controller of a lighting fixture, a lighting management system, and/or another occupancy sensing unit. Such an occupancy sensing unit may be combined with or coupled to a light-emitting diode (LED) lighting fixture that includes one or more LEDs to illuminate the environment and a controller, operatively coupled to the LEDs and to the occupancy sensing unit, to actuate the LEDs in response to a signal indicative of an occupancy event. The controller can set the LEDs to a first lighting level in response to a signal indicative of a first type of occupancy event, and to a second lighting level in response to a signal indicative of a second type of occupancy event. Alternatively, or in addition, the controller can change a light level of the LEDs after a first elapsed time in response to a signal indicative of a first type of occupancy event, and change the light level of the LEDs after a second elapsed time in response to a signal indicative of a second type of occupancy event.


Yet another embodiment includes a lighting system to provide variable occupancy-based illumination of an environment. Such a lighting system comprises a plurality of lighting fixtures, each of which includes a light source to illuminate the environment, an occupancy sensor to respond to an occupancy event, a communications interface, a memory, and a controller. The occupancy sensor provides a first occupancy signal representing the occupancy event, which is logged to memory and transmitted to other lighting fixture in the plurality of lighting fixtures via the communications interface. The communications interface also receives a second occupancy signal from another lighting fixture in the plurality of lighting fixtures, and the memory stores sensor data representing the second occupancy signal as well. The controller, which is operatively coupled to the light source, the communications interface, and the memory, places the light source in an inactive state after elapsation of a delay period following an end of the at least one occupancy event (as shown, for example, by a change in state of the first and/or second occupancy signals). The controller performs an analysis of the sensor data logged in the memory, adjusts the delay period based on the analysis of the sensor data logged in the memory, and may optionally control a light level of the light source based at least in part on the first and second occupancy signals. In some cases, at least two of the plurality of lighting fixtures are configured to provide respective signals indicative of a velocity and/or a trajectory associated with an occupancy event.


As referred to herein, an “occupancy event” is any type of detectable incursion by or presence of a person or object into a space monitored by an occupancy sensor. Occupancy events include, but are not limited to: entry of a person or vehicle into a space monitored by an occupancy sensor and the presence of a person or object in a space monitored by an occupancy sensor. Detectable signatures of occupancy events include, but are not limited to: thermal radiation (i.e., heat) emitted by persons or objects, images of persons or objects, radiation reflected by persons objects, and Doppler shifts of radiation reflected by moving persons or moving objects.


As referred to herein, “sensor timeout” or “sensor delay” is the time elapsed between the end of an occupancy event (i.e., when the occupancy sensor stops seeing activity) and the moment that the lighting fixture goes into an “inactive” state. Similarly, a “lighting fixture timeout” or “lighting fixture delay” is the time between when the sensor output indicates the end of an occupancy event and the moment that the lighting fixture goes into an “inactive” state. In one example, the occupancy sensor has a sensor timeout (e.g., 30 seconds) that is fixed in hardware, and the processor implements a lighting fixture timeout that can be varied from about zero seconds to over four hours (e.g., 16,384 seconds) in increments of one second. The processor uses the variable lighting fixture timeout to provide an adjustable amount of time between the end of occupancy event and the moment that the lighting fixture goes into an “inactive” state. In other examples, the sensor timeout and lighting fixture timeout may coincident, in which case they are referred to collectively as a “timeout” or “delay.”


The following U.S. published applications are hereby incorporated herein by reference: U.S. publication no. 2009-0267540-A1, published Oct. 29, 2009, filed Apr. 14, 2009, and entitled “Modular Lighting Systems”; U.S. publication no. 2010-0296285-A1, published Nov. 25, 2010, filed Jun. 17, 2010, and entitled “Fixture with Rotatable Light Modules”; U.S. publication no. 2010-0301773-A1, published Dec. 2, 2010, filed Jun. 24, 2010, and entitled “Fixture with Individual Light Module Dimming;” U.S. Provisional Application No. 61/510,173, filed on Jul. 21, 2011, and entitled “Lighting Fixture”; and U.S. Provisional Application No. 61/555,075, filed on Nov. 3, 2011, and entitled “Methods, Apparatus, and Systems for Intelligent Lighting.”


It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.





BRIEF DESCRIPTION OF THE DRAWINGS

The skilled artisan will understand that the drawings primarily are for illustrative purposes and are not intended to limit the scope of the inventive subject matter described herein. The drawings are not necessarily to scale; in some instances, various aspects of the inventive subject matter disclosed herein may be shown exaggerated or enlarged in the drawings to facilitate an understanding of different features. In the drawings, like reference characters generally refer to like features (e.g., functionally similar and/or structurally similar elements).



FIG. 1 is a block diagram of a light fixture with an occupancy sensing unit, according to embodiments of the present invention.



FIGS. 2A and 2B are, respectively, elevation and plan views of an occupancy sensing unit with an adjustable field of view (radiation pattern), according to embodiments of the present invention.



FIG. 3 is a plot of a digital signal generated by the occupancy sensing unit of FIGS. 2A and 2B, according to an embodiment of the present invention.



FIG. 4 is a plot that illustrates the occupancy state and lit state of a notional illuminated environment.



FIG. 5 is a histogram of the number of occupancy events for the notional illuminated environment of FIG. 4 over a single day.



FIGS. 6A and 6B are histograms that illustrate occupancy profiles for a given illuminated environment on weekdays (FIG. 6A) and weekends (FIG. 6B).



FIG. 6C is a plot of the number of occupancy events versus the time between occupancy events for two different occupancy patterns.



FIG. 6D is a plot of energy consumed by a lighting fixture versus sensor delay for two different occupancy profiles.



FIG. 7 is a flow diagram that illustrates how an occupancy sensing unit adjusts gain, offset, and/or threshold parameters in real-time by analyzing logged sensor data, according to embodiments of the present invention.



FIG. 8 illustrates a notional two-dimensional parameter map, according to one embodiment of the present invention.



FIGS. 9A and 9B are flow diagrams that illustrate how an occupancy sensing unit adjusts timeout parameters in real-time by analyzing logged sensor data, according to embodiments of the present invention.



FIG. 10A is a block diagram of a lighting system that employs multiple lighting fixtures and/or occupancy sensing units to provide variable occupancy-based lighting, according to embodiments of the present invention.



FIG. 10B is a flow diagram that illustrates operation of the lighting system of FIG. 10A, according to embodiments of the present invention.





DETAILED DESCRIPTION

Following below are more detailed descriptions of various concepts related to, and embodiments of, inventive systems, methods, and apparatus for occupancy sensing. Inventive aspects include tailoring an occupancy sensor system to provide increased performance for industrial facilities, warehouses, cold storage facilities, etc. The inventive occupancy sensor methods, apparatus, and systems described herein also facilitate accurately sensing occupancy as well as harvesting occupancy data, e.g., for use in various lighting and energy conservation purposes. Inventive occupancy sensing units may report the harvested data back to an integral processor and/or external management system that use the harvested data to change lighting fixture behaviors, such as light levels and timeout parameters, so as to reduce energy consumption and increase safety based on actual occupancy patterns. It should be appreciated that various concepts introduced above and discussed in greater detail below may be implemented in any of numerous ways, as the disclosed concepts are not limited to any particular manner of implementation. Examples of specific implementations and applications are provided primarily for illustrative purposes.


Inventive aspects of the occupancy sensing units include, but are not limited to: tunable occupancy sensing, self-learning occupancy sensing, cooperative occupancy sensing, and dual-function sensing that facilitate mapping and other functionality. Tunable occupancy sensing units may employ software-based tuning of the occupancy sensor gain and cutoff characteristics for improving the precision of occupancy event detection and classification. In some cases, a sensor may be tuned to enhance detection of and discrimination among multiple object types (e.g., a person on foot, a moving forklift, etc.). Tuning can be used in conjunction with self-learning to set timeouts, active light levels, and inactive light levels based on patterns in past occupancy data. Past occupancy data can also be used to determine signatures associated with particular types of occupant activities. Some occupancy sensing units may even include cameras, radio-frequency antennas (e.g., Bluetooth sniffers), and other sensors to capture additional historical data for analysis. Inventive occupancy sensing units may also share both real-time and historical occupancy sensing data with each other to increase detection reliability, to identify malfunctioning sensors, and to provide more flexible lighting responses.



FIG. 1 shows a lighting fixture 100 that can be used to illuminate an environment in response to occupancy events that occur within or in the vicinity of the illuminated environment. The lighting fixture 100 includes an occupancy sensor 110 that is operably coupled to a memory 120 (shown in FIG. 1 as an electrically erasable programmable read-only memory (EEPROM)) via a filter 134, an amplifier 136, a multi-bit analog-to-digital converter (ADC) 132, and a processor 130. Together, the occupancy sensor 110, memory 120, and processor 130 form an occupancy sensing unit 102 that detects occupancy events, stores data representing the occupancy events, analyzes the stored data, and/or controls the lighting fixture 100 based on the occupancy events and/or the analysis of the stored data.


More specifically, upon detection of an occupancy event, the processor 130 may send a signal to one or more light-emitting diode (LED) drivers 140, which respond to the signal by changing the amount of light emitted by one or more LED light bars 142. The processor 130 may continue transmitting the signal to the LED drivers 140 for as long as the occupancy sensor 110 detects occupancy, or it may send a second signal to the LED drivers 140 as soon as the occupancy 110 stops detecting occupancy (i.e., when the occupancy event ends). At this point, the lighting fixture 100 enters a delay or timeout period during which the LED light bars 142 remain in the active state (or possibly transition to a state of intermediate activity, e.g., 50% illumination). Once the delay period has elapsed, as indicated by the change in state of a signal from the processor 130 and/or the LED driver 142, the LED light bars 142 enter an inactive state (e.g., they turn off or emit light at a very low level). As described below, the processor 130 may adjust the delay period and/or the light levels based on its analysis of logged sensor data.


The lighting fixture 100 also includes a temperature sensor 180, which can optionally be integrated into the occupancy sensing unit 102, along with other sensors, including but not limited to ambient light sensors (e.g., photocells), sensors for tracking radio-frequency identification (RFID) tags, cameras, and even other types of occupancy sensors. These additional sensors (not shown) may be coupled to the processor 130 via one or more digital input/output ports 164 and/or one or more analog input ports 166.


A communications interface 160 coupled to the processor 130 may, optionally, be incorporated into the occupancy sensing unit 102 if desired. The communications interface 160, which is coupled to an antenna 162, provides the occupancy sensing unit 102 with access to a wireless communications network, such as a local area network or the Internet. The occupancy sensing unit 102 may transmit raw or processed occupancy data to other a database, other lighting fixtures, or other occupancy sensing units via the communications interface 160. It may also receive occupancy data, firmware or software updates, predicted environmental data (e.g., temperature and ambient light level data), commissioning information, or any other suitable information from other sources, e.g., other lighting fixtures, occupancy sensing units, or external controllers.


The lighting fixture 100 also includes a real-time clock 170 that can also, optionally, be incorporated into the occupancy sensing unit 102 if desired. The real-time clock 170 provides time-stamp information on as needed or periodic basis to the memory 120 and the processor 130, which may store or tag the occupancy data with time stamps to indicate when the data was collected. The real-time clock 170 may also be used to time or coordinate the sensor/lighting fixture delay period and to synchronize the occupancy sensing unit 102 to other devices, systems, or communications networks.


A hardware power meter 150 coupled to the processor 102 meters alternating-current (AC) power (e.g., 120 VAC at 60 Hz) from an AC power input 156. The hardware power meter 150 provides the processor 130 with metering data representing the amount and rates of power consumption as a function of time. A low-voltage power supply 152 coupled to the power meter 150 transforms the AC power into low-voltage (e.g., 5 V) direct-current (DC) power suitable for running the processor 130 and/or other low-voltage electrical components in the lighting fixture. A high-voltage power supply 154 coupled to the power meter 150 transforms the AC power into high-voltage DC power suitable for running the LED driver 140 and the LED light bars 142. The low-voltage power supply 152 and/or the high-voltage power supply 154 may filter and/or otherwise condition the AC power as desired.


Alternatively, the lighting fixture 100 (and occupancy sensing unit 102) may draw power from an external DC power supply, such as a rechargeable battery. Such an embodiment may include one or more DC-DC power converters coupled to a DC power input and configured to step up or step down the DC power as desired or necessary for proper operation of the electronic components in the lighting fixture 100 (and occupancy sensing unit 102). For instance, the DC-DC power converter(s) may supply DC voltages suitable for logic operations (e.g., 5 VDC) and for powering electronic components (e.g., 12 VDC).


Occupancy Sensors and Sensor Configurations


While the configuration of the facilities in which the occupancy sensor system may be used can be quite varied, there are certain attributes of the functionality of occupancy sensing in warehouses and distribution centers that are based on mounting heights, positions, and angles. Therefore, an occupancy sensor as described herein may work for a variety of installation locations in a warehouse or distribution center including without limitation: racked aisles, ends of aisles, cross-aisles, and open spaces. The occupancy sensor design overcomes limitations found in existing designs which are typically either 360 degrees for open areas, or a long lobe of sensitivity for aisle applications.


To provide 360-degree monitoring and/or enhanced monitoring in certain directions, an occupancy sensor design may include multiple sensors and/or multiple sensing elements, which may be configured in various ways. One example is to align and overlap two or more sensing elements along one axis (e.g., for use in aisles). Another example is to position two or more sensing elements to provide angled fields of view, e.g., fields of view whose optical axes are offset from each other and/or oriented with respect to each other at an angle of about 30 degrees, 45 degrees, 60 degrees, 90 degrees, or any other desired or suitable angle. Various combinations of angled and offset sensing regions, when combined with processing and optimization capabilities provided by an inventive occupancy sensing unit, may provide a desired degree of sensitivity and configurability. An exemplary occupancy sensor design may fulfill the needs of multiple applications with a single embodiment by supporting occupancy sensing within two or more long lobes (e.g., for aisles in a warehouse) and in a 360-degree zone for open environments or where the sensor is approached from multiple directions. Networked control of lights may benefit from the improved sensing resolution of the inventive occupancy sensor to further facilitate operation based on “local control” that facilitates control of multiple lights or lighting fixtures (e.g., in a predetermined zone) by a single occupancy sensing unit (e.g., on a single lighting fixture or disposed remotely).


The lighting fixture 100 or occupancy sensing unit 102 may also include an accelerometer (not shown) coupled to the processor 130 to provide a signal representative of swaying, vibration, or other movement of the occupancy sensor 110. Because the occupancy sensor 110 detects relative motion, swaying or other movement of the occupancy sensor 110 may result in “false positive” detections. The processor 130 may use the signal from the accelerometer to determine the velocity of the occupancy sensor 130 and to compensate for the occupancy sensor's motion when determining and classifying signals from the occupancy sensor 110. If the processor 130 detects that the occupancy sensor's velocity varies periodically, for example, the processor 130 may determine that the occupancy sensor 110 is swaying and subtract the sensor's velocity from the detected velocity of the moving objects in the sensor's field of view. (Alternatively, or in addition, the occupancy sensor mounting may be made more rigid to reduce or prevent swaying.)


Suitable occupancy sensors may provide adjustable sensing areas with one or more sensing elements, including but not limited to passive infrared (PIR) sensing elements, a visible or infrared camera, ultrasonic sensing elements, radio-frequency antennas (e.g., for radar), or combinations thereof (e.g., as in hybrid PIR/ultrasonic devices). The occupancy sensor 110 shown in FIGS. 2A and 2B includes three PIR sensing elements (shown in FIGS. 2A and 2B as sensing elements 112a, 112b, and 112c; collectively, sensing elements 112) arrayed at the respective focal points of respective Fresnel lenses (shown in FIGS. 2A and 2B as lenses 114a, 114b, 114c, and 114d; collectively, lenses 114). Each lens 114 focuses infrared radiation in a particular field of view (shown in FIGS. 2A and 2B as fields of view 116a, 116b, and 116c; collectively, fields of view 116) onto one or more corresponding sensing elements 112.


The sensing elements 112 and lenses 114 can be selected and/or adjusted to ensure that the occupancy sensor's aggregate field of view (i.e., the combination of individual fields of view 116) encompasses certain portions of the illuminated environment. In some cases, the fields of view 116 may be arranged such that the occupancy sensor 100 detects a moving object, such as a person or vehicle (e.g., a forklift), before the moving object enters the illuminated environment. (In these cases, one or more of the fields of view 116 may extend beyond the area illuminated by the lighting fixture 100.) The processor 130 estimates the moving object's velocity and predicts the moving object's trajectory from the occupancy sensor data; if the processor 130 determines the that moving object is going to enter the illuminated area, it turns on the lighting fixture 100 soon enough to provide sufficient illumination for safety purposes. For example, the processor 130 may estimate that the object is a forklift moving at about 25 mph based on the amplitude and variation(s) in occupancy sensor data and turn on the lights about 40-50 seconds before the forklift enters the illuminated area to ensure that the forklift operator can see a distance equal to or greater than the stopping distance of the forklift. If the processor 130 estimates that the object is a person walking at about 5 mph, it may turn the lights on only about 20-30 seconds before the person enters the illuminated area. The processor 130 may also determine how long the lighting fixture 100 remains on based on the object's estimated velocity, e.g., it may reduce the sensor delay for objects moving at higher speeds and increase the sensor delay for objects moving at lower speeds.


In other cases, the sensing elements 112 and lenses 114 may also be arranged to ensure that other portions of the illuminated environment or vicinity do not fall within the aggregate field of view. For instance, fields of view 116 may be arranged during or after installation to prevent a person or vehicle at edge of the illuminated environment or outside the illuminated environment from triggering the occupancy sensor 110 prematurely or inadvertently. Similarly, predictable, consistent occupancy sensing may facilitate reporting of energy usage to a utility provider (e.g., for measurement and verification, or demand response actions, and the like), such as in the case of remotely mounted sensors, or sensors controlling more than one fixture (e.g., through a network). The fields of view 116 may also be adjusted (on a regular basis, if desired) based on traffic patterns, occupancy patterns, energy consumption, and other factors derived from analysis of the occupancy sensor data logged by the occupancy sensing unit 102.


Referring again to FIGS. 2A and 2B, the illustrated occupancy sensor 110 includes two Fresnel lenses 114a and 114b that collect radiation falling within longitudinally oriented fields of view 116a and 116b, respectively, and focus the collected radiation onto sensing elements 112a and 112b, respectively. The illustrated occupancy sensor 110 also includes two Fresnel lenses 114c and 114d that collect infrared radiation falling within transversely oriented fields of view 116c and 116d, respectively, onto a single, centrally positioned sensing element 112c. The fields of view 116, each of which has a roughly conical shape, may be arranged to detect occupancy events along an aisle in a warehouse: the longitudinally oriented fields of view 116a and 116b cover the aisle itself, and the transversely oriented fields of view 116c and 166d cover an intersection in the middle of the aisle. Together, the fields of view 116 enable monitoring occupancy over about 360 degrees in an area close to the sensor (i.e., at the intersection) and along the length of the aisle itself.


The occupancy sensor 110 can be mounted a height of about seven meters to about fourteen meters (e.g., eight meters, ten meters, twelve meters, or any other suitable height) to provide varying amounts of floor coverage. At a mounting height of fourteen meters, for example, the occupancy sensor 110 may have a detection radius of about nine meters; reducing the mounting height to about ten meters reduces the floor detection radius to about seven meters, and at a mounting height of about seven meters, the floor detection radius may be about five meters. Alternatively, the occupancy sensor 110 may have lenses 114 selected and mounted such that the floor detection radius varies more or less gradually with mounting height.


The occupancy sensing unit 102 may be an integral part of a lighting fixture, as shown in FIG. 1, or it can be a modular unit suitable for use with new or existing lighting fixtures, such as LED fixtures, fluorescent fixtures, and/or HID fixtures. In some examples, the occupancy sensing unit may be a kit that can be built and coupled to an existing lighting fixture or installed as a stand-alone module in the vicinity of a new or existing lighting fixture. For instance, exemplary occupancy sensing units may be retrofit to mid-bay and/or high-bay lighting fixtures in a warehouse, cold-storage facility, or other industrial space. When installed properly, occupancy sensing units may be used to reduce energy consumption by the lighting fixtures, optimize facility layout, and/or enhance safety.


The occupancy sensing unit 102 may be configured to detect (and identify) objects moving at speeds of anywhere from walking speed (about 0.6 m/s) to the driving speed of a forklift or similar vehicle (about 10 mph). The occupancy sensor(s) 110 in the occupancy sensing unit 102 may be rotatable, e.g., through at least about 90 degrees and up to about 180 degrees, either by hand, via a remote-controlled actuator, or both by hand or by remote control. The occupancy sensing unit 102 may have an operating temperature of about −40° C. to about +40° C. (or even +50° C.) and a storage temperature of about −40° C. to about +60° C. It may also operate in conditions of about 20% to about 90% humidity.


Processing and Storing Occupancy Sensor Data


As well understood by those of skill in the art, each sensing element 112 in the occupancy sensor 110 produces an analog signal 201, such as a photocurrent, whose magnitude is directly proportional to the strength of detected radiation. Depending on the sensor design, the analog signals 201 from the sensing elements 112 are either processed separately or multiplexed together to form a single analog output. Alternatively, the signals may be multiplexed together after they have been digitized.


In the example shown in FIG. 2A, the analog signals 201 are transmitted through a filter 134, such as a bandpass or lowpass filter, coupled to the output of occupancy sensor 110 to produce filtered analog signals 203. As understood by those of skill in the art, the filter 134 removes noise and other undesired signals at bands outside a passband or above or below cutoff frequency. In some embodiments, the processor 130 may tune the passband width, center frequency, or cutoff frequency of the filter 134 based on an analysis of logged occupancy sensor data. An amplifier 136 coupled to the output of the filter 134 amplifies the filtered analog signals 203 by a gain, which can be varied by the processor 130 based on an analysis of logged occupancy sensor data, to produce an amplified signal 205. A multi-bit ADC 132 coupled to the output of the amplifier 136 converts the amplified analog signal 205 into one or more multi-bit digital signals 300 (e.g., 16-bit, 32-bit, or 64-bit digital signals) whose amplitudes represent the strength of the detected radiation. The processor 130 may control the offset, sample period, and bit levels of the ADC 132 based on analysis of logged occupancy sensor data. Those of skill in the art will readily appreciate that alternative occupancy sensing units may include other components or arrangements of components to generate one or more digital signals representative of detected occupancy events.



FIG. 3 is a plot of an exemplary digital signal 300 from an occupancy sensing unit 102 that illustrates how the processor 130 uses amplitude and duration thresholds to classify signals from the occupancy sensor 110. As explained in greater detail below, the thresholds (and associated responses) may be adjusted based on analysis of logged sensor data. In some embodiments, the processor 130 compares the amplitude(s) of the digital signal(s) 300 (e.g., with one or more comparators) to one or more thresholds (e.g., represented by reference levels) that represent different types and/or different numbers of occupants. For instance, the processor 130 may ignore signals whose amplitudes are below a low threshold 302 representing a noise floor. The processor 130 may determine that a signal 300 whose amplitude falls between the low threshold 302 and an intermediate threshold 304 represents a person who has just entered the sensor's field of view 116 and turn on one or more of the LED light bars 142 in the lighting fixture 100. If the processor 130 determines that the signal amplitude exceeds a high threshold 306, the processor 130 may determine that a vehicle has entered or is about to enter the illuminated area and turn on all of the LED light bars 142 in the fixture. Although FIG. 3 depicts only low, intermediate, and high thresholds, those of skill in the art will readily appreciate that the processor 130 may compare the digital signal 300 to more or fewer thresholds as desired.


The processor 130 can also measure how long the amplitude of the digital signal 300 exceeds any one of the thresholds and use this measurement as a classification criterion. For instance, if the digital signal 300 exceeds a given threshold only briefly (i.e., for less than a minimum duration 310), the processor 130 may discard the data point as spurious. The processor 130 may also compute the average signal amplitude over a given window and/or the rate of change in signal strength (i.e., the derivative of the signal amplitude with respect to time); if the signal amplitude changes too quickly or too slowly to represent an occupancy event, then the processor 130 may discard or ignore the data.


The processor 130 may also learn and identify patterns in the digital signals 300 that represent particular types of occupancy events. For example, in cases where each sensing element 112 provides a separate digital signal 300, the digital signals 300 from each sensing element may successively increase, then decrease, as a moving object passes through the fields of view 116. The processor 130 determines the object's direction of movement from the order in which the digital signals 300 change; it determines the object's speed from how quickly the digital signals 300 change, either by taking the derivative of each signal individually, by estimating the object's change in position over time from the peaks in the different signals, or both. The processor 130 uses its estimate of object velocity to turn on lights in the object's predicted path and to turn off lights shortly after the object's predicted departure from the illuminated area (rather than simply turning off the lights after a fixed timeout period).


The processor 130 may also set or vary the light levels for different types of occupancy events. For instance, the processor 130 may turn on all the lights to 100% illumination when it detects a moving vehicle. It may also turn on these lights gradually, especially at night, to avoid blinding the vehicle's driver. In other examples, the processor 130 may turn on lights to relatively low levels (e.g., 30%) at night to preserve a person's night vision.


The processor 130 also logs representations of the digital signals 300 in the memory 120. These representations, or historical occupancy sensor data, may be stored in a raw format, as processed data (e.g., with time stamps from the real-time clock 170 or other timing device), or both. The processor 130 may also log representations of its responses to occupancy signals 300 (e.g., data representing commands such as “turn on light bars 1 and 2 at 50% of maximum amplitude for five minutes”) as well as data about the occupancy sensing unit 102 and lighting fixture 100 including, but not limited to: gain, offset, and threshold values of the occupancy sensor 110; the age, operating status, power consumption rates, of the system components and the system itself; etc. The memory 120 may store data from other sensors, including, but not limited to data concerning temperature, time (including hour, day, and month), ambient light levels, humidity, etc.


Analyzing Logged Occupancy Sensor Data


As stated above, the memory 120 in the occupancy sensing unit 102 may store a variety of data, including the two types of raw data shown in FIG. 4: data representing the occupancy state (i.e., “occupied” or “unoccupied”) of an illuminated environment as well as the status of the lighting fixture (i.e., “lit” or “unlit”) over a single day. In this example, both the occupancy state and the environment status are binary, i.e., either on or off. In other examples, the occupancy sensor data may be in a “raw” format, e.g., as shown in FIG. 3. Alternatively, the occupancy sensor data may be processed to indicate a particular type of occupancy event. Similarly, the lighting fixture data may indicate the number of lights that are on and/or the dimming level of each light.


Even the simple, binary case illustrated in FIG. 4 shows that the light is on when the illuminated environment is unoccupied. Suppose that the illuminated environment is a break room with the lighting fixture 100 of FIG. 1 in an office or warehouse that is open during the day and patrolled by security guards at night. Workers come into the break room in the morning, starting at 6 am, to get coffee or relax briefly before starting to work. The break room is occupied nearly continuously between about noon and 6 pm as the workers take lunch breaks and coffee breaks. A security guard may step in briefly at midnight. An occupancy sensor 110 detects an occupancy event every time person enters the break room, as indicated by the transition from an “unoccupied” state to an “occupied” state in the lower curve in FIG. 4. Upon sensing a transition from unoccupied to occupied status, the occupancy sensing unit 102 turns on the light bars 142, as indicated by the transition from an “unlit” state to a “lit” state in the upper curve in FIG. 4, which remain on for a predetermined timeout period.



FIG. 4 shows that the break room remains lit and occupied nearly continuously between about noon and 6 pm. FIG. 4 also shows that the timeout period for the light bars 142 is longer than it needs to be for occupancy events between 6 pm and 6 am: the fixture remains lit for many minutes after the room becomes unoccupied. Similarly, the timeout period for the light bars 142 is longer than necessary between about 6 am and noon. As a result, the lighting fixture 142 consumes more energy than necessary between 6 pm and noon. This extra “on” time also causes the light sources (e.g., LEDs) in the light bars 142 to age more quickly.


Placing the raw data plotted in FIG. 4 in TABLES 1-3 (below) shows that the light bars 142 in the break room are on for 2.8 hours longer than necessary. As a result, the light bars 142 consume about 34% more energy than if they were on only when the room was occupied. Reducing the timeout period would reduce the excess “on” time and the amount of extra energy consumed by the lighting fixture.


The raw data also show that the status of the illuminated environment is never “off and occupied,” which indicates that the occupancy sensing unit 102 is not experiencing “false negatives,” i.e., the occupancy sensing unit 102 has not detected every occupancy event that occurred in the twenty-four-hour period under examination. If the status of the illuminated space is ever “off and occupied,” indicating that the occupancy sensing unit 102 had failed to detect or respond to an occupancy event (or had been overridden), then the processor 130 may adjust the occupancy sensor settings to lower detection thresholds (e.g., decrease threshold 302 in FIG. 3) and/or change responses to detected occupancy events (e.g., change the dimming level). Similarly, the processor 130 may increase detection thresholds if it determines that there are too many “false positives,” i.e., the occupancy sensing unit 102 transmits a signal representative of an occupancy event when no occupancy event has taken place.









TABLE 1







Lighting Data









Lighting Metric
Time (Hours)
Time (Percentage)












Total On Time
11.0
45.8


Average On Period
3.7
15.3


Short On Period
1.0
4.2


Longest On Period
8.0
33.3


Total On/Off Cycles
3


Average On/Off Cycles/Day
3.1
















TABLE 2







Occupancy Data









Occupancy Metric
Time (Hours)
Time (Percentage)












Total Occupancy Time
8.2
34.1


Average Occupancy Period
1.2
5.0


Short Occupancy Period
0.1
0.4


Longest Occupancy Period
2.3
9.6


Total Occupancy Cycles
7


Average Occupancy Cycles/Day
7.6
















TABLE 3







Illuminated Environment Status











Environment Status
Time (Hours)
Time (Percentage)















On and Occupied
8.2
34.1



On and Vacant
2.8
11.7



Off and Occupied
0.0
0.0



Off and Vacant
15.8
65.8











FIG. 4 and TABLES 1-3 represent a coarse level of analysis of only a single day's worth of data. Collecting data for longer periods (e.g., weeks, months, or years) enable more sophisticated analysis and more sophisticated control of inventive lighting fixtures. For instance, the plot in FIG. 4 suggests that the occupancy pattern in the illuminated space changes over the course of the day. Although useful, FIG. 4 and TABLES 1-3 do not present a complete picture of the occupancy patterns associated with the illuminated space.


Analyzing an extended data as a function of time and/or frequency yields a more complete picture of the occupancy patterns associated with a particular illuminated environment. For instance, FIG. 5 shows a histogram of the total number of occupancy events at a particular time of day for an extended period of time (e.g., four weeks). The histogram suggests that occupancy events occur most frequently between noon and 2 pm (lunch time) with substantial occupancy activity extending from about 6 am until about 6 pm (a twelve-hour work day). Occupancy events occur sporadically between about 6 pm and about 6 am, with slight peaks at about 8 pm and midnight.


The processor 130 may use the occupancy pattern(s) revealed by a frequency distribution of occupancy events, such as the histogram shown in FIG. 5, to determine a timeout period that varies with the time of day. In this example, the processor 130 may set different timeout periods for the light bars 142 for different times of day: a relatively short timeout period between 6 pm and 6 am, when the illuminated environment is only briefly occupied, and a longer timeout period between 6 am and 6 pm, when the illuminated environment is occupied for longer periods of time. In addition, the processor 130 may also adjust the lighting levels for the active state, inactive state, and even set a third lighting level for an intermediate state corresponding to the sensor delay period.



FIGS. 6A-6D show how occupancy and lighting patterns for an illuminated space change on time scales of one week or more. FIGS. 6A and 6B are histograms of the number of occupancy events versus time of day for a single illuminated environment during weekdays (FIG. 6A) and weekends (FIG. 6B). The histograms indicate that the frequency and number of occupancy events is highest during working hours (i.e., 8 am to 5 pm) during weekdays and weekends, but the total number of occupancy events is dramatically lower during weekends than during weekdays. In addition, FIG. 6B shows that the occupancy sensor 110 detects hardly any occupancy events between midnight and 6 am on the weekends.


The processor 130 in the occupancy sensing unit 102 may identify and use patterns shown in the histograms of FIGS. 6A and 6B to define two different occupancy profiles for the illuminated environment: a first profile that applies on weekdays and a second profile that applies on weekends. Each profile may be linked to or include an expected occupancy pattern, a specific set of sensor parameters (gain, threshold, offset, timeout), and a specific set of responses to particular types of occupancy events. The processor 130 may adjust parameters and/or responses associated with each profile so as to reduce or minimize energy consumption of the lighting fixture 100 for the associated occupancy pattern. In this case, for example, FIG. 6C shows that occupancy events tend to occur more frequently on weekdays (the first profile) than on weekends (the second profile).



FIG. 6D, which is a plot of consumed energy versus sensor delay for the first and second occupancy profiles (as well as for having the lights on all the time), shows that the difference in occupancy event frequencies for the first and second occupancy profiles has a profound effect on the amount of energy consumed to illuminate the same space. In this case, the relatively frequent occurrence of occupancy events for the first occupancy profile causes energy consumption to increase as a function of sensor delay. Energy consumption for the second occupancy profile also increases as a function of sensor delay, but more quickly because occupancy events occur less frequently for the second occupancy profile compared to the first occupancy profile.


Adjusting Sensor Detection Parameters based on Stored Occupancy Sensor Data


Illustrative occupancy sensing units may further benefit warehouse and other LED light applications by learning occupancy patterns so as to adjust the occupancy sensors and/or light fixtures. For example, learned occupancy patterns based on detected occupancy events (e.g., coming and going) may provide some indication of a behavioral signature for certain individuals or objects entering an occupancy sensing area. Certain times of the work day may be found to have higher occupancy activity in certain areas of the facility. Lights in those areas, and perhaps leading up to those areas, may be kept on longer once an occupancy event has been detected during more active times of day.


When mixed occupancy events (e.g., a moving electric forklift and a walking human) are detected in adjacent or nearby areas, the processor 130 may apply certain operational rules, such as safety rules, when processing occupancy sensor data so that additional or key safety areas (e.g., ends of aisles) are well lit. In addition, different types of warehouse activities may benefit from different lighting. Occupancy detection may provide an indication as to the type of activity based on the dwell time of an occupant in a region. Someone performing an audit or inventory count may tend to stay in a particular area of the inventory aisles for longer periods of time than for someone simply picking a part.


The occupancy sensing unit 102 may include hardware, firmware, and/or software that controls the gain, offset, threshold, polling frequency, and/or polling duty cycle of each sensing element 112 in the occupancy sensor 110. For instance, each sensing element 112 may be coupled to an individually tunable occupancy sensor circuit that controls the operating mode (on, off, standby, etc.), gain, sensitivity, delay, hysteresis, etc., of the sensing element 112. Such a circuit may be tuned locally by the processor 130 or over a network for different illuminated environments. For instance, the sensor 110 or individual sensing elements 112 may be tuned for the differences between humans and fork trucks based on temperature signatures, velocities, field of view orientations, ambient light levels (e.g., due to proximity to a window), etc. mined from stored sensor data.


The occupancy sensing unit 102 may include hardware, firmware, and/or software that controls the sensing parameters (e.g., gain, offset, threshold, polling frequency, and/or polling duty cycle) of each sensing element 112 in the occupancy sensor 110. For instance, each sensing element 112 may be coupled to an individually tunable occupancy sensor circuit that controls the operating mode (on, off, standby, etc.), gain, sensitivity, delay, hysteresis, etc., of the sensing element 112. Such a circuit may be tuned locally by the processor 130 or over a network for different illuminated environments. For instance, the sensor 110 or individual sensing elements 112 may be tuned for the differences between humans and fork trucks based on temperature signatures, velocities, field of view orientations, ambient light levels (e.g., due to proximity to a window), etc. mined from stored sensor data.



FIG. 7 is a flowchart that illustrates a first process 700 for converting raw analog occupancy sensor data (from a PIR occupancy sensor, for example) to a digital output indicating an “occupied” state or an “unoccupied” state (or finer-grained output, e.g., “unoccupied” vs. “person” vs. “forklift”). Conventional occupancy sensors have gain, offset, and threshold parameters that are hard-coded in hardware (e.g., as resistors) or hard-coded in firmware. They do not enable user adjustment. At most, the raw signal from a conventional sensor is scaled with a gain and/or shifted by an offset, then compared to a threshold to determine whether or not an occupancy event has occurred. Because the gain, offset, and threshold are fixed when a conventional occupancy sensor is built, the conventional occupancy sensor cannot be adapted to fit changing (or variable) sensor conditions. As a result, a conventional occupancy sensor is likely to suffer from false positives or false negatives when used across a wide range of real-world environments.


Conversely, the sensor operation illustrated in FIG. 7 enables adaptive responses to changing occupancy patterns through real-time adjustment of gain, offset, and threshold parameters based on an analysis of past sensor values. In block 702, raw occupancy sensor data is logged to memory at regular intervals, e.g., once per second, based on a timing signal from a real-time clock, counter, or other time-keeping device. Next, in block 704, the processor coupled to the memory creates a multidimensional array or “map” of sensor readings that show, for instance, the frequency, amplitude, duration, and/or rate of change of the raw occupancy data. If desired, the processor may create or update the map once per clock cycle, i.e., every time new data is logged to memory. The processor then processes the map in block 706, e.g., using automated data classification techniques to “partition” the map of sensor readings into clusters corresponding to a particular output state (such as “person,” “forklift,” “empty,” etc.). The processor then stores the classification results back into memory in block 708 for use in setting the lighting fixture and future analysis of lighting system performance. The processor also determines new gain, offset, and threshold parameters based on the classification results tunes the occupancy sensing unit's gain, offset, and threshold parameters accordingly in block 710.


In some embodiments, the automated data classification techniques performed by the processor may include “cluster analysis,” which is the assignment of a set of objects into groups (called clusters) based on common characteristics. Objects in a particular cluster tend to be more similar (in some sense or another) to each other than to objects in other clusters. One example of basic cluster analysis involves creating a scatter plot of detected occupancy events versus two mutually exclusive parameters, such as time of day and estimated object velocity, then dividing the points on the scatter plot into clusters. For instance, the points can be grouped based on their mean distance from each other or from a “centroid,” or central vector. Alternatively, points can be grouped into clusters using distribution models, density models, or subspace models as understood in the art. The processor may infer occupancy patterns and behaviors from the size, location (with respect to the parameters), and number of elements in a particular cluster. Other suitable automated data classification techniques include, but are not limited to: machine learning, pattern recognition, image analysis, information retrieval, and data mining.



FIG. 8 is a two-dimensional parameter map generated from a notional set of occupancy sensor data. Each point represents a detected occupancy event as function two parameters: rate of change (x axis), which correlates with object speed, and signal amplitude (y axis), which correlates with object size. The points form clusters that can be grouped together, e.g., based on a maximum distance from a center of mass for each cluster. The clusters can then be classified based on their respective ranges for each combination of parameters. In this case, medium-sized, slow-moving objects are taken to be people; large, fast-moving objects are taken to be vehicles; and small, fast-moving objects are taken to be animals.


After the processor 130 (or a user) has classified each cluster, the processor 130 (or user) may estimate the mean, median, and range of parameters associated with each particular class of object. For instance, the processor 130 may determine that people move at a rates of 0.1 m/s to 0.6 m/s with a mean speed of 0.4 m/s. Given knowledge of the size of the illuminated area, the processor 130 may adjust the sensor timeout or the lighting fixture timeout to match or exceed a person's mean (or maximum) travel time through the illuminated environment.


Adding additional parameters to the parameter space further enhances the processor's ability to tailor the lighting and to reduce energy consumption. For instance, the processor 130 may also infer the most common trajectory through the illuminated area by computing which occupancy sensor(s) detected the plotted occupancy events. It may also determine that different types of objects take different paths. For instance, a multidimensional parameter map with the parameters direction, speed, and size may show that vehicles may travel down a central aisle, whereas people may travel along narrower aisles branching off the central aisle. All of the classifications can be used to tune the sensor detection and response parameters, including timeout, gain, offset, and threshold.


Adjusting Sensor Delay Based on Analysis of Stored Occupancy Sensor Data


Inventive occupancy sensing units are also capable of determining an optimal value of the sensor delay and adjusting the sensor delay accordingly. If the sensor delay is too long, then the lighting fixture remains on unnecessarily, wasting energy; if the sensor delay is too short, then the lighting fixture turns off too soon (i.e., when the illuminated environment is still occupied), which impairs safety, productivity, and comfort. In a conventional occupancy sensor, the sensor delay parameter is hard-coded into the sensor via a DIP switch, button, or other manual interface. Changing the sensor delay of a conventional sensor requires manually actuating the switch on sensor, which can be difficult, dangerous, and time-consuming for a sensor mounted on a high-bay lighting fixture fourteen feet above the ground. In addition, even if the sensor delay can be changed, there is no way to determine an “optimal” sensor delay setting for a conventional sensor because the conventional sensor does not record or analyze historical data.


In one example, an inventive occupancy sensing unit has an adjustable sensor delay (“timeouts”) that can be adjusted by the processor in the occupancy sensing unit according to the processes 900 and 950 shown in FIGS. 9A and 9B, respectively. Each sensor delay adjustment operation begins in block 902 with logging time-stamped occupancy sensor data to a memory in the occupancy sensing unit and/or to a remote memory (e.g., a memory connected to the occupancy sensing unit via the Internet or another communications network). The data logging may occur at regular intervals, e.g., once per second, as determined by a real-time clock, counter, network clock signal, or other time-keeping device. In block 904, the logged occupancy data sensor data is used to create or update histograms of “sensor on” and “sensor off” durations, e.g., as shown in FIGS. 5, 6A, and 6B.


Next, the processor adjusts the sensor parameters based on the histograms created or updated in block 904. In process 900, shown in FIG. 9A, the processor compares the sensor delay to a histogram peak in block 906. When set properly, the sensor delay matches the histogram peak, and no adjustment is necessary. If the sensor delay is less than the histogram peak, the processor increments the sensor delay by a predetermined amount, e.g., thirty seconds, one minute, five minutes, or ten minutes, in block 908. If the sensor delay is greater than the histogram peak, the processor decrements the sensor delay by a predetermined amount, e.g., thirty seconds, one minute, five minutes, or ten minutes, in block 910. In process 950, shown in FIG. 9B, sensor delay adjustment involves determining how many “sensor off” (“sensor on”) occurrences occur below (above) a predetermined threshold number of occurrences in block 926. If the number of “sensor off” (“sensor on”) occurrences is below the threshold, the processor increments (decrements) the sensor delay by a predetermined amount, e.g., thirty seconds, one minute, five minutes, or ten minutes, in block 928. If desired, the processor may average or otherwise combine the results of the sensor delay determination techniques of both process 900 and process 950. Once the processor has incremented or decremented the sensor delay, it stores the sensor delay in memory, in block 912, and tunes the sensor delay accordingly, in block 914.


Cooperative Occupancy Sensing



FIGS. 10A and 10B show how inventive lighting fixtures 100 and occupancy sensing units 102 can be used together with a lighting engine 1100 as part of a lighting system 1000 to determine and provide more sophisticated detection, identification, and analysis of occupancy patterns in illuminated environments. A set of lighting fixtures 100 that each include an occupancy sensing unit 102 provide variable, occupancy-based illumination for a particular environment, such a warehouse, commercial space, or government facility. Each occupancy sensing unit 102 collects, stores, and analyzes time-stamped occupancy sensing data 104 as described above and as in block 1052 of process 1050. The lighting fixtures 100 are exchange information with each other and with the lighting engine 1100 via their respective communication interfaces.


The lighting engine 1100 includes a harvesting engine 1002, which may be implemented in a general-purpose computer processor or as an application-specific processor, that is communicatively coupled to each occupancy sensing unit 102 via a communications network, such as a radio-frequency wireless communications network, an infrared communications network, or a wire- or optical fiber-based communications network. The harvesting engine 1002 retrieves time-stamped occupancy sensing data 104 from the local memory 120 in each occupancy sensing unit 102 on a periodic or as-needed basis as in block 1054 of FIG. 10B. Alternatively, each occupancy sensing unit 102 may transmit its respective occupancy sensing data 104 to a central harvesting engine 1002. The harvesting engine aggregates the retrieved or transmitted data 104 in an aggregated occupancy event database 1004, which can be implemented in any type of suitable nonvolatile memory. The data in the database 100 may include, but is not limited to a time stamp, a fixture, and an event identification code or tag.


The lighting engine 1100 also includes an event processor 1006 coupled to the event database 1004. Like the harvesting engine 1002, the event processor 1006 can be implemented in a general-purpose computer processor or as an application-specific processor. The event processor 1006 transforms the time-stamped data in the aggregated event database 1004 into an interval-based form display as in block 1058 of FIG. 10B. The interval-based form display can be presented to a user via a reporting graphical user interface (GUI) 1010 that shows occupancy data per fixture or zone, traffic between fixtures or within lighting zones, current and historical sensor and fixture parameters, and energy usage per fixture or zone as a function of time and/or space.


The lighting engine 1100 and lighting fixtures 100 (and possibly separate occupancy sensing units 102) are commissioned and connected to each other to form a wireless network (i.e., the lighting system 1000). In one example, occupancy sensing units 102 are installed on existing high-bay lighting fixtures 102 in a cold-storage facility and connected to a power supply, such as an AC power line. An installer commissions the occupancy sensing units 102 with a wireless device, such as a laptop computer, smart phone, or personal digital assistant, by sending a commissioning signal to each occupancy sensing unit 102 from the wireless device while walking through the cold-storage facility (as opposed to commissioning each sensing unit 102 by hand). The wireless device may


Once installed, the occupancy sensing units 102 can communicate with each other directly via their respective communications interfaces 160 or indirectly via a central controller, such as the event processor 1006 in the lighting engine 1100. The occupancy sensing units 102 may be coupled to each other (and to the event processor 1000) via a wireless network (e.g., a Zigbee® network) or a wired network (e.g., an Ethernet network). The occupancy sensing units 102 may exchange signals, such as “heartbeat” signals representing current operating status, on a periodic basis. They may also distribute raw or processed occupancy sensing information. For instance, an occupancy sensing unit 102 at the head of a warehouse aisle may detect an occupancy event, then broadcast an indication of the occupancy event to every occupancy sensing unit 102 in the vicinity. Alternatively, the occupancy sensing unit 102 at the head of the warehouse aisle may detect and identify a moving object, predict the object's trajectory, and send indications of the object's predicted trajectory to those occupancy sensing units 102 along the object's predicted trajectory. The notified occupancy sensing units 102 may then activate their respective lighting fixtures 100 to illuminate the predicted trajectory.


In addition to exchanging occupancy information, the occupancy sensing units 102 may identify and compensate for malfunctioning occupancy sensing units 102. Consider an aisle monitored by three occupancy sensing units 102 and illuminated by three lighting fixtures 100 arranged along the aisle. A person must enter the aisle either from one end or from the other, so the middle occupancy sensing units 102 may not be able to detect an occupancy event without one of the other occupancy sensing units 102 seeing occupancy first. If the occupancy sensing units 102 on the ends detect objects moving along the aisle (e.g., they detect occupancy sensing events at an interval about equal to the time it takes to walk from one end of the aisle to the other), they may determine that the middle occupancy sensing unit 102 is broken and activate the middle lighting fixture 100. Similarly, if the middle occupancy sensing unit 102 detects an occupancy event but the occupancy sensing units 102 on the ends of the aisle do not detect anything, the middle occupancy sensing unit 102 may be broken. In some instances, these indications may be used to tune or re-calibrate the gain, offset, and threshold settings of the malfunctioning occupancy sensing unit 102.


Conclusion


While various inventive embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the inventive embodiments described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the inventive teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific inventive embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, inventive embodiments may be practiced otherwise than as specifically described and claimed. Inventive embodiments of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the inventive scope of the present disclosure.


The above-described embodiments can be implemented in any of numerous ways. For example, the embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers.


Further, it should be appreciated that a computer may be embodied in any of a number of forms, such as a rack-mounted computer, a desktop computer, a laptop computer, or a tablet computer. Additionally, a computer may be embedded in a device not generally regarded as a computer but with suitable processing capabilities, including a Personal Digital Assistant (PDA), a smart phone or any other suitable portable or fixed electronic device.


Also, a computer may have one or more input and output devices. These devices can be used, among other things, to present a user interface. Examples of output devices that can be used to provide a user interface include printers or display screens for visual presentation of output and speakers or other sound generating devices for audible presentation of output. Examples of input devices that can be used for a user interface include keyboards, and pointing devices, such as mice, touch pads, and digitizing tablets. As another example, a computer may receive input information through speech recognition or in other audible format.


Such computers may be interconnected by one or more networks in any suitable form, including a local area network or a wide area network, such as an enterprise network, and intelligent network (IN) or the Internet. Such networks may be based on any suitable technology and may operate according to any suitable protocol and may include wireless networks, wired networks or fiber optic networks.


The various methods or processes outlined herein may be coded as software that is executable on one or more processors that employ any one of a variety of operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages and/or programming or scripting tools, and also may be compiled as executable machine language code or intermediate code that is executed on a framework or virtual machine.


In this respect, various inventive concepts may be embodied as a computer readable storage medium (or multiple computer readable storage media) (e.g., a computer memory, one or more floppy discs, compact discs, optical discs, magnetic tapes, flash memories, circuit configurations in Field Programmable Gate Arrays or other semiconductor devices, or other non-transitory medium or tangible computer storage medium) encoded with one or more programs that, when executed on one or more computers or other processors, perform methods that implement the various embodiments of the invention discussed above. The computer readable medium or media can be transportable, such that the program or programs stored thereon can be loaded onto one or more different computers or other processors to implement various aspects of the present invention as discussed above.


The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of computer-executable instructions that can be employed to program a computer or other processor to implement various aspects of embodiments as discussed above. Additionally, it should be appreciated that according to one aspect, one or more computer programs that when executed perform methods of the present invention need not reside on a single computer or processor, but may be distributed in a modular fashion amongst a number of different computers or processors to implement various aspects of the present invention.


Computer-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Typically the functionality of the program modules may be combined or distributed as desired in various embodiments.


Also, data structures may be stored in computer-readable media in any suitable form. For simplicity of illustration, data structures may be shown to have fields that are related through location in the data structure. Such relationships may likewise be achieved by assigning storage for the fields with locations in a computer-readable medium that convey relationship between the fields. However, any suitable mechanism may be used to establish a relationship between information in fields of a data structure, including through the use of pointers, tags or other mechanisms that establish relationship between data elements.


Also, various inventive concepts may be embodied as one or more methods, of which an example has been provided. The acts performed as part of the method may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.


All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.


The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”


The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.


As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of” or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of” “Consisting essentially of,” when used in the claims, shall have its ordinary meaning as used in the field of patent law.


As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.


In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of” shall be closed or semi-closed transitional phrases, respectively, as set forth in the eighth edition as revised in July 2010 of the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.

Claims
  • 1. An occupancy sensing unit to monitor an environment illuminated by a lighting fixture, the occupancy sensing unit comprising: A) at least one occupancy sensor to detect radiation indicative of at least one occupancy event, in the environment illuminated by the lighting fixture, according to sensing parameters;B) a memory, operatively coupled to the at least one occupancy sensor, to log sensor data, representing the at least one occupancy event, provided by the at least one occupancy sensor; andC) a processor, operatively coupled to the memory, to: C1) perform an analysis of the sensor data logged in the memory, the analysis comprising forming a representation of the sensor data logged in the memory based on at least one of a frequency, an amplitude, a duration, or a rate of change of the sensor data logged in the memory,C2) perform a classification of the sensor data according to the representation of the sensor data logged in the memory performed in C1),C3) store, in the memory, results of the classification performed in C2) for analysis of future sensor data from the at least one occupancy sensor, andC4) adjust at least one of a gain, a threshold, an offset, a timeout, or a sensitivity of the at least one occupancy sensor based on the results stored in the memory in C3).
  • 2. The occupancy sensing unit of claim 1, wherein the at least one occupancy sensor provides an analog signal representative of the at least one occupancy event, and wherein the occupancy sensing unit further comprises: an analog-to-digital converter, operatively coupled to the at least one occupancy sensor, to provide a digital representation of the analog signal at one of a plurality of digital levels,wherein different levels among the plurality of digital levels represent different types of occupancy events.
  • 3. The occupancy sensing unit of claim 1, wherein the at least one occupancy sensor comprises: two or more sensing elements to provide at least one of one or more signals indicative of a velocity or a trajectory associated with the at least one occupancy event, andwherein the sensor data represents the velocity associated with the at least one occupancy event.
  • 4. The occupancy sensing unit of claim 3, wherein the analysis comprises determination of a frequency with which at least one of a particular velocity or a particular trajectory appears in the sensor data.
  • 5. The occupancy sensing unit of claim 1, wherein the sensing parameters comprise the at least one of the gain, the threshold, the offset, the timeout, or the sensitivity of the at least one occupancy sensor.
  • 6. The occupancy sensing unit of claim 1, wherein the lighting fixture remains in an active state after the at least one occupancy sensor stops sensing the radiation indicative of the at least one occupancy event for a sensor delay.
  • 7. The occupancy sensing unit of claim 6, wherein the processor adjusts the sensor delay based on the analysis of the sensor data logged in the memory.
  • 8. The occupancy sensing unit of claim 1, wherein the analysis performed by the processor comprises creating an n-dimensional array of the sensor data logged in the memory, wherein each dimension of the array corresponds to a parameter associated with the at least one occupancy event.
  • 9. The occupancy sensing unit of claim 8, wherein the analysis performed by the processor further comprises partitioning the n-dimensional array into clusters corresponding to different types of occupancy events.
  • 10. The occupancy sensing unit of claim 8, wherein the dimensions of the array comprise at least one of a frequency, amplitude, duration, rate of change, duty cycle, time of day, day of the week, month of the year, ambient light level, or ambient temperature associated with the sensor data logged in the memory.
  • 11. The occupancy sensing unit of claim 1, wherein the analysis performed by the processor comprises determining a distribution of a frequency with which the at least one occupancy sensor detects occupancy events.
  • 12. The occupancy sensing unit of claim 11, wherein the processor adjusts a duration of a sensor delay based on the distribution of the frequency with which the at least one occupancy sensor detects occupancy events.
  • 13. The occupancy sensing unit of claim 1, further comprising: a communications interface to provide at least one of sensor data or a signal indicative of the at least one occupancy event to at least one of a controller of a lighting fixture, a lighting management system, or another occupancy sensing unit.
  • 14. The occupancy sensing unit of claim 1, in combination with a light-emitting diode (LED) lighting fixture comprising: D) at least one LED to illuminate the environment; andE) a controller, operatively coupled to the at least one LED and to the occupancy sensing unit, to place the at least one LED in an active state in response to a signal indicative of the at least one occupancy event and to place the at least one LED in an inactive state after an elapse of a sensor delay.
  • 15. The occupancy sensing unit of claim 14, wherein the controller: E1) sets the at least one LED to a first lighting level in response to a signal indicative of a first type of occupancy event, andE2) sets the at least one LED to a second lighting level in response to a signal indicative of a second type of occupancy event.
  • 16. The occupancy sensing unit of claim 14, wherein the controller: E3) changes a light level of the at least one LED after a first elapsed time in response to a signal indicative of a first type of occupancy event, andE4) changes the light level of the at least one LED after a second elapsed time in response to a signal indicative of a second type of occupancy event.
  • 17. A method of monitoring an environment illuminated by a lighting fixture, the method comprising: A) providing, with an occupancy sensor having sensing parameters, sensor data representative of at least one occupancy event in the environment illuminated by the lighting fixture according to sensing parameters;B) logging the sensor data in a memory;C) performing an analysis of the sensor data logged in the memory in B), the analysis comprising forming a representation of the sensor data logged in the memory based on at least one of a frequency, an amplitude, a duration, or a rate of change of the sensor data logged in the memory,D) performing a classification of the sensor data according to the representation of the sensor data logged in the memory performed in C),E) store, in the memory, results of the classification performed in D) for analysis of future sensor data from the occupancy sensor, andF) adjusting at least one of a gain, a threshold, an offset, a timeout, or a sensitivity of the occupancy sensor based on the results stored in the memory in E).
  • 18. The method of claim 17, wherein A) further comprises: A1) providing, with the occupancy sensor, an analog signal representative of the at least one occupancy event; andA2) digitizing the analog signal at one of a plurality of digital levels to provide the sensor data, wherein different levels in the plurality of digital levels represent different types of occupancy events.
  • 19. The method of claim 17, wherein the sensor data represents at least one of a velocity or a trajectory associated with the at least one occupancy event.
  • 20. The method of claim 19, wherein C) comprises determining a frequency with which at least one of a particular velocity or a particular trajectory appears in the sensor data.
  • 21. The method of claim 17, wherein C) comprises creating an n-dimensional array of the sensor data logged in the memory, wherein each dimension of the array is a parameter associated with the at least one occupancy event.
  • 22. The method of claim 21, wherein C) further comprises partitioning the n-dimensional array into clusters corresponding to different types of occupancy events.
  • 23. The method of claim 21, wherein the dimensions of the array comprise at least one of a frequency, amplitude, duration, rate of change, duty cycle, time of day, day of the week, month of the year, ambient light level, or ambient temperature associated with the sensor data logged in the memory.
  • 24. The method of claim 17, wherein C) comprises determining a distribution of a frequency with which the at least one occupancy sensor detects occupancy events.
  • 25. The method of claim 24, wherein F) comprises changing a duration of a sensor delay after detection of the at least one occupancy event based on the distribution of the frequency with which the at least one occupancy sensor detects occupancy events.
  • 26. The method of claim 17, further comprising: changing a sensor delay after an end of the at least one occupancy event based on the analysis in C).
  • 27. The method of claim 17, further comprising: providing at least one of the sensor data or a signal indicative of the at least one occupancy event to at least one of a controller of a lighting fixture, a lighting management system, or another occupancy sensing unit.
  • 28. The method of claim 17, fixture comprising: changing an illumination level of the environment in response to a signal indicative of the at least one occupancy event.
  • 29. The method of claim 28, wherein changing the illumination level of the environment comprises: setting the illumination level to a first level in response to a signal indicative of a first type of occupancy event, andsetting the illumination level to a second level in response to a signal indicative of a second type of occupancy event.
  • 30. The method of claim 28, wherein changing the illumination level of the environment comprises: changing the illumination level after a first elapsed time in response to a signal indicative of a first type of occupancy event, andchanging the illumination level after a second elapsed time in response to a signal indicative of a second type of occupancy event.
  • 31. A lighting system to provide variable occupancy-based illumination of an environment, the lighting system comprising: a plurality of lighting fixtures, wherein each lighting fixture in the plurality of lighting fixtures comprises:A) at least one occupancy sensor to provide a first occupancy signal representing at least one occupancy event;B) a communications interface to transmit the first occupancy signal to at least one other lighting fixture in the plurality of lighting fixtures and to receive a second occupancy signal from another lighting fixture in the plurality of lighting fixtures;C) a memory, operatively coupled to the communications interface, to store sensor data representing the first and second occupancy signals; andD) at least one light source to illuminate the environment in response to at least one of the first occupancy signal or the second occupancy signal;E) a controller, operatively coupled to the light source, the communications interface, and the memory, to: E1) place the at least one light source in an inactive state after elapse of a predetermined delay period following an end of the at least one occupancy event,E2) perform an analysis of the sensor data logged in the memory, the analysis comprising forming a representation of the sensor data logged in the memory based on at least one of a frequency, an amplitude, a duration, or a rate of change of the sensor data logged in the memory,E3) perform a classification of the sensor data according to the representation of the sensor data logged in the memory performed in E2),E4) store, in the memory, results of the classification performed in E3) for analysis of future sensor data from the at least one occupancy sensor, andE5) adjust the predetermined delay period of the at least one occupancy sensor based on the results stored in the memory in E4).
  • 32. The lighting system of claim 31, wherein the controller controls a light level of the at least one light source based at least in part on the first and second occupancy signals.
  • 33. The lighting system of claim 31, wherein the at least two of the plurality of lighting fixtures are configured to provide respective signals indicative of at least one of a velocity or a trajectory associated with the at least one occupancy event.
CROSS-REFERENCE TO RELATED PATENT APPLICATIONS

This application claims the benefit, under 35 U.S.C. §119(e), of U.S. Provisional Patent Application No. 61/409,991, filed on Nov. 4, 2010, entitled “Occupancy Sensor,” which application is hereby incorporated herein by reference.

US Referenced Citations (658)
Number Name Date Kind
2899541 De Mauro Aug 1957 A
D185410 Bodian Jun 1959 S
D191530 Zurawski Oct 1961 S
D200548 Reeves Mar 1965 S
4194181 Brundage Mar 1980 A
4217646 Caltagirone et al. Aug 1980 A
4277691 Lunn Jul 1981 A
4298922 Hardwick Nov 1981 A
4558275 Borowy et al. Dec 1985 A
4755920 Tinley Jul 1988 A
4772825 Tabor et al. Sep 1988 A
4780731 Creutzmann et al. Oct 1988 A
D300471 Szymanek Mar 1989 S
5055985 Fabbri Oct 1991 A
5073133 Inoue Dec 1991 A
5144222 Herbert Sep 1992 A
5323334 Meyers et al. Jun 1994 A
5430356 Ference et al. Jul 1995 A
5455487 Mix et al. Oct 1995 A
5521852 Hibbs et al. May 1996 A
D374301 Kleffman Oct 1996 S
5566084 Cmar Oct 1996 A
5572237 Crooks et al. Nov 1996 A
5572239 Jaeger Nov 1996 A
5640792 Smith et al. Jun 1997 A
5655833 Raczynski Aug 1997 A
5668446 Baker Sep 1997 A
5739639 Johnson Apr 1998 A
5753983 Dickie et al. May 1998 A
5764146 Baldwin et al. Jun 1998 A
5895986 Walters et al. Apr 1999 A
5914865 Barbehenn et al. Jun 1999 A
5971597 Baldwin et al. Oct 1999 A
6016038 Mueller et al. Jan 2000 A
6025679 Harper et al. Feb 2000 A
6028396 Morrissey, Jr. et al. Feb 2000 A
6028597 Ryan et al. Feb 2000 A
6035266 Williams et al. Mar 2000 A
6092913 Edwards, Jr. Jul 2000 A
6097419 Morris et al. Aug 2000 A
6113137 Mizutani et al. Sep 2000 A
6150774 Mueller et al. Nov 2000 A
6151529 Batko Nov 2000 A
6166496 Lys et al. Dec 2000 A
6211626 Lys et al. Apr 2001 B1
6257735 Baar Jul 2001 B1
D447266 Verfuerth Aug 2001 S
6292901 Lys et al. Sep 2001 B1
6340868 Lys et al. Jan 2002 B1
6359555 Williams Mar 2002 B1
6370489 Williams et al. Apr 2002 B1
D457974 Piepgras et al. May 2002 S
6384722 Williams May 2002 B1
6388399 Eckel et al. May 2002 B1
6393381 Williams et al. May 2002 B1
D458395 Piepgras et al. Jun 2002 S
D460735 Verfuerth Jul 2002 S
6415205 Myron et al. Jul 2002 B1
6415245 Williams et al. Jul 2002 B2
6428183 McAlpin Aug 2002 B1
D463059 Verfuerth Sep 2002 S
D463610 Piepgras et al. Sep 2002 S
6452339 Morrissey et al. Sep 2002 B1
6452340 Morrissey, Jr. et al. Sep 2002 B1
6456960 Williams et al. Sep 2002 B1
6459919 Lys et al. Oct 2002 B1
6466190 Evoy Oct 2002 B1
6467933 Baar Oct 2002 B2
6486790 Perlo et al. Nov 2002 B1
D468035 Blanc et al. Dec 2002 S
6491412 Bowman et al. Dec 2002 B1
6517218 Hochstein Feb 2003 B2
6528954 Lys et al. Mar 2003 B1
6548967 Dowling et al. Apr 2003 B1
6577080 Lys et al. Jun 2003 B2
6585396 Verfuerth Jul 2003 B1
6604062 Williams et al. Aug 2003 B2
6608453 Morgan et al. Aug 2003 B2
D479826 Verfuerth et al. Sep 2003 S
6624597 Dowling et al. Sep 2003 B2
6641284 Stopa et al. Nov 2003 B2
6652119 Barton Nov 2003 B1
D483332 Verfuerth Dec 2003 S
6710588 Verfuerth et al. Mar 2004 B1
6714895 Williams et al. Mar 2004 B2
6717376 Lys et al. Apr 2004 B2
6720745 Lys et al. Apr 2004 B2
6724180 Verfuerth et al. Apr 2004 B1
D491678 Piepgras Jun 2004 S
D492042 Piepgras Jun 2004 S
6746274 Verfuerth Jun 2004 B1
6748299 Motoyama Jun 2004 B1
6758580 Verfuerth Jul 2004 B1
D494700 Hartman et al. Aug 2004 S
6774584 Lys et al. Aug 2004 B2
6774619 Verfuerth et al. Aug 2004 B1
6777891 Lys et al. Aug 2004 B2
6781329 Mueller et al. Aug 2004 B2
6788011 Mueller et al. Sep 2004 B2
6791458 Baldwin Sep 2004 B2
6798341 Eckel et al. Sep 2004 B1
6801003 Schanberger et al. Oct 2004 B2
6806659 Mueller et al. Oct 2004 B1
6807516 Williams et al. Oct 2004 B2
6841944 Morrissey et al. Jan 2005 B2
6869204 Morgan et al. Mar 2005 B2
6883929 Dowling Apr 2005 B2
6888322 Dowling et al. May 2005 B2
6892168 Williams et al. May 2005 B2
6909921 Bilger Jun 2005 B1
6933627 Wilhelm Aug 2005 B2
6936978 Morgan et al. Aug 2005 B2
6964502 Verfuerth Nov 2005 B1
6965205 Piepgras et al. Nov 2005 B2
6967448 Morgan et al. Nov 2005 B2
6969954 Lys Nov 2005 B2
6975079 Lys et al. Dec 2005 B2
7002546 Stuppi et al. Feb 2006 B1
D518218 Roberge et al. Mar 2006 S
7014336 Ducharme et al. Mar 2006 B1
7019276 Cloutier et al. Mar 2006 B2
7031920 Dowling et al. Apr 2006 B2
7038398 Lys et al. May 2006 B1
7038399 Lys et al. May 2006 B2
7042172 Dowling et al. May 2006 B2
7062360 Fairlie et al. Jun 2006 B2
7064498 Dowling et al. Jun 2006 B2
7093952 Ono et al. Aug 2006 B2
7113541 Lys et al. Sep 2006 B1
7132635 Dowling Nov 2006 B2
7132785 Ducharme Nov 2006 B2
7132804 Lys et al. Nov 2006 B2
7135824 Lys et al. Nov 2006 B2
7139617 Morgan et al. Nov 2006 B1
7160140 Mrakovich et al. Jan 2007 B1
7161311 Mueller et al. Jan 2007 B2
7161313 Piepgras et al. Jan 2007 B2
7161556 Morgan et al. Jan 2007 B2
7178941 Roberge et al. Feb 2007 B2
7180252 Lys et al. Feb 2007 B2
D538462 Verfuerth et al. Mar 2007 S
7186003 Dowling et al. Mar 2007 B2
7187141 Mueller et al. Mar 2007 B2
7190121 Rose et al. Mar 2007 B2
7199531 Loughrey Apr 2007 B2
7202613 Morgan et al. Apr 2007 B2
7204622 Dowling et al. Apr 2007 B2
7220015 Dowling May 2007 B2
7220018 Crabb et al. May 2007 B2
7221104 Lys et al. May 2007 B2
7228190 Dowling et al. Jun 2007 B2
7231060 Dowling et al. Jun 2007 B2
7233115 Lys Jun 2007 B2
7233831 Blackwell Jun 2007 B2
7236366 Chen Jun 2007 B2
7242152 Dowling et al. Jul 2007 B2
7248239 Dowling et al. Jul 2007 B2
D548868 Roberge et al. Aug 2007 S
7253566 Lys et al. Aug 2007 B2
7255457 Ducharme et al. Aug 2007 B2
7256554 Lys Aug 2007 B2
7256556 Lane et al. Aug 2007 B2
7274160 Mueller et al. Sep 2007 B2
7274975 Miller Sep 2007 B2
7300192 Mueller et al. Nov 2007 B2
D557817 Verfuerth Dec 2007 S
7303300 Dowling et al. Dec 2007 B2
7308296 Lys et al. Dec 2007 B2
7309965 Dowling et al. Dec 2007 B2
7311423 Frecska et al. Dec 2007 B2
D560469 Bartol et al. Jan 2008 S
D562494 Piepgras et al. Feb 2008 S
7333903 Walters et al. Feb 2008 B2
7344279 Mueller et al. Mar 2008 B2
7344296 Matsui et al. Mar 2008 B2
7348736 Piepgras et al. Mar 2008 B2
D566323 Piepgras et al. Apr 2008 S
7350936 Ducharme et al. Apr 2008 B2
7352138 Lys et al. Apr 2008 B2
7352339 Morgan et al. Apr 2008 B2
7353071 Blackwell et al. Apr 2008 B2
7354172 Chemel et al. Apr 2008 B2
7358679 Lys et al. Apr 2008 B2
7358929 Mueller et al. Apr 2008 B2
7385359 Dowling et al. Jun 2008 B2
7387405 Ducharme et al. Jun 2008 B2
7391335 Mubaslat et al. Jun 2008 B2
7401942 Verfuerth et al. Jul 2008 B1
7411489 Elwell et al. Aug 2008 B1
7427840 Morgan et al. Sep 2008 B2
7445354 Aoki et al. Nov 2008 B2
7453217 Lys et al. Nov 2008 B2
7470055 Hacker et al. Dec 2008 B2
7482565 Morgan et al. Jan 2009 B2
7482764 Morgan et al. Jan 2009 B2
7495671 Chemel et al. Feb 2009 B2
7501768 Lane et al. Mar 2009 B2
7502034 Chemel et al. Mar 2009 B2
7506993 Kain et al. Mar 2009 B2
7507001 Kit Mar 2009 B2
7518319 Konno et al. Apr 2009 B2
7520634 Ducharme et al. Apr 2009 B2
D592786 Bisberg et al. May 2009 S
7529594 Walters et al. May 2009 B2
7543956 Piepgras et al. Jun 2009 B2
7546167 Walters et al. Jun 2009 B2
7546168 Walters et al. Jun 2009 B2
7550931 Lys et al. Jun 2009 B2
7550935 Lys et al. Jun 2009 B2
D595894 Verfuerth et al. Jul 2009 S
7563006 Verfuerth et al. Jul 2009 B1
7571063 Howell et al. Aug 2009 B2
7572028 Mueller et al. Aug 2009 B2
7575338 Verfuerth Aug 2009 B1
7598681 Lys et al. Oct 2009 B2
7598684 Lys et al. Oct 2009 B2
7598686 Lys et al. Oct 2009 B2
7603184 Walters et al. Oct 2009 B2
7619370 Chemel et al. Nov 2009 B2
D606697 Verfuerth et al. Dec 2009 S
D606698 Verfuerth et al. Dec 2009 S
7628506 Verfuerth et al. Dec 2009 B2
7638743 Bartol et al. Dec 2009 B2
7642730 Dowling et al. Jan 2010 B2
7646029 Mueller et al. Jan 2010 B2
7659674 Mueller et al. Feb 2010 B2
7660892 Choong et al. Feb 2010 B2
7703951 Piepgras et al. Apr 2010 B2
D617028 Verfuerth et al. Jun 2010 S
D617029 Verfuerth et al. Jun 2010 S
7744251 Liu et al. Jun 2010 B2
7746003 Verfuerth et al. Jun 2010 B2
7753568 Hu et al. Jul 2010 B2
7761260 Walters et al. Jul 2010 B2
7762861 Verfuerth et al. Jul 2010 B2
D621410 Verfuerth et al. Aug 2010 S
D621411 Verfuerth et al. Aug 2010 S
7766518 Piepgras et al. Aug 2010 B2
7777427 Stalker, III Aug 2010 B2
7780310 Verfuerth et al. Aug 2010 B2
7783390 Miller Aug 2010 B2
7784966 Verfuerth et al. Aug 2010 B2
D623340 Verfuerth et al. Sep 2010 S
7792956 Choong et al. Sep 2010 B2
7809448 Lys et al. Oct 2010 B2
7824065 Maxik Nov 2010 B2
7828465 Roberge et al. Nov 2010 B2
7839017 Huizenga et al. Nov 2010 B2
7839295 Ries, II Nov 2010 B2
7845823 Mueller et al. Dec 2010 B2
7852017 Melanson Dec 2010 B1
7866847 Zheng Jan 2011 B2
D632006 Verfuerth et al. Feb 2011 S
D632418 Bisberg et al. Feb 2011 S
7878683 Logan et al. Feb 2011 B2
7911359 Walters et al. Mar 2011 B2
7924155 Soccoli et al. Apr 2011 B2
7925384 Huizenga et al. Apr 2011 B2
7926974 Wung et al. Apr 2011 B2
7936561 Lin May 2011 B1
7938558 Wilcox et al. May 2011 B2
7959320 Mueller et al. Jun 2011 B2
7962606 Barron et al. Jun 2011 B2
7976188 Peng Jul 2011 B2
7988335 Liu et al. Aug 2011 B2
7988341 Chen Aug 2011 B2
7997762 Wang et al. Aug 2011 B2
8010319 Walters et al. Aug 2011 B2
8013281 Morgan et al. Sep 2011 B2
8025426 Mundle et al. Sep 2011 B2
8033686 Recker et al. Oct 2011 B2
8035320 Sibert Oct 2011 B2
8042968 Boyer et al. Oct 2011 B2
8052301 Zhou et al. Nov 2011 B2
8061865 Piepgras et al. Nov 2011 B2
8066403 Sanfilippo et al. Nov 2011 B2
8067906 Null Nov 2011 B2
D650225 Bartol et al. Dec 2011 S
8070312 Verfuerth et al. Dec 2011 B2
8079731 Lynch et al. Dec 2011 B2
8080819 Mueller et al. Dec 2011 B2
8096679 Chen et al. Jan 2012 B2
8101434 Guillien et al. Jan 2012 B2
8136958 Verfuerth et al. Mar 2012 B2
8138690 Chemel et al. Mar 2012 B2
8147267 Oster Apr 2012 B2
RE43456 Verfuerth et al. Jun 2012 E
8214061 Westrick, Jr. et al. Jul 2012 B2
8237581 Ries, II Aug 2012 B2
8237582 Ries, II Aug 2012 B2
8242927 Ries, II Aug 2012 B2
8260575 Walters et al. Sep 2012 B2
8265674 Choong et al. Sep 2012 B2
8275471 Huizenga et al. Sep 2012 B2
8337043 Verfuerth et al. Dec 2012 B2
8339069 Chemel et al. Dec 2012 B2
8344660 Mohan et al. Jan 2013 B2
8344665 Verfuerth et al. Jan 2013 B2
8364325 Huizenga et al. Jan 2013 B2
8368321 Chemel et al. Feb 2013 B2
8370483 Choong et al. Feb 2013 B2
8373362 Chemel et al. Feb 2013 B2
8376583 Wang et al. Feb 2013 B2
8376600 Bartol et al. Feb 2013 B2
8406937 Verfuerth et al. Mar 2013 B2
8415897 Choong et al. Apr 2013 B2
8422401 Choong et al. Apr 2013 B1
8445826 Verfuerth May 2013 B2
8450670 Verfuerth et al. May 2013 B2
8466626 Null et al. Jun 2013 B2
8476565 Verfuerth Jul 2013 B2
8531134 Chemel et al. Sep 2013 B2
8536802 Chemel et al. Sep 2013 B2
8543249 Chemel et al. Sep 2013 B2
8552664 Chemel et al. Oct 2013 B2
8586902 Verfuerth Nov 2013 B2
8593135 Chemel et al. Nov 2013 B2
8604701 Verfuerth et al. Dec 2013 B2
8610376 Chemel et al. Dec 2013 B2
8610377 Chemel et al. Dec 2013 B2
8729833 Chemel et al. May 2014 B2
8754589 Chemel et al. Jun 2014 B2
8805550 Chemel et al. Aug 2014 B2
8823277 Chemel et al. Sep 2014 B2
8841859 Chemel et al. Sep 2014 B2
8954170 Chemel et al. Feb 2015 B2
20010028227 Lys et al. Oct 2001 A1
20010055965 Delp et al. Dec 2001 A1
20020032535 Alexander et al. Mar 2002 A1
20020036430 Welches et al. Mar 2002 A1
20020038157 Dowling et al. Mar 2002 A1
20020047628 Morgan et al. Apr 2002 A1
20020048169 Dowling et al. Apr 2002 A1
20020070688 Dowling et al. Jun 2002 A1
20020074559 Dowling et al. Jun 2002 A1
20020078221 Blackwell et al. Jun 2002 A1
20020101197 Lys et al. Aug 2002 A1
20020113555 Lys et al. Aug 2002 A1
20020130627 Morgan et al. Sep 2002 A1
20020133270 Hung et al. Sep 2002 A1
20020134849 Disser Sep 2002 A1
20020145394 Morgan et al. Oct 2002 A1
20020152045 Dowling et al. Oct 2002 A1
20020153851 Morgan et al. Oct 2002 A1
20020163316 Lys et al. Nov 2002 A1
20020171365 Morgan et al. Nov 2002 A1
20020171377 Mueller et al. Nov 2002 A1
20020171378 Morgan et al. Nov 2002 A1
20020175642 von Kannewurff et al. Nov 2002 A1
20030011538 Lys et al. Jan 2003 A1
20030057886 Lys et al. Mar 2003 A1
20030057887 Dowling et al. Mar 2003 A1
20030057888 Archenhold et al. Mar 2003 A1
20030057890 Lys et al. Mar 2003 A1
20030063462 Shimizu et al. Apr 2003 A1
20030076056 Schuurmans Apr 2003 A1
20030076281 Morgan et al. Apr 2003 A1
20030097309 Gibler et al. May 2003 A1
20030100837 Lys et al. May 2003 A1
20030100998 Brunner et al. May 2003 A2
20030102675 Noethlichs Jun 2003 A1
20030123705 Stam et al. Jul 2003 A1
20030123706 Stam et al. Jul 2003 A1
20030133292 Mueller et al. Jul 2003 A1
20030137258 Piepgras et al. Jul 2003 A1
20030206411 Dowling et al. Nov 2003 A9
20030214259 Dowling et al. Nov 2003 A9
20030216971 Sick et al. Nov 2003 A1
20030222587 Dowling et al. Dec 2003 A1
20030222603 Mogilner et al. Dec 2003 A1
20040002792 Hoffknecht Jan 2004 A1
20040036006 Dowling Feb 2004 A1
20040052076 Mueller et al. Mar 2004 A1
20040090191 Mueller et al. May 2004 A1
20040090787 Dowling et al. May 2004 A1
20040105261 Ducharme et al. Jun 2004 A1
20040105264 Spero Jun 2004 A1
20040111638 Yadav et al. Jun 2004 A1
20040113044 Ishiguchi Jun 2004 A1
20040113568 Dowling et al. Jun 2004 A1
20040119415 Lansing et al. Jun 2004 A1
20040130909 Mueller et al. Jul 2004 A1
20040141321 Dowling et al. Jul 2004 A1
20040155609 Lys et al. Aug 2004 A1
20040160199 Morgan et al. Aug 2004 A1
20040178751 Mueller et al. Sep 2004 A1
20040212320 Dowling et al. Oct 2004 A1
20040212321 Lys et al. Oct 2004 A1
20040212993 Morgan et al. Oct 2004 A1
20040240890 Lys et al. Dec 2004 A1
20040252501 Moriyama et al. Dec 2004 A1
20040257007 Lys et al. Dec 2004 A1
20050030744 Ducharme et al. Feb 2005 A1
20050035728 Schanberger et al. Feb 2005 A1
20050036300 Dowling et al. Feb 2005 A1
20050040774 Mueller et al. Feb 2005 A1
20050041161 Dowling et al. Feb 2005 A1
20050041424 Ducharme Feb 2005 A1
20050044617 Mueller et al. Mar 2005 A1
20050047132 Dowling et al. Mar 2005 A1
20050047134 Mueller et al. Mar 2005 A1
20050062440 Lys et al. Mar 2005 A1
20050063194 Lys et al. Mar 2005 A1
20050099796 Magee May 2005 A1
20050099824 Dowling et al. May 2005 A1
20050116667 Mueller et al. Jun 2005 A1
20050125083 Kiko Jun 2005 A1
20050128751 Roberge et al. Jun 2005 A1
20050151489 Lys et al. Jul 2005 A1
20050162101 Leong et al. Jul 2005 A1
20050174473 Morgan et al. Aug 2005 A1
20050213352 Lys Sep 2005 A1
20050213353 Lys Sep 2005 A1
20050218838 Lys Oct 2005 A1
20050218870 Lys Oct 2005 A1
20050219872 Lys Oct 2005 A1
20050231133 Lys Oct 2005 A1
20050236029 Dowling Oct 2005 A1
20050236998 Mueller et al. Oct 2005 A1
20050248299 Chemel et al. Nov 2005 A1
20050253533 Lys et al. Nov 2005 A1
20050258765 Rodriguez et al. Nov 2005 A1
20050275626 Mueller et al. Dec 2005 A1
20050276053 Nortrup et al. Dec 2005 A1
20050285547 Piepgras et al. Dec 2005 A1
20060002110 Dowling et al. Jan 2006 A1
20060012987 Ducharme et al. Jan 2006 A9
20060022214 Morgan et al. Feb 2006 A1
20060038511 Tagawa Feb 2006 A1
20060050509 Dowling et al. Mar 2006 A9
20060076908 Morgan et al. Apr 2006 A1
20060087843 Setomoto et al. Apr 2006 A1
20060098077 Dowling May 2006 A1
20060104058 Chemel et al. May 2006 A1
20060106762 Caracas et al. May 2006 A1
20060108935 Stevn May 2006 A1
20060109649 Ducharme et al. May 2006 A1
20060132061 McCormick et al. Jun 2006 A1
20060152172 Mueller et al. Jul 2006 A9
20060158881 Dowling Jul 2006 A1
20060160199 DiCosimo et al. Jul 2006 A1
20060170376 Piepgras et al. Aug 2006 A1
20060181878 Burkholder Aug 2006 A1
20060198128 Piepgras et al. Sep 2006 A1
20060208667 Lys et al. Sep 2006 A1
20060221606 Dowling Oct 2006 A1
20060245174 Ashdown et al. Nov 2006 A1
20060262516 Dowling et al. Nov 2006 A9
20060262521 Piepgras et al. Nov 2006 A1
20060262544 Piepgras et al. Nov 2006 A1
20060262545 Piepgras et al. Nov 2006 A1
20060273741 Stalker Dec 2006 A1
20060276938 Miller Dec 2006 A1
20060285325 Ducharme et al. Dec 2006 A1
20070021946 Williams et al. Jan 2007 A1
20070032990 Williams et al. Feb 2007 A1
20070040513 Cleland et al. Feb 2007 A1
20070045407 Paul Mar 2007 A1
20070047227 Ducharme Mar 2007 A1
20070064425 Frecska et al. Mar 2007 A1
20070086754 Lys et al. Apr 2007 A1
20070086912 Dowling et al. Apr 2007 A1
20070115658 Mueller et al. May 2007 A1
20070115665 Mueller et al. May 2007 A1
20070143046 Budike, Jr. Jun 2007 A1
20070145915 Roberge et al. Jun 2007 A1
20070152797 Chemel et al. Jul 2007 A1
20070153514 Dowling et al. Jul 2007 A1
20070188114 Lys et al. Aug 2007 A1
20070188427 Lys et al. Aug 2007 A1
20070189026 Chemel et al. Aug 2007 A1
20070195526 Dowling et al. Aug 2007 A1
20070206375 Piepgras et al. Sep 2007 A1
20070211463 Chevalier et al. Sep 2007 A1
20070217196 Shaner Sep 2007 A1
20070228999 Kit Oct 2007 A1
20070229250 Recker et al. Oct 2007 A1
20070236156 Lys et al. Oct 2007 A1
20070237284 Lys et al. Oct 2007 A1
20070258231 Koerner et al. Nov 2007 A1
20070258240 Ducharme et al. Nov 2007 A1
20070263379 Dowling Nov 2007 A1
20070267978 Shteynberg et al. Nov 2007 A1
20070273307 Westrick et al. Nov 2007 A1
20070291483 Lys Dec 2007 A1
20080007943 Verfuerth et al. Jan 2008 A1
20080007944 Verfuerth et al. Jan 2008 A1
20080012502 Lys Jan 2008 A1
20080012506 Mueller et al. Jan 2008 A1
20080030149 Callahan Feb 2008 A1
20080074059 Ahmed Mar 2008 A1
20080079568 Primous et al. Apr 2008 A1
20080089060 Kondo et al. Apr 2008 A1
20080140231 Blackwell et al. Jun 2008 A1
20080158878 Van Laanen et al. Jul 2008 A1
20080164826 Lys Jul 2008 A1
20080164827 Lys Jul 2008 A1
20080164854 Lys Jul 2008 A1
20080170371 Lai Jul 2008 A1
20080180015 Wu et al. Jul 2008 A1
20080183081 Lys et al. Jul 2008 A1
20080183307 Clayton et al. Jul 2008 A1
20080183316 Clayton Jul 2008 A1
20080195561 Herzig Aug 2008 A1
20080204268 Dowling et al. Aug 2008 A1
20080208651 Johnston et al. Aug 2008 A1
20080215391 Dowling et al. Sep 2008 A1
20080246415 Chitta et al. Oct 2008 A1
20080265799 Sibert Oct 2008 A1
20080272934 Wang et al. Nov 2008 A1
20080275802 Verfuerth et al. Nov 2008 A1
20080278941 Logan et al. Nov 2008 A1
20080310850 Pederson et al. Dec 2008 A1
20090000217 Verfuerth et al. Jan 2009 A1
20090009989 Verfuerth et al. Jan 2009 A1
20090014625 Bartol et al. Jan 2009 A1
20090018673 Dushane et al. Jan 2009 A1
20090021955 Kuang et al. Jan 2009 A1
20090027932 Haines et al. Jan 2009 A1
20090034263 Stenback et al. Feb 2009 A1
20090050908 Yuan et al. Feb 2009 A1
20090051506 Hicksted et al. Feb 2009 A1
20090059603 Recker et al. Mar 2009 A1
20090059915 Baker Mar 2009 A1
20090066266 Jungwirth et al. Mar 2009 A1
20090076790 Fein et al. Mar 2009 A1
20090085494 Summerland Apr 2009 A1
20090085500 Zampini et al. Apr 2009 A1
20090122571 Simmons et al. May 2009 A1
20090147507 Verfuerth et al. Jun 2009 A1
20090160364 Ackermann et al. Jun 2009 A1
20090189535 Verfuerth et al. Jul 2009 A1
20090193217 Korecki et al. Jul 2009 A1
20090243517 Verfuerth et al. Oct 2009 A1
20090248217 Verfuerth et al. Oct 2009 A1
20090267540 Chemel et al. Oct 2009 A1
20090278472 Mills et al. Nov 2009 A1
20090278479 Platner et al. Nov 2009 A1
20090284184 Valois et al. Nov 2009 A1
20090299527 Huizenga et al. Dec 2009 A1
20090299811 Verfuerth et al. Dec 2009 A1
20090303722 Verfuerth et al. Dec 2009 A1
20090315485 Verfuerth et al. Dec 2009 A1
20090323347 Zhang et al. Dec 2009 A1
20100026479 Tran Feb 2010 A1
20100034386 Choong et al. Feb 2010 A1
20100061088 Bartol et al. Mar 2010 A1
20100109536 Jung et al. May 2010 A1
20100127634 Dowling et al. May 2010 A1
20100134051 Huizenga et al. Jun 2010 A1
20100135186 Choong et al. Jun 2010 A1
20100148689 Morgan et al. Jun 2010 A1
20100169249 Jhala et al. Jul 2010 A1
20100171145 Morgan et al. Jul 2010 A1
20100171442 Draper et al. Jul 2010 A1
20100185339 Huizenga et al. Jul 2010 A1
20100201267 Bourquin et al. Aug 2010 A1
20100204841 Chemel et al. Aug 2010 A1
20100207534 Dowling et al. Aug 2010 A1
20100211443 Carrel et al. Aug 2010 A1
20100246168 Verfuerth et al. Sep 2010 A1
20100246172 Liu Sep 2010 A1
20100253499 Haab et al. Oct 2010 A1
20100259931 Chemel et al. Oct 2010 A1
20100262313 Chambers et al. Oct 2010 A1
20100264834 Gaines et al. Oct 2010 A1
20100264846 Chemel et al. Oct 2010 A1
20100270933 Chemel et al. Oct 2010 A1
20100295473 Chemel et al. Nov 2010 A1
20100295474 Chemel et al. Nov 2010 A1
20100295475 Chemel et al. Nov 2010 A1
20100295482 Chemel et al. Nov 2010 A1
20100296285 Chemel et al. Nov 2010 A1
20100301768 Chemel et al. Dec 2010 A1
20100301769 Chemel et al. Dec 2010 A1
20100301770 Chemel et al. Dec 2010 A1
20100301771 Chemel et al. Dec 2010 A1
20100301773 Chemel et al. Dec 2010 A1
20100301774 Chemel et al. Dec 2010 A1
20100301834 Chemel et al. Dec 2010 A1
20100302779 Chemel et al. Dec 2010 A1
20100307075 Zampini et al. Dec 2010 A1
20100308736 Hung et al. Dec 2010 A1
20110001436 Chemel et al. Jan 2011 A1
20110001438 Chemel et al. Jan 2011 A1
20110033632 Vance et al. Feb 2011 A1
20110035404 Morgan et al. Feb 2011 A1
20110038148 Pyle Feb 2011 A1
20110043124 Johnston et al. Feb 2011 A1
20110057581 Ashar et al. Mar 2011 A1
20110060701 Verfuerth et al. Mar 2011 A1
20110068702 Van De Ven et al. Mar 2011 A1
20110084608 Lin et al. Apr 2011 A1
20110090684 Logan et al. Apr 2011 A1
20110102052 Billingsley et al. May 2011 A1
20110118890 Parsons May 2011 A1
20110140612 Mohan et al. Jun 2011 A1
20110146669 Bartol et al. Jun 2011 A1
20110172844 Choong et al. Jul 2011 A1
20110198977 VanderSluis Aug 2011 A1
20110204820 Tikkanen et al. Aug 2011 A1
20110216538 Logan et al. Sep 2011 A1
20110235317 Verfuerth et al. Sep 2011 A1
20110248171 Rueger et al. Oct 2011 A1
20110254466 Jackson et al. Oct 2011 A1
20110279063 Wang et al. Nov 2011 A1
20110279248 Ogawa Nov 2011 A1
20120007511 Choong et al. Jan 2012 A1
20120032599 Mohan et al. Feb 2012 A1
20120037725 Verfuerth Feb 2012 A1
20120038281 Verfuerth Feb 2012 A1
20120038490 Verfuerth Feb 2012 A1
20120040606 Verfuerth Feb 2012 A1
20120044350 Verfuerth Feb 2012 A1
20120044670 Piepgras et al. Feb 2012 A1
20120058663 Oster Mar 2012 A1
20120062125 Mohan et al. Mar 2012 A1
20120081906 Verfuerth et al. Apr 2012 A1
20120112654 Choong et al. May 2012 A1
20120112667 Mohan et al. May 2012 A1
20120130544 Mohan et al. May 2012 A1
20120167957 Verfuerth et al. Jul 2012 A1
20120182729 Verfuerth et al. Jul 2012 A1
20120203601 Verfuerth et al. Aug 2012 A1
20120209755 Verfuerth et al. Aug 2012 A1
20120229049 Mohan et al. Sep 2012 A1
20120233045 Verfuerth et al. Sep 2012 A1
20120235579 Chemel et al. Sep 2012 A1
20120274222 Verfuerth et al. Nov 2012 A1
20120299485 Mohan et al. Nov 2012 A1
20120326608 Mohan et al. Dec 2012 A1
20130006437 Verfuerth et al. Jan 2013 A1
20130020949 Mohan et al. Jan 2013 A1
20130033183 Verfuerth et al. Feb 2013 A1
20130063042 Bora et al. Mar 2013 A1
20130069542 Curasi et al. Mar 2013 A1
20130069543 Mohan et al. Mar 2013 A1
20130088168 Mohan et al. Apr 2013 A1
20130094230 Verfuerth et al. Apr 2013 A1
20130131882 Verfuerth et al. May 2013 A1
20130141904 Verfuerth et al. Jun 2013 A1
20130169185 Dai et al. Jul 2013 A1
20130193857 Tlachac et al. Aug 2013 A1
20130229795 Wang et al. Sep 2013 A1
20130257292 Verfuerth et al. Oct 2013 A1
20130293117 Verfuerth Nov 2013 A1
20130308325 Verfuerth et al. Nov 2013 A1
20140028199 Chemel et al. Jan 2014 A1
20140117852 Zhai et al. May 2014 A1
20140285090 Chemel et al. Sep 2014 A1
20140285095 Chemel et al. Sep 2014 A1
20140292208 Chemel et al. Oct 2014 A1
20140293605 Chemel et al. Oct 2014 A1
20140333222 Chemel et al. Nov 2014 A1
20140375206 Holland et al. Dec 2014 A1
20150008827 Carrigan et al. Jan 2015 A1
20150008828 Carrigan et al. Jan 2015 A1
20150061511 Chemel et al. Mar 2015
Foreign Referenced Citations (10)
Number Date Country
1873908 Dec 2006 CN
2005-073133 Mar 1993 JP
05073133 Mar 1993 JP
2006-106762 Apr 2006 JP
2006106762 Apr 2006 JP
2007-045407 Feb 2007 JP
2007045407 Feb 2007 JP
WO 9620369 Jul 1996 WO
WO 2007003038 Jan 2007 WO
WO 2007116332 Oct 2007 WO
Non-Patent Literature Citations (128)
Entry
Albeo Technologies, C Series, http://www.albeotech.com/?site—id=1500&item—id=161711, retrieved May 18, 2011.
Albeo Technologies, C3 Series, http://www.albeotech.com/?site—id=1500&item—id=173338, retrieved May 18, 2011.
Albeo Technologies, S Series, http://www.albeotech.com/?site—id=1500&item—id=161722, retrieved May 18, 2011.
Albeo Technologies, Surface Mounts, http://www.albeotech.com/?site—id=1500&item—id=161724, retrieved May 18, 2011.
Beta LED, 227 Series LED Canopy, http://www.betaled.com/us-en/TechnicalLibrary/TechnicalDocuments/227-series-canopy.aspx, retrieved May 18, 2011.
Beta LED, 227 Series LED Sofit, http://www.betaled.com/us-en/TechnicalLibrary/TechnicalDocuments/227-series-soffit.aspx, retrieved May 18, 2011.
Beta LED, 304 Series LED Interior, http://www.betaled.com/us-en/TechnicalLibrary/TechnicalDocuments/304-series-canopy.aspx, retrieved May 18, 2011.
Beta LED, 304 Series LED Parking Structure, http://www.betaled.com/us-en/TechnicalLibrary/TechnicalDocuments/304-series-parking.aspx, retrieved May 18, 2011.
Beta LED, 304 Series LED Sofit, http://www.betaled.com/us-en/TechnicalLibrary/TechnicalDocuments/304-series-soffit.aspx, retrieved May 18, 2011.
Beta LED, The Edge Canopy, http://www.betaled.com/us-en/TechnicalLibrary/TechnicalDocuments/TheEdgeCanopy.aspx, retrieved May 18, 2011.
Beta LED, The Edge LED Parking Structure, http://www.betaled.com/us-en/TechnicalLibrary/TechnicalDocuments/TheEdgeParking.aspx, retrieved May 18, 2011.
Color Kinetics, eW Cove EC Powercore line, http://www.colorkinetics.com/support/datasheets/eW—Cove—EC—Powercore—2700K—12in—SpecSheet.pdf, retrieved May 18, 2011.
Color Kinetics, eW Cove MX Powercore line, http://www.colorkinetics.com/support/datasheets/eW—Cove—MX—Powercore—2700K—Wide—Beam—Angle—SpecSheet.pdf, retrieved May 18, 2011.
Color Kinetics, eW Cove QLX Powercore line, http://www.colorkinetics.com/support/datasheets/eW—Cove—QLX—Powercore—6in—110degreex110degr—ee.pdf, retrieved May 18, 2011.
Color Kinetics, eW Fuse Powercore line, http://www.colorkinetics.com/support/datasheets/eW—Fuse—Powercore—2700K—10degree—x—60degree .pdf, retrieved May 18, 2011.
Color Kinetics, eW Graze Powercore line, http://www.colorkinetics.com/support/datasheets/eW—Graze—Powercore—SpecSheet—2700K—10x60.pdf, retrieved May 18, 2011.
Examination Report dated May 10, 2013 in AU 2009236311.
Final Office Action, dated Nov. 19, 2013, in U.S. Appl. No. 12/831,358.
Final Office Action, issued Oct. 10, 2013, in corresponding U.S. Appl. No. 12/828,495.
Final Office Action, mailed Nov. 19, 2013, in U.S. Appl. No. 12/831,358.
Final Office Action, mailed Oct. 10, 2013, in U.S. Appl. No. 12/828,495.
International Application Status Report dated Aug. 9, 2012, for PCT/US2009/040514, pp. 1-2.
International Preliminary Report on Patentability of PCT/US2009/040514 dated Jun. 26, 2009.
International Preliminary Report on Patentability of PCT/US2011/059334 dated Feb. 2, 2013.
International Search Report and Written Opinion in PCT/US2012/063372 mailed Mar. 19, 2013, pp. 1-18.
International Search Report and Written Opinion of PCT/US12/29834 dated Jul. 12, 2012, pp. 1-10.
International Search Report and Written Opinion of PCT/US2013/031790 dated Jun. 3, 2013.
Non-Final Office Action, dated Jan. 10, 2014, in U.S. Appl. No. 12/827,209.
Non-Final Office Action, issued Oct. 2, 2013, in corresponding U.S. Appl. No. 12/832,211.
Non-Final Office Action, issued Sep. 10, 2013, in corresponding U.S. Appl. No. 12/817,425.
Non-Final Office Action, mailed Nov. 21, 2013, in U.S. Appl. No. 12/831,476.
Non-Final Office Action, mailed Oct. 21, 2013, in U.S. Appl. No. 12/832,211.
Notice of Allowance issued in U.S. Appl. No. 12/823,195 dated Dec. 12, 2011.
Notice of Allowance issued in U.S. Appl. No. 12/823,195 dated Oct. 27, 2011.
Notice of Allowance, issued Oct. 2, 2013, in corresponding U.S. Appl. No. 12/827,336.
Office Action dated Apr. 11, 2012 from U.S. Appl. No. 12/831,476.
Office Action dated Apr. 2, 2012 from U.S. Appl. No. 12/822,577.
Office Action dated Jun. 27, 2011 from U.S. Appl. No. 12/423,543.
Office Action dated Mar. 5, 2012 from U.S. Appl. No. 12/830,868.
Office Action dated Nov. 3, 2011 from U.S. Appl. No. 12/817,425.
Supplementary European Search Report for EP09732558.3 dated Aug. 23, 2012, pp. 1-8.
US Notice of Allowance on U.S. Appl. No. 12/423,543 mailed Feb. 8, 2012, pp. 1-39.
US Notice of Allowance on U.S. Appl. No. 12/423,543 mailed Apr. 11, 2012, pp. 1-15.
US Notice of Allowance on U.S. Appl. No. 12/423,543 mailed Jun. 21, 2012, pp. 1-4.
US Notice of Allowance on U.S. Appl. No. 12/822,577 mailed Mar. 15, 2013, pp. 1-17.
US Notice of Allowance on U.S. Appl. No. 12/822,421 mailed Mar. 1, 2013, pp. 1-14.
US Notice of Allowance on U.S. Appl. No. 12/824,797 mailed Nov. 9, 2012, pp. 1-11.
US Notice of Allowance on U.S. Appl. No. 12/827,397 mailed Oct. 29, 2012, pp. 1-19.
US Notice of Allowance on U.S. Appl. No. 12/828,340 mailed Nov. 21, 2012, pp. 1-16.
US Notice of Allowance on U.S. Appl. No. 12/830,868 mailed Mar. 25, 2013, pp. 1-21.
US Notice of Allowance on U.S. Appl. No. 12/830,868 DTD Jun. 24, 2013, pp. 1-8.
US Notice of Allowance on U.S. Appl. No. 12/833,332 mailed Mar. 21, 2013, pp. 1-11.
US Notice of Allowance on U.S. Appl. No. 12/833,181 DTD May 23, 2013 (pp. 1-34).
US Office Action on 099431-0129 DTD Aug. 13, 2012.
US Office Action on U.S. Appl. No. 12/817,425 mailed Apr. 30, 2012, pp. 1-28.
US Office Action on U.S. Appl. No. 12/822,421 mailed Jan. 19, 2012, pp. 1-35.
US Office Action on U.S. Appl. No. 12/822,421 mailed Sep. 12, 2012, pp. 1-37.
US Office Action on U.S. Appl. No. 12/822,577 mailed Oct. 11, 2012, pp. 1-35.
US Office Action on U.S. Appl. No. 12/824,797 mailed Jun. 29, 2012, pp. 1-22.
US Office Action on U.S. Appl. No. 12/827,336 DTD Jun. 13, 2013, pp. 1-20.
US Office Action on U.S. Appl. No. 12/827,336 mailed Oct. 4, 2012, pp. 1-44.
US Office Action on U.S. Appl. No. 12/827,397 mailed Jul. 11, 2012, pp. 1-22.
US Office Action on U.S. Appl. No. 12/828,385 mailed Mar. 19, 2013 pp. 1-22.
US Office Action on U.S. Appl. No. 12/828,495 mailed Mar. 28, 2013, pp. 1-25.
US Office Action on U.S. Appl. No. 12/828,340 mailed Jul. 2, 2012, pp. 1-24.
US Office Action on U.S. Appl. No. 12/828,385 mailed Sep. 12, 2012, pp. 1-17.
US Office Action on U.S. Appl. No. 12/828,495 mailed Dec. 12, 2012, pp. 1-31.
US Office Action on U.S. Appl. No. 12/828,495 mailed May 17, 2012, pp. 1-15.
US Office Action on U.S. Appl. No. 12/831,358 DTD Jun. 13, 2013, pp. 1-18.
US Office Action on U.S. Appl. No. 12/831,476 DTD Jul. 23, 2013, pp. 1-45.
US Office Action on U.S. Appl. No. 12/831,476 mailed Oct. 17, 2012, pp. 1-36.
US Office Action on U.S. Appl. No. 12/831,476 mailed Feb. 13, 2013, pp. 1-46.
US Office Action on U.S. Appl. No. 12/832,179 DTD Jul. 17, 2013, pp. 1-19.
US Office Action on U.S. Appl. No. 12/832,179 mailed Mar. 13, 2013, pp. 1-25.
US Office Action on U.S. Appl. No. 12/832,179 mailed Sep. 12, 2012, pp. 1-19.
US Office Action on U.S. Appl. No. 12/832,211 DTD Jun. 20, 2013, pp. 1-24.
US Office Action on U.S. Appl. No. 12/832,211 mailed Sep. 12, 2012, pp. 1-18.
US Office Action on U.S. Appl. No. 12/833,181 mailed Sep. 12, 2012, pp. 1-19.
US Office Action on U.S. Appl. No. 12/833,332 mailed Nov. 23, 2012, pp. 1-15.
US Office Action on U.S. Appl. No. 12/833,332 mailed Aug. 20, 2012, pp. 1-21.
Vainio, A.-M. et al., Learning and adaptive fuzzy control system for smart home, Mar. 2008, http://www.springerlink.com/content/ll72k32006l4qx81/fulltext.pdf, 10 pages.
Vishal Garg, N.K. Bansal, Smart occupancy sensors to reduce energy consumption, Energy and Buildings, vol. 32, Issue 1, Jun. 2000, pp. 81-87, ISSN 0378-7788, 10.1016/S0378-7788(99)00040-7. (http://www.sciencedirect.com/science/article/pii/S037877889.
ZigBee Alliance “Wireless Sensors and Control Networks: Enabling New Opportunities with ZigBee”, Bob Heile, Chairman, ZigBee Alliance, Dec. 2006 Powerpoint Presentation.
ZigBee Alliance Document No. 08006r03, Jun. 2008, ZigBee-200y Layer Pics and Stack Profile, Copyright © 1996-2008 by the ZigBee Alliance. 2400 Camino Ramon, Suite 375, San Ramon, CA 94583, USA; http://www.zigbee.org.
ZigBee Specification Document 053474r17, Notice of Use and Disclosure; Jan. 17, 2008 11:09 A.M., Sponsored by: ZibEe Alliance; Copyright © 2007 ZigBee Standards Organizat. All rights reserved.
Abeo Technologies, C3 Series, http://albeotech.com/?site—id=1500&item—id=173338, retrieved May 18, 2011.
Abeo Technologies, C Series, http://albeotech.com/?site—id=1500&item—id=161711, retrieved May 18, 2011.
Abeo Technologies, S Series, http://albeotech.com/?site—id=1500&item—id=161722, retrieved May 18, 2011.
Abeo Technologies, Surface Mounts, http://albeotech.com/?site—id=1500&item—id=161724, retrieved May 18, 2011.
Color Kinetics, eW Fuse Powercore line, http://www.colorkinetics.com/support/datasheets/eW—Fuse—Powercore—2700K—2700K—10degree—x—60degree.pdf, retrieved May 18, 2011.
Color Kinetics, eW Cove QLX Powercore line, http://www.colorkinetics.com/support/datasheets/eW—Cove—QLX—Powercore—6in—110degreex110degree.pdf, retrieved May 18, 2011.
International Search Report of PCT/US2009/040514 dated Jun. 26, 2009.
Notice of Allowance issued in U.S. Appl. No. 12/423,543, dated Feb. 8, 2012.
Notice of Allowance issued in U.S. Appl. No. 12/823,195, dated Dec. 12, 2011.
Office Action dated Jan. 19, 2012 from U.S. Appl. No. 12/822,421.
ZigBee Alliance Document No. 08006r03. Jun. 2008, ZigBee-200y Layer Pics and Stack Profile, Copyright © 1996-2008 by the ZigBee Alliance. 2400 Camino Ramon, Suite 375, San Ramon, CA 94583, USA; http://www.zigbee.org.
Progress Report: Reducing Barriers To Use of High Efficiency Lighting Systems; Oct. 2001, (http://www.Irc.rpi.edu/researchAreas/reducingBarriers/pdf/year1FinalReport.pdf), 108 pages.
International Search Report in International Application No. PCT/US2009/040514 mailed Jun. 26, 2009, 4 pages.
Written Opinion in International Application No. PCT/US2009/040514, dated Jun. 26, 2009, 3 pages.
International Preliminary Report on Patentability in International Application No. PCT/US2012/029834, mailed dated Sep. 24, 2013, 7 pages.
Office Action in U.S. Appl. No. 12/831,358, mailed Nov. 19, 2013, 16 pages.
Notice of Allowance in U.S. Appl. No. 12/828,495, mailed Feb. 19, 2014, 8 pages.
Notice of Allowance in U.S. Appl. No. 14/045,679, mailed Feb. 20, 2014, 8 pages.
Office Action in U.S. Appl. No. 12/832,179, mailed Feb. 21, 2014, 16 pages.
Advisory Action in U.S. Appl. No. 12/831,358, mailed Feb. 27, 2014, 2 pages.
Office Action in U.S. Appl. No. 12/817,425, mailed Mar. 27, 2014, 16 pages.
Notice of Allowance in U.S. Appl. No. 12/832,211, mailed Apr. 23, 2014, 10 pages.
International Preliminary Report on Patentability in International Application No. PCT/US2012/063372, dated Mar. 19, 2013, 14 pages.
Office Action in U.S. Appl. No. 13/425,295, mailed Jun. 10, 2014, 12 pages.
Notice of Allowance in U.S. Appl. No. 12/831,476, mailed Jun. 11, 2014, 5 pages.
Notice of Acceptance in Australian Application No. 2009236311, dated Jun. 12, 2014, 2 pages.
Notice of Allowance in U.S. Appl. No. 12/832,179, mailed Aug. 1, 2014, 9 pages.
Examination Report in Australian Patent Application No. 2011323165, dated Aug. 22, 2014, 3 pages.
Notice of Allowance in U.S. Appl. No. 12/831,358, mailed Aug. 29, 2014, 9 pages.
Final Office Action in U.S. Appl. No. 12/817,425, mailed Sep. 15, 2014, 17 pages.
International Search Report and Written Opinion in International Application No. PCT/US2014/35990, mailed Sep. 18, 2014, 11 pages.
International Preliminary Report on Patentability in International Application No. PCT/US2013/031790, mailed Sep. 23, 2014, 10 pages.
Restriction Requirement in U.S. Appl. No. 14/294,081, mailed Oct. 9, 2014, 6 pages.
Examination Report in Australian Patent Application No. 2012230991, dated Nov. 18, 2014, 3 pages.
Restriction Requirement in U.S. Appl. No. 12/817,425, mailed Dec. 10, 2014, 6 pages.
Final Office Action in U.S. Appl. No. 13/425,295, mailed Jan. 2, 2015, 17 pages.
Office Action in U.S. Appl. No. 14/294,082, mailed Jan. 2, 2015, 10 pages.
Office Action in U.S. Appl. No. 14/294,081, mailed Jan. 22, 2015, 7 pages.
International Search Report and Written Opinion in International Application No. PCT/US2014/060095, mailed Jan. 29, 2015, 16 pages.
Office Action in U.S. Appl. No. 14/289,601, mailed Jan. 30, 2015, 6 pages.
Office Action in U.S. Appl. No. 14/245,196, mailed Feb. 9, 2015, 13 pages.
Examination Report in Australian Patent Application No. 2012332206, dated Feb. 12, 2015, 3 pages.
Office Action in U.S. Appl. No. 12/817,425, mailed Feb. 25, 2015, 6 pages.
Related Publications (1)
Number Date Country
20120143357 A1 Jun 2012 US
Provisional Applications (1)
Number Date Country
61409991 Nov 2010 US