This application claims priority to U.S. Provisional Patent Application No. 61/494,987, filed Jun. 9, 2012.
In various situations, motion and/or occupancy of individuals in a room may be detected for various reasons. For example, the lighting and/or climate controls may be altered based on occupancy and/or the motion in the room. Altering the lighting, climate, etc. based on the motion and/or occupancy of a room by individuals may reduce energy costs.
A computer implemented method for sensing occupancy of a workspace includes creating a difference image that represents luminance differences of pixels between past and current images of the workspace resulting from motion in the workspace, determining motion occurring in regions of the workspace based on the difference image, and altering a workspace environment based at least in part on the determined motion. The method also includes determining which pixels in the difference image represent persistent motion that can be ignored and determining which pixels representing motion in the difference image are invalid because the pixels are isolated from other pixels representing motion.
Another example of a computer implemented method for sensing occupancy of a workspace includes creating a difference image that represents luminance differences of pixels in two sequential images of the workspace resulting from motion in the workspace, the difference image including motion pixels and non-motion pixels, determining which motion pixels are invalid by comparing the motion pixels to adjacent pixels and changing the motion pixels to non-motion pixels if the adjacent pixels are not also motion pixels, creating an updated difference image with the pixels changed from motion pixels to non-motion pixels, and altering a workspace environment based at least in part on the updated difference image.
A further computer implemented method for sensing occupancy of a workspace includes creating a difference image that represents luminance differences of pixels in two sequential images of the workspace resulting from motion in the workspace, the difference image including motion pixels and non-motion pixels, creating a persistence image having pixels corresponding to pixels of the difference image, and altering a workspace environment based at least in part on the persistence and difference images. A value of a pixel in the persistence image is increased each time a corresponding motion pixel is identified in the difference image, the value being decreased when a corresponding non-motion pixel is identified in the difference image, wherein the pixel in the persistence image is ignored when the value exceeds a threshold value.
The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the instant disclosure.
While the embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the instant disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
Building efficiency and energy conservation is becoming increasingly important in our society. One way to conserve energy is to power devices in a workspace only when those devices are needed. Many types of devices are needed only when the user is within a workspace or in close proximity to such devices. One scenario is an office workspace that includes a plurality of electronic devices such as lighting, heating and air conditioning, computers, telephones, etc. One aspect of the present disclosure relates to monitoring the presence of an occupant within the office workspace, and turning on and off at least some of the electronic devices based on the user's proximity to the office workspace.
An occupancy sensor system and related methods may be used to determine when an occupant's current location is within a given workspace. A sequence of images of the workspace may be used to determine the occupant's location. The sequence allows motion data to be extracted from the images. The current and past motion from the images comprises what may be referred to as motion history. Occupancy information may be used, for example, so that lighting within the space may be adjusted to maximally reduce energy consumption. Another example is altering room heating or cooling or providing other environmental controls in response to determining the occupant's location. Typically, the space to monitor and sense occupancy is referred to as a workspace, and typically has physical boundaries. For example, the workspace may have one or more fixed entrance locations that are monitored relative to other portions of the workspace.
Referring now to
The workspace modifier module 102 may be positioned remotely from the rooms 106a-c. Alternatively, the workspace modifier module 102 may be housed locally in close proximity to one or more of the rooms 106a-c. Furthermore, the rooms 106a-c may be positioned adjacent to each other or be positioned at locations remote from each other. While a plurality of rooms 106a-c is shown in
Referring to
The differencing module 114 may perform comparison of past and current images and create the differencing image as described below with reference to
The occupy sensors 108a include various components and modules in addition to an occupy sensing module 112, as shown in
Other types of sensors may be associated with the system 100 of
Referring now to
Referring now to
Each time a pixel in the corrected difference image 184 does not represent valid motion, the value of the corresponding pixel in the persistence image 186 is decremented. In one embodiment, the persistence image is decremented by 1, but may not go below 0. If the value of a pixel in a persistence image 186 is above a predetermined threshold, that pixel is considered to represent persistent motion. Persistent motion is motion that reoccurs often enough that it should be ignored (e.g., a fan blowing in an office workspace). In the example of
Referring now to
The room 106a may include a further border reference as ignore border 146. In the event the sensor 108a is able to see through the entrance of the room 106a (e.g. through an open door) into a space beyond border I 140, movement beyond the ignore border 146 may be masked and ignored.
A state machine may be updated using the triggers generated in the region motion evaluation discussed with reference to
Proper placement in sizing of the borders shown in
Border II 144 may be placed around at least a portion of the periphery of border I 140. Border II 144 may surround all peripheral surfaces of border I 140 that are otherwise exposed to the workspace 110a. Border II 144 may be large enough that the system can detect the occupant's presence in border II 144 separate and distinct from detecting the occupant's presence in border I 140.
The ignore border 146 may also be rectangular in shape (or any other suitable shape) and is placed adjacent to border I 140 adjacent to the door opening. The ignore border 146 may be used to mask pixels in the image (e.g., image 180) that are outside of the workspace 110a, but that are visible in the image. Any motion within the ignore border 146 is typically ignored.
A state machine may be used to help define the behavior of the occupy sensor and related system and methods. In one example, there are four states in the state machine: not occupy, border I motion, border II motion, and workspace occupy. Other examples may include more or fewer states depending on, for example, the number of borders established in the workspace. The not occupy state may be valid initially and when the occupant has moved from border I to somewhere outside of the workspace. If the occupant moves from border I to somewhere outside of the workspace, the workspace environment may be altered (e.g., lights being turned off). The border I motion state may be valid when the occupant has moved into border I from either outside the workspace or from within the workspace. The border II motion state may be valid when the occupant has moved into border II from either border I or the workspace. If the occupant enters border II from border I, workspace environment may be altered (e.g., the workspace lights are turned on). The workspace occupy state may be valid when the occupant has moved into the workspace from either border I or border II.
A motion disappear trigger may result, for example, in lights being turned off 158, and may occur as the occupant moves from border I 140 and into the ignore border 146. A border I motion trigger 160 may occur as the occupant moves from outside of the workspace 110a and into the border I 140. A border II motion trigger 162, resulting, for example, in turning a light on, may occur as the occupant moves from border I 140 to border II 144. A border I motion trigger 164 may occur as the occupant moves from border II 144 to border I 140. A workspace motion trigger 166 may occur as the occupant moves from border II 144 to the workspace 110a. A border II motion trigger 168 may occur when an occupant moves from the workspace 110a to the border II 144. A workspace motion trigger 170 may occur as the occupant moves from border I 140 to the workspace 110a. A border I motion trigger 172 may occur as the occupant moves from the workspace 110a to border I 152.
Referring now to
Step 210 includes creating a data structure with dimensions M×N to store a binary difference image. Steps 212 includes creating a data structure with dimensions M×N to store the previous image. Step 214 includes creating a data structure with dimensions M×N to store a persistent motion image. The following step 216 includes copying a current image to the previous image data structure. For each pixel in the current image, if an absolute value of difference between the current pixel and corresponding pixel in a previous image is greater than a threshold, a corresponding value is set in a difference image to 1. Otherwise, a corresponding value is set in a difference image to 0 in the step 218. The step 220 includes leaving the value of a pixel at 1, for each pixel in the difference image set to 1, if the pixel is not on any edge of the image and all nearest neighbor pixels are set to 1. Otherwise, the pixel value is set at 0.
A step 226 includes determining whether a corresponding pixel in a difference image is set to 0. If so, step 230 includes decrementing a value of the corresponding pixel in a persistence image by 1, and not to decrease below the value of 0. If a corresponding pixel in the difference image is set to 1 and the condition in step 222 is yes and the condition in step 228 is no, then a further step 234 includes setting a value of the corresponding pixel in a motion history image 255 or some other predefined value. A step 236 includes increasing the dimension of the motion region rectangle to include this pixel. An increment count of pixels with motion is increased by 1 in the step 238.
Many other methods may be possible in accordance with the various systems, embodiments and features disclosed herein. An example method 300 is shown with reference to
Another example step may include determining the number of pixels representing motion in each of a plurality region in the workspace and creating a trigger representative of which region has the most pixels representing motion. The method 300 may include evaluating the triggers based on a preset priority. The method 300 may include updating a state machine using a signal representing the evaluated triggers, wherein the state machine controls the workspace environment.
Creating the difference image according to method 300 may include comparing an absolute value of the difference in luminance between the images to a threshold value, setting a corresponding pixel in the difference image to 1 if the difference is greater than the threshold value, and setting the corresponding pixel in the difference image to 0 if the difference is less than the threshold value. The method 300 may further include comparing every pixel representing motion to immediate neighboring pixels, and if the immediate neighboring pixels are not pixels representing motion, changing a value of the pixel. The method 300 may include creating a persistence image having a pixel corresponding to each pixel in the difference image, wherein the pixels in the persistence image power increment in value each time a pixel in the difference image is determined to represent motion. The method 300 may include creating a motion history image representing how recently motion was detected in each pixel representing motion in the difference image. A further step may include assigning a pixel value to pixels in the motion history image corresponding to pixels in the difference image representing motion, and subtracting from the pixel value each time a pixel in the current image is determined to not represent motion.
Another example step of method 400 may include comparing the motion pixels to a plurality of media adjacent of immediate neighboring pixels (e.g., 8 immediate neighboring pixels). The method 400 may include determining the number of motion pixels in each of a plurality of regions in the workspace and creating a trigger representative of the region that has the most motion pixels. Method 400 may include updating a state machine using the triggers, wherein the state machine controls the workspace environment. Another step may include creating a persistence image having a pixel corresponding to each pixel in a difference image, wherein the pixels in the persistence image are incremented in value each time a pixel on the difference image is determined to represent motion. The method 400 may include creating a motion history image representing how recently motion was detected in each motion pixel.
The method 500 may also include determining a motion occurring in regions of the workspace based at least in part on the persistence and difference images. The method may also include creating a motion history image representing how recently a motion was detected in each motion pixel. The method 500 may include determining the number of pixels representing motion in each of the regions in the workspace and creating a trigger representative of the region having the most pixels representing motion. The method may include evaluating the triggers based on a pre-set priority. The method 500 may further include updating a state machine using a signal representing the trigger, the state machine controlling alterations to the workspace environment.
Bus 610 allows data communication between central processor 604 and system memory 606, which may include read-only memory (ROM) or flash memory (neither shown), and random access memory (RAM) (not shown), as previously noted. The RAM is generally the main memory into which the operating system and application programs are loaded. The ROM or flash memory can contain, among other code, the Basic Input-Output system (BIOS) which controls basic hardware operation such as the interaction with peripheral components or devices. For example, the occupy sensing module 112 to implement the present systems and methods may be stored within the system memory 606. The occupy sensing module 112 may be an example of the occupy sensing module of
Communications interface 608 may provide a direct connection to a remote server or to the Internet via an internet service provider (ISP). Communications interface 608 may provide a direct connection to a remote server via a direct network link to the Internet via a POP (point of presence). Communications interface 608 may provide such connection using wireless techniques, including digital cellular telephone connection, Cellular Digital Packet Data (CDPD) connection, digital satellite data connection, or the like.
Many other devices or subsystems (not shown) may be connected in a similar manner. Conversely, all of the devices shown in
Moreover, regarding the signals described herein, those skilled in the art will recognize that a signal can be directly transmitted from a first block to a second block, or a signal can be modified (e.g., amplified, attenuated, delayed, latched, buffered, inverted, filtered, or otherwise modified) between the blocks. Although the signals of the above described embodiment are characterized as transmitted from one block to the next, other embodiments of the present systems and methods may include modified signals in place of such directly transmitted signals as long as the informational and/or functional aspect of the signal is transmitted between blocks. To some extent, a signal input at a second block can be conceptualized as a second signal derived from a first signal output from a first block due to physical limitations of the circuitry involved (e.g., there will inevitably be some attenuation and delay). Therefore, as used herein, a second signal derived from a first signal includes the first signal or any modifications to the first signal, whether due to circuit limitations or due to passage through other circuit elements which do not change the informational and/or final functional aspect of the first signal.
While the foregoing disclosure sets forth various embodiments using specific block diagrams, flowcharts, and examples, each block diagram component, flowchart step, operation, and/or component described and/or illustrated herein may be implemented, individually and/or collectively, using a wide range of hardware, software, or firmware (or any combination thereof) configurations. In addition, any disclosure of components contained within other components should be considered exemplary in nature since many other architectures can be implemented to achieve the same functionality.
The process parameters and sequence of steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
Furthermore, while various embodiments have been described and/or illustrated herein in the context of fully functional electronic devices, one or more of these exemplary embodiments may be distributed as a program product in a variety of forms, regardless of the particular type of computer-readable media used to actually carry out the distribution. The embodiments disclosed herein may also be implemented using software modules that perform certain tasks. These software modules may include script, batch, or other executable files that may be stored on a computer-readable storage medium or in an electronic device. In some embodiments, these software modules may configure an electronic device to perform one or more of the exemplary embodiments disclosed herein.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the invention to the precise forms disclosed. Many modifications and variations are possible in view of the above teachings. The embodiments were chosen and described in order to best explain the principles of the present systems and methods and their practical applications, to thereby enable others skilled in the art to best utilize the present systems and methods and various embodiments with various modifications as may be suited to the particular use contemplated.
Unless otherwise noted, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” In addition, for ease of use, the words “including” and “having,” as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.”
This invention was made with government support under Contract No. DE-EE0003114 awarded by the U.S. Department of Energy. The government has certain rights in the invention.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2012/041673 | 6/8/2012 | WO | 00 | 12/9/2013 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2012/170898 | 12/13/2012 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4937878 | Lo et al. | Jun 1990 | A |
5121201 | Seki | Jun 1992 | A |
5489827 | Xia | Feb 1996 | A |
5528698 | Kamei et al. | Jun 1996 | A |
5684887 | Lee et al. | Nov 1997 | A |
5835613 | Breed et al. | Nov 1998 | A |
5845000 | Breed et al. | Dec 1998 | A |
6005958 | Farmer et al. | Dec 1999 | A |
6081606 | Hansen et al. | Jun 2000 | A |
6141041 | Carlbom et al. | Oct 2000 | A |
6141432 | Breed et al. | Oct 2000 | A |
6198998 | Farmer et al. | Mar 2001 | B1 |
6324453 | Breed et al. | Nov 2001 | B1 |
6335985 | Sambonsugi et al. | Jan 2002 | B1 |
6393133 | Breed et al. | May 2002 | B1 |
6608910 | Srinivasa et al. | Aug 2003 | B1 |
6841780 | Cofer et al. | Jan 2005 | B2 |
6870945 | Schoepflin et al. | Mar 2005 | B2 |
6961443 | Mahbub | Nov 2005 | B2 |
7508979 | Comaniciu et al. | Mar 2009 | B2 |
7511613 | Wang | Mar 2009 | B2 |
7796780 | Lipton et al. | Sep 2010 | B2 |
7801330 | Zhang et al. | Sep 2010 | B2 |
8334906 | Lipton et al. | Dec 2012 | B2 |
20020104094 | Alexander et al. | Aug 2002 | A1 |
20020122570 | Paragios et al. | Sep 2002 | A1 |
20020181742 | Wallace et al. | Dec 2002 | A1 |
20020181743 | Khairallah et al. | Dec 2002 | A1 |
20030002738 | Cooper | Jan 2003 | A1 |
20030021445 | Larice et al. | Jan 2003 | A1 |
20030044045 | Schoepflin et al. | Mar 2003 | A1 |
20030123704 | Farmer et al. | Jul 2003 | A1 |
20030133595 | Farmer et al. | Jul 2003 | A1 |
20030223617 | Wallace et al. | Dec 2003 | A1 |
20040008773 | Itokawa | Jan 2004 | A1 |
20040151342 | Venetianer et al. | Aug 2004 | A1 |
20040234137 | Weston et al. | Nov 2004 | A1 |
20040247158 | Kohler et al. | Dec 2004 | A1 |
20050002544 | Winter et al. | Jan 2005 | A1 |
20050058322 | Farmer et al. | Mar 2005 | A1 |
20050196015 | Luo et al. | Sep 2005 | A1 |
20050201591 | Kiselewich | Sep 2005 | A1 |
20050271280 | Farmer et al. | Dec 2005 | A1 |
20060170769 | Zhou | Aug 2006 | A1 |
20070127824 | Luo et al. | Jun 2007 | A1 |
20070176402 | Irie et al. | Aug 2007 | A1 |
20070177800 | Connell | Aug 2007 | A1 |
20080079568 | Primous et al. | Apr 2008 | A1 |
20080226172 | Connell | Sep 2008 | A1 |
20080273754 | Hick et al. | Nov 2008 | A1 |
20090087025 | Ma | Apr 2009 | A1 |
20100322476 | Kanhere et al. | Dec 2010 | A1 |
20110260871 | Karkowski | Oct 2011 | A1 |
Number | Date | Country |
---|---|---|
1020070104999 | Oct 2007 | KR |
1020080103586 | Nov 2008 | KR |
100883065 | Feb 2009 | KR |
1020100121020 | Nov 2010 | KR |
Entry |
---|
Intille, Stephen et al, “Real-Time Closed-World Tracking”, Computer Vision and Pattern Recognition, 1997. Proceedings., 1997 IEEE Computer Society Conference on, p. 697-703, Jun. 1997. |
Zhou, Xuhui et al. “A Master-Slave System to Acquire Biometric Imagery of Humans at Distance”, Proceeding IWVS '03 First ACM SIGMM international workshop on Video surveillance, pp. 113-120, 2003. |
International Search Report and Written Opinion for PCT/US2012/041673, Dec. 28, 2012. |
Bradski, G. and Davis, J., Motion Segmentation and Pose Recognition with Motion History Gradients, Machine Vision and Applications, vol. 13, 2002, pp. 174-184. |
Number | Date | Country | |
---|---|---|---|
20140093130 A1 | Apr 2014 | US |
Number | Date | Country | |
---|---|---|---|
61494987 | Jun 2011 | US |