This patent specification relates to motion sensors, and in particular to passive infrared motion sensors.
This section is intended to introduce the reader to various aspects of art that may be related to various aspects of the present techniques, which are described and/or claimed below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light, and not as admissions of prior art.
Several types of passive infra-red sensors have been described in the prior art for detecting occupancy of human occupants in an area of interest, such as a home or office. Such sensors detect the changes in the infra-red radiation falling on an infra-red detector caused by movement of the infra-red emitting intruder in the field of view of the sensor. The area under surveillance is focused onto the infra-red sensitive detector by an array of lenses that produce a number of discrete zones. As the occupant crosses from zone to zone, the changes in the detector output above the ambient level from the surroundings are amplified by suitable circuitry, and an alarm signal is generated.
Detector effectiveness often is improved with optics that include segmented mirrors or lenses having multiple fields-of-view. Movement of an infra-red target into or through any of the fields will produce an electrical signal at the sensor, increasing the probability of detection. A detector mounted six or seven feet high in the corner of a room, for example, may have twenty or more separate fields-of-view, sometimes called zones, covering the room both horizontally and vertically. Fields-of-view that intercept the floor will detect or “catch” intruders attempting to crawl into the protected region. At the same time, however, they also catch ground based domestic animals, such as dogs and cats. Since household pets are likely to produce false alarms whenever they are active in the protected area, detectors often are disarmed, or the pets are confined to areas not protected by the system. This causes a dilemma in households where pets that might otherwise deter intruders instead reduce system effectiveness.
Accordingly, what is needed are systems and methods for accurately distinguishing between human occupants and pets.
A summary of certain embodiments disclosed herein is set forth below. It should be understood that these aspects are presented merely to provide the reader with a brief summary of these certain embodiments and that these aspects are not intended to limit the scope of this disclosure. Indeed, this disclosure may encompass a variety of aspects that may not be set forth below.
Systems and methods that use pattern recognition to characterize stimuli captured by passive infrared motion sensors are provided. The pattern recognition can be performed by comparing one or more features extracted from motion sensor signals to known features, thereby providing enhanced pet rejection that exceeds performance of conventional threshold based pet rejecting PIR systems. In some embodiments, the known features can be obtained through simulations that accurately model the performance of motion sensors and their response to a large variety of stimuli. The simulations result in an extensive database that can be accessed by motion sensor units when performing pattern matching algorithms to determine whether the stimulus is a human or a pet.
Various refinements of the features noted above may be used in relation to various aspects of the present disclosure. Further features may also be incorporated in these various aspects as well. These refinements and additional features may be used individually or in any combination. For instance, various features discussed below in relation to one or more of the illustrated embodiments may be incorporated into any of the above-described aspects of the present disclosure alone or in any combination. The brief summary presented above is intended only to familiarize the reader with certain aspects and contexts of embodiments of the present disclosure without limitation to the claimed subject matter.
A further understanding of the nature and advantages of the embodiments discussed herein may be realized by reference to the remaining portions of the specification and the drawings.
In the following detailed description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of the various embodiments. Those of ordinary skill in the art will realize that these various embodiments are illustrative only and are not intended to be limiting in any way. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure.
In addition, for clarity purposes, not all of the routine features of the embodiments described herein are shown or described. One of ordinary skill in the art would readily appreciate that in the development of any such actual embodiment, numerous embodiment-specific decisions may be required to achieve specific design objectives. These design objectives will vary from one embodiment to another and from one developer to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming but would nevertheless be a routine engineering undertaking for those of ordinary skill in the art having the benefit of this disclosure.
It is to be appreciated that while one or more hazard detection embodiments are described further herein in the context of being used in a residential home, such as a single-family residential home, the scope of the present teachings is not so limited. More generally, hazard detection systems are applicable to a wide variety of enclosures such as, for example, duplexes, townhomes, multi-unit apartment buildings, hotels, retail stores, office buildings, and industrial buildings. Further, it is understood that while the terms user, customer, installer, homeowner, occupant, guest, tenant, landlord, repair person, and the like may be used to refer to the person or persons who are interacting with the hazard detector in the context of one or more scenarios described herein, these references are by no means to be considered as limiting the scope of the present teachings with respect to the person or persons who are performing such actions.
Optics system 120 and PIR detection system 130 can be specifically designed to achieve a desired field of view for motion detection system 115. That is, the field of view may be designed such that each of the zones defining the field of view is assigned a specific weight. For example,
The zones and the weighting thereof discussed in
The simulation based features can be obtained through simulations that use a particular hardware configuration and a multitude of stimulus factors. The hardware configuration can be an actual motion detector, or it may be a software representation of a motion detector. Thus, for any given hardware arrangement of the motion detector system, that particular arrangement can be subjected to a battery of stimuli to generate simulation based features. These simulation based features may form a basis for performing pattern matching with features extracted from motion sensor data. A database of simulation based features can be created for any number of different hardware configurations. The appropriate simulation based features may be stored in motion detection sensors that most closely resemble the hardware configuration, such that those simulation based features can be used for pattern matching.
PIR detector system module 420 may be digital representation of a PIR detector system. Module may model element placement 421 and element design 422. Element placement 421 may refer to the number and position of elements. Element design 422 may refer to the construction of the element. For example, elements may be constructed to have different compositions, shapes, substrate layout, etc.
Field of view module 430 may be a digital representation of the motion sensor's field of view. In one embodiment, the digital representation of the field of view may be the product of modules 410 and 420. In another embodiment, the field of view may take into account any masking that may have been applied to the optical system. Module 430 may model lenslet weighting 431 to provide an accurate digital representation of a motion detector system.
Stimulus factor module 440 may represent different factors related to stimuli that may be captured by a motion detector. Stimulus factors can include, for example, distance 441, angle 442, velocity 443, mounting height 444, pet size 445, and person height 446. Distance 441, angle 442, and velocity 443 may refer to different characteristic involving movement of a stimuli within the zones of the motion sensor. Mounting height 444 may refer to the position of the motion sensor on a wall or ceiling. For example, a motion sensor positioned at nine feet (relative to the floor) will capture stimuli differently than a motion sensor mounted at seven feet. Pet size 445 may refer to the size of a pet. For example, pets can be classified in different sizes such as small (e.g., 20 pounds), medium (e.g., 40 pounds), and large (e.g., 70 pounds). Person height 446 may, as its label implies, refers to the height of a human occupant. For example, a human occupant can be modeled to have certain minimum height (e.g., four feet, 10 inches). If desired, humans of multiple heights may be modeled.
Simulation based features module 450 may include different simulated features obtained using a combination of modules 410, 420, 430, and 440. The simulated features can include any number of raw data extractions and/or statistical interpretations of the raw data. Four illustrative features are shown in module 450. These include amplitude 451, frequency 452, phase 453, and time series of peaks 454. Amplitude 451 may refer to the magnitude of a signal provided by a motion sensor. Referring briefly to
Frequency 452 may refer to the rate at which a stimulus crosses from zone to zone. Referring briefly to
Pattern matching module 630 may access a pattern lookup engine 631 when attempting to match extracted features to simulation based features so that it produce a stimulus character determination 640. Pattern lookup engine 631 may be implemented as a decision forest classifier. The decision forest classifier is one example of pattern matching that may be implemented. Other suitable pattern matching techniques may be used by pattern lookup engine 631. The simulation based patterns may be stored in a local memory (not shown) that may be updated when system 600 receives updates from server 680. Server 680 may pass simulation based patterns 681 (e.g., obtained from simulator 400) to system 600 as part of a regular or systematic update process.
At step 730, the at least one extracted feature is pattern matched with simulation based features to determine a character of the stimulus. The simulation based features may have been previously obtained via a simulator such as simulator 400. In one embodiment, the simulation based features can be generated based on a plurality of stimulus factors and a software representation of the motion sensor system. For example, the motion sensor system may be embodied by one or more of optical system 410, PIR detector system 420, and field-of-view 430. The pattern matching can be performed using, for example, a pattern lookup engine or a decision forest classifier.
At step 740, an action can be executed in response to the determined character of the stimulus. In one embodiment, the action can include activating an alarm and/or alerting a remote service when the determined character of the stimulus is a human. In another embodiment, the action can include not activating an alarm when the determined character of the stimulus is a pet. If desired, the action can include notifying an owner that his or her pet is moving around the house.
It should be appreciated that the steps in
With reference to
Special-purpose computer system 800 can include computer 802, a monitor 806 coupled to computer 802, one or more additional user output devices 830 (optional) coupled to computer 802, one or more user input devices 840 (e.g., keyboard, mouse, track ball, touch screen) coupled to computer 802, an optional communications interface 850 coupled to computer 802, a computer-program product 805 stored in a tangible computer-readable memory in computer 802. Computer-program product 805 directs computer system 800 to perform the above-described methods. Computer 802 may include one or more processors 860 that communicate with a number of peripheral devices via a bus subsystem 890. These peripheral devices may include user output device(s) 830, user input device(s) 840, communications interface 850, and a storage subsystem, such as random access memory (RAM) 870 and non-volatile storage drive 880 (e.g., disk drive, optical drive, solid state drive), which are forms of tangible computer-readable memory.
Computer-program product 805 may be stored in non-volatile storage drive 880 or another computer-readable medium accessible to computer 802 and loaded into random access memory (RAM) 870. Each processor 860 may comprise a microprocessor, such as a microprocessor from Intel® or Advanced Micro Devices, Inc.®, or the like. To support computer-program product 805, the computer 802 runs an operating system that handles the communications of computer-program product 805 with the above-noted components, as well as the communications between the above-noted components in support of the computer-program product 805. Exemplary operating systems include Windows® or the like from Microsoft Corporation, Solaris® from Sun Microsystems, LINUX, UNIX, and the like.
User input devices 840 include all possible types of devices and mechanisms to input information to computer 802. These may include a keyboard, a keypad, a mouse, a scanner, a digital drawing pad, a touch screen incorporated into the display, audio input devices such as voice recognition systems, microphones, and other types of input devices. In various embodiments, user input devices 840 are typically embodied as a computer mouse, a trackball, a track pad, a joystick, wireless remote, a drawing tablet, a voice command system. User input devices 840 typically allow a user to select objects, icons, text and the like that appear on the monitor 806 via a command such as a click of a button or the like. User output devices 830 include all possible types of devices and mechanisms to output information from computer 802. These may include a display (e.g., monitor 806), printers, non-visual displays such as audio output devices, etc.
Communications interface 850 provides an interface to other communication networks, such as communication network 895, and devices and may serve as an interface to receive data from and transmit data to other systems, WANs and/or the Internet. Embodiments of communications interface 850 typically include an Ethernet card, a modem (telephone, satellite, cable, ISDN), a (asynchronous) digital subscriber line (DSL) unit, a FireWire® interface, a USB® interface, a wireless network adapter, and the like. For example, communications interface 850 may be coupled to a computer network, to a FireWire® bus, or the like. In other embodiments, communications interface 850 may be physically integrated on the motherboard of computer 802, and/or may be a software program, or the like.
RAM 870 and non-volatile storage drive 880 are examples of tangible computer-readable media configured to store data such as computer-program product embodiments of the present invention, including executable computer code, human-readable code, or the like. Other types of tangible computer-readable media include floppy disks, removable hard disks, optical storage media such as CD-ROMs, DVDs, bar codes, semiconductor memories such as flash memories, read-only-memories (ROMs), battery-backed volatile memories, networked storage devices, and the like. RAM 870 and non-volatile storage drive 880 may be configured to store the basic programming and data constructs that provide the functionality of various embodiments of the present invention, as described above.
Software instruction sets that provide the functionality of the present invention may be stored in RAM 870 and non-volatile storage drive 880. These instruction sets or code may be executed by the processor(s) 860. RAM 870 and non-volatile storage drive 880 may also provide a repository to store data and data structures used in accordance with the present invention. RAM 870 and non-volatile storage drive 880 may include a number of memories including a main random access memory (RAM) to store instructions and data during program execution and a read-only memory (ROM) in which fixed instructions are stored. RAM 870 and non-volatile storage drive 880 may include a file storage subsystem providing persistent (non-volatile) storage of program and/or data files. RAM 870 and non-volatile storage drive 880 may also include removable storage systems, such as removable flash memory.
Bus subsystem 890 provides a mechanism to allow the various components and subsystems of computer 802 to communicate with each other as intended. Although bus subsystem 890 is shown schematically as a single bus, alternative embodiments of the bus subsystem may utilize multiple busses or communication paths within the computer 802.
It should be noted that the methods, systems, and devices discussed above are intended merely to be examples. It must be stressed that various embodiments may omit, substitute, or add various procedures or components as appropriate. For instance, it should be appreciated that, in alternative embodiments, the methods may be performed in an order different from that described, and that various steps may be added, omitted, or combined. Also, features described with respect to certain embodiments may be combined in various other embodiments. Different aspects and elements of the embodiments may be combined in a similar manner. Also, it should be emphasized that technology evolves and, thus, many of the elements are examples and should not be interpreted to limit the scope of the invention.
Specific details are given in the description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, well-known, processes, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the embodiments. This description provides example embodiments only, and is not intended to limit the scope, applicability, or configuration of the invention. Rather, the preceding description of the embodiments will provide those skilled in the art with an enabling description for implementing embodiments of the invention. Various changes may be made in the function and arrangement of elements without departing from the spirit and scope of the invention.
It is to be appreciated that while the described methods and systems for intuitive status signaling at opportune times for a hazard detector are particularly advantageous in view of the particular device context, in that hazard detectors represent important life safety devices, in that hazard detectors are likely to be placed in many rooms around the house, in that hazard detectors are likely to be well-positioned for viewing from many places in these rooms, including from near light switches, and in that hazard detectors will usually not have full on-device graphical user interfaces but can be outfitted quite readily with non-graphical but simple, visually appealing on-device user interface elements (e.g., a simple pressable button with shaped on-device lighting), and in further view of power limitations for the case of battery-only hazard detectors making it desirable for status communications using minimal amounts of electrical power, the scope of the present disclosure is not so limited. Rather, the described methods and systems for intuitive status signaling at opportune times are widely applicable to any of a variety of smart-home devices such as those described in relation to
Any processes described with respect to
It is to be understood that any or each module or state machine discussed herein may be provided as a software construct, firmware construct, one or more hardware components, or a combination thereof. For example, any one or more of the state machines or modules may be described in the general context of computer-executable instructions, such as program modules, that may be executed by one or more computers or other devices. Generally, a program module may include one or more routines, programs, objects, components, and/or data structures that may perform one or more particular tasks or that may implement one or more particular abstract data types. It is also to be understood that the number, configuration, functionality, and interconnection of the modules or state machines are merely illustrative, and that the number, configuration, functionality, and interconnection of existing modules may be modified or omitted, additional modules may be added, and the interconnection of certain modules may be altered.
Whereas many alterations and modifications of the present invention will no doubt become apparent to a person of ordinary skill in the art after having read the foregoing description, it is to be understood that the particular embodiments shown and described by way of illustration are in no way intended to be considered limiting. Therefore, reference to the details of the preferred embodiments is not intended to limit their scope.