The present techniques relate generally to power management in electronic devices. More particularly, the present techniques relate generally to power management in electronic devices by sensing user presence and intent.
Computers, cellular telephones, tablets, display panels, televisions, and many other types of electronic devices may incorporate several sensors, such as passive infrared, audio, and camera sensors, to sense an action of a user and, in response thereto, may cause the device to take one or more predetermined actions such as resuming operation or scrolling displayed content. While these types of sensors may be applied in an effort to reduce the amount of power the device consumes, the sensors may be continuously powered on even when the device is in an inactive mode of operation. Thus, although the sensors are intended to reduce power consumption, they may actually increase the power consumed by the device under certain usage or operating conditions. Further, the types of sensors typically utilized are relatively costly to procure. Moreover, the types of sensors typically utilized have relatively limited capability in that they may detect only the occurrence of an event rather than detecting specific characteristics of that event. Thus, such sensors may provide relatively limited information to a device, which, in turn, may cause the device to operate in an undesirable or erroneous manner.
The same numbers are used throughout the disclosure and the figures to reference like components and features. Numbers in the 100 series refer to features originally found in
In the following description and claims, the terms “coupled” and “connected,” along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, “connected” may be used to indicate that two or more elements are in direct physical or electrical contact with each other. “Coupled” may mean that two or more elements are in direct physical or electrical contact. However, “coupled” may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
Some embodiments may be implemented in one or a combination of hardware, firmware, and software. Some embodiments may also be implemented as instructions stored on a machine-readable medium, which may be read and executed by a computing platform to perform the operations described herein. A machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine, e.g., a computer. For example, a machine-readable medium may include read only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; or electrical, optical, acoustical or other form of propagated signals, e.g., carrier waves, infrared signals, digital signals, or the interfaces that transmit and/or receive signals, among others.
An embodiment is an implementation or example. Reference in the specification to “an embodiment,” “one embodiment,” “some embodiments,” “various embodiments,” or “other embodiments” means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least some embodiments, but not necessarily all embodiments, of the present techniques. The various appearances of “an embodiment,” “one embodiment,” or “some embodiments” are not necessarily all referring to the same embodiments. Elements or aspects from an embodiment can be combined with elements or aspects of another embodiment.
Not all components, features, structures, characteristics, etc. described and illustrated herein need be included in a particular embodiment or embodiments. If the specification states a component, feature, structure, or characteristic “may”, “might”, “can” or “could” be included, for example, that particular component, feature, structure, or characteristic is not required to be included. If the specification or claim refers to “a” or “an” element, that does not mean there is only one of the element. If the specification or claims refer to “an additional” element, that does not preclude there being more than one of the additional element.
It is to be noted that, although some embodiments have been described in reference to particular implementations, other implementations are possible according to some embodiments. Additionally, the arrangement and/or order of circuit elements or other features illustrated in the drawings and/or described herein need not be arranged in the particular way illustrated and described. Many other arrangements are possible according to some embodiments.
In each system shown in a figure, the elements in some cases may each have a same reference number or a different reference number to suggest that the elements represented could be different and/or similar. However, an element may be flexible enough to have different implementations and work with some or all of the systems shown or described herein. The various elements shown in the figures may be the same or different. Which one is referred to as a first element and which is called a second element is arbitrary.
CPU 102 may, in embodiments, be a conventional CPU or, in other embodiments, may be a CPU specifically configured for operation at low or reduced power consumption rates. CPU 102 is capable of reading and executing computer-readable instructions, which, in embodiments, may include instructions stored in memory 104, including operating system instructions 106. CPU 102 may further be capable of reading and executing other computer-readable instructions such as instructions 108 that may also be stored in memory 104.
Memory 104, in addition to storing operating system instructions 106 and instructions 108, may also store data such as application-related data. Memory 104 may include memories of different types, and thus may be configured as one or more, or various combinations of one or more, of nonvolatile read only, read only, random access, hard disk drive, or other types or combinations of types of, memories.
Operating system 106 is a set of computer-readable instructions that, when executed by CPU 102, enables the basic operation, and manages the hardware, software, memory and processes, of electronic device 100. Operating system 106 may be configured as an Android, Windows, Linux, iOS, proprietary, or virtually any type of operating system. Instructions 108 are a set of computer-readable instructions that, when executed by CPU 102, may cause electronic device 100 to perform various functions or take various actions autonomously and in response to user input or instructions. Instructions 108 may also, when executed by CPU 102, cause the electronic device to perform one or more embodiments of the user presence and intent sensing method of the present invention, which will be more particularly described hereinafter.
Sensor 110 may include one or more ambient temperature sensors, such as, for example, infrared thermopile sensors. As will be described in more detail with reference to
Input/output device 112 may be configured as a conventional input/output (I/O) port, such as a USB port, or may be configured as a wireless I/O port, such as, for example, a WiFi or Bluetooth port, that enables electronic device 100 to exchange information wirelessly with another device. User interface device 114 enables a user to input data to and receive information from the electronic device 100, and may be configured as, for example, a display, touch screen display, keyboard and display combination, voice control and recognition system, speaker, or any combination of one or more of the foregoing or similar. Device bus 116 may be a conventional bus that carries electronic signals and data between and among the components of device 100.
As will be more particularly described hereinafter, electronic device 100, and more particularly instructions 108 executed by CPU 102 in conjunction with sensor 110, generally operates to sense the desired or target frequency range of infrared radiation and, in embodiments, to determine various characteristics of the sensed infrared radiation, including whether the source of that radiation is approaching toward or moving away from the device 100 and the approximate distance of the sensed radiation from the device 100, to thereby determine or infer an intent of a user or other source of the infrared radiation. In response thereto, the device 100 may, in embodiments, take certain actions such as activating other sensors to identify or receive input from the user or other source of the infrared radiation, activating or modifying the operating characteristics of other elements of the device 100, displaying a message or otherwise communicating information that may be of a general nature and intended for the detected user or other source that may be of a specific nature intended for and adapted to a detected and identified user or other source, or not waking the device if the inferred intent of the user or other source is not consistent with an intent to activate or otherwise interact with the device thereby conserving power.
Amplifier and processing circuitry 220, in embodiments, may include amplification, analog-to-digital conversion and other necessary or desired processing of the electrical signals issued by the sensors 210A-C. Once the sensor signals are processed by amplifier and processing circuitry 220, the processed signals are passed via signal line 216 to decoder device 230.
Decoder device 230 may, in embodiments, be a microcontroller device configured to decode the changes in sensed average ambient temperature, as well as any localized variations thereof, within the field of view of sensors 210A-210C, as indicated by the processed signals provided thereto by amplifier and processing circuitry 220. In embodiments, the function of decoding the changes and any variations in the sensed average ambient temperature within the field of view of sensors 210A-210C may be performed by instructions 108 executed by CPU 102 of electronic device 100. The ambient temperature information decoded by decoder device 230 may then be placed on device signal bus 116 for further processing by electronic device 100, which further processing is more particularly described with reference to
In embodiments, sensor hub 200 may also include various other types of sensors that may be utilized by electronic device 100. Sensor hub 200 may include an audio sensor or microphone 240 to detect sound in the environment of device 100, gyroscope 250 to sense an orientation of device 100, an ambient light sensor 260 to detect a level of ambient light surrounding the device 100, and a camera sensor 270.
Sensors 210A-C may be enclosed in a housing 280. Housing 280 may be constructed of a material that is impervious to all infrared radiation other than the range of infrared radiation that is desired to be detected. More particularly, housing 280 may be constructed of material that is substantially impervious to all infrared radiation other than the infrared radiation that is desired to be detected. In embodiments, housing 280 may be constructed of a material that is substantially impervious to all infrared radiation other than the range emitted by a human, such as from approximately 7 to approximately 15 micrometers. In such an embodiment, housing 280 may be constructed of, for example, a high-density polyethylene.
At block 402, a baseline average ambient temperature characteristic is established. The baseline average spatial ambient temperature characteristic is established for and within the field of view of one or more sensors associated with the electronic device. In embodiments, an average spatial ambient temperature of the field of view of the one or more sensors may be determined based on the infrared radiation sensed by the one or more sensors 110, 210A-C.
At block 404, method 400 monitors the fields of view of the one or more sensors for any variation from the baseline average spatial ambient temperature characteristic. More particularly, the one or more sensors take periodic temperature measurements to determine a current average spatial ambient temperature within their corresponding fields of view. That current average spatial ambient temperature is compared against the baseline established at block 402 to determine whether the current average spatial ambient temperature varies from the baseline. If no variation, or no variation in excess of a predetermined minimum threshold, is detected, method 400 continues to monitor via blocks 402 and 404 the fields of view of the one or more sensors for a variation from the baseline average spatial ambient temperature. Conversely, if a variation, or a variation in excess of a predetermined threshold, is detected, method 400 proceeds to block 406.
At block 406, the detected variation is characterized. More particularly, the detected variation is compared with the known characteristic temperature variation indicative of the presence within the field of view of the one or more sensors of an object of interest, i.e., an object that emits infrared radiation within a known range and, thus, having a known characteristic temperature, such as a characteristic temperature of a human body, an animal, a mechanical object such as a robot, or any other object having a known characteristic temperature. If the variation is indicative of the object of interest being present within the field of view of the one or more sensors, the variation may be further characterized by taking additional or more frequent periodic measurements of the average spatial ambient temperature within the field of view of the one or more sensors to determine certain characteristics of the variation, including, in embodiments, whether the variation is stationary or in motion relative to the one or more sensors. In embodiments, characterizing the variation may include periodically reading the ambient temperature measured by the one or more sensors and recording over time the variation in the measured ambient temperature relative to the baseline characteristic. Exemplary characteristic variations are illustrated with reference to
At block 408, a determination is made as to whether the detected variation is recognized as a known or anticipated variation. The characteristics of the detected variation may be compared with known or anticipated variations. In embodiments, the characteristics of certain known or anticipated variations may be stored in the form of data or as a look up table in the memory of the electronic device, such as memory 104 of electronic device 100, and the characteristics of the detected variation may be compared against the characteristics of the known or anticipated variations to determine whether there is sufficient similarity between the two. In other embodiments, a microcontroller, such as microcontroller 230, may be programmed or otherwise configured to compare detected variations with certain known or anticipated variations and to thereby determine if the detected variation constitutes a recognized variation. If the characteristics of the detected variation do not bear a sufficient similarity to the characteristics of the known or anticipated variations, the variation is not recognized as constituting a known or anticipated variation and method 400 proceeds to block 402 where a new baseline ambient averaged spatial temperature characteristic for the field of view of the sensors is determined. Thus, a reduction in the power consumed by the electronic device may be achieved by virtue of the electronic device not reacting, such as activating or powering on, to an unrecognized variation. Conversely, if the characteristics of the detected variation are sufficiently similar to the characteristics of a known or anticipated variation, the detected variation is recognized as a known or anticipated variation that corresponds to a predetermined inferred intent of the detected object, and method 400 proceeds to block 410.
At block 410, an action corresponding to the recognized variation and the intent of the object, such as a user, inferred therefrom may be taken by the electronic device. In embodiments, the electronic device, such as electronic device 100, may take various actions responsive to the detection and recognition of a variation that corresponds to an inferred intent of the user, including, for example, displaying notifications to the detected presence, adjusting the rendering perspective on a display of the electronic device to render the displayed information as appearing to be aimed at the location or to track movement of the detected user presence, wake the electronic device, activate additional sensors within the sensor hub 200 or the electronic device 100 to enable the electronic device to accept input, such as, for example, via spoken command, or to recognize the detected user presence via facial or voice recognition and unlock the device or display notifications intended for or preselected by the detected and recognized user. Thus, by taking a predetermined action only in response to recognized variations the power consumed by the electronic device may be reduced.
However, it should be noted that, in embodiments, the sensor hubs 200A-200E may be variously configured with different types of sensors or combinations of the foregoing sensors. For example, in embodiments, the sensor hub 200A associated with the display of the device 700 may include a full complement of sensors whereas the sensor hubs 200B-E associated with the sides of the device may include only the sensors 210A-C, their associated lenses 212A-C, and amplification 220 and decoding circuitry 230 to detect and characterize the presence of a user. Thus, device 700 when lying flat on a table or other surface is thus configured to sense a user presence and direction of movement of an object of interest by operation of the sensor hubs 200B-C associated with the sides of device 700. It should also be noted that sensors 110, 210A-C, may be enclosed in a housing substantially similar to housings 120 and 280 described above, and which are impervious to all infrared radiation outside the range of interest, such as a range from approximately 7 to approximately 15 micrometers in the case of human detection.
An electronic device is provided that includes at least one sensor for sensing a presence and inferring therefrom an intent of an object within a field of view of the at least one sensor. The device includes a central processing unit (CPU), nonvolatile memory, a signal bus communicatively coupling the sensor, the CPU and the nonvolatile memory, and computer readable instructions stored in the nonvolatile memory that, when executed by the CPU, cause the device to characterize a sensed presence, infer an intent of the sensed presence, and to take one or more predetermined actions responsive thereto.
A method for sensing, in an electronic device, the presence and intent of an object is provided. The method includes sensing a baseline ambient spatial average temperature for a sensed field of view, monitoring the sensed field of view for a variation in the ambient spatial average temperature of the field of view relative to the baseline ambient spatial average temperature, characterizing any detected variation, determining whether the detected variation is a recognized variation, from which an intent of the object can be inferred, and causing the electronic device to take at least one predetermined action responsive to a recognized variation.
It is to be understood that specifics in the aforementioned examples may be used anywhere in one or more embodiments. For instance, all optional features of exemplary devices described above may also be implemented with respect to any of the other exemplary devices and/or the method described herein. Furthermore, although flow diagrams and/or state diagrams may have been used herein to describe embodiments, the present techniques are not limited to those diagrams or to their corresponding descriptions. For example, the illustrated flow need not move through each box or state or in exactly the same order as depicted and described.
The present techniques are not restricted to the particular details listed herein. Indeed, those skilled in the art having the benefit of this disclosure will appreciate that many other variations from the foregoing description and drawings may be made within the scope of the present techniques. Accordingly, it is the following claims including any amendments thereto that define the scope of the techniques.