The present invention relates to a low-power wake-up method for an electronic device having a sensor. More particularly, the present invention relates to ways to reduce battery power utilized by portable devices and facilitate a return of the electronic device to operation from a sleep mode.
In order to save power, which is of particular importance to battery powered devices, conventionally a “sleep mode” has been utilized, typically when the electronic device is an idle state for a predetermined amount of time.
Battery usage in portable electronic devices, including but not limited to cell phones, smart phones, tablets, personal digital assistants (PDA's), portable music players, etc., are just a few of the many types of devices where battery usage is critical, and there continues to be a need to provide more functionality and at the same time reduce power battery usage.
Conventional devices, wherein after a period of time of non-usage, may dim the brightness of the display, or the display goes blank to conserve energy.
For example, with regard to computers, sleep mode is defined as an energy-saving standby condition of a computer which can be reactivated by external stimulus, such as touching the keyboard. For example, when a notebook computer goes into sleep mode, the display screen and disk drive are normally shut down. Once awakened (i.e. sent a specific signal), the computer returns to its former operating status.
Moreover, in the case of portable electronic devices, sleep mode operates, for example, in devices that are in no way limited to smartphones, tablets, music players, Personal Digital Assistant (PDAs), just to name a few possibilities.
In fact, many smartphones now default to a sleep mode when not used, unless actively performing certain tasks. When there are no active user interactions such as screen touches, every component, including the central processor, can stays off unless an app instructs the operating system to keep the device fully powered on.
Moreover, a number of background operations need to be performed while the phone is idle. In one such example, a mailer may need to automatically update email by checking with a remote server. To prevent the phone from going to sleep during such operations, smartphone manufacturers often make application programming interfaces, or APIs, available to app developers. The developers insert the APIs into apps to instruct the phone to stay awake long enough to perform necessary operations.
In a typical smartphone, an Application Processor (AP) is asleep when the device is asleep. In order to wake up the device, conventional systems require the user to press a power button or an unlock button.
Sleep mode saves battery power, particularly when compared with leaving a device in fully operation state while idle, and advantageously permits the user to avoid having to reset programming codes or wait for an electronic device to reboot. In wireless electronic devices, such as portable mobile terminals, tablets, etc., which often seek out networks and have to provide passwords to obtain access upon being rebooted or reset, the use of sleep mode is preferable to a rather cumbersome and slow process or rebooting.
However, to return to an operational mode from a sleep mode (wake mode) requires an action to be undertaken by the user. For example, a power button or an unlock icon must be pressed, which is slow and sometimes awkward, especially when trying to quickly perform an action on the electronic device. Even in the case of a virtual keypad, an unlock icon must be touched or spread in order to restore the electronic device to an operational mode, meaning that the user is inconvenienced by contacting a button of the device, or sliding their finger along a screen.
Some conventional attempts to solve some of the shortcomings include providing a luminance sensor or a camera. However, in such cases the application processor (AP) cannot go into sleep mode and must always be in an operating mode to process sensed data from the sensor or camera. This type of monitoring requires a high amount of power consumption, as it is impossible to control the sensor by the AP directly when the AP is asleep.
Recently, the use of a lower power processor for processing only the sensing data has been configured into the devices. However, the low power processor processes data from the sensor using a polling type, and must be maintained in a wake-up state, using significant amounts of power.
With regard to conventional attempts to address the above-discussed issues, U.S. Pat. Appln. Pub. No. 20100313050 discloses that a sensor processor system selects a power profile to be applied to the application processor system based on the sensed data, and instructs the power management controller to apply the selected power profile to the application processor system. There are two processors used for low power sensing that wakes up the AP when the sensed data meets the condition.
However, a significant drawback to U.S. Pat. Appln. Pub. No. 2010/0313050 is that the sensor processor always operates to monitor ambient environment using a polling type sensor without a sleep mode. The sensor processor applies the power profile to the application processor system (S/W type).
In another conventional attempt to improve the art, in U.S. Pat, Appln. Pub. No. 2009/0259865, the electronic device includes a circuit configured to operate when the main processor is in the sleep mode. The circuit comprises at least one low power processor and a sensor. However, the low power processor in the conventional system always operates without being in sleep mode in order to be able to monitor ambient environment via a polling-type sensor.
Accordingly, there is a need in the art for a system and method that permits additional components to be in sleep mode and yet, provides ambient monitoring of the device, and can permit a switch back to an operating mode from sleep mode quickly without pressing buttons or touching the display screen.
The summary of the invention is not to be used as a basis to interpret a scope of the appended claims, as the claimed invention is far broader than the description in this summary.
An apparatus and method for waking up a main processor in an ultra-low power device preferably includes a main processor, and a sub-processor that utilizes less power than the main processor, and may be internalized in the main processor. According to an exemplary aspect of the presently claimed invention, at least one sensor is preferably an interrupt-type sensor (as opposed to, for example, a polling-type sensor). One of the many advantages of the presently claimed invention is that both the main processor and the sub-processor can remain in sleep mode, as a low-power or an ultra-low power sensor can operate with the sub-processor being in sleep mode and only awaken after receiving an interrupt signal from the interrupt sensor that a change has been detected.
In addition, the presently claimed invention also permits a return from sleep mode to operating mode by a mere wave of the hand, which is unknown heretofore. Also, shaking the unit, or moving a stylus pen arranged along an exterior of the device are all examples of the many ways the device can be awakened from sleep mode.
The above and other exemplary aspects of the invention will become more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the invention, for which reference should be made to the appended claims. Moreover, the drawings are not necessarily drawn to scale and, unless otherwise indicated, they are merely intended to conceptually illustrate the structures and at least one of a low power processor and an ultra-low power sensor to monitor at least one of the signals, commands, inputs, and changes in the environment. The circuit wakes up the main processor responsive to one of the low power processor and the ultra-low power interrupt sensor.
The present invention has been described with respect to particular exemplary embodiments and with reference to certain drawings, but the invention is not limited thereto, but rather, is set forth only by the appended claims. The drawings described are only schematic and are non-limiting. In the drawings, for illustrative purposes, the size of some of the elements may be exaggerated and not drawn to a particular scale. Where the term “comprising” is used in the present description and claims, it does not exclude other elements or steps. Where an indefinite or definite article is used when referring to a singular noun, e.g. “a” “an” or “the”, this includes a plural of that noun unless something otherwise is specifically stated. Hence, the term “comprising” should not be interpreted as being restricted to the items listed thereafter; it does not exclude other elements or steps, and so the scope of the expression “a device comprising items A and B” should not be limited to devices consisting only of components A and B. This expression signifies that, with respect to the present invention, the only relevant components of the device are A and B.
Furthermore, the terms “first”, “second”, “third” and the like, if used in the description and in the claims, are provided for distinguishing between similar elements and not necessarily for describing a sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances (unless clearly disclosed otherwise) and that the exemplary embodiments of the invention described herein and may be operated in other sequences and/or arrangements than are described or illustrated herein.
To aide in an understanding of the present invention, an artisan should understand and appreciate that the terms “main processor” and sub-processor” are terminologies used for understanding of the invention, but other terminologies can be interchangeably used in place of main processor and sub-processor having the same meaning.
For example, to aid the artisan, the term “main processor”, can be any one of an “application processor”, “AP”, “first processor”, and “processor 1” as used herein all refer to the same processor 110 that is shown in
In addition, the term “sub-processor”, can be any one of a “sensing processor”, “MCU”, “second processor”, “processor 2”, “Sensor Hub (Processor)”, MCU (Micro Controller Unit”, I refer to the same processor 120 that is shown in
An artisan understands and appreciates that the term “ultra-low power” typically refers to a processor operating at power consumption values using less than approximately 1 mA, and often in the μA range. “Ultra-low power levels” refers to power consumption at levels using less than approximately 1 mA.
In addition, the artisan also understands and appreciates that the term “low power” typically refers to a processor (or sub-processor) operating in the 1-10 mA range. The apparatus may comprise a wireless communication device, such as a mobile communication terminal, a cellphone, smart phone, tablet, Personal Digital Assistant (PDA), notebook, netbook, etc. just to name a few possible non-limiting examples of devices.
The sub-processor 120 operates at a low power or ultra-low power, and according to the present invention, the sub-processor can remain in a sleep mode along with the main processor 110 because of the use of an interrupt sensor 130. As discussed herein above, the conventional apparatus uses only a polling sensor that requires either the main processor or the sub-processing to remain fully operational to be able to have the device change from sleep mode to operational mode.
The interrupt sensor 130 operates at ultra-low power levels and sends an interrupt signal to the sub-processor 120 when a predetermined condition is sensed, which can be, for example, waiving one's hand in front of the display, shaking the device, or moving a piece of the device, such as shifting a position of a stylus 475 (
At step 200, the main processor 110 and sub-processor 120 are in sleep mode. At steps 210 and 220, an interrupt sensor 130 (including but not limited to an infrared (IR) sensor) detects gesture sensoring within a proximity distance of the electronic device, typically a display or touchscreen. The proximity distance can be, for example, 10-15 cm, but the invention does not require a specific distance, so long as the sensor can recognize the wave of the user's hand.
At step 230, the sub-processor 120 is awakened by the interrupt signal sent from the interrupt sensor 130. Alternatively, at step 240 an accelerometer may detect the device being shaken or waived, and also cause the sub-processor 120 to be awakened.
At step 250, the sub-processor determines whether or not the sensed data from the interrupt sensor 130 is valid by comparing the value with a table in storage.
In addition, a polling sensor 130 can be optionally included so that when the mobile device is placed in a case or bag, the interrupt sensor does not unintentionally operate. Accordingly, the sub-processor wakes up due to the interrupt from the interrupt sensor, and the main processor wakes up when 1) sensing data of the interrupt sensor is within valid range or 2) when sensing data of the polling sensor is within the valid range, with 1) or 2) being determined by the sub-processor at step 250.
After determining by the sub-processor 120 that the data is valid, for example, by being in a valid range, or has reached a predetermined threshold, the sub-processor 120 at step 260 then wakes the main processor 110, which in turn at step 270 provides feedback to the user, in the form of, for example, unlocking the screen, prompting the user, making the display operable, showing a home screen, etc. According to an exemplary aspect of the present invention, the predetermined threshold could be a particular value which if the output is greater than or equal to, is determined by the sub-processor as satisfying the wake up condition(s). In addition, there can be a range of range of values received from the sensor that are predetermined as satisfying a wakeup condition, that being provided only for purposes of illustration and not for limiting the appended claims, such as, for example a microvolt uv (or microamp ua) range. Any other such range (e.g. ma) that is within the capability of the sub-processor to distinguish between values received from the sensor so as ascertain a valid range or predetermined threshold are within the spirit and scope of the claimed invention.
As shown in
With continued reference to
Referring now to
With reference to
As shown in
Touch screen 655 permits display and entry of data. Storage device 685 is in communication with the controller, and comprises a non-transitory machine readable medium.
Auxiliary input 675 can be anything from a keyboard to a mouse, and wireless communication device, shown as a single box, may be different hardware modules for transmitting in short range communication such as Near Field Communication, Bluetooth, WLAN, 802.11, RF communications, etc.
In the invention, the ambient environment is monitored by an interrupt sensor, so that the sub-processor and the main processor (application processor) can remain together in sleep mode. Not only does the invention save power, but provides the user with a convenience in that there is no requirement to push a button to activate/convert the device from a sleep mode back to a normal operating mode.
The sensing of a swiping near the device is sufficient to awake the device from sleep mode, or alternatively, shaking or waving the device, also restores the device to a normal operating state by waking it up.
The above-described methods according to the present invention can be implemented in hardware, firmware or as software or computer code that can be stored in a recording medium such as a CD ROM, an RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory machine readable medium and to be stored on a local recording medium, so that the methods described herein can be rendered in such software that is stored on the recording medium using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as, flash, an ASIC or FPGA. As would be understood in the art, the computer, the processor, microprocessor controller or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein. In addition, it would be recognized that when a general purpose computer accesses code for implementing the processing shown herein, the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein.
Number | Date | Country | Kind |
---|---|---|---|
10-2013-0088382 | Jul 2013 | KR | national |
This application is a Continuation of U.S. patent application Ser. No. 14/744,521 filed on Jun. 19, 2015 which is a Continuation of U.S. patent application Ser. No. 13/595,119 filed on Aug. 27, 2012 and assigned U.S. Pat. No. 9,063,731 issued Jun. 23, 2015 which claims the benefit under 35 U.S.C. § 119(a) of a Korean Patent Application filed in the Korean Intellectual Property Office on Jul. 26, 2013 and assigned Serial No. 10-2013-0088382, the entire disclosure of which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
6070140 | Tran | May 2000 | A |
6223294 | Kondoh | Apr 2001 | B1 |
6449496 | Beith et al. | Sep 2002 | B1 |
6532447 | Christensson | Mar 2003 | B1 |
7176902 | Peterson, Jr. et al. | Feb 2007 | B2 |
7633076 | Huppi et al. | Dec 2009 | B2 |
8072379 | Gopinath | Dec 2011 | B2 |
8145053 | Sakurai | Mar 2012 | B2 |
8230246 | Sharkey | Jul 2012 | B1 |
8706172 | Priyantha et al. | Apr 2014 | B2 |
8816985 | Tate et al. | Aug 2014 | B1 |
8819467 | Park et al. | Aug 2014 | B2 |
8912877 | Ling et al. | Dec 2014 | B2 |
9047055 | Song | Jun 2015 | B2 |
9063731 | Heo et al. | Jun 2015 | B2 |
9430024 | Heo | Aug 2016 | B2 |
9524030 | Modarres et al. | Dec 2016 | B2 |
9606625 | Levesque et al. | Mar 2017 | B2 |
10048758 | Modarres et al. | Aug 2018 | B2 |
10241553 | Heo | Mar 2019 | B2 |
10372164 | Huitema | Aug 2019 | B2 |
10551969 | Jeong et al. | Feb 2020 | B2 |
20020180724 | Oshima et al. | Dec 2002 | A1 |
20030040339 | Chang | Feb 2003 | A1 |
20030177402 | Piazza | Sep 2003 | A1 |
20030226044 | Cupps et al. | Dec 2003 | A1 |
20060161377 | Rakkola et al. | Jul 2006 | A1 |
20070078487 | Vaisnys et al. | Apr 2007 | A1 |
20070102525 | Orr et al. | May 2007 | A1 |
20070140199 | Zhao et al. | Jun 2007 | A1 |
20070273673 | Park et al. | Nov 2007 | A1 |
20090135751 | Hodges et al. | May 2009 | A1 |
20090259865 | Sheynblat et al. | Oct 2009 | A1 |
20090278738 | Gopinath | Nov 2009 | A1 |
20100007801 | Cooper et al. | Jan 2010 | A1 |
20100013778 | Liu et al. | Jan 2010 | A1 |
20100235667 | Mucignat et al. | Sep 2010 | A1 |
20100268831 | Scott et al. | Oct 2010 | A1 |
20100302028 | Desai et al. | Dec 2010 | A1 |
20100306711 | Kahn et al. | Dec 2010 | A1 |
20100313050 | Harrat et al. | Dec 2010 | A1 |
20110071759 | Pande et al. | Mar 2011 | A1 |
20110074693 | Ranford | Mar 2011 | A1 |
20110077865 | Chen | Mar 2011 | A1 |
20110105955 | Yudovsky et al. | May 2011 | A1 |
20110126014 | Camp, Jr. et al. | May 2011 | A1 |
20110162894 | Weber | Jul 2011 | A1 |
20120005509 | Araki et al. | Jan 2012 | A1 |
20120071149 | Bandyopadhyay et al. | Mar 2012 | A1 |
20120096290 | Shkolnikov et al. | Apr 2012 | A1 |
20120100895 | Priyantha et al. | Apr 2012 | A1 |
20120154292 | Zhao | Jun 2012 | A1 |
20120191993 | Drader | Jul 2012 | A1 |
20120212319 | Ling et al. | Aug 2012 | A1 |
20120243719 | Franklin et al. | Sep 2012 | A1 |
20120249431 | Li | Oct 2012 | A1 |
20120254878 | Nachman et al. | Oct 2012 | A1 |
20130082937 | Liu et al. | Apr 2013 | A1 |
20130082939 | Zhao et al. | Apr 2013 | A1 |
20130265276 | Obeidat et al. | Oct 2013 | A1 |
20130290761 | Moon et al. | Oct 2013 | A1 |
20130307809 | Sudou | Nov 2013 | A1 |
20130314349 | Chien | Nov 2013 | A1 |
20140006825 | Shenhav | Jan 2014 | A1 |
20140025973 | Schillings | Jan 2014 | A1 |
20140049480 | Rabii | Feb 2014 | A1 |
20140075226 | Heo et al. | Mar 2014 | A1 |
20140149754 | Silva et al. | May 2014 | A1 |
20140237277 | Mallinson et al. | Aug 2014 | A1 |
20140285449 | Cho et al. | Sep 2014 | A1 |
20140320393 | Modarres et al. | Oct 2014 | A1 |
20150185909 | Gecnuk | Jul 2015 | A1 |
20150286263 | Heo et al. | Oct 2015 | A1 |
20160103488 | Levesque et al. | Apr 2016 | A1 |
20160306393 | Huitema | Oct 2016 | A1 |
20170034331 | Hao et al. | Feb 2017 | A1 |
20170060248 | Modarres et al. | Mar 2017 | A1 |
20180088736 | Jeong et al. | Mar 2018 | A1 |
20190073035 | Modarres et al. | Mar 2019 | A1 |
Number | Date | Country |
---|---|---|
1757027 | Apr 2006 | CN |
201000588 | Jan 2008 | CN |
101751112 | Jun 2010 | CN |
101978748 | Feb 2011 | CN |
102461135 | May 2012 | CN |
102508591 | Jun 2012 | CN |
2 479 642 | Jul 2012 | EP |
2 482 167 | Aug 2012 | EP |
6-95787 | Apr 1994 | JP |
10-333789 | Dec 1998 | JP |
11-102253 | Apr 1999 | JP |
2003-501959 | Jan 2003 | JP |
2005-283843 | Oct 2005 | JP |
2011-139301 | Jul 2011 | JP |
2002-536917 | Oct 2012 | JP |
2014-49011 | Mar 2014 | JP |
10-0748984 | Aug 2007 | KR |
10-2010-0061894 | Jun 2010 | KR |
10-2011-0071216 | Jun 2011 | KR |
2017119531 | Jul 2017 | WO |
Entry |
---|
Chinese Search Report dated Feb. 3, 2021. |
Number | Date | Country | |
---|---|---|---|
20190220076 A1 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14744521 | Jun 2015 | US |
Child | 16361329 | US | |
Parent | 13595119 | Aug 2012 | US |
Child | 14744521 | US |