Optimizing preemptive operating system with motion sensing

Information

  • Patent Grant
  • 10754683
  • Patent Number
    10,754,683
  • Date Filed
    Tuesday, March 27, 2018
    7 years ago
  • Date Issued
    Tuesday, August 25, 2020
    5 years ago
Abstract
A method and apparatus to provide a scheduler comprising determining a current use characteristic for the device based on motion information and active applications, and scheduling a future task.
Description
FIELD OF THE INVENTION

The present invention relates to preemptive operating systems, and more particularly to scheduling in preemptive operating systems.


BACKGROUND

Mobile devices are gaining increasing functionality and importance in our daily lives. Accelerometers may be incorporated in these devices for measuring the motion that the device experiences. More and more of these mobile devices have multi-tasking preemptive operating systems that allow the device to run several programs or applications at once. These preemptive operating systems have schedulers to prioritize tasks. In prior implementations, these schedulers based their decision on the priority of each application or function, and occasionally on the time of day.


SUMMARY OF THE INVENTION

A method and apparatus to provide a scheduler comprising determining a current use characteristic for a device based on motion information and active applications, and scheduling a future task.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:



FIG. 1 is a network diagram illustrating a network in which the present system may work.



FIG. 2 is a block diagram of one embodiment of the scheduler.



FIG. 3 is a flowchart of one embodiment of scheduling.



FIGS. 4A and 4B is a flowchart of one embodiment of setting scheduling preferences.



FIG. 5 is a list of exemplary tasks and the associated resources used.



FIG. 6 is a block diagram of a computer system that may be utilized with the present invention.





DETAILED DESCRIPTION

The method and apparatus described is for providing a preemptive operating system, which schedules tasks in a mobile device. Prior art schedulers have had no awareness as to the motion that the device is experiencing and in particular what the user's motion implies about the state of the device and the likelihood of the user to perform certain actions with the device. In prior art, these schedulers know only of the current state of various programs and whether the device has the screen turned on or off but have no awareness of the motion that a device is experiencing.


The scheduler of the present invention in one embodiment optimizes these preemptive operating environments using motion information. The scheduler optimizes tasks, programs, and data communications based upon the device's use characteristic, determined based on the motion information. Data communications may include pulling data from and pushing data to the network.


While the scheduler improves the performance of all programs on a mobile device, some programs require especially low latency in their execution in order to operate accurately. In particular, programs that receive and process the input from integrated or wirelessly tethered sensors degrade rapidly in performance and accuracy as latency increases. Many of these sensor programs analyze data in real time, sampling a sensor from 1 to 300+Hz. In a preemptive operating system where the host processor interfaces directly with a given sensor, a higher priority task such as a phone call or data transfer will preempt the lower priority task of real time data analysis. If system bandwidth is limited, the lower priority task may be halted entirely, significantly degrading the program's performance.



FIG. 1 is a network diagram illustrating a network in which the present system may work. The system includes a mobile device 110. The mobile device 110, in one embodiment, receives data from one or more sensors 120, 170. The sensors 120, 170 may be coupled to the mobile device 110 via a wireless connection 120 such as 802.11 WiFi, Bluetooth, etc or may be integrated in the mobile device 170. The integrated sensors 170, in one embodiment include an inertial sensor. The mobile device 110 can also retrieve data from a server 130 via network 140, or send data to a server 130 via network 140. The network may be the Internet, a cellular telephone network, or any other network. The mobile device 110 may be getting data from the network via various protocols including Wireless Access Protocol (WAP), HTTP, or other protocols.


The mobile device 110 may also obtain data from other another mobile device 150 either through a direct connection or through network. The scheduler 160 in mobile device 110 determines when the various tasks and communications occur. This may include obtaining data from, and sending data to, servers 130, sensors 120, and other mobile devices 150, as well as internal processes such as programs and tasks.



FIG. 2 is a block diagram of one embodiment of the scheduler. The scheduler 160 in one embodiment is a software application which runs on a mobile device. In another embodiment, the scheduler 160 may have a client and a server component. The functionality may be split between the client and the server. In one embodiment, the preference settings, and calculations may be on the server which has more processing power and storage available, while the implementation/use aspects reside on the client. For simplicity, the below description refers to any schedulable task, program, or data communication as a “task.”


Scheduler 160 includes a resource identification list 210, which includes a listing of one or more potential tasks and the resource(s) that the task uses. For example, a download task utilizes network bandwidth, storage bandwidth (memory bus), and storage. By contrast, a telephone call task only uses network bandwidth. FIG. 5 lists a number of exemplary tasks and their associated resources.


Prioritizer 220 includes a list of tasks and their relative priorities. For example, a task which is observed by the user is a higher priority than a task which is generally not observed by the user. A task which provides semi-real-time feedback or other data processing is higher priority than a task which provides background download, synchronization, or similar features. In one embodiment, user may use a user interface 225 to prioritize tasks. In one embodiment, the system comes with a default set of priorities, which may be edited or adjusted by the user.


Motion information logic 230 receives motion data. In one embodiment, motion data is received from an accelerometer or other inertial sensor. In one embodiment, motion data is received from a 3-dimensional accelerometer that is part of the device. Motion logic 230 determines a current motion, and based on an identified activity of the user, determines expected future motion as well. For example, if the user is walking at a rapid cadence, it is likely that he or she will continue to walk. If the user is playing a game, he or she is likely to continue moving the device and playing the game.


In one embodiment, the system further includes an active application detector 240. In one embodiment, active application detector detects when an application is active (i.e. being used), even if there is no associated motion. For example, the user may open an application such as a web download application while keeping the mobile device stationary.


Current task scheduler 250 prioritizes current tasks based on prioritizer 220 data and resource ID 210 data. The current tasks are determined by current task scheduler 250 based on active app. detector 240 and motion logic 230.


If the current task scheduler 250 determines that two applications conflict, it can in one embodiment, send a stop message to application stop logic 270. In one embodiment, current task scheduler 250 also then sends the stopped task to future task scheduler 260. In one embodiment, current task scheduler 250 may also utilize resource restrictor 280 to reduce available resources for lower priority tasks. In one embodiment, current task scheduler 250 uses prioritizer data to determine which application(s) to throttle.


Resource restrictor 280 may be used to reduce the available resources to one or more of the currently active applications. This may include reducing available bandwidth.


Future task scheduler 260 receives future tasks for scheduling. In one embodiment, these future tasks may be received from current task scheduler 250. In one embodiment, future tasks may be received from the user. The user may, in one embodiment, add tasks to a list of “future tasks” which should be performed when there are resources available. For example, for a larger download or upload project, the user may indicate that the project is a “future task” instead of directly initializing the task.


Future task scheduler 260 passes a task to current task scheduler 250 when the motion info logic 230 and active application detector 240 indicate that the time is good for performing that task. For example, when the device is not in motion, and there are no applications using network bandwidth, a upload or download future task may be scheduled. In one embodiment, future task scheduler 260 passes tasks for data calls to the server for uploads, downloads, and synchronization to the current task scheduler 250 when the device is idle. In one embodiment, the device is idle when no motion is detected. In one embodiment, the device is idle when no motion is detected and the user is not interacting with any application.


In one embodiment, the system may have tasks that are interruptible (such as downloads) and tasks that are not interruptible (such as installation of applications). In one embodiment, future task scheduler 260 may also have as an input a clock. In one embodiment, the future task scheduler may take into account the likelihood of a user initiating a conflicting task, prior to passing a non-interruptible task to the current task scheduler 250.



FIG. 3 is a flowchart of one embodiment of scheduling. The process starts at block 305.


At block 310, motion data is logged. Motion data is logged, in one embodiment in a buffer or similar temporary memory. At block 315, current motion is identified, and future expected motions are identified. At block 320, the active applications are identified.


At block 320, the process determines whether there is a conflict between the motions/sensors and any current tasks. If there is no conflict, the process continues to block 330. At block 330, the process determines whether there are any future tasks. Future tasks are tasks either scheduled by the user to be executed in the future, or halted previously. If there are no future tasks, the process returns to block 310 to continue logging motion data.


If there are future tasks, the process, at block 335, determines whether there are resources available currently to execute the future task. In one embodiment, this is determined based on the motion data. In one embodiment, this is determined based on the motion data and the active application data. In one embodiment, this is determined based on the motion data and time-of-day data.


If the resources are available, at block 340 the future task is initiated. The process then returns to block 330, to query whether there are any more future tasks to be scheduled. In one embodiment, the future tasks are scheduled in order of priority. That is the first query is for the highest priority future task, then for the next highest priority, and so on. In one embodiment, each future task is evaluated by this process. If there are no remaining future tasks, the process returns to block 310 to continue logging motion data.


If, at block 325, the process found that there was a conflict between the current applications, the process continues to block 350.


At block 350, the conflicting resource is identified. This may include network bandwidth, memory bandwidth, display, etc.


At block 355, the lowest priority application which uses that resource is identified. In one embodiment, the lowest priority resource may be one that is restartable, not viewed or actively utilized by the user. For example, a backup application may be the lowest priority application.


At block 360, the process determines whether throttling should be used. In one embodiment, throttling is always used when available. In one embodiment, throttling is only used if the application is a non-interruptible application. In one embodiment, the user may set a preference for throttling.


If throttling should be used, the process, at block 365 throttles the conflicting application's use of the conflicting resource. The process then returns to block 325, to determine whether there is still a conflict.


If throttling should not be used, at block 370 the lowest priority application is stopped. It is then, at block 375, added to the future tasks list. In this way, the system ensures that the task will be performed at some future time. The process then returns to block 325, to determine whether there is still a conflict.


In this way, the system provides a method to ensure that low priority applications are throttled based on motion data, and potentially other data. Note that while this and other processes are shown in flowchart form, the actual implementation need not be sequential as described. Thus, for example, future tasks may also be monitoring the resource availability for tasks on the list. In one embodiment, conflicts may be tested for every time there is a change in state in the device, i.e. a new application is started, a new motion type is detected, etc.



FIGS. 4A and 4B are a flowchart of one embodiment of setting scheduling preferences. The process starts at block 405. In one embodiment, this process is performed on the mobile device. I another embodiment, this process may be performed on a remote server, and the results may be uploaded to the mobile device. In one embodiment, the process may be split between the mobile device and a server.


At block 410, the applications on the mobile device are identified. In one embodiment, this process is triggered each time a new application is added to the mobile device. In one embodiment, only new applications are evaluated and prioritized in that instance.


At block 415, the process identifies any applications needing real-time feedback. Such applications may include sensors which require real-time control commands, applications such as telephone applications where even short delays can impact the user experience.


At block 420, the process determines whether there are any such applications. If so, at block 422, these applications receive the highest priority. The process then continues to block 425. If there are no such applications, the process continues directly to block 425.


At block 425, the process identifies any applications having direct user interactions. Such applications may include games, productivity applications, and other applications where delays can impact the user experience.


At block 423, the process determines whether there are any such applications. If so, at block 432, these applications receive the next highest priority. The process then continues to block 435. If there are no such applications, the process continues directly to block 435.


At block 435, the process identifies any non-interruptible applications. Such applications may include software installation, games requiring periodic memory access, and other applications that cannot be terminated without causing problems.


At block 440, the process determines whether there are any such applications. If so, at block 442, these applications receive the next highest priority. The process then continues to block 445. If there are no such applications, the process continues directly to block 445.


At block 445, the process identifies any applications including periodic reporting. This includes sensors that have periodic updates, applications which report out to the user, applications such as email which use periodic data pulls, etc.


At block 450, the process determines whether there are any such applications. If so, at block 452, these applications receive the next highest priority. The process then continues to block 455. If there are no such applications, the process continues directly to block 455.


At block 455, the remaining applications receive the lowest priority.


At block 460, the process determines whether there is likely conflicts between equally prioritized applications. For example, it is unlikely that a user will be playing two games simultaneously, but the user may walk and make a telephone call at the same time. If there are equally prioritized applications which may conflict, the process continues to block 462.


At block 462, the conflicting applications are reprioritized based on usage statistics or other measurements. In one embodiment, the prioritization occurs within the same category. That is, the lowest priority application within a category is still a higher priority than the highest prioritization in the next lower category. In one embodiment, more frequently used applications receive higher priority. In one embodiment, delay-sensitivity is used for prioritizing within the category. In one embodiment, this step is skipped entirely, and the user is prompted to make prioritization decisions. In one embodiment, if two such applications are found in conflict during use, the one which was activated later is considered the higher priority application.


At block 465, in one embodiment the priorities are provided to the user, and the user is permitted to make changes. In one embodiment, this only occurs if the user specifically requests it. Otherwise, the entire scheduling process is completely transparent to the user, and the user need not be aware of it at all. In one embodiment, if the user lowers the priority of an application which requires real-time feedback or has user interaction, the user is warned of the risk of such a reprioritization.


At block 470, the priorities are saved. The process then ends at block 475. In one embodiment, this process may be invoked by the user at any time, may be automatically triggered periodically, may be triggered whenever a new application is added to the mobile device, or may be started by another trigger.



FIG. 6 is a block diagram of a particular machine that may be used with the present invention. It will be apparent to those of ordinary skill in the art, however that other alternative systems of various system architectures may also be used.


The data processing system illustrated in FIG. 6 includes a bus or other internal communication means 640 for communicating information, and a processing unit 610 coupled to the bus 640 for processing information. The processing unit 610 may be a central processing unit (CPU), a digital signal processor (DSP), or another type of processing unit 610.


The system further includes, in one embodiment, a random access memory (RAM) or other volatile storage device 620 (referred to as memory), coupled to bus 640 for storing information and instructions to be executed by processor 610. Main memory 620 may also be used for storing temporary variables or other intermediate information during execution of instructions by processing unit 610.


The system also comprises in one embodiment a read only memory (ROM) 650 and/or static storage device 650 coupled to bus 640 for storing static information and instructions for processor 610. In one embodiment, the system also includes a data storage device 630 such as a magnetic disk or optical disk and its corresponding disk drive, or Flash memory or other storage which is capable of storing data when no power is supplied to the system. Data storage device 630 in one embodiment is coupled to bus 640 for storing information and instructions.


The system may further be coupled to an output device 670, such as a cathode ray tube (CRT) or a liquid crystal display (LCD) coupled to bus 640 through bus 660 for outputting information. The output device 670 may be a visual output device, an audio output device, and/or tactile output device (e.g. vibrations, etc.)


An input device 675 may be coupled to the bus 660. The input device 675 may be an alphanumeric input device, such as a keyboard including alphanumeric and other keys, for enabling a user to communicate information and command selections to processing unit 610. An additional user input device 680 may further be included. One such user input device 680 is cursor control device 680, such as a mouse, a trackball, stylus, cursor direction keys, or touch screen, may be coupled to bus 640 through bus 660 for communicating direction information and command selections to processing unit 610, and for controlling movement on display device 670.


Another device, which may optionally be coupled to computer system 600, is a network device 685 for accessing other nodes of a distributed system via a network. The communication device 685 may include any of a number of commercially available networking peripheral devices such as those used for coupling to an Ethernet, token ring, Internet, or wide area network, personal area network, wireless network or other method of accessing other devices. The communication device 685 may further be a null-modem connection, or any other mechanism that provides connectivity between the computer system 600 and the outside world.


Note that any or all of the components of this system illustrated in FIG. 6 and associated hardware may be used in various embodiments of the present invention.


It will be appreciated by those of ordinary skill in the art that the particular machine that embodies the present invention may be configured in various ways according to the particular implementation. The control logic or software implementing the present invention can be stored in main memory 620, mass storage device 630, or other storage medium locally or remotely accessible to processor 610.


It will be apparent to those of ordinary skill in the art that the system, method, and process described herein can be implemented as software stored in main memory 620 or read only memory 650 and executed by processor 610. This control logic or software may also be resident on an article of manufacture comprising a computer readable medium having computer readable program code embodied therein and being readable by the mass storage device 630 and for causing the processor 610 to operate in accordance with the methods and teachings herein.


The present invention may also be embodied in a handheld or portable device containing a subset of the computer hardware components described above. For example, the handheld device may be configured to contain only the bus 640, the processor 610, and memory 650 and/or 620.


The handheld device may be configured to include a set of buttons or input signaling components with which a user may select from a set of available options. These could be considered input device #1675 or input device #2680. The handheld device may also be configured to include an output device 670 such as a liquid crystal display (LCD) or display element matrix for displaying information to a user of the handheld device. Conventional methods may be used to implement such a handheld device. The implementation of the present invention for such a device would be apparent to one of ordinary skill in the art given the disclosure of the present invention as provided herein.


The present invention may also be embodied in a special purpose appliance including a subset of the computer hardware components described above, such as a kiosk or a vehicle. For example, the appliance may include a processing unit 610, a data storage device 630, a bus 640, and memory 620, and no input/output mechanisms, or only rudimentary communications mechanisms, such as a small touch-screen that permits the user to communicate in a basic manner with the device. In general, the more special-purpose the device is, the fewer of the elements need be present for the device to function. In some devices, communications with the user may be through a touch-based screen, or similar mechanism. In one embodiment, the device may not provide any direct input/output signals, but may be configured and accessed through a website or other network-based connection through network device 685.


It will be appreciated by those of ordinary skill in the art that any configuration of the particular machine implemented as the computer system may be used according to the particular implementation. The control logic or software implementing the present invention can be stored on any machine-readable medium locally or remotely accessible to processor 610. A machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g. a computer). For example, a machine readable medium includes read-only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or other storage media which may be used for temporary or permanent data storage. In one embodiment, the control logic may be implemented as transmittable data, such as electrical, optical, acoustical or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, etc.).


In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A method comprising: identifying one or more resources currently in use by a device;identifying current motion information for the device;determining predicted future motion for the device; andscheduling a future task based on the resources used by other active tasks, the current motion information, and the predicted future motion of the device.
  • 2. The method of claim 1, further comprising: scheduling the future task to be immediately executed when the task utilizes resources not currently used by the device.
  • 3. The method of claim 1, further comprising: determining that the future task would utilize a particular resource currently used by an other task on the device; andwhen the future task is a higher priority than the other task, reducing resource availability to the other task.
  • 4. The method of claim 3, wherein reducing resource availability comprises terminating the other task by the device.
  • 5. The method of claim 4, wherein when the other task is terminated, the other task is placed on a future task list to be executed when the resource becomes available again.
  • 6. The method of claim 1, further comprising: prioritizing tasks based on resource use and application profile.
  • 7. The method of claim 1, wherein the task is transmitting data to a server, and wherein the data transmission task is scheduled when the motion information and the currently active applications indicate that the device is idle.
  • 8. The method of claim 7, wherein the data is for one or more of: uploading, downloading, and synchronizing data with a server.
  • 9. The method of claim 7, wherein the device is idle when the device is motionless, and no user interaction with the device is occurring.
  • 10. The method of claim 7, wherein the device is idle when the device is motionless and no voice call is being made.
  • 11. A device comprising: a motion sensor to obtain motion information;resource detector to detect one or more currently used resources; anda processor to predict a future use characteristic for the device, based on the one or more currently used resources, current motion information of the device, and predicted future motion information; anda scheduler to schedule one or more tasks to be executed at a future time based on the future use characteristics of the device.
  • 12. The device of claim 11, further comprising: the scheduler to schedule the task to be immediately executed, when the task utilizes resources not currently used.
  • 13. The device of claim 11, further comprising: a prioritizer to determine a conflict between the currently used resources and the resources needed by the task; andwhen the task is a higher priority, a resource restrictor to reduce resource availability to current users of the resource.
  • 14. The device of claim 13, wherein reducing resource availability comprises terminating the current users of the resource.
  • 15. The mobile device of claim 14, wherein when the existing operation is terminated, the existing operation is placed on a future task list to be executed when the resource becomes available again.
  • 16. The device of claim 11, further comprising: the scheduler to prioritize tasks based on resource use and application profile.
  • 17. The device of claim 11, wherein the task is transmitting data to a server, and wherein the data transmission task is scheduled when the motion information and the currently active applications indicate that the device is idle.
  • 18. The device of claim 17, wherein the data call is for one or more of: uploading, downloading, and synchronizing data with a server.
  • 19. The device of claim 17, wherein the device is idle when one of the following is occurring: the device is motionless and no user interaction with the device is occurring, or when the device is motionless and no voice call is being made.
  • 20. A device including a network connection comprising: a motion information logic to receive motion information of a device, and to predict future motion information;a resource detector to detect current resource usage of the device;a processor to determine future use characteristics based on the one or more of the current resource usage, the current motion information, and the predicted future motion information;a scheduler to receive a task to be scheduled to be executed at a future time;the scheduler to schedule the task for future execution based on the predicted future use characteristics.
Parent Case Info

The present patent application is a continuation of U.S. application Ser. No. 14/936,629, filed on Nov. 9, 2015, issued as U.S. Pat. No. 9,940,161 on Apr. 10, 2018, which is a continuation of U.S. application Ser. No. 14/047,937, filed on Oct. 7, 2015, issuing as U.S. Pat. No. 9,183,044 on Nov. 10, 2015, which is a continuation of U.S. application Ser. No. 11/829,813, filed on Jul. 27, 2007, now U.S. Pat. No. 8,555,282, which issued on Oct. 8, 2013, all of which are incorporated herein by reference.

US Referenced Citations (451)
Number Name Date Kind
4238146 Kitamura et al. Dec 1980 A
4285041 Smith Aug 1981 A
4571680 Wu Feb 1986 A
4578769 Frederick Mar 1986 A
4700369 Siegal et al. Oct 1987 A
4776323 Spector Oct 1988 A
5192964 Shinohara et al. Mar 1993 A
5313060 Gast et al. May 1994 A
5323060 Gast et al. Jun 1994 A
5386210 Lee Jan 1995 A
5430480 Allen et al. Jul 1995 A
5446725 Ishiwatari Aug 1995 A
5446775 Wright et al. Aug 1995 A
5454114 Yach et al. Sep 1995 A
5485402 Smith et al. Jan 1996 A
5506987 Abramson Apr 1996 A
5515419 Sheffer May 1996 A
5583776 Levi et al. Dec 1996 A
5593431 Sheldon Jan 1997 A
5654619 Iwashita Aug 1997 A
5703786 Conkright Dec 1997 A
5708863 Satoh et al. Jan 1998 A
5717611 Terui et al. Feb 1998 A
5737439 Lapsley et al. Apr 1998 A
5771001 Cobb Jun 1998 A
5778882 Raymond et al. Jul 1998 A
5790490 Satoh et al. Aug 1998 A
5911065 Williams Jun 1999 A
5955667 Fyfe Sep 1999 A
5955871 Nguyen Sep 1999 A
5960085 de la Huerga Sep 1999 A
5976083 Richardson et al. Nov 1999 A
6013007 Root et al. Jan 2000 A
6061456 Andrea et al. May 2000 A
6122595 Varley et al. Sep 2000 A
6129686 Friedman Oct 2000 A
6135951 Richardson et al. Oct 2000 A
6145389 Ebeling et al. Nov 2000 A
6246321 Rechsteiner et al. Jun 2001 B1
6282496 Chowdhary Aug 2001 B1
6336891 Fedrigon et al. Jan 2002 B1
6353449 Gregg et al. Mar 2002 B1
6369794 Sakurai et al. Apr 2002 B1
6374054 Schinner Apr 2002 B1
6396883 Yang et al. May 2002 B2
6408330 de la Huerga Jun 2002 B1
6428490 Kramer et al. Aug 2002 B1
6470147 Imada Oct 2002 B1
6478736 Mault Nov 2002 B1
6493652 Ohlenbusch et al. Dec 2002 B1
6496695 Kouji et al. Dec 2002 B1
6513381 Fyfe et al. Feb 2003 B2
6522266 Soehren et al. Feb 2003 B1
6529144 Nilsen et al. Mar 2003 B1
6532419 Begin et al. Mar 2003 B1
6539336 Vock et al. Mar 2003 B1
6595929 Stivoric et al. Jul 2003 B2
6601016 Brown et al. Jul 2003 B1
6607493 Song Aug 2003 B2
6609004 Morse Aug 2003 B1
6611789 Darley Aug 2003 B1
6628898 Endo Sep 2003 B2
6634992 Ogawa Oct 2003 B1
6665802 Ober Dec 2003 B1
6672991 O'Malley Jan 2004 B2
6685480 Nishimoto et al. Feb 2004 B2
6700499 Kubo et al. Mar 2004 B2
6731958 Shirai May 2004 B1
6766176 Gupta et al. Jul 2004 B1
6771250 Oh Aug 2004 B1
6786877 Foxlin Sep 2004 B2
6788980 Johnson Sep 2004 B1
6790178 Mault et al. Sep 2004 B1
6807564 Zellner et al. Oct 2004 B1
6813582 Levi et al. Nov 2004 B2
6823036 Chen Nov 2004 B1
6826477 Ladetto et al. Nov 2004 B2
6836744 Asphahani et al. Dec 2004 B1
6881191 Oakley et al. Apr 2005 B2
6885971 Vock et al. Apr 2005 B2
6895425 Kadyk et al. May 2005 B1
6898550 Blackadar et al. May 2005 B1
6928382 Hong et al. Aug 2005 B2
6941239 Unuma et al. Sep 2005 B2
6959259 Vock et al. Oct 2005 B2
6975959 Dietrich et al. Dec 2005 B2
6997852 Watterson et al. Feb 2006 B2
7002553 Shkolnikov Feb 2006 B2
7010332 Irvin et al. Mar 2006 B1
7020487 Kimata Mar 2006 B2
7027087 Nozaki et al. Apr 2006 B2
7028547 Shiratori et al. Apr 2006 B2
7042509 Onuki May 2006 B2
7054784 Flentov et al. May 2006 B2
7057551 Vogt Jun 2006 B1
7072789 Vock et al. Jul 2006 B2
7089508 Wright Aug 2006 B1
7092846 Vock et al. Aug 2006 B2
7096619 Jackson et al. Aug 2006 B2
7148797 Albert Dec 2006 B2
7148879 Amento et al. Dec 2006 B2
7149964 Cottrille et al. Dec 2006 B1
7155507 Hirano et al. Dec 2006 B2
7158912 Vock et al. Jan 2007 B2
7169084 Tsuji Jan 2007 B2
7171222 Fostick Jan 2007 B2
7171331 Vock et al. Jan 2007 B2
7173604 Marvit et al. Feb 2007 B2
7176886 Marvit et al. Feb 2007 B2
7176887 Marvit et al. Feb 2007 B2
7176888 Marvit et al. Feb 2007 B2
7177684 Kroll et al. Feb 2007 B1
7180500 Marvit et al. Feb 2007 B2
7180501 Marvit et al. Feb 2007 B2
7180502 Marvit et al. Feb 2007 B2
7200517 Darley et al. Apr 2007 B2
7212230 Stavely May 2007 B2
7212943 Aoshima et al. May 2007 B2
7220220 Stubbs et al. May 2007 B2
7245725 Beard Jul 2007 B1
7254516 Case et al. Aug 2007 B2
7280096 Marvit et al. Oct 2007 B2
7280849 Bailey Oct 2007 B1
7297088 Tsuji Nov 2007 B2
7301526 Marvit et al. Nov 2007 B2
7301527 Marvit et al. Nov 2007 B2
7301528 Marvit et al. Nov 2007 B2
7301529 Marvit et al. Nov 2007 B2
7305323 Skvortsov et al. Dec 2007 B2
7328611 Klees et al. Feb 2008 B2
7334472 Seo et al. Feb 2008 B2
7353112 Choi et al. Apr 2008 B2
7365735 Reinhardt et al. Apr 2008 B2
7365736 Marvit et al. Apr 2008 B2
7365737 Marvit et al. Apr 2008 B2
7379999 Zhou et al. May 2008 B1
7382611 Tracy et al. Jun 2008 B2
7387611 Inoue et al. Jun 2008 B2
7397357 Krumm et al. Jul 2008 B2
7451056 Flentov et al. Nov 2008 B2
7457719 Kahn et al. Nov 2008 B1
7457872 Aton et al. Nov 2008 B2
7463997 Pasolini et al. Dec 2008 B2
7467060 Kulach et al. Dec 2008 B2
7489937 Chung et al. Feb 2009 B2
7502643 Farringdon et al. Mar 2009 B2
7512515 Vock et al. Mar 2009 B2
7526402 Tanenhaus et al. Apr 2009 B2
7608050 Sugg Oct 2009 B2
7640804 Daumer et al. Jan 2010 B2
7647196 Kahn et al. Jan 2010 B2
7653508 Kahn et al. Jan 2010 B1
7664657 Letzt et al. Feb 2010 B1
7689107 Enomoto Mar 2010 B2
7705884 Pinto et al. Apr 2010 B2
7736272 Martens Jun 2010 B2
7752011 Niva et al. Jul 2010 B2
7753861 Kahn et al. Jul 2010 B1
7765553 Douceur Jul 2010 B2
7774156 Niva et al. Aug 2010 B2
7788059 Kahn et al. Aug 2010 B1
7857772 Bouvier et al. Dec 2010 B2
7881902 Kahn et al. Feb 2011 B1
7892080 Dahl Feb 2011 B1
7907901 Kahn et al. Mar 2011 B1
7934020 Xu Apr 2011 B1
7987070 Kahn et al. Jul 2011 B2
8187182 Kahn et al. May 2012 B2
8275635 Stivoric et al. Sep 2012 B2
8398546 Pacione et al. Mar 2013 B2
8458715 Khosla Jun 2013 B1
8555282 Kahn Oct 2013 B1
8562489 Burton et al. Oct 2013 B2
8790279 Brunner Jul 2014 B2
8949070 kahn et al. Feb 2015 B1
8996332 Kahn et al. Mar 2015 B2
9183044 Kahn Nov 2015 B2
20010047488 Verplaetse et al. Nov 2001 A1
20020006284 Kim Jan 2002 A1
20020022551 Watterson et al. Feb 2002 A1
20020023654 Webb Feb 2002 A1
20020027164 Mault et al. Mar 2002 A1
20020042830 Bose et al. Apr 2002 A1
20020044634 Rooke et al. Apr 2002 A1
20020054214 Yoshikawa May 2002 A1
20020089425 Kubo et al. Jul 2002 A1
20020091956 Potter et al. Jul 2002 A1
20020109600 Mault et al. Aug 2002 A1
20020118121 Lehrman et al. Aug 2002 A1
20020122543 Rowen Sep 2002 A1
20020138017 Bui et al. Sep 2002 A1
20020142887 O'Malley Oct 2002 A1
20020150302 McCarthy et al. Oct 2002 A1
20020151810 Wong et al. Oct 2002 A1
20020173295 Nykanen et al. Nov 2002 A1
20020190947 Feinstein Dec 2002 A1
20020193124 Hamilton et al. Dec 2002 A1
20030018430 Ladetto et al. Jan 2003 A1
20030033411 Kavoori Feb 2003 A1
20030048218 Milnes et al. Mar 2003 A1
20030083596 Kramer et al. May 2003 A1
20030093187 Walker et al. May 2003 A1
20030101260 Dacier et al. May 2003 A1
20030109258 Mantyjarvi et al. Jun 2003 A1
20030139692 Barrey et al. Jul 2003 A1
20030139908 Wegerich et al. Jul 2003 A1
20030149526 Zhou et al. Aug 2003 A1
20030151672 Robins et al. Aug 2003 A1
20030187683 Kirchhoff et al. Oct 2003 A1
20030208110 Mault et al. Nov 2003 A1
20030208113 Mault et al. Nov 2003 A1
20030227487 Hugh Dec 2003 A1
20030236625 Brown et al. Dec 2003 A1
20040017300 Kotzin et al. Jan 2004 A1
20040024846 Randall et al. Feb 2004 A1
20040043760 Rosenfeld et al. Mar 2004 A1
20040044493 Coulthard Mar 2004 A1
20040047498 Mulet-Parada et al. Mar 2004 A1
20040078219 Kaylor et al. Apr 2004 A1
20040078220 Jackson Apr 2004 A1
20040081441 Sato et al. Apr 2004 A1
20040106421 Tomiyoshi et al. Jun 2004 A1
20040106958 Mathis et al. Jun 2004 A1
20040122294 Hatlestad et al. Jun 2004 A1
20040122295 Hatlestad et al. Jun 2004 A1
20040122296 Hatlestad et al. Jun 2004 A1
20040122297 Stahmann et al. Jun 2004 A1
20040122333 Nissila Jun 2004 A1
20040122484 Hatlestad et al. Jun 2004 A1
20040122485 Stahmann et al. Jun 2004 A1
20040122486 Stahmann et al. Jun 2004 A1
20040122487 Hatlestad et al. Jun 2004 A1
20040125073 Potter et al. Jul 2004 A1
20040130628 Stavely Jul 2004 A1
20040135898 Zador Jul 2004 A1
20040146048 Cotte Jul 2004 A1
20040148340 Cotte Jul 2004 A1
20040148341 Cotte Jul 2004 A1
20040148342 Cotte Jul 2004 A1
20040148351 Cotte Jul 2004 A1
20040176067 Lakhani et al. Sep 2004 A1
20040185821 Yuasa Sep 2004 A1
20040219910 Beckers Nov 2004 A1
20040225467 Vock et al. Nov 2004 A1
20040236500 Choi et al. Nov 2004 A1
20040242202 Torvinen Dec 2004 A1
20040247030 Wiethoff Dec 2004 A1
20040259494 Mazar Dec 2004 A1
20050015768 Moore Jan 2005 A1
20050027567 Taha Feb 2005 A1
20050033200 Soehren et al. Feb 2005 A1
20050038691 Babu Feb 2005 A1
20050048945 Porter Mar 2005 A1
20050048955 Ring Mar 2005 A1
20050078197 Gonzales Apr 2005 A1
20050079873 Caspi et al. Apr 2005 A1
20050079877 Ichimura Apr 2005 A1
20050081200 Rutten Apr 2005 A1
20050101841 Kaylor et al. May 2005 A9
20050102167 Kapoor May 2005 A1
20050107944 Hovestadt et al. May 2005 A1
20050113649 Bergantino May 2005 A1
20050113650 Pacione et al. May 2005 A1
20050125797 Gabrani Jun 2005 A1
20050131736 Nelson et al. Jun 2005 A1
20050141522 Kadar et al. Jun 2005 A1
20050143106 Chan et al. Jun 2005 A1
20050146431 Hastings et al. Jul 2005 A1
20050157181 Kawahara et al. Jul 2005 A1
20050165719 Greenspan et al. Jul 2005 A1
20050168587 Sato et al. Aug 2005 A1
20050182824 Cotte Aug 2005 A1
20050183086 Abe Aug 2005 A1
20050202934 Olrik et al. Sep 2005 A1
20050203430 Williams et al. Sep 2005 A1
20050210300 Song et al. Sep 2005 A1
20050210419 Kela et al. Sep 2005 A1
20050212751 Marvit et al. Sep 2005 A1
20050212752 Marvit et al. Sep 2005 A1
20050212753 Marvit et al. Sep 2005 A1
20050212760 Marvit et al. Sep 2005 A1
20050216403 Tam et al. Sep 2005 A1
20050222801 Wulff et al. Oct 2005 A1
20050232388 Tsuji Oct 2005 A1
20050232404 Gaskill Oct 2005 A1
20050234676 Shibayama Oct 2005 A1
20050235058 Rackus et al. Oct 2005 A1
20050238132 Tsuji Oct 2005 A1
20050240375 Sugai Oct 2005 A1
20050243178 McConica Nov 2005 A1
20050245988 Miesel Nov 2005 A1
20050248718 Howell et al. Nov 2005 A1
20050256414 Kettunen et al. Nov 2005 A1
20050258938 Moulson Nov 2005 A1
20050262237 Fulton et al. Nov 2005 A1
20050281289 Huang et al. Dec 2005 A1
20060009243 Dahan et al. Jan 2006 A1
20060010699 Tamura Jan 2006 A1
20060017692 Wehrenberg et al. Jan 2006 A1
20060020177 Seo et al. Jan 2006 A1
20060026212 Tsukerman Feb 2006 A1
20060029284 Stewart Feb 2006 A1
20060040793 Martens Feb 2006 A1
20060063980 Hwang et al. Mar 2006 A1
20060064276 Ren et al. Mar 2006 A1
20060068919 Gottfurcht Mar 2006 A1
20060080551 Mantyjarvi et al. Apr 2006 A1
20060090088 Choi et al. Apr 2006 A1
20060090161 Bodas et al. Apr 2006 A1
20060098097 Wach et al. May 2006 A1
20060100546 Silk May 2006 A1
20060109113 Reyes et al. May 2006 A1
20060136173 Case, Jr. et al. Jun 2006 A1
20060140422 Zurek et al. Jun 2006 A1
20060149516 Bond et al. Jul 2006 A1
20060154642 Scannell, Jr. Jul 2006 A1
20060161377 Rakkola et al. Jul 2006 A1
20060161459 Rosenfeld et al. Jul 2006 A9
20060167387 Buchholz et al. Jul 2006 A1
20060167647 Krumm et al. Jul 2006 A1
20060167943 Rosenberg Jul 2006 A1
20060172706 Griffin et al. Aug 2006 A1
20060174685 Skvortsov et al. Aug 2006 A1
20060201964 DiPerna et al. Sep 2006 A1
20060204214 Shah et al. Sep 2006 A1
20060205406 Pekonen et al. Sep 2006 A1
20060206258 Brooks Sep 2006 A1
20060223547 Chin et al. Oct 2006 A1
20060249683 Goldberg et al. Nov 2006 A1
20060256082 Cho et al. Nov 2006 A1
20060257042 Ofek et al. Nov 2006 A1
20060259268 Vock et al. Nov 2006 A1
20060259574 Rosenberg Nov 2006 A1
20060284979 Clarkson Dec 2006 A1
20060288781 Daumer et al. Dec 2006 A1
20060289819 Parsons et al. Dec 2006 A1
20070004451 Anderson Jan 2007 A1
20070005988 Zhang et al. Jan 2007 A1
20070017136 Mosher et al. Jan 2007 A1
20070024441 Kahn et al. Feb 2007 A1
20070037605 Logan Feb 2007 A1
20070037610 Logan Feb 2007 A1
20070038364 Lee et al. Feb 2007 A1
20070040892 Aoki et al. Feb 2007 A1
20070050157 Kahn et al. Mar 2007 A1
20070060446 Asukai et al. Mar 2007 A1
20070061105 Darley et al. Mar 2007 A1
20070063850 Devaul et al. Mar 2007 A1
20070067094 Park et al. Mar 2007 A1
20070072158 Unuma et al. Mar 2007 A1
20070072581 Aerrabotu Mar 2007 A1
20070073482 Churchill et al. Mar 2007 A1
20070075127 Rosenberg Apr 2007 A1
20070075965 Huppi et al. Apr 2007 A1
20070078324 Wijisiriwardana Apr 2007 A1
20070082789 Nisslia et al. Apr 2007 A1
20070102525 Orr et al. May 2007 A1
20070104479 Machida May 2007 A1
20070106991 Yoo May 2007 A1
20070125852 Rosenberg Jun 2007 A1
20070130582 Chang et al. Jun 2007 A1
20070142715 Benet et al. Jun 2007 A1
20070143068 Pasolini et al. Jun 2007 A1
20070143765 Aridor Jun 2007 A1
20070145680 Rosenberg Jun 2007 A1
20070150136 Doll et al. Jun 2007 A1
20070156364 Rothkopf Jul 2007 A1
20070161410 Huang et al. Jul 2007 A1
20070165790 Taori Jul 2007 A1
20070169126 Todoroki Jul 2007 A1
20070176776 Kashiwagi et al. Aug 2007 A1
20070176898 Suh Aug 2007 A1
20070192483 Rezvani et al. Aug 2007 A1
20070195784 Allen et al. Aug 2007 A1
20070204744 Sako et al. Sep 2007 A1
20070208531 Darley et al. Sep 2007 A1
20070208544 Kulach et al. Sep 2007 A1
20070213085 Fedora Sep 2007 A1
20070213126 Deutsch et al. Sep 2007 A1
20070219708 Brasche Sep 2007 A1
20070221045 Terauchi et al. Sep 2007 A1
20070225935 Ronkainen et al. Sep 2007 A1
20070233788 Bender Oct 2007 A1
20070239399 Sheynblat et al. Oct 2007 A1
20070250261 Soehren Oct 2007 A1
20070259685 Engblom et al. Nov 2007 A1
20070259716 Mattice et al. Nov 2007 A1
20070259717 Mattice et al. Nov 2007 A1
20070260418 Ladetto et al. Nov 2007 A1
20070260482 Nurmela et al. Nov 2007 A1
20070263995 Park et al. Nov 2007 A1
20070281762 Barros et al. Dec 2007 A1
20070296696 Nurmi Dec 2007 A1
20080005738 Imai Jan 2008 A1
20080030586 Helbing et al. Feb 2008 A1
20080046888 Appaji Feb 2008 A1
20080052716 Theurer Feb 2008 A1
20080072014 Krishnan et al. Mar 2008 A1
20080082994 Ito et al. Apr 2008 A1
20080086734 Jensen Apr 2008 A1
20080102785 Childress et al. May 2008 A1
20080109158 Huhtala et al. May 2008 A1
20080113689 Bailey May 2008 A1
20080114538 Lindroos May 2008 A1
20080120520 Eriksson May 2008 A1
20080125959 Doherty May 2008 A1
20080140338 No Jun 2008 A1
20080153671 Ogg et al. Jun 2008 A1
20080161072 Lide et al. Jul 2008 A1
20080165022 Herz et al. Jul 2008 A1
20080168361 Forstall et al. Jul 2008 A1
20080171918 Teller et al. Jul 2008 A1
20080214358 Ogg et al. Sep 2008 A1
20080231713 Florea et al. Sep 2008 A1
20080231714 Estevez et al. Sep 2008 A1
20080232604 Dufresne et al. Sep 2008 A1
20080243432 Kato et al. Oct 2008 A1
20080254944 Muri et al. Oct 2008 A1
20080303681 Herz et al. Dec 2008 A1
20080311929 Carro Dec 2008 A1
20090017880 Moore et al. Jan 2009 A1
20090024233 Shirai et al. Jan 2009 A1
20090031319 Fecioru Jan 2009 A1
20090043531 Kahn et al. Feb 2009 A1
20090047645 Dibenedetto et al. Feb 2009 A1
20090067826 Shinohara et al. Mar 2009 A1
20090082994 Schuler et al. Mar 2009 A1
20090088204 Culbert et al. Apr 2009 A1
20090098880 Lindquist Apr 2009 A1
20090099668 Lehman et al. Apr 2009 A1
20090099812 Kahn et al. Apr 2009 A1
20090124348 Yoseloff et al. May 2009 A1
20090124938 Brunner May 2009 A1
20090128448 Riechel May 2009 A1
20090174782 Kahn et al. Jul 2009 A1
20090213002 Rani et al. Aug 2009 A1
20090215502 Griffin Aug 2009 A1
20090234614 Kahn et al. Sep 2009 A1
20090274317 Kahn et al. Nov 2009 A1
20090296951 De Haan Dec 2009 A1
20090319221 Kahn et al. Dec 2009 A1
20090325705 Filer et al. Dec 2009 A1
20100056872 Kahn et al. Mar 2010 A1
20100057398 Darley et al. Mar 2010 A1
20100199189 Ben-Aroya et al. Aug 2010 A1
20100245131 Graumann Sep 2010 A1
20100277489 Geisner Nov 2010 A1
20100283742 Lam Nov 2010 A1
20110003665 Burton et al. Jan 2011 A1
20120092157 Tran Apr 2012 A1
20140369522 Asukai et al. Dec 2014 A1
Foreign Referenced Citations (25)
Number Date Country
1104143 May 2001 EP
0833537 Jul 2002 EP
1988492 Nov 2008 EP
2431813 May 2007 GB
7020547 Jan 1995 JP
200090069 Mar 2000 JP
2001057695 Feb 2001 JP
2003108154 Apr 2003 JP
2003143683 May 2003 JP
2005167758 Jun 2005 JP
2005260944 Sep 2005 JP
2005277995 Oct 2005 JP
2005309691 Nov 2005 JP
2006026092 Feb 2006 JP
2006118909 May 2006 JP
2006239398 Sep 2006 JP
2006287730 Oct 2006 JP
2007080219 Mar 2007 JP
2007104670 Apr 2007 JP
2007121694 May 2007 JP
2007142611 Jun 2007 JP
2007193908 Aug 2007 JP
WO9922338 May 1999 WO
WO0063874 Oct 2000 WO
WO02088926 Nov 2002 WO
Non-Patent Literature Citations (40)
Entry
“Access and Terminals (AT); Multimedia Message Service (MMS) for PSTN/ISDN; Multimedia Message Communication Between a Fixed Network Multimedia Message Terminal Equipment and a Multimedia Message Service Centre,” ETSI AT-F Rapporteur Meeting, Feb. 4-6, 2003, Gothenburg, DES/AT-030023 V0.0.1 (Mar. 2003).
“Decrease Processor Power Consumption using a CoolRunner CPLD,” Xilinx XAPP347 (v1.0), May 16, 2001, 9 pages.
“Sensor Fusion,” <www.u-dynamics.com>, accessed Aug. 29, 2008, 2 pages.
Anderson, Ian, et al, “Shakra: Tracking and Sharing Daily Activity Levels with Unaugmented Mobile Phones,” Mobile Netw Appl, Aug. 3, 2007, pp. 185-199.
Ang, Wei Tech, et al, “Zero Phase Filtering for Active Compensation of Periodic Physiological Motion,” Proc 1st IEEE /RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Feb. 20-22, 2006, pp. 182-187.
Aylward, Ryan, et al, “Sensemble: A Wireless, Compact, Multi-User Sensor System for Interactive Dance,” International Conference on New Interfaces for Musical Expression (NIME06), Jun. 4-8, 2006, pp. 134-139.
Baca, Arnold, et al, “Rapid Feedback Systems for Elite Sports Training,” IEEE Pervasive Computing, Oct.-Dec. 2006, pp. 70-76.
Bakhru, Kesh, “A Seamless Tracking Solution for Indoor and Outdoor Position Location,” IEEE 16th International Symposium on Personal, Indoor, and Mobile Radio Communications, 2005, pp. 2029-2033.
Bliley, Kara E, et al, “A Miniaturized Low Power Personal Motion Analysis Logger Utilizing MEMS Accelerometers and Low Power Microcontroller,” IEEE EMBS Special Topic Conference on Microtechnologies in Medicine and Biology, May 12-15, 2005, pp. 92-93.
Bourzac, Katherine “Wearable Health Reports,” Technology Review, Feb. 28, 2006, <http://www.techreview.com/printer.sub.--friendly.sub.-article.-sub.-aspx?id+16431>, Mar. 22, 2007, 3 pages.
Cheng, et al, “Periodic Human Motion Description for Sports Video Databases,” Proceedings of the Pattern Recognition, 2004, 8 pages.
Dao, Ricardo, “Inclination Sensing with Thermal Accelerometers”, MEMSIC, May 2002, 3 pages.
Fang, Lei, et al, “Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience,” IEEE Transactions on Instrumentation and Measurement, vol. 54, No. 6, Dec. 2005, pp. 2342-2358.
Healey, Jennifer, et al, “Wearable Wellness Monitoring Using ECG and Accelerometer Data,” IEEE Int. Symposium on Wearable Computers (ISWC'05), 2005, 2 pages.
Hemmes, Jeffrey, et al, “Lessons Learned Building TeamTrak: An Urban/Outdoor Mobile Testbed,” 2007 IEEE Int. Conf. on Wireless Algorithms, Aug. 1-3, 2007, pp. 219-224.
Jones, L, et al, “Wireless Physiological Sensor System for Ambulatory Use,” <http://ieeexplore.ieee.org/xpl/freeabs.sub.-all.jsp?tp=&arnumb- er=1612917&isnumber=33861>, Apr. 3-5, 2006, 1 page.
Jovanov, Emil, et al, “A Wireless Body Area Network of Intelligent Motion Sensors for Computer Assisted Physical Rehabilitation,” Journal of NeuroEngineering and Rehabilitation, Mar. 2005, 10 pages.
Kalpaxis, Alex, “Wireless Temporal-Spatial Human Mobility Analysis Using Real-Time Three Dimensional Acceleration Data,” IEEE Intl. Multi-Cont on Computing in Global IT (ICCGI'07), 2007, 7 pages.
Lee, Hyunseok, et al, A Dual Processor Solution for the MAC Layer of a Software Defined Radio Terminal, Advanced Computer Architecture Laboratory, University of Michigan, 25 pages.
Lee, Seon-Woo, et al., “Recognition of Walking Behaviors for Pedestrian Navigation,” ATR Media Integration & Communications Research Laboratories, Kyoto, Japan, pp. 1152-1155.
Margaria, Rodolfo, “Biomechanics and Energetics of Muscular Exercise”, Chapter 3, Oxford: Clarendon Press 1976, pp. 105-125.
Milenkovic, Milena, et al, “An Accelerometer-Based Physical Rehabilitation System,” IEEE SouthEastern Symposium on System Theory, 2002, pp. 57-60.
Mizell, David, “Using Gravity to Estimate Accelerometer Orientation”, Seventh IEEE International Symposium on Wearable Computers, 2003, 2 pages.
Ormoneit, D, et al, Learning and Tracking of Cyclic Human Motion: Proceedings of NIPS 2000, Neural Information Processing Systems, 2000, Denver, CO, pp. 894-900.
Otto, Chris, et al, “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring,” Journal of Mobile Multimedia, vol. 1, No. 4, 2006, pp. 307-326.
Park, Chulsung, et al, “Eco: An Ultra-Compact Low-Power Wireless Sensor Node for Real-Time Motion Monitoring,” IEEE Int Symp. on Information Processing in Sensor Networks, 2005, pp. 398-403.
Ricoh, “Advanced digital technology changes creativity,” <http://www.ricoh.com/r.sub.--dc/gx/gx200/features2.html>, Accessed May 12, 2011, 4 pages.
Shen, Chien-Lung, et al, “Wearable Band Using a Fabric-Based Sensor for Exercise ECG Monitoring,” IEEE Int. Symp. on Wearable Computers, 2006, 2 pages.
Tapia, Emmanuel Munguia, et al, “Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor,” IEEE Cont. on Wearable Computers, Oct. 2007, 4 pages.
Tech, Ang Wei, “Real-time Image Stabilizer,” <http://www.mae.ntu.edu.sg/ABOUTMAE/DIVISIONS/RRC.sub.-BIOROBOTICS/Pa- ges/rtimage.aspx>, Mar. 23, 2009, 3 pages.
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 1-66 (part 1 of 3).
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 67-92 (part 2 of 3).
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 93-123 (part 3 of 3).
Weckesser, P, et al, “Multiple Sensorprocessing for High-Precision Navigation and Environmental Modeling with a Mobile Robot,” IEEE, 1995, pp. 453-458.
Weinberg, Harvey, “MEMS Motion Sensors Boost Handset Reliability” Jun. 2006, http://www.mwrf.com/Articles/Print.cfm?ArticleID=12740>, Feb. 21, 2007, 4 pages.
Weinberg, Harvey, “Minimizing Power Consumption of iMEMS.RTM. Accelerometers,” Analog Devices, <http://www.analog.com/static/imported-files/application.sub.-notes/5- 935151853362884599AN601.pdf>, 2002, 5 pages.
Wixted, Andrew J, et al, “Measurement of Energy Expenditure in Elite Athletes Using MEMS-Based Triaxial Accelerometers,” IEEE Sensors Journal, vol. 7, No. 4, Apr. 2007, pp. 481-488.
Wu, Winston H, et al, “Context-Aware Sensing of Physiological Signals,” IEEE Int. Conf. on Engineering for Medicine and Biology, Aug. 23-26, 2007, pp. 5271-5275.
Yoo, Chang-Sun, et al, “Low Cost GPS/INS Sensor Fusion System for UAV Navigation,” IEEE, 2003, 9 pages.
Zypad WL 1100 Wearable Computer, <http://www.eurotech.fi/products/manuals/Zypad%20WL%201100.sub.--sf.pd-f>, Jan. 16, 2008, 2 pgs.
Continuations (3)
Number Date Country
Parent 14936629 Nov 2015 US
Child 15937855 US
Parent 14047937 Oct 2013 US
Child 14936629 US
Parent 11829813 Jul 2007 US
Child 14047937 US