Method and apparatus for a motion state aware device

Information

  • Patent Grant
  • 9529437
  • Patent Number
    9,529,437
  • Date Filed
    Tuesday, May 26, 2009
    15 years ago
  • Date Issued
    Tuesday, December 27, 2016
    7 years ago
Abstract
A device comprising a motion context logic that receives data from at least one motion sensor is described. The motion context logic determines a user's motion context. Context based action logic manages the device based on the user's motion context.
Description
FIELD OF THE INVENTION

The present invention relates to a headset or other user carried device, and more particularly to a motion state aware headset or device.


BACKGROUND

A headset is a headphone combined with a microphone. Headsets provide the equivalent functionality of a telephone handset with hands-free operation. Headsets can be wired or wireless. Wireless headsets generally connect to a phone via a Bluetooth or equivalent network connection.


SUMMARY

A device comprising a motion context logic that receives data from at least one motion sensor is described. The motion context logic determines a user's motion context. Context based action logic manages the device based on the user's motion context.





BRIEF DESCRIPTION OF THE DRAWINGS

The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:



FIG. 1 is an illustration of an exemplary headset, a telephone, and their connectivity.



FIG. 2 is an illustration of one embodiment of a dual processor implementation of the device.



FIG. 3 is a block diagram of one embodiment of the context-based system.



FIG. 4 is a flowchart of one embodiment of determining motion context.



FIG. 5 is a flowchart of one embodiment of utilizing motion context based commands.



FIG. 6 is a flowchart of one embodiment of adjusting settings based on motion context.



FIG. 7 is a flowchart of one embodiment of how power management is handled.





DETAILED DESCRIPTION

The method and apparatus described is for a motion context aware headset or user carried device. Although the term headset is used in the description, one of skill in the art would understand that the description below also applies to mobile phones, eye glasses, or other user carried devices which may include a motion sensing mechanism, and can be adjusted in their actions and responses based on the user's motion state. The device, in one embodiment, is designed to be coupled to a cellular phone or other telephone. In another embodiment, the device may be a self-contained cellular unit which directly interacts with the cellular network.


The headset includes at least one motion sensor. The headset may also receive data from other sensors. In one embodiment, these sensors may be in the headset, or may be external. For example, sensors may include a global positioning system (GPS) sensor, one or more motion sensors in the phone/handset, a barometric sensor, capacitance (touch) sensor(s), proximity sensors, or other sensors which may provide motion context.


The following detailed description of embodiments of the invention makes reference to the accompanying drawings in which like references indicate similar elements, showing by way of illustration specific embodiments of practicing the invention. Description of these embodiments is in sufficient detail to enable those skilled in the art to practice the invention. One skilled in the art understands that other embodiments may be utilized and that logical, mechanical, electrical, functional and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.



FIG. 1 is an illustration of an exemplary headset, a telephone, and their connectivity. Headset 110 is coupled to a mobile device 130, in one embodiment, via local communication link 120. In one embodiment, headset 110 is paired with mobile device. In one embodiment, local communication link 120 is a Personal Area Network (PAN) link. In one embodiment, the link is a Bluetooth link. Mobile device 130 can connect to a network 150 through cellular communication link 140, wireless frequency (WiFi) link 160, such as a link through an 802.11(a)-(g) compliant modem. In one embodiment, headset 110 may include links to communicate directly with the cellular network and/or the wireless network. In one embodiment, the headset 110 and mobile device 130 may share processing, to maximize battery life and provide the most seamless experience possible to the user.



FIG. 2 is an illustration of one embodiment of a dual processor implementation of the headset. Low power processor 210 has as its inputs the buttons, the accelerometer, and in one embodiment a speech sensor, and other sensors included in the device. In one embodiment, some sensors which utilize extensive processing may be coupled to, or part of, the high power processor 250. In one embodiment, low power processor 210 may be a Texas Instruments® MSP430 microcontroller which is an ultra-low-power 16-bit RISC mixed-signal processor. The low power processor (LPP) also can turn on and off a high power processor (HPP) 250.


The high power processor HPP in one embodiment may be a CSR® BlueCore™ 5 integrated circuit which provides a programmable single-chip Bluetooth solution with on-chip DSP, stereo CODEC, and Flash memory. In another embodiment, the high power processor HPP may be a different type of processor, providing different functionality.


In one embodiment, the LPP 210 receives data from various sensors, which may include accelerometer 220 and other sensors (not shown). In one embodiment, accelerometer 220 is the BOSCH Sensortec® BMA150. The LPP 210 also sends and receives signals from and to user input device (button 240) and user output device (LED 245). These are merely exemplary user interfaces, of course, alternative interfaces may be utilized. For example, instead of or in addition to a button, the device may have a dial, a set of buttons, capacitance touch sensitive pads, a touch screen, or another type of user interface. Instead of or in addition to light emitting diodes (LEDs) 245, the device may have a screen, or any other visual data display mechanism. In one embodiment, HPP 250 maintains a Bluetooth connection, when it is awake, and receives any call signals from a telephone that is coupled to the headset 200.


The LPP 210 determines, using headset logic engine, whether the HPP 250 should be woken up. If so, the LPP 210 sends a Power On signal to the HPP 250. Similarly, when the device is quiescent, and the HPP 250 is not needed, the LPP 210 sends a Power Off signal, in one embodiment. In one embodiment, the HPP may automatically go to sleep if no use has been detected in a preset period of time, e.g. 5 seconds. In one embodiment, the HPP maintains a Bluetooth connection with the handset, if available.



FIG. 3 is a block diagram of one embodiment of the context-based system 300. The context based system 300, in one embodiment, is implemented on low power processor (LPP). In one embodiment, the context based system 300 is implemented across an LPP and a high power processor (HPP). The processes are split based on the specific processing requirements. However, in one embodiment, in a headset the high power processor is only used to handle maintaining a Bluetooth connection, and phone calls which require voice processing.


The motion context logic 310 receives sensor data. In one embodiment, the sensor data is motion data. In one embodiment, other sensor data may also be received. In one embodiment, other sensor data may include data such as barometer data, GPS data, temperature data, or any other data which may be used by the headset. In one embodiment, the data is collected by buffer 305. In one embodiment, some of the sensor data may be unrelated to the motion logic context, and may be simply collected by the context based system. In that case, in one embodiment, the data may simply be stored, in store 307. In one embodiment, store 307 may be Flash memory, or similar non-volatile storage. Store 307, in one embodiment, may also store processed data from sensors.


Motion context logic 310 uses the sensor data to determine the user's motion context. The motion context of the user may include determining whether the user is wearing the headset, whether the user is sitting, standing, laying down, moving in various ways. In one embodiment, the location context of the user may be part of the motion context. That is, in one embodiment the motion context of “walking on the street” may be different from the motion context of “walking around a track.”


Context based action logic 320 receives the motion context information from motion context logic 310. Context based action logic 320 in one embodiment is implemented in the logic engine of the low power processor 210.


Gesture command logic 325 identifies motion commands. In one embodiment, commands may be defined by gestures, e.g. the user tapping the headset once, twice, three times, or in a certain pattern. For example, two rapid taps followed by a slower tap. Commands may also be defined by shakes, or other recognizable movements. In one embodiment, the commands available via a gesture command logic interface replace the commands entered via button pushes in the prior art. Since it is significantly easier to tap the side of a headset while wearing it—instead of attempting to successfully push a small button that one cannot see—this improves the user experience. Also it allows for a waterproof, or water resistant and sweat-proof button-free device. Gesture commands, in one embodiment, may be defined out of the box. In one embodiment, the user may further add, edit, and delete gesture commands to configure the device to suit their preferences. In one embodiment, gesture commands depend on the motion context. For example, a double tap when the user is running may initiate a run/training sequence. The same double tap when there is an incoming call may pick up the call.


In one embodiment, a user may define custom gesture commands using training logic 330. In one embodiment, the user interface may permit the use of verbal instructions, in programming the device. Verbal instructions may also be used in conjunction with, or to replace gesture commands. The gesture command logic 325 passes identified gesture commands to power management 335. Power management 335 determines whether to turn on-off the high power processor (not shown).


For certain commands, the high power processor is used to execute related actions. In that case, power management 335 ensures that the high power processor is active. Process sharing system 370 passes data and requests to the high power processor, and receives returned processed data, when appropriate.


Context based action logic 320 may further include sound adjust logic 340. Sound adjust logic 340 adjusts the sound input and output parameters, when appropriate. The sound output may be a receiving telephone connection, music played on the device, beeps, feedback noises, or any other sounds produced by the device. Depending on the user's context, the sounds may be too loud or two soft—the user may need a louder ring for example when he or she is jogging than in an office, the speaker volume may need to be louder when the user is driving in the car, the microphone may need to pick up softer tones and reduce echo in a quiet office. The system adjusts the sounds based on the determined motion context. In one embodiment, the method disclosed in co-pending application Ser. No. 12/469,633, entitled A “Method And Apparatus For Adjusting Headset/Handset Audio For A User Environment” filed May 20, 2009, which is herein incorporated by reference, may be utilized in connection with sound adjust logic 340.


Call management system 345 detects when a call is received. In one embodiment, the handset receives the call, and transmits a “ring” message to the headphone. If the high power processor is asleep, power management 335 handles it, and ensures that the call is transmitted. The gesture command logic 325 receives the command to pick up the call, if given by the user.


In one embodiment, sensor management 365 manages the power consumption of various sensors. In one embodiment, sensors are turned off when the headset is not in use, to increase battery life. For example, when the headset is not moving, it is not necessary to obtain GPS data more than once. Similarly, when the motion data indicates that the headset has not moved, or has moved minimally, barometer data and temperature data is unlikely to have changed. Therefore, those sensors can be turned off.


In one embodiment, one of the functions provided by the system is to turn the headset functionality to a minimum when the headset is not being worn. For example, users will often take off their Bluetooth headset, not turn it off, and leave it on the desk for most of the day. During this time, the high power power processor may be completely off, along with almost all sensors, while the low power processor is on stand-by and periodically monitors the motion sensor. In one embodiment, the LPP goes to sleep, and periodically wakes up just enough to sample the accelerometer and analyze the accelerometer data. In one embodiment, the monitoring period is every second. In another embodiment, it may be more or less frequent. In one embodiment, the monitoring period gradually increases, from when lack of motion is initially detected, to a maximum delay. In one embodiment, the maximum delay may be 1 second.


If the data indicates no motion, the LPP goes back to sleep. If the sample indicates that there is motion, the LPP wakes up, continues monitoring the accelerometer. In one embodiment, the LPP further determines whether the HPP should be woken up too. In one embodiment, the LPP automatically wakes up the HPP when it detects that the user has placed the headset in the “wearing” position, e.g. in the ear/head/etc. In one embodiment, the physical configuration of the headset is such that the position in which it is worn can be distinguished from any resting position. In one embodiment, the motion characteristics of placing the headset in the worn location are detected. This is because the user may be picking up the phone in order to take a call. The HPP establishes a Bluetooth connection to the phone, and determines whether there is a call in progress. If not, the HPP, in one embodiment, goes back to sleep.


By waking up the HPP when the headset is placed on the ear, the user perceives no delay in the ability to pick up the call on the headset. In another embodiment, the LPP waits until a gesture command is received before activating the HPP. In one embodiment, the user may set, via options, which happens.


When the user picks up the headset to pick up the phone, the low power sensor is powered up when it detects motion via the accelerometer sensor. Then, in one embodiment, the HPP is automatically woken, to determine whether there is a phone call/ring in progress. In another embodiment, the LPP monitors for the “pick-up phone command” and wakes up the HPP when that command is detected. The LPP, in one embodiment wakes up any other relevant sensors. In this way, the battery life of a headset can be significantly extended.



FIG. 4 is a flowchart of one embodiment of determining motion context. The process starts at block 405. At block 410, the headset is powered up. In one embodiment, the process is initiated when the headset is powered. In another embodiment, the process is initiated when the headset is paired with a phone and/or powered up.


At block 420, the system starts receiving accelerometer data. The accelerometer, in one embodiment, is located in the headset. In one embodiment the accelerometer data is buffered. At block 430, the process determines whether there are any other accelerometers which may be providing data to be processed. In one embodiment, there may be more than one accelerometer in the headset. In one embodiment, there may be a separate accelerometer in the handset, or in another external sensor location. In one embodiment, data from these additional accelerometers is received, and integrated 435 to get a better idea of the current motion data of the user.


At block 440, the process determines whether there are any other sensors. In one embodiment, the headset or paired handset may include additional sensors such as a barometric sensor, a thermometer, a proximity sensor, a capacitance (touch) sensor, and/or other sensors that may assist in determining the user's motion state. If there are additional sensors, the data from the additional sensors is integrated into the motion state data at block 445.


At block 450, the process determines whether location data is available. If so, at block 455 the user's location context is calculated. At block 460, the user's motion context is determined based on all available data. In one embodiment, the user's motion context includes the user's position (e.g. walking, running, standing, sitting, laying down), as well as the user's proximate location (e.g. at the office, on the street, in a car, etc.).


Thus, the final “motion context” may include location context. This enables the system to differentiate between “the user is walking down a busy street” and “the user is walking in a hallway in his office.”



FIG. 5 is a flowchart of one embodiment of utilizing motion context based commands. The process starts at block 505. At block 510, a command is identified. The command may have been communicated via motion (e.g. tapping on the headset, shaking the headset up and down, etc.). Alternatively, the command may be indicated through the push of a button, or by verbally giving a command. The process continues to block 520.


At block 520, the process determines whether the command has motion context. If so, at block 530, the process obtains the motion context.


At block 540, the process determines whether the command has an action context. Certain commands mean different things based on other actions within the headset. For example, the same tap may indicate “pick up the phone” if the phone is ringing, and “hang up the phone” if a call has just been terminated. Thus, if appropriate the action context is determined, at block 550.


At block 560, the correct version actions associated with the received command is executed. The process then ends.



FIG. 6 is a flowchart of one embodiment of adjusting settings based on motion context. The process starts at block 605. In one embodiment, this process is active whenever the headset is active.


At block 610, the process determines whether there has been a motion context change. If so, at block 630, the process determines whether any active applications are context dependent. If there are no context-dependent applications, the process returns to block 610 to continue monitoring. When there is an active application which relies on context, the process continues to block 640.


If there has been no context change, the process continues to block 620. At block 620, the process determines whether there has been a context-based function initiated. If not, the process returns to block 610, to continue monitoring for motion context changes. If a context based function has been initiated, the process continues directly to block 640.


At block 640, the new motion context is obtained. Note that the motion context describes the user's/device's current state.


At block 650, the application-relevant settings are obtained. The application relevant settings indicate which features are used by the application, and suggested changes to those settings based on the current motion context. For example, if the application in question is a music player, the volume, bass level, treble level may be the settings which are indicated.


At block 660, the process determines whether the change in context indicates that one or more of those settings need adjusting. If not, the process returns to block 610, to continue monitoring. For example, when a user moves from sitting to standing in the same location, it is likely that no adjustment to the volume of music playing is needed. However, if the user moves from walking to jogging, louder sound is likely needed to be heard. Similarly, in one embodiment, command recognition is tightened or loosened based on the user's motion state. For example, when a user is jogging, a lot of motions appear to be like a gesture command. Thus, a more narrow definition of the gesture command may be used to ensure that what is indicated is the gesture command, rather than mere jostling motions from jogging. For example, in one embodiment, the jogging-related motion is subtracted from the motion data to determine whether a command was received.


If adjustment is needed, at block 670 the appropriate settings are altered. The process then returns to block 610. In this way, the settings are automatically adjusted based on user and application context, as the user moves through his or her day. This means that the user need not fiddle with the headset when he or she gets into a car, or walks into a loud restaurant. This type of automatic adjustment is very convenient.



FIG. 7 is a flowchart of one embodiment of how power management is handled. The process starts at block 705. At block 710, the headset is turned on.


At block 715, sensor data is monitored and integrated to get a user motion context. As noted above, this reflects the user standing, sitting, walking, jogging, etc. It may further include motion location context, e.g. “standing in the office” or “walking into a loud restaurant” or “driving/riding in a car.”


At block 720, the process determines whether the headset is not in use. In one embodiment, this may be indicated when the headset has been immobile for a preset period. In general, whenever a human is wearing an object, that human makes small motions. Even someone holding “perfect still” makes micro motions which would be detectable by an accelerometer or other motion sensing device. In one embodiment, the industrial design of the headset ensures that it is worn in an orientation unlikely to be replicated when headset is just sitting on a surface, and the orientation is used to determine whether the headset is not in use. Therefore, if a device is not in use for a period of time, it indicates that the device has been placed somewhere, and is no longer being worn. If that is the case, the process continues to block 725. At block 725, the high power processor is placed in deep sleep mode.


At block 730, the low power processor is placed in a power saving mode as well. In power saving mode, the motion sensor is monitored, to detect if the user picks up the headset. But, in one embodiment, no other sensors are monitored. In one embodiment, all other sensors which can be controlled by the headset are also placed in low power consumption mode, or turned off. Clearly when the headset is not moving, continuously monitoring the GPS signal is not useful. In one embodiment, the sensors remain on, but the sampling rate is lowered significantly. In one embodiment, the low power processor may monitor something other than motion data, e.g. gyroscope data, weight data, etc.


Once the devices are in power save mode, the process monitors to see if motion is detected, at block 735. As noted above, this may be done via an accelerometer within the device itself, or coupled to the device. In one embodiment, the monitoring frequency decreases over time. In one embodiment, the monitoring frequency may decrease gradually from the standard accelerometer sampling rate to a stand-by rate.


If motion is detected, at block 740 the low power processor is used to detect the motion context. The motion context generally would indicate why the headset had been moved, e.g. to pick up a call on the headset, etc. The process then continues to block 745.


At block 745, the process determines whether any of the identified applications, states, or received commands use additional processing power to be provided by the high power processor. In one embodiment, if the headset is picked up and placed in the configuration to be worn, the system determines whether there is an incoming phone call. In one embodiment, this uses the Bluetooth connection, and therefore high power processor is woken up. If so, at block 760 the high power processor is woken up. The process monitors until the actions are completed, at block 765. When the process determines that the actions are completed, the process returns to block 715 to continue monitoring motions.


If at block 745, the high power processor was found to be unnecessary, at block 750 the LPP is used to perform the relevant operations. The process then returns to block 715 to continue monitoring.


In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims
  • 1. A motion aware mobile device comprising: a processor;a motion context logic implemented in the processor to determine a motion context of the device, the motion context automatically identifying absent user input a geographic location of the device, a type of user activity, a user's environment, and a relative location of the device with respect to the user's body based in part on data from a motion sensor, the relative location indicating that the device is in an in-use location on the user's body based on the data from the motion sensor or a not-in-use location based on an orientation of the device, wherein the type of user activity identified in the motion context includes distinguishing types of user activities selected from among: standing, sitting, walking, jogging;a context-based action logic to automatically manage the device based on the motion context via a context-based action logic engine; anda gesture command logic to identify a motion command and to automatically determine an appropriate action based on the context-based action logic engine, the identification automatically tightened or loosened to change a definition of a gesture command based on the motion context of the device.
  • 2. The device of claim 1, further comprising: a gesture command logic automatically to identify a motion command and to determine an appropriate action, the appropriate action defined by the motion context and application context of the device, the identification tightened or loosened based on the motion context of the device.
  • 3. The device of claim 1, wherein the context-based action logic engine further comprises: a sound adjust logic to automatically adjust an input and/or an output sound level based on the motion context of the device.
  • 4. The device of claim 1, further comprising: a power management system to place portions of the device into a suspended state, when the device is not being actively used, the state of not being actively used determined based on the motion context.
  • 5. The device of claim 4, wherein the portion of the device placed into the suspended state comprises a high-powered processor, while maintaining a low-powered processor to manage the device.
  • 6. The device of claim 4, further comprising, the power management system placing at least one sensor coupled to the device into a power saving mode.
  • 7. The device of claim 4, wherein when the device is immobile for a period, the suspended state includes having only the motion data monitored to detect removal from inactive status.
  • 8. A motion-state aware device comprising: a low power processor (LPP) to monitor a sensor, the LPP including:a motion context logic to determine a motion context of the device, the motion context identifying a type of user activity, a motion location context of the device automatically determined from a location of the device and the type of user activity at the location, a user's environment, and a relative location of the device with respect to the user's body;a context-based action logic to automatically manage the device based in the motion context via a context-based action logic engine;a gesture command logic to automatically identify a motion command and to automatically determine an appropriate action based on the context-based action logic, identification tightened or loosened to change a definition of a gesture command based on the motion context of the device; and,a power management system to wake up portions of the motion-state aware device when the motion context indicates a need for the portions.
  • 9. The device of claim 8, further comprising: a high power processor to provide processing capability, the high power processor turned on and off by the LPP based on the motion context.
  • 10. The device of claim 8, wherein the motion command is configured by a user of the device.
  • 11. The device of claim 10, further comprising: a call management system to detect a received call; andthe power management system to wake up portions of the device to enable the device to ring and the received call to be picked up.
  • 12. The device of claim 9, wherein the low power processor automatically wakes up the high power processor when the motion context of the device indicates the device is in a worn location.
  • 13. The device of claim 8, further comprising: a gesture command logic to automatically identify a motion command and to determine an appropriate action, the appropriate action defined by the motion context and application context of the device.
  • 14. The device of claim 8, further comprising: a sound adjust logic to adjust an input and/or an output sound level based on the motion context of the device.
  • 15. The device of claim 9, further comprising: a power management system to place the high power processor into a suspended state, when the device is not being actively used, the state of not being actively used determined based on the motion context.
  • 16. The device of claim 15, further comprising, the power management system placing at least one sensor into a power saving mode.
  • 17. The device of claim 15, wherein when the device is immobile for a period, the suspended state includes having only the motion data monitored to detect removal from inactive status.
  • 18. The device of claim 8, further comprising the LPP including: a call management system to detect a received call; anda power management system to wake up portions of the device to enable the device to ring and the received call to be picked up.
  • 19. A method for adjusting settings of a motion aware device, the method comprising: determining a motion context of a motion aware device based on a type of user activity, a motion location context of the user automatically determined absent user input based on a user's location and the type of user activity at the location, a user's environment, and a relative location of the device with respect to the user's body, the relative location indicating that the device is in an in-use location on the user's body based on motion characteristics of the device or a not-in-use location based on an orientation of the device; andautomatically managing the motion aware device based on the motion context by automatically tightening or loosening gesture recognition based on the motion context of the device.
  • 20. The method of claim 19, further comprising: determining that the device is not in use based on the motion context; andplacing a portion of the motion aware device in a low power mode.
  • 21. The method claim 20, wherein the portion is one or more of: a low power processor, a high power process, and a sensor.
US Referenced Citations (419)
Number Name Date Kind
4285041 Smith Aug 1981 A
4571680 Wu Feb 1986 A
4578769 Frederick Mar 1986 A
4700369 Seigal et al. Oct 1987 A
4776323 Spector Oct 1988 A
5313060 Gast et al. May 1994 A
5386210 Lee Jan 1995 A
5430480 Allen et al. Jul 1995 A
5446725 Ishiwatari Aug 1995 A
5446775 Wright et al. Aug 1995 A
5454114 Yach et al. Sep 1995 A
5485402 Smith et al. Jan 1996 A
5506987 Abramson et al. Apr 1996 A
5515419 Sheffer May 1996 A
5583776 Levi et al. Dec 1996 A
5593431 Sheldon Jan 1997 A
5654619 Iwashita Aug 1997 A
5703786 Conkright Dec 1997 A
5737439 Lapsley et al. Apr 1998 A
5771001 Cobb Jun 1998 A
5778882 Raymond et al. Jul 1998 A
5911065 Williams et al. Jun 1999 A
5955667 Fyfe Sep 1999 A
5955871 Nguyen Sep 1999 A
5960085 de la Huerga Sep 1999 A
5976083 Richardson et al. Nov 1999 A
6013007 Root et al. Jan 2000 A
6061456 Andrea et al. May 2000 A
6122595 Varley et al. Sep 2000 A
6129686 Friedman Oct 2000 A
6135951 Richardson et al. Oct 2000 A
6145389 Ebeling et al. Nov 2000 A
6246321 Rechsteiner et al. Jun 2001 B1
6282496 Chowdhary Aug 2001 B1
6336891 Fedrigon et al. Jan 2002 B1
6353449 Gregg et al. Mar 2002 B1
6369794 Sakurai et al. Apr 2002 B1
6396883 Yang et al. May 2002 B2
6408330 de la Huerga Jun 2002 B1
6428490 Kramer et al. Aug 2002 B1
6470147 Imada Oct 2002 B1
6478736 Mault Nov 2002 B1
6493652 Ohlenbusch et al. Dec 2002 B1
6496695 Kouji et al. Dec 2002 B1
6513381 Fyfe et al. Feb 2003 B2
6522266 Soehren et al. Feb 2003 B1
6529144 Nilsen Mar 2003 B1
6532419 Begin et al. Mar 2003 B1
6539336 Vock et al. Mar 2003 B1
6595929 Stivoric et al. Jul 2003 B2
6601016 Brown et al. Jul 2003 B1
6607493 Song Aug 2003 B2
6611789 Darley Aug 2003 B1
6628898 Endo Sep 2003 B2
6634992 Ogawa Oct 2003 B1
6665802 Ober Dec 2003 B1
6672991 O'Malley Jan 2004 B2
6685480 Nishimoto et al. Feb 2004 B2
6700499 Kubo et al. Mar 2004 B2
6731958 Shirai May 2004 B1
6766176 Gupta et al. Jul 2004 B1
6771250 Oh Aug 2004 B1
6786877 Foxlin Sep 2004 B2
6788980 Johnson Sep 2004 B1
6790178 Mault et al. Sep 2004 B1
6807564 Zellner et al. Oct 2004 B1
6813582 Levi et al. Nov 2004 B2
6823036 Chen Nov 2004 B1
6826477 Ladetto et al. Nov 2004 B2
6836744 Asphahani et al. Dec 2004 B1
6881191 Oakley et al. Apr 2005 B2
6885971 Vock et al. Apr 2005 B2
6895425 Kadyk et al. May 2005 B1
6898550 Blackadar et al. May 2005 B1
6928382 Hong et al. Aug 2005 B2
6941239 Unuma et al. Sep 2005 B2
6959259 Vock et al. Oct 2005 B2
6975959 Dietrich et al. Dec 2005 B2
6997852 Watterson et al. Feb 2006 B2
7002553 Shkolnikov Feb 2006 B2
7010332 Irvin et al. Mar 2006 B1
7020487 Kimata Mar 2006 B2
7027087 Nozaki et al. Apr 2006 B2
7028547 Shiratori et al. Apr 2006 B2
7042509 Onuki May 2006 B2
7054784 Flentov et al. May 2006 B2
7057551 Vogt Jun 2006 B1
7072789 Vock et al. Jul 2006 B2
7092846 Vock et al. Aug 2006 B2
7096619 Jackson et al. Aug 2006 B2
7148797 Albert Dec 2006 B2
7148879 Amento et al. Dec 2006 B2
7149964 Cottrille et al. Dec 2006 B1
7155507 Hirano et al. Dec 2006 B2
7158912 Vock et al. Jan 2007 B2
7169084 Tsuji Jan 2007 B2
7171222 Fotstick Jan 2007 B2
7171331 Vock et al. Jan 2007 B2
7173604 Marvit Feb 2007 B2
7176886 Marvit et al. Feb 2007 B2
7176887 Marvit et al. Feb 2007 B2
7176888 Marvit et al. Feb 2007 B2
7177684 Kroll et al. Feb 2007 B1
7180500 Marvit et al. Feb 2007 B2
7180501 Marvit et al. Feb 2007 B2
7180502 Marvit et al. Feb 2007 B2
7200517 Darley et al. Apr 2007 B2
7212230 Stavely May 2007 B2
7212943 Aoshima et al. May 2007 B2
7220220 Stubbs et al. May 2007 B2
7245725 Beard Jul 2007 B1
7254516 Case et al. Aug 2007 B2
7280096 Marvit et al. Oct 2007 B2
7280849 Bailey Oct 2007 B1
7297088 Tsuji Nov 2007 B2
7301526 Marvit et al. Nov 2007 B2
7301527 Marvit et al. Nov 2007 B2
7301528 Marvit et al. Nov 2007 B2
7301529 Marvit et al. Nov 2007 B2
7305323 Skvortsov et al. Dec 2007 B2
7334472 Seo et al. Feb 2008 B2
7353112 Choi et al. Apr 2008 B2
7365735 Reinhardt et al. Apr 2008 B2
7365736 Marvit et al. Apr 2008 B2
7365737 Marvit et al. Apr 2008 B2
7379999 Zhou et al. May 2008 B1
7382611 Tracy et al. Jun 2008 B2
7387611 Inoue et al. Jun 2008 B2
7397357 Krumm et al. Jul 2008 B2
7451056 Flentov et al. Nov 2008 B2
7457719 Kahn et al. Nov 2008 B1
7457872 Aton et al. Nov 2008 B2
7463997 Pasolini et al. Dec 2008 B2
7467060 Kulach et al. Dec 2008 B2
7489937 Chung et al. Feb 2009 B2
7502643 Farringdon et al. Mar 2009 B2
7512515 Vock et al. Mar 2009 B2
7526402 Tenanhaus et al. Apr 2009 B2
7608050 Sugg Oct 2009 B2
7640804 Daumer et al. Jan 2010 B2
7647196 Kahn et al. Jan 2010 B2
7653508 Kahn et al. Jan 2010 B1
7664657 Letzt et al. Feb 2010 B1
7689107 Enomoto Mar 2010 B2
7705884 Pinto et al. Apr 2010 B2
7736272 Martens Jun 2010 B2
7752011 Niva et al. Jul 2010 B2
7753861 Kahn et al. Jul 2010 B1
7765553 Douceur et al. Jul 2010 B2
7774156 Niva et al. Aug 2010 B2
7788059 Kahn et al. Aug 2010 B1
7840346 Huhtala et al. Nov 2010 B2
7857772 Bouvier et al. Dec 2010 B2
7881902 Kahn et al. Feb 2011 B1
7892080 Dahl Feb 2011 B1
7907901 Kahn et al. Mar 2011 B1
7987070 Kahn et al. Jul 2011 B2
8187182 Kahn et al. May 2012 B2
8275635 Stivoric et al. Sep 2012 B2
8398546 Pacione et al. Mar 2013 B2
8458715 Khosla et al. Jun 2013 B1
8562489 Burton et al. Oct 2013 B2
8790279 Brunner Jul 2014 B2
20010047488 Verplaetse et al. Nov 2001 A1
20020006284 Kim Jan 2002 A1
20020022551 Watterson et al. Feb 2002 A1
20020023654 Webb Feb 2002 A1
20020027164 Mault et al. Mar 2002 A1
20020042830 Bose et al. Apr 2002 A1
20020044634 Rooke et al. Apr 2002 A1
20020054214 Yoshikawa May 2002 A1
20020089425 Kubo et al. Jul 2002 A1
20020109600 Mault et al. Aug 2002 A1
20020118121 Lehrman et al. Aug 2002 A1
20020122543 Rowen Sep 2002 A1
20020138017 Bui et al. Sep 2002 A1
20020142887 O'Malley Oct 2002 A1
20020150302 McCarthy et al. Oct 2002 A1
20020151810 Wong et al. Oct 2002 A1
20020173295 Nykanen et al. Nov 2002 A1
20020190947 Feinstein Dec 2002 A1
20020193124 Hamilton et al. Dec 2002 A1
20030018430 Ladetto et al. Jan 2003 A1
20030033411 Kavoori et al. Feb 2003 A1
20030048218 Milnes et al. Mar 2003 A1
20030083596 Kramer et al. May 2003 A1
20030093187 Walker et al. May 2003 A1
20030101260 Dacier et al. May 2003 A1
20030109258 Mantyjarvi et al. Jun 2003 A1
20030139692 Barrey et al. Jul 2003 A1
20030139908 Wegerich et al. Jul 2003 A1
20030149526 Zhou et al. Aug 2003 A1
20030151672 Robins et al. Aug 2003 A1
20030187683 Kirchhoff et al. Oct 2003 A1
20030208110 Mault et al. Nov 2003 A1
20030208113 Mault et al. Nov 2003 A1
20030227487 Hugh Dec 2003 A1
20030236625 Brown et al. Dec 2003 A1
20040017300 Kotzin et al. Jan 2004 A1
20040024846 Randall et al. Feb 2004 A1
20040043760 Rosenfeld et al. Mar 2004 A1
20040044493 Coulthard Mar 2004 A1
20040047498 Mulet-Parada et al. Mar 2004 A1
20040078219 Kaylor et al. Apr 2004 A1
20040078220 Jackson Apr 2004 A1
20040081441 Sato et al. Apr 2004 A1
20040106421 Tomiyoshi et al. Jun 2004 A1
20040106958 Mathis et al. Jun 2004 A1
20040122294 Hatlestad et al. Jun 2004 A1
20040122295 Hatlestad et al. Jun 2004 A1
20040122296 Hatlestad et al. Jun 2004 A1
20040122297 Stahmann et al. Jun 2004 A1
20040122333 Nissila Jun 2004 A1
20040122484 Hatlestad et al. Jun 2004 A1
20040122485 Stahmann et al. Jun 2004 A1
20040122486 Stahmann et al. Jun 2004 A1
20040122487 Hatlestad et al. Jun 2004 A1
20040125073 Potter et al. Jul 2004 A1
20040130628 Stavely Jul 2004 A1
20040135898 Zador Jul 2004 A1
20040146048 Cotte Jul 2004 A1
20040148340 Cotte Jul 2004 A1
20040148341 Cotte Jul 2004 A1
20040148342 Cotte Jul 2004 A1
20040148351 Cotte Jul 2004 A1
20040176067 Lakhani et al. Sep 2004 A1
20040185821 Yuasa Sep 2004 A1
20040219910 Beckers Nov 2004 A1
20040225467 Vock et al. Nov 2004 A1
20040236500 Choi et al. Nov 2004 A1
20040242202 Torvinen Dec 2004 A1
20040247030 Wiethoff Dec 2004 A1
20040259494 Mazar Dec 2004 A1
20050015768 Moore Jan 2005 A1
20050027567 Taha Feb 2005 A1
20050033200 Soehren et al. Feb 2005 A1
20050038691 Babu Feb 2005 A1
20050048945 Porter Mar 2005 A1
20050048955 Ring Mar 2005 A1
20050078197 Gonzalez Apr 2005 A1
20050079873 Caspi et al. Apr 2005 A1
20050101841 Kaylor et al. May 2005 A9
20050102167 Kapoor May 2005 A1
20050107944 Hovestadt et al. May 2005 A1
20050113649 Bergantino May 2005 A1
20050113650 Pacione et al. May 2005 A1
20050125797 Gabrani et al. Jun 2005 A1
20050131736 Nelson et al. Jun 2005 A1
20050141522 Kadar et al. Jun 2005 A1
20050143106 Chan et al. Jun 2005 A1
20050146431 Hastings et al. Jul 2005 A1
20050157181 Kawahara et al. Jul 2005 A1
20050165719 Greenspan et al. Jul 2005 A1
20050168587 Sato et al. Aug 2005 A1
20050182824 Cotte Aug 2005 A1
20050183086 Abe et al. Aug 2005 A1
20050202934 Olrik et al. Sep 2005 A1
20050203430 Williams et al. Sep 2005 A1
20050210300 Song et al. Sep 2005 A1
20050210419 Kela Sep 2005 A1
20050212751 Marvit et al. Sep 2005 A1
20050212752 Marvit et al. Sep 2005 A1
20050212753 Marvit et al. Sep 2005 A1
20050212760 Marvit et al. Sep 2005 A1
20050216403 Tam et al. Sep 2005 A1
20050222801 Wulff et al. Oct 2005 A1
20050232388 Tsuji Oct 2005 A1
20050232404 Gaskill Oct 2005 A1
20050232405 Gaskill Oct 2005 A1
20050234676 Shibayama Oct 2005 A1
20050235058 Rackus et al. Oct 2005 A1
20050238132 Tsuji Oct 2005 A1
20050240375 Sugai Oct 2005 A1
20050243178 McConica Nov 2005 A1
20050245988 Miesel Nov 2005 A1
20050248718 Howell et al. Nov 2005 A1
20050256414 Kettunen et al. Nov 2005 A1
20050258938 Moulson Nov 2005 A1
20050262237 Fulton et al. Nov 2005 A1
20050281289 Huang et al. Dec 2005 A1
20060009243 Dahan et al. Jan 2006 A1
20060017692 Wehrenberg et al. Jan 2006 A1
20060020177 Seo et al. Jan 2006 A1
20060029284 Stewart Feb 2006 A1
20060040793 Martens Feb 2006 A1
20060063980 Hwang et al. Mar 2006 A1
20060064276 Ren et al. Mar 2006 A1
20060080551 Mantyjarvi et al. Apr 2006 A1
20060090088 Choi et al. Apr 2006 A1
20060090161 Bodas et al. Apr 2006 A1
20060098097 Wach et al. May 2006 A1
20060100546 Silk May 2006 A1
20060109113 Reyes et al. May 2006 A1
20060136173 Case, Jr. et al. Jun 2006 A1
20060140422 Zurek et al. Jun 2006 A1
20060149516 Bond et al. Jul 2006 A1
20060154642 Scannell, Jr. Jul 2006 A1
20060161377 Rakkola et al. Jul 2006 A1
20060161459 Rosenfeld et al. Jul 2006 A9
20060167387 Buchholz et al. Jul 2006 A1
20060167647 Krumm et al. Jul 2006 A1
20060167943 Rosenberg Jul 2006 A1
20060172706 Griffin et al. Aug 2006 A1
20060174685 Skvortsov et al. Aug 2006 A1
20060201964 DiPerna et al. Sep 2006 A1
20060204214 Shah et al. Sep 2006 A1
20060205406 Pekonen et al. Sep 2006 A1
20060206258 Brooks Sep 2006 A1
20060223547 Chin et al. Oct 2006 A1
20060249683 Goldberg et al. Nov 2006 A1
20060256082 Cho et al. Nov 2006 A1
20060257042 Ofek et al. Nov 2006 A1
20060259268 Vock et al. Nov 2006 A1
20060284979 Clarkson Dec 2006 A1
20060288781 Daumer et al. Dec 2006 A1
20060289819 Parsons et al. Dec 2006 A1
20070004451 Anderson Jan 2007 A1
20070005988 Zhang et al. Jan 2007 A1
20070017136 Mosher et al. Jan 2007 A1
20070024441 Kahn et al. Feb 2007 A1
20070037605 Logan Feb 2007 A1
20070037610 Logan Feb 2007 A1
20070038364 Lee et al. Feb 2007 A1
20070040892 Aoki et al. Feb 2007 A1
20070050157 Kahn et al. Mar 2007 A1
20070060446 Asukai et al. Mar 2007 A1
20070061105 Darley et al. Mar 2007 A1
20070063850 Devaul et al. Mar 2007 A1
20070067094 Park et al. Mar 2007 A1
20070072581 Arrebotu Mar 2007 A1
20070073482 Churchill et al. Mar 2007 A1
20070075127 Rosenberg Apr 2007 A1
20070075965 Huppi et al. Apr 2007 A1
20070078324 Wijisiriwardana Apr 2007 A1
20070082789 Nissila et al. Apr 2007 A1
20070102525 Orr et al. May 2007 A1
20070104479 Machida May 2007 A1
20070106991 Yoo May 2007 A1
20070125852 Rosenberg Jun 2007 A1
20070130582 Chang et al. Jun 2007 A1
20070142715 Banet et al. Jun 2007 A1
20070143068 Pasolini et al. Jun 2007 A1
20070145680 Rosenberg Jun 2007 A1
20070150136 Doll et al. Jun 2007 A1
20070156364 Rothkopf Jul 2007 A1
20070161410 Huang et al. Jul 2007 A1
20070165790 Taori Jul 2007 A1
20070169126 Todoroki et al. Jul 2007 A1
20070176898 Suh Aug 2007 A1
20070192483 Rezvani et al. Aug 2007 A1
20070195784 Allen et al. Aug 2007 A1
20070204744 Sako et al. Sep 2007 A1
20070208531 Darley et al. Sep 2007 A1
20070208544 Kulach et al. Sep 2007 A1
20070213085 Fedora Sep 2007 A1
20070213126 Deutsch et al. Sep 2007 A1
20070221045 Terauchi et al. Sep 2007 A1
20070225935 Ronkainen et al. Sep 2007 A1
20070233788 Bender Oct 2007 A1
20070239399 Shyenblat et al. Oct 2007 A1
20070250261 Soehren Oct 2007 A1
20070259685 Engblom et al. Nov 2007 A1
20070259716 Mattice et al. Nov 2007 A1
20070259717 Mattice et al. Nov 2007 A1
20070260418 Ladetto et al. Nov 2007 A1
20070260482 Nurmela et al. Nov 2007 A1
20070263995 Park et al. Nov 2007 A1
20070281762 Barros et al. Dec 2007 A1
20070296696 Nurmi Dec 2007 A1
20080005738 Imai et al. Jan 2008 A1
20080030586 Helbing et al. Feb 2008 A1
20080046888 Appaji Feb 2008 A1
20080052716 Theurer Feb 2008 A1
20080072014 Krishnan et al. Mar 2008 A1
20080102785 Childress et al. May 2008 A1
20080109158 Huhtala et al. May 2008 A1
20080113689 Bailey May 2008 A1
20080114538 Lindroos May 2008 A1
20080140338 No et al. Jun 2008 A1
20080153671 Ogg et al. Jun 2008 A1
20080161072 Lide et al. Jul 2008 A1
20080165022 Herz et al. Jul 2008 A1
20080168361 Forstall et al. Jul 2008 A1
20080171918 Teller et al. Jul 2008 A1
20080214358 Ogg et al. Sep 2008 A1
20080231713 Florea et al. Sep 2008 A1
20080231714 Estevez et al. Sep 2008 A1
20080232604 Dufresne et al. Sep 2008 A1
20080243432 Kato et al. Oct 2008 A1
20080303681 Herz et al. Dec 2008 A1
20080311929 Carro et al. Dec 2008 A1
20090017880 Moore et al. Jan 2009 A1
20090024233 Shirai et al. Jan 2009 A1
20090031319 Fecioru Jan 2009 A1
20090043531 Kahn et al. Feb 2009 A1
20090047645 Dibenedetto et al. Feb 2009 A1
20090067826 Shinohara et al. Mar 2009 A1
20090082994 Schuler Mar 2009 A1
20090088204 Culbert et al. Apr 2009 A1
20090098880 Lindquist Apr 2009 A1
20090099668 Lehman et al. Apr 2009 A1
20090124348 Yoseloff et al. May 2009 A1
20090124938 Brunner May 2009 A1
20090128448 Riechel May 2009 A1
20090174782 Kahn et al. Jul 2009 A1
20090213002 Rani et al. Aug 2009 A1
20090215502 Griffin, Jr. Aug 2009 A1
20090234614 Kahn et al. Sep 2009 A1
20090274317 Kahn et al. Nov 2009 A1
20090296951 De Haan Dec 2009 A1
20090319221 Kahn et al. Dec 2009 A1
20090325705 Filer et al. Dec 2009 A1
20100056872 Kahn et al. Mar 2010 A1
20100057398 Darley et al. Mar 2010 A1
20100199189 Ben-Aroya et al. Aug 2010 A1
20100245131 Graumann Sep 2010 A1
20100277489 Geisner et al. Nov 2010 A1
20100283742 Lam Nov 2010 A1
20110003665 Burton et al. Jan 2011 A1
Foreign Referenced Citations (15)
Number Date Country
1 104 143 May 2001 EP
0 833 537 Jul 2002 EP
2431813 May 2007 GB
7020547 Jan 1995 JP
2001-057695 Feb 2001 JP
2003-143683 May 2003 JP
2005-309691 Nov 2005 JP
2006-118909 May 2006 JP
2006-239398 Sep 2006 JP
2007-104670 Apr 2007 JP
2007-215784 Aug 2007 JP
2007-226855 Sep 2007 JP
WO 9922338 May 1999 WO
WO 0063874 Oct 2000 WO
WO 02088926 Nov 2002 WO
Non-Patent Literature Citations (43)
Entry
Lee, Hyunseok, et al, A Dual Processor Solution for the MAC Layer of a Software Defined Radio Terminal, Advanced Computer Architecture Laboratory, University of Michigan, 25 pages.
“Decrease Processor Power Consumption using a CoolRunner CPLD,” XILINX XAPP347 (v1.0), May 16, 2001, 9 pages.
PCT/US10/36091, International Preliminary Report on Patentability, Mailed Jul. 27, 2011, 8 pages.
PCT/US10/36091, The International Search Report and Written Opinion, Date of mailing: Jul. 28, 2010, 7 pages.
“Access and Terminals (AT); Multimedia Message Service (MMS) for PSTN/ISDN; Multimedia Message Communication Between a Fixed Network Multimedia Message Terminal Equipment and a Multimedia Message Service Centre,” ETSI AT-F Rapporteur Meeting, Feb. 4-6, 2003, Gothenburg, DES/AT-030023 V0.0.1 (Mar. 2003).
Anderson, Ian, et al, “Shakra: Tracking and Sharing Daily Activity Levels with Unaugmented Mobile Phones,” Mobile Netw Appl, Aug. 3, 2007, pp. 185-199.
Ang, Wei Tech, et al, “Zero Phase Filtering for Active Compensation of Periodic Physiological Motion,” Proc 1st IEEE / RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Feb. 20-22, 2006, pp. 182-187.
Aylward, Ryan, et al, “Sensemble: A Wireless, Compact, Multi-User Sensor System for Interactive Dance,” International Conference on New Interfaces for Musical Expression (NIME06), Jun. 4-8, 2006, pp. 134-139.
Baca, Arnold, et al, “Rapid Feedback Systems for Elite Sports Training,” IEEE Pervasive Computing, Oct.-Dec. 2006, pp. 70-76.
Bakhru, Kesh, “A Seamless Tracking Solution for Indoor and Outdoor Position Location,” IEEE 16th International Symposium on Personal, Indoor, and Mobile Radio Communications, 2005, pp. 2029-2033.
Bliley, Kara E, et al, “A Miniaturized Low Power Personal Motion Analysis Logger Utilizing Mems Accelerometers and Low Power Microcontroller,” IEEE EMBS Special Topic Conference on Microtechnologies in Medicine and Biology, May 12-15, 2005, pp. 92-93.
Bourzac, Katherine “Wearable Health Reports,” Technology Review, Feb. 28, 2006, <http://www.techreview.com/printer—friendly—article—aspx?id+16431>, Mar. 22, 2007, 3 pages.
Cheng, et al, “Periodic Human Motion Description for Sports Video Databases,” Proceedings of the Pattern Recognition, 2004, 5 pages.
Dao, Ricardo, “Inclination Sensing with Thermal Accelerometers”, MEMSIC, May 2002, 3 pages.
Fang, Lei, et al, “Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience,” IEEE Transactions on Instrumentation and Measurement, vol. 54, No. 6, Dec. 2005, pp. 2342-2358.
Healey, Jennifer, et al, “Wearable Wellness Monitoring Using ECG and Accelerometer Data,” IEEE Int. Symposium on Wearable Computers (ISWC'05), 2005, 2 pages.
Hemmes, Jeffrey, et al, “Lessons Learned Building TeamTrak: An Urban/Outdoor Mobile Testbed,” 2007 IEEE Int. Conf. on Wireless Algorithms, Aug. 1-3, 2007, pp. 219-224.
Jones, L, et al, “Wireless Physiological Sensor System for Ambulatory Use,” <http://ieeexplorejeee.org/xpl/freeabs—all.jsp?tp=&arnumber=1612917&isnumber=33861>, Apr. 3-5, 2006.
Jovanov, Emil, et al, “A Wireless Body Area Network of Intelligent Motion Sensors for Computer Assisted Physical Rehabilitation,” Journal of NeuroEngineering and Rehabilitation, Mar. 2005, 10 pages.
Kalpaxis, Alex, “Wireless Temporal-Spatial Human Mobility Analysis Using Real-Time Three Dimensional Acceleration Data,” IEEE Intl. Multi-Conf. on Computing in Global IT (ICCGI'07), 2007, 7 pages.
Lee, Seon-Woo, et al., “Recognition of Walking Behaviors for Pedestrian Navigation,” ATR Media Integration & Communications Research Laboratories, Kyoto, Japan, pp. 1152-1155.
Margaria, Rodolfo, “Biomechanics and Energetics of Muscular Exercise”, Chapter 3, pp. 105-125, Oxford: Clarendon Press 1976.
Milenkovic, Milena, et al, “An Accelerometer-Based Physical Rehabilitation System,” IEEE SouthEastern Symposium on System Theory, 2002, pp. 57-60.
Mizell, David, “Using Gravity to Estimate Accelerometer Orientation”, Seventh IEEE International Symposium on Wearable Computers, 2003, 2 pages.
Ormoneit, D, et al, Learning and Tracking of Cyclic Human Motion: Proceedings of NIPS 2000, Neural Information Processing Systems, 2000, Denver, CO, pp. 894-900.
Otto, Chris, et al, “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring,” Journal of Mobile Multimedia, vol. 1, No. 4, 2006, pp. 307-326.
Park, Chulsung, et al, “Eco: An Ultra-Compact Low-Power Wireless Sensor Node for Real-Time Motion Monitoring,” IEEE Int. Symp. On Information Processing in Sensor Networks, 2005, pp. 398-403.
Ricoh, “Advanced digital technology changes creativity,” <http://www.ricoh.com/r—dc/gx/gx200/features2.html>, Accessed May 12, 2011, 4 pages.
“Sensor Fusion,” <www.u-dynamics.com>, accessed Aug. 29, 2008, 2 pages.
Shen, Chien-Lung, et al, “Wearable Band Using a Fabric-Based Sensor for Exercise ECG Monitoring,” IEEE Int. Symp. on Wearable Computers, 2006, 2 pages.
Tapia, Emmanuel Munguia, et al, “Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor,” IEEE Cont. on Wearable Computers, Oct. 2007, 4 pages.
Tech, Ang Wei, “Real-time Image Stabilizer,” <http://www.mae.ntu.edu.sg/ABOUTMAE/DIVISIONS/RRC—BIOROBOTICS/Pages/rtimage.aspx>, Mar. 23, 2009, 3 pages.
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 1-66 (part 1 of 3).
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 67-92 (part 2 of 3).
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 93-123 (part 3 of 3).
Weckesser, P, et al, “Multiple Sensorprocessing for High-Precision Navigation and Environmental Modeling with a Mobile Robot,” IEEE, 1995, pp. 453-458.
Weinberg, Harvey, “Minimizing Power Consumption of iMEMS® Accelerometers,” Analog Devices, <http://www.analog.com/static/imported-files/application—notes/5935151853362884599AN601.pdf>, 2002, 5 pages.
Weinberg, Harvey, “MEMS Motion Sensors Boost Handset Reliability” Jun. 2006, <http://www.mwrf.com/Articles/Print.cfm?ArticleID=12740>, Feb. 21, 2007, 3 pages.
Wixted, Andrew J, et al, “Measurement of Energy Expenditure in Elite Athletes Using MEMS-Based Triaxial Accelerometers,” IEEE Sensors Journal, vol. 7, No. 4, Apr. 2007, pp. 481-488.
Wu, Winston H, et al, “Context-Aware Sensing of Physiological Signals,” IEEE Int. Conf. on Engineering for Medicine and Biology, Aug. 23-26, 2007, pp. 5271-5275.
Yoo, Chang-Sun, et al, “Low Cost GPS/INS Sensor Fusion System for UAV Navigation,” IEEE, 2003, 9 pages.
Zypad WL 1100 Wearable Computer, http://www.eurotech.fi/products/manuals/Zypad%20WL%201100—sf.pdf, Jan. 16, 2008, 2 pgs.
EP 10781099.6, Supplementary European Search Report, Dated Nov. 2, 2012, 5 pages.
Related Publications (1)
Number Date Country
20100306711 A1 Dec 2010 US