The present invention relates to accelerometers, and more particularly to using gestures in a mobile device.
Accelerometers are becoming cheaper and more ubiquitous. Numerous mobile devices include accelerometers. For example, SAMSUNG SPH-S4000 and SCH-S400 phones feature gesture recognition, enabling a user to control its functionality by moving it. There is an accelerometer built into the phone, and a user can skip songs on its MP3 player by shaking the phone from side to side, or play games by shaking the phone, rather than using a more traditional joystick. However, there are numerous problems with this interface, including the issue regarding accidental shakes. As commentators point out, if shaking the device skips songs, then jogging with the telephone would cause random skipping whenever the device was accidentally shaken right or left by the user's motions.
The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
The method and apparatus described is for the use of motions or gestures as a user interface. The gestures, or motions, enable a user to navigate in a computing device in various ways. In one embodiment, the navigation may be for a mobile device, such as a cellular telephone, MP3 player, or other such device. In another embodiment, the navigation may be for a non-mobile device, such as a stationary computer utilizing a mobile controller, such as a mouse. The gestures or motions are detected using an embedded or wirelessly tethered accelerometer, in one embodiment. The accelerometer is also used to detect a user's activity level, in one embodiment.
The system provides the ability to interact with a device using pre-defined gestures or motion series. The system further provides the ability for a user to define interaction gestures that are preferred by the user, and are not likely to be problematic (i.e. accidentally made). Furthermore, in one embodiment, the gesture interface interacts with spoken or displayed menu items, to provide a gesture interface in loud environments.
In one embodiment, the system further modifies and/or turns off certain motion gestures, based on current activities. For example, in one embodiment, certain gesture recognition algorithms are adjusted or shut off entirely when the user is walking, biking, or running above a certain cadence. This is useful because it permits the recognition of a gesture that is easy to perform when a user is sitting holding the device, and yet ensures that the command is not accidentally set off when the user is running. For example, tapping on a device may be used as a user interface command for controlling the mobile device or an application within the mobile device. When jogging or running, the mobile device may knock against the user's leg if in a shorts pocket, on other objects in a handbag, etc. To solve this challenge, the gesture detection algorithm may be modified as the user's cadence increases so as not to be falsely triggered by the motions associated with the activity of the user.
At block 105, the user selects a suggested motion sequence. In one embodiment, the user does this by indicating the start of a motion sequence, performing the motion sequence, and indicating the end of the motion sequence. In one embodiment, the motion sequence may include more than one motion, or one complex motion.
At block 110, the process compares the suggested motion sequence to known accidental motion patterns. Accidental motion patterns include motions likely to be accidentally performed when walking, running, talking if it is mobile phone, or another activity likely to be performed by the user with the mobile component, as well as existing registered gesture sequences. It would be any motion that the user may make, that may trigger the command associated with the suggested motion.
At block 115, the process determines whether the suggested motion sequence is too similar to an accidental motion. In one embodiment, the comparison takes into account the movement type, speed, and accelerations of the motion pattern suggested. If the motion is too similar, the user is prompted to try another motion sequence. In one embodiment, the user is informed of the reason for the similarity. For example, the user may be informed that “the up-down motion resembles jogging or similar accidental motion, please select an alternative pattern.” In one embodiment, the system may further provide suggestions. For example, the suggestion may be to “change the speed/angle/range of motion” to avoid similarity.
If the motion sequence is not too similar to an accidental motion, the process continues to block 125. At block 125, the user is requested to repeat the motion for confirmation. In one embodiment, if the two repetitions of the motion are too dissimilar, the user is requested to repeat the motion again. If the motions continue to be too dissimilar, the motion sequence is discarded as too difficult, and the user is requested to select another motion sequence.
If the repeated motions match properly, at block 127, the user is permitted to define one or more actions associated with the gesture. The actions may range from an emergency response, to dialing a particular number (defined by the user), making an item selection among a set of menu items, listing menu items on the display or via a speaker, activating an application, or any other definable action. In one embodiment, the action may relate to the mobile device as a whole, and/or to a particular application within the mobile device. In one embodiment, the user may define different actions depending on the currently active application. Thus, for example, in a music application a rapid tilt to the side may mean “advance to next song” while in the address book application the same rapid tilt to the side may mean “scroll one screen to next set of addresses.”
At block 130, the gesture and its associated action(s) are added to the gesture library. The user can, at block 135, decide to add another motion sequence to the gesture library. If the user chooses to do so, the process returns to block 105. Otherwise, the process ends. In one embodiment, a gesture may be defined not only for a particular application, but also for a particular background activity, ambient noise, or user's motion cadence. In one embodiment, the user may define separate gestures associated with the same command based on any of these features. For example, if the user is jogging he or she may not want to use a gesture that involves rapid shaking up and down, and my instead define a different gesture to use as a command. In one embodiment, such separate gestures are provided in the default set of gestures as well.
The process starts at block 210, when a suggested motion is received from the user for analysis. In one embodiment, the system includes a library of accidental motions. In one embodiment, this library of accidental motions is added-to as the device is utilized. For example, for someone who sprints, the accidental motions are different from someone who primarily does race walking. In one embodiment, the library of motions includes a list of activities. The user may, in one embodiment, select the set of activities performed with the mobile device.
At block 215, the process determines whether the suggested motion is dissimilar from accidental motions which may be accidentally made or made intended to activate a different command. If so, at block 220, the motion is accepted. In one embodiment, this requires that the motion be clearly non-conflicting. If the motion is not clearly non-conflicting, the process continues to block 225.
At block 225, the process determines whether the suggested motion is similar to a “standard” movement, which is performed during a normal course of action. For example, for a mobile phone, standard movements include walking, sitting, and other activities which a user is expected to perform all the time. These “standard” movements ensure that the motion would be a problem under normal circumstances. Therefore, the motion cannot be accepted. If so, at block 230, the motion is rejected.
Otherwise, at block 235, the process obtains identification of the application/command associated with the suggested motion. In one embodiment, the user first identifies the application/command, and then provides the motion. In another embodiment, the user first provides the application/command, and then the suggested motion. In another embodiment, the process requests the application/command only when the motion isn't similar to a standard movement, but isn't dissimilar enough from possible movements to be an automatic pass.
At block 240, the process determines whether the application/command is likely to be utilized concurrently with the interfering base activity. For example, a user is likely to utilize commands associated with a music player while jogging, but is unlikely to play an electronic bowling game while jogging. If they are likely to be utilized concurrently, the process continues to block 230, and rejects the suggested motion. If there is likely concurrent use, the process continues to block 230, and rejects the suggested motion.
If the commands are not likely to be utilized concurrently, at block 245 the process notifies the user of the potential conflict, and allows the user to accept the conflict. At block 250, the process determines whether the user is willing to accept the conflict. If not, the process continues to block 230 and rejects the suggested motion. Otherwise, the process continues to block 250.
At block 255, the process determines whether it would be better to shut down the availability of the motion sequence command when the interfering activity is occurring. For example, if the command is for a game, if the underlying activity is jogging, it may be best to turn off the availability of the command when the user is jogging. If so, at block 260, the system sets up a condition such that the motion command is not available when certain activity is occurring. For example, the tapping to access a particular game menu may be unavailable when the system determines that the user is jogging.
Otherwise, in one embodiment, the motion signature is adjusted for the activity, at block 265. The process then ends, at block 270.
In one embodiment, the user is prompted to utilize an easy to remember motion sequence. For example, the system may suggest a beat from a song the user is familiar with, or their favorite dance move, or something similar. In one embodiment, the system further suggests that the user select a motion that can be done unobtrusively. For example, windmilling the arms may not be the best motion sequence because it is so obvious that the user is making an unnatural motion. However, the motion sequence should be one that will not be accidentally activated by the user's normal actions.
Once the motion is defined—in one embodiment the process described above with respect to
At block 310, the user is asked to define a first contact for emergency response. In one embodiment, the default first contact is the local police emergency number. In one embodiment, that number may be 911. In another embodiment, the user's local number is utilized, because calling 911 in some mobile devices connects to a central service center which may not be local to the user. In one embodiment, if the mobile device includes a GPS (global positioning system) or other location-determination mechanism, a local emergency number is identified and used.
At block 315, the emergency settings are configured for this gesture. In one embodiment, the user may choose to change any of the default configurations. In one embodiment, the default configuration is to transmit audio, but mute incoming audio, so that it is not obvious that sounds are being transmitted. Alternatively, the configuration may be to set the telephone to act as a speaker phone, broadcasting tone as well as receiving. In one embodiment the emergency setting may also include a short audio message indicating that this is an emergency connection to whatever agency receives the call.
At block 320, the emergency settings are set to transmit location coordinates, if the emergency contact is capable of receiving such data, and the mobile device has the capability of obtaining the data. In one embodiment, the user may define the location. In one embodiment, the data may be based on GPS (global positioning system) data, if the mobile device includes this feature. In one embodiment, the data may be based on wireless locator data. In one embodiment, the data may be based on network triangulation data.
The user is then queried whether he or she wishes to add an additional contact to the emergency response, at block 325. If so, the process returns to block 310, to add additional contacts. In one embodiment, the system connects to multiple contacts simultaneously, if multiple contacts are designated and the device is capable of conference calls. Alternatively, the contacts may be sequential. In one embodiment, if the contacts are sequential, the order of the contacts may be specified by the user. At block 330, the emergency response is stored.
At block 335 the process provides the opportunity for the user to define a cancellation gesture or command. The cancellation gesture/command is designed to enable the user to cancel the emergency response, if it was accidentally triggered. In one embodiment, the cancellation command may be a numeric pass code. The process then ends.
At block 355, feedback is provided to the user indicating that the emergency gesture was received. In one embodiment, this feedback is designed to be non-obtrusive, quiet, so as to communicate only to the user. In one embodiment, the feedback may be auditory, visual, or tactile (such as vibration), or a combination of the above.
At block 360, the device starts recording data. This occurs, in one embodiment, substantially immediately after detection of the emergency gesture. The recording, in one embodiment, may include recording of audio data, video data, image data, movement data, and/or data from other sensors within the device. If location data is available—through GPS, network triangulation, or another source—that data is also recorded.
In one embodiment, the recording is stored in a “black box” system. This ensures that the data is not trivially erasable, and in one embodiment is designed to keep the data stored even if the mobile device is broken. In one embodiment, the data from the emergency recording can only be erased with the use of a security key, known to the user.
At block 365, the process determines whether a cancellation gesture/command was received. In one embodiment, the user is given a short amount of time to cancel the emergency response.
If a cancellation signal was given, at block 370 the recording is terminated, and the process is aborted. The process then ends at block 390. In one embodiment, the user is able to erase the recorded data from the black box. If no cancellation is given, the process continues to block 375.
At 375, the system attempts to establish a connection to the designated emergency contacts over any available channel, to send out a call for help. In one embodiment, this includes switching to roaming, sending data over WiFi (wireless connection) if so enabled, sending data via WAP (wireless access protocol), as well as sending data via the more traditional carrier network.
At block 380, the process determines whether the connection has been established. If not, the system continues trying, until either the user terminates the emergency, or a connection is established.
At block 385, once the connection is established, the emergency data is sent to the contact. As noted above, generally the contact would be local law enforcement or emergency response team or dispatcher. In one embodiment, an initial notification message is transmitted, which indicates that this is an emergency and the location of the user if available, and then initiates live audio/video broadcast to give the emergency response team/dispatcher additional information. In one embodiment, the location information may be converted by the system from the GPS data/network triangulation data into location data. In one embodiment, if the emergency contact's system is capable of it, the user's system may provide a data dump of collected information—i.e. recorded information that was collected prior to the connection being established. In one embodiment, the data continues being sent until either the user aborts the process, the contact aborts the process, or the device can no longer maintain a connection. In one embodiment, if the connection is lost, and the user has not aborted the emergency, the process attempts to establish a new connection.
In this way, the user is provided an emergency response mechanism which can be easily activated and provides added security.
At block 410, the process determines whether a gesture has been defined by the accelerometer data. In one embodiment, the system includes one or more default gestures provided with the system. In one embodiment, for a mobile handset these gestures may include gestures for picking up the telephone. One example of a gesture that may be provided is described in U.S. Patent Application Ser. No. 60/948,434. As noted above, the user may also record one or more gestures during the set-up phase. In one embodiment, the user may remove or modify any of the default gestures. In one embodiment, the system continuously compares the recorded gesture data to the accumulated accelerometer data. If no gesture has been defined, the process continues to accumulate data, and make the comparison.
If a gesture has been recognized, the process continues to block 415. At block 415, the actions associated with the defined gesture are identified. These actions may include the emergency response discussed above, dialing a particular number, or any other action as defined by the user.
At block 420, the process identifies the active application to which the gesture relates. At block 430, the action is performed in the designated application. The process then returns to block 405, to continue accumulating accelerometer data.
At block 510, the system displays the list, via auditory and/or visual output. The user can then utilize a “selection gesture,” at block 515. The selection gesture is defined by a user during training of a phone.
At block 520, the action associated with the listed item which was selected by the user is performed.
The gesture interface is especially useful in loud and badly lit environments, for example shop floors or clubs where spoken commands impossible, and visually making a selection is also difficult. It can also be useful for individuals with strong accents who have difficulty training word recognition. Gesture recognition is much easier to train, since the user can simply define any gesture to correspond to a particular type of action.
The sensor engine interfaces with the sensor, and controls the sampling rate etc. The inference engine does all other processing, in one embodiment. This processing includes step counting, gesture recognition, etc. In one embodiment, the inference engine resolves complex raw motion data into organized, actionable information.
Under this architecture, the inference engine is divided into two components: min and max. The data analysis and computing is split between the MCU integrated with the accelerometer (min) and the main processor (max). In one embodiment, low complexity and high speed processing is done on the MCU and other more processor intensive computations are resolved on the main processor.
These are merely exemplary architectures. As is understood in the art, since none of these processes must be truly instantaneous, the processing may be performed remotely, and may be divided among the various devices and processors based on available processing power, speed requirements, and network availability and speed. In one embodiment, the handheld device may be an independent device, providing all processing during use.
In one embodiment, the ambient noise level is a variable that is input in the gesture detection algorithm and is used to scale the magnitude of the gesture, i.e. if there is a lot of ambient noise, then a relatively large (more pronounced gesture) is necessary than if the device very still.
Similarly with the user's cadence when walking/jogging/running. The cadence is an input into the gesture recognition algorithms of the ISIE 740, and that input adjusts the gesture. In one embodiment, the cadence may change the gesture entirely, to a different gesture that's practical when running at that cadence.
In one embodiment, device location identifier 755 can tell from the motion signature of walking or other regular motions where the device is located. In one embodiment, this data is used by ISIE 740 to modify the gesture algorithm based on the devices location.
If the ISIE 740 identifies a gesture, and the gesture is available, the corresponding actions are retrieved from the gesture library 730. Translator 750 then translates the identified actions into commands for the mobile device.
In one embodiment, the gesture library 730 is populated by the user, using gesture registration logic 760. Gesture registration logic enables a user to define a motion, gesture, or set of motions, and associate one or more actions with the gesture. In one embodiment, the actions may be a series of actions. For example, a single motion may be used to dial a particular number, enter a passcode, and start playing a game.
Configuration logic 770, in one embodiment, allows the user to define actions which change the mobile device's configuration. For example, the emergency response may be to configure the mobile telephone to be a speaker phone, and set the volume to the maximum available volume. Configuration logic 770 interacts with the mobile device's settings, so the user may change the phone's configuration via gesture. For example, one of the defined gestures may change the mobile device's settings from “outdoors” to “meeting” without requiring the user to fuss with their telephone during a meeting.
Emergency logic 780 provides the special features associated with emergency gestures. This may include setting up a conference to enable the phone dial all identified parties substantially concurrently, providing a recorded outgoing message, turning off the incoming audio, etc. Emergency logic 780 is coupled to black box recorder 785, which provides a location to store the emergency record.
In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
The present application is a continuation of U.S. application Ser. No. 11/776,532 filed on Jul. 11, 2007, which claims priority to U.S. Provisional Application Ser. No. 60/830,205 filed on Jul. 11, 2006, and incorporates those applications in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
4285041 | Smith | Aug 1981 | A |
4571680 | Wu | Feb 1986 | A |
4578769 | Frederick | Mar 1986 | A |
4700369 | Siegal et al. | Oct 1987 | A |
4776323 | Spector | Oct 1988 | A |
5313060 | Gast et al. | May 1994 | A |
5386210 | Lee | Jan 1995 | A |
5430480 | Allen et al. | Jul 1995 | A |
5446725 | Ishiwatari | Aug 1995 | A |
5446775 | Wright et al. | Aug 1995 | A |
5454114 | Yach et al. | Sep 1995 | A |
5485402 | Smith et al. | Jan 1996 | A |
5506987 | Abramson et al. | Apr 1996 | A |
5515419 | Sheffer | May 1996 | A |
5583776 | Levi et al. | Dec 1996 | A |
5593431 | Sheldon | Jan 1997 | A |
5654619 | Iwashita | Aug 1997 | A |
5703786 | Conkright | Dec 1997 | A |
5737439 | Lapsley et al. | Apr 1998 | A |
5771001 | Cobb | Jun 1998 | A |
5778882 | Raymond et al. | Jul 1998 | A |
5911065 | Williams et al. | Jun 1999 | A |
5955667 | Fyfe | Sep 1999 | A |
5955871 | Nguyen | Sep 1999 | A |
5960085 | de la Huerga | Sep 1999 | A |
5976083 | Richardson et al. | Nov 1999 | A |
6013007 | Root et al. | Jan 2000 | A |
6061456 | Andrea et al. | May 2000 | A |
6122595 | Varley et al. | Sep 2000 | A |
6129686 | Friedman | Oct 2000 | A |
6135951 | Richardson et al. | Oct 2000 | A |
6145389 | Ebeling et al. | Nov 2000 | A |
6246321 | Rechsteiner et al. | Jun 2001 | B1 |
6282496 | Chowdhary | Aug 2001 | B1 |
6336891 | Fedrigon et al. | Jan 2002 | B1 |
6353449 | Gregg et al. | Mar 2002 | B1 |
6369794 | Sakurai et al. | Apr 2002 | B1 |
6396883 | Yang et al. | May 2002 | B2 |
6408330 | de la Huerga | Jun 2002 | B1 |
6428490 | Kramer et al. | Aug 2002 | B1 |
6470147 | Imada | Oct 2002 | B1 |
6478736 | Mault | Nov 2002 | B1 |
6493652 | Ohlenbusch et al. | Dec 2002 | B1 |
6496695 | Kouji et al. | Dec 2002 | B1 |
6513381 | Fyfe et al. | Feb 2003 | B2 |
6522266 | Soehren et al. | Feb 2003 | B1 |
6529144 | Nilsen | Mar 2003 | B1 |
6532419 | Begin et al. | Mar 2003 | B1 |
6539336 | Vock et al. | Mar 2003 | B1 |
6595929 | Stivoric et al. | Jul 2003 | B2 |
6601016 | Brown et al. | Jul 2003 | B1 |
6607493 | Song | Aug 2003 | B2 |
6611789 | Darley | Aug 2003 | B1 |
6628898 | Endo | Sep 2003 | B2 |
6634992 | Ogawa | Oct 2003 | B1 |
6665802 | Ober | Dec 2003 | B1 |
6672991 | O'Malley | Jan 2004 | B2 |
6685480 | Nishimoto et al. | Feb 2004 | B2 |
6700499 | Kubo et al. | Mar 2004 | B2 |
6731958 | Shirai | May 2004 | B1 |
6766176 | Gupta et al. | Jul 2004 | B1 |
6771250 | Oh | Aug 2004 | B1 |
6786877 | Foxlin | Sep 2004 | B2 |
6788980 | Johnson | Sep 2004 | B1 |
6790178 | Mault et al. | Sep 2004 | B1 |
6807564 | Zellner et al. | Oct 2004 | B1 |
6813582 | Levi et al. | Nov 2004 | B2 |
6823036 | Chen | Nov 2004 | B1 |
6826477 | Ladetto et al. | Nov 2004 | B2 |
6836744 | Asphahani et al. | Dec 2004 | B1 |
6881191 | Oakley et al. | Apr 2005 | B2 |
6885971 | Vock et al. | Apr 2005 | B2 |
6895425 | Kadyk et al. | May 2005 | B1 |
6898550 | Blackadar et al. | May 2005 | B1 |
6928382 | Hong et al. | Aug 2005 | B2 |
6941239 | Unuma et al. | Sep 2005 | B2 |
6959259 | Vock et al. | Oct 2005 | B2 |
6975959 | Dietrich et al. | Dec 2005 | B2 |
6997852 | Watterson et al. | Feb 2006 | B2 |
7002553 | Shkolnikov | Feb 2006 | B2 |
7010332 | Irvin et al. | Mar 2006 | B1 |
7020487 | Kimata | Mar 2006 | B2 |
7027087 | Nozaki et al. | Apr 2006 | B2 |
7028547 | Shiratori et al. | Apr 2006 | B2 |
7042509 | Onuki | May 2006 | B2 |
7054784 | Flentov et al. | May 2006 | B2 |
7057551 | Vogt | Jun 2006 | B1 |
7072789 | Vock et al. | Jul 2006 | B2 |
7092846 | Vock et al. | Aug 2006 | B2 |
7096619 | Jackson et al. | Aug 2006 | B2 |
7148797 | Albert | Dec 2006 | B2 |
7148879 | Amento et al. | Dec 2006 | B2 |
7149964 | Cottrille et al. | Dec 2006 | B1 |
7155507 | Hirano et al. | Dec 2006 | B2 |
7158912 | Vock et al. | Jan 2007 | B2 |
7169084 | Tsuji | Jan 2007 | B2 |
7171222 | Fostick | Jan 2007 | B2 |
7171331 | Vock et al. | Jan 2007 | B2 |
7173604 | Marvit | Feb 2007 | B2 |
7176886 | Marvit et al. | Feb 2007 | B2 |
7176887 | Marvit et al. | Feb 2007 | B2 |
7176888 | Marvit et al. | Feb 2007 | B2 |
7177684 | Kroll et al. | Feb 2007 | B1 |
7180500 | Marvit et al. | Feb 2007 | B2 |
7180501 | Marvit et al. | Feb 2007 | B2 |
7180502 | Marvit et al. | Feb 2007 | B2 |
7200517 | Darley et al. | Apr 2007 | B2 |
7212230 | Stavely | May 2007 | B2 |
7212943 | Aoshima et al. | May 2007 | B2 |
7220220 | Stubbs et al. | May 2007 | B2 |
7245725 | Beard | Jul 2007 | B1 |
7254516 | Case et al. | Aug 2007 | B2 |
7280096 | Marvit et al. | Oct 2007 | B2 |
7280849 | Bailey | Oct 2007 | B1 |
7297088 | Tsuji | Nov 2007 | B2 |
7301526 | Marvit et al. | Nov 2007 | B2 |
7301527 | Marvit et al. | Nov 2007 | B2 |
7301528 | Marvit et al. | Nov 2007 | B2 |
7301529 | Marvit et al. | Nov 2007 | B2 |
7305323 | Skvortsov et al. | Dec 2007 | B2 |
7328611 | Klees et al. | Feb 2008 | B2 |
7334472 | Seo et al. | Feb 2008 | B2 |
7353112 | Choi et al. | Apr 2008 | B2 |
7365735 | Reinhardt et al. | Apr 2008 | B2 |
7365736 | Marvit et al. | Apr 2008 | B2 |
7365737 | Marvit et al. | Apr 2008 | B2 |
7379999 | Zhou et al. | May 2008 | B1 |
7387611 | Inoue et al. | Jun 2008 | B2 |
7397357 | Krumm et al. | Jul 2008 | B2 |
7451056 | Flentov et al. | Nov 2008 | B2 |
7457719 | Kahn et al. | Nov 2008 | B1 |
7457872 | Aton et al. | Nov 2008 | B2 |
7463997 | Pasolini et al. | Dec 2008 | B2 |
7467060 | Kulach et al. | Dec 2008 | B2 |
7489937 | Chung et al. | Feb 2009 | B2 |
7502643 | Farringdon et al. | Mar 2009 | B2 |
7512515 | Vock et al. | Mar 2009 | B2 |
7526402 | Tanenhaus et al. | Apr 2009 | B2 |
7608050 | Shugg | Oct 2009 | B2 |
7640804 | Daumer et al. | Jan 2010 | B2 |
7647196 | Kahn et al. | Jan 2010 | B2 |
7653508 | Kahn et al. | Jan 2010 | B1 |
7664657 | Letzt et al. | Feb 2010 | B1 |
7689107 | Enomoto | Mar 2010 | B2 |
7705884 | Pinto et al. | Apr 2010 | B2 |
7752011 | Niva et al. | Jul 2010 | B2 |
7753861 | Kahn et al. | Jul 2010 | B1 |
7765553 | Douceur et al. | Jul 2010 | B2 |
7774156 | Niva et al. | Aug 2010 | B2 |
7788059 | Kahn et al. | Aug 2010 | B1 |
7857772 | Bouvier et al. | Dec 2010 | B2 |
7881902 | Kahn et al. | Feb 2011 | B1 |
7892080 | Dahl | Feb 2011 | B1 |
7907901 | Kahn et al. | Mar 2011 | B1 |
7987070 | Kahn et al. | Jul 2011 | B2 |
8187182 | Kahn et al. | May 2012 | B2 |
8275635 | Stivoric et al. | Sep 2012 | B2 |
8398546 | Pacione et al. | Mar 2013 | B2 |
20010047488 | Verplaetse et al. | Nov 2001 | A1 |
20020006284 | Kim | Jan 2002 | A1 |
20020022551 | Watterson et al. | Feb 2002 | A1 |
20020023654 | Webb | Feb 2002 | A1 |
20020027164 | Mault et al. | Mar 2002 | A1 |
20020042830 | Bose et al. | Apr 2002 | A1 |
20020044634 | Rooke et al. | Apr 2002 | A1 |
20020054214 | Yoshikawa | May 2002 | A1 |
20020089425 | Kubo et al. | Jul 2002 | A1 |
20020109600 | Mault et al. | Aug 2002 | A1 |
20020118121 | Lehrman et al. | Aug 2002 | A1 |
20020122543 | Rowen | Sep 2002 | A1 |
20020138017 | Bui et al. | Sep 2002 | A1 |
20020142887 | O'Malley | Oct 2002 | A1 |
20020150302 | McCarthy et al. | Oct 2002 | A1 |
20020151810 | Wong et al. | Oct 2002 | A1 |
20020173295 | Nykanen et al. | Nov 2002 | A1 |
20020190947 | Feinstein | Dec 2002 | A1 |
20020193124 | Hamilton et al. | Dec 2002 | A1 |
20030018430 | Ladetto et al. | Jan 2003 | A1 |
20030033411 | Kavoori et al. | Feb 2003 | A1 |
20030048218 | Milnes et al. | Mar 2003 | A1 |
20030083596 | Kramer et al. | May 2003 | A1 |
20030093187 | Walker | May 2003 | A1 |
20030101260 | Dacier et al. | May 2003 | A1 |
20030109258 | Mantyjarvi et al. | Jun 2003 | A1 |
20030139692 | Barrey et al. | Jul 2003 | A1 |
20030139908 | Wegerich et al. | Jul 2003 | A1 |
20030149526 | Zhou et al. | Aug 2003 | A1 |
20030151672 | Robins et al. | Aug 2003 | A1 |
20030187683 | Kirchhoff et al. | Oct 2003 | A1 |
20030208110 | Mault et al. | Nov 2003 | A1 |
20030208113 | Mault et al. | Nov 2003 | A1 |
20030227487 | Hugh | Dec 2003 | A1 |
20030236625 | Brown et al. | Dec 2003 | A1 |
20040017300 | Kotzin et al. | Jan 2004 | A1 |
20040024846 | Randall et al. | Feb 2004 | A1 |
20040043760 | Rosenfeld et al. | Mar 2004 | A1 |
20040044493 | Coulthard | Mar 2004 | A1 |
20040047498 | Mulet-Parada et al. | Mar 2004 | A1 |
20040078219 | Kaylor et al. | Apr 2004 | A1 |
20040078220 | Jackson | Apr 2004 | A1 |
20040081441 | Sato et al. | Apr 2004 | A1 |
20040106421 | Tomiyoshi et al. | Jun 2004 | A1 |
20040106958 | Mathis et al. | Jun 2004 | A1 |
20040122294 | Hatlestad et al. | Jun 2004 | A1 |
20040122295 | Hatlestad et al. | Jun 2004 | A1 |
20040122296 | Hatlestad et al. | Jun 2004 | A1 |
20040122297 | Stahmann et al. | Jun 2004 | A1 |
20040122333 | Nissila | Jun 2004 | A1 |
20040122484 | Hatlestad et al. | Jun 2004 | A1 |
20040122485 | Stahmann et al. | Jun 2004 | A1 |
20040122486 | Stahmann et al. | Jun 2004 | A1 |
20040122487 | Hatlestad et al. | Jun 2004 | A1 |
20040125073 | Potter et al. | Jul 2004 | A1 |
20040130628 | Stavely | Jul 2004 | A1 |
20040135898 | Zador | Jul 2004 | A1 |
20040146048 | Cotte | Jul 2004 | A1 |
20040148340 | Cotte | Jul 2004 | A1 |
20040148341 | Cotte | Jul 2004 | A1 |
20040148342 | Cotte | Jul 2004 | A1 |
20040148351 | Cotte | Jul 2004 | A1 |
20040176067 | Lakhani et al. | Sep 2004 | A1 |
20040185821 | Yuasa | Sep 2004 | A1 |
20040219910 | Beckers | Nov 2004 | A1 |
20040225467 | Vock et al. | Nov 2004 | A1 |
20040236500 | Choi et al. | Nov 2004 | A1 |
20040242202 | Torvinen | Dec 2004 | A1 |
20040247030 | Wiethoff | Dec 2004 | A1 |
20040259494 | Mazar | Dec 2004 | A1 |
20050015768 | Moore | Jan 2005 | A1 |
20050027567 | Taha | Feb 2005 | A1 |
20050033200 | Soehren et al. | Feb 2005 | A1 |
20050038691 | Babu | Feb 2005 | A1 |
20050048945 | Porter | Mar 2005 | A1 |
20050048955 | Ring | Mar 2005 | A1 |
20050078197 | Gonzalez | Apr 2005 | A1 |
20050079873 | Caspi et al. | Apr 2005 | A1 |
20050101841 | Kaylor et al. | May 2005 | A9 |
20050102167 | Kapoor | May 2005 | A1 |
20050107944 | Hovestadt et al. | May 2005 | A1 |
20050113649 | Bergantino | May 2005 | A1 |
20050113650 | Pacione et al. | May 2005 | A1 |
20050125797 | Gabrani et al. | Jun 2005 | A1 |
20050131736 | Nelson et al. | Jun 2005 | A1 |
20050141522 | Kadar et al. | Jun 2005 | A1 |
20050143106 | Chan et al. | Jun 2005 | A1 |
20050146431 | Hastings et al. | Jul 2005 | A1 |
20050157181 | Kawahara et al. | Jul 2005 | A1 |
20050165719 | Greenspan et al. | Jul 2005 | A1 |
20050168587 | Sato et al. | Aug 2005 | A1 |
20050182824 | Cotte | Aug 2005 | A1 |
20050183086 | Abe et al. | Aug 2005 | A1 |
20050202934 | Olrik et al. | Sep 2005 | A1 |
20050203430 | Williams et al. | Sep 2005 | A1 |
20050210300 | Song et al. | Sep 2005 | A1 |
20050210419 | Kela | Sep 2005 | A1 |
20050212751 | Marvit et al. | Sep 2005 | A1 |
20050212752 | Marvit et al. | Sep 2005 | A1 |
20050212753 | Marvit et al. | Sep 2005 | A1 |
20050212760 | Marvit et al. | Sep 2005 | A1 |
20050216403 | Tam et al. | Sep 2005 | A1 |
20050222801 | Wulff et al. | Oct 2005 | A1 |
20050232388 | Tsuji | Oct 2005 | A1 |
20050232404 | Gaskill | Oct 2005 | A1 |
20050234676 | Shibayama | Oct 2005 | A1 |
20050235058 | Rackus et al. | Oct 2005 | A1 |
20050238132 | Tsuji | Oct 2005 | A1 |
20050240375 | Sugai | Oct 2005 | A1 |
20050243178 | McConica | Nov 2005 | A1 |
20050245988 | Miesel | Nov 2005 | A1 |
20050248718 | Howell et al. | Nov 2005 | A1 |
20050256414 | Kettunen et al. | Nov 2005 | A1 |
20050258938 | Moulson | Nov 2005 | A1 |
20050262237 | Fulton et al. | Nov 2005 | A1 |
20050281289 | Huang et al. | Dec 2005 | A1 |
20060009243 | Dahan et al. | Jan 2006 | A1 |
20060017692 | Wehrenberg et al. | Jan 2006 | A1 |
20060020177 | Seo et al. | Jan 2006 | A1 |
20060029284 | Stewart | Feb 2006 | A1 |
20060063980 | Hwang et al. | Mar 2006 | A1 |
20060064276 | Ren et al. | Mar 2006 | A1 |
20060080551 | Mantyjarvi et al. | Apr 2006 | A1 |
20060090088 | Choi et al. | Apr 2006 | A1 |
20060090161 | Bodas et al. | Apr 2006 | A1 |
20060098097 | Wach et al. | May 2006 | A1 |
20060100546 | Silk | May 2006 | A1 |
20060109113 | Reyes et al. | May 2006 | A1 |
20060136173 | Case, Jr. et al. | Jun 2006 | A1 |
20060140422 | Zurek et al. | Jun 2006 | A1 |
20060149516 | Bond et al. | Jul 2006 | A1 |
20060154642 | Scannell, Jr. | Jul 2006 | A1 |
20060161377 | Rakkola et al. | Jul 2006 | A1 |
20060161459 | Rosenfeld et al. | Jul 2006 | A9 |
20060167387 | Buchholz et al. | Jul 2006 | A1 |
20060167647 | Krumm et al. | Jul 2006 | A1 |
20060167943 | Rosenberg | Jul 2006 | A1 |
20060172706 | Griffin et al. | Aug 2006 | A1 |
20060174685 | Skvortsov et al. | Aug 2006 | A1 |
20060201964 | DiPerna et al. | Sep 2006 | A1 |
20060204214 | Shah et al. | Sep 2006 | A1 |
20060205406 | Pekonen et al. | Sep 2006 | A1 |
20060206258 | Brooks | Sep 2006 | A1 |
20060223547 | Chin et al. | Oct 2006 | A1 |
20060249683 | Goldberg et al. | Nov 2006 | A1 |
20060256082 | Cho et al. | Nov 2006 | A1 |
20060257042 | Ofek et al. | Nov 2006 | A1 |
20060259268 | Vock et al. | Nov 2006 | A1 |
20060284979 | Clarkson | Dec 2006 | A1 |
20060288781 | Daumer et al. | Dec 2006 | A1 |
20060289819 | Parsons et al. | Dec 2006 | A1 |
20070004451 | C. Anderson | Jan 2007 | A1 |
20070005988 | Zhang et al. | Jan 2007 | A1 |
20070017136 | Mosher et al. | Jan 2007 | A1 |
20070024441 | Kahn et al. | Feb 2007 | A1 |
20070037605 | Logan et al. | Feb 2007 | A1 |
20070037610 | Logan | Feb 2007 | A1 |
20070038364 | Lee et al. | Feb 2007 | A1 |
20070040892 | Aoki et al. | Feb 2007 | A1 |
20070050157 | Kahn et al. | Mar 2007 | A1 |
20070060446 | Asukai et al. | Mar 2007 | A1 |
20070061105 | Darley et al. | Mar 2007 | A1 |
20070063850 | Devaul et al. | Mar 2007 | A1 |
20070067094 | Park et al. | Mar 2007 | A1 |
20070072581 | Aerrabotu | Mar 2007 | A1 |
20070073482 | Churchill et al. | Mar 2007 | A1 |
20070075127 | Rosenberg | Apr 2007 | A1 |
20070075965 | Huppi et al. | Apr 2007 | A1 |
20070078324 | Wijisiriwardana | Apr 2007 | A1 |
20070082789 | Nissila et al. | Apr 2007 | A1 |
20070102525 | Orr et al. | May 2007 | A1 |
20070104479 | Machida | May 2007 | A1 |
20070106991 | Yoo | May 2007 | A1 |
20070125852 | Rosenberg | Jun 2007 | A1 |
20070130582 | Chang et al. | Jun 2007 | A1 |
20070142715 | Banet et al. | Jun 2007 | A1 |
20070143068 | Pasolini et al. | Jun 2007 | A1 |
20070145680 | Rosenberg | Jun 2007 | A1 |
20070150136 | Doll et al. | Jun 2007 | A1 |
20070156364 | Rothkopf | Jul 2007 | A1 |
20070161410 | Huang et al. | Jul 2007 | A1 |
20070165790 | Taori | Jul 2007 | A1 |
20070169126 | Todoroki et al. | Jul 2007 | A1 |
20070176898 | Suh | Aug 2007 | A1 |
20070192483 | Rezvani et al. | Aug 2007 | A1 |
20070195784 | Allen et al. | Aug 2007 | A1 |
20070204744 | Sako et al. | Sep 2007 | A1 |
20070208531 | Darley et al. | Sep 2007 | A1 |
20070208544 | Kulach et al. | Sep 2007 | A1 |
20070213085 | Fedora | Sep 2007 | A1 |
20070213126 | Deutsch et al. | Sep 2007 | A1 |
20070221045 | Terauchi et al. | Sep 2007 | A1 |
20070225935 | Ronkainen | Sep 2007 | A1 |
20070233788 | Bender | Oct 2007 | A1 |
20070239399 | Sheyenblat et al. | Oct 2007 | A1 |
20070250261 | Soehren | Oct 2007 | A1 |
20070259685 | Engblom et al. | Nov 2007 | A1 |
20070259716 | Mattice et al. | Nov 2007 | A1 |
20070259717 | Mattice et al. | Nov 2007 | A1 |
20070260418 | Ladetto et al. | Nov 2007 | A1 |
20070260482 | Nurmela et al. | Nov 2007 | A1 |
20070263995 | Park et al. | Nov 2007 | A1 |
20070281762 | Barros et al. | Dec 2007 | A1 |
20070296696 | Nurmi | Dec 2007 | A1 |
20080005738 | Imai et al. | Jan 2008 | A1 |
20080030586 | Helbing et al. | Feb 2008 | A1 |
20080046888 | Appaji | Feb 2008 | A1 |
20080052716 | Theurer | Feb 2008 | A1 |
20080072014 | Krishnan et al. | Mar 2008 | A1 |
20080082994 | Ito et al. | Apr 2008 | A1 |
20080102785 | Childress et al. | May 2008 | A1 |
20080113689 | Bailey | May 2008 | A1 |
20080140338 | No et al. | Jun 2008 | A1 |
20080153671 | Ogg et al. | Jun 2008 | A1 |
20080161072 | Lide et al. | Jul 2008 | A1 |
20080165022 | Herz et al. | Jul 2008 | A1 |
20080168361 | Forstall et al. | Jul 2008 | A1 |
20080171918 | Teller et al. | Jul 2008 | A1 |
20080214358 | Ogg et al. | Sep 2008 | A1 |
20080231713 | Florea et al. | Sep 2008 | A1 |
20080231714 | Estevez et al. | Sep 2008 | A1 |
20080232604 | Dufresne et al. | Sep 2008 | A1 |
20080243432 | Kato et al. | Oct 2008 | A1 |
20080303681 | Herz et al. | Dec 2008 | A1 |
20080311929 | Carro et al. | Dec 2008 | A1 |
20090017880 | Moore et al. | Jan 2009 | A1 |
20090024233 | Shirai et al. | Jan 2009 | A1 |
20090031319 | Fecioru | Jan 2009 | A1 |
20090043531 | Kahn et al. | Feb 2009 | A1 |
20090047645 | Dibenedetto et al. | Feb 2009 | A1 |
20090067826 | Shinohara et al. | Mar 2009 | A1 |
20090088204 | Culbert et al. | Apr 2009 | A1 |
20090098880 | Lindquist | Apr 2009 | A1 |
20090099668 | Lehman et al. | Apr 2009 | A1 |
20090124348 | Yoseloff et al. | May 2009 | A1 |
20090128448 | Riechel | May 2009 | A1 |
20090174782 | Kahn et al. | Jul 2009 | A1 |
20090213002 | Rani et al. | Aug 2009 | A1 |
20090215502 | Griffin, Jr. | Aug 2009 | A1 |
20090234614 | Kahn et al. | Sep 2009 | A1 |
20090274317 | Kahn et al. | Nov 2009 | A1 |
20090296951 | De Haan | Dec 2009 | A1 |
20090319221 | Kahn et al. | Dec 2009 | A1 |
20090325705 | Filer et al. | Dec 2009 | A1 |
20100056872 | Kahn et al. | Mar 2010 | A1 |
20100057398 | Darley et al. | Mar 2010 | A1 |
20100199189 | Ben-Aroya et al. | Aug 2010 | A1 |
20100245131 | Graumann | Sep 2010 | A1 |
20100277489 | Geisner et al. | Nov 2010 | A1 |
20100283742 | Lam | Nov 2010 | A1 |
Number | Date | Country |
---|---|---|
1 104 143 | May 2001 | EP |
0 833 537 | Jul 2002 | EP |
1271099 | Jan 2003 | EP |
2431813 | May 2007 | GB |
7020547 | Jan 1995 | JP |
2000-90069 | Mar 2000 | JP |
2001-057695 | Feb 2001 | JP |
2001-79699 | Mar 2001 | JP |
2003-143683 | May 2003 | JP |
2005-309691 | Nov 2005 | JP |
2006-026092 | Feb 2006 | JP |
2006-118909 | May 2006 | JP |
2006-239398 | Sep 2006 | JP |
2007-075172 | Mar 2007 | JP |
2007-080219 | Mar 2007 | JP |
2007-104670 | Apr 2007 | JP |
2007-142611 | Jun 2007 | JP |
2007-206748 | Aug 2007 | JP |
2007-215784 | Aug 2007 | JP |
2007-226855 | Sep 2007 | JP |
2008-173248 | Jul 2008 | JP |
WO 9922338 | May 1999 | WO |
WO 0063874 | Oct 2000 | WO |
WO 02088926 | Nov 2002 | WO |
WO 2006008790 | Jan 2006 | WO |
WO 2006082809 | Aug 2006 | WO |
WO 2009049302 | Apr 2009 | WO |
WO 2010008900 | Jan 2010 | WO |
Entry |
---|
DP Technologies, Inc. Office Action for U.S. Appl. No. 11/970,499 mailed Jul. 28, 2010. |
EP 09700881.7, European Search Report, dated May 3, 2011, 8 pages. |
PCT/US2008/000928, International Search Report and Written Opinion, Jun. 10, 2008, 8 pages. |
PCT/US2009/048523, International Preliminary Report on Patentability, mailing date Jan. 13, 2011, 7 pages. |
PCT/US10/36091, International Preliminary Report on Patentability, Mailed Jul. 27, 2001, 8 pages. |
PCT/US10/36091, The International Search Report and Written Opinion, Date of mailing: Jul. 28, 2010, 7 pages. |
PCT/US2009/044914, International Search Report and Written Opinion, Mailed Aug. 27, 2009. |
PCT/US2009/044914, International Preliminary Report on Patentability, mailed Mar. 29, 2011, 14 pages. |
International Preliminary Report on Patentability, PCT/US09/30223, Date of mailing Oct. 27, 2010, 15 pages. |
International Search Report and Written Opinion, PCTU/S09/30223, mailed Feb. 23, 2009. |
PCT/US2008/079752, International Search Report and Written Opinion, Mailed Jan. 9, 2009. |
PCT/US2006/29570, International Search Report and the Written Opinion, mailing date Jul. 17, 2007, 7 pages. |
PCT/US2006/29570, Notification Preliminary Report on Patentability, mailing date Feb. 7, 2008, 6 pages. |
PCT/US2009/042183, International Preliminary Report on Patentability, mailed Jan. 27, 2011, 10 pages. |
PCT/US2009/042183, International Search Report and Written Opinion, mailed Jun. 24, 2009, 8 pages. |
EP 09739742.6, Supplementary European Search Report, Dated Nov. 30, 2012, 6 pages. |
EP 10781099.6, Supplementary European Search Report, Dated Nov. 2, 2012, 5 pages. |
JP 2011-507626, Notification of Reason for Rejection, Drawn Up Date May 13, 2013, 6 pages. |
Japanese Patent Application No. 2011-516623, Office Action mailed Oct. 31, 2013, 9 pages. |
Japanese Patent Application No. 2011-516623, Final Office Action mailed Apr. 15, 2014, 6 pages. |
EP 09739742.6, Examination Report, Dated Nov. 22, 2013, 3 pages. |
EP 09798529.5, Extended European Search Report, Dated Jan. 8, 2014, 5 pages. |
PCT/US2008/072537, International Search Report and Written Opinion, Mailed Oct. 22, 2008, 10 pages. |
PCT/US2009/48523, International Search Report and Written Opinion, Mailed Aug. 7, 2009, 8 pages. |
Dao, Ricardo, “Inclination Sensing with Thermal Accelerometers”, MEMSIC, May 2002, 3 pages. |
Lee, Seon-Woo, et al., “Recognition of Walking Behaviors for Pedestrian Navigation,” ATR Media Integration & Communications Research Laboratories, Kyoto, Japan, pp. 1152-1155, Sep. 2001. |
Margaria, Rodolfo, “Biomechanics and Energetics of Muscular Exercise”, Chapter 3, Oxford: Clarendon Press 1976, pp. 105-125. |
Ormoneit, D, et al, Learning and Tracking of Cyclic Human Motion: Proceedings of NIPS 2000, Neural Information Processing Systems, 2000, Denver, CO, pp. 894-900. |
Mizell, David, “Using Gravity to Estimate Accelerometer Orientation”, Seventh IEEE International Symposium on Wearable Computers, 2003, 2 pages. |
Weinberg, Harvey, “Minimizing Power Consumption of iMEMS® Accelerometers,” Analog Devices, <http://www.analog.com/static/imported-files/application—notes/5935151853362884599AN601.pdf>, 2002, 5 pages. |
Zypad WL 1100 Wearable Computer, <http://www.eurotech.fi/products/manuals/Zypad%20WL%201100—sf.pdf>, Jan. 16, 2008, 2 pgs. |
Weinberg, Harvey, “MEMS Motion Sensors Boost Handset Reliability” Jun. 2006, <http://www.mwrf.com/Articles/Print.cfm?ArticleID=12740>, Feb. 21, 2007, 4 pages. |
Bliley, Kara E, et al, “A Miniaturized Low Power Personal Motion Analysis Logger Utilizing MEMS Accelerometers and Low Power Microcontroller,” IEEE EMBS Special Topic Conference on Microtechnologies in Medicine and Biology, May 12-15, 2005, pp. 92-93. |
Park, Chulsung, et al, “Eco: An Ultra-Compact Low-Power Wireless Sensor Node for Real-Time Motion Monitoring,” IEEE Int. Symp. on Information Processing in Sensor Networks, 2005, pp. 398-403. |
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 1-66 (part 1 of 3). |
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 67-92 (part 2 of 3). |
Wang, Shu, et al, “Location Based Services for Mobiles: Technologies and Standards, LG Electronics MobileComm,” IEEE ICC 2008, Beijing, pp. 93-123 (part 3 of 3). |
Bourzac, Katherine “Wearable Health Reports,” Technology Review, Feb. 28, 2006, <http://www.techreview.com/printer—friendly—article—aspx?id+16431>, Mar. 22, 2007, 3 pages. |
Cheng, et al, “Periodic Human Motion Description for Sports Video Databases,” Proceedings of the Pattern Recognition, 2004, 8 pages. |
Anderson, Ian, et al, “Shakra: Tracking and Sharing Daily Activity Levels with Unaugmented Mobile Phones,” Mobile Netw Appl, Aug. 3, 2007, pp. 185-199. |
Aylward, Ryan, et al, “Sensemble: A Wireless, Compact, Multi-User Sensor System for Interactive Dance,” International Conference on New Interfaces for Musical Expression (NIME06), Jun. 4-8, 2006, pp. 134-139. |
Baca, Arnold, et al, “Rapid Feedback Systems for Elite Sports Training,” IEEE Pervasive Computing, Oct.-Dec. 2006, pp. 70-76. |
Bakhru, Kesh, “A Seamless Tracking Solution for Indoor and Outdoor Position Location,” IEEE 16th International Symposium on Personal, Indoor, and Mobile Radio Communications, 2005, pp. 2029-2033. |
Fang, Lei, et al, “Design of a Wireless Assisted Pedestrian Dead Reckoning System—The NavMote Experience,” IEEE Transactions on Instrumentation and Measurement, vol. 54, No. 6, Dec. 2005, pp. 2342-2358. |
Healey, Jennifer, et al, “Wearable Wellness Monitoring Using ECG and Accelerometer Data,” IEEE Int. Symposium on Wearable Computers (ISWC'05), 2005, 2 pages. |
Hemmes, Jeffrey, et al, “Lessons Learned Building TeamTrak: An Urban/Outdoor Mobile Testbed,” 2007 IEEE Int. Conf. on Wireless Algorithms, Aug. 1-3, 2007, pp. 219-224. |
Jovanov, Emil, et al, “A Wireless Body Area Network of Intelligent Motion Sensors for Computer Assisted Physical Rehabilitation,” Journal of NeuroEngineering and Rehabilitation, Mar. 2005, 10 pages. |
Kalpaxis, Alex, “Wireless Temporal-Spatial Human Mobility Analysis Using Real-Time Three Dimensional Acceleration Data,” IEEE Intl. Multi-Conf. on Computing in Global IT (ICCGI'07), 2007, 7 pages. |
Milenkovic, Milena, et al, “An Accelerometer-Based Physical Rehabilitation System,” IEEE SouthEastern Symposium on System Theory, 2002, pp. 57-60. |
Otto, Chris, et al, “System Architecture of a Wireless Body Area Sensor Network for Ubiquitous Health Monitoring,” Journal of Mobile Multimedia, vol. 1, No. 4, 2006, pp. 307-326. |
Shen, Chien-Lung, et al, “Wearable Band Using a Fabric-Based Sensor for Exercise ECG Monitoring,” IEEE Int. Symp. on Wearable Computers, 2006, 2 pages. |
Tapia, Emmanuel Munguia, et al, “Real-Time Recognition of Physical Activities and Their Intensities Using Wireless Accelerometers and a Heart Rate Monitor,” IEEE Cont. on Wearable Computers, Oct. 2007, 4 pages. |
Wixted, Andrew J, et al, “Measurement of Energy Expenditure in Elite Athletes Using MEMS-Based Triaxial Accelerometers,” IEEE Sensors Journal, vol. 7, No. 4, Apr. 2007, pp. 481-488. |
Wu, Winston H, et al, “Context-Aware Sensing of Physiological Signals,” IEEE Int. Conf. on Engineering for Medicine and Biology, Aug. 23-26, 2007, pp. 5271-5275. |
“Access and Terminals (AT); Multimedia Message Service (MMS) for PSTN/ISDN; Multimedia Message Communication Between a Fixed Network Multimedia Message Terminal Equipment and a Multimedia Message Service Centre,” ETSI AT-F Rapporteur Meeting, Feb. 4-6, 2003, Gothenburg, DES/AT-030023 V0.0.1 (Mar. 2003). |
“Decrease Processor Power Consumption using a CoolRunner CPLD,” XILINX XAPP347 (v1.0), May 16, 2001, 9 pages. |
“Sensor Fusion,” <www.u-dynamics.com>, accessed Aug. 29, 2008, 2 pages. |
Ang, Wei Tech, et al, “Zero Phase Filtering for Active Compensation of Periodic Physiological Motion,” Proc 1st IEEE / RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, Feb. 20-22, 2006, pp. 182-187. |
Jones, L, et al, “Wireless Physiological Sensor System for Ambulatory Use,” <http://ieeexplore.ieee.org/xpl/freeabs—all.jsp?tp=&arnumber=1612917&isnumber=33861>, Apr. 3-5, 2006, 1 page. |
Lee, Hyunseok, et al, A Dual Processor Solution for the MAC Layer of a Software Defined Radio Terminal, Advanced Computer Architecture Laboratory, University of Michigan, 25 pages, Apr. 14, 2009. |
Ricoh, “Advanced digital technology changes creativity,” <http://www.ricoh.com/r—dc/gx/gx200/features2.html>, Accessed May 12, 2011, 4 pages. |
Tech, Ang Wei, “Real-time Image Stabilizer,” <http://www.mae.ntu.edu.sg/ABOUTMAE/DIVISIONS/RRC—BIOROBOTICS/Pages/rtimage.aspx>, Mar. 23, 2009, 3 pages. |
Weckesser, P, et al, “Multiple Sensorprocessing for High-Precision Navigation and Environmental Modeling with a Mobile Robot,” IEEE, 1995, pp. 453-458. |
Yoo, Chang-Sun, et al, “Low Cost GPS/INS Sensor Fusion System for UAV Navigation,” IEEE, 2003, 9 pages. |
“Heart Rate Monitor Sports Bra,” <www.numetrex.com/about/heart-rate-monitor-sports-bra>, Accessed Aug. 9, 2013, 2 pages. |
Meinhold, Bridgette, “Adidas by Stella McCartney's Tennis Bra Includes Built-In Heart Sensor,” <www.ecouterre.com/adidas-by-stella-mccartneys-tennis-bra-includes-built-in-heart-sensor/>, Mar. 23, 2012, 2 pages. |
“Smart Underwear With Biosensors Availability in the Market Kudos to Modern Inkjet Printer Technology,” <www.kokeytechnology.com/biotechnology/smart-underwear-with-biosensors-availability-in-the-market-kudos-to-modern-inkjet-printer-technology/>, Published Jul. 21, 2010, 2 pages. |
Number | Date | Country | |
---|---|---|---|
60830205 | Jul 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11776532 | Jul 2007 | US |
Child | 14555547 | US |