Observing a person's gait is often an important clinical step in diagnosing certain types of musculoskeletal and neurological conditions. Proper gait diagnosis may also be valuable in properly fitting a patient with a prosthesis. Currently, gait analysis is largely dependent on the subjective perception of a trained professional. Such manual diagnosis methods may rely on a small set of observable activities and may have inherent inaccuracies, creating the potential for misdiagnosis and the delay of proper treatment. Thus, there is currently a need for improved systems and methods for diagnosing the gait of an individual.
In various embodiments, a method of monitoring the health of an individual comprises: (1) receiving information obtained from at least one sensor worn adjacent the individual's head; (2) in response to receiving the information from a user, utilizing the information to assess the gait of the individual; and (3) at least partially in response to receiving the assessed gait, determining whether the assessed gait includes one or more particular gait patterns that are associated with a particular medical condition.
In various embodiments, a method of monitoring the proper fit of a prosthesis worn by an individual comprises: (1) receiving information obtained from at least one sensor worn adjacent an individual's head; (2) at least partially in response to receiving the information, utilizing the information to assess the gait of the individual and to analyze the assessed gait to determine one or more gait patterns associated with the individual's gait; (3) determining whether the one or more gait patterns are consistent with a particular gait abnormality; and (4) in response to identifying a gait pattern that is consistent with a particular gait abnormality, generating an alert to indicate that the individual may have a gait abnormality, which may further evidence an improper fit of the prosthesis.
In various embodiments, a computer system for monitoring the gait of an individual comprises a pair of glasses comprising one or more sensors for assessing the gait of the individual. In various particular embodiments, the system is configured to analyze a user's assessed gait to determine whether the assessed gait includes one or more particular gait patterns that are consistent with one or more particular medical conditions. In response to determining that the assessed gait includes one or more particular gait patterns that are consistent with one or more particular medical conditions, the system may generate an alert that communicates the particular medical condition to a user (e.g., the individual or a caregiver of the individual).
Various embodiments of systems and methods for monitoring an individual's gait are described below. In the course of this description, reference will be made to the accompanying drawings, which are not necessarily drawn to scale and wherein:
Various embodiments will now be described more fully hereinafter with reference to the accompanying drawings. It should be understood that the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.
Overview
A system, according to various embodiments, includes eyewear (or any other suitable wearable device) that includes one or more sensors (e.g., one or more heart rate monitors, one or more electrocardiograms (EKG), one or more electroencephalograms (EEG), one or more pedometers, one or more thermometers, one or more transdermal sensors, one or more front-facing cameras, one or more eye-facing cameras, one or more microphones, one or more accelerometers, one or more blood pressure sensors, one or more pulse oximeters, one or more respiratory rate sensors, one or more blood alcohol concentration (BAC) sensors, one or more motion sensors, one or more gyroscopes, one or more geomagnetic sensors, one or more global positioning system sensors, one or more impact sensors, or any other suitable one or more sensors) that may be used to monitor the gait of an individual. The system may further include one or more suitable computing devices for analyzing the individual's gait. This information may then be used, for example, to: (1) identify one or more medical conditions associated with the individual; (2) assess the fit of a prosthetic device worn by the individual, and/or (3) assess an individual's recovery from a particular injury or medical procedure.
Exemplary Technical Platforms
As will be appreciated by one skilled in the relevant field, the present invention may be, for example, embodied as a computer system, a method, or a computer program product. Accordingly, various embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, particular embodiments may take the form of a computer program product stored on a computer-readable storage medium having computer-readable instructions (e.g., software) embodied in the storage medium. Various embodiments may take the form of internet-implemented computer software. Any suitable computer-readable storage medium may be utilized including, for example, hard disks, compact disks, DVDs, optical storage devices, and/or magnetic storage devices.
Various embodiments are described below with reference to block diagrams and flowchart illustrations of methods, apparatuses (e.g., systems) and computer program products. It should be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by a computer executing computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus to create means for implementing the functions specified in the flowchart block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
Accordingly, blocks of the block diagrams and flowchart illustrations support combinations of mechanisms for performing the specified functions, combinations of steps for performing the specified functions, and program instructions for performing the specified functions. It should also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and other hardware executing appropriate computer instructions.
Example System Architecture
In various embodiments, the one or more wearable gait monitoring device(s) 150 may further comprise at least one processor and one or more sensors (e.g., an accelerometer, a magnetometer, a gyroscope, a front-facing camera, a location sensor such as a GPS unit, etc.). In particular embodiments, the system is configured to gather data, for example, using the one or more sensors, regarding the user's gait as the user walks or runs (e.g., the user's stride cadence, the user's speed (e.g., the speed of the user's feet and/or body), the orientation of the user (e.g., the orientation of the user's body and/or feet), the elevation of the user's respective feet from the ground, the movement of the user's head such as bobbing, etc.).
In various embodiments, the database is configured to store information regarding gait patterns associated with various predetermined medical conditions. The system is configured to store information regarding normal gait patterns for a particular individual or individuals who are similar in physical stature to the particular individual. In various embodiments, the database stores past information regarding an individual's gait and may include recent gait measurements for the individual, which may, for example, be used to track the individual's progress in improving their gait (e.g., after an injury or a medical procedure).
The one or more computer networks 115 may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a mesh network, a public switch telephone network (PSTN), or any other type of network (e.g., a network that uses Bluetooth or near field communications to facilitate communication between computers). The communication link between the Wearable Gait Monitoring System 100 and the Database 130 may be, for example, implemented via a Local Area Network (LAN) or via the Internet. In particular embodiments, the one or more computer networks 115 facilitate communication between the one or more third party servers 140a, 140b, 140c, the Gait Server 120, the Database 130, and one or more remote computing devices 110a, 110b. In various embodiments, the handheld device 110a is configured to communicate with the wearable gait monitoring device 150 via, for example, Bluetooth. In various other embodiments, the wearable gait monitoring device 150 may communicate with a remote server, for example, the Gait Server 120, via a cellular communication or wireless Internet connection. In yet other embodiments, the system may be further configured to allow the wearable gait monitoring device 150 to communicate with the remote server (e.g., the Gait Server 120), without the intermediary handheld device 110a.
In particular embodiments, the Gait Server 120 may be connected (e.g., networked) to other computers in a LAN, an intranet, an extranet, and/or the Internet. As noted above, the Gait Server 120 may operate in the capacity of a server or a client computer in a client-server network environment, or as a peer computer in a peer-to-peer (or distributed) network environment. The Gait Server 120 may be a desktop personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a server, a network router, a switch or bridge, or any other computer capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that computer. Further, while only a single computer is illustrated, the term “computer” shall also be taken to include any collection of computers that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
An exemplary Gait Server 120 includes a processing device 202, a main memory 204 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 206 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 218, which communicate with each other via a bus 232.
The processing device 202 represents one or more general-purpose processing devices such as a microprocessor, a central processing unit, or the like. More particularly, the processing device 202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. The processing device 202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gait array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 202 may be configured to execute processing logic 226 for performing various operations and steps discussed herein.
The Gait Server 120 may further include a network interface device 208. The Gait Server 120 also may include a video display unit 210 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alphanumeric input device 212 (e.g., a keyboard), a cursor control device 214 (e.g., a mouse), and a signal generation device 216 (e.g., a speaker).
The data storage device 218 may include a non-transitory computer-accessible storage medium 230 (also known as a non-transitory computer-readable storage medium or a non-transitory computer-readable medium) on which is stored one or more sets of instructions (e.g., software 222) embodying any one or more of the methodologies or functions described herein. The software 222 may also reside, completely or at least partially, within the main memory 204 and/or within the processing device 202 during execution thereof by the Gait Server 120—the main memory 204 and the processing device 202 also constituting computer-accessible storage media. The software 222 may further be transmitted or received over a network 115 via a network interface device 208.
While the computer-accessible storage medium 230 is shown in an exemplary embodiment to be a single medium, the terms “computer-accessible storage medium” and “computer-readable medium” should be understood to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computer-accessible storage medium” and “computer-readable medium” should also be understood to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the computer and that cause the computer to perform any one or more of the methodologies of the present invention. The terms “computer-accessible storage medium” and “computer-readable medium” should accordingly be understood to include, but not be limited to, solid-state memories, optical and magnetic media, etc.
More Detailed Description of Gait Monitoring Functionality
Various embodiments of a system for the monitoring the gait of an individual are described below and may be implemented in any suitable context. For example, particular embodiments may be implemented to: (1) identify one or more medical conditions associated with the individual; (2) assess the fit of a prosthetic device worn by the individual, and/or (3) assess an individual's recovery from a particular injury or medical procedure.
Various aspects of the system's functionality may be executed by certain system modules, including the Gait Monitoring Module 300. The Gait Monitoring Module 300 is discussed in greater detail below.
Gait Monitoring Module
Referring to
In particular embodiments, the one or more of the system's sensors may be embedded in, or otherwise attached to, eyewear or other wearable device (e.g., another wearable device worn adjacent the individual's head or another suitable part of the individual's body). In particular embodiments, at least one or more of the system's sensors may be incorporated into a prosthesis or into a portion of the individual's shoes. In certain embodiments, the system may include one or more sensors that are incorporated into (e.g., embedded in, or attached to) a plurality of wearable devices (e.g., eyewear and the individual's shoes) that are adapted to be worn simultaneously by the user while the system retrieves signals from the sensors to assess the individual's gait.
In particular embodiments, the system may include a set of eyewear that includes one or more motion sensors (e.g., accelerometers, gyroscopes, or location sensors) for sensing the movement of the head of an individual who is wearing the eyewear as the individual walks. The system may then use this head movement information (e.g., using any suitable technique, such as any suitable technique described herein) to determine whether the user has a gait abnormality. The system may do this, for example, by comparing one or more of the measured head motions of an individual (e.g., as measured when the individual is walking or running) with the actual or typical head motions experienced by individuals with gait abnormalities as those individuals walk or run.
In various embodiments, the system is configured to measure and receive at least one of the velocity, height, and orientation of one or more of the individual's feet. For example, in certain embodiments, the system is configured to measure and receive (e.g., using the suitable sensors) the linear acceleration of each of the individual's feet, the height of each of the feet from the ground, and/or the position and/or orientation of each of the feet relative to the central axis of the individual's body as the individual walks or runs.
The system continues at Step 310 by using the data received from the system's sensors to identify one or more relative peaks in linear acceleration of the individual's body and/or head as the user ambulates (e.g., walks or runs). In various embodiments, the system may do this by processing the data received from the sensor(s) in Step 305, and then isolating the relative peaks in the data. Such peaks represent the relative maxima and minima of the linear acceleration of the user's head, body, and/or one or more of the individual's lower body parts (e.g., knee, ankle, or foot) as the user ambulates. Alternatively or additionally, the system may be configured to identify the relative peaks in linear acceleration by identifying the slope of the line formed by regression analysis of the data received from the sensors. This regression analysis may indicate the change in magnitude of the linear acceleration with time.
In identifying the relative peaks in linear acceleration, the system is further configured to identify the peaks such that the magnitude and phase of these peaks may be utilized to aid in the diagnosis of one or more gait abnormalities by comparing the magnitude and phase of the peaks associated with the individual's gait with the magnitude and phase of the peaks associated with: (1) the gait of one or more individuals who are known to have one or more gait abnormalities; (2) a typical gait associated with individuals who are known to have one or more gait abnormalities; and/or (3) the individual's normal gait (which may be determined based on data stored in system memory that the system obtained, for example, when the individual was known to walk or run without a gait abnormality). This comparison may be helpful in determining whether the individual has a gait abnormality and, if so, whether the gait abnormality exists due to an improper prosthetic fit.
In a particular embodiment, the above comparison may involve comparing the magnitude and/or phase of peaks that represent a user's head movement as the user ambulates with the magnitude and/or phase of peaks that represent the head movement, during ambulation, of (1) one or more individuals who are known to have one or more gait abnormalities; (2) a typical individual (or model theoretical individual) who is known to have one or more gait abnormalities; and/or (3) the individual themself (this data may be determined, for example, based on data stored in system memory that the system obtained, for example, when the individual was known to walk or run without a gait abnormality).
Continuing at Step 315, the system is configured to analyze the received gait information to determine whether the individual has an identifiable gait abnormality and to communicate the results of the analysis to the user. In various embodiments, the system may use the gait information to: (1) identify potential, previously undiagnosed medical conditions (e.g., one or more medical conditions, such as ALS or MS, that may be indicated by a particular gait abnormality, such as foot drop); (2) assess the quality of the fit of a prosthesis; and/or (3) assess the individual's progress in recovering from an injury or medical procedure (e.g., knee or hip surgery).
Use of System to Identify Previously Undiagnosed Medical Condition
In identifying a potential, previously undiagnosed medical condition, the system is configured to compare the gait of the individual with: (1) the gait of one or more individuals who are known to have one or more gait abnormalities (e.g., hemiplegic gait, diplegic gait, neuropathic gait, foot drop, myopathic gait, or ataxic gait); (2) a typical gait associated with individuals who are known to have one or more gait abnormalities; and/or (3) the individual's normal gait. To do this, the system may compare one or more gait patterns of a user (e.g., in the manner discussed above or in any other suitable way) with information regarding one or more abnormal gait patterns that is stored in a Gait Database 130. The system may do this, for example, by applying any suitable mathematical or other data comparison technique to determine whether one of more of the individual's gait patterns are at least substantially similar to one or more abnormal gait patterns stored in the system's Gait Database 130.
If the system determines that the individual has, or may have, a particular gait abnormality, the system may generate and send a notification to a suitable individual (e.g., the individual or the individual's physician) indicating that the individual may have a gait abnormality and/or that it may be beneficial to examine or monitor the individual for one or more medical conditions that are typically associated with the gait abnormality, e.g., stroke, amyotrophic lateral sclerosis, muscular dystrophy, Charcot Maries Tooth disease, multiple sclerosis, cerebral palsy, hereditary spastic paraplegia, and Friedrich's ataxia. The notification may be, for example, a suitable electronic notification (e.g., a message on a display screen, an e-mail, a text), or any other suitable notification.
Use of System to Determine Whether a Prosthesis Fits Correctly
In assessing the quality of fit of the prosthesis, the system in various embodiments, may, in various embodiments, be configured to compare the user's assessed gait with: (1) the gait of one or more individuals who are known to have one or more gait abnormalities that are associated with an improper prosthetic fit; (2) a typical gait associated with individuals who are known to have one or more gait abnormalities that are associated with an improper prosthetic fit; and/or (3) the individual's normal gait. This comparison may be done as discussed above or in any other suitable way. In particular embodiments, the gait patterns that the individual's gait patterns are compared with may be modeled, for example, based on previously recorded data for individuals with one or more physical attributes (e.g., height, age, weight, femur length, etc . . . ) that are similar to that of the individual. In various other embodiments, such patterns may be modeled from previously recorded data for users that aren't physically similar to the individual.
In response to determining that the individual has one or more gait patterns that are associated with an improper prosthetic fit, the system may generate an alert indicating that the prosthesis may fit improperly. The system may send this alert electronically, for example, via email, text message, or via a display on a display screen, to the user and/or their physician or other suitable individual.
In various embodiments, after determining that the individual has an abnormal gait, the system may then determine whether the gait deviation results from an improperly fitting prosthesis or from an injury associated with the individual (e.g., an infected wound adjacent the prosthesis). It is noted that an improper fit of a prosthetic leg may result in any of a number of gait deviations such as trans-femoral (TF) long prosthetic step, TF excessive lumbar lordosis, TF drop off at end of stance, TF foot slap, TF medial or lateral whips, TF uneven heel rise, etc. While such gait deviations may result from an improper prosthetic fit, they may also manifest from: (1) various improper actions or movements by the amputee while the amputee is wearing the prosthesis; or (2) an injury adjacent the prosthesis. Clinically distinguishing an improper gait caused by a poorly fitting prosthetic from an improper gait caused by improper use of a properly fitted prosthesis may be important in helping the amputee regain proper functionality of the prosthetic.
Use of System to Assess an Individual's Recovery from an Injury or Medical Procedure
In assessing an individual's recovery from an injury or medical procedure, the system may compare the individual's current gait with historical gait information for the individual stored in the Database 130. The historical gait information, in various embodiments, may include gait pattern information taken for the individual at some time in the past (e.g., the recent past) before or after the user suffered the injury or underwent the medical procedure.
The system may then analyze both sets of gait information to determine whether the individual's gait has become more consistent with the user's normal gait (e.g., fewer abnormalities in gait, more regular, quicker lateral acceleration, etc.) To do this, the system may, in various embodiments, compare the user's current gait information with a normal gait to determine whether the user's gait has become more consistent with a normal gait over time. In other embodiments, the system may compare the most current gait data with other post-procedure or post-injury gait data for the individual to determine whether the user's gait has become more consistent with a normal gait (e.g., the individual's normal gait).
Upon analyzing both sets of gait information, the system may generate an appropriate assessment of the user's recovery and/or to generate one or more treatment recommendations. The system may, in various embodiments, generate a report that communicates the progress of an individual's recovery. The system may also, or alternatively, generate an alternate treatment plan for the individual, if necessary. For example, a particular generated report may include one or more recommendations with regard to a particular type and length of physical therapy to be performed by the individual, and/or one or more dietary restrictions that the individual should implement to aid recovery to regain muscle tone and strength in the affected limb. The system may then communicate the report to the individual or an appropriate third party.
User Experience
Gait Abnormality Diagnosis
In a particular example, a pair of eyewear with embedded sensors may be used to monitor the user's gait over the course of one or more days (e.g., days, weeks, months, years, etc.). As the sensors measure the movements of the individual's body (e.g., the individual's head, legs, feet, etc . . . ), the system may transmit the related movement data to a remote server where the information is stored in a suitable database. After receiving the data, a central server may process the data to identify one or more gait patterns for the individual. The system may then compare one or more of the individual's gait patterns with one or more known irregular gait patterns to determine whether the individual has an irregular gait pattern as discussed above.
The system may be utilized, for example, in the following construct. A patient may present to a physician complaining of weakness and decreased use of one leg. The physician may perform a routine physical, ask diagnostic questions, and have the patient walk briefly in order to physically demonstrate the purported condition. Upon observing the patient, the doctor may decide that the patient may potentially have a gait abnormality, but the physician cannot isolate the specific abnormality as presented by the patient. The physician may instruct the patient to wear the wearable gait monitoring device over the course of one or more days. During this time, the wearable gait monitoring device would obtain and record information regarding the individual's gait as discussed above.
The system may then use the information to identify one or more gait pattern irregularities as discussed above and generate a message to the user's treating physician indicating that the individual appears to have an abnormal gait. The system may optionally further display one or more potential medical conditions associated with that gait, e.g., amyotrophic lateral sclerosis, multiple sclerosis, etc. The physician may then meet with the individual to discuss the individual's condition, and/or to order additional testing to establish a particular diagnosis. For example, the physician may review the patient's medical history, presented gait pattern, and possible conditions contributing to the gait abnormality to diagnose and/or to order more tests to aid in the diagnosis of such medical conditions.
The system may similarly be used to analyze the fit of a particular prosthetic, or a user's recovery from an injury or surgery using similar techniques in combination with one or more of the methods described above.
Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains, having the benefit of the teaching presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the invention is not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for the purposes of limitation.
Number | Name | Date | Kind |
---|---|---|---|
3505879 | Vanderberg | Apr 1970 | A |
3548663 | Radin | Dec 1970 | A |
3972038 | Nasa | Jul 1976 | A |
4100401 | Tutt et al. | Jul 1978 | A |
4186609 | Baermann | Feb 1980 | A |
4195642 | Price et al. | Apr 1980 | A |
4281663 | Pringle | Aug 1981 | A |
4407295 | Steuer et al. | Oct 1983 | A |
4434801 | Jiminez et al. | Mar 1984 | A |
4855942 | Bianco | Aug 1989 | A |
4878749 | McGee | Nov 1989 | A |
4919530 | Hyman | Apr 1990 | A |
5422816 | Sprague et al. | Jun 1995 | A |
5452480 | Ryden | Sep 1995 | A |
5497143 | Matsuo et al. | Mar 1996 | A |
5585871 | Linden | Dec 1996 | A |
5670872 | Van De Walle et al. | Sep 1997 | A |
5746501 | Chien et al. | May 1998 | A |
5891042 | Sham et al. | Apr 1999 | A |
5931764 | Freeman et al. | Aug 1999 | A |
5966680 | Butnaru | Oct 1999 | A |
5976083 | Richardson et al. | Nov 1999 | A |
6013007 | Root et al. | Jan 2000 | A |
6183425 | Whalen et al. | Feb 2001 | B1 |
6218958 | Eichstaedt et al. | Apr 2001 | B1 |
6241684 | Amano et al. | Jun 2001 | B1 |
6325507 | Jannard et al. | Dec 2001 | B1 |
6381482 | Jayaraman et al. | Apr 2002 | B1 |
6431705 | Linden et al. | Aug 2002 | B1 |
6439067 | Goldman et al. | Aug 2002 | B1 |
6513532 | Mault et al. | Feb 2003 | B2 |
6532298 | Cambier et al. | Mar 2003 | B1 |
6736759 | Stubbs et al. | May 2004 | B1 |
6769767 | Swab et al. | Aug 2004 | B2 |
6783501 | Takahashi et al. | Aug 2004 | B2 |
6790178 | Mault et al. | Sep 2004 | B1 |
6812845 | Yuzuki et al. | Nov 2004 | B2 |
7181345 | Rosenfeld et al. | Feb 2007 | B2 |
7187960 | Abreu | Mar 2007 | B2 |
7192136 | Howell et al. | Mar 2007 | B2 |
7255437 | Howell et al. | Aug 2007 | B2 |
7376238 | Rivas et al. | May 2008 | B1 |
7380936 | Howell et al. | Jun 2008 | B2 |
7400257 | Rivas | Jul 2008 | B2 |
7401918 | Howell et al. | Jul 2008 | B2 |
7438410 | Howell et al. | Oct 2008 | B1 |
7454002 | Gardner et al. | Nov 2008 | B1 |
7457434 | Azar | Nov 2008 | B2 |
7481531 | Howell et al. | Jan 2009 | B2 |
7488294 | Torch | Feb 2009 | B2 |
7500746 | Howell et al. | Mar 2009 | B1 |
7500747 | Howell et al. | Mar 2009 | B2 |
7515054 | Torch | Apr 2009 | B2 |
7543934 | Howell et al. | Jun 2009 | B2 |
7581833 | Howell et al. | Sep 2009 | B2 |
7621634 | Howell et al. | Nov 2009 | B2 |
7630524 | Lauper et al. | Dec 2009 | B2 |
7634379 | Noble | Dec 2009 | B2 |
7640135 | Vock et al. | Dec 2009 | B2 |
7648463 | Elhag et al. | Jan 2010 | B1 |
7677723 | Howell et al. | Mar 2010 | B2 |
7771046 | Howell et al. | Aug 2010 | B2 |
7792552 | Thomas et al. | Sep 2010 | B2 |
7793361 | Ishihara et al. | Sep 2010 | B2 |
7857772 | Bouvier et al. | Sep 2010 | B2 |
7806525 | Howell et al. | Oct 2010 | B2 |
7922321 | Howell et al. | Apr 2011 | B2 |
7987070 | Kahn et al. | Jul 2011 | B2 |
8011242 | O'Neill et al. | Sep 2011 | B2 |
8081082 | Malik et al. | Dec 2011 | B2 |
8109629 | Howell et al. | Feb 2012 | B2 |
8188868 | Case | May 2012 | B2 |
8202148 | Young | Jun 2012 | B2 |
8294581 | Kamen | Oct 2012 | B2 |
8303311 | Forest | Nov 2012 | B2 |
8337013 | Howell et al. | Dec 2012 | B2 |
8384617 | Braun et al. | Feb 2013 | B2 |
8430507 | Howell et al. | Apr 2013 | B2 |
8448846 | Needham et al. | May 2013 | B2 |
8449471 | Tran | May 2013 | B2 |
8465151 | Howell et al. | Jun 2013 | B2 |
8494507 | Tedesco et al. | Jul 2013 | B1 |
8500271 | Howell et al. | Aug 2013 | B2 |
8510166 | Neven | Aug 2013 | B2 |
8531355 | Maltz | Sep 2013 | B2 |
8540583 | Leech | Sep 2013 | B2 |
8568313 | Sadhu | Oct 2013 | B2 |
8594971 | Keal et al. | Nov 2013 | B2 |
8620600 | Vock et al. | Dec 2013 | B2 |
8630633 | Tedesco et al. | Jan 2014 | B1 |
8634701 | Kang et al. | Jan 2014 | B2 |
8647270 | Leboeuf et al. | Feb 2014 | B2 |
8690750 | Krueger | Apr 2014 | B2 |
8696113 | Lewis | Apr 2014 | B2 |
8733928 | Lewis | May 2014 | B1 |
8750971 | Tran | Jun 2014 | B2 |
8764651 | Tran | Jul 2014 | B2 |
8849610 | Molettiere et al. | Sep 2014 | B2 |
8892401 | Yuen et al. | Nov 2014 | B2 |
8905542 | Howell et al. | Dec 2014 | B2 |
8911087 | Publicover et al. | Dec 2014 | B2 |
8920332 | Hong et al. | Dec 2014 | B2 |
8931896 | Blum et al. | Jan 2015 | B2 |
8941560 | Wong et al. | Jan 2015 | B2 |
8944590 | Blum et al. | Feb 2015 | B2 |
8961415 | Leboeuf et al. | Feb 2015 | B2 |
8964298 | Haddick et al. | Feb 2015 | B2 |
8965730 | Yuen | Feb 2015 | B2 |
8979295 | Waters | Mar 2015 | B2 |
9001427 | Jacobs et al. | Apr 2015 | B2 |
9005129 | Venkatraman et al. | Apr 2015 | B2 |
9007220 | Johns et al. | Apr 2015 | B2 |
9028405 | Tran | May 2015 | B2 |
9033493 | Howell et al. | May 2015 | B2 |
9035970 | Lamb et al. | May 2015 | B2 |
9064342 | Yuen et al. | Jun 2015 | B2 |
9112701 | Sano et al. | Aug 2015 | B2 |
9113794 | Hong et al. | Aug 2015 | B2 |
9113795 | Hong et al. | Aug 2015 | B2 |
9141194 | Keyes et al. | Sep 2015 | B1 |
9144405 | Kim et al. | Sep 2015 | B2 |
9153074 | Zhou et al. | Oct 2015 | B2 |
9215290 | Yuen et al. | Dec 2015 | B2 |
9229227 | Border et al. | Jan 2016 | B2 |
9235064 | Lewis | Jan 2016 | B2 |
9239473 | Lewis | Jan 2016 | B2 |
9241635 | Yuen et al. | Jan 2016 | B2 |
9244293 | Lewis | Jan 2016 | B2 |
9247212 | Bose et al. | Jan 2016 | B2 |
9254100 | Beck et al. | Feb 2016 | B2 |
9256711 | Horseman | Feb 2016 | B2 |
9304331 | Carrara | Apr 2016 | B2 |
9341526 | Bass et al. | May 2016 | B2 |
9342610 | Liu et al. | May 2016 | B2 |
9480877 | Chiang et al. | Nov 2016 | B2 |
9520638 | Baringer et al. | Dec 2016 | B2 |
9529197 | Olsson et al. | Dec 2016 | B2 |
9566033 | Bogdanovich et al. | Feb 2017 | B2 |
9579060 | Lisy et al. | Feb 2017 | B1 |
9610476 | Tran et al. | Apr 2017 | B1 |
9726904 | Lin | Aug 2017 | B1 |
9763592 | Le et al. | Sep 2017 | B2 |
9896154 | Modolo | Feb 2018 | B2 |
9977259 | Archambeau et al. | May 2018 | B2 |
10188323 | Sales et al. | Jan 2019 | B2 |
10310296 | Howell et al. | Jun 2019 | B2 |
10330956 | Howell et al. | Jun 2019 | B2 |
20010031031 | Ogawa et al. | Oct 2001 | A1 |
20020151810 | Wong et al. | Oct 2002 | A1 |
20030195398 | Suzuki et al. | Oct 2003 | A1 |
20040039517 | Biesinger et al. | Feb 2004 | A1 |
20050033200 | Soehren et al. | Feb 2005 | A1 |
20050036103 | Bloch | Feb 2005 | A1 |
20050054942 | Melker et al. | Mar 2005 | A1 |
20060115130 | Kozlay | Jun 2006 | A1 |
20070052672 | Ritter et al. | Mar 2007 | A1 |
20070112287 | Fancourt et al. | May 2007 | A1 |
20070273611 | Torch | Nov 2007 | A1 |
20080045804 | Williams | Feb 2008 | A1 |
20080137916 | Lauper et al. | Jun 2008 | A1 |
20080146892 | LeBoeuf | Jun 2008 | A1 |
20090030350 | Yang | Jan 2009 | A1 |
20090195747 | Insua | Aug 2009 | A1 |
20090227853 | Wijesiriwardana | Sep 2009 | A1 |
20090267805 | Jin et al. | Oct 2009 | A1 |
20100042430 | Bartfeld | Feb 2010 | A1 |
20100045928 | Levy | Feb 2010 | A1 |
20100110368 | Chaum | May 2010 | A1 |
20100136508 | Zekhtser | Jun 2010 | A1 |
20100271587 | Pavlopoulos | Oct 2010 | A1 |
20100280336 | Giftakis et al. | Nov 2010 | A1 |
20100308999 | Chornenky | Dec 2010 | A1 |
20100332571 | Healey et al. | Dec 2010 | A1 |
20110169932 | Mula et al. | Jul 2011 | A1 |
20110221656 | Haddick et al. | Sep 2011 | A1 |
20110224505 | Sadhu | Sep 2011 | A1 |
20120021806 | Maltz | Jan 2012 | A1 |
20120029367 | Hobeika | Feb 2012 | A1 |
20120127423 | Blum et al. | May 2012 | A1 |
20120133885 | Howell et al. | May 2012 | A1 |
20120135384 | Nakao | May 2012 | A1 |
20120142443 | Savarese et al. | Jun 2012 | A1 |
20120169990 | Burnstein | Jul 2012 | A1 |
20120191016 | Jastram | Jul 2012 | A1 |
20120203310 | Pugh et al. | Aug 2012 | A1 |
20120206485 | Osterhout et al. | Aug 2012 | A1 |
20120209149 | Yoneyama | Aug 2012 | A1 |
20120310442 | Doutaz et al. | Dec 2012 | A1 |
20130009907 | Rosenberg et al. | Jan 2013 | A1 |
20130024022 | Bowers | Jan 2013 | A1 |
20130024211 | Monteforte et al. | Jan 2013 | A1 |
20130041590 | Burich et al. | Feb 2013 | A1 |
20130050258 | Liu et al. | Feb 2013 | A1 |
20130096397 | Kiso et al. | Apr 2013 | A1 |
20130138413 | Finch et al. | May 2013 | A1 |
20130157232 | Ehrenkranz | Jun 2013 | A1 |
20130242262 | Lewis | Sep 2013 | A1 |
20130274587 | Coza et al. | Oct 2013 | A1 |
20130274904 | Coza et al. | Oct 2013 | A1 |
20130307670 | Ramaci | Nov 2013 | A1 |
20130329183 | Blum et al. | Dec 2013 | A1 |
20130345168 | Kim et al. | Dec 2013 | A1 |
20140028456 | Sadhu | Jan 2014 | A1 |
20140031703 | Rayner et al. | Jan 2014 | A1 |
20140063242 | Hanina et al. | Mar 2014 | A1 |
20140073081 | Wang | Mar 2014 | A1 |
20140078049 | Parshionikar | Mar 2014 | A1 |
20140085190 | Erinjippurath et al. | Mar 2014 | A1 |
20140135593 | Jayalth et al. | May 2014 | A1 |
20140142459 | Jayalth et al. | May 2014 | A1 |
20140159862 | Yang et al. | Jun 2014 | A1 |
20140204334 | Stoll | Jul 2014 | A1 |
20140207264 | Quy | Jul 2014 | A1 |
20140218281 | Amayeh et al. | Aug 2014 | A1 |
20140228649 | Rayner et al. | Aug 2014 | A1 |
20140229220 | Yuen et al. | Aug 2014 | A1 |
20140240122 | Roberts | Aug 2014 | A1 |
20140247145 | Proud | Sep 2014 | A1 |
20140266988 | Fisher et al. | Sep 2014 | A1 |
20140276096 | Bonutti | Sep 2014 | A1 |
20140340221 | Yuen et al. | Nov 2014 | A1 |
20140346158 | Matthews | Nov 2014 | A1 |
20140375452 | Yuen et al. | Dec 2014 | A1 |
20140375470 | Malveaux | Dec 2014 | A1 |
20140378872 | Hong et al. | Dec 2014 | A1 |
20150057512 | Kapoor | Feb 2015 | A1 |
20150085245 | Howell et al. | Mar 2015 | A1 |
20150088464 | Yuen et al. | Mar 2015 | A1 |
20150173631 | Richards et al. | Jun 2015 | A1 |
20150179050 | Katingari et al. | Jun 2015 | A1 |
20150185506 | Lewis | Jul 2015 | A1 |
20150212329 | Sugihara et al. | Jul 2015 | A1 |
20150223805 | Whitman et al. | Aug 2015 | A1 |
20150244910 | Marston et al. | Aug 2015 | A1 |
20150281879 | Saadi | Oct 2015 | A1 |
20150287338 | Wells et al. | Oct 2015 | A1 |
20150332149 | Kolb et al. | Nov 2015 | A1 |
20150342482 | Carrara | Dec 2015 | A1 |
20150366518 | Sampson | Dec 2015 | A1 |
20160007849 | Krueger | Jan 2016 | A1 |
20160034042 | Joo | Feb 2016 | A1 |
20160041404 | Palermo et al. | Feb 2016 | A1 |
20160041613 | Klanner et al. | Feb 2016 | A1 |
20160117937 | Penders et al. | Apr 2016 | A1 |
20160314468 | Smith et al. | Oct 2016 | A1 |
20170323584 | Daniel et al. | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
2778612 | Dec 2017 | EP |
2396421 | Jun 2004 | GB |
2005015163 | Feb 2005 | WO |
2005094667 | Oct 2005 | WO |
2007088374 | Aug 2007 | WO |
2008073806 | Jun 2008 | WO |
2010006370 | Jan 2010 | WO |
2010062479 | Jun 2010 | WO |
2010062481 | Jun 2010 | WO |
2011086466 | Jul 2011 | WO |
2012041485 | Apr 2012 | WO |
2013188343 | Dec 2013 | WO |
2014021602 | Feb 2014 | WO |
2014108481 | Jul 2014 | WO |
2014144918 | Sep 2014 | WO |
2014144940 | Sep 2014 | WO |
2014170280 | Oct 2014 | WO |
2014188244 | Nov 2014 | WO |
2015015025 | Feb 2015 | WO |
2015081299 | Jun 2015 | WO |
2015095924 | Jul 2015 | WO |
2015127143 | Aug 2015 | WO |
2015127441 | Aug 2015 | WO |
2016017997 | Feb 2016 | WO |
2016029803 | Mar 2016 | WO |
Entry |
---|
Michael Franco, Tzoa wearable turns you into a walking air-quality sensor, Dec. 9, 2014, CNET, https://www.cnet.com/news/tzoa-wearable-turns-you-into-a-walking-air-quality-sensor/. |
Notice of Allowance, dated Feb. 28, 2017, from corresponding U.S. Appl. No. 14/588,122. |
Office Action, dated Feb. 10, 2017, from corresponding U.S. Appl. No. 14/846,401. |
Office Action, dated Mar. 3, 2017, from corresponding U.S. Appl. No. 14/610,628. |
International Preliminary Report on Patentability, dated Mar. 16, 2017, from corresponding International Application No. PCT/US2015/048662. |
Ted Burnham, Wearable Air Quality Sensor: Tzoa, Jan. 5, 2015, Postscapes, http://www.postscapes.com/wearable-air-quality-sensor-tzoa/. |
Final Office Action, dated Mar. 29, 2017, from corresponding U.S. Appl. No. 14/562,454. |
International Preliminary Report on Patentability, dated Mar. 16, 2017, from corresponding International Application No. PCT/US2015/048612. |
International Preliminary Report on Patentability, dated Mar. 16, 2017, from corresponding International Application No. PCT/US2015/048656. |
Final Office Action, dated Dec. 15, 2016, from corresponding U.S. Appl. No. 14/506,249. |
Final Office Action, dated Sep. 26, 2016, from corresponding U.S. Appl. No. 14/610,628. |
International Search Report, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048612. |
International Search Report, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048656. |
International Search Report, dated Jun. 2, 2016, from corresponding International Application No. PCT/US2016/015705. |
Invitation to Pay Additional Search Fees, dated Apr. 1, 2016, from corresponding International Application Serial No. PCT/US2016/015705. |
Maria S. Redin, “Marathon Man”, Article Jun. 15, 1998, MIT Media Laboratory. |
Office Action, dated Aug. 19, 2016, from corresponding U.S. Appl. No. 14/578,039. |
Office Action, dated Jul. 1, 2016, from corresponding U.S. Appl. No. 14/562,454. |
Office Action, dated Jul. 22, 2016, from corresponding U.S. Appl. No. 14/506,249. |
Office Action, dated Mar. 8, 2016, from corresponding U.S. Appl. No. 14/610,628. |
Office Action, dated Sep. 2, 2016, from corresponding U.S. Appl. No. 14/588,122. |
Restriction Requirement, dated Nov. 10, 2016, from corresponding U.S. Appl. No. 14/846,401. |
Richard M. Satava, et al., “The Physiologic Cipher at Altitude: Telemedicine and Real-Time Monitoring of Climbers on Mount Everest”, Telemedicine Journal and e-Health, vol. 6, No. 3, 2000, Mary Ann Liebert, Inc. |
Written Opinion of the International Searching Authority, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048612. |
Written Opinion of the International Searching Authority, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048656. |
Written Opinion of the International Searching Authority, dated Jun. 2, 2016, from corresponding International Application No. PCT/US2016/015705. |
Office Action, dated Dec. 29, 2016, from corresponding U.S. Appl. No. 14/610,589. |
Phend, Crystal, “Calorie Intake Rises as Sleep Time Drops,” Medpage Today, Medpage Today, LLC, Mar. 15, 2012, Web Dec. 19, 2016, http://www.medpagetoday.com/cardiology/prevention/31663. |
Final Office Action, dated Jun. 30, 2017, from corresponding U.S. Appl. No. 14/610,589. |
Shankland, Stephen, “Google's electronic eyewear get ‘OK Glass’ voice commands”, Feb. 20, 2013, Cnet.com, https://www.cnet.com/news/googles-electronic-eyewear-gets-ok-glass-voice-commands/. |
Office Action, dated Jun. 29, 2017, from corresponding U.S. Appl. No. 15/489,147. |
Final Office Action, dated Jul. 10, 2017, from corresponding U.S. Appl. No. 14/846,401. |
Final Office Action, dated May 23, 2017, from corresponding U.S. Appl. No. 14/578,039. |
Notice of Allowance, dated Jun. 21, 2017, from corresponding U.S. Appl. No. 14/562,454. |
Office Action, dated Jun. 27, 2017, from corresponding U.S. Appl. No. 15/060,333. |
Tolentino, Mellisa, Udderly Clever Wearable Tech Solutions, http://siliconangle.com/blog/2014/03/25/udderly-clever-wearable-tech-solutions/, Mar. 25, 2014. |
Torres, Juan Carlos, ODG R-7 Smart Glasses Carries Its Own Android Inside, http://androidcommunity.com/pdg-r-7-smart-glasses-carries-its-own-android-inside-20140919/, Sep. 19, 2014. |
Invitation to Pay Additional Search Fees, dated Nov. 4, 2015, from corresponding International Application Serial No. PCT/US2015/048612. |
Invitation to Pay Additional Search Fees, dated Nov. 4, 2015, from corresponding International Application Serial No. PCT/US2015/048656. |
International Search Report, dated Dec. 18, 2015, from corresponding International Application No. PCT/US2015/048662. |
Written Opinion of the International Searching Authority, dated Dec. 18, 2015, from corresponding International Application No. PCT/US2015/048662. |
Office Action, dated Jan. 11, 2018, from corresponding U.S. Appl. No. 15/074,679. |
Office Action, dated Mar. 2, 2018, from corresponding U.S. Appl. No. 15/060,333. |
Office Action, dated Mar. 9, 2018, from corresponding U.S. Appl. No. 14/610,439. |
Final Office Action, dated Mar. 30, 2018, from corresponding U.S. Appl. No. 14/846,401. |
Office Action, dated May 23, 2018, from corresponding U.S. Appl. No. 14/578,039. |
Office Action, dated Sep. 29, 2017, from corresponding U.S. Appl. No. 14/506,249. |
Office Action, dated Sep. 26, 2017, from corresponding U.S. Appl. No. 14/846,401. |
Restriction Requirement, dated Oct. 4, 2017, from corresponding U.S. Appl. No. 14/610,439. |
Notice of Allowance, dated Oct. 20, 2017, from corresponding U.S. Appl. No. 15/489,147. |
Final Office Action, dated Nov. 16, 2017, from corresponding U.S. Appl. No. 14/610,628. |
Notice of Allowance, dated Dec. 13, 2017, from corresponding U.S. Appl. No. 14/610,501. |
Final Office Action, dated Sep. 25, 2018, from corresponding U.S. Appl. No. 14/610,439. |
Notice of Allowance, dated Sep. 13, 2018, from corresponding U.S. Appl. No. 15/594,898. |
Office Action, dated Sep. 11, 2018, from corresponding U.S. Appl. No. 15/060,333. |
Office Action, dated Oct. 4, 2018, from corresponding U.S. Appl. No. 15/791,196. |
Office Action, dated Jun. 8, 2018, from corresponding U.S. Appl. No. 14/610,501. |
Final Office Action, dated Jun. 14, 2018, from corresponding U.S. Appl. No. 15/074,679. |
Final Office Action, dated Jan. 14, 2019, from corresponding U.S. Appl. No. 14/578,039. |
Final Office Action, dated Jan. 14, 2019, from corresponding U.S. Appl. No. 15/060,333. |
Notice of Allowance, dated Jan. 17, 2019, from corresponding U.S. Appl. No. 14/610,439. |
Notice of Allowance, dated Oct. 11, 2018, from corresponding U.S. Appl. No. 15/074,679. |
Final Office Action, dated Dec. 11, 2018, from corresponding U.S. Appl. No. 14/610,501. |
Office Action, dated Feb. 11, 2019, from corresponding U.S. Appl. No. 14/846,401. |
Office Action, dated Apr. 4, 2019, from corresponding U.S. Appl. No. 16/284,615. |
Office Action, dated Mar. 21, 2019, from corresponding U.S. Appl. No. 16/259,646. |
Final Office Action, dated Apr. 29, 2019, from corresponding U.S. Appl. No. 15/791,196. |
Office Action, dated Jun. 11, 2019, from corresponding U.S. Appl. No. 14/610,501. |
Office Action, dated Jun. 27, 2019, from corresponding U.S. Appl. No. 15/060,333. |
Final Office Action, dated Jul. 18, 2019, from corresponding U.S. Appl. No. 14/846,401. |
Office Action, dated Jul. 26, 2019, from corresponding U.S. Appl. No. 16/259,646. |
Notice of Allowance, dated Jul. 31, 2019, from corresponding U.S. Appl. No. 16/284,615. |
Office Action, dated Aug. 6, 2019, from corresponding U.S. Appl. No. 16/429,480. |
Office Action, dated Aug. 7, 2019, from corresponding U.S. Appl. No. 15/611,574. |
Number | Date | Country | |
---|---|---|---|
20160066820 A1 | Mar 2016 | US |
Number | Date | Country | |
---|---|---|---|
62046406 | Sep 2014 | US |