Systems, apparatus, and methods for using a wearable device to monitor operator alertness

Information

  • Patent Grant
  • 10617342
  • Patent Number
    10,617,342
  • Date Filed
    Thursday, June 1, 2017
    7 years ago
  • Date Issued
    Tuesday, April 14, 2020
    4 years ago
Abstract
A computer system and method is adapted for determining an alertness of a wearer of a computerized wearable device having one or more sensors by receiving a signal from the sensors and determining a baseline heart rate of the wearer. The system monitors the wearer for an elevated heart rate and/or transmits a stimulus to the wearer to produce an elevated heart rate. In response to the elevated heart rate, the system compares the wearer's current heart rate to the wearer's baseline heart rate to determine the wearer's current alertness by calculating the length of time it takes the wearer's current heart rate to return to the baseline heart rate. The system notifies the wearer of the wearer's current alertness by generating an alert such as sending a notification to the wearer's computing device. The system may also generate an alert from a vehicle driven by the wearer.
Description
BACKGROUND

Individuals may become drowsy while driving or operating a machine because of monotonous driving conditions, lack of sleep, or due to a medical condition. When individuals drive or operate machinery while drowsy, accidents are prone to happen. Other accidents may also occur when an individual becomes distracted while driving or operating machinery. Accordingly, there is a need for improved systems and methods for monitoring the alertness of an individual, especially when the individual is driving or operating machinery. Various embodiments of the present systems and methods recognize and address the foregoing considerations, and others, of prior art systems and methods.


SUMMARY OF THE VARIOUS EMBODIMENTS

In general, in various embodiments, a computer-implemented method is adapted for determining the alertness of a wearer of a computerized wearable device. The computerized wearable device comprises one or more sensors coupled to the computerized wearable device. The one or more sensors are adapted to detect one or more physiological characteristics of the wearer of the computerized wearable device, where the one or more characteristics are associated with the wearer's alertness. The computerized wearable device also comprises one or more stimulus transmitters coupled to the computerized wearable device, where the one or more stimulus transmitters are adapted to initiate the transmission of one or more stimuli to the wearer. The method comprises: (1) receiving, by a processor, at least one signal from the one or more sensors of the computerized wearable device; (2) at least partially in response to receiving the at least one signal, determining, by a processor, a baseline heart rate of the wearer of the computerized wearable device; (3) transmitting, by a processor, at least one stimulus from the one or more stimulus transmitters to the wearer; (4) determining, by a processor, a current heart rate of the wearer; (5) comparing, by a processor, the current heart rate of the wearer to the baseline heart rate of the wearer; (6) at least partially in response to comparing the current heart rate of the wearer to the baseline heart rate of the wearer, determining, by a processor, the current alertness level of the wearer; and (7) notifying, by a processor, the wearer of the wearer's current alertness level.


In general, in various embodiments, a computer system is adapted for determining the alertness of a wearer of a computerized wearable device comprising one or more sensors coupled to the computerized wearable device. The one or more sensors are adapted to detect one or more physiological characteristics of the wearer of the computerized wearable device, where the one or more characteristics are associated with the wearer's alertness. The computer system comprises at least one processor, where the computer system is configured for: (1) receiving at least one signal from the one or more sensors of the computerized wearable device; (2) at least partially in response to receiving the at least one signal, measuring one or more baseline heart rates of the wearer of the computerized wearable device that are indicative of a normal heart rate of the wearer; (3) receiving a current heart rate of the wearer; (4) determining that the current heart rate of the wearer is above a predetermined heart rate threshold value that corresponds to an excited state of the wearer; (5) calculating a length of time that the current heart rate of the wearer remains above the predetermined heart rate threshold value; (6) comparing the current heart rate of the wearer to the normal heart rate of the wearer; (7) at least partially in response to comparing the current heart rate of the wearer to the normal heart rate of the wearer, determining the current alertness level of the wearer; and (8) notifying the wearer of the wearer's current alertness level.


In general, in various embodiments, a computerized eyewear comprises at least one processor and one or more sensors operatively coupled to the at least one processor. The one or more sensors are selected from a group consisting of: (1) a heart rate monitor; (2) a pulse oximeter; (3) a gyroscope; (4) a forward facing camera; and (5) an eye-facing camera. The computerized eyewear further comprises a power source operatively coupled to the at least one processor a communication device operatively coupled to the at least one processor. The computerized eyewear is configured to: (1) receive at least one signal from the one or more sensors on the computerized wearable device; (2) at least partially in response to receiving the at least one signal, determine a current alertness of the wearer of the computerized eyewear; (3) at least partially in response to determining the current alertness of the wearer, compare the current alertness of the wearer to a normal alertness of the wearer; and (4) notify the wearer when the current alertness of the wearer deviates from the normal alertness of the wearer.





BRIEF DESCRIPTION OF THE DRAWINGS

Various embodiments of systems and methods for detecting attributes a wearer's physiology are described below. In the course of this description, reference will be made to the accompanying drawings, which are not necessarily drawn to scale and wherein:



FIG. 1 is a block diagram of an Alertness Monitoring System in accordance with an embodiment of the present system.



FIG. 2 is a block diagram of the Alertness Monitoring Server of FIG. 1.



FIG. 3 depicts a flowchart that generally illustrates various steps executed by a Alertness Monitoring Module according to a particular embodiment.



FIG. 4 is an exemplary wearable health monitoring device of FIG. 1.





DETAILED DESCRIPTION OF SOME EMBODIMENTS

Various embodiments will now be described more fully hereinafter with reference to the accompanying drawings. It should be understood that the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout.


Overview

A wearable alertness monitoring system, in various embodiments, may, for example, be embodied in any suitable wearable device configured to monitor the alertness of a wearer. The system may, for example, be embodied as a pair of eyewear, as contact lenses, as a wristwatch, as a suitable piece of clothing (e.g., such as a suitable shirt, pair of pants, undergarment, compression sleeve, etc.), as footwear, as a hat, as a helmet, as an orthopedic cast, or any other suitable wearable item. In a particular example, a wearable alertness monitoring system embodied as a pair of eyewear may enable the system to access one or more (e.g., all five) of a wearer's senses (e.g., touch, sight, sound, smell, and taste) based at least in part on a proximity of the eyewear to the wearer's sensory systems (e.g., eyes, mouth, ears, nose) when worn by the wearer. In other embodiments, a wearable alertness monitoring system embodied as a helmet may enable the system to determine impact, acceleration, posture, and other physiological attributes of the wearer.


In various embodiments, the system comprises one or more sensors configured to determine one or more physiological attributes of the wearer. The one or more sensors may be coupled to the wearable device in any suitable way. For instance, the one or more sensors may be embedded into the wearable device, coupled to the wearable device, and/or operatively coupled to the wearable device. The one or more sensors may include, for example, one or more heart rate monitors, one or more electrocardiogram (EKG) sensors, one or more pedometers, one or more gyroscopes, one or more geomagnetic sensors, one or more thermometers, one or more front-facing cameras, one or more eye-facing cameras, one or more microphones, one or more accelerometers, one or more blood pressure sensors, one or more pulse oximeters, one or more near-field communication sensors, one or more bone scanners, one or more infrared LED/photodiode sensor combinations, one or more photodiode sensors, one or more magnetometers, or any other suitable one or more sensors. In particular embodiments, the system is configured to gather data about the wearer, for example, using the one or more sensors (e.g., such as temperature, balance, heart rate, activity, activity levels, alertness, food eaten, medications taken, steps taken, head position, body movements, facial muscle movements, trauma, etc.).


In some embodiments, one or more body position sensors may be physically or wirelessly coupled to the wearable device and adapted to assess the relative positions of two or more of the wearer's body parts over time in order to determine whether the wearer's head position is desirable or undesirable (e.g., whether the wearer's head has slumped forward). For example, the wearable device may include a magnetometer that is adapted to sense the relative positions of a first magnet that a wearer is wearing adjacent the wearer's forehead (e.g., as an adhesive attachment or embedded within an article of clothing) and a second magnet that the wearer is wearing adjacent the wearer's chest (e.g., as an adhesive attachment or embedded within an article of clothing). The system may then use data obtained from the magnetometer regarding the relative positions of the magnets to determine whether the wearer's head is slumping forward. In response to determining that the wearer's head is slumping forward, the system may send an alert to the wearer indicating that they should resume a correct head position.


In various embodiments, the sensors determine the alertness of the wearer by monitoring the wearer's heart rate, pulse, orientation, acceleration, geomagnetic field, and images from the one or more front facing cameras and/or one or more eye-facing cameras. The system may measure the wearer's baseline levels based on these characteristics to determine the wearer's normal alertness. The system may measure the wearer's baseline once to create a static baseline, or intermittently to create a dynamic baseline. After determining the wearer's normal alertness based on the baseline levels from the sensors, the system may receive at least one signal from the one or more sensors. The system may use the at least one signal received from the one or more sensors to determine the wearer's current alertness. The system may then compare the wearer's current alertness to the wearer's baseline (e.g., normal) alertness. If the wearer's current alertness deviates from the normal alertness of the wearer, the system may notify the wearer of the deviation. In some embodiments, the system may notify a third party of the deviation. In various embodiments, the system may notify the wearer via the wearable device or through a notification sent to a mobile device associated with the wearer. In some embodiments, the system may also provide suggestions to the wearer on how to change the wearer's current alertness to conform to the wearer's normal alertness. The system, in particular embodiments, may also provide suggestions to the wearer on the cause of the wearer's current alertness. For instance, where the wearer's current alertness includes a lowered head, closed eyes, and slumped shoulders (as determined by additional sensors placed on the wearer's shoulders) and the wearer's normal alertness is sitting up straight with shoulders back, the system may determine that the wearer is sleeping. After determining that the wearer is sleeping, the system may notify the wearer to wake up or to correct their current alertness to their normal alertness. The system may also suggest the cause of the wearer's deviation in alertness (e.g., fatigue).


In various embodiments, while the system is using one or more sensors (e.g., eyewear based sensors) to assess the alertness of the wearer, the system may also (e.g., at least substantially simultaneously) capture one or more images of the wearer or the wearer's surroundings (e.g., using a camera, such as a forward-facing camera associated with eyewear worn by the wearer) that may be used in determining the user's activity or physiological state.


Exemplary Technical Platforms


As will be appreciated by one skilled in the relevant field, the present systems and methods may be, for example, embodied as a computer system, a method, or a computer program product. Accordingly, various embodiments may be entirely hardware or a combination of hardware and software. Furthermore, particular embodiments may take the form of a computer program product stored on a computer-readable storage medium having computer-readable instructions (e.g., software) embodied in the storage medium. Various embodiments may also take the form of web-implemented computer software. Any suitable computer-readable storage medium may be utilized including, for example, hard disks, compact disks, DVDs, optical storage devices, and/or magnetic storage devices.


Various embodiments are described below with reference to block diagram and flowchart illustrations of methods, apparatuses, (e.g., systems), and computer program products. It should be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by a computer executing computer program instructions. These computer program instructions may be loaded onto a general purpose computer, a special purpose computer, or other programmable data processing apparatus that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the functions specified in the flowchart block or blocks.


The computer instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on a user's computer and partly on a remote computer, or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including but not limited to: a local area network (LAN); a wide area network (WAN); a cellular network; or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).


These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner such that the instructions stored in the computer-readable memory produce an article of manufacture that is configured for implementing the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process (e.g., method) such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.


Example System Architecture



FIG. 1 is a block diagram of an Alertness Monitoring System 100 according to particular embodiments. As may be understood from this figure, the Alertness Monitoring System 100 includes One or More Networks 115, One or More Third Party Servers 50, an Alertness Monitoring Server 120 that may, for example, be adapted to execute an Alertness Monitoring Module 300, a Database 140, One or More Remote Computing Devices 154 (e.g., such as a smart phone, a tablet computer, a wearable computing device, a laptop computer, a desktop computer, etc.), and One or More Wearable Health Monitoring Devices 156, which may, for example, be embodied as one or more of eyewear, headwear, clothing, a watch, a hat, a helmet, a cast, an adhesive bandage, a piece of jewelry (e.g., a ring, earring, necklace, bracelet, etc.), or any other suitable wearable device. In particular embodiments, the one or more computer networks 115 facilitate communication between the One or More Third Party Servers 50, the Alertness Monitoring Server 120, Database 140, One or More Remote Computing Devices 154, and the one or more Health Monitoring Devices 156.


The one or more networks 115 may include any of a variety of types of wired or wireless computer networks such as the Internet, a private intranet, a mesh network, a public switch telephone network (PSTN), or any other type of network (e.g., a network that uses Bluetooth or near field communications to facilitate communication between computing devices). The communication link between the One or More Remote Computing Devices 154 and the Mental State Monitoring Server 120 may be, for example, implemented via a Local Area Network (LAN) or via the Internet.



FIG. 2 illustrates a diagrammatic representation of the architecture for the Alertness Monitoring Server 120 that may be used within the Alertness Monitoring System 100. It should be understood that the computer architecture shown in FIG. 2 may also represent the computer architecture for any one of the One or More Remote Computing Devices 154, the one or more Third Party Servers 50, and the one or more Health Monitoring Devices 156 shown in FIG. 1. In particular embodiments, the Alertness Monitoring Server 120 may be suitable for use as a computer within the context of the Alertness Monitoring System 100 that is configured for determining an alertness of a wearer of a computerized wearable device using signals received from sensors coupled to the computerized wearable device.


In particular embodiments, the Alertness Monitoring Server 120 may be connected (e.g., networked) to other computing devices in a LAN, an intranet, an extranet, and/or the Internet as shown in FIG. 1. As noted above, the Alertness Monitoring Server 120 may operate in the capacity of a server or a client computing device in a client-server network environment, or as a peer computing device in a peer-to-peer (or distributed) network environment. The Alertness Monitoring Server 120 may be a desktop personal computing device (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, a switch or bridge, or any other computing device capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that computing device. Further, while only a single computing device is illustrated, the term “computing device” shall also be interpreted to include any collection of computing devices that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.


An exemplary Alertness Monitoring Server 120 includes a processing device 202, a main memory 204 (e.g., read-only memory (ROM), flash memory, dynamic random access memory (DRAM) such as synchronous DRAM (SDRAM) or Rambus DRAM (RDRAM), etc.), a static memory 206 (e.g., flash memory, static random access memory (SRAM), etc.), and a data storage device 218, which communicate with each other via a bus 232.


The processing device 202 represents one or more general-purpose or specific processing devices such as a microprocessor, a central processing unit (CPU), or the like. More particularly, the processing device 202 may be a complex instruction set computing (CISC) microprocessor, reduced instruction set computing (RISC) microprocessor, very long instruction word (VLIW) microprocessor, or processor implementing other instruction sets, or processors implementing a combination of instruction sets. The processing device 202 may also be one or more special-purpose processing devices such as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), a digital signal processor (DSP), network processor, or the like. The processing device 202 may be configured to execute processing logic 226 for performing various operations and steps discussed herein.


The Alertness Monitoring Server 120 may further include a network interface device 208. The Alertness Monitoring Server 120 may also include a video display unit 210 (e.g., a liquid crystal display (LCD) or a cathode ray tube (CRT)), an alpha-numeric input device 212 (e.g., a keyboard), a cursor control device 214 (e.g., a mouse), and a signal generation device 216 (e.g., a speaker).


The data storage device 218 may include a non-transitory computing device-accessible storage medium 230 (also known as a non-transitory computing device-readable storage medium, a non-transitory computing device-readable medium, or a non-transitory computer-readable medium) on which is stored one or more sets of instructions (e.g., the Alertness Monitoring Module 300) embodying any one or more of the methodologies or functions described herein. The one or more sets of instructions may also reside, completely or at least partially, within the main memory 204 and/or within the processing device 202 during execution thereof by the Alertness Monitoring Server 120—the main memory 204 and the processing device 202 also constituting computing device-accessible storage media. The one or more sets of instructions may further be transmitted or received over a network 115 via a network interface device 208.


While the computing device-accessible storage medium 230 is shown in an exemplary embodiment to be a single medium, the term “computing device-accessible storage medium” should be understood to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “computing device-accessible storage medium” should also be understood to include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing device and that causes the computing device to include any one or more of the methodologies of the present invention. The term “computing device-accessible storage medium” should accordingly be understood to include, but not be limited to, solid-state memories, optical and magnetic media, etc.


Exemplary System Platform


As noted above, a system, according to various embodiments, is adapted to monitor the alertness of a wearer of a wearable device. Various aspects of the system's functionality may be executed by certain system modules, including the Alertness Monitoring Module 300. The Alertness Monitoring Module 300 is discussed in greater detail below.


Alertness Monitoring Module



FIG. 3 is a flow chart of operations performed by an exemplary Alertness Monitoring Module 300, which may, for example, run on the Alertness Monitoring Server 120, or any suitable computing device (such as the One or More Health Monitoring Devices 156 or other suitable mobile computing device). In particular embodiments, the Alertness Monitoring Module 300 may determine a wearer's normal alertness and current alertness to determine whether the wearer's current alertness deviates from the wearer's normal alertness.


The system begins, in various embodiments, at Step 305 by receiving at least one signal from one or more sensors of a computerized wearable device. In various embodiments, the one or more sensors are coupled to a computing device that is associated with (e.g., embedded within, attached to) the computerized wearable device or other health monitoring device. In particular embodiments, the computerized wearable device or other health monitoring device comprises at least one processor, computer memory, suitable wireless communications components (e.g., a Bluetooth chip), and a power supply for powering the health monitoring device and/or the various sensors. In various embodiments, the one or more sensors may be adapted to detect one or more characteristics of a wearer of the computerized wearable device, wherein the one or more characteristics of the wearer are associated with the wearer's alertness. In various embodiments, the sensors coupled to the computerized wearable device or other health monitoring device may include, for example, one or more of the following: a heart rate monitor, an electrocardiogram (EKG), a gyroscope, a geomagnetic sensor, a pedometer, a thermometer, a front-facing camera, an eye-facing camera, a microphone, an accelerometer, a magnetometer, a blood pressure sensor, a pulse oximeter, a skin conductance response sensor, a near-field communication sensor, an infrared LED/photodiode sensor combination, a magnetometer, an electrooculography sensor, or any other suitable sensor. In particular embodiments, the sensors coupled to the computerized wearable device comprise one or more of a heart rate monitor, a pulse oximeter, gyroscope, a forward facing camera, and an eye-facing camera.


Referring to FIG. 4, an exemplarily embodiment of a computerized wearable device in the form of eyewear, according to particular embodiments, illustrates that the one or more sensors may be physically coupled to eyewear 400 in any suitable way. For example, in various embodiments, the one or more sensors may be embedded into the eyewear 400. In some embodiments, the one or more sensors may be positioned along the frame 410 of the eyewear 400. In other embodiments, the one or more sensors may be positioned along the temples 412, 414 of the eyewear 400. In still other embodiments, the one or more sensors may be coupled to one or more of the lenses 418, 420 of the eyewear 400. As noted above, the one or more sensors may be coupled to a Bluetooth device that is configured to transmit the one or more signals to a handheld wireless device, and the step of receiving one or more signals from the one or more sensors (discussed below with reference to Step 320) further comprises receiving the one or more signals from the wireless handheld device (e.g., via the Internet, Wi-Fi, a cellular network, etc.). In particular embodiments, one or more of the one or more sensors may be detachable from the eyewear 400. For instance, if a wearer does not need a temperature sensor or other particular sensor, the sensor may be removed from the eyewear. In other embodiments, the one or more sensors may be detached from the eyewear to attach to other portions of the wearer's body. For example, in determining the level of alertness of the wearer, the gyroscope, accelerometer, or other suitable device may be attached to the wearer's back or shoulders and may be adapted to transmit data to the eyewear. The structure of the eyewear will be discussed further below.


Returning to FIG. 3, in particular embodiments, the at least one signal from the one or more sensors may include one or more signals that may be used to derive: (1) the wearer's heart rate, (2) the wearer's heart rhythm; (3) a distance traveled by the wearer; (4) the wearer's body temperature; (5) one or more images associated with the wearer or the wearer's environment; (6) one or more sounds associated with the wearer's body or the wearer's environment; (7) a speed traveled by the wearer; (8) the wearer's blood pressure; (9) the wearer's oxygen saturation level; (10) the wearer's brainwave activity; (11) the wearer's pupil size; (12) the wearer's perspiration level; (13) the wearer's respiratory rate; (14) the number and/or cadence of steps taken by the wearer; (15) the movement of one or more of the wearer's facial muscles; (16) one or more biochemical changes within the wearer's body (e.g., changes in hormone levels, releases of neurotransmitters); (17) changes in the one or more characteristics of the wearer's skin (e.g., skin paleness or clamminess); (18) one or more postures associated with the wearer; and/or (19) measuring the corneo-retinal standing potential that exists between the front and the back of the human eye, (20) any other suitable attribute of the wearer or the wearer's environment. For instance, the system may receive a signal from an eye-facing camera associated with the eyewear that the wearer is looking down at the same time that the system receives a signal from the front-facing camera that there is a road in front of the wearer.


In particular embodiments, the system may receive one or more of the above-referenced signals substantially automatically. In various embodiments, the system may receive one or more of the signals on a substantially periodic basis (e.g., by the second, by the minute, hourly, daily, etc.). For example, the system may receive one or more signals every thirty seconds throughout the day. In other embodiments, the system may receive one or more signals at least partially in response to receiving an indication from the wearer that the system should receive a signal. For instance, the wearer may speak a voice command to the wearable device requesting that the device receive a signal from the gyroscope to get the wearer's orientation. In various embodiments, the system may receive an indication from the wearer of when to have the system receive the signal. For example, the system may receive an indication from the wearer to have the system receive a signal from the gyroscope at 8:00 a.m. and at 2:00 p.m. on a particular day. In particular embodiments, the system may receive a request from the wearer to have a particular signal received from a particular sensor at the same time that the system receives a second particular signal from a second particular sensor. For example, when the system receives a signal that indicates that the wearer's pupil size has changed, the system may, at least partially in response to receiving the changed pupil size signal, also obtain an orientation of the wearer from the gyroscope associated with the eyewear.


In various embodiments, the system may receive one or more signals at least partially in response to receiving an indication from a third party that the system should receive a signal. For instance, the third party may remotely request that the wearable device receive a signal from the heart rate monitor. In some embodiments, the system may receive an indication from the third party of when to have the system receive the signal. For example, the system may receive an indication from the third party (e.g., the wearer's supervisor) to have the system receive a signal from the heart rate monitor every 15 minutes. In particular embodiments, the system may receive a request from the third party to have a particular signal received from a particular sensor at the same time that the system receives a second particular signal from a second particular sensor. For example, when the system receives a signal that indicates that the wearer's heart rate has increased, the system may, at least partially in response to receiving the increased heart rate signal, also obtain an image from the eye-facing camera associated with the eyewear, which may be analyzed by the system to determine why the wearer's heart rate increased.


In some embodiments, the system receives a signal of an image captured by the eyewear. In various embodiments, the system receives a plurality of images captured by the eyewear. In particular embodiments, the system receives the image from the front-facing camera. In some embodiments, the system receives the image substantially automatically from the front-facing camera. In other embodiments, the system may receive the image in response to receiving an indication from the wearer to capture the image. For example, the system may receive a voice command from the wearer to capture the image. In various embodiments, the system may store the captured image in local or remote memory. In some embodiments, the image captured by the eyewear may be a video.


In various embodiments, the system may receive only one signal from a single sensor associated with the eyewear. In other embodiments, the system may receive a signal from a plurality of the sensors associated with the eyewear. In yet other embodiments, the system may receive multiple signals from one or more of the sensors. In various embodiments, the system may be configured to receive a first signal from a first sensor at the same time that it receives a second signal from a second sensor. For example, the system may be configured to receive an image signal from an eye-facing camera associated with the eyewear at the same time that the system receives an orientation signal from the gyroscope associated with the eyewear. As a further example, the system may be configured to simultaneously receive a signal from both an eye-facing camera and a heart rate monitor associated with the eyewear.


At least partially in response to receiving the at least one signal, at Step 310, the system determines a baseline heart rate of the wearer of the computerized wearable device. In particular embodiments, the baseline heart rate of the wearer may be indicative of a normal heart rate of the wearer. In particular embodiments, the baseline heart rate may be a normal starting baseline heart rate of the wearer. For instance, the normal starting baseline heart rate of the wearer when the wearer wakes up may be a resting heart rate range from 60 to 100 beats per minute. In other embodiments, the baseline heart rate may be a desired baseline heart rate of the wearer. For instance, the baseline heart rate may be received from medical guidelines based on a desired baseline heart rate for a wearer of the same sex, height, weight, age, etc. In still other embodiments, the baseline heart rate may be a baseline level of an average person (e.g., not a baseline level based on measurements from the wearer).


In various embodiments, the system may determine the baseline heart rate from a single heart rate monitor. The system may, in some embodiments, determine the baseline heart rate multiple times (e.g., measure the baseline heart rate from the heart rate monitor three times) to create an average baseline heart rate. In various embodiments, the system may be configured to determine the baseline heart rate from the heart rate monitor at the same time that it determines one or more physiological baseline levels from a second sensor. For example, the system may be configured to receive a baseline heart rate at the same time that it receives a position measurement from the accelerometer. In addition, the system may also receive an oxygen saturation level from a pulse oximeter sensor to determine the wearer's blood oxygen level, which may be used in supplementing alertness measurements, balance measurements, and position measurements.


In various embodiments, the system may determine the baseline heart rate substantially automatically after the sensor generates the data. In particular embodiments, the system may determine the baseline heart rate only once to create a static baseline. In some embodiments, the system may determine the baseline heart rate periodically (e.g., by the second, by the minute, hourly, daily, etc.) to create a dynamic baseline. For example, the system may determine the wearer's heart rate every thirty seconds throughout the day to create a dynamic baseline for the wearer's heart rate. In other embodiments, the system may determine the baseline heart rate after receiving an indication from the wearer and/or a third party that the system should determine the one or more baseline levels.


In particular embodiments, the system may determine the baseline heart rate during a predefined time period. In various embodiments, the system may determine the baseline heart rate when a sensor detects movement and/or activities of the wearer (e.g., sitting, lying down, standing, driving a vehicle, flying in an airplane, walking, running, lifting, exercising, etc.). For example, the system may determine the baseline heart rate when the sensor detects that the wearer is driving a vehicle as determined by the speed at which the wearer is currently traveling.


In particular embodiments, the system may store the baseline heart rate in associated memory. In various embodiments, the system may store the baseline heart rate substantially automatically after measuring the data. In other embodiments, the system may store the baseline heart rate after receiving manual input from the wearer and/or a third party requesting that the system store the baseline heart rate. In various embodiments, the system may store the baseline heart rate for a specified period of time. For instance, the system may store the baseline heart rate for a day, a month, a year, etc., in the Database 140. In some embodiments, the system may store the baseline heart rate on any suitable server or other device. In various embodiments, the system may store the baseline heart rate on the Alertness Monitoring Server 120. In particular embodiments, the system may store the baseline heart rate in an account associated with the wearer. In various embodiments, the system may store the baseline heart rate with a timestamp of when the one or more baseline levels were received.


Continuing at Step 315, the system transmits at least one stimulus from one or more stimulus transmitters to the wearer. In various embodiments, the at least one stimulus from the one or more stimulus transmitters may include: (1) a vibration; (2) a change in temperature; (3) a smell; (4) a light; (5) a sound; (6) a shock, (7) or any other suitable stimulus that causes a reaction in the wearer. In particular embodiments, the at least one stimulus is intended to produce a reaction in the wearer (e.g., physical reaction, emotional reaction, neurological reaction, etc.). For example, a particular stimulus may produce an emotional reaction in the wearer causing the wearer to become excited. In some embodiments, the one or more stimulus transmitters may be any suitable transmitter such as: (1) a vibration transmitter; (2) a temperature changing transmitter; (3) a smell transmitter; (4) a light transmitter; (5) a sound transmitter; and (6) a shock transmitter.


In particular embodiments, the one or more stimulus transmitters may be operatively coupled to a vehicle that the wearer drives and/or machinery that the wearer operates. For example, the one or more stimulus transmitters may be configured to generate one or more stimuli from one or more of the following vehicle systems: (1) in-cabin climate control; (2) audio system; (3) engine; (4) in-cabin lighting elements; and (5) brakes. By being operatively coupled to the one or more vehicle systems, the one or more transmitters may be configured to generate one or more stimuli such as: (1) turning up or down the volume on the vehicle's audio system; (2) disabling the driving functionality of the vehicle; (3) turning up or down the temperature on the vehicle's in-cabin climate control; (4) turning on or off the lights in the vehicle's cabin; (5) applying the brakes on the vehicle, (6) etc.


In particular embodiments, the one or more stimulus transmitters may be configured to transmit the at least one stimuli from one or more remote computing devices associated with the wearer and/or a third party. For example, the one or more stimulus transmitters may transmit the at least one stimuli from the wearer's cellphone by causing the cellphone to ring, vibrate, create a flashing light, and/or any other suitable alert. In various embodiments, the system is configured to transmit the one or more stimuli wirelessly (e.g., via a cellular network, Bluetooth, WI-FI, etc.).


At Step 320, the system determines a current heart rate of the wearer. In particular embodiments, the system determines the current heart rate of the wearer at least partially in response to the system transmitting at least one stimulus from the one or more stimulus transmitters to the wearer. In some embodiments, the system determines the current heart rate of the wearer prior to the system transmitting at least one stimulus from the one or more stimulus transmitters to the wearer. In some of these embodiments, the system may also determine the current heart rate of the wearer after the system transmits the at least one stimulus to the wearer. In particular embodiments, the system may determine the current heart rate of the wearer after detecting a change in the wearer's current heart rate. In particular embodiments, the system may determine the current heart rate after receiving an indication from the wearer and/or a third party that the system should determine the current heart rate.


In various embodiments, the system may determine the current heart rate after a predefined time period. In some embodiments, the system may determine the current heart rate when a sensor detects movement and/or activities of the wearer (e.g., sitting, lying down, standing, driving a vehicle, flying in an airplane, walking, running, lifting, exercising, etc.). For example, the system may determine the current heart rate when the sensor detects that the wearer is driving a vehicle as determined by the speed at which the wearer is currently traveling.


The system may, in some embodiments, determine the current heart rate multiple times (e.g., measure the current heart rate from the heart rate monitor three times) to create an average current heart rate. In various embodiments, in addition to determining the current heart rate of the wearer, the system may determine other current physiological attributes of the wearer (e.g., pupil size, oxygen level, blood sugar level, the wearer's gaze and/or changes thereto, perspiration level, respiratory rate, brain wave activity, blink rate, free of physical injuries, injured, asleep, awake, conscious, unconscious, alive, deceased, stable, good, fair, serious, critical, distressed, etc.). In some embodiments, the system may be configured to determine the current heart rate from the heart rate monitor at the same time that it determines one or more current physiological levels from a second sensor. For example, the system may be configured to receive a current heart rate at the same time that it receives a head position measurement from the gyroscope.


In particular embodiments, the system may determine the current heart rate to be an elevated heart rate. In various embodiments, the system may determine the current heart rate to be a normal heart rate. In particular embodiments, the system may determine the current heart rate to be a first particular heart rate for a first period of time and a second particular heart rate for a second period of time.


The system, at Step 325, compares the current heart rate of the wearer to the baseline heart rate of the wearer. In various embodiments, the system compares the wearer's current heart rate to the wearer's baseline heart rate substantially automatically after the system determines the current heart rate of the wearer. In some embodiments, the system may compare the wearer's current heart rate to the wearer's baseline heart rate periodically (e.g., by the second, by the minute, hourly, daily, weekly, monthly, etc.). For example, the system may compare the wearer's current heart rate to the wearer's baseline heart rate every thirty minutes throughout the day. In other embodiments, the system may compare the wearer's current heart rate to the wearer's baseline heart rate after receiving an indication from the wearer or a third party that the system should compare the wearer's current heart rate to the wearer's baseline heart rate. In various embodiments, the system may receive an indication from the wearer and/or a third party of when to have the system compare the wearer's current heart rate to the wearer's baseline heart rate. For example, the system may receive an indication from the wearer to have the system compare the wearer's current heart rate to the baseline heart rate of the wearer at 6:00 a.m. and at 2:30 p.m. on a particular day.


In various embodiments, the system may compare the wearer's current heart rate to the wearer's baseline heart rate to determine if the wearer's current heart rate deviates from the wearer's baseline heart rate. In particular embodiments, the wearer's current heart rate may not deviate from the wearer's baseline heart rate. In some embodiments, the wearer's current heart rate may deviate from the wearer's baseline heart rate. In particular embodiments, the system may detect one or more deviations from the wearer's baseline heart rate. In various embodiments, the system may determine the wearer's current heart rate deviates from the wearer's baseline heart rate based on a predetermined percentage. For instance, the system may determine a wearer's current heart rate deviates from the wearer's baseline heart rate if the wearer's current heart rate is 10% faster than the wearer's baseline heart rate. In particular embodiments, the system may store the comparisons in an account associated with the wearer. In some embodiments, the comparisons may be accessible by the wearer and/or a third party. For instance, the comparisons may be diagramed in a chart that is accessible from the wearable device or from a computing device by the wearer's employer.


At least partially in response to comparing the current heart rate of the wearer to the baseline heart rate of the wearer, at Step 330, the system determines a current alertness level of the wearer. In particular embodiments, the system may determine the current alertness level of the wearer by calculating an amount of time it takes the current heart rate of the wearer to return from an elevated heart rate that results from the one or more stimuli to about the baseline heart rate of the wearer. In some embodiments, the system may set a predetermined amount of time that it takes the current heart rate of the wearer to return from an elevated heart rate to about the baseline heart rate of the wearer as a normal amount of time. In particular embodiments, the system may set a range of time as a normal amount of time for the wearer's heart rate to return from the elevated heart rate to the baseline heart rate. For example, the predetermined amount of time may be a range between 60 seconds and 300 seconds. In various embodiments, the system may determine whether the amount of time it takes the current heart rate of the wearer to return from the elevated heart rate that results from the one or more stimuli to the baseline heart rate of the wearer is above a particular heart rate threshold value. In particular embodiments, the system may determine the current alertness level of the wearer to be a particular alertness level for a particular period of time. For example, the system may determine that the wearer has been awake for over 15 hours.


In some embodiments, the system may determine the wearer's level of alertness based on the amount of time that it takes for the wearer's current heart rate to return to a particular deviation from the wearer's baseline heart rate. For example, the wearer's alertness level may be determined based on the amount of time it takes the wearer's current heart rate to fall to within a 5% range about the wearer's baseline heart rate. In particular, if the wearer's baseline heart rate is 75 beats per minute and the current hear rate is 120 beats per minute, then the system may determine the amount of time it takes for the current heart rate to get to within 5% of the baseline heartrate (e.g., 79 beats per minute.


In various embodiments, the system may determine the current alertness of the wearer based on the one or more physical attributes of the wearer (e.g., pupil size, heart rate, perspiration level, respiratory rate, brain wave activity, free of physical injuries, injured, asleep, awake, conscious, unconscious, alive, deceased, stable, good, fair, serious, critical, distressed, etc.).


In particular embodiments, the system may store the current alertness of the wearer in an account associated with the wearer. In some embodiments, the current alertness of the wearer may be accessible by the wearer and/or a third party. For instance, the current alertness of the wearer may be diagramed in a chart that is accessible from the wearable device or from a computing device by the wearer's supervisor. In various embodiments, the system may store the current alertness of the wearer in the Database 140.


At Step 335, the system notifies the wearer of the wearer's current alertness level. In various embodiments, the system may notify a third party of the wearer's current alertness level. In particular embodiments, the system may notify the wearer and/or a third party when the wearer's current alertness deviates from the wearer's baseline alertness. In some embodiments, in addition to notifying the wearer, the system may update the wearer's account to note that a notification was sent. In particular embodiments, the system may notify the wearer of the deviation from the baseline alertness by displaying an image on the lens of the eyewear. In other embodiments, the system notifies the wearer of the deviation from the baseline alertness by communicating through a speaker to the wearer. In various embodiments, the system may notify the wearer of the deviation from the baseline alertness by sending an electric shock to the wearer. In particular embodiments, the system may notify the wearer by generating an alert from one or more of the following vehicle systems: (1) in-cabin climate control; (2) audio system; (3) engine; (4) in-cabin lighting elements; and (5) brakes. By being operatively coupled to the one or more vehicle systems, the system may be configured to generate one or more alerts such as: (1) turning up or down the volume on the vehicle's audio system; (2) disabling the driving functionality of the vehicle; (3) turning up or down the temperature on the vehicle's in-cabin climate control; (4) turning on or off the lights in the vehicle's cabin; (5) applying the brakes on the vehicle, (6) etc.


In some embodiments, the system notifies the wearer and/or the third party of the deviation from the baseline alertness by sending a notification to the wearer's and/or the third party's mobile device. In particular embodiments, the system notifies the wearer and/or the third party of the deviation from the baseline alertness by email and/or text message. In other embodiments, the system may notify the wearer and/or the third party of a single deviation from the normal alertness substantially immediately after the system detects the deviation between the wearer's current alertness as compared to the wearer's normal alertness.


In various embodiments, the system may notify the wearer and/or the third party of the deviation from the normal alertness after a particular event. For example, the system may notify a third party of the wearer's current alertness if the system determines that the wearer's vehicle has swerved. In some embodiments, the system may notify the wearer and/or the third party of the deviation from the normal alertness after a particular period of time. For instance, the system may notify the wearer of the deviation from the wearer's normal alertness to a current alertness of drowsy after one minute of detecting the wearer being drowsy. In particular embodiments, at least partially in response to determining that the amount of time it takes the current heart rate of the wearer to return from an elevated heart rate that results from the one or more stimuli to about the baseline heart rate of the wearer is above a particular heart rate threshold value, the system may generate an alert that the wearer is tired.


In various embodiments, at least partially in response to determining that the wearer has experienced an impact, the system may notify one of the wearer or a third party that the wearer has experienced an impact. For example, if the wearer becomes drowsy and experiences an impact due to the vehicle's airbag deploying caused by the vehicle being involved in an accident, the system may notify the wearer's supervisor that the wearer has experienced an impact.


In various embodiments, in addition to notifying the wearer of the deviation from the normal alertness, the system may also provide suggestions to the wearer on how to change the wearer's current alertness to make it conform to the wearer's normal alertness. For instance, where the wearer's normal alertness while driving is alert and attentive and the wearer's current alertness while driving is drowsy, the system may suggest for the wearer to pull the vehicle over and rest. In some embodiments, the system may also provide suggestions to the wearer on the cause of the wearer's current alertness level. For example, where the wearer's normal alertness is alert and the wearer's current alertness is drowsy, the system may provide a suggestion to the wearer that the cause of the wearer's current alertness may be fatigue or a more serious medical issue such as depression based on the wearer's frequency of being drowsy. In addition to using the wearer's heartbeat to determine the wearer's level of alertness, the system may use information about one or more of the wearer's pulse oximeter lever, blood pressure level, pupil response, etc. to determine the cause of the wearer's alertness level.


In various embodiments, the system, when executing the Alertness Monitoring Module 300, may omit particular steps, perform particular steps in an order other than the order presented above, or perform additional steps not discussed directly above.


Structure of the Eyewear


Referring again to FIG. 4, eyewear 400, according to various embodiments, includes: (1) the eyewear frame 410; (2) the first temple 412; and (3) the second temple 414. These various components are discussed in more detail below.


Eyewear Frame


Referring still to FIG. 4, eyewear 400, in various embodiments, includes any suitable eyewear frame 410 configured to support one or more lenses 418, 420. In the embodiment shown in this figure, the eyewear frame 410 has a first end 402 and a second end 404. The eyewear frame 410 may be made of any suitable material such as metal, ceramic, polymers or any combination thereof. In particular embodiments, the eyewear frame 410 is configured to support the first and second lenses 418, 420 about the full perimeter of the lenses. In other embodiments, the eyewear frame 410 may be configured to support the first and second lenses 418, 420 about only a portion of each respective lens. In various embodiments, the eyewear frame 410 is configured to support a number of lenses other than two lenses (e.g., a single lens, a plurality of lenses, etc.). In particular embodiments, the lenses 418, 420 may include prescription lenses, sunglass lenses, or any other suitable type of lens (e.g., reading lenses, non-prescription lenses), which may be formed from glass or polymers.


The eyewear frame 410 includes a first (not shown in figure) and second nose pad 424, which may be configured to maintain the eyewear 400 adjacent the front of a wearer's face such that the lenses 418, 420 are positioned substantially in front of the wearer's eyes while the wearer is wearing the eyewear 400. In particular embodiments, the nose pads may comprise a material that is configured to be comfortable when worn by the wearer (e.g., rubber, polymer, etc.). In other embodiments, the nose pads may include any other suitable material (e.g., plastic, metal, etc.). In still other embodiments, the nose pads may be integrally formed with the frame 410 and made from the same material as the frame.


The eyewear frame 410 includes a first and a second hinge 426, 428, that attach the first and second temples 412, 414 to the frame first and second ends 402, 404, respectively. In various embodiments, the hinges may be formed by any suitable connection (e.g., tongue and groove, ball and socket, spring hinge, etc.). In particular embodiments, the first hinge 426 may be welded to, or integrally formed with, the frame 410 and the first temple 412 and the second hinge 428 may be welded to, or integrally formed with, the frame 410 and the second temple 414.


First and Second Temples


As shown in FIG. 4, the first temple 412, according to various embodiments, is rotatably connected to the frame 410 at a right angle to extend the first temple 412 substantially perpendicular, substantially parallel, or anywhere in between the right angle to the frame 410. The first temple 412 has a first and second end 412a, 412b. Proximate the first temple second end 412b, the first temple 412 includes an earpiece 413 configured to be supported by a wearer's ear. Similarly, the second temple 414, according to various embodiments, is rotatably connected to the frame 410 at a right angle to extend the second temple 414 substantially perpendicular, substantially parallel, or anywhere in between the right angle to the frame 410. The second temple 414 has a first and second end 414a, 414b. Proximate the second temple second end 414b, the second temple 414 includes an earpiece 415 configured to be supported by a wearer's ear.


Sensors


In various embodiments, the second temple 414 has one or more sensors 430 connected to the second temple 414. As discussed above, in various embodiments, the one or more sensors 430 may be coupled to the frame 410, the first and second temples 412, 414, the first and second lenses 418, 410, or any other portion (e.g., the nose pads, etc.) of the eyewear 400 in any suitable way. For instance, the one or more sensors 430 may be embedded into the eyewear 400, coupled to the eyewear 400, and/or operatively coupled to the eyewear 400. In various embodiments, the one or more sensors 430 may be formed at any point along the eyewear 400. For instance, a fingerprint reader may be disposed adjacent the first temple of the eyewear 400. In various embodiments, the one or more sensors 430 may be formed in any shape. In addition, the one or more sensors 430 may be formed on the inner (back) surface of the frame 410, the first and second temples 412, 414, the first and second lenses 418, 410, or any other portion of the eyewear 400. In other embodiments, the one or more sensors 430 may be formed on the outer (front) surface of the frame 410, the first and second temples 412, 414, the first and second lenses 418, 410, or any other portion of the eyewear 400.


In various embodiments, the one or more sensors 430 that are coupled to the eyewear (or other wearable device) are adapted to detect one or more characteristics of the eyewear or a wearer of the eyewear, wherein the one or more characteristics of the eyewear or the wearer are associated with the wearer's physiology. In various embodiments, the one or more sensors coupled to the eyewear or other wearable device may include, for example, one or more of the following: a near-field communication sensor, a Bluetooth chip, a GPS unit, an RFID tag (passive or active), a fingerprint reader, an iris reader, a retinal scanner, a voice recognition sensor, a heart rate monitor, an electrocardiogram (EKG), an electroencephalogram (EEG), a pedometer, a thermometer, a front-facing camera, an eye-facing camera, a microphone, an accelerometer, a magnetometer, a blood pressure sensor, a pulse oximeter, a skin conductance response sensor, any suitable biometric reader, an infrared LED sensor, a photodiode sensor, a magnetometer, or any other suitable sensor. In some embodiments, the one or more sensors may include a unique shape, a unique code, or a unique design physically inscribed into the eyewear that may be readable by an individual or a remote computing device.


In various embodiments, the one or more sensors are coupled to a computing device that is associated with (e.g., embedded within, attached to) the eyewear or other wearable device. In particular embodiments, the eyewear or other wearable device comprises at least one processor, computer memory, suitable wireless communications components (e.g., a Bluetooth chip) and a power supply for powering the wearable device and/or the various sensors.


As noted above, the one or more sensors may be coupled to a Bluetooth device that is configured to transmit the one or more signals to a handheld wireless device, and the step of using the eyewear to measure the baseline heart rate from the one or more sensors (discussed above in reference to Step 310) further comprises receiving the one or more signals from the wireless handheld device (e.g., via the Internet). In particular embodiments, one or more of the sensors may be detachable from the eyewear. For instance, if a wearer does not need a temperature sensor or other particular sensor, the sensor may be removed from the eyewear.


Exemplary User Experience


Monitor Driver Alertness


In a particular example of a wearer using the Alertness Monitoring Module 300 to determine their alertness, the wearer may put on the wearable device (e.g., computerized eyewear) as the wearer enters a vehicle to begin driving. During this time, the system tracks the alertness of the wearer using the system's heart rate sensor, orientation sensor(s), brainwave activity sensor, blink rate sensor, oxygen sensor, infrared LED, photodiode sensor, EOG sensor, front-facing camera, and/or eye-facing camera. The wearer may begin by having the system take a baseline measurement of the wearer's heart rate while the wearer is looking straight ahead out of the windshield of the vehicle. The wearer's baseline heart rate may be determined to be in the normal range for an individual with a similar height, weight, sex, age, etc. The system then tracks the heart rate of the wearer to determine any changes in the wearer's heart rate. If the wearer's heart rate remains relatively unchanged for a predetermined period of time, the system may stimulate the wearer's heart rate by producing a vibration from the wearer's cellphone. If the wearer's heart rate remains above a predetermined heart rate threshold for longer than a predetermined period of time, the system may determine that the wearer is drowsy or that the wearer's level of alertness is impaired from an expected level of alertness. The system may alert the wearer that the wearer is drowsy by decreasing the temperature in the vehicle driven by the wearer. This may cause the wearer to again become alert. If the wearer does not become appropriately alert, the system may generate an alert to a computing device associated with a third party, such as a supervisor for the wearer. In another example, the wearer's blink rate and/or blink frequency may increase or decrease based on the wearer's alertness level. For instance, where the wearer has become tired, the wearer's frequency of blinks may decrease as the wearer's eyes remain closed for longer periods of time or remains open and fixated as the wearer's level of alertness is diminished. The system may then determine that the wearer's current alertness deviates from the wearer's normal alertness. In response to determining a particular deviation from the wearer's normal alertness level (e.g., as determined by changes in the wearer's blink rate), the system may emit a sound, or other alert, to notify the wearer (or other individual) that the wearer's alertness level is too low to be driving.


CONCLUSION

Many modifications and other embodiments of the invention will come to mind to one skilled in the art to which this invention pertains, having the benefit of the teaching presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the invention is not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for the purposes of limitation.

Claims
  • 1. A computerized eyewear comprising: a. at least one processor;b. one or more sensors operatively coupled to the at least one processor, the one or more sensors selected from the group consisting of: i. a heart rate monitor,ii. a pulse oximeter,iii. a gyroscopeiv. a forward facing camera, andv. an eye-facing camera;c. a power source operatively coupled to the at least one processor; andd. a communication device operatively coupled to the at least one processor;wherein the computerized eyewear is operatively coupled to a vehicle that a wearer is driving and is configured to: i. receive at least one signal from the one or more sensors on the computerized wearable device;ii. at least partially in response to receiving the at least one signal, determine a current alertness of the wearer of the computerized eyewear;iii. at least partially in response to determining the current alertness of the wearer, compare the current alertness of the wearer to a normal alertness of the wearer; andiv. notify the wearer when the current alertness of the wearer deviates from the normal alertness of the wearer by generating an alert selected from the group consisting of: 1. turning up the volume on the vehicle's audio system;2. disabling driving functionality of the vehicle; and3. decreasing the temperature on the vehicle's in-cabin climate control.
  • 2. The computerized eyewear of claim 1, wherein a. the one or more sensors comprises a heart rate monitor; andb. the computerized eyewear is further configured to: i. receive a signal from the heart rate monitor;ii. determine a current heart rate of the wearer at least partially based on the received signal from the heart rate monitor; andiii. compare the current heart rate of the wearer to a baseline heart rate of the wearer.
  • 3. The computerized eyewear of claim 1, wherein a. the one or more sensors comprises a gyroscope; andb. the computerized eyewear is further configured to: i. receive one or more signals from the gyroscope;ii. determine a current pitch, roll, and yaw of the wearer's head at least partially based on the received one or more signals from the gyroscope; andiii. compare the current pitch, roll, and/or yaw of the wearer's head to a baseline pitch, roll, and/or yaw of the wearer's head.
  • 4. The computerized eyewear of claim 3, wherein the computerized eyewear if further configured to notify the wearer when the current pitch, roll, and/or yaw deviates from the baseline pitch, roll, and yaw by a predetermined amount.
  • 5. The computerized eyewear of claim 1, wherein a. the one or more sensors further comprises a blink rate sensor; andb. the computerized eyewear is further configured to: i. receive a signal from the blink rate sensor;ii. determine a current blink rate of the wearer at least partially based on the received signal from the blink rate sensor;iii. determine whether the current blink rate is below a particular blink rate threshold value; andiv. at least partially in response to determining that the current blink rate is below the particular blink rate threshold value, generate an alert indicating that the wearer is tired.
  • 6. A computer-implemented method of determining an alertness of a wearer of a computerized wearable device the method comprising: a. receiving, by a processor, at least one signal from one or more sensors of the computerized wearable device;b. at least partially in response to receiving the at least one signal, determining, by a processor, a baseline heart rate of the wearer of the computerized wearable device;c. transmitting, by a processor, at least one stimulus from one or more stimulus transmitters to the wearer;d. determining, by a processor, a current heart rate of the wearer;e. comparing, by a processor, the current heart rate of the wearer to the baseline heart rate of the wearer;f. at least partially in response to comparing the current heart rate of the wearer to the baseline heart rate of the wearer, determining, by a processor, a current alertness level of the wearer by i. calculating, by a processor, an amount of time it takes the current heart rate of the wearer to return from an elevated heart rate that results from the one or more stimuli to about the baseline heart rate of the wearer;ii. determining, by a processor, whether the amount of time it takes the current heart rate of the wearer to return to about the baseline heart rate is above a particular threshold value; andiii. at least partially in response to determining that the amount of time it takes the current heart rate to return to about the baseline heart rate is above a particular threshold value, generating, by a processor, an alert that the wearer is tired; andg. notifying, by a processor, the wearer of the wearer's current alertness level.
  • 7. The computer-implemented method of claim 6, further comprising providing a computer wearable device comprising one or more sensors coupled to the computerized wearable device, the one or more sensors being adapted to detect one or more physiological characteristics of the wearer of the computerized wearable device, wherein a. the one or more characteristics are associated with the wearer's alertness and one or more stimulus transmitters coupled to the computerized wearable device, the one or more stimulus transmitters being adapted to initiate the transmission of one or more stimuli,b. the computerized wearable device is eyewear having the one or more sensors, a power source, and a communication device coupled thereto.
  • 8. The computer-implemented method of claim 6, wherein the one or more stimuli are selected from the group consisting of: a. a shock,b. a vibration,c. a sound, andd. a flashing light.
  • 9. The computer-implemented method of claim 6, wherein generating the alert that the wearer is tired comprises sending a notification to a computing device associated with a third party.
  • 10. The computer-implemented method of claim 6, wherein generating the alert that the wearer is tired comprises sending a notification to a computing device associated with the wearer.
  • 11. The computer-implemented method of claim 6, wherein the computerized wearable device is further operatively coupled to a vehicle that the wearer is driving.
  • 12. The computer-implemented method of claim 11, wherein generating the alert that the wearer is tired comprises turning up the volume on the vehicle's audio system.
  • 13. The computer-implemented method of claim 11, wherein generating the alert that the wearer is tired comprises disabling driving functionality of the vehicle.
  • 14. The computer-implemented method of claim 11, wherein generating the alert that the wearer is tired comprises decreasing the temperature on the vehicle's in-cabin climate control.
  • 15. A computer system for determining the alertness of a wearer of a computerized wearable device, the computer wearable device comprising: a. one or more sensors coupled to the computerized wearable device, the one or more sensors being adapted to detect one or more physiological characteristics of the wearer of the computerized wearable device, wherein the one or more characteristics are associated with the wearer's alertness; andb. at least one processor operatively coupled to the one or more sensors, wherein the at least one processor is configured for: i. receiving at least one signal from the one or more sensors of the computerized wearable device;ii. at least partially in response to receiving the at least one signal, measuring one or more baseline heart rates of the wearer of the computerized wearable device that are indicative of a normal heart rate of the wearer;iii. receiving at least another signal from the one or more sensors of the computerized wearable device, wherein the at least another signal indicates a current heart rate of the wearer;iv. calculating a predetermined heart rate threshold value based on the wearer's baseline heart rate;v. determining that the current heart rate of the wearer is above the predetermined heart rate threshold value that corresponds to an excited state of the wearer;vi. calculating a length of time that the current heart rate of the wearer remains above the predetermined heart rate threshold value;vii. at least partially in response to the length of time that the current heart rate of the wearer stays above the predetermined heart rate threshold value, determining the current alertness level of the wearer; andviii. notifying the wearer of the wearer's current alertness level.
  • 16. The computer system of claim 15, wherein the at least one processor is further configured for: a. calculating an amount of time it takes the current heart rate of the wearer to return to the normal heart rate of the wearer;b. determining whether the amount of time it takes the current heart rate of the wearer to return to the normal heart rate of the wearer is above a particular time threshold value;c. at least partially in response to determining that the amount of time it takes the current heart rate of the wearer to return to the normal heart rate of the wearer is above the particular time threshold value, generating an alert indicating that the wearer is tired.
  • 17. The computer system of claim 15, wherein: a. the computerized wearable device comprises one or more stimulus transmitters coupled to the computerized wearable device, the one or more stimulus transmitters being adapted to transmit one or more stimuli to the wearer of the computerized wearable device; andb. the at least one processor is further configured for: i. determining that the current heart rate of the wearer has been within a predetermined range of the normal heart rate for a predetermined amount of time;ii. at least partially in response to determining that the current heart rate of the wearer has been within the predetermined range of the normal heart rate for the predetermined amount of time, transmitting at least one stimulus from the one or more stimulus transmitters;iii. calculating a second length of time it takes the current heart rate of the wearer to return to the normal heart rate of the wearer;iv. determining whether the second length of time it takes the current heart rate of the wearer to return to the normal heart rate of the wearer is above a particular time threshold value;v. at least partially in response to determining that the second length of time it takes the current heart rate of the wearer to return to the normal heart rate of the wearer is above the particular time threshold value, generating an alert indicating that the wearer is tired.
  • 18. The computer system of claim 17, wherein the particular time threshold value comprises a range between 60 seconds and 300 seconds.
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation-in-part of U.S. Provisional patent application Ser. No. 14/610,439, filed Jan. 30, 2015, and entitled, “Wearable Physiology Monitor Computer Apparatus, Systems, and Related Methods,” which claims the benefit of U.S. Provisional Patent Application No. 62/046,406, filed Sep. 5, 2014, and entitled, “Wearable Health Computer Apparatus, Systems, and Related Methods,” the disclosures of which are hereby incorporated by reference herein in their entirety.

US Referenced Citations (248)
Number Name Date Kind
3505879 Vanderberg Apr 1970 A
3548663 Radin Dec 1970 A
3972038 Fletcher et al. Jul 1976 A
4100401 Tutt et al. Jul 1978 A
4186609 Baermann Feb 1980 A
4195642 Price et al. Apr 1980 A
4281663 Pringle Aug 1981 A
4407295 Steuer et al. Oct 1983 A
4434801 Jiminez et al. Mar 1984 A
4855942 Bianco Aug 1989 A
4878749 McGee Nov 1989 A
4919530 Hyman Apr 1990 A
5422816 Sprague et al. Jun 1995 A
5452480 Ryden Sep 1995 A
5497143 Matsuo et al. Mar 1996 A
5585871 Linden Dec 1996 A
5670872 Van De Walle et al. Sep 1997 A
5746501 Chien et al. May 1998 A
5891042 Sham et al. Apr 1999 A
5931764 Freeman et al. Aug 1999 A
5966680 Butnaru Oct 1999 A
5976083 Richardson et al. Nov 1999 A
6013007 Root et al. Jan 2000 A
6183425 Whalen et al. Feb 2001 B1
6218958 Eichstaedt et al. Apr 2001 B1
6241684 Amano et al. Jun 2001 B1
6325507 Jannard et al. Dec 2001 B1
6381482 Jayaraman et al. Apr 2002 B1
6431705 Linden et al. Aug 2002 B1
6439067 Goldman et al. Aug 2002 B1
6513532 Mault et al. Feb 2003 B2
6532298 Cambier et al. Mar 2003 B1
6736759 Stubbs et al. May 2004 B1
6769767 Swab et al. Aug 2004 B2
6783501 Takahashi et al. Aug 2004 B2
6790178 Mault et al. Sep 2004 B1
6812845 Yuzuki et al. Nov 2004 B2
7181345 Rosenfeld et al. Feb 2007 B2
7187960 Abreu Mar 2007 B2
7192136 Howell et al. Mar 2007 B2
7255437 Howell et al. Aug 2007 B2
7376238 Rivas et al. May 2008 B1
7380936 Howell et al. Jun 2008 B2
7400257 Rivas Jul 2008 B2
7401918 Howell et al. Jul 2008 B2
7438410 Howell et al. Oct 2008 B1
7454002 Gardner et al. Nov 2008 B1
7457434 Azar Nov 2008 B2
7481531 Howell et al. Jan 2009 B2
7488294 Torch Feb 2009 B2
7500746 Howell et al. Mar 2009 B1
7500747 Howell et al. Mar 2009 B2
7515054 Torch Apr 2009 B2
7543934 Howell et al. Jun 2009 B2
7581833 Howell et al. Sep 2009 B2
7621634 Howell et al. Nov 2009 B2
7630524 Lauper et al. Dec 2009 B2
7634379 Noble Dec 2009 B2
7640135 Vock et al. Dec 2009 B2
7648463 Elhag et al. Jan 2010 B1
7677723 Howell et al. Mar 2010 B2
7771046 Howell et al. Aug 2010 B2
7792552 Thomas et al. Sep 2010 B2
7793361 Ishihara et al. Sep 2010 B2
7857772 Bouvier et al. Sep 2010 B2
7806525 Howell et al. Oct 2010 B2
7922321 Howell et al. Apr 2011 B2
7987070 Kahn et al. Jul 2011 B2
8007450 Williams Aug 2011 B2
8011242 O'Neill et al. Sep 2011 B2
8081082 Malik et al. Dec 2011 B2
8109629 Howell et al. Feb 2012 B2
8157730 Leboeuf et al. Apr 2012 B2
8188868 Case May 2012 B2
8202148 Young Jun 2012 B2
8294581 Kamen Oct 2012 B2
8303311 Forest Nov 2012 B2
8337013 Howell et al. Dec 2012 B2
8384617 Braun et al. Feb 2013 B2
8430507 Howell et al. Apr 2013 B2
8448846 Needham et al. May 2013 B2
8449471 Tran May 2013 B2
8465151 Howell et al. Jun 2013 B2
8494507 Tedesco et al. Jul 2013 B1
8500271 Howell et al. Aug 2013 B2
8510166 Neven Aug 2013 B2
8531355 Maltz Sep 2013 B2
8540583 Leech Sep 2013 B2
8568313 Sadhu Oct 2013 B2
8594971 Keal et al. Nov 2013 B2
8620600 Vock et al. Dec 2013 B2
8630633 Tedesco et al. Jan 2014 B1
8634701 Kang et al. Jan 2014 B2
8647270 Leboeuf et al. Feb 2014 B2
8690750 Krueger Apr 2014 B2
8696113 Lewis Apr 2014 B2
8733928 Lewis May 2014 B1
8750971 Tran Jun 2014 B2
8764651 Tran Jul 2014 B2
8849610 Molettiere et al. Sep 2014 B2
8892401 Yuen et al. Nov 2014 B2
8905542 Howell et al. Dec 2014 B2
8911087 Publicover et al. Dec 2014 B2
8920332 Hong et al. Dec 2014 B2
8931896 Blum et al. Jan 2015 B2
8941560 Wong et al. Jan 2015 B2
8944590 Blum et al. Feb 2015 B2
8961415 Leboeuf et al. Feb 2015 B2
8964298 Haddick et al. Feb 2015 B2
8965730 Yuen Feb 2015 B2
8979295 Waters Mar 2015 B2
9001427 Jacobs et al. Apr 2015 B2
9005129 Venkatraman et al. Apr 2015 B2
9007220 Johns et al. Apr 2015 B2
9028405 Tran May 2015 B2
9031812 Roberts et al. May 2015 B2
9033493 Howell et al. May 2015 B2
9035970 Lamb et al. May 2015 B2
9050033 Yoneyama et al. Jun 2015 B2
9064342 Yuen et al. Jun 2015 B2
9112701 Sano et al. Aug 2015 B2
9113794 Hong et al. Aug 2015 B2
9113795 Hong et al. Aug 2015 B2
9141194 Keyes et al. Sep 2015 B1
9144405 Kim et al. Sep 2015 B2
9153074 Zhou et al. Oct 2015 B2
9215290 Yuen et al. Dec 2015 B2
9229227 Border et al. Jan 2016 B2
9235064 Lewis Jan 2016 B2
9239473 Lewis Jan 2016 B2
9241635 Yuen et al. Jan 2016 B2
9244293 Lewis Jan 2016 B2
9247212 Bose et al. Jan 2016 B2
9254100 Beck et al. Feb 2016 B2
9256711 Horseman Feb 2016 B2
9304331 Carrara Apr 2016 B2
9341526 Bass et al. May 2016 B2
9342610 Liu et al. May 2016 B2
9480877 Chiang et al. Nov 2016 B2
9520638 Baringer et al. Dec 2016 B2
9529197 Olsson et al. Dec 2016 B2
9566033 Bogdanovich et al. Feb 2017 B2
9579060 Lisy et al. Feb 2017 B1
9610476 Tran et al. Apr 2017 B1
9726904 Lin Aug 2017 B1
9763592 Le et al. Sep 2017 B2
9896154 Modolo Feb 2018 B2
9977259 Archambeau et al. May 2018 B2
10188323 Sales et al. Jan 2019 B2
10310296 Howell et al. Jun 2019 B2
10330956 Howell et al. Jun 2019 B2
20010031031 Ogawa et al. Oct 2001 A1
20020151810 Wong et al. Oct 2002 A1
20030195398 Suzuki et al. Oct 2003 A1
20040039517 Biesinger et al. Feb 2004 A1
20050033200 Soehren et al. Feb 2005 A1
20050036103 Bloch Feb 2005 A1
20050054942 Melker et al. Mar 2005 A1
20060115130 Kozlay Jun 2006 A1
20070052672 Ritter et al. Mar 2007 A1
20070112287 Fancourt et al. May 2007 A1
20070273611 Torch Nov 2007 A1
20080137916 Lauber et al. Jun 2008 A1
20090030350 Yang et al. Jan 2009 A1
20090195747 Insua Aug 2009 A1
20090227853 Wijesiriwardana Sep 2009 A1
20090267805 Jin et al. Oct 2009 A1
20100042430 Bartfeld Feb 2010 A1
20100045928 Levy Feb 2010 A1
20100110368 Chaum May 2010 A1
20100136508 Zekhtser Jun 2010 A1
20100271587 Pavlopoulos Oct 2010 A1
20100280336 Giftakis et al. Nov 2010 A1
20100308999 Chornenky Dec 2010 A1
20100332571 Healey et al. Dec 2010 A1
20110169932 Mula et al. Jul 2011 A1
20110221656 Haddick et al. Sep 2011 A1
20110224505 Sadhu Sep 2011 A1
20120021806 Maltz Jan 2012 A1
20120029367 Hobeika Feb 2012 A1
20120127423 Blum et al. May 2012 A1
20120133885 Howell et al. May 2012 A1
20120135384 Nakao May 2012 A1
20120142443 Savarese et al. Jun 2012 A1
20120169990 Burnstein Jul 2012 A1
20120191016 Jastram Jul 2012 A1
20120203310 Pugh et al. Aug 2012 A1
20120206485 Osterhout et al. Aug 2012 A1
20120310442 Doutaz et al. Dec 2012 A1
20130009907 Rosenberg et al. Jan 2013 A1
20130009993 Horseman Jan 2013 A1
20130024022 Bowers Jan 2013 A1
20130024211 Monteforte et al. Jan 2013 A1
20130041590 Burich et al. Feb 2013 A1
20130050258 Liu et al. Feb 2013 A1
20130096397 Kiso et al. Apr 2013 A1
20130138413 Finch et al. May 2013 A1
20130157232 Ehrenkranz Jun 2013 A1
20130242262 Lewis Sep 2013 A1
20130274587 Coza et al. Oct 2013 A1
20130274904 Coza et al. Oct 2013 A1
20130307670 Ramaci Nov 2013 A1
20130329183 Blum et al. Dec 2013 A1
20130345168 Kim et al. Dec 2013 A1
20140028456 Sadhu Jan 2014 A1
20140031703 Rayner et al. Jan 2014 A1
20140063242 Hanina et al. Mar 2014 A1
20140073081 Wang Mar 2014 A1
20140078049 Parshionikar Mar 2014 A1
20140085190 Erinjippurath et al. Mar 2014 A1
20140135593 Jayalth et al. May 2014 A1
20140142459 Jayalth et al. May 2014 A1
20140159862 Yang et al. Jun 2014 A1
20140204334 Stoll Jul 2014 A1
20140207264 Quy Jul 2014 A1
20140218281 Amayeh et al. Aug 2014 A1
20140228649 Rayner et al. Aug 2014 A1
20140229220 Yuen et al. Aug 2014 A1
20140247145 Proud Sep 2014 A1
20140266988 Fisher et al. Sep 2014 A1
20140276096 Bonutti Sep 2014 A1
20140340221 Yuen et al. Nov 2014 A1
20140346158 Matthews Nov 2014 A1
20140375452 Yuen et al. Dec 2014 A1
20140375470 Malveaux Dec 2014 A1
20140378872 Hong et al. Dec 2014 A1
20150057512 Kapoor Feb 2015 A1
20150085245 Howell et al. Mar 2015 A1
20150088464 Yuen et al. Mar 2015 A1
20150173631 Richards et al. Jun 2015 A1
20150179050 Katingari et al. Jun 2015 A1
20150185506 Lewis Jul 2015 A1
20150212329 Sugihara et al. Jul 2015 A1
20150223805 Whitman et al. Aug 2015 A1
20150244910 Marston et al. Aug 2015 A1
20150281879 Saadi et al. Oct 2015 A1
20150287338 Wells et al. Oct 2015 A1
20150332149 Kolb et al. Nov 2015 A1
20150342482 Carrara Dec 2015 A1
20150366518 Sampson Dec 2015 A1
20160007849 Krueger Jan 2016 A1
20160034042 Joo Feb 2016 A1
20160041404 Palermo et al. Feb 2016 A1
20160041613 Klanner et al. Feb 2016 A1
20160117937 Penders et al. Apr 2016 A1
20160314468 Smith et al. Oct 2016 A1
20170071528 Chen Mar 2017 A1
20170323584 Daniel et al. Nov 2017 A1
Foreign Referenced Citations (25)
Number Date Country
2778612 Dec 2017 EP
2396421 Jun 2004 GB
2005015163 Feb 2005 WO
2005094667 Oct 2005 WO
2007088374 Aug 2007 WO
2008073806 Jun 2008 WO
2010006370 Jan 2010 WO
2010062479 Jun 2010 WO
2010062481 Jun 2010 WO
2011086466 Jul 2011 WO
2012041485 Apr 2012 WO
2013188343 Dec 2013 WO
2014021602 Feb 2014 WO
2014108481 Jul 2014 WO
2014144918 Sep 2014 WO
2014144940 Sep 2014 WO
2014170280 Oct 2014 WO
2014188244 Nov 2014 WO
2015015025 Feb 2015 WO
2015081299 Jun 2015 WO
2015095924 Jul 2015 WO
2015127143 Aug 2015 WO
2015127441 Aug 2015 WO
2016017997 Feb 2016 WO
2016029803 Mar 2016 WO
Non-Patent Literature Citations (81)
Entry
Office Action, dated Jan. 11, 2018, from corresponding U.S. Appl. No. 15/074,679.
Office Action, dated Sep. 29, 2017, from corresponding U.S. Appl. No. 14/506,249.
Restriction Requirement, dated Oct. 4, 2017, from corresponding U.S. Appl. No. 14/610,439.
Office Action, dated Sep. 26, 2017, from corresponding U.S. Appl. No. 14/846,401.
Restriction Requirement, dated Sep. 13, 2017, from corresponding U.S. Appl. No. 14/550,406.
Notice of Allowance, dated Oct. 20, 2017, from corresponding U.S. Appl. No. 15/489,147.
Final Office Action, dated Nov. 16, 2017, from corresponding U.S. Appl. No. 14/610,628.
Office Action, dated Nov. 30, 2017, from corresponding U.S. Appl. No. 14/550,406.
Jeannet, Pierre-Yves, et al., “Continuous monitoring and quantification of multiple parameters of daily physical activity in ambulatory Duchenne muscular , dystrophy patients”, Official Journal of the European Paediatric Neurology Society, 2011.
Notice of Allowance, dated Dec. 13, 2017, from corresponding U.S. Appl. No. 14/610,501.
Notice of Allowance, dated Jun. 5, 2019, from corresponding U.S. Appl. No. 14/550,406.
Office Action, dated Jun. 11, 2019, from corresponding U.S. Appl. No. 14/610,501.
Office Action, dated Jun. 27, 2019, from corresponding U.S. Appl. No. 15/060,333.
Notice of Allowance, dated Oct. 11, 2018, from corresponding U.S. Appl. No. 15/074,679.
Final Office Action, dated Dec. 11, 2018, from corresponding U.S. Appl. No. 14/610,501.
Final Office Action, dated Dec. 31, 2018, from corresponding U.S. Appl. No. 14/550,406.
Final Office Action, dated Dec. 15, 2016, from corresponding U.S. Appl. No. 14/506,249.
Final Office Action, dated Sep. 26, 2016, from corresponding U.S. Appl. No. 14/610,628.
International Search Report, dated Dec. 18, 2015, from corresponding International Application No. PCT/US2015/048662.
International Search Report, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048612.
International Search Report, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048656.
International Search Report, dated Jun. 2, 2016, from corresponding International Application No. PCT/US2016/015705.
Invitation to Pay Additional Search Fees, dated Apr. 1, 2016, from corresponding International Application Serial No. PCT/US2016/015705.
Invitation to Pay Additional Search Fees, dated Nov. 4, 2015, from corresponding International Application Serial No. PCT/US2015/048612.
Invitation to Pay Additional Search Fees, dated Nov. 4, 2015, from corresponding International Application Serial No. PCT/US2015/048656.
Maria S. Redin, “Marathon Man”, Article Jun. 15, 1998, MIT Media Laboratory.
Office Action, dated Aug. 19, 2016, from corresponding U.S. Appl. No. 14/578,039.
Office Action, dated Jul. 1, 2016, from corresponding U.S. Appl. No. 14/562,454.
Office Action, dated Jul. 22, 2016, from corresponding U.S. Appl. No. 14/506,249.
Office Action, dated Sep. 2, 2016, from corresponding U.S. Appl. No. 14/588,122.
Restriction Requirement, dated Nov. 10, 2016, from corresponding U.S. Appl. No. 14/846,401.
Richard M. Satava, et al., “The Physiologic Cipher at Altitude: Telemedicine and Real-Time Monitoring of Climbers on Mount Everest”, Telemedicine Journal and e-Health, vol. 6, No. 3, 2000, Mary Ann Liebert, Inc.
Written Opinion of the International Searching Authority, dated Dec. 18, 2015, from corresponding International Application No. PCT/US2015/048662.
Written Opinion of the International Searching Authority, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048612.
Written Opinion of the International Searching Authority, dated Jan. 21, 2016, from corresponding International Application No. PCT/US2015/048656.
Written Opinion of the International Searching Authority, dated Jun. 2, 2016, from corresponding International Application No. PCT/US2016/015705.
Office Action, dated Dec. 29, 2016, from corresponding U.S. Appl. No. 14/610,589.
Phend, Crystal, “Calorie Intake Rises as Sleep Time Drops,” Medpage Today, Medpage Today, LLC, Mar. 15, 2012, Web Dec. 19, 2016, http://www.medpagetoday.com/cardiology/prevention/31663.
Michael Franco, Tzoa wearable turns you into a walking air-quality sensor, Dec. 9, 2014, CNET, https://www.cnet.com/news/tzoa-wearable-turns-you-into-a-walking-air-quality-sensor/.
Notice of Allowance, dated Feb. 28, 2017, from corresponding U.S. Appl. No. 14/588,122.
Office Action, dated Feb. 10, 2017, from corresponding U.S. Appl. No. 14/846,401.
Office Action, dated Mar. 3, 2017, from corresponding U.S. Appl. No. 14/610,628.
Office Action, dated Mar. 8, 2016, from corresponding U.S. Appl. No. 14/610,628.
Ted Burnham, Wearable Air Quality Sensor: Tzoa, Jan. 5, 2015, Postscapes, http://www.postscapes.com/wearable-air-quality-sensor-tzoa/.
Final Office Action, dated Mar. 29, 2017, from corresponding U.S. Appl. No. 14/562,454.
International Preliminary Report on Patentability, dated Mar. 16, 2017, from corresponding International Application No. PCT/US2015/048612.
International Preliminary Report on Patentability, dated Mar. 16, 2017, from corresponding International Application No. PCT/US2015/048656.
International Preliminary Report on Patentability, dated Mar. 16, 2017, from corresponding International Application No. PCT/US2015/048662.
Notice of Allowance, dated Sep. 13, 2018, from corresponding U.S. Appl. No. 15/594,898.
Office Action, dated Sep. 11, 2018, from corresponding U.S. Appl. No. 15/060,333.
Final Office Action, dated Sep. 25, 2018, from corresponding U.S. Appl. No. 14/610,439.
Office Action, dated Oct. 4, 2018, from corresponding U.S. Appl. No. 15/791,196.
Office Action, dated Mar. 2, 2018, from corresponding U.S. Appl. No. 15/060,333.
Office Action, dated Mar. 9, 2018, from corresponding U.S. Appl. No. 14/610,439.
Final Office Action, dated Mar. 30, 2018, from corresponding U.S. Appl. No. 14/846,401.
Office Action, dated May 23, 2018, from corresponding U.S. Appl. No. 14/578,039.
Final Office Action, dated Jan. 14, 2019, from corresponding U.S. Appl. No. 14/578,039.
Final Office Action, dated Jan. 14, 2019, from corresponding U.S. Appl. No. 15/060,333.
Notice of Allowance, dated Jan. 17, 2019, from corresponding U.S. Appl. No. 14/610,439.
Office Action, dated Jun. 8, 2018, from corresponding U.S. Appl. No. 14/610,501.
Final Office Action, dated Jun. 14, 2018, from corresponding U.S. Appl. No. 15/074,679.
Office Action, dated Aug. 7, 2018, from corresponding U.S. Appl. No. 14/550,406.
Office Action, dated Feb. 11, 2019, from corresponding U.S. Appl. No. 14/846,401.
Office Action, dated Apr. 4, 2019, from corresponding U.S. Appl. No. 16/284,615.
Office Action, dated Mar. 21, 2019, from corresponding U.S. Appl. No. 16/259,646.
Final Office Action, dated Apr. 29, 2019, from corresponding U.S. Appl. No. 15/791,196.
Final Office Action, dated Jun. 30, 2017, from corresponding U.S. Appl. No. 14/610,589.
Shankland, Stephen, “Google's electronic eyewear get ‘OK Glass’ voice commands”, Feb. 20, 2013, Cnet.com, https://www.cnet.com/news/googles-electronic-eyewear-gets-ok-glass-voice-commands/.
Office Action, dated Jun. 29, 2017, from corresponding U.S. Appl. No. 15/489,147.
Final Office Action, dated Jul. 10, 2017, from corresponding U.S. Appl. No. 14/846,401.
Final Office Action, dated May 23, 2017, from corresponding U.S. Appl. No. 14/578,039.
Notice of Allowance, dated Jun. 21, 2017, from corresponding U.S. Appl. No. 14/562,454.
Office Action, dated Jun. 27, 2017, from corresponding U.S. Appl. No. 15/060,333.
Tolentino, Mellisa, Udderly Clever Wearable Tech Solutions, http://siliconangle.com/blog/2014/03/25/udderly-clever-wearable-tech-solutions/, Mar. 25, 2014.
Torres, Juan Carlos, ODG R-7 Smart Glasses Carries Its Own Android Inside, http://androidcommunity.com/bdg-r-7-smart-glasses-carries-its-own-android-inside-20140919/, Sep. 19, 2014.
Final Office Action, dated Jul. 18, 2019, from corresponding U.S. Appl. No. 14/846,401.
Office Action, dated Jul. 26, 2019, from corresponding U.S. Appl. No. 16/259,646.
Notice of Allowance, dated Jul. 31, 2019, from corresponding U.S. Appl. No. 16/284,615.
Office Action, dated Aug. 6, 2019, from corresponding U.S. Appl. No. 16/429,480.
Office Action, dated Oct. 4, 2019, from corresponding U.S. Appl. No. 15/191,196.
Notice of Allowance, dated Sep. 11, 2019, from corresponding U.S. Appl. No. 16/259,646.
Related Publications (1)
Number Date Country
20170265798 A1 Sep 2017 US
Provisional Applications (1)
Number Date Country
62046406 Sep 2014 US
Continuation in Parts (1)
Number Date Country
Parent 14610439 Jan 2015 US
Child 15611574 US