Methods and apparatus regarding electronic eyewear applicable for seniors

Information

  • Patent Grant
  • 11721183
  • Patent Number
    11,721,183
  • Date Filed
    Friday, August 7, 2020
    4 years ago
  • Date Issued
    Tuesday, August 8, 2023
    a year ago
Abstract
Novel methods and apparatus regarding electronic eyewear are disclosed. Different embodiments can be applicable for seniors. One embodiment includes an eyewear that can charge when placed in an eyewear case. A user can access a live operator by tapping a button at the eyewear. The live operator can connect the user to a person. The eyewear can detect falls. In another embodiment, the eyewear with a digital assistant can be voice activated, and can provide assistance via the digital assistant. In yet another embodiment, the eyewear includes a position identifying system to identify a position of the eyewear. In one embodiment, the eyewear could wirelessly interact with a device coupled to an apparatus in the vicinity of the eyewear.
Description
BACKGROUND OF THE INVENTION

As the elderly population continues to grow, technologies for aging will become crucial for ensuring seniors' well-being. Existing devices and services attempt to address some of these needs, but can be unwieldy and conspicuous. It is desirable to have methods and apparatuses combining ease-of-use with compact electronics in items many seniors already use every day that could address the needs of seniors, particularly those who live by themselves.


SUMMARY OF THE INVENTION

Different embodiments focus on using electronic eyewear to address the needs of seniors, particularly those who live by themselves and may need extra help, such as to get around or keep in touch with loved ones.


As the elderly population continues to grow, technologies for aging are becoming more crucial for ensuring seniors' well-being. A number of embodiments combine ease-of-use with compact electronics in an item many seniors already use every day: eyewear.


Different embodiments can benefit elderly users who want to remain independent. The different embodiments empower and guide the elderly through routine obstacles, from maintaining a schedule to addressing critical needs in the event of an emergency.


One embodiment prioritizes a simple learning curve: a waterproof eyewear that can charge automatically when placed in an appropriate eyewear case, and can allow a user to access different features of the eyewear through the eyewear's one-button interface. For example, one tap of the button can summon a digital assistant—available in the user's preferred language—that can place calls to contacts and respond to questions like “Major news today?”; “Have I walked enough today?”; and “How do I get home from here?”. If the digital assistant can't help, it can connect the user to a live professional operator. The user can also connect to the operator directly at any time, simply by, for example, tapping the button more than once. The operator can offer more personalized assistance, including responding to more complicated requests, or connecting the user to the right person for the situation, such as a loved one or emergency services.


Another embodiment includes features for the user who can benefit from additional assistance and monitoring. The eyewear can detect falls, and can continually measure the user's key vital signs. In the event of a problem, the eyewear can send a caregiver to check on the user. In different embodiments, the eyewear can also provide convenient reminders for eating, taking medications, drinking water, and sleeping.


One embodiment not only can detect falls but can connect the eyewear to a live operator. The eyewear can be charged easily via inductive charging by placing the eyewear in an appropriate eyewear case. In another embodiment, the eyewear can also be voice activated.


One embodiment includes a position identifying system to help guide the user, such as when the user is lost. Many seniors want to keep independent, and sometimes, want to go for a walk. They may forget how to get home. In one embodiment, by pushing a button on the eyewear, a user could be linked to a person who could guide the user home. Or in another embodiment, instead of a person, a digital assistant in the eyewear could help.


In one embodiment, the eyewear could wirelessly interact with a device in an apparatus in its vicinity. For example, the apparatus could be a stove. When the user turns on a burner at the stove, the device could send a signal to the eyewear. And a digital assistant in the eyewear could alert the user if the eyewear has not received from the device another wireless signal regarding the burner being turned off after a preset amount of time.


Different embodiments offer an easy, familiar way for seniors to maintain their independence, facilitating fast and automatic communication, and providing guidance. The different embodiments would enable seniors to retire their worries, not their lifestyles, and offer loved ones the sound tranquility of relief.


Other aspects and advantages of the present invention will become apparent from the following detailed description, which, when taken in conjunction with the accompanying drawings, illustrates by way of example the principles of the invention.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 shows an embodiment of an eyewear in an eyewear casing that could help charge the eyewear.



FIG. 2 shows an embodiment of an eyewear in a plate with a slot that could help charge the eyewear.



FIG. 3 shows an embodiment of an eyewear sitting in a structure that could help charge the eyewear.



FIG. 4 shows an embodiment of an oscillator circuit for charging.



FIG. 5 shows an embodiment of a receiving charging circuit to charge a battery.



FIG. 6 shows an embodiment for RF charging in an eyewear.



FIG. 7 shows an embodiment of an eyewear that a senior can use.





Same numerals in FIGS. 1-7 are assigned to similar elements in all the figures. Embodiments of the invention are discussed below with reference to FIGS. 1-7. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.


DETAILED DESCRIPTION OF THE INVENTION

Different embodiments of electronic eyewear for everyday use are disclosed herein and may be used individually or in any combination. They could be used, for example, by seniors.


One embodiment is equipped with an easily accessible button located, for example, at a side of the eyewear. Pressing the button once enables the user to speak to a digital assistant in her preferred language. The assistant can respond to a number of preset requests, including “How can I get to the library?”; “Louder!”; “Call Katherine for me.”; and “Tell me the major news today.”


If the user becomes lost, she can ask the digital assistant, “How do I get home?” In response, the assistant can provide real-time step-by-step directions to guide the user home using the eyewear's built-in GPS or other location guidance system.


In one embodiment, for more complex inquiries, the user can simply ask the digital assistant to connect her to a live professional operator. Or, by pressing the button more than once, the user can connect directly to the operator at any time, without going through the digital assistant. The operator can, if necessary, soothe the user, provide more personal help, and answer more complicated questions. The operator can also connect the user to family and friends, caregivers, or emergency services, as the particular situation requires.


In one embodiment, when the eyewear needs to be recharged, the user can place the eyewear in its case, which functions as an automatic charging station without any extra effort from the user. The next day, the glasses can be fully charged for use. It can be that easy!


One embodiment can detect falls. For example, if the user falls while walking and is then practically motionless, the eyewear could ask if the user is okay or directly alert a caregiver. It also can measure the user's other vital signs, such as body temperature, heart rate, and blood oxygen level. The eyewear can automatically connect to emergency services if monitored data surpasses predetermined safety thresholds. The eyewear can be waterproof, so it can continue to function even when it gets wet, like on a rainy day.


In one embodiment, the eyewear can add features for seniors who would benefit from additional assistance and monitoring. The eyewear can provide convenient reminders for eating, taking medications, drinking water, and sleeping. Motion sensors can be built into the eyewear to detect whether the user has remained stationary for too long. A digital assistant in the eyewear can encourage the sedentary user to stand, walk a few steps, or stroll outside. In one embodiment, the eyewear can light up the user's path to facilitate night activity.


In one embodiment, the eyewear can include a being-worn sensor, which could sense if the eyewear is worn. The eyewear can be automatically activated or turned on when the eyewear is worn. For example, when a user gets up in the morning and puts on the eyewear, the eyewear could be turned on automatically. Different embodiments of a being-worn sensor in an eyewear are described in U.S. Pat. No. 8,434,863, entitled, “Eyeglasses with Printed Circuit Board,” which is incorporated herein by reference.


In one embodiment, the eyewear could be automatically charged if placed in a charging station. For example, the eyewear could come with a case, which could be connected to a wall power outlet (directly or via a wire/cable). If the eyewear is placed in the case, the eyewear can then be charged. To illustrate, in one embodiment, a user could take off a pair of eyeglasses with folding arms. With the arms folded, the glasses could be turned off. The user could put the glasses into its case, where the eyeglasses could be charged. One charging mechanism can be via a connector at the eyewear being connected to a connector at a case (e.g., an electrical connector, such as a USB connector, accessible inside the case). Different embodiments regarding a charging station for an eyewear are disclosed in “Eyewear Housing for Charging Embedded Battery in Eyewear Frame,” with application Ser. No. 15/409,723, which is incorporated herein by reference.


Another charging mechanism can be via inductive coupling. For example, there could be receiving charging coils of wires in the eyewear, such as inside the frame of the eyewear. To illustrate, the receiving charging coils could be at a lens holder and/or at one or more arms of a pair of eyeglasses. At corresponding position(s) inside a case where the eyewear could be housed, there could also be transmitting charging coils of wires. Via inductive coupling, a rechargeable battery in the eyewear can be charged via coupling between the receiving charging coils in the frame of the eyewear and the transmitting charging coils in the case. One such structure 100 is shown, for example, in FIG. 1, where there could be receiving charging coils at least around the lens holders of the eyewear. When the eyewear is placed in the case, for example, with the case closed, the eyewear could be charged automatically by transmitting charging coils in the case, such as at the oval-shaped area 102 at a side wall of the case as shown in FIG. 1, which could be arranged to be in close proximity to at least one of the lens holders 104 when the eyeglasses are placed in the case and the case is closed. In one embodiment, the eyewear and the case can be designed so that the eyewear could go into the case in only one way, such as when one wants to close the case. In one embodiment, there could be a switch at the case, such as close to the hinge of the case, so that when the case is closed, the switch could be activated to start the charging process.



FIG. 2 shows an embodiment of a plate 200 with a slot 202 to help charge an eyewear. In this example, transmitting charging coils can be in a wall area 204 at one end of the plate next to the slot 202. When the lens holders of an eyewear are in the slot, as shown, for example, in FIG. 2, the lens holders could be in close proximity to the transmitting charging coils in the wall area 204. Then via receiving charging coils, for example, in the lens holders, the battery within the eyewear could be charged.


In yet another embodiment, the eyewear could be placed on or in a structure, with areas in close proximity to parts of the eyewear, such as in close proximity to at least one of its arms, to charge the eyewear. One such structure 300 is shown, for example, in FIG. 3. There could be transmitting charging coils at a back panel 302 of the structure 300 to charge the battery within the eyewear, which could have receiving charging coils at least in one of the arms 304 of the eyewear. To illustrate, when the user retires at night and puts the eyewear on the structure 300 shown in FIG. 3, the eyewear's battery can be automatically recharged by transmitting charging coils in the back panel 302 sending charging signals to receiving charging coils in at least one arm 304 of the eyewear.



FIG. 4 shows an example of an oscillator circuit 400, serving as at least a portion of transmitting charging circuit for charging. The circuit 400 in FIG. 4 is sometimes known as a Colpitts oscillator. The circuit 400 includes an oscillator (such as the C1, C2 and L structure) generating AC signals at the base of a transistor (such as 2N2222). The output of the oscillator circuit, Vout, could be used to drive a set of transmitting charging coils, such as 10 turns of coils with a diameter of 1.5″, connected between Vout and ground. With appropriate components, the oscillation could be at, for example, 140 KHz. One could also place an amplifier at Vout to increase its power to drive the transmitting charging coils. The oscillator circuit 400 shown in FIG. 4 is just an example. Other types of oscillator circuits can be used.



FIG. 5 shows an example of a receiving charging circuit 500 to charge one or more batteries, such as in an eyewear. The circuit 500 could include a set of receiving charging coils, such as 10 turns of coils or wires with a diameter of 1.5″. The receiving charging coils could wirelessly receive charging signals from transmitting charging coils. AC signals at the outputs from the receiving charging coils, V1, could be connected to a resonator, such as L1 and C1, tuned at the charging frequency, such as 140 KHz. Then the AC signals at V1 can be converted to DC signals at V2, which can be stepped up by a voltage multiplier to raise the DC voltage to, for example, 5V at V3. The 5V DC signals can run a battery charger to charge a set of batteries at VBat in an eyewear.


Yet another charging mechanism can be via RF charging. FIG. 6 shows an example of such an implementation 600 in an eyewear. In the example, an antenna could run along an arm of the eyewear. The antenna can be tuned to the charging frequency, such as 2.4 Ghz, to wirelessly capture RF signals from a RF transmitting circuit. The captured RF signals could be received by a RF receiving charging chip to charge one or more batteries in the eyewear. The RF transmitting circuit with a corresponding charging antenna could be at a charging structure, such as at the back panel 302 of the structure 300 shown in FIG. 3, to generate the RF signals to charge the one or more batteries in the eyewear.


In one embodiment, when a battery within the eyewear is being charged, other than the charging electronics, other electronics in the eyewear could be off; and when the eyewear is not being charged, electronics in the eyewear could be on. This could be done, for example, by measuring a voltage at the charging electronics. A high voltage value could indicate the operation of charging being on, which could cause the other electronics in the eyewear to be turned off; and a voltage value below a threshold could indicate not charging, which could cause electronics in the eyewear to be on.


In one embodiment, the battery in the eyewear can be sufficient to operate the electronics in the eyewear for at least 24 hours when fully charged.


In one embodiment, the eyewear can notify the user when the battery level is low via an indicator, such as a visual indicator, like a LCD display or one or more LEDs. In another example, the eyewear could orally tell the user, via a speaker at the eyewear, that the battery level is low and should be recharged. To illustrate, when the battery level has dropped to 25% of its fully-charged value, the speaker could alert the user.


In one embodiment, the eyewear could monitor the heart rate of its user via a heart-beat sensor. Different embodiments of a heart beat in an eyewear are described in U.S. Pat. No. 7,677,723, entitled, “Eyeglasses with a Heart Rate Monitor,” which is incorporated herein by reference.


In one embodiment, the eyewear could include a pulse oximeter. At least one of the nose pads of the eyewear could have the pulse oximeter. In another embodiment, instead of at the nose pads, a pulse oximeter can be clipped onto a portion of the ear of the user, such as the user's earlobe, or can be attached to other areas of the skin of the user with capillaries. Based on the measurements of a pulse oximeter, a controller in the eyewear could determine the % of oxygen in the user's blood.


In one embodiment, the eyewear can include a temperature sensor. Different embodiments of a temperature sensor in an eyewear are described in U.S. Pat. No. 7,380,936, entitled, “Eyeglasses with a Clock or other Electrical Component,” which is incorporated herein by reference.


In one embodiment, the eyewear can include a motion sensor, such as a pedometer. Different embodiments of a motion sensor in an eyewear are described in U.S. Pat. No. 7,255,437, entitled, “Eyeglasses with Activity Monitoring,” which is incorporated herein by reference.


In one embodiment, the eyewear can include a blood pressure sensor. In one embodiment, a blood pressure sensor can be based on optical Ballistocardiography and pulse oximetry. Electronics for both mechanisms could be at the nose pads of the eyewear, or they could be pressing onto parts of the skin of the user with capillaries. To illustrate, optical Ballistocardiography can measure physical displacements of a section of the user's skin as a function of the user's pulse. The optical device can be made as a mat of optic fibers with some emitting light, and some sensing with photo transistors. The user's pulses can vibrate the skin, which could press onto the mat (such as the mat at a nose pad). The vibrating skin could alternately compress and relax the mat of fibers, leading to the light received by the photo transistors to be modulated as a function of the movement of the mat. With that, optical Ballistocardiography could produce a first pulse signal waveform. The pulse oximeter could produce a second pulse signal waveform. Depending on the time-lag between the two waveforms, a controller in the eyewear could calculate the user's blood pressure.


In one embodiment, the eyewear can include a blood glucose sensor. The blood glucose sensor could be at a nose pad of the eyewear.


In one embodiment, to save battery, one or more of the sensors do not sense the user continuously, but at regular intervals, such as every 10 minutes. Different sensors could sense at different intervals. For example, in one embodiment, heart rate could be monitored more frequently than temperature.


In one embodiment, the eyewear could alert a user to drink water, eat, exercise, weigh himself, sleep, and/or involve in other activities. This could be based on time. In one embodiment, typical times can be used, such as alerting the user to have dinner at 6 pm.


In one embodiment, the eyewear could be programmed to be tailored to the user, for example, supporting a set of predetermined routine. To illustrate, the eyewear could be programmed to alert the user to take medication (such as asking the user, “Taken the medication yet?”), follow a treatment plan, and/or sleep at specific times. As to taking medication, the eyewear could alert the user how and when to take medication, and what medication to take. The eyewear could alert the user to walk a little more, if the user has not walked much that day (as shown, for example, by a pedometer in the eyewear). Or, at certain time each day, the eyewear could tell the user major news of the day. These could be major news identified by Facebook, and wirelessly downloaded to the eyewear.


In one embodiment, the eyewear can be wirelessly coupled to a calendar to remind the user of appointments, such as “time to play bridge in the lunch room.” The calendar could be downloaded to a memory device in the eyewear, such as when the user has a new calendared event for the calendar. In another example, a new calendared event could be entered, such as wirelessly, into the calendar in the eyewear. In yet another example, a new calendared event could be entered wiredly via a connector, such as a USB connector, into the calendar in the eyewear.


In one embodiment, based on motion monitored by a pedometer in an eyewear, the eyewear could alert the user, such as by 5 pm, to walk another certain number of steps before the end of the day.


In one embodiment, the eyewear could alert a user by sound, such as via an ear bud coupled to the eyewear. The alert could be based on an audio tone, such as beeps. In another embodiment, the alerts can be via human voices, and could be in a language preferred by the user.


In one embodiment, the eyewear could send emergency alerts to others. For example, the eyewear could send emergency signals wirelessly to a near-by device, such as the user's cell phone, which could automatically make a cellular call to an interested party. In another embodiment, the eyewear could send emergency signals directly to the interested party.


In one embodiment, the eyewear could periodically, such as every 12 hours, wirelessly send its monitored data to an interested party, who could review the data, to determine if there are issues of concern. Or a system could analyze the data. Such analysis could be performed automatically, and the interested party could be alerted if there is an issue. The wireless transmission could be performed via another device in the vicinity of the user. The eyewear could send the monitored data to the another device, such as wirelessly to the user's cell phone, or wiredly to a computer that is plugged to a wall power outlet. In another embodiment, at least a portion, or a significant portion, or all of the analysis can be performed by a processor in the eyewear.


In one embodiment, a user could activate an emergency call. To prevent false alarm, one embodiment can require the user to press and hold a switch or button at the eyewear and let go. Then an interested party would call the user, and ask if the user needs assistance. In another embodiment, the user needs to press and hold the switch twice to initiate a distress call.


In one embodiment, the eyewear could have just a switch for a user to activate, and the switch can be a switch to at least activate an emergency call. For example, the user could activate the switch by pushing it. The switch can be made conspicuously. In addition to the switch, the eyewear could also include a microphone to receive the user's voice input. In another embodiment, the eyewear could include at least a speaker, which could be at an ear bud, and the eyewear could also include a user-controlled volume-changing switch to change the volume of the speaker. The simplicity of having fewer switches at the eyewear for a user to activate or push could help make the eyewear easier for a senior to use.


In one embodiment, the eyewear includes voice recognition software or firmware embedded therein. The software could have a dictionary to recognize words and sentences commonly used in specific areas, such as in time of emergency. For example, when the user wears the eyewear, the eyewear can passively listen. The eyewear could be programmed to be activated by a specific word. When that word is captured by the eyewear, the eyewear could try to recognize subsequent voice inputs based on its voice recognition capabilities.


In one embodiment, the eyewear could send out emergency call if, for example, abnormal signals or signals below certain preset thresholds have been tracked by the eyewear. To illustrate, if a pulse oximeter indicates that the oxygen level of the user is below 93%, an emergency call could be sent out to an interested party, with the reasons of the call also included. Or a caregiver could be automatically sent to check on the user.


In one embodiment, the eyewear also can include a cellular phone that could at least receive calls.


In one embodiment, the eyewear can include a connector, which could be used to download monitored information from the eyewear. Different embodiments of a connector coupled to an eyewear are described in U.S. Pat. No. 7,500,747, entitled, “Eyeglasses with Electrical Components,” which is incorporated herein by reference.


In one embodiment, instead of downloading via a connector, monitored information or data could be accessed wirelessly. In another embodiment, the wireless access can be via a short-range wireless network, such as Bluetooth. The information in the eyewear could be password protected.


In one embodiment, the eyewear can include a light, such as a LED, which can turn on if the eyewear is worn at specific time frames, such as in the middle of the night. The time period to turn on the light could be programmed. Turning on the light can also depend on the user putting on the eyewear, which could, for example, activate the eyewear. In other words, in this embodiment, the light could turn on if the eyewear goes from the off state to the on state during the specific time frames. In one embodiment, the light can automatically turn off after a predetermined period of time, such as 3 minutes. To turn it back on, in one embodiment, one could take off the eyewear and put it back on again. In another embodiment, the user could activate or turn on the light manually via, for example, a switch at the eyewear. In yet another embodiment, the user could turn on the light via voice, with the eyewear having voice recognition electronics.


In one embodiment, the eyewear can include hearing enhancement abilities. Different embodiments of hearing enhancement abilities in an eyewear are described in U.S. Pat. No. 7,760,898, entitled, “Eyeglasses with Hearing Enhanced and Other Audio Signal-generating Capabilities,” which is incorporated herein by reference.


In other embodiments, the glasses can have a number of hearing enhancing capabilities. In one embodiment, the hearing enhancement is for those with mild or medium hearing loss. In another embodiment, the hearing enhancement is for those with severe hearing loss.


One hearing enhancement functionality is frequency-dependent amplification. For example, higher frequencies are amplified more than lower frequencies; certain frequency bands are not amplified; or the frequencies to be amplified are tailored to the user.


To tailor the amplification to a user, hearing enhancement capabilities can be calibrated against the user. The calibration can be done by the user or by a third party. The calibration can be performed through a website, which guides the user through the process. The calibrated frequency hearing profile of the user can be stored. Such calibration can be performed periodically, such as once a year.


In one embodiment, the user can perform the calibration by himself/herself. For example, the audio frequencies are separated into different bands. The glasses generate different SPL at each band. The specific power level that the user feels most comfortable would be the power level for that band. Alternatively, the glasses could generate different tones in different frequency bands. The user could compare the tones and rate the perceived loudness. In this process, the glasses can prompt the user and lead him through the process interactively. Based on the measurements, the glasses could create a calibration curve, which becomes the personal hearing profile for that user. After calibration, signals received in different bands, such as by a microphone in the glasses, will be amplified or attenuated according to the hearing profile.


In one embodiment, the eyewear can include a position identifying system, such as a GPS device, or other position identifying system, such as a system using wifi and/or cellular networks via, for example, triangulation. In one embodiment, a combination of more than one position identifying systems could be used to identify position, and the combination could become the position identifying system. In one embodiment, the position identifying system can be normally off and activated remotely to track the position of the eyewear. For example, the user could send out an emergency call to an interested party, who could activate the position identifying system. This could be used to track the user, such as when the user is lost, and provide instructions orally to guide the user home. In one embodiment, the user could activate the position identifying system, and the system could guide the user to a preset place, such as back home via, for example, voice.


In one embodiment, the eyewear can include a lanyard, which could be permanently attached to the eyewear. The lanyard could be attached to a battery pack to provide power (or additional power) to electronics in the eyewear. Different embodiments of electronics tethered to an eyewear are described in U.S. Pat. No. 7,192,136, entitled, “Tethered Electrical Components for Eyeglasses,” which is incorporated herein by reference.


In one embodiment, the eyewear could detect if the user has fallen. For example, the eyewear could include an accelerometer. As one example, if the eyewear (a) detects an accelerating and then a stop, (b) with the being-worn sensor indicating eyewear being worn, and (c) with little or no subsequent motion, the eyewear could assume the user has fallen, and send an emergency signal to an interested party. In another embodiment, the eyewear also could include a up/down sensor (or level sensor) showing the orientation of the eyewear. If the eyewear detects the above, with the eyewear orientation still showing it being in at least a substantially up orientation at the stop, the eyewear could send out an emergency signal asking an interested party to contact the user.


Different embodiments could be used to determine, for example, if a user wearing the eyewear is in danger, such as has fainted. One approach could be based on changes in the patterns of the monitored measurements, as determined by a controller in the eyewear. For example, the eyewear can be aware of the average heart beat or heart beat pattern of the user. A dangerous condition could be significant deviation from the average. Another approach could be based on measurements and/or changes in the patterns of measurements from different sensors in combination. For example, the eyewear has detected that the user might have fallen, and a microphone in the eyewear detects no sound from the user in a subsequent duration of time.


In one embodiment, in response to detecting danger, an interested party could try to contact the user. For example, the interested party could call the user and ask the user to, for example, push a button on the eyewear, such as pushing the button multiple times if the user is in danger.


In one embodiment, the eyewear can include an imaging system, such as a camera. The imaging system could be used, for example, to read barcodes off products, such as medicine bottles. And a controller in the eyewear could identify the product based on the barcodes read. In one embodiment, the imaging system could be a 3D imaging system, and the controller could, for example, identify the product based on its 3D image. Different embodiments of eyewear with cameras are described in U.S. Pat. No. 7,806,525, entitled, “Eyeglasses having a camera,” which is incorporated herein by reference.


In one embodiment, as shown, for example, in FIG. 7, the eyewear 700 includes (a) a button 702, which could be at an extended endpiece 704 of the eyewear 700, such as at its top surface (Different embodiments of eyewear with extended endpieces are described in U.S. Pat. No. 8,109,629, entitled, “Eyewear Supporting Electrical Components and Apparatus Therefor,” which is incorporated herein by reference); (b) at least a microphone, which could be at an extended endpiece of the eyewear; (c) at least a speaker 706, which could be at an extended endpiece of the eyewear; (d) at least a battery, which could be at an extended endpiece, or could be at an arm, of the eyewear; (e) a battery indicator; (f) voice recognition capability, with a digital assistant, to respond to a limited range of requests in voice; (g) a phone directory; and (h) cellular connection capabilities. The eyewear could also include a local wireless system that could track locations (such as based on WiFi signals), or a navigation satellite system that could track locations (such as a GPS system); and a pedometer. The eyewear could further include inductive charging ability, such as, for example, with wires at lens holders or arms of the eyewear. The eyewear, including the button, the microphone, and the speaker, could be waterproofed.


For the above embodiment, in operation, for example, such as when the button is pushed once, the digital assistant could respond to a limited range of requests from the user, such as: “What is the time?”; “Major news today?”; “Have I walked enough today?”; “Call Andy for me.”; “Louder.”; “Softer.”; “What is the time?”; “Help!”; “Everything is fine.”; and “How do I get home from here?” The digital assistant could be trained to respond accordingly. For example, if the question is “How do I get home?”, the digital assistant could respond, for example, by providing turn-by-turn directions via the built-in position identifying system to the user. The interaction with the digital assistant could be in a language preferred by the user.


In one embodiment, the user could use the eyewear to listen to audio books. For example, one or more digital books could be stored in or downloaded into the eyewear. The embodiment could include a display, such as a LCD display, to show the books and allow the user to scroll down the list of books to select the one the user wants. The books could be categorized. The user could select a category to have books under the category listed. Selection could be done via a button or a switch at the eyewear. In one embodiment, selection could be done via voice. In the embodiment with a digital assistant, the user could ask the digital assistant to start playing a digital book by describing to the digital assistant the book, such as telling the assistant the title of the book. In another embodiment, if the eyewear does not have the book, the digital assistant could find out how much it would cost to get the book, such as from Amazon, and ask the user if the user wants to acquire the book. If the user wants to, the digital assistant could download the book, such as from Amazon, based on, for example, the user's charge card information previously stored in the eyewear.


In one embodiment, the eyewear could include noise cancellation circuits, such as based on multiple microphones. For example, there could be a first microphone in an area on a top portion of the eyewear, such as a top portion of a lens holder or a top portion of an arm of the eyewear, for capturing sound from the environment. And there could be a second microphone in an area on a bottom portion of the eyewear, such as a bottom portion of a lens holder or a bottom portion of an arm of the eyewear, for capturing the user's voice. The audio signals from the second microphone capturing the user's voice could be adjusted based on the audio signals from the first microphone for noise cancellation, via techniques known to those skilled in the art.


In one embodiment, the eyewear could also identify where the user parked via another device in the car of the user; turning light on or off when asked, if the eyewear includes a light output, which could be pointing forward at an extended endpiece; and asking the user some brain exercise questions.


In one embodiment, if the digital assistant can't help, it could connect the user to a live professional operator (e.g. via cellular connection). The user can also connect to the operator directly at any time by quickly tapping the button, such as two or more times within a second. The operator can offer more personalized assistance, including responding to more difficult requests, or connecting the user to the right person for the situation, such as a close relative, as needed. When the battery is low, the battery indicator could provide an indication to the user, and the indication could be oral via the speaker.


In one embodiment, the eyewear could detect if the user has fallen. For example, the eyewear could include an accelerometer, which could operate as a motion sensor, such as a pedometer. As one example, if the eyewear detects an accelerating and then a stop, with little or no subsequent motion, the digital assistant could orally ask the user whether everything is fine. If the user doesn't respond, the eyewear could assume the user has fallen, and send an emergency signal to an interested party. In another embodiment, the eyewear also could include a up/down sensor (or level sensor) showing the orientation of the eyewear. If the eyewear detects the above regarding a fall, (a) with the eyewear orientation still showing it being in at least a substantially up orientation at the stop, and/or (b) with the eyewear orientation showing it being not in a substantially up orientation after the stop for a duration of time, the digital assistant could orally ask the user to respond. If the user doesn't respond, the eyewear could assume the user has fallen, and send an emergency signal asking for a caregiver to check on the user. In one embodiment, the caregiver, in addition to using the position identifying system in the eyewear, could also activate the eyewear wirelessly to give out a beeping sound to help the caregiver locate the user.


In one embodiment, the digital assistant can engage in simple dialogues with the user. This could be initiated by the user, such as by the user pushing the button and starting to talk to the digital assistant. For example, the user could say, “I don't feel good.” The assistant could respond open-endedly, such as, “Why don't you feel good?” Through the dialogue, the assistant could spot patterns in the user's spoken language.


For example, once a day, or several times a day, the assistant could ask the user how the user is feeling, and could offer choices for the user to select to respond, such as lonely, sad, or depressed. Then the digital assistant could recognize words and phrases in subsequent dialogues with the user that could be contextually likely to be associated with the selected choice. In one embodiment, the assistant could provide some simple counseling based on standard cognitive behavioral therapy. In another embodiment, if the assistant decides that the user is depressed, the assistant could suggest activities the user likes that could enhance the user's mood or distract the user. For example, the assistant could play some music the user likes.


In one embodiment, the digital assistant could be trained to analyze (a) syntactic patterns in the user's words, such as the frequency of nouns used versus adjectives; (b) classes of words the user uses, such as whether they are more related to perception or action; (c) how often the user talks to the digital assistant; (d) whether the user changes topics more abruptly than the user's average responses; and (e) acoustic features, such as changes in volume, pitch, and frequency of pauses by the user.


In one embodiment, the digital assistant could identify dangerous signs. For example, by analyzing dialogue from the user for an extended period of time, the assistant could identify a drastic change in the user's verbal communication. Note that the analysis could be done by a remote device, based on the user's daily communication data transmitted by the eyewear. Or, in another embodiment, the analysis, or a portion of the analysis, could be done by the eyewear. The analysis could be based on machine learning. In view of the dangerous signs identified, in one embodiment, a professional operator or a close relative of the user could be alerted, and this could be done, for example, by the digital assistant.


One embodiment for voice recognition could use processors or servers remotely accessed via the Internet (such as in the cloud), such as GPU, to train the voice recognition models, which can be a machine learning-driven task. And the embodiment could use a lower power processor, a client processor, or an edge processor in the eyewear to run inferences. In one embodiment, with the models trained, the client processor could run voice recognition without the need to wirelessly call the servers. This could lead to consuming less power, reducing latency, and giving a better user experience. In another embodiment, the eyewear could directly interact with the remote servers without using a client processor.


In one embodiment, for the client processor, the eyewear could use a lighter-version TPU (Tensor Processing Unit), a custom chip optimized for inference in edge devices. Instead of using TensorFlow, a Google machine learning development platform, other embodiments could use development platforms, such as Caffe2 or PyTorch.


Instead of voice recognition, different embodiments could train and/or use machine learning models in the remote servers for other applications, such as image recognition.


In one embodiment, the eyewear could include mental exercises to sharpen mental skills of the user, such as word games, testing recall, and doing math mentally. In another embodiment, the eyewear could play games or other activities with the user to entertain the user. As an example, this could be administered to the user verbally by the eyewear.


In one embodiment, the eyewear could turn off automatically if there is no audio input, such as from the user, received for a preset amount of time, such as 2 minutes. In another embodiment, the user could ask the digital assistant to turn the eyewear off by, for example, saying, “Goodbye, glasses.” In one embodiment, the eyewear or the digital assistant could be programmed to be called a name, such as Joe. To turn the eyewear off, the user could say, “Goodbye, Joe.”


In one embodiment, instead of or in addition to a button, the eyewear could be programmed to be activated by voice, such as when the user said a specific word or phrase. When that word/phrase by the user is captured by the eyewear, the eyewear could be activated. In one embodiment, the eyewear, with voice recognition capability, does not have the button and could be activated by voice. In one embodiment, when the eyewear is activated, a digital assistant in the eyewear could address the user.


In one embodiment, the eyewear could be wirelessly coupled to a medicine box. This could be done via short-range wireless communication system, such as Bluetooth. When the medicine box is opened, the box could send a signal to the eyewear. The signal could be an indication that the user has initiated an action to take medication. The signal could also initiate an audio script by the eyewear as to the medication to take and how much. In another example, based on time monitored by the eyewear, the audio script could indicate to the user that it is not time to take any medication yet.


In one embodiment, the eyewear could be coupled, such as wirelessly coupled, to a remote medical monitoring system. The system could keep track of measurements performed by the eyewear on a user, and perform analysis accordingly. Different embodiments of a medical monitoring system are described in U.S. Pat. No. 8,112,293, entitled, “Medical monitoring system,” which is incorporated herein by reference.


In one embodiment, the eyewear could wirelessly interact with one or more other types of electronic devices in its vicinity (or local e-devices). A local e-device could be at one's property indoor or outdoor. It typically includes computing and communication electronics that, for example, allow it to interact, pair, or be authorized to participate and communicate via a local network or short-range wireless network (e.g. a wifi network or a Bluetooth network) with at least the eyewear. In one embodiment, a local e-device can be an IOT device. In another embodiment, a local e-device can be connected to an electrical power outlet at a wall. In yet another embodiment, a local e-device could also interact wirelessly with one or more other local e-devices in its vicinity.


In one embodiment, a local e-device could be an indoor device, such as in a house of the user.


In one embodiment, an indoor device could be coupled to an apparatus or a physical object in an indoor environment, such as in the house of the user; and the apparatus includes at least a status that could be changed by the user. As the status changes, a signal could be sent to the indoor device, which could send a corresponding signal to the eyewear. The eyewear does not have to be fully turned on, but can be in a listen mode to receive the signal, and the eyewear could act according to the signal received.


For example, the apparatus could be a stove with, for example, a status indicating a burner being turned on or off; a refrigerator with, for example, a status indicating its door being opened or shut; a door (such as a garage door, a front door or a patio door) with, for example, a status indicating it being opened or closed; a faucet with, for example, a status indicating it being turned on or off; and a shower with, for example, a status indicating it being turned on or off.


In one embodiment, as the user changes a status of an apparatus, the corresponding indoor device could send a signal to the eyewear of the user (which could be worn by the user), alerting the eyewear of the change. In another embodiment, if the status is changed in a way different from its norm, an alert signal could be sent to the eyewear. Depending on the alert signal received, the eyewear could react accordingly.


For example, when the user turns on a burner at a stove, an indoor device of the stove could send a wireless signal regarding the burner being turned on to the eyewear of the user. In one approach, a digital assistant at the eyewear could alert the user if the eyewear has not received from the indoor device another wireless signal regarding the burner being turned off after a preset amount of time.


As another example, when the user opens a door, an indoor device at the door could send a wireless signal regarding the door being opened to the eyewear of the user. In one approach, an alert signal at the eyewear could notify the user if the eyewear does not receive from the indoor device another wireless signal regarding the door being closed, such as after a preset amount of time.


In one embodiment, an indoor device can be used to open the door of a user's house. For example, the corresponding system could include at least two local e-devices, such as a microphone/speaker device at the door and a lock-control device controlling the lock of the door. When a visitor, for example, rings the door bell, the user could turn on the eyewear and ask the digital assistant to ask the visitor questions, such as the visitor's identity and the purpose of the visit, via the speaker at the door. The visitor's response can be captured by the microphone at the door, and presented to the user via the eyewear. If the user wants to open the door, the user could instruct the digital assistant to remotely open the door via the lock-control device. The digital assistant could re-confirm with the user before remotely activating the lock-control device to open the door. In one embodiment, the door bell also could be a local e-device. When the visitor rings the door bell, the eyewear could be notified, which, in turn, could notify the user accordingly.


In another embodiment, the corresponding system to open the door of a user's house could include a video camera to capture video/images of the visitor and transmit such information to the user via, for example, a display. In one embodiment, the display could be at the eyewear. In another embodiment, the display could be a monitor external to the eyewear. The eyewear could alert the user to watch the captured video/images.


In one embodiment, an indoor device can be used to lock a door, and the lock could be controlled electrically. In one embodiment, the device could be electrically coupled to the lock; and in another embodiment, the device includes the lock. A user could control to lock the door via the user's eyewear, which can be wirelessly coupled to the indoor device.


In one embodiment, an indoor device could be used to adjust one or more thermostats in the house. In one embodiment, an indoor device could be used to turn on/off or dim one or more lights in the house. In one embodiment, an indoor device could be used to control one or more other appliances in the house. In one embodiment, the adjustment/control could be through a digital assistant in the eyewear.


One embodiment previously described includes a medicine box sending signals to the eyewear. In one embodiment, a medicine holder could include an indoor device. When the medicine holder is opened, the indoor device could send a message to the eyewear. If the user opens the medicine holder again the same day, based on another message from the indoor device, the eyewear could alert the user that the holder has been opened before that day.


As yet another example, an indoor device could monitor one or more statuses of a car, such as the location of the car. After the car has been parked for a preset amount of time, such as 15 minutes, inside a garage of the house of the user, if a digital assistant in the eyewear has not received a signal from the indoor device that the car has been turned off, the digital assistant could alert the user.


In one embodiment, a signal from the eyewear to the user, such as an alert signal, could be a beeping sound. In another example, such a signal could be optical, such as via light from a LED at the eyewear. In yet another example, such a signal could be a voice message from a digital assistant from the eyewear.


In one embodiment, an indoor device of an apparatus could restrict the user from using the apparatus unless the user could satisfy certain predetermined criteria. For example, the apparatus could be a gun or a chain saw, which could be stored in a container. When the user wants to open the container or when the user wants to operate the apparatus, the user could, for example, be asked a question, such as the address of the house with the apparatus. Whether the user could answer the question correctly determines if the user could use the apparatus. In one embodiment, the indoor device could have the apparatus send a signal to the eyewear, requesting the eyewear to ask the question and to monitor the answer. Or, the indoor device could have the apparatus ask the question and monitor the answer, and send information regarding the interrogation, including the monitoring, to the eyewear.


In one embodiment, an alert or a signal from the eyewear regarding the status of a corresponding apparatus could be deactivated. For example, a digital assistant could alert the user after a burner at a stove has been turned on for 5 minutes. To illustrate, it turns out the user is making a stew, and the user could tell the digital assistant not to send any more alert regarding this episode. Or, the user could ask the digital assistant to send a reminder after a certain amount of time, such as 30 minutes. In another embodiment, the digital assistant could be configured to override the user's request of not sending any more alert if after a unreasonable or predetermined amount of time, the eyewear still has not received any signal regarding the burner being turned off.


In one embodiment, a local e-device could be an outdoor device in the vicinity of the eyewear.


In one embodiment, an outdoor device could be coupled to an apparatus or a physical object out in the open; and the apparatus includes at least a status that could be changed by the user. As the status changes, a signal could be sent to or received by the outdoor device, which could wirelessly send a corresponding signal to the eyewear.


For example, the apparatus could be a car door with at least a status indicating the door being opened or closed. When the user opens the car door, an outdoor device at the door could send a wireless signal regarding the door being opened to the eyewear of the user. In one approach, a signal at the eyewear could notify the user if the eyewear has not received from the outdoor device another wireless signal regarding the door being closed, such as after a preset amount of time.


In one embodiment, an outdoor device could be in a car, and could be electrically coupled to batteries in the car and a position identifying system in the car that could track locations (such as via a navigation satellite system). An eyewear could be wirelessly coupled to the outdoor device via, for example, a wifi network or a cellular network. To illustrate, the car is in an outdoor parking lot, and a user could not find it. The user could ask a digital assistant in her eyewear for guidance. The digital assistant could send a wireless signal to the outdoor device. The outdoor device would determine its position (or the position of the car) via the position identifying system in the car. Then the outdoor device could wirelessly message the position to the digital assistant. The digital assistant could determine the position of the eyewear via a position identifying system in the eyewear. Based on the positions of the outdoor device and the eyewear, the digital assistant could guide the user back to the car.


In one embodiment, a local e-device could couple to an object, and operate both indoor and outdoor, in the vicinity of the eyewear.


A number of examples have been described regarding local e-devices. In one embodiment, at least each local e-device that is configured to interact with the eyewear has an unique identifier relative to other local e-devices that are configured to interact with the eyewear. This could serve as a way to distinguish signals from one local e-device relative to signals from another local e-device, received by the eyewear.


In one embodiment, the eyewear could keep track of its interactions with the one or more local e-devices in its vicinity. The eyewear could analyze the interactions; or information regarding the interactions could be sent, such as wirelessly, to another computer to be analyzed. In one embodiment, based on the analysis, norms could be established regarding typical, normal, or regular behavior of the user toward different physical objects indoor or outside in the open.


In one embodiment, if a user behavior to a physical object indoor or outside in the open is significantly different from the corresponding norm, a digital assistant in the eyewear could ask the user whether there is any concern. Based on the user's response, the digital assistant could notify different persons of interest, and could present to them what has happened. In one embodiment, significantly different from the norm could be more than one and a half standard deviations from the norm or mean. In another embodiment, significantly different from the norm could be two or more than two standard deviations from the norm or mean.


In one embodiment, the eyewear includes an imaging system, which could include one or more cameras, such as optical cameras and/or infrared cameras. The imaging system could be used to gather images, which could be videos, of the environment of the eyewear. The images could be analyzed to help the user. The analysis could be done at the eyewear. In another embodiment, the eyewear could be wirelessly coupled to a computing device, and the eyewear could send the images to the computing device to perform the analysis. The computing device could be coupled to the eyewear via a local network (e.g. a local area network, or a wifi network). After the analysis, the computing device could send the results back to the eyewear.


For example, the user is standing in front of an oven and does not know how to operate the different buttons on the oven. The user could ask the digital assistant in the eyewear, and the digital assistant could have the imaging system took images and then have them analyzed to help the user. If the user is not satisfied with the help, the user could ask the digital assistant to have the images sent to a live person and connect the user to the live person to help the user. In one embodiment, the imaging system could keep taking images, which could be videos, of the oven as the user is talking to the live person, who could help the user to operate the different buttons on the oven.


In another embodiment, when the user is standing in front of a stove and tries to turn on the stove, an indoor device at the stove could, for example, ask the user a question before the stove can be turned on. In yet another embodiment, the indoor device could send a signal to the eyewear, which could be in a listening mode. The signal could activate the eyewear, which could analyze the signal, and ask the user a question. For example, the question could be, “Why do you want to use the stove?” The user could respond, “I want to heat up some soup.” The response can be analyzed, based on, for example, natural-language processing techniques, to determine if it is an allowed activity in the context of operating a stove. The analysis could use different pre-configured rules. For example, a rule could be time based, such as certain activities being allowed at a certain time, but not allowed at a different time.


In one embodiment, if the response is deemed appropriate, then the requested action would be permitted. For example, the user could turn on the stove if the response is deemed appropriate. However, if the response is deemed not appropriate, then the user would not be able to turn on the stove. The eyewear, the indoor device at the stove, or the stove, could inform the user, for example, that it is not a good time to use the stove, such as to heat some soup.


In another embodiment, the imaging system could include additional sensors to assist the user. For example, the imaging system could include a flame detector to detect flames, such as flames of a stove; and the eyewear can interact with the user to insure the stove is used appropriately. To illustrate, one rule could be that two burners should not be turned on and left on for a preset amount of time. Based on such a rule, if the imaging system determines that two burners have been turned on for more than the preset amount of time, such as 10 seconds, the eyewear could react, such as alert the user, accordingly. For example, the eyewear could interact with a corresponding indoor device at the stove, which could react accordingly, such as turning down the flames at both burners or turning both burners off.


In yet another embodiment, the eyewear can also take and analyze images to see if the user is actually using the stove appropriately, such as to heat soup. Based on the analysis, the eyewear could act accordingly. For example, the eyewear could send signals to an indoor device at the stove to have the stove operated accordingly.


In one embodiment, instead of performing the analysis, the eyewear could communicate with a supervising person to determine whether the user is allowed to perform certain activity. For example, when the user wants to turn on a stove, a corresponding indoor device could alert the eyewear, which could take images of the user intending to turn on a stove, and send the images to the person. Or, the eyewear could send a text message to the person asking whether it is ok for the user to use the stove, such as to heat soup. Based on the response from the supervisor, the eyewear could send a corresponding signal to the indoor device to act accordingly. For example, if the corresponding signal is no, the indoor device could forbid the stove to be turned on.


A number of embodiments have been described regarding an eyewear interacting with the one or more local e-devices in its vicinity. In one embodiment, instead of an eyewear, the embodiment could include a wearable device interacting with the one or more local e-devices in its vicinity. The wearable device could be in the format of a watch, a wristband, a piece of jewelry, a chest strap, a piece of clothing, or other types of devices wearable on the body of the user.


Different embodiments regarding an eyewear have been described. The eyewear could be, for example, a pair of sunglasses, auxiliary frames, fit-over glasses, prescription glasses, safety glasses, swim masks, and goggles (such as ski goggles). In one embodiment, the eyewear could be incorporated in a helmet or other type of headgear.


Different embodiments of electronics in an eyewear are described in (a) U.S. Pat. No. 9,488,520, entitled, “Eyewear With Radiation Detection,” which is incorporated herein by reference; (b) U.S. Pat. No. 8,905,542, entitled, “Eyewear Supporting Bone Conducting Speaker,” which is incorporated herein by reference; and (c) U.S. Pat. No. 8,337,013, entitled, “Eyeglasses with RFID Tags or with a Strap,” which is incorporated herein by reference.


The various embodiments, implementations and features of the invention noted above can be combined in various ways or used separately. Those skilled in the art will understand from the description that the invention can be equally applied to or used in other various different settings with respect to various combinations, embodiments, implementations or features provided in the description herein.


Numerous specific details are set forth in order to provide a thorough understanding of the present invention. However, it will become obvious to those skilled in the art that the invention may be practiced without these specific details. The description and representation herein are the common meanings used by those experienced or skilled in the art to most effectively convey the substance of their work to others skilled in the art. In other instances, well-known methods, procedures, components, and circuitry have not been described in detail to avoid unnecessarily obscuring aspects of the present invention.


Also, in this specification, reference to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the order of blocks in process flowcharts or diagrams representing one or more embodiments of the invention do not inherently indicate any particular order nor imply any limitations in the invention.


Other embodiments of the invention will be apparent to those skilled in the art from a consideration of this specification or practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with the true scope and spirit of the invention being indicated by the following claims.

Claims
  • 1. An eyewear system for a user comprising: an eyewear frame comprising: a wireless communication component;an activity detector configured to monitor at least a user activity;a microphone;a memory storing at least a portion of voice-recognition instructions to enable recognizing voice depending on at least a voice recognition model previously trained based on machine learning in at least a server separate from the eyewear system, to allow recognizing voice at the eyewear frame without the need to wirelessly access a server;a re-chargeable battery;a speaker;a controller configured to detect a fall of the user based on at least the activity detector, in view of such a fall,generate a voice output that is configured to solicit a voice response, via at least the speaker in view of the detected fall of the user, andanalyze reaction of the user to the voice output to determine a course of action, with the analyze including analyzing voice input received via at least the microphone, as recognized at least by the at least a portion of voice-recognition instructions if the reaction of the user includes the voice input from the user;anda case configured for the eyewear frame to be placed in, with the case configured to charge the re-chargeable battery based on the eyewear frame being placed in the case.
  • 2. An eyewear system as recited in claim 1, wherein the controller is configured to identify a location of the eyewear frame based on at least an electronic component in the eyewear frame.
  • 3. An eyewear system as recited in claim 1, wherein the course of action includes sending a message to an interested party via at least the wireless communication component, at least in view of the detected fall.
  • 4. An eyewear system as recited in claim 3, wherein the controller is configured to identify a location of the eyewear frame based on at least an electronic component in the eyewear frame, andwherein the message includes at least the location of the eyewear frame.
  • 5. An eyewear system as recited in claim 1, wherein the frame comprises a coil of wire,wherein the case comprises a coil of wire,wherein the charging of the re-chargeable battery is at least via induction, with at least the coil of wire in the frame becoming in proximity to at least the coil of wire in the case, and without the need for the eyewear frame to have a physical connector to be physically and electrically connected to another physical connector at the case for charging, andwherein at least the coil of wire in the frame is around a lens holder of the eyewear frame.
  • 6. An eyewear system as recited in claim 1, wherein the frame comprises a coil of wire,wherein the case comprises a coil of wire,wherein the charging of the re-chargeable battery is at least via induction, with at least the coil of wire in the frame becoming in proximity to at least the coil of wire in the case, and without the need for the eyewear frame to have a physical connector to be physically and electrically connected to another physical connector at the case for charging,wherein the eyewear frame includes a first and a second lens holders,wherein the eyewear frame comprises another coil of wire,wherein at least the coil of wire in the frame is around the first lens holder, andwherein at least the another coil of wire is around the second lens holder.
  • 7. An eyewear system as recited in claim 1, wherein the frame comprises a coil of wire,wherein the case comprises a coil of wire,wherein the charging of the re-chargeable battery is at least via induction, with at least the coil of wire in the frame becoming in proximity to at least the coil of wire in the case, and without the need for the eyewear frame to have a physical connector to be physically and electrically connected to another physical connector at the case for charging, andwherein at least the coil of wire in the frame is in an arm of the eyewear frame, with the coil having at least a loop configured to charge the re-chargeable battery.
  • 8. An eyewear system as recited in claim 1, wherein the eyewear is eyeglasses with at least one lens holder.
  • 9. An eyewear system as recited in claim 1, wherein the controller is configured to enhance hearing of the user based on at least an electrical component in the eyewear frame.
  • 10. An eyewear system as recited in claim 1, wherein the eyewear frame includes an electrical component configured to monitor heart rate of the user, at least via the electrical component touching skin of the user, andwherein the controller is configured to send a message to an interested party via at least the wireless communication component, in view of a monitored heart rate of the user.
  • 11. An eyewear system as recited in claim 1, wherein the eyewear frame is configured to be activated by voice input received via at least the microphone, as recognized by the at least a portion of voice-recognition instructions.
  • 12. An eyewear system as recited in claim 1, wherein the course of action includes sending a message to an interested party via at least the wireless communication component, in view of the analyze the reaction of the user to identify no voice response from the user to the voice output.
  • 13. An eyewear system as recited in claim 1, wherein the eyewear frame includes an electrical component configured to monitor heart rate of the user.
  • 14. An eyewear system as recited in claim 13, wherein the controller is configured to send a message to an interested party via at least the wireless communication component, in view of a monitored heart rate of the user.
  • 15. A non-transitory computer readable storage medium comprising a plurality of instructions in an eyewear frame of an eyewear system, with the eyewear frame comprising a wireless communication component, a microphone, a re-chargeable battery, a speaker, and a controller,with the plurality of instructions including voice-recognition instructions to enable recognizing voice depending on at least a voice recognition model previously trained based on machine learning in at least a server separate from the eyewear system, to allow recognizing voice at the eyewear frame without the need to wirelessly access a server,the eyewear system also including a case configured for the eyewear frame to be placed in, with the case configured to charge the re-chargeable battery at least based on the eyewear frame being placed in the case,the plurality of instructions, when executed at least by the controller, result in the controller:detect a fall of the user based on sensing at least an activity of user, in view of such a fall;generate a voice output that is configured to solicit a voice response, via at least the speaker in view of the detected fall of the user; andanalyze reaction of the user to the voice output to determine a course of action, with the analyze including analyzing voice input received via at least the microphone, as recognized at least by the at least a portion of voice-recognition instructions if the reaction of the user includes the voice input from the user.
  • 16. A non-transitory computer readable storage medium as recited in claim 15, wherein the plurality of instructions, when executed at least by the controller, result in the controller identifying a location of the eyewear frame at least based on at least an electronic component in the eyewear frame.
  • 17. A non-transitory computer readable storage medium as recited in claim 15, wherein the plurality of instructions, when executed at least by the controller, result in the controller sending a message to an interested party via at least the wireless communication component, at least in view of the detected fall.
  • 18. A non-transitory computer readable storage medium as recited in claim 15, wherein the plurality of instructions, when executed at least by the controller, result in the controller using at least a hearing characteristic of the user to enhance hearing of the user.
  • 19. A non-transitory computer readable storage medium as recited in claim 15, wherein the plurality of instructions, when executed at least by the controller, result in the controller monitoring heart rate of the user based on at least an electrical component in the eyewear frame configured to monitor heart rate, with the user wearing the eyewear frame.
  • 20. A non-transitory computer readable storage medium as recited in claim 15, wherein the plurality of instructions, when executed at least by the controller, result in the controller: sending a message to an interested party via at least the wireless communication component, at least in view of the detected fall; andusing at least a hearing characteristic of the user to enhance hearing of the user.
  • 21. A non-transitory computer readable storage medium as recited in claim 20, wherein the plurality of instructions, when executed at least by the controller, result in the controller identifying a location of the eyewear frame at least based on at least an electronic component in the eyewear frame.
  • 22. An eyewear frame for a user comprising: a wireless communication component;a microphone;a memory storing at least a portion of voice-recognition instructions to enable recognizing voice depending on at least a voice recognition model previously trained based on machine learning in at least a server separate from the eyewear system, to allow recognizing voice at the eyewear frame without the need to wirelessly access a server;a re-chargeable battery;a speaker;anda controller configured to detect a fall of the user based on sensing at least an activity of user, in view of such a fall;generate a voice output that is configured to solicit a voice response, via at least the speaker in view of the detected fall of the user, andanalyze reaction of the user to the voice output to determine a course of action, with the analyze including analyzing voice input received via at least the microphone, as recognized at least by the at least a portion of voice-recognition instructions if the reaction of the user includes the voice input from the user,wherein the re-chargeable battery of the eyewear frame is configured to be charged by placing the eyewear frame into a case.
  • 23. An eyewear frame for a user as recited in claim 22, wherein the controller is configured to send a message to an interested party via at least the wireless communication component, at least in view of the detected fall.
  • 24. An eyewear frame for a user as recited in claim 22 wherein the course of action includes sending a message to an interested party via at least the wireless communication component, in view of the analyze the reaction of the user to identify no voice response from the user to the voice output.
  • 25. An eyewear frame for a user as recited in claim 22 wherein the controller is configured to identify a location of the eyewear frame based on at least an electronic component in the eyewear frame.
  • 26. An eyewear frame for a user as recited in claim 25, wherein the controller is configured to send a message to an interested party via at least the wireless communication component, at least in view of the detected fall, and wherein the message includes at least the location of the eyewear frame.
  • 27. An eyewear frame for a user as recited in claim 22, wherein the frame comprises a coil of wire,wherein the case comprises a coil of wire,wherein the charging of the re-chargeable battery is at least via induction, with at least the coil of wire in the frame becoming in proximity to at least the coil of wire in the case, and without the need for the eyewear frame to have a physical connector to be physically and electrically connected to another physical connector at the case for charging, andwherein at least the coil of wire in the frame is around a lens holder of the eyewear frame.
  • 28. An eyewear frame for a user as recited in claim 22, wherein the eyewear frame is configured to be activated by voice input received via at least the microphone, as recognized by the at least a portion of voice-recognition instructions software code, with the voice input including at least a preset word.
  • 29. An eyewear frame for a user as recited in claim 22, wherein the controller is configured to enhance hearing of the user based on at least an electrical component in the frame and at least a hearing characteristic of the user.
  • 30. An eyewear frame for a user as recited in claim 29, wherein the controller is configured to send a message to an interested party via at least the wireless communication component, at least in view of the detected fall,wherein the controller is configured to identify a location of the eyewear frame based on at least an electronic component in the eyewear frame, andwherein the eyewear frame is configured to be activated by voice input received via at least the microphone, as recognized by the at least a portion of voice-recognition instructions, with the voice input including at least a preset word.
  • 31. An eyewear frame for a user as recited in claim 30, wherein the message includes at least the location of the eyewear frame.
CROSS REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 16/382,036, filed on Apr. 11, 2019, now U.S. Pat. No. 10,777,048, entitled “Methods and Apparatus Regarding Electronic Eyewear Applicable for Seniors,” which is hereby incorporated herein by reference and which in turn claims benefit of (a) U.S. Provisional Patent Application No. 62/656,621, filed on Apr. 12, 2018, entitled “Electronic Eyewear for Seniors,” which is hereby incorporated herein by reference; (b) U.S. Provisional Patent Application No. 62/668,762, filed on May 8, 2018, entitled “Methods and Apparatus regarding Electronic Eyewear Applicable for Seniors,” which is hereby incorporated herein by reference; (c) U.S. Provisional Patent Application No. 62/681,292, filed on Jun. 6, 2018, entitled “Methods and Apparatus regarding Electronic Eyewear Applicable for Seniors,” which is hereby incorporated herein by reference; (d) U.S. Provisional Patent Application No. 62/686,174, filed Jun. 18, 2018, entitled “Methods and Apparatus Regarding Electronic Eyewear Applicable for Seniors,” which is hereby incorporated herein by reference; and (e) U.S. Provisional Patent Application No. 62/718,597, filed on Aug. 14, 2018, entitled “Methods and Apparatus Regarding Electronic Eyewear Applicable for Seniors,” which is hereby incorporated herein by reference.

US Referenced Citations (450)
Number Name Date Kind
320558 Hull Jun 1885 A
669949 Underwood Mar 1901 A
1255265 Zachara Feb 1918 A
1917745 Weiss Jul 1933 A
2249572 Lieber Jul 1941 A
2638532 Brady May 1953 A
2725462 Vorgang Nov 1955 A
2794085 Angelis May 1957 A
2818511 Ullery et al. Dec 1957 A
2830132 Borg Jul 1958 A
2874230 Carlson Feb 1959 A
2904670 Calmes Sep 1959 A
3060308 Fortuna Oct 1962 A
3104290 Rosemond et al. Sep 1963 A
3119903 Rosemond et al. Jan 1964 A
3597054 Winter Aug 1971 A
3710115 Jubb Jan 1973 A
3858001 Bonne Dec 1974 A
3883701 Delorenzo May 1975 A
4165487 Corderman Aug 1979 A
4254451 Cochran, Jr. Mar 1981 A
4283127 Rosenwinkel et al. Aug 1981 A
4322585 Liautaud Mar 1982 A
4348664 Boschetti et al. Sep 1982 A
4389217 Baughman et al. Jun 1983 A
4526473 Zahn, III Jul 1985 A
4535244 Burnham Aug 1985 A
4608492 Burnham Aug 1986 A
4683587 Silverman Jul 1987 A
4751691 Perera Jun 1988 A
4757714 Purdy et al. Jul 1988 A
4773095 Zwicker et al. Sep 1988 A
4806011 Bettinger Feb 1989 A
4822160 Tsai Apr 1989 A
4822161 Jimmy Apr 1989 A
4851686 Pearson Jul 1989 A
4856086 McCullough Aug 1989 A
4859047 Badewitz Aug 1989 A
4882769 Gallimore Nov 1989 A
4904078 Gorike Feb 1990 A
4942629 Stadlmann Jul 1990 A
4962469 Ono et al. Oct 1990 A
4967268 Lipton et al. Oct 1990 A
4985632 Bianco et al. Jan 1991 A
5008548 Gat Apr 1991 A
5015086 Okaue et al. May 1991 A
5020150 Shannon May 1991 A
5026151 Waltuck et al. Jun 1991 A
5036311 Moran et al. Jul 1991 A
5050150 Ikeda Sep 1991 A
5064410 Frenkel et al. Nov 1991 A
5093576 Edmond et al. Mar 1992 A
5106179 Kamaya et al. Apr 1992 A
5144344 Takahashi et al. Sep 1992 A
5148023 Hayashi et al. Sep 1992 A
5151600 Black Sep 1992 A
5161250 Ianna et al. Nov 1992 A
5172256 Sethofer et al. Dec 1992 A
5264877 Hussey Nov 1993 A
5306917 Black et al. Apr 1994 A
5353378 Hoffman et al. Oct 1994 A
5359370 Mugnier Oct 1994 A
5359444 Piosenka et al. Oct 1994 A
5367345 da Silva Nov 1994 A
5379464 Schleger et al. Jan 1995 A
5382986 Black et al. Jan 1995 A
5394005 Brown et al. Feb 1995 A
5452026 Marcy, III Sep 1995 A
5452480 Ryden Sep 1995 A
5455637 Kallman et al. Oct 1995 A
5455640 Gertsikov Oct 1995 A
5457751 Such Oct 1995 A
5463428 Lipton et al. Oct 1995 A
5475798 Handles Dec 1995 A
5500532 Kozicki Mar 1996 A
D369167 Hanson et al. Apr 1996 S
5510961 Peng Apr 1996 A
5513384 Brennan et al. Apr 1996 A
5519781 Kurkurudza May 1996 A
5533130 Staton Jul 1996 A
5541641 Shimada Jul 1996 A
5581090 Goudjil Dec 1996 A
5585871 Linden Dec 1996 A
5589398 Krause et al. Dec 1996 A
5590417 Rydbeck Dec 1996 A
5606743 Vogt et al. Feb 1997 A
5608808 da Silva Mar 1997 A
5634201 Mooring May 1997 A
5671035 Barnes Sep 1997 A
5686727 Reenstra et al. Nov 1997 A
5694475 Boyden Dec 1997 A
5715323 Walker Feb 1998 A
5737436 Boyden et al. Apr 1998 A
5777715 Kruegle et al. Jul 1998 A
5818381 Williams Oct 1998 A
5819183 Voroba et al. Oct 1998 A
5835185 Kallman et al. Nov 1998 A
5900720 Kallman et al. May 1999 A
5903395 Rallison et al. May 1999 A
5923398 Goldman Jul 1999 A
5941837 Amano et al. Aug 1999 A
5946071 Feldman Aug 1999 A
5949516 McCurdy Sep 1999 A
5966746 Reedy et al. Oct 1999 A
5980037 Conway Nov 1999 A
5988812 Wingate Nov 1999 A
5991085 Rallison et al. Nov 1999 A
5992996 Sawyer Nov 1999 A
5995592 Shirai et al. Nov 1999 A
6010216 Jesiek Jan 2000 A
6013919 Schneider et al. Jan 2000 A
6028627 Helmsderfer Feb 2000 A
6046455 Ribi et al. Apr 2000 A
6060321 Hovorka May 2000 A
6061580 Altschul et al. May 2000 A
6091546 Spitzer Jul 2000 A
6091832 Shurman et al. Jul 2000 A
6115177 Vossler Sep 2000 A
6132681 Faran et al. Oct 2000 A
6145983 Schiffer Nov 2000 A
6154552 Koroljow et al. Nov 2000 A
6176576 Green et al. Jan 2001 B1
6225897 Doyle et al. May 2001 B1
6231181 Swab May 2001 B1
6236969 Ruppert et al. May 2001 B1
6243578 Koike Jun 2001 B1
6259367 Klein Jul 2001 B1
6270466 Weinstein et al. Aug 2001 B1
6292213 Jones Sep 2001 B1
6292685 Pompei Sep 2001 B1
6301050 DeLeon Oct 2001 B1
6301367 Boyden et al. Oct 2001 B1
6307526 Mann Oct 2001 B1
6311155 Vaudrey et al. Oct 2001 B1
6343858 Zelman Feb 2002 B1
6346929 Fukushima et al. Feb 2002 B1
6349001 Spitzer Feb 2002 B1
6349422 Schleger et al. Feb 2002 B1
6409335 Lipawsky Jun 2002 B1
6409338 Jewell Jun 2002 B1
6426719 Nagareda et al. Jul 2002 B1
6431705 Linden Aug 2002 B1
6474816 Butler et al. Nov 2002 B2
6478736 Mault Nov 2002 B1
6506142 Itoh et al. Jan 2003 B2
6511175 Hay et al. Jan 2003 B2
6513532 Mault et al. Feb 2003 B2
6517203 Blum et al. Feb 2003 B1
6539336 Vock et al. Mar 2003 B1
6542081 Torch Apr 2003 B2
6546101 Murray et al. Apr 2003 B1
6554763 Amano et al. Apr 2003 B1
6582075 Swab et al. Jun 2003 B1
6619799 Blum et al. Sep 2003 B1
6629076 Haken Sep 2003 B1
6678381 Manabe Jan 2004 B1
6717737 Haglund Apr 2004 B1
6729726 Miller et al. May 2004 B2
6736759 Stubbs et al. May 2004 B1
6764194 Cooper Jul 2004 B1
6769767 Swab et al. Aug 2004 B2
6771423 Geist Aug 2004 B2
6788309 Swan et al. Sep 2004 B1
6792401 Nigro et al. Sep 2004 B1
6816314 Shimizu et al. Nov 2004 B2
6824265 Harper Nov 2004 B1
6857741 Blum et al. Feb 2005 B2
6871951 Blum et al. Mar 2005 B2
6879930 Sinclair et al. Apr 2005 B2
6912386 Himberg et al. Jun 2005 B1
6929365 Swab et al. Aug 2005 B2
6932090 Reschke et al. Aug 2005 B1
6947219 Ou Sep 2005 B1
7004582 Jannard et al. Feb 2006 B2
7013009 Warren Mar 2006 B2
7023594 Blum et al. Apr 2006 B2
7030902 Jacobs Apr 2006 B2
7031667 Horiguchi Apr 2006 B2
7033025 Winterbotham Apr 2006 B2
7059717 Bloch Jun 2006 B2
7073905 Da Pra' Jul 2006 B2
7079876 Levy Jul 2006 B2
7123215 Nakada Oct 2006 B2
7192136 Howell et al. Mar 2007 B2
7255437 Howell et al. Aug 2007 B2
7265358 Fontaine Sep 2007 B2
7274292 Velhal et al. Sep 2007 B2
7289767 Lai Oct 2007 B2
7312699 Chornenky Dec 2007 B2
7331666 Swab et al. Feb 2008 B2
7376238 Rivas et al. May 2008 B1
7380936 Howell et al. Jun 2008 B2
7401918 Howell et al. Jul 2008 B2
7405801 Jacobs Jul 2008 B2
7429965 Weiner Sep 2008 B2
7438409 Jordan Oct 2008 B2
7438410 Howell et al. Oct 2008 B1
7445332 Jannard et al. Nov 2008 B2
7481531 Howell et al. Jan 2009 B2
7500746 Howell et al. Mar 2009 B1
7500747 Howell et al. Mar 2009 B2
7512414 Jannard et al. Mar 2009 B2
7517083 Blum et al. Apr 2009 B2
7527374 Chou May 2009 B2
7543934 Howell et al. Jun 2009 B2
7581833 Howell et al. Sep 2009 B2
7621634 Howell et al. Nov 2009 B2
7648236 Dobson Jan 2010 B1
7677723 Howell et al. Mar 2010 B2
7760898 Howell Jul 2010 B2
7771046 Howell et al. Aug 2010 B2
7792552 Thomas et al. Sep 2010 B2
7801570 Cheung et al. Sep 2010 B2
7806525 Howell et al. Oct 2010 B2
7922321 Howell et al. Apr 2011 B2
7976159 Jacobs et al. Jul 2011 B2
8109629 Howell et al. Feb 2012 B2
8142015 Paolino Mar 2012 B2
8174569 Tanijiri et al. May 2012 B2
8337013 Howell et al. Dec 2012 B2
8430507 Howell et al. Apr 2013 B2
8434863 Howell et al. May 2013 B2
8465151 Howell et al. Jun 2013 B2
8485661 Yoo et al. Jul 2013 B2
8500271 Howell et al. Aug 2013 B2
8770742 Howell et al. Jul 2014 B2
8849185 Cheung et al. Sep 2014 B2
8905542 Howell et al. Dec 2014 B2
9033493 Howell et al. May 2015 B2
9244292 Swab et al. Jan 2016 B2
9400390 Osterhout et al. Jul 2016 B2
9405135 Sweis et al. Aug 2016 B2
9488520 Howell et al. Nov 2016 B2
9547184 Howell et al. Jan 2017 B2
9690121 Howell et al. Jun 2017 B2
9922236 Moore Mar 2018 B2
10042186 Chao et al. Aug 2018 B2
10060790 Howell et al. Aug 2018 B2
10061144 Howell et al. Aug 2018 B2
10310296 Howell et al. Jun 2019 B2
10345625 Howell et al. Jul 2019 B2
10359311 Howell et al. Jul 2019 B2
10359459 Gorin et al. Jul 2019 B1
10515623 Grizzel Dec 2019 B1
10571715 Rizzo, III et al. Feb 2020 B2
10624790 Chao et al. Apr 2020 B2
10777048 Howell et al. Sep 2020 B2
10802582 Clements Oct 2020 B1
10964190 Peyrard Mar 2021 B2
11042045 Chao et al. Jun 2021 B2
11069358 Harper Jul 2021 B1
11086147 Howell et al. Aug 2021 B2
11204512 Howell et al. Dec 2021 B2
11243416 Howell et al. Feb 2022 B2
11326941 Howell et al. May 2022 B2
11513371 Howell et al. Nov 2022 B2
11536988 Howell et al. Dec 2022 B2
11630331 Howell et al. Apr 2023 B2
11644361 Howell et al. May 2023 B2
11644693 Howell et al. May 2023 B2
20010005230 Ishikawa Jun 2001 A1
20010028309 Torch Oct 2001 A1
20010050754 Hay et al. Dec 2001 A1
20020017997 Felkowitz Feb 2002 A1
20020021407 Elliott Feb 2002 A1
20020081982 Schwartz et al. Jun 2002 A1
20020084990 Peterson, III Jul 2002 A1
20020089639 Starner et al. Jul 2002 A1
20020090103 Calisto, Jr. Jul 2002 A1
20020098877 Glezerman Jul 2002 A1
20020101568 Eberl et al. Aug 2002 A1
20020109600 Mault et al. Aug 2002 A1
20020136414 Jordan et al. Sep 2002 A1
20020140899 Blum et al. Oct 2002 A1
20020159023 Swab Oct 2002 A1
20020197961 Warren Dec 2002 A1
20030018274 Takahashi et al. Jan 2003 A1
20030022690 Beyda et al. Jan 2003 A1
20030032449 Giobbi Feb 2003 A1
20030062046 Wiesmann et al. Apr 2003 A1
20030064746 Rader et al. Apr 2003 A1
20030065257 Mault et al. Apr 2003 A1
20030067585 Miller et al. Apr 2003 A1
20030068057 Miller et al. Apr 2003 A1
20030083591 Edwards et al. May 2003 A1
20030091200 Pompei May 2003 A1
20030214630 Winterbotham Nov 2003 A1
20030226978 Ribi et al. Dec 2003 A1
20030231293 Blum et al. Dec 2003 A1
20040000733 Swab et al. Jan 2004 A1
20040005069 Buck Jan 2004 A1
20040029582 Swab et al. Feb 2004 A1
20040059212 Abreu Mar 2004 A1
20040063378 Nelson Apr 2004 A1
20040096078 Lin May 2004 A1
20040100384 Chen et al. May 2004 A1
20040101178 Fedorovskaya et al. May 2004 A1
20040104864 Nakada Jun 2004 A1
20040114770 Pompei Jun 2004 A1
20040128737 Gesten Jul 2004 A1
20040150986 Chang Aug 2004 A1
20040156012 Jannard et al. Aug 2004 A1
20040157649 Jannard et al. Aug 2004 A1
20040160571 Jannard Aug 2004 A1
20040160572 Jannard Aug 2004 A1
20040160573 Jannard et al. Aug 2004 A1
20040197002 Atsumi et al. Oct 2004 A1
20040227219 Su Nov 2004 A1
20050036103 Bloch Feb 2005 A1
20050067580 Fontaine Mar 2005 A1
20050078274 Howell et al. Apr 2005 A1
20050088365 Yamazaki et al. Apr 2005 A1
20050201585 Jannard et al. Sep 2005 A1
20050213026 Da Pra' Sep 2005 A1
20050230596 Howell et al. Oct 2005 A1
20050238194 Chornenky Oct 2005 A1
20050239502 Swab et al. Oct 2005 A1
20050248717 Howell et al. Nov 2005 A1
20050248718 Howell et al. Nov 2005 A1
20050248719 Howell et al. Nov 2005 A1
20050264752 Howell et al. Dec 2005 A1
20050278446 Bryant Dec 2005 A1
20060001827 Howell et al. Jan 2006 A1
20060003803 Thomas et al. Jan 2006 A1
20060023158 Howell et al. Feb 2006 A1
20060034478 Davenport Feb 2006 A1
20060107822 Bowen May 2006 A1
20060132382 Jannard Jun 2006 A1
20060291667 Watanabe et al. Dec 2006 A1
20070030442 Howell et al. Feb 2007 A1
20070035830 Matveev et al. Feb 2007 A1
20070046887 Howell et al. Mar 2007 A1
20070055888 Miller et al. Mar 2007 A1
20070098192 Sipkema May 2007 A1
20070109491 Howell et al. May 2007 A1
20070186330 Howell et al. Aug 2007 A1
20070189548 Croft, III Aug 2007 A1
20070200927 Krenik Aug 2007 A1
20070208531 Darley et al. Sep 2007 A1
20070211574 Croft, III Sep 2007 A1
20070248238 Abreu et al. Oct 2007 A1
20070270663 Ng et al. Nov 2007 A1
20070271065 Gupta et al. Nov 2007 A1
20070271116 Wysocki et al. Nov 2007 A1
20070271387 Lydon et al. Nov 2007 A1
20070279584 Howell et al. Dec 2007 A1
20080062338 Herzog et al. Mar 2008 A1
20080068559 Howell et al. Mar 2008 A1
20080089545 Jannard et al. Apr 2008 A1
20080100792 Blum et al. May 2008 A1
20080144854 Abreu Jun 2008 A1
20080151175 Gross Jun 2008 A1
20080151179 Howell et al. Jun 2008 A1
20080158506 Fuziak Jul 2008 A1
20080211921 Sako et al. Sep 2008 A1
20080218684 Howell et al. Sep 2008 A1
20080262392 Ananny et al. Oct 2008 A1
20080278678 Howell et al. Nov 2008 A1
20090059159 Howell et al. Mar 2009 A1
20090059381 Jannard Mar 2009 A1
20090073375 Nakada Mar 2009 A1
20090141233 Howell et al. Jun 2009 A1
20090147215 Howell et al. Jun 2009 A1
20090156128 Franson et al. Jun 2009 A1
20090251660 Figler et al. Oct 2009 A1
20090251661 Fuziak, Jr. Oct 2009 A1
20090296044 Howell et al. Dec 2009 A1
20100045928 Levy Feb 2010 A1
20100061579 Rickards et al. Mar 2010 A1
20100079356 Hoellwarth Apr 2010 A1
20100105445 Brunton et al. Apr 2010 A1
20100110368 Chaum May 2010 A1
20100245754 Matsumoto et al. Sep 2010 A1
20100296045 Agnoli et al. Nov 2010 A1
20100309426 Howell et al. Dec 2010 A1
20110102734 Howell et al. May 2011 A1
20110164122 Hardacker Jul 2011 A1
20110187990 Howell et al. Aug 2011 A1
20110241976 Boger et al. Oct 2011 A1
20110273365 West et al. Nov 2011 A1
20110292333 Kozaki et al. Dec 2011 A1
20120033061 Ko et al. Feb 2012 A1
20120050668 Howell et al. Mar 2012 A1
20120062357 Slamka Mar 2012 A1
20120101411 Hausdorff Apr 2012 A1
20120133885 Howell et al. May 2012 A1
20120176580 Sonsino Jul 2012 A1
20120283894 Naboulsi Nov 2012 A1
20130072828 Sweis et al. Mar 2013 A1
20130077175 Hotta et al. Mar 2013 A1
20130143519 Doezema Jun 2013 A1
20130172691 Tran Jul 2013 A1
20130201440 Howell et al. Aug 2013 A1
20130308089 Howell et al. Nov 2013 A1
20140132913 Sweis et al. May 2014 A1
20140176902 Sweis et al. Jun 2014 A1
20140198293 Sweis et al. Jul 2014 A1
20140226838 Wingate Aug 2014 A1
20140268008 Howell et al. Sep 2014 A1
20140268013 Howell et al. Sep 2014 A1
20140268017 Sweis et al. Sep 2014 A1
20140361185 Howell et al. Dec 2014 A1
20150085245 Howell et al. Mar 2015 A1
20150230988 Chao et al. Aug 2015 A1
20150253590 Howell et al. Sep 2015 A1
20150277123 Chaum Oct 2015 A1
20150338677 Block Nov 2015 A1
20160098874 Handville et al. Apr 2016 A1
20160246075 Howell et al. Aug 2016 A9
20160302992 Sweis et al. Oct 2016 A1
20170068117 Howell et al. Mar 2017 A9
20170074721 Howell et al. Mar 2017 A1
20170090219 Howell et al. Mar 2017 A1
20170131575 Howell et al. May 2017 A1
20170146829 Howell et al. May 2017 A1
20170303187 Crouthamel et al. Oct 2017 A1
20180122208 Peyrard May 2018 A1
20180314079 Chao et al. Nov 2018 A1
20180335650 Howell et al. Nov 2018 A1
20180348050 Howell et al. Dec 2018 A1
20190004325 Connor Jan 2019 A1
20190033622 Olgun Jan 2019 A1
20190033623 Howell et al. Jan 2019 A1
20190187492 Howell et al. Jun 2019 A1
20190272800 Tao et al. Sep 2019 A1
20190278110 Howell et al. Sep 2019 A1
20190285913 Howell et al. Sep 2019 A1
20190310132 Howell et al. Oct 2019 A1
20190318589 Howell et al. Oct 2019 A1
20190369402 Woodman Dec 2019 A1
20190378493 Kim et al. Dec 2019 A1
20190387351 Lyren et al. Dec 2019 A1
20200012127 Howell et al. Jan 2020 A1
20200218094 Howell et al. Jul 2020 A1
20210000347 Stump Jan 2021 A1
20210026146 Harder et al. Jan 2021 A1
20210271116 Chao et al. Sep 2021 A1
20210364827 Howell et al. Nov 2021 A9
20210364828 Howell et al. Nov 2021 A1
20210379425 Tran Dec 2021 A1
20220008763 Saleh et al. Jan 2022 A1
20220011603 Howell et al. Jan 2022 A1
20220034542 Peters et al. Feb 2022 A1
20220054092 Howell et al. Feb 2022 A1
20220178743 Howell et al. Jun 2022 A1
20220335792 Howell et al. Oct 2022 A1
20220415388 Howell et al. Dec 2022 A1
20230017634 Howell et al. Jan 2023 A1
20230033660 Howell et al. Feb 2023 A1
20230057654 Howell et al. Feb 2023 A1
Foreign Referenced Citations (33)
Number Date Country
2 487 391 Dec 2003 CA
88203065 Nov 1988 CN
89214222.7 Mar 1990 CN
90208199.3 Nov 1990 CN
10123226 Nov 2002 DE
1134491 Sep 2001 EP
1027626 Mar 2023 EP
2290433 Apr 2023 EP
2530039 Jan 1984 FR
1467982 Mar 1977 GB
58-113912 Jul 1983 JP
58-113914 Jul 1983 JP
02-181722 Jul 1990 JP
09-017204 Jan 1997 JP
10-161072 Jun 1998 JP
2000-039595 Feb 2000 JP
2002-02511706 Apr 2002 JP
2002 341059 Nov 2002 JP
2005-151292 Jun 2005 JP
2005-167902 Jun 2005 JP
2002-0044416 Jun 2002 KR
484711 Jun 2001 TW
WO 199712205 Apr 1997 WO
WO 9950706 Oct 1999 WO
WO 200106298 Jan 2001 WO
WO 200124576 Apr 2001 WO
WO 200206881 Jan 2002 WO
WO 2003069394 Aug 2003 WO
WO 2003100368 Dec 2003 WO
WO 2003100503 Dec 2003 WO
WO 2004012477 Feb 2004 WO
WO 2004025554 Mar 2004 WO
WO 100141514 Dec 2010 WO
Non-Patent Literature Citations (119)
Entry
Notice of Allowance for U.S. Appl. No. 16/031,046 dated Jul. 6, 2020.
Office Action for U.S. Appl. No. 14/072,784, dated Jul. 27, 2015.
Office Action for U.S. Appl. No. 14/072,784, dated Oct. 29, 2015.
Notice of Allowance for U.S. Appl. No. 14/072,784, dated Jan. 14, 2016.
Notice of Allowance for U.S. Appl. No. 14/072,784, dated Apr. 7, 2016.
Office Action for U.S. Appl. No. 15/193,155, dated Sep. 26, 2016.
Office Action for U.S. Appl. No. 15/193,155, dated Jun. 8, 2017.
Office Action for U.S. Appl. No. 14/190,352, dated Oct. 26, 2016.
Office Action for U.S. Appl. No. 14/190,352, dated May 4, 2017.
Office Action for U.S. Appl. No. 14/703,875, dated Oct. 5, 2016.
Office Action for U.S. Appl. No. 14/703,875, dated May 17, 2017.
Office Action for U.S. Appl. No. 14/703,875, dated Apr. 12, 2018.
Office Action for U.S. Appl. No. 14/703,875, dated Oct. 2, 2018.
Office Action for U.S. Appl. No. 14/703,875, dated Mar. 15, 2019.
Office Action for U.S. Appl. No. 14/703,875, dated Jul. 26, 2019.
Advisory Action for U.S. Appl. No. 14/703,875, dated Sep. 27, 2019.
Notice of Allowance for U.S. Appl. No. 14/703,875, dated Jan. 13, 2020.
Restriction Requirement for U.S. Appl. No. 14/217,174, dated Mar. 28, 2016.
Office Action for U.S. Appl. No. 14/217,174, dated Jul. 28, 2016.
Office Action for U.S. Appl. No. 14/217,174, dated Feb. 10, 2017.
Restriction Requirement for U.S. Appl. No. 14/211,491, dated Jul. 16, 2015.
Office Action for U.S. Appl. No. 14/211,491, dated Oct. 19, 2015.
Office Action for U.S. Appl. No. 14/211,491, dated Feb. 23, 2016.
Notice of Allowance for U.S. Appl. No. 14/211,491, dated Nov. 9, 2016.
Notice of Allowance for U.S. Appl. No. 14/211,491, dated May 26, 2017.
Office Action for U.S. Appl. No. 14/211,491, dated Oct. 11, 2017.
Notice of Allowance for U.S. Appl. No. 14/211,491, dated Apr. 9, 2018.
Restriction Requirement for U.S. Appl. No. 16/031,046, dated Feb. 11, 2019.
Office Action for U.S. Appl. No. 16/031,046, dated Apr. 16, 2019.
Office Action for U.S. Appl. No. 16/031,046, dated Jul. 1, 2019.
Office Action for U.S. Appl. No. 16/031,046, dated Sep. 10, 2019.
Notice of Allowance for U.S. Appl. No. 16/031,046 dated Jan. 15, 2020.
Notice of Allowance for U.S. Appl. No. 16/031,046 dated Mar. 24, 2020.
Notice of Allowance for U.S. Appl. No. 16/382,036, dated Feb. 24, 2020.
Notice of Allowance for U.S. Appl. No. 16/382,036, dated May 18, 2020.
“±1.5g Dual Axis Micromachined Accelerometer”, Freescale Semiconductor, Inc., Motorola Semiconductor Technical Data, MMA6260Q, Jun. 2004, pp. 1-7.
“APA Announces Shipment of the SunUV™ Personal UV Monitor”, Press Release, Nov. 7, 2003, pp. 1-3.
“Camera Specs Take Candid Snaps”, BBC News, Sep. 18, 2003, pp. 1-3.
“Cardo Wireless Attaching Clips and Wearing Headset”, Cardo Systems, Inc., http://www.cardowireless.com/clips.php, downloaded Nov. 27, 2004, pp. 1-3.
“Environmental Health Criteria 14: Ultraviolet Radiation”, International Programme on Chemical Safety, World Health Organization Geneva, 1979 http://www.ichem.org., pp. 1-102.
“Exclusive Media Event Marks Debut of Oakley Thump: World's First Digital Audio Eyewear”, Oakley Investor Relations, Press Release, Nov. 15, 2004, pp. 1-2.
“Eyetop, Product-Features”, eyetop eyewear, eyetop belt worn, http://www.eyetop.net/products/eyetop/features.asp., downloaded Nov. 6, 2003, pp. 1-2.
“Heart Rate Monitors”, http://www.healthgoods.com, downloaded Dec. 4, 2004.
“How is the UV Index Calculated”, SunWise Program, U.S. Environmental Protection Agency, http://www.epa.gov/sunwise/uvcalc.html, downloaded Oct. 14, 2004, pp. 1-2.
“Industrial UV Measurements”, APA Optics, Inc., http://www.apaoptics.com/uv/, downloaded Jul. 12, 2004, p. 1.
“Motorola and Oakley Introduce First Bluetooth Sunglasses—Cutting Edge RAZRWire Line Offers Consumers On-The-Go Connections”, Motorola Mediacenter-Press Release, Feb. 14, 2005, pp. 1-2.
“Oakley Thump: Sunglasses Meet MP3 Player”, with image, http://news.designtechnica.com/article4665.html, Jul. 13, 2004.
“Personal UV monitor,” Optics.org, http://optics.org/articles/news/6/6/7/1 (downloaded Dec. 20, 2003), Jun. 9, 2000, pp. 1-2.
“SafeSun Personal Ultraviolet Light Meter”, http://healthchecksystems.com/safesun.htm, downloaded Jul. 12, 2004, pp. 1-4.
“SafeSun Personal UV Meter”, Introduction, OptixTech Inc., http://www.safesun.com, downloaded Feb. 5, 2004, pp. 1-2.
SafeSun Personal UV Meter, features, Optix Tech Inc., http://www.safesun.com/features.html, downloaded May 1, 2004, pp. 1-2.
“Sharper Image—The FM Pedometer”, e-Corporate Gifts.com, http://www.e-corporategifts.com/sr353.html, downloaded Jan. 22, 2005, pp. 1-2.
“Sun UV™ Personal UV Monitor”, APA Optics, Inc., http://www.apaoptics.com/sunuv/uvfacts.html, downloaded Dec. 20, 2003, pp. 1-3.
“Ultraviolet Light and Sunglasses”, Oberon's Frequently Asked Questions, http://www.oberoncompany.com/OBEnglish/FAQUV.html, downloaded Feb. 5, 2004, pp. 1-2.
“Ultraviolet Light Sensor”, Barrett & Associates Engineering, http://www.barrettengineering.com/project_uvs.htm, downloaded Feb. 5, 2004, pp. 1-3.
“Ultraviolet Radiation (UVR)”, Forum North, Ontario Ministry of Labour, http://www3.mb.sympatico.ca/˜ericc/ULTRAVIOLET%20RADIATION.htm, downloaded Feb. 5, 2004, pp. 1-6.
“What Are Grippies?”, Gripping Eyewear, Inc., http://www.grippingeyewear.com/whatare.html, downloaded Nov. 2, 2005.
“With Racing Heart”, Skaloud et al., GPS World, Oct. 1, 2001, http://www.gpsworld.com/gpsworld/content/printContentPopup.jsp?id=1805, pp. 1-5.
Abrisa Product Information: Cold Mirrors, Abrisa, Jun. 2001, p. 1.
Abrisa Product Information: Commercial Hot Mirror, Abrisa, Jun. 2001, p. 1.
Alps Spectacle, Air Conduction Glass, Bone Conduction Glass, http://www.alps-inter.com/spec.htm, downloaded Dec. 10, 2003, pp. 1-2.
Altimeter and Compass Watches, http://store.yahoo.com/snowshack/altimeter-watches.html, downloaded May 3, 2004, pp. 1-2.
Pediatric Eye Disease Group,“Randomized Trial of Treatment of Amblyopia in Children Aged 7 to 17 Years,” Roy W. Beck, M.D., Ph.D. Section Ed., Originally Published and Reprinted from Arch Ophthalmol, v. 123, Apr. 2005, pp. 437-447, http;//archopht.jamanetwork.com/ by a new England College of Optometry User on Dec. 20, 2012.
Bone Conduction Headgear HG16 Series, “Voiceducer,” http://www.temco-j.co.jp/html/English/HG16.html, downloaded Dec. 10, 2003, pp. 1-3.
Carney, David, “The Ultimate MP3 Player for Athletes? Could be.”, CNET Reviews, May 14, 2004, pp. 1-4.
Clifford, Michelle A., “Accelerometers Jump into the Consumer Goods Market”, Sensors Online, http://www.sensorsmag.com, Aug. 2004.
Comfees.com, Adjustable Sports Band Style No. 1243, http://shop.store.yahoo.com/comfees/adsportbansty.html, downloaded Apr. 18, 2003, pp. 1-2.
Cool Last Minute Gift Ideas!, UltimateFatBurner Reviews and Articles, http://www.ultimatefatburner.com/gift-ideas.html, downloaded May 10, 2005, pp. 1-3.
Dickie et al. “Eye Contact Sensing Glasses for Attention-Sensitive Wearable Video Blogging,” Human Media Lab, Queen's University, Kingston, ON K7L3N6, Canada, est. Apr. 2004, pp. 1-2.
Dixen, Brian, “ear-catching”, Supertesten, Mobil, Apr. 2003 (estimated), pp. 37-41.
Global Solar UV Index, A Practical Guide, World Health Organization, 2002, pp. 1-28.
Grobart, Sam, “Digit-Sizing Your Computer Data”, News Article, Sep. 2004, p. 1.
Holmes, JM et al. “A randomized trial of prescribed patching regimens for treatment of severe amblyopia in children.” Ophthalmology, v. 110, Iss. 11, Nov. 2003, pp. 2075-2087.
Inductive Charging Set, https://adafruit.com/product/1459/, downloaded Apr. 10, 2019, pp. 1-2.
Life Monitor V1.1, Rhusoft Technologies Inc., http://www.rhusoft.com/lifemonitor/, Mar. 1, 2003, pp. 1-6.
Manes, Stephen, “Xtreme Cam”, Forbes Magazine, Sep. 5, 2005, p. 96.
Mio, PhysiCal, http://www.gophysical.com/, downloaded Jan. 27, 2004, 5 pages.
Monitoring Athletes Performance—2002 Winter Olympic News from KSL, Jan. 23, 2002, http://2002.ksl.com/news-3885i, pp. 1-3.
Niwa, “UV Index Information”, http://www.niwa.cri.nz/services/uvozone/uvi-info, downloaded Jul. 15, 2004, pp. 1-2.
NuVision 60GX Steroscopic Wireless Glasses, Product Information, NuVision by MacNaughton, c. 1997, MacNaughton, Inc., pp. 1-2.
Pärkkä, Juha, et al., “A Wireless Wellness Monitor for Personal Weight Management”, VII Information Technology, Tampere, Finland, Nov. 2000, p. 1.
Pedometer, Model HJ-112, Omron Instruction Manual, Omron Healthcare, Inc., 2003, pp. 1-27.
PNY Announces Executive Attaché USB 2.0 Flash Drive and Pen Series, Press Release, PNY Technologies, Las Vegas, Jan. 8, 2004, pp. 1-2.
PNY Technologies,“ Executive Attaché” http://www.pny.com/products/flash/execattache.asp downloaded Nov. 16, 2005.
Polar WM41 and 42 weight management monitor, http://www.simplysports/polar/weight management/wm41-42.htm, downloaded Jan. 28, 2004, pp. 1-3.
Questions Answers, Pedometer.com, http://www.pedometer.com, downloaded May 5, 2005.
Radio Frequency Wireless Charging: How RF Charging Works?, ©2010-2015 Humavox Ltd., Nov. 3, 2016, http://www.humavox.com/blog/rf-wireless-charging-works/, d+B638ownloaded Apr. 11, 2019, pp. 1-10.
Razrwire, copyright Motorola, Inc., Jul. 2005, 1 page.
Repka MX et al. “A randomized trial of patching regimens for treatment of moderate amblyopia in children.” Arch Ophthalmology v. 121, No. 5, May 2003, pp. 603-611.
Roberts, Catherine. “How to Choose a Medical Alert System,” © 2006-2016 Consumer Reports, https://www.consumerreports.org/medical-alert-systems/how-to-choose-a-medical-alert-system/, Feb. 7, 2019, downloaded Apr. 10, 2019, pp. 1-22.
SafeSun Personal UV Meter, Scientific Data, Optix Tech Inc., http://www.safesun.com/scientific.html, downloaded May 1, 2004, pp. 1-3.
SafeSun Sensor, User's Manual, Optix Tech Inc., Jun. 1998, 2 pages.
Safesun, Personal UV Meter, “Technical Specifications”, Optix Tech Inc., http://www.safesun.com/technical.html, downloaded Jul. 12, 2004, pp. 1-2.
Safesun, Personal UV Meter, Experiments, Optix Tech Inc., http://www.safesun.com/experiments.html, downloaded Feb. 5, 2004, pp. 1-2.
Shades of Fun, Blinking Light Glasses, http://www.shadesoffun.com/Nov/Novpgs-14.html, downloaded Jul. 9, 2005, pp. 1-4.
SportLine Fitness Pedometer-Model 360, UltimateFatBurner Superstore, http://www.ultimatefatburner-store.com/ac_004.html, downloaded May 10, 2005, pp. 1-2.
Steele, Bonnie G et al., “Bodies in motion: Monitoring daily activity and exercise with motion sensors in people with chronic pulmonary disease”, VA Research & Development, Journal of Rehabilitation Research & Development, vol. 40, No. 5, Sep./Oct. 2003, Supplement 2, pp. 45-58.
Stevens, Kathy, “Should I Use a Pedometer When I Walk?”, Healtheon/WebMD, Apr. 14, 2000.
Sundgot, Jørgen “2nd-gen Motorola Bluetooth headset”, InfoSync World, Mar. 1, 2003, http://www.infosync.no/news/2002/n/2841.html, pp. 1-2.
Sunsensors, Segan Industries, Inc., http://www.segan-ind.com/sunsensor.htm, downloaded Feb. 5, 2004, pp. 1-3.
SunUV™, Personal UV Monitor User's Guide, APA Optics, Inc., 2003 pp. 1-52.
SunUV™, Personal UV Monitor, APA Optics, Inc., http://www.apaoptics.com/sunuv/models.html, downloaded Dec. 20, 2003.
Talking Pedometer, Sportline, Inc., Jun. 2001 (Possibly earlier), 1 page.
The unofficial Elsa 3D Revelator p. Dec. 30, 1999, pp. 1-15.
Top Silicon PIN Photodiode, PD93-21C, Technical Data Sheet, Everlight Electronics Co., Ltd., 2004, pp. 1-9.
Universal Qi Wireless Charging Module, https://www.adafruit.com/product/1901/, downloaded Oct. 15, 2015, pp. 1-3.
Universal Qi Wireless Charging Transmitter, https://www.adafruit.com/product/2162/, downloaded Apr. 10, 2019, pp. 1-2.
UV Light Meter, UVA and UVB measurement, UV-340, Instruction Manual, Lutron, Jun. 2003 (estimated), pp. 1-5.
UV-Smart, UVA/B Monitor, Model EC-960-PW, Instruction Manual, Tanita Corporation of America, Inc., downloaded Nov. 16, 2001.
Vitaminder Personal Carb Counter, http://www.auravita.com/products/AURA/ORBU11420.asp. Downloaded Nov. 15, 2005, pp. 1-4.
Wallace DK et al. “A randomized trial to evaluate 2 hours of daily patching for strabismic and anisometropic amblyopia in children.” Ophthalmology v. 113, No. 6, Jun. 2006, pp. 904-912.
Wireless Charging for the Smallest Wearables Available, © 2010-2015 Humavox Ltd., Apr. 7, 2016, http://www.humavox.com/blog/wireless-charging-smallest-wearables-available/, downloaded Apr. 11, 2019, pp. 1-7.
Yamada et al. “Development of an eye-movement analyser possessing functions for wireless transmission and autocalibration,” Med. Biol. Eng. Comput., No. 28, v.4, Jul. 28, 1990, http://link.springer.com/article/10.1007%2FBF02446149?LI=true, pp. 1-2.
Notice of Allowance for U.S. Appl. No. 16/031,046 dated Apr. 5, 2021.
Office Action for U.S. Appl. No. 17/856,954, dated Aug. 18, 2022.
Office Action for U.S. Appl. No. 17/856,954, dated Oct. 11, 2022.
Office Action for U.S. Appl. No. 17/856,954, dated Jul. 2, 2022.
Office Action for U.S. Appl. No. 17/856,954, dated Dec. 7, 2022.
Office Action for U.S. Appl. No. 17/856,954, dated Apr. 14, 2023.
Related Publications (1)
Number Date Country
20200364992 A1 Nov 2020 US
Provisional Applications (5)
Number Date Country
62718597 Aug 2018 US
62686174 Jun 2018 US
62681292 Jun 2018 US
62668762 May 2018 US
62656621 Apr 2018 US
Continuations (1)
Number Date Country
Parent 16382036 Apr 2019 US
Child 16987781 US