The present invention relates to wearable devices such as wireless earpieces. More particularly, but not exclusively, the present invention relates to wearable devices such as wireless earpieces with near field gesture based control.
Although various wireless earpieces and wearable devices exist, there has not been widespread adoption due to numerous issues and deficiencies. What is needed is an improved wearable device such as an improved wireless earpiece.
It is a primary object, feature, or advantage of the present invention to provide a wearable device which provides for detection of a user's gestures.
It is a further object, feature, or advantage to provide an earpiece which detects a user's gestures through an IR LED interface.
It is a still further object, feature, or advantage of the present invention to provide an earpiece which is impervious to water and high IR environments.
Another object, feature, or advantage of the present invention is to receive user gestures in the form of swipes and determine the directionality of the swipe through algorithmic analysis.
Yet another object, feature, or advantage of the present invention is to provide audio feedback to a user based on interactions with an IR LED control system.
A still further object, feature, or advantage of the present invention is to provide the ability to accommodate “white out” IR situations through the use of an algorithmic reversal of a primary input methodology.
Another object, feature, or advantage of the present invention is to provide ultrasound sensor capability as an input control methodology.
Yet another object, feature, or advantage of the present invention is to provide a user interface that is fully functional even in situations where there is no option for visual interaction with the user.
Another object, feature, or advantage of the present invention is to provide for maximum user benefit in situations where input control is minimal due to physical space limitations.
Yet another object, feature, or advantage of the present invention is to provide a user interface which is functional even when the user and device are completely submerged in water, the user is wearing gloves, or the device is being used in areas of extremely bright sunlight or other types of diverse use conditions.
One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage as different embodiments may have different objects, features, or advantages. Therefore, the invention is not to be limited by or to any object, feature, or advantage set forth herein.
According to one aspect, an earpiece includes an earpiece housing, a processor disposed within the earpiece housing, and a gesture based interface operatively connected to the processor and configured to detect changes in an energy field associated with user gestures. The processor is configured to interpret the changes in the energy field to determine the user gestures. The gesture based interface may include one or more IR LEDs and one or more IR receivers. Alternatively, the earpiece may include one or more ultra sound emitters and one or more ultrasound receivers. Alternative types of energy fields such as radar may also be used instead of light or sound. The user gestures may include swipe gestures. The processor may be further configured to determine directionality of the swipe gestures. The user gestures may include tap gestures, holds, or combinations of gestures. The earpiece may further include a speaker operatively connected to the processor and wherein the processor is configured to provide audio feedback to a user through the speaker. Where the energy field is an infrared light field, the processor may be configured to reverse modes when the infrared light field exceeds a threshold. The processor may also be configured to alter the rate of energy emission upon detection of an object within the field. The earpiece housing may be water resistant or impervious to water to allow a user to swim while wearing the earpiece.
According to another aspect, an earpiece is provided. The earpiece includes an earpiece housing, an infrared (IR) light emitting diode (LED) interface operative connected to the earpiece housing, and a processor disposed within the earpiece housing and operatively connected to the infrared (IR) light emitting diode (LED) interface. The IR LED interface may include at least one IR LED and at least one IR receiver. The processor may be configured to detect user gestures based on changes in infrared light. The earpiece may further include a speaker disposed within the earpiece housing which may be operatively connected to the processor and wherein the processor may be configured to provide audio feedback through the speaker. The earpiece may be configured to detect proximity of an object to the IR LED interface and adjust sampling speed based upon detection of the object. The processor may be configured to reverse modes when infrared light levels are above (or below) a threshold.
According to yet another aspect an earpiece is provided. The earpiece includes an earpiece housing, an ultrasound interface operatively connected to the earpiece housing, and a processor disposed within the earpiece housing and operatively connected to the ultrasound interface. The ultrasound interface may include at least one ultrasound emitter and at least one ultrasound detector. The processor may be configured to detect user gestures based on changes in ultrasound energy. The earpiece may further include a speaker disposed within the earpiece housing and operatively connected to the processor and wherein the processor is configured to provide audio feedback through the speaker. The processor may be configured to detect proximity of an object to the ultrasound interface and adjust sampling speed based upon detection of the object.
According to another aspect an earpiece is provided. The earpiece includes an earpiece housing, a radar interface operatively connected to the earpiece housing, and a processor disposed within the earpiece housing and operatively connected to the radar interface. The radar interface may include at least one radar emitter and at least one radar detector. The processor may be configured to detect user gestures based on changes in radar energy. The earpiece may further include a speaker disposed within the earpiece housing and operatively connected to the processor and wherein the processor is configured to provide audio feedback through the speaker. The processor may be configured to detect proximity of an object to the radar interface and adjust sampling speed based upon detection of the object.
According to another aspect a wearable device is provided. The wearable device includes a wearable device housing, an infrared (IR) light emitting diode (LED) interface operatively connected to the wearable device housing, and a processor disposed within the wearable device housing and operatively connected to the infrared (IR) light emitting diode (LED) interface. The processor is configured to interpret user gestures.
According to another aspect, a method for interacting with a user of a wearable device is provided. The method includes generating an energy field and detecting changes in the energy field associated with interactions of the user of the wearable device with the energy field. The interactions of the user of the wearable device with the energy field may be user gestures. The energy field may be an infrared light field. The method may further include providing audio feedback to the user of the wearable device. The wearable device may be an earpiece.
According to another aspect, an earpiece includes an earpiece housing, at least one speaker and at least one microphone operatively connected to the earpiece housing, an infrared (IR) light emitting diode (LED) interface operatively connected to the earpiece housing, and a processor disposed within the earpiece housing and operatively connected to the infrared (IR) light emitting diode (LED) interface. The IR LED interface comprises at least one IR LED and at least one IR receiver. The processor is configured to detect user gestures based on changes in infrared light. The processor is configured to detect proximity of an object to the IR LED interface and adjust sampling speed based upon detection of the object. The processor is configured to reverse modes when infrared light is above a threshold.
The wearable device may provide for a plurality of different modes of operation. One mode of operation of the device relate to gestural movements. For example, where a user performs a gestural movement which is interpreted by the device the device may light or activate one or more lighting elements to confirm the gestural movement or to indicate that the gestural movement could not be interpreted. In addition, audio feedback may be used to confirm a gestural movement or to indicate that the gestural movement could not be interpreted. One or more detectors or receivers 24A, 24B may also be present to detect changes in energy fields associated with gestures performed by a user. The receivers 24A, 24B in combination with one or more emitters provide a gesture based user interface.
The wearable device may be a wireless earpiece designed to fit into the external ear and concha cavum segment of the pinna. The system may be responsive in a number of harsh environments. These vary from complete submersion in water to being able to be accessed while wearing gloves, among others. Note that capacitive touch sensors would not be appropriate for these types of use cases.
The wearable device provides a near field control system. Such a system is responsive to the user in multiple environments where current technology physiologic interfaces are incapable of function. Said environments include, but are not limited to situations where the user and device are completely submerged in water, while wearing gloves, in areas of extremely bright sunlight among others. This system may function with no screen for visual feedback expected or anticipated. A gesture based control system may integrate audio signals for transmission of feedback to the individual. Audio based feedback provides a reliable and efficient human/device interface. Such a system requires no tactile feedback.
This can be accomplished in a number of ways. As shown in
Alternately, the system may be designed so that if placed in a position where there is extreme IR exposure, the converse of the previously described methodology is employed. For example, in a situation such as this, where there is massive IR exposure (such as at a beach or walking on a sunny day through snow), the finger creates a shadow; this is able to be interpreted correctly as opposite input relative to the baseline.
Alternately, the system may be further designed so that use of other sensors may be brought into usage to further clarify and quantify the data presented to the intelligent control. For example, inertial sensor data can be used to further improve the resolution and accuracy of the reading. Such additional features and benefits are not to be limited to the present examples cited.
The system has alternately been constructed using one or more ultrasound sensors, creating a sound wave field in place of the infrared field. As shown in
In operation, a user may wear the ear piece. The user may make a gesture near the IR LED interface (or other type of interface). The gesture may be in the form of a tap, a double tap, a triple tap, a swipe (such as a swipe with a particular directionality), a hold, or other gesture. Note that different functionalities may be associated with different gestures and different functionalities may be associated with the same gesture when the device is operating in different modes of operation. Although it is generally preferred that gestures be simple, it is contemplated that complex gestures may be used based on a combination of simple gestures. It is further contemplated that the ear piece may be trained to identify swipes or taps from different fingers of a user. It is further contemplated that swipes or taps of different durations may be interpreted differently. In addition, directionality of user gestures may be used to define the gestures.
It is also contemplated that more than one wearable device may be used. For example, two earpieces may be used each with its own user interface. Where multiple devices are used, it is to be understood that the same gesture performed at one device may be associated with one function while the same gesture performed at the other device may associated with a different function. Alternatively, the same gesture may perform the same function regardless of which device the gesture is performed at.
It is further contemplated that audio feedback may be provided to the user in response to gestures made. For example, the audio feedback may simply indicate that the gesture was received or may specify the functionality associated with the gesture. Alternatively, the audio feedback may request additional gestures such as an additional gesture to confirm the gesture previously made or that the function associated with the gesture is to be performed. It is contemplated that whether audio feedback is used or not and/or the type of audio feedback used may be controlled through user settings of the device. For example audio feedback may always be used, or audio feedback may only be used where the confidence level associated with identifying the gesture is not sufficiently high enough, or audio feedback may only be used in certain modes of operation. As shown in
Note that the user interface provides a number of advantages which may be of particular importance. For example, where the device is an ear piece, the ear piece may be resistant or impervious to water. Thus, for example, a user may wear the earpiece while swimming. In such a situation other types of user interfaces such as capacitive touch may not be appropriate. In addition, because as previously explained, when there is massive IR exposure the use of IR may be reversed, the user interface may be used even in high IR environments.
One of the other significant advantages that the gesture based user interface provides is that a user may fully interact with the system even in situations where there is no option for visual interaction with the user. Another significant advantage is that the user interface may be used in situations where input control is minimal due to physical space limitations. A further benefit of the user interface is that voice commands are not required and thus issues associated with voice control can be avoided.
Therefore, various apparatus, systems, and methods have been shown and described. Differences in the type of energy detection, the algorithms used, the gestures used, and other options, variations, and alternatives are contemplated.
This application claims priority to U.S. Provisional Patent Application No. 62/211,728, filed Aug. 29, 2015, is a continuation application of U.S. patent application Ser. No. 15/244,917 filed on Aug. 23, 2016, both applications are hereby incorporated by reference in their entirety.
| Number | Name | Date | Kind |
|---|---|---|---|
| 2325590 | Carlisle et al. | Aug 1943 | A |
| 2430229 | Kelsey | Nov 1947 | A |
| 3047089 | Zwislocki | Jul 1962 | A |
| D208784 | Sanzone | Oct 1967 | S |
| 3586794 | Michaelis | Jun 1971 | A |
| 3696377 | Wall | Oct 1972 | A |
| 3934100 | Harada | Jan 1976 | A |
| 3983336 | Malek et al. | Sep 1976 | A |
| 4069400 | Johanson et al. | Jan 1978 | A |
| 4150262 | Ono | Apr 1979 | A |
| 4334315 | Ono et al. | Jun 1982 | A |
| D266271 | Johanson et al. | Sep 1982 | S |
| 4375016 | Harada | Feb 1983 | A |
| 4588867 | Konomi | May 1986 | A |
| 4617429 | Bellafiore | Oct 1986 | A |
| 4654883 | Iwata | Mar 1987 | A |
| 4682180 | Gans | Jul 1987 | A |
| 4791673 | Schreiber | Dec 1988 | A |
| 4852177 | Ambrose | Jul 1989 | A |
| 4865044 | Wallace et al. | Sep 1989 | A |
| 4984277 | Bisgaard et al. | Jan 1991 | A |
| 5008943 | Arndt et al. | Apr 1991 | A |
| 5185802 | Stanton | Feb 1993 | A |
| 5191602 | Regen et al. | Mar 1993 | A |
| 5201007 | Ward et al. | Apr 1993 | A |
| 5201008 | Arndt et al. | Apr 1993 | A |
| D340286 | Seo | Oct 1993 | S |
| 5280524 | Norris | Jan 1994 | A |
| 5295193 | Ono | Mar 1994 | A |
| 5298692 | Ikeda et al. | Mar 1994 | A |
| 5343532 | Shugart | Aug 1994 | A |
| 5347584 | Narisawa | Sep 1994 | A |
| 5363444 | Norris | Nov 1994 | A |
| 5444786 | Raviv | Aug 1995 | A |
| D367113 | Weeks | Feb 1996 | S |
| 5497339 | Bernard | Mar 1996 | A |
| 5606621 | Reiter et al. | Feb 1997 | A |
| 5613222 | Guenther | Mar 1997 | A |
| 5654530 | Sauer et al. | Aug 1997 | A |
| 5692059 | Kruger | Nov 1997 | A |
| 5721783 | Anderson | Feb 1998 | A |
| 5748743 | Weeks | May 1998 | A |
| 5749072 | Mazurkiewicz et al. | May 1998 | A |
| 5771438 | Palermo et al. | Jun 1998 | A |
| D397796 | Yabe et al. | Sep 1998 | S |
| 5802167 | Hong | Sep 1998 | A |
| 5844996 | Enzmann et al. | Dec 1998 | A |
| D410008 | Almqvist | May 1999 | S |
| 5929774 | Charlton | Jul 1999 | A |
| 5933506 | Aoki et al. | Aug 1999 | A |
| 5949896 | Nageno et al. | Sep 1999 | A |
| 5987146 | Pluvinage et al. | Nov 1999 | A |
| 6021207 | Puthuff et al. | Feb 2000 | A |
| 6054989 | Robertson et al. | Apr 2000 | A |
| 6081724 | Wilson | Jun 2000 | A |
| 6084526 | Blotky et al. | Jul 2000 | A |
| 6094492 | Boesen | Jul 2000 | A |
| 6111569 | Brusky et al. | Aug 2000 | A |
| 6112103 | Puthuff | Aug 2000 | A |
| 6157727 | Rueda | Dec 2000 | A |
| 6167039 | Karlsson et al. | Dec 2000 | A |
| 6181801 | Puthuff et al. | Jan 2001 | B1 |
| 6185152 | Shen | Feb 2001 | B1 |
| 6208372 | Barraclough | Mar 2001 | B1 |
| 6230029 | Yegiazaryan et al. | May 2001 | B1 |
| 6275789 | Moser et al. | Aug 2001 | B1 |
| 6339754 | Flanagan et al. | Jan 2002 | B1 |
| D455835 | Anderson et al. | Apr 2002 | S |
| 6408081 | Boesen | Jun 2002 | B1 |
| 6424820 | Burdick et al. | Jul 2002 | B1 |
| D464039 | Boesen | Oct 2002 | S |
| 6470893 | Boesen | Oct 2002 | B1 |
| D468299 | Boesen | Jan 2003 | S |
| D468300 | Boesen | Jan 2003 | S |
| 6542721 | Boesen | Apr 2003 | B2 |
| 6560468 | Boesen | May 2003 | B1 |
| 6563301 | Gventer | May 2003 | B2 |
| 6654721 | Handelman | Nov 2003 | B2 |
| 6664713 | Boesen | Dec 2003 | B2 |
| 6690807 | Meyer | Feb 2004 | B1 |
| 6694180 | Boesen | Feb 2004 | B1 |
| 6718043 | Boesen | Apr 2004 | B1 |
| 6738485 | Boesen | May 2004 | B1 |
| 6748095 | Goss | Jun 2004 | B1 |
| 6754358 | Boesen et al. | Jun 2004 | B1 |
| 6784873 | Boesen et al. | Aug 2004 | B1 |
| 6823195 | Boesen | Nov 2004 | B1 |
| 6852084 | Boesen | Feb 2005 | B1 |
| 6879698 | Boesen | Apr 2005 | B2 |
| 6892082 | Boesen | May 2005 | B2 |
| 6920229 | Boesen | Jul 2005 | B2 |
| 6952483 | Boesen et al. | Oct 2005 | B2 |
| 6987986 | Boesen | Jan 2006 | B2 |
| 7010137 | Leedom et al. | Mar 2006 | B1 |
| 7113611 | Leedom et al. | Sep 2006 | B2 |
| D532520 | Kampmeier et al. | Nov 2006 | S |
| 7136282 | Rebeske | Nov 2006 | B1 |
| 7203331 | Boesen | Apr 2007 | B2 |
| 7209569 | Boesen | Apr 2007 | B2 |
| 7215790 | Boesen et al. | May 2007 | B2 |
| D549222 | Huang | Aug 2007 | S |
| D554756 | Sjursen et al. | Nov 2007 | S |
| 7403629 | Aceti et al. | Jul 2008 | B1 |
| D579006 | Kim et al. | Oct 2008 | S |
| 7463902 | Boesen | Dec 2008 | B2 |
| 7508411 | Boesen | Mar 2009 | B2 |
| 7532901 | LaFranchise et al. | May 2009 | B1 |
| D601134 | Elabidi et al. | Sep 2009 | S |
| 7825626 | Kozisek | Nov 2010 | B2 |
| 7859469 | Rosener et al. | Dec 2010 | B1 |
| 7965855 | Ham | Jun 2011 | B1 |
| 7979035 | Griffin et al. | Jul 2011 | B2 |
| 7983628 | Boesen | Jul 2011 | B2 |
| D647491 | Chen et al. | Oct 2011 | S |
| 8095188 | Shi | Jan 2012 | B2 |
| 8108143 | Tester | Jan 2012 | B1 |
| 8140357 | Boesen | Mar 2012 | B1 |
| D666581 | Perez | Sep 2012 | S |
| 8300864 | Müllenborn et al. | Oct 2012 | B2 |
| 8406448 | Lin | Mar 2013 | B2 |
| 8430817 | Al-Ali et al. | Apr 2013 | B1 |
| 8436780 | Schantz et al. | May 2013 | B2 |
| D687021 | Yuen | Jul 2013 | S |
| 8679012 | Kayyali | Mar 2014 | B1 |
| 8719877 | VonDoenhoff et al. | May 2014 | B2 |
| 8774434 | Zhao et al. | Jul 2014 | B2 |
| 8831266 | Huang | Sep 2014 | B1 |
| 8891800 | Shaffer | Nov 2014 | B1 |
| 8994498 | Agrafioti et al. | Mar 2015 | B2 |
| D728107 | Martin et al. | Apr 2015 | S |
| 9013145 | Castillo et al. | Apr 2015 | B2 |
| 9037125 | Kadous | May 2015 | B1 |
| D733103 | Jeong et al. | Jun 2015 | S |
| 9081944 | Camacho et al. | Jul 2015 | B2 |
| 9461403 | Gao et al. | Oct 2016 | B2 |
| 9510159 | Cuddihy et al. | Nov 2016 | B1 |
| D773439 | Walker | Dec 2016 | S |
| D775158 | Dong et al. | Dec 2016 | S |
| D777710 | Palmborg et al. | Jan 2017 | S |
| 9544689 | Fisher et al. | Jan 2017 | B2 |
| D788079 | Son et al. | May 2017 | S |
| 9711062 | Ellis et al. | Jul 2017 | B2 |
| 9729979 | Özden | Aug 2017 | B2 |
| 9767709 | Ellis | Sep 2017 | B2 |
| 9848257 | Ambrose et al. | Dec 2017 | B2 |
| 20010005197 | Mishra et al. | Jun 2001 | A1 |
| 20010027121 | Boesen | Oct 2001 | A1 |
| 20010043707 | Leedom | Nov 2001 | A1 |
| 20010056350 | Calderone et al. | Dec 2001 | A1 |
| 20020002413 | Tokue | Jan 2002 | A1 |
| 20020007510 | Mann | Jan 2002 | A1 |
| 20020010590 | Lee | Jan 2002 | A1 |
| 20020030637 | Mann | Mar 2002 | A1 |
| 20020046035 | Kitahara et al. | Apr 2002 | A1 |
| 20020057810 | Boesen | May 2002 | A1 |
| 20020076073 | Taenzer et al. | Jun 2002 | A1 |
| 20020118852 | Boesen | Aug 2002 | A1 |
| 20030002705 | Boesen | Jan 2003 | A1 |
| 20030065504 | Kraemer et al. | Apr 2003 | A1 |
| 20030100331 | Dress et al. | May 2003 | A1 |
| 20030104806 | Ruef et al. | Jun 2003 | A1 |
| 20030115068 | Boesen | Jun 2003 | A1 |
| 20030125096 | Boesen | Jul 2003 | A1 |
| 20030218064 | Conner et al. | Nov 2003 | A1 |
| 20040070564 | Dawson et al. | Apr 2004 | A1 |
| 20040102931 | Ellis et al. | May 2004 | A1 |
| 20040160511 | Boesen | Aug 2004 | A1 |
| 20050017842 | Dematteo | Jan 2005 | A1 |
| 20050043056 | Boesen | Feb 2005 | A1 |
| 20050094839 | Gwee | May 2005 | A1 |
| 20050125320 | Boesen | Jun 2005 | A1 |
| 20050148883 | Boesen | Jul 2005 | A1 |
| 20050165663 | Razumov | Jul 2005 | A1 |
| 20050196009 | Boesen | Sep 2005 | A1 |
| 20050197063 | White | Sep 2005 | A1 |
| 20050212911 | Marvit et al. | Sep 2005 | A1 |
| 20050238190 | Rohrlein | Oct 2005 | A1 |
| 20050251455 | Boesen | Nov 2005 | A1 |
| 20050266876 | Boesen | Dec 2005 | A1 |
| 20060029246 | Boesen | Feb 2006 | A1 |
| 20060073787 | Lair et al. | Apr 2006 | A1 |
| 20060074671 | Farmaner et al. | Apr 2006 | A1 |
| 20060074808 | Boesen | Apr 2006 | A1 |
| 20060166715 | Engelen et al. | Jul 2006 | A1 |
| 20060166716 | Seshadri et al. | Jul 2006 | A1 |
| 20060220915 | Bauer | Oct 2006 | A1 |
| 20060258412 | Liu | Nov 2006 | A1 |
| 20070102009 | Wong et al. | May 2007 | A1 |
| 20070239225 | Saringer | Oct 2007 | A1 |
| 20070269785 | Yamanoi | Nov 2007 | A1 |
| 20080076972 | Dorogusker et al. | Mar 2008 | A1 |
| 20080090622 | Kim et al. | Apr 2008 | A1 |
| 20080102424 | Holljes | May 2008 | A1 |
| 20080146890 | LeBoeuf et al. | Jun 2008 | A1 |
| 20080187163 | Goldstein et al. | Aug 2008 | A1 |
| 20080215239 | Lee | Sep 2008 | A1 |
| 20080253583 | Goldstein et al. | Oct 2008 | A1 |
| 20080254780 | Kuhl et al. | Oct 2008 | A1 |
| 20080255430 | Alexandersson et al. | Oct 2008 | A1 |
| 20080298606 | Johnson et al. | Dec 2008 | A1 |
| 20090003620 | McKillop et al. | Jan 2009 | A1 |
| 20090008275 | Ferrari et al. | Jan 2009 | A1 |
| 20090017881 | Madrigal | Jan 2009 | A1 |
| 20090073070 | Rofougaran | Mar 2009 | A1 |
| 20090097689 | Prest et al. | Apr 2009 | A1 |
| 20090105548 | Bart | Apr 2009 | A1 |
| 20090154739 | Zellner | Jun 2009 | A1 |
| 20090191920 | Regen et al. | Jul 2009 | A1 |
| 20090226017 | Abolfathi et al. | Sep 2009 | A1 |
| 20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
| 20090261114 | McGuire et al. | Oct 2009 | A1 |
| 20090296968 | Wu et al. | Dec 2009 | A1 |
| 20090303073 | Gilling et al. | Dec 2009 | A1 |
| 20090304210 | Weisman | Dec 2009 | A1 |
| 20100033313 | Keady et al. | Feb 2010 | A1 |
| 20100166206 | Macours | Jul 2010 | A1 |
| 20100203831 | Muth | Aug 2010 | A1 |
| 20100210212 | Sato | Aug 2010 | A1 |
| 20100290636 | Mao et al. | Nov 2010 | A1 |
| 20100320961 | Castillo et al. | Dec 2010 | A1 |
| 20110018731 | Linsky et al. | Jan 2011 | A1 |
| 20110103448 | Dahl | May 2011 | A1 |
| 20110103609 | Pelland et al. | May 2011 | A1 |
| 20110137141 | Razoumov et al. | Jun 2011 | A1 |
| 20110140844 | McGuire et al. | Jun 2011 | A1 |
| 20110239497 | McGuire et al. | Oct 2011 | A1 |
| 20110286615 | Olodort et al. | Nov 2011 | A1 |
| 20110293105 | Arie et al. | Dec 2011 | A1 |
| 20120057740 | Rosal | Mar 2012 | A1 |
| 20120155670 | Rutschman | Jun 2012 | A1 |
| 20120163626 | Booij et al. | Jun 2012 | A1 |
| 20120197737 | LeBoeuf et al. | Aug 2012 | A1 |
| 20120235883 | Border et al. | Sep 2012 | A1 |
| 20120309453 | Maguire | Dec 2012 | A1 |
| 20130106454 | Liu et al. | May 2013 | A1 |
| 20130154826 | Ratajczyk | Jun 2013 | A1 |
| 20130178967 | Mentz | Jul 2013 | A1 |
| 20130204617 | Kuo et al. | Aug 2013 | A1 |
| 20130293494 | Reshef | Nov 2013 | A1 |
| 20130316642 | Newham | Nov 2013 | A1 |
| 20130346168 | Zhou et al. | Dec 2013 | A1 |
| 20140004912 | Rajakarunanayake | Jan 2014 | A1 |
| 20140014697 | Schmierer et al. | Jan 2014 | A1 |
| 20140020089 | Perini, II | Jan 2014 | A1 |
| 20140072136 | Tenenbaum et al. | Mar 2014 | A1 |
| 20140072146 | Itkin et al. | Mar 2014 | A1 |
| 20140073429 | Meneses et al. | Mar 2014 | A1 |
| 20140079257 | Ruwe et al. | Mar 2014 | A1 |
| 20140106677 | Altman | Apr 2014 | A1 |
| 20140122116 | Smythe | May 2014 | A1 |
| 20140146973 | Liu et al. | May 2014 | A1 |
| 20140153768 | Hagen et al. | Jun 2014 | A1 |
| 20140163771 | Demeniuk | Jun 2014 | A1 |
| 20140185828 | Helbling | Jul 2014 | A1 |
| 20140219467 | Kurtz | Aug 2014 | A1 |
| 20140222462 | Shakil et al. | Aug 2014 | A1 |
| 20140235169 | Parkinson et al. | Aug 2014 | A1 |
| 20140270227 | Swanson | Sep 2014 | A1 |
| 20140270271 | Dehe et al. | Sep 2014 | A1 |
| 20140276227 | Pérez | Sep 2014 | A1 |
| 20140310595 | Acharya et al. | Oct 2014 | A1 |
| 20140321682 | Kofod-Hansen et al. | Oct 2014 | A1 |
| 20140335908 | Krisch et al. | Nov 2014 | A1 |
| 20140348367 | Vavrus et al. | Nov 2014 | A1 |
| 20150028996 | Agrafioti et al. | Jan 2015 | A1 |
| 20150035643 | Kursun | Feb 2015 | A1 |
| 20150036835 | Chen | Feb 2015 | A1 |
| 20150056584 | Boulware et al. | Feb 2015 | A1 |
| 20150110587 | Hori | Apr 2015 | A1 |
| 20150148989 | Cooper et al. | May 2015 | A1 |
| 20150181356 | Krystek et al. | Jun 2015 | A1 |
| 20150230022 | Sakai et al. | Aug 2015 | A1 |
| 20150245127 | Shaffer | Aug 2015 | A1 |
| 20150256949 | Vanpoucke et al. | Sep 2015 | A1 |
| 20150264472 | Aase | Sep 2015 | A1 |
| 20150264501 | Hu et al. | Sep 2015 | A1 |
| 20150317565 | Li et al. | Nov 2015 | A1 |
| 20150358751 | Deng et al. | Dec 2015 | A1 |
| 20150359436 | Shim et al. | Dec 2015 | A1 |
| 20150364058 | Lagree et al. | Dec 2015 | A1 |
| 20150373467 | Gelter | Dec 2015 | A1 |
| 20150373474 | Kraft et al. | Dec 2015 | A1 |
| 20160033280 | Moore et al. | Feb 2016 | A1 |
| 20160034249 | Lee et al. | Feb 2016 | A1 |
| 20160071526 | Wingate et al. | Mar 2016 | A1 |
| 20160072558 | Hirsch et al. | Mar 2016 | A1 |
| 20160073189 | Lindén et al. | Mar 2016 | A1 |
| 20160100262 | Inagaki | Apr 2016 | A1 |
| 20160119737 | Mehnert et al. | Apr 2016 | A1 |
| 20160124707 | Ermilov et al. | May 2016 | A1 |
| 20160125892 | Bowen et al. | May 2016 | A1 |
| 20160140870 | Connor | May 2016 | A1 |
| 20160142818 | Park | May 2016 | A1 |
| 20160162259 | Zhao et al. | Jun 2016 | A1 |
| 20160209691 | Yang et al. | Jul 2016 | A1 |
| 20160253994 | Panchapagesan et al. | Sep 2016 | A1 |
| 20160324478 | Goldstein | Nov 2016 | A1 |
| 20160353196 | Baker et al. | Dec 2016 | A1 |
| 20160360350 | Watson et al. | Dec 2016 | A1 |
| 20170021257 | Gilbert et al. | Jan 2017 | A1 |
| 20170046503 | Cho et al. | Feb 2017 | A1 |
| 20170059152 | Hirsch et al. | Mar 2017 | A1 |
| 20170060262 | Hviid et al. | Mar 2017 | A1 |
| 20170060269 | Förstner et al. | Mar 2017 | A1 |
| 20170061751 | Loermann et al. | Mar 2017 | A1 |
| 20170061817 | Mettler May | Mar 2017 | A1 |
| 20170062913 | Hirsch et al. | Mar 2017 | A1 |
| 20170064426 | Hviid | Mar 2017 | A1 |
| 20170064428 | Hirsch | Mar 2017 | A1 |
| 20170064432 | Hviid et al. | Mar 2017 | A1 |
| 20170064437 | Hviid et al. | Mar 2017 | A1 |
| 20170078780 | Qian et al. | Mar 2017 | A1 |
| 20170078785 | Qian et al. | Mar 2017 | A1 |
| 20170100277 | Ke | Apr 2017 | A1 |
| 20170108918 | Boesen | Apr 2017 | A1 |
| 20170109131 | Boesen | Apr 2017 | A1 |
| 20170110124 | Boesen et al. | Apr 2017 | A1 |
| 20170110899 | Boesen | Apr 2017 | A1 |
| 20170111723 | Boesen | Apr 2017 | A1 |
| 20170111725 | Boesen et al. | Apr 2017 | A1 |
| 20170111726 | Martin et al. | Apr 2017 | A1 |
| 20170111740 | Hviid et al. | Apr 2017 | A1 |
| 20170127168 | Briggs et al. | May 2017 | A1 |
| 20170131094 | Kulik | May 2017 | A1 |
| 20170142511 | Dennis | May 2017 | A1 |
| 20170146801 | Stempora | May 2017 | A1 |
| 20170150920 | Chang et al. | Jun 2017 | A1 |
| 20170151085 | Chang et al. | Jun 2017 | A1 |
| 20170151447 | Boesen | Jun 2017 | A1 |
| 20170151668 | Boesen | Jun 2017 | A1 |
| 20170151918 | Boesen | Jun 2017 | A1 |
| 20170151930 | Boesen | Jun 2017 | A1 |
| 20170151957 | Boesen | Jun 2017 | A1 |
| 20170151959 | Boesen | Jun 2017 | A1 |
| 20170153114 | Boesen | Jun 2017 | A1 |
| 20170153636 | Boesen | Jun 2017 | A1 |
| 20170154532 | Boesen | Jun 2017 | A1 |
| 20170155985 | Boesen | Jun 2017 | A1 |
| 20170155992 | Perianu et al. | Jun 2017 | A1 |
| 20170155993 | Boesen | Jun 2017 | A1 |
| 20170155997 | Boesen | Jun 2017 | A1 |
| 20170155998 | Boesen | Jun 2017 | A1 |
| 20170156000 | Boesen | Jun 2017 | A1 |
| 20170164890 | Leip et al. | Jun 2017 | A1 |
| 20170178631 | Boesen | Jun 2017 | A1 |
| 20170180842 | Boesen | Jun 2017 | A1 |
| 20170180843 | Perianu et al. | Jun 2017 | A1 |
| 20170180897 | Perianu | Jun 2017 | A1 |
| 20170188127 | Perianu et al. | Jun 2017 | A1 |
| 20170188132 | Hirsch et al. | Jun 2017 | A1 |
| 20170193978 | Goldman | Jul 2017 | A1 |
| 20170195829 | Belverato et al. | Jul 2017 | A1 |
| 20170208393 | Boesen | Jul 2017 | A1 |
| 20170214987 | Boesen | Jul 2017 | A1 |
| 20170215016 | Dohmen et al. | Jul 2017 | A1 |
| 20170230752 | Dohmen et al. | Aug 2017 | A1 |
| 20170251933 | Braun et al. | Sep 2017 | A1 |
| 20170257698 | Boesen et al. | Sep 2017 | A1 |
| 20170258329 | Marsh | Sep 2017 | A1 |
| 20170263236 | Boesen et al. | Sep 2017 | A1 |
| 20170263376 | Verschueren et al. | Sep 2017 | A1 |
| 20170266494 | Crankson et al. | Sep 2017 | A1 |
| 20170273622 | Boesen | Sep 2017 | A1 |
| 20170280257 | Gordon et al. | Sep 2017 | A1 |
| 20170301337 | Golani et al. | Oct 2017 | A1 |
| 20170361213 | Goslin et al. | Dec 2017 | A1 |
| 20170366233 | Hviid et al. | Dec 2017 | A1 |
| 20180007994 | Boesen et al. | Jan 2018 | A1 |
| 20180008194 | Boesen | Jan 2018 | A1 |
| 20180008198 | Kingscott | Jan 2018 | A1 |
| 20180009447 | Boesen et al. | Jan 2018 | A1 |
| 20180011006 | Kingscott | Jan 2018 | A1 |
| 20180011682 | Milevski et al. | Jan 2018 | A1 |
| 20180011994 | Boesen | Jan 2018 | A1 |
| 20180012228 | Milevski et al. | Jan 2018 | A1 |
| 20180013195 | Hviid et al. | Jan 2018 | A1 |
| 20180014102 | Hirsch et al. | Jan 2018 | A1 |
| 20180014103 | Martin et al. | Jan 2018 | A1 |
| 20180014104 | Boesen et al. | Jan 2018 | A1 |
| 20180014107 | Razouane et al. | Jan 2018 | A1 |
| 20180014108 | Dragicevic et al. | Jan 2018 | A1 |
| 20180014109 | Boesen | Jan 2018 | A1 |
| 20180014113 | Boesen | Jan 2018 | A1 |
| 20180014140 | Milevski et al. | Jan 2018 | A1 |
| 20180014436 | Milevski | Jan 2018 | A1 |
| 20180034951 | Boesen | Feb 2018 | A1 |
| 20180040093 | Boesen | Feb 2018 | A1 |
| 20180042501 | Adi et al. | Feb 2018 | A1 |
| Number | Date | Country |
|---|---|---|
| 204244472 | Apr 2015 | CN |
| 104683519 | Jun 2015 | CN |
| 104837094 | Aug 2015 | CN |
| 1469659 | Oct 2004 | EP |
| 1017252 | May 2006 | EP |
| 2903186 | Aug 2015 | EP |
| 2074817 | Apr 1981 | GB |
| 2508226 | May 2014 | GB |
| 06292195 | Oct 1994 | JP |
| 2008103925 | Aug 2008 | WO |
| 2008113053 | Sep 2008 | WO |
| 2007034371 | Nov 2008 | WO |
| 2011001433 | Jan 2011 | WO |
| 2012071127 | May 2012 | WO |
| 2013134956 | Sep 2013 | WO |
| 2014046602 | Mar 2014 | WO |
| 2014043179 | Jul 2014 | WO |
| 2015061633 | Apr 2015 | WO |
| 2015110577 | Jul 2015 | WO |
| 2015110587 | Jul 2015 | WO |
| 2016032990 | Mar 2016 | WO |
| 2016187869 | Dec 2016 | WO |
| Entry |
|---|
| Stretchgoal—Windows Phone Support (Feb. 17, 2014). |
| The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014). |
| The Dash-A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014). |
| Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014). |
| Weisiger; “Conjugated Hyperbilirubinemia”, Jan. 5, 2016. |
| Wertzner et al., “Analysis of fundamental frequency, jitter, shimmer and vocal intensity in children with phonological disorders”, V. 71, n5, 582-588, Sep./Oct. 2005; Brazilian Journal of Othrhinolaryngology. |
| Wikipedia, “Gamebook”, https://en.wikipedia.org/wiki/Gamebook, Sep. 3, 2017, 5 pages. |
| Wikipedia, “Kinect”, “https://en.wikipedia.org/wiki/Kinect”, 18 pages, (Sep. 9, 2017). |
| Wikipedia, “Wii Balance Board”, “https://en.wikipedia.org/wiki/Wii_Balance_Board”, 3 pages, (Jul. 20, 2017). |
| Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
| Alzahrani et al: “A Multi-Channel Opto-Electronic Sensor to Accurately Monitor Heart Rate against Motion Artefact during Exercise”, Sensors, vol. 15, No. 10, Oct. 12, 2015, pp. 25681-25702, XPO55334602, DOI: 10.3390/s151025681 the whole document. |
| Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014). |
| Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013). |
| Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
| BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
| BRAGI is on Facebook (2014). |
| BRAGI Update—Arrival of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014). |
| BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015). |
| BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014). |
| BRAGI Update—Let's Get Ready to Rumble, A Lot to Be Done Over Christmas (Dec. 22, 2014). |
| BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014). |
| BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014). |
| BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014). |
| BRAGI Update—Memories From the First Month of Kickstarter—Update on Progress (Aug. 1, 2014). |
| BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014). |
| BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014). |
| BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014). |
| BRAGI Update—Status on Wireless, Bits and Pieces, Testing-Oh Yeah, Timeline(Apr. 24, 2015). |
| BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015). |
| BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014). |
| BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015). |
| BRAGI Update—Alpha 5 and Back to China, Backer Day, on Track(May 16, 2015). |
| BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015). |
| BRAGI Update—Certifications, Production, Ramping Up. |
| BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015). |
| BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015). |
| BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015). |
| BRAGI Update—Getting Close(Aug. 6, 2015). |
| BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015). |
| BRAGI Update—On Track, on Track and Gems Overview. |
| BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015). |
| BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015). |
| Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016). |
| Hoffman, “How to Use Android Beam to Wirelessly Transfer Content Between Devices”, (Feb. 22, 2013). |
| Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017). |
| Hyundai Motor America, “Hyundai Motor Company Introduces a Health + Mobility Concept for Wellness in Mobility”, Fountain Valley, Californa (2017). |
| International Search Report & Written Opinion, PCT/EP16/70245 (dated Nov. 16, 2016). |
| International Search Report & Written Opinion, PCT/EP2016/070231 (dated Nov. 18, 2016). |
| International Search Report & Written Opinion, PCT/EP2016/070247 (dated Nov. 18, 2016). |
| International Search Report & Written Opinion, PCT/EP2016/07216 (dated Oct. 18, 2016). |
| International Search Report and Written Opinion, PCT/EP2016/070228 (dated Jan. 9, 2017). |
| Jain A et al: “Score normalization in multimodal biometric systems”, Pattern Recognition, Elsevier, GB, vol. 38, No. 12, Dec. 31, 2005, pp. 2270-2285, XPO27610849, ISSN: 0031-3203. |
| Last Push Before the Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014). |
| Nemanja Paunovic et al, “A methodology for testing complex professional electronic systems”, Serbian Journal of Electrical Engineering, vol. 9, No. 1, Feb. 1, 2012, pp. 71-80, XPO55317584, YU. |
| Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregisterco.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014). |
| Nuance, “ING Netherlands Launches Voice Biometrics Payment System in the Mobile Banking App Powered by Nuance”, “https://www.nuance.com/about-us/newsroom/press-releases/ing-netherlands-launches-nuance-voice-biometirics.html”, 4 pages (Jul. 28, 2015). |
| Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
| Stretchgoal—It's Your Dash (Feb. 14, 2014). |
| Stretchgoal—The Carrying Case for the Dash (Feb. 12, 2014). |
| Number | Date | Country | |
|---|---|---|---|
| 20180220221 A1 | Aug 2018 | US |
| Number | Date | Country | |
|---|---|---|---|
| 62211728 | Aug 2015 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 15244917 | Aug 2016 | US |
| Child | 15928960 | US |