The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to earpieces.
Wearable technology is a fast-developing field, and thus significant developments are needed in how users interact and interface with these technologies. Various alternatives exist for determining user intent in wearable technology exist. One such alternative is to use touch-based interfaces. Examples of touch-based interfaces may include capacitive touch screen, buttons, switches, pressure sensors, and finger print sensor. Another alternative is to use audio interfaces such as through use of key-word vocal commands or natural language spoken commands. Another alternative is to use a gesture based interface such that hand motions may be measured by some sensor and then classified as certain gestures. Yet another alternative is to use a computer-vision based interface such as by g. recognition of a specific individual, of a user's presence in general, or of two or more people.
Wearable technology presents particular challenges in that user-interfaces successful for established technologies are in some cases no longer the most natural, convenient, appropriate or simple interface for users. For example, large capacitive touchscreens are widely used in mobile devices but the inclusion of such a user interface may not be appropriate for discrete ear-worn devices.
Therefore, what is needed are improved user interfaces for wearable devices.
Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.
It is a further object, feature, or advantage of the present invention to provide for triggering an event after determining a user's attention or intention.
Another object, feature, or advantage is to provide an improved user interface for a wearable such as an earpiece wearable.
It is a still further object, feature, or advantage of the present invention to provide for an interface which uses audio menus.
Another object, feature, or advantage of the present invention is to use sensor data such as inertial sensor data, biometric sensor data, and environmental sensor data to determine a user's attention or intention.
Yet another object, feature, or advantage of the present invention is to interact with a user without requiring manual input on a device and without requiring voice input to the device.
One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by an objects, features, or advantages stated herein.
According to one aspect an earpiece includes an earpiece housing, an intelligent control system disposed within the earpiece housing, a speaker operatively connected to the intelligent control system, a microphone operatively connected to the intelligent control system, and at least one sensor operatively connected to the intelligent control system for providing sensor data. The intelligent control system of the earpiece is configured to interface with a user of the earpiece by determining at least one of attention or intention of the user using the sensor data without receiving manual input at the earpiece and without receiving voice input from the user. The earpiece may be further configured to present an audio menu and use the attention or intention of the user to select one or more items from the audio menu. The at least one sensor may include an inertial sensor and the step of determining the attention or intention of the user may be based at least in part on head orientation and/or head movement. The at least one sensor may further include at least one biometric sensor and the step of determining the attention or intention of the user may be based at least in part on biometric data from the at least one biometric sensor. The at least one sensor may further include at least one environmental sensor and the step of determining the attention or intention of the user may be based at least in part on environmental data from the at least one environmental sensor.
According to another aspect, an earpiece includes an earpiece housing, an intelligent control system disposed within the earpiece housing, a speaker operatively connected to the intelligent control system, a microphone operatively connected to the intelligent control system, and at least one inertial sensor operatively connected to the intelligent control system for providing inertial sensor data. The intelligent control system of the earpiece may be configured to interface with a user of the earpiece by providing audio cues associated with a menu containing a plurality of selections and receiving a selection of one of the plurality of the selections within the menu at least partially based on the inertial sensor data. The menu may have a plurality of levels. Each of the selections within a given level of the menu may be associated with different head positions although a user may otherwise communicate their attention or intention. The earpiece may be further configured to interface with the user of the earpiece by receiving a confirmation of the selection of one of the plurality of the selections within the menu based on the inertial sensor data or other sensor data.
According to yet another aspect, a system includes a first earpiece and a second earpiece wherein each of the first earpiece comprises an earpiece housing, a speaker, a microphone, and a transceiver. At least one of the first earpiece and the second earpiece further includes at least one sensor for providing sensor data. At least one of the first earpiece and the second earpiece may further include an intelligent control system to interface with a user of the earpiece by determining at least one of attention or intention of the user using the sensor data without receiving manual input at the earpiece and without receiving voice input from the user. The system may be configured to present an audio menu and use the attention or intention of the user to select one or more items from the audio menu. The audio menu may include a plurality of audio cues and wherein the audio cues are processed with a psychoacoustic model to virtually place or move sounds in 3D space relative to the user. The at least one sensor may include an inertial sensor and the step of determining the attention or intention of the user may be based at least in part on head orientation and/or head movement. The at least one sensor may further include one or more biometric sensors and/or one or more environmental sensors.
The present invention provides for methods, apparatus, and systems to allow for triggering of an event through determining the attention and/or intention of a user. In other words, instead of conventional user interfaces triggering of events may be performed by determining the attention and/or intention of a user. Although, specific embodiments are shown and described with respect to earpieces or ear worn computers and sensor packages, it is to be understood that methodologies shown and described may be applied to other type of wearable devices.
Focusing on a specific entity may be performed in various ways. For example, focusing may be performed through particular motions. Thus, for example, a user may turn to face something, gesture toward something, or move toward something in order to place a positive focus on an entity. Alternatively, a user may turn away from something, move away from something, gesture to reject something, gesture to shield oneself from something, invoke an involuntary (e.g. fight/flight response) reaction to something, which are all examples of placing a negative focus on an entity. Focus may also be determined based on subconscious emotional responses. For examples, changes in facial expression may be used as input. Thus, for example a smile may be used as one form of attention while a scowl may be used as another form of attention. Focus may also be determined based on subconscious physiological information. This may include, for example, changes in heart rate, changes in heart rate variability, changes in perspiration levels, changes in skin conductance, and changes in evoked potentials in brain waves. Focus may also be determined by listening to a particular sound source. For example, a user may consciously listen to a particular source of sound in an environment where there is more than one sound source present. Alternatively, a user may subconsciously listen to a particular source of sound in an environment where there is more than one sound source present.
The interface may use one or more different sensors to determine metrics of the user's state from which the user's attention and/or intention may be derived. Any number of different sensors may be used including microphones, image sensors, time of flight sensors, inertial sensors, physiological sensors, or other types of sensors.
A spectrometer 16 is also shown. The spectrometer 16 may be an infrared (IR) through ultraviolet (UV) spectrometer although it is contemplated that any number of wavelengths in the infrared, visible, or ultraviolet spectrums may be detected. The spectrometer 16 is preferably adapted to measure environmental wavelengths for analysis and recommendations and thus preferably is located on or at the external facing side of the device. An image sensor 88 may be present and a depth or time of flight camera 89 may also be present. A gesture control interface 36 may also be operatively connected to or integrated into the intelligent control system 30. The gesture control interface 36 may include one or more emitters 82 and one or more detectors 84 for sensing user gestures. The gestures performed may be performed such as through contact with a surface of the earpiece or may be performed near the earpiece. The emitters may be of any number of types including infrared LEDs. The device may include a transceiver 35 which may allow for induction transmissions such as through near field magnetic induction. A short range transceiver 34 using Bluetooth, BLE, UWB, or other means of radio communication may also be present. The short range transceiver 34 may be used to communicate with other devices including mobile devices. The various sensors 32, the intelligent control system 30, and other electronic components may be located on one or more printed circuit boards of the device. One or more speakers 73 may also be operatively connected to the intelligent control system 30. A magnetic induction electric conduction electromagnetic (E/M) field transceiver 37 or other type of electromagnetic field receiver may also operatively connected to the intelligent control system 30 to link it to the electromagnetic field of the user. The use of the E/M transceiver 37 allows the device to link electromagnetically into a personal area network or body area network or other device. It is contemplated that sensors associated with other devices including other wearable devices or interne of things (IoT) devices may be used to provide or add to sensor data which may be used in determining user attention or intention in any number of different ways and in any number of different contexts or situations.
It is contemplated that the interface may have different modes of operations which may include a sleep mode to conserve battery life and/or reduce power usage. The interface may be awakened in any number of ways such as through a deliberate interaction between the user and the interface or through a behavior recognized by the interfaces. For example, movement of the head may serve to awaken the interface.
When the interface is awake and/or active, the user may be presented with different audio prompts or feedback based on the orientation of their head, thereby allowing them to trigger an event in the interface. Thus, all possible orientations of the head (or any subset thereof) may be used as input channels to the interface. In this example, audio prompts or audio feedback may be presented to the user and the intention of the user may be determined by the interface via a confirmation gesture or otherwise. For example, in one alternative a user may simply continue to attend a presented audio cue.
In one alternative, sounds may be played to user according to their (the user's) orientation.
It also to be understood that the menus provided may be built dynamically to present the items in an order generated to present the most likely selections first. A determination of the most likely selections may be performed in various ways including based on user history, user preferences, and/or through using other contextual information including sensor data.
According, to another example with a more natural attention-detection mechanism, the user may be presented various audio cues or selections at particular locations. Audio feedback or cues may be processed with a psychoacoustic model to virtually place or move sounds in 3D space relative to the user. Thus, for example, different audio cues or selections may be placed in different locations, such as up, down, right, left, up and to the right, down and to the right, down and to the left. Of course, any number of other locations may be used. It should be understood that in this example, the audio cues need not include position information. Instead, the position is associated with the perceived location or direction of the sound source. In addition to placing audio cues or audio feedback or selections at different locations, these sounds may also be moved in 3D space relative to a user. Thus, for example, a sound may be introduced at one location and may be perceived as moving from that location to another location. This is another tool by which a user may convey their interest in a particular selection, as their head movement may track movement of a sound. In addition, after a user has selected a sound, one manner in which confirmation of the selection may be made is to move the sound to another location and confirm that the user is focused on that sound and thus intends to make that selection. This may be accomplished by having the user select the same sound again (but in a different location) or to confirm that the user has begun to track the sound after selection. If the user does not intend to make a particular selection, then the user would not select the same sound again or would not track that sound, such as either by exhibiting no head movement or by movement in a different direction. These are examples of inferring intention if the user continues to maintain attention on a presented audio cue.
In addition to or instead of inertial sensors, other types of sensors may also be used including biometric sensors. Biometric sensors may be used to ascertain subconscious and/or unintentional information about the state of the user. Biometric data may be used to complement logic behind determining the user's intent to a greater extent and provide a greater depth of contextual information to the interface, such that the information presented is more relevant to the user. Examples of biometric data may include pulse oximetry data, heart rate, heart variability, perspiration level, skin conductance, evoked potentials in brain waves, and other types of biometric data.
In addition to inertial sensors and biometric sensors other types of sensors may also be used to sense environmental conditions or information. Such environmental data may be used to enable the information presented to be more relevant and may include, without limitation, data such as geographic location such as may be determined using a GPS sensor, location in a localized sensed such as indoors or outdoors. Other types of sensors may include depth or time of flight cameras, air pressure sensors, barometric sensors, volatile organic compound sensors, small-particle sensors, temperature sensors, photometers, image sensors or cameras, or other types of sensors.
It is to be further understood that context may be based in part on the relative location to other objects. Other objects may be identified in any number of ways. For example, where the sensors include imaging sensors, imagery may be acquired and image processing algorithms may be performed to detect and classify objects upon which a user's attention may be focused such as may be determined based on the direction their head is pointed. Similarly, audio sources may be identified and classified based on data sensed with external microphones and the user's attention on a particular audio source may be determined in part based on the orientation of the user's head or other information such as if the user has made adjustment to audio settings to focus on that individual. Wireless transceivers of various types associated with the earpiece may also be used to identify objects where the object is in wireless communication with the user.
Although various examples have been shown and described throughout, it is to be understood that numerous variations, options, and alternatives and contemplated. This includes variations in the sensors used, the placement of sensors, the manner in which audio menus are constructed, and other variations, options, and alternatives.
This application claims priority to U.S. Provisional Patent Application No. 62/464,337, filed Feb. 27, 2017, hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2325590 | Carlisle et al. | Aug 1943 | A |
2430229 | Kelsey | Nov 1947 | A |
3047089 | Zwislocki | Jul 1962 | A |
D208784 | Sanzone | Oct 1967 | S |
3586794 | Michaelis | Jun 1971 | A |
3696377 | Wall | Oct 1972 | A |
3934100 | Harada | Jan 1976 | A |
3983336 | Malek et al. | Sep 1976 | A |
4069400 | Johanson et al. | Jan 1978 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
D266271 | Johanson et al. | Sep 1982 | S |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4617429 | Bellafiore | Oct 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4852177 | Ambrose | Jul 1989 | A |
4865044 | Wallace et al. | Sep 1989 | A |
4984277 | Bisgaard et al. | Jan 1991 | A |
5008943 | Arndt et al. | Apr 1991 | A |
5185802 | Stanton | Feb 1993 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5201008 | Arndt et al. | Apr 1993 | A |
D340286 | Seo | Oct 1993 | S |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5347584 | Narisawa | Sep 1994 | A |
5363444 | Norris | Nov 1994 | A |
5444786 | Raviv | Aug 1995 | A |
D367113 | Weeks | Feb 1996 | S |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5654530 | Sauer et al. | Aug 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5748743 | Weeks | May 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
D397796 | Yabe et al. | Sep 1998 | S |
5802167 | Hong | Sep 1998 | A |
5844996 | Enzmann et al. | Dec 1998 | A |
D410008 | Almqvist | May 1999 | S |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6084526 | Blotky et al. | Jul 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6185152 | Shen | Feb 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6230029 | Hahn et al. | May 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
D455835 | Anderson et al. | Apr 2002 | S |
6408081 | Boesen | Jun 2002 | B1 |
6424820 | Burdick et al. | Jul 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6563301 | Gventer | May 2003 | B2 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6690807 | Meyer | Feb 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7010137 | Leedom et al. | Mar 2006 | B1 |
7113611 | Leedom et al. | Sep 2006 | B2 |
D532520 | Kampmeier et al. | Nov 2006 | S |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
D549222 | Huang | Aug 2007 | S |
D554756 | Sjursen et al. | Nov 2007 | S |
7403629 | Aceti et al. | Jul 2008 | B1 |
D579006 | Kim et al. | Oct 2008 | S |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
7532901 | LaFranchise et al. | May 2009 | B1 |
D601134 | Elabidi et al. | Sep 2009 | S |
7825626 | Kozisek | Nov 2010 | B2 |
7859469 | Rosener et al. | Dec 2010 | B1 |
7965855 | Ham | Jun 2011 | B1 |
7979035 | Griffin et al. | Jul 2011 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
D647491 | Chen et al. | Oct 2011 | S |
8095188 | Shi | Jan 2012 | B2 |
8108143 | Tester | Jan 2012 | B1 |
8140357 | Boesen | Mar 2012 | B1 |
D666581 | Perez | Sep 2012 | S |
8300864 | Müllenborn et al. | Oct 2012 | B2 |
8406448 | Lin | Mar 2013 | B2 |
8430817 | Al-Ali et al. | Apr 2013 | B1 |
8436780 | Schantz et al. | May 2013 | B2 |
D687021 | Yuen | Jul 2013 | S |
8679012 | Kayyali | Mar 2014 | B1 |
8719877 | VonDoenhoff et al. | May 2014 | B2 |
8774434 | Zhao et al. | Jul 2014 | B2 |
8831266 | Huang | Sep 2014 | B1 |
8891800 | Shaffer | Nov 2014 | B1 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
D728107 | Martin et al. | Apr 2015 | S |
9013145 | Castillo et al. | Apr 2015 | B2 |
9026914 | Kauffmann | May 2015 | B1 |
9037125 | Kadous | May 2015 | B1 |
D733103 | Jeong et al. | Jun 2015 | S |
9081944 | Camacho et al. | Jul 2015 | B2 |
9461403 | Gao et al. | Oct 2016 | B2 |
9510159 | Cuddihy et al. | Nov 2016 | B1 |
D773439 | Walker | Dec 2016 | S |
D775158 | Dong et al. | Dec 2016 | S |
D777710 | Palmborg et al. | Jan 2017 | S |
9544689 | Fisher et al. | Jan 2017 | B2 |
D788079 | Son et al. | May 2017 | S |
9711062 | Ellis et al. | Jul 2017 | B2 |
9729979 | Özden | Aug 2017 | B2 |
9767709 | Ellis | Sep 2017 | B2 |
9769585 | Hills | Sep 2017 | B1 |
9848257 | Ambrose et al. | Dec 2017 | B2 |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010043707 | Leedom | Nov 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20030002705 | Boesen | Jan 2003 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040102931 | Ellis et al. | May 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050017842 | Dematteo | Jan 2005 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050094839 | Gwee | May 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050197063 | White | Sep 2005 | A1 |
20050212911 | Marvit et al. | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060073787 | Lair et al. | Apr 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20060166715 | Engelen et al. | Jul 2006 | A1 |
20060166716 | Seshadri et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20060258412 | Liu | Nov 2006 | A1 |
20070102009 | Wong et al. | May 2007 | A1 |
20070239225 | Saringer | Oct 2007 | A1 |
20070269785 | Yamanoi | Nov 2007 | A1 |
20080076972 | Dorogusker et al. | Mar 2008 | A1 |
20080090622 | Kim et al. | Apr 2008 | A1 |
20080102424 | Holljes | May 2008 | A1 |
20080146890 | LeBoeuf et al. | Jun 2008 | A1 |
20080187163 | Goldstein et al. | Aug 2008 | A1 |
20080215239 | Lee | Sep 2008 | A1 |
20080253583 | Goldstein et al. | Oct 2008 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20080255430 | Alexandersson et al. | Oct 2008 | A1 |
20080298606 | Johnson et al. | Dec 2008 | A1 |
20090003620 | McKillop et al. | Jan 2009 | A1 |
20090008275 | Ferrari et al. | Jan 2009 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090073070 | Rofougaran | Mar 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090105548 | Bart | Apr 2009 | A1 |
20090154739 | Zellner | Jun 2009 | A1 |
20090191920 | Regen et al. | Jul 2009 | A1 |
20090226017 | Abolfathi et al. | Sep 2009 | A1 |
20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
20090261114 | McGuire et al. | Oct 2009 | A1 |
20090296968 | Wu et al. | Dec 2009 | A1 |
20090303073 | Gilling et al. | Dec 2009 | A1 |
20090304210 | Weisman | Dec 2009 | A1 |
20100033313 | Keady et al. | Feb 2010 | A1 |
20100100004 | van Someren | Apr 2010 | A1 |
20100166206 | Macours | Jul 2010 | A1 |
20100203831 | Muth | Aug 2010 | A1 |
20100210212 | Sato | Aug 2010 | A1 |
20100246847 | Johnson, Jr. | Sep 2010 | A1 |
20100290636 | Mao et al. | Nov 2010 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110018731 | Linsky et al. | Jan 2011 | A1 |
20110103609 | Pelland et al. | May 2011 | A1 |
20110137141 | Razoumov et al. | Jun 2011 | A1 |
20110140844 | McGuire et al. | Jun 2011 | A1 |
20110200213 | Knox | Aug 2011 | A1 |
20110239497 | McGuire et al. | Oct 2011 | A1 |
20110286615 | Olodort et al. | Nov 2011 | A1 |
20110293105 | Arie et al. | Dec 2011 | A1 |
20120057740 | Rosal | Mar 2012 | A1 |
20120155670 | Rutschman | Jun 2012 | A1 |
20120163626 | Booij et al. | Jun 2012 | A1 |
20120197737 | LeBoeuf et al. | Aug 2012 | A1 |
20120235883 | Border et al. | Sep 2012 | A1 |
20120309453 | Maguire | Dec 2012 | A1 |
20130106454 | Liu et al. | May 2013 | A1 |
20130154826 | Ratajczyk | Jun 2013 | A1 |
20130178967 | Mentz | Jul 2013 | A1 |
20130204617 | Kuo et al. | Aug 2013 | A1 |
20130293494 | Reshef | Nov 2013 | A1 |
20130316642 | Newham | Nov 2013 | A1 |
20130346168 | Zhou et al. | Dec 2013 | A1 |
20140004912 | Rajakarunanayake | Jan 2014 | A1 |
20140014697 | Schmierer et al. | Jan 2014 | A1 |
20140020089 | Perini, II | Jan 2014 | A1 |
20140072136 | Tenenbaum et al. | Mar 2014 | A1 |
20140072146 | Itkin et al. | Mar 2014 | A1 |
20140073429 | Meneses et al. | Mar 2014 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140106677 | Altman | Apr 2014 | A1 |
20140122116 | Smythe | May 2014 | A1 |
20140146973 | Liu et al. | May 2014 | A1 |
20140153768 | Hagen et al. | Jun 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140185828 | Helbling | Jul 2014 | A1 |
20140219467 | Kurtz | Aug 2014 | A1 |
20140222462 | Shakil et al. | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140270271 | Dehe et al. | Sep 2014 | A1 |
20140276227 | Pérez | Sep 2014 | A1 |
20140310595 | Acharya et al. | Oct 2014 | A1 |
20140321682 | Kofod-Hansen et al. | Oct 2014 | A1 |
20140335908 | Krisch et al. | Nov 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20140359450 | Lehtiniemi | Dec 2014 | A1 |
20150028996 | Agrafioti et al. | Jan 2015 | A1 |
20150035643 | Kursun | Feb 2015 | A1 |
20150036835 | Chen | Feb 2015 | A1 |
20150056584 | Boulware et al. | Feb 2015 | A1 |
20150110587 | Hori | Apr 2015 | A1 |
20150148989 | Cooper et al. | May 2015 | A1 |
20150181356 | Krystek et al. | Jun 2015 | A1 |
20150230022 | Sakai et al. | Aug 2015 | A1 |
20150245127 | Shaffer | Aug 2015 | A1 |
20150256949 | Vanpoucke et al. | Sep 2015 | A1 |
20150264472 | Aase | Sep 2015 | A1 |
20150264501 | Hu et al. | Sep 2015 | A1 |
20150317565 | Li et al. | Nov 2015 | A1 |
20150358751 | Deng et al. | Dec 2015 | A1 |
20150359436 | Shim et al. | Dec 2015 | A1 |
20150364058 | Lagree et al. | Dec 2015 | A1 |
20150373467 | Gelter | Dec 2015 | A1 |
20150373474 | Kraft et al. | Dec 2015 | A1 |
20160033280 | Moore et al. | Feb 2016 | A1 |
20160034249 | Lee et al. | Feb 2016 | A1 |
20160071526 | Wingate et al. | Mar 2016 | A1 |
20160072558 | Hirsch et al. | Mar 2016 | A1 |
20160073189 | Lindén et al. | Mar 2016 | A1 |
20160100262 | Inagaki | Apr 2016 | A1 |
20160119737 | Mehnert et al. | Apr 2016 | A1 |
20160124707 | Ermilov et al. | May 2016 | A1 |
20160125892 | Bowen et al. | May 2016 | A1 |
20160140870 | Connor | May 2016 | A1 |
20160142818 | Park | May 2016 | A1 |
20160150339 | Choueiri | May 2016 | A1 |
20160162259 | Zhao et al. | Jun 2016 | A1 |
20160209691 | Yang et al. | Jul 2016 | A1 |
20160253994 | Panchapagesan et al. | Sep 2016 | A1 |
20160324478 | Goldstein | Nov 2016 | A1 |
20160353196 | Baker et al. | Dec 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20170021257 | Gilbert et al. | Jan 2017 | A1 |
20170046503 | Cho et al. | Feb 2017 | A1 |
20170059152 | Hirsch et al. | Mar 2017 | A1 |
20170060262 | Hviid et al. | Mar 2017 | A1 |
20170060269 | Förstner et al. | Mar 2017 | A1 |
20170061751 | Loermann et al. | Mar 2017 | A1 |
20170061817 | Mettler May | Mar 2017 | A1 |
20170062913 | Hirsch et al. | Mar 2017 | A1 |
20170064426 | Hviid | Mar 2017 | A1 |
20170064428 | Hirsch | Mar 2017 | A1 |
20170064432 | Hviid et al. | Mar 2017 | A1 |
20170064437 | Hviid et al. | Mar 2017 | A1 |
20170078780 | Qian et al. | Mar 2017 | A1 |
20170078785 | Qian et al. | Mar 2017 | A1 |
20170100277 | Ke | Apr 2017 | A1 |
20170108918 | Boesen | Apr 2017 | A1 |
20170109131 | Boesen | Apr 2017 | A1 |
20170110124 | Boesen et al. | Apr 2017 | A1 |
20170110899 | Boesen | Apr 2017 | A1 |
20170111723 | Boesen | Apr 2017 | A1 |
20170111725 | Boesen et al. | Apr 2017 | A1 |
20170111726 | Martin et al. | Apr 2017 | A1 |
20170111740 | Hviid et al. | Apr 2017 | A1 |
20170127168 | Briggs et al. | May 2017 | A1 |
20170131094 | Kulik | May 2017 | A1 |
20170142511 | Dennis | May 2017 | A1 |
20170146801 | Stempora | May 2017 | A1 |
20170150920 | Chang et al. | Jun 2017 | A1 |
20170151085 | Chang et al. | Jun 2017 | A1 |
20170151447 | Boesen | Jun 2017 | A1 |
20170151668 | Boesen | Jun 2017 | A1 |
20170151918 | Boesen | Jun 2017 | A1 |
20170151930 | Boesen | Jun 2017 | A1 |
20170151957 | Boesen | Jun 2017 | A1 |
20170151959 | Boesen | Jun 2017 | A1 |
20170153114 | Boesen | Jun 2017 | A1 |
20170153636 | Boesen | Jun 2017 | A1 |
20170154532 | Boesen | Jun 2017 | A1 |
20170155985 | Boesen | Jun 2017 | A1 |
20170155992 | Perianu et al. | Jun 2017 | A1 |
20170155993 | Boesen | Jun 2017 | A1 |
20170155997 | Boesen | Jun 2017 | A1 |
20170155998 | Boesen | Jun 2017 | A1 |
20170156000 | Boesen | Jun 2017 | A1 |
20170164890 | Leip et al. | Jun 2017 | A1 |
20170178631 | Boesen | Jun 2017 | A1 |
20170180842 | Boesen | Jun 2017 | A1 |
20170180843 | Perianu et al. | Jun 2017 | A1 |
20170180897 | Perianu | Jun 2017 | A1 |
20170188127 | Perianu et al. | Jun 2017 | A1 |
20170188132 | Hirsch et al. | Jun 2017 | A1 |
20170193978 | Goldman | Jul 2017 | A1 |
20170195829 | Belverato et al. | Jul 2017 | A1 |
20170208393 | Boesen | Jul 2017 | A1 |
20170214987 | Boesen | Jul 2017 | A1 |
20170215016 | Dohmen et al. | Jul 2017 | A1 |
20170230752 | Dohmen et al. | Aug 2017 | A1 |
20170251933 | Braun et al. | Sep 2017 | A1 |
20170257698 | Boesen et al. | Sep 2017 | A1 |
20170258329 | Marsh | Sep 2017 | A1 |
20170263236 | Boesen et al. | Sep 2017 | A1 |
20170263376 | Verschueren et al. | Sep 2017 | A1 |
20170266494 | Crankson et al. | Sep 2017 | A1 |
20170273622 | Boesen | Sep 2017 | A1 |
20170280257 | Gordon et al. | Sep 2017 | A1 |
20170301337 | Golani et al. | Oct 2017 | A1 |
20170347177 | Masaki | Nov 2017 | A1 |
20170361213 | Goslin et al. | Dec 2017 | A1 |
20170366233 | Hviid et al. | Dec 2017 | A1 |
20180007994 | Boesen et al. | Jan 2018 | A1 |
20180008194 | Boesen | Jan 2018 | A1 |
20180008198 | Kingscott | Jan 2018 | A1 |
20180009447 | Boesen et al. | Jan 2018 | A1 |
20180011006 | Kingscott | Jan 2018 | A1 |
20180011682 | Milevski et al. | Jan 2018 | A1 |
20180011994 | Boesen | Jan 2018 | A1 |
20180012228 | Milevski et al. | Jan 2018 | A1 |
20180013195 | Hviid et al. | Jan 2018 | A1 |
20180014102 | Hirsch et al. | Jan 2018 | A1 |
20180014103 | Martin et al. | Jan 2018 | A1 |
20180014104 | Boesen et al. | Jan 2018 | A1 |
20180014107 | Razouane et al. | Jan 2018 | A1 |
20180014108 | Dragicevic et al. | Jan 2018 | A1 |
20180014109 | Boesen | Jan 2018 | A1 |
20180014113 | Boesen | Jan 2018 | A1 |
20180014140 | Milevski et al. | Jan 2018 | A1 |
20180014436 | Milevski | Jan 2018 | A1 |
20180034951 | Boesen | Feb 2018 | A1 |
20180040093 | Boesen | Feb 2018 | A1 |
20180042501 | Adi et al. | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
204244472 | Apr 2015 | CN |
104683519 | Jun 2015 | CN |
104837094 | Aug 2015 | CN |
1469659 | Oct 2004 | EP |
1017252 | May 2006 | EP |
2903186 | Aug 2015 | EP |
2074817 | Apr 1981 | GB |
2508226 | May 2014 | GB |
06292195 | Oct 1994 | JP |
2008103925 | Aug 2008 | WO |
2008113053 | Sep 2008 | WO |
2007034371 | Nov 2008 | WO |
2011001433 | Jan 2011 | WO |
2012071127 | May 2012 | WO |
2013134956 | Sep 2013 | WO |
2014046602 | Mar 2014 | WO |
2014043179 | Jul 2014 | WO |
2015061633 | Apr 2015 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
2016032990 | Mar 2016 | WO |
2016187869 | Dec 2016 | WO |
Entry |
---|
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
Alzahrani et al: “A Multi-Channel Opto-Electronic Sensor to Accurately Monitor Heart Rate against Motion Artefact during Exercise”, Sensors, vol. 15, No. 10, Oct. 12, 2015, pp. 25681-25702, XPO55334602, DOI: 10.3390/s151025681 the whole document. |
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014). |
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013). |
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
BRAGI is on Facebook (2014). |
BRAGI Update—Arrival of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014). |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015). |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014). |
BRAGI Update—Lets Get Ready to Rumble, A Lot to Be Done Over Christmas (Dec. 22, 2014). |
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014). |
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014). |
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014). |
BRAGI Update—Memories From the First Month of Kickstarter—Update on Progress (Aug. 1, 2014). |
BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014). |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014). |
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014). |
BRAGI Update—Status on Wireless, Bits and Pieces, Testing—Oh Yeah, Timeline(Apr. 24, 2015). |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015). |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014). |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015). |
BRAGI Update—Alpha 5 and Back to China, Backer Day, On Track(May 16, 2015). |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015). |
BRAGI Update—Certifications, Production, Ramping Up. |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015). |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015). |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015). |
BRAGI Update—Getting Close(Aug. 6, 2015). |
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015). |
BRAGI Update—On Track, On Track and Gems Overview. |
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015). |
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015). |
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016). |
Hoffman, “How to Use Android Beam to Wirelessly Transfer Content Between Devices”, (Feb. 22, 2013). |
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017). |
Hyundai Motor America, “Hyundai Motor Company Introduces a Health + Mobility Concept for Wellness in Mobility”, Fountain Valley, Californa (2017). |
International Search Report & Written Opinion, PCT/EP16/70245 (dated Nov. 16, 2016). |
International Search Report & Written Opinion, PCT/EP2016/070231 (dated Nov. 18, 2016). |
International Search Report & Written Opinion, PCT/EP2016/070247 (dated Nov. 18, 2016). |
International Search Report & Written Opinion, PCT/EP2016/07216 (dated Oct. 18, 2016). |
International Search Report and Written Opinion, PCT/EP2016/070228 (dated Jan. 9, 2017). |
Jain A et al: “Score normalization in multimodal biometric systems”, Pattern Recognition, Elsevier, GB, vol. 38, No. 12, Dec. 31, 2005, pp. 2270-2285, XP027610849, ISSN: 0031-3203. |
Last Push Before the Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014). |
Nemanja Paunovic et al, “A methodology for testing complex professional electronic systems”, Serbian Journal of Electrical Engineering, vol. 9, No. 1, Feb. 1, 2012, pp. 71-80, XP055317584, YU. |
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014). |
Nuance, “ING Netherlands Launches Voice Biometrics Payment System in the Mobile Banking App Powered by Nuance”, “https://www.nuance.com/about-us/newsroom/press-releases/ing-netherlands-launches-nuance-voice-biometirics.html”, 4 pages (Jul. 28, 2015). |
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—Its Your Dash (Feb. 14, 2014). |
Stretchgoal—The Carrying Case for the Dash (Feb. 12, 2014). |
Stretchgoal—Windows Phone Support (Feb. 17, 2014). |
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014). |
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014). |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014). |
Weisiger; “Conjugated Hyperbilirubinemia”, Jan. 5, 2016. |
Wertzner et al., “Analysis of fundamental frequency, jitter, shimmer and vocal intensity in children with phonological disorders”, V. 71, n5, 582-588, Sep./Oct. 2005; Brazilian Journal of Othrhinolaryngology. |
Wikipedia, “Gamebook”, https://en.wikipedia.org/wiki/Gamebook, Sep. 3, 2017, 5 pages. |
Wikipedia, “Kinect”, “https://en.wikipedia.org/wiki/Kinect”, 18 pages, (Sep. 9, 2017). |
Wikipedia, “Wii Balance Board”, “https://en.wikipedia.org/wiki/Wii_Balance_Board”, 3 pages, (Jul. 20, 2017). |
Number | Date | Country | |
---|---|---|---|
20180249239 A1 | Aug 2018 | US |
Number | Date | Country | |
---|---|---|---|
62464337 | Feb 2017 | US |