The present invention relates to audio-based social media communication using wearable devices. More particularly, but not exclusively, the present invention relates to an audio-based social media platform.
Social media provides users with a computer-mediated tool to create, share, and exchange information. Computer-mediated tools such as Facebook chats, Instagram posts, LinkedIn, and text messaging all necessitate the use of a screen for the exchange of information. Teenagers in the United States spend about nine hours per day in front of a screen. Therefore, what is needed is a new form of communication with a completely screen free messenger interface.
Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.
It is a further object, feature, or advantage of the present invention to provide enhanced auditory communication using wearable devices.
It is a still further object, feature, or advantage of the present invention to provide users with a social media network of people using audio-based communication.
Another object, feature, or advantage is to interact with others without the need to use keyboards or screens.
It is still a further object, feature, or advantage of the present invention to provide the user with a means of sending audio-based communication via a gesture.
It is another object, feature, or advantage of the present invention to provide an individual with the ability to share audio messaging with an individual person, a group of people, or even strangers.
Yet another object, feature, or advantage of the present invention is to associate an audio message with a particular location.
A further object, feature, or advantage of the present invention is to provide for a social media experience that emulates real life.
A still further object, feature, or advantage is to allow for personal and immediate communications between people.
Another object, feature, or advantage is to allow for a social media experience that promotes conversational interaction and authentic interactions.
Yet another object, feature, or advantage is to provide for a social media experience, which may be used for purely social or business related interactions including those, which promote customer engagement.
A further object, feature, or advantage is to provide for a social media experience which need not result in interactions being permanent and allows for interactions which are private.
One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and claims that follow. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by any objects, features, or advantages stated herein.
According to one aspect, a method for audio-based social media messaging for a wearable device includes generating a first audio message using a wearable device, the wearable device having at least one microphone and at least one speaker and conveying the first audio message from the wearable device to an audio-based social media platform using a wireless radio transceiver within the wearable device. The method may further include generating social media data associated with the first audio message at the wearable device and conveying the social media data to the audio-based social media platform using the wireless radio transceiver within the wearable device. The social media data may include delivery data indicating one or more users on the social media platform to receive the first audio message, a location associated with where the first audio message is generated or where the wearable device is located when the audio message is sent. The method may further include receiving a notification from the wearable device that a second audio message is available. The method may further include playing the second audio message using the wearable device.
According to another aspect, a wearable device configured to support an audio-based social media network. The wearable device may include a wearable device housing, at least one speaker associated with the wearable device housing, at least one microphone associated with the wearable device housing and a wireless transceiver disposed within the wearable device housing. The wearable device may be configured to connect to the audio-based social media network through the wireless transceiver to send and receive audio messages.
According to another aspect, a method for providing audio-based social media messaging to a plurality of wearable devices, each of the wearable device having a speaker and a microphone includes steps of receiving audio messages and audio message identification and delivery data from the plurality of wearable devices, storing the audio messages and the audio message identification and delivery data on a server platform, and delivering one or more of the audio messages to one or more of the plurality of wearable devices based on the audio message identification and the delivery data.
According to another aspect, a method for audio-based social media messaging for a wearable device is provided. The method includes generating a first audio message using a first wearable device, the first wearable device having at least one microphone and at least one speaker and no screen display, and sending the first audio message from the first wearable device over a network to an audio-based social media server platform using a wireless radio transceiver within the first wearable device. The method may further include storing the first audio message on the audio-based social media server platform for a time period, and communicating the first audio message on the audio-based social media server platform to a second wearable device, the second wearable device having at least one microphone and at least one speaker and no screen display. The method may further include generating social media data associated with the first audio message at the first wearable device and conveying the social media data to the audio-based social media platform using the wireless radio transceiver within the first wearable device.
Illustrated embodiments of the disclosure are described in detail below with reference to the attached drawing figures, which are incorporated by reference herein.
The present invention relates to an audio-based social media messaging platform or experience and related systems and methods. Current social media applications generally provide users with a screen-based method of sharing information that focuses on typing and reading information. These social media sites are limited in the manner and location in which the user may communicate information. Users must be in places they can type out a message, and have their electronic devices with them. One of the criticisms of such social media usage is that users can miss experiencing life around them because of their fixation on a display. Another problem relates to safety. If users stare at screens while they walk, drive, or engage in other activities, the users put themselves and/or others in danger. Therefore an audio-based social media messaging platform as shown and described herein allows for removing the need to rely on a display and promotes more natural social media communications. Thus, chats may occur without keyboards or screens and while a user is otherwise interacting with their environment. The term “bubble” is sometimes used herein when referring to aspects of the invention. As used herein, a “bubble” is associated with an audio message that may have a particular reach. It may include the content of one or more audio messages as well as convey inclusiveness or reach (e.g. bubbles may grow such as to include others or add to the messaging) and may also convey a temporary nature of the communication (e.g. a bubble may be popped).
To trigger the audio-based social media application or functionality, a gesture may be used. The gesture may be used as input to a wearable device such as an earpiece having one or more microphones. Once triggered or activated an audio message or bubble may be sent by simply speaking. The audio message or bubble may be of different types and have different delivery options associated with it. For examples, a user may send an audio message in the form of a bubble to other users in their social media network or group or sub-group within their social media network or list of contacts within the social media network. Alternatively, the user may send an audio message or bubble to a single recipient. A user may also make the bubble “fly” and allow the bubble to be shared with other users of the audio-based social media platform who are not within the list of contacts associated with the user. Alternatively, a user may “stick” the audio message or bubble to a particular location (such as a geolocation associated with the user's current location or other location) so that other users may only hear the audio message or join the bubble when present at the location.
A list may be maintained locally and/or on a server associated with the audio-based social media platform. The list may include a plurality of different contacts associated with a user. It is contemplated that a user may build this list or add to it individually. In addition, or alternatively, the audio-based social medial platform may scan any number of different social media channels associated with a user in order to add contacts into this list. This may include email contacts, social media contacts, or other types of contacts. A user may invite someone to message or join a bubble in one tap or less.
In addition, artificial intelligence may be used to generate an audio message around a user so that the user does not feel alone and to make the social media platform a more inclusive and positive experience for all users. The audio message may relate to the location of the user, the environment of the user, or otherwise be of interest to the user. For example, the audio message may contain good news or interesting information. In addition, a user may create a bubble which is directed to an artificial intelligence system and the response may then be a part of a bubble.
There are numerous different types of messages. For example, family members may leave messages for one another. Thus, for example, a mother could create a bubble stating, “There is some lasagna in the fridge just heat it up, mom.” The mother could share this directly to one or more children. Alternatively, the mother could attach this to a location at or near the refrigerator or within the kitchen so that the bubble would be available at an appropriate when the children are in the kitchen. This example is an example of an indoor bubble.
Similarly, a bubble may be attached to an outdoor location. For example, a bubble may be attached to location that states “from here you have the best view on the bay bridge.” Thus, when individuals open to receive messages are at the location they can hear the message of this bubble and that promotes active engagement with their environment.
A bubble may also be epic in nature. For example, a bubble may be sent which states “around cape canaveral you can find a lot of bubbles from Neil Armstrong bubbling about Apollo 11.” Of course, bubbles may be otherwise epic in nature depending upon their content or environment.
A bubble may also be a service bubble. For example, a restaurant guidebook, online review site, or other service provider may leave a message associated with a restaurant such as, “In this lovely restaurant everything is good. Try the red thai curry their most famous dish.” Thus, a person may have information available to them in the form of a bubble at the place and time of need.
In addition, messages may be created with artificial intelligence or which are directed to artificial intelligence may be created. For example, a messages of “I wonder where there is the next good indian place?” may be created or the message “Where is the next good indian place?” and these messages may be directed towards an artificial intelligence system for responding such as by selecting a pre-existing bubble (e.g. a service bubble previously created) or creating a response to add to a bubble.
Group bubbles is another type of bubble that may be set. For example, a bubble may be sent to or shared with a plurality of recipients that states, “Let's create a Saturday barbeque bubble!.” Once created, messaging associated that event may be associated with the bubble.
A live bubble may be sent such as “Let's link with BUBBLE!” thereby inviting one or more additional individuals to communicate with one another using the platform.
A “spread bubble” such as “Random Jokes & Wisdom” may be sent. The spread bubble may be used to collect messages from any number of different users and such a bubble will grow over time.
A “love bubble” such as “seeking a good night, male 28” may be sent. This type of bubble may be sent to recipients open to receiving this type of bubble.
It is to be understood that the above examples of different types of bubbles is not limiting and that any number of different types of bubbles are contemplated based on the content of the bubble, the manner in which the bubble is to be disseminated, the location of the bubble, the recipient(s) of the bubble, the source of the bubble, or otherwise.
It is further contemplated that bubbles may be popped. Thus, unlike certain forms of social media contents of bubbles need not be public nor permanent but may be fleeting and ephemeral. Once popped a bubble may simply cease to exist no longer stored anywhere or accessible by anyone. Alternatively, once popped the bubble may be temporarily stored and marked for deletion according to a bubble retention policy or based on user preferences with the possibility of restoring the bubble such as if the bubble was inadvertently or prematurely popped. It is contemplated that different rules may be implemented regarding who has permission or authority to pop a bubble. For example, anyone who creates the bubble may pop it, anyone who receives the bubble may pop it, only the creator may pop it, only the receiver may pop it, if it is a group bubble only the creator may pop it, the bubble may automatically pop after a set time period or on a certain time and date, after a certain number of listens, or any number of different rules may be applied. Note that the rules may be based in whole or in part on user preferences and that different rules may apply for different types of bubbles. For example, a service bubble may pop after it has been listened to a set number of times if a service provider only pays for a set number of listens. It is also contemplated that meta data about a bubble may also potentially be maintained after a bubble has been popped such as information about who was a part of the bubble or other information which does not include the actual audio message. It is also contemplated that some types of bubbles may be effectively permanent in nature where desired.
The ability to pop a bubble may be advantageous in a number of different situations and for a variety of different reasons. For example, the ability to pop a bubble allows for interactions between individuals to be temporary in nature and not permanent and potentially public. Therefore the use of bubbles may serve to better emulate real life person-to-person conversations and be more authentic. This may also be more advantageous for customer engagement as well such as interactions between an individual and customer support or other types of business interactions where it is helpful to promote authenticity and allow for conversational interaction.
The servers may provide for any number of different functions. For example, the servers may be configured to filter the messages from the bubble. Thus, if audio messages are negative or inappropriate based on policies of the social media platform, or according to user preferences, the audio messages may be deleted or returned to the sender with the sender being notified that the message will not be permitted within a bubble. Such content filtering allows for the social media platform which promotes positive communications.
Note that the wearable devices 10A, 10B may access the network 16 in any number of ways depending upon their hardware configuration. For example, the wearable devices may include Bluetooth or Bluetooth wireless transceivers which communicate with mobile devices (not shown) such as phones or tablets which then access the network through a cellular transceiver or Wi-Fi transceiver. Alternatively, the wearable devices may include a cellular transceiver of Wi-Fi transceiver, or the network may be otherwise accessed.
Although the wearable devices shown are in the form of earpieces which may be ear bud earpieces or over-the-ear earphones or other form factors, other types of wearable devices may be used such as articles of clothing, jewelry items, watches, eye glass(es), or other wearable devices. In addition, note that no screen displays are required for the wearable devices because the social media network is audio messaging social media platform. Thus, the audio messages may be sent, received, stored, shared, deleted, or other actions may be taken without use of a screen display by a user.
As shown in
If a user is not available to hear the BUBBLE, the BUBBLE may be stored until the user can access it and listen to it. A wearable device allows the user to operate in a hands-free situation. Since the messages sent are spoken by the user instead of being typed, the user has the option to send, receive, and manage the BUBBLES in more convenient locations, such as while walking or driving or otherwise actively engaged in other activities.
It is to also be understood that the audio-based social media platform may maintain databases of its own users, or it may interact with another social network to reach more users. Other social networks include but are not limited to, FACEBOOK, FACEBOOK MESSENGER, GOOGLE, GOOGLE+, SNAPCHAT, LINKEDIN, YOUTUBE, VIBER, TUMBLR, TWITTER, BAIDU TIEBA, PINTEREST, INSTAGRAM, WHATSAPP or others. The audio-based social media platform may interact with other social media services in various manners. For example, a user's contacts on various different social media platforms may be integrated into a list of contacts on the audio-based social media platform. In addition, various other social media platforms may integrate with the audio based social media platform in one manner or another. For example, a user may select to have other types of social media messages converted into bubbles.
Preferably each BUBBLE contains good news or interesting stuff. Because of filtering that may be performed using artificial intelligence to delete hateful messages or otherwise negative or inappropriate messages, users will be encouraged to leave messages which feel playful and good.
The invention is not to be limited to the particular embodiments described herein. In particular, the invention contemplates numerous variations in the server platform used, the manner in which messages are conveyed, the manner in which messages are created, the manner in which messages are delivered, and other variations. The foregoing description has been presented for purposes of illustration and description. It is not intended to be an exhaustive list or limit any of the invention to the precise forms disclosed. It is contemplated that other alternatives or exemplary aspects are considered included in the invention. The description is merely examples of embodiments, processes or methods of the invention. It is understood that any other modifications, substitutions, and/or additions can be made, which are within the intended spirit and scope of the invention.
This application is a continuation of U.S. Non-provisional patent application Ser. No. 15/716,204, filed on Sep. 26, 2017, which claims priority to U.S. Provisional Patent Application 62/400,391, filed on Sep. 27, 2016, both entitled “Audio-based social media platform”, hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
2325590 | Carlisle et al. | Aug 1943 | A |
2430229 | Kelsey | Nov 1947 | A |
3047089 | Zwislocki | Jul 1962 | A |
D208784 | Sanzone | Oct 1967 | S |
3586794 | Michaelis | Jun 1971 | A |
3934100 | Harada | Jan 1976 | A |
3983336 | Malek et al. | Sep 1976 | A |
4069400 | Johanson et al. | Jan 1978 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
D266271 | Johanson et al. | Sep 1982 | S |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4617429 | Bellafiore | Oct 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4852177 | Ambrose | Jul 1989 | A |
4865044 | Wallace et al. | Sep 1989 | A |
4984277 | Bisgaard et al. | Jan 1991 | A |
5008943 | Arndt et al. | Apr 1991 | A |
5185802 | Stanton | Feb 1993 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5201008 | Arndt et al. | Apr 1993 | A |
D340286 | Seo | Oct 1993 | S |
5251327 | Lenchik | Oct 1993 | A |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5347584 | Narisawa | Sep 1994 | A |
5363444 | Norris | Nov 1994 | A |
D367113 | Weeks | Feb 1996 | S |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5654530 | Sauer et al. | Aug 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5748743 | Weeks | May 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
D397796 | Yabe et al. | Sep 1998 | S |
5802167 | Hong | Sep 1998 | A |
D410008 | Almqvist | May 1999 | S |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6084526 | Blotky et al. | Jul 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6230029 | Yegiazaryan et al. | May 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
D455835 | Anderson et al. | Apr 2002 | S |
6408081 | Boesen | Jun 2002 | B1 |
6424820 | Burdick et al. | Jul 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6690807 | Meyer | Feb 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7010137 | Leedom et al. | Mar 2006 | B1 |
7113611 | Leedom et al. | Sep 2006 | B2 |
D532520 | Kampmeier et al. | Nov 2006 | S |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
D549222 | Huang | Aug 2007 | S |
D554756 | Sjursen et al. | Nov 2007 | S |
7403629 | Aceti et al. | Jul 2008 | B1 |
D579006 | Kim et al. | Oct 2008 | S |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
D601134 | Elabidi et al. | Sep 2009 | S |
7650170 | May | Jan 2010 | B2 |
7825626 | Kozisek | Nov 2010 | B2 |
7965855 | Ham | Jun 2011 | B1 |
7979035 | Griffin et al. | Jul 2011 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
D647491 | Chen et al. | Oct 2011 | S |
8095188 | Shi | Jan 2012 | B2 |
8108143 | Tester | Jan 2012 | B1 |
8140357 | Boesen | Mar 2012 | B1 |
D666581 | Perez | Sep 2012 | S |
8300864 | Müllenborn et al. | Oct 2012 | B2 |
8406448 | Lin et al. | Mar 2013 | B2 |
8436780 | Schantz et al. | May 2013 | B2 |
D687021 | Yuen | Jul 2013 | S |
8719877 | VonDoenhoff et al. | May 2014 | B2 |
8774434 | Zhao et al. | Jul 2014 | B2 |
8831266 | Huang | Sep 2014 | B1 |
8891800 | Shaffer | Nov 2014 | B1 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
D728107 | Martin et al. | Apr 2015 | S |
9013145 | Castillo et al. | Apr 2015 | B2 |
9037125 | Kadous | May 2015 | B1 |
D733103 | Jeong et al. | Jun 2015 | S |
9081944 | Camacho et al. | Jul 2015 | B2 |
9313317 | LeBeau | Apr 2016 | B1 |
9408048 | Paulrajan | Aug 2016 | B1 |
9510159 | Cuddihy et al. | Nov 2016 | B1 |
D773439 | Walker | Dec 2016 | S |
D775158 | Dong | Dec 2016 | S |
D777710 | Palmborg | Jan 2017 | S |
D788079 | Son et al. | May 2017 | S |
10042821 | Cronin | Aug 2018 | B1 |
10088921 | Hardi | Oct 2018 | B2 |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010043707 | Leedom | Nov 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20030002705 | Boesen | Jan 2003 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20040048627 | Olvera-Hernandez | Mar 2004 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050017842 | Dematteo | Jan 2005 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050094839 | Gwee | May 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060073787 | Lair et al. | Apr 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20060166715 | Engelen et al. | Jul 2006 | A1 |
20060166716 | Seshadri et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20060258412 | Liu | Nov 2006 | A1 |
20060274856 | Dunn | Dec 2006 | A1 |
20080076972 | Dorogusker et al. | Mar 2008 | A1 |
20080090622 | Kim et al. | Apr 2008 | A1 |
20080146890 | LeBoeuf et al. | Jun 2008 | A1 |
20080253583 | Goldstein | Oct 2008 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20080255430 | Alexandersson et al. | Oct 2008 | A1 |
20090003620 | McKillop et al. | Jan 2009 | A1 |
20090008275 | Ferrari et al. | Jan 2009 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090073070 | Rofougaran | Mar 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090105548 | Bart | Apr 2009 | A1 |
20090191920 | Regen et al. | Jul 2009 | A1 |
20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
20090261114 | McGuire et al. | Oct 2009 | A1 |
20090296968 | Wu et al. | Dec 2009 | A1 |
20100033313 | Keady et al. | Feb 2010 | A1 |
20100041447 | Graylin | Feb 2010 | A1 |
20100203831 | Muth | Aug 2010 | A1 |
20100210212 | Sato | Aug 2010 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110140844 | McGuire et al. | Jun 2011 | A1 |
20110239497 | McGuire et al. | Oct 2011 | A1 |
20110286615 | Olodort et al. | Nov 2011 | A1 |
20120057740 | Rosal | Mar 2012 | A1 |
20130197907 | Burke | Aug 2013 | A1 |
20130316642 | Newham | Nov 2013 | A1 |
20130346168 | Zhou et al. | Dec 2013 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140095682 | Yablokov | Apr 2014 | A1 |
20140106677 | Altman | Apr 2014 | A1 |
20140122116 | Smythe | May 2014 | A1 |
20140153768 | Hagen et al. | Jun 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140185828 | Helbling | Jul 2014 | A1 |
20140219467 | Kurtz | Aug 2014 | A1 |
20140222462 | Shakil et al. | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140270271 | Dehe et al. | Sep 2014 | A1 |
20140335908 | Krisch et al. | Nov 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150028996 | Agrafioti et al. | Jan 2015 | A1 |
20150110587 | Hori | Apr 2015 | A1 |
20150148989 | Cooper et al. | May 2015 | A1 |
20150187188 | Raskin | Jul 2015 | A1 |
20150213208 | Zhang | Jul 2015 | A1 |
20150245127 | Shaffer | Aug 2015 | A1 |
20150382138 | Bose | Dec 2015 | A1 |
20160033280 | Moore | Feb 2016 | A1 |
20160072558 | Hirsch et al. | Mar 2016 | A1 |
20160073189 | Lindén et al. | Mar 2016 | A1 |
20160080295 | Davies | Mar 2016 | A1 |
20160125892 | Bowen et al. | May 2016 | A1 |
20160275076 | Ishikawa | Sep 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20170041699 | Mackellar | Feb 2017 | A1 |
20170059152 | Hirsch et al. | Mar 2017 | A1 |
20170060262 | Hviid et al. | Mar 2017 | A1 |
20170060269 | Förstner et al. | Mar 2017 | A1 |
20170061200 | Wexler | Mar 2017 | A1 |
20170061751 | Loermann et al. | Mar 2017 | A1 |
20170062913 | Hirsch et al. | Mar 2017 | A1 |
20170064426 | Hviid | Mar 2017 | A1 |
20170064428 | Hirsch | Mar 2017 | A1 |
20170064432 | Hviid et al. | Mar 2017 | A1 |
20170064437 | Hviid et al. | Mar 2017 | A1 |
20170078780 | Qian et al. | Mar 2017 | A1 |
20170105096 | Olson | Apr 2017 | A1 |
20170108918 | Boesen | Apr 2017 | A1 |
20170109131 | Boesen | Apr 2017 | A1 |
20170110124 | Boesen et al. | Apr 2017 | A1 |
20170110899 | Boesen | Apr 2017 | A1 |
20170111723 | Boesen | Apr 2017 | A1 |
20170111725 | Boesen et al. | Apr 2017 | A1 |
20170111726 | Martin et al. | Apr 2017 | A1 |
20170111740 | Hviid et al. | Apr 2017 | A1 |
20170151447 | Boesen | Jun 2017 | A1 |
20170151668 | Boesen | Jun 2017 | A1 |
20170151918 | Boesen | Jun 2017 | A1 |
20170151930 | Boesen | Jun 2017 | A1 |
20170151957 | Boesen | Jun 2017 | A1 |
20170151959 | Boesen | Jun 2017 | A1 |
20170153114 | Boesen | Jun 2017 | A1 |
20170153636 | Boesen | Jun 2017 | A1 |
20170154532 | Boesen | Jun 2017 | A1 |
20170155985 | Boesen | Jun 2017 | A1 |
20170155992 | Perianu et al. | Jun 2017 | A1 |
20170155993 | Boesen | Jun 2017 | A1 |
20170155997 | Boesen | Jun 2017 | A1 |
20170155998 | Boesen | Jun 2017 | A1 |
20170156000 | Boesen | Jun 2017 | A1 |
20170178631 | Boesen | Jun 2017 | A1 |
20170180842 | Boesen | Jun 2017 | A1 |
20170180843 | Perianu et al. | Jun 2017 | A1 |
20170180897 | Perianu | Jun 2017 | A1 |
20170180923 | Barron | Jun 2017 | A1 |
20170188127 | Perianu et al. | Jun 2017 | A1 |
20170188132 | Hirsch et al. | Jun 2017 | A1 |
20170195829 | Belverato et al. | Jul 2017 | A1 |
20170206899 | Bryant | Jul 2017 | A1 |
20170208393 | Boesen | Jul 2017 | A1 |
20170214987 | Boesen | Jul 2017 | A1 |
20170215016 | Dohmen et al. | Jul 2017 | A1 |
20170230752 | Dohmen et al. | Aug 2017 | A1 |
20170257698 | Boesen et al. | Sep 2017 | A1 |
20170274267 | Blahnik | Sep 2017 | A1 |
20170279751 | Quirarte | Sep 2017 | A1 |
20170374448 | Patel | Dec 2017 | A1 |
20180007210 | Todasco | Jan 2018 | A1 |
20180069815 | Fontana | Mar 2018 | A1 |
20180364825 | Hardi | Dec 2018 | A1 |
Number | Date | Country |
---|---|---|
204244472 | Apr 2015 | CN |
104683519 | Jun 2015 | CN |
104837094 | Aug 2015 | CN |
1469659 | Oct 2004 | EP |
1017252 | May 2006 | EP |
2903186 | Aug 2015 | EP |
2074817 | Nov 1981 | GB |
2508226 | May 2014 | GB |
2008103925 | Aug 2008 | WO |
2007034371 | Nov 2008 | WO |
2011001433 | Jan 2011 | WO |
2012071127 | May 2012 | WO |
2013134956 | Sep 2013 | WO |
2014046602 | Mar 2014 | WO |
2014043179 | Jul 2014 | WO |
2015061633 | Apr 2015 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
2016032990 | Mar 2016 | WO |
Entry |
---|
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014) pp. 1-14. |
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013), pp. 1-7. |
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
BRAGI is on Facebook (2014), pp. 1-51. |
BRAGI Update—Arrival of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014), pp. 1-8. |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015), pp. 1-18. |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014), pp. 1-8. |
BRAGI Update—Let's Get Ready to Rumble, A Lot to be Done Over Christmas (Dec. 22, 2014), pp. 1-18. |
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014), pp. 1-15. |
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014), pp. 1-16. |
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014), pp. 1-17. |
BRAGI Update—Memories From The First Month of Kickstarter—Update on Progress (Aug. 1, 2014), pp. 1-16. |
BRAGI Update—Memories From The Second Month of Kickstarter—Update on Progress (Aug. 22, 2014), pp. 1-15. |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014), pp. 1-9. |
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014), pp. 1-14. |
BRAGI Update—Status on Wireless, Bits and Pieces, Testing—Oh Yeah, Timeline(Apr. 24, 2015), pp. 1-18. |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015), pp. 1-19. |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014), pp. 1-21. |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015), pp. 1-21. |
BRAGI Update—Alpha 5 and Back to China, Backer Day, On Track(May 16, 2015), pp. 1-15. |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015), pp. 1-16. |
BRAGI Update—Certifications, Production, Ramping Up (Nov. 13, 2015), pp. 1-15. |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015), pp. 1-20. |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015), pp. 1-20. |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015), pp. 1-14. |
BRAGI Update—Getting Close(Aug. 6, 2015), pp. 1-20. |
BRAGI Update—On Track, Design Verification, How it Works and What's Next(Jul. 15, 2015), pp. 1-17. |
BRAGI Update—On Track, On Track and Gems Overview (Jun. 24, 2015), pp. 1-19. |
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015), pp. 1-17. |
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy are We Getting Close(Sep. 10, 2015), pp. 1-15. |
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016), pp. 1-2. |
Hoyt et al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017), pp. 1-8. |
Hyundai Motor America, “Hyundai Motor Company Introduces A Health + Mobility Concept for Wellness in Mobility”, Fountain Valley, Californa (2017), pp. 1-3. |
International Search Report & Written Opinion, PCT/EP2016/070231 (dated Nov. 18, 2016) 12 pages. |
Last Push Before The Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014), pp. 1-7. |
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dina_iot/ (Sep. 24, 2014). |
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—It's Your Dash (Feb. 14, 2014), pp. 1-14. |
Stretchgoal—The Carrying Case for The Dash (Feb. 12, 2014), pp. 1-9. |
Stretchgoal—Windows Phone Support (Feb. 17, 2014), pp. 1-17. |
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014), pp. 1-12. |
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014), pp. 1-7. |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014), pp. 1-11. |
Wikipedia, “Gamebook”, https://en.wikipedia.org/wiki/Gamebook, Sep. 3, 2017, 5 pages. |
Wikipedia, “Kinect”, “https://en.wikipedia.org/wiki/Kinect”, 18 pages, (Sep. 9, 2017). |
Wikipedia, “Wii Balance Board”, “https://en.wikipedia.org/wiki/Wii_Balance_Board”, 3 pages, (Jul. 20, 2017). |
Number | Date | Country | |
---|---|---|---|
20220182344 A1 | Jun 2022 | US |
Number | Date | Country | |
---|---|---|---|
62400391 | Sep 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15716204 | Sep 2017 | US |
Child | 17677597 | US |