I. Field of the Disclosure
The illustrative embodiments relate to audio communications within virtual reality systems. More specifically, but not exclusively, the illustrative embodiments relate to interactions between one or more users, wireless earpieces, and virtual reality systems.
II. Description of the Art
The growth of virtual reality technology is growing nearly exponentially. This growth is fostered by the decreasing size of microprocessors, circuitry boards, projectors, displays, chips, and other components. Virtual reality systems, such as headsets, are decreasing in size and increasing in functionality, but are still bulky and heavy. The additional mass of headphone units worn by a user may further unbalance motion of the user's head when utilizing a virtual reality system. Tracking the location, position, motion, acceleration, and orientation, such as a user's head movements relative to the virtual reality environment may also be difficult. In addition, some existing systems and devices, such as external microphones, have significant latency when sending and receiving audio communications.
Therefore, it is a primary object, feature, or advantage to improve over the state of the art.
It is a further object, feature, or advantage to provide ear worn microphones which delivers voice inputs with reduced latency.
It is a still further object, feature, or advantage
One embodiment provides a system, method, and wireless earpieces for communicating with a virtual reality headset. A position and an orientation of a head of a user are detected utilizing at least wireless earpieces. Audio content is received. The audio content is enhanced utilizing the position and the orientation of the head of the user. The audio content is immediately delivered to the user. Another embodiment provides wireless earpieces. The wireless earpieces include a processor for executing a set of instructions and a memory for storing the set of instructions. The set of instructions are executed to perform the method described above.
Yet another embodiment provides a virtual reality system. The virtual reality system includes a virtual reality headset for displaying a virtual reality environment to a user. The virtual reality system also includes wireless earpieces that include sensors that detect a position and an orientation of a head of a user and two or more microphones including an ear-bone microphone and an external microphone that receive first audio content from the user. The wireless earpieces receive second audio content from the virtual reality headset, the wireless earpieces enhance the second audio content utilizing the position and the orientation of the head of the user and play the second audio content to the user
Illustrated embodiments of the present invention are described in detail below with reference to the attached drawing figures, which are incorporated by reference herein, and where:
The illustrative embodiments provide a system and method for audio communications between wireless earpieces and a virtual reality system, such as a virtual reality headset. The virtual reality headset may communicate with the wireless earpieces wirelessly or utilizing a wired connection. For example, ports and interfaces, such as micro-USB connectors may be utilized to connect the wireless earpieces to the virtual reality system. In another example, a wireless protocol, standard, connection, or link, such as Bluetooth or BLE may be utilized.
The illustrative embodiments may be utilized for entertainment, scientific, educational, or commercial applications. Virtual reality headsets, such as those produced by Google, HTC, Samsung, oculus, Sony, Microsoft, and so forth, may present any number of two-dimensional or three-dimensional visualizations to the user. The illustrative embodiments minimize the existing mass problems with bulky over-ear headphones or other audio systems. As a result, the characteristics of angular momentum associated with the user's head are not increased significantly decreasing the effects of torque and neck and head strain that may be associated with such virtual reality systems.
In addition, the user may not be required to utilize microphones that sit on a desk or are externally positioned from the user decreasing the effectiveness of the microphone. For example, many remotely positioned microphones do not have any additional sensors and their remote position may increase latency and delay when sensing various audio inputs from the user, environment, or so forth. Such a placement of the microphones may also introduce noise issues. According to one aspect, a virtual reality system is provided to incorporate position, orientation, movement, and acceleration (e.g., angular, linear, etc.) as part of the user input, responses, and feedback. As a result, the audio and visual information presented to the user may be adjusted in response to audio input received from the user as well as the corresponding user information including position, orientation, movement, and acceleration.
The wireless earpieces may include any number of sensors that may communicate with the sensors, systems and components of the virtual reality headset to further enhance the user's experience. In one embodiment, the sensors of the wireless earpieces may include accelerometers, gyroscopes, magnetometers, optical sensors, pulse oximeters, GPS chips, thermometers, and so forth. The data acquired by the sensors may be utilized to determine the user's condition, characteristics, position, orientation, movement, acceleration, location, or so forth. As a result, the data may be utilized to enhance the user's experience within the virtual reality environment. In addition, the sensors provide data that enhances sensor measurements of the virtual reality headset. The precise determination of the user's location, orientation, movement, and position may also be utilized to provide more accurate three-dimensional spatial sound imaging for the user. For example, allowable or communicated content, actions, and processes implemented by the virtual reality headset may vary based on the applicable user information In addition, the sensors may be utilized to sense any number of biometric readings or information, such as heart rate, respiratory rate, blood, or skin physiology, or other biometric data. This information may be utilized to determine whether the user is safe in the virtual reality environment, enjoying a game, or if the user is stressed or fatigued. Besides being integrated with the virtual reality headset, the wireless earpieces may be utilized to make and receive communications (e.g., telephone calls, transcribed text messages, audio/tactile alerts, etc.), play music, filter or block sound, amplify sounds, or so forth.
The wireless earpieces may be utilized for daily activities, such as gaming, business communications, exercising, phone calls, travel, and so forth. The wireless earpieces may then also serve a dual-purpose by integrating as an audio portion of a virtual reality system. As a result, more expensive audio components are not required reducing the cost and weight of the virtual reality system. The user may be relieved of significant weight and strain by utilizing the reduced footprint of the wireless earpieces. In addition, the virtual reality system may include a stand-alone power source or battery that may be utilized to power the wireless earpieces on the fly. The voice and audio inputs sensed by the wireless earpieces are processed with minimal latency due to positions of the microphones within the wireless earpieces at the lateral and medial segments as positioned within the ears of the user. The microphone and other sensor inputs provide enhanced input modality to the programs and processes implemented by the wireless earpieces.
The description may also refer to components and functionality of each of the wireless earpieces 102 collectively or individually. In one embodiment, the wireless earpieces 102 include a left earpiece and a right earpiece configured to fit into ears of a user 101. The wireless earpieces 102 are shown separately from their positioning within the ears of the user 101 for purposes of simplicity.
The wireless earpieces 102 are configured to play audio associated with visual content presented by the virtual reality headset 110. The wireless earpieces 102 may be configured to play music or audio, receive and make phone calls or other communications, determine ambient environmental readings (e.g., temperature, altitude, location, speed, heading, etc.), read user biometrics and actions (e.g., heart rate, motion, sleep, blood oxygenation, calories burned, etc.), and communicate content audibly, tactilely, and visually.
The wireless earpieces 102 may include interchangeable parts that may be adapted to fit the needs of the user 101. For example, sleeves of the wireless earpieces 102 that fit into the ear of the user 101 may be interchangeable to find a suitable shape and configuration. The wireless earpieces 102 may include a number of sensors and input devices including, but not limited to, pulse oximeters, microphones, pulse rate monitors, accelerometers, gyroscopes, light sensors, global positioning sensors, and so forth. Sensors of the virtual reality headset 110 may also be configured to wirelessly communicate with the wireless earpieces 102.
The virtual reality headset 110 replicates or displays an environment simulating physical presence in places in the real world or imagined worlds and lets the user 101 interact in that environment. Virtual reality may also be referred to as immersive multimedia and may be utilized to create sensory experiences which may include sight, hearing, touch, smell, and taste. The virtual reality headset 110 may be powered by a power plug, battery, or other connection (e.g., USB connection to a computing or gaming device). The virtual reality headset 110 may also communicate (send and receive) data utilizing a wired or wireless connection to any number of computing, communications, or entertainment devices.
The visor 112 may be utilized to display visual and graphical information to the user 101. The visor 112 may include one or more displays (e.g., liquid crystal displays, light emitting diode (LED) displays, organic LED, etc.) or projectors (direct, indirect, or refractive) for displaying information to the eyes of the user 101. Although not shown, the virtual reality headset 110 may also include touch screens, tactile interfaces, vibration components, smell interfaces, or tasting interfaces for enhancing the experience of the user 101. The size and shape of the virtual reality headset 110, visor 112, and the strap 114 may vary by make, model, manufacturer as well as user configuration of the virtual reality headset 110, such as those produced by Google, HTC, Sony, Oculus, Epson, Samsung, LG, Microsoft, Durovis, Valve, Avegant, and others. In one embodiment, the visor 110 may be transparent or see through allowing the user to interact and function in the real-world while still communicating virtual information. For example, the wireless earpieces 102, visor 112, virtual reality headset 110 may be configured for augmented reality functionality, processes, displays, and so forth as are herein described.
The strap 114 extends between sides of the visor 112 and is configured to secure the virtual reality headset 110 to the head of the user 101. The strap 114 may be formed of any number of materials, such as cotton, polyester, nylon, rubber, plastic, or so forth. The strap 114 may include buckles, loops, or other adjustment mechanisms for fitting the virtual reality headset 110 to the head of the user 101. The strap 114 may be flexible to comfortably fit to the head of the user 101.
The wireless earpieces 102 may communicate utilizing any number of wireless connections, standards, or protocols (e.g., near field communications, Bluetooth, Wi-Fi, ANT+, etc.). The virtual reality headset 110 may locally or remotely implement and utilize any number of operating systems, kernels, instructions, or applications that may make use of the sensor data measured by the wireless earpieces 102. For example, the virtual reality headset 110 may utilize any number of android, iOS, Windows, open platform, or other systems. Similarly, the virtual reality headset 110 may include a number of applications that utilize the biometric data from the wireless earpieces 102 to display applicable information and data. For example, the biometric information (including, high, low, average, or other values) may be processed by the wireless earpieces 102 or the virtual reality headset 110 to display heart rate, blood oxygenation, altitude, speed, distance traveled, calories burned, or other applicable information.
In one embodiment, the virtual reality headset 110 may include any number of sensors (e.g., similar to those described with regard to the wireless earpieces 102) that may be utilized to augment the sensor readings of the wireless earpieces 102. For example, a microphone of the wireless device 110 may determine an amount and type of ambient noise. The noise may be analyzed and utilized to filter the sensor readings made by the wireless earpieces 102 to maximize the accuracy and relevance of the sensor measurements of the wireless earpieces 102. Filtering, tuning, and adaptation for the sensor measurements may be made for signal noise, electronic noise, or acoustic noise, all of which are applicable in the system 100. The virtual reality headset 110 may also include accelerometers, gyroscopes, magnetometers, radar sensors, and so forth that determine the location, position, and orientation of the user 101 within the system 100 which may represent a number of indoor or outdoor environments. Sensor measurements made by either the wireless earpieces 102, virtual reality headset 110, or sensor devices of the user 101 may be communicated with one another in the system 100.
With respect to the wireless earpieces 102 sensor measurements may refer to measurements made by one or both of the wireless earpieces 102. For example, the wireless earpieces 102 may determine that the sensor signal for the pulse oximeter of the right wireless earpiece is very noisy and as a result, may utilize the sensor signal from the pulse oximeter of the left wireless earpiece as the primary measurement. The wireless earpieces 102 may also switch back and forth between pulse oximeters of the left earpiece 106 and the right earpiece 104 in response to varying noise for both of the wireless earpieces. As a result, the clearest sensor signal may be utilized at any given time. In one embodiment, the wireless earpieces 102 may switch sensor measurements in response to the sensor measurements exceeding or dropping below a specified threshold.
The user 101 may also be wearing or carrying any number of sensor-enabled devices, such as heart rate monitors, pacemakers, smart glasses, smart watches or bracelets (e.g., Apple watch, Fitbit, etc.), or other sensory devices that may be worn, attached to, or integrated with the user 101. The data and information from the external sensor devices may be communicated to the wireless earpieces 102. In another embodiment, the data and information from the external sensor devices may be utilized to perform additional processing of the information sent from the wireless earpieces 102 to the virtual reality headset 110.
The sensors of the wireless earpieces 102 may also be positioned at enantiomeric locations. For example, a number of colored light emitting diodes may be positioned to provide variable data and information, such as heart rate, respiratory rate, and so forth. The data gathered by the LED arrays may be sampled and used alone or in aggregate with other sensors. As a result, sensor readings may be enhanced and strengthened with additional data.
As shown, the wireless earpieces 202 may be physically or wirelessly linked to the virtual reality headset 204. User input and commands may be received from either the wireless earpieces 202 or the virtual reality headset 204 for implementation on either of the devices of the virtual reality system 200 (or other externally connected devices). As previously noted, the wireless earpieces 202 may be referred to or described herein as a pair (wireless earpieces) or singularly (wireless earpiece). The description may also refer to components and functionality of each of the wireless earpieces 202 collectively or individually.
The wireless earpieces 202 play the audio corresponding to the virtual reality content displayed by the virtual reality headset 204. In one embodiment, the wireless earpieces 202 may play the sounds and audio received from the virtual reality headset 204 based on the sensed location, position, orientation, speed, and acceleration of the user as measured by the sensors 217. For example, if the user's head is inclined as if he is writing or simulating writing a bicycle, the corresponding audio may be played to simulate actually riding a bike and the way be sounds and noises may strike the ears of the user. In another example, the audio and sounds may be played as if the user's head was turned a particular direction. For example, the sounds and audio may be more prominent in the left ear rather than the right ear based on the position and orientation of the user stimulated by the wireless earpieces 202. In addition, the wireless earpieces 202 may provide additional biometric and user data that may be further utilized by the virtual reality headset 204 or connected computing, entertainment, or communications
In some embodiments, the virtual reality headset 204 may act as a logging tool for receiving information, data, or measurements made by the wireless earpieces 202. For example, the virtual reality headset 204 may be worn by the user to download data from the wireless earpieces in real-time. As a result, the virtual reality headset 204 may be utilized to store, display, and synchronize data to the wireless earpieces 202. For example, the virtual reality headset 204 may display pulse, oxygenation, distance, calories burned, and so forth as measured by the wireless earpieces 202. The wireless earpieces 202 and the virtual reality headset 204 may have any number of electrical configurations, shapes, and colors and may include various circuitry, connections, and other components.
In one embodiment, the wireless earpieces 202 may include a battery 208, a logic engine or processor 210, a memory 212, user interface 214, physical interface 215, a transceiver 216, and sensors 212. Similar components within the virtual reality headset 204 may be similarly structured to provide analogous functionality, features, and processes. Likewise, the virtual reality headset 204 may have a battery 218, a memory 220, a user interface 222, sensors 224, a logic engine 226, a display 228, and a transceiver 230. The battery 208 is a power storage device configured to power the wireless earpieces 202. Likewise, the battery 218 is a power storage device configured to power the virtual reality headset 204. The battery 218 may represent a converter, inverter, or interface for receiving power and/or communications from a virtual reality processing system (not shown). In other embodiments, the batteries 208 and 218 may represent a fuel cell, thermal electric generator, piezo electric charger, solar charger, ultra-capacitor, or other existing or developing power storage technologies.
The logic engine 210 is the logic that controls the operation and functionality of the wireless earpieces 202. The logic engine 210 may include one or more processors. The logic engine 210 may include circuitry, chips, and other digital logic. The logic engine 210 may also include programs, scripts, and instructions that may be implemented to operate the logic engine 210. The logic engine 210 may represent hardware, software, firmware, or any combination thereof. In one embodiment, the logic engine 210 may include one or more processors. The logic engine 210 may also represent an application specific integrated circuit (ASIC) or field programmable gate array (FPGA). The logic engine 210 may utilize information from the sensors 212 to determine the biometric information, data, and readings of the user. The logic engine 210 may utilize this information and other criteria to inform the user of the biometrics (e.g., audibly, through an application of a connected device, tactilely, etc.). The logic engine 210 may also determine the location, orientation, position, speed, and acceleration of the user utilizing the sensors 217. For example, the sensors 217 may include accelerometers, gyroscopes, optical sensors, or miniaturized radar that may be utilized to determine associated user information. The logic engine 210 may also control how audio information is both sent and received from the transceiver 216 of the wireless earpieces 202.
The logic engine 210 may also process user input to determine commands implemented by the wireless earpieces 202 or sent to the wireless earpieces 204 through the transceiver 216. The user input may be determined by the sensors 217 to determine specific actions to be taken. In one embodiment, the logic engine 210 may implement a macro allowing the user to associate user input as sensed by the sensors 217 with commands.
In one embodiment, a processor included in the logic engine 210 is circuitry or logic enabled to control execution of a set of instructions. The processor may be one or more microprocessors, digital signal processors, application-specific integrated circuits (ASIC), central processing units, or other devices suitable for controlling an electronic device including one or more hardware and software elements, executing software, instructions, programs, and applications, converting and processing signals and information, and performing other related tasks. The processor may be a single chip or integrated with other computing or communications elements of a smart case.
The memory 212 is a hardware element, device, or recording media configured to store data for subsequent retrieval or access at a later time. The memory 212 may be or include static and/or dynamic memory. The memory 212 may include one or more of a hard disk, random access memory, cache, removable media drive, mass storage, or configuration suitable as storage for data, instructions, and information. In one embodiment, the memory 212 and the logic engine 210 may be integrated. The memory may use any type of volatile or non-volatile storage techniques and mediums. The memory 212 may store information related to the status of a user, wireless earpieces 202, virtual reality headset 204, and other peripherals, such as a wireless device, smart case for the wireless earpieces 202, smart watch, and so forth. In one embodiment, the memory 212 may display instructions or programs for controlling the user interface 214 including one or more LEDs or other light emitting components, speakers, tactile generators (e.g., vibrator), and so forth. The memory 212 may also store the user input information associated with each command.
The transceiver 216 is a component comprising both a transmitter and receiver which may be combined and share common circuitry on a single housing. The transceiver 216 may communicate utilizing Bluetooth, Wi-Fi, ZigBee, Ant+, near field communications, wireless USB, infrared, mobile body area networks, ultra-wideband communications, cellular (e.g., 3G, 4G, 5G, PCS, GSM, etc.) or other suitable radio frequency standards, networks, protocols, or communications. The transceiver 216 may also be a dual or hybrid transceiver that supports a number of different communications. For example, the transceiver 216 may communicate with the virtual reality headset 204 or other systems utilizing wired interfaces (e.g., wires, traces, etc.), NFC or Bluetooth communications.
The components of the wireless earpieces 202 (or the virtual reality system 200) may be electrically connected utilizing any number of wires, contact points, leads, busses, wireless interfaces, or so forth. In addition, the wireless earpieces 202 may include any number of computing and communications components, devices or elements which may include busses, motherboards, circuits, chips, sensors, ports, interfaces, cards, converters, adapters, connections, transceivers, displays, antennas, and other similar components. The physical interface 215 is hardware interface of the wireless earpieces 202 for connecting and communicating with the virtual reality headset 204 or other electrical components.
The physical interface 215 may include any number of pins, arms, or connectors for electrically interfacing with the contacts or other interface components of external devices or other charging or synchronization devices. For example, the physical interface 215 may be a micro USB port. In one embodiment, the physical interface 215 is a magnetic interface that automatically couples to contacts or an interface of the virtual reality headset 204. In another embodiment, the physical interface 215 may include a wireless inductor for charging the wireless earpieces 202 without a physical connection to a charging device.
The user interface 214 is a hardware interface for receiving commands, instructions, or input through the touch (haptics) of the user, voice commands, or predefined motions. The user interface 214 may be utilized to control the other functions of the wireless earpieces 202. Although not shown, the one or more speakers of the user interface 214 may include a number of speaker components (e.g., signal generators, amplifiers, drivers, and other circuitry) configured to generate sounds waves at distinct frequency ranges (e.g., bass, woofer, tweeter, midrange, etc.) or to vibrate at specified frequencies to be perceived by the user as sound waves. The speakers may also generate sound waves to provide three dimensional stereo sound to the user. All or portions of the speakers may be activated or directed within the wireless earpieces 202 to generate various effects. The speakers may quickly respond to content sent from the virtual reality headset 204 or other portions of the virtual reality system 200 to add to the realistic effects and processing experienced by the user. The user interface 214 may include an LED array, one or more touch sensitive buttons or screens, portions or sensors, a miniature screen or display, or other input/output components. The user interface 214 may be controlled by the user or based on commands received from the virtual reality headset 204 or a linked wireless device. The user interface 214 may also include traditional software interfaces, such as a graphical user interface or applications that may be executed by the logic engine 210 for communication by the user interface 214. For example, the speakers may simulate users, devices, or sounds spatially positioned relative to the user wearing the wireless earpieces 202. As a result, a person or animal that appears to be forward and to the left of the user will also sound like they are so positioned based on sounds received and played by the wireless earpieces 202 relative to the communicated media content.
In one embodiment, the user may provide feedback by tapping the user interface 214 once, twice, three times, or any number of times. Similarly, a swiping motion may be utilized across or in front of the user interface 214 (e.g., the exterior surface of the wireless earpieces 202, proximate the exterior surface for optical sensors) etc.) to implement a predefined action. Swiping motions in any number of directions may be associated with specific activities, such as play music, pause, fast forward, rewind, activate a digital assistant (e.g., Siri, Cortana, smart assistant, etc.). The swiping motions may also be utilized to control actions and functionality of the virtual reality headset 204 or other external devices (e.g., smart television, camera array, smart watch, etc.). The user may also provide user input by moving her head in a particular direction or motion or based on the user's position or location. For example, the user may utilize voice commands, head gestures, or touch commands to change the content displayed by the virtual reality headset 204. The voice and audio input from the user and received from the virtual reality system 200 may be enhanced to accurately determine position, location, orientation, motion and acceleration of the user/user's head within a three dimensional space. As a result, audio or sound effects, such as loudness, masking, pitch (including changes, such as the Doppler effect), timbre, localization, and other user affects heard and perceived by the user.
The sensors 217 may include pulse oximeters, accelerometers, gyroscopes, magnetometers, inertial sensors, photo detectors, microphones (e.g., ear-bone or bone conduction microphones, exterior microphones, etc.) miniature cameras, and other similar instruments for detecting location, orientation, motion, and so forth. The sensors 217 may also be utilized to gather optical images, data, and measurements and determine an acoustic noise level, electronic noise in the environment, ambient conditions, and so forth. The sensors 217 may provide measurements or data that may be utilized to filter or select images for display by the virtual reality headset 204. For example, motion or sound detected on the left side of the user may be utilized to command the smart glasses to display camera images from the left side of the user. Motion or sound may be utilized, however, any number of triggers may be utilized to send commands to the virtual reality headset 204.
The microphones of the sensors 217 may immediately receive and process audio signals and sounds from the user thereby minimizing latency and delay. As a result, the virtual reality system 200 may perform effectively for real-time scenarios, simulations, games, communications, or so forth. The microphones may sense verbal feedback from the user as well as audio input associated with the user and environment (e.g., foot falls, breaths, grunts, wind, etc.) to provide relevant information to the virtual reality system 200.
The virtual reality headset 204 may include components similar in structure and functionality to those shown for the wireless earpieces 202 including a battery 218, a memory 220, a user interface 222, sensors 224, a logic engine 226, a display 228, and transceiver 230. The virtual reality headset 204 may include the logic engine 226 for executing and implementing the processes and functions as are herein described. The battery 218 of the virtual reality headset 204 may be integrated into the frames of the virtual reality headset 204 and may have extra capacity which may be utilized to charge the wireless earpieces 202. For example, the wireless earpieces 202 may be magnetically coupled or connected to the virtual reality headset 204 so that the battery 218 may be charged. All or a portion of the logic engine 226, sensors, user interface 222, sensors 224, display, and transceiver 230 may be integrated in the frame and/or lenses of the virtual reality headset 204.
The user interface 222 of the virtual reality headset 204 may include a touch interface or display for indicating the status of the virtual reality headset 204. For example, an external LED light may indicate the battery status of the virtual reality headset 204 as well as the connected wireless earpieces 202, connection status (e.g., linked to the wireless earpieces 202, wireless device, etc.), download/synchronization status (e.g., synchronizing, complete, last synchronization, etc.), or other similar information.
The display 228 may be integrated into the lenses of the virtual reality headset 204 or represent one or more projectors that may project content directly or reflectively to the eyes of the user. For example, the display 228 may represent a transparent organic light emitting diode lens that is see through and may be utilized to display content. Projectors of the display 228 may utilize any number of wavelengths or light sources to display data, images, or other content to the user. The virtual reality headset 204 may also be utilized for augmented reality displays. The virtual reality headset 204 may take any number of forms including regular glasses, disposable headsets, and so forth. The virtual reality headset 204 may be very small and unobtrusive. In one embodiment, the virtual reality headset 204 may be integrated in smart contact lenses that communicate with the wireless earpieces 202 as described herein.
An LED array of the user interface 222 may also be utilized for display actions. For example, an LED may be activated in response to someone or something being in the user's blind spot while riding a bicycle. In another embodiment, device status indications may emanate from the LED array of the wireless earpieces 202 themselves, triggered for display by the user interface 222 of the virtual reality headset 204. The battery 218 may itself be charged through a physical interface of the user interface 222. The physical interface may be integrated with the user interface 222 or may be a separate interface. For example, the user interface 222 may also include a hardware interface (e.g., port, connector, etc.) for connecting the virtual reality headset 204 to a power supply or other electronic device. The user interface 222 may be utilized for charging as well as communications with externally connected devices. For example, the user interface 222 may represent a mini-USB, micro-USB or other similar miniature standard connector. In another embodiment, a wireless inductive charging system may be utilized to initially replenish power to the wireless earpieces 202. The virtual reality headset 204 may also be charged utilizing inductive charging.
In another embodiment, the virtual reality headset 204 may also include sensors for detecting the location, orientation, and proximity of the wireless earpieces 202. For example, the virtual reality headset 204 may include optical sensors or cameras for capturing images and other content around the periphery of the user (e.g., front, sides, behind, etc.). The virtual reality headset 204 may detect any number of wavelengths and spectra to provide distinct images, enhancement, data, and content to the user. The virtual reality headset 204 may also include an LED array, galvanic linkage or other touch sensors, battery, solar charger, actuators or vibrators, one or more touch screens or displays, an NFC chip, or other components. The sensors 224 may include integrated sensors that are part of the virtual reality headset 204 as well as external sensors that communicate with the virtual reality headset 204. For example, the sensors 224 may also measure the position, location, orientation, motion, and acceleration of other portions of the user's body including arms, legs, torso, and so forth.
As originally packaged, the wireless earpieces 202 and the virtual reality headset 204 may include peripheral devices such as charging cords, power adapters, inductive charging adapters, solar cells, batteries, lanyards, additional light arrays, speakers, smart case covers, transceivers (e.g., Wi-Fi, cellular, etc.), or so forth.
In one embodiment, the process may begin by detecting a position and an orientation of a head of a user (step 302). The position and the orientation of the user's head may be determined utilizing one or more accelerometers, gyroscopes, proximity sensors, optical sensors, or other sensors of the wireless earpieces or the virtual reality headset. The wireless earpieces may also determine information based on a user selected activity or activity detected by the wireless earpieces. For example, if the user has selected a biking simulation or activity, the wireless earpieces may expect a corresponding head position and orientation. The position and orientation may include global positioning information, spatial positioning within a room or other environment, x, y, and z orientation of the user's head utilizing any number of planes or axis, distance between objects (e.g., user's head and the floor/wall, etc.).
Next, the wireless earpieces receive audio content (step 304). The audio content may be received from the virtual reality headset. In one embodiment, the audio content may be integrated material or content from a simulation, game, broadcast, or other media. As previously noted, the audio content may be received by a transceiver of the wireless earpieces through a physical or wireless connection for processing. The audio content may also be received from the user by one or more microphones of the wireless earpieces including ear-bone and external microphones to detect the voice, sounds, or other audio input from the user. The microphones may also sense content associated with the user's environment, such as other users proximate the user, organic, mechanical, or electric sounds, or so forth.
Next, the wireless earpieces enhance the audio content utilizing the position and orientation of the user (step 306). In one embodiment, the position, orientation, and audio content received from the user (e.g. verbal commands, indicators, stimuli, etc.) may be associated with specific commands or actions implemented by the wireless earpieces, the virtual reality headset, or other computing or communications systems in communication with the wireless earpieces. For example, a combination of the position, orientation, and audio content may be stored in a database with associated actions, commands, communications, scripts, applications, or processes that may be implemented or executed. The audio content may also be enhanced utilizing filtering, amplification, signal processing, and other processes to remove unwanted noise, jitter, latency, or so forth.
Next, the wireless earpieces determine whether to send or receive the audio content (step 308). The wireless earpieces determine whether the audio content was received from the user through one or more microphones (e.g., ear-bone, external, etc.) for communication to the virtual reality headset/system or from the virtual reality headset/system to be played by the wireless earpieces.
If the wireless earpieces determine to receive the audio content, the wireless earpieces immediately deliver the audio content to the user (step 310). The audio content is delivered without significant delay or latency. As a result, any potential video, pictures, or other visual content may be synchronized with the audio content delivered by the wireless earpieces to prevent unwanted dizziness, disorientation, or motion sickness due to differing inputs. The immediate delivery of the audio content may ensure that real-time or time sensitive applications, such as communications, gaming, simulations, or so forth are implemented without delay. As previously noted, the audio content may be received from the virtual reality headset/system utilizing any number of connections. Any number of pins or connectors may interconnect the components. In another embodiment, the virtual reality headset is magnetically coupled to the wireless earpieces allowing for inductive power transfer and communication between the connectors and the wireless earpieces. In yet other embodiments, short range wireless signals, such as Bluetooth, ANT+, or other radiofrequency protocols, standards, or connections may be utilized.
The playback and communication of audio content may be coordinated based on user actions, conditions, position, location, or so forth. For example, specific three dimensional noises may be played in each of the wireless earpieces corresponding to the left and right ears of the user to make the environment seem more realistic. Likewise, the volume and other audio effects may be varied to match the orientation of the user's head (or avatar) within a virtual environment. The audio content may include flags, timestamps, or other information for synchronizing playback. The synchronization of the audio and visual content may ensure that the user does not become disoriented, motion sick, or otherwise adversely affected. The audio content may be delivered, played, or otherwise communicated based on synchronization information determined between the virtual reality headset and the wireless earpieces. For example, the left and right wireless earpieces may play distinct content based on the virtual reality environment with which the user is interacting. Distinct sounds, volumes, and audio effects may be utilized by each of the wireless earpieces. As a result, the user is able to experience a unique virtual environment with corresponding sounds without significant increases in weight or other forces imposed upon the user by much larger sound systems.
If the wireless earpieces determine to send the audio content, the wireless earpieces communicate the audio content received from the user through a virtual reality system (step 312). The audio content may be sent from a transceiver of the wireless earpieces to the virtual reality system/headset for processing. For example, the audio content may be communications, commands, or other audio/verbal feedback. The enhanced reception of quality audio input received through the microphones of the wireless earpieces may enhance the interactive functionality of the wireless earpieces due to reduced delay, latency, and so forth.
Although, not specifically described, the process of
The wireless earpieces may also synchronize playback or communication of the audio content with visual content of the virtual reality system. As previously noted, the wireless earpieces may utilize any number of sensors to determine the location, velocity (e.g. linear, angular, etc.), position of the user (and the user's head), orientation, acceleration, biometric condition (e.g., heart rate, blood oxygenation, temperature, etc.), and other information to adjust the exact timing, volume, tuning, balance, fade, and other audio effects communicated to the user by the speakers of the wireless earpieces. The wireless earpieces may also send or receive commands for synchronizing and managing the audio content played or communicated by the wireless earpieces with associated visual content.
The illustrative embodiments provide a system, method, and wireless earpiece(s) for enhancing audio communications utilizing a virtual reality system. Audio input that is both sent and received through the wireless earpieces may be enhanced and delivered based on user information that may include the location, position, orientation, motion, and acceleration of the user. The user information may be utilized to more effectively deliver audio content for the user as well as to interact with media content of the virtual reality or augmented reality system. The illustrative embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, embodiments of the inventive subject matter may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium. The described embodiments may be provided as a computer program product, or software, that may include a non-transitory machine-readable medium having stored thereon instructions, which may be used to program a computing system (or other electronic device(s)) to perform a process according to embodiments, whether presently described or not, since every conceivable variation is not enumerated herein.
Computer program code for carrying out operations of the embodiments may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN), a personal area network (PAN), or a wide area network (WAN), or the connection may be made to an external computer (e.g., through the Internet using an Internet Service Provider).
The illustrative embodiments are not to be limited to the particular embodiments described herein and it is to be understood that features from different embodiments may, but need not be combined. In particular, the illustrative embodiments contemplate numerous variations in the type of ways in which embodiments may be applied. The foregoing description has been presented for purposes of illustration and description. It is not intended to be an exhaustive list or limit any of the disclosure to the precise forms disclosed. It is contemplated that other alternatives or exemplary aspects are considered included in the disclosure. The description is merely examples of embodiments, processes or methods of the invention. It is understood that any other modifications, substitutions, and/or additions may be made, which are within the intended spirit and scope of the disclosure. For the foregoing, it can be seen that the disclosure accomplishes at least all of the intended objectives.
The previous detailed description is of a small number of embodiments for implementing the invention and is not intended to be limiting in scope. The following claims set forth a number of the embodiments of the invention disclosed with greater particularity.
This application claims priority to U.S. Provisional Patent Application 62/358,985, filed on Jul. 6, 2016, and entitled Audio Response Based on User Worn Microphones to Direct or Adapt Program Responses System and Method, hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
2325590 | Carlisle et al. | Aug 1943 | A |
2430229 | Kelsey | Nov 1947 | A |
3047089 | Zwislocki | Jul 1962 | A |
D208784 | Sanzone | Oct 1967 | S |
3586794 | Michaelis | Jun 1971 | A |
3934100 | Harada | Jan 1976 | A |
3983336 | Malek et al. | Sep 1976 | A |
4069400 | Johanson et al. | Jan 1978 | A |
4150262 | Ono | Apr 1979 | A |
4334315 | Ono et al. | Jun 1982 | A |
D266271 | Johanson et al. | Sep 1982 | S |
4375016 | Harada | Feb 1983 | A |
4588867 | Konomi | May 1986 | A |
4617429 | Bellafiore | Oct 1986 | A |
4654883 | Iwata | Mar 1987 | A |
4682180 | Gans | Jul 1987 | A |
4791673 | Schreiber | Dec 1988 | A |
4852177 | Ambrose | Jul 1989 | A |
4865044 | Wallace et al. | Sep 1989 | A |
4984277 | Bisgaard et al. | Jan 1991 | A |
5008943 | Arndt et al. | Apr 1991 | A |
5185802 | Stanton | Feb 1993 | A |
5191602 | Regen et al. | Mar 1993 | A |
5201007 | Ward et al. | Apr 1993 | A |
5201008 | Arndt et al. | Apr 1993 | A |
D340286 | Seo | Oct 1993 | S |
5280524 | Norris | Jan 1994 | A |
5295193 | Ono | Mar 1994 | A |
5298692 | Ikeda et al. | Mar 1994 | A |
5343532 | Shugart | Aug 1994 | A |
5347584 | Narisawa | Sep 1994 | A |
5363444 | Norris | Nov 1994 | A |
D367113 | Weeks | Feb 1996 | S |
5497339 | Bernard | Mar 1996 | A |
5606621 | Reiter et al. | Feb 1997 | A |
5613222 | Guenther | Mar 1997 | A |
5654530 | Sauer et al. | Aug 1997 | A |
5692059 | Kruger | Nov 1997 | A |
5721783 | Anderson | Feb 1998 | A |
5748743 | Weeks | May 1998 | A |
5749072 | Mazurkiewicz et al. | May 1998 | A |
5771438 | Palermo et al. | Jun 1998 | A |
D397796 | Yabe et al. | Sep 1998 | S |
5802167 | Hong | Sep 1998 | A |
D410008 | Almqvist | May 1999 | S |
5929774 | Charlton | Jul 1999 | A |
5933506 | Aoki et al. | Aug 1999 | A |
5949896 | Nageno et al. | Sep 1999 | A |
5987146 | Pluvinage et al. | Nov 1999 | A |
6021207 | Puthuff et al. | Feb 2000 | A |
6054989 | Robertson et al. | Apr 2000 | A |
6081724 | Wilson | Jun 2000 | A |
6094492 | Boesen | Jul 2000 | A |
6111569 | Brusky et al. | Aug 2000 | A |
6112103 | Puthuff | Aug 2000 | A |
6157727 | Rueda | Dec 2000 | A |
6167039 | Karlsson et al. | Dec 2000 | A |
6181801 | Puthuff et al. | Jan 2001 | B1 |
6208372 | Barraclough | Mar 2001 | B1 |
6230029 | Yagiazaryan et al. | May 2001 | B1 |
6275789 | Moser et al. | Aug 2001 | B1 |
6339754 | Flanagan et al. | Jan 2002 | B1 |
D455835 | Anderson et al. | Apr 2002 | S |
6408081 | Boesen | Jun 2002 | B1 |
6424820 | Burdick et al. | Jul 2002 | B1 |
D464039 | Boesen | Oct 2002 | S |
6470893 | Boesen | Oct 2002 | B1 |
D468299 | Boesen | Jan 2003 | S |
D468300 | Boesen | Jan 2003 | S |
6542721 | Boesen | Apr 2003 | B2 |
6560468 | Boesen | May 2003 | B1 |
6654721 | Handelman | Nov 2003 | B2 |
6664713 | Boesen | Dec 2003 | B2 |
6690807 | Meyer | Feb 2004 | B1 |
6694180 | Boesen | Feb 2004 | B1 |
6718043 | Boesen | Apr 2004 | B1 |
6738485 | Boesen | May 2004 | B1 |
6748095 | Goss | Jun 2004 | B1 |
6754358 | Boesen et al. | Jun 2004 | B1 |
6784873 | Boesen et al. | Aug 2004 | B1 |
6823195 | Boesen | Nov 2004 | B1 |
6852084 | Boesen | Feb 2005 | B1 |
6879698 | Boesen | Apr 2005 | B2 |
6892082 | Boesen | May 2005 | B2 |
6920229 | Boesen | Jul 2005 | B2 |
6952483 | Boesen et al. | Oct 2005 | B2 |
6987986 | Boesen | Jan 2006 | B2 |
7010137 | Leedom et al. | Mar 2006 | B1 |
7113611 | Leedom et al. | Sep 2006 | B2 |
D532520 | Kampmeier et al. | Nov 2006 | S |
7136282 | Rebeske | Nov 2006 | B1 |
7203331 | Boesen | Apr 2007 | B2 |
7209569 | Boesen | Apr 2007 | B2 |
7215790 | Boesen et al. | May 2007 | B2 |
D549222 | Huang | Aug 2007 | S |
D554756 | Sjursen et al. | Nov 2007 | S |
7403629 | Aceti et al. | Jul 2008 | B1 |
D579006 | Kim et al. | Oct 2008 | S |
7463902 | Boesen | Dec 2008 | B2 |
7508411 | Boesen | Mar 2009 | B2 |
D601134 | Elabidi et al. | Sep 2009 | S |
7825626 | Kozisek | Nov 2010 | B2 |
7965855 | Ham | Jun 2011 | B1 |
7979035 | Griffin et al. | Jul 2011 | B2 |
7983628 | Boesen | Jul 2011 | B2 |
D647491 | Chen et al. | Oct 2011 | S |
8095188 | Shi | Jan 2012 | B2 |
8108143 | Tester | Jan 2012 | B1 |
8140357 | Boesen | Mar 2012 | B1 |
D666581 | Perez | Sep 2012 | S |
8300864 | Müllenborn et al. | Oct 2012 | B2 |
8406448 | Lin | Mar 2013 | B2 |
8436780 | Schantz et al. | May 2013 | B2 |
D687021 | Yuen | Jul 2013 | S |
8719877 | VonDoenhoff et al. | May 2014 | B2 |
8767968 | Flaks | Jul 2014 | B2 |
8774434 | Zhao et al. | Jul 2014 | B2 |
8831266 | Huang | Sep 2014 | B1 |
8891800 | Shaffer | Nov 2014 | B1 |
8994498 | Agrafioti et al. | Mar 2015 | B2 |
D728107 | Martin et al. | Apr 2015 | S |
9013145 | Castillo et al. | Apr 2015 | B2 |
9037125 | Kadous | May 2015 | B1 |
D733103 | Jeong et al. | Jun 2015 | S |
9081944 | Camacho et al. | Jul 2015 | B2 |
9510159 | Cuddihy et al. | Nov 2016 | B1 |
D773439 | Walker | Dec 2016 | S |
D775158 | Dong et al. | Dec 2016 | S |
D777710 | Palmborg et al. | Jan 2017 | S |
9612722 | Miller | Apr 2017 | B2 |
D788079 | Son et al. | May 2017 | S |
20010005197 | Mishra et al. | Jun 2001 | A1 |
20010027121 | Boesen | Oct 2001 | A1 |
20010043707 | Leedom | Nov 2001 | A1 |
20010056350 | Calderone et al. | Dec 2001 | A1 |
20020002413 | Tokue | Jan 2002 | A1 |
20020007510 | Mann | Jan 2002 | A1 |
20020010590 | Lee | Jan 2002 | A1 |
20020030637 | Mann | Mar 2002 | A1 |
20020046035 | Kitahara et al. | Apr 2002 | A1 |
20020057810 | Boesen | May 2002 | A1 |
20020076073 | Taenzer et al. | Jun 2002 | A1 |
20020118852 | Boesen | Aug 2002 | A1 |
20020150254 | Wilcock | Oct 2002 | A1 |
20030002705 | Boesen | Jan 2003 | A1 |
20030065504 | Kraemer et al. | Apr 2003 | A1 |
20030100331 | Dress et al. | May 2003 | A1 |
20030104806 | Ruef et al. | Jun 2003 | A1 |
20030115068 | Boesen | Jun 2003 | A1 |
20030125096 | Boesen | Jul 2003 | A1 |
20030218064 | Conner et al. | Nov 2003 | A1 |
20040070564 | Dawson et al. | Apr 2004 | A1 |
20040160511 | Boesen | Aug 2004 | A1 |
20050017842 | Dematteo | Jan 2005 | A1 |
20050043056 | Boesen | Feb 2005 | A1 |
20050125320 | Boesen | Jun 2005 | A1 |
20050148883 | Boesen | Jul 2005 | A1 |
20050165663 | Razumov | Jul 2005 | A1 |
20050196009 | Boesen | Sep 2005 | A1 |
20050251455 | Boesen | Nov 2005 | A1 |
20050266876 | Boesen | Dec 2005 | A1 |
20060029246 | Boesen | Feb 2006 | A1 |
20060074671 | Farmaner et al. | Apr 2006 | A1 |
20060074808 | Boesen | Apr 2006 | A1 |
20060166715 | Engelen et al. | Jul 2006 | A1 |
20060166716 | Seshadri et al. | Jul 2006 | A1 |
20060220915 | Bauer | Oct 2006 | A1 |
20060258412 | Liu | Nov 2006 | A1 |
20080076972 | Dorogusker et al. | Mar 2008 | A1 |
20080090622 | Kim et al. | Apr 2008 | A1 |
20080146890 | LeBoeuf et al. | Jun 2008 | A1 |
20080254780 | Kuhl et al. | Oct 2008 | A1 |
20080255430 | Alexandersson et al. | Oct 2008 | A1 |
20090003620 | McKillop et al. | Jan 2009 | A1 |
20090017881 | Madrigal | Jan 2009 | A1 |
20090073070 | Rofougaran | Mar 2009 | A1 |
20090097689 | Prest et al. | Apr 2009 | A1 |
20090105548 | Bart | Apr 2009 | A1 |
20090191920 | Regen et al. | Jul 2009 | A1 |
20090245559 | Boltyenkov et al. | Oct 2009 | A1 |
20090296968 | Wu et al. | Dec 2009 | A1 |
20100033313 | Keady et al. | Feb 2010 | A1 |
20100203831 | Muth | Aug 2010 | A1 |
20100210212 | Sato | Aug 2010 | A1 |
20100320961 | Castillo et al. | Dec 2010 | A1 |
20110286615 | Olodort et al. | Nov 2011 | A1 |
20120057740 | Rosal | Mar 2012 | A1 |
20130316642 | Newham | Nov 2013 | A1 |
20130329051 | Boesen | Dec 2013 | A1 |
20130346168 | Zhou et al. | Dec 2013 | A1 |
20140072146 | Itkin et al. | Mar 2014 | A1 |
20140079257 | Ruwe et al. | Mar 2014 | A1 |
20140106677 | Altman | Apr 2014 | A1 |
20140122116 | Smythe | May 2014 | A1 |
20140163771 | Demeniuk | Jun 2014 | A1 |
20140185828 | Helbling | Jul 2014 | A1 |
20140222462 | Shakil et al. | Aug 2014 | A1 |
20140235169 | Parkinson et al. | Aug 2014 | A1 |
20140270227 | Swanson | Sep 2014 | A1 |
20140270271 | Dehe et al. | Sep 2014 | A1 |
20140348367 | Vavrus et al. | Nov 2014 | A1 |
20150028996 | Agrafioti et al. | Jan 2015 | A1 |
20150110587 | Hori | Apr 2015 | A1 |
20150148989 | Cooper et al. | May 2015 | A1 |
20150245127 | Shaffer | Aug 2015 | A1 |
20160033280 | Moore et al. | Feb 2016 | A1 |
20160072558 | Hirsch et al. | Mar 2016 | A1 |
20160073189 | Lindén et al. | Mar 2016 | A1 |
20160125892 | Bowen et al. | May 2016 | A1 |
20160154241 | Alhashim | Jun 2016 | A1 |
20160360350 | Watson et al. | Dec 2016 | A1 |
20170078780 | Qian et al. | Mar 2017 | A1 |
20170111726 | Martin et al. | Apr 2017 | A1 |
20170155992 | Perianu et al. | Jun 2017 | A1 |
20170347348 | Masaki | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
204244472 | Apr 2015 | CN |
104683519 | Jun 2015 | CN |
104837094 | Aug 2015 | CN |
1469659 | Oct 2004 | EP |
1017252 | May 2006 | EP |
2903186 | Aug 2015 | EP |
2074817 | Apr 1981 | GB |
2508226 | May 2014 | GB |
2008103925 | Aug 2008 | WO |
2007034371 | Nov 2008 | WO |
2011001433 | Jan 2011 | WO |
2012071127 | May 2012 | WO |
2013134956 | Sep 2013 | WO |
2014046602 | Mar 2014 | WO |
2014043179 | Jul 2014 | WO |
2015061633 | Apr 2015 | WO |
2015110577 | Jul 2015 | WO |
2015110587 | Jul 2015 | WO |
2016032990 | Mar 2016 | WO |
Entry |
---|
Waves, “Virtual Mix Room User Guide”, Snapshot generated Apr. 19, 2016, web.archive.org, all pages. (Year: 2016). |
“ReSound LiNX2”, “http://ww1.resound.com/en/hearing-aids/resound-hearing-aids/linx2#.Vpj_SRUrKUk”, 2017, 6 pages. |
Akkermans, “Acoustic Ear Recognition for Person Identification”, Automatic Identification Advanced Technologies, 2005 pp. 219-223. |
Announcing the $3,333,333 Stretch Goal (Feb. 24, 2014). |
Ben Coxworth: “Graphene-based ink could enable low-cost, foldable electronics”, “Journal of Physical Chemistry Letters”, Northwestern University, (May 22, 2013). |
Blain: “World's first graphene speaker already superior to Sennheiser MX400”, htt://www.gizmag.com/graphene-speaker-beats-sennheiser-mx400/31660, (Apr. 15, 2014). |
BMW, “BMW introduces BMW Connected—The personalized digital assistant”, “http://bmwblog.com/2016/01/05/bmw-introduces-bmw-connected-the-personalized-digital-assistant”, (Jan. 5, 2016). |
BRAGI is on Facebook (2014). |
BRAGI Update—Arrival of Prototype Chassis Parts—More People—Awesomeness (May 13, 2014). |
BRAGI Update—Chinese New Year, Design Verification, Charging Case, More People, Timeline(Mar. 6, 2015). |
BRAGI Update—First Sleeves From Prototype Tool—Software Development Kit (Jun. 5, 2014). |
BRAGI Update—Let's Get Ready to Rumble, A Lot to Be Done Over Christmas (Dec. 22, 2014). |
BRAGI Update—Memories From April—Update on Progress (Sep. 16, 2014). |
BRAGI Update—Memories from May—Update on Progress—Sweet (Oct. 13, 2014). |
BRAGI Update—Memories From One Month Before Kickstarter—Update on Progress (Jul. 10, 2014). |
BRAGI Update—Memories From the First Month of Kickstarter—Update on Progress (Aug. 1, 2014). |
BRAGI Update—Memories From the Second Month of Kickstarter—Update on Progress (Aug. 22, 2014). |
BRAGI Update—New People @BRAGI—Prototypes (Jun. 26, 2014). |
BRAGI Update—Office Tour, Tour to China, Tour to CES (Dec. 11, 2014). |
BRAGI Update—Status on Wireless, Bits and Pieces, Testing—Oh Yeah, Timeline(Apr. 24, 2015). |
BRAGI Update—The App Preview, The Charger, The SDK, BRAGI Funding and Chinese New Year (Feb. 11, 2015). |
BRAGI Update—What We Did Over Christmas, Las Vegas & CES (Jan. 19, 2014). |
BRAGI Update—Years of Development, Moments of Utter Joy and Finishing What We Started(Jun. 5, 2015). |
BRAGI Update—Alpha 5 and Back to China, Backer Day, on Track(May 16, 2015). |
BRAGI Update—Beta2 Production and Factory Line(Aug. 20, 2015). |
BRAGI Update—Certifications, Production, Ramping Up. |
BRAGI Update—Developer Units Shipping and Status(Oct. 5, 2015). |
BRAGI Update—Developer Units Started Shipping and Status (Oct. 19, 2015). |
BRAGI Update—Developer Units, Investment, Story and Status(Nov. 2, 2015). |
BRAGI Update13 Getting Close(Aug. 6, 2015). |
BRAGI Update—On Track, Design Verification, How It Works and What's Next(Jul. 15, 2015). |
BRAGI Update—On Track, on Track and Gems Overview. |
BRAGI Update—Status on Wireless, Supply, Timeline and Open House@BRAGI(Apr. 1, 2015). |
BRAGI Update—Unpacking Video, Reviews on Audio Perform and Boy Are We Getting Close(Sep. 10, 2015). |
Davies, “Goodie Cardboard adds spatial sound for more immersive VR”, “https://www.slashgear.com/google-cardboard-adds-spatial-sound-for-more-immersive-vr-13423072/”, Jan. 13, 2016, 10 pages. |
Healthcare Risk Management Review, “Nuance updates computer-assisted physician documentation solution” (Oct. 20, 2016). |
Hoyt et. al., “Lessons Learned from Implementation of Voice Recognition for Documentation in the Military Electronic Health Record System”, The American Health Information Management Association (2017). |
Hyundai Motor America, “Hyundai Motor Company Introduces a Health + Mobility Concept for Wellness in Mobility”, Fountain Valley, Californa (2017). |
Last Push Before the Kickstarter Campaign Ends on Monday 4pm CET (Mar. 28, 2014). |
Nigel Whitfield: “Fake tape detectors, ‘from the stands’ footie and UGH? Internet of Things in my set-top box”; http://www.theregister.co.uk/2014/09/24/ibc_round_up_object_audio_dlna_iot/ (Sep. 24, 2014). |
Staab, Wayne J., et al., “A One-Size Disposable Hearing Aid is Introduced”, The Hearing Journal 53(4):36-41) Apr. 2000. |
Stretchgoal—It's Your Dash (Feb. 14, 2014). |
Stretchgoal—The Carrying Case for the Dash (Feb. 12, 2014). |
Stretchgoal—Windows Phone Support (Feb. 17, 2014). |
The Dash + The Charging Case & The BRAGI News (Feb. 21, 2014). |
The Dash—A Word From Our Software, Mechanical and Acoustics Team + An Update (Mar. 11, 2014). |
Update From BRAGI—$3,000,000—Yipee (Mar. 22, 2014). |
Weisiger; “Conjugated Hyperbilirubinemia”, Jan. 5, 2016. |
Number | Date | Country | |
---|---|---|---|
20180014140 A1 | Jan 2018 | US |
Number | Date | Country | |
---|---|---|---|
62358985 | Jul 2016 | US |