The described embodiments relate generally to electronic devices. More particularly, the present disclosure relates to electronic charging devices and user interfaces.
Advances in portable computing have enabled wearable devices, such as wireless headphones, to be wirelessly connected to one or more other electronic devices. Audio outputs from these other electronic devices can be listened to by the user through the headphones. Wireless headphones are generally small and lightweight so that the user can conveniently wear the headphones in or on the user's ears without hassle. However, the small form factor of headphones can limit the space that is available on the headphone to incorporate user input features and audio control functionalities.
Therefore, there is a need to design devices, systems, and methods to increase the user input and control functionalities of wireless headphones and associated devices.
The present disclosure relates to electronic devices. In particular, the present disclosure relates to electronic devices having user interface features.
According to one example of the present disclosure, an electronic device includes a housing having a cavity configured to receive an earbud, an input device configured to generate a signal in response to detecting a user input, and circuitry coupled to the input device. The circuitry can be configured to detect the signal and, in response to detecting the signal, send an instruction to the earbud to change at least one of a source or a perceived location of audio content output at the earbud.
In one example, the input device can be further configured to generate a graphical user interface. In one example, the graphical user interface includes a user selectable icon corresponding to an audio source and the audio source includes at least one of a music application, a calendar application, an email application, a message application, or a weather application. In one example, the input device can include a capacitive touch surface. In one example, the user input can include a gesture applied in a direction along the capacitive touch surface. In one example, in response to detecting the signal, the input device sends an instruction to the earbud to change the perceived location of audio content output at the earbud and the direction corresponds to a perceived source location of the audio content. In one example, the instruction changes the perceived source location.
In one example of the present disclosure, an electronic system includes a case defining an external surface and including a cavity configured to receive an electronic device and a display device. The display device is configured to generate a first user interface at the external surface and in response to detecting a user input at the external surface, generate a second user interface.
In one example, the first user interface includes a graphical user interface. In one example, the first user interface includes an audio user interface. In one example, the first user interface include a virtual user interface. In one example, the case includes a capacitive touch surface at least partially defining a user interface region of the external surface and the electronic device includes an earbud. In one example, the user input includes a gesture input. In one example, the gesture input includes a touch input at the capacitive touch surface. In one example, the touch input contacts the capacitive touch surface at a location corresponding to a user selectable icon of the first user interface and the second user interface includes a second user selectable icon.
In one example of the present disclosure, a head mountable display can include a processor, memory, and a program stored in the memory, the program including instructions which, when executed by the processor, cause the head mountable display to display a virtual user interface on an external surface of a housing of an electronic device defining a cavity, detect a user input at the external surface while displaying the virtual user interface, and in response to detecting the input, alter the virtual user interface.
In one example, the virtual user interface corresponds virtually to a user interface region defined by the external surface of the electronic device. In one example, the user interface region of the external surface includes a capacitive touch surface. In one example, displaying the virtual user interface includes displaying a first user selectable icon and altering the virtual user interface includes displaying a second user selectable icon. In one example, the cavity is shaped to receive an earbud.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
Reference will be made in detail below to representative embodiments and examples illustrated in the accompanying drawings. However, the following descriptions are not intended to limit the embodiments to one preferred embodiment. Rather, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
The present disclosure relates to electronic devices. In particular, the present disclosure relates to electronic charging devices having user interface features. In one example, the present disclosure includes an electronic charging device having a housing that defines an internal volume and an external surface. The electronic device can also include a cavity and a charging system disposed within the internal volume and electrically connected to the cavity. In one example, the electronic device also includes a user interface region, which in some examples can include a touchpad defining a portion of the exterior surface. In some example, the housing can form the user interface region.
In such an example, the electronic charging device can be a charging case for wireless headphones, also referred to herein as earbuds. Charging cases can be used to store and charge earbuds when one or more of the earbuds are not in use. During use, the earbuds can be removed from the charging case and placed in or on the user's cars to listen to audio. In one example, the charging case itself can facilitate a wireless connection between the earbuds and one or more other devices streaming or transmitting the audio. The user can remove the earbuds and place them back into the charging case when the earbuds are running low on power or when the user is done using the earbuds.
Some earbuds can include user input components, such as buttons or capacitive touch sensors that enable the user to control the audio being listened to at the point of the earbud itself. For example, an earbud may include a capacitive touch sensor that registers a tapping or pressing of the earbud by the user's hand/or finger. Various combinations of taps or touches of the earbuds themselves can cause the audio input to pause, increase or decrease in volume, turn on, turn off, or the like. Earbuds and other wireless headphones may be designed with small form factors that enable the user to wear the earbuds conveniently and comfortably. However, this small form factor limits the available space, such as the surface area of the earbuds themselves, to include control input capabilities or other user interface features.
Advantageously, the user interface region of a charging case described herein can be utilized by the user to input control commands for controlling the earbuds or the audio being transmitted thereto. Such a user interface region, located with the charging case of the earbuds, provides increased surface area or space, in place of or in addition to that of the earbuds themselves, to include sensors, buttons, or other interface components that expand user input capabilities, user interface outputs, and control functionalities.
For example, at least some of the user interface regions of electronic charging devices and input devices described herein can include one or more capacitive touch surfaces configured to receive gesture controls from the user. For example, while wearing the earbuds and listening to audio content, a user can swipe a finger on the user interface region of the charging case in a certain direction, motion path, or for a certain duration, to indicate any number of controls to manipulate the audio being listened to through the earbuds. In one example, a user can swipe left or right to move back and forth from one track of audio to another. In another example, a user can swipe a finger in a circular motion on the user interface region to indicate a volume change. The surface area provided by the user interface region s described herein can thus receive user touch gestures as command inputs to control audio content output by the earbuds. As used herein, the terms “gesture,” “contact gesture,” “touch gesture,” or other terms including “gesture,” can include any path, motion, or contact profile of the user's finger as it contacts the user interface region. Gestures can include the direction, shape, duration, pressing force, or other contact characteristics between the user's finger or hand and the user interface region. For example, gestures can include swipe paths in any direction, shapes such as circles, triangles, rectangles or other shapes, taps, hard or soft presses, or any combination thereof, as formed by the path of the finger along the region and as detected by the region.
The foregoing examples of gesture controls and their effects on the audio being listened to through the earbuds are given as non-limiting examples only. One will appreciate that the user interface region of the electronic charging and input devices described herein can be configured to register any gesture of any path, direction, shape, or combination thereof. The larger surface area of the charging case provides increased resolution for more complicated and varied user input gestures associated commands. In general, the user interface features of electronic charging and other input devices described herein, including user interface regions and touchpads, expand the available surface area that can be used to receive user control inputs, which in turn expands the number and variety of controls and user inputs available to the user beyond what may be available on just the earbuds themselves.
In addition to the expanded control functionalities described above, the electronic charging and input devices described herein reduce the frequency of interaction between the user's hands or fingers and the earbuds themselves. Examples of electronic input devices described herein can provide a more convenient point of interaction for the user as he or she controls the audio output of the earbuds. For example, an electronic charging input device can be configured to be placed in a user's hand/or pocket such that the user can touch the charging device near where the user's hands may be positioned at rest in order to control the audio output or other functions of the earbuds. In this way, the user does not need to repeatedly reach up to his or her ears, where the earbuds are located, to control the audio output.
Electronic input devices described herein, such as earbud charging cases, can be used to provide expanded user output functionalities of audio devices such as earbuds. For example, electronic input devices of the present disclosure can include visual, audio, or tactile outputs to relay information to the user.
In addition, in at least one example, electronic input devices described herein can include additional components such as processors, memory components, antennas, proximity sensors, or other components that expand the earbud control functionalities and features. Such components can enable electronic input devices to communicate with one or more other electronic devices, such as other computing devices that include audio outputs. Electronic input devices of the present disclosure can be configured to advantageously control which device is connected to the earbuds such that a user can seamlessly switch from receiving the audio output of one device to receiving the audio output from another.
Current wireless headphones on the market may generally be considered as accessories to other devices to which they connect. For example, wireless headphones may be considered an accessory product to a mobile phone where the headphones have limited capacity for receiving audio controls. That is, in many instances, the audio streamed to the headphones by the mobile phone is still largely controlled using the user interface of the phone. Thus, use of the earpieces themselves are at least partially dependent on the mobile phone and, in this way, can be considered an accessory item. However, wireless earpieces, such as the earbuds and charging cases described herein, can be configured as standalone electronic devices utilizing input and control interfaces at the case itself. In this way, in at least some examples, the audio devices and systems described herein can be used to create an immersive lifestyle device, which can include music streaming, message and calendar notifications, driving directions, and the like, without being dependent or accessory to another device.
These and other embodiments are discussed below with reference to
Turning now to the FIGS.,
In at least one example, device 100 includes a user interface region 116 that defines at least a portion of the exterior surface 106. In at least one example, housing 102 can define at least a portion of user interface region 116. User interface region 116 can take many forms and is generally configured to receive a contact input when contact is made at outer surface 106 of device 100 defined by user interface region 116. In one example, user interface regions 116 includes a touchpad, such as a capacitive touchpad having a surface area that at least partially defines exterior surface 106.
In at least one example, charging system 112 can be configured to deliver electrical current to cavity 108 such that electrical current is carried to one or more interior housing surfaces 110 defining cavity 108. In this way, an object received into cavity 108 and making contact with internal housing surface 110 electrically connected to charging system 112 can be charged. For example, as shown in
Earbud 218 can include one or more internal batteries (not shown) that power earbud 218 when the user removes earbud 218 from cavity 208 of device 200 and places earbud 218 on or in his or her ear to listen to audio output from earbud 218. When the user is finished using earbud 218 or when the one or more batteries of earbud 218 are running low on power, the user can insert earbud 218 into cavity 208 of device 200, as shown in
Device 200 shown in
Accordingly, in at least one example, user interface region 316 includes one or more sensors, for example touch sensors. Device 300 can include circuitry coupled to device 330 or to a processor of device 300 that can be electrically connected to the one or more sensors of user interface region 316 such that the position and changes in position of a user's touch, for example when a user taps, swipes, or otherwise gestures while contacting user interface region 316, can be detected and identified.
In the illustrated schematic of
Accordingly, in at least one example, user interface region 316 includes a capacitive touch surface that defines at least a portion of the exterior surface 306 of device 300. In one example, such a touch surface can be a distinct component separate from other portions of housing 302, but defining a portion of exterior surface 306 along with the rest of housing 302. Alternatively, such a capacitive touch surface of user interface region 316 can be defined by housing 302 such that a defined portion of housing 102 acts to receive a touch input. In this way, housing 302 can form a dielectric layer or plate of a capacitive sensor stack.
Along these lines, user interface region 316 can include multiple components, including multiple layers, configured to sense a change in capacitance of one or more of the layers when the user contacts the portion of exterior surface 306 of housing 302 that corresponds to user interface region 316. In one example, housing 302 forms a dielectric layer disposed between the user's skin/finger during contact with user interface region 316 and one or more conductive layers disposed inwardly from exterior surface 306 of housing 302, for example, within internal volume 304. Such layers can be configured to hold an electric charge. In one example, such an inner conductive layer of a capacitive touch sensor stack forming user interface region 316 can include one or more conductive plate or electrode. Sensing circuitry can electrically connect such an electrically conductive layer with one or more processors within device 300. The processor can be configured to determine a change in the charge of the internal conductive layer. This change in electrical charge can occur when a user's finger comes near to or contacts user interface region 316 at exterior surface 306 of housing 302, with the user's finger acting as an opposing charged object to be sensed.
In one or more other examples, user interface region 316 can include one or more other types of touch sensors and components thereof. For example, user interface region 316 can include one or more pressure sensors and components thereof, one or more resistive touch sensors and components thereof, or other touch sensor configurations and components thereof.
Additionally, or alternatively, one or more examples of user interface regions 316 described herein can include one or more depressible buttons defining a portion of housing 302. For example, user interface region 316 can include one or more buttons that can be depressed below a level or plane defined by exterior surface 306 of housing 302. In this way, user interface region 316 can include tactile feedback from the physical depression of a button while inputting audio control command, such as gestures, with his or her fingers or hands at device 300. At least one example of device 300 can include a combination of one or more depressible buttons and/or one or more areas defining user interface region 316, such as the capacitive sensor touchpads described herein.
All of the features and components, or combinations thereof, described with reference to device 300 shown in
One or more other examples of device 400 can include two or more processors 420 and/or two or more antennas 422 disposed at various locations within internal volume 404. In at least one example, antenna 422 is configured to transmit and receive electromagnetic signals to and from device 400. For example, antenna 422 can be configured to send electromagnetic signals to one or more earbuds separated from device 400 and being used by a user. Also, for example, antenna 422 can be configured to send and receive signals between device 400 or earbuds and other electronic devices, such as a mobile phone or other computing device that may transmit audio signals or content to the earbuds. In response to detecting or sending one or more electromagnetic signals, device 400 can send an instruction to the earbuds or other electronic devices.
Processor 420 can be electrically coupled to a user interface region 416 and antenna 422 via circuitry. Processor 420 can be configured to cause antenna 422, via the circuitry, to send and receive signals to various other devices, including one or more earbuds that are configured to be received into cavity 408, based on user inputs received by user interface region 416. A user can input a command via the user interface region 416, for example, by tapping, gesturing, swiping, or otherwise contacting user interface region 416, such that the command indicates an intended action of device 400. For example, a certain gesture inputs at user interface region 416 can indicate that the user wants to skip from one song being listened to through an earbud to the next song. Processor 420 can be configured to recognize the input command and cause antenna 422 to send one or more signals communicating with device 400 or associated earbuds, such as a mobile phone, from which the earbud is streaming music. As another example, the user can input the gesture or other touch command at user interface region 416 to indicate an intent to start or stop the audio content transmitted to the earbuds. Processor 420 can be configured to recognize any variety of such commands at user interface region 416 and cause antenna 422 or any other component of device 400 to carry out the action or function desired by the user.
One will appreciate that any number of contact gestures, swipes, taps, or other touch commands input by the user at user interface region 416 can be recognized by processor 420. Processor 420 can thus be configured to carry out any such command that is input at user interface region 416 by the user. Carrying out such a command can include causing antenna 422 to send or receive signals with one or more other devices or causing one or more other components of device 400 to carry out the command. Other commands that can be input by the user include, as non-limiting examples, skipping audio tracks, speeding up or slowing down audio inputs, switching from one audio stream to another, connecting or switching to or from various other devices providing audio streams, increasing or decreasing volume, or any other audio control command.
Antenna 422 can include one or more components configured to send and receive electromagnetic signals, including digital audio content signals and the like. For example, antenna 422 can include multiple antenna modules including Bluetooth modules and circuitry, ultra-wideband stacks, or other transmitter-receiver modules or combinations thereof.
All of the features and components, or combinations thereof, described with reference to device 400 shown in
Advantageously, device 500 including memory component 524 can be used as a stand-alone audio content device for streaming audio content to the user via earbuds without connecting to, or being accessory to, other devices such as phones, computers, digital music players, and the like. For example, the audio can be downloaded and stored directly onto memory component 524 of device 500, and processor 520 can be configured to stream content stored on a memory component 524 to one or more earbuds via antenna 522 or other components. In such an example, user interface region 516 can be used to control content from memory component 524, as it is transmitted to the earbuds.
The components and features of device 500 shown in
As noted above, device 600 can be a charging case for earbuds 618. That is, in addition to the other features and functionalities of device 600 described above, device 600, or charging case 600, can include one or more cavities for receiving earbuds 618 and a charging system configured to charge or recharge one or more batteries of earbuds 618. Advantageously, the charging functionality of device 600 can be combined with the other control features, components, and functionalities described herein, for example user interface regions, processors, circuitry, antennas, memory components, proximity sensors, and so forth, in one simple and compact device.
In addition, as the user interacts with device 600 to control the audio output of the earbuds 618, device 600 and other devices described herein can include one or more output features to communicate information to the user. That is, in addition to the user interface regions described herein, which are configured to receive command inputs from the user, one or more examples of devices of the present disclosure can include user interface output features and components. For example,
In one example, one or more output feature 728 can include a visual icon, such as a light or backlit image that can turn on or off to relay information to the user. For example, one output feature 728 can include a backlit form of an envelope that indicates an e-mail notification to the user. Once indicated by the output feature 728, the user can input touch commands at user interface region 716 so that a processor of device 700 can cause an audio output of the mail notification or mail contents to be streamed to the earbuds. In the foregoing example of an e-mail notification, one or more components of device 700, such as antennas or other transmitters and receivers of devices described herein, can relay audio output from an e-mail message or notification from a separate connected device such as a mobile phone or computer.
One or more output features 728 can include other visual or tactile outputs to notify the user of various other notifications, statuses of device 700, or other information. For example, one or more output features 728 can alert the user with visual icons representing text messages received, upcoming calendar events, missed calls from a connected mobile phone, or any other information relayed to device 700 from other connected electronic devices. Output feature 728 can also include one or more light indicators without specific forms of images. As shown, at least one output feature can utilize and area of external surface 706 occupied by user interface region 716. For example, output feature 728 located at user interface region 716 can include a diffuse backlit portion of exterior surface 706. In addition, any of the lit output features 728 of device 700 can include multiple colors that could indicate unique meaning to the user.
In response to receiving information from output features 728, the user can swipe or otherwise gesture on the user interface region 716 of housing 702 to change the audio outputs of the earbuds based on information relayed by the output features 728. Using the mail envelope example from above, which may indicate an incoming e-mail message to the user, a user can then touch or swipe on the user interface region 716 in a certain way that causes a processor of device 700 to switch the audio being transmitted to the earbuds to an audio reading of the contents of the e-mail. For example, the user could make a swiping gesture with his or her finger on user interface region 716 with the direction of the swipe or gesture aimed at the given output feature 728. In this way, the user can indicate which output feature 728 he or she is interested in, and can switch the audio output by the earbuds accordingly. This is one example of an interaction between the user and device 700 that includes one or more output features 728 relaying information to the user, and the user subsequently reacting to that information by inputting controls via device 700 to manipulate the content of audio received through wirelessly connected earbuds.
One will appreciate that any number of output features 728 or combinations of output features 728 can communicate any number of notifications, statuses, or other information to the user. Other interactions between device 700 and the user can include output features 728 indicating traffic directions. Accordingly, if the user is listening to music via earbuds is wirelessly connected to device 700, one or more output features 728 can indicate to the user that he or she needs to listen to an upcoming traffic navigation instruction. In reaction, the user can swipe or gesture on the user interface region 716 in such a way that device 700 then interrupts the music with the navigation instruction audio output at the earbuds. The same process can occur with output features 728 indicating text messages, weather conditions and forecasts, news headlines, stock price updates, missed calls from a mobile phone, or any other piece of information that can be relayed by audio to the earbuds.
In addition to, or alternatively to, output features 728 of device 700 described herein, device 700 can include one or more haptic output features or components for interfacing with the user and conveying non-audio information from device 700 in a tactile manner. For example, device 700 can include a motor or other vibration producing component that can vibrate device 700 to alert the user of a status change, notification, or other output information, as described above. Vibrational or other tactile feedback mechanisms can be configured to provide unique movements or vibrations of device 700, each conveying unique information to the user.
The components and features of device 700 shown in
In the illustrated example of
In any case, transition surface 830 can serve to provide a physical feature that indicates to a user where the bounds or outer perimeter of user interface region 816 is located. In this way, if a user stores device 800 out of sight within a pocket or purse, as shown in
Another example of a charging case having one or more user interface regions incorporated onto an exterior surface thereof is shown in
One will appreciate from the foregoing examples of devices shown in
One or more processors of the devices shown can cause one or more other components of the devices to transmit or receive various commands to and from wirelessly connected earbuds or other devices to control the audio output of the earbuds being used. In each of the features or components of devices shown in
As noted above, in particular with reference to device 500 shown in
Along these lines,
Both mobile phone 1432 and laptop computer 1434 can output audio content to one or more earbuds 1418 worn by the user. As shown in
Advantageously, as described herein with reference to other figures, case 1400 can include one or more proximity sensors configured to sense a presence of other electronic devices. For example, a proximity sensor of case 1400 can detect the presence of nearby mobile phone 1432 and laptop computer 1434. In addition, the one or more proximity sensors of case 1400 can be configured to sense a distance between case 1400 and other electronic devices. For example, as shown in
Once the relative position between case 1400 and either mobile phone 1432 or laptop computer 1434 is determined, case 1400 can be configured to provide the user with an option to receive the audio output of the nearest electronic device, which in the example illustrated in
In another example, when the user brings case 1402 into closer proximity with laptop computer 1434, and thus further away from mobile phone 1432, the user can have an option to switch audio content being streamed to earbuds 1418 from the mobile phone 1432 audio output signal 1436 to the audio output signal 1438 of laptop computer 1434. Again, case 1400 can include one or more proximity sensors and processors that enable the detection of, and relative position with, external electronic devices. The one or more processors can cause case 1400 to switch the audio content that is transmitted to earbuds 1418 based on that relative position and one or more commands given by the user to case 1400 via one or more user interface regions of case 1400.
As described above, when the user switches from the transmission of one audio content to another, the processor of devices described herein, such as earbud charging cases, can cause a smooth transmission from one audio source to the other as heard by the user through the earbuds. Such a transition can include a fading in and out between different audio sources. In one example, one audio source may be transitioned to the listener at a lower volume than the other. In another example, the volume of one audio source can be decreased but not completely removed when another audio source is provided. For example, notifications from other devices regarding text messages, e-mails, calendar events, and so forth, can slowly fade in to audibly overlay an audio track already being listened to while that audio track is reduced in volume or faded out.
In addition to the fading or volume transitions between audio sources, charging cases, and devices described herein can cause earbuds worn by a user to produce and change the spatial location of the various audio content and sources provided to the user.
In the example shown in
In at least one example, when switching from one audio source to another, the one or more processors of a charging case of earbuds 1518a, 1518b can cause earbuds 1518a, 1518b to change a perceived location or direction of the two or more different audio sources as the user transitions from one source to the other, as commanded by the user at a user interface region of the charging case. For example, if a user is listening to music and wants to switch from one track of music to another, the user can manipulate an interface surface (such as a swipe left to right on the user interface region of the charging case). Accordingly, one or more processors of the charging case can cause the earbuds to move the first track from left to right along spatial audio band 1540, as perceived by the user, and move the second track onto spatial audio band 1540 from left to right. For example, as shown in
Along these lines, one will also note that the sound propagation wave lines associated with sounds 1542, 1544, 1545, and 1546, shown in
The perceived positions of each sound 1542, 1544, 1545, and 1546 can be changed, in combination with the volume of each sound 1542, 1544, 1545, and 1546, as shown in
The foregoing is one non-limiting example of how user touch and finger gesture commands at user interface region of an earbud charging case can be used to manipulate a spatial perception of audio content listened to through earbuds 1518a, 1518b. One will appreciate that any number of other swipes and/or gesture inputs by the user at the user interface region of an earbud charging case can manipulate the audio spatial perception of the audio output at earbuds 1518a, 1518b. For example, swiping right to left can move the perceived location of audio outputs from right to left or front to back on spatial audio band 1540. Transitions from one source of audio content to another, for example, as the user switches from audio content transmitted by one or more other devices as described above, can be spatially expressed as shown in
For example, if the user is listening to music through the earbuds and receives a notification from an output feature of the control case, for example, a text message notification, the user can swipe or gesture at the user interface region of the charging case to indicate a desire to listen to an audio output of the text message. In one example, referring to
In at least one example, using the devices and charging cases having processors and antennas described herein, multiple audio contents from multiple electronic devices having audio outputs can be simultaneously transmitted to earbuds 1518a, 1518b to be heard by the user. In such an example, one audio source can be spatially perceived at sound 1542, another audio source can be perceived at sound 1544, and another audio content can be perceived at sound 1546. In this way, multiple sources and audio contents can be perceived simultaneously as if they are coming from different directions along spatial band 1540, as shown in
As noted above, spatial audio band 5140 represents a continuous set of locations from which various sounds 1542, 1544, 1545, and 1546 can be perceived. However, by varying the volume and other characteristics of the sounds output through earbuds 1518a, 1518b, spatial audio band 1540 can be perceived from further away or closer away from what it shown in
In this way, charging cases described herein can cause the user to experience multiple audio sources and contents as if they were in a room, for example, with multiple people talking or multiple devices providing audio outputs from different directions. In some examples, devices and systems described herein can mimic real world audio environments where the user perceives different audio content from different locations and can pay attention to what he or she chooses.
In at least one example, the various sensors and spatial audio manipulation functionalities of devices described herein, such as earbud charging cases, can be utilized within an augmented reality (AR) or virtual reality (VR) environment. Such an example is shown in
Alternatively, or additionally, the external area of device 1600 on which virtual representation of user interface 16 is presented to the user through display 1648 of AR/VR device 1604 is not actually configured to receive gesture commands from the user. In such an example, AR/VR device 1604 can include a processor, circuitry, and sensors configured to visually detect the gestures performed by the user at the virtual representation of the user interface region 16 on the device 1600. Then, the processor and other components of AR/VR device 1604, including one or more antennas, can communicate the command and perform the associated function at connected earbuds 1618.
In one example, a gaze detection capability of AR/VR device 1604, which can include a head-mounted device, can be used in conjunction with the spatial audio manipulation of devices described herein. For example, as the users gaze switches from one thing to another within an AR/VR environment, devices described herein can spatially manipulate the audio outputs from earbuds to match the user's gaze within the AR/VR environment. Advantageously, this can create more immersive and realistic AR/VR experience for the user.
In addition to the various features, components, and advantages of devices described herein, in at least one example, a device can include one or more microphones configured to receive user commands from the user. In such examples, one or more processors of the device can be configured to detect or recognize speech of the user through the one or more microphones. Speech or other audio commands can be used in conjunction with, or separately from, the touch gestures input at the user interface regions of charging cases and devices described herein.
Additionally or alternatively to examples of devices and charging cases described herein, one or more devices can include one or more sensors including fitness or biometric sensors. Such sensors can be incorporated within the housing of devices described herein to track biometric data or fitness data of the user. For example, as a user holds an earbud charging case in his or her pocket, one or more sensors of the charging case can detect the number of steps taken by the user, a temperature of the user, or other fitness and biometric data. The one or more processors of the charging case device can cause the charging case device to relay the sensed or detected biometric and fitness data to the user via one or more output features as described herein. Additionally or alternatively, the charging case device can be configured to transmit audio information containing the fitness and biometric data sensed or detected by the sensors to the earbuds.
One or more other sensors can be included in one or more other examples of devices described herein. For example, one or more environmental sensing sensors can be included in a device such as an earbud charging case. Environmental sensors can include sensors configured to detect or sense other objects, electronic devices, or people within the environment or other environmental characteristics. In one example, one or more sensors can sense the temperature, humidity, or other physical characteristics of the environment and relay such information to the user through the earbuds wirelessly connected to charging case device or via one or more visual or tactile output features of the device. In examples where one or more sensors are configured to detect other objects or people within an environment, one or more processors of the charging case device can be configured to cause audio content relating to those other objects, devices, or people, to be transmitted to the user through the earbuds.
Additionally or alternatively to examples of devices and charging cases described herein, one or more devices can include one or more memory components, as discussed above, so that the device is configured as a stand-alone radio or music playlist device. Furthermore, one or more examples of devices described herein can include one or more wireless internet connection components or modules configured to connect the device to the internet for streaming audio content.
In at least one example of the devices described herein, such as earbud charging case devices described herein, multiple people can receive the transmission of audio content from the same device, for example a television or mobile phone, with each users charging case adapting the audio content to the user's needs. This can be done either automatically or by command from the user via at least the user interface region of the charging case. For example, if two or more people are watching and listening to a television through their respective earbuds, each user can uniquely adapt the audio output to their needs. For example, some users may need or want the volume to be louder or softer. Also for example, some users may prefer turning up the bass or treble components of the audio output. Advantageously, each user utilizing the devices described herein can change the audio output to meet their own needs without affecting the audio output transmitted to others. This can have unique and advantageous accessibility implications for providing altered or enhanced audio experiences to those with hearing impairments or other hearing disabilities.
In some examples, personal information data can be gathered with the present systems and methods, which personal information should be gathered pursuant to authorized and well established secure privacy policies and practices that are appropriate for the type of data collected. Such personal information can be used to practice and improve on the various examples described herein. The disclosed technology is not, however, rendered inoperable in the absence of such personal information data.
It will be understood that the various details of the present systems and methods provided above can be combined in various combinations and with alternative components. The foregoing descriptions of the specific examples described herein are presented for purposes of illustration and description. They are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed. Rather, many modifications and variations are possible in view of the above teachings.
This application is a National Stage filing based off of PCT Application No. PCT/US2023/068386, filed 13 Jun. 2023, and entitled “ELECTRONIC CHARGING DEVICE AND USER INTERFACE” which claims priority to U.S. Provisional Patent Application No. 63/366,403, filed 14 Jun. 2022, and entitled “ELECTRONIC CHARGING DEVICE AND USER INTERFACE,” the entire disclosure of which is hereby incorporated by reference.
| Filing Document | Filing Date | Country | Kind |
|---|---|---|---|
| PCT/US2023/068386 | 6/13/2023 | WO |
| Number | Date | Country | |
|---|---|---|---|
| 63366403 | Jun 2022 | US |