This relates generally to electronic devices, including electronic devices with wireless communications capabilities.
Electronic devices often have displays that are used to display images to users. Such devices can include head-mounted displays. It can be challenging to incorporate wireless communications circuitry in head-mounted displays.
A head-mounted device may have a housing. The device may have first and second displays at a rear side of the housing. The device may have a cover layer at a front side of the housing. The device may have a third display that overlaps a central region of the cover layer. The cover layer may have a peripheral region surrounding the central region. The device may have a head strap attached to a left side and a right side of the housing.
The device may include wireless circuitry. The wireless circuitry may include a first transceiver, a second transceiver, and antennas. The first transceiver may transmit audio data to left and right earbuds using one or more of the antennas and using a non-Bluetooth, low-latency-audio communications protocol. The second transceiver may transmit wireless signals using other communications protocols such as a wireless local area network protocol and the Bluetooth protocol.
If desired, the first transceiver may use a single antenna to transmit the audio data to the left and right earbuds concurrently. The antenna may be mounted between the third display and the bottom of the device overlapping the peripheral region of the cover layer. In other configurations, the transceiver may use first and second antennas to concurrently transmit audio data to the left and right earbuds respectively. The first and second antennas may be mounted at bottom-left and bottom-right corners of the device and overlapping the peripheral region of the cover layer, may be mounted to the head strap, or may be mounted at the rear side of the device. The first and second antennas may be tilted to place the left and right earbuds into the fields of view of the antennas, to match the radio-frequency polarization of the antennas, and/or to configure the first and second antennas to use orthogonal polarizations.
Mounting the antennas in this way may serve to optimize wireless performance given the other components of the device while meeting stringent requirements associated with the low-latency-audio communications protocol and while also satisfying requirements on radio-frequency energy exposure. If desired, the first antenna may be coupled to a phase shifter to boost signal quality using a cross-head channel. If desired, a duplexer may be used to allow the first or second antenna to also convey Bluetooth data for the second transceiver.
Electronic devices may be provided with components such as antennas. The electronic devices may include portable electronic devices, wearable devices, desktop devices, embedded systems, and other electronic equipment. Illustrative configurations in which the electronic devices include a head-mounted device may sometimes be described herein as an example. The head-mounted device may have first and second rear-facing displays and a front-facing display. The device may have a housing with a cover layer at a front side of the device. The cover layer may have a central region overlapping the front-facing display and a peripheral region surrounding the central region.
The device may include wireless circuitry with a first transceiver and a second transceiver. The first transceiver may convey wireless local area network (WLAN) signals and Bluetooth signals. The second transceiver may convey audio data with left and right earbuds using a non-Bluetooth low-latency-audio communications protocol. The second transceiver may convey the audio data using one or more antennas. The one or more antennas may be mounted at a bottom side of the device and overlapping the peripheral region, may be mounted at a rear side of the device, or may be mounted to a head strap for the device. The one or more antennas may be tilted at different angles to optimize the field(s) of view of the antenna(s), to match the polarization of the earbuds, and/or to allow multiple antennas to exhibit orthogonal polarizations. Mounting the antennas in this way may serve to optimize wireless performance given the other components of the device while meeting stringent requirements associated with the low-latency-audio communications protocol and while also satisfying requirements on radio-frequency energy exposure.
The antennas may include a single antenna that concurrently conveys packets of audio data to both earbuds. Alternatively, the antennas may include first and second antennas that concurrently convey respective packets of audio data to the left and right earbuds or that concurrently convey the same packets of audio data to the left and right earbuds using a signal splitter. A phase shifter may be coupled to one of the antennas to boost a cross-head channel for the earbuds. A duplexer may be used to configure one of the antennas to also convey Bluetooth data for the first transceiver.
As shown in
The head-mounted support structures in housing structures 12 may have the shape of glasses or goggles and may support one or more lenses that align with one or more of the user's eyes while the user is wearing device 10. The head-mounted support structures in housing structures 12 may include one or more rigid frames that help to provide mechanical integrity, rigidity, and/or strength to device 10 during use. In some implementations that are described herein as an example, the one or more rigid frames are formed from conductive material. The rigid frame(s) may therefore sometimes be referred to herein as conductive frame(s).
If desired, housing structures 12 may include other housing structures or housing members disposed on (e.g., layered on or over, affixed to, etc.) and/or overlapping some or all of the conductive frame(s) (e.g., dielectric structures, rubber structures, ceramic structures, glass structures, fiber composite structures, foam structures, sapphire structures, plastic structures, cosmetic structures, etc.). These other housing structures may, for example, support one or more components in device 10, may help to protect the components of device 10 from damage or contaminants, may help to allow device 10 to be worn comfortably on the user's head, may help to hide portions of the conductive frame from view, may contribute to the cosmetic or aesthetic appearance of device 10, etc.
Device 10 may include input/output (I/O) components such as I/O components 14. I/O components 14 may allow device 10 to provide output and/or other information to the user of device 10 or other entities and/or may allow device 10 to receive user input and/or other information from the user and/or other entities. I/O components 14 may include one or more displays such as displays 18. Displays 18 may emit light (sometimes referred to herein as image light) that is provided to the user's eyes for viewing. The light may contain images. The images may contain pixels. Many images may be provided over time in a sequence (e.g., as a video). The displays 18 in device 10 may include, for example, left and right displays. The left display may provide light to a user's left eye whereas the right display may provide light to the suer's right eye while the user wears device 10 on their head.
I/O components 14 may also include wireless circuitry such as wireless circuitry 16 (sometimes referred to herein as wireless communication circuitry 16). Wireless circuitry 16 may transmit radio-frequency signals 24 to external equipment 22 and/or may receive radio-frequency signals 24 from external equipment 22. External equipment 22 may include another device such as device 10 (e.g., another head-mounted device, a desktop computer, a laptop computer, a cellular telephone, a tablet computer, a tethered computer, etc.), a peripheral device or accessory device (e.g., a user input device, a stylus, a device that identifies user inputs associated with gestures or motions made by a user, a gaming controller, headphones, etc.), remote computing equipment such as a remote server or cloud computing segment, a wireless base station, a wireless access point, and/or any other desired equipment with wireless communications capabilities. In implementations that are described herein as an example, external equipment 22 includes at least first and second peripheral devices such as left and right headphone speakers or earbuds. The earbuds may be worn by a user to provide audio content to the user's ears while the user is wearing device 10 on their head. Wireless circuitry 16 may transmit the audio content to the earbuds using radio-frequency signals 24.
I/O components 14 may also include other components (not shown) such as sensors, haptic output devices (e.g., one or more vibrators), non-display light sources such as light-emitting diodes, audio devices such as speakers for producing audio output, wireless charging circuitry for receiving wireless power for charging a battery on device 10 and/or for transmitting wireless power for charging a battery on other devices, batteries and/or other energy storage devices, buttons, mechanical adjustment components (e.g., components for adjusting one or more housing structures 12 to allow device 10 to be worn comfortably on a user's head and/or on other user's heads, which may have different geometries), and/or other components.
Sensors in I/O components 14 may include image sensors (e.g., one or more visible and/or infrared light cameras, binocular three-dimensional image sensors that gather three-dimensional images using two or more cameras in a binocular configuration, sensors that emit beams of light and that use two-dimensional image sensors to gather image data for three-dimensional images from light spots that are produced when a target is illuminated by the beams, light detection and ranging (lidar) sensors, etc.), acoustic sensors such as microphones or ultrasonic sensors, gaze tracking sensors (e.g., an optical system that emits one or more beams of infrared light that are tracked using the image sensor after reflecting from a user's eyes while wearing device 10), touch sensors, force sensors (e.g., capacitive force sensors, strain gauges, resistive force sensors, etc.), proximity sensors (e.g., capacitive proximity sensors and/or optical proximity sensors), ambient light sensors, contact sensors, pressure sensors, moisture sensors, gas sensors, magnetic sensors, motion sensors for sensing motion, position, and/or orientation (e.g., gyroscopes, accelerometers, compasses, and/or inertial measurement units (IMUs) that include two or more of these), and/or any other desired sensors.
Device 10 may also include one or more controllers 20 (sometimes referred to herein as control circuitry 20). Controller(s) 20 may include processing circuitry and storage circuitry. The processing circuitry may be used to control the operation of device 10 and may include one or more processors such as microprocessors, digital signal processors, microcontrollers, host processors, application specific integrated circuits, baseband processors, graphics processing units, central processing units (CPUs), etc. The storage circuitry in controller(s) 20 may include one or more hard disks or hard drives storage, nonvolatile memory (e.g., electrically-programmable-read-only memory configured to form a solid-state drive), volatile memory (e.g., static or dynamic random-access-memory), etc. If desired, controller(s) 20 may be configured to perform operations in device 10 using hardware (e.g., dedicated hardware or circuitry), firmware, and/or software. Software code for performing operations in device 10 may be stored on storage and may be executed by processing circuitry in controller(s) 20.
Controller(s) 20 run software on device 10 such as one or more software applications, internet browsers, gaming programs, voice-over-internet-protocol (VOIP) telephone call applications, social media applications, driving or navigation applications, email applications, media playback applications, operating system functions, etc. To support interactions with external equipment 22, controller(s) 20 may implement one or more communications protocols associated with (wireless) radio-frequency signals 24. The communications protocols may include internet protocols, wireless local area network protocols (e.g., IEEE 802.11 protocols-sometimes referred to as Wi-Fi®), protocols for other short-range wireless communications links such as the Bluetooth® protocol or other wireless personal area network (WPAN) protocols, IEEE 802.11ad protocols, cellular telephone protocols, multiple-input and multiple-output (MIMO) protocols, antenna diversity protocols, satellite navigation system protocols, IEEE 802.15.4 ultra-wideband communications protocols or other ultra-wideband communications protocols, non-Bluetooth protocols for ultra-low-latency audio streaming, etc. Each communications protocol may be associated with a corresponding radio access technology (RAT) that specifies the physical connection methodology used in implementing the protocol.
During operation, wireless circuitry 16 may be used to support communication between device 10 and external equipment 22 (e.g., using radio-frequency signals 24). For example, device 10 and/or external device 22 may transmit video data, application data, audio data, user input commands, and/or other data to each other (e.g., in one or both directions). If desired, device 10 and/or external equipment 22 may use wired and/or wireless communications circuitry to communicate through one or more communications networks (e.g., the internet, local area networks, etc.). If desired, device 10 may communicate with other end hosts over the internet via radio-frequency signals 24 and external equipment 22. Wireless circuitry 16 may allow data to be received by device 10 from external equipment 22 and/or to provide data to external equipment 22.
While controller(s) 20 are shown separately from wireless circuitry 16 for the sake of clarity, wireless circuitry 16 may include processing circuitry and/or storage circuitry that forms part of controller(s) 20 (e.g., portions of controller(s) 20 may be implemented on wireless circuitry 16). As an example, controller(s) 20 may include baseband circuitry (e.g., one or more baseband processors), digital control circuitry, analog control circuitry, and/or other control circuitry that forms part of wireless circuitry 16. The baseband circuitry may, for example, access a communication protocol stack on controller(s) 20 to: perform user plane functions at a PHY layer, MAC layer, RLC layer, PDCP layer, SDAP layer, and/or PDU layer, and/or to perform control plane functions at the PHY layer, MAC layer, RLC layer, PDCP layer, RRC, layer, and/or non-access stratum layer.
Housing structures 12 may include one or more frame members such as outer chassis 12A and inner chassis 12B. Outer chassis 12A may be an outer frame surrounding the interior of device 10 and may, if desired, form exterior surfaces of device 10 (e.g., portions of outer chassis 12A may form one or more housing walls of device 10 such as housing walls that run around a periphery of device 10). Inner chassis 12B may be disposed within the interior of device 10 and may be mounted to outer chassis 12A (e.g., outer chassis 12A may surround inner chassis 12B in the X-Z plane). Strap 12C may be attached to outer chassis 12A at right side 36 of device 10 and left side 34 of device 10 (e.g., using attachment structures such as a joint, a hinge, screws, fasteners, snaps, magnets, etc.). Strap 12C may be permanently attached to outer chassis 12A or may be removable. Right side 36 may sometimes be referred to herein as right edge 36, right face 36, or right wall 36 of device 10. Left side 34 may extend opposite right side 36 and may sometimes be referred to herein as left edge 34, left face 34, or left wall 34 of device 10. Right side 36 and left side 34 may extend from front side 30 to rear side 32 of device 10.
Outer chassis 12A may be formed from conductive material such as aluminum, stainless steel, or titanium. Outer chassis 12A may therefore sometimes be referred to herein as conductive chassis 12A, conductive outer chassis 12A, conductive outer frame 12A, conductive frame 12A, conductive housing 12A, or conductive outer housing 12A. Inner chassis 12B may be formed from conductive material such as magnesium, aluminum, stainless steel, or titanium. Inner chassis 12B may therefore sometimes be referred to herein as conductive chassis 12B, conductive inner chassis 12B, conductive inner frame 12B, conductive frame 12B, conductive housing 12B, conductive inner housing 12B, or conductive support plate 12B.
Outer chassis 12A and inner chassis 12B may provide mechanical support and rigidity for device 10. In addition, one or more components within the interior of device 10 may be mounted or affixed to outer chassis 12A and/or inner chassis 12B. For example, a substrate such as logic board 38 may be mounted to inner chassis 12B. Logic board 38 may, for example, form a main logic board (MLB) for device 10. Other components in device 10 (e.g., portions of I/O components 14 and/or controller(s) 20 of
When device 10 is worn on a user's head, the user's head 33 faces rear side 32 of device 10 and the user's eyes are aligned with displays 18B, as shown by arrows 40. Displays 18B may include a left display that aligns with the user's left eye and a right display that aligns with the user's right eye (e.g., the user's left and right eyes may be located within left and right eye boxes of displays 18B). The left and right displays may include respective pixel arrays (or a single shared pixel array) and optics (e.g., one or more lenses) for directing images from the pixel arrays to the user's eyes (e.g., as binocularly fusible content).
The housing structures 12 of device 10 may also include housing structures at the front side 30 of device 10 opposite rear side 32. Front side 30 may sometimes also be referred to herein as front edge 30, front wall 30, or front face 30 of device 10. Housing structures 12 may include a cover glass assembly (CGA) 28 mounted to outer chassis 12A at front side 30 of device 10. CGA 28 may sometimes also be referred to herein as cover 28 or front cover 28 of device 10. CGA 28 may be fully or partially transparent.
CGA 28 may include multiple layers (sometimes referred to herein as cover layers). For example, CGA 28 may include an outer cover layer for device 10 such as a glass cover layer (sometimes referred to herein as a display cover layer or a cover glass). The glass cover layer may form the exterior surface of device 10 at front side 30. CGA 28 may also include one or more dielectric layers behind and overlapping the glass cover layer (e.g., at an interior side of the glass cover layer). The dielectric layer(s) may include one or more polymer layers, plastic layers, glass layers, ceramic layers, and/or other dielectric layers. If desired, some or all of the dielectric layer(s) may be formed in a ring shape that runs along the periphery of CGA 28 in the X-Z plane and the glass cover layer (e.g., at peripheral edge portions 42 of CGA 28) or may overlap substantially all of the glass cover layer. The dielectric layer(s) behind the glass cover layer may sometimes also be referred to as a cover layer, dielectric member, dielectric cover layer, shroud, trim, and/or canopy. Peripheral edge portions 42 may sometimes also be referred to herein as peripheral region 42 or edge region 42.
CGA 28 may also include a forward-facing display such as display 18A (e.g., a flexible display panel formed from a pixel array based on organic light-emitting diodes or other display panel). CGA 28 may have a central portion or region 44 that overlaps display 18A. Peripheral edge portions 42 of CGA 28 may extend around the lateral periphery of CGA 28 and central portion 44. Display 18A may emit light (e.g., images) through central portion 44 of the dielectric layer(s) and the glass cover layer of CGA 28 (as shown by arrow 46) for view by persons other than the wearer of device 10. The central region 44 of the glass cover layer and the dielectric layer(s) of CGA 28 that overlap display 18A may be fully transparent or partly transparent to help hide display 18A from view when the display is not emitting light. The peripheral edge regions 42 of the glass cover layer and the dielectric layer(s) of CGA 28 may be opaque or transparent. Display 18A may sometimes be referred to herein as a front-facing display or a publicly viewable display.
Housing structures 12 may also include cosmetic covering members, polymer layers (e.g., fully or partly transparent polymer layers), and/or dielectric housing walls layered onto or over outer chassis 12A (e.g., at the exterior of device 10) if desired. Housing structures 12 may also include one or more fabric members, rubber members, ceramic members, dielectric members, curtain members, or other structures at rear side 32 of device 10 that help to accommodate the user's face while wearing device 10 and/or to block external, ambient, or scene light from the environment around the user from interfering with the light from displays 18B being viewed by the user.
Some or all of the lateral surface of CGA 28 may exhibit a curved cross-sectional profile. Within CGA 28, some or all of one or more lateral surfaces of the glass cover layer and/or some or all of one or more of the lateral surfaces of the dielectric layer(s) in CGA 28 may be characterized by a three-dimensional curvature (e.g., spherical curvature, aspherical curvature, freeform curvature, etc.). The three-dimensional curvature may be a compound curvature (e.g., the surfaces exhibiting the curvature may be non-developable surfaces).
In the areas of compound curvature, at least some portions of the curved surface(s) in CGA 28 may be characterized by a radius of curvature R of 4 mm to 250 mm, 8 mm to 200 mm, 10 mm to 150 mm, at least 5 mm, at least 12 mm, at least 16 mm, at least 20 mm, at least 30 mm, less than 200 mm, less than 100 mm, less than 75 mm, less than 55 mm, less than 35 mm, and/or other suitable amount of curvatures. The compound curvature may be, for example, a three-dimensional curvature in which the surface(s) have non-zero radii of curvature about two or more different axes (e.g., non-parallel axes, intersecting axes, non-intersecting axes, perpendicular axes such as the X-axis and Z-axis, etc.) and/or two or more different points within or behind device 10. If desired, one or more of the surfaces of the dielectric layer(s) in CGA 28 may be a developable surface. Display 18A may be a flexible display panel that is bent into a curved shape (e.g., a curved shape following the curved face of a user, a curved shape following the compound curvature of CGA 28, a curved shape characterized by inner and outer developable surfaces, etc.). The compound curvature may serve to provide device 10 with an attractive cosmetic appearance, may help device 10 to exhibit a compact and light weight form factor, may serve to maximize the mechanical strength of device 10, and/or may accommodate easy interaction with device 10 by the user, as examples.
During operation, device 10 may receive image data (e.g., image data for video, still images, etc.) and may present this information on displays 18B and/or 18A. Device 10 may also receive other data, control commands, user input, etc. Device 10 may also transmit data to accessories and other electronic equipment (e.g., external equipment 22 of
Communications such as these may be supported using wired and/or wireless communications. In an illustrative configuration, wireless circuitry 16 (
External equipment 22 of
While operating device 10, the user wears device 10 on head 33. At the same time, the user wears left earbud 22L on and/or within their left ear (at the left side of head 33) and wears right carbud 22R on and/or within their right car (at the right side of head 33). Earbuds 22L and 22R may each include a speaker, a battery, one or more processors, and wireless circuitry having one or more antennas. Earbuds 22L and 22R may be wireless earbuds having batteries that are rechargeable when earbuds 22L and 22R are plugged into a power adapter, placed on or within a charging dock, or placed within a charging case, for example.
One or more antennas in device 10 may transmit audio data in radio-frequency signals 24A to carbuds 22R and 22L. Earbuds 22L and 22R may play the audio data over the speakers in earbuds 22L and 22R. The audio data may include a first stream of audio data (e.g., left audio data) for playback by left earbud 22L and a second, different, stream of audio data (e.g., right audio data) for playback by right earbud 22R (e.g., to provide the user with stereo, three-dimensional, spatial, and/or surround sound). One or more antennas in device 10 may also convey other wireless data in radio-frequency signals 24W.
Additionally or alternatively, one or both of earbuds 22L and 22R may include one or more sensors that generate sensor data. The sensors may include a microphone, a touch sensor, a force sensor, an orientation sensor (e.g., a gyroscope, inertial measurement unit, motion sensor, etc.), an ambient light sensor, a proximity sensor, a magnetic sensor, a temperature sensor, and/or other sensors. The microphone may generate microphone data (e.g., voice data from the user speaking while wearing the carbuds). The touch sensor may generate touch sensor data and the force sensor may generate force sensor data (e.g., indicative of a user input provided to device 10 via the carbuds, indicative of the carbuds being presently located in the cars of the user, etc.). The ambient light sensor may generate ambient light sensor data (e.g., indicative of the location of device 10 and/or lighting conditions around the user). In general, the sensors may generate any desired sensor data. Earbuds 22L and 22R may transmit the sensor data to one or more antennas in device 10 using radio-frequency signals 24A and/or using radio-frequency signals 24W.
The frequency bands handled by transceiver 66 may include wireless personal area network (WPAN) frequency bands such as the 2.4 GHz Bluetooth® band or other WPAN communications bands, cellular telephone communications bands such as a cellular low band (600-960 MHz), a cellular low-midband (1400-1550 MHZ), a cellular midband (1700-2200MHz), a cellular high band (2300-2700 MHZ), a cellular ultra-high band (3300-5000 MHZ), or other cellular communications bands between about 600 MHz and about 5000 MHZ), 3G bands, 4G LTE bands, 3GPP 5G New Radio Frequency Range 1 (FR1) bands below 10 GHz, 3GPP 5G New Radio (NR) Frequency Range 2 (FR2) bands between 20 and 60 GHz, other centimeter or millimeter wave frequency bands between 10-300 GHz, wireless local area network (WLAN) frequency bands (e.g., Wi-Fi® (IEEE 802.11) or other WLAN communications bands) such as a 2.4 GHz WLAN band (e.g., from 2400 to 2480 MHZ), a 5 GHz WLAN band (e.g., from 5180 to 5825 MHZ), a Wi-Fi® 6E band (e.g., from 5925-7125 MHZ), and/or other Wi-Fi® bands (e.g., from 1875-5160 MHZ), near-field communications frequency bands (e.g., at 13.56 MHZ), satellite navigation frequency bands such as the Global Positioning System (GPS) bands, Global Navigation Satellite System (GLONASS) bands, and BeiDou Navigation Satellite System (BDS) bands, ultra-wideband (UWB) frequency bands that operate under the IEEE 802.15.4 protocol and/or other ultra-wideband communications protocols (e.g., a first UWB communications band at 6.5 GHz and/or a second UWB communications band at 8.0 GHZ), communications bands under the family of 3GPP wireless communications standards, communications bands under the IEEE 802.XX family of standards, satellite communications bands, unlicensed bands such as an unlicensed band at 2.4 GHz and/or an unlicensed band between 5-6 GHZ, emergency and/or public services bands, and/or any other desired frequency bands of interest. Transceiver 66 may also be used to perform spatial ranging operations if desired (e.g., using a radar scheme).
As shown in
Antenna 50 may have an antenna feed coupled between antenna resonating element 52 and antenna ground 54. The antenna feed may have a first (positive or signal) antenna feed terminal 56 coupled to antenna resonating element 52. The antenna feed may also have a second (ground or negative) antenna feed terminal 58 coupled to antenna ground 54. Antenna resonating element 52 may be separated from antenna ground 54 by a dielectric (non-conductive) gap. Antenna resonating element 52 and antenna ground 54 may be formed from separate pieces of metal or other conductive materials or may, if desired, be formed from separate portions of the same integral piece of metal. If desired, antenna 50 may include additional antenna conductors that are not coupled to antenna feed terminals 56 and 58 (e.g., parasitic elements).
Each antenna feed and thus each antenna 50 in wireless circuitry 16 may be coupled to one or more transceivers 66 in wireless circuitry 16 over a corresponding radio-frequency transmission line 60. Radio-frequency transmission line 60 may include a signal conductor such as signal conductor 62 (e.g., a positive signal conductor) and a ground conductor such as ground conductor 64. Ground conductor 64 may be coupled to antenna feed terminal 58 of antenna 50. Signal conductor 62 may be coupled to antenna feed terminal 56 of antenna 50. Radio-frequency transmission line 60 may include one or more of a stripline, microstrip, coaxial cable, coaxial probes, edge-coupled microstrip, edge-coupled stripline, waveguide, radio-frequency connector, combinations of these, etc. Radio-frequency transmission line 60 may also sometimes be referred to herein as a radio-frequency transmission line path. If desired, filter circuitry, tuning components, switching circuitry, impedance matching circuitry, phase shifter circuitry, amplifier circuitry, and/or other circuitry may be disposed on radio-frequency transmission line 60 and/or may be coupled between two or more of the antenna conductors in antenna 50.
The term “convey radio-frequency signals” as used herein means the transmission and/or reception of the radio-frequency signals (e.g., for performing unidirectional and/or bidirectional wireless communications with external wireless communications equipment). During transmission of radio-frequency signals 24, transceiver 66 transmits radio-frequency signals 24 (e.g., as modulated using wireless data such as audio data, control data, etc.) over radio-frequency transmission line 60. The radio-frequency signals may excite antenna currents to flow around the edges of antenna resonating element 52 and antenna ground 54 (via antenna feed terminals 56 and 58). The antenna currents may radiate radio-frequency signals 24 into free space (e.g., based at least on a resonance established by the radiating length of antenna resonating element 52 and/or antenna ground 54).
During the reception of radio-frequency signals 24 (e.g., as modulated by external equipment using wireless data such as voice data, sensor data, image data, etc.), incident radio-frequency signals 24 may excite antenna currents to flow around the edges of antenna resonating element 52 and antenna ground 54. The antenna currents may pass radio-frequency signals 24 to transceiver 66 over radio-frequency transmission line 60. Transceiver 66 may downconvert the radio-frequency signals to baseband and may demodulate wireless data from the signals (e.g., using baseband circuitry such as one or more baseband processors).
Antennas 50 may be formed using any suitable antenna structures. For example, antennas 50 may include antennas with antenna resonating elements that are formed from patch antenna structures (e.g., shorted patch antenna structures), slot antenna structures, loop antenna structures, stacked patch antenna structures, antenna structures having parasitic elements, inverted-F antenna structures, planar inverted-F antenna structures, helical antenna structures, monopole antenna structures, dipole antenna structures, Yagi (Yagi-Uda) antenna structures, surface integrated waveguide structures, hybrids of two or more of these designs, etc. If desired, one or more antennas 50 may be cavity-backed antennas. Two or more antennas 50 may be arranged in a phased antenna array if desired (e.g., for conveying centimeter and/or millimeter wave signals within a signal beam formed in a desired beam pointing direction that may be steered/adjusted over time). Earbuds 22R and 22L may also have wireless circuitry such as wireless circuitry 16 of
Device 10 may include a first set of one or more antennas that convey radio-frequency signals 24A with carbuds 22R and 22L (
Antennas 50W-1 and 50W-2 may be coupled to a first transceiver 66W over radio-frequency transmission lines 60-1, and 60-2, respectively. Antenna 50A may be coupled to a second transceiver 66A over radio-frequency transmission line 60-3. Transceivers 66W and 66A may be formed using different respective radios, modems, chips, integrated circuits, integrated circuit (IC) packages, and/or modules. Transceiver 66W may convey radio-frequency signals 24W (
Transceiver 66W may convey radio-frequency signals 24W using at least a first communications protocol, at least a first RAT, and a first set of frequency bands. An implementation in which radio-frequency signals 24W include WLAN signals conveyed using a WLAN protocol (e.g., a Wi-Fi protocol), the WLAN RAT, and WLAN frequency bands is described herein as an example. If desired, radio-frequency signals 24W may also include Bluetooth signals conveyed using a Bluetooth protocol and Bluetooth frequency bands. Transceiver 66W may therefore sometimes be referred to herein as WLAN transceiver 66W, Wi-Fi transceiver 66W, or WLAN/Bluetooth transceiver 66W. Radio-frequency signals 24W may sometimes be referred to herein as WLAN or Wi-Fi signals 24W. This is merely illustrative and, in general, radio-frequency signals 24W may be conveyed using any desired protocol(s).
In some scenarios, Bluetooth signals conveyed by transceiver 66W are used to convey streams of audio data between device 10 and earbuds 22L and 22R. However, Bluetooth signaling can involve an excessive amount of latency and an excessive glitch rate. This can be disruptive to the user experience while listening to audio on earbuds 22L and 22R, particularly for audio data with a relatively high data rate (e.g., as required for immersive, high definition, three-dimensional audio presented to the user along with virtual reality content on displays 18B of
To mitigate these issues, transceiver 66A may convey radio-frequency signals 24A using a second communications protocol, a second RAT, and a second set of frequency bands different from those used by transceiver 66W. For example, transceiver 66A may convey radio-frequency signals 24A using a non-Bluetooth, ultra-low-latency audio communications protocol optimized to support low latency and high data rate audio streaming from device 10 to carbuds 22L and 22R. Radio-frequency signals 24A may be conveyed in different frequency bands than radio-frequency signals 24W. For example, radio-frequency signals 24A may be conveyed using an unlicensed band at 2.4 GHz and/or an unlicensed band between 5-6 GHZ. The band between 5-6 GHz may allow for a larger bandwidth than the 2.4 GHz band. In addition, the band between 5-6 GHz may allow for fewer coexistence/interference issues than the 2.4 GHz band, which coexists with the Bluetooth band, household appliances such as microwaves that emit around 2.4 GHz, etc.
The ultra-low-latency audio protocol may involve communications without performing time division duplexing between earbuds 22L and 22R and may involve communications with a lower packet re-transmission count limit, lower latency, lower glitch rate (e.g., 1 glitch per hour or fewer), more stability, and less interference than the Bluetooth protocol. Further, the ultra-low-latency audio protocol requires both earbuds 22R and 22L to convey radio-frequency signals 24A directly with device 10 rather than relaying signals or data between earbuds 22R and 2L and has a wireless fading channel selected to have a tighter distribution and shorter tail at the low power end than the Bluetooth protocol. Transceiver 66A may therefore sometimes be referred to herein as audio transceiver 66A. Radio-frequency signals 24A may sometimes be referred to herein as audio signals 24A. The example in which transceiver 66A conveys audio data is merely illustrative and, in general, transceiver 66A may use radio-frequency signals 24A to convey any desired wireless data.
During transmission, transceiver 66A may transmit audio data AUD in radio-frequency signals 24A (e.g., radio-frequency signals 24A may be modulated to carry audio data AUD). Antenna 50A may transmit the radio-frequency signals 24A including audio data AUD. Audio data AUD may include a stream of audio data packets. The stream of audio data packets may include a first set of audio data packets (or any desired first portion of the stream of audio data as distributed across one or more packets) for playback by left earbud 22L (e.g., a stream of left speaker audio data). The stream of audio data packets may also include a second set of audio data packets (or any desired second portion of the stream of audio data as distributed across one or more packets) for playback by right earbud 22R (e.g., a stream of right speaker audio data). The first and second sets may be interspersed or interleaved in time, for example.
Since the ultra-low-latency audio communications protocol governing transmission of radio-frequency signals 24A does not involve time division duplexing (TDD) between carbuds 22R and 22L, the same audio data AUD (e.g., the stream of audio data packets including both left and right speaker audio data) is concurrently (e.g., simultaneously) transmitted to both carbuds 22R and 22L and is concurrently received by both earbuds 22R and 22L. The controllers on earbuds 22R and 22L may demodulate the received audio data to recover the first and second sets of audio data packets. Left earbud 22L may then play the first set of audio data packets without playing (e.g., while discarding) the received second set of audio data packets. Right earbud 22R may play the second set of audio data packets without playing (e.g., while discarding) the received first set of audio data packets. Earbuds 22L and 22R may also transmit radio-frequency signals 24A to antenna 50A on device 10 to confirm/acknowledge receipt of audio data AUD, to convey voice/sensor data to device 10, etc. Since the sensor data gathered by carbuds 22R and 22L may not be subject to the same strict latency requirements as the audio data conveyed by transceiver 66A, carbuds 22L and 22R may, if desired, include additional wireless circuitry that transmits some or all of the sensor data to device 10 using the Bluetooth protocol or other protocols.
In the example of
In the example of
The radio-frequency signals 24A conveyed by antenna 50A-L may sometimes be referred to herein as radio-frequency signals 24A-L. The radio-frequency signals 24A conveyed by antenna 50A-R may sometimes be referred to herein as radio-frequency signals 24A-R. During transmission, transceiver 66A may transmit a first (left) stream of audio data AUDL in radio-frequency signals 24A-L. Transceiver 66B may concurrently transmit a second (right) stream of audio data AUDR in radio-frequency signals 24A-R. Antenna 50A-R may transmit radio-frequency signals 24A-R and thus audio data AUDR to earbud 22R. Antenna 50A-L may concurrently transmit radio-frequency signals 24-L and thus audio data AUDL to earbud 22L. Audio data AUDL may include a first stream of audio packets (e.g., a first set of audio packets) for playback by left earbud 22L. Audio data AUDR may include a second stream of audio packets (e.g., a second set of audio packets) for concurrent playback by right earbud 22R.
Earbud 22L may also transmit radio-frequency signals 24A-L to antenna 50A-L on device 10 to confirm/acknowledge receipt of audio data AUDL, to convey voice/sensor data to device 10, etc. Similarly, earbud 22R may also transmit radio-frequency signals 24A-R to antenna 50A-R on device 10 to confirm/acknowledge receipt of audio data AUDR, to convey voice/sensor data to device 10, etc.
Conveying respective audio data streams to earbuds 22R and 22L using separate antennas 50A-R and 50A-L may serve to improve link quality or glitch rate relative to using the same antenna 50A (
To allow for a simpler low-latency-audio protocol without requiring timing configuration for separate streams of audio data, transceiver 66A may transmit the same stream of audio data AUD over both antennas 50A-R and 50A-L.
As shown in
Wireless circuitry 16 may include a radio-frequency signal splitter/combiner 70 having a first port coupled to radio-frequency transmission line 60-3. Splitter/combiner 70 may have a second port coupled to antenna 50A-R over radio-frequency transmission line 60-4. Splitter/combiner 70 may have a third port coupled to antenna 50A-L over radio-frequency transmission line 60-5. Splitter/combiner 70 may sometimes be referred to herein simply as signal splitter 70 or combiner 70.
During transmission, transceiver 66A may transmit audio data AUD over radio-frequency transmission line 60-3. Splitter/combiner 70 may act as a radio-frequency signal splitter that transmits the same audio data AUD from radio-frequency transmission line 60-3 onto both radio-frequency transmission line 60-4 (in radio-frequency signals 24A-R) and radio-frequency transmission line 60-5 (in radio-frequency signals 24A-L). Antenna 50A-R may transmit the radio-frequency signals 24A-R including audio data AUD. Antenna 50A-L may concurrently transmit the radio-frequency signals 24A-L including the same audio data AUD. For example, antennas 50A-R and 50A-L may concurrently and sequentially transmit each audio packet in the stream of audio packets from audio data AUD (e.g., antennas 50A-R and 50A-L may concurrently or simultaneously transmit a first packet from audio data AUD, may then concurrently or simultaneously transmit a second packet from audio data AUD, may then concurrently or simultaneously transmit a third audio packet from audio data AUD, etc.). Earbuds 22R and 22L may thereby concurrently receive the same stream of audio data AUD, may extract their respective portions of audio data AUD for playback, and may play their respective portions of audio data AUD on the corresponding earbud speakers.
Since the same stream of audio data AUD is transmitted by both antennas 50A-R and 50A-L, there is no concern for interference between radio-frequency signals 24A-R and 24A-L or between earbuds 22L and 22R in this configuration. Earbud 22L may also transmit radio-frequency signals 24A-L to antenna 50A-L on device 10 to confirm/acknowledge receipt of audio data AUDL, to convey voice/sensor data to device 10, etc. Similarly, carbud 22R may also transmit radio-frequency signals 24A-R to antenna 50A-R on device 10 to confirm/acknowledge receipt of audio data AUDR, to convey voice/sensor data to device 10, etc. Splitter/combiner 70 may serve as a radio-frequency combiner that combines the received radio-frequency signals 24A-R from antenna 50A-R and the received radio-frequency signals 24A-L from antenna 50A-L onto radio-frequency transmission line 60-3.
In practice, there may exist a cross-head channel over which earbud 22R receives radio-frequency signals 24A-L transmitted by antenna 50A-L and/or a cross-head channel over which carbud 22L receives radio-frequency signals 24A-R transmitted by antenna 50A-R. In implementations where antennas 50A-L and 50A-R both concurrently transmit the same stream of audio data AUD, the cross-head channels may be used to boost signal reception at carbuds 22R and/or 22L. However, the cross-head channels are usually at least 10 dB lower than the direct wireless channels between left earbud 22L and antenna 50A-L and between right earbud 22R and antenna 50A-R. If desired, wireless circuitry 16 may include a phase shifter that serves to boost the cross-head channel to further boost signal quality at the earbuds.
Phase shifter 72 may be controlled (e.g., using a control signal received from controller(s) 20 of
In practice, it may be desirable for transceivers 66W and 66A to be located as far away from each other in device 10 as possible to minimize coupling and interference between the transceivers when operating at similar frequencies. It may be desirable for transceiver 66W to have a dedicated path or port for conveying Bluetooth data that is not shared with a path or port used to convey WLAN data. If desired, rather than adding a dedicated Bluetooth antenna to device 10 (which can consume excessive space in device 10), one of antennas 50A such as antenna 50A-L may also be used to convey Bluetooth data for transceiver 66W.
During transmission, transceiver 66W may transmit radio-frequency signals 24W at a Bluetooth frequency and including Bluetooth data BT over radio-frequency transmission line 60-6. Duplexer 74 may (passively) pass the radio-frequency signals 24W having Bluetooth data BT onto radio-frequency transmission line 60-7. Duplexer 74 may also pass the radio-frequency signals 24A-L having audio data AUD onto radio-frequency transmission line 60-7. Antenna 50A-L may transmit radio-frequency signals 24A-L containing audio data AUD and may transmit radio-frequency signals 24W containing Bluetooth data BT (e.g., at different times, concurrently, etc.).
During reception, antenna 50A-L may receive radio-frequency signals 24A-L containing audio data AUD and may receive radio-frequency signals 24W containing Bluetooth data BT. Duplexer 60-7 may (passively) split the radio-frequency signals received over radio-frequency transmission line 60-7 by frequency, such that radio-frequency signals 24W at Bluetooth frequencies are passed to radio-frequency transmission line 60-6 and radio-frequency signals 24A-L at an ultra-low-latency audio frequency are passed to radio-frequency transmission line 60-5. Alternatively, transceiver 66W may use antenna 50A-R or both antennas 50A-R and 50A-L (e.g., using multiple duplexers and transmission lines) to convey Bluetooth data BT.
It can be challenging to place antennas 50W-1, 50W-2, 50A, 50A-L, and 50A-R at locations in device 10 that allow the antennas to exhibit satisfactory levels of radio-frequency performance, particularly given the lightweight and compact form factor of device 10. For example, if care is not taken, the presence of conductive structures such as outer chassis 12A, inner chassis 12B, conductive material on logic board 38, displays 18B, display 18A, and other conductive components can undesirably block, load, detune, or distort the radiation pattern of the antennas. In addition, some locations may be subject to excessive on-body channel fading, some locations may lead to excessive wireless interference or coexistence issues with other antennas and/or circuitry in device 10, some locations may produce excessive levels of radio-frequency exposure to the user of device 10 (potentially implicating regulatory requirements on radio-frequency exposure or absorption), some locations may undesirably impact radio link budget, some locations may involve excessive RF cable path loss (e.g., when the antennas are located excessively far from the corresponding transceiver), some locations may involve poor polarization matching with the antennas on earbuds 22R and 22L, etc.
As shown in
Device 10 may have a top side 80 and a bottom side 82 opposite top side 80. Top side 80 may sometimes also be referred to herein as top edge 80, top wall 80, or top face 80 of device 10. Bottom side 82 may sometimes also be referred to herein as bottom edge 82, bottom wall 82, or bottom face 82 of device 10. Right side 36 and left side 34 may extend from top side 80 to bottom side 82 of device 10.
Display 18A may include pixel circuitry and other conductive components that can block radio-frequency signals conveyed by the antennas in device 10. As such, antennas 50W-1, 50W-2, and 50A may be disposed within device 10 at locations overlapping peripheral edge portions 42 of CGA 28. Antennas 50W-1 and 50W-2 may be mounted within device 10 and overlapping an upper region or area of peripheral edge portions 42 (e.g., antennas 50W-1 and 50W-2 may be interposed between display 18A and top side 80 of device 10). Antennas 50W-1 and 50W-2 may convey radio-frequency signals 24W through the dielectric material in CGA 28 and/or the top, bottom, right, left, and/or rear sides of device 10. Antennas 50W-1 and 50W-2 may be disposed at opposing sides of device 10 (e.g., antenna 50W-1 may be disposed at or adjacent right side 36 whereas antenna 50W-2 is disposed at or adjacent left side 34 of device 10) to maximize spatial diversity for transceiver 66W. Antennas 50W-1 and 50W-2 may, for example, be mounted at opposing sides of nose bridge region 85 of device 10. Nose bridge region 85 may rest on the user's nose while wearing device 10 on their head. Nose bridge region 85 may be laterally interposed between the left and right displays 18B in device 10 (
Antenna 50A may be mounted within device 10 and overlapping a lower region or area of peripheral edge portions 42 (e.g., antenna 50A may be interposed between display 18A and bottom side 82 of device 10). Antenna 50A may convey radio-frequency signals 24A with both carbuds 22R and 22L through the dielectric material in CGA 28 and/or the top, bottom, right, left, and/or rear sides of device 10. Disposing antenna 50A along the bottom edge of device 10 may serve to minimize the amount of conductive material in device 10 that lies between antenna 50A and the location of earbuds 22R and 22L while device 10 is being worn by the user. If desired, antenna 50A may overlap nose bridge portion 85 of device 10 (e.g., antenna 50A may be disposed at the center of device 10 along the X-axis). This may allow antenna 50A to exhibit optimal and balanced channel conditions with both right carbud 22R at right side 36 of device 10 and left carbud 22L at left side 34 of device 10.
As shown in
Antennas 50A-R and 50A-L may be mounted within device 10 at different locations overlapping peripheral edge portions 42 of CGA 28. Antenna 50A-R may, for example, be mounted at or adjacent to corner 84R of display 18A and/or corner 86R of device 10 (e.g., antenna 50A-R may be laterally interposed between corner 84R of display 18A and corner 86R of device 10). Antenna 50A-L may be mounted at or adjacent to corner 84L of display 18A and/or corner 86L of device 10 (e.g., antenna 50A-L may be laterally interposed between corner 84L of display 18A and corner 86L of device 10). In this way, display 18A may be vertically interposed between the antennas 50W (
Device 10 may have a central longitudinal axis 88 extending from right side 36 to left side 34 (parallel to the X-axis and perpendicular to nose bridge region 85 of
The example of
As shown in
In configurations in which antennas 50A-R and 50A-L are mounted within device 10 at locations overlapping CGA 28 (e.g., as shown in the front view of
As shown in
When placed and oriented in this way, the antenna resonating element 52 may exhibit optimal channel characteristics in conveying radio-frequency signals 24A or 24A-R with right earbud 22R (e.g., with minimal blockage by the user's head, display 18A, and/or the other conductive structures of device 10). Mounting antenna resonating element 52 at rear side 32 of device 10 (as shown in
Antenna 50A-R (e.g., the antenna resonating element 52 of antenna 50A-R) may be rotated, tilted, or oriented at a non-parallel and non-perpendicular angle 94 with respect to longitudinal axis 88 of device 10 (e.g., the X-axis of
Antenna 50A-L may exhibit an angular field of view (FOV) 100 (e.g., an angular/spatial region around or facing the antenna resonating element of the antenna in which the antenna exhibits a gain or antenna efficiency that exceeds a threshold gain or antenna efficiency). Similarly, antenna 50A-R may exhibit a FOV 98. Angle 94 may be selected such that the expected location of left carbud 22L lies within FOV 100 of antenna 50A-L. This may allow antenna 50A-L to convey radio-frequency signals 24A-L with left earbud 22L (e.g., while minimizing blockage by outer chassis 12A and/or other conductive components). Similarly, angle 96 may be selected such that the expected location of right earbud 22R lies within FOV 98 of antenna 50A-R. This may allow antenna 50A-R to convey radio-frequency signals 24A-R with right carbud 22R (e.g., while minimizing blockage by outer chassis 12A and/or other conductive components). If desired, the geometry of CGA 28 may be altered to enhance the size of FOV 100 and FOV 98. For example, CGA 28 may exhibit greater curvatures (e.g., greater radii of curvature) within peripheral edge portions 42 than overlapping display 18A to effectively maximize the size of FOV 100 and FOV 98.
Tilting antenna 50A-L at angle 94 may serve to match the polarization of the radio-frequency signals 24A-L conveyed by antenna 50A-L with the polarization of the antenna(s) on left carbud 22L while worn by the user, thereby maximizing efficiency. Similarly, tilting antenna 50A-L at angle 96 may serve to match the polarization of the radio-frequency signals 24A-R conveyed by antenna 50A-R with the polarization of the antenna(s) on left earbud 22R while worn by the user. Additionally or alternatively, the transceiver may electronically adjust the polarizations of antennas 50A-L and 50A-R to help match the polarizations of the antennas to the polarizations of the carbuds.
If desired, angles 94 and 96 may be selected such that the sum of the magnitudes of angles 94 and 96 (e.g., the relative angle between the surfaces containing the antenna resonating elements 52 of antennas 50A-L and 50A-R) is approximately equal to 90 degrees (e.g., 80-100 degrees, 70-110 degrees, 85-95 degrees, 88-92 degrees, 89-91 degrees, 89.5-90.5 degrees, or other angles around 90 degrees). Put differently, the antenna resonating elements 52 of antennas 50A-L and 50A-R may be oriented at approximately 90 degrees with respect to each other. Angles 94 may, for example, have an equal magnitude to angle 96. Angles 94 and 96 may each have a magnitude of 45 degrees, as one example. This may help to configure antennas 50A-R to convey radio-frequency signals 24A-R with a polarization that is orthogonal to the polarization with which antenna 50A-L conveys radio-frequency signals 24A-L. This may help to minimize destructive interference between radio-frequency signals 24A-R and 24A-L in configurations where radio-frequency signals 24A-R and 24A-L concurrently convey the same stream of audio data AUD to both earbuds 22L and 22R.
As used herein, the term “concurrent” means at least partially overlapping in time. In other words, first and second events are referred to herein as being “concurrent” with each other if at least some of the first event occurs at the same time as at least some of the second event (e.g., if at least some of the first event occurs during, while, or when at least some of the second event occurs). First and second events can be concurrent if the first and second events are simultaneous (e.g., if the entire duration of the first event overlaps the entire duration of the second event in time) but can also be concurrent if the first and second events are non-simultaneous (e.g., if the first event starts before or after the start of the second event, if the first event ends before or after the end of the second event, or if the first and second events are partially non-overlapping in time). As used herein, the term “while” is synonymous with “concurrent.”
As described above, one aspect of the present technology is the gathering and use of information such as information from input-output devices. The present disclosure contemplates that in some instances, data may be gathered that includes personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, username, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to have control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA), whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide certain types of user data. In yet another example, users can select to limit the length of time user-specific data is maintained. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an application (“app”) that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of information that may include personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
Physical environment: A physical environment refers to a physical world that people can sense and/or interact with without aid of electronic systems. Physical environments, such as a physical park, include physical articles, such as physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment, such as through sight, touch, hearing, taste, and smell.
Computer-generated reality: in contrast, a computer-generated reality (CGR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic system. In CGR, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the CGR environment are adjusted in a manner that comports with at least one law of physics. For example, a CGR system may detect a person's head turning and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), adjustments to characteristic(s) of virtual object(s) in a CGR environment may be made in response to representations of physical motions (e.g., vocal commands). A person may sense and/or interact with a CGR object using any one of their senses, including sight, sound, touch, taste, and smell. For example, a person may sense and/or interact with audio objects that create 3D or spatial audio environment that provides the perception of point audio sources in 3D space. In another example, audio objects may enable audio transparency, which selectively incorporates ambient sounds from the physical environment with or without computer-generated audio. In some CGR environments, a person may sense and/or interact only with audio objects. Examples of CGR include virtual reality and mixed reality.
Virtual reality: A virtual reality (VR) environment refers to a simulated environment that is designed to be based entirely on computer-generated sensory inputs for one or more senses. A VR environment comprises a plurality of virtual objects with which a person may sense and/or interact. For example, computer-generated imagery of trees, buildings, and avatars representing people are examples of virtual objects. A person may sense and/or interact with virtual objects in the VR environment through a simulation of the person's presence within the computer-generated environment, and/or through a simulation of a subset of the person's physical movements within the computer-generated environment.
Mixed reality: In contrast to a VR environment, which is designed to be based entirely on computer-generated sensory inputs, a mixed reality (MR) environment refers to a simulated environment that is designed to incorporate sensory inputs from the physical environment, or a representation thereof, in addition to including computer-generated sensory inputs (e.g., virtual objects). On a virtuality continuum, a mixed reality environment is anywhere between, but not including, a wholly physical environment at one end and virtual reality environment at the other end. In some MR environments, computer-generated sensory inputs may respond to changes in sensory inputs from the physical environment. Also, some electronic systems for presenting an MR environment may track location and/or orientation with respect to the physical environment to enable virtual objects to interact with real objects (that is, physical articles from the physical environment or representations thereof). For example, a system may account for movements so that a virtual tree appears stationery with respect to the physical ground. Examples of mixed realities include augmented reality and augmented virtuality. Augmented reality: an augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment, or a representation thereof. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment. The system may be configured to present virtual objects on the transparent or translucent display, so that a person, using the system, perceives the virtual objects superimposed over the physical environment. Alternatively, a system may have an opaque display and one or more imaging sensors that capture images or video of the physical environment, which are representations of the physical environment. The system composites the images or video with virtual objects, and presents the composition on the opaque display. A person, using the system, indirectly views the physical environment by way of the images or video of the physical environment, and perceives the virtual objects superimposed over the physical environment. As used herein, a video of the physical environment shown on an opaque display is called “pass-through video,” meaning a system uses one or more image sensor(s) to capture images of the physical environment, and uses those images in presenting the AR environment on the opaque display. Further alternatively, a system may have a projection system that projects virtual objects into the physical environment, for example, as a hologram or on a physical surface, so that a person, using the system, perceives the virtual objects superimposed over the physical environment. An augmented reality environment also refers to a simulated environment in which a representation of a physical environment is transformed by computer-generated sensory information. For example, in providing pass-through video, a system may transform one or more sensor images to impose a select perspective (e.g., viewpoint) different than the perspective captured by the imaging sensors. As another example, a representation of a physical environment may be transformed by graphically modifying (e.g., enlarging) portions thereof, such that the modified portion may be representative but not photorealistic versions of the originally captured images. As a further example, a representation of a physical environment may be transformed by graphically eliminating or obfuscating portions thereof. Augmented virtuality: an augmented virtuality (AV) environment refers to a simulated environment in which a virtual or computer generated environment incorporates one or more sensory inputs from the physical environment. The sensory inputs may be representations of one or more characteristics of the physical environment. For example, an AV park may have virtual trees and virtual buildings, but people with faces photorealistically reproduced from images taken of physical people. As another example, a virtual object may adopt a shape or color of a physical article imaged by one or more imaging sensors. As a further example, a virtual object may adopt shadows consistent with the position of the sun in the physical environment.
Hardware: there are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Examples include head mounted systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers. A head mounted system may have one or more speaker(s) and an integrated opaque display. Alternatively, a head mounted system may be configured to accept an external opaque display (e.g., a smartphone). The head mounted system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment. Rather than an opaque display, a head mounted system may have a transparent or translucent display. The transparent or translucent display may have a medium through which light representative of images is directed to a person's eyes. The display may utilize digital light projection, OLEDs, LEDs, μLEDs, liquid crystal on silicon, laser scanning light sources, or any combination of these technologies. The medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, the transparent or translucent display may be configured to become opaque selectively. Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
The foregoing is merely illustrative and various modifications can be made by those skilled in the art without departing from the scope and spirit of the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of U.S. Provisional Patent Application No. 63/505,532, filed Jun. 1, 2023, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63505532 | Jun 2023 | US |