The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
The present disclosure is generally directed to an antenna design for a mobile electronic device. In some cases, the mobile electronic device may be implemented as a computational hub or a communications hub for peripheral devices. The mobile electronic device may have multiple different types of antennas, as well as cameras, sensors, and other electronic components. In some other types of mobile devices, sufficient space exists to accommodate various antennas, sensors, and other components without the components interfering with the antennas. However, in at least some of the embodiments described herein, form factor constraints or other size constraints may limit the amount of space available to position antennas. Moreover, the large number of antennas implemented in the mobile electronic device described herein may impose transmission and reception constraints that are not experienced by other devices.
For example, each of the antenna classes that may be implemented in the mobile electronic device described herein including, for example (and not by way of limitation), 53-65 GHz antennas (e.g., line-of-sign or LOS antennas), FR2 antennas (e.g., 24-53 GHz), and FR1 antennas (e.g., 0.5-8 GHz)) may require a minimum level of spherical coverage. This minimum level of spherical coverage (i.e., the spherical radiation level), however, may also be required not to interfere with other antennas (or at least to keep interference below a specified maximum level). The antennas may also need to be placed so as to avoid occlusion by a user's fingers or a user's hand while in use.
For instance, the mobile electronic devices described herein may be used in multiple different scenarios such as augmented phone calls, in which the mobile electronic device may be laid on a platform with direct line of sight to another mobile device (e.g., a virtual reality head mounted device (HMD), a pair of artificial reality glasses, a smartwatch, an internet of things (IoT) device, etc.). The mobile electronic devices described herein may also be used in “augmented world” scenarios in which the mobile electronic device may act as a computer processing system and/or communications device for other local devices. In such cases, the mobile electronic device may be placed in a user's pocket or backpack or may be held in one of the user's hands as a gaming input device. In other scenarios, the mobile electronic devices described herein may be implemented in an ultra-low friction communication scenario in which the mobile electronic device may be used to facilitate real-time video and/or audio communication with other individuals or entities. In such cases, the mobile electronic device may be held in landscape position with two hands for messaging or point of view (POV) capture.
In each of these use case scenarios, or in different scenarios, different antennas may be used. In each scenario, the user's hands may be in different positions relative to the mobile electronic device. Thus, in light of space constraints, in light of power and interference constraints, and in light of different use scenarios in which the user's hands may be placed in different positions on the device leading to possible occlusions, the embodiments herein may provide an antenna design or antenna architecture that optimally places each of the antennas of the different types of antennas where they will be able to fit into an underlying support structure in relation to other components, where and how they will function in relation to interference caused by other components, and where they will operate substantially without occlusion or with only minimal occlusion in the various use case scenarios. These embodiments will be described in greater detail below. Other embodiments, including details regarding materials used in constructing or manufacturing the mobile electronic device, material thicknesses, and dimensions for the mobile device's support structure may also be provided. Still further, at least in some cases, these embodiments may include ultrawideband (UWB) antennas, as will be explained in greater detail below with regard to
Features from any of the embodiments described herein may be used in combination with one another in accordance with the general principles described herein. These and other embodiments, features, and advantages will be more fully understood upon reading the following detailed description in conjunction with the accompanying drawings and claims.
Moreover, it will be understood that different shapes or sizes of the mobile electronic devices 100A-100G may accommodate different components, and may allow different placements for antennas or other components. For instance, the rectangular-shaped mobile electronic device 100A of
In some cases, the mobile electronic devices 100A-100G may be designed to operate in conjunction with other mobile or stationary electronic devices. These electronic devices may include smartphones, smartwatches, VR HMDs, artificial reality glasses, laptops, tablets, personal computers, IoT devices (e.g., smart doorbells, refrigerators, coffee makers), or other electronic devices that are capable of wired or wireless communication. The mobile electronic devices 100A-100G may include different types of antennas to communicate on intralinks (e.g., wireless communications between local devices) or on interlinks (e.g., wireless communications between remote devices including wireless connections to the internet). In some cases, the mobile electronic devices 100A-100G may include processors, controllers, or other processing means to perform at least some amount of processing for the local devices connected via intralinks.
Thus, for instance, the mobile electronic devices 100A-100G may provide processing capabilities for connected VR HMDs or artificial reality devices (e.g., augmented reality glasses) or smartwatches. In such cases, the HMDs, glasses, or smartwatches may turn over processing tasks to the mobile electronic devices 100A-100G where those tasks will be processed. Upon completion of those tasks, the mobile electronic devices 100A-100G may then return the processed results to the local devices. In this manner, the mobile electronic devices 100A-100G may communicate with local electronic devices, perform processing for those devices, and return the results of the processing to those devices. Moreover, the mobile electronic devices 100A-100G may connect to cellular, global navigational satellite system (GNSS), or other remote computer networks to retrieve information and pass that information to the local devices. In this manner, the mobile electronic devices 100A-100G may function as a processing and/or communications hub for these local electronic devices.
In some cases, the local electronic devices may include artificial reality devices. These artificial reality devices may, themselves, include many different types of electronic hardware. In some cases, for example, artificial reality devices may include head-mounted displays that provide a virtual reality environment or augmented reality glasses that provide an augmented reality environment. In such cases, these HMDs may fully cover the user's eyes, and the user may be entirely enveloped in the virtual environment. In other cases, artificial reality devices may include augmented reality glasses or other similar devices. In such cases, the augmented reality glasses may allow the user to still see the world around them, but may project virtual objects into the physical world. As such, the wearer of the augmented reality glasses may see real world objects as well as virtual objects that are projected onto the user's eyes by the augmented reality glasses. Smartphones, smartwatches, and other mobile electronic devices may be used in conjunction with these artificial reality devices and/or with the mobile electronic devices 100A-100G.
As noted above, the mobile electronic devices 100A-100G may include many types of antennas, sensors, and other electronic components. These antennas may include WiFi antennas, Bluetooth antennas, global navigation satellite system (GNSS) or global positioning system (GPS) antennas, cellular antennas (e.g., 5G, 6G, 7G, etc.), Ultrawideband (UWB) antennas), near-field communication (NFC) antennas, or other types of antennas. The mobile electronic devices 100A-100G may also include microphones, speakers, batteries, cameras, printed circuit boards (PCBs), touch sensors, buttons, insulating or heat conducting materials for thermal management, or other components. For example, the mobile electronic devices 100A-100G may include an outer housing 101A-101G. The housing 101A-101G may cover or provide protection for one or more interior components including processors, memory, antennas, or other electronic components embedded on a PCB. In some places, the housing 101A-101G may include cutouts that allow placement of sensors or similar components on the ends or edges of the device.
For example, the housing 101A-101G may include a rectangular cutout that allows placement of a sensor structure 110A-110G. The sensor structure may be shaped differently in different applications. For instance, the sensor structure 110A may be rectangular in
In some cases, the mobile electronic device housings 101A-101G may include cutouts for SLAM sensors 103A-103G, buttons (e.g., power, volume, etc.), communication ports (e.g., USB ports), power ports (e.g., for wired charging), or other cutouts. The mobile electronic device housings 101A-101G may be provided in a specified thickness that allows for structural stability while still allowing for cutouts for sensors, antennas, or other components. In some cases, certain types of antennas (e.g., FR2 antennas) may have improved antenna reception or transmission with a thicker housing. Other types of antennas (e.g., LOS, 60 GHz) may transmit or receive more efficiently with a thinner housing. The embodiments herein may use various levels of thickness and, in some cases, may implement a thickness that is thinner than preferred for FR2 antennas, but thicker than preferred for LOS antennas. This intermediate thickness value may additionally allow for sensor or component cutouts while still maintaining structural integrity.
The mobile electronic device housings 101A-101G may be formed using plastic, glass, metal, ceramic, or a combination of such materials. In some cases, the antennas (e.g., the LOS antennas) may be designed to operate at a lower frequency than designed when placed behind glass, as the glass may detune the electromagnetic waves that travel through the glass. In some cases, specific distances may be established between glass portions and antennas, or between plastic portions and antennas. In some cases, grounded components including, for example, communication ports, may be placed between antennas to reduce correlation between those antennas. The housing may thus accommodate antennas, sensors, communication ports, or other mechanical or electrical components.
The mobile electronic device 200 of
In
In augmented calling, the artificial reality glasses 306A may project a moving, lifelike image of another caller (or group of callers) as being in the same room as the wearer of the glasses 306A, even if those callers are located far away from the wearer. In such cases, the mobile electronic device 300A may establish an interlink to other callers' devices and an intralink to the artificial reality glasses 306A. Then, the mobile electronic device 300A may facilitate an augmented call between the wearer of the glasses and one or more remote callers, projecting moving images of those callers onto the glasses wearer's current environment. In such a scenario, multiple different types of antennas may be simultaneously operating within the mobile electronic device 300A to allow augmented calling.
In embodiment 300E of
In some examples, as shown in embodiments 300F of
Additionally or alternatively, the mobile electronic device 400A may include at least one line-of-sight (LOS) antenna 402A. The LOS1 antenna may be placed on the front face of the support structure 401A of the mobile electronic device 400A when viewed in landscape mode and may be placed on the left side of the mobile electronic device 400A when viewed in portrait mode. The LOS1 antenna may be enclosed plastic or other RF transparent material that will not attenuate or distort its radiation pattern. The LOS1 antenna may operate, as noted in chart 500B of
Still further, at least in some cases and as shown in
In
Each antenna may have minimum operational specifications indicating a minimum amount of power needed to operate. Additionally, or alternatively, some or all of the antennas may specify a minimum amount of 3D spherical radiation coverage needed to operate properly or may specify a maximum amount of radiation coverage that can be provided by that antenna. Still further, some or all of the antennas may have specifications regarding heat dissipation or minimum distances between components for heat regulation. The embodiments herein, including the antennas shown in
However, at least in some cases, sensors (e.g., SLAM) sensors may prevent placement of the LB1 antenna in the top left corner of the mobile electronic device 400G. Accordingly, as shown in
In at least some of the preceding figures, the antennas incorporated in the various mobile electronic device embodiments and the placement of those antennas within the mobile devices was primarily based on positions that would allow each antenna to operate in isolation and would allow the placement of other components. In the embodiments of
For example, if the user 702 is holding the mobile electronic device 701 in portrait mode, as in embodiment 700A, the user's hand or thumb may cover the LB1 or HB3 antennas of
Accordingly, in different environments, in different use cases, or when being held by different users that hold the devices in different manners, the embodiments herein may turn certain antennas off and use other antennas that are designed to operate in the same frequency range. As the user 702 switches applications and potentially switches how they are holding the device 701, the mobile electronic device 701 may stop transmitting or receiving on certain antennas and may start transmitting or receiving on other antennas that are not being occluded. In some cases, some antennas (e.g., HB1 and HB2) may be primarily transmit antennas, while other antennas (e.g., HB3 and HB4) may be primarily receive antennas. Other configurations are also possible.
The mobile electronic device 900C of
Method of manufacturing 1200 may include, at step 1210, providing a support structure (e.g., 401F of
The method of manufacturing 1200 may next include, at step 1230, providing one or more second antennas configured to established wireless interlinks on different frequencies to various external wireless networks (e.g., cellular, GNSS, or other external networks). These other antennas may be positioned a specified minimum distance away from the LOS1 antenna. Some of these other antennas may be configured to establish intralinks to local devices. For instance, LB1 and LB2 antennas may establish Bluetooth, WiFi, NFC, or other local intralink connections to local devices. In some cases, the LB1 and LB2 may be positioned at an at least partially opposing angle to each other. As such, the second antennas may provide at least a minimum threshold amount of spherical radiation to transmit and receive data using the established wireless interlinks while limiting correlation between the two low band antennas.
In another embodiment, an alternative method of manufacturing may be provided. In the alternative method of manufacturing, the process may begin by assembling or producing a housing or support structure. The method may next include disposing one or more antenna carriers into the housing. These antenna carriers may include any of the antennas shown in
In this manner, the systems, methods, and mobile electronic devices described herein may provide multiple different types of antennas that may operate simultaneously or in an alternate manner. The embodiments herein may include different types of antennas designed to operate in different frequency bands. These antennas may be placed, among other electronic and mechanical components, on various portions of a mobile device's support structure. The placement of these antennas may allow the mobile device to form intralinks and interlinks with many different types of local and remote devices, and may further allow each of these different types of antennas to operate with at least a minimum power level while not interfering with the other surrounding antennas and other components.
Example 1: A system may include: a support structure configured to house one or more electronic components, a first antenna mounted to the support structure, wherein the first antenna is configured to provide a wireless intralink on a first frequency to a local mobile electronic device, and a plurality of second antennas configured to established wireless interlinks on at least a second different frequency to one or more external wireless networks, wherein the second antennas are positioned a specified minimum distance away from the first antenna and are positioned at an at least partially opposing angle to each other, such that the second antennas provide at least a minimum threshold amount of spherical radiation to transmit and receive data using the established wireless interlinks.
Example 2: The system of Example 1, wherein the first antenna establishes a line-of-sight connection to communicate with the local mobile electronic device on the wireless intralink.
Example 3: The system of Example 1 or Example 2, wherein the line-of-sight connection established on the wireless intralink between the first antenna and the local mobile electronic is established using a frequency of at least 53 GHz.
Example 4: The system of any of Examples 1-3, wherein the local mobile electronic device comprises an artificial reality device.
Example 5: The system of any of Examples 1-4, wherein the first antenna is positioned in a front-facing, centered position, and wherein the second antennas are positioned on front-facing and rear-facing positions on a first side of the support structure.
Example 6: The system of any of Examples 1-5, wherein the second antennas comprise low band antennas.
Example 7: The system of any of Examples 1-6, further comprising first and second high band antennas, wherein the first high band antenna is positioned between the first antenna and the front-facing second antenna, and wherein the second high band antenna is positioned on a rear-facing portion of the system between the center of the system and the rear-facing second antenna.
Example 8: The system of any of Examples 1-7, further comprising third and fourth high band antennas, wherein the third high band antenna is positioned between the first antenna and a top portion of the support structure, and wherein the fourth high band antenna is positioned between the second high band antenna and the top portion of the support structure.
Example 9: The system of any of Examples 1-8, further comprising a second intralink antenna that is configured to provide the wireless intralink on the first frequency to the local mobile electronic device.
Example 10: The system of any of Examples 1-9, further comprising a global navigation satellite system (GNSS) antenna positioned on a rear-facing portion of the support structure between the second intralink antenna and a high band antenna.
Example 11: The system of any of Examples 1-10, wherein a front-facing portion of the support structure includes first and second cameras mounted thereto, and wherein the first antenna is positioned between the first and second cameras.
Example 12: The system of any of Examples 1-11, may further include: a third antenna positioned in an upper portion of a side portion of the support structure, and a fourth antenna positioned in a lower portion of the side portion of the support structure.
Example 13: A mobile electronic device may include a support structure configured to house one or more electronic components, a first antenna mounted to the support structure, wherein the first antenna is configured to provide a wireless intralink on a first frequency to a local mobile electronic device, and a plurality of second antennas configured to established wireless interlinks on at least a second different frequency to one or more external wireless networks, wherein the second antennas are positioned a specified minimum distance away from the first antenna and are positioned at an at least partially opposing angle to each other, such that the second antennas provide at least a minimum threshold amount of spherical radiation to transmit and receive data using the established wireless interlinks.
Example 14: The mobile electronic device of Example 13 may further include one or more sensors positioned between the first antenna and at least one of the plurality of second antennas.
Example 15: The mobile electronic device of Example 13 or Example 14, wherein the one or more sensors comprises at least one of a simultaneous location and mapping (SLAM) sensor, a camera, a depth sensor, an inertial motion unit (IMU), an altimeter, a communication port, a touchpad, or an ambient light sensor.
Example 16: The mobile electronic device of any of Examples 13-15, wherein a communication port is positioned between at least two of the plurality of second antennas.
Example 17: The mobile electronic device of any of Examples 13-16, further comprising at least two high band antennas, wherein the at least two high band antennas are offset from each other, such that at least one of the high band antennas remains unoccluded when the mobile electronic device is held in different manners.
Example 18: The mobile electronic device of any of Examples 13-17, wherein the at least two high band antennas comprise third and fourth high band antennas, wherein the third and fourth high band antennas operate as transmission-biased or reception-biased antennas.
Example 19: The mobile electronic device of any of Examples 13-18, further comprising a touchpad positioned on an upper, side portion of the support structure.
Example 20: A method of manufacturing may include providing a support structure configured to house one or more electronic components, mounting a first antenna to the support structure, wherein the first antenna is configured to provide a wireless intralink on a first frequency to a local mobile electronic device, and providing a plurality of second antennas configured to established wireless interlinks on at least a second different frequency to one or more external wireless networks, wherein the second antennas are positioned a specified minimum distance away from the first antenna and are positioned at an at least partially opposing angle to each other, such that the second antennas provide at least a minimum threshold amount of spherical radiation to transmit and receive data using the established wireless interlinks.
Embodiments of the present disclosure may include or be implemented in conjunction with various types of artificial-reality systems. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, for example, a virtual reality, an augmented reality, a mixed reality, a hybrid reality, or some combination and/or derivative thereof. Artificial-reality content may include completely computer-generated content or computer-generated content combined with captured (e.g., real-world) content. The artificial-reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional (3D) effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in an artificial reality and/or are otherwise used in (e.g., to perform activities in) an artificial reality.
Artificial-reality systems may be implemented in a variety of different form factors and configurations. Some artificial-reality systems may be designed to work without near-eye displays (NEDs). Other artificial-reality systems may include an NED that also provides visibility into the real world (such as, e.g., augmented-reality system 1300 in
Turning to
In some embodiments, augmented-reality system 1300 may include one or more sensors, such as sensor 1340. Sensor 1340 may generate measurement signals in response to motion of augmented-reality system 1300 and may be located on substantially any portion of frame 1310. Sensor 1340 may represent one or more of a variety of different sensing mechanisms, such as a position sensor, an inertial measurement unit (IMU), a depth camera assembly, a structured light emitter and/or detector, or any combination thereof. In some embodiments, augmented-reality system 1300 may or may not include sensor 1340 or may include more than one sensor. In embodiments in which sensor 1340 includes an IMU, the IMU may generate calibration data based on measurement signals from sensor 1340. Examples of sensor 1340 may include, without limitation, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors used for error correction of the IMU, or some combination thereof.
In some examples, augmented-reality system 1300 may also include a microphone array with a plurality of acoustic transducers 1320(A)-1320(J), referred to collectively as acoustic transducers 1320. Acoustic transducers 1320 may represent transducers that detect air pressure variations induced by sound waves. Each acoustic transducer 1320 may be configured to detect sound and convert the detected sound into an electronic format (e.g., an analog or digital format). The microphone array in
In some embodiments, one or more of acoustic transducers 1320(A)-(J) may be used as output transducers (e.g., speakers). For example, acoustic transducers 1320(A) and/or 1320(B) may be earbuds or any other suitable type of headphone or speaker.
The configuration of acoustic transducers 1320 of the microphone array may vary. While augmented-reality system 1300 is shown in
Acoustic transducers 1320(A) and 1320(B) may be positioned on different parts of the user's ear, such as behind the pinna, behind the tragus, and/or within the auricle or fossa. Or, there may be additional acoustic transducers 1320 on or surrounding the ear in addition to acoustic transducers 1320 inside the ear canal. Having an acoustic transducer 1320 positioned next to an ear canal of a user may enable the microphone array to collect information on how sounds arrive at the ear canal. By positioning at least two of acoustic transducers 1320 on either side of a user's head (e.g., as binaural microphones), augmented-reality device 1300 may simulate binaural hearing and capture a 3D stereo sound field around about a user's head. In some embodiments, acoustic transducers 1320(A) and 1320(B) may be connected to augmented-reality system 1300 via a wired connection 1330, and in other embodiments acoustic transducers 1320(A) and 1320(B) may be connected to augmented-reality system 1300 via a wireless connection (e.g., a BLUETOOTH connection). In still other embodiments, acoustic transducers 1320(A) and 1320(B) may not be used at all in conjunction with augmented-reality system 1300.
Acoustic transducers 1320 on frame 1310 may be positioned in a variety of different ways, including along the length of the temples, across the bridge, above or below display devices 1315(A) and 1315(B), or some combination thereof. Acoustic transducers 1320 may also be oriented such that the microphone array is able to detect sounds in a wide range of directions surrounding the user wearing the augmented-reality system 1300. In some embodiments, an optimization process may be performed during manufacturing of augmented-reality system 1300 to determine relative positioning of each acoustic transducer 1320 in the microphone array.
In some examples, augmented-reality system 1300 may include or be connected to an external device (e.g., a paired device), such as neckband 1305. Neckband 1305 generally represents any type or form of paired device. Thus, the following discussion of neckband 1305 may also apply to various other paired devices, such as charging cases, smart watches, smart phones, wrist bands, other wearable devices, hand-held controllers, tablet computers, laptop computers, other external compute devices, etc.
As shown, neckband 1305 may be coupled to eyewear device 1302 via one or more connectors. The connectors may be wired or wireless and may include electrical and/or non-electrical (e.g., structural) components. In some cases, eyewear device 1302 and neckband 1305 may operate independently without any wired or wireless connection between them. While
Pairing external devices, such as neckband 1305, with augmented-reality eyewear devices may enable the eyewear devices to achieve the form factor of a pair of glasses while still providing sufficient battery and computation power for expanded capabilities. Some or all of the battery power, computational resources, and/or additional features of augmented-reality system 1300 may be provided by a paired device or shared between a paired device and an eyewear device, thus reducing the weight, heat profile, and form factor of the eyewear device overall while still retaining desired functionality. For example, neckband 1305 may allow components that would otherwise be included on an eyewear device to be included in neckband 1305 since users may tolerate a heavier weight load on their shoulders than they would tolerate on their heads. Neckband 1305 may also have a larger surface area over which to diffuse and disperse heat to the ambient environment. Thus, neckband 1305 may allow for greater battery and computation capacity than might otherwise have been possible on a stand-alone eyewear device. Since weight carried in neckband 1305 may be less invasive to a user than weight carried in eyewear device 1302, a user may tolerate wearing a lighter eyewear device and carrying or wearing the paired device for greater lengths of time than a user would tolerate wearing a heavy standalone eyewear device, thereby enabling users to more fully incorporate artificial-reality environments into their day-to-day activities.
Neckband 1305 may be communicatively coupled with eyewear device 1302 and/or to other devices. These other devices may provide certain functions (e.g., tracking, localizing, depth mapping, processing, storage, etc.) to augmented-reality system 1300. In the embodiment of
Acoustic transducers 1320(1) and 1320(J) of neckband 1305 may be configured to detect sound and convert the detected sound into an electronic format (analog or digital). In the embodiment of
Controller 1325 of neckband 1305 may process information generated by the sensors on neckband 1305 and/or augmented-reality system 1300. For example, controller 1325 may process information from the microphone array that describes sounds detected by the microphone array. For each detected sound, controller 1325 may perform a direction-of-arrival (DOA) estimation to estimate a direction from which the detected sound arrived at the microphone array. As the microphone array detects sounds, controller 1325 may populate an audio data set with the information. In embodiments in which augmented-reality system 1300 includes an inertial measurement unit, controller 1325 may compute all inertial and spatial calculations from the IMU located on eyewear device 1302. A connector may convey information between augmented-reality system 1300 and neckband 1305 and between augmented-reality system 1300 and controller 1325. The information may be in the form of optical data, electrical data, wireless data, or any other transmittable data form. Moving the processing of information generated by augmented-reality system 1300 to neckband 1305 may reduce weight and heat in eyewear device 1302, making it more comfortable to the user.
Power source 1335 in neckband 1305 may provide power to eyewear device 1302 and/or to neckband 1305. Power source 1335 may include, without limitation, lithium ion batteries, lithium-polymer batteries, primary lithium batteries, alkaline batteries, or any other form of power storage. In some cases, power source 1335 may be a wired power source. Including power source 1335 on neckband 1305 instead of on eyewear device 1302 may help better distribute the weight and heat generated by power source 1335.
As noted, some artificial-reality systems may, instead of blending an artificial reality with actual reality, substantially replace one or more of a user's sensory perceptions of the real world with a virtual experience. One example of this type of system is a head-worn display system, such as virtual-reality system 1400 in
Artificial-reality systems may include a variety of types of visual feedback mechanisms. For example, display devices in augmented-reality system 1300 and/or virtual-reality system 1400 may include one or more liquid crystal displays (LCDs), light emitting diode (LED) displays, microLED displays, organic LED (OLED) displays, digital light project (DLP) micro-displays, liquid crystal on silicon (LCoS) micro-displays, and/or any other suitable type of display screen. These artificial-reality systems may include a single display screen for both eyes or may provide a display screen for each eye, which may allow for additional flexibility for varifocal adjustments or for correcting a user's refractive error. Some of these artificial-reality systems may also include optical subsystems having one or more lenses (e.g., concave or convex lenses, Fresnel lenses, adjustable liquid lenses, etc.) through which a user may view a display screen. These optical subsystems may serve a variety of purposes, including to collimate (e.g., make an object appear at a greater distance than its physical distance), to magnify (e.g., make an object appear larger than its actual size), and/or to relay (to, e.g., the viewer's eyes) light. These optical subsystems may be used in a non-pupil-forming architecture (such as a single lens configuration that directly collimates light but results in so-called pincushion distortion) and/or a pupil-forming architecture (such as a multi-lens configuration that produces so-called barrel distortion to nullify pincushion distortion).
In addition to or instead of using display screens, some of the artificial-reality systems described herein may include one or more projection systems. For example, display devices in augmented-reality system 1300 and/or virtual-reality system 1400 may include micro-LED projectors that project light (using, e.g., a waveguide) into display devices, such as clear combiner lenses that allow ambient light to pass through. The display devices may refract the projected light toward a user's pupil and may enable a user to simultaneously view both artificial-reality content and the real world. The display devices may accomplish this using any of a variety of different optical components, including waveguide components (e.g., holographic, planar, diffractive, polarized, and/or reflective waveguide elements), light-manipulation surfaces and elements (such as diffractive, reflective, and refractive elements and gratings), coupling elements, etc. Artificial-reality systems may also be configured with any other suitable type or form of image projection system, such as retinal projectors used in virtual retina displays.
The artificial-reality systems described herein may also include various types of computer vision components and subsystems. For example, augmented-reality system 1300 and/or virtual-reality system 1400 may include one or more optical sensors, such as two-dimensional (2D) or 3D cameras, structured light transmitters and detectors, time-of-flight depth sensors, single-beam or sweeping laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. An artificial-reality system may process data from one or more of these sensors to identify a location of a user, to map the real world, to provide a user with context about real-world surroundings, and/or to perform a variety of other functions.
The artificial-reality systems described herein may also include one or more input and/or output audio transducers. Output audio transducers may include voice coil speakers, ribbon speakers, electrostatic speakers, piezoelectric speakers, bone conduction transducers, cartilage conduction transducers, tragus-vibration transducers, and/or any other suitable type or form of audio transducer. Similarly, input audio transducers may include condenser microphones, dynamic microphones, ribbon microphones, and/or any other type or form of input transducer. In some embodiments, a single transducer may be used for both audio input and audio output.
In some embodiments, the artificial-reality systems described herein may also include tactile (i.e., haptic) feedback systems, which may be incorporated into headwear, gloves, body suits, handheld controllers, environmental devices (e.g., chairs, floormats, etc.), and/or any other type of device or system. Haptic feedback systems may provide various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluidic systems, and/or a variety of other types of feedback mechanisms. Haptic feedback systems may be implemented independent of other artificial-reality devices, within other artificial-reality devices, and/or in conjunction with other artificial-reality devices.
By providing haptic sensations, audible content, and/or visual content, artificial-reality systems may create an entire virtual experience or enhance a user's real-world experience in a variety of contexts and environments. For instance, artificial-reality systems may assist or extend a user's perception, memory, or cognition within a particular environment. Some systems may enhance a user's interactions with other people in the real world or may enable more immersive interactions with other people in a virtual world. Artificial-reality systems may also be used for educational purposes (e.g., for teaching or training in schools, hospitals, government organizations, military organizations, business enterprises, etc.), entertainment purposes (e.g., for playing video games, listening to music, watching video content, etc.), and/or for accessibility purposes (e.g., as hearing aids, visual aids, etc.). The embodiments disclosed herein may enable or enhance a user's artificial-reality experience in one or more of these contexts and environments and/or in other contexts and environments.
As noted, artificial-reality systems 1300 and 1400 may be used with a variety of other types of devices to provide a more compelling artificial-reality experience. These devices may be haptic interfaces with transducers that provide haptic feedback and/or that collect haptic information about a user's interaction with an environment. The artificial-reality systems disclosed herein may include various types of haptic interfaces that detect or convey various types of haptic information, including tactile feedback (e.g., feedback that a user detects via nerves in the skin, which may also be referred to as cutaneous feedback) and/or kinesthetic feedback (e.g., feedback that a user detects via receptors located in muscles, joints, and/or tendons).
Haptic feedback may be provided by interfaces positioned within a user's environment (e.g., chairs, tables, floors, etc.) and/or interfaces on articles that may be worn or carried by a user (e.g., gloves, wristbands, etc.). As an example,
One or more vibrotactile devices 1540 may be positioned at least partially within one or more corresponding pockets formed in textile material 1530 of vibrotactile system 1500. Vibrotactile devices 1540 may be positioned in locations to provide a vibrating sensation (e.g., haptic feedback) to a user of vibrotactile system 1500. For example, vibrotactile devices 1540 may be positioned against the user's finger(s), thumb, or wrist, as shown in
A power source 1550 (e.g., a battery) for applying a voltage to the vibrotactile devices 1540 for activation thereof may be electrically coupled to vibrotactile devices 1540, such as via conductive wiring 1552. In some examples, each of vibrotactile devices 1540 may be independently electrically coupled to power source 1550 for individual activation. In some embodiments, a processor 1560 may be operatively coupled to power source 1550 and configured (e.g., programmed) to control activation of vibrotactile devices 1540.
Vibrotactile system 1500 may be implemented in a variety of ways. In some examples, vibrotactile system 1500 may be a standalone system with integral subsystems and components for operation independent of other devices and systems. As another example, vibrotactile system 1500 may be configured for interaction with another device or system 1570. For example, vibrotactile system 1500 may, in some examples, include a communications interface 1580 for receiving and/or sending signals to the other device or system 1570. The other device or system 1570 may be a mobile device, a gaming console, an artificial-reality (e.g., virtual-reality, augmented-reality, mixed-reality) device, a personal computer, a tablet computer, a network device (e.g., a modem, a router, etc.), a handheld controller, etc. Communications interface 1580 may enable communications between vibrotactile system 1500 and the other device or system 1570 via a wireless (e.g., Wi-Fi, BLUETOOTH, cellular, radio, etc.) link or a wired link. If present, communications interface 1580 may be in communication with processor 1560, such as to provide a signal to processor 1560 to activate or deactivate one or more of the vibrotactile devices 1540.
Vibrotactile system 1500 may optionally include other subsystems and components, such as touch-sensitive pads 1590, pressure sensors, motion sensors, position sensors, lighting elements, and/or user interface elements (e.g., an on/off button, a vibration control element, etc.). During use, vibrotactile devices 1540 may be configured to be activated for a variety of different reasons, such as in response to the user's interaction with user interface elements, a signal from the motion or position sensors, a signal from the touch-sensitive pads 1590, a signal from the pressure sensors, a signal from the other device or system 1570, etc.
Although power source 1550, processor 1560, and communications interface 1580 are illustrated in
Haptic wearables, such as those shown in and described in connection with
Head-mounted display 1602 generally represents any type or form of virtual-reality system, such as virtual-reality system 1400 in
While haptic interfaces may be used with virtual-reality systems, as shown in
One or more of band elements 1732 may include any type or form of actuator suitable for providing haptic feedback. For example, one or more of band elements 1732 may be configured to provide one or more of various types of cutaneous feedback, including vibration, force, traction, texture, and/or temperature. To provide such feedback, band elements 1732 may include one or more of various types of actuators. In one example, each of band elements 1732 may include a vibrotactor (e.g., a vibrotactile actuator) configured to vibrate in unison or independently to provide one or more of various types of haptic sensations to a user. Alternatively, only a single band element or a subset of band elements may include vibrotactors.
Haptic devices 1510, 1520, 1604, and 1730 may include any suitable number and/or type of haptic transducer, sensor, and/or feedback mechanism. For example, haptic devices 1510, 1520, 1604, and 1730 may include one or more mechanical transducers, piezoelectric transducers, and/or fluidic transducers. Haptic devices 1510, 1520, 1604, and 1730 may also include various combinations of different types and forms of transducers that work together or independently to enhance a user's artificial-reality experience. In one example, each of band elements 1732 of haptic device 1730 may include a vibrotactor (e.g., a vibrotactile actuator) configured to vibrate in unison or independently to provide one or more of various types of haptic sensations to a user.
Dongle portion 1920 may include antenna 1952, which may be configured to communicate with antenna 1950 included as part of wearable portion 1910. Communication between antennas 1950 and 1952 may occur using any suitable wireless technology and protocol, non-limiting examples of which include radiofrequency signaling and BLUETOOTH. As shown, the signals received by antenna 1952 of dongle portion 1920 may be provided to a host computer for further processing, display, and/or for effecting control of a particular physical or virtual object or objects.
Although the examples provided with reference to
As detailed above, the computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein. In their most basic configuration, these computing device(s) may each include at least one memory device and at least one physical processor.
In some examples, the term “memory device” generally refers to any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, a memory device may store, load, and/or maintain one or more of the modules described herein. Examples of memory devices include, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, or any other suitable storage memory.
In some examples, the term “physical processor” generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions. In one example, a physical processor may access and/or modify one or more modules stored in the above-described memory device. Examples of physical processors include, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor.
Although illustrated as separate elements, the modules described and/or illustrated herein may represent portions of a single module or application. In addition, in certain embodiments one or more of these modules may represent one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks. For example, one or more of the modules described and/or illustrated herein may represent modules stored and configured to run on one or more of the computing devices or systems described and/or illustrated herein. One or more of these modules may also represent all or portions of one or more special-purpose computers configured to perform one or more tasks.
In addition, one or more of the modules described herein may transform data, physical devices, and/or representations of physical devices from one form to another. For example, one or more of the modules recited herein may receive data to be transformed, transform the data, output a result of the transformation, and store the result of the transformation. Additionally or alternatively, one or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form to another by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
In some embodiments, the term “computer-readable medium” generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions. Examples of computer-readable media include, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media), and other distribution systems.
The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
The preceding description has been provided to enable others skilled in the art to best utilize various aspects of the exemplary embodiments disclosed herein. This exemplary description is not intended to be exhaustive or to be limited to any precise form disclosed. Many modifications and variations are possible without departing from the spirit and scope of the present disclosure. The embodiments disclosed herein should be considered in all respects illustrative and not restrictive. Reference should be made to the appended claims and their equivalents in determining the scope of the present disclosure.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.”
This application claims priority to and the benefit of U.S. Provisional Patent Application No. 63/344,343, filed May 20, 2022, which application is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
63344343 | May 2022 | US |