This disclosure relates generally to methods, apparatus and systems for providing extended reality effects.
The term “extended reality” (XR) refers to all real-and-virtual combined environments and human-machine interactions, including augmented reality (AR), mixed reality (MR) and virtual reality (VR). The levels of virtuality in XR may range from sensory inputs that augment a user's experience of the real world to immersive virtuality, also called VR. Although some existing XR systems provide acceptable performance under some conditions, improved methods and devices would be desirable.
The systems, methods and devices of the disclosure each have several innovative aspects, no single one of which is solely responsible for the desirable attributes disclosed herein.
One innovative aspect of the subject matter described in this disclosure may be implemented in an apparatus. According to some examples, the apparatus may include a structure, such as a headset or an eyeglass frame, that is configured to provide extended reality effects. The extended reality effects may include augmented reality effects, mixed reality effects, virtual reality effects, or combinations thereof.
In some examples, the apparatus may include an ultrasound-based haptic system including one or more arrays of ultrasonic transducers, which in some examples may include piezoelectric micromachined ultrasonic transducers (PMUTs), mounted in or on the structure. In some examples, the apparatus may include a control system configured for communication with (such as electrically or wirelessly coupled to) the structure and the ultrasound-based haptic system. In some examples, the control system may include a memory, whereas in other examples the control system may be configured for communication with a memory that is not part of the control system. The control system may include one or more general purpose single- or multi-chip processors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs) or other programmable logic devices, discrete gates or transistor logic, discrete hardware components, or combinations thereof.
According to some examples, the control system may be configured to control the one or more arrays of ultrasonic transducers to create haptic effects via ultrasonic waves. In some examples, the control system may be configured to control the one or more arrays of ultrasonic transducers to create haptic effects via air-coupled ultrasonic waves. According to some examples, the control system may be configured to control the one or more arrays of ultrasonic transducers to create one or more haptic effects associated with at least one of the extended reality effects. In some examples, the control system may be configured to control the one or more arrays of ultrasonic transducers to create one or more haptic effects synchronized with at least one of the extended reality effects.
In some implementations, at least one array of the one or more arrays of ultrasonic transducers may include ultrasonic transducers grouped into superpixels. In some such implementations, each of the superpixels may include a plurality of ultrasonic transducers.
According to some examples, the control system may be configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects via beam steering of transmitted ultrasonic waves. In some examples, a beam steering distance of the beam steering may be in a range from 5 mm to 2 cm.
In some examples, the control system may be configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects by modifying a focus area of transmitted ultrasonic waves, modifying a focus depth of transmitted ultrasonic waves, or a combination thereof. In some such examples, modifying the focus area may involve modifying the focus area in a range from 2 mm to 5 cm. In some examples, modifying the focus depth may involve modifying the focus depth in a range from 5 mm to 5 cm.
According to some examples, the control system may be configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects by transmitting a focused beam of ultrasonic waves by the at least one array at a first time and transmitting an unfocused beam of ultrasonic waves by the at least one array at a second time. In some examples, the control system may be configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects by modifying a peak frequency of transmitted ultrasonic waves.
In some examples, the control system may be configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects via amplitude modulation of transmitted ultrasonic carrier waves. In some such examples, a frequency of amplitude modulation may be in a range of 40 Hz to 300 Hz. In some examples, a peak frequency of the transmitted ultrasonic carrier waves may be in a range of 20 KHz to 600 KHz.
According to some examples, the control system may be configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects via ultrasonic waves transmitted to a wearer of the apparatus via solid material. In some such examples, the solid material may include a portion of the structure that may be configured to be in contact with the wearer of the apparatus. According to some examples, the control system may be further configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects by modifying a focus area of transmitted ultrasonic waves in a range of 1 mm to 5 mm, moving a focus area of transmitted ultrasonic waves within a steering range of 1 cm, modifying a focus depth of transmitted ultrasonic waves in a range from 5 mm to 5 cm, or a combination thereof.
In some examples, the control system may be further configured to control at least one array of the one or more arrays of ultrasonic transducers to create haptic effects corresponding to a motion along a trajectory.
According to some examples, the one or more arrays of ultrasonic transducers may include one or more piezoelectric micromachined ultrasonic transducers (PMUTs). The one or more PMUTs may, in some examples, include one or more scandium-doped aluminum nitride PMUTs.
Other innovative aspects of the subject matter described in this disclosure may be implemented in a method. In some examples, the method may involve providing extended reality effects. According to some examples, the method may involve controlling, by a control system, a structure to provide extended reality effects. In some examples, the method may involve controlling, by the control system, one or more arrays of ultrasonic transducers mounted in or on the structure to create haptic effects via transmitted ultrasonic waves. In some examples, creating haptic effects via transmitted ultrasonic waves may involve transmitting air-coupled ultrasonic waves. Alternatively, or additionally, creating haptic effects via transmitted ultrasonic waves may involve transmitting ultrasonic waves through solid material. In some examples, one or more of the haptic effects may be associated with at least one of the extended reality effects, synchronized with at least one of the extended reality effects, or combinations thereof.
Some or all of the operations, functions or methods described herein may be performed by one or more devices according to instructions (such as software) stored on one or more non-transitory media. Such non-transitory media may include memory devices such as those described herein, including but not limited to random access memory (RAM) devices, read-only memory (ROM) devices, etc. Accordingly, some innovative aspects of the subject matter described in this disclosure can be implemented in one or more non-transitory media having software stored thereon.
For example, the software may include instructions for controlling one or more devices to perform a method. In some examples, the method may involve providing extended reality effects. According to some examples, the method may involve controlling, by a control system, a structure to provide extended reality effects. In some examples, the method may involve controlling, by the control system, one or more arrays of ultrasonic transducers mounted in or on the structure to create haptic effects via transmitted ultrasonic waves. In some examples, creating haptic effects via transmitted ultrasonic waves may involve transmitting air-coupled ultrasonic waves. Alternatively, or additionally, creating haptic effects via transmitted ultrasonic waves may involve transmitting ultrasonic waves through solid material. In some examples, one or more of the haptic effects may be associated with at least one of the extended reality effects, synchronized with at least one of the extended reality effects, or combinations thereof.
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Note that the relative dimensions of the following figures may not be drawn to scale.
Like reference numbers and designations in the various drawings indicate like elements.
The following description is directed to certain implementations for the purposes of describing the innovative aspects of this disclosure. However, a person having ordinary skill in the art will readily recognize that the teachings herein may be applied in a multitude of different ways. The described implementations may be implemented in any device, apparatus, or system that includes a biometric system as disclosed herein. In addition, it is contemplated that the described implementations may be included in or associated with a variety of electronic devices such as, but not limited to: mobile telephones, multimedia Internet enabled cellular telephones, mobile television receivers, wireless devices, smartphones, smart cards, wearable devices such as bracelets, armbands, wristbands, rings, headbands, patches, etc., Bluetooth® devices, personal data assistants (PDAs), wireless electronic mail receivers, hand-held or portable computers, netbooks, notebooks, smartbooks, tablets, printers, copiers, scanners, facsimile devices, global positioning system (GPS) receivers/navigators, cameras, digital media players (such as MP3 players), camcorders, game consoles, wrist watches, clocks, calculators, television monitors, flat panel displays, electronic reading devices (such as e-readers), mobile health devices, computer monitors, automobile components, including but not limited to automobile displays (such as odometer and speedometer displays, etc.), cockpit controls or displays, camera view displays (such as the display of a rear view camera in a vehicle), electronic photographs, electronic billboards or signs, projectors, architectural structures, microwaves, refrigerators, stereo systems, cassette recorders or players, DVD players, CD players, VCRs, radios, portable memory chips, washers, dryers, washer/dryers, parking meters, packaging (such as in electromechanical systems (EMS) applications including microelectromechanical systems (MEMS) applications, as well as non-EMS applications), aesthetic structures (such as display of images on a piece of jewelry or clothing) and a variety of EMS devices. The teachings herein also may be used in applications such as, but not limited to, electronic switching devices, radio frequency filters, sensors, accelerometers, gyroscopes, motion-sensing devices, magnetometers, inertial components for consumer electronics, parts of consumer electronics products, steering wheels or other automobile parts, varactors, liquid crystal devices, electrophoretic devices, drive schemes, manufacturing processes and electronic test equipment. Thus, the teachings are not intended to be limited to the implementations depicted solely in the Figures, but instead have wide applicability as will be readily apparent to one having ordinary skill in the art.
Providing haptic feedback, in addition to audio and video effects, can create a relatively more immersive extended reality (XR) experience. For example, there is an existing device for creating touch sensations as part of providing an XR experience. This device is about half the size of a typical laptop computer. Accordingly, the device is rather bulky and heavy, and consumes a relatively large amount of power during use.
Some disclosed implementations include an ultrasound-based haptic system for use with, or which may be configured as part of, an XR system. Some implementations may provide an ultrasound-based haptic system that includes one or more arrays of ultrasonic transducer elements, which may in some examples include piezoelectric micromachined ultrasonic transducers (PMUTs), mounted in or on a structure configured to provide XR effects. In some such implementations, a control system may be configured to control the one or more arrays of ultrasonic transducer elements to create haptic effects via ultrasonic waves. In some examples, the control system may be configured to control the one or more arrays of ultrasonic transducer elements to create haptic effects via air-coupled ultrasonic waves. The haptic effect(s) may be associated with at least one of the extended reality effects, such as at least one visual extended reality effect. In some instances, the haptic effect(s) may be synchronized with at least one of the extended reality effects, such as at least one visual extended reality effect.
Particular implementations of the subject matter described in this disclosure may be implemented to realize one or more of the following potential advantages. In some implementations, an apparatus may include an ultrasound-based haptic system that is smaller than, lighter than and that may consume less power than, prior haptic systems provided for use with, or deployed as part of, an XR system. Some such ultrasound-based haptic system implementations are small enough and light enough to deploy as part of an XR headset or an eyeglass frame without the ultrasound-based haptic system appreciably increasing the weight of the headset or eyeglass frame. In some implementations, haptic effects may be provided via air-coupled ultrasonic waves. Such implementations may be capable of providing haptic effects even to areas of a user's head that are not in contact with the XR headset or eyeglass frame. Alternatively, or additionally, some implementations provide haptic effects via ultrasonic waves transmitted to a wearer of the apparatus via solid material, such as a portion of the structure that is configured to be in contact with the wearer of the apparatus. An ultrasound-based haptic system can provide sensations to a device wearer without disturbing other nearby people. Some ultrasound-based haptic systems may be configured to produce a variety of different sensations. In some such implementations, each of the different sensations may correspond with an intended use case, a particular type of XR experience, or combinations thereof.
In some examples, the ultrasound-based haptic system 102 may include one or more arrays of ultrasonic transducer elements, such as one or more arrays of piezoelectric micromachined ultrasonic transducers (PMUTs), one or more arrays of capacitive micromachined ultrasonic transducers (CMUTs), etc. According to some examples, the ultrasonic transducer elements may include one or more piezoelectric layers, such as one or more layers of polyvinylidene fluoride PVDF polymer, polyvinylidene fluoride-trifluoroethylene (PVDF-TrFE) copolymer, scandium-doped aluminum nitride (SLAIN), or a combination thereof. In some such examples, PMUT elements in a single-layer array of PMUTs or CMUT elements in a single-layer array of CMUTs may be used as ultrasonic transmitters as well as ultrasonic receivers. However, in some examples the PMUTs, CMUTs or combinations thereof may be configured to transmit ultrasonic waves, but not to provide signals to the control system 106 corresponding to received ultrasonic waves.
The touch sensor system 103 (if present) may be, or may include, a resistive touch sensor system, a surface capacitive touch sensor system, a projected capacitive touch sensor system, a surface acoustic wave touch sensor system, an infrared touch sensor system, or any other suitable type of touch sensor system.
The control system 106 may include one or more general purpose single- or multi-chip processors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs) or other programmable logic devices, discrete gates or transistor logic, discrete hardware components, or combinations thereof. According to some examples, the control system 106 also may include one or more memory devices, such as one or more random access memory (RAM) devices, read-only memory (ROM) devices, etc.
In this example, the control system 106 is configured for communication with, and configured for controlling, elements of the structure 105 to provide XR effects. The XR effects may include visual effects provided by the display system 110, audio effects provided by the loudspeaker system 116, or combinations thereof. For example, the structure 105 may be an XR headset and the control system 106 may be configured for controlling elements of the XR headset to provide XR effects. In other examples, the structure 105 may be an eyeglass frame and the control system 106 may be configured for controlling elements of the eyeglass frame to provide XR effects. According to some examples, the control system 106 is configured for communication with, and for controlling, the ultrasound-based haptic system 102 to provide haptic effects. In some such examples, the control system 106 may be configured to control one or more arrays of ultrasonic transducer elements, such as PMUTs, of the ultrasound-based haptic system 102 to create one or more haptic effects associated with at least one of the XR effects, e.g., associated with at least one visual XR effect, associated with at least one audio XR effect, or a combination thereof. In some examples, the control system 106 may be configured to control one or more arrays of ultrasonic transducer elements of the ultrasound-based haptic system 102 to create one or more haptic effects synchronized with at least one of the XR effects, e.g., synchronized with at least one visual XR effect, synchronized with at least one audio XR effect, or a combination thereof.
In implementations where the apparatus includes a touch sensor system 103, the control system 106 is configured for communication with, and for controlling, the touch sensor system 103. In implementations where the apparatus includes a memory system 108 that is separate from the control system 106, the control system 106 also may be configured for communication with the memory system 108. In implementations where the apparatus includes a microphone system 112, the control system 106 is configured for communication with, and for controlling, the microphone system 112. According to some examples, the control system 106 may include one or more dedicated components for controlling the ultrasound-based haptic system 102, the touch sensor system 103, the memory system 108, the display system 110 or the microphone system 112. In some implementations, functionality of the control system 106 may be partitioned between one or more controllers or processors, such as between a dedicated sensor controller and an applications processor of a mobile device.
In some examples, the memory system 108 may include one or more memory devices, such as one or more RAM devices, ROM devices, etc. In some implementations, the memory system 108 may include one or more computer-readable media, storage media or storage media. Computer-readable media include both computer storage media and communication media including any medium that may be enabled to transfer a computer program from one place to another. Storage media may be any available media that may be accessed by a computer. In some examples, the memory system 108 may include one or more non-transitory media. By way of example, and not limitation, non-transitory media may include RAM, ROM, electrically erasable programmable read-only memory (EEPROM), compact disc ROM (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer.
Some implementations of the apparatus 101 may include an interface system 104. In some examples, the interface system 104 may include a wireless interface system. In some implementations, the interface system 104 may include a user interface system, one or more network interfaces, one or more interfaces between the control system 106 and the ultrasound-based haptic system 102, one or more interfaces between the control system 106 and the touch sensor system 103, one or more interfaces between the control system 106 and the memory system 108, one or more interfaces between the control system 106 and the display system 110, one or more interfaces between the control system 106 and the microphone system 112, one or more interfaces between the control system 106 and the loudspeaker system 116, one or more interfaces between the control system 106 and one or more external device interfaces (such as ports or applications processors), or combinations thereof.
The interface system 104 may be configured to provide communication (which may include wired or wireless communication, electrical communication, radio communication, etc.) between components of the apparatus 101. In some such examples, the interface system 104 may be configured to provide communication between the control system 106 and the ultrasound-based haptic system 102. According to some such examples, the interface system 104 may couple at least a portion of the control system 106 to the ultrasound-based haptic system 102 and the interface system 104 may couple at least a portion of the control system 106 to the touch sensor system 103, such as via electrically conducting material (for example, via conductive metal wires or traces). According to some examples, the interface system 104 may be configured to provide communication between the apparatus 101 and one or more other devices. In some examples, the interface system 104 may be configured to provide communication between the apparatus 101 and a human being. In some such examples, the interface system 104 may include one or more user interfaces. In some examples, the user interface(s) may be provided via the touch sensor system 103, the display system 110, the microphone system 112, the gesture sensor system, or combinations thereof. The interface system 104 may, in some examples, include one or more network interfaces or one or more external device interfaces (such as one or more universal serial bus (USB) interfaces or a serial peripheral interface (SPI)).
In some examples, the apparatus 101 may include a display system 110 having one or more displays. In some examples, the display system 110 may be, or may include, a light-emitting diode (LED) display, such as an organic light-emitting diode (OLED) display. In some such examples, the display system 110 may include layers, which may be referred to collectively as a “display stack.”
In some implementations, the apparatus 101 may include a microphone system 112. The microphone system 112 may include one or more microphones.
In some implementations, the apparatus 101 may include a loudspeaker system 116. The loudspeaker system 116 may be, or may include, one or more loudspeakers or groups of loudspeakers. In some examples, the loudspeaker system 116 may include one or more loudspeakers, or one or more groups of loudspeakers, corresponding to a left ear and one or more loudspeakers, or one or more groups of loudspeakers, corresponding to a right ear. In some implementations, at least a portion of the loudspeaker system 116 may reside within an earcup, an earbud, etc. In some examples, at least a portion of the loudspeaker system 116 may reside in or on a portion of an eyeglass frame that is intended to reside near a wearer's ear or touching the wearer's ear.
The apparatus 101 may be used in a variety of different contexts, some examples of which are disclosed herein. For example, in some implementations a mobile device may include at least a portion of the apparatus 101. In some implementations, a wearable device may include at least a portion of the apparatus 101. The wearable device may, for example, be a headset or an eyeglass frame. In some implementations, the control system 106 may reside in more than one device. For example, a portion of the control system 106 may reside in a wearable device and another portion of the control system 106 may reside in another device, such as a mobile device (for example, a smartphone), a server, etc. The interface system 104 also may, in some such examples, reside in more than one device.
According to this example, the apparatus 101 is a mobile device, such as a cellular telephone.
In this implementation, the apparatus 101 includes arrays of ultrasonic transducer elements 202a, 202b, 202c and 202d. According to this implementation, the arrays of ultrasonic transducer elements 202a-202d are components of an ultrasound-based haptic system, which is an instance of the ultrasound-based haptic system 102 that is described herein with reference to
Although the arrays of ultrasonic transducer elements 202a-202d are illustrated as circles in
In this example, the control system is configured for controlling the arrays of ultrasonic transducer elements 202a-202d to provide haptic effects. In some such examples, the control system may be configured for controlling the arrays of ultrasonic transducer elements 202a-202d to create one or more haptic effects associated with at least one of the XR effects provided by the structure 105, e.g., associated with at least one visual XR effect, associated with at least one audio XR effect, or a combination thereof. In some examples, the control system may be configured for controlling the arrays of ultrasonic transducer elements 202a-202d to create one or more haptic effects synchronized with at least one of the XR effects provided by the structure 105, e.g., synchronized with at least one visual XR effect, synchronized with at least one audio XR effect, or a combination thereof.
In this implementation, the control system is configured to control one or more of the arrays of ultrasonic transducer elements 202a-202d (for example, the arrays of ultrasonic transducer elements 202a and 202b) to provide haptic effects via air-coupled ultrasonic waves. Such implementations may be capable of providing haptic effects to areas of the wearer 205's head that are not in contact with the eyeglass frame, such as the wearer 205's eyebrow area, forehead area, cheek area, the area surrounding the wearer 205's eyes, the area between the wearer 205's eyes and the wearer 205's temples, etc.
According to this implementation, the control system is also configured to control one or more of the arrays of ultrasonic transducer elements 202a-202d (for example, the arrays of ultrasonic transducer elements 202c and 202d) to provide haptic effects via ultrasonic waves transmitted to a wearer of the apparatus via one or more portions of the structure 105 that are configured to be in contact with the wearer of the apparatus. For example, the arrays of ultrasonic transducer elements 202c and 205d may reside in portions of the structure 105 that are configured to be in contact with the wearer 205's temple and an area of the wearer 205's head that is behind the wearer 205's ear, respectively. According to some implementations, the array of ultrasonic transducer elements 205d may reside in a portion of the structure 105 that is configured to be in contact with a “backside” portion of the wearer 205's ear that is facing the wearer 205's head. According to some such implementations, the array of ultrasonic transducer elements 205d may reside in or on an outward-facing portion of the structure 105 that is configured to face the backside portion of the wearer 205's ear. In some implementations, there may be only a thin layer or a thin stack of material (such as one or more protective layers, one or more impedance-matching layers, etc.) between the arrays of ultrasonic transducer elements 202c and 205d and the wearer 205's skin.
Advantageously, transmit and receive electrodes may be formed in the same electrode layer during a common fabrication process of deposition, masking and etching, for example. In some implementations, one or more piezoelectric layers and associated electrode layers (not shown) may be included in the piezoelectric layer 315, in which case the piezoelectric layer 315 may be referred to as a piezoelectric stack. According to some examples, the piezoelectric layer 315 may include polyvinylidene fluoride PVDF polymer, polyvinylidene fluoride-trifluoroethylene (PVDF-TrFE) copolymer, scandium-doped aluminum nitride (SLAIN), or a combination thereof.
Referring still to
It should be noted that in the illustrated arrangement, portions of the piezoelectric layer 315 that are proximate to the outer electrodes 314 are in an opposite state of mechanical stress compared to portions of the piezoelectric layer 315 that are proximate to the inner electrode 313 during vibrations of the PMUT diaphragm. More particularly, at the instantaneous moment illustrated in
To maximize the transmitter and receiver efficiencies, it is desirable to cover the maximum possible area on the suspended portion having a common sense of stress (e.g. either tensile or compressive). Thus, transmitter and receiver efficiencies may be improved by positioning the outer perimeter of the inner electrode 313 and the inner perimeter of the outer electrode 314 close to the inflection zone. For other shapes such as rectangular or square diaphragms, a similar approach may be applied to optimize the electrode shapes. An outer edge of the outer electrode 314 may be substantially aligned with a perimeter of the cavity 320 or may (as illustrated) extend beyond the walls of the cavity 320.
The PMUT diaphragm may be supported by an anchor structure 370 that allows the diaphragm to extend over the cavity 320. The diaphragm may undergo flexural motion when the PMUT receives or transmits ultrasonic signals. The PMUT diaphragm may operate in a first flexural mode when receiving or transmitting ultrasonic signals. In some implementations, when operating in the first flexural mode, the inner and outer electrodes may experience a respective first and second oscillating load cycle that includes alternating periods of tensile and compressive stress. The first and second oscillating load cycles may be out of phase, that is, one being tensile while the other is compressive on each side of the inflection zone, as shown in
The PMUT 350 of
As suggested by the plus and minus signs in
According to some examples, one or more (and in some cases, all) of the dashes 405 shown in
In these examples, the arrays of ultrasonic transducer elements 402 shown in
In some implementations, the arrays of ultrasonic transducer elements 402 may include PMUTs. However, in some implementations, the arrays of ultrasonic transducer elements 402 may include one or more other types of ultrasonic transducer elements, such as CMUTs.
In some examples, each of the individual ultrasonic transducer elements 405 (or each of the individual ultrasonic transducer elements in an array of superpixels) may have a diameter in the range of hundreds of microns, such as 200 microns, 300 microns, 400 microns, 500 microns, 600 microns, 700 microns, 800 microns, etc. According to some examples, some arrays of ultrasonic transducer elements 402 may include different sizes of individual ultrasonic transducer elements 405. Such examples may be configured to produce more than one peak frequency of ultrasonic waves. For example, relatively larger ultrasonic transducer elements 405 may be configured for producing relatively lower peak frequencies of ultrasonic waves than relatively smaller ultrasonic transducer elements 405, because the peak frequency is inversely proportional to the diameter squared. According to one such example, an array of ultrasonic transducer elements 402 may include some ultrasonic transducer elements 405 having a diameter of 400 microns and other ultrasonic transducer elements 405 having a diameter of 800 microns. Other examples may include larger ultrasonic transducer elements, smaller ultrasonic transducer elements, or a combination thereof. In some examples, each of the individual ultrasonic transducer elements in a superpixel may have the same diameter.
According to some implementations, a control system (not shown) may control the array of ultrasonic transducer elements 402 to create haptic effects via amplitude modulation of transmitted ultrasonic carrier waves. In some such implementations, the ultrasonic carrier wave may be in the range of 20 KHz to 600 KHz. In some implementations, the ultrasonic carrier wave may be an amplitude-modulated carrier wave. According to some such implementations, the frequency of amplitude modulation may be in a range of 40 Hz to 300 Hz.
Referring to
According to some such examples, movement of the focus area may create a haptic effect of motion along a trajectory corresponding to differing positions of a range of focus areas, which may include the first focus area 415a and the second focus area 415b, over time. In some examples, the trajectory may be a linear trajectory, a curved trajectory, an oval trajectory, a circular trajectory, a sinusoidal trajectory, or combinations thereof.
Accordingly,
According to some examples, the control system may control the array of ultrasonic transducer elements 402 to produce at least the focus area 415d, and in some examples both focus area 415c and the focus area 415d, on or within a person's skin, such as on or in the skin of the wearer 205 of
In
In some such examples, the larger focus area 415e may disperse the energy of the beam of ultrasonic waves 410e to the extent that little or no haptic effect is produced, whereas the smaller focus area 415f may concentrate the energy of the beam of ultrasonic waves 410f to the extent that a noticeable haptic effect is produced. By controlling the array of ultrasonic transducer elements 402 to alternate between transmitting relatively less focused and relatively more focused beams of ultrasonic waves, a control system may produce intermittent haptic effects, or haptic effects that change over time.
In this example, the control system is controlling the array of ultrasonic transducer elements 402 to modify both a focus area and a focus depth of transmitted ultrasonic waves. In some such examples, the focus area may be modified in a range from 2 mm to 5 cm. However, alternative examples may involve modifying the focus area in a smaller or a larger range. According to this example, the focus depth changes by at least a distance 420a, which is the distance between the focus area 415e and the focus area 415f. Some such examples may involve modifying the focus depth in a range from 5 mm to 5 cm. However, alternative examples may involve modifying the focus depth in a smaller or a larger range.
In some examples, the control system may control the array of ultrasonic transducer elements 402 to produce at least the focus area 415f, and in some examples the focus areas 415e and 415f, on or in a person's skin, such as the skin of the wearer 205 of
In
In this example, the beam of ultrasonic waves 410g corresponds to relatively lower-frequency transmitted ultrasonic waves and the beam of ultrasonic waves 410h corresponds to relatively higher-frequency transmitted ultrasonic waves. According to this example, the beam of ultrasonic waves 410g is transmitted by ultrasonic transducer elements (or groups of transducer elements) 405a and the beam of ultrasonic waves 410h is transmitted by ultrasonic transducer elements (or groups of transducer elements) 405b. According to some examples, the ultrasonic transducer elements 405a, the ultrasonic transducer elements 405b, or both, may be, or may include, superpixels. In this example, the focus area 415h is relatively smaller than the focus area 415g based, at least in part, on the relatively higher-frequency ultrasonic waves in the beam of ultrasonic waves 410h.
Accordingly,
In this example, in addition to modifying a peak frequency of transmitted ultrasonic waves, the control system is controlling the array of ultrasonic transducer elements 402 to modify both a focus area and a focus depth of transmitted ultrasonic waves. In some such examples, the focus area may be modified in a range from 2 mm to 5 cm. However, alternative examples may involve modifying the focus area in a smaller or a larger range. According to this example, the focus depth changes by at least a distance 420b, which is the distance between the focus area 415g and the focus area 415h. Some such examples may involve modifying the focus depth in a range from 5 mm to 5 cm. However, alternative examples may involve modifying the focus depth in a smaller or a larger range.
In some examples, the control system may control the array of ultrasonic transducer elements 402 to produce at least the focus area 415h, and in some examples the focus areas 415g and 415h, on or in a person's skin, such as the skin of the wearer 205 of
According to this example, method 500 involves providing extended reality effects. In some examples, method 500 may involve controlling elements in or on a headset, in or on an eyeglass frame, or elements in or on one or more other devices, to provide extended reality effects. The extended reality effects may include augmented reality effects, mixed reality effects, virtual reality effects, or combinations thereof.
In this example, block 505 involves controlling, by a control system, a structure to provide extended reality effects. In some examples, block 505 may involve controlling a display system of a headset, an eyeglass frame, or another device, to provide images corresponding to the extended reality effects. According to some examples, block 505 may involve controlling a loudspeaker system of a headset, an eyeglass frame, or another device, to provide sounds corresponding to the extended reality effects.
According to this example, block 510 involves controlling, by the control system, one or more arrays of ultrasonic transducer elements mounted in or on the structure to create haptic effects via transmitted ultrasonic waves. In some examples, block 510 may involve controlling, by the control system, one or more arrays of PMUTs mounted in or on the structure to create haptic effects via transmitted ultrasonic waves. In some examples, creating haptic effects via transmitted ultrasonic waves may involve transmitting air-coupled ultrasonic waves. Alternatively, or additionally, creating haptic effects via transmitted ultrasonic waves may involve transmitting ultrasonic waves to a wearer of the apparatus via solid material. The solid material may, for example, include a portion of the structure (for example, a portion of the headset or the eyeglass frame) that is configured to be in contact with the wearer of the apparatus.
In some examples, one or more of the haptic effects may be associated with at least one of the extended reality effects. Alternatively, or additionally, one or more of the haptic effects may be synchronized with at least one of the extended reality effects.
According to some examples, method 500 may involve creating haptic effects via beam steering of transmitted ultrasonic waves. In some examples, a beam steering distance of the beam steering may be in a range from 5 mm to 2 cm. According to some examples, method 500 may involve creating haptic effects via beam steering of transmitted ultrasonic waves corresponding to a motion (such as motion of a focus area of the transmitted ultrasonic waves) along a trajectory. For example, method 500 may involve creating haptic effects corresponding to a motion along a linear trajectory, a curved trajectory, an oval trajectory, a circular trajectory, a sinusoidal trajectory or combinations thereof.
Some examples of method 500 may involve controlling at least one array of the one or more arrays of PMUTs to create haptic effects by modifying a focus area of transmitted ultrasonic waves, a focus depth of transmitted ultrasonic waves, or a combination thereof. In some examples, modifying the focus area may involve modifying the focus area in a range from 2 mm to 5 cm. In some examples, modifying the focus depth may involve modifying the focus depth in a range from 5 mm to 5 cm. Some examples of method 500 may involve transmitting a focused beam of ultrasonic waves by the at least one array at a first time and transmitting an unfocused beam of ultrasonic waves by the at least one array at a second time.
Some examples of method 500 may involve creating haptic effects by modifying a peak frequency of transmitted ultrasonic waves. Some examples of method 500 may involve creating haptic effects via amplitude modulation of transmitted ultrasonic carrier waves. According to some such examples, a frequency of amplitude modulation may be in a range of 40 Hz to 300 Hz. In some such examples, a peak frequency of the transmitted ultrasonic carrier waves may be in a range of 20 KHz to 600 KHz.
Implementation examples are described in the following numbered clauses:
As used herein, a phrase referring to “at least one of” a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.
The various illustrative logics, logical blocks, modules, circuits and algorithm processes described in connection with the implementations disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. The interchangeability of hardware and software has been described generally, in terms of functionality, and illustrated in the various illustrative components, blocks, modules, circuits and processes described above. Whether such functionality is implemented in hardware or software depends upon the particular application and design constraints imposed on the overall system.
The hardware and data processing apparatus used to implement the various illustrative logics, logical blocks, modules and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, or any conventional processor, controller, microcontroller, or state machine. A processor also may be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. In some implementations, particular processes and methods may be performed by circuitry that is specific to a given function.
In one or more aspects, the functions described may be implemented in hardware, digital electronic circuitry, computer software, firmware, including the structures disclosed in this specification and their structural equivalents thereof, or in any combination thereof. Implementations of the subject matter described in this specification also may be implemented as one or more computer programs, i.e., one or more modules of computer program instructions, encoded on a computer storage media for execution by, or to control the operation of, data processing apparatus.
If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium, such as a non-transitory medium. The processes of a method or algorithm disclosed herein may be implemented in a processor-executable software module which may reside on a computer-readable medium. Computer-readable media include both computer storage media and communication media including any medium that may be enabled to transfer a computer program from one place to another. Storage media may be any available media that may be accessed by a computer. By way of example, and not limitation, non-transitory media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Also, any connection may be properly termed a computer-readable medium. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and instructions on a machine readable medium and computer-readable medium, which may be incorporated into a computer program product.
Various modifications to the implementations described in this disclosure may be readily apparent to those having ordinary skill in the art, and the generic principles defined herein may be applied to other implementations without departing from the spirit or scope of this disclosure. Thus, the disclosure is not intended to be limited to the implementations presented herein, but is to be accorded the widest scope consistent with the claims, the principles and the novel features disclosed herein. The word “exemplary” is used exclusively herein, if at all, to mean “serving as an example, instance, or illustration.” Any implementation described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other implementations.
Certain features that are described in this specification in the context of separate implementations also may be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation also may be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a subcombination or variation of a sub combination.
Similarly, while operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order presented or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may generally be integrated together in a single software product or packaged into multiple software products. Additionally, other implementations are within the scope of the following claims. In some cases, the actions recited in the claims may be performed in a different order and still achieve desirable results.
It will be understood that unless features in any of the particular described implementations are expressly identified as incompatible with one another or the surrounding context implies that they are mutually exclusive and not readily combinable in a complementary or supportive sense, the totality of this disclosure contemplates and envisions that specific features of those complementary implementations may be selectively combined to provide one or more comprehensive, but slightly different, technical solutions. It will therefore be further appreciated that the above description has been given by way of example only and that modifications in detail may be made within the scope of this disclosure.