The present invention is directed to a contextual haptic-enabled wearable device, and to a method and apparatus for providing a haptic effect in a context-dependent manner, and has application in gaming, consumer electronics, entertainment, and other situations.
As virtual reality, augmented reality, mixed reality, and other immersive reality environments increase in usage for providing a user interface, haptic feedback has been implemented to augment a user's experience in such environments. Examples of such haptic feedback include kinesthetic haptic effects on a joystick or other gaming peripheral used to interact with the immersive reality environments.
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
One aspect of the embodiments herein relate to a processing unit or a non-transitory computer-readable medium having instructions stored thereon that, when executed by the processing unit, causes the processing unit to perform a method of providing haptic effects for an immersive reality environment. The method comprises receiving, by the processing circuit, an indication that a haptic effect is to be generated for an immersive reality environment being executed by an immersive reality module. The method further comprises determining, by the processing circuit, a type of immersive reality environment being generated by the immersive reality module, or a type of device on which the immersive reality module is being executed. In the method, the processing circuit controls a haptic output device of a haptic-enabled wearable device to generate the haptic effect based on the type of immersive reality environment being generated by the immersive reality module, or the type of device on which the immersive reality module is being executed.
In an embodiment, wherein the type of device on which the immersive reality module is being executed has no haptic generation capability.
In an embodiment, the type of the immersive reality environment is one of a two-dimensional (2D) environment, a three-dimensional (3D) environment, a mixed reality environment, a virtual reality (VR) environment, or an augmented reality (AR) environment.
In an embodiment, when the type of the immersive reality environment is a 3D environment, the processing circuit controls the haptic output device to generate the haptic effect based on a 3D coordinate of a hand of a user in a 3D coordinate system of the 3D environment, or based on a 3D gesture in the 3D environment.
In an embodiment, when the type of immersive reality environment is a 2D environment, the processing circuit controls the haptic output device to generate the haptic effect based on a 2D coordinate of a hand of a user in a 2D coordinate system of the 2D environment, or based on a 2D gesture in the 2D environment.
In an embodiment, when the type of immersive reality environment is an AR environment, the processing circuit controls the haptic output device to generate the haptic effect based on a simulated interaction between a virtual object of the AR environment and a physical environment depicted in the AR environment.
In an embodiment, the type of the immersive reality environment is determined to be a second type of immersive reality environment, and wherein the step of controlling the haptic output device to generate the haptic effect comprises retrieving a defined haptic effect characteristic associated with a first type of immersive reality environment, and modifying the defined haptic effect characteristic to generate a modified haptic effect characteristic, wherein the haptic effect is generated with the modified haptic effect characteristic.
In an embodiment, the defined haptic effect characteristic includes a haptic driving signal or a haptic parameter value, wherein the haptic parameter value includes at least one of a drive signal magnitude, a drive signal duration, or a drive signal frequency.
In an embodiment, the defined haptic effect characteristic includes at least one of a magnitude of vibration or deformation, a duration of vibration or deformation, a frequency of vibration or deformation, a coefficient of friction for an electrostatic friction effect or ultrasonic friction effect, or a temperature.
In an embodiment, the type of device on which the immersive reality module is being executed is one of a game console, a mobile phone, a tablet computer, a laptop, a desktop computer, a server, or a standalone a head-mounted display (HMD).
In an embodiment, wherein the type of the device on which the immersive reality module is being executed is determined to be a second type of device, and wherein the step of controlling the haptic output device to generate the haptic effect comprises retrieving a defined haptic effect characteristic associated with a first type of device for executing any immersive reality module, and modifying the defined haptic effect characteristic to generate a modified haptic effect characteristic, wherein the haptic effect is generated with the modified haptic effect characteristic.
In an embodiment, the processing unit further determines whether a user who is interacting with the immersive reality environment is holding a haptic-enabled handheld controller configured to provide electronic signal input for the immersive reality environment, wherein the haptic effect generated on the haptic-enabled wearable device is based on whether the user is holding a haptic-enabled handheld controller.
In an embodiment, the haptic effect is further based on what software other than the immersive reality module is being executed or is installed on the device.
In an embodiment, the haptic effect is further based on a haptic capability of the haptic output device of the haptic-enabled wearable device.
In an embodiment, the haptic output device is a second type of haptic output device, and wherein controlling the haptic output device comprises retrieving a defined haptic effect characteristic associated with a first type of haptic output device, and modifying the defined haptic effect characteristic to generate a modified haptic effect characteristic, wherein the haptic effect is generated based on the modified haptic effect characteristic.
One aspect of the embodiments herein relate to a processing unit, or a non-transitory computer-readable medium having instructions thereon that, when executed by the processing unit, causes the processing unit to perform a method of providing haptic effects for an immersive reality environment. The method comprises detecting, by the processing circuit, a simulated interaction between an immersive reality environment and a physical object being controlled by a user of the immersive reality environment. The method further comprises determining, by the processing circuit, that a haptic effect is to be generated for the simulated interaction between the immersive reality environment and the physical object. The method additionally comprises controlling, by the processing circuit, a haptic output device of a haptic-enabled wearable device to generate the haptic effect based on the simulated interaction between the physical object and the immersive reality environment.
In an embodiment, the physical object is a handheld object being moved by a user of the immersive reality environment.
In an embodiment, the handheld object is a handheld user input device configured to provide electronic signal input for the immersive reality environment.
In an embodiment, the handheld object has no ability to provide electronic signal input for the immersive reality environment.
In an embodiment, the simulated interaction includes simulated contact between the physical object and a virtual surface of the immersive reality environment, and wherein the haptic effect is based on a virtual texture of the virtual surface.
In an embodiment, the processing unit further determines a physical characteristic of the physical object, wherein the haptic effect is based on the physical characteristic of the physical object, and wherein the physical characteristic includes at least one of a size, color, or shape of the physical object.
In an embodiment, the processing unit further assigns a virtual characteristic to the physical object, wherein the haptic effect is based on the virtual characteristic, and wherein the virtual characteristic include at least one of a virtual mass, a virtual shape, a virtual texture, or a magnitude of virtual force between the physical object and a virtual object of the immersive reality environment.
In an embodiment, the haptic effect is based on a physical relationship between the haptic-enabled wearable device and the physical object.
In an embodiment, the haptic effect is based on proximity between the haptic-enabled wearable device and a virtual object of the immersive reality environment.
In an embodiment, the haptic effect is based on a movement characteristic of the physical object.
In an embodiment, the physical object includes a memory that stores profile information describing one or more characteristics of the physical object, wherein the haptic effect is based on the profile information.
In an embodiment, the immersive reality environment is generated by a device that is able to generate a plurality of different immersive reality environments. The processing unit further selects the immersive reality environment from among the plurality of immersive reality environments based on a physical or virtual characteristic of the physical object.
In an embodiment, the processing unit further applies an image classification algorithm to a physical appearance of the physical object to determine an image classification of the physical object, wherein selecting the immersive reality environment from among the plurality of immersive reality environments is based on the image classification of the physical object.
In an embodiment, the physical object includes a memory that stores profile information describing a characteristic of the physical object, wherein selecting the immersive reality environment from among the plurality of immersive reality environments is based on the profile information stored in the memory.
One aspect of the embodiments herein relate to a processing unit, or a non-transitory computer-readable medium having instructions thereon that, when executed by the processing unit, causes the processing unit to perform a method of providing haptic effects for an immersive reality environment. The method comprises determining, by a processing circuit, that a haptic effect is to be generated for an immersive reality environment. The method further comprises determining, by the processing circuit, that the haptic effect is a defined haptic effect associated with a first type of haptic output device. In the method, the processing unit determines a haptic capability of a haptic-enabled device in communication with the processing circuit, wherein the haptic capability indicates that the haptic-enabled device has a haptic output device that is a second type of haptic output device. In the method, the processing unit further modifies a haptic effect characteristic of the defined haptic effect based on the haptic capability of the haptic-enabled device in order to generate a modified haptic effect with a modified haptic effect characteristic.
In an embodiment, the haptic capability of the haptic-enabled device indicates at least one of what type(s) of haptic output device are in the haptic-enabled device, how many haptic output devices are in the haptic-enabled device, what type(s) of haptic effect each of the haptic output device(s) is able to generate, a maximum haptic magnitude that each of the haptic output device(s) is able to generate, a frequency bandwidth for each of the haptic output device(s), a minimum ramp-up time or brake time for each of the haptic output device(s), a maximum temperature or minimum temperature for any thermal haptic output device of the haptic-enabled device, or a maximum coefficient of friction for any ESF or USF haptic output device of the haptic-enabled device.
In an embodiment, modifying the haptic effect characteristic includes modifying at least one of a haptic magnitude, haptic effect type, haptic effect frequency, temperature, or coefficient of friction.
One aspect of the embodiments herein relate to a processing unit, or a non-transitory computer-readable medium having instructions thereon that, when executed by the processing unit, causes the processing unit to perform a method of providing haptic effects for an immersive reality environment. The method comprises determining, by a processing circuit, that a haptic effect is to be generated for an immersive reality environment being generated by the processing circuit. The method further comprises determining, by the processing circuit, respective haptic capabilities for a plurality of haptic-enabled devices in communication with the processing circuit. In the method, the processing unit selects a haptic-enabled device from the plurality of haptic-enabled devices based on the respective haptic capabilities of the plurality of haptic-enabled devices. The method further comprises controlling the haptic-enabled device that is selected to generate the haptic effect, such that no unselected haptic-enabled device generates the haptic effect.
One aspect of the embodiments herein relate to a processing unit, or a non-transitory computer-readable medium having instructions thereon that, when executed by the processing unit, causes the processing unit to perform a method of providing haptic effects for an immersive reality environment. The method comprises tracking, by a processing circuit, a location or movement of a haptic-enabled ring or haptic-enabled glove worn by a user of an immersive reality environment. The method further comprises determining, based on the location or movement of the haptic-enabled ring or haptic-enabled glove, an interaction between the user and the immersive reality environment. In the method, the processing unit controls the haptic-enabled ring or haptic-enabled glove to generate a haptic effect based on the interaction that is determined between the user and the immersive reality environment.
In an embodiment, the haptic effect is based on a relationship, such as proximity, between the haptic-enabled ring or haptic-enabled glove and a virtual object of the immersive reality environment.
In an embodiment, the relationship indicates proximity between the haptic-enabled ring or haptic-enabled glove and a virtual object of the immersive reality environment
In an embodiment, the haptic effect is based on a virtual texture or virtual hardness of the virtual object.
In an embodiment, the haptic effect is triggered in response to the haptic-enabled ring or the haptic-enabled glove crossing a virtual surface or virtual boundary of the immersive reality environment, and wherein the haptic effect is a micro-deformation effect that approximates a kinesthetic effect.
In an embodiment, tracking the location or movement of the haptic-enabled ring or the haptic-enabled glove comprises the processing circuit receiving from a camera an image of a physical environment in which the user is located, and applying an image detection algorithm to the image to detect the haptic-enabled ring or haptic-enabled glove.
One aspect of the embodiments herein relate to a system comprising an immersive reality generating device and a haptic-enabled wearable device. The immersive reality generating device has a memory configured to store an immersive reality module for generating an immersive reality environment; a processing unit configured to execute the immersive reality module, and a communication interface for performing wireless communication, wherein the immersive reality generating device has no haptic output device and no haptic generation capability. The haptic-enabled wearable device has a haptic output device, a communication interface configured to wirelessly communicate with the communication interface of the immersive reality generating device, wherein the haptic-enabled wearable device is configured to receive, from the immersive reality generating device, an indication that a haptic effect is to be generated, and to control the haptic output device to generate the haptic effect.
The foregoing and other features, objects and advantages of the invention will be apparent from the following detailed description of embodiments hereof as illustrated in the accompanying drawings. The accompanying drawings, which are incorporated herein and form a part of the specification, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention. The drawings are not to scale.
The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
One aspect of the embodiments herein relates to providing a haptic effect for an immersive reality environment, such as a virtual reality environment, an augmented reality environment, or a mixed reality environment, in a context-dependent manner. In some cases, the haptic effect may be based on a context of a user's interaction with the immersive reality environment. One aspect of the embodiments herein relates to providing the haptic effect with a haptic-enabled wearable device, such as a haptic-enabled ring worn on a user's hand. In some cases, the haptic-enabled wearable device may be used in conjunction with an immersive reality platform (also referred to as an immersive reality generating device) that has no haptic generating capability. For instance, the immersive reality platform has no built-in haptic actuator. Such cases allow the immersive reality platform, such as a mobile phone, to have a slimmer profile and/or less weight. When the mobile phone needs to provide a haptic alert to a user, the mobile phone may provide an indication to the haptic-enabled wearable device that a haptic effect needs to be generated, and the haptic-enabled wearable device may generate the haptic effect. The haptic-enabled wearable device may thus provide a common haptic interface for different immersive reality environments or different immersive reality platforms. The haptic alert generated by the haptic-enabled wearable device may relate to user interaction with an immersive reality environment, or may relate to other situations, such as a haptic alert regarding an incoming phone call or text message being received by the mobile phone. One aspect of the embodiments herein relates to using the haptic-enabled wearable device, such as the haptic-enabled ring, to track a location or movement of a hand of a user, so as to track a gesture or other form of interaction by the user with an immersive reality environment.
As stated above, one aspect of the embodiments herein relates to generating a haptic effect based on a context of a user's interaction with an immersive reality environment. In some cases, the context may refer to what type of immersive reality environment the user is interacting with. For instance, the type of immersive reality environment may be one of a virtual reality (VR) environment, an augmented reality (AR) environment, a mixed reality (MR) environment, a 3D environment, a 2D environment, or any combination thereof. The haptic effect that is generated may differ based on the type of immersive reality environment that the user is interacting in. For example, a haptic effect for a 2D environment may be based on a 2D coordinate of a hand of a user, or motion of a hand of a user, along two coordinate axes of the 2D environment, while a haptic effect for a 3D environment may be based on a 3D coordinate of a hand of the user, or motion of a hand of a user, along three coordinate axes of the 3D environment.
In some instances, a context-dependent haptic effect functionality may be implemented as a haptic control module that is separate from an immersive reality module for providing an immersive reality environment. Such an implementation may allow a programmer to create an immersive reality module (also referred to as immersive reality application) without having to program context-specific haptic effects into the immersive reality module. Rather, the immersive reality module may later incorporate the haptic control module (e.g., as a plug-in) or communicate with the haptic control module to ensure that haptic effects are generated in a context-dependent manner. If an immersive reality module is programmed with, e.g., a generic haptic effect characteristic that is not context-specific or specific to only one context, the haptic control module may modify the haptic effect characteristic to be specific to other, different contexts. In some situations, an immersive reality module may be programmed without instructions for specifically triggering a haptic effect or without haptic functionality in general. In such situations, the haptic control module may monitor events occurring within an immersive reality environment and determine when a haptic effect is to be generated.
Regarding context-dependent haptic effects, in some cases a context may refer to a type of device on which the immersive reality module (which may also be referred to as an immersive reality application) is being executed. The type of device may be, e.g., a mobile phone, a tablet computer, a laptop computer, a desktop computer, a server, or a standalone head-mounted display (HMD). The standalone HMD may have its own display and processing capability, such that it does not need another device, such as a mobile phone, to generate an immersive reality environment. In some cases, an immersive reality module for each type of device may have a defined haptic effect characteristic (which may also be referred to as a pre-defined haptic effect characteristic) that is specific to that type of device. For instance, an immersive reality module being executed on a tablet computer may have been programmed with a haptic effect characteristic that is specific to tablet computers. If a haptic effect is to be generated on a haptic-enabled wearable device, a pre-existing haptic effect characteristic may have to be modified by a haptic control module in accordance herewith so as to be suitable for the haptic-enabled wearable device. A haptic control module in accordance herewith may need to make a modification of the haptic effect based on what type of device an immersive reality module is executing on.
In an embodiment, a context may refer to what software is being executed or installed on a device executing an immersive reality module (the device may be referred to as an immersive reality generating device, or an immersive reality platform). The software may refer to the immersive reality module itself, or to other software on the immersive reality platform. For instance, a context may refer to an identity of the immersive reality module, such as its name and version, or to a type of immersive reality module (e.g., a first-person shooting game). In another example, a context may refer to what operating system (e.g., Android™, Mac OS®, or Windows®) or other software is running on the immersive reality platform. In an embodiment, a context may refer to what hardware component is on the immersive reality platform. The hardware component may refer to, e.g., a processing circuit, a haptic output device (if any), a memory, or any other hardware component.
In an embodiment, a context of a user's interaction with an immersive reality environment may refer to whether a user is using a handheld gaming peripheral such as a handheld game controller to interact with the immersive reality environment, or whether the user is interacting with the immersive reality environment with only his or her hand and any haptic-enabled wearable device worn on the hand. The handheld game controller may be, e.g., a game controller such as the Oculus Razer® or a wand such as the Wii® remote device. For instance, a haptic effect on a haptic-enabled wearable device may be generated with a stronger drive signal magnitude if a user is not holding a handheld game controller, relative to a drive signal magnitude for when a user is holding a handheld game controller. In one example, a context may further refer to a haptic capability (if any) of a handheld game controller.
In an embodiment, a context may refer to whether and how a user is using a physical object to interact with an immersive reality environment. The physical object may be an everyday object that is not an electronic game controller and has no capability for providing electronic signal input for an immersive reality environment. For instance, the physical object may be a toy car that a user picks up to interact with a virtual race track of an immersive reality environment. The haptic effect may be based on presence of the physical object, and/or how the physical object is interacting with the immersive reality environment. In an embodiment, a haptic effect may be based on a physical characteristic of a physical object, and/or a virtual characteristic assigned to a physical object. In an embodiment, a haptic effect may be based on a relationship between a physical object and a haptic-enabled wearable device, and/or a relationship between a physical object and a virtual object of an immersive reality environment.
In an embodiment, a physical object may be used to select which immersive reality environment of a plurality of immersive reality environments is to be generated on an immersive reality platform. The selection may be based on, e.g., a physical appearance (e.g., size, color, shape) of the physical object. For instance, if a user picks up a physical object that is a Hot Wheels® toy, an immersive reality platform may use an image classification algorithm to classify a physical appearance of the physical object as that of a car. As a result, an immersive reality environment related to cars may be selected to be generated. The selection does not have to rely on only image classification, or does not have to rely on image classification at all. For instance, a physical object may in some examples have a memory that stores a profile that indicates characteristics of the physical object. The characteristics in the profile may, e.g., identify a classification of the physical object as a toy car.
In an embodiment, a context may refer to which haptic-enabled devices are available to generate a haptic effect for an immersive reality environment, and/or capabilities of the haptic-enabled devices. The haptic-enabled devices may be wearable devices, or other types of haptic-enabled devices. In some instances, a particular haptic-enabled device may be selected from among a plurality of haptic-enabled devices based on a haptic capability of a selected device. In some instances, a haptic effect characteristic may be modified so as to be better suited to a haptic capability of a selected device.
In an embodiment, a haptic-enabled wearable device may be used to perform hand tracking in an immersive reality environment. For instance, an image recognition algorithm may detect a location, orientation, or movement of a haptic-enabled ring or haptic-enabled glove, and use that location, orientation, or movement of the haptic-enabled wearable device to determine, or as a proxy for, a location, orientation, or movement of a hand of a user. The haptic-enabled wearable device may thus be used to determine interaction between a user and an immersive reality environment.
In an embodiment, the immersive reality generating device 110A may have no haptic generation capability. For instance, the immersive reality generating device 110A may be a mobile phone that has no haptic output device. The omission of the haptic output device may allow the mobile phone to have a reduced thickness, reduced weight, and/or longer battery life. Thus, some embodiments herein relate to a combination of an immersive reality generating device and a haptic-enabled wearable device in which the immersive reality generating device has no haptic generation capability and relies on the haptic-enabled wearable device to generate a haptic effect.
In
In
In the embodiment of
In some cases, the immersive reality generating device 110A further includes a sensor 117 that captures information from which the context is determined. In an embodiment, the sensor 117 may include a camera, an infrared detector, an ultrasound detection sensor, a hall sensor, a lidar or other laser-based sensor, radar, or any combination thereof. If the sensor 117 is an infrared detector, the system 100A may include, e.g., a set of stationary infrared emitters (e.g., infrared LED's) that are used to track user movement within an immersive reality environment. In an embodiment, the sensor 117 may be part of a simultaneous localization and mapping (SLAM) system. In an embodiment, the sensor 117 may include a device that is configured to generate an electromagnetic field and detect movement within the field due to changes in the field. In an embodiment, the sensor may include devices configured to transmit wireless signals to determine position or movement via triangulation. In an embodiment, the sensor 117 may include an inertial sensor, such as an accelerometer, gyroscope, or any combination thereof. In an embodiment, the sensor 117 may include a global positioning system (GPS) sensor. In an embodiment, haptic-enabled wearable device 120A, such as a haptic-enabled ring, may include a camera or other sensor for the determination of context information.
In an embodiment, the context determination module 111b may be configured to determine context based on data from the sensor 117. For instance, the context determination module 111b may be configured to apply a convolutional neural network or other machine learning algorithm, or more generally an image processing algorithm, to a camera image or other data from the sensor 117. The image processing algorithm may, for instance, detect presence of a physical object, as described above, and/or determine a classification of a physical appearance of a physical object. In another example, the image processing algorithm may detect a location of a haptic-enabled device worn on a user's hand, or directly of the user's hand, in order to perform hand tracking or hand gesture recognition. In an embodiment, the context determination module 111b may be configured to communicate with the immersive reality module 111a and/or an operating system of the device 110A in order to determine, e.g., a type of immersive reality environment being executed by the immersive reality module 111a, or a type of device 110A on which the immersive reality module 111a is being executed.
In an embodiment, the haptic control module 111c may be configured to control a manner in which to generate a haptic effect, and to do so based on, e.g., a context of a user's interaction with the immersive reality environment. In the embodiment of
As illustrated in
In an embodiment, the haptic-enabled wearable device 120A may be a type of body-grounded haptic-enabled device. In an embodiment, the haptic-enabled wearable device 120A may be a device worn on a user's hand or wrist, such as a haptic-enabled ring, haptic-enabled glove, haptic-enabled watch or wrist band, or a fingernail attachment. Haptic-enabled rings are discussed in more detail in U.S. Patent Appl. No. (IMM753), titled “Haptic Ring,” the entire content of which is incorporated by reference herein in its entirety. In an embodiment, the haptic-enabled wearable device 120A may be a head band, a gaming vest, a leg strap, an arm strap, a HMD, a contact lens, or any other haptic-enabled wearable device.
In an embodiment, the haptic output device 127 may be configured to generate a haptic effect in response to a haptic command. In some instances, the haptic output device 127 may be the only haptic output device on the haptic-enabled wearable device 120A, or may be one of a plurality of haptic output devices on the haptic-enabled wearable device 120A. In some cases, the haptic output device 127 may be an actuator configured to output a vibrotactile haptic effect. For instance, the haptic output device 127 may be an eccentric rotating motor (ERM) actuator, a linear resonant actuator (LRA), a solenoid resonant actuator (SRA), an electromagnet actuator, a piezoelectric actuator, a macro-fiber composite (MFC) actuator, or any other vibrotactile haptic actuator. In some cases, the haptic output device 127 may be configured to generate a deformation haptic. For instance, the haptic output device 127 may use a smart material such as an electroactive polymer (EAP), a macro-fiber composite (MFC) piezoelectric material (e.g., a MFC ring), a shape memory alloy (SMA), a shape memory polymer (SMP), or any other material that is configured to deform when a voltage, heat, or other stimulus is applied to the material. The deformation effect may be created in any other manner. In an embodiment, the deformation effect may squeeze, e.g., a user's finger, and may be referred to as a squeeze effect. In some cases, the haptic output device 127 may be configured to generate an electrostatic friction (ESF) haptic effect or an ultrasonic friction (USF) effect. In such cases, the haptic output device 127 may include one or more electrodes, which may be exposed on a surface of the haptic-enabled wearable device 120A or may be slightly electrically insulated beneath the surface, and include a signal generator for applying a signal onto the one or more electrodes. In some cases, the haptic output device 127 may be configured to generate a temperature-based haptic effect. For instance, the haptic output device 127 may be a Peltier device configured to generate a heating effect or a cooling effect. In some cases, the haptic output device may be, e.g., an ultrasonic device that is configured to project air toward a user.
In an embodiment, one or more components of the immersive reality generating device 110A may be supplemented with or replaced by an external component. For instance,
In
In an embodiment, the haptic control functionality may reside at least partially in a haptic-enabled wearable device. For instance,
In an embodiment, the context determination functionality may reside at least partially on a haptic-enabled wearable device. For instance,
In an embodiment, the functionality of a haptic control module may be implemented on a device that is external to both an immersive reality generating device and to a haptic-enabled wearable device. For instance,
As stated above, in some cases a context of a user's interaction with an immersive reality environment may refer to a type of immersive reality environment being generated, or a type of immersive reality generating device on which the immersive reality environment is being generated.
In an embodiment, the hand H and/or the haptic-enabled wearable device 270 may be used as a proxy for a virtual cursor that is used to interact with an immersive reality environment. For instance,
Similarly,
Further, FIG. 2D illustrates an example of an AR environment displayed on the HMD 250. The AR environment may display a physical environment, such as a park in which a user of the AR environment is located, and may display a virtual object 289 superimposed on an image of the physical environment. In an embodiment, the virtual object 289 may be controlled based on movement of the user's hand H and/or of the haptic-enabled wearable device 270. The embodiments in
In an embodiment, the method 300 begins at step 301, in which the processing circuit 113/123/153 receives an indication that a haptic effect is to be generated for an immersive reality environment being executed by an immersive reality module, such as immersive reality module 111a. The indication may include a command from the immersive reality module 111a, or may include an indication that a particular event (e.g., virtual collision) within the immersive reality environment has occurred, wherein the event triggers a haptic effect.
In step 303, the processing circuit 113/123/153 may determine a type of immersive reality environment being generated by the immersive reality module 111a, or a type of device on which the immersive reality module 111a is being executed. In an embodiment, the types of immersive reality environment may include a two-dimensional (2D) environment, a three-dimensional (3D) environment, a mixed reality environment, a virtual reality environment, or an augmented reality environment. In an embodiment, the types of device on which the immersive reality module 111a is executed may include a desktop computer, a laptop computer, a server, a standalone HMD, a tablet computer, or a mobile phone.
In step 305, the processing circuit 113/123/153 may control the haptic-enabled wearable device 120A/120C/120D/270 to generate the haptic effect based on the type of immersive reality environment being generated by the immersive reality module 111a, or on the type of device on which the immersive reality module is being executed.
For instance,
In another example of step 305,
In another example of step 305, FIG. 2D illustrates an example of an AR environment displayed on the HMD 250. In this example, the AR environment displays a virtual object 289 superimposed on an image of a physical environment, such as a park. In an embodiment, the processing circuit 113/123/153 may control a haptic output device of the haptic-enabled wearable device 270 to generate a haptic effect based on a simulated interaction between the virtual object 289 and the image of the physical environment, such as the virtual object 289 driving over the grass of the park in the image of the physical environment. The haptic effect may be based on, e.g., a simulated traction (or, more generally, friction) between the virtual object 289 and the grass in the image of the physical environment, a velocity of the virtual object 289 within a coordinate system of the AR environment, a virtual characteristic (also referred to as a virtual property) of the virtual object 289, such as a virtual tire quality, or any other characteristic.
In an embodiment, a haptic effect of the method 300 may be based on whether the user of the immersive reality environment is holding a handheld user input device, such as a handheld game controller or other gaming peripheral. For instance, a drive signal magnitude of the haptic effect on the haptic-effect wearable device 270 may be higher if the user is not holding a handheld user input device.
In an embodiment, a haptic effect may be further based on a haptic capability of the haptic-enabled wearable device. In an embodiment, the haptic capability indicates at least one of a type or strength of haptic effect the haptic-enabled wearable device 270 is capable of generating thereon, wherein the strength may refer to, e.g., maximum acceleration, deformation, pressure, or temperature. In an embodiment, the haptic capability of the haptic-enabled wearable device 270 indicates at least one of what type(s) of haptic output device are in the haptic-enabled device, how many haptic output devices are in the haptic-enabled device, what type(s) of haptic effect each of the haptic output device(s) is able to generate, a maximum haptic magnitude that each of the haptic output device(s) is able to generate, a frequency bandwidth for each of the haptic output device(s), a minimum ramp-up time or brake time for each of the haptic output device(s), a maximum temperature or minimum temperature for any thermal haptic output device of the haptic-enabled device, or a maximum coefficient of friction for any ESF or USF haptic output device of the haptic-enabled device
In an embodiment, step 305 may involve modifying a haptic effect characteristic, such as a haptic parameter value or a haptic driving signal, used to generate a haptic effect. For instance, step 305 may involve the haptic-enabled wearable device 270, which may be a second type of haptic-enabled device, such as a haptic-enabled ring. In such an example, step 305 may involve retrieving a defined haptic driving signal or a defined haptic parameter value associated with a first type of haptic-enabled wearable device, such as a haptic wrist band. The step 305 may involve modifying the defined haptic driving signal or the defined haptic parameter value based on a difference between the first type of haptic-enabled device and the second type of haptic-enabled device.
As stated above, a context of user interaction may refer to a manner in which a user is using a physical object to interact with an immersive reality environment.
In an embodiment, the physical object P may be detected or otherwise recognized based on sensor data from the sensor 230. For instance, the sensor 230 may be a camera configured to capture an image of a user's forward field of view. In the embodiment, the context determination module 111b may be configured to apply an image recognition algorithm to detect the presence of the physical object P.
In an embodiment, the method 500 may begin at step 501, in which the processing circuit 113/123/153 may detect a simulated interaction between a physical object and an immersive reality environment. For instance, step 501 may involve the processing circuit 113 detecting a simulated interaction between the physical toy car and a virtual racetrack of the immersive reality environment depicted in
In step 503, the processing circuit 113/123/153 may determine a haptic effect to be generated based on the simulated interaction between the physical object and the immersive reality environment. For instance, step 503 may involve the processing circuit 113 adjusting a haptic effect magnitude based on a level of the simulated friction between the physical toy car and the virtual racetrack of
In step 505, the processing circuit 113/123/153 may control a haptic output device in communication with the processing circuit 113/123/153, such as a haptic output device of the haptic-enabled wearable device 270, to generate the haptic effect based on the simulated interaction.
In an embodiment, the haptic effect may be based on a physical relationship between the physical object P and a haptic-enabled wearable device 270/271. For instance, the haptic effect may have a magnitude (e.g., magnitude of deformation, vibration, friction, or temperature effect) that is based on a proximity between the physical object P and the haptic-enabled wearable device 271 in
As stated above, a haptic effect may be based on a physical characteristic of the physical object P, such as a size, weight, or physical appearance of the physical object. For instance, a physical object having a first size may be associated with a first haptic magnitude, and a physical object having a second, bigger size may be associated with a second, higher haptic magnitude. In an embodiment, the haptic effect may be based on an image classification of the physical appearance of the physical object. The image classification may be performed via an image classification algorithm. For instance, the image classification algorithm may classify the physical object as a car, which may affect the haptic effect that is generated. In some instances, the image classification may affect what immersive reality environment is generated.
In an embodiment, a physical object may be assigned one or more virtual properties (also referred to as virtual characteristics), such as a virtual mass, a virtual appearance (e.g., virtual shape), a virtual texture, a virtual charge, or a virtual force of attraction or repulsion. For instance, with reference to
In an embodiment, a physical object may be used to determine which immersive reality module to execute, or more generally which immersive reality environment to generate. For instance, with reference to
As stated above, a physical object may have a storage medium that stores a profile describing characteristics of the physical object. In an embodiment, a selection of the immersive reality environment may be based on the profile. In one example, the profile may describe a physical object as being a car. As a result, the first immersive reality environment noted above may be selected to be generated.
As stated above, a context of user interaction in an immersive reality environment may in an embodiment refer to a haptic capability of a haptic-enabled device (e.g., a haptic-enabled wearable device), and a haptic effect may be generated based on the haptic capability of a haptic-enabled device. For instance,
In an embodiment, the method 600 may begin at step 601, in which a processing circuit determines that a haptic effect is to be generated for an immersive reality environment. The processing circuit may be, e.g., processing circuit 113 executing haptic control module 111c.
In step 603, the processing circuit may determine that the haptic effect is a defined haptic effect (also referred to as a pre-defined haptic effect) associated with a first type of haptic output device. For instance, the processing circuit may determine that the haptic effect is associated with an ERM actuator, and has haptic effect characteristics associate with the ERM actuator.
In step 605, the processing circuit may determine a haptic capability of the haptic-enabled device in communication with the processing circuit, wherein the haptic capability indicates that the haptic-enabled device has a haptic output device that is a second type of haptic output device different from the first type of haptic output device. For instance, with reference to
In step 607, the processing circuit may modify a haptic effect characteristic of the defined haptic effect based on the haptic capability of the haptic-enabled device in order to generate a modified haptic effect with a modified haptic effect characteristic. For instance, with reference to
In an embodiment, the haptic capability may indicate, e.g., at least one of a type of haptic effect the haptic-enabled user interface device is capable of generating thereon, a maximum magnitude the haptic-enabled user interface device is capable of generate for the haptic effect, a total number of haptic output devices included in the haptic-enabled user interface device, a bandwidth or frequency band of haptic effects that the haptic-enabled user interface device is able to generate, or a minimum response time that the haptic-enabled user interface device is capable of for ramping up the haptic effect to a steady state or for braking the haptic effect to a substantially complete stop.
In an embodiment, the haptic capabilities of various haptic-enabled devices may be used to select between them. For example, step 607 above may be supplemented or replaced by a step in which the processing circuit selects a haptic-enabled device other than device 120A to generate a haptic effect. Referring to
As stated above, a haptic-enabled wearable device may in an embodiment facilitate hand tracking or hand gesture detection.
In an embodiment, the method 700 begins at step 701, in which the processing circuit tracks a location or movement of a haptic-enabled ring (e.g., 270) or haptic-enabled glove worn by a user of an immersive reality environment. In some cases, step 701 may involve applying an image processing algorithm that is configured to detect a shape of the haptic-enabled ring or haptic-enabled glove. In some instances, step 701 may involve tracking a wireless signal emitted by the haptic-enabled ring or haptic-enabled glove. In some instances, step 701 may involve performing infrared detection to detect any heat emitted by the haptic-enabled ring or haptic-enabled glove.
In step 703, the processing circuit determines, based on the location or movement of the haptic-enabled ring or haptic-enabled glove, an interaction between a user and the immersive reality environment. For instance, the processing circuit may use a location of the haptic-enabled ring or haptic-enabled glove to determine a location of the user in a coordinate system of the immersive reality environment. This determination may indicate how far away the hand is from, e.g., a virtual object of the immersive reality environment. In another example, the processing circuit may detect a hand gesture based on movement of the haptic-enabled ring or haptic-enabled glove (e.g., a hand gesture for switching between a VR environment and a mixed reality environment). Gesture detection is discussed in more detail in U.S. patent application Ser. No. 15/958,617, titled “Systems, Devices, and Methods for Providing Immersive Reality Interface Modes,” the entire content of which is incorporated by reference herein in its entirety). In an additional example, the processing circuit may use movement of the haptic-enabled ring or haptic-enabled glove as an approximation of movement of a physical object, such as physical object P in
In step 705, the processing circuit controls the haptic-enabled ring or haptic-enabled glove to generate a haptic effect based on the interaction that is determined between the user and the immersive reality environment. In an embodiment, the haptic effect is based on a relationship, such as proximity, between the haptic-enabled ring or haptic-enabled glove and a virtual object of the immersive reality environment. In an embodiment, the haptic effect is based on a virtual texture or virtual hardness of the virtual object. In an embodiment, the haptic effect is triggered in response to the haptic-enabled ring or the haptic-enabled glove crossing a virtual surface or virtual boundary of the immersive reality environment, and wherein the haptic effect is a micro-deformation effect that approximates a kinesthetic effect.
In one example of the above embodiments, a user may be wearing a haptic-enabled ring. The user may have a mobile device, such as a mobile phone, having a small size. As a result, the mobile device may no longer have a built-in haptic actuator. The mobile device instead communicates with the haptic-enabled ring. When getting alerts or interacting with the mobile device, haptic effects (e.g., haptic sensations) are rendered on the haptic-enabled ring. When the user puts the mobile device into a HMD shell, the user's hands can be tracked while the user is interacting with a virtual reality (VR) world. As the user interacts with physical or virtual objects, haptic effects can be rendered on the haptic-enabled ring. With a quick gesture, the user's VR experience can turn into a mixed reality experience. The user can now interact with physical objects and virtual objects. A system that is generating the mixed reality environment may use camera recognition to be aware of what objects the user is interacting with and if those objects need haptic rendering. For example, the user may pick up a small Hot Wheels® car and load a virtual race track. As the user moves the physical car, haptic effects are rendered on the user's haptic-enabled ring (on the hand moving the car). The haptic effect may be based on a property such as a velocity of the car's motion and the virtual texture beneath the car. If the user is wearing the haptic-enabled ring on his off hand (the hand that is not holding the car), the haptic effects may render differently on that hand (which may be referred to more generally as an endpoint) based on the interaction that the user is performing as well as based on any spatial interaction occurring around the user.
In an embodiment, the haptic-enabled wearable device of
While various embodiments have been described above, it should be understood that they have been presented only as illustrations and examples of the present invention, and not by way of limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the appended claims and their equivalents. It will also be understood that each feature of each embodiment discussed herein, and of each reference cited herein, can be used in combination with the features of any other embodiment. All patents and publications discussed herein are incorporated by reference herein in their entirety.