Embodiments hereof relate to systems, devices and methods for providing interface modes for performing actions in immersive reality environments. In particular, embodiments hereof provide haptic and visual cues and indicators to facilitate and improve interactions with immersive reality environments.
Immersive reality environments, including augmented reality, mixed reality, merged reality, and virtual reality, are increasing in popularity and use. The modes and systems within which users interact with these environments are not standardized and are often clumsy and non-intuitive and rely on visual and audio feedback. The inadequacy of existing interface modes limits the use and appeal of interactive immersive reality systems.
These and other drawbacks exist with conventional interactive immersive reality systems. These drawbacks are addressed by the inventions described herein.
Embodiments of the invention include systems configured to generate varying interface modes to facilitate and improve user interaction with immersive reality environments. The interface modes provided are specifically configured for ease of interaction and for training human users to interact with the interface modes through gestures that are easily trained and stored in muscle memory. The interface modes are aided by the provision of haptic, visual, and audio feedback to provide users with a full sensory experience during interaction with the interface modes discussed herein. The systems further include one or more of an immersive reality display, one or more haptic output devices, and one or more gesture detection systems. The gesture detection systems are used to capture gestures made by the user. The immersive reality display and the haptic output devices are used to provide visual and haptic feedback, respectively. Audio outputs may further be included as a standalone device or as an additional feature of any of the other described devices.
A computing unit of the system provides interface modes that can be accessed by the user based on an initiation gesture. During interaction, the haptic output devices provide haptic feedback to guide the user through the interface mode by “feel.” The haptic output assists a user in learning the appropriate gestures to work with the interface mode without requiring full reliance on visuospatial skills in the immersive reality environment.
In an embodiment, a system for providing haptic effects during an immersive reality interaction is provided. The system comprises at least one sensor configured to detect user gesture information, at least one haptic output device configured to provide a haptic effect to the user, an immersive reality display device, and at least one processor. The at least one processor is configured to receive the user gesture information, determine an initiation gesture from the gesture information, activate an interface mode according to the initiation gesture, the interface mode including an interactive visual display provided via the immersive reality display device, anchor the interface mode according to a location of the initiation gesture, determine an interaction gesture from the user gesture information, provide a dynamic haptic effect in response to the interaction gesture, determine a selection gesture from the gesture information, determine a selection haptic effect in response to the selection gesture, and provide a selection visual display via the immersive reality display device.
In an embodiment, a computer-implemented method of providing haptic effects during an immersive reality interaction is provided. The method comprises detecting user gesture information by at least one sensor, providing an immersive reality display via an immersive reality display device, receiving, by at least one processor, the user gesture information, determining, by the at least one processor, an initiation gesture from the gesture information, activating, by the at least one processor, an interface mode according to the initiation gesture, the interface mode including an interactive visual display provided via the immersive reality display device, anchoring, by the at least one processor, the interface mode according to a location of the initiation gesture, determining, by the at least one processor, an interaction gesture from the user gesture information, providing, a dynamic, static, or discrete haptic effects by at least one haptic output device in response to the interaction gesture, determining, by the at least one processor, a selection gesture from the gesture information, determining, by the at least one processor, a selection haptic effect in response to the selection gesture, delivering the selection haptic effect by the at least one haptic output device, and providing a selection visual display by the immersive reality display device.
The foregoing and other features and advantages of the invention will be apparent from the following description of embodiments hereof as illustrated in the accompanying drawings. The accompanying drawings, which are incorporated herein and form a part of the specification, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention. The drawings are not to scale.
Specific embodiments of the present invention are now described with reference to the figures. The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
Embodiments of the present invention are directed to immersive reality interface modes involving mixed visual and haptic effects. Immersive reality, as used herein, describes visual display systems that provide altered reality viewing to a user. Immersive reality environments include virtual reality environments, augmented reality environments, mixed reality environments, and merged reality environments, as well as other similar visual environments. Immersive reality environments are designed to provide visual display environments that mimic a realistic viewing experience and include panoramic imaging where a user's movements determine the display. As a user turns their head or body, the images displayed to the user are adjusted as if the user were inside the immersive reality environment. Immersive reality environments frequently include stereoscopic or other three-dimensional imaging technologies to improve realism. Immersive reality environments may include any mix of real and virtual objects that may or may not interact with one another.
When engaged in immersive reality environments, users may interact with the environment via various means. Users may operate control devices, such as joysticks and other objects that provide input to the immersive reality environment through buttons, triggers, and/or keys on the objects. Users may also interact with the environment through gestures. Gesture interactions may be performed with the hands, fingers, or other body parts and may be aided by wearable devices such as gloves, wristbands, and other items as well as handheld devices such as wands or other handheld tracking devices
Gesture interaction may be facilitated via the gesture detection system 130. The gesture detection system 130 uses one or more sensors 131 to capture information related to movement and positioning of a gesture entity 150. The gesture entity 150 is the object or body part employed by the user to perform gestures recognizable by the system. The gesture entity 150 may be a physical hand-held object, such as a wand or controller. The gesture entity 150 may be a wearable device, such as a glove, a ring, a wrist-band, a watch, or any other wearable device that may be trackable. Finally, the gesture entity 150 may be the user's body parts, such as their hands or fingers. When a user's body parts function as the gesture entity 150, tracking by the gesture detection system 130 may be augmented by attachments, as further described below. In embodiments, the gesture entity 150 may include multiple gesture entities 150—e.g., both hands, a hand and an object, etc.
The gesture detection system 130 includes one or more sensors 131 to track movement and positioning of the gesture entity 150. The sensors 131 may include any device or technology capable of tracking movement and location. For example, the sensors 131 may include imaging technologies such as cameras and infrared detectors. In these embodiments, the gesture entity 150 may include attachments that stand out to the imaging technology and facilitate tracking. The sensors 131 may further include lidar, radar, and ultrasound detection sensors. In such embodiments, the gesture entity 150 may include attachments configured to increase reflectivity. The sensors 131 may further include devices that generate electromagnetic fields and detect movement within the fields based on changes in the field. In such embodiments, the gesture entities 150 may include attachments, such as magnets or circuits that increase disturbances to the electromagnetic fields. The sensors 131 may further include devices designed to transmit wireless signals to antennas attached to the gesture entity 150 and determine positioning and movement via triangulation. The sensors 131 may further include inertial sensors, such as accelerometers, attached to the gesture entity 150. In embodiments, the sensors 131 may include any combination of the above-described sensors.
In embodiments, the gesture detection system 130 is a wearable system, designed to be attached to the user. The gesture detection system 130, may, for example, be coupled to or integrated with a wearable device that also includes the haptic output devices. The gesture detection system 130 may also be coupled to or integrated with an immersive reality display device 120.
In embodiments, the gesture detection system 130 may be a stationary system. In embodiments, the gesture detection system 130 may be room sized and configured to detect motion and positioning of the gesture entity 150 anywhere within the confines of a room. In embodiments, the gesture detection system 130 may be desktop sized and configured to detect motion and positioning of the gesture entity 150 within the confines of a workstation or cubicle.
The foregoing are merely examples of technology that the gesture detection system 130 may employ and is not exclusive or exhaustive. The gesture detection system 130 may include any technology capable of detecting the location and movement of an object or body part designated as a gesture entity 150.
The gesture detection system 130 may provide raw data to the computing unit 105 of the system 100, and or may include one or more processors capable of processing captured information before providing the captured information to the computing unit 105.
Immersive reality environments require the display of an immersive reality to a user. Immersive reality display devices 120 include headsets, glasses, contact lenses and other display technologies configured for virtual, augmented and/or mixed reality display. Immersive reality display devices 120 may also include projection devices configured to project images in the air, onto surfaces, or directly into a user's eyes. Immersive reality display devices 120 may be small and portable, i.e., capable of being worn, and/or may be larger and stationary. In further embodiments, immersive reality display devices 120 may interact directly with a user's optic nerves or other neurological structures to provide the required visual display.
The system 100 further includes at least one haptic output device 110. In embodiments, one or more haptic output devices 110 may be provided in wearable haptic devices configured to provide haptic effects to one or more body parts of a user. Wearable haptic devices compatible with the system 100 include bracelets, rings, gloves, finger-nail attachments, finger-tip attachments, and any other haptically enabled devices that may provide a haptic effect to the body. The one or more haptic output devices 110 may also be provided in handheld haptically enabled devices, such as tablet computers, smart-phones, PDAs, game controllers, and any other haptically enabled device sized and shaped for being held. Throughout the description herein, the one or more haptic output devices 110 may be referred to in the singular for convenience sake. It is understood that any effect discussed herein as provided by a haptic output device 110 may also be provided by multiple haptic output devices 110.
Possible haptic output devices 110 include but are not limited to eccentric rotating mass (“ERM”) actuators in which an eccentric mass is moved by a motor, linear resonant actuators (“LRAs”) in which a mass attached to a spring is driven back and forth, piezoelectric actuators, electromagnetic motors in which an eccentric mass is moved by a motor, vibrotactile actuators, inertial actuators, shape memory alloys, electro-active polymers that deform in response to signals, mechanisms for changing stiffness, electrostatic friction (ESF), ultrasonic surface friction (USF), any other actuator known in the art, and/or any combination of actuators described above.
With reference to
The at least one processor 140 (also interchangeably referred to herein as processors 140, processor(s) 140, or processor 140 for convenience) functions as part of the computing unit or computer 105 including at least one memory unit 141, and/or other components. The processor 140 is programmed by one or more computer program instructions stored on the storage device 141. As shown in
The processor 140 may be configured as part of a server (e.g., having one or more server blades, processors, etc.), a personal computer (e.g., a desktop computer, a laptop computer, etc.), a smartphone, a tablet computing device, a wearable haptic device and/or any other device that may be programmed to carry out aspects of the system as described herein. Although often described herein as a singular processor 140 for convenience sake, the at least one processor 140 may include several processors, each configured to carry out various aspects of the system. Where multiple processors 140 are employed, the multiple processors 140 of the system 100 may each be associated with different hardware aspects of the system, may be housed together or separately, and/or may be in communication with any one or more of the other multiple processors 140.
For example, in an embodiment, the computing unit 105 may house one or more processors 140 in a single computer that are in communication with the immersive reality display device 120, the haptic output devices 110, and the gesture detection system 130. In this embodiment, the processors 140 generate the visuals for the immersive reality display device 130, generate the haptic output signals for the haptic output devices 110, and receive and interpret information from the gesture detection system 130. The processors 140 act as a central hub to operate the entire system 100 and are also configured to generate and operate the immersive reality interface modes. In this embodiment, the various hardware aspects of the system act as peripherals of the central computing unit 105. For example, a user may carry a tablet computer or smartphone configured to run application software. In this embodiment, the tablet computer or smartphone outputs audiovisual data to the immersive reality display device 120, haptic command signals to the haptic output devices 110, and receives raw gesture information from the gesture detection system 130.
In another embodiment, each of the immersive reality display device 120, the haptic output devices 110, and the gesture detection system 130 have an associated one or more processors 140 configured to carry out the necessary tasks to operate the respective hardware. Each of these systems has standalone capabilities and is configured to receive high-level instructions from a processor 140 configured to operate the interface mode and coordinate the information flow and actions of the hardware devices. The processor 140 may be one of the processors 140 associated with the hardware units and/or may be an additional processor 140 housed in an additional device.
In another embodiment, the computing unit 105 is configured with processor 140 to carry out all of the actions described herein to create an interface mode. The computing unit 105 may further be configured to determine the capabilities of any immersive reality display device 120, haptic output device 110, and gesture detection system 130 to which it is connected. The computing unit 105 is further configured to adjust aspects of the generated interface modes so as to fully take advantage of the hardware capabilities without suffering any loss of functionality. Thus, the computing unit 105 is configured to generate and operate interface modes as discussed herein with third party peripheral hardware devices.
Various other configurations of hardware and allocations of processing duties exist between one or more processors 140 located in various places among the hardware devices of the system 100. The specific examples provided herein are not intended to limit the implementation of the immersive reality interface modes as discussed herein.
The gesture determination module 142 is a software module in operation on processor 140. The gesture determination module 142 is configured to receive user gesture information from the gesture detection system 130. As discussed above, the gesture detection system 130 includes one or more sensors 131 configured to capture gestures or movements of a user made by a gesture entity 150. The gesture entity 150 includes any body part or device which a user may use to perform gestures including fingers and hands, active handheld devices such as controllers or wands, and passive handheld devices such as pens, pencils, pointers, and other objects. The user gesture information received by the gesture determination module 142 may include raw or processed data as captured by the sensors 131. In an embodiment, user gesture information is the raw data captured by the sensors 131 and is transmitted directly to the gesture determination module 142 for interpretation. In further embodiments, the gesture detection system 130 refines the data captured by the sensors 131 to provide processed user gesture information to the gesture determination module 142. Processed user gesture information may include information at various levels of abstraction from the raw data. For example, processed user gesture information may include information describing the movements of the gesture entity 150. In another example, the processed gesture information may include information matching the movements of the gesture entity 150 to specific identifiable gestures.
The gesture determination module 142 is configured to detect and classify identifiable gestures based on the user gesture information received. Identifiable gestures detectable by the gesture determination module 142 include at least initiation gestures, interaction gestures, selection gestures, holding gestures, and closing gestures. Initiation gestures are movements of the gesture entity 150 selected to initiate an interface mode within an immersive reality environment. Interaction gestures are movements of the gesture entity 150 used for interaction and navigation within an interface mode. Selection gestures are movements of the gesture entity 150 used for selecting options or elements within an interface mode and, in some embodiments, gestures used for exiting an interface mode. Holding gestures are movements of the gesture entity 150 used for freezing an aspect of an interface mode. Closing gestures are gestures used for exiting an interface mode. Examples of each of these gestures are provided in greater detail below.
Display generation module 144 is a software module in operation on the processor 140. The display generation module 144 is configured to provide an interactive visual display of an interface mode to the user via the immersive reality display device 120. The interactive visual display includes at least an interactive option display and a selection visual display. The interactive option display presents a user with one or more selectable or adjustable objects while the selection visual display provides a user with visual confirmation that an object has been selected or adjusted.
The display generation module 144 is also configured to anchor the interactive visual display based on the interface mode that is initiated. For example, in some interface mode embodiments, the interactive visual display may be anchored to a specific location or fixed position within the immersive reality environment according to a location at which the initiation gesture is made. After initiation of an interface mode via a detected initiation gesture, the associated interactive visual display is anchored at a location within the environment associated with the initiation gesture. As the user rotates their vision or moves around in the environment, the interactive visual display remains anchored to the original location and thus may pass from the user's view. In embodiments, the interactive visual display may be persistently anchored, permitting the user to leave the area, log out of the system and/or perform other tasks while the interactive visual display remains anchored in place. In another interface mode embodiment, the interactive visual display may be anchored to a specific location within a user's field of view. As the user moves through the environment and rotates their view, the anchored interactive visual display follows their movements, always remaining in the same portion of the user's field of view. In another interface mode embodiment, the interactive visual display is anchored to an anchor object, physical or virtual, within the immersive reality environment. For example, an initiation gesture may cause the interactive visual display to be anchored to a real or virtual book. As the user moves the book around within the environment, the interactive visual display moves with the book. In embodiments, an interactive visual display instantiated by one user may be visible to other users that are interacting with the same immersive reality environment. In embodiments, an interactive visual display instantiated by one user may be private to the initiating user.
The effects generation module 146 is a software module in operation on the processor 140. The effects generation module 146 is configured to cause the output of haptic command signals from the computing unit 105 to cause haptic effects in the one or more haptic output devices 110 associated with the system 100. The output haptic command signals may be command signals configured to cause a power system associated with the haptic output devices 110 to provide the appropriate electrical signal to drive a haptic output device 110. The output haptic command signals may also include direct drive signals configured to drive a haptic output device 110 with no requirement for an intervening power system.
The effects generation module 146 is configured to cause the output of haptic command signals to provide at least initiation haptic effects, dynamic haptic effects, and selection haptic effects. The initiation haptic effects are configured to provide a user with confirmation that an interface mode has been initiated. The dynamic haptic effects are configured to provide various sensations to a user to facilitate the use and navigation of an interface mode. The selection haptic effects are configured to provide a user with confirmation that a selection within an interface mode has been made. In embodiments, the initiation haptic effects and the selection haptic effects are discrete effects delivered over a short period of time to emphasize or confirm an occurrence. The dynamic haptic effects, in contrast, are on-going effects that vary to provide haptic perceptions during an interaction with the interface mode. Greater detail regarding the various haptic effects are provided below.
The interface generation module 148 is a software module in operation on the processor 140. The interface generation module 148 operates to generate the interface mode and to coordinate the actions of the gesture determination module 142, the display generation module 144, and the effects generation module 146. The interface generation module 148 further interacts with software applications operating on the system 100 for which interface modes are provided. When an interface mode is provided by the interface generation module 148, it permits the user to select options, adjust parameters, and otherwise interact with a software application operating on the system 100.
The interface generation module 148 receives information about determined or recognized gestures from the gesture determination module 142 and outputs commands to the display generation module 144 to provide the interactive visual display and to the effects generation module 146 to provide the haptic effects.
In response to an initiation gesture of the gesture entity 150 detected in the gesture information by the gesture determination module 142, the interface generation module 148 activates an interface mode associated with the detected initiation gesture. The interface mode is provided to allow the user to interact with a software application operating on the system 100. The interface mode permits the users to adjust parameters, select options, and perform other actions associated with the software application. The interface generation module 148 causes the display generation module 144 to provide an interactive visual display of the interface mode via the immersive reality display device 120. In embodiments, the interface generation module 148 causes the effects generation module 146 to cause the output of an initiation haptic effect via the haptic output devices 110. As discussed above, the interactive visual display of the interface mode may be anchored to a physical or virtual object, to a point in the immersive reality environment, or to a point in the user's field of vision within the immersive reality environment. For example, a user may clench a fist to initiate a particular interface mode. The user's hands are the gesture entity 150 in this embodiment. The clenched fist is captured by the gesture detection system 130 and determined as an initiation gesture by the gesture determination module 142. The interface generation module 148 activates an interface mode associated with the clenched fist gesture and causes the display generation module 144 to provide an interactive visual display within the immersive reality environment. Upon clenching a fist, the user may receive a brief haptic sensation as an initiation haptic effect via the haptic output device 110 as confirmation that the gesture has been interpreted by the system 100 as an initiation gesture. The interactive visual display is created and anchored at a location near the user's clenched fist. If the user turns his/her head or moves the clenched fist, the interactive visual display remains in place. The user now interacts with the interactive visual display of the interface mode.
In embodiments, the initiation gesture may involve two actions. For example, a first action may cause display of the interactive visual display while a second action may cause the anchoring of the interactive visual display and activation of the interface mode. For example, a clenched fist may be a first initiation gesture. In response to a determination of the first initiation gesture, the interface generation module 148 causes display of the interactive visual display. The user may then move their clenched fist through the immersive reality environment to a location where they wish to anchor the display. Opening the clenched fist to a flat hand may be a second initiation gesture. In response to a determination of the second initiation gesture, the interface generation module 148 may anchor the interactive visual display at the location of the hand during the second initiation gesture and activate the interface mode. A two-part initiation gesture system may permit the user to quickly peruse menu options without entering the interface mode, i.e., by only performing the first initiation gesture and not the second initiation gesture mode. The two-part initiation gesture system may also permit a user to see the size and shape of a menu before determining where they wish to anchor it.
In response to one or more interaction gestures detected in the gesture information by the gesture determination module 146, the interface generation module 148 causes the display generation module 144 to provide an interactive visual display of selectable or adjustable options, navigational actions, or menu items associated with the detected interaction gestures. For example, movement of the user's hand, either rotationally or translationally, causes a cursor or other visual selection object to move through the interactive visual display. While navigating the interactive visual display, the user's visuospatial understanding of the interactive visual display is augmented by dynamic haptic effects provided by the effects generation module 146. For example, as the cursor nears a selectable object or item within the interactive visual display, a haptic effect grows in intensity, by increasing at least one of a frequency and a magnitude. In another example, the haptic effect is tuned to provide a sensation of greater resistance as a cursor nears a selectable object. When the cursor is appropriately focused on a selectable object, the effects generation module 146 provides a selection haptic effect via the haptic output device 110 to indicate to the user that a selectable object is highlighted or ready for selection in the interactive visual display. By providing dynamic haptic effects and triggered selection haptic effects during navigation of the interactive visual display, a user can navigate through the interface mode by feel as well as by sight and can develop familiarity and facility with the interface mode quickly.
After locating a cursor appropriately over the desired selectable object, the user performs a selection gesture with the gesture entity 150. The gesture determination module 142 determines the selection gesture in the gesture information received from the gesture detection system 130. The interface generation module 148 receives information that the selection gesture was performed. In response to the selection gesture, the interface generation module 148 provides information about the object selection to the software application operating on the system 100 associated with the activated interface mode. Further in response to the selection gesture, the interface generation module deactivates the interface mode and causes the effects generation module 146 to provide the user with a selection haptic effect, via the haptic output device 110, as confirmation that the selection gesture was recognized and accepted. In embodiments, the selection gesture may be a release of the initiation gesture, e.g., the opening of the clenched fist that initiated the interface mode. In embodiments, the selection gesture may be a lack of movement of the gesture entity, e.g., the user may hold the gesture entity still for a period of time to indicate selection.
In an embodiment, a simple interface mode for adjusting a volume of audio associated with a running software application may operate as follows. The user performs an initiation gesture of pinching a thumb to tips of his/her index and middle fingers as if grasping a dial. The interface generation module 148 recognizes the initiation gesture, outputs a brief haptic buzz effect to confirm, and causes the image of a volume dial to appear between the user's pinched fingers. The user quickly rotates their hand to “turn” the virtual volume dial while receiving dynamic haptic effects in the form of clicks each time the volume is increased or decreased by one level. When satisfied, the user releases their pinched fingers, receives a selection haptic effect to confirm, and the volume dial disappears. This simple interaction can easily become part of a user's muscle memory, aided by the repeatable gestures and haptic effects. A change in volume becomes an action performed in less than a second and with very little thought or attention diverted from a main task or action that the user may be performing.
Alternative embodiments of the layer cake interface mode may be provided. For example, a user's hand may act as gesture entity 300. In further variations, one or both of the selection haptic effects may be eliminated. In further variations, an audible confirmation may accompany one or both of the selection haptic effects. In further variations, a selection gesture may be performed by holding the gesture entity 300 in place for a specific time period. For example, holding the gesture entity 300 in a position to select a specific selectable visual display object 301 for a set period of time, e.g., 3 seconds, may be interpreted as a selection gesture. In another embodiment, the initiation gesture of the layer cake interface mode may be a two-part gesture. A first initiation gesture may be recognized as a tilting of the gesture entity 300, as described above. The first initiation gesture may permit the user to continue moving the gesture entity 300 and the associated interactive visual display 302 before choosing a location in the immersive reality environment to anchor it. After moving the gesture entity 300 to the anchoring location, a second initiation gesture, such as the press of a button on a haptic peripheral constituting the gesture entity 300 or a movement of a specific finger of a hand constituting the gesture entity 300 may be performed. The second initiation gesture may then cause the interface generation module 148 to anchor the interactive visual display 302 at the location the second initiation gesture was made and activate the interface mode.
Any combination of the above variations may also be implemented.
Alternative embodiments of the quick action interface mode may be provided. For example, a user's hand may act as first gesture entity 400 while the user's other hand acts as the second gesture entity 404. In further variations, one or both of the selection haptic effects may be eliminated. In further variations, an audible confirmation may accompany one or both of the selection haptic effects. In another embodiment, the initiation gesture of the quick action interface mode may be a two-part gesture. The first initiation gesture may permit the user to continue moving the gesture entity 400 and the associated interactive visual display 402 before choosing a location in the immersive reality environment to anchor it. After moving the gesture entity 400 to the anchoring location, a second initiation gesture, such as the press of a button on a haptic peripheral constituting the gesture entity 400 or a movement of a specific finger of a hand constituting the gesture entity 400 may be performed. The second initiation gesture may then cause the interface generation module 148 to anchor the interactive visual display 402 at the location the second initiation gesture was made and activate the interface mode. Any combination of the above variations may also be implemented.
Alternative embodiments of the pivoting interface mode may be provided. In an embodiment, for example, a holding gesture of the gesture entity 500 may be recognized as a command to maintain the relative positions of the selectable visual display objects 503 in the interactive visual display 502 even after the gesture entity 500 is removed to do other things. A holding gesture may include, for example, a bending movement of the index finger of the gesture entity 500, which can be made while the thumb of the gesture entity 500 maintains contact with the anchor object 501. Use of a holding gesture permits the user to continue viewing the selectable visual display objects 503 while using his/her hand for other tasks. In an embodiment, as noted above, the anchor object 501 may be a virtual object. In another embodiment, the gesture entity 500 may include a first hand and a second hand of the user. In this two-handed embodiment, placement of the thumb of one hand at an anchor interaction point while the index finger of the other hand is placed at a moveable gesture point serves as the initiation gesture. Pivoting the index finger and the moveable gesture point around the thumb at the anchor interaction point serves as the interface gesture. The pivoting interface mode may be particularly useful in permitting a user to compare one or more visual display objects at the same time. By rotating and spreading or fanning out the selectable visual display objects 503, the user may position them so that more than one is visible at the same time, thus permitting a quick and easy comparison. In further variations, once the holding gesture is determined, the user is free to set the anchor object 501 down or hand it to another user. Any combination of the above variations may also be implemented.
Alternative embodiments of the slide-out interface mode may be provided. For example, a holding gesture of the gesture entity 600 may be recognized as a command to maintain the position of the images in the interactive visual display 601 even after the gesture entity 600 is removed to do other things. The holding gesture may include a bending of the index finger of the second hand of the gesture entity 600, which may be performed while the thumb of the first hand remains anchored. This permits the user to continue viewing the selectable visual display objects 605 while using their hands for other tasks. In further variations, once the holding gesture is determined, the user is free to set the physical anchor object 601 down or hand it to another user. Any combination of the above variations may also be implemented.
Alternative embodiments of the proportional interface mode may be provided. In variations, a selection haptic effect may be provided when the user reaches a limit of the available range. In further variations, an audible confirmation may accompany the selection haptic effects. In another embodiment, the initiation gesture of the proportional interface mode may be a two-part gesture. The first initiation gesture may permit the user to continue moving the gesture entity 700 and the associated interactive visual display 702 before choosing a location in the immersive reality environment to anchor it. After moving the gesture entity 700 to the anchoring location, a second initiation gesture, such as the movement of a specific finger of a hand constituting the gesture entity 700 may be performed. The second initiation gesture may then cause the interface generation module 148 to anchor the interactive visual display 702 at the location the second initiation gesture was made and activate the interface mode. Any combination of the above variations may also be implemented.
Alternative embodiments of the linear menuing interface mode may be provided. For example, one or both of the selection haptic effects may be eliminated. In another embodiment, the initiation gesture of the quick action interface mode may be a two-part gesture. The first initiation gesture may permit the user to continue moving the gesture entity 800 and the associated interactive visual display 802 before choosing a location in the immersive reality environment to anchor it. After moving the gesture entity 800 to the anchoring location, a second initiation gesture, such as the movement of a specific finger of a hand constituting the gesture entity 800 may be performed. The second initiation gesture may then cause the interface generation module 148 to anchor the interactive visual display 802 at the location the second initiation gesture was made and activate the interface mode. In further variations, an audible confirmation may accompany one or both of the selection haptic effects. Any combination of the above variations may also be implemented.
In embodiments, one or more of the various interface modes discussed above may be combined form interface modes of greater complexity. That is, a second interface mode may be initiated based on a selection made in a first interface mode. For example, the layer cake interface mode may be combined with either the pivoting interface mode or the slide-out interface mode. The user may initiate the layer cake interface mode, make a selection, and then launch either the pivoting interface mode or the slide-out interface from the selected visual display object of the layer cake interface mode. Other potential combinations are also within the scope of this description. In embodiments, more than two interface modes may be combined. In this way, multi-level menuing may be provided through a combination of easily navigable interactive visual displays.
In embodiments, the various interaction gestures, selection gestures, holding gestures, and closing gestures, as well as the associated dynamic haptic effects and selection haptic effects are adjustable based on user preferences. A user may alter the gesture required to initiate or interface with any of the above-described interface modes. In embodiments, the system 100 may provide gesture training functionality. Gesture training functionality may be used to train both the user in performing the required gestures and the gesture determination module 142 in recognizing user performed gestures. Machine learning techniques may be applied to user gesture information to improve gesture determination by the gesture determination module 142.
In an operation 1002, the interface mode process 1000 includes detecting user gesture information by at least one sensor 131 of the gesture detection system 130. The user gesture information is then transmitted to the gesture determination module 142 of the computing unit 105. The gesture detection system 130 continues to detect user gesture information and transmit user gesture information to the gesture determination module 142 throughout the interface mode process 1000.
In an operation 1004, the interface mode process 1000 includes providing an immersive reality environment via an immersive reality display device 120. The display generation module 144 of the computing unit 105 transmits visual information to the immersive reality display device 120 for providing an immersive reality environment to the user.
In an operation 1006, the interface mode process 1000 includes receiving the user gesture information by the gesture determination module 142 of the computing unit 105. The gesture determination module 142 uses the user gesture information to determine an initiation gesture from the gesture information, at an operation 1008.
In an operation 1010, the interface mode process 1000 includes activating, by the interface generation module 148, an interface mode according to and in response to the initiation gesture determined at the operation 1008. The interface mode includes an interactive visual display provided via the immersive reality display device. The interactive visual display varies according to the interface mode that is activated.
In an operation 1012, the interface mode process 1000 includes anchoring the interactive visual display of the interface mode. The interactive visual display may be anchored according to a location of the initiation gesture, according to a location of a physical or virtual object within the immersive reality environment, and/or according to a specific location within a user's field of view.
In an operation 1014, the interface mode process 1000 includes determining an interaction gesture from the user gesture information by the gesture determination module 142. Information about the interaction gesture is provided by the gesture determination module 142 to the interface generation module 148, which, in turn coordinates actions of the effects generation module 146 and the display generation module 144. In response to the interaction gesture, the effects generation module 146 causes the haptic output device 110 to output a dynamic haptic effect at operation 1016. In response to the interaction gesture, the display generation module 144 adjusts the interactive visual display to show the interaction and navigational movements associated with the interaction gesture.
In an operation 1018, the interface mode process 1000 includes determining, by the gesture determination module 142, a selection gesture from the user gesture information. Information about the selection gesture is provided by the gesture determination module 142 to the interface generation module 148, which, coordinates actions of the effects generation module 146 and the display generation module 144. In response to the selection gesture, the effects generation module 146 causes the haptic output device 110 to provide a selection haptic effect via the haptic output device 110 at an operation 1020. In response to the selection gesture, the display generation module 144 adjusts the interactive visual display to provide a selection visual display associated with the selection gesture at an operation 1022.
In an operation 1024, the interface mode process 1000 includes exiting the interface mode by the interface generation module 148 of the computing unit 105. Upon exiting the interface mode, the interface generation module 148 reports any selections made by the user during the interface generation module 148 to the software application or applications associated with the selection. The interface generation module 148 also causes the display generation module 144 to cease display of the interactive visual display in the immersive environment.
The above describes an illustrative flow of an example interface mode generation and use process according to embodiments described herein. The process as illustrated in
Thus, there are provided systems, devices, and methods for generating and using various interface modes within an immersive reality environment. While various embodiments according to the present invention have been described above, it should be understood that they have been presented by way of illustration and example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments. Further embodiments and combinations are set forth in the numbered paragraphs below. It will also be understood that each feature of each embodiment discussed herein, and of each reference cited herein, can be used in combination with the features of any other embodiment. The aspects of the above methods of rendering haptic effects may be used in any combination with other methods described herein or the methods can be used separately. All patents and publications discussed herein are incorporated by reference herein in their entirety.