The present disclosure generally relates to simulated or virtual reality environments, and more particularly to interactive input controls therein.
Advances in technology, driven, for example, by improved processing power and speed, ever more efficient software rendering techniques, consumer convenience, and the like, have supported a growing movement in developing and designing simulated or three-dimensional (3D) virtual reality (VR) environments. VR systems which support such VR environments generally include hardware such as headsets, goggles, handheld or wearable devices, and the like. Operatively, such hardware continuously tracks user movements, updates user positions, orientations, and the like, and receives interactive input from users in a VR environment.
While certain VR systems can include complex and often expensive hardware and other equipment, such as an omni-directional treadmill, for tracking and translating real-world user movement into the VR environment, an average user may not have requisite capital (or physical space) to support such complex and often expensive equipment. Accordingly, certain challenges arise when designing and creating intuitive and interactive controls for users in a VR environment. Therefore, there is a need for improved interactive processes and techniques operable by simple VR equipment.
In one exemplary embodiment, this disclosure provides an interactive control process/service. The interactive control process/service may be performed as a method of steps that include detecting movement of a controller associated with a virtual reality (VR) environment. The method further includes steps to determine an angle of rotation, a magnitude of force (e.g., acceleration), and the like, based on the movement. For example, the movement may be mapped or represented by vectors, each having a direction and a magnitude. The method further includes steps for determining a path in the VR environment that corresponds to the vectors (e.g., the angle(s) of rotation, magnitudes of force, etc.) as well as for projecting the path in the VR environment, by displaying, for example, graphical elements that represent the path in the VR environment. In some aspects the method may further include intersection or collision processes that determine when an object in the VR environment intersects (or is in close proximity to) a portion of the path. In such aspects, the method may further include steps to modify or adjust the path so as to select such object, display menu options, retrieve the object along the path, move the user toward the object along the path, and the like.
In another embodiment, a virtual reality (VR) system employs the above discussed interactive control process/service. For example, the VR system includes a network interface to communicate in a communication network, a processor coupled to the network interface and adapted to execute one or more processes, and a memory configured to store a process executable by the processor. The process, when executed by the processor, is operable to detect movement of a controller associated with a virtual reality (VR) environment and determine angles of rotation, magnitudes of force (e.g., acceleration), and the like, based on the movement. The VR system further determines a path in the VR environment that corresponds to the movement and projects the path in the VR environment (e.g., displays graphical objects/elements that represent the path, etc.).
In yet another embodiment, a tangible, non-transitory, computer-readable media includes software or instructions such as an exemplary interactive control process. The software/instructions are executed by a processor, which causes the processor to detect movement of a controller associated with a virtual reality (VR) environment and determine angles of rotation, magnitudes of force (e.g., acceleration), and the like, based on the movement. The processor also determines a path in the VR environment that corresponds to the movement and projects the path in the VR environment (e.g., displays graphical objects/elements that represent the path, etc.).
The embodiments herein may be better understood by referring to the following description in conjunction with the accompanying drawings in which like reference numerals indicate identical or functionally similar elements. Understanding that these drawings depict only exemplary embodiments of the disclosure and are not therefore to be considered to be limiting of its scope, the principles herein are described and explained with additional specificity and detail through the use of the accompanying drawings in which:
Various embodiments of the disclosure are discussed in detail below. While specific implementations are discussed, it should be understood that this is done for illustration purposes only. A person skilled in the relevant art will recognize that other components and configurations may be used without parting from the spirit and scope of the disclosure.
As discussed in herein, the subject disclosure relates to interactive controls particularly suitable for virtual reality (VR) environments. These interactive controls are employed by simple controllers without requiring complex and often expensive equipment. For example, the interactive controls may include an interactive control process/service that can be employed by simple controllers, head-sets, and/or VR consoles. Such interactive control process may, for example, detect movement of a controller associated with a virtual reality (VR) environment and includes steps to determine angles of rotation, magnitudes of force (e.g., acceleration), and the like, based on the movement. For example, the interactive control process may map or assign portions of the movement to vectors in a coordinate system and determine a path in the VR environment that corresponds to the vectors (e.g., based on averages, unions, differences, superposition, or other combinations of vectors/vector-elements). The path may be used as a guide to move the user in the 3D environment and/or as a selection tool to select or retrieve objects as discussed in greater detail below.
Referring now to the figures,
As shown, VR environment 100 includes equipment such as a console 110, controller(s) 120, and a headset 125. Console 110 represents centralized hardware/software that communicates with controller 120 and/or headset 125, as well as communicates with various other devices, servers, databases, and the like over a communication network 130 (e.g., the Internet), as is appreciated by those skilled in the art.
Communication network 130 represents a network of devices/nodes interconnected over network interfaces/links/segments/etc. and operable to exchange data such as a data packet 140 and transport data to/from end devices/nodes (e.g., console 110, controller 120, and/or headset 125).
Data packets 140 include network traffic/messages which are exchanged between devices over communication network 130 using predefined network communication protocols such as certain known wired protocols, wireless protocols (e.g., IEEE Std. 802.15.4, WiFi, Bluetooth®, etc.), PLC protocols, or other shared-media protocols where appropriate.
Controller 120 wirelessly communicates with console 110 over network 130, or (in some embodiments) it may be coupled to console 110 over another network (not shown). Controller 120 facilitates user interaction with and within VR environment 100 and is operable to, for example, detect, track, or otherwise monitor movement and biometric information, communicate data signals with headset 125 and console 110, and provide feedback (e.g., tactile, audible, etc.) to a user 105. In this fashion, controller 120 can comprise any number of sensors, gyros, radios, processors, touch detectors, transmitters, receivers, feedback circuitry, and the like.
Headset 125, similar to controller 120, wirelessly communicates with console 110. Headset 125 displays or projects simulated graphical elements that form simulated VR environments to user 105, tracks eye movements, and measure biometric data from user 105.
With respect to the devices discussed above, it is appreciated that certain devices may be adapted to include (or exclude) certain functionality and that the components shown are shown for purposes of discussion, not limitation. As discussed, console 110, controller 120, and headset 125 cooperate to provide an immersive and interactive VR environment to user 105.
Network interface(s) 210 contain mechanical, electrical, and signaling circuitry for communicating data between devices over a network such as communication network 130. Input interface 215 includes hardware/software that receives user commands, and detects movement or gestures, and may also be configured to provide user-feedback (e.g., tactile, visual, audio, etc.). For example, input interface 215 can include switches, buttons, accelerometers, sensors, processors, radios, display elements, and the like. Memory 240 comprises a plurality of storage locations that are addressable by processor 220 for storing software programs and data structures associated with the embodiments described herein.
Processor 220 may comprise necessary elements or logic adapted to execute the software programs and manipulate data structures 245. An operating system 242, portions of which are typically resident in memory 240 and executed by processor 220, functionally organizes the device by, inter alia, invoking operations in support of software processes and/or services executing on the device. These software processes and/or services may comprise an illustrative “interactive control” process/service 244. Note that while process/service 244 is shown in centralized memory 240, it may be configured to operate in a distributed network.
It will be apparent to those skilled in the art that other processor and memory types, including various computer-readable media, may be used to store and execute program instructions pertaining to the techniques described herein. Also, while the description illustrates various processes, it is expressly contemplated that various processes may be embodied as modules configured to operate in accordance with the techniques herein (e.g., according to the functionality of a similar process). Further, while the processes have been shown separately, those skilled in the art will appreciate that processes may be routines or modules within other processes. For example, processor 220 can include one or more programmable processors, e.g., microprocessors or microcontrollers, or fixed-logic processors. In the case of a programmable processor, any associated memory, e.g., memory 240, may be any type of tangible processor readable memory, e.g., random access, read-only, etc., that is encoded with or stores instructions that can implement program modules, e.g., a module having interactive control process 244 encoded thereon. Processor 220 can also include a fixed-logic processing device, such as an application specific integrated circuit (ASIC) or a digital signal processor that is configured with firmware comprised of instructions or logic that can cause the processor to perform the functions described herein. Thus, program modules may be encoded in one or more tangible computer readable storage media for execution, such as with fixed logic or programmable logic, e.g., software/computer instructions executed by a processor, and any processor may be a programmable processor, programmable digital logic, e.g., field programmable gate array, or an ASIC that comprises fixed digital logic, or a combination thereof. In general, any process logic may be embodied in a processor or computer readable medium that is encoded with instructions for execution by the processor that, when executed by the processor, are operable to cause the processor to perform the functions described herein.
In operation, controller 120 detects movement from the first position to the second position, determines an angle of rotation (α) based on the movement, and determines a magnitude of force (e.g., an acceleration force, etc.) associated with the movement and/or the respective positions. Controller 120 further determines a path in the VR environment—here, path 440—that corresponds to the angle of rotation and the magnitude of force and headset 125 projects the path to user 105 in the VR environment. While the foregoing operations are described with respect to specific devices/controllers it is appreciated that any combination of devices may perform the same or substantially similar functionality.
As discussed, the movement of controller 120 from the first position to the second position may be mapped into a 3D coordinate system where vectors represent respective direction and magnitude at each position. The 3D coordinate system may further include a real-world coordinate system and/or a VR environment coordinate system. With respect to the real-world coordinate system, additional processing may be needed to translate the movement and calculate the path resulting therefrom into the VR environment coordinate system.
As shown here, path 440 is represented by a graphical dash line that terminates in a 3D graphical “X” element, which represents anchor point 445, and controller 120 is represented by a 3D graphical controller element. Collectively, these graphical components/elements show an exemplary a first person perspective view of user 105 within VR environment 400. As mentioned, path 440 is calculated, in part, from changes in controller orientation as well as changes in force (e.g., acceleration). In one embodiment, path 440 represents a curved path or an arcing path and may be generated by a simulated casting motion similar to casting, for example, a fishing line. The distance and direction of the curved path are derived by the above mentioned changes in controller orientation/force, as is appreciated by those skilled in the art. Notably, in some embodiments, the casting motion may operate in conjunction with other input controls (e.g., button press/release/etc.) to indicate the user's intention to generate path 440 in environment 400.
Here, path 440 represents a selection path that selects interactive object 605. The selection path may guide movement between the user and the selected object in the VR environment and/or the selection path may retrieve the selected object to the user (e.g., move the object from its current location to a current location of the user). As is appreciated by those skilled in the art, user 105 may provide additional input to prompt subsequent movement of the user and/or the object in VR environment 300.
With respect to corresponding vectors, in addition to vectors 120′ and 120″ which correspond to controller 120,
In operation, controllers 120, 820 detect respective movements, determine an angle of rotation (α, β) based on the movement, and determine a magnitude of force (e.g., an acceleration force, etc.) associated with the movement and/or the respective positions. Controllers 120, 820 further determine a respective path in the VR environment—path 440, 840—that corresponds to the respective angles of rotation and magnitudes of force.
As discussed above, the paths 440, 840 may represent travel paths to guide movement of user 105 in the VR environment. While embodiments having one controller and one travel path (as well as corresponding interactive operations) are discussed above, here, a synergy exists between multiple paths—i.e., path 440 and path 840—that supports further interactive operations. In particular, an analogy may be made to operating a kite with two strings, whereby each path represents one string (or a cable) and the kite is represented by a fixed bounding surface (a ground surface in
The VR system also determines, in step 1330, a path for the VR environment that corresponds to the one or more vectors (e.g., angles of rotation, magnitudes of force, etc.). For example, the VR system can derive the path based on an average, differences, summations, or other combinations of the vectors, which correspond to respective angles of rotation, magnitudes of forces (changes of force over time), etc., as is appreciated by those skilled in the art.
Procedure 1300 continues to step 1335 where, as discussed above, the VR system detects an intersection between portions of the path and an object in the VR environment and selects the object. As mentioned, selecting the object may further cause the object to be indicated as selected (e.g., a bounding box), display menu options associated with the object, retrieving or moving the object to a current position of the user or moving the user toward the object. The VR system further projects (e.g., using headset 125) the path in the VR environment such as the figures illustrate for paths 440/830.
Procedure 1300 subsequently ends at step 1345, but may begin again at step 1310 where it detects movement of the first controller. Collectively, the steps in procedure 1300 describe interactive control processes and techniques particularly suitable for VR environments without requiring expensive and often complex equipment. It should be noted that certain steps within procedures 1300 may be optional, and further, the steps shown in
The techniques described herein, therefore, provide interactive control processes that compliment immersive simulated VR environments without requiring expensive and complex equipment. These interactive controls define simple and intuitive gestures that quickly and efficiently learned by any user.
While there have been shown and described illustrative embodiments of the interactive control processes for VR environments, it is to be understood that various other adaptations and modifications may be made within the spirit and scope of the embodiments herein. For example, the embodiments and certain functionality have been shown and described herein with relation to certain systems, platforms, hardware, devices, and modules. However, the embodiments in their broader sense are not as limited, and may, in fact, be employed in non-VR environments as well as employed by any combination of the devices or components discussed herein.
The foregoing description has been directed to specific embodiments. It will be apparent, however, that other variations and modifications may be made to the described embodiments, with the attainment of some or all of their advantages. For instance, it is expressly contemplated that the components and/or elements described herein can be implemented as software being stored on a tangible (non-transitory) computer-readable medium, devices, and memories (e.g., disks/CDs/RAM/EEPROM/etc.) having program instructions executing on a computer, hardware, firmware, or a combination thereof. Further, methods describing the various functions and techniques described herein can be implemented using computer-executable instructions that are stored or otherwise available from computer readable media. Such instructions can comprise, for example, instructions and data which cause or otherwise configure a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. Portions of computer resources used can be accessible over a network. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, firmware, or source code. Examples of computer-readable media that may be used to store instructions, information used, and/or information created during methods according to described examples include magnetic or optical disks, flash memory, USB devices provided with non-volatile memory, networked storage devices, and so on. In addition, devices implementing methods according to these disclosures can comprise hardware, firmware and/or software, and can take any of a variety of form factors. Typical examples of such form factors include laptops, smart phones, small form factor personal computers, personal digital assistants, and so on. Functionality described herein also can be embodied in peripherals or add-in cards. Such functionality can also be implemented on a circuit board among different chips or different processes executing in a single device, by way of further example. Instructions, media for conveying such instructions, computing resources for executing them, and other structures for supporting such computing resources are means for providing the functions described in these disclosures. Accordingly this description is to be taken only by way of example and not to otherwise limit the scope of the embodiments herein. Therefore, it is the object of the appended claims to cover all such variations and modifications as come within the true spirit and scope of the embodiments herein.
Number | Name | Date | Kind |
---|---|---|---|
8384665 | Powers et al. | Feb 2013 | B1 |
20100113153 | Yen et al. | May 2010 | A1 |
20120178534 | Ferguson et al. | Jul 2012 | A1 |
20160300387 | Ziman | Oct 2016 | A1 |
20160357249 | Webb et al. | Dec 2016 | A1 |
20170228922 | Kaeser | Aug 2017 | A1 |
20170337742 | Powderly | Nov 2017 | A1 |
20180004283 | Mathey-Owens | Jan 2018 | A1 |
Entry |
---|
PCT Application No. PCT/US2018/040604 International Search Report and Written Opinion dated Sep. 27, 2018. |