Virtual environments, such as virtual reality and augmented reality, allow users to view a virtual environment with virtualized components. The virtualized components may be presented in a virtual environment to the user through a head-mounted display. In some examples, the user may manipulate various virtualized components. For example, a user may control the movement of a virtualized vehicle in a virtual-reality game.
For a more complete understanding of various examples, reference is now made to the following description taken in connection with the accompanying drawings in which:
Virtual environments can be displayed in a virtual headset. While a user is wearing the headset, it is difficult for a user to interact with other user devices, such as a mobile device, for example. A user generally exits the virtual environment by, for example, removing the headset to access the mobile device. For example, if the user receives an email on his mobile device, the user removes the headset to view the email on the mobile device.
Various examples described herein relate to virtual environments. In various examples, a user device may be coupled to the virtualization system and presented in the virtual environment. The virtualization system includes tracker devices that are associated with the virtualization system and may be worn by a user, for example, on a hand. The tracker devices may include inertial measurement units to measure, or determine, inertial data for the tracker devices. The virtualization system can broadcast the inertial data of the tracker devices. The broadcast data may be received by a user device, such as a mobile device or any of a variety of other devices. In this regard, various mobile devices include motion tracking systems to, for example, support changing screen orientation or monitoring physical activity of a user for a health application. The user device may compare the inertial data of the tracker devices with similar data of the user device. Based on this comparison, the user device may determine motion matching between the user device and a tracker device. For example, if the user is holding a mobile device in his hand, the inertial data of the mobile device may match the inertial data of the tracker device worn on that hand. The user device may then send an indicator to the virtualization system indicating the motion matching, and the virtualization system may couple the user device and present virtualization of the user device in the virtual environment. In various examples, the broadcasting of the inertial data of the tracker device eliminates the need for various user devices to continuously transmit similar data. Thus, user devices can conserve power (e.g., battery power) by merely responding to detected broadcast signals when motion matching is determined.
Referring now to
In various examples, the tracker device inertial data determination portion 110 determines the inertial data of the tracker device by receiving information from the tracker device. For example, the tracker device may include an inertial measurement unit (IMU) and may transmit information from the IMU to the tracker device inertial data determination portion 110. The information from the IMU may include, or be used to determine, inertial data such as the motion of the tracker device, and the direction of gravity. Further, additional information, such as change in position or orientation, may be derived or calculated from the information from the IMU. In various examples, an IMU can measure acceleration, via accelerometers, and may be able to provide velocity (by integrating acceleration over a time interval) and a change in position (by integrating the velocity over the time interval). In some examples, the tracker device inertial data determination portion 110 may include other tracking capability, such as optical tracking systems, to provide direct measurements of position and/or orientation.
In various examples, the tracker device is either a part of or coupled to a virtualization system. In this regard, the tracker device can communicate with the tracker device inertial data determination portion 110 wirelessly via any of a variety of wireless protocols.
The example system 100 of
The example system 100 of
Referring now to
Further, the example system 200 of
The headset 220 includes a head-mounted display 230. In various examples, the head-mounted display 230 may include a screen or a screen portion for each eye. In one example, the head-mounted display 230 includes a screen that includes a left-eye portion and a right-eye portion corresponding to each eye of the user. The head-mounted display 230 may display a virtual environment to the user in accordance with instructions from the controller 210.
In various examples, the controller 210 may include a virtual environment display portion. The virtual environment display portion is provided to generate a virtualized environment to be displayed on the head-mounted display 230. As used herein, virtualized environment includes virtual reality, as well as augmented reality in which a virtual environment and the physical environment are displayed together. In some examples of augmented reality systems, the user is provided with a direct view of the physical environment, and virtual elements are overlaid onto the physical environment via, for example, a half-silvered mirror. In this regard, virtual elements may augment the physical environment of the user.
In one example, the virtual environment display portion generates two corresponding images, one for the left-eye portion of the head-mounted display 230 and another for the right-eye portion of the head-mounted display 230.
The example system 200 of
The tracker device 240 includes an inertial data portion 250. In one example, the inertial data portion 250 includes an inertial measurement unit (IMU) or other such component to measure or detect an inertial parameter, such as acceleration in each of the three spatial axes of the tracker device 240. In various examples, the inertial data portion 250 includes accelerometers, gyroscopes, magnetometers or a combination thereof. The tracker device 240 is coupled to the controller 210 and transmits inertial data from the inertial data portion 250 to the controller 210. In this regard, the inertial data may be transmitted at regular intervals or upon any change in the inertial data.
In the example illustrated in
In one example, the controller 210 is to receive inertial data from the inertial data portion 250 of the tracker device 240. The controller 210 then broadcasts inertial data for the tracker device 240 for receipt by the user devices 260, 270. In one example, the broadcast of the inertial data may be broadcast for receipt by user devices 260, 270 that are nearby, such as within the same room as the controller 210. Each user device 260, 270 may then determine whether the inertial data of the tracker device 240 indicates motion matching with the user device 260, 270. In this regard, each user device 260, 270 may compare the inertial data of the tracker device 240 with similar inertial data of the user device 260, 270.
In various example, the comparison performed by the user device 260, 270 includes comparing data from the inertial data portion 250 (e.g., IMU) of the tracker device 240 with inertial data from an IMU of the user device 260, 270. In this regard, the user device 260, 270 may evaluate the change in magnitude within certain time intervals. In various examples, the time intervals are sufficiently large to allow for possible communication delays or other delays. The comparison process may include multiple comparisons involving different sets of data from the inertial data portion 250 of the tracker device 240, for example. If a correlation is determined between the inertial data of the tracker device 240 and the inertial data of the user device 260, 270, motion matching may be indicated.
If the comparison indicates motion matching, the user device 260, 270 may transmit a signal to the controller 210. In the example of
Referring now to
The user devices 340, 360 transmit a signal indicating motion matching to the controller and may then be coupled to the controller. The controller may then present a virtualization of the motion matching user device 340, 360 in a virtual environment presented to the user in the headset 320, an example of which is illustrated in
In the example virtual environment 400 of
In some examples, the coupling of the smart phone 360 with the controller includes sharing of content displayed on a display screen of the smart phone 360. The controller may then present the content on a virtual display screen 462 of the virtualization 460 of the smart phone 360. Thus, a user may access content, such as an email or a text message, on the smart phone 360 while viewing the virtualization 460 of the smart phone 360 in the virtual environment 400.
Referring now to
The example method 500 further includes broadcasting the inertial data for the tracker device (block 520). As noted above, the inertial data may be broadcast for receipt by various user devices in a region.
At block 530 of the example method 500, the controller may receive a signal from a user device indicating motion matching of the user device with the tracker device. In various examples, the signal from the user device is in response to the broadcast of the inertial data for the tracker device by the controller.
In various examples, upon receiving indication of motion matching of a user device, the controller may couple the user device with the controller and present a virtualization of the user device in a virtual environment associated with the tracker device. In this regard, the coupling of the device may allow the controller to identify the type of device (e.g., smart phone or smart watch) and other details associated with the user device (e.g., size). This information may be used to present the virtualization of the user device.
In one example, the user device may transmit a motion matching status change indication when, for example, motion matching is terminated. For example, the user may set down a smart phone while a virtualization of the smart phone is being presented to the user. At this point, the motion matching may be halted. The virtualization of the smart phone may continue in a static manner until the smart phone transmits a status change with respect to motion matching. The status change may result in the removal of the virtualization of the smart phone from the virtual environment presented to the user.
The user device, or a controller thereof, may receive inertial data of the user device from an inertial measurement unit (IMU) of the user device (block 620). As noted above, various user devices may include accelerometers or other components to measure, or allow calculation of, various inertial parameters.
The example method 600 of
Upon determination of motion matching, the user device transmits a signal to the controller indicating the motion matching (block 640). The signal may include an identification of the user device, and the matching tracker device, to the controller and allow the controller to couple with the user device.
Referring now to
The example instructions include determine tracker device inertial data for a tracker device instructions 721. As noted above, determining of the inertial data of the tracker device may include receiving the inertial data from the tracker device. The inertial data may be based on an inertial measurement unit (IMU) or similar component in the tracker device.
The example instructions further include broadcast tracker device inertial data instructions 722. As described above, a controller of a virtualization system may broadcast the inertial data for a tracker device for receipt by various user devices.
The example instructions further include receive motion matching indicator from a user device instructions 723. As noted above, the indicator received from the user device may be indicative of motion matching of the user device and the tracker device. The motion matching may include matching of movement and orientation.
The example instructions further include couple the user device to a virtualization system instructions 724. As described above, a motion matching user device may be coupled to a controller associated with the tracker device with which the user device is motion matching.
In some examples, a virtualization of the user device may then be presented in a virtual environment presented to the user. For example, the virtualization of the user device and the virtual environment may be presented to the user in a head-mounted display.
Thus, in various examples, a user device may be determined to be motion matching with a tracking device while conserving battery power. By broadcasting the inertial data of the tracker device, user devices are not required to expend battery power by transmitting their own inertial data. The user devices can compare the broadcast inertial data with their own inertial data and transmit a signal if motion matching is determined.
Software implementations of various examples can be accomplished with standard programming techniques with rule-based logic and other logic to accomplish various database searching steps or processes, correlation steps or processes, comparison steps or processes and decision steps or processes.
The foregoing description of various examples has been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or limiting to the examples disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of various examples. The examples discussed herein were chosen and described in order to explain the principles and the nature of various examples of the present disclosure and its practical application to enable one skilled in the art to utilize the present disclosure in various examples and with various modifications as are suited to the particular use contemplated. The features of the examples described herein may be combined in all possible combinations of methods, apparatus, modules, systems, and computer program products.
It is also noted herein that while the above describes examples, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope as defined in the appended claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2018/041517 | 7/10/2018 | WO | 00 |