This application claims the benefit of Taiwan application Serial No. 104140569, filed Dec. 3, 2015, the disclosure of which is incorporated by reference herein in its entirety.
The disclosure relates in general to a mobile virtual reality (VR) operation method, system and storage media.
In virtual reality (VR) world, computer generates 3D virtual world, and user may search/observe the objects real-time and unlimited in the 3D virtual world by visual, hearing and touching and so on. When the user moves, the computer may generate corresponding 3D images by instant complex computation, and accordingly, the user may feel that the environment is moving. Therefore, VR system may meet people's need as much as possible.
Most of VR system provides visual experience to the user. The user may communicate with the VR system via the input device, such as keyboard, mouse and wired glove. The VR technology is limited by computer processing ability, image resolution and communication bandwidth. However, as technology develops, the computer processing ability, image resolution and communication bandwidth are also improved and more cost down. The limitation on the VR technology will be less in the future.
The disclosure is directed to a mobile virtual reality (VR) operation method, system and storage media. The movement status of the mobile VR system is determined to control the display of the VR image.
According to one embodiment, a mobile virtual reality (VR) system is provided. The mobile VR system includes a display unit, a sensing unit, a photographing unit and a VR application. The sensing unit is for sensing a physical movement variation of the mobile VR system. The photographing unit is for photographing environment to generate a photograph image. The VR application is for determining a movement status of the mobile VR system based on the physical movement variation of the mobile VR system, sensed by the sensing unit, and the photograph image from the photographing unit to adjust a VR display image on the display unit.
According to another embodiment, a mobile VR operation method for a mobile VR system is provided. A physical movement variation of the mobile VR system is sensed. Environment is photographed to generate a photograph image. A movement status of the mobile VR system is determined based on the physical movement variation of the mobile VR system and the photograph image to adjust a VR display image on the mobile VR system.
According to an alternative embodiment, a computer-readable non-transitory storage media is provided. When the computer-readable non-transitory storage media is read by a computer, the computer executes the above mobile virtual reality operation method.
In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. It will be apparent, however, that one or more embodiments may be practiced without these specific details. In other instances, well-known structures and devices are schematically shown in order to simplify the drawing.
Technical terms of the disclosure are based on general definition in the technical field of the disclosure. If the disclosure describes or explains one or some terms, definition of the terms is based on the description or explanation of the disclosure. Each of the disclosed embodiments has one or more technical features. In possible implementation, one skilled person in the art would selectively implement part or all technical features of any embodiment of the disclosure or selectively combine part or all technical features of the embodiments of the disclosure.
The display unit 110 is for real-time displaying the VR image from the VR application.
The human-machine operation interface 120 provides an operation interface for the user to operate the mobile VR system 100.
The photograph unit 130 photographs environment to generate a photograph image. The photograph unit 130 is for example but not limited by a rear camera of the smart mobile device. In one embodiment, the rear camera refers to the camera on the back side of the smart mobile device, and on the contrary, the display unit 110 is on the front side of the smart mobile device. That is, the display unit 110 and the photograph unit 130 are on opposite sides of the smart mobile device. The photograph image from the photograph unit 130 is sent to the VR application 160. Accordingly, the VR application 160 determines whether the image is zoom-in or zoom-out to further determine that whether the mobile VR system 100 tilts (or moves) forward, or backward or is still.
The acceleration unit 140 is for sensing an acceleration sensing value of the mobile VR system 100. The acceleration unit 140 is for example but not limited by, a G-sensor. The acceleration sensing value sensed by the acceleration unit 140 may be sent to the VR application 160 to further determine that whether the mobile VR system 100 tilts (or moves) forward, or backward or is still.
The direction sensing unit 150 is for sensing an angle sensing value of the mobile VR system 100. The direction sensing unit 150 is for example but not limited by, a gyroscope. The angle sensing value sensed by the direction sensing unit 150 may be sent to the VR application 160 to further determine that whether the mobile VR system 100 tilts (or moves) forward, or backward or is still.
Based on the photograph image from the photograph unit 130, the acceleration sensing value sensed by the acceleration unit 140 and the angle sensing value sensed by the direction sensing unit 150, the VR application 160 determines whether the mobile VR system 100 tilts (or moves) forward, or backward or is still. Further, based on the determination result, the VR application 160 displays the VR image real-time on the display unit 110 and accordingly the user may view the VR image real-time. The VR images are stored in the memory (not shown) which is read out by the VR application 160 and displayed on the display unit 110.
Further, the mobile VR system 100 may optionally include a communication unit.
Similarly, in step 210, in initial setting, the VR application 160 commands the user to move a second predetermined distance along a second direction (for example but not limited by, backward). In response to the command, the user moves the mobile VR system 100 along the second direction by a second initial movement amount and the VR application 160 records the detected second initial movement amount as a second threshold. In the following, for example but not limited by, the VR application 160 predicts the second initial movement amount of the mobile VR system 100 based on the acceleration sensing value sensed by the acceleration unit 140. For example, in response to the command from the VR application 160, the user may tilt/move the mobile VR system 100 backward by 15 cm. Besides, although the user is commanded to move the mobile VR system 100 by 15 cm, the second initial movement of the mobile VR system 100 caused by user may be not precisely as 15 cm, and the second initial movement may be 14 or 16 cm.
In other possible embodiment of the application, the first threshold and the second threshold may be obtained via computation. That is, the steps 205 and 210 may be skipped and the VR application 160 obtains the first threshold and the second threshold via computation. Alternatively, after the first threshold and the second threshold are obtained in steps 205 and 210, the first threshold and the second threshold may be further processed.
In step 215, the VR application 160 displays the VR image on the display unit 110 and enables the photograph unit 130. In other words, the use may view the VR image on the display unit 110 to have VR experience.
In the embodiment of the application, the image from the photograph unit 130 may be used to determine whether the image is zoom-in or zoom-out.
In step 220, the VR application 160 determines that whether the photograph image from the photograph unit 130 is zoom-in or zoom-out. The details of how to determine that whether the photograph image from the photograph unit 130 is zoom-in or zoom-out is not specified here. If the VR application 160 determines that the photograph image from the photograph unit 130 is zoom-in, the flow proceeds to step 225. On the contrary, if the VR application 160 determines that the photograph image from the photograph unit 130 is zoom-out, the flow proceeds to step 230. In the embodiment of the application, in operation, if the mobile VR system 100 is tilted forward, the photograph image from the photograph unit 130 will be zoom-in because the photograph unit 130 is near to the objects under photographing. On the contrary, if the mobile VR system 100 is tilted backward, the photograph image from the photograph unit 130 will be zoom-out because the photograph unit 130 is away from the objects under photographing. Thus, in the embodiment of the application, the determination of whether the photograph image from the photograph unit 130 is zoom-in or zoom-out is used to determine that whether the mobile VR system 100 tilts forward or backward.
When the user operates the mobile VR system 100, the user may tilt forward or backward. Thus, the mobile VR system 100 may detects its own movement amount. In step 225, the VR application 160 determines that whether the physical movement amount of the mobile VR system 100 on the first direction is over the first threshold or not (that is, the VR application 160 determines whether the mobile VR system 100 tilts forward and if yes, the VR application determines the physical forward movement amount is over the first threshold or not). If the step 225 is yes (that is, the VR application determines the physical forward movement amount is over the first threshold), it is determined that the user tilts forward (i.e. tilts toward the first direction) and in step 230, the VR application 160 displays the moving-along-first-direction VR image on the display unit 110.
If the step 225 is no (that is, although the VR application determines the mobile VR system 110 tilts forward but the physical forward movement amount of the mobile VR system is not over the first threshold), it is determined that the user is still. In step 235, the VR application 160 displays the still VR image on the display unit 110.
In step 240, the VR application 160 determines that the physical movement amount of the mobile VR system 100 on the second direction is over the second threshold or not (that is, the VR application 160 determines whether the mobile VR system 100 tilts backward and if yes, the VR application determines the physical backward movement amount is over the second threshold or not). If the step 240 is yes (that is, the VR application determines that the physical backward movement amount of the mobile VR system 100 is over the second threshold), it is determined that the user tilts backward (i.e. tilts toward the second direction) and in step 245, the VR application 160 displays the moving-along-second-direction VR image on the display unit 110.
If the step 240 is no (that is, although the VR application determines the mobile VR system 110 tilts backward but the physical backward movement amount of the mobile VR system is not over the second threshold), it is determined that the user is still. In step 250, the VR application 160 displays the still VR image on the display unit 110.
That is, in the embodiment of the application, during operation of the mobile VR system 100, if the user wants to view moving-forward VR image, then the user may tilt or move forward in enough physical movement amount, and accordingly the physical forward movement amount of the mobile VR system 100 is over the first threshold. If the user wants to view the still VR image, the user may stand still (neither forward nor backward). If the user wants to view moving-backward VR image, then the user may tilt or move backward in enough physical movement amount, and accordingly the physical backward movement amount of the mobile VR system 100 is over the second threshold.
In step 255, the VR application 160 determines whether the use operation is ended. If yes, the flow ends. If not, the flow jumps back to the step 220.
Further, in the embodiment of the application, if the user wants to view the VR image on his/her right hand or left hand, the user may be still and then turn to the desired direction. Then the user may execute the flow chart in
As shown in
As shown in
As shown in
As shown in
In an embodiment of the application, the details about how the VR application 160 determines whether the mobile VR system 100 moves/tilts backward or forward are as follows. For example, if the frame rate of the photograph unit 130 is 18-35 FPS (frame per second) and the sampling rate of the acceleration sensing unit 140 is 15-197 Hz, the embodiment of the application may have a better and precise determination by adopting image scaling detection and the angle/direction sensing value and the acceleration sensing value from the direction sensing unit 150 and the acceleration sensing unit 140.
Further, in an embodiment of the application, each pixel is defined by a motion vector. The motion vector is classified by four directions. If the pixel is on any direction of the four directions, then the motion vector of this pixel is 1 (here, we use I/O as an example for explaining, but not limit to). On the contrary, if the pixel is on none of the four directions, then the motion vector of this pixel is 0. A histogram is obtained by gathering the motion vectors of all pixels. Then the pattern of the histogram is judged to determine that whether the mobile VR system 100 moves/tilts backward or forward.
Of course, the VR application may use other algorithm in determining whether the mobile VR system 100 moves/tilts backward or forward and the details are omitted here.
That is, in the embodiment of the application, in operation, if the user tilts forward enough (over the first threshold), the user may view the moving-forward VR image on the display unit 110. When the user is sill or tilts backward enough (over the second threshold), the user may view the still or moving-backward VR image on the display unit 110.
The head-mounted case 800 includes an elastic band 810, an adjustable camera hole 820, a recess 830, two lens 840 and a soft cushion 850.
The elastic band 810 extends from two sides of the head-mounted case and is fastened to the user head. The adjustable camera hole 820 may be adjusted based on a size and a location of the photographing unit 130 to expose the photographing unit 130. The recess 830 is for receiving a smart mobile device. The lens 840 are corresponding to a left eye and a right eye of the user, respectively. The lens 840 are corresponding to a left half and a right half of the display unit 110, respectively. The lens 840 will enlarge the VR images displayed on the left half and the right half of the display unit 110, respectively. The soft cushion 850 surrounds the lens 840. The soft cushion 850 is for example, a sponge which adds soft experience to user when touched to user face.
Based on the sensing result from the sensing unit (which represent the physical movement variation of the mobile VR system 100), the mobile VR system 100 of the embodiment of the application may determine the movement status (moving forward, backward or still) and accordingly adjust the VR images.
In the embodiment of the application, it is enough to sense the user operation (tilting forward, backward or still) by the acceleration sensing unit, the direction sensing unit and the photograph unit, for controlling display of the VR image. The acceleration sensing unit, the direction sensing unit and the photograph unit are common to the modern smart phone. That is, the mobile VR system 100 of the embodiment of the application could control the display of the VR image without additional control means. Thus, the mobile VR system 100 of the embodiment of the application has an advantage of cost down.
Besides, in detecting and determining the user operation (tilting forward, backward or still), the mobile VR system 100 considers whether the photograph image is zoom-in or zoom-out, the acceleration sensing value and the direction sensing value. Therefore, the detecting result is more accurate and will not be easily affected by noises.
Further, in initial setting of the mobile VR system 100 of the embodiment of the application, each user sets his/her own first/second threshold (that is, the respective thresholds reflecting the moving/tilting habit of the user). That is, the mobile VR system 100 of the embodiment of the application may fine tunes the first/second threshold for each user. Thus, even if each user may have different forward or backward tilting angle, after fine tune on the first/second threshold, in the mobile VR system 100 of the embodiment of the application, the forward or backward tilting detection will be not easily affected by user difference. Thus, the detection will be more precise.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments. It is intended that the specification and examples be considered as exemplary only, with a true scope of the disclosure being indicated by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
104140569 | Dec 2015 | TW | national |