A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent disclosure, as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright rights whatsoever.
The present invention relates generally to virtual reality viewers, and methods for providing touch, tactile, feedback to users using the virtual reality viewer for interacting with virtual reality content. More particularly it relates to integrating touch feedback from active and inert touch interfaces in a virtual reality environment. Still more particularly it relates to systems and method for providing touch, tactile, feedback to users using a virtual reality viewer for interacting with virtual reality gaming content.
Virtual reality consoles, i.e. viewers, are known for providing an immersive interactive video experience to a user. These viewers typically are worn on the user's head and position a stereo-optical display for the user to view. The content is presented in an auto-stereo, three-dimensional rendition. Virtual reality content can be created video content like interactive games and can be pre-recorded or live video streams captured by virtual reality capable cameras which can capture a 360° view of the environment. The content may be provided to the viewer through a wireless network, e.g. ultra-high frequency band assigned for mobile cellular communication such as 2G, 3GPP and 4G, WiFi or the like. The viewers can include location and position sensors as well as gyroscopes and accelerometers such that the content is rendered based upon the user turning or dipping their head. Katz et al, US Pub. App. 2015/0193949 filed Jan. 5, 2015 and titled “Calibration of Multiple Rigid Bodies in a Virtual Reality System”, the disclosure of which is incorporated by reference, discloses such a viewer and supporting system. Perry, WO 2014/197230A1 filed May 23, 2014 and titled “Systems and Methods for Using Reduced Hops to generate Virtual-Reality Scene Within a Head Mounted System”, the disclosure of which is incorporated by reference, discloses a gaming virtual reality (VR) headset using a handheld controller to provide user input. The head mounted display may include a forward looking digital camera to capture images of hand or finger gestures to provide user input and to provide real environmental context which may be considered in the VR rendition.
In environments where a VR viewer does not have a handheld “wired” or wireless controller, and instead relies on hand or finger gestures in front of the viewer and captured by the forward looking camera, there is no tactile feedback. For example, if the VR content requires the user to provide a button press or a finger slide input, a captured gesture in the air in front of the viewer does not provide a physical touch feedback. Further where there are several buttons from which one must be selected to depress, the confirmatory tactile feedback of a physical button press is not present. With specific reference to gaming and physical gaming machines which have numerous selection buttons, a VR rendition of the gaming machine would not provide the player with the touch, tactile, feedback associated with a button selection and press or touch. Without a wired or wireless active button panel, the tactile feedback cannot be had. It would be advantageous to provide a system and method where a physical, inactive, communicatively inert, button panel can be synchronize with a VR viewer to provide VR rendered buttons corresponding to buttons on the physical panel and to detect, through gesture recognition, a touch at the physical button. By communicatively inert what is meant is that the button panel is not connected by wired or wireless communication to a network or system as would be a computer keyboard or wired/wireless controller and which provide user inputs. The viewer recognizes the touch gesture and the touch at the physical panel provides the touch feedback to the user. According to this arrangement the physical panel can be a printed button panel to lay on a rigid surface such as a desk top, a projected button panel or an inert button panel with depressible buttons. It would be advantageous to use the same approach to finger “slide”, touch pointer or touch gesture inputs to provide touch feedback to the user. In this fashion tactile feedback can be provided without connected keyboards or controllers.
According to one aspect of the present invention, a system for providing tactile feedback for a user of a virtual reality headpiece viewer is provided where the system includes one or more servers to package and control virtual reality (“VR”) content delivered to the viewer responsive to user inputs and a transceiver to deliver the content to the viewer and receive and transmit inputs from the user to the one or more servers through a communication network. The system further includes a physical button panel for buttons associable with different user inputs, the user touch at a button providing a tactile feedback to the user where the panel is, with respect to user touches, communicatively inert. A viewer video camera captures real-time image data of the panel. A controller at the viewer and/or the one or more servers receives the image data and transmits the data to the one or more servers. The one or more servers and/or controller synchronize the virtual reality content for defining virtual buttons to substantially correspond with the positions of one or more physical button locations described on the panel and allocate a virtual reality input function to each virtual button. The viewer forward looking camera captures a user's touch gesture at the physical button on the panel for generating an input signal associated with the button and its allocated input function which is provided as one or more signals to the transceiver. The viewer controller and/or servers based upon the camera data signals integrate a view of the physical panel into the virtual reality environment such that the user may recognize and touch the physical panel to obtain the tactile feedback while the camera interprets the touch gesture as an input for controlling one or more feature associated with the virtual reality content.
Where the viewer includes position sensors to sense the direction and angle of view to include the physical button panel, perhaps resting on a desktop, the controller and/or one or more servers are configured to render into the virtual reality environment the corresponding view of the button panel such as perspective and orientation as would be expected in the physical world. Thus when a user tips their head to bring the physical button panel into view, the controller and/or servers in real time render a virtual reality version of the button panel into the virtual reality environment viewed by the user.
In an embodiment the button panel could be printed, or a video display with or without haptic feedback (as described in Rosenberg et al, U.S. Pat. No. 7,982,720 issued Jul. 19, 2011 and titled “Haptic Feedback for Touchpads and other Touch Controls” and Kelly et al, US Pub App 2014/0235320A1 filed Apr. 15, 2014 and titled “Dynamic Palpable Controls for a Gaming Device”, the disclosures of which is incorporated by reference), or include depressible buttons such as an elastomeric button panel.
There is also set forth a system for providing tactile feedback for a player of a virtual reality headpiece viewer for playing a virtual gaming device such as devices commonly referred to as a slot machines where the system includes one or more servers to package and control virtual reality gaming content for delivery to the player responsive to user inputs and a transceiver to deliver the content to the viewer and receive and transmit inputs from the player. The system further includes a physical button panel for buttons associable with different user inputs, the player touch at a button providing a tactile feedback to the player where the panel is, with respect to player touches, communicatively inert. For example a player may touch a button to prompt play of the gaming device, change wagers or change wagering propositions, e.g. how many pay lines to wager upon, or provide sliding gestures to spin a wheel or reel as part of the virtual reality game environment. A video camera captures real-time image data of the physical button panel and the player's touch. The video camera may be mounted to the headpiece. A controller at the headpiece and/or the one or more servers receives the image data and transmits the data to the one or more servers. The one or more servers and/or controller synchronize the virtual reality content for defining player observed virtual buttons to substantially correspond with the positions of one or more physical buttons locations on the physical button panel, allocate a virtual reality input function to each virtual button and determine a player's touch at a physical button for generating an input signal associated with the button and its allocated input function and provide the signal to the transceiver. The viewer controller and/or servers based upon the camera data integrate a view of the physical panel into the virtual reality environment such that the player may touch the physical panel to obtain the tactile feedback while the camera interprets the touch gesture as an input for controlling one or more features associated with the virtual reality content.
Related to the foregoing where the player is in a casino environment, a player virtual reality station may be provided which includes a cash validator and ticket validator and printer which are associated with the physical button panel to enable the player to establish credits for wagering and to receive an physical instrument when cashing out credits as is provided in current casino environments. Additionally or alternatively the headpiece may communicate with a credit account for downloading value for gaming credits.
In an embodiment the physical button panel may include electromagnetic beacons or visual beacons to enable the controller and/or one or more servers to recognize the location, orientation, type or configuration, size and/or shape of the physical button panel for appropriate rendering the virtual reality environment to include the panel and buttons.
In an embodiment the controller may be a smart phone mounted to headgear to define the headpiece. A software client application provided to the smart phone configures it to be the controller or cooperate with one or more servers to be the controller and to use the smart phone camera as the video camera.
In an embodiment the camera may capture the button panel and finger touches in other than visual light such as infrared.
There is also set forth a method for providing tactile feedback for a user of a virtual reality headpiece viewer is provided including one or more servers for packaging and controlling virtual reality content for delivery to the viewer and for responding to user inputs. A transceiver is provided for delivering the virtual reality content to the viewer and receiving and transmitting inputs from the user to the one or more servers through a communication network. The transceiver may be a wireless transceiver such as a WiFi or broadband communication device and network in communication with the viewer. A video camera is provided for capturing real-time video data of a physical, communicatively inert, button panel and a user's touch at the panel, the touch at the panel providing tactile feedback to the user. A controller receiving the image data transmits the data to the one or more servers via the transceiver where the one or more servers and/or controller synchronize the virtual reality content for defining virtual to substantially to correspond with the positions of one or more physical buttons locations on the physical button panel, define a virtual reality input function to each virtual button and determine a user's touch at a physical button on the panel, generate an input signal associated with the button and its allocated input function and provide the signal to the transceiver for transmission to the one or more servers for controlling an aspect of the virtual reality environment being experienced by the user.
There is also set forth a method for integrating tactile feedback into a player's experience of playing a virtual gaming device using a virtual reality viewer. The method includes accessing, through a transceiver in communication with the viewer, one or more servers which package and control virtual reality gaming content for delivery to the viewer responsive to user inputs. The method includes capturing in real-time video image data of a physical, communicatively inert, button panel positioned for touching by the player to provide tactile feedback as well as capturing images of the player's touches. Receiving the image data a controller transmits the image data to the one or more servers for synchronizing the virtual reality content for defining player observed virtual buttons to substantially coincide with the positions of one or more physical buttons defined on the panel, allocating a virtual reality input function to each virtual button and determining a player's touch at a physical button on the panel for generating an input signal associated with the button and its allocated input function and providing the signal to the transceiver for transmitting to the servers. The viewer controller and/or servers based upon the camera data provide for integration of a view of the physical panel into the virtual reality environment such that the player touching the physical panel obtains the tactile feedback while the camera interprets the touch gesture as an input for controlling one or more feature associated with the virtual reality content.
Additional aspects of the invention will be apparent to those of ordinary skill in the art in view of the detailed description of various embodiments, which is made with reference to the drawings, a brief description of which is provided below.
While the invention is susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. It should be understood, however, that the invention is not intended to be limited to the particular forms disclosed. Rather, the invention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
While this invention is susceptible of embodiment in many different forms, there is shown in the drawings and will herein be described in detail preferred embodiments of the invention with the understanding that the present disclosure is to be considered as an exemplification of the principles of the invention and is not intended to limit the broad aspect of the invention to the embodiments illustrated. For purposes of the present detailed description, the singular includes the plural and vice versa (unless specifically disclaimed); the words “and” and “or” shall be both conjunctive and disjunctive; the word “all” means “any and all”; the word “any” means “any and all”; and the word “including” means “including without limitation.”
For purposes of illustrating an embodiment of the invention, it will, unless otherwise indicated, be described with reference to a virtual reality environment for casino games. It should be understood that the invention has utility outside of gaming for environments having user button or other touch inputs to control an aspect of, respond to queries from and provide other inputs relevant to the virtual reality experience of the user for example, button-type inputs where tactile feedback to the user may enhance other computer gaming, publishing, digital photo-processing or other environments susceptible to virtual reality (“VR”) viewing by a viewer.
For purposes of the present detailed description, the terms “wagering game,” “casino wagering game,” “gambling,” “slot game,” “casino game,” and the like include games in which a player places at risk a sum of money or other representation of value, whether or not redeemable for cash, on an event with an uncertain outcome, including without limitation those having some element of skill. In some embodiments, the wagering game involves wagers of real money, as found with typical land-based or online casino games. These types of games are sometimes referred to as pay-to-play (P2P) gaming. In other embodiments, the wagering game additionally, or alternatively, involves wagers of non-cash values, such as virtual currency, and therefore may be considered a social or casual game, such as would be typically available on a social networking web site, other web sites, across computer networks, or applications on mobile devices (e.g., phones, tablets, etc.). These types of games are sometimes referred to play-for-fun (P4F) gaming. When provided in a social or casual game format, the wagering game may closely resemble a traditional casino game, or it may take another form that more closely resembles other types of social/casual games.
Referring to
The present invention can also apply to wired networks as well where the VRV 10 is connected by a cable to, for example, a game console, PC computer or the like. In a gaming environment such as a casino where gaming supporting VR content is provided to a VRV 10 provided by or to the player, the communication network will typically be wireless.
As suggested in
The VRV 10 may also include a gyroscope, accelerometer, compass and other devices. Modern smart phones often include these devices. As such the VRV 10 can detect movement, direction and speed of movement of the player's head. To provide the player with an immersive VR experience, the VRV 10 may be controlled to alter the VR view of the player as he/she, turns their head or looks up or down. The VRV 10 provides signals responsive to detecting such movements to one or more VR rendering sources (discussed below) to alter in real-time the VR view experienced by the player. For example, the player viewing the gaming device video display 20 may turn their head to the right resulting in the VR content streaming to the VRV 10 being altered to show a view of neighboring gaming devices, other people, or other scenery. VR cameras can acquire a 360°, live video image at a location such as in front of a gaming machine.
To provide VR content to the VRV 10, the provider may record the VR environment or a portion thereof to be rendered to the player. For example, and with continuing reference to
As discussed above, it is known to provide for gesture recognition for VRVs such as recognition of hand gestures and finger gestures. In environments where there is no hand-held or hand actuated controller, there are no means or insufficient means for a user to have tactile feedback for inputs such as finger touches or finger slides on buttons. In the illustration of
To provide tactile feedback to the player, a physical, communicatively inert, button panel 30 is provided an example of which is shown at
To coordinate the physical button panel 30 with the generated VR environment viewed by the player such as the gaming machine display of
As can be appreciated to provide tactile feedback to the player, the player touches the communicatively inert physical button panel 30 as shown at 40 (
The physical button panel 30 can be of any configuration. Where, for example, the physical button is a laser projected button panel projected on a rigid surface, the physical button panel 30 can change based upon the game content being presented. However the button touches are captured by the camera 14 of the VRV 10 and are not, with respect to the VR content, input via the laser projected button panel. That is, the player may wish to play a hypothetical game of “Queen's Treasure” and may so indicate that through their VRV 10. The system would package for delivery to the player the associated VR content and may send a signal through the network to a laser projector to project the corresponding button panel configuration on a rigid surface to define the appropriate physical button panel 30. The VRV 10 captures the laser projected button panel and synchronizes the physical button panel 30 into the VR content for the play of “Queen's Treasure”. The player's touches at the laser projected physical button panel 30 are detected by the camera 14 which interprets the same as an appropriate input.
The acquisition and incorporation of a virtual replica of the physical button panel 30 into the VR content may through augmented reality in a fashion as described in Lyons, et al U.S. Pat. No. 9,269,219 issued Feb. 23, 2016, published Oct. 24, 2013 and titled “System and Method for Augmented Reality with Complex Augmented Reality Video Images” the disclosure of which is incorporated by reference.
To provide the VR content to the VRV 10 in a casino environment according to an embodiment of the invention the VRV 10 is in communication with a system 300 as illustrated in
To configure the VRV 10 according to an embodiment of the present invention where the VRV 10 is a player's smart phone, CCS 302a may be configured to store downloadable configuration software client applications adapted to be downloaded by the player for configuring their smart phone device to receive and process data as described herein. During download this software client may also return to CCS 302a data such as data related to the smart phone configuration, e.g. display size and resolution, video camera resolution and capabilities, e.g. infrared enabled, processing capabilities and operating system and accessory links to receive data from the smart phone device such as the camera, gyroscope, compass and accelerometer for determining the view direction and movement of the VRV 10.
The smart phone 400 typically includes peripherals such as the camera 14, a gyroscope 418, compass 420 and speaker 422. Other peripherals such as one or more accelerometers may be provided to determine acceleration associated with the movement of the smartphone 400.
To configure the smart phone 400 into the VRV 10 and with reference to
The player accesses the system 300 and during the process confirms their credentials and acquires at 700 the appropriate software client application through a download from the CCS 302a to their smart phone 400 to arrange the controller 402 and various modules 404, 406, 410, 412, 414 at the smart phone 400 for configuration as the controller for the VRV 10 to support the features of this invention. As shown in
At 702 the player launches or initiates the client application to receive VR content. In an embodiment a video instruction may tell the player to move their head such that the VRV 10 camera 14 captures at 704 an image of the physical button panel 30. In an embodiment the controller 402 alone or with processing at the system 300 at 706 synchronizes the view of the physical button panel 30 into the VR content for the gaming device as described with reference o
To provide the player with a platform to play the virtual reality supported game, as shown in
One or more features of the present invention may be provided to a player who is remotely located from a casino enterprise by an iGaming system for either P2P or P4F gaming. That is a player at home may desire to have a VR gaming experience to play a game for fun wagering virtual credits or, where legal, actually wagering value consideration.
The website 800 also accesses a player-centric iGaming platform level account module 804 at 806 for the player to establish and confirm credentials for play and, where permitted, access an electronic funds account (eWallet) for wagering. The account module may include or access data related to the player profile (player-centric information desired to be retained and tracked by the host), the player's eWallet and deposit and withdrawal records, registration and authentication information such as username and password, name and address information, date of birth, a copy of a government issued identification document such as a driver's license or passport and biometric identification criteria such as fingerprint, facial recognition data) and a responsible gaming module containing information such as self-imposed (or jurisdictionally imposed) gaming restraints such as loss limits, daily limits and duration limits. The account module 804 may also contain and enforce geo-location limits such as geographic areas where the player may play P2P games, user device IP address confirmation and the like.
The account module 804 communicates at 805 with a game module 816 for completing log-ins, registrations and other activities. The game module 816 may also store or access a player's gaming history such as player tracking and loyalty club account information. The game module 816 may provide static web pages to the VRV 10 from the game module 816 through line 818 whereas, as stated above, the live VR content is provided from the gaming server 814 to the web game client through line 811.
The VR game server 814 is configured to provide interaction between the game and the player such as receiving wager information, game selection, button interaction gesture recognition, inter-game player selections or choices to play a game to its conclusion and well the random selection of game outcomes and graphics packages which, alone or in conjunction with the downloadable game client 808/web game client 802 and game module 816 provide for the display of game graphics and player interactive interfaces. At 818 player account and log-in information is provided to the gaming server 814 from the account module 804 to enable gaming. 820 provides wager/credit information between the account module 804 and gaming server 814 for the play of the game and may display credits/eWallet availability. 822 provides player tracking information for the gaming server 814 for tracking the player's play. The tracking of play may be used for purposes of providing loyalty rewards to a player, determining preferences and the like.
All or portions of the features of
In a further embodiment where a player at a physical gaming machine would like to continue gaming elsewhere in the casino in a VR environment, the player may elect to move the game being played for play using the VRV 10 at another location such as a player station 600 in a bar or restaurant. This may be advantageous where, for example, the casino venue is limited to a number of gaming machines. The player using their smart phone 400 would go through the steps to transfer the game to the mobile device such as disclosed in Hedrick et al, US Pub App 2015/0228153A1 published Aug. 13, 2015 and titled “System and Method for Remote Control Gaming Sessions Using a Mobile Device” the disclosure of which is incorporated by reference. The system 300 recognizes the request to transfer and thereafter moves the game experience to a VR experience as described above.
The acquisition of the physical button panel 30 for integration into the VR content may be through augmented reality technology as described in Lyons, et al U.S. Pat. No. 8,469,260 issued Jun. 25, 2013 and titled “System and Method for Assisted Maintenance in a Gaming Machine Using a Mobile Device” the disclosure of which is incorporated by reference. The player with the VRV 10 camera 14 acquires a video of the physical button panel 30 and in an embodiment the bar code 34. The controller 402 and/or system 300 receive the video data and use that information to overlay function graphics for the buttons.
A generic physical input device other than a button panel may take the form of a compressible ball or a cube or other multi-faceted object that fits in the player's hand. For example, the object may be constructed of foam or rubber. The object can be squeezed and released, acting as button, when the camera 14 of VRV 10 detects the player's hand so acting on the object. In some embodiments, the blank object (as illustrated in
The above examples of buttons and a button panel may be extended to any number of tangible physical objects which are also within the scope of the various embodiments of the invention. One example of a VR game which may be made available in accordance with one or more embodiments of a system as described by
As described above with respect to buttons, the camera 14 of VRV 10 acquires a video of each physical object and its orientation on the table or in each player's hands. The controller 402 and/or system 300 receive the video data and use that information to overlay values on the cards and chips, position an avatar of each player around a virtual table and mimic their movements, etc. The values of the playing cards do not matter, nor does the color of the chips, the size of the table, etc. The inclusion of the physical objects in play of the game provides individualized tactile feedback to each player while playing a virtual game presented on the VRV 10. Once the objects are detected, the system overlays all relevant markings, such as backs and rank and suit, on the cards and colors or values on the chips according to their orientation in physical space. For example, if a card is face up, its face is shown. If not, its back is shown. Similarly, if a player “peeks” at his physical cards by lifting physically lifting up a corner, tucks his cards under his chips to signal “staying” in Blackjack, moves one or more chips into a betting circle or the like, these actions will be represented in the virtual world via that player's presentation on the VRV 10 and also in the virtual worlds of any other players of the game.
In accordance with one or more embodiments, system 300 can also detect if any of the required objects is missing and suspend game play until all required objects are provided and ready for use. Similarly, some embodiments may require the placement of certain objects in certain locations in order for game play to start or continue. For example, the game may direct a player to place his two playing cards in a space marked by a rectangle or to place one of his chips in an ante circle depicted by VRV 10.
In accordance with still other embodiments, a single die or two or more dice may be used. Again, the player has dice he can physically hold, shake and throw in order to provide tactile feedback to his VR game. The VRV 10 tracks the dice on a tabletop or floor and represents their location on its display so they can be picked up again by the player. As with the card example above, when the player throws the dice, the face that actually lands upright is irrelevant as the image provided in the virtual world will show the outcome determined by the game engine. In accordance with some embodiments, to avoid having to track the dice and have them be picked up by the player, they may be in sealed cup. When it is time to roll the dice, the player can still shake and feel the dice in the cup, but when the player makes a throwing motion, the virtual dice appear thrown while the physical dice remain in the cup. The cup is next used when it is time to throw the dice again.
In accordance with one or more embodiments, a floor space may become a source of feedback for the player. If a player has an open floor space available in a room, camera 14 of VRV 10 captures an image of the space and determines its size in order to determine a scale usable in the virtual scene. The floor space then becomes akin to a touchscreen surface and the location of the player's feet within the space determines where a “touch” occurs. For example, in a game of virtual roulette, the player may not be sitting at table but, instead, be represented as an avatar in a large virtual world who can walk around on the betting layout, placing wagers or issuing commands with his feet by stepping on virtual buttons portrayed by the VRV 10. A selection or wagering action, for example, may occur if the player jumps up and down, taps his foot, etc. In these cases, the input signal includes not only an activation signal, but position information, such as x-, y- and z-coordinates, as well, all of which may be combined by the system in evaluating the nature of the input.
Similarly, in accordance with still other embodiments, a surface such as a blank tabletop provided by the player can become the scene for a 3D world portrayed by the VRV 10. The player can walk around edges of the table and see the virtual world or game from different perspectives. In accordance with still other embodiments, the tabletop may also serve as a touchscreen over which the player can “walk” around the surface 1000 of the game space with his fingers, as shown in
Alternately, the player may stay in place and, by using hand gestures or pressing virtual controls on the surface of the table, rotate or otherwise alter the presentation of the table in order to view it from different angles. In some embodiments, a haptic feedback pad may be placed on the table to provide additional feedback when dice, cards, chips and the like hit the table.
A VR game may use existing buttons and controls on an existing device such as a gaming machine. At various points in the VR game, more controls than are provided by the existing device may be required and ask a player to dynamically assign objects he can feel and also see in the rendered scene as the new controls. For example, an extra button may be required. In accordance with one or more embodiments, the game may ask the player to select a visible object that he can also feel for use as the button or control. The VRV 10 may display portions of the player's body visible to camera 14, such as his wrist, and the player may select the face of his wristwatch as the additional control. During the game, any time the player touches the face of his wristwatch, the control is activated. Similarly, various surfaces on the gaming machine itself may be selected. In another non-limiting example, the player may elect to use the center top edge of gaming machine cabinet 502 as a control. Again, during the game, any time the camera 14 of VRV 10 detects the player touching the center top edge of the gaming machine cabinet, the control is activated.
In accordance with some embodiments, the VR game may assign certain unused spaces on a physical device such as a physical button panel 602 or gaming machine cabinet 502 and, using augmented reality, overlay the new control in that space. When the player touches the overlaid control, the underlying surface provides tactile feedback that the additional control has been touched.
At block 1110 of
In accordance with some embodiments, the viewer includes position sensors to detect when the user's field of view includes the physical object. In these cases; the optional step of generating an augmented image of the physical object to, from the user's viewpoint, overlay one or more images on a live image of the physical object, is performed at block 1120.
At block 1130, since the physical object does not provide any signal to the one or more servers responsive to a touch by the user, a virtual reality system input function and a signal corresponding to a detected touch of the object is assigned.
At block 1140, the camera captures real-time image data corresponding to the user's touch of the physical object determined in block 1110 and the image data is sent to and received by the system's server(s) for processing at block 1150.
At block 1150, the user's touch of the physical object is synchronized with a generated virtual reality image corresponding to the touch of the object to provide visual feedback to the user. As noted above, tactile feedback to the user is provided by the physical object itself
Finally, at block 1160, the signal assigned in block 1130 is provided to the transceiver and sent to the server.
The order of actions as shown in
While the above invention has been described with reference to a gaming environment, it has applications to VR users in other environments where touch feedback would be advantageous. For example, at home, a user may want to engage in online banking or other eCommerce activity. They would print a physical button panel and acquire the client application for providing the VR environment. The user could virtually walk through a store or mall and use the buttons, supported by tactile feedback, to make selections.
Each of these embodiments and obvious variations thereof is contemplated as falling within the spirit and scope of the claimed invention, which is set forth in the following claims. Moreover, the present concepts expressly include any and all combinations and sub combinations of the preceding elements and aspects.
This application is a non-provisional application of U.S. Provisional Application 62/323,301 filed Apr. 15, 2016, hereby incorporated by reference in its entirety for all purposes.
Number | Date | Country | |
---|---|---|---|
62323301 | Apr 2016 | US |