The present disclosure relates to a platform for virtual reality movement.
In the past, virtual reality (VR) systems lacked abilities that would allow natural human movement, such as walking, jogging, running, etc, and navigation of a VR environment was typically experienced via pointing, gesturing, manipulating a joystick, a trackball, or a mouse. Moreover, some VR environments simulate human movement while a user sits or stands. Furthermore, there is a growing demand for exercise machines to be entertaining by enhancing a physical exertion experience through three-dimensional visual graphics. Further, utilizing multi-directional virtually unrestricted movement within a VR environment and within a physically restricted space has also been desirable.
In one or more embodiments, one or more systems may include a first platform that has a top surface, which is substantially planar and configured to rotate and support a user, and the one or more systems may provide first images to a user; determine a rotation of a head of the user; provide second images, based at least on the rotation, to the user; and rotate the first platform in accordance with the rotation.
For a more complete understanding of the present invention and its features and advantages, reference is now made to the following description, taken in conjunction with the accompanying drawings, in which:
In the following description, details are set forth by way of example to facilitate discussion of the disclosed subject matter. It should be apparent to a person of ordinary skill in the field, however, that the disclosed embodiments are exemplary and not exhaustive of all possible embodiments.
As used herein, a reference numeral followed by a letter refers to a specific instance of an element and the numeral only form of the reference numeral refers to the collective element. Thus, for example, device ‘12A’ refers to an instance of a device class, which may be referred to collectively as devices ‘12’ and any one of which may be referred to generically as a device ‘12’.
In one or more embodiments, a virtual reality (VR) environment may be or include a computer-generated illusion that allows and/or permits a user (e.g., a person) to experience and/or interact with an artificially created world and/or a simulation of an environment. For example, a VR environment may stimulate naturally occurring senses such as one or more of sight, sound, touch, and movement. In one or more embodiments, the user may perform one or more actions, which may be translated by a computer system into one or more actions that affect the VR environment. In one or more embodiments, one or more systems, methods, and/or processes described herein may allow and/or permit the user to wear a VR head mounted display (HMD) that may allow and/or permit the person to move around in a VR environment. For example, the person may walk or run in the VR environment.
In one or more embodiments, one or more systems, methods, and/or processes described herein may include a virtual reality movement system (VRMS) that may include multiple planes. For example, the VRMS may include a first platform that may include a first plane and may include a second platform that may include a second plane that may rotate on the first plane. For instance, when the user moves the VR HMD left or right, the second plane may rotate on the first plane left or right, respectively to rotational movement of the VR HMD. In one or more embodiments, the VRMS may provide virtual infinite movement (e.g., walking, jogging, running, etc.) on the first platform. For example, the VRMS may include a movement belt, which may permit the user to walk, jog, run, etc. For instance, the movement belt may be similar to a treadmill. In one or more embodiments, forward and/or backward movement may be determined utilizing the movement belt. For example, sensors within and/or coupled to the movement belt may determine if the user is moving. In one instance, the sensors within and/or coupled to the movement belt may determine if the user is moving forward. In another instance, the sensors within and/or coupled to the movement belt may determine if the user is moving backward.
In one or more embodiments, the first platform may include the movement belt, and the movement belt may be rotated when there is rotational movement to a left or a right direction for the VR HMD. For example, the VR HMD may include one or more sensors that may sense and/or determine rotational movement. In one or more embodiments, the movement belt may move (e.g., forward or backward) once the user moves. In one example, the VRMS may include one or more sensors that may determine movement of one or more feet of the user. In another example, the VRMS may include one or more sensors that may determine movement of one or two legs of the user. In one or more embodiments, the VRMS may include one or more pillars and/or rails that may aid the user in maintaining balance before and/or after a movement is commenced. In one example, a rail may be parallel to the first platform. In another example, a pillar may be perpendicular to the first platform. In one or more embodiments, one or more pillars and/or rails may include one or more controller devices. For example, the one or more controller devices may permit the user to control the VR HMD and/or the VRMS. For instance, the user may actuate the one or more controller devices. In one or more embodiments, the VRMS may include an embedded game pad. For example, the embedded game pad may receive input from the user. For instance, the user may “click on” one or more options via the game pad to control the VR HMD, the VRMS, the VR environment, a game, and/or a simulation, among others.
In one or more embodiments, the VRMS and the VR HMD may allow and/or permit the user to experience a three-dimensional VR environment. For example, the user may walk through a virtual area (e.g., a virtual walking tour). In one instance, when a virtual reality of a path is available for a participant, the user may be able to walk in a recorded path. In another instance, a tourist (e.g., the user) may utilize the VRMS to explore one or more three hundred and sixty (360) degree views of an area (e.g., an area of a city, an area of a country, etc.) utilizing pre-recorded VR paths and/or routes, among others.
Turning now to
As shown, VRMS 100 may include a platform 106 that may rotate upon a platform 108. In one or more embodiments, platform 106 may rotate upon platform 108 when user 102 rotates his or her head. For example, platform 106 may rotate upon platform 108 as indicated via the dashed arc, while platform 108 may stay fixed. In one instance, when user 102 rotates his or her head left, platform 106 may rotate left upon platform 108. In another instance, when user 102 rotates his or her head right, platform 106 may rotate right upon platform 108. In one or more embodiments, VR HMD 104 may include one or more sensors that may be utilized in determining a rotation of the head of user 102. In one example, the one or more sensors may include one or more gyroscopes. In another example, the one or more sensors may include one or more accelerometers.
As illustrated, VRMS 100 may include pillars 110 and rails 112. In one or more embodiments, pillars 110 and rails 112 may aid user 102 in maintaining balance. For example, user 102 may handle one or more of rails 112 in maintaining balance. In one or more embodiments, one or more of rails 112 may include one or more controller devices. In one example, the one or more controller devices may permit user 102 to control one or more of VR HMD 104 and VRMS 100, among others. In one instance, the one or more controller devices may permit user 102 to control one or more rotations of platform 106 upon platform 108. In another instance, the one or more controller devices may include an emergency brake that may be actuated by user 102. In another example, the one or more controller devices may include a game pad. For instance, the game pad may permit user 102 to control one or more of VR HMD 104 and VRMS 100, among others.
In one or more embodiments, pillars 110 and/or rails 112 may be removable and/or optional. For example,
As shown, VRMS 100 may include a movement belt 114. In one or more embodiments, movement belt 114 may permit user 102 to walk and/or run. For example, movement belt 114 may be similar to a treadmill. In one instance, movement belt 114 may include a top layer that may include rubber, natural rubber, fabric reinforced rubber, etc. In a second instance, a surface of platform 106 may be planar or substantially planar, and movement belt 114 may be planar or substantially planar and may be included in platform 106 via the top surface of platform 106. In a third instance, travel in a VR of user 102 may not be limited via utilization of movement belt 114. In another instance, movement belt 114 may permit a simulated infinite travel of user 102. In one or more embodiments, forward and/or backward movement may be determined utilizing movement belt 114. For example, sensors within and/or coupled to movement belt 114 may determine if user 102 is moving. In one instance, the sensors within and/or coupled to movement belt 102 may determine if user 102 is moving forward. In a second instance, the sensors within and/or coupled to the movement belt may determine if user 102 is moving backward. In another instance, movement belt 114 may include one or more pressure sensors and/or one or more torque sensors that may sense and/or determine if user 102 is moving and/or information associated with movements of user 102 (e.g., a speed of movement, a direction of movement, an acceleration of movement, etc.). In one or more embodiments, movement belt 114 may move (e.g., forward or backward) once user 102 moves.
As illustrated in
In one or more embodiments, one or more of sensors 116 and 118 may include one or more sound producing devices and/or one or more sound receiving devices. For example, the one or more sound receiving devices may receive sounds produced by the one or more sound producing devices, and one or more distances, positions, and/or motions of a leg and/or a foot of user 102 may be determined from sound interruption and/or sound reflection. For instance, one or more of sensors 116 and 118 may include one or more sonar devices. In one or more embodiments, one or more of sensors 116 and 118 may be utilized in determining one or more gestures of user 102. In one example, user 102 may utilize one or more gestures to control VRMS 100. In another example, user 102 may utilize one or more gestures to control and/or interact with a VR environment.
As illustrated in
As illustrated, user 102C may wear a device 126. In one or more embodiments, device 126 may monitor and/or measure one or more biometrics of user 102. For example, device 126 may monitor and/or measure one or more of a heart rate, a blood pressure, and an oxygen saturation. For instance, device 126 may include one or more of a light-emitting diode (LED) and a photodiode, among others, which may be utilized in monitoring and/or measuring one or more biometrics of user 102. In one or more embodiments, device 126 may include a form of a watch and/or bracelet. As illustrated, user 102D may wear a chest belt 128. In one or more embodiments, device 128 may monitor and/or measure one or more biometrics of user 102D. For example, device 128 may monitor and/or measure one or more of a heart rate, a blood pressure, and an oxygen saturation of user 102D. For instance, device 126 may one or more of a LED and a photodiode, among others, which may be utilized in monitoring and/or measuring one or more biometrics of user 102D.
In one or more embodiments, one or more of glove 120, controller 124, device 126, and chest belt 128 may communicate with VRMS 100 in a wireless fashion. For example, one or more of glove 120, controller 124, device 126, and chest belt 128 may receive data from VRMS ins a wireless fashion and/or may provide data to VRMS 100 in a wireless fashion.
As illustrated in
As illustrated in
As illustrated in
As illustrated in
Turning now to
In one or more embodiments, communication device 206 may be configured to communicate with another system. For example, communication device 206 may be configured to communicate data with VRMS 100. In one instance, communication device 206 may be configured to communicate data with VRMS 100 in a wired fashion and/or a wireless fashion. In another instance, communication device 206 may enable and/or permit VR HMD 104 to be wirelessly coupled to VRMS 100 and/or one or more portions of VRMS 100. In one or more embodiments, communication device 206 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing a radio frequency (RF) carrier and/or an optical carrier. In one example, communication device 206 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing a licensed and/or an unlicensed RF spectrum. For instance, the unlicensed spectrum may include one or more industrial, scientific, and medical (ISM) RF bands. In a second example, communication device 206 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing wireless Ethernet (e.g., based on IEEE 802.11), Bluetooth (e.g., based on IEEE 802.15), IEEE 802.15.4, and/or ZigBEE, among others. In another example, communication device 206 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing infrared (IR) signaling. In one or more embodiments, communications device 206 may include multiple communications devices configured to communicate with multiple systems and/or multiple networks. In one example, the multiple systems may be or include multiple different systems. In another example, the multiple networks may be or include multiple different networks.
In one or more embodiments, sound output devices 208 may be configured to produce one or more sounds that user 102 may hear. For example, sound output devices 208 may include one or more speakers. In one or more embodiments, sound input device 210 may be configured to receive one or more sounds. For example, sound input device 210 may include a microphone that may transform one or more sounds into one or more electrical signals (e.g., voltage signals, current signals, etc.). In one or more embodiments, optical output devices 212 may be configured produce images and/or videos. For example, optical output devices 212 may include one or more displays that may produce images and/or videos viewable by user 102.
In one or more embodiments, user input device 214 may be configured to receive input from user 102. In one example, user input device 214 may include one or more buttons and/or switches that may be actuated by user 102. In a second example, user input device 214 may include one or more dials that may be utilized by user 102. In another example, input device 214 may include one or more capacitive sensing devices. In one instance, the one or more capacitive sensing devices may include one or more capacitive sensing buttons. In another instance, the one or more capacitive sensing devices may include one or more sliders (e.g., an area that may detect sliding of a finger of user 102, etc.).
As shown, VRMS 100 may include a processor 222. In one example, processor 222 may include a microprocessor, a SoC, or in general, any type of circuit that includes processing logic that executes instructions from a memory medium in implementing one or more systems, methods, and/or processes described herein. In another example, processor 222 may include a FPGA and/or an ASIC, which may be configured to implement one or more systems, methods, and/or processes described herein. As illustrated, VRMS 100 may include a memory medium 224, a communication device 226, a sound output device 230, a sound input device 232, an optical output device 234, a user input device 236, and an optical sensor device 238, which may be communicatively coupled to processor 222. In one or more embodiments, memory medium 224 may include data and/or instructions, executable by processor 222, to implement one or more method, processes, and/or systems described herein. For example, memory medium 224 may include one or more volatile memory media and/or one or more non-volatile memory media.
In one or more embodiments, communication device 226 may be configured to communicate with another system. For example, communication device 226 may be configured to communicate data with VR HMD 104. For instance, communication device 226 may be configured to communicate data with VR HMD 104 in a wired fashion and/or a wireless fashion. In one or more embodiments, communication device 226 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing a RF carrier and/or an optical carrier. In one example, communication device 226 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing a licensed and/or an unlicensed RF spectrum. For instance, the unlicensed spectrum may include one or more ISM RF bands. In a second example, communication device 226 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing wireless Ethernet (e.g., based on IEEE 802.11), Bluetooth (e.g., based on IEEE 802.15), IEEE 802.15.4, and/or ZigBEE, among others. In another example, communication device 206 may be configured to communicate, in a wireless fashion, with one or more other systems and/or one or more other devices utilizing IR signaling. In one or more embodiments, communication device 226 may be configured to communicate with one or more devices and/or computer systems via a network. For example, the network may include and/or be coupled to a local area network (LAN), a wide area network (WAN), an Internet, a public switched telephone network (PSTN), a cellular telephone network, a satellite telephone network, or a combination of the foregoing, among others. For instance, the WAN may be or include a private WAN, a corporate WAN, and/or a public WAN, among others. In one or more embodiments, communications device 226 may include multiple communications devices configured to communicate with multiple systems and/or multiple networks. In one example, the multiple systems may be or include multiple different systems. In another example, the multiple networks may be or include multiple different networks.
In one or more embodiments, physical sensor device 228 may include one or more of a pressure sensor, a torque sensor, a micro-electro-mechanical systems (MEMS) sensor, an accelerometer, a magnetometer, and a gyroscope, among others. For example, physical sensor device 228 may sense and/or determine one or more interactions of user 102 with VRMS 100. In one instance, pressure sensors 182 may be or include physical sensor device 228, which may provide pressure data to processor 222. In another instance, movement sensing device 184 may be or include physical sensor device 228, which may movement data to processor 222. In one or more embodiments, sound output devices 230 may be configured to produce one or more sounds that user 102 may hear. For example, sound output devices 230 may include one or more speakers. In one or more embodiments, sound input device 232 may be configured to receive one or more sounds. For example, sound input device 232 may include a microphone that may transform one or more sounds into one or more electrical signals (e.g., voltage signals, current signals, etc.).
In one or more embodiments, optical output device 234 may be configured produce images and/or videos. For example, optical output devices 234 may include one or more displays that may produce images and/or videos viewable by user 102. In one or more embodiments, optical output device 234 may be configured to convey information to user 102 in an optical fashion. For example, optical output device 234 may include one or more light emitting diodes (LEDs) that may convey information to user 102 in an optical fashion. For instance, the one or more LEDs may convey information such as a functionality of VRMS 100, an error with VRMS 100, etc.
In one or more embodiments, user input device 236 may be configured to receive input from user 102. In one example, user input device 236 may include one or more buttons and/or switches that may be actuated by user 102. In a second example, user input device 236 may include one or more dials that may be utilized by user 102. In a third example, input device 236 may include one or more capacitive sensing devices. In one instance, the one or more capacitive sensing devices may include one or more capacitive sensing buttons. In another instance, the one or more capacitive sensing devices may include one or more sliders (e.g., an area that may detect sliding of a finger of user 102, etc.). In another example, user input device 236 may include a game pad. In one or more embodiments, optical sensor device 238 may be configured to determine movements associated with user 102. For example, optical sensor device 238 may include one or more of sensors 116 and 118, among others.
Turning now to
In one or more embodiments, user 102 may have multiple dimensions of freedom when using VRMS 100 and VR environment 300. In one example, the multiple dimensions of freedom may include movement forwards, movement backwards, movement leftwards, and/or movement rightwards, among others. In another example, the multiple dimensions of freedom may include jumping, jogging, and/or running, among others. In one or more embodiments, user 102 may utilize a virtual tool (e.g., a gamepad, glove 120, controller 124, etc.) to move and/or manipulate one or more object in three dimensions. In one or more embodiments, sensor data may be applied in VRMS 100. For example, a sensor (e.g., device 126, chest belt 128, etc.) may monitor and/or determine a heart rate of user 102, and VRMS 100 VR HMD 104 may display different views of one or more images with respect to the heart rate of user 102. In one or more embodiments, user 102 may have a three hundred and sixty (360) degree view, utilizing VRMS 100, along path 320 and/or proximate to path 320 utilizing the four dimensions of freedom.
Turning now to
If the rotation is not greater than zero (0) degrees, the method may proceed to 405. If the rotation is greater than zero (0) degrees, a platform may be rotated by a number of degrees of rotation, at 415. For example, VRMS 100 may rotate platform 106 by a number of degrees of rotation indicated by the rotation data. In one or more embodiments, the rotation data may indicate a direction of rotation. In one example, a positive number of degrees may indicate a rightward rotation by the number of degrees. In a second example, a negative number of degrees may indicate a leftward rotation by the number of degrees. In another example, the rotation data may include one or more numbers and/or characters that indicate leftward rotation or rightward rotation. In one instance, the rotation data may include “left” or “right”. In another instance, the rotation data may include “L” or “R”. In one or more embodiments, VRMS 100 may rotate platform 106 leftward or rightward, in accordance with a direction of rotation indicated by the rotation data.
At 420, a rotation confirmation may be sent to a VR application. For example, the VR application may provide a VR environment for user 102. For instance, VRMS 100 may send the rotation confirmation the VR application. In one or more embodiments, the VR application may be executed via a computer system. In one example, the computer system may be included in a cloud environment. In another example, VRMS 100 may include the computer system. In one or more embodiments, the VR application may include a game and/or a simulation.
Turning now to
If the VRMS is not operational, operations may cease at 515. If the VRMS is operational, speed data may be received at 520. For example, speed data may be associated with a speed of movement belt 114. At 525, movement sensor data may be received. In one example, a movement sensor may be coupled to movement belt 114. For instance, the movement sensor data may be associated with movement of movement belt 114. In a second example, the movement sensor data may be received from one or more of sensors 116 and 118. For instance, the movement sensor data may be associated with movement of one or more of legs of user 102 and/or one or more of feet of user 102. In another example, the movement sensor data may be associated with movement of a head of user 102. For instance, the movement sensor data may include rotation data.
At 530, it may be determined if an emergency brake is engaged. For example, user 102 may engage the emergency brake. For instance, one or more of rails 112 may include a button and/or switch that may be actuated that engages the emergency brake. If the emergency brake is engaged, operations may cease at 515. If the emergency brake is not engaged, it may be determined if a speed is greater than zero (0) or if there is movement, at 535. In one example, the speed may be or include a distance per time period measurement. In another example, the speed may be or include an angular measurement per time period measurement. For instance, movement belt 114 may include one or more devices that may rotate, and the speed may be an angular measurement per time period measurement of at least one of the one or more devices of movement belt 114 that may rotate.
If the speed is not greater then zero (0) and if there is no movement, the method may proceed to 510. If the speed is greater then zero (0) or if there is movement, a movement belt may be moved, at 540. For example, movement belt 114 may be moved. For instance, movement belt 114 may be moved in accordance with the speed. At 545, heart rate data may be received. For example, the heart rate data may be associated with a heart rate of user 102. In one or more embodiments, user 102 may wear a device that measures a heart rate of user 102. For example, the device that measures the heart rate of user 102 may provide the heart rate data to VRMS 100. For instance, VRMS 100 may wirelessly receive the heart rate data from the device that measures the heart rate of user 102.
At 550, the heart rate data and game pad input data may be sent to a VR application. In one or more embodiments, the VR application may be executed via a computer system. In one example, the computer system may be included in a cloud environment. In another example, VRMS 100 may include the computer system. In one or more embodiments, the method may proceed to 510.
Turning now to
At 615, a rotation of a head of the user may be determined. For example, a rotation of a head of user 102 may be determined. For instance, VR HMD 104 may determine the rotation of the head of user 102. In one or more embodiments, VR HMD 104 may include sensors that may determine the rotation of the head of user 102. For example, the sensors may include one or more of a gyroscope and an accelerometer. In one or more embodiments, VRMS 100 may determine the rotation of the head of user 102. For example, VRMS 100 may receive data from one or more sensors (e.g., a gyroscope, an accelerometer, etc.) of VR HMD 104 and determine the rotation of the head of user 102 based at least on the data from the one or more sensors of VR HMD 104.
At 620, second images may be provided to the user. For example, the second images may be or include images of the VR environment in a direction that user 102 has rotated. For instance, VR HMD 104 may provide the second images to user 102. At 625, a platform that supports the user may be rotated in accordance with the rotation. For example, VRMS 100 may rotate platform 106 in accordance with the rotation. For instance, platform 106 may be rotated on platform 108. In one embodiment, rotating platform 106 in accordance with the rotation may include a one-to-one correspondence. For example, if the rotation is forty-seven (47) degrees, platform 106 may be rotated forty-seven (47) degrees. In one or more embodiments, rotating platform 106 in accordance with the rotation may include a ratio. For example, if the rotation is a number of degrees, platform 106 may be rotated by the number of degrees modified by the ratio.
At 630, one or more of a walking movement and a running movement of the user may be determined. For example, VRMS 100 may determine one or more of a walking movement and a running movement of user 102 via one or more sensors. In one instance, the sensors may include one or more of sensors 116 and 118. In another instance, the sensors may include one or more of sensors may include one or more sensors coupled to movement belt 114 (e.g., a torque sensor, a pressure sensor, etc.). At 635, a movement belt may be moved in accordance with the one or more of the walking movement and the running movement of the user. For example, VRMS 100 may move movement belt 114 in accordance with the one or more of the walking movement and the running movement of user 102.
In one or more embodiments, one or more of the method elements described herein and/or one or more portions of an implementation of any method element may be repeated, may be performed in varying orders, may be performed concurrently with one or more of the other method elements and/or one or more portions of an implementation of any method element, or may be omitted. Further, one or more of the system elements described herein may be omitted and/or additional system elements may be added as desired, according to one or more embodiments. Moreover, supplementary, additional, and/or duplicated method elements may be instantiated and/or performed as desired, according to one or more embodiments. The above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosure. Thus, to the maximum extent allowed by law, the scope of the present disclosure is to be determined by the broadest permissible interpretation of the following claims and their equivalents, and shall not be restricted or limited by the foregoing detailed description.
Number | Name | Date | Kind |
---|---|---|---|
5562572 | Carmein | Oct 1996 | A |
5690587 | Gruenangerl | Nov 1997 | A |
5846134 | Latypov | Dec 1998 | A |
5948990 | Hashida | Sep 1999 | A |
5961541 | Ferrati | Oct 1999 | A |
5980256 | Carmein | Nov 1999 | A |
6050822 | Faughn | Apr 2000 | A |
6102832 | Tani | Aug 2000 | A |
6123647 | Mitchell | Sep 2000 | A |
6135928 | Butterfield | Oct 2000 | A |
6152854 | Carmein | Nov 2000 | A |
6743154 | Epstein | Jun 2004 | B2 |
7037241 | Kuo | May 2006 | B2 |
7381152 | Couvillion et al. | Jun 2008 | B2 |
7387592 | Couvillion et al. | Jun 2008 | B2 |
7399258 | Sugar | Jul 2008 | B1 |
7470218 | Williams | Dec 2008 | B2 |
7682291 | Gill et al. | Mar 2010 | B2 |
7780573 | Carmein | Aug 2010 | B1 |
7811200 | Chiang | Oct 2010 | B2 |
7841964 | Radow | Nov 2010 | B2 |
8083647 | Park | Dec 2011 | B2 |
8414130 | Pelah | Apr 2013 | B2 |
8790279 | Brunner | Jul 2014 | B2 |
20010043232 | Abbott | Nov 2001 | A1 |
20020103616 | Park | Aug 2002 | A1 |
20020105482 | Lemelson | Aug 2002 | A1 |
20030103016 | Walker | Jun 2003 | A1 |
20040048722 | Epstein | Mar 2004 | A1 |
20040063549 | Kuo | Apr 2004 | A1 |
20040242390 | Willliams | Dec 2004 | A1 |
20050108096 | Burger | May 2005 | A1 |
20050148432 | Carmein | Jul 2005 | A1 |
20050266963 | Holmes | Dec 2005 | A1 |
20060114171 | Vascotto et al. | Jun 2006 | A1 |
20060139317 | Leu | Jun 2006 | A1 |
20070229397 | Sefton | Oct 2007 | A1 |
20070233615 | Tumminaro | Oct 2007 | A1 |
20070270285 | Gill | Nov 2007 | A1 |
20080019569 | Rhoads | Jan 2008 | A1 |
20080033641 | Medalia | Feb 2008 | A1 |
20090058850 | Fun | Mar 2009 | A1 |
20090111670 | Williams | Apr 2009 | A1 |
20090124938 | Brunner | May 2009 | A1 |
20090152343 | Carter | Jun 2009 | A1 |
20090160609 | Lin | Jun 2009 | A1 |
20090175509 | Gonion | Jul 2009 | A1 |
20090262087 | Kim | Oct 2009 | A1 |
20100022358 | Schwaiger | Jan 2010 | A1 |
20100048256 | Huppi | Feb 2010 | A1 |
20100075808 | Luberski | Mar 2010 | A1 |
20100113222 | Radow | May 2010 | A1 |
20100147430 | Shultz | Jun 2010 | A1 |
20100265171 | Pelah | Oct 2010 | A1 |
20100300006 | Magpuri | Dec 2010 | A1 |
20110009241 | Lane | Jan 2011 | A1 |
20110035594 | Fox | Feb 2011 | A1 |
20110177914 | Park | Jul 2011 | A1 |
20110209192 | LeClerc Greer | Aug 2011 | A1 |
20110263355 | Delorme | Oct 2011 | A1 |
20110276396 | Rathod | Nov 2011 | A1 |
20120142442 | Brantingham | Jun 2012 | A1 |
20120144461 | Rathbun | Jun 2012 | A1 |
20120190504 | Lee | Jul 2012 | A1 |
20120302408 | Burger | Nov 2012 | A1 |
20130073400 | Heath | Mar 2013 | A1 |
20130127980 | Haddick | May 2013 | A1 |
20130132854 | Raleigh | May 2013 | A1 |
20130132910 | Belmon | May 2013 | A1 |
20130167226 | Lin | Jun 2013 | A1 |
20130278631 | Border | Oct 2013 | A1 |
20130311572 | Faller | Nov 2013 | A1 |
20140059443 | Tabe | Feb 2014 | A1 |
20140062855 | Huang | Mar 2014 | A1 |
20140162598 | Villa-Real | Jun 2014 | A1 |
20150070274 | Morozov | Mar 2015 | A1 |
20150321337 | Stephens, Jr. | Nov 2015 | A1 |
20160199695 | Armstrong | Jul 2016 | A1 |
20170036111 | Shigeta | Feb 2017 | A1 |
20170043252 | Kalaboukis | Feb 2017 | A1 |
20170129105 | Stephens, Jr. | May 2017 | A1 |
20170168486 | Tommy et al. | Sep 2017 | A1 |
20170272838 | Glazer | Sep 2017 | A1 |
20170336860 | Smoot | Nov 2017 | A1 |
20180033321 | Clarke | Feb 2018 | A1 |
20180036880 | Stephens, Jr. | Feb 2018 | A1 |
20180050256 | Buvid | Feb 2018 | A1 |
20180147442 | Moon | May 2018 | A1 |
20180161620 | Rothschild | Jun 2018 | A1 |
20180170678 | Leong | Jun 2018 | A1 |
20180217586 | Stephens, Jr. | Aug 2018 | A1 |
20180224928 | Ross | Aug 2018 | A1 |
20180326286 | Rathi | Nov 2018 | A1 |
20180360340 | Rehse | Dec 2018 | A1 |
Entry |
---|
Team, Brain and Spine. “Virtual Reality Treadmill a High-Tech Help for Patients (Video).” Health Essentials from Cleveland Clinic. Retrieved from <https://health.clevelandclinic.org/2014/05/virtual-reality-treadmill-helps-patients-with-parkinsons-ms/>, Jul. 17, 2015; 10 pages. |
“Virtuix Omni Package.” Virtuix Omni. Retrieved from < http://www.virtuix.com/product/omni-package/#tabs_20141211-190917-6490>, Jul. 18, 2017; 3 pages. |
Philip Wong, “Virtual Reality Treadmill: Run with a View.” CNET, Retrieved from < https://www.cnet.com/news/virtual-reality-treadmill-run-with-a-view/#>, Jul. 30, 2008; 1 page. |
“Kat Walk—A New VR Omni-Directional Treadmill.” Kickstarter, Retrieved from < https://www.kickstarter.com/projects/katvr/kat-walk-a-new-virtual-reality-locomotion-device>, Jul. 18, 2017; 24 pages. |
“The World's First True Commercially Viable Omnidirectional Treadmill.” Infinadeck Omnidirectional Treadmill. Retrieved from < http://www.infinadeck.com/#introduction >, Jul. 18, 2017; 6 pages. |
“Virtual Reality Setups.” Cyberith, Retrieved from < http://cyberith.com/product/>, Jul. 18, 2017; 4 pages. |
“Virtual Reality's Locomotion Problem.” Motherboard, Retrieved from < http://motherboard.vice.com/read/virtual-realitys-locomotion-problem>, Jul. 18, 2017; 14 pages. |
Sheik-Nainar, Mohamed A., and David B. Kaber. “The utility of a virtual reality locomotion interface for studying gait behavior.” Human factors 49.4, (2007): 696-709; 14 pages. |
“Omnidirectional Treadmill.” Wikipedia. Wikimedia Foundation, Jul. 12, 2017. Retrieved from < https://en.wikipedia.org/wiki/Omnidirectional_treadmill >; 2 pages. |
“Cave Automatic Virtual Environment.” Wikipedia. Wikimedia Foundation, Jul. 16, 2017. Retrieved from < https://en.wikipedia.org/wiki/Cave_automatic_virtual_environment>; 6 pages. |
John Marco, “List of Omnidirectional Treadmills Under Development.” Virtual Reality Times. Dec. 6, 2016. Retrieved from < http://www.virtualrealitytimes.com/2015/04/09/list-of-omnidirectional- treadmills-under-development/>; 11 pages. |
Number | Date | Country | |
---|---|---|---|
20190086996 A1 | Mar 2019 | US |