The present disclosure relates generally to a sensory feedback system, and, more particularly, to a virtual and augmented reality glove design for receiving and transmitting signals between a user environment and a virtual or augmented reality environment and a method for the same.
Virtual, augmented and mixed reality (collectively, “XR”) applications continue to evolve and become more capable everyday. Typically, the primary use for XR technology is in virtual reality gaming. As the capabilities of XR technology continue to evolve, humans spend more and more time playing, working, shopping and being entertained in XR environments. With continued evolution of XR technology, some common productivity applications or tasks may be more optimal, or even cost effective, to perform in the XR environment than they would be in the real world environment (“user's environment”). For example, training for high risk scenarios, such as surgery or military operations, artistic endeavors, such as painting or playing an instrument, or even remote monitoring of servers and robotics are all reasonable use cases for XR technology.
Maximizing the potential of XR technology platforms, whether for work or entertainment, requires optimal interfaces for the various tasks. One such interface is the human hand. The hand interacts and works with physical objects in the user's environment, so the hand provides an optimal interface to interact and work with virtual objects in the XR environment. The hands have great dexterity, accuracy, and sensitivity. The hands type, mix, wave, grab, perform sign language, and much more. This is why current XR technology platforms utilize hand controllers to interface between the XR environment and the user's environment.
Common hand controllers are hand-held controllers similar to those used for gaming consoles. Hand-held controllers tend to be bulky and more push button controllers rather than utilizing natural movement of the hand. Hand-held controllers provide limited accuracy when using individual fingers, which is necessary in order to successfully employ XR environments for productivity applications such as surgery or military operations which require precision accuracy. Without accurate tracking of finger position and movement, the dexterity, accuracy, and sensitivity of the hand is lost between the user's environment and the XR environment. These deficiencies in current hand-held controllers prompt the need for an XR glove controller design and method that accurately tracks and transmits not only hand position, but individual finger position and movement.
The present invention provides a sensory feedback system and method which overcomes the deficiencies described above, and has other advantages.
In one embodiment, a sensory feedback system is provided. The sensory feedback system comprises a central processing unit, a glove, a sensor system, and a microcontroller. The glove has a palm portion, a dorsal portion, and at least one finger member having a tope side and a bottom side. The sensor system comprises at least one flex sensor secured to the finger member, a set of input and output sensors, and an inertial movement unit secured to the dorsal portion of the glove. The set of input and output sensors comprises at least one of an input sensor. Alternatively, the set of input and output sensors may include an output sensor secured to the bottom side of the finger member and an input sensor or output sensor or both an input and output sensor secured to the palm portion of the glove. The microcontroller is secured to the dorsal portion of the glove and facilitates transmission of signals between the sensor system and the central processing unit.
In some embodiments, the sensory feedback system includes a light-emitting diode tracking system. The light-emitting diode tracking system comprises at least one light-emitting diode secured to the glove and at least one light sensor. The at least one light sensor receives a tracking input from the light-emitting diode and transmits the tracking input to the central processing unit.
Another embodiment provides a method for managing and controlling the virtual interaction between a user and a central processing unit. The virtual interaction method includes controlling the transmission of one or more signals between a glove and the central processing unit. The controlled signals provide interaction between a user environment and a virtual environment. Control of the signals between the glove and the central processing unit includes the steps: providing a microcontroller secured to the glove for facilitating the transmission of the one or more signals between the glove and the central processing unit; transmitting at least one flexure signal from the microcontroller to the central processing unit, wherein the flexure signal represents a degree of bend detected by a flex sensor secured to a finger member of the glove in the user environment; transmitting at least one impact signal from the microcontroller to the central processing unit, wherein the impact signal represents an impact in the user environment by at least one input sensor secured to the glove; transmitting at least one haptic signal from the central processing unit to the microcontroller, wherein the haptic signal represents an impact in the virtual environment by at least one output sensor secured to the glove; and transmitting at least one rotation signal from the microcontroller to the central processing unit, wherein the at least one rotation signal represents a detected rotation of the glove in at least one of the x, y, and z dimensions by an inertial movement unit secured to the glove.
In some embodiments, the method of managing and controlling the virtual interaction between the glove and the central processing unit may further comprise the steps of: transmitting at least one position signal from the microcontroller to the central processing unit, wherein the at least one position signal represents an identified position of the glove in a user environment by a light-emitting diode tracking system, the light-emitting diode tracking system comprising: at least one light-emitting diode secured to the glove; and at least one light sensor for receiving the position signal from the at least one light-emitting diode and transmitting the position signal to the central processing unit.
The drawings included with this application illustrate certain aspects of the embodiments described herein. However, the drawings should not be viewed as exclusive embodiments. In the Figures, the input and output sensors and light-emitting diodes are generically depicted as darkened circles, but the input and output sensors and the light-emitting diodes may take any configuration as suitable for use in connection with the various disclosed embodiments. The subject matter disclosed is capable of considerable modifications, alterations, combinations, and equivalents in form and function, as will occur to those skilled in the art with the benefit of this disclosure.
The present disclosure may be understood more readily by reference to these detailed descriptions. For simplicity and clarity of illustration, where appropriate, reference numerals may be repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the various embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures, and components have not been described in detail so as not to obscure the related relevant feature being described. Also, the description is not to be considered as limiting the scope of the embodiments described herein. The drawings are not necessarily to scale and the proportions of certain parts have been exaggerated to better illustrate details and features of the present disclosure.
Referring to
As shown in
In some embodiments, some or all of the components of sensor system 40 are secured to the outer surface of glove 30. In other embodiments, some or all of the components of sensor system 40 are secured to the inner surface of glove 30 such that sensor system 40 is not readily visible to the naked eye. In alternative embodiments, some components of sensor system 40 are secured to the outer surface of glove 30 and some components of sensor system 40 are secured to the inner surface of glove 30. In other embodiments, sensor system 40 is incorporated directly into the material comprising glove 30. For example, when incorporated directly into the material of glove 30, sensor system 40 may be positioned and secured between two pieces of material making up glove 30. As used herein, “secured” means attached to, carried by, or otherwise incorporated into.
Sensor system 40 includes one or more flex sensors 42, a set 44 of input and output sensors (referred to herein as set 44), and an inertial movement unit 46. As depicted in
Flex sensors 42 are secured to or incorporated into finger members 36 of glove 30. With reference to
Set 44 of input and output sensors is secured to glove 30 at various locations. Set 44 may include an input sensor 45a, an output sensor 45b, or a combination of the same. Typically, set 44 includes a plurality of input and output sensors 45a and 45b. As depicted in the Figures, input sensors 45a and output sensors 45b are depicted generically as darkened circles. However, it is to be understood that input sensors 45a and output sensors 45b may take any shape or geometric configuration.
Input sensors 45a register a user-actuated signal in the user environment for transmission as an input into the XR environment. Output sensors 45b receive a signal from the XR environment and generate a user-experienced sensation, output, in the user environment. In some embodiments, input sensor 45a or output sensor 45b is secured at fingertip 37C of bottom side 37b of finger member 36. In other embodiments, input sensor 45a or output sensor 45b is secured to palm portion 32 of glove 30. Additional embodiments may include input sensor 45a or output sensor 45b secured to joint 37d of bottom side 37b of finger members 36. In additional embodiments, input sensor 45a or output sensor 45b may be secured at each fingertip 37c of bottom side 37b of finger members 36, joint 37d of bottom side 37b of finger members 36, and to palm portion 32 of glove 30.
Input sensors 45a may be impact sensors. Types of impact sensors include, but are not limited to, piezoelectric, piezo film sensing material, force sensing resistors, or flat buttons. Piezo sensors are analog sensors suitable for creating a user perceived input, i.e., an impact experienced by the user in the real world. Such sensors require conversion from analog to digital form using an analog to digital converter. The structure and use of piezoelectric sensors are well known and will not be discussed in detail. Flat button sensors operate similar to keys on a keyboard in that they are user-actuated in the user environment. Flat button sensors register an input reflecting the application of pressure to the sensor, or the lack of pressure on the sensor. Force sensing resistors alter resistance when a force or pressure is applied.
Output sensors 45b may be haptic feedback sensors. Types of haptic feedback sensors include haptic motors such as small vibrational motors. Other haptic feedback sensors are possible, such as an Enamel insulated resistance heating wire or insulated Nichrome heating resistance wire (like those used in electric heating pads) could also be used to simulate touching objects of different warmth in the virtual environment. In some embodiments, where output sensors 45b and input sensors 45a are utilized, input sensors 45a may be secured to the fingertip 37c portions on bottom side 37b of finger members 36 and output sensors 45b may be secured at joint 37d distal from fingertip 37c.
Inertial movement unit 46 is secured at a central portion of glove 30. Inertial movement unit 46 may be secured to palm portion 32 or dorsal portion 34 of glove 30. Inertial movement unit 46 utilizes a combination of accelerometers and gyroscopes to track motion and orientation of glove 30 in the user's environment in three dimensions (x, y, and z). Inertial movement units 46 are commercially available as represented by Adafruit 9-DOF Absolute Orientation IMU Fusion Breakout—BNO055 and others known in the art.
In some embodiments, as depicted in
LED emitters 62 are secured to glove 30. The structure and use of LED emitters 62 are well known and will not be discussed in detail. Any spectrum of light detectable by light sensors 64 may be utilized as LED emitters 62. However, it is generally preferable to use a non-visible light such as infrared for user convenience and comfort.
In some embodiments, as depicted in
In other embodiments, LED emitters 62 are secured to glove 30 in a pattern. For example, the pattern may be a straight line, a circle, a triangle, a square, or an arrow. The pattern may be the same or different on palm portion 32 and dorsal portion 34. Differing patterns allow for easier recognition between palm portion 32 and dorsal portion 34 by light sensors 64. For example, LED emitters 62 may be a circle in the center portion of palm portion 32 and a straight line across the knuckles portion 38 of dorsal portion 34.
Light sensors 64 positioned throughout the user's room and/or integrated into other XR technology worn or utilized by the user, monitor the position of the LED emitters 62 to define tracking space 66. For example, in some embodiments, the user's room may include light sensors 64 positioned in the corners of the room. In other embodiments, light sensors 64 are integrated into central processing unit 20. In additional embodiments, light sensors 64 are located adjacent to central processing unit 20. As depicted in
LED tracking system 60 includes a view field or tracking space 66 as defined by light sensors 64. With reference to
Microcontroller 50 is secured to glove 30. In some embodiments, microcontroller 50 is secured at a central portion 35 of dorsal portion 34 of glove 30. Microcontroller 50 is datalinked to sensor system 40 and central processing unit 20. Microcontroller 50, sensor system 40 and central processing unit 20 may be datalinked via wired or wireless connections. Microcontroller 50 communicates inputs and outputs to and from sensor system 40 and central processing unit 20.
Microcontroller 50 translates a user's action in the user environment to the XR environment and vice versa. For example, if a user makes a peace sign in the user environment, sensor system 40 sends an input to microcontroller 50 signaling the peace sign has been made. Microcontroller 50 transmits the input to central processing unit 20 which creates an output in the XR environment. In this example, the output in the XR environment may be an avatar in the XR environment making a peace sign. In another example, the output may control a real environment robotic hand driving the robotic hand to form the peace sign in the user environment. In an additional example, the user's avatar in the XR environment may create an input through impact with an object in the XR environment. The output may be output sensors 45B vibrating to signal the impact in the user environment.
As depicted in
The flexure signal represents the degree of bend detected by flex sensors 42. Flex sensors 42 measure the amount of deflection or bending of finger members 36. When a user bends a finger member 36, flex sensor 42 measures the degree of bend and transmits the flexure signal to microcontroller 50. Microcontroller 50 transmits the flexure signal to central processing unit 20 which configures the flexure signal into an output to provide an action within the XR environment or an action to perform a specific task by a robotic hand in the real world. For example, if a user bends finger members 36 in a gripping motion, microcontroller 50 transmits the flexure signal, i.e. the degree of bend of finger member 36, to central processing unit 20 which creates an output to cause an avatar in the XR environment to grip a virtual object. In another example, the output may be to grip an object in the user environment with a robotic hand.
The impact signal represents an impact in the user environment by at least one input sensor 45a. Input sensors 45a may be impact sensors which register an input, impact signal, when a user makes contact or impact with an object in the user environment such as tapping a table or clapping the user's hands together. Microcontroller 50 receives the impact signal and transmits the impact signal to central processing unit 20 to provide an action in the XR environment. The use of impact sensors as input sensors 45a allow for the exact timing of when contact or impact was made for accurate transmission to the XR environment. For example, if playing a virtual piano in the XR environment, using sensory feedback system 10, a user can play a piano in the user environment and input sensors 45a register the timing of the keys of the piano being played as an impact signal. Microcontroller receives the impact signal and transmits the impact signal to central processing unit 20. Central processing unit 20 then accurately reflects a user's impact with the piano in the user environment on the virtual piano in the XR environment.
The haptic signal represents an impact in the virtual environment by at least one output sensor 45b. Central processing unit 20 transmits a haptic signal received from the XR environment to microcontroller 50. Microcontroller 50 creates and translates the haptic signal into a haptic output. Microcontroller 50 transmits the haptic output to output sensors 45b which are haptic feedback sensors. When the haptic feedback sensors are haptic motors, the haptic output is a set vibration amount. The strength of the haptic output, i.e., vibration, will vary depending on the haptic signal created by the impact in the XR environment. For example, when playing a virtual piano in the XR environment, the haptic feedback sensors can register a set vibration amount when the piano key is touched in the XR environment based upon the density of the piano key in the XR environment that creates the haptic signal. Such vibration amount provides for a resistive sensation in the user environment providing the effect in the user environment that the piano key is resisting finger members 36.
The rotation signal represents a detected rotation of glove 30 in at least one of the x, y, and z dimensions. One or more accelerometers in inertial movement unit 46 detect the linear acceleration of glove 30 relative to an initial position of glove 30 as a user's hand moves in the user environment. For example, the one or more accelerometers in inertial movement unit 46 measure forward and back, up and down, and left and right movement of glove 30 in the user environment. One or more gyroscopes in inertial movement unit 46 detect the rotational rate of glove 30 relative to an initial position of glove 30 as a user's hand moves in the user environment. For example, the one or more gyroscopes of inertial movement unit 46 measure pitch, yaw, and roll of glove 30 in the user environment.
Inertial movement unit 46 receives measurements from the accelerometers and gyroscopes as raw data to generate a rotation signal. In some embodiments, the rotation signal is the estimated position of glove 30 in the user environment. Inertial movement unit 46 translates the raw data into an estimated position of glove 30 in the user environment. Inertial movement unit 46 transmits the rotation signal to microcontroller 50 which transmits the rotation signal, i.e., the position of glove 30, to central processing unit 20. Central processing unit 20 displays the rotation signal as movement or action in the XR environment. For example, the movement or action may be an avatar rotating its hand in the XR environment to open a door. In other embodiments, the rotation signal is the raw data from the accelerometers and gyroscopes. Inertial movement unit 46 transmits the rotation signal, i.e. the raw data from the accelerometers and gyroscopes, to microcontroller 50. Microcontroller 50 transmits the rotation signal to central processing unit 20 which estimates the position of glove 30 in the user environment. Central processing unit 20 displays the rotation signal as movement or action in the XR environment.
In some embodiments, the method for providing virtual interaction between a user and central processing unit 20 further comprises the steps of: transmitting at least one position signal of glove 30 to central processing unit 20. The position signal represents an identified position of glove 30 in the user environment by LED tracking system 60. Light sensors 64 identify light of a specific pattern and spectrum emitted from LED emitters 62 within tracking space 66 to create the position signal. Light sensors 64 transmit the position signal to central processing unit 20. The pattern of LED emitters 62 allows for light sensors 64 to detect light emitted from LED emitters 62 without regard to the position of the user's hand. For example, when LED emitters 62 are secured to palm portion 32 and dorsal portion 34 of glove 30, light sensors 64 detect light from LED emitters 62 regardless of whether the user's hand is facing upwards or downwards. Central processing unit 20 receives the position signal from light sensors 64 and estimates the position of glove 30 in the user environment.
Although the disclosed invention has been shown and described in detail with respect to a preferred embodiment, it will be understood by those skilled in the art that various changes in the form and detailed area may be made without departing from the spirit and scope of this invention as claimed. Thus, the present invention is well adapted to carry out the object and advantages mentioned as well as those which are inherent therein. While numerous changes may be made by those skilled in the art, such changes are encompassed within the spirit of this invention as defined by the appended claims.