Method and apparatus for detecting hand gestures with a handheld controller

Information

  • Patent Grant
  • 9990045
  • Patent Number
    9,990,045
  • Date Filed
    Thursday, November 12, 2015
    8 years ago
  • Date Issued
    Tuesday, June 5, 2018
    6 years ago
Abstract
A method for detecting a user's hand gestures with a handheld controller. The method includes monitoring a first sensor and a second sensor. Each sensor is capable of sensing the presence of the user's thumb. The method further includes sensing the presence of the thumb on the first sensor and then sensing when the thumb is no longer present on the first sensor. A time period is monitored beginning when the thumb is no longer present on the first sensor and ending when the presence of the thumb is sensed on one of the first sensor and the second sensor. Once the time period exceeds a threshold time value, a thumbs-up gesture is registered.
Description
TECHNICAL FIELD

This patent application is directed to handheld controllers and, more specifically, to virtual reality handheld controllers.


BACKGROUND

In a virtual reality system, a user wears a head-mounted display that presents a selected virtual reality (VR) environment in front of the user's eyes. In some VR systems, a user can manipulate items in the virtual environment with handheld controllers. The controllers include tracking patterns comprised of a pattern of lights, for example. The system monitors the movement of the tracking patterns with a tracking camera and reproduces the user's hand movements in the virtual environment. However, buttons traditionally used on game controllers, for example, do not typically detect detailed hand movements. For example, individual finger movements and gestures, as well as opened or closed hand movements, are not captured with traditional button configurations.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the methods and apparatus for detecting a user's hand gestures introduced herein may be better understood by referring to the following Detailed Description in conjunction with the accompanying drawings, in which like reference numerals indicate identical or functionally similar elements:



FIG. 1 is an isometric view of a pair of handheld controllers each operative to detect a user's hand gestures according to a representative embodiment.



FIG. 2A is an isometric view of a user's right hand grasping the right-hand controller of FIG. 1.



FIG. 2B is an isometric view of the right-hand controller as shown in FIG. 2A with the user's thumb in a thumbs-up gesture.



FIG. 3 is an isometric view of the right-hand controller shown in FIGS. 1-2B as viewed from the top of the controller.



FIG. 4 is a flowchart illustrating a representative method for detecting a user's hand gestures.





The headings provided herein are for convenience only and do not necessarily affect the scope or meaning of the claimed embodiments. Further, the drawings have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be expanded or reduced to help improve the understanding of the embodiments. Moreover, while the disclosed technology is amenable to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the embodiments described. On the contrary, the embodiments are intended to cover all modifications, equivalents, and alternatives falling within the scope of the embodiments as defined by the appended claims.


DETAILED DESCRIPTION

Overview


A method for detecting a user's hand gestures with a handheld controller is disclosed. In an embodiment, the method includes monitoring a first sensor and a second sensor. Each sensor is capable of sensing the presence of the user's thumb. The method further includes sensing the presence of the thumb on the first sensor and then sensing when the thumb is no longer present on the first sensor. A time period is monitored beginning when the thumb is no longer present on the first sensor and ending when the presence of the thumb is sensed on one of the first sensor and the second sensor. Once the time period exceeds a threshold time value, a thumbs-up gesture is registered.


A handheld controller operative to detect a user's hand gestures is also disclosed. In an embodiment, the handheld controller comprises a main body and a handle extending from the main body. A first sensor is disposed on the main body capable of sensing the presence of a user's finger. The controller also includes a second sensor capable of sensing the presence of the finger. A processor monitors the first and second sensors. The processor registers when the first sensor senses the presence of the finger and also registers when the finger is no longer sensed on the first sensor. The processor monitors a time period beginning when the finger is no longer sensed on the first sensor and ending when the presence of the finger is sensed on one of the first sensor and the second sensor. When the time period exceeds a threshold time value, the processor registers a gesture, such as a thumbs-up gesture.


General Description

Various examples of the devices introduced above will now be described in further detail. The following description provides specific details for a thorough understanding and enabling description of these examples. One skilled in the relevant art will understand, however, that the techniques discussed herein may be practiced without many of these details. Likewise, one skilled in the relevant art will also understand that the technology can include many other features not described in detail herein. Additionally, some well-known structures or functions may not be shown or described in detail below so as to avoid unnecessarily obscuring the relevant description.


The terminology used below is to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of some specific examples of the embodiments. Indeed, some terms may even be emphasized below; however, any terminology intended to be interpreted in any restricted manner will be overtly and specifically defined as such in this section.



FIG. 1 illustrates a pair of handheld controllers 100 according to a representative embodiment. The pair of handheld controllers 100 includes a right-hand controller 102 and a left-hand controller 104. The primary structure of the right-hand controller 102 and the left-hand controller 104 when held adjacent to each other in a similar orientation, as illustrated, are substantially symmetric with respect to each other. Both controllers 102/104 are described herein with respect to the right-hand controller 102, as both controllers include the same or similar features, albeit in mirror image. The right-hand controller 102 includes a main body 106 and a handle 108 extending from the main body 106. In some embodiments, a surrounding ring portion 110 extends from the main body 106. The controllers 102/104 can be part of a VR system 1, such as the Rift™ available from Oculus™. Each of the controllers 102/104 includes a plurality of tracking features positioned in a corresponding tracking pattern, such as the right-hand controller's tracking pattern 112. The tracking features in the tracking patterns are configured to be accurately tracked by a tracking camera 3 to determine the motion, orientation, and/or spatial position of the controller for reproduction in a virtual environment. The tracking features can include, for example, fiducial markers or light emitting diodes (LED).


As shown in FIG. 2A, the handle portion 108 of the right-hand controller 102 is grasped by a user's right hand 5. When the controller 102 is grasped, the user's thumb 7 (i.e., the user's first finger) is positioned above the main body 106 and rests on a thumbstick 114. The user's second or index finger 9 is positioned on a trigger button 116. The user's third or middle finger 11 is positioned to contact a third finger button 118 on the handle portion 108. The controllers 102 and 104 are configured to detect a user's hand and/or finger gestures, such as a thumbs-up gesture as shown in FIG. 2B. As described more fully below, a thumbs-up gesture is indicated when the user removes his or her thumb 7 from the thumbstick 114 for a threshold period of time without touching the same or another button. Although, the various embodiments are described with respect to a thumbs-up gesture, other hand and/or finger gestures can be detected with the disclosed technology. For example, a pointing gesture can be detected with the trigger button 116. A hand or finger gesture, such as an open hand, can be detected with the third finger button 118. The presence of a gesture can be a signal to the VR system to initiate a command or to include the gesture in a corresponding apparition or avatar.


With further reference to FIG. 3, the main body 106 of the right-hand controller 102 includes a thumb surface 120 from which the thumbstick 114 extends. The thumbstick 114 can include a thumb sensor 122 to detect the presence of the user's thumb or other finger. The presence of a finger can include touching the sensor or hovering over the sensor within a selected distance. In some embodiments, the thumb sensor 122 can be a capacitive touch sensor. The thumb surface 120 can include a capacitive-touch sensor area 124 positioned next to the thumbstick 114 in order to detect when a user is resting their thumb on the thumb surface 120. In some embodiments, the entire thumb surface 120 is a capacitive touch sensor. The thumb surface 120 may include additional buttons 126, 130, and 134, each of which may include a capacitive touch sensor 128, 132, and 136, respectively. A hand gesture, such as a thumbs-up gesture, can be detected when the user lifts their thumb off of a first sensor, such as the thumb sensor 122, and does not place it on the first sensor or a second sensor, such as the sensor area 124, for a selected threshold period of time. It should be understood that a touch sensor can comprise either a capacitive touch sensor or a push button, such as one of the buttons 126, 130, and 134. In some embodiments, the trigger button 116 (see FIG. 2A) and the third finger button 118 can each include a capacitive touch sensor to facilitate finger and gesture detection according to the disclosed techniques. The third finger button 118 is described further in U.S. patent application Ser. No. 14/939,431 titled “HANDHELD CONTROLLER WITH THIRD FINGER GRIP DETECTION,” which is hereby incorporated by reference in its entirety.


In some embodiments, the controller 102 includes an inertial measurement unit (IMU) 142 to monitor rotation and orientation of the controller. Thus, if the user makes a thumbs-up gesture and the IMU 142 determines that the user's hand has been rotated, then the registered hand gesture can be modified to represent a thumbs-down gesture, for example. In some embodiments, the controller 102 includes a processor or microcontroller 140 to perform the timing and monitoring of the buttons (114, 116, 118, 126, 130, 134), sensors (122, 124, 128, 132, 136), and IMU 142. In other embodiments, a computer included in the VR system 1 (see FIG. 1) can be used to monitor the buttons, sensors, and IMU 142 and to determine the hand gesture indicated by the sensors and buttons. In some embodiments, the rotation and orientation of the handheld controller is determined by the IMU 142 and/or by monitoring the tracking pattern 112 with the tracking camera 3 (see FIG. 1).



FIG. 4 illustrates a representative method for determining a hand gesture with a handheld controller as described above. The method 200 starts at 202 where a first sensor S1, such as the thumb sensor 122, is monitored at step 204. At step 206, a determination is made as to whether a finger, such as a thumb, is present on the first sensor S1. If there is still no finger present on the first sensor S1 at step 206, then the first sensor S1 continues to be monitored and returns back to step 204. If a finger is present on the first sensor S1, the method moves to step 208.


If a finger is detected at step 206, the method moves to step 208 where again the first sensor S1 is monitored to determine if the finger remains present or is removed from the sensor. Thus, a determination is made at step 210 if the finger is still present on the sensor. If the finger is still present on the sensor, then at step 208 the sensor continues to be monitored. However, if there is no longer a finger present on the sensor (i.e., the user has removed their thumb or other finger from the sensor), a timer is started at step 212.


Once the timer is started at step 212, the first sensor S1 and one or more second sensors S2, such as any of the buttons (114, 116, 118, 126, 130, 134) or sensors (122, 124, 128, 132, 136), are monitored at step 214 to determine if the user puts their thumb down onto a sensor. Thus, at step 216, a determination is made as to whether the finger is present on the first sensor S1. If there is not a finger present on the first sensor S1, then a determination is made at step 218 as to whether the finger is present at a second sensor S2. If there is no finger present on sensor S2, then a check is made at step 220 to determine if the timer has exceeded a threshold time value. If the timer has exceeded the threshold time value, a gesture is registered for use by the VR system 1. In other words, if the thumb is raised off of first sensor S1 for a selected period of time (e.g., two seconds), it is determined that the user is gesturing a thumbs-up.


Once the gesture is registered at step 222, sensors S1 and S2 continue to be monitored at step 214. Also, if the timer has not reached the threshold time value, the method returns to step 214 to continue monitoring sensors S1 and S2. If the finger is detected at either sensor S1 or sensor S2, the method resets and returns to step 204 to begin monitoring the first sensor S1 again. When the method resets, the timer is stopped at step 224, the timer is cleared at step 226, and the gesture is cleared at step 228.


In some embodiments, the registered gesture at step 222 can be modified based on the orientation of the controller such as measured by the IMU 142 or via the tracking patterns 112. For example, if the controller is rotated approximately 90 degrees from vertical, the thumbs-up gesture can be modified to indicate a sideways thumb gesture. In another example, if the controller is rotated 180 degrees from vertical, the thumb gesture can be modified to indicate a thumbs-down gesture. In some embodiments, a resting position is registered if the thumb is sensed on the second sensor S2 prior to the time period exceeding the threshold time value. In some embodiments, the trigger button 116 and third button 118 can be used to determine other hand or finger gestures, whether they be pushed or through the capacitive touch sensor.


REMARKS

The techniques introduced here can be embodied as special-purpose hardware (e.g., circuitry), as programmable circuitry appropriately programmed with software and/or firmware, or as a combination of special-purpose and programmable circuitry. Hence, embodiments may include a machine-readable medium having stored thereon instructions which may be used to program a computer, a microprocessor, processor, and/or microcontroller (or other electronic devices) to perform a process. The machine-readable medium may include, but is not limited to, optical disks, compact disc read-only memories (CD-ROMs), magneto-optical disks, ROMs, random access memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or other type of media/machine-readable medium suitable for storing electronic instructions.


The above description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of the disclosure. However, in some instances, well-known details are not described in order to avoid obscuring the description. Further, various modifications may be made without deviating from the scope of the embodiments. Accordingly, the embodiments are not limited except as by the appended claims.


Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the disclosure. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.


The terms used in this specification generally have their ordinary meanings in the art, within the context of the disclosure, and in the specific context where each term is used. It will be appreciated that the same thing can be said in more than one way. Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein, and any special significance is not to be placed upon whether or not a term is elaborated or discussed herein. Synonyms for some terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification, including examples of any term discussed herein, is illustrative only and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification. Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions, will control.

Claims
  • 1. A handheld controller, comprising: a main body;a handle extending from the main body;a first sensor disposed on the main body capable of sensing the presence of a user's finger;a second sensor capable of sensing the presence of the finger; anda processor operative to: monitor the first sensor and the second sensor;register when the first sensor senses the presence of the finger;after registering the presence of the finger, register when the finger is no longer sensed on the first sensor;monitor a time period beginning when the finger is no longer sensed on the first sensor and ending when the presence of the finger is sensed on one of the first sensor and the second sensor; andinitiate a command to display a gesture when the time period exceeds a threshold time value and until the finger is sensed on one of the first sensor and the second sensor.
  • 2. The handheld controller of claim 1, wherein the main body includes a thumb surface and further comprising a thumbstick extending from the thumb surface, wherein the first sensor is disposed on the thumbstick.
  • 3. The handheld controller of claim 2, wherein the second sensor is disposed on the thumb surface.
  • 4. The handheld controller of claim 3, wherein the first sensor and the second sensor are capacitive sensors.
  • 5. The handheld controller of claim 2, wherein the second sensor is a button mounted in the thumb surface.
  • 6. The handheld controller of claim 2, wherein the second sensor is a capacitive sensor disposed on a button mounted in the thumb surface.
  • 7. The handheld controller of claim 2, wherein the processor is housed in the main body.
  • 8. A method for detecting a user's hand gestures with a handheld controller, the method comprising: monitoring a first sensor and a second sensor, each sensor capable of sensing the presence of a finger of the user;sensing the presence of the finger on the first sensor;after registering the presence of the finger, sensing when the finger is no longer present on the first sensor;monitoring a time period beginning when the finger is no longer present on the first sensor and ending when the presence of the finger is sensed on one of the first sensor and the second sensor; anddisplaying a gesture when the time period exceeds a threshold time value and until the finger is sensed on one of the first sensor and the second sensor.
  • 9. The method of claim 8, further comprising registering a resting position if the finger is sensed on the second sensor prior to the time period exceeding the threshold time value.
  • 10. The method of claim 8, further comprising sensing rotation of the handheld controller and modifying the gesture according to the sensed rotation.
  • 11. The method of claim 10, wherein sensing rotation of the handheld controller comprises monitoring an inertial measuring unit.
  • 12. A method for detecting a user's hand gestures with a handheld controller, the method comprising: monitoring a first sensor and a second sensor, each sensor capable of sensing the presence of a thumb of the user;sensing the presence of the thumb on the first sensor;after sensing the presence of the thumb, sensing when the thumb is no longer present on the first sensor; andmonitoring a time period beginning when the thumb is no longer present on the first sensor and ending when the presence of the thumb is sensed on one of the first sensor and the second sensor; anddisplaying a thumbs-up gesture when the time period exceeds a threshold time value and until the thumb is sensed on one of the first sensor and the second sensor.
  • 13. The method of claim 11, further comprising registering a resting position if the thumb is sensed on the second sensor prior to the time period exceeding the threshold time value.
  • 14. The method of claim 11, further comprising sensing rotation of the handheld controller and modifying the thumbs-up gesture according to the sensed rotation.
  • 15. The method of claim 14, wherein the thumbs-up gesture is modified to register a thumbs-down gesture.
US Referenced Citations (110)
Number Name Date Kind
4518164 Hayford, Jr. et al. May 1985 A
4552360 Schenck et al. Nov 1985 A
5087825 Ingraham et al. Feb 1992 A
5181009 Perona Jan 1993 A
5207426 Inoue et al. May 1993 A
5265009 Colavita et al. Nov 1993 A
5421590 Robbins et al. Jun 1995 A
D363320 Barthelemy et al. Oct 1995 S
5479163 Samulewicz Dec 1995 A
5551701 Bouton et al. Sep 1996 A
5616078 Oh et al. Apr 1997 A
5645277 Cheng Jul 1997 A
5796354 Cartabiano et al. Aug 1998 A
5982355 Jaeger et al. Nov 1999 A
6173203 Barkley et al. Jan 2001 B1
6192253 Charlier et al. Feb 2001 B1
6430110 Baroche Aug 2002 B2
6544124 Woodward et al. Apr 2003 B2
6572108 Bristow Jun 2003 B1
6590835 Farine et al. Jul 2003 B2
6652383 Sonoda et al. Nov 2003 B1
6970157 Siddeeq et al. Nov 2005 B2
7004469 von Goeben et al. Feb 2006 B2
7106197 Gaiotto et al. Sep 2006 B2
7331793 Hernandez et al. Feb 2008 B2
7345670 Armstrong et al. Mar 2008 B2
8064972 Russo et al. Nov 2011 B2
D656996 Mikhailov et al. Apr 2012 S
8188842 Otsuka et al. May 2012 B2
8267786 Ikeda Sep 2012 B2
8439753 Nagata et al. May 2013 B2
8795078 Musick, Jr. Aug 2014 B1
8882596 Takahashi et al. Nov 2014 B2
8994643 Goodwin et al. Mar 2015 B2
9141087 Brown et al. Sep 2015 B2
9386662 Krueger et al. Jul 2016 B1
9421472 Buller et al. Aug 2016 B2
9678566 Webb et al. Jun 2017 B2
D795959 Hubler et al. Aug 2017 S
D800841 Hubler et al. Oct 2017 S
9804693 Long Oct 2017 B2
D802055 Chen et al. Nov 2017 S
9839840 Long et al. Dec 2017 B2
20010015718 Hinckley et al. Aug 2001 A1
20010045938 Willner et al. Nov 2001 A1
20020072415 Kikukawa et al. Jun 2002 A1
20030100367 Cooke et al. May 2003 A1
20040222963 Guo et al. Nov 2004 A1
20040222970 Martinez et al. Nov 2004 A1
20050248544 Adam et al. Nov 2005 A1
20050255915 Riggs et al. Nov 2005 A1
20060287089 Addington et al. Dec 2006 A1
20070049374 Ikeda et al. Mar 2007 A1
20070066394 Ikeda et al. Mar 2007 A1
20070084293 Kaiserman et al. Apr 2007 A1
20070293318 Tetterington et al. Dec 2007 A1
20080261693 Zalewski et al. Oct 2008 A1
20080261695 Coe et al. Oct 2008 A1
20090005164 Chang et al. Jan 2009 A1
20090143110 Armstrong et al. Jun 2009 A1
20090149256 Lui et al. Jun 2009 A1
20090290345 Shaner et al. Nov 2009 A1
20090295721 Yamamoto et al. Dec 2009 A1
20090298590 Marks et al. Dec 2009 A1
20100009760 Shimamura et al. Jan 2010 A1
20100085321 Pundsack et al. Apr 2010 A1
20100118195 Eom et al. May 2010 A1
20100144436 Marks et al. Jun 2010 A1
20100184513 Mukasa et al. Jul 2010 A1
20110294579 Marks et al. Dec 2011 A1
20120088582 Wu et al. Apr 2012 A1
20120202597 Yee et al. Aug 2012 A1
20120261551 Rogers et al. Oct 2012 A1
20130162450 Leong et al. Jun 2013 A1
20130324254 Huang et al. Dec 2013 A1
20140015813 Numaguchi et al. Jan 2014 A1
20140141891 Georgy et al. May 2014 A1
20140203953 Moser et al. Jul 2014 A1
20140228124 Plagge et al. Aug 2014 A1
20140273546 Harmon et al. Sep 2014 A1
20140361977 Mao et al. Dec 2014 A1
20140362110 Stafford Dec 2014 A1
20140364212 Osman et al. Dec 2014 A1
20140378227 Lee Dec 2014 A1
20150077398 Yairi Mar 2015 A1
20150094142 Stafford Apr 2015 A1
20150155445 Crowder et al. Jun 2015 A1
20150234477 Watson et al. Aug 2015 A1
20150253574 Thurber Sep 2015 A1
20150258431 Strafford et al. Sep 2015 A1
20150258432 Tokubo et al. Sep 2015 A1
20150268920 Schapiro Sep 2015 A1
20150370320 Connor et al. Dec 2015 A1
20160351362 Gassoway et al. Dec 2016 A1
20160357249 Webb et al. Dec 2016 A1
20160357261 Webb et al. Dec 2016 A1
20160361637 Bristol et al. Dec 2016 A1
20160361638 Rogoza et al. Dec 2016 A1
20160363996 Rogoza et al. Dec 2016 A1
20160364910 Katz et al. Dec 2016 A1
20170128828 Long May 2017 A1
20170131767 Long May 2017 A1
20170136351 Long May 2017 A1
20170168303 Petrov et al. Jun 2017 A1
20170177102 Long et al. Jun 2017 A1
20170189798 Rogoza et al. Jul 2017 A1
20170189799 Anderson et al. Jul 2017 A1
20170189802 Rogoza et al. Jul 2017 A1
20170192495 Drinkwater et al. Jul 2017 A1
20170192506 Andersen et al. Jul 2017 A1
Non-Patent Literature Citations (39)
Entry
U.S. Appl. No. 14/934,073 by Long, C., et al., filed Nov. 5, 2015.
U.S. Appl. No. 14/934,090 by Long, C., et al., filed Nov. 5, 2015.
U.S. Appl. No. 14/939,431 by Long, C., et al., filed Nov. 12, 2015.
U.S. Appl. No. 14/975,049 by Long, C., et al., filed Dec. 18, 2015.
U.S. Appl. No. 29/529,915 by Chen, Y., et al., filed Jun. 11, 2015.
U.S. Appl. No. 29/571,025 by Chen, Y., et al., filed Jul. 13, 2016.
U.S. Appl. No. 29/571,027 by Chen, Y., et al., filed Jul. 13, 2016.
U.S. Appl. No. 29/571,030 by Chen, Y., et al., filed Jul. 13, 2016.
U.S. Appl. No. 14/991,875 by Drinkwater, J., et al., filed Jan. 8, 2016.
U.S. Appl. No. 15/172,099 by Rogoza, B., et al., filed Jun. 2, 2016.
U.S. Appl. No. 15/173,474 by Rogoza, B., et al., filed Jun. 3, 2016.
U.S. Appl. No. 15/173,558 by Andersen, B., et al., filed Jun. 3, 2016.
U.S. Appl. No. 15/177,121 by Anderson, B., et al., filed Jun. 2, 2016.
U.S. Appl. No. 29/579,091 by Chen, Y., et al., filed Sep. 27, 2016.
Notice of Allowance dated Jun. 29, 2016, for U.S. Appl. No. 29/529,915 by Chen, Y., et al., filed Jun. 11, 2015.
Restriction Requirement dated Apr. 8, 2016, for U.S. Appl. No. 29/529,915 by Chen, Y., et al., filed Jun. 11, 2015.
Notice of Allowance dated Sep. 27, 2016, for U.S. Appl. No. 29/529,915 by Chen, Y., et al., filed Jun. 11, 2015.
Non-Final Office Action dated Mar. 23, 2017 for U.S. Appl. No. 14/934,073 by Long, C., et al., filed Nov. 5, 2015.
Non-Final Office Action dated Apr. 7, 2017 for U.S. Appl. No. 14/975,049 by Long, C., et al., filed Dec. 18, 2015.
Ex Parte Quayle Action dated May 5, 2017 for U.S. Appl. No. 29/571,027 by Chen, Y., et al., filed Jul. 13, 2016.
Ex Parte Quayle Action dated May 5, 2017 for U.S. Appl. No. 29/571,030 by Chen, Y., et al., filed Jul. 13, 2016.
Ex Parte Quayle Action dated May 8, 2017 for U.S. Appl. No. 29/571,025 by Chen, Y., et al., filed Jul. 13, 2016.
Notice of Allowance dated Jun. 15, 2017 of U.S. Appl. No. 29/571,030 by Chen, Y., et al., filed Jul. 13, 2016.
Notice of Allowance dated Jun. 21, 2017 for U.S. Appl. No. 29/571,025 by Chen, Y., et al., filed Jul. 13, 2016.
Notice of Allowance dated Jun. 22, 2017 for U.S. Appl. No. 29/571,027 by Chen, Y., et al., filed Jul. 13, 2016.
Supplemental Notice of Allowability dated Jul. 10, 2017 of U.S. Appl. No. 29/571,030 by Chen, Y., et al., filed Jul. 13, 2016.
Supplemental Notice of Allowability dated Jul. 6, 2017 for U.S. Appl. No. 29/571,025 by Chen, Y., et al., filed Jul. 13, 2016.
Non-Final Office Action dated Jul. 17, 2017 for U.S. Appl. No. 14/939,431 by Long, C., et al., filed Nov. 12, 2015.
Non-Final Office Action dated Aug. 24, 2017 for U.S. Appl. No. 14/991,875 by Drinkwater, J., et al., filed Jan. 8, 2016.
Notice of Allowance dated Sep. 15, 2017 for U.S. Appl. No. 14/975,049 by Long, C. et al., filed Dec. 15, 2015.
U.S. Appl. No. 29/611,924 by Chen, Y., et al., filed Jul. 26, 2017.
Notice of Allowance dated Dec. 22, 2017 for U.S. Appl. No. 14/991,875 by Drinkwater, J., et al., filed Jan. 8, 2016.
“STEM System” accessed and printed from URL <http://sixense.com/wireless>, 5 pages.
Final Office Action dated Nov. 2, 2017 for U.S. Appl. No. 14/934,073 by Long, C., et al., filed Nov. 5, 2015.
Notice of Allowance dated Oct. 20, 2017 for U.S. Appl. No. 14/934,090 by Long, C., et al., filed Nov. 5, 2015.
Restriction Requirement dated Oct. 12, 2017 for U.S. Appl. No. 29/579,091 by Chen, Y., et al., filed Sep. 27, 2016.
Supplemental Notice of Allowability dated Sep. 29, 2017 for U.S. Appl. No. 29/571,027 by Chen, Y., et al., filed Jul. 13, 2016.
Tested, “Hands-On with Sixense STEM VR Motion-Tracking System” accessed and printed from URL <https://www.youtube.com/watch?v=C8z-On6FBTM>, 5 pages.
Non-Final Office Action dated Nov. 1, 2017 for U.S. Appl. No. 15/173,558 by Andersen, B., et al., filed Jun. 3, 2016.
Related Publications (1)
Number Date Country
20170139481 A1 May 2017 US