The present invention is related to the field of video games. More specifically, the present invention relates to plug-and-play video games with wireless controllers.
Electronic video games have enjoyed large acceptance in the marketplace. Video games have incorporated wireless configurations to increase their ease and enjoyment. Conventional wireless video games require a user to stand near the console and limit the types of motion that the game can detect. Also, conventional video games require a user to hold one or more active elements that increase the cost of the video device. The user registers a motion in the game by performing a preset motion. One drawback with such an approach is the user lacks ease in switching between different types of activities that are included within the video game. Many additional game components must be stored when the game is not in use. Another drawback is that game play is limited to the predetermined number of motions. The game cannot recognize feet placement or hand motion except in precisely defined positions, nor can the game recognize or incorporate movement as a game play feature. Still other games require the user to carry or hold an electronic controller with buttons or with accelerometers and radio transmitters. Such complex controllers can increase the complexity and price of a game and requires the user to interact with the game in an artificial and unnatural way.
An ideal video game control device would merely sense the position of each of the user's feet or hands without requiring the user to hold additional active components such as those that use expensive radio frequency oscillators. Further, the device can determine an activity of the user by sensing the type of user movement performed. Such a device would be extremely easy for the user to operate. Further, such a device would greatly simplify and enhance the playing of video games.
The present invention allows a user to interact with a video game or other electronic device using natural motions. The system includes a light (e.g., optical) transmitter or a plurality of such transmitters. The system also includes at least one optical receiver. Preferably, the receiver is an array of receivers such as in a CCD (charge coupled device) camera. An optical system, such as lenses can optionally be mounted to the transmitters. Likewise, an optical system, such as lenses can optionally be mounted to the receivers. The user places her body into the field of view of the transmitters and optical radiation is reflected from her body onto the receivers. In some embodiments, the user wears or affixes to her clothes or shoes a retroreflector to enhance the radiation that is received by the receivers. The user can interact with the game using natural movements. Alternatively, a user can hold an object that simulates a real world object, such as a retro-reflective baseball bat or a hairbrush. Further, a user can optionally hold a real baseball bat to simulate batting in a game.
In a first aspect of the invention, an apparatus for tracking motion includes a transmitter configured to irradiate one or more objects, a receiver that includes an array of detectors arranged to receive reflected radiation from the one or more objects, a controller programmed to use the reflected radiation to track a motion of the one or more objects, and one or more retroreflectors designed for attachment to corresponding hands or feet of a user. The apparatus includes a monitor configured to display an image, such as a pictorial representation of the one or more objects, substantially reproducing the motion of the one or more objects.
In one embodiment, the transmitter includes multiple light emitting diodes arranged to substantially uniformly illuminate an area containing the one or more objects. The array of detectors includes an array of charge coupled devices arranged in a rectangle. The shape of the array can be rectangular, square, uniform, or even non-uniform. Preferably, the receiver includes one or more lenses arranged to focus the reflected radiation onto the array of charge coupled devices.
In one embodiment, the one or more objects include feet, and the controller is further programmed to separately track motion of both feet.
In one embodiment, the controller is also programmed to compare the motion to a predetermined target motion. In this embodiment, when the apparatus is used to track dance steps, the steps can be compared to the correct steps, allowing the user to measure her progress.
In other embodiments, the one or more objects include a hairbrush or a sporting equipment such as a bat, a club, a racket, a glove, a ball, a bow, a gun, or a fishing reel.
In a second aspect of the invention, a method of tracking motion includes irradiating one or more objects, each supporting a corresponding retroreflector attached to a hand or foot, receiving radiation reflected from the one or more retroreflectors onto an array of detectors, and tracking motion of the one or more objects using the reflected radiation. An image reproducing the motion is displayed on a computer monitor. The image includes a picture of the one or more objects, such as a picture of dancing feet, a rolling bowling ball, a gun, or a swinging golf club or bat.
In a third aspect of the invention, an apparatus includes a transceiver, a controller, and a monitor. The transceiver includes a transmitter and an array of detectors. The transmitter is arranged to irradiate one or more retroreflectors attached to feet, and the array of detectors is positioned to receive reflected radiation from the one or more retroreflectors. The controller is programmed to use the reflected radiation to determine motion of the one or more retroreflectors. The monitor displays an image substantially reproducing the motion in a virtual game environment.
The novel features of the invention are set forth in the appended claims. However, for purpose of explanation, several embodiments of the invention are set forth in the following figures.
In the following description, numerous details and alternatives are set forth for purpose of explanation. However, one of ordinary skill in the art will realize that the invention can be practiced without the use of these specific details. In other instances, well-known structures and devices are shown in block diagram form in order not to obscure the description of the invention with unnecessary detail.
In
The strength of reflected radiation must exceed background radiation levels.
Preferably, the detection array 24 is configured as a two dimensional array of receiver detectors, such as an array of CCD devices. In some embodiments, there can be receive optics positioned over the detection array 24. The detection array 24 can be positioned to view an area of a surface, such as the floor. A user wearing retroreflectors 25 within the field of view on the floor will reflect radiation from the retroreflectors 25 to the detection array 24. Certain elements in the detection array 24 will correspond to certain positions within the field of view on the floor. When a user's foot wearing a retroreflector 25 is in a location in the field of view, light radiation will be reflected and impinge on the corresponding elements in the detection array 24. In this way, the detection array 24 can operate in a manner analogous to a digital camera. It will be appreciated that the detection array 24 can be configured to identify a user's foot when the sensed radiation exceeds a predetermined threshold for each of the elements in the detection array 24.
The intelligence module 28 can detect a wide range of user motions or activities. The intelligence module 28 comprises a microprocessor configured to interact with the transmitters 14, 16, 18, 20 and the detection array 24. The intelligence module 28 interprets reflected radiation from the retroreflectors 25 and determines the user motion. The intelligence module 28 is configured to mimic an “intuitive” controller since multiple user activities can be determined. For example, the user can simulate a baseball swing and the intelligence module 28 determines the user motion to be a baseball swing. Alternatively, the user can simulate a golf swing and the intelligence module 28 determines the user motion to be a golf swing. The intelligence module 28 can be configured to distinguish the action of the user to be a baseball swing or a golf swing. The intelligence module 28 can determine patterns of reflected radiation received from the detection array 24 since certain elements in the detection array 24 correspond to certain positions within the three dimensional field of view of the detection array 24. The intelligence module 28 can also determine the strength of reflected radiation and detect if the user motion is mostly a vertical motion as in a golf swing or a horizontal motion as in a baseball swing.
While the array of detectors 200 is an 8×8 square, detectors of other sizes and configurations can also be used. Some examples includes larger square arrays, such as 256×256 array; smaller square arrays, such as 4×4; rectangular arrays; other uniform arrays, as well as non-uniform arrays. Those skilled in the art will recognize that different size and different configuration arrays can be selected to fit the application at hand.
Referring to the device 10 of
In the step 330, the method determines whether the game is finished. If the game is finished, the method proceeds to the step 335, where the game ends. Otherwise, the method loops back to the step 315.
Referring to
The steps 300 are only exemplary. Some steps can be added, some steps can be deleted, and the steps can be performed in different orders than the one shown.
The retroreflectors 25 located within the volume of space sensed by embodiments of the present invention will be represented on the display screen 12 at a particular location. As the retroreflectors 25 moved and are positioned in a new location the relative analogue change in position will be displayed on the screen 12. More precise position identification can be obtained through the use of precision components, such as optical lenses and circuitry.
In an exemplary embodiment of the present invention, it is desirable for the video game device 10 to sense the location of more than one object. Each of the players feet can be sensed separately. In
The video game device 10 separately identifies a left foot movement and a right foot movement. It can sense forward, backward and sideways movement. When utilizing embodiments of the present invention, the location of each foot of the player 40 can be uniquely determined by having a retroreflector 25 attached to each foot of the player 40.
When utilizing embodiments of the present invention with this game, the control circuitry 26 can be set to register movement of the retroreflectors 25 after a particular threshold of reflected signal is received. This signifies that the player's feet are at least as close as some predefined limit to the detection array 24. In the event that the player's feet are farther from the detection array 24 than allowable to achieve the appropriate threshold, no foot movement is indicated on the game screen 12. When the player's feet and the retroreflectors 25 approach the detection array 24 sufficiently close that the threshold is crossed, the display screen 12 will then indicate movement of the player's left or right foot.
The transmitters 14, 16, 18, 20 and the detection array 24 can be used to sense the reflected signal from the retroreflectors 25 and avoid the problem of having a left foot being misinterpreted as a right foot. Accordingly, the video game device 10 can distinguish the player's left foot from her right foot using kinematic rules whereby assumptions are made. These include assuming that at least one foot is always on the ground in static states and dynamic states with the exception of jumps.
The detection array 24 can be positioned to view an area, such as the three dimensional space in front of the video display device 12. A user 40 wearing the first and second pair of retroreflectors 25a, 25b within the field of view in the area will reflect light radiation from the retroreflectors 25a, 25b to the detection array 24. Certain elements in the detection array 24 will correspond to certain positions within the field of view in the area. When a user's feet wearing the retroreflectors 25a or the user's hands wearing the retroreflectors 25b are in a location in the field of view, radiation will be reflected and impinge on the corresponding elements in the detection array 24. In this way, the detection array 24 can operate in a manner analogous to a digital camera. It will be appreciated that the detection array 24 can be configured to identify a user's feet or hands when the sensed reflected radiation exceeds a predetermined threshold for each of the elements in the detection array 24.
The intelligence module 28 interprets reflected radiation from the first and second pairs of retroreflectors 25a, 25b and determines the user motion. The intelligence module 28 is configured to mimic an intuitive controller since multiple user activities can be determined. For example, the user can simulate a baseball swing and the intelligence module 28 determines the user motion to be a baseball swing. The intelligence module 28 can determine patterns of reflected radiation received from the detection array 24 since certain elements in the detection array correspond to certain positions within the three dimensional field of view of the detection array 24. The intelligence module 28 and the control circuit 26 are configured to detect and determine if reflected radiation 36a is from the first pair of retroreflectors 25a or reflected radiation 36b is from the second pair of retroreflectors 25b. Identifying the source of reflected radiation 36a, 36b can be facilitated with the filter elements 27a, 27b. The filter elements 27a, 27b can be active or passive devices that modify the transmitted radiation 34. Alternatively, the intelligence module 28 and the control circuit 26 can similarly be configured to distinguish the movement of the user's right hand from the left hand or the right foot from the left foot.
The detection array 24 can be positioned to receive radiation within an area, such as the three dimensional space in front of a display device configured as a pretend mirror 11. A user wearing the retroreflectors 25 within the field in the area will reflect radiation from the retroreflectors 25 to the detection array 24. Certain elements in the detection array 24 will correspond to certain positions within the field of view in the area. When a user's hands wearing the retroreflectors 25 are in a location in the field of view, light radiation will be reflected and impinge on the corresponding elements in the detection array 24. In this way, the detection array 24 can operate in a manner analogous to a digital camera. It will be appreciated that the detection array 24 can be configured to identify a user's hands when the sensed radiation exceeds a predetermined threshold for each of the elements in the detection array 24.
The intelligence module 28 interprets reflected radiation from the retroreflectors 25 and determines the user motion. The intelligence module 28 is configured to mimic an intuitive controller since multiple user activities can be determined. The intelligence module 28 can determine patterns of reflected radiation received from the detection array 24 since certain elements in the detection array correspond to certain positions within the three dimensional field of view of the detection array 24. The intelligence module 28 and the control circuit 26 can be configured to distinguish the movement of the user's right hand from left hand. For example, the user's hand motion can be determined as a grooming activity such as combing of the hair or brushing the teeth. In this way, the video gaming device can facilitate learning proper grooming habits as a grooming game.
The intelligence module 128 interprets reflected radiation from the user motion. The intelligence module 128 can determine patterns of reflected radiation received from the transceivers 115, 119, 123 within the three dimensional field of view. The intelligence module 128 and the control circuit 126 can be configured to distinguish the movement of the user's right hand from left hand. In an alternative embodiment, the brush 132 can include a filter element as in previous embodiments. In still another embodiment, the user can wear retroreflectors as in previous embodiments.
In an alternative embodiment, a cooking game with multiple venues can be substituted for the grooming game of the previous embodiment. In another embodiment, driving a car or flying a plane can be simulated using a device in accordance with the present invention. In still another embodiment, electronic devices such as personal computers or DVDs can be controlled by determining a user's movement as certain commands.
In some embodiments, there can be receive optics positioned over the detection array 24. The detection array 24 can be positioned to view an area in front of the detection array 24. A user holding the bat 42 within the field of view will reflect light radiation from the bat 42 to the detection array 24. Certain elements in the detection array 24 will correspond to certain positions within the field of view. When the bat 42 is in a location in the field of view, radiation will be reflected and impinge on the corresponding elements in the detection array 24. In this way, the detection array 24 can operate in a manner analogous to a digital camera. It will be appreciated that the detection array 24 can be configured to identify the bat 42 when the sensed radiation exceeds a predetermined threshold for each of the elements in the detection array 24.
The intelligence module 28 in the console 44 interprets reflected radiation from the bat 42 and determines the user motion. The intelligence module 28 is configured to mimic an “intuitive” controller since multiple user activities can be determined. The intelligence module 28 can determine the strength of reflected radiation and detect if the user motion is mostly a vertical motion as in a golf swing or a horizontal motion as in a baseball swing. The intelligence module 28 interprets and determines a swing arc “A” to be a baseball swing and registers a response on the display 12 by manipulating the cursor or presentation 32.
In some embodiments, there can be receive optics positioned over the detection array 24. The detection array 24 can be positioned to view an area in front of the detection array 24. A user holding the golf club 44 within the field of view will reflect light radiation from the golf club 44 to the detection array 24. Certain elements in the detection array 24 will correspond to certain positions within the field of view. When the golf club 44 is in a location in the field of view, radiation will be reflected and impinge on the corresponding elements in the detection array 24. In this way, the detection array 24 can operate in a manner analogous to a digital camera. It will be appreciated that the detection array 24 can be configured to identify the golf club 44 when the sensed radiation exceeds a predetermined threshold for each of the elements in the detection array 24.
The intelligence module 28 in the console 44 interprets reflected radiation from the golf club 44 and determines the user motion. The intelligence module 28 is configured to mimic an “intuitive” controller since multiple user activities can be determined. The intelligence module 28 can determine the strength of reflected radiation and detect if the user motion is mostly a vertical motion as in a golf swing or a horizontal motion as in a baseball swing. The intelligence module 28 interprets and determines a swing arc “B” to be a golf swing and registers a response on the display 12 by manipulating the cursor or presentation 32.
Still other embodiments exploit the fact that when flesh is close to an optical receiver (e.g., within one foot or less), the reflective nature of flesh approximates that of a retroreflector. In these embodiments, flesh, such as an exposed hand or foot, can substitute for a retroreflector. This permits a user's hands to be images for gaming and control applications. As one example, players use a computer to compete against one another in the game Rock, Scissors, Paper. The gestures for rock, scissors, and paper are all visually different enough for a computer to robustly recognize their shapes and process them in real time.
While the examples illustrate using embodiments of the invention in various games and activities, it will be appreciated that embodiments can also be used in other games, including, but not limited to, sword games, ping pong, billiards, archery, rifle shooting, aviation (e.g., flight simulation), and race car driving, to name only a few. Further, while some embodiments describe transmitting and receiving light energy for tracking objects, other types of radiant energy can be used. Further, while the examples discussed are generally directed to video games, it will be appreciated that the invention finds use in other applications other than games. One other embodiment, for example, includes a self-contained electronic device that tracks motion as described above and provides audio feedback.
While the invention has been described with reference to numerous specific details, one of ordinary skill in the art will recognize that the invention can be embodied in other specific forms without departing from the spirit of the invention. Thus, one of ordinary skill in the art will understand that the invention is not to be limited by the foregoing illustrative details, but rather is to be defined by the appended claims.
This application claims priority under 35 U.S.C. §119(e) of the U.S. provisional patent application Ser. No. 61/113,933, filed Nov. 12, 2008, and titled “Plug and Play Wireless Video Game,” which is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
3504334 | Turnage | Mar 1970 | A |
3792243 | Appel et al. | Feb 1974 | A |
3838212 | Whetstone et al. | Sep 1974 | A |
3886361 | Wester | May 1975 | A |
4111421 | Mierzwinski | Sep 1978 | A |
4137651 | Pardes et al. | Feb 1979 | A |
4210329 | Steiger et al. | Jul 1980 | A |
4309781 | Lissau | Jan 1982 | A |
4317005 | de Bruyne | Feb 1982 | A |
4414537 | Grimes | Nov 1983 | A |
4517559 | Deitch et al. | May 1985 | A |
4521772 | Lyon | Jun 1985 | A |
4521870 | Babbel et al. | Jun 1985 | A |
4524348 | Lefkowitz | Jun 1985 | A |
4542291 | Zimmerman | Sep 1985 | A |
4545583 | Pearman et al. | Oct 1985 | A |
4550250 | Mueller et al. | Oct 1985 | A |
4564928 | Glenn et al. | Jan 1986 | A |
4565999 | King et al. | Jan 1986 | A |
4578674 | Baker et al. | Mar 1986 | A |
4654648 | Herrington et al. | Mar 1987 | A |
4682159 | Davison | Jul 1987 | A |
4713545 | Norrgren et al. | Dec 1987 | A |
4771344 | Fallacaro et al. | Sep 1988 | A |
4782328 | Denlinger | Nov 1988 | A |
4791416 | Adler | Dec 1988 | A |
4796019 | Auerbach | Jan 1989 | A |
4799687 | Davis et al. | Jan 1989 | A |
4837430 | Hasegawa | Jun 1989 | A |
4910464 | Trett et al. | Mar 1990 | A |
4924216 | Leung | May 1990 | A |
RE33662 | Blair et al. | Aug 1991 | E |
5191461 | Cranshaw et al. | Mar 1993 | A |
5288078 | Capper et al. | Feb 1994 | A |
5384652 | Allen et al. | Jan 1995 | A |
5506682 | Pryor | Apr 1996 | A |
5521616 | Capper et al. | May 1996 | A |
5534917 | MacDougall | Jul 1996 | A |
5616078 | Oh | Apr 1997 | A |
5914487 | Ramer et al. | Jun 1999 | A |
5982352 | Pryor | Nov 1999 | A |
6088091 | Ramer et al. | Jul 2000 | A |
6144366 | Numazaki et al. | Nov 2000 | A |
6353428 | Maggioni et al. | Mar 2002 | B1 |
6674895 | Rafii et al. | Jan 2004 | B2 |
6955603 | Jeffway et al. | Oct 2005 | B2 |
7015950 | Pryor | Mar 2006 | B1 |
7058204 | Hildreth et al. | Jun 2006 | B2 |
7190263 | McKay et al. | Mar 2007 | B2 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7259747 | Bell | Aug 2007 | B2 |
7322889 | Ueshima | Jan 2008 | B2 |
7348963 | Bell | Mar 2008 | B2 |
7379563 | Shamaie | May 2008 | B2 |
7379566 | Hildreth | May 2008 | B2 |
7389591 | Jaiswal et al. | Jun 2008 | B2 |
7421093 | Hildreth et al. | Sep 2008 | B2 |
7430312 | Gu | Sep 2008 | B2 |
7450250 | Venkatesh et al. | Nov 2008 | B2 |
7473884 | Fouquet et al. | Jan 2009 | B2 |
7555142 | Hildreth et al. | Jun 2009 | B2 |
7570805 | Gu | Aug 2009 | B2 |
7574020 | Shamaie | Aug 2009 | B2 |
7576727 | Bell | Aug 2009 | B2 |
7609249 | Fouquet et al. | Oct 2009 | B2 |
7710391 | Bell et al. | May 2010 | B2 |
7737393 | Fouquet et al. | Jun 2010 | B2 |
7777899 | Hildreth | Aug 2010 | B1 |
7778942 | Naito | Aug 2010 | B2 |
7809167 | Bell | Oct 2010 | B2 |
7812816 | Wenstrand | Oct 2010 | B2 |
7822267 | Gu | Oct 2010 | B2 |
7827698 | Jaiswal et al. | Nov 2010 | B2 |
7834846 | Bell | Nov 2010 | B1 |
7840031 | Albertson et al. | Nov 2010 | B2 |
7848542 | Hildreth | Dec 2010 | B2 |
7853041 | Shamaie | Dec 2010 | B2 |
7878586 | Kneller et al. | Feb 2011 | B2 |
7898522 | Hildreth et al. | Mar 2011 | B2 |
7920179 | Thorn | Apr 2011 | B2 |
7944685 | Nabais Nobre | May 2011 | B2 |
7953271 | Gu | May 2011 | B2 |
7957554 | Silver et al. | Jun 2011 | B1 |
8015718 | Jaiswal et al. | Sep 2011 | B2 |
8035612 | Bell et al. | Oct 2011 | B2 |
8035614 | Bell et al. | Oct 2011 | B2 |
8035624 | Bell et al. | Oct 2011 | B2 |
8068641 | Hildreth | Nov 2011 | B1 |
8068693 | Sorek et al. | Nov 2011 | B2 |
8081822 | Bell | Dec 2011 | B1 |
8094873 | Kelosky et al. | Jan 2012 | B2 |
8098277 | Bell | Jan 2012 | B1 |
8110178 | Fujikawa et al. | Feb 2012 | B2 |
8116518 | Shamaie et al. | Feb 2012 | B2 |
8146020 | Clarkson | Mar 2012 | B2 |
8157652 | Nguyen et al. | Apr 2012 | B2 |
8170281 | Shamaie | May 2012 | B2 |
8218858 | Gu | Jul 2012 | B2 |
8384663 | Fouquet et al. | Feb 2013 | B2 |
20030058412 | Owen et al. | Mar 2003 | A1 |
20040087366 | Shum et al. | May 2004 | A1 |
20060028656 | Venkatesh et al. | Feb 2006 | A1 |
20060033713 | Pryor | Feb 2006 | A1 |
20060158522 | Pryor | Jul 2006 | A1 |
20060223637 | Rosenberg | Oct 2006 | A1 |
20060237633 | Fouquet et al. | Oct 2006 | A1 |
20060238492 | Fouquet et al. | Oct 2006 | A1 |
20060238499 | Wenstrand | Oct 2006 | A1 |
20060279745 | Wenstrand et al. | Dec 2006 | A1 |
20070117628 | Stanley | May 2007 | A1 |
20070155502 | Wu | Jul 2007 | A1 |
20080018595 | Hildreth et al. | Jan 2008 | A1 |
20080166022 | Hildreth | Jul 2008 | A1 |
20080170123 | Albertson et al. | Jul 2008 | A1 |
20080208517 | Shamaie | Aug 2008 | A1 |
20080219502 | Shamaie | Sep 2008 | A1 |
20080267521 | Gao et al. | Oct 2008 | A1 |
20080273755 | Hildreth | Nov 2008 | A1 |
20080280676 | Distanik et al. | Nov 2008 | A1 |
20080281633 | Burdea et al. | Nov 2008 | A1 |
20090027337 | Hildreth | Jan 2009 | A1 |
20090031240 | Hildreth | Jan 2009 | A1 |
20090051648 | Shamaie et al. | Feb 2009 | A1 |
20090052785 | Shamaie | Feb 2009 | A1 |
20090078858 | Fouquet et al. | Mar 2009 | A1 |
20090079813 | Hildreth | Mar 2009 | A1 |
20090109036 | Schalla et al. | Apr 2009 | A1 |
20090124379 | Wells | May 2009 | A1 |
20090133051 | Hildreth | May 2009 | A1 |
20090138805 | Hildreth | May 2009 | A1 |
20090217211 | Hildreth et al. | Aug 2009 | A1 |
20090315740 | Hildreth et al. | Dec 2009 | A1 |
20100001950 | Fouquet et al. | Jan 2010 | A1 |
20100039379 | Hildreth | Feb 2010 | A1 |
20100040292 | Clarkson | Feb 2010 | A1 |
20100066667 | MacDougall et al. | Mar 2010 | A1 |
20100066763 | MacDougall et al. | Mar 2010 | A1 |
20100091110 | Hildreth | Apr 2010 | A1 |
20100170052 | Ortins et al. | Jul 2010 | A1 |
20100259474 | Hildreth | Oct 2010 | A1 |
20100277075 | Rees | Nov 2010 | A1 |
20100281636 | Ortins et al. | Nov 2010 | A1 |
20100295946 | Reed et al. | Nov 2010 | A1 |
20100315491 | Carter et al. | Dec 2010 | A1 |
20110013024 | Pryor | Jan 2011 | A1 |
20110038530 | Gu | Feb 2011 | A1 |
20110068227 | Kneller et al. | Mar 2011 | A1 |
20110074974 | Hildreth | Mar 2011 | A1 |
20110080490 | Clarkson et al. | Apr 2011 | A1 |
20110197132 | Escoda et al. | Aug 2011 | A1 |
20110210943 | Zaliva | Sep 2011 | A1 |
20110242344 | Elwell et al. | Oct 2011 | A1 |
20110247156 | Schmid et al. | Oct 2011 | A1 |
20110260998 | Ludwig | Oct 2011 | A1 |
20120007821 | Zaliva | Jan 2012 | A1 |
20120016960 | Gelb et al. | Jan 2012 | A1 |
Number | Date | Country |
---|---|---|
60-069728 | Apr 1985 | JP |
62-014528 | Jan 1987 | JP |
63-167534 | Jul 1988 | JP |
Entry |
---|
Foley, James D., “Interfaces for Advance Computing,” Oct. 1987, pp. 127-135, Scientific America. |
“Q&A: Gesture Tek Talks Xbox 360 Camera Innovation,” Gamasutra, accessed Feb. 1, 2012, <http://www.gamasutra.com/php-bin/news—index.php?story=11215>. |
J.J. Marotta, W,P. Medendorp, J.D. Crawford “Kinematic Rules for Upper and Lower Arm Contributions to Grasp Orientation”, J Neurophysiol 90: 3816-3827 (2003). |
A.G.Bharatkumar et al., “Lover Limb Kinematics of Human Walking with the Medial Axis Transformation”, Proceedings of the 1994 IEEE Workshop on Motion of Non-Rigid and Articulated Objects, pp. 70-76 (1994). |
W.P. Medendorp, J.D. Crawford, D.Y.P. Henriques, J.A.M. Van Gisbergen, C.C.A.M. Gielen “Kinematic Strategies for Upper Arm-Forearm Coordination in Three Dimensions”, Journal of Neurophysiology 84:2302-2316 (2000). |
Number | Date | Country | |
---|---|---|---|
61113933 | Nov 2008 | US |