The present invention is directed to the field of analyzing motion and more specifically to an apparatus, system and method for interpreting and reproducing physical motion.
Motion sensing devices and systems, including utilization in virtual reality devices, are known in the art, see U.S. Pat. App. Pub. No. 2003/0024311 to Perkins, U.S. Pat. App. Pub. No. 2002/0123386 to Perlmutter, U.S. Pat. No. 5,819,206 to Horton, et al; U.S. Pat. No. 5,898,421 to Quinn; U.S. Pat. No. 5,694,340 to Kim; and U.S. Pat. No. RE37,374 to Roston, et al., which are all incorporated herein by reference.
Accordingly, there is a need for an apparatus, system and method that can facilitate the interpretation and reproduction of sensed physical motion.
An apparatus, system and method for turning physical motion into an interpretable language which when formed into sentences represents the original motion. This system may be referred to herein as a “Motion Description System.” Physical motion is defined as motion in one, two or three dimensions, with anywhere from 1 to 6 degrees of freedom. Language is defined as meaning applied to an abstraction.
The invention is described with reference to the FIGURE of the drawing, in which:
The FIGURE is a schematic illustration of a system used to turn physical motion into an interpretable language, according to various embodiments of the invention.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed. It may be noted that, as used in the specification and the appended claims, the singular forms “a”, “an” and “the” include plural referents unless the context clearly dictates otherwise. References cited herein are hereby incorporated by reference in their entirety, except to the extent that they conflict with teachings explicitly set forth in this specification.
Referring now to the FIGURE of the drawing, the FIGURE constitutes a part of this specification and illustrates exemplary embodiments of the invention. It is to be understood that in some instances various aspects of the invention may be shown schematically or may be exaggerated to facilitate an understanding of the invention.
The FIGURE is a schematic illustration of a system 1000 used to turn physical motion into an interpretable language, according to various embodiments of the present invention. When formed into sentences the interpretable language may be used to abstractly replace the original physical motion. Embodiments of system components are described below.
Motion Sensing
In one embodiment, a motion sensing unit 10 is described as follows. Physical motion is captured using a motion capture device 12 such as, but not limited to, one or more of the following: accelerometer, gyroscope, RF tag, magnetic sensor, compass, global positioning unit, fiber-optic interferometers, piezo sensors, strain gauges, cameras, etc. Data is received from the motion capture device 12 and transferred to the motion interpretation unit 100, for example via a data reception and transmission device 14. As shown by the multiple embodiments illustrated, the motion data may then be transferred directly to the motion interpretation unit 100 or may be transferred via an external application 20, such as a program that utilizes the raw motion data as well as the commands received from the motion interpretation unit 100 (described below). Data transfer may be accomplished by direct electrical connection, by wireless data transmission or by other data transfer mechanisms as known to those of ordinary skill in the art.
Motion Interpretation
In one embodiment, a motion interpretation unit 100 contains the following components:
Data Processor 110
Raw motions are periodically sampled from the one or more physical motion capture devices 12 of the motion sensing unit 10.
Raw non-motion data is periodically sampled and input from a non-motion data device 112 (i.e. keyboard, voice, mouse, etc.).
A single sample of Complex Motion data is preliminarily processed. The Complex Motion data is defined as the combined sample of all raw physical motion captured by the motion capture device(s) and all non-motion data as defined above.
All the Single Degree Motion (SDM) components are identified from the Complex Motion data. The Single Degree Motion components are defined as the expression of a multi-dimensional motion in terms of single dimension vectors in a given reference frame.
Token Identifier (TI) or Tokenizer 120
The tokenizer 120 receives as input a stream of Single Degree Motion component samples.
Every time subsequent subset of samples is marked as a possible token.
A token dictionary 122 exists. The token dictionary is defined as a list of simple meanings given to SDM components. The token dictionary is editable.
Sample groups marked for tokenization are compared against the token dictionary 122 and are either discarded (as bad syntax) or given token status.
Parser 130
The parser 130 receives as input a stream of tokenized 3D Complex Motion/Non-Motion data.
Using a language specification 132, the tokens are grouped into sentences. In one embodiment, the system contains a default language specification.
Command Generator 140
The command generator 140 receives as input a sentence and outputs commands based on sentences and non-motion related inputs.
At any time a user may create or teach the system new language (i.e. tokens, sentences) by associating a raw motion with an output command. Output commands can include, but are not limited to, application context specific actions, keystrokes, mouse movements. In one embodiment, the output command is sent to the external application 20.
Languages may be context driven and created for any specific application.
In the one embodiment, for example golf, motions of the club may be interpreted too mean “good swing,” “fade to the right,” etc.
The Motion Description System is suitable for a number of applications:
Other embodiments of the invention will be apparent to those skilled in the art from a consideration of the specification or practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with the true scope and spirit of the invention being indicated by the following claims.
The present application claims priority to U.S. Provisional Application No. 60/660,261, filed Mar. 10, 2005, entitled “System and Method for Interpreting and Reproducing Physical Motion,” which is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3792863 | Evans | Feb 1974 | A |
3806131 | Evans | Apr 1974 | A |
4839838 | LaBiche et al. | Jun 1989 | A |
4940236 | Allen | Jul 1990 | A |
4991850 | Wilhlem | Feb 1991 | A |
5067717 | Harlan et al. | Nov 1991 | A |
5233544 | Kobayashi | Aug 1993 | A |
5337758 | Moore et al. | Aug 1994 | A |
5472205 | Bouton | Dec 1995 | A |
5592401 | Kramer | Jan 1997 | A |
5598187 | Ide et al. | Jan 1997 | A |
5638300 | Johnson | Jun 1997 | A |
5694340 | Kim | Dec 1997 | A |
5697791 | Nashner et al. | Dec 1997 | A |
5779555 | Nomura et al. | Jul 1998 | A |
5791351 | Curchod | Aug 1998 | A |
5819206 | Horton et al. | Oct 1998 | A |
5826578 | Curchod | Oct 1998 | A |
5826874 | Teitell et al. | Oct 1998 | A |
5875257 | Marrin et al. | Feb 1999 | A |
5898421 | Quinn | Apr 1999 | A |
5903228 | Ohgaki et al. | May 1999 | A |
5907819 | Johnson | May 1999 | A |
6001014 | Ogata et al. | Dec 1999 | A |
6224493 | Lee et al. | May 2001 | B1 |
RE37374 | Roston et al. | Sep 2001 | E |
6441745 | Gates | Aug 2002 | B1 |
6529144 | Nilsen et al. | Mar 2003 | B1 |
6821211 | Otten et al. | Nov 2004 | B2 |
6908386 | Suzuki et al. | Jun 2005 | B2 |
7158118 | Liberty | Jan 2007 | B2 |
7176886 | Marvit et al. | Feb 2007 | B2 |
7236156 | Liberty et al. | Jun 2007 | B2 |
7239301 | Liberty et al. | Jul 2007 | B2 |
7262760 | Liberty | Aug 2007 | B2 |
7263462 | Funge et al. | Aug 2007 | B2 |
20010024973 | Meredith | Sep 2001 | A1 |
20020077189 | Tuer et al. | Jun 2002 | A1 |
20020107085 | Lee et al. | Aug 2002 | A1 |
20020123386 | Perlmutter | Sep 2002 | A1 |
20030024311 | Perkins | Feb 2003 | A1 |
20050032582 | Mahajan et al. | Feb 2005 | A1 |
20050164678 | Rezvani et al. | Jul 2005 | A1 |
20050212753 | Marvit et al. | Sep 2005 | A1 |
20060025229 | Mahajan et al. | Feb 2006 | A1 |
20060033711 | Kong | Feb 2006 | A1 |
20060264260 | Zalewski et al. | Nov 2006 | A1 |
20060287084 | Mao et al. | Dec 2006 | A1 |
20060287086 | Zalewski et al. | Dec 2006 | A1 |
20060287087 | Zalewski et al. | Dec 2006 | A1 |
20070015558 | Zalewski et al. | Jan 2007 | A1 |
20070015559 | Zalewski et al. | Jan 2007 | A1 |
20070026869 | Dunko | Feb 2007 | A1 |
20070050597 | Ikeda | Mar 2007 | A1 |
20070052177 | Ikeda et al. | Mar 2007 | A1 |
20070060228 | Akasaka et al. | Mar 2007 | A1 |
20070060391 | Ikeda et al. | Mar 2007 | A1 |
20070066394 | Ikeda et al. | Mar 2007 | A1 |
20070072680 | Ikeda | Mar 2007 | A1 |
20070178974 | Masuyama et al. | Aug 2007 | A1 |
Number | Date | Country |
---|---|---|
WO 0235184 | Feb 2002 | WO |
PCTUS0220119 | Jan 2003 | WO |
Number | Date | Country | |
---|---|---|---|
20060202997 A1 | Sep 2006 | US |
Number | Date | Country | |
---|---|---|---|
60660261 | Mar 2005 | US |