1. Field of the Invention
This invention relates generally to video processing, and more particularly to an interface that enables controlling a virtual camera through a user's head motion in order to adjust the view being presented during an interactive entertainment application.
2. Description of the Related Art
The interactive entertainment industry strives to allow users a realistic as possible experience when playing an interactive video game. Currently, the scene views presented on screen during execution of the interactive application do not allow for the definition of a scene view according to actual tracked movement where the movement is captured without the use of markers. The requirement for a user to wear the sometimes awkward markers is a nuisance that has prevented the wide scale acceptance of the applications associated with the markers.
One attempt to provide a realistic experience is to provide a canned response to a detected movement. That is, a user may be monitored and if the user ducks or jumps a corresponding character of the application ducks or jumps. However, there is no correlation with the user's movement to the scene view being presented on display screen viewed by the user. Thus, in order to change a scene view being presented, the user is left with manipulating a joy stick to change the scene view. Moreover, a user is required to remember a number of abstract commands in order to access the various scene movement capabilities. For example, in order to peer around a corner within a scene, the user may have to key a button sequence in combination with manipulation of the joy stick in order to achieve the desired functionality. As can be appreciated, this manipulation is wholly unrelated to the physical movement, i.e., peering around a corner, tying to be emulated.
In view of the foregoing, there is a need for providing a method and apparatus configured to tie the actual movement of a user to modify a scene view being presented, without having the user wear markers, during an execution of an interactive entertainment application.
Broadly speaking, the present invention fills these needs by providing a method and apparatus that tracks head motion of a user without markers in order to adjust a view-frustum associated with a scene being displayed. It should be appreciated that the present invention can be implemented in numerous ways, including as a method, a system, computer readable medium or a device. Several inventive embodiments of the present invention are described below.
In one embodiment, a method for processing interactive user control with a scene of a video clip is provided. The method initiates with identifying a head of a user that is to interact with the scene of the video clip. Then, the identified head of the user is tracked during display of the video clip, where the tracking enables detection of a change in position of the head of the user. Next, a view-frustum is adjusted in accordance with the change in position of the head of the user.
In another embodiment, a method for processing interactive user control with a scene of a video clip is provided. The method initiates with identifying a head of a user that is to interact with the scene of the video clip. Then, the identified head of the user is tracked during display of the video clip, where the tracking enables detection of a change in position of the head of the user. Next, a view-frustum is translated in accordance with the change in position of the head of the user.
In still another embodiment, a method for managing a visible volume displayed through a view port is provided. The method initiates with locating a head of a user. Then, a location of the head of the user is tracked relative to the view port. Next, the visible volume is adjusted based upon the location of the head of the user relative to the view port.
In another embodiment, a computer readable medium having program instructions for processing interactive user control with a scene of a video clip is provided. The computer readable medium includes program instructions for identifying a head of a user that is to interact with the scene of the video clip. Program instructions for tracking the identified head of the user during display of the video clip, the tracking enabling detection of a change in position of the head of the user and program instructions for adjusting a view-frustum in accordance with the change in position of the head of the user are included.
In yet another embodiment, a computer readable medium having program instructions for processing interactive user control with a scene of a video clip is provided. The computer readable medium includes program instructions for identifying a head of a user that is to interact with the scene of the video clip. Program instructions for tracking the identified head of the user during display of the video clip, the tracking enabling detection of a change in position of the head of the user and program instructions for translating a view-frustum in accordance with the change in position of the head of the user are included.
In still yet another embodiment, a computer readable medium having program instructions for managing a visible volume displayed through a view port is provided. The computer readable medium includes program instructions for locating a head of a user. Program instructions for tracking a location of the head of the user relative to the view port and program instructions for adjusting the visible volume based upon the location of the head of the user relative to the view port are included.
In another embodiment, a system enabling interactive user control for defining a visible volume being displayed is provided. The system includes a computing device. A display screen in communication with the computing device is included. The display screen is configured to display image data defined through a view-frustum. A tracking device in communication with the computing device is included. The tracking device is capable of capturing a location change of a control object, wherein the location change of the control object effects an alignment of the view-frustum relative to the display screen.
In yet another embodiment, a computing device is provided. The computing device includes a memory configured to store a template of a control object. A processor capable of receiving a video signal tracking the control object is included. The processor includes logic for comparing a portion of a frame of the video signal to the template, logic for identifying a change in a location of the control object in the portion of the frame relative to a location of the control object associated with the template, and logic for translating the change in the location of the control object to adjust a view-frustum associated with an original location of the control object.
Other aspects and advantages of the invention will become apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the principles of the invention.
The invention, together with further advantages thereof, may best be understood by reference to the following description taken in conjunction with the accompanying drawings.
An invention is disclosed for adjusting a point of view for a scene being displayed during an interactive entertainment application according to the head movement of a user. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some or all of these specific details. In other instances, well known process steps have not been described in detail in order not to obscure the present invention.
The embodiments of the present invention modify a point of view associated with a virtual camera during an interactive entertainment application through the marker-less tracking of a control object. Thus, the visible scene being presented on a display screen is effected to the actual movement of the control object. That is, the control object is tracked and the movement of the control object is translated to modify a view-frustum defining a visible scene presented on a display screen. For illustrative purposes, the embodiments described herein designate the control object as a user's head. Of course, certain features of a user's head may be tracked, e.g., the face or any other suitable facial feature. Accordingly, rather than using a joy stick controller to move a virtual camera that defines the point of view for the visible scene being presented, a change in the coordinates of a user's head, that is being tracked through an image capture device, results in defining a new view point and subsequently displaying the image data associated with the new view point. As mentioned above, the tracking of the control object is performed without markers affixed to the control object. Thus, the user is not required to wear a device for tracking purposes. One skilled in the art will appreciate that the image capture device may be any suitable camera, e.g., a web camera.
In one embodiment, the physical movement in the real world, that is associated with the control object being tracked, is transformed to a virtual movement of a virtual camera defining a visible scene. The visible scene in the virtual world is then displayed through a virtual window, then rendered onto a rectangular area with screen coordinates, referred to as the view port. As used herein, the view port may any suitable display screen, e.g., a television monitor, a computer monitor, etc. While the embodiments described herein refer to a video game application, the embodiments may be applied to any suitable interactive entertainment application. Accordingly, with respect to a video game application any suitable video game console, such as the “PLAYSTATION 2”® manufactured by Sony Computer Entertainment Inc. may be incorporated with the embodiments described below. However, the embodiments including a video game console may also include any suitable computing device in place of the video game console. For example, with reference to on-line gaming applications, the computing device may be a server.
In one embodiment, a video camera is set proximate to a graphical display and pointed at the user to detect user movements. In particular, a change in location associated with the user's head or face is detected by the video camera. Each frame of video is processed to locate the position of the user's head in the image, by matching a portion of the video frame with a face template captured from a specific user, or a canonical face template. The face template is initially captured by the user placing his face within a capture region defined on a display screen. Once the user's face is within the capture region the system is signaled so that an image of the user's face may be stored as gray scale image data or some other suitable image data for storage in memory. The virtual viewpoint and view frustum used for displaying the scene are modified to correspond to the user's tracked head or face location during execution of an interactive entertainment application. In addition, distance of the user's head from the camera may be determined from the scale of their face/head features in the video image. The mapping from the head location to the virtual view is dependent on the application. For example, a game developer may decide on the factors defining the mapping of the head location to the virtual view.
Still referring to
Still referring to
In summary, the above described embodiments enable the tracking of a user's head in order to move a point of view to correlate to the head movement. The tracking is performed without markers, thereby freeing previous restrictions on a user, especially with reference to virtual reality applications. While exemplary applications have been provided for viewing control applications, it should be appreciated that numerous other suitable applications may use the embodiments described herein. For example, additional specific uses include: directing the view change in a 3D cut-scene, movie or replay; judging distances using the depth-cue provided by head-motion parallax, e.g., what distance to jump in a platformer game; a scary effect of restricting the normal field of view, e.g., showing the small area visible with a flashlight, and requiring the user to move their head to see more; and using head motion as a trigger for events related to the user's view, such as, triggering a warping effect when a user looks over a precipice to indicate vertigo, and triggering a game character's reaction when you look at something. In another embodiment, a sniper mode may be provided where the virtual camera is finely moved according to fine movements of the user's head, similar to peering through the crosshairs of a rifle when aiming the rifle.
It should be appreciated that the embodiments described herein may also apply to on-line gaming applications. That is, the embodiments described above may occur at a server that sends a video signal to multiple users over a distributed network, such as the Internet, to enable players at remote noisy locations to communicate with each other. It should be further appreciated that the embodiments described herein may be implemented through either a hardware or a software implementation. That is, the functional descriptions discussed above may be synthesized to define a microchip configured to perform the functional tasks for locating and tracking a user's head or facial region and translating the tracked movement to define a scene for presentation.
With the above embodiments in mind, it should be understood that the invention may employ various computer-implemented operations involving data stored in computer systems. These operations include operations requiring physical manipulation of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. Further, the manipulations performed are often referred to in terms, such as producing, identifying, determining, or comparing.
The above described invention may be practiced with other computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention may also be practiced in distributing computing environments where tasks are performed by remote processing devices that are linked through a communications network.
The invention can also be embodied as computer readable code on a computer readable medium. The computer readable medium is any data storage device that can store data which can be thereafter read by a computer system. Examples of the computer readable medium include hard drives, network attached storage (NAS), read-only memory, random-access memory, CD-ROMs, CD-Rs, CD-RWs, magnetic tapes, and other optical and non-optical data storage devices. The computer readable medium can also be distributed over a network coupled computer system so that the computer readable code is stored and executed in a distributed fashion.
Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications may be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the appended claims. In the claims, elements and/or steps do not imply any particular order of operation, unless explicitly stated in the claims.
Number | Name | Date | Kind |
---|---|---|---|
3943277 | Everly et al. | Mar 1976 | A |
4263504 | Thomas | Apr 1981 | A |
4313227 | Eder | Jan 1982 | A |
4558864 | Medwedeff | Dec 1985 | A |
4565999 | King et al. | Jan 1986 | A |
4802227 | Elko et al. | Jan 1989 | A |
4823001 | Kobayashi et al. | Apr 1989 | A |
4843568 | Krueger et al. | Jun 1989 | A |
5034986 | Karmann et al. | Jul 1991 | A |
5055840 | Bartlett | Oct 1991 | A |
5111401 | Everett et al. | May 1992 | A |
5144594 | Gilchrist | Sep 1992 | A |
5260556 | Lake et al. | Nov 1993 | A |
5297061 | Dementhon et al. | Mar 1994 | A |
5335011 | Addeo et al. | Aug 1994 | A |
5426450 | Drumm | Jun 1995 | A |
5455685 | Mori | Oct 1995 | A |
5534917 | MacDougall | Jul 1996 | A |
5543818 | Scott | Aug 1996 | A |
5557684 | Wang et al. | Sep 1996 | A |
5563988 | Maes et al. | Oct 1996 | A |
5568928 | Munson et al. | Oct 1996 | A |
5581276 | Cipolla et al. | Dec 1996 | A |
5583478 | Renzi | Dec 1996 | A |
5586231 | Florent et al. | Dec 1996 | A |
5616078 | Oh | Apr 1997 | A |
5638228 | Thomas, III | Jun 1997 | A |
5649021 | Matey et al. | Jul 1997 | A |
5675828 | Stoel et al. | Oct 1997 | A |
5677710 | Thompson-Rohrlich | Oct 1997 | A |
5706364 | Kopec et al. | Jan 1998 | A |
5768415 | Jagadish et al. | Jun 1998 | A |
5796354 | Cartabiano et al. | Aug 1998 | A |
5818424 | Korth | Oct 1998 | A |
5850222 | Cone | Dec 1998 | A |
5850473 | Andersson | Dec 1998 | A |
5870100 | DeFreitas | Feb 1999 | A |
5883616 | Koizumi et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5900863 | Numazaki | May 1999 | A |
5913727 | Ahdoot | Jun 1999 | A |
5914723 | Gajewska | Jun 1999 | A |
5917493 | Tan et al. | Jun 1999 | A |
5917936 | Katto | Jun 1999 | A |
5923306 | Smith et al. | Jul 1999 | A |
5923318 | Zhai et al. | Jul 1999 | A |
5929444 | Leichner | Jul 1999 | A |
5930383 | Netaer | Jul 1999 | A |
5930741 | Kramer | Jul 1999 | A |
5937081 | O'Brill et al. | Aug 1999 | A |
5959596 | McCarten et al. | Sep 1999 | A |
5963250 | Parker et al. | Oct 1999 | A |
5993314 | Dannenberg et al. | Nov 1999 | A |
6009210 | Kang | Dec 1999 | A |
6014167 | Suito et al. | Jan 2000 | A |
6021219 | Andersson et al. | Feb 2000 | A |
6031545 | Ellenby et al. | Feb 2000 | A |
6031934 | Ahmad et al. | Feb 2000 | A |
6037942 | Millington | Mar 2000 | A |
6044181 | Szeliski et al. | Mar 2000 | A |
6049619 | Anandan et al. | Apr 2000 | A |
6056640 | Schaaij | May 2000 | A |
6057909 | Yahav et al. | May 2000 | A |
6061055 | Marks | May 2000 | A |
6075895 | Qiao et al. | Jun 2000 | A |
6091905 | Yahav et al. | Jul 2000 | A |
6094625 | Ralston | Jul 2000 | A |
6097369 | Wambach | Aug 2000 | A |
6100517 | Yahav et al. | Aug 2000 | A |
6100895 | Miura et al. | Aug 2000 | A |
6101289 | Kellner | Aug 2000 | A |
6115052 | Freeman et al. | Sep 2000 | A |
6134346 | Berman et al. | Oct 2000 | A |
6151009 | Kanade et al. | Nov 2000 | A |
6160540 | Fishkin et al. | Dec 2000 | A |
6166744 | Jaszlics et al. | Dec 2000 | A |
6173059 | Huang et al. | Jan 2001 | B1 |
6175343 | Mitchell et al. | Jan 2001 | B1 |
6184863 | Sibert et al. | Feb 2001 | B1 |
6195104 | Lyons | Feb 2001 | B1 |
6215898 | Woodfill et al. | Apr 2001 | B1 |
6243491 | Andersson | Jun 2001 | B1 |
6275213 | Tremblay et al. | Aug 2001 | B1 |
6281930 | Parker et al. | Aug 2001 | B1 |
6282362 | Murphy et al. | Aug 2001 | B1 |
6297838 | Chang et al. | Oct 2001 | B1 |
6304267 | Sata | Oct 2001 | B1 |
6307549 | King et al. | Oct 2001 | B1 |
6307568 | Rom | Oct 2001 | B1 |
6323839 | Fukuda et al. | Nov 2001 | B1 |
6323942 | Bamji | Nov 2001 | B1 |
6326901 | Gonzales | Dec 2001 | B1 |
6327073 | Yahav et al. | Dec 2001 | B1 |
6331911 | Manassen et al. | Dec 2001 | B1 |
6346929 | Fukushima et al. | Feb 2002 | B1 |
6351661 | Cosman | Feb 2002 | B1 |
6371849 | Togami | Apr 2002 | B1 |
6392644 | Miyata et al. | May 2002 | B1 |
6394897 | Togami | May 2002 | B1 |
6400374 | Lanier | Jun 2002 | B2 |
6409602 | Wiltshire et al. | Jun 2002 | B1 |
6411392 | Bender et al. | Jun 2002 | B1 |
6411744 | Edwards | Jun 2002 | B1 |
6417836 | Kumar et al. | Jul 2002 | B1 |
6441825 | Peters | Aug 2002 | B1 |
6473516 | Kawaguchi et al. | Oct 2002 | B1 |
6504535 | Edmark | Jan 2003 | B1 |
6516466 | Jackson | Feb 2003 | B1 |
6542927 | Rhoads | Apr 2003 | B2 |
6545706 | Edwards et al. | Apr 2003 | B1 |
6546153 | Hoydal | Apr 2003 | B1 |
6556704 | Chen | Apr 2003 | B1 |
6577748 | Chang | Jun 2003 | B2 |
6580414 | Wergen et al. | Jun 2003 | B1 |
6580415 | Kato et al. | Jun 2003 | B1 |
6593956 | Potts et al. | Jul 2003 | B1 |
6621938 | Tanaka et al. | Sep 2003 | B1 |
6628265 | Hwang | Sep 2003 | B2 |
6661914 | Dufour | Dec 2003 | B2 |
6676522 | Rowe et al. | Jan 2004 | B2 |
6677967 | Sawano et al. | Jan 2004 | B2 |
6677987 | Girod | Jan 2004 | B1 |
6709108 | Levine et al. | Mar 2004 | B2 |
6720949 | Pryor et al. | Apr 2004 | B1 |
6749510 | Giobbi | Jun 2004 | B2 |
6751338 | Wallack | Jun 2004 | B1 |
6753849 | Curran et al. | Jun 2004 | B1 |
6767282 | Matsuyama et al. | Jul 2004 | B2 |
6769769 | Podlleanu et al. | Aug 2004 | B2 |
6774939 | Peng | Aug 2004 | B1 |
6785329 | Pan et al. | Aug 2004 | B1 |
6789967 | Forester | Sep 2004 | B1 |
6795068 | Marks | Sep 2004 | B1 |
6809776 | Simpson et al. | Oct 2004 | B1 |
6819318 | Geng | Nov 2004 | B1 |
6846238 | Wells | Jan 2005 | B2 |
6847311 | Li | Jan 2005 | B2 |
6881147 | Naghi et al. | Apr 2005 | B2 |
6884171 | Eck et al. | Apr 2005 | B2 |
6890262 | Oishi et al. | May 2005 | B2 |
6917688 | Yu et al. | Jul 2005 | B2 |
6919824 | Lee | Jul 2005 | B2 |
6924787 | Kramer et al. | Aug 2005 | B2 |
6930725 | Hayashi | Aug 2005 | B1 |
6931596 | Gutta et al. | Aug 2005 | B2 |
6943776 | Ehrenburg | Sep 2005 | B2 |
6945653 | Kobori et al. | Sep 2005 | B2 |
6951515 | Ohshima et al. | Oct 2005 | B2 |
6952198 | Hansen | Oct 2005 | B2 |
6965362 | Ishizuka | Nov 2005 | B1 |
6970183 | Monroe | Nov 2005 | B1 |
7006009 | Newman | Feb 2006 | B2 |
7016411 | Azuma et al. | Mar 2006 | B2 |
7039199 | Rui | May 2006 | B2 |
7039253 | Matsuoka et al. | May 2006 | B2 |
7042440 | Pryor et al. | May 2006 | B2 |
7054452 | Ukita | May 2006 | B2 |
7059962 | Watashiba | Jun 2006 | B2 |
7061507 | Tuomi et al. | Jun 2006 | B1 |
7071914 | Marks | Jul 2006 | B1 |
7090352 | Kobor et al. | Aug 2006 | B2 |
7098891 | Pryor | Aug 2006 | B1 |
7102615 | Marks | Sep 2006 | B2 |
7106366 | Parker et al. | Sep 2006 | B2 |
7116330 | Marshall et al. | Oct 2006 | B2 |
7116342 | Dengler et al. | Oct 2006 | B2 |
7121946 | Paul et al. | Oct 2006 | B2 |
7139767 | Taylor et al. | Nov 2006 | B1 |
7148922 | Shimada | Dec 2006 | B2 |
7164413 | Davis et al. | Jan 2007 | B2 |
7183929 | Antebi et al. | Feb 2007 | B1 |
7212308 | Morgan | May 2007 | B2 |
7224384 | Iddan et al. | May 2007 | B1 |
7227526 | Hildreth et al. | Jun 2007 | B2 |
7227976 | Jung et al. | Jun 2007 | B1 |
7245273 | Eberl et al. | Jul 2007 | B2 |
7259375 | Tichit et al. | Aug 2007 | B2 |
7274305 | Lutrell | Sep 2007 | B1 |
7283679 | Okada et al. | Oct 2007 | B2 |
7301530 | Lee et al. | Nov 2007 | B2 |
7305114 | Wolff et al. | Dec 2007 | B2 |
7346387 | Wachter et al. | Mar 2008 | B1 |
7364297 | Goldfain et al. | Apr 2008 | B2 |
7379559 | Wallace et al. | May 2008 | B2 |
7446650 | Scholfield et al. | Nov 2008 | B2 |
20010056477 | McTernan et al. | Dec 2001 | A1 |
20020010655 | Kjallstrom | Jan 2002 | A1 |
20020056114 | Fillebrown et al. | May 2002 | A1 |
20020072414 | Stylinski et al. | Jun 2002 | A1 |
20020075286 | Yonezawa et al. | Jun 2002 | A1 |
20020083461 | Hutcheson et al. | Jun 2002 | A1 |
20020085097 | Colmenarez et al. | Jul 2002 | A1 |
20020094189 | Navab et al. | Jul 2002 | A1 |
20020126899 | Farrell | Sep 2002 | A1 |
20020134151 | Naruoka et al. | Sep 2002 | A1 |
20020158873 | Williamson | Oct 2002 | A1 |
20030014212 | Ralston et al. | Jan 2003 | A1 |
20030093591 | Hohl | May 2003 | A1 |
20030100363 | Ali | May 2003 | A1 |
20030160862 | Charlier et al. | Aug 2003 | A1 |
20030169907 | Edwards et al. | Sep 2003 | A1 |
20030232649 | Gizis et al. | Dec 2003 | A1 |
20040001082 | Said | Jan 2004 | A1 |
20040017355 | Shim | Jan 2004 | A1 |
20040063480 | Wang | Apr 2004 | A1 |
20040063481 | Wang | Apr 2004 | A1 |
20040070565 | Nayar et al. | Apr 2004 | A1 |
20040087366 | Shum et al. | May 2004 | A1 |
20040095327 | Lo | May 2004 | A1 |
20040140955 | Metz | Jul 2004 | A1 |
20040150728 | Ogino | Aug 2004 | A1 |
20040213419 | Varma et al. | Oct 2004 | A1 |
20040254017 | Cheng | Dec 2004 | A1 |
20050037844 | Shum et al. | Feb 2005 | A1 |
20050047611 | Mao | Mar 2005 | A1 |
20050088369 | Yoshioka | Apr 2005 | A1 |
20050102374 | Moragne et al. | May 2005 | A1 |
20050105777 | Koslowski et al. | May 2005 | A1 |
20050117045 | Abdellatif et al. | Jun 2005 | A1 |
20050198095 | Du et al. | Sep 2005 | A1 |
20050239548 | Ueshima et al. | Oct 2005 | A1 |
20060035710 | Festejo et al. | Feb 2006 | A1 |
20070066394 | Ikeda et al. | Mar 2007 | A1 |
20070120834 | Boillot | May 2007 | A1 |
20070120996 | Boillot | May 2007 | A1 |
20080056561 | Sawachi | Mar 2008 | A1 |
Number | Date | Country |
---|---|---|
0353200 | Jan 1990 | EP |
0652686 | May 1995 | EP |
0750202 | Dec 1996 | EP |
0 823 683 | Feb 1998 | EP |
0 869 458 | Oct 1998 | EP |
1 180 384 | Feb 2002 | EP |
1 279 425 | Jan 2003 | EP |
1435258 | Jul 2004 | EP |
2814965 | Apr 2002 | FR |
2206716 | Jan 1989 | GB |
2206716 | Nov 1989 | GB |
2376397 | Nov 2002 | GB |
2388418 | Nov 2003 | GB |
01-284897 | Nov 1989 | JP |
07-311568 | Nov 1995 | JP |
9-128141 | May 1997 | JP |
9-185456 | Jul 1997 | JP |
11-38949 | Feb 1999 | JP |
2000-172431 | Jun 2000 | JP |
2000259856 | Sep 2000 | JP |
2000350859 | Dec 2000 | JP |
2001-166676 | Jun 2001 | JP |
2002369969 | Dec 2002 | JP |
2004145448 | May 2004 | JP |
2005046422 | Feb 2005 | JP |
WO9935633 | Jun 1999 | WO |
WO 9926198 | Oct 1999 | WO |
WO 0227456 | Feb 2002 | WO |
WO 03079179 | Sep 2003 | WO |
WO 2005073838 | Aug 2005 | WO |
WO 2005107911 | Nov 2005 | WO |
Number | Date | Country | |
---|---|---|---|
20050059488 A1 | Mar 2005 | US |