The present invention relates to simulations in a virtual environment.
Virtual reality is a technology which allows a user or “actor” to interact with a computer-simulated environment, be it a real or imagined one. Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special stereoscopic displays. An actor can interact with a virtual reality environment or a virtual artifact within the virtual reality environment either through the use of standard input devices, such as a keyboard and mouse, or through multimodal devices, such as a wired glove. The actor is disposed in a three-dimensional, physical space, known as a studio, wherein the actor interacts with one or more physical objects within the studio and/or with one or more virtual artifacts of the virtual reality environment.
One particular shortcoming of conventional virtual reality environments is that only one computer simulation can be conducted within a studio at any given time. Thus, if a studio is being utilized for a first simulation and the need arises to run a second, different simulation, the first simulation must be terminated in favor of the second simulation or the execution of the second simulation must be delayed until the first simulation has been completed.
There are ways of conducting virtual reality simulations well known in the art; however, considerable shortcomings remain.
The novel features believed characteristic of the invention are set forth in the appended claims. However, the invention itself, as well as a preferred mode of use, and further objectives and advantages thereof, will best be understood by reference to the following detailed description when read in conjunction with the accompanying drawings, in which the leftmost significant digit(s) in the reference numerals denote(s) the first figure in which the respective reference numerals appear, wherein:
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the description herein of specific embodiments is not intended to limit the invention to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.
Illustrative embodiments of the invention are described below. In the interest of clarity, not all features of an actual implementation are described in this specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developer's specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming but would nevertheless be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
In the specification, reference may be made to the spatial relationships between various components and to the spatial orientation of various aspects of components as the devices are depicted in the attached drawings. However, as will be recognized by those skilled in the art after a complete reading of the present application, the devices, members, apparatuses, etc. described herein may be positioned in any desired orientation. Thus, the use of terms such as “above,” “below,” “upper,” “lower,” or other like terms to describe a spatial relationship between various components or to describe the spatial orientation of aspects of such components should be understood to describe a relative relationship between the components or a spatial orientation of aspects of such components, respectively, as the device described herein may be oriented in any desired direction.
Referring to
A virtual representation of studio 111 exists in motion capture environment 113, which hosts the virtual reality environment. The one or more actors 101 use display devices, for example, headset viewers, such as a headset viewer 201 of
Physical objects, such as physical objects 103 and 105, that are disposed within studio 111 and that are moved by the one or more actors 101, are tracked using motion capture environment 113. These “tracked objects” may be tracked by a variety of sensor methodologies, including, but not limited to, reflectors, such as reflectors 123 and 125 and reflector 203 of
Tracker-sensors, such as tracker sensors 119, interface with motion capture environment 113 and determine where a tracked object, such as physical objects 103 and 105, is located within the physical space of the studio. Such tracker-sensors may comprise a single unit or a plurality of units. The tracker-sensors may be attached to a framework, such as framework 117, which defines the physical limits of the studio or may be attached to the tracked objects, or both. While tracker-sensors may utilize various methodologies for tracking tracked objects, certain tracker-sensors use inertial acceleration with subsequent integration to provide rate and displacement information, ultrasonic measurement, optical measurement, near infrared measurement, as well as methods that use other bands of radiation within the electromagnetic spectrum.
Referring now to
In the present invention, however, the locations of all of the virtual and physical objects are sent to server 301. Server 301 sends copies of the locations of sensors 311a-311e, shown generally at 312, that are operably associated with first actor 307 to first client, 303 and sends copies of the locations of sensors 313a-313e, shown generally at 314, that are operably associated with second actor 309 to second client 305. Alternatively, the locations of all sensors 311a-311e and 313a-313e are sent to each of clients 303 and 305 along with instructions to process only those sensor locations that are part of the simulation being performed by a particular client, as shown in
It should be noted that while the disclosure provided above describes the actors' hands and heads as being tracked, any portion of an actor's body, or any appendage to any portion of an actor's body, may be tracked.
Performing multiple simultaneous independent simulations in a single motion capture environment provides many advantages. For example, a single motion capture environment can support more than one computer simulation simultaneously, thus extending a valuable resource in the time dimension. Moreover, performing multiple simultaneous independent simulations in a single motion capture environment reduces the unit simulation cost. Furthermore, multiple, physical motion capture environments need not be built in some situations to support the demand for such environments.
It should be noted that motion capture environment 113 comprises one or more computers, such as computer 115, executing software embodied in a computer-readable medium that is operable to produce and control the virtual reality environment. The scope of the invention encompasses, among other things, motion capture environment, such as motion capture environment 113 of
The particular embodiments disclosed above are illustrative only, as the invention may be modified and practiced in different but equivalent manners apparent to those skilled in the art having the benefit of the teachings herein. Furthermore, no limitations are intended to the details of construction or design herein shown, other than as described in the claims below. It is therefore evident that the particular embodiments disclosed above may be altered or modified and all such variations are considered within the scope and spirit of the invention. Accordingly, the protection sought herein is as set forth in the claims below. It is apparent that an invention with significant advantages has been described and illustrated. Although the present invention is shown in a limited number of forms, it is not limited to just these forms, but is amenable to various changes and modifications without departing from the spirit thereof.
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/US2008/051651 | 1/22/2008 | WO | 00 | 7/9/2009 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2008/091861 | 7/31/2008 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
5999185 | Kato et al. | Dec 1999 | A |
6538655 | Kubota | Mar 2003 | B1 |
6624853 | Latypov | Sep 2003 | B1 |
6681629 | Foxlin et al. | Jan 2004 | B2 |
6798407 | Benman | Sep 2004 | B1 |
7084884 | Nelson et al. | Aug 2006 | B1 |
7372463 | Anand | May 2008 | B2 |
7468778 | Thomas et al. | Dec 2008 | B2 |
7885732 | Troy et al. | Feb 2011 | B2 |
7937253 | Anast et al. | May 2011 | B2 |
7952594 | Morita et al. | May 2011 | B2 |
8018579 | Krah | Sep 2011 | B1 |
8072479 | Valliath et al. | Dec 2011 | B2 |
8241118 | Camhi | Aug 2012 | B2 |
20020010734 | Ebersole et al. | Jan 2002 | A1 |
20040080507 | Von Prittwitz | Apr 2004 | A1 |
20040104935 | Williamson et al. | Jun 2004 | A1 |
20040113885 | Genc et al. | Jun 2004 | A1 |
20050166163 | Chang et al. | Jul 2005 | A1 |
20050233865 | Reiffel | Oct 2005 | A1 |
20060001650 | Robbins et al. | Jan 2006 | A1 |
20060087509 | Ebert et al. | Apr 2006 | A1 |
20060192852 | Rosenthal et al. | Aug 2006 | A1 |
20060228101 | Sullivan et al. | Oct 2006 | A1 |
20060267932 | Rosenberg et al. | Nov 2006 | A1 |
20060290695 | Salomie et al. | Dec 2006 | A1 |
20070003915 | Templeman | Jan 2007 | A1 |
20110320567 | Edecker et al. | Dec 2011 | A1 |
Number | Date | Country |
---|---|---|
0221451 | Mar 2002 | WO |
Entry |
---|
“Studio” defined—a compilation of various definitions of “studio” obtained from Dictionary.com, Merriam-Webster Dictionary online, and Wikipedia, on Aug. 27, 2012. |
Supplementary European Search Report in Corresponding European Application No. 08713892, dated Sep. 29, 2010. |
International Search Report and Written Opinion in related PCT Application PCT/US08/51651, dated Jun. 27, 2008. |
International Preliminary Examination Report in related PCT Application PCT/US08/51651, dated Jul. 6, 2009, 7 pages. |
International Search Report and Written Opinion in PCT Application PCT/US08/51642, dated Jun. 27, 2008, 7 pages. |
International Preliminary Report on Patentability in PCT Application PCT/US08/51642, dated Dec. 29, 2008, 7 pages. |
International Search Report and Written Opinion in PCT Application PCT/US08/51661, dated Jul. 1, 2008, 6 pages. |
International Preliminary Report on Patentability in PCT Application PCT/US08/51661, dated Dec. 29, 2008, 7 pages. |
SpotON: An Indoor 3D Location Sensing Technology Based on RF Signal Strength, by Hightower et al., University of Washington Computer Science and Engineering Technical Report #2000-02-02, Feb. 18, 2000, 16 pages. |
Office Action in U.S. Appl. No. 12/522,641, dated Dec. 6, 2011, 22 pages. |
International Search Report and Written Opinion in PCT Application PCT/US08/60562, dated Aug. 15, 2008, 7 pages. |
International Preliminary Report on Patentability in PCT Application PCT/US08/60562, dated Feb. 18, 2010, 7 pages. |
Office Action in U.S. Appl. No. 12/522,568, dated Aug. 2, 2012, 14 pages. |
First Examination Report from the Canadian Intellectual Property Office in related Canadian Application No. 2,675,999, issued Jul. 5, 2012, 2 pages. |
Office Action in U.S. Appl. No. 12/522,641, dated May 4, 2012, 23 pages. |
Office Action in U.S. Appl. No. 12/522,641, dated Aug. 10, 2012, 24 pages |
Office Action in U.S. Appl. No. 12/595,373, dated Feb. 17, 2012, 8 pages. |
Office Action in U.S. Appl. No. 12/595,373, dated Aug. 15, 2012, 10 pages. |
First Examination Report from the Canadian Patent Office in related Canadian application 2,675,995, mailed May 2, 2013, 4 pages. |
First Examination Report from the Canadian Patent Office in related Canadian application 2,684,487, mailed Apr. 24, 2013, 3 pages. |
Number | Date | Country | |
---|---|---|---|
20100050094 A1 | Feb 2010 | US |
Number | Date | Country | |
---|---|---|---|
60886053 | Jan 2007 | US |