This application relates to a virtual reality system that generates a virtual three-dimensional (3D) environment from a real-world environment and renders a virtual 3D object in the virtual 3D environment.
3D graphics may be used to implement a virtual reality system. Conventional virtual reality systems “inject” a user into a fictitious virtual 3D environment. In that environment, the user can interact with objects, characters, and the like as if in the real world.
By way of example, a user may generate a virtual 3D version of a real-world living room. The user may then furnish the resulting virtual living room with virtual 3D objects, such as furniture, artwork, and the like. The virtual objects may be rearranged, as desired, in order to obtain a pleasing layout of the room. Thus, the virtual reality system described herein provides a 3D preview of a real-world space augmented with computer-generated virtual elements, hence the name “augmented reality”. The virtual reality system has other applications as well.
Referring to
Process 16 generates (22) a virtual 3D environment. To generate the virtual 3D environment, process 16 surveys (24) a real-world environment, such as a room. The user walks camera 12 around the room, capturing frames of two-dimensional (2D) video data from different positions in the room. Process 16 uses these frames to generate a 3D model of the room.
To this end, process 16 extracts features of the room (the real-world environment, in this example) from the frames of 2D video data. These features include planes and reference points, such as corners, in the real-world environment.
Process 16 locates (32) corners 30 in room 26 using standard corner detection processes and performs standard position (“pose”) estimation processes to determine the location of camera 12 in the room. Process 16 tracks the motion of the corners over a predetermined time frame (which corresponds to a given number of frames of 2D video). It is noted that the corners themselves do not move within the room, but rather they move relative to camera 12 exclusively due to the motion of camera 12. It is this relative motion that is being tracked. Based on the locations of the corners and their movement over time, process 16 determines the position of camera 12 relative to the corners for each frame of video.
The camera position is used when constructing a virtual 3D version of the room. That is, knowing the camera position, allows process 16 to know the perspective from which each frame was taken. Knowing the perspective of each frame allows process 16 to determine where in the virtual 3D environment the additional virtual elements should be positioned.
Process 16 recognizes (34) planes in the real-world environment that are bounded by the corners. Process 16 recognizes planes by identifying clusters of three or more points (e.g., pixels) of the 2D video that behave similarly during motion of camera 12. For example, as camera 12 moves toward a cluster of pixels, the pixels may appear to “grow”, i.e., they may appear larger because they become closer to the camera. Pixels on the same plane may appear to “grow” by about the same amount. Examples of planes that may be identified include a floor, ceiling, and walls of a room.
Once process 16 identifies the corners and planes from the 2D frames of video, process 16 generates (36) 3D data that defines the corners and the planes of the 3D environment relative to the camera positions. The 3D data may define Cartesian XYZ coordinates of pixels that make up the corners and planes of the virtual 3D environment. Other types of 3D data may alternatively be used.
Process 16 renders (38) the virtual 3D environment (e.g., virtual living room) from the 3D data. The virtual 3D environment 40 is rendered on the display screen 42 of computer 14. A user can then populate this virtual 3D environment with virtual objects retrieved by computer 14.
In more detail, the user selects a virtual 3D object from a database, along with a location in the virtual 3D environment where the selected virtual 3D object is to be displayed. The selections may be made using a light pen, stylus on a touch screen, or any other type of computer interface. As noted above, the virtual 3D objects may be virtual 3D representations of furniture or the like. Process 16 retrieves (44) the selected virtual 3D object from the database and positions (46) it at the appropriate location. Positioning is performed in response to user input.
In order to achieve a realistic effect, process 16 may scale (48) the selected virtual 3D object (i.e., model) before rendering. In this context, scaling may include changing the size of the virtual 3D object so that the virtual 3D object is appropriate given the size of the virtual 3D environment.
Process 16 scales the virtual 3D object by obtaining (50) the size of a target object in the real-world environment and changing (52) the size of the virtual 3D object in accordance with the size of the target. For example, the size of an object (e.g., the height of a ceiling, distance between two objects, etc.) in the 3D environment may be captured beforehand. Using the size of the target as a reference, process 16 may change the size of the virtual 3D object so that its size is smaller or larger to correlate substantially to the size of the target. Process 16 then renders (49) the virtual objects in the virtual environment.
By way of example, process 16 may retrieve a virtual 3D model for a table from a database. Data for the table may include its dimensions, such as length, width and height. Knowing these dimensions and the size of the target, process 16 can scale the table to its appropriate size within the virtual 3D environment and then render the virtual table.
Process 16 continuously tracks the position of the camera during movement throughout the real-world 3D environment and updates the position of the camera periodically in order to ensure that virtual objects are placed at correct locations within the virtual 3D environment. That is, process 16 uses the position of the camera to further refine (and render) the definition of the virtual 3D environment and to place the virtual objects within the virtual 3D environment.
Process 16 may also illuminate the virtual 3D object in the virtual 3D environment to simulate lighting from one or more light sources in the real-world environment. This may be done using well-known processes, such as that described in “Adaptive Estimation Of Illumination Distribution With Unknown Reflectance Properties In Shadow Regions”, by Sato, I, Sato, Y., and Ikeuchi, K., The Proceedings of the Seventh International Institute of Electrical and Electronics Engineers (IEEE) Conference, Vol. 2, pgs. 875–882 (1999).
Referring to
Referring to
Process 16 can change the illumination of the objects by altering the positions of virtual light sources in the virtual 3D environment and/or adding virtual light sources. Process 16 can also affect how the lighting hits the virtual objects by changing the positions of normal vectors on the virtual 3D object. Thus, process 16 can simulate light hitting an object from both an inside light source, such as a lamp, and/or outside light, such as a window. This provides for a more realistic overall effect in the resulting simulation. Also, the colors of the various light sources may be varied.
Process 16 also permits a user to re-position virtual 3D objects in the virtual 3D environment. For example, a user may drag and drop a virtual 3D object from one location in the virtual 3D environment to another location. This allows the user to experiment with several different layouts.
As shown in
Process 16, however, is not limited to use with the hardware and software of
Each such program may be implemented in a high level procedural or object-oriented programming language to communicate with a computer system. However, the programs can be implemented in assembly or machine language. The language may be a compiled or an interpreted language.
Each computer program may be implemented as a computer program stored on a storage medium (e.g., CD-ROM, hard disk, or magnetic diskette) that is readable by a general or special purpose programmable computer for configuring and operating the computer when the storage medium is read by the computer to perform process 16. Process 16 may also be implemented as an article of manufacture, such as a machine-readable storage medium, configured with a computer program, where, upon execution, instructions in the computer program cause a machine to operate in accordance with process 16.
The process described herein is not limited to the embodiments set forth herein. The order of the blocks in
Other embodiments not described herein are also within the scope of the following claims.
Number | Name | Date | Kind |
---|---|---|---|
4600919 | Stern | Jul 1986 | A |
4747052 | Hishinuma et al. | May 1988 | A |
4835712 | Drebin et al. | May 1989 | A |
4855934 | Robinson | Aug 1989 | A |
4901064 | Deering | Feb 1990 | A |
5124914 | Grangeat | Jun 1992 | A |
5163126 | Einkauf et al. | Nov 1992 | A |
5371778 | Yanof et al. | Dec 1994 | A |
5611030 | Stokes | Mar 1997 | A |
5625765 | Ellenby et al. | Apr 1997 | A |
5731819 | Gagne et al. | Mar 1998 | A |
5757321 | Billyard | May 1998 | A |
5786822 | Sakaibara et al. | Jul 1998 | A |
5805782 | Foran | Sep 1998 | A |
5809219 | Pearce et al. | Sep 1998 | A |
5812141 | Kamen et al. | Sep 1998 | A |
5847712 | Salesin et al. | Dec 1998 | A |
5894308 | Isaacs | Apr 1999 | A |
5929860 | Hoppe | Jul 1999 | A |
5933148 | Oka et al. | Aug 1999 | A |
5949969 | Suzuoki et al. | Sep 1999 | A |
5966133 | Hoppe | Oct 1999 | A |
5966134 | Arias | Oct 1999 | A |
5974423 | Margolin | Oct 1999 | A |
6054999 | Strandberg | Apr 2000 | A |
6057859 | Handelman et al. | May 2000 | A |
6078331 | Pulli et al. | Jun 2000 | A |
6115050 | Landau et al. | Sep 2000 | A |
6157747 | Szeliski et al. | Dec 2000 | A |
6175655 | George et al. | Jan 2001 | B1 |
6191787 | Lu et al. | Feb 2001 | B1 |
6191796 | Tarr | Feb 2001 | B1 |
6198486 | Junkins et al. | Mar 2001 | B1 |
6201549 | Bronskill | Mar 2001 | B1 |
6208347 | Migdal et al. | Mar 2001 | B1 |
6219070 | Baker et al. | Apr 2001 | B1 |
6239808 | Kirk et al. | May 2001 | B1 |
6252608 | Snyder et al. | Jun 2001 | B1 |
6262737 | Li et al. | Jul 2001 | B1 |
6262739 | Migdal et al. | Jul 2001 | B1 |
6292192 | Moreton | Sep 2001 | B1 |
6317125 | Persson | Nov 2001 | B1 |
6337880 | Cornog et al. | Jan 2002 | B1 |
6388670 | Naka et al. | May 2002 | B2 |
6405071 | Analoui | Jun 2002 | B1 |
6437782 | Pieragostini et al. | Aug 2002 | B1 |
6478680 | Yoshioka et al. | Nov 2002 | B1 |
6559848 | O'Rourke | May 2003 | B2 |
6593924 | Lake et al. | Jul 2003 | B1 |
6593927 | Horowitz et al. | Jul 2003 | B2 |
6608627 | Marshall et al. | Aug 2003 | B1 |
6608628 | Ross et al. | Aug 2003 | B1 |
6633304 | Anabuki et al. | Oct 2003 | B2 |
6657637 | Nishigori et al. | Dec 2003 | B1 |
6724386 | Clavadetscher | Apr 2004 | B2 |
20010005425 | Watanabe et al. | Jun 2001 | A1 |
20010026278 | Arai et al. | Oct 2001 | A1 |
20020101421 | Pallister | Aug 2002 | A1 |
20020154174 | Redlich et al. | Oct 2002 | A1 |
20020158873 | Williamson | Oct 2002 | A1 |
20030025788 | Beardsley | Feb 2003 | A1 |
20030058238 | Doak et al. | Mar 2003 | A1 |
20030146922 | Navab et al. | Aug 2003 | A1 |
Number | Date | Country | |
---|---|---|---|
20030179218 A1 | Sep 2003 | US |