Claims
- 1. A method of modeling of the visible world using full-surround image data, said method comprising:
selecting a view point within a p-surface; selecting a direction of view within the p-surface; texture mapping full-surround image data onto said p-surface such that the resultant texture map is substantially equivalent to projecting full-surround image data onto the p-surface from said view point to thereby generate a texture mapped p-surface; and displaying a predetermined portion of said texture mapped p-surface.
- 2. The method as recited in claim 1, further comprising rotating said texture mapped p-surface so as to simulate rotating the direction of view in the opposite direction.
- 3. The method as recited in claim 1, wherein said method further comprises interactively changing said direction of view to thereby expose a corresponding portion of said texture mapped p-surface.
- 4. The method as recited in claim 1, wherein a viewer is allowed to interactively alter at least one of focal length or an angle of view relative to said textured mapped p-surface to thereby vary the displayed portion of said texture mapped p-surface.
- 5. The method as recited in claim 1, further comprising:
selecting a new viewpoint; repeating said texture mapping step using said new viewpoint; and redisplaying said predetermined portion of said p-surface, whereby a first image portion occupying said predetermined portion displayed during the displaying step is different than a second image portion occupying said predetermined portion during the redisplaying step.
- 6. The method as recited in claim 5, wherein said selecting step comprises interactively selecting said new viewpoint.
- 7. The method as recited in claim 5, wherein a first said texture mapped p-surface is replaced by a second texture mapped p-surface by interactively selecting said new viewpoint from viewpoints within said second texture mapped p-surface.
- 8. The method as recited in claim 5, wherein the new viewpoint is close to the surface of said p-surface.
- 9. The method as recited in claim 1, further comprising:
selecting a new viewpoint; and redisplaying said predetermined portion of said p-surface, whereby a first image portion occupying said predetermined portion displayed during the displaying step is different than a second image portion occupying said predetermined portion during the redisplaying step.
- 10. The method as recited in claim 9, wherein said selecting step comprises interactively selecting said new viewpoint.
- 11. The method as recited in claim 9, wherein the new viewpoint is close to the surface of said p-surface.
- 12. A method for interactively viewing a model of the visible world formed from full-surround image data, comprising:
providing said full surround image data; selecting a view point within a p-surface; establishing a first direction of view within the p-surface; texture mapping full-surround image data onto said p-surface such that the resultant texture map is substantially equivalent to projecting full-surround image data onto the p-surface from said view point to thereby generate a texture mapped p-sphere; interactively changing said direction of view to thereby select a second direction of view; and displaying a predetermined portion of said texture mapped p-sphere as said texture mapped p-sphere moves between the first and second directions of view.
- 13. The method as recited in claim 12, wherein the interactively changing step results in rotating said texture mapped p-sphere so as to simulate rotating the direction of view in the opposite direction.
- 14. An apparatus for interactively viewing a model of the visible world formed from full-surround image data stored in memory, comprising:
means for selecting a view point within a p-surface; means for establishing a first direction of view within the p-surface; means for texture mapping full-surround image data onto said p-surface such that the resultant texture map is substantially equivalent to projecting full-surround image data onto the p-surface from said view point to thereby generate a texture mapped p-sphere; means for interactively changing said direction of view to thereby select a second direction of view; and means for displaying a predetermined portion of said texture mapped p-sphere as said texture mapped p-sphere moves between the first and second directions of view.
- 15. The apparatus as recited in claim 14, wherein the interactively changing means effectively rotates said texture mapped p-sphere so as to simulate rotating the direction of view in the opposite direction.
- 16. The apparatus as recited in claim 14, wherein said selecting means, said establishing means, said texture mapping means, and said interactively changing means comprise software devices.
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This is a Continuation of U.S. Ser. No. 09/871,903 (filed Jun. 4, 2001 and now abandoned), which is a Continuation of U.S. Pat. No. 6,243,099 (filed Jan. 12, 1999 as U.S. Ser. No, 09/228,760), the '099 patent being a Continuation-in-Part of U.S. Pat. No. 5,903,782 (filed Nov. 14, 1996 as U.S. Ser. No. 08/749,166), which application claims priority from U.S. Provisional Patent Application Serial. No. 60/006,800 (filed Nov. 15, 1996), the '099 patent also claiming priority from U.S. Provisional Patent Application Serial No. 60/071,148 (filed Jan. 12, 1998).
Provisional Applications (1)
|
Number |
Date |
Country |
|
60006800 |
Nov 1995 |
US |
Continuations (3)
|
Number |
Date |
Country |
Parent |
09871903 |
Jun 2001 |
US |
Child |
10602666 |
Jun 2003 |
US |
Parent |
09228760 |
Jan 1999 |
US |
Child |
09871903 |
Jun 2001 |
US |
Parent |
08749166 |
Nov 1996 |
US |
Child |
09871903 |
Jun 2001 |
US |