The present invention relates to the field of film and video photography. In particular the present invention discloses a camera device that captures a 360 degree panoramic image and display systems for displaying the panoramic image captured by the camera device.
Most cameras only provide a small viewing angle. Thus, a typical conventional camera only captures an image in the direction that the camera is aimed. Limited view cameras force viewers to look only at what the camera operator chooses to focus on. Some cameras use a specialized wide angle lens to capture a wider panoramic image, but such panoramic cameras still have a limited field of view.
It would be desirable to have a camera system that would capture the light from all directions such that a full 360 degree panoramic image can be created. A full 360 degree panoramic image would allow the viewer to choose what she would like to look at. Furthermore, a full 360 degree panoramic image allows multiple viewers to simultaneously view the world from the same point, with each being able to independently choose their viewing direction and field of view.
At the present time, there are some known methods of creating 360 degree panoramic images. However, most current methods are subject to limitations due to their physical movements and mechanical complexity. For example, some of the current methods operate by combining a series of individual photographs taken in different directions into a single panoramic image. Some panoramic cameras spin a lens and film to capture a panoramic view in a single sweeping motion.
There is a market for panoramic photos to be used in multimedia applications, typically provided on CD-ROMs. In the last few years, some software manufacturers have introduced standards for digital storage and computer playback of panoramic datasets. One example is QuickTime® VR, introduced by Apple® Computer, Inc. Apple® Computer's QuickTime® VR standard governs the file storage format and the playback software needed to view their datasets.
Currently, Apple Computer recommends and provides software tools to implement a labor-intensive process for capturing these panoramic datasets. In the Apple QuickTime® VR (QTVR) process a standard 35 mm camera is mounted vertically on a leveled tripod and equipped with an extreme wide angle lens (e.g. 15-18 mm focal length). A sequence of twelve or more overlapping still photographs is taken at roughly 30 degree intervals as the camera is turned on the tripod around a vertical axis. These photographs are developed, digitized and then fed into a semi-automated software program called a “stitcher” that merges the overlapping still photographs into one long panoramic strip.
The labor intensive process suffers from a number of shortcomings. First, the process is time-consuming since many steps require human intervention and guidance. Furthermore, the recommended process is prone to temporal artifacts since it captures each individual photo at a different time. This means that the “stitched” pan image is not instantaneous but rather is made up of individual photos taken at different times. The time change during the series of photographs makes it nearly impossible to create panoramic images in changing scenes containing shorelines, urban crowds and traffic, windblown trees, etc. Finally, it is difficult to see how the image capture method recommended by Apple QuickTime® VR (QTVR) can be extended from a single still panoramic image into a continuous frame, or motion picture panoramic image capture.
The present invention discloses a camera device that instantaneously captures a 360 degree panoramic image. Furthermore, the present invention discloses various different systems for displaying the panoramic images.
In the camera device, virtually all of the light that converges on a point in space is captured. Specifically, in the camera of the present invention, light striking this point in space is captured if it comes from any direction, 360 degrees around the point and from angles 50 degrees or more above and below the horizon as illustrated in
Other objects, features and advantages of present invention will be apparent from the company drawings and from the detailed description that follows below.
The objects, features and advantages of the present invention will be apparent to one skilled in the art, in view of the following detailed description in which:
A method and apparatus for a camera device that instantaneously captures 360 degree panoramic images is disclosed. In the following description, for purposes of explanation, specific nomenclature is set forth to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that these specific details are not required in order to practice the present invention. For example, the present invention has been described with reference to Charge Coupled Devices. However, the panoramic camera system can easily be implemented with other types of electronic image capture systems.
The panoramic camera design of the present invention captures light from all directions within 50 to 60 degrees above and below the horizon simultaneously.
The Mirror
Referring to
The distortion in the image is partly due to the fact that the convex mirror 210 of the imaging system effectively converts the surrounding panorama to a polar coordinate system. By adjusting the shape of the convex mirror 210, the mapping of the elevation angle of incoming light to radial distance in the annular image, can be controlled.
In a preferred embodiment, the convex mirror 210 is a parabolic mirror that creates an annular image wherein the radial distance from the center of the annular image is linearly proportional to the angle of incident light. A panoramic camera system with a parabolic mirror is illustrated in
The Astigmatism Correction Lens
The convex mirror of the present invention introduces other image defects that require careful correction. One particular problem is astigmatism. Specifically, the light reflected downward from the convex mirror 210 of the present invention will not meet at a single focal point. To correct for this problem, an astigmatism correction lens 220 is added to correctly focus the light from the convex mirror 210.
The astigmatism correction lens 220 comprises a group of 2 or more lenses whose group focal length is long but with individual elements of strong and opposite power. Thus, the astigmatism lens group may be made of the same optical material without introducing significant lateral color. Since the beam size associated with any object point in space to be imaged is quite small compared to the field of the beam, the strong elements tend to introduce deleterious amounts of spherical aberration or coma into the final image.
The Objective Lens
The next component is a standard camera objective lens 230. The standard camera objective lens 230 forms an image using the astigmatism-corrected, reflected light from the convex mirror 210. In the present embodiment, a standard off-the-shelf camera lens is used that is optimized for cost and performance in the conventional photography market. The current embodiment relies upon a pre-defined focal length.
The focal length of the standard objective lens is selected based on two factors. The first factor is the maximum angular field of view present by the convex mirror and astigmatism correction lens group. This factor is determined by the largest angle away from the horizon of an object to be captured. The second factor is the maximum diameter of the circular image to be recorded. In an embodiment that uses 35 mm film, this value would not exceed 24 mm. In an embodiment that uses Charged Coupled Device arrays, the objective lens must keep the circular image within the bounds of the CCD array.
For one preferred embodiment, the appropriate focal length is 22 mm. Since there are many objective lenses available with focal lengths in the 18 mm to 24 mm range, this focal length provides many off-the-shelf lens choices.
To allow a standard off-the-shelf camera lens to be used, the present invention “false focuses” the image beyond the normal focal plane. This allows the next optical element (field flattening lens) to fit between the objective lens 230 and the image plane 250.
The Field Flattening Lens
Another optical problem created by the parabolic mirror is a curved image field that is created by the curve of the parabolic mirror. The curved image field problem is solved by adding yet another lens 240. This final lens is a “field flattening” lens, that flattens the field of optimal focus to a flat two dimensional image plane. The field flattening lens 240 must be kept as close to the image plane as practical to eliminate the need for a focal plane shutter.
In one embodiment, the material SFL6 is used to create the field flattening lens 240. Due to its high index of refraction, SFL6 allows the field flattening lens 240 to be approximately 2 millimeters thick. If the field flattening lens 240 was created using more traditional materials, the field flattening lens 240 would be approximately 4.5 millimeters thick.
The Image Capture System
The final major component of the panoramic camera design is the image capture mechanism 250. The image capture mechanism 250 is placed at the image plane just beneath the field flattening lens 240. This mechanism captures the optimized two dimensional annular image of the surrounding panorama. An example of a captured panorama stored as a two dimensional annular representation is shown in
In one embodiment of the present invention, the image capture mechanism can be a frame of photographic film as illustrated in
In the preferred embodiment of the present invention, a high resolution digital image capture system is used to capture the annular image created by the optical elements. In one embodiment of the present invention, a Charged Coupled Device (CCD) array 450 is placed in the image plane to capture the image as illustrated in
To generate an annular image of sufficient quality to be used in the Apple QuickTime® VR market, it has been determined that the image plane must be sampled with an array having at least 2K by 2K elements. To meet this requirement, one embodiment of the present invention uses a CCD array produced by Loral-Fairchild, Inc. However, the high resolution CCD array sold by Loral-Fairchild, Inc., adds a significant cost to the panoramic camera of the present invention. Furthermore, large CCD arrays such as the Loral-Fairchild array have difficulty handling the extreme differences in light intensity that are produced by the optical system of the present invention. Specifically, one area of the image may have direct sunlight and other areas may receive comparatively little light.
To reduce the production cost of the panoramic camera, alternate embodiments of the present invention use a set of lower resolution CCD arrays. Specifically, consumer grade CCD devices that are targeted at the consumer electronics market are used. Consumer electronics grade CCD arrays have the distinct advantages of lower cost, more highly-integrated support circuitry availability, high speed read-out, robustness to extreme lighting and other environmental conditions.
No individual consumer grade CCD array meets the high resolution requirements needed by the present invention (at least 2K by 2K elements). Therefore, a method of obtaining a greater image resolution is required if consumer grade CCDs are used.
One method of creating an acceptable image capture mechanism using consumer grade CCD arrays is to use multiple low resolution CCD chips to cover the image plane using a mosaic pattern. Referring to
A disadvantage of the mosaic technique is the image capture variation that will exist between the different CCD chips. The image variation can be compensated for by having overlapping CCD array coverage. The overlapping area is used to cross calibrate the image variation between adjacent CCD arrays.
Folded Optics Configuration
Transparent Block Configuration
Another alternative embodiment is shown in
The solid transparent block approach has a number of significant advantages. First, the mirrored inner surface of the transparent block material can be well protected. This technique overcomes the disadvantages of front surface mirrors. Specifically, when front surface mirrors are exposed to the outside world they are susceptible to damage and degradation. In the above described embodiment, the mirrored surface is fully protected since it is encased between a protective backing material and the transparent block material. Another advantage of the solid block approach is that the skin of the camera is incorporated into the optical system. Thus only one surface would need to be multicoated to prevent internal reflections.
The transparent block technique can also be implemented using the folded optics scheme described in the previous section. Specifically,
Different methods can be used to construct a transparent block panoramic camera system. One method would be to create the transparent block, then polish the transparent block, and finally add a mirrored surface where appropriate. An alternate method of constructing a transparent block panoramic camera system would start with the convex mirror. Then, the convex mirror would be encapsulated within the transparent block. This method would be simpler to construct since a concave surface would not require polishing. Furthermore, the convex mirror would be protected by the transparent block.
Center Support Configuration
Another alternative embodiment addresses the problem of how to align and support the optical elements of the panoramic camera illustrated in
External Support Configuration
Another scheme for supporting the parabolic mirror above the optical elements below is to use several side supports. This can be accomplished by splitting the parabolic mirror into “pie-pieces” by cutting the parabolic mirror after fabrication. For example, the parabolic mirror can be quartered as illustrated in
If the parabolic mirror is split into four sections, then the annular image will appear as four quadrants at the image plane. To correct for this, the gaps can be removed during the polar-to-rectangular coordinate conversion, thereby restoring the continuity of the panoramic image. The gaps between the mirror sections should be kept as small as possible, however, since the optical system is degraded by the loss of rotational symmetry.
As illustrated in
Still Image Presentation as a Rectangular Panoramic Image
The most common method of displaying a panoramic image is to display the image as a rectangle where the horizontal direction represents the view angle. An example of this type of panoramic image presentation is illustrated in
With the panoramic camera system of the present invention, such rectangular panoramic images can easily be created. First, the panoramic camera system of the present invention is used to capture an annular image of the surrounding panorama. Then the annular image is digitized and loaded into a computer system. (The image will already be in digital form if a CCD version of the panoramic camera system was used to capture the image.)
A custom conversion program is then executed on the computer system. The custom conversion program scans around the annular image starting at an arbitrarily chosen sampling line 310. Points along the sampling line 310 are sampled and then their position changed using polar coordinate to rectangular coordinate conversion.
While sampling the annular image, it is important to sample the image differently depending on where the annular image is being sampled. The following three rules must be observed:
Since there is a greater resolution around the outer perimeter of the annular image, the corresponding rectangular image portion will have better image clarity. The outer perimeter of the annular image may be the top or the bottom of the rectangular image depending on the optical path. (Compare
Once the panoramic image has been converted from an annular image to a rectangular image on a computer system, then the rectangular image can be presented to viewers in a number of different formats. For example, the rectangular image may be distributed electronically as a JPEG image and viewed with JPEG image viewers. Alternatively, the rectangular image can be printed out with a color printer. It should be noted that since the rectangular image is in digital form, it can quickly be added to a publication being created with a Desktop Publishing Layout Program such QuarkXpress or Adobe's PageMaker.
IMAGE Presentation as a Virtual Reality Image
Apple Computer introduced a standard known as QuickTime® VR for storing and displaying virtual reality images. Apple Computer's QuickTime® VR standard governs the data storage format and the playback software needed to view the QuickTime® VR datasets. The camera system of the present invention can be used to quickly create QuickTime® VR datasets.
The QuickTime® VR format stores the image as cylindrical image as illustrated in
After the digital version of the annular image is available on the computer system, a transformation program is then executed on the computer system at step 1130 in order to transform the digitized annular image into a QuickTime® VR dataset. The annular image produced by the camera system of the present invention stores the panoramic image information in a polar coordinate system. Conversely, Apple®'s QuickTime® VR uses a cylindrical coordinate system as illustrated in
Once the coordinate transform is complete, the transformed image can be viewed using Apple's QTVR player program as stated in step 1140.
Still Image Presentation on a Computer Network
Since the present invention can store the annular image in digital form, a very useful method of distributing panoramic images is through a computer network. In particular, the hypertext transport protocol (http) of the World Wide Web (WWW) on the Internet can be used to distribute still annular images. The still annular images would be stored on a World Wide Web server. To access the still annular images, any user coupled to the Internet would use a World Wide Web browser program.
One method of transporting the images would be to define a new panoramic image annular data format. The images could then be downloaded as stored in the panoramic image annular data format. A helper application would then display the images once downloaded.
A better method of displaying images using the hypertext transport protocol (http) of the World Wide Web (WWW) would be to implement a “plug-in” application that would work with the browser program.
Image Presentation as Video
One of the most interesting presentation systems for the present invention is a video presentation system.
Referring to
After being received through the panoramic camera interface 1210, the digitized annular images are stored in an Annular “Video” Storage system 1230. The Annular “Video” comprises a series of a consecutive annular images taken with a CCD version of the panoramic camera system 1205.
To display the Annular Video as normal video, the annular frames must be converted from the annular image format into normal video images. In one embodiment of the present invention, only a portion of the annular image is converted into normal video. One reason for this is that the aspect ratio of video does not allow for good viewing of wide but short rectangular panoramic images. Furthermore, by only transforming a portion of the annular image into normal video, the transformation can be done in real-time without requiring exceedingly fast computer equipment. The transformation of annular video to normal video is done by annular to video conversion units 1240 and 1243.
To display the normal video, existing video streaming software 1260 and 1263 can be used. For example, using a standard transmission protocol like MPEG or proprietary protocols such as StreamWorks produced by Xing Technology Corporation of Arroyo Grande, Calif., or VDOLive produced by VDOnet Corporation of Santa Clara, Calif., the video can be provided to computer users coupled to a network. One skilled in the art will understand that one way to transmit information (for example, still or video digital data; or plug-ins or other computer software) is by embodying the data in a carrier wave that is transmitted over the network.
To change the view angle, the user can select a pan right arrow 1347 or a pan left arrow 1343 with a cursor 1320. Alternatively, the user can simply move the position of the locator window 1315 within the still panoramic image 1310. In the embodiment of
Referring back to
Referring back to
To more completely convey the experience of being at a different location, the present invention can be combined with a three-dimensional sound system. Referring to
To add three dimensional sound, the sound from the various directional microphones is mixed depending on the viewing angle that a user has selected. For example, if a viewer that is seeing a real-time image from camera 1400 of
By adding sound to the system, the user is provided with cues as to which direction they should be viewing. For example, if the user hears a sound from “behind”, then the user can change the view angle to look backward.
The foregoing has described a camera device that captures 360 degree panoramic images and presentation systems for displaying such images. It is contemplated that changes and modifications may be made by one of ordinary skill in the art, to the materials and arrangements of elements of the present invention without departing from the scope of the invention.
This is a divisional of co-pending application Ser. No. 09/521,652 filed Mar. 8, 2000 which is a divisional of co-pending application Ser. No. 08/872,525, filed Jun. 11, 1997 which claims the benefit of U.S. Provisional Application No. 60/020,292, filed Jun. 24, 1996.This application is a broadening Reissue application of U.S. patent application Ser. No. 10/419,283 filed Apr. 17, 2003 (now U.S. Pat. No. 7,486,324, granted Feb. 3, 2009), which is a Divisional of U.S. patent application Ser. No. 09/521,652, filed Mar. 8, 2000 (now U.S. Pat. No. 6,593,969), which is a Divisional of U.S. patent application Ser. No. 08/872,525, filed Jun. 11, 1997 (now U.S. Pat. No. 6,459,451), which claims the benefit of U.S. Provisional Application No. 60/020,292, filed Jun. 24, 1996.
Number | Name | Date | Kind |
---|---|---|---|
2146662 | Van Albada | Feb 1939 | A |
2244235 | Ayres | Jun 1941 | A |
2304434 | Ayres | Dec 1942 | A |
2628529 | Braymer | Feb 1953 | A |
2654286 | Cesar | Oct 1953 | A |
3203328 | Brueggeman | Aug 1965 | A |
3205777 | Benner | Sep 1965 | A |
3229576 | Rees | Jan 1966 | A |
3692934 | Herndon | Sep 1972 | A |
3723805 | Scarpino et al. | Mar 1973 | A |
3785715 | Mecklenborg | Jan 1974 | A |
3832046 | Mecklenborg | Aug 1974 | A |
3846809 | Pinzone et al. | Nov 1974 | A |
3872238 | Herndon | Mar 1975 | A |
3934259 | Krider | Jan 1976 | A |
3970841 | Green | Jul 1976 | A |
3998532 | Dykes | Dec 1976 | A |
4012126 | Rosendahl et al. | Mar 1977 | A |
4017145 | Jerie | Apr 1977 | A |
4038670 | Seitz | Jul 1977 | A |
4058831 | Smith | Nov 1977 | A |
4078860 | Globus et al. | Mar 1978 | A |
4157218 | Gordon et al. | Jun 1979 | A |
4190866 | Luknar | Feb 1980 | A |
4241985 | Globus et al. | Dec 1980 | A |
D263716 | Globus et al. | Apr 1982 | S |
4326775 | King | Apr 1982 | A |
4395093 | Rosendahl et al. | Jul 1983 | A |
4429957 | King | Feb 1984 | A |
4463380 | Hooks, Jr. | Jul 1984 | A |
4484801 | Cox | Nov 1984 | A |
4518898 | Tarnowski et al. | May 1985 | A |
4549208 | Kamejima et al. | Oct 1985 | A |
4561733 | Kreischer | Dec 1985 | A |
4566763 | Greguss | Jan 1986 | A |
4578682 | Hooper et al. | Mar 1986 | A |
4593982 | Rosset | Jun 1986 | A |
4602857 | Woltz et al. | Jul 1986 | A |
4656506 | Ritchey | Apr 1987 | A |
4661855 | Gulck | Apr 1987 | A |
4670648 | Hall et al. | Jun 1987 | A |
4728839 | Coughlan et al. | Mar 1988 | A |
4736436 | Yasukawa et al. | Apr 1988 | A |
4742390 | Francke et al. | May 1988 | A |
4751660 | Hedley | Jun 1988 | A |
4754269 | Kishi et al. | Jun 1988 | A |
4761641 | Schreiber | Aug 1988 | A |
4772942 | Tuck | Sep 1988 | A |
4797942 | Burt | Jan 1989 | A |
4807158 | Blanton et al. | Feb 1989 | A |
4835532 | Fant | May 1989 | A |
4858002 | Zobel | Aug 1989 | A |
4858149 | Quarendon | Aug 1989 | A |
4864335 | Corrales | Sep 1989 | A |
4868682 | Shimizu et al. | Sep 1989 | A |
4899293 | Dawson et al. | Feb 1990 | A |
4901140 | Lang et al. | Feb 1990 | A |
4907084 | Nagafusa | Mar 1990 | A |
4908874 | Gabriel | Mar 1990 | A |
4918473 | Blackshear | Apr 1990 | A |
4924094 | Moore | May 1990 | A |
4943821 | Gelphman et al. | Jul 1990 | A |
4943851 | Lang et al. | Jul 1990 | A |
4945367 | Blackshear | Jul 1990 | A |
4965844 | Oka et al. | Oct 1990 | A |
D312263 | Charles | Nov 1990 | S |
4974072 | Hasegawa | Nov 1990 | A |
4985762 | Smith | Jan 1991 | A |
4991020 | Zwirn | Feb 1991 | A |
5005083 | Grage et al. | Apr 1991 | A |
5020114 | Fujioka et al. | May 1991 | A |
5021813 | Corrales | Jun 1991 | A |
5023725 | McCutchen | Jun 1991 | A |
5038225 | Maeshima | Aug 1991 | A |
5040055 | Smith | Aug 1991 | A |
5048102 | Tararine et al. | Sep 1991 | A |
5051830 | Von Hoessle | Sep 1991 | A |
5067019 | Juday et al. | Nov 1991 | A |
5068735 | Tuchiya et al. | Nov 1991 | A |
5077609 | Manelphe | Dec 1991 | A |
5083389 | Alperin | Jan 1992 | A |
5097325 | Dill | Mar 1992 | A |
5115266 | Troje | May 1992 | A |
5130794 | Ritchey | Jul 1992 | A |
5142354 | Suzuki et al. | Aug 1992 | A |
5153716 | Smith | Oct 1992 | A |
5157491 | Kassatly | Oct 1992 | A |
5166878 | Poelstra | Nov 1992 | A |
5173948 | Blackham et al. | Dec 1992 | A |
5175808 | Sayre | Dec 1992 | A |
5185667 | Zimmermann | Feb 1993 | A |
5187571 | Braun et al. | Feb 1993 | A |
5189528 | Takashima et al. | Feb 1993 | A |
5200818 | Neta et al. | Apr 1993 | A |
5231673 | Elenga | Jul 1993 | A |
5235656 | Hilgeman | Aug 1993 | A |
5259584 | Wainwright | Nov 1993 | A |
5262852 | Eouzan et al. | Nov 1993 | A |
5262867 | Kojima | Nov 1993 | A |
5280540 | Addeo et al. | Jan 1994 | A |
5289312 | Hashimoto et al. | Feb 1994 | A |
5305035 | Schonherr et al. | Apr 1994 | A |
5311572 | Friedes et al. | May 1994 | A |
5313306 | Kuban et al. | May 1994 | A |
5315331 | Ohshita | May 1994 | A |
5341218 | Kaneko et al. | Aug 1994 | A |
5359363 | Kuban et al. | Oct 1994 | A |
5384588 | Martin et al. | Jan 1995 | A |
5396583 | Chen et al. | Mar 1995 | A |
5422987 | Yamada | Jun 1995 | A |
5432871 | Novik | Jul 1995 | A |
5444476 | Conway | Aug 1995 | A |
5446833 | Miller et al. | Aug 1995 | A |
5452450 | Delory | Sep 1995 | A |
5473474 | Powell | Dec 1995 | A |
5479203 | Kawai et al. | Dec 1995 | A |
5490239 | Myers | Feb 1996 | A |
5495576 | Ritchey | Feb 1996 | A |
5508734 | Baker et al. | Apr 1996 | A |
5530650 | Biferno et al. | Jun 1996 | A |
5539483 | Nalwa | Jul 1996 | A |
5550646 | Hassan et al. | Aug 1996 | A |
5563650 | Poelstra | Oct 1996 | A |
5601353 | Naimark et al. | Feb 1997 | A |
5606365 | Maurinus et al. | Feb 1997 | A |
5610391 | Ringlien | Mar 1997 | A |
5612533 | Judd et al. | Mar 1997 | A |
5627675 | Davis et al. | May 1997 | A |
5631778 | Powell | May 1997 | A |
5633924 | Kaish et al. | May 1997 | A |
5649032 | Burt et al. | Jul 1997 | A |
5682511 | Sposato et al. | Oct 1997 | A |
5684937 | Oxaal | Nov 1997 | A |
5686957 | Baker | Nov 1997 | A |
5714997 | Anderson | Feb 1998 | A |
5729471 | Jain et al. | Mar 1998 | A |
5748194 | Chen | May 1998 | A |
5760826 | Nayar | Jun 1998 | A |
5761416 | Mandal et al. | Jun 1998 | A |
5764276 | Martin et al. | Jun 1998 | A |
5796426 | Gullichsen et al. | Aug 1998 | A |
5841589 | Davis et al. | Nov 1998 | A |
5844520 | Guppy et al. | Dec 1998 | A |
5850352 | Moezzi et al. | Dec 1998 | A |
5854713 | Kuroda et al. | Dec 1998 | A |
5877801 | Martin et al. | Mar 1999 | A |
RE36207 | Zimmermann et al. | May 1999 | E |
5903319 | Busko et al. | May 1999 | A |
5920337 | Glassman et al. | Jul 1999 | A |
5990941 | Jackson et al. | Nov 1999 | A |
6002430 | McCall et al. | Dec 1999 | A |
6034716 | Whiting et al. | Mar 2000 | A |
6043837 | Driscoll et al. | Mar 2000 | A |
Number | Date | Country |
---|---|---|
2140681 | Dec 1995 | CA |
2140681 | Dec 1995 | CA |
0 816 891 | Jan 1998 | EP |
0816891 | Jan 1998 | EP |
1234341 | May 1960 | FR |
1234341 | Oct 1960 | FR |
2 221 118 | Jan 1990 | GB |
2 289 820 | Nov 1995 | GB |
2289820 | Nov 1995 | GB |
02-127877 | Nov 1988 | JP |
07-151965 | Jun 1995 | JP |
09-133877 | May 1997 | JP |
2-127877 | Nov 1998 | JP |
WO-9401300 | Jun 1994 | WO |
WO 9413100 | Jun 1994 | WO |
WO-9417493 | Aug 1994 | WO |
WO 9417493 | Aug 1994 | WO |
WO-9731482 | Aug 1997 | WO |
WO 9731482 | Aug 1997 | WO |
WO-0106449 | Jan 2001 | WO |
WO 0106449 | Jan 2001 | WO |
Entry |
---|
Anderson, R.L., et al., “Omnidirectional Real time Imaging Using Digital Restoration”. High Speed Photography SPIE. vol. 348. San Diego, CA. 1982. pp. 807-814. |
Carlbom, Ingrid et al. “Planner Geometric Projections and Viewing Transformations”. Computing Surveys. vol. 10. No. 04. Dec. 1978. pp. 465-502. |
Castleman, K., “Digital Image Processing”. Prentice Hall. 1979. pp. 110-135, 383-400,408. |
Castleman, K., “Digital Image Processing”. Prentice Hall. 1996. pp. 125-127, 140-141. |
Chang, Yuh-Lin et al., “Calibrating a Mobile Camera's Parameters”. Pattern Recognition. vol. 26. No. 01. Dated: 1983. pp. 75-88. |
Charles et al., “How to Build and Use an All-Sky Camera.” Astronomy. Apr. 1987. pp. 64-70. |
Charles Jeffery, R., “All-Sky Reflector with “Invisible” Camera Support”. Images from 1988 RTMC Proceedings. pp. 79-80. |
Chen, Shenchang Eric. Quick Time VR—An Image-Based Approach to Virtual Environment Navigation. pp. 39. Dated: 1995. |
Cnoe M. And Kuno Y., “Digital Processing CF Images Taken by Fish-Eye Lens”. 1982. IEEE. pp. 105-108. |
Defendant IPI's Composite Exhibit List, Civil Action of interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849. Filed: Jan. 5, 1998, in U.S.D.C., Eastern District of Tennessee. pp. 20. |
Defendants IPI's Notice of Reliance of Prior Art and Witnesses, Civil Action of Interactive Pictures Corporation, A/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 05 pages. Filed: Dec. 8, 1997, in U.S.D.C., Eastern District of Tennessee. |
Deutsch, Claudia H., “One Camera That Offers Many Views,” The New York Times, Feb. 3, 1997, 2 pages, Retrieved from: http://www.nytimes.com/1997/02/03/business/one-camera-that-offers-many-views.html. |
Dixon, D., Golin, S., and Hasfield, I., “DVI Video/Graphics”. Computer Graphics World reprinted from the Jul. 1987 edition of Computer Graphics World. pp. 4. |
Fant, K., “A Nonaliasing, Real-Time Spatial Formation Technique”. IEEE. 1986. pp. 71-80. |
Fisher, Timothy E., A Programmable Video Image Remapper. SPIE> vol. 938. pp. 122-128. Dated: 1988. |
Fu, K.S. et al., “Low-Level Vision”. Robotics: Control, Sensing, Vision, and Intelligence. 1987.McGraw Hill Inc., pp. 313-315. |
Greene, N., “Environment Mapping and Other Applications of the World Projections.” Computer Graphics and Applications. Nov. 1986. IEEE Computer Society. vol. 06. No. 11. pp. 21-29. |
Greene, N., and Heckbert, P., “Creating Raster Omnimax Images From Multiple Perspective Views Using The Elliptical Weighted Average Filter”. IEEE. 1986. pp. 21-27. |
Greene, William B., “Qualitative Image Processing Techniques”. Digital Image Processing, A Systems Approach. 2.sup.nd Edition. 1989. Van Nostrand Reinhold. pp. 92-112. |
Hamit, F., “Near- Fisheye CCD Camera Broadens View for Imaging”. Advanced Imaging. Mar. 1993. pp. 50-52. |
Hechbert P., “The PMAT and Poly User's Manual”. Computer Graphics Lab. N.Y.I.T., Feb. 18, 1983. pp. 1-29. |
Heckbert, P., “Survey of Texture Mapping” IEEE CG&A, Nov. 1986, pp. 56-67. |
Heckbert, P., Fundamentals of Textured Mapping and Image Warping. Master Thesis. pp. 86. Dated: Jun. 17, 1989. |
International Broadcasting Convention Venue RAI Congress and Exhibition Centre, Amsterdam, The Netherlands. Jul. 3-7, 1992. pp. 6, Including the title page. |
Johnson, Colin R., “Imaging System Sees All”. Electronic Engineering Times. Dec. 25, 1996. pp. 1&98. |
Laikin, Milton. “Wide Angle Lens System”. 1980. International Design Conference (OSA). SPIE. vol. 237. 1980. pp. 530-532, 815-816. |
Lenz, Reimer K. et al., “Techniques for Calibration of the Scale Factor and Image Center for High Accuracy 3-D Machine Vision Metrology”. IEEE. Transaction on Pattern Analysis and Machine Intelligence. vol. 05. No. 05. Sep. 1988. pp. 713-720. |
Lippman, Andrew. Movie-Map: An Application Of The Optical Videodisc To Computer Graphics. pp. 43. Dated: 1980. |
Lu Carnevale, Mary. Video Camera Puts The Viewer in Control. Wall Street Journal. Dated: Nov. 25, 1992. |
Marbach, William D. (edited by): Developments To Watch. Business Week. p. 83. Dated: Sep. 26, 1988. |
Miyamoto, K., “Fish Eye Lens”. JOSA. vol. 54. pp. 1060-1061. Dated: Aug. 1964. |
Nomura, Yoshihiko, et al., “A Simple Calibration Algorithm for High-Distortion Lens Camera”. IEEE. Transaction on Pattern Analysis and Intelligence Machine. vol. 14. No. 11. Nov. 1992. pp. 1095-1099. |
Non-Final Office Action received for U.S. Appl. No. 10/419,283 dated Sep. 19, 2006. |
Notice of Allowance received for U.S. Appl. No. 10/419,283 dated Sep. 26, 2008. |
Office Action received for Japanese Appln. No. 2000-341825 dated Aug. 3, 2010. (English Translation attached). |
Plaintiff's Rule 26(a)(3) Disclosures, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849. Filed: Dec. 8, 1997 in U.S.D.C. Eastern District of Tennessee. pp. 31. |
Plaintiff's Supplemental Trial Exhibit List, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 41 pages. Filed: Jan. 2, 1998, in U.S.D.C., Eastern District of Tennessee. |
Popular Science. Electronic Panning Camera System. pp. 36-37. Dated: Sep. 1992. |
Rebiai,M., Mansouri,S., Pinson,F., and Tichit, B., “Image Distortion From Zoom Lenses: Modeling and Digital Correction”. International Broadcasting Convention. IEEE. Dated: Jul. 1992. |
Ripley G. David, “DVI—A Digital Multimedia Technology”. Communication of the ACM. Jul. 1989. vol. 32. No. 07. pp. 811-820. |
Roger W. Sinnott, “Scientific Library Gleaning for ATMs”. Sky & Telescope. Aug. 1986. pp. 186. |
Shah, S., A Simple Calibration Procedure for Fish-Eye (High Distortion) Lens. IEEE. 1994. pp. 3422-3427. |
Shah, Shisir et al., “Depth Estimation using Fish-Eye Lenses”. IEEE. Department of Electrical and Computer Engineering. University of Texas. 1994. pp. 740-744. |
Snyder, J. P. “Map Projections—A Working Manual,” U. S. Geological Survey Professional Paper 1395. Washington, DC: U. S. Government Printing Office, pp. 164-168, 1987. |
Spice, B., “Panospheric Camera Expands Horizon,” Pittsburgh Post—Gazette, Jun. 2, 1997, 3 pages. |
Tsai, Roger Y., “A Versatile Camera Calibration Technique for High Accuracy 3-D Machine Vision Using Off-the-Shelf TV Cameras and Lenses”. IEEE. Journal of Robotics and Automation. vol. RA-3. No. 04. Aug. 1987. pp. 323-344. |
Tulloch, Martha. “New Video Camera . . . ” Photonics Spectra. pp. 18-20. Dated: Oct. 1992. |
Upstill, Steve. “Building Stronger Images”. UNIX Review. Oct. 1988. vol. 06. No. 10. pp. 63-73. |
Verity, John W. (edited by): Information Processing. Business Week. p. 134E. Dated: Jul. 13, 1992. |
Weng, Juyang. “Camera Calibration With Distortion Models and Accuracy”. IEEE. Transactions on Pattern Analysis and Machine Intelligence. vol. 14. No. 10. Oct. 1992. pp. 965-980. |
Wolberg, George. Digital Image Warping (Introduction). 1990. IEEE Computer Society Press. pp. 2. |
Yelick, Steven. Anamorphic Image Processing. pp. 1-37, Including Acknowledgement Page. Dated: 1980. |
Supplemental Information Disclosure Statement in re: the Application of Steven D. Zimmerman, et al. U.S. Appl. No. 08/662,410; 08 Pages including PTO 1449 Form citing 19 references U.S. Appl. No. 08/662,410, filed Jul. 12, 1996. |
Plaintiff's Rule 26(a)(3) Disclosures, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 31 Pages. Filed: Dec. 8, 1997, in U.S.D.C., Eastern District of Tennessee. |
Greene, William B., “Qualitative Image Processing Techniques”. Digital Image Processing, A Systems Approach. 2nd Edition. 1989. Van Nostrand Reinhold. pp. 91-112. |
Defendant's IPI's Composite Exhibit List, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman. Case No. 3-96-849. Filed: Jan. 5, 1998 in U.S.D.C., Eastern District of Tennessee. pp. 20. |
Number | Date | Country | |
---|---|---|---|
60020292 | Jun 1996 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09521652 | Mar 2000 | US |
Child | 10419283 | US | |
Parent | 08872525 | Jun 1997 | US |
Child | 09521652 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10419283 | Apr 2003 | US |
Child | 13015142 | US |