Method and apparatus for implementing a panoptic camera system

Information

  • Patent Grant
  • 6356296
  • Patent Number
    6,356,296
  • Date Filed
    Thursday, May 8, 1997
    27 years ago
  • Date Issued
    Tuesday, March 12, 2002
    22 years ago
Abstract
A panoptic camera system that can be used to capture all the light from a hemisphere viewing angle is disclosed. The panoptic camera comprises a main reflecting mirror that reflects light from an entire hemisphere onto an image capture mechanism. The main reflecting mirror consists of a paraboloid shape with a dimple on an apex. The surface area around the dimple allows the main reflector to capture light from behind an image capture mechanism or a second reflector. When two panoptic camera systems that capture the light from an entire hemisphere are placed back to back, a camera system that “sees” light from all directions is created. A stereo vision panoramic camera system is also disclosed. The stereo vision panoramic camera system comprises two panoramic camera systems that are separated by a known distance. The two panoramic camera systems are each placed in a “blind spot” of the other panoramic camera system. By using the different images generated by the two panoramic camera systems and the known distance between the two panoramic camera systems, the range to objects within the panoramic images can be determined.
Description




FIELD OF THE INVENTION




The present invention relates to the field of film and video photography. In particular the present invention discloses a panoptic camera device that captures virtually all the light that converges on a single point in space.




BACKGROUND OF THE INVENTION




Most cameras only record a small viewing angle. Thus, a typical conventional camera only captures an image in the direction that the camera is aimed. Such conventional cameras force viewers to look only at what the camera operator chooses to focus on. Some cameras use a specialized wide angle lens or “fish-eye” lens to capture a wider panoramic image. However, such panoramic cameras still have a relatively limited field.




In many situations, it would be much more desirable to have a camera system that captures light from all directions. For example, a conventional surveillance camera can be compromised by a perpetrator that approaches the camera from a direction that is not within the viewing angle of the camera. An ideal surveillance camera would capture light from all directions such that the camera would be able to record an image of a person that approaches the camera from any direction.




It would be desirable to have a camera system that would capture the light from all directions such that a full 360 degree panoramic image can be created. A full 360 degree panoramic image would allow the viewer to choose what she would like to look at. Furthermore, a full 360 degree panoramic image allows multiple viewers to simultaneously view the world from the same point, with each being able to independently choose their viewing direction and field of view.




SUMMARY OF THE INVENTION




The present invention introduces a panoptic camera system that can be used to capture all the light from a hemisphere viewing angle. The panoptic camera comprises a main reflecting mirror that reflects light from an entire hemisphere onto an image capture mechanism. The main reflecting mirror consists of a paraboloid shape with a dimple on an apex. The surface area around the dimple allows the main reflector to capture light from behind an image capture mechanism or a second reflector. When two panoptic camera systems that capture the light from an entire hemisphere are placed back to back, a camera system that “sees” light from all directions is created.




A stereo vision panoramic camera system is also disclosed. The stereo vision panoramic camera system comprises two panoramic camera systems that are separated by a known distance. The two panoramic camera systems are each placed in a “blind spot” of the other panoramic camera system. By using the different images generated by the two panoramic camera systems and the known distance between the two panoramic camera systems, the range to objects within the panoramic images can be determined.




Other objects feature and advantages of present invention will be apparent from the company drawings and from the following detailed description that follows below.











BRIEF DESCRIPTION OF THE DRAWINGS




The objects, features and advantages of the present invention will be apparent to one skilled in the art, in view of the following detailed description in which:





FIG. 1

illustrates one embodiment of a panoramic camera system.





FIG. 2



a


illustrates an annular image that is recorded by the panoramic camera system of FIG.


1


.





FIG. 2



b


illustrates how the annular image of

FIG. 2



a


appears after it has been unwrapped by polar to rectangular mapping software.





FIG. 3

graphically illustrates the 360 degree band of light that is captured by the panoramic camera system of FIG.


1


.





FIG. 4



a


illustrates an embodiment of a panoramic camera system that captures all the light from a hemisphere above and around the panoramic camera system.





FIG. 4



b


is a conceptual diagram used to illustrate the shape of the panoptic camera system in

FIG. 4



a.







FIG. 5

illustrates an embodiment of a panoramic camera system that captures light from all directions around the panoramic camera system.





FIG. 6

illustrates a first embodiment of a panoramic camera with stereo vision.





FIG. 7

illustrates a second embodiment of a panoramic camera with stereo vision.





FIG. 8

illustrates an embodiment of a panoramic camera system that shields unwanted light and limits the amount of light that reaches the image plane.





FIG. 9

illustrates an embodiment of a panoramic camera that is constructed using a solid transparent material such that the inner components are protected.





FIGS. 10



a


and


10




b


graphically illustrate a method of locating the center of a annular panoramic image.





FIGS. 11



a


and


11




b


illustrate flow diagram describing the method of locating the center of a annular panoramic image.











DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT




A method and apparatus for implementing a panoptic camera is disclosed. The following description, for purposes of explanation, specific nomenclature is set forth to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that these specific details are not required in order to practice the present invention. For example, the present invention has been described with reference to Ethernet based computer networks. However, the same techniques can easily be applied to other types of computer networks.




A Panoptic Camera





FIG. 1

illustrates a cross section view of panoramic camera system


100


that captures an image of the surrounding panorama. It should be noted that the camera system is cylindrically symmetrical such that it captures light from a 360 degree band around a point.




The panoramic camera system


100


operates by reflecting all the light from a 360 degree band with a parabolic reflector


110


to a second reflector


115


through a set of lenses


120


,


130


, and


140


to an image capture mechanism


150


. The set of lenses corrects various optical artifacts created by the parabolic mirror. The image capture mechanism


150


may be a chemical based film image capture mechanism or an electronic based image capture mechanism such as a CCD. Details on how to construct such a panoramic camera can be found in the U.S. patent application titled “Panoramic Camera” filed on May 8, 1997, with Ser. No. 08/853,048.





FIG. 2



a


illustrates how an image captured by the panoramic camera system


100


of

FIG. 1

appears. As illustrated in

FIG. 2



a,


the surrounding panorama is captured as an annular image on a two dimensional surface. The annular image can later be processed by an optical or electronic image processing system to display the image in a more familiar format.

FIG. 2



b


illustrates how the annular image of

FIG. 2



a


appears after it has been geometrically transformed from the annular image into a rectangular image by image processing software. In one embodiment, the transformation approximates a transform from polar coordinates to rectangular coordinate.





FIG. 3

graphically illustrates the band of light that is captured by the panoramic camera system of FIG.


1


. As illustrated in

FIG. 3

, the panoramic camera system of

FIG. 1

captures a 360 degree band of light that is 60 degrees above and below the horizon.




Camera System that Collects all Light from a Hemisphere




In certain applications, it would be desirable to have a camera system that collects all the light from a full hemisphere around the camera. For example, a camera system that collects all the light from a full hemisphere could be used by astronomers to capture an image of the entire night sky.





FIG. 4



a


illustrates a camera system similar to camera system of

FIG. 1

except that the camera system of

FIG. 4



a


captures light from the horizon line all the way to the Zenith. Thus, the camera system of

FIG. 4



a


captures light from the entire hemisphere above the camera system.




The camera system operates by having a main reflector


410


that reflects light from the entire hemisphere above the camera system to a second reflector


415


. The second reflector


415


reflects the light down through a lens system


420


to an image capture mechanism


440


.




To be able to collect light from a full hemisphere, the main reflector of the camera system consists a cylindrically symmetric mirror with a cross section that consists of an offset parabola.

FIG. 4



b


illustrates the shape of a full parabola


450


that is then cut shortly after the apex on the side of the parabola near the center of the main reflector. The offset parabola reflects light from a slightly greater than 90 degree band that starts at the horizon (see light ray


481


) and continues to the zenith (see light rays


485


and


489


) and beyond. The short section of parabola near the center of the main reflector allows the main reflector to direct light from the zenith and beyond to the second reflector


470


and down into the image capture mechanism


440


. This is illustrated by light ray


489


.




Although the main reflector


410


of

FIG. 4



a


captures light from the zenith and beyond, the main reflector has a slight “blind spot.” The blind spot is limited to being a small cone of space behind the second reflector


415


inside light ray


439


. This small area in the blind spot can be used to implement a support fixture for the mirror. Alternatively, the small area in the blind spot can be used to implement supplemental lighting.




Camera System that Collects Light from all Directions




For some applications, it would be desirable to have a camera system that collects all the light that converges on a point from all directions. For example, an ideal security camera system would be able to “see” in all directions such that no perpetrator could sneak up on the camera from an angle not seen by the camera. Thus, no perpetrator could sneak up on the camera and disable the camera without having his image captured by the camera.




To construct a panoptic camera system that collects light from all directions, the present invention discloses an arrangement of two hemisphere camera systems joined together as illustrated in FIG.


5


. The arrangement of

FIG. 5

will produce two annular images: one annular image for the upper hemisphere and one annular image for the lower hemisphere. Since the two camera systems are aligned with each other, the two annular images can be optically or electronically combined to generate an image of the entire surroundings of the panoptic camera system.




A Panoramic Camera System with Stereo Vision




To gauge the range of visible objects, humans uses stereo vision. Specifically, the two different view angles provided by two eyes enables a human to determine the relative distance of visible objects. The same principle can be used to implement a panoramic camera system that has stereo vision.




A First Embodiment




Referring back to

FIG. 1

, the original panoramic camera has a blind spot above the second reflector. The blind spot is clearly illustrated in

FIG. 3

wherein the area above 60 degrees above the horizon and the area below 60 degrees below the horizon are not captured by the panoramic camera system. A second panoramic camera can be placed in a blind spot of a first panoramic camera.

FIG. 6

illustrates a stereo vision panoramic camera system constructed according to this technique.




The stereo panoramic camera system of

FIG. 6

comprises a first panoramic camera


605


and a second inverted panoramic camera


635


. Each panoramic camera system


605


and


635


is in a blind spot of the other panoramic camera system. By spatially separating the two panoramic camera systems, each panoramic camera system will record a slightly different annular image of the surrounding panorama. Using the known distance between the two panoramic camera systems and the two different annular images, the distance to objects within the annular images can be determined.




In the embodiment displayed in

FIG. 6

, the two panoramic camera systems use a single dual sided reflector


615


to reflect the panoramic image from the main reflector into the respective image capture mechanisms. In an alternate embodiment (not shown), two panoramic camera systems can be placed in the other blind spot such that the two panoramic camera systems are arranged in a manner similar to the arrangement of FIG.


5


.




Another Stereo Vision Embodiment





FIG. 7

illustrates yet another embodiment of a stereo vision panoramic camera system. In the embodiment of

FIG. 7

, a single image capture mechanism


750


is used to capture two slightly different panoramic images.




The stereo vision panoramic camera of

FIG. 7

captures a first panoramic annular image using a first main reflector


735


and a second reflector


715


in the same manner described with reference to FIG.


1


. However, the second reflector


715


in the stereo vision panoramic camera system of

FIG. 7

is an electrically activated mirror. The stereo vision panoramic camera system of

FIG. 7

also features a second main reflector


760


that is positioned at the correct position for an optical path that does not require a second reflector. Thus, by deactivating the electrically activated second reflector


715


, the stereo vision panoramic camera system captures a second panoramic annular image using the second main reflector


760


. The two panoramic annular images can be combined to deliver a stereo image of the surrounding panorama.




Panoramic Camera System with Protective Shield




When collecting light reflected off the main reflector of a panoramic camera system, it is desirable to eliminate any influence from light from other sources. For example, ambient light should not be able to enter the optical system that is intended only to collect the panoramic image reflected off of the main reflector.





FIG. 8

illustrates a cross-section view of an improved panoramic camera system


800


the collects only the light from the reflected panoramic image. It should be noted that the real panoramic camera system is cylindrically symmetrical. The panoramic camera panoramic camera system


800


uses two light shields (


875


and


877


) to block all light that is not from the reflected image off of the main reflector.




The first light shield


875


is mounted on top of the second reflector


815


that reflects the panoramic image on the main reflector


810


down into the optical path of the camera system. The first light shield


875


prevents light from above the panoramic camera's maximum vertical viewing angle. In one embodiment, the panoramic camera's maximum vertical viewing angle is 50 degrees such that the first light shield


875


prevents light coming from an angle greater than 50 degree from entering the panoramic camera's optical path.




The second light shield


877


is placed around the opening of the panoramic camera's lens system. The second light shield prevents light from entering the camera's optical path unless that light has reflected off the main reflector


810


and has reflected off the second reflector


815


down into the optical path.





FIG. 8

also illustrates that the second reflector


815


can be constructed using a convex mirror instead of the flat mirror. By using a convex mirror as the second reflector, the second reflector can be placed closer to the main body of the camera system.




Overexposure Control




A panoramic camera system must be able to handle a much wider variety of lighting conditions than a conventional (limited viewing angle) camera system. A conventional camera system only captures light from a small viewing angle such that the intensity of light from the viewing angle will probably not vary a great amount. However, a panoramic camera system captures light from all directions such that the wide variety of lighting conditions must be handled. For example, with a panoramic camera system light from a first direction may come directly from the sun and in another light from a second direction may consist of ambient light reflected off of an object in a shadow. To capture a high quality panoramic image, it would be desirable to adjust the amount of light captured from each viewing direction such that the light exposure from the different directions does not vary wildly.





FIG. 8

illustrates a panoramic camera constructed to limit the light received from the different directions. To adjust the amount of light captured from each direction, the panoramic camera system


800


includes an adaptive light filter


890


in the optical path of the panoramic camera system. The adaptive light filter


890


limits the amount of light that reaches the image capture mechanism


850


.




In the illustration of

FIG. 8

, the adaptive light filter


890


is placed just before the image capture mechanism


850


. This position minimizes the detrimental effects caused by any scattering of light by the adaptive light filter


890


. However, the adaptive light filter


890


can be placed at any point in the optical path of the panoramic camera system.




A Passive Filtering System




One method of implementing an adaptive light filter


890


is to use a normally transparent light sensitive material that darkens when the material is exposed to large quantities of light. For example, a refractive neutral lens made of photogray material would automatically limit the amount of light from high intensity viewing directions. Examples of photogray glass include PhotoGray Extra and PhotoGray II made by Corning Glass Works of Corning, New York.




An Active Filtering System




Another method of implementing an adaptive light filter


890


is to use an electronically controlled Liquid Crystal Display (LCD) array as an adaptive light filter


890


. Ideally, the LCD array would be capable of selectively adjusting the amount of light that passes through any point of the LCD array.




To control the LCD array, an LCD control circuit (not shown) would be coupled to the electronic image capture mechanism


850


of the panoramic camera system


800


. The electronic image capture mechanism


850


would determine the relative light intensity at each point on the electronic image capture mechanism. The light intensity information from the electronic image capture mechanism


850


is passed to the LCD control circuit that determines how the LCD array should limit the light that passes through. Specifically, when the electronic image capture mechanism


850


detects an area that is receiving high intensity light, then the LCD control circuit would darken the corresponding area on the LCD array. Thus, the LCD array would selectively reduce the amount of light that reaches the image capture mechanism from high light intensity directions. The “flattening” of the light intensity results in captured panoramic annular images with greater contrast.




A Solid Camera Embodiment




The panoramic camera system illustrated in

FIG. 8

uses an outer surface mirror for the main reflector. An outer surface mirror is used since an inner surface mirror protected by a transparent material would have refractive effects caused when the light enters the transparent material and when the light exits the transparent material. Since the panoramic camera system illustrated in

FIG. 8

uses an outer surface mirror, the camera must be used cautiously to prevent damage to the out mirror surface. It would therefore be desirable to implement a panoramic camera that protects the main reflector.





FIG. 9

illustrates an embodiment of a panoramic camera system constructed of a solid transparent block. In the embodiment of

FIG. 9

, the main reflector


910


is protected by a transparent material


912


. The transparent material


912


is shaped such that all the light that will be used to create the annular reflection of the surrounding panorama enter the transparent material


912


at a normal to the surface of the transparent material


912


as illustrated by the right angle marks on the light rays. Since the light rays that create the annular image enter at a normal to the surface, there is no refractive effect as the light enters the transparent material


912


. The outer surface of the transparent material


912


is coated with a multicoat material such that internal reflections are prevented.




Once a light ray that will form part of the panoramic image enters the transparent material


912


, the light ray then reflects off the main reflector


910


and then reflects off the second reflector


915


and then exits the transparent material


912


at surface


920


. Thus, the light remains in the transparent material


912


until it enters the lens system. The surface


920


can be shaped such that all light that is part of the annular image exits at a normal to the surface


920


such that the transparent material


912


has no refractive effect on the light. Alternatively, the surface


920


can be shaped such that surface


920


is part of the lens system.




The embodiment in

FIG. 9

includes two light shields


975


and


977


to prevent undesired light from entering the optical path. It should be noted that the panoramic camera system can also be constructed with the light shields


975


and


977


.




Annular Image Processing




As previously described, the panoramic annular images can be geometrically transformed from the annular image into more conventional rectangular projections. One method of performing this operation is to use digital image processing techniques as described in the relate U.S. patent titled “Panoramic Camera” filed on May 8, 1997, with Ser. No. 08/853,048.




When photographic film is used to capture the annular images, the annular images will not always be recorded in the exact same position on the film. One reason for this is that sprockets used to advance film through a camera are slightly smaller that the correspond holes in the film. Thus, the film alignment between exposures tends to vary. This effect is known as “gate weave.”




To process an annular image, the center coordinate of the digitized annular image must be known in order to rotate a selected viewport into a standard view. Since gate weave causes the center coordinate to vary, the center coordinate must be determined for each annular image that originated from photographic film.

FIGS. 10



b,




10




b,




11




a


and


11




b


illustrate a method of determining the center coordinate of an digitized panoramic annular image that originated from photographic film.




Referring to the flow diagram of

FIG. 11



a,


step


1110


selects an initial proposed center point along a first axis. Referring to

FIG. 10



a,


an initial proposed center point PC


1


is illustrated along a Y axis (the first axis). Next at step


1120


, the annular video to standard video conversion software finds a first pixel along an orthogonal second axis that passes through the first proposed center point and exceeds a threshold value. In

FIG. 10



a


this is illustrated as FP


1


on a X axis. As illustrated in

FIG. 10



a,


the threshold value is selected to locate the first pixel along the edge of the annular image. Next, a last pixel that exceeds the threshold and is located along the second axis that passes through the first proposed center point (PC


1


) is selected. In

FIG. 10



a,


that last pixel is LP


1


along an X axis. Next at step


1130


, the converter selects the midpoint between the first pixel FP


1


and the last pixel LP


1


along the second axis as a second proposed center point. In

FIG. 10



a,


the second proposed center point is illustrated as PC


2


. The second proposed center point is closer to the actual center than the first proposed center point.




This process is repeated again after switching axis. Specifically, in step


1140


a first pixel a first axis that passes through the second proposed center point and exceeds a threshold value is selected as a first pixel. This is illustrated in

FIG. 10



b


as point FP


1


along a Y axis. Then a last pixel along a first axis that passes through a second proposed center point and exceeds the threshold value is selected. In

FIG. 10



b


this is illustrated as LP


2


. Then a midpoint is selected between the first pixel FP


2


and the last pixel LP


2


as the third proposed center point. This is illustrated on

FIG. 10



b


as third proposed center point PC


3


. The third proposed center point is also referred to as the first proposed center point for purposes of repeating the method steps.




The method proceeds to step


1160


where it determines if the first/third proposed center point is equal to the second proposed center point. This test determines whether the same center point has been selected again. If this occurs, then the method proceeds down to step


1180


where the second proposed center point is selected as the center point of the annular image. If the first proposed center point is not the same as the second proposed center point the method proceeds to step


1170


where the method determines if a minimum number of iterations have been performed. If this has not occurred, then the method proceeds back up to


1120


where it can repeat additional iterations of the method to determine a more accurate center point.




The foregoing disclosure has described several panoramic camera embodiments. It is contemplated that changes and modifications may be made by one of ordinary skill in the art, to the materials and arrangements of elements of the present invention without departing from the scope of the invention.



Claims
  • 1. A camera apparatus, said camera apparatus comprising:an image capture mechanism; and a main reflector, said main reflector reflecting light from a full hemisphere view onto said image capture mechanism; wherein said main reflector comprises a cylindrically symmetrical shape of a parabola segment rotated about an axis, said parabola segment comprising a vertex, a first side of said parabola segmenty, and a second side of said parabola segment shorter than said first side and adjacent to said axis.
  • 2. The apparatus as claimed in claim 1 further comprising:a second reflector, said second reflector positioned such that said light is reflected from said main reflector onto said second reflector and then from said second reflector onto said image capture mechanism.
  • 3. The apparatus as claimed in claim 1 wherein said light passes through a set of lenses before landing on said image capture mechanism.
  • 4. A camera apparatus, said camera apparatus comprising:an image capture mechanism; and a main reflector, said main reflector comprising a paraboloid shape with a dimple on an apex; wherein said main reflector comprises a cylindrically symmetrical shape of a parabola segment rotated about an axis, said parabola segment comprising a vertex, a first side of said parabola segment, and a second side of said parabola segment shorter than said first side and adjacent to said axis.
  • 5. The apparatus as claimed in claim 4 further comprising:a second reflector, said second reflector positioned such that said light is reflected from said main reflector onto said second reflector and then from said second reflector onto said image capture mechanism.
  • 6. The apparatus as claimed in claim 5 wherein said light passes through a set of lenses before landing on said image capture mechanism.
US Referenced Citations (138)
Number Name Date Kind
2146662 Van Albada Feb 1939 A
2244235 Ayres Jun 1941 A
2628529 Braymer Feb 1953 A
2654286 Cesar Oct 1953 A
3205777 Brenner Sep 1965 A
3692934 Herndon Sep 1972 A
3723805 Scarpino et al. Mar 1973 A
3785715 Mechlenborg Jan 1974 A
3832046 Mecklenborg Aug 1974 A
3846809 Pinzone et al. Nov 1974 A
3872238 Herndon Mar 1975 A
3934259 Krider Jan 1976 A
3998532 Dykes Dec 1976 A
4012126 Rosendahl et al. Mar 1977 A
4017145 Jerie Apr 1977 A
4038670 Seitz Jul 1977 A
4058831 Smith Nov 1977 A
4078860 Globus et al. Mar 1978 A
4157218 Gordon et al. Jun 1979 A
4190866 Luknar Feb 1980 A
4241985 Globus et al. Dec 1980 A
D263716 Globus et al. Apr 1982 S
4326775 King Apr 1982 A
4395093 Rosendahl et al. Jul 1983 A
4429957 King Feb 1984 A
4463380 Hooks, Jr. Jul 1984 A
4484801 Cox Nov 1984 A
4518898 Tarnowski et al. May 1985 A
4549208 Kamejima et al. Oct 1985 A
4561733 Kreischer Dec 1985 A
4566763 Greguss Jan 1986 A
4578682 Hooper et al. Mar 1986 A
4593982 Rosset Jun 1986 A
4602857 Woltz et al. Jul 1986 A
4656506 Ritchey Apr 1987 A
4661855 Gulck Apr 1987 A
4670648 Hall et al. Jun 1987 A
4728839 Coughlan et al. Mar 1988 A
4736436 Yasukawa et al. Apr 1988 A
4742390 Francke et al. May 1988 A
4751660 Hedley Jun 1988 A
4754269 Kishi et al. Jun 1988 A
4761641 Schreiber Aug 1988 A
4772942 Tuck Sep 1988 A
4807158 Blanton et al. Feb 1989 A
4835532 Fant May 1989 A
4858002 Zobel Aug 1989 A
4858149 Quarendon Aug 1989 A
4864335 Corrales Sep 1989 A
4868682 Shimizu et al. Sep 1989 A
4899293 Dawson et al. Feb 1990 A
4901140 Lang et al. Feb 1990 A
4907084 Nagufusa Mar 1990 A
4908874 Gabriel Mar 1990 A
4918473 Blackshear Apr 1990 A
4924094 Moore May 1990 A
4943821 Gelphman et al. Jul 1990 A
4943851 Lang et al. Jul 1990 A
4945367 Blackshear Jul 1990 A
4965844 Oka et al. Oct 1990 A
D312263 Charles Nov 1990 S
4974072 Hasegawa Nov 1990 A
4985762 Smith Jan 1991 A
4991020 Zwirn Feb 1991 A
5005083 Grage et al. Apr 1991 A
5016109 Gaylord May 1991 A
5020114 Fujioka et al. May 1991 A
5021813 Corrales Jun 1991 A
5023725 McCutchen Jun 1991 A
5038225 Maeshima Aug 1991 A
5040055 Smith Aug 1991 A
5048102 Tararine et al. Sep 1991 A
5067019 Juday et al. Nov 1991 A
5068735 Tuchiya et al. Nov 1991 A
5097325 Dill Mar 1992 A
5115266 Troje May 1992 A
5130794 Ritchey Jul 1992 A
5142354 Suzuki et al. Aug 1992 A
5153716 Smith Oct 1992 A
5157491 Kassatly Oct 1992 A
5166878 Poelstra Nov 1992 A
5173948 Blackham et al. Dec 1992 A
5175808 Sayre Dec 1992 A
5185667 Zimmermann Feb 1993 A
5187571 Braun et al. Feb 1993 A
5189528 Takashima et al. Feb 1993 A
5200818 Neta et al. Apr 1993 A
5231673 Elenga Jul 1993 A
5259584 Wainwright Nov 1993 A
5262852 Eouzan et al. Nov 1993 A
5262867 Kojima Nov 1993 A
5280540 Addeo et al. Jan 1994 A
5289312 Hashimoto et al. Feb 1994 A
5305035 Schonherr et al. Apr 1994 A
5311572 Freides et al. May 1994 A
5313306 Kuban et al. May 1994 A
5315331 Ohshita May 1994 A
5341218 Kaneko et al. Aug 1994 A
5359363 Kuban et al. Oct 1994 A
5384588 Martin et al. Jan 1995 A
5396583 Chen et al. Mar 1995 A
5422987 Yamada Jun 1995 A
5432871 Novik Jul 1995 A
5444476 Conway Aug 1995 A
5446833 Miller et al. Aug 1995 A
5452450 Delory Sep 1995 A
5473474 Powell Dec 1995 A
5479203 Kawai et al. Dec 1995 A
5490239 Myers Feb 1996 A
5495576 Ritchey Feb 1996 A
5530650 Bifero et al. Jun 1996 A
5539483 Nalwa Jul 1996 A
5601353 Naimark et al. Feb 1997 A
5606365 Maurinus et al. Feb 1997 A
5610391 Ringlien Mar 1997 A
5612533 Judd et al. Mar 1997 A
5633924 Kaish et al. May 1997 A
5649032 Burt et al. Jul 1997 A
5682511 Sposato et al. Oct 1997 A
5686957 Baker et al. Nov 1997 A
5714997 Anderson et al. Feb 1998 A
5729471 Jain et al. Mar 1998 A
5748194 Chen May 1998 A
5760826 Nayar Jun 1998 A
5761416 Mandet et al. Jun 1998 A
5764276 Martin et al. Jun 1998 A
5774569 Waldenmaier Jun 1998 A
5796426 Gullichsen et al. Aug 1998 A
5841589 Davis et al. Nov 1998 A
5844520 Guppy et al. Dec 1998 A
5850352 Moezzi et al. Dec 1998 A
5854713 Kuroda et al. Dec 1998 A
5877801 Martin et al. Mar 1999 A
5920337 Glassman et al. Jul 1999 A
5920376 Bruckstein et al. Jul 1999 A
5990941 Jackson et al. Nov 1999 A
6002430 McCall et al. Dec 1999 A
6144406 Girard et al. Nov 2000 A
Foreign Referenced Citations (2)
Number Date Country
2 221 118 Jan 1990 GB
2289820 Nov 1995 GB
Non-Patent Literature Citations (54)
Entry
Supplemental Information Disclosure Statement in re: the Application of Steven D. Zimmerman, et al. Application No. 08/662,410; 08 Pages including PTO 1449 Form citing 19 references.Application No. 08/662,410; Filed Jul. 12, 1996.
Heckbert, P., “Survey to Texture Mapping” IEEE CG&A, Nov. 1986, pp. 56-67.
Defendants IPI's Notice of Reliance of Prior Art and Witnesses, Civil Action of Interactive Pictures Corporation, A/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 05 Pages. Filed: Dec. 8, 1997, in U.S.D.C., Eastern District of Tennessee.
Defendants IPI's Composit Exhibit List, Civil Action of Interative Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849. Filed: Jan. 5, 1998, in U.S.D.C., Eastern District of Tennessee. pages 2.
Plaintiff's Rule 26(a)(3) Disclosures, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tilman, Case No. 3-96-849; 31 Pages. Filed: Dec. 8, 1997, in U.S.D.C., Eastern District of Tennessee.
Plaintiff's Supplemental Trial Exhibit List, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 41 Pages. Filed: Jan. 2, 1998, in U.S.D.C., Eastern District of Tennessee.
Ripley G. David, “DVI-A Digital Multimedia Technology”. Communication of the ACM. Jul. 1989. vol. 32. No.07. Pp. 811-820.
Onoe M. and Kuno Y., “Digital Processing CF Images Taken By Fish-Eye Lens”. IEEE. pp. 105-108.
Hamit, F. “Near-Fisheye CCD camera Broadens View for Imaging”. Advanced Imaging. Mar. 1993. pp. 50-52.
Dixon, D., Golin, S., and Hasfield, I., “DVI Video/Graphics”. Computer Graphics World reprinted from the Jul. 1987 edition of Computer Graphics World. p. 4.
Upstill, Steve. “Building Stronger Images”. UNIX Review. Oct. 1988. vol. 06. No. 10. pp. 63-73.
Greene, N., “Environment Mapping and Other Applications of the World Projections.” Computer Graphics and Applications, Nov. 1986. IEEE Computer Society. vol. 06. No. 11. pp. 21-29.
Hechbert P., “The PMAT and Poly User's Manual”. Computer Graphics Lab. N.Y.I.T., Feb. 18, 1983. pp. 1-29.
Heckbert, P., Fundamentals of textured Mapping and Image Warping. Master Thesis, p. 86. Dated: Jun. 17, 1989.
Rebiai,M., Mansouri, S., Pinson, F., and Tichit, B., “Image Distortion From Zoom Lenses: Modeling and Digital Correction”. International Broadcasting Convention. IEEE, Dated: Jul. 1992.
Charles Jeffery, R., “All-Sky Reflector with “Invisible” Camera Support”. Images from 1988 RTMC Proceedings, pp. 79-80.
Roger W. Sinnott, “Scientific Library Gleaning for ATMs”. Sky & Telescope. Aug. 1986. p. 186.
Charles et al., “How to Build and Use an All-Sky Camera.” Astronomy. Apr. 1987. pp. 64-70.
Deutsch, Cludia H., “One Camera That Offers Many Views”. The New York Times.
Johnson, Colin R., “Imaging System Sees All”. Electronic Engineering Times. Dec. 25, 1996. pp. 1&98.
“Panospheric Camera Expands Horizon”. p. 01.
“Panoshperic Camera Developed at Carnegie Mellon Expands Horizon”. p. 01.
Castleman, K., “Digital Image Processing”. Prentice Hall. 1979. pp. 110-135, 383-400, 408.
Castleman, K., “Digital Image Processing”. Prentice Hall. 1996. pp. 125-127, 140-141.
Shah, S., A Simple Calibration Procedure For Fish-Eye (High Distortion) Lens. IEEE. 1994. pp. 3422-3427.
“Gnomonic Projection”. Map Projections-A Working Manual. pp. 164-168.
Greene, N., and Heckbert, P. “Creating Raster Ominmax Images From Multiple Perspective Views Using The Elliptical Weighted Average Filter”. IEEE. 1986. pp. 21-27.
Fant, K., “A Nonaliasing, Real-Time Spatial Formation Technique”. IEEE. 1986. pp. 71-80.
Green, William B., “Qualitative Image Processing Techniques”. Digital Image Processing, A Systems Approach. 2nd Edition. 1989. Van Nostrand Reinhold. pp. 92-112.
Wolberg, George. Digital Image Warping (introduction). 1990. IEEE Computer Society Press. p. 2.
Fu, K.S. et al., “Low-Level Vision”. Robotics: Control, Sensing, Vision, and Intellgence. 1987.McGraw Hill Inc., pp. 313-315.
Carlbom, Ingrid et al. “Planner Geometric Projections and Viewing Trasformations”, Computing Surveys. vol. 10. No. 04. Dec. 1978. pp. 465-502.
Anderson, R.L., et al., “Omnidirectional Real time Imaging Using Digital Restoration”. High Speed Photography SPIE. vol. 348. San Diego, CA. 1982. pp. 807-814.
Laikin, Milton. “Wide Angel Lens System”. 1980. International Design Conference (OSA). SPIE. vol. 237. 1980. pp. 530-532, 815-816.
Shah, Shisir et al., “Depth Estimation using Fish-Eye Lenses”. IEEE. Department Of Electrical and Computer Engineering. University of Texas. 1994. pp. 740-744.
Tsai, Roger Y., “A Versatile Camera Calibration Technique for High Accuracy 3-D Machine Vision Using Off-the-Shelf TV Cameras and Lenses”. IEEE. Journal of Robotics and Automation. vol. RA-3. No. 04. Aug. 1987. pp. 323-344.
Chang, Yuh-Lin et al., “Calibrating a Mobile Camera's Parameters”. Pattern Recognition. vol. 26. No. 01. Dated: 1983. pp. 75-88.
Weng, Juyang. “Camera Calibration With Distortion Models and Accuracy”. IEEE. Transactions On Pattern Analysis and Machine Intelligence. vol. 14. No. 10. Oct. 1992. pp. 965-980.
Lenz, Reimer K. et al., “Techniques for Calibration of the Scale Factor and Image Center for High Accuracy 3-D Machine Vision Metrology”. IEEE. Transaction on Pattern Analysis and Machine Intelligence. vol. 05. No. 05. Sep. 1988. pp. 713-720.
Nomura, Yoshihiko, et al., “A Simple Calibration Algorithm for High-Distortion Lens Camera”. IEEE. Transaction on Pattern Analysis and Intelligence Machine. vol. 14. No. 11. Nov. 1992. pp. 1095-1099.
International Broadcasting Convention Venue RAI Congress And Exhibition Centre, Amersterdam, The Netherlands. Jul. 3-7, 1992. pp.: 06, Including the title page.
Telerobotics International, Inc. “Optimizing The Camera And Positioning System For Telerobotic Workcite Viewing”.
Miyamoto, K., “Fish Eye Lens”. JOSA. vol. 54. pp. 1060-1061. Dated: Aug. 1964.
Defendant's IPI's Composite Exhibit List, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman. Case No. 3-96-849. Filed: Jan. 5, 1998 in U.S.D.C., Eastern District Of Tennessee. p. 20.
Baltes, M. “Bevet D'Intervention”. Ref. No.: N 1.234.341.
Verity, John W. (edited by): Information Processing Business Week. p. 134E Dated Jul. 13, 1992.
Marbach, William D (edited by): Developments To Watch. Business Week. p. 83. Dated Sep. 26, 1988.
Lu Carnevale, Mary. Video Camera Puts The Viewer in Control. Wall Street Journal. Dated: Nov. 25, 1992.
Popular Science. Electronic Panning Camera System. pp. 36-37. Dated Sep. 1992.
Tulloch, Martha. “New Video Camera . . . ” Photonics Spectra. pp. 18-20. Dated Oct. 1992.
Fisher, Timothy E., A PROGRAMMABLE Video Image Ramapper. SPIE> vol. 938. pp. 122-128. Dated: 1988.
Lippman, Andrew. Movie-Map: An Application Of The Optical Videodisc To Computer Graphics. p. 43. Dated: 1980.
Yelick, Steven. Anamorphic Image Processing. pp. 1-37, Including Acknowledgement Page. Dated: 1980.
Chen, Shenchang Eric. Quick Time VR-An Image-Based Approach To Virtual Environment Navigation. p. 39. Dated 1995.