Method and apparatus for implementing a panoptic camera system

Information

  • Patent Grant
  • 6313865
  • Patent Number
    6,313,865
  • Date Filed
    Monday, January 10, 2000
    24 years ago
  • Date Issued
    Tuesday, November 6, 2001
    22 years ago
Abstract
A panoptic camera system that can be used to capture all the light from a hemisphere viewing angle is disclosed. The panoptic camera comprises a main reflecting mirror that reflects light from an entire hemisphere onto an image capture mechanism. The main reflecting mirror consists of a paraboloid shape with a dimple on an apex. The surface area around the dimple allows the main reflector to capture light from behind an image capture mechanism or a second reflector. When two panoptic camera systems that capture the light from an entire hemisphere are placed back to back, a camera system that “sees” light from all directions is created. A stereo vision panoramic camera system is also disclosed. The stereo vision panoramic camera system comprises two panoramic camera systems that are separated by a known distance. The two panoramic camera systems are each placed in a “blind spot” of the other panoramic camera system. By using the different images generated by the two panoramic camera systems and the known distance between the two panoramic camera systems, the range to objects within the panoramic images can be determined.
Description




FIELD OF THE INVENTION




The present invention relates to the field of film and video photography. In particular the present invention discloses a panoptic camera device that captures virtually all the light that converges on a single point in space.




BACKGROUND OF THE INVENTION




Most cameras only record a small viewing angle. Thus, a typical conventional camera only captures an image in the direction that the camera is aimed. Such conventional cameras force viewers to look only at what the camera operator chooses to focus on. Some cameras use a specialized wide angle lens or “fish-eye” lens to capture a wider panoramic image. However, such panoramic cameras still have a relatively limited field.




In many situations, it would be much more desirable to have a camera system that captures light from all directions. For example, a conventional surveillance camera can be compromised by a perpetrator that approaches the camera from a direction that is not within the viewing angle of the camera. An ideal surveillance camera would capture light from all directions such that the camera would be able to record an image of a person that approaches the camera from any direction.




It would be desirable to have a camera system that would capture the light from all directions such that a full 360 degree panoramic image can be created. A full 360 degree panoramic image would allow the viewer to choose what she would like to look at. Furthermore, a full 360 degree panoramic image allows multiple viewers to simultaneously view the world from the same point, with each being able to independently choose their viewing direction and field of view.




SUMMARY OF THE INVENTION




The present invention introduces a panoptic camera system that can be used to capture all the light from a hemisphere viewing angle. The panoptic camera comprises a main reflecting mirror that reflects light from an entire hemisphere onto an image capture mechanism. The main reflecting mirror consists of a paraboloid shape with a dimple on an apex. The surface area around the dimple allows the main reflector to capture light from behind an image capture mechanism or a second reflector. When two panoptic camera systems that capture the light from an entire hemisphere are placed back to back, a camera system that “sees” light from all directions is created.




A stereo vision panoramic camera system is also disclosed. The stereo vision panoramic camera system comprises two panoramic camera systems that are separated by a known distance. The two panoramic camera systems are each placed in a “blind spot” of the other panoramic camera system. By using the different images generated by the two panoramic camera systems and the known distance between the two panoramic camera systems, the range to objects within the panoramic images can be determined.




Other objects feature and advantages of present invention will be apparent from the company drawings and from the following detailed description that follows below.











BRIEF DESCRIPTION OF THE DRAWINGS




The objects, features and advantages of the present invention will be apparent to one skilled in the art, in view of the following detailed description in which:





FIG. 1

illustrates one embodiment of a panoramic camera system.





FIG. 2



a


illustrates an annular image that is recorded by the panoramic camera system of FIG.


1


.





FIG. 2



b


illustrates how the annular image of

FIG. 2



a


appears after it has been unwrapped by polar to rectangular mapping software.





FIG. 3

graphically illustrates the 360 degree band of light that is captured by the panoramic camera system of FIG.


1


.





FIG. 4



a


illustrates an embodiment of a panoramic camera system that captures all the light from a hemisphere above and around the panoramic camera system.





FIG. 4



b


is a conceptual diagram used to illustrate the shape of the panoptic camera system in

FIG. 4



a.







FIG. 5

illustrates an embodiment of a panoramic camera system that captures light from all directions around the panoramic camera system.





FIG. 6

illustrates a first embodiment of a panoramic camera with stereo vision.





FIG. 7

illustrates a second embodiment of a panoramic camera with stereo vision.





FIG. 8

illustrates an embodiment of a panoramic camera system that shields unwanted light and limits the amount of light that reaches the image plane.





FIG. 9

illustrates an embodiment of a panoramic camera that is constructed using a solid transparent material such that the inner components are protected.





FIGS. 10



a


and


10




b


graphically illustrate a method of locating the center of a annular panoramic image.





FIGS. 11



a


and


11




b


illustrate flow diagram describing the method of locating the center of a annular panoramic image.











DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT




A method and apparatus for implementing a panoptic camera is disclosed. In the following description, for purposes of explanation, specific nomenclature is set forth to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that these specific details are not required in order to practice the present invention. For example, the present invention has been described with reference to Ethernet based computer networks. However, the same techniques can easily be applied to other types of computer networks.




A Panoptic Camera





FIG. 1

illustrates a cross section view of panoramic camera system


100


that captures an image of the surrounding panorama. It should be noted that the camera system is cylindrically symmetrical such that it captures light from a 360 degree band around a point.




The panoramic camera system


100


operates by reflecting all the light from a 360 degree band with a parabolic reflector


110


to a second reflector


115


through a set of lenses


120


,


130


, and


140


to an image capture mechanism


150


. The set of lenses corrects various optical artifacts created by the parabolic mirror. The image capture mechanism


150


may be a chemical based film image capture mechanism or an electronic based image capture mechanism such as a CCD. Details on how to construct such a panoramic camera can be found in the U.S. patent application titled “Panoramic Camera” filed on May 8, 1997, with Ser. No. 08/853,048.





FIG. 2



a


illustrates how an image captured by the panoramic camera system


100


of

FIG. 1

appears. As illustrated in

FIG. 2



a


, the surrounding panorama is captured as an annular image on a two dimensional surface. The annular image can later be processed by an optical or electronic image processing system to display the image in a more familiar format.

FIG. 2



b


illustrates how the annular image of

FIG. 2



a


appears after it has been geometrically transformed from the annular image into a rectangular image by image processing software. In one embodiment, the transformation approximates a transform from polar coordinates to rectangular coordinate.





FIG. 3

graphically illustrates the band of light that is captured by the panoramic camera system of FIG.


1


. As illustrated in

FIG. 3

, the panoramic camera system of

FIG. 1

captures a 360 degree band of light that is 60 degrees above and below the horizon.




Camera System That Collects All Light From A Hemisphere




In certain applications, it would be desirable to have a camera system that collects all the light from a full hemisphere around the camera. For example, a camera system that collects all the light from a full hemisphere could be used by astronomers to capture an image of the entire night sky.





FIG. 4



a


illustrates a camera system similar to camera system of

FIG. 1

except that the camera system of

FIG. 4



a


captures light from the horizon line all the way to the Zenith. Thus, the camera system of

FIG. 4



a


captures light from the entire hemisphere above the camera system.




The camera system operates by having a main reflector


410


that reflects light from the entire hemisphere above the camera system to a second reflector


415


. The second reflector


415


reflects the light down through a lens system


420


to an image capture mechanism


440


.




To be able to collect light from a full hemisphere, the main reflector of the camera system consists a cylindrically symmetric mirror with a cross section that consists of an offset parabola.

FIG. 4



b


illustrates the shape of a full parabola


450


that is then cut shortly after the apex on the side of the parabola near the center of the main reflector. The offset parabola reflects light from a slightly greater than 90 degree band that starts at the horizon (see light ray


481


) and continues to the zenith (see light rays


485


and


489


) and beyond. The short section of parabola near the center of the main reflector allows the main reflector to direct light from the zenith and beyond to the second reflector


470


and down into the image capture mechanism


440


. This is illustrated by light ray


489


.




Although the main reflector


410


of

FIG. 4



a


captures light from the zenith and beyond, the main reflector has a slight “blind spot.” The blind spot is limited to being a small cone of space behind the second reflector


415


inside light ray


439


. This small area in the blind spot can be used to implement a support fixture for the mirror. Alternatively, the small area in the blind spot can be used to implement supplemental lighting.




Camera System That Collects Light From All Directions




For some applications, it would be desirable to have a camera system that collects all the light that converges on a point from all directions. For example, an ideal security camera system would be able to “see” in all directions such that no perpetrator could sneak up on the camera from an angle not seen by the camera. Thus, no perpetrator could sneak up on the camera and disable the camera without having his image captured by the camera.




To construct a panoptic camera system that collects light from all directions


500


, the present invention discloses an arrangement of two hemisphere camera systems (one hemisphere system comprising


510


,


515


and


540


and the other hemisphere system comprising


560


,


565


and


590


) joined together as illustrated in FIG.


5


. The arrangement of

FIG. 5

will produce two annular images: one annular image for the upper hemisphere and one annular image for the lower hemisphere. Since the two camera systems are aligned with each other, the two annular images can be optically or electronically combined to generate an image of the entire surroundings of the panoptic camera system.




A Panoramic Camera System With Stereo Vision




To gauge the range of visible objects, humans uses stereo vision. Specifically, the two different view angles provided by two eyes enables a human to determine the relative distance of visible objects. The same principle can be used to implement a panoramic camera system that has stereo vision.




A First Embodiment




Referring back to

FIG. 1

, the original panoramic camera has a blind spot above the second reflector. The blind spot is clearly illustrated in

FIG. 3

wherein the area above 60 degrees above the horizon and the area below 60 degrees below the horizon are not captured by the panoramic camera system. A second panoramic camera can be placed in a blind spot of a first panoramic camera.

FIG. 6

illustrates a stereo vision panoramic camera system constructed according to this technique.




The stereo panoramic camera system of

FIG. 6

comprises a first panoramic camera


605


and a second inverted panoramic camera


635


. Each panoramic camera system


605


and


635


is in a blind spot of the other panoramic camera system. By spatially separating the two panoramic camera systems, each panoramic camera system will record a slightly different annular image of the surrounding panorama. Using the known distance between the two panoramic camera systems and the two different annular images, the distance to objects within the annular images can be determined.




In the embodiment displayed in

FIG. 6

, the two panoramic camera systems


635




605


use a single dual sided reflector


615


to reflect the panoramic image from the main reflectors


610




660


into the respective image capture mechanisms. In an alternate embodiment (not shown), two panoramic camera systems can be placed in the other blind spot such that the two panoramic camera systems are arranged in a manner similar to the arrangement of FIG.


5


.




Another Stereo Vision Embodiment





FIG. 7

illustrates yet another embodiment of a stereo vision panoramic camera system. In the embodiment of

FIG. 7

, a single image capture mechanism


750


is used to capture two slightly different panoramic images.




The stereo vision panoramic camera of

FIG. 7

captures a first panoramic annular image using a first main reflector


735


and a second reflector


715


in the same manner described with reference to FIG.


1


. However, the second reflector


715


in the stereo vision panoramic camera system of

FIG. 7

is an electrically activated mirror. The stereo vision panoramic camera system of

FIG. 7

also features a second main reflector


760


that is positioned at the correct position for an optical path that does not require a second reflector. Thus, by deactivating the electrically activated second reflector


715


, the stereo vision panoramic camera system captures a second panoramic annular image using the second main reflector


760


. The two panoramic annular images can be combined to deliver a stereo image of the surrounding panorama.




Panoramic Camera System With Protective Shield




When collecting light reflected off the main reflector of a panoramic camera system, it is desirable to eliminate any influence from light from other sources. For example, ambient light should not be able to enter the optical system that is intended only to collect the panoramic image reflected off of the main reflector.





FIG. 8

illustrates a cross-section view of an improved panoramic camera system


800


the collects only the light from the reflected panoramic image. It should be noted that the real panoramic camera system is cylindrically symmetrical. The panoramic camera panoramic camera system


800


uses two light shields (


875


and


877


) to block all light


881


,


883


,


885


that is not from the reflected image off of the main reflector.




The first light shield


875


is mounted on top of the second reflector


815


that reflects the panoramic image on the main reflector


810


down into the optical path of the camera system. The first light shield


875


prevents light from above the panoramic camera's maximum vertical viewing angle. In one embodiment, the panoramic camera's maximum vertical viewing angle is 50 degrees such that the first light shield


875


prevents light coming from an angle greater than 50 degree from entering the panoramic camera's optical path.




The second light shield


877


is placed around the opening of the panoramic camera's lens system


820


,


830


,


840


. The second light shield prevents light from entering the camera's optical path unless that light has reflected off the main reflector


810


and has reflected off the second reflector


815


down into the optical path.





FIG. 8

also illustrates that the second reflector


815


can be constructed using a convex mirror instead of the flat mirror. By using a convex mirror as the second reflector, the second reflector can be placed closer to the main body of the camera system.




Overexposure Control




A panoramic camera system must be able to handle a much wider variety of lighting conditions than a conventional (limited viewing angle) camera system. A conventional camera system only captures light from a small viewing angle such that the intensity of light from the viewing angle will probably not vary a great amount. However, a panoramic camera system captures light from all directions such that the wide variety of lighting conditions must be handled. For example, with a panoramic camera system light from a first direction may come directly from the sun and in another light from a second direction may consist of ambient light reflected off of an object in a shadow. To capture a high quality panoramic image, it would be desirable to adjust the amount of light captured from each viewing direction such that the light exposure from the different directions does not vary wildly.





FIG. 8

illustrates a panoramic camera constructed to limit the light received from the different directions. To adjust the amount of light captured from each direction, the panoramic camera system


800


includes an adaptive light filter


890


in the optical path of the panoramic camera system. The adaptive light filter


890


limits the amount of light that reaches the image capture mechanism


850


.




In the illustration of

FIG. 8

, the adaptive light filter


890


is placed just before the image capture mechanism


850


. This position minimizes the detrimental effects caused by any scattering of light by the adaptive light filter


890


. However, the adaptive light filter


890


can be placed at any point in the optical path of the panoramic camera system.




A Passive Filtering System




One method of implementing an adaptive light filter


890


is to use a normally transparent light sensitive material that darkens when the material is exposed to large quantities of light. For example, a refractive neutral lens made of photogray material would automatically limit the amount of light from high intensity viewing directions. Examples of photogray glass include PhotoGray Extra and PhotoGray II made by Corning Glass Works of Corning, N.Y.




An Active Filtering System




Another method of implementing an adaptive light filter


890


is to use an electronically controlled Liquid Crystal Display (LCD) array as an adaptive light filter


890


. Ideally, the LCD array would be capable of selectively adjusting the amount of light that passes through any point of the LCD array.




To control the LCD array, an LCD control circuit (not shown) would be coupled to the electronic image capture mechanism


850


of the panoramic camera system


800


. The electronic image capture mechanism


850


would determine the relative light intensity at each point on the electronic image capture mechanism. The light intensity information from the electronic image capture mechanism


850


is passed to the LCD control circuit that determines how the LCD array should limit the light that passes through. Specifically, when the electronic image capture mechanism


850


detects an area that is receiving high intensity light, then the LCD control circuit would darken the corresponding area on the LCD array. Thus, the LCD array would selectively reduce the amount of light that reaches the image capture mechanism from high light intensity directions. The “flattening” of the light intensity results in captured panoramic annular images with greater contrast.




A Solid Camera Embodiment




The panoramic camera system illustrated in

FIG. 8

uses an outer surface mirror for the main reflector. An outer surface mirror is used since an inner surface mirror protected by a transparent material would have refractive effects caused when the light enters the transparent material and when the light exits the transparent material. Since the panoramic camera system illustrated in

FIG. 8

uses an outer surface mirror, the camera must be used cautiously to prevent damage to the out mirror surface. It would therefore be desirable to implement a panoramic camera that protects the main reflector.





FIG. 9

illustrates an embodiment of a panoramic camera system constructed of a solid transparent block. In the embodiment of

FIG. 9

, the main reflector


910


is protected by a transparent material


912


. The transparent material


912


is shaped such that all the light that will be used to create the annular reflection of the surrounding panorama enter the transparent material


912


at a normal to the surface of the transparent material


912


as illustrated by the right angle marks on the light rays. Since the light rays that create the annular image enter at a normal to the surface, there is no refractive effect as the light enters the transparent material


912


. The outer surface of the transparent material


912


is coated with a multicoat material such that internal reflections are prevented.




Once a light ray that will form part of the panoramic image enters the transparent material


91


Z the light ray then reflects off the main reflector


910


and then reflects off the second reflector


915


and then exits the transparent material


912


at surface


920


. Thus, the light remains in the transparent material


912


until it enters the lens system


930


,


940


. The surface


920


can be shaped such that all light that is part of the annular image exits at a normal to the surface


920


such that the transparent material


912


has no refractive effect on the light. Alternatively, the surface


920


can be shaped such that surface


920


is part of the lens system. The light is captured by an image capture mechanism


950


.




The embodiment in

FIG. 9

includes two light shields


975


and


977


to prevent undesired light


981


from entering the optical path. It should be noted that the panoramic camera system can also be constructed without the light shields


975


and


977


.




Annular Image Processing




As previously described, the panoramic annular images can be geometrically transformed from the annular image into more conventional rectangular projections. One method of performing this operation is to use digital image processing techniques as described in the relate U.S. patent titled “Panoramic Camera” filed on May 8, 1997, with Ser. No. 08/853,048.




When photographic film is used to capture the annular images, the annular images will not always be recorded in the exact same position on the film. One reason for this is that sprockets used to advance film through a camera are slightly smaller that the correspond holes in the film. Thus, the film alignment between exposures tends to vary. This effect is known as “gate weave.”




To process an annular image, the center coordinate of the digitized annular image must be known in order to rotate a selected viewport into a standard view. Since gate weave causes the center coordinate to vary, the center coordinate must be determined for each annular image that originated from photographic film.

FIG. 10



b


,


10




b


,


11




a


and


11




b


illustrate a method of determining the center coordinate of an digitized panoramic annular image that originated from photographic film.




Referring to the flow diagram of

FIG. 11



a


, step


1110


selects an initial proposed center point along a first axis. Referring to

FIG. 10



a


, an initial proposed center point PC


1


is illustrated along a Y axis (the first axis). Next at step


1120


, the annular video to standard video conversion software finds a first pixel along an orthogonal second axis that passes through the first proposed center point and exceeds a threshold value. In

FIG. 10



a


this is illustrated as FP


1


on a X axis. As illustrated in

FIG. 10



a


, the threshold value is selected to locate the first pixel along the edge of the annular image. Next, a last pixel that exceeds the threshold and is located along the second axis that passes through the first proposed center point (PC


1


) is selected. In

FIG. 10



a


, that last pixel is LP


1


along an X axis. Next at step


1130


, the converter selects the midpoint between the first pixel FP


1


and the last pixel LP


1


along the second axis as a second proposed center point. In

FIG. 10



a


, the second proposed center point is illustrated as PC


2


. The second proposed center point is closer to the actual center than the first proposed center point.




This process is repeated again after switching axis. Specifically, in step


1140


a first pixel a first axis that passes through the second proposed center point and exceeds a threshold value is selected as a first pixel. This is illustrated in

FIG. 10



b


as point FP


1


along a Y axis. Then a last pixel along a first axis that passes through a second proposed center point and exceeds the threshold value is selected. In

FIG. 10



b


this is illustrated as LP


2


. Then (at step


1150


) a midpoint is selected between the first pixel FP


2


and the last pixel LP


2


as the third proposed center point. This is illustrated on

FIG. 10



b


as third proposed center point PC


3


. The third proposed center point is also referred to as the first proposed center point for purposes of repeating the method steps.




The method proceeds to step


1160


where it determines if the first/third proposed center point is equal to the second proposed center point. This test determines whether the same center point has been selected again. If this occurs, then the method proceeds down to step


1180


where the second proposed center point is selected as the center point of the annular image. If the first proposed center point is not the same as the second proposed center point the method proceeds to step


1170


where the method determines if a minimum number of iterations have been performed. If this has not occurred, then the method proceeds back up to


1120


where it can repeat additional iterations of the method to determine a more accurate center point.




The foregoing disclosure has described several panoramic camera embodiments. It is contemplated that changes and modifications may be made by one of ordinary skill in the art, to the materials and arrangements of elements of the present invention without departing from the scope of the invention.



Claims
  • 1. A panoptic camera apparatus, said panoptic camera apparatus comprising:a first panoptic camera, said first panoptic camera comprising a first image capture mechanism, a first main reflector, said first main reflector reflecting light from a first hemisphere view onto said first image capture mechanism via a first secondary reflector; and a second panoptic camera, said second panoptic camera comprising: a second image capture mechanism, a second main reflector, said second main reflector reflecting light from a second hemisphere view onto said second image capture mechanism via a second secondary reflector; such that light from an entire spherical view is recorded by capturing a first half of said spherical view using first panoptic camera and capturing a second half of said spherical view using said second panoptic camera.
  • 2. The apparatus as claimed in claim 1 wherein both said first main reflector and said second main reflector comprise a cylindrically symmetrical shape of a parabola segment rotated about an axis, said parabola segment comprising a vertex, a first side of said parabola segment, and a second side of said parabola segment shorter than said first side and adjacent to said axis.
  • 3. The apparatus as claimed in claim 1 further comprising:a dual-sided reflector, said dual-sided reflector having a reflective surface on both a first side and a second side, said dual-sided reflector positioned between said first main reflector and said second main reflector such that light is reflected from said first main reflector onto said first side of said dual-sided reflector then onto said first image capture mechanism and light is reflected from said second main reflector onto said second side of said dual-sided reflector then onto said second image capture mechanism, said first side corresponding to said first secondary reflector and said second side corresponding to said second secondary reflector.
  • 4. A stereo panoramic camera apparatus, said stereo panoramic camera apparatus comprising:a first panoramic camera, said first panoramic camera comprising a first image capture mechanism, a first main reflector for reflecting a first image onto said first image capture mechanism via a first secondary reflector; and a second panoramic camera, said second panoramic camera comprising a second image capture mechanism, a second main reflector for reflecting a second image onto said second image capture mechanism via a second secondary reflector.
  • 5. The apparatus as claimed in claim 4 wherein said first panoramic camera and said second panoramic camera are separated by a known distance.
  • 6. The apparatus as claimed in claim 4 wherein said first panoramic camera is located in a blind spot of said second panoramic camera and said second panoramic camera is located in a blind spot of said first panoramic camera.
  • 7. A stereo panoramic camera apparatus, said stereo panoramic camera apparatus comprising:an image capture mechanism; an electronically controlled reflector; a first main reflector for reflecting a first panoramic image onto said electronically controlled reflector, said electronically controlled reflector reflects the first panoramic image onto said image capture mechanism when said electronically controlled reflector is on; and a second main reflector for reflecting a second panoramic image onto said image capture mechanism through said electronically controlled reflector when said electronically controlled reflector is off.
  • 8. The apparatus as claimed in claim 7 wherein said first main reflector and said second main reflector are separated by a known distance.
Parent Case Info

This application is a continuation of U.S. Ser. No. 08/853,048 filed May 8, 1997.

US Referenced Citations (97)
Number Name Date Kind
D. 263716 Globus et al. Apr 1982
D. 312263 Charles Nov 1990
2146662 Van Albada Feb 1939
2244235 Ayres Jun 1941
2628529 Braymer Feb 1953
2654286 Cesar Oct 1953
3203328 Brueggemann Aug 1965
3205777 Brenner Sep 1965
3229576 Ress Jan 1966
3486809 Pinzone et al. Dec 1969
3785715 Mecklenborg Jan 1974
3832046 Mecklenborg Aug 1974
3934259 Krider Jan 1976
4012126 Rosendahl et al. Mar 1977
4017145 Jerie Apr 1977
4038670 Seitz Jul 1977
4058831 Smith Nov 1977
4078860 Globus et al. Mar 1978
4157218 Gordon et al. Jun 1979
4190866 Luknar Feb 1980
4241985 Globus et al. Dec 1980
4326775 King Apr 1982
4395093 Rosendahl et al. Jul 1983
4429957 King Feb 1984
4463380 Hooks, Jr. Jul 1984
4561733 Kreischer Dec 1985
4566763 Greguss Jan 1986
4568160 Krueger Feb 1986
4578682 Hooper et al. Mar 1986
4593982 Rosset Jun 1986
4602857 Woltz et al. Jul 1986
4656506 Ritchey Apr 1987
4661855 Gulck Apr 1987
4670648 Hall et al. Jun 1987
4728839 Coughlan et al. Mar 1988
4742390 Francke et al. May 1988
4751660 Hedley Jun 1988
4754269 Kishi et al. Jun 1988
4761641 Schreiber Aug 1988
4772942 Tuck Sep 1988
4858002 Zobel Aug 1989
4864335 Corrales Sep 1989
4868682 Shimizu et al. Sep 1989
4901140 Lang et al. Feb 1990
4907084 Nagafusa Mar 1990
4908874 Gabriel Mar 1990
4918473 Blackshear Apr 1990
4943821 Gelphman et al. Jul 1990
4943851 Lang et al. Jul 1990
4945367 Blackshear Jul 1990
4974072 Hasegawa Nov 1990
4985762 Smith Jan 1991
5016109 Gaylord May 1991
5021813 Corrales Jun 1991
5023725 McCutchen Jun 1991
5038225 Maeshima Aug 1991
5040055 Smith Aug 1991
5048102 Tararine et al. Sep 1991
5067019 Juday et al. Nov 1991
5068735 Tuchiya et al. Nov 1991
5083389 Alperin Jan 1992
5097325 Dill Mar 1992
5115266 Troje May 1992
5130794 Ritchey Jul 1992
5142354 Suzuki et al. Aug 1992
5153716 Smith Oct 1992
5166878 Polestra Nov 1992
5173948 Blackham et al. Dec 1992
5175808 Sayre Dec 1992
5185667 Zimmerman Feb 1993
5187571 Braun et al. Feb 1993
5189528 Takashima et al. Feb 1993
5200818 Neta et al. Apr 1993
5231673 Elenga Jul 1993
5259584 Wainwright Nov 1993
5262852 Eouzan et al. Nov 1993
5262867 Kojima Nov 1993
5305035 Schonherr et al. Apr 1994
5313306 Kuban et al. May 1994
5315331 Ohshita May 1994
5341218 Kaneko et al. Aug 1994
5359363 Kuban et al. Oct 1994
5384588 Martin et al. Jan 1995
5396583 Chen et al. Mar 1995
5422987 Yamada Jun 1995
5473474 Powell Dec 1995
5495576 Ritchey Feb 1996
5508734 Baker et al. Apr 1996
5539483 Nalwa Jul 1996
5563650 Polestra Oct 1996
5601353 Naimark et al. Feb 1997
5627675 Davis et al. May 1997
5631778 Powel May 1997
5649032 Burt et al. Jul 1997
5760826 Nayar Jun 1998
5877801 Martin et al. Mar 1999
5920337 Glassman et al. Jul 1999
Foreign Referenced Citations (2)
Number Date Country
1234341 May 1960 FR
2289820 Nov 1995 GB
Non-Patent Literature Citations (54)
Entry
Supplemental Information Disclosure Statement in re: the Application of Steven D. Zimmerman, et al. Application No. 08/662,410; 08 Pages including PTO 1449 Form citing 19 references. Application No. 08/662,410; Filed Jul. 12, 1996. Filed: Jul. 12, 1996.
Heckbert, P., “Survey of Texture Mapping” IEEE CG&A, Nov. 1986, pp. 56-67.
Defendants IPI's Notice of Reliance of Prior Art and Witnesses, Civil Action of Interactive Pictures Corporation, A/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 05 Pages. Filed: Dec. 08, 1997, in U.S.D.C., Eastern District of Tennessee.
Defendant IPI's Composit Exhibit List, Civil Action of interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849. Filed: Jan. 05, 1998, in U.S.D.C., Eastern District of Tennessee. Pages: 20.
Plaintiff's Rule 26(a)(3) Disclosures, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 31 Pages. Filed: Dec. 08, 1997, in U.S.D.C., Eastern District of Tennessee.
Plaintiff's Supplemental Trial Exhibit List, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman, Case No. 3-96-849; 41 Pages. Filed: Jan. 02, 1998, in U.S.D.C., Eastern District of Tennessee.
Ripley G. David, “DVI-A Digital Multimedia Technology”. Communication of the ACM. Jul. 1989. vol. 32. No. 07. Pages: 811-820.
Onoe M. and Kuno Y., “Digital Processing CF Images Taken By Fish-Eye Lens”. 1982. IEEE. Pages: 105-108.
Hamit, F., “Near-Fisheye CCD Camera Broadens View for Imaging”. Advanced Imaging. Mar. 1993. Pages: 50-52.
Dixon, D., Golin, S., and Hasfield, I., “DVI Video/Graphics”. Computer Graphics World reprinted from the Jul. 1987 edition of Computer Graphics World. Pages: 04.
Upstill, Steve. “Building Stronger Images”. Unix Review. Oct. 1988. vol. 06. No. 10. Pages: 63-73.
Greene, N., “Environment Mapping and Other Applications of the World Projections.” Computer Graphics and Applications. Nov. 1986. IEEE Computer Society. vol.06. No.11. Pages: 21-29.
Hechbert P., “The PMAT and Poly User's Manual”. Computer Graphics Lab. N.Y.I.T., Feb. 18, 1983. Pages: 1-29.
Heckbert, P., Fundamentals of Textured Mapping and Image Warping. Master Thesis. Pages: 86. Dated: Jun. 17, 1989.
Rebiai,M., Mansouri,S., Pinson,F., and Tichit,B., “Image Distortion From Zoom Lenses: Modeling and Digital Correction”. International Broadcasting Convention. IEEE. Dated: Jul. 1992.
Charles Jeffery, R., “All-Sky Reflector with “Invisible” Camera Support”. Images from 1988 RTMC Proceedings. Pages: 79-80.
Roger W. Sinnott, “Scientific Library Gleaning for ATMs”. Sky & Telescope. Aug. 1986. Pages: 186.
Charles et al., “How to Build and Use an All-Sky Camera”. Astronomy. Apr. 1987. Pages: 64-70.
Deutsch, Cludia H., “One Camera That Offers Many Views”. The New York Times.
Johnson, Colin R., “Imaging System Sees All”. Electronic Engineering Times. Dec. 25, 1996. Pages: 1&98.
“Panospheric Camera Expands Horizon”. Pages:01.
“Panoshperic Camera Developed at Carnegie Mellon Expands Horizon”. Pages: 01.
Castleman, K., “Digital Image Processing”. Prentice Hall. 1979. Pages: 110-135, 383-400,408.
Castleman, K., “Digital Image Processing”. Prentice Hall. 1996. Pages: 125-127, 140-141.
Shah, S., A Simple Calibration Procedure For Fish-Eye (High Distortion) Lens. IEEE. 1994. Pages: 3422-3427.
“Gnomonic Projection”. Map Projections-A Working Manual. Pages: 164-168.
Greene, N., and Heckbert, P., “Creating Raster Omnimax Images From Multiple Perspective Views Using The Elliptical Weighted Average Filter”. IEEE. 1986. Pages: 21-27.
Fant, K., “A Nonaliasin, Real-Time Spatial Formation Technique”. IEEE. 1986. Pages: 71-80.
Greene, William B., “Qualitative Image Processing Techniques”. Digital Image Processing, A Systems Approach. 2nd Edition. 1989. Van Nostrand Reinhold. Pages: 92-112.
Wolberg, George. Digital Image Warping (Introduction). 1990. IEEE Computer Society Press. Pages: 2.
Fu, K. S. et al., “Low-Level Vision”. Robotics: Control, Sensing, Vision, and Intelligence. 1987.McGraw Hill Inc., Pages: 313-315.
Carlbom, Ingrid et al. “Planner Geometric Projections and Viewing Transformations”. Computing Surveys. vol. 10. No. 04. Dec. 1978. Pages: 465-502.
Anderson, R. L., et al., “Omnidirectional Real time Imaging Using Digital Restoration”. High Speed Photography SPIE. vol. 348. San Diego, Ca. 1982. Pages: 807-814.
Laikin, Milton. “Wide Angle Lens System”. 1980. International Design Conference (OSA). SPIE. vol. 237. 1980. Pages: 530-532, 815-816.
Shah, Shisir et al., “Depth Estimation using Fish-Eye Lenses”. IEEE. Department Of Electrical and Computer Engineering. University of Texas. 1994. Pages: 740-744.
Tsai, Roger Y., “A Versatile Camera Calibration Technique for High Accuracy 3-D Machine Vision Using Off-the-Shelf TV Cameras and Lenses”. IEEE. Journal of Robotics and Automation. vol. RA-3. No.04. Aug. 1987. Pages: 323-344.
Chang, Yuh-Lin et al., “Calibrating a Mobile Camera's Parameters”. Pattern Recognition. vol. 26. No. 01. Dated: 1983. Pages: 75-88.
Weng, Juyang. “Camera Calibration With Distortion Models and Accuracy”. IEEE. Transactions On Pattern Analysis and Machine Intelligence. vol. 14. No. 10. Oct. 1992. Pages: 965-980.
Lenz, Reimer K. et al., “Techniques for Calibration of the Scale Factor and Image Center for High Accuracy 3-D Machine Vision Metrology”. IEEE. Transaction on Pattern Analysis and Machine Intelligence. vol. 05. No. 05. Sep. 1988. Pages: 713-720.
Nomura, Yoshihiko, et al., “A Simple Calibration Algorithm for High-Distortion Lens Camera”. IEEE. Transaction on Pattern Analysis and Intelligence Machine. vol. 14. No.11. Nov. 1992. Pages: 1095-1099.
International Broadcasting Convention Venue RAI Congress And Exhibition Centre, Amersterdam, The Netherlands. Jul. 3-7, 1992. Pages:06, Including the title page.
Telerobotics International, Inc. “Optimizing The Camera And Positioning System For Telerobotic Workcite Viewing”.
Miyamoto, K., “Fish Eye Lens”. JOSA. vol.54, Pages: 1060-1061. Dated: Aug. 1964.
Defendant's IPI's Composite Exhibit List, Civil Action of Interactive Pictures Corporation, F/K/A Omniview, Inc. v. Infinite Pictures, Inc. and Bill Tillman. Case No. 3-96-849. Filed: Jan. 05, 1998 in U.S.D.C., Eastern District Of Tennessee. Pages: 20.
Baltes, M. “Bevet D'Intervention”. Ref. No.: N 1.234.341.
Verity, John W. (edited by): Information Processing. Business Week. Page: 134E. Dated: Jul. 13, 1992.
Marbach, William D. (edited by): Developments To Watch. Business Week. Page: 83. Dated: Sep. 26, 1988.
Lu Carnevale, Mary. Video Camera Puts The Viewer in Control. Wall Street Journal. Dated: Nov. 25, 1992.
Popular Science. Electronic Panning Camera System. Pages: 36-37. Dated: Sep. 1992.
Tulloch, Martha. “New Video Camera...” Photonics Spectra. Pages: 18-20. Dated: Oct. 1992.
Fisher, Timothy E., A Programmable Video Image Remapper. SPIE> vol. 938. Pages: 122-128. Dated: 1988.
Lippman, Andrew. Movie-Map: An Application Of The Optical Videodisc To Computer Graphics. Pages: 43. Dated: 1980.
Yelick, Steven. Anamorphic Image Processing. Pages: 1-37, Including Acknowledgement Page. Dated: 1980.
Chen, Shenchang Eric. Quick Time VR-An Image-Based Approach To Virtual Environment Navigation. Pages: 39. Dated: 1995.
Continuations (1)
Number Date Country
Parent 08/853048 May 1997 US
Child 09/480311 US