System and method for displaying images in 3-D stereo

Information

  • Patent Grant
  • 10110876
  • Patent Number
    10,110,876
  • Date Filed
    Tuesday, May 2, 2017
    7 years ago
  • Date Issued
    Tuesday, October 23, 2018
    6 years ago
  • Inventors
    • Carlson; Kenneth L. (Salt Lake City, UT, US)
  • Original Assignees
  • Examiners
    • Pontius; James
    Agents
    • Durham Jones & Pinegar, P.C., Intellectual Property Law Group
Abstract
A system and method of capturing a stereoscopic pair of images for use in forming a 3-D image of an object at a desired perceived position in a scene projected onto a dome surface. The first one of the stereoscopic pair of images is captured when the object is offset to the right of the desired perceived position in the scene. The second one of the stereoscopic pair of images is captured when the object is offset to the left of the desired perceived position in the scene. In this manner, positive parallax can be captured in front of a viewer, upward in an arc through the zenith of the dome, and beyond to the back of the dome. The system and method allows scenes projected onto a dome surface to contain positive parallax, and therefore allows objects to appear to be located beyond the dome surface when viewed in 3-D stereo, which was previously not possible.
Description
TECHNICAL FIELD

The present disclosure relates generally to generating images for display, and more particularly, but not necessarily entirely, to a method and system for generating 3-D images for display.


RELATED ART

Stereopsis is the visual ability to perceive the world in three dimensions (3-D). Stereopsis in humans is primarily achieved by the horizontal offset, known as interocular offset, between the two eyes. Interocular offset leads to two slightly different projections of the world onto the retinas of the two eyes. The human mind perceives the viewed object in 3-D from the two slightly different projections projected onto the two retinas.


One of the main ways in which human eyes perceive distance is called parallax. Parallax is an apparent displacement or difference in the apparent position of an object viewed along two different lines of sight. Nearby objects have a larger parallax than more distant objects when observed from different positions, so parallax can be used to determine distances. In humans, the two eyes have overlapping visual fields that use parallax to gain depth perception; that is, each eye views the object along a different line of sight. The brain exploits the parallax due to the different view from each eye to gain depth perception and estimate distances to objects.


This same method of parallax is used to give the illusion of distance in 3-D stereo images, including still images, videos, and movies, whether captured by camera or computer generated. 3-D stereo images simulate real-world perception by displaying a slightly different image for each eye—a slightly different perspective of the same scene—where the viewing position is offset slightly in the horizontal direction (interocular distance). The two images that are displayed independently to the right and left eyes are sometime referred to as a “stereo pair.”


There are many methods for displaying a different image to each eye to generate the perception of a 3-D image. For still images, display methods may include a lenticular display surface, or a special viewing device. For movies and videos, the display method may involve the viewer wearing glasses which permit a different color space or polarization to reach each eye, or which shutter alternating frames between right-eye views and left-eye views.


The perceived depth of an object may be determined by the angle at which the viewer's eyes converge. This is also the case when viewing a 3-D image that is displayed on a surface. Where both eyes view the same object in the same location, the object will appear to be positioned at the same distance of the display surface. This is because the eyes are converged at that distance just as they would be if an actual object were placed at that distance. When there is no separation between the images for the left eye and the right eye, this is referred to as zero parallax.


If the position of an object in the left eye's view is located to the right, and the position of the object in the right eye's view is located to the left, this is called negative parallax, and the eyes have to rotate inward (cross-eyed) to converge the images into a single image. In this case, the object is perceived to be located in front of the display surface.


If the position of an object in the left eye's view is located to the left, and the position of the object in the right eye's view is located to the right, this is called positive parallax, and the eyes have to rotate outward (more wall-eyed) to converge the images into a single image. In this case, the object is perceived to be located beyond the display surface. In short, when viewed in stereo pairs, an object must have negative parallax to appear closer than the display surface, and an object must have positive parallax to appear further away than the display surface. An object with zero parallax will appear to be at the distance of the display surface. Referring now to FIGS. 1A, 1B and 2, there are shown examples of how parallax allows a human to perceive distance.


In FIG. 1A, a distant object 10 is perceived by a human as a single image from the two images viewed by the left and right eyes. The index finger 12 is seen as a double image while viewing the distant object 10. In particular, the left eye sees the index finger 12 offset to the right by a distance 16 and the right eye sees the index finger 12 offset to the left by a distance 14. In this case, there is a relatively small negative parallax.


In FIG. 1B, an index finger 20 is perceived by a human as a single image from the two images viewed by the left and right eyes. A distant object 22 is seen as a double image while viewing the index finger 20. The left eye sees the distant object 22 offset by a distance 24 to the left of the index finger 20 while the right eye sees the distant object 22 offset by a distance 26 to the right of the index finger 20.


Referring now to FIG. 2, there is shown an example of parallax and perceived distance of an image shown on a display surface 30. For purposes of this example, a triangle 32, a circle 34, and a square 36 are shown in the perceived locations, i.e., the locations where they are perceived to be by the human mind. In regard to the triangle 32, both the left eye and the right eye see the single image of the triangle 32 at the same location. In the case of the triangle 32, there is zero parallax as the eyes converge to see the single image at the distance of the display surface 30, so the location of the triangle 32 is perceived to be at the distance of the display surface 30.


In regard to the circle 34, the perceived location of the circle 34 is created by images 34A and 34B on the display surface 30. In particular, the left eye views the image 34A and the right eye views the image 34B such that the location of the circle 34 is perceived in front of the display surface 30. In this case, the eyes rotate inward (cross-eyed) to converge the images 34A and 34B into a single image, which is defined as negative parallax.


In regard to the square 36, the left-eye views the image 36A on the display surface 30 and the right eye views the image 36B on the display surface 30 such that the location of the square 36 is perceived beyond the display surface 30. In this case, the eyes rotate outward to converge the images 36A and 36B into a single image so that the square 36 appears to be further away than the display surface 30, which is defined as positive parallax.


3-D images of real-world objects may be captured by use of a stereoscopic camera having two lenses, one for capturing the right-eye image and one for capturing the left-eye image. If a scene is to be viewed on a flat display surface in front of the viewer (such as on a television or movie screen), positive parallax can be captured by aiming the two cameras slightly toward each other (with a slight toe-in). The two cameras would both aim at a point along a central viewing axis. Optimally, this point would be the same distance from the cameras as the display surface would be from the audience. This way, the scene would appear correctly when viewed in 3-D stereo, with close objects having a negative parallax, objects at the distance of the display surface having zero parallax, and distant objects having positive parallax. These concepts are depicted in FIG. 3 as will now be explained.



FIG. 3 depicts a top view of a 3-D scene when captured for display on a flat display surface 50. The 3-D scene may be filmed using a left-eye camera 58 and a right-eye camera 60. The left-eye camera 58 and the right-eye camera 60 may be offset from a centerline, or y axis by an amount c, representing the interocular distance needed to create a 3-D image. There is shown a desired perceived position of a triangle 52, a circle 54, and a square 56 from the perspective of the viewer and in relation to the display surface 50.


In order to have the perceived position of the triangle 52 to appear at the same distance as the display surface 50, the triangle 52 is positioned along the y axis at the same distance as the display surface 50 and the aim of a left-eye camera 58 and the aim of a right-eye camera 60 converge at a distance equal to the distance of the display surface 50 from the viewer.


Circle 54 will appear to be located in front of display surface 50 because it is offset to the right in the left-eye view and offset to the left in the right-eye view (defined as negative parallax). Square 56 will appear to be located beyond the display surface 50 because it is offset to the left in the left-eye view and offset to the right in the right-eye view (defined as positive parallax).


Converging the aim of the cameras as described above will create positive parallax only in the direction of the camera convergence. In FIG. 3, for example, the cameras converge along the y axis. This will produce positive parallax in the direction of the y axis and enable display of 3-D objects that appear to be located beyond a flat display surface. This method works for flat display surface 50, because the position of display surface 50 is offset from the cameras in the direction of the y axis. But the aforementioned method is not suitable for images that will be viewed on a dome surface, as explained in the paragraph below.


In a dome environment, images are projected on the inside of a hemispherical, dome display surface. The images are captured with a dome camera which uses a circular fisheye lens to yield 180-degree views, for example an astronomy image of the entire night sky. The majority of the image viewed is high in the dome above, behind, and to the sides of the viewer, rather than just in front of the viewer of a flat surface display. To capture 3-D objects that will appear to be located beyond a dome display surface, right-eye and left-eye dome cameras capturing the scene must be aimed in a direction parallel to the central viewing axis, or undesirable effects will be produced: If the two dome cameras were aimed with a toe-in as described above for a flat display surface, positive parallax would only be produced in an area of the dome in the direction of the y axis directly in front of the viewer. As the viewer looks upward in the dome at angles above the y axis, the positive parallax effect diminishes and then reverses in areas of the scene overhead and behind the viewer. For example, consider FIG. 3 in three dimensions. If circle 54 were raised a great distance above the plane of the drawing (in the z axis direction, which would be above the viewer), positive parallax could never be achieved by the depicted camera convergence. Even at a great distance, circle 54 will always appear to be located in front of the dome display surface because it will be offset to the right in the left-eye view and offset to the left in the right-eye view (negative parallax). Positive parallax can only be created in the direction of camera convergence, in this case the y axis, and regardless of actual distance, the effect will actually be reversed on any object whose y component of distance is located in front of the point of camera convergence. In addition, if the dome cameras are angled to create the convergence on the y axis in front of the viewer, objects overhead would be captured at different angles in the right-eye camera and the left-eye camera, so these overhead objects in the resulting image would appear to crisscross. The minds of the viewers would not be able to make sense of these anomalies, and the illusion of 3-D would be destroyed.


Therefore, cameras capturing 3-D stereo to be rendered on a dome surface must be parallel to each other (parallel to the central viewing axis). As a result, positive parallax cannot be captured from the original scene. So objects in the stereo images can only appear to be at located the dome surface or closer to the viewer, and none will appear to be located beyond the dome surface.


The prior art is thus characterized by several disadvantages that are addressed by the present disclosure. The present disclosure minimizes, and in some aspects eliminates, the above-mentioned failures, and other problems, by utilizing the methods and structural features described herein. The method described by the present disclosure allows positive parallax to be captured in front of the viewer, upward in an arc through the zenith of the dome, and beyond to the back of the dome. The method allows scenes to contain positive parallax on a dome surface (and therefore allows objects to appear to be located beyond the dome surface when viewed in 3-D stereo), which was previously not possible.


The features and advantages of the present disclosure will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by the practice of the present disclosure without undue experimentation. The features and advantages of the present disclosure may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims.





BRIEF DESCRIPTION OF THE DRAWINGS

In the drawings:



FIGS. 1A & 1B are illustrations of convergence and parallax;



FIG. 2 depicts the concepts of parallax and perceived distance;



FIG. 3 is a top view of a 3-D stereo scene when captured for display on a flat display surface;



FIG. 4 is a top view of a 3-D stereo scene when captured for display on a dome surface;



FIG. 5 depicts a calculation of offset to create positive parallax to simulate real-world distances;



FIG. 6 is a flow chart showing a process of creating positive parallax on distant objects;



FIG. 7 depicts a system for creating a 3-D image pursuant to an embodiment of the present disclosure;



FIG. 8 depicts a system for creating a 3-D image pursuant to an embodiment of the present disclosure; and



FIG. 9 depicts a projection system for displaying 3-D images pursuant to an embodiment of the present disclosure.





DETAILED DESCRIPTION

For the purposes of promoting an understanding of the important principles in accordance with this disclosure, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the disclosure is thereby intended. Any alterations and further modifications of the inventive features illustrated herein, and any additional applications of the principles of the disclosure as illustrated herein, which would normally occur to one skilled in the relevant art and having possession of this disclosure, are to be considered within the scope of the disclosure claimed.


It must be noted that, as used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise. As used herein, the terms “comprising,” “including,” “containing,” “characterized by,” and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional, unrecited elements or method steps.


As used herein, the term “object” refers to a scene element, which may refer to, but is not be limited to, a dot, a line, a sprite, a complete computer-graphic model, a part of a computer-graphic model, a virtual surface, a vertex on a virtual polygonal surface, or a texture mapping coordinate on a computer-graphic model. Any of these scene elements can be offset in the manner described herein to achieve positive parallax.


Applicant has discovered a method and system for generating a stereoscopic pair of images for use in creating 3-D images on a surface, such as a dome surface. Referring now to FIG. 7, there is depicted a system 250 for generating a computer generated 3-D image for display on a surface using a projector. In an embodiment of the present disclosure, the surface may be a dome surface. The system 250 may comprise a computer 252. The computer 252 may comprise a display 254 as is known to one having ordinary skill in the art. The computer 252 may further comprise a processor 256 for executing programming instructions. The processor 256 may be coupled to a memory 258. The memory 258 may store programs for execution by the processor 256. The computer 252 may further comprise an input device 262 for allowing a user to provide input for creating the necessary 3-D image. The input device 262 may comprise a computer mouse and a key board. The computer 252 may also be connected to a data storage device 264, such as a non-volatile memory device, e.g., a hard drive, for storing image data.


In an embodiment of the present disclosure, the memory 258 may have stored therein a 3-D modeling program 260. The 3-D modeling program 260 may provide suitable software tools for 3-D modeling, visual effects, and 3-D rendering. A commercially available 3-D modeling program 110 may be suitable. Such commercially available 3-D modeling program may include the AUTODESK® MAYA® 3D computer animation software or the AUTODESK® 3DS MAX® computer animation software. Using the 3-D modeling program 260, a user may create a computer-generated model using the system 250. The computer-generated model created by the user may comprise one or more virtual objects or scene elements that the user desires to be perceived by viewers in 3-D on a dome surface.


As explained above, in order to generate a 3-D image, a stereoscopic pair of images is created from a virtual scene. The user may create the virtual scene using the 3-D modeling program 250 running on the processor 256 as is known to one having ordinary skill in the art. In order to create a stereoscopic pair of images, the 3-D modeling program 260 may allow the user to position one or a pair of virtual cameras within the computer-generated and virtual scene. In an embodiment, one of the virtual cameras may be designated as the left-eye camera while the other may be designated as the right-eye camera. Alternatively, a single virtual camera may be moved between the left-eye and right-eye camera positions.


The heading or central optical axis of the left-eye virtual camera and the right-eye virtual camera are parallel or substantially parallel. Stated another way, the heading or central optical axis of the left-eye virtual camera and the right-eye virtual camera are in parallel parallax. In an embodiment of the present disclosure, when creating stereo pairs for a video scene or still image, the frames from the left-eye virtual camera are typically rendered separately from the frames from the right-eye virtual camera.


In an embodiment of the present disclosure, before the processor 256 renders the frames from the left-eye virtual camera, any distant objects that the animator wishes to be seen with positive parallax are positioned or offset to the left of their desired position in the virtual scene. Before the processor 256 renders frames from the right-eye virtual camera, those objects are positioned or offset to the right of their desired position by the same distance in the virtual scene. It will be appreciated that these objects can be simple objects within the scene, or hemispherical, flat or curved surfaces textured with background images.


The processor 256 may store images captured by the left-eye virtual camera as a left-eye image file 266 on the data storage device 264 and the processor 256 may store images captured by the right-eye virtual camera as a right-eye image file 268 on the data storage device 114. The left-eye image file 266 and the right-eye image file 268 may contain the appropriate data or formatting to render the images to a viewer's left eye or right eye depending on the desired 3-D methodology, e.g., polarization filtering (passive or active), shutter filtering (mechanical shutters), color filtering (anaglyph), autostereoscope, etc.


In an embodiment of the present disclosure, objects may be offset from their original positions manually by the user. Alternatively, the user may simply select the object to be offset, and the processor 256 may automatically offset the object from its original position prior to rendering based upon the desired location of where the object will appear in 3-D.


In an embodiment of the present disclosure, the images captured by the right-eye and left-eye virtual cameras, and as stored as the left-eye image file 266 and the right-eye image file 268 in the data storage device 264, respectively, can then be combined by the processor 256 into a master image file 270, which is then stored in the data storage device 264. The master image file 270 may be formatted depending on the 3-D display technology being used. In an embodiment of the present disclosure, the left-eye image file 266 and the right-eye image file 268 are maintained as separate image files.


When played, the master file 270, or the left-eye image file 266 and the right-eye image file 268, may generate images of a movie, art piece, video game, film, simulator, television program, still image, or animation suitable for display on a dome surface. Again, the processor 256 executing the instructions of the 3-D modeling program 260 may facilitate the user creating the master image file 270 and the left-eye image file 266 and the right-eye image file 268.


Referring now to FIG. 4, there is depicted a top view of a 3-D stereo scene when captured for display on a dome surface. In embodiments of the present disclosure, the 3-D stereo scene may be a virtual scene, captured with virtual cameras, or a real-world scene, captured with real-world cameras. The virtual scene may be generated and filmed using a computer, such as the computer 252. For purposes of convenience, a coordinate system with the x axis pointing to the right, the y axis pointing straight ahead, and the z axis pointing up is designated.


The 3-D scene may be filmed using a left-eye camera 100 and a right-eye camera 102. The left-eye camera 100 and the right-eye camera 102 may be offset from a centerline, or y axis, by an amount c, where 2c represents an interocular distance needed to create a 3-D image. There is shown a desired perceived position of objects in the scene, namely, a circle 104 and a square 106, from the perspective of a viewer and in relation to a dome display surface 108 (the dome display surface 108 is not actually present during the filming of the scene, but its location and distance from the viewer is needed in order to generate 3-D images in the proper perceived position). The heading or aim 100A of the left-eye camera 100 and the heading or aim 102A the right-eye camera 102 are parallel or substantially parallel with each other and the y axis in the scene.


The square 106, which represents an exemplary object in the scene, is located on the y axis in its desired viewing position or original position, which is also the position where the object will be perceived by viewers, but in 3-D. Thus, in FIG. 4, the square 106 is desired to appear behind the surface of the domed viewing surface 108. For images captured by the left-eye camera 100, the square 106 is offset to the left of the y axis by a distance o to the offset position 106B. For images captured by the right-eye camera 102, the square 106 is offset to the right of the y axis by a distance o to the offset position 106A.


It will be appreciated that the images captured by the left-eye camera 100 and the right-eye camera 102 may be rendered separately from each other. The separately captured renderings may then be prepared for presentation in 3-D. In an embodiment of the present disclosure, the images captured by the left-eye camera 100 and the right-eye camera 102 may be stored as two image files on an electronic data storage medium. The image files may be processed for 3-D display.


Referring now to FIG. 5, there is shown a diagram with the variables of calculation to determine the necessary offset, o, to create positive parallax to simulate real-world distances pursuant to an embodiment of the present disclosure, where like reference numerals depict like components. An assumption may be made that the viewer is seated in the center of the dome. TABLE 1, below, lists the variables needed to calculate the offset, o.











TABLE 1





Variable
Status
Description







d
Known
Distance along the y axis from observer to distant




object


s
Known
Distance from observer to dome display surface


i
Known
Interocular distance of cameras and observer


r
Known
Offset of observer's right eye and of right-eye




camera (=i/2)


Ø
Unknown
Angle from distant object to right eye of observer


ν
Unknown
Offset distance in x direction where right eye of




observer would see distant object at distance of




the dome surface


α
Unknown
Angle of positive parallax


o
Unknown
Distant of object's offset in x direction for




right-eye virtual camera









The unknown variables in TABLE 1, may be determined using the following equations:

Ø=tan−1(r/d)
ν=(d−s)tan Ø
α=tan−1(ν/s)
o=(d tan α)+r


The above can also be applied to the left-eye camera 100 and the object's left offset from the y axis in the x direction.


Referring now to FIG. 6, there is depicted a flow diagram of a process for creating positive parallax on distant objects within a virtual environment for display on a dome surface. The process in FIG. 6 may utilize a computer system, such as a computer system 250 as shown in FIG. 7. At step 200, a user may create a virtual environment using a 3-D modeling program running on a computer. At step 202, the desired interocular distance between a left-eye virtual camera and a right-eye virtual camera is determined. The interocular distance, or distance along the x axis between left-eye and right-eye virtual cameras (or actual cameras for filming real-world scenes) is somewhat arbitrary. The distance may be dependent on the distance from the cameras to the closest object in the scene, and the strength of stereo effect desired. To closely simulate a human view in an every-day environment, the left and right cameras should be spaced at the same distance that human eyes are spaced, roughly 65 mm apart. If rendering a small-scale scene, the distance would be much smaller, and if rendering a large-scale scene, the distance would be much greater. Also, the greater the spacing, the greater the 3-D effect. The heading or aim of the left-eye camera 100 and the heading or aim of the right-eye camera 102 are parallel or substantially parallel with each other and the y axis in the scene.


At step 204, the offsets for distant objects to produce the desired positive parallax are determined. The amount of positive parallax one would want to introduce to distant objects is somewhat arbitrary as well. The more positive parallax the eye sees, the further away the object will appear. So the most distant object in a scene should be given the greatest positive parallax. Objects just beyond the distance of the viewing surface (the dome surface) should be given the smallest positive parallax. The amount of positive parallax given to objects in between these objects should fall in between these amounts, proportional to their distance. And the actual distance of the offset in the x direction to produce a certain amount of positive parallax is dependent on the distance that object lies away from the eye point (in the y direction), so the easiest way to consider positive parallax is by angle. Positive parallax must be below about 5 degrees for the human mind to process it. And it is most comfortable for the viewer at 3 degrees or less.


If the simulation of real-world distances is desired, then simple trigonometry can yield the amount of offset that should be given to distant objects. FIG. 5 illustrates such a calculation. It is likely, however, that the artist may want to exaggerate the positive parallax of distant objects to enhance the 3-D effect of a scene. In this case, the calculated offset values would be multiplied by a factor which would increase distant offsets, while keeping the largest positive parallax (of the most distant object in the scene) below 3 degrees.


At step 206, the virtual camera is offset to the left of its original position by half of the interocular distance. At this point, the virtual camera is functioning as the left-eye virtual camera. At step 208, each distant object is moved to the left of its original position by the offset distance as determined at step 204. At step 210, the virtual scene is rendered with the virtual camera in the left-eye position. At step 212 and 214, the left-eye images are prepared for stereo display to the left eyes of viewers.


At step 216, the virtual dome camera is offset to the right of its original position by half of the interocular distance. At this point, the virtual camera is functioning as the right-eye virtual camera. At step 218, each distant object is moved to the right of its original position by the offset distance as determined at step 204. At step 220, the virtual scene is rendered with the virtual camera in the right-eye position. At steps 222 and 224, the right-eye images are prepared for stereo display to the right eyes of viewers. Once the left-eye and the right-eye images are prepared and formatted according the 3-D display technology, they may be projected by a projector onto a dome surface for viewing by an audience.


Referring now to FIG. 8, there is depicted a system 300 for generating a computer generated 3-D image for display on a surface using a projector. In an embodiment of the present disclosure, the surface may be a dome surface. The system 300 may comprise a computer 302. The computer 302 may comprise a display 304 as is known to one having ordinary skill in the art. The computer 302 may further comprise a processor 306 for executing programming instructions. The processor 306 may be coupled to a memory 308. The memory 308 may store programs, or instructions, for execution by the processor 306. The computer 302 may further comprise an input device 312 for allowing a user to provide input for creating the necessary 3-D image. The input device 312 may comprise a computer mouse and a key board. The computer 302 may also be connected to a data storage device 314, such as a non-volatile storage device, for storing image data.


In an embodiment of the present disclosure, the memory 308 may have stored therein a post-production editing program 310. Commercially available post-production editing programs may be suitable such as ADOBE® AFTER EFFECTS®. The program 310, when executed by the processor 306, may allow scene elements to be offset in post-production. For example, scene elements that have been rendered separately in a modeling program as described above, can be composited together into a scene in a compositing program, sliding distant objects in the left-eye view to the left, and sliding distant objects in the right-eye view to the right by the same distance, to generate the desired positive parallax.


In an embodiment of the present disclosure, the invention described herein could also be used when capturing stereo pair frames of real-world scenes by video or still camera. Frames for the right-eye camera could be captured at a separate time than frames for the left-eye camera. Distant objects could be moved, as described above, before capturing each camera view. Alternatively, scene elements could be filmed separately, then given their left and right offset when combined in a compositing program (as described above) to achieve positive parallax.


It will be appreciated that the processes and systems described herein may create stereo-pair “dome masters” (or hemispherical images) to be displayed and viewed on a dome surface. These stereo pairs may be for a single still image, or a series of frames that, when displayed in sequence, constitute a video segment. When displaying real-time computer-generated graphics, the horizontal offset of distant objects to create positive parallax could be applied in software by a processor of a computer at the time that the scene elements are displayed in the separate left-eye and right-eye views.


Whether being displayed in real-time, or as a rendered still or video segment, the left-eye view of the stereo pair is displayed to the left eye of the viewer, and the right-eye view of the stereo pair is displayed to the right eye of the viewer in order for the 3-D stereo effect to occur. As mentioned above, the processes by which these images are then processed or independently displayed to the eyes will not be described in detail herein as they are readily known to one having ordinary skill in the art.


Referring now to FIG. 9, there is depicted a system 400 for displaying an image. The system 400 may comprise a dome surface 402. A projector 404 having a projection lens 406 may be positioned to project images onto the dome surface 402. A data storage device 408 may be connected to the projector 404. The data storage device 408 may provide image data to the projector 404. The projector 404 may project 3-D images onto the dome surface 402. The 3-D images displayed on the dome surface 402 may be generated by the methods described above, including the method described in connection with FIG. 6. A viewer may use a pair of 3-D glasses 410 to view the 3-D images on the dome surface 402. The system 400 projects scenes containing positive parallax onto the dome surface 402 and therefore allows objects to appear to be located beyond the dome surface 402 when viewed in 3-D stereo, which was previously not possible. For example, based upon the image data, the projector 404 may project a left-eye image 412 and a right-eye image 414 onto the dome surface 402. A viewer, wearing the 3-D glasses 410, may perceive an object formed from the left-eye image 412 and a right-eye image 414 as being positioned behind the dome surface 402 as shown by object 416. In this case, the eyes of the viewer wearing the 3-D glasses 410 are in positive parallax. It will be appreciated that the system 400 may be adapted to display 3-D images using any desired technology, including, polarization filtering (passive or active), shutter filtering (mechanical shutters), color filtering (anaglyph), autostereoscope, etc.


It will be appreciated that the structure and apparatus disclosed herein is merely one example of a means for generating and displaying a distant 3-D stereo image on a dome surface, and it should be appreciated that any structure, apparatus or system for generating and displaying a distant 3-D stereo image on a dome surface which performs functions the same as, or equivalent to, those disclosed herein are intended to fall within the scope of a means for generating and displaying a distant 3-D stereo image on a dome surface, including those structures, apparatus or systems for generating and displaying a distant 3-D stereo image on a dome surface which are presently known, or which may become available in the future. Anything which functions the same as, or equivalently to, a means for generating and displaying a distant 3-D stereo image on a dome surface falls within the scope of this element.


The cameras disclosed herein may be real-world cameras especially adapted for filming in 3-D or virtual cameras for capturing scenes in virtual worlds as is known to one having ordinary skill in the art.


Those having ordinary skill in the relevant art will appreciate the advantages provided by the features of the present disclosure. For example, it is a feature of the present disclosure to provide a system for generating a 3-D image for display on a dome surface. Another feature of the present disclosure is to provide such a projection system for displaying distant 3-D stereo on a dome surface. It is a further feature of the present disclosure, in accordance with one aspect thereof, to provide a process of capturing positive parallax for a dome scene by leaving the cameras parallel, and simply moving distant objects to the right in the right-camera view, and to the left in the left-camera view.


In the foregoing Detailed Description, various features of the present disclosure are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed disclosure requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the following claims are hereby incorporated into this Detailed Description by this reference, with each claim standing on its own as a separate embodiment of the present disclosure.


It is to be understood that the above-described arrangements are only illustrative of the application of the principles of the present disclosure. Numerous modifications and alternative arrangements may be devised by those skilled in the art without departing from the spirit and scope of the present disclosure and the appended claims are intended to cover such modifications and arrangements. Thus, while the present disclosure has been shown in the drawings and described above with particularity and detail, it will be apparent to those of ordinary skill in the art that numerous modifications, including, but not limited to, variations in size, materials, shape, form, function and manner of operation, assembly and use may be made without departing from the principles and concepts set forth herein.

Claims
  • 1. A method of generating a stereoscopic pair of images for use in forming a 3-D image of an object at a desired perceived position in a scene, said method comprising: moving the object to a left offset position in the scene, the left offset position being located to a left side of the desired perceived position in the scene;capturing a left image of the object in the left offset position using a camera in a left camera position;moving the object to a right offset position in the scene, the right offset position being located to a right side of the desired perceived position in the scene;capturing a right image of the object in the right offset position using a camera in a right camera position; andpreparing the left image and the right image for stereoscopic display of the scene.
  • 2. The method of claim 1, further comprising: determining a distance between the desired perceived position in the scene and each of the left offset position and the right offset position.
  • 3. The method of claim 1, further comprising: determining a distance between the left camera position and the right camera position.
  • 4. The method of claim 1, wherein the object is a scene element.
  • 5. The method of claim 1, further comprising: orienting the camera in the left camera position to a left camera heading and orienting the camera in the right camera position to a right camera heading, wherein the left camera heading and the right camera heading are parallel to one another.
  • 6. The method of claim 1, wherein preparing the left image and the right image for stereoscopic display comprises generating a master image file from the left image and the right image.
  • 7. The method of claim 1, wherein the object is a real-world object.
  • 8. The method of claim 1, wherein the object is a virtual object, the scene is a virtual scene, each camera is a virtual camera capable of generating an image, moving the object to the left offset position and moving the object to the right offset position comprise virtually moving the object, and capturing the left image and capturing the right image comprise digitally preparing the left image and digitally preparing the right image.
  • 9. The method of claim 1, wherein preparing the left image and the right image comprises preparing the left image and the right image for stereoscopic display of the scene on a concave surface that at least partially surrounds an audience and on which the object appears to be in positive parallax due to the left offset position of the object in the left image and the right offset position of the object in the right image.
  • 10. The method of claim 9, further comprising: displaying the left image and the right image on the concave surface such that the object is perceived by a viewer in the audience as a 3-D image of the object positioned at the desired perceived position.
  • 11. The method of claim 10, wherein the concave surface is a dome surface.
  • 12. The method of claim 11, wherein the desired perceived position is beyond the dome surface.
  • 13. A method of generating a stereoscopic pair of images for use in forming a 3-D image, said method comprising: creating a virtual environment using a computer;determining a desired interocular distance between a left camera position and a right camera position in the virtual environment;determining an offset distance for an object in the virtual environment from a desired perceived position to produce a desired positive parallax; andgenerating a left image in which the object appears in the virtual environment at a left offset position at the offset distance from a left side of the desired perceived position;generating a right image in which the object appears in the virtual environment at a right offset position at the offset distance from a right side of the desired perceived position; andpreparing the left image and the right image for stereoscopic display.
  • 14. The method of claim 13, wherein the object is a scene element.
  • 15. The method of claim 13, further comprising: displaying the left image and the right image on a surface such that the object is perceived by a viewer as a 3-D image of the object positioned at the desired perceived position.
  • 16. The method of claim 15, wherein: preparing the left image and the right image for stereoscopic display comprises preparing the left image and the right image for stereoscopic display on a concave surface to the viewer at least partially surrounded by the concave surface; anddisplaying the left image and the right image comprises displaying the left image and the right image on an inside of the concave surface.
  • 17. The method of claim 16, wherein the desired perceived position is beyond the inside of the concave surface.
  • 18. A system for generating a 3-D image, comprising: a processor;a memory accessible by the processor, the memory comprising instructions that, when executed by the processor, cause the processor to identify a desired perceived position in a scene, generate a left image of a scene with an object in a left offset position at an offset distance to a left side of the desired perceived position, generate a right image of the scene with the object at the offset distance to a right side of the desired perceived position, store the left image of the scene in a left image file and store the right image of the scene in a right image file.
  • 19. The system of claim 18, wherein the memory further comprises instructions that, when executed by the processor, cause the processor to allow a user to define the scene.
  • 20. The system of claim 18, wherein instructions cause the processor to generate the left image and the right image as a stereoscopic pair of images.
  • 21. The system of claim 20, wherein the memory further comprises the instructions that, when executed by the processor, cause the processor to prepare the left image and the right image for stereoscopic display.
  • 22. The system of claim 21, wherein the instructions, when executed by the processor, cause the processor to prepare the left image and the right image for stereoscopic display such that the object appears beyond a surface onto which the left image and the right image are to be projected.
  • 23. The system of claim 21, wherein the instructions, when executed by the processor, cause the processor to prepare the left image and the right image for stereoscopic display on a concave surface.
  • 24. The system of claim 23, wherein the instructions, when executed by the processor, cause the processor to prepare the left image and the right image for display on an inside of at least a portion of a dome surface.
  • 25. A method of generating a 3-D image on a dome surface from a stereoscopic pair of images, the dome surface capable of at least partially surrounding an audience, the method comprising: projecting the stereoscopic pair of images onto an inside of the dome surface with a projection system such that a viewer perceives a location of an object in the 3-D image formed from the stereoscopic pair of images as being beyond the dome surface.
  • 26. A system for displaying a 3-D image, comprising: a concave screen;a processor;a memory accessible by the processor, the memory storing: a plurality of left images of a scene with an object at a left offset position from a desired perceived position in the scene; anda plurality of right images of the scene with the object at a right offset position from the desired perceived position in the scene;instructions that, when executed by the processor, cause the processor to output a signal of a stereoscopic image including a corresponding pair of images including a left image of the plurality of left images and a right image of the plurality of right images;a projector capable of receiving the signal from the processor, the projector including a left image projection element capable of projecting the left image of the scene onto the concave screen and a right image projection element capable of projecting the right image of the scene onto the concave screen, the left image projection element and the right image projection element oriented to project the left image of the scene and the right image of the scene in such a way that the object appears to be located beyond a surface of the concave screen.
  • 27. The system of claim 26, wherein the concave screen comprises a dome-shaped screen.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 13/545,948, filed on Jul. 10, 2012, titled SYSTEM AND METHOD FOR DISPLAYING DISTANT 3-D STEREO ON A DOME SURFACE, now U.S. Pat. No. 9,641,826, issued May 2, 2017 (“the '948 Application”), which claims the benefit of the Oct. 6, 2011 filing date U.S. Provisional Patent Application No. 61/544,110 and of the Oct. 12, 2011 filing date of U.S. Provisional Patent Application No. 61/546,152 pursuant to 35 U.S.C. § 119(e). The entire disclosure of each of the foregoing patent applications is hereby incorporated herein.

US Referenced Citations (865)
Number Name Date Kind
449435 Brotz Mar 1891 A
1525550 Jenkins Feb 1925 A
1548262 Freedman Aug 1925 A
1702195 Centeno Feb 1929 A
1814701 Ives Jul 1931 A
2415226 Sziklai Feb 1947 A
2688048 Rose Aug 1954 A
2764628 Bambara Sep 1956 A
2783406 Vanderhooft Feb 1957 A
2991690 Grey et al. Jul 1961 A
3201797 Roth Aug 1965 A
3345462 Good et al. Oct 1967 A
3370505 Bryan Feb 1968 A
3418459 Purdy et al. Dec 1968 A
3422419 Mathews et al. Jan 1969 A
3485944 Stephens, Jr. Dec 1969 A
3534338 Christensen et al. Oct 1970 A
3553364 Lee Jan 1971 A
3576394 Lee Apr 1971 A
3577031 Welsh et al. May 1971 A
3600798 Lee Aug 1971 A
3602702 Warnock Aug 1971 A
3605083 Kramer Sep 1971 A
3633999 Buckles Jan 1972 A
3656837 Sandbank Apr 1972 A
3659920 McGlasson May 1972 A
3668622 Gannett et al. Jun 1972 A
3688298 Miller et al. Aug 1972 A
3709581 McGlasson Jan 1973 A
3711826 La Russa Jan 1973 A
3734602 Deck May 1973 A
3734605 Yevick May 1973 A
3736526 Simmons May 1973 A
3737573 Kessler Jun 1973 A
3746911 Nathanson et al. Jul 1973 A
3757161 Kline Sep 1973 A
3760222 Smith Sep 1973 A
3764719 Dell Oct 1973 A
3775760 Strathman Nov 1973 A
3781465 Ernstoff et al. Dec 1973 A
3783184 Ernstoff et al. Jan 1974 A
3785715 Mecklenborg Jan 1974 A
3802769 Rotz et al. Apr 1974 A
3816726 Sutherland et al. Jun 1974 A
3818129 Yamamoto Jun 1974 A
3831106 Ward Aug 1974 A
3846826 Mueller Nov 1974 A
3862360 Dill et al. Jan 1975 A
3886310 Guldberg et al. May 1975 A
3889107 Sutherland Jun 1975 A
3891889 Fazio Jun 1975 A
3896338 Nathanson et al. Jul 1975 A
3899662 Kreeger et al. Aug 1975 A
3915548 Opittek et al. Oct 1975 A
3920495 Roberts Nov 1975 A
3922585 Andrews Nov 1975 A
3934173 Korver Jan 1976 A
3935499 Oess Jan 1976 A
3940204 Withrington Feb 1976 A
3943281 Keller et al. Mar 1976 A
3947105 Smith Mar 1976 A
3969611 Fonteneau Jul 1976 A
3983452 Bazin Sep 1976 A
3991416 Byles et al. Nov 1976 A
4001663 Bray Jan 1977 A
4009939 Okano Mar 1977 A
4016658 Porter et al. Apr 1977 A
4017158 Booth Apr 1977 A
4017985 Heartz Apr 1977 A
4021841 Weinger May 1977 A
4027403 Marsh et al. Jun 1977 A
4028725 Lewis Jun 1977 A
4048653 Spooner Sep 1977 A
4067129 Abramson et al. Jan 1978 A
4077138 Foerst Mar 1978 A
4093346 Nishino et al. Jun 1978 A
4093347 La Russa Jun 1978 A
4100571 Dykes et al. Jul 1978 A
4119956 Murray Oct 1978 A
4120028 Membrino et al. Oct 1978 A
4138726 Girault et al. Feb 1979 A
4139257 Matsumoto Feb 1979 A
4139799 Kureha et al. Feb 1979 A
4149184 Giddings et al. Apr 1979 A
4152766 Osofsky et al. May 1979 A
4163570 Greenaway Aug 1979 A
4170400 Bach et al. Oct 1979 A
4177579 Peters et al. Dec 1979 A
4184700 Greenaway Jan 1980 A
4195911 Bougon et al. Apr 1980 A
4197559 Gramling Apr 1980 A
4200866 Strathman Apr 1980 A
4203051 Hallett et al. May 1980 A
4211918 Nyfeler et al. Jul 1980 A
4222106 Hess et al. Sep 1980 A
4223050 Nyfeler et al. Sep 1980 A
4229732 Hartstein et al. Oct 1980 A
4234891 Beck et al. Nov 1980 A
4241519 Gilson et al. Dec 1980 A
4250217 Greenaway Feb 1981 A
4250393 Greenaway Feb 1981 A
4289371 Kramer Sep 1981 A
4297723 Whitby Oct 1981 A
4303394 Berke et al. Dec 1981 A
4305057 Rolston Dec 1981 A
4318173 Freedman et al. Mar 1982 A
4333144 Whiteside et al. Jun 1982 A
4335402 Holmes Jun 1982 A
4335933 Palmer Jun 1982 A
4338661 Tredennick et al. Jul 1982 A
4340878 Spooner et al. Jul 1982 A
4342083 Freedman et al. Jul 1982 A
4343037 Bolton Aug 1982 A
4343532 Palmer Aug 1982 A
4345817 Gwynn Aug 1982 A
4347507 Spooner Aug 1982 A
4348184 Moore Sep 1982 A
4348185 Breglia et al. Sep 1982 A
4348186 Harvey et al. Sep 1982 A
4349815 Spooner Sep 1982 A
4356730 Cade Nov 1982 A
4360884 Okada et al. Nov 1982 A
4375685 Le Goff et al. Mar 1983 A
4384324 Kim et al. May 1983 A
4390253 Lobb Jun 1983 A
4393394 McCoy Jul 1983 A
4394727 Hoffman et al. Jul 1983 A
4398794 Palmer et al. Aug 1983 A
4398795 Palmer Aug 1983 A
4399861 Carlson Aug 1983 A
4408884 Kleinknecht et al. Oct 1983 A
4422019 Meyer Dec 1983 A
4427274 Pund et al. Jan 1984 A
4431260 Palmer Feb 1984 A
4435756 Potash Mar 1984 A
4437113 Lee et al. Mar 1984 A
4439157 Breglia et al. Mar 1984 A
4440839 Mottier Apr 1984 A
4441791 Hornbeck Apr 1984 A
4445197 Lone et al. Apr 1984 A
4446480 Breglia et al. May 1984 A
4463372 Bennett et al. Jul 1984 A
4466123 Arai et al. Aug 1984 A
4471433 Matsumoto et al. Sep 1984 A
4472732 Bennett et al. Sep 1984 A
4487584 Allen et al. Dec 1984 A
4492435 Banton et al. Jan 1985 A
4498136 Sproul, III Feb 1985 A
4499457 Hintze Feb 1985 A
4500163 Burns et al. Feb 1985 A
4511337 Fortunato et al. Apr 1985 A
4536058 Shaw et al. Aug 1985 A
4539638 Gaffney Sep 1985 A
4546431 Horvath Oct 1985 A
4566935 Hornbeck Jan 1986 A
4570233 Yan et al. Feb 1986 A
4582396 Bos et al. Apr 1986 A
4583185 Heartz Apr 1986 A
4586037 Rosener et al. Apr 1986 A
4586038 Sims et al. Apr 1986 A
4589093 Ippolito et al. May 1986 A
4590555 Bourrez May 1986 A
4591844 Hickin et al. May 1986 A
4596992 Hornbeck Jun 1986 A
4597633 Fussell Jul 1986 A
4598372 McRoberts Jul 1986 A
4599070 Hladky et al. Jul 1986 A
4609939 Kozawa et al. Sep 1986 A
4616217 Nesbitt et al. Oct 1986 A
4616262 Toriumi et al. Oct 1986 A
4623223 Kempf Nov 1986 A
4623880 Bresenham et al. Nov 1986 A
4625289 Rockwood Nov 1986 A
4630101 Inaba Dec 1986 A
4630884 Jubinski Dec 1986 A
4631690 Corthout et al. Dec 1986 A
4633243 Bresenham et al. Dec 1986 A
4634384 Neves et al. Jan 1987 A
4636031 Schmadel, Jr. et al. Jan 1987 A
4636384 Stolle Jan 1987 A
4642756 Sherrod Feb 1987 A
4642790 Minshull et al. Feb 1987 A
4642945 Browning et al. Feb 1987 A
4645459 Graf et al. Feb 1987 A
4646251 Hayes et al. Feb 1987 A
4647966 Phillips et al. Mar 1987 A
4655539 Caulfield et al. Apr 1987 A
4656506 Ritchey Apr 1987 A
4656578 Chilinski et al. Apr 1987 A
4657512 Mecklenborg Apr 1987 A
4658351 Teng Apr 1987 A
4662746 Hornbeck May 1987 A
4663617 Stockwell May 1987 A
4671650 Hirzel et al. Jun 1987 A
4672215 Howard Jun 1987 A
4672275 Ando Jun 1987 A
4677576 Berlin, Jr. et al. Jun 1987 A
4679040 Yan Jul 1987 A
4684215 Shaw et al. Aug 1987 A
4692880 Merz et al. Sep 1987 A
4698602 Armitage Oct 1987 A
4704605 Edelson Nov 1987 A
4710732 Hornbeck Dec 1987 A
4714428 Bunker et al. Dec 1987 A
4715005 Heartz Dec 1987 A
4720705 Gupta et al. Jan 1988 A
4720747 Crowley Jan 1988 A
4725110 Glenn et al. Feb 1988 A
4727365 Bunker et al. Feb 1988 A
4730261 Smith Mar 1988 A
4731859 Holter et al. Mar 1988 A
4735410 Nobuta Apr 1988 A
4743200 Welch et al. May 1988 A
4744615 Fan et al. May 1988 A
4748572 Latham May 1988 A
4751509 Kubota et al. Jun 1988 A
4760388 Tatsumi et al. Jul 1988 A
4760917 Vitek Aug 1988 A
4761253 Antes Aug 1988 A
4763280 Robinson et al. Aug 1988 A
4766555 Bennett Aug 1988 A
4769762 Tsujido Sep 1988 A
4772881 Hannah Sep 1988 A
4777620 Shimoni et al. Oct 1988 A
4780084 Donovan Oct 1988 A
4780711 Doumas Oct 1988 A
4791583 Colburn Dec 1988 A
4794386 Bedrij et al. Dec 1988 A
4795226 Bennion et al. Jan 1989 A
4796020 Budrikis et al. Jan 1989 A
4799106 Moore et al. Jan 1989 A
4805107 Kieckhafer et al. Feb 1989 A
4807158 Blanton et al. Feb 1989 A
4807183 Kung et al. Feb 1989 A
4811245 Bunker et al. Mar 1989 A
4812988 Duthuit et al. Mar 1989 A
4821212 Heartz Apr 1989 A
4825391 Merz Apr 1989 A
4833528 Kobayashi May 1989 A
4837740 Sutherland Jun 1989 A
4854669 Bimbach et al. Aug 1989 A
4855934 Robinson Aug 1989 A
4855937 Heartz Aug 1989 A
4855939 Fitzgerald, Jr. et al. Aug 1989 A
4855943 Lewis Aug 1989 A
4856869 Sakata et al. Aug 1989 A
4868766 Oosterholt Sep 1989 A
4868771 Quick et al. Sep 1989 A
4873515 Dickson et al. Oct 1989 A
4884275 Simms Nov 1989 A
4885703 Deering Dec 1989 A
4893353 Iwaoka et al. Jan 1990 A
4893515 Uchida Jan 1990 A
4897715 Beamon, III Jan 1990 A
4899293 Dawson et al. Feb 1990 A
4907237 Dahmani et al. Mar 1990 A
4912526 Iwaoka et al. Mar 1990 A
4915463 Barbee, Jr. Apr 1990 A
4918626 Watkins et al. Apr 1990 A
4930888 Freisleben et al. Jun 1990 A
4935879 Ueda Jun 1990 A
4938584 Suematsu et al. Jul 1990 A
4940972 Mouchot et al. Jul 1990 A
4949280 Littlefield Aug 1990 A
4952152 Briggs et al. Aug 1990 A
4952922 Griffin et al. Aug 1990 A
4953107 Hedley et al. Aug 1990 A
4954819 Watkins Sep 1990 A
4955034 Scerbak Sep 1990 A
4959541 Boyd Sep 1990 A
4959803 Kiyohara et al. Sep 1990 A
4969714 Fournier, Jr. et al. Nov 1990 A
4970500 Hintze Nov 1990 A
4974155 Dulong et al. Nov 1990 A
4974176 Buchner et al. Nov 1990 A
4982178 Hintze Jan 1991 A
4984824 Antes et al. Jan 1991 A
4985831 Dulong et al. Jan 1991 A
4985854 Wittenburg Jan 1991 A
4991955 Vetter Feb 1991 A
4992780 Penna et al. Feb 1991 A
4994794 Price et al. Feb 1991 A
5005005 Brossia et al. Apr 1991 A
5007705 Morey et al. Apr 1991 A
5011276 Iwamoto Apr 1991 A
5016643 Applegate et al. May 1991 A
5022732 Engan et al. Jun 1991 A
5022750 Flasck Jun 1991 A
5023725 McCutchen Jun 1991 A
5023818 Wittensoldner et al. Jun 1991 A
5025394 Parke Jun 1991 A
5025400 Cook et al. Jun 1991 A
5035473 Kuwayama et al. Jul 1991 A
5038352 Lenth et al. Aug 1991 A
5043924 Hofmann Aug 1991 A
5047626 Bobb et al. Sep 1991 A
5053698 Ueda Oct 1991 A
5058992 Takahashi Oct 1991 A
5059019 McCullough Oct 1991 A
5061075 Alfano et al. Oct 1991 A
5061919 Watkins Oct 1991 A
5063375 Lien et al. Nov 1991 A
5077608 Dubner Dec 1991 A
5088095 Zirngibl Feb 1992 A
5089903 Kuwayama et al. Feb 1992 A
5095491 Kozlovsky et al. Mar 1992 A
5097427 Lathrop et al. Mar 1992 A
5101184 Antes Mar 1992 A
5103306 Weiman et al. Apr 1992 A
5103339 Broome Apr 1992 A
5111468 Kozlovsky et al. May 1992 A
5113455 Scott May 1992 A
5115127 Bobb et al. May 1992 A
5117221 Mishica, Jr. May 1992 A
RE33973 Kriz et al. Jun 1992 E
5121086 Srivastava Jun 1992 A
5123085 Wells et al. Jun 1992 A
5124821 Antier et al. Jun 1992 A
5132812 Takahashi et al. Jul 1992 A
5134521 Lacroix et al. Jul 1992 A
5136675 Hodson Aug 1992 A
5136818 Bramson Aug 1992 A
5142788 Willetts Sep 1992 A
5155604 Miekka et al. Oct 1992 A
5157385 Nakao et al. Oct 1992 A
5159601 Huber Oct 1992 A
5161013 Rylander et al. Nov 1992 A
5175575 Gersuk Dec 1992 A
5179638 Dawson et al. Jan 1993 A
5185852 Mayer Feb 1993 A
5194969 DiFrancesco Mar 1993 A
5196922 Yeomans Mar 1993 A
5198661 Anderson et al. Mar 1993 A
5200818 Neta et al. Apr 1993 A
5206868 Deacon Apr 1993 A
5214757 Mauney et al. May 1993 A
5222205 Larson et al. Jun 1993 A
5226109 Dawson et al. Jul 1993 A
5227863 Bilbrey et al. Jul 1993 A
5229593 Cato Jul 1993 A
5230039 Grossman et al. Jul 1993 A
5231388 Stoltz Jul 1993 A
5239625 Bogart et al. Aug 1993 A
5241659 Parulski et al. Aug 1993 A
5242306 Fisher Sep 1993 A
5243448 Banbury Sep 1993 A
5251160 Rockwood et al. Oct 1993 A
5252068 Gryder Oct 1993 A
5255274 Wysocki et al. Oct 1993 A
5266930 Ichikawa et al. Nov 1993 A
5267045 Stroomer Nov 1993 A
5272473 Thompson et al. Dec 1993 A
5276849 Patel Jan 1994 A
5285397 Heier et al. Feb 1994 A
5291317 Newswanger Mar 1994 A
5293233 Billing et al. Mar 1994 A
5297156 Deacon Mar 1994 A
5300942 Dolgoff Apr 1994 A
5301062 Takahashi et al. Apr 1994 A
5311360 Bloom et al. May 1994 A
5315699 Imai et al. May 1994 A
5317576 Leonberger et al. May 1994 A
5317689 Nack et al. May 1994 A
5319744 Kelly et al. Jun 1994 A
5320353 Moore Jun 1994 A
5320534 Thomas Jun 1994 A
5325133 Adachi Jun 1994 A
5325485 Hochmuth et al. Jun 1994 A
5326266 Fisher et al. Jul 1994 A
5329323 Biles Jul 1994 A
5333021 Mitsutake et al. Jul 1994 A
5333245 Vecchione Jul 1994 A
5341460 Tam Aug 1994 A
5345280 Kimura et al. Sep 1994 A
5347433 Sedlmayr Sep 1994 A
5347620 Zimmer Sep 1994 A
5348477 Welch et al. Sep 1994 A
5353390 Harrington Oct 1994 A
5357579 Buchner et al. Oct 1994 A
5359526 Whittington et al. Oct 1994 A
5359704 Rossignac et al. Oct 1994 A
5360010 Applegate Nov 1994 A
5361386 Watkins et al. Nov 1994 A
5363220 Kuwayama et al. Nov 1994 A
5363475 Baker et al. Nov 1994 A
5363476 Kurashige et al. Nov 1994 A
5367585 Ghezzo et al. Nov 1994 A
5367615 Economy et al. Nov 1994 A
5369450 Haseltine et al. Nov 1994 A
5369735 Thier et al. Nov 1994 A
5369739 Akeley Nov 1994 A
5377320 Abi-Ezzi et al. Dec 1994 A
5379371 Usami et al. Jan 1995 A
5380995 Udd et al. Jan 1995 A
5381338 Wysocki et al. Jan 1995 A
5381519 Brown et al. Jan 1995 A
5384719 Baker et al. Jan 1995 A
5388206 Poulton et al. Feb 1995 A
5394414 Kozlovsky et al. Feb 1995 A
5394515 Lentz et al. Feb 1995 A
5394516 Winser Feb 1995 A
5396349 Roberts et al. Mar 1995 A
5398083 Tsujihara et al. Mar 1995 A
5408249 Wharton et al. Apr 1995 A
5408606 Eckart Apr 1995 A
5410371 Lambert Apr 1995 A
5412796 Olive May 1995 A
5422986 Neely Jun 1995 A
5430888 Witek et al. Jul 1995 A
5432863 Benati et al. Jul 1995 A
5444839 Silverbrook et al. Aug 1995 A
5451765 Gerber Sep 1995 A
5459610 Bloom et al. Oct 1995 A
5459835 Trevett Oct 1995 A
5465121 Blalock et al. Nov 1995 A
5465368 Davidson et al. Nov 1995 A
5471545 Negami et al. Nov 1995 A
5471567 Soderberg et al. Nov 1995 A
5473373 Hwung et al. Dec 1995 A
5473391 Usui Dec 1995 A
5479597 Fellous Dec 1995 A
5480305 Montag et al. Jan 1996 A
5487665 Lechner et al. Jan 1996 A
5488687 Rich Jan 1996 A
5489920 Kaasila Feb 1996 A
5490238 Watkins Feb 1996 A
5490240 Foran et al. Feb 1996 A
5493439 Engle Feb 1996 A
5493629 Stange Feb 1996 A
5495563 Winser Feb 1996 A
5499194 Prestidge et al. Mar 1996 A
5500747 Tanide et al. Mar 1996 A
5500761 Goossen et al. Mar 1996 A
5502482 Graham Mar 1996 A
5502782 Smith Mar 1996 A
5504496 Tanaka et al. Apr 1996 A
5506949 Perrin Apr 1996 A
5519518 Watanabe et al. May 1996 A
5535374 Olive Jul 1996 A
5536085 Li et al. Jul 1996 A
5537159 Suematsu et al. Jul 1996 A
5539577 Si et al. Jul 1996 A
5541769 Ansley et al. Jul 1996 A
5544306 Deering et al. Aug 1996 A
5544340 Doi et al. Aug 1996 A
5550960 Shirman et al. Aug 1996 A
5551283 Manaka et al. Sep 1996 A
5557297 Sharp et al. Sep 1996 A
5557733 Hicok et al. Sep 1996 A
5559952 Fujimoto Sep 1996 A
5559954 Sakoda et al. Sep 1996 A
5561745 Jackson et al. Oct 1996 A
5566370 Young Oct 1996 A
5572229 Fisher Nov 1996 A
5574847 Eckart et al. Nov 1996 A
5579456 Cosman Nov 1996 A
5584696 Walker et al. Dec 1996 A
5586291 Lasker et al. Dec 1996 A
5590254 Lippincott et al. Dec 1996 A
5594854 Baldwin et al. Jan 1997 A
5598517 Watkins Jan 1997 A
5604849 Artwick et al. Feb 1997 A
5610665 Berman Mar 1997 A
5612710 Christensen et al. Mar 1997 A
5614961 Gibeau et al. Mar 1997 A
5625768 Dye Apr 1997 A
5627605 Kim May 1997 A
5629801 Staker et al. May 1997 A
5630037 Schindler May 1997 A
5633750 Nogiwa et al. May 1997 A
5638208 Walker Jun 1997 A
5648860 Ooi et al. Jul 1997 A
5650814 Florent et al. Jul 1997 A
5651104 Cosman Jul 1997 A
5657077 DeAngelis et al. Aug 1997 A
5658060 Dove Aug 1997 A
5659490 Imamura Aug 1997 A
5659671 Tannenbaum et al. Aug 1997 A
5661592 Bornstein et al. Aug 1997 A
5661593 Engle Aug 1997 A
5665942 Williams et al. Sep 1997 A
5677783 Bloom et al. Oct 1997 A
5684939 Foran et al. Nov 1997 A
5684943 Abraham et al. Nov 1997 A
5689437 Nakagawa Nov 1997 A
5691999 Ball et al. Nov 1997 A
5694180 Deter et al. Dec 1997 A
5696892 Redmann et al. Dec 1997 A
5696947 Johns et al. Dec 1997 A
5699497 Erdahl et al. Dec 1997 A
5703604 McCutchen Dec 1997 A
5706061 Marshall et al. Jan 1998 A
5715021 Gibeau et al. Feb 1998 A
5719951 Shackleton et al. Feb 1998 A
5724561 Tarolli et al. Mar 1998 A
5726785 Chawki et al. Mar 1998 A
5734386 Cosman Mar 1998 A
5734521 Fukudome et al. Mar 1998 A
5739819 Bar-Nahum Apr 1998 A
5740190 Moulton Apr 1998 A
5742749 Foran et al. Apr 1998 A
5748264 Hegg May 1998 A
5748867 Cosman et al. May 1998 A
5761709 Kranich Jun 1998 A
5764280 Bloom et al. Jun 1998 A
5764311 Bonde et al. Jun 1998 A
5768443 Michael et al. Jun 1998 A
5781666 Ishizawa et al. Jul 1998 A
5793912 Boord et al. Aug 1998 A
5798743 Bloom Aug 1998 A
5808797 Bloom et al. Sep 1998 A
5818456 Cosman et al. Oct 1998 A
5818998 Harris et al. Oct 1998 A
5821944 Watkins Oct 1998 A
5825363 Anderson Oct 1998 A
5825538 Walker Oct 1998 A
5835256 Huibers Nov 1998 A
5837996 Keydar Nov 1998 A
5838328 Roller Nov 1998 A
5838484 Goossen Nov 1998 A
5841443 Einkauf Nov 1998 A
5841447 Drews Nov 1998 A
5841579 Bloom et al. Nov 1998 A
5850225 Cosman Dec 1998 A
5854631 Akeley et al. Dec 1998 A
5854865 Goldberg Dec 1998 A
5860721 Bowron et al. Jan 1999 A
5864342 Kajiya et al. Jan 1999 A
5867166 Myhrvold et al. Feb 1999 A
5867301 Engle Feb 1999 A
5870097 Snyder et al. Feb 1999 A
5870098 Gardiner Feb 1999 A
5874967 West et al. Feb 1999 A
5889529 Jones et al. Mar 1999 A
5900881 Ikedo May 1999 A
5903272 Otto May 1999 A
5905504 Barkans et al. May 1999 A
5908300 Walker et al. Jun 1999 A
5909225 Schinnerer et al. Jun 1999 A
5912670 Lipscomb et al. Jun 1999 A
5912740 Zare et al. Jun 1999 A
5917495 Doi et al. Jun 1999 A
5920361 Gibeau et al. Jul 1999 A
5923333 Stroyan Jul 1999 A
5930740 Mathisen Jul 1999 A
5943060 Cosman et al. Aug 1999 A
5946129 Xu et al. Aug 1999 A
5963788 Barron et al. Oct 1999 A
5969699 Balram et al. Oct 1999 A
5969721 Chen et al. Oct 1999 A
5969726 Rentschler et al. Oct 1999 A
5974059 Dawson Oct 1999 A
5977977 Kajiya et al. Nov 1999 A
5980044 Cannon et al. Nov 1999 A
5982553 Bloom et al. Nov 1999 A
5987200 Fleming et al. Nov 1999 A
5988814 Rohlfing et al. Nov 1999 A
5990935 Rohlfing Nov 1999 A
5999549 Freitag et al. Dec 1999 A
6002454 Kajiwara et al. Dec 1999 A
6002505 Kraenert et al. Dec 1999 A
6005580 Donovan Dec 1999 A
6005611 Gullichsen et al. Dec 1999 A
6014144 Nelson et al. Jan 2000 A
6014163 Houskeeper Jan 2000 A
6021141 Nam et al. Feb 2000 A
6031541 Lipscomb et al. Feb 2000 A
6034739 Rohlfing et al. Mar 2000 A
6038057 Brazas, Jr. et al. Mar 2000 A
6042238 Blackham et al. Mar 2000 A
6052125 Gardiner et al. Apr 2000 A
6052485 Nelson et al. Apr 2000 A
6057909 Yahav et al. May 2000 A
6064392 Rohner May 2000 A
6064393 Lengyel et al. May 2000 A
6069903 Zanger et al. May 2000 A
6072500 Foran et al. Jun 2000 A
6072544 Gleim et al. Jun 2000 A
6078333 Wittig et al. Jun 2000 A
6084610 Ozaki et al. Jul 2000 A
6094226 Ke et al. Jul 2000 A
6094267 Levenson et al. Jul 2000 A
6094298 Luo et al. Jul 2000 A
6100906 Asaro et al. Aug 2000 A
6101036 Bloom Aug 2000 A
6108054 Heizmann et al. Aug 2000 A
6111616 Chauvin et al. Aug 2000 A
6122413 Jiang et al. Sep 2000 A
6124647 Marcus et al. Sep 2000 A
6124808 Budnovitch Sep 2000 A
6124922 Sentoku Sep 2000 A
6124989 Oode et al. Sep 2000 A
6126288 Hewlett Oct 2000 A
6128019 Crocker, III et al. Oct 2000 A
6128021 van der Meulen et al. Oct 2000 A
6130770 Bloom Oct 2000 A
6134339 Luo Oct 2000 A
6137565 Ecke et al. Oct 2000 A
6137932 Kim et al. Oct 2000 A
6141013 Nelson et al. Oct 2000 A
6141025 Oka et al. Oct 2000 A
6141034 McCutchen Oct 2000 A
6144481 Kowarz et al. Nov 2000 A
6147690 Cosman Nov 2000 A
6147695 Bowen et al. Nov 2000 A
6147789 Gelbart Nov 2000 A
6154259 Hargis et al. Nov 2000 A
6175579 Sandford et al. Jan 2001 B1
6184888 Yuasa et al. Feb 2001 B1
6184891 Blinn Feb 2001 B1
6184926 Khosravi et al. Feb 2001 B1
6188427 Anderson et al. Feb 2001 B1
6188712 Jiang et al. Feb 2001 B1
6191827 Segman et al. Feb 2001 B1
6195099 Gardiner Feb 2001 B1
6195484 Brennan, III et al. Feb 2001 B1
6195609 Pilley et al. Feb 2001 B1
6204859 Jouppi et al. Mar 2001 B1
6204955 Chao et al. Mar 2001 B1
6215579 Bloom et al. Apr 2001 B1
6219015 Bloom et al. Apr 2001 B1
6222937 Cohen et al. Apr 2001 B1
6229650 Reznichenko et al. May 2001 B1
6229827 Fernald et al. May 2001 B1
6233025 Wallenstein May 2001 B1
6236408 Watkins May 2001 B1
6240220 Pan et al. May 2001 B1
6262739 Migdal et al. Jul 2001 B1
6262810 Bloomer Jul 2001 B1
6263002 Hsu et al. Jul 2001 B1
6266068 Kang et al. Jul 2001 B1
6268861 Sanz-Pastor et al. Jul 2001 B1
6282012 Kowarz et al. Aug 2001 B1
6282220 Floyd Aug 2001 B1
6283598 Inami et al. Sep 2001 B1
6285407 Yasuki et al. Sep 2001 B1
6285446 Farhadiroushan Sep 2001 B1
6292165 Lin et al. Sep 2001 B1
6292268 Hirota et al. Sep 2001 B1
6292310 Chao Sep 2001 B1
6297899 Romanovsky Oct 2001 B1
6298066 Wettroth et al. Oct 2001 B1
6301370 Steffens et al. Oct 2001 B1
6304245 Groenenboom Oct 2001 B1
6307558 Mao Oct 2001 B1
6307663 Kowarz Oct 2001 B1
6308144 Bronfeld et al. Oct 2001 B1
6320688 Westbrook et al. Nov 2001 B1
6323984 Trisnadi Nov 2001 B1
6333792 Kimura Dec 2001 B1
6333803 Kurotori et al. Dec 2001 B1
6335765 Daly et al. Jan 2002 B1
6335941 Grubb et al. Jan 2002 B1
6340806 Smart et al. Jan 2002 B1
6356683 Hu et al. Mar 2002 B1
6360042 Long Mar 2002 B1
6361173 Vlahos et al. Mar 2002 B1
6362817 Powers et al. Mar 2002 B1
6362818 Gardiner et al. Mar 2002 B1
6363089 Fernald et al. Mar 2002 B1
6366721 Hu et al. Apr 2002 B1
6369936 Moulin Apr 2002 B1
6370312 Wagoner et al. Apr 2002 B1
6374011 Wagoner et al. Apr 2002 B1
6374015 Lin Apr 2002 B1
6375366 Kato et al. Apr 2002 B1
6381072 Burger Apr 2002 B1
6381385 Watley et al. Apr 2002 B1
6384828 Arbeiter et al. May 2002 B1
6388241 Ang May 2002 B1
6393036 Kato May 2002 B1
6393181 Bulman et al. May 2002 B1
6396994 Philipson et al. May 2002 B1
6404425 Cosman Jun 2002 B1
6407736 Regan Jun 2002 B1
6411425 Kowarz et al. Jun 2002 B1
6421636 Cooper et al. Jul 2002 B1
6424343 Deering et al. Jul 2002 B1
6429876 Morein Aug 2002 B1
6429877 Stroyan Aug 2002 B1
6433823 Nakamura et al. Aug 2002 B1
6433838 Chen Aug 2002 B1
6433840 Poppleton Aug 2002 B1
6437789 Tidwell et al. Aug 2002 B1
6445362 Tegreene Sep 2002 B1
6445433 Levola Sep 2002 B1
6449071 Farhan et al. Sep 2002 B1
6449293 Pedersen et al. Sep 2002 B1
6452667 Fernald et al. Sep 2002 B1
6456288 Brockway et al. Sep 2002 B1
6466206 Deering Oct 2002 B1
6466224 Nagata et al. Oct 2002 B1
6470036 Bailey et al. Oct 2002 B1
6473090 Mayer Oct 2002 B1
6476848 Kowarz et al. Nov 2002 B2
6480513 Kapany et al. Nov 2002 B1
6480634 Corrigan Nov 2002 B1
6490931 Fernald et al. Dec 2002 B1
6496160 Tanner et al. Dec 2002 B1
6507706 Brazas et al. Jan 2003 B1
6510272 Wiegand Jan 2003 B1
6511182 Agostinelli et al. Jan 2003 B1
6512892 Montgomery et al. Jan 2003 B1
RE37993 Zhang Feb 2003 E
6519388 Fernald et al. Feb 2003 B1
6522809 Takabayashi et al. Feb 2003 B1
6525740 Cosman Feb 2003 B1
6529310 Huibers et al. Mar 2003 B1
6529531 Everage et al. Mar 2003 B1
6534248 Jain et al. Mar 2003 B2
6538656 Cheung et al. Mar 2003 B1
6549196 Taguchi et al. Apr 2003 B1
6554431 Binsted et al. Apr 2003 B1
6556627 Kitamura et al. Apr 2003 B2
6563968 Davis et al. May 2003 B2
6574352 Skolmoski Jun 2003 B1
6575581 Tsurushima Jun 2003 B2
6577429 Kurtz et al. Jun 2003 B1
6580430 Hollis et al. Jun 2003 B1
6591020 Klassen Jul 2003 B1
6594043 Bloom et al. Jul 2003 B1
6597363 Duluk, Jr. et al. Jul 2003 B1
6598979 Yoneno Jul 2003 B2
6600460 Mays, Jr. Jul 2003 B1
6600830 Lin et al. Jul 2003 B1
6600854 Anderegg et al. Jul 2003 B2
6603482 Tidwell Aug 2003 B1
6643299 Lin Nov 2003 B1
6646645 Simmonds et al. Nov 2003 B2
6650326 Huber et al. Nov 2003 B1
6671293 Kopp et al. Dec 2003 B2
6678085 Kowarz et al. Jan 2004 B2
6690655 Miner et al. Feb 2004 B1
6692129 Gross et al. Feb 2004 B2
6711187 Tanner et al. Mar 2004 B2
6727918 Nason Apr 2004 B1
6738105 Hannah et al. May 2004 B1
6741384 Martin et al. May 2004 B1
6747649 Sanz-Pastor et al. Jun 2004 B1
6747781 Trisnadi Jun 2004 B2
6751001 Tanner et al. Jun 2004 B1
6760036 Tidwell Jul 2004 B2
6763042 Williams et al. Jul 2004 B2
6773142 Rekow Aug 2004 B2
6776045 Fernald et al. Aug 2004 B2
6782205 Trisnadi et al. Aug 2004 B2
6788304 Hart et al. Sep 2004 B1
6788307 Coleman et al. Sep 2004 B2
6789903 Parker et al. Sep 2004 B2
6791562 Cosman et al. Sep 2004 B2
6793350 Raskar et al. Sep 2004 B1
6798418 Sartori et al. Sep 2004 B1
6799850 Hong et al. Oct 2004 B2
6801205 Gardiner et al. Oct 2004 B2
6809731 Muffler et al. Oct 2004 B2
6811267 Allen et al. Nov 2004 B1
6816169 Cosman Nov 2004 B2
6831648 Mukherjee et al. Dec 2004 B2
6840627 Olbrich Jan 2005 B2
6842298 Shafer et al. Jan 2005 B1
6856449 Winkler et al. Feb 2005 B2
6868212 DeWitte et al. Mar 2005 B2
6871958 Streid et al. Mar 2005 B2
6897878 Cosman et al. May 2005 B2
6943803 Cosman et al. Sep 2005 B1
6956582 Tidwell Oct 2005 B2
6956878 Trisnadi Oct 2005 B1
6971576 Tsikos et al. Dec 2005 B2
6984039 Agostinelli Jan 2006 B2
6985663 Catchmark et al. Jan 2006 B2
7012669 Streid et al. Mar 2006 B2
7030883 Thompson Apr 2006 B2
7038735 Coleman et al. May 2006 B2
7043102 Okamoto et al. May 2006 B2
7053911 Cosman May 2006 B2
7053912 Cosman May 2006 B2
7053913 Cosman May 2006 B2
7054051 Bloom May 2006 B1
7091980 Tidwell Aug 2006 B2
7095423 Cosman et al. Aug 2006 B2
7110153 Sakai Sep 2006 B2
7110624 Williams et al. Sep 2006 B2
7111943 Agostinelli et al. Sep 2006 B2
7113320 Tanner Sep 2006 B2
7133583 Marceau et al. Nov 2006 B2
7169630 Moriwaka Jan 2007 B2
7193765 Christensen et al. Mar 2007 B2
7193766 Bloom Mar 2007 B2
7197200 Marceau et al. Mar 2007 B2
7210786 Tamura et al. May 2007 B2
7215840 Marceau et al. May 2007 B2
7237916 Mitomori Jul 2007 B2
7257519 Cosman Aug 2007 B2
7267442 Childers et al. Sep 2007 B2
7277216 Bloom Oct 2007 B2
7286277 Bloom et al. Oct 2007 B2
7317464 Willis Jan 2008 B2
7327909 Marceau et al. Feb 2008 B2
7334902 Streid et al. Feb 2008 B2
7354157 Takeda et al. Apr 2008 B2
7364309 Sugawara et al. Apr 2008 B2
7400449 Christensen et al. Jul 2008 B2
7420177 Williams et al. Sep 2008 B2
7583437 Lipton et al. Sep 2009 B2
7594965 Tanaka Sep 2009 B2
20010002124 Mamiya et al. May 2001 A1
20010027456 Lancaster et al. Oct 2001 A1
20010047251 Kemp Nov 2001 A1
20020005862 Deering Jan 2002 A1
20020021462 Delfyett et al. Feb 2002 A1
20020030769 Bae Mar 2002 A1
20020042674 Mochizuki et al. Apr 2002 A1
20020067465 Li Jun 2002 A1
20020067467 Dorval et al. Jun 2002 A1
20020071453 Lin Jun 2002 A1
20020075202 Fergason Jun 2002 A1
20020101647 Moulin Aug 2002 A1
20020136121 Salmonsen et al. Sep 2002 A1
20020145615 Moore Oct 2002 A1
20020145806 Amm Oct 2002 A1
20020146248 Herman et al. Oct 2002 A1
20020154860 Fernald et al. Oct 2002 A1
20020176134 Vohra Nov 2002 A1
20020196414 Manni et al. Dec 2002 A1
20030035190 Brown et al. Feb 2003 A1
20030038807 Demos et al. Feb 2003 A1
20030039443 Catchmark et al. Feb 2003 A1
20030048275 Ciolac Mar 2003 A1
20030081303 Sandstrom et al. May 2003 A1
20030086647 Willner et al. May 2003 A1
20030142319 Ronnekleiv et al. Jul 2003 A1
20030160780 Lefebvre et al. Aug 2003 A1
20030174312 Leblanc Sep 2003 A1
20030214633 Roddy et al. Nov 2003 A1
20030235304 Evans et al. Dec 2003 A1
20040017518 Stern et al. Jan 2004 A1
20040017608 Lantz Jan 2004 A1
20040085283 Wang May 2004 A1
20040136074 Ford et al. Jul 2004 A1
20040165154 Kobori et al. Aug 2004 A1
20040179007 Bower et al. Sep 2004 A1
20040183954 Hannah et al. Sep 2004 A1
20040184013 Raskar et al. Sep 2004 A1
20040196660 Usami Oct 2004 A1
20040207618 Williams et al. Oct 2004 A1
20050018309 McGuire, Jr. et al. Jan 2005 A1
20050024722 Agostinelli et al. Feb 2005 A1
20050047134 Mueller et al. Mar 2005 A1
20050093854 Kennedy et al. May 2005 A1
20050243389 Kihara Nov 2005 A1
20060039051 Baba et al. Feb 2006 A1
20060114544 Bloom et al. Jun 2006 A1
20060176912 Anikitchev Aug 2006 A1
20060221429 Christensen et al. Oct 2006 A1
20060238851 Bloom Oct 2006 A1
20060255243 Kobayashi et al. Nov 2006 A1
20070183473 Bicknell et al. Aug 2007 A1
20080037125 Takamiya Feb 2008 A1
20080218837 Yang et al. Sep 2008 A1
20120019528 Ugawa Jan 2012 A1
20120069143 Chu Mar 2012 A1
20120133748 Chung May 2012 A1
20130257857 Kakizawa Oct 2013 A1
20140300708 Iversen Oct 2014 A1
20150163475 Krisman Jun 2015 A1
Foreign Referenced Citations (51)
Number Date Country
2 325 028 Dec 1974 DE
197 21 416 Jan 1999 DE
0 155 858 Sep 1985 EP
0 306 308 Mar 1989 EP
0 319 165 Jun 1989 EP
0 417 039 Mar 1991 EP
0 480 570 Apr 1992 EP
0 488 326 Jun 1992 EP
0 489 594 Jun 1992 EP
0 528 646 Feb 1993 EP
0 530 760 Mar 1993 EP
0 550 189 Jul 1993 EP
0 610 665 Aug 1994 EP
0 621 548 Oct 1994 EP
0 627 644 Dec 1994 EP
0 627 850 Dec 1994 EP
0 643 314 Mar 1995 EP
0 654 777 May 1995 EP
0 658 868 Jun 1995 EP
0 689 078 Dec 1995 EP
0 801 319 Oct 1997 EP
0 880 282 Nov 1998 EP
1 365 584 Nov 2003 EP
2 118 365 Oct 1983 GB
2 144 608 Mar 1985 GB
2 179 147 Feb 1987 GB
2 245 806 Jan 1992 GB
2 251 770 Jul 1992 GB
2 251 773 Jul 1992 GB
2 266 385 Oct 1993 GB
2 293 079 Mar 1996 GB
63-305323 Dec 1988 JP
2-219092 Aug 1990 JP
2000305481 Nov 2000 JP
8701571 Mar 1987 WO
9212506 Jul 1992 WO
9302269 Feb 1993 WO
9309472 May 1993 WO
9318428 Sep 1993 WO
9511473 Apr 1995 WO
9527267 Oct 1995 WO
9641217 Dec 1996 WO
9641224 Dec 1996 WO
9726569 Jul 1997 WO
9815127 Apr 1998 WO
0146248 Jun 2001 WO
0157581 Aug 2001 WO
0212925 Feb 2002 WO
0223824 Mar 2002 WO
0231575 Apr 2002 WO
03001281 Jan 2003 WO
Non-Patent Literature Citations (124)
Entry
Gupta et al., “A VLSI Architecture for Updating Raster-Scan Displays,” Computer Graphics, Aug. 1981, pp. 71-78, vol. 15, No. 3.
Halevi, “Bimorph piezoelectric flexible mirror: graphical solution and comparison with experiment,” J. Opt. Soc. Am., Jan. 1983, pp. 110-113, vol. 73, No. 1.
Hanbury, “The Taming of the Hue, Saturation and Brightness Colour Space,” Centre de Morphologie Mathematique, Ecole des Mines de Paris, date unknown, pp. 234-243.
Hearn et al., Computer Graphics, 2nd ed., 1994, pp. 143-181.
Heckbert, “Survey of Texture Mapping,” IEEE Computer Graphics and Applications, Nov. 1986, pp. 56-67.
Heckbert, “Texture Mapping Polygons in Perspective,” New York Institute of Technology, Computer Graphics Lab, Technical Memo No. 13, Apr. 28, 1983.
Heidrich et al., “Applications of Pixel Textures in Visualization and Realistic Image Synthesis,” Symposium on INteractive 3D Graphics, 1990, pp. 127-135, Atlanta, Georgia.
Holten-Lund, Design for Scalability in 3D Computer Graphics Architectures, Ph.D. thesis, Computer Science sand Technology Informatics and Mathematical Modelling, Technical University of Denmark, Jul. 2001.
Integrating Sphere, www.crowntech.-inc.com, 010-82781750/82782352/68910917, date unknown.
INTEL740 Graphics Accelerator Datasheet, Apr. 1998.
INTEL470 Graphics Accelerator Datasheet, Architectural Overview, at least as early as Apr. 30, 1998.
Jacob, “Eye Tracking in Advanced Interface Design,” ACM, 1995.
Kelley et al., “Hardware Accelerated Rendering of CSG and Transparency,” SIGGRAPH '94, in Computer Graphics Proceedings, Annual Conference Series, 1994, pp. 177-184.
Klassen, “Modeling the Effect of the Atmosphere on Light,” ACM Transactions on Graphics, Jul. 1987, pp. 215-237, vol. 6, No. 3.
Kleiss, “Tradeoffs Among Types of Scene Detail for Simulating Low-Altitude Flight,” University of Dayton Research Institute, Aug. 1, 1992, pp. 1141-1146.
Kudryashov et al., “Adaptive Optics for High Power Laser ZBeam Control,” Springer Proceedings in Physics, 2005, pp. 237-248, vol. 102.
Lewis, “Algorithms for Solid Noise Synthesis,” SIGGRAPH '89, Computer Graphics, Jul. 1989, pp. 263-270, vol. 23, No. 3.
Lindstrom et al., “Real-Time, Continuous Level of Detail Rendering of Height Fields,” SIGGRAPH '96, Aug. 1996.
McCarty et al., “A Virtual Cockpit for a Distributed Interactive Simulation,” IEEE Computer Graphics & Applications, Jan. 1994, pp. 49-54.
Microsoft Flight Simulator 2004, Aug. 9, 2000. http://www.microsoft.com/games/flightsimulator/fs2000_devdesk.sdk.asp.
Miller et al., “Illumination and Reflection Maps: Simulated Objects in Simulated and Real Environments,” SIGGRAPH '84, Course Notes for Advances Computer Graphics Animation, Jul. 23, 1984.
Mitchell, “Spectrally Optimal Sampling for Distribution Ray Tracing,” SIGGRAPH '91, Computer Graphics, Jul. 1991, pp. 157-165, vol. 25, No. 4.
Mitsubishi Electronic Device Group, “Overview of 3D-RAM and Its Functional Blocks,” 1995.
Montrym et al., “InfiniteReality: A Real-Time Graphics System,” Computer Graphics Proceedings, Annual Conference Series, 1997.
Mooradian et al., “High Power Extended Vertical Cavity Surface Emitting Diode Lasers and Arrays and Their Applications,” Micro-Optics Conference, Tokyo, Nov. 2, 2005.
Musgrave et al., “The Synthesis and Rendering of Eroded Fractal Terrains,” SIGGRAPH '89, Computer Graphics, Jul. 1989, pp. 41-50, vol. 23, No. 3.
Nakamae et al., “Compositing 3D Images with Antialiasing and Various Shading Effects,” IEEE Computer Graphics Applications, Mar. 1989, pp. 21-29, vol. 9, No. 2.
Newman et al., Principles of Interactive Computer Graphics, 2nd ed., 1979, McGraw-Hill Book Company, San Francisco, California.
Niven, “Trends in Laser Light Sources for Projection Display,” Novalux International Display Workshop, Session LAD2-2, Dec. 2006.
Oshima et al., “An Animation Design Tool Utilizing Texture,” International Workshop on Industrial Applications of Machine Intelligence and Vision, Tokyo, Apr. 10-12, 1989, pp. 337-342.
Parke, “Simulation and Expected Performance Analysis of Multiple Processor Z-Buffer Systems,” Computer Graphics, 1980, pp. 48-56.
Peachey, “Solid Texturing of Complex Surfaces,” SIGGRAPH '85, 1985, pp. 279-286, vol. 19, No. 3.
Peercy et al., “Efficient Bump Mapping Hardware,” Computer Graphics Proceedings, 1997.
Perlin, “An Image Synthesizer,” SIGGRAPH '85, 1985, pp. 287-296, vol. 19, No. 3.
Pineda, “A Parallel Algorithm for Polygon Rasterization,” SIGGRAPH '88, Aug. 1988, pp. 17-20, vol. 22, No. 4.
Polis et al., “Automating the Construction of Large Scale Virtual Worlds,” Digital Mapping Laboratory, School of Computer Science, Carnegie Mellon University, date unknown.
Porter et al., “Compositing Digital Images,” SIGGRAPH '84, Computer Graphics, Jul. 1984, pp. 253-259, vol. 18, No. 3.
Poulton et al., “Breaking the Frame-Buffer Bottleneck with Logic-Enhanced Memories,” IEEE Computer Graphics Applications, Nov. 1992, pp. 65-74.
Rabinovich et al., “Visualization of Large Terrains in Resource-Limited Computing Environments,” Computer Science Department, Technion—Israel Institute of Technology, pp. 95-102, date unknown.
Reeves et al., “Rendering Antialiased Shadows with Depth Maps,” SIGGRAPH '87, Computer Graphics, Jul. 1987, pp. 283-291, vol. 21, No. 4.
Regan et al., “Priority Rendering with a Virtual Reality Address Recalculation Pipeline,” Computer Graphics Proceedings, Annual Conference Series, 1994.
Rhoades et al., “Real-Time Procedural Textures,” ACM, Jun. 1992, pp. 95-100, 225.
Rockwood et al., “Blending Surfaces in Solid Modeling,” Geometric Modeling: Algorithms and New Trends, 1987, pp. 367-383, Society for Industrial and Applied Mathematics, Philadelphia, Pennsylvania.
Röttger et al., “Real-Time Generation of Continuous Levels of Detail for Height Fields,” WSCG '98, 1998.
Safronov, “Bimorph Adaptive Optics: Elements, Technology and Design Principles,” SPIE, 1996, pp. 494-504, vol. 2774.
Saha et al., “Web-based Distributed VLSI Design,” IEEE, 1997, pp. 449-454.
Salzman et al., “VR's Frames of Reference: A Visualization Technique for Mastering Abstract Multidimensional Information,” CHI 99 Papers, May 1999, pp. 489-495.
Sandejas, Silicon Microfabrication of Grating Light Valves, Doctor of Philosophy Dissertation, Stanford University, Jul. 1995.
Scarlatos, “A Refined Triangulation Hierarchy for Multiple Levels of Terrain Detail,” presented at the IMAGE V Conference, Phoenix, Arizona, Jun. 19-22, 1990, pp. 114-122.
Schilling, “A New Simple and Efficient Antialiasing with Subpixel Masks,” SIGGRAPH '91, Computer Graphics, Jul. 1991, pp. 133-141, vol. 25, No. 4.
Schumacker, “A New Visual System Architecture,” Proceedings of the Second Interservices/Industry Training Equipment Conference, Nov. 18-20, 1990, Salt Lake City, Utah.
Segal et al., “Fast Shadows and Lighting Effects Using Texture Mapping,” SIGGRAPH '92, Computer Graphics, Jul. 1992, pp. 249-252, vol. 26, No. 2.
SICK AG, S3000 Safety Laser Scanner Operating Instructions, Aug. 25, 2005.
Silicon Light Machines, “White Paper: Calculating Response Characteristics for the “Janis” GLV Module, Revision 2.0,” Oct. 1999.
Solgaard, “Integrated Semiconductor Light Modulators for Fiber-Optic and Display Applications,” Ph.D. Dissertation submitted to the Deparatment of Electrical Engineering and the Committee on Graduate Studies of Stanford University, Feb. 1992.
Sollberger et al., “Frequency Stabilization of Semiconductor Lasers for Applications in Coherent Communication Systems,” Journal of Lightwave Technology, Apr. 1987, pp. 485-491, vol. LT-5, No. 4.
Steinhaus et al., “Bimorph Piezoelectric Flexible Mirror,” Journal of the Optical Society of America, Mar. 1979, pp. 478-481, vol. 69, No. 3.
Stevens et al., “The National Simulation Laboratory: The Unifying Tool for Air Traffic Control System Development,” Proceedings of the 1991 Winter Simulation Conference, 1991, pp. 741-746.
Stone, “High-Performance Computer Architecture,” 1987, pp. 278-330, Addison-Wesley Publishing Company, Menlo Park, California.
Tanner et al., “The Clipmap: A Virtual Mipmap,” Silicon Graphics Computer Systems; Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques, Jul. 1998.
Tanriverdi et al., “Interacting with Eye Movements in Virtual Environments,” CHI Letters, Apr. 2000, pp. 265-272, vol. 2, No. 1.
Texas Instruments, DLP® 3-D HDTV Technology, 2007.
Torborg et al., “Talisman: Commodity Realtime 3D Graphics for the PC,” Computer Graphics Proceedings, Annual Conference Series, 1996, pp. 353-363.
Trisnadi, “Hadamard Speckle Contrast Reduction,” Optics Letters, 2004, vol. 29, pp. 11-13.
Trisnadi et al., “Overview and Applications of Grating Light Valve™ Based Optical Write Engines for High-Speed Digital Imaging,” proceedings of conference “MOEMS Display and Imaging Systems II,” Jan. 2004, vol. 5328.
Tseng et al., “Development of an Aspherical Bimorph PZT Mirror Bender with Thin Film Resistor Electrode,” Advanced Photo Source, Argonne National Laboratory, Sep. 2002, pp. 271-278.
Vinevich et al., “Cooled and Uncooled Single-Channel Deformable Mirrors for Industrial Laser Systems,” Quantum Electronics, 1998, pp. 366-369, vol. 28, No. 4.
Whitton, “Memory Design for Raster Graphics Displays,” IEEE Computer Graphics & Applications, Mar. 1984, pp. 48-65.
Williams, “Casting Curved Shadows on Curved Surfaces,” Computer Graphics Lab, New York Institute of Technology, 1978, pp. 270-274.
Williams, “Pyramidal Parametrics,” Computer Graphics, Jul. 1983, pp. 1-11, vol. 17, No. 3.
Willis et al., “A Method for Continuous Adaptive Terrain,” Presented at the 1996 IMAGE Conference, Jun. 23-28, 1996.
Woo et al., “A Survey of Shadow Algorithms,” IEEE Computer Graphics & Applications, Nov. 1990, pp. 13-32, vol. 10, No. 6.
Wu et al., “A Differential Method for Simultaneous Estimation of Rotation, Change of Scale and Translation,” Signal Processing: Image Communication, 1990, pp. 69-80, vol. 2, No. 1.
Youbing et al., “A Fast Algorithm for Large Scale Terrain Walkthrough,” CAD/Graphics, Aug. 22-24, 2001.
Abrash, “The Quake Graphics Engine,” CGDC Quake Talk taken from Computer Game Developers Conference, Apr. 2, 1996. http://gamers.org/dEngine/quake/papers/mikeab-cgdc.html.
Akeley, “RealityEngine Graphics,” Computer Graphics Proceedings, Annual Conference Series, 1993.
Allen, J. et al., “An Interactive Learning Environment for VLSI Design,” Proceedings of the IEEE, Jan. 2000, pp. 96-106, vol. 88, No. 1.
Allen, W. et al., “47.4: Invited Paper: Wobulation: Doubling the Addressed Resolution of Projection Displays,” SID 05 Digest, 2005, pp. 1514-1517.
Amm, et al., “5.2: Grating Light Valve™ Technology: Update and Novel Applications,” Presented at Society for Information Display Symposium, May 19, 1998, Anaheim, California.
Apgar et al., “A Display System for the Stellar™ Graphics Supercomputer Model GS1000T™,” Computer Graphics, Aug. 1988, pp. 255-262, vol. 22, No. 4.
Apte, “Grating Light Valves for High-Resolution Displays,” Ph.D. Dissertation—Stanford University, 1994 (abstract only).
Baer, Computer Systems Architecture, 1980, Computer Science Press, Inc., Rockville, Maryland.
Barad et al., “Real-Time Procedural Texturing Techniques Using MMX,” Gamasutra, May 1, 1998, http://www.gamasutra.com/features/19980501/mmxtexturing_01.htm.
Bass, “4K GLV Calibration,” E&S Company, Jan. 8, 2008.
Becker et al., “Smooth Transitions between Bump Rendering Algorithms,” Computer Graphics Proceedings, 1993, pp. 183-189.
Bishop et al., “Frameless Rendering: Double Buffering Considered Harmful,” Computer Graphics Proceedings, Annual Conference Series, 1994.
Blinn, “Simulation of Wrinkled Surfaces,” Siggraph '78 Proceedings, 1978, pp. 286-292.
Blinn, “A Trip Down the Graphics Pipeline: Subpixelic Particles,” IEEE Computer Graphics & Applications, Sep./Oct. 1991, pp. 86-90, vol. 11, No. 5.
Blinn et al., “Texture and Reflection in Computer Generated Images,” Communications of the ACM, Oct. 1976, pp. 542-547, vol. 19, No. 10.
Bloom, “The Grating Light Valve: revolutionizing display technology,” Silicon Light Machines, date unknown.
Boyd et al., “Parametric Interaction of Focused Gaussian Light Beams,” Journal of Applied Physics, Jul. 1968, pp. 3597-3639 vol. 39, No. 8.
Brazas et al., “High-Resolution Laser-Projection Display System Using a Grating Electromechanical System (GEMS),” MOEMS Display and Imaging Systems II, Proceedings of SPIE, 2004, pp. 65-75vol. 5348.
Bresenham, “Algorithm for Computer Control of a Digital Plotter,” IBM Systems Journal, 1965, pp. 25-30, vol. 4, No. 1.
Carlson, “An Algorithm and Data Structure for 3D Object Synthesis Using Surface Patch Intersections,” Computer Graphics, Jul. 1982, pp. 255-263, vol. 16, No. 3.
Carpenter, “The A-buffer, an Antialiased Hidden Surface Method,” Computer Graphics, Jul. 1984, pp. 103-108, vol. 18, No. 3.
Carter, “Re: Re seams and creaseAngle (long),” posted on the GeoVRML.org website Feb. 2, 2000, http://www.ai.sri.com/geovrml/archive/msg00560.html.
Catmull, “An Analytic Visible Surface Algorithm for Independent Pixel Processing,” Computer Graphics, Jul. 1984, pp. 109-115, vol. 18, No. 3.
Chasen, “Geometric Principles and Procedures for Computer Graphic Applications,” 1978, pp. 11-123, Upper Saddle River, New Jersey.
Choy et al., “Single Pass Algorithm for the Generation of Chain-Coded Contours and Contours Inclusion Relationship,” Communications, Computers and Signal Processing—IEEE Pac Rim '93, 1993, pp. 256-259.
Clark et al., “Photographic Texture and CIG: Modeling Strategies for Production Data Bases,” 9th VITSC Proceedings, Nov.-Dec., 1987, pp. 274-283.
Corbin et al., “Grating Light Valve™ and Vehicle Displays,” Silicon Light Machines, Sunnyvale, California, date unknown.
Corrigan et al., “Grating Light Valve™ Technology for Projection Displays,” Presented at the International Display Workshop—Kobe, Japan, Dec. 9, 1998.
Crow, “Shadow Algorithms for Computer Graphics,” Siggraph '77, Jul. 1977, San Jose, California, pp. 242, 248.
Deering et al., “FBRAM: A New Form of Memory Optimized for 3D Graphics,” Computer Graphics Proceedings, Annual Conference Series, 1994.
Drever et al., “Laser Phase and Frequency Stabilization Using an Optical Resonator,” Applied Physics B: Photophysics and Laser Chemistry, 1983, pp. 97-105, vol. 31.
Duchaineau et al., “ROAMing Terrain: Real-Time Optimally Adapting Meshes,” Los Alamos National Laboratory and Lawrence Livermore National Laboratory, 1997.
Duff, “Compositing 3-D Rendered Images,” Siggraph '85, Jul. 22-26, 1985, San Francisco, California, pp. 41-44.
Ellis, “Low-Cost Bimorph Mirrors in Adaptive Optics,” Ph.D. Thesis, Imperial College of Science, Technology and Medicine—University of London, 1999.
Faux et al., Computational Geometry for Design and Manufacture, 1979, Ellis Horwood, Chicester, United Kingdom.
Feiner et al., “Dial: A Diagrammatic Animation Language,” IEEE Computer Graphics & Applications, Sep. 1982, pp. 43-54, vol. 2, No. 7.
Fiume et al., “A Parallel Scan Conversion Algorithm with Anti-Aliasing for a General-Purpose Ultracomputer,” Computer Graphics, Jul. 1983, pp. 141-150, vol. 17, No. 3.
Foley et al., “Computer Graphics: Principles and Practice,” 2nd ed., 1990, Addison-Wesley Publishing Co., Inc., Menlo Park, California.
Foley et al., “Fundamentals of Interactive Computer Graphics,” 1982, Addison-Wesley Publishing Co., Inc., Menlo Park, California.
Fox et al., “Development of Computer-Generated Imagery for a Low-Cost Real-Time Terrain Imaging System,” IEEE 1986 National Aerospace and Electronic Conference, May 19-23, 1986, pp. 986-991.
Gambotto, “Combining Image Analysis and Thermal Models for Infrared Scene Simulations,” Image Processing Proceedings, ICIP-94, IEEE International Conference, 1994, vol. 1, pp. 710-714.
Gardiner, “A Method for Rendering Shadows,” E&S Company, Sep. 25, 1996.
Gardiner, “Shadows in Harmony,” E&S Company, Sep. 20, 1996.
Gardner, “Simulation of Natural Scenes Using Textured Quadric Surfaces,” Computer Graphics, Jul. 1984, pp. 11-20, vol. 18, No. 3.
Gardner, “Visual Simulation of Clouds,” Siggraph '85, Jul. 22-26, 1985, San Francisco, California, pp. 297-303.
Giloi, Interactive Computer Graphics: Data Structures, Algorithms, Languages, 1978, Prentice-Hall, Inc., Englewood Cliffs, New Jersey.
Glaskowsky, “Intel Displays 740 Graphics Chip: Auburn Sets New Standard for Quality—But Not Speed,” Microprocessor Report, Feb. 16, 1998, pp. 5-9, vol. 12, No. 2.
Goshtasby, “Registration of Images with Geometric Distortions,” IEEE Transactions on Geoscience and Remote Sensing, Jan. 1988, pp. 60-64, vol. 26, No. 1.
Great Britain Health & Safety Executive, The Radiation Safety of Lasers Used for Display Purposes, Oct. 1996.
Gupta et al., “Filtering Edges for Gray-Scale Displays,” Computer Graphics, Aug. 1981, pp. 1-5, vol. 15, No. 3.
Provisional Applications (2)
Number Date Country
61546152 Oct 2011 US
61544110 Oct 2011 US
Continuations (1)
Number Date Country
Parent 13545948 Jul 2012 US
Child 15585085 US