The present invention relates to a three-dimensional imaging system using a micromirror array lens.
There are so many different three-dimensional imaging systems as well as so many different ways to represent an object in three-dimensional space. The most popular three-dimensional imaging system is stereoscopic imaging systems, which acquire depth information from a scene in the form that the parallax phenomenon of human eyes is simulated. When human eyes see a scene, right and left side eyes have two different perspectives due to their separation. The brain fuses these two perspectives and assesses the visual depth. Similarly to human eyes, stereoscopic three-dimensional imaging systems take two perspective images by two parallel cameras that are disposed to view the scene from different angles at the same time as disclosed in U.S. Pat. No. 5,432,712 to Chan. These devices, however, tend to be large and heavy, and come at high cost due to multiple camera systems and their optical axis separation requirement. Also, when stereoscopic images are displayed for three-dimensional viewing, many technical problems can be arisen involved with arbitrary distribution of the viewer's position, watching by multiple viewers, binocular disparity due to deviations in the distance between the two eyes, vergence, fatigue accumulation in the eye, accommodation, the relative position change of the three-dimensional image due to viewer's movement, etc.
U.S. Pat. No. 6,503,195 to Keller discloses a structured light depth extraction system in which a projector projects a structured light pattern such as grids in the visible or invisible form onto an object, and then an image processor calculates depth information based on the reflected light pattern. In case of using visible light, image quality can be degraded while using invisible light requires an additional sensor system. Also, performance of the structured light depth extraction system depends on the reflectivity of the object.
U.S. Pat. No. 3,506,327 to Leith discloses a holographic imaging system, which uses coherent radiation to produce an object-bearing beam and reference beam. These two beams produce a pattern of interference fringe on the detector, wherein the intensity and phase information of light are recorded. Three-dimensional image can be reconstructed by illuminating the pattern of interference fringe with the reference beam. The maximum image depth is limited by mainly the coherence length of the beam. The holographic imaging system requires expensive and high power consuming coherent light source such as laser and the near darkroom conditions for imaging. Therefore, the holographic imaging system is not applicable to portable imaging devices as well as may cause some safety concerns using in the public area.
U.S. Pat. No. 5,032,720 to White and U.S. Pat. No. 6,949,069 to Farkas disclose a three-dimensional confocal system in which a point of interest is illuminated by a light source using a pinhole aperture. The confocal system can provide a high resolution three-dimensional image with a single camera system, but most of illuminating light is wasted and causes noise problem. To overcome this, U.S. Pat. No. 6,749,346 to Dickensheets and U.S. Pat. No. 6,563,105 to Seibel use a single optical fiber to scan and collect reflected light, but point by point scanning can lead to a slow image refresh rate.
The depth from focus criteria is well known for three-dimensional imaging, wherein a sequence of images is taken by changing the camera focus and in-focus regions are extracted from the images. Camera focus can be changed in many different ways. U.S. Pat. No. 5,986,811 to Wohlstadter discloses a three-dimensional imaging method and system using conventional motorized optics having an input lens and an output lens to change the focal length of the imaging system. Conventional motorized optics has a slow response time and complex driving mechanisms to control the relative position of the lenses. Therefore, it is difficult to use in the real-time imaging system and miniaturize the imaging system.
U.S. Pat. No. 6,344,930 to Kaneko discloses a total-focus imaging system using a sealed liquid lens actuated by a piezoelectric actuator to change the focal length of the imaging system. The proposed liquid lens has a slow focal length change speed of several hundreds of Hz. The system can have only a half dozen of focal length changes for each three-dimensional image when considering the standard video or movie rate. Besides, the lens has a small focal length variation range. These problems limit the possible range of depth and the depth resolution of the three-dimensional image.
A most advanced variable focal length lens is a liquid crystal variable focal length lens, wherein its focal length is changed by modulating the refractive index. However, it has a complex mechanism to control it and a slow response time typically on the order of hundreds of milliseconds, while the fastest response liquid crystal lens has a response time of tens of milliseconds, which still provides a low depth resolution three-dimensional image. Also, it has a small focal length variation and a low focusing efficiency.
A high speed, large variation of numerical aperture, and large diameter of variable focal length lens is necessary to get a real-time, large range of depth, and high depth resolution three-dimensional image.
An objective of the invention is to provide a real-time three-dimensional imaging system that can provide an all-in-focus image and depth information for each pixel of the all-in-focus image.
Another objective of the invention is to provide a real-time three-dimensional imaging system that can provide depthwise images and depth information for each depthwise image.
Another objective of the invention is to provide a three-dimensional imaging system that can compensate for various optical distortions or aberrations.
In the present invention, three-dimensional imaging is achieved based on the depth from focus criteria, aforementioned. The three-dimensional imaging system of the present invention comprises at least one variable focal length MicroMirror Array Lens (MMAL), an imaging unit, and an image processing unit.
The variable focal length MMAL comprises a plurality of micromirrors. Each micromirror has the same function as a mirror. Micromirrors in the MMAL are arranged in a shape depending on the geometry of the imaging system on a substantially flat plane. The MMAL works as a reflective focusing lens by making all light scattered from one point of an object have the same periodical phase and converge at one point on the image plane. Each micromirror in the MMAL is controlled to have desired translation and rotation to satisfy the convergence and phase matching conditions for forming an image of the object, wherein each micromirror of the MMAL is actuated by the electrostatic and/or the electromagnetic force.
The MMAL has many advantages over conventional variable focus lenses including a very fast response time because each micromirror has a tiny mass, a large focal length variation because large numerical aperture variations can be achieved by increasing the maximum rotational angle of the micromirrors, a high optical focusing efficiency, a large aperture without losing optical performance, low cost because of the advantage of mass productivity of microelectronics manufacturing technology, capability of compensating for phase errors introduced by the medium between the object and the image and/or correcting the defects of the lens system that cause its image to deviate from the rules of paraxial imagery, simpler focusing mechanism, and low power consumption when electrostatic actuation is used to control it.
The following U.S. patents and applications describe the MMAL: U.S. Pat. No. 6,934,072 to Kim, U.S. Pat. No. 6,934,073 to Kim, U.S. Pat. No. 6,970,284 to Kim, U.S. Pat. No. 6,999,226 to Kim, U.S. Pat. No. 7,031,046 to Kim, U.S. patent application Ser. No. 10/857,714 filed May 28, 2004, U.S. patent application Ser. No. 10/893,039 filed Jul. 16, 2004, and U.S. patent application Ser. No. 10/983,353 filed Nov. 8, 2004, all of which are hereby incorporated by reference.
The variable focal length MMAL changes its surface profile to change its focal length by controlling the rotation and translation of each micromirror in the MMAL. The focal length of the variable focal length MMAL is changed with a plurality of steps in order to scan the whole object.
The imaging unit comprising at least one two-dimensional image sensor captures images formed on the image plane by the variable focal length MMAL. As the focal length of the variable focal length MMAL is changed, the in-focus regions of the object are also changed accordingly.
The image processing unit extracts the substantially in-focus pixels of each captured image received from the imaging unit and generates a corresponding depthwise image using the extracted in-focus pixels of each captured image. Depth information, or distance between the imaging system and the in-focus region of the object is determined by known imaging system parameters including the focal length and distance between the MMAL and the image plane. The image processing unit can combine these depthwise images to make an all-in-focus image. There are several methods for the image processing unit to obtain depthwise images or an all-in-focus image (e.g. edge detection filter). Recent advances in both the image sensor and the image processing unit make them as fast as they are required to be. Depending on the display methods of three-dimensional display systems, the three-dimensional imaging system can provides depthwise images with depth information for each depthwise image or one all-in-focus image with depth information for each pixel of the all-in-focus image. All the processes are achieved within a unit time which is less than or equal to the persistent rate of the human eye.
The three-dimensional imaging system of the present invention compensates for aberrations using the MMAL. Since the MMAL is an adaptive optical component, the MMAL compensates for phase errors of light introduced by the medium between an object and its image and/or corrects the defects of the three-dimensional imaging system that may cause the image to deviate from the rules of paraxial imagery by controlling individual micromirrors in the MMAL.
Because conventional refractive lenses are positioned to be perpendicular about optical axis, surface profile of the lens is generally axis-symmetric. However, the variable focal length MMAL is reflective lens. The three-dimensional imaging system can further comprise a beam splitter positioned in the path of light between the imaging unit and the MMAL to have normal incident optical geometry onto the MMAL. Alternatively, in order to defect the light into a sensor, the MMAL can be positioned so that the path of the light reflected by the MMAL is not blocked without using a beam splitter. When the MMAL is tilted about an axis (tilting axis), which is perpendicular to the optical axis, the surface profile of the MMAL is symmetric about an axis which is perpendicular to the optical axis and tilting axis. The tilted MMAL can cause non axis-symmetric aberrations. To have the desired focal length and compensate for non axis-symmetric aberrations, motion of each micromirror is controlled independently and has two rotational degrees of freedom and one translational degree of freedom.
For an axis-symmetric MMAL, micromirrors at the same radius have same motion. Therefore, the mirrors at the same radius are controlled dependently and motion of each micromirror can have one rotational degree of freedom and one translational degree of freedom to make an axis-symmetric MMAL.
Also, an object which does not lie on the optical axis can be imaged by the MMAL without macroscopic mechanical movement of the three-dimensional imaging system.
In order to obtain a color image, the MMAL is controlled to compensate for chromatic aberration by satisfying the phase matching condition for each wavelength of Red, Green, and Blue (RGB), or Yellow, Cyan, and Magenta (YCM), respectively. The three-dimensional imaging system may further comprise a plurality of bandpass filters for color imaging. Also, the three-dimensional imaging system may further comprise a photoelectric sensor. The photoelectric sensor comprises Red, Green, and Blue sensors or Yellow, Cyan, and Magenta sensors, wherein color images are obtained by the treatments of the electrical signals from each sensor. The treatment of electrical signal from sensor is synchronized and/or matched with the control of the MMAL to satisfy the phase matching condition for each wavelength of Red, Green, and Blue or Yellow, Cyan, and Magenta, respectively.
Furthermore, the MMAL can be controlled to satisfy phase matching condition at an optimal wavelength to minimize chromatic aberration, wherein optimal wavelength phase matching is used for getting a color image. The MMAL is controlled to satisfy phase matching condition for the least common multiple wavelength of Red, Green, and Blue or Yellow, Cyan, and Magenta lights to get a color image.
The three-dimensional imaging system can further comprise an optical filter or filters for image quality enhancement.
The three-dimensional imaging system can further comprise an auxiliary lens or group of lenses for image quality enhancement.
The three-dimensional imaging system can further comprise additional MMAL or MMALs for imaging with magnification.
The three-dimensional imaging system can further comprises extra MMAL or MMALs to compensate for aberrations of the imaging system including chromatic aberration.
In summary, the three-dimensional imaging system of the present invention has the following advantages: (1) the system can make a real-time three-dimensional image because the MMAL has a fast focusing time; (2) the system has a large range of depth because the MMAL has a large focal length variation; (3) the system has a high optical efficiency; (4) the system can have high depth resolution by controlling the focal length of the MMAL in the small scale; (5) the cost is inexpensive because the MMAL is inexpensive; (6) the system can compensate for aberrations of the imaging system; (7) the system is very simple because there is no macroscopic mechanical displacement or deformation of the lens system; (8) the system is compact; and (9) the system requires small power consumption when the MMAL is actuated by electrostatic force.
Although the present invention is briefly summarized herein, the full understanding of the invention can be obtained by the following drawings, detailed description, and appended claims.
These and other features, aspects and advantages of the present invention will become better understood with reference to the accompanying drawings, wherein:
As the MMAL 23 changes its focal length according to selected steps, the imaging unit 24 comprising at least one two-dimensional image sensor captures two-dimensional images 25 at an image plane. The image processing unit 26 extracts the substantially in-focus pixels of each captured image and generates a corresponding depthwise image using the extracted in-focus pixels of each captured image. Also, depending on the display methods of three-dimensional display systems, the image processing unit 26 can generate an all-in-focus image and depth information for each pixel of the all-in-focus image. All the processes are achieved within a unit time which is less than or equal to the persistent rate of the human eye.
The number of focal length change steps is determined by the required depth resolution and the range of depth of the object to be imaged. To provide real-time three-dimensional video images, the whole object is scanned within the unit time, which is less than or equal to the persistent rate of the human eye. The necessary focal length changing rate of the MMAL is the number of required focal length changes times the persistent rate of the human eye. The image sensing and processing speed needs to be equal to or faster than the focal length change speed of the MMAL to provide real-time three-dimensional imaging. Recent advances in both the image sensor and the image processing unit make them as fast as they are required to be.
Alternatively, as shown in
The focal length f of the MMAL 61 is changed by controlling the rotation and/or translation of the micromirror 64. The operation of the MMAL 61 is possible by controlling only rotation without controlling translation even though it can not satisfy the phase condition. In this case, the imaging quality of the lens 61 generated by controlling only rotation is degraded by the aberration. Pure translation without rotation can satisfy the two imaging conditions by Fresnel diffraction theory. The lens generated by the control of only translation has the aberration too. The smaller the size of the micromirrors 64, the less the aberration. Even though the quality of the lens with one motion is lower than the lens with rotation and translation, the lens with one motion has the advantage that its control and fabrication is easier than the lens with both rotation and translation.
It is desired that each of the micromirrors 64 has a curvature because the ideal shape of a conventional reflective lens has a curvature. However, the aberration of the lens with flat micromirrors 64 is not much different from the lens with curvature if the size of each micromirror is small enough. For most applications, flat micromirrors can be used.
As shown in
The
As shown in
The MMAL is an adaptive optical component because the phase of light can be changed by the translations and/or rotations of micromirrors. The MMAL can correct the phase errors as an adaptive optical component can correct the phase errors of light introduced by the medium between the object and its image and/or corrects the defects of a lens system that cause its image to deviate from the rules of paraxial imagery. For an example, the MMAL can correct the phase error caused by optical tilt by adjusting the translations and/or rotations of micromirrors.
The same phase condition satisfied by the MMAL uses an assumption of monochromatic light. Therefore, to get a color image, the MMAL of the three-dimensional imaging system is controlled to satisfy the same phase condition for each wavelength of Red, Green, and Blue (RGB), or Yellow, Cyan, and Magenta (YCM), respectively, and the three-dimensional imaging system can use bandpass filters to make monochromatic lights with wavelengths of Red, Green, and Blue or Yellow, Cyan, and Magenta.
If a color photoelectric sensor is used as an image sensor in the three-dimensional imaging system using a variable focal length MMAL, a color image can be obtained by processing electrical signals from Red, Green, and Blue sensors with or without bandpass filters, which should be synchronized with the control of the MMAL.
To image the Red light scattered from an object, the MMAL is controlled to satisfy the phase condition for the Red light and Red, Green, and Blue image sensors measure the intensity of each Red, Green, and Blue light scattered from an object. Among them, only the intensity of Red light is stored as image data because only Red light is imaged properly. To image each Green and Blue light, the MMAL and each imaging sensor works in the same manner with the process of the Red light. Therefore, the MMAL is synchronized with Red, Green, and Blue imaging sensors.
While the invention has been shown and described with reference to different embodiments thereof, it will be appreciated by those skills in the art that variations in form, detail, compositions and operation may be made without departing from the spirit and scope of the invention as defined by the accompanying claims.
This application is a continuation-in-part of, and claims priority to U.S. patent application Ser. No. 10/822,414 filed Apr. 12, 2004, U.S. patent application Ser. No. 10/855,715 filed May 27, 2004, U.S. patent application Ser. No. 10/872,241 filed Jun. 18, 2004, U.S. patent application Ser. No. 10/893,039 filed Jul. 16, 2004, U.S. patent application Ser. No. 10/896,146 filed Jul. 21, 2004, U.S. patent application Ser. No. 10/979,612 filed Nov. 2, 2004, U.S. patent application Ser. No. 10/983,353 filed Nov. 8, 2004, U.S. patent application Ser. No. 11/072,597 filed Mar. 4, 2005, U.S. patent application Ser. No. 11/076,616 filed Mar. 10, 2005, U.S. patent application Ser. No. 11/191,886 filed Jul. 28, 2005, U.S. patent application Ser. No. 11/208,115 filed Aug. 19, 2005, U.S. patent application Ser. No. 11/218,814 filed Sep. 2, 2005, U.S. patent application Ser. No. 11/286,971 filed Nov. 23, 2005, U.S. patent application Ser. No. 11/294,944 filed Dec. 6, 2005, U.S. patent application Ser. No. 11/300,205 filed Dec. 13, 2005, U.S. patent application Ser. No. 11/314,408 filed Dec. 20, 2005, U.S. patent application Ser. No. 11/319,987 filed Dec. 28, 2005, U.S. patent application Ser. No. 11/341,214 filed Jan. 28, 2006, and U.S. patent application Ser. No. 11/369,797 filed Mar. 6, 2006, all of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
2002376 | Mannheimer | May 1935 | A |
4407567 | Michelet | Oct 1983 | A |
4834512 | Austin | May 1989 | A |
5004319 | Smither | Apr 1991 | A |
5212555 | Stoltz | May 1993 | A |
5369433 | Baldwin | Nov 1994 | A |
5402407 | Eguchi | Mar 1995 | A |
5467121 | Allcock | Nov 1995 | A |
5612736 | Vogeley | Mar 1997 | A |
5696619 | Knipe | Dec 1997 | A |
5881034 | Mano | Mar 1999 | A |
5897195 | Choate | Apr 1999 | A |
5986811 | Wohlstadter | Nov 1999 | A |
6025951 | Swart | Feb 2000 | A |
6028689 | Michaliek | Feb 2000 | A |
6064423 | Geng | May 2000 | A |
6084843 | Abe | Jul 2000 | A |
6104425 | Kanno | Aug 2000 | A |
6111900 | Suzudo | Aug 2000 | A |
6123985 | Robinson | Sep 2000 | A |
6282213 | Gutin et al. | Aug 2001 | B1 |
6315423 | Yu | Nov 2001 | B1 |
6329737 | Jerman | Dec 2001 | B1 |
6498673 | Frigo | Dec 2002 | B1 |
6507366 | Lee | Jan 2003 | B1 |
6549730 | Hamada | Apr 2003 | B1 |
6625342 | Staple et al. | Sep 2003 | B2 |
6649852 | Chason | Nov 2003 | B2 |
6650461 | Atobe | Nov 2003 | B2 |
6658208 | Watanabe | Dec 2003 | B2 |
6711319 | Hoen | Mar 2004 | B2 |
6741384 | Martin | May 2004 | B1 |
6784771 | Fan | Aug 2004 | B1 |
6833938 | Nishioka | Dec 2004 | B2 |
6885819 | Shinohara | Apr 2005 | B2 |
6900901 | Harada | May 2005 | B2 |
6900922 | Aubuchon | May 2005 | B2 |
6906848 | Aubuchon | Jun 2005 | B2 |
6906849 | Mi | Jun 2005 | B1 |
6914712 | Kurosawa | Jul 2005 | B2 |
6919982 | Nimura | Jul 2005 | B2 |
6934072 | Kim | Aug 2005 | B1 |
6934073 | Kim | Aug 2005 | B1 |
6943950 | Lee | Sep 2005 | B2 |
6958777 | Pine | Oct 2005 | B1 |
6970284 | Kim | Nov 2005 | B1 |
6995909 | Hayashi | Feb 2006 | B1 |
6999226 | Kim | Feb 2006 | B2 |
7023466 | Favalora | Apr 2006 | B2 |
7031046 | Kim | Apr 2006 | B2 |
7046447 | Raber | May 2006 | B2 |
7068416 | Gim | Jun 2006 | B2 |
7077523 | Seo et al. | Jul 2006 | B2 |
7161729 | Kim | Jan 2007 | B2 |
20020018407 | Komoto | Feb 2002 | A1 |
20020102102 | Watanabe | Aug 2002 | A1 |
20020135673 | Favalora | Sep 2002 | A1 |
20030058520 | Yu | Mar 2003 | A1 |
20030071125 | Yoo | Apr 2003 | A1 |
20030174234 | Kondo | Sep 2003 | A1 |
20030184843 | Moon | Oct 2003 | A1 |
20040009683 | Hiraoka | Jan 2004 | A1 |
20040012460 | Cho | Jan 2004 | A1 |
20040021802 | Yoshino | Feb 2004 | A1 |
20040052180 | Hong | Mar 2004 | A1 |
20040246362 | Konno | Dec 2004 | A1 |
20040252958 | Abu-Ageel | Dec 2004 | A1 |
20050024736 | Bakin | Feb 2005 | A1 |
20050057812 | Raber | Mar 2005 | A1 |
20050136663 | Terence Gan | Jun 2005 | A1 |
20050174625 | Huiber | Aug 2005 | A1 |
20050180019 | Cho | Aug 2005 | A1 |
20050212856 | Temple | Sep 2005 | A1 |
20050224695 | Mushika | Oct 2005 | A1 |
20050225884 | Gim | Oct 2005 | A1 |
20050231792 | Alain | Oct 2005 | A1 |
20050264226 | Cho | Dec 2005 | A1 |
20050264867 | Cho | Dec 2005 | A1 |
20050264870 | Kim | Dec 2005 | A1 |
20060012766 | Klosner | Jan 2006 | A1 |
20060012852 | Cho | Jan 2006 | A1 |
20060028709 | Cho | Feb 2006 | A1 |
20060187524 | Sandstrom | Aug 2006 | A1 |
20060232498 | Seo et al. | Oct 2006 | A1 |
Number | Date | Country |
---|---|---|
08-043881 | Feb 1996 | JP |
11-069209 | Mar 1999 | JP |
2002-288873 | Oct 2002 | JP |
Number | Date | Country | |
---|---|---|---|
20060209439 A1 | Sep 2006 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 10822414 | Apr 2004 | US |
Child | 11382707 | US | |
Parent | 10855715 | May 2004 | US |
Child | 10822414 | US | |
Parent | 10872241 | Jun 2004 | US |
Child | 10855715 | US | |
Parent | 10893039 | Jul 2004 | US |
Child | 10872241 | US | |
Parent | 10896146 | Jul 2004 | US |
Child | 10893039 | US | |
Parent | 10979612 | Nov 2004 | US |
Child | 10896146 | US | |
Parent | 10983353 | Nov 2004 | US |
Child | 10979612 | US | |
Parent | 11072597 | Mar 2005 | US |
Child | 10983353 | US | |
Parent | 11076616 | Mar 2005 | US |
Child | 11072597 | US | |
Parent | 11191886 | Jul 2005 | US |
Child | 11076616 | US | |
Parent | 11208115 | Aug 2005 | US |
Child | 11191886 | US | |
Parent | 11218814 | Sep 2005 | US |
Child | 11208115 | US | |
Parent | 11286971 | Nov 2005 | US |
Child | 11218814 | US | |
Parent | 11294944 | Dec 2005 | US |
Child | 11286971 | US | |
Parent | 11300205 | Dec 2005 | US |
Child | 11294944 | US | |
Parent | 11314408 | Dec 2005 | US |
Child | 11300205 | US | |
Parent | 11319987 | Dec 2005 | US |
Child | 11314408 | US | |
Parent | 11341214 | Jan 2006 | US |
Child | 11319987 | US | |
Parent | 11369797 | Mar 2006 | US |
Child | 11341214 | US |