High-resolution digital cameras are able to resolve small features within a scene from a large distance. Such cameras are useful for creating detailed panoramic, or wide-angle, images of a scene. The resolution of a digital image is limited by the number of pixels (photo-sensing elements) in the camera's imaging sensor. As the number of pixels in an imaging sensor increases, the cost and size of the imaging sensor also increases, typically exponentially with the number of pixels. State-of-the-art 50 mega-pixel (MP) imaging sensors are about 20-25 mm by 14-36 mm in size and typically cost upwards of $20,000. On the other hand, a 1.3 MP imaging sensor may cost as little as $2, and five 10 MP imaging sensors may cost as little as $50.
One method of making high resolution images using a camera with a smaller number of pixels is to take a number of images at locations that cover the desired field of view using a single camera and then combine the images to form a high-resolution composite or mosaic image. However, because of the time elapsed between starting and ending the image capture, moving objects in the scene lead to undesirable effects. For example, if a person is walking down the street during the time a mosaic of the street is captured, that person may appear multiple times in the mosaic. Moving objects might also be split both horizontally and vertically depending on their speed and position.
Another high-resolution imaging method employs an optical system that includes a number of cameras positioned to cover a desired field of view. The cameras may be synchronized to capture the image at the same time, and the images may then be combined. However, each camera has an optical axis that is displaced from their neighboring camera. As a result, each lens has a different perspective of the subject. In order to create a seamless composite image, the perspectives of the captured images must be corrected so that they match. Existing methods to correct perspective involve enlarging or reducing portions of an image using interpolation, which results in a loss of sharpness. Therefore, there is a need for an improved, low-cost high-resolution digital imaging system.
The systems and methods described herein relate to high resolution imaging. In particular, the systems include two or more lens assemblies for imaging a particular scene. Each lens assembly has image sensors disposed behind the lens assembly to image only a portion of the scene viewable through the lens assembly. Image sensors behind different lens assemblies image different portions of the scene. When the imaged portions from all the sensors are combined, a high resolution image of the scene is formed. Thus, multiple sensors can be combined into a high resolution image sensor without suffering the shortcomings associated with requiring each of the sensors to be positioned adjacent to each other, namely, image quality deterioration near the border regions of each sensor because of the constraints imposed by packaging of the individual sensors.
According to one aspect of the invention, a system for imaging a scene is provided. The system may include a first lens assembly with a first field of view and a second lens assembly with a second field of view. The first field of view may be substantially the same as the second field of view. The system may also include a first sensor disposed behind the first lens assembly to image only a portion of the first field of view and a second sensor disposed behind the second lens assembly to image only a portion of the second field of view. The imaged portion of the first field of view may be substantially different from the imaged portion of the second field of view. In certain embodiments, the first lens assembly includes a first optical axis, and the second lens assembly includes a second optical axis, and the first optical axis is substantially parallel to the second optical axis.
In some embodiments, an active imaging area of the first sensor may be smaller than the first field of view. In certain embodiments, the first sensor may be disposed in a first focal area of the first lens assembly.
According to another aspect of the invention, a system for imaging a scene is provided. The system may include a plurality of lens assemblies, each with substantially the same field of view. The system may also include a plurality of image sensors, each disposed behind one of the plurality of lens assemblies to image only a portion of the field of view of the respective lens assembly. Each imaged portion of the field of view may be substantially different from the other imaged portions of the field of view, and every portion of the entire field of view may be included in at least one imaged portion. In certain embodiments, each of the plurality of lens assemblies includes an optical axis and each of the plurality of optical axes are substantially parallel to each other.
In some embodiments, the active imaging area of one of the image sensors disposed behind one of the lens assemblies may be smaller than the field of view of the respective lens assembly. In other embodiments, the active imaging area of one or the image sensors may be substantially the same size as the field of view of the respective lens assembly. In certain embodiments, the plurality of image sensors may include a first sensor, the plurality of lens assemblies may include a first lens assembly, and the first sensor may be disposed in a first focal area of the first lens assembly. Optionally, the plurality of image sensors may also include a second sensor, and the plurality of lens assemblies may include a second lens assembly.
In any of the aspects and embodiments described above, the first focal area may be divided into at least a first focal area portion and a second focal area portion, and the first sensor may be disposed in the first focal area portion. Optionally, the active imaging area of the first sensor may be substantially the same size as the first focal area portion.
In some of the aspects and embodiments described above, the first focal area may be divided into an imaging array of imaging cells disposed in rows and columns, where the first focal area portion may correspond to a first imaging cell and the second focal area portion may correspond to a second imaging cell. In any of the aspects and embodiments described above, the second lens assembly may have a second focal area divided into at least a third focal area portion and a fourth focal area portion. The third focal area portion may have substantially the same field of view as the first focal area portion, and the fourth focal area portion may have substantially the same field of view as the second focal area portion. In these embodiments, the second sensor may be disposed in the fourth focal area portion. In some of the aspects and embodiments described above, one or more other sensors may be disposed behind the first lens assembly, and each sensor behind the first lens assembly may not be contiguous to any other sensor.
According to yet another aspect of the invention, a method of imaging a scene is provided. A first portion of the scene may be imaged with a first sensor array assembly having a first field of view. A second portion of the scene, substantially different from the first portion of the scene, may be imaged with a second sensor array assembly having a second field of view. The second field of view may be substantially the same as the first field of view. A processor may combine at least the first portion and the second portion to generate an image of the scene.
In some embodiments, the first sensor array assembly may image the first portion of the scene through a first lens assembly with the first field of view and the second sensor array assembly may image the second portion of the scene through a second lens assembly with the second field of view. In certain embodiments, the imaged first portion of the scene may include only incontiguous sections of the scene. In some embodiments, the imaged second portion of the scene may include only incontiguous sections of the scene, at least one of which is different from one of the incontiguous sections in the imaged first portion of the scene. Optionally, at least one of the incontiguous sections of the imaged first portion is substantially contiguous to at least one of the incontiguous sections of the imaged second portion. In certain embodiments, at least one of the incontiguous sections of the imaged first portion partially overlaps with at least one of the incontiguous sections of the imaged second portion.
In some embodiments, the first sensor assembly may be disposed adjacent to the second sensor assembly. Optionally, the first and second sensor assemblies may be disposed such that there is a gap between the two sensor assemblies.
In any of the aspects and embodiments described above, the first lens assembly may have a nonplanar focal surface, and the curvature of the first focal area may substantially match the curvature of the nonplanar focal surface. In some of the aspects and embodiments described above, the first sensor may have a sensor plane different from the curvature of the first focal area, and may be disposed such that the sensor plane is perpendicular to the chief ray of the first lens assembly at the first focal area. Optionally, light from the first lens assembly to the first sensor may be refracted before it reaches the first sensor such that the chief ray of the light is perpendicular to a sensor plane of the first sensor.
The foregoing and other objects and advantages of the invention will be appreciated more fully from the following further description thereof, with reference to the accompanying drawings, wherein:
The systems and methods described herein relate to high resolution imaging. In particular, the systems include two or more lens assemblies for imaging a particular scene. Each lens assembly has image sensors disposed behind the lens assembly to image only a portion of the scene viewable through the lens assembly. Image sensors behind different lens assemblies image different portions of the scene. When the imaged portions from all the sensors are combined, a high resolution image of the scene is formed. Thus, multiple sensors can be combined into a high resolution image sensor without the shortcomings associated with the border regions and packaging of the individual sensors, such as image gaps.
In certain embodiments, the system includes a plurality of lenses arranged in a one or two dimensional array, each lens having a focal area (i.e. a portion of its focal plane) that may be larger than an individual imaging sensor. A plurality of imaging sensors may be located behind each lens to cover the focal area of each lens to capture the entire field of view. The field of view, or focal area, captured behind a lens may be represented by an array having rows and columns of cell regions. Each cell region in this array may be sized to match the size of the active imaging area of an imaging sensor. In addition to an active imaging area, an imaging sensor may include a black level correction boundary region and/or an imaging sensor package. Thus, the active imaging area of an imaging sensor may be substantially smaller than the overall size or footprint of the imaging sensor itself. Because of the disparity between the active imaging area and the overall size/footprint of an imaging sensor, it may be a challenge to place multiple imaging sensors in adjacent cell regions in the focal area cell array behind a particular lens.
In certain embodiments, the imaging sensors behind each lens may be arranged in a sparse array, in which each sensor may be placed in a cell region that is not adjacent to a cell region containing another sensor, resulting in an array of incontiguous sensors. In these embodiments, the focal area array of a particular lens may contain fewer imaging sensors than there are cells within the array. These focal area arrays of incontiguous sensors may be known as sparse arrays. The sparse arrays behind different lenses may have different configurations. For example, the sparse array behind one lens may have more or less sensors than the sparse array behind another lens. Also, the sparse array behind different lenses may be arranged in different physical configurations. For example, the sparse array behind a first lens may have sensors arranged in certain locations on the focal area of the first lens, and the sparse array behind another lens may have sensors arranged in different positions complementary to the positions of the sensors behind the first lens. One advantage of this approach is that the perspective between “adjacent” imaging sensors may be matched, even if the “adjacent” sensors are actually each positioned behind different lenses and not contiguous.
In certain embodiments, a plurality of imaging sensors may be arranged adjacent to each other in a row, or in an array, to increase the resolution of the imaging sensor system. In such a configuration, the plurality of sensors may be smaller than the focal area of the lens and therefore configured to capture a portion of the field of view captured by the lens.
To reduce the gap, a sensor 204d as shown in
In some embodiments, the system includes a plurality of lenses arranged in a one or two dimensional array, each lens having a focal plane that may be larger than an individual imaging sensor.
In certain embodiments, a composite image may be created by combining one or more images into a single image having limited, if any, visible seams. In certain embodiments, the first step in combining such images is to align the images spatially using calibration data that describes the overlap regions between adjacent imaging sensors. In such embodiments the exposures and the color balance may be blended. Next, the system may correct for perspective (if required) and optical distortion between adjacent images.
In certain embodiments, the lenses and/or imaging sensors may be arranged in two-dimensional arrays.
The imaging systems described herein may include any number of lenses and sensors as desired without departing from the scope of the invention. The lenses and sensors may be positioned in arrays of any desired dimension. The focal areas of one or more lenses may be divided into any number of regions depending on, among other things, the desired resolution of the image, the number and/or the specification of the imaging sensors, the number and/or the specification of the lenses, as well as other components and considerations. Lenses may have focal areas of any shape, and imaging sensors mounted so that light exits the lens to impinge on the image sensor surface in a desired manner. Image sensor mounting positions and angles may be chosen to lower color crosstalk and vignetting in the composite image.
In certain embodiments, if the total size of the imaging sensor (i.e., the total size of the imaging sensor 104 including active area 112 and border region 110) is smaller than twice the active area size in both dimensions, then the imaging sensors may be spaced at intervals equal to the dimensions of the active area. Positioning imaging sensors in this manner results in the minimum number of lenses required to cover a desired field of view. An example of such an embodiment is illustrated in
In certain embodiments, the ratio of sensors to lenses depends on the active area size, the total sensor size, and the field of view to be covered. The ratio of sensors to lenses may depend on other aspects of the systems and methods described herein, without departing from the scope of the invention. In certain embodiments, if either of the total sensor size dimensions exceeds about twice the corresponding active area dimension, the sensor spacing may need to be increased to the next integral multiple of the corresponding active area size, and more lenses will be required to map to the sparser array of imaging sensors.
For many lenses, the focal plane may be a spherical surface. In certain embodiments, image sensors may be mounted on a flat substrate or a curved substrate that matches the field curvature of the lens and results in more properly focused images. In particular,
In other embodiments, the image sensors can also be mounted normal to the chief ray of the lens at the image sensor position in the focal area. For example,
In another embodiment, image sensors can be mounted on a flat substrate and a prism can be mounted on the image sensor to bend the incident rays and make them normal to the image sensor.
Light meters 808a and 808b are connected to the sensors 802a and 802b for determining incident light on the sensors. The light meters 808a and 808b and the sensors 802a and 802b are connected to exposure circuitry 810. The exposure circuitry 810 is configured to determine an exposure value for each of the sensors 802a and 802b. In certain embodiments, the exposure circuitry 810 determines the best exposure value for a sensor for imaging a given scene. The exposure circuitry 810 is optionally connected to miscellaneous mechanical and electronic shuttering systems 818 for controlling the timing and intensity of incident light and other electromagnetic radiation on the sensors 802a and 802b. The sensors 802a and 802b may optionally be coupled with one or more filters 822. In certain embodiments, filters 822 may preferentially amplify or suppress incoming electromagnetic radiation in a given frequency range.
In some embodiments, imaging system 800 may include mechanisms (not shown) to actuate one or more of sensors 802a and 802b. For example, imaging system 800 may include mechanisms to tilt or slide sensors 802a and/or 802b with respect to each other, the lens focal plane, or any other suitable axis. In certain embodiments, imaging system 800 may include one or more refractors (not shown), such as prism 706 (
In certain embodiments, sensor 802a includes an array of photosensitive elements (or pixels) 806a distributed in an array of rows and columns. The sensor 802a may include a charge-CCD imaging sensor. In certain embodiments, the sensor 802a includes a CMOS imaging sensor. In certain embodiments, the sensor 802b is similar to the sensor 802a. The sensor 802b may include a CCD and/or CMOS imaging sensor. The sensors 802a and 802b may be positioned adjacent to each other, either vertically or horizontally. The sensors 802a and 802b may be included in an optical head of an imaging system. In certain embodiments, the sensors 802a and 802b may be configured, positioned or oriented to capture different fields-of-view of a scene. The sensors 802a and 802b may be angled depending on the desired extent of the field-of-view. During operation, incident light from a scene being captured may fall on the sensors 802a and 802b. In certain embodiments, the sensors 802a and 802b may be coupled to a shutter and when the shutter opens, the sensors 802a and 802b are exposed to light. The light may then converted to a charge in each of the photosensitive elements 806a and 806b.
The sensors can be of any suitable type and may include CCD imaging sensors, CMOS imaging sensors, or any analog or digital imaging sensor. The sensors may be color sensors. The sensors may be responsive to electromagnetic radiation outside the visible spectrum, and may include thermal, gamma, multi-spectral and x-ray sensors. The sensors, in combination with other components in the imaging system 700, may generate a file in any format, such as the raw data, GIF, JPEG, TIFF, PBM, PGM, PPM, EPSF, X11 bitmap, Utah Raster Toolkit RLE, PDS/VICAR, Sun Rasterfile, BMP, PCX, PNG, IRIS RGB, XPM, Targa, XWD, PostScript, and PM formats on workstations and terminals running the X11 Window System or any image file suitable for import into the data processing system. Additionally, the system may be employed for generating video images, including digital video images in the .AVI, .WMV, .MOV, .RAM and .MPG formats.
In certain embodiments, once the shutter closes, light is blocked and the charge may then be transferred from an imaging sensor and converted into an electrical signal. In such embodiments, charge from each column is transferred along the column to an output amplifier 812, a technique typically referred to as a rolling shutter. The term “rolling shutter” may also be used to refer to other processes which generally occur column-wise or row-wise at each sensor, including charge transfer and exposure adjustment. Charge may first be transferred from each pixel in the columns 804a and 804b. In certain embodiments, after this is completed, charges from the columns 824a and 824b (adjacent to the columns 804a and 804b, respectively) are first transferred to the columns 804a and 804b, respectively, and then transferred along the columns 804a and 804b to the output amplifier 812. Similarly, charges from each of the remaining columns are moved over by one column towards the columns 804a and 804b and then transferred to output amplifier 812. The process may repeat until all or substantially all charges are transferred to the output amplifier 812.
In a further embodiment, the rolling shutter's column-wise transfer of charge is achieved by orienting a traditional imaging sensor vertically. Generally, traditional imaging sensors are designed for row-wise transfer of charge, instead of a column-wise transfer as described above. However, these traditional imaging sensors may be oriented on their sides such that rows now function as columns and allow for column-wise transfer. The output amplifier 812 may be configured to transfer charges and/or signals to a processor 814.
The processor 814 may include microcontrollers and microprocessors programmed to receive data from the output amplifier 812 and exposure values from the exposure circuitry 810, and determine interpolated exposure values for each column in each of the sensors 802a and 802b. In particular, processor 814 may include a central processing unit (CPU), a memory, and an interconnect bus (not shown). The CPU may include a single microprocessor or a plurality of microprocessors for configuring the processor 814 as a multi-processor system. The memory may include a main memory and a read-only memory. The processor 814 and/or the databases 816 also include mass storage devices having, for example, various disk drives, tape drives, FLASH drives, etc. The main memory also includes dynamic random access memory (DRAM) and high-speed cache memory. In operation, the main memory stores at least portions of instructions and data for execution by a CPU.
The mass storage 816 may include one or more magnetic disk or tape drives or optical disk drives, for storing data and instructions for use by the processor 814. At least one component of the mass storage system 816, possibly in the form of a disk drive or tape drive, stores the database used for processing the signals measured from the sensors 802a and 802b. The mass storage system 816 may also include one or more drives for various portable media, such as a floppy disk, a compact disc read-only memory (CD-ROM), DVD, or an integrated circuit non-volatile memory adapter (i.e. PC-MCIA adapter) to input and output data and code to and from the processor 814.
The processor 814 may also include one or more input/output interfaces for data communications. The data interface may be a modem, a network card, serial port, bus adapter, or any other suitable data communications mechanism for communicating with one or more local or remote systems. The data interface may provide a relatively high-speed link to a network, such as the Internet. The communication link to the network may be, for example, optical, wired, or wireless (e.g., via satellite or cellular network). Alternatively, the processor 814 may include a mainframe or other type of host computer system capable of communications via the network.
The processor 814 may also include suitable input/output ports or use the interconnect bus for interconnection with other components, a local display 820, and keyboard or other local user interface for programming and/or data retrieval purposes (not shown).
In certain embodiments, the processor 814 includes circuitry for an analog-to-digital converter and/or a digital-to-analog converter. In such embodiments, the analog-to-digital converter circuitry converts analog signals received at the sensors to digital signals for further processing by the processor 814.
The components of the processor 814 are those typically found in imaging systems used for portable use as well as fixed use. In certain embodiments, the processor 814 includes general purpose computer systems used as servers, workstations, personal computers, network terminals, and the like. Certain aspects of the systems and methods described herein may relate to the software elements, such as the executable code and database for the server functions of the imaging system 800.
Generally, the methods described herein may be executed on a conventional data processing platform such as an IBM PC-compatible computer running the Windows operating systems, a SUN workstation running a UNIX operating system or another equivalent personal computer or workstation. Alternatively, the data processing system may comprise a dedicated processing system that includes an embedded programmable data processing unit.
Certain embodiments of the systems and processes described herein may also be realized as software component operating on a conventional data processing system such as a UNIX workstation. In such embodiments, the processes may be implemented as a computer program written in any of several languages well-known to those of ordinary skill in the art, such as (but not limited to) C, C++, FORTRAN, Java or BASIC. The processes may also be executed on commonly available clusters of processors, such as Western Scientific Linux clusters, which may allow parallel execution of all or some of the steps in the process.
Certain embodiments of the methods described herein may be performed in either hardware, software, or any combination thereof, as those terms are currently known in the art. In particular, these methods may be carried out by software, firmware, or microcode operating on a computer or computers of any type, including pre-existing or already-installed image processing facilities capable of supporting any or all of the processor's functions. Additionally, software embodying these methods may comprise computer instructions in any form (e.g., source code, object code, interpreted code, etc.) stored in any computer-readable medium (e.g., ROM, RAM, magnetic media, punched tape or card, compact disc (CD) in any form, DVD, etc.). Furthermore, such software may also be in the form of a computer data signal embodied in a carrier wave, such as that found within the well-known Web pages transferred among devices connected to the Internet. Accordingly, these methods and systems are not limited to any particular platform, unless specifically stated otherwise in the present disclosure.
The systems described herein may include additional electronic, electrical and optical hardware and software elements for capturing images without departing from the scope of the invention. For example, the system may include single-shot systems, which in turn, may include one or more color filters coupled with the imaging sensors (e.g., CCD or CMOS). In certain embodiments, the imaging sensor is exposed to the light from a scene a desired number of times. The system may be configured to capture images using one or more imaging sensors with a Bayer filter mosaic, or three or more imaging sensors (for one or more spectral bands) which are exposed to the same image via a beam splitter. In another embodiment, the system includes multi-shot systems in which the sensor may be exposed to light from a scene in a sequence of three or more openings of the lens aperture. In such embodiments, one or more imaging sensors may be combined with three filters (e.g., red, green and blue) passed in front of the sensor in sequence to obtain the additive color information. In other embodiments, the systems described herein may be combined with computer systems for operating the lenses and/or sensors and processing captured images.
In step 904, a second image or a second portion of the scene is captured by a second sensor array. The second sensor array may also be a sparse sensor array that includes one or more sensors disposed in one or more cells of a second imaging array. In some embodiments, the number of cells in the second imaging array may exceed the number of sensors disposed in the second array. If the second sensor array includes more than one sensor, the sensors may be disposed in nonadjacent cells of the second imaging array. In these embodiments, the captured second portion of the scene may include two or more nonadjacent or incontiguous sections of the scene. In certain embodiments, the one or more sensors in the second imaging array may be disposed in cells that are adjacent to cells that correspond to cells in the first imaging array that contain sensors. Thus, while the captured second portion of the scene may include only incontiguous sections of the scene, these portions may be contiguous to sections of the scene in the captured first portion of the scene.
In step 906, the captured first and second portions of the scene may be combined to form an high resolution image of the scene by, for example, processor 814 (
Those skilled in the art will know or be able to ascertain using no more than routine experimentation, many equivalents to the embodiments and practices described herein. Variations, modifications, and other implementations of what is described may be employed without departing from the spirit and scope of the invention. More specifically, any of the method, system and device features described above or incorporated by reference may be combined with any other suitable method, system or device features disclosed herein or incorporated by reference, and is within the scope of the contemplated inventions. The systems and methods may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The foregoing embodiments are therefore to be considered in all respects illustrative, rather than limiting of the invention. The teachings of all references cited herein are hereby incorporated by reference in their entirety.
This application claims the benefit of U.S. Provisional Patent Application No. 61/197,204, entitled “Systems And Methods For High Resolution Imaging” filed on Oct. 24, 2008, the entire disclosure of which is hereby incorporated by reference as if set forth herein in its entirety.
Number | Date | Country | |
---|---|---|---|
61197204 | Oct 2008 | US |