The present invention relates to imaging systems and, more particularly, to imaging systems with image sensors with multiple lenses of varying polarizations.
Electronic devices such as cellular telephones, camera, and computers often use digital camera modules to capture images. Typically, digital camera modules capture light that has passed through a lens. The lens is typically unpolarized (e.g., allows light of all polarizations to reach the camera modules). Occasionally, the lens is polarized (e.g., allows light of only a single polarization to reach the camera modules). Camera modules with these types of conventional lenses are unsatisfactory when imaging scenes illuminated by polarized light, by structured light, or by a combination of polarized and structured light.
Digital camera modules are widely used in electronic devices. An electronic device with a digital camera module is shown in
Still and video image data from camera sensor 14 may be provided to image processing and data formatting circuitry 16 via path 26. Image processing and data formatting circuitry 16 may be used to perform image processing functions such as adjusting white balance and exposure and implementing video image stabilization, image cropping, image scaling, etc. Image processing and data formatting circuitry 16 may also be used to compress raw camera
If desired, camera sensor 14 may be sensitive to light of varying polarizations. As one example, a first portion of camera sensor 14 may be sensitive to unpolarized light (e.g., light of any polarization) and a second portion of camera sensor 14 may be sensitive to polarized light (e.g., light of a particular polarization such as a particular orientation for linearly polarized light or a particular handedness for circularly polarized light). As another example, a first portion of camera sensor 14 may be sensitive to a first type of polarized light and a second portion of camera sensor 14 may be sensitive to a second type of polarized light. If desired, the first type of polarized light may be linearly polarized light or may be circularly polarized light. Similarly, the second type of polarized light may be linearly polarized light or may be circularly polarized light. Differences in the type of polarized light received by the first and second portions of camera sensor 14 may include differences in the kind of polarization (e.g., linear versus circular polarizations), in the handedness (if both types are circular polarization), or in the orientation (if both types are linear polarization). In general, camera sensor 14 may be divided into any desired number of regions, with each region being sensitive to light of a different polarization (or to unpolarized light). If desired, the number of regions that camera sensor 14 is divided into may equal the number of pixels, or some fraction thereof, in camera sensor 14.
In a typical arrangement, which is sometimes referred to as a system on chip or SOC arrangement, camera sensor 14 and image processing and data formatting circuitry 16 are implemented on a common integrated circuit 15. The use of a single integrated circuit to implement camera sensor 14 and image processing and data formatting circuitry 16 can help to minimize costs. If desired, however, multiple integrated circuits may be used to implement circuitry 15.
Circuitry 15 conveys data to host subsystem 20 over path 18. Circuitry 15 may provide acquired image data such as captured video and still digital images to host subsystem 20.
Electronic device 10 typically provides a user with numerous high level functions. In a computer or advanced cellular telephone, for example, a user may be provided with the ability to run user applications. To implement these functions, electronic device 10 may have input-output devices 22 such as projectors, keypads, input-output ports, and displays and storage and processing circuitry 24. Storage and processing circuitry 24 may include volatile and nonvolatile memory (e.g., random-access memory, flash memory, hard drives, solid state drives, etc.). Storage and processing circuitry 24 may also include processors such as microprocessors, microcontrollers, digital signal processors, application specific integrated circuits, etc.
Device 10 may include position sensing circuitry 23. Position sensing circuitry 23 may include, as examples, global positioning system (GPS) circuitry and radio-frequency-based positioning circuitry (e.g., cellular-telephone positioning circuitry).
An example of an arrangement for sensor array 14 is shown in
Address generator circuitry 32 may generate signals on paths 34 as desired. For example, address generator circuitry 32 may generate reset signals on reset lines in paths 34, transfer signals on transfer lines in paths 34, and row select (e.g., row readout) signals on row select lines in paths 34 to control the operation of array 14. If desired, address generator circuitry 32 and array 14 may be integrated together in a single integrated circuit (as an example).
Signals 34, generated by address generator circuitry 32 as an example, may include signals that dynamically adjust the resolution of array 14. For example, signals 34 may include binning signals that cause pixels 28 in a first region of array 14 to be binned together (e.g., with a 2-pixel binning scheme, with a 3-pixel binning scheme, or with a pixel binning scheme of 4 or more pixels) and that cause pixels 28 in a second region of array 14 to either not be binned together or to be binned together to a lesser extent than the first region. In addition, signals 34 may cause pixels 28 in any number of additional (e.g., third, fourth, fifth, etc.) regions of array 14 to be binned together to any number of different, or identical, degrees (e.g., 2-pixel binning schemes, 3-or-more-pixel binning schemes, etc.).
Image readout circuitry 30 may include circuitry 42 and image processing and data formatting circuitry 16. Circuitry 42 may include sample and hold circuitry, analog-to-digital converter circuitry, and line buffer circuitry (as examples). As one example, circuitry 42 may be used to measure signals in pixels 28 and may be used to buffer the signals while analog-to-digital converters in circuitry 42 convert the signals to digital signals. In a typical arrangement, circuitry 42 reads signals from rows of pixels 28 one row at a time over lines 40. With another suitable arrangement, circuitry 42 reads signals from groups of pixels 28 (e.g., groups formed from pixels located in multiple rows and columns of array 14) one group at a time over lines 40. The digital signals read out by circuitry 42 may be representative of charges accumulated by pixels 28 in response to incident light. The digital signals produced by the analog-to-digital converters of circuitry 42 may be conveyed to image processing and data formatting circuitry 16 and then to host subsystem 20 (
As shown in
With some suitable arrangements, image sensor 14 may include microlenses that cover a single light sensitive pixel 28. If desired, each microlens may cover a group of two, three, four, or more pixels. As shown in the example of
Pixel 54 may include a filter that passes either unpolarized light or that passes a particular polarization of light. In some arrangements, the filter in pixel 54 may vary depending on the location of pixel 54 within the larger image sensor array 14. As shown by the P(x,y) label for pixel 54 in
An example of an arrangement in which the microlens for each pixel 54 varies across image sensor array 14 is shown in
As shown in
In some arrangements, image sensor 14 may include at least one polarized lens such as lens 46B that passes (to the underlying sensor 14) the structured and/or polarized light originally emitted by display 22 or light source 60. Image sensor 14 may then be able to capture light emitted by display 22 or light source 60 that has scattered off of nearby objects (e.g., that has illuminated those nearby objects). In arrangements in which the light emitted by display 22 or light source 60 include near-infrared wavelengths, image sensor 14 may be able to capture images of objects regardless of the visible-wavelength ambient lighting conditions (e.g., regardless of the whether the ambient environment is visibly bright or not and regardless of the visible-spectrum brightness of display 22).
A flowchart of illustrative steps involved in using image sensor 14 is shown in
In step 56, image sensor 14 may capture one or more images of a scene. Image sensor 14 may be divided into at least two regions, a first of which may be sensitive to a first type of light (e.g., unpolarized light or light of a first particular polarization) and a second of which may be sensitive to a second type of light (e.g., unpolarized light or light of a second particular polarization).
In step 58, image processor circuitry such as image processing circuitry 15 in camera module 12 and/or processing circuitry 24 in host subsystem 20 may analyze the image or images captured in step 56. As an example, device 10 may identify sources of polarized light in the image(s) and may identify the polarization of light emitted or reflected by those sources. In step 58, device 10 may create one or more images from incident light collected by image sensor 14.
CMOS imager 200 is operated by a timing and control circuit 206, which controls decoders 203, 205 for selecting the appropriate row and column lines for pixel readout, and row and column driver circuitry 202, 204, which apply driving voltages to the drive transistors of the selected row and column lines. The pixel signals, which typically include a pixel reset signal Vrst and a pixel image signal Vsig for each pixel are sampled by sample and hold circuitry 207 associated with the column driver 204. A differential signal Vrst-Vsig is produced for each pixel, which is amplified by amplifier 208 and digitized by analog-to-digital converter 209. The analog to digital converter 209 converts the analog pixel signals to digital signals, which are fed to image processor 210 which forms a digital image.
Processor system 300, which may be a digital still or video camera system, may include a lens such as lens 396 for focusing an image onto a pixel array such as pixel array 201 when shutter release button 397 is pressed. Processor system 300 may include a central processing unit such as central processing unit (CPU) 395. CPU 395 may be a microprocessor that controls camera functions and one or more image flow functions and communicates with one or more input/output (I/O) devices 391 over a bus such as bus 393. Imaging device 200 may also communicate with CPU 395 over bus 393. System 300 may include random access memory (RAM) 392 and removable memory 394. Removable memory 394 may include flash memory that communicates with CPU 395 over bus 393. Imaging device 200 may be combined with CPU 395, with or without memory storage, on a single integrated circuit or on a different chip. Although bus 393 is illustrated as a single bus, it may be one or more buses or bridges or other communication paths used to interconnect the system components.
Various embodiments have been described illustrating imaging systems that may include multiple lenses of varying polarizations.
A camera sensor may be divided into two or more regions. Each region of the camera sensor may include a lens that passes light of a particular type (e.g., unpolarized light, light of a particular linear polarization, or light of a particular circular polarization). At least some light sensitive pixels within each region may receive white light (e.g., light of all visible wavelengths) or near-infrared light of the polarization passed by the lens of the region.
The camera sensor may be formed from an array of light-sensitive pixels. In some arrangements, the camera sensor may include a microlens over each pixel. Some of the microlenses may pass red, green, or blue light to the underlying pixels. Still other microlens may pass light such as unpolarized white light, unpolarized infrared light, white light of a particular polarization, and near-infrared light of a particular polarization to the underlying pixels. If desired, the type of polarization passed by these microlenses may vary within the array that forms the camera sensor (e.g., may vary depending on the location within the array).
The electronic device may include a component that emits structured or polarized light. In such arrangements, the camera sensor may have lenses that are mapped to the light emitted by the component. In particular, the component may emit light in a particular polarization and the lenses may pass light having the same polarization. As examples, the component may be a display device and may be an illumination device (e.g., a light that emits polarized, structured, visible, and/or near-infrared light).
The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of provisional patent application No. 61/537,548, filed Sep. 21, 2011, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
61537548 | Sep 2011 | US |