This invention relates to an image capture device that includes a two-dimensional image sensor that produces two video signals having different frame rates and are used for different functions.
An image capture device depends on an electronic image sensor to create an electronic representation of a visual image. Examples of such electronic image sensors include charge coupled device (CCD) image sensors and active pixel sensor (APS) devices (APS devices are often referred to as CMOS sensors because of the ability to fabricate them in a Complementary Metal Oxide Semiconductor process). Typically, electronic image sensors are used for the multiple functions related to pre-photography preparations from the video signal in addition to creating the final visual image. Based on brightness measurement results of the subject, automatic exposure control processing (hereinafter referred to as “AE processing”) is carried out to obtain a suitable exposure value. Then, automatic focus detection processing (hereinafter referred to as “AF processing”) is carried out to drive a focus-adjusting lens to focus the subject on the image capture device. The subject brightness value is measured from the video signal again, and photographic exposure conditions are thereby determined. In addition to AE, AF, and other analytical processing, image capture devices often display a visual electronic image of the scene to be captured. This visual image is updated frequently, such as 30 frames per second, and is referred to as a preview image or stream of preview images.
Commonly, a single electronic image sensor is used for creating the electronic representation of a visual image, AE processing and AF processing. These tasks are performed sequentially since the same electronic image sensor is being utilized for different functions. Typically, the rate at which the AE processing and AF processing can be performed is restricted by the rate at which a visual image can be read and processed from the electronic image sensor. This can cause a considerable delay in time between when the electronic image sensor initiates processing and when the final capture is finally acquired.
In prior art, the user adjusts the zoom setting and points the camera to compose the image, and then actuates a capture device through user inputs. The camera focus is adjusted to a mid-range position, and the sensor is cleared of any charge. For example, with a CCD sensor, this would be done using a fast flush technique. An image, to be used for focusing the camera lens, is then integrated for a period of time, for example 10 milli-seconds, during the focusing mode. The vertical clock sequence is then set to a line skipping operation (e.g., read two lines, dump six lines, read two, dump six, etc.), or read only selected lines in the central area of the image.
After data acquisition, the average absolute value output (average contrast) of a horizontal spatial bandpass filter processing these image lines is used to determine how well the image is focused. The system controller stores this average contrast value, and the lens focus is adjusted while the remainder of the sensor charge is cleared out using fast flush timing The fast flush timing for the top and bottom of the sensor are required with a CCD in order to reduce the time spent reading out each focus image. Sensor lines that are flushed are not available for any purpose, such as exposure analysis or video signal output. The process of integrating and reading out the focus image is then repeated for a second focusing cycle. If the average contrast increases, the lens focus position is stepped again in the same direction. If the average contrast decreases, the focus position is moved in the opposite direction. These focusing cycles are repeated numerous times as the lens focus is adjusted until it provides the maximum average contrast. Once the average contrast has reached a maximum value, the focus is acceptable. At this point, the entire sensor is cleared. The final image is then integrated for a period of time. The final image is read out from the sensor.
Prior art also includes focus analysis techniques besides the average contrast calculation described above. Still, they rely on a through focus operation, acquiring multiple images at different focus positions.
In order to solve the time problem, some capture devices actually have two image sensors: one that operates at a fast frame rate to provide for AE or AF processing and the other that operates at a slow frame rate for producing a visual image signal. This of course involves the complexity of a second sensor and its control. The added complexity includes optical and mechanical complexity as well as electronic complexity.
It is an object of the present invention to provide improved techniques for reading data for multiple functions from a single image sensor.
This objective is achieved in a method for using a capture device to capture at least two video signals corresponding to a scene, comprising:
By providing the first and second video signals for adjusting the capture device parameters, the time required between initiating the capture and acquiring the final capture can be reduced.
Because digital cameras employing imaging devices and related circuitry for signal capture and correction and for exposure control are well known, the present description will be directed in particular to elements forming part of, or cooperating more directly with, method and apparatus in accordance with the present invention. Elements not specifically shown or described herein are selected from those known in the art. Certain aspects of the embodiments to be described are provided in software. Given the system as shown and described according to the invention in the following materials, software not specifically shown, described or suggested herein that is useful for implementation of the invention is conventional and within the ordinary skill in such arts.
Turning now to
The image sensor 20 receives light 10 from a subject scene. The resulting electrical signal from each pixel of the image sensor 20 is typically related to both the intensity of the light reaching the pixel and the length of time the pixel is allowed to accumulate or integrate the signal from incoming light. This time is called the integration time or exposure time. In this context, the integration time is the time during which the shutter 18 allows light to reach the image sensor 20 and the image sensor is simultaneously operating to record the light. The combination of overall light intensity and integration time is called exposure. It is to be understood that equivalent exposures can be achieved by various combinations of light intensity and integration time. For example, a long integration time can be used with a scene of very low light intensity in order to achieve the same exposure as using a short integration time with a scene of high light intensity.
Although
As previously mentioned, equivalent exposures can be achieved by various combinations of light intensity and integration time. Although the exposures are equivalent, a particular exposure combination of light intensity and integration time may be preferred over other equivalent exposures for capturing a given scene image. For example, a short integration time is generally preferred when capturing sporting events in order to avoid blurred images due to motion of athletes running or jumping during the integration time. In this case, the iris block can provide a large aperture for high light intensity and the shutter can provide a short integration time. This case serves as an example of a scene mode, specifically a sports scene mode that favors short integration times over small apertures. In general, scene modes are preferences for selecting and controlling the elements that combine to make an exposure in order to optimally to capture certain scene types. Another example of a scene mode is a landscape scene mode. In this scene mode, preference is given to a small aperture to provide good depth of focus with the integration time being adjusted to provide optimum exposure. Yet another example of a scene mode, is a general scene mode that favors small apertures for good depth of focus with integration time increasing with lower scene light levels, until the integration time becomes long enough for certain light levels that handheld camera shake becomes a concern, at which point the integration time remains fixed and the iris provides larger apertures to increase the light intensity at the sensor.
The system controller 50 in
The system controller 50 also receives inputs from the user inputs 74. Scene mode as described above is generally provided by the user as a user input. When taking multiple image captures in quick succession, scene lighting intensity for the next capture can also be estimated from the digitized image data taken on the previous capture. This image data, passing through the digital signal processor 36 can be used by the system controller 50 to augment or override digital signals from the brightness sensor 16.
The system controller 50 uses the light intensity signal(s) from brightness sensor 16, user inputs 74 (including scene mode), and system controller 50 inputs to determine how to control the exposure regulating elements to provide an appropriate exposure. The system controller 50 can determine automatically how to control or adjust all the exposure regulating elements to produce a correct exposure. Alternatively, by way of the user inputs 74, the user can manually control or adjust the exposure regulating elements to produce a user selected exposure. Furthermore, the user can manually control or adjust only some exposure regulating elements while allowing the system controller 50 to control the remaining elements automatically. The system controller 50 also provides information regarding the exposure to the user through the viewfinder display 70 and the exposure display 72. This information for the user includes the automatically or manually determined integration time, aperture, and other exposure regulating elements. This information can also include to what degree an image capture will be underexposed or overexposed in case the correct exposure cannot be achieved based on the limits of operation of the various exposure regulating elements.
Referring again to the embodiment shown in
Referring again to the embodiment shown in
The image capture device, shown in
The analog signal from image sensor 20 is processed by analog signal processor 22 and applied to analog to digital (A/D) converter 24. Timing generator 26 produces various clocking signals to select rows and pixels and synchronizes the operation of analog signal processor 22 and A/D converter 24. The image sensor stage 28 includes the image sensor 20, the analog signal processor 22, the A/D converter 24, and the timing generator 26. The components of image sensor stage 28 are separately fabricated integrated circuits, or they are fabricated as a single integrated circuit as is commonly done with CMOS image sensors. The resulting stream of digital pixel values from A/D converter 24 is stored in memory 32 associated with digital signal processor (DSP) 36.
Digital signal processor 36 is a processor in addition to system controller 50. Although this partitioning of camera functional control among multiple controllers and processors is typical, these controllers or processors are combined in various ways without affecting the functional operation of the camera and the application of the present invention. These controllers or processors can comprise one or more digital signal processor devices, microcontrollers, programmable logic devices, or other digital logic circuits. Although a combination of two such controllers or processors has been described, it should be apparent that one controller or processor or more than two controllers could be designated to perform all of the needed functions. All of these variations can perform the same function and fall within the scope of this invention, and the term “processing stage” will be used as needed to encompass all of this functionality within one phrase, for example, as in processing stage 38 in
In the illustrated embodiment, DSP 36 manipulates the digital image data in its memory 32 according to a software program permanently stored in program memory 54 and copied to memory 32 for execution during image capture. DSP 36 executes the software necessary for practicing image processing shown in
System controller 50 controls the overall operation of the camera based on a software program stored in program memory 54, which can include Flash EEPROM or other nonvolatile memory. This memory can also be used to store image sensor calibration data, user setting selections and other data which must be preserved when the camera is turned off. System controller 50 controls the sequence of image capture by directing the focus control 8, zoom control 9, lens 12, filter assembly 13, iris 14, and shutter 18 as previously described, directing the timing generator 26 to operate the image sensor 20 and associated elements, and directing DSP 36 to process the captured image data. After an image is captured and processed, the final image file stored in memory 32 is transferred to a host computer via interface 57, stored on a removable memory card 64 or other storage device, and displayed for the user on image display 88.
A bus 52 includes a pathway for address, data and control signals, and connects system controller 50 to DSP 36, program memory 54, system memory 56, host interface 57, memory card interface 60 and other related devices. Host interface 57 provides a high-speed connection to a personal computer (PC) or other host computer for transfer of image data for display, storage, manipulation or printing. This interface is an IEEE1394 or USB2.0 serial interface or any other suitable digital interface. Memory card 64 is typically a Compact Flash (CF) card inserted into socket 62 and connected to the system controller 50 via memory card interface 60. Other types of storage that are utilized include without limitation PC-Cards, MultiMedia Cards (MMC), or Secure Digital (SD) cards.
Processed images are copied to a display buffer in system memory 56 and continuously read out via video encoder 80 to produce a video signal. This signal is output directly from the camera for display on an external monitor, or processed by display controller 82 and presented on image display 88. This display is typically an active matrix color liquid crystal display (LCD), although other types of displays are used as well.
The user interface, including all or any combination of viewfinder display 70, exposure display 72, status display 76 and image display 88, and user inputs 74, is controlled by a combination of software programs executed on system controller 50. The Viewfinder Display, Exposure Display and the User Inputs displays are a user control and status interface 68. User inputs 74 typically include some combination of buttons, rocker switches, joysticks, rotary dials or touchscreens. System controller 50 operates light metering, scene mode, autofocus, and other exposure functions. The system controller 50 manages the graphical user interface (GUI) presented on one or more of the displays, e.g., on image display 88. The GUI typically includes menus for making various option selections and review modes for examining captured images.
The ISO speed rating is an important attribute of a digital still camera. The exposure time, the lens aperture, the lens transmittance, the level and spectral distribution of the scene illumination, and the scene reflectance determine the exposure level of a digital still camera. When an image from a digital still camera is obtained using an insufficient exposure, proper tone reproduction can generally be maintained by increasing the electronic or digital gain, but the image will contain an unacceptable amount of noise. As the exposure is increased, the gain is decreased, and therefore the image noise can normally be reduced to an acceptable level. If the exposure is increased excessively, the resulting signal in bright areas of the image can exceed the maximum signal level capacity of the image sensor or camera signal processing. This can cause image highlights to be clipped to form a uniformly bright area, or to bloom into surrounding areas of the image. It is important to guide the user in setting proper exposures. An ISO speed rating is intended to serve as such a guide. In order to be easily understood by photographers, the ISO speed rating for a digital still camera should directly relate to the ISO speed rating for photographic film cameras. For example, if a digital still camera has an ISO speed rating of ISO 200, then the same exposure time and aperture should be appropriate for an ISO 200 rated film/process system.
The ISO speed ratings are intended to harmonize with film ISO speed ratings. However, there are differences between electronic and film-based imaging systems that preclude exact equivalency. Digital still cameras can include variable gain, and can provide digital processing after the image data has been captured, enabling tone reproduction to be achieved over a range of camera exposures. It is therefore possible for digital still cameras to have a range of speed ratings. This range is defined as the ISO speed latitude. To prevent confusion, a single value is designated as the inherent ISO speed rating, with the ISO speed latitude upper and lower limits indicating the speed range, that is, a range including effective speed ratings that differ from the inherent ISO speed rating. With this in mind, the inherent ISO speed is a numerical value calculated from the exposure provided at the focal plane of a digital still camera to produce specified camera output signal characteristics. The inherent speed is usually the exposure index value that produces peak image quality for a given camera system for normal scenes, where the exposure index is a numerical value that is inversely proportional to the exposure provided to the image sensor.
The digital camera as described can be configured and operated to capture a single image or to capture a stream of images. For example, the image sensor stage 28 can be configured to capture single full resolution images and the mechanical shutter 18 can be used to control the integration time. This case is well suited to single image capture for still photography. Alternatively, the image sensor stage can be configured to capture a stream of limited resolution images and the image sensor can be configured to control the integration time electronically. In this case a continuous stream of images can be captured without being limited by the readout speed of the sensor or the actuation speed of the mechanical shutter. This case is useful, for example, for capturing a stream of images that will be used to provide a video signal, as in the case of a video camera. The configurations outlined in these cases are examples of the configurations employed for single capture and capturing a stream of images, but alternative configurations can be used for single image capture and capturing a stream of images. The present invention can be practiced in image capture devices providing either for single image capture or for capturing a stream of images. Furthermore, image capture devices incorporating the present invention can allow the user to select between single image capture and capturing a stream of images.
The image sensor 20 shown in
Whenever general reference is made to an image sensor in the following description, it is understood to be representative of the image sensor 20 from
An advantage of the present invention is that efficiency and parallelism in reading pixels allow for a faster frame rate without a faster pixel conversion rate. As used herein, the term “frame rate” is the reciprocal of the time from the beginning of one frame of video signal to the beginning of the next frame of video signal. Pixel rate is the inverse of the time from the readout of one pixel to the readout of the next pixel.
The first number in each box refers to the group of pixels (such as first or second group). In prior art, there is no grouping of pixels, and the group number is designated by 0, such as G01. The second number in each box refers to the frame number. In some boxes, there is a third number (preceded by a decimal point). This number refers to a part of a frame for the group of pixels. For example, box G21.2 refers to the time interval required to read the second group of pixels, first frame, second part of frame. The timing advantage of the current invention is illustrated by the fact that boxes G11 and G21 (or G12 and G22, etc) are shorter than box G01.
Referring again to
A person skilled in the art can see that frame of the first group of pixels can be split in many different ways in order to optimize the frequency of reading frames of the first group of pixels and frames of the second group of pixels. Usually the optimal interleaving will be such that a complete frame of the first group of pixels will be read at a standard video rate, such as every 33 milliseconds. The frames of the second group of pixels are interleaved to provide a consistent time interval between each frame of the second group of pixels, to help in synchronization with other camera operations, such as lens movement.
In order to produce a color image, the array of pixels in an image sensor typically has a pattern of color filters placed over them. The set of color photoresponses selected for use in a sensor usually has three colors, but it can also include four or more. As used herein, a panchromatic photoresponse refers to a photoresponse having a wider spectral sensitivity than those spectral sensitivities represented in the selected set of color photoresponses. A panchromatic photosensitivity can have high sensitivity across the entire visible spectrum. The term panchromatic pixel will refer to a pixel having a panchromatic photoresponse. Although the panchromatic pixels generally have a wider spectral sensitivity than the set of color photoresponses, each panchromatic pixel can have an associated filter. Such filter is either a neutral density filter or a color filter.
After appropriately adjusting parameters of the capture device, all pixels from the two-dimensional image sensor can be read out, to provide a final capture of an image of the scene after adjustment of capture parameter(s).
Those skilled in the art will appreciate that the number of pixels employed can vary from zone to zone and be optimized for a particular application.
The invention has been described in detail with particular reference to certain preferred embodiments thereof, but it will be understood that variations and modifications are effected within the spirit and scope of the invention.
This application is a continuation of U.S. application Ser. No. 11/538,599, filed 4 Oct. 2006, and claims priority therefrom under 35 U.S.C. §120. The priority application is still pending. The present application is related to U.S. application Ser. No. 11/191,538, filed 28 Jul. 2006, of John F. Hamilton Jr. and John T. Compton, entitled “PROCESSING COLOR AND PANCHROMATIC PIXELS” and U.S. application Ser. No. 11/191,729, filed 28 Jul. 2005, of John T. Compton and John F. Hamilton, Jr., entitled “IMAGE SENSOR WITH IMPROVED LIGHT SENSITIVITY”.
Number | Name | Date | Kind |
---|---|---|---|
2446791 | Schroeder | Aug 1948 | A |
2508267 | Kasperowicz | May 1950 | A |
2884483 | Ehrenhaft et al. | Apr 1959 | A |
3725572 | Kurokawa et al. | Apr 1973 | A |
3971065 | Bayer | Jul 1976 | A |
4047203 | Dillon | Sep 1977 | A |
4121244 | Nakabe et al. | Oct 1978 | A |
4390895 | Sato et al. | Jun 1983 | A |
4437112 | Tanaka et al. | Mar 1984 | A |
4567510 | Tanaka et al. | Jan 1986 | A |
4591900 | Heeb et al. | May 1986 | A |
4606630 | Haruki et al. | Aug 1986 | A |
4642678 | Cok | Feb 1987 | A |
4663661 | Weldy et al. | May 1987 | A |
4760441 | Kohno | Jul 1988 | A |
4805024 | Suzuki et al. | Feb 1989 | A |
4807981 | Takizawa et al. | Feb 1989 | A |
4823186 | Muramatsu | Apr 1989 | A |
4896207 | Parulski | Jan 1990 | A |
4939573 | Teranishi et al. | Jul 1990 | A |
4956715 | Okino et al. | Sep 1990 | A |
4962419 | Hibbard et al. | Oct 1990 | A |
5018006 | Hashimoto | May 1991 | A |
5172220 | Beis | Dec 1992 | A |
5227313 | Gluck et al. | Jul 1993 | A |
5244817 | Hawkins et al. | Sep 1993 | A |
5264924 | Cok | Nov 1993 | A |
5272518 | Vincent | Dec 1993 | A |
5323233 | Yamagami et al. | Jun 1994 | A |
5373322 | Laroche et al. | Dec 1994 | A |
5374956 | D'Luna | Dec 1994 | A |
5382976 | Hibbard | Jan 1995 | A |
5432906 | Newman et al. | Jul 1995 | A |
5493335 | Parulski et al. | Feb 1996 | A |
5506619 | Adams, Jr. et al. | Apr 1996 | A |
5596367 | Hamilton, Jr. et al. | Jan 1997 | A |
5625210 | Lee et al. | Apr 1997 | A |
5629734 | Hamilton, Jr. et al. | May 1997 | A |
5631703 | Hamilton, Jr. et al. | May 1997 | A |
5652621 | Adams, Jr. et al. | Jul 1997 | A |
5670817 | Robinson | Sep 1997 | A |
5677202 | Hawkins et al. | Oct 1997 | A |
5773814 | Phillips et al. | Jun 1998 | A |
5852468 | Okada | Dec 1998 | A |
5877809 | Omata et al. | Mar 1999 | A |
5914749 | Bawolek et al. | Jun 1999 | A |
5917956 | Ohsawa et al. | Jun 1999 | A |
5969368 | Thompson et al. | Oct 1999 | A |
6011875 | Laben | Jan 2000 | A |
6097835 | Lindgren | Aug 2000 | A |
6107655 | Guidash | Aug 2000 | A |
6115066 | Gowda et al. | Sep 2000 | A |
6153446 | Chen et al. | Nov 2000 | A |
6168965 | Malinovich et al. | Jan 2001 | B1 |
6243133 | Spaulding et al. | Jun 2001 | B1 |
6246865 | Lee | Jun 2001 | B1 |
6271554 | Nozaki et al. | Aug 2001 | B1 |
6292212 | Zigadlo et al. | Sep 2001 | B1 |
6326624 | Chapuis et al. | Dec 2001 | B1 |
6366318 | Smith et al. | Apr 2002 | B1 |
6366319 | Bills | Apr 2002 | B1 |
6369853 | Merrill et al. | Apr 2002 | B1 |
6429036 | Nixon et al. | Aug 2002 | B1 |
6441848 | Tull | Aug 2002 | B1 |
6441852 | Levine et al. | Aug 2002 | B1 |
6441855 | Omata et al. | Aug 2002 | B1 |
6476865 | Gindele et al. | Nov 2002 | B1 |
6510283 | Yamagishi | Jan 2003 | B1 |
6512838 | Rafii et al. | Jan 2003 | B1 |
6515275 | Hunter et al. | Feb 2003 | B1 |
6529239 | Dyck et al. | Mar 2003 | B1 |
6594388 | Gindele et al. | Jul 2003 | B1 |
6630960 | Takahashi et al. | Oct 2003 | B2 |
6642962 | Lin et al. | Nov 2003 | B1 |
6646246 | Gindele et al. | Nov 2003 | B1 |
6654062 | Numata et al. | Nov 2003 | B1 |
6665449 | He et al. | Dec 2003 | B1 |
6686960 | Iizuka | Feb 2004 | B2 |
6694064 | Benkelman | Feb 2004 | B1 |
6714243 | Mathur et al. | Mar 2004 | B1 |
6734906 | Hashimoto | May 2004 | B1 |
6757012 | Hubina et al. | Jun 2004 | B1 |
6765611 | Gallagher et al. | Jul 2004 | B1 |
6784939 | Lee et al. | Aug 2004 | B1 |
6809008 | Holm et al. | Oct 2004 | B1 |
6813046 | Gindele et al. | Nov 2004 | B1 |
6829008 | Kondo et al. | Dec 2004 | B1 |
6847397 | Osada | Jan 2005 | B1 |
6869817 | Hwang | Mar 2005 | B2 |
6876384 | Hubina et al. | Apr 2005 | B1 |
6885819 | Shinohara | Apr 2005 | B2 |
6927432 | Holm et al. | Aug 2005 | B2 |
6937774 | Specht et al. | Aug 2005 | B1 |
6943051 | Augusto et al. | Sep 2005 | B2 |
6943831 | Gallagher et al. | Sep 2005 | B2 |
6972799 | Hashimoto | Dec 2005 | B1 |
6984816 | Holm et al. | Jan 2006 | B2 |
6995795 | Losee et al. | Feb 2006 | B1 |
7009638 | Gruber et al. | Mar 2006 | B2 |
7012643 | Frame | Mar 2006 | B2 |
7016089 | Yoneda et al. | Mar 2006 | B2 |
7016549 | Utagawa | Mar 2006 | B1 |
7065246 | Xiaomang et al. | Jun 2006 | B2 |
7075129 | Parks | Jul 2006 | B2 |
7109051 | Cave et al. | Sep 2006 | B2 |
7148925 | Osada et al. | Dec 2006 | B2 |
7153720 | Augusto | Dec 2006 | B2 |
7161625 | Hori | Jan 2007 | B2 |
7199830 | Tanaka et al. | Apr 2007 | B1 |
7206072 | Takahashi et al. | Apr 2007 | B2 |
7239342 | Kingetsu et al. | Jul 2007 | B2 |
7251054 | Takemoto | Jul 2007 | B2 |
7298922 | Lindgren et al. | Nov 2007 | B1 |
7315014 | Lee et al. | Jan 2008 | B2 |
7327504 | Gallagher | Feb 2008 | B2 |
7330209 | Osamato | Feb 2008 | B2 |
7340099 | Zhang | Mar 2008 | B2 |
7343867 | Fraisse et al. | Mar 2008 | B2 |
7349016 | Fujii et al. | Mar 2008 | B2 |
7379588 | Loce et al. | May 2008 | B2 |
7400332 | Schweng et al. | Jul 2008 | B2 |
7400770 | Keaton et al. | Jul 2008 | B2 |
7453129 | King et al. | Nov 2008 | B2 |
7454053 | Bryll et al. | Nov 2008 | B2 |
7456880 | Okita et al. | Nov 2008 | B2 |
7468750 | Mabuchi et al. | Dec 2008 | B2 |
7485903 | Abe et al. | Feb 2009 | B2 |
7521737 | Augusto | Apr 2009 | B2 |
7554588 | Yaffe | Jun 2009 | B2 |
7577315 | Uvarov et al. | Aug 2009 | B2 |
7615808 | Pain et al. | Nov 2009 | B2 |
7688368 | Kijima et al. | Mar 2010 | B2 |
7706022 | Okuyama | Apr 2010 | B2 |
7724292 | Ueno et al. | May 2010 | B2 |
7769229 | O'Brien et al. | Aug 2010 | B2 |
7769230 | Pillman et al. | Aug 2010 | B2 |
7807955 | Parks et al. | Oct 2010 | B2 |
7821553 | Ellis-Monaghan et al. | Oct 2010 | B2 |
7830430 | Adams, Jr. et al. | Nov 2010 | B2 |
7839437 | Kasai et al. | Nov 2010 | B2 |
7859033 | Brady | Dec 2010 | B2 |
7876956 | Adams, Jr. et al. | Jan 2011 | B2 |
7893976 | Compton et al. | Feb 2011 | B2 |
7915067 | Brady et al. | Mar 2011 | B2 |
7916362 | Kijima et al. | Mar 2011 | B2 |
7999870 | Compton et al. | Aug 2011 | B2 |
8017426 | Brady | Sep 2011 | B2 |
8031258 | Enge et al. | Oct 2011 | B2 |
8076170 | Brady | Dec 2011 | B2 |
8106427 | Parks et al. | Jan 2012 | B2 |
8119435 | Brady | Feb 2012 | B2 |
8139130 | Compton et al. | Mar 2012 | B2 |
8164682 | Border et al. | Apr 2012 | B2 |
8174601 | Compton et al. | May 2012 | B2 |
8184184 | Xue | May 2012 | B2 |
8194296 | Compton et al. | Jun 2012 | B2 |
20010010952 | Abramovich | Aug 2001 | A1 |
20010012133 | Yoneda et al. | Aug 2001 | A1 |
20010026683 | Morimoto et al. | Oct 2001 | A1 |
20010055064 | Minakami | Dec 2001 | A1 |
20020020845 | Ogura et al. | Feb 2002 | A1 |
20020044209 | Saito | Apr 2002 | A1 |
20020058353 | Merrill | May 2002 | A1 |
20020130957 | Gallagher et al. | Sep 2002 | A1 |
20020135689 | Uya | Sep 2002 | A1 |
20030035917 | Hyman | Feb 2003 | A1 |
20030160886 | Misawa et al. | Aug 2003 | A1 |
20030193580 | Okamoto | Oct 2003 | A1 |
20030210332 | Frame | Nov 2003 | A1 |
20040007722 | Narui et al. | Jan 2004 | A1 |
20040032516 | Kakarala | Feb 2004 | A1 |
20040046881 | Utagawa | Mar 2004 | A1 |
20040090550 | Park | May 2004 | A1 |
20040094784 | Rhodes et al. | May 2004 | A1 |
20040169747 | Ono et al. | Sep 2004 | A1 |
20040174446 | Acharya | Sep 2004 | A1 |
20040207823 | Alasaarela et al. | Oct 2004 | A1 |
20040212714 | Mikoshiba et al. | Oct 2004 | A1 |
20040227456 | Matsui | Nov 2004 | A1 |
20050001915 | Mabuchi et al. | Jan 2005 | A1 |
20050094071 | Akiyama et al. | May 2005 | A1 |
20050104148 | Yamamoto et al. | May 2005 | A1 |
20050110002 | Noda | May 2005 | A1 |
20050116251 | Abe et al. | Jun 2005 | A1 |
20050128586 | Sedlmayr | Jun 2005 | A1 |
20050135709 | Gusmano et al. | Jun 2005 | A1 |
20050139945 | Lim | Jun 2005 | A1 |
20050151729 | Akimoto et al. | Jul 2005 | A1 |
20050221541 | Metzler et al. | Oct 2005 | A1 |
20050231618 | Sugiyama | Oct 2005 | A1 |
20050248667 | Schweng et al. | Nov 2005 | A1 |
20050259169 | Ito et al. | Nov 2005 | A1 |
20050276475 | Sawada | Dec 2005 | A1 |
20060007337 | Panicacci | Jan 2006 | A1 |
20060017829 | Gallagher | Jan 2006 | A1 |
20060017837 | Sorek et al. | Jan 2006 | A1 |
20060033129 | Mouli | Feb 2006 | A1 |
20060043189 | Agrawal et al. | Mar 2006 | A1 |
20060043393 | Okita et al. | Mar 2006 | A1 |
20060043438 | Holm et al. | Mar 2006 | A1 |
20060043439 | Koizumi | Mar 2006 | A1 |
20060044427 | Hu | Mar 2006 | A1 |
20060044434 | Okita et al. | Mar 2006 | A1 |
20060050162 | Nakamura | Mar 2006 | A1 |
20060055800 | Ackland et al. | Mar 2006 | A1 |
20060060854 | Wakano et al. | Mar 2006 | A1 |
20060068586 | Pain | Mar 2006 | A1 |
20060088298 | Frame et al. | Apr 2006 | A1 |
20060113459 | Yang et al. | Jun 2006 | A1 |
20060115749 | Toyoda | Jun 2006 | A1 |
20060119710 | Ben-Ezra et al. | Jun 2006 | A1 |
20060119738 | Kido | Jun 2006 | A1 |
20060139245 | Sugiyama | Jun 2006 | A1 |
20060146157 | Toros et al. | Jul 2006 | A1 |
20060146158 | Toros et al. | Jul 2006 | A1 |
20060186560 | Swain et al. | Aug 2006 | A1 |
20060187308 | Lim et al. | Aug 2006 | A1 |
20060192873 | Yaffe | Aug 2006 | A1 |
20060204122 | Onozawa | Sep 2006 | A1 |
20060275944 | Hyun | Dec 2006 | A1 |
20070002153 | Dierickx | Jan 2007 | A1 |
20070024879 | Hamilton, Jr. et al. | Feb 2007 | A1 |
20070024931 | Compton et al. | Feb 2007 | A1 |
20070029465 | Choi et al. | Feb 2007 | A1 |
20070030366 | Compton | Feb 2007 | A1 |
20070040922 | McKee et al. | Feb 2007 | A1 |
20070045681 | Mauritzson et al. | Mar 2007 | A1 |
20070046807 | Hamilton, Jr. et al. | Mar 2007 | A1 |
20070069248 | Ohta | Mar 2007 | A1 |
20070069258 | Ahn | Mar 2007 | A1 |
20070071433 | Kawanami | Mar 2007 | A1 |
20070076269 | Kido et al. | Apr 2007 | A1 |
20070096232 | Hwang | May 2007 | A1 |
20070127040 | Davidovici | Jun 2007 | A1 |
20070138588 | Wilson et al. | Jun 2007 | A1 |
20070146522 | Okada et al. | Jun 2007 | A1 |
20070154202 | Lee et al. | Jul 2007 | A1 |
20070158772 | Boettiger | Jul 2007 | A1 |
20070159542 | Luo | Jul 2007 | A1 |
20070177236 | Kijima et al. | Aug 2007 | A1 |
20070194397 | Adkisson et al. | Aug 2007 | A1 |
20070210244 | Halvis et al. | Sep 2007 | A1 |
20070223831 | Mei et al. | Sep 2007 | A1 |
20070235829 | Levine et al. | Oct 2007 | A1 |
20070244359 | Cabiri et al. | Oct 2007 | A1 |
20070257998 | Inoue | Nov 2007 | A1 |
20070268533 | Kijima et al. | Nov 2007 | A1 |
20070285548 | Gomi | Dec 2007 | A1 |
20080002959 | Border et al. | Jan 2008 | A1 |
20080012969 | Kasai et al. | Jan 2008 | A1 |
20080038864 | Yoo et al. | Feb 2008 | A1 |
20080074521 | Olsen | Mar 2008 | A1 |
20080084486 | Enge et al. | Apr 2008 | A1 |
20080128598 | Kanai et al. | Jun 2008 | A1 |
20080129834 | Dosluoglu | Jun 2008 | A1 |
20080130073 | Compton et al. | Jun 2008 | A1 |
20080130991 | O'Brien et al. | Jun 2008 | A1 |
20080144964 | Soinio et al. | Jun 2008 | A1 |
20080165815 | Kamijima | Jul 2008 | A1 |
20080170848 | Wernersson | Jul 2008 | A1 |
20080211943 | Egawa et al. | Sep 2008 | A1 |
20080218597 | Cho | Sep 2008 | A1 |
20080218613 | Janson et al. | Sep 2008 | A1 |
20080219654 | Border et al. | Sep 2008 | A1 |
20080255409 | Graumann et al. | Oct 2008 | A1 |
20080258045 | Oike et al. | Oct 2008 | A1 |
20080297634 | Uya | Dec 2008 | A1 |
20090016390 | Sumiyama et al. | Jan 2009 | A1 |
20090021588 | Border et al. | Jan 2009 | A1 |
20090021612 | Hamilton, Jr. et al. | Jan 2009 | A1 |
20090057801 | Goushcha et al. | Mar 2009 | A1 |
20090096991 | Chien et al. | Apr 2009 | A1 |
20090101796 | Ladd et al. | Apr 2009 | A1 |
20090109172 | Lee et al. | Apr 2009 | A1 |
20090121306 | Ishikawa | May 2009 | A1 |
20090141242 | Silverstein et al. | Jun 2009 | A1 |
20090160993 | Kato et al. | Jun 2009 | A1 |
20090167893 | Susanu et al. | Jul 2009 | A1 |
20090179995 | Fukumoto et al. | Jul 2009 | A1 |
20090195681 | Compton et al. | Aug 2009 | A1 |
20090206377 | Swain et al. | Aug 2009 | A1 |
20090289169 | Yang et al. | Nov 2009 | A1 |
20090290043 | Liu et al. | Nov 2009 | A1 |
20100006908 | Brady | Jan 2010 | A1 |
20100006909 | Brady | Jan 2010 | A1 |
20100006963 | Brady | Jan 2010 | A1 |
20100006970 | Brady et al. | Jan 2010 | A1 |
20100059802 | Chen | Mar 2010 | A1 |
20100091169 | Border et al. | Apr 2010 | A1 |
20100141822 | Xue | Jun 2010 | A1 |
20100149396 | Summa | Jun 2010 | A1 |
20100157120 | Compton et al. | Jun 2010 | A1 |
20100302418 | Adams, Jr. et al. | Dec 2010 | A1 |
20100302423 | Adams, Jr. et al. | Dec 2010 | A1 |
20100309340 | Border et al. | Dec 2010 | A1 |
20100309347 | Adams, Jr. et al. | Dec 2010 | A1 |
20100309350 | Adams, Jr. et al. | Dec 2010 | A1 |
20110042770 | Brady | Feb 2011 | A1 |
20110059572 | Brady | Mar 2011 | A1 |
20110073173 | Hwang | Mar 2011 | A1 |
20110115957 | Brady et al. | May 2011 | A1 |
20110147875 | Parks et al. | Jun 2011 | A1 |
20110211109 | Compton et al. | Sep 2011 | A1 |
20110285880 | Brady | Nov 2011 | A1 |
Number | Date | Country |
---|---|---|
0 119 862 | Sep 1984 | EP |
0 138 074 | Dec 1989 | EP |
0 472 299 | Feb 1992 | EP |
0 528 433 | Feb 1993 | EP |
0 954 032 | Nov 1999 | EP |
1 035 729 | Sep 2000 | EP |
1 206 119 | May 2002 | EP |
1 241 896 | Sep 2002 | EP |
1 322 123 | Jun 2003 | EP |
1 411 471 | Apr 2004 | EP |
1 209 903 | Aug 2004 | EP |
1 594 321 | Nov 2005 | EP |
1 612 863 | Jan 2006 | EP |
1 641 045 | Mar 2006 | EP |
1 648 160 | Apr 2006 | EP |
1 709 901 | Oct 2006 | EP |
1 808 894 | Jul 2007 | EP |
1 821 128 | Aug 2007 | EP |
2 105 143 | Mar 1983 | GB |
62-246033 | Oct 1987 | JP |
63-039293 | Feb 1988 | JP |
01077288 | Mar 1989 | JP |
04-088784 | Mar 1992 | JP |
61-13310 | Apr 1994 | JP |
8023542 | Jan 1996 | JP |
08-182005 | Jul 1996 | JP |
2002-270809 | Sep 2002 | JP |
2004-147093 | May 2004 | JP |
2004-304706 | Oct 2004 | JP |
2005-099160 | Apr 2005 | JP |
2005-268738 | Sep 2005 | JP |
2005-277513 | Oct 2005 | JP |
2007-104178 | Apr 2007 | JP |
2007-150643 | Jun 2007 | JP |
2007-271667 | Oct 2007 | JP |
WO 9959345 | Nov 1999 | WO |
WO 2005079199 | Sep 2005 | WO |
WO 2006064564 | Jun 2006 | WO |
WO 2006130518 | Dec 2006 | WO |
WO 2007015765 | Feb 2007 | WO |
WO 2007015982 | Feb 2007 | WO |
WO 2007030226 | Mar 2007 | WO |
WO 2007051147 | May 2007 | WO |
WO 2007089416 | Aug 2007 | WO |
WO 2007089426 | Aug 2007 | WO |
WO 2007126288 | Nov 2007 | WO |
WO 2007139675 | Dec 2007 | WO |
WO 2007145373 | Dec 2007 | WO |
WO 2008044673 | Apr 2008 | WO |
WO 2008045198 | Apr 2008 | WO |
WO 2008066699 | Jun 2008 | WO |
WO 2008066703 | Jun 2008 | WO |
WO 2008069920 | Jun 2008 | WO |
WO 2008106282 | Sep 2008 | WO |
WO 2008118525 | Oct 2008 | WO |
Entry |
---|
JP 2009-531389—Japanese Office Action with English translation, mail date Feb. 14, 2012, 5 pages. |
Razavi, B., “Design of a 100-MHz 10-mW 3-V Sample-and-Hold Amplifier in Digital Bipolar Technology,” IEEE Journal of Solid-State Circuits, vol. 30, No. 7, Jul. 1995, New York, pp. 724-730. |
Razavi, B., “Design of Sample-and-Hold Amplifiers for High-Speed Low-Voltage A/D Converters,” Integrated Circuits and Systems Laboratory, University of California, Los Angeles, IEEE 1997 Custom Integrated Circuits Conference, 5.1.1-5.1.8, pp. 59-66. |
de Bethune, S. et al., “Adaptive Intensity Matching Filters: A New Tool for Multi-Resolution Data Fusion,” Scientific congresses and symposiums: Paper published in a book, Agard Conference Proceedings S95, pp. 28.1-28.15, 1998, Lisbon, Portugal, 14 pages downloaded from http://orbi.ulg.ac.be/handle/2268/4739. |
Pohl, C. et al., Review article, “Multisensor image fusion in remote sensing: concepts, methods and applications,” Int. J. Remote Sensing, 1998, vol. 19, No. 5, pp. 823-854. |
Tico, M. et al., “Motion Blur Identification Based on Differently Exposed Images,” IEEE International Conference on Image Processing, ICIP 2006, Nokia Research Center, Finland, pp. 2021-2024. |
Tico, M. et al., “Image Stabilization Based on Fusing the Visual Information in Differently Exposed Images,” IEEE International Conference on Image Processing, ICIP 2007, Nokia Research Center, Finland, pp. I-117-I-120. |
Joy, T. et al., “Development of a Production-Ready, Back-Illuminated CMOS Image Sensor with Small Pixels,” IEEE, Dec. 2007, pp. 1007-1010. |
Yuan, X. et al., “Gate-Induced-Drain-Leakage Current in 45-nm CMOS Technology,” IEEE Transactions on Device and Materials Reliability, vol. 8, No. 3, Sep. 2008, pp. 501-508. |
Tai, Y. et al., “ImageNideo Deblurring using a Hybrid Camera,” IEEE 2008, Microsoft Research Asia, 8 pages. |
U.S. Office Action mailed Nov. 12, 2009, U.S. Appl. No. 11/538,599, filed Oct. 4, 2006, 20 pages. |
U.S. Office Action mailed May 3, 2010, U.S. Appl. No. 11/538,599, filed Oct. 4, 2006, 16 pages. |
U.S. Office Action mailed Jul. 22, 2010, U.S. Appl. No. 11/538,599, filed Oct. 4, 2006, 17 pages. |
U.S. Office Action mailed Dec. 13, 2010, U.S. Appl. No. 11/538,599, filed Oct. 4, 2006, 14 pages. |
U.S. Notice of Allowance mailed Jun. 2, 2011, U.S. Appl. No. 11/538,599, filed Oct. 4, 2006, 11 pages. |
PCT/US2006/028493; International Search Report, mail date May 25, 2007, 2 pages. |
PCT/US2006/028493; International Preliminary Report on Patentability, date of issuance Jan. 29, 2008, 6 pages. |
PCT/US2006/027454; International Search Report, mail date Dec. 21, 2006, 3 pages. |
PCT/US2006/027454; International Preliminary Report on Patentability, date of issuance Jan. 29, 2008, 8 pages. |
PCT/US2007/020612; International Search Report, mail date Apr. 21, 2008, 2 pages. |
PCT/US2007/020612; International Preliminary Report on Patentability, date of issuance Apr. 7, 2009, 6 pages. |
PCT/US2007/001113; International Search Report, mail date Jul. 2, 2007, 2 pages. |
PCT/US2007/001113; International Preliminary Report on Patentability, date of issuance Jul. 29, 2008, 5 pages. |
PCT/US2007/000955; International Search Report, mail date Jun. 26, 2007, 3 pages. |
PCT/US2007/000955; International Preliminary Report on Patentability, date of issuance Jul. 29, 2008, 8 pages. |
PCT/US2007/011276; International Search Report and Written Opinion, mail date Feb. 7, 2008, 9 pages. |
PCT/US2007/024162; International Search Report and Written Opinion, mail date May 14, 2008, 11 pages. |
PCT/US2007/023823; International Search Report and Written Opinion, mail date Oct. 23, 2008, 16 pages. |
PCT/US2007/022780; International Search Report and Written Opinion, mail date Apr. 7, 2008, 13 pages. |
PCT/US2007/023822; International Search Report and Written Opinion, mail date Oct. 23, 2008, 17 pages. |
PCT/US2009/000548; International Search Report and Written Opinion, mail date Jun. 3, 2009, 16 pages. |
PCT/US2009/004903; International Search Report and Written Opinion, mail date Nov. 9, 2009, 8 pages. |
PCT/US2009/006416; International Search Report and Written Opinion, mail date Mar. 29, 2010, 8 pages. |
PCT/US2009/006259; International Search Report and Written Opinion, mail date Mar. 4, 2010, 10 pages. |
PCT/US2009/003974; International Search Report and Written Opinion, mail date Oct. 12, 2009, 7 pages. |
PCT/US2009/003977; International Search Report and Written Opinion, mail date Mar. 26, 2010, 13 pages. |
PCT/US2009/003794; International Search Report and Written Opinion, mail date Sep. 30, 2009, 9 pages. |
PCT/US2009/006472; International Search Report and Written Opinion, mail date Mar. 25, 2010, 10 pages. |
PCT/US2009/002921; International Search Report and Written Opinion, mail date Sep. 18, 2009, 8 pages. |
PCT/US2010/001515; International Search Report and Written Opinion, mail date Sep. 3, 2010, 10 pages. |
PCT/US2009/005533; International Search Report and Written Opinion, mail date Dec. 4, 2009, 9 pages. |
PCT/US2010/060393; International Search Report and Written Opinion, mail date Feb. 15, 2011, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20110310279 A1 | Dec 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 11538599 | Oct 2006 | US |
Child | 13220512 | US |