Systems and methods for detecting defective camera arrays and optic arrays

Information

  • Patent Grant
  • 10334241
  • Patent Number
    10,334,241
  • Date Filed
    Monday, October 30, 2017
    6 years ago
  • Date Issued
    Tuesday, June 25, 2019
    4 years ago
Abstract
Systems and methods for detecting defective camera arrays, optic arrays and/or sensors are described. One embodiment includes capturing image data using a camera array; dividing the captured images into a plurality of corresponding image regions; identifying the presence of localized defects in any of the cameras by evaluating the image regions in the captured images; and detecting a defective camera array using the image processing system when the number of localized defects in a specific set of image regions exceeds a predetermined threshold, where the specific set of image regions is formed by: a common corresponding image region from at least a subset of the captured images; and any additional image region in a given image that contains at least one pixel located within a predetermined maximum parallax shift distance along an epipolar line from a pixel within said common corresponding image region within the given image.
Description
FIELD OF THE INVENTION

The present invention generally relates to systems and methods for screening cameras for defects and more specifically to systems and methods for screening camera arrays or optic arrays that are used in the construction of camera arrays for defects.


BACKGROUND

In response to the constraints placed upon a traditional digital camera based upon the camera obscura, a new class of cameras that can be referred to as array cameras has been proposed. Array cameras are characterized in that they include an imager array that has multiple arrays of pixels, where each pixel array is intended to define a focal plane, and each focal plane has a separate lens stack. Typically, each focal plane includes a plurality of rows of pixels that also forms a plurality of columns of pixels, and each focal plane is contained within a region of the imager that does not contain pixels from another focal plane. An image is typically formed on each focal plane by its respective lens stack. In many instances, the array camera is constructed using an imager array that incorporates multiple focal planes and an optic array of lens stacks.


SUMMARY OF THE INVENTION

Systems and methods in accordance with embodiments of the invention detect defective camera arrays, and/or optic arrays and/or sensors used in the construction of array camera modules. One embodiment of the method of the invention includes: capturing image data of a known target using the plurality of cameras, where the image data forms a plurality of images; dividing each of the plurality of images into a plurality of corresponding image regions using the image processing system; identifying the presence of at least one localized defect in at least one of the plurality of the cameras by evaluating the image regions in the plurality of images in accordance with at least one predetermined localized defect criterion using the image processing system; detecting a defective camera array using the image processing system when the number of localized defects in a specific set of image regions exceeds a predetermined threshold. In addition, the specific set of image regions is formed by: a common corresponding image region from at least a subset of the plurality of images; and any additional image region in a given image that contains at least one pixel located within a predetermined maximum parallax shift distance along an epipolar line from a pixel within said common corresponding image region within the given image, where the epipolar line is defined by the relative location of the center of the camera that captured the given image and a predetermined viewpoint.


In a further embodiment of the method of the invention, identifying the presence of at least one localized defect in at least one of the plurality of the cameras by evaluating the image regions in the plurality of images in accordance with at least one predetermined localized defect criterion using the image processing system comprises identifying a plurality of defective pixels within an image region that satisfies at least one predetermined criterion.


In another embodiment of the method of the invention, the predetermined criterion is that the plurality of defective pixels within the image region exceeds a predetermined number of defective pixels.


In a still further embodiment of the method of the invention, the predetermined criterion is that the plurality of defective pixels includes a cluster of defective pixels that exceeds a predetermine size.


In still another embodiment of the method of the invention, defective pixels comprise hot pixels, bright pixels and dark pixels.


In a yet further embodiment of the method of the invention, identifying the presence of at least one localized defect in at least one of the plurality of the cameras by evaluating the image regions in the plurality of images in accordance with at least one predetermined localized defect criterion using the image processing system comprises: measuring the Modulation Transfer Function (MTF) within an image region; and determining that the MTF of the image region fails to satisfy a predetermined criterion.


In yet another embodiment of the method of the invention, the predetermined criterion is that the on-axis MTF at a predetermined spatial frequency exceeds a first predetermined threshold, the off-axis tangential MTF at a predetermined spatial frequency exceeds a second predetermined threshold, and the off-axis sagittal MTF at a predetermined spatial frequency exceeds a third predetermined threshold.


In a further embodiment again of the method of the invention, said plurality of corresponding images portions forms a first plurality of corresponding image regions and the method further comprises: dividing each of the plurality of images into a second plurality of corresponding image regions using the image processing system, where the number of image regions in the first plurality of corresponding image regions differs from the number of image regions in the second plurality of corresponding image regions; and identifying the presence of at least one localized defect in at least one of the plurality of the cameras by evaluating the image regions in the second plurality of images in accordance with at least one additional predetermined localized defect criterion using the image processing system.


In another embodiment again of the method of the invention, the plurality of images forms a reference image and a plurality of alternate view images; the specific set of image regions is formed by: a specific image region from the reference image; the image regions from each of the alternate view images that correspond to the specific image region from the reference image; and any additional image region in a given alternate view image from the plurality of alternate view images that contains at least one pixel located within a predetermined maximum parallax shift distance along an epipolar line from a pixel within the image region of the given alternate view image that corresponds to the selected image region from the reference image, where the epipolar line is defined by the relative location of the center of the camera that captured the reference image and the center of the camera that captured the given alternate view image.


In a further additional embodiment of the method of the invention, the plurality of images forms a plurality of images in each of a plurality of color channels; and a specific set of image regions is formed by image regions from the plurality of images within one of the plurality of color channels.


In another additional embodiment of the method of the invention, the plurality of images forms a reference image and a plurality of alternate view images and said plurality of images from one of the plurality of color channels does not include the reference image; and the specific set of image regions is further formed by: the image regions from each of the alternate view images within said one of the plurality of color channels that correspond to a specific image region from the reference image; and any additional image region in a given alternate view image from said one of the plurality of color channels that contains at least one pixel located within a predetermined maximum parallax shift distance along an epipolar line from a pixel within the image region of the given alternate view image that corresponds to the selected image region from the reference image, where the epipolar line is defined by the relative location of the center of the camera that captured the reference image and the center of the camera that captured the given alternate view image.


A still yet further embodiment of the method of the invention also includes detecting a defective camera array using the image processing system when the number of localized defects in a second set of image regions exceeds a second predetermined threshold, where the second set of image regions is formed by image regions from the plurality of images within a second of the plurality of color channels.


In still yet another embodiment of the method of the invention, said predetermined criterion used with respect to said specific set of image regions from said one of the plurality of color channels is different from said second predetermined criterion used with respect to said second set of image regions from said second of the plurality of color channels.


A still further embodiment again of the method of the invention includes dividing the image field of each of the plurality of lens stacks into a plurality of corresponding regions when using an optical test instrument; measuring the Modulation Transfer Function (MTF) of a known target using the optical test instrument in each of the regions; identifying the presence of at least one localized defect in at least one of the plurality of the lens stacks by evaluating the MTF measurements of the regions in the plurality of lens stacks in accordance with at least one predetermined localized defect criterion using the optical test instrument; detecting a defective optic array using the image processing system when the number of localized defects in a specific set of regions exceeds a predetermined threshold. In addition, the specific set of regions is formed by: a common corresponding region from at least a subset of the plurality of lens stacks; and any additional region in a given lens stack that forms an image within a predetermined maximum parallax shift distance along an epipolar line from said common corresponding region within the given lens stack, where the epipolar line is defined by the relative location of the center of the given lens stack and a predetermined viewpoint.


Another further embodiment of the method of the invention includes capturing image data using a camera array comprising a plurality of cameras, where at least one of the plurality of cameras includes a known localized defect impacting image data captured by the camera; disregarding image data within a region of an image captured by the at least one of the plurality of cameras that includes a known localized defect using a processor configured by a super-resolution image processing application, where the discarded image data is from a region of the camera that is known to include said known localized defect; and synthesizing a super-resolution image from the remaining image data captured by the cameras in the camera array using a super-resolution process performed by the processor configured using the super-resolution image processing application.


In still another further embodiment of the method of the invention, the camera array comprises at least one camera known to include at least one defective pixel, and the method further comprises disregarding image data captured by the pixels in the at least one camera that are known to be defective.


Another embodiment of the invention includes: an array camera module comprising a plurality of cameras formed by an imager array comprising a plurality of and an optic array comprising a plurality of lens stacks, where at least one of the plurality of cameras formed by the imager array and optic array includes a known localized defect impacting image data captured by the camera; a processor; and memory containing a super-resolution image processing application and defect data identifying said at least one of the plurality of cameras that includes a known localized defect and a region of the camera that contains the known localized defect. In addition, the super-resolution processing application configures the processor to: capture image data using the array camera module; with respect to each of said at least one of the plurality of cameras that includes a known localized defect, disregarding image data within at least one region identified by the defect data; and synthesizing a super-resolution image from the remaining image data.


In still another embodiment of the invention, memory further comprises defect data identifying at least one defective pixel within the imager array and the super-resolution processing application configures the processor to also disregard image data captured by the at least one pixel identified as defective by said defect data.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 conceptually illustrates a camera array implemented in the form of an array camera.



FIG. 2 conceptually illustrates an array camera module constructed from an optic array and an imager array.



FIG. 3 illustrates the circuitry in an image array that can be utilized in the construction of an array camera module.



FIG. 4 illustrates circuitry utilized in the independent control and read out of pixel sub-arrays that form focal planes on an imager array that can be utilized in the construction of an array camera module.



FIGS. 5A-5E conceptually illustrate a process for determining whether a camera array is capable of synthesizing an image having acceptable image quality from image data captured by the cameras in the camera array in accordance with embodiments of the invention.



FIG. 6 is a process for determining whether a camera array is defective due to the presence of localized defects in predetermined parallax uncertainty zones in accordance with an embodiment of the invention.



FIG. 7 is a process for determining whether a region of a camera is defective based upon the presence of defective pixels within the region of the camera in accordance with an embodiment of the invention.



FIG. 8 is a process for determining whether a region of a camera is defective based upon measurements of the MTF of the camera in the region in accordance with an embodiment of the invention.



FIG. 9 is a process for selecting a reference camera within a camera array based upon the reference camera being free from defective regions in accordance with an embodiment of the invention.



FIG. 10 is a process for determining whether an optic array is defective based upon the presence of localized regions of MTF that do not pass an acceptance criterion in predetermined parallax uncertainty zones in accordance with an embodiment of the invention.



FIG. 11 is a process for synthesizing a super-resolution image from image data captured by a camera array including defects using information concerning the location of the defective regions to disregard captured image data that are impacted by the defects in accordance with an embodiment of the invention.





DETAILED DESCRIPTION OF THE DRAWINGS

Turning now to the drawings, systems and methods for detecting defective camera arrays, and/or optic arrays and/or sensors used in the construction of array camera modules in accordance with embodiments of the invention are illustrated. A variety of defects can arise during the manufacture of a conventional digital camera that captures images using a single aperture including (but not limited to) defects in the camera optics including defects that result in unacceptable Modulation Transfer Function (MTF) performance, defective pixels in the camera's sensor, and/or defects in the assembly of the optics and sensor to form the camera. With respect to the discussion below, the term defect is used to refer to any aspect of a camera (including the sensor, optics, and/or assembly) or optic array that negatively impacts the image formed and/or image data captured by the camera. Even when a defect is localized, the defect can render the camera unsuitable for use, as the localized defect will result in unsatisfactory image quality in the impacted region of every image captured by the camera. As is discussed below, system and methods in accordance with embodiments of the invention utilize knowledge of the image processing used to synthesize images from images captured by camera arrays to determine whether localized defects in specific cameras in an array can be tolerated. In this way, yield can be improved in the manufacture of camera arrays by utilizing camera arrays that contain defects that will not impact the performance of the camera array.


A variety of camera arrays and processes for manufacturing camera arrays are disclosed in U.S. patent application Ser. No. 12/935,504, entitled “Capturing and Processing of Images using Monolithic Camera Array with Heterogeneous Images”, filed May 20, 2009, the disclosure of which is incorporated by reference herein in its entirety. Multiple images of a scene can be captured by a camera array and utilized to synthesize a higher (super-resolution) image of the scene. Fusion and super-resolution processes that can be utilized to generate super-resolution images using images captured by a camera array are disclosed in U.S. patent application Ser. No. 12/967,807, entitled “System and Methods for Synthesizing High Resolution Images Using Super-Resolution Processes”, filed Dec. 14, 2010, the disclosure of which is incorporated herein by reference in its entirety.


A portion of an image synthesized using a super-resolution process typically includes image data from multiple images captured by a camera array. In many instances, the complete set of images captured by a camera array is not required to achieve acceptable image quality in a region of a synthesized image. Manufacture of camera arrays results in many of the same defects that are experienced during the manufacture of conventional cameras. One approach to determining whether a camera array is defective is to identify cameras within the camera array that contain defects and to identify the entire camera array as defective when a predetermined threshold number of defective cameras is exceeded. For array cameras that include sets of cameras that form different color channels, such as those disclosed in U.S. patent application Ser. No. 12/935,504, the number of defective cameras in each color channel can be evaluated with respect to separate predetermined thresholds in order to determine whether the camera array as a whole is defective. For color channels that include fewer cameras, a smaller number of defective cameras may be tolerated. Although rejecting camera arrays as defective based on the number of cameras containing defects is effective, the process may reject camera arrays that could still be utilized to synthesize images of acceptable image quality (despite the presence of a predetermined number of localized defects within a color channel). Increased manufacturing yield can be achieved by identifying the portions of the images captured by the camera array that are impacted by defects of some cameras and evaluating whether sufficient reliable image data remains for that region from all the remaining cameras to synthesize an image. If sufficient reliable image data remains to synthesize an image, then the camera array can be utilized irrespective of the total number of cameras impacted by localized defects.


In several embodiments, a determination that a camera array is defective is made by dividing each of the images of a scene (typically a known target) captured by the cameras in the camera array into corresponding regions and determining which of the regions contain pixels that contain image data which is likely to be fused during image processing to form regions of a synthesized image. In many embodiments, the image processing involves performing a super-resolution process involving parallax detection and correction to synthesize a super-resolved image. In other embodiments, any of a variety of image processing techniques can be utilized including (but not limited to processes that synthesize) stereo-pairs of super-resolution images, video sequences synthesized using a subset of cameras in the camera array, and/or high frame rate video sequences where different frames of video are synthesized using image data captured by different subsets of the camera array. In the event that a predetermined number of the regions that are likely to be fused to form a region of a synthesized image are impacted by localized defects, then the camera array can be determined to be defective. In the context of a super-resolution process, regions that are likely to be fused to form a specific region of a synthesized image can be identified using the maximum parallax shifts that are likely to be observed between images captured by the cameras in the camera array. In several embodiments, one of the cameras in a camera array is selected as a reference camera and the remaining cameras are considered alternate view cameras. In certain embodiments, the reference camera is selected in accordance with criteria including that the reference camera does not include any defects. When a region of an image captured by the reference camera is considered, the maximum parallax shifts along epipolar lines of the pixels in the region define so called “parallax uncertainty zones” within each of the alternate view images. A determination concerning whether a camera array is defective can be made by counting the number of defects impacting pixels within the parallax uncertainty zones associated with each region within the image captured by the reference camera. Where the cameras in a camera array form multiple color channels, separate criteria based upon parallax shifts can be applied to evaluate the impact of the localized defects present in the cameras of each color channel.


As indicated above, a variety of defects can occur during component manufacture and assembly of a camera or camera array. In several embodiments, the process of evaluating whether a camera array is defective can involve evaluation the cameras in the camera array for several different types of defects including (but not limited to) defects in camera optics, defects in the pixels of camera sensors and defects in the assembly of the camera optics and sensors. In a number of embodiments, the size of the regions of the images considered when evaluating the impact of specific types of localized defects can differ. In many embodiments, larger regions are considered when evaluating the camera's optics in a given region of an image captured by the camera than the regions considered when evaluating the impact of defective pixels in the camera's sensor. In general, the smaller the regions considered (i.e. the larger the number of regions considered) during the defect detection process, the higher the anticipated yield up to a point at which the process is: identifying all camera arrays in which the defects present in the camera array can be tolerated by the super-resolution processing; and rejecting all camera arrays as defective where the defects result in insufficient reliable image data for reliably performing super-resolution processing.


Evaluating the likely impact of localized defects based upon anticipated parallax shifts can improve overall manufacturing yields, because camera arrays are not rejected as defective simply based upon a predetermined number of cameras in the camera array containing defects. Similar techniques can be utilized to evaluate optic arrays utilized in the construction of array camera modules (similar to the array camera modules discussed in U.S. patent application Ser. No. 12/935,504). In many embodiments, a Modulation Transfer Function (MTF) measurement can be made with respect to different regions of the images formed by each lens stack in an optic array. MTF is generally the most relevant measurement of optical performance, and is generally taken as an objective measurement of the ability of an optical system to transfer various levels of detail (or spatial frequency) from an object to an image. The MTF is measured in terms of contrast (degrees of gray), or of modulation, produced from a perfect source of that detail level (thus it is the ratio of contrast between the object and the image). The amount of detail in an image is given by the resolution of the optical system, and is customarily specified as spatial frequency in line pairs per millimeter (lp/mm). A line pair is one cycle of a light bar and dark bar of equal width and has a contrast of unity. Contrast can be defined as the ratio of the difference in maximum intensity (Imax) and minimum intensity (Imin) over the sum of Imax and Imin, where Imax is the maximum intensity produced by an image (white) and Imin is the minimum intensity (black). The MTF then is the plot of contrast, measured in percent, against spatial frequency measured in lp/rm. The impact of errors in the lens such as (but not limited to) centering errors, form errors, and/or thickness errors that negatively impact MTF to a point at which the region of a lens stack is considered to contain a defect (i.e. MTF measurements that fail to satisfy one or more predetermined criterion) can be evaluated based upon anticipated parallax shifts during super-resolution processing. In this way, manufacturing yield can be increased by considering the regions of images impacted by defects as opposed to simply the number of defects in an optic array. Systems and methods for detecting defective camera arrays, and/or optic arrays, and techniques for synthesizing super-resolution images from images captured by array cameras containing localized defects in accordance with embodiments of the invention are discussed further below.


Camera Arrays


While much of the discussion that follows refers to systems and methods for screening for defective camera arrays, it is worthwhile initially reviewing the construction of camera arrays, the defects that can occur in the construction of camera arrays, and the manner in which information concerning localized defects can be utilized when synthesizing super-resolution images to avoid corruption of the synthesized super-resolution image by pixels impacted by the localized defects. Camera arrays can be implemented in a variety of ways including (but not limited to) as a set of discrete cameras, or as an array camera. Array cameras typically can include an array camera module and a processor.


An array camera that is configured to synthesize super-resolution images in a manner that involves disregarding image data impacted by localized defects in the cameras in the array camera in accordance with an embodiment of the invention is illustrated in FIG. 1. The array camera 100 includes an array camera module 102 including an array of individual cameras 104 where an array of individual cameras refers to a plurality of cameras in a particular arrangement, such as (but not limited to) the square arrangement utilized in the illustrated embodiment. The array camera module 102 is connected to the processor 106 and the processor 106 is connected to a memory 108. In a number of embodiments, the memory contains a super-resolution image processing application that is configured to synthesize a super-resolution image using image data captured by the camera module 102 using a process such as (but not limited to) one of the processes outlined in U.S. patent application Ser. No. 12/967,807. In several embodiments, the memory 108 contains information concerning image data captured by the camera module 102 that is unreliable due to localized defects in the cameras within individual cameras 104 within the camera module 108. The information can be in the form of regions of images that can be disregarded and/or individual pixels or clusters of pixels that can be disregarded. The super-resolution image processing application can utilize the information concerning the image data that is unreliable in the captured images to disregard the unreliable image data when performing super-resolution processing.


Although a specific array camera is illustrated in FIG. 1, any of a variety of different array camera configurations can be utilized in accordance with many different embodiments of the invention. Furthermore, the basic configuration shown in FIG. 1 can also be utilized in an image processing system that can be utilized to detect the presence of localized defects in a camera module and to determine whether the localized defects result in the overall camera module being defective for the purposes of synthesizing super-resolution images using the image data captured by the individual cameras within the camera module.


Array Camera Modules


The defects that can be present in array cameras typically arise from the manner in which array camera modules are constructed. Array camera modules, such as the array camera modules discussed above with respect to FIG. 1, can be constructed from an imager array and an optic array in the manner illustrated in FIG. 2. The camera module 200 includes an imager array 230 including an array of focal planes 240 along with a corresponding optic array 210 including an array of lens stacks 220. Within the array of lens stacks, each lens stack 220 creates an optical channel that forms an image of the scene on an array of light sensitive pixels within a corresponding focal plane 240. Each pairing of a lens stack 220 and focal plane 240 forms a single camera 104 within the array camera module. Each pixel within a focal plane 240 of a camera 104 generates image data that can be sent from the camera 104 to the processor 108. In many embodiments, the lens stack within each optical channel is configured so that pixels of each focal plane 240 sample the same object space or region within the scene. In several embodiments, the lens stacks are configured so that the pixels that sample the same object space do so with sub-pixel offsets to provide sampling diversity that can be utilized to recover increased resolution through the use of super-resolution processes.


In several embodiments, color filters in individual cameras can be used to form multiple color channels within the array camera module. In this way, cameras can be used to capture data with respect to different colors, or a specific portion of the spectrum. In contrast to applying color filters to the pixels of the camera, color filters in many embodiments of the invention can be included in the lens stack. For example, a green color camera can include a lens stack with a green light filter that allows green light to pass through the optical channel. In many embodiments, the pixels in each focal plane are the same and the light information captured by the pixels is differentiated by the color filters in the corresponding lens stack for each filter plane. Although a specific construction of a camera module with an optic array including color filters in the lens stacks is described above, camera modules can be implemented in a variety of ways including (but not limited to) by applying color filters to the pixels of the focal planes of the camera module similar to the manner in which color filters are applied to the pixels of a camera that uses a conventional Bayer color filter pattern. In several embodiments, at least one of the cameras in the camera module can include uniform color filters applied to the pixels in its focal plane. In many embodiments, a Bayer filter pattern is applied to the pixels of one of the cameras in a camera module. In a number of embodiments, camera modules are constructed in which color filters are utilized in both the lens stacks and on the pixels of the imager array.


The defects that can be present in a camera module include (but are not limited to) defective pixels, a lens stack including one or more lens surfaces that deviate from the relevant prescriptions for the surfaces, and defects associated with the manner in which the sensor and the optic array are combined to form the camera module. The types of defective pixels that may be present can include (but are not limited to) hot pixels (pixels that generate a signal above a predetermined mean dark signal when the sensor array is not illuminated), bright pixels (pixels that produce values that exceed a predetermined threshold above the values produced by neighboring pixels under similar illumination conditions), and dark pixels (pixels that produce values lower than a predetermined threshold below the values produced by neighboring pixels under similar illumination conditions). The specific types of pixel defects that can be detected in accordance with embodiments of the invention typically depend upon the requirements of a specific application. As noted above, a variety of characteristics of the optics of a camera can result in sufficient deterioration to be considered defects in accordance with embodiments of the invention. In many embodiments, defects in a region of a lens can be detected by measuring whether one or both of the tangential and/or sagittal MTF components (sometimes referred to as the horizontal and vertical components) fail to exceed one or more predefined thresholds. Additional defects that can be detected include (but are not limited to) blemishes in the optics and/or that result from assembly. Although specific categories of localized defect are described above, processes in accordance with embodiments of the invention can evaluate the impact of any of a variety of defects that are localized to a region of a camera within a camera array on the performance of the camera array using information concerning the manner in which images will be synthesized from the image data captured by the camera array.


Although specific array camera modules and defects that can occur during the manufacture of array camera modules are discussed above, many different array camera modules can be constructed and systems and methods in accordance with embodiments of the invention can detect the presence of the types of defects that typically arise in the construction of a specific type of array camera module. In order to provide some additional background concerning the operation of imager arrays and the manner in which the imager arrays capture image data for use in super-resolution processing, imager arrays that can be utilized in the construction of array camera modules in accordance with embodiments of the invention are discussed further below.


Imager Arrays


An imager array that can be utilized in the construction of an array camera module and in which the image capture settings of a plurality of focal planes can be independently configured is illustrated in FIG. 3. The imager array 300 includes a focal plane array core 302 that includes an array of focal planes 304 and all analog signal processing, pixel level control logic, signaling, and analog-to-digital conversion (ADC) circuitry. The imager array also includes focal plane timing and control circuitry 306 that is responsible for controlling the capture of image information using the pixels. In a number of embodiments, the focal plane timing and control circuitry utilizes reset and read-out signals to control the integration time of the pixels. In other embodiments, any of a variety of techniques can be utilized to control integration time of pixels and/or to capture image information using pixels. In many embodiments, the focal plane timing and control circuitry 306 provides flexibility of image information capture control, which enables features including (but not limited to) high dynamic range imaging, high speed video, and electronic image stabilization. In various embodiments, the imager array includes power management and bias generation circuitry 308. The power management and bias generation circuitry 308 provides current and voltage references to analog circuitry such as the reference voltages against which an ADC would measure the signal to be converted against. In many embodiments, the power management and bias circuitry also includes logic that turns off the current/voltage references to certain circuits when they are not in use for power saving reasons. In several embodiments, the imager array includes dark current and fixed pattern (FPN) correction circuitry 310 that increases the consistency of the black level of the image data captured by the imager array and can reduce the appearance of row temporal noise and column fixed pattern noise. In several embodiments, each focal plane includes reference pixels for the purpose of calibrating the dark current and FPN of the focal plane and the control circuitry can keep the reference pixels active when the rest of the pixels of the focal plane are powered down in order to increase the speed with which the imager array can be powered up by reducing the need for calibration of dark current and FPN.


In many embodiments, a single self-contained chip imager array includes focal plane framing circuitry 312 that packages the data captured from the focal planes into a container file and can prepare the captured image data for transmission. In several embodiments, the focal plane framing circuitry includes information identifying the focal plane and/or group of pixels from which the captured image data originated. In a number of embodiments, the imager array also includes an interface for transmission of captured image data to external devices. In the illustrated embodiment, the interface is a MIPI CSI 2 output interface (as specified by the non-profit MIPI Alliance, Inc.) supporting four lanes that can support read-out of video at 30 fps from the imager array and incorporating data output interface circuitry 318, interface control circuitry 316 and interface input circuitry 314. Typically, the bandwidth of each lane is optimized for the total number of pixels in the imager array and the desired frame rate. The use of various interfaces including the MIPI CSI 2 interface to transmit image data captured by an array of imagers within an imager array to an external device in accordance with embodiments of the invention is described in U.S. Pat. No. 8,305,456, entitled “Systems and Methods for Transmitting Array Camera Data”, issued Nov. 6, 2012, the disclosure of which is incorporated by reference herein in its entirety.


Although specific components of an imager array architecture are discussed above with respect to FIG. 3, any of a variety of imager arrays can be constructed in accordance with embodiments of the invention that enable the capture of images of a scene at a plurality of focal planes in accordance with embodiments of the invention. Independent focal plane control that can be included in imager arrays in accordance with embodiments of the invention are discussed further below.


Independent Focal Plane Control


Imager arrays in accordance with embodiments of the invention can include an array of focal planes that can independently be controlled. In this way, the image capture settings for each focal plane in an imager array can be configured differently. An imager array including independent control of image capture settings and independent control of pixel readout in an array of focal planes in accordance with an embodiment of the invention is illustrated in FIG. 4. The imager array 400 includes a plurality of focal planes or pixel sub-arrays 402. Control circuitry 403, 404 provides independent control of the exposure timing and amplification gain applied to the individual pixels within each focal plane. Each focal plane 402 includes independent row timing circuitry 406, 408, and independent column readout circuitry 410, 412. In operation, the control circuitry 403, 404 determines the image capture settings of the pixels in each of the active focal planes 402. The row timing circuitry 406, 408 and the column readout circuitry 410, 412 are responsible for reading out image data from each of the pixels in the active focal planes. The image data read from the focal planes is then formatted for output using an output and control interface 416.


Although specific imager array configurations are discussed above with reference to FIG. 4, any of a variety of imager array configurations including independent and/or related focal plane control can be utilized in accordance with embodiments of the invention including those outlined in U.S. patent application Ser. No. 13/106,797, entitled “Architectures for Imager Arrays and Array Cameras”, filed May 12, 2011, the disclosure of which is incorporated by reference herein in its entirety. As is discussed further below, the image data captured by an imager array can be utilized to detect localized defects in the cameras formed by an array camera module and to evaluate whether the defects will ultimately render the entire array camera module defective.


Evaluating Defects in Camera Arrays


Camera arrays can capture information in multiple color channels or spectral cameras, where specific cameras are configured to only capture image data within a single color channel or spectral band. A 4×4 camera array that is configured to capture red, green, and blue image data is conceptually illustrated in FIG. 5A. As noted above, super-resolution processes including (but not limited to) the process disclosed in U.S. patent application Ser. No. 12/967,807 can be utilized to take image data captured by each of the cameras in the camera array and synthesize a super-resolution image. The process of synthesizing a super-resolution image from images captured by multiple cameras having different viewpoints involves identifying pixel shifts that can be applied to the image data to shift all of the captured image data to a single viewpoint. Referring to the camera array illustrated in FIG. 5A, a reference camera 500 can be designated and all of the remaining cameras can be considered alternate view cameras. One approach to determining the appropriate shifts to apply to the pixels of the images captured by the alternate view cameras to shift the pixels into the viewpoint of the reference camera is to determine the distance to objects within the scene captured by the reference camera. These distances can then be used to determine the anticipated parallax shifts in the alternate view images, which can then be corrected. The parallax shifts in the alternate view images will typically occur along epipolar lines, which are determined based upon the relative locations of the centers of the reference camera and the alternate view camera. Processes for detecting and correcting for parallax shifts are disclosed in U.S. Provisional Patent Application Ser. No. 61/780,906, entitled “Systems and Methods for Parallax Detection and Correction in Images Captured Using Array Cameras”, filed Mar. 13, 2013, the disclosure of which is incorporated by reference herein in its entirety. As is discussed in U.S. Provisional Patent Application Ser. No. 61/780,906, disparity searches performed when conducting parallax detection and correction can be bounded based upon a maximum observed parallax shift. As is discussed further below with reference to FIGS. 5B-5E, an appreciation of these bounds can be utilized to determine whether localized defects in the cameras of a camera array will impact the image quality of images synthesized using images captured by the camera array.


Using Bounds on Parallax Shifts to Evaluate Impact of Localized Defects


Systems and methods for screening camera arrays for defects in accordance with many embodiments of the invention attempt to evaluate whether the image data captured by a camera array includes sufficient reliable image data to reliably synthesize a super-resolution image. In several embodiments, the sufficiency of the captured image data is determined by considering the super-resolution image as a set of regions synthesized from image data captured by pixels in regions in each of the images captured by the camera array. While the locations of the regions in the images correspond with the locations of the regions in the super-resolution image, it should be noted that the effects of parallax can mean that a region of a super-resolution image can be synthesized from image data captured from more than just the corresponding regions of the images captured by the camera array. The process of synthesizing a region of the super-resolution image involves shifting all of the image data captured by the cameras in the camera array to the viewpoint from which the super-resolution image is synthesized, which can include shifting image data captured by pixels from multiple regions of a camera. Although much of the discussion that follows assumes that the super-resolution image is synthesized from the viewpoint of a reference camera, super-resolution images can also be synthesized from virtual viewpoints. In which case, parallax corrections are applied to all of the image data.


The reliability with which the region of the super-resolution image can be synthesized based upon captured image data can be evaluated by identifying pixels in the image data that could be utilized to synthesize the super-resolution image and which may be impacted by a localized defect. As noted above, the parallax shifts that are likely to be observed in captured image data are typically bounded. Therefore, these maximum parallax shift bounds can be utilized to identify pixels in image data captured by specific cameras within an array that could be utilized to synthesize a region of a super-resolution image depending upon the nature of a scene. The specific pixels that will be utilized to synthesize a region of a super-resolution image will typically depend upon the distance(s) to objects within the scene that are visible within the synthesized region of the super-resolution image. Regions of the images captured by specific cameras within a camera array that contain pixels that could be utilized to synthesize a region of a super-resolution image (identified based upon the parallax shift bounds) can be referred to as parallax uncertainty zones with respect to the region of the super-resolution image. These parallax uncertainty zones contain the pixels that could be utilized to synthesize the associated region of the super-resolution image under all possible imaging conditions (i.e. across all possible object distances). By identifying the number of localized defects (if any) that impact pixels contained within the parallax uncertainty zones, systems and methods in accordance with embodiments of the invention can identify the amount of image data that must be disregarded during the synthesis of the region of the super-resolution image. If the amount of image data that must be disregarded (i.e. the number of localized defects impacting pixels contained within the parallax uncertainty zones) exceeds a predetermined amount, then the camera array can be determined to be defective for the purpose of synthesizing super-resolution images. Although much of the discussion that follows focuses on synthesizing super-resolution images, similar processes can be utilized to evaluate whether sufficient reliable image data is available to synthesize other types of image(s) such as (but not limited to) stereo-pairs of super-resolution images, and/or video sequences synthesized using a subset of cameras in the camera array.


In order to provide a concrete example of the processes outlined above, a process for determining whether localized defects in the cameras of the 4×4 camera array illustrated in FIG. 5A render the camera array defective for the purpose of synthesizing super-resolution images from the viewpoint of the reference camera 500 in accordance with an embodiment of the invention is conceptually illustrated in FIGS. 5B-5E. The 4×4 camera array illustrated in FIG. 5A includes cameras that capture red, green, and blue image data. In evaluating the camera array, each color channel can be considered separately. Referring first to FIG. 5B, the sufficiency of the reliable image data captured by cameras within the green color channel is considered with respect to a region of the super-resolution image defined by dividing the super-resolution image into a 3×3 grid and dividing each of the images captured by the camera array into corresponding 3×3 grids. Although regions defined using 3×3 grids are utilized to illustrate the process shown in FIGS. 5B-5E, the number of regions can be selected based upon the requirements of specific applications and, as is discussed further below, the size of the regions can differ when considering different types of defects that may be present within a given camera array. Due to the fact that the super-resolution image is synthesized from the viewpoint of the reference camera 500, the region of the super-resolution image that is being considered corresponds to a region 502 of the reference camera (i.e. the anticipated parallax shifts to shift image data captured by pixels of the reference camera into the viewpoint of the synthesized super-resolution image is zero). In the illustrated embodiment, epipolar lines 504 and maximum parallax shifts are utilized to identify regions within the alternate view green cameras that contain pixels that could potentially capture image data that could be utilized to synthesize the region of the super-resolution image under all possible imaging conditions. In the illustrated embodiment, a maximum parallax shift is assumed that is approximately equal to the relevant dimension of one of the regions (i.e. somewhere between the length of the diagonal and the length of an edge of the region depending on the location of the camera within the array). In actual applications, the maximum parallax shift that is observed typically depends upon the location of an alternate view camera relative to the reference camera. In certain embodiments, different maximum parallax shifts are utilized based upon camera location. In other embodiments, the same maximum parallax shift is utilized irrespective of the camera location to simplify analysis. The specific parallax shift(s) that are assumed typically depend upon the spacing and focal length of the cameras in a specific camera array.


The identified regions within the alternate view green cameras that contain at least one pixel that could potentially capture image data used to synthesize a given region of the super-resolution image define the parallax uncertainty zones 506 for the given region of the super-resolution image. In FIGS. 5B-5E, parallax uncertainty zones are illustrated as shaded regions within each camera. Once the parallax uncertainty zones are identified, the process of determining whether the camera array captures sufficient reliable image data to reliably synthesize super-resolution images under all imaging conditions involves simply counting the number of defects that impact pixels within the parallax uncertainty zones. When the counted number of defects within any of the parallax uncertainty zone exceeds a predetermined number, then the camera array is considered defective. In a number of embodiments, less than three localized defects impacting regions within the parallax uncertainty zones can be tolerated. Referring again to FIG. 5B, localized defects are indicated using the symbol “X”. As can be readily appreciated, the presence of three localized defects (X) within the uncertainty zones of the region of the super-resolution image being evaluated would result in the illustrated camera array being considered defective.


The ability to define parallax uncertainty zones in a predetermined manner can simplify processes for detecting defective camera arrays in accordance with embodiments of the invention during the manufacturing of camera arrays. Processes for determining whether camera arrays are defective can simply involve determining regions of the cameras that contain localized defects and then using lookup tables to identify the parallax uncertainty zones to consider when evaluating whether the localized defects render the overall camera array defective for the purpose of synthesizing a desired type of image(s).


In many embodiments of the invention, the process of evaluating whether a camera array is defective involves evaluating whether regions within parallax uncertainty zones contain localized defects. The regions of a camera that are considered part of a parallax uncertainty zone for a given region of a super-resolution image are regions that contain at least one pixel that could potentially capture image data that could be utilized to synthesize the given region of the super-resolution image under all possible imaging conditions. It is also worth noting that a region that is part of a parallax uncertainty zone can also include pixels that capture image data that will not be utilized by the super-resolution process to synthesize the given region of the super-resolution image under any imaging conditions (i.e. pixels that shift along epipolar lines to different regions of the super-resolution image). For example, region 508 contains such pixels. In the event that the localized defect impacting region 508 does not impact pixels that could potentially capture image data used in the synthesis of the given region of the super-resolution image, then the camera array could theoretically still be used to synthesize super-resolution images of acceptable image quality despite failing to the criterion outlined above. Accordingly, yield can be further increased by reducing the size of the regions (i.e. using more than a 3×3 grid, e.g. a 6×8 grid and/or any other grid appropriate to the requirements of the invention). As is discussed further below, the size of the regions considered can depend upon by the specific type of defect being detected. For example, defective pixels can be identified individually and so very small regions can be considered when evaluating the impact of defective pixels. By contrast, MTF calculations typically require image data captured by a larger number of pixels. Therefore, larger regions may be utilized when evaluating the impact of defects in the lens stacks of optic array of an array camera module. In addition, the size, number and location of the regions for testing the optic array only can be already defined by the setup of the MTF testing equipment, e.g. if optical testing instrument use 9 reticles and corresponding cameras in the tester (on-axis, 4 at intermediate field heights on H and V axes of “image” and 4 in the corners). Accordingly, a single screening process can utilize different sized regions when evaluating the impact of different types of defects as part of the process of determining whether a camera array is sufficiently reliable to be utilized in synthesizing a desired type of image.


The manner in which parallax uncertainty zones are defined with respect to various regions of a super-resolution image can be further appreciated with reference to FIGS. 5C-5E. With specific regard to FIGS. 5C and 5D, different regions of the super-resolution image are selected corresponding to region 510 and region 520 in the reference camera (shown in FIG. 5C and FIG. 5D respectively). Epipolar lines 512, 522 and maximum parallax shift bounds are utilized to identify parallax uncertainty regions 514, 524 and the number of localized defects impacting pixels within the parallax uncertainty zones 514, 524 can then be determined. When the number of localized defects impacting pixels within the parallax uncertainty zones exceeds a predetermined threshold number, then the camera array can be considered defective for the purpose of synthesizing super-resolution images.


The reference camera 500 in the camera array illustrated in FIG. 5A is a camera that captures image data within a green color channel. The process of synthesizing a super-resolution image can also involve shifting image data captured by cameras within other color channels to the viewpoint of the reference camera. In several embodiments, the process of evaluating whether a camera array can reliably synthesize a super-resolution image involves evaluating whether defects in cameras that are part of a color channel that does not contain the reference camera are likely to result in unacceptable image quality in a super-resolution image synthesized using image data captured by the camera array. The process for evaluating the likely impact of localized defects in cameras that are part of a color channel that does not contain the reference camera is similar to the process outlined above. Epipolar lines and maximum parallax shift bounds are utilized to identify regions within the alternate view cameras within the color channel that constitute parallax uncertainty zones for a specific region of a synthesized super-resolution image. In many instances, the number of cameras within the camera array used to capture image data in different color channels may vary. Therefore, a different threshold may be utilized to determine whether an array camera is defective in each color channel.


A process for evaluating whether the camera array illustrated in FIG. 5A is defective for the purpose of synthesizing a full-color super-resolution image due to localized defects present in cameras that are part of a blue color channel in accordance with embodiments of the invention is conceptually illustrated in FIG. 5E. The process involves selecting a region of the super-resolution image that corresponds to a region 530 of the reference camera 500. Although the region 530 corresponding to the selected region of the super-resolution image is shown in FIG. 5E, the reference camera does not capture image data in the blue color channel and so the reference camera is not considered for the purposes of evaluating the cameras in the blue color channel. The region 330 is simply shown for the purpose of illustrating the manner in which the parallax uncertainty regions within the cameras that are part of the blue channel are determined. Epipolar lines and maximum parallax shift bounds are utilized to identify parallax uncertainty regions in the cameras within the blue channel with respect to the selected region of the super-resolution image. A determination can be made as to the number of localized defects (of any type) that impact regions within the parallax uncertainty zones of the cameras within the blue color channel. Where the number of regions within the parallax uncertainty zones impacted by localized defects exceeds a predetermined threshold, then the camera array can be determined to be defective for the purpose of synthesizing full-color super-resolution images. With specific regard to the 4×4 camera array illustrated in FIG. 5A, the number of defects that can be tolerated in the parallax uncertainty zones of a specific region of a super-resolution image before the camera array is considered defective is a single defect. In other embodiments, the number of localized defects that are tolerated within the parallax uncertainty zones can be determined based upon the requirements of a specific application.


Although the processes discussed above with respect to FIG. 5E are discussed in the context of the cameras that form the blue color channel in the camera array shown in FIG. 5A, similar processes can be applied to determine whether defects in the cameras that form the red color channel compromise the reliability with which super-resolution images can be synthesized using image data captured by the camera array. Furthermore, the above discussion of color channels that do not contain a reference camera is presented primarily with reference to red and blue color channels. Any color channel, however, can be evaluated using processes similar to those outlined above as appropriate to the requirements of specific applications in accordance with embodiments of the invention. Indeed, all color channels captured by a camera array can be evaluated using processes similar to those described above with reference to FIG. 5E when a super-resolution image is synthesized from a virtual viewpoint (i.e. none of the color channels include a camera that captures image data from the viewpoint from which the super-resolution image is synthesized).


Processes for Detecting Defective Camera Arrays


Processes for manufacturing camera arrays, including (but not limited to) camera arrays implemented using an array camera module, can incorporate processes that screen for defective camera arrays. In many embodiments, the screening processes identify defects and the regions within the cameras in the camera array that capture image data, which are impacted by the identified defects. The process can then count the number of regions impacted by defects within specific sets of regions, where each set of regions constitutes the parallax uncertainty zones for a specific region of a super-resolution image that can be synthesized using image data captured by the camera array. In many embodiments, the specific sets of regions can include different sets for each color channel used to synthesize a region of a super-resolution image. In this way, predetermined parallax uncertainty zones can effectively be defined as a set of look up tables (or similar data structures) without the need to continuously perform the calculations to determine the parallax uncertainty zones (which are typically the same for each similar camera array being manufactured and tested).


A process for screening camera arrays to identify defective camera arrays in accordance with an embodiment of the invention is illustrated in FIG. 6. The process 600 includes capturing (602) image data of a known target using multiple focal planes. In many embodiments, the target includes features that enable evaluation of captured image data for the purpose of detecting localized defects within the array camera. In a number of embodiments, a target is used that enables local measurement of MTF at multiple field locations such as (but not limited to) targets that incorporate slanted edge targets (for both tangential and sagittal components), bar targets (for both tangential and sagittal components) and/or Siemens stars. In certain embodiments, the specific types of targets are repeatedly arranged to be imaged into different regions. The captured image data is divided (604) into regions and any localized defects are identified (606) within the regions. Processes in accordance with embodiments of the invention can screen for multiple different types of defects including (but not limited to) defects in the lens stack of a camera, defects in a camera's sensor, and defects resulting from the incorrect assembly of the camera optics and sensor. As is discussed further below, the process of dividing the captured image data into regions can involve dividing the captured image data into different sized regions for the purpose of evaluating the impact of different types of images on the image quality of super-resolution images synthesized by the camera array.


In a number of embodiments, a reference camera is selected (608). As is discussed further below, processes in accordance with many embodiments of the invention require that the reference camera utilized in the synthesis of super-resolution images be free of localized defects. Accordingly, the process of selecting a reference camera can involve selecting candidate reference cameras and evaluating whether any of the candidate reference cameras are free from defects. In the event that none of the candidate reference cameras are free from defects, the camera array may be rejected.


The process of screening the camera array can then involve identifying (610) defects that impact image data captured by regions within the parallax uncertainty zones of each region of a super-resolution image that can be synthesized using image data captured by the camera array. As noted above, this can involve utilizing look up tables (or similar rapidly accessible data structures) to count the number of defects that occur in specific sets of regions corresponding to the parallax uncertainty zones (in each color channel) for each region of a super-resolution image that can be synthesized using the camera array. The number of defects in each of the specific sets of regions can then be evaluated to determine (612) whether the number exceeds a predetermined threshold. In many embodiments, different thresholds can be defined for different sets of regions. In several embodiments, different thresholds apply to the sets in each of the different color channels supported by the camera array. In embodiments where the number of defects in each instance is sufficiently low to satisfy the thresholds, then the camera array is determined to be capable of synthesizing super-resolution images of acceptable image quality and information concerning the defects can be stored for use by the camera array in the subsequent synthesis of super-resolution images. In this way, information concerning defects can be utilized to disregard image data captured by regions of cameras impacted by the defects during the synthesis of super-resolution images. Processes for synthesizing super-resolution images in this manner are discussed further below. In the event that at least one of the defect counts with respect to a specific set of regions exceeds the predetermined threshold, then the camera array is determined to be defective for the purpose of synthesizing super-resolution images having acceptable image quality.


Although specific process for determining whether a camera array is defective are described above with respect to FIG. 6, any of a variety of processes can be utilized in accordance with embodiments of the invention including processes that define sets of regions based upon any of a variety of criterion appropriate to a specific application including sets that evaluate the reliability of image data used to synthesize other types of images including (but not limited to) stereo pairs of super-resolution images, sequences of images synthesized from sub-arrays of cameras within of the camera array, and/or high speed video sequences including successive frames synthesized from different sub-arrays of cameras within the camera array. As can be readily appreciated in view of the above discussion, the specific regions included within the specific sets of regions can be determined based upon the cameras used to synthesize each type of image and using epipolar lines and maximum parallax shift bounds to identify the regions in those cameras that fall within parallax uncertainty zones. Processes for identifying defects in accordance with embodiments of the invention are discussed further below.


Identifying Defective Regions Based Upon Pixel Defects


A region of a camera can be considered defective due the presence of defective pixels within the region. Pixels can be considered defective for reasons including (but not limited to) the pixels being determined to be hot pixels, bright pixels, or dark pixels. Any of a variety of criteria appropriate to the requirements of specific applications can be utilized to determine whether the presence of defective pixels within a region renders the entire region defective for the purpose of evaluating the camera array. In a number of embodiments, the presence of a predetermined number of pixels results in the entire region being considered defective. In several embodiments, the presence of a cluster of defective pixels exceeding a predetermined size within region results in the entire region being considered defective. In certain embodiments, clusters of pixels that are equal to or smaller than a 2×2 cluster of pixels can be tolerated. However, a cluster of pixels that includes three or more pixels in one dimension results in the entire region being considered defective. In other embodiments, the size of defective pixel clusters that can be tolerated is determined based upon the requirements of specific applications.


A process for determining whether the presence of defective pixels results in a region being considered defective for the purpose of evaluating the performance of a camera array in accordance with an embodiment of the invention is illustrated in FIG. 7. The process 700 includes detecting (702) defective pixels using image data captured within a specific region. A determination (704) is made concerning whether the number of defective pixels exceeds a threshold. If the threshold is exceeded, then the region is considered to be defective (706) for the purpose of evaluating the camera array. In the event that the number of defective pixels does not exceed the predetermined threshold, a separate determination (708) is made concerning whether the size of any clusters of defective pixels exceeds a predetermine threshold. In the event that one or more clusters are present that the exceed the maximum size criterion, then the region is considered to be defective (706) for the purpose of evaluating the camera array. Otherwise the region is treated (710) as not being defective despite the presence of defective pixels. In many embodiments, information concerning the defective pixels is stored for use when synthesizing images using image data captured by the region so that the image data captured by the defective pixels can be disregarded.


Although specific processes for determining whether a region of a camera is defective based upon the characteristics of defective pixels present within the region of the camera are described above, any of a variety of processes utilizing any of a variety of criteria appropriate to the requirements of specific applications can be utilized to determine whether a region of a camera is defective for the purpose of evaluating a camera array based upon the number, type, and/or location of defective pixels within the region in accordance with embodiments of the invention. Processes for evaluating whether a region of a camera is defective utilizing MTF measurements in accordance with embodiments of the invention are discussed below.


Identifying Defective Regions Using MTF Measurements


Defects in the optics of a camera can be identified by performing MTF measurements. In several embodiments, defects in regions of a camera that are attributable to defects in the lens stack of the camera can be detected by performing an MTF measurement for the region. Where the MTF measurement diverges from the anticipated MTF of the optics, then MTF failure can be considered to have occurred within the region and the region can be treated as defective for the purpose of evaluating the overall reliability of the camera array.


A process for determining whether a region of a camera is defective when evaluating the overall reliability of a camera array in accordance with an embodiment of the invention is illustrated in FIG. 8. The process 800 includes measuring (802) the MTF of a region of an image captured by the camera. When a determination (804) is made that the MTF measurement indicates that the MTF within the region falls below a predetermined threshold. In many embodiments, when an MTF measurement with respect to a region does not meet a threshold for a certain contrast at a certain spatial frequency, the region of the camera is determined to be defective for the purpose of evaluating the overall performance of the camera array. In the event that the MTF measurement for the region satisfies the predetermined acceptance criterion, then the region is determined (808) not to be defective for the purpose of evaluating the overall performance of the camera array.


Although specific processes are described above with reference to FIG. 8, any of a variety of processes can be utilized to screen regions of cameras based upon the characteristics of the optics of a camera as appropriate to the requirements of specific applications in accordance with embodiments of the invention.


Processes for Selecting a Reference Camera


In many embodiments, the process of synthesizing a super-resolution image involves selection of a reference camera and synthesizing the super-resolution image from the viewpoint of the reference camera. The camera selected as the reference camera plays an important role in the synthesis of the super-resolution image. Therefore, processes in accordance with a number of embodiments of the invention attempt to select a reference camera that is free from defects and will discard a camera array when none of the cameras that can serve as a reference camera are free of defects.


A process for selecting a reference camera in a camera array in accordance with an embodiment of the invention is illustrated in FIG. 9. The process 900 includes selecting (902) an initial reference camera. A determination (904) is made concerning whether the selected camera is free from defects. In the event that the selected camera is free of defects, then the camera is selected (906) as the reference camera. In the event that the selected camera incorporates one or more defective regions, then the process of selecting (902) candidate reference cameras and evaluating (904) the candidate reference cameras repeats until either a candidate reference camera is found that is free from defects and selected (906) as the reference camera or all potential candidates are exhausted. In which case, the camera array is rejected (910) as defective. Depending upon the construction of the camera array and the requirements of a specific application, there is typically only a subset of cameras in the camera array that can serve as a reference camera.


Although specific processes for selecting a reference camera are discussed above with reference to FIG. 9, any of a variety of processes appropriate to the requirements of specific applications can be utilized in accordance with an embodiment of the invention.


Screening Optic Arrays


While much of the discussion above has focused on systems and methods for screening camera arrays for defects that will prevent the synthesis of images having acceptable image quality, similar techniques can be utilized to screen optic arrays manufactured for use in array camera modules in accordance with embodiments of the invention. The MTF in multiple regions of each of the lens stacks in an optic array can be measured using an optical test instrument designed to perform MTF testing, such as (but no limited to) the Image Master® PRO line of products manufactured by Trioptocis GmbH of Wedel, Germany. Furthermore, scripts can be executed on such optical test instruments to detect defective optic arrays using processes that consider the impact that defective regions within the optical array would have on images synthesized using image data captured by a hypothetical array cameras that incorporated the optic array in accordance with embodiments of the invention. Defects in a lens stack can be localized by separately measuring the MTF of each of a number of regions of each lens stack. Parallax uncertainty zones can be defined with respect to the regions of the lens stacks in the optic array in the same way in which they are defined for regions of cameras in a camera array. By counting regions in the parallax uncertainty zones that have MTF measurements that fail to satisfy one or more predetermined MTF criterion, a determination can be made concerning whether the defects in the optics are likely to result in the construction of an array camera module that is incapable of capturing image data from which super-resolution images can be synthesized with acceptable image quality. As with camera arrays, the specific set of regions that forms each parallax uncertainty zone can be stored in a lookup table (or similar data structure) enable rapid retrieval. In this way, counts can be generated and the appropriate threshold applied with respect to each set to determine whether the optic array is defective.


A process for determining whether a lens array is defective in accordance with an embodiment of the invention is illustrated in FIG. 10. The process 1000 includes measuring (1002) MTFs for different regions of each lens stack in the lens stack array. Localized defects can be identified (1004) by comparing the MTF measurements to at least one predetermined criterion such as (but not limited to) any of the following thresholds: on-axis MTF at 227 lp/mm>0.3; all regions at 0.6 relative field height having S-MTF at 2271 lp/mm>0.2, and T-MTF at 2271 lp/mm>0.2; all regions @ 0.8 relative field height having S-MTF at 2271 lp/mm>0.15, and T-MTF at 2271 lp/mm>0.1. In other embodiments, any thresholds appropriate to the requirements of specific applications can be utilized. In a number of embodiments, a lens stack is selected as a reference lens stack. As is discussed above, several embodiments of the invention require that a reference camera be free from defects. Accordingly, a lens stack selected as a reference lens stack can also be subject to a requirement that it be free from defects. In the event no lens stack that can serve as the lens stack of a reference camera is free from defects, then certain embodiments of the invention involve rejecting the lens stack as defective.


The process of screening the optic array can then involve identifying (1008) defects that will impact image data captured within parallax uncertainty zones of each region of a super-resolution image that can be synthesized from the viewpoint of the reference lens stack. As noted above, this can involve utilizing look up tables (or similar rapidly accessible data structures) to count the number of defects that occur in specific sets of regions corresponding to the parallax uncertainty zones (in each color channel) for each region of the reference lens stack. The number of defects in each of the specific sets of regions can then be evaluated to determine (1010) whether the number exceeds a predetermined threshold. In many embodiments, different thresholds can be defined for different sets. In several embodiments, different thresholds apply to the sets in each of the different color channels that will ultimately be formed using the optic array. In embodiments where the number of defects in each instance is sufficiently low to satisfy the thresholds, then the optic array is determined to be suitable for use in the construction of an array camera module. Furthermore, information concerning defects captured during the screening process can be subsequently utilized to disregard image data captured by regions of cameras impacted by the defects in the optic array during the synthesis of super-resolution images. Processes for synthesizing super-resolution images in this manner are discussed further below. In the event that at least one of the defect counts with respect to a specific set of regions exceeds the predetermined threshold, then the optic array is determined to be defective for the purpose of constructing an array camera module.


Although specific processes for determining whether an optic array is defective are described above with respect to FIG. 10, any of a variety of processes can be utilized in accordance with embodiments of the invention including processes that define sets of regions based upon any of a variety of criterion appropriate to a specific application including sets that evaluate the reliability of optic arrays based upon synthesizing other types of images including (but not limited to) stereo pairs of super-resolution images, sequences of images synthesized from sub-arrays of cameras within of the camera array, and/or high speed video sequences including successive frames synthesized from different sub-arrays of cameras within the camera array. Furthermore, material binning can be utilized to further improve yield by combining optic arrays and sensors based upon the defects present in each component. In this way, combinations can be created that match regions in which localized defects are present in an optic array with localized defects in a sensor to minimize the total number of camera regions that contain localized defects in an array camera module assembled using the optic array and the sensor.


Synthesizing Images Using Camera Arrays Incorporating Defects


Camera arrays that are screened utilizing processes similar to those outlined above and/or that include optic arrays and/or sensors that are screened utilizing processes similar to those outlined above can contain defects. When image data captured by pixels impacted by the defects is utilized to synthesize an image, then a degradation in image quality can result. The process of screening the camera array and/or the optics yields information concerning regions in the cameras and/or lens stacks or sensors containing defects. In several embodiments of the invention, information concerning the defective regions is maintained by the camera array and utilized in the processing of captured image data to synthesize images. In several embodiments, image data captured in a defective region can be disregarded. Where a region is identified as defective, but the location of the specific pixels impacted by the defect is known, only the impacted pixels can be disregarded.


A process for synthesizing a super-resolution image that involves disregarding image data captured by regions and/or pixels impacted by defects in accordance with an embodiment of the invention is illustrated in FIG. 11. The process 1100 includes capturing (1102) image data using the cameras in the camera array. Information concerning defective regions in specific cameras and or lens stacks, which can take the form of arbitrary formatted defect data, can be utilized to disregard (1104) image data captured by pixels in the impacted regions of the identified cameras. It is worth noting that when an entire region is disregarded, even the pixels within the region that are not impacted by the defect are disregarded. In many embodiments, the defect data can also contain information concerning defective pixels can also be disregarded (1106). A super-resolution process can be applied (1108) to the remaining image data and yield a super-resolution image (1110) as an output.


Although specific processes for synthesizing super-resolution images are discussed above with respect to FIG. 11, any of a variety of processes for synthesizing images from image data captured by camera arrays that utilize information concerning regions of specific cameras within the camera array that contain defects can be utilized in accordance with embodiments of the invention including (but not limited to) processes that involve synthesizing stereo pairs of super-resolution images, sequences of images synthesized from sub-arrays of cameras within of the camera array, and/or high speed video sequences including successive frames synthesized from different sub-arrays of cameras within the camera array.


While the above description contains many specific embodiments of the invention, these should not be construed as limitations on the scope of the invention, but rather as an example of one embodiment thereof. It is therefore to be understood that the present invention may be practiced otherwise than specifically described, without departing from the scope and spirit of the present invention. Thus, embodiments of the present invention should be considered in all respects as illustrative and not restrictive.

Claims
  • 1. A method for evaluating a camera's suitability as a reference camera to be used in screening the camera array having a plurality of cameras for defectiveness, the method comprising: capturing image data of a known target using a plurality of cameras, where the known target image data forms a plurality of known target images;identifying, using the image processing system, localized defects in each of the plurality of known target images;identifying, using an image processing system, corresponding regions between target images captured by different cameras of the plurality of cameras, wherein the corresponding image regions between the plurality of known target images are determined by searching for correspondence along an epipolar line up to a predetermined maximum parallax shift distance, where the epipolar line is defined parallel to the relative locations of the center of a first camera and the center of a second camera;identifying, using the image processing system, for at least one region of a target image captured by the first camera, localized defects in corresponding regions of target images captured by a set of at least one other camera of the plurality of cameras that correspond to the at least one region of the target image captured by the first camera; andevaluating, using the image processing system, the corresponding image regions in accordance with a set of one or more localized defect criteria to determine whether the first camera is suitable as a reference camera.
  • 2. The method of claim 1, wherein evaluating the corresponding image regions comprises discerning whether a set of one or more defective pixels exists in a corresponding region that satisfies a criterion of the set of localized defect criteria.
  • 3. The method of claim 2, wherein the criterion of the set of localized defect criteria is that a number of defective pixels in the set of defective pixels within the corresponding image regions exceeds a predetermined number of defective pixels.
  • 4. The method of claim 2, wherein the criterion of the set of localized defect criteria is that the set of defective pixels includes a cluster of defective pixels that exceeds a predetermine size.
  • 5. The method of claim 2, wherein the set of defective pixels comprises at least one pixel selected from the group consisting of: a hot pixel, a bright pixel, and a dark pixel.
  • 6. The method of claim 1, wherein evaluating the corresponding image regions comprises: measuring the Modulation Transfer Function (MTF) within each of the corresponding image regions; anddetermining whether the MTF of each of the corresponding image regions fails to satisfy at least one criterion of the set of localized defect criteria.
  • 7. The method of claim 6, wherein the at least one defect criterion is that an on-axis MTF at a predetermined spatial frequency exceeds a first threshold, an off-axis tangential MTF at a predetermined spatial frequency exceeds a second threshold, and an off-axis sagittal MTF at a predetermined spatial frequency exceeds a third threshold.
  • 8. The method of claim 5, wherein determining whether the first camera is suitable as a reference camera comprises detecting whether a number of localized defects for the corresponding image regions according to the set of localized defect criteria is greater than zero.
  • 9. The method of claim 5, wherein determining whether the first camera is suitable as a reference camera comprises detecting whether a number of localized defects for the corresponding image regions according to the set of localized defect criteria is greater than a predetermined threshold, wherein the predetermined threshold is one of one, three, five, and ten.
  • 10. The method of claim 5 further comprising: detecting, using the image processing system, at least one localized defect in the target images captured by the first camera of the plurality of cameras;utilizing, using the image processing system, image data corresponding to the location of the at least one localized defect that is captured by at least one other camera of the plurality of cameras when evaluating whether the first camera is suitable as the reference camera.
  • 11. The method of claim 5 further comprising iteratively evaluating each camera of at least a subset of the plurality of cameras to determine each camera's suitability as the reference camera.
  • 12. A non-transitory computer-readable medium including instructions that, when executed by a processing unit, evaluates a camera's suitability as a reference camera to be used in screening the camera array having a plurality of cameras for defectiveness, the instructions comprising: retrieving captured image data of a known target that was captured using a plurality of cameras, where the known target image data forms a plurality of known target images;identifying localized defects in each of the plurality of known target images;identifying corresponding regions between target images captured by different cameras of the plurality of cameras, wherein the corresponding image regions between the plurality of known target images are determined by searching for correspondence along an epipolar line up to a predetermined maximum parallax shift distance, where the epipolar line is defined parallel to the relative locations of the center of a first camera and the center of a second camera;identifying for at least one region of a target image captured by the first camera, localized defects in corresponding regions of target images captured by a set of at least one other camera of the plurality of cameras that correspond to the at least one region of the target image captured by the first camera; andevaluating the corresponding image regions in accordance with a set of one or more localized defect criteria to determine whether the first camera is suitable as a reference camera.
  • 13. The method of claim 12, wherein evaluating the corresponding image regions comprises discerning whether a set of one or more defective pixels exists in a corresponding region that satisfies a criterion of the set of localized defect criteria.
  • 14. The method of claim 12, wherein evaluating the corresponding image regions comprises: measuring the Modulation Transfer Function (MTF) within each of the corresponding image regions; anddetermining whether the MTF of each of the corresponding image regions fails to satisfy at least one criterion of the set of localized defect criteria.
  • 15. The method of claim 12, wherein determining whether the first camera is suitable as a reference camera comprises detecting whether a number of localized defects for the corresponding image regions according to the set of localized defect criteria is greater than zero.
  • 16. The method of claim 12, wherein determining whether the first camera is suitable as a reference camera comprises detecting whether a number of localized defects for the corresponding image regions according to the set of localized defect criteria is greater than a predetermined threshold, wherein the predetermined threshold is one of one, three, five, and ten.
  • 17. The method of claim 12 further comprising: detecting at least one localized defect in the target images captured by the first camera of the plurality of cameras;utilizing image data corresponding to the location of the at least one localized defect that is captured by at least one other camera of the plurality of cameras when evaluating whether the first camera is suitable as the reference camera.
  • 18. The method of claim 12 further comprising iteratively evaluating each camera of at least a subset of the plurality of cameras to determine each camera's suitability as the reference camera.
  • 19. A method for evaluating a camera's suitability as a reference camera to be used in screening the camera array having a plurality of cameras for defectiveness, the method comprising: capturing image data of a known target using a plurality of cameras, where the known target image data forms a plurality of known target images;identifying, using the image processing system, localized defects in each of the plurality of known target images;identifying, using an image processing system, corresponding regions between target images captured by different cameras of the plurality of cameras;identifying, using the image processing system, for at least one region of a target image captured by the first camera, localized defects in corresponding regions of target images captured by a set of at least one other camera of the plurality of cameras that correspond to the at least one region of the target image captured by the first camera; andevaluating, using the image processing system, the corresponding image regions in accordance with a set of one or more localized defect criteria to determine whether the first camera is suitable as a reference camera, wherein evaluating the corresponding image regions comprises: measuring the Modulation Transfer Function (MTF) within each of the corresponding image regions; anddetermining whether the MTF of each of the corresponding image regions fails to satisfy at least one criterion of the set of localized defect criteria.
  • 20. A non-transitory computer-readable medium including instructions that, when executed by a processing unit, evaluates a camera's suitability as a reference camera to be used in screening the camera array having a plurality of cameras for defectiveness, the instructions comprising: retrieving captured image data of a known target that was captured using a plurality of cameras, where the known target image data forms a plurality of known target images;identifying localized defects in each of the plurality of known target images;identifying corresponding regions between target images captured by different cameras of the plurality of cameras;identifying for at least one region of a target image captured by the first camera, localized defects in corresponding regions of target images captured by a set of at least one other camera of the plurality of cameras that correspond to the at least one region of the target image captured by the first camera; andevaluating the corresponding image regions in accordance with a set of one or more localized defect criteria to determine whether the first camera is suitable as a reference camera, wherein evaluating the corresponding image regions comprises: measuring the Modulation Transfer Function (MTF) within each of the corresponding image regions; anddetermining whether the MTF of each of the corresponding image regions fails to satisfy at least one criterion of the set of localized defect criteria.
CROSS-REFERENCE TO RELATED APPLICATIONS

The current application is a continuation of application Ser. No. 14/805,412 filed Jul. 21, 2015, which is a divisional of application Ser. No. 13/931,724 filed Jun. 28, 2013, which claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application Ser. No. 61/665,724, filed Jun. 28, 2012, the disclosures of which are incorporated herein by reference.

US Referenced Citations (1062)
Number Name Date Kind
4124798 Thompson Nov 1978 A
4198646 Alexander et al. Apr 1980 A
4323925 Abell et al. Apr 1982 A
4460449 Montalbano Jul 1984 A
4467365 Murayama et al. Aug 1984 A
4652909 Glenn Mar 1987 A
4899060 Lischke Feb 1990 A
4962425 Rea Oct 1990 A
5005083 Grage Apr 1991 A
5070414 Tsutsumi Dec 1991 A
5144448 Hornbaker et al. Sep 1992 A
5157499 Oguma et al. Oct 1992 A
5325449 Burt Jun 1994 A
5327125 Iwase et al. Jul 1994 A
5463464 Ladewski Oct 1995 A
5488674 Burt Jan 1996 A
5629524 Stettner et al. May 1997 A
5638461 Fridge Jun 1997 A
5757425 Barton et al. May 1998 A
5793900 Nourbakhsh et al. Aug 1998 A
5801919 Griencewic Sep 1998 A
5808350 Jack et al. Sep 1998 A
5832312 Rieger et al. Nov 1998 A
5833507 Woodgate et al. Nov 1998 A
5880691 Fossum et al. Mar 1999 A
5911008 Niikura et al. Jun 1999 A
5933190 Dierickx et al. Aug 1999 A
5963664 Kumar et al. Oct 1999 A
5973844 Burger Oct 1999 A
6002743 Telymonde Dec 1999 A
6005607 Uomori et al. Dec 1999 A
6034690 Gallery et al. Mar 2000 A
6069351 Mack May 2000 A
6069365 Chow et al. May 2000 A
6095989 Hay et al. Aug 2000 A
6097394 Levoy et al. Aug 2000 A
6124974 Burger Sep 2000 A
6130786 Osawa et al. Oct 2000 A
6137100 Fossum et al. Oct 2000 A
6137535 Meyers Oct 2000 A
6141048 Meyers Oct 2000 A
6160909 Melen Dec 2000 A
6163414 Kikuchi et al. Dec 2000 A
6172352 Liu et al. Jan 2001 B1
6175379 Uomori et al. Jan 2001 B1
6205241 Melen Mar 2001 B1
6239909 Hayashi et al. May 2001 B1
6292713 Jouppi et al. Sep 2001 B1
6340994 Margulis et al. Jan 2002 B1
6358862 Ireland et al. Mar 2002 B1
6373518 Sogawa Apr 2002 B1
6419638 Hay et al. Jul 2002 B1
6443579 Myers Sep 2002 B1
6476805 Shum et al. Nov 2002 B1
6477260 Shimomura Nov 2002 B1
6502097 Chan et al. Dec 2002 B1
6525302 Dowski, Jr. et al. Feb 2003 B2
6552742 Seta Apr 2003 B1
6563537 Kawamura et al. May 2003 B1
6571466 Glenn et al. Jun 2003 B1
6603513 Berezin Aug 2003 B1
6611289 Yu Aug 2003 B1
6627896 Hashimoto et al. Sep 2003 B1
6628330 Lin Sep 2003 B1
6635941 Suda Oct 2003 B2
6639596 Shum et al. Oct 2003 B1
6647142 Beardsley Nov 2003 B1
6657218 Noda Dec 2003 B2
6671399 Berestov Dec 2003 B1
6674892 Melen Jan 2004 B1
6750904 Lambert Jun 2004 B1
6765617 Tangen et al. Jul 2004 B1
6771833 Edgar Aug 2004 B1
6774941 Boisvert et al. Aug 2004 B1
6788338 Dinev Sep 2004 B1
6795253 Shinohara Sep 2004 B2
6801653 Wu et al. Oct 2004 B1
6819328 Moriwaki et al. Nov 2004 B1
6819358 Kagle et al. Nov 2004 B1
6879735 Portniaguine et al. Apr 2005 B1
6897454 Sasaki et al. May 2005 B2
6903770 Kobayashi et al. Jun 2005 B1
6909121 Nishikawa Jun 2005 B2
6917702 Beardsley Jul 2005 B2
6927922 George et al. Aug 2005 B2
6958862 Joseph Oct 2005 B1
6985175 Iwai et al. Jan 2006 B2
7015954 Foote et al. Mar 2006 B1
7085409 Sawhney Aug 2006 B2
7161614 Yamashita et al. Jan 2007 B1
7199348 Olsen et al. Apr 2007 B2
7206449 Raskar et al. Apr 2007 B2
7215364 Wachtel et al. May 2007 B2
7235785 Hornback et al. Jun 2007 B2
7245761 Grossberg et al. Jul 2007 B2
7262799 Suda Aug 2007 B2
7292735 Blake et al. Nov 2007 B2
7295697 Satoh Nov 2007 B1
7333651 Kim et al. Feb 2008 B1
7369165 Bosco et al. May 2008 B2
7391572 Jacobowitz et al. Jun 2008 B2
7408725 Sato Aug 2008 B2
7425984 Chen Sep 2008 B2
7430312 Gu Sep 2008 B2
7471765 Jaffray et al. Dec 2008 B2
7496293 Shamir et al. Feb 2009 B2
7564019 Olsen Jul 2009 B2
7599547 Sun et al. Oct 2009 B2
7606484 Richards et al. Oct 2009 B1
7620265 Wolff Nov 2009 B1
7633511 Shum et al. Dec 2009 B2
7639435 Chiang et al. Dec 2009 B2
7646549 Zalevsky et al. Jan 2010 B2
7657090 Omatsu et al. Feb 2010 B2
7667824 Moran Feb 2010 B1
7675080 Boettiger Mar 2010 B2
7675681 Tomikawa et al. Mar 2010 B2
7706634 Schmitt et al. Apr 2010 B2
7723662 Levoy et al. May 2010 B2
7738013 Galambos et al. Jun 2010 B2
7741620 Doering et al. Jun 2010 B2
7782364 Smith Aug 2010 B2
7826153 Hong Nov 2010 B2
7840067 Shen et al. Nov 2010 B2
7912673 Hébert et al. Mar 2011 B2
7924321 Mitsunaga et al. Apr 2011 B2
7956871 Fainstain et al. Jun 2011 B2
7965314 Miller et al. Jun 2011 B1
7973834 Yang Jul 2011 B2
7986018 Rennie Jul 2011 B2
7990447 Honda et al. Aug 2011 B2
8000498 Shih et al. Aug 2011 B2
8013904 Tan et al. Sep 2011 B2
8027531 Wilburn et al. Sep 2011 B2
8044994 Vetro et al. Oct 2011 B2
8055466 Bryll Nov 2011 B2
8077245 Adamo et al. Dec 2011 B2
8089515 Chebil et al. Jan 2012 B2
8098297 Crisan et al. Jan 2012 B2
8098304 Pinto et al. Jan 2012 B2
8106949 Tan et al. Jan 2012 B2
8111910 Tanaka Feb 2012 B2
8126279 Marcellin et al. Feb 2012 B2
8130120 Kawabata et al. Mar 2012 B2
8131097 Lelescu et al. Mar 2012 B2
8149323 Li Apr 2012 B2
8164629 Zhang Apr 2012 B1
8169486 Corcoran et al. May 2012 B2
8180145 Wu et al. May 2012 B2
8189065 Georgiev et al. May 2012 B2
8189089 Georgiev May 2012 B1
8194296 Compton Jun 2012 B2
8212914 Chiu Jul 2012 B2
8213711 Tam Jul 2012 B2
8231814 Duparre Jul 2012 B2
8242426 Ward et al. Aug 2012 B2
8244027 Takahashi Aug 2012 B2
8244058 Intwala et al. Aug 2012 B1
8254668 Mashitani et al. Aug 2012 B2
8279325 Pitts et al. Oct 2012 B2
8280194 Wong et al. Oct 2012 B2
8284240 Saint-Pierre et al. Oct 2012 B2
8289409 Chang Oct 2012 B2
8289440 Pitts et al. Oct 2012 B2
8290358 Georgiev Oct 2012 B1
8294099 Blackwell, Jr. Oct 2012 B2
8294754 Jung et al. Oct 2012 B2
8300085 Yang et al. Oct 2012 B2
8305456 McMahon Nov 2012 B1
8315476 Georgiev et al. Nov 2012 B1
8345144 Georgiev et al. Jan 2013 B1
8360574 Ishak et al. Jan 2013 B2
8400555 Georgiev Mar 2013 B1
8406562 Bassi et al. Mar 2013 B2
8411146 Twede Apr 2013 B2
8446492 Nakano et al. May 2013 B2
8456517 Mor et al. Jun 2013 B2
8493496 Freedman et al. Jul 2013 B2
8514291 Chang Aug 2013 B2
8514491 Duparre Aug 2013 B2
8541730 Inuiya Sep 2013 B2
8542933 Venkataraman Sep 2013 B2
8553093 Wong et al. Oct 2013 B2
8559756 Georgiev et al. Oct 2013 B2
8565547 Strandemar Oct 2013 B2
8576302 Yoshikawa Nov 2013 B2
8577183 Robinson Nov 2013 B2
8581995 Lin et al. Nov 2013 B2
8619082 Ciurea et al. Dec 2013 B1
8648918 Kauker et al. Feb 2014 B2
8655052 Spooner et al. Feb 2014 B2
8682107 Yoon et al. Mar 2014 B2
8687087 Pertsel et al. Apr 2014 B2
8692893 McMahon Apr 2014 B2
8754941 Sarwari et al. Jun 2014 B1
8773536 Zhang Jul 2014 B1
8780113 Ciurea et al. Jul 2014 B1
8804255 Duparre Aug 2014 B2
8830375 Ludwig Sep 2014 B2
8831367 Venkataraman Sep 2014 B2
8831377 Pitts et al. Sep 2014 B2
8836793 Kriesel et al. Sep 2014 B1
8842201 Tajiri Sep 2014 B2
8854462 Herbin et al. Oct 2014 B2
8861089 Duparre Oct 2014 B2
8866912 Mullis Oct 2014 B2
8866920 Venkataraman et al. Oct 2014 B2
8866951 Keelan Oct 2014 B2
8878950 Lelescu et al. Nov 2014 B2
8885059 Venkataraman et al. Nov 2014 B1
8885922 Ito et al. Nov 2014 B2
8896594 Xiong et al. Nov 2014 B2
8896719 Venkataraman et al. Nov 2014 B1
8902321 Venkataraman et al. Dec 2014 B2
8928793 McMahon Jan 2015 B2
8977038 Tian et al. Mar 2015 B2
9001226 Ng et al. Apr 2015 B1
9019426 Han et al. Apr 2015 B2
9025894 Venkataraman May 2015 B2
9025895 Venkataraman May 2015 B2
9030528 Pesach et al. May 2015 B2
9031335 Venkataraman May 2015 B2
9031342 Venkataraman May 2015 B2
9031343 Venkataraman May 2015 B2
9036928 Venkataraman May 2015 B2
9036931 Venkataraman et al. May 2015 B2
9041823 Venkataraman et al. May 2015 B2
9041824 Lelescu et al. May 2015 B2
9041829 Venkataraman et al. May 2015 B2
9042667 Venkataraman et al. May 2015 B2
9047684 Lelescu et al. Jun 2015 B2
9049367 Venkataraman et al. Jun 2015 B2
9055233 Venkataraman et al. Jun 2015 B2
9060120 Venkataraman et al. Jun 2015 B2
9060124 Venkataraman et al. Jun 2015 B2
9077893 Venkataraman et al. Jul 2015 B2
9094661 Venkataraman et al. Jul 2015 B2
9100586 McMahon et al. Aug 2015 B2
9100635 Duparre et al. Aug 2015 B2
9123117 Ciurea et al. Sep 2015 B2
9123118 Ciurea et al. Sep 2015 B2
9124815 Venkataraman et al. Sep 2015 B2
9124831 Mullis Sep 2015 B2
9124864 Mullis Sep 2015 B2
9128228 Duparre Sep 2015 B2
9129183 Venkataraman et al. Sep 2015 B2
9129377 Ciurea et al. Sep 2015 B2
9143711 McMahon Sep 2015 B2
9147254 Ciurea et al. Sep 2015 B2
9185276 Rodda et al. Nov 2015 B2
9188765 Venkataraman et al. Nov 2015 B2
9191580 Venkataraman et al. Nov 2015 B2
9197821 McMahon Nov 2015 B2
9210392 Nisenzon et al. Dec 2015 B2
9214013 Venkataraman et al. Dec 2015 B2
9235898 Venkataraman et al. Jan 2016 B2
9235900 Ciurea et al. Jan 2016 B2
9240049 Ciurea et al. Jan 2016 B2
9253380 Venkataraman et al. Feb 2016 B2
9256974 Hines Feb 2016 B1
9264592 Rodda et al. Feb 2016 B2
9264610 Duparre Feb 2016 B2
9361662 Lelescu et al. Jun 2016 B2
9374512 Venkataraman et al. Jun 2016 B2
9412206 McMahon et al. Aug 2016 B2
9413953 Maeda Aug 2016 B2
9426343 Rodda et al. Aug 2016 B2
9426361 Venkataraman et al. Aug 2016 B2
9438888 Venkataraman et al. Sep 2016 B2
9445003 Lelescu et al. Sep 2016 B1
9456134 Venkataraman et al. Sep 2016 B2
9456196 Kim et al. Sep 2016 B2
9462164 Venkataraman et al. Oct 2016 B2
9485496 Venkataraman et al. Nov 2016 B2
9497370 Venkataraman et al. Nov 2016 B2
9497429 Mullis et al. Nov 2016 B2
9516222 Duparre et al. Dec 2016 B2
9519972 Venkataraman et al. Dec 2016 B2
9521319 Rodda et al. Dec 2016 B2
9521416 McMahon et al. Dec 2016 B1
9536166 Venkataraman et al. Jan 2017 B2
9576369 Venkataraman et al. Feb 2017 B2
9578237 Duparre et al. Feb 2017 B2
9578259 Molina Feb 2017 B2
9602805 Venkataraman et al. Mar 2017 B2
9633442 Venkataraman et al. Apr 2017 B2
9635274 Lin et al. Apr 2017 B2
9638883 Duparre May 2017 B1
9661310 Deng et al. May 2017 B2
9706132 Nisenzon et al. Jul 2017 B2
9712759 Venkataraman et al. Jul 2017 B2
9733486 Lelescu et al. Aug 2017 B2
9741118 Mullis Aug 2017 B2
9743051 Venkataraman et al. Aug 2017 B2
9749547 Venkataraman et al. Aug 2017 B2
9749568 McMahon Aug 2017 B2
9754422 McMahon et al. Sep 2017 B2
9766380 Duparre et al. Sep 2017 B2
9769365 Jannard Sep 2017 B1
9774789 Ciurea et al. Sep 2017 B2
9774831 Venkataraman et al. Sep 2017 B2
9787911 McMahon et al. Oct 2017 B2
9794476 Nayar et al. Oct 2017 B2
9800856 Venkataraman et al. Oct 2017 B2
9800859 Venkataraman et al. Oct 2017 B2
9807382 Duparre et al. Oct 2017 B2
9811753 Venkataraman et al. Nov 2017 B2
9813616 Lelescu et al. Nov 2017 B2
9813617 Venkataraman et al. Nov 2017 B2
9858673 Ciurea et al. Jan 2018 B2
9864921 Venkataraman et al. Jan 2018 B2
9888194 Duparre Feb 2018 B2
9898856 Yang et al. Feb 2018 B2
9917998 Venkataraman et al. Mar 2018 B2
9924092 Rodda et al. Mar 2018 B2
9936148 McMahon Apr 2018 B2
9955070 Lelescu et al. Apr 2018 B2
9986224 Mullis May 2018 B2
10009538 Venkataraman et al. Jun 2018 B2
10019816 Venkataraman et al. Jul 2018 B2
10027901 Venkataraman et al. Jul 2018 B2
10089740 Srikanth et al. Oct 2018 B2
10091405 Molina Oct 2018 B2
20010005225 Clark et al. Jun 2001 A1
20010019621 Hanna et al. Sep 2001 A1
20010028038 Hamaguchi et al. Oct 2001 A1
20010038387 Tomooka et al. Nov 2001 A1
20020012056 Trevino Jan 2002 A1
20020015536 Warren Feb 2002 A1
20020027608 Johnson Mar 2002 A1
20020028014 Ono et al. Mar 2002 A1
20020039438 Mori et al. Apr 2002 A1
20020057845 Fossum May 2002 A1
20020061131 Sawhney et al. May 2002 A1
20020063807 Margulis May 2002 A1
20020075450 Aratani Jun 2002 A1
20020087403 Meyers et al. Jul 2002 A1
20020089596 Suda Jul 2002 A1
20020094027 Sato et al. Jul 2002 A1
20020101528 Lee Aug 2002 A1
20020113867 Takigawa et al. Aug 2002 A1
20020113888 Sonoda et al. Aug 2002 A1
20020118113 Oku et al. Aug 2002 A1
20020120634 Min et al. Aug 2002 A1
20020122113 Foote et al. Sep 2002 A1
20020163054 Suda et al. Nov 2002 A1
20020167537 Trajkovic Nov 2002 A1
20020177054 Saitoh et al. Nov 2002 A1
20020190991 Efran et al. Dec 2002 A1
20020195548 Dowski, Jr. et al. Dec 2002 A1
20030025227 Daniell Feb 2003 A1
20030086079 Barth et al. May 2003 A1
20030124763 Fan et al. Jul 2003 A1
20030140347 Varsa Jul 2003 A1
20030156189 Utsumi et al. Aug 2003 A1
20030179418 Wengender et al. Sep 2003 A1
20030188659 Merry et al. Oct 2003 A1
20030190072 Adkins et al. Oct 2003 A1
20030198377 Ng et al. Oct 2003 A1
20030211405 Venkataraman Nov 2003 A1
20040003409 Berstis et al. Jan 2004 A1
20040008271 Hagimori et al. Jan 2004 A1
20040012689 Tinnerino Jan 2004 A1
20040027358 Nakao Feb 2004 A1
20040047274 Amanai Mar 2004 A1
20040050104 Ghosh et al. Mar 2004 A1
20040056966 Schechner et al. Mar 2004 A1
20040061787 Liu et al. Apr 2004 A1
20040066454 Otani et al. Apr 2004 A1
20040071367 Irani et al. Apr 2004 A1
20040075654 Hsiao et al. Apr 2004 A1
20040096119 Williams May 2004 A1
20040100570 Shizukuishi May 2004 A1
20040105021 Hu et al. Jun 2004 A1
20040114807 Lelescu et al. Jun 2004 A1
20040141659 Zhang Jul 2004 A1
20040151401 Sawhney et al. Aug 2004 A1
20040165090 Ning Aug 2004 A1
20040169617 Yelton et al. Sep 2004 A1
20040170340 Tipping et al. Sep 2004 A1
20040174439 Upton Sep 2004 A1
20040179008 Gordon et al. Sep 2004 A1
20040179834 Szajewski Sep 2004 A1
20040196379 Chen et al. Oct 2004 A1
20040207600 Zhang et al. Oct 2004 A1
20040207836 Chhibber et al. Oct 2004 A1
20040213449 Safaee-Rad et al. Oct 2004 A1
20040218809 Blake et al. Nov 2004 A1
20040234873 Venkataraman Nov 2004 A1
20040239782 Equitz et al. Dec 2004 A1
20040239885 Jaynes et al. Dec 2004 A1
20040240052 Minefuji et al. Dec 2004 A1
20040251509 Choi Dec 2004 A1
20040264806 Herley Dec 2004 A1
20050006477 Patel Jan 2005 A1
20050007461 Chou et al. Jan 2005 A1
20050009313 Suzuki et al. Jan 2005 A1
20050010621 Pinto et al. Jan 2005 A1
20050012035 Miller Jan 2005 A1
20050036778 DeMonte Feb 2005 A1
20050047678 Jones et al. Mar 2005 A1
20050048690 Yamamoto Mar 2005 A1
20050068436 Fraenkel et al. Mar 2005 A1
20050083531 Millerd et al. Apr 2005 A1
20050084179 Hanna et al. Apr 2005 A1
20050128509 Tokkonen et al. Jun 2005 A1
20050128595 Shimizu Jun 2005 A1
20050132098 Sonoda et al. Jun 2005 A1
20050134698 Schroeder Jun 2005 A1
20050134699 Nagashima Jun 2005 A1
20050134712 Gruhlke et al. Jun 2005 A1
20050147277 Higaki et al. Jul 2005 A1
20050151759 Gonzalez-Banos et al. Jul 2005 A1
20050168924 Wu et al. Aug 2005 A1
20050175257 Kuroki Aug 2005 A1
20050185711 Pfister et al. Aug 2005 A1
20050205785 Hornback et al. Sep 2005 A1
20050219264 Shum et al. Oct 2005 A1
20050219363 Kohler Oct 2005 A1
20050224843 Boemler Oct 2005 A1
20050225654 Feldman et al. Oct 2005 A1
20050265633 Piacentino et al. Dec 2005 A1
20050275946 Choo et al. Dec 2005 A1
20050286612 Takanashi Dec 2005 A1
20050286756 Hong et al. Dec 2005 A1
20060002635 Nestares et al. Jan 2006 A1
20060007331 Izumi et al. Jan 2006 A1
20060013318 Webb et al. Jan 2006 A1
20060018509 Miyoshi Jan 2006 A1
20060023197 Joel Feb 2006 A1
20060023314 Boettiger et al. Feb 2006 A1
20060028476 Sobel et al. Feb 2006 A1
20060029270 Berestov et al. Feb 2006 A1
20060029271 Miyoshi et al. Feb 2006 A1
20060033005 Jerdev et al. Feb 2006 A1
20060034003 Zalevsky Feb 2006 A1
20060034531 Poon et al. Feb 2006 A1
20060035415 Wood Feb 2006 A1
20060038891 Okutomi et al. Feb 2006 A1
20060039611 Rother Feb 2006 A1
20060046204 Ono et al. Mar 2006 A1
20060049930 Zruya et al. Mar 2006 A1
20060050980 Kohashi et al. Mar 2006 A1
20060054780 Garrood et al. Mar 2006 A1
20060054782 Olsen Mar 2006 A1
20060055811 Frtiz et al. Mar 2006 A1
20060069478 Iwama Mar 2006 A1
20060072029 Miyatake et al. Apr 2006 A1
20060087747 Ohzawa et al. Apr 2006 A1
20060098888 Morishita May 2006 A1
20060103754 Wenstrand et al. May 2006 A1
20060125936 Gruhike et al. Jun 2006 A1
20060138322 Costello et al. Jun 2006 A1
20060152803 Provitola Jul 2006 A1
20060157640 Perlman et al. Jul 2006 A1
20060159369 Young Jul 2006 A1
20060176566 Boettiger et al. Aug 2006 A1
20060187338 May et al. Aug 2006 A1
20060197937 Bamji et al. Sep 2006 A1
20060203100 Ajito et al. Sep 2006 A1
20060203113 Wada et al. Sep 2006 A1
20060210146 Gu Sep 2006 A1
20060210186 Berkner Sep 2006 A1
20060214085 Olsen Sep 2006 A1
20060221250 Rossbach et al. Oct 2006 A1
20060239549 Kelly et al. Oct 2006 A1
20060243889 Farnworth et al. Nov 2006 A1
20060251410 Trutna Nov 2006 A1
20060274174 Tewinkle Dec 2006 A1
20060278948 Yamaguchi et al. Dec 2006 A1
20060279648 Senba et al. Dec 2006 A1
20060289772 Johnson et al. Dec 2006 A1
20070002159 Olsen Jan 2007 A1
20070008575 Yu et al. Jan 2007 A1
20070009150 Suwa Jan 2007 A1
20070024614 Tam Feb 2007 A1
20070030356 Yea et al. Feb 2007 A1
20070035707 Margulis Feb 2007 A1
20070036427 Nakamura et al. Feb 2007 A1
20070040828 Zalevsky et al. Feb 2007 A1
20070040922 McKee et al. Feb 2007 A1
20070041391 Lin et al. Feb 2007 A1
20070052825 Cho Mar 2007 A1
20070083114 Yang et al. Apr 2007 A1
20070085917 Kobayashi Apr 2007 A1
20070092245 Bazakos et al. Apr 2007 A1
20070102622 Olsen et al. May 2007 A1
20070126898 Feldman Jun 2007 A1
20070127831 Venkataraman Jun 2007 A1
20070139333 Sato et al. Jun 2007 A1
20070140685 Wu Jun 2007 A1
20070146503 Shiraki Jun 2007 A1
20070146511 Kinoshita et al. Jun 2007 A1
20070153335 Hosaka Jul 2007 A1
20070158427 Zhu et al. Jul 2007 A1
20070159541 Sparks et al. Jul 2007 A1
20070160310 Tanida et al. Jul 2007 A1
20070165931 Higaki Jul 2007 A1
20070171290 Kroger Jul 2007 A1
20070177004 Kolehmainen et al. Aug 2007 A1
20070182843 Shimamura et al. Aug 2007 A1
20070201859 Sarrat et al. Aug 2007 A1
20070206241 Smith et al. Sep 2007 A1
20070211164 Olsen et al. Sep 2007 A1
20070216765 Wong et al. Sep 2007 A1
20070225600 Weibrecht et al. Sep 2007 A1
20070228256 Mentzer Oct 2007 A1
20070236595 Pan et al. Oct 2007 A1
20070242141 Ciurea Oct 2007 A1
20070247517 Zhang et al. Oct 2007 A1
20070257184 Olsen et al. Nov 2007 A1
20070258006 Olsen et al. Nov 2007 A1
20070258706 Raskar et al. Nov 2007 A1
20070263113 Baek et al. Nov 2007 A1
20070263114 Gurevich et al. Nov 2007 A1
20070268374 Robinson Nov 2007 A1
20070296721 Chang et al. Dec 2007 A1
20070296832 Ota et al. Dec 2007 A1
20070296835 Olsen Dec 2007 A1
20070296847 Chang et al. Dec 2007 A1
20070297696 Hamza Dec 2007 A1
20080006859 Mionetto et al. Jan 2008 A1
20080019611 Larkin Jan 2008 A1
20080024683 Damera-Venkata et al. Jan 2008 A1
20080025649 Liu et al. Jan 2008 A1
20080030592 Border et al. Feb 2008 A1
20080030597 Olsen et al. Feb 2008 A1
20080043095 Vetro et al. Feb 2008 A1
20080043096 Vetro et al. Feb 2008 A1
20080054518 Ra et al. Mar 2008 A1
20080056302 Erdal et al. Mar 2008 A1
20080062164 Bassi et al. Mar 2008 A1
20080079805 Takagi et al. Apr 2008 A1
20080080028 Bakin et al. Apr 2008 A1
20080084486 Enge et al. Apr 2008 A1
20080088793 Sverdrup et al. Apr 2008 A1
20080095523 Schilling-Benz et al. Apr 2008 A1
20080099804 Venezia et al. May 2008 A1
20080106620 Sawachi et al. May 2008 A1
20080112059 Choi et al. May 2008 A1
20080112635 Kondo et al. May 2008 A1
20080117289 Schowengerdt et al. May 2008 A1
20080118241 Tekolste et al. May 2008 A1
20080131019 Ng Jun 2008 A1
20080131107 Ueno Jun 2008 A1
20080151097 Chen et al. Jun 2008 A1
20080152215 Horie et al. Jun 2008 A1
20080152296 Oh et al. Jun 2008 A1
20080156991 Hu et al. Jul 2008 A1
20080158259 Kempf et al. Jul 2008 A1
20080158375 Kakkori et al. Jul 2008 A1
20080158698 Chang et al. Jul 2008 A1
20080165257 Boettiger et al. Jul 2008 A1
20080174670 Olsen et al. Jul 2008 A1
20080187305 Raskar et al. Aug 2008 A1
20080193026 Horie et al. Aug 2008 A1
20080211737 Kim et al. Sep 2008 A1
20080218610 Chapman et al. Sep 2008 A1
20080218611 Parulski et al. Sep 2008 A1
20080218612 Border et al. Sep 2008 A1
20080218613 Janson et al. Sep 2008 A1
20080219654 Border et al. Sep 2008 A1
20080239116 Smith Oct 2008 A1
20080240598 Hasegawa Oct 2008 A1
20080247638 Tanida et al. Oct 2008 A1
20080247653 Moussavi et al. Oct 2008 A1
20080272416 Yun Nov 2008 A1
20080273751 Yuan et al. Nov 2008 A1
20080278591 Barna et al. Nov 2008 A1
20080278610 Boettiger et al. Nov 2008 A1
20080284880 Numata Nov 2008 A1
20080291295 Kato et al. Nov 2008 A1
20080298674 Baker et al. Dec 2008 A1
20080310501 Ward et al. Dec 2008 A1
20090027543 Kanehiro et al. Jan 2009 A1
20090050946 Duparre et al. Feb 2009 A1
20090052743 Techmer Feb 2009 A1
20090060281 Tanida et al. Mar 2009 A1
20090066693 Carson Mar 2009 A1
20090079862 Subbotin Mar 2009 A1
20090086074 Li et al. Apr 2009 A1
20090091645 Trimeche et al. Apr 2009 A1
20090091806 Inuiya Apr 2009 A1
20090092363 Daum et al. Apr 2009 A1
20090096050 Park Apr 2009 A1
20090102956 Georgiev Apr 2009 A1
20090103792 Rahn et al. Apr 2009 A1
20090109306 Shan Apr 2009 A1
20090127430 Hirasawa et al. May 2009 A1
20090128644 Camp et al. May 2009 A1
20090128833 Yahav May 2009 A1
20090129667 Ho et al. May 2009 A1
20090140131 Utagawa et al. Jun 2009 A1
20090141933 Wagg Jun 2009 A1
20090147919 Goto et al. Jun 2009 A1
20090152664 Klem et al. Jun 2009 A1
20090167922 Perlman et al. Jul 2009 A1
20090167934 Gupta Jul 2009 A1
20090175349 Ye et al. Jul 2009 A1
20090179142 Duparre et al. Jul 2009 A1
20090180021 Kikuchi et al. Jul 2009 A1
20090200622 Tai et al. Aug 2009 A1
20090201371 Matsuda et al. Aug 2009 A1
20090207235 Francini et al. Aug 2009 A1
20090219435 Yuan et al. Sep 2009 A1
20090225203 Tanida et al. Sep 2009 A1
20090237520 Kaneko et al. Sep 2009 A1
20090245573 Saptharishi et al. Oct 2009 A1
20090256947 Ciurea et al. Oct 2009 A1
20090263017 Tanbakuchi Oct 2009 A1
20090268192 Koenck et al. Oct 2009 A1
20090268970 Babacan et al. Oct 2009 A1
20090268983 Stone Oct 2009 A1
20090273663 Yoshida et al. Nov 2009 A1
20090274387 Jin Nov 2009 A1
20090279800 Uetani et al. Nov 2009 A1
20090284651 Srinivasan Nov 2009 A1
20090290811 Imai Nov 2009 A1
20090297056 Lelescu et al. Dec 2009 A1
20090302205 Olsen et al. Dec 2009 A9
20090317061 Jung et al. Dec 2009 A1
20090322876 Lee et al. Dec 2009 A1
20090323195 Hembree et al. Dec 2009 A1
20090323206 Oliver et al. Dec 2009 A1
20090324118 Maslov et al. Dec 2009 A1
20100002126 Wenstrand et al. Jan 2010 A1
20100002313 Duparre et al. Jan 2010 A1
20100002314 Duparre Jan 2010 A1
20100007714 Kim et al. Jan 2010 A1
20100013927 Nixon Jan 2010 A1
20100044815 Chang et al. Feb 2010 A1
20100045809 Packard Feb 2010 A1
20100053342 Hwang et al. Mar 2010 A1
20100053600 Tanida Mar 2010 A1
20100060746 Olsen et al. Mar 2010 A9
20100073463 Momonoi et al. Mar 2010 A1
20100074532 Gordon et al. Mar 2010 A1
20100085351 Deb et al. Apr 2010 A1
20100085425 Tan Apr 2010 A1
20100086227 Sun et al. Apr 2010 A1
20100091389 Henriksen et al. Apr 2010 A1
20100097491 Farina et al. Apr 2010 A1
20100103175 Okutomi et al. Apr 2010 A1
20100103259 Tanida et al. Apr 2010 A1
20100103308 Butterfield et al. Apr 2010 A1
20100111444 Coffman May 2010 A1
20100118127 Nam May 2010 A1
20100128145 Pitts et al. May 2010 A1
20100129048 Pitts et al. May 2010 A1
20100133230 Henriksen et al. Jun 2010 A1
20100133418 Sargent et al. Jun 2010 A1
20100141802 Knight Jun 2010 A1
20100142839 Lakus-Becker Jun 2010 A1
20100157073 Kondo et al. Jun 2010 A1
20100165152 Lim Jul 2010 A1
20100166410 Chang et al. Jul 2010 A1
20100171866 Brady et al. Jul 2010 A1
20100177411 Hegde et al. Jul 2010 A1
20100182406 Benitez et al. Jul 2010 A1
20100194860 Mentz et al. Aug 2010 A1
20100194901 van Hoorebeke et al. Aug 2010 A1
20100195716 Gunnewiek Klein et al. Aug 2010 A1
20100201809 Oyama et al. Aug 2010 A1
20100201834 Maruyama et al. Aug 2010 A1
20100202054 Niederer Aug 2010 A1
20100202683 Robinson Aug 2010 A1
20100208100 Olsen et al. Aug 2010 A9
20100220212 Perlman et al. Sep 2010 A1
20100223237 Mishra et al. Sep 2010 A1
20100225740 Jung et al. Sep 2010 A1
20100231285 Boomer et al. Sep 2010 A1
20100238327 Griffith et al. Sep 2010 A1
20100244165 Lake et al. Sep 2010 A1
20100245684 Xiao et al. Sep 2010 A1
20100254627 Panahpour Tehrani et al. Oct 2010 A1
20100259610 Petersen et al. Oct 2010 A1
20100265346 Iizuka Oct 2010 A1
20100265381 Yamamoto et al. Oct 2010 A1
20100265385 Knight et al. Oct 2010 A1
20100281070 Chan et al. Nov 2010 A1
20100289941 Ito et al. Nov 2010 A1
20100290483 Park et al. Nov 2010 A1
20100302423 Adams, Jr. et al. Dec 2010 A1
20100309292 Ho et al. Dec 2010 A1
20100309368 Choi et al. Dec 2010 A1
20100321595 Chiu et al. Dec 2010 A1
20100321640 Yeh et al. Dec 2010 A1
20100329556 Mitarai et al. Dec 2010 A1
20110001037 Tewinkle Jan 2011 A1
20110018973 Takayama Jan 2011 A1
20110019048 Raynor et al. Jan 2011 A1
20110019243 Constant, Jr. et al. Jan 2011 A1
20110031381 Tay et al. Feb 2011 A1
20110032341 Ignatov et al. Feb 2011 A1
20110032370 Ludwig Feb 2011 A1
20110033129 Robinson Feb 2011 A1
20110038536 Gong Feb 2011 A1
20110043661 Podoleanu Feb 2011 A1
20110043665 Ogasahara Feb 2011 A1
20110043668 McKinnon et al. Feb 2011 A1
20110044502 Liu et al. Feb 2011 A1
20110051255 Lee et al. Mar 2011 A1
20110055729 Mason et al. Mar 2011 A1
20110064327 Dagher et al. Mar 2011 A1
20110069189 Venkataraman et al. Mar 2011 A1
20110080487 Venkataraman et al. Apr 2011 A1
20110085028 Samadani et al. Apr 2011 A1
20110090217 Mashitani et al. Apr 2011 A1
20110108708 Olsen et al. May 2011 A1
20110115886 Nguyen May 2011 A1
20110121421 Charbon May 2011 A1
20110122308 Duparre May 2011 A1
20110128393 Tavi et al. Jun 2011 A1
20110128412 Milnes et al. Jun 2011 A1
20110129165 Lim et al. Jun 2011 A1
20110141309 Nagashima et al. Jun 2011 A1
20110142138 Tian et al. Jun 2011 A1
20110149408 Hahgholt et al. Jun 2011 A1
20110149409 Haugholt et al. Jun 2011 A1
20110150321 Cheong et al. Jun 2011 A1
20110153248 Gu et al. Jun 2011 A1
20110157321 Nakajima et al. Jun 2011 A1
20110157451 Chang Jun 2011 A1
20110169994 DiFrancesco et al. Jul 2011 A1
20110176020 Chang Jul 2011 A1
20110181797 Galstian et al. Jul 2011 A1
20110193944 Lian et al. Aug 2011 A1
20110200319 Kravitz et al. Aug 2011 A1
20110206291 Kashani et al. Aug 2011 A1
20110207074 Hall-Holt et al. Aug 2011 A1
20110211068 Yokota Sep 2011 A1
20110211077 Nayar Sep 2011 A1
20110211824 Georgiev et al. Sep 2011 A1
20110221599 Högasten Sep 2011 A1
20110221658 Haddick et al. Sep 2011 A1
20110221939 Jerdev Sep 2011 A1
20110221950 Oostra Sep 2011 A1
20110222757 Yeatman, Jr. et al. Sep 2011 A1
20110228142 Brueckner Sep 2011 A1
20110228144 Tian et al. Sep 2011 A1
20110234841 Akeley et al. Sep 2011 A1
20110241234 Duparre Oct 2011 A1
20110242342 Goma et al. Oct 2011 A1
20110242355 Goma et al. Oct 2011 A1
20110242356 Aleksic et al. Oct 2011 A1
20110243428 Das Gupta et al. Oct 2011 A1
20110255592 Sung Oct 2011 A1
20110255745 Hodder et al. Oct 2011 A1
20110261993 Weiming et al. Oct 2011 A1
20110267264 McCarthy et al. Nov 2011 A1
20110267348 Lin Nov 2011 A1
20110273531 Ito et al. Nov 2011 A1
20110274175 Sumitomo Nov 2011 A1
20110274366 Tardif Nov 2011 A1
20110279705 Kuang et al. Nov 2011 A1
20110279721 McMahon Nov 2011 A1
20110285701 Chen et al. Nov 2011 A1
20110285866 Bhrugumalla et al. Nov 2011 A1
20110285910 Bamji et al. Nov 2011 A1
20110292216 Fergus et al. Dec 2011 A1
20110298898 Jung et al. Dec 2011 A1
20110298917 Yanagita Dec 2011 A1
20110300929 Tardif et al. Dec 2011 A1
20110310980 Mathew Dec 2011 A1
20110316968 Taguchi et al. Dec 2011 A1
20110317766 Lim et al. Dec 2011 A1
20120012748 Pain et al. Jan 2012 A1
20120014456 Martinez Bauza et al. Jan 2012 A1
20120019530 Baker Jan 2012 A1
20120019700 Gaber Jan 2012 A1
20120023456 Sun et al. Jan 2012 A1
20120026297 Sato Feb 2012 A1
20120026342 Yu et al. Feb 2012 A1
20120026366 Golan et al. Feb 2012 A1
20120026451 Nystrom Feb 2012 A1
20120039525 Tian et al. Feb 2012 A1
20120044249 Mashitani et al. Feb 2012 A1
20120044372 Côté et al. Feb 2012 A1
20120051624 Ando Mar 2012 A1
20120056982 Katz et al. Mar 2012 A1
20120057040 Park et al. Mar 2012 A1
20120062697 Treado et al. Mar 2012 A1
20120062702 Jiang et al. Mar 2012 A1
20120062756 Tian Mar 2012 A1
20120069235 Imai Mar 2012 A1
20120081519 Goma Apr 2012 A1
20120086803 Malzbender et al. Apr 2012 A1
20120105590 Fukumoto et al. May 2012 A1
20120105691 Waqas et al. May 2012 A1
20120113232 Joblove May 2012 A1
20120113318 Galstian et al. May 2012 A1
20120113413 Miahczylowicz-Wolski et al. May 2012 A1
20120114224 Xu et al. May 2012 A1
20120127275 Von Zitzewitz et al. May 2012 A1
20120147139 Li et al. Jun 2012 A1
20120147205 Lelescu et al. Jun 2012 A1
20120153153 Chang et al. Jun 2012 A1
20120154551 Inoue Jun 2012 A1
20120155830 Sasaki et al. Jun 2012 A1
20120163672 Mckinnon Jun 2012 A1
20120163725 Fukuhara Jun 2012 A1
20120169433 Mullins Jul 2012 A1
20120170134 Bolis et al. Jul 2012 A1
20120176479 Mayhew et al. Jul 2012 A1
20120176481 Lukk et al. Jul 2012 A1
20120188235 Wu et al. Jul 2012 A1
20120188341 Klein Gunnewiek et al. Jul 2012 A1
20120188389 Lin et al. Jul 2012 A1
20120188420 Black et al. Jul 2012 A1
20120188634 Kubala et al. Jul 2012 A1
20120198677 Duparre Aug 2012 A1
20120200669 Lai Aug 2012 A1
20120200726 Bugnariu Aug 2012 A1
20120200734 Tang Aug 2012 A1
20120206582 DiCarlo et al. Aug 2012 A1
20120219236 Ali et al. Aug 2012 A1
20120224083 Jovanovski et al. Sep 2012 A1
20120229602 Chen et al. Sep 2012 A1
20120229628 Ishiyama et al. Sep 2012 A1
20120237114 Park et al. Sep 2012 A1
20120249550 Akeley et al. Oct 2012 A1
20120249750 Izzat et al. Oct 2012 A1
20120249836 Ali et al. Oct 2012 A1
20120249853 Krolczyk et al. Oct 2012 A1
20120262601 Choi et al. Oct 2012 A1
20120262607 Shimura et al. Oct 2012 A1
20120268574 Gidon et al. Oct 2012 A1
20120274626 Hsieh Nov 2012 A1
20120287291 McMahon et al. Nov 2012 A1
20120290257 Hodge et al. Nov 2012 A1
20120293489 Chen et al. Nov 2012 A1
20120293624 Chen et al. Nov 2012 A1
20120293695 Tanaka Nov 2012 A1
20120307093 Miyoshi Dec 2012 A1
20120307099 Yahata et al. Dec 2012 A1
20120314033 Lee et al. Dec 2012 A1
20120314937 Kim et al. Dec 2012 A1
20120327222 Ng et al. Dec 2012 A1
20130002828 Ding et al. Jan 2013 A1
20130003184 Duparre Jan 2013 A1
20130010073 Do Jan 2013 A1
20130016245 Yuba Jan 2013 A1
20130016885 Tsujimoto et al. Jan 2013 A1
20130022111 Chen et al. Jan 2013 A1
20130027580 Olsen et al. Jan 2013 A1
20130033579 Wajs Feb 2013 A1
20130033585 Li et al. Feb 2013 A1
20130038696 Ding et al. Feb 2013 A1
20130047396 Au et al. Feb 2013 A1
20130050504 Safaee-Rad et al. Feb 2013 A1
20130050526 Keelan Feb 2013 A1
20130057710 McMahon Mar 2013 A1
20130070060 Chatterjee Mar 2013 A1
20130076967 Brunner et al. Mar 2013 A1
20130077859 Stauder et al. Mar 2013 A1
20130077880 Venkataraman et al. Mar 2013 A1
20130077882 Venkataraman et al. Mar 2013 A1
20130083172 Baba Apr 2013 A1
20130088489 Schmeitz et al. Apr 2013 A1
20130088637 Duparre Apr 2013 A1
20130093842 Yahata Apr 2013 A1
20130107061 Kumar et al. May 2013 A1
20130113888 Koguchi May 2013 A1
20130113899 Morohoshi et al. May 2013 A1
20130113939 Strandemar May 2013 A1
20130120536 Song et al. May 2013 A1
20130120605 Georgiev et al. May 2013 A1
20130121559 Hu May 2013 A1
20130128068 Georgiev et al. May 2013 A1
20130128069 Georgiev et al. May 2013 A1
20130128087 Georgiev et al. May 2013 A1
20130128121 Agarwala et al. May 2013 A1
20130135315 Bares May 2013 A1
20130135448 Nagumo et al. May 2013 A1
20130147979 McMahon et al. Jun 2013 A1
20130169754 Aronsson et al. Jul 2013 A1
20130176394 Tian et al. Jul 2013 A1
20130208138 Li Aug 2013 A1
20130215108 McMahon et al. Aug 2013 A1
20130215231 Hiramoto et al. Aug 2013 A1
20130222556 Shimada Aug 2013 A1
20130222656 Kaneko Aug 2013 A1
20130223759 Nishiyama et al. Aug 2013 A1
20130229540 Farina et al. Sep 2013 A1
20130230237 Schlosser et al. Sep 2013 A1
20130250123 Zhang et al. Sep 2013 A1
20130250150 Malone Sep 2013 A1
20130258067 Zhang et al. Oct 2013 A1
20130259317 Gaddy Oct 2013 A1
20130265459 Duparre et al. Oct 2013 A1
20130274596 Azizian et al. Oct 2013 A1
20130274923 By et al. Oct 2013 A1
20130286236 Mankowski Oct 2013 A1
20130293760 Nisenzon Nov 2013 A1
20130321581 El-Ghoroury et al. Dec 2013 A1
20130335598 Gustavsson Dec 2013 A1
20140002674 Duparre et al. Jan 2014 A1
20140002675 Duparre et al. Jan 2014 A1
20140009586 McNamer et al. Jan 2014 A1
20140013273 Ng et al. Jan 2014 A1
20140037137 Broaddus et al. Feb 2014 A1
20140037140 Benhimane et al. Feb 2014 A1
20140043507 Wang et al. Feb 2014 A1
20140059462 Wernersson Feb 2014 A1
20140076336 Clayton et al. Mar 2014 A1
20140078333 Miao Mar 2014 A1
20140079336 Venkataraman et al. Mar 2014 A1
20140081454 Nuyujukian et al. Mar 2014 A1
20140085502 Lin et al. Mar 2014 A1
20140092281 Nisenzon et al. Apr 2014 A1
20140098266 Nayar et al. Apr 2014 A1
20140098267 Tian et al. Apr 2014 A1
20140104490 Hsieh et al. Apr 2014 A1
20140118493 Sali et al. May 2014 A1
20140118584 Lee et al. May 2014 A1
20140125771 Grossmann et al. May 2014 A1
20140132810 McMahon May 2014 A1
20140146132 Bagnato et al. May 2014 A1
20140146201 Knight et al. May 2014 A1
20140176592 Wilburn et al. Jun 2014 A1
20140183334 Wang et al. Jul 2014 A1
20140186045 Poddar et al. Jul 2014 A1
20140192154 Jeong et al. Jul 2014 A1
20140192253 Laroia Jul 2014 A1
20140198188 Izawa Jul 2014 A1
20140204183 Lee et al. Jul 2014 A1
20140218546 McMahon Aug 2014 A1
20140232822 Venkataraman et al. Aug 2014 A1
20140240528 Venkataraman et al. Aug 2014 A1
20140240529 Venkataraman et al. Aug 2014 A1
20140253738 Mullis Sep 2014 A1
20140267243 Venkataraman et al. Sep 2014 A1
20140267286 Duparre Sep 2014 A1
20140267633 Venkataraman et al. Sep 2014 A1
20140267762 Mullis et al. Sep 2014 A1
20140267829 McMahon et al. Sep 2014 A1
20140267890 Lelescu et al. Sep 2014 A1
20140285675 Mullis Sep 2014 A1
20140300706 Song Oct 2014 A1
20140313315 Shoham et al. Oct 2014 A1
20140321712 Ciurea et al. Oct 2014 A1
20140333731 Venkataraman et al. Nov 2014 A1
20140333764 Venkataraman et al. Nov 2014 A1
20140333787 Venkataraman et al. Nov 2014 A1
20140340539 Venkataraman et al. Nov 2014 A1
20140347509 Venkataraman et al. Nov 2014 A1
20140347748 Duparre Nov 2014 A1
20140354773 Venkataraman et al. Dec 2014 A1
20140354843 Venkataraman et al. Dec 2014 A1
20140354844 Venkataraman et al. Dec 2014 A1
20140354853 Venkataraman et al. Dec 2014 A1
20140354854 Venkataraman et al. Dec 2014 A1
20140354855 Venkataraman et al. Dec 2014 A1
20140355870 Venkataraman et al. Dec 2014 A1
20140368662 Venkataraman et al. Dec 2014 A1
20140368683 Venkataraman et al. Dec 2014 A1
20140368684 Venkataraman et al. Dec 2014 A1
20140368685 Venkataraman et al. Dec 2014 A1
20140368686 Duparre Dec 2014 A1
20140369612 Venkataraman et al. Dec 2014 A1
20140369615 Venkataraman et al. Dec 2014 A1
20140376825 Venkataraman et al. Dec 2014 A1
20140376826 Venkataraman et al. Dec 2014 A1
20150002734 Lee Jan 2015 A1
20150003752 Venkataraman et al. Jan 2015 A1
20150003753 Venkataraman et al. Jan 2015 A1
20150009353 Venkataraman et al. Jan 2015 A1
20150009354 Venkataraman et al. Jan 2015 A1
20150009362 Venkataraman et al. Jan 2015 A1
20150015669 Venkataraman et al. Jan 2015 A1
20150035992 Mullis Feb 2015 A1
20150036014 Lelescu et al. Feb 2015 A1
20150036015 Lelescu et al. Feb 2015 A1
20150042766 Ciurea et al. Feb 2015 A1
20150042767 Ciurea et al. Feb 2015 A1
20150042833 Lelescu et al. Feb 2015 A1
20150049915 Ciurea et al. Feb 2015 A1
20150049916 Ciurea et al. Feb 2015 A1
20150049917 Ciurea et al. Feb 2015 A1
20150055884 Venkataraman et al. Feb 2015 A1
20150085073 Bruls et al. Mar 2015 A1
20150085174 Shabtay et al. Mar 2015 A1
20150091900 Yang et al. Apr 2015 A1
20150098079 Montgomery et al. Apr 2015 A1
20150104076 Hayasaka Apr 2015 A1
20150104101 Bryant et al. Apr 2015 A1
20150122411 Rodda et al. May 2015 A1
20150124059 Georgiev et al. May 2015 A1
20150124113 Rodda et al. May 2015 A1
20150124151 Rodda et al. May 2015 A1
20150138346 Venkataraman et al. May 2015 A1
20150146029 Venkataraman et al. May 2015 A1
20150146030 Venkataraman et al. May 2015 A1
20150161798 Lelescu et al. Jun 2015 A1
20150199793 Venkataraman et al. Jul 2015 A1
20150199841 Venkataraman et al. Jul 2015 A1
20150235476 McMahon et al. Aug 2015 A1
20150243480 Yamada et al. Aug 2015 A1
20150244927 Laroia et al. Aug 2015 A1
20150248744 Hayasaka et al. Sep 2015 A1
20150254868 Srikanth et al. Sep 2015 A1
20150264337 Lelescu et al. Sep 2015 A1
20150296137 Duparre et al. Oct 2015 A1
20150312455 Venkataraman et al. Oct 2015 A1
20150326852 Duparre et al. Nov 2015 A1
20150332468 Hayasaka et al. Nov 2015 A1
20150373261 Rodda et al. Dec 2015 A1
20160037097 Duparre Feb 2016 A1
20160044252 Molina Feb 2016 A1
20160044257 Venkataraman et al. Feb 2016 A1
20160057332 Ciurea et al. Feb 2016 A1
20160065934 Kaza et al. Mar 2016 A1
20160163051 Mullis Jun 2016 A1
20160165106 Duparre Jun 2016 A1
20160165134 Lelescu et al. Jun 2016 A1
20160165147 Nisenzon et al. Jun 2016 A1
20160165212 Mullis Jun 2016 A1
20160195733 Lelescu et al. Jul 2016 A1
20160198096 McMahon et al. Jul 2016 A1
20160227195 Venkataraman et al. Aug 2016 A1
20160249001 McMahon Aug 2016 A1
20160255333 Nisenzon et al. Sep 2016 A1
20160266284 Duparre et al. Sep 2016 A1
20160267665 Venkataraman et al. Sep 2016 A1
20160267672 Ciurea et al. Sep 2016 A1
20160269626 McMahon Sep 2016 A1
20160269627 McMahon Sep 2016 A1
20160269650 Venkataraman et al. Sep 2016 A1
20160269651 Venkataraman et al. Sep 2016 A1
20160269664 Duparre Sep 2016 A1
20160316140 Nayar et al. Oct 2016 A1
20170006233 Venkataraman et al. Jan 2017 A1
20170048468 Pain et al. Feb 2017 A1
20170053382 Lelescu et al. Feb 2017 A1
20170054901 Venkataraman et al. Feb 2017 A1
20170070672 Rodda et al. Mar 2017 A1
20170070673 Lelescu et al. Mar 2017 A1
20170078568 Venkataraman et al. Mar 2017 A1
20170085845 Venkataraman et al. Mar 2017 A1
20170094243 Venkataraman et al. Mar 2017 A1
20170099465 Mullis et al. Apr 2017 A1
20170163862 Molina Jun 2017 A1
20170178363 Venkataraman et al. Jun 2017 A1
20170187933 Duparre Jun 2017 A1
20170244960 Ciurea et al. Aug 2017 A1
20170257562 Venkataraman et al. Sep 2017 A1
20170365104 McMahon et al. Dec 2017 A1
20180007284 Venkataraman et al. Jan 2018 A1
20180013945 Ciurea et al. Jan 2018 A1
20180024330 Laroia Jan 2018 A1
20180035057 McMahon et al. Feb 2018 A1
20180040135 Mullis Feb 2018 A1
20180048830 Venkataraman et al. Feb 2018 A1
20180048879 Venkataraman et al. Feb 2018 A1
20180081090 Duparre et al. Mar 2018 A1
20180097993 Nayar et al. Apr 2018 A1
20180124311 Lelescu et al. May 2018 A1
20180139382 Venkataraman et al. May 2018 A1
20180197035 Venkataraman et al. Jul 2018 A1
20180211402 Ciurea et al. Jul 2018 A1
20180240265 Yang et al. Aug 2018 A1
20180270473 Mullis Sep 2018 A1
Foreign Referenced Citations (189)
Number Date Country
1669332 Sep 2005 CN
1839394 Sep 2006 CN
101010619 Aug 2007 CN
101064780 Oct 2007 CN
101102388 Jan 2008 CN
101147392 Mar 2008 CN
101427372 May 2009 CN
101606086 Dec 2009 CN
101883291 Nov 2010 CN
102037717 Apr 2011 CN
102375199 Mar 2012 CN
104081414 Oct 2014 CN
104508681 Apr 2015 CN
104662589 May 2015 CN
104685513 Jun 2015 CN
104685860 Jun 2015 CN
104081414 Aug 2017 CN
107230236 Oct 2017 CN
107346061 Nov 2017 CN
104685513 Apr 2018 CN
0677821 Oct 1995 EP
0840502 May 1998 EP
1201407 May 2002 EP
1355274 Oct 2003 EP
1734766 Dec 2006 EP
1243945 Jan 2009 EP
2026563 Feb 2009 EP
2104334 Sep 2009 EP
2244484 Oct 2010 EP
0957642 Apr 2011 EP
2336816 Jun 2011 EP
2339532 Jun 2011 EP
2381418 Oct 2011 EP
2652678 Oct 2013 EP
2761534 Aug 2014 EP
2867718 May 2015 EP
2873028 May 2015 EP
2888698 Jul 2015 EP
2888720 Jul 2015 EP
2901671 Aug 2015 EP
2973476 Jan 2016 EP
3066690 Sep 2016 EP
2652678 Sep 2017 EP
2817955 Apr 2018 EP
3328048 May 2018 EP
3075140 Jun 2018 EP
2482022 Jan 2012 GB
2708CHENP2014 Aug 2015 IN
59025483 Feb 1984 JP
64037177 Feb 1989 JP
02285772 Nov 1990 JP
06129851 May 1994 JP
0715457 Jan 1995 JP
09171075 Jun 1997 JP
09181913 Jul 1997 JP
10253351 Sep 1998 JP
11142609 May 1999 JP
11223708 Aug 1999 JP
11325889 Nov 1999 JP
2000209503 Jul 2000 JP
2001008235 Jan 2001 JP
2001194114 Jul 2001 JP
2001264033 Sep 2001 JP
2001277260 Oct 2001 JP
2001337263 Dec 2001 JP
2002195910 Jul 2002 JP
2002205310 Jul 2002 JP
2002250607 Sep 2002 JP
2002252338 Sep 2002 JP
2003094445 Apr 2003 JP
2003139910 May 2003 JP
2003163938 Jun 2003 JP
2003298920 Oct 2003 JP
2004221585 Aug 2004 JP
2005116022 Apr 2005 JP
2005181460 Jul 2005 JP
2005295381 Oct 2005 JP
2005303694 Oct 2005 JP
2005341569 Dec 2005 JP
2005354124 Dec 2005 JP
2006033228 Feb 2006 JP
2006033493 Feb 2006 JP
2006047944 Feb 2006 JP
2006258930 Sep 2006 JP
2007520107 Jul 2007 JP
2007259136 Oct 2007 JP
2008039852 Feb 2008 JP
2008055908 Mar 2008 JP
2008507874 Mar 2008 JP
2008172735 Jul 2008 JP
2008258885 Oct 2008 JP
2009064421 Mar 2009 JP
2009132010 Jun 2009 JP
2009300268 Dec 2009 JP
2010139288 Jun 2010 JP
2011017764 Jan 2011 JP
2011030184 Feb 2011 JP
2011109484 Jun 2011 JP
2011523538 Aug 2011 JP
2011203238 Oct 2011 JP
2012504805 Feb 2012 JP
2013509022 Mar 2013 JP
2013526801 Jun 2013 JP
2014521117 Aug 2014 JP
2014535191 Dec 2014 JP
2015522178 Aug 2015 JP
2015534734 Dec 2015 JP
2016524125 Aug 2016 JP
6140709 May 2017 JP
2017163550 Sep 2017 JP
2017163587 Sep 2017 JP
2017531976 Oct 2017 JP
20110097647 Aug 2011 KR
20170063827 Jun 2017 KR
101824672 Feb 2018 KR
101843994 Mar 2018 KR
191151 Jul 2013 SG
200828994 Jul 2008 TW
200939739 Sep 2009 TW
2005057922 Jun 2005 WO
2006039906 Apr 2006 WO
2006039906 Sep 2006 WO
2007013250 Feb 2007 WO
2007083579 Jul 2007 WO
2007134137 Nov 2007 WO
2008045198 Apr 2008 WO
2008050904 May 2008 WO
2008108271 Sep 2008 WO
2008108926 Sep 2008 WO
2008150817 Dec 2008 WO
2009073950 Jun 2009 WO
2009151903 Dec 2009 WO
2009157273 Dec 2009 WO
2010037512 Apr 2010 WO
2011008443 Jan 2011 WO
2011046607 Apr 2011 WO
2011055655 May 2011 WO
2011063347 May 2011 WO
2011105814 Sep 2011 WO
2011116203 Sep 2011 WO
2011063347 Oct 2011 WO
2011143501 Nov 2011 WO
2012057619 May 2012 WO
2012057620 May 2012 WO
2012057621 May 2012 WO
2012057622 May 2012 WO
2012057623 May 2012 WO
2012057620 Jun 2012 WO
2012074361 Jun 2012 WO
2012078126 Jun 2012 WO
2012082904 Jun 2012 WO
2012155119 Nov 2012 WO
2013003276 Jan 2013 WO
2013043751 Mar 2013 WO
2013043761 Mar 2013 WO
2013049699 Apr 2013 WO
2013055960 Apr 2013 WO
2013119706 Aug 2013 WO
2013126578 Aug 2013 WO
2013166215 Nov 2013 WO
2014004134 Jan 2014 WO
2014005123 Jan 2014 WO
2014031795 Feb 2014 WO
2014052974 Apr 2014 WO
2014032020 May 2014 WO
2014078443 May 2014 WO
2014130849 Aug 2014 WO
2014133974 Sep 2014 WO
2014138695 Sep 2014 WO
2014138697 Sep 2014 WO
2014144157 Sep 2014 WO
2014145856 Sep 2014 WO
2014149403 Sep 2014 WO
2014149902 Sep 2014 WO
2014150856 Sep 2014 WO
2014153098 Sep 2014 WO
2014159721 Oct 2014 WO
2014159779 Oct 2014 WO
2014160142 Oct 2014 WO
2014164550 Oct 2014 WO
2014164909 Oct 2014 WO
2014165244 Oct 2014 WO
2014133974 Apr 2015 WO
2015048694 Apr 2015 WO
2015070105 May 2015 WO
2015074078 May 2015 WO
2015081279 Jun 2015 WO
2015134996 Sep 2015 WO
2016054089 Apr 2016 WO
Non-Patent Literature Citations (308)
Entry
Joshi et al., “Synthetic Aperture Tracking: Tracking Through Occlusions”, I CCV IEEE 11th International Conference on Computer Vision; Publication [online]. Oct. 2007 [retrieved Jul. 28, 2014]. Retrieved from the Internet: <URL: http:I/ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4409032&isnumber=4408819>; pp. 1-8.
Kang et al., “Handling Occlusions in Dense Multi-view Stereo”, Computer Vision and Pattern Recognition, 2001, vol. 1, pp. I-103-I-110.
Kim et al., “Scene reconstruction from high spatio-angular resolution light fields”, ACM Transactions on Graphics (TOG)—SIGGRAPH 2013 Conference Proceedings, vol. 32, Issue 4, Article 73, Jul. 21, 2013, 11 pages.
Kitamura et al., “Reconstruction of a high-resolution image on a compound-eye image-capturing system”, Applied Optics, Mar. 10, 2004, vol. 43, No. 8, pp. 1719-1727.
Konolige, Kurt, “Projected Texture Stereo”, 2010 IEEE International Conference on Robotics and Automation, May 3-7, 2010, p. 148-155.
Krishnamurthy et al., “Compression and Transmission of Depth Maps for Image-Based Rendering”, Image Processing, 2001, pp. 828-831.
Kubota et al., “Reconstructing Dense Light Field From Array of Multifocus Images for Novel View Synthesis”, IEEE Transactions on Image Processing, vol. 16, No. 1, Jan. 2007, pp. 269-279.
Kutulakos et al., “Occluding Contour Detection Using Affine Invariants and Purposive Viewpoint Control”, Computer Vision and Pattern Recognition, Proceedings CVPR 94, Seattle, Washington, Jun. 21-23, 1994, 8 pgs.
Lai et al., “A Large-Scale Hierarchical Multi-View RGB-D Object Dataset”, Proceedings—IEEE International Conference on Robotics and Automation, Conference Date May 9-13, 2011, 8 pgs., DOI:10.1109/ICRA.201135980382.
Lane et al., “A Survey of Mobile Phone Sensing”, IEEE Communications Magazine, vol. 48, Issue 9, Sep. 2010, pp. 140-150.
Lee et al., “Automatic Upright Adjustment of Photographs”, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2012, pp. 877-884.
Lee et al., “Electroactive Polymer Actuator for Lens-Drive Unit in Auto-Focus Compact Camera Module”, ETRI Journal, vol. 31, No. 6, Dec. 2009, pp. 695-702.
Lee et al., “Nonlocal matting”, CVPR 2011, Jun. 20-25, 2011, pp. 2193-2200.
Lensvector, “How LensVector Autofocus Works”, 2010, printed Nov. 2, 2012 from http://www.lensvector.com/overview.html, 1 pg.
Levin et al., “A Closed Form Solution to Natural Image Matting”, Pattern Analysis and Machine Intelligence, Dec. 18, 2007, vol. 30, Issue 2, 8 pgs.
Levin et al., “Spectral Matting”, 2007 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 17-22, 2007, Minneapolis, MN, USA, pp. 1-8.
Levoy, “Light Fields and Computational Imaging”, IEEE Computer Society, Sep. 1, 2006, vol. 39, Issue No. 8, pp. 46-55.
Levoy et al., “Light Field Rendering”, Proc. Adm Siggraph '96, 1996, pp. 1-12.
Li et al., “A Hybrid Camera for Motion Deblurring and Depth Map Super-Resolution”, Jun. 23-28, 2008, IEEE Conference on Computer Vision and Pattern Recognition, 8 pgs. Retrieved from www.eecis.udel.edu/˜jye/lab_research/08/deblur-feng.pdf on Feb. 5, 2014.
Li et al., “Fusing Images With Different Focuses Using Support Vector Machines”, IEEE Transactions on Neural Networks, vol. 15, No. 6, Nov. 8, 2004, pp. 1555-1561.
Lim, Jongwoo, “Optimized Projection Pattern Supplementing Stereo Systems”, 2009 IEEE International Conference on Robotics and Automation, May 12-17, 2009, pp. 2823-2829.
Liu et al., “Virtual View Reconstruction Using Temporal Information”, 2012 IEEE International Conference on Multimedia and Expo, 2012, pp. 115-120.
Lo et al., “Stereoscopic 3D Copy & Paste”, ACM Transactions on Graphics, vol. 29, No. 6, Article 147, Dec. 2010, pp. 147:1-147:10.
Martinez et al., “Simple Telemedicine for Developing Regions: Camera Phones and Paper-Based Microfluidic Devices for Real-Time, Off-Site Diagnosis”, Analytical Chemistry (American Chemical Society), vol. 80, No. 10, May 15, 2008, pp. 3699-3707.
Mcguire et al., “Defocus video matting”, ACM Transactions on Graphics (TOG)—Proceedings of ACM SIGGRAPH 2005, vol. 24, Issue 3, Jul. 2005, pp. 567-576.
Merkle et al., “Adaptation and optimization of coding algorithms for mobile 3DTV”, Mobile3DTV Project No. 216503, Nov. 2008, 55 pgs.
Mitra et al., “Light Field Denoising, Light Field Superresolution and Stereo Camera Based Refocussing using a GMM Light Field Patch Prior”, Computer Vision and Pattern Recognition Workshops (CVPRW), 2012 IEEE Computer Society Conference on Jun. 16-21, 2012, pp. 22-28.
Moreno-Noguer et al., “Active Refocusing of Images and Videos”, ACM Transactions on Graphics (TOG)—Proceedings of ACM SIGGRAPH 2007, vol. 26, Issue 3, Jul. 2007, 10 pages.
Muehlebach, “Camera Auto Exposure Control for VSLAM Applications”, Studies on Mechatronics, Swiss Federal Institute of Technology Zurich, Autumn Term 2010 course, 67 pgs.
Nayar, “Computational Cameras: Redefining the Image”, IEEE Computer Society, Aug. 14, 2006, pp. 30-38.
Ng, “Digital Light Field Photography”, Thesis, Jul. 2006, 203 pgs.
Ng et al., “Light Field Photography with a Hand-held Plenoptic Camera”, Stanford Tech Report CTSR 2005-02, Apr. 20, 2005, pp. 1-11.
Ng et al., “Super-Resolution Image Restoration from Blurred Low-Resolution Images”, Journal of Mathematical Imaging and Vision, 2005, vol. 23, pp. 367-378.
Nguyen et al., “Error Analysis for Image-Based Rendering with Depth Information”, IEEE Transactions on Image Processing, vol. 18, Issue 4, Apr. 2009, pp. 703-716.
Nguyen et al., “Image-Based Rendering with Depth Information Using the Propagation Algorithm”, Proceedings. (ICASSP '05). IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005, vol. 5, Mar. 23-23, 2005, pp. II-589-II-592.
Nishihara, H.K., “PRISM: A Practical Real-Time Imaging Stereo Matcher”, Massachusetts Institute of Technology, A.I. Memo 780, May 1984, 32 pgs.
Nitta et al., “Image reconstruction for thin observation module by bound optics by using the iterative backprojection method”, Applied Optics, May 1, 2006, vol. 45, No. 13, pp. 2893-2900.
Nomura et al., “Scene Collages and Flexible Camera Arrays”, Proceedings of Eurographics Symposium on Rendering, Jun. 2007, 12 pgs.
Park et al., “Multispectral Imaging Using Multiplexed Illumination”, 2007 IEEE 11th International Conference on Computer Vision, Oct. 14-21, 2007, Rio de Janeiro, Brazil, pp. 1-8.
Park et al., “Super-Resolution Image Reconstruction”, IEEE Signal Processing Magazine, May 2003, pp. 21-36.
Parkkinen et al., “Characteristic Spectra of Munsell Colors”, Journal of the Optical Society of America A, vol. 6, Issue 2, Feb. 1989, pp. 318-322.
Perwass et al., “Single Lens 3D-Camera with Extended Depth-of-Field”, printed from www.raytrix.de, Jan. 22, 2012, 15 pgs.
Pham et al., “Robust Super-Resolution without Regularization”, Journal of Physics: Conference Series 124, Jul. 2008, pp. 1-19.
Philips 3d Solutions, “3D Interface Specifications, White Paper”, Feb. 15, 2008, 2005-2008 Philips Electronics Nederland B.V., Philips 3D Solutions retrieved from www.philips.com/3dsolutions, 29 pgs., Feb. 15, 2008.
Polight, “Designing Imaging Products Using Reflowable Autofocus Lenses”, printed Nov. 2, 2012 from http://www.polight.no/tunable-polymer-autofocus-lens-html--11.html, 1 pg.
Pouydebasque et al., “Varifocal liquid lenses with integrated actuator, high focusing power and low operating voltage fabricated on 200 mm wafers”, Sensors and Actuators A: Physical, vol. 172, Issue 1, Dec. 2011, pp. 280-286.
Protter et al., “Generalizing the Nonlocal-Means to Super-Resolution Reconstruction”, IEEE Transactions on Image Processing, Dec. 2, 2008, vol. 18, No. 1, pp. 36-51.
Radtke et al., “Laser lithographic fabrication and characterization of a spherical artificial compound eye”, Optics Express, Mar. 19, 2007, vol. 15, No. 6, pp. 3067-3077.
Rajan et al., “Simultaneous Estimation of Super Resolved Scene and Depth Map from Low Resolution Defocused Observations”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 25, No. 9, Sep. 8, 2003, pp. 1-16.
Rander et al., “Virtualized Reality: Constructing Time-Varying Virtual Worlds From Real World Events”, Proc. of IEEE Visualization '97, Phoenix, Arizona, Oct. 19-24, 1997, pp. 277-283, 552.
Rhemann et al, “Fast Cost—Volume Filtering for Visual Correspondence and Beyond”, IEEE Trans. Pattern Anal. Mach. Intell, 2013, vol. 35, No. 2, pp. 504-511.
Rhemann et al., “A perceptually motivated online benchmark for image matting”, 2009 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 20-25, 2009, Miami, FL, USA, pp. 1826-1833.
Robertson et al., “Dynamic Range Improvement Through Multiple Exposures”, In Proc. of the Int. Conf. on Image Processing, 1999, 5 pgs.
Robertson et al., “Estimation-theoretic approach to dynamic range enhancement using multiple exposures”, Journal of Electronic Imaging, Apr. 2003, vol. 12, No. 2, pp. 219-228.
Roy et al., “Non-Uniform Hierarchical Pyramid Stereo for Large Images”, Computer and Robot Vision, 2002, pp. 208-215.
Sauer et al., “Parallel Computation of Sequential Pixel Updates in Statistical Tomographic Reconstruction”, ICIP 1995 Proceedings of the 1995 International Conference on Image Processing, Date of Conference: Oct. 23-26, 1995, pp. 93-96.
Scharstein et al., “High-Accuracy Stereo Depth Maps Using Structured Light”, IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2003), Jun. 2003, vol. 1, pp. 195-202.
Extended European Search Report for European Application No. 14865463.5, Search completed May 30, 2017, dated Jun. 8, 2017, 6 Pgs.
Extended European Search Report for European Application No. 15847754.7, Search completed Jan. 25, 2018, dated Feb. 9, 2018, 8 Pgs.
File Wrapper for U.S. Appl. No. 61/527,007, filed Aug. 24, 2011, 21 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2011/064921, Report dated Jun. 18, 2013, dated Jun. 27, 2013, 14 Pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2012/056151, Report dated Mar. 25, 2014, 9 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2012/056166, Report dated Mar. 25, 2014, Report dated Apr. 3, 2014 8 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2012/058093, dated Sep. 18, 2013, dated Oct. 22, 2013, 40 pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/029052, dated Sep. 15, 2015, dated Sep. 24, 2015, 9 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2015/053013, dated Apr. 4, 2017, dated Apr. 13, 2017, 9 Pgs.
International Search Report and Written Opinion for International Application No. PCT/US2012/056166, Completed Nov. 10, 2012, dated Nov. 20, 2012, 9 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/027146, completed Apr. 2, 2013, dated Apr. 19, 2013, 11 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2015/053013, completed Dec. 1, 2015, dated Dec. 30, 2015, 9 Pgs.
International Search Report and Written Opinion for International Application PCT/US2014/017766, completed May 28, 2014, dated Jun. 18, 2014, 9 Pgs.
International Search Report and Written Opinion for International Application PCT/US2014/018084, completed May 23, 2014, dated Jun. 10, 2014, 12 pgs.
International Search Report and Written Opinion for International Application PCT/US2014/018116, completed May 13, 2014, dated Jun. 2, 2014, 13 Pgs.
International Search Report and Written Opinion for International Application PCT/US2010/057661, completed Mar. 9, 2011, dated Mar. 17, 2011, 14 pgs.
International Search Report and Written Opinion for International Application PCT/US2014/029052, completed Jun. 30, 2014, dated Jul. 24, 2014, 10 Pgs.
Collins et al., “An Active Camera System for Acquiring Multi-View Video”, IEEE 2002 International Conference on Image Processing, Date of Conference: Sep. 22-25, 2002, Rochester, NY, 4 pgs.
Drulea et al., “Motion Estimation Using the Correlation Transform”, IEEE Transactions on Image Processing, Aug. 2013, vol. 22, No. 8, pp. 3260-3270, first published May 14, 2013.
Holoeye Photonics AG, “LC 2012 Spatial Light Modulator (transmissive)”, Sep. 18, 2013, retrieved from https://web.archive.org/web/20130918151716/http://holoeye.com/spatial-light-modulators/lc-2012-spatial-light-modulator/ on Oct. 20, 2017, 3 pages.
Joshi, Neel S., “Color Calibration for Arrays of Inexpensive Image Sensors”, Master's with Distinction in Research Report, Stanford University, Department of Computer Science, Mar. 2004, 30 pgs.
Robert et al., “Dense Depth Map Reconstruction : A Minimization and Regularization Approach which Preserves Discontinuities”, European Conference on Computer Vision (ECCV), pp. 439-451, 1996.
Van Der Wal et al., “The Acadia Vision Processor”, Proceedings Fifth IEEE International Workshop on Computer Architectures for Machine Perception, Sep. 13, 2000, Padova, Italy, pp. 31-40.
International Search Report and Written Opinion for International Application PCT/US2009/044687, completed Jan. 5, 2010, dated Jan. 13, 2010, 9 pgs.
International Search Report and Written Opinion for International Application PCT/US2010/057661, completed Mar. 9, 2011, 14 pgs.
International Search Report and Written Opinion for International Application PCT/US2012/037670, dated Jul. 18, 2012, Completed Jul. 5, 2012, 9 pgs.
International Search Report and Written Opinion for International Application PCT/US2012/044014, completed Oct. 12, 2012, 15 pgs.
International Search Report and Written Opinion for International Application PCT/US2012/056151, completed Nov. 14, 2012, 10 pgs.
International Search Report and Written Opinion for International Application PCT/US2012/058093, Report completed Nov. 15, 2012, 12 pgs.
International Search Report and Written Opinion for International Application PCT/US2012/059813, completed Dec. 17, 2012, 8 pgs.
International Search Report and Written Opinion for International Application PCT/US2014/022123, completed Jun. 9, 2014, dated Jun. 25, 2014, 5 pgs.
International Search Report and Written Opinion for International Application PCT/US2014/023762, Completed May 30, 2014, dated Jul. 3, 2014, 6 Pgs.
International Search Report and Written Opinion for International Application PCT/US2014/024903, completed Jun. 12, 2014, dated Jun. 27, 2014, 13 pgs.
International Search Report and Written Opinion for International Application PCT/US2014/024947, Completed Jul. 8, 2014, dated Aug. 5, 2014, 8 Pgs.
International Search Report and Written Opinion for International Application PCT/US2014/028447, completed Jun. 30, 2014, dated Jul. 21, 2014, 8 Pgs.
International Search Report and Written Opinion for International Application PCT/US2014/030692, completed Jul. 28, 2014, dated Aug. 27, 2014, 7 Pgs.
International Search Report and Written Opinion for International Application PCT/US2014/064693, Completed Mar. 7, 2015, dated Apr. 2, 2015, 15 pgs.
International Search Report and Written Opinion for International Application PCT/US2014/066229, Completed Mar. 6, 2015, dated Mar. 19, 2015, 9 Pgs.
International Search Report and Written Opinion for International Application PCT/US2014/067740, Completed Jan. 29, 2015, dated Mar. 3, 2015, 10 pgs.
Notice of Allowance Received, U.S. Appl. No. 12/935,504, Pub. Date Jul. 18, 2014, p. 12.
Office Action for U.S. Appl. No. 12/952,106, dated Aug. 16, 2012, 12 pgs.
Supplementary European Search Report for EP Application No. 13831768.0, Search completed May 18, 2016, dated May 30, 2016, 13 Pgs.
“Exchangeable image file format for digital still cameras: Exif Version 2.2”, Japan Electronics and Information Technology Industries Association, Prepared by Technical Standardization Committee on AV & IT Storage Systems and Equipment, JEITA CP-3451, Apr. 2002, Retrieved from: http://www.exif.org/Exif2-2.PDF, 154 pgs.
“File Formats Version 6”, Alias Systems, 2004, 40 pgs.
“Light fields and computational photography”, Stanford Computer Graphics Laboratory, Retrieved from: http://graphics.stanford.edu/projects/lightfield/, Earliest publication online: Feb. 10, 1997, 3 pgs.
U.S. Appl. No. 12/952,134, “Notice of Allowance Received”, Notice of Allowance U.S. Appl. No. 12/952,134, report completed Jul. 24, 2014, Pub. Date Jul. 24, 2014, 8 Pgs.
Aufderheide et al., “A MEMS-based Smart Sensor System for Estimation of Camera Pose for Computer Vision Applications”, Research and Innovation Conference 2011, Jul. 29, 2011, pp. 1-10.
Baker et al., “Limits on Super-Resolution and How to Break Them”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Sep. 2002, vol. 24, No. 9, pp. 1167-1183.
Barron et al., “Intrinsic Scene Properties from a Single RGB-D Image”, 2013 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 23-28, 2013, Portland, OR, USA, pp. 17-24.
Bennett et al., “Multispectral Bilateral Video Fusion”, 2007 IEEE Transactions on Image Processing, vol. 16, No. 5, May 2007, published Apr. 16, 2007, pp. 1185-1194.
Bennett et al., “Multispectral Video Fusion”, Computer Graphics (ACM SIGGRAPH Proceedings), Jul. 25, 2006, published Jul. 30, 2006, 1 pg.
Bertalmio et al., “Image Inpainting”, Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques, 2000, ACM Pres/Addison-Wesley Publishing Co., pp. 417-424.
Bertero et al., “Super-resolution in computational imaging”, Micron, Jan. 1, 2003, vol. 34, Issues 6-7, 17 pgs.
Bishop et al., “Full-Resolution Depth Map Estimation from an Aliased Plenoptic Light Field”, ACCV Nov. 8, 2010, Part II, LNCS 6493, pp. 186-200.
Bishop et al., “Light Field Superresolution”, Computational Photography (ICCP), 2009 IEEE International Conference, Conference Date Apr. 16-17, published Jan. 26, 2009, 9 pgs.
Bishop et al., “The Light Field Camera: Extended Depth of Field, Aliasing, and Superresolution”, IEEE Transactions on Pattern Analysis and Machine Intelligence, May 2012, vol. 34, No. 5, published Aug. 18, 2011, pp. 972-986.
Borman, “Topics in Multiframe Superresolution Restoration”, Thesis of Sean Borman, Apr. 2004, 282 pgs.
Borman et al, “Image Sequence Processing”, Dekker Encyclopedia of Optical Engineering, Oct. 14, 2002, 81 pgs.
Borman et al., “Block-Matching Sub-Pixel Motion Estimation from Noisy, Under-Sampled Frames—An Empirical Performance Evaluation”, Proc SPIE, Dec. 28, 1998, vol. 3653, 10 pgs.
Borman et al., “Image Resampling and Constraint Formulation for Multi-Frame Super-Resolution Restoration”, Proc. SPIE, published Jul. 1, 2003, vol. 5016, 12 pgs.
Borman et al., “Linear models for multi-frame super-resolution restoration under non-affine registration and spatially varying PSF”, Proc. SPIE, May 21, 2004, vol. 5299, 12 pgs.
Borman et al., “Nonlinear Prediction Methods for Estimation of Clique Weighting Parameters in NonGaussian Image Models”, Proc. SPIE, Sep. 22, 1998, vol. 3459, 9 pgs.
Borman et al., “Simultaneous Multi-Frame MAP Super-Resolution Video Enhancement Using Spatio-Temporal Priors”, Image Processing, 1999, ICIP 99 Proceedings, vol. 3, pp. 469-473.
Borman et al., “Super-Resolution from Image Sequences—A Review”, Circuits & Systems, 1998, pp. 374-378.
Bose et al., “Superresolution and Noise Filtering Using Moving Least Squares”, IEEE Transactions on Image Processing, Aug. 2006, vol. 15, Issue 8, published Jul. 17, 2006, pp. 2239-2248.
Boye et al., “Comparison of Subpixel Image Registration Algorithms”, Proc. of SPIE—IS&T Electronic Imaging, Feb. 3, 2009, vol. 7246, pp. 72460X-1-72460X-9; doi: 10.1117/12.810369.
Bruckner et al., “Artificial compound eye applying hyperacuity”, Optics Express, Dec. 11, 2006, vol. 14, No. 25, pp. 12076-12084.
Bruckner et al., “Driving microoptical imaging systems towards miniature camera applications”, Proc. SPIE, Micro-Optics, May 13, 2010, 11 pgs.
Bruckner et al., “Thin wafer-level camera lenses inspired by insect compound eyes”, Optics Express, Nov. 22, 2010, vol. 18, No. 24, pp. 24379-24394.
Bryan et al., “Perspective Distortion from Interpersonal Distance Is an Implicit Visual Cue for Social Judgments of Faces”, PLOS One, vol. 7, Issue 9, Sep. 26, 2012, e45301, doi:10.1371/journal.pone.0045301, 9 pages.
Capel, “Image Mosaicing and Super-resolution”, Retrieved on Nov. 10, 2012, Retrieved from the Internet at URL:<http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.226.2643&rep=rep1 &type=pdf>, 2001, 269 pgs.
Carroll et al., “Image Warps for Artistic Perspective Manipulation”, ACM Transactions on Graphics (TOG), vol. 29, No. 4, Jul. 26, 2010, Article No. 127, 9 pgs.
Chan et al., “Extending the Depth of Field in a Compound-Eye Imaging System with Super-Resolution Reconstruction”, Proceedings—International Conference on Pattern Recognition, Jan. 1, 2006, vol. 3, pp. 623-626.
Chan et al., “Investigation of Computational Compound-Eye Imaging System with Super-Resolution Reconstruction”, IEEE, ISASSP, Jun. 19, 2006, pp. 1177-1180.
Chan et al., “Super-resolution reconstruction in a computational compound-eye imaging system”, Multidim Syst Sign Process, published online Feb. 23, 2007, vol. 18, pp. 83-101.
Chen et al., “Image Matting with Local and Nonlocal Smooth Priors”, CVPR '13 Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 23, 2013, pp. 1902-1907.
Chen et al., “Interactive deformation of light fields”, In Proceedings of SIGGRAPH I3D, Apr. 3, 2005, pp. 139-146.
Chen et al., “KNN matting”, 2012 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 16-21, 2012, Providence, RI, USA, pp. 869-876.
Chen et al., “KNN Matting”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Sep. 2013, vol. 35, No. 9, pp. 2175-2188.
Cooper et al., “The perceptual basis of common photographic practice”, Journal of Vision, vol. 12, No. 5, Article 8, May 25, 2012, pp. 1-14.
Crabb et al., “Real-time foreground segmentation via range and color imaging”, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Anchorage, AK, USA, Jun. 23-28, 2008, pp. 1-5.
Debevec et al., “Recovering High Dynamic Range Radiance Maps from Photographs”, Computer Graphics (ACM SIGGRAPH Proceedings), Aug. 16, 1997, 10 pgs.
Do, Minh N., “Immersive Visual Communication with Depth”, Presented at Microsoft Research, Jun. 15, 2011, Retrieved from: http://minhdo.ece.illinois.edu/talks/ImmersiveComm.pdf, 42 pgs.
Do et al., “Immersive Visual Communication”, IEEE Signal Processing Magazine, vol. 28, Issue 1, Jan. 2011, DOI: 10.1109/MSP.2010.939075, Retrieved from: http://minhdo.ece.illinois.edu/publications/ImmerComm_SPM.pdf, pp. 58-66.
Drouin et al., “Fast Multiple-Baseline Stereo with Occlusion”, Fifth International Conference on 3-D Digital Imaging and Modeling (3DIM'05), Ottawa, Ontario, Canada, Jun. 13-16, 2005, pp. 540-547.
Drouin et al., “Geo-Consistency for Wide Multi-Camera Stereo”, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), vol. 1, Jun. 20-25, 2005, pp. 351-358.
Drouin et al., “Improving Border Localization of Multi-Baseline Stereo Using Border-Cut”, International Journal of Computer Vision, Jul. 5, 2006, vol. 83, Issue 3, 8 pgs.
Duparre et al., “Artificial apposition compound eye fabricated by micro-optics technology”, Applied Optics, Aug. 1, 2004, vol. 43, No. 22, pp. 4303-4310.
Duparre et al., “Artificial compound eye zoom camera”, Bioinspiration & Biomimetics, Nov. 21, 2008, vol. 3, pp. 1-6.
Duparre et al., “Artificial compound eyes—different concepts and their application to ultra flat image acquisition sensors”, MOEMS and Miniaturized Systems IV, Proc. SPIE 5346, Jan. 24, 2004, pp. 89-100.
Duparre et al., “Chirped arrays of refractive ellipsoidal microlenses for aberration correction under oblique incidence”, Optics Express, Dec. 26, 2005, vol. 13, No. 26, pp. 10539-10551.
Duparre et al., “Micro-optical artificial compound eyes”, Bioinspiration & Biomimetics, Apr. 6, 2006, vol. 1, pp. R1-R16.
Duparre et al., “Microoptical artificial compound eyes—from design to experimental verification of two different concepts”, Proc. of SPIE, Optical Design and Engineering II, vol. 5962, Oct. 17, 2005, pp. 59622A-1-59622A-12.
Duparre et al., “Microoptical Artificial Compound Eyes—Two Different Concepts for Compact Imaging Systems”, 11th Microoptics Conference, Oct. 30-Nov. 2, 2005, 2 pgs.
Duparre et al., “Microoptical telescope compound eye”, Optics Express, Feb. 7, 2005, vol. 13, No. 3, pp. 889-903.
Duparre et al., “Micro-optically fabricated artificial apposition compound eye”, Electronic Imaging—Science and Technology, Prod. SPIE 5301, Jan. 2004, pp. 25-33.
Duparre et al., “Novel Optics/Micro-Optics for Miniature Imaging Systems”, Proc. of SPIE, Apr. 21, 2006, vol. 6196, pp. 619607-1-619607-15.
Duparre et al., “Theoretical analysis of an artificial superposition compound eye for application in ultra flat digital image acquisition devices”, Optical Systems Design, Proc. SPIE 5249, Sep. 2003, pp. 408-418.
Duparre et al., “Thin compound-eye camera”, Applied Optics, May 20, 2005, vol. 44, No. 15, pp. 2949-2956.
Duparre et al., “Ultra-Thin Camera Based on Artificial Apposition Compound Eyes”, 10th Microoptics Conference, Sep. 1-3, 2004, 2 pgs.
Eng, Wei Yong et al., “Gaze correction for 3D tele-immersive communication system”, IVMSP Workshop, 2013 IEEE 11th. IEEE, Jun. 10, 2013.
Fanaswala, “Regularized Super-Resolution of Multi-View Images”, Retrieved on Nov. 10, 2012 (Nov. 10, 2012). Retrieved from the Internet at URL:<http://www.site.uottawa.ca/edubois/theses/Fanaswala_thesis.pdf>, 2009, 163 pgs.
Fang et al., “Volume Morphing Methods for Landmark Based 3D Image Deformation”, SPIE vol. 2710, Proc. 1996 SPIE Intl Symposium on Medical Imaging, Newport Beach, CA, Feb. 10, 1996, pp. 404-415.
Farrell et al., “Resolution and Light Sensitivity Tradeoff with Pixel Size”, Proceedings of the SPIE Electronic Imaging 2006 Conference, Feb. 2, 2006, vol. 6069, 8 pgs.
Farsiu et al., “Advances and Challenges in Super-Resolution”, International Journal of Imaging Systems and Technology, Aug. 12, 2004, vol. 14, pp. 47-57.
Farsiu et al., “Fast and Robust Multiframe Super Resolution”, IEEE Transactions on Image Processing, Oct. 2004, published Sep. 3, 2004, vol. 13, No. 10, pp. 1327-1344.
Farsiu et al., “Multiframe Demosaicing and Super-Resolution of Color Images”, IEEE Transactions on Image Processing, Jan. 2006, vol. 15, No. 1, date of publication Dec. 12, 2005, pp. 141-159.
Fecker et al., “Depth Map Compression for Unstructured Lumigraph Rendering”, Proc. SPIE 6077, Proceedings Visual Communications and Image Processing 2006, Jan. 18, 2006, pp. 60770B-1-60770B-8.
Feris et al., “Multi-Flash Stereopsis: Depth Edge Preserving Stereo with Small Baseline Illumination”, IEEE Trans on PAMI, 2006, 31 pgs.
Fife et al., “A 3D Multi-Aperture Image Sensor Architecture”, Custom Integrated Circuits Conference, 2006, CICC '06, IEEE, pp. 281-284.
Fife et al., “A 3MPixel Multi-Aperture Image Sensor with 0.7Mu Pixels in 0.11Mu CMOS”, ISSCC 2008, Session 2, Image Sensors & Technology, 2008, pp. 48-50.
Fischer et al., “Optical System Design”, 2nd Edition, SPIE Press, Feb. 14, 2008, pp. 191-198.
Fischer et al., “Optical System Design”, 2nd Edition, SPIE Press, Feb. 14, 2008, pp. 49-58.
Gastal et al., “Shared Sampling for Real-Time Alpha Matting”, Computer Graphics Forum, Eurographics 2010, vol. 29, Issue 2, May 2010, pp. 575-584.
Georgeiv et al., “Light Field Camera Design for Integral View Photography”, Adobe Systems Incorporated, Adobe Technical Report, 2003, 13 pgs.
Georgiev et al., “Light-Field Capture by Multiplexing in the Frequency Domain”, Adobe Systems Incorporated, Adobe Technical Report, 2003, 13 pgs.
Goldman et al., “Video Object Annotation, Navigation, and Composition”, In Proceedings of UIST 2008, Oct. 19-22, 2008, Monterey CA, USA, pp. 3-12.
Gortler et al., “The Lumigraph”, In Proceedings of SIGGRAPH 1996, published Aug. 1, 1996, pp. 43-54.
Gupta et al., “Perceptual Organization and Recognition of Indoor Scenes from RGB-D Images”, 2013 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 23-28, 2013, Portland, OR, USA, pp. 564-571.
Hacohen et al., “Non-Rigid Dense Correspondence with Applications for Image Enhancement”, ACM Transactions on Graphics, vol. 30, No. 4, Aug. 7, 2011, pp. 70:1-70:10.
Hamilton, “JPEG File Interchange Format, Version 1.02”, Sep. 1, 1992, 9 pgs.
Hardie, “A Fast Image Super-Algorithm Using an Adaptive Wiener Filter”, IEEE Transactions on Image Processing, Dec. 2007, published Nov. 19, 2007, vol. 16, No. 12, pp. 2953-2964.
Hasinoff et al., “Search-and-Replace Editing for Personal Photo Collections”, 2010 International Conference: Computational Photography (ICCP) Mar. 2010, pp. 1-8.
Hernandez-Lopez et al., “Detecting objects using color and depth segmentation with Kinect sensor”, Procedia Technology, vol. 3, Jan. 1, 2012, pp. 196-204, XP055307680, ISSN: 2212-0173, DOI: 10.1016/j.protcy.2012.03.021.
Holoeye Photonics Ag, “Spatial Light Modulators”, Oct. 2, 2013, Brochure retrieved from https://web.archive.org/web/20131002061028/http://holoeye.com/wp-content/uploads/Spatial_Light_Modulators.pdf on Oct. 13, 2017, 4 pgs.
Holoeye Photonics Ag, “Spatial Light Modulators”, Sep. 18, 2013, retrieved from https://web.archive.org/web/20130918113140/http://holoeye.com/spatial-light-modulators/ on Oct. 13, 2017, 4 pages.
Horisaki et al., “Irregular Lens Arrangement Design to Improve Imaging Performance of Compound-Eye Imaging Systems”, Applied Physics Express, Jan. 29, 2010, vol. 3, pp. 022501-1-022501-3.
Horisaki et al., “Superposition Imaging for Three-Dimensionally Space-Invariant Point Spread Functions”, Applied Physics Express, Oct. 13, 2011, vol. 4, pp. 112501-1-112501-3.
Horn et al., “LightShop: Interactive Light Field Manipulation and Rendering”, In Proceedings of I3D, Jan. 1, 2007, pp. 121-128.
Isaksen et al., “Dynamically Reparameterized Light Fields”, In Proceedings of SIGGRAPH 2000, pp. 297-306.
Izadi et al., “KinectFusion: Real-time 3D Reconstruction and Interaction Using a Moving Depth Camera”, UIST'11, Oct. 16-19, 2011, Santa Barbara, CA, pp. 559-568.
Janoch et al., “A category-level 3-D object dataset: Putting the Kinect to work”, 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), Nov. 6-13, 2011, Barcelona, Spain, pp. 1168-1174.
Jarabo et al., “Efficient Propagation of Light Field Edits”, In Proceedings of SIACG 2011, pp. 75-80.
Jiang et al., “Panoramic 3D Reconstruction Using Rotational Stereo Camera with Simple Epipolar Constraints”, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06), vol. 1, Jun. 17-22, 2006, New York, NY, USA, pp. 371-378.
Seitz et al., “Plenoptic Image Editing”, International Journal of Computer Vision 48, Conference Date Jan. 7, 1998, 29 pgs., DOI: 10.1109/ICCV.1998.710696 ⋅ Source: DBLP Conference: Computer Vision, Sixth International Conference.
Shotton et al., “Real-time human pose recognition in parts from single depth images”, CVPR 2011, Jun. 20-25, 2011, Colorado Springs, CO, USA, pp. 1297-1304.
Shum et al., “A Review of Image-based Rendering Techniques”, Visual Communications and Image Processing 2000, May 2000, 12 pgs.
Shum et al., “Pop-Up Light Field: An Interactive Image-Based Modeling and Rendering System”, Apr. 2004, ACM Transactions on Graphics, vol. 23, No. 2, pp. 143-162. Retrieved from http://131.107.65.14/en-us/um/people/jiansun/papers/PopupLightField_TOG.pdf on Feb. 5, 2014.
Silberman et al., “Indoor segmentation and support inference from RGBD images”, ECCV'12 Proceedings of the 12th European conference on Computer Vision, vol. Part V, Oct. 7-13, 2012, Florence, Italy, pp. 746-760.
Stober, “Stanford researchers developing 3-D camera with 12,616 lenses”, Stanford Report, Mar. 19, 2008, Retrieved from: http://news.stanford.edu/news/2008/march19/camera-031908.html, 5 pgs.
Stollberg et al., “The Gabor superlens as an alternative wafer-level camera approach inspired by superposition compound eyes of nocturnal insects”, Optics Express, Aug. 31, 2009, vol. 17, No. 18, pp. 15747-15759.
Sun et al., “Image Super-Resolution Using Gradient Profile Prior”, 2008 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 23-28, 2008, 8 pgs.; DOI: 10.1109/CVPR.2008.4587659.
Taguchi et al., “Rendering-Oriented Decoding for a Distributed Multiview Coding System Using a Coset Code”, Hindawi Publishing Corporation, EURASIP Journal on Image and Video Processing, vol. 2009, Article ID 251081, Online: Apr. 22, 2009, 12 pages.
Takeda et al., “Super-resolution Without Explicit Subpixel Motion Estimation”, IEEE Transaction on Image Processing, Sep. 2009, vol. 18, No. 9, pp. 1958-1975.
Tallon et al., “Upsampling and Denoising of Depth Maps Via Joint-Segmentation”, 20th European Signal Processing Conference, Aug. 27-31, 2012, 5 pgs.
Tanida et al., “Color imaging with an integrated compound imaging system”, Optics Express, Sep. 8, 2003, vol. 11, No. 18, pp. 2109-2117.
Tanida et al., “Thin observation module by bound optics (TOMBO): concept and experimental verification”, Applied Optics, Apr. 10, 2001, vol. 40, No. 11, pp. 1806-1813.
Tao et al., “Depth from Combining Defocus and Correspondence Using Light-Field Cameras”, ICCV '13 Proceedings of the 2013 IEEE International Conference on Computer Vision, Dec. 1, 2013, pp. 673-680.
Taylor, “Virtual camera movement: The way of the future?”, American Cinematographer vol. 77, No. 9, Sep. 1996, 93-100.
Tseng et al., “Automatic 3-D depth recovery from a single urban-scene image”, 2012 Visual Communications and Image Processing, Nov. 27-30, 2012, San Diego, CA, USA, pp. 1-6.
Vaish et al., “Reconstructing Occluded Surfaces Using Synthetic Apertures: Stereo, Focus and Robust Measures”, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06), vol. 2, Jun. 17-22, 2006, pp. 2331-2338.
Vaish et al., “Synthetic Aperture Focusing Using a Shear-Warp Factorization of the Viewing Transform”, IEEE Workshop on A3DISS, CVPR, 2005, 8 pgs.
Vaish et al., “Using Plane + Parallax for Calibrating Dense Camera Arrays”, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2004, 8 pgs.
Veilleux, “CCD Gain Lab: The Theory”, University of Maryland, College Park—Observational Astronomy (ASTR 310), Oct. 19, 2006, pp. 1-5 (online], [retrieved on May 13, 2014]. Retrieved from the Internet <URL: http://www.astro.umd.edu/˜veilleux/ASTR310/fall06/ccd_theory.pdf, 5 pgs.
Venkataraman et al., “PiCam: An Ultra-Thin High Performance Monolithic Camera Array”, ACM Transactions on Graphics (TOG), ACM, US, vol. 32, No. 6, Nov. 1, 2013, pp. 1-13.
Vetro et al., “Coding Approaches for End-To-End 3D TV Systems”, Mitsubishi Electric Research Laboratories, Inc., TR2004-137, Dec. 2004, 6 pgs.
Viola et al., “Robust Real-time Object Detection”, Cambridge Research Laboratory, Technical Report Series, Compaq, CRL 2001/01, Feb. 2001, Printed from: http://www.hpl.hp.com/techreports/Compaq-DEC/CRL-2001-1.pdf, 30 pgs.
Vuong et al., “A New Auto Exposure and Auto White-Balance Algorithm to Detect High Dynamic Range Conditions Using CMOS Technology”, Proceedings of the World Congress on Engineering and Computer Science 2008, WCECS 2008, Oct. 22-24, 2008.
Wang, “Calculation Image Position, Size and Orientation Using First Order Properties”, Dec. 29, 2010, OPTI521 Tutorial, 10 pgs.
Wang et al., “Automatic Natural Video Matting with Depth”, 15th Pacific Conference on Computer Graphics and Applications, PG '07, Oct. 29-Nov. 2, 2007, Maui, HI, USA, pp. 469-472.
Wang et al., “Image and Video Matting: A Survey”, Foundations and Trends, Computer Graphics and Vision, vol. 3, No. 2, 2007, pp. 91-175.
Wang et al., “Soft scissors: an interactive tool for realtime high quality matting”, ACM Transactions on Graphics (TOG)—Proceedings of Acm Siggraph 2007, vol. 26, Issue 3, Article 9, Jul. 2007, 6 pages, published Aug. 5, 2007.
Wetzstein et al., “Computational Plenoptic Imaging”, Computer Graphics Forum, 2011, vol. 30, No. 8, pp. 2397-2426.
Wheeler et al., “Super-Resolution Image Synthesis Using Projections Onto Convex Sets in the Frequency Domain”, Proc. SPIE, Mar. 11, 2005, vol. 5674, 12 pgs.
Wieringa et al., “Remote Non-invasive Stereoscopic Imaging of Blood Vessels: First In-vivo Results of a New Multispectral Contrast Enhancement Technology”, Annals of Biomedical Engineering, vol. 34, No. 12, Dec. 2006, pp. 1870-1878, Published online Oct. 12, 2006.
Wikipedia, “Polarizing Filter (Photography)”, retrieved from http://en.wikipedia.org/wiki/Polarizing_filter_(photography) on Dec. 12, 2012, last modified on Sep. 26, 2012, 5 pgs.
Wilburn, “High Performance Imaging Using Arrays of Inexpensive Cameras”, Thesis of Bennett Wilburn, Dec. 2004, 128 pgs.
Wilburn et al., “High Performance Imaging Using Large Camera Arrays”, ACM Transactions on Graphics, Jul. 2005, vol. 24, No. 3, pp. 1-12.
Wilburn et al., “High-Speed Videography Using a Dense Camera Array”, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004, vol. 2, Jun. 27-Jul. 2, 2004, pp. 294-301.
Wilburn et al., “The Light Field Video Camera”, Proceedings of Media Processors 2002, SPIE Electronic Imaging, 2002, 8 pgs.
Wippermann et al., “Design and fabrication of a chirped array of refractive ellipsoidal micro-lenses for an apposition eye camera objective”, Proceedings of SPIE, Optical Design and Engineering II, Oct. 15, 2005, 59622C-1-59622C-11.
Wu et al., “A virtual view synthesis algorithm based on image inpainting”, 2012 Third International Conference on Networking and Distributed Computing, Hangzhou, China, Oct. 21-24, 2012, pp. 153-156.
Xu, “Real-Time Realistic Rendering and High Dynamic Range Image Display and Compression”, Dissertation, School of Computer Science in the College of Engineering and Computer Science at the University of Central Florida, Orlando, Florida, Fall Term 2005, 192 pgs.
Yang et al., “A Real-Time Distributed Light Field Camera”, Eurographics Workshop on Rendering (2002), published Jul. 26, 2002, pp. 1-10.
Yang et al., “Superresolution Using Preconditioned Conjugate Gradient Method”, Proceedings of SPIE—The International Society for Optical Engineering, Jul. 2002, 8 pgs.
Yokochi et al., “Extrinsic Camera Parameter Estimation Based-on Feature Tracking and GPS Data”, 2006, Nara Institute of Science and Technology, Graduate School of Information Science, LNCS 3851, pp. 369-378.
Zhang et al., “A Self-Reconfigurable Camera Array”, Eurographics Symposium on Rendering, published Aug. 8, 2004, 12 pgs.
Zhang et al., “Depth estimation, spatially variant image registration, and super-resolution using a multi-lenslet camera”, Proceedings Of Spie, vol. 7705, Apr. 23, 2010, pp. 770505-770505-8, XP055113797 ISSN: 0277-786X, DOI: 10.1117/12.852171.
Zheng et al., “Balloon Motion Estimation Using Two Frames”, Proceedings of the Asilomar Conference on Signals, Systems and Computers, IEEE, Comp. Soc. Press, US, vol. 2 of 02, Nov. 4, 1991, pp. 1057-1061.
Zhu et al., “Fusion of Time-of-Flight Depth and Stereo for High Accuracy Depth Maps”, 2008 IEEE Conference on Computer Vision and Pattern Recognition, Jun. 23-28, 2008, Anchorage, AK, USA, pp. 1-8.
Zomet et al., “Robust Super-Resolution”, IEEE, 2001, pp. 1-6.
International Preliminary Report on Patentability for International Application PCT/US2013/039155, completed Nov. 4, 2014, dated Nov. 13, 2014, 10 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/046002, dated Dec. 31, 2014, dated Jan. 8, 2015, 6 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/048772, dated Dec. 31, 2014, dated Jan. 8, 2015, 8 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/056502, dated Feb. 24, 2015, dated Mar. 5, 2015, 7 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/069932, dated May 19, 2015, dated May 28, 2015, 12 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/017766, dated Aug. 25, 2015, dated Sep. 3, 2015, 8 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/018084, dated Aug. 25, 2015, dated Sep. 3, 2015, 11 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/018116, dated Sep. 15, 2015, dated Sep. 24, 2015, 12 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/021439, dated Sep. 15, 2015, dated Sep. 24, 2015, 9 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/022118, dated Sep. 8, 2015, dated Sep. 17, 2015, 4 pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/022123, dated Sep. 8, 2015, dated Sep. 17, 2015, 4 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/022774, dated Sep. 22, 2015, dated Oct. 1, 2015, 5 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/023762, dated Mar. 2, 2015, dated Mar. 9, 2015, 10 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/024407, dated Sep. 15, 2015, dated Sep. 24, 2015, 8 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/024903, dated Sep. 15, 2015, dated Sep. 24, 2015, 12 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/024947, dated Sep. 15, 2015, dated Sep. 24, 2015, 7 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/025100, dated Sep. 15, 2015, dated Sep. 24, 2015, 4 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/025904, dated Sep. 15, 2015, dated Sep. 24, 2015, 5 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/028447, dated Sep. 15, 2015, dated Sep. 24, 2015, 7 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/030692, dated Sep. 15, 2015, dated Sep. 24, 2015, 6 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/064693, dated May 10, 2016, dated May 19, 2016, 14 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/066229, dated May 24, 2016, dated Jun. 6, 2016, 9 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2014/067740, dated May 31, 2016, dated Jun. 9, 2016, 9 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2015/019529, dated Sep. 13, 2016, dated Sep. 22, 2016, 9 Pgs.
International Preliminary Report on Patentability for International Application PCT/US13/62720, dated Mar. 31, 2015, dated Apr. 9, 2015, 8 Pgs.
International Search Report and Written Opinion for International Application No. PCT/US13/46002, completed Nov. 13, 2013, dated Nov. 29, 2013, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US13/56065, Completed Nov. 25, 2013, dated Nov. 26, 2013, 8 pgs.
International Search Report and Written Opinion for International Application No. PCT/US13/59991, Completed Feb. 6, 2014, dated Feb. 26, 2014, 8 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2009/044687, date completed Jan. 5, 2010, dated Jan. 13, 2010, 9 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2011/064921, Completed Feb. 25, 2011, dated Mar. 6, 2012, 17 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/024987, Completed Mar. 27, 2013, dated Apr. 15, 2013, 14 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/027146, completed Apr. 2, 2013, 11 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/039155, completed Jul. 1, 2013, dated Jul. 11, 2013, 11 Pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/048772, Completed Oct. 21, 2013, dated Nov. 8, 2013, 6 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/056502, Completed Feb. 18, 2014, dated Mar. 19, 2014, 7 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2013/069932, Completed Mar. 14, 2014, dated Apr. 14, 2014, 12 pgs.
International Search Report and Written Opinion for International Application No. PCT/US2015/019529, completed May 5, 2015, dated Jun. 8, 2015, 11 Pgs.
International Search Report and Written Opinion for International Application PCT/US11/36349, dated Aug. 22, 2011, 11 pgs.
International Search Report and Written Opinion for International Application PCT/US13/62720, completed Mar. 25, 2014, dated Apr. 21, 2014, 9 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/17766, completed May 28, 2014, dated Jun. 18, 2014, 9 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/18084, completed May 23, 2014, dated Jun. 10, 2014, 12 pgs.
International Search Report and Written Opinion for International Application PCT/US14/18116, completed May 13, 2014, dated Jun. 2, 2014, 12 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/21439, completed Jun. 5, 2014, dated Jun. 20, 2014, 10 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/22118, completed Jun. 9, 2014, dated Jun. 25, 2014, 5 pgs.
International Search Report and Written Opinion for International Application PCT/US14/22774 report completed Jun. 9, 2014, dated Jul. 14, 2014, 6 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/24407, report completed Jun. 11, 2014, dated Jul. 8, 2014, 9 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/25100, report completed Jul. 7, 2014, dated Aug. 7, 2014, 5 Pgs.
International Search Report and Written Opinion for International Application PCT/US14/25904 report completed Jun. 10, 2014, dated Jul. 10, 2014, 6 Pgs.
Extended European Search Report for EP Application No. 11781313.9, Completed Oct. 1, 2013, dated Oct. 8, 2013, 6 pages.
Extended European Search Report for EP Application No. 13810429.4, Completed Jan. 7, 2016, dated Jan. 15, 2016, 6 Pgs.
Extended European Search Report for European Application EP12782935.6, completed Aug. 28, 2014, dated Sep. 4, 2014, 7 Pgs.
Extended European Search Report for European Application EP12804266.0, Report Completed Jan. 27, 2015, dated Feb. 3, 2015, 6 Pgs.
Extended European Search Report for European Application EP12835041.0, Report Completed Jan. 28, 2015, dated Feb. 4, 2015, 7 Pgs.
Extended European Search Report for European Application EP13751714.0, completed Aug. 5, 2015, dated Aug. 18, 2015, 8 Pgs.
Extended European Search Report for European Application EP13810229.8, Report Completed Apr. 14, 2016, dated Apr. 21, 2016, 7 pgs.
Extended European Search Report for European Application No. 13830945.5, Search completed Jun. 28, 2016, dated Jul. 7, 2016, 14 Pgs.
Extended European Search Report for European Application No. 13841613.6, Search completed Jul. 18, 2016, dated Jul. 26, 2016, 8 Pgs.
Extended European Search Report for European Application No. 14763087.5, Search completed Dec. 7, 2016, dated Dec. 19, 2016, 9 Pgs.
Extended European Search Report for European Application No. 14860103.2, Search completed Feb. 23, 2017, dated Mar. 3, 2017, 7 Pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2012/059813, dated Apr. 15, 2014, 7 pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2013/059991, dated Mar. 17, 2015, dated Mar. 26, 2015, 8 pgs.
International Preliminary Report on Patentability for International Application PCT/US10/057661, dated May 22, 2012, dated May 31, 2012, 10 pages.
International Preliminary Report on Patentability for International Application PCT/US11/036349, Report dated Nov. 13, 2012, dated Nov. 22, 2012, 9 pages.
International Preliminary Report on Patentability for International Application PCT/US2013/056065, dated Feb. 24, 2015, dated Mar. 5, 2015, 4 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/062720, Report dated Mar. 31, 2015, dated Apr. 9, 2015, 8 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/024987, dated Aug. 12, 2014, dated Aug. 12, 2014, 13 Pgs.
International Preliminary Report on Patentability for International Application PCT/US2013/027146, Report dated Aug. 26, 2014, 10 pgs.
Extended European Search Report for European Application No. 18151530.5, Completed Mar. 28, 2018, dated Apr. 20, 2018, 11 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2009/044687, Completed Jul. 30, 2010, 9 pgs.
Supplementary European Search Report for European Application 09763194.9, completed Nov. 7, 2011, dated Nov. 29, 2011, 9 pgs.
Related Publications (1)
Number Date Country
20180109782 A1 Apr 2018 US
Provisional Applications (1)
Number Date Country
61665724 Jun 2012 US
Divisions (1)
Number Date Country
Parent 13931724 Jun 2013 US
Child 14805412 US
Continuations (1)
Number Date Country
Parent 14805412 Jul 2015 US
Child 15797126 US