This background description is provided for the purpose of generally presenting the context of the disclosure. Unless otherwise indicated herein, material described in this section is neither expressly nor impliedly admitted to be prior art to the present disclosure or the appended claims.
Current sensor arrays for capturing images have partially addressed the need for a small form factor in the Z dimension for cameras and other imaging devices. These conventional sensor arrays, however, have various limitations. First, images captured for each sensor of the array must be combined in some manner through computational effort to construct the final image, which has varied success and requires computing resources. Second, this construction of the final image can be scene-dependent, meaning that some scenes result in relatively poor image quality. Third, these conventional sensor arrays often struggle to provide high resolution images, especially if there are any flaws in the sensors or lenses.
Apparatuses of and techniques using an asymmetric sensor array for capturing images are described with reference to the following drawings. The same numbers are used throughout the drawings to reference like features and components:
Conventional sensor arrays use an array of equivalent image sensors to realize a final image. These sensor arrays enable a camera to have a low Z-height relative to the quality of the final image. Compared to a single sensor that provides a similar image quality, for example, sensor arrays have a low Z-height. This is due to a relationship between sensor size and Z height for the lens that focuses the image onto the sensor. Thus, a four-megapixel single sensor requires, assuming similar lens characteristics, a much taller Z height than an array of four one-megapixel sensors. Each of the four one-megapixel sensors is smaller and thus uses a shorter Z-height.
These conventional sensor arrays, however, have various limitations, such as failing to realize sharp optics, depth of color, scene-independent image reconstruction, or low-light sensitivity.
Consider, for example, a conventional sensor having a 2×2 grid of sensors, the sensors having red, green, green, and blue pixels to capture images. Each of the four sensors in the array includes small repeating squares having four pixels each, one pixel that senses red, one blue, and two green. The two green are used to determine resolution (e.g., sharpness) in addition to the color green. Mathematically, a one-megapixel sensor is then capable of one-half-megapixel resolution. Through various computational processes, which are not the topic of this disclosure, this one-half-megapixel resolution can be interpolated to improve the resolution (again, with varied success) by about 20%. Thus, the one-megapixel red, green, green, blue sensor can result in a final resolution of about 0.7 megapixels, though this final resolution has limitations as noted above.
To maximize this resolution, conventional sensor arrays use small color pixels to increases a number of pixels in a sensor, and thus keep the size of the sensor down, which in turn keeps the Z-height relatively low. Small color pixels, however, often fail to handle noise well, as each pixel's ability is limited by size, and thus small pixels have poorer signal-to-noise ratios than large pixels. Conventional sensor arrays often forgo use of large pixels, however, because doing so increases the Z-height or reduces the final resolution of the image.
Consider instead, however, an example asymmetric sensor array for capturing images. This asymmetric sensor array, instead of using small color pixels and equivalent sensors, uses an asymmetric sensor array having a central monochrome sensor for resolution and peripheral, relatively large color-pixel sensors for color. The central monochrome-pixel sensor provides high resolution using small pixels. The peripheral, large-pixel color sensors provide color and, due to their size, have excellent signal-to-noise ratios, and thus provide truer color, better color in low-light situations, or other benefits described below. While these peripheral color sensors have lower resolution than the central sensor, the human eye distinguishes less detail in color than it does in greyscale (e.g., the image's resolution or sharpness). Therefore, this asymmetric sensor array provides a final image that conforms to the human eye's characteristics—with high sharpness and truer color, as well as less sensitively to low-light and other adverse scene characteristics.
The following discussion first describes an operating environment, then example asymmetric sensor arrays, then a detailed description of an example imaging device, followed by techniques that may be employed in this environment and imaging device, and ends with an example electronic device.
Example Environment
Asymmetric sensor array 110 includes a main sensor 112 having a main resolution and angled at a main angle 114. Here main angle 114, as shown relative to object 116 of scene 104, is at ninety degrees. Asymmetric sensor array 110 also includes multiple peripheral sensors 118. These peripheral sensors 118 have peripheral resolutions or colors that are asymmetric to the main colors or resolution. Asymmetric sensors can be asymmetric to each other by having different numbers of pixels, color-sensing of pixels, sizes of pixels, or sensor size.
Peripheral sensors 118 (shown at 118-1 and 118-2) can be positioned at peripheral angles 120 (shown as peripheral angles 120-1 and 120-2, respectively), which are different from main angle 114 of main sensor 112. This difference or differences in angles enables a depth of image to be created for an image of a scene based on peripheral data sensed by the one of the peripheral sensors of the scene and main data sensed by the main sensor of the same scene or different peripheral data sensed by a different peripheral sensor of the same scene, or some combination of these. Thus, peripheral angle 120-1 of 5° off of main angle 114 captures peripheral data through capture of an image of object 116 different from that of an image captured of object 116 from main sensor 112 at main angle 114.
Consider
As mentioned above, main sensor 112 may include various resolutions and types. In the example of
This illustration shows resolutions of main sensor 112 and peripheral sensors 118 in terms of a number and size of squares, which are here assumed to be pixels. While simplified for visual clarity (showing millions of pixels is not possible for this type of illustration), main sensor 112 includes four times the number of pixels of each of peripheral sensors 118, and peripheral sensors 118 include pixels that are four times as large as those of main sensor 112.
The estimation of a depth map for images (e.g., a per-pixel estimation of the distance between a camera and a scene) improves with image SNR. Given this, and the use of peripheral sensors 118 at some angle relative to main sensor 112 for depth mapping, the larger size of pixels of peripheral sensors 118 can improve depth mapping by improving SNR. In more detail, smaller pixels have less capacity to absorb photons, and thus, they have less capacity to accept noise. Therefore, the larger pixels allow for a better signal-to-noise ratio, which aids in depth mapping for accurate color representation and for low-light scenes.
In addition to the example shown in
Each of these asymmetric sensor arrays 302 and 110 are examples, rather than limitations to the types of asymmetric sensor arrays contemplated by this disclosure. This description now turns to lens stacks 108.
As noted for
While not shown, the various imagers may include, or imaging device 102 may include separate from the various imagers, an auto-focus device capable of determining a focus of the main sensor in part using depth data captured by the peripheral sensors. This is not required, as in some cases no auto-focus is needed. Use of an auto-focus device can depend on a desired image quality and a size and resolution of sensors used to deliver this image quality. This can be balanced, however, with an undesirable focus lag of many current auto-focus mechanisms. The asymmetric sensor array, however, can reduce this focus lag by decreasing an iterative adjust and sense operation of current auto-focus systems. The iterative adjust and sense operation is decreased by using depth information captured by the peripheral sensors to guide the auto-focus system, thereby reducing a number of iterations required to achieve focus.
Furthermore, these various imagers can be structured to be capable of focusing at objects in scenes beyond about two meters from the lens of the main sensor without a focusing mechanism. If focusing on objects within two meters is desired, a simpler optical system to adjust focus only at near-field scenes (objects within one to two meters) can be used. This simpler optical system can be a near-far toggle, for example.
Having generally described asymmetric sensor arrays and imagers, this discussion now turns to
In some cases, imaging device 102 is in communication with, but may not necessarily include, imager 106 or elements thereof. Captured images are then received by imaging device 102 from imager 106 via the one or more I/O ports 516. I/O ports 516 can include a variety of ports, such as by way of example and not limitation, high-definition multimedia (HDMI), digital video interface (DVI), display port, fiber-optic or light-based, audio ports (e.g., analog, optical, or digital), USB ports, serial advanced technology attachment (SATA) ports, peripheral component interconnect (PCI) express based ports or card slots, serial ports, parallel ports, or other legacy ports. Imaging device 102 may also include network interface(s) 518 for communicating data over wired, wireless, or optical networks. By way of example and not limitation, network interface 518 may communicate data over a local-area-network (LAN), a wireless local-area-network (WLAN), a personal-area-network (PAN), a wide-area-network (WAN), an intranet, the Internet, a peer-to-peer network, point-to-point network, a mesh network, and the like.
Example Methods
The following discussion describes methods by which techniques are implemented to enable use of asymmetric sensor arrays for capturing images. These methods can be implemented utilizing the previously described environment and example sensor arrays and imagers, such as shown in
At 604, peripheral sensor data including multiple color images of the scene are received from peripheral sensors. One or more of the multiple color images can be sensed at an angle different than the angle of reception of the sensor data of the main sensor. As noted above, this different angle enables creation of a depth map along with other uses also described above. Also, in some example asymmetric sensor arrays, such as asymmetric sensor array 110, 302-1, and 302-2 (but not 302-3), of
Continuing the ongoing example, peripheral sensor data from peripheral sensors 118 include two low-resolution color images of scene 104, both of which are sensed at angles different from those of main sensor 112, namely by five degrees (see
At 606, a depth map is determined based on the multiple color images. This depth map includes information relating to distances of surfaces in a sense (such as object 116 of scene 104 of
At 608, a final image is constructed using the depth map, the multiple color images, and the high-resolution image. Image manager 512, for example, may “paint” the low-resolution color images from peripheral sensors 118 onto the high-resolution, monochromatic image from main sensor 112, in part with use of the depth map. By so doing, methods 600 create a final image having object 116 in focus, with high sharpness, accurate color and depth of color, and, in many cases, using fewer computation resources or more quickly (in focusing or processing).
Example Electronic Device
Electronic device 700 includes communication transceivers 702 that enable wired and/or wireless communication of device data 704, such as received data, transmitted data, or sensor data as described above. Example communication transceivers include NFC transceivers, WPAN radios compliant with various IEEE 802.15 (Bluetooth™) standards, WLAN radios compliant with any of the various IEEE 802.11 (WiFi™) standards, WWAN (3GPP-compliant) radios for cellular telephony, wireless metropolitan area network (WMAN) radios compliant with various IEEE 802.16 (WiMAX™) standards, and wired local area network (LAN) Ethernet transceivers.
Electronic device 700 may also include one or more data input ports 706 via which any type of data, media content, and/or inputs can be received, such as user-selectable inputs, messages, music, television content, recorded video content, and any other type of audio, video, and/or image data received from any content and/or data source (e.g., other image devices or imagers). Data input ports 706 may include USB ports, coaxial cable ports, and other serial or parallel connectors (including internal connectors) for flash memory, DVDs, CDs, and the like. These data input ports may be used to couple the electronic device to components (e.g., imager 106), peripherals, or accessories such as keyboards, microphones, or cameras.
Electronic device 700 of this example includes processor system 708 (e.g., any of application processors, microprocessors, digital-signal-processors, controllers, and the like), or a processor and memory system (e.g., implemented in a SoC), which process (i.e., execute) computer-executable instructions to control operation of the device. Processor system 708 (processor(s) 708) may be implemented as an application processor, embedded controller, microcontroller, and the like. A processing system may be implemented at least partially in hardware, which can include components of an integrated circuit or on-chip system, digital-signal processor (DSP), application-specific integrated circuit (ASIC), field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon and/or other hardware.
Alternatively or in addition, electronic device 700 can be implemented with any one or combination of software, hardware, firmware, or fixed logic circuitry that is implemented in connection with processing and control circuits, which are generally identified at 710 (processing and control 710). Hardware-only devices in which an asymmetric sensor array for capturing images may be embodied include those that convert, without computer processors, sensor data into voltage signals by which to control focusing systems (e.g., focusing module 514).
Although not shown, electronic device 700 can include a system bus, crossbar, or data transfer system that couples the various components within the device. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
Electronic device 700 also includes one or more memory devices 712 that enable data storage, examples of which include random access memory (RAM), non-volatile memory (e.g., read-only memory (ROM), flash memory, EPROM, EEPROM, etc.), and a disk storage device. Memory device(s) 712 provide data storage mechanisms to store the device data 704, other types of information and/or data, and various device applications 720 (e.g., software applications). For example, operating system 714 can be maintained as software instructions within memory device 712 and executed by processors 708. In some aspects, image manager 512 is embodied in memory devices 712 of electronic device 700 as executable instructions or code. Although represented as a software implementation, image manager 512 may be implemented as any form of a control application, software application, signal-processing and control module, or hardware or firmware installed on imager 106.
Electronic device 700 also includes audio and/or video processing system 716 that processes audio data and/or passes through the audio and video data to audio system 718 and/or to display system 722 (e.g., a screen of a smart phone or camera). Audio system 718 and/or display system 722 may include any devices that process, display, and/or otherwise render audio, video, display, and/or image data. Display data and audio signals can be communicated to an audio component and/or to a display component via an RF (radio frequency) link, S-video link, HDMI (high-definition multimedia interface), composite video link, component video link, DVI (digital video interface), analog audio connection, or other similar communication link, such as media data port 724. In some implementations, audio system 718 and/or display system 722 are external components to electronic device 700. Alternatively or additionally, display system 722 can be an integrated component of the example electronic device, such as part of an integrated touch interface. Electronic device 700 includes, or has access to, imager 106, which includes lens stacks 108 and asymmetric sensor array 110 (or 302 or 410). Sensor data is received from imager 106 and/or asymmetric sensor array 110 by image manager 512, here shown stored in memory devices 712, which when executed by processor 708 constructs a final image as noted above.
Although embodiment of an asymmetric sensor array for capturing images have been described in language specific to features and/or methods, the subject of the appended claims is not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations an asymmetric sensor array for capturing images.
This application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Patent Application No. 61/856,449, entitled “Asymmetric Array Camera” and filed on Jul. 19, 2013, the disclosure of which is incorporated in its entirety by reference herein.
Number | Date | Country | |
---|---|---|---|
61856449 | Jul 2013 | US |