This disclosure relates generally to image or video capture devices, including processing of image frames from multiple image sensors by an image signal processor.
Many devices include multiple image sensors that may be used for capturing one or more image frames. For example, a smartphone or tablet includes multiple image sensors to be used in generating images or video for different imaging applications. A plurality of image signal processors, with each image signal processor coupled to a different image sensor, process the image frames from the multiple image sensors. The processed image frames may then be used for the imaging application (such as generating user photographs, recording video, performing augmented reality operations, and so on).
This Summary is provided to introduce in a simplified form a selection of concepts that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to limit the scope of the claimed subject matter.
Some aspects of the present disclosure relate to processing image frames from multiple image sensors by a single image signal processor. An example device includes a memory and an image signal processor coupled to the memory. The image signal processor is configured to provide a first trigger to a first image sensor (the first image sensor being coupled to the image signal processor), receive a first frame from the first image sensor at a first time in response to the first trigger being received by the first image sensor, process the first frame, provide a second trigger to the second image sensor (the second image sensor being coupled to the image signal processor), receive a second frame from the second image sensor at a second time in response to the second trigger being received by the second image sensor (with the second time subsequent to the first time), and process the second frame. The memory is configured to store the processed first frame and the processed second frame received from the image signal processor.
In some implementations, capture of the first frame by the first image sensor is based on when the first image sensor receives the first trigger, and capture of the second frame by the second image sensor is based on when the second image sensor receives the second trigger. The image signal processor may be configured to determine when to provide the second trigger to the second image sensor based on an imaging parameter associated with the first frame. The first image sensor begins capture of the first frame upon receiving the first trigger, and the second image sensor begins capture of the second frame upon receiving the second trigger. In some implementations, the imaging parameter includes one or more of an exposure window size for the first frame or a size of the first image sensor for readout for the first frame.
The second trigger may be associated with a blanking factor to delay readout of the second frame by the second image sensor to the image signal processor. In some implementations, the first frame and the second frame are captured concurrently, and the second frame is prevented from being received at the same time as the first frame at the image signal processor based on the delayed readout of the second frame by the second image sensor. The image signal processor may be configured to provide an indication of the blanking factor to the second image sensor, wherein the second image sensor applies the blanking factor to delay readout based on the indication.
In some implementations, the image signal processor is configured to provide the third trigger to the first image sensor, receive a third frame from the first image sensor at a third time in response to the third trigger being received by the first image sensor (with the third time subsequent to the first time and different than the second time), and process the third frame. The memory may be configured to store the processed third frame from the image signal processor. The image signal processor may also be configured to provide a fourth trigger to the second image sensor. A timing between the first trigger and the third trigger is associated with a first frame rate of the first image sensor, and a timing between the second trigger and the fourth trigger is associated with a second frame rate of the second image sensor different from the first frame rate. The image signal processor may also be configured to receive a fourth frame from the second image sensor at a fourth time in response to the fourth trigger being received by the second image sensor (with the fourth time subsequent to the second time and the fourth time) and process the fourth frame. The memory may be configured to store the processed fourth frame from the image signal processor.
In some implementations, the device includes one or more processors coupled to the memory, and the one or more processors are configured to obtain the processed first frame and the processed second frame from memory. The device may also include the first image sensor to capture the first frame and the second image sensor to capture the second frame. In some implementations, the device includes a display configured to display the processed first frame and the processed second frame.
An example method includes providing, by an image signal processor, a first trigger to a first image sensor (the first image sensor being coupled to the image signal processor), receiving, by the image signal processor, the first frame from the first image sensor at a first time in response to the first trigger being received by the first image sensor, processing the first frame, providing, by the image signal processor, a second trigger to a second image sensor (the second image sensor being coupled to the image signal processor), receiving, by the image signal processor, the second frame from the second image sensor at a second time in response to the second trigger being received by the second image sensor (with the second time subsequent to the first time), and processing the second frame.
In some implementations, capture of the first frame by the first image sensor is based on when the first image sensor receives the first trigger, and capture of the second frame by the second image sensor is based on when the second image sensor receives the second trigger. The method may include determining when to provide the second trigger to the second image sensor based on an imaging parameter associated with the first frame. The first image sensor begins capture of the first frame upon receiving the first trigger, and the second image sensor begins capture of the second frame upon receiving the second trigger. In some implementations, the imaging parameter includes one or more of an exposure window size for the first frame or a size of the first image sensor for readout for the first frame.
The second trigger may be associated with a blanking factor to delay readout of the second frame by the second image sensor to the image signal processor. In some implementations, the first frame and the second frame are captured concurrently, and the second frame is prevented from being received at the same time as the first frame at the image signal processor based on the delayed readout of the second frame by the second image sensor. The method may include providing, by the image signal processor, an indication of the blanking factor to the second image sensor, wherein the second image sensor applies the blanking factor to delay readout based on the indication.
In some implementations, the method includes providing, by the image signal processor, the third trigger to the first image sensor, receiving, by the image signal processor, a third frame from the first image sensor at a third time in response to the third trigger being received by the first image sensor (with the third time subsequent to the first time and different than the second time), and processing the third frame. The method may include providing, by the image signal processor, a fourth trigger to the second image sensor. A timing between the first trigger and the third trigger is associated with a first frame rate of the first image sensor, and a timing between the second trigger and the fourth trigger is associated with a second frame rate of the second image sensor different from the first frame rate. The method may also include receiving, by the image signal processor, a fourth frame from the second image sensor at a fourth time in response to the fourth trigger being received by the second image sensor (with the fourth time subsequent to the second time and the fourth time) and processing the fourth frame.
An example non-transitory, computer-readable medium stores instructions that, when executed by one or more processors of a device, cause the device to provide, by an image signal processor, a first trigger to a first image sensor (the first image sensor being coupled to the image signal processor), receive, by the image signal processor, the first frame from the first image sensor at a first time in response to the first trigger being received by the first image sensor, process the first frame, provide, by the image signal processor, a second trigger to a second image sensor (the second image sensor being coupled to the image signal processor), receive, by the image signal processor, the second frame from the second image sensor at a second time in response to the second trigger being received by the second image sensor (with the second time subsequent to the first time), and process the second frame.
In some implementations, capture of the first frame by the first image sensor is based on when the first image sensor receives the first trigger, and capture of the second frame by the second image sensor is based on when the second image sensor receives the second trigger. Execution of the instructions may also cause the device to determine when to provide the second trigger to the second image sensor based on an imaging parameter associated with the first frame. The first image sensor begins capture of the first frame upon receiving the first trigger, and the second image sensor begins capture of the second frame upon receiving the second trigger. In some implementations, the imaging parameter includes one or more of an exposure window size for the first frame or a size of the first image sensor for readout for the first frame.
The second trigger may be associated with a blanking factor to delay readout of the second frame by the second image sensor to the image signal processor. In some implementations, the first frame and the second frame are captured concurrently, and the second frame is prevented from being received at the same time as the first frame at the image signal processor based on the delayed readout of the second frame by the second image sensor. Execution of the instructions may also cause the device to provide, by the image signal processor, an indication of the blanking factor to the second image sensor, wherein the second image sensor applies the blanking factor to delay readout based on the indication.
In some implementations, execution of the instructions causes the device to provide, by the image signal processor, the third trigger to the first image sensor, receive, by the image signal processor, a third frame from the first image sensor at a third time in response to the third trigger being received by the first image sensor (with the third time subsequent to the first time and different than the second time), and process the third frame. Execution of the instructions may cause the device to provide, by the image signal processor, a fourth trigger to the second image sensor. A timing between the first trigger and the third trigger is associated with a first frame rate of the first image sensor, and a timing between the second trigger and the fourth trigger is associated with a second frame rate of the second image sensor different from the first frame rate. Execution of the instructions may also cause the device to receive, by the image signal processor, a fourth frame from the second image sensor at a fourth time in response to the fourth trigger being received by the second image sensor (with the fourth time subsequent to the second time and the fourth time) and process the fourth frame.
Another example device for image signal processing includes means for providing, by an image signal processor, a first trigger to a first image sensor (the first image sensor being coupled to the image signal processor), means for receiving, by the image signal processor, the first frame from the first image sensor at a first time in response to the first trigger being received by the first image sensor, means for processing the first frame, means for providing, by the image signal processor, a second trigger to a second image sensor (the second image sensor being coupled to the image signal processor), means for receiving, by the image signal processor, the second frame from the second image sensor at a second time in response to the second trigger being received by the second image sensor (with the second time subsequent to the first time), and means for processing the second frame.
In some implementations, capture of the first frame by the first image sensor is based on when the first image sensor receives the first trigger, and capture of the second frame by the second image sensor is based on when the second image sensor receives the second trigger. The device may include means for determining when to provide the second trigger to the second image sensor based on an imaging parameter associated with the first frame. The first image sensor begins capture of the first frame upon receiving the first trigger, and the second image sensor begins capture of the second frame upon receiving the second trigger. In some implementations, the imaging parameter includes one or more of an exposure window size for the first frame or a size of the first image sensor for readout for the first frame.
The second trigger may be associated with a blanking factor to delay readout of the second frame by the second image sensor to the image signal processor. In some implementations, the first frame and the second frame are captured concurrently, and the second frame is prevented from being received at the same time as the first frame at the image signal processor based on the delayed readout of the second frame by the second image sensor. The device may include means for providing, by the image signal processor, an indication of the blanking factor to the second image sensor, wherein the second image sensor applies the blanking factor to delay readout based on the indication.
In some implementations, the device includes means for providing, by the image signal processor, the third trigger to the first image sensor, means for receiving, by the image signal processor, a third frame from the first image sensor at a third time in response to the third trigger being received by the first image sensor (with the third time subsequent to the first time and different than the second time), and means for processing the third frame. The device may include means for providing, by the image signal processor, a fourth trigger to the second image sensor. A timing between the first trigger and the third trigger is associated with a first frame rate of the first image sensor, and a timing between the second trigger and the fourth trigger is associated with a second frame rate of the second image sensor different from the first frame rate. The device may also include means for receiving, by the image signal processor, a fourth frame from the second image sensor at a fourth time in response to the fourth trigger being received by the second image sensor (with the fourth time subsequent to the second time and the fourth time) and means for processing the fourth frame.
Aspects of the present disclosure are illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements.
Aspects of the present disclosure may be used for image capture and processing devices including or coupled to multiple image sensors. Some aspects include processing image frames from the multiple image sensors by a single image signal processor.
Many devices include multiple image sensors, and the image sensors may capture frames during a same time period. For example, a smartphone may include a configuration of three cameras or a configuration of two cameras on a backside of the device, and the configuration of cameras may be used to capture image frames for a bokeh effect in images, a portrait mode for imaging, stereoscopic imaging, or other applications utilizing multiple image sensors. In another example, all of the image sensors of the configuration may be initialized to readout frames concurrently. Devices with multiple image sensors include one or more image signal processors to process the images provided by the image sensors. The one or more image signal processors provide the processed image frames to a memory, and an application processor may access the memory to obtain the processed image frames for further processing (such as for encoding or other manipulation).
As used herein, an image sensor may refer to the image sensor itself and any other suitable components coupled to the image sensor. For example, an image sensor may also refer to other components of a camera, including a shutter, buffer, or other readout circuitry. The image sensor may further refer to an analog front end or other circuitry for converting analog signals to digital representations for the frame. Therefore, the term “image sensor” herein may refer to any suitable components for capture and readout of an image frame to an image signal processor.
If a device includes a dedicated image signal processor for each image sensor and the number of image sensors increases per device, the additional image signal processors require additional space, and operation of the additional image signal processors require additional power resources. The device may thus include fewer image signal processors than image sensors. In this manner, one image signal processor processes frames from two or more image sensors. However, an image signal processor is configured to receive one image frame and begin processing the image frame before receiving the next image frame. If multiple image sensors are coupled to the same image signal processor, frame data from the multiple image sensors may be provided to the image signal processor at the same time. For example, two image sensors may perform a readout of frames at the same time, and the data read out from the image sensors are thus provided to the image signal processor at the same time (even though the image signal processor is only able to receive one frame at a time for processing).
For many devices, one of the image sensors is a master and the other image sensors are slaves to the master image sensor. The master image sensor may synchronize frame captures (such as a start of exposure (SoE) or start of frame (SoF)) among the image sensors via a synchronization signal provided to each of the image sensors, and the image sensors may readout frame data to an image signal processor concurrently as a result of the synchronized frame captures. To prevent frame data from different image sensors being received at the same time by an image signal processor, a device may include coordination circuitry to receive the frames from the different image sensors that are read out concurrently and then provide the frames sequentially to an image signal processor.
At time 204, the coordination circuitry 106 coordinates sending the frames received from the image sensors 102 and 104 to the image signal processor 108 for processing. The coordination is in response to receiving the frames from the image sensors (202). The coordination circuitry 106 may include a first buffer or other storage element to temporarily store frame data from the first image sensor 102, and the coordination circuitry 106 may include a second buffer or other storage element to temporarily store from data from the second image sensor 104. The coordination circuitry 106 also includes logic to determine the order in which to provide the frames to the image signal processor 108, when to provide each frame to the image signal processor 108, and other operations to prevent any portion of the frames from the different image sensors from being provided to the image signal processor 108 concurrently. The circuitry 106 may also include one or more switches and decision logic to determine which buffer is to receive the current incoming frame data from an image sensor.
At time 206, the coordination circuitry 106 sends the frame from the first image sensor 102 to the image signal processor 108. The coordination circuitry 106 also prevents the frame from the second image sensor 104 from being sent to the image signal processor 108 at time 206. The image signal processor 108 receives the frame captured by the first image sensor 102 and begins processing the frame at time 208. The coordination circuitry 106 completes sending the frame to the image signal processor 108 before beginning to send the next frame to the image signal processor. In this manner, the frame captured by the second image sensor 104 is not sent to the image signal processor 108 before sending the previous image frame to the image signal processor 108 is completed. The coordination circuitry 106 may also delay sending the frame captured by the second image sensors 104 until the image signal processor 108 completes processing the previous frame (such as illustrated in
With the frame from the first image sensor 102 being completely sent to the image signal processor 108 (and the image signal processor 108 completing processing of the frame), the coordination circuitry 106 sends the frame captured by the second image sensor 104 to the image signal processor 108 at time 212. The image signal processor 108 begins processing the received frame at time 214, and the next frame is delayed from being sent by the coordination circuitry 106 to the image signal processor 108 at this time. Such a process may continue for any number of frames to be processed, and the coordination circuitry 106 may be expanded to receive frames from additional image sensors.
A problem with the inclusion of coordination circuitry between the image signal processor and the multiple image sensors is the space requirements for the coordination circuitry. For example, the circuitry requires a plurality of logic, buffers, and other integrated circuits to perform the coordination. In addition, as the number of image sensors to be coupled to a single image signal processor increases, the space requirements for the coordination circuitry exponentially increases. Furthermore, the coordination circuitry is always on during operation of the image sensors and thus requires power resources to operate.
Another problem with coordination circuitry is that a delay between readout and processing of frames may exist. For example, if two image sensors concurrently readout frames to the same image signal processor, only one of the frames can be processed when received by the image signal processor, and the other frame is delayed in being processed. The coordination circuitry may also introduce an inherent latency caused by the frames passing through one or more components (such as a buffer) associated with a latency for data passing through the component. Such delays may be too large for some imaging applications, especially for image sensors operating at a high frame rate. High frame rate (HFR) video and other HFR imaging applications (including near real time depth enhancement or other applications) may not be performed as a result of the latencies, or the latencies may cause a significant lag in video, depth enhancement, or other imaging applications to negatively impact the user experience.
A further problem with coordination circuitry (and with one of the image sensors being a master to the other image sensors) is that the image sensors are required to operate at a static frame rate. In one example, if a video's frame rate is to be adjusted during capture, the image sensor requires its frame rate to be adjusted. Adjusting the frame rate requires a pause in operation of the image sensor, which causes a pause or gap in the video.
In some implementations, an image signal processor is configured to coordinate the reception of frames from multiple image sensors coupled to the image signal processor. In this manner, frames are received sequentially by the image signal processor even if frames are captured concurrently by the image sensors. To coordinate the reception of frames, the image signal processor coordinates the readout of the frames by the image sensors to the image signal processor. Coordinating frame readout may be performed by triggering frame capture (such as a SoE) or delaying readout of a frame to ensure frames are received sequentially at the image signal processor. In this manner, the image signal processor does not require coordination circuitry or other logic at the front end of the image signal processor to coordinate providing image frames sequentially that would otherwise be received concurrently (since image frames are ensured to be received in a sequential manner). Furthermore, coordinating the readout of frames allows for reducing or removing a delay between receiving a frame and processing the frame.
The image sensors may all be slaves to the image signal processor (instead of being slaves to one of the image sensors). The image signal processor may provide a trigger to each image sensor to indicate when to capture (or readout) an image frame. In this manner, the image signal processor also controls if and when an image sensor is to capture an image frame. As a result, in addition to coordinating readout of the image frames so that they are received sequentially by the image signal processor, a device's power consumption may be reduced when frame capture is not required by an application, as the image signal processor may not provide a trigger to one or more image sensors during such time (and the image sensors do not consume power capturing and reading out image frames when not needed). The time to reconfigure an image sensor may also be reduced based on the image sensor being a slave to the image signal processor.
Another benefit is that gaps in video or delays in an imaging application associated with a frame rate change may be reduced or removed. For example, if an image sensor's frame capture is based on triggers provided by the image signal processor, a frame rate adjustment may be controlled by the image signal processor by adjusting the timing between triggers. In this manner, an image sensor's frame rate may be increased by reducing the time between triggers, and an image sensor's frame rate may be decreased by increasing the time between triggers (without requiring disabling the image sensor to change the frame rate). Other benefits of the present disclosure may also become evident in the provided examples and description herein.
In the following description, numerous specific details are set forth, such as examples of specific components, circuits, and processes to provide a thorough understanding of the present disclosure. The term “coupled” as used herein means connected directly to or connected through one or more intervening components or circuits. Also, in the following description and for purposes of explanation, specific nomenclature is set forth to provide a thorough understanding of the present disclosure. However, it will be apparent to one skilled in the art that these specific details may not be required to practice the teachings disclosed herein. In other instances, well known circuits and devices are shown in block diagram form to avoid obscuring teachings of the present disclosure. Some portions of the detailed descriptions which follow are presented in terms of procedures, logic blocks, processing, and other symbolic representations of operations on data bits within a computer memory. In the present disclosure, a procedure, logic block, process, or the like, is conceived to be a self-consistent sequence of steps or instructions leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, although not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated in a computer system.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussions, it is appreciated that throughout the present application, discussions utilizing the terms such as “accessing,” “receiving,” “sending,” “using,” “selecting,” “determining,” “normalizing,” “multiplying,” “averaging,” “monitoring,” “comparing,” “applying,” “updating,” “measuring,” “deriving,” “settling,” “generating” or the like, refer to the actions and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
In the figures, a single block may be described as performing a function or functions; however, in actual practice, the function or functions performed by that block may be performed in a single component or across multiple components, and/or may be performed using hardware, using software, or using a combination of hardware and software. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps are described below generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. Also, the example devices may include components other than those shown, including well-known components such as a processor, memory, and the like.
Aspects of the present disclosure are applicable to any suitable electronic device including or coupled to two or more image sensors capable of capturing image frames (also referred to as frames) for video (such as security systems, smartphones, tablets, laptop computers, digital video cameras, and so on). Further, aspects of the present disclosure may be implemented in devices having or coupled to image sensors of the same or different capabilities and characteristics (such as resolution, shutter speed, sensor type, and so on).
The terms “device” and “apparatus” are not limited to one or a specific number of physical objects (such as one smartphone, one camera controller, one processing system and so on). As used herein, a device may be any electronic device with one or more parts that may implement at least some portions of the disclosure. While the below description and examples use the term “device” to describe various aspects of the disclosure, the term “device” is not limited to a specific configuration, type, or number of objects. As used herein, an apparatus may include a device or a portion of the device for performing the described operations.
The first image sensor 301 and the second image sensor 302 are configured to capture one or more image frames. For example, the first image sensor 301 and the second image sensor 302 may be included in one multiple camera configuration or in separate single cameras or separate multiple camera configurations (such as a dual camera configuration, a triple camera configuration, and so on for a smartphone or other suitable device). The image sensors 301 and 302 may also include or be coupled to one or more lenses for focusing light, one or more apertures for receiving light, one or more shutters for blocking light when outside an exposure window, one or more color filter arrays (CFAs) for filtering light outside of specific frequency ranges, one or more analog front ends for converting analog measurements to digital information, or other suitable components for imaging. The device 300 may also include a flash, a depth sensor, a GPS, or other suitable components for imaging.
The image sensors 301 and 302 may be configured to be a slave to the image signal processor 312 (with neither image sensor nor any other image sensor coupled to the image signal processor 312 being a master to the image sensors 301 and 302). In this manner, instead of the image sensors 301 and 302 coupled to one another or any other image sensor for a master slave relationship between image sensors, the image sensors 301 and 302 are coupled to the image signal processor 312 for such relationship. With the image sensors 301 and 302 as slaves of the image signal processor 312, the image sensors 301 and 302 are configured to wait for a trigger from the image signal processor 312 to begin image frame capture or readout of the image frame. For example, a trigger provided from the image signal processor 312 to one of the image sensors 301 or 302 may cause the image sensor to begin a SoE for image frame capture. The image sensors 301 and 302 may also be configured to read out their respective image frames to the image signal processor 312.
The image signal processor 312 is a single image signal processor 312 to process captured image frames provided by the image sensors 301 and 302. The image signal processor 312 may also be configured to provide the triggers to the image sensors 301 and 302 to control capture or readout of the image frames from the image sensors 301 and 302. In this manner, the device 300 is able to control the image sensors 301 and 302 to readout image frames sequentially to the image signal processor 312 (and thus not require coordination circuitry between the image signal processor 312 and the image sensors 301 and 302).
While
In some aspects, the image signal processor 312 may execute instructions from a memory (such as instructions 308 from the memory 306, instructions stored in a separate memory coupled to or included in the image signal processor 312, or instructions provided by the processor 304. In addition or alternative to the image signal processor 312 configured to execute software, the image signal processor 312 may include specific hardware (such as one or more integrated circuits (ICs)) to perform one or more operations described in the present disclosure.
In some implementations, the device 300 includes a memory 306. The memory 306 may include a non-transient or non-transitory computer readable medium storing computer-executable instructions 308 to perform all or a portion of one or more operations described in this disclosure. In some implementations, the instructions 308 include a camera application (or other suitable application) to be executed by the device 300 for generating images or videos. The instructions 308 may also include other applications or programs executed by the device 300 (such as an operating system and specific applications other than for image or video generation). Execution of the camera application (such as by the processor 304) may cause the device 300 to generate images using the image sensors 301 or 302 and the image signal processor 312. The memory 306 may also be accessed by the image signal processor 312 to store processed frames or may be accessed by the processor 304 to obtain the processed frames. In some other implementations, the device 300 does not include the memory 306. For example, the device 300 may be a circuit including the image signal processor 312, and the memory is outside the device 300. The device 300 may be coupled to the memory and configured to access the memory for writing processed frames.
In some implementations, the device 300 includes a processor 304. The processor 304 may include one or more general purpose processors capable of executing scripts or instructions of one or more software programs (such as instructions 308) stored within the memory 306. For example, the processor 304 may include one or more application processors configured to execute the camera application (or other suitable application for generating images or video) stored in the memory 306. In executing the camera application, the processor 304 may be configured to instruct the image signal processor 312 to perform one or more operations with reference to the image sensors 301 or 302. Execution of instructions 308 outside of the camera application by the processor 304 may also cause the device 300 to perform any number of functions or operations. In some implementations, the processor 304 may include ICs or other hardware in addition to the ability to execute software to cause the device 300 to perform a number of functions or operations (including the operations described herein). In some other implementations, the device 300 does not include the processor 304. For example, if the device 300 is a circuit including the image signal processor 312, the device 300 may be coupled to a processor for performing one or more of the described operations.
In some implementations, the device 300 includes a display 314. The display 314 may include one or more suitable displays or screens allowing for user interaction and/or to present items to the user (such as a preview of the image frames being captured by the image sensors 301 and 302). In some aspects, the display 314 is a touch-sensitive display. The device 300 may also include I/O components 316, and the I/O components 316 may be or include any suitable mechanism, interface, or device to receive input (such as commands) from the user and to provide output to the user. For example, the I/O components 316 may include (but are not limited to) a graphical user interface (GUI), keyboard, mouse, microphone and speakers, a squeezable bezel, one or more buttons (such as a power button), a slider or switch, and so on.
While shown to be coupled to each other via the processor 304 in the example of
As noted above, multiple image sensors are coupled to an image signal processor, and the image signal processor processes the frames from the multiple image sensors. Coordinating readout of frames from the multiple image sensors to the image signal processor may be based on triggers provided by the image signal processor (as the master) to the image sensors (as the slaves) to coordinate when the frames are provided by the image sensors to the image signal processor.
The trigger controller 406 is configured to provide a trigger to cause an image sensor to begin capture of a frame (such as a SoE for a frame) or to trigger readout of the frame by the image sensor. In some implementations, a trigger may refer to defined signal value or level. For example, a trigger may include a high signal instance, including an increased voltage, increased current, or other suitable indication or conversely as a low signal instance, including a decreased voltage, decreased current, or other suitable indication for a signal provided by the image signal processor coupled to the image sensor. In this manner, each image sensor coupled to the image signal processor may receive a different signal from the image signal processor. In some other implementations, a trigger may refer to a distinct signal from the image signal processor. For example, a defined signal for an amount of time may be sent by the image signal processor to an image sensor to trigger the image sensor to begin exposure for a frame. Nothing may be transmitted otherwise when the image signal processor is not to trigger the image frame. In this manner, separate signals are sent for each time the image signal processor is to trigger an image sensor to capture or readout an image frame. The examples provided herein describe and illustrate a trigger as a high signal instance of a signal provided by the image signal processor to the image sensor, but any suitable trigger may be used, including a predefined string of bits, a pattern of voltage or current fluctuations, or any other suitable trigger. The present disclosure is not limited to the example triggers described in the examples herein.
The image sensors 404A-404N are slaves to the image signal processor 402. As a result, the image sensors 404A-404N do not readout a frame to the image signal processor 402 until triggered to do so via a trigger provided by the image signal processor 402 (such as via the trigger controller 406). In this manner, the image signal processor 402 may control the triggers being sent to the image sensors 404A-404N to prevent two or more of the image sensors reading out frames to the image signal processor 402 at the same time. Controlling readout may include a trigger being used to indicate a beginning of a frame capture (such as a SoE for a frame). If a frame is not to be captured by an image sensor, no frame readout occurs.
For the example operation 500, a first trigger to cause a first frame to be received from a first image sensor and a second trigger to cause a second frame to be received from a second image sensor are provided by the image signal processor 402 to the first image sensor and the second image sensor. The triggers are configured (such as sent by the image signal processor 402 at different times or associated with different blanking factors (described below)) so that the first image sensor and the second image sensor are prevented from providing the first image frame and the second image frame to the image signal processor at the same time. In some implementations, a trigger causes an image sensor to begin capture of a frame (such as a SoE for the frame). The frame may be captured and readout to the image signal processor as typically performed by the image sensor. In this manner, the image signal processor 402 may coordinate when the triggers are sent to the image sensors 404A-404N so that frames are received sequentially from the image sensors 404A-404N (without multiple frames arriving concurrently at the image signal processor 402). For example, referring back to
Referring back to
Alternative or additional to the trigger causing a SoE for frame capture at an image sensor, the trigger may cause an image sensor to readout a current frame to the image signal processor. For example, the image sensor may continuously capture frames, but the frames are not readout to the image signal processor 402 as a result of not receiving a trigger. In this manner, the buffers storing pixel data for the current frame may be cleared so that a next frame may be captured (effectively dumping the current frame). In response to the image sensor receiving a trigger, the image sensor may begin readout of the current frame being captured or that was just captured.
In addition or alternative to the image signal processor 402 coordinating when to send the triggers to the image sensors, one or more image sensors may be associated with a delay in readout of a frame to the image signal processor. The lengths of the delays may be configured to prevent concurrent readout of frames from multiple image sensors, even if triggers are sent to multiple image sensors concurrently. Timing of the triggers provided to the image sensors are described in more detail with reference to
Referring back to
The image signal processor 402 also provides a second trigger to a second image sensor (508). As noted above, in some implementations, the second trigger is provided to the second image sensor at a different time than providing the first trigger to the first image sensor. In some other implementations, the triggers may be provided at the same time (or close to the same time) to the multiple image sensors, and one or more blanking factors associated with the image sensors causes the multiple frames to be provided to the image signal processor 402 one at a time. In some further implementations, the triggers may be sent at different times and the image sensors may be associated with different blanking factors so that the image sensors are controlled to provide the frames to the image signal processor 402 one at a time.
The image signal processor 402 receives the second frame from the second image sensor at a second time in response to the second trigger being received by the second image sensor (510), and the image signal processor 402 processes the second frame in response to receiving the second frame (512). The second time is subsequent to the first time. In this manner, the image signal processor 402 processes the first image frame before processing the second image frame. In addition, the second image frame is received by the image signal processor 402 at a time after the first time so that there is no delay in beginning to process the second image frame. As a result, coordination circuitry or other coordination components are not required at the input to the image signal processor 402 (such as to delay the image signal processor 402 from receiving the second frame before being ready to process the frame). In some implementations, the second time may be based on when the second trigger is provided to the second image sensor. In addition or to the alternative, the second time may be based on a blanking factor associated with the second image sensor. Coordination of the frames may apply to subsequent frames from the first image sensor not being received before being ready to process when processing the second frame or any other previous frames.
While not shown in
Referring to coordinating when the triggers are to be provided to the image sensors,
The first bump 608 (high signal instance) of the first signal 602 is a trigger provided to a first image sensor, and the first image sensor may begin frame capture (such as a SoE). The first bump 610 of the second signal 604 is a trigger provided to a second image sensor, and the second image sensor may begin frame capture (such as a SoE). The image signal processor may determine the time period between the first bump 608 and the first bump 610 to be period 614 to prevent the second frame from the second image sensor being received at the same time as the first frame from the first image sensor. Period 614 may be based on the amount of time for the first image sensor to capture and complete readout of the first frame to the image signal processor. Period 614 may also be based on an amount of time to capture and readout the second frame by the second image sensor. For example, the exposure window before readout at the second image sensor may allow the image signal processor to provide the first bump 610 before the first image sensor completes readout of the first frame (as the readout is completed before the exposure window ends at the second image sensor or readout begins at the second image sensor). In this manner, the image signal processor may reduce the time between frames being received from different image sensors.
Similar processes described above may be performed for determining the period 616 and providing the first bump 612 of the third signal 606 (which prevents receiving a third frame from the third image sensor at the same time as receiving the second frame), determining the period 618 and providing the second bump 608 for the first signal 602 (which prevents receiving another frame from the first image sensor at the same time as receiving the third frame), and so on in timing the triggers to the image sensors. For example, with the first image sensor (which has already been triggered to capture the first frame) to capture another frame, the image signal processor determines when to send another trigger that prevents the new frame from being received from the first image sensor at the same time as another frame is received from another image sensor (such as the second frame from the second image sensor, the third frame from the third image sensor, or another frame from a different image sensor coupled to the image signal processor). The trigger is provided to the first image sensor after the first trigger (previously provided to trigger the first image sensor to provide the first frame) and after the second trigger (previously provided to trigger the second image sensor to provide the second frame). Based on the trigger, the image signal processor receives the new frame from the first image sensor at a time after receiving the first frame and the second frame, and the image signal processor processes the received frame.
Referring back to
As noted above, in addition or alternative to coordinating when the triggers are provided to the image sensors, one or more image sensors may be associated with a blanking factor to delay readout of a frame after receiving a trigger. Some imaging applications require multiple image sensors to capture corresponding image frames. For example, stereoscopic imaging includes two image sensors capturing frames concurrently. In this manner, the parallax between the image sensors may be used to generate a three dimensional image from the two frames captured by the two image sensors. Since the image sensors may capture frames concurrently, at least a portion of one frame may be readout at the same time a portion of the other frame is readout. To prevent a frame from an image sensor from being provided to the image signal processor at the same time a frame from a different image sensor is provided to the image signal processor, the image sensor may be associated with a blanking factor.
As used herein, a blanking factor may be a value or other indication of how long an image sensor is to delay readout of a frame to the image signal processor. The blanking factor may indicate an amount of time, a multiple of a base unit of time, a number of clock cycles, or any other suitable indication of a period of time. In some implementations, the blanking factor is provided by the image signal processor or another suitable device component, and the blanking factor may be based on a frame rate for image capture, previous readout delays, a tolerance to ensure no overlap, or any other suitable factors that may impact when a frame would be typically provided to an image signal processor. In some other implementations, the blanking factor may be pre-defined (such as during device calibration after production), user-defined, or stored for the image sensor. In some implementations, each image sensor may be associated with its own unique value or amount of time. The blanking factor may indicate a time to delay readout beginning from when the trigger is received. In some other implementations, the blanking factor may indicate a time to delay readout beginning from the end of an exposure window or other suitable starting point. While some examples of a blanking factor are provided, any suitable implementation of a blanking factor may be used, and the present disclosure is not limited to a specific example of a blanking factor. With one or more image sensors being associated with a unique blanking factor, multiple image sensors may capture frames concurrently, but the frames are readout at different times for the different image sensors so that the image signal processor sequentially receives the frames from the image sensors.
Upon receiving the first trigger, the first image sensor 301 may begin capture of a first frame (704). Upon receiving the second trigger, the second image sensor 302 may begin capture of the second frame (706). In some implementations, capture of the first frame and capture of the second frame may be at the same time (or close to the same time). For example, providing the first and second triggers at the same time (or close to the same time) may cause the first frame and the second frame to be captured at the same time (or close to the same time).
At 708, the first image sensor 301 begins readout of the first frame to the image signal processor 312. For example, if the first image sensor 301 includes a rolling shutter, one or more lines of the first frame may be readout to the image signal processor, and additional lines are readout after readout of the first one or more lines. In another example, if the first image sensor 301 includes a global shutter, the exposure window ends for all pixels of the image sensor, and the frame is readout to the image signal processor 312 after the exposure window ends. If the second image sensor 302 is similar to the first image sensor 301, the second image sensor 302 may also be ready to begin readout at or near 708. To prevent the second image sensor 302 from reading out the second frame when the first frame is being readout to the image signal processor 312, the second image sensor performs blanking (710). For example, a blanking factor is used to delay a column buffer and row buffer (used to collects pixel data from the image sensor pixels) from collecting pixel data for a period of time or from providing the collected pixel data to the image signal processor for a period of time. The blanking period 714 (during which the second image sensor 302 prevents readout of the second frame to the image signal processor 312) is based on when the first frame is readout to the image signal processor 708. For example, the blanking period 714 may be a defined amount of time based on the blanking factor that is universally long enough to prevent the second frame from being provided to the image signal processor 312 at the same time the first frame is provided to the image signal processor 312. In another example, the blanking period 714 is a variable amount of time based on the blanking factor (which varies based on a frame readout of the first frame, frame rate, or other factors that may impact when the second frame is ready for readout and when the first frame is completely readout to the image signal processor). In some implementations, the blanking factor may be provided with the trigger or may be provided in a separate control signal (such as from the processor 304 or the image signal processor 312) to the image sensor, and the image sensor determines the blanking period based on the blanking factor. For example, the image signal processor 312 may provide an indication of a blanking factor (such as a number, flag, or value that may be used by the image sensor to determine the blanking period), and the image sensor applies the blanking factor (performs the blanking) to delay readout of a frame based on the indication of the blanking factor.
At 712, the first image sensor 301 completes readout of the first frame to the image signal processor 312. As illustrated, the blanking period 714 may end at or near the time when readout of the first frame is completed (716). While blanking is illustrated as being for the blanking period 714 from when the second image sensor 302 receives the second trigger to when the first image sensor 301 completes readout of the first frame, blanking may begin and end at any suitable times.
With the blanking period 714 ending, the second image sensor 302 begins to readout the second frame to the image signal processor 312 (718). While a space is illustrated between when the blanking period 714 ends and when the second frame begins to be readout (718) in the example in
In some implementations, an error may occur where an image signal processor receives at least part of a first frame concurrently with at least part of a second frame. For example, the timing of the triggers may be incorrect or an incorrect blanking factor may have been used. If conflicting frames are received or cause interference in receiving image frames at the image signal processor, the image signal processor may disregard the data received for those frames and perform example operation 500 (
Since the image signal processor is the master to the image sensors coupled to the image signal processor, the described operations above may also be used in reducing device power consumption. For example, when a device typically executes an imaging application, all of the image sensors are enabled and all of the image sensors capture image frames and readout each image frame. The imaging application may only require frames from one image sensor or from a subset of all of the image sensors of the device. Conventionally, a device may disable one or more of the image sensors as not needed (such as placing the image sensors in a low power state). If the device needs the image sensor to begin capturing images, the device removes the image sensor from the low power state, initializes the image sensor (such as by performing autofocus (AF), autoexposure (AE), and automatic white balance (AWB) operations), and configures the initialized image sensor to begin capturing image frames. Initialization and configuration requires an amount of time that delays when the image sensor is ready to be used, and the delay may be noticeable to the user (such as half a second or more).
Frame readout consumes a large portion of the power required to keep the image sensor enabled. If frame readout can be prevented, power consumption is reduced without placing the image sensor into a low power state. With the image sensors as slaves to the image signal processor, the image sensors are prevented from performing frame readout until receiving a trigger from the image signal processor. In this manner, power consumption of the image sensors may be reduced by preventing triggers from being provided to one or more image sensors when not needed. Since the image sensors are not placed into a low power state, an image sensor does not need to be initialized or configured to capture an image frame. As a result, a trigger can be provided to the image sensor when appropriate, and the image sensor captures one or more image frames without a delay typically required in removing an image sensor from a low power state.
In addition or alternative to reducing power consumption while not placing one or more image sensors into a low power state, the described operations above may be used for improving reconfiguration operations of an image sensor. An image sensor may be reconfigured, such as changing one or more configurations of the image sensor based on changing the frame rate, changing the resolution (such as changing the remosaicing), or changing other aspects of the frames from the image sensor. Conventionally, an image sensor (such as the first image sensor 301) to be reconfigured is active, with the image sensor performing readouts of frames being captured. Referring to
In the present disclosure, since the image sensors are slaves to the image signal processor, the image sensors do not readout frames to the image signal processor without a trigger. In some implementations, the image sensors do not capture a frame without receiving a trigger. In this manner, an image sensor is not required to be deactivated before reconfiguring the image sensor. Instead, the image signal processor prevents sending a trigger to the image sensor while the image sensor is being reconfigured. In this manner, the image sensor may be reconfigured without requiring deactivation and reactivation, thus reducing the amount of time required to reconfigure the image sensor.
Reconfiguring an image sensor without requiring deactivation or powering down at least a portion of the image sensor may be particularly beneficial for video capture and generation. Conventional reconfiguration of an image sensor (requiring powering down or otherwise deactivating the image sensor) may cause a pause or blank space in a video (during which no frames are being captured by the image sensor) of half a second or more. Reconfiguring the image sensor without powering down (such as based on preventing sending triggers to the image sensor during reconfiguration) reduces the length of the pause or blank space in the video attributed to reconfiguring the image sensor.
The above implementations and techniques may apply to different types of image sensors and configurations. For example, in addition to being applicable to image sensors that capture or readout frames at 30 frames per second (fps), 60 fps, or other typical frame rates for image or video capture, the techniques may also be applicable to image sensors configured in a fast shutter or fast readout mode. In some implementations, one or more of the image sensors are configured to capture or readout 120 fps (or more).
In one example, an image signal processor is coupled to four image sensors. For example, referring back to
An image sensor capturing 30 fps corresponds to a frame being captured approximately every 32 ms. The image signal processor 402 may be configured to trigger each of the image sensors 404A-404D to be active during 8 ms of each 32 ms period. Otherwise, the image sensor may be blanked for the other 24 ms of the 32 ms period. In this manner, a different image sensor is active each 8 ms subperiod of the 32 ms period. As used herein, an image sensor being active may refer to the image sensor performing readout of a frame captured during the 8 ms, and an image sensor being blanked may refer to the image sensor being prevented from performing readout of frames captured during the 24 ms. In this manner, each image sensor may continue to capture four frames per 32 ms period, but only one frame is readout to the image signal processor 402. In some other implementations, the image sensor being active may refer to the image sensor capturing one or more image frames, and the image sensor being blanked may refer to the image sensor not capturing image frames.
With the image sensors 404A-404D to be active during an 8 ms subperiod each 32 ms period, the image signal processor 402 may time when to send the triggers and/or configure a blanking factor for one or more image sensors to coordinate when each image sensor is to be active (and when to be blanked). While the above example describes four image sensors configured for fast readout (thus able to capture and readout frames at 120 fps) to each effectively provide 30 fps, any suitable number of image sensors, effective frame rates, fast readout rates, or other suitable configurations for when an image sensor is to be active or is to be blanked may be used. The above example is provided for clarity in describing techniques of the present disclosure as applicable to fast readout image sensors, and is not to limit the scope of the present disclosure.
Various techniques for an image signal processor to coordinate the reception of frames from multiple image sensors is described herein. As noted, the image signal processor does not require coordination circuitry preceding the image sensor and receiving frames from the image sensors. As described above, based on the image sensors being slaves to the image signal processor, different operations and aspects of image capture and processing may be improved. The techniques described herein may be implemented in hardware, software, firmware, or any combination thereof, unless specifically described as being implemented in a specific manner. Any features described as modules or components may also be implemented together in an integrated logic device or separately as discrete but interoperable logic devices. If implemented in software, the techniques may be realized at least in part by a non-transitory processor-readable storage medium (such as the memory 306 in the example device 300 of
The non-transitory processor-readable storage medium may comprise random access memory (RAM) such as synchronous dynamic random access memory (SDRAM), read only memory (ROM), non-volatile random access memory (NVRAM), electrically erasable programmable read-only memory (EEPROM), FLASH memory, other known storage media, and the like. The techniques additionally, or alternatively, may be realized at least in part by a processor-readable communication medium that carries or communicates code in the form of instructions or data structures and that can be accessed, read, and/or executed by a computer or other processor.
The various illustrative logical blocks, modules, circuits, and instructions described in connection with the embodiments disclosed herein may be executed by one or more processors, such as the processor 304 or the image signal processor 312 in the example device 300 of
As noted above, while the present disclosure shows illustrative aspects, it should be noted that various changes and modifications could be made herein without departing from the scope of the appended claims. For example, while one trigger is described as being used to indicate a frame is to be captured or readout, any suitable number of triggers may be used. In a specific example, a first trigger may cause frame capture to begin and a second trigger may cause readout of the captured frame. In another example, one trigger may be used to cause an image sensor to capture and readout a plurality of frames. As such, any suitable triggers and blanking factors may be used in coordinating the reception of multiple frames by the image signal processor.
Additionally, the functions, steps or actions of the method claims in accordance with aspects described herein need not be performed in any particular order unless expressly stated otherwise. For example, blocks 504 and 510 in