The present disclosure is generally related to image processing and, more particularly, to adaptive power saving for multi-frame processing.
Unless otherwise indicated herein, approaches described in this section are not prior art to the claims listed below and are not admitted to be prior art by inclusion in this section.
Multiple-frame (herein interchangeably referred as “multi-frame”) applications are generally applications that use image processing technology/technologies to generate one or more output image frames from multiple captured image frames. The multiple captured image frames can be captured by a single camera at different times, by multiple cameras at a same time, or by multiple cameras at different times. The multiple captured image frames can go through multi-frame processing (MFP), which generates at least one video frame from the multiple captured image frames for quality improvement. The quality improvement may be with respect to brightness, color, contrast, noise, sharpness, texture, frame rate, temporal smoothness, and so forth. MFP can be applied to still images, video recording files, as well as a video preview shown on a display.
With respect to MFP, existing approaches typically use the same image capture condition and video frame processing algorithm for each video frame. As a result, power consumption tends to be similar in the generation of each video frame with MFP. However, for portable applications that operate on a limited amount of power supply such as battery (e.g., smartphones, tablets, laptop computers and any battery-powered portable apparatus), the power consumption related to MFP can be excessive and is thus undesirable. Moreover, under high power consumption, high thermal condition (e.g., high temperature in one or more components) due to the high power consumption can result and, undesirably, lead to shutdown of the portable apparatus.
The following summary is illustrative only and is not intended to be limiting in any way. That is, the following summary is provided to introduce concepts, highlights, benefits and advantages of the novel and non-obvious techniques described herein. Select implementations are further described below in the detailed description. Thus, the following summary is not intended to identify essential features of the claimed subject matter, nor is it intended for use in determining the scope of the claimed subject matter.
An objective of the present disclosure is to propose a novel scheme for adaptive power saving for MFP. In one aspect, a method in accordance with the present disclosure may involve monitoring for at least one condition associated with an apparatus. The method may also involve dynamically adjusting image processing performed on a plurality of input image frames to provide one or more output image frames in response to a result of the monitoring.
In another aspect, an apparatus in accordance with the present disclosure may include a processor configured to monitor for at least one condition associated with an apparatus and, in response to a result of the monitoring, dynamically adjust image processing performed on a plurality of input image frames to provide one or more output image frames.
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of the present disclosure. The drawings illustrate implementations of the disclosure and, together with the description, serve to explain the principles of the disclosure. It is appreciable that the drawings are not necessarily in scale as some components may be shown to be out of proportion than the size in actual implementation in order to clearly illustrate the concept of the present disclosure.
Detailed embodiments and implementations of the claimed subject matters are disclosed herein. However, it shall be understood that the disclosed embodiments and implementations are merely illustrative of the claimed subject matters which may be embodied in various forms. The present disclosure may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments and implementations set forth herein. Rather, these exemplary embodiments and implementations are provided so that description of the present disclosure is thorough and complete and will fully convey the scope of the present disclosure to those skilled in the art. In the description below, details of well-known features and techniques may be omitted to avoid unnecessarily obscuring the presented embodiments and implementations.
Under the proposed scheme of the present disclosure, power consumption in connect with image frame processing may be reduced adaptively, or dynamically, based on one or more real-time conditions being monitored at the time of image frame processing. In particular, under the proposed scheme, utilization of MFP may be dynamically enabled/disabled, adjusted, throttled or otherwise controlled to result in power saving by adapting to the one or more real-time conditions of the apparatus. MFP may be utilized to improve quality in output image frames that are generated by processing input images, at least in terms of one or more of the following non-exhaustive list of aspects: denoising, deblurring, super-resolution, better high dynamic range, better sharpness, better texture, better brightness, better color and better contrast.
The condition being monitored may be any condition in concern with respect to the apparatus associated with the camera. It is noteworthy that, although in examples described herein the condition being monitored may be the thermal condition of one or more components of the apparatus, in various implementations in accordance with the present disclosure the condition being monitored may be one or more non-thermal conditions with respect to the apparatus. It is noteworthy that more than one condition with respect to the apparatus may be monitored for dynamically controlling MFP in accordance with the present disclosure. For example and without limitation, one or more conditions being monitored may include at least one of the following: one or more temperatures associated with the apparatus reaching or exceeding one or more respective thermal thresholds, one or more temperatures associated with a camera of the apparatus reaching or exceeding one or more respective thermal thresholds, an amount of time that the apparatus has been in use reaching or exceeding a respective temporal threshold, an amount of time that the camera has been in use reaching or exceeding a respective temporal threshold, and an amount of time that an application has been in execution on the apparatus reaching or exceeding a respective temporal threshold. It is further noteworthy that, in addition to or in lieu of controlling MFP, one or more other actions may be taken to achieve power saving in accordance with the present disclosure. For example and without limitation, power saving may be achieved by lowering camera input frame rate, disabling MFP, lowering computation precision in hardware and/or in software (e.g., from 32 bits to 16 bits), and/or turning off one or more parallel hardware tasks and/or one or more parallel software tasks.
The term “thermal condition” herein may refer to the temperature(s) of one or more components of the apparatus such as, for example and without limitation, one or more processors, one or more electronic components and/or a casing of the apparatus. For example, when the temperature of a given component being monitored is below a first threshold, thermal condition with respect to that component (and/or the apparatus) may be deemed as being low; and when the temperature of that component is above a second threshold, thermal condition with respect to that component (and/or the apparatus) may be deemed as being high. In some implementations, the first threshold and the second threshold may be the same threshold. Alternatively, the first threshold may be different and lower than the second threshold. In such cases, the thermal condition may be deemed as being medium when the temperature of that component is between the first threshold and the second threshold.
The terms “input frame” and “input image frame” are used interchangeably herein. The terms “output frame” and “output image frame” are used interchangeably herein.
In case 1 in scenario 200 and under an existing approach, MFP would be utilized by generating each output frame with multiple input frames, respectively. That is, input frames may be captured at a constant rate (e.g., 120 frame per second (fps) or another fps), and multiple input frames (e.g., four input frames or any number greater than 1) may be processed to generate a corresponding output frame (e.g., video frame). For example, as shown in the upper portion of
In contrast, under the proposed scheme and in case 2 of scenario 200, the utilization of MFP may be dynamically enabled/disabled, adjusted, throttled or otherwise controlled to result in power saving by adapting to real-time thermal condition(s) (and/or one or more other types of conditions) of the apparatus. For example, as shown in the lower portion of
It is noteworthy that the condition(s) being monitored in scenario 200 may include, for example and without limitation, thermal condition, usage time, bandwidth and/or battery power level associated with the apparatus. It is also noteworthy that the condition(s) being monitored may be user-defined and, accordingly, a user may define which mode (e.g., normal mode or power-saving mode) corresponds to which condition(s).
In case 1 in scenario 300 and under an existing approach, MFP would be disabled and output frames are output by dynamic frame rate according to certain condition(s) (e.g., some frames needing more exposure time). That is, input frames may be captured at a dynamic frame rate (e.g., at 120 fps under one condition and at 30 fps under another condition) and processed to generate corresponding output frames (e.g., video frames). For example, as shown in the upper portion of
In contrast, under the proposed scheme and in case 2 of scenario 300, the utilization of MFP may be dynamically enabled/disabled, adjusted, throttled or otherwise controlled to result in power saving by adapting to real-time thermal condition(s) (and/or one or more other types of conditions being monitored) of the apparatus. For example, as shown in the lower portion of
It is noteworthy that the condition(s) being monitored in scenario 300 may include, for example and without limitation, thermal condition, usage time, bandwidth and/or battery power level associated with the apparatus. It is also noteworthy that the condition(s) being monitored may be user-defined and, accordingly, a user may define which mode (e.g., normal mode or power-saving mode) corresponds to which condition(s).
In some implementations, the multiple cameras may use the same lens module or different lens modules. In some implementations, the multiple cameras may have different resolutions (e.g., 13 million pixels and 5 million pixels), different color filter arrays (e.g., Bayer and mono), different f-numbers (e.g., 2.0 and 2.4), and/or field of views (FOV). In some implementations, to save power for the multiple cameras, at least one camera of the multiple cameras may be disabled. Alternatively or additionally, MFP computation may be disabled. Alternatively or additionally, lower computation precision in hardware and/or software may be utilized (e.g., from 32 bits to 16 bits).
In case 1 in scenario 400 and under an existing approach, MFP would be utilized by generating each output frame with multiple input frames, respectively. That is, input frames may be captured at a constant rate (e.g., 120 fps), and multiple input frames (e.g., two input frames or any number greater than 1) may be processed to generate a corresponding output frame (e.g., video frame). For example, as shown in the upper portion of
In contrast, under the proposed scheme and in case 2 of scenario 400, the utilization of MFP may be dynamically enabled/disabled, adjusted, throttled or otherwise controlled to result in power saving by adapting to real-time thermal condition(s) (and/or one or more other types of conditions) of the apparatus. For example, as shown in the lower portion of
It is noteworthy that the condition(s) being monitored in scenario 400 may include, for example and without limitation, thermal condition, usage time, bandwidth and/or battery power level associated with the apparatus. It is also noteworthy that the condition(s) being monitored may be user-defined and, accordingly, a user may define which mode (e.g., normal mode or power-saving mode) corresponds to which condition(s).
In the example shown in
In the example shown in
In the example shown in
Memory 820 may be a storage device configured to store one or more sets of processor-executable codes, programs and/or instructions 822 as well as image data 824 of input image frames and output image frames. For example, memory 820 may be operatively coupled to processor 810 and/or imaging device 830 to receive image data 824. Memory 820 may be implemented by any suitable technology and may include volatile memory and/or non-volatile memory. For example, memory 820 may include a type of random access memory (RAM) such as dynamic RAM (DRAM), static RAM (SRAM), thyristor RAM (T-RAM) and/or zero-capacitor RAM (Z-RAM). Alternatively or additionally, memory 820 may include a type of read-only memory (ROM) such as mask ROM, programmable ROM (PROM), erasable programmable ROM (EPROM) and/or electrically erasable programmable ROM (EEPROM). Alternatively or additionally, memory 820 may include a type of non-volatile random-access memory (NVRAM) such as flash memory, solid-state memory, ferroelectric RAM (FeRAM), magnetoresistive RAM (MRAM) and/or phase-change memory.
Imaging device 830 may include one or more cameras 835(1)-835(N), where N is a positive integer greater than or equal to 1. Each of the one or more cameras 835(1)-835(N) may include a digital camera which may be implemented with, for example and without limitation, semiconductor charge-coupled device(s) (CCD) and/or active pixel sensors in complementary metal-oxide-semiconductor (CMOS) or N-type metal-oxide-semiconductor (NMOS) technologies. Each of the one or more cameras 835(1)-835(N) may be configured to capture one or more input image frames at any given time, and provide data representative of the captured input image frame(s) to processor 810 and/or memory 820 for processing and/or storage.
Sensing device 840 may include one or more sensors 845(1)-845(M), where M is a positive integer greater than or equal to 1. Each of the one or more sensors 845(1)-845(M) may be configured to sense or otherwise detect a respect condition with respect to one or more aspects of apparatus 800. In some implementations, the one or more sensors 845(1)-845(M) may include one or more temperature sensors. For instance, the one or more temperature sensors may sense one or more temperatures associated with one or more components apparatus 800 (e.g., temperature of processor 810 and/or temperature of a casing of apparatus 800). In some implementations, the one or more sensors 845(1)-845(M) may include one or more power sensors. For instance, the one or more power sensors may sense a power level of a power supply associated with apparatus 800 such as an internal power supply (e.g., battery).
In one aspect, processor 810 may be implemented in the form of one or more single-core processors, one or more multi-core processors, or one or more CISC processors. That is, even though a singular term “a processor” is used herein to refer to processor 810, processor 810 may include multiple processors in some implementations and a single processor in other implementations in accordance with the present disclosure. In another aspect, processor 810 may be implemented in the form of hardware (and, optionally, firmware) with electronic components including, for example and without limitation, one or more transistors, one or more diodes, one or more capacitors, one or more resistors, one or more inductors, one or more memristors and/or one or more varactors that are configured and arranged to achieve specific purposes in accordance with the present disclosure. In other words, in at least some implementations, processor 810 is a special-purpose machine specifically designed, arranged and configured to perform specific tasks including adaptive power saving for multi-frame processing in accordance with various implementations of the present disclosure.
Processor 810 may be operably coupled to memory 820, imaging device 830 and sensing device 840. Processor 810 may access memory 820 to execute the one or more processor-executable codes 822 stored in memory 820. Upon executing the one or more processor-executable codes 822, processor 810 may be configured to perform operations pertaining to adaptive power saving for multi-frame processing. Processor 810 may be also operably coupled to imaging device 830 to receive input image frames, captured by the one or more cameras 835(1)-835(N), from imaging device 830. Processor 810 may be further operatively coupled to sensing device 840 to receive one or more signals from sensing device 840, with the one or more signals representative of one or more conditions sensed or otherwise detected by the one or more sensors 845(1)-845(M) of sensing device 840.
Processor 810, as a special-purpose machine, may include non-generic and specially-designed hardware circuits that are designed, arranged and configured to perform specific tasks pertaining to adaptive power saving for multi-frame processing in accordance with various implementations of the present disclosure. For instance, processor 810 may include a monitoring circuit 812 and an adjustable image processing circuit 814 that, together, perform specific tasks and functions to render adaptive power saving for multi-frame processing in accordance with various implementations of the present disclosure. For instance, monitoring circuit 812 may monitor for at least one condition associated with apparatus 800, and, in response to a result of the monitoring, adjustable image processing circuit 814 may dynamically adjust image processing performed on multiple input image frames received from imaging device 830 to provide one or more output image frames.
In some implementations, monitoring circuit 812 may, based on one or more signals received from sensing device 840, monitor one or more temperatures associated with apparatus 800 and determine whether the one or more monitored temperatures has/have reached or exceeded one or more respective thermal thresholds. For example and without limitation, monitoring circuit 812 may monitor and determine whether the temperature(s) of processor 810 (and/or one or more other circuits of apparatus 800) and/or a casing of apparatus 800 has/have reached or exceeded respective thermal threshold(s). Alternatively or additionally, monitoring circuit 812 may, based on one or more signals received from sensing device 840, monitor one or more temperatures associated with at least one of the one or more cameras 835(1)-835(N) of imaging device 830 and determine whether the one or more monitored temperatures has/have reached or exceeded one or more respective thermal thresholds. Alternatively or additionally, monitoring circuit 812 may, based on one or more signals received from sensing device 840, monitor a power level of a battery associated with apparatus 800 and determine whether the monitored power level has reached or dropped below a respective power level threshold.
In some implementations, monitoring circuit 812 may, based on signal(s), data and/or information received from one or more other hardware components of processor 810, one or more firmware components of processor 810 and/or one or more software applications executed by processor 810, monitor and determine whether an amount of time that apparatus 800 has been in use has reached or exceeded a respective temporal threshold. Alternatively or additionally, monitoring circuit 812 may, based on signal(s), data and/or information received from one or more other hardware components of processor 810, one or more firmware components of processor 810 and/or one or more software applications executed by processor 810, monitor and determine whether an amount of time that at least one of the one or more cameras 835(1)-835(N) of imaging device 830 has been in use has reached or exceeded a respective temporal threshold. Alternatively or additionally, monitoring circuit 812 may, based on signal(s), data and/or information received from one or more other hardware components of processor 810, one or more firmware components of processor 810 and/or one or more software applications executed by processor 810, monitor and determine whether an amount of time that an application in execution on apparatus 800 has reached or exceeded a respective temporal threshold. Alternatively or additionally, monitoring circuit 812 may, based on signal(s), data and/or information received from one or more other hardware components of processor 810, one or more firmware components of processor 810 and/or a communication device of apparatus 800, monitor and determine whether a bandwidth associated with apparatus 800 has reached or dropped below a respective bandwidth threshold. Alternatively or additionally, monitoring circuit 812 may, based on signal(s), data and/or information received from one or more other hardware components of processor 810, one or more firmware components of processor 810 and/or a user interface device of apparatus 800, monitor and determine whether a user input, which changes a mode of the image processing performed on the multiple input image frames, has been received.
In some implementations, in dynamically adjusting the image processing, adjustable image processing circuit 814 may be configured to perform multi-frame processing (MFP) on the multiple input image frames. For instance, adjustable image processing circuit 814 may perform MFP to achieve at least one of the following: denoising, deblurring, super-resolution imaging, high dynamic range improvement, sharpness improvement, texture improvement, brightness improvement, color improvement and contrast improvement.
In some implementations, processor 810 may receive the multiple input image frames from a single camera of imaging device 830, where the multiple input image frames may be captured by the single camera at different times. In such cases, in dynamically adjusting image processing performed on the multiple input image frames to provide the one or more output image frames, adjustable image processing circuit 814 may be configured to perform a number of operations. For instance, adjustable image processing circuit 814 may perform a first mode of image processing on the multiple input image frames to provide the one or more output image frames when there is no occurrence of the at least one condition. Moreover, adjustable image processing circuit 814 may perform a second mode of image processing on the multiple input image frames to provide the one or more output image frames when there is an occurrence of the at least one condition.
In performing the first mode of image processing on the multiple input image frames to provide the one or more output image frames, adjustable image processing circuit 814 may be configured to perform either of the following: (i) generating each output image frame of the one or more output image frames using a first number of respective input image frames of the multiple input image frames, and (ii) generating the first number of output image frames of the one or more output image frames using a second number of respective input image frames of the multiple input image frames. In performing the second mode of image processing on the multiple input image frames to provide the one or more output image frames, adjustable image processing circuit 814 may be configured to perform either of the following: (i) generating each output image frame of the one or more output image frames using the second number of respective input image frames of the multiple input image frames, and (ii) generating a third number of output image frames of the one or more output image frames using the second number of respective input image frames of the multiple input image frames. Here, the second number may be less than the first number, and the third number may be less than or not equal to the first number.
In some implementations, processor 810 may receive the multiple input image frames from multiple cameras of imaging device 830, where the multiple input image frames may be captured by the multiple cameras in batches at different times with each batch of input image frames captured simultaneously by the multiple cameras at a respective time. In such cases, in dynamically adjusting the image processing performed on multiple input image frames to provide one or more output image frames, adjustable image processing circuit 814 may be configured to perform a number of operations. For instance, adjustable image processing circuit 814 may perform a first mode of image processing on the multiple input image frames to provide the one or more output image frames when there is no occurrence of the at least one condition. Furthermore, adjustable image processing circuit 814 may perform a second mode of image processing on the multiple input image frames to provide the one or more output image frames when there is an occurrence of the at least one condition.
In performing the first mode of image processing on the multiple input image frames to provide the one or more output image frames, adjustable image processing circuit 814 may be configured to perform either of the following: (i) generating each output image frame of the one or more output image frames using a respective batch of input image frames of the multiple input image frames, and (ii) generating each output image frame of the one or more output image frames using more than one respective batch of input image frames of the multiple input image frames. In performing the second mode of image processing on the multiple input image frames to provide the one or more output image frames, adjustable image processing circuit 814 may be configured to perform either of the following: (i) generating each output image frame of the one or more output image frames using a respective input image frame captured by one of the multiple cameras at a respective time, and (ii) generating each output image frame of the one or more output image frames using a respective input image frame captured by one of the multiple cameras at a respective time.
In some implementations, processor 810 may receive the multiple input image frames from one or more cameras of imaging device 830, where the multiple input image frames may be captured by a single camera of the one or more cameras at different times, by more than one camera of the one or more cameras in batches at different times, or by a combination thereof. Each batch of input image frames may be captured simultaneously by the more than one camera of the one or more cameras at a respective time. In such cases, in dynamically adjusting the image processing performed on the multiple input image frames to provide the one or more output image frames, adjustable image processing circuit 814 may be configured to perform a respective mode of multiple modes of image processing on the multiple input image frames to provide the one or more output image frames under a respective condition of a number of conditions. For instance, under a first condition, adjustable image processing circuit 814 may generate each output image frame of the one or more output image frames using a first number of respective input image frames of the multiple input image frames captured by the single camera or the more than one camera of the one or more cameras at different times. Additionally, under a second condition, adjustable image processing circuit 814 may generate each output image frame of the one or more output image frames using a second number of respective input image frames of the multiple input image frames captured by the single camera or the more than one camera of the one or more cameras at different times. Moreover, under a third condition, adjustable image processing circuit 814 may generate each output image frame of the one or more output image frames using a third number of respective input image frames of the multiple input image frames captured by the single camera or the more than one camera of the one or more cameras at different times. Here, the second number may be less than the first number, and the third number may be less than or not equal to the first number.
At 910, process 900 may involve processor 810 of apparatus 800 receiving multiple input images from a single camera. Process 900 may proceed from 910 to 930.
At 920, process 900 may involve processor 810 of apparatus 800 receiving multiple input images from multiple cameras. Process 900 may proceed from 920 to 930.
At 930, process 900 may involve processor 810 of apparatus 800 monitoring for at least one condition associated with apparatus 800. Process 900 may proceed from 930 to 940.
At 940, process 900 may involve processor 810 of apparatus 800, in response to a result of the monitoring, dynamically adjusting image processing performed on the multiple input image frames to provide one or more output image frames.
In some implementations, in monitoring for the at least one condition related to apparatus 800, process 900 may involve processor 810 monitoring for an occurrence of one or more conditions of a number of conditions related to apparatus 800. For instance, such conditions may include the following: one or more temperatures associated with apparatus 800 reaching or exceeding one or more respective thermal thresholds, one or more temperatures associated with a camera of apparatus 800 reaching or exceeding one or more respective thermal thresholds, an amount of time that apparatus 800 has been in use reaching or exceeding a respective temporal threshold, an amount of time that the camera of apparatus 800 has been in use reaching or exceeding a respective temporal threshold, and an amount of time that an application has been in execution on apparatus 800 reaching or exceeding a respective temporal threshold.
In some implementations, in monitoring for the at least one condition related to apparatus 800, process 900 may involve processor 810 monitoring for an occurrence of one or more conditions of a number of conditions related to apparatus 800. For instance, such conditions may include the following: a bandwidth associated with apparatus 800 reaching or dropping below a respective bandwidth threshold, a power level of a battery associated with apparatus 800 reaching or dropping below a respective power level threshold, and receipt of a user input that changes a mode of the image processing performed on the plurality of input image frames.
In some implementations, the multiple input image frames may be received from a single camera and captured by the single camera at different times. In such cases, in dynamically adjusting the image processing performed on the multiple input image frames to provide the one or more output image frames, process 900 may involve processor 810 performing a first mode of image processing on the multiple input image frames to provide the one or more output image frames when there is no occurrence of the at least one condition. For instance, processor 810 may generate each output image frame of the one or more output image frames using a first number of respective input image frames of the multiple input image frames. Process 900 may also involve processor 810 performing a second mode of image processing on the multiple input image frames to provide the one or more output image frames when there is an occurrence of the at least one condition. For instance, processor 810 may generate each output image frame of the one or more output image frames using a second number of respective input image frames of the multiple input image frames, where the second number may be less than the first number.
Alternatively, in performing the first mode of image processing on the multiple input image frames to provide the one or more output image frames, process 900 may involve processor 810 generating a first number of output image frames of the one or more output image frames using a second number of respective input image frames of the multiple input image frames. Moreover, in performing the second mode of image processing on the multiple input image frames to provide the one or more output image frames, process 900 may involve processor 810 generating a third number of output image frames of the one or more output image frames using the second number of respective input image frames of the multiple input image frames. Here, the second number may be less than the first number, and the third number may be less than or not equal to the first number.
In some implementations, the multiple input image frames may be received from multiple cameras and captured in batches by the multiple cameras at different times, with each batch of input image frames being captured simultaneously by the multiple cameras at a respective time. In such cases, in dynamically adjusting the image processing performed on the multiple input image frames to provide the one or more output image frames, process 900 may involve processor 810 performing a first mode of image processing on the multiple input image frames to provide the one or more output image frames when there is no occurrence of the at least one condition. For instance, processor 810 may generate each output image frame of the one or more output image frames using one or more than one respective batch of input image frames of the multiple input image frames. Process 900 may also involve processor 810 performing a second mode of image processing on the multiple input image frames to provide the one or more output image frames when there is an occurrence of the at least one condition. For instance, processor 810 may generate each output image frame of the one or more output image frames using a respective input image frame captured by one of the multiple cameras at a respective time.
In some implementations, the multiple input image frames may be received from one or more cameras. The multiple input image frames may be captured by a single camera of the one or more cameras at different times, by more than one camera of the one or more cameras in batches at different times, or by a combination thereof. In such cases, each batch of input image frames may be captured simultaneously by the more than one camera of the one or more cameras at a respective time. Moreover, in dynamically adjusting the image processing performed on the multiple input image frames to provide the one or more output image frames, process 900 may involve processor 810 performing a respective mode of multiple modes of image processing on the multiple input image frames to provide the one or more output image frames under a respective condition of a plurality of conditions. In some implementations, in performing the respective mode of the multiple modes of image processing on the multiple input image frames to provide the one or more output image frames under the respective condition of the multiple conditions, process 900 may involve processor 810 performing a number of operations. For instance, under a first condition, process 900 may involve processor 810 generating each output image frame of the one or more output image frames using a first number of respective input image frames of the multiple input image frames captured by the single camera or the more than one camera of the one or more cameras at different times. Under a second condition, process 900 may involve processor 810 generating each output image frame of the one or more output image frames using a second number of respective input image frames of the multiple input image frames captured by the single camera or the more than one camera of the one or more cameras at different times. Under a third condition, process 900 may involve processor 810 generating each output image frame of the one or more output image frames using a third number of respective input image frames of the multiple input image frames captured by the single camera or the more than one camera of the one or more cameras at different times. Here, the second number may be less than the first number, and the third number may be less than or not equal to the first number.
The herein-described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely examples, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected”, or “operably coupled”, to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable”, to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
Further, with respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
Moreover, it will be understood by those skilled in the art that, in general, terms used herein, and especially in the appended claims, e.g., bodies of the appended claims, are generally intended as “open” terms, e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc. It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to implementations containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an,” e.g., “a” and/or “an” should be interpreted to mean “at least one” or “one or more;” the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should be interpreted to mean at least the recited number, e.g., the bare recitation of “two recitations,” without other modifiers, means at least two recitations, or two or more recitations. Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention, e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc. In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention, e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc. It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.”
From the foregoing, it will be appreciated that various implementations of the present disclosure have been described herein for purposes of illustration, and that various modifications may be made without departing from the scope and spirit of the present disclosure. Accordingly, the various implementations disclosed herein are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
The present disclosure is part of a non-provisional application claiming the priority benefit of U.S. Patent Application No. 62/260,352, filed on 27 Nov. 2015, which is incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62260352 | Nov 2015 | US |