This application claims priority under 35 U.S.C. § 119 to Chinese Patent Application No. 202311227530.5, filed on Sep. 22, 2023, the disclosure of which is incorporated by reference herein in its entirety.
Some example embodiments of the inventive concept relate to the field of image processing, and more particularly, to image processing methods and/or electronic devices designed to increase the speed of camera stream (or image stream) processing.
Original images captured by an image sensor undergo various image processing steps including Lens Shading Correction (LSC), Automatic White Balance (AWB) and the like. The LSC processing and the AWB processing interact with each other. In other words, these two processes are interdependent. Generally, the LSC gain for a current image frame depends on a Correlated Color Temperature (CCT) determined by the AWB processing of the previous image frame. In turn, the LSC gain for the current image frame influences a result of the AWB processing for the current image frame, which subsequently impacts the LSC processing of a next image frame.
When processing an image stream which includes a plurality of sensor images, CCT used in previous image processing steps may not be applicable if there is a change in illumination or the image sensor itself. In such cases, achieving a stable CCT may require iterative processing on several image frames, which can lead to extended processing times.
Therefore, there is a need for an image processing method to improve the speed of camera stream processing.
According to some example embodiments of the inventive concept, there is provided a method for image processing including: obtaining a sensor image; performing a lens shading correction on the sensor image to obtain a first image; performing a weighted masking on the first image to obtain a second image; and performing an automatic white balance processing on the second image to obtain a third image.
According to some example embodiments of the inventive concept, there is provided an electronic device including: a sensor module configured to obtain sensor images; and a processor configured to, perform a lens shading correction on the sensor image to obtain a first image; perform a weighted masking on the first image to obtain a second image; and perform an automatic white balance processing on the second image to obtain a third image.
According to some example embodiments of the inventive concept, there is provided a non-transitory computer-readable storage medium stores instructions that, when executed by a processor, cause the processor to execute the method disclosed above.
The method of image processing and an electronic device according to some example embodiments of the inventive concept, may perform a weighted masking processing between the lens shading correction processing and the automatic white balance processing when the light source changes or the imaging sensor changes. This way, a converged correlated color temperature may be obtained by using fewer images, thereby increasing the speed of camera stream processing.
The above and other features of the inventive concept will become clearer through the following detailed description taken together with the accompanying drawings in which:
Example embodiments of the inventive concept will be described hereinafter with reference to the accompanying drawings. However, various changes and modifications to the methods, apparatuses, and/or systems described herein will be apparent to one of ordinary skill in the art. For example, the sequences of operations described herein are merely examples, and are not intended to be limited to those set forth herein, but may be changed as determined by one of ordinary skill in the art.
Although the terms “first” or “second” may be used to explain various components, the components are not limited by the terms. These terms are merely used to distinguish one component from another component. For example, a “first” component may be referred to as a “second” component, or similarly, the “second” component may be referred to as the “first” component.
It will be understood that when a component is referred to as being “connected to” another component, the component may be directly connected or coupled to the other component or intervening components may be present.
As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, components or a combination thereof, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Unless otherwise defined, all terms including technical or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention pertains. It will be further understood that terms, such as those defined in commonly-used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The electronic device according to various example embodiments of the present inventive concept may include or may be, for example, at least one of a camera, a mobile phone, a tablet personal computer (PC), a personal digital assistant (PDA), a portable multimedia player (PMP), an augmented reality (AR) device, a virtual reality (VR) device, various wearable devices (e.g., a smart watch, smart glasses, a smart bracelet, etc.). However, example embodiments are not limited thereto, and the electronic device according to the present inventive concept may be any electronic device having an image processing function.
As shown in
The sensor module 110 may capture an object and generate a plurality of sensor images. The sensor images output from the sensor module are also called raw images.
The processor 120 may receive the sensor images from the sensor module 110, and perform image processing on the sensor images. The processor 120 may perform various operations on the sensor images, including pre-processing, lens shading correction (LSC), automatic white balance (AWB), and other post-processing tasks. Although various processing performed on the images by the processor 120 are shown herein, these are merely examples, and the present inventive concept is not limited thereto.
According to example embodiments, the processor 120 may obtain a sensor image; performing a lens shading correction on the sensor image, to obtain a first image; perform a weighted masking on the first image, to obtain a second image; and perform an automatic white balance on the second image, to obtain a third image.
The memory unit may obtain images from the outside or receive images generated by the sensor module 110. The memory unit may store images used to perform image processing tasks.
The memory unit may also store data and/or software or instructions for implementing a method of image processing according to some example embodiments. When the processor 120 executes the software or instructions, the method of image processing according to some example embodiments may be implemented. The memory may be implemented as a part of the processor 120 or separately from the processor 120 within the electronic device 100.
The processor 120 may be implemented as a general-purpose processor, an application processor (AP), an integrated circuit dedicated to image processing, a field programmable gate array, or a combination of hardware and software.
Referring to
According to some example embodiments, the pre-processing steps may include the removal of dark current, linear correction, bad point compensation, among other tasks. The LSC refers to correcting the pixels around the field of view according to the physical characteristics of a lens, such that the brightness of an image is consistent. The gain of LSC is associated with the correlated color temperature (CCT) of the image. AWB involves calculating the gain of RGB (red, green and blue) channels as well as determining the CCT of the captured scene, based on the scene's color statistical information. Other processing after AWB may include the removal of noise, sharpness enhancement, gamut mapping, etc. Although various image processing operations may be performed on the image, there are merely examples, and the present inventive concept is not limited thereto.
The sensor module 110 may capture a plurality of sensor images which are continuous or consecutive (for example, a plurality of frames of sensor images), and the processor 120 may sequentially perform the pre-processing, LSC, AWB and other processing for each sensor image.
The right side of
In an example, the LSC processing of the first frame of the sensor image is based on the initial CCT. A CCT obtained based on the AWB processing of the first frame of the sensor image may be used as a CCT estimation value of the second frame of the sensor image and used for the LSC processing of the second frame of the sensor image. By iteratively performing, on a plurality of sensor images, the image processing shown in
When the illumination scene changes or the imaging sensor changes, the CCT used in the previous image processing step may no longer be applicable to the current scene. In this case, iterative processing is performed on several frames of images to obtain the CCT that converges to a stable state. This, however, can lead to extended processing times.
Hereinafter, a method for image processing according to example embodiments of the present disclosure will be described with reference to
Referring to
Generally, when calculating the LSC gain based on the CCT, an error in the CCT will cause larger LSC gain error at the corners or peripheral pixels of the image. This is due to a more noticeable lens shadow in the corners. Conversely, errors of the CCT have a lesser influence on the LSC gain at the central pixels of the image. In other words, the wrong CCT has little influence on the pixels located in the center of the image. Because the calculation of AWB is not sensitive to corner pixels, when the CCT is calculated by performing the AWB processing based on the result of the LSC processing, the influence of pixels located at the corners of the image may be properly masked.
The weighted masking processing may be triggered based on various conditions. For example, when the light source of the scene used to capture the images changes, or the image sensor used for imaging changes, the weighted masking processing may be triggered (e.g., may be executed).
Determining whether to trigger the weighted masking processing may be based on the difference between the CCT values of two adjacent frames of images. For example, when the difference between the CCT values of two adjacent frames of images is greater than a predetermined threshold, it may be determined that the image is in a transitional state. At this time, a weighted masking processing may be performed, using a weighted mask, on the image which has experienced the LSC processing. Hereinafter, the weighted masking processing is described in detail with reference to
Referring to
In some example embodiments, an image, on which the lens shading correction processing is to be performed, may be referred to as the sensor image. In other words, the pre-processing (e.g., the removal of dark current, linear correction, bad point compensation, etc.) may be performed on the captured image before the lens shading correction processing is performed.
At step S420, the processor 120 may perform the lens shading correction processing on the sensor image, to obtain a first image.
At step S430, the processor 120 may perform the weighted masking processing on the first image to obtain a second image.
At step S440, the processor 120 may perform the automatic white balance processing on the second image to obtain a third image.
The processor 120 may perform the lens shading correction processing, the weighted masking processing and the automatic white balance processing on each of the sensor images obtained, so as to obtain respective first images, respective second images and respective third images corresponding to respective ones of the sensor images.
Referring to
The method for image processing according to example embodiments of the inventive concept may increase the speed of camera stream processing by performing the weighted masking processing between the lens shading correction processing and the automatic white balance processing when the light source changes or the imaging sensor changes, such that a converged CCT (for example, a CCT corresponding to the changed image capture environment) may be obtained through a smaller number of images.
Referring to
According to some example embodiments, whether the first image (for example, the first image corresponding to the current sensor image) is in the transitional state may be determined based on whether the difference of CCT estimated values between the first image corresponding to the current sensor image and the first image corresponding to the previous sensor image is greater than a predetermined threshold. In this case, the CCT estimated value of the first image corresponding to the current sensor image may be the CCT of the third image corresponding to the previous sensor image.
In response to the difference of CCT estimated values between the first image for the current sensor image and the first image for the previous sensor image being greater than the threshold, it may be determined that the first image corresponding to the current sensor image is in a transitional state (YES in S431). Therefore, at step S432, a transition weighted mask may be determined.
In response to the difference being not greater than the threshold, it may be determined that the first image corresponding to the current sensor image is in a stable state (No in S431). Therefore, a stable weighted mask may be selected at step S433.
At step S434, a second image may be generated based on the weighted mask and the first image.
According to example embodiments, the weighted mask may include a plurality of weights, each of which corresponds to one of a plurality of pixels in the first image. Each of pixel values of the second image may be calculated based on a respective weight among the weights in the weighted mask and a pixel value of a respective pixel in the first image. For example, each of the pixel values of the second image may be obtained by multiplying a respective weight in the weighted mask with the pixel value of a corresponding pixel in the first image. The pixel values of the second image may be used for subsequent AWB processing.
In response to determining that the first image is in the stable state, the stable weighted mask may be selected. In the stable weighted mask, the value of each of the weights is 1.
In response to determining that the first image is in the transitional state, the transitional weighted mask may be determined. In the transitional weighted mask, the value of each of the weights ranges from 0 to 1.
The LSC gain error due to a wrong CCT has less influence on the pixels located in the central position of the image. Therefore, in the transition weighted mask, a value of the weight corresponding to a pixel located in the central position of the image may be large, and a value of the weight corresponding to a pixel located in the peripheral position of the image may be small.
According to an example embodiment, in response to the first image being in the transitional state, transition weighted masks may be sequentially selected from a predetermined set of weighted masks. The selected transition weighted mask may be used for each of a plurality of consecutive or continuous first images, and a transition weighted mask of the first image corresponding to the current sensor image may be different from that of the first image corresponding to the previous sensor image.
For example, in response to a previous first image being in the stable state and the current first image being in the transitional state, a first weighted mask may be selected from the sets of weighted masks in
In another example, the weighted mask may be determined according to the difference of estimated CCT values between the first image corresponding to the current sensor image and the first image corresponding to the previous sensor image.
For example, a plurality of weighted masks, which correspond to different differences of CCTs respectively, may be determined in advance. When the difference of estimated CCT values between two first images is obtained at step S431, a weighted mask corresponding to the determined difference may be selected at step S432. In one example, the greater the difference in the estimated CCT values, the smaller the weight assigned to the peripheral pixels in the weighted mask.
As shown in
The communication unit 1010 may perform a communication operation of the mobile terminal 1000. The communication unit 1010 may establish a communication channel to the communication network and/or may perform a communication associated with, for example, a voice call, a video call, and/or a data call.
The input unit 1020 is configured to receive various input information and various control signals, and to transmit the input information and control signals to the control unit 1060. The input unit 1020 may be realized by various input devices such as keypads and/or key boards, touch screens and/or styluses, mice, etc. However, example embodiments are not limited thereto.
The image processing unit 1030 is connected to the image sensor 1070. The image sensor 1070 may capture images and transmit the captured images to the image processing unit 1030. The image processing unit 1030 may perform image processing on the images (for example, using the method of image processing illustrated in
The display unit 1040 is used to display various information, and may be realized, for example, by a touch screen. However, example embodiments are not limited thereto.
The storage unit 1050 may include volatile memory and/or nonvolatile memory. The storage unit 1050 may store various data generated and used by the mobile terminal 1000. For example, the storage unit 1050 may store an operating system and applications (e.g., applications associated with the methods according to example embodiments of the inventive concept) for controlling the operation of the mobile terminal 1000. The control unit 1060 may control the overall operation of the mobile terminal 1000 and may control part or all of the internal elements of the mobile terminal 1000. The control unit 1060 may be implemented as a general-purpose processor, an application processor (AP), an application specific integrated circuit, a field programmable gate array, etc., but example embodiments are not limited thereto.
In some example embodiments, the image processing unit 1030 and the control unit 1060 may be implemented by the same device and/or integrated in a single chip.
The apparatuses, units, modules, devices, and other components described herein may be implemented by hardware components. Examples of hardware components that may be used to perform the operations described in this application include controllers, sensors, generators, drivers, memories, comparators, arithmetic logic units, adders, subtractors, multipliers, dividers, integrators, and any other electronic components configured to perform the operations described in this application. In other examples, one or more of the hardware components that perform the operations described in this application may be implemented by computing hardware, for example, by one or more processors or computers. A processor or computer may be implemented by one or more processing elements, such as an array of logic gates, a controller and an arithmetic logic unit, a digital signal processor, a microcomputer, a programmable logic controller, a field-programmable gate array, a programmable logic array, a microprocessor, or any other device or combination of devices that is configured to respond to and execute instructions in a defined manner to achieve a desired result. In one example, a processor or computer includes, or is connected to, one or more memories storing instructions or software that are executed by the processor or computer. Hardware components implemented by a processor or computer may execute instructions or software, such as an operating system (OS) and one or more software applications that run on the OS, to perform the operations described in this application. The hardware components may also access, manipulate, process, create, and store data in response to execution of the instructions or software. For simplicity, the singular term “processor” or “computer” may be used to describe the examples in this application, but in other examples multiple processors or computers may be used, or a processor or computer may include multiple processing elements, or multiple types of processing elements, or both. For example, a single hardware component or two or more hardware components may be implemented by a single processor, or two or more processors, or a processor and a controller. One or more hardware components may be implemented by one or more processors, or a processor and a controller, and one or more other hardware components may be implemented by one or more other processors, or another processor and another controller. One or more processors, or a processor and a controller, may implement a single hardware component, or two or more hardware components. A hardware component may have any one or more of different processing configurations, examples of which include a single processor, independent processors, parallel processors, a single-instruction single-data (SISD) multiprocessor, a single-instruction multiple-data (SIMD) multiprocessor, a multiple-instruction single-data (MISD) multiprocessor, and a multiple-instruction multiple-data (MIMD) multiprocessor.
The methods that perform the operations described in this application may be performed by computing hardware, for example, by one or more processors or computers, implemented as described above executing instructions or software to perform the operations described in this application that are performed by the methods. For example, a single operation or two or more operations may be performed by a single processor, or two or more processors, or a processor and a controller. One or more operations may be performed by one or more processors, or a processor and a controller, and one or more other operations may be performed by one or more other processors, or another processor and another controller. One or more processors, or a processor and a controller, may perform a single operation, or two or more operations.
Instructions or software to control a processor or computer to implement the hardware components and perform the methods as described above may be written as computer programs, code segments, instructions or any combination thereof, for individually or collectively instructing or configuring the processor or computer to operate as a machine or special-purpose computer to execute the operations performed by the hardware components and the methods as described above. In one example, the instructions and/or software include machine code that is directly executed by the processor or computer, such as machine code produced by a compiler. In another example, the instructions or software include higher-level code that is executed by the processor or computer using an interpreter. Persons and/or programmers of ordinary skill in the art may readily write the instructions and/or software based on the block diagrams and the flow charts illustrated in the drawings and the corresponding descriptions in the specification, which disclose algorithms for performing the operations performed by the hardware components and the methods as described above.
The instructions or software to control a processor or computer to implement the hardware components and perform the methods as described above, and any associated data, data files, and data structures, may be recorded, stored, or fixed in or on one or more non-transitory computer-readable storage medium. Examples of a non-transitory computer-readable storage medium include at least one of read-only memory (ROM), programmable read only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), flash memory, non-volatile memory, CD-ROMs, CD-Rs, CD+Rs, CD-RWs, CD+RWs, DVD-ROMs, DVD-Rs, DVD+Rs, DVD-RWs, DVD+RWs, DVD-RAMs, BD-ROMs, BD-Rs, BD-R LTHs, BD-REs, blue-ray or optical disk storage, hard disk drive (HDD), solid state drive (SSD), a card type memory such as multimedia card or a micro card (for example, secure digital (SD) or extreme digital (XD)), magnetic tapes, floppy disks, magneto-optical data storage devices, optical data storage devices, hard disks, solid-state disks, and any other device that is configured to store the instructions or software and any associated data, data files, and data structures in a non-transitory manner and provide the instructions or software and any associated data, data files, and data structures to a processor or computer so that the processor or computer can execute the instructions.
While various example embodiments of the inventive concept have been described, it will be apparent to one of ordinary skill in the art that various changes in form and detail may be made thereto without departing from the spirit and scope of inventive concept as set forth in the claims.
| Number | Date | Country | Kind |
|---|---|---|---|
| 202311227530.5 | Sep 2023 | CN | national |