The present disclosure relates to the technical field of image processing, and in particular to an image processing method, an image processing system, electronic equipment, and a readable storage medium.
Electronic devices such as mobile phones may be provided with a camera to realize a camera function. An image sensor for receiving light may be arranged in the camera. A filter array may be provided in the image sensor. In order to improve the signal-to-noise ratio of images acquired by electronic devices such as mobile phones, an image sensor with four-in-one pixel arrangement will be used.
Embodiments of the present disclosure provide a method and system for image processing, electronic device, and computer-readable storage medium.
In a first aspect, a method for image processing is provided according to some embodiments of the present disclosure, the method is for an image sensor, the image sensor comprises a pixel array, the pixel array comprises a plurality of panchromatic photosensitive pixels and a plurality of colorful photosensitive pixels, the plurality of colorful photosensitive pixels comprise a first color photosensitive pixel, a second color photosensitive pixel and a third color photosensitive pixel with different spectral responses, the plurality of colorful photosensitive pixels have a narrower spectral response than the plurality of panchromatic photosensitive pixel, and both the first color photosensitive pixel and the third color photosensitive pixels have a narrower spectral response than the second color photosensitive pixels, the pixel array comprises a plurality of minimal repeating units, each of the minimal repeating unit comprises a plurality of subunits, each of the subunit comprises at least one single-color photosensitive pixel and at least one panchromatic photosensitive pixel; the method for image processing comprising: obtaining an original image by exposing the pixel array, the image pixels in the original image comprising colorful image pixels and panchromatic image pixels; obtaining a colorful image based on all the colorful image pixels in the subunit, and obtaining a panchromatic image based on all the panchromatic image pixels in the subunit, the image pixels in the colorful image being in a Bayer array arrangement; and obtaining a first color target image, a second color target image and a third color target image by processing the colorful image based on the panchromatic image, wherein, all image pixels in the first color target image are the first image pixel, all image pixels in the second color target image are the second color image pixels, and all image pixels in the third color target image are the third color image pixels.
In a second aspect, a system for image processing is provided according to some embodiments of the present disclosure, the system comprises: an image sensor, the image sensor comprises a pixel array, the pixel array comprises a plurality of panchromatic photosensitive pixels and a plurality of colorful photosensitive pixels, the plurality of colorful photosensitive pixels comprise a first color photosensitive pixel, a second color photosensitive pixel and a third color photosensitive pixel with different spectral responses, the plurality of colorful photosensitive pixels have a narrower spectral response than the plurality of panchromatic photosensitive pixel, and both the first color photosensitive pixel and the third color photosensitive pixels have a narrower spectral response than the second color photosensitive pixels, the pixel array comprises a plurality of minimal repeating units, each of the minimal repeating unit comprises a plurality of subunits, each of the subunit comprises at least one single-color photosensitive pixel and at least one panchromatic photosensitive pixel; and a processor, the processor is configured to obtain an original image by exposing the pixel array, the image pixels in the original image comprising colorful image pixels and panchromatic image pixels; obtain a colorful image based on all the colorful image pixels in the subunit, and obtain a panchromatic image based on all the panchromatic image pixels in the subunit, the image pixels in the colorful image being in a Bayer array arrangement; and obtain a first color target image, a second color target image and a third color target image by processing the colorful image based on the panchromatic image, wherein, all image pixels in the first color target image are the first image pixel, all image pixels in the second color target image are the second color image pixels, and all image pixels in the third color target image are the third color image pixels.
In a third aspect, an electronic device is provided according to some embodiments of the present disclosure, the electronic device comprises a lens, a housing, and the system for image processing described above, the lens and the system for image processing are couple to the housing, the lens and the image sensor of the system for image processing are configured to form an image by cooperation.
In a fourth aspect, a non-transitory computer-readable storage medium storing computer instructions is provided according to some embodiments of the present disclosure, when the computer instructions are executed by a processor, the processor executes the method for image processing described above.
The above additional aspects and advantages of the present disclosure will become apparent and easily understood from the description of the embodiments in conjunction with the following drawings, wherein:
Embodiments of the present disclosure are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the drawings are examples, are only for explaining the embodiments of the present disclosure, and should not be construed as limiting the embodiments of the present disclosure.
Please refer to
At block 01: obtaining an original image by exposing the pixel array 11, the image pixels in the original image comprising colorful image pixels and panchromatic image pixels;
At block 02: obtaining a colorful image based on all the colorful image pixels in the subunit, and obtaining a panchromatic image based on all the panchromatic image pixels in the subunit, the image pixels in the colorful image being in a Bayer array arrangement; and
At block 03: obtaining a first color target image, a second color target image and a third color target image by processing the colorful image based on the panchromatic image, wherein, all image pixels in the first color target image are the first image pixel, all image pixels in the second color target image are the second color image pixels, and all image pixels in the third color target image are the third color image pixels.
Please refer to
At block 04: obtaining a first color initial image A, a second color initial image B, and a third color initial image C by separating the first color image pixels, the second color image pixels, and the third color image pixels in the colorful image. block 03: the obtaining a first color target image, a second color target image and a third color target image by processing the colorful image based on the panchromatic image comprising:
At block 031: if the panchromatic image pixel in the panchromatic image W0 corresponding to the image pixel D0 to be updated in the second color initial image is not overexposed, calculating an updated pixel value of the image pixel D0 to be updated based on the panchromatic image and the second color initial image and identifying the updated pixel value of the image pixel to be updated as a target pixel value of the second color pixel; and if the panchromatic image pixel in the panchromatic image W0 corresponding to the image pixel D0 to be updated in the second color initial image is overexposed, calculating an updated pixel value of the image pixel to be updated based on the first color initial image and the second color initial image, or based on the third color initial image and the second color initial image, and identifying the updated pixel value of the image pixel to be updated as a target pixel value of the second color pixel; until the target pixel values of the second color pixels corresponding to all image pixels B in the second color initial image are obtained, obtaining the second color target image;
At block 032: obtaining the first color target image by processing the second color target image and the first color initial image, and obtaining the third color target image by processing the second color target image and the third color initial image.
Please refer to
At block 0311: selecting a first window C1 centered on the image pixel D0 to be updated in the second color initial image, and selecting a second window C2 corresponding to the first window C1 in the panchromatic image, the panchromatic image pixels W0 at the center of the second window C2 correspond to the image pixel D0 to be updated and identified as a mapped panchromatic image pixel W0;
At block 0312: obtaining a first matrix I1 based on pixel information of all image pixels in the first window C1;
At block 0313: obtaining a second matrix I2 based on the pixel values of the mapped panchromatic image pixels W0, the pixel values of all panchromatic image pixels W in the second window C2, the first matrix I1, and a preset weight function F(x); and
At block 0314: obtaining the updated target pixel value of the second color pixel of the image pixel D0 to be updated based on the pixel values of the mapped panchromatic image pixels W0, the pixel values of all image pixels in the first window C1, the pixel values of all panchromatic image pixels W in the second window C2, and the second matrix I2.
Please refer to
At block 03121: mapping the array arrangement of the image pixels in the first window C1 to the array arrangement of the first matrix I1;
At block 03122: identifying the value in the first matrix I1 corresponding to the null pixel in the first window C1 as a first value; and
At block 03123: identifying the value of the position in the first matrix I1 corresponding to the second color image pixel B in the first window as a second value.
In some embodiments, the image pixels in the first window and the second window are arranged in M*M, M representing an odd number and the first matrix being arranged in M*M.
Please refer to
At block 03131: mapping the matrix arrangement of image pixels in the second window C2 to the array arrangement of the second matrix I2;
At block 03132: obtaining the deviation value L1 corresponding to the position of the panchromatic image pixel W in the second matrix I2 based on the pixel value of each panchromatic image pixel W in the second window C2 and the mapped pixel value of the panchromatic image pixel W0;
At block 03133: obtaining the value of the corresponding position in the second matrix I2 based on the deviation value L1, a preset function F(x) and the value of the same position in the first matrix I1.
Please refer to
At block 03141, calculating the first weighted value M1 based on the first window matrix N1 formed by the pixel values of all image pixels in the first window C1 and the second matrix I2, and calculating a second weighted value M2 based on a second window matrix N2 formed by pixel values of all image pixels in the second window C2 and the second matrix I2;
At block 03142, obtaining the updated target pixel value of the second color pixel of the image pixel D0 to be updated based on the mapped pixel value of the panchromatic image pixel W0, the first weighted value M1, and the second weighted value M2.
Please refer to
At block 0315: if the image pixel D0 to be updated is the second color image pixel B, identifying the value of the original pixel of the second color image pixel B as the value of the target pixel of the second color pixel after updating the pixel value D1 of the image pixel D0 to be updated;
At block 0316: if the position corresponding to the image pixel D0 to be updated in the first color initial image is the first color image pixel A, performing interpolation processing on the second color initial image based on the first color initial image, and obtaining the updated target pixel value D1 of the second color pixel of the image pixel D0 to be updated;
At block 0317: if the position corresponding to the image pixel D0 to be updated in the initial image of the third color is the third color image pixel C, performing interpolation processing on the second color initial image based on the third color initial image, and obtaining the updated target pixel value D1 of the second color pixel of the image pixel D1 to be updated.
Please refer to
At block 0321: obtaining the first color target image by performing a bilateral filtering process on the first color initial image based on the second color target image, and obtaining the third color target image by performing a bilateral filtering process on the third color initial image based on the second color target image.
Please refer to
At block 05: obtaining a processed colorful image by performing colorful image processing on the colorful image, obtaining a processed panchromatic image by performing panchromatic image processing on the panchromatic image; block 03: obtaining a fully arranged first color target image, a fully arranged second color target image and a fully arranged third color target image by processing of the color image based on the panchromatic image comprising:
At block 033: obtaining a first color target image, a second color target image and a third color target image by processing the processed colorful image based on the processed panchromatic image.
Please refer
At block 06: obtaining a color converted target image by performing color conversion based on the first color target image, the second color target image and the third color target image.
Please refer to
Please refer to
Please refer to
Please refer to
In some embodiments, the image pixels in the first window C1 and the second window C2 are arranged in M*M, M representing an odd number and the first matrix I1 being arranged in M*M.
Please refer to
Please refer to
Please refer to
Please refer to
Please refer to
Please refer to
Please refer to
Please refer to
In the method for image processing and system 100 for image processing in the embodiment of the present disclosure, all colorful image pixels in the same subunit are fused into a colorful image arranged in a Bayer array; all panchromatic image pixels W in the same subunit are fused into panchromatic image, and directly use the panchromatic image and the color image to fuse, and can directly output the first color target image, the second color target image and the third color target image that contain panchromatic image information and are fully arranged. In this way, the resolution and signal-to-noise ratio of the image can be improved, and the overall photo effect can be improved.
In some embodiments, the image sensor 10 may use a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) photosensitive element or a charge-coupled device (CCD, Charge-coupled Device) photosensitive element.
In some embodiments, the pixel array 11 comprises a plurality of photosensitive pixels 110 (shown in
In some embodiments, the vertical driving unit 12 comprises a shift register and an address decoder. The vertical drive unit 12 comprises readout scanning and reset scanning functions. The readout scan refers to sequentially scanning the unit photosensitive pixels 110 row by row, and reading signals from these unit photosensitive pixels 110 row by row. In some embodiments, the signal output by each photosensitive pixel 110 in the selected and scanned photosensitive pixel row is transmitted to the column processing unit 14. The reset scan is for resetting the charge, and the photo charge of the photoelectric conversion element is discarded so that accumulation of new photo charge can be started.
In some embodiments, the signal processing performed by the column processing unit 14 is correlated double sampling (CDS) processing. In the CDS process, the reset level and signal level output from each photosensitive pixel 110 in the selected photosensitive pixel row are taken out, and the level difference is calculated. Thus, signals of photosensitive pixels 110 in one row are obtained. The column processing unit 14 may have an analog-to-digital (A/D) conversion function for converting analog pixel signals into digital format.
In some embodiments, the horizontal driving unit 15 comprises a shift register and an address decoder. The horizontal driving unit 15 sequentially scans the pixel array 11 column by column. Through the selection scanning operation performed by the horizontal driving unit 15, each photosensitive pixel column is sequentially processed by the column processing unit 14, and is sequentially output.
In some embodiments, the control unit 13 configures timing signals according to the operation mode, and uses various timing signals to control the vertical driving unit 12, the column processing unit 14 and the horizontal driving unit 15 to work together.
As illustrated in
In some embodiments, the photoelectric conversion element 1111 includes a photodiode, and the anode of the photodiode is connected to the ground. Photodiodes convert received light into electric charges. The cathode of the photodiode is connected to the floating diffusion unit FD via an exposure control circuit (eg, transfer transistor 1112). The floating diffusion unit FD is connected to the gate of the amplification transistor 1114 and the source of the reset transistor 1113.
In some embodiments, the exposure control circuit is the transfer transistor 1112, and the control terminal TG of the exposure control circuit is the gate of the transfer transistor 1112. When a pulse of an active level (for example, VPIX level) is transmitted to the gate of the transfer transistor 1112 through the exposure control line, the transfer transistor 1112 is turned on. The transfer transistor 1112 transfers the photoelectrically converted charge of the photodiode to the floating diffusion unit FD.
In some embodiments, the drain of the reset transistor 1113 is connected to the pixel power supply VPIX. The source of the reset transistor 113 is connected to the floating diffusion unit FD. Before the charge is transferred from the photodiode to the floating diffusion unit FD, a pulse of effective reset level is transmitted to the gate of the reset transistor 113 via the reset line, and the reset transistor 113 is turned on. The reset transistor 113 resets the floating diffusion unit FD to the pixel power supply VPIX.
In some embodiments, the gate of the amplification transistor 1114 is connected to the floating diffusion unit FD. The drain of the amplification transistor 1114 is connected to the pixel power supply VPIX. After the floating diffusion unit FD is reset by the reset transistor 1113, the amplification transistor 1114 outputs a reset level through the output terminal OUT via the selection transistor 1115. After the charge of the photodiode is transferred by the transfer transistor 1112, the amplification transistor 1114 outputs a signal level through the output terminal OUT via the selection transistor 1115.
In some embodiments, the drain of the selection transistor 1115 is connected to the source of the amplification transistor 1114. The source of the selection transistor 1115 is connected to the column processing unit 14 in
The pixel structure of the pixel circuit 111 in the embodiment of the present disclosure is not limited to the structure shown in
In some embodiments,
W represents a panchromatic photosensitive pixel; A represents a first color photosensitive pixel among a plurality of colorful photosensitive pixels; B represents a second color photosensitive pixel among a plurality of colorful photosensitive pixels; C represents a third color photosensitive pixel among a plurality of colorful photosensitive pixels.
In some embodiments, as shown in
In some embodiments, as shown in
In some embodiments, the first diagonal direction D1 may also be the direction connecting the upper right corner and the lower left corner, and the second diagonal direction D2 may also be the direction connecting the upper left corner and the lower right corner. In addition, the “direction” here is not a single point, but can be understood as the concept of “straight line” indicating the arrangement, and there can be two-way pointing at both ends of the line. The explanation of the first diagonal direction D1 and the second diagonal direction D2 in
In some embodiments,
W represents a panchromatic photosensitive pixel; A represents a first color photosensitive pixel among a plurality of colorful photosensitive pixels; B represents a second color photosensitive pixel among a plurality of colorful photosensitive pixels; C represents a third color photosensitive pixel among a plurality of colorful photosensitive pixels.
In some embodiments, as shown in
In some embodiments, as shown in
In some embodiments,
W represents a panchromatic photosensitive pixel; A represents a first color photosensitive pixel among a plurality of colorful photosensitive pixels; B represents a second color photosensitive pixel among a plurality of colorful photosensitive pixels; C represents a third color photosensitive pixel among a plurality of colorful photosensitive pixels.
In some embodiments, as shown in
In some embodiments, as shown in
In some embodiments, in the minimum repeating unit shown in
In some embodiments, in the minimum repeating unit shown in
In some embodiments, in the minimum repeating unit shown in
In some embodiments, the response wavelength band of the panchromatic photosensitive pixel W may be a visible light band (for example, 400 nm-760 nm). For example, an infrared filter is arranged on the panchromatic photosensitive pixel W to filter out infrared light. In some other embodiments, the response bands of the panchromatic photosensitive pixels W are visible light bands and near-infrared bands (for example, 400 nm-1000 nm), and the photoelectric conversion elements 1111 (shown in
For the convenience of description, the following embodiments assume that the photosensitive pixels of the first color are red photosensitive pixels R; the photosensitive pixels B of the second color are green photosensitive pixels G; and the photosensitive pixels C of the third color are blue photosensitive pixels Bu.
Please refer to
In some embodiments, please refer to
The processor 20 takes the average value of the pixel values of the colorful image pixel P1 (1,1) and the color image pixel P1 (2,2) in the subunit U1 as the pixel value of the fused colorful image pixel P2 (1,1), The fused colorful image pixel P2 (1,1) is located in the first row and first column of the color image; then, the processor 20 combines the colorful image pixel P1 (1,3) and the colorful image pixel P1 (2, 4) as the pixel value of the fused colorful image pixel P2 (1, 2), and the fused colorful image pixel P2 (1, 2) is located in the first row and second column of the colorful image; subsequently, the processor 20 takes the average value of the pixel values of the colorful image pixel P1 (1, 5) and the colorful image pixel P1 (2, 6) in the subunit U3 as the pixel of the fused colorful image pixel P2 (1, 3) value, the fused colorful image pixel P2 (1, 3) is located in the first row and third column of the colorful image; then, the processor 20 converts the colorful image pixel P1 (1, 7) and the colorful image pixel in the subunit U4. The mean value of the pixel value of P1 (2,8) is used as the pixel value of the fused colorful image pixel P2 (1,4), and the fused colorful image pixel P2 (1,4) is located in the first row and fourth row of the colorful image row; then, the processor 20 uses the mean value of the pixel values of the colorful image pixel P1 (1,9) and the colorful image pixel P1 (2,10) in the subunit U5 as the fused colorful image pixel P2 (1,5). The pixel value of the fused colorful image pixel P2 (1, 5) is located in the first row and fifth column of the colorful image; then, the processor 20 converts the colorful image pixel P1 (1, 11) and the colorful image pixel P1 (1, 11) in the subunit U6. The mean value of the pixel values of the image pixel P1 (2,12) is used as the pixel value of the fused colorful image pixel P2 (1,6), which is located in the first row of the colorful image Column 6; subsequently, the processor 20 uses the mean value of the pixel values of the colorful image pixel P1 (1, 13) and the colorful image pixel P1 (2, 14) in the subunit U7 as the fused colorful image pixel P2 (1, 14) 7), the fused colorful image pixel P2 (1, 7) is located in the first row and 7th column of the colorful image; then, the processor 20 converts the colorful image pixel P1 (1, 15) in the subunit U8. The average value of the pixel values of the colorful image pixel P1 (2, 16) is used as the pixel value of the fused colorful image pixel P2 (1, 8), and the fused colorful image pixel P2 (1, 8) is located in the first colorful image 1 row, 8th column. So far, the processor 20 has fused the colorful image pixels of the multiple sub-units in the first row in the original image. Subsequently, the processor 20 fuses the multiple colorful image pixels corresponding to the multiple subunits in the second row to obtain corresponding fused colorful image pixels, and then fuses the multiple colorful image pixels corresponding to the multiple subunits in the second row. The specific method of obtaining the corresponding fused colorful image pixels is the same as the specific method of merging the multiple colorful image pixels corresponding to the multiple subunits in the first row to obtain the corresponding fused colorful image pixels, and will not be repeated here. By analogy, until the processor 20 has fused the colorful image pixels of the multiple sub-units in the eighth row in the original image. In this way, all the colorful image pixels in the same subunit are fused to obtain a fused colorful image pixel, and a plurality of fused colorful image pixels are arranged to form a colorful image. The color image pixels in a colorful image are arranged in a Bayer array. Of course, the processor 20 may also average the colorful image pixels in multiple subunits at the same time to obtain multiple fused colorful image pixels, and then arrange the multiple fused colorful image pixels to generate a colorful image, which is not limited here.
Since the fused colorful image pixels in the colorful image are obtained by averaging all the colorful image pixels in the same subunit in the original image, that is, the mean value of all the colorful image pixels in the same subunit in the original image is used as the colorful image in the fused color image pixels, in this way, the obtained colorful image has a larger dynamic range than the original image, so that the dynamic range of the colorful image obtained after the colorful image is used for subsequent processing can be expanded.
In some embodiments, referring to
Please refer to
At block 04: the first color image pixel A, the second color image pixel B, and the third color image pixel C in the colorful image are separated to obtain a first color initial image, a second color initial image, and a third color initial image.
Please refer to
In some embodiments, referring to
The following takes the acquisition of the first color initial image as an example. After the processor 20 obtains the color image, the processor 20 extracts the first color image pixel A in the colorful image and sets the extracted first color image pixel A at corresponding position of the initial image of a color. In some embodiments, the processor 20 extracts the first color image pixel A to be located in the first row and the first column of the color image, then the processor 20 sets the first color image pixel A in the first row and the first column of the first color initial image, Then the processor 20 extracts the next first color image pixel A in the colorful image, and repeats the above blocks until all the first color image pixels A in the colorful image are extracted once. The processor 20 then places a null pixel N at the position where the pixel A of the first color image is not set in the first color initial image. It should be noted that the null pixel N (NULL) is neither a panchromatic pixel nor a colorful pixel, and the position of the null pixel N in the first color initial image can be regarded as having no pixel at this position, or the pixel value of the null pixel N can be treated as zero. The specific implementation manner for the processor 20 to obtain the second color initial image and the third color initial image is the same as the specific implementation manner of obtaining the first color initial image, and will not be described one by one.
After the processor 20 obtains the first color initial image, the second color initial image and the third color initial image, please refer to
At block 031: if the panchromatic image pixel in the panchromatic image W0 corresponding to the image pixel D0 to be updated in the second color initial image is not overexposed, calculating an updated pixel value of the image pixel D0 to be updated based on the panchromatic image and the second color initial image and identifying the updated pixel value of the image pixel to be updated as a target pixel value of the second color pixel; and if the panchromatic image pixel in the panchromatic image W0 corresponding to the image pixel D0 to be updated in the second color initial image is overexposed, calculating an updated pixel value of the image pixel to be updated based on the first color initial image and the second color initial image, or based on the third color initial image and the second color initial image, and identifying the updated pixel value of the image pixel to be updated as a target pixel value of the second color pixel; until the target pixel values of the second color pixels corresponding to all image pixels B in the second color initial image are obtained, obtaining the second color target image;
At block 032: obtaining the first color target image by processing the second color target image and the first color initial image, and obtaining the third color target image by processing the second color target image and the third color initial image.
Please refer to
After the processor 20 obtains the first color initial image, the second color initial image, and the third color initial image, the processor 20 processes the second color initial image based on the panchromatic image or the first color initial image and the third color initial image. Each image pixel in the image is processed to obtain the target pixel value of the second color pixel corresponding to each image pixel, and the target pixel values of multiple second color pixels are arranged to form the second color target image.
In some embodiments, the processor 20 arbitrarily extracts any image pixel in the original image of the second color as the image pixel D0 to be updated, and the processor 20 first determines the image pixel to be updated in the panchromatic image and the original image of the second color Whether the panchromatic image pixel W0 corresponding to DO is overexposed, if the corresponding panchromatic image pixel W0 is not overexposed, then process the initial image of the second color according to the panchromatic image to obtain the second color corresponding to the image pixel D0 to be updated Color pixel target pixel value; if the corresponding panchromatic image pixel W0 is overexposed, then process the second color initial image according to the first color initial image and the third color initial image to obtain the corresponding first image pixel D0 to be updated Two-color pixel target pixel value. Subsequently, the processor 20 extracts the next image pixel in the original image of the second color as the image pixel D0 to be updated for processing, and repeats the above blocks until all the image pixels in the original image of the second color are processed. At this time, the processor 20 can obtain the target pixel value of the second color pixel corresponding to each image pixel in the second color initial image, and a plurality of second color pixel target pixel values are arranged to form the second color target image. Of course, the processor 20 can also extract the image pixels in the initial image of the second color in a certain order, for example, the processor 20 first extracts the first image pixel located in the upper left corner of the initial image of the second color for processing, and then extracts the next image pixels to be processed until the row of image pixels is processed, and then the image pixels of the next row of the second color initial image are processed, which is not limited here.
In some embodiments, the processor 20 first determines whether the panchromatic image pixel W0 corresponding to the image pixel D0 to be updated in the second color initial image is overexposed. For example, referring to
If the panchromatic image pixel W0 corresponding to the image pixel D0 to be updated in the second color initial image in the panchromatic image is not overexposed, then calculate the updated image pixel D0 according to the panchromatic image and the pixel value of the second color initial image and as the target pixel value of the second color pixel. For example, please refer to
At block 0311: selecting a first window C1 centered on the image pixel D0 to be updated in the second color initial image, and selecting a second window C2 corresponding to the first window C1 in the panchromatic image, the panchromatic image pixels W0 at the center of the second window C2 correspond to the image pixel D0 to be updated and identified as a mapped panchromatic image pixel W0;
At block 0312: obtaining a first matrix I1 based on pixel information of all image pixels in the first window C1;
At block 0313: obtaining a second matrix I2 based on the pixel values of the mapped panchromatic image pixels W0, the pixel values of all panchromatic image pixels W in the second window C2, the first matrix I1, and a preset weight function F(x); and
At block 0314: obtaining the updated target pixel value of the second color pixel of the image pixel D0 to be updated based on the pixel values of the mapped panchromatic image pixels W0, the pixel values of all image pixels in the first window C1, the pixel values of all panchromatic image pixels W in the second window C2, and the second matrix I2.
Please refer to
Please refer to
In some embodiments, the first window C1 and the second window C2 are virtual computing windows, not actual structures; and the sizes of the first window C1 and the second window C2 can be changed arbitrarily according to actual needs, in some embodiments, the image pixels in the first window C1 and the second window C2 are arranged in M*M, M is an odd number, for example, M can be 3, 5, 7, 9, etc., the corresponding first window C1 and the second window C2 It can be 3*3, 5*5, 7*7 or 9*9, etc., and there is no limitation here. For the convenience of description, the following embodiments are described with the first window C1 and the second window C2 both having a size of 5*5.
In some embodiments, if the image pixel D0 to be updated is located at the edge of the initial image of the second color, that is, when the first window C1 is set with the image pixel D0 to be updated as the center, the first window C1 will be partially set outside the second color initial image. At this time, image pixels in the initial image of the second color may be set in the first window C1 centered on the image pixel D0 to be updated by copying the image boundary pixels. For example, as shown in
After the processor 20 sets the first window C1 in the original image of the second color and the second window C2 in the panchromatic image, the processor 20 obtains the first matrix 11 according to the pixel information of all image pixels in the first window C1. Specifically, please refer to
At block 03121: mapping the array arrangement of the image pixels in the first window C1 to the array arrangement of the first matrix I1;
At block 03122: identifying the value in the first matrix I1 corresponding to the null pixel in the first window C1 as a first value; and
At block 03123: identifying the value of the position in the first matrix I1 corresponding to the second color image pixel B in the first window as a second value.
Please refer to
Please refer to
Please refer to
At block 0313: obtaining a second matrix I2 based on the pixel values of the mapped panchromatic image pixels W0, the pixel values of all panchromatic image pixels W in the second window C2, the first matrix I1, and a preset weight function F(x), comprises:
At block 03131: mapping the matrix arrangement of image pixels in the second window C2 to the array arrangement of the second matrix I2;
At block 03132: obtaining the deviation value L1 corresponding to the position of the panchromatic image pixel W in the second matrix I2 based on the pixel value of each panchromatic image pixel W in the second window C2 and the mapped pixel value of the panchromatic image pixel W0;
At block 03133: obtaining the value of the corresponding position in the second matrix I2 based on the deviation value L1, a preset function F(x) and the value of the same position in the first matrix I1.
Please refer to
After the processor 20 acquires the first matrix I1, the processor 20 maps the array arrangement of the image pixels in the second window C2 to the array arrangement of the second matrix I2. In some embodiments, the number of rows of elements in the second matrix I2 is the same as the number of rows of image pixels in the second window C2, and the number of columns of elements in the second matrix I2 is the same as the number of columns of image pixels in the second window C2, any image pixel in the second window C2 has an element corresponding to it in the second matrix I2.
The processor 20 obtains the deviation value L1 of the position corresponding to the panchromatic image pixel W in the second matrix I2 according to the pixel value of each panchromatic image pixel W in the second window C2 and the mapped pixel value of the panchromatic image pixel W0. Specifically, the deviation value L1 of the position corresponding to the panchromatic image pixel W in the second window C2 in the second matrix I2 is equal to the pixel value of the panchromatic image pixel W minus the pixel value of the mapped panchromatic image pixel W0. For example, the deviation value L1 (1,2) corresponding to row 1 and column 2 of the second matrix I2 is equal to the absolute value of the difference between the pixel values of panchromatic image pixels W (1,2) arranged in row 1 and column 2 of the panchromatic image and the pixel values of mapped panchromatic image pixels W0.
After the processor 20 obtains the deviation value L1 corresponding to all positions in the second matrix I2, the processor 20 obtains the value of the corresponding position in the second matrix I2 according to the deviation value L1, the default function F(x) and the value of the same position in the first matrix I1. Specifically, the deviation value L1 corresponding to the position to be calculated in the second matrix I2 is substituted into the default function F(x) to obtain the first result F(L1), and then the first result F(L1) is multiplied by the value of the same position in the first matrix I1 to obtain the value of the corresponding position in the second matrix I2. For example, please refer to
Please refer to
At block 03141, calculating the first weighted value M1 based on the first window matrix N1 formed by the pixel values of all image pixels in the first window C1 and the second matrix I2, and calculating a second weighted value M2 based on a second window matrix N2 formed by pixel values of all image pixels in the second window C2 and the second matrix I2;
At block 03142, obtaining the updated target pixel value of the second color pixel of the image pixel D0 to be updated based on the mapped pixel value of the panchromatic image pixel W0, the first weighted value M1, and the second weighted value M2.
Please refer to
After the processor 20 obtains the second matrix I2, the processor forms the first window matrix N1 according to the pixel values of all the image pixels in the first window C1, and forms the second window matrix N2 according to the pixel values of all the panchromatic image pixels W in the second window C2. It should be noted that the value of any position in the first window matrix N1 is the same as the pixel value of the image pixel at the corresponding position in the first window C1, and the value of any position in the second window matrix N2 is the same as that of the pixel values of the image pixels at the corresponding positions in the second window C2.
The processor 20 calculates a first weight M1 according to the first window matrix N1 and the second matrix I2. In some embodiments, the first weighted value M1 can be obtained by calculating the formula M1=sum(sum(H1×I2)). In some embodiments, the pixel value of each image pixel in the first window C1 is multiplied by the value at the corresponding position in the second matrix I2 to obtain multiple new pixel values, and the multiple new pixels are added to obtain the first weighted value M1. The second weighted value M2 can be obtained by calculating the formula M2=sum(sum(H2×I2)). In some embodiments, the pixel value of each panchromatic image pixel W in the second window C1 is multiplied by the value at the corresponding position in the second matrix I2 to obtain multiple new pixel values, and the multiple new pixels are added to obtain the second weighted value M2.
After the processor 20 obtains the first weighted value M1 and the second weighted value M2, the processor 20 obtains the image to be updated according to the pixel value of the mapped panchromatic image pixel W0, the first weighted value M1, the second weighted value M2, and the updated target pixel value of the second color pixel of the pixel D0. For example, the target pixel value of the second color pixel can be calculated by the formula B0′=W0′×M1/M2, wherein B0′ is the target pixel value of the second color pixel, and W0′ is the pixel value of the mapped panchromatic image pixel W0. In some embodiments, the pixel value of the mapped panchromatic image pixel W0 is multiplied by the first weighted value M1, and then divided by the second weighted value M2 to obtain the updated target pixel value of the second color pixel of the image pixel D0 to be updated.
If the panchromatic image pixel in the panchromatic image W0 corresponding to the image pixel D0 to be updated in the second color initial image is not overexposed, calculating an updated pixel value of the image pixel D0 to be updated based on the panchromatic image and the second color initial image and identifying the updated pixel value of the image pixel to be updated as a target pixel value of the second color pixel. For example, please refer to
At block 0315: if the image pixel D0 to be updated is the second color image pixel B, identifying the value of the original pixel of the second color image pixel B as the value of the target pixel of the second color pixel after updating the pixel value D1 of the image pixel D0 to be updated;
At block 0316: if the position corresponding to the image pixel D0 to be updated in the first color initial image is the first color image pixel A, performing interpolation processing on the second color initial image based on the first color initial image, and obtaining the updated target pixel value D1 of the second color pixel of the image pixel D0 to be updated;
At block 0317: if the position corresponding to the image pixel D0 to be updated in the initial image of the third color is the third color image pixel C, performing interpolation processing on the second color initial image based on the third color initial image, and obtaining the updated target pixel value D1 of the second color pixel of the image pixel D1 to be updated.
Please refer to
The processor 20 first determines whether the image pixel D0 to be updated is the image pixel B of the second color, and if the image pixel D0 to be updated is the image pixel B of the second color, then the original pixel value of the image pixel B of the second color is used as the image pixel B to be updated. The updated image pixel D0 is the updated target pixel value D1 of the second color pixel. In some embodiments, as shown in
If the image pixel D0 to be updated is not the second color image pixel B, that is, the image pixel D0 to be updated is a null pixel N, determine whether the image pixel at the position corresponding to the image pixel D0 to be updated in the initial image of the first color is the image pixel A of the first color, if the image pixel corresponding to the image pixel D0 to be updated in the initial image of the first color is the image pixel A of the first color, then the initial image of the second color is performed according to the initial image of the first color by processing interpolation to obtain the updated target pixel value D1 of the second color pixel of the image pixel D0 to be updated. For example, as shown in
In some embodiments, the color difference constant theory can be used to utilize the updated target pixel value D1 of the second color pixel D0 of the image pixel D0 to be updated in the initial image of the first color. For example, please refer to
At Block 03161: according to the pixel value of the image pixel in the original image of the second color and the pixel value of the image pixel in the original image of the first color, calculating the first pixel in the first direction H1 corresponding to each image pixel in the original image of the second color the difference E1 and the second difference E2 in the second direction H2;
At Block 03162: calculating the first direction difference V1 corresponding to the image pixel according to the first difference E1 corresponding to two adjacent image pixels in the first direction H1 of each image pixel in the second color initial image, according to the first direction difference E1, calculating the second difference value E2 corresponding to the image pixel corresponding to the image pixel in the second direction H2 for the second difference E2 corresponding to the image pixel of each image pixel in the second color initial image;
At Block 03163: according to the first direction difference V1 of the image pixel D0 to be updated, the second direction difference V2 of the image pixel D0 to be updated, and the first direction difference V1 and the second direction difference of the surrounding image pixels V2, calculating the first weight value g1, the second weight value g2, the third weight value g3 and the fourth weight value g4;
At Block 03164: adjacent to the first side of the image pixel D0 to be updated in the first direction H1 according to the first direction difference V1 of the image pixel D0 to be updated, the second direction difference V2 of the image pixel D0 to be updated and the first direction difference V1 of the four image pixels of D0, the first direction difference V1 of the four image pixels adjacent to the second side of the image pixel D0 in the first direction H1, and the image to be updated and the second direction difference V2 of the four image pixels adjacent to the first side of the pixel D0 in the second direction H2, and the four images adjacent to the second side of the image pixel D0 to be updated in the second direction H2, calculating the second direction difference V2, the first weight g1, the second weight g2, the third weight g3 and the fourth weight g4 of the pixel to obtain the total difference K; and
At Block 03165: acquiring the target pixel value D1 of the second color pixel corresponding to the image pixel D0 to be updated according to the mapped image pixel AO of the first color and the total difference K corresponding to the image pixel D0 to be updated.
Please refer to
The processor 20 calculates the first direction H1 corresponding to each image pixel in the second color initial image according to the pixel value of the image pixel in the second color initial image and the pixel value of the image pixel in the first color initial image. A difference E1 and a second difference E2 in the second direction H2. It should be noted that the first direction H1 is perpendicular to the second direction H2. For convenience of illustration, in the following embodiments, the first direction H1 is the direction parallel to the columns of the image pixels, and the second direction H2 is the direction parallel to the rows of the image pixels. directions are explained. For example, assuming that the image pixel to be calculated is located in row i, column j of the original image of the second color, the corresponding first difference E1 can be calculated by the formula E1(i,j)=(B_((i,j−1))+B_((i,j+1)))/2+(2×(A_((i,j))−A_((i,j−2))−A_((i,j+2)))/4−A_((i,j))] to obtain. Among them, B_((i,j−1)) represents the pixel value of the image pixel located in the i-th row and j−1 column of the second color initial image, and B_((i,j+1)) represents the pixel value located in the second color initial image value of the image pixel in the i-th row and j+1 column of the image, A_((i,j)) represents the pixel value of the image pixel in the i-th row and j-column of the initial image of the first color, A_((i,j−2)) represents the pixel value of the image pixel located in row i, column j−2 of the initial image of the first color, and A_((i,j+2)) represents the pixel value located in row i, column j+ of the initial image of the first color of the image pixels for the 2 columns. In some embodiments, the average value of the pixel value difference between twice the pixel value of the image pixel corresponding to the image pixel to be calculated on the initial image of the first color and the pixel value of the image pixel spaced on both sides of the first direction H1, plus the mean value of the sum of the pixel values of the image pixels on both sides of the first direction H1 is subtracted from the pixel value of the image pixel corresponding to the image pixel to be calculated on the initial image of the first color to obtain the first difference E1. The second difference E2 of the image pixel to be calculated can be calculated by the formula E2(i,j)=(B_((i−1,j))+B_((i+1,j)))/2+(2×A_((i,j))−A_((i−2,j))−A_((i+2,j)))/4−A_((i,j))] is obtained. Among them, B_((i−1,j)) represents the pixel value of the image pixel located at row i−1, column j of the second color initial image, and B_((i+1,j)) represents the pixel value located at the second color initial image of the image pixel in the i+1th row and jth column of the image, A_((i,j)) represents the pixel value of the image pixel in the ith row and jth column of the initial image of the first color, A_((i−2,j)) represents the pixel value of the image pixel located at row i−2, column j of the initial image of the first color, and A_((i+2,j)) represents the pixel value of the image pixel located at row i+2, column j, of the initial image of the first color. In some embodiments, the average value of the pixel value difference between twice the pixel value of the image pixel corresponding to the image pixel to be calculated on the initial image of the first color and the pixel value of the image pixel spaced on both sides of the second direction H2, plus the value to be calculated. The mean value of the sum of the pixel values of the image pixels on both sides of the second direction H2 is subtracted from the pixel value of the image pixel corresponding to the image pixel to be calculated on the initial image of the first color to obtain the second difference E2. For example, please refer to
After the processor 20 obtains the first difference E1 and the second difference E2 corresponding to each image pixel on the second color initial image, according to the adjacent image pixels in the second color initial image in the first direction H1, calculate the first direction difference V1 corresponding to the image pixel corresponding to the first difference E1 corresponding to the two image pixels, according to the two images adjacent to each other in the second direction H2 according to the image pixels of each of the original images of the second color and the second difference E2 corresponding to the pixel calculates the second direction difference V2 corresponding to the image pixel. For example, assuming that the image pixel to be calculated is located in row i, column j of the original image of the second color, its corresponding first direction difference V1 can be calculated by the formula V1(i,j)=|E1(i,j−1)−E1(i,j+1)|, where E1(i,j−1) represents the first difference E1 corresponding to the image pixel located in row i, column j−1 on the initial image of the second color, and E1(i,j+1) represents the first difference E1 corresponding to the image pixel located at row i and column j+1 on the initial image of the second color. In some embodiments, the first direction difference V1 of the image pixel to be calculated is equal to the absolute value of the first difference E1 corresponding to two image pixels adjacent to the image pixel to be calculated in the first direction H1. The second direction difference V2 corresponding to the image pixel to be calculated can be calculated by the formula V2(i,j)=|E2(i−1,j)−E2(i+1,j)|, where E2(i+1,j) represents the second difference E2 corresponding to the image pixel located at row i+1 and column j on the initial image of the second color, and E1(i, j+1) represents the second difference E2 located at row i on the initial image of the second color and the second difference E2 corresponding to the image pixel in the j+1th column. In some embodiments, the second direction difference V2 of the image pixel to be calculated is equal to the absolute value of the difference between the second difference E2 corresponding to two adjacent image pixels of the image pixel to be calculated in the second direction H2.
After the processor 20 obtains the first direction difference V1 and the second direction difference V2 of the image pixels in the initial image of the second color, according to the first direction difference V1 of the image pixel D0 to be updated, the image pixel to be updated, the second direction difference V2 of D0, and the first direction difference V1 and the second direction difference V2 of the surrounding image pixels, calculate the first weight value g1, the second weight value g2, the third weight value g3, the first weight value g3 and the four weights g4. For example, the first weight value g1 can be calculated by the formula g1=1/(Σa=i−4iΣb=j−2j+2V2(i,j))2, the image that will be located four columns to the left of the image pixel D0 to be updated and located two rows above the image pixel D0 to be updated, and within the range of two rows above the image pixel D0 to be updated and the second direction difference value V2 corresponding to the pixel is summed, and then divided by 1 by the square of the result to obtain the first weight value g1; the second weight value g2 can be calculated by the formula g2=1/(Σii+4Σb=j−2j+2V2(i,j))2, which will be located four columns to the right of the image pixel D0 to be updated and located in the to-be-updated The second direction difference V2 corresponding to the two rows above the image pixel D0 and the image pixels located in the range of the two rows above the image pixel D0 to be updated is summed, and then divided by 1 by the square of the result to obtain the second The weight value g2; the third weight value g3 can be calculated by the formula g3=1/(Σi−2i+2Σb=j−4jV1(i,j))2, the image pixels that will be located four rows below the image pixel D0 to be updated, two columns to the left of the image pixel D0 to be updated, and two columns to the right of the image pixel D0 to be updated corresponding first direction difference V1 is summed, and then divided by 1 by the square of the result to obtain the third weight value g3; and the fourth weight value g4 can be calculated by the formula g4=1/(Σi−2i+2Σb=j+44jV1(i,j))2, which will be located four rows above the image pixel D0 to be updated and located at the first direction difference V1 corresponding to the two columns on the left side of the image pixel D0 and the image pixels located in the two ranges on the right side of the image pixel D0 to be updated is summed, and then divided by 1 by the square of the result to obtain the fourth weights g4.
After the processor 20 obtains the first weight value g1, the second weight value g2, the third weight value g3 and the fourth weight value g4 of the image to be updated, the processor 20 calculates to obtain the total difference K according to the first direction difference V1 of image pixel D0 to be updated, the second direction difference V2 of image pixel D0 to be updated, the first direction difference V1 of four image pixels adjacent to the first side of image pixel D0 to be updated on the first direction H1, the first direction difference V1 of the four image pixels adjacent to the second side of the image pixel D0 to be updated in the first direction H1, the second direction difference V2 of four image pixels adjacent to the first side of the image pixel D0 to be updated in the second direction H2, the second direction difference V2 of the four image pixels adjacent to the second side of the image pixel D0 to be updated on the second direction H2, the first weight g1, the second weight g2, the third weight g3 and the fourth weight g4. For example, the first weight matrix S1 is formed according to the second difference E2 of the image pixel D0 to be updated and the 4 adjacent image pixels below it, and according to the image pixel D0 to be updated and the 4 adjacent images above it, the second difference E2 of the pixels is arranged to form the second weight matrix S2, and the third weight matrix S3 is formed according to the first difference E1 of the image pixel D0 to be updated and the four adjacent image pixels on the left side thereof, and the third weight matrix S3 is formed according to the image to be updated, the first difference E1 of the pixel D0 and its adjacent four image pixels on the right is arranged to form the fourth weight matrix S4, and the total difference K can be calculated by the formula K=(g1×f×S1+g2×f×S2+g3×f′×S3+g4×f′×S4)/(g1+g+g3+g4), where f represents the preset matrix, f′ represents the transpose of the preset matrix, and in some embodiments, the preset Matrix f=[11111]/5. For example, referring to
the third weight matrix
After the processor 20 obtains the total difference K, according to the mapped first color image pixel AO and the total difference K corresponding to the image pixel D0 to be updated, obtain the second color pixel target corresponding to the image pixel D0 to be updated Pixel value D1. For example, the target pixel value D1 of the second color pixel corresponding to the image pixel D0 to be updated is equal to the sum of the mapped first color image pixel AO and the total difference K corresponding to the image pixel D0 to be updated. Of course, in some embodiments, other interpolation methods may also be used to obtain the target pixel value D1 of the pixel of the second color corresponding to the image pixel D0 to be processed, and no more examples are given here.
If the image pixel D0 to be updated is not the second color image pixel B, that is, the image pixel D0 to be updated is a null pixel N, and the image pixel at the position corresponding to the image pixel D0 to be updated in the initial image of the third color is, when the third image pixel is C, the initial image of the second color is interpolated according to the initial image of the third color to obtain the updated target pixel value D1 of the second color pixel D0 of the image pixel D0 to be updated. For example, as shown in
After the processor 20 obtains the target pixel values of the second color pixels corresponding to all the second color image pixels in the second color initial image, a plurality of second color pixel target pixel values are arranged to form the second color target image. The position where the target pixel value of the second color pixel is set in the second color target image is the same as the position where the corresponding image pixel is set in the second color initial image.
Please refer to
At block 0321: perform bilateral filtering processing on the first color initial image according to the second color target image to obtain a first color target image, and perform bilateral filtering processing on a third color initial image according to the second color target image to obtain a third color target image.
Please refer to
In the following, description will be made by taking the second color target image as an example to perform bilateral filter processing on the first color initial image to obtain the first color target image. In some embodiments, referring to
wherein, kp=Σq∈Ωf(∥p−q∥)g(∥Ip′−Iq′∥), Jp is the output pixel value, kp is the sum of weights, Ω is the filter window, p is the coordinate of the pixel to be filtered in the initial image of the first color, and q is the coordinate of the pixel in the filter window in the initial image of the first color, Iq is the pixel value corresponding to point q, Ip′ is the pixel value corresponding to the pixel to be filtered in the target image of the second color, Iq′ is the pixel value corresponding to point q in the target image of the second color, f and g are the weight distribution function, the weight distribution function includes a Gaussian function.
In some embodiments, the joint bilateral filtering algorithm determines the first distance weight f(∥p−q∥) by the difference between the coordinates of the pixel point p to be filtered and the coordinates of a pixel point q in the filtering window, the example p in the FIGURE The coordinate difference between the point and the q point can be 2, and the second distance weight g(∥Ip′−Iq′∥), according to the first distance weight of each pixel in the filter window, the second distance weight, the pixel value Iq corresponding to point q in the initial image of the first color, and the weight sum kp to determine the output pixel value Jp.
In some embodiments, in the initial image of the first color, the position of the pixel A of the first color image is not set, that is, a null pixel N, whose pixel value is 0. The second color intermediate image set by the output pixel value Jp corresponds to the pixel point p to be filtered, and after one output is completed, the filter window moves to the next image pixel position until all image pixels in the first color initial image are completely filtered, so that the target image of the first color including only the pixel A of the first color image is obtained. Please refer to
The image processing system 100 in the embodiment of the present disclosure fuses all the color image pixels in the same subunit into a colorful image arranged in a Bayer array; all the panchromatic image pixels in the same subunit are fused into a panchromatic image, and the panchromatic image and the colorful image are directly used for fusion, and the first color target image and the second color target image that contains panchromatic image information and are fully arranged can be directly output. In this way, the resolution and signal-to-noise ratio of the image can be improved, and the overall photo effect can be improved.
Please refer to
At block 05: obtaining a processed colorful image by performing colorful image processing on the colorful image, obtaining a processed panchromatic image by performing panchromatic image processing on the panchromatic image; and
At block 033: obtain a first color target image, a second color target image and a third color target image by processing the processed colorful image based on the processed panchromatic image.
Please refer to
After the processor 20 obtains the colorful image and the panchromatic image, the processor 20 performs colorful image processing on the colorful image to obtain a processed colorful image, and performs panchromatic image processing on the panchromatic image to obtain a processed panchromatic image. It should be noted that, in some embodiments, the colorful image processing includes at least one of dead point compensation processing, vignetting compensation processing, and white balance processing; the panchromatic image processing includes dead point compensation processing.
After obtaining the processed panchromatic image and the processed color image, the processor 20 processes the processed color image according to the processed panchromatic image to obtain the first color target image, the second color target image, the second color target image and the third color target image. The specific implementation manner of processing the processed color image by the processed panchromatic image is the same as the specific implementation manner of processing the color image by the panchromatic image in the above embodiment, and will not be repeated here.
Since the color image is processed by color image processing and the panchromatic image is processed by panchromatic image, its image quality can be improved. At this time, the processed color image is processed according to the processed panchromatic image to obtain the first color target image, the second color target image and the third color target image can improve the image quality of the first color target image, the second color target image and the third color target image.
Please refer to
At block 06: obtaining a color converted target image by performing color conversion based on the first color target image, the second color target image and the third color target image.
Please refer to
The processor 20 performs color conversion processing after obtaining the first color target image, the second color target image and the third color target image, so as to obtain a color converted target image. Color conversion processing is to convert an image from one color space (such as RGB color space) to another color space (such as YUV color space) so as to have a wider range of application scenarios or a more efficient transmission format. In a specific embodiment, the block of color conversion processing can be to convert the R, G and B channel pixel values of all pixel values in the image to obtain the Y, U and V channel pixel values by the following formula: (1) Y=0.30 R+0.59 G+0.11 B; (2) U=0.493 (B−Y); (3) V=0.877 (R−Y); thus converting the image from the RGB color space to the YUV color space. Since the luminance signal Y and the chrominance signals U and V in the YUV color space are separated, and the human eye is more sensitive to luminance than chromaticity, this application uses the first color target image, the second color target image and the third color target image to convert color to obtain the color conversion and then transmitted to the image processor (not shown) for subsequent processing, which can reduce the amount of information in the image without affecting the viewing effect of the image, thereby improving the transmission efficiency of the image.
Please refer to
The electronic device 1000 may be a mobile phone, a tablet computer, a notebook computer, a smart wearable device (such as a smart watch, a smart bracelet, smart glasses, a smart helmet), a drone, a head-mounted display device, etc., which is not limited here.
The electronic device 1000 in the embodiment of the present disclosure fuses all the colorful image pixels in the same subunit into a colorful image arranged in a Bayer array in the system 100 for image processing; fuses all the panchromatic image pixels in the same subunit into a panchromatic image, and directly use the panchromatic image and the colorful image to fuse, and can directly output the first color target image, the second color target image and the third color target image that contain panchromatic image information and are fully arranged. In this way, the resolution and signal-to-noise ratio of the image can be improved, and the overall photo effect can be improved.
Please refer to
For example, please refer to
At block 01: obtain the original image by exposing the pixel array 11, the image pixels in the original image comprising colorful image pixels and panchromatic image pixels;
At block 02: obtain a color image based on all the color image pixels in the same subunit, and obtain a panchromatic image based on all the panchromatic image pixels in the same subunit, and the image pixels in the colorful image are arranged in a Bayer array; and
At block 03: process the colorful image according to the panchromatic image to obtain the first color target image, the second color target image and the third color target image, wherein all the image pixels in the first color target image are the first color image All image pixels in the second color target image are the second color image pixels, and all image pixels in the third color target image are the third color image pixels.
In some embodiments the processor 60 can be the same processor as the processor 20 arranged in the system 100 for image processing, and the processor 60 can also be arranged in the electronic device 1000, that is, the processor 60 can also be arranged in the system 100 for image processing. The processors 20 in the system 100 for image processing are not the same processor, which is not limited here.
In the description of this specification, reference to the terms “one embodiment”, “some embodiments”, “exemplary embodiments”, “example”, “specific examples” or “some examples” etc. The specific features, structures, materials or features described in the manner or example are included in at least one embodiment or example of the present disclosure.
In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the described specific features, structures, materials or characteristics may be combined in any suitable manner in any one or more embodiments or examples. In addition, those skilled in the art can combine and combine different embodiments or examples and features of different embodiments or examples described in this specification without conflicting with each other.
Any process or method descriptions in flowcharts or otherwise described herein may be understood to represent modules, segments or portions of code comprising one or more executable instructions for implementing specific logical functions or blocks of the process, and the scope of preferred embodiments of the present disclosure includes additional implementations in which functions may be performed out of the order shown or discussed, including in substantially simultaneous fashion or in reverse order depending on the functions involved, which should be understood by those skilled in the art to which the embodiments of the present disclosure belong.
Although the implementation of the present disclosure has been shown and described above, it can be understood that the above-mentioned implementation is exemplary and should not be construed as limiting the disclosure, and those skilled in the art can make changes, modifications, substitutions and variations to the above-mentioned embodiments.
Number | Date | Country | Kind |
---|---|---|---|
202011582503.6 | Dec 2020 | CN | national |
This application is a continuation of International Patent Application No. PCT/CN2021/075315, filed Feb. 4, 2021, which claims the priority of Chinese Patent Application No. 202011582503.6, filed Dec. 28, 2020, both of which are herein incorporated by reference in their entireties.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN21/75315 | Feb 2021 | US |
Child | 18341509 | US |