This patent document claims the priority and benefits of Korean patent application No. 10-2023-0092057, filed on Jul. 14, 2023, in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference in its entirety as part of the disclosure of this patent document.
The technology and implementations disclosed in this patent document generally relate to an image signal processor capable of performing image conversion and an image signal processing method for the same.
An image sensing device is a device for capturing optical images by converting light into electrical signals using a photosensitive semiconductor material which reacts to light. With the development of automotive, medical, computer and communication industries, the demand for high-performance image sensing devices is increasing in various fields such as smart phones, digital cameras, game machines, IoT (Internet of Things), robots, surveillance cameras and medical micro cameras.
An original image captured by the image sensing device may include a plurality of pixels corresponding to different colors (e.g., red, blue, and green). The plurality of pixels included in the original image may be arranged according to a certain arrangement of the color patterns, such as, a Bayer pattern arrangement. In order to convert the original image into a complete image (e.g., RGB images), an operation of interpolating pixels may be performed according to a predetermined algorithm. Since this algorithm basically includes an operation of interpolating pixels having lost (or missed) information using information of the neighboring pixels, there may occur serious noise in images with specific patterns due to limitations of such algorithm.
In accordance with an embodiment of the disclosed technology, an image signal processor may include a first determiner configured to compare a target gradient combination for a target kernel including a target pixel with a reference gradient combination for each of a plurality of corner patterns to calculate a similarity score, and determine at least one corner pattern based on the calculated similarity score; and a pixel interpolator configured to interpolate the target pixel using pixel data of interpolation pixels determined based on the corner pattern.
In accordance with another embodiment of the disclosed technology, an image signal processing method may include comparing a target gradient combination for a target kernel including a target pixel with a reference gradient combination for each of a plurality of corner patterns to calculate a similarity score, determining a corner pattern corresponding to the target kernel based on the similarity score; and interpolating the target pixel using pixel data of interpolation pixels determined based on the corner pattern.
The above and other features and beneficial aspects of the disclosed technology will become readily apparent with reference to the following detailed description when considered in conjunction with the accompanying drawings.
This patent document provides implementations and examples of an image signal processor and an image signal processing method capable of increasing the accuracy of correction for defective pixels or the like, that may be used in configurations to substantially address one or more technical or engineering issues and to mitigate limitations or disadvantages encountered in some image signal processors in the art. Some implementations of the disclosed technology relate to an image signal processor and an image signal processing method that can increase the accuracy of correction for defective pixels or the like. In recognition of the issues above, the image signal processor based on some implementations of the disclosed technology can interpolate the target pixel by determining a corner pattern corresponding to a target kernel, and can increase the accuracy of correction for defective pixels even when the target kernel corresponds to the corner pattern.
Reference will now be made in detail to some embodiments of the disclosed technology, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. While the disclosure is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings. However, the disclosure should not be construed as being limited to the embodiments set forth herein.
Hereinafter, various embodiments will be described with reference to the accompanying drawings. However, it should be understood that the disclosed technology is not limited to specific embodiments, but includes various modifications, equivalents and/or alternatives of the embodiments. The embodiments of the disclosed technology may provide a variety of effects capable of being directly or indirectly recognized through the disclosed technology.
Various embodiments of the disclosed technology relate to an image signal processor capable of increasing the accuracy of correction for defective pixels or the like, and an image signal processing method for the same.
It is to be understood that both the foregoing general description and the following detailed description of the disclosed technology are illustrative and explanatory and are intended to provide further explanation of the disclosure as claimed.
Referring to
The image data (IDATA) may be generated by an image sensing device that captures an optical image of a scene, but the scope of the disclosed technology is not limited thereto. The image sensing device may include a pixel array, a control circuit, and a readout circuit. The pixel array may include a plurality of pixels configured to sense an incident light provided from a scene. The control circuit may be configured to control the pixel array to generate an analog pixel signal. The readout circuit may be configured to output digital image data (IDATA) by converting the analog pixel signal into the digital image data (IDATA). In some implementations of the disclosed technology, it is assumed that the image data (IDATA) is generated by the image sensing device.
The pixel array of the image sensing device may include at least one defective pixel that cannot normally capture a color image due to process limitations or temporary noise inflow. In addition, the pixel array may include at least one phase difference detection pixel. As is well known, the phase difference detection pixel may be configured to acquire phase difference-related information to improve autofocus performance. For example, the phase difference detection pixel may include a different shape from the pixel. The phase difference detection pixels cannot sense color images in the same manner as defective pixels, such that the phase difference detection pixels can be treated as defective pixels from the point of view of color images. In some implementations, for convenience of description and better understanding of the disclosed technology, the defective pixel and the phase difference detection pixel, each of which cannot normally sense the color image, will hereinafter be collectively referred to as “defective pixels”.
In order to increase the quality of color images, the image signal processor 100 are required to accurately correct the defective pixels. To this end, the image signal processor 100, based on some implementations of the disclosed technology, may include a defective pixel detector 200 and a defective pixel corrector 300.
In exemplary embodiments, the defective pixel detector 200 may detect pixel data of the defective pixel among a plurality of pixels of a pixel array based on the image data (IDATA). In some implementations of the disclosed technology, for convenience of description, digital data corresponding to an output signal (hereinafter, a pixel signal) of each pixel will hereinafter be defined as pixel data, and a set of pixel data corresponding to a predetermined unit (e.g., a frame or kernel) will hereinafter be defined as image data (IDATA). Here, the frame may correspond to the entire pixel array including the plurality of pixels. The kernel may refer to a unit for image signal processing. For example, the kernel may refer to a group of the pixels on which the image signal processing is performed at one time. In addition, an actual value of the pixel data may be defined as a “pixel value”.
In some implementations, the defective pixel detector 200 may detect pixel data of the defective pixel based on the image data (IDATA). For example, the defective pixel detector 200 may compare pixel data of a target pixel with an average value of pixel data of the pixels in the kernel (hereinafter, the average value of the pixel data of the kernel). The defective pixel detector 200 may determine whether the target pixel is a defective pixel based on a difference between the pixel data of the target pixel and the average value of the pixel data of the kernel. For example, the defective pixel detector 200 may determine that the target pixel is a defective pixel when the difference is equal to or greater than a threshold value.
The threshold value to be described later may be a fixed constant or may be a specific ratio of a brightness value (e.g., a green average value) of the kernel. In addition, in some implementations, it is assumed that, when a measurement value (for example, the difference) is compared with the threshold value, if the measurement value is equal to or greater than the threshold value, this condition will hereinafter be denoted by “Large” for convenience of description, and if the measurement value is less than the threshold value, this condition will hereinafter be denoted by “Small” for convenience of description.
In some other implementations, the defective pixel detector 200 may receive pre-stored position information of defective pixels obtained based on a previous process for correcting the defective pixel or a pixel test process. Further, the defective pixel detector 200 may determine whether the target pixel is a defective pixel based on the position information of the defective pixels. For example, the image sensing device may determine position information of inherently defective pixels as the position information of the defective pixels. Further, the image sensing device may store the position information of the defective pixels in an internal storage (e.g., one time programmable (OTP) memory), and may provide the position information of the defective pixels to the image signal processor 100.
When the target pixel is determined to be a defective pixel by the defective pixel detector 200, the defective pixel corrector 300 may correct the pixel data of the target pixel based on image data of a kernel including the target pixel.
Referring to
The corner pattern determiner 310 may determine a corner pattern corresponding to a target kernel including a target pixel.
Image data IDATA corresponding to one kernel (or one frame) may include textures of various sizes and shapes. The texture may refer to a set (or aggregate) of pixels having similarity. For example, a subject having a unified color included in a captured scene may be recognized as a texture. The boundary of the texture may include at least one corner. For example, pixels inside the corner (hereafter, inside pixels) and pixels outside the corner (hereafter, outside pixels). A difference between pixel data of the inside pixels and pixel data of the outside pixel may be larger than a difference between pixel data of neighboring pixels not bounded by the corner.
In
In this case, when the target pixel of the kernel is a defective pixel, the image signal processor 100 may correct a target pixel DP based on pixel data of adjacent pixels arranged according to the corner pattern.
Referring to
In addition, shaded pixels of
Referring to
More detailed, the kernels K1 to K8 may include at least one corner pattern PT_A to PT_H. For example, the kernels K1 to K4 may include two corner patterns PT_A to PT_D based on the cross-point of the corner. The kernels K5 to K8 may include one corner pattern PT_E to PT_H. The corner patterns PT_A, PT_D, PT_E, PT_F, PT_G, and PT_H may include the target pixels DP located at each of their vertices. Meanwhile, the target pixel DP might not be located within the corner patterns PT_B and PT_C.
Although the embodiment of the disclosed technology assumes that there are eight corner patterns in the (5×5) kernel for convenience of description, other implementations are also possible, and it should be noted that more diverse corner patterns may exist in a kernel larger than the (5×5) kernel as needed. The defective pixel correction method based on some implementations of the disclosed technology can also be applied in substantially the same way to these corner patterns.
Referring back to
The first determiner 320 may compare a target gradient combination with a reference gradient combination. The target gradient combination may be related to a target kernel including a target pixel DP. The reference gradient combination may be related to each of the corner patterns. The first determiner 320 may generate a similarity score by the comparison result. Furthermore, the first determiner 320 may determine a corner pattern corresponding to the target kernel based on the similarity score.
In some implementations, the reference gradient combination lists the reference gradient, which is the result of comparing the difference value between pixel data of each of plurality of pixel pairs with a predetermined threshold value. Here, the reference gradient may mean that the reference gradient is arranged corresponding to the plurality of pixel pairs. For example, the pixel pair may be two pixels of the same color but located in different positions in the corner pattern.
In some implementations, the target gradient combination lists the target gradient. The target gradient may be arranged corresponding to the plurality of pixel pairs. Here, the target gradient may be obtained by comparing a difference value between pixel data of pixels of each of the plurality of pixel pairs with a predetermined threshold value based on image data (IDATA) that was input to the target kernel by using the positions of pixels of the plurality of pixel pairs used in the reference gradient combination without change. That is, positions of the pixel pairs in the target kernel may be same as the position of the pixel pairs for obtaining the reference gradient combination. The plurality of pixel pairs will be described in more detail below.
When a plurality of corner patterns is determined by the first determiner 320, the second determiner 330 may determine a corner pattern corresponding to the target kernel based on a complex gradient combination corresponding to each of the corner patterns determined by the first determiner 320 with respect to the target kernel.
In some implementations, the complex gradient combination may refer to a result of comparison between average values of a difference in pixel data between two pixels allocated to each of the pixel combinations.
When a plurality of corner patterns is determined by the second determiner 330, the third determiner 340 may determine a corner pattern corresponding to the target kernel based on a result of applying a Laplacian filter corresponding to each of the corner patterns determined by the second determiner 330 to the target kernel.
In some implementations, the Laplacian filter may mean a difference, for each pixel combination, between a value obtained by doubling pixel data of one of three pixels and a value corresponding to the sum of pixel data values of the remaining two pixels other than the one pixel.
When the corner pattern corresponding to the target kernel is determined by the corner pattern determiner 310, the pixel interpolator 350 may interpolate the target pixel DP using pixel data of pixels based on the corner pattern determined by the corner pattern determiner 310.
More detailed operations of the defective pixel corrector 300 will be described later with reference to the attached drawings below
Referring to
In
Although the embodiment of the disclosed technology has disclosed that the image is a (5×5)-sized kernel having 25 pixels arranged in a Bayer pattern for convenience of description, the technical idea of the disclosed technology can also be applied to another kernel in which color pixels are arranged in other patterns such as a quad-Bayer pattern, a nona-Bayer pattern, a hexa-Bayer pattern, an RGBW pattern, a mono pattern, etc., and the types of image patterns are not limited thereto and can also be sufficiently changed as needed. In addition, a kernel having another size (e.g., a (10×10) size) other than the (5×5) size may be used depending on performance of the image signal processor 100, required correction accuracy, an arrangement method of color pixels, and the like.
The first determiner 320 may compare a reference gradient combination with a target gradient combination of the target kernel in which the target pixel DP determined to be a defective pixel by the defective pixel detector 200 is located at the center of the target kernel (S100). The first determiner 320 may determine at least one corner pattern corresponding to the target kernel based on the result of comparison.
When the first determiner 320 determines the presence of only one corner pattern corresponding to the target kernel (e.g., the number of corner patterns in S100=singular number), the pixel interpolator 350 may interpolate the target pixel DP based on pixel data of some pixels that have the same color as the target pixel DP while being included in the same region as a texture or non-texture region of the corner pattern including the target pixel DP (S140). For example, when the corner pattern includes the target pixel DP as a texture region and the target pixel DP is a blue pixel, the pixel interpolator 350 may set an average value of pixel data of blue pixels located in the texture region of the corner pattern belonging to the target kernel as pixel data of the target pixel DP.
When the first determiner 320 determines the presence of a plurality of corner patterns corresponding to the target kernel (e.g., the number of corner patterns in S100=plural number), the second determiner 330 may calculate a complex gradient combination for the target kernel to determine whether the number of the corner pattern corresponding to the target kernel (S110). For example, the second determiner 320 may determine whether the number of the corner pattern of the target kernel is 1, based on the complex gradient combination.
When the second determiner 330 determines the presence of only one corner pattern corresponding to the target kernel (e.g., the number of corner patterns in S110=singular number), the pixel interpolator 350 may interpolate the target pixel DP in the same manner as before (S140).
When the second determiner 330 determines the presence of the plurality of corner patterns corresponding to the target kernel (e.g., the number of corner patterns in S110=plural number), the third determiner 340 may apply the Laplacian filter to the target kernel in which the target pixel DP determined to be a defective pixel by the defective pixel detector 200 is located at the center of the target kernel (S120). The third determiner 340 may determine whether the number of corner patterns corresponding to the target kernel is 1.
When the third determiner 340 determines the presence of only one corner pattern corresponding to the target kernel (e.g., the number of corner patterns in S120=singular number), the pixel interpolator 350 may interpolate the target pixel DP in the same manner as before (S140).
When the third determiner 340 determines the presence of the plurality of corner patterns corresponding to the target kernel (e.g., the number of corner patterns in S120=plural number), it can be seen that there is no corner pattern corresponding to the target kernel, such that the pixel interpolator 350 may interpolate the target pixel DP based on the pixel data of pixels having the same colors as the target pixel DP belonging to the target kernel (S130). For example, when the target pixel DP is a blue pixel, the pixel interpolator 350 may determine an average value of pixel data of all blue pixels belonging to the target kernel to be pixel data of the target pixel DP.
Referring to
The first determiner 320 may determine a corner pattern corresponding to the target kernel based on the similarity score. In addition, when the number of corner patterns corresponding to the target kernel is 1 based on the similarity score (i.e., ‘Yes’ in S210), the first determiner 320 may transmit pixel data of the corner pattern including the target pixel DP to the pixel interpolator 350. The target pixel DP may be positioned at the texture or non-texture region of the corner pattern.
For example, when the target pixel DP is positioned at the texture region of the corner pattern, the first determiner 320 may transmit pixel data of pixels of the texture region to the pixel interpolator 350.
When the number of corner patterns corresponding to the target kernel is determined to be a plural number based on the similarity score (i.e., ‘No’ in S210), the first determiner 320 may transmit identification information for identifying the corner patterns to the second determiner 330.
For example, when the corner patterns are PT_A, PT_B, PT_C, and PT_D, the first determiner 320 may transmit the identification information of the corner patterns PT_A, PT_B, PT_C, and PT_D to the second determiner 330. The second determiner 330 may select at least one of the corner patterns PT_A, PT_B, PT_C, PT_D as candidate corner patterns.
The reference gradient combination 500 for each of the corner patterns may refer to a result of comparing a difference value between pixel data of each of the plurality of pixel pairs and a preset threshold value. The reference gradient combination 500 may be obtained by combining the results of the plurality of pixel pairs. In addition, the pixel pair to be used in the reference gradient may include two pixels that have the same color but are positioned in different positions in the kernel.
The reference gradient combination 500 may mean that a plurality of different pixel pairs located at predetermined positions with respect to a kernel is selected, and a difference value between pixel data of pixels for each of the selected pixel pairs is compared with a threshold value so that one of two results denoted by “Large” and “Small” is listed depending on the pixel pair. In addition, the reference gradient combination 500 may vary depending on the type and number of corner patterns. For example, based on (1) pixel pairs, the result value of the PT_A kernel is large because it is abs(P11-P15)>=threshold value, and the result value of the 400 kernel is abs(P11-P15)<threshold value, so it corresponds to small.
A target gradient combination 510 for the target kernel 400 may mean a result of a reference gradient with the plurality of pixel pairs. The reference gradient may be obtained by comparing a difference value between pixel data of pixels of each of the plurality of pixel pairs with a predetermined threshold value based on image data (IDATA) that was input to the target kernel 400 by using the positions of pixels of the plurality of pixel pairs used in the reference gradient combination 500 without change.
For example, in a situation where the eleventh pixel P11 and the twenty-third pixel P23, the third pixel P3 and the fifteenth pixel P15 are set to a pixel pair to be used in the reference gradient combination 500, the results such as “Large” and “Small” can be calculated based on the result of combination. Here, the reference gradient may be obtained by comparing the threshold value with each of a first difference between pixel data of the positions of the eleventh pixel P11 and pixel data of the twenty-third pixel P23, and a second difference between pixel data of the positions of the third pixel P3 and pixel data of the fifteenth pixel P15.
For example, when 22 pairs of pixels of
In addition, the target gradient combination 510 for the target kernel 400 may refer to a result expressed as one of two results of ‘large’ and ‘small’ by comparing the difference value between pixel data of each of the plurality of pixel pairs and the threshold value based on the image data (IDATA) input into the target kernel 400 using the pixel position of the plurality of pixel pairs used in the reference gradient combination 500. Results of ‘large’ and ‘small’ may be listed according to 22 pairs of pixels.
In
Referring to
The second determiner 330 may determine a corner pattern corresponding to the target kernel based on a result of calculating the complex gradient combination with respect to the target kernel. In addition, when the number of corner patterns corresponding to the target kernel is determined to be only one (1) based on the complex gradient combination (‘Yes’ in S310), the second determiner 330 may transmit pixel data of pixels belonging to the same region as the texture or non-texture region of the corner pattern including the target pixel DP to the pixel interpolator 350.
For example, when the target pixel DP is included in the texture region of a determined corner pattern, the second determiner 330 may transmit pixel data of pixels included in the texture region of the determined corner pattern to the pixel interpolator 350.
When the number of corner patterns corresponding to the target kernel is determined to be a plural number based on the complex gradient combination (‘No’ in S310), the second determiner 330 may transmit identification information of each of the determined corner patterns to the third determiner 340.
For example, when the corner patterns are PT_B and PT_C, the second determiner 330 may transmit identification information of the corner patterns PT_B and PT_C to the third determiner 340, and the third determiner 340 may determine the corner pattern using the corner patterns PT_B and PT_C as candidate corner patterns.
Two pixel combinations may be included in each cell of the table of
The input image data (IDATA) may be different for each kernel. Thus, even when the complex gradient combination obtained using the same pixel combinations is used, the result of calculating the complex gradient combination for each kernel may be different. In this case, the target pixel DP may correspond to a blue or red pixel.
The complex gradient combination may refer to a result of comparison between average values of different values between pixel data of two pixels for each of the pixel combinations corresponding to the corner pattern. In addition, the complex gradient combination may mean that an average of difference values between pixel data of the pixel pair for each of the pixel combinations corresponding to the corner pattern is compared with a predetermined threshold value so as to indicate one of two results denoted by “Large” and “Small”.
Pixel combinations used in the complex gradient may refer to pixel pairs in which pixels of the same color are paired with each other, and pixel combinations to be used in the complex gradient combination corresponding to each corner pattern may be determined in advance for each corner pattern.
For example, referring to
At this time, in a situation where the first corner pattern and the second corner pattern are candidate corner patterns, when a complex gradient combination corresponding to the first corner pattern and the second corner pattern is calculated for an arbitrary kernel, only one result from among the results denoted by “avg{abs(P11-P23)+abs(P3-P15)}<avg{abs(P12-P18)+abs(P8-P14)}”, “avg{abs(P11-P23)+abs(P3-P15)}=avg{abs(P12-P18)+abs(P8-P14)}” and “avg{abs(P11-P23)+abs(P3-P15)}>avg{abs(P12-P18)+abs(P8-P14)}” can be obtained. Here, “avg” is an operator for finding average values and “abs” is an operator for finding absolute values.
In addition, the second determiner 330 may calculate a complex gradient combination, and may determine the corner pattern corresponding to pixel combinations having the same or smaller average value to be the corner pattern corresponding to the target kernel. Therefore, in a situation where the first corner pattern and the second corner pattern are used as candidate patterns, when the result of calculating a complex gradient combination corresponding to the first corner pattern and the second corner pattern with respect to the target kernel is denoted by “avg{abs(P11-P23)+abs(P3-P15)}<avg{abs(P12-P18)+abs(P8-P14)}”, the first corner pattern may be determined to be a corner pattern corresponding to the target kernel, and when the result of calculating a complex gradient combination corresponding to the first corner pattern and the second corner pattern with respect to the target kernel is denoted by “avg{abs(P11-P23)+abs(P3-P15)}=avg{abs(P12-P18)+abs(P8-P14)}”, the first corner pattern and the second corner pattern may be determined to be corner patterns corresponding to the target kernel.
Referring back to
When the complex gradient combination corresponding to PT_B and PT_C is calculated, only one result from among the results denoted by “avg{abs(P11-P21)+abs(P21-P23)}<avg{abs(P3-P5)+abs(P5-P15)}”, “avg{abs(P11-P21)+abs(P21-P23)}=avg{abs(P3-P5)+abs(P5-P15)}” and “avg{abs(P11-P21)+abs(P21-P23)}>avg{abs(P3-P5)+abs(P5-P15)}” can be obtained.
At this time, the second determiner 330 may calculate the complex gradient combination, and may determine the corner pattern corresponding to pixel combinations having the same or smaller average value to be the corner pattern corresponding to the target kernel. Therefore, in a situation where the corner pattern PT_B and the corner pattern PT_C are used as candidate patterns, when the result of calculating a complex gradient combination corresponding to PT_B and PT_C with respect to the target kernel is denoted by “avg{abs(P11-P21)+abs(P21-P23)}<avg{abs(P3-P5)+abs(P5-P15)}”, the corner pattern PT_B may be determined to be a corner pattern corresponding to the target kernel, and when the result of calculating a complex gradient combination corresponding to PT_B and PT_C with respect to the target kernel is denoted by “avg{abs(P11-P21)+abs(P21-P23)}=avg{abs(P3-P5)+abs(P5-P15)}”, the corner patterns PT_B and PT_C may be determined to be corner patterns corresponding to the target kernel.
Referring back to
When the complex gradient combination corresponding to PT_A and PT_C is calculated, only one result from among the plurality of results denoted by “avg{abs(P8-P12)+abs(P14-P18)}<avg{abs(P12-P18)+abs(P8-P14)}”, “avg{abs(P8-P12)+abs(P14-P18)}=avg{abs(P12-P18)+abs(P8-P14)}”, and “avg{abs(P8-P12)+abs(P14-P18)}>avg{abs(P12-P18)+abs(P8-P14)}” can be obtained.
When the complex gradient combination corresponding to PT_A and PT_H is calculated, only one result from among the plurality of results denoted by “avg{abs(P23-P25)+abs(P25-P15)}<avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}”, “avg{abs(P23-P25)+abs(P25-P15)}=avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}”, and “avg{abs(P23-P25)+abs(P25-P15)}>avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}” can be obtained.
When the complex gradient combination corresponding to PT_C and PT_H is calculated, only one result from among the plurality of results denoted by “avg{abs(P3-P5)+abs(P5-P8)}<avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}”, “avg{abs(P3-P5)+abs(P5-P8)}=avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}”, and “avg{abs(P3-P5)+abs(P5-P8)}>avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}” can be obtained.
At this time, the second determiner 330 may calculate the complex gradient combination, and may determine the corner pattern corresponding to pixel combinations having the same or smaller average value to be the corner pattern corresponding to the target kernel. Therefore, in a situation where the corner patterns (PT_A, PT_C, PT_H) are used as candidate patterns, i) when the result of calculating a complex gradient combination corresponding to PT_A and PT_C with respect to the target kernel is denoted by “avg{abs(P8-P12)+abs(P14-P18)}<avg{abs(P12-P18)+abs(P8-P14)}”, ii) when the result of calculating a complex gradient combination corresponding to PT_A and PT_H with respect to the target kernel is denoted by “avg{abs(P23-P25)+abs(P25-P15)}=avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}”, and iii) when the result of calculating a complex gradient combination corresponding to PT_C and PT_H with respect to the target kernel is denoted by “avg{abs(P3-P5)+abs(P5-P8)}>avg{abs(P1-P11)+abs(P1-P3)+abs(P23-P25)+abs(P25-P15)}”, the corner patterns PT_A and PT_H may be determined to be corner patterns corresponding to the target kernel.
At this time, the second determiner 330 may calculate the complex gradient combination, and may determine the corner patterns (PT_A, PT_H) corresponding to pixel combinations each having “Small” to be corner patterns corresponding to the target kernel, and may transmit identification information of the corner patterns (PT_A, PT_H) to the third determiner 340.
Referring to
The third determiner 340 may determine a corner pattern corresponding to the target kernel based on the result of applying the Laplacian filter to the target kernel. In addition, when the number of corner patterns corresponding to the target kernel is determined to be only one based on the Laplacian filter (‘Yes’ in S410), the third determiner 340 may transmit pixel data of pixels belonging to the same region as the texture or non-texture region of the corner pattern including the target pixel DP (S420).
For example, when the target pixel DP is included in the texture region of one determined corner pattern, pixel data of pixels included in the texture region of the determined corner pattern may be transmitted to the pixel interpolator 350 (S420).
When the number of corner patterns corresponding to the target kernel is determined to be a plural number based on the Laplacian filter (‘No’ in S410), the third determiner 340 may transmit, to the pixel interpolator 350, information indicating that the corner pattern cannot be specified.
For example, when the determined corner patterns are PT_B and PT_C, the third determiner 340 may transmit, to the pixel interpolator 350, information indicating that the corner pattern cannot be specified.
Each cell of the table shown in
The input image data (IDATA) may be different for each kernel. Thus, even when the Laplacian filter obtained using the same pixel combination is used, the result of applying the Laplacian filter to each kernel may be different. In this case, the target pixel DP may correspond to a blue or red pixel.
The Laplacian filter may refer to the result of indicating a difference between a value obtained by doubling pixel data of one of three pixels for each pixel combination corresponding to the corner pattern and a value corresponding to the sum of pixel data values of the remaining two pixels other than the one pixel.
In addition, the Laplacian filter may refer to one of two results “Large” and “Small” obtained by comparing a predetermined threshold value with a difference between a value obtained by doubling pixel data of one of three pixels for each pixel combination corresponding to the corner pattern and a value corresponding to the sum of pixel data values of the remaining two pixels other than one pixel.
The pixel combination used in the Laplacian filter may mean that pixels of the same color are grouped together, and the pixel combination used in the Laplacian filter corresponding to each pattern may be determined in advance for each pattern.
For example, a combination of the eleventh pixel P11, the third pixel P3, and the fifteenth pixel P15, and a combination of the third pixel P3, the fifteenth pixel P15, and the twenty-third pixel P23 may be determined to be pixel combinations corresponding to a first corner pattern, and a combination of the eleventh pixel P11, the twenty-third pixel P23, and the fifteenth pixel P15, and a combination of the twenty-third pixel P23, the fifteenth pixel P15, and the third pixel P3 may be determined to be pixel combinations corresponding to a second corner pattern.
At this time, in a situation where the first corner pattern and the second corner pattern are candidate patterns, when the Laplacian filter corresponding to the first corner pattern is calculated for an arbitrary kernel, the result of comparing each of “abs(P3*2-P11-P15)” and “abs(P15*2-P15-P23)” with a predetermined threshold value can be obtained based on the calculated Laplacian filter. Further, when the Laplacian filter corresponding to the second corner pattern is calculated, the result of comparing each of “abs(P23*2-P11-P15)” and “abs(P15*2-P23-P15)” with a predetermined threshold value based on the calculated Laplacian filter can be obtained.
In addition, the third determiner 340 may calculate the Laplacian filter, and may determine a corner pattern corresponding to the target kernel based on the results listed according to pixel combinations. Therefore, in a situation where the first corner pattern and the second corner pattern are candidate patterns, i) when the result corresponding to the first corner pattern is denoted by “(Small, Small)” and the result corresponding to the second corner pattern is denoted by “(Small, Large)”, and ii) when the result of calculating the complex gradient combination corresponding to the first corner pattern with respect to the target kernel is denoted by “(Small, Small)” and the result of calculating the complex gradient combination corresponding to the second corner pattern with respect to the target kernel is denoted by “(Small, Small)”, the first corner pattern may be determined to be a corner pattern corresponding to the target kernel.
Referring back to
When the Laplacian filter corresponding to PT_D is used, the result of comparing each of “abs(P8*2-P12-P4)” and “abs(P12*2-P16-P8)” with a predetermined threshold value can be obtained, and when the Laplacian filter corresponding to PT_H is used, the result of comparing each of “abs(P14*2-P18-P10)” and “abs(P18*2-P22-P14)” with a predetermined threshold value can be obtained.
At this time, when all the results obtained by comparing the resultant values of the respective pixel combinations with a threshold value using the Laplacian filter are denoted by “Small”, the third determiner 340 may determine corner patterns of the corresponding pixel combination to be corner patterns corresponding to the target kernel. Therefore, in a situation where the corner patterns PT_D and PT_H are candidate patterns, i) when the result obtained by applying the Laplacian filter corresponding to PT_D to the target kernel is denoted by “(Small, Small)”, and ii) when the result obtained by applying the Laplacian filter corresponding to PT_H to the target kernel is denoted by “(Small, Large)”, the corner pattern PT_D may be determined to be a corner pattern corresponding to the target kernel.
Referring back to
When the Laplacian filter corresponding to PT_D is used, the result of comparing each of “abs(P7*2-P11-P3)” and “abs(P7*2-P17-P9)” with a predetermined threshold value can be obtained, and when the Laplacian filter corresponding to PT_H is used, the result of comparing each of “abs(P19*2-P17-P9) and “abs(P19*2-P23-P15)” with the predetermined threshold value can be obtained.
At this time, in a situation where the resultant value of each pixel combination is compared with a threshold value using the Laplacian filter, when the result of a pixel combination corresponding to each solid arrow is denoted by “Small” and the result of a pixel combination corresponding to each dotted arrow is denoted by “Large”, the corner pattern of the corresponding pixel combination may be determined to be a corner pattern corresponding to the target kernel. Therefore, in a situation where PT_D and PT_H are candidate patterns, the result of applying the Laplacian filter corresponding to PT_D to the target kernel is denoted by “(Small, Large)” according to the order of pixel combinations shown in
At this time, the third determiner 340 may determine the corner pattern PT_D to be a corner pattern corresponding to the target kernel according to the result of applying the Laplacian filter to the target kernel, and may transmit information about the corner pattern PT_D to the pixel interpolator 350.
Referring to
For example, when no weight is used and the corner pattern corresponding to the target kernel is determined to be PT_A, the pixel interpolator 350 may calculate an average value of pixel data of some pixels (i.e., the first pixel P1, the fifteenth pixel P15, the twenty-third pixel P23, and the twenty-fifth pixel P25) that belong to the same region as the corner pattern region including the target pixel DP while having the same color as the target pixel DP, and may determine the calculated average value to be pixel data of the target pixel DP.
In addition, when weights are used, the pixel interpolator 350 may calculate a weighted average value of a first value obtained when ‘3’ is multiplied by pixel data of the fifteenth pixel P15 and the twenty-third pixel P23 from among the pixels (P1, P15, P23, P25) and a second value obtained when ‘1’ is multiplied by pixel data of the first pixel P1 and the 25th pixel P25 from among the pixels (P1, P15, P23, P25), and may determine the calculated average value to be pixel data of the target pixel DP. Here, the pixels P15 and P23 used to calculate the first value may be located closer to the target pixel DP, and the other pixels P1 and P25 used to calculate the second value may be located farther from the target pixel DP.
Referring to
For example, when no weight is used and the corner patterns corresponding to the target kernel are determined to be PT_A and PT_B, the pixel interpolator 350 may determine an average value of the pixel data of some pixels (i.e., P1, P3, P5, P11, P15, P21, P23, P25) having the same color as the target pixel DP to be pixel data of the target pixel DP.
In addition, when weights are used, the pixel interpolator 350 may calculate a weighted average value of a first value obtained when ‘3’ is multiplied by pixel data of some pixels (P3, P11, P15, P23) located closer to the target pixel DP from among the pixels (P1, P3, P5, P11, P15, P21, P23, P25) and a second value obtained by when ‘1’ is multiplied by pixel data of some pixels (P1, P5, P21, P25) located farther from the target pixel DP from among the pixels (P1, P3, P5, P11, P15, P21, P23, P25), and may determine the calculated average value to be pixel data of the target pixel DP.
Referring to
The computing device 1400 may be mounted on a chip that is independent from the chip on which the image sensing device is mounted. According to one embodiment, the chip on which the image sensing device is mounted and the chip on which the computing device 1400 is mounted may be implemented in one package, for example, a multi-chip package (MCP), but the scope of the disclosed technology is limited thereto.
Additionally, the internal configuration or arrangement of the image sensing device and the image signal processor 100 described in
The computing device 1400 may include a Defect Pixel Processor 1410, a Defect Pixel Memory 1420, an input/output interface 1430, and a communication interface 1440.
The Defect Pixel Processor 1410 may process data and/or instructions required to perform the operations of the components (100, 200) of the image signal processor 100 described in
The Defect Pixel Memory 1420 may store data and/or instructions required to perform operations of the components (100, 200) of the image signal processor 100, and may be accessed by the Defect Pixel Processor 1410. For example, the Defect Pixel Memory 1420 may be volatile memory (e.g., Dynamic Random Access Memory (DRAM), Static Random Access Memory (SRAM), etc.) or non-volatile memory (e.g., Programmable Read Only Memory (PROM), Erasable PROM (EPROM), etc.), EEPROM (Electrically Erasable PROM), flash memory, etc.).
That is, the computer program for performing the operations of the image signal processor 100 disclosed in this document is recorded in the Defect Pixel Memory 1420 and executed and processed by the Defect Pixel Processor 1410, thereby implementing the operations of the image signal processor 100.
The input/output interface 1430 is an interface that connects an external input device (e.g., keyboard, mouse, touch panel, etc.) and/or an external output device (e.g., display) to the Defect Pixel Processor 1410 to allow data to be transmitted and received.
The communication interface 1440 is a component that can transmit and receive various data with an external device (e.g., an application processor, external memory, etc.), and may be a device that supports wired or wireless communication.
As is apparent from the above description, the image signal processor and the image signal processing method based on some implementations of the disclosed technology can interpolate the target pixel by determining a corner pattern corresponding to a target kernel, and can increase the accuracy of correction for defective pixels even when the target kernel corresponds to the corner pattern.
The embodiments of the disclosed technology may provide a variety of effects capable of being directly or indirectly recognized through the above-mentioned patent document.
Although a number of illustrative embodiments have been described, it should be understood that modifications and enhancements to the disclosed embodiments and other embodiments can be devised based on what is described and/or illustrated in this patent document.
Number | Date | Country | Kind |
---|---|---|---|
10-2023-0092057 | Jul 2023 | KR | national |