The present disclosure relates to the field of imaging technologies, and more particularly, to an image sensor, an imaging apparatus, an electronic device, an image processing system, and a signal processing method.
Electronic devices such as mobile phones are typically equipped with cameras to provide photographing functions. An image sensor is provided in a camera. In order to capture color images, an image sensor is typically provided with a filter array arranged in a form of a Bayer array, such that a plurality of pixels in the image sensor can receive light passing through the corresponding filters, thereby generating pixel signals with different color channels. Image sensors have poor light sensitivity in low-luminance environments, making it difficult to obtain high-definition images.
The present disclosure provides an image sensor, an imaging apparatus, an electronic device, an image processing system, and a signal processing method.
An image sensor according to an embodiment of the present disclosure includes a filter array, a pixel array, and a processing circuit. The filter array includes a plurality of filter regions each including a plurality of filter units. Each filter unit includes at least one first color filter, at least one second color filter, and at least one third color filter. The pixel array includes a plurality of pixels each corresponding to one filter in the filter array and configured to receive light passing through the corresponding filter to generate an electrical signal. The processing circuit is provided on a substrate having the pixel array and configured to: combine the electrical signals generated by the pixels corresponding to each filter unit for outputting as a combined luminance value and forming a first intermediate image, the combined luminance value representing luminance of light applied to the pixels corresponding to the filter unit; generate a first color signal, a second color signal, and a third color signal based on the electrical signals generated by the pixels corresponding to each filter region, the first color signal representing a value in a first color channel of light applied to the pixels corresponding to the filter region, the second color signal representing a value in a second color channel of the light applied to the pixels corresponding to the filter region, and the third color signal representing a value in a third color channel of the light applied to the pixels corresponding to the filter region; and process the first color signal, the second color signal, and the third color signal to obtain a plurality of second intermediate images representing chrominance values of the filter region, and fuse the first intermediate image and the plurality of second intermediate images to obtain a first target image.
An imaging apparatus according to an embodiment of the present disclosure includes an image sensor and a processor. The image sensor includes a filter array and a pixel array. The filter array includes a plurality of filter regions each including a plurality of filter units. Each filter unit includes at least one first color filter, at least one second color filter, and at least one third color filter. The pixel array includes a plurality of pixels each corresponding to one filter in the filter array and configured to receive light passing through the corresponding filter to generate an electrical signal. The electrical signals generated by the pixels corresponding to each filter unit are combined for outputting as a combined luminance value and forming a first intermediate image. The combined luminance value represents luminance of light applied to the pixels corresponding to the filter unit. The processor is configured to: generate a first color signal, a second color signal, and a third color signal based on the electrical signals generated by the pixels corresponding to each filter region, the first color signal representing a value in a first color channel of the light applied to the pixels corresponding to the filter region, the second color signal representing a value in a second color channel of the light applied to the pixels corresponding to the filter region, and the third color signal representing a value in a third color channel of the light applied to the pixels corresponding to the filter region; and process the first color signal, the second color signal, and the third color signal to obtain a plurality of second intermediate images representing chrominance values of the filter region, and fuse the first intermediate image and the plurality of second intermediate images to obtain a first target image.
An electronic device according to an embodiment of the present disclosure includes an imaging apparatus and a processor. The imaging apparatus includes an image sensor. The image sensor includes a filter array and a pixel array. The filter array includes a plurality of filter regions each including a plurality of filter units. Each filter unit includes at least one first color filter, at least one second color filter, and at least one third color filter. The pixel array includes a plurality of pixels each corresponding to one filter in the filter array and configured to receive light passing through the corresponding filter to generate an electrical signal. The electrical signals generated by the pixels corresponding to each filter unit are combined for outputting as a combined luminance value and forming a first intermediate image. The combined luminance value represents luminance of light applied to the pixels corresponding to the filter unit. The processor is configured to: generate a first color signal, a second color signal, and a third color signal based on the electrical signals generated by the pixels corresponding to each filter region, the first color signal representing a value in a first color channel of the light applied to the pixels corresponding to the filter region, the second color signal representing a value in a second color channel of the light applied to the pixels corresponding to the filter region, and the third color signal representing a value in a third color channel of the light applied to the pixels corresponding to the filter region; and process the first color signal, the second color signal, and the third color signal to obtain a plurality of second intermediate images representing chrominance values of the filter region, and fuse the first intermediate image and the plurality of second intermediate images to obtain a first target image.
An image processing system according to an embodiment of the present disclosure includes an electronic device and a processor. The electronic device includes an imaging apparatus. The imaging apparatus includes an image sensor. The image sensor includes a filter array and a pixel array. The filter array includes a plurality of filter regions each including a plurality of filter units. Each filter unit includes at least one first color filter, at least one second color filter, and at least one third color filter. The pixel array includes a plurality of pixels each corresponding to one filter in the filter array and configured to receive light passing through the corresponding filter to generate an electrical signal. The electrical signals generated by the pixels corresponding to each filter unit are combined for outputting as a combined luminance value and forming a first intermediate image. The combined luminance value represents luminance of light applied to the pixels corresponding to the filter unit. The processor is configured to: generate a first color signal, a second color signal, and a third color signal based on the electrical signals generated by the pixels corresponding to each filter region, the first color signal representing a value in a first color channel of light applied to the pixels corresponding to the filter region, the second color signal representing a value in a second color channel of the light applied to the pixels corresponding to the filter region, and the third color signal representing a value in a third color channel of the light applied to the pixels corresponding to the filter region; and process the first color signal, the second color signal, and the third color signal to obtain a plurality of second intermediate images representing chrominance values of the filter region, and fuse the first intermediate image and the plurality of second intermediate images to obtain a first target image.
A signal processing method according to an embodiment of the present disclosure is applied in an image sensor. The imaging apparatus includes a filter array and a pixel array. The filter array includes a plurality of filter regions each including a plurality of filter units. Each filter unit includes at least one first color filter, at least one second color filter, and at least one third color filter. The pixel array includes a plurality of pixels each corresponding to one filter in the filter array and configured to receive light passing through the corresponding filter to generate an electrical signal. The signal processing method includes: combining the electrical signals generated by the pixels corresponding to each filter unit for outputting as a combined luminance value and forming a first intermediate image, the combined luminance value representing luminance of light applied to the pixels corresponding to the filter unit; and generating a first color signal, a second color signal, and a third color signal based on the electrical signals generated by the pixels corresponding to each filter region, the first color signal representing a value in a first color channel of light applied to the pixels corresponding to the filter region, the second color signal representing a value in a second color channel of the light applied to the pixels corresponding to the filter region, and the third color signal representing a value in a third color channel of the light applied to the pixels corresponding to the filter region; processing the first color signal, the second color signal, and the third color signal to obtain a plurality of second intermediate images representing chrominance values of the filter region; and fusing the first intermediate image and the plurality of second intermediate images to obtain a first target image.
Additional aspects and advantages of the embodiments of the present disclosure will be given at least in part in the following description, or become apparent at least in part from the following description, or can be learned from practicing of the present disclosure.
The above and/or additional aspects and advantages of the present disclosure will become more apparent and more understandable from the following description of embodiments taken in conjunction with the accompanying drawings, in which:
The embodiments of the present disclosure will be described in detail below with reference to examples thereof as illustrated in the accompanying drawings, throughout which same or similar elements, or elements having same or similar functions, are denoted by same or similar reference numerals. The embodiments described below with reference to the drawings are illustrative only, and are intended to explain, rather than limiting, the present disclosure.
Referring to
In the image sensor 10 according to the embodiment of the present disclosure, since the combined luminance value is a combined output of the electrical signals from all the pixels 120 corresponding to each filter unit 1111, the combined output of the electrical signals from the pixels 120 is equivalent to increasing the photosensitive area of the pixels, such that the photosensitive capability of the pixels can be improved. Therefore, the obtained combined luminance value is more accurate, and the first target image formed by using the combined luminance value is also more accurate. In addition, since the color signals (the first color signal, the second color signal, and the third color signal) are generated by the pixels 120 corresponding to the filter region 111, the color signals in the present disclosure are also more accurate than color signals generated by one single pixel 120 (especially in a low-luminance environment),
Referring to
In the image sensor 10 according to the embodiment of the present disclosure, the plurality of filter units 1111 may have a same ratio between the number of first color filters A, the number of second color filters B, and the number of the third color filters C in each filter unit 1111. For example, the ratio between the number of first color filters A, the number of second color filters B, and the number of third color filters C in each filter unit 1111 may be 2:1:1. In another example, the ratio between the number of the first color filter A, the number of the second color filter B, and the number of the third color filter C in each filter unit 1111 may be 5:2:2.
In the embodiment of the present disclosure, each first color filter A may be a green filter G, each second color filter B may be a red filter R, and each third color filter C may be a blue filter Bu. In the example where the ratio between the number of first color filters A, the number of second color filters B, and the number of third color filters C in each filter unit 1111 is 2:1:1, the ratio between the number of green filters G, the number of red filters R, and the number of blue filters Bu in each filter unit 1111 is 2:1:1. That is, the combined luminance value is R+2G+B, which is similar to the calculation of the luminance signal (y=0.299R+0.586G+0.114B), the combined luminance value obtained in this case is more accurate.
In some embodiments, the filter array 11 may include a plurality of first sets of filters 1131 and a plurality of second sets of filters 1132. Each of the plurality of first sets of filters 1131 may include a plurality of first color filters A and a plurality of second color filters B, and a number of first color filters A and a number of second color filters B are same in each of the plurality of first sets of filters 1131. Each of the plurality of second sets of filters 1132 may include a plurality of first color filters A and a plurality of third color filters C, and a number of first color filters A and a number of third color filters C are same in each of the plurality of second sets of filters 1132. A sub-array formed by arrangement of all the first sets of filters 1131 and all the second sets of filters 1132 is a part of the filter array 11, or a sub-array formed by arrangement of all the filter regions 111 is a part of the filter array 11. In a low-luminance mode, the image sensor 10 may be configured to obtain the first target image. In a clear texture mode, the processing circuit 13 may be configured to combine the electrical signals generated by the pixels 120 corresponding to each of the plurality of first sets of filters 1131 to generate a first pixel signal and a third pixel signal. The first pixel signal represents a value in the first color channel of light applied to the pixels 120 corresponding to the first set of filters 1131, and the third pixel signal represents a value in the second color channel of the light applied to the pixels 120 corresponding to the first set of filters 1131. The processing circuit 13 may be further configured to combine the electrical signals generated by the pixels 120 corresponding to each of the plurality of second sets of filters 1132 to generate a second pixel signal and a fourth pixel signal. The second pixel signal represents a value in the first color channel of the light applied to the pixels 120 corresponding to the second set of filters 1132, and the fourth pixel signal represents a value in a third color channel of light applied to the pixels 120 corresponding to the second set of filters 1132. The processing circuit 13 may be further configured to obtain a second target image based on the first pixel signal, the second pixel signal, the third pixel signal, and the fourth pixel signal.
In some embodiments, if the filter array 11 is manufactured in units of filter regions 111, all the first sets of filters 1131 and all the second sets of filters 1132 are arranged to form a sub-array which is a part of the filter array 11. When acquiring the first target image, the electrical signals generated by all the pixels 120 are read. When acquiring the second target image, the electrical signals generated by the pixels 120 in the first row and/or the first column are not read. Taking the filter array 11 of
For example, in the example where the ratio between the number of first color filters A, the number of second color filters B, and the number of third color filters C in each filter unit 1111 is 2:1:1, one first color filter A and one second color filter B in one filter unit 1111 and one first color filter A and one first color filter B in another filter unit 1111 are combined into a first set of filters 1131. One first color filter A and one third color filter C in one filter unit 1111 and one first color filter A and one third color filter C in another filter unit 1111 are combined into a second set of filters 1132.
In some embodiments, if the filter array 11 is manufactured in units of first sets of filters 1131 and second sets of filters 1132, all the filter regions 111 are arranged to form a sub-array which is a part of the filter array 11. When acquiring the first target image, the electrical signals generated by the pixels 120 in the first row and/or the first column are not read. When acquiring the second target image, the electrical signals generated by all the pixels 120 are read. Taking the filter array 11 of
In an embodiment of the present disclosure, the user can switch the mode by selecting his/her desired mode. For example, the display interface may display the low-luminance mode and the clear texture mode. When the user selects the low-luminance mode, the first target image is outputted, or when the user selects the clear texture mode, the second target image is outputted. In this way, the switching between the low-luminance mode and the clear texture mode can be achieved respectively with the same filter array 11. In the low-luminance mode, since the combined luminance value is outputted by combining the electrical signals of all the pixels 120 corresponding to each filter unit 1111, and the color signal is generated by the pixels 120 corresponding to the filter region 111, the low-luminance mode can still be used to acquire images accurately and effectively in a low-luminance environment. That is, the low-luminance mode can be applied in the low-luminance environment. In the clear texture mode, since both the first set of filters 1131 and the second set of filters 1132 have the first color filter A, both the pixels 120 corresponding to the first set of filters 1131 and the pixels 120 corresponding to the second set of filters 1132 are capable of generating pixel signals having values in the first color channel. Therefore, in the process of generating the second target image, no interpolation is needed for the values in the first color channel, and the color reproduction of the color image can be more accurate.
Referring to
The filter array 11 includes a plurality of first sets of filters 1131 and a plurality of second sets of filters 1132. Each of the plurality of first sets of filters 1131 includes a plurality of first color filters A and a plurality of second color filters B. A number of first color filters A and a number of second color filters A are same in each of the plurality of first sets of filters 111. Each of the plurality of second sets of filters 1132 includes a plurality of first color filters A and a plurality of third color filters C. A number of first color filters A and a number of third color filters C are same in each of the plurality of second sets of filters 112.
The pixel array 12 includes a plurality of pixels 120 each corresponding to one filter 110 in the filter array 11 and configured to receive light passing through the corresponding filter 110 to generate an electrical signal.
The microlens array 15 includes a plurality of sets of microlenses 151. Each set of microlenses 151 in the microlens array 15 corresponds to one set of filters 113 (the first set of filters 1131 or the second set of filters 1132), and to the pixels 120 corresponding to the one set of filters 113. As illustrated in
Here, each first color filter A can be a green filter G, each second color filter B can be a red filter R, and each third color filter C can be a blue filter Bu.
Here, the plurality of first sets of filters 1131 may be arranged in a first diagonal direction D1, and the plurality of second sets of filters 1132 may be arranged in a second diagonal direction D2 different from the first diagonal direction D1. In an example, when the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2, the first sets of filters 1131 and the second sets of filters 1132 may be arranged adjacently to each other in a vertical direction and a horizontal direction of the image sensor 10.
Here, the number of filters 110 in each of the plurality of first sets of filters 1131 is K*K, and the number of filters 110 in each of the plurality of second sets of filters 1132 is K*K, where K is an integer greater than or equal to 2. For example, the value of K may be 2, 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, etc., and the present disclosure is not limited to any of these values.
Here, the arrangement of the filters 110 in each first set of filters 1131 may be: (1) referring to
Here, the arrangement of the filters 110 in each second set of filters 1132 may be: (1) referring to
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 2*2, and the number of filters 110 in each second set of filters 1132 is 2*2.
As illustrated in
It is to be noted that the first diagonal direction D1 and the second diagonal direction D2 are not limited to diagonals, but may also include directions parallel to the diagonals. The term “direction” here is not a single pointing direction, but can be understood as a concept of a “straight line” indicating an arrangement and having two pointing directions at both ends of the straight line. In addition, in other embodiments, the first diagonal direction D1 may alternatively be the direction connecting the lower left corner and the upper right corner of the filter array 11, and the second diagonal direction D2 may alternatively be the direction connecting the upper left corner and the lower right corner of the filter array 11. In this case, positions of the first sets of filters 1131 and the second sets of filters 1132 are changed correspondingly to a change in the diagonal directions.
As illustrated in
It is to be noted that the arrangement in which the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V and adjacently to each other in the horizontal direction H is not limited to the one illustrated in
As illustrated in
In the filter array 11 shown in
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 2*2 filters 110. Each filter unit 1111 includes two first color filters A, one second color filter B, and one third color filter C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 3*3, and the number of filters 110 in each second set of filters 1132 is 3*3.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the plurality of first color filters A and the plurality of second color filters B are arranged adjacently to each other in the vertical direction V and the horizontal direction H. That is, in the vertical direction V, the first color filters A and the second color filters B are arranged alternately, and in the horizontal direction H, the first color filters A and the second color filters B are arranged alternately. In each second set of filters 1132, the plurality of first color filters A and the plurality of third color filters C are arranged adjacently to each other in the vertical direction V and the horizontal direction H. That is, in the vertical direction V, the first color filters A and the third color filters C are arranged alternately, and in the horizontal direction H, the first color filters A and the third color filters C are arranged alternately.
In the above filter array 11, if the filters 110 in the first row are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 3*3 filters 110. Some filter units 1111 may include five first color filters A, three second color filters B, and one third color filter C. Some filter units 1111 may include five first color filters A, one second color filter B, and three third color filters C.
If the filters 110 in the first column are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 3*3 filters 110. Some filter units 1111 may include five first color filters A, three second color filters B, and one third color filter C. Some filter units 1111 may include five first color filters A, one second color filter B, and three third color filters C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 4*4, and the number of filters 110 in each second set of filters 1132 is 4*4.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the plurality of first color filters A and the plurality of second color filters B are arranged adjacently to each other in the vertical direction V and the horizontal direction H. That is, in the vertical direction V, the first color filters A and the second color filters B are arranged alternately, and in the horizontal direction H, the first color filters A and the second color filters B are arranged alternately. In each second set of filters 1132, the plurality of first color filters A and the plurality of third color filters C are arranged adjacently to each other in the vertical direction V and the horizontal direction H. That is, in the vertical direction V, the first color filters A and the third color filters C are arranged alternately, and in the horizontal direction H, the first color filters A and the third color filters C are arranged alternately.
In the above filter array 11, if the filters 110 in the first row are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 4*4 filters 110. Some filter units 1111 may include eight first color filters A, six second color filters B, and two third color filters C. Some filter units 1111 may include eight first color filters A, two second color filters B, and six third color filters C.
If the filters 110 in the first column are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 4*4 filters 110. Some filter units 1111 may include eight first color filters A, six second color filters B, and two third color filters C. Some filter units 1111 may include eight first color filters A, two second color filters B, and six third color filters C.
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 2*2, and the number of filters 110 in each second set of filters 1132 is 2*2.
As illustrated in
As illustrated in
As illustrated in
In the filter array 11 shown in
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 2*2 filters 110. Each filter unit 1111 includes two first color filters A, one second color filter B, and one third color filter C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 3*3, and the number of filters 110 in each second set of filters 1132 is 3*3.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the filters 110 are arranged in rows, and the filters 110 in each row have a same color. For example, the filters 110 in the first row are all first color filters A, the filters 110 in the second row are all second color filters B, and the filters 110 in the third row are all first color filters A. In each second set of filters 1132, the filters 110 are arranged in rows, and the filters 110 in each row have a same color. For example, the filters 110 in the first row are all first color filters A, the filters 110 in the second row are all third color filters C, and the filters 110 in the third row are all first color filters A.
In the above filter array 11, if the filters 110 in the first column are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 3*3 filters 110. Some filter units 1111 may include five first color filters A, three second color filters B, and one third color filter C. Some filter units 1111 may include five first color filters A, one second color filter B, and three third color filters C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 4*4, and the number of filters 110 in each second set of filters 1132 is 4*4.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the filters 110 are arranged in rows, and the filters 110 in each row have a same color. For example, the filters 110 in the first row are all first color filters A, the filters 110 in the second row are all second color filters B, the filters 110 in the third row are all first color filters A, and the filters 110 in the fourth row are all second color filters B. In each second set of filters 1132, the filters 110 are arranged in rows, and the filters 110 in each row have a same color. For example, the filters 110 in the first row are all first color filters A, the filters 110 in the second row are all third color filters C, the filters 110 in the third row are all first color filters A, and the filters 110 in the fourth row are all third color filters C.
In the above filter array 11, if the filters 110 in the first column are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 4*4 filters 110. Some filter units 1111 may include eight first color filters A, six second color filters B, and two third color filters C. Some filter units 1111 may include eight first color filters A, two second color filters B, and six third color filters C.
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 2*2, and the number of filters 110 in each second set of filters 1132 is 2*2.
As illustrated in
As illustrated in
As illustrated in
In the filter array 11 shown in
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 2*2 filters 110. Each filter unit 1111 includes two first color filters A, one second color filter B, and one third color filter C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 3*3, and the number of filters 110 in each second set of filters 1132 is 3*3.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the filters 110 are arranged in columns, and the filters 110 in each column have a same color. For example, the filters 110 in the first column are all first color filters A, the filters 110 in the second column are all second color filters B, and the filters 110 in the third column are all first color filters A. In each second set of filters 1132, the filters 110 are arranged in columns, and the filters 110 in each column have a same color. For example, the filters 110 in the first column are all first color filters A, the filters 110 in the second column are all third color filters C, and the filters 110 in the third column are all first color filters A.
In the above filter array 11, if the filters 110 in the first row are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 3*3 filters 110. Some filter units 1111 may include five first color filters A, three second color filters B, and one third color filter C. Some filter units 1111 may include five first color filters A, one second color filter B, and three third color filters C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 4*4, and the number of filters 110 in each second set of filters 1132 is 4*4.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the filters 110 are arranged in columns, and the filters 110 in each column have a same color. For example, the filters 110 in the first column are all first color filters A, the filters 110 in the second column are all second color filters B, the filters 110 in the third column are all first color filters A, and the filters 110 in the fourth column are all second color filters B. In each second set of filters 1132, the filters 110 are arranged in columns, and the filters 110 in each column have a same color. For example, the filters 110 in the first column are all first color filters A, the filters 110 in the second column are all third color filters C, the filters 110 in the third column are all first color filters A, and the filters 110 in the fourth column are all third color filters C.
In the above filter array 11, if the filters 110 in the first row are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 4*4 filters 110. Some filter units 1111 may include eight first color filters A, six second color filters B, and two third color filters C. Some filter units 1111 may include eight first color filters A, two second color filters B, and six third color filters C.
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 2*2, and the number of filters 110 in each second set of filters 1132 is 2*2.
As illustrated in
As illustrated in
As illustrated in
In the filter array 11 shown in
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 2*2 filters 110. Each filter unit 1111 includes two first color filters A, one second color filter B, and one third color filter C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 3*3, and the number of filters 110 in each second set of filters 1132 is 3*3.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the filters 110 are arranged in rows, and the filters 110 in each row have a same color. For example, the filters 110 in the first row are all first color filters A, the filters 110 in the second row are all second color filters B, and the filters 110 in the third row are all first color filters A. In each second set of filters 1132, the filters 110 are arranged in columns, and the filters 110 in each column have a same color. For example, the filters 110 in the first column are all first color filters A, the filters 110 in the second column are all third color filters C, and the filters 110 in the third column are all first color filters A.
In the above filter array 11, if the filters 110 in the first row and the first column are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 3*3 filters 110. Some filter units 1111 may include five first color filters A, three second color filters B, and one third color filter C. Some filter units 1111 may include five first color filters A, one second color filter B, and three third color filters C.
In some embodiments, the arrangement of some filters 110 in the filter array 11 may alternatively be:
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter. The number of filters 110 in each first set of filters 1131 is 4*4, and the number of filters 110 in each second set of filters 1132 is 4*4.
In this arrangement, the plurality of first sets of filters 1131 is arranged in the first diagonal direction D1 (for example, the direction connecting the upper left corner and the lower right corner of the filter array 11), and the plurality of second sets of filters 1132 is arranged in the second diagonal direction D2 (for example, the direction connecting the lower left corner and the upper right corner of the filter array 11). The first diagonal direction D1 is different from the second diagonal direction D2. For example, the first diagonal direction D1 may be perpendicular to the second diagonal direction D2.
In this arrangement, the first sets of filters 1131 and the second sets of filters 1132 are arranged adjacently to each other in the vertical direction V of the image sensor 10 (illustrated in
In this arrangement, in each first set of filters 1131, the filters 110 are arranged in columns, and the filters 110 in each column have a same color. For example, the filters 110 in the first column are all first color filters A, the filters 110 in the second column are all second color filters B, the filters 110 in the third column are all first color filters A, and the filters 110 in the fourth column are all second color filters B. In each second set of filters 1132, the filters 110 are arranged in rows, and the filters 110 in each row have a same color. For example, the filters 110 in the first row are all first color filters A, the filters 110 in the second row are all third color filters C, the filters 110 in the third row are all first color filters A, and the filters 110 in the fourth row are all third color filters C.
In the above filter array 11, if the filters 110 in the first row and the first column are omitted, the arrangement of the filters 110 may be:
In this case, each filter region 111 may include 2*2 filter units 1111, and each filter unit 1111 includes 4*4 filters 110. Some filter units 1111 may include eight first color filters A, six second color filters B, and two third color filters C. Some filter units 1111 may include eight first color filters A, two second color filters B, and six third color filters C.
Referring to
In the example illustrated in
Thus, the four pixels 120 corresponding to each first set of filters 1131 can form a first combined pixel, and each first combined pixel can generate a first pixel signal and a third pixel signal. The four pixels 120 corresponding to each second set of filters 1132 can form a second combined pixel, and each second combined pixel can generate a second pixel signal and a fourth pixel signal. Each combined pixel can output the pixel signal (the first pixel signal or the second pixel signal) having a value in the first color channel, only some of the combined pixels can output the third pixel signal having a value in the second color channel and only some of the combined pixels can output the fourth pixel signal having a value in the third color channel. Therefore, a combined pixel that cannot output the third pixel signal needs to be subjected to an interpolation process to calculate a value in the second color channel of the combined pixel, and a combined pixel that cannot output the fourth pixel signal also needs to be subjected to the interpolation process to calculate a value in the third color channel of the combined pixel. In this way, each combined pixel can obtain the values in the first color channel, the second color channel, and the third color channel, and a color image can be generated by means of color space calculation.
Referring to
Referring to
In addition, referring to
In summary, in the image sensor 10 according to the embodiments of the present disclosure, each first set of filters 1131 and each second set of filters 1132 have the first color filters A, such that the pixel 120 corresponding to each first set of filters 1131 and the pixel 120 corresponding to each second set of filters 1132 can generate the pixel signal having a value in the first color channel. Therefore, in the process of generating the color image, the value in the first color channel does not need to be obtained by interpolation, and the color reproduction of the color image can be more accurate. Moreover, when performing the interpolation process on values in the second color channel (or values in the third color channel), most of the values in the second color channel to be generated by interpolation (or the values in the third color channel to be generated by interpolation) can be calculated from four adjacent pixel signals each having a value in the second color channel (or four adjacent pixels each having a value in the third color channel), such that the pixel signal having a value in the second color channel (or the third color channel) generated by interpolation can be more accurate, thereby further improving the accuracy of the color reproduction of the color image.
The plurality of filter regions 111 includes a plurality of first filter regions 1112 and a plurality of second filter regions 1114, and the arrangement of the filter units 1111 in each first filter region 1112 is different from the arrangement of the filter units 1111 in each second filter region 1114.
For example, the arrangement of the first filter region 1112 in an embodiment of the present disclosure may be:
The arrangement of the second filter region 1114 in an embodiment of the present disclosure may be:
The plurality of first filter regions 1112 may be arranged in a third diagonal direction, and the plurality of second filter regions 1114 may be arranged in a fourth diagonal direction different from the third diagonal direction.
The arrangement of the plurality of first filter regions 1112 and the plurality of second filter regions 1114 may be: (1) referring to
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter.
As shown in
It is to be noted that the third diagonal direction D3 and the fourth diagonal direction D4 are not limited to diagonals, but may also include directions parallel to the diagonals. The term “direction” here is not a single pointing direction, but can be understood as a concept of a “straight line” indicating an arrangement and having two pointing directions at both ends of the straight line. In addition, in other embodiments, the third diagonal direction D3 may alternatively be the direction connecting the lower left corner and the upper right corner of the filter array 11, and the fourth diagonal direction D4 may alternatively be the direction connecting the upper left corner and the lower right corner of the filter array 11. In this case, positions of the first filter regions 1112 and the second filter regions 1114 are changed correspondingly to a change in the diagonal directions.
As illustrated in
It is to be noted that the arrangement in which the first filter regions 1112 and the second filter regions 1114 are arranged adjacently to each other in the vertical direction V and adjacently to each other in the horizontal direction H is not limited to the one illustrated in
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter.
As shown in
Here, A denotes a first color filter, B denotes a second color filter, and C denotes a third color filter.
As shown in
In some embodiments, the processing circuit 13 is configured to calculate an average value of the electrical signals generated by the pixels 120 corresponding to the first color filters A in each filter region 111 as the first color signal, an average value of the electrical signals generated by the pixels 120 corresponding to the second color filters B in each filter region 111 as the second color signal, and an average value of the electrical signals generated by the pixels 120 corresponding to the third color filters C in each filter region 111 as the third color signal.
In an example where each filter region 111 includes 2*2 filter units 1111 and each filter unit 1111 includes two first color filters A, one second color filter B and one third color filter C, each filter region 111 includes eight first color filters A, four second color filters B and four third color filters C. If the pixel values corresponding to the electrical signals generated by the pixels 120 corresponding to the eight first color filters A are 41, 47, 43, 37, 45, 39, 35, and 33, respectively, the first color signal is (41+47+43+37+45+39+35+33)/8=40. If the pixel values corresponding to the electrical signals generated by the pixels 120 corresponding to the four second color filters B are 24, 26, 25, and 27, respectively, the second color signal is (24+26+25+27)/4=25.5. If the pixel values corresponding to the electrical signals generated by the pixels 120 corresponding to the four third color filters C are 52, 54, 54, and 52, the second color signal is (52+54+54+52)/4=53.
In some embodiments, the processing circuit 13 may be configured to perform a gamma correction process on the first intermediate image. With the gamma correction process performed on the first intermediate image, the luminance of the processed first intermediate image, when displayed, will be more suitable for viewing by the user. Specifically, f(I)=Iγ can be used for gamma correction, where f(I) is the luminance value of the first intermediate image after correction, I is the luminance value of the first intermediate image before correction, and γ is a first correction coefficient. When γ<1, the dynamic range of a low-luminance area of the first intermediate image can be increased, the dynamic range of a high-luminance area of the first intermediate image can be reduced, and the overall luminance of the first intermediate image can be increased. When γ>1, the dynamic range of the low-luminance area of the first intermediate image can be reduced, the dynamic range of the high-luminance area of the first intermediate image can be increased, and the overall luminance of the first intermediate image can be reduced. The user can set the corresponding correction coefficient γ depending on his/her own viewing requirements, such that the display effect of the adjusted first intermediate image can be improved.
In some embodiments, the processing circuit 13 may be configured to perform a white balance process, a color correction matrix process, and a gamma correction process on the first color signal, the second color signal, and the third color signal, and convert the processed first color signal, the processed second color signal, and the processed third color signal into a chrominance-luminance separation space to obtain the second intermediate images. After the white balance process, color correction matrix process and gamma correction process, the first color signal, the second color signal and the third color signal are more accurate, such that the second intermediate image obtained by converting them into the chrominance-luminance separation space is also more accurate. The white balance process can be implemented by using algorithms such as gray world method, automatic white balance method based on dynamic threshold, mirror method, etc. Different white balance algorithms can be used depending on different scenes or the user's choice, such that the processed color signals can be more accurate and more in line with the user's viewing requirements.
The color correction matrix can be:
where R′ refers to the second color signal corrected based on the color correction matrix, G′ refers to the first color signal corrected based on the color correction matrix, B′ refers to the third color signal corrected based on the color correction matrix, CMC11, CMC12, CMC13, CMC21, CMC22, CMC23, CMC31, CMC32, and CMC33 together form the correction matrix, R refers to the second color signal before correction based on the color correction matrix, and G refers to the first color signal before correction based on the color correction matrix, and B refers to the third color signal before correction based on the color correction matrix. The correction matrix can be selected according to the color temperature corresponding to the first color signal, the second color signal and the third color signal before adjustment, such that the first color signal, the second color signal and the third color signal processed based on the color correction matrix can be more accurate.
The gamma correction can be based on f(I)=Iγ, where f(I) is the corrected first color signal, second color signal, and third color signal, and I is the first color signal, second color signal, and third color signal before correction, γ is a second correction coefficient which may be same as or different from the first correction coefficient, and the present disclosure is not limited to this.
The processed first color signal, the processed second color signal and the processed third color signal are converted into a chrominance-luminance separation space to obtain the second intermediate images. Specifically, the following equation can be used for conversion: Y=0.257*R+0.564*G+0.098*B+16, Cb=−0.148*R−0.291*G+0.439*B+128, Cr=0.439*R−0.368*G−0.071*B+128, where Y is the luminance value, Cb is the blue component, Cr is the red component, R is the processed second color signal, G is the processed first color signal, and B is the processed third color signal. Here, the second intermediate images may include two frames, and the two frames of the second intermediate images are the second intermediate image corresponding to the blue component and the second intermediate image corresponding to the red component, respectively.
Of course, the following equation can also be used to convert the processed first color signal, the processed second color signal and the processed third color signal into the chrominance-luminance separation space: Y=0.29900*R+0.58700*G+0.11400*B, Cb=−0.16874*R−0.33126*G+0.50000*B+128, Cr=0.50000*R−0.41869*G−0.08131*B+128, and the present disclosure is not limited to this.
When each filter region 111 includes 2*2 filter units 1111, and each filter unit 1111 includes two first color filters A, one second color filter B, and one third color filter C, each filter unit 1111 can correspond to one combined luminance value, each filter region 111 can correspond to 4 combined luminance values, and each filter region 111 can only correspond to one first color signal, one second color signal, and the third color signal. Taking the filter array 11 of
Generally, an image in Y/Cb/Cr format can allow the resolution of Cb/Cr to be lower than that of Y. For example, the format of Y/Cb/Cr is 4:2:2 or 4:2:0. In a Y/Cb/Cr image in a format of 4:2:0, every four Ys correspond to one Cr and one Cb. Therefore, in the embodiment of the present disclosure, the first intermediate image representing the luminance Y, the second intermediate image representing the red component Cr, and the second intermediate image representing the blue component Cb may be fused to form the first target image.
In some embodiments, the processing circuit 13 may be configured to perform an up-sampling process on the second intermediate images such that one filter region 111 corresponds to a plurality of color values, so as to form a third intermediate image, and fuse the first intermediate image and the third intermediate image to obtain the first target image. For example, when each filter region 111 includes 2*2 filter units 1111, and each filter unit 1111 includes two first color filters A, one second color filter B, and one first color filter C, the resolution of the first intermediate image is 4 times that of the second intermediate images. Therefore, the second intermediate images can be up-sampled, for example, the second intermediate images can be enlarged four times, such that one filter region 111 corresponds to four Crs and four Cbs to obtain the third intermediate image. The resolution of the third intermediate image is the same as the resolution of the first intermediate image, such that the first target image obtained by fusing the first intermediate image and the third intermediate image will have more abundant color information.
In some embodiments, the processing circuit 13 may be configured to perform a high-pass filtering process on the first intermediate image. In this way, by performing the high-pass filtering process on the first intermediate image, the high-pass filtering process can effectively retain detailed information and remove noise, such that the processed first intermediate image can be more accurate.
Referring to
Referring to
In some embodiments, the imaging apparatus 100 further includes a processor 20. The processor 20 may be configured to generate a first color signal, a second color signal, and a third color signal based on the electrical signals generated by the pixels 120 corresponding to each filter region 111, the first color signal representing a value in a first color channel of the light applied to the pixels 120 corresponding to the filter region 111, the second color signal representing a value in a second color channel of the light applied to the pixels 120 corresponding to the filter region 111, and the third color signal representing a value in a third color channel of the light applied to the pixels 120 corresponding to the filter region 111. The processor 20 may be further configured to process the first color signal, the second color signal, and the third color signal to obtain a plurality of second intermediate images representing chrominance values of the filter region 111, and fuse the first intermediate image and the plurality of second intermediate images to obtain a first target image.
In the above embodiment of the present disclosure, the functions implemented by the processing circuit 13 can be implemented by the processor 20, and details thereof will be omitted here.
Referring to
Referring to
Here, the processor 20 can be located in a server responsible for cloud computing, or can be located in a server responsible for edge computing. In this way, the subsequent processing of the pixel signals outputted by the image sensor 10 can be offloaded to the server for execution, which can save power consumption of the imaging apparatus 100 or the electronic device 1000.
Referring to
01: combining the electrical signals generated by the pixels 120 corresponding to each filter unit 1111 for outputting as a combined luminance value and forming a first intermediate image, the combined luminance value representing luminance of light applied to the pixels 120 corresponding to the filter unit 1111; and
02: generating a first color signal, a second color signal, and a third color signal based on the electrical signals generated by the pixels 120 corresponding to each filter region 111, the first color signal representing a value in a first color channel of the light applied to the pixels 120 corresponding to the filter region 111, the second color signal representing a value in a second color channel of the light applied to the pixels 120 corresponding to the filter region 111, and the third color signal representing a value in a third color channel of the light applied to the pixels 120 corresponding to the filter region 111;
03: processing the first color signal, the second color signal, and the third color signal to obtain a plurality of second intermediate images representing chrominance values of the filter region 111; and
04: fusing the first intermediate image and the plurality of second intermediate images to obtain a first target image.
In some embodiments, the filter array 11 may include a plurality of first sets of filters 1131 and a plurality of second sets of filters 1132. Each of the plurality of first sets of filters 1131 may include a plurality of first color filters A and a plurality of second color filters B. A number of first color filters A and a number of second color filters B are same in each of the plurality of first sets of filters 1131. Each of the plurality of second sets of filters 1132 may include a plurality of first color filters A and a plurality of third color filters C. A number of first color filters A and a number of third color filters C are same in each of the plurality of second sets of filters 1132. A sub-array formed by arrangement of all the first sets of filters 1131 and all the second sets of filters 1132 is a part of the filter array 11, or a sub-array formed by arrangement of all the filter regions 111 is a part of the filter array 11. The signal processing method may include:
in a low-luminance mode, obtaining the first target image;
in a clear texture mode:
combining the electrical signals generated by the pixels 120 corresponding to each of the plurality of first sets of filters 1131 to generate a first pixel signal and a third pixel signal, the first pixel signal representing a value in the first color channel of the light applied to the pixels 120 corresponding to the first set of filters 1131, and the third pixel signal representing a value in the second color channel of the light applied to the pixels 120 corresponding to the first set of filters 1131, and
combining the electrical signals generated by the pixels 120 corresponding to each of the plurality of second sets of filters 1132 to generate a second pixel signal and a fourth pixel signal, the second pixel signal representing a value in the first color channel of the light applied to the pixels 120 corresponding to the second set of filters 1132, and the fourth pixel signal representing a value in a third color channel of the light applied to the pixels 120 corresponding to the second set of filters 1132; and
obtaining a second target image according to the first pixel signal, the second pixel signal, the third pixel signal, and the fourth pixel signal.
In some embodiments, the operation of obtaining the second target image according to the first pixel signal, the second pixel signal, the third pixel signal, and the fourth pixel signal may include: calculating an average value of the electrical signals generated by the pixels 120 corresponding to the first color filters A in each filter region 111 as the first color signal, calculating an average value of the electrical signals generated by the pixels 120 corresponding to the second color filters B in each filter region 111 as the second color signal, and calculating an average value of the electrical signals generated by the pixels 120 corresponding to the third color filters C in each filter region 111 as the third color signal.
In some embodiments, the signal processing method may include: performing a gamma correction process on the first intermediate image.
In some embodiments, the signal processing method may include: performing a white balance process, a color correction matrix process, and a gamma correction process on the first color signal, the second color signal, and the third color signal. The step 03 may include: converting the processed first color signal, the processed second color signal, and the processed third color signal into a chrominance-luminance separation space to obtain the plurality of second intermediate images.
In some embodiments, the step 04 may include: performing an up-sampling process on the plurality of second intermediate images such that one filter region corresponds to a plurality of color values, so as to form a third intermediate image, and fusing the first intermediate image and the third intermediate image to obtain the first target image.
In some embodiments, the signal processing method may include: performing a high-pass filtering process on the first intermediate image.
The embodiment of the present disclosure also provides a computer device. The computer device may be the electronic device 100.
The above computer device includes an image processing circuit, which may be implemented by hardware and/or software components, and may include various processing units that define an Image Signal Processing (ISP) pipeline.
As illustrated in
Image data captured by the imaging apparatus 910 is first processed by the ISP processor 940. The ISP processor 940 analyzes the image data to capture image statistics information that can be used to determine one or more control parameters for the imaging apparatus 910. The imaging apparatus 910 may include a camera having one or more lenses 912 and an image sensor 914. The image sensor 914 may be the image sensor 10. The image sensor 914 may include a filter array. The image sensor 914 may obtain light intensity and wavelength information captured by each pixel of the image sensor 914, and provide a set of raw image data that can be processed by the ISP processor 940, e.g., raw image data composed of a plurality of first pixel signals, a plurality of second pixel signals, a plurality of third pixel signals, and a plurality of fourth pixel signals. The sensor 920 (e.g., a gyroscope) may provide collected processing parameters (e.g., anti-shake parameter) to the ISP processor 940 based on a type of an interface the sensor 920. The interface of the sensor 920 may use a Standard Mobile Imaging Architecture (SMIA) interface, another serial or parallel camera interface, or any combination thereof.
In addition, the image sensor 914 may transmit the raw image data to the sensor 920, and the sensor 920 may provide the raw image data to the ISP processor 940 based on the type of the interface of the sensor 920, or the sensor 920 may store the raw image data in the image memory 930.
The ISP processor 940 can process the raw image data pixel by pixel in any of a variety of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits. The ISP processor 940 may perform one or more image processing operations on the raw image data, and collect statistical information about the image data. Here, the image processing operations can be performed with same or different bit depth accuracies.
The ISP processor 940 may alternatively receive image data from the image memory 930. For example, the interface of the sensor 920 can transmit the raw image data to the image memory 930, and the raw image data in the image memory 930 can be provided to the ISP processor 940 for processing. The image memory 930 may be a part of a memory device, a storage device, or an independent dedicated memory in an electronic device, and may include Direct Memory Access (DMA) features.
Upon receiving the raw image data from an interface of the image sensor 914, the interface of the sensor 920, or the image memory 930, the ISP processor 940 can perform one or more image processing operations, such as time-domain filtering, or as another example, process the first pixel signal, the second pixel signal, the third pixel signal, the fourth pixel signal to obtain a color image, etc. The processed image data (for example, the color image) can be transmitted to the image memory 930 for further processing before being displayed. The ISP processor 940 can receive the processed data from the image memory 930, and perform image data processing on the processed data in an original domain and in RGB and YCbCr color spaces. The image data processed by the ISP processor 940 may be outputted to the display 970 for viewing by the user and/or further processing by a graphics engine or a Graphics Processing Unit (GPU). In addition, the output of the ISP processor 940 can also be transmitted to the image memory 930, and the display 970 can read the image data from the image memory 930. In one embodiment, the image memory 930 can be configured to implement one or more frame buffers. In addition, the output of the ISP processor 940 can be transmitted to an encoder/decoder 960 for encoding/decoding the image data. The encoded image data can be saved and decompressed before being displayed on the display device 970. The encoder/decoder 960 may be implemented by a CPU or GPU or a co-processor. For example, when the computer device is in a preview mode or a video recording mode, the ISP processor 940 can process an image signal including a plurality of image signal units U (illustrated in
The statistical data determined by the ISP processor 940 can be transmitted to the control logic unit 950. For example, the statistical data may include statistical information of the image sensor 914 for automatic exposure, automatic white balance, automatic focusing, flicker detection, black level compensation, and shading correction for the lens 912. The control logic 950 may include a processor and/or a micro-controller that executes one or more routines (such as firmware). The one or more routines can determine control parameters for the imaging apparatus 910 and control parameters for the ISP processor 940 based on the received statistical data. For example, the control parameters for the imaging apparatus 910 may include control parameters for the sensor 920 (such as gain, integration time of exposure control, anti-shake parameters, etc.), flash control parameters for the camera, control parameters for the lens 912 (such as focus or focal length for zooming), or any combination thereof. The ISP control parameters may include gain level and color correction matrices for automatic white balance and color adjustment (for example, during RGB processing), and shading correction parameters for the lens 912.
In the present disclosure, the description with reference to the terms “one embodiment”, “some embodiments”, “an example”, “a specific example”, or “some examples”, etc., means that specific features, structures, materials, or characteristics described in conjunction with the embodiment(s) or example(s) are included in at least one embodiment or example of the present disclosure. In the present disclosure, any illustrative reference of the above terms does not necessarily refer to the same embodiment(s) or example(s). Moreover, the specific features, structures, materials or characteristics as described can be combined in any one or more embodiments or examples as appropriate. In addition, those skilled in the art can combine and integrate different embodiments or examples, or features thereof, as described in the present disclosure, provided that they do not contradict each other.
In addition, the terms “first” and “second” are only used for the purpose of description, and should not be construed as indicating or implying any relative importance or implicitly indicating the number of defined technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include at least one of the features. In the description of the present disclosure, “a plurality of” means at least two, e.g., two, three, etc., unless specifically defined otherwise.
Any process or method described in the flowchart or described otherwise herein can be understood as a module, segment or part of codes that include one or more executable instructions for implementing steps of specific logical functions or processes. It can be appreciated by those skilled in the art that the scope of the preferred embodiments of the present disclosure includes additional implementations where functions may not be performed in the order as shown or discussed, including implementations where the involved functions are performed substantially in parallel or even in a reverse order.
Although the embodiments of the present disclosure have been shown and described above, it can be appreciated that the above embodiments are exemplary only, and should not be construed as limiting the present disclosure. Various changes, modifications, replacements and variants can be made to the above embodiments by those skilled in the art without departing from the scope of the present disclosure.
The present application is a continuation of International Application No. PCT/CN2020/078212 filed on Mar. 6, 2020, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2020/078212 | Mar 2020 | US |
Child | 17903872 | US |