The subject matter herein generally relates to an image processing device, and more particularly to an image processing device for a lens module.
Referring to
Implementations of the present disclosure will now be described, by way of embodiments, with reference to the attached figures.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. Additionally, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures and components have not been described in detail so as not to obscure the related relevant feature being described. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features. The description is not to be considered as limiting the scope of the embodiments described herein.
Several definitions that apply throughout this disclosure will now be presented.
The term “coupled” is defined as connected, whether directly or indirectly through intervening components, and is not necessarily limited to physical connections. The connection can be such that the objects are permanently coupled or releasably connected. The term “comprising” means “including, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in a so-described combination, group, series, and the like.
Referring to
The pixel array 10 includes a plurality of pixels 11. The pixels 11 form an array in the form of N*M. Among them, N and M are both positive integers, and the values of N and M can be equal or unequal. For example, in one embodiment, N and M are both equal to 4, so the pixels 11 form a 4*4 array. A unit area of each pixel 11 may be less than 1 micron. In one embodiment, the unit area of each pixel 11 is 0.8 microns.
The filter array 20 is arranged corresponding to the pixel array 10. In one embodiment, a shape and size of the filter array 20 correspond to a shape and size of the pixel array 10. The filter array 20 includes a plurality of filter units. Each filter unit includes at least one filter for filtering incident light, so that a colored light enters the corresponding pixel 11 through the filter array 20.
In one embodiment, the filter array 20 includes four filter units, namely a first filter unit 21, a second filter unit 22, a third filter unit 23, and a fourth filter unit 24. The first filter unit 21, the second filter unit 22, the third filter unit 23, and the fourth filter unit 24 are arranged adjacent to each other and form a 2*2 filter array.
In one embodiment, each of the filter units allows only one type of colored light to pass through. For example, the first filter unit 21 and the fourth filter unit 24 located in opposite corners of the filter array 20 only allow light of a first color, such as green, to pass through, the second filter unit 22 located at another corner of the filter array 20 only allows light of a second color, such as red, to pass through, and the third filter unit 23 located at another corner of the filter array 20 only allows light of a third color, such as blue, to pass through. In this way, the first filter unit 21, the second filter unit 22, the third filter unit 23, and the fourth filter unit 24 can form a four-Bayer color filter array in the form of GRBG. In other embodiments, the four-Bayer color filter array formed by the filter array 20 is not limited to the GRBG format described above and may be in other formats, such as RGGB or BGGR. In addition, the arrangement of the filter units in the filter array 20 is not limited to the arrangement of the 2*2 filter units as described above. In other embodiments, the filter units of the filter array 20 may be arranged in an array of 3*3 filter units.
Referring to
In one embodiment, a shape and size of the filter array 20 correspond to a shape and size of the pixel array 10, and each filter unit corresponds to one pixel unit 12. Therefore, the pixel array 10 is divided into a corresponding number of pixel units 12 according to the number of the pixel units 12, that is, the pixel array 10 is divided into 4 pixel units 12. Each pixel unit 12 is composed of an array of M1*M2. M1 and M2 are both positive integers greater than 1, and the two may be the same or different. For example, in one embodiment, both M1 and M2 are equal to 2.
In one embodiment, since the first filter unit 21, the second filter unit 22, the third filter unit 23, and the fourth filter unit 24 respectively correspond to four adjacent pixel units on the pixel array 10, each pixel unit 12 includes 2*2 pixels 11, and each pixel 11 in each pixel unit 12 filters the same color light.
Specifically, in one embodiment, when the filter array 20 is arranged according to the GRBG four-Bayer color filter array, light of a specific wavelength (such as red light, green light, or blue light) can be transmitted, so that the pixel array 10 outputs a first Bayer image (see
Referring to
In the pixel array in the related art, each pixel corresponds to one micro lens, and a gap exists between each two adjacent micro lenses. When incident light enters the gap between the micro lenses, a portion of the incident light cannot be converted into electrical signals, which will reduce a utilization rate of the incident light. In the present disclosure, the micro lenses 31 are arranged corresponding to the filter units and the pixel units 12, so that a plurality of pixels 11 form one pixel unit and share one micro lens 31, which can effectively reduce the gaps between adjacent micro lenses 31 and increase utilization of the incident light.
Referring to
It can be understood that when incident light enters, the incident light will pass through the micro lens array 30, the filter array 20, and the pixel array 10 in sequence. The incident light is first condensed by the micro lens array 30, and then each filter unit in the filter array 20 filters the condensed incident light and enters the pixel array 10, so that the pixel unit 12 corresponding to each filter unit is illuminated by one of the three colors of RGB light. Then, the photodiode 13 and the readout circuit 14 on each pixel 11 obtain the light intensity value of the colored light corresponding to each pixel 11 to generate the first Bayer image.
Referring to
In one embodiment, the image signal processor 80 includes a switching module 50, a first processing module 60, and a second processing module 70. The switching module 50 is electrically coupled to the image sensor 40. The first processing module 60 and the second processing module 70 are electrically coupled to the switching module 50. The switching module 50 is configured to receive the first Bayer image output by the image sensor 40, and select or trigger the first processing module 60 or the second processing module 70 according to the current mode of the image signal processor 80, so that the first processing module 60 or the second processing module 70 processes the first Bayer image, and then outputs the first image or the second image.
For example, when the switching module 50 receives the first Bayer image and determines that the image signal processor 80 is in the first mode, the first processing module 60 will be selected or triggered. The first processing module 60 receives the first Bayer image transmitted by the switching module 50 and performs Remosaic processing on the first Bayer image to obtain the second Bayer image (see
Referring to
The Demosaic processing refers to processing the second Bayer image into the first image. The first image is an RGB image where each pixel has three RGB color channels. The Remosaic processing and Demosaic processing can be implemented by different interpolation algorithms, such as linear interpolation, mean interpolation, etc., which will not be repeated here.
The image signal processor 80 further includes a filter unit 61. The filter unit 61 is electrically coupled to the first processing module 60. The filter unit 61 is configured to perform mean filtering on each pixel unit 12 in the first Bayer image before generating the second Bayer image. In this way, the influence of scattered light and dispersive light on the first Bayer image is reduced, thereby effectively reducing color crosstalk of pixels in the generated second Bayer image.
When the switching module 50 receives the first Bayer image and determines that the image signal processor 80 is in the second mode, the second processing module 70 will be selected or triggered. The second processing module 70 receives the first Bayer image transmitted by the switching module 50 and performs pixel binning processing on the first Bayer image to obtain a third Bayer image (shown in
Referring to
Because the first image is obtained from the first Bayer image through the Remosaic and Demosaic processing, no pixel binning is performed during generation of the first image, so that the number of pixels in the first image is consistent with the number of pixels in the first Bayer image, and the area of each pixel in the first image is equal to the area of each pixel in the first Bayer image. The second image is obtained by combining four pixels of the first Bayer image, and the number of pixels in the second image is the same as the number of pixel units 12 in the first Bayer image. The area of one pixel is consistent with the area of each pixel unit 12 of the first Bayer image. In this way, the number of pixels in the first image is four times the number of pixels in the second image, but the second image and the first image have the same size. Generally, for images with the same image size, the more pixels there are, the higher the image resolution and the clearer the image. Similarly, for an image of the same size, the larger the area of each pixel, the more light signal will be absorbed. Therefore, an image resolution of the first image is higher than an image resolution of the second image, but a brightness of the second image is higher than a brightness of the first image.
In one embodiment, the first mode is a Remosaic mode, and the second mode is a Binning mode. The Remosaic mode performs processing based on each pixel of the first Bayer image, and the first image has a higher resolution. The first Bayer image is filtered by the filter unit 61 to improve a stray light margin and reduce color crosstalk between pixels.
The Binning mode combines several pixels of each pixel unit 12 corresponding to each filter unit into one pixel for processing, thereby increasing the area of each pixel, improving sensitivity, increasing the stray light margin, and reducing color crosstalk between pixels.
In summary, the image processing device 100 is configured with each filter unit corresponding to one pixel unit 12. Each filter unit allows only one type of colored light to pass through, and each micro lens 31 corresponds to one filter unit and one pixel unit 12. In the Remosaic mode of the image signal processor 80, the arrangement of the pixel array 10 is restored to the Bayer array arrangement, and the stray light margin is improved through filtering processing, so that the color crosstalk between pixels is reduced, and the images have a high resolution. In the Binning mode of the image signal processor 80, light is incident on a pixel with a larger area, so that the stray light margin and sensitivity are improved, so that a larger aperture lens can be used. That is, the image processing device 100 can adapt to various focal lengths and scenes and overcome the problems of low image resolution, low brightness, scattered light between pixels, and color crosstalk between pixels of the existing periscope lens due to the small aperture and large pixel area, thereby effectively improving the image quality.
Referring to
The periscope lens 90 may be located at a telephoto end and/or a wide-angle end. It can be understood that when the periscope lens is located at the telephoto end or the wide-angle end, the image processing device 100 can output the first image or the second image.
It can be understood that when the periscope lens 90 is located at the telephoto end, a focusing distance is long, an incident light angle is small, and a light input is small. When the image signal processor 80 is in the first mode, the pixel arrangement of the first image is restored to the general Bayer array, and the resolution of the first image is improved. When the image signal processor 80 is in the second mode, through pixel binning, stray light is reduced and the sensitivity of the second image is improved, and the second image has less color crosstalk.
It can be understood that when the periscope lens 90 is located at the wide-angle end, the focusing distance is short, the incident light angle is large, and the light input is large. When the image signal processor 80 is in the first mode, the pixel arrangement of the first image is restored to the general Bayer array, the resolution of the first image is improved, and the color crosstalk between pixels is reduced through mean filtering, so that the first mode is more suitable for bright scenes. When the image signal processor 80 is in the second mode, through pixel binning, the sensitivity of the second image is improved. In this way, the second mode is more suitable for dark scenes.
The lens module 200 can effectively overcome the problems of low image resolution and low brightness in the existing periscope lens through the configuration of the image processing device 100.
Referring to
At block S1, a first Bayer image is obtained.
The first Bayer image may be obtained by the image sensor 40 described above. The specific structure and working principle of the image sensor 40 are described above, and will not be repeated here.
At block S2, a corresponding processing module is selected or triggered according to a current mode.
The image signal processor 80 is described above, and will not be repeated here. When the switching module 50 receives the first Bayer image and determines that the image signal processor 80 is in the first mode, the first processing module 60 is selected or triggered. When the switching module 50 receives the first Bayer image and determines that the image signal processor 80 is in the second mode, the second processing module 70 is selected or triggered.
At block S3, image processing is performed on the first Bayer image to obtain a first image or a second image.
When the image signal processor 80 is in the first mode, the first processing module 60 receives the first Bayer image transmitted by the switching module 50 and performs Remosaic processing on the first Bayer image to obtain a second Bayer image. Then, the first processing module 60 performs Demosaic processing on the second Bayer image to obtain the first image.
When the image signal processor 80 is in the second mode, the second processing module 70 receives the first Bayer image transmitted by the switching module 50 and performs pixel binning processing on the first Bayer image to obtain a third Bayer image. Then, the second processing module 70 performs Demosaic processing on the third Bayer image to obtain the second image.
The embodiments shown and described above are only examples. Even though numerous characteristics and advantages of the present technology have been set forth in the foregoing description, together with details of the structure and function of the present disclosure, the disclosure is illustrative only, and changes may be made in the detail, including in matters of shape, size and arrangement of the parts within the principles of the present disclosure up to, and including, the full extent established by the broad general meaning of the terms used in the claims.
Number | Date | Country | Kind |
---|---|---|---|
202011553202.0 | Dec 2020 | CN | national |