This application claims priority of Taiwan Application No. 109109643 filed on 2020 Mar. 23.
The present invention is related to an optical recognition system for use in computer visual processing, and more particularly, to an optical recognition system having 4×4 kernel image sensors for use in computer visual processing.
Image sensors are widely used in consumer products for converting optical images into electrical signals, thereby generating color images. An image sensor typically includes photo-sensitive devices such as charge-coupled devices (CCD) or CMOS active pixel sensors for light detection, as well as a filter array arranged in a specific pattern for gathering the brightness information of each color. Next, full-color images may be provided by performing interpolation and correction on the brightness information.
However, the prior art recognition system is designed for human eyes wherein may line buffers are required for storing the brightness information on multiple scan lines so as to perform interpolation on RGB images and IR images. Also, the prior art recognition system needs to implement complicated algorithms in order to provide sufficient image characteristics for human eyes to perform image recognition.
The present invention provides an optical recognition system for use in computer visual processing. The optical recognition system includes an image capturing device, an interpolation unit and an interpolation unit. The image capturing device includes a 4×4 kernel image sensor which includes a first red pixel, a second red pixel, a first through an eighth green pixels, a first blue pixel, a second blue pixel, and a first through a fourth IR pixels forming a first through a fourth scan lines adjacent to each other. The buffer unit is configured to store brightness information of at least two scan lines among the first through the fourth scan lines. The interpolation unit is configured to provide missing components in each pixel according to the brightness information stored in the buffer unit, thereby outputting an image data which includes full-color brightness information associated with each pixel.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
In the optical recognition systems 100 and 200, the image capturing device 10 includes one or multiple 4×4 kernel image sensors each consist of optical sensors and filter arrays. Each 4×4 kernel image sensor includes a plurality of red pixels, a plurality of green pixels, a plurality of blue pixels, and a plurality of IR pixels arranged in a Bayer pattern and forming four adjacent scan lines.
In an embodiment of the present invention, the buffer unit 30 in the optical recognition systems 100 and 200 includes two line buffers. Therefore, for the coordinate of a red pixel in the 4×4 kernel image sensor PX(n,m), its red component may be provided based on the brightness information associated with the coordinate of the red pixel, its green component may be provided by performing interpolation based on the brightness information associated with four green pixels adjacent to the coordinate of the red pixel, its blue component may be provided by performing interpolation based on the brightness information associated with two blue pixels nearest to the coordinate of the red pixel along the horizontal direction, and its IR component may be provided by performing interpolation based on the brightness information associated with four IR pixels nearest to the coordinate of the red pixel.
For the coordinate of a green pixel in the 4×4 kernel image sensor PX(n,m), its red component may be provided by performing interpolation based on the brightness information associated with the red pixel adjacent to the coordinate of the green pixel along the horizontal direction or the vertical direction, its green component may be provided based on the brightness information associated with the coordinate of the green pixel, its blue component may be provided by performing interpolation based on the brightness information associated with the blue pixel adjacent to the coordinate of the green pixel along the horizontal direction or the vertical direction, and its IR component may be provided by performing interpolation based on the brightness information associated with the two IR pixels adjacent to the coordinate of the green pixel along the horizontal direction or the vertical direction.
For the coordinate of a blue pixel in the 4×4 kernel image sensor PX(n,m), its red component may be provided by performing interpolation based on the brightness information associated with two red pixels nearest to the coordinate of the blue pixel along the horizontal direction, its green component may be provided by performing interpolation based on the brightness information associated with four green pixels adjacent to the coordinate of the blue pixel along the horizontal direction and the vertical direction, its blue component may be provided based on the brightness information associated with the coordinate of the blue pixel, and its IR component may be provided by performing interpolation based on the brightness information associated with four IR pixels nearest to the coordinate of the blue pixel.
For the coordinate of an IR pixel in the 4×4 kernel image sensor PX(n,m), its red component may be provided by performing interpolation based on the brightness information associated with the two red pixels nearest to the coordinate of the IR pixel, its green component may be provided by performing interpolation based on the brightness information associated with the four green pixels adjacent to the coordinate of the IR pixel along the horizontal direction and the vertical direction, its blue component may be provided by performing interpolation based on the brightness information associated with the two blue pixels nearest to the coordinate of the IR pixel, and its IR component may be provided based on the brightness information associated with the coordinate of the IR pixel.
More specifically, the interpolation method of providing the red component R′(4n,4m), the green component G′(4n,4m), the blue component B′(4n, 4m) and the IR component IR′(4n, 4m) for the green pixel at the coordinate (4n,4m) may be illustrated by the following equations:
R′(4n,4m)=R(4n,4m+1)
G′(4n,4m)=G(4n,4m)
B′(4n,4m)=B(4n,4m−1)
IR′(4n,4m)=[IR(4n−1,4m)+IR(4n+1,4m)]/2
The interpolation method of providing the red component R′(4n,4m+1), the green component G′(4n,4m+1), the blue component B′(4n,4m+1) and the IR component IR′(4n,4m+1) for the red pixel at the coordinate (4n,4m+1) may be illustrated by the following equations:
R′(4n,4m+1)=R(4n,4m+1)
G′(4n,4m+1)=[G(4n−1,4m+1)+G(4n,4m)+G(4n,4m+2)+G(4n+1,4m+1)]/4
B′(4n,4m+1)=[B(4n,4m−1)+B(4n,4m+3)]/2
IR′(4n,4m+1)=[IR(4n−1,4m)+IR(4n−1,4m+2)+IR(4n+1,4m)+IR(4n+1,4m+2)]/4
The interpolation method of providing the red component R′(4n,4m+2), the green component G′(4n,4m+2), the blue component B′(4n,4m+2) and the IR component IR′(4n,4m+2) for the green pixel on the coordinate (4n,4m+2) may be illustrated by the following equations:
R′(4n,4m+2)=R(4n,4m+1)
G′(4n,4m+2)=G(4n,4m+2)
B′(4n,4m+2)=B(4n,4m+3)
IR′(4n,4m+2)=[IR(4n−1,4m+2)+IR(4n+1,4m+2)]/2
The interpolation method of providing the red component R′(4n,4m+3), the green component G′(4n,4m+3), the blue component B′(4n,4m+3) and the IR component IR′(4n,4m+3) for the blue pixel at the coordinate (4n,4m+3) may be illustrated by the following equations:
R′(4n,4m+3)=[R(4n,4m+1)+R(4n,4m+5)]/2
G′(4n,4m+3)=[G(4n−1,4m+3)+G(4n,4m+2)+G(4n,4m+4)+G(4n+1,4m+3)]/4
B′(4n,4m+3)=B(4n,4m+3)
IR′(4n,4m+3)=[IR(4n−1,4m+2)+IR(4n−1,4m+4)+IR(4n+1,4m+2)+IR(4n+1,4m+4)]/4
The interpolation method of providing the red component R′(4n+1,4m), the green component G′(4n+1,4m), the blue component B′(4n+1,4m) and the IR component IR′(4n+1,4m) for the IR pixel at the coordinate (4n+1,4m) may be illustrated by the following equations:
R′(4n+1,4m)=[R(4n,4m+1)+R(4n+2,4m−1)]/2
G′(4n+1,4m)=[G(4n,4m)+G(4n+1,4m−1)+G(4n+1,4m+1)+G(4n+2,4m]/4
B′(4n+1,4m)=[B(4n,4m−1)+B(4n+2,4m+1)]/2
IR′(4n+1,4m)=IR(4n+1,4m)
The interpolation method of providing the red component R′(4n+1,4m+1), the green component G′ (4n+1,4m+1), the blue component B′(4n+1,4m+1) and the IR component IR′(4n+1,4m+1) for the green pixel at the coordinate (4n+1,4m+1) may be illustrated by the following equations:
R′(4n+1,4m+1)=R(4n,4m+1)
G′(4n+1,4m+1)=G(4n+1,4m+1)
B′(4n+1,4m+1)=B(4n+2,4m+1)
IR′(4n+1,4m+1)=[IR(4n+1,4m)+IR(4n+1,4m+2)]/2
The interpolation method of providing the red component R′(4n+1,4m+2), the green component G′ (4n+1,4m+2), the blue component B′(4n+1,4m+2) and the IR component IR′(4n+1,4m+2) for the IR pixel at the coordinate (4n+1,4m+2) may be illustrated by the following equations:
R′(4n+1,4m+2)=[R(4n,4m−1)+R(4n+2,4m+3)]/2
G′(4n+1,4m+2)=[G(4n,4m+2)+G(4n+1,4m+1)+G(4n+1,4m+3)+G(4n+2,4m+2]/4
B′(4n+1,4m+2)=[B(4n,4m+3)+B(4n+2,4m+1)]/2
IR′(4n+1,4m+2)=IR(4n+1,4m+2)
The interpolation method of providing the red component R′(4n+1,4m+3), the green component G′ (4n+1,4m+3), the blue component B′(4n+1, 4m+3) and the IR component IR′(4n+1, 4m+3) for the green pixel at the coordinate (4n+1,4m+3) may be illustrated by the following equations:
R′(4n+1,4m+3)=R(4n+2,4m+3)
G′(4n+1,4m+3)=G(4n+1,4m+3)
B′(4n+1,4m+3)=B(4n,4m+3)
IR′(4n+1,4m+3)=[IR(4n+1,4m+2)+IR(4n+1,4m+4)]/2
The interpolation method of providing the red component R′(4n+2,4m), the green component G′(4n+2,4m), the blue component B′(4n+2,4m) and the IR component IR′(4n+2,4m) for the green pixel at the coordinate (4n+2,4m) may be illustrated by the following equations:
R′(4n+2,4m)=R(4n+2,4m−1)
G′(4n+2,4m)=G(4n+2,4m)
B′(4n+2,4m)=B(4n+2,4m+1)
IR′(4n+2,4m)=[IR(4n+1,4m)+IR(4n+3,4m)]/2
The interpolation method of providing the red component R′(4n+2,4m+1), the green component G′(4n+2,4m+1), the blue component B′(4n+2,4m+1) and the IR component IR′(4n+2,4m+1) for the blue pixel at the coordinate (4n+2,4m+1) may be illustrated by the following equations:
R′(4n+2,4m+1)=[R(4n+2,4m−1)+R(4n+2,4m+3)]/2
G′(4n+2,4m+1)=[G(4n+1,4m+1)+G(4n+2,4m)+G(4n+2,4m+2)+G(4n+3,4m+1)]/4
B′(4n+2,4m+1)=B(4n+2,4m+1)
IR′(4n+2,4m+1)=[IR(4n+1,4m)+IR(4n+1,4m+2)+IR(4n+3,4m)+IR(4n+3,4m+2)]/4
The interpolation method of providing the red component R′(4n+2,4m+2), the green component G′(4n+2,4m+2), the blue component B′(4n+2,4m+2) and the IR component IR′(4n+2,4m+2) for the green pixel at the coordinate (4n+2,4m+2) may be illustrated by the following equations:
R′(4n+2,4m+2)=R(4n+2,4m+3)
G′(4n+2,4m+2)=G(4n+2,4m+2)
B′(4n+2,4m+2)=B(4n+2,4m+1)
IR′(4n+2,4m+2)=[IR(4n+1,4m+2)+IR(4n+3,4m+2)]/2
The interpolation method of providing the red component R′(4n+2,4m+3), the green component G′(4n+2,4m+3), the blue component B′(4n+2,4m+3) and the IR component IR′(4n+2,4m+3) for the red pixel at the coordinate (4n+2,4m+3) may be illustrated by the following equations:
R′(4n+2,4m+3)=R(4n+2,4m+3)
G′(4n+2,4m+3)=[G(4n+1,4m+3)+G(4n+2,4m+2)+G(4n+2,4m+4)+G(4n+3,4m+3)]/4
B′(4n+2,4m+3)=[B(4n+2,4m+1)+B(4n+2,4m+5)]/2
IR′(4n+2,4m+3)=[IR(4n+1,4m+2)+IR(4n+1,4m+4)+IR(4n+3,4m+2)+IR(4n+3,4m+4)]/4
The interpolation method of providing the red component R′(4n+3,4m), the green component G′(4n+3,4m), the blue component B′(4n+3,4m) and the IR component IR′(4n+3,4m) for the IR pixel at the coordinate (4n+3,4m) may be illustrated by the following equations:
R′(4n+3,4m)=[R(4n+2,4m−1)+R(4n+4,4m+1)]/2
G′(4n+3,4m)=[G(4n+2,4m)+G(4n+3,4m−1)+G(4n+3,4m+1)+G(4n+4,4m]/4
B′(4n+3,4m)=[B(4n+2,4m+1)+B(4n+4,4m−1)]/2
IR′(4n+3,4m)=IR(4n+3,4m)
The interpolation method of providing the red component R′(4n+3,4m+1), the green component G′(4n+3,4m+1), the blue component B′(4n+3,4m+1) and the IR component IR′(4n+3,4m+1) for the green pixel at the coordinate (4n+3,4m+1) may be illustrated by the following equations:
R′(4n+3,4m+1)=R(4n+4,4m+1)
G′(4n+3,4m+1)=G(4n+3,4m+1)
B′(4n+3,4m+1)=B(4n+2,4m+1)
IR′(4n+3,4m+1)=[IR(4n+3,4m)+IR(4n+3,4m+2)]/2
The interpolation method of providing the red component R′(4n+3,4m+2), the green component G′(4n+3,4m+2), the blue component B′(4n+3,4m+2) and the IR component IR′(4n+3,4m+2) for the IR pixel at the coordinate (4n+3,4m+2) may be illustrated by the following equations:
R′(4n+3,4m+2)=[R(4n+2,4m+3)+R(4n+4,4m+1)]/2
G′(4n+3,4m+2)=[G(4n+2,4m+2)+G(4n+3,4m+1)+G(4n+3,4m+3)+G(4n+4,4m+2]/4
B′(4n+3,4m+2)=[B(4n+2,4m+1)+B(4n+4,4m+3)]/2
IR′(4n+3,4m+2)=R(4n+3,4m+2)
The interpolation method of providing the red component R′(4n+3,4m+3), the green component G′(4n+3,4m+3), the blue component B′(4n+3,4m+3) and the IR component IR′(4n+3,4m+3) for the green pixel at the coordinate (4n+3,4m+3) may be illustrated by the following equations:
R′(4n+3,4m+3)=R(4n+2,4m+3)
G′(4n+3,4m+3)=G(4n+3,4m+3)
B′(4n+3,4m+3)=B(4n+4,4m+3)
IR′(4n+3,4m+3)=[IR(4n+3,4m+2)+IR(4n+3,4m+4)]/2
After performing interpolation on all pixels, the interpolation unit 20 is configured to output an image data DI which includes full-color brightness information associated with the brightness information of each pixel.
In another embodiment of the present invention, the buffer unit 30 in the optical recognition systems 100 and 200 may include more than two line buffers. Under such circumstance, the interpolation unit 20 can provide the missing components of each pixel by performing interpolation based on the brightness information of all neighboring pixels.
In the optical recognition systems 100 and 200, the correction unit 40 is configured to calibrate each pixel channel in the image data DI outputted by the interpolation unit 20 according to a configurable RGB-IR correction matrix, thereby outputting the RGB image and the IR image. The configurable RGB-IR correction matrix is depicted following this paragraph. In the configurable RGB-IR correction matrix, R, G, B, and IR represent the red pixel value, the green pixel value, the blue pixel value and the IR pixel value in the image data DI before calibration. RT, GT, BT, and IRT represent the red pixel value, the green pixel value, the blue pixel value and the IR pixel value in the RGB image and the IR image after calibration. C11˜C44 represent correction coefficients which may be acquired by shooting color cards with different optical brightness, thereby generating the calibrated RGB image and the calibrated IR image under different optical brightness. However, the implementation of the configurable RGB-IR correction matrix does not limit the scope of the present invention.
In the optical recognition system 100, the image signal processor 70 is configured to receive and analyze the RGB image and the IR image outputted by the correction unit 40, thereby providing a brightness parameter Y. The output decision unit 50 is configured to output one of the RGB image and the IR image to the computer visual processing unit 60 based on the brightness parameter Y.
In the optical recognition system 200, the output decision unit 50 is configured to receive the RGB image and the IR image directly from the correction unit 40. After analyzing the RGB image and the IR image, the output decision unit 50 is configured to output one of the RGB image and the IR image to the computer visual processing unit 60.
In conclusion, the present optical recognition system may be used in computer visual processing wherein a minimum of two line buffers are used for performing interpolation on RGB images and IR images using 4×4 kernel image sensor scheme. Therefore, the present invention can provide image characteristics for computers to perform image recognition without the need to implement complicated algorithms.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
109109643 | Mar 2020 | TW | national |