This application claims priority of Taiwan Patent Application No. 104137821, filed on Nov. 17, 2015, the entirety of which is incorporated by reference herein.
Field of the Invention
The present invention relates to an image generation method and more particularly to a high-resolution image generation method according to known images.
Description of the Related Art
With the progress and development of technology, most handheld electronic devices, such as mobile phones, personal digital assistants, tablet, etc., are equipped with a camera unit to implement the camera function. In general, the camera module embedded on the handheld electronic device cannot be replaced. It means that the resolution of images taken by the camera module cannot be changed. It is inconvenient for uses who have the demand on high resolution images.
With the popularity of 3D images, more and more handheld electronic devices are equipped with a dual-lens camera module. The invention uses two images captured by the dual-lens camera module to generate a high-resolution image.
An embodiment of the invention provides an electronic device including a camera module, a control unit and a computing unit. The control unit controls the camera module to capture a first image and a second image. The computing unit receives the first image and the second image to generate a third image, wherein a resolution of the third image is higher than a resolution of the first image and the second image.
Another embodiment of the invention provides an image processing method for an electronic device with a camera module, comprising steps of capturing a first image and a second image via the camera module; applying an image to the first image and the second image to find corresponding relation between the first image and the second image; combining the first image and the second image to generate a third image according to the corresponding relation, wherein a resolution of the third image is higher than a resolution of the first image and the second image.
A detailed description is given in the following embodiments with reference to the accompanying drawings.
The present invention can be more fully understood by reading the subsequent detailed description and examples with references made to the accompanying drawings, wherein:
The following description is of the best-contemplated mode of carrying out the invention. This description is made for the purpose of illustrating the general principles of the invention and should not be taken in a limiting sense. The scope of the invention is best determined by reference to the appended claims.
The image sensor 13 transmits the first image and the second image to the computing unit 14 to generate a third image. In one embodiment, the computing unit 14 first applies an image comparison procedure to the first image and the second image to find corresponding relation between the first image and the second image, and combines the first image and the second image to generate the third image according to the corresponding relation.
In one embodiment, the computing unit 14 may be a graphic processor, a controller of the electronic device or software or firmware executed by a processor or a controller. To clearly explain the operation of the computing unit 14, please refer to
Then the computing unit 14 generates pixel values of unknown pixels according to the pixel values of known pixel (labeled as P(i,j) on
If the pixel X(i,j) is at the four edges of the third image, such as the first pixel column or the first pixel row, the pixel value of the pixel X(i,j) is generated by the following equation:
X(i,j)=(⅓)*[P(i−1,j)+P(i+1,j)+P(i,j±1)] or
X(i,j)=(⅓)*[(P(ij−1)+P(i,j+1)+P(i±1,j))
If the pixel X(i,j) is at the four corners of the third image, such as the pixel X(2m,0) or X(0,2n), the pixel value of the pixel X(i,j) is generated by the following equation:
X(i,j)=(½)*[P(i±1,j)+P(i,j±1)]
If the pixel X(i,j) is at the internal position of the third image, the pixel value of the pixel X(i,j) is generated by the following equation:
X(i,j)=(¼)*[P(i+1,j)+P(i−1,j)+P(i,j+1)+P(ij−1)]
According to the paragraphs above, the invention can combine two images with resolution (N× M) to generate the third image with resolution (2N×2M).
Furthermore, the pixel value generation of pixel X(i,j) above is for illustration only, and not to limit the invention thereto. A person skilled in the art can use a weight values according to a view angle difference between the first image and the second image to generate the pixel value pixel X(i,j).
In another embodiment, the third image is generated by arranging each pixel column of the first image and each pixel column of the second image alternately or by arranging each pixel row of the first image and each pixel row of the second image alternately. For example, the first pixel column of the third image is the first pixel column of the first image, the second pixel column of the third image is the first pixel column of the second image and so on. In another example, the first pixel row of the third image is the first pixel row of the first image, the second pixel row of the third image is the first pixel row of the second image and so on.
In this embodiment the resolution of the image captured by the first camera module 66 and the second camera module 67 is (N×M). The control unit may be implemented by a controller or a processor to control the first camera module 66 and the second camera module 67, such as the focus, shutter or aperture ratio thereof. In another embodiment, the control unit 61 processes all the operation of the electronic device.
The memory unit 63 stores the first image and the second image captured by the first camera module 66 and the second camera module 67. Furthermore, the memory unit 63 stores programs executed by the control unit 61. The I/O unit 65 is provided to the user for inputting data a, controlling the electronic device 60 or outputting the data of the electronic device 60. In one embodiment, the I/O unit 65 generates a control interface displayed in the display unit 64. In this embodiment, the display unit 64 shows images captured by the first camera module 66 and the second camera module 67 or the image generated by the computing unit 62.
The computing unit 62 receives the first image and the second image captured by the first camera module 66 and the second camera module 67, and applies an image comparison procedure to the first image and the second image to find corresponding relation between the first image and the second image to generate image information such as shown in
In step S62, the first image and the second image are stored in the memory unit of the electronic device. In step S63, the computing unit of the electronic device accesses the first image and the second image, and applies an image to the first image and the second image to find corresponding relation between the first image and the second image, such as the image information shown in
In step S64, the third image is stored in the memory of the electronic device and displayed by the display device of the electronic device. Note that when the user determines to generate a high-resolution image by the computing unit of the electronic device, the first image and the second image are not displayed on the display device of the electronic device. In step S65, the third image is displayed by the display device of the electronic device or transmitted to other device for displaying.
In this embodiment the resolution of the first image or the second image captured by the camera module 76 is (N×M). The control unit 71 is implemented by a controller or a processor to control the camera module 76, such as the focus, shutter or aperture ratio thereof. In another embodiment, the control unit 71 processes all the operation of the electronic device 70. In this embodiment, the control unit 71 controls the camera module 76 to capture the first image at a first time point. Then, the control unit 71 transmits a control signal to the MEMS 77 to shift the camera module 76 for a predetermined distance or angle, and the camera module 76 captures the second image at a second time point.
The memory unit 73 stores the first image and the second image captured by the camera module 76. Furthermore, the memory unit 73 stores programs executed by the control unit 71. The I/O unit 75 is provided to the user for inputting data, controlling the electronic device 70 or outputting the data of the electronic device 70. In one embodiment, the I/O unit 75 generates a control interface displayed in the display unit 74. In this embodiment, the display unit 74 shows images captured by the camera module 76 or the image generated by the computing unit 72.
The computing unit 72 receives the first image and the second image captured by the camera module 76, and applies an image comparison procedure to the first image and the second image to find corresponding relation between the first image and the second image to generate image information such as shown in
In step 73, the first image and the second image are stored in the memory of the electronic device. In step S74, the computing device accesses the first image and the second image, and applies an image to the first image and the second image to find corresponding relation between the first image and the second image, such as the image information shown in
In step S75, the third image is stored in the memory of the electronic device and displayed by the display device of the electronic device. Note that when the user determines to generate a high-resolution image by the computing unit of the electronic device, the first image and the second image are not displayed on the display device of the electronic device. In step S76, the third image is displayed by the display device of the electronic device or transmitted to other device for displaying.
In this embodiment the resolution of the image captured by the camera module 86 is (N×M). The control unit 81 is implemented by a controller or a processor to control the camera module 86, such as the focus, shutter or aperture ratio thereof. In another embodiment, the control unit 81 processes all the operation of the electronic device 80. In this embodiment, the control unit 81 controls the camera module 86 to capture a first image at a first time point. Then, the control unit 81 transmits a control signal to the MEMS 87 to shift the photosensitive element 88 of the camera module 86 for a predetermined distance or angle, and the camera module 86 captures the second image at a second time point.
The memory unit 83 stores the first image and the second image captured by the camera module 86. Furthermore, the memory unit 83 stores programs executed by the control unit 81. The I/O unit 85 is provided to the user for inputting data a, controlling the electronic device 80 or outputting the data of the electronic device 80. In one embodiment, the I/O unit 85 generates a control interface displayed in the display unit 84. In this embodiment, the display unit 84 shows images captured by the camera module 86 or the image generated by the computing unit 82.
The computing unit 82 receives the first image and the second image captured by the camera module 86, and applies an image comparison procedure to the first image and the second image to find corresponding relation between the first image and the second image to generate image information such as shown in
In step S83, the first image and the second image are stored in the memory of the electronic device. In step S84, the computing device accesses the first image and the second image, and applies an image to the first image and the second image to find corresponding relation between the first image and the second image, such as the image information shown in
In step S85, the third image is stored in the memory of the electronic device and displayed by the display device of the electronic device. Note that when the user determines to generate a high-resolution image by the computing unit of the electronic device, the first image and the second image are not displayed on the display device of the electronic device. In step S86, the third image is displayed by the display device of the electronic device or transmitted to other device for displaying.
While the invention has been described by way of example and in terms of the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments. To the contrary, it is intended to cover various modifications and similar arrangements (as would be apparent to those skilled in the art). Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
Number | Date | Country | Kind |
---|---|---|---|
104137821 A | Nov 2015 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
20160212332 | Tang | Jul 2016 | A1 |
Number | Date | Country |
---|---|---|
462567 | Nov 2001 | TW |
569617 | Jan 2004 | TW |
Entry |
---|
Taiwan Patent Office, Office Action, Patent Application Serial No. 104137821, dated May 23, 2016, Taiwan. |
Number | Date | Country | |
---|---|---|---|
20170142346 A1 | May 2017 | US |