The present invention relates generally to imaging lens. More particularly, this invention relates to a lens module and a system for producing an image having such lens module.
A related camera module has many lens elements to improve field curvature and distortion. High quality lenses can have even 7 lens elements. The thickness of the related camera module is usually more than 5 mm with 7 elements. The related camera modules have lenses with high thickness, which is not suitable for new mobile terminal, which becomes thinner and thinner.
Therefore, it is necessary to provide a new lens module and a system for solving above mentioned problem.
In one aspect of the present disclosure, a lens module, comprises: a print circuit board, a spacer, attached onto the print circuit board, and the spacer having a through hole; a lens assembly, supported by the spacer and covered the through hole; an image sensor, mounted on the print circuit board and electrically connected with the print circuit; wherein, the lens assembly is configured for capturing light reflected by an object and transmitting the light into the image sensor; the image sensor is configured for converting the light into raw image signals.
In another aspect of the present disclosure, a system for producing image, comprises: a lens module, comprising: a print circuit board, a spacer, attached onto the print circuit board, and the spacer having a through hole; a lens assembly, supported by the spacer and covered the through hole; an image sensor, mounted on the print circuit board and electrically connected with the print circuit; wherein, the lens assembly is configured for capturing light reflected by an object and transmitting the light into the image sensor; the image sensor is configured for converting the light into raw image signals; a processor, electrically connected with the image sensor, the processor configured for receiving the raw image signals from the image sensor and pre-storing object image signals; a neural network, coupled to the processor, and configured for: receiving the raw image signals; receiving the object image signals; training the neural network, correcting the raw image signals according to the object image signals; outputting an object image.
Hereinafter, the present invention will be further described with reference to the accompanying drawings and embodiments.
As shown in
The spacer 12 may be cylinder shaped, and the through hole 120 penetrate through the spacer 12. One end of the spacer 12 is attached onto the circuit board 11, and the other end of the spacer 12 is configured to support the lens assembly 13.
The lens assembly 13 includes a first lens 131, a second lens 133 and an aperture sheet 132 sandwiched between the first lens 131 and the second lens 133. The first lens 131, the second lens 133 both have a fixing portion 1311, 1331 fixed to the end of the spacer 12 and an imaging portion 1312,1332 for capturing the light. The fixing portion 1311,1331 surrounds the imaging portion 1312, 1332, respectively. The aperture sheet 132 defines an aperture window 1320 for transmitting the light, corresponding to the imaging portion 1312, 1322. The aperture sheet 132 may be coated with infrared cut filter coating. The lens array may be curved shaped. The aperture window 1320 may be a transparent area made of transparent material.
With such configuration, the lens module 100 has two lenses and the image captured by the lens module and then is processed by a neural network. The more field curvature and distortion can be tolerated due to neural network process. Thus, the thickness of the lens module can be thinner.
In this embodiment, as shown in
In EMBODIMENT 2, the lens assembly 23 further includes a micro-lens array 234. The micro-lens array 234 includes a plurality of micro-lenses 2342 and a periphery 2341 surrounding the micro-lenses 2342. The micro-lenses array 234 is received in the through hole 220 of the spacer 22 and fixed to the spacer 22 via the periphery 2341. The micro-lens array 234 is spaced apart from the second lens 233 with the micro-lenses 2341 disposed corresponding to the imaging portion 2312,2332. The incident light from the first lens 231 and the second lens 233 are captured by the micro-lenses and then transmitted to the image sensor 24.
With such configuration, the images captured by the micro-lenses can be combined into a composite image by the neural network, it allows more field curvature and distortion to be tolerated due to neural network.
In this embodiment, as shown in
In EMBODIMENT 3, the lens assembly 33 includes only the micro-lens array 334. The micro-lens array 334 includes the plurality of micro-lenses 3342 and the periphery 3341 surrounding the micro-lenses 3342. The micro-lens array 334 may be used as the aperture sheet. The micro-lenses are received in the through hole 320 and corresponding to the image sensor 34. In this embodiment, the image sensor 34 is a flat image sensor. The incident light is captured by the micro-lenses 3342, and then transmitted to the image sensor 34. The image sensor 34 obtains a plurality of image signals, and then the individual images through micro-lens array can be combined into the composite image by neural network.
In this embodiment, as shown in
In this embodiment, the lens assembly 43 includes the micro-lens array 434 and the aperture sheet 433. The micro-lens array 434 includes a plurality of micro-lenses 4342 and the periphery 4341 surrounding the micro-lenses 4342. The micro-lens array 434 is fixed to the spacer 42 via the periphery 4341 with the micro-lenses 4342 received in the through hole 420 of the spacer 42. The aperture sheet 433 is attached on a side of the micro-lens array 434 close to the image sensor 44. The aperture sheet 433 also has a plurality of windows 4330 for passing through the incident light, each window is disposed corresponding to each of the micro-lenses 4342. The windows 4330 are spaced apart from each other with an even interval. In some use cases the scaling of the micro-lenses 4342 and windows 4330 can vary. This may allow more light to pass through the corresponding surface area of the image sensor 44.
In this embodiment, as shown in
In this embodiment, the lens assembly 53 includes two micro-lens array 534 and the aperture sheet 533. The aperture sheet 533 is sandwiched between two micro-lens arrays 534. Each micro-lens array 534 has a plurality of micro-lenses 5342 and the periphery 5341 surrounding the micro-lenses 5342. The micro-lenses 5342 are received in the through hole 520, and the peripheries 5341 are fixed to the spacer 52. The micro-lenses of one micro-lens array are facing the micro-lenses of the other micro-lens array one by one. The aperture sheet 533 has a plurality of windows 5330, which are corresponding to the micro-lenses one by one. The incident light captured by the micro-lenses, and passing through the windows, and then captured by the micro-lenses of the other micro-lenses array, and then transmitted to the image sensor 54.
In this embodiment, as shown in
In this embodiment, the image sensor 64 is curved shaped. The image sensor 64 has a plurality of curved surface 641. The incident light transmitted via the lens assembly 63 is sensed by the plurality of curved surface 641, and then produce a lot of image signals. The neural network processed the image signals and the image signals can be combined into composite image by the neural network. The image sensor 64 may a mosaic image sensor.
In this embodiment, as shown in
In this embodiment, the lens assembly 73 includes the first lens 731, a third lens 736 and the micro-lens array 734. The shape of the first lens 731 can be controlled by the piezo-film or SMA actuator. It is fixed to the spacer 72 via the fixing portion 7311 with the imaging portion 7312 received in the through hole 720. The lens assembly 73 can perform AF function due to actuator reshaping (curvature change) the first lens 731 and third lens 736. The third lens 736 is a filled cavity of optically behaving flexible material (e.g. liquid polymer gel). The third lens 736 is sandwiched between the first lens 731 and the micro-lens array 734. The fixing portion 7311 of the first lens 731 has a piezo-film or SMA actuator to perform the AF function and correct the lens errors. The micro-lens array 734 has a plurality of micro-lenses 7342 and the periphery 7341 surrounding the micro-lenses 7342. The micro-lenses 7342 are disposed corresponding to the imaging portion 7312. In this embodiment, the micro-lens array 734 is used as the aperture sheet.
In this embodiment, as shown in
In this embodiment, the lens assembly 83 includes the first lens 831, the third lens 836, the micro-lens array 834 and the aperture sheet 833. The first lens 831, the third lens 836, the micro-lens array 834 and the aperture sheet 833 are sequentially stacked upon each other in a direction toward the image sensor 84. The aperture sheet 833 has a plurality of windows 8330. Each window 8330 faces one of the micro-lens 8342. The incident light captured by the first lens 831, and passing through the third lens 836, and the micro-lenses 8342, incident into the image sensor 84 via the window 8330. The aperture sheet 833 may be coated with infrared-cut filter coating.
The individual images captured by the micro-lenses can be processed by the neural network to combine the individual images into composite image.
In the present invention, as shown in
The processor 200 and the neural network 300 operate as follows:
S1: many raw image signals are captured by the above-mentioned lens module 100, and then be sent to the processor 200; an object image signals also be pre-stored in the processor 200;
S2: the neural network processed all of the image signals and combined them into a composite image. Specifically, the neural network will be trained and finally produce the images similar to the object image.
S3: the composite image is displayed on an electric terminal, such as mobile phones. The composite image processed by the neural network has less blurriness and distortion.
In the present invention, the lens module may be a simple optics module and thinner than a related lens module. The lens module has an in-built field curvature and distortion removal of the optics, which can enable images stitching from narrow FOV to combine into a full FOV composite image.
The above is only the embodiment of the present invention, but not limit to the patent scope of the present invention, and the equivalent structures or equivalent process transformations made by utilizing the present invention and the contents of the drawings, or directly or indirectly applied to other related technology fields, are all included in the scope of the patent protection of the present invention.
Number | Name | Date | Kind |
---|---|---|---|
5581379 | Aoyama | Dec 1996 | A |
8804255 | Duparre | Aug 2014 | B2 |
9077880 | Ferraris | Jul 2015 | B2 |
9411136 | Ferraris | Aug 2016 | B2 |
9647150 | Blasco Claret | May 2017 | B2 |
20050141106 | Lee | Jun 2005 | A1 |
20070108578 | Watanabe | May 2007 | A1 |
20090109318 | Chang | Apr 2009 | A1 |
20090122178 | Kwon | May 2009 | A1 |
20100053781 | Wang | Mar 2010 | A1 |
20110292271 | Lin | Dec 2011 | A1 |
20130188161 | Kajiyama | Jul 2013 | A1 |
20140090687 | Den Boer | Apr 2014 | A1 |
20140118516 | Suzuki | May 2014 | A1 |
20140248736 | Oganesian | Sep 2014 | A1 |
20150063685 | Chen | Mar 2015 | A1 |
20150084150 | Pan | Mar 2015 | A1 |
20160133762 | Blasco Claret | May 2016 | A1 |
20160218129 | Liu | Jul 2016 | A1 |
20170168265 | Du | Jun 2017 | A1 |
20170195534 | Jeong | Jul 2017 | A1 |
20180063388 | Hsu | Mar 2018 | A1 |
20200322507 | Chen | Oct 2020 | A1 |
Number | Date | Country | |
---|---|---|---|
20220082820 A1 | Mar 2022 | US |