This disclosure relates generally to the field of optics, and in particular but not exclusively, relates to fixed focus lens assemblies and camera modules.
Conventional digital imaging devices or cameras often include a lens assembly (which includes multiple lens elements) that focuses image light onto an image sensor that measures the image light and generates an image based on the measurements.
Lens assembly 115 has a z-axis height H1, which is also referred to as the optical total track length (“TTL”). The optical TTL is typically influenced by the FOV and the size of image sensor 110, as well as other design choices. A larger optical TTL may limit the applications or uses of digital imaging device 100 because of space constraints. Therefore, a lens assembly that allows a digital imaging device to capture high resolution images with a large FOV while reducing the optical TTL can be advantageous. A lens assembly that provides a large DOF where both far and near images are brought into focus is also desirable.
Non-limiting and non-exhaustive embodiments of the invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles being described.
Embodiments of a system and apparatus for a lens assembly that provides improved near-field image recognition capabilities with a large field of view (“FOV”) and short optical total track length (“TTL”) are described herein. In the following description numerous specific details are set forth to provide a thorough understanding of the embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein can be practiced without one or more of the specific details, or with other methods, components, materials, etc. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Head wearable displays are becoming portable computing devices that are used for everyday tasks such as capturing far-field (e.g., greater than 1 m object distance) full color pictures for personal use (e.g., memory savers, social media sharing, etc.) and capturing near-field (e.g., less than 1 m) images for image recognition tasks. Near-field image recognition tasks can include optical character recognition (“OCR”), bar code scanning, 2D code recognition (e.g., QR codes, data matrix codes, etc.), feature recognition, object recognition, etc. Accordingly, a camera module capable of acquiring high quality full color far-field images while also being capable of acquiring high quality near-field images acceptable for image recognition is desirable for head wearable displays.
Camera modules for use in head wearable displays should also have a FOV ranging between 80 to 110 degrees (and in particular between 88 and 100 degrees). In contrast, cell phone cameras typically have a FOV in the 64 to 75 degree range and don't have as tight form factor constraints. Other conventional lens assemblies, such as fish eye lenses, are designed to have very wide fields of view, typically above 120 degrees, but are not well suited for typical wearable computing tasks. Finally, such camera modules should also be compact and light weight. As such, fixed focus lenses provide the smallest and lightest form factors.
Lens assembly 210 provides a fixed focus camera module 200 with a large FOV in a compact form factor along the z-axis (the axis running parallel to the depth of field) while achieving good optical characteristics (e.g., acceptable optical distortion, well controlled field curvatures along tangential and sagittal directions, well controlled lateral color, etc.). Various embodiments of lens assembly 210 may range between 80 degrees and 110 degrees for the FOV. In one embodiment, camera module 200 has a diagonal FOV of 90 degrees with an optical TTL of 4.09 mm for image sensor 220 having a full image circle size of 5.6 mm, and stop aperture 205 providing an F-number of 2.4. In this embodiment, IRCF 215 is implemented as a blue glass IRCF having a thickness of 0.25 mm. Of course, camera module 200 may be implemented with its constituent components having other dimensions. For example, the F-number may typically vary between 2.0 to 2.4 for use in a head wearable display, though greater variances may be implemented.
Lens assembly 210 is purposefully designed to induce axial chromatic aberration (see
In one embodiment, lens assembly 210 is designed to provide an object focal distance for green light that is greater than twice an object focal distance for blue light. For example, green light may have an object focal distance of approximately 1 m, while blue light has an object focal distance of approximately 0.4 m. Of course, lens assembly 210 may be designed with other object focal distances, but typically the object focal distance for green light will range between 0.7 m to 1.8 m, while the object focal distance for blue light will range between 0.2 m to 0.6 m to provide the above recited near-field and far-field characteristics in a fixed focus camera module. Thus, the blue channel with its shorter focal distance provides a sharper image in the near-field than the green channel with its longer focal distance.
Returning to
In the illustrated embodiment, lens L1 has positive optical power, lens L2 has positive optical power, lens L3 has negative optical power, lens L4 has positive optical power, lens L5 has positive optical power, and lens L6 has negative optical power. Thus, lens assembly 210 includes six total lenses with four lenses having varying degrees of positive optical power and two lenses having varying degrees of negative optical power. Thus the total optical power of lens assembly 210 is distributed across the six lenses L1 to L6. The illustrated embodiments of lens L1 to L6 are discrete elements within intervening air gaps. These discrete lenses can be fabricated of a variety of materials (e.g., plastic, glass, etc.). In one embodiment, lens L1 to L6 are fabricated of injection molded plastics for high volume manufacturing.
Lens L1 is the first inline lens in the optical train and is positive, contributing to the total positive optical power. Lens L1 operates to reduce the large ray angle of the upper marginal ray for large off-axis field heights. This reduction of ray angle helps to reduce optical aberration for the upper marginal rays for the large off-axis field heights. The Abbe number is a measure of a transparent material's dispersion in relation to its index of refraction. In one embodiment, lens L1 is made of a material having the lowest Abbe number of the lenses with positive optical power, thereby contributing substantially to axial color aberration.
Lens L2 is the second inline lens in the optical train and is strongly positive, therefore contributing to the total positive optical power. Lens L2 is designed to have a high Abbe number thereby reducing its contribution to axial color aberration (dispersion).
Lens L3 is the third inline lens in the optical train and has negative optical power. The negative optical power of lens L3 reduces the field curvature of lens assembly 210. Lens L3 may also be made of a material having low Abbe number and partially offsets axial color aberration induced by the positive power lenses.
Lens L4 is the fourth inline lens in the optical train and is weakly positive, but contributes to the overall positive optical power. In one embodiment, lens L4 has the weakest positive power of all the positive lenses.
Lens L5 is the fifth inline lens in the optical train and has positive optical power, contributing to the overall positive optical power. Lens L5 is strongly positive. In one embodiment, lens L5 has the strongest positive optical power (i.e., shortest focal length of the positive lenses). Lens L5 is made of a material having a high Abbe number thereby reducing its contribution to axial color aberration.
Lens L6 is the sixth inline lens in the optical train and has negative optical power. In one embodiment, lens L6 has the strongest negative optical power of the lenses having negative optical power. In the illustrated embodiment, lens L6 is the largest lens in lens assembly 210. Lens L6 has an inflection point in the curvature of surface S13 and no inflection point in the curvature of surface S12. Lens L6 operates as a field corrector. The rays from different field heights fall on different regions of lens L6, which servers to correct field curvature, control optical distortion, and control the chief ray angle in the image space. For some implementations of image sensor 220 (e.g., CMOS image sensors), the chief ray angle in the image space should be kept below 32 degrees to maintain desirable quantum efficiency and low cross-talk. To achieve this, the chief ray angle for large field heights should be constrained. Lens L6 serves as a field corrector to maintain reasonable chief ray angles.
In the illustrated embodiment, lens L6 is the largest lens and larger than lens L5. The diameter of lens L6 is sufficiently large while the diameter of lens L5 sufficiently small, relative to lens L6, such that the convex shape of surface S11 extends into a recess formed by the concave surface S12 of lens L6. This design feature contributes to the overall compactness of the optical TTL.
IRCF 215 may be implemented using a variety of different types of filters to cut out the infrared spectrum. For example, IRCF 215 may be implemented as a pigmented or absorptive color filter (e.g., blue glass filter) or an interference filter. However, given that an interference filter operates by reflecting the IR wavelengths back into lens assembly 210, these reflections may again bounce back towards image sensor 220 due to the refractive index interfaces at each lens surface. Accordingly, absorptive type IRCF may be more effective at removing infrared wavelengths. These wavelengths are removed, since they are not visible to the human eye, but may be picked up by image sensor 220. In one embodiment, IRCF 215 is a blue glass infrared cut filter having a thickness of 0.25 mm.
Image sensor 220 is positioned such that its light sensitive surface S16 is coincident with the image plane 230 of lens assembly 210. Image sensor 220 may be implemented using a variety of technologies including charged coupled devices (“CCD”) sensors or complementary metal-oxide-semiconductor (“CMOS”) sensors. In one embodiment, image sensor 220 is a 1/3.2″ 5 megapixel CMOS sensor.
where c represents the radius of curvature (1/radius), k represents the conic constant, and r represents the radius of the lens.
Conventional lenses have an on-axis axial color aberration of approximately 10 um. It is the goal of typical conventional lens designers to minimize axial color aberration, since it separates color images and reduces overall image quality.
Lens assembly 910 packages the discrete lens elements (L1 through L6) into a barrel style form factor, which threads into lens holder 905. In one embodiment, lens assembly 910 includes male threads around its perimeter, which mate to female threads on the inside edge of lens holder 905. This thread design facilitates offset adjustment to align the image plane 230 of lens assembly 910 with the light sensitive side S16 of image sensor 925. Lens holder 905 also serves to seal image sensor 925 on top of substrate 930 and prevent dust or other contaminants for accumulating on image sensor 925. It should be appreciated that lens assembly 210 may be implemented in other form factors than the barrel style illustrated in
In one embodiment, controller 1005 includes hardware logic (or executed software logic stored in memory 1010) to identify near-field objects. In one embodiment, identification of near-field object may be assumed when a user inputs a request to perform image recognition. Other image processing techniques, or otherwise, may be implemented to identify when an object is a near-field object upon which image recognition is to be performed. When it is determined that image recognition is to be performed upon a near-field object, controller 1005 ignores the red and green image signal channels output from camera module 200 and analyzes only the blue channel when performing image recognition. When capturing images of far-field objects, all three RGB color channels of the image signal are used.
The see-through displays 1101 are mounted to a frame assembly, which includes a nose bridge 1105, left ear arm 1110, and right ear arm 1115. Camera system 1000 may be disposed in any of portion of the frame assembly with a forward facing perspective. In other embodiments, camera module 1000 may be a rear facing camera positioned to capture eye images for eye tracking functionality, in addition to, or in place of, a forward facing camera. Although
The see-through displays 1101 are secured into an eye glass arrangement or head wearable display that can be worn on the head of a user. The left and right ear arms 1110 and 1115 rest over the user's ears while nose bridge 1105 rests over the user's nose. The frame assembly is shaped and sized to position each display in front of a corresponding eye of the user. Other frame assemblies having other shapes may be used (e.g., a visor with ear arms and a nose bridge support, a single contiguous headset member, a headband, goggles type eyewear, etc.).
The above description of illustrated embodiments of the invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize.
These modifications can be made to the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification. Rather, the scope of the invention is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation.
Number | Name | Date | Kind |
---|---|---|---|
RE35310 | Moskovich | Aug 1996 | E |
5959785 | Adachi | Sep 1999 | A |
6804066 | Ha et al. | Oct 2004 | B1 |
7649693 | Kuroda et al. | Jan 2010 | B2 |
7663814 | Kitahara | Feb 2010 | B2 |
8000030 | Tang | Aug 2011 | B2 |
8395852 | Tsai et al. | Mar 2013 | B2 |
8441746 | Hsieh et al. | May 2013 | B2 |
8456758 | Huang et al. | Jun 2013 | B1 |
8456763 | Hsieh et al. | Jun 2013 | B2 |
8462449 | Hsu et al. | Jun 2013 | B2 |
8508865 | Teraoka | Aug 2013 | B2 |
20040165090 | Ning | Aug 2004 | A1 |
20120026285 | Yoshida et al. | Feb 2012 | A1 |
20120188657 | Hsu et al. | Jul 2012 | A1 |
20130002908 | Ben-Eliezer et al. | Jan 2013 | A1 |
20140085513 | Tashiro et al. | Mar 2014 | A1 |
20140085615 | Pretorius et al. | Mar 2014 | A1 |
Entry |
---|
Guichard, F et al. “Extended depth-of-field using sharpness transport across color channels”, SPIE Proceedings vol. 7250, Digital Photography V, 72500N, Jan. 19, 2009, 13 pages. |