The present technology relates to a technical field of an imaging lens including a plurality of lenses and an imaging device in which such imaging lens is used.
As imaging devices such as digital still cameras, thin types such as card types have been widespread year after year, and they are required to be compact. Furthermore, imaging devices incorporated in mobile phones such as smartphones and mobile terminals such as tablets are also required to be thin in pursuit of design and to be compact for securing a space for multiple functions for differentiation. As a result, there is an increasing demand for making imaging lenses mounted on the imaging devices further compact.
Furthermore, as imaging elements such as a charge coupled device (CCD) and a complementary metal oxide semiconductor (CMOS) are made compact, the number of pixels is increasing due to miniaturization of a pixel pitch of the imaging elements, and high performance is also required in the imaging lenses used in the imaging devices accordingly.
Moreover, there is a demand for a large-diameter bright imaging lens that enables high-sensitivity imaging while preventing deterioration in image quality due to noise in imaging in a dark place.
Therefore, an imaging lens including a plurality of lenses is required as such a compact and high-performance imaging lens, and various types including five to seven lenses are suggested as an imaging lens including a plurality of lenses (refer to, for example, Patent Documents 1 to 3).
Patent Document 1 discloses an imaging lens including five lenses having high resolution, and realizes a response to a wide angle of view recently required with a half angle of view of 38 degrees.
However, a f-number is about 2.0 to 2.8, and in a case of trying to further reduce the f-number in the disclosed lens configuration, one positive lens on a most object side cannot sufficiently suppress an increase in spherical aberration due to an increase in diameter, so that it is difficult to satisfy both reduction in eccentric sensitivity and excellent optical performance.
Patent Document 2 discloses an imaging lens including six lenses having high resolution, and realizes a response to a wide angle of view recently required with a half angle of view of 39 to 45 degrees.
However, a f-number is about 1.8 to 2.3, and in a case of trying to further reduce the f-number in the disclosed lens configuration, one positive lens on a most object side cannot sufficiently suppress an increase in spherical aberration due to an increase in diameter, so that it is difficult to satisfy both reduction in eccentric sensitivity and excellent optical performance.
Patent Document 3 discloses an imaging lens including seven lenses having high resolution, and realizes a response to recent requirement of a low f-number such as the f-number of 1.6.
However, the half angle of view is about 32 degrees, and it is difficult to sufficiently respond to the wide angle of view recently required, by the disclosed lens configuration.
Therefore, an object of the imaging lens and imaging device of the present technology is to improve performance.
First, an imaging lens according to the present technology is provided with a first lens having a positive refractive power with a convex surface facing an object side, a second lens having a positive refractive power, a third lens having a negative refractive power, a fourth lens having a positive refractive power or a negative refractive power with a concave surface facing the object side, a fifth lens having a positive refractive power or a negative refractive power, a sixth lens having a positive refractive power with a concave surface facing an image side, and a seventh lens having a negative refractive power in the vicinity of an optical axis, a surface on the image side of which is formed into an aspherical shape having an inflection point, in order from the object side to the image side.
Therefore, an eccentric sensitivity is reduced and an aberration correction function is improved.
Second, it is desirable that the above-described imaging lens satisfy following conditional expression (1).
1.0<f1/f<225.0 (1)
where
f is a focal length of an entire system, and
f1 is a focal length of the first lens.
Therefore, the focal length of the first lens is optimized, and an appropriate refractive power with respect to an incident light beam may be obtained.
Third, it is desirable that the above-described imaging lens satisfy following conditional expression (2).
0.7<f2/f<4.0 (2)
where
f is a focal length of an entire system, and
f2 is a focal length of the second lens.
Therefore, the focal length of the second lens is optimized, and an appropriate refractive power with respect to an incident light beam may be obtained.
Fourth, it is desirable that the above-described imaging lens satisfy following conditional expression (3).
4.0<|f45|/f (3)
where
f is a focal length of an entire system, and
f45 is a composite focal length of the fourth and fifth lenses.
Therefore, the composite focal length of the fourth and fifth lenses is optimized, and an appropriate refractive power with respect to an incident light beam may be obtained.
Fifth, it is desirable that the above-described imaging lens satisfy following conditional expression (4).
2.8<f6/f<215.0 (4)
where
f is a focal length of an entire system, and
f6 is a focal length of the sixth lens.
Therefore, the focal length of the sixth lens is optimized, and an appropriate refractive power with respect to an incident light beam may be obtained.
Sixth, it is desirable that the above-described imaging lens satisfy following conditional expression (5).
−46.0<f6/f7<−0.3 (5)
where
f6 is a focal length of the sixth lens, and
f7 is a focal length of the seventh lens.
Therefore, the focal length of the sixth lens is optimized, and an appropriate refractive power with respect to an incident light beam may be obtained.
Seventh, it is desirable that the above-described imaging lens satisfy following conditional expression (6).
5.4<|R7/f|<220.0 (6)
where
f is a focal length of an entire system, and
R7 is a curvature radius of a surface on the object side of the fourth lens.
Therefore, the curvature radius of the surface on the object side of the fourth lens is optimized, and an appropriate refractive power of an air lens between the third lens and the fourth lens may be obtained.
Eighth, it is desirable that the above-described imaging lens satisfy following conditional expression (7).
5.8<|(R11+R12)/(R11−R12)|<320.0 (7)
where
R11 is a curvature radius of a surface on the object side of the sixth lens, and
R12 is a curvature radius of a surface on the image side of the sixth lens.
This makes it possible to sufficiently correct spherical aberration and higher-order aberration with respect to an off-axis light beam.
Ninth, it is desirable that the above-described imaging lens satisfy following conditional expression (8).
18.0<νd(L3)<νd(L5)<νd(L6)<30.0 (8)
where
νd (L3) is an Abbe number on a d-line of the third lens,
νd (L5) is an Abbe number on a d-line of the fifth lens, and
νd (L6) is an Abbe number on a d-line of the sixth lens.
This makes it possible to sufficiently correct axial chromatic aberration and magnification chromatic aberration.
Tenth, in the above-described imaging lens, it is desirable that an aperture diaphragm be arranged on the object side of the first lens or between the first lens and the second lens.
Therefore, the aperture diaphragm functions on the object side of the entire system.
Eleventh, in the above-described imaging lens, it is desirable that the sixth lens have an inflection point on a surface on the image side.
This makes it possible to have different aberration correction effects in the vicinity of the optical axis and outside the vicinity of the optical axis.
Twelfth, an imaging device according to the present technology is provided with an imaging lens and an imaging element that converts an optical image formed by the imaging lens into an electric signal, in which the imaging lens is provided with a first lens having a positive refractive power with a convex surface facing an object side, a second lens having a positive refractive power, a third lens having a negative refractive power, a fourth lens having a positive refractive power or a negative refractive power with a concave surface facing the object side, a fifth lens having a positive refractive power or a negative refractive power, a sixth lens having a positive refractive power with a concave surface facing an image side, and a seventh lens having a negative refractive power in the vicinity of an optical axis, a surface on the image side of which is formed into an aspherical shape having an inflection point, in order from the object side to the image side.
Therefore, the eccentric sensitivity is reduced and the aberration correction function is improved in the imaging lens.
A mode for carrying out an imaging lens and an imaging device according to the present technology is hereinafter described.
The imaging lens of the present technology is provided with a first lens having a positive refractive power with a convex surface facing an object side, a second lens having a positive refractive power, a third lens having a negative refractive power, a fourth lens having a positive refractive power or a negative refractive power with a concave surface facing the object side, a fifth lens having a positive refractive power or a negative refractive power, a sixth lens having a positive refractive power with a concave surface facing an image side, and a seventh lens having a negative refractive power in the vicinity of an optical axis, a surface on the image side of which is formed into an aspherical shape having an inflection point in order from the object side to the image side.
In this manner, the imaging lens has a lens configuration of seven lenses in total, each lens arranged to have an optimal refractive power and formed into a lens shape in which an aspheric surface is effectively used, so that it becomes possible to excellently correct various aberrations while securing a large diameter and miniaturization and improve performance.
Furthermore, regarding each lens, it becomes possible to correct the various aberrations more excellently while securing the large diameter and miniaturization by combining optimum glass materials.
Especially, by forming the surface on the object side of the fourth lens into a concave shape, it is possible to excellently correct coma aberration and field curvature while reducing an eccentric sensitivity.
It is desirable that following conditional expression (1) be satisfied in an imaging lens according to one embodiment of the present technology.
1.0<f1/f<225.0 (1)
where
f is a focal length of an entire system, and
f1 is a focal length of the first lens.
Conditional expression (1) is an expression that defines a ratio between the focal length of the first lens and the focal length of the entire system.
If it exceeds an upper limit of conditional expression (1), the focal length of the first lens becomes long and the refractive power with respect to an incident light beam becomes weak, so that it becomes difficult to achieve miniaturization because of a long entire length of the lens.
On the other hand, if it exceeds a lower limit of conditional expression (1), the focal length of the first lens becomes short and the refractive power with respect to the incident light beam becomes strong, so that the miniaturization may be achieved and the coma aberration may be easily corrected; however, the eccentric sensitivity at the time of lens assembly increases.
Therefore, by satisfying conditional expression (1), it is possible to reduce the eccentric sensitivity at the time of assembly of the lens while securing the miniaturization, and secure an excellent optical performance.
Furthermore, in order to further improve the above-described effect, it is desirable that a range of conditional expression (1) be set to a range of following conditional expression (1)′.
2.45<f1/f<43.50 (1)′
where
f is a focal length of an entire system, and
f1 is a focal length of the first lens.
It is desirable that following conditional expression (2) be satisfied in the imaging lens according to one embodiment of the present technology.
0.7<f2/f<4.0 (2)
where
f is a focal length of an entire system, and
f2 is a focal length of the second lens.
Conditional expression (2) is an expression that defines a ratio between the focal length of the second lens and the focal length of the entire system.
If it exceeds an upper limit of conditional expression (2), the focal length of the second lens becomes long and the refractive power with respect to the incident light beam becomes weak, so that it becomes difficult to achieve the miniaturization because of the long entire length of the lens.
On the other hand, if it exceeds a lower limit of conditional expression (2), the focal length of the second lens becomes short and the refractive power with respect to the incident light beam becomes strong, so that the miniaturization may be achieved and the coma aberration may be easily corrected; however, the eccentric sensitivity at the time of lens assembly increases.
Therefore, by satisfying conditional expression (2), it is possible to reduce the eccentric sensitivity at the time of lens assembly while securing the miniaturization, and secure the excellent optical performance.
Furthermore, in order to further improve the above-described effect, it is desirable that a range of conditional expression (2) be set to a range of following conditional expression (2)′.
0.78<f2/f<1.23 (2)′
where
f is a focal length of an entire system, and
f2 is a focal length of the second lens.
It is desirable that following conditional expression (3) be satisfied in the imaging lens according to one embodiment of the present technology.
4.0<|f45|/f (3)
where
f is a focal length of an entire system, and
f45 is a composite focal length of the fourth and fifth lenses.
Conditional expression (3) is an expression that defines a ratio between the composite focal length of the fourth and fifth lenses and the focal length of the entire system.
If it exceeds a lower limit of conditional expression (3), the composite focal length of the fourth and fifth lenses becomes short and the refractive power with respect to the incident light beam becomes strong, so that the miniaturization may be achieved and the coma aberration may be easily corrected; however, the eccentric sensitivity at the time of lens assembly increases.
Therefore, by satisfying conditional expression (3), it is possible to reduce the eccentric sensitivity at the time of lens assembly while securing the miniaturization, and secure the excellent optical performance.
Furthermore, in order to further improve the above-described effect, it is desirable that a range of conditional expression (3) be set to a range of following conditional expression (3)′.
4.04<|f45|/f<42.63 (3)′
where
f is a focal length of an entire system, and
f45 is a composite focal length of the fourth and fifth lenses.
It is desirable that following conditional expression (4) be satisfied in the imaging lens according to one embodiment of the present technology.
2.8<f6/f<215.0 (4)
where
f is a focal length of an entire system, and
f6 is a focal length of the sixth lens.
Conditional expression (4) is an expression that defines a ratio between the focal length of the sixth lens and the focal length of the entire system.
If it exceeds an upper limit of conditional expression (4), the focal length of the sixth lens becomes long and the refractive power with respect to the incident light beam becomes weak, so that it becomes difficult to achieve the miniaturization because of the long entire length of the lens.
On the other hand, if it exceeds a lower limit of conditional expression (4), the focal length of the sixth lens becomes short and the refractive power with respect to the incident light beam becomes strong, so that the miniaturization may be achieved and the coma aberration may be easily corrected; however, back focus becomes too short and it becomes difficult to secure a space for arranging an infrared cut filter and the like.
Therefore, by satisfying conditional expression (4), it is possible to elongate the back focus to secure a sufficient space for arranging the infrared cut filter and the like while securing the miniaturization, and secure the excellent optical performance.
Furthermore, in order to further improve the above-described effect, it is desirable that a range of conditional expression (4) be set to a range of following conditional expression (4)′.
2.87<f6/f<133.14 (4)′
where
f is a focal length of an entire system, and
f6 is a focal length of the sixth lens.
It is desirable that following conditional expression (5) be satisfied in the imaging lens according to one embodiment of the present technology.
−46.0<f6/f7<−0.3 (5)
where
f6 is a focal length of the sixth lens, and
f7 is a focal length of the seventh lens.
Conditional expression (5) is an expression that defines a ratio between the focal length of the sixth lens and the focal length of the seventh lens.
If it exceeds a lower limit of conditional expression (5), the focal length of the sixth lens becomes long and the refractive power with respect to the incident light beam becomes weak, so that it becomes difficult to achieve the miniaturization because of the long entire length of the lens.
On the other hand, if it exceeds an upper limit of conditional expression (5), the focal length of the sixth lens becomes short and the refractive power with respect to the incident light beam becomes strong, so that the miniaturization may be achieved and the coma aberration may be easily corrected; however, the back focus becomes too short and it becomes difficult to secure the space for arranging the infrared cut filter and the like.
Therefore, by satisfying conditional expression (5), it is possible to elongate the back focus to secure a sufficient space for arranging the infrared cut filter and the like while securing the miniaturization, and secure the excellent optical performance.
Furthermore, in order to further improve the above-described effect, it is desirable that a range of conditional expression (5) be set to a range of following conditional expression (5)′.
−21.62<f6/f7<−0.35 (5)′
where
f6 is a focal length of the sixth lens, and
f7 is a focal length of the seventh lens.
It is desirable that following conditional expression (6) be satisfied in the imaging lens according to one embodiment of the present technology.
5.4<|R7/f|<220.0 (6)
where
f is a focal length of an entire system, and
R7 is a curvature radius of a surface on the object side of the fourth lens.
Conditional expression (6) is an expression that defines a ratio between the curvature radius of the surface on the object side of the fourth lens and the focal length of the entire system.
If it exceeds an upper limit of conditional expression (6), the curvature radius of the surface on the object side of the fourth lens becomes large and a refractive power of an air lens between the third and fourth lenses becomes weak, so that an angle at which the light beam is flipped up becomes small and the entire length of the lens becomes long, and it becomes difficult to achieve the miniaturization.
On the other hand, if it exceeds a lower limit of conditional expression (6), the curvature radius of the surface on the object side of the fourth lens becomes small and the refractive power of the air lens between the third and fourth lenses becomes strong, so that the angle at which the light beam is flipped up becomes large, and it becomes difficult to correct the coma aberration and field curvature.
Therefore, by satisfying conditional expression (6), it is possible to improve an effect of correcting the coma aberration and field curvature while securing the miniaturization, and secure the excellent optical performance.
Furthermore, in order to further improve the above-described effect, it is desirable that a range of conditional expression (6) be set to a range of following conditional expression (6)′.
17.07<|R7/f|<218.52 (6)′
where
f is a focal length of an entire system, and
R7 is a curvature radius of a surface on the object side of the fourth lens.
It is desirable that following conditional expression (7) be satisfied in the imaging lens according to one embodiment of the present technology.
5.8<|(R11+R12)/(R11−R12)|<320.0 (7)
where
R11 is a curvature radius of a surface on the object side of the sixth lens, and
R12 is a curvature radius of a surface on the image side of the sixth lens.
Conditional expression (7) is an expression that defines a shape of a paraxial curvature radius of the surface on the object side of the sixth lens and the surface on the image side of the sixth lens.
If it exceeds an upper or lower limit of conditional expression (7), it becomes difficult to sufficiently correct spherical aberration and high-order aberration with respect to an off-axis light beam.
Therefore, by satisfying conditional expression (7), it is possible to sufficiently correct the spherical aberration and the high-order aberration with respect to the off-axis light beam, and secure the excellent optical performance.
Furthermore, in order to further improve the above-described effect, it is desirable that a range of conditional expression (7) be set to a range of following conditional expression (7)′.
5.87<|(R11+R12)/(R11−R12)|<38.63 (7)′
where
R11 is a curvature radius of a surface on the object side of the sixth lens, and
R12 is a curvature radius of a surface on the image side of the sixth lens.
It is desirable that following conditional expression (8) be satisfied in the imaging lens according to one embodiment of the present technology.
18.0<νd(L3)<νd(L5)<νd(L6)<30.0 (8)
where
νd (L3) is an Abbe number on a d-line of the third lens,
νd (L5) is an Abbe number on a d-line of the fifth lens, and
νd (L6) is an Abbe number on a d-line of the sixth lens.
Conditional expression (8) is an expression that defines magnitudes of the Abbe number of the third lens, the Abbe number of the fifth lens, and the Abbe number of the sixth lens.
If it exceeds an upper or lower limit of conditional expression (8), it becomes difficult to sufficiently correct axial chromatic aberration and magnification chromatic aberration.
Therefore, by satisfying conditional expression (8), it is possible to sufficiently correct the axial chromatic aberration and the magnification chromatic aberration, and secure the excellent optical performance.
In the imaging lens according to one embodiment of the present technology, it is desirable that an aperture diaphragm be arranged on the object side of the first lens or between the first lens and the second lens.
By arranging the aperture diaphragm on the object side of the first lens or between the first lens and the second lens, the aperture diaphragm functions on the object side of the entire system, and it is possible to make an amount light incident on the first lens or the second lens appropriate and secure the excellent optical performance.
It is desirable that the sixth lens have an inflection point on the surface on the image side in the imaging lens according to one embodiment of the present technology.
By having the inflection point on the surface on the image side of the sixth lens, it is possible to provide different aberration correction effects in the vicinity of an optical axis and outside the vicinity of the optical axis, and secure the excellent optical performance. Especially, by making a shape in the vicinity of the optical axis of the sixth lens a concave shape and making a shape in a peripheral portion a convex shape, it is possible to secure the excellent optical performance and suppress an incident angle on the image surface of the light.
Hereinafter, a specific embodiment of the imaging lens of the present technology and a numerical value example in which specific numerical values are applied to the embodiment are described with reference to drawings and tables.
Note that, the meanings of the symbols and the like in the following respective tables and descriptions are as described below.
When a surface number of the lens and the like is set to i, “R” represents a paraxial curvature radius of an i-th surface, “T” represents an on-axis surface spacing between the i-th surface and an (i+1)-th surface (a thickness of the lens center or an air interval), “refractive index” represents a refractive index on the d-line (λ=587.6 nm) of the lens and the like starting from the i-th surface, and “Abbe number” represents the Abbe number on the d-line of the lens and the like starting from the i-th surface.
In a “lens” field, “L1, L2, . . . ” represent the first lens, the second lens, . . . , respectively, “R1” represents the surface on the object side, and “R2” represents the surface on the image side. Regarding “R”, “infinity” represents that the surface is a flat surface.
“Focal length” represents the focal length of the entire optical system, “f-number” represents an f-number, “total length” represents the total length of the entire optical system, and “ω” represents a half angle of view.
Note that, in each table illustrating aspherical coefficients as follows, “E-n” is an exponential representation with a base of 10, that is, “n-minex”, and for example, “0.12345E-05” represents “0.12345×(10 to the negative fifth power)”.
Some of the imaging lenses used in each embodiment includes a lens surface formed into an aspherical surface. The aspherical shape is defined by following mathematical expression 1, where “x” is a distance (sag amount) from an apex of the lens surface in an optical axis direction, “y” is a height (image height) in a direction orthogonal to the optical axis direction, “c” is a paraxial curvature at the apex of the lens (reciprocal of the curvature radius), “K” is a conic constant, and “A”, “B”, are fourth, sixth, . . . , aspherical coefficients, respectively.
x=cy
2/[1+{1−(1+κ)c2y2}1/2]+Ay4+By6+ [Mathematical Expression 1]
Note that, in each example, the aperture diaphragm is represented as “diaphragm” to the right of the surface number, and a seal glass is represented as “SG” to the right of the surface number.
The imaging lens 1 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a positive refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 1 illustrates lens data of a numerical value example 1 in which specific numerical values are applied to the imaging lens 1.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 1 are illustrated in Tables 2-1 and 2-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 1 are illustrated in Table 3.
A focal length of each lens in the numerical value example 1 is illustrated in Table 4.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 1, and this has an excellent image forming performance.
The imaging lens 2 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a positive refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 5 illustrates lens data of a numerical value example 2 in which specific numerical values are applied to the imaging lens 2.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 2 are illustrated in Tables 6-1 and 6-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 2 are illustrated in Table 7.
A focal length of each lens in the numerical value example 2 is illustrated in Table 8.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 2, and this has an excellent image forming performance.
The imaging lens 3 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a positive refractive power with a concave surface facing the object side, a fifth lens L5 having a positive refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 9 illustrates lens data of a numerical value example 3 in which specific numerical values are applied to the imaging lens 3.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 3 are illustrated in Tables 10-1 and 10-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 3 are illustrated in Table 11.
A focal length of each lens in the numerical value example 3 is illustrated in Table 12.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 3, and this has an excellent image forming performance.
The imaging lens 4 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a positive refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 13 illustrates lens data of a numerical value example 4 in which specific numerical values are applied to the imaging lens 4.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 4 are illustrated in Tables 14-1 and 14-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 4 are illustrated in Table 15.
A focal length of each lens in the numerical value example 4 is illustrated in Table 16.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 4, and this has an excellent image forming performance.
The imaging lens 5 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a positive refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 17 illustrates lens data of a numerical value example 5 in which specific numerical values are applied to the imaging lens 5.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 5 are illustrated in Tables 18-1 and 18-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 5 are illustrated in Table 19.
A focal length of each lens in the numerical value example 5 is illustrated in Table 20.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 5, and this has an excellent image forming performance.
The imaging lens 6 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a negative refractive power with a concave surface facing the object side, a fifth lens L5 having a positive refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 21 illustrates lens data of a numerical value example 6 in which specific numerical values are applied to the imaging lens 6.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 6 are illustrated in Tables 22-1 and 22-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 6 are illustrated in Table 23.
A focal length of each lens in the numerical value example 6 is illustrated in Table 24.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 6, and this has an excellent image forming performance.
The imaging lens 7 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a negative refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 25 illustrates lens data of a numerical value example 7 in which specific numerical values are applied to the imaging lens 7.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 7 are illustrated in Tables 26-1 and 26-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 7 are illustrated in Table 27.
A focal length of each lens in the numerical value example 7 is illustrated in Table 28.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 7, and this has an excellent image forming performance.
The imaging lens 8 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a negative refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 29 illustrates lens data of a numerical value example 8 in which specific numerical values are applied to the imaging lens 8.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 8 are illustrated in Tables 30-1 and 30-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 8 are illustrated in Table 31.
A focal length of each lens in the numerical value example 8 is illustrated in Table 32.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 8, and this has an excellent image forming performance.
The imaging lens 9 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a positive refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged between the first lens L1 and the second lens L2.
Table 33 illustrates lens data of a numerical value example 9 in which specific numerical values are applied to the imaging lens 9.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 9 are illustrated in Tables 34-1 and 34-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 9 are illustrated in Table 35.
A focal length of each lens in the numerical value example 9 is illustrated in Table 36.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 9, and this has an excellent image forming performance.
The imaging lens 10 includes a first lens L1 having a positive refractive power with a convex surface facing an object side, a second lens L2 having a positive refractive power, a third lens L3 having a negative refractive power, a fourth lens L4 having a positive refractive power with a concave surface facing the object side, a fifth lens L5 having a negative refractive power, a sixth lens L6 having a positive refractive power with a concave surface facing an image side, and a seventh lens L7 having a negative refractive power in the vicinity of an optical axis arranged in order from the object side to the image side. The seventh lens L7 includes a surface on the image side formed into an aspherical shape with an inflection point.
A seal glass SG is arranged between the seventh lens L7 and an image surface IMG. An aperture diaphragm STO is arranged on the object side of the first lens L1.
Table 37 illustrates lens data of a numerical value example 10 in which specific numerical values are applied to the imaging lens 10.
The aspherical coefficient and the like of each surface of the lens in the numerical value example 10 are illustrated in Tables 38-1 and 38-2 together with a conic constant K.
The focal length, f-number, total length, and half angle of view ω of the numerical value example 10 are illustrated in Table 39.
A focal length of each lens in the numerical value example 10 is illustrated in Table 40.
From each aberration diagram, it is clear that various aberrations are excellently corrected in the numerical value example 10, and this has an excellent image forming performance.
Each value in a conditional expression of the imaging lens of the present technology is described below.
Table 41 illustrates each value in conditional expressions (1) to (8) in the numerical value examples 1 to 10 of the imaging lenses 1 to 10.
As is clear from Table 41, the imaging lenses 1 to 10 are designed to satisfy conditional expressions (1) to (8).
In the imaging device of the present technology, the imaging lens is provided with the first lens having the positive refractive power with the convex surface facing the object side, the second lens having the positive refractive power, the third lens having the negative refractive power, the fourth lens having the positive refractive power or the negative refractive power with the concave surface facing the object side, the fifth lens having the positive refractive power or the negative refractive power, the sixth lens having the positive refractive power with the concave surface facing the image side, and the seventh lens having the negative refractive power in the vicinity of the optical axis, the surface on the image side of which is formed into the aspherical shape having the inflection point in order from the object side to the image side.
In this manner, in the imaging device, the imaging lens has a lens configuration of seven lenses in total, each lens arranged to have an optimal refractive power and formed into a lens shape in which an aspheric surface is effectively used, so that it becomes possible to excellently correct various aberrations while securing a large diameter and miniaturization and improve performance.
Furthermore, regarding each lens, it becomes possible to correct the various aberrations more excellently while securing the large diameter and miniaturization by combining optimum glass materials.
Especially, by forming the surface on the object side of the fourth lens into a concave shape, it is possible to correct the coma aberration and field curvature while reducing the eccentric sensitivity.
An imaging device 100 includes an imaging element 10 having a photoelectric conversion function of converting captured light into an electric signal, a camera signal processing unit 20 that performs signal processing such as analog-to-digital conversion of an imaged image signal, and an image processing unit 30 that performs record/reproduction processing of the image signal. Furthermore, the imaging device 100 is provided with a display unit 40 that displays an imaged image and the like, a reader/writer (R/W) 50 that writes and reads the image signal in and from a memory 90, a central processing unit (CPU) 60 that controls an entire imaging device 100, an input unit 70 such as various switches on which a user performs a required operation, and a lens drive control unit 80 that controls drive of a lens group (movable group).
The camera signal processing unit 20 performs various types of signal processing such as conversion of an output signal from the imaging element 10 into a digital signal, noise removal, image quality correction, and conversion into a luminance/color difference signal.
The image processing unit 30 performs compression encoding/decompression decoding processing of the image signal based on a predetermined image data format, conversion processing of data specification such as resolution and the like.
The display unit 40 has a function of displaying various data such as an operating state of the user on the input unit 70 and the imaged image.
The R/W 50 writes the image data encoded by the image processing unit 30 in the memory 90 and reads the image data recorded in the memory 90.
The CPU 60 serves as a control processing unit that controls each circuit block provided in the imaging device 100, and controls each circuit block on the basis of an instruction input signal and the like from the input unit 70.
The input unit 70 outputs the instruction input signal according to the user operation to the CPU 60.
The lens drive control unit 80 controls a motor and the like not illustrated that drives the lens group on the basis of a control signal from the CPU 60.
The memory 90 is, for example, a semiconductor memory attachable to and detachable from a slot connected to the R/W 50. Note that, it is also possible that the memory 90 is not attachable to and detachable from the slot, and is incorporated in the imaging device 100.
The operation in the imaging device 100 is hereinafter described.
In a standby state for imaging, under the control of the CPU 60, the imaged image signal is output to the display unit 40 via the camera signal processing unit 20 and displayed as a camera through image.
When imaging is performed by the instruction input signal from the input unit 70, the imaged image signal is output from the camera signal processing unit 20 to the image processing unit 30, subjected to the compression encoding processing, and converted into digital data in a predetermined data format. The converted data is output to the R/W 50 and written in the memory 90.
Focusing is performed by moving a focus lens group by the lens drive control unit 80 on the basis of the control signal from the CPU 60.
In a case of reproducing the image data recorded in the memory 90, predetermined image data is read from the memory 90 by the R/W 50 in response to the operation on the input unit 70, and the decompression decoding processing is performed thereon by the image processing unit 30, then a reproduced image signal is output to the display unit 40 and a reproduced image is displayed.
Note that, in the present technology, the term “imaging” refers to processing including only a part of or all of a series of processing starting from photoelectric conversion processing of converting the captured light into the electric signal by the imaging element 10, processing such as the conversion of the output signal from the imaging element 10 into the digital signal, the noise removal, the image quality correction, and the conversion into the luminance/color difference signal by the camera signal processing unit 20, the compression encoding/decompression decoding processing of the image signal based on a predetermined image data format and the conversion processing of the data specification such as the resolution by the image processing unit 30, until writing processing of the image signal in the memory 90 by the R/W 50.
That is, the term “imaging” may refer only to the photoelectric conversion processing that converts the captured light into the electric signal by the imaging element 10, may refer to processing from the photoelectric conversion processing that converts the captured light into the electric signal by the imaging element 10 to processing such as the conversion of the output signal from the imaging element 10 into the digital signal, the noise removal, the image quality correction, and the conversion into the luminance/color difference signal by the camera signal processing unit 20, may refer to processing from the photoelectric conversion processing that converts the captured light into the electric signal by the imaging element 10 through the processing such as the conversion of the output signal from the imaging element 10 into the digital signal, the noise removal, the image quality correction, and the conversion into the luminance/color difference signal by the camera signal processing unit 20 to the compression encoding/decompression decoding processing of the image signal based on a predetermined image data format and the conversion processing of the data specification such as the resolution by the image processing unit 30, may refer to processing from the photoelectric conversion processing of converting the captured light into the electric signal by the imaging element 10, through the processing such as the conversion of the output signal from the imaging element 10 into the digital signal, the noise removal, the image quality correction, and the conversion into the luminance/color difference signal by the camera signal processing unit 20, and the compression encoding/decompression decoding processing of the image signal based on a predetermined image data format and the conversion processing of the data specification such as the resolution by the image processing unit 30, and may refer to the processing until the writing processing of the image signal in the memory 90 by the R/W 50. In the above-described processing, the order of each processing may be appropriately changed.
Furthermore, in the present technology, the imaging device 100 may include only a part of or all of the imaging element 10, the camera signal processing unit 20, the image processing unit 30, and the R/W 50 that perform the above-described processing.
In the imaging lens of the present technology and the imaging device of the present technology, other optical elements such as a lens having no refractive power may be arranged in addition to the first lens L1 to the seventh lens L7. In this case, the lens configuration of the imaging lens of the present technology is substantially made the lens configuration of seven lenses of the first lens L1 to the seventh lens L7.
Note that, the imaging device described above may be widely applied to a digital still camera, a digital video camera, a camera unit of a digital input/output device such as a mobile phone with an incorporated camera, and a mobile terminal such as a tablet with an incorporated camera and the like.
The technology according to the present disclosure may be applied to various products. For example, the technology according to the present disclosure may be applied to a capsule endoscope.
Configurations and functions of the capsule endoscope 5401 and the external control device 5423 are described in further detail. As illustrated, the capsule endoscope 5401 has functions of a light source unit 5405, an imaging unit 5407, an image processing unit 5409, a wireless communication unit 5411, a power feed unit 5415, a power supply unit 5417, a state detection unit 5419, and a control unit 5421 in a capsule-shaped casing 5403.
The light source unit 5405 includes a light source such as, for example, a light emitting diode (LED), and irradiates an imaging visual field of the imaging unit 5407 with light.
The imaging unit 5407 includes an optical system including an imaging element and a plurality of lenses provided on a preceding stage of the imaging element. Reflected light (hereinafter referred to as observation light) of the light applied to body tissue being an observation target is condensed by the optical system and is incident on the imaging element. The imaging element receives the observation light and photoelectrically converts the same to generate an electric signal corresponding to the observation light, that is, an image signal corresponding to an observation image. The image signal generated by the imaging unit 5407 is provided to the image processing unit 5409. Note that, as the imaging element of the imaging unit 5407, various types of well-known imaging elements such as a complementary metal oxide semiconductor (CMOS) image sensor, or a charge coupled device (CCD) image sensor may be used.
The image processing unit 5409 includes a processor such as a central processing unit (CPU) and a graphics processing unit (GPU), and performs various types of signal processing on the image signal generated by the imaging unit 5407. The signal processing may be minimum processing (for example, image data compression, frame rate conversion, data rate conversion, and/or format conversion and the like) for transmitting the image signal to the external control device 5423. Since the image processing unit 5409 is configured to perform only requisite minimum processing, the image processing unit 5409 may be realized with a smaller size and lower power consumption, so that this is preferable as the capsule endoscope 5401. However, in a case where there is an extra space in the casing 5403 and extra power consumption, it is possible to perform further signal processing (for example, noise removal processing, other high image quality processing and the like) in the image processing unit 5409. The image processing unit 5409 provides the image signal subjected to the signal processing to the wireless communication unit 5411 as RAW data. Note that, in a case where information regarding a state (movement, attitude and the like) of the capsule endoscope 5401 is obtained by the state detection unit 5419, the image processing unit 5409 may provide the image signal to the wireless communication unit 5411 in association with the information. Therefore, it is possible to associate a position in the body in which the image is imaged, an imaging direction of the image and the like with the imaged image.
The wireless communication unit 5411 includes a communication device capable of transmitting/receiving various types of information to/from the external control device 5423. The communication device includes an antenna 5413, a processing circuit for performing modulation processing and the like for transmitting and receiving signals and the like. The wireless communication unit 5411 performs predetermined processing such as modulation processing on the image signal subjected to the signal processing by the image processing unit 5409 and transmits the image signal to the external control device 5423 via the antenna 5413. Furthermore, the wireless communication unit 5411 receives a control signal regarding drive control of the capsule endoscope 5401 from the external control device 5423 via the antenna 5413. The wireless communication unit 5411 provides the received control signal to the control unit 5421.
The power feed unit 5415 includes an antenna coil for power reception, a power regeneration circuit for regenerating electric power from current generated in the antenna coil, a booster circuit and the like. In the power feed unit 5415, the electric power is generated using a so-called non-contact charging principle. Specifically, a magnetic field (electromagnetic wave) of a predetermined frequency is externally given to the antenna coil of the power feed unit 5415, so that induced electromotive force is generated in the antenna coil. The electromagnetic wave may be a carrier wave transmitted from the external control device 5423 via an antenna 5425, for example. The electric power is regenerated from the induced electromotive force by the power regeneration circuit, and electric potential thereof is appropriately adjusted in the booster circuit, so that electric power for storage is generated. The electric power generated by the power feed unit 5415 is stored in the power supply unit 5417.
The power supply unit 5417 includes a secondary battery and stores the electric power generated by the power feed unit 5415. In
The state detection unit 5419 includes a sensor for detecting the state of the capsule endoscope 5401 such as an acceleration sensor and/or a gyro sensor. The state detection unit 5419 may obtain the information regarding the state of the capsule endoscope 5401 from a detection result by the sensor. The state detection unit 5419 provides the obtained information regarding the state of the capsule endoscope 5401 to the image processing unit 5409. As described above, in the image processing unit 5409, the information regarding the state of the capsule endoscope 5401 may be associated with the image signal.
The control unit 5421 includes a processor such as a CPU, and comprehensively controls an operation of the capsule endoscope 5401 by operating according to a predetermined program. The control unit 5421 appropriately controls drive of the light source unit 5405, the imaging unit 5407, the image processing unit 5409, the wireless communication unit 5411, the power feed unit 5415, the power supply unit 5417, and the state detection unit 5419 according to the control signal transmitted from the external control device 5423, thereby realizing the function in each unit as described above.
The external control device 5423 may be a microcomputer, a control board or the like on which a processor such as a CPU and a GPU is mounted, or the processor and a storage element such as a memory are mixedly mounted. The external control device 5423 includes the antenna 5425 and is configured to be able to transmit and receive various types of information to and from the capsule endoscope 5401 via the antenna 5425. Specifically, the external control device 5423 controls the operation of the capsule endoscope 5401 by transmitting the control signal to the control unit 5421 of the capsule endoscope 5401. For example, an irradiation condition of the light to the observation target in the light source unit 5405 might be changed by the control signal from the external control device 5423. Furthermore, an imaging condition (for example, a frame rate, exposure value and the like in the imaging unit 5407) might be changed by the control signal from the external control device 5423. Furthermore, a content of the processing in the image processing unit 5409 and a condition (for example, a transmission interval, the number of transmitted images and the like) for the wireless communication unit 5411 to transmit the image signal may be changed by the control signal from the external control device 5423.
Furthermore, the external control device 5423 applies various types of image processing to the image signal transmitted from the capsule endoscope 5401 and generates the image data for displaying the imaged in-vivo image on the display device. As the image processing, for example, various types of well-known signal processing such as development processing (demosaic processing), high image quality processing (such as band enhancement processing, super-resolution processing, noise reduction (NR) processing, and/or camera shake correction processing), and/or scaling processing (electronic zoom processing) may be performed. The external control device 5423 controls drive of the display device (not illustrated) to display the in-vivo image imaged on the basis of the generated image data. Alternatively, the external control device 5423 may allow a recording device (not illustrated) to record the generated image data or allow a printing device (not illustrated) to print out the same.
An example of the in-vivo information obtaining system 5400 to which the technology according to the present disclosure may be applied is described above. The technology according to the present disclosure may be preferably applied to the capsule endoscope 5401 out of the configurations described above. Specifically, this is applicable to an imaging lens in the capsule endoscope 5401 and the capsule endoscope 5401 provided with the imaging lens. Since a clearer surgical site image may be obtained by applying the technology according to the present disclosure to the capsule endoscope 5401, the accuracy of the examination may be improved and the capsule endoscope 5401 may be made further smaller, so that a burden on the patient may be further reduced.
The technology according to the present disclosure may be applied to various products. For example, the technology according to the present disclosure is applicable to an imaging device or an imaging lens mounted on any type of mobile body such as an automobile, an electric automobile, a hybrid electric automobile, a motorcycle, a bicycle, a personal mobility, an airplane, a drone, a ship, a robot, a constructing machine, and an agriculture machine (tractor).
Each control unit is provided with a microcomputer that performs arithmetic processing according to various programs, a storage unit that stores the programs executed by the microcomputer, parameters used for various arithmetic operations and the like, and a drive circuit that drives various devices to be controlled. Each control unit is provided with a network I/F for communicating with other control units via the communication network 7010, and a communication I/F for communicating by wired communication or wireless communication with devices, sensors or the like inside and outside the vehicle. In
The drive system control unit 7100 controls an operation of a device related to a drive system of a vehicle according to various programs. For example, the drive system control unit 7100 serves as a control device of a driving force generating device for generating driving force of the vehicle such as an internal combustion engine, a driving motor or the like, a driving force transmitting mechanism for transmitting the driving force to wheels, a steering mechanism that adjusts a rudder angle of the vehicle, a braking device that generates braking force of the vehicle and the like. The drive system control unit 7100 may have a function as a control device of an antilock brake system (ABS), electronic stability control (ESC) or the like.
A vehicle state detection unit 7110 is connected to the drive system control unit 7100. The vehicle state detection unit 7110 includes, for example, a gyro sensor that detects an angular velocity of axial rotational movement of a vehicle body, an acceleration sensor that detects acceleration of the vehicle, or at least one of sensors for detecting an operation amount of an accelerator pedal, an operation amount of a brake pedal, a steering angle of a steering wheel, an engine speed, a wheel rotational speed or the like. The drive system control unit 7100 performs arithmetic processing using a signal input from the vehicle state detection unit 7110 to control the internal combustion engine, the driving motor, an electric power steering device, a brake device or the like.
The body system control unit 7200 controls operations of various devices mounted on the vehicle body in accordance with various programs. For example, the body system control unit 7200 serves as a control device of a keyless entry system, a smart key system, a power window device, or various lights such as a head light, a backing light, a brake light, a blinker, a fog light or the like. In this case, a radio wave transmitted from a portable device that substitutes for a key or signals of various switches may be input to the body system control unit 7200. The body system control unit 7200 receives an input of the radio wave or signals and controls a door lock device, the power window device, the lights and the like of the vehicle.
The battery control unit 7300 controls a secondary battery 7310 that is a power supply source of the driving motor according to various programs. For example, information such as battery temperature, a battery output voltage, or remaining battery capacity is input to the battery control unit 7300 from the battery device provided with the secondary battery 7310. The battery control unit 7300 performs arithmetic processing using these signals, and controls temperature adjustment of the secondary battery 7310 or a cooling device and the like provided in the battery device.
The vehicle exterior information detection unit 7400 detects information outside the vehicle on which the vehicle control system 7000 is mounted. For example, the vehicle exterior information detection unit 7400 is connected to at least one of an imaging unit 7410 or a vehicle exterior information detection unit 7420. The imaging unit 7410 includes at least one of a time of flight (ToF) camera, a stereo camera, a monocular camera, an infrared camera, or other cameras. The vehicle exterior information detection unit 7420 includes, for example, at least one of an environmental sensor for detecting current weather or meteorological phenomenon, or an ambient information detection sensor for detecting other vehicles, obstacles, pedestrians or the like around the vehicle on which the vehicle control system 7000 is mounted.
The environmental sensor may be, for example, at least one of a raindrop sensor that detects rainy weather, a fog sensor that detects fog, a sunshine sensor that detects sunlight intensity, or a snow sensor that detects snowfall. The ambient information detection sensor may be at least one of an ultrasonic sensor, a radar device, or light detection and ranging, laser imaging detection and ranging (LIDAR) device. The imaging unit 7410 and the vehicle exterior information detection unit 7420 may be provided as independent sensors or devices, or may be provided as a device in which a plurality of sensors or devices is integrated.
Here,
Note that, in
Vehicle exterior information detection units 7920, 7922, 7924, 7926, 7928, and 7930 provided on front, rear, side, corner, and the upper portion of the windshield of the vehicle interior of the vehicle 7900 may be ultrasonic sensors or radar devices, for example. The vehicle exterior information detection units 7920, 7926, and 7930 provided on the front nose, the rear bumper, the rear door, and the upper portion of the windshield in the vehicle interior of the vehicle 7900 may be, for example, the LIDAR devices. These vehicle exterior information detection units 7920 to 7930 are principally used for detecting the preceding vehicle, the pedestrian, the obstacle or the like.
Returning to
Furthermore, the vehicle exterior information detection unit 7400 may perform image recognition processing of recognizing a person, a vehicle, an obstacle, a sign, a character on a road surface or the like or distance detection processing on the basis of the received image. The vehicle exterior information detection unit 7400 may perform processing such as distortion correction or alignment on the received image data, and combine the image data imaged by different imaging units 7410 to generate an overlooking image or a panoramic image. The vehicle exterior information detection unit 7400 may perform viewpoint conversion processing using the image data imaged by the different imaging units 7410.
The vehicle interior information detection unit 7500 detects information in the vehicle. The vehicle interior information detection unit 7500 is connected to, for example, a driver state detection unit 7510 for detecting a state of a driver. The driver state detection unit 7510 may include a camera that images the driver, a biometric sensor that detects biometric information of the driver, a microphone that collects sound in the vehicle interior or the like. The biometric sensor is provided, for example, on a seat surface, a steering wheel or the like, and detects biometric information of a passenger sitting on the seat or the driver holding the steering wheel. The vehicle interior information detection unit 7500 may calculate a driver's fatigue level or concentration level or may determine whether the driver is not dozing on the basis of detection information input from the driver state detection unit 7510. The vehicle interior information detection unit 7500 may perform processing such as noise canceling processing on the collected audio signal.
The integrated control unit 7600 controls the overall operation in the vehicle control system 7000 according to various programs. An input unit 7800 is connected to the integrated control unit 7600. The input unit 7800 is realized by a device that may be operated by the passenger to input, such as a touch panel, a button, a microphone, a switch, or a lever, for example. To the integrated control unit 7600, data obtained by audio recognition of audio input by the microphone may be input. The input unit 7800 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a personal digital assistant (PDA) that supports the operation of the vehicle control system 7000. The input unit 7800 may be, for example, a camera, and in that case, the passenger may input information by gesture. Alternatively, data obtained by detecting movement of a wearable device worn by the passenger may be input. Moreover, the input unit 7800 may include, for example, an input control circuit and the like that generates an input signal on the basis of the information input by the passenger and the like using the input unit 7800 described above and outputs to the integrated control unit 7600. The passenger and the like operate the input unit 7800 to input various data to the vehicle control system 7000 or indicate a processing operation.
The storage unit 7690 may include a read only memory (ROM) that stores various programs executed by the microcomputer, and a random access memory (RAM) that stores various parameters, operation results, sensor values or the like. Furthermore, the storage unit 7690 may be realized by a magnetic storage device such as a hard disc drive (HDD), a semiconductor storage device, an optical storage device, a magneto-optical storage device or the like.
The general-purpose communication I/F 7620 is a general-purpose communication I/F that mediates communication with various devices present in an external environment 7750. The general-purpose communication I/F 7620 may implement a cellular communication protocol such as global system of mobile communications (GSM (registered trademark)), WiMAX (registered trademark), long term evolution (LTE (registered trademark)), or LTE-advanced (LTE-A), or other wireless communication protocols such as wireless LAN (also referred to as Wi-Fi (registered trademark)), Bluetooth (registered trademark) and the like. The general-purpose communication I/F 7620 may be connected to a device (for example, an application server or a control server) present on an external network (for example, the Internet, a cloud network, or an operator-specific network) via, for example, a base station or an access point. Furthermore, the general-purpose communication I/F 7620 may use, for example, a peer to peer (P2P) technology to connect to a terminal present in the vicinity of the vehicle (for example, a driver, a pedestrian or a store terminal, or a machine type communication (MTC) terminal.
The dedicated communication I/F 7630 is a communication I/F that supports a communication protocol planned for use in a vehicle. The dedicated communication I/F 7630 may implement standard protocols such as wireless access in vehicle environment (WAVE) that is a combination of lower-layer IEEE802.11p and upper-layer IEEE1609, dedicated short range communications (DSRC), or cellular communication protocol, for example. The dedicated communication I/F 7630 typically executes V2X communication being a concept including one or more of vehicle-to-vehicle communication, vehicle-to-infrastructure communication, vehicle-to-home communication, and vehicle-to-pedestrian communication.
The positioning unit 7640 receives a GNSS signal from a global navigation satellite system (GNSS) satellite (for example, a GPS signal from a global positioning system (GPS) satellite) to execute positioning, and generates positional information including the latitude, longitude, and altitude of the vehicle, for example. Note that, the positioning unit 7640 may specify a current position by exchanging signals with the wireless access point, or may obtain the positional information from a terminal such as a mobile phone, PHS, or smartphone having a positioning function.
For example, the beacon reception unit 7650 receives radio waves or electromagnetic waves transmitted from a wireless station and the like installed on the road, and obtains information such as the current position, traffic jam, closed road, or required time. Note that, the function of the beacon reception unit 7650 may be included in the dedicated communication I/F 7630 described above.
The in-vehicle device I/F 7660 is a communication interface that mediates connections between the microcomputer 7610 and various in-vehicle devices 7760 present in the vehicle. The in-vehicle device I/F 7660 may establish a wireless connection using a wireless communication protocol such as wireless LAN, Bluetooth (registered trademark), near field communication (NFC), or wireless USB (WUSB). Furthermore, the in-vehicle device I/F 7660 may establish a wired connection such as universal serial bus (USB), high-definition multimedia interface (HDMI (registered trademark)), or mobile high-definition link (MHL) via a connection terminal (and a cable if necessary) not illustrated. The in-vehicle devices 7760 may include, for example, at least one of a mobile device or a wearable device that the passenger has, or an information device carried in or attached to the vehicle. Furthermore, the in-vehicle devices 7760 may include a navigation device that searches for a route to an arbitrary destination. The in-vehicle device I/F 7660 exchanges control signals or data signals with these in-vehicle devices 7760.
The on-vehicle network I/F 7680 is an interface that mediates communication between the microcomputer 7610 and the communication network 7010. The on-vehicle network I/F 7680 transmits and receives signals and the like in accordance with a predetermined protocol supported by the communication network 7010.
The microcomputer 7610 of the integrated control unit 7600 controls the vehicle control system 7000 according to various programs on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning unit 7640, the beacon reception unit 7650, the in-vehicle device I/F 7660, or the on-vehicle network I/F 7680. For example, the microcomputer 7610 may calculate a control target value of the driving force generating device, the steering mechanism, or the braking device on the basis of the obtained information inside and outside the vehicle and output a control instruction to the drive system control unit 7100. For example, the microcomputer 7610 may perform cooperative control for realizing functions of advanced driver assistance system (ADAS) including collision avoidance or impact attenuation of the vehicle, following travel based on the distance between the vehicles, vehicle speed maintaining travel, vehicle collision warning, vehicle lane departure warning or the like. Furthermore, the microcomputer 7610 may perform the cooperative control for realizing automatic driving and the like to autonomously travel independent from the operation of the driver by controlling the driving force generating device, the steering mechanism, the braking device or the like on the basis of the obtained information around the vehicle.
The microcomputer 7610 may generate three-dimensional distance information between the vehicle and objects such as peripheral structure and person on the basis of information obtained via at least one of the general-purpose communication I/F 7620, the dedicated communication I/F 7630, the positioning unit 7640, the beacon reception unit 7650, the in-vehicle device I/F 7660, or the on-vehicle network I/F 7680 to create local map information including peripheral information of the current position of the vehicle. Furthermore, the microcomputer 7610 may generate a warning signal by predicting a danger such as vehicle collision, approach of a pedestrian and the like or entry to a closed road on the basis of the obtained information. The warning signal may be, for example, a signal for generating a warning sound or lighting a warning light.
The audio image output unit 7670 transmits at least one of audio or image output signal to an output device capable of visually or audibly notifying the passenger of the vehicle or the outside the vehicle of the information. In the example in
Note that, in the example illustrated in
The present technology may also have the following configuration.
<1>
An imaging lens provided with:
a first lens having a positive refractive power with a convex surface facing an object side;
a second lens having a positive refractive power;
a third lens having a negative refractive power;
a fourth lens having a positive refractive power or a negative refractive power with a concave surface facing the object side;
a fifth lens having a positive refractive power or a negative refractive power;
a sixth lens having a positive refractive power with a concave surface facing an image side; and
a seventh lens having a negative refractive power in a vicinity of an optical axis, a surface on the image side of which is formed into an aspherical shape having an inflection point,
in order from the object side to the image side.
<2>
The imaging lens according to <1>
that satisfies following conditional expression (1):
1.0<f1/f<225.0 (1)
where
f is a focal length of an entire system, and
f1 is a focal length of the first lens.
<3>
The imaging lens according to <1> or <2>
that satisfies following conditional expression (2):
0.7<f2/f<4.0 (2)
where
f is a focal length of an entire system, and
f2 is a focal length of the second lens.
<4>
The imaging lens according to any one of <1> to <3>
that satisfies following conditional expression (3):
4.0<|f45|/f (3)
where
f is a focal length of an entire system, and
f45 is a composite focal length of the fourth and fifth lenses.
<5>
The imaging lens according to any one of <1> to <4>
that satisfies following conditional expression (4):
2.8<f6/f<215.0 (4)
where
f is a focal length of an entire system, and
f6 is a focal length of the sixth lens.
<6>
The imaging lens according to any one of <1> to <5>
that satisfies following conditional expression (5):
−46.0<f6/f7<−0.3 (5)
where
f6 is a focal length of the sixth lens, and
f7 is a focal length of the seventh lens.
<7>
The imaging lens according to any one of <1> to <6>
that satisfies following conditional expression (6):
5.4<|R7/f|<220.0 (6)
where
f is a focal length of an entire system, and
R7 is a curvature radius of a surface on the object side of the fourth lens.
<8>
The imaging lens according to any one of <1> to <7>
that satisfies following conditional expression (7):
5.8<|(R11+R12)/(R11−R12)|<320.0 (7)
where
R11 is a curvature radius of a surface on the object side of the sixth lens, and
R12 is a curvature radius of a surface on the image side of the sixth lens.
<9>
The imaging lens according to any one of <1> to <8>
that satisfies following conditional expression (8):
18.0<νd(L3)<νd(L5)<νd(L6)<30.0 (8)
where
νd (L3) is an Abbe number on a d-line of the third lens,
νd (L5) is an Abbe number on a d-line of the fifth lens, and
νd (L6) is an Abbe number on a d-line of the sixth lens.
<10>
The imaging lens according to any one of <1> to <9>, in which
an aperture diaphragm is arranged on the object side of the first lens or between the first lens and the second lens.
<11>
The imaging lens according to any one of <1> to <10>, in which
the sixth lens has an inflection point on a surface on the image side.
<12>
An imaging device provided with:
an imaging lens and an imaging element that converts an optical image formed by the imaging lens into an electric signal, in which
the imaging lens is provided with:
a first lens having a positive refractive power with a convex surface facing an object side;
a second lens having a positive refractive power;
a third lens having a negative refractive power;
a fourth lens having a positive refractive power or a negative refractive power with a concave surface facing the object side;
a fifth lens having a positive refractive power or a negative refractive power;
a sixth lens having a positive refractive power with a concave surface facing an image side; and
a seventh lens having a negative refractive power in the vicinity of an optical axis, a surface on the image side of which is formed into an aspherical shape having an inflection point,
in order from the object side to the image side.
Number | Date | Country | Kind |
---|---|---|---|
2018-243648 | Dec 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/045965 | 11/25/2019 | WO | 00 |