The present disclosure relates to an optical system suitable for image pickup apparatuses such as digital still cameras, digital video cameras, on-board (in-vehicle) cameras, mobile phone cameras, surveillance cameras, wearable cameras, and medical cameras.
Optical systems for image pickup apparatuses such as on-board cameras are demanded to have a wide angle of view. Japanese Patent Laid-Open No. 2010-9028 discloses a wide-angle lens with a projection characteristic close to that of an orthogonal projection method.
However, the wide-angle lens disclosed in Japanese Patent Laid-Open No. 2010-9028 does not meet the miniaturization demand.
An optical system according to one aspect of the disclosure includes, in order from an object side to an image side, a front group, and a rear group. The front group includes, in order from the object side to the image side, a first lens having negative refractive power and a second lens having an aspheric surface. The rear group includes, in order from the object side to the image side, a third lens having positive refractive power, a fourth lens having positive refractive power, a fifth lens having negative refractive power, and a sixth lens having an aspheric surface. On an optical axis, an object-side surface of the first lens is convex, an object-side surface of the second lens is convex, and an object-side surface of the sixth lens is concave. An image pickup apparatus and an imaging system each having the above optical system also constitute another aspect of the disclosure.
Further features of various embodiments of the disclosure will become apparent from the following description of embodiments with reference to the attached drawings.
Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the disclosure. Each drawing may be drawn at a scale different from the actual scale for convenience. Corresponding elements in respective figures will be designated by the same reference numerals, and a duplicate description thereof will be omitted.
Optical systems for image pickup apparatuses such as on-board cameras are demanded to have a wide angle of view, and fisheye lenses are mainly used. Known projection methods for fisheye lenses include an orthogonal projection method, an equidistant projection method, and a stereoscopic projection method. Each projection method is expressed by the following expression:
Orthogonal projection method: Y=f×sin θ
Equidistant projection method: Y=f×θ
Stereoscopic projection method: Y=2×f tan(θ/2)
where Y is an image height on a projection plane, f is a focal length of the entire optical system, and θ is a half angle of view.
The orthogonal projection method has a characteristic of more strongly compressing an image at a periphery of a screen relative to an image near an optical axis. The equidistant projection method has a constant resolution regardless of an angle of view. The stereoscopic projection method has a characteristic of more strongly compressing an image around the optical axis than an image at the periphery of the screen, which is opposite to the orthogonal projection method.
A detailed description will be given of the characteristic of the optical system according to each example.
The optical system according to each example includes, in order from the object side to the image side, a front group G1 having negative refractive power, and a rear group G2 having positive refractive power. Now assume that the cover glass CG is not included in each optical system. The front group G1 includes, in order from the object side to the image side, a first lens L1 having negative refractive power, and a second lens L2 having an aspheric surface. An object-side surface of the first lens L1 is a convex surface. A second lens L2 has negative refractive power near the optical axis OA (on the optical axis). An object-side surface of the second lens L2 is a convex surface near the optical axis.
The rear group G2 includes, in order from the object side to the image side, a third lens L3 having positive refractive power, a fourth lens L4 having positive refractive power, a fifth lens L5 having negative refractive power, and a sixth lens L6 having an aspheric surface. The fourth lens L4 and the fifth lens L5 are cemented together to form a cemented lens CL. An object-side surface of the sixth lens L6 is concave surface near the optical axis. Here, a lens means an optical element with refractive power, and does not include optical elements such as parallel plate glass that do not have refractive power. The rear group G2 further includes an aperture stop STO.
In the optical system according to each example, the second lens L2 disposed at a position away from the aperture stop STO is an aspheric lens, and each example realizes a projection characteristic with high resolution in a central area of a screen (near the optical axis), and a wide angle of view by adopting the above lens configuration. Configuring the rear group G2 as described above can satisfactorily correct spherical aberration, on-axis aberration, and curvature of field that occur in a case where the optical system has a wide angle of view.
In each example, the following inequality (1) may be satisfied:
where f3 is a focal length of the third lens L3, and f is a focal length of the optical system (entire system).
In order to reduce the overall length of the optical system, it is effective to increase the power of the lens having positive refractive power disposed at a position where an on-axis ray height is high. In each example, since the first lens L1 and the second lens L2 have negative refractive power, the on-axis ray diverges and thus the incident position on the third lens L3 is high. Properly setting the third lens L3 having positive refractive power can reduce the overall length of the optical system. In a case where f3/f becomes higher than the upper limit of inequality (1), the positive refractive power of the third lens L3 becomes weak, and the effect of reducing the overall length is reduced. On the other hand, in a case where f3/f becomes lower than the lower limit of inequality (1), the positive refractive power of the third lens L3 becomes too strong, and it becomes difficult to correct curvature of field.
Inequality (1) may be replaced with inequality (1a) or (1b) below:
The optical system according to each example can obtain the effects of the present disclosure as long as it satisfies at least the above configuration, and the front group G1 may have a lens other than the first lens L1 and the second lens L2 (three or more lenses).
In each example, inequality (2) below may be satisfied:
where θ [degree] is a half angle of view of the optical system, y(θ) is a projection characteristic that expresses a relationship between the half angle of view θ and an image height y, and θ max is a maximum half angle of view of the optical system.
Details will be described later, but configuring the optical system to satisfy inequality (2) can increase the resolution of an object image at an angle of view near the optical axis OA (near the optical axis) while a wide angle of view is maintained.
Inequality (2) may be replaced with inequality (2a) or (2b) below:
In each example, the following inequality (3) may be satisfied:
A ratio of the image height y(θ max) at a maximum half angle of view θ max to an image height y(θ max/2) at an angle of view θ max/2, which is half the maximum half angle of view θ max, is set within the range of inequality (3). Thereby, the resolution of an object image can be increased at an angle of view near the optical axis OA while a wide angle of view is maintained.
Inequality (3) may be replaced with inequality (3a) or (3b) below:
An image pickup apparatus such as an on-board camera described later is demanded to have a wide angle of view, and to increase an imaging magnification near the optical axis (central area). For example, in a case where the image pickup apparatus is disposed at the rear of a movable apparatus (vehicle), an image corresponding to the central area, which is the main target area (reference area or area of interest), may be enlarged and displayed on an electronic rearview mirror, and an entire image including an area other than the central area (peripheral area) may be displayed on the in-vehicle display. Thus, the imaging magnification (focal length) of the optical system may be different between the central area and the other areas.
The object-side surface (lens object-side surface) of the second lens L2, which is an aspheric lens, can largely refract a light ray from the peripheral portion in the radial direction of a light beam from the first lens L1 toward the optical axis OA. Thereby, it becomes easy to make the imaging magnification different between the central area and the peripheral area of the optical system. In this case, the object-side surface of the second lens L2 may have an inflection point in a section having the optical axis OA, as illustrated in
The following inequality (4) may be satisfied:
where t23 is a distance (gap) on the optical axis from the image-side surface of the second lens L2 to the object-side surface of the third lens L3, and fR is a focal length of the rear group.
Generally, in a case where an optical system is divided into a front group and a rear group, power (refractive power) Φ of the optical system (entire system) is expressed by the following equation (5):
where fF is a focal length of the front group, and d is a distance on the optical axis between the front group and the rear group.
In each example, the refractive power of the front group G1 is negative, and the refractive power of the rear group G2 is positive. Thus, the third term of equation (5) is positive, and the value of the third term increases as the distance between the front group G1 and the rear group G2 increases. In setting the power (or focal length) of the optical system (entire system) to a desired value, as the distance between the front group G1 and the rear group G2 increases, the value of the third term increases, and thus it is not necessary to excessively strengthen the power of the rear group G2 having positive refractive power. This is beneficial to aberration correction.
In a case where t23/fR becomes lower than the lower limit of inequality (4), the distance between the front group G1 and the rear group G2 becomes insufficient, so the refractive power of the rear group G2 becomes too strong, aberration correction becomes difficult. On the other hand, in a case where t23/fR becomes higher than the upper limit of inequality (4), the distance between the front group G1 and the rear group G2 becomes too wide, and the size of the optical system increases.
Inequality (4) may be replaced with (4a) or (4b) below:
The following inequality (6) may be satisfied:
where f4 is a focal length of the fourth lens L4.
The third lens L3 and the fourth lens L4, which have positive refractive power, can suppress the occurrence of aberrations in each lens by sharing the optical power. In a case where f3/f4 becomes lower than the lower limit or higher than the upper limit of inequality (6), the power becomes biased toward one of the lenses, and it becomes difficult to correct spherical aberration and astigmatism.
Inequality (6) may be replaced with (6a) or (6b) below:
The following inequality (7) may be satisfied:
where ν3 is an Abbe number based on the d-line of the third lens L3.
In a case where v3 becomes lower than the lower limit of inequality (7), the occurrence of lateral chromatic aberration becomes too large, and aberration correction becomes difficult. On the other hand, in a case where v3 becomes higher than the upper limit of inequality (7), the material has a small dispersion, and manufacturing becomes difficult.
Inequality (7) may be replaced with (7a) or (7b) below:
The object-side surface of the sixth lens L6 is aspheric, and it may not have an extreme value in the radial direction in a section having the optical axis OA. As illustrated in
The tilt of the object-side surface of the sixth lens L6 (tilt relative to a plane perpendicular to the optical axis OA) may increase monotonically in the radial direction in a section having the optical axis OA. More off-axis rays from the fifth lens L5 are more likely to obliquely incident on this surface, and as illustrated in
The image-side surface of the sixth lens L6 may be aspheric and have no extreme values in the radial direction in a section having the optical axis OA. As illustrated in
The tilt of the image-side surface of the sixth lens L6 may monotonically increase in the radial direction in a section having the optical axis OA. Off-axis rays from the object surface of the sixth lens L6 are obliquely incident on this surface, but as illustrated in
The following inequality (8) may be satisfied:
where R31 is a radius of curvature of an object-side surface of the third lens L3, and R32 is a radius of curvature of the image-side surface of the third lens L3.
In a case where (R32+R31)/(R32−R31) becomes higher than the upper limit of inequality (8), the spherical aberration generated by the third lens L3 is not shared between the object-side surface and the image-side surface, and it becomes sensitive to the occurrence of higher-order aberrations and manufacturing errors. On the other hand, in a case where (R32+R31)/(R32−R31) becomes lower than the lower limit of inequality (8), coma occurs more and is difficult to correct.
Inequality (8) may be replaced with (8a) or (8b) below:
The following inequality (9) may be satisfied:
where f1 is a focal length of the first lens L1.
In a case where f1/f becomes higher than the upper limit of inequality (9), the negative refractive power of the first lens L1 becomes too strong, and various aberrations excessively occur in the first lens L1. On the other hand, in a case where f1/f becomes lower than the lower limit of inequality (9), the negative refractive power of the first lens L1 is too weak, and the occurrence of pupil aberrations is suppressed, and an off-axis light capturing amount is reduced.
Inequality (9) may be replaced with (9a) or (9b) below:
The image pickup apparatus includes an optical system according to each example configured to form an object image, and an image sensor configured to photoelectrically convert an object image (capture the object via the optical system). A plurality of pixels are two-dimensionally arranged on the imaging surface of the image sensor.
A description will be given of a detailed configuration of the optical system according to each example.
An optical system 100 according to Example 1 illustrated in
The imaging surface of an image sensor such as a Complementary Metal-Oxide-Semiconductor (CMOS) sensor is disposed on the image plane IMG, and has a cover glass CG for the image sensor. In the image pickup apparatus, image data is generated from an output of the imaging sensor.
The fourth lens L4 and the fifth lens L5 are cemented together to form a cemented lens CL.
The object-side surface of the second lens L2 is convex near the optical axis OA (on the optical axis), and the image-side surface is concave. As illustrated in
The first lens L1 and the second lens L2 constitute a front group G1, and the focal length of the front group G1 is −5.01 mm. The third lens L3 to the sixth lens L6 constitute the rear group G2, and the focal length of the rear group G2 is 5.53 mm.
As illustrated in
Numerical example 1 illustrates specific numerical values of the optical system 100 according to this example. Table 1 illustrates the numerical values of each inequality. The optical system 100 according to this example satisfies inequalities (1) to (4) and (6) to (9).
An optical system 200 according to Example 2 illustrated in
The imaging surface of an image sensor such as a CMOS sensor is disposed on the image plane IMG, and has a cover glass CG of the image sensor. In the image pickup apparatus, image data is generated from an output of the image sensor.
The fourth lens L4 and the fifth lens L5 are cemented together to form a cemented lens CL.
The object-side surface of the second lens L2 is convex near the optical axis, and the image-side surface is concave. As illustrated in
The first lens L1 and the second lens L2 constitute the front group G1, and the focal length of the front group G1 is −5.52 mm. The third lens L3 to the sixth lens L6 constitute the rear group G2, and the focal length of the rear group G2 is 6.75 mm.
As illustrated in
Numerical example 2 illustrates specific numerical values of the optical system 200 according to this example. Table 1 illustrates the numerical values of each inequality. The optical system 200 according to this example satisfies inequalities (1) to (4) and (6) to (9).
An optical system 300 according to Example 3 illustrated in
The image pickup surface of an image sensor such as a CMOS sensor is disposed on the image plane IMG, and has a cover glass CG of the image sensor. In the image pickup apparatus, image data is generated from an output of the image sensor.
The fourth lens L4 and the fifth lens L5 are cemented together to form a cemented lens CL.
The object-side surface of the second lens L2 is a convex surface near the optical axis, and the image-side surface is a concave surface. As illustrated in
The first lens L1 and the second lens L2 constitute the front group G1, and the focal length of the front group G1 is −5.07 mm. The third lens L3 to the sixth lens L6 constitute the rear group G2, and the focal length of the rear group G2 is 5.58 mm.
As illustrated in
Numerical example 3 illustrates specific numerical values of the optical system 300 according to this example. Table 1 illustrates the numerical values of each inequality. The optical system 300 according to this example satisfies inequalities (1) to (4) and (6) to (9).
An optical system 400 according to Example 4 illustrated in
The image pickup apparatus includes an imaging surface of an image sensor such as a CMOS sensor disposed on the image plane IMG, and a cover glass CG of the image sensor. Image data is generated from an output of the image sensor in the image pickup apparatus.
The fourth lens L4 and the fifth lens L5 are cemented together to form a cemented lens CL.
The object-side surface of the second lens L2 is convex near the optical axis, and the image-side surface is concave. As illustrated in
The first lens L1 and the second lens L2 constitute the front group G1, and the focal length of the front group G1 is −6.53 mm. The third lens L3 to the sixth lens L6 constitute the rear group G2, and the focal length of the rear group G2 is 5.59 mm.
As illustrated in
Numerical example 4 illustrates specific numerical values of the optical system 400 according to this example. Table 1 illustrates the numerical values of each inequality. The optical system 400 according to this example satisfies the inequalities (1) to (4) and (6) to (9).
An optical system 500 according to Example 5 illustrated in
The imaging surface of an image sensor such as a CMOS sensor is disposed on the image plane IMG, and has a cover glass CG of the image sensor. In the image pickup apparatus, image data is generated from an output of the image sensor.
The fourth lens L4 and the fifth lens L5 are cemented together to form a cemented lens CL.
The object-side surface of the second lens L2 is convex near the optical axis, and the image-side surface is concave. As illustrated in
The first lens L1 and the second lens L2 constitute the front group G1, and the focal length of the front group G1 is −6.16 mm. The third lens L3 to the sixth lens L6 constitute the rear group G2, and the focal length of the rear group G2 is 5.39 mm.
As illustrated in
Numerical example 5 illustrates specific numerical values of the optical system 500 according to this example. Table 1 illustrates the numerical values of each inequality. The optical system 500 according to this example satisfies inequalities (1) to (4) and (6) to (9).
A description will now be given of numerical examples 1 to 5 corresponding to Examples 1 to 5, respectively. In each numerical example, i represents the order of a surface (optical surface) counted from the object side. ri represents a radius of curvature of an i-th surface (unit: mm), di represents a distance between i-th and (i+1)-th surfaces (unit: mm), and ndi and vdi represent a refractive index and Abbe number of the i-th optical member based on the d-line (wavelength 587.6 nm). The Abbe number νd is defined by the following equation:
where nF, nd, and nC are refractive indices for the F-line, d-line, and C-line, respectively.
The surface distance (or separation) is positive if the direction moves toward the image side along the optical path, and the surface distance is negative if the direction moves toward the object side. In numerical examples 1 to 5, the two surfaces closest to the image plane are flat surfaces that correspond to the optical block.
In a case where an optical surface is aspheric, an asterisk “*” is added to the right of the surface number. Each of the aspheric optical surfaces in this example is rotationally symmetric about the optical axis, and is expressed by the following aspheric expression:
where z is a sag amount (mm) of an aspheric shape in the optical axis direction, c is a curvature (1/mm) on the optical axis AX, k is a cone coefficient, h is a distance (mm) in the radial direction from the optical axis OA, and A, B, C, . . . are aspheric coefficients of the fourth order term, sixth order term, eighth order term, . . . respectively. In this aspheric equation, the first term indicates the sag amount of a base sphere, and a radius of curvature of this base sphere is R=1/c. The second and subsequent terms indicate the sag amount of the aspheric component applied to the base sphere. In each numerical example, “E±P” means “×10±P.”
The optical system according to each numerical example is a single focus optical system in which a focal length is fixed (no zooming is performed) and no focusing is performed. In other words, a distance between the lenses in the optical system according to each numerical example is always fixed. Thereby, fluctuations in optical performance can be avoided along with the movement of each lens. However, if necessary, the optical system may be configured to perform at least one of zooming and focusing, and a distance between the lenses may be configured to change for this purpose.
In a case where the image pickup apparatus 70 is used as a distance measuring apparatus, for example, an image sensor (imaging-surface phase-difference sensor) having pixels that can split a light beam from the object side into two and photoelectrically convert it can be used as the light receiving element 72. In a case where the object is located on a front focal plane of the optical system 71, no positional shift occurs between images corresponding to the two split light beams on the image plane of the optical system 71. However, in a case where the object is located at a position other than the front focal plane of the optical system 71, a positional shift occurs between the images. In this case, a positional shift of each image corresponds to a displacement amount from the front focal plane of the object, so a distance to the object can be measured by acquiring a positional shift amount and a positional shift direction of each image using an imaging-surface phase-difference sensor.
The optical system 71 and the camera body 73 may be attachable to and detachable from each other. That is, the optical system 71 and the lens barrel may be configured as an interchangeable lens (lens apparatus). The optical system according to each of the above examples can be applied not only to image pickup apparatuses such as digital still cameras, film-based cameras, video cameras, on-board cameras, and surveillance cameras, but also to various optical apparatuses such as telescopes, binoculars, projectors (projection apparatuses), and digital copiers.
A detailed description will now be given of the optical characteristic of the optical system 201. A left diagram in
As illustrated in
In the left diagram of
The optical system 201 is configured such that the projection characteristic y(θ) in the first area 201a is different from that of the equidistant projection method and is also different from that of the second area 201b. In this case, the projection characteristic y(θ) of the optical system 201 may satisfy inequality (2).
Satisfying inequality (2) can reduce the resolution in the second area 201b, and thereby realize a wide angle of view of the optical system 201. The resolution can be higher in the first area 201a than in the central area of a general fisheye lens that employs the orthogonal projection method. In a case where the value becomes lower than the lower limit of inequality (2), the resolution in the first area 201a becomes lower than that of a fisheye lens that employs the orthogonal projection method, or a maximum image height becomes larger. In a case where the value becomes higher than the upper limit of inequality (2), the resolution in the first area 201a becomes too high, and it becomes difficult to achieve a wide angle of view equivalent to that of a fisheye lens that employs the orthogonal projection method, or good optical performance cannot be maintained.
As described above, in the first area 201a, the distortion of the optical system 201 is small and the resolution is high, so that a higher definition image can be obtained than that in the second area 201b. Therefore, good visibility can be obtained by setting the first area 201a (first angle of view 30) to be a target area of the user 40. For example, as illustrated in
The processing apparatus 220 includes an image processing unit 221, a display-angle-of-view (DAV) determining unit 224, a user setting change unit 226 (first change unit), a rear vehicle distance detector 223 (first detector), a reverse gear detector 225 (second detector), and a DAV change unit 222 (second change unit). The processing apparatus 220 is a computer such as a Central Processing Unit (CPU) microcomputer, and functions as a control unit that controls the operation of each component based on a computer program. At least one of the components of the processing apparatus 220 may be realized by hardware such as an Application Specific Integrated Circuit (ASIC) or a Programmable Logic Array (PLA).
The image processing unit 221 generates image data by performing image processing such as Wide Dynamic Range (WDR) correction, gamma correction, Look Up Table (LUT) processing, and distortion correction for the image data acquired from the imaging unit 210. The distortion is corrected on at least the image data corresponding to the second area 201b. Thereby, the user 40 is likely to visually recognize an image when it is displayed on the display apparatus 230, and also improves a detection rate of the rear vehicle in the rear vehicle distance detector 223. The distortion correction does not have to be performed on the image data corresponding to the first area 201a. The image processing unit 221 outputs the image data generated by executing the image processing as described above to the DAV change unit 222 and the rear vehicle distance detector 223.
The rear vehicle distance detector 223 acquires information on a distance to a rear vehicle included in the image data corresponding to a range of the second angle of view 31 that does not include the first angle of view 30, using the image data output from the image processing unit 221. For example, the rear vehicle distance detector 223 can detect a rear vehicle based on image data corresponding to the second area 201b among the image data, and calculate a distance to his vehicle from changes in the position and size of the detected rear vehicle. The rear vehicle distance detector 223 outputs information on the calculated distance to the DAV determining unit 224.
The rear vehicle distance detector 223 may further determine a vehicle type of the rear vehicle based on data on characteristic information such as a shape and color for each vehicle type output as a result of machine learning (deep learning) based on an image of a large number of vehicles. At this time, the rear vehicle distance detector 223 may output information on the vehicle type of the rear vehicle to the DAV determining unit 224. The reverse gear detector 225 detects whether the transmission of the movable apparatus 10 (user's vehicle) is in the reverse gear, and outputs the detection result to the DAV determining unit 224.
The DAV determining unit 224 determines whether the angle of view (display angle of view) of the image to be displayed on the display apparatus 230 is to be the first angle of view 30 or the second angle of view 31 based on an output from at least one of the rear vehicle distance detector 223 and the reverse gear detector 225. Then, the DAV determining unit 224 outputs a predetermined result to the DAV change unit 222 according to the determination result. For example, the DAV determining unit 224 can determine that the display angle of view is to be the second angle of view 31 in a case where a distance value in the distance information is equal to or smaller than a certain threshold value (e.g., 3 m), and can determine that the display angle of view is to be the first angle of view 30 in a case where the distance value is larger than the threshold value. Alternatively, the DAV determining unit 224 can determine that the display angle of view is to be the second angle of view 31 in a case where the reverse gear detector 225 notifies the user that the transmission of the movable apparatus 10 is in the reverse gear. The DAV determining unit 224 can determine that the display angle of view is to be the first angle of view 30 in a case where the vehicle is not in the reverse gear.
The DAV determining unit 224 can determine that the display angle of view is to be the second angle of view 31 in a case where the transmission of the movable apparatus 10 is in the reverse gear, regardless of the result of the rear vehicle distance detector 223. The DAV determining unit 224 can determine that the display angle of view is to be determined according to the detection result of the rear vehicle distance detector 223 in a case where the transmission of the movable apparatus 10 is not in the reverse gear. The DAV determining unit 224 may change the determination criterion for changing the angle of view according to the vehicle type of the movable apparatus 10 by receiving vehicle type information from the rear vehicle distance detector 223. For example, in a case where the movable apparatus 10 is a large vehicle such as a truck, its braking distance is longer than that of a standard vehicle, so the above threshold value may be set larger than that of the standard vehicle (for example, 10 m).
The user setting change unit 226 allows the user 40 to change the determination criteria for determining whether or not the display angle of view is changed to the second angle of view 31 by the DAV determining unit 224. The determination criteria set (changed) by the user 40 are input from the user setting change unit 226 to the DAV determining unit 224. The DAV change unit 222 generates a display image to be displayed on the display apparatus 230 according to the determination result by the DAV determining unit 224. For example, in a case where it is determined that the first angle of view 30 is to be used, the DAV change unit 222 cuts out a rectangular sandwiched angle image (first image) from the image data corresponding to the first angle of view 30 and outputs it to the display apparatus 230. In a case where a rear vehicle that satisfies a predetermined condition is present in image data corresponding to the second angle of view 31, the DAV change unit 222 outputs an image (second image) including the rear vehicle to the display apparatus 230. The second image may include an image corresponding to the first area 201a. The DAV change unit 222 functions as a display control unit configured to perform display control for switching between a first display state in which the display apparatus 230 displays a first image and a second display state in which the display apparatus 230 displays a second image.
The DAV change unit 222 cuts out an image by storing the image data output from the image processing unit 221 in a storage unit (memory) such as a RAM, and by reading out the image to be cut out from there. An area in the image data that corresponds to the first image is a rectangular area at the first angle of view 30 that corresponds to the first area 201a. An area in the image data that corresponds to the second image is a rectangular area including the rear vehicle at the second angle of view 31 that corresponds to the second area 201b.
The display apparatus 230 includes a display unit such as a liquid crystal display or an organic EL, and displays a display image output from the DAV change unit 222. For example, the display apparatus 230 includes a first display unit as an electronic rearview mirror disposed above the windshield (front glass) of the movable apparatus 10, and a second display unit as an operation panel (monitor) disposed below the windshield of the movable apparatus 10. This configuration can display the first image and the second image generated from the image data described above on the first display unit and the second display unit, respectively. The first display unit may include a half-mirror so that it may function as a mirror in a case where it is not used as a display unit. The second display unit may serve as a display unit for a navigation system or an audio system, for example.
The movable apparatus 10 is not limited to a vehicle such as an automobile, but may be a movable unit such as a ship, an aircraft, an industrial robot, or a drone. The on-board system 2 according to this embodiment is used to display an image to the user 40, but is not limited to this example. For example, the on-board system 2 may also be used for driving assistance such as cruise control (including an adaptive cruise control function) and automatic driving. The on-board system 2 is not limited to a movable unit and is applicable to various devices that use object recognition, such as an intelligent transport system (ITS).
While the disclosure has described example embodiments, it is to be understood that some embodiments are not limited to the disclosed embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
For example, the optical system according to each example is assumed to be used in the visible range and is configured to perform good aberration correction in the entire visible range, but the wavelength range in which the aberration correction is performed may be changed, as necessary. For example, each optical system may be configured to perform aberration correction only in a specific wavelength range in the visible range, or may be configured to perform aberration correction in a wavelength range in the infrared range other than the visible range.
The on-board system 2 may employ the distance measuring apparatus described as the image pickup apparatus 20. At this case, the on-board system 2 may include a determining unit configured to determine a collision likelihood with an object based on information on a distance to the object acquired by the image pickup apparatus 20. A stereoscopic camera having two imaging units 210 may be adopted as the image pickup apparatus 20. In this case, even if an imaging-surface phase-difference sensor is not used, image data can be simultaneously acquired by each of the synchronized imaging units, and the two image data can be used to perform the same processing as described above. As long as a difference in imaging time between respective imaging units is known, each imaging unit does not need to be synchronized.
Each example can provide an optical system that has a reduced size and a wide angle of view.
This application claims priority to Japanese Patent Application No. 2023-204465, which was filed on Dec. 4, 2023, and which is hereby incorporated by reference herein in its entirety.
| Number | Date | Country | Kind |
|---|---|---|---|
| 2023-204465 | Dec 2023 | JP | national |