One of the aspects of the embodiments relates to an optical apparatus that receives light reflected from an illuminated object and detects the object.
Light Detection and Ranging (LiDAR) is one known method for measuring a distance to an object by calculating the distance from the time it takes to receive reflected light from an illuminated object or a phase of the reflected light. Japanese Patent Laid-Open No. 2009-098111 discloses a configuration that includes a generator for generating a laser beam, a rotation deflector for rotating around a central axis, and a direction changer for deflecting a laser beam relative to a direction of the central axis. In the configuration disclosed in Japanese Patent Laid-Open No. 2009-098111, the laser beam scans the object by the direction changer and the rotation deflector, and the reflected light from the object passes through the rotation deflector and is guided to a photodetector without passing through the direction changer.
In the configuration disclosed in Japanese Patent Laid-Open No. 2009-098111, the reflected light from the object reaches different positions on the photodetector for each angle defined by the direction changer, so the light receiving size of the photodetector becomes large and unnecessary light other than reflected light is also received. Thereby, noise components increase, the distance measurement accuracy decreases, and this accuracy decrease becomes remarkable as the distance to the object increases.
An optical apparatus according to one aspect of the disclosure includes a deflector configured to scan an object by deflecting illumination light from a light source unit and to deflect reflected light from the object, and a light receiver configured to receive the reflected light from the deflector. The deflector includes a first scanning element configured to scan the object in a first direction by deflecting the illumination light from the light source unit, and a second scanning element configured to scan the object in a second direction by deflecting the illumination light from the first scanning element. The light receiver includes a plurality of light-receiving elements arranged along a direction corresponding to the first direction. An on-board system and a moving apparatus each having the above optical apparatus also constitutes another aspect of the disclosure.
Further features of the disclosure will become apparent from the following description of embodiments with reference to the attached drawings.
Referring now to the accompanying drawings, a detailed description will be given of embodiments according to the disclosure. Corresponding elements in respective figures will be designated by the same reference numerals, and a duplicate description thereof will be omitted.
An optical apparatus using LiDAR includes an illumination system configured to illuminate an object (target) and a light receiving system configured to receive reflected light and scattered light from the object. LiDAR includes a coaxial system in which the optical axes of the illumination system and the light receiving system partially coincide with each other, and a non-coaxial system in which the optical axes of the illumination system and the light receiving system do not coincide with each other. In each example, a non-coaxial LiDAR is used as the optical apparatus, but a coaxial LiDAR may also be used.
The optical apparatus according to each example is used, for example, as an automatic driving support system in a vehicle such as an automobile. The object is, for example, a pedestrian, an obstacle, a vehicle, etc., and approximately 1 to 300 meters away. The optical apparatus according to each example measures the distance to the object, and controls the direction and speed of the vehicle based on the measurement result.
The coordinate system in this example is determined as illustrated in
The optical apparatus 1 can be used as a detection apparatus (image pickup apparatus) that detects (images) the object 100 by receiving the reflected light 61 or as a distance measuring apparatus that obtains the distance (distance information) to the object 100. The optical apparatus 1 uses a technology called Light Detection and Ranging (LiDAR) that calculates the distance to the object 100 based on the time it takes to receive the reflected light 61 or the phase of the reflected light 61.
The light source unit 10 includes a light source 11 and a collimator lens 12. The light source 11 uses a semiconductor laser, which is a laser with high energy concentration and excellent directivity, a vertical cavity surface emitting laser or Vertical Cavity Surface Emitting Laser (VCSEL). In a case where the optical apparatus 1 is applied to an on-board system (in-vehicle system) as described below, the object 100 may be a person. Therefore, the light source 11 may use an infrared light emitter because infrared light has little influence on human eyes. In this example, the wavelength of the illumination light 60 emitted by the light source 11 is 905 nm, which is included in the near-infrared region.
The illumination light (divergent light) 60 emitted from the light source 11 is collimated by the collimator lens 12 and becomes parallel light. The parallel light here includes not only strictly parallel light beam but also weakly diverging light and weakly converging light. After passing through the collimator lens 12, the illumination light 60 travels to the deflector 20.
The deflector 20 includes a first scanning element 21 and a second scanning element 22. The first scanning element 21 is rotatable about a rotation axis parallel to the Z-axis. The second scanning element 22 is rotatable about a rotation axis parallel to the Y-axis.
The first scanning element 21 scans the object 100 in the first direction (the Y-axis direction in this example) by deflecting the illumination light 60 from the light source unit 10 that travels in the +Y-axis direction. While a galvano mirror is used as the first scanning element 21 in this example, a Micro Electro Mechanical System (MEMS) mirror or the like may also be used.
The second scanning element 22 scans the object 100 in a second direction different from the first direction (Z-axis direction in this example) by deflecting the illumination light 60 that travels in the +X-axis direction from the first scanning element 21, and deflects the reflected light 61 and guides it to the light receiver 30. In this example, a polygon mirror having four reflective surfaces is used as the second scanning element 22, and one surface 221 of the polygon mirror reflects the illumination light 60 and irradiates it onto the object 100. The reflected light 61 enters a surface 222 of the polygon mirror that is different from the surface which the illumination light 60 enters, and deflects it to the light receiver 30.
The light receiver 30 includes a condenser lens (condenser optical system) 31, an optical filter 32, and a light-receiving element group 33.
The condenser lens 31 condenses the reflected light 61 from the second scanning element 22 onto the light-receiving surface of the light-receiving element included in the light-receiving element group 33. The optical filter 32 transmits only desired light and blocks (absorbs) other unnecessary light. In this example, the optical filter 32 is a bandpass filter that transmits only light in a wavelength band corresponding to the illumination light 60 emitted from the light source 11. The configurations of the condenser lens 31 and the optical filter 32 are not limited to the configurations of this example. For example, the order of arrangement of each member may be changed or a plurality of each member may be provided, as necessary.
The light-receiving element group 33 receives the light from the condenser lens 31, performs photoelectric conversion, and outputs a signal. The light-receiving element group 33 can use one including a Photo Diode (PD), an Avalanche Photo Diode (APD), a Singel Photo Avalanche Diode (SPAD), or the like.
The reflected light 61 from the object 100 is deflected by the second scanning element 22, condensed by the condenser lens 31, and enters the light-receiving element group 33 via the optical filter 32.
As illustrated in
The control unit 40 is, for example, a processing apparatus (processor) such as a Central Processing Unit (CPU), or a calculation apparatus (computer) including the same, and controls the light source 11, the first scanning element 21, the second scanning element 22, the light-receiving element group 33, and the like. The control unit 40 drives each of the light source 11, the first scanning element 21, and the second scanning element 22 at a predetermined driving voltage and a predetermined driving frequency. The control unit 40 can, for example, control the light source 11 to convert the illumination light 60 into pulsed light, or perform intensity modulation for the illumination light 60 to generate signal light.
The control unit 40 can acquire distance information about the object 100 based on the emission time of the illumination light 60 from the light source 11 (light emission time) to the reception time of the reflected light 61 by the light-receiving element 331 from the object 100 (light reception time). At this time, the control unit 40 may acquire the signal from the light-receiving element group 33 at a specific frequency. The distance information may be acquired based on the phase of the reflected light 61 from the object 100 instead of the reception time of the reflected light 61 from the object 100. More specifically, the control unit 40 finds the difference (phase difference) between the phase of the signal from the light source 11 and the phase of the signal output from the light-receiving element group 33, and multiplying the phase difference by the light speed, and acquires the distance information about the object 100.
The light source 11 may be disposed so that the x-axis in
The following inequality (1) may be satisfied:
where f is a focal length of the condenser lens 31, P is a pixel pitch (distance between the centers of the light-receiving elements 331 in the Y-axis direction) in the Y-axis direction (direction corresponding to the first direction) of the light-receiving element group 33, and 0 is a divergent angle of the illumination light 60 in the Y-axis direction (direction corresponding to the first direction) in a case where the object 100 is irradiated.
By satisfying inequality (1), the reflected light 61 condensed in the Y-axis direction on the light-receiving element group 33 can enter one light-receiving element 331, and improve the resolution.
The width of the area of the object 100 that is irradiated with the illumination light 60 is different in two orthogonal directions. The direction of the narrow width of the area may be a direction deflected by the first scanning element 21 (the Y-axis direction in this example), that is, the scanning range of the first scanning element 21 may be narrower than that of the second scanning element 22. This configuration can reduce the number of light-receiving elements in the light-receiving element group 33.
The rotation axis of the second scanning element 22 may be located outside the illumination optical path from the first scanning element 21 to the second scanning element 22. In this example, the rotation axis of the second scanning element 22 is parallel to the Y-axis, and the illumination optical path from the first scanning element 21 to the second scanning element 22 is deflected by the moving first scanning element 21 that swings about an axis parallel to the Z-axis. Therefore, the rotation axis of the second scanning element 22 is located outside the illumination optical path from the first scanning element 21 to the second scanning element 22. Thereby, the size of the surface which the reflected light 61 of the second scanning element 22 enters can be reduced.
The second scanning element 22 may be a polygon mirror having a plurality of reflective surfaces. This configuration allows scanning to be performed a plurality of times in one rotation, and thereby can improve the frame rate. The illumination light 60 and the reflected light 61 may be deflected by different surfaces of the polygon mirror. Thereby, the area per surface of the polygon mirror can be reduced, and the first scanning element 21 and the second scanning element 22 can be smaller.
The reflected light 61 from the object 100 may be deflected in an area through which the illumination light 60 of the second scanning element 22 corresponding to the reflected light 61 does not pass (an area different from the area where the illumination light 60 is deflected). Thereby, even the coaxial LiDAR can suppress the loss of light amount that might occur at an overlap portion between the illumination optical path and the light-receiving optical path.
The second scanning element 22 may be placed at the entrance pupil of the condenser lens 31. Thereby, the scanning surface of the second scanning element 22 can be smaller, the load on the driving unit for the second scanning element 22 can be reduced, and the optical apparatus 1 can be made smaller.
As explained above, the configuration of this example can realize highly accurate long-distance measurement.
This example will discuss only the configurations that are different from Example 1, and will omit a description of the same configurations.
In this example, the light source 11 uses a VCSEL or a fiber laser instead of a semiconductor laser. Therefore, the section of the illumination light 60 is not elliptical but circular.
The light receiver 30 is disposed on the +Y-axis side of the light source unit 10. The illumination light 60 from the light source unit 10 is deflected by the first scanning element 21 and then deflected by the second scanning element 22 to illuminate the object. The reflected light 61 from the object is reflected by the surface of the second scanning element 22 that was used for the illumination, and is guided to the light receiver 30. Thereby, the size of the optical apparatus 1 in the lateral direction (X-axis direction) can be reduced. The second scanning element 22 is larger in the Y-axis direction than that of Example 1, but its radius of rotation is smaller. Therefore, the load on the driving unit for the second scanning element 22 can be reduced.
This example will describe only the configurations that are different from Example 1, and will omit a description of the same configurations.
As described above, by arranging a plurality of light sources and light-receiving elements, the optical apparatus 1 according to this example can have an angle of view wider than that of the optical apparatus 1 according to each of Examples 1 and 2.
In this example, the optical apparatus 1 has two light sources 11a and 11b and two light-receiving elements 351 and 352, but the number of light sources and light-receiving elements is not limited to two.
This example will describe only the configurations that are different from Example 1, and will omit a description of the same configurations.
The optical system 50 is an optical system configured to change the light beam diameter of the deflector 20. In this example, the optical system 50 is an optical system (telescope) that enlarges the beam diameter of the illumination light from the deflector 20 and reduces the beam diameter of the reflected light from the object. The optical system 50 is an afocal system that includes a plurality of optical elements (lenses) having refractive power, and has no refractive power as a whole. More specifically, the optical system 50 includes, in order from the deflector 20 side to the object side, a first lens 51 having positive power and a second lens 52 having positive power. The number of lenses is not limited to this example, and the optical system 50 may include three or more lenses, as necessary. The optical system 50 may be an optical system that reduces the beam diameter of the illumination light from the deflector 20 and enlarges the beam diameter of the reflected light from the object.
The deflector 20 is placed at the entrance pupil of the optical system 50. The absolute value of the optical magnification (lateral magnification) β of the optical system 50 is larger than 1 (|β|>1). Thereby, a deflection angle of a principal ray of the illumination light emitted from the optical system 50 becomes smaller than a deflection angle of a principal ray of the illumination light that is deflected by the deflector 20 and enters the optical system 50, and the resolution for detecting the object can be improved.
The illumination light from the light source unit 10 is deflected by the deflector 20, enlarged by the optical system 50 according to the optical magnification β, and irradiated onto the object. The reflected light from the object is reduced by the optical system 50 according to the optical magnification 1/β, deflected by the deflector 20, and condensed by the condenser lens 31 to reach the light-receiving element group 33.
The optical system 50 disposed on the object side of the deflector 20 can enlarge the diameter of the illumination light by the optical system 50. This allows the diameter of the illumination light to be further enlarged and a divergent angle to be further reduced, and thus sufficient illuminance and resolution can be secured even when the object is located at a distant position. Enlarging the pupil diameter using the optical system 50 can capture more reflected light from the object, and improve the measured distance and the distance measurement accuracy.
As illustrated in
First, in step S1, the light source unit 10 in the optical apparatus 1 illuminates an object around the vehicle, and the control unit 40 acquires the distance information on the object based on the signal output from the light receiver 30 by receiving the reflected light from the object. In step S2, the vehicle information acquiring apparatus 200 acquires vehicle information including the speed, yaw rate, steering angle of the vehicle, and the like. Next, in step S3, the control unit 40 determines whether the distance to the object is included within a preset distance range using the distance information acquired in step S1 and the vehicle information acquired in step S2.
This configuration can determine whether or not the object exists within the set distance range around the vehicle, and determine whether a collision is likely to occur between the vehicle and the object. Steps S1 and S2 may be performed in the reverse order of the above order or in parallel. The control unit 40 determines that the collision is likely in a case where the object exists within the set distance (step S4) and determines that the collision is unlikely in a case where the object does not exist within the set distance (step S5).
Next, in the case where the control unit 40 determines that the collision is likely, the control unit 40 notifies (transmits) the determination result to the control apparatus 300 and the warning apparatus 400. At this time, the control apparatus 300 controls the vehicle based on the determination result of the control unit 40 (step S6), and the warning apparatus 400 warns the user (driver) of the vehicle based on the determination result of the control unit 40 (step S7). The determination result may be notified to at least one of the control apparatus 300 and the warning apparatus 400.
The control apparatus 300 can control the vehicle by generating a control signal, for example, to apply the brakes, release the accelerator, turn the steering wheel, and generate braking force at each wheel to suppress the output of the engine or motor. The warning apparatus 400 warns the driver by, for example, emitting a warning sound, displaying warning information on the screen of a car navigation system, or applying vibration to the seat belt or steering wheel.
Thus, the on-board system 1000 according to this example can detect the object and measure the distance to the object by the above processing, and avoid the collision between the vehicle and the object. In particular, applying the optical apparatus according to each of the examples to the on-board system 1000 can realize high distance measuring accuracy, so that object detection and collision determination can be performed with high accuracy.
This example applies the on-board system 1000 to the driving support (collision damage mitigation), but the on-board system 1000 is not limited to this example and is applicable to cruise control (including adaptive cruise control) and automatic driving. The on-board system 1000 is applicable not only to a vehicle such as an automobile but also to a moving body such as a ship, an aircraft, or an industrial robot. It can be applied not only to moving objects but also to various devices that utilize object recognition such as intelligent transportation systems (ITS) and monitoring systems.
The on-board system 1000 and the moving apparatus may include a notification apparatus (notifying unit) for notifying the manufacturer of the on-board system, the seller (dealer) of the moving apparatus, or the like of any collisions between the moving apparatus and the obstacle. For example, the notification apparatus may use an apparatus that transmits information (collision information) on the collision between the moving apparatus and the obstacle to a preset external notification destination by e-mail or the like.
Thus, the configuration for automatically notifying the collision information through the notification apparatus can promote processing such as inspection and repair after the collision. The notification destination of the collision information may be an insurance company, a medical institution, the police, or another arbitrary destination set by the user. The notification apparatus may notify the notification destination of not only the collision information but also the failure information on each component and consumption information on consumables. The presence or absence of the collision may be detected based on the distance information acquired by the output from the above light receiver or by another detector (sensor).
While the disclosure has been described with reference to embodiments, it is to be understood that the disclosure is not limited to the disclosed embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
Each example can provide an optical apparatus that can provide distance measurement with high accuracy.
This application claims the benefit of Japanese Patent Application No. 2023-059445, filed on Mar. 31, 2023, which is hereby incorporated by reference herein in its entirety.
Number | Date | Country | Kind |
---|---|---|---|
2023-059445 | Mar 2023 | JP | national |