The embodiments described herein relate to a three-dimensional (3D) sensing system.
There have been proposed radar devices configured to detect a distance to a measuring object and a shape thereof which exists around a vehicle or the like.
For example, conventional radar devices using a Light Detection and Ranging (LiDAR) method have problems in size, weight, accuracy, reliability, service life, and the like, due to the mechanical moving parts involved in beam scanning. In particular, when mounted on a vehicle, it is difficult to achieve all requirements at the same time, since not only accuracy, reliability, and life, but also strict restrictions on size and weight are often imposed due to a space available for being mounted.
In addition to driving and controlling a laser light source, a driving circuit for a beam scanning and a control circuit thereof are also required. In some cases, mechanisms and circuits are also required for monitoring an emitting direction of beam.
Since a beam emitted from a laser light source has a certain angle of spread, a certain condensing optical system such as a lens is required before the beam is incident onto a beam scanning portion, and the size, weight, and mounting accuracy thereof are problems.
In a simple raster-based operation, beam arrival time density at both ends of a scanning portion is high, and therefore time density in a center portion, where interest level of sensing is high, is reduced. Furthermore, although it would be desirable that a detection region can be changed in accordance with a moving situation or environment, and thereby only the region can be scanned or a plurality of regions simultaneously can be scanned, it is difficult to cope with it by a simple beam scanning technology.
A so-called a Flash Lidar method for calculating the distance for each pixel by emitting pulsed illumination light towards the entire sensing space and receiving reflected light therefrom by an image sensor is also promising as a sensing method, but it cannot handle long distances such as those required for sensing during automatic driving, for example. The structured light method using a light pattern projection is also unsuitable for sensing at long distances. Although it is common in that a light source and an imaging device are used by each thereof, but cannot be shared since requirements for the light source are different from each other.
Each method has its own advantages and disadvantages, and it is practical to choose an appropriate method in accordance with the situation.
On the other hand, photonic crystal (PC) surface emitting lasers (SELs) have been proposed as a next-generation semiconductor laser light source.
The embodiments provide a 3D sensing system, having higher accuracy, higher output, miniaturization, and robustness, as well as higher adaptability to sensing regions and sensing objects, and capable of supporting a plurality of sensing modes.
According to one aspect of the embodiments, there is provided a three-dimensional sensing system comprising: a photonic crystal laser array in which a photonic crystal laser element is arranged on a plane; a control unit configured to control an operation mode of a laser light source; a driving unit configured to execute a drive control of the photonic crystal laser array in accordance with the operation mode controlled by the control unit; a light receiving unit configured to receive reflected light that is laser light emitted from the photonic crystal laser array reflected from a measuring object; a signal processing unit configured to execute signal processing of the reflected light received by the signal receiving unit in accordance with the operation mode; and a distance calculation unit configured to execute calculation processing of a distance to the measuring object with respect to a signal processed by the signal processing unit, in accordance with the operation mode, and to output a calculation result as distance data.
According to another aspect of the embodiments, there is provided a three-dimensional sensing system comprising: a signal transmitting unit comprising a two-dimensional photonic crystal surface emitting laser cell array configured to emit laser light to a measuring object; a signal receiving unit comprising an optical system and an image sensor configured to receive reflected light emitted from the signal transmitting unit and reflected from the measuring object; a control unit configured to control an operation mode of a light source of the laser light; transmission direction recognition unit configured to recognize an emitting direction of the laser light emitted from the two-dimensional photonic crystal surface emitting laser cell array; a two-dimensional photonic crystal cell array driving unit configured to execute a drive control of the two-dimensional photonic crystal surface emitting laser cell array on the basis of the emitting direction of the laser light recognized by the transmission direction recognition unit, in accordance with the operation mode; and a signal processing unit comprising a distance detection unit configured to calculate a distance to the measuring object on the basis of a light receiving position on an imaging surface of the image sensor and a time from light emission to light reception in accordance with the operation mode.
According to still another aspect of the embodiments, there is provided a three-dimensional sensing system comprising: a flash light source configured to emit laser light to an entire surface of a specific region; a two-dimensional photonic crystal surface emitting laser cell array configured to emit the laser light to a target region of the specific region; a control unit configured to control an operation mode of a laser light source; a flash driving unit configured to execute a drive control of the flash light source and a two-dimensional photonic crystal cell array driving unit configured to execute a drive control of the two-dimensional photonic crystal surface emitting laser cell array, in accordance with the operation mode controlled by the control unit; a signal receiving unit configured to receive a reflected light that is the laser light emitted from the flash light source and reflected from a measuring object included in the specific region, and to receive a reflected light that is the laser light emitted from the two-dimensional photonic crystal surface emitting laser cell array and reflected from measuring object included in the target region; a signal processing unit configured to execute signal processing of the reflected light received the signal receiving unit in accordance with the operation mode; and a distance detection unit configured to execute calculation processing of the distance to the measuring object with respect to the signal processed by the signal processing unit in accordance with the operation mode, wherein the signal processing unit determines whether or not there is any region where the signal to noise ratio of the reflected light emitted from the flash light source and reflected is lower than the predetermined threshold value in the specific region, wherein if there is a region where the signal to noise ratio is lower than the predetermined threshold value, the signal processing unit controls the two-dimensional photonic crystal cell array driving unit to irradiate only the region where the signal to noise ratio is lower than the predetermined threshold value as a target with spot laser light from the two-dimensional photonic crystal surface emitting laser cell array.
According to the embodiments, there can be provided the 3D sensing system, having higher accuracy, higher output, miniaturization, and robustness, as well as higher adaptability to sensing regions and sensing objects, and capable of supporting a plurality of sensing modes.
In the 2D-PC SEL cell applicable to the 3D sensing system according to the embodiments,
In the 2D-PC SEL cell applicable to the 3D sensing system according to the embodiments,
In the 2D-PC SEL cell applicable to the 3D sensing system according to the embodiments,
In the 2D-PC SEL cell applicable to the 3D sensing system according to the embodiments,
In the LiDAR operation mode executed in the 3D sensing system according to the embodiments,
In a flash LiDAR operation mode executed in the 3D sensing system according to the embodiments,
In a light-section method operation mode executed in the 3D sensing system according to the embodiments,
In an example of the twin beam using the closest-packing pattern of circles emitted from the 2D-PC SEL cell array applicable to the 3D sensing system according to the embodiments,
In the 3D sensing system according to the embodiments,
As schematic diagrams for explaining an example in which differences occur in light intensity in accordance with a direction (position) even if the equal current value is injected into each cell of the 2D-PC SEL cell array, in the 3D sensing system according to a comparative example,
As schematic diagrams for explaining an example in which the light intensity is uniformed in accordance with the direction (position) by injecting a different current value for each position into each cell of the 2D-PC SEL cell array, in the 3D sensing system according to the embodiments,
In an example of emitting beam control of the 2D-PC SEL cell array applicable to the 3D sensing system according to the embodiments,
In an example of emitting beam control of the 2D-PC SEL cell array applicable to the 3D sensing system according to the embodiments,
In an example of emitting beam control of the 2D-PC SEL cell array applicable to the 3D sensing system according to the embodiments,
In an example of emitting beam control of the 2D-PC SEL cell array applicable to the 3D sensing system according to the embodiments,
In the 2D-PC SEL cell applicable to the 3D sensing system according to the embodiments,
There will now be described embodiments with reference to the drawings. In the description of the following drawings to be explained, the identical or similar reference sign is attached to the identical or similar part. However, it should be noted that the drawings are schematic and the relation between thickness and the plane size and the ratio of the thickness of each layer differs from an actual thing. Therefore, detailed thickness and size should be determined in consideration of the following explanation. Of course, the part from which the relation and ratio of a mutual size differ also in mutually drawings is included.
Moreover, the embodiments described hereinafter merely exemplify the device and method for materializing the technical idea; and the embodiments do not specify the material, shape, structure, placement, etc. of each component part as the following. The embodiments of the present invention may be changed without departing from the spirit or scope of claims.
The embodiments discloses a 3D sensing system formed by combining a two-dimensional photonic crystal (2D-PC) surface emitting laser (SEL) element and a two-dimensional (2D) arrayed element thereof with an imaging device.
The 3D sensing system according to the embodiments calculates a distance to and a direction of a measuring object by irradiating the measuring object with radiately laser light and receiving a scattered light from the measuring object. Since the photonic crystal has flexible controllability of a laser beam, it can flexibly control a beam direction (emitting direction of the laser light) even if not providing any mechanical operating unit (solid state).
In particular, it becomes possible to realize a light source for 3D sensing which has a plurality of operation modes by utilizing characteristics of the photonic crystal, such as flexible emission control function (time, intensity, direction), higher output, higher quality beam, small size, and robustness (hard to break) and affordable price.
Moreover, it becomes possible to realize a control method of symmetrical emitting beam which is a characteristic of the PC laser (a beam arrangement design method for satisfying a region as a sensing object and an emission pattern control corresponding thereto (including also the case of a single beam emitted in a normal direction of the device)). The 3D sensing system according to the embodiments, by changing operation modes of the laser light source, can configure the following three sensing modes (1) to (3) in one 3D sensing system.
The beam scanning type LiDAR scans the beam (transmitting signal) within a range of detecting the measuring object, captures the scattered reflected light (reflected signal) from the measuring object, calculates the direction by recognizing which direction the light is reflected from, and thereby calculates the distance on the basis of the time until the light to be reflected and returned (Time of Flight (TOF)).
Various technology with regard to a laser radar is an ingenuity for a signal processing logic for calculating the distance and direction, a scan method of the beam corresponding thereto, and a method of a spatial modulation for realizing such scanning. As a means for the spatial modulation, there are methods, such as a polygon mirror, a galvanomirror, and Micro Electro Mechanical Systems (MEMS), a method of arraying the laser light source to be subjected to light control (Vertical Cavity Surface Emitting Laser (VCSEL) etc.), or an optical phased array.
The 3D sensing system according to the embodiments is capable of scanning (e.g., rotational scanning) differently from conventional raster scanning even in the beam scanning type sensing mode. Moreover. the arrayed light-receiving element can also distinguish reflected light from a plurality of laser beams and function also as the flash LiDAR. Also in the flash type sensing mode, it is also possible to sense only a certain region.
Since the emission control function of the 3D sensing system according to the embodiments has a high compatibility (sensing region, cooperative operation of a plurality of systems, learning control) with software control (program control), it can easily support also adaptive sensing that incorporates learning functions, etc. According to such characteristics, it is also possible to easily support applicability of encoding of the emitting beam, a cooperative operation of a plurality of systems, etc.
The 3D sensing system according to the embodiments is a solid-state type system, which is small and is hard to break, and allows for greater flexibility of setting positions. Moreover, it has resistance to noise and interference (utilizing excellent controllability in hardware and software).
Since a light emitting unit of the 3D sensing system according to the embodiments requires no beam scanning mechanism, the size thereof is at a semiconductor package level and no optical system (collimating optical system) for converging the emitting beam is also required. Therefore, it is possible to emit light in any driving condition under independent in flexible directions, and a cooperative operation of a plurality of devices can also be realized. Moreover, since no beam scanning mechanism such as a rotating mirror or MEMS mirror required for the LiDAR applicability is required, a system that is ultra-compact, robust, and can be installed freely can be realized.
In the following description, the 2D-PC SEL described in Patent Literature 3 is used, but the modulated photonic crystal (PC) laser described in Patent Literature 2 may be used instead. The beam control principle is the same for both, and any one thereof can be used in the present embodiments.
The PC laser applicable to the 3D sensing system according to the embodiments is formed by laminating a transparent electrode 251T, a lower substrate 241, a first cladding layer 231, a two-dimensional photonic crystal (2D-PC) layer 221, an active layer 222, a second cladding layer 232, an upper substrate 242, and a window-shaped electrode 252, in this order. In the PC laser in the embodiments, the laser beam (laser light A, B) is emitted passes through a cavity area (window) provided in a center portion of the window-shaped electrode 252 in a direction inclined by an emitting angle θ from a vertical line with respect to a surface at a side of the window-shaped electrode 252 of the upper substrate 242. It should be noted that the order of the 2D-PC layer 221 and the active layer 222 may be opposite to the above-mentioned order. For convenience, the words “upper” and “lower” are used in the embodiments, but these words do not define the actual orientation (upward or downward) of the PC laser.
In the embodiments, a p type semiconductor gallium arsenide (GaAs) is used for the lower substrate 241, an n type GaAs is used for the upper substrate 242, a p type semiconductor aluminum gallium arsenide (AlGaAs) is used for the first cladding layer 231, and an n type AlGaAs is used for the second cladding layer 232. A layer including a Multiple-Quantum Well (MQW) composed of an indium gallium arsenide/gallium arsenide (InGaAs/GaAs) is used for the active layer 222. Gold (Au) is used for the material of the window-shaped electrode 252. SnO2, In2O3, or the like are used for the material of the transparent electrode 251T. Instead of the transparent electrode 251T, a Distributed Bragg Reflector (DBR) layer capable of passing through the laser light may also be used as a multilayered structure of an insulating layer. It should be noted that the materials of these each layer are not limited to the above-mentioned materials, and it is possible to use the same materials as those the materials used for the respective layers of conventional photonic crystal surface emitting laser (PC SEL). Moreover, other layers, such as a spacer layer, may be inserted between the above-mentioned each layer.
The 2D-PC layer 221 is formed by periodically arranging holes (different refractive index regions) 211 on the below-mentioned lattice points in a plate-shaped base material (slab) 214. A p type GaAs is used for the material of the slab 214 in the embodiments. Although the shape of the hole 211 is equilateral triangle in the embodiments, another shape, such as circular shape, may be used therefor. It should be noted that the material of the slab 214 is not limited to the above-mentioned material, and any material used for the base member in conventional PC lasers can also be used therefor. Moreover, any member (different refractive index member) from which the refractive index is different may be used for the different refractive index region in the slab 214 instead of the hole 211. The holes are advantageous in that they can be easily processed, while different refractive index members are preferable in the case where the base member may possibly be deformed due to a processing heat or other factors.
In the 2D-PC SEL 120 cell applicable to the 3D sensing system according to the embodiments,
The lattice point where the hole 211 is arranged, in the 2D-PC layer 221 will now be described with reference to
The lattice 212A for forming optical resonance state is composed of a square lattice having a lattice constant a. Hereinafter, in the square lattice, one of the two directions in which the lattice points 213A are aligned at interval a is referred to as the x direction and the other referred to as the y direction. Therefore, the x-y coordinate system of the lattice point 213A is expressed as (ma, na) using integers m and n.
In contrast, in the lattice 212B for light-emitting, an orthorhombic lattice having a basic translation vector of c1↑=(r1, 1)a and c2↑=(r2, 1)a is configured. The lattice constants c1 and c2 of this orthorhombic lattice are the magnitudes of the basic translation vector c1↑ and c2↑ are respectively (r10.5+1)a and (r20.5+1)a; and the angle α between c1↑ and c2↑ satisfies a relationship of cos α=(r1r2+1)×(r12+1)−0.5×(r22+1)−0.5. The lattice points 213B are aligned in the y direction at interval a, for both the lattice 212A for forming optical resonance state and the lattice 212B for light-emitting.
In the embodiments, the emission wavelength λ is 980 nm. Moreover, the effective refractive index neff of the 2D-PC layer 221 is determined by the refractive index (3.55) of the p type GaAs which is a material of the slab 214 and a rate of the hole 211 occupies in the slab 214 (refractive index 1). In the embodiments, the effective refractive index neff of the 2D-PC layer 221 is set to 3.5 by adjusting the area of the hole 211. Accordingly, the lattice constant a in the embodiments is set to 2−0.5×980 nm/3.4≈200 nm from the equation (3) described below.
In the 2D-PC layer 221 of the embodiments, the PC structure is formed by arranging the hole 211 at the lattice point of the lattice 212C (
In the 2D-PC layer 221 of the embodiments, the laser beam is emitted in the direction with which r1 and r2 which are parameters indicating the position of the lattice point 213B satisfy the equation (1) and the equation (2) described below.
In the 2D-PC SEL cell 120 applicable to the 3D sensing system according to the embodiments,
By applying a periodic drive modulation to the PC structure and providing a diffraction effect to an upper side in addition to the resonance effect, the beam emitting direction control (beam scanning) covering the range of the biaxial direction can be executed.
As illustrated in
As illustrated in
As a specific example of the twin beam A, B, the spreading angle of one beam (beam A or B) is 0.34°, the output of the twin beam A, B is 1 to 10 W, the modulation frequency is several hundreds of MHz, and |θ|<50°. Strictly speaking, not all beams are emitted from the origin point O, but since the transition is at most approximately μm, the beams may be regarded as emitted from the same point in the LiDAR applications for sensing from several meters to several hundred meters.
In the 2D-PC SEL cell 120 applicable to the 3D sensing system according to the embodiments,
Used photonic crystal herein is a photonic crystal for beam scanning illustrated in
In the 2D-PC layer 221 of the embodiments, the laser beam is emitted in the direction with which r1 and r2 which are parameters indicating the position of the lattice point satisfy the equation (1) and the equation (2) as follows:
where θ is the inclined angle with respect to the normal line of the PC layer, φ is the azimuth angle with respect to the x direction, and neff is the effective refractive index.
Moreover, the lattice constant a in the 2D-PC layer 221 of the embodiments is obtained by the following equation (3):
where λ is the emission wavelength.
The 2D-PC layer 221 of the embodiments designed in this way enables the beams A and B to be emitted in the biaxial directions.
The feedback control mechanism includes: the 2D-PC SEL cell array 120AR; the 2D-PD cell array 118PDAR configured to detects the feedback laser light C (FB) from each cell emitted from a back side surface of the 2D-PC SEL cell array 120AR; a feedback control unit 130 configured to control a 2D-PD array driving unit 140AR on the basis of the detection result by the 2D-PD cell array 118PDAR; and the 2D-PD array driving unit 140AR configured to drive the 2D-PC SEL cell array 120AR in accordance with the control by the feedback control unit 130.
For example, even if the same current I is injected into each cell of the 2D-PC SEL cell array 120AR, the light intensity L may be different from each other depending on the direction (position) thereof. However, the light intensity L can be uniformed by configuring the feedback control mechanism as illustrated in
As illustrated in
As illustrated in
As illustrated in
The PC laser array 10 is a device in which the PC laser elements as illustrated in
The control unit 14 executes the operation control of each unit on the basis of three operation modes (i.e., a LiDAR operation mode, a flash LiDAR operation mode, and a light-section method operation mode).
The driving unit 12 executes the drive control of beam emitted from the PC laser array 10 in accordance with the operation modes (LiDAR operation mode/flash LiDAR operation mode/light-section method operation mode) controlled by the control unit 14.
In addition, a feedback control illustrated in
The light receiving unit including the imaging lens 16 and the image sensor (or arrayed light-receiving element) 18 receives the scattered reflected light emitted from the PC laser array 10 and reflected from the object through the imaging lens 16 by the image sensor (or arrayed light-receiving element) 18.
The signal processing unit 20 executes signal processing of the reflected laser beam received by the light receiving unit in accordance with the operation mode (LiDAR operation mode/flash LiDAR operation mode/light-section method operation mode) controlled by the control unit 14, to be transmitted to the distance calculation unit 22. The measurement principle of the LiDAR operation mode and the measurement principle of the flash LiDAR operation mode are equal in that the reflection from the plurality of beams emitted from the PC laser array 10 is captured by the image sensor and the distance thereto is calculated by the time measurement function. The difference between both is resolution (positional accuracy) of the space to be measured. The resolution in LiDAR mode depends on the emitting direction accuracy of the emitting beam, and the resolution in the flash LiDAR operation mode depends on the number of pixels with respect to a certain angle of view.
The distance calculation unit 22 calculates the distance to the measuring object on the basis of the light receiving position in the imaging surface of the image sensor 18 and the time from light emission to light reception (arrival time) in accordance with the operation mode (LiDAR operation mode/flash LiDAR operation mode/light-section method operation mode) controlled by the control unit 14, and outputs the calculation result as the distance data DM.
The distance calculation processing is started in Step S0.
In Step S1, in accordance with the operation mode controlled by the control unit 14, the driving unit 12 executes the drive control of the PC laser array 10, and the laser beam is emitted from the PC laser array 10. More specifically, any one of emitting the twin beam/emitting in regional shape/emitting optical pattern are selected, in accordance with three operation modes (LiDAR operation mode (M1)/flash LiDAR operation mode (M2)/light-section method operation mode (M3)), and the beam is emitted from the PC laser array 10. Namely, when the operation mode is the LiDAR operation mode, one laser element is driven and the twin beam (A, B) is emitted in a certain direction; when the operation mode is the flash LiDAR operation mode, a certain region (sensing space) is irradiated with the light for given length of time; and when the operation mode is the light-section method operation mode, striped pattern light is projected onto the measuring object.
In Step S2, the light receiving unit (16, 18) receives the scattered reflected light emitted from the PC laser array 10 and then reflected from the measuring object by the image sensor (or arrayed light-receiving element) 18 through the imaging lens 16.
In Step S3, the control unit 14 allocates the processing, in accordance with the operation mode, to any one of Step S4, Step S5 and Step S6. More specifically, when the operation mode is a LiDAR operation mode (M1) (emitting twin beam), the processing shifts to Step S4; when the operation mode is the flash LiDAR operation mode (M2) (emitting in regional shape), the processing shifts to Step S5; and when the operation mode is the light-section method operation mode (M3) (emitting optical pattern), it changes to Step S6.
When the operation mode is the LiDAR operation mode (M1) (emitting twin beam), in Step S4, the distance calculation unit 22 separates the reflected light emitted and reflected from the measuring object from each beam, and calculates the distance to the measuring object on the basis of the reflected light arrival time (TOF). As a result, information of the distance to and the direction of the measuring object which exist in the direction where the beam is emitted is obtained, and the distance calculation unit 22 outputs the obtained information as the distance data DM1 (
When the operation mode is the flash LiDAR (M2) (emitting in regional shape), in Step S5, the distance calculation unit 22 calculates the distance for each pixel on the basis of the pixel position and the reflected light arrival time for each pixel. As the result, the distance information for each pixel of the distance to the measuring object which exists in the emission region can be obtained, and the distance calculation unit 22 outputs the obtained information as the distance data DM2 (
When the operation mode is the light-section method (M3) (emitting optical pattern), in Step S6, the distance calculation unit 22 executes triangular ranging with the stripe-shaped imaging pattern projected onto the measuring object, and thereby calculates the distance to the measuring object. As a result, the three-dimensional (3D) data of the measuring object can be obtained by moving the distance information and the line along the projected striped pattern light, and the distance calculation unit 22 outputs the obtained information as the distance data DM3 (
In Step S11, the control unit 14 executes operation control of the driving unit 12 and the distance calculation unit 22 on the basis of three operation modes (i.e., the LiDAR operation mode, the flash LiDAR operation mode, the light-section method operation mode).
In Step S12, the driving unit 12 executes the drive control of the PC laser array 10 in accordance with the following three operation modes.
Then, when the light receiving unit (16, 18) receives the scattered reflected light emitted from the PC laser array 10 and reflected from the measuring object, the signal processing unit 20 and the distance calculation unit 22 execute processing in accordance with the operation mode.
More specifically, when the operation mode is M1 or M2 in Step S13, the distance calculation unit 22 measures the arrival time of the reflected light for each pixel in Step S14, and when the operation mode is M1 in Step S16, the distance calculation unit 22, in Step S17, executes correspondence processing to the emitting beam, calculates the distance to the measuring object on the basis of the arrival time of the emitted beam, and outputs the calculates distance as the distance data DM1. In contrast, when the operation mode is M2 in Step S16, the distance calculation unit 22 outputs the information obtained on the basis of the arrival time of the reflected light measured in Step S14 as the distance data DM2.
On the other hand, when the operation mode is M3 in Step S13, the distance calculation unit 22 obtains a reflected light image (imaging pattern) (pixel) in Step S15, and the distance calculation unit 22, in Step S18, executes triangular ranging with the imaging pattern, calculates the distance to the measuring object, and outputs the calculated distance as the distance data DM3.
From one element of the PC laser array 10, two beams (twin beam) A and B are emitted in accordance with an angle specification based on the design thereof. In an example of
When the reflected light RA and the reflected light RB is detected, the distance calculation unit 22 determines which beam of the emitted light A or B is the reflected light on the basis of a light receiving position (x, y) on the imaging surface of the image sensor 18, and measures the time from light emission to light reception. For example, it can be identified that the light receiving position 24I1 in the image sensor 18 illustrated in
The distance calculation unit 22 calculates the distance to each of the measuring objects 24T1 and 24T2 existing in the emitting direction of the twin beam A, B, on the basis of the above information. Herein, the distance to each of the measuring object 24T1 and 24T2=the light velocity×the arrival time/2.
In the LiDAR operation mode in the 3D sensing system according to the embodiments, the above distance calculation is repeatedly executed with respect to different emitting directions.
From a plurality of elements in the PC laser array 10, laser light FL is simultaneously emitted to a specific region. In an example illustrated in
The distance calculation unit 22 measures the arrival time (time from light emission to light reception) of the reflected light for each pixel, when the reflected light RFL is detected. For example, it can be identified that the light receiving position 24I1 in the illumination area ILL of the image sensor 18 illustrated in
The distance calculation unit 22 calculates the distance to each of the measuring objects 24T1 and 24T2 existing in the imaging range for each pixel on the basis of the above information. In the flash LiDAR operation mode, the distance information according to the number of the pixels in the illumination area ILL can be acquired at once.
An example of 3D measurement operation by means of the structured light projection will now be described as a light-section method operation, with reference to
In the light-section method operation mode, the measuring object 24T is irradiated with stripe-shaped laser light ST generated by the PC laser array 10. In an example illustrated in
The distance calculation unit 22, when the reflected light RST is detected, obtains a reflected light image (imaging pattern) (pixel), executes triangular ranging with the imaging pattern, calculates the distance to the measuring object 24T, and obtains 3D distance data for one line of the stripe-shaped light.
Furthermore, as illustrated in
A positional relationship between the PC laser array 10, the measuring object 24T, the imaging lens 16, and the image sensor 18, which are illustrated to
X=D cos θa sin θb/sin(θa+θb) (4)
Y=D sin θa sin θb/sin(θa+θb) (5)
Z=D tan φa/sin θa (6)
where θa=tan−1(f/Xa), φa=tan−1(Ya cos θa/Xa),
D is baseline length, f is the focal length of the imaging lens 16, and Xa and Ya are positions of spot light image on the image sensor 18.
First, in Step S101, twin beam A, B is emitted in a specific direction from one element (specific element) in the PC laser array 10.
Next, in Step S102, the reflected light RA and reflected light RB emitted from the PC laser array 10 and respectively reflected from the measuring objects 24T1 and 24T2 are captured by image sensor 18 through the imaging lens 16. In this case, if there is no reflected light, it can be recognized as that no object (no measuring object) exists in the corresponding direction.
Next, in Step S103, the distance calculation unit 22 distinguishes which beam of the emitted light A or B is the reflected light from the light receiving position (position of pixel) on the imaging surface of the image sensor 18.
Next, in Step S104, the distance calculation unit 22 measures the arrival time of the reflected light from each of the measuring objects 24T1 and 24T2 to the pixel of the image sensor 18.
Next, in Step S105, the distance calculation unit 22 calculates the distance to each of the measuring objects 24T1 and 24T2 existing in the emitting direction of the laser light A and B on the basis of the information of each of the emitting light A and emitting light B distinguished in the position of the pixel in the image sensor 18 and the information of the arrival time of each of the reflected light RA and reflected light RB from the measuring object to the pixel in image sensor 18.
The above-described distance calculation is repeated for different emitting directions (Step S106).
First, in Step S201, from a plurality of elements in the PC laser array 10, laser light FL is simultaneously emitted to a specific region.
Next, in Step S202, the reflected light RFL emitted from the PC laser array 10 and reflected from the measuring objects 24T1 and 24T2 is captured by image sensor 18 through the imaging lens 16. In this case, if there is no reflected light, it can be recognized as that no object (no measuring object) exists in the corresponding direction.
Next, in Step S203, when the reflected light RFL is detected, the distance calculation unit 22 measures the arrival time (time from light emission to light reception) of the reflected light in each pixel.
Next, in Step S204, the distance calculation unit 22 calculates the distance to each of the measuring objects 24T1 and 24T2 existing in the imaging range for each pixel. In the flash LiDAR operation mode, the distance information according to the number of the pixels in the illumination area ILL can be acquired at once.
First, in Step S301, the measuring object 24T is irradiated with stripe-shaped light ST generated by the PC laser array 10.
Next, in Step S302, the reflected light RST emitted from the PC laser array 10 and reflected from the measuring object 24T is received by image sensor 18 through the imaging lens 16. The distance calculation unit 22 obtains a reflected light image (imaging pattern) (pixel), executes triangular ranging with the imaging pattern, calculates the distance to the measuring object 24T, and obtains 3D distance data for one line of the stripe-shaped light.
Next, in Step S303, 3D data of the entire measuring object 24T is obtained by rotationally scanning stripe-shaped light ST (ROT).
As illustrated in
The signal transmitting unit 200 further includes the FBPD array 204 configured to execute a feedback control of the emitted laser light, and the transmission direction recognition unit 404 recognizes an emitting direction of the laser light emitted from the signal transmitting unit 200 in accordance with feedback information provided from the FBPD array 204.
The signal transmitting unit 200 may also include a reception direction recognition unit 406 configured to recognize a reception direction of the reflected light from the light receiving position on the imaging surface of the image sensor 18, and the 2D-PC cell array driving unit 402 executes a drive control of the 2D-PC cell array 202 on the basis of the emitting direction of the laser light recognized by the transmission direction recognition unit 404 and the reception direction of the reflected light recognized by the reception direction recognition unit 406.
The signal transmitting unit 200 further includes an object recognition logic 414 configured to identify the measuring object on the basis of a calculation result of the distance detection unit (TOF) 412.
More specifically, as illustrated in
The signal transmitting unit 200 includes a 2D-PC cell array 202 configured to emit laser light to a measuring object, and an FBPD array 204 configured to execute a feedback control of the emitted laser light. The 2D-PC cell array 202 corresponds to the PC laser array 10 illustrated to
The signal receiving unit 300 includes an optical system 304 and an image sensor (line/area) 302 configured to receive a scattered reflected light emitted from the signal transmitting unit 200 and reflected from the measuring object. The optical system 304 and the image sensor 302 respectively correspond to the imaging lens 16 and the image sensor 18 which are illustrated in
The signal processing unit 400 includes a 2D-PC cell array driving unit 402, a transmission direction recognition unit 404, a reception direction recognition unit 406, a CPU 408, a 3D image storage unit 410, a distance detection unit (TOF) 412, and an object recognition logic 414. The CPU 408 executes an operation control of each unit on the basis of three operation modes (i.e., LiDAR operation mode, flash LiDAR operation mode, light-section method operation mode). The CPU 408 corresponds to the control unit 14 illustrated in
The 2D-PC cell array driving unit 402 executes the drive control of the 2D-PC cell array 202 on the basis of the emitting direction of the laser light recognized by the transmission direction recognition unit 404 and the reception direction of the reflected light recognized by the reception direction recognition unit 406, in accordance with the operation mode (LiDAR operation mode/flash LiDAR operation mode/light-section method operation mode) controlled by the CPU 408. A drive control of the beam emitted from the 2D-PC cell array 202 is executed.
The transmission direction recognition unit 404 recognizes an emitting direction of the laser light emitted from the signal transmitting unit 200 in accordance with the feedback information provided from the FBPD array 204, and provides a recognition result to the CPU 408 and the 2D-PC cell array driving unit 402. The reception direction recognition unit 406 recognizes a reception direction of the reflected light from the light receiving position on the imaging surface of the image sensor 18, and provides a recognition result to the CPU 408. The 3D image storage unit 410 stores image data captured by the image sensor 18 and provides the stored image data to distance detection unit (TOF) 412 etc.
The distance detection unit (TOF) 412 calculates a distance to the measuring object on the basis of the light receiving position on the imaging surface of the image sensor 18 and the time from light emission to light reception (arrival time), in accordance with the operation mode (LiDAR operation mode/flash LiDAR operation mode/light-section method operation mode) controlled by the CPU 408. The distance detection unit (TOF) 412 corresponds to the distance calculation unit 22 illustrated in
The object recognition logic 414 identifies the measuring object on the basis of a calculation result of the distance detection unit (TOF) 412.
The MCPU 500 controls the entire main system mounted in the 3D sensing system 100 according to the embodiments. For example, when the 3D sensing system 100 is mounted in a vehicle, the MCPU 500 corresponds to a main CPU provided in the side of the vehicle.
A user interface (I/F) unit 504 is connected to the MCPU 500. The user I/F unit 504 includes: an input unit 506 for a user to input instructions (e.g., start/end of sensing processing, selecting of operation mode, and the like) to the 3D sensing system 100; and an output unit 508 for presenting sensing information detected by the 3D sensing system 100 to the user. The sensing information detected by the 3D sensing system 100 may be output as an image depicting a measuring object, and may be output as sound information, such as a warning sound.
On the basis of the image data stored and accumulated in the 3D image storage unit 410, the AI unit 502 learns the sensing result from the 3D sensing system 100, and assists more appropriately the sensing processing executed by the 3D sensing system 100.
The difference between the 3D sensing system 100 illustrated in
In the 3D sensing system 100 according to the modified example 1 of the embodiments, the 2D-PC cell array driving unit 402 executes the drive control of the 2D-PC cell array 202 on the basis of the emitting direction of laser light recognized by the transmission/reception direction recognition unit 405.
The block structural example of the 3D sensing system 100 according to the modified example 1 is the same as the block structural example of the 3D sensing system 100 according to the embodiments illustrated in
The 2D-PC cell array driving unit 402 includes an operation selection unit 4022, a LiDAR operation control unit 4024, a flash LiDAR control unit 4026, and a structured light-section control unit 4028, as illustrated in
The operation selection unit 4022 controls the LiDAR operation control unit 4024, the flash LiDAR control unit 4026, and the structured light-section control unit 4028, in accordance with the operation mode (LiDAR operation mode (M1)/flash LiDAR operation mode (M2)/light-section method operation mode (M3)) controlled by the CPU 408.
Specifically, when the operation mode is the LiDAR operation mode (M1), the LiDAR operation control unit 4024 executes the drive control of 2D-PC cell array 202 so that one laser element is driven and the twin beam (A, B) is emitted. When the operation mode is the flash LiDAR operation mode (M2), the flash LiDAR control unit 4026 executes the drive control of 2D-PC cell array 202 so that a certain region (sensing space) is irradiated with light for given length of time. When the operation mode is the light-section method operation mode (M3), the structured light-section control unit 4028 executes the drive control of 2D-PC cell array 202 so that the striped pattern light is projected onto the measuring object.
The operation selection unit 4022 executes a selection control of the three operation modes as follows, for example.
First, the flash LiDAR control unit 4026 is made to execute the drive control in accordance with the flash LiDAR operation mode (M2) (for example, a higher output of approximately several 100 W). Next, the LiDAR operation control unit 4024 is made to execute the drive control in accordance with the LiDAR operation mode (M1) (for example, an output of approximately several W to approximately several tens of W). Next, the structured light-section control unit 4028 is made to execute the drive control in accordance with the light-section method operation mode (M3).
Then, the operation selection unit 4022 may return the operation mode to the initial flash LiDAR operation mode (M2), or may terminate the processing. Moreover, the order of the processing of flash LiDAR operation mode (M2) and the processing of light-section method operation mode (M3) may be reversed. One or two operation modes of the three operation modes may also be combined with each other.
In this way, the way the three operation modes are combined can be selected arbitrarily, but in principle, the processing will not shift to the next operation mode until the sensing processing in one operation mode is completed.
The difference between the 3D sensing system 100 according to the modified example 2 and the 3D sensing system 100 according to the modified example 1 (
In the 3D sensing system 100 according to the modified example 2 of the embodiments, the AI unit 407 learns a sensing result of the 3D sensing system 100 on the basis of the image data stored and accumulated in the 3D image storage unit 410, and controls more appropriately the next and subsequent sensing processing executed by 3D sensing system 100 (in particular, the transmission/reception direction recognition unit 405 and the distance detection unit (TOF) 412).
The block structural example of the 3D sensing system 100 according to the modified example 2 is the same as the block structural example of the 3D sensing system 100 according to the modified example 1 of the embodiments illustrated in
The TOF ranging system 600 irradiates a measuring object 700 with laser light A, B, measures the time until reflected light RA, RB is reflected and returned, and thereby measures the distance to the measuring object 700.
The TOF ranging system 600 includes a 2D-PC cell array 202, a PWM modulation control unit 203, a phase difference detection unit 205, an image sensor 302, an optical system 304, and a distance detection unit 412. It should be noted that since the practice of the present application uses the same time measurement principle as that of the flash LiDAR, a certain amount of pulse width may be required, but an operation in which the pulse width is not changed can also be realized. Typically, in the application for such measurement, pulses of several ns to ten and several ns are repeatedly generated as short as possible. Repetition frequency is determined in accordance with the detected distance. After the reflection from the set distance of the first pulse is returned and the processing is completed, the next pulse is output.
The 2D-PC cell array 202 emits the twin beam A, B in which amplitude is modulated to the fundamental frequency (e.g., several 100 MHz) by the PWM modulation control unit 203. The emitting light A and emitting light B are reflected from the measuring object 700 and are received by image sensor 302 through the optical system 304 as the reflected light RA and reflected light RB, respectively. In this case, if there is no reflected light, it can be recognized as that no object (no measuring object) exists in the corresponding direction.
The phase difference detection unit 205 detects a phase difference in frequency between the emitting light A and emitting light B and the reflected light RA and reflected light RB, respectively.
The distance detection unit 412 includes a distance calculation circuit 4121 configured to calculate the time on the basis of the phase difference detected by the phase difference detection unit 205, and a distance data detection unit 4122 configured to detect the distance to the measuring object 700 by multiplying the time calculated by the distance calculation circuit 4121 by the light velocity.
In the LiDAR operation mode of the TOF ranging system 600 according to the modified example 3, the above distance calculation is repeatedly executed for different emitting directions.
Although not illustrated, the TOF ranging system 600 according to the modified example 3 may also include the AI unit 502, the 3D image storage unit 410, the object recognition logic 414 and/or the user I/F unit 504 including the input unit 506 and the output unit 508 illustrated in
(Image Sensor applicable to 3D Sensing System (Area))
The image sensor (area) 302 is an image sensor configured to measure the distance to the measuring object by means of the TOF method and outputs a phase difference information of light emission/reception timing using PWM modulated laser light. As illustrated in
The 3D sensing system during sensing has the following functions.
In
The arrangement shown in
In an example of the twin beam using the closest-packing pattern of circles emitted from the 2D-PC SEL cell array applicable to the 3D sensing system according to the embodiments,
In
In
The specification of beam of each photonic crystal surface emitting Laser (PC SEL) composing the 2D array is determined by selecting vertex pair with the origin point symmetry as shown by the corresponding number in an example of the laser beam arrangement in
In the 3D sensing system according to the embodiments,
The light receiving system in the 3D sensing system according to the embodiments includes an imaging lens 16 and an image sensor (or arrayed light-receiving element) 18, and is configured to receive the reflected light R, as illustrated in
In a 3D sensing system according to a comparative example,
As illustrated in the comparative example in
On the other hand, in the 3D sensing system according to the embodiments,
As illustrated in
In an example of emitting beam control of the 2D-PC SEL cell array 120AR applicable to the 3D sensing system according to the embodiments,
The example of the arrangement state of the lattice 212A for forming optical resonance state and the lattice 212B for light-emitting illustrated in
As illustrated in
In an example of emitting beam control of the 2D-PC SEL cell array 120AR applicable to the 3D sensing system according to the embodiments,
In an example of emitting beam control of the 2D-PC SEL cell array 120AR applicable to the 3D sensing system according to the embodiments,
In an example of emitting beam control of the 2D-PC SEL cell array 120AR applicable to the 3D sensing system according to the embodiments,
In the 2D-PC SEL cell applicable to the 3D sensing system according to the embodiments,
Herein, strip-shaped electrodes E1 to E19 described below are used as the lower electrode, instead of lower electrode not illustrated in
The 2D-PC layer illustrated in
Strip-shaped electrodes (E1 to E5), (E6 to E12), and (E13 to E19) are provided on an upper surface of the upper substrate 242 as the upper electrode 252, as illustrated in
In the PC laser illustrated in
The 2D-PC SEL array and the image sensor, and the drive and control of the 2D-PC SEL array and image sensor have been described above, and the support for multiple operation modes including flash has also been described above.
In the case of a system that always uses fixedly the flash operation for the entire surface of a specific region, a dedicated flash light source such as a laser or LED can be used in addition to the PC SEL array, as another aspect of the embodiments. The details will now be described hereinafter.
The 3D sensing system in the combination operation mode according to the embodiments includes a flash (FL) light source 250 for entire surface irradiation as a light source, and a 2D-PC SEL cell array 202 for irradiation of a target region.
The FL source 250 emits laser light FL to the entire surface of a specific region (sensing region). In an example illustrated in
Reflected light RVH, RVB, RHM emitted from the FL source 250 and reflected from the measuring objects VH, VB, HM by a time-of-flight (TOF) camera 350, to measure the distance to each measuring object VH, VB, HM.
At this time, for example, the body color of vehicle VH and the clothing color of the pedestrian HM are relatively bright (e.g., white based, yellow based, etc.), and the body color of vehicle VB is relatively dark (e.g., black based, dark blue based, brown based, etc.). Accordingly, since the reflectance of the vehicle VH and the pedestrian HM having relatively bright color is relatively high and the signal to noise (S/N) ratio thereof is also high, the TOF camera 350 can observe the vehicle VH and the pedestrian HM. However, since the reflectance of the vehicle VB having relatively dark color is relatively low and the S/N ratio thereof is also low, it is difficult for the TOF camera 350 to observe the vehicle VB.
Therefore, only in the target region of the measuring object (in this case, the vehicle VB) having low reflectance and insufficient S/N ratio is irradiated with a spot beam from the 2D-PC SEL cell array 202, and then the reflected light is observed. As a result, it enables measurement of the distance with high sensitivity, even for measurement objects such as the vehicle VB.
An operation flow of a flash LiDAR system according to a comparative example will now be described, with reference to
In Step S400, the entire surface of the specific region is irradiated with laser light FL from the FL source 250.
Next, in Step S401, the reflected light RVH, RVB, RHM emitted from the FL source 250 and respectively reflected from the measuring objects VH, VB, HM is observed by the TOF camera 350. In this case, if there is no reflected light, it can be recognized as that no object (no measuring object) exists in the corresponding direction.
Next, in Step S402, it is determined whether or not there is any region where the S/N of the reflected light is lower than a predetermined threshold value T. A result of the determination in Step S402, if there is no region where the S/N of the reflected light is lower than the predetermined threshold value T (in the case of NO in Step S402), a distance image (3D image) is output in Step S403.
In contrast, as a result of the determination in Step S402, if there is any region where the S/N of the reflected light is lower than the predetermined threshold value T (in the case of YES in Step S402), a distance image to be obtained by calculating the distance of the region cannot be output. Accordingly, in Step S404, the irradiation intensity from the FL source 250 is increased, and the entire surface of a specific region is irradiated with the laser light FL again.
Next, in Step S405, the reflected light emitted from the FL source 250 and reflected from the measuring object is observed by the TOF camera 350. However, a region other than a region where the S/N ratio is lower than the predetermined threshold value T is saturated by the noise, and therefore a distance image cannot be output. In this way, if a measuring object having low reflectance is included, distance measurement becomes difficult due to reduction of the S/N ratio.
An operation flow of the 3D sensing system illustrated in
In Step S500, the entire surface of the specific region is irradiated with laser light FL from the FL source 250 (flash type).
Next, in Step S501, the reflected light RVH, RVB, RHM emitted from the FL source 250 and respectively reflected from the measuring objects VH, VB, HM is observed by the TOF camera 350.
In this case, if there is no reflected light, it can be recognized as that no object (no measuring object) exists in the corresponding direction.
Next, in Step S502, it is determined whether or not there is any region where the S/N of the reflected light is lower than the predetermined threshold value T. A result of the determination in Step S502, if there is no region where the S/N of the reflected light is lower than the predetermined threshold value T (in the case of NO in Step S502), a distance image (3D image) is output in Step S503.
In contrast, as a result of the determination in Step S502, if there is any region where the S/N of the reflected light is lower than the predetermined threshold value T (in the case of YES in Step S502), e.g., in the case of the vehicle VB having a relatively low reflectance and a relatively low S/N ratio, a distance image to be obtained by calculating the distance of the region cannot be output. Therefore, in Step S504, only a region where the S/N ratio is lower than the predetermined threshold value T is irradiated with a spot beam from the 2D-PC SEL cell array 202 (beam scanning type).
Next, in Step S505, the reflected light emitted from the 2D-PC SEL cell array 202 and reflected from the measuring object is observed by the TOF camera 350, and it is determined in Step S506 whether or not there is any region where the S/N ratio of the reflected light is lower than the predetermined threshold value T.
the S/N of the reflected light is lower than the predetermined threshold value T (in the case of NO in Step S506), a distance image (3D image) is output in Step S507.
In contrast, as a result of the determination in Step S506, if there is any region where the S/N of the reflected light is lower than the predetermined threshold value T (in the case of YES in Step S506) although it is assumed that there is some kind of measuring object, a distance image to be obtained by calculating the distance of the region cannot be output.
Therefore, in Step S508, the light intensity to be emitted from the 2D-PC cell array 202 is increased, then returning to Step S504, only the region concerned is irradiated with a spot beam from the 2D-PC SEL cell array 202. Herein, as an adjustment method for increasing the light intensity in step S508, for example, a method for increasing a voltage supplied to the 2D-PC SEL cell array 202 can be applied.
Then, until there is no more region where the S/N ratio of the reflected light is lower than the predetermined threshold T, i.e., until all the measurement objects in the region are detected, the processing of steps S504 to S508 is repeated.
In this way, by introducing the operation mode in combination of the flash operation mode and the LiDAR operation mode in the 3D sensing system according to the embodiments, even when a measuring object having low reflectance is included in a sensing region, the distance of the measuring object having low reflectance can be measured.
It is configured so that, first, the entire surface of the specific region is irradiated with the laser light FL from the FL source 250 beforehand, and the measuring object included to the specific entire region is detected (flash operation mode), and then only the measuring object which cannot be detected at that time is irradiated with the spot beam from the 2D-PC SEL cell array 202 to be detected (LiDAR operation mode). Accordingly, the processing can be executed more efficiently than operating in the LiDAR operation mode from the beginning to the end.
The emitting angle is, for example, approximately 2° per one point in a specified direction, which is approximately ±60° in the horizontal direction and ±60° in the vertical direction for the entire array.
The output is more than approximately 0.2 W per one point, for example. By arranging a plurality of light sources B, it is possible to increase the output, but when the pulse width is long or the repetition frequency is high, an ingenuity of heat dissipation may be required.
The laser used as the FL source 250 is, for example, surface-vertical type PC SEL or VCSEL, emitted laser light is appropriately spread by a lens or a diffuser, and irradiates a range of ±60°. The lens used herein is, for example, ball lens, a Graded Index (GI) lens, or a lens obtained by combining a plurality of lenses. More expansively, irradiation within a range of ±60° can be realized, without using lenses or diffusers.
The PC laser light source for entire surface irradiation is an entire surface irradiation type, for example, within a range of ±60° in the horizontal direction and ±60° in the vertical direction, and irradiates the entire surface of the sensing region with laser light FL by spreading the light emission of a single element. The emitting angle is, for example, approximately 2° per a single element, and more expansively, it uniformly irradiates within a range of approximately ±60° in the horizontal direction and approximately ±60° in the vertical direction.
The output is more than approximately 5 W, for example. When the pulse width is long or the repetition frequency is high, an ingenuity of heat dissipation may be required. The package to be used therefor is a 5.6 mm φ stem, for example.
As illustrated in
The signal processing unit 400 determines whether or not there is any region where the S/N ratio of the reflected light emitted from the flash light source 250 and reflected is lower than the predetermined threshold value in the specific region. When there is a region where the S/N ratio is lower than the predetermined threshold value, the signal processing unit 400 controls the 2D-PC cell array driving unit 402 so as to irradiate only the region where the S/N ratio is lower than the predetermined threshold value as a target with spot laser light from the 2D-PC SEL cell array 202.
Moreover, the signal processing unit 400 determines whether or not there is any region where the S/N ratio of the spot reflected light emitted from the 2D-PC SEL cell array 202 and reflected is lower than the predetermined threshold value T. As a result of the determination, when there is a region where the S/N ratio is lower than the predetermined threshold value T, the signal processing unit 400 controls the 2D-PC cell array driving unit 402 to increase the light intensity emitted from the 2D-PC SEL cell array 202 and then to irradiate only targeting the region concerned with the spot laser light from the 2D-PC SEL cell array 202.
In this case, there are the flash operation mode and the LiDAR operation mode as the operation mode, the flash driving unit 415 executes the drive control of the flash light source 250 when the operation mode is the flash operation mode, and the 2D-PC cell array driving unit 402 executes the drive control of the 2D-PC SEL cell array 202 when the operation mode is the LiDAR operation mode.
The 3D sensing system 100 according to the embodiments will now be described in more detail.
The 3D sensing system 100 according to the embodiments includes a signal transmitting unit 200, a signal receiving unit 300, and a signal processing unit 400, as illustrated in
The signal transmitting unit 200 includes: a flash (FL) light source 250 for entire surface irradiation configured to emit laser light FL to the entire surface of a specific region, and a 2D-PC SEL cell array 202 configured to emit laser light to a target region in the specific region.
The signal receiving unit 300 includes an optical system 304 and an image sensor (line/area) 302 configured to receive a reflected light emitted from the signal transmitting unit 200 and reflected from the measuring object.
The signal processing unit 400 includes: a control unit (CPU) 408 configured to control an operation mode of the laser light source; a transmission direction recognition unit 404 configured to recognize an emitting direction of the laser light emitted from the 2D-PC SEL cell array 202; a 2D-PC cell array driving unit 402 configured to executes a drive control of the 2D-PC SEL cell array 202 in accordance with the operation mode controlled by the CPU 408 on the basis of the emitting direction of laser light recognized by the transmission direction recognition unit 404; an FL driving unit 415 configured to execute a drive control of the FL source 250; and a distance detection unit (TOF) 412 configured to calculate the distance to the measuring object on the basis of a light receiving position on an imaging surface of the image sensor 18 and the time from light emission to light reception, in accordance with the operation mode controlled by the CPU 408.
The FL source 250 first emits laser light FL to the entire surface of the specific region. The reflected light emitted from the FL source 250 and reflected from the measuring object is received in the signal receiving unit 300, and the distance to the measuring object is measured by the distance detection unit 412 in the signal processing unit 400.
At this time, the signal processing unit 400 determines whether or not there is any region where the S/N ratio of the reflected light is lower than the predetermined threshold value T. As a result of the determination, when there is no region where the S/N ratio of the reflected light is lower than the predetermined threshold value T, a distance image is output. In contrast, when there is a region where the S/N ratio is lower than the predetermined threshold value, the signal processing unit 400 controls the 2D-PC cell array driving unit 402 so as to irradiate only the targeting region where the S/N ratio is lower than the predetermined threshold value with spot beam from the 2D-PC SEL cell array 202.
The reflected light emitted from the 2D-PC SEL cell array 202 and reflected from the measuring object is received in the signal receiving unit 300. The signal processing unit 400 determines whether or not there is any region where the S/N ratio of the reflected light is lower than the predetermined threshold value T. As a result of the determination, when there is no region where the S/N ratio of the reflected light is lower than the predetermined threshold value T, a distance image is output. In contrast, when there is a region where the S/N ratio of the reflected light is lower than the predetermined threshold value T, the signal processing unit 400 controls the 2D-PC cell array driving unit 402 to increase the light intensity emitted from the 2D-PC SEL cell array 202 and then to irradiate only targeting the region concerned with the spot beam from the 2D-PC SEL cell array 202.
The signal transmitting unit 200 further includes the FBPD array 204 configured to execute a feedback control of the emitted laser light, and the transmission direction recognition unit 404 recognizes an emitting direction of the laser light emitted from the signal transmitting unit 200 in accordance with feedback information provided from the FBPD array 204.
The signal transmitting unit 200 may also include a reception direction recognition unit 406 configured to recognize a reception direction of the reflected light from the light receiving position on the imaging surface of the image sensor 18, and the 2D-PC cell array driving unit 402 executes a drive control of the 2D-PC cell array 202 on the basis of the emitting direction of the laser light recognized by the transmission direction recognition unit 404 and the reception direction of the reflected light recognized by the reception direction recognition unit 406.
The signal processing unit 400 further includes an object recognition logic 414 configured to identify the measuring object on the basis of a calculation result of the distance detection unit (TOF) 412.
The 3D sensing system 100 according to the embodiments is provided with a signal transmitting unit 200, a signal receiving unit 300, a signal processing unit 400, a main controlling unit (MCPU) 500, and an artificial intelligence (A.I. Artificial Intelligence) unit 502 so that it may illustrate to
The signal transmitting unit 200 includes: a FL source 250 for entire surface irradiation configured to emit laser light FL to the entire surface of the specific region; a 2D-PC SEL cell array 202 configured to emit laser light to the measuring object; and an FBPD array 204 configured to execute a feedback control of the emitted laser light. The FBPD array 204 corresponds to the PD 118PD illustrated in
The signal receiving unit 300 includes an optical system 304 and an image sensor (line/area) 302 configured to receive a scattered reflected light emitted from the signal transmitting unit 200 and reflected from the measuring object.
The signal processing unit 400 includes a 2D-PC cell array driving unit 402, a transmission direction recognition unit 404, a reception direction recognition unit 406, a CPU 408, a 3D image storage unit 410, a distance detection unit (TOF) 412, and an object recognition logic 414. The CPU 408 executes an operation control of each unit on the basis of three operation modes (i.e., LiDAR operation mode, flash LiDAR operation mode, light-section method operation mode). The CPU 408 corresponds to the control unit 14 illustrated in
The 2D-PC cell array driving unit 402 executes a drive control of the 2D-PC SEL cell array 202 on the basis of the emitting direction of the laser light recognized by the transmission direction recognition unit 404 and the reception direction of the reflected light recognized by the reception direction recognition unit 406, in accordance with the operation mode (LiDAR operation mode/flash LiDAR operation mode/light-section method operation mode) controlled by the CPU 408. The FL driving unit 415 executes a drive control of the FL source 250.
The transmission direction recognition unit 404 recognizes an emitting direction of the laser light emitted from the signal transmitting unit 200 in accordance with the feedback information provided from the FBPD array 204, and provides a recognition result to the CPU 408 and the 2D-PC cell array driving unit 402 and the FL driving unit 415. The reception direction recognition unit 406 recognizes a reception direction of the reflected light from the light receiving position on the imaging surface of the image sensor 18, and provides a recognition result to the CPU 408. The 3D image storage unit 410 stores image data captured by the image sensor 18 and provides the stored image data to distance detection unit (TOF) 412 etc.
The distance detection unit (TOF) 412 calculates a distance to the measuring object on the basis of the light receiving position on the imaging surface of the image sensor 18 and the time from light emission to light reception (arrival time), in accordance with the operation mode (LiDAR operation mode/flash LiDAR operation mode/light-section method operation mode) controlled by the CPU 408. The distance detection unit (TOF) 412 corresponds to the distance calculation unit 22 illustrated in
The object recognition logic 414 identifies the measuring object on the basis of a calculation result of the distance detection unit (TOF) 412.
The MCPU 500 controls the entire main system mounted in the 3D sensing system 100 according to the embodiments. For example, when the 3D sensing system 100 is mounted in a vehicle, the MCPU 500 corresponds to a main CPU provided in the side of the vehicle.
A user interface (I/F) unit 504 is connected to the MCPU 500. The user I/F unit 504 includes: an input unit 506 for a user to input instructions (e.g., start/end of sensing processing, selecting of operation mode, and the like) to the 3D sensing system 100; and an output unit 508 for presenting sensing information detected by the 3D sensing system 100 to the user. The sensing information detected by the 3D sensing system 100 may be output as an image depicting a measuring object, and may be output as sound information, such as a warning sound.
On the basis of the image data stored and accumulated in the 3D image storage unit 410, the AI unit 502 learns the sensing result from the 3D sensing system 100, and assists more appropriately the sensing processing executed by the 3D sensing system 100.
The difference between the 3D sensing system 100 according to the modified example 4 and the 3D sensing system 100 illustrated in
In the 3D sensing system 100 according to the modified example 4 of the embodiments, the 2D-PC cell array driving unit 402 executes the drive control of the 2D-PC SEL cell array 202 on the basis of the emitting direction of laser light recognized by the transmission/reception direction recognition unit 405.
The block structural example of the 3D sensing system 100 according to the modified example 4 is the same as the block structural example of the 3D sensing system 100 according to the embodiments illustrated in
As well as
The FL driving unit 415 executes a drive control of the FL source 250 in accordance with the control by the CPU 408.
The difference between the 3D sensing system 100 according to the modified example 5 and the 3D sensing system 100 according to the modified example 4 (
In the 3D sensing system 100 according to the modified example 5 of the embodiments, the AI unit 407 learns a sensing result of the 3D sensing system 100 on the basis of the image data stored and accumulated in the 3D image storage unit 410, and controls more appropriately the next and subsequent sensing processing executed by 3D sensing system 100 (in particular, the transmission/reception direction recognition unit 405 and the distance detection unit (TOF) 412).
The block structural example of the 3D sensing system 100 according to the modified example 5 is the same as the block structural example of the 3D sensing system 100 according to the modified example 4 of the embodiments illustrated in
In the flash operation mode, the TOF ranging system 600 irradiates a measuring object 700 with laser light FL, measures the time until reflected light is reflected and returned, and thereby measures the distance to the measuring object 700. In the LiDAR operation mode, the TOF ranging system 600 irradiates a measuring object 700 with laser light A, B, measures the time until reflected light RA, RB is reflected and returned, and thereby measures the distance to the measuring object 700.
The TOF ranging system 600 includes an FL light source 250, a 2D-PC cell array 202, a PWM modulation control unit 203, a phase difference detection unit 205, an image sensor 302, an optical system 304, and a distance detection unit 412. It should be noted that, in the LiDAR operation mode, since the practice of the present application uses the same time measurement principle as that of the flash LiDAR, a certain amount of pulse width may be required, but an operation in which the pulse width is not changed can also be realized. Typically, in the application for such measurement, pulses of several ns to ten and several ns are repeatedly generated as short as possible. Repetition frequency is determined in accordance with the detected distance. After the reflection from the set distance of the first pulse is returned and the processing is completed, the next pulse is output.
In the LiDAR operation mode, the 2D-PC SEL cell array 202 emits the twin beam A, B in which amplitude is modulated to the fundamental frequency (e.g., several 100 MHz) by the PWM modulation control unit 203. The emitting light A and emitting light B are reflected from the measuring object 700 and are received by image sensor 302 through the optical system 304 as the reflected light RA and reflected light RB, respectively. In this case, if there is no reflected light, it can be recognized as that no object (no measuring object) exists in the corresponding direction.
The phase difference detection unit 205 detects a phase difference in frequency between the emitting light A and emitting light B and the reflected light RA and reflected light RB, respectively.
The distance detection unit 412 includes a distance calculation circuit 4121 configured to calculate the time on the basis of the phase difference detected by the phase difference detection unit 205, and a distance data detection unit 4122 configured to detect the distance to the measuring object 700 by multiplying the time calculated by the distance calculation circuit 4121 by the light velocity.
In the LiDAR operation mode of the TOF ranging system 600 according to the modified example 6, the above distance calculation is repeatedly executed for different emitting directions.
Although not illustrated, the TOF ranging system 600 according to the modified example 6 may also include the AI unit 502, the 3D image storage unit 410, the object recognition logic 414 and/or the user I/F unit 504 including the input unit 506 and the output unit 508 illustrated in
As described above, according to the embodiments, there can be provided the 3D sensing system, having higher accuracy, higher output, miniaturization, and robustness, as well as higher adaptability to sensing regions and sensing objects, and capable of supporting a plurality of sensing modes.
The present embodiments have been described by the embodiments, as a disclosure including associated description and drawings to be construed as illustrative, not restrictive. This disclosure makes clear a variety of alternative embodiments, working examples, and operational techniques for those skilled in the art.
Such being the case, the embodiments cover a variety of embodiments, whether described or not.
The 3D sensing system according to the embodiments is available, for example, as sensing technology for assisting safe driving of vehicles, such as an in-vehicle sensor configured to detect the distance to and the shape of measuring objects existing around the vehicles; and is further available also as sensing technology for realizing advanced automatic driving systems. Moreover, it is applicable not only to vehicles but also to aircrafts, satellites, spacecraft, ships, etc. Furthermore, it is also applicable to a wide range of fields, including geology, seismology, and oceanography.
Number | Date | Country | Kind |
---|---|---|---|
2019-129075 | Jul 2019 | JP | national |
2020-059842 | Mar 2020 | JP | national |
This is a continuation application (CA) of PCT Application No. PCT/JP2020/027059, filed on Jul. 10, 2020, which claims priority to Japan Patent Application Nos. P2019-129075 filed on Jul. 11, 2019, and P2020-059842 filed on Mar. 30, 2020 and is based upon and claims the benefit of priority from prior Japanese Patent Application Nos. P2019-129075 filed on Jul. 11, 2019, and P2020-059842 filed on Mar. 30, 2020 and PCT Application No. PCT/JP2020/027059, filed on Jul. 10, 2020, the entire contents of each of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2020/027059 | Jul 2020 | US |
Child | 17569922 | US |