2D AREA ARRAY CAMERA-LINE LASER 3D SENSOR JOINT CALIBRATION METHOD AND DEVICE

Information

  • Patent Application
  • 20250085101
  • Publication Number
    20250085101
  • Date Filed
    March 28, 2023
    2 years ago
  • Date Published
    March 13, 2025
    2 months ago
  • Inventors
  • Original Assignees
    • JIANGSU JITRI INTELLIGENT OPTOELECTRONIC SYSTEM RESEARCH INSTITUTE CO., LTD.
Abstract
A 2D area array camera-line laser 3D sensor joint calibration method, comprising: establishing a 3D coordinate system of the line laser 3D sensor, and obtaining 3D coordinates of a light bar point on the 2D area array camera; placing a target at a common measurement position, and allowing the 2D area array camera to acquire images of intersection lines of three sets of laser plane and cylinders and the image coordinates of three sets of laser light bars respectively; calculating 3D coordinates of the light bar point in a 2D area array camera coordinate system, and calculating three ellipse center position coordinates; directly acquiring 3D coordinates of three sets of laser light bars in the 3D sensor coordinate system from the line laser 3D sensor, and calculating three ellipse center position coordinates; and calculating a conversion distance between two coordinate systems.
Description
TECHNICAL FIELD

The present invention relates to a 2D area array camera-line laser 3D sensor joint calibration method and device and belongs to the technical field of visual 3D measurement.


RELATED ART

3D sensors based on line lasers have become an important sensor type on the market. and there are quite a few mature products. A line laser 3D sensor projects the line laser onto the surface of an object to be measured. After being modulated by the surface of the object, a light bar is formed. After the light bar is collected by an image sensor and processed by computation, the 3D point coordinates of the light bar on the surface of the object can be directly output, where the measurement coordinate system is defined by the sensor itself, and its specific position and the internal parameters of the 3D sensor are typically not disclosed, which brings inconvenience to the need to build multi-type and heterogeneous visual measurement systems.


Currently, in a multi-view measurement system that uses two or more industrial cameras, the measurement results of each individual visual sensor need to be unified into a 3D coordinate system, and the relative positional relationship between multiple visual sensors needs to be known. represented by the rotation matrix R and the shift vector t. The process of obtaining the R, t matrix is called extrinsic calibration between binocular or multi-camera industrial cameras. Generally, the intrinsic parameters of each industrial camera are used, and it is completed with the help of 2D (two-dimensional) planar targets, 3D (three-dimensional) targets, etc. However, some integrated 3D sensors generally only output 3D coordinates under their own defined coordinate system and do not open their intrinsic parameters and process images captured by the camera to the outside world. Therefore, existing traditional methods are used for multi-sensor extrinsic calibration. For example, in a combined measurement system using a line laser 3D sensor and a 2D area array camera, the output results of the line laser 3D sensor need to be imported into the coordinate system of other 2D area array cameras, which means that the relative positional relationship between the 2D area array camera and the 3D sensor is required. However, since the 3D sensor neither disclose intrinsic parameters nor output image information, the traditional binocular calibration method cannot be used.


SUMMARY OF INVENTION

An objective of the present invention is to provide a 2D area array camera-line laser 3D sensor joint calibration method and device to solve the defects of the prior art.


A 2D area array camera-line laser 3D sensor joint calibration method, including:

    • combining a line laser of a line laser 3D sensor and a 2D area array camera into a line structured light vision model, and calibrating the line structured light vision model by using a 2D plane target method;
    • establishing a 3D coordinate system of the line laser 3D sensor with an origin at an optical center of the 2D area array camera, and obtaining 3D coordinates of a light bar point on the 2D area array camera;
    • placing a target with no less than three cylinder features at a common measurement position of the line laser 3D sensor and the 2D area array camera, and allowing the 2D area array camera to acquire images of intersection lines of three sets of laser plane and cylinders and then acquire image coordinates of three sets of laser light bars respectively;
    • calculating 3D coordinates of the light bar point in a 2D area array camera coordinate system according to the 3D coordinates of the light bar point, and implementing an ellipse fitting algorithm to calculate three ellipse center position coordinates respectively:
    • directly acquiring 3D coordinates of three sets of laser light bars in the 3D sensor coordinate system from the line laser 3D sensor, and implementing the ellipse fitting algorithm to calculate three ellipse center position coordinates respectively; and
    • according to the 3D coordinates in the 2D area array camera and the 3D coordinates in the line laser 3D sensor, calculating a conversion distance between the two coordinate systems, thus completing the joint calibration.


Further, the operation of obtaining the 3D coordinates of the light bar point on the 2D area array camera includes:

    • setting an X-axis of the line structured light vision model as a transverse axis of a photosensitive surface and a Y-axis as a longitudinal axis of the photosensitive surface;
    • obtaining intrinsic parameters of the 2D area array camera, expressed as







K
=

[



fx




u

0






fy



v

0







1



]


,






    • then obtaining a relationship between the image coordinates of a certain light bar point on the 2D area array camera and its 3D coordinates as follows:











s
·


[



u


v


1



]

T


=

K
·


[



x


y


z



]

T



,






    • after calibration of the line structured light vision model, obtaining the coordinates of its laser plane in the 2D area array camera, expressed as:












[



a


b


c


d



]

·


[



x


y


z


1



]

T


=
0

,






    • from the above equations, obtaining 3D coordinates of the certain light bar point on the 2D area array camera as follows:












x
=


z
·

(

u
-

u

0


)


/
fx







y
=


z
·

(

v
-

v

0


)


/
fy








z
=


-
1

·


[



a
·

(

u
-

u

0


)


/
fx

+


b
·

(

v
-

v

0


)


/
fy

+
c

]


-
1




,









    • where: fx represents an x-axis component of a camera focal length, fy represents a y-axis component of the camera focal length, u0 represents a u-axis coordinate of a camera image origin, v0 represents a v-axis coordinate of the camera image origin, u represents a u-axis coordinate of a camera image pixel point, v represents a v-axis coordinate of the camera image pixel point. x represents an x-axis coordinate of a measured point in a camera space coordinate system, y represents a y-axis coordinate of the measured point in the camera space coordinate system, and z represents a z-axis coordinate of the measured point in the camera space coordinate system; a, b, c, and d are laser plane coordinate coefficients; T represents a vector transpose and s represents a scale factor.





Further, the cylinder features are installed in parallel on a plane, any three of central axes of the plurality of cylinders are not coplanar, and the relative positional relationship between the cylinders remains unchanged after installation.


Further, the target is placed at least once to complete the calibration.


Furthermore, step of placing the target ensures that the intersection lines between the laser plane and the three cylinders are obtained, and at the same time, are within acquisition ranges of the 3D sensor and the 2D area array camera.


A 2D area array camera-line laser 3D sensor joint calibration device, including a target for calibrating with no less than three cylinders, a line laser 3D sensor and a 2D area array camera;

    • wherein central axes of all the cylinders of the target are parallel to each other, and the central axes of any three cylinders are not in the same plane;
    • during calibration, the target with multi-cylinders is placed in a common measurement area of the line laser 3D sensor and the 2D area array camera, and then a line laser of the line laser 3D sensor intersects with the cylinders to form three arc-shaped laser light bars; the 2D area array camera and the line laser form a line structured light vision model, and the calibration is completed by using a plane target method.


Compared with the prior art, the present invention has the following beneficial effects.


The present invention provides a multi-cylinder target-based 2D area array camera-line laser 3D sensor joint calibration method and device. Compared with the traditional method, the present invention has the following advantages. First, the measurement results of the 3D sensor can be used directly without the need to open its intrinsic parameters or to obtain 2D images it captures. Second, only the cylindricity of the cylinders, instead of making a particularly precise target, is required. The processing difficulty of machine turning is low, and there is no need to obtain the coordinate values of the target feature points in advance. Third, without considering random errors, joint calibration can be completed with only one placement and photographing, rather than relying on multiple placements.





BRIEF DESCRIPTION OF DRAWINGS


FIG. 1 is a schematic diagram of a device according to the present invention;





where 1—line laser 3D sensor; 2—2D area array camera; 3—target: 4—laser plane: 5—elliptical arc segments formed by the intersection of a laser plane and cylinder features of the target.


DESCRIPTION OF EMBODIMENTS

In order to make the technical means, creative features, objectives and effects achieved by the present invention easy to understand, the present invention will be further elaborated below in conjunction with specific embodiments.


The present invention discloses a 2D area array camera-line laser 3D sensor joint calibration method, including followings.

    • (1) A target with no less than three cylinder features is made, wherein the cylinder features are installed in parallel on a plane, any three of central axes of the plurality of cylinders are not coplanar, and the relative positional relationship between the cylinders remains unchanged after installation.
    • (2) The positions of a line laser 3D sensor and a 2D area array camera are fixed, wherein their measurement areas overlap.
    • (3) A line laser of the line laser 3D sensor and the 2D area array camera into a separate line structured light vision model are combined, and the line structured light vision model is calibrated by using a 2D plane target method; a 3D coordinate system with an origin at an optical center of the 2D area array camera is established, wherein the X-axis is the transverse axis of a photosensitive surface and the Y-axis is the longitudinal axis of the photosensitive surface.


The intrinsic parameters of the 2D area array camera are obtained as







K
=

[



fx




u

0






fy



v

0







1



]


,






    • then obtaining a relationship between the image coordinates of a certain light bar point on the 2D area array camera and its 3D coordinates as follows:











s
·


[



u


v


1



]

T


=

K
·


[



x


y


z



]

T



,






    • after calibration of the line structured light vision model, obtaining the coordinates of its laser plane in the 2D area array camera, expressed as:












[



a


b


c


d



]

·


[



x


y


z


1



]

T


=
0

,






    • from the above equation, obtaining 3D coordinates of the certain light bar point on the 2D area array camera as follows:












x
=


z
·

(

u
-

u

0


)


/
fx







y
=


z
·

(

v
-

v

0


)


/
fy








z
=


-
1

·


[



a
·

(

u
-

u

0


)


/
fx

+


b
·

(

v
-

v

0


)


/
fy

+
c

]


-
1




,









    • where: fx represents an x-axis component of a camera focal length, fy represents a y-axis component of the camera focal length, u0 represents a u-axis coordinate of a camera image origin, v0 represents a v-axis coordinate of the camera image origin, u represents a u-axis coordinate of a camera image pixel point, v represents a v-axis coordinate of the camera image pixel point, x represents an x-axis coordinate of a measured point in a camera space coordinate system, y represents a y-axis coordinate of the measured point in the camera space coordinate system, and z represents a z-axis coordinate of the measured point in the camera space coordinate system; a, b, c, and d are laser plane coordinate coefficients; T represents a vector transpose and s represents a scale factor.

    • (4) The target is placed at the common measurement position of the line laser 3D sensor and the 2D area array camera. The 2D area array camera acquires images of the intersection lines of three sets of laser plane and cylinders and then acquires image coordinates of three sets of laser light bars respectively. Based on the above formula, their 3D coordinates in the 2D area array camera coordinate system are obtained, and the ellipse fitting algorithm is implemented to calculate three ellipse center position coordinates respectively, denoted as C1, C2, and C3. The coordinates are in the form of column vectors. If there are more than three cylinders, the center position coordinates are denoted C1, C2, C3. . . . Cn.

    • (5) The 3D coordinates of the three sets of laser light bars in the 3D sensor coordinate system are directly obtained from the line laser 3D sensor, and the ellipse fitting algorithm is implemented to calculate three ellipse center position coordinates respectively. Devices are denoted D1, D2 and D3, and the coordinates are in the form of column vectors. If there are more than three cylinders, the center position coordinates are denoted D1, D2, D3. . . . Dn.

    • (6) The conversion matrix from the 3D sensor to the 2D area array camera is set as R, t, where R represents the rotation matrix and t represents the shift vector; and then, according to the following formula, the least squares method is implemented to calculate R and t to complete the joint calibration, expressed as:










[




C

1




C

2




C

3






Cn



]

=


R
[




D

1




D

2




D

3






Dn



]

+

t
·


[



1


1


1





1



]

.







As shown in FIG. 1, the present invention further discloses a 2D area array camera-line laser 3D sensor joint calibration device, including:

    • designing and using a calibration target with no less than three cylinders;
    • wherein the target can be made by only fixing the three machined cylinders on a flat base plate, without the need to obtain the relative positional relationship of the three cylinders in advance;
    • central axes of all the cylinders of the target are parallel to each other, and the central axes of any three cylinders are not in the same plane.


During calibration, the multi-cylinder target is placed in the common measurement area of a line laser 3D sensor and a 2D area array camera, and then the line laser of the line laser 3D sensor intersects with the cylinders to form three arc-shaped laser light bars. The 2D area array camera and the line laser form a line structured light vision model, and the calibration is completed by using a plane target method. Then, the 3D point coordinates of the three light bars in the 2D area array camera coordinate system can be obtained through the line structured light vision model.


The three light bars are all elliptical arc segments on the laser plane. The ellipse fitting algorithm can be implemented to obtain the ellipse center coordinates, which are the intersection coordinates of the laser plane and the central axes of the three cylinders, and are expressed in the 2D area array camera coordinate system. In the meanwhile, the 3D point coordinates of the three light bars in the 3D sensor coordinate system can be directly obtained by the sensor. The coordinates of intersections between the laser plane and the central axes of three cylinders are obtained by the same method and expressed in the 3D sensor coordinate system. Since the intersections between the laser plane and the central axes of the three cylinders are fixed and unique, the conversion distance between the two coordinate systems can be calculated using the coordinate values of the three intersections in the 2D area array camera coordinate system and the 3D sensor coordinate system, thus completing joint calibration.


In this embodiment, the calibration can be completed by placing the target at least once, and the impact of random errors can be reduced by placing the target multiple times.


The target placement only needs to ensure that the intersection lines between the laser plane and the three cylinders are obtained, and at the same time, are within the acquisition ranges of the 3D sensor and the 2D area array camera. There is no requirement for the target placement angle.


The above are only the preferred embodiments of the invention. It should be pointed out that those of ordinary skill in the art can also make several improvements and modifications without departing from the technical principles of the invention, and these improvements and modifications should also be regarded as falling within the scope of the invention.

Claims
  • 1. A two-dimensional area array camera-line laser three-dimensional sensor joint calibration method, characterized by comprising: combining a line laser three-dimensional sensor and a two-dimensional area array camera into a line structured light vision model, and calibrating the line structured light vision model by using a two-dimensional plane target method;establishing a three-dimensional coordinate system of the line laser three-dimensional sensor with an origin at an optical center of the two-dimensional area array camera, and obtaining three-dimensional coordinates of a light bar point on the two-dimensional area array camera;placing a target with no less than three cylinder features at a common measurement position of the line laser three-dimensional sensor and the two-dimensional area array camera, and allowing the two-dimensional area array camera to acquire images of intersection lines of three sets of laser plane and cylinders and then acquire image coordinates of three sets of laser light bars respectively;calculating three-dimensional coordinates of the light bar point in a two-dimensional area array camera coordinate system according to the three-dimensional coordinates of the light bar point, and implementing an ellipse fitting algorithm to calculate three ellipse center position coordinates respectively;directly acquiring three-dimensional coordinates of three sets of laser light bars in the three-dimensional sensor coordinate system from the line laser three-dimensional sensor, and implementing the ellipse fitting algorithm to calculate three ellipse center position coordinates respectively; andaccording to the three-dimensional coordinates in the two-dimensional area array camera and the three-dimensional coordinates in the line laser three-dimensional sensor, calculating a conversion distance between two coordinate systems, thus completing joint calibration.
  • 2. The two-dimensional area array camera-line laser three-dimensional sensor joint calibration method according to claim 1, wherein obtaining the three-dimensional coordinates of the light bar point on the two-dimensional area array camera comprises: setting an X-axis of the line structured light vision model as a transverse axis of a photosensitive surface and a Y-axis as a longitudinal axis of the photosensitive surface;obtaining intrinsic parameters of the two-dimensional area array camera, expressed as
  • 3. The two-dimensional area array camera-line laser three-dimensional sensor joint calibration method according to claim 1, wherein the cylinder features are installed in parallel on a plane, any three of central axes of the plurality of cylinders are not coplanar, and the relative positional relationship between the cylinders remains unchanged after installation.
  • 4. The two-dimensional area array camera-line laser three-dimensional sensor joint calibration method according to claim 1, wherein the target is placed at least once to complete the calibration.
  • 5. The two-dimensional area array camera-line laser three-dimensional sensor joint calibration method according to claim 1, wherein step of placing the target ensures that the intersection lines between the laser plane and the three cylinders are obtained, and at the same time, are within acquisition ranges of the three-dimensional sensor and the two-dimensional area array camera.
  • 6. A two-dimensional area array camera-line laser three-dimensional sensor joint calibration device, comprising a target for calibrating with no less than three cylinders, a line laser three-dimensional sensor and a two-dimensional area array camera; wherein central axes of all the cylinders of the target are parallel to each other, and the central axes of any three cylinders are not in a same plane;during calibration, the target with multi-cylinders is placed in a common measurement area of the line laser three-dimensional sensor and the two-dimensional area array camera, and then a line laser of the line laser three-dimensional sensor intersects with the cylinders to form three arc-shaped laser light bars; the two-dimensional area array camera and the line laser form a line structured light vision model, and the calibration is completed by using a plane target method.
Priority Claims (1)
Number Date Country Kind
202211706672.5 Dec 2022 CN national
PCT Information
Filing Document Filing Date Country Kind
PCT/CN2023/084250 3/28/2023 WO