This manuscript focuses on a comprehensive model-based mathematical method for calibration of the gantry robot via dual camera vision system. We assume the axes of gantry system have angular misalignment in the horizontal plane i.e. the axes are not perfectly perpendicular to each other. Moreover, the axes are linearly scaled in X-axis and Y-axis with unknown scaling factor. Also, we assume the upward looking camera (ULC) and downward looking camera (DLC) are installed with minute angles on the gantry system. We presume the axes of camera view are perpendicular to each other but they are not perfectly aligned with the XY axes of the reference plate. It is also assumed that the tooling system is installed with some misalignment on X-axis of gantry system such that the tool axis has slight angles with the normal planes to the reference plate. The method presented in this manuscript can provide a sufficiently accurate, fast and affordable method for calibration of a gantry robot. Implementing this approach significantly reduces the commissioning time, such that the time-consuming trial-and-error method in manufacturing floor will be significantly reduced to multi-step automatic procedure with no human interference
Robotic systems have become increasingly important in manufacturing of devices. These systems can perform repeated task with more accuracy and efficiency than human labor. These robots can be used in manufacturing of products such as medical devices, smart gadget, etc. with small components which picking and placing of those components are undesirable for human labor. The accuracy and precision level of assembling of the product components are directly related to the accuracy of the robotic system which is primarily built for assembling purposes. In the design of any basic or complicated assembling machine some accuracy requirements are inevitably considered in advance, which must be met in the design process to yield a reliable and accurate assembled product. To maintain the predefined machine accuracy within an acceptable range, the calibration process must be accomplished. Eliminating or minimizing the factors that cause inaccuracy in the machine motion is a fundamental aspect of machine calibration.
Conventionally, calibration regime is performed using some calibrators to establish the correlation between the motion by machine and known moved values. This time-consuming process, in essence, teaches the machine to produce more accurate results than those that would occur otherwise. Therefore, providing a method which could minimize the time spent, and at the same time improve the accuracy of the machine and establish reliable post-calibration results, would be advantageous.
U.S. Pat. No. 8,180,487 describes a method of calibrating a vision-based robotic system. The approach is based on the trial-and-error to calibrate the gantry system. This method moves the calibration pin to a calibration block that includes one set of optical transmitter-receiver to receive beam. This method moves the calibration pin to the center point of the transmitted optical beam to determine a camera center position. This approach does not establish a mathematical model to consider misalignments and scaling of the gantry system, camera installation error, tool misalignment in the system.
The method presented in U.S. Pat. No. 7,627,395 is also based on the trial-and-error to calibrate the gantry system. It moves a calibration wafer supported by a robot to a reference position within the semiconductor processing system; viewing the calibration wafer to obtain wafer position data; and utilizing the wafer position data to correct the reference position. The method presented in U.S. Pat. No. 6,816,755 uses a tool and one camera mounted on the robot and at least six target features which are normal features of the object are selected. The features are used to train the robot in the frame of reference of the object so that when the same object is subsequently located, the robot's path of operation can be quickly transformed into the frame of reference of the object.
There are some robot calibration methods based on the geometry for a 6-degrees-of-freedom (DOF) robot manipulator. For example, one calibration device uses three laser beams. The coordinates of the laser spots on the table in the world coordinate system is obtained by processing the image data on a CCD camera fixed above. Based on that, the mapping between the world coordinate system and the robot base coordinate system can then be utilized to locate the robot within its environment by some geometric analysis.
In U.S. Pat. No. 8,135,208, a method for calibrating a vision based robotic system is presented. The machine includes two cameras, a calibration gauge with an alignment mark, and a robotic tool with an alignment fiducial. In this method, both camera center positions are determined using camera and the calibration block, and then using the second camera and the robotic tool to determine a robotic tool center position, and calculating a first camera to tool offset value.
A camera-based auto-alignment process for robotic arm is presented in United States Patent Publication 2014/0100694. The gripper unit and camera unit are aligned on two roughly parallel axes. The images are analyzed to calibrate the axis of view of the camera with the gripper axis, providing an XY calibration of the robotic arm. The gripper unit is calibrated on a Z-axis using optical calibration with landmarks provided on a second calibration gauge, by moving the gripper unit towards the work surface until it makes contact with the work surface. Once calibrated, the camera can be used to identify one or more landmarks at known locations on the work surface to align the robotic arm with the work surface.
This manuscript focuses on a comprehensive model-based mathematical method for calibration of the gantry system via the machine vision. The method presented in this manuscript can provide a sufficiently accurate, fast and affordable method. In this manuscript we assume the axes of gantry system have angular misalignment α and β radians in the horizontal plane and, and the movement in X and Y axes are scaled by Sx,Sy in X-axis and Y-axis, respectively. Moreover, the cameras are installed with minute angles on the gantry system. Therefore, the axes of camera view which are perpendicular to each other are not aligned perfectly with the reference plate axes. It is also assumed that the tool is installed with some deviation on x-axis of gantry system such that the tool axis has slight angles with the normal planes to the reference plate.
Our approach focuses on a comprehensive model-based mathematical method for calibration of the gantry system and tool via the machine vision. The method presented in this manuscript can provide a sufficiently accurate, fast and affordable method. Implementing this approach significantly reduce the commissioning time.
The close-up perspective view of gantry robotic system which is designed for assembling of products with tight tolerances in micrometer-level accuracy is illustrated in
In the following, various embodiments of the invention will be described in detail. Such details are included to facilitate understanding and implementing of the invention. In order to find the above-mentioned unknown parameters in gantry robot, multiple coordinate systems including {C}, {G″}, {G′}, {G} and {W} are established. The coordinate {C} is camera coordinate system which is established at the center of camera with perfectly perpendicular X-Y axes; the coordinate {G} is gantry robot coordinate system; {W} is world coordinate system, and coordinates {G″}, {G′} are medium coordinate systems between {G} and {W}. As shown in
In the gantry robot, two cameras are used for calibration, DLC and ULC. The upward looking camera (ULC), indexed with number (12), is installed with angle γ2 and downward looking camera (DLC), indexed with number (11), is installed with angle γi radians with respect to X-axis of the gantry system as shown in
It is also presumed that the tool is installed with slight angle on the gantry system such that the relative distance between end of the tool and camera view center is δx in X-axis and δy in Y-axis, as shown in
A comprehensive model-based mathematical method for calibration of the gantry system and the tool via the machine vision is developed here. In this automatic calibration process we determine the following parameters which are the main causes of inaccuracy in positioning and assembling of the components:
In the next following sections, a mathematical approach is employed and a solution for finding these unknown parameters of gantry robot is described in detail.
In this section, with reference to
Y
1×20
=S
1
X
1×20
S
1=(XTX)−1XTY (1)
where X is the distance between two dots on the Dot-grid as given by manufacturer and Y is the measured distance between two dots on the Dot-grid in mm.
Again, set the working distance between camera lens and Dot-grid to Zmax and capture the image. Then, with the help of above-mentioned nine points and 20 distances between these points, the value of S2 can be calculated as:
S
2=(XTX)−1XTY (2)
Finally, set the working distance to Z and find the scale value S based on the Zmax, Zmin and S1, S2 with linear interpolation/extrapolation as:
It is assumed that the axes of gantry robot have angular misalignment α and β radians with respect to the world coordinate system in the horizontal plane as shown in
The gantry robot is calibrated via three precision points A, B, C on the reference plate. To do so, the coordinate systems {C}, {G″}, {G′}, {G} and {W} are established on camera, gantry system, medium coordinate systems and reference plate, respectively, and then transformation matrices are formed from the camera to world coordinate system. The solution for unknown parameters including α, β, γ, Sx, Sy will be found in an iterative computational formula.
From
{C}→{G″}→{G′}→{G}→{W}
Therefore, the transformation matrix from camera to world coordinate system would be written as:
W
T
C=WG GG′ G′G″ G″C (4)
The transformation from gantry coordinate system {G} to world coordinate system {W} is:
Transformation from {G′} to {G} could be written as:
Transformation from medium coordinate {G″} to {G′} is:
Transformation from camera coordinate system {C} to {G″} is:
The commanded position of the gantry that locates the center of camera's image at the current position (CØ) is assumed to be GPCØ:
It is assumed that the calibrated output of the camera for the object's reference point (O) in the camera coordinates (CPo) is available as:
The same point (Po) in the world coordinate system could be expressed as WPo and obtained as:
W
P
o=(WG GG′ G′G″ G″C)CPo (11)
W
C represents the transformation matrix that maps the position of the object's reference point from the camera coordinate system (CPo) to the world coordinate system (WPo) as:
W
C=WG GG′ G′G″ G″C (12)
The displacement and the absolute position that the gantry needs to be moved in order to allow the camera see the object's reference point at the center of its image will be G′Po and GPo, respectively:
Displacement of the gantry: G′Po=(G′G″ G″C)CPo
New position of the gantry: GPo=(GG′ G′G″ G″C)CPo (13)
Moreover, once we know where the reference point is located in the world coordinate system (WPo) the calibration matrix (K=(WG)−1) will provide the position needs to be commanded to the gantry to locate the center of camera image at GP0.
G
P
o
=K
W
P
o (14)
Let's expand the GPo=GC CPo as following:
And also, let's expand the WPo=WG GPo:
For each of the three precision points on the reference plate, one needs to substitute o with A, B, or C. Let's detect each of the A, B, or C points on the reference plate. The gantry is located around the top of the point, where the center of camera image is located at GPCØ|@A, GPCØ|@B, or GPCØ|@C and the relative position of each point at the camera output is obtained as CPA, CPB, or CPC.
Let's put the three equations of WPo=WG GPo for o=A, B, or C together:
We could represent WG in the following form:
After simplification of equation it becomes:
Subtracting WPA from all columns on both side, it yields:
Since
thus:
Once H1, H2, H3 and H4 are obtained, we can obtain Sx, Sy, α, and β as follows:
In the gantry robot, the downward looking camera (DLC) is installed with angle γ1 radians with respect to X-axis of the gantry system as shown in
From the
where Δx and Δy as shown in
To have accurate measurement of camera angle, it is better to have error analysis on both equations and select the equation with less error.
Let's assume εG is gantry accuracy along X-axis and it is 10 μm, and εc is camera accuracy and it is 10 μm/pixel, and γt is the true value of γ which is 0.5°. And also, a is the linear motion of gantry system along the X-axis for 15000 μm. Therefore, the deviation in X and Y direction due to camera angle would be:
Now the question is that, what is the effect of camera angle measurement error on the accuracy of camera reading in the second snapshot:
The camera accuracy in the second snapshot becomes:
max{ε′c2}=εc2=εc=21 μm (26)
The main problem in tool calibration is finding the distances δx, δy. To determine the location of the tool head with respect to the camera FOV center, a precisely machined tapered object called calibration work piece is manufactured for pick and placing, and finding the center of tool head with respect to the upward looking camera.
To calibrate tooling system via machine vision we need to fabricate an object Q such that it is precisely picked up by the tool. First, we pick up the tool from the arbitrary point and then move and place the object on a magnetic plate to make sure the part never moves along the X-Y axes while the tool head is releasing the object, and then save the current location of camera center [0 0 1]T in camera coordinate, G[xC01 YC01 1]T.
As shown in
From the foregoing, it will be seen that this invention is one well adapted to attain all the ends and objects hereinabove set forth together with other advantages which are obvious and which are inherent to the method and apparatus. It will be understood that certain features and sub combinations are of utility and may be employed without reference to other features and sub combinations. This is contemplated by and is within the scope of the claims. Since many possible embodiments of the invention may be made without departing from the scope thereof, it is also to be understood that all matters herein set forth or shown in the accompanying drawings are to be interpreted as illustrative and not limiting.
The constructions described above and illustrated in the drawings are presented by way of example only and are not intended to limit the concepts and principles of the present invention. As used herein, the terms “having” and/or “including” and other terms of inclusion are terms indicative of inclusion rather than requirement.
While the invention has been described with reference to preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof to adapt to particular situations without departing from the scope of the invention. Therefore, it is intended that the invention not be limited to the particular embodiments disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope and spirit of the appended claims.
The present application claims the benefit of U.S. Provisional Patent Application No. 62/805,563, filed Feb. 14, 2019, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62805563 | Feb 2019 | US |