Cameras are heavily leveraged in robotic applications. Cameras allow robotics systems to locate objects of interest and provide the robot with accurate coordinates of the objects, identify good graspable points, provide an occupancy map of an area, etc.
Cameras can be mounted on a moving part of the robot or be external and stationary. Cameras may produce data from depth and RGB sensors, enabling a three-dimensional (3D) view of a workspace to be generated. Camera extrinsic calibration (or Calibration in the proceeding) is an important procedure to ensure that the relayed poses of 3D points are accurate. It is important not only for the transformation describing the pose of the camera with respect to the robot to be accurate, but also for the pose of 2 or more different camera sensors with respect to each other to be accurate.
Typically, calibration is a cumbersome process that requires fiducial markers to be installed on the robot and the robot to be moved by an operator to various poses within the camera field of view to capture pictures. The accuracy of the results is difficult to estimate and much uncertainty about the success of the process remains. Typically, technicians work with a set of fixed joint angles for the data collection step. This raises safety challenges where the robot being calibrated needs to have its ambient space clear of any obstacles. Furthermore, this process is not scalable for different product lines which have different end-effectors and work zones. The work zone variability can be a challenge as well since typically one would like to evaluate the accuracy of calibration within the specific workspace and region of interest that the robot will be working in, which may vary with the nature of the work the robot will be used to perform within the workspace.
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
An automated camera calibration system is disclosed. In various embodiments, a “button click” camera calibration solution is provided that is applicable for a plurality of robotic applications and/or product lines, e.g., each of a plurality of different types of work a robotic system may be configured to perform, such as line kitting, palletization/depalletization, sortation, truck loading, etc. In various embodiments, the calibration process yields a report that studies the success of the calibration operation and that is displayed on a user interface, such as a computer or other display device.
As shown in
In various embodiments, the user 126 is presented with a graphical user interface that enables the user to select a robot, e.g., robot 104. In response, the user interface displays to the user 126 a variety of options to configure, administer, and/or control the robot 104, including an option to select a camera to be calibrated. Selection of the aforementioned option results in a list or buttons to select a camera that the administration application and/or platform with which the user interface is associated has been configured to associate with the robot selected by the user 126, i.e., cameras 106 and 108 associated with robot 104, in this example.
In various embodiments, selection of a camera to be calibrated results in a call being made to a remote camera calibration service that includes one or more of the following: an identification of the robot (e.g., robot 104, in the above example), identification of the camera selected by the user (e.g., camera 106, if selected); and data indicating the robotic application for which the robot (e.g., robot 104) is to be used (e.g., palletization/depalletization, etc.). In response, as described more fully below, the calibration service performs without further action or input from the user 126 a camera calibration process as a service, which is specific at least in part to the robotic application indicated in the call to the service and the attributes of the workspace in which the robot is located.
In various embodiments, other calls to the service, e.g., from other users with respect to other robots and/or cameras shown in
In various embodiments, an automated camera calibration and/or process as disclosed herein may be used to calibrate one or more cameras used by a robotic system to perform a set of tasks associated with a robotic application, including without limitation cameras mounted in fixed locations in the workspace, cameras mounted on poles or other superstructure fixed to a same base or mobile chassis to which the robotic arm or other robot is mounted, and cameras mounted on a movable element of the robot itself, such as the end effector or arm segment.
In various embodiments, techniques disclosed herein may be used to calibrate cameras for use in robotic systems and applications, whether the cameras are mounted in a fixed location in the workspace, a fixed location relative to the robot or a structure on which the robot is mounted, or on the robot itself, as in the examples illustrated by
In the example shown, the calibration process is invoked by user 302 accessing a diagnostic tool in the system user interface (UI) 304. The user clicks on the robot they wish to calibrate, e.g., robot 306, and selects a camera to calibrate, e.g., camera 308. A request is sent from the UI 304 to the robot application, which kicks off a series of steps to set the robot up for calibration and provide the calibration library with the appropriate parameters. In various embodiments, as in the example shown in
In various embodiments, the architecture of the backend implementation leverages two components: (1) an application specific component comprising a control computer, described below, configured to control the robot indicated in the service call to perform a specific robotic application (line kitting, palletization/depalletization, sortation, truck loading/unloading, etc.) the robotic system will perform, and (2) a calibration library 318 that is generalized across robotic applications. In various embodiments the UI component 304 of the robotic system implements the diagnostic API which interacts with the calibration service 312.
Calibration service 312 uses the identity of the robot, as included in service call 310, to configuration data associated with the robot, enabling the calibration library 318 to determine relevant attributes of and/or context information for the robot. The calibration library 318 uses the information to connect with and interact with a control computer 320 configured to control the selected robot, i.e., robot 306 in the example shown. Control computer 320 includes a control stack 322, which includes a robotic application layer specific to the robotic application the robot 306 is to be used to perform, a motion planner component, and other layer used to control the robot 306 to perform the robotic application. The control computer further includes a vision stack 324 configured to interact with and control the cameras, e.g., camera 308, to be calibrated.
In various embodiments, the calibration service 312 interacts with the control computer 320 to retrieve the internal state of the system and, using a region of interest (ROI) provided by the user (e.g., via a CAD file or other configuration file), causes the robot to be moved to a safe zone. The calibration service 312 interacts with one or both of the control stack 322 and the vision stack 324 to prunes the ROI and determine a set of sample points and poses to position and orient the fiducial marker(s) to be used for calibration within the field of view of the camera, i.e., camera 308 in this example. Subsequently, a motion module comprising the control stack 322 samples points in the pruned ROI and computes trajectories. In some embodiments, this step uses forward simulation—a tool for studying desired motions in simulation and heuristically planning trajectories, checking for possible collisions (with robot itself, ambient objects) and further prunes the number of points (e.g., to ensure collisions are avoided). For each of the points in the sample set, the system (e.g., one or both of calibration service 312 and control computer 320) moves the robot to the point, captures the image from the camera, and records the corresponding robot pose. After this process, the calibration library 318 is called with the set of data (points, robot poses and image) as input.
In various embodiments, the calibration library 318 performs computations based on the received input data (i.e., points, robot poses and image) and computes one or more transformation matrices to be used to transform image/depth data generated by the camera, in a camera from or reference, to a frame of reference used by the robotic system to control the robot to perform the specified robotic application in the associated workspace. The calibration service 312 returns a calibration result 326 to the user interface 304. In addition, in various embodiments, the calibration service 312 generates and returns to the user interface 304 one or more visualizations reflecting the results of the calibration process. For example, the user interface 304 may display the overall success or failure of the calibration process, and a provide a visual representation of the errors that remain after error minimization that was performed as part of the calibration process.
At 506, the robot is moved or caused to be moved through a sequence of calibration positions/poses and at each position/pose one or more images to be used to perform calibration computations are generated. (In some embodiments, step 506 may be performed by a robotic application, e.g., based on position, pose, and/or trajectory information provided by the calibration service.) At 508, the sampled points (positions), poses, and corresponding images are processed to generate calibration results. In some embodiments, a ray-tracing techniques is used to perform the calibration. At 510, calibration results are returned.
In various embodiments, the calibration process produces as output a 4×4 or other transformation matrix to be applied to image/depth data from the camera to produce calibration-adjusted values to be used by the robotic system to control the robot to perform assigned tasks.
In various embodiments, a validation process may be performed to validate the calibration results. The validation process may follow the same steps as in the process of
In various embodiments, a set of AprilTags or other optical fiducial markers are used to perform calibration as disclosed herein. The fiducial markers may be mounted on the robot, e.g., on the robotic end effector and/or in the workspace.
In some embodiments, to achieve a successful calibration, point correspondences are established between the points detected by the April Tag library and using forward kinematics of the robot. The latter can be obtained by finding the position of points in the robot frame using the “marker2robot” transform which comes from the recorded robot poses. The calibration process uses the seed values for “marker2robot” and “robot2camera” from the CAD drawings or other configuration information. These values are jointly optimized as part of the process using applicable cost functions.
In various embodiments, a robot and/or end effector having fiducial markers mounted thereon, as in the example shown in
In various embodiments, the end effector shown in
In various embodiments, a calibration process as disclosed herein may include data usable to present a visual representation of the error (difference) between the position of a location on a fiducial marker as perceived by the camera and the position of the corresponding point on the marker based on the robot's position and pose and the associated “marker2robot” transform, e.g., from stored configuration data. An example 640 of such a visualization is shown in
In a ray-based approach, after establishing point correspondences as described above, an error term which minimizes the distance “d” between the ray 702 (emanating from the image plane 704 and optical center 706 of the camera) and the point 708 in the world frame represented in camera is used. An advantage of expressing the error term as distance between points in the camera frame is that the distance is measured in millimeters instead of pixels. This error term is invariant with respect to the distance from the camera, in various embodiments.
In some embodiments, the validation process is performed to validate camera calibration results. The validation process provides the following artifacts in a detailed report:
In some embodiments, techniques disclosed herein are applied to calibrate a camera that is mounted on the robot, such as at or near the robot “hand” or other end effector. In various embodiments, calibration of a camera mounted on the robot may include one or more of the following:
In some embodiments, techniques disclosed herein enable camera calibration to be performed automatically, e.g., by an operator selecting the robot and camera and then pushing a single button to calibrate. Automatic pruning of the ROI, sampling of points, and further pruning of the sample set, as disclosed herein, enables calibration to be performed without skilled technicians being required to select the sample points, move the robot to the respective points, etc.
In various embodiments, techniques disclosed herein enable an operator who may not be an expert in robotic system configuration and/or camera calibration to easily calibrate (or recalibrate) a camera to be used to enable a given robot or set of robots to perform tasks associated with a robotic application.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
This application claims priority to U.S. Provisional Patent Application No. 63/418,380 entitled CAMERA CALIBRATION PROCESS AND INTERFACE filed Oct. 21, 2022, which is incorporated herein by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
63418380 | Oct 2022 | US |