The present invention relates to, for example, a camera pose calibration device in a camera that is installed to a robot and used, a camera pose calibration method, and a robot.
In a system using a robot installed with a camera, it is necessary to calibrate a relative pose of the camera and the robot.
Regarding a camera pose calibration method, PTL 1 describes “In a visual sensor correction method for a robot arm of correcting a mounted-state value of a visual sensor on a tip of the robot arm by recognizing a position of work by the visual sensor mounted on the tip of the robot arm, a visual sensor correction method for a robot arm according to the present invention includes: a movement amount determination processing step of determining a movement amount of the tip of the robot arm due to movement of the robot arm in reference coordinates; an image processing step of obtaining a processed image before and after movement by performing image processing on an image of the work acquired by the visual sensor before and after the movement; a position change amount extraction processing step of obtaining a position change amount of the work before and after the movement in association with a temporary parameter, by applying the temporary parameter as an estimated value of the mounted-state value of the visual sensor to the processed image before and after the movement and by converting the processed image before and after the movement to the reference coordinates; a state difference calculation processing step of calculating a difference between the movement amount and the position change amount as a state difference; and a state value extraction processing step of obtaining a plurality of state differences by changing the temporary parameter and repeating the position change amount extraction processing step a plurality of times, and extracting the temporary parameter corresponding to a smallest state difference among a plurality of the state differences, as an mounted-state value of the visual sensor”.
In the invention described in PTL 1, by using an image of a surrounding environment of the robot captured by the camera installed on the robot, it is possible to calibrate a relative pose of the camera and the robot without requiring manual work such as setting a marker.
However, the image used for the calibration is acquired by causing the robot to execute an operation dedicated to calibration, and a task that is set in the robot is interrupted while the operation dedicated to calibration is performed. Further, it takes time to execute the operation dedicated to calibration. According to this conventional method, a two-step operation for performing the calibration operation is sequentially executed in time series prior to the execution of the task operation.
From the above, an object of the present invention is to provide: a camera pose calibration device, a camera pose calibration method, and a robot, capable of calibrating a relative pose of a camera and a robot without interrupting a task that is set in the robot.
From the above, the present invention includes “a camera pose calibration device including a task operation planning unit that plans an operation for executing a task that is set in a robot, from an image captured by a camera installed on the robot and a pose of the robot; a calibration operation planning unit that plans an operation necessary for calibrating a relative pose of the camera and the robot from the image and the pose of the robot; and an integration operation planning unit that plans an operation by integrating a task operation plan planned by the task operation planning unit and a calibration operation plan planned by the calibration operation planning unit. Further, the camera pose calibration device operates the robot.
Further, the present invention includes a “robot operated by the camera pose calibration device”.
Further, the present invention includes “a camera pose calibration method including: obtaining a task operation plan in which an operation for executing a task that is set in a robot is planned from an image captured by a camera installed on the robot and a pose of the robot, a calibration operation plan in which an operation necessary for calibrating a relative pose of the camera and the robot is planned from the image and the pose of the robot, an integration operation plan that integrates the task operation plan and the calibration operation plan, and a status and an objective of the task; and calculating a priority of the calibration operation plan with respect to the task operation plan, based on the status and an objective of the task, selecting an operation plan based on the priority, and operating the robot in accordance with the selected operation plan”.
According to the present invention, it is possible to realize a camera pose calibration device capable of calibrating a relative pose of a camera and a robot without interrupting a task that is set in the robot.
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
Hereinafter, a camera pose calibration device according to a first embodiment of the present invention will be described with reference to
First, an example of a typical robot to which the camera pose calibration device of the present invention can be applied will be described with reference to
The picking robot 200 is a robot that picks up works 221 in bulk on a work table 220 one by one and arranges the works 221 in a predetermined place. The picking robot 200 includes a plurality of joints 212 and links 214, and an end effector 215, and a camera 201 serving as a camera device is installed at an appropriate position on the robot 200 by a camera mounting jig 213. Further, each joint 212 of the robot 200 is appropriately provided with a pose sensor 202 that is a sensor to measure an angle of each joint 212.
The picking robot 200 recognizes the work 221 on the work table 220 by the camera 201, and controls an angle of the joint 212 to move the end effector 215 to a position where the work 221 can be lifted. The end effector 215 is, for example, a suction pad or a robot hand, and lifts the work 221 by suctioning or gripping. Note that a configuration such as the number of joints 212 and links 214 of the picking robot 200 may be freely adopted. Further, as the camera 201, an RGB-D camera capable of acquiring a distance in addition to an image may be used.
When the end effector 215 is brought close to the work 221, the picking robot 200 needs to check a pose of the end effector 215. The picking robot 200 uses the pose sensor 202 that is a sensor to measure an angle of each joint 212, to check the pose of the end effector 215 that is important when lifting the work 221, for example. Note that, by measuring the angle of each joint 212, the pose of the end effector 215 can be obtained by solving forward kinematics by a known method, based on a geometric model of the picking robot 200.
In the robot 200, in order to move the robot 200 to a place where it is possible to execute a task (in the example of the picking robot 200 in
However, the relative pose between the camera 201 and the robot 200 may be changed by vibrations during operation of the robot 200, or collision of the camera 201 and the robot 200 with an object existing in a surrounding environment during operation, which may cause an error with respect to an initially set value. Since the error in the relative pose between the camera 201 and the robot 200 is to be an error in the pose of the robot 200, a large error will cause a task failure.
Therefore, a control device of the robot 200 includes a camera pose calibration device that has a function of calibrating the relative pose of the camera 201 and the robot 200. The camera pose calibration device plans an operation of the robot 200 that enables calibration of the pose of the camera without interrupting the task, by planning an operation that integrates an operation plan necessary for task execution and an operation plan necessary for camera pose calibration before the error in the relative pose of the camera 201 and the robot 200 becomes large enough to fail the task.
In the robot system of
In the robot system of
Hereinafter, the robot system of
Note that the pose sensor 202 may be a sensor that directly measures a pose of the robot 200 or a sensor that measures a state of the robot 200. In a case of using a sensor that measures a state of the robot 200, the robot 200 estimates a pose of the robot 200 from the state of the robot 200 and outputs.
Next, the camera pose estimation unit 300 in the robot control device 10 of
A problem of estimating the relative pose of the camera 201 and the robot 200 from the image and the pose of the robot 200 is called Hand-Eye calibration. The camera pose estimation unit 300 according to the first embodiment of the present invention preferably use a known Hand-Eye calibration method. However, in the present invention, a method that does not use a marker is suitable, such as J. Heller, M. Havlena, A. Sugimoto, T. Pajdla, Structure-from-motion based hand-eye calibration using L∞ minimization, IEEE Conf. on Computer Vision and Pattern Recognition, pp. 3497-3503 (2011). The robot 200 executes a task by using a camera pose estimated by the camera pose estimation unit 300.
The task status/objective acquisition unit 310 acquires a status and an objective of a task that is set in the robot 200. Here, as will be described later in detail, the task is taking out and arranging the works 221 in a predetermined place in the example of the picking robot 200 in
The camera pose calibration device 100 plans an operation of the robot 200 from the image captured by the camera 201 installed on the robot 200, the pose of the robot acquired by the pose sensor 202 installed on the robot 200, and the task status and objective acquired by the task status/objective acquisition unit 310. The robot 200 executes the operation planned by the camera pose calibration device 100. Therefore, the robot includes an actuator (not shown) that drives joints and the like.
As a result, the camera pose calibration device 100 plans an operation of the robot 200 that enables calibration of the pose of the camera without interrupting the task, by planning an operation that integrates an operation plan necessary for task execution and an operation plan necessary for camera pose calibration before an error in the relative pose of the camera 201 and the robot 200 becomes large enough to fail the task.
The task operation planning unit 101 creates a task operation plan L1 that is an operation for executing a task that is set in the robot, from an image and pose data of the robot. The calibration operation planning unit 102 creates a calibration operation plan L2 that is an operation necessary for calibrating a camera pose, from the image and the pose of the robot. The integration operation planning unit 103 creates an operation plan L by integrating the task operation plan L1 planned by the task operation planning unit 101 and the calibration operation plan L2 planned by the calibration operation planning unit 102, based on the status and the objective of the task. The integration operation planning unit 103 includes the calibration priority calculation unit 104. The calibration priority calculation unit 104 calculates a priority of the calibration operation plan L2 with respect to the task operation plan L1.
A function of each unit of the camera pose calibration device 100 shown in
The task operation planning unit 101 creates the task operation plan L1 for executing a task that is set in the robot 200, from an image and pose data of the robot 200. In the picking robot 200, from an image of the work 221 captured by the camera 201 and pose data of the picking robot 200, the task operation planning unit 101 creates the task operation plan L1 that is an operation for lifting and arranging the works 221 at a predetermined place. For example, the task operation planning unit 101 calculates the pose of the picking robot 200 capable of lifting the work 221, by estimating a pose of the work 221 from an image by a known pattern matching method or machine learning method. When a plurality of works 221 are recognized, calculation is performed for a pose corresponding to each work 221, of the picking robot 200 capable of lifting the works 221.
Note that
The calibration operation planning unit 102 plans an operation (the calibration operation plan L2) necessary for calibrating a camera pose, from an image and a pose of the robot 200. Note that, when the operation necessary for calibrating the camera pose is illustrated in
The integration operation planning unit 103 plans an operation (the operation plan L) by integrating the task operation plan L1 planned by the task operation planning unit 101 and the calibration operation plan L2 planned by the calibration operation planning unit 102, based on the status and the objective of the task. Details of ah integration operation planning process will be described later with reference to
Note that connection between the robot 200 and the camera pose calibration device 100 may be wired connection such as USB or Ethernet (registered trademark) or wireless connection via a wireless network. Further, the camera pose calibration device 100 may be provided in the robot 200, or may be provided in a PC or a server connected to the robot 200.
Next, contents of processing in the calibration operation planning unit 102 will be specifically described with reference to
The calibration operation planning unit 102 plans the calibration operation plan L2 that is an operation necessary for calibrating a camera pose, from an image and a pose of the robot 200. The calibration operation planning unit 102 includes an acquired operation database DB1 that stores acquired operation data D1 and a calibration operation database DB2 that stores calibration operation data D2. Further, the calibration operation planning unit 102 outputs an operation (the calibration operation plan L2) necessary for calibration as a calibration operation plan table TB1.
First, formation of the acquired operation database DB1 will be described. The acquired operation database DB1 is replenished with stored contents by acquiring images at various poses at an appropriate point during robot operation.
Note that, in the example of
In the acquired operation data D1 in
Therefore, when it is considered that the relative pose of the camera 201 and the robot 200 have changed, the acquired operation data D1 stored in the acquired operation database DB1 is to be formed by deleting all images and poses, initializing, and acquiring images again at various poses of the robot 200. For example, if the task fails or the camera 201 and the robot 200 come into contact with an object in a surrounding environment, it is determined that the relative pose of the camera 201 and the robot 200 have changed.
In the picking robot 200, a task failure corresponds to a failure of lifting the work 221. When a suction pad is used as the end effector 215, for example, the success or failure of lifting is determined by a pressure in the suction pad. In addition, for a contact with an object in the surrounding environment of the camera 201 and the robot 200, it is determined that the contact has occurred when a change other than an expected change due to the task of the robot 200 occurs, for example, in a recognition result of the surrounding environment with use of images.
In the success or failure of a task for each work target W, “∘” indicates success, “x” indicates failure, and “-” indicates that the task for the corresponding work target W is not attempted. In the picking robot 200, the work target W is the work 221, and the success or failure of the task is the success or failure of lifting the work 221. Hereinafter, an identifier of each row of the calibration operation database DB2 is defined as j, an identifier of a pose included in each row is defined as pj, and an identifier of the work target W is defined as wj. Note that a creating/updating method of the calibration operation database DB2 will be described later.
Specifically explaining the expression in
Similarly, in the pattern of ID=2, it is shown that the task for the work target Wa has been successful, but the task for the work target Wb has failed, and the task for the work target Wj has not been executed yet, when the camera pose estimation unit 300 in
In the pattern of ID=j, it is shown that the task for the work target Wj has been successful, but the task for the work target Wb has failed, and the task for the work target Wa has not been executed yet, when the camera pose estimation unit 300 in
Note that pose data p expressed in
Next, formation of the calibration operation database DB2 will be described. The calibration operation database DB2 is formed by performing a task with use of the estimated relative pose of the camera 201 and the robot 200 and the operation plan by the task operation planning unit 101, determining whether or not the task has been successful, and adding the pose used for calibration and the success or failure of the task to the calibration operation database DB2 when the task for any of the work targets has been successful.
Note that, here, time is required to actually execute the task and determine the success or failure of the task. Therefore, in order to shorten the time for creating the calibration operation database DB2, for example, it is preferable to perform calibration first by using a sufficient number of pairs of the image and the pose (the acquired operation data D1) stored in the acquired operation database DB1, and register in the calibration operation database DB2 by the above method. Specifically, for example, a six-degree-of-freedom space of the work environment is divided into grids at equal intervals, and data is acquired at a pose corresponding to each grid. Next, some pairs are randomly extracted from pairs of the image and the pose (the acquired operation data D1) previously used for the calibration, and calibration is performed using only the extracted pairs. A calibration result when using all pairs, and a calibration result using only the extracted pairs are compared in a manner similar to processing step S540 of the calibration operation planning unit 102 described later. Then, when the two match, the pose included in the extracted pair and a work lifting result in using all the pairs are added to the calibration operation database DB2. This makes it possible to increase types of operations included in the calibration operation database DB2 while reducing the number of trials of actual tasks.
The calibration operation database DB2 is updated when the camera pose estimation unit 300 performs calibration. Specifically, the pose included in the acquired operation database DB1 used by the camera pose estimation unit 300 for the calibration, and the success or failure of the task by using the camera pose estimated by the camera pose estimation unit 300 are added as one row of the calibration operation database DB2.
Note that the success or failure of a task for each work target is the same in the calibration operation database DB2 and the calibration operation plan table TB1. Moreover, in poses of the calibration operation plan table TB1, “-” indicates that the pose is not included. That is, in the example of
In the first processing step S500 in the calibration operation planning unit 102, the calibration operation plan table TB1 is initialized. Specifically, the calibration operation plan table TB1 is created so as to include no information of the pose p and have a blank, and to have the success or failure of a task for each work target W that is the same as that in the calibration operation database DB2. The calibration operation plan table TB1 created at this time has the contents shown in
In processing step S510a, loop processing for each row j of the calibration operation database DB2 is started. The loop processing is repeatedly executed by changing a condition between processing step S510a and processing step S510b.
In processing step S520a, loop processing for each pose pj included in the row j of the calibration operation database DB2 is started. The loop processing is repeatedly executed by changing a condition between processing step S520a and processing step S520b.
Information of each pose p in the calibration operation database DB2 of
In processing step S530a, loop processing for each row i of the acquired operation database DB1 is started. The loop processing is repeatedly executed by changing a condition between processing step S530a and processing step S530b. This causes pair information (the image information D11 and the pose information D12) of the acquired operation database DB1 of
In processing step S540, the pose pj included in the row j of the calibration operation database DB2 is compared with a pose included in a row i of the acquired operation database DB1. The loop processing is continued when the poses do not match. When the poses match, the process leaves the loop processing for each row i of the acquired operation database DB1, and proceeds to the loop processing for each pose pj included in the row j of the calibration operation database DB2.
In the example above, first, a process of comparing the pose information (0, 0, 0, 0, 0, 0) of the pose p1 and ID=1 in
In matching determination of poses, for example, when a difference between two positions is within a preset threshold value and a difference between two attitudes is within a preset threshold value, it is determined that the two poses match.
In processing step S550, the pose pj is added to the row j of the calibration operation plan table TB1. The calibration operation plan table TB1 shown in
Similarly, the acquired operation database DB1 of FIG. 4 is searched for the pose p1 (0, 0, 0, 10, 0, −5) of the pose p1 and ID=2 and the pose p2 (−10, −5, 5, 0, 10, 10), and no description is made in the row of ID=2 of the calibration operation plan table TB1 since there is matching data. Further, the pose pj (0, 5, 0, 20, −20, 5) that does not exist in the acquired operation database DB1 is described in the row of ID=1 in the calibration operation plan table TB1.
All the pose information p extracted in the calibration operation plan table TB1 in
Note that the calibration operation data D2 stored in the calibration operation database DB2 is created when the camera pose calibration device 100 is first used, and then is updated each time the camera pose is calibrated.
Next, contents of processing in the integration operation planning unit 103 will be described. The integration operation planning unit 103 is to create the operation plan L by integrating the task operation plan L1 planned by the task operation planning unit 101 and the calibration operation plan L2 planned by the calibration operation planning unit 102, on the basis of a calibration priority calculated by the calibration priority calculation unit 104 from a status and an objective of a task.
Note that, in order to understand the contents of processing in the integration operation planning unit 103, it is better to understand a calculation method for a calibration priority that is calculated by the calibration priority calculation unit 104. Therefore, first, the calibration priority calculation unit 104 will be described below.
With reference to
The status and the objective of the task are managed by a task status/objective table TB2 provided by the task status/objective acquisition unit 310.
According to the example of the task status/objective table TB2 in
The task status/objective table TB2 is updated by the task status/objective acquisition unit 310. For example, in the picking robot 200, the management number D31 and the target end time D32 of the work target, which are the task objective, are inputted by an operator or a production management system.
The achievement level D33 and the number of failures D35, which are the task status, are updated based on a result of a picking operation by the picking robot 200. The scheduled end time D34 is calculated, for example, from a past operation time and an achievement level. The achievement level D33 is calculated from a total number of works and a number of remaining works. Further, for the scheduled end time D34, the time required for the operation may be calculated from the task operation plan L1 outputted by the task operation planning unit 101.
The calibration priority calculation unit 104 calculates, for example, an output operation type and a calibration usable time as the priority of the calibration operation plan L2 with respect to the task operation plan L1. Here, the output operation type is a type of an operation plan outputted by the integration operation planning unit 103, and is the calibration operation plan L2 or the integration operation plan L. Note that the integration operation plan L is a plan for executing the task operation plan L1 and the calibration operation plan L2. The calibration usable time is a time that can be spent for the calibration operation.
In processing step S610, when the number of failures D35 for the task that is currently set in the robot 200 is equal to or greater than a preset threshold value, the process proceeds to processing step S620, otherwise the process proceeds to processing step S630.
In processing step S620, the output operation type is set to the calibration operation plan L2, and the calibration usable time is set to the work margin time. As a result, it is determined that the task cannot be continued without performing the calibration work, and setting is made so as to perform the calibration operation by using the work margin time. In the above example, setting is made so as to perform the calibration operation within five minutes.
In processing step S630, when the work margin time is greater than 0, the process proceeds to processing step S640, otherwise the process proceeds to processing step S650.
In processing step S640, the output operation type is set to the integration operation plan L, and the calibration usable time is set to the work margin time. This makes setting so as to use the work margin time to perform the integration operation while continuing the task. In the above example, setting is made so as to perform the integration operation by the calibration operation and the task operation so that the task is completed by 10:00.
In processing step S650, when the task is continued without performing the calibration work, an estimated failure time, which is a time estimated to be required for an operation that fails the task, is calculated. For example, the number of task failures in the future is calculated from an achievement level and the number of failures for a task that is currently set in the robot 200. Finally, the estimated failure time is calculated from an average time taken for one operation calculated from past results, or from an average operation time when the task fails, and the number of failures.
In processing step S660, the output operation type is set to the integration operation plan L, and the calibration usable time is set to the estimated failure time. As a result, even when the work margin time is 0, setting is made so as to perform the integration operation that can be executed within a shorter time than the time to be lost due to the task failure, when the task is continued without performing the calibration work.
In processing step S710, for each calibration operation plan included in the calibration operation plan table TB1, the integration operation plan L and a calibration operation time, which is a time to be required more by the integration operation plan L than the task operation plan L1, are calculated. Details of the processing will be described later.
In processing step S720, an integration operation outputted by the integration operation planning unit 103 is selected based on the calibration operation plan table TB1 and based on the integration operation and the calibration operation time corresponding to each calibration operation included in the calibration operation plan table TB1.
For example, first, an integration operation having a calibration operation time shorter than the calibration usable time is set as a selection candidate. Next, by referring to the calibration operation plan table TB1 and the task status/objective table TB2, an integration operation that can succeed a task for the work target W targeted by a task that is furthest ahead is outputted, among the selection candidates. Note that, when there are a plurality of selection candidates that can succeed until the task that is furthest ahead, an integration operation having the shortest calibration operation time is outputted. Further, when there is no selection candidate, the task operation outputted by the task operation planning unit 101 is outputted instead of the integration operation.
In processing step S730, a calibration operation time to be required for the calibration operation is calculated for each calibration operation plan included in the calibration operation plan table TB1. Details of the processing will be described later.
In processing step S740, a calibration operation to be outputted by the integration operation planning unit 103 is selected, based on the calibration operation plan table TB1 and based on the calibration operation time corresponding to each calibration operation included in the calibration operation plan table TB1.
For example, first, a calibration operation having a calibration operation time shorter than the calibration usable time is set as a selection candidate. Next, by referring to the calibration operation plan table TB1 and the task status/objective table TB2, a calibration operation that can succeed a task for the work target targeted by a task that is furthest ahead is outputted, among the selection candidates. Note that, when there are a plurality of selection candidates that can succeed until the task that is furthest ahead, a calibration operation having the shortest calibration operation time is outputted. Further, when there is no selection candidate, a calibration operation with the shortest calibration operation time is outputted, among the calibration operations that can succeed in the current task.
In processing step S711, loop processing for each row k of the calibration operation plan table TB1 is executed between S711a and S711b.
In processing step S712, the loop processing is started between S712a and S712b for an allocation pattern, to the task operation plan, of each pose pk included in the row k of the calibration operation plan table TB1.
The row k of the calibration operation plan table TB1 includes a plurality of poses pk. Further, the task operation planning unit 101 outputs a plurality of task operation plans. That is, when the plurality of poses pk are integrated with a plurality of task operation plans, there are a plurality of combination patterns of the poses pk and the task operation plans. Further, when the plurality of poses pk are integrated for one task operation plan, there are a plurality of patterns in the order of integrating the poses pk. In processing step S712, loop processing for all these patterns is started.
In processing step S713, an integration operation that integrates the pose pk included in the calibration operation plan table TB1 and the task operation plan is planned. Specifically, in a pattern to be processed, for each task operation plan to which the pose pk included in the calibration operation plan table TB1 is allocated, the task operation planning unit 101 plans an operation to start from a reference pose of the robot 200 to reach a pose at which the task outputted as the operation plan can be executed, via the pose pk included in the calibration operation plan table TB1, by a known operation planning method.
For example, as an operation planning method, J. J. Kuffner, and S. M. LaValle, RRT-connect: An efficient approach to single-query path planning, Int. Conf. on Robotics and Automation, pp. 995-1001 (2000) can be used.
In processing step S714, a time required for the calibration operation is calculated. Specifically, for each task operation plan to which the pose pk included in the calibration operation plan table TB1 is allocated, a time required when the robot 200 performs the integration operation planned in processing step S713 is calculated based on a specification of the robot 200.
Further, an operation time of the robot 200 without via the calibration operation is calculated, that is, by planning an operation to start from a reference position of the robot 200 to reach a place where the task can be executed, the time required for the operation is calculated.
Next, a difference between the time required when the robot 200 performs the integration operation and the operation time of the robot 200 without via the calibration operation is set as the time required for the calibration operation. Finally, by adding up the time required for the calibration operation calculated for each task operation plan to which the pose pk included in the calibration operation plan table TB1 is allocated, the calibration operation time for the pattern to be processed is obtained.
In processing step S715, the integration operation is selected corresponding to the shortest calibration operation time among all the calibration operation times calculated by the loop processing, which is started in processing step S712, for an allocation pattern to the task operation plan of each pose pk included in the k row of the calibration operation plan table TB1.
Returning to
Specifically, for each row of the calibration operation plan table TB1, for all patterns in the order of a plurality of poses included in the row, an operation plan for reaching the pose from the reference position of the robot 200 is planned by a known operation planning method, and set as the calibration operation plan. Further, the time required for executing the calibration operation plan based on the specification of the robot 200 is set as the calibration operation time. Finally, a calibration operation corresponding to the shortest calibration operation time is selected for each row of the calibration operation plan table TB1.
According to the above-described first embodiment of the present invention, the following working effects can be obtained.
(1) The camera pose calibration device 100 includes the task operation planning unit 101, the calibration operation planning unit 102, and the integration operation planning unit 103. The task operation planning unit 101 plans an operation for executing a task that is set in the robot 200, from an image and pose data of the robot 200. The calibration operation planning unit 102 plans an operation necessary for calibrating the camera pose from the image and the pose of the robot 200. The integration operation planning unit 103 plans an operation by integrating a task operation plan planned by the task operation planning unit 101 and a calibration operation plan planned by the calibration operation planning unit 102 (
(2) The calibration operation planning unit 102 plans an operation necessary for calibration on the basis of the acquired operation database DB1 including the acquired image and pose of the robot 200, and on the basis of the calibration operation database DB2 including the pose of the robot 200 used for the calibration and the success or failure of the task based on the calibration result, and outputs as the calibration operation plan table TB1 (
(3) The integration operation planning unit 103 includes the calibration priority calculation unit 104. The calibration priority calculation unit 104 calculates a priority of the calibration operation plan with respect to the task operation plan on the basis of the task status/objective table TB2 including the status and the objective of the task. The integration operation planning unit 103 plans an operation by integrating a task operation plan planned by the task operation planning unit 101 and a calibration operation plan planned by the calibration operation planning unit 102 (
(4) The calibration priority calculation unit 104 calculates a work margin time that is a margin time for the task, from the target end time and the scheduled end time of the task, and calculates, as the priority of the calibration operation plan, a calibration usable time that is a time that can be spent for the calibration operation, on the basis of the work margin (
(5) The calibration priority calculation unit 104 calculates, as the priority of the calibration operation plan, an output operation type that is a type of the operation plan outputted by the integration operation planning unit 103, and the calibration usable time, on the basis of the number of failures for the task that is set in the robot 200 (
(6) In a case where the task is continued without performing calibration work based on the achievement level and the number of failures for the task that is set in the robot 200, the calibration priority calculation unit 104 calculates an estimated failure time that is a time estimated to be spent for an operation that fails the task, and calculates, as the priority of the calibration operation plan, a calibration usable time that is a time that can be spent for the calibration operation, on the basis of the estimated failure time (
(7) The integration operation planning unit 103 integrates each task operation plan planned by the task operation planning unit 101 and each calibration operation plan planned by the calibration operation planning unit 102 into a plurality of combination patterns, and the integration operation planning unit 103 selects, as an integration operation for each calibration operation plan, an integration operation with the shortest calibration operation time that is an operation time added to the task operation plan by the integration (
(8) The integration operation planning unit 103 selects the integration operation on the basis of the success or failure of the task based on a calibration result included in the calibration operation database DB1 and a schedule of a work target that is included in the task status/objective table TB2 and given as a task to the robot 200 (
Note that the following modified embodiment can be further adopted in realizing the first embodiment of the present invention.
First, when the camera pose calibration device 100 is first used, the calibration operation database DB2 has been created by acquiring images with various poses of the robot 200 and performing a task based on the calibration result using the acquired image and the pose of the robot 200. However, the method of creating the calibration operation database DB2 is not limited to this.
In addition, the integration operation planning unit may integrate each task operation plan planned by the task operation planning unit and each calibration operation plan planned by the calibration operation planning unit with a plurality of combination patterns, and may select, as an integration operation for each calibration operation plan, an integration operation with the shortest calibration operation time, which is an operation time added to the task operation plan by the integration.
Further, the integration operation planning unit may select the integration operation by using a calibration operation database and an objective of the plurality of scheduled tasks.
Moreover, the calibration operation database may be created by simulation.
The calibration operation database DB2 may be created by simulation without operating the robot 200. For example, the calibration operation database DB2 may be created by a method similar to a case of using actual data, by creating a simulation environment from a configuration and arrangement of the robot 200, a shape of the work target, or the like, and virtually acquiring a plurality of pairs of the image and the pose of the robot 200. Further, in order to match the simulation and the real environment, it is possible to acquire an image of the actual work environment, create a simulation environment that simulates the real environment by using a 3D restoration method using an image or an object recognition method, and use to create the calibration operation database DB2.
According to the above-described first modification, the following working effects can be obtained. That is, the camera pose calibration device 100 creates the calibration operation database DB2 by simulation. Therefore, since the time required to create the calibration operation database DB2 is shortened, it is possible to start using the camera pose calibration device early.
Hereinafter, a camera pose calibration device according to a second embodiment of the present invention will be described with reference to
A configuration of the camera pose calibration device according to the second embodiment is basically the same as that of
In the second embodiment, the robot 200 is an autonomous driving vehicle.
In the example shown in
In the autonomous driving vehicle 230, the pose sensor 202 in the robot of
In the autonomous driving vehicle 230, the set task is traveling to the target pose 241. A task status is, for example, a type and shape of the target pose 241, a distance to the target pose 241, and the like. A task objective is, for example, a time to reach the target pose 241.
In the autonomous driving vehicle 230, the task operation planning unit 101 in
When a plurality of parking spaces are recognized, a target pose that enables parking corresponding to each parking space is set as the target pose 241. Note that the target pose 241 is not limited to the parking space, and may be any pose desired by a user. Further, the target pose 241 may be calculated using other sensors or map information, in addition to the camera 231.
Next, an operation and processing content of the calibration priority calculation unit 104 when the robot is the autonomous driving vehicle 230 will be described with reference to
In the task status/objective table TB2 of the autonomous driving vehicle 230, a work target D31 is the target pose 241. Further, a task failure D35 corresponds to a failure in traveling to the target pose 241. For example, it is determined as a failure when a driver senses a risk of contact with an object in the surrounding environment and applies a brake. In addition, when the risk of contact with an object in the surrounding environment is detected by an ultrasonic sensor or the like, it is determined as a failure.
The task status/objective table TB2 in
In processing step S970, when the number of passengers is 0, the process proceeds to processing step S980, otherwise the process proceeds to processing step S650.
In processing step S980, an output operation type is set to an integration operation plan L, and a calibration usable time is set to a preset value. When the number of passengers is 0, by setting the calibration usable time to the preset value, the calibration work is prioritized over finishing the task at the target end time.
According to the above-described second embodiment, the following working effects can be obtained. That is, the calibration priority calculation unit 104 calculates, as the priority of the calibration operation plan, the calibration usable time that is the time that can be spent for the calibration operation, on the basis the number of passengers of the autonomous driving vehicle 230. Therefore, by performing the calibration when the number of passengers is 0, the calibration operation time when there is a passenger can be shortened, which can reduce the burden on the passenger due to the calibration.
Note that the present invention is not limited to the above embodiments, and various modifications may be included. For example, the above embodiments have been illustrated in detail to facilitate description for easy understanding of the present invention, and are not necessarily limited to the embodiments that include all the illustrated configurations. Other aspects considered within the technical idea of the present invention are also included within the scope of the present invention. Additionally, a part of a configuration of an embodiment may be replaced with a configuration of another embodiment, and a configuration of an embodiment may be added with a configuration of another embodiment. Moreover, a part of a configuration of each embodiment may be deleted, replaced, or added with another configuration. In addition, each of the above-described configurations, functions, processing parts, processing units, and the like may be realized by hardware, for example, by designing part or all of them with an integrated circuit or the like. In addition, each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program in which a processor realizes each function. Information such as a program, a table, and a file for realizing each function can be placed in a recording device such as a memory, a hard disk, or a solid state drive (SSD), or in a recording medium such as an IC card, an SD card, or a DVD.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/021497 | 6/5/2018 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2019/234814 | 12/12/2019 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
4831549 | Red et al. | May 1989 | A |
20150224649 | Watanabe | Aug 2015 | A1 |
20160059419 | Suzuki | Mar 2016 | A1 |
20170136626 | Wang | May 2017 | A1 |
20170312918 | Huang et al. | Nov 2017 | A1 |
20180194008 | Namiki | Jul 2018 | A1 |
20180361589 | Paquin | Dec 2018 | A1 |
20190126487 | Benaim | May 2019 | A1 |
20200016757 | Sakuramoto | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
1-159186 | Jun 1989 | JP |
6-134691 | May 1994 | JP |
6-270020 | Sep 1994 | JP |
2007-61979 | Mar 2007 | JP |
2012-240174 | Dec 2012 | JP |
2015-150636 | Aug 2015 | JP |
6301045 | Mar 2018 | JP |
WO 2018043525 | Aug 2018 | WO |
Entry |
---|
International Search Report (PCT/ISA/210) issued in PCT Application No. PCT/JP2018/021497 dated Sep. 4, 2018 with English translation (four (4) pages). |
Japanese-language Written Opinion (PCT/ISA/237) issued in PCT Application No. PCT/JP2018/021497 dated Sep. 4, 2018 (three (3) pages). |
Heller J., et al., Structure-from-Motion Based Hand-Eye Calibration Using L∞ Minimization, IEEE Conference on Computer Vision and Pattern Recognition, 2011, pp. 3497-3503 (seven (7) pages). |
Kuffner J., et al., “RRT-Connect: An Efficient Approach to Single-Query Path Planning”, In Proc. 2000 IEEE Int'l Conf. on Robotics and Automation, 2000, pp. 1-7 (seven (7) pages). |
Japanese-language Office Action issued in Japanese Application No. 2020-523873 dated Nov. 2, 2021 with English translation (five (5) pages). |
Number | Date | Country | |
---|---|---|---|
20210229290 A1 | Jul 2021 | US |