The present technology relates to an information processing device, an information processing method, and a program, and more particularly, to an information processing device, an information processing method, and a program capable of appropriately controlling positioning of an operation object.
Various tasks using a robot hand, such as holding an object, are typically implemented by controlling a robot hand and performing success/failure determination of the task on the basis of a sensor value acquired by a sensor.
For example, Patent Document 1 discloses a technique of detecting movement of a target object and a peripheral object on the basis of image information acquired by a vision sensor and force information acquired by a force sensor, and determining whether or not a robot is normally operating the target object.
Patent Document 2 discloses a technique of moving a sensor unit to a position where measurement of an object is easy, and then moving a robot hand on the basis of the position and orientation of the object measured by the sensor unit to hold the object.
In the technique described in Patent Document 1, depending on the positional relationship between the vision sensor and the robot hand, an object may be shielded by the robot hand, so that the vision sensor may not be able to observe the object. In this case, it is difficult to control the robot hand on the basis of the image information.
Furthermore, it is not preferable to estimate the positional relationship between the target object and the peripheral object and the movement amount on the basis of the force information acquired by the force sensor from the viewpoint of estimation accuracy.
In the technique described in Patent Document 2, the measurement result of the position and orientation of the object includes an error, so that the actual position and orientation of the object may deviate from the estimated position and orientation in a case where the robot hand is moved on the basis of one measurement result.
Furthermore, there is a case where the information acquired by the sensor unit cannot be used for the success/failure determination of the task or for the control of the robot hand only by moving the sensor unit to a position where measurement is easy. For example, in a case where the robot hand brought close to an object to hold the object shields the object, the position and orientation of the object cannot be measured by the sensor unit.
The present technology has been made in view of such situations, and an object thereof is to appropriately control positioning of an operation object.
An information processing device according to one aspect of the present technology is an information processing device including a control unit configured to control a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface, on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
In one aspect of the present technology, a position of a hand portion that is used for execution of a task of moving a target object to a specific position and that has a plurality of sensors provided on a surface is controlled on the basis of positional relationship information indicating a positional relationship between the target object and an object in surroundings of the target object, the positional relationship being estimated on the basis of distances measured by the sensors during movement of the hand portion.
MODE FOR CARRYING OUT THE INVENTION
Hereinafter, an embodiment for carrying out the present technology will be described. The description will be given in the following order.
<1. External Configuration of Robot>
As illustrated in
At the upper end of the body portion 11, arm portions 13-1 and 13-2 each including a manipulator with multiple degrees of freedom are provided. Hand portions 14-1 and 14-2 that are end effectors are provided at distal ends of the arm portions 13-1 and 13-2, respectively. The robot 1 has a function of holding an object with the hand portions 14-1 and 14-2.
Hereinafter, in a case where it is not necessary to distinguish the arm portions 13-1 and 13-2, they are collectively referred to as arm portions 13 as appropriate. Furthermore, in a case where it is not necessary to distinguish the hand portions 14-1 and 14-2, they are collectively referred to as hand portions 14. A plurality of other components may be described collectively as appropriate.
A carriage-type moving body portion 15 is provided at a lower end of the body portion 11. The robot 1 can move by rotating the wheels provided on the left and right of the moving body portion 15 or changing the direction of the wheels.
As described above, the robot 1 is a robot capable of executing various tasks such as holding an object by the hand portions 14 and carrying the object in a state of being held. In the example of
Note that the robot 1 may be configured not as a dual arm robot as illustrated in
As illustrated in
The left finger 22L is configured by connecting a plate-shaped portion 31L and a plate-shaped portion 32L that are plate-shaped members having a predetermined thickness. The plate-shaped portion 32L is provided on the distal end side of the plate-shaped portion 31L attached to the base portion 21. A coupling portion between the base portion 21 and the plate-shaped portion 31L and a coupling portion between the plate-shaped portion 31L and the plate-shaped portion 32L each have a predetermined movable range. A thin plate-shaped finger contact portion 33L is provided on an inner side of the plate-shaped portion 32L.
The right finger 22R has a configuration similar to that of the left finger 22L. That is, a plate-shaped portion 32R is provided on the distal end side of a plate-shaped portion 31R attached to the base portion 21. A coupling portion between the base portion 21 and the plate-shaped portion 31R and a coupling portion between the plate-shaped portion 31R and the plate-shaped portion 32R each have a predetermined movable range. A thin plate-shaped finger contact portion 33R (not illustrated) is provided on an inner side of the plate-shaped portion 32R.
The left finger 22L and the right finger 22R are opened and closed by moving the respective coupling portions. Various objects such as the card C1 are held so as to be sandwiched between the inner side of the plate-shaped portion 32L and the inner side of the plate-shaped portion 32R. The inner surface of the plate-shaped portion 32L provided with the finger contact portion 33L and the inner surface of the plate-shaped portion 32R provided with the finger contact portion 33R serve as contact surfaces with an object when the object is held.
As illustrated in a color in
For example, on the upper surface of the base portion 21 corresponding to the palm, nine distance sensors 41-0 are provided side by side vertically and horizontally. The distance sensors 41-0 are provided at predetermined intervals.
Furthermore, distance sensors 41L-1, 41L-2, and 41L-3 that are pairs of two distance sensors, are provided on the inner side of the plate-shaped portion 32L, which is a contact surface with an object, in this order from the fingertip side. The two distance sensors constituting each of the distance sensors 41L-1, 41L-2, and 41L-3 are provided across the finger contact portion 33L and the distance sensors 41L-1, 41L-2, and 41L-3 are provided along the edges of the plate-shaped portion 32L.
Distance sensors 41L-4 are provided on the side surfaces of the plate-shaped portion 32L, and a distance sensor 41L-5 is provided on a semi-cylindrical surface serving as a fingertip. On the outer side of the plate-shaped portion 32L, a distance sensor 41L-6 (not illustrated) is provided similarly to the right finger 22R side.
On the inner side of the plate-shaped portion 31L, distance sensors 41L-7 and 41L-8 are provided side by side. On the outer side of the plate-shaped portion 31L, a distance sensor 41L-9 (not illustrated) is provided similarly to the right finger 22R side.
Also on the right finger 22R, distance sensors 41R-1 to 41R-9 are provided similarly to the left finger 22L. That is, the distance sensors 41R-1, 41R-2, and 41R-3 (not illustrated) are provided on the inner side of the plate-shaped portion 32R in this order from the fingertip side, and the distance sensors 41R-4 are provided on the side surfaces of the plate-shaped portion 32R.
The distance sensor 41R-5 is provided on the surface of the fingertip of the plate-shaped portion 32R, and the distance sensor 41R-6 is provided on the outer side of the plate-shaped portion 32R. The distance sensors 41R-7 and 41R-8 (not illustrated) are provided on the inner side of the plate-shaped portion 31R, and the distance sensor 41R-9 is provided on the outer side of the plate-shaped portion 31R.
Hereinafter, in a case where it is not necessary to distinguish the distance sensors 41L-1 to 41L-9 and the distance sensors 41R-1 to 41R-9, they are collectively referred to as distance sensors 41 as appropriate.
When a task is executed, as illustrated in
For example, the distance sensors 41-0, the distance sensors 41L-1 to 41L-3, 41L-7, and 41L-8, and the distance sensors 41R-1 to 41R-3, 41R-7, and 41R-8 are used to measure the distance to each position of the object held by the hand portions 14, and the like.
As described above, the distance sensor 41 is provided on each part of the hand portion 14-1, so that the distribution of the distances to the card C1 as the operation object (the distance to each position of the card C1) is measured in real time. Furthermore, the distribution of the distances to the desk D1 included in the environment surrounding the card C1 is measured in real time. The execution status of the task is monitored on the basis of the distance to each position of the card C1 and the desk D1.
The same components as the components of the hand portion 14-1 as described above are also provided in the hand portion 14-2.
Although the hand portions 14 are two-finger type holding portions, a multi-finger type holding portion having different numbers of finger portions, such as a three-finger type holding portion and a five-finger type holding portion, may be provided. The degree of freedom of the finger portion, the number of the distance sensors 41, and the arrangement of the distance sensors 41 can be set in any ways.
<2. Configuration of Robot>
Hardware Configuration
As illustrated in
The information processing device 51 includes a computer including a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), a flash memory, and the like. The information processing device 51 is housed in, for example, the body portion 11. The information processing device 51 executes a predetermined program by the CPU to control the overall operation of the robot 1.
The information processing device 51 recognizes the environment around the robot 1 on the basis of the detection result by the sensors, the images captured by the visual sensors, and the like, and executes a task according to the recognition result. Various sensors and cameras are provided in each of the body portion 11, the head portion 12, the arm portions 13, the hand portions 14, and the moving body portion 15. For example, the head portion 12 is provided with the visual sensors 12A including RGB cameras or the like. The hand portions 14 are provided with the distance sensors 41.
Functional Configuration
As illustrated in
The environment measurement unit 101 performs three-dimensional measurement on an operation object and objects included in the environment surrounding the operation object on the basis of the output of the visual sensors 12A. By performing the three-dimensional measurement, the position and shape of the operation object, the shape of the object included in the environment surrounding the operation object, and the like are calculated. The measurement result by the environment measurement unit 101 is output to the hand and finger initial position determination unit 103.
The task determination unit 102 determines a task to be executed and outputs information indicating the content of the task. The information indicating the content of the task includes information indicating what kind of information is used as the geometric information. As described later, the geometric information is information used for monitoring the execution status of the task as well as control of each unit. The information output from the task determination unit 102 is supplied to the hand and finger initial position determination unit 103, the target value calculation unit 106, and the geometric information estimation unit 107.
The hand and finger initial position determination unit 103 determines use sensors that are the distance sensor 41 used for monitoring the execution status of the task according to the task determined by the task determination unit 102. Among the plurality of distance sensors 41 provided on the hand portions 14, the distance sensors 41 suitable for monitoring the execution status of the task are determined as use sensors.
Furthermore, the hand and finger initial position determination unit 103 calculates the initial positions of the distance sensors 41 determined as the use sensors on the basis of the measurement result by the environment measurement unit 101 and the content of the task determined by the task determination unit 102.
Candidates for the initial positions of the distance sensors 41 are set in advance for every type of content of the task. The information indicating the candidates of the initial position is stored in the initial position database 104 and read by the hand and finger initial position determination unit 103 as appropriate. Information indicating the initial positions of the distance sensors 41 calculated by the hand and finger initial position determination unit 103 are output to the initial position movement control unit 105.
The initial position movement control unit 105 controls a drive unit 121 so that the distance sensors 41 are positioned at the initial positions calculated by the hand and finger initial position determination unit 103. The drive unit 121 corresponds to drive portions of the robot 1 including the arm portions 13, the hand portions 14, and the moving body portion 15.
The target value calculation unit 106 calculates a target value of the geometric information on the basis of the information supplied from the task determination unit 102, and outputs the target value to the positioning control unit 108.
The geometric information estimation unit 107 acquires distance distribution information measured by the distance sensors 41. The distance distribution information indicates a distance to each position of an operation object and an object included in the environment. The geometric information estimation unit 107 estimates the geometric information determined by the task determination unit 102 on the basis of the distance distribution information, and outputs the estimated geometric information to the task determination unit 102, the positioning control unit 108, and the task success/failure determination unit 110.
The positioning control unit 108 performs positioning control by controlling the drive unit 121 so that the geometric information estimated by the geometric information estimation unit 107 reaches the target value supplied from the target value calculation unit 106. The positioning control is control for moving the operation object to a predetermined position. Furthermore, the positioning control is also control for moving the arm portions 13 and the hand portions 14 so that the distance sensors 41 come to predetermined positions.
The task success/failure condition calculation unit 109 determines a success condition and a failure condition of the task, and outputs information indicating the success condition and the failure condition to the task success/failure determination unit 110.
The task success/failure determination unit 110 determines success or failure of the task being executed according to whether or not the geometric information estimated by the geometric information estimation unit 107 satisfies the condition determined by the task success/failure condition calculation unit 109. In a case where it is determined that the task has succeeded or the task has failed, the task success/failure determination unit 110 outputs a stop command to the positioning control unit 108. The result of success/failure determination of the task is also supplied to other processing units such as the task determination unit 102 as appropriate.
<3. Operation of Information Processing Device>
Example of Task of Holding Thin Object
The processing of the information processing device 51 will be described with reference to the flowchart of
Here, processing in a case where a task of holding the card C1 placed on the top plate of the desk D1 is executed will be described appropriately referring to states at the time of execution of the task illustrated in
When a thin object such as the card C1 is held, a human often picks up the object by using a nail, or picks up the object after translating the object to the front side of the desk D1. The latter operation, that is, translating the card C1 to the front side of the desk D1 and then picking up the card C1 is implemented by the information processing device 51.
In step S1, the task determination unit 102 determines a task to be executed.
The task of holding the card C1 includes a task of translating the card C1 to the front side to allow the card C1 to be sandwiched between the left finger 22L and the right finger 22R, and a task of bringing the left finger 22L into contact with the lower surface of the card C1 (task of sandwiching the card C1 between the left finger 22L and the right finger 22R). The task determination unit 102 determines to first execute a task of translating the card C1 to the front side.
In step S2, the environment measurement unit 101 performs three-dimensional measurement of the environment on the basis of the output of the visual sensors 12A. By performing three-dimensional measurement, the position and shape of the card C1, the shape of the desk D1, and the like are recognized.
In step S3, the hand and finger initial position determination unit 103 calculates the initial positions of the hand portions 14 and the finger portions 22 on the basis of the content of the task and the measurement result by the environment measurement unit 101. Together with the positions, the respective orientations of the hand portions 14 and the finger portions 22 are also calculated.
In step S4, the hand and finger initial position determination unit 103 determines a use sensor.
Here, the initial position of each of the hand portions 14 and the finger portions 22 is calculated in step S3 and the use sensor is determined in step S4 so as to be a position suitable for monitoring the execution status of the task.
For example, as illustrated in A of
In the example in A of
Such an initial position of each portion is selected according to the task determined by the task determination unit 102. Various methods can be used as a method of determining the initial positions in addition to programming in advance such that the determination according to the task is performed.
For example, initial positions or the like most suitable for detection of success or failure of a task may be determined using an inference model obtained by machine learning. In this case, an inference model is generated by performing machine learning using time-series sensor data pieces at the time of task success and at the time of task failure recorded when executing the task with setting the positions of the hand portions 14 to various positions.
Returning to the description of
The geometric information is information indicating at least a part of the state of the object including a size (area), an inclination, a distance, and the like and is obtained on the basis of the distance measured by the distance sensors 41 during the movement of the hand portions 14. Since the geometric information changes according to the positional relationship between the operation object and objects surrounding it, the geometric information is used for monitoring the execution status of the task as information indicating the positional relationship between the operation object and the objects surrounding it during execution of the task (during movement of the hand portion 14) together with the control of the drive unit 121.
In a case where the task of translating the card C1 is executed, the contactable area S, which is an area where the left finger 22L can be brought into contact, is determined as the geometric information. The contactable area Sest is expressed by Equation (1) below.
[Equation 1]
S
est
=nA (1)
In Equation (1), n represents the number of the distance sensors 41 that have measured sensor values within the effective distance range. A represents a footprint area of the distance sensors 41.
The effective distance range is a range of a distance larger (longer) than the distance from the distance sensor 41 (the distance sensor 41 on the left finger 22L) to the desk D1 and smaller (shorter) than the distance from the distance sensor 41 to the right finger 22R, which is the finger portion 22 on the opposite side.
Since the left finger 22L is positioned under the desk D1, the contactable area Sest is an area, in the entire card C1, of a region protruding from the edge portion of the desk D1.
In addition to the contactable area of the operation object, distances measured by the distance sensors 41, an orientation (inclination) of the operation object with respect to the environment, and the like can be used as the geometric information. An example of using information other than the contactable area of the operation object as the geometric information will be described later with reference to an example of executing another task.
In step S6, the initial position movement control unit 105 moves the hand portion 14 to the initial position calculated by the hand and finger initial position determination unit 103.
In step S7, the initial position movement control unit 105 brings the hand portion 14 into contact with the operation object. Specifically, as described with reference to A of
After the left finger 22L and the right finger 22R are positioned at the initial positions illustrated in A of
In the case of A of
In step S8, the target value calculation unit 106 calculates a target value of the geometric information. The control of the hand portion 14 and the like is performed so as to make the geometric information close to the target value.
In step S9, the task success/failure condition calculation unit 109 determines a success condition and a failure condition of the task.
For example, the task success/failure condition calculation unit 109 sets, as a threshold value, an area that allows the left finger 22L to be in contact with the lower surface of the card C1, and determines, as a success condition, that the contactable area Sest is larger than the threshold value. The target value calculated by the target value calculation unit 106 in step S8 is an area used as a threshold of the success condition.
Furthermore, the task success/failure condition calculation unit 109 determines, as a failure condition, that the contactable area Sest is smaller than the target value and the sensor values of all the distance sensors 41 measuring sensor values that are out of the effective distance range are within an abnormal range. Of the distances out of the effective distance range, the distance from the distance sensor 41 provided on the left finger 22L to the right finger 22R is set as the abnormal range.
Note that a plurality of conditions may be determined as failure conditions. For example, together with the failure condition described above, the minimum value of the sensor values measured by the distance sensors 41 provided on the left finger 22L (the shortest distance to the desk D1) becomes smaller than that at the start of the task is determined as the failure condition.
In this case, it is meant that the hand portion 14 cannot slide parallel to the top plate of the desk D1, and the right finger 22R is away from the card C1.
In step S10, the geometric information estimation unit 107 calculates the contactable area Sest on the basis of the sensor values measured by the distance sensors 41.
In step S11, the positioning control unit 108 performs positioning control on the basis of the contactable area Sest estimated by the geometric information estimation unit 107.
Specifically, the positioning control unit 108 slides the hand portion 14-1 in the state illustrated in A of
In this case, as indicated by the light beam L3 indicated by an alternate long and short dash line, the distance sensors 41L-3 measure the sensor values within the effective distance range from the distance sensors 41 to the card C1. In this state, a predetermined area is obtained as the contactable area Sest.
In the example of B of
As described later, the control by the positioning control unit 108 is performed such that the hand portion 14-1 (drive unit 121) is moved while being decelerated as the contactable area Sest, which is geometric information, increases.
In step S12, the task success/failure determination unit 110 determines whether or not an abnormality has occurred.
In a case where it is determined in step S12 that no abnormality has occurred, in step S13, the task success/failure determination unit 110 determines whether or not the task has succeeded. Here, in a case where the contactable area Sest satisfies the success condition, it is determined that the task has succeeded.
In a case where it is determined in step S13 that the task has not succeeded, the processing returns to step S10, and the subsequent processing is repeated. The positioning control unit 108 continues to slide the hand portion 14-1 until the contactable area reaches the target value.
In a case where it is determined in step S13 that the task has succeeded since the contactable area Sest has reached the target value, the positioning control unit 108 stops the sliding of the hand portion 14-1 according to a stop command from the task success/failure determination unit 110 in step S14.
As indicated by the light beams L2 and L3 indicated by alternate long and short dash lines in A of
In step S15, the task determination unit 102 determines whether or not all the tasks have been completed.
In a case where it is determined in step S15 that all the tasks are not completed because, for example, there is a task of bringing the left finger 22L into contact with the lower surface of the card C1, the target value calculation unit 106 calculates, in step S16, a command value for the next task on the basis of the current sensor values. The calculation of the command value for the next task is performed after the task of bringing the left finger 22L into contact with the lower surface of the card C1 is determined as a task to be executed next by the task determination unit 102.
For example, as illustrated in a word balloon in A of
After the command value for the next task is calculated, the processing returns to step S3, and processing similar to the processing described above is performed.
That is, the processing of moving the left finger 22L while monitoring the execution status of the task using the geometric information is performed, so that the left finger 22L comes into contact with the lower surface of the region of the card C1 protruding from the edge portion of the desk D1 and the card C1 is held as illustrated in B of
On the other hand, in a case where the failure condition is satisfied in step S12, the task success/failure determination unit 110 determines that an abnormality has occurred.
In a case where the force of pressing the card C1 with the right finger 22R is weak, even if the hand portion 14-1 is slid to move the card C1, slipping occurs between the right finger 22R and the card C1 as illustrated in a word balloon in A of
In a case where the hand portion 14-1 is slid while the slipping occurs, the card C1 is slightly translated but does not protrude from the edge portion of the desk D1 as illustrated in B of
In this case, as indicated by the light beams L1 and L2 indicated by broken lines, sensor values out of the effective distance range are measured by the distance sensors 41L-1 and 41L-2, respectively. Furthermore, as indicated by the light beams L3 to L5 indicated by broken lines, the sensor values within the abnormal range are measured by the distance sensors 41L-3, 41L-7, and 41L-8, respectively.
Since the distance sensors 41L-1 or 41L-2 does not measure a sensor value within the abnormal range, the sliding of the hand portion 14-1 is continued. As the sliding is continued, the card C1 is slightly translated further as illustrated in A of
In A of
In a case where it is determined in step S12 that an abnormality has occurred due to satisfaction of the above-described failure condition, the task determination unit 102 sets, in step S17, a return operation task on the basis of the current sensor value. Thereafter, the processing returns to step S3, and the return operation task is executed by processing similar to the processing described above.
For example, the task determination unit 102 determines a return operation task of performing the task of translating the card C1 again as a task to be executed next. The target value calculation unit 106 calculates a target value of the movement amount of the return operation task by Equation (2) on the basis of the contactable area Sest.
In Equation (2), nref represents the number of the distance sensors 41 that measure sensor values within the effective distance range necessary for the contactable area to reach the target value. n represents the current number of the distance sensors 41 that measure sensor values within the effective distance range. Sref represents a target value of the contactable area.
Equation (2) indicates that the contactable area Sest becomes larger than the target value Sref if the region of the card C1 corresponding to the footprint area of the nref distance sensors 41 protrudes from the edge portion of the desk D1.
The positioning control unit 108 moves the hand portion 14-1 to a position where the number of the distance sensors 41 that measure distances to the desk D1 is nref. For example, as illustrated in B of
As described above, the operation of moving the hand portion 14-1 by a distance corresponding to nref of the distance sensors 41 is performed as the return operation task. After the return operation task is performed, for example, the task of increasing the force of pressing the card C1 with the right finger 22R to translate the card C1 is performed again.
In a case where the task of translating the card C1 to the front side fails, there is a possibility that the card C1 has moved from the position at the start of the task. In this case, even if the hand portion 14-1 is moved to the initial position of the task, the hand portion 14-1 cannot be brought into contact with the card C1 in some cases.
In such a case, in the conventional methods, the hand portion is once moved to a position away from the desk, and then the environment is measured again using a visual sensor, and the hand portion is moved again according to the recognized position of the card.
With the above-described processing, it is possible to resume the task by moving the hand portion 14-1 by the minimum necessary movement amount without once moving the hand portion 14-1 away from the desk D1. That is, the information processing device 51 can quickly perform the task again.
Returning to the description of
Example of Control by Positioning Control Unit 108
As described above, the control of the hand portion 14 by the positioning control unit 108 is performed such that the hand portion 14 is moved while being decelerated as the contactable area Sest, which is the geometric information, increases.
The control by the positioning control unit 108 is implemented by a subtractor 131, a converter 132, a subtractor 133, and a controller 134. The subtractor 131, the converter 132, the subtractor 133, and the controller 134, which are surrounded by a broken line in
The subtractor 131 calculates a difference between the target value Sref of the contactable area and the contactable area Sest estimated by the geometric information estimation unit 107, and outputs the difference to the converter 132.
The converter 132 applies the conversion coefficient K to the difference supplied from the subtractor 131 to calculate the target value vref of the moving speed. For example, as the difference between the target value Sref and the contactable area Sest is smaller, a smaller value is calculated as the target value vref of the moving speed. The converter 132 outputs the target value vref of the moving speed to the subtractor 133.
The subtractor 133 calculates a difference between the target value vref supplied from the converter 132 and the actual moving speed v of the drive unit 121, and outputs the difference to the controller 134.
The controller 134 controls the drive unit 121 so that the difference between the moving speeds supplied from the subtractor 133 becomes 0.
The actual moving speed v of the drive unit 121 is measured by a sensor provided in each unit and supplied to the subtractor 133. Furthermore, the three-dimensional coordinates p of each measurement point of the distance sensor 41 are supplied to the geometric information estimation unit 107. For example, in a case where N distance sensors 41 are provided, the three-dimensional coordinates p are represented by a 3×N matrix.
The geometric information estimation unit 107 estimates the contactable area Sest on the basis of the three-dimensional coordinates p, and outputs the contactable area Sest to the subtractor 131.
The upper part of
As described with reference to A of
As described with reference to B of
As described with reference to A of
As described above, the control of the hand portion 14 by the positioning control unit 108 is performed such that the moving speed of the hand portion 14 is adjusted according to the change in the contactable area Sest as the geometric information.
Effects
As described above, in the robot 1, the distance sensors 41 are positioned at positions where it is easy to measure the displacement between the operation object and the environment according to the content of the task to be executed. Therefore, it is possible to constantly monitor the progress status of a task in which a robot hand would shield the operation object from a camera at a fixed position to make measurement difficult.
The robot 1 can improve the accuracy and success rate of a task of moving an operation object to a target position.
Furthermore, in the robot 1, it is not necessary to measure the contact position between the card C1 and the hand portion 14-1, and it is possible to succeed a task of translating the card C1 only by bringing the hand portion 14-1 into contact with an approximate target position.
Typically, in a case where a robot hand is controlled on the basis of information acquired by a camera at a fixed position, it is necessary to measure the positional relationship between a card and a desk at the start of the task and the positional relationship between the card and the robot hand.
In the robot 1, the visual sensors 12A are used only for measuring the positional relationship between the card C1 and the desk D1 at the start of the task. The control of the hand portion 14-1 and the success determination of the task are performed on the basis of the relative displacement between the card C1 and the desk D1 measured by the distance sensor 41 provided on the left finger 22L. Since the contact position between the card C1 and the hand portion 14-1 is not so important, the hand portion 14-1 does not need to contact the center of the card C1, and may contact the front side or the back side of the card C1.
Moreover, in the robot 1, the geometric information is calculated on the basis of the distance distribution information measured by the distance sensors 41, and the control of the hand portion 14 and the success determination of the task are performed according to the change in the geometric information. In a case of executing a task for which success or failure is determined by the positional relationship between the operation object and the environment, the robot 1 can control the hand portion 14 and determine the success of the task with easy observation and a simple algorithm.
<4. Examples of Other Tasks>
Examples in which the present technology is applied to other tasks will be described later. Note that the basic flow of the tasks is similar to the flow of the task of holding the card C1 described with reference to
Example of Wiping Task
In
At the start of the task of wiping the window W1, as illustrated in A of
Here, the robot 1 brings the hand portion 14-1 roughly into contact with the cleaner C11. Since the base portion 21, the left finger 22L, and the right finger 22R are provided with the distance sensors 41, the robot 1 can detect which part of the hand portion 14-1 the cleaner C11 is in contact with on the basis of the sensor values measured by the distance sensors 41 when the robot 1 is brought roughly into contact with the cleaner C11.
After the left finger 22L and the right finger 22R are positioned to be spread, distances are measured by the distance sensors 41L-1 to 41L-3, 41L-7, and 41L-8, which are use sensors. In the case of
Furthermore, as indicated by the light beams L2 to L4 indicated by broken lines, sensor values out of the effective distance range are measured by the distance sensors 41L-2, 41L-3, and 41L-7.
Since a portion of the left finger 22L provided with the distance sensor 41L-8 is pressed against the cleaner C11 together with the base portion 21, a sensor value within a contact range is measured by the distance sensor 41L-8. The contact range indicates that the sensor value is 0.
In a case where the task of wiping is executed, the progress status of the task is monitored and the hand portion 14-1 is controlled using, as the geometric information, an interval δx that is the interval between the end of the cleaner C11 and the end of the window W1. The interval δx indicated by the bidirectional arrow in A of
[Equation 3]
δx=min(xe)−min(xc) (3)
In Equation (3), xe represents a set of positions of the distance sensors 41 that measure sensor values within the effective distance range. xc represents a set of positions of the distance sensors 41 that acquire sensor values within the contact range.
In other words, min(xe) represents the position of the left end of the frame F1 (the end on the side of the surface in contact with the window W1), and max(xc) represents the right end position of the cleaner C11. The positioning control by the hand portion 14-1 is performed such that the interval δx decreases.
Specifically, the robot 1 moves the hand portion 14-1 in the state illustrated in A of
In this case, as indicated by the light beams L1 to L4 indicated by alternate long and short dash lines, the distance sensors 41L-1 to 41L-3 and 41L-7 measure sensor values within the effective distance range. In this state, it is assumed that min(xe) is a position corresponding to the position of the distance sensor 41L-7 and max(xc) is a position corresponding to the position of the distance sensor 41L-8, and an interval δx of a predetermined length is obtained.
In a case where the interval δx becomes sufficiently small, it is determined that the task has succeeded. At this time, as illustrated in
The sensor values measured by the distance sensors 41 when it is determined that the task has succeeded are used for a task of moving the hand portion 14-1 in the +y direction as the next operation. In the task of moving the hand portion 14-1 in the +y direction, the movement of the hand portion 14-1 is controlled such that the interval δx, which is the distance between the cleaner C11 and the frame F1, is maintained at a constant distance.
Typically, wiping a window using the hand portion is often performed with the cleaner held by the left and right finger portions of the hand portion.
In the robot 1, the left finger 22L and the right finger 22R are spread, the cleaner C11 is pressed against the surface of the window W1 by the base portion 21 of the hand portion 14-1, and wiping work is performed while measuring the interval δx as the geometric information. Since wiping is performed while measuring the interval δx, it is possible to move the cleaner C11 to almost the end of the window W1.
Note that the control of the hand portion 14-1 is performed such that the hand portion 14-1 is moved while being decelerated as the interval δx, which is the geometric information, decreases.
The upper part of
As described with reference to A of
A period from time t1 to time t2 is a period in which the hand portion 14-1 moves in the +x direction. In this period, the interval δx gradually decreases from time t1. Furthermore, the command value of the moving speed gradually decreases from time t1.
As described with reference to B of
As illustrated in A of
In this case, as indicated by the light beam L5 indicated by an alternate long and short dash line, a state where the distance sensor 41L-8, which has measured a sensor value within the contact range before the movement, measures a sensor value out of the effective distance range is made. In a case where the contact position between the cleaner C11 and the hand portion 14-1, that is, the value of max (xc) changes, it is determined that slipping has occurred between the hand portion 14-1 and the cleaner C11 (the task has failed).
In a case where a sensor value of one of the distance sensors 41 that has measured a sensor value within the contact range changes, as illustrated in C of
Note that the control of the hand portion 14 may be performed using not only the sensor values measured by the distance sensors 41 but also a sensor value measured by a vibration sensor or a tactile sensor provided on the hand portion 14. For example, slipping occurred between the hand portion 14 and the operation object is detected on the basis of a sensor value measured by a vibration sensor or a tactile sensor, thereby failure of the task is determined. By using the measurement results by a plurality of types of sensors, the robot 1 can improve the accuracy of the failure determination of the task.
Example of Task of Cutting Object
In the example of
After the blade portion of the kitchen knife K1 is brought into contact with the object Ob1, the distance sensor 41-5 at the fingertip is determined as a use sensor, and the distance is measured by the distance sensor 41-5. In the example of
In a case where the task of cutting the object Ob1 is executed, the progress status of the task is monitored and the hand portion 14-1 is controlled using, as the geometric information, the distance from the blade portion of the kitchen knife K1 to the desk D1 measured by the distance sensor 41-5. For example, in a case where the fingertip is at the same height as the blade portion of the kitchen knife K1, the distance measured by the distance sensor 41-5 is the same distance as the distance from the blade portion of the kitchen knife K1 to the desk D1 indicated by the bidirectional arrow in A of
Furthermore, as indicated by the light beams L12 and L13 indicated by broken lines, a plurality of distances to the handle portion of the kitchen knife K1 are measured by the distance sensors 41-0 provided on the base portion 21. As described with reference to
In the task of cutting the object Ob1, the inclination (orientation) of the kitchen knife K1 obtained on the basis of the sensor values measured by the distance sensors 41-0 is used as the geometric information together with the distance to the desk D1. The inclination of the kitchen knife K1 is represented by, for example, a difference between the distance measured by the light beam L12 and the distance measured by the light beam L13.
The positioning control for lowering the kitchen knife K1 is performed such that the distance to the desk D1 decreases. Specifically, the positioning control is performed by moving the hand portion 14-1 in a downward direction indicated by the white arrow in A of
In a case where the sensor value measured by the distance sensor 41-5 becomes sufficiently small, it is determined that the blade portion of the kitchen knife K1 completely comes into contact with the desk D1 and cannot be lowered any more, and it is determined that the task has succeeded.
In addition to the sensor value measured by the distance sensor 41-5, the sensor value measured by the force sensor provided at the wrist portion of the hand portion 14-1 may be used for the success determination of the task. The force sensor provided at the wrist portion of the hand portion 14-1 measures, for example, the reaction force when the blade portion of the kitchen knife K1 hits the desk D1. By using the measurement results by the plurality of types of sensors, the robot 1 can improve the accuracy of the success determination of the task.
In a case where it is determined that the task of cutting the object Ob1 has succeeded, positioning control for separating the kitchen knife K1 from the object Ob1 is performed. Specifically, as illustrated in C of
Note that the present technology can also be applied to a task of positioning an object using a tool, such as a task of placing an object held by a tong at a specific position, by regarding the end effector to include the tool.
As illustrated in A of
In this case, as indicated by the light beams L12 and L13 indicated by alternate long and short dash lines, different sensor values are measured by the plurality of distance sensors 41-0. The inclination of the kitchen knife K1 is calculated on the basis of the sensor values measured by the distance sensors 41-0.
The robot 1 adjusts the orientation of pushing the kitchen knife K1 into the object Ob1 on the basis of the inclination of the kitchen knife K1 as the geometric information. Specifically, as indicated by the white arrow in C of
By adjusting the orientation of the kitchen knife K1, the robot 1 can resume the task of cutting the object Ob1 without performing the task from the beginning again.
Example of Task of Placing Operation Object Between Objects
In the example of
Furthermore, objects included in the environment surrounding the operation object are the book B11 and the book B12.
After the book B1 is held, the distance sensor 41L-5 at the fingertip of the left finger 22L and the distance sensor 41R-5 at the fingertip of the right finger 22R are determined as use sensors, and the distance sensor 41L-5 and the distance sensor 41R-5 measure distances. In the example in B of
In a case where the task of placing the book B1 between the book B11 and the book B12 is executed, an average value of sensor values measured by the distance sensor 41L-5 and the distance sensor 41R-5 is used as geometric information to monitor the progress status of the task and control the hand portion 14-1.
The positioning control for placing the book B1 is performed such that the book B1 is inserted into the gap between the book B11 and the book B12 until the average value of the sensor values measured by the distance sensor 41L-5 and the distance sensor 41R-5 becomes 0.
When the book B1 is inserted, the book B1 may come into contact with the book B11 in the surroundings as illustrated in B of
In this case, as indicated by the light beam L21 indicated by an alternate long and short dash line, the sensor value measured by the distance sensor 41L-5 becomes a large value. Therefore, the average value of the sensor values measured by the distance sensors 41L-4 and the distance sensor 41R-5 also becomes a large value.
In a case where the average value of the sensor values measured by the distance sensors 41L-4 and the distance sensor 41R-5 becomes large, it is determined that the task has failed. In a case where it is determined that the task has failed, a recovery operation is performed. Specifically, as illustrated in D of
After performing the recovery operation, the robot 1 can perform a next task, such as the same task.
As described above, the robot 1 can detect an abnormality in a case where an object in the surroundings of the operation object moves unexpectedly.
<5. Modifications>
Regarding Sensor
Although the geometric information is obtained on the basis of the sensor values measured by the distance sensors 41, the geometric information may be obtained on the basis of a measurement result by a different sensor such as a time of flight (ToF) camera or a stereo camera. As described above, various sensors capable of acquiring distance distribution information (distance information of multiple points) can be used for measurement.
Furthermore, the geometric information may be obtained on the basis of a map surrounding the hand portion 14 created by moving the hand portion 14 to perform scanning and merging time-series data pieces of measurement results by the distance sensors 41.
Therefore, even in a case where the number of the distance sensors 41 provided on the hand portion 14 is small, it is possible to acquire the distribution information of the distance necessary for obtaining the geometric information by moving the positions of the distance sensors 41.
The distance sensors 41 may be provided at portions other than the hand portion 14. For example, in a task of picking up the card C1, the other hand (the hand portion 14-2) is positioned under the desk D1, and positioning control is performed on the basis of sensor values measured by the distance sensors 41 provided on the hand portion 14-2.
In this case, the robot 1 can perform positioning control using the hand portion 14-2 at the same time as positioning the hand portion 14-1 at the initial position. Therefore, the robot 1 can shorten the time required for the task.
In a case where the target position to be the destination of movement of the operation object does not have any shape feature but a mark is provided instead, the operation object may be moved to the target position on the basis of information output from an RGB camera or a color sensor mounted on the hand portion 14.
In the example of
In a case where the task of positioning the object Ob11 at the position P1 is executed, first, a target distance from the end of the desk D1 to the position P1 indicated by the bidirectional arrow in A of
After the object Ob11 is held, the plurality of distance sensors 41 provided on the arm portion 13-1 is determined as use sensors, and the distances are measured by the distance sensors 41. In the example of A of
In the task of positioning the operation object at the position P1, the arm portion 13-1 is controlled using the distance from the end of the desk D1 to the object Ob11 as the geometric information. The distance from the end of the desk D1 to the object Ob11 is obtained on the basis of sensor values measured by the plurality of distance sensors 41 provided on the arm portion 13-1.
The positioning for positioning the object Ob11 at the position P1 is performed so that the difference between the distance from the end of the desk D1 to the object Ob11 and the target distance becomes small. Specifically, by moving the arm portion 13-1 in the left direction indicated by the white arrow in A of
In this case, as indicated by the light beams L31 to L33 indicated by broken lines, the distance sensors 41 provided on the arm portion 13-1 measure sensor values indicating the distance to the desk D1. When the difference between the distance from the end to the desk D1 to the object Ob11 and the target position becomes sufficiently small, the arm portion 13-1 is moved downward so as to place the object Ob11 on the desk D1, so that the object Ob11 is positioned at the position P1.
Regarding Actuator
An actuator other than the electromagnetic motor may be mounted on the end effector. For example, a suction type end effector is mounted on the robot 1. The robot 1 can easily hold a light object such as a card using a suction type end effector.
In the case of holding a heavy object using a suction type end effector, similarly to the case of picking up the card C1, the robot 1 can once move the object to the front side of the desk while sucking the object, and then suck and hold the back of the object with a suction mechanism of another finger portion. By once moving a heavy object to the front side of the desk and then holding the object, the robot 1 can increase the stability of holding.
Regarding Control
The positioning control may be performed on the basis of the sensor values measured by the distance sensors 41 and information measured at the start of the task by the visual sensors 12A (a three-dimensional measuring instrument such as a Depth camera). Since the distance sensors 41 are discretely arranged, the accuracy of the geometric information obtained on the basis of the sensor values measured by such distance sensors 41 may be low.
By complementing the distance information between the distance sensors 41 on the basis of the information measured by the visual sensors 12A, the robot 1 can obtain highly accurate geometric information, and the accuracy of positioning control of a small object or an object having a complicated shape can be improved.
The operation of moving the hand portion 14-1 to the initial position may be performed a plurality of times. For example, in a case where the quality of the sensor values measured by the distance sensors 41 is poor (a case where noise is large, a case where there are many measurement omissions, and the like) after the hand portion 14-1 is moved to the initial position, the robot 1 changes the orientation of the hand portion 14-1 from the initial position or moves the hand portion 14-1 to an initial position that is another candidate.
Regarding System Configuration
The system illustrated in
Wireless communication of a predetermined standard such as a wireless LAN or long term evolution (LTE) is performed between the robot 1 and the information processing device 51 in
Various types of information such as information indicating the state of the robot 1 and information indicating the detection result of the sensors are transmitted from the robot 1 to the information processing device 51. Information for controlling the operation of the robot 1 and the like are transmitted from the information processing device 51 to the robot 1.
The robot 1 and the information processing device 51 may be directly connected as illustrated in A of
Regarding Computer
The above-described series of processing can be performed by hardware or software. In a case where the series of processing is executed by software, a program implementing the software is installed from a program recording medium to a computer incorporated in dedicated hardware, a general-purpose personal computer, or the like.
A central processing unit (CPU) 1001, a read only memory (ROM) 1002, and a random access memory (RAM) 1003 are connected to each other by a bus 1004.
An input/output interface 1005 is further connected to the bus 1004. An input unit 1006 including a keyboard, a mouse, and the like, and an output unit 1007 including a display, a speaker, and the like are connected to the input/output interface 1005. Furthermore, a storage unit 1008 including a hard disk, a nonvolatile memory, and the like, a communication unit 1009 including a network interface and the like, and a drive 1010 that drives a removable medium 1011 are connected to the input/output interface 1005.
In the computer configured as described above, for example, the CPU 1001 loads a program stored in the storage unit 1008 into the RAM 1003 via the input/output interface 1005 and the bus 1004 and executes the program, so that the above-described series of processing is performed.
The program executed by the CPU 1001 is provided, for example, by being recorded in the removable medium 1011 or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and is installed in the storage unit 1008.
Note that the program executed by the computer may be a program that causes pieces of processing to be performed in time series in the order described in the present specification, or may be a program that causes the pieces of processing to be performed in parallel or at necessary timing such as when a call is made.
In the present specification, a system means a set of a plurality of components (devices, modules (parts), and the like), and it does not matter whether or not all the components are in the same housing. Therefore, a plurality of devices housed in separate housings and connected via a network, and one device having one housing in which a plurality of modules is housed are both systems.
Note that the effects described in the present specification are merely examples and are not limited, and other effects may be provided.
Embodiments of the present technology are not limited to the above-described embodiment, and various modifications can be made without departing from the gist of the present technology.
For example, the present technology can have a configuration of cloud computing in which one function is shared and processed in cooperation by a plurality of devices via a network.
Furthermore, steps described in the above-described flowcharts can be performed by one device or can be shared and executed by a plurality of devices.
Moreover, in a case where a plurality of pieces of processing is included in one step, the plurality of pieces of processing included in the one step can be executed by one device or can be shared and executed by a plurality of devices.
Combination Examples of Configurations
The present technology may also have the following configurations.
(1)
An information processing device including:
(2)
The information processing device according to (1),
(3)
The information processing device according to (2),
(4)
The information processing device according to (3) further including
(5)
The information processing device according to (4),
(6)
The information processing device according to (5),
(7)
The information processing device according to (6),
(8)
The information processing device according to any one of (2) to (7) further including:
(9)
The information processing device according to (8),
(10)
The information processing device according to any one of (1) to (9) further including
(11)
The information processing device according to (4),
(12)
The information processing device according to (11),
(13)
The information processing device according to (4),
(14)
The information processing device according to (13),
(15)
The information processing device according to (4),
(16)
An information processing method performed by an information processing device, including:
(17)
A program that causes a computer to perform processing of:
Number | Date | Country | Kind |
---|---|---|---|
2020-131437 | Aug 2020 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2021/027079 | 7/20/2021 | WO |