Priority is claimed on Japanese Patent Application No. 2022-154741, filed Sep. 28, 2022, the content of which is incorporated herein by reference.
The present invention relates to an object gripping method, a storage medium, and an object gripping control device.
A control method for estimating a state of a target object from an image captured by a camera attached to a head of a robot having an end effector, for example, and moving a wrist position obtained from the estimated state of the target object as a target when the robot is caused to perform a task has been proposed (see Patent Document 1, for example). In gripping work, for example, after gripping, a position of the gripped target object is operated as an end effector position, a fingertip position is fixed with respect to the target object, the fingertip position at a desired target object position is calculated, and the fingertip position is controlled toward such a position.
In gripping work, for example, a target object is moved by a finger which first touches the target object, and it is difficult to take a posture with a large angular margin at the time of an operation in advance. In the gripping work, when there are restrictions on an action, it is difficult to move only a position or a posture of the target object while maintaining a geometry (a positional relationship and a force balance) of the gripping. Further, in the gripping work, when force balance is considered on the basis of a target object position, there is a possibility that passivity cannot be guaranteed due to a position measurement error or the like. However, with a technology described in Patent Document 1, these problems cannot be solved.
Aspects of the present invention have been made in view of the above problems, and an object thereof is to provide an object gripping method, a storage medium, and an object gripping control device capable of stably realizing a target object operation after gripping using a multi-degree-of-freedom hand, with robustness against a measurement error.
In order to solve the above problems and achieve the object, the present invention adopts the following aspects.
(1) An object gripping method according to an aspect of the present invention includes determining a gripping center coordinate system when gripping a target object assumed to be gripped by an end effector having a plurality of fingers for each of a plurality of gripping postures that are able to be taken by the end effector;
(2) In the aspect (1), the gripping center coordinate system may be updated according to a position of the fingertip in a case in which the fingertip moves when the target object is operated.
(3) In the aspect (1), the initial fingertip position may be determined so that a variation of a gripping center coordinate system at the time of operating the target object is small.
(4) In any one of the aspects (1) to (3), when the target object is moved, the end effector may be controlled using an integral value of a deviation between a measurement object position based on a measured posture of the target object and a goal object position serving as a goal position of a movement destination of the target object.
(5) In any one of the aspects (1) to (4), when the target object is gripped using three fingers among the plurality of fingers included in the end effector, a gripping center position in the gripping center coordinate system may be a center of a circumscribed circle of a triangle connecting fingertip centers of the plurality of fingers included in the end effector, or a centroid of the triangle connecting the fingertip centers of the plurality of fingers included in the end effector.
(6) In the aspect (5), the center of the circumscribed circle of the triangle connecting the fingertip centers of the plurality of fingers may be selected when a distance between an index finger and a middle finger included in the end effector is smaller than a predetermined distance, and the centroid of the triangle connecting the fingertip centers of the plurality of fingers may be selected when a distance between the index finger and the middle finger included in the end effector is larger than the predetermined distance.
(7) In any one of the aspects (1) to (6), a gripping center position in the gripping center coordinate system may be a center of a circumscribed circle of a right triangle formed by a perpendicular line of a palm reference point and a fingertip of an index finger when gripping is performed using the index finger among the plurality of fingers included in the end effector and a palm.
(8) A storage medium according to an aspect of the present invention stores a program causing a computer to execute: determining a gripping center coordinate system when gripping a target object assumed to be gripped by an end effector having a plurality of fingers for each of a plurality of gripping postures that are able to be taken by the end effector; determining an initial fingertip position from the determined gripping center coordinate system; instructing the end effector to grip the target object at the initial fingertip position, to thereby cause the end effector to grip the target object; fixing a fingertip position of the end effector with respect to the gripping center coordinate system; and determining an operation amount of the gripping center coordinate system according to a desired operation amount of the target object, and operating the target object according to an operation of the gripping center coordinate system.
(9) An object gripping control device according to an aspect of the present invention includes: a real position correction unit configured to determine a gripping center coordinate system when gripping a target object assumed to be gripped by an end effector having a plurality of fingers for each of a plurality of gripping postures that are able to be taken by the end effector, and determine an initial fingertip position from the determined gripping center coordinate system; and a hand control unit configured to instruct the end effector to grip the target object at the initial fingertip position, to thereby cause the end effector to grip the target object, fix a fingertip position of the end effector with respect to the gripping center coordinate system, determine an operation amount of the gripping center coordinate system according to a desired operation amount of the target object, and operate the target object according to an operation of the gripping center coordinate system.
According to the aspects (1) to (9), it is possible to stably realize a target object operation after gripping using a multi-degree-of-freedom hand, with robustness against a measurement error.
Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the drawings used for the following description, a scale of each member is appropriately changed so that the member has a recognizable size.
In all drawings for describing the embodiments, components having the same functions are denoted by the same reference numerals, and repeated description thereof will be omitted.
“On the basis of XX” in the present application means “based on at least XX,” and includes a case of being based on another element in addition to XX. “On the basis of XX” is not limited to a case in which XX is used directly, and also includes a case of being based on something obtained by performing calculation or processing on XX. “XX” is an arbitrary element (for example, arbitrary information).
First, an example of a problem in a gripping action will be described.
Scene 1 is a work start state, and is a state in which a hexagonal screw is set near a screw hole and is extended.
Scene 2 is a state in which the hexagonal screw is about to be gripped. In this case, there is a problem that, for example, even when an imaging device is attached to a head of the robot, the hexagonal screw is blocked by the hand and a posture before gripping cannot be checked.
Scene 3 is a state in which the hexagonal screw is gripped and then moved to the screw hole. In this case, there is a problem that the screw is not inserted into the screw hole unless the screw is brought upright in the hand and brought close to the screw hole.
Scene 4 is a state in which the hexagonal screw is inserted into the screw hole and the screw tightening is performed. In this case, there is a problem that the screw cannot be manually turned and tightened unless the finger is moved to roll a contact point with respect to a screw center.
Scene 5 is a state in which screw tightening is performed.
The task or problem illustrated in
In gripping work, there are also the following problems, in addition to the above problem.
1. A target object is moved by a finger which first touches the target object.
2. It is difficult to take a posture with a large angular margin at the time of an operation in advance.
3. When there are restrictions on an action, it is difficult to move only a position or a posture of the target object while maintaining a geometry (a positional relationship and a force balance) of the gripping.
4. Considering force balance based on the position of the target object, there is a possibility that passivity cannot be guaranteed due to a position measurement error or the like.
Next, an overview of classification of gripping postures (taxonomy) of a person will be described.
It is said that actions of a person regarding gripping or operating an object can be broadly divided into three aspects including order in which parts are gripped, object gripping, and object operation. A finger shape in this case is selected depending on a geometry of the object or a task after gripping.
In object gripping or operation, gripping is classified depending on a required task, or a distribution or size of a gripping record (see Reference 1, for example).
The classification of gripping of the taxonomy illustrated in
Reference 1; Thomas Feix, Javier Romero, et al., “The GRASP Taxonomy of Human GraspTypes” IEEE Transactions on Human-Machine Systems (Volume: 46, Issue: 1, February2016), IEEE, p 66-77
Next, a configuration example of an object gripping system will be described.
The object gripping control device 1 includes, for example, a fingertip control unit 11, a real position correction unit 12, a calculation unit 13, a hand control unit 14, a joint angle conversion unit 15, and a storage unit 16.
The robot 2 includes an end effector 21, a photographing unit 22, and a sensor 23, for example.
An operator may remotely operate the robot 2. In such a case, the operator wears, for example, a head-mounted display (HMD) including a display device, a line-of-sight detection sensor, and the like on a head, and wears, for example, a data glove that detects a position or movement of the fingertips or the like on the hand. The end effector 21 is attached to an arm, for example, and has five fingers, for example. The number of fingers is not limited thereto, and may be four or the like. The end effector 21 includes actuators that move joints or fingers.
The photographing unit 22 includes, for example, an RGB sensor that captures an RGB (red-green-blue) image, and a depth sensor that captures a depth image including depth information. The photographing unit 22 is further attached to the head of the robot 2. Further, the photographing unit 22 may also be installed in a work environment.
The sensor 23 detects, for example, force of the fingertip and an angle of the fingertip.
The real position correction unit 12 corrects a real position on the basis of the image captured by the photographing unit 22 included in the hand (the end effector 21). The real position correction unit 12 determines the taxonomy on the basis of a joint angle, instruction content of the operator during execution, and the like. The real position correction unit 12 may perform acquisition from the image captured by the photographing unit 22 attached to the head of the robot 2, for example.
The real position correction unit 12 includes, for example, a target object goal position setting unit 121, a target object position detection unit 122, a calculation unit 123, and an integrator 124.
The target object goal position setting unit 121 sets a goal position of the target object using the image captured by the photographing unit 22 include in the hand. For example, when the target object is a screw, the goal position is a position of a hole that the screw enters, as illustrated in
The target object position detection unit 122 recognizes the target object using the image captured by the photographing unit 22 included in the hand, and detects a position (including a posture) of the recognized target object.
The calculation unit 123 divides the position of the target object detected by the target object position detection unit 122 by the goal position set by the target object goal position setting unit 121 to correct real position information of the target object. In other words, the calculation unit 123 calculates a deviation between the goal object position and the target object position, as will be described below.
The integrator 124 integrates the deviation output by the calculation unit 123. The integrator 124 multiplies an integral value by a gain on the basis of information stored in the storage unit 16.
The storage unit 16 stores a gripping center coordinate system for each taxonomy. The storage unit 16 stores a predetermined gain. The storage unit 16 may store gains for each taxonomy, for example. The storage unit 16 stores a shape and a reference position (measurement object position) in association with each other for each gripping. The object gripping control device 1, for example, may measure a shape and weight of the target object on the basis of a position of each finger at the time of first gripping, and a detection value of the sensor 23 when the object has been gripped.
The fingertip control unit 11 acquires a command value from an operator who operates the robot 2. The fingertip control unit 11 refers to the acquired command value and the information stored in the storage unit 16 to generate a fingertip command value for controlling the fingertip at the time of an in-hand operation. A control method may be feedback (FB) control or feedforward (FF) control. The fingertip control unit 11 generates the fingertip command value according to a goal posture of the gripping target object. The fingertip control unit 11 corrects a goal fingertip position according to a goal movement amount of the gripping target object.
The calculation unit 13 adds the corrected real position information output by the real position correction unit 12 to the corrected goal fingertip position output by the fingertip control unit 11, and outputs a conversion result as the fingertip command value.
The hand control unit 14 controls the hand on the basis of the fingertip command value output by the calculation unit 13 and a fingertip force and finger angle information output by the robot 2. The hand control unit 14 includes, for example, a fingertip compliance control unit 141, a contact point estimation unit 142, and a gripping force distribution calculation unit 143.
The fingertip compliance control unit 141 performs fingertip compliance control of the end effector 21 on the basis of the fingertip command value output by the calculation unit 13 and a contact force command value output by the gripping force distribution calculation unit 143.
The contact point estimation unit 142 estimates a contact point between the finger and the target object on the basis of the fingertip force and the finger angle information output by the robot 2.
The gripping force distribution calculation unit 143 calculates a distribution of the gripping force to generate the contact force command value, on the basis of the estimated contact point information output by the contact point estimation unit 142.
The joint angle conversion unit 15 performs inverse kinematics calculation to generate a finger angle command value on the basis of the fingertip command value output by the hand control unit 14.
Next, a configuration example of the end effector will be described.
The finger portion 101 includes a force sensor at a fingertip, for example. The finger portion 102 includes a force sensor at a fingertip, for example. The finger portion 103 includes a force sensor at a fingertip, for example. The finger portion 104 includes a force sensor at a fingertip, for example. The finger portion 101 corresponds to, for example, a thumb of the person, the finger portion 102 corresponds to, for example, the index finger of the person, the finger portion 103 corresponds to, for example, a middle finger of the person, and the finger portion 104 corresponds to, for example, a ring finger of the person. The arm 171 is also a mechanism portion that can change a wrist posture of the end effector 21.
The cameras 131 to 134, 181 to 184, and 161 are RDG-D cameras that can obtain RGB information and depth information. Alternatively, the cameras 181 to 184 may be a combination of a charge coupled device (CCD) imaging device, a complementary MOS (CMOS) imaging device, and a depth sensor, for example.
The camera 131 is installed, for example, at a position corresponding to a thumb ball of the person. The camera 131 may be installed, for example, at a position corresponding to the outer side rather than an index finger side of a base joint part of the thumb of the person.
The camera 132 is installed, for example, at a position corresponding to a thumb ball of the person. The camera 132 may be installed at a position corresponding to the index finger side of the base joint part of the thumb of the person.
The camera 133 is installed, for example, at a position corresponding to a side of four finger bases of the person on the thumb side, and sides of a thumb ball portion on the thumb and the index finger side.
The camera 134 is installed, for example, at a position corresponding to a side of the four finger bases of the person on the ring finger side, and a side of a little finger ball portion. The end effector 21 may include at least one of the cameras 131 to 134 on the palm.
The camera 181 is installed, for example, at a position corresponding to a fingertip of the thumb of the person. The camera 181 may be installed in a region including a fingertip, a distal joint portion, and a distal joint corresponding to the thumb of the person.
The camera 182 is installed, for example, at a position corresponding to a fingertip of the index finger of the person. The camera 182 may be installed in a region including a fingertip, a distal joint portion, and a distal joint corresponding to the index finger of the person.
The camera 183 is installed, for example, at a position corresponding to the fingertip of the middle finger or ring finger of the person. The camera 183 may be installed in a region including a fingertip, a distal joint portion, and a distal joint corresponding to the middle finger or ring finger of the person.
The camera 184 is installed, for example, at a position corresponding to a fingertip of the little finger or ring finger of the person. The camera 184 may be installed in a region including a fingertip, a distal joint portion, and a distal joint corresponding to the little finger or ring finger of the person.
The cameras 181 to 184 are installed near contact points of the fingers (places at which the finger portions touch the target object for a gripping purpose).
The camera 161 is installed, for example, at a position of the wrist.
6-axis sensors, position sensors, and the like are provided at the fingertips, joints, wrists, and the like. The fingertips include force sensors.
Here, a coordinate system, definitions of terms, and the like in the embodiment are performed.
1) A concept of the gripping center is called a coordinate system because there is not only a position but also an orientation.
2) Even when the gripping center is simply described, the gripping center is used in the sense of the gripping center coordinate system including an orientation.
In the present embodiment, the gripping center according to the gripping taxonomy is defined in conjunction with the fingertip position as follows.
3) The gripping center coordinate system is defined for each taxonomy. However, the gripping center coordinate system is only an ideal value. The gripping center coordinate system is used for a wrist position determination or initial posture calculation, but is not used thereafter.
4) The gripping center coordinates in 3) are finely adjusted according to an actual task, and the fingertip position is determined accordingly.
5) At an initial stage, an ideal (nominal) gripping center coordinate system set in 4) substantially matches a gripping center coordinate system calculated from the finger positions. However, each finger stops on a surface of the object depending on a shape of the target object, the finger deviates from an ideal position. Therefore, in control, the coordinate system calculated from the fingertip position is always used.
In the present embodiment, for example, the following control is performed.
1. A gripping center position corresponding to a gripping taxonomy is defined in conjunction with the fingertip position, and an initial fingertip position is determined so that a variation of the position is minimized with respect to the movement of the fingertip. The initial fingertip position at this stage is a position when the hand is brought closer to the target object.
2. After the fingertip touches the target object, the fingertip position is set as the gripping center, that is, a balance center of the gripping force in the gripping center coordinate system, so that passivity can be maintained even when the fingertip moves. This makes it possible to ensure the passivity even when performing the operation, without hindering the contact according to the present embodiment by fixing the positions of the fingertips to the gripping center and performing a gripping center operation.
3. An operation amount of the gripping center is determined depending on a desired target object operation amount, so that the target object is operated without collapsing stable gripping.
4. A visual error captured by the photographing unit 22 is corrected through integral control.
Thus, when the position of the fingertip is determined, control is performed so that the gripping center position is determined. Therefore, even when the position of the target object or a change in the position cannot be observed from an image or the like, the gripping can be maintained. It is also possible to obtain positions to which the fingertips should be moved for gripping by determining the gripping center position first. Since the gripping center position is determined, it is possible to calculate the force for balancing each fingertip. Since the gripping center position is in the middle of the fingertip as will be described below, that is, the fingertip is disposed to oppose the gripping center in a well-balanced manner, control for returning can be performed even when there is a disturbance. At the time of an operation, an operation amount is determined with a center of the operation set as the gripping center position so that the target object can be operated while stability is being maintained. Further, an error of a position of the gripping center is integrated so that a recognized error can be corrected.
A main processing procedure in the embodiment is as follows.
I. A gripping center coordinate system for each taxonomy determined in advance is called according to the task.
II. The gripping center coordinate system and the fingertip positions at an initial stage are calculated on the basis of I.
III. A command to grip at the fingertip position of II is issued, but since the actual gripping place may differ, and thus, the fingertip is fixed at the actual gripping place and the gripping center coordinate system is calculated again from there.
IV. After III, the fingertips do not move, and a relationship between the gripping center and the fingertip remains unchanged. When the object is operated, a movement occurs for each gripping center.
V. When the fingertips move during the operation (power grip, or the like), the operation is performed while updating the gripping center coordinate system each time.
Next, an example of a method of calculating the gripping center position in the embodiment will be described.
First, a method of calculating the gripping center in the case of precision gripping with three or more fingers will be described.
A reference sign g31 represents a triangle connecting fingertip centers of the three fingers, and a reference sign g32 represents a centroid of the triangle g31. In this calculation scheme, when the triangle is an equilateral triangle, the centroid of the gripping target matches the gripping center, but when the triangle is another triangle, the centroid of the gripping target does not match the gripping center. However, the gripping center is always inside the triangle.
Each gripping center position described in
Therefore, in the present embodiment, the gripping center position in the precision gripping is calculated as a center position of the circumscribed circle when the distance between the index finger and the middle finger is small, and is obtained as a centroid position of the triangle when the distance between the index finger and the middle finger is large.
For example, when gripping is performed with four fingers, the three dominant fingers for gripping may be selected from among the four fingers on the basis of a taxonomy, for example, and the gripping center position may be determined on the basis of a triangle formed by the selected three fingers and a circumscribed circle of the triangle. Alternatively, when gripping is performed with the four fingers, the gripping center position may be determined on the basis of a polygon formed by the four fingers or a circumscribed circle of the polygon.
Next, a method of calculating the gripping center in the case of power gripping in which a palm is also used to grip will be described.
A point g41 is a reference point of the palm. A point g45 is a center of a circumscribed circle g44 of an isosceles right triangle g43 formed by a perpendicular line g42 of the reference point g41 of the palm and the fingertip of the index finger. Thus, in the present embodiment, the gripping center in the power gripping is determined from the palm and the fingertip of the index finger. This makes it possible to continuously calculate the center position according to a size of the circle even when the position of the index finger is moved at the time of gripping.
A position of the reference point is a position closer to the index finger in the case of light-tool (power gripping of a light target object), and is a position closer to the thumb in the case of power-sphere (power grip of a sphere) or medium-wrap (power grip in a wrapping state).
[Robustness Against Deviation]
Here, the robustness against deviation when the gripping center position obtained by using a scheme of the present embodiment is used will be described. In the present embodiment, when the fingertips move at the time of operating the target object due to the correction of the fingertip control unit 11, the gripping center coordinate system is updated according to the positions of the fingertips.
Next, a case in which the gripping fingers are switched will be described.
In this case, a gripping center position g71 of the images g70 and g80 is a center position of a circumscribed circle of a triangle formed by the thumb g11, the index finger g12, and the middle finger g13, and a gripping center position g72 of the image g90 is a center position of the circumscribed circle of the triangle formed by the thumb g11, the middle finger g13, and the ring finger g14.
Thus, according to the present embodiment, even when the fingers are switched, the gripping center position according to the center of the circumscribed circle and the centroid of the triangle described above is subjected to weighted averaging, for example, according to a transition of the gripping force so that the finger can be continuously moved. That is, according to the present embodiment, the gripping center position is continuously changed so that a weight distribution becomes automatically small. According to the present embodiment, a force command can be set to 0 (zero) when internal force distribution is performed without involvement of the finger after the weight distribution becomes sufficiently small. According to the present embodiment, it is possible to release the fingers smoothly because of no involvement in the balance at this stage.
Next, control when a position of the gripping target deviates will be described.
In this case, the deviation of the position is a difference between the goal object position g113 and the measurement object position g112.
As described above, the gripping center position g111 is determined from the fingertip. The deviation between the gripping center position g111 and the measurement object position g112 can be calculated using measurement results. A relationship between the gripping center position and the measurement object position does not change in a period in which the target object is stably gripped. Therefore, when the deviation between the measurement object position and the goal object position is known, this may be converted into the gripping center position. Further, a relationship between the gripping center position and the fingertip does not change as long as the target object is stably gripped.
For example, the deviation of the position of the target object is expressed by the following equation (1).
[Math. 1]
(Deviation of position of target object)=(goal object position)−(measurement object position) (1)
The gripping center position can be expressed as in the following equation (2). The gain is set in advance depending on, for example, a size, weight, taxonomy, or the like of the target object and stored in the storage unit 16.
[Math. 2]
(gripping center position)=(integral value of deviation of position of target object)×(gain) (2)
In the present embodiment, the gripping center position can be used as a fingertip position command.
This makes it possible to eliminate the position deviation of the target object while maintaining the stability of the target object and move the target object to a desired position while gripping the target object.
The fingertip control unit 11 performs the above-described calculation of the gripping center position. The real position correction unit 12 performs calculation of the deviation between the goal object position and the target object position.
Next, an example of a control processing procedure performed by the object gripping control device 1 will be described.
(Step S1) The fingertip control unit 11 acquires an operation instruction from the operator. The fingertip control unit 11 determines detection of the target object and the taxonomy on the basis of the image captured by the photographing unit 22, the information stored in the storage unit 16, and the acquired operation instruction. The object gripping control device 1 determines a gripping center coordinate system (the gripping center position and the orientation) when gripping a target object assumed to be gripped by the end effector 21 having a plurality of fingers for each of a plurality of gripping postures (taxonomies) that can be taken by the end effector 21. These pieces of information are stored in the storage unit 16 in advance, for example.
(Step S2) The fingertip control unit 11 determines the initial fingertip position on the basis of the acquired operation instruction, the determined taxonomy, and the gripping center coordinate system. That is, the fingertip control unit 11 determines the initial fingertip position from the acquired operation instruction and the information stored in the storage unit 16.
(Step S3) The hand control unit 14 instructs the end effector 21 to grip the target object from the initial fingertip position, for example, according to an operation of the operator or an autonomous operation, and grips the target object.
(Step S4) The hand control unit 14 fixes the fingertip position of the end effector 21 with respect to the gripping center coordinate system.
(Step S5) The hand control unit 14 determines an operation amount of the gripping center coordinate system according to a desired operation amount of the target object, and operates the target object according to an operation of the gripping center coordinate system.
The object gripping control device 1 repeats the above process to perform control of gripping of the target object, the balance of the target object after gripping, movement of the target object, movement of the fingertips, and the like.
Next, an example of a finger angle command value generation processing procedure performed by the object gripping control device 1 will be described.
(Step S11) The real position correction unit 12 uses the image captured by the photographing unit 22 to detect the position (measurement object position) and orientation of the target object.
(Step S12) The real position correction unit 12 uses the image captured by the photographing unit 22 to set the goal position (goal object position) and the orientation of the target object.
(Step S13) The real position correction unit 12 calculates an amount of deviation between the goal object position and the measurement object position. The real position correction unit 12 integrates the amount of deviation and multiplies a result thereof by the gain stored in the storage unit 16 to obtain the gripping center position.
(Step S14) The fingertip control unit 11 estimates the goal position and posture of the gripping center on the basis of the taxonomy. The fingertip control unit 11 corrects the goal fingertip position included in the acquired operation instruction according to the goal position and posture of the gripping center.
(Step S15) The calculation unit 13 adds the value output by the real position correction unit 12 to the value output by the fingertip control unit 11. For example, the calculation unit 13 performs coordinate transform of a relative movement amount to transform a change in the goal object position and posture into a change in the gripping center, and adds these together.
(Step S16) The hand control unit 14 controls the hand on the basis of the fingertip command value output by the calculation unit 13.
(Step S17) The joint angle conversion unit 15 performs inverse kinematics calculation to generate a finger angle command value on the basis of the fingertip command value output by the hand control unit 14.
The processing procedure illustrated in
An example in which the robot 2 is caused to perform screw tightening work will be described with reference to
In an initial state, a screw that is the target object is placed vertically (g201).
Next, the target object is gripped with fingertips. In this case, the posture (position) of the target object is tilted according to the fingertip grip positions (g202).
Next, the target object is moved to a screw hole while the position of the target object is being corrected (g203).
Next, the target object is inserted into the screw hole (g204).
Next, actions of the fingertips are controlled so that the screw is turned and the screw tightening work is performed (g205).
Next, the fingertips are removed from the target object in order to change angles or positions of the fingers to further perform screw tightening (g206).
Next, the target object is gripped with the fingertips again, and the target object is turned so that the screw tightening is performed (g207 and g208).
Then, the actions of g205 to g208 are repeated to perform the screw tightening work.
Thus, it was possible to realize changing a screw posture and to realize the insertion into the screw hole and tightening through control based on the gripping center position described above using the image obtained by the photographing unit 22 included in the hand photographing the hexagonal screw standing vertically near the screw hole in the confirmation.
With such control, according to the present embodiment, the gripping center position determined from the fingertips is obtained, and the balance of the object or a trajectory of the fingertip position is controlled with reference to the obtained gripping center position, enabling robust control and work against an error while performing continuous control.
A program for realizing some or all of functions of the object gripping control device 1 of the present invention may be recorded in a computer-readable recording medium, and the program recorded in this recording medium may be read into a computer system and executed to perform all or some of the processing performed by the object gripping control device 1. The “computer system” described herein includes an OS or hardware such as a peripheral device. Further, the “computer system” also includes a WWW system including a homepage providing environment (or display environment). Further, the “computer-readable recording medium” refers to a portable medium such as a flexible disk, a magneto-optical disc, a ROM, or a CD-ROM, or a storage device such as a hard disk built into the computer system. Further, the “computer-readable recording medium” may also include a recording medium that holds a program for a certain period of time, such as a volatile memory (RAM) inside a computer system including a server and a client when the program is transmitted over a network such as the Internet or a communication line such as a telephone line. Further, the program may be transmitted from a computer system in which the program is stored in a storage device or the like to another computer system via a transmission medium or by transmission waves in the transmission medium. Here, the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information such as a network (a communication network) such as the Internet or a communication line such as a telephone line. Further, the program may be a program for realizing some of the above-described functions. Further, the program may be a so-called difference file (difference program) that can realize the above-described functions in combination with a program already recorded in a computer system.
Although a mode for carrying out the present invention has been described above using the embodiment, the present invention is not limited to the embodiment at all, and various modifications and substitutions can be made without departing from the gist of the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2022-154741 | Sep 2022 | JP | national |