The present technology relates to a control device, a control method, and a program, and more particularly, to a control device, a control method, and a program which enable easy detection of contact of an object with a predetermined surface in a case where the object is gripped and arranged on the predetermined surface.
In conventional robot systems, when an object is gripped by a robot hand and the gripped object is arranged on a predetermined surface, a reaction force received from the surface on which the object is to be placed is calculated so as not to give an impact to the object, and the object is released from the robot hand when the reaction force exceeds a threshold (see, for example, Patent Document 1).
Furthermore, there is also a robot system that acquires three-dimensional shape information of a conveyance target from three-dimensional information of a mountain before and after gripping and three-dimensional information of the conveyance target in order to suitably perform arrangement even in a case where a three-dimensional size of the conveyance target is unknown when one object is gripped as the conveyance target by a gripping device from the mountain of a plurality of stacked objects and is arranged on a predetermined surface (see, for example, Patent Document 2).
However, the robot systems described above need to perform complicated processing such as the calculation of the reaction force and the acquisition of the three-dimensional shape information in order to grip an object and suitably arrange the object on a predetermined surface.
Therefore, it is desired to easily detect contact of an object with a predetermined surface and suitably and easily arrange the object in a case where the object is gripped and arranged on the predetermined surface.
The present technology has been made in view of such a situation, and enables easy detection of contact of an object with a predetermined surface in a case where the object is gripped and arranged on the predetermined surface.
A control device according to one aspect of the present technology is a control device including a detection unit that detects contact of an object with a predetermined surface, in a case where the object, gripped by a first finger portion including an ultrasonic transmitter that generates an ultrasonic wave and a second finger portion including an ultrasonic receiver that receives the ultrasonic wave output from the ultrasonic transmitter, is to be arranged on the predetermined surface, on the basis of a sound pressure of the ultrasonic wave received by the ultrasonic receiver.
A control method according to one aspect of the present technology is a control method including detecting, by a control device, contact of an object with a predetermined surface, in a case where the object, gripped by a first finger portion including an ultrasonic transmitter that generates an ultrasonic wave and a second finger portion including an ultrasonic receiver that receives the ultrasonic wave output from the ultrasonic transmitter, is to be arranged on the predetermined surface, on the basis of a sound pressure of the ultrasonic wave received by the ultrasonic receiver.
A program according to one aspect of the present technology is a program for causing a computer to function as a detection unit that detects contact of an object with a predetermined surface, in a case where the object, gripped by a first finger portion including an ultrasonic transmitter that generates an ultrasonic wave and a second finger portion including an ultrasonic receiver that receives the ultrasonic wave output from the ultrasonic transmitter, is to be arranged on the predetermined surface, on the basis of a sound pressure of the ultrasonic wave received by the ultrasonic receiver.
In one aspect of the present technology, in a case where an object, gripped by a first finger portion including an ultrasonic transmitter that generates an ultrasonic wave and a second finger portion including an ultrasonic receiver that receives the ultrasonic wave output from the ultrasonic transmitter, is to be arranged on the predetermined surface, contact of the object with the predetermined surface is detected on the basis of a sound pressure of the ultrasonic wave received by the ultrasonic receiver.
Hereinafter, modes for carrying out the present technology (hereinafter, referred to as embodiments) will be described. Note that the description will be given in the following order.
Note that the same or similar portions are denoted by the same or similar reference signs in the drawings referred to in the following description. However, the drawings are schematic, and the relationship between the thickness and the plane dimension or the like is different from the actual one. Furthermore, the drawings also include portions having mutually different dimensional relationships and ratios in some cases.
A robot 11 in
An eye portion 22a including a camera is provided in the upper portion of the head portion 22. Ear portions 22b including microphones are provided on the left and right of the head portion 22, respectively. A mouth portion 22c including a speaker is provided in a lower portion of the head portion 22.
The leg portion 23 is provided with a tray portion 32 for placing a target object, which is an object to be transported, in addition to four wheels 31 for enabling the robot 11 to move. The robot 11 places a target object on the tray portion 32, moves to a transport destination, grips the target object with the finger portions 26a and 26b, arranges the target object on a predetermined surface (hereinafter, referred to as an arrangement surface), and releases the target object.
In the example of
As illustrated in
In the robot 11, a central processing unit (CPU) 61, a read only memory (ROM) 62, and a random access memory (RAM) 63 are mutually connected by a bus 74. Furthermore, a microcontroller unit (MCU) 64 and an operation controller 67 are connected to the bus 74.
The CPU 61 is a control device that controls the entire robot 11, controls each portion, and performs various processes.
For example, the CPU 61 performs an arrangement process that is a process of holding the target object with the finger portion 26a and the finger portion 26b and arranging and releasing the target object onto the arrangement surface. Specifically, the CPU 61 instructs the MCU 64 to perform ultrasonic sensing. The CPU 61 acquires received wave information, which is information regarding the ultrasonic wave received by the ultrasonic receiving element 42, supplied from the MCU 64 in response to such an instruction. On the basis of the received wave information, the CPU 61 estimates a protrusion distance, which is a distance that protrudes from a gripping position of the target object to the arrangement surface side, and detects contact of the target object with the arrangement surface. The CPU 61 instructs the operation controller 67 to cause the robot 11 to perform a predetermined operation on the basis of an image acquired by the eye portion 22a, the protrusion distance, a detection result of the contact of the target object, and the like.
The MCU 64 is connected to a drive circuit 65 and an amplifier circuit 66, and performs ultrasonic sensing in response to an instruction from the CPU 61. Specifically, the MCU 64 drives the ultrasonic transmitting element 41 by supplying a rectangular pulse oscillating at a resonance frequency of the ultrasonic transmitting element 41 to the drive circuit 65 in response to an instruction from the CPU 61. The MCU 64 incorporates an analog/digital converter (AD converter), and samples a voltage corresponding to a sound pressure of the ultrasonic wave amplified by the amplifier circuit 66 by the AD converter. The MCU 64 performs signal processing on a digital signal obtained as a result of the sampling to calculate a reception time, a maximum voltage, and the like of the ultrasonic wave. Note that the reception time of the ultrasonic wave is a time from when the ultrasonic wave is output by the ultrasonic transmitting element 41 to a first peak of the ultrasonic waveform which is a waveform of the digital signal of the ultrasonic wave. The maximum voltage is a maximum value of the voltage in an ultrasonic waveform in a predetermined period. The MCU 64 supplies the CPU 61 with the reception time and the maximum voltage of the ultrasonic wave as the received wave information.
The drive circuit 65 includes a circuit such as an H-Bridge circuit, and converts a rectangular pulse voltage supplied from the MCU 64 into a driving voltage of the ultrasonic transmitting element 41. The drive circuit 65 supplies the rectangular pulse after the voltage conversion to the ultrasonic transmitting element 41. Therefore, the ultrasonic transmitting element 41 generates an ultrasonic wave and outputs the ultrasonic wave in a predetermined orientation.
The amplifier circuit 66 amplifies the ultrasonic wave received by the ultrasonic receiving element 42. The amplifier circuit 66 may amplify the received ultrasonic wave in all bands, or may extract only a frequency component near the resonance frequency of the ultrasonic transmitting element 41 by a band pass filter (BPF) or the like and amplify only the frequency component.
The operation controller 67 is connected to a body drive unit 68, a head drive unit 69, a leg drive unit 70, an arm drive unit 71, a hand drive unit 72, and a finger drive unit 73. In accordance with an instruction from the CPU 61, the operation controller 67 controls the body drive unit 68, the head drive unit 69, the leg drive unit 70, the arm drive unit 71, the hand drive unit 72, and the finger drive unit 73, and causes the robot 11 to perform a predetermined operation.
The body drive unit 68, the head drive unit 69, the leg drive unit 70, the arm drive unit 71, the hand drive unit 72, and the finger drive unit 73 drive the body portion 21, the head portion 22, the leg portion 23, the arm portion 24, the hand portion 25, and the finger portions 26a and 26b under the control of the operation controller 67, respectively, and cause these portions to perform predetermined operations.
For example, the body drive unit 68 drives the body portion 21 to incline the body portion 21 forward, backward, left, and right. The head drive unit 69 drives the head portion 22, and rotates the head portion 22 relative to the body portion 21 such that the eye portion 22a and the ear portions 22b can acquire information from desired directions, and the mouth portion 22c can output voice in a desired direction. The leg drive unit 70 drives the wheels 31 of the leg portion 23 to move the robot 11 from a transport source to the transport destination. The arm drive unit 71 drives the arm portion 24 to move the arm portion 24 up, down, left, and right relative to the body portion 21 such that positions of the finger portions 26a and 26b are at desired positions (for example, positions where the target object can be gripped). The hand drive unit 72 drives the hand portion 25 to rotate the hand portion 25 relative to the arm portion 24 such that positions of finger portions 26a and 26b are at desired positions (for example, positions where the target object can be gripped). The finger drive unit 73 drives the finger portions 26a and 26b to cause the finger portions 26a and 26b to grip the target object.
The body drive unit 68, the head drive unit 69, the leg drive unit 70, the arm drive unit 71, the hand drive unit 72, and the finger drive unit 73 supply information such as current positions of the body portion 21, the head portion 22, the arm portion 24, the hand portion 25, and the finger portions 26a and 26b to the operation controller 67.
Furthermore, an input/output interface 75 is also connected to the bus 74. To the input/output interface 75, an input unit 76, an output unit 77, a storage unit 78, a communication unit 79, and a drive 80 are connected.
The input unit 76 includes the eye portion 22a, the ear portions 22b, and the like. The eye portion 22a acquires a surrounding image. The ear portions 22b acquire surrounding voice. The image acquired by the eye portion 22a and the voice acquired by the ear portions 22b are supplied to the CPU 61 via the input/output interface 75 and the bus 74.
The output unit 77 includes the mouth portion 22c and the like. The mouth portion 22c outputs voice supplied from the CPU 61 via the input/output interface 75 and the bus 74.
The storage unit 78 includes a hard disk, and a non-volatile memory. The communication unit 79 includes a network interface. The drive 80 drives a removable medium 81 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
In the robot 11 configured as described above, for example, the CPU 61 loads the program stored in the storage unit 78 into the RAM 63 via the input/output interface 75 and the bus 74 and executes the program, to thereby perform a series of processes.
The program executed by the CPU 61 can be provided, for example, by recording on the removable medium 81 as a package medium and the like, for example. Furthermore, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
In the robot 11, the program can be installed in the storage unit 78 via the input/output interface 75 by mounting the removable medium 81 to the drive 80.
Furthermore, the program can be received by the communication unit 79 via a wired or wireless transmission medium, and installed in the storage unit 78. In addition, the program can be installed in the ROM 62 or the storage unit 78 in advance.
As illustrated in
The protrusion distance estimation unit 101 instructs the operation controller 67 to move an interval between the finger portions 26a and 26b to a predetermined width W0. Then, the protrusion distance estimation unit 101 instructs the MCU 64 to perform ultrasonic sensing, and acquires a reception time t0 of an ultrasonic wave from the MCU 64. The protrusion distance estimation unit 101 stores the reception time t0 and the width W0 in association with each other in the RAM 63. The protrusion distance estimation unit 101 performs the above processing while changing the width W0, and stores a table in which the reception times t0 and the widths W0 are associated with each other in the RAM 63.
The protrusion distance estimation unit 101 reads the reception time t0 corresponding to the width W0 the same as an interval W1, which is supplied from the movement control unit 103, between the finger portions 26a and 26b when the finger portions 26a and 26b grip the target object from the table stored in the RAM 63. Then, the protrusion distance estimation unit 101 instructs the MCU 64 to perform ultrasonic sensing, and acquires a reception time t1 of an ultrasonic wave from the MCU 64. The protrusion distance estimation unit 101 estimates a protrusion distance d of the target object on the basis of the reception time t0 and the reception time t1 according to the principle of time of flight (ToF), and supplies the protrusion distance d to the initial position determination unit 102. The protrusion distance estimation unit 101 supplies the protrusion distance d and the reception time t1 to the detection unit 104.
The initial position determination unit 102 determines a position on the arrangement surface on which the target object is to be arranged on the basis of the image from the eye portion 22a and the like. On the basis of such a position and the protrusion distance d, the initial position determination unit 102 determines initial positions of the finger portions 26a and 26b at the time of an arrangement operation to be positions higher by (d+α) than a position on the arrangement surface on which the target object is to be arranged. Note that α is any value larger than zero, and is a margin determined in advance on the basis of estimation accuracy of the protrusion distance d. The initial position determination unit 102 supplies the initial positions to the movement control unit 103.
The movement control unit 103 acquires an image of the target object acquired by the eye portion 22a, and determines an objective gripping position that is a gripping position of the target object aimed on the basis of the image. The movement control unit 103 calculates positions of the finger portions 26a and 26b that enable the finger portions 26a and 26b to grip the objective gripping position on the basis of the objective gripping position. The movement control unit 103 instructs the operation controller 67 to move the finger portions 26a and 26b to the positions.
Then, the movement control unit 103 instructs the operation controller 67 to cause the finger portions 26a and 26b to grip the target object. As a result, the movement control unit 103 acquires an interval between the finger portions 26a and 26b when the target object is gripped from the operation controller 67, and supplies the interval to the protrusion distance estimation unit 101. The movement control unit 103 instructs the operation controller 67 to lift the target object gripped by the finger portions 26a and 26b.
The movement control unit 103 instructs the operation controller 67 to move the finger portions 26a and 26b to the initial positions supplied from the initial position determination unit 102. Then, the movement control unit 103 notifies the detection unit 104 of the completion of the movement. Thereafter, the movement control unit 103 instructs the operation controller 67 to move the finger portions 26a and 26b from the initial positions toward the arrangement surface at a predetermined speed. The movement control unit 103 instructs the operation controller 67 to stop the movement of the finger portions 26a and 26b and to release the target object from the finger portions 26a and 26b according to a detection result from the detection unit 104.
The detection unit 104 calculates a predetermined period corresponding to the maximum voltage on the basis of the protrusion distance d supplied from the protrusion distance estimation unit 101 and the reception time t1. In response to the notification from the movement control unit 103, the detection unit 104 starts the instruction of ultrasonic sensing with respect to the MCU 64 and provides notification of the predetermined period corresponding to the maximum voltage, and as a result, acquires a maximum voltage Vmax from the MCU 64. The detection unit 104 detects that the target object has come into contact with the arrangement surface on the basis of the maximum voltage Vmax. The detection unit 104 supplies such a detection result to the movement control unit 103, and ends the instruction of ultrasonic sensing with respect to the MCU 64.
First, as illustrated in A of
Next, the arrangement process is started, and the finger portions 26a and 26b grip an objective gripping position of a target object 121 in response to an instruction from the movement control unit 103 as illustrated in B of
After the target object 121 is gripped, ultrasonic sensing using the ultrasonic transmitting element 41 and the ultrasonic receiving element 42 is performed in response to an instruction from the protrusion distance estimation unit 101. As a result, the protrusion distance estimation unit 101 acquires the reception time t1 of an ultrasonic wave received by the ultrasonic receiving element 42 through a path 132 directed from the ultrasonic transmitting element 41 toward the ultrasonic receiving element 42 while making a detour around the target object 121. The protrusion distance estimation unit 101 estimates the protrusion distance d of the target object 121 on the basis of the reception time t0 and the reception time t1.
Specifically, since sound has a diffraction effect, even in a case where the target object 121 is present between the ultrasonic transmitting element 41 and the ultrasonic receiving element 42, an ultrasonic wave makes a detour around the target object 121 and propagates through the path 132. At this time, the reception time t1 varies depending on the protrusion distance d and the interval W1 between the finger portions 26a and 26b, but the interval W1 is unknown until the finger portions 26a and 26b grip the target object 121. Therefore, the protrusion distance estimation unit 101 holds in advance the table in which various widths W0 are associated with the reception times t0. Then, the protrusion distance estimation unit 101 estimates the protrusion distance d by the following Formula (1) on the basis of the reception time t0 corresponding to the same width W0 as the interval W1 acquired after the gripping of the target object 121 and the reception time t1.
In Formula (1), v represents a sound velocity. According to Formula (1), the protrusion distance d is estimated by subtracting a distance v×t0 of the path 132 through which direct propagation is performed without making a detour around the target object 121 from a distance v×t1 of the path 131 detouring around the target object 121, and dividing the resultant by two. The interval W1 may be used instead of the distance v×t0.
After the protrusion distance d is estimated, the initial position determination unit 102 determines initial positions of the finger portion 26a and the finger portion 26b in an arrangement operation to be positions higher by (d+α) than a position on an arrangement surface 122 on which the target object is to be arranged on the basis of the positions on the arrangement surface on which the target object is to be arranged and the protrusion distance d. Therefore, the finger portions 26a and 26b move to the initial positions in accordance with an instruction from the movement control unit 103 as illustrated in C of
After the finger portions 26a and 26b are moved to the initial positions, the finger portions 26a and 26b are moved from the initial positions toward the arrangement surface 122 at a predetermined speed in response to an instruction of the movement control unit 103. At this time, in response to an instruction from the detection unit 104, ultrasonic sensing using the ultrasonic transmitting element 41 and the ultrasonic receiving element 42 is performed.
The detection unit 104 detects that the target object 121 has come into contact with the arrangement surface 122 on the basis of the maximum voltage Vmax obtained as a result of the ultrasonic sensing. Specifically, immediately before the target object 121 comes into contact with the arrangement surface 122, a gap between a surface (a bottom surface in the example of
When the detection unit 104 detects that the target object 121 has come into contact with the arrangement surface 122, the movement control unit 103 stops the movement of the finger portions 26a and 26b, and the target object 121 is released from the finger portions 26a and 26b.
Therefore, it is possible to prevent the finger portions 26a and 26b from moving toward the arrangement surface 122 (descending in the example of
In a graph of
As illustrated in
In graphs of
Note that experiments in
In
Graphs on the left side of
Graphs on the right side of
Therefore, when the protrusion distance d is calculated with the reception time t0 as 300 μs, the reception time t1 as 500 μs, and the sound velocity v as 340 m/s in Formula (1) described above, an estimation value of the protrusion distance d is 34 mm.
Graphs on the left side of
Graphs on the right side of
Here, the predetermined period corresponding to the maximum voltage Vmax will be described. The graphs of
Specifically, the predetermined period corresponding to the maximum voltage Vmax is a period from the time when the ultrasonic transmitting element 41 outputs the ultrasonic wave to time t2 calculated by the following Formula (2).
According to Formula (2), the time t2 is obtained by adding a time taken for the ultrasonic wave to move by a distance twice the margin α to the reception time t1 when the finger portions 26a and 26b grip the target object. That is, the time t2 is a value obtained by estimating a reception time in a case where the finger portions 26a and 26b are located at the initial positions on the basis of the reception time t1, the margin α, and the sound velocity v.
When the finger portions 26a and 26b move from the initial positions toward the arrangement surface, a distance between the arrangement surface and each of the ultrasonic transmitting element 41 and the ultrasonic receiving element 42 decreases. Therefore, the ultrasonic wave propagated through the gap between the target object and the arrangement surface is received by the ultrasonic receiving element 42 at a time earlier than the time t2. Thus, erroneous detection by the detection unit 104 can be prevented by limiting the period corresponding to the maximum voltage Vmax to the period from the time when the ultrasonic transmitting element 41 outputs the ultrasonic wave output to the time t2.
For example, assuming that α is 2 cm and the sound velocity v is 340 m/s in
However, in the graphs on the right side of
Therefore, the detection unit 104 limits the period for the maximum voltage Vmax between the output of the ultrasonic wave from the ultrasonic transmitting element 41 and the time t2 (in this case, about 618 μs) to be capable of detecting that the target object is in contact with the arrangement surface on the basis of the maximum voltage Vmax.
In
Graphs on the left side of
Graphs on the right side of
Therefore, when the protrusion distance d is calculated with the reception time t0 as 400 μs, the reception time t1 as 800 μs, and the sound velocity v as 340 m/s in Formula (1) described above, an estimation value of the protrusion distance d is 68 mm.
Similarly to the graphs on the left side of
Graphs on the right side of
Note that the predetermined period corresponding to the maximum voltage Vmax is a period from the time when the ultrasonic transmitting element 41 outputs an ultrasonic wave to time t2. For example, assuming that α is 2 cm and the sound velocity v is 340 m/s in
In the graphs on the left side of
As described above, in the experimental results illustrated in
In step S11 of
In step S13, the movement control unit 103 calculates positions of the finger portions 26a and 26b that enable the finger portions 26a and 26b to grip the objective gripping position on the basis of the objective gripping position determined in step S12.
In step S14, the movement control unit 103 instructs the operation controller 67 to move the finger portions 26a and 26b to the positions calculated in step S12.
In step S15, the movement control unit 103 instructs the operation controller 67 to cause the finger portions 26a and 26b to grip the objective gripping position of the target object. Then, the movement control unit 103 acquires the interval W1 between the finger portions 26a and 26b from the operation controller 67, and supplies the interval W1 to the protrusion distance estimation unit 101.
In step S16, the movement control unit 103 instructs the operation controller 67 to cause the finger portion 26a and the finger portion 26b to lift the target object. A gripping operation is performed by the processing from steps S11 to S16.
In step S17, the protrusion distance estimation unit 101 reads the reception time t0 corresponding to the width W0 the same as the interval W1 supplied from the movement control unit 103 from the table stored in the RAM 63.
In step S18, the protrusion distance estimation unit 101 instructs the MCU 64 to perform ultrasonic sensing. The protrusion distance estimation unit 101 acquires the reception time t1 obtained as a result from the MCU 64.
In step S19, the protrusion distance estimation unit 101 estimates the protrusion distance d on the basis of the reception time t0 read in step S17 and the reception time t1 acquired in step S18. The protrusion distance estimation unit 101 supplies the protrusion distance d to the initial position determination unit 102, and supplies the protrusion distance d and the reception time t1 to the detection unit 104.
In step S20, the initial position determination unit 102 determines a position on the arrangement surface on which the target object is to be arranged on the basis of the image from the eye portion 22a and the like.
In step S21, on the basis of the position determined in step S20 and the protrusion distance d estimated in step S19, the initial position determination unit 102 determines initial positions of the finger portion 26a and the finger portion 26b to be positions higher by (d+α) than the position on the arrangement surface on which the target object is to be arranged.
In step S22, the movement control unit 103 instructs the operation controller 67 to move the finger portions 26a and 26b to the initial positions. Then, the movement control unit 103 notifies the detection unit 104 of the completion of the movement.
In step S23, the detection unit 104 calculates a predetermined period corresponding to the maximum voltage Vmax on the basis of the protrusion distance d and the reception time t1.
In step S24, the detection unit 104 instructs the MCU 64 to perform ultrasonic sensing in response to the notification from the movement control unit 103. At this time, the detection unit 104 notifies the MCU 64 of the predetermined period calculated in step S23. In step S25, the detection unit 104 acquires the maximum voltage Vmax from the MCU 64.
In step S26, the detection unit 104 determines whether or not the target object has come into contact with the arrangement surface on the basis of the maximum voltage Vmax acquired in step S25. Specifically, the detection unit 104 determines whether or not the maximum voltage Vmax is smaller than the threshold Vth. Then, in a case where it is determined that the maximum voltage Vmax is not smaller than the threshold Vth, the detection unit 104 determines that the target object is not in contact with the arrangement surface and advances the processing to step S27.
In step S27, the movement control unit 103 instructs the operation controller 67 to move the finger portions 26a and 26b toward the arrangement surface at a predetermined speed for a predetermined time. Then, the processing returns to step S24, and the detection unit 104 instructs the MCU 64 to perform ultrasonic sensing, and the processing proceeds to step S25.
On the other hand, in a case where it is determined that the maximum voltage Vmax is smaller than the threshold Vth in step S26, the detection unit 104 determines that the target object has come into contact with the arrangement surface, and supplies a detection result indicating that the contact of the target object with the arrangement surface to the movement control unit 103. Then, the processing proceeds to step S28.
In step S28, the movement control unit 103 instructs the operation controller 67 according to the detection result from the detection unit 104 to stop the movement of the finger portions 26a and 26b and release the target object from the finger portions 26a and 26b. Then, the arrangement process ends. The arrangement operation is performed by the processing from steps S17 to S28.
<Description of Arrangement Process Performed without Using Present Technology>
In step S41 of
In step S42, the robot determines an objective gripping position on the basis of the size of the target object measured in step S41. In step S43, the robot determines a position of the finger portion at which the objective gripping position can be gripped by the finger portion on the basis of the objective gripping position determined in step S42.
In step S44, the robot moves the finger portion to the position determined in step S43. In step S45, the robot causes the finger portion to grip the objective gripping position of the target object. In step S46, the robot causes the finger portion to lift the target object. A gripping operation is performed by the processing from steps S41 to S46.
In step S47, the robot determines a position on an arrangement surface on which the target object is to be arranged. In step S48, the robot determines a position of the finger portion that enables the target object to be arranged at the position on the arrangement surface on the basis of the size of the target object measured in step S41, the objective gripping position, and the position on the arrangement surface determined in step S47. Specifically, the robot estimates a protrusion distance on the basis of the size of the target object and the objective gripping position. Then, the robot determines a position of the finger portion such that the finger portion is arranged to be higher by the protrusion distance than the position on the arrangement surface determined in step S47.
In step S49, the robot moves the finger portion to the position determined in step S48. In step S50, the robot releases the target object from the finger portion, and ends the arrangement process. An arrangement operation is performed by the processing from steps S47 to S50.
As described above, in the arrangement process of
In this case, when the robot arranges the finger portion to be higher by the protrusion distance than the position on the arrangement surface on which the target object is to be arranged and releases the target object, there is a possibility that the target object drops by being released in a state of not being in contact with the arrangement surface, or the target object is pressed against the arrangement surface by the finger portion moving toward the arrangement surface even after the target object comes into contact with the arrangement surface. When the target object drops, an impact is applied to the target object, or the target object moves or falls down after dropping so that the target object cannot be arranged at a desired position in a desired posture. When the target object is pressed against the arrangement surface even after the contact, there is a possibility that an excessive force is applied to the target object to damage the target object. Therefore, it is necessary to move the target object to the arrangement surface at a low speed in order to prevent the damage to the target object.
On the other hand, in the arrangement process of
A calculation time required for the ultrasonic sensing is about 10 ms, and a computational load is low. Therefore, in the arrangement process of
In the arrangement process of
In the arrangement process of
As described above, in a case where the target object gripped by the finger portion 26a having the ultrasonic transmitting element 41 and the finger portion 26b having the ultrasonic receiving element 42 is to be arranged on the arrangement surface, the arrangement processing unit 100 detects contact of the target object with the arrangement surface on the basis of the sound pressure of the ultrasonic wave received by the ultrasonic receiving element 42.
Therefore, the arrangement processing unit 100 can easily detect the contact of the target object with the arrangement surface and suitably and easily arrange the target object only by performing the ultrasonic sensing. The arrangement processing unit 100 does not need to capture an image of the arrangement surface in order to detect the contact of the target object with the arrangement surface. Therefore, the arrangement processing unit 100 can accurately detect contact of the target object with the arrangement surface even in a case where the arrangement surface is present at a place (for example, inside a shelf at a high place or a low place, behind a shield, or the like) where image capturing with the eye portion 22a or the like is impossible. As a result, the target object can be suitably arranged at a high speed.
Since the arrangement processing unit 100 estimates the protrusion distance d on the basis of the sound pressure of the ultrasonic wave received by the ultrasonic receiving element 42, it is possible to easily estimate the protrusion distance d without performing a complicated process such as the image segmentation process.
Note that the predetermined period corresponding to the maximum voltage may be changed in accordance with current positions of the finger portions 26a and 26b. In this case, for example, the predetermined period corresponding to the maximum voltage is a period from the time when the ultrasonic transmitting element 41 outputs an ultrasonic wave to time t3 calculated by the following Formula (3).
In the Formula (3), Δz is a distance between the initial position and the current position of each of the finger portions 26a and 26b, and is a value that is equal to or more than zero and less than α.
The table in which the reception times t0 and the widths W0 are associated with each other may be created immediately before the arrangement process, or may be created when the robot 11 is activated or the like. This table may be created at the time of shipment of the robot 11 from a factory and stored in the storage unit 78.
In the robot 11 in
The CPU 141 is a control device that controls the entire robot 11, controls each portion, and performs various processes.
For example, the CPU 141 performs an arrangement process. This arrangement process is similar to the arrangement process performed by the CPU 61 in
The MCU 142 is connected to the drive circuit 65 and the amplifier circuit 66, and performs ultrasonic sensing in response to an instruction from the CPU 141. This ultrasonic sensing is similar to the ultrasonic sensing performed by the MCU 64 of
In an arrangement processing unit 150 in
Similarly to the movement control unit 103 in
Then, the movement control unit 153 instructs the operation controller 67 to cause the finger portions 26a and 26b to grip a target object. As a result, the movement control unit 153 acquires an interval between the finger portions 26a and 26b when the target object is gripped from the operation controller 67, and supplies the interval to the protrusion distance estimation unit 101. The movement control unit 153 instructs the operation controller 67 to lift the target object gripped by the finger portions 26a and 26b.
The movement control unit 153 instructs the operation controller 67 to move the finger portions 26a and 26b to initial positions supplied from the initial position determination unit 102. Then, the movement control unit 153 notifies the detection unit 104 of the completion of the movement. Thereafter, the movement control unit 153 instructs the operation controller 67 to move the finger portions 26a and 26b toward an arrangement surface at a moving speed supplied from the surface distance estimation unit 155. The movement control unit 153 instructs the operation controller 67 to stop the movement of the finger portions 26a and 26b and to release the target object from the finger portions 26a and 26b according to a detection result from the detection unit 104.
The surface distance estimation unit 155 acquires a reception time supplied from the MCU 142 in response to an instruction of the detection unit 104 when the finger portions 26a and 26b are moving toward the arrangement surface. The surface distance estimation unit 155 estimates a surface distance dp, which is a distance between the arrangement surface and each of the finger portions 26a and 26b, according to the principle of ToF on the basis of the reception time. The surface distance estimation unit 155 determines the moving speed of the finger portions 26a and 26b on the basis of the surface distance dp so as to be slower as the surface distance dp is smaller. The surface distance estimation unit 155 supplies the moving speed to the movement control unit 153.
In
First, a table in which the reception times t0 and the widths W0 are associated with each other is stored in the RAM 63 before the arrangement process as illustrated in A of
Then, as illustrated in C of
Thereafter, as illustrated in E of
The surface distance estimation unit 155 calculates a moving speed vref on the basis of a reception time of the ultrasonic waveform received through the path 161. In the calculation of the moving speed vref, first, the surface distance estimation unit 155 estimates the surface distance dp according to the principle of ToF. Then, the surface distance estimation unit 155 calculates the moving speeds vref of the finger portion 26a and the finger portion 26b by the following Formula (4) using the surface distance dp and the protrusion distance d.
In Formula (4), G is a speed gain, and the upward direction in
The movement control unit 153 instructs the operation controller 67 to move the finger portions 26a and 26b toward the arrangement surface 122 at the moving speed vref. By repeating the ultrasonic sensing, the calculation of the moving speed vref, and the movement of the finger portions 26a and 26b as described above, the finger portions 26a and 26b move at a lower speed as approaching the arrangement surface 122 as illustrated in F of
Immediately before the target object 121 comes into contact with the arrangement surface 122, a gap between a surface (a bottom surface in the example of
When the detection unit 104 detects that the target object 121 has come into contact with the arrangement surface 122, the movement control unit 153 stops the movement of the finger portions 26a and 26b, and the target object 121 is released from the finger portions 26a and 26b.
As described above, the moving speed vref is decelerated as the finger portions 26a and 26b approach the arrangement surface 122 in the arrangement process performed by the arrangement processing unit 150. Therefore, the target object 121 can be arranged on the arrangement surface 122 without applying a strong impact to the target object 121 even in a case where an initial moving speed of the finger portions 26a and 26b is fast.
In the arrangement process performed by the arrangement processing unit 150, the contact of the target object 121 is detected using the maximum voltage of only the ultrasonic wave received through the path 161 that is reflected by the arrangement surface 122 and directed to the ultrasonic receiving element 42. Therefore, detection accuracy of the contact can be improved.
Processing from steps S71 to S84 in
After processing of step S84, in step S85, the detection unit 104 acquires, from the MCU 64, a maximum voltage of only an ultrasonic wave reflected by an arrangement surface and received, which is obtained as a result of ultrasonic sensing. In step S86, the surface distance estimation unit 155 acquires, from the MCU 142, a reception time of only the ultrasonic wave reflected by the arrangement surface and received, which is obtained as a result of ultrasonic sensing.
In step S87, the detection unit 104 determines whether or not the target object has come into contact with the arrangement surface on the basis of the maximum voltage acquired in step S85, similarly to the processing in step S26 in
In a case where it is determined in step S87 that the target object is not in contact with the arrangement surface, the processing proceeds to step S88. In step S88, the surface distance estimation unit 155 estimates a surface distance according to the principle of ToF on the basis of the reception time acquired in step S86. In step S89, the surface distance estimation unit 155 calculates the moving speed vref by the above-described Formula (4) on the basis of the surface distance dp estimated in step S88. The surface distance estimation unit 155 supplies the moving speed vref to the movement control unit 153.
In step S90, the movement control unit 153 instructs the operation controller 67 to move the finger portions 26a and 26b toward the arrangement surface for a predetermined time at the moving speed vref. Then, the processing returns to step S84, and the detection unit 104 instructs the MCU 64 to perform ultrasonic sensing, and the processing proceeds to step S85.
On the other hand, in a case where it is determined in step S87 that the target object has come into contact with the arrangement surface, the detection unit 104 supplies a detection result indicating the contact of the target object with the arrangement surface to the movement control unit 103, and the processing proceeds to step S91. Since processing in step S91 is similar to the processing in step S28 in
Note that the same ones as those in
As illustrated in
A of
As illustrated in
In a case where the finger portion 170 includes the plurality of ultrasonic transmitting elements 171, a propagation direction of an ultrasonic wave can be changed by adjusting phases of driving voltages of the respective ultrasonic transmitting elements 171. For example, when the ultrasonic transmitting elements 171 are driven in the order of the ultrasonic transmitting elements 171-3, 171-2, and 171-1, the ultrasonic wave propagates in a direction of an arrow 172 as illustrated in
Therefore, in the second embodiment, the protrusion distance d of the target object 181 in any direction can be estimated by changing the propagation direction of the ultrasonic wave. As a result, for example, even in a case where the target object 181 is gripped in an inclined manner as illustrated in
Furthermore, in the second embodiment, it is possible to detect contact of a position of the target object 181 in any direction with an arrangement surface 182 by changing the propagation direction of the ultrasonic wave. Therefore, for example, even in the case where the target object 181 is gripped in an inclined manner as illustrated in
Note that, in a robot 200 in
The CPU 201 is a control device that controls the entire robot 11, controls each portion, and performs various processes.
For example, the CPU 201 performs an arrangement process. This arrangement process is similar to the arrangement process performed by the CPU 61 in
The MCU 202 is connected to the drive circuits 203-1 to 203-3 and the amplifier circuit 66, and performs ultrasonic sensing in the predetermined directions in response to an instruction from the CPU 201. Specifically, in response to the instruction from the CPU 201, the MCU 202 drives the ultrasonic transmitting elements 171 by supplying rectangular pulses oscillating at resonance frequencies of the ultrasonic transmitting elements 171 to the drive circuits 203-1 to 203-3 in the order corresponding to the directions in which the ultrasonic sensing is performed, respectively. Furthermore, the MCU 202 generates received wave information using an ultrasonic wave amplified by the amplifier circuit 66 and supplies the received wave information to the CPU 201, similarly to the MCU 64.
The drive circuits 203-1 to 203-3 are connected to the ultrasonic transmitting elements 171-1 to 171-3, respectively. Therefore, an ultrasonic wave output timing from each of the ultrasonic transmitting elements 171 is controlled for each of the ultrasonic transmitting elements 171. Note that the drive circuits 203-1 to 203-3 will be collectively referred to as drive circuits 203 hereinafter in a case where it is not necessary to particularly distinguish them.
Each of the drive circuits 203 is configured similarly to the drive circuit 65, and converts a voltage of the rectangular pulse supplied from the MCU 202 into the driving voltage of each of the ultrasonic transmitting elements 171. Each of the drive circuits 203 supplies the rectangular pulse after the voltage conversion to each of the ultrasonic transmitting elements 171 connected to itself. Therefore, the ultrasonic transmitting elements 171-1 to 171-3 generate and output ultrasonic waves in the order corresponding to the directions in which the ultrasonic sensing is performed. As a result, the ultrasonic waves propagate in the directions in which the ultrasonic sensing is performed.
Note that, in an arrangement processing unit 220 in
The protrusion distance estimation unit 221 stores, in the RAM 63, a table in which the reception times t0 and the widths W0 are associated with each other, similarly to the protrusion distance estimation unit 101. The protrusion distance estimation unit 221 reads the reception time t0 from the table stored in the RAM 63, similarly to the protrusion distance estimation unit 101.
Then, the protrusion distance estimation unit 221 instructs the MCU 202 to perform ultrasonic sensing in a predetermined direction, and acquires the reception time t1 from the MCU 202. The protrusion distance estimation unit 221 estimates the protrusion distance d on the basis of the reception time t0 and the reception time t1, similarly to the protrusion distance estimation unit 101. The protrusion distance estimation unit 221 performs the above processing while changing the direction of the ultrasonic sensing, and estimates the protrusion distances d in the respective directions. The protrusion distance estimation unit 221 supplies a maximum protrusion distance dmax, which is a maximum value among the estimated protrusion distances d, to the initial position determination unit 222. The protrusion distance estimation unit 221 supplies the maximum protrusion distance dmax and a reception time t1max, used to estimate the maximum protrusion distance dmax, to the detection unit 224.
The initial position determination unit 222 determines a position on the arrangement surface on which the target object is to be arranged, similarly to the initial position determination unit 102. On the basis of the position and the maximum protrusion distance dmax, the initial position determination unit 222 determines initial positions of the finger portion 170 and the finger portion 26b at the time of an arrangement operation to be positions higher by (dmax+α) than the position on the arrangement surface on which the target object is to be arranged. The initial position determination unit 222 supplies the initial positions to the movement control unit 103.
The detection unit 224 calculates a predetermined period at a maximum voltage on the basis of the maximum protrusion distance dmax and the reception time t1max supplied from the protrusion distance estimation unit 221, similarly to the detection unit 104.
In response to a notification from the movement control unit 103, the detection unit 224 starts an instruction of ultrasonic sensing in a predetermined direction to the MCU 202 and provides notification of the predetermined period at the maximum voltage, and as a result, acquires the maximum voltage from the MCU 202. The detection unit 224 determines whether or not a position of the target object corresponding to the direction of the ultrasonic sensing has come into contact with the arrangement surface on the basis of the maximum voltage. The detection unit 224 performs the above processing while changing the direction of the ultrasonic sensing, and determines whether or not positions of the target object in the respective directions have come into contact with the arrangement surface.
In a case where it is determined that a position of the target object in any direction has come into contact with the arrangement surface, the detection unit 224 detects the contact of the position with the arrangement surface. The detection unit 224 supplies such a detection result to the movement control unit 103, and ends the instruction of ultrasonic sensing with respect to the MCU 202.
As described above, the robot 200 includes the three ultrasonic transmitting elements 171, and thus, can perform the ultrasonic sensing in the predetermined directions by individually controlling the ultrasonic wave output timings of the respective ultrasonic transmitting elements 171. As a result, the robot 200 can estimate the protrusion distances in the predetermined directions and detect contact of the positions in the predetermined directions of the target object with the arrangement surface. Therefore, even in a case where the target object is gripped in an inclined manner or in a case where the arrangement surface is not a flat surface, the target object can be safely and suitably arranged on the arrangement surface without applying an impact to the target object.
Note that the robot 200 may interpolate an occlusion region of an image of the target object acquired by the eye portion 22a on the basis of the protrusion distances d in the respective directions.
Although the three ultrasonic transmitting elements 171 are provided in the robot 200, the number of ultrasonic transmitting elements is not limited as long as the number is plural.
Note that the same ones as those in
As illustrated in
A of
As illustrated in
In a case where the directivity of the ultrasonic transmitting element 41 is wide, an ultrasonic wave propagates in a wide range, and the ultrasonic wave makes a detour in various directions around the target object 181. In such a case, when the finger portion 270 includes the plurality of ultrasonic receiving elements 271, a deviation occurs in a reception timing of the ultrasonic wave in each of the ultrasonic receiving elements 271 depending on a direction in which the ultrasonic wave arrives, and thus, the arrival direction of the ultrasonic wave can be recognized according to the principle of direction of arrival (DOA) on the basis of such a deviation.
For example, in a case where not the center but the left side of the target object 181 is gripped by the finger portion 26a and the finger portion 270 as illustrated in
After the ultrasonic wave from the direction indicated by the arrow 281, an ultrasonic wave makes a detour around the target object 181 clockwise in the drawing and arrives at the ultrasonic receiving elements 271 from a direction indicated by an arrow 282. Since the ultrasonic waves arrive in the order of the ultrasonic receiving elements 271-3, 271-2, and 271-1, it is possible to recognize that an arrival direction of the ultrasonic wave is the direction indicated by the arrow 282 on the basis of deviations in peak times of the respective ultrasonic receiving elements 271 according to the principle of DOA. A distance of a path of the arriving ultrasonic wave can be known from the time of the ultrasonic wave received by the ultrasonic receiving element 271-3.
A direction in which an ultrasonic wave has arrived and a distance of a path can be calculated similarly for the ultrasonic wave from a direction other than the directions of the arrows 281 and 282. Therefore, in the third embodiment, it is possible to estimate a three-dimensional protrusion size which is a three-dimensional size of a portion protruding from a gripping position of the target object 181 to an arrangement surface side. As a result, initial positions can be determined more suitably on the basis of the three-dimensional protrusion size. Therefore, it is possible to more safely move the finger portion 26a and the finger portion 270 to the initial positions without applying an impact to the target object 181.
Furthermore, in the third embodiment, at the time of detecting the contact of the target object 181, it is possible to recognize a position in any direction of the target object 181 that has come into contact with the arrangement surface on the basis of the deviations of the reception timings in the respective ultrasonic receiving elements 271 when a voltage of the peak of the ultrasonic waveform becomes lower than a threshold. Therefore, it is possible to detect contact of a position in a predetermined direction of the target object 181 with the arrangement surface 182. Therefore, the target object 181 can be more safely and suitably arranged on the arrangement surface 182, for example, by releasing the target object 181 when a position in a desired direction of the target object 181 has come into contact with the arrangement surface 182.
Note that, in a robot 300 in
The CPU 301 is a control device that controls the entire robot 11, controls each portion, and performs various processes.
For example, the CPU 301 performs an arrangement process. This arrangement process is similar to the arrangement process performed by the CPU 61 in
The MCU 302 is connected to the drive circuit 65 and the amplifier circuits 303-1 to 303-3, and performs ultrasonic sensing in response to an instruction from the CPU 301. Specifically, the MCU 302 drives the ultrasonic transmitting element 41 in response to an instruction from the CPU 301, similarly to the MCU 64. Furthermore, the MCU 302 includes three built-in AD converters. The MCU 302 samples voltages corresponding to sound pressures of ultrasonic waves amplified by the amplifier circuits 303-1 to 303-3 in the AD converters, respectively. The MCU 302 performs signal processing on digital signals obtained as results of the sampling, thereby calculating the peak times and the peak voltages of the ultrasonic waves amplified by the respective amplifier circuits 303-1 to 303-3. The MCU 302 supplies the peak times, the peak voltages, and the like to the CPU 301 as the received wave information.
The amplifier circuits 303-1 to 303-3 are connected to the ultrasonic receiving elements 271-1 to 271-3, respectively. Note that the amplifier circuits 303-1 to 303-3 will be collectively referred to as the amplifier circuits 303 hereinafter in a case where it is not necessary to particularly distinguish them. Each of the amplifier circuits 303 is configured similarly to the amplifier circuit 66, and amplifies the ultrasonic wave received by each of the ultrasonic receiving elements 271 connected to itself.
Note that, in an arrangement processing unit 320 in
The three-dimensional size estimation unit 321 instructs the MCU 302 to perform ultrasonic sensing, and acquires peak times of ultrasonic waves received by the respective ultrasonic receiving element 271 from the MCU 302. The three-dimensional size estimation unit 321 estimates a three-dimensional protrusion size on the basis of the peak times. The three-dimensional size estimation unit 321 supplies the maximum protrusion distance dmax of the three-dimensional protrusion size to the initial position determination unit 322.
The initial position determination unit 322 determines a position on the arrangement surface on which the target object is to be arranged, similarly to the initial position determination unit 102. On the basis of the position and the maximum protrusion distance dmax, the initial position determination unit 322 determines initial positions of the finger portion 26a and the finger portion 270 at the time of an arrangement operation to be positions higher by (dmax+α) than the position on the arrangement surface on which the target object is to be arranged. The initial position determination unit 322 supplies the initial positions to the movement control unit 103.
The detection unit 323 starts an instruction of ultrasonic sensing with respect to the MCU 302 in response to a notification from the movement control unit 103, and as a result, acquires the peak times and peak voltages in the respective ultrasonic receiving elements 271 from the MCU 302. On the basis of the peak times and the peak voltages, the detection unit 323 detects that a position of the target object in a predetermined direction has come into contact with the arrangement surface. The detection unit 323 supplies such a detection result to the movement control unit 103, and ends the instruction of ultrasonic sensing with respect to the MCU 302.
As described above, the robot 300 includes the three ultrasonic receiving elements 271, and thus, can recognize an arrival direction of an ultrasonic wave on the basis of deviations in ultrasonic wave reception timings of the ultrasonic receiving elements 271. As a result, the robot 300 can estimate the three-dimensional protrusion size and detect contact of a position in a predetermined direction of the target object with the arrangement surface. Therefore, even in a case where a gripping position of the target object is shifted from the center, the target object can be arranged at a desired position on the arrangement surface. Furthermore, it is possible to lower the possibility of applying an impact to the target object and to more safely and suitably arrange the target object on the arrangement surface as compared with the robot 11.
Although the three ultrasonic receiving elements 271 are provided in the robot 300, the number of ultrasonic transmitting elements is not limited as long as the number is plural.
Note that not the maximum protrusion distance dmax but a protrusion distance in a predetermined direction can be used as the protrusion distance used to calculate the initial positions in the second and third embodiments.
A tactile sensor may be provided on the finger portion 26a (170) or the finger portion 26b (270), or a force sensor may be provided at a position (root) where the hand portion 25 is connected with the arm portion 24. In this case, the detection unit 104 (224, 323) detects that a target object has come into contact with an arrangement surface by also using information regarding a reaction force received by the target object measured by the tactile sensor or the force sensor. Therefore, detection accuracy can be improved as compared with a case where detection is performed by using only an ultrasonic waveform.
The robot 200 (300) may have finger portions other than the finger portion 170 (26a) and the finger portion 26b (270), that is, a plurality of finger portions that does not grip a target object, and each of the finger portions may be provided with an ultrasonic transmitting element (ultrasonic receiving element). In this case, similar processing as in the second embodiment (third embodiment) can be performed by moving finger portions provided with ultrasonic transmitting elements (ultrasonic receiving elements) to array the ultrasonic transmitting elements (ultrasonic receiving elements).
In a case where a three-dimensional size of a target object is known, an objective value of the protrusion distance d is known due to high positional accuracy of movement of the finger portion 26a (170) and the finger portion 26b (270), and an objective value and an estimation value of the protrusion distance d are greatly different from each other, the robot 11 (200, 300) may determine that gripping of an objective gripping position has failed. In this case, the robot 11 (200, 300) may perform a gripping operation again or perform calibration such that an error between an actual gripping position and the objective gripping position becomes zero.
A feature amount such as the number of peaks, a width of a peak, or a peak time of an ultrasonic waveform at the time of gripping a target object varies depending on a shape of the target object and the protrusion distance d (three-dimensional protrusion size). Therefore, the robot 11 (200, 300) may learn the relationship between shapes and the protrusion distances d (three-dimensional protrusion sizes) of objects assumed as the target object, and feature amounts of ultrasonic waveforms at the time of gripping the objects by using a deep neural network (DNN) or the like before the arrangement process. In this case, the robot 11 (200, 300) estimates the shape of the target object and the protrusion distance d (three-dimensional protrusion size) from the feature amount of the ultrasonic waveform at the time of gripping the target object. At this time, the robot 11 (200, 300) may measure information such as the shape and a three-dimensional size of the target object using a three-dimensional sensor, and estimate the shape and the protrusion distance d (three-dimensional protrusion size) of the target object using the information to improve the estimation accuracy.
A feature amount such as a maximum voltage or a peak voltage of an ultrasonic waveform at the time of contact of a target object with an arrangement surface varies depending on a shape and an area of the arrangement surface. Therefore, the robot 11 (200, 300) may learn the relationship between shapes and areas of surfaces assumed as the arrangement surface and feature amounts of ultrasonic waveforms at the time of contact of target objects with the surfaces by using a DNN or the like before the arrangement process. In this case, the robot 11 (200, 300) detects the contact of the target object with the arrangement surface from the feature amount of the ultrasonic waveform. Therefore, accurate contact detection can be performed regardless of shapes and areas of arrangement surfaces.
In a case where the protrusion distance d is long and a received ultrasonic wave is weak, the robot 11 (200, 300) may change a gripping position such that the protrusion distance d is shortened, or may increase a voltage of an ultrasonic waveform by adjusting an amplification factor of the amplifier circuit 66 (303) using a programmable gain amplifier (PGA). The robot 11 (200, 300) can also increase the voltage of the ultrasonic waveform by switching a rectangular pulse voltage to be supplied to the ultrasonic transmitting element 41 (171) or adjusting the number of rectangular pulses with an analog switch or the like.
The eye portion 22a may be a 3D sensor or the like. In this case, the eye portion 22a supplies information acquired by the 3D sensor to the CPU 61 (141, 201, 301).
The program executed by the CPU 61 (141, 201, 301) may be a program in which the processes are performed in time series in the order described in the present description, or may be a program in which the processes are performed in parallel or at a necessary timing such as when a call is made.
A series of processes in the robot 11 (200, 300) is executed by software in the above description, but can also be executed by hardware.
Embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made within a scope not departing from the gist of the present technology.
For example, it is possible to adopt a mode obtained by combining all or some of the plurality of embodiments described above. In the second embodiment and the third embodiment, the surface distance dp can be estimated, and a moving speed of each of the finger portion 170 (26a) and the finger portion 26b (270) toward an arrangement surface can be set to the moving speed vref.
The present technology may be configured as cloud computing in which one function is shared by a plurality of devices through a network to process together.
Each of the steps in the flowcharts described above can be executed by one device or shared and executed by a plurality of devices.
In a case where one step includes a plurality of processes, the plurality of processes included in the one step can be executed by one device, and also shared and executed by a plurality of devices.
The effects described in the present specification are merely examples and are not limited, and effects other than those described in the present specification may be provided.
Note that, the present technology can have the following configurations.
Number | Date | Country | Kind |
---|---|---|---|
2021-102969 | Jun 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/005154 | 2/9/2022 | WO |