The technology disclosed in the present specification (hereinafter referred to as “the present disclosure”) relates to, for example, a robot device applied to transportation of loads and a method for controlling the robot device.
For example, in a field of logistics, introduction of mobile robots applied to transportation of loads is being promoted. This type of mobile robot is expected to implement, for example, autonomous operation of taking out and delivering a load placed on a shelf or a carriage.
A transport robot has been devised that uses an arm to take out a load (see, for example, Patent Documents 1 and 2). However, it is very difficult to lift and unload a heavy object with a small arm. Furthermore, if a high power arm is used in consideration of taking out a heavy object, the size of the arm increases and the cost also increases. Furthermore, a type of arm that grips a load with a gripper using frictional force grips the load with a large gripping force when taking out a heavy object, but in the case of a load packed in a soft box such as corrugated cardboard, there is also a possibility of crushing the box with the gripping force of the gripper.
Furthermore, since the last mile of logistics includes an outdoor environment, a road surface condition is not always constant unlike factories and warehouses. In a case where a load is placed on an installation surface inclined from the horizontal, the posture of the load is indefinite, so it is very difficult to grasp the load without using a multi-degree-of-freedom arm or the like. For example, a robot has been devised that performs force detection with an end effector at the tip of an arm and implements profiling operation of the end effector by force control (see Patent Document 3). However, to perform such profiling operation, an arm having six degrees of freedom capable of posture control is required, and the weight of the arm increases, which makes it difficult to mount the arm on a small robot.
An object of the technology according to the present disclosure is to provide a robot device and a method for controlling the robot device that implement unloading or loading of a load to a loading platform, with less degrees of freedom.
The technology according to the present disclosure has been made in consideration of the problems described above, and a first aspect thereof is
The robot device according to the first aspect further includes a taking-out unit that takes out the load placed on the load receiving surface and moves the load to the loading unit. Furthermore, the control unit performs force control of the posture changing unit to cause the loading unit to follow the load receiving surface, and then controls the taking-out unit to take out the load placed on the load receiving surface and move the load to the loading unit.
Furthermore, a second aspect of the technology according to the present disclosure is
With the technology according to the present disclosure, it is possible to provide a robot device and a method for controlling the robot device that implement unloading or loading of a load, with less degrees of freedom, by performing profile control of the posture of a loading platform by force control.
Note that, the effects described in the present specification are merely examples, and the effects brought about by the technology according to the present disclosure are not limited thereto. Furthermore, the technology according to the present disclosure may have additional effects other than the effects described above.
Still other objects, features, and advantages of the technology according to the present disclosure will become apparent from the detailed description based on embodiments and attached drawings to be described later.
Hereinafter, embodiments of the technology according to the present disclosure will be described in detail with reference to the drawings.
A. Device Configuration
The leg 110 includes two links 111 and 112, and a joint unit 113 connecting the link 111 with the link 112. The other end (lower end) of the link 111 corresponds to the sole of a foot and is installed on a floor surface. Furthermore, the upper end of the link 112 is attached to the loading unit 101 via the joint unit 114. The joint unit 113 has a rotational degree of freedom around the pitch axis, and the link 111 can be driven around the pitch axis with respect to the link 112 by an actuator (not illustrated) such as a pitch axis rotation motor. Furthermore, the joint unit 114 has rotational degrees of freedom around at least the pitch axis and the roll axis, and the link 112 can be driven around the pitch axis and the roll axis with respect to the loading unit 101 by an actuator (not illustrated) such as a pitch axis rotation motor. Note that, in the order of proximity to the loading unit 101, the link 112 is also referred to as the first link, and the link 111 is also referred to as the second link. Furthermore, in the order of proximity to the loading unit 101, the joint unit 114 corresponding to the hip or hip joint is also referred to as the first joint, and the joint unit 113 corresponding to the knee is also referred to as the second joint.
Furthermore, the leg 120 includes two links 121 and 122, and a joint unit 123 connecting the link 121 with the link 122. The other end (lower end) of the link 121 corresponds to the sole of a foot and is installed on the floor surface. Furthermore, the upper end of the link 122 is attached to the loading unit 101 via the joint unit 124. The joint unit 123 has a rotational degree of freedom around the pitch axis, and the link 121 can be driven around the pitch axis with respect to the link 122 by an actuator (not illustrated) such as a pitch axis rotation motor. Furthermore, the joint unit 124 has rotational degrees of freedom around at least the pitch axis and the roll axis, and the link 122 can be driven around the pitch axis and the roll axis with respect to the loading unit 101 by an actuator (not illustrated) such as a pitch axis rotation motor. Note that, in the order of proximity to the loading unit 101, the link 122 is also referred to as the first link, and the link 121 is also referred to as the second link. Furthermore, in the order of proximity to the loading unit 101, the joint unit 124 corresponding to the hip or hip joint is also referred to as the first joint, and the joint unit 123 corresponding to the knee is also referred to as the second joint.
Furthermore, the leg 130 includes two links 131 and 132, and a joint unit 133 connecting the link 131 with the link 132. The other end (lower end) of the link 131 corresponds to the sole of a foot and is installed on the floor surface. Furthermore, the upper end of the link 132 is attached to the loading unit 101 via the joint unit 134. The joint unit 133 has a rotational degree of freedom around the pitch axis, and the link 131 can be driven around the pitch axis with respect to the link 132 by an actuator (not illustrated) such as a pitch axis rotation motor. Furthermore, the joint unit 134 has rotational degrees of freedom around at least the pitch axis and the roll axis, and the link 132 can be driven around the pitch axis and the roll axis with respect to the loading unit 101 by an actuator (not illustrated) such as a pitch axis rotation motor. Note that, in the order of proximity to the loading unit 101, the link 132 is also referred to as the first link, and the link 131 is also referred to as the second link. Furthermore, in the order of proximity to the loading unit 101, the joint unit 134 corresponding to the hip or hip joint is also referred to as the first joint, and the joint unit 133 on the knee is also referred to as the second joint.
Furthermore, the leg 140 includes two links 141 and 142, and a joint unit 143 connecting the link 141 with the link 142. The other end (lower end) of the link 141 corresponds to the sole of a foot and is installed on the floor surface. Furthermore, the upper end of the link 142 is attached to the loading unit 101 via the joint unit 144. The joint unit 143 has a rotational degree of freedom around the pitch axis, and the link 141 can be driven around the pitch axis with respect to the link 142 by an actuator (not illustrated) such as a pitch axis rotation motor. Furthermore, the joint unit 144 has rotational degrees of freedom around at least the pitch axis and the roll axis, and the link 142 can be driven around the pitch axis and the roll axis with respect to the loading unit 101 by an actuator (not illustrated) such as a pitch axis rotation motor. Note that, in the order of proximity to the loading unit 101, the link 142 is also referred to as the first link, and the link 141 is also referred to as the second link. Furthermore, in the order of proximity to the loading unit 101, the joint unit 144 corresponding to the hip or hip joint is also referred to as the first joint, and the joint unit 143 on the knee is also referred to as the second joint.
The movable legs 110, 120, 130, and 140 each have three degrees of freedom of a rotational degree of freedom around the pitch axis of the first joint and rotational degrees of freedom around the roll and pitch axes of the second joint, and the entire robot device 100 has twelve degrees of freedom. Although the robot device 100 illustrated in
Note that, the loading unit 101 is provided with a taking-out unit that scoops up a load placed on a shelf or a carriage and moving the load to the loading unit 101, a stopper that prevents the load from slipping down the loading unit 101, and the like, but illustration of those is omitted for simplification in
In the robot device 100, as an external sensor unit 210, cameras 211L and 211R that function as left and right “eyes” of the robot device 100, a microphone 212 that functions as an “ear”, a touch sensor 213, and the like are arranged at predetermined positions, respectively. As the cameras 211L and 211R, for example, a camera is used including an imaging element such as a complementary metal oxide semiconductor (CMOS) or a charge coupled device (CCD).
Note that, although not illustrated, the external sensor unit 210 may further include other sensors. For example, the external sensor unit 210 includes a torque sensor that detects rotational torque acting on the first joint and the second joint of each of the legs 110, 120, 130, and 140, an encoder that detects a joint angle, and a sole sensor that measures floor reaction force acting on the sole of each of the legs 110, 120, 130, and 140. Each sole sensor includes, for example, six degrees of freedom (6DOF) force sensor, or the like.
Furthermore, the external sensor unit 210 may include a sensor capable of measuring or estimating a direction and a distance of a predetermined target, such as Laser Imaging Detection and Ranging (LIDAR), a Time OF Flight (TOF) sensor, and a laser range sensor. Furthermore, the external sensor unit 210 may include a Global Positioning System (GPS) sensor, an infrared sensor, a temperature sensor, a humidity sensor, an illuminance sensor, and the like.
Furthermore, in the robot device 100, a speaker 221 and a display unit 222 are respectively arranged at predetermined positions as output units. The speaker 221 output voice, and functions to perform voice guidance, for example. Furthermore, the display unit 222 displays a status of the robot device 100 and a response to a user.
In a controller unit 230, a main control unit 231, a battery 232, an internal sensor unit 233 including a battery sensor 233A and an acceleration sensor 233B, an external memory 234, and a communication unit 235 are arranged.
The cameras 211L and 211R of the external sensor unit 210 image a surrounding situation and send an obtained image signal S1A to the main control unit 231. The microphone 212 collects voice input from the user and sends an obtained voice signal S1B to the main control unit 231. Input voice given to the robot device 100 by the user includes an activation word and various command voices (voice commands) such as “walk”, “turn right”, “hurry”, and “stop”. Note that, although only one microphone 82 is drawn in
Furthermore, the touch sensor 213 of the external sensor unit 210 is laid on a placement surface of the loading unit 101, for example, and detects a pressure received at a place where a load is placed, and a result of the detection is sent to the main control unit 231 as a pressure detection signal S1C.
The battery sensor 233A of the internal sensor unit 233 detects a remaining amount of energy of the battery 232 at predetermined cycles, and sends a detection result as a battery remaining amount detection signal S2A to the main control unit 231.
The acceleration sensor 233B detects acceleration in three axis directions (x (roll) axis, y (pitch) axis, and z (yaw) axis) at predetermined cycles for movement of the robot device 100, and sends a result of the detection to the main control unit 231 as an acceleration detection signal S2B. The acceleration sensor 233B may be, for example, an Inertial Measurement Unit (IMU) equipped with a 3-axis gyro and a 3-direction acceleration sensor. By using the IMU, it is possible to measure an angle and an acceleration of the robot device 100 main body or the loading unit 101.
The external memory 234 stores programs, data, control parameters, and the like, and supplies the programs and data to a memory 231A built in the main control unit 231 as needed. Furthermore, the external memory 234 receives and stores data and the like from the memory 231A. Note that, the external memory 234 may be configured as a cartridge type memory card like an SD card, for example, and may be detachable from the robot device 100 main body (or the controller unit 230).
The communication unit 235 performs data communication with the outside on the basis of a communication method such as Wi-Fi (registered trademark) or Long Term Evolution (LTE). For example, a program such as an application executed by the main control unit 231 and data required for executing the program can be acquired from the outside via the communication unit 235.
The memory 231A is built in the main control unit 231. The memory 231A stores programs and data, and the main control unit 231 performs various types of processing by executing the programs stored in the memory 231A. That is, the main control unit 231 determines surrounding and internal situations of the robot device 100, presence or absence of a command from the user or an action from the user, or the like, on the basis of the image signal S1A, the voice signal S1B, and the pressure detection signal S1C (hereinafter, these are collectively referred to as external sensor signals S1) respectively supplied from the cameras 211L and 211R, the microphone 212, and the touch sensor 213 of the external sensor unit 210, and the battery remaining amount detection signal S2A and the acceleration detection signal S2B (hereinafter these are collectively referred to as internal sensor signals S2) respectively supplied from the battery sensor 233A and the acceleration sensor 233B of the internal sensor unit 233. Note that, information on the weight and the position of the center of gravity of the robot device 100 main body (however, a state in which no load is placed on the loading unit 101) may be stored in the memory 231A in advance.
Then, the main control unit 231 determines an action of the robot device 100 and an expression action to be activated for the user, on the basis of the surrounding and internal situations of the robot device 100, a determination result of the presence or absence of the command from the user or the action from the user, a control program stored in advance in the internal memory 231A, or various control parameters and the like stored in the external memory 234 loaded at that time, and generates a control command based on a result of the determination and send the control command to each of sub-control units 241, 242, etc.
The sub-control units 241, 242, . . . are in charge of operation control of subsystems in the robot device 100, respectively, and drive the subsystems on the basis of the control command supplied from the main control unit 231. The above-mentioned movable legs 110, 120, 130, and 140, and the taking-out unit that scoops up a load (described above) correspond to the subsystems, and are driven and controlled by the corresponding sub-control units 241, 242, 243, 244, etc. Specifically, the sub-control units 241, 242, 243, 244, . . . perform drive control of the joint units 113, 123, 133, and 143, and drive control of the taking-out unit. The taking-out unit performs operation such as scooping up a load placed on a shelf or a carriage and moving the load to the loading unit 101.
B. Load Transportation Operation
It is assumed that the robot device 100 is applied to the field of logistics, and transports a load in the last mile from a final base to a delivery destination, for example. Thus, the robot device 100 autonomously performs operation of scooping up the load placed on the shelf or the carriage and placing the load on the loading unit 101, at the final base, and then moving the load to the delivery destination.
Since the last mile of logistics includes an outdoor environment, a road surface condition is not always constant unlike factories and warehouses. For this reason, the robot device 100 may have to scoop up a load placed on an installation surface inclined from the horizontal and having an indefinite posture.
For example, a robot has been devised that use an arm to scoop up a load from a shelf and a carriage. However, to grasp a load in any posture, the arm requires a large number of degrees of freedom, so that the weight of the arm increases and it becomes difficult to miniaturize the robot. Furthermore, a type of arm that grips a load with a gripper using frictional force grips the load with a large gripping force when taking out a heavy object, but in the case of a load packed in a soft box such as corrugated cardboard, there is also a possibility of crushing the box with the gripping force of the gripper.
Thus, in the technology according to the present disclosure, posture control of the robot device 100 is performed so that the loading unit 101 follows a surface on which a load is placed, and then the load is scooped up and the load is moved onto the loading unit 101. Here, the surface on which the load is placed is, for example, a shelf or a carriage on which the load is placed. Furthermore, by performing the posture control of the robot device 100 by force control using the plurality of movable legs 110 to 140, it is possible to cause the loading unit 101 to follow the surface on which the load is placed. Then, since the loading unit 101 on which the load is to be loaded has already followed the surface on which the load is placed such as the shelf or the carriage, it is possible to move the load to the loading unit 101 relatively easily by pulling the gripper into the loading unit 101.
Even a gripper with a simple configuration that has only a degree of freedom of opening and closing and a degree of freedom of movement in one direction can sufficiently move a load, and the robot device 100 does not require an arm with multiple degrees of freedom, in other words, the size and weight of the robot device 100 can be reduced, and the cost can be reduced.
In a body unit 300, circuit components are built, such as the controller unit 230 and the sub-control units 241, 242, 243, 244, etc. The front leg 110 is coupled to the body unit 300 by the first joint 114 corresponding to a shoulder joint or a hip joint, and the front leg 120 is also coupled to the body unit 300 by the first joint 124. Furthermore, although hidden in
The upper surface of the body unit 300 constitutes the loading unit 101 on which a load is placed. Although omitted in
As described later, the posture control of the robot device 100 is performed by force control so that the loading unit 101 follows the surface on which the load is placed, and then the load is scooped up and the load is moved onto the loading unit 101. For this reason, it is sufficient that the gripper 310 has only a degree of freedom of opening and closing and a degree of freedom of movement in the front-rear direction, and other degrees of freedom are unnecessary. However, the degree of freedom of movement of the gripper 310 is in a direction parallel to a loading surface of the loading unit 101. Furthermore, since work of transferring the load from a load receiving surface to the loading unit 101 is performed in a state in which the loading unit 101 has followed the surface on which the load is placed, a large gripping force is not required even when a heavy object is taken out. For example, a cardboard box or a precision package packed with a load can be gently pulled out or placed without being crushed by the gripping force of the gripper.
Note that, although the cameras 211L and 211R (described above) that function as the left and right “eyes” of the robot device 100 are not illustrated in
The robot device 100 searches for a load to be a target of transportation on the basis of the images imaged by the cameras 211L and 211R, and detection results by the LIDAR or the TOF sensor. In the examples illustrated in
First, as illustrated in
As illustrated in
When the gripper 310 comes into contact with the load receiving surface, external force Fb and a moment Mb acting on a base point of the body unit 300 can be obtained from the whole body force control of the robot device 100. The base point of the body unit 300 referred to here is a root portion of the gripper 310 attached to the body unit 300. The robot device 100 can estimate contact force Fc received by the gripper 310 from the load receiving surface at a contact point position x between the gripper 310 and the load receiving surface on the basis of dynamics (Fb, Mb) of the base point of the body unit 300. Here, a distance from the base point of the body unit 300 to a contact point is defined as the contact point position x. The contact point position x where the gripper 310 first comes into contact with the load receiving surface is defined as x1.
The robot device 100 drives the legs 110, 120, 130, and 140 until the contact force Fc reaches a predetermined set value F1. Note that, the contact force Fc is equal to the external force Fb acting on the base point, and furthermore, the contact point position x holds the following equation (1).
Thereafter, while the contact force Fc acting on the base point of the body unit 300 is kept at the predetermined set value F1, the legs 110, 120, 130, and 140 are driven, and the force control of the posture of the body unit 300 is performed in a rotation direction of the trajectory plan of the body unit 300 already determined.
When the force control of the posture of the body unit 300 is performed continuously to cause rotation in accordance with the trajectory plan while the contact force Fc is kept at the predetermined set value F1, the posture of the body unit 300 (or the loading unit 101 on the upper surface of the body unit 300) begins to follow the load receiving surface as illustrated in
Here, in
In the posture of the robot device 100 illustrated in
As can be seen from
First, the robot device roughly detects the inclination of the load receiving surface of the shelf 501 on which the load 500 is placed, on the basis of the image recognition of the images imaged by the cameras 211L and 211R, for example (step S901). See, for example,
Next, the robot device 100 the robot device 100 determines the rotation direction of the posture of the body unit 300 for causing the loading unit 101 to follow the load receiving surface after the gripper 310 integrally attached to the body unit 300 and the load receiving surface come into contact with each other, and formulates the trajectory plan for the gripper 310 (step S902). Then, the robot device 100 executes the trajectory plan (step S903). See, for example,
While rotating the posture of the body unit 300 in accordance with the formulated trajectory plan in a state in which the gripper 310 and the load receiving surface are in contact with each other (see, for example,
Thereafter, while keeping the contact force Fc acting on the base point of the body unit 300 at the predetermined set value F1 until the predetermined set value F1 is reached, the robot device 100 rotates the posture of the body unit 300 in the rotation direction of the trajectory plan of the body unit 300 determined in step S902, by the whole body force control of the robot device 100 (step S905). See, for example,
When the posture of the body unit 300 (or the loading unit 101 on the upper surface of the body unit 300) begins to follow the load receiving surface, the contact point position x between the gripper 310 and the load receiving surface begins to vary. Then, the robot device 100 determines that the body unit 300 or the loading unit 101 has almost completely followed the load receiving surface when the contact point position has entered the vicinity of the center of the contact surface (step S906). See, for example,
Next, the robot device 100 closes the two claws 311 and 312 of the gripper 310 until the claws come into contact with the load, to grip the load 500 (step S907).
Then, as illustrated in
According to the operation procedure illustrated in
The robot device 100 searches for a load to be carried out from the plurality of loads stored in the carriage 1000 on the basis of the images imaged by the cameras 211L and 211R and the detection results by the LIDAR or the TOF sensor. Upon roughly detecting an inclination of the load receiving surface on which the target load is placed, the robot device 100 determines a rotation direction of the posture of the body unit 300 for causing the loading unit 101 to follow the load receiving surface, formulates a trajectory plan for the gripper 310, and execute the trajectory plan.
After the gripper 310 comes into contact with the load receiving surface, the robot device 100 estimates the external force Fb and the moment Mb acting on the base point of the body unit 300 on the basis of the whole body force control, and continues to perform the force control of the posture of the body unit 300 while keeping the contact force Fc received by the gripper 310 from the load receiving surface at the predetermined set value F1. Then, upon detecting that the contact point between the gripper 310 and the load receiving surface has entered the vicinity of the center of the contact surface, the robot device 100 determines that the body unit 300 or the loading unit 101 has almost completely followed the load receiving surface, and grips the load with the gripper 310.
Note that, in a case where the carriage 1000 is on an inclined road surface, the carriage 1000 may start moving even while the robot device 100 is approaching the load. Thus, as illustrated in
As can be seen from
C. Modification (1)
The robot device 100 first roughly detects the floor surface by, for example, image recognition using a camera, determines the rotation direction of the posture of the body unit 300 after a fork portion of the lift 1600 comes into contact with the floor surface, and formulates a trajectory plan for the lift 1600. The robot device 100 executes the trajectory plan, and when the fork portion of the lift 1600 comes into contact with the floor surface, estimates the contact force Fc received by the lift 1600 from the floor surface on the basis of the external force Fb and the moment Mb acting on the base point of the body unit 300 obtained from the whole body force control of the robot device 100.
Then, the robot device 100 rotates the posture of the body unit 300 by driving the legs 110, 120, 130, and 140 until the contact force Fc from the floor surface reaches the predetermined set value F1. Moreover, the robot device 100 rotates the posture of the body unit 300 so that the contact force Fc is kept at the predetermined set value F1, and when a contact point between the fork portion of the lift 1600 and the floor surface enters the vicinity of the center of a contact surface of the fork portion of the lift 1600, it is determined that the fork portion of the lift 1600 has followed the floor surface as the load receiving surface.
Thereafter, as illustrated in
D. Modification (2)
The robot device 100 illustrated in
The robot device 100 first roughly detects the back surface of the tray 2100 by, for example, image recognition using a camera, determines the rotation direction of the posture of the body unit 300 after the loading unit 101 on the upper surface of the body unit 300 comes into contact with the back surface of the tray 2100, and formulates a trajectory plan for the body unit 300. The robot device 100 executes the trajectory plan, and when the loading unit 101 comes into contact with the back surface of the tray 2100, estimates the contact force Fc received by the loading unit 101 from the back surface of the tray 2100 on the basis of the external force Fb and the moment Mb acting on the base point of the body unit 300 obtained from the whole body force control of the robot device 100.
Then, the robot device 100 rotates the posture of the body unit 300 by driving the legs 110, 120, 130, and 140 until the contact force Fc from the back surface of the tray 2100 reaches the predetermined set value F1. Moreover, the robot device 100 rotates the posture of the body unit 300 so that the contact force Fc is kept at the predetermined set value F1, and when a contact point between the loading unit 101 and the back surface of the tray 2100 enters the vicinity of the center of the loading unit 101, it is determined that the loading unit 101 has followed the back surface of the tray 2100 as the load receiving surface.
Thereafter, as illustrated in
E. Modification (3)
The robot device 100 illustrated in
The robot device 100 first roughly detects the wall surface of the load 2301 by, for example, image recognition using a camera, determines the rotation direction of the posture of the body unit 300 after a suction port of the suction unit 2300 comes into contact with the wall surface of the load 2301, and formulates a trajectory plan for the suction unit 2300. The robot device 100 executes the trajectory plan, and when the suction port of the suction unit 2300 comes into contact with the wall surface of the load 2301, estimates the contact force Fc received by the suction unit 2300 from the wall surface of the load 2301 on the basis of the external force Fb and the moment Mb acting on the base point of the body unit 300 obtained from the whole body force control of the robot device 100.
Then, the robot device 100 rotates the posture of the body unit 300 by driving the legs 110, 120, 130, and 140 until the contact force Fc from the wall surface of the load 2301 reaches the predetermined set value F1. Moreover, the robot device 100 rotates the posture of the body unit 300 so that the contact force Fc is kept at the predetermined set value F1, and when a contact point between the suction port of the suction unit 2300 and the wall surface of the load 2301 enters the vicinity of the center of a contact surface of the suction port of the suction unit 2300, it is determined that the suction port of the suction unit 2300 has followed the wall surface of the load 2301.
Thereafter, the robot device 100 causes the suction unit 2300 to suck the load 2301 by air pressure or the like, and retreats from the shelf or the carriage on which the load 2301 is placed by walking operation using the legs 110, 120, 130, and 140, and moves to a transport destination of the load.
F. Modification (4)
On the other hand, in the robot device 100 illustrated in
The robot device 100 first roughly detects the load receiving surface by, for example, image recognition using a camera, determines a height of the lift 2400 for accessing the load receiving surface and the rotation direction of the posture of the body unit 300 after the gripper 2401 comes into contact with the load receiving surface, and formulates a trajectory plan for the body unit 300, including lifting operation of the lift 2400. The robot device 100 executes the formulated trajectory plan, causes the lift 2400 to perform lifting operation as illustrated in
Then, the robot device 100 rotates the posture of the body unit 300 by driving the legs 110, 120, 130, and 140 until the contact force Fc from the load receiving surface reaches the predetermined set value F1. Moreover, the robot device 100 rotates the posture of the body unit 300 so that the contact force Fc is kept at the predetermined set value F1, and when a contact point between the gripper 2401 and the load receiving surface enters the vicinity of the center of a contact surface of the gripper 2401, it is determined that the gripper 2401 has followed the floor surface as the load receiving surface.
The robot device 100 can lift the load from the load receiving surface by closing the gripper 2401 to grip the load and raising the lift 2400 a little further. Next, the robot device 100 retreats from the shelf or the like on which the load is placed by walking operation, then lowers the lift 2400 as illustrated in
F. Modification (4)
In all of the robot devices 100 described so far, a change of the posture of the body unit 300 or the loading unit 101 and movement of the robot device 100 main body have been implemented by driving the legs 110, 120, 130, and 140.
On the other hand, a robot device 2800 illustrated in
Note that, the parallel link includes a mechanism that supports an output end (corresponding to the loading unit 101 in the example illustrated in
G. Modification (5)
When a load 2900 is gripped by the gripper 310 in front of the robot device 100, a position of the center of gravity of the entire robot device 100 including the load 2900 shifts forward as compared with a case of the robot device 100 alone. As the weight of the load 2900 becomes heavier, the position of the center of gravity shifts more forward. As a result, a margin between the position of the center of gravity and a boundary of the support polygon becomes small, which increases the risk of the robot device 100 falling over. Thus, as illustrated in
H. Modification (6)
The robot device 3000 first roughly detects the floor surface as the load receiving surface by, for example, image recognition using a camera, determines the rotation direction of the posture of the body unit after the gripper 3001 at the leg tip comes into contact with the load receiving surface, and formulate a trajectory plan for the body unit. The robot device 3000 executes the formulated trajectory plan and brings the gripper 3001 closer to the load 3002 on the floor by operation of the legs. Then, when the gripper 3001 comes into contact with the load receiving surface, the contact force Fc received by the gripper 3001 from the load receiving surface is estimated on the basis of the external force Fb and the moment Mb acting on the base point of the body unit obtained from the whole body force control of the robot device 3000.
Then, the robot device 3000 rotates the posture of the body unit by driving the legs until the contact force Fc from the floor surface reaches the predetermined set value F1. Moreover, the robot device 3000 rotates the posture of the body unit or the leg tip so that the contact force Fc is kept at the predetermined set value F1, and when a contact point between the gripper 3001 and the floor surface enters the vicinity of the center of a contact surface of the gripper 3001, it is determined that the gripper 3001 has followed the floor surface. Then, the robot device 3000 can lift the load 3002 from the floor surface by closing the gripper 3001 to grip the load and raising the leg tip.
The robot device 3000 utilizes the posture control of the body unit and the degree of freedom of the leg to cause the gripper 3001 at the leg tip to follow the load receiving surface, grips the load 3002, and further moves the load to the body unit, whereby a range is expanded in which the load can be pulled out. The robot device 3000 can directly pull up the load 3002 placed on the floor if it has a space for loading the load in the front, the rear, or the lower part of the body unit.
I. Modification (7)
J. Modification (7)
The robot device 100 first roughly detects the load receiving surface by, for example, image recognition using a camera, determines the rotation direction of the posture of the body unit 300 after the gripper 310 comes into contact with the load receiving surface, and formulates a trajectory plan for the body unit 300. The load receiving surface referred to here is the upper surface of the load 3201 directly under the load 3200.
The robot device 100 executes the formulated trajectory plan and brings the gripper 310 closer to the load receiving surface on the load receiving surface by walking motion. Then, when the gripper 2401 comes into contact with the load receiving surface, the contact force Fc received by the gripper 310 from the load receiving surface is estimated on the basis of the external force Fb and the moment Mb acting on the base point of the body unit 300 obtained from the whole body force control of the robot device 100.
Then, the robot device 100 rotates the posture of the body unit 300 by driving the legs 110, 120, 130, and 140 until the contact force Fc from the load receiving surface reaches the predetermined set value F1. Moreover, the robot device 100 rotates the posture of the body unit 300 so that the contact force Fc is kept at the predetermined set value F1, and when a contact point between the gripper 310 and the load receiving surface enters the vicinity of the center of the contact surface of the gripper 310, it is determined that the gripper 310 has followed the floor surface as the load receiving surface.
The robot device 100 closes the gripper 310 to grip the load 3200, and lifts the load 3200 from the load receiving surface. Next, after retreating from, for example, a place where the load 3200 is piled up in bulk by walking operation, the robot device 100 greatly tilts the body unit 300 and the gripper 310 to the rear of the robot device 100 by greatly bending the hind legs 130 and 140. Then, the lifted load slips down from the gripper 310 and moves to the loading unit 101. Furthermore, since the stopper 320 is provided on the rear end edge of the loading unit 101, the load 3200 that has slipped down from the gripper 310 does not further fall from the rear end of the loading unit 101.
K. Other Modification
All of the robot devices described so far basically have a structure in which the taking-out unit such as a gripper is provided in front of the robot device (or the body unit) to pull a load into the loading unit on the upper surface of the body unit. On the other hand, it is also possible to configure the robot device so that a space for accommodating a load is provided at the rear or lower part of the body unit, the posture of the bottom surface of the body unit is caused to follow the floor surface, and the load is pulled into the space.
L. Summary
Finally, a summary will be given of the effects brought about by the robot device to which the technology according to the present disclosure is applied.
(1) The robot device can individually take out the target load from the shelf, the carriage, or the like on which one or a plurality of loads is placed.
(2) Since the robot device can take out the load from the load receiving surface without using the arm with multiple degrees of freedom, the device cost can be reduced and a small and lightweight robot device can be configured.
(3) Even in an uncertain situation where the inclination of the load receiving surface is unknown or the accuracy of the bottom surface of the load is insufficient, the robot device can reliably scoop up the load by causing the loading unit to follow the load receiving surface by the whole body force control.
(4) The robot device can quietly pull out or place the load or precision package packed in a soft box such as corrugated cardboard with a small gripping force without crushing it, by causing the loading unit to follow the load receiving surface by the whole body force control.
(5) The robot device pulls the load gripped by the gripper or the like to the body unit, whereby the center of gravity of the entire robot device including the load moves to the vicinity of the center of the support polygon, so that the posture stabilizes and the risk of falling over is low.
(6) Since the robot device lifts the load from the lower side thereof, no additional gripper or pull-in mechanism is required, and the structure that takes out the load from the load receiving surface can be simplified.
(7) The robot device can reliably suck the wall surface of the load by combination with the suction unit having a suction function by air pressure or the like.
(8) The robot device can easily pull out the load regardless of whether the load receiving surface is high or low by attaching the gripper via a lift capable of lifting operation.
(9) The robot device moves the leg tips of the front legs forward when gripping the load to expand the support polygon forward, whereby the risk of falling over can be reduced even if the position of the center of gravity including the load shifts forward.
(10) The robot device can directly pull up the load placed on the floor if it has a space for loading the load in the front, the rear, or the lower part of the body unit.
(11) If the robot device is provided with the taking-out unit such as the gripper at the leg tip of the leg, it is possible to cause the leg tip to follow the load receiving surface to grip the load, and the load is further moved to the body unit, whereby the range is expanded in which the load can be pulled out.
(12) By causing body postures of a plurality of robot devices to follow each other by the whole body force control, it is possible to stably carry a large load while cooperating.
(13) The robot device can reliably scoop up the load by causing the upper surface of the load directly under the target load to be followed from among loads piled in bulk.
In the above, the technology according to the present disclosure has been described in detail with reference to specific embodiments. However, it is obvious that those skilled in the art can make modifications and substitutions of the embodiments without departing from the scope of the technology according to the present disclosure.
The technology according to the present disclosure can be similarly applied to a robot device or a mobile device provided with a plurality of moving means other than the legs. For example, a configuration may be adopted in which a wheel-type mobile device (including an autonomous vehicle) provided with a plurality of wheels is equipped with a parallel link including a plurality of mechanisms arranged in parallel and a final output destination as a loading platform.
In short, the technology according to the present disclosure has been described in the form of exemplification, and the description content of the present specification should not be interpreted restrictively. To determine the gist of the technology according to the present disclosure, the claims should be taken into consideration.
Note that, the technology according to the present disclosure can also have the following configuration.
(1) A robot device including:
(2) The robot device according to (1), in which the posture changing unit includes a plurality of link structures that supports the loading unit.
(3) The robot device according to either of (1) or (2), in which
(4) The robot device according to either of (1) or (2), in which
(5) The robot device according to any of (1) to (4), further including
(6) The robot device according to (5), in which
(7) The robot device according to (6), in which
(8) The robot device according to any of (5) to (7), in which
(9) The robot device according to (5), in which
(10) The robot device according to (5), in which
(11) The robot device according to (5), in which
(12) The robot device according to any of (5) to (11), in which
(13) The robot device according to (5), in which
(14) The robot device according to (5), in which
(15) A method for controlling a robot device including a loading unit on which a load is placed, a posture changing unit that changes a posture of the loading unit, and a moving unit that moves the loading unit,
Number | Date | Country | Kind |
---|---|---|---|
2019-187523 | Oct 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/027055 | 7/10/2020 | WO |