The present disclosure relates to a moving apparatus and a moving apparatus control method. Specifically, the present disclosure relates to a moving apparatus and a moving apparatus control method that enable control to stop a walking (leg-driven) robot such as a four-leg robot that travels by moving a plurality of legs in a more suitable posture according to a shape of a traveling surface on various traveling surfaces.
Examples of a walking (leg-driven) robot that moves by moving legs (leg portions) back and forth include various robots such as a four-legged robot and a six-legged robot.
In a case where such a walking (leg-driven) robot stops, the robot needs to be in a state where at least three legs are landed on the traveling surface.
In some cases, the robot travels not only on a flat surface but also on a traveling surface having various different shapes such as an inclined surface and a staircase, for example. The walking (leg-driven) robot cannot stop in a stable posture unless the arrangement of each foot and the posture of the robot are changed in accordance with the shape of the traveling surface.
An example of a conventional technique that discloses posture control of a walking robot is Patent Document 1 (Japanese Patent No. 3687076).
Patent Document 1 discloses a technique for controlling the posture to a posture that decreases consumed energy in a stationary posture. The robot is configured to measure the consumed energy of the robot to calculate an evaluation reference value, search for a posture that decreases the consumed energy when the robot is in a stationary state, control the posture to that posture, and stop the robot.
For example, when a stationary command is input from a controller to the robot during traveling, the robot stops at the position, and shifts to a stop posture with less energy consumption calculated in advance and stops.
However, this method is a method that can be used only in a case where the traveling surface of the walking robot travels on a limited shape of the traveling surface such as a flat surface for which a stop posture with low energy consumption is calculated.
For example, there is a problem that the method cannot be used in a case where the robot is traveling on a traveling surface having various different shapes such as an inclined surface and a staircase.
Since a stable posture at the time of stopping the robot varies depending on the shape of the traveling surface, there is a limit to determination of the stop posture in advance.
In addition, in a case where a walking (leg-driven) robot stops as described above, the robot needs to be in a state where at least three legs are landed on the traveling surface. A polygon configured by three or more landing points is referred to as support polygon.
After the support polygon is formed, even if an attempt is made to change the posture to a posture that decreases the consumed energy in a stationary state, there is a possibility that the robot falls down when the posture is changed depending on the shape of the traveling surface.
In addition, a walking (leg-driven) robot performs control to change the arrangement of the legs by driving a driving motor attached to each leg. However, for example, in a case where a failure of a motor of one leg causes the robot to take a posture calculated in advance in which consumed energy decreases, a heat generation amount of the motor increases, and it may be difficult to change to the posture.
In such a manner, in order to stably stop the robot, processing in consideration of the shape of the traveling surface of the robot and the state of the robot is required.
Patent Document 1: Japanese Patent No. 3687076
The present disclosure has been made in view of the above problems, for example, and an object of the present disclosure is to provide a moving apparatus and a moving apparatus control method that achieves control for stopping a walking (leg-driven) robot in a suitable posture according to a shape of a traveling surface and a state of the robot.
A first aspect of the present disclosure is
Furthermore, a second aspect of the present disclosure is
Other objects, features, and advantages of the present disclosure will become apparent from a more detailed description based on an embodiment of the present disclosure described later and the accompanying drawings. Note that a system herein described is a logical set configuration of a plurality of devices, and is not limited to a system in which devices with respective configurations are in the same housing.
A configuration of an embodiment of the present disclosure achieves a configuration in which a suitable posture according to a shape of a traveling surface of a robot is stored in a storage, the suitable posture according to the shape of the traveling surface is acquired from the storage when the robot is to be stopped, and posture control is performed to stop the robot.
Specifically, for example, a suitable stop posture analyzer that analyzes the suitable stop posture of a walking robot executes processing of analyzing the shape of the traveling surface of the walking robot, stop posture analysis processing of analyzing stop postures according to the shape of the traveling surface, stop posture evaluation value calculation processing of calculating evaluation values of the stop postures, and processing of recording, in a storage, a stop posture having a highest evaluation value among the stop postures according to the traveling surface shape. Furthermore, a stop control unit acquires the suitable stop posture corresponding to a traveling surface shape clustering group from the storage, and performs posture control to match a posture of the robot to the suitable stop posture and stop the robot.
This configuration achieves a configuration in which the suitable posture according to the shape of the traveling surface of the robot is stored in the storage, the suitable posture according to the shape of the traveling surface is acquired from the storage when the robot is to be stopped, and the posture control is performed to stop the robot.
Note that the effects herein described are only examples and are not restrictive, and additional effects may also be provided.
Hereinafter, details of a moving apparatus and a moving apparatus control method of the present disclosure will be described with reference to the drawings. Note that the description will be made in accordance with the following items.
First, an overview of a robot apparatus of the present disclosure will be described with reference to
The robot apparatus 10 illustrated in
It is possible to perform leg movement, that is, walking travel by individually lifting each of the four legs and moving the legs back and forth.
As illustrated in
Furthermore, a plurality of sensors is attached to the robot apparatus 10.
The camera sensor 13 captures an image of a surrounding environment of the robot apparatus 10. The captured image is used to analyze a position of an obstacle, a shape of a robot traveling surface, and the like.
The IMU sensor 14 is an inertial measurement device, and acquires information for analyzing an inclination, acceleration, moving speed, and the like of the robot apparatus 10.
The leg end pressure sensor 15 is attached to an end of each of the four legs, and is used to measure whether or not each leg is landed and further measure a weight load amount of each leg in a landed state.
Note that the example of the sensors illustrated in
Note that both the LiDAR and TOF sensors are sensors capable of measuring an object distance.
Furthermore, a temperature sensor for measuring a temperature of a motor of each leg corresponding to a drive unit of the robot apparatus 10, a torque sensor for measuring a torque (load) of the motor of each leg, and the like are also individually attached to each motor unit.
A detailed configuration example of the robot apparatus 10 illustrated in
As understood from (a) the front view and (c) the rear view, an upper end of each of the leg portions 12 with respect to the body 11, that is, a leg attachment portion is pivotable in the left-right direction (around a roll axis) when viewed from the front surface and the rear surface of the robot. This pivot is performed by driving of a roll axis motor 21.
Furthermore, as understood from (b) the side view, an upper end portion of each of the leg portions 12 is also pivotable in the front-rear direction. This pivot in the front-rear direction is performed by driving of a hip axis motor 22.
In such a manner, the upper end portion of each of the leg portions 12 is pivotable with respect to the body 11 in both the front-rear direction and the left-right direction.
Furthermore, as understood from (b) the side view, a so-called knee joint portion is provided in a middle of each leg, and a lower half of each of the leg portions 12 is pivotable in the front-rear direction with respect to an upper portion of each leg. This pivot is performed by driving of a knee axis motor 23.
To a motor of each leg corresponding to a drive unit of the robot apparatus 10, a temperature sensor for measuring a temperature and a torque sensor for measuring a torque (load) of the motor of each leg are individually attached.
That is, a temperature sensor 21a and a torque sensor 21b are individually provided for each of the four roll axis motors 21 for the respective four legs, and are configured to measure and output a heat generation amount and a load (torque) at the time of driving the roll axis motor 21 when each leg is pivotally driven in the left-right direction.
In addition, a temperature sensor 22a and a torque sensor 22b are individually provided for each of the four hip axis motors 22 for the respective four legs, and are configured to measure and output a heat generation amount and a load (torque) at the time of driving the hip axis motor 22 when each leg is pivotally driven in the front-rear direction.
Similarly, a temperature sensor 23a and a torque sensor 23b are individually provided for each of the four knee axis motors 23 for the respective four legs, and are configured to measure and output a heat generation amount and a load (torque) at the time of driving the knee axis motor 23 when a lower half of each leg is pivotally driven in the front-rear direction.
Detection values of the temperature sensor and the torque sensor belonging to each motor are input to a control unit inside the robot apparatus 10, and are used, for example, for calculation of an evaluation value of stability of a stop posture and the like when the robot apparatus stops.
Details of these processing will be described later.
Although the robot apparatus 10 can autonomously travel under the control of the robot apparatus 10 itself, for example, as illustrated in
Note that a control device (information processing device) for controlling the robot is mounted on the body 11 of the robot apparatus 10. The control device inputs detection signals from various sensors and encoders attached to the robot apparatus 10, analyzes a position (three-dimensional position) and movement of each leg by analyzing the input signals, and drives and controls each leg on the basis of the analysis result.
Note that the robot apparatus 10 described with reference to
For example,
The robot apparatus 10 illustrated in
It is possible to perform walking travel by lifting each of the legs and moving the legs back and forth.
Similarly to the four-legged robot apparatus 10 described above with reference to
A detailed configuration of the robot apparatus 10 illustrated in
Similarly to the above description with reference to
The pivot in the left-right direction is performed by driving of a roll axis motor 21.
The pivot in the front-rear direction is performed by driving of a hip axis motor 22.
Furthermore, as understood from (b) the side view, a so-called knee joint portion is provided in a middle of each leg, and a lower half of each of the leg portions 12 is pivotable in the front-rear direction with respect to an upper portion of each leg. This pivot is performed by driving of a knee axis motor 23.
To a motor of each leg corresponding to a drive unit of the robot apparatus 10, a temperature sensor for measuring a temperature and a torque sensor for measuring a torque (load) of the motor of each leg are individually attached.
That is, a temperature sensor 21a and a torque sensor 21b are individually provided for each of the six roll axis motors 21 for the respective six legs.
In addition, a temperature sensor 22a and a torque sensor 22b are individually provided for each of the six hip axis motors 22 for the respective six legs.
Similarly, a temperature sensor 23a and a torque sensor 23b are individually provided for each of the six knee axis motors 23 for each of the six legs.
These sensors measure a heat generation amount and a load (torque) of each motor when each leg is driven.
Detection values of the temperature sensor and the torque sensor belonging to each motor are input to a control unit inside the robot apparatus 10, and are used, for example, for calculation of an evaluation value of stability of a stop posture and the like when the robot apparatus stops.
Details of these processing will be described later.
The configuration example of the robot apparatus 10 of the present disclosure has been described with reference to
The robot apparatus of the present disclosure is a walking robot capable of performing leg movement, that is, walking travel by lifting a plurality of legs and moving the legs back and forth in such a manner.
Next, a specific example of a travel environment of the robot apparatus will be described.
Note that, hereinafter, an embodiment to which the four-legged robot 10 described with reference to
As an environment in which the four-legged robot 10 moves, for example, there are various obstacles as illustrated in
The four-legged robot 10 repeats traveling and stopping while avoiding obstacles in such an environment, for example. There are various environments of the traveling surface at the time of stopping. It is necessary to stop at positions of various traveling surfaces such as a rough surface, a stepped surface, and an inclined surface.
As described above, the robot apparatus 10 needs to have at least three legs landed on the traveling surface in order to stop stably. A polygon configured by three or more landing points is referred to as support polygon.
For example, in a case where the robot apparatus 10 cannot form a support polygon, there is a possibility that the robot apparatus loses balance and falls down to cause damage or failure.
In a case where the robot apparatus 10 stops in the environment as illustrated in
However, the leg arrangement and the robot posture capable of maintaining a stable stationary state differ depending on the shape of the traveling surface at the position where the robot apparatus 10 stops.
A plurality of examples of the shape of the traveling surface at the position where the robot apparatus 10 stops will be described with reference to
As illustrated in
In this case, the robot apparatus 10 of the present disclosure determines a leg arrangement capable of forming the support polygon configured by landing at least three legs at a robot position illustrated in
Furthermore, the robot apparatus 10 of the present disclosure executes posture control for changing the robot posture to the determined robot posture and performs processing of stopping the robot apparatus 10.
In this case, the robot apparatus 10 of the present disclosure determines a leg arrangement capable of forming the support polygon configured by landing at least three legs at a robot position illustrated in
Furthermore, the robot apparatus 10 of the present disclosure executes posture control for changing the robot posture to the determined robot posture and performs processing of stopping the robot apparatus 10.
In this case, the robot apparatus 10 of the present disclosure determines a leg arrangement capable of forming the support polygon configured by landing at least three legs at a robot position illustrated in
Furthermore, the robot apparatus 10 of the present disclosure executes posture control for changing the robot posture to the determined robot posture and performs processing of stopping the robot apparatus 10.
In some other cases, the robot apparatus 10 receive a stop command at a position with various different shapes of the traveling surface, such as, for example, an inclined surface as illustrated in
The robot apparatus 10 of the present disclosure performs processing of determining the robot posture including the leg arrangement capable of maintaining a stable stationary state in accordance with each shape of the traveling surfaces, performing the posture control for changing the robot posture to the determined robot posture, and stopping.
In such a manner, there is a possibility that the robot apparatus 10 travels on traveling surfaces having various shapes, and receives stop commands at various timings. Therefore, the robot apparatus needs to stop on traveling surfaces having various shapes.
However, as described above, the robot posture that can maintain a stable stationary state differs depending on the shape of the traveling surface.
When receiving the stop command during traveling, the robot apparatus 10 needs to perform processing of determining a highly stable posture according to the shape of the traveling surface, changing its posture to the determined posture, and stopping. However, it takes time to analyze the shape of the traveling surface at the time when the robot apparatus receives the stop command and to calculate an optimum stable posture according to the analysis result.
When such a time loss occurs, the robot apparatus 10 might not be able to be stopped at a target position. On the other hand, when the robot apparatus 10 is forcibly stopped at the target position, the robot apparatus 10 might stop in an unstable posture and fall.
The present disclosure solves such a problem, and executes processing of causing the robot apparatus 10 to travel while analyzing the shape of the traveling surface, causing the robot apparatus 10 to calculate a highly stable stop posture according to the shape of the traveling surface, and registering information of the calculated stop posture and an evaluation value of the information in the storage.
In a case where the robot apparatus 10 receives the stop command, processing is performed in which stop posture information having a high evaluation value according to the shape of the traveling surface at the time of receiving the stop command is acquired from the storage, the stop posture is set, and the robot apparatus is stopped.
By performing such processing, it is possible to immediately execute posture control to different stable stop postures according to various shapes of traveling surfaces, and stably stop the robot apparatus 10 at the target position without a time loss.
Hereinafter, specific examples of the configuration and processing of the robot apparatus of the present disclosure will be described.
Next, a configuration example of the robot apparatus of the present disclosure will be described.
As illustrated in
Note that the suitable stop posture analyzer 110 includes a traveling surface shape analyzer 111, a stop posture analyzer 112, a stop posture evaluation value calculator 113, and a suitable stop posture record updater 114, and the stop control unit 120 includes a suitable stop posture acquirer 121 and a stop posture control unit 122.
Note that the storage 105 records each of the following data:
These data will be described in detail later.
Note that processors such as a traveling surface shape analyzer and a traveling posture control unit also exist in the travel control unit 102. However, since the conventional configurations are applicable, these processors are not used for the processing of the present disclosure and are omitted.
The processing of the present disclosure is processing of calculating a stop posture and a posture evaluation value with high stability according to the shape of the traveling surface and recording the stop posture and the posture evaluation value in the storage 105, and processing of controlling the stop posture of the robot apparatus 10 to which the data recorded in the storage 105 is applied. A subject that executes these processing is the suitable stop posture analyzer 110 and the stop control unit 120 of the robot apparatus 10 illustrated in
Each component of the robot apparatus 10 illustrated in
The communication unit 101 executes, for example, communication with the user terminal 60 held by the user 50.
The user terminal 60 outputs a travel start command and a stop command to the robot apparatus 10. Alternatively, the user terminal 60 also provides travel route information and the like.
The communication unit 101 of the robot apparatus 10 receives these information from the user terminal 60, and outputs the information to the travel control unit 102 and the suitable stop posture analyzer 110.
The travel control unit 102 and the suitable stop posture analyzer 110 execute travel control and stop control in accordance with reception information from the user terminal 60.
The travel control unit 102 controls the drive unit 103 in accordance with the reception information from the user terminal 60 to cause the robot apparatus 10 to travel.
Note that, as described above, the travel control unit 102 includes processors such as a traveling surface shape analyzer that analyzes the shape of the robot traveling surface, and a traveling posture control unit that controls the robot posture such as the leg arrangement during traveling of the robot and an orientation and inclination of the robot. However, since the conventional configurations are applicable, these processors are not used for the processing of the present disclosure and will not be described.
Specifically, the drive unit 103 includes, for example, a motor for driving each leg of the robot apparatus 10, and the like. The motor includes motors such as the roll axis motor 21, the hip axis motor 22, and the knee axis motor 23 described above with reference to
The sensor group 104 includes a camera, an IMU, a temperature sensor, a torque sensor, a pressure sensor, and the like.
The camera, the IMU, and the pressure sensor are the sensors described with reference to
As described above with reference to
Furthermore, the IMU sensor 14 illustrated in
The temperature sensor and the torque sensor are the sensors described above with reference to
That is, the temperature sensor 21a and the torque sensor 21b are individually provided for each of the roll axis motors 21 of the respective legs as described with reference to
In addition, a temperature sensor 22a and a torque sensor 22b are individually provided for each of the hip axis motors 22 for the respective legs.
Similarly, the temperature sensor 23a and the torque sensor 23b are individually provided for each of the knee axis motors 23 for the respective legs.
These sensors measure a heat generation amount and a load (torque) of each motor when each leg is driven.
Detection values of the sensors constituting the sensor group 104 are input to the travel control unit 102 and the suitable stop posture analyzer 110, and are used for travel control processing and stop control processing.
Note that various other sensors can be used as the sensors constituting the sensor group 104. For example, any of a stereo camera, an omnidirectional camera, an infrared camera, a light detection and ranging (LiDAR) sensor, a time of flight (TOF) sensor, or the like, or a combination thereof can be used.
Next, each component of the suitable stop posture analyzer 110 will be described.
The suitable stop posture analyzer 110 executes processing of calculating a stop posture and a posture evaluation value with high stability according to the shape of the traveling surface and recording the stop posture and the posture evaluation value in the storage 105.
Note that this processing can be executed without stopping the robot apparatus 10, and can be executed while the robot apparatus 10 is traveling.
The suitable stop posture analyzer 110 includes the traveling surface shape analyzer 111, the stop posture analyzer 112, the stop posture evaluation value calculator 113, and the suitable stop posture record updater 114.
First, processing executed by the traveling surface shape analyzer 111 will be described.
The traveling surface shape analyzer 111 performs processing of analyzing the shape of the traveling surface of the robot apparatus 10.
For example, processing of analyzing the shape of the traveling surface of the robot apparatus 10 is performed on the basis of an image captured by the camera and input from the sensor group 104, the inclination of the robot which is the detection value of the IMU, the landing and a weight amount of each leg detected from the pressure sensor of each leg end, and the like.
Note that traveling surface shape analysis processing executed by the traveling surface shape analyzer 111 of the suitable stop posture analyzer 110 is not highly accurate analysis processing of the traveling surface on which the robot apparatus 10 travels but rough classification processing of the shape of the traveling surface. That is, clustering processing of roughly classifying the type of the shape of the traveling surface on which the robot apparatus 10 is traveling is executed.
The traveling surface shape analyzer 111 refers to data for traveling surface shape clustering stored in the storage 105 and executes the clustering processing of the shape of the traveling surface on which the robot apparatus 10 is traveling.
A specific example of classification data of the shape of the traveling surface (data for traveling surface shape clustering) applied to the clustering processing (classification processing) of the shape of the traveling surface executed by the traveling surface shape analyzer 111 will be described with reference to
As illustrated in
For example, the clustering groups (C11 to C13) are groups of inclined surface shapes in which the front-rear direction inclination angle of the shape of the traveling surface is in a range of 0° to +10°.
Note that the front-rear direction inclination angle=0° means that the traveling surface of the robot apparatus 10 is horizontal without inclination in the front-rear direction of the robot.
The front-rear direction inclination angle=+10° means that the inclination angle of the traveling surface of the robot apparatus 10 in the front-rear direction of the robot is +10°, that is, there is an inclined surface or a step that has a height in a forward direction in which the robot apparatus 10 advances.
For example, the front-rear inclination angle is calculated as an angle between a horizontal line and a straight line connecting the advancing direction of the robot apparatus 10, a landing position of the foremost leg, and a landing position of the rearmost leg.
In this processing example, calculation processing of the “front-rear direction inclination angle” is similarly performed in both the case of the traveling surface having the step illustrated in
However, this clustering processing example is an example, and clustering processing in which a step and an inclined surface are distinguished may be performed.
Note that, in a case where there is an inclined surface in which the forward direction in which the robot apparatus 10 advances is lowered or there is a step in which the forward direction is lowered, the inclination angle is set to −10° or the like.
For example, if the landing position of the leftmost leg and the landing position of the rightmost leg are at the same height, the inclination angle in the left-right direction=0°.
In a case where the landing position of the leftmost leg is higher than the landing position of the rightmost leg, the inclination angle in the left-right direction is a value of (+), and in a case where the landing position of the leftmost leg is lower than the landing position of the rightmost leg, the inclination angle in the left-right direction is a value of (−).
For example, in the data for traveling surface shape clustering illustrated in
For example, the traveling surface shape analyzer 111 performs the processing of analyzing the shape of the traveling surface of the robot apparatus 10 on the basis of an image captured by the camera and input from the sensor group 104, the inclination of the robot which is the detection value of the IMU, the landing and a weight amount of each leg detected from the pressure sensor of each leg end, and the like.
As a result of this analysis processing, in a case where it is determined that the shape of the current traveling surface of the robot apparatus 10 is in the range of the inclination defined in the clustering group C11 described above, that is, in the range of the inclination of
An example of traveling surface shape analysis data generated by the traveling surface shape analyzer 111 will be described with reference to
That is, “(b1) the clustering group” indicates an identifier (C11, C12, C13 . . . ) of each clustering group (classification group) of the shape of the traveling surface.
In the data illustrated in
The data indicates that the shape of the traveling surface for the elapsed time from the start of traveling of the robot apparatus 10=000000 to 001233 has the shape of the traveling surface corresponding to the traveling surface shape clustering data indicated by
The second entry data of the data illustrated in
The data indicates that the shape of the traveling surface for the elapsed time from the start of traveling of the robot apparatus 10=001234 to 002231 has the shape of the traveling surface corresponding to the traveling surface shape clustering data indicated by
In such a manner, the traveling surface shape analyzer 111 analyzes the shape of the traveling surface during traveling of the robot apparatus 10. In a case where a change in the shape of the traveling surface occurs, the traveling surface shape analyzer 111 executes processing of analyzing the clustering group corresponding to the changed shape of the traveling surface.
The traveling surface shape clustering data including the clustering group analyzed by the traveling surface shape analyzer 111 is output to the suitable stop posture record updater 114 and the suitable stop posture acquirer 121 of the stop control unit 120.
Next, processing executed by the stop posture analyzer 112 and the stop posture evaluation value calculator 113 will be described.
The stop posture analyzer 112 analyzes the posture of the robot apparatus 10 at the time of stopping, and acquires stop posture information.
The stop posture evaluation value calculator 113 evaluates the stop posture analyzed by the stop posture analyzer 112 and calculates an evaluation value (stop posture evaluation value).
The stop posture analyzer 112 determines whether or not at least three legs are landed and a support polygon is formed while the robot apparatus 10 is traveling. In a case where determining that the support polygon is formed, the stop posture analyzer 112 analyzes the posture at a timing when the support polygon is formed as the “stop posture”.
Furthermore, the stop posture evaluation value calculator 113 calculates an evaluation value of the “stop posture”.
Note that the above processing is executed without stopping the robot apparatus 10.
The stop posture analyzer 112 analyzes, as the “stop posture”, the posture of the robot apparatus 10 at the timing when at least three legs are landed and the support polygon is formed while the robot apparatus 10 is traveling.
When at least three legs are landed and the support polygon is formed, the robot apparatus 10 can be stopped. Thus, the stop posture analyzer 112 analyzes the posture at this timing as the “stop posture” without actually stopping the robot apparatus 10.
The stop posture analyzer 112 acquires detection values of various sensors input from the sensor group 104 at the timing when the support polygon is formed, and analyzes the posture at that timing as the “stop posture”. The stop posture evaluation value calculator 113 further calculates an evaluation value (stop posture evaluation value) by applying a predefined evaluation value calculation algorithm to the “stop posture”.
The stop posture analyzer 112 and the stop posture evaluation value calculator 113 analyze the “stop posture” on the basis of, for example, the inclination of the robot which is a detection value of the IMU, the load of the motor analyzed from the detection values of the temperature sensor and the torque sensor belonging to each motor, the landing and the weight amount of each leg detected from the pressure sensor of each leg end, and the like, and calculate the “stop posture evaluation value”.
The “stop posture” analyzed by the stop posture analyzer 112 includes, for example, each of the following data:
The stop posture analyzer 112 analyzes the detection value of the sensor input from the sensor group 104 at the timing when at least three legs are landed and a support polygon is formed while the robot apparatus 10 is traveling, and calculates each data (stop posture data) of (a) to (c) described above.
Specific examples of the stop posture data (a) to (c) will be described with reference to
It is assumed that the stop posture analyzer 112 of the robot apparatus 10 determines that at least three legs are landed and the support polygon is formed at the timing illustrated in
As illustrated in
The stop posture analyzer 112 generates stop posture data as illustrated in
As described above, “(2a) the inclination of the body (trunk) (inclination in each of the front-rear direction and the left-right direction)” records the roll angle of the body (trunk) (=inclination of the body (trunk) in the left-right direction), and the pitch angle of the body (trunk) (=inclination of the body (trunk) in the front-back direction).
For example, the data (aa°, bb°) of the support polygon detection timing=000233 indicates
FL represents the left front leg, FR represents the right front leg, BL represents the left rear leg, and BR represents the right rear leg. Three angle data in ( ) indicate rotation angle data about the roll axis, the pitch axis, and the knee axis of each leg.
In the “(2c) whether or not each leg is landed and the pressure of the leg end” records whether or not each of the four legs is landed and the pressure of each of the four legs.
FL is the left front leg, FR is the right front leg, BL is the left rear leg, and BR is the right rear leg, and the two data in ( ) includes the preceding data indicating whether or not landed (1=landed, 0=non landed) and the succeeding data indicating the leg end pressure at the time of landing.
In such a manner, in a case where the stop posture analyzer 112 of the robot apparatus 10 determines that at least three legs are landed while the robot apparatus 10 is traveling and the support polygon is formed, the stop posture analyzer analyzes the stop posture at this timing and sequentially generates stop posture data as illustrated in
The stop posture data generated by the stop posture analyzer 112 is output to the stop posture evaluation value calculator 113.
The stop posture evaluation value calculator 113 inputs the stop posture data generated by the stop posture analyzer 112 and calculates an evaluation value of the stop posture.
The stop posture evaluation value calculator 113 performs calculation processing of the evaluation value (stop posture evaluation value) by applying a predefined evaluation value calculation algorithm on the basis of the stop posture data generated by the stop posture analyzer 112, that is, the stop posture data including each data of
As described above, the stop posture evaluation value calculator 113 calculates an evaluation value of the posture at the timing when at least three legs are landed and the support polygon is formed while the robot apparatus 10 is traveling, that is, the stop posture analyzed by the stop posture analyzer 112.
The stop posture evaluation value calculator 113 calculates an evaluation value by applying an evaluation value calculation algorithm defined in advance by using detection values of various sensors input from the sensor group 104 at the timing when the support polygon is formed.
For example, as illustrated in
Note that the stop posture evaluation value calculated by the stop posture evaluation value calculator 111 is an evaluation value that is higher as the stability of the posture of the robot apparatus 10 is higher and is higher as a consumed energy calculated from the load of the motor and the like is lower.
As an algorithm for calculating the stop posture evaluation value by the stop posture evaluation value calculator 111, various algorithms are applicable.
For example, the following algorithms for calculating the stop posture evaluation value can be used:
(Example 1) an evaluation value calculation algorithm based on a total sum of torques which are detection values of torque sensors respectively belonging to the motors of the legs;
(Example 2) an evaluation value calculation algorithm based on a total sum of temperatures of the motors, which are detection values of the temperature sensors respectively belonging to the motors of the legs;
(Example 3) an evaluation value calculation algorithm based on a load balance of the leg calculated on the basis of the leg end pressure value which is a detection value of the leg end pressure sensor of each leg.
(Example 4) an evaluation value calculation algorithm based on a rotation angle of the motor of each leg and a load balance of the leg calculated on the basis of the leg end pressure value which is a detection value of the leg end pressure sensor of each leg.
(Example 1) the evaluation value calculation algorithm based on the total sum of torques which are detection values of the torque sensors respectively belonging to the motors of the legs.
The graph illustrated in
The stop posture evaluation value increases as the total sum of the motor torques decreases, and the stop posture evaluation value decreases as the total sum of the motor torques increases.
(Example 2) the evaluation value calculation algorithm based on the total sum of temperatures of the motors, which are detection values of the temperature sensors respectively belonging to the motors of the legs.
The graph illustrated in
The stop posture evaluation value increases as the total sum of motor torques decreases, and the stop posture evaluation value decreases as the total sum of motor torques increases.
(Example 3) the evaluation value calculation algorithm based on the load balance of the leg calculated on the basis of the leg end pressure value which is the detection value of the leg end pressure sensor of each leg, and
(Example 4) the evaluation value calculation algorithm based on the rotation angle of the motor of each leg and the load balance of the leg calculated on the basis of the leg end pressure value which is the detection value of the leg end pressure sensor of each leg.
The graph illustrated in
The stop posture evaluation value decreases as the load balance is worse, and the stop posture evaluation value increases as the load balance is better (for example, the load is applied equally to each leg).
Note that, as the algorithm for calculating the stop posture evaluation value by the stop posture evaluation value calculator 111, various algorithms as described above in (Example 1) to (Example 4) are applicable.
Furthermore, a final stop posture evaluation value may be calculated by weighted addition of the evaluation values respectively calculated by the algorithms of (Example 1) to (Example 4) described above.
For example, in a case where the evaluation values calculated by respectively applying the algorithms of (Example 1) to (Example 4) described above are V1 to V4, respectively, a final stop posture evaluation value Vall is calculated in accordance with the following (Equation 1).
Vall=α·V1+β·V2+γ·V3+δ·V4 (Equation 1)
Note that α, β, γ, and δ are weighting coefficients.
In such a manner, the stop posture evaluation value calculator 113 calculates an evaluation value of the posture at the timing when at least three legs are landed and the support polygon is formed while the robot apparatus 10 is traveling, that is, the stop posture analyzed by the stop posture analyzer 112.
That is, the data is data corresponding to the data of
That is, each of the following data is included:
In such a manner, the stop posture evaluation value calculator 113 calculates an evaluation value (stop posture evaluation value) corresponding to the “stop posture” at the timing when the stop posture analyzer 112 determines that at least three legs are landed and a support polygon is formed while the robot apparatus 10 is traveling.
As described above, the stop posture analyzer 112 and the stop posture evaluation value calculator 113 analyze the stop posture at the timing when at least three legs are landed and a support polygon is formed while the robot apparatus 10 is traveling, and calculate the evaluation value of the stop posture. Note that the above processing is executed at the first timing when the formation of a support polygon is confirmed after a change in the shape of the traveling surface is detected during traveling of the robot apparatus 10.
Note that a specific sequence of the processing will be described later with reference to flowcharts.
The stop posture information and the evaluation value (stop posture evaluation value) generated and calculated by the stop posture analyzer 112 and the stop posture evaluation value calculator 113 are output to the suitable stop posture record updater 114.
Next, processing executed by the suitable stop posture record updater 114 will be described.
The suitable stop posture record updater 114 inputs each of the following data:
“(2) The stop posture generated by the stop posture analyzer 112 and the stop posture evaluation value calculator 113 and the evaluation value of the stop posture (stop posture evaluation value)” described above are respectively the stop posture and the stop posture evaluation value at the timing when the support polygon is formed while the robot apparatus 10 is traveling.
By using these input data, the suitable stop posture record updater 114 executes processing of recording a “suitable stop posture and evaluation value data corresponding to a traveling surface shape clustering group” in the storage 105 and processing of updating the recorded data.
A data configuration example of the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” recorded in the storage 105 will be described with reference to
As illustrated in
The above data are recorded by extracting data of generation timings of “(B) the suitable stop posture” and “(C) the stop posture evaluation value” from the data described above with reference to
That is, the “(B) the suitable stop posture” and the “(C) the stop posture evaluation value” are the stop posture and the stop posture evaluation value at the timing when a support polygon is formed while the robot apparatus 10 is traveling, and “(A) the traveling surface shape clustering data” at the same timing is recorded in association. The data is extracted at the same timing by collating (A) the time data of the data illustrated in
The suitable stop posture record updater 114 sequentially records, in the storage 105, the stop posture of the highest evaluation value corresponding to a plurality of different shapes of the traveling surface as “(B) the suitable stop posture” with “(C) the stop posture evaluation value” which is the evaluation value of the stop posture in association with “(A) the traveling surface shape clustering data”.
In a case where a support polygon is detected on a traveling surface having a new shape of a traveling surface on which data is not yet recorded and the “stop posture” and the “stop posture evaluation value” are calculated while the robot apparatus 10 is traveling, an entry in association with the following data:
The suitable stop posture record updater 114 further compares the “stop posture evaluation value” of the data already stored in the storage 105 with the newly calculated “stop posture evaluation value” in a case where the support polygon is detected on the traveling surface having the same shape as the shape of the traveling surface already recorded as the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” while the robot apparatus 10 is traveling and the “stop posture” and the “stop posture evaluation value” are calculated.
As a result of the comparison processing, in a case where the newly calculated “stop posture evaluation value” is an evaluation value higher than the “stop posture evaluation value” of the data already stored in the storage 105, data update processing of replacing the “suitable stop posture” and the “stop posture evaluation value” already stored in the storage 105 with the data having a higher evaluation is executed.
By performing such data update processing, the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” having the data configuration illustrated in
When executing stop processing of the robot apparatus 10 illustrated in
That is, an entry having “(A) the traveling surface shape clustering data” matching the shape of the traveling surface at the time of execution of the stop processing is selected, “(B) the suitable stop posture” recorded in the selected entry is acquired, the posture control of the robot apparatus is performed so as to have a posture matching the acquired “(B) suitable stop posture”, and the robot apparatus 10 is stopped.
By performing such stop posture control, it is possible to perform stop processing with higher stability.
Processing executed by the stop control unit 120 of the robot apparatus 10 illustrated in
The stop control unit 120 of the robot apparatus 10 starts processing for performing stop processing with reception of a stop command as a trigger via the communication unit 101, for example.
As illustrated in
The suitable stop posture acquirer 121 receives input of current traveling surface shape data (traveling surface shape clustering data) during traveling of the robot apparatus 10 from the traveling surface shape analyzer 111.
The traveling surface shape data (traveling surface shape clustering data) input from the traveling surface shape analyzer 111 is the data described above with reference to
That is, the data is each of the following data illustrated in
Note that the traveling surface shape data (traveling surface shape clustering data) input from the traveling surface shape analyzer 111 by the suitable stop posture acquirer 121 is the traveling surface shape data (traveling surface shape clustering data) at the present time, that is, at the time when the stop command is input.
The suitable stop posture acquirer 121 acquires a clustering group (Cnn) in the traveling surface shape data (traveling surface shape clustering data) input from the traveling surface shape analyzer 111.
Furthermore, entry data of the same clustering group (Cnn) as the acquired clustering group (Cnn) is acquired from the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” in the storage 105.
That is, the entry of the clustering group (Cnn) corresponding to the shape of the current traveling surface is selected from the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” described above with reference to
The “suitable stop posture” is a stop posture recorded in the storage 105 as data having the highest “stop posture evaluation value” in travel processing of the robot apparatus 10 so far.
The suitable stop posture acquirer 121 acquires, from the storage 105, the “suitable stop posture” having the highest evaluation, which is data stored in the storage and corresponding to the shape of the current traveling surface of the robot apparatus 10, and outputs the posture information to the stop posture control unit 122.
The stop posture control unit 122 performs posture control by driving the drive unit 103 of the robot apparatus 10 so as to have a posture matching the “suitable stop posture” input from the suitable stop posture acquirer 121, and stops the robot apparatus 10 in a state where the posture matches the “suitable stop posture”.
By performing such stop posture control, it is possible to perform stop processing with higher stability.
Next, a sequence of the processing executed by the robot apparatus 10 of the present disclosure will be described.
The flowchart shown in
The flowchart shown in
Note that these flowcharts can be implemented under the control of a data processor having a program execution function, such as a CPU, in accordance with a program stored in the storage in the robot apparatus 10.
First, the processing sequence executed by the suitable stop posture analyzer 110 of the robot apparatus 10 illustrated in
Hereinafter, details of processing of each step of the flow illustrated in
First, the robot apparatus 10 determines whether or not a travel execution command is input.
This determination processing is executed, for example, by the travel control unit 102 of the robot apparatus 10 illustrated in
In a case where input of the travel execution command has been detected, a determination in step S101 is Yes, and the processing proceeds to step S102.
In a case where it is determined in step S101 that the input of the travel execution command is detected, the travel processing of the robot apparatus 10 is started in step S102.
The travel processing of the robot apparatus 10 is executed under the control of the travel control unit 102.
The robot apparatus 10 starts traveling in step S102, and then, the robot apparatus 10 determines whether or not a travel stop command has been input in step S103.
The travel control unit 102 determines, for example, whether or not a travel stop command has been input from the user terminal 60 operated by the user 50 via the communication unit 101.
In a case where the input of the travel stop command has been detected, the determination in step S103 is Yes, and the processing proceeds to step S201.
On the other hand, in a case where the input of the travel stop command has not been detected, the determination in step S103 is No, and the processing proceeds to step S104.
In a case where it is determined in step S103 that the input of the travel stop command is not detected, the robot apparatus 10 executes the processing of step S104 and subsequent steps.
In this case, in step S104, suitable stop posture analysis processing by the suitable stop posture analyzer 110 is started.
The processing of steps S104 to S110 is processing executed by the suitable stop posture analyzer 110, and is processing executed while the robot apparatus 10 is traveling.
First, in step S105, the suitable stop posture analyzer 110 detects the presence or absence of a change in the shape of the traveling surface of the robot apparatus 10.
This processing is executed by the traveling surface shape analyzer 111 of the suitable stop posture analyzer 110 of the robot apparatus 10 illustrated in
The traveling surface shape analyzer 111 inputs and analyzes detection information of various sensors of the sensor group 104, and detects the presence or absence of a change in the shape of the traveling surface of the robot apparatus 10.
In a case where a change in the shape of the traveling surface has been detected, the determination in step S105 is Yes, and the processing proceeds to step S106.
Next, in step S106, the suitable stop posture analyzer 110 determines whether or not at least three legs of the robot apparatus 10 are landed and a support polygon is formed.
This process is processing executed by the stop posture analyzer 112 of the suitable stop posture analyzer 110.
As described above, the stop posture analyzer 112 analyzes the posture of the robot apparatus 10 at the time of stopping, and the acquires stop posture information.
In a case where it is determined in step S106 that at least three legs of the robot apparatus 10 are landed and a support polygon is formed, the processing proceeds to step S107.
In a case where at least three legs of the robot apparatus 10 are landed and a support polygon is not formed, the determination processing in steps S105 to S106 is continued.
In a case where the stop posture analyzer 112 of the suitable stop posture analyzer 110 determines that at least three legs of the robot apparatus 10 are landed and a support polygon is formed in step S106, the processing of step S107 is executed.
In this case, in step S107, the posture (stop posture) of the robot apparatus 10 at the time of forming the support polygon is analyzed by using the sensor detection information, and the evaluation value (stop posture evaluation value) of the stop posture is calculated.
This processing is executed by the stop posture analyzer 112 and the stop posture evaluation value calculator 113 of the suitable stop posture analyzer 110.
The stop posture analyzer 112 determines whether or not at least three legs are landed and a support polygon is formed while the robot apparatus 10 is traveling. In a case where determining that the support polygon is formed, the stop posture analyzer 112 analyzes the posture at a timing when the support polygon is formed as the “stop posture”.
This processing is executed without stopping the robot apparatus 10.
When at least three legs are landed and the support polygon is formed, the robot apparatus 10 can be stopped. Thus, the stop posture analyzer 112 analyzes the posture at this timing as the “stop posture” without actually stopping the robot apparatus 10.
The stop posture analyzer 112 acquires detection values of various sensors input from the sensor group 104 at the timing when the support polygon is formed, and analyzes the posture at that timing as the “stop posture”.
Furthermore, the stop posture evaluation value calculator 113 calculates an evaluation value (stop posture evaluation value) by applying a predefined evaluation value calculation algorithm to the “stop posture”.
The stop posture analyzer 112 and the stop posture evaluation value calculator 113 analyze the “stop posture” on the basis of, for example, the inclination of the robot which is a detection value of the IMU, the load of the motor analyzed from the detection values of the temperature sensor and the torque sensor belonging to each motor, the landing and the weight amount of each leg detected from the pressure sensor of each leg end, and the like, and calculate the “stop posture evaluation value”.
As described above with reference to
The stop posture analyzer 112 generates “stop posture” data including the above data (a) to (c) on the basis of the input value from the sensor group 104.
In such a manner, in a case where the stop posture analyzer 112 of the robot apparatus 10 determines that at least three legs are landed while the robot apparatus 10 is traveling and the support polygon is formed, the stop posture analyzer analyzes the stop posture at this timing, and analyzes the stop posture.
The stop posture data generated by the stop posture analyzer 112 is output to the stop posture evaluation value calculator 113.
The stop posture evaluation value calculator 113 inputs the stop posture data generated by the stop posture analyzer 112 and calculates an evaluation value of the stop posture.
The stop posture evaluation value calculator 113 performs calculation processing of the evaluation value (stop posture evaluation value) by applying a predefined evaluation value calculation algorithm on the basis of the stop posture data generated by the stop posture analyzer 112, that is, the stop posture data including each data of
The stop posture evaluation value calculator 113 calculates an evaluation value by applying an evaluation value calculation algorithm defined in advance by using detection values of various sensors input from the sensor group 104 at the timing when the support polygon is formed.
For example, as described above with reference to
Note that the stop posture evaluation value calculated by the stop posture evaluation value calculator 111 is an evaluation value that is higher as the stability of the posture of the robot apparatus 10 is higher and is higher as a consumed energy calculated from the load of the motor and the like is lower.
As described above with reference to
(Example 1) an evaluation value calculation algorithm based on a total sum of torques which are detection values of torque sensors respectively belonging to the motors of the legs;
(Example 2) an evaluation value calculation algorithm based on a total sum of temperatures of the motors, which are detection values of the temperature sensors respectively belonging to the motors of the legs;
(Example 3) an evaluation value calculation algorithm based on a load balance of the leg calculated on the basis of the leg end pressure value which is a detection value of the leg end pressure sensor of each leg.
(Example 4) an evaluation value calculation algorithm based on a rotation angle of the motor of each leg and a load balance of the leg calculated on the basis of the leg end pressure value which is a detection value of the leg end pressure sensor of each leg.
In addition, for example, the evaluation values calculated by respectively applying the algorithms of (Example 1) to (Example 4) described above may be set as V1 to V4, respectively, and a final stop posture evaluation value Vall may be calculated in accordance with the following (Equation 1).
Vall=α·V1+β·V2+γ·V3+δ·V4 (Equation 1)
Note that α, β, γ, and δ are weighting coefficients.
In such a manner, the stop posture evaluation value calculator 113 calculates an evaluation value of the posture at the timing when at least three legs are landed and the support polygon is formed while the robot apparatus 10 is traveling, that is, the stop posture analyzed by the stop posture analyzer 112.
In such a manner, in step S107, the stop posture analyzer 112 and the stop posture evaluation value calculator 113 analyze the stop posture at the timing when at least three legs are landed and a support polygon is formed while the robot apparatus 10 is traveling, and calculate the evaluation value of the stop posture.
The stop posture information and the evaluation value (stop posture evaluation value) generated and calculated by the stop posture analyzer 112 and the stop posture evaluation value calculator 113 are output to the suitable stop posture record updater 114.
In the next step S108, processing of comparing a newly calculated evaluation value (newly calculated stop posture evaluation value) calculated by the stop posture evaluation value calculator 113 in step S107 with an existing evaluation value already calculated and already stored in the storage 105 is executed.
This processing is processing executed by the suitable stop posture record updater 114.
As described above, the suitable stop posture record updater 114 inputs each of the following data:
By using these input data, the suitable stop posture record updater 114 executes processing of recording a “suitable stop posture and evaluation value data corresponding to a traveling surface shape clustering group” in the storage 105 and processing of updating the recorded data.
The processing of recording the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” described above with reference to
In step S108, the suitable stop posture record updater 114 compares the “stop posture evaluation value” calculated in step S107 with the “stop posture evaluation value” of the data already stored in the storage 105.
The suitable stop posture record updater 114 selects an entry matching a traveling surface shape clustering group at a calculation timing of the newly calculated evaluation value (newly calculated stop posture evaluation value) of the stop posture evaluation value calculator 113 in step S107 from the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” already stored in the storage 105, and compares the evaluation value of the selected entry with the value of the newly calculated evaluation value.
Next, the suitable stop posture record updater 114 executes the following determination processing in step S109.
That is, it is determined whether or not there is recorded data in the storage 105 that matches the traveling surface shape clustering group at the calculation timing of the newly calculated evaluation value (newly calculated stop posture evaluation value) of the stop posture evaluation value calculator 113 in step S107.
Furthermore, in a case where there is recorded data that matches the traveling surface shape clustering group, it is determined whether or not the newly calculated evaluation value is a value larger than the stop posture evaluation value of the recorded data in the storage 105.
In a case where recorded data that matches the traveling surface shape clustering group is not recorded in the storage 105 or the newly calculated evaluation value is a value larger than the stop posture evaluation value of the record data in the storage 105, the determination in step S109 is Yes, and the processing proceeds to step S110.
On the other hand, in a case where recorded data that matches the traveling surface shape clustering group is recorded in the storage 105 or the newly calculated evaluation value is not a value larger than the stop posture evaluation value of the record data in the storage 105, the determination in step S109 is No, and the processing returns to step S102.
In a case where the determination in step S109 is Yes, that is, in a case where it is determined that recorded data that matches the traveling surface shape clustering group at the calculation timing of the newly calculated evaluation value (newly calculated stop posture evaluation value) of the stop posture evaluation value calculator 113 in step S107 is not recorded in the storage 105, or in a case where the newly calculated evaluation value is a value larger than the stop posture evaluation value of the recorded data in the storage 105, the processing in step S110 is executed.
In this case, the suitable stop posture record updater 114 executes record and update processing of the recorded data in the storage 105 in step S110.
In a case where it is determined that recorded data that matches the traveling surface shape clustering group at the calculation timing of the newly calculated evaluation value (newly calculated stop posture evaluation value) of the stop posture evaluation value calculator 113 in step S107 is not recorded in the storage 105, an entry in association with the following data:
In addition, in a case where it is determined that recorded data that matches the traveling surface shape clustering group at the calculation timing of the newly calculated evaluation value (newly calculated stop posture evaluation value) of the stop posture evaluation value calculator 113 in step S107 is recorded in the storage 105, but the newly calculated evaluation value is a value larger than the stop posture evaluation value of the recorded data in the storage 105, the following processing is executed.
Data update processing of replacing the “suitable stop posture” and the “stop posture evaluation value” already stored in the storage 105 with the “stop posture” and the “stop posture evaluation value” calculated in step S107 is executed.
By performing such data update processing, data stored in the storage 105, that is, the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” having the data configuration illustrated in
When executing stop processing of the robot apparatus 10 illustrated in
That is, an entry having “(A) the traveling surface shape clustering data” matching the shape of the traveling surface at the time of execution of the stop processing is selected, “(B) the suitable stop posture” recorded in the selected entry is acquired, the posture control of the robot apparatus is performed so as to have a posture matching the acquired “(B) suitable stop posture”, and the robot apparatus 10 is stopped.
By performing such stop posture control, it is possible to perform stop processing with higher stability.
After the data update processing in step S110 of the flowchart illustrated in
Next, a processing sequence executed by the stop control unit 120 of the robot apparatus 10 illustrated in
The flow illustrated in
For example, in a case where it is determined in step S103 of the flowchart illustrated in
Hereinafter, details of processing of each step of the flow illustrated in
In a case where it is determined in step S103 that a travel stop command is input from the user terminal 60 operated by the user 50 via the communication unit 101, for example, in the robot apparatus 10, the stop control unit 120 of the robot apparatus 10 illustrated in
In step S202, the stop control unit 120 of the robot apparatus 10 inputs the traveling surface shape data (traveling surface shape clustering data) of the current traveling surface of the robot apparatus 10.
This processing is processing executed by the suitable stop posture acquirer 121 of the stop control unit 120 of the robot apparatus 10 illustrated in
The suitable stop posture acquirer 121 receives input of current traveling surface shape data (traveling surface shape clustering data) during traveling of the robot apparatus 10 from the traveling surface shape analyzer 111.
The traveling surface shape data (traveling surface shape clustering data) input from the traveling surface shape analyzer 111 is the data described above with reference to
That is, the data is each of the following data illustrated in
Note that the traveling surface shape data (traveling surface shape clustering data) input from the traveling surface shape analyzer 111 by the suitable stop posture acquirer 121 is the traveling surface shape data (traveling surface shape clustering data) at the present time, that is, at the time when the stop command is input.
The suitable stop posture acquirer 121 acquires a clustering group (Cnn) in the traveling surface shape data (traveling surface shape clustering data) input from the traveling surface shape analyzer 111.
Next, in step S203, from the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” in the storage 105, the suitable stop posture acquirer 121 acquires entry data of the same clustering group (Cnn) as the clustering group (Cnn) indicating the shape of the traveling surface on which the robot apparatus 10 is currently traveling, the clustering group (Cnn) being acquired from the traveling surface shape analyzer 111.
That is, the entry of the clustering group (Cnn) corresponding to the shape of the current traveling surface is selected from the “suitable stop posture and evaluation value data corresponding to the traveling surface shape clustering group” described above with reference to
The “suitable stop posture” is a stop posture recorded in the storage 105 as data having the highest “stop posture evaluation value” in travel processing of the robot apparatus 10 so far.
Next, in step S204, the robot apparatus 10 sets the posture of the robot apparatus 10 to the suitable stop posture acquired from the storage 105.
This processing is processing executed by the stop posture control unit 122 of the stop control unit 120 of the robot apparatus 10 illustrated in
The stop posture control unit 122 inputs, from the suitable stop posture acquirer 121, a “suitable stop posture” having the highest evaluation, which is data in the storage corresponding to the shape of the current traveling surface of the robot apparatus 10, and performs posture control by driving the drive unit 103 of the robot apparatus 10 so as to have a posture matching the input “suitable stop posture”.
Next, in step S205, the stop posture control unit 122 determines whether or not the posture of the robot apparatus 10 matches the “suitable stop posture” having the highest evaluation which is data stored in the storage.
In a case where the posture does not match, the posture control in step S204 is continued, and in a case where the posture matches, the processing proceeds to step S206.
In a case where determining in step S205 that the posture of the robot apparatus 10 matches the “suitable stop posture” having the highest evaluation which is data stored in the storage, the stop posture control unit 122 stops the robot apparatus 10 in a state where the posture matches the “suitable stop posture” in step S206.
By performing such stop posture control, it is possible to perform stop processing with higher stability.
Next, other embodiments will be described.
In the embodiment described above, the “suitable stop posture” and the “stop posture evaluation value”, which is the evaluation value of the suitable stop posture, to be stored in the storage 105 are recorded for use in association with the “traveling surface shape clustering group” which is the type of the shape of the traveling surface.
However, the “suitable stop posture” and the “stop posture evaluation value”, which is the evaluation value of the suitable stop posture, to be stored in the storage 105 may be generated as data corresponding to not only the type of the shape of the traveling surface but also, for example, the environment in which the robot apparatus 10 travels, and may be recorded in the storage 105 for use.
For example, environment information such as “slipperiness” and “hardness” of the traveling surface on which the robot apparatus 10 travels, whether or not the travel environment of the robot apparatus 10 is an environment where water droplets of rain or the like fall, and a wind speed and a wind direction in the travel environment may be acquired. The acquired environment information may be clustered. The “suitable stop posture” and the “stop posture evaluation value” which is the evaluation value of the suitable stop posture may be calculated in association with an environment information clustering group, and stored in the storage 105 for use.
Note that, in this case, a sensor for acquiring these environment information needs to be attached to the robot apparatus 10.
In such a manner, by calculating the “suitable stop posture” and the “stop posture evaluation value” which is the evaluation value of the suitable stop posture in association with the environment information clustering group and storing the calculated values in the storage 105, it is possible to stop the robot apparatus 10 in a suitable posture according to the travel environment in a case where a stop command is input, by acquiring a suitable stop posture according to the travel environment at that time from the storage 105 and performing stop posture control to match the acquired suitable stop posture.
Next, a hardware configuration example of the robot apparatus of the present disclosure will be described.
A central processing unit (CPU) 301 functions as a data processing unit that performs various types of processing in accordance with a program stored in a read only memory (ROM) 302 or a storage 308. For example, processing according to the sequences described in the embodiment described above is performed. A random access memory (RAM) 303 stores programs, data, and the like to be performed by the CPU 301. The CPU 301, the ROM 302, and the RAM 303 are connected to each other by a bus 304.
The CPU 301 is connected to an input/output interface 305 via the bus 304, and the input/output interface 305 is connected to an input unit 306 including various switches, a keyboard, a touch panel, a mouse, a microphone, and further a status data acquisition unit of a user input unit, a camera, and various sensors 321 such as LiDAR, and the like, and an output unit 307 including a display, a speaker, and the like.
Furthermore, the output unit 307 also outputs drive information for a drive unit 322 that drives the robot and the like.
The CPU 301 inputs commands, status data, and the like input from the input unit 306, executes various types of processing, and outputs processing results to, for example, the output unit 307.
The storage 308 connected to the input/output interface 305 includes, for example, a flash memory, a hard disk, or the like, and stores a program executed by the CPU 301 or various types of data. A communication unit 309 functions as a transmitter and receiver for data communication via a network such as the Internet or a local area network, and communicates with an external device.
Furthermore, in addition to the CPU, a graphics processing unit (GPU) may be provided as a dedicated processing unit for image information and the like input from the camera.
A drive 310 connected to the input/output interface 305 drives a removable medium 311 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory such as a memory card, and records or reads data.
The embodiment of the present disclosure has been described above in detail with reference to specific embodiments. However, it is obvious that those skilled in the art can modify or substitute the embodiments without departing from the gist of the present disclosure. That is, the present invention has been disclosed in the form of exemplification, and should not be interpreted in a limited manner. In order to determine the gist of the present disclosure, the claims should be considered.
Note that the technology herein disclosed can have the following configurations.
Note that a series of processing herein described can be executed by hardware, software, or a combined configuration of the both. In a case where processing by software is executed, a program in which a processing sequence is recorded can be installed and executed in a memory in a computer incorporated in dedicated hardware, or the program can be installed and executed in a general-purpose computer capable of executing various types of processing. For example, the program can be recorded in advance in a recording medium. In addition to being installed on a computer from the recording medium, a program can be received via a network such as a local area network (LAN) or the Internet and installed on a recording medium such as an internal hard disk or the like.
Furthermore, the various types of processing herein described may be performed not only in time series as described, but also in parallel or individually in accordance with the processing capability of the device that performs the processing or as necessary. Furthermore, a system herein described is a logical set configuration of a plurality of devices, and is not limited to a system in which devices of respective configurations are in the same housing.
As described above, a configuration of an embodiment of the present disclosure achieves a configuration in which a suitable posture according to the shape of the traveling surface of the robot is stored in the storage, the suitable posture according to the shape of the traveling surface is acquired from the storage when the robot is to be stopped, and posture control is performed to stop the robot.
Specifically, for example, a suitable stop posture analyzer that analyzes the suitable stop posture of a walking robot executes processing of analyzing the shape of the traveling surface of the walking robot, stop posture analysis processing of analyzing stop postures according to the shape of the traveling surface, stop posture evaluation value calculation processing of calculating evaluation values of the stop postures, and processing of recording, in a storage, a stop posture having a highest evaluation value among the stop postures according to the traveling surface shape. Furthermore, a stop control unit acquires the suitable stop posture corresponding to a traveling surface shape clustering group from the storage, and performs posture control to match a posture of the robot to the suitable stop posture and stop the robot.
This configuration achieves a configuration in which the suitable posture according to the shape of the traveling surface of the robot is stored in the storage, the suitable posture according to the shape of the traveling surface is acquired from the storage when the robot is to be stopped, and the posture control is performed to stop the robot.
Number | Date | Country | Kind |
---|---|---|---|
2021-100692 | Jun 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/003021 | 1/27/2022 | WO |