This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2018-133986 filed Jul. 17, 2018.
The present disclosure relates to a robot control system, a robot apparatus, and a non-transitory computer readable medium.
Japanese Unexamined Patent Application Publication No. 2006-247803 discloses an autonomous mobile robot that tilts the robot body to change the scanning range of an obstacle detection sensor.
Aspects of a non-limiting embodiment of the present disclosure relate to providing a robot control system, a robot apparatus, and a non-transitory computer readable medium that enable control information for controlling operation of a robot apparatus to reflect a control condition that is not determined unless the robot apparatus is observed from outside.
Aspects of a certain non-limiting embodiment of the present disclosure address the above advantages and/or other advantages not described above. However, aspects of the non-limiting embodiment are not required to address the advantages described above, and aspects of the non-limiting embodiment of the present disclosure may not address advantages described above.
According to an aspect of the present disclosure, there is provided a robot control system that includes a robot apparatus that operates autonomously in accordance with control information provided to the robot apparatus, the robot apparatus receiving update information to be used to update the control information and updating the control information in accordance with the received update information, an imaging apparatus that captures an image of the robot apparatus, and a control apparatus including a transmitting unit that transmits to the robot apparatus update information generated in accordance with the image captured by the imaging apparatus.
An exemplary embodiment of the present disclosure will be described in detail based on the following figures, wherein:
An exemplary embodiment of the present disclosure will be described in detail with reference to the drawings.
First,
As depicted in
For example, a control parameter set regarding the external form (external dimensions) of the robot apparatus 10 carrying no load is provided to the robot apparatus 10, and thereby the robot apparatus 10 controls operation of the robot body in accordance with the control parameter set and performs an operation such as bypassing an obstacle and determining whether a narrow path or the like is passable for the robot body. In addition, when a path to a destination is searched for by using map information prepared in advance, a path search based on the result of determining whether a path is passable as described above is possible.
Next,
Thus, when the robot apparatus 10 performs an operation for bypassing an obstacle or turning around, if the robot apparatus 10 allows a margin between the obstacle and the robot body in accordance with a control parameter set provided by using the external form (external dimensions) of the robot body carrying no load, the load 80 placed on the robot body may come into contact with an obstacle around the robot body.
The robot control system according to the present exemplary embodiment has the following configuration so as to avoid such a situation. As depicted in
The robot apparatus 10 is configured to be connectable to the network 30 via a wireless local-area network (LAN) terminal 50.
The cameras 61 and 62 function as an imaging unit and capture an image of the robot apparatus 10, which is positioned at a predetermined reference measurement point. The cameras 61 and 62 each capture from different directions an image of the external appearance of the robot apparatus 10, which is positioned at the predetermined reference measurement point.
As depicted in
Instead of using a typical red-green-blue (RGB) camera as the cameras 61 and 62, if a stereo camera or a distance measurement sensor capable of measuring the distance to an object, such as a laser range finder (LRF), is used, it is possible to calculate the external form or other parameters of the robot apparatus 10 without obtaining the positional information of each of the cameras 61 and 62 with respect to the reference measurement point.
The control server 20 generates update information in accordance with images captured by the cameras 61 and 62 and the positional information of each of the cameras 61 and 62 with respect to the reference measurement point described above. The update information is used to update control information for controlling operation of the robot apparatus 10, and the control server 20 transmits the generated update information to the robot apparatus 10.
The update information is information to update control information such as a control program and a control parameter set necessary for the robot apparatus 10 to move autonomously. Specifically, the update information is, for example, a new control parameter set and control program to replace the control parameter set and control program stored in the robot apparatus 10.
Alternatively, the update information may be instruction information providing instructions to update the control parameter set and control program stored in the robot apparatus 10. More specifically, the robot apparatus 10 may store in advance a plurality of pieces of control information having different control characteristics and may select in accordance with the instruction information provided by the control server 20 one piece of control information from the plurality of pieces of stored control information. Then, the robot apparatus 10 may replace the control information for performing autonomous operation with the selected piece of control information.
Further, the control server 20 may transmit image information of the robot apparatus 10, whose images are captured by the cameras 61 and 62, to the robot apparatus 10 as the update information. In such a case, the robot apparatus 10 generates new control information in accordance with the image information received from the control server 20 and replaces the control information for performing autonomous operation with the generated control information.
In the following description, a configuration in which the control server 20 generates in accordance with image information obtained by the cameras 61 and 62 a new control parameter set for controlling the movement operation of the robot apparatus 10 and transmits the generated control parameter set to the robot apparatus 10 will mainly be described.
Next,
As depicted in
The CPU 11 performs predetermined processing in accordance with a control program stored in the memory unit 12 or in the storage unit 13 and controls operation of the robot apparatus 10. Although the description regarding the present exemplary embodiment will be provided assuming that the CPU 11 reads and executes the control program stored in the memory unit 12 or in the storage unit 13, it is also possible to provide the CPU 11 with a control program stored on a recording medium such as a compact-disc read-only memory (CD-ROM).
As depicted in
The wireless communication unit 14, which is connected to the network 30 via the wireless LAN terminal 50, transmits and receives data to and from the control server 20.
The movement unit 16 is controlled by the controller 31 and moves the body of the robot apparatus 10. The operation input unit 33 receives various pieces of operation information such as instructions from a user.
The detection unit 32 uses various sensors, such as a LRF, to detect an obstacle present around the robot apparatus 10, such as an object or a person, and determines the size of the obstacle, the distance to the obstacle, and the like.
The control-parameter storage unit 34 stores various control parameter sets for controlling the movement of the robot apparatus 10.
The controller 31 autonomously controls in accordance with a provided control parameter set operation of the robot apparatus 10 in which the controller 31 is installed. Specifically, in addition to referencing information detected by the detection unit 32, the controller 31 controls the movement unit 16 in accordance with a control parameter set stored in the control-parameter storage unit 34 and thereby controls the movement of the robot apparatus 10. More specifically, the controller 31 uses a new control parameter set received from the control server 20 and performs, in accordance with the new control parameter set received from the control server 20, one or both of an operation for bypassing an obstacle to avoid a collision between the robot apparatus 10 and the obstacle and determination of whether a path ahead of the robot apparatus 10 is passable for the robot apparatus 10.
Upon receiving a new control parameter set from the control server 20 as update information via the wireless communication unit 14, the controller 31 updates the control parameter set, which is stored in the control-parameter storage unit 34, in accordance with the received control parameter set. This update information is determined in accordance with a captured image of the external appearance of the robot apparatus 10 in which the controller 31 is installed.
Alternatively, the control-parameter storage unit 34 may store in advance a plurality of control parameter sets having different control characteristics. In such a case, the controller 31 receives from the control server 20 via the wireless communication unit 14 instruction information providing instructions to update the control parameter set to be used to control the robot apparatus 10 and selects in accordance with the received instruction information a control parameter set to be used from the plurality of control parameter sets stored in the control-parameter storage unit 34.
When image information obtained by the cameras 61 and 62 is received from the control server 20 instead of a new control parameter set, the controller 31 generates in accordance with the received image information a new control parameter set for controlling the robot apparatus 10. Then, the generated new control parameter set is stored in the control-parameter storage unit 34, and the robot apparatus 10 operates autonomously in accordance with the new control parameter set.
Next,
As depicted in
The CPU 21 performs predetermined processing in accordance with a control program stored in the memory unit 22 or in the storage unit 23 and controls operation of the control server 20. Although the description regarding the present exemplary embodiment will be provided assuming that the CPU 21 reads and executes the control program stored in the memory unit 22 or in the storage unit 23, it is also possible to provide the CPU 21 with a control program stored on a recording medium such as a CD-ROM.
As depicted in
The image-data receiving unit 41 receives captured image data of the robot apparatus 10 from the cameras 61 and 62.
The 3D model generation unit 42 generates a three-dimensional model (3D model) of the robot apparatus 10 from image data (image information) of the robot apparatus 10, the image data being received by the image-data receiving unit 41.
The control-parameter generation unit 43 generates a control parameter set for controlling the robot apparatus 10 from the 3D model of the robot apparatus 10, the 3D model being generated by the 3D model generation unit 42. In other words, the control-parameter generation unit 43 generates in accordance with the images captured by the cameras 61 and 62, which constitute an imaging apparatus, a control parameter set for controlling the robot apparatus 10.
Specifically, the control parameter set is generated from positional information of each of the cameras 61 and 62 with respect to the position at which the robot apparatus 10 is placed and the respective images captured by the two cameras 61 and 62.
The transmitting unit 44 transmits to the robot apparatus 10 the control parameter set generated by the control-parameter generation unit 43.
In the description of the present exemplary embodiment, the control parameter set, which is information regarding the external dimensions of the robot apparatus 10, is generated by the control-parameter generation unit 43 and transmitted to the robot apparatus 10 by the transmitting unit 44, but information other than the information regarding the external dimensions may be transmitted to the robot apparatus 10 as a control parameter set.
The controller 45 may cause the transmitting unit 44 to transmit image information of the robot apparatus 10, the image information being received by the image-data receiving unit 41, to the robot apparatus 10 as the update information without processing the image information.
Alternatively, the controller 45 may transmit to the robot apparatus 10 instruction information, which provides instructions to update the control parameter set used to control the robot apparatus 10, as the update information.
The control-program storage unit 46 stores in advance a plurality of control programs having different control characteristics. The controller 45 identifies the type of the robot apparatus 10 by using images of the robot apparatus 10 captured by the cameras 61 and 62, selects a control program that corresponds to the identified type of the robot apparatus 10 from the plurality of control programs stored in the control-program storage unit 46, and causes the transmitting unit 44 to transmit the selected control program to the robot apparatus 10.
The control-program storage unit 46 may store in advance a plurality of control programs each of which corresponds to an individual robot apparatus 10. In such a case, the controller 45 identifies an individual robot apparatus 10 by using images of the robot apparatus 10 captured by the cameras 61 and 62, selects a control program that corresponds to the identified individual robot apparatus 10 from the plurality of control programs stored in the control-program storage unit 46, and causes the transmitting unit 44 to transmit the selected control program to the robot apparatus 10.
It is also possible to configure the robot apparatus 10 to transmit information to enable the type of the robot apparatus 10 or the individual robot apparatus 10 to be identified. In such a case, the controller 45 may identify the type of the robot apparatus 10 or the individual robot apparatus 10 by using the information received from the robot apparatus 10 instead of images of the robot apparatus 10 captured by the cameras 61 and 62.
Operation of the robot control system according to the present exemplary embodiment will be described in detail with reference to the drawings.
Operation of the robot control system according to the present exemplary embodiment will be described with reference to the sequence chart in
First, the robot apparatus 10 is placed at the reference measurement point described with reference to
Then, the 3D model generation unit 42 in the control server 20 generates a 3D model of the robot apparatus 10 from the two captured images (step S105).
It is also possible to transmit the 3D model data directly to the robot apparatus 10 from the control server 20 and to cause the robot apparatus 10 to control movement in accordance with the received 3D model data.
Next, the control-parameter generation unit 43 generates as a control parameter set, for example, information regarding the external dimensions in the width, depth, and height directions of the robot apparatus 10 from the 3D model data generated as described above (step S106).
For example, as depicted in
The new control parameter set generated by the control-parameter generation unit 43 is transmitted to the robot apparatus 10 (step S107).
The robot apparatus 10 replaces the provided control parameter set with the new control parameter set, which is received from the control server 20 (step S108).
The control parameter set provided to the robot apparatus 10 is replaced with a new control parameter set, and it is found that the external dimensions in the height, depth, and width directions increase.
In summary, updating the control parameter set enables the robot apparatus 10 to perform autonomous movement control in accordance with the external dimensions of the robot apparatus 10 carrying the load 80 and to perform processing such as bypassing an obstacle, ensuring a margin during a turn, and determining whether a path ahead of the robot apparatus 10 is passable.
A case where the two cameras 61 and 62 capture the images of the robot apparatus 10 is described with reference to
In the configuration as depicted in
Specifically, operation of the robot apparatus 10 is controlled by the control server 20, a controller, or the like (not depicted), and the robot apparatus 10 is operated so that the entire body of the robot apparatus 10 is captured by the camera 61.
Then, while the robot apparatus 10 is being operated, the camera 61 captures a plurality of times an image of the external appearance of the robot apparatus 10. Simultaneously, a distance traveled by the robot apparatus 10 is estimated by using the number of rotations of a wheel of the robot apparatus 10, and the control server 20 acquires, as odometry information, the information regarding the distance traveled by the robot apparatus 10 or the like. In the control server 20, a control parameter set is generated from the odometry information and the information regarding the plurality of captured images of the robot apparatus 10.
The robot apparatus 10 may be operated manually or automatically by the control server 20 by using the captured images. When the control server 20 automatically controls operation of the robot apparatus 10, feature points or the like of the robot apparatus 10 are recognized by using the object recognition technology, and operation of the robot apparatus 10 is controlled so that the recognized form of the robot apparatus 10 coincides with the form viewed in the direction from which an image is to be captured.
An operation of generating a control parameter set by capturing images of the robot apparatus 10 in operation by using the single camera 61 in this manner will be described with reference to the sequence chart in
The control server 20 provides the camera 61 with instructions to capture an image, and an image captured by the camera 61 is transmitted to the control server 20 (steps S201 and S202). Then, the control server 20 provides the robot apparatus 10 with instructions to operate (step S203) and receives as odometry information a piece of information such as the distance traveled by the robot apparatus 10, which has received the instructions to operate (step S204).
Then, the control server 20 provides the camera 61 with instructions to capture an image and acquires an image captured by the camera 61 (steps S205 and S206).
Repeating such processing a plurality of times enables the control server 20 to acquire image information of the robot apparatus 10 from various directions (steps S207 to S210).
Then, the control server 20 generates a 3D model of the robot apparatus 10 from the plurality of captured images by using a method similar to the method described above (step S211). A control parameter set is generated from the generated 3D model (step S212).
Finally, the generated control parameter set is transmitted from the control server 20 to the robot apparatus 10 (step S213). Then, the robot apparatus 10 replaces the provided control parameter set with the new control parameter set, which is received from the control server 20 (step S214).
In the exemplary embodiment described above, a case where the information regarding the external dimensions of the robot apparatus 10 is generated as a control parameter set has been described, but a control parameter set is not limited to such information.
For example, as depicted in
Specifically, the acceleration value or the angular acceleration value at which the robot apparatus 10 carrying the load 71 is operated is gradually increased, and the acceleration value or the angular acceleration value at the point when the load 71 falls is acquired as the allowable upper limit.
For example, when the robot apparatus 10 is used for an operation such as conveying the same load a plurality of times in a plant, first, the acceleration value or the angular acceleration value at which the robot apparatus 10 carrying the load is operated is gradually increased, and the acceleration value or the angular acceleration value at the point when the fall of the load is detected in a captured image is determined to be the upper limit for the robot apparatus 10 carrying the load.
Such calibration is performed before the operation of conveying the load is started, and thereby it is possible to provide a control parameter set to the robot apparatus 10 before the operation is actually started.
Consequently, the robot apparatus 10 whose control parameter set is replaced with such a control parameter set is capable of an operation for preventing the carried object from falling by using a new control parameter set received from the control server 20.
Further, as depicted in
In such a case, a control parameter set for controlling the robot arm 81 is transmitted from the control server 20 to the robot apparatus 10 or to the robot arm 81, and thereby the control parameter set for controlling the robot arm 81 may be updated.
A controller for controlling the robot arm 81 may be installed in the robot arm 81, or the robot apparatus 10 may execute a control program for controlling the robot arm 81 and control the robot arm 81.
Further, as depicted in
For example, in a case depicted in
In such a case, the camera 61 is caused to capture an image of the robot apparatus 10 equipped with the movable unit 91 while the movable unit 91 is gradually moved, and the angle information for the movable unit 91 at a point when the movable unit 91 comes into contact with the robot apparatus 10 is acquired by the control server 20 as a new control parameter set.
Then, the robot apparatus 10 acquires information regarding the allowable range of motion for the movable unit 91 from the control server 20 as a control parameter set and replaces the control parameter set for controlling the movable unit 91 with the acquired parameter set. As a result, the robot apparatus 10 is capable of controlling the movable unit 91 to operate so as not to come into contact with the robot apparatus 10.
As depicted in
The foregoing description of the exemplary embodiment of the present disclosure has been provided for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiment was chosen and described in order to best explain the principles of the disclosure and its practical applications, thereby enabling others skilled in the art to understand the disclosure for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the disclosure be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2018-133986 | Jul 2018 | JP | national |