The present invention relates to an object manipulation apparatus and an object manipulation method for an automatic machine that picks up and manipulates an object while sharing work space with a person(s).
When an automatic machine such as a robot works while sharing work space with a human(s), it is necessary to consider safety against collisions with the human(s), and work efficiency for working quickly. For the purpose of achieving both safety and work efficiency, for example, the inventions of Patent Documents 1 and 2 are disclosed.
Patent Document 1 discloses that, in order to achieve both safety and efficiency, an automatic machine detects a human's position, and changes the operation speed of the automatic machine dependent on the distance between the automatic machine and the human. When the human approaches, the operation speed of the automatic machine is reduced to achieve safe work. When the human is remote, the operation speed of the automatic machine is increased to achieve efficient work.
Patent Document 2 discloses that both safety and work efficiency are achieved by changing operating modes. That is, an automatic machine is stopped or operated at low power dependent on the distance to a human (referred to as a “moving body” in Patent Document 2), and thus, achieving safety and also work efficiency.
PATENT DOCUMENT 1: Japanese Patent Laid-open Publication No. JP 2015-526309 A
PATENT DOCUMENT 2: Japanese Patent Laid-open Publication No. JP 2014-176934 A
Such automatic machines simply reduce the operation speed and output power of the automatic machine dependent on the positions of the automatic machine and the human. Therefore, there is a problem that, whenever the human is close to the automatic machine, the operation speed and/or output power are/is reduced, resulting in work efficiency remaining low.
An object of the present invention is to solve the above-described problem, and to provide an object manipulation apparatus and an object manipulation method capable of achieving both safety and efficiency, and capable of achieving efficient work even when a human approaches.
According to an aspect of the present invention, an object manipulation apparatus is provided with at least one sensor and at least one pickup device, for picking up and manipulating at least one target object using the pickup device. The object manipulation apparatus is further provided with: an object recognizer that recognizes a position and an attitude of the target object based on data measured by the sensor, a distance calculator that calculates a distance from the target object to a certain object which is other than the object manipulation apparatus and the target object, and a manipulation controller that controls the pickup device based on the position and the attitude of the target object, and based on the distance from the target object to the certain object. When there are a plurality of target objects to be selected by the object manipulation apparatus, the manipulation controller select one of the target objects having distances to the certain object longer than a predetermined threshold, and manipulates the selected target object using the pickup device.
According to the present invention, the object manipulation apparatus selects a target object allowing efficient work, while determining the possibility of a collision with a human. Therefore, it is possible to achieve more efficient work than simply reducing the operation speed of the apparatus.
The object manipulation apparatus of
The drive device 1 moves the sensor 2, the pickup device 3, and the sensor 5. The drive device 1 is used to move the sensors 2 and 5 to a point of view for measurement, and move the pickup device 3 to a position for manipulating a target object. Here, the drive device refers to a device providing movement under automatic control based on action commands, and may include a robot, a manipulator, a movable carriage, a linear axis, and a combination of a plurality of drive axes.
The sensor 2 measures an area around the object manipulation apparatus, and recognizes a position and an attitude of a target object. The sensor 2 is a camera or a three-dimensional vision sensor. The sensor 2 may be a pinhole camera, a rangefinder camera, a view camera, a light field camera, a stereo camera, an active stereo camera, a passive stereo camera, a photometric stereo system, a sensor using the time-of-flight method, a sensor using the spatial encoding method, a sensor using the structured light method, or a laser scanner.
The pickup device 3 is a mechanism for manipulating a target object around the object manipulation apparatus. The pickup device 3 may be a suction-type pickup device that sucks and picks up a target object, or may be a pinch-type pickup device that pinches and picks up a target object. In addition, it is possible to arbitrarily select the number of nozzles of the suction-type pickup device, the number of fingers and nails of the pinch-type pickup device, and their shapes.
The sensor 5 measures an area around the object manipulation apparatus, and recognizes a position and an attitude of an object which is other than the object manipulation apparatus and a target object, which is present around the object manipulation apparatus, e.g., a human around the object manipulation apparatus. The sensor 5 may be a camera or a three-dimensional vision sensor, or may be a sensor that detects a sound wave, heat, light, vibration, or magnetism. The camera or three-dimensional vision sensor may be a pinhole camera, a rangefinder camera, a view camera, a light field camera, a stereo camera, an active stereo camera, a passive stereo camera, a photometric stereo system, a sensor using the time-of-flight method, a sensor using the spatial encoding method, a sensor using the structured light method, or a laser scanner, similar to the case of the sensor 2. The sensor that detects a sound wave, heat, light, vibration, or magnetism may be an optical sensor, a photoelectric element, a photodiode, an infrared sensor, a radiation sensor, a magnetic sensor, an ultrasonic range finder, a capacitive displacement sensor, or an optical position sensor. Obviously, the sensors 2 and 5 may be unified into one sensor of one type, and only the one sensor be used to recognize positions and attitudes of a target object and a human.
The object recognizer 4a recognizes the position and attitude of the target object based on data measured by the sensor 2. In this case, it is possible to extract a target object from an image and recognize a region, a position, and an attitude of the target object, using any general computer vision technique, such as: deleting a background region from a camera's image, a three-dimensional vision sensor's image, a range image, point cloud data, etc., to calculate the center of mass of the region of the target object; or fitting a model of the target object to an image, a range image, or point cloud data. It is also possible to define the position of the target object from the region of the target object in the image. The center of mass of the region may be defined as the position of the target object. The position of a point on a surface of the region, the point being closest to a specific position in space may be defined as the position of the target object. In this case, it is possible to apply chamfer matching, template matching, iterative closest point algorithm, feature extraction, hashing, machine learning techniques including deep learning, reinforcement learning techniques, or their derivations.
The human recognizer 4b recognizes the position and attitude of the human around the object manipulation apparatus, based on data measured for the human by the sensor 5. In this case, the human recognizer 4b can recognize the position of the human using any general computer vision technique, in a manner similar to that of the object recognizer 4a. In addition, the position and attitude of the human and a region of the human may be calculated by estimating an approximate position of the human using an optical sensor, a photoelectric element, a photodiode, an infrared sensor, a radiation sensor, a magnetic sensor, an ultrasonic range finder, a capacitive displacement sensor, an optical position sensor, etc., calculating a position of the human body closest to the drive device 1, and calculating a direction vector from this position to the drive device 1 as an attitude. The position of the human may be defined from the region of the human. The center of mass of the region may be defined as the position of the human. The position of a point on a surface of the region, the point being closest to a specific position in space may be defined as the position of the human. Thus, it is possible to calculate the position of the human.
The safe distance calculator 4c calculates the safe distance for the target object, based on the position and attitude of the human, and based on the position and attitude of the target object. Here, the safe distance refers to how close the human and the target object are to each other. The distance between the position of the human and the position of the target object is defined as the safe distance. When there are a plurality of humans and/or a plurality of target objects, safe distances are calculated among the humans and the target objects, respectively.
The manipulation controller 4d generates action commands for controlling the pickup device 3 to pick up the target object, based on the position and attitude of the target object, and based on the safe distance for the target object. When there are a plurality of target objects to be selected by the object manipulation apparatus, the manipulation controller selects one of target objects having safe distances longer than a predetermined threshold, and manipulates the selected target object using the pickup device 3.
At step S11 of
The threshold is determined with reference to operation speeds of the drive device 1 and the pickup device 3, weights of the drive device 1 and the pickup device 3, and a force applied to the drive device 1 and the pickup device 3 when they collide with each other. The threshold is determined such that, under conditions of high operation speeds, large weights, or a large force applied to each other upon collision, a long safe distance is set so as to reduce influence and damage of collision with a human. The manipulation controller 4d determines the threshold, for example, so as to increase the threshold as the weight and speed of the pickup device 3 increase, and to decrease the threshold as the weight and speed of the pickup device 3 decrease. When the safe distance for a certain target object is shorter than or equal to the threshold, the manipulation controller 4d determines the target object not to be a candidate for pickup, and continues a manipulation planning process for a next target object. When the safe distance for a certain target object is longer than the threshold, the manipulation controller 4d determines the target object to be a candidate for pickup, and calculates a pickup attitude of the pickup device 3 for picking up the target object.
At step S13, the manipulation controller 4d calculates a pickup attitude of the pickup device 3 for picking up the target object, based on the position and attitude of the target object. At step S14, the manipulation controller 4d determines whether or not the pickup device 3 can reach from its current position to the pickup attitude calculated at step S13; if YES, the process proceeds to step S15, and if NO, the process proceeds to step S16. Step S14 is done by determining whether or not interference between the object manipulation apparatus and its surrounding environment occurs, and whether or not kinematics of the drive device 1 can be solved. At step S15, the manipulation controller 4d calculates a moving time of the pickup device 3 from its current position. At step S16, the manipulation controller 4d determines whether or not the calculation has been done for all the target objects; if YES, the process proceeds to step S18, and if NO, the process proceeds to step S17. At step S17, the manipulation controller 4d selects a next target object among the plurality of target objects. At step S18, the manipulation controller 4d selects a target object minimizing the moving time of the pickup device, among the target objects having safe distances longer than the threshold. At step S19, the manipulation controller 4d selects a pickup attitude of the pickup device 3 for achieving the most efficient work. Thus, the manipulation controller 4d determines a manipulation plan for moving the pickup device 3 to the target object selected at step S18 and to the pickup attitude selected at step S19.
At step S8 of
At step S9 of
For step S7 of
According to the object manipulation apparatus according to the first embodiment, the object manipulation apparatus selects a target object and a pickup method that allow efficient work, while determining the possibility of a collision with a human. Therefore, it is possible to achieve more efficient work than simply reducing the operation speed of the apparatus.
According to the object manipulation apparatus according to the first embodiment, in an environment including the object manipulation apparatus, a plurality of target objects, and a human, the object manipulation apparatus can determine which one of the plurality of target objects is to be picked up for safety and efficiency. Therefore, it is possible to achieve more efficient work than simply reducing the operation speed when a human approaches.
According to the second embodiment, the information processing device 4A is provided with the auxiliary storage device 46, instead of being connected to a sensor for recognizing a human. The auxiliary storage device 46 stores in advance a distance from a target object to a certain object which is other than the object manipulation apparatus and the target object. The auxiliary storage device 46 stores, for example, information on any predetermined point(s) or plane in space for safety. The information processing device 4A refers to the information stored in the auxiliary storage device 46, instead of recognizing a human based on data measured by a sensor. Thus, it is possible to achieve safety while achieving more efficient work.
It is also possible to calculate a safe distance, even when using a predetermined “point” in space, instead of the plane 8.
The auxiliary storage device 46 stores in advance a distance from a target object to a certain object which is other than the object manipulation apparatus and the target object, instead of any predetermined plane and point in space.
By recognizing the position and attitude of a target object 6, it is possible to calculate a safe distance for the target object 6 based on a relationship with the plane 8. Then, it is possible to make a plan for manipulating the target object based on the safe distance for each target object, and based on the position and attitude of each target object, in a manner similar to that of the first embodiment.
As described above, according to the second embodiment, it is possible to make a safe and efficient manipulation plan based on data measured by the sensor 2, and based on information on any predetermined point(s) or plane in space, stored in the auxiliary storage device 46.
The object manipulation apparatus may be provided with a plurality of pickup devices to be selected, in order to manipulate various target objects. The plurality of pickup devices to be selected include different types of pickup devices, e.g., a pinch-type pickup device and a suction-type pickup device. According to the third embodiment, for the purpose of such work, the object manipulation apparatus is provided with two types of pickup devices, i.e., a pickup device 3 and the pickup device 10. In this case, it is determined not only which one of a plurality of target object is to be manipulated, but also which one of the plurality of pickup devices is used to pick up the target object. In this case, an overall processing flow is similar to that of
At step S31 of
At step S34, the manipulation controller 4d calculates a direction of action of each pickup device for pickup, based on the safe distance for the target object. At step S35, the manipulation controller 4d calculates a safety factor of each pickup device in the calculated direction of action.
When the same target object is picked up by different pickup devices, these pickup devices may have different directions of action from each other, for example, as shown in
At step S36, the manipulation controller 4d determines whether or not the calculation has been done for all the target objects; if YES, the process proceeds to step S38, and if NO, the process proceeds to step S37. At step S37, the manipulation controller 4d selects a next target object among the plurality of target objects. At step S38, the manipulation controller 4d selects a pickup device having the safest direction of action, and selects a corresponding target object.
Thus, the manipulation controller 4d estimates positions of a target object manipulated by the pickup devices, selects one of pickup devices having safe distances from the estimated positions longer than the threshold, and manipulates the selected target object. The manipulation controller 4d selects a pickup device having the minimum moving time, among the pickup devices having the safe distances from the estimated positions longer than the threshold, and manipulates the selected target object. Thus, it is possible to make a manipulation plan for achieving safe and efficient work.
Further,
At step S51 of
At step S54, the manipulation controller 4d calculates directions of action of each pickup device for pickup, based on the safe distance for the target object. At step S55, the manipulation controller 4d calculates a safety factor of each pickup device in the calculated direction of action. At step S56, the manipulation controller 4d calculates a moving time of each pickup device from its current position.
At step S57, the manipulation controller 4d determines whether or not the calculation has been done for all the target objects; if YES, the process proceeds to step S59, and if NO, the process proceeds to step S58. At step S58, the manipulation controller 4d selects a next target object among the plurality of target objects. At step S59, the manipulation controller 4d selects a pickup device having a safe direction of action with a short moving time, and selects a corresponding target object.
When both safety and moving time are achieved, an evaluation function: F=aS+bT may be used. “S” denotes a safety factor dependent on the direction of action, and is defined by the safe distance. “T” denotes the moving time of the pickup device from its current position. “a” and “b” denote arbitrary weight coefficients, and are adjusted dependent on whether to prioritize safety or efficiency of work. The evaluation function may further include weights for normalizing the respective factors.
According to the manipulation planning process of
As described above, a manipulation plan is determined by automatically selecting a pickup device and a target object so as to achieve both safety and efficiency.
For example, in the case that a pinch-type pickup device picks up a target object, even when the pickup device can not pinch the target object, the pickup device may press itself against the target object to “scrape out” the target object. Thus, one pickup device may have a plurality of different pickup methods. Also in this case, it is possible to select a pickup method and determine a manipulation plan in a manner similar to that described above.
At step S61 of
At step S64, the manipulation controller 4d calculates a direction of action of each pickup method for pickup, based on the safe distance for the target object. At step S65, the manipulation controller 4d calculates a safety factor of the direction of action of each pickup method. At step S66, the manipulation controller 4d calculates a moving time of each pickup device from its current position.
At step S67, the manipulation controller 4d determines whether or not the calculation has been done for all the target objects; if YES, the process proceeds to step S69, and if NO, the process proceeds to step S68. At step S68, the manipulation controller 4d selects a next target object among the plurality of target objects. At step S69, the manipulation controller 4d selects a pickup method having a safe direction of action with a short moving time, and selects a corresponding target object.
According to the manipulation planning process of
Thus, the manipulation controller 4d estimates positions of a target object manipulated by the pickup methods, selects one of pickup methods having safe distances from the estimated positions longer than the threshold, and manipulates the selected target object. The manipulation controller 4d selects a pickup method minimizing the moving time of the pickup device, among the pickup methods having the safe distances from the estimated positions longer than the threshold, and manipulates the selected target object.
In addition, when there are a plurality of target objects to be selected, a plurality of pickup devices to be selected, and a plurality of pickup methods to be selected, it is possible to similarly select a target object, a pickup device, and a pickup method, and determine a manipulation plan.
By the above-described method, it is possible to select a target object, a pickup device, and a pickup method in a safe and efficient manner.
The object manipulation apparatus of
According to the manipulation planning process of
When selecting a target object to pick up, a moving time may be considered in a manner similar to that of the first to third embodiments. With reference to
TA=p×T1+(1−p)×T2
Where, “p” denotes the successful pickup rate, “T1” denotes a moving time in the case that the target object is successfully picked up, and “T2” denotes a moving time in the case that picking up of the target object is failed and to be retried. It is determined that the shorter the assumed moving time TA is, the higher the work efficiency is.
According to the manipulation planning process of
According to the manipulation planning processes of
Although the fourth embodiment describes selecting one of a plurality of target objects according to the priority related to the successful pickup rate, one of a plurality of pickup devices (see
The manager device 13 is a personal computer or a server device that runs software for managing the overall work process, in sites such as a factory or a warehouse where the object manipulation apparatus according to the fifth embodiment is used. The manager device 13 may be a warehouse management system (WMS) for a warehouse, or may be a production management system for a production site.
The object manipulation apparatus of
According to the manipulation planning process of
According to the manipulation planning process of
Although the fifth embodiment describes selecting one of a plurality of target objects according to the priority related to an object(s) prioritized for a subsequent process, one of a plurality of pickup devices (see
The features of the above-described first to fifth embodiments may be combined with one another.
The embodiments of the present invention have the following features.
In a robot provided with a sensor and a pickup device,
an object recognizer calculates the position and attitude of a target object from data measured by the sensor,
a safe distance calculator calculates the possibility of a collision between the target object and a human, and
a manipulation controller determines any one or more among a target object to be manipulated by the robot, a type of the pickup device, and a pickup method of the pickup device, based on the position and attitude of the target object, and based on a safe distance for the target object.
Thus, it is possible to achieve efficient and safe manipulation of the target object by changing a target object to be manipulated, a pickup device, and a pickup method, while considering the risk of a collision with a human.
The manipulation controller calculates action commands so as to select a target object according to a predetermined priority, among target objects having safe distances within an arbitrary range, and pick up the selected target object in a prioritized manner.
Thus, it is possible to achieve efficient and safe manipulation of the target object by changing a target object to be manipulated, a pickup device, and a pickup method, while considering the risk of a collision with a human.
The manipulation controller calculates action commands so as to select at least one of: a target object allowing the pickup device to move from the robot's current attitude to a position and an attitude so as to reduce a pickup operation time, a target object having a high successful pickup rate, and a target object prioritized for a subsequent process, among target objects having safe distances within an arbitrary range, and pick up the selected target object in a prioritized manner.
Thus, it is possible to select a target object manipulated in the most efficient manner, while achieving safety.
When the manipulation controller can calculate a plurality of positions and attitudes of the pickup device for picking up a target object, the manipulation controller estimates the risk of a collision with a human based on directions of action for pickup, and calculates action commands to pick up the target object using the pickup device at a position and an attitude capable of reducing the risk of a collision with the human, in a prioritized manner.
Thus, it is possible to select a safe, efficient, and stable pickup method (the position and attitude of the pickup device relative to the target object).
The robot has a plurality of pickup devices, and
the manipulation controller estimates the risk of a collision with a human based on directions of action for pickup, and calculates action commands to pick up the target object using a type of pickup device capable of reducing the risk of a collision with the human, in a prioritized manner.
Thus, it is possible to select a safe pickup device.
The robot has a plurality of pickup method, and
the manipulation controller estimates the risk of a collision with a human based on directions of action for pickup, and calculates action commands to pick up the target object using a pickup method capable of reducing the risk of a collision with the human, in a prioritized manner.
Thus, it is possible to select a safe and efficient pickup method.
The safe distance calculator determines the possibility of a collision between the robot and a human, based on the position and attitude of the target object, or a tip position of the pickup device when manipulating the target object, and based on a position and an attitude of the human calculated by a human recognizer from data measured by the sensor.
Thus, it is possible to sequentially calculate the risk of a collision with a human, to achieve more safe and efficient manipulation of the target object.
The safe distance calculator determines the possibility of a collision between the robot and a human, based on a tip position of the pickup device when manipulating the target object, and based on information on any predetermined point(s) or plane in space.
Thus, it is possible to define in advance the risk of a collision with a human in work space, to achieve more safe and efficient manipulation of the target object.
According to the embodiment of the present invention, it is possible to automatically selecting a target object to be manipulated by an automatic machine, a pickup device of the automatic machine for manipulating the target object, and/or a pickup method, dependent on the possibility of a collision with a human, and therefore, it is possible to achieve work with high efficiency even if the human approaches.
The present invention is applicable to an object manipulation apparatus and an object manipulation method for an automatic machine sharing work space with a human, and achieving safe and efficient manipulation of a plurality of target objects, the automatic machine being provided with means for sensing a surrounding environment, and means for picking up and manipulating the target objects.
Number | Date | Country | Kind |
---|---|---|---|
JP2016-086373 | Apr 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/013407 | 3/30/2017 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2017/183414 | 10/26/2017 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
20140067121 | Brooks et al. | Mar 2014 | A1 |
20140277723 | Nishimura et al. | Sep 2014 | A1 |
20140277725 | Kouno et al. | Sep 2014 | A1 |
20140316573 | Iwatake | Oct 2014 | A1 |
20150120055 | Miyazawa | Apr 2015 | A1 |
Number | Date | Country |
---|---|---|
2010-120139 | Jun 2010 | JP |
2014-176932 | Sep 2014 | JP |
2014-176934 | Sep 2014 | JP |
2015-526309 | Sep 2015 | JP |
Entry |
---|
International Search Report dated May 16, 2017, in PCT/JP2017/013407, filed Mar. 30, 2017. |
Office Action dated Feb. 3, 2021, in corresponding Chinese patent Application No. 201780024149.5, 22 pages. |
Number | Date | Country | |
---|---|---|---|
20190061159 A1 | Feb 2019 | US |