The present disclosure relates to a waste sorting robot for sorting waste objects.
In the waste management industry, industrial and domestic waste is increasingly being sorted in order to recover and recycle useful components. Each type of waste, or “fraction” of waste can have a different use and value. If waste is not sorted, then it often ends up in landfill or incinerated which has an undesirable environmental and economic impact.
It is known to sort industrial and domestic waste using a waste sorting robot. The waste sorting robot picks waste objects from a conveyor with a gripper and moves the object to a sorting location depending on the type of waste object.
A previous problem is the limited speed by which waste sorting robots can be operated. The speed of operation limits the flow of waste objects to be sorted, and ultimately the throughput and value of this type of automated recycling. Adding further waste sorting robots along the conveyor increases the cost of the waste sorting system, as well as the footprint and complexity of the system.
Examples described hereinafter aim to address the aforementioned problems.
In a first aspect of the disclosure, there is provided a waste sorting robot comprising a manipulator movable within a working area, a gripper connected to the manipulator, wherein the gripper is arranged to selectively grip a waste object in the working area at a picking position and throw the waste object to a target position. The waste sorting robot comprises a sensor configured to detect object parameters of the waste objects, the object parameters comprising the orientation and/or physical characteristics of the respective waste objects. The waste sorting robot comprises a controller in communication with the sensor and being configured to receive the detected object parameters, wherein, for the respective waste objects being selectively gripped by the gripper, the controller is configured to determine an associated throw trajectory of the gripped waste object towards the target position based on the detected object parameters of the gripped waste object, and send control instructions to the gripper and/or manipulator so that the gripper and/or manipulator accelerates the gripped waste object and releases the waste object at a throw position with a throw velocity and throw angle towards the target position so that the waste object is thrown along the determined throw trajectory associated with the waste object, from the throw position to the target position.
Optionally, the physical characteristics comprises geometrical and/or material characteristics of the respective waste objects.
Optionally, the geometrical and/or material characteristics of the waste objects comprises size, shape, density, surface properties, and/or composition of the waste objects.
Optionally, the sensor comprises any of an image sensor, a force sensor, a motion sensor, an electric current sensor, a temperature sensor, a chemical sensor.
Optionally, the controller is configured to determine the throw trajectory based on the object parameters and the picking position, and/or the position of the target position.
Optionally, the throw position is determined based on the object parameters.
Optionally, the throw position is off-set a distance from the picking position, in a direction towards the target position.
Optionally, the controller is configured to determine the throw velocity of the throw trajectory by calculating a parabola of a projectile motion of the gripped waste object.
Optionally, the controller is configured to determine a mass of the gripped waste object from the object parameters, and determine an acceleration of the gripped waste object based on the mass so that the waste object is thrown with the throw velocity from the throw position to the target position.
Optionally, accelerating the gripped waste object to the throw velocity comprises applying a force to the gripped waste object during time by a movement of the gripper and/or manipulator, and/or by applying an air- or gas flow to the gripped waste object, wherein the air- or gas flow is ejected from the gripper.
Optionally, the controller is configured to determine the throw trajectory by determining a drag force of the waste objects based on the detected object parameters.
Optionally, the controller is configured to determine a shape and/or cross-sectional area of the waste objects based on the geometrical characteristics, and determining the drag force as being proportional to a drag coefficient based on the shape, and/or as being proportional the cross-sectional area.
Optionally, a throw sensor (112) configured to determine the position of a waste object after being thrown to the target position, the controller is configured to receive said position as throw data, associate the throw data and the detected object parameters of the thrown waste object to a waste object model to be applied to subsequently gripped waste objects, determine deviations in the position of the thrown waste object by comparing the throw data with the throw trajectory, determine control instructions to the gripper and/or manipulator based on the deviations, and associate the determined control instructions to the waste object model.
Optionally, the controller is configured to input the throw data and the object parameters to a machine learning-based model to determine the control instructions for subsequently gripped waste objects.
In a second aspect of the disclosure, there is provided a method of controlling a waste robot comprising moving a manipulator within a working area, controlling a gripper connected to the manipulator to selectively grip a waste object in the working area at a picking position and throw the waste object to a target position, determining object parameters of the waste objects, the object parameters comprising the orientation and/or physical characteristics of the respective waste objects, wherein, for the respective waste objects being selectively gripped by the gripper, the method comprises determining an associated throw trajectory of the gripped waste object towards the target position based on the determined object parameters and picking position of the gripped waste object, sending control instructions to the gripper and/or manipulator so that the gripper and/or manipulator accelerates the gripped waste object and releases the waste object at a throw position with a throw velocity and throw angle towards the target position so that the waste object is thrown along the determined throw trajectory associated with the waste object, from the throw position to the target position.
In a third aspect of the disclosure, a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
Various other aspects and further examples are also described in the following detailed description and in the attached claims with reference to the accompanying drawings, in which:
The waste sorting robot 100 comprises a manipulator 101 which is movable within a working area 102. The waste sorting robot 100 comprises a gripper 103 which is connected to the manipulator 101. The gripper 103 is arranged to selectively grip a waste object 104 which moves into the working area 102 on a conveyor belt 113. The gripper 103 may comprise a pneumatic suction gripper holding and releasing waste objects 104 by a varying air- or gas pressure. Alternatively, or in addition, the gripper 103 may comprise movable jaws to pinch the waste object 104 with a releasable grip. The conveyor belt 113 may be a continuous belt, or a conveyor belt formed from overlapping portions. The conveyor belt 113 may be a single belt or alternatively a plurality of adjacent moving belts (not shown). In other examples, the waste object 104 can be conveyed into the working area 102 via other conveying means. The conveyor belt 113 can be any suitable means for moving the waste object 104 into the working area 102. For example, the waste object 104 may be fed under gravity via a slide (not shown) to the working area 102.
The working area 102 is an area within which the manipulator 101 and gripper 103 is able to reach and interact with the waste object 104. The working area 102 as shown in
The gripper 103 is arranged to grip the waste object 104 in the working area 102, at a momentaneous position referred to as a picking position 105 below, and throw the waste object 104 to a target position 106.
The manipulator 101, and the gripper 103 connected thereto, is configured to move within a working volume defined by the height above the working area 102 where the waste sorting robot 100 can manipulate the waste object 104. In some examples, the manipulator 101 is moveable along a plurality of axes. In some examples, the manipulator 101 is moveable along three axes which are substantially at right angles to each other. In this way, the manipulator 101 is movable in an X-axis which is parallel with the longitudinal axis of the conveyor belt 113 (“beltwise”). Additionally, the manipulator 101 is movable across the conveyor belt 113 in a Y-axis which is perpendicular to the longitudinal axis of the conveyor belt 113 (“widthwise”). The manipulator 101 is movable in a Z-axis which is in a direction normal to the working area 102 and the conveyor belt 113 (“heightwise”). Optionally, the manipulator 101 and/or gripper 103 can rotate about one or more axes (W), as schematically indicated in
The waste sorting robot 100 is arranged to sort the waste object 104 into fractions according to one or more parameters of the waste object 104. The waste objects 104 can be any type of industrial waste, commercial waste, domestic waste or any other waste which requires sorting and processing. Unsorted waste material comprises a plurality of fractions of different types of waste. Industrial waste can comprise fractions, for example, of metal, wood, plastic, hardcore and one or more other types of waste. In other examples, the waste can comprise any number of different fractions of waste formed from any type or parameter of waste. The fractions can be further subdivided into more refined categories. For example, metal can be separated into steel, iron, aluminium etc. Domestic waste also comprises different fractions of waste such as plastic, paper, cardboard, metal, glass and/or organic waste. A fraction is a category of waste that the waste can be sorted into by the waste sorting robot 100. A fraction can be a standard or homogenous composition of material, such as aluminium, but alternatively a fraction can be a category of waste defined by a customer or user.
The waste sorting robot 100 comprises a sensor 107 configured to detect object parameters of the waste objects 104, and a controller 108 in communication with the sensor 107 which is configured to receive the detected object parameters. The controller 108 may thus be configured to send movement instructions to the manipulator 101 and gripper 103 for interacting with the waste objects 104 to be sorted, based on the detected object parameters. I.e. the gripper 103 may selectively grip the waste objects 104 to be sorted as mentioned above. The controller 108 may thus be configured to send instructions to the X-axis, Y-axis and Z-axis drive mechanisms of the manipulator 101 and gripper 103 to control and interact with the waste objects 104 on the conveyor belt 113. Various information processing techniques can be adopted by the controller 108 for controlling the manipulator 101 and gripper 103. Such information processing techniques are described in WO2012/089928, WO2012/052615, WO2011/161304, WO2008/102052 which are incorporated herein by reference. The control of the waste sorting robot 100 is discussed in further detail in reference to
The controller 108 is thus configured to receive the detected object parameters from the sensor 107. The object parameters comprise the orientation and/or physical characteristics of the respective waste objects 104. The orientation of a waste object 104 should be construed as the orientation in the working volume in the X, Y, Z-directions. For example, two waste objects 104 of identical size and shape may have different orientations when being transported on the conveyor belt 113, since the waste object 104 may lay on different sides on the conveyor belt 113. The orientation of such waste objects 104 may thus also be different when being held in place in the gripper 103, since the gripper 103 typically grips the waste objects 104 from a top-down approach, regardless of the orientation of the waste objects 104 on the conveyor belt 113. The physical characteristics may comprise geometrical characteristics of the respective waste objects 104, such as the shape, size, and/or volume. Alternatively, or in addition, the physical characteristics may comprise material characteristics, such as from what material the waste object 104 is made, density, and/or surface properties of the waste object 104.
The controller 108 is configured to determine an associated throw trajectory 109 of the respectively gripped waste object 104 towards the target position 106 based on the detected object parameters of said gripped waste object 104.
In one example the waste objects 104 may have essentially the same object parameters with respect to the geometrical characteristics, i.e. same size and shape, but the material characteristics may be different. The densities of the waste objects 104 may vary, and accordingly the weight. Determining the throw trajectory 109 may thus take into account the different weights of the waste objects 104. E.g. a heavier object needs to be accelerated for a longer duration by the gripper 103 and/or manipulator 101 to reach a desired throw velocity (v0), compared to a lighter object, due to the increased inertia of the heavier object. Other material characteristics may include structural parameters such as the flexibility of the waste objects 104. E.g. the waste object 104b in
In a further example the waste objects 104 may have essentially the same object parameters with respect to the geometrical characteristics and the material characteristics, but the orientation of the waste objects 104 on the conveyor belt 113 may vary. The orientation of the waste objects 104 when held in place by the gripper 103 may thus also vary, if the waste objects 104 are gripped from the same top-down approach. For example, a rectangular waste object 104 which has one side significantly shorter than the remaining two, e.g. shaped like a text book, may have different trajectories through the air depending on which side is facing the throw direction. The waste object 104 may experience less drag if the shortest side is facing the throw direction, thus cutting through the air with less air resistance. Hence, the detected object parameters may comprise information of the orientation of the waste objects 104 to determine the associated throw trajectories 109.
Determining a throw trajectory 109 of the respectively gripped waste objects 104, based on the detected object parameters provides for optimizing the useful time interval the manipulator 101 and gripper 103 interacts with each waste object 104. E.g. determining the required throw velocity (v0) of a first waste object 104a to follow a throw trajectory 109 to the target position 106 allows for minimizing the amount of time the gripper 103 needs to carry the first waste object 104a before being thrown. The first waste object 104a may be thrown quickly at a throw position 110 just after being accelerated to the throw velocity (v0), and the gripper 103 may immediately target the next identified waste object 104b. A subsequently gripped waste object 104 may have associated object parameters which dictate a different throw trajectory 109 and the gripper 103 and/or manipulator 101 may be controlled to throw the waste object 104 accordingly. The optimized throwing of the waste objects 104 to the target positions 106 as described in the examples above provides for a more effective waste sorting robot 100. This means that the sorting speed can be increased. The speed of the conveyor belt 113 and/or the amount of waste objects 104 on the conveyor belt 113 may be increased. In one example, by increasing the speed of the conveyor belt 113, the objects to be sorted on the conveyor belt 113 are more singularized and less likely to be overlapping. This means that the manipulation and object recognition is easier. This increases the processing rate e.g. tons/hour because the number of objects per hour which is fed to the robot increases.
The sensor 107 may be positioned upstream of the working area 102 so that detected parameters of the waste objects 104 may be sent to the controller 108 before the waste objects 104 enter the working area 102. The sensor 107 may comprise a plurality of different types of sensors 107. The sensor 107 or plurality of sensors 107 may be arranged at different positions outside or inside the working volume. In some examples the sensor 107 or plurality of sensors 107 may be arranged on, or in communication with, the manipulator 101 and/or gripper 103.
The sensor 107 may comprise any sensor suitable to detect a parameter of the waste object 104 e.g. one or more of an image sensor, a force sensor, a gyroscopic sensor, a motion sensor, an electric current sensor, a hall sensor, a metal detector, a temperature sensor, a chemical sensor, a visual and/or infrared spectroscopic detector, radioactivity sensor and/or a laser e.g. LIDAR. An image sensor may comprise one or more of an RGB camera, an infrared camera, a 3D imaging sensor, a terahertz imaging system.
The object parameters of the waste objects 104 may be detected by any of the mentioned sensors. For example, the geometrical dimensions and orientation of a waste object 104 may be determined from image data of the waste object 104 received from an image sensor 107. The image data may be used to determine any one of a size, shape, and volume of the waste object 104. Further, the image data may be utilized in a machine learning-based model to build up an object recognition capability of the waste sorting robot 100 as described further below. Thus, the recorded image data may be utilized to distinguish physical characteristics such as from what material the waste object 104 is made, and the associated material characteristics, besides from the geometrical dimensions and orientation of a waste object 104.
The image data may be combined with sensor data from any one of the aforementioned sensors. In one example, the position of the gripper 103 provides sensory input to the controller 108. E.g. the position of the gripper 103 in
In some examples, the gripper 103 comprises a gyroscopic sensor, such as an electrical MEMS gyroscope used as a velocity sensor. The controller 108 may thus determine the acceleration and velocity of the gripper 103 during operation. The velocity of the gripper 103 may thus be monitored and controlled at the throw position 110 so that the velocity of the gripper 103 translates to the desired throw velocity (v0) of the waste object 104 when being released from the gripper 103. The throw angle (φ0) may be controlled by the X-, Y-, Z-movement of the gripper 103 and manipulator 101. For example, an upwards acceleration of the waste object 104 as illustrated in
The controller 108 may be configured to determine the throw trajectory 109 based on the object parameters and the picking position 105. The target position 106 may in some applications be at a defined distance, e.g. when having a single target chute extending along the working area 102. The picking position 105 may thus define the distance to the target position 106. The respective waste objects 104 may thus be accelerated and released with a throw velocity (v0) and throw angle (φ0) to be thrown along the different distances from the picking positions 105. This provides for a time effective sorting of the waste objects 104. It is however conceivable that some examples the throw position 110 is set to a defined distance from the target position 106, e.g. for waste objects 104 exceeding a certain weight, or having a high aerodynamic drag coefficient as described with reference to
The throw position 110 may be determined based on the detected object parameters as described above. For example, turning to
The throw position 110 may be off-set a distance 111 from the picking position 105, in a direction towards the target position 106.
The controller 108 may be configured to determine the throw velocity (v0) of the throw trajectory 109 by calculating a parabola of a projectile motion of the gripped waste object 104. The throw trajectory 109 may thus be estimated by a simplified model, i.e. parabola/projectile motion, where the aerodynamic drag force is not taken into account. The distance (x) travelled by the waste object 104 depends in this model only on time, the throw angle (φ0) and the throw velocity (v0); x=v0·t·cos(φ). The distance (x) may be determined from the picking position 105 and the target position 106. The throw angle (φ0) may in some examples be set to a defined value, such as in the range of 30-60°, e.g. 35, 45 or 55°. The throw velocity (v0) may consequently be determined from such projectile motion model. This may be particularly advantageous in some applications, e.g. where the waste objects 104 exhibit a minimum of drag force when thrown.
The controller 108 may be configured to determine a mass (m) of the gripped waste object 104 from the object parameters, e.g. based on parameters as exemplified above. The controller 108 may be configured to determine an acceleration of the gripped waste object 104 based on the determined or estimated mass (m) so that the waste object 104 is thrown with the throw velocity (v0) from the throw position 110 to the target position 106. For example, given the relationship F=m·a=m·dv/dt, where F is the force acting on the waste object 104 and a is the acceleration thereof, provides; T=v0·m/F, if the force F is constant during time a time T. Thus, T is the time the gripper 103 needs to apply force F to accelerate the waste object 104 with a mass (m) to the throw velocity (v0).
Thus, accelerating the gripped waste object 104 to the throw velocity (v0) may comprise applying a force (F) to the gripped waste object 104 during a time (T) by a movement of the gripper 103 and/or the manipulator 101. Alternatively, or in addition, the waste object 104 may be accelerated to the throw velocity (v0) by applying an airflow to the gripped waste object 104, where the airflow is ejected from the gripper 103. Hence, a pressure from an airflow of a flow of a gas, ejected from the gripper 103 onto the waste object 104 applies a force onto the waste object 104 to accelerate the waste object 104. In some examples, the gripper 103 and/or manipulator 101 may accelerate the waste object 104 by a movement in the X-, Y-, Z-directions in combination with pushing the waste object 104 away from the gripper 103 by an airflow. The gripper 103 may in some examples comprise a suction gripper 103 comprising a suction cup configured to physically engage with a surface of the waste object 104. A negative pressure may be created in the suction cup so that the waste object 104 is held in place by the gripper 103 due to the force created by the negative pressure. The suction gripper 103 may be in fluid communication with a pneumatic system (not shown) to connecting the suction gripper 103 with a compressed air or gas supply. The air or gas supply to the suction gripper 103 may be reversed so that the negative pressure is released and a positive pressure may be exerted onto the waste object 104 to throw the waste object 104 as described above. In a further example, the gripper 103 comprises movable jaws to grip the waste objects 104, and a gas- or airflow connection to push and throw the waste objects 104 away from the gripper 103 when the jaws release their grip.
The controller 108 may be configured to determine the throw trajectory 109 by determining a drag force of the waste objects 104 based on the detected object parameters. This provides for determining a more accurate throw trajectory 109 of the waste objects 104. The efficiency of the waste sorting robot 100 may thus be further improved. This may be particularly advantageous where the waste objects 104 to be sorted includes objects with shapes and/or cross-sectional areas which exhibit a non-negligible impact from aerodynamic drag when being thrown in the air.
The controller 108 may be configured to determine a shape and/or cross-sectional area of the waste objects 104 based on the detected geometrical characteristics. The controller 108 may be configured to determine the drag force as being proportional to a drag coefficient which is based on the shape of the waste objects 104. For example, a sphere may have a lower drag coefficient that an angular box, even though the cross-sectional area may be the same. Alternatively, or in addition, the controller 108 may be configured to determine the drag force as being proportional the cross-sectional area of the waste object 104. Hence, a drag force Fdrag may be determined as Fdrag=−f(v)·v, where f(v) is a function characterising the drag force in dependence on the velocity (v), v is the velocity vector, and f(v) may be expressed as f(v)=kpA·v2, in one example where a quadratic dependence on the velocity is assumed, and k is the drag coefficient and depends on the shape, p the density of air, and A is the cross-sectional area of the waste object 104. The force (F) on the waste object may be expressed as F=−mg·y+Fdrag=−mg·y−kpA·v2·v, with motion vector y, and gravity g.
The waste sorting robot 100 may comprise a throw sensor 112 configured to determine the position of a waste object 104 after being thrown to the target position 106.
In some examples, the throw data 109′ established for some categories of waste objects 104 may not agree with the desired throw trajectory 109 towards the correct target position 106.
Building of the waste object model as described in the example above may be part of a machine learning-based capability of the controller 108 and the waste sorting robot 100. Thus, the controller 108 may be configured to input the throw data 109′ and the object parameters to a machine learning-based model to determine the control instructions for subsequently gripped waste objects 104. In some examples, the sensor 107 may comprise an imaging sensor, such as a camera. Object parameters for the different waste objects 104 may be determined from the image data received from the sensor 107. Different image features, such as shapes, colours, geometrical relationships etc of the detected waste objects 104 may be assigned as the characterising object parameters in a waste object model, to create e.g. different categories of waste objects. The waste object model may be continuously populated with the throw data 109′ for the respective categories of waste objects, to continuously adapt the associated control instructions. The categorization of the waste objects 104 may be continuously refined by analysing and comparing the image data of waste objects 104 having similar throw data 109′ for a similar set of control instructions. The same principle may be applied to any of the sensor data received from any of the aforementioned sensors 107.
A computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.
In another example two or more examples are combined. Features of one example can be combined with features of other examples.
Examples of the present disclosure have been discussed with particular reference to the examples illustrated. However it will be appreciated that variations and modifications may be made to the examples described within the scope of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2030327-7 | Oct 2020 | SE | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/FI2021/050722 | 10/26/2021 | WO |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2022/090625 | 5/5/2022 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3896940 | Birrell | Jul 1975 | A |
4305130 | Kelley et al. | Dec 1981 | A |
4616121 | Clocksin | Oct 1986 | A |
4679291 | Schmeal et al. | Jul 1987 | A |
4763941 | Sniderman | Aug 1988 | A |
4835730 | Shimano et al. | May 1989 | A |
4998442 | Brown et al. | Mar 1991 | A |
5100005 | Noble et al. | Mar 1992 | A |
5116190 | Silke | May 1992 | A |
5188411 | Golden | Feb 1993 | A |
5244242 | Goedecke et al. | Sep 1993 | A |
5299693 | Ubaldi et al. | Apr 1994 | A |
5322272 | Benz et al. | Jun 1994 | A |
5423431 | Westin | Jun 1995 | A |
5445247 | Sato | Aug 1995 | A |
5572785 | Tveit | Nov 1996 | A |
5617338 | Sugano | Apr 1997 | A |
5617898 | Nagai | Apr 1997 | A |
5626378 | Puhl et al. | May 1997 | A |
5636966 | Lyon et al. | Jun 1997 | A |
5733098 | Lyon et al. | Mar 1998 | A |
5735782 | Berg | Apr 1998 | A |
5934864 | Lyon et al. | Aug 1999 | A |
5987726 | Akeel | Nov 1999 | A |
5992691 | Post et al. | Nov 1999 | A |
6008636 | Miller et al. | Dec 1999 | A |
6024392 | Blatt | Feb 2000 | A |
6056108 | Buchi et al. | May 2000 | A |
6168220 | Schmalz et al. | Jan 2001 | B1 |
6213709 | Hebrank | Apr 2001 | B1 |
6256553 | Erikkila | Jul 2001 | B1 |
6304050 | Skaar et al. | Oct 2001 | B1 |
6331758 | Takanashi et al. | Dec 2001 | B1 |
6817639 | Schmalz et al. | Nov 2004 | B2 |
6967465 | Takenaka et al. | Nov 2005 | B2 |
7327112 | Hlynka et al. | Feb 2008 | B1 |
7415321 | Okakaki et al. | Aug 2008 | B2 |
7650203 | Maslov et al. | Jan 2010 | B2 |
7957580 | Ban et al. | Jun 2011 | B2 |
7966094 | Ban et al. | Jun 2011 | B2 |
7996114 | Ban et al. | Aug 2011 | B2 |
8098928 | Ban et al. | Jan 2012 | B2 |
8351681 | Koike | Jan 2013 | B2 |
8380342 | Ban et al. | Feb 2013 | B2 |
8606398 | Eakins et al. | Dec 2013 | B2 |
8777284 | Schaller et al. | Jul 2014 | B2 |
8880217 | Izumi et al. | Nov 2014 | B2 |
9082454 | Yao et al. | Jul 2015 | B2 |
9205558 | Zevenbergen et al. | Dec 2015 | B1 |
9272417 | Konolige | Mar 2016 | B2 |
9230329 | Lukka | May 2016 | B2 |
9486926 | Kawano | Nov 2016 | B2 |
9600798 | Battles et al. | Mar 2017 | B2 |
9713875 | Lukka | Jul 2017 | B2 |
9789517 | Doublet et al. | Oct 2017 | B2 |
9914213 | Vijayanarasimhan et al. | Mar 2018 | B2 |
10449572 | Ripley | Oct 2019 | B2 |
10464105 | Koistinen | Nov 2019 | B2 |
10482120 | Ripley | Nov 2019 | B2 |
10576630 | Diankov et al. | Mar 2020 | B1 |
10639790 | Bacon et al. | May 2020 | B1 |
11607807 | Khansari Zadeh | Mar 2023 | B2 |
11660762 | Holopainen | May 2023 | B2 |
11851292 | Lukka et al. | Dec 2023 | B2 |
20020190230 | Dworkowski | Dec 2002 | A1 |
20020193909 | Parker et al. | Dec 2002 | A1 |
20030012925 | Gorrell | Jan 2003 | A1 |
20030133775 | Specher | Jul 2003 | A1 |
20040094979 | Damhuis | May 2004 | A1 |
20050077856 | Takenaka et al. | Apr 2005 | A1 |
20050173164 | Maslov et al. | Aug 2005 | A1 |
20050218677 | Llich | Oct 2005 | A1 |
20050279612 | Boberg | Dec 2005 | A1 |
20060053624 | Maeda et al. | Mar 2006 | A1 |
20070131213 | Matsuda | Jun 2007 | A1 |
20070147678 | Gotting | Jun 2007 | A1 |
20070187299 | Valerio | Aug 2007 | A1 |
20070213874 | Oumi et al. | Sep 2007 | A1 |
20070276539 | Habibi et al. | Nov 2007 | A1 |
20070299559 | Jassen | Dec 2007 | A1 |
20080150965 | Bischoff et al. | Jun 2008 | A1 |
20080240511 | Ban et al. | Oct 2008 | A1 |
20090025502 | Nakamoto | Jan 2009 | A1 |
20100004778 | Arimatsu et al. | Jan 2010 | A1 |
20110076128 | Johnson | Mar 2011 | A1 |
20110231018 | Iwai et al. | Sep 2011 | A1 |
20120032461 | Hukelmann | Feb 2012 | A1 |
20130127192 | Regan et al. | May 2013 | A1 |
20130127194 | Regan et al. | May 2013 | A1 |
20140025197 | Mattern | Jan 2014 | A1 |
20140036276 | Gross et al. | Feb 2014 | A1 |
20140062112 | Ho-Young | Mar 2014 | A1 |
20150016933 | Ochiishi | Jan 2015 | A1 |
20150241203 | Jordil | Aug 2015 | A1 |
20150328779 | Bowman et al. | Nov 2015 | A1 |
20160332310 | Conall | Nov 2016 | A1 |
20170028562 | Yamazaki et al. | Feb 2017 | A1 |
20170073174 | Tanaka | Mar 2017 | A1 |
20170174439 | Ripley | Jun 2017 | A1 |
20170291308 | Junichi | Oct 2017 | A1 |
20170355083 | Wigren | Dec 2017 | A1 |
20180036774 | Lukka et al. | Feb 2018 | A1 |
20180050451 | Takanishi et al. | Feb 2018 | A1 |
20180127219 | Wagner et al. | May 2018 | A1 |
20190030571 | Horowitz et al. | Jan 2019 | A1 |
20190039838 | Curhan et al. | Feb 2019 | A1 |
20190084012 | McCoy et al. | Mar 2019 | A1 |
20190126478 | Scott | May 2019 | A1 |
20190130560 | Horowitz et al. | May 2019 | A1 |
20190217342 | Parr et al. | Jul 2019 | A1 |
20190291283 | Kurz | Sep 2019 | A1 |
20190361672 | Odhner et al. | Nov 2019 | A1 |
20190389082 | Higo | Dec 2019 | A1 |
20200048015 | Martin et al. | Feb 2020 | A1 |
20200077074 | Denenberg et al. | Mar 2020 | A1 |
20200087118 | Sato et al. | Mar 2020 | A1 |
20200130935 | Wagner | Apr 2020 | A1 |
20200269429 | Chavez | Aug 2020 | A1 |
20200290214 | Watanabe et al. | Sep 2020 | A1 |
20210061588 | Lukka et al. | Mar 2021 | A1 |
20210114062 | Liu | Apr 2021 | A1 |
20210206586 | Douglas | Jul 2021 | A1 |
20210237260 | Holopainen et al. | Aug 2021 | A1 |
20210237262 | Holopainen et al. | Aug 2021 | A1 |
20230144252 | Lukka et al. | May 2023 | A1 |
20230191608 | Horowitz | Jun 2023 | A1 |
20230241787 | Holopainen et al. | Aug 2023 | A1 |
20240042624 | Holopainen | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
1291537 | Apr 2001 | CN |
1651295 | Aug 2005 | CN |
101088720 | Dec 2007 | CN |
101471546 | Jul 2009 | CN |
101618444 | Jan 2010 | CN |
101508181 | Apr 2011 | CN |
102431787 | May 2012 | CN |
203212009 | Sep 2013 | CN |
103787059 | May 2014 | CN |
204057223 | Dec 2014 | CN |
104513012 | Apr 2015 | CN |
104589351 | May 2015 | CN |
105196302 | Dec 2015 | CN |
105215076 | Jan 2016 | CN |
105372510 | Mar 2016 | CN |
107363405 | Nov 2017 | CN |
107650139 | Feb 2018 | CN |
107738264 | Feb 2018 | CN |
106362957 | May 2018 | CN |
108013841 | May 2018 | CN |
108032324 | May 2018 | CN |
108971190 | Dec 2018 | CN |
109013384 | Dec 2018 | CN |
109176522 | Jan 2019 | CN |
109249402 | Jan 2019 | CN |
109433633 | Mar 2019 | CN |
110116415 | Aug 2019 | CN |
209866708 | Dec 2019 | CN |
2455284 | May 1976 | DE |
4127446 | May 1995 | DE |
4440748 | May 1996 | DE |
10319253 | Dec 2004 | DE |
102010029662 | Dec 2011 | DE |
102015009998 | Feb 2016 | DE |
102015220413 | Apr 2017 | DE |
3056289 | Jan 2019 | DK |
0253229 | Jan 1988 | EP |
0706838 | Apr 1996 | EP |
1466704 | Oct 2004 | EP |
1810795 | Jul 2007 | EP |
1918479 | May 2008 | EP |
2476813 | Jul 2012 | EP |
2585256 | May 2013 | EP |
2694224 | Feb 2014 | EP |
2758216 | Jul 2014 | EP |
2810901 | Mar 2016 | EP |
3056288 | Aug 2016 | EP |
3056289 | Aug 2016 | EP |
3236083 | Oct 2017 | EP |
3254998 | Dec 2017 | EP |
3496873 | Jun 2019 | EP |
3626412 | Mar 2020 | EP |
3658302 | Jun 2020 | EP |
3672764 | Jul 2020 | EP |
3674040 | Jul 2020 | EP |
3677388 | Jul 2020 | EP |
2325915 | Dec 1998 | GB |
2354752 | Apr 2001 | GB |
MI20 081 360 | Jan 2010 | IT |
S5045304 | Apr 1975 | JP |
61-249292 | Nov 1986 | JP |
H01 240287 | Sep 1989 | JP |
H03154793 | Jul 1991 | JP |
H4176583 | Jun 1992 | JP |
H0489687 | Aug 1992 | JP |
H05228780 | Sep 1993 | JP |
H05318369 | Dec 1993 | JP |
H0630857 | Apr 1994 | JP |
H0740273 | Feb 1995 | JP |
05089337 | Dec 1996 | JP |
H092682 | Jan 1997 | JP |
9131575 | May 1997 | JP |
H1069315 | Mar 1998 | JP |
10-202571 | Aug 1998 | JP |
H11198076 | Jul 1999 | JP |
H11320461 | Nov 1999 | JP |
2001138280 | May 2001 | JP |
2002301683 | Oct 2002 | JP |
2003031636 | Jan 2003 | JP |
2003223642 | Aug 2003 | JP |
2005117791 | Apr 2005 | JP |
3684278 | Aug 2005 | JP |
2007040273 | Feb 2007 | JP |
2010089238 | Apr 2010 | JP |
4947691 | Jun 2012 | JP |
2012115916 | Jun 2012 | JP |
2013252568 | Dec 2013 | JP |
2014516810 | Apr 2014 | JP |
5688924 | Mar 2015 | JP |
2016068034 | May 2016 | JP |
2016225336 | Dec 2016 | JP |
2020022929 | Feb 2020 | JP |
2020022930 | Feb 2020 | JP |
2020062633 | Apr 2020 | JP |
20190050145 | May 2019 | KR |
20190071387 | Jun 2019 | KR |
1 399 116 | May 1988 | SU |
WO 8908537 | Sep 1989 | WO |
WO 89012019 | Dec 1989 | WO |
WO 9524544 | Sep 1995 | WO |
WO 98019799 | May 1998 | WO |
WO 2008102052 | Aug 2008 | WO |
WO 2011161304 | Dec 2011 | WO |
WO 2012052615 | Apr 2012 | WO |
WO 2012089928 | Jul 2012 | WO |
WO 2012156579 | Nov 2012 | WO |
WO 2013068115 | May 2013 | WO |
WO 2014202998 | Dec 2014 | WO |
WO 2016070412 | May 2016 | WO |
WO 2019056102 | Mar 2019 | WO |
WO 2019207201 | Oct 2019 | WO |
WO 2019207202 | Oct 2019 | WO |
WO 2019215384 | Nov 2019 | WO |
WO 2020053195 | Mar 2020 | WO |
WO2020079125 | Apr 2020 | WO |
WO 2020082176 | Apr 2020 | WO |
Entry |
---|
May 2, 2023 Int'l Preliminary Report on Patentability from PCT/FI2021/050722 (5 pgs). |
International Search Report in PCT/FI2021/050722 dated Jan. 19, 2022. |
Office Action received in Swedish Application No. 2030327-7 dated Jun. 29, 2021. |
Boudaba et al., “Grasping of Planar Objects using Visual Perception”, Article, p. 605-611. |
Chinese Office Action, dated Apr. 3, 2015,in corresponding Chinese Patent Application No. 201280056743.X. |
Cort, “Robotic parts feeding,” Assembly, Jun. 2007, https://www.assemblymag.com/articles/86446-robotic-parts-feeding. |
Extended European Search Report issued in PCT/FI2019/050322 dated Mar. 29, 2022. |
Extended European Search Report issued in PCT/FI2019/050322 dated Aug. 31, 2022. |
Extended European Search Report issued in PCT/FI2019/050320 dated Jan. 24, 2022. |
Finnish Search Report dated Jun. 19, 2012, corresponding to the Foreign Priority Application No. 20115923. |
Fujimoto et al., Image-Based Visual Serving for Grasping Unknown Objects, Article, p. 876-881. |
International Preliminary Report on Patentability issued in PCT/FI2021/050720 dated May 2, 2023. |
International Search Report issued in PCT/FI2012/050909 dated Mar. 4, 2013. |
International Search Report and Written Opinion of PCT/FI2019/050319, dated Jul. 29, 2019, in 17. |
International Search Report and Written Opinion issued in PCT/FI2019/050320 dated Jul. 30, 2019. |
International Search Report and Written Opinion of PCT/FI2019/050321, dated Jul. 30, 2019, in 13 pages. |
International Search Report and Written Opinion issued in PCT/FI2019/050322 dated Aug. 28, 2019. |
International Search Report issued in PCT/FI2021/050453 dated Sep. 2, 2021. |
International Search Report and Written Opinion issued in PCT/IF2021/050088 dated May 4, 2021. |
International Search Report issued in PCT/FI2021/050720 dated Nov. 16, 2021. |
Jang et al., “Visibility-based spatial reasoning for object manipulation in cluttered environments”, Apr. 2008, pp. 42-438, vol. 40, Issue 4. |
Japanese Office Action dated Aug. 25, 2015; Application No. 2013-546749. |
Japanese Office Action dated Jul. 25, 2016; Application No. 2014-531283. |
Kristensen et al., “Bin-picking with a solid state range camera”, Jun. 30, 2001, pp. 143-151, vol. 35, Issues 3-4. |
Morales et al., “Vision-based three-finger grasp synthesis constrained by hand geometry,” Article, Jun. 30, 2006, p. 496-512, vol. 54, Issue 6. |
Search Report received in Swedish Application No. 2030211-3 dated Feb. 4, 2021. |
Office Action received in Swedish Application No. 2030325-1 dated Jun. 28, 2021. |
Wong et al., “Vision Strategies for Robotic Manipulation of Natural Objects,” Article, Dec. 2-4, 2009, p. 8, New Zealand. |
Yanagihara et al., “Parts-picking in Disordered Environment,” Article, Nov. 3-5, 1991, p. 517-522, Japan. |
Number | Date | Country | |
---|---|---|---|
20230405639 A1 | Dec 2023 | US |