The invention generally relates to robotic systems, and relates in particular, to robotic systems that are used in an environment involving human activity.
Certain robotic systems are intended to be used in environments that also include people. In warehouse sortation environments for example, human workers are responsible for taking objects and placing them onto shelves in correct locations. A common paradigm is to use workcells with put-to-light systems to facilitate this process by indicating via lights the desired target location. These systems typically employ either monochromatic lights or have a display showing the number of items that the worker should place on the shelf. By using these lights as visual cues, valuable time is shaved off of the sortation task. Additionally, workers may have issues sorting particular unexpected, damaged, non-bar-coded or otherwise problematic unsortable items. When workers come across objects such as these, they may raise a small flag or activatable light to indicate to a manager or foreman that they had issues processing an item. This allows them to continue sorting without leaving their work area.
Current solutions such as put-to-light however, do not translate well for use in automated systems. Using lighting to demonstrate to a robot where to place an object may in fact, be fundamentally worse than sending placement locations directly from a Warehouse Management System or other database. As such, automated systems do not typically use lighting in determining object placement. Further, human workers who work in sortation facilities may have preconceived expectations of the role lighting plays in sortation. There remains a need therefore, for a robotic system that is able to better, quickly and efficiently communicate with human workers in the robotic environment, information that may help to keep each human worker safe from injury.
In accordance with an embodiment, the invention provides a robotic system including an articulated arm with an end effector. The robotic system is for use in a robotic environment requiring interaction with persons in the robotic environment, and the robotic system includes a plurality of lights that are illuminated responsive to known near-future movements of the articulated arm to convey the known near-future movements of the articulated arm to the persons in the robot environment.
In accordance with another embodiment, the invention provides a robotic system including an articulated arm with an end effector. The robotic system is for use in an environment requiring interaction with persons in the robotic system and includes a plurality of sortation locations and a plurality of lights that are each associated with a sortation location. The system provides that one or more of the plurality of lights is engageable to be illuminated to indicate that the system plans to move the end effector toward a sortation location that is associated with the one or more of the plurality of lights.
In accordance with a further embodiment, the invention provides a method of providing communication lighting in a robotic environment requiring interaction with persons in the robotic environment. The method includes the steps of providing in the robotic environment, a robotic system having an end effector, and providing illumination indicative of a planned direction of movement of the end effector of the robotic system in the robotic environment.
The following description may be further understood with reference to the accompanying drawings in which:
The drawings are shown for illustrative purposes only.
In accordance with an embodiment the invention provides lighting system for use in robotic sortation environment, as well as the use of such a system for conveying system state. In certain embodiments, an array of RGB LEDs is placed on shelves, and an array of RGB LEDs is mounted on an end effector or manipulator. In further embodiments, the invention provides a method of conveying robot state using these systems as well as a light pole.
The invention therefore provides systems and methods for conveying state and near-future information via LED arrays during robotic sortation. In certain embodiments, the invention provides systems and methods for facilitating communication with human workers.
In accordance with various embodiments, the invention provides a robotic system that includes an array of RGB LEDs mounted above or below shelves, and provides in an example, information regarding where the automated system will place future objects, the location of previously placed objects, and general system state. An array of RGB LEDs may be mounted on a manipulator or end effector. The invention also provides for the use of the system in conveying information about the process of picking objects, the qualities of picked objects, the qualities of grasps on objects, and general system state, as well as the use of light poles in conveying automated system state for sortation.
As also shown in
In accordance with certain embodiments of the invention therefore, the lighting system may convey the state of the robotic sortation task, as well as the state of the robot. For example, in an embodiment, after the system has selected a place location, a subset of the RGB LEDs 26 adjacent to the place location are illuminated in a pulsing color in order to demonstrate where the robot will place its next object. Once objects are placed, the same LEDs are illuminated in a different color in order to indicate a successful place. Similarly, when performing tasks requiring caution or when an error has occurred, all lights can be placed into a pulsing orange or red color, respectively. Conveying system state in this manner provides human workers with easily accessible and digestible information about the task at hand and allows for advanced collaborative interaction with automated systems.
In accordance with various embodiments, therefore, the plurality of lights may be multi-colored lights proximate to an end effector of the articulated arm. In further embodiments, the plurality of multi-colored lights may be indicative of an intended direction of movement of the end effector, or may be indicative of the end effector grasp quality on an object. In further embodiments, the plurality of multi-colored lights may be provided on a wrist of the end effector, and may be indicative of the robotic system not having proper information regarding a required task for an object, or indicative of the robotic system not recognizing the object. In further embodiments, the plurality of multi-colored lights may be indicative of the robotic system not knowing where to put an object, or may be indicative of where an end effector is being directed. In certain embodiments, the plurality of multi-colored lights include lights at potential target locations that are indicative of when a target location bin is full or otherwise completed.
Those skilled in the art will appreciate that numerous modifications and variations may be made to the above disclosed embodiments without departing from the spirit and scope of the present invention.
The present application is a continuation of U.S. patent application Ser. No. 16/826,819, filed Mar. 23, 2020, now U.S. Pat. No. 11,117,271 issued Sep. 14, 2021, which is a continuation of U.S. patent application Ser. No. 16/243,753, filed Jan. 9, 2019, now U.S. Pat. No. 10,632,631, issued Apr. 28, 2020, which is a continuation of U.S. patent application Ser. No. 15/259,961, filed Sep. 8, 2016, now U.S. Pat. No. 10,265,872, issued Apr. 23, 2019, which claims priority to U.S. Provisional Patent Application Ser. No. 62/216,017 filed Sep. 9, 2015, the entire disclosures of which are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
4412775 | Molitor et al. | Nov 1983 | A |
4557659 | Scaglia | Dec 1985 | A |
4704694 | Czerniejewski | Nov 1987 | A |
4846335 | Hartlepp | Jul 1989 | A |
4896357 | Hatano et al. | Jan 1990 | A |
5794789 | Payson et al. | Aug 1998 | A |
5865487 | Gore et al. | Feb 1999 | A |
5881890 | Wiley | Mar 1999 | A |
6079570 | Oppliger et al. | Jun 2000 | A |
6865487 | Charron | Mar 2005 | B2 |
7313464 | Perreault et al. | Dec 2007 | B1 |
7474939 | Oda et al. | Jan 2009 | B2 |
7516848 | Shakes et al. | Apr 2009 | B1 |
7677622 | Dunkmann et al. | Mar 2010 | B2 |
7861844 | Hayduchok et al. | Jan 2011 | B2 |
8718814 | Clark et al. | May 2014 | B1 |
8874270 | Ando | Oct 2014 | B2 |
8952284 | Wong et al. | Feb 2015 | B1 |
9061868 | Paulsen et al. | Jun 2015 | B1 |
9120622 | Elazary et al. | Sep 2015 | B1 |
9227323 | Konolige et al. | Jan 2016 | B1 |
9259844 | Xu et al. | Feb 2016 | B2 |
9492923 | Wellman et al. | Nov 2016 | B2 |
9604363 | Ban | Mar 2017 | B2 |
20010045755 | Schick et al. | Nov 2001 | A1 |
20010056316 | Johnson et al. | Dec 2001 | A1 |
20020027652 | Paromtchik et al. | Mar 2002 | A1 |
20030135300 | Lewis | Jul 2003 | A1 |
20060177295 | Frueh et al. | Aug 2006 | A1 |
20060242785 | Cawley et al. | Nov 2006 | A1 |
20070005179 | Mccrackin et al. | Jan 2007 | A1 |
20080179224 | Van Bossuyt | Jul 2008 | A1 |
20080181485 | Beis et al. | Jul 2008 | A1 |
20090019818 | Gilmore et al. | Jan 2009 | A1 |
20100125361 | Mougin et al. | May 2010 | A1 |
20100180711 | Kilibarda et al. | Jul 2010 | A1 |
20100234857 | Itkowitz et al. | Sep 2010 | A1 |
20110054689 | Nielsen et al. | Mar 2011 | A1 |
20110144798 | Freudelsperger | Jun 2011 | A1 |
20110160910 | Preisinger et al. | Jun 2011 | A1 |
20110176148 | Briggs et al. | Jul 2011 | A1 |
20110184555 | Kosuge et al. | Jul 2011 | A1 |
20110320036 | Freudelsperger | Dec 2011 | A1 |
20120215346 | Gingher et al. | Aug 2012 | A1 |
20130232918 | Lomerson, Jr. | Sep 2013 | A1 |
20130245824 | Barajas et al. | Sep 2013 | A1 |
20130343640 | Buehler et al. | Dec 2013 | A1 |
20140005831 | Naderer et al. | Jan 2014 | A1 |
20140067121 | Brooks et al. | Mar 2014 | A1 |
20140067127 | Gotou | Mar 2014 | A1 |
20140195979 | Branton et al. | Jul 2014 | A1 |
20140291112 | Lyon et al. | Oct 2014 | A1 |
20140305847 | Kudrus | Oct 2014 | A1 |
20140316570 | Sun | Oct 2014 | A1 |
20150057793 | Kawano | Feb 2015 | A1 |
20150073589 | Khodl et al. | Mar 2015 | A1 |
20150081090 | Dong | Mar 2015 | A1 |
20150217937 | Marquez | Aug 2015 | A1 |
20150224650 | Xu et al. | Aug 2015 | A1 |
20150369618 | Barnard et al. | Dec 2015 | A1 |
20150375401 | Dunkmann et al. | Dec 2015 | A1 |
20160027093 | Crebier | Jan 2016 | A1 |
20160136816 | Pistorino | May 2016 | A1 |
20160199884 | Lykkegaard et al. | Jul 2016 | A1 |
20160243704 | Vakanski et al. | Aug 2016 | A1 |
20160244262 | O'Brien et al. | Aug 2016 | A1 |
20170043953 | Battles et al. | Feb 2017 | A1 |
20170050315 | Henry et al. | Feb 2017 | A1 |
20170066597 | Hiroi | Mar 2017 | A1 |
20170080566 | Stubbs et al. | Mar 2017 | A1 |
20170087731 | Wagner | Mar 2017 | A1 |
20170106532 | Wellman et al. | Apr 2017 | A1 |
20170276472 | Becker | Sep 2017 | A1 |
Number | Date | Country |
---|---|---|
701886 | Mar 2011 | CH |
104870147 | Aug 2015 | CN |
102007038834 | Feb 2009 | DE |
102013020137 | Jun 2015 | DE |
0613841 | Sep 1994 | EP |
1256421 | Jan 2008 | EP |
1995192 | Nov 2008 | EP |
2960024 | Dec 2015 | EP |
3006379 | Apr 2016 | EP |
H09244730 | Sep 1997 | JP |
2010034044 | Apr 2010 | WO |
2010099873 | Sep 2010 | WO |
2014111483 | Jul 2014 | WO |
2015162390 | Oct 2015 | WO |
Entry |
---|
Extended European Search Report issued by the European Patent Office in related European Patent Application No. 21214194.9 dated Mar. 4, 2022, 5 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO dated Mar. 13, 2018 for International Application No. PCT/US2016/050786, 7 pages. |
International Search Report and Written Opinion issued by the International Searching Authority dated Nov. 24, 2016 for International Application No. PCT/US2016/050786, 9 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office dated Jul. 10, 2017 in related U.S. Appl. No. 15/259,961, 31 pages. |
Final Office Action issued by the U.S. Patent and Trademark Office dated Feb. 13, 2018 in related U.S. Appl. No. 15/259,961, 33 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office dated Aug. 21, 2018 in related U.S. Appl. No. 15/259,961, 31 pages. |
Liu et al., “Hand Arm Coordination for a Tomato Harvesting Robot Based on Commercial Manipulator,” Proc. Of the IEEE, Int'l Conf. on Robotics and Biometrics, Dec. 2013, pp. 2715-2720. |
Vitton et al., “A Flexible Robotic Gripper for Automation of Assembly Tasks,” Proc. Of the ASME—Dynamic Systems & Control Div., 2003, vol. 2, 7 pages. |
Herbert et al., “A Robotic Gripper System for Limp Material Manipulation: Hardware and Software Development and Integration,” Proc. of the 1997 IEEE, Int'l Conf. on Robotics and Automation, Apr. 1997, pp. 15-21. |
Cipolla et al., “Visually Guided Grasping in Unstructured Environments,” J. of Robotics and Autonomous Sys., pp. 1-20. |
Klingbeil et al., “Grasping with Application to an Autonomous Checkout Robot,” Proc. of IEEE, Int'l Conf. on Robotics and Automation, Jun. 2011, 9 pages. |
Examiner's Report issued by the Canadian Intellectual Property Office dated May 16, 2019 in related Canadian Patent Application No. 2,998,403, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office dated Apr. 17, 2018 in related European Patent Application No. 16767467.0, 3 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office dated Dec. 20, 2019 in related U.S. Appl. No. 16/243,753, 35 pages. |
First Office Action, and its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201680062748.1 dated Jul. 22, 2020, 20 pages. |
Second Office Action, and its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201680062748.1 dated Feb. 20, 2021, 18 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/826,819 dated May 5, 2021, 16 pages. |
Notice on the Third Office Action and the Third Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201680062748.1 dated Jun. 30, 2021, 23 pages. |
Notice on the First Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202210026563.2 dated Mar. 31, 2023, 25 pages. |
Number | Date | Country | |
---|---|---|---|
20210362356 A1 | Nov 2021 | US |
Number | Date | Country | |
---|---|---|---|
62216017 | Sep 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16826819 | Mar 2020 | US |
Child | 17395929 | US | |
Parent | 16243753 | Jan 2019 | US |
Child | 16826819 | US | |
Parent | 15259961 | Sep 2016 | US |
Child | 16243753 | US |