Traditionally, the food industry employs human labor to manipulate ingredients with the purpose of either assembling a meal such as a salad or a bowl, or packing a box of ingredients such as those used in grocery shopping, or preparing the raw ingredients. Robots have not yet been able to assemble complete meals from prepared ingredients in a food-service setting such as a restaurant, largely because the ingredients are arranged unpredictably and change shape in difficult-to-predict ways rendering traditional methods to move material ineffective without extensive modifications to existing kitchens. Additionally, traditional material handling methods are ill-suited to moving cooked foods without altering their texture and taste-profile. These difficulties arise because the friction, stiction, and viscosity of commonly consumed foods cause auger, conveyor, and suction mechanisms to become clogged and soiled, while these mechanisms simultaneously impart forces on the foodstuffs which alter their texture, consistency, and taste-profile in unappetizing ways.
Widely-available food serving tools (e.g., spoons, ladles, measuring cups, dishers (aka scoopers), tongs, etc.) are designed to be operated by human hands, not by robotic end effectors. There is a current need to develop custom robotic end effector solutions targeted at an increasingly automated food-service industry. Specifically, a need exists for an adaptor to allow for existing servicing tools to be attached to and controlled by a robot. The ability to convert existing utensils from human use to robot use is beneficial because existing utensils are already widely used, understood, and satisfy the required safety, sanitary and regulatory standards including NSF/ANSI 169. Providing an adaptor for existing tools and utensils, instead of designing and creating new robot focused tools, allows for a simpler, cheaper and safer transition to a robotic food service environment.
In an embodiment, an adaptor, and a corresponding method for use, includes an element with a first interface component and second interface component. The first interface component is configured to removably mate with a connector of a robot and the second interface component is configured to attach to a tool. The tool may be controllable by moving the connector when the first interface component is mated with the connector and the second interface component is attached to the tool. The tool may be a static tool such as a spatula, rake, peeler, whisk, strainer, knife, ladle, or spoon.
In an embodiment, the adaptor may further include an actuator, coupled to the element such that the actuator can be controlled by the robot when the first interface component is mated with the connector and an actuatable tool is controllable by moving the connector and the actuator when the first interface component is mated with the connector and the second interface component is attached to the actuatable tool. The actuatable tool may be a spice mill, egg beater, frother, crusher, tongs, disher, or ice cream scooper.
In such embodiments, the actuator may be coupled to the element at a port, the port configured to allow a connection between the connector and the actuator when the first interface component is mated with the connector. The actuator may be a linear actuator configured to move a joint of the attached actuatable tool. The actuator may be a rotary actuator configured to move a joint of the attached actuatable tool. The adaptor may include a spring configured to apply a tensile force opposing movement created by the actuator.
In an embodiment, when the actuator is a rotary actuator, it may include a drive shaft having a proximal end coupled to the element, and having a distal end having a pinion and a rack, coupled to the second interface component, configured to mate with the pinion of the drive shaft, and further configured to apply force to the joint of the actuatable tool through the second interface component when the drive shaft is rotated and the second interface component is attached to the actuatable tool.
The adaptor may further comprise a visual marker attached to the element configured to provide, to an imaging system at least one of the position and the orientation of the tool in space with respect to the connector.
The element of the adaptor may include at least one electrical feedthrough that enable power or data to be transmitted between the robot and the tool when the first interface component is mated with the connector and the second interface component is attached to the tool. The element of the adaptor may a plurality of air feedthroughs which enable compressed air to pass between the robot and the tool when the first interface component is mated with the connector and the second interface component is attached to the existing tool. If the adaptor includes an actuator, the actuator may be driven by at least one of electricity, air pressure, water pressure, or magnetism.
In an embodiment, the adaptor can be formed by at least one of: a single material in an injection molding process or a single material created by 3D printing or casting. The element, the second interface, and the tool may form a monolithic block.
In an embodiment, the first interface component may have an eccentric circular perimeter configured to constrain the location and position of the adaptor when the first interface component is mated with the connector. The first interface component may include a volume, enclosed by the element, configured to receive a protruding element of the connector when the first interface component is mated with the connector.
In an embodiment, a method for adapting a tool for robotic use includes providing an adaptor. The adaptor includes an element having a first interface component and a second interface component. The method further includes mating the first interface component to a connector of a robot. The method further includes attaching the second interface component to the tool.
The foregoing will be apparent from the following more particular description of example embodiments, as illustrated in the accompanying drawings in which like reference characters refer to the same parts throughout the different views. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating embodiments.
A description of example embodiments follows.
In an embodiment, the present disclosure provides an apparatus and corresponding method that solves problems relating to employing robotics in a food service environment. The apparatus enables safe manipulation of foodstuffs by connecting, to a robot, (1) a tool having a monolithic form factor or (2) an existing tool with an adaptor allowing it to be controlled and used by a robot. The monolithic design enables ease of cleaning and can be designed to be dishwasher safe. The disclosure provides for common or standardized geometry to allow multiple adaptors, for a range of tools, to attach and detach to a common master robotic connector at a flange. In other words, the multiple adaptors, for the range of tools, are compatible with a common master robotic connector. The materials used are food-safe, waterproof and can be compatible with an injection molding process providing a route to cost effective manufacturing. The present disclosure relates to an adaptor attachable to new and existing utensils so that they maintain their original function (e.g., manipulating, preparing, and serving food) while enabling them to be used by a robot.
Operating a robot in a food preparation environment, such as a quick service restaurant, can be challenging for several reasons. First, the end effectors (e.g., utensils), that the robot uses need to remain clean from contamination. Contamination can include allergens (e.g., peanuts), dietary preferences (e.g., contamination from pork for a vegetarian or kosher customer), dirt/bacteria/viruses, or other non-ingestible materials (e.g., oil, plastic, or particles from the robot itself). Second, the robot should be operated within its design specifications, and not exposed to excessive temperatures or incompatible liquids, without sacrificing cleanliness. Third, the robot should be able to manipulate food stuffs, which are often fracturable and deformable materials, and further the robot must be able to measure an amount of material controlled by its utensil in order to dispense specific portions. Fourth, the robot should be able to automatically and seamlessly switch utensils (e.g., switch between a ladle and salad tongs). Fifth, the utensils should be adapted to be left in an assigned food container and interchanged with the robot as needed, in situ. Sixth, the interchangeable parts (e.g., utensils) should be washable and dishwasher safe. Seventh, the robot should be able to autonomously generate a task plan and motion plan(s) to assemble all ingredients in a recipe, and execute that plan. Eighth, the robot should be able to modify or stop a motion plan based on detected interference or voice commands to stop or modify the robot's plan. Ninth, the robot should be able to minimize the applied torque based on safety requirements or the task context or the task parameters (e.g., density and viscosity) of the material to be gathered. Tenth, the system should be able to receive an electronic order from a user, assemble the meal for the user, and place the meal for the user in a designated area for pickup automatically with minimal human involvement.
The food preparation area 102 includes a plurality of ingredient containers 106a-d each having a particular foodstuff (e.g., lettuce, chicken, cheese, tortilla chips, guacamole, beans, rice, various sauces or dressings, etc.). Each ingredient container 106a-d stores in situ its corresponding ingredients. Utensils 108a-d may be stored in situ in the ingredient containers or in a stand-alone tool rack 109. The utensils 108a-d can be spoons, ladles, tongs, dishers (scoopers), spatulas, or other utensils. Each utensil 108a-e is configured to mate with and disconnect from a tool changer interface 112 of a robot arm 110. While the term utensil is used throughout this application, a person having ordinary skill in the art can recognize that the principles described in relation to utensils can apply in general to end effectors in other contexts (e.g., end effectors for moving fracturable or deformable materials in construction with an excavator or backhoe, etc.); and a robot arm can be replaced with any computer controlled actuatable system which can interact with its environment to manipulate a deformable material. The robot arm 110 includes sensor elements/modules such as stereo vision systems (SVS), 3D vision sensors (e.g., Microsoft Kinect™ or an Intel RealSense™), LIDAR sensors, audio sensors (e.g., microphones), inertial sensors (e.g., internal motion unit (IMU), torque sensor, weight sensor, etc.) for sensing aspects of the environment, including pose (i.e., X, Y, Z coordinates and roll, pitch, and yaw angles) of tools for the robot to mate, shape and volume of foodstuffs in ingredient containers, shape and volume of foodstuffs deposited into food assembly container, moving or static obstacles in the environment, etc.
To initiate an order, a patron in the patron area 120 enters an order 124 in an ordering station 122a-b, which is forwarded to a network 126. Alternatively, a patron on a mobile device 128 can, within or outside of the patron area 120, generate an optional order 132. Regardless of the source of the order, the network 126 forwards the order to a controller 114 of the robot arm 110. The controller generates a task plan 130 for the robot arm 110 to execute.
The task plan 130 includes a list of motion plans 132a-d for the robot arm 110 to execute. Each motion plan 132a-d is a plan for the robot arm 110 to engage with a respective utensil 108a-e, gather ingredients from the respective ingredient container 106a-d, and empty the utensil 108a-e in an appropriate location of a food assembly container 104 for the patron, which can be a plate, bowl, or other container. The robot arm 110 then returns the utensil 108a-e to its respective ingredient container 106a-d, the tool rack 109, or other location as determined by the task plan 130 or motion plan 132a-d, and releases the utensil 108a-d. The robot arm executes each motion plan 132a-d in a specified order, causing the food to be assembled within the food assembly container 104 in a planned and aesthetic manner.
Within the above environment, various of the above described problems can be solved. The environment 100 illustrated by
The body 203 of adaptor 200, shown in
A second interface (not shown in
The robot end 201 includes a first interface component 204 configured to removably mate with a robot connector. The robot connector may include a flange that is designed to couple with a first interface component 204. The first interface component 204 is of any size and shape matching the flange of the robot connector, but can have any geometry and location. A person having ordinary skill in the art can recognize that the first interface component 204 may also be any geometry, size, and shape, and further can be composed of any material depending on the needs of the food service environment. For adaptor 200, first interface component 204 is located on the upper base of circular truncated cone 205. The location, position, and orientation of the robot connector controls the location, position, and orientation of the tool attached to adaptor 200 when the interface component 204 mates with the flange of a robot connector. The first interface component 204 of the adaptor 200, shown in
In embodiments, (not illustrated by
In some embodiments, adaptor 200 includes feedthroughs that enable power (e.g., electricity) or data to be transmitted between the robot and the tool when the first interface component is mated with the flange and the second interface component is attached to the utensil. The electrical feedthroughs may be located within adaptor body 203, first interface component 204, or the second interface. In some embodiments, adaptor 200 includes air feedthroughs that enable compressed air to pass between the robot and the utensil when the first interface component is mated with the flange and the second interface component is attached to the existing utensil. The air feedthroughs may be located within at least one of adaptor body 203, first interface component 204, and/or second interface.
In
The adaptor 300 may also include a visual tag feature for easier recognition by vision systems of the robot. A visual marker/tag 316 is attached to the top of body 303. The visual marker 316 provides the position, orientation, and/or the rotation of the tool 108 and adaptor 300 in space with respect to the robot connector 312, allowing for easier attachment of first interface component 304 with flange 314. In addition, the adaptor body 303 includes a bottom feature 317 that allows tool 108 and adaptor 300 to rest in a consistent location in a food container, even if the food container stores varying levels of food. The bottom feature 317 allows the robot to more easily find and attach to adaptor 300 because it more easily rests in a standard position. In embodiments, the bottom feature 317 can include one or more members extending from the adaptor 300, tool 108, or tool handle configured to latch on to or rest against an edge of a material bin or other element in the environment.
The tool side 302 of the adaptor 300 attaches to the tool 108 through the second interface 315. In some embodiments, the tool 108 is removably attached to the second interface 315. Alternately, the tool 108 can be permanently attached to the second interface 315. In such embodiments, the tool 108 and the adaptor 300 may be composed of a single monolithic block. The tool 108 may be a previously existing utensil with known properties and meeting established safety and cleanliness. The adaptor 300 allows the tool 108, previously designed for human use, to be used and controlled by a robot. Utilizing existing tools instead of designing robot specific tools can provide a simpler, cheaper, and more intuitive way to transition a food service environment from human control to robotic control. Additionally, existing tools have undergone food safety testing and meet other requirements for safe and efficient use.
The tool 108 is a static tool without any joints that require manipulation and can be fully controlled without an actuator. Therefore, the adaptor 300 for use with tool 108, does not include an actuator. The tool 108, when the adaptor 300 is connected to the robot connector 312, can be controlled by changing the location, position, and orientation of the robot connector 312 and be used by the robot to mimic any action a human could take with the tool 108. Specifically, the robot can be configured to use the tool 108 to deliver amounts of food stuff in the food service environment shown in
In embodiments, including the embodiment illustrated by
Referring to
Actuatable tools are tools that require internal movement to be effectively used, such as tongs, or an ice scream scooper with a dispensing mechanism. To allow the robot to use actuatable tools, the adaptor includes an actuator employed to create movement of elements in the tool. Embodiments include specialized actuators designed to operate any manner of actuatable tools. The actuator may be a linear or rotary actuator configured to create either linear or rotational movement at a joint of an actuatable actual tool. The adaptor may also include a spring to apply a tensile force opposing movement created by the actuator. The tensile force can cause stable movement of the joint and/or provide that the tool returns to a resting position after the desired movement induced by the actuator is completed. The actuator can be mechanical, pneumatic, hydraulic, electrical and utilize any other known method of or apparatus for creating and controlling movement. When the adaptor includes an actuator, an attached tool is controlled by the actuator, and the position, location, and orientation of the robotic connector when the first interface component is mated with the flange of the robot's connector and the second interface component is attached to the actual tool.
Changing air pressure conditions cause the actuator to create a rotational force inducing movement in a joint of disher 108 that moves the dispensing rod element. A second interface 515 attaches the disher 108 to the body 503 of the adaptor 500 and holds the rest of the disher 108 steady when the pneumatic actuator creates movement. The robot controls the pneumatic actuator when attached to the adaptor 500 at first interface 504 through utilizing air-pressure pass-throughs located in the ports 511. A valve located with the robot control equipment connects to pneumatic actuator 502 and passes air through pass-throughs 511 to actuate the pneumatic actuator 502. One position of the valve rotates the dispensing rod element of disher 108, the other position returns the dispensing rod element to its initial position. The adaptor 500 mates with the robot connector at the first interface 504 to be physically locked to the robot, but also receive air pressure pneumatically. The adaptor 500 also can contain an electric actuator which utilizes electrical pass-throughs from the robot connector.
The adaptor 600 is attached to tool 108 through a second interface component 615. The adaptor 600 includes a first interface component 604 that can removably mate with a flange of robot connector 112. Adaptor 600 also includes an actuator 620 that can be controlled by an attached to and manipulated by a robot through port 611. The tool 108 is a disher utensil used for serving food. Second interface component 615 can securely couple tool 108 to adaptor 600.
For tools that require actuation, such as disher 108, the first interface component 604 attachable to the robot master connector, the second interface component 615 attachable to the tool 108, and also accommodate mounting features for the required 620 may all be included in a monolithic design.
Adaptor 900, including first and second interfaces 904 and 915, can be monolithic. For tools that require actuation, such as tongs 108, the first interface component 904 attachable to the robot connector, the second interface component 915 attachable to the tool 108, and also according mounting features for the required actuator may all be included in a monolithic design.
Embodiments of the disclosure, including embodiments utilizing monolithic designs, can be manufactured using 3D printing technology. Complex geometries can be rapidly fabricated and combined into functional prototypes. Additional features (e.g., bin clocking and visual tag alignment) are cost effective and can be rapidly incorporated into designs. Embodiments of the disclosure can also use of any other known printing machines and plastic material. Further, food service applications are, generally speaking, low-load. This enables the usage of 3D printed plastic materials without need for large supporting structures. Many of the available tool changers are over-designed to work in higher load applications. Injection moldings are also possible in embodiments, and other food-safe plastics and/or waterproof can be used for elements or the entirety of the adaptor, such as high-density polyethylene (HDPE), polyethylene terephthalate (PET) or polycarbonate. In addition, compatible designs can also be fabricated with injection molding processes. For designs that do not require actuation, the tool and adaptor can be completely monolithic.
In one embodiment of the disclosure a plastic food-safe and water proof adaptor can be attached to an existing metal, or other material, tool. Alternatively, both the adaptor and the tool can be composed of a plastic food-safe and water proof material either as separate pieces or as a monolithic design incorporating both the adaptor and tool. For utensils requiring actuation, actuators can be moved from the tool and/or adaptor to a robotic connector master that has these additional actuation capabilities. Removing the actuator from the tool/adaptor would enable actuatable tool designs such as the scooper and tongs to be truly monolithic. Finally, the adaptor and tool either their entirety or their individual elements can be composed of at least one of, nylon, stainless steel, aluminum, and/or silicone.
Currently, prior art adaptors such as ATI QC-11 can be mated to spoons and ladles. However, unlike the adaptor of the disclosure the prior art adaptors require significant rewording of the tools to be used. Other prior art solutions use new hand-like tools (i.e. shadow hand from the Shadow Robot Company), however, these are currently not at a state of reliability to be used in the near future, especially in a food environment and cannot take advantage of the known qualities of existing tools and utensils.
Flippy, the burger flipping robot by Miso Robotics, employs reworked spatulas and grill scrapers mounted to metallic tool changers to tool change. However, these re-worked spatulas are not a cost effective, long-term solution. Further, prior art adaptors require multiple connectors to fasten multiple pieces together can be a cause of later mechanical failure, and in addition can create hard to clean areas and allow microbes to grow. Also, using many components over a more monolithic, sealed design is more difficult to clean and exposes crevices for microbes to develop. The present disclosure's designs reduce the number of components required for functionality. This reduction in individual components is beneficial to improve assembly time, cost effectiveness and improves cleanability (e.g., less crevices for microbes to develop). Changes to the tool geometry can be rapidly implemented and produced into a functional article if 3D printing fabrication methods are used. Designing an adaptor as a monolithic component further provides flexibility advantages, such as the air pass-throughs can be used, blocked, or left exposed entirely.
The teachings of all patents, published applications, and references cited herein are incorporated by reference in their entirety.
While example embodiments have been particularly shown and described, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the scope of the embodiments encompassed by the appended claims.
This application claims the benefit of U.S. Provisional Application No. 62/730,947, filed on Sep. 13, 2018, U.S. Provisional Application No. 62/730,703, filed on Sep. 13, 2018, U.S. Provisional Application No. 62/730,933, filed on Sep. 13, 2018, U.S. Provisional Application No. 62/730,918, filed on Sep. 13, 2018, U.S. Provisional Application No. 62/730,934, filed on Sep. 13, 2018 and U.S. Provisional Application No. 62/731,398, filed on Sep. 14, 2018. The entire teachings of the above applications are incorporated herein by reference. This application is related to U.S. patent application Ser. No. 16/570,100, U.S. patent application Ser. No. 16/570,855, U.S. patent application Ser. No. 16/570,955, U.S. patent application Ser. No. 16/570,915, U.S. patent application Ser. No. 16/570,976, U.S. patent application Ser. No. 16/570,736, U.S. patent application Ser. No. 16/571,025, U.S. patent application Ser. No. 16/570,606, U.S. patent application Ser. No. 16/571,040, and U.S. patent application Ser. No. 16/571,041, all filed on the same day, Sep. 13, 2019. The entire teachings of the above applications are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4512709 | Hennekes | Apr 1985 | A |
4513709 | Hennekes | Apr 1985 | A |
4604787 | Silvers | Aug 1986 | A |
4611377 | McCormick | Sep 1986 | A |
4624043 | Bennett | Nov 1986 | A |
4676142 | McCormick | Jun 1987 | A |
4875275 | Hutchinson et al. | Oct 1989 | A |
4896357 | Hatano | Jan 1990 | A |
4904514 | Morrison et al. | Feb 1990 | A |
5018266 | Hutchinson et al. | May 1991 | A |
5044063 | Voellmer | Sep 1991 | A |
5131706 | Appleberry | Jul 1992 | A |
5136223 | Karakama | Aug 1992 | A |
5360249 | Monforte et al. | Nov 1994 | A |
5396346 | Nakayama | Mar 1995 | A |
5774841 | Salazar et al. | Jun 1998 | A |
5879277 | Dettman et al. | Mar 1999 | A |
6223110 | Rowe et al. | Apr 2001 | B1 |
6427995 | Steinwall | Aug 2002 | B1 |
6569070 | Harrington et al. | May 2003 | B1 |
6678572 | Oh | Jan 2004 | B1 |
8095237 | Habibi et al. | Jan 2012 | B2 |
9186795 | Edsinger et al. | Nov 2015 | B1 |
9189742 | London | Nov 2015 | B2 |
9259840 | Chen | Feb 2016 | B1 |
9346164 | Edsinger et al. | May 2016 | B1 |
9427876 | Mozeika et al. | Aug 2016 | B2 |
9615066 | Tran et al. | Apr 2017 | B1 |
9621984 | Chu | Apr 2017 | B1 |
9659225 | Joshi et al. | May 2017 | B2 |
9744668 | Russell et al. | Aug 2017 | B1 |
9547306 | Sepulveda | Oct 2017 | B2 |
9800973 | Chatot et al. | Oct 2017 | B1 |
9801517 | High et al. | Oct 2017 | B2 |
10131053 | Sampedro et al. | Nov 2018 | B1 |
10427306 | Quinlan et al. | Oct 2019 | B1 |
11016491 | Millard | May 2021 | B1 |
11116593 | Hashimoto et al. | Sep 2021 | B2 |
11351673 | Zito et al. | Jun 2022 | B2 |
20020144565 | Ambrose | Oct 2002 | A1 |
20020151848 | Capote et al. | Oct 2002 | A1 |
20020158599 | Fujita | Oct 2002 | A1 |
20020181773 | Higaki et al. | Dec 2002 | A1 |
20030060930 | Fujita | Mar 2003 | A1 |
20040039483 | Kemp et al. | Feb 2004 | A1 |
20040172380 | Zhang | Sep 2004 | A1 |
20050004710 | Shimomura et al. | Jan 2005 | A1 |
20050171643 | Sabe et al. | Aug 2005 | A1 |
20050193901 | Buehler | Sep 2005 | A1 |
20050283475 | Beranik | Dec 2005 | A1 |
20060137164 | Kraus | Jun 2006 | A1 |
20060141200 | D'Amdreta | Jun 2006 | A1 |
20060165953 | Castelli | Jul 2006 | A1 |
20070233321 | Suzuki | Oct 2007 | A1 |
20070274812 | Ban et al. | Nov 2007 | A1 |
20070276539 | Habibi et al. | Nov 2007 | A1 |
20080059178 | Yamamoto et al. | Mar 2008 | A1 |
20080161970 | Adachi et al. | Jul 2008 | A1 |
20080177421 | Cheng et al. | Jul 2008 | A1 |
20080201016 | Finlay | Aug 2008 | A1 |
20080237921 | Butterworth | Oct 2008 | A1 |
20090075796 | Doll | Mar 2009 | A1 |
20090292298 | Lin et al. | Nov 2009 | A1 |
20100114371 | Tsusaka et al. | May 2010 | A1 |
20100292707 | Ortmaier | Nov 2010 | A1 |
20110060462 | Aurnhammer et al. | Mar 2011 | A1 |
20110125504 | Ko et al. | May 2011 | A1 |
20110238212 | Shirado et al. | Sep 2011 | A1 |
20110256995 | Takazakura | Oct 2011 | A1 |
20120016678 | Gruber | Jan 2012 | A1 |
20120255388 | Mcclosky | Oct 2012 | A1 |
20120290134 | Zhao et al. | Nov 2012 | A1 |
20130079930 | Mistry | Mar 2013 | A1 |
20130103918 | Dictos | Apr 2013 | A1 |
20140067121 | Brooks | Mar 2014 | A1 |
20140163736 | Azizian et al. | Jun 2014 | A1 |
20140316636 | Hong et al. | Oct 2014 | A1 |
20150032260 | Yoon et al. | Jan 2015 | A1 |
20150051734 | Zheng | Feb 2015 | A1 |
20150052703 | Lee et al. | Feb 2015 | A1 |
20150114236 | Roy | Apr 2015 | A1 |
20150117156 | Xu et al. | Apr 2015 | A1 |
20150148953 | Laurent et al. | May 2015 | A1 |
20150149175 | Hirata et al. | May 2015 | A1 |
20150178953 | Laurent | May 2015 | A1 |
20150277430 | Linnell et al. | Oct 2015 | A1 |
20150375402 | D'Andreta | Dec 2015 | A1 |
20160016315 | Kuffner et al. | Jan 2016 | A1 |
20160073644 | Dickey | Mar 2016 | A1 |
20160075023 | Sisbot | Mar 2016 | A1 |
20160103202 | Sumiyoshi et al. | Apr 2016 | A1 |
20160291571 | Cristiano | Oct 2016 | A1 |
20160372138 | Shinkai et al. | Dec 2016 | A1 |
20170004406 | Aghamohammadi | Jan 2017 | A1 |
20170080565 | Dalibard | Mar 2017 | A1 |
20170087722 | Aberg et al. | Mar 2017 | A1 |
20170133009 | Cho et al. | May 2017 | A1 |
20170168488 | Wierzynski | Jun 2017 | A1 |
20170178352 | Harmsen et al. | Jun 2017 | A1 |
20170326728 | Prats | Nov 2017 | A1 |
20170334066 | Levine | Nov 2017 | A1 |
20170354294 | Shivaiah | Dec 2017 | A1 |
20170361461 | Tan | Dec 2017 | A1 |
20170361468 | Cheuvront et al. | Dec 2017 | A1 |
20180043952 | Ellerman et al. | Feb 2018 | A1 |
20180056520 | Ozaki | Mar 2018 | A1 |
20180070776 | Ganninger | Mar 2018 | A1 |
20180121994 | Matsunaga et al. | May 2018 | A1 |
20180144244 | Masoud et al. | May 2018 | A1 |
20180147718 | Oleynik | May 2018 | A1 |
20180147723 | Vijayanarasimhan | May 2018 | A1 |
20180150661 | Hall et al. | May 2018 | A1 |
20180200014 | Bonny et al. | Jul 2018 | A1 |
20180200885 | Ikeda et al. | Jul 2018 | A1 |
20180202819 | Mital | Jul 2018 | A1 |
20180214221 | Crawford et al. | Aug 2018 | A1 |
20180257221 | Toothaker | Sep 2018 | A1 |
20180275632 | Zhang et al. | Sep 2018 | A1 |
20180338504 | Lavri | Nov 2018 | A1 |
20180345479 | Martino et al. | Dec 2018 | A1 |
20180348783 | Pitzer et al. | Dec 2018 | A1 |
20180354140 | Watanabe | Dec 2018 | A1 |
20190001489 | Hudson et al. | Jan 2019 | A1 |
20190039241 | Langenfeld et al. | Feb 2019 | A1 |
20190049970 | Djuric et al. | Feb 2019 | A1 |
20190056751 | Ferguson et al. | Feb 2019 | A1 |
20190066680 | Woo et al. | Feb 2019 | A1 |
20190212441 | Casner et al. | Jul 2019 | A1 |
20190291277 | Oleynik | Sep 2019 | A1 |
20190310611 | Jain | Oct 2019 | A1 |
20190321989 | Anderson et al. | Oct 2019 | A1 |
20190381617 | Patrini | Dec 2019 | A1 |
20200023520 | Yoshizumi | Jan 2020 | A1 |
20200030966 | Hasegawa | Jan 2020 | A1 |
20200047349 | Sinnet | Feb 2020 | A1 |
20200070355 | Neumann et al. | Mar 2020 | A1 |
20200073358 | Dedkov et al. | Mar 2020 | A1 |
20200073367 | Nguyen et al. | Mar 2020 | A1 |
20200086437 | Johnson | Mar 2020 | A1 |
20200086482 | Johnson | Mar 2020 | A1 |
20200086485 | Johnson | Mar 2020 | A1 |
20200086487 | Johnson | Mar 2020 | A1 |
20200086497 | Johnson | Mar 2020 | A1 |
20200086498 | Johnson | Mar 2020 | A1 |
20200086503 | Johnson | Mar 2020 | A1 |
20200086509 | Johnson | Mar 2020 | A1 |
20200087069 | Johnson | Mar 2020 | A1 |
20200090099 | Johnson | Mar 2020 | A1 |
20200298403 | Nilsson | Sep 2020 | A1 |
20220066456 | Ebrahimi et al. | Mar 2022 | A1 |
Number | Date | Country |
---|---|---|
106313068 | Jan 2017 | CN |
107092209 | Aug 2017 | CN |
3723329 | Jan 1988 | DE |
3823102 | Jan 1990 | DE |
138461 | Apr 1985 | EP |
474881 | Mar 1992 | EP |
1145804 | Oct 2001 | EP |
2011610 | Jan 2019 | EP |
3015334 | Jun 2015 | FR |
2550396 | Nov 2017 | GB |
2004295620 | Oct 2004 | JP |
200849462 | Mar 2008 | JP |
2020028957 | Feb 2020 | JP |
9903653 | Jan 1999 | WO |
2005072917 | Nov 2005 | WO |
2007122717 | Nov 2007 | WO |
2009045827 | Apr 2009 | WO |
20150117156 | Aug 2015 | WO |
20170197170 | Nov 2017 | WO |
20180133861 | Jul 2018 | WO |
2020056279 | Mar 2020 | WO |
2020056295 | Mar 2020 | WO |
2020056301 | Mar 2020 | WO |
2020056353 | Mar 2020 | WO |
2020056362 | Mar 2020 | WO |
2020056373 | Mar 2020 | WO |
2020056374 | Mar 2020 | WO |
2020056375 | Mar 2020 | WO |
2020056376 | Mar 2020 | WO |
2020056377 | Mar 2020 | WO |
2020056380 | Mar 2020 | WO |
Entry |
---|
Yang et al., “Obstacle Avoidance through Deep Networks based Intermediate Perception”, Apr. 27, 2017, The Robotics Instiute, Carnegie Mellon University (Year: 2017). |
Anandan, T.M., “The Shrinking Footprint of Robot Safety”, Robotics Online, Oct. 6, 2014. https://www.robotics.org/content-detail.cfm/Industrial-Robotics-Industry-Insights/The-Shrinking-Footprint-of-Robot-Safety/content_id/5059. |
Blutinger, J., et al., “Scoop: Automating the Ice Cream Scooping Process”, Introduction to Robotics MECE E4602, Group 8 Final Project, Dec. 2016. |
Bollini, M., et al., “Interpreting and Executing Recipes with a Cooking Robot”, Experimental Robotics, 2013. |
Cao, Z., et al. “Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2017. |
Dantam, N.T., et al. “Incremental Task and Motion Planning a Constraint-Based Approach”, Robotics: Science and Systems 12, 00052, 2016. |
Ferrer-Mestres, J., et al., “Combined Task and Motion Planning as a Classical AI Planning” arXiv preprint arXiv: 1706.06927, 2017—arxiv.org; Jun. 21, 2017. |
Kaelbling, L.P, et al., “Integrated task and motion planning in beliefs space” The International Journal of Robotics Research; 0(0) 1-34; 2013. |
Martinez, J., et al., “On human motion prediction using recurrent neural networks.” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2017. |
Nedunuri, S., et al., “SMT-Based Synthesis of Integrated Task and Motion Plan from Plan Outlines”; the Proceedings of the 2014 IEEE Inti. Conf. on Robotics and Automation (ICRA2014). |
Saxena, A., et al., “RoboBrain: Large-Scale Knowledge Engine for Robots”, arXiv preprint:arXiv:1412.0691 (2014). |
Schenck, C., et al., “Learning Robotic Manipulation of Granular Media”, 1st Conference on Robot Learning, arXiv: 1709.02833, Oct. 25, 2017. |
Shimizu, T. and Kubota, T., “Advanced Sampling Scheme Based on Environmental Stiffness for a Smart Manipulator”, Robot Intelligence Technology and Applications, pp. 19-208. 2012. |
Srivastava, S., et al. “Combined Task and Motion Planning Through an Extensible Planner-Independent Interface Layer”; 2014 IEEE international conference on robotics and automation (ICRA), 639-646. |
Stentz, A., et al., “A Robotic Excavator for Autonomous Truck Loading”, in Proceedings of the IEEE/RSJ International Conference on Intelligent Robotic Systems, 1998. |
Villegas, et al., “Learning to Generate Long-term Future via Hierarchical Prediction”, in Proceedings of the 34th International Conference on Machine Learning (ICML), 2017. |
Walker, J., et al.,“The pose knows: Video forecasting by generating pose futures”, in the IEEE International Conference on Computer Vision (ICCV), Oct. 2017. |
Watson, J,. Kevin, et al. “Use of Voice Recognition for Control of a Robotic Welding Workcell”, IEEE Control Systems Magazine; p. 16-18; (ISSN 0272-1708); 7 , Jun. 1, 1987. |
Wong, J.M., et al., “SegICP-DSR: Dense Semantic Scene Reconstruction and Registration”, Draper, arXiv: 1711 02216; Nov. 6, 2017. |
Wong, J.M., et al., “SegICP: Integrated Deep Semantic Segmentation and Pose Estimation”, Massachusetts Institute of Technology, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS); Sep. 5, 2017. |
Wu, J., et al., “Real-Time Object Pose Estimation with Pose Interpreter Networks”, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018. |
Ye, G., et al., “Demonstration-Guided Motion Planning” Robotics Research. Springer Tracts in Advanced Robotics, vol. 100, 2017. |
International Search Report and Written Opinion for PCT/US2019/051148 dated Dec. 12, 2019 entitled “Food-Safe, Washable, Thermally-Conductive Robot Cover”. |
Anonymous: “Pate a pizza fine—Notre recette avec photos—Meilleur du Chef,” Retrieved from the Internet: URL: https://www.meilleurduchef.com/fr/recette/pate-pizza-fine.html# [retrieved on Dec. 5, 2019]. |
International Search Report and Written Opinion for PCT/US2019/051176 dated Dec. 12, 2019 entitled “Determining How to Assemble a Meal”. |
International Search Report and Written Opinion for PCT/US2019/051175 dated Jan. 3, 2020 entitled Stopping Robot Motion Based on Sound Cues. |
Dexai Robotics: “Alfred Sous-Chef scooping ice-cream” Youtube, retrieved from Internet Jun. 8, 2018. https://www.youtube.com/watch?v=caNG4qrZhRU. |
International Search Report and Written Opinion for PCT/US2019/051179 dated Jan. 9, 2020 entitled “An Adaptor tor Food-Safe, Bin-Compatible, Washable, Tool-Changer Utensils”. |
International Search Report and Written Opinion for PCT/US2019/051177 dated Jan. 9, 2020 entitled “Voice Modification to Robot Motion Plans”. |
International Search Report and Written Opinino for PCT/US2019/051183 dated Jan. 14, 2020 entitled “Locating and Attaching Interchangeable Tools In-Situ”. |
International Search Report and Written Opinion for PCT/US2019/051067 dated Jan. 16, 2020 entitled “Robot Interaction With Human Co-Workers”. |
International Search Report and Written Opinion for PCT/US2019/051161 dated Jan. 15, 2020 entitled “Food-Safe, Washable Interface for Exchanging Tools”. |
ATI Industrial Automation: Automatic/RoboticTool Changers, “Automatic/RoboticTool Changes”, Tool Changer News. Downloaded from Internet Feb. 4, 2020. https://www.ati-ia.com/products/toolchanger/robot_tool_changer.aspx. |
Draper—“A ‘Preceptive Robot’ Earns Draper Spots as KUKA Innovation Award Finalist” Aug. 30, 2017, retrieved from Internet from Feb. 5, 2020. https://www.draper.com/news-releases/perceptive-robot-eams-draper-spot-kuka-innovation-award-finalist. |
“Draper Spins Out Dexai Robotics”, Mar. 21, 2019, retrieved from Internet from Feb. 5, 2020. https://www.draper.com/news-releases/draper-spins-out-dexai-robotics. |
Dynamic Robotic Manipulation—KUKA Innovation—Finalist Spotlight—Apr. 26, 2018 retrieved from Internet Feb. 5, 2020. https://youtube.com/watch?v=7wGc-4uqOKw. |
Siciliano, B., et al. “Chapter 8—Motion Control—Robotics Modelling Planning and Control”, In: Robotics Modelling Planning and Control, Dec. 23, 2009. |
Siciliano, B., et al. “Chapter 9—Force Control—Robotics Modelling Planning and Control”, in: Robotics Modelling Planning and Control, Dec. 23, 2009. |
International Search Report and Written Opinion for PCT/US2019/051040 dated Feb. 7, 2020 entitled “Manipulating Fracturable and Deformable Materials Using Articulated Manipulators”. |
International Search Report and Written Opinion for PCT/US2019/051180 dated Jan. 31, 2020 entitled “One-Click Robot Order”. |
International Search Report and Written Opinion for PCT/US2019/051061 dated Apr. 3, 2020 titled “Controlling Robot Torque and Velocity Based on Context”. |
Olin College of Engineering, “Autonomous Tool Changer” Draper 2016-2017, retrieved from Internet Feb. 5, 2020. http://www.olin.edu/sites/default/files/draperarchival2.pdf. |
Olin College of Engineering, Autonomous Tool Changer, MoMap and the Future, “How Can We Enable a Robotic Arm to Change and Use Human Tools Autonomously”. Date unknown. |
Feddema, John T., et al., Model-Based Visual Feedback Control for a Hand-Eye Coordinated Robotic System, Aug. 1992, IEEE, vol. 25, Issue: 8, pp. 21-31 (Year: 1992). |
Dexai Robotics: “A Robot Company is Born”, retrieved from Internet from Feb. 5, 2020. https://draper.com/dexai-robotics. |
Charabaruk, Nicholas; “Development of an Autonomous Omnidirectional Hazardous Material Handling Robot”;. University of Ontario Institute of Technology (Canada). ProQuest Dissertations Publishing, 2015. 10006730. (Year: 2015). |
Langsfeld, Joshua D..; “Learning Task Models for Robotic Manipulation of Nonrigid Objects”; University of Maryland, College Park. ProQuest Dissertations Publishing, 2017. 10255938. (Year: 2017). |
Rennekamp, T., et al., “Distributed Sensing and Prediction of Obstacle Motions for Mobile Robot Motion Planning,” 2006, IEEE, International Conference on Intelligent Robots and Systems, pp. 4833-4838 (Year: 2006). |
Number | Date | Country | |
---|---|---|---|
20200086502 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
62731398 | Sep 2018 | US | |
62730933 | Sep 2018 | US | |
62730703 | Sep 2018 | US | |
62730918 | Sep 2018 | US | |
62730934 | Sep 2018 | US | |
62730947 | Sep 2018 | US |