The present technology is generally related to robotic systems with gripping mechanisms, and more specifically robotic systems with features for planning a packing operation and adjusting a gripper mechanism based on the packing operation.
With their ever-increasing performance and lowering cost, many robots (e.g., machines configured to automatically/autonomously execute physical actions) are now extensively used in many fields. Robots, for example, can be used to execute various tasks (e.g., manipulate or transfer an object through space) in manufacturing and/or assembly, packing and/or packaging, transport and/or shipping, etc. In executing the tasks, the robots can replicate human actions, thereby replacing or reducing human involvements that are otherwise required to perform dangerous or repetitive tasks.
However, despite the technological advancements, robots often lack the sophistication necessary to duplicate human interactions required for executing larger and/or more complex tasks. Accordingly, there remains a need for improved techniques and systems for managing operations of and/or interactions between robots.
The drawings have not necessarily been drawn to scale. Similarly, some components and/or operations can be separated into different blocks or combined into a single block for the purpose of discussion of some of the implementations of the present technology. Moreover, while the technology is amenable to various modifications and alternative forms, specific implementations have been shown by way of example in the drawings and are described in detail below. The intention, however, is not to limit the technology to the particular implementations described.
For ease of reference, the end effector and the components thereof are sometimes described herein with reference to top and bottom, upper and lower, upwards and downwards, a longitudinal plane, a horizontal plane, an x-y plane, a vertical plane, and/or a z-plane relative to the spatial orientation of the embodiments shown in the figures. It is to be understood, however, that the end effector and the components thereof can be moved to, and used in, different spatial orientations without changing the structure and/or function of the disclosed embodiments of the present technology.
Robotic systems with hybrid gripping mechanisms and related systems and methods are disclosed herein. In some embodiments, the robotic system includes a robotic arm and an end-of-arm tool coupled to the robotic arm. The end-of-arm tool can include a frame that has a first axis (e.g., a longitudinal axis) and a second axis (e.g., a transverse axis) at least partially orthogonal to the first axis. The end-of-arm tool can also include an actuator system and first, second, and third clamping components each coupled to the frame.
The first clamping component (sometimes also referred to herein as an external clamping component) extends along the first axis and along an outer edge of the frame. As a result, the first clamping component can define an outer wall of the end-of-arm tool. The second clamping component (sometimes also referred to herein as an internal clamping component) extends along the first axis and is operably coupled to the actuator system to move along the second axis toward and away from the first clamping component. During operation, the first and second clamping components can engage opposing side surfaces of one or more target objects to clamp the one or more target objects therebetween.
The third clamping component (sometimes also referred to herein as a support clamping component) extends along the first axis peripheral to the second clamping component with respect to the first clamping component (e.g., is positioned further from the first clamping component than the second clamping component) and includes one or more extension portions extending toward the first clamping component. The one or more extension portions are at an elevation below a lower edge of the second clamping component. The third clamping component is also operably coupled to the actuator system to move along the second axis toward and away from the first clamping component. During operation, the third clamping component can move toward the first clamping component, after the first and second clamping components have engaged the side surfaces of the one or more target objects, to position the one or more extension portions beneath a lower surface of the one or more target objects. As a result, the one or more extension portions can help support the one or more target objects while the robotic system transports the one or more target objects. Additionally, or alternatively, the third clamping component can help stabilize the end-of-arm tool after the first and second clamping components have engaged the side surfaces of the one or more target objects (e.g., to add rigidity, reinforce the engagement, and the like).
In some embodiments, the actuator system includes a track carried by the frame and extending along the second axis, a first carriage operably coupled between the track and the second clamping component, a second carriage operably coupled between the track and the third clamping component, and a driver operably coupled to the first and second carriages. The movement of the first carriage can control the movement of the second clamping component along the second axis. Similarly, the movement of the second carriage can control the movement of the third clamping component along the second axis.
In some embodiments, the second clamping component and/or the third clamping component include a bracing bracket facilitating the coupling to the actuator system. The bracing bracket can extend along a third axis at least partially orthogonal to the first and second axes and help transmit motion from the actuator system throughout the second clamping component and/or the third clamping component. For example, the bracing bracket can help ensure that motion from the actuator system is not only applied to an upper edge of the second clamping component and/or the third clamping component (which would, for example, result in a pivoting force when the second clamping component engages the side surface of the one or more target objects). When the second clamping component includes a bracing bracket, the third clamping component can include an opening aligned with the bracing bracket along the second axis. The opening can be configured to nest with the bracing bracket when the third clamping component is adjacent to the second clamping component, thereby allowing the third clamping component to move closer to (and/or abut) the second clamping component.
In some embodiments, the second clamping component includes a gripping substrate disposed on the surface of the second clamping component facing the first clamping component. The gripping substrate can help reduce the amount that the one or more target objects can slip once engaged by the first and second clamping components.
In some embodiments, the first clamping component has a thickness of less than 3 centimeters. As discussed in more detail below, the relatively thin profile of the first clamping component can allow the first end-of-arm tool to place the one or more target objects in close proximity to other objects (e.g., previously moved objects) and/or barriers (e.g., a wall of a shipping container). For example, during operation, the end-of-arm tool can be oriented such that the first clamping component is between the one or more target objects and previously placed objects at a destination. In this example, the thickness of the first clamping component is the limiting factor on how close the one or more target objects can be placed. Accordingly, when the first clamping component has a relatively thin profile, the one or more target objects can be placed in close proximity to the other objects (e.g., tightly packed at the destination).
In some embodiments, the end-of-arm tool includes a sensor system coupled to the frame and positioned to measure an environment around the end-of-arm tool. For example, the sensors can include an imaging system positioned to measure one or more parameters of the one or more target objects, objects surrounding the one or more target objects, objects at a destination for the target objects, an environment around the end-of-arm tool while transporting the one or more target objects, and the like.
In some embodiments, the end-of-arm tool includes one or more retractable arms carried by the frame. Each of the one or more retractable arms can include a suction gripping component and can be movable between a first position and a second position. In the first position, the suction gripping component can be at a first elevation above the lower edge of the second clamping component (e.g., out of the way of the first and second clamping components as they engage the one or more target objects) In the second position, the suction gripping component can be at a second elevation below the lower edge of the second clamping component (e.g., able to grab an object, such as a slip sheet or other divider).
Several details describing structures or processes that are well-known and often associated with robotic systems and subsystems, but that can unnecessarily obscure some significant aspects of the disclosed techniques, are not set forth in the following description for purposes of clarity. Moreover, although the following disclosure sets forth several embodiments of different aspects of the present technology, several other embodiments can have different configurations or different components than those described in this section. Accordingly, the disclosed techniques can have other embodiments with additional elements or without several of the elements described below.
Many embodiments or aspects of the present disclosure described below can take the form of computer-executable or controller-executable instructions, including routines executed by a programmable computer or controller. Those skilled in the relevant art will appreciate that the disclosed techniques can be practiced on computer or controller systems other than those shown and described below. The techniques described herein can be embodied in a special-purpose computer or data processor that is specifically programmed, configured, or constructed to execute one or more of the computer-executable instructions described below. Accordingly, the terms “computer” and “controller” as generally used herein refer to any data processor and can include Internet appliances and handheld devices, including palm-top computers, wearable computers, cellular or mobile phones, multi-processor systems, processor-based or programmable consumer electronics, network computers, mini computers, and/or the like. Information handled by these computers and controllers can be presented at any suitable display medium, including a liquid crystal display (LCD). Instructions for executing computer- or controller-executable tasks can be stored in or on any suitable computer-readable medium, including hardware, firmware, or a combination of hardware and firmware. Instructions can be contained in any suitable memory device, including, for example, a flash drive, USB device, and/or other suitable medium.
The terms “coupled” and “connected,” along with their derivatives, can be used herein to describe structural relationships between components. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, “connected” can be used to indicate that two or more elements are in direct contact with each other. Unless otherwise made apparent in the context, the term “coupled” can be used to indicate that two or more elements are in either direct or indirect (with other intervening elements between them) contact with each other, or that the two or more elements co-operate or interact with each other (e.g., as in a cause-and-effect relationship, such as for signal transmission/reception or for function calls), or both.
In the example illustrated in
In some embodiments, the task can include interaction with a target object 112, such as manipulation, moving, reorienting or a combination thereof, of the object. The target object 112 is the object that will be handled by the robotic system 100. More specifically, the target object 112 can be the specific object among many objects that is the target of an operation or task by the robotics system 100. For example, the target object 112 can be the object that the robotic system 100 has selected for or is currently being handled, manipulated, moved, reoriented, or a combination thereof. The target object 112, as examples, can include boxes, cases, tubes, packages, bundles, an assortment of individual items, or any other object that can be handled by the robotic system 100.
As an example, the task can include transferring the target object 112 from an object source 114 to a task location 116. The object source 114 (e.g., a starting location) can be a receptacle for storage of objects. The object source 114 can include numerous configurations and forms. For example, the object source 114 can be a platform, with or without walls, on which objects can be placed or stacked, such as a pallet, a shelf, or a conveyor belt. As another, the object source 114 can be a partially or fully enclosed receptacle with walls or lid in which objects can be placed, such as a bin, cage, or basket. In some embodiments, the walls of the object source 114 with the partially or fully enclosed can be transparent or can include openings or gaps of various sizes such that portions of the objects contained therein can be visible or partially visible through the walls. In yet another example, the object source 14 can be a conveyor belt and/or any other suitable assembly line location.
For illustrative purposes, the robotic system 100 is described in the context of a shipping center; however, it is understood that the robotic system 100 can be configured to execute tasks in other environments or for other purposes, such as for manufacturing, assembly, packaging, healthcare, or other types of automation. It is also understood that the robotic system 100 can include other units, such as manipulators, service robots, modular robots, that are not shown in
The robotic system 100 can include a controller 109 configured to interface with and/or control one or more of the robotic units. For example, the controller 109 can include circuits (e.g., one or more processors, memory, etc.) configured to derive motion plans and/or corresponding commands, settings, and/or the like used to operate the corresponding robotic unit. The controller 109 can communicate the motion plans, the commands, settings, etc. to the robotic unit, and the robotic unit can execute the communicated plan to accomplish a corresponding task, such as to transfer the target object 112 from the object source 114 to the task location 116.
The control unit 202 can be implemented in a number of different ways. For example, the control unit 202 can be a processor, an application specific integrated circuit (ASIC), an embedded processor, a microprocessor, a hardware control logic, a hardware finite state machine (FSM), a digital signal processor (DSP), or a combination thereof. The control unit 202 can execute software and/or instructions to provide the intelligence of the robotic system 100.
The control unit 202 can be operably coupled to the user interface 210 to provide a user with control over the control unit 202. The user interface 210 can be used for communication between the control unit 202 and other functional units in the robotic system 100. The user interface 210 can also be used for communication that is external to the robotic system 100. The user interface 210 can receive information from the other functional units or from external sources, or can transmit information to the other functional units or to external destinations. The external sources and the external destinations refer to sources and destinations external to the robotic system 100.
The user interface 210 can be implemented in different ways and can include different implementations depending on which functional units or external units are being interfaced with the user interface 210. For example, the user interface 210 can be implemented with a pressure sensor, an inertial sensor, a microelectromechanical system (MEMS), optical circuitry, waveguides, wireless circuitry, wireline circuitry, application programming interface, or a combination thereof.
The storage unit 204 can store the software instructions, master data, tracking data or a combination thereof. For illustrative purposes, the storage unit 204 is shown as a single element, although it is understood that the storage unit 204 can be a distribution of storage elements. Also for illustrative purposes, the robotic system 100 is shown with the storage unit 204 as a single hierarchy storage system, although it is understood that the robotic system 100 can have the storage unit 204 in a different configuration. For example, the storage unit 204 can be formed with different storage technologies forming a memory hierarchal system including different levels of caching, main memory, rotating media, or off-line storage.
The storage unit 204 can be a volatile memory, a nonvolatile memory, an internal memory, an external memory, or a combination thereof. For example, the storage unit 204 can be a nonvolatile storage such as non-volatile random access memory (NVRAM), Flash memory, disk storage, or a volatile storage such as static random access memory (SRAM). As a further example, storage unit 204 can be a non-transitory computer medium including the non-volatile memory, such as a hard disk drive, NVRAM, solid-state storage device (SSD), compact disk (CD), digital video disk (DVD), or universal serial bus (USB) flash memory devices. The software can be stored on the non-transitory computer readable medium to be executed by a control unit 202.
The storage unit 204 can be operably coupled to the user interface 210. The user interface 210 can be used for communication between the storage unit 204 and other functional units in the robotic system 100. The user interface 210 can also be used for communication that is external to the robotic system 100. The user interface 210 can receive information from the other functional units or from external sources, or can transmit information to the other functional units or to external destinations. The external sources and the external destinations refer to sources and destinations external to the robotic system 100.
Similar to the discussion above, the user interface 210 can include different implementations depending on which functional units or external units are being interfaced with the storage unit 204. The user interface 210 can be implemented with technologies and techniques similar to the implementation of the user interface 210 discussed above.
In some embodiments, the storage unit 204 is used to further store and provide access to processing results, predetermined data, thresholds, or a combination thereof. For example, the storage unit 204 can store the master data that includes descriptions of the one or more target objects 112 (e.g., boxes, box types, cases, case types, products, and/or a combination thereof). In one embodiment, the master data includes dimensions, predetermined shapes, templates for potential poses and/or computer-generated models for recognizing different poses, a color scheme, an image, identification information (e.g., bar codes, quick response (QR) codes, logos, and the like), expected locations, an expected weight, and/or a combination thereof, for the one or more target objects 112 expected to be manipulated by the robotic system 100.
In some embodiments, the master data includes manipulation-related information regarding the one or more objects that can be encountered or handled by the robotic system 100. For example, the manipulation-related information for the objects can include a center-of-mass location on each of the objects, expected sensor measurements (e.g., for force, torque, pressure, and/or contact measurements), corresponding to one or more actions, maneuvers, or a combination thereof.
The communication unit 206 can enable external communication to and from the robotic system 100. For example, the communication unit 206 can enable the robotic system 100 to communicate with other robotic systems or units, external devices, such as an external computer, an external database, an external machine, an external peripheral device, or a combination thereof, through a communication path 218, such as a wired or wireless network.
The communication path 218 can span and represent a variety of networks and network topologies. For example, the communication path 218 can include wireless communication, wired communication, optical communication, ultrasonic communication, or the combination thereof. For example, satellite communication, cellular communication, Bluetooth, Infrared Data Association standard (IrDA), wireless fidelity (WiFi), and worldwide interoperability for microwave access (WiMAX) are examples of wireless communication that can be included in the communication path 218. Cable, Ethernet, digital subscriber line (DSL), fiber optic lines, fiber to the home (FTTH), and plain old telephone service (POTS) are examples of wired communication that can be included in the communication path 218. Further, the communication path 218 can traverse a number of network topologies and distances. For example, the communication path 218 can include direct connection, personal area network (PAN), local area network (LAN), metropolitan area network (MAN), wide area network (WAN), or a combination thereof. The robotic system 100 can transmit information between the various units through the communication path 218. For example, the information can be transmitted between the control unit 202, the storage unit 204, the communication unit 206, the I/O device 208, the actuation devices 212, the transport motors 214, the sensor units 216, or a combination thereof.
The communication unit 206 can also function as a communication hub allowing the robotic system 100 to function as part of the communication path 218 and not limited to be an end point or terminal unit to the communication path 218. The communication unit 206 can include active and passive components, such as microelectronics or an antenna, for interaction with the communication path 218.
The communication unit 206 can include a communication interface. The communication interface can be used for communication between the communication unit 206 and other functional units in the robotic system 100. The communication interface can receive information from the other functional units or from external sources, or can transmit information to the other functional units or to external destinations. The external sources and the external destinations refer to sources and destinations external to the robotic system 100.
The communication interface can include different implementations depending on which functional units are being interfaced with the communication unit 206. The communication interface can be implemented with technologies and techniques similar to the implementation of the control interface.
The I/O device 208 can include one or more input sub-devices and/or one or more output sub-devices. Examples of the input devices of the I/O device 208 can include a keypad, a touchpad, soft-keys, a keyboard, a microphone, sensors for receiving remote signals, a camera for receiving motion commands, or any combination thereof to provide data and communication inputs. Examples of the output device can include a display interface. The display interface can be any graphical user interface such as a display, a projector, a video screen, and/or any combination thereof.
The control unit 202 can operate the I/O device 208 to present or receive information generated by the robotic system 100. The control unit 202 can operate the I/O device 208 to present information generated by the robotic system 100. The control unit 202 can also execute the software and/or instructions for the other functions of the robotic system 100. The control unit 202 can further execute the software and/or instructions for interaction with the communication path 218 via the communication unit 206.
The robotic system 100 can include physical or structural members, such as robotic manipulator arms, that are connected at joints for motion, such as rotational displacement, translational displacements, or a combination thereof. The structural members and the joints can form a kinetic chain configured to manipulate an end-effector, such as a gripping element, to execute one or more task, such as gripping, spinning, or welding, depending on the use or operation of the robotic system 100. The robotic system 100 can include the actuation devices 212, such as motors, actuators, wires, artificial muscles, electroactive polymers, or a combination thereof, configured to drive, manipulate, displace, reorient, or a combination thereof, the structural members about or at a corresponding joint. In some embodiments, the robotic system 100 can include the transport motors 214 configured to transport the corresponding units from place to place.
The robotic system 100 can include the sensor units 216 configured to obtain information used to execute tasks and operations, such as for manipulating the structural members or for transporting the robotic units. The sensor units 216 can include devices configured to detect or measure one or more physical properties of the robotic system 100, such as a state, a condition, a location of one or more structural members or joints, information about objects or surrounding environment, or a combination thereof. As an example, the sensor units 216 can include imaging devices, system sensors, contact sensors, and/or any combination thereof.
In some embodiments, the sensor units 216 include one or more imaging devices 222. The imaging devices 222 are devices configured to detect and image the surrounding environment. For example, the imaging devices 222 can include 2-dimensional cameras, 3-dimensional cameras, both of which can include a combination of visual and infrared capabilities, lidars, radars, other distance-measuring devices, and other imaging devices. The imaging devices 222 can generate a representation of the detected environment, such as a digital image or a point cloud, used for implementing machine/computer vision for automatic inspection, robot guidance, or other robotic applications. As described in further detail below, the robotic system 100 can process the digital image, the point cloud, or a combination thereof via the control unit 202 to identify the target object 112 of
In some embodiments, the sensor units 216 can include system sensors 224. The system sensors 224 can monitor the robotic units within the robotic system 100. For example, the system sensors 224 can include units or devices to detect and monitor positions of structural members, such as the robotic arms and the end-effectors, corresponding joints of robotic units or a combination thereof. As a further example, the robotic system 100 can use the system sensors 224 to track locations, orientations, or a combination thereof of the structural members and the joints during execution of the task. Examples of the system sensors 224 can include accelerometers, gyroscopes, or position encoders.
In some embodiments, the sensor units 216 can include the contact sensors 226, such as pressure sensors, force sensors, strain gauges, piezoresistive/piezoelectric sensors, capacitive sensors, elastoresistive sensors, torque sensors, linear force sensors, other tactile sensors, and/or any other suitable sensors configured to measure a characteristic associated with a direct contact between multiple physical structures or surfaces. For example, the contact sensors 226 can measure the characteristic that corresponds to a grip of the end-effector on the target object 112 or measure the weight of the target object 112. Accordingly, the contact sensors 226 can output a contact measure that represents a quantified measure, such as a measured force or torque, corresponding to a degree of contact or attachment between the gripping element and the target object 112. For example, the contact measure can include one or more force or torque readings associated with forces applied to the target object 112 by the end-effector.
The robotic unit 300 can be configured to clamp, pick up, grip, transport, release, load, and/or unload various types or categories of objects. For example, in the illustrated embodiment, the one or more joints 314 allow the robotic arm 310 to controllably position the end-of-arm tool 320 over and/or adjacent to one or more target objects (e.g., the target object 112 discussed above with respect to
In various embodiments, the actuator system 410 can be coupled to the first clamping component 420, the second clamping component 430, and/or the third clamping component 430 to move the first, second, and/or third clamping components 420, 430, 440 independently along a transverse axis (e.g., the x-axis and/or at least partially orthogonal (or orthogonal) to the longitudinal axis) of the frame 402 to grip and/or release one or more target objects. For example, in the illustrated embodiment, the actuator system includes a track 412; two or more carriages 414 (one labeled) coupled between the second clamping component 430 and the track 412, as well as between the third clamping component 440 and the track 412; and one or more drivers 416 operably coupled to the carriages 414 to drive their motion along the track 412. As a result, the actuator system 410 can drive motion of the second and third clamping components 430, 440 along the transverse axis while the first clamping component 420 remains in place. The fixed coupling between the first clamping component 420 and the frame 402 can simplify the operation of the end-of-arm tool 400. However, it will be understood that, in some embodiments, the actuator system 410 is operably coupled to the first clamping component 420 in addition to (or instead of) the second and third clamping components 430, 440.
In various other embodiments, the actuator system 410 can include additional, or alternative, components to move the first, second, and/or third clamping components 420, 430, 440. For example, the actuator system 410 can include an extendable component (e.g., a telescoping component, piston, a linear actuator, and/or the like) in place of, or in addition to, the track 412 and carriages 414 described above. Additionally, or alternatively, the actuator system 410 can include electric track actuators, coiled actuators, fixed belt actuators, lead screw actuators, and/or any other suitable mechanism to move the first, second, and/or third clamping components 420, 430, 440 in accordance with the systems and methods described herein.
In the illustrated embodiment, the first clamping component 420 is a rigid, thin material coupled to a peripheral edge of the frame 402. As a result, the first clamping component 420 (sometimes also referred to herein as an “external clamping component,” a “clamping plate,” and/or the like) defines a constant (e.g., unchanging) edge of the clamping mechanism on the end-of-arm tool 400 for reference during a packing operation. As further illustrated in
As discussed in more detail below, during a packing operation, the first clamping component 420 abuts existing objects (e.g., previously placed objects, container walls and/or barriers, and/or the like) while the second and third clamping components 430, 440 release the target object(s) at the destination. As a result, the thickness T1 (along the x-axis) of the first clamping component 420 imposes a lower limit on how tightly the target object(s) can be packed at the destination. Because the thickness T1 imposes a limit on how tightly the target object(s) can be packed, it is beneficial for the first clamping component 420 to be as thin as possible while providing the necessary rigidity to grip the target object(s). In various embodiments, the thickness T1 can be between about 1 millimeter (mm) and about 3 centimeters (cm).
The second clamping component is carried by the frame 402 through the actuator system 410. Similar to the first clamping component 420, the second clamping component 430430 (sometimes also referred to herein as an “internal clamping component,” a “clamping plate,” an “active clamping component,” and/or the like) includes a second clamping surface 432 oriented (e.g., facing, directed) toward the first clamping component 420 and one or more second openings 434 (two shown). The second clamping surface 432 can include a gripping substrate to help grip the target object(s), while the second opening(s) 434 can reduce the weight of the second clamping component 430. Further, as discussed above, the second clamping component 430 is operably coupled to one or more of the carriages of the actuator system 410 to move the second clamping component 430 along the transverse axis (e.g., toward and away from the first clamping component 420). As a result, the first and second clamping surfaces 422, 432 can engage opposing (e.g., opposite) side surfaces of the target object(s), allowing the end-of-arm tool 400 to lift and transport the target object(s).
The third clamping component 440 provides optional, additional support to the target object(s) engaged by the end-of-arm tool 400. In the illustrated embodiment, the third clamping component 440 is carried by the frame 402 through the actuator system 410 peripheral to the second clamping component 430 along the transverse axis (e.g., the x-axis) of the frame 402 (sometimes also referred to herein as peripheral to the second clamping component 430 with respect to the first clamping component 420). The third clamping component 440 includes one or more extension portions 442 (sometimes also referred to as “object support components,” “lifting portions,” “protrusions,” “object-lifting components,” and/or the like) that project toward the first clamping component 420 and one or more openings 444 (one labeled, two illustrated). Further, when the third clamping component 440 abuts the second clamping component 430 (e.g., as illustrated), the extension portions 442 extend beneath a lower edge 433 of the second clamping component 430 and beyond the second clamping surface 432. As a result, the extension portions 442 can engage or otherwise support a lower surface of the target object(s) engaged by the first and second clamping surfaces 422, 432. Additionally, or alternatively, the third clamping component 440 can add stability to the end-of-arm tool 400 by increasing the rigidity of the movable side of the end-of-arm tool 400. The additional support and/or stability can help stabilize the target object(s) during transportation and/or allow the end-of-arm tool 400 to lift and transport heavier objects.
In the illustrated embodiment, the third clamping component 440 includes one or more bracing bracket(s) 446 (one illustrated in
As further illustrated in
As further illustrated in
As illustrated in
As further illustrated in
In some embodiments, the first and second clamping components 520, 530 engage the first target object 10a with sufficient force and/or security to move the first target object 10a without support from the third clamping component 540 (
For example, in the embodiment illustrated in
It will be understood that, although the second motion path B (
Further, as illustrated in
As discussed in more detail below, the target objects 711 can be placed in the shipping unit 710 sequentially by an end-of-arm tool similar to those discussed above with reference to
As illustrated in
For example, as illustrated in
In some embodiments, the third clamping component 736 is moved away from first clamping component 732 before the end-of-arm tool 730 approaches a placement position on the pallet 720. As a result, the extension portions on the third clamping component 736 can be disengaged from the target objects and moved out of the way while placing the target objects. In other embodiments, however, the second and third clamping components 734, 736 are moved away from the first clamping component 732 at the same time at the placement location.
As illustrated in
The process 800 begins at block 802 by identifying one or more parameters of one or more target object(s) and planning a grasping operation. The parameters can include a length, width, height, weight, expected weight, weight distribution, expected weight distribution, wall strength, expected wall strength, rigidity, and/or any other suitable parameter. In some embodiments, the parameter(s) are identified by one or more sensors (e.g., sensor units 216 of
As an illustrative example, the robotic system 100 can obtain image data (e.g., 2-dimensional and/or 3-dimensional representation of the object source 14) having the target object 112 and/or various objects in the surrounding environment. The robotic system 100 can identify the edges, the surface texture (e.g., images and/or characters printed on the surface of the objects), heights, or the like to identify candidate objects depicted in the image data. The robotic system 100 can compare the identified candidate objects and characteristics thereof (e.g., edge lengths, surface textures, etc.) to those of objects registered in the master data. Based on matching a registered object, the robotic system 100 can detect the object depicted in the image data based on identifying a type or an identifier for the depicted location and based on locating the depicted object.
At block 803, the process 800 includes planning a grasping operation based on the parameter(s) identified/detected at block 802, an environment surrounding the target object(s) at a picking location (e.g., in a conveyor belt and/or any other suitable location), and/or an environment at a placement location (e.g., in a pallet container or other shipping unit, on a warehouse cart, and/or any other suitable location). Planning the grasping operation can include: planning an approach for the end-of-arm tool (e.g., as illustrated in
In planning the grasping operation, the process 800 can derive a grasping location on the target object(s) such that a side surface of the target object(s) is aligned with (or peripheral to) a longitudinal end of the first and second clamping components (e.g., as illustrated in
At blocks 804-808, the process 800 then includes implementing the planned grasping operation. At block 804, the process 800 includes grasping the target object(s). Additional details on the grasping process are discussed below with respect to
At block 812, the process 810 includes aligning the end-of-arm tool with the target object(s) at the pick-up location. As illustrated in
At block 814, the process 810 includes actuating the second clamping component the first clamping component, for example as illustrated in
At block 816, the process 810 includes moving the end-of-arm tool to expose at least a portion of the lower surface of the target object(s), for example as illustrated in
At block 818, the process 810 includes actuating the third clamping component to engage the lower surface of the target object(s), for example as illustrated in
At block 822, the process 820 includes aligning the end-of-arm tool with a placement location. The alignment can place the end-of-arm tool directly above and/or otherwise adjacent to the placement location. The alignment process can be executed by the robotic unit (e.g., the robotic unit 300 of
At block 824, the process 820 includes actuating the third clamping component to disengage the lower surface of the target object(s). The actuation can be driven by the actuator system under the control of the controller 109 of
At block 826, the process 820 includes moving the end-of-arm tool to the placement location, for example as illustrated in
At block 828, the process 820 includes actuating the second clamping component away from the first clamping component. The actuation can be driven by the actuator system under the control of the controller 109 of
At block 830, the process 820 includes lifting the end-of-arm tool clear of the target object(s) and departing from the placement location. The departure can be controlled by the robotic unit (e.g., the robotic unit 300 of
Once the process 820 is completed, the system can return to block 802 of
In the embodiment illustrated in
In the embodiment illustrated in
As further illustrated in
The second grippers 1165 each include an expandable arm 1166 carried by an outer surface of the third clamping component 1140 and a suction gripping component 1168 carried by a distal end of the expandable arm 1166. Similar to the pivotable arm 1162, the expandable arm 1166 (e.g., a telescoping component, extendable track, and the like) is movable along a sixth motion path F between a first position and a second position. When the expandable arm 1166 is in the first position, the suction gripping component 1168 is positioned above a lower edge of the end-of-arm tool 1100 (e.g., at an elevation above the lower edge of the first and/or second clamping components on the end-of-arm tool 1100) and therefore does not impede the motion of the end-of-arm tool 1100 during the grasping operations described above. When the expandable arm 1166 is in the second position, the suction gripping component 1168 is positioned below the lower edge of the end-of-arm tool 1100 (e.g., at an elevation below the lower edge of the first and/or second clamping components on the end-of-arm tool 1100).
As further illustrated in
The present technology is illustrated, for example, according to various aspects described below. Various examples of aspects of the present technology are described as numbered examples (1, 2, 3, etc.) for convenience. These are provided as examples and do not limit the present technology. It is noted that any of the dependent examples can be combined in any suitable manner, and placed into a respective independent example. The other examples can be presented in a similar manner.
1. An end-of-arm tool, comprising:
2. The end-of-arm tool of example 1 wherein the actuator system includes:
3. The end-of-arm tool of any of examples 1 and 2 wherein the second clamping component includes a bracing bracket coupled to the actuator system and extending along a third axis at least partially orthogonal to the first axis and the second axis.
4. The end-of-arm tool of example 3 wherein the third clamping component includes an opening aligned with the bracing bracket along the second axis and configured to nest with the bracing bracket when the third clamping component is adjacent to the second clamping component.
5. The end-of-arm tool of any of examples 1-4 wherein the second clamping component includes a clamping surface oriented toward the first clamping component, and wherein the second clamping component further includes a gripping substrate disposed on the clamping surface.
6. The end-of-arm tool of any of examples 1-5 wherein the first clamping component has a thickness of less than 3 centimeters.
7. The end-of-arm tool of any of examples 1-6, further comprising an imaging system carried by the frame and positioned to measure one or more parameters of a target object adjacent to the end-of-arm tool.
8. The end-of-arm tool of any of examples 1-7, further comprising one or more retractable arms carried by the frame, each of the one or more retractable arms including a suction gripping component and movable between a first position and a second position, wherein:
9. A method for method for operating an end-of-arm tool, the method comprising:
10. The method of example 9 wherein generating the commands for grasping the one or more target objects includes:
11. The method of example 10 wherein generating the commands for grasping the one or more target objects further includes generating commands to move the end-of-arm tool, after moving the second clamping component and before moving the third clamping component, to at least partially expose the lower surface of the one or more target objects to be engaged by the extension portion of the third clamping component.
12. The method of any of examples 9-11 wherein generating the commands for releasing the one or more target objects includes:
13. The method of example 12 wherein generating the commands to lift the end-of-arm tool away from the planned placement position includes generating commands to move the end-of-arm tool vertically to avoid contact moving the one or more target objects via contact with the first clamping component.
14. The method of any of examples 9-13 wherein the planned placement position is spaced apart from one or more previously placed target objects by a distance generally equal to a thickness of the first clamping component.
15. The method of example 14 wherein generating the commands for releasing the one or more target objects includes generating commands to orient the end-of-arm tool with the first clamping component between the one or more previously placed target objects and the one or more target objects.
16. A robotic system, comprising:
17. The robotic system of example 16 wherein the object-gripping mechanism further includes:
18. The robotic system of any of examples 16 and 17 wherein the actuator system is configured to move the external clamping component along the second axis toward and away from the internal clamping component.
19. The robotic system of any of examples 16-18 wherein the object-gripping mechanism further includes an imaging system carried by the frame and positioned to measure one or more parameters of a target object adjacent to the object-gripping mechanism.
20. The robotic system of any of examples 16-19 further comprising a controller operably coupled to the robotic arm and the object-gripping mechanism, the controller storing instructions that, when executed by the controller, cause the controller to:
21. The robotic system of any of examples 16-20, further comprising the robotic arm operably coupled to the object-gripping mechanism.
From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. To the extent any material incorporated herein by reference conflicts with the present disclosure, the present disclosure controls. Where the context permits, singular or plural terms may also include the plural or singular term, respectively. Moreover, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Furthermore, as used herein, the phrase “and/or” as in “A and/or B” refers to A alone, B alone, and both A and B. Additionally, the terms “comprising,” “including,” “having,” and “with” are used throughout to mean including at least the recited feature(s) such that any greater number of the same features and/or additional types of other features are not precluded. Further, the terms “approximately” and “about” are used herein to mean within at least within 10 percent of a given value or limit. Purely by way of example, an approximate ratio means within a ten percent of the given ratio.
From the foregoing, it will also be appreciated that various modifications may be made without deviating from the disclosure or the technology. For example, one of ordinary skill in the art will understand that various components of the technology can be further divided into subcomponents, or that various components and functions of the technology may be combined and integrated. In a specific example, as noted above, although the end-of-arm tool has been discussed primarily herein as having a stationary first clamping component, the first clamping component can be coupled to the actuator system to move along toward and away from the second clamping component. In some embodiments, only the first clamping component is coupled to the actuator system. In such embodiments, the end-of-arm tool can be positioned with the second and third clamping components adjacent to a pick-up location (e.g., on a conveyor belt) and the first clamping component can be actuated to push one or more target objects into contact with the second clamping component and/or into a position supported by the third clamping component. In various other embodiments any of the first, second, and third clamping components can be coupled to the actuator system to move along the transvers axis as needed. In addition, certain aspects of the technology described in the context of particular embodiments may also be combined or eliminated in other embodiments. Furthermore, although advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the disclosure and associated technology can encompass other embodiments not expressly shown or described herein.
The present application claims the benefit of U.S. Provisional Patent Application No. 63/327,811, filed Apr. 6, 2022, the entirety of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63327811 | Apr 2022 | US |