The present technology is generally directed to robotic systems and, more specifically, to systems, processes, and techniques for object detection. For example, several embodiments of the present technology are directed to robotic systems with dynamic motion planning for transferring unregistered objects (e.g., having initially unknown dimensions), such as robotic systems with dynamic approach, depart, and/or return path motion planning based on sensor data obtained using upward facing sensors.
With their ever-increasing performance and lowering cost, many robots (e.g., machines configured to automatically/autonomously execute physical actions) are now extensively used in many fields. Robots, for example, can be used to execute various tasks (e.g., manipulate or transfer an object through space) in manufacturing and/or assembly, packing and/or packaging, transport and/or shipping, etc. In executing the tasks, the robots can replicate human actions, thereby replacing or reducing the human involvement that would otherwise be required to perform dangerous or repetitive tasks.
Despite the technological advancements, however, robots often lack the sophistication necessary to duplicate human interactions required for executing larger and/or more complex tasks. Accordingly, there remains a need for improved techniques and systems for managing operations and/or interactions between robots.
Robotic systems with dynamic motion planning for transferring unregistered objects (and associated systems, devices, and methods) are disclosed herein. Unregistered objects can include objects having one or more properties or characteristics that are not included, stored, or registered in master data of a robotic system employed to transfer the unregistered objects between a source location and a destination location. Additionally, or alternatively, unregistered objects can include objects having one or more properties or characteristics that may be erroneously detected, occluded, altered, and/or otherwise determined to be different from the features included in the master data. As a result, the unregistered objects can be (at least initially) ‘unknown’ to the robotic system. The unknown properties or characteristics of the unregistered objects can include physical dimensions (e.g., length and/or width of one or more sides of the objects), shape, center of mass location, weight, SKU, fragility rating, etc. A specific example of a property of an unregistered target object that may be unknown to a robotic system is height of the target object.
Without knowledge of a target object's property, it may be difficult for a robotic system to place the target object at a destination location. For example, although it may be possible to (i) engage a top surface of the target object at a source location using an end-effector of the robotic system and (ii) transfer the target object toward the destination location (e.g., based on a maximum possible height value and/or a minimum possible height value for the target object), the robotic system may not be aware of a location of a bottom surface of the target object. Thus, the robotic system may not be able to determine how far it must lower the target object toward the destination location before disengaging (e.g., dropping) the target object at the destination location. Releasing at a higher location for a shorter object may increase the dropped distance and increase the risk of damaging the object and the contents therein. Alternatively, excessively lowering the grasped object can crush the grasped object and the contents therein.
To address this concern, robotic systems of the present technology can include sensors (e.g., distance sensors) having vertically oriented fields of view. While transferring an unregistered target object between a source location and a destination location, a robotic system of the present technology can present the target object to a vertically oriented sensor by positioning the target object within the vertically oriented field of view of the sensor. In turn, the sensor can be used to determine a distance (e.g., a second distance) between the target object and the sensor. In addition, given that (i) the location of the sensor and (ii) the location of the end-effector gripping the target object are known to the robotic system, the robotic system can determine a distance (e.g., a first distance) between the end-effector and the sensor at the time the target object is presented to the sensor. Thus, the robotic system can determine a height of the target object by determining a difference between the first distance and the second distance.
Knowledge of the height of the target object and the location of the end-effector enables the robotic system to determine a location of a bottom surface of the target object. In turn, the robotic system can determine an approach path for a robotic arm and the end-effector of the robotic system to place the target object at the destination location. In some embodiments, the robotic system can optimize the approach path and/or a speed at which the robotic arm and the end-effector move along the approach path, such as to reduce or minimize time spent by the robotic system placing the target object at the destination location.
In addition, in some embodiments, the robotic system can determine a height (e.g., a release height) above the destination location at which the end-effector of the robotic system can safely disengage (e.g., drop) the target object for placing the target object at the destination location. The release height can depend on one or more properties of the target object. For example, the robotic system can determine a lower release height for a heavier or more fragile target object, and/or can determine a higher release height for a lighter or less fragile target object.
Furthermore, knowledge of the height of a target object enables the robotic system to determine a future location of the end-effector that corresponds to the time when the bottom surface of the target object is positioned at the release height for the target object. Therefore, the robotic system can dynamically calculate a return path for returning the end-effector to a start location directly from the future location of the end-effector. Thus, time spent by the robotic system returning the end-effector to the start location can be less than time spent by a robotic system that first raises the end-effector to a precalculated/predetermined height (e.g., to avoid horizontal line sensors or other components of the robotic system) following placement of the target object at the destination location before moving the end-effector to the start location along a return path.
In the following description, numerous specific details are set forth to provide a thorough understanding of the presently disclosed technology. In other embodiments, the techniques introduced herein can be practiced without these specific details. In other instances, well-known features, such as specific functions or routines, are not described in detail herein in order to avoid unnecessarily obscuring the present disclosure. References in this description to “an embodiment,” “one embodiment,” or the like mean that a particular feature, structure, material, or characteristic being described is included in at least one embodiment of the present disclosure. Thus, the appearances of such phrases in this specification do not necessarily all refer to the same embodiment. On the other hand, such references are not necessarily mutually exclusive either. Furthermore, the particular features, structures, materials, or characteristics can be combined in any suitable manner in one or more embodiments. It is to be understood that the various embodiments shown in the figures are merely illustrative representations and are not necessarily drawn to scale.
Several details describing structures or processes that are well-known and often associated with robotic systems and subsystems, but that can unnecessarily obscure some significant aspects of the disclosed techniques, are not set forth in the following description for purposes of clarity. Moreover, although the following disclosure sets forth several embodiments of different aspects of the present technology, several other embodiments can have different configurations or different components than those described in this section. Accordingly, the disclosed techniques can have other embodiments with additional elements or without several of the elements described below.
Many embodiments or aspects of the present disclosure described below can take the form of computer- or controller-executable instructions, including routines executed by a programmable computer or controller. Those skilled in the relevant art will appreciate that the disclosed techniques can be practiced on computer or controller systems other than those shown and described below. The techniques described herein can be embodied in a special-purpose computer or data processor that is specifically programmed, configured, or constructed to execute one or more of the computer-executable instructions described below. Accordingly, the terms “computer” and “controller” as generally used herein refer to any data processor and can include Internet appliances and handheld devices (including palm-top computers, wearable computers, cellular or mobile phones, multi-processor systems, processor-based or programmable consumer electronics, network computers, mini computers, and the like). Information handled by these computers and controllers can be presented at any suitable display medium, including a liquid crystal display (LCD). Instructions for executing computer- or controller-executable tasks can be stored in or on any suitable computer-readable medium, including hardware, firmware, or a combination of hardware and firmware. Instructions can be contained in any suitable memory device, including, for example, a flash drive, USB device, and/or other suitable medium.
The terms “coupled” and “connected,” along with their derivatives, can be used herein to describe structural relationships between components. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, “connected” can be used to indicate that two or more elements are in direct contact with each other. Unless otherwise made apparent in the context, the term “coupled” can be used to indicate that two or more elements are in either direct or indirect (with other intervening elements between them) contact with each other, or that the two or more elements co-operate or interact with each other (e.g., as in a cause-and-effect relationship, such as for signal transmission/reception or for function calls), or both.
In the illustrated embodiment, the robotic system 100 can include an unloading unit 102, a transfer unit 104 (e.g., a palletizing robot and/or a piece-picker robot), a transport unit 106, a loading unit 108, or a combination thereof in a warehouse or a distribution/shipping hub. Each of the units in the robotic system 100 can be configured to execute one or more tasks. The tasks can be combined in sequence to perform an operation that achieves a goal, such as to unload objects from a truck or a van and store them in a warehouse or to unload objects from storage locations and prepare them for shipping. In some embodiments, the task can include placing the objects on a target location (e.g., on top of a pallet and/or inside a bin/cage/box/case). As described in detail below, the robotic system 100 can derive individual placement locations/orientations, calculate corresponding motion plans, or a combination thereof for placing and/or stacking the objects. Each of the units can be configured to execute a sequence of actions (e.g., operating one or more components therein) to execute a task.
In some embodiments, the task can include manipulation (e.g., moving and/or reorienting) of a target object 112 (e.g., one of the packages, boxes, cases, cages, pallets, etc. corresponding to the executing task) from a start/source location 114 to a task/destination location 118. For example, the unloading unit 102 (e.g., a devanning robot) can be configured to transfer the target object 112 from a location in a carrier (e.g., a truck) to a location on a conveyor 107. Also, the transfer unit 104 can be configured to transfer the target object 112 between one location (e.g., the conveyor 107, a pallet, or a bin) and another location (e.g., a pallet, a bin, another conveyor, etc.). For example, the transfer unit 104 (e.g., a palletizing robot) can be configured to transfer the target object 112 from a source location (e.g., a pallet, a bin, a pickup area, and/or a conveyor at which the transfer unit 104 engages the target object 112) to a destination location (e.g., a pallet, a bin, a dropoff area, and/or a conveyor at which the transfer unit 104 places or disengages the target object 112). The transport unit 106 (e.g., a conveyor, an automated guided vehicle (AGV), a shelf-transport robot, etc.) can transfer the target object 112 between (a) an area associated with the transfer unit 104 and (b) an area associated with the loading unit 108. The loading unit 108 can transfer the target object 112 (by, e.g., moving the pallet carrying the target object 112) between the transfer unit 104 and a storage location (e.g., a location on the shelves).
In some embodiments, the robotic system 100 can include sensors 116, such as two-dimensional imaging sensors and three-dimensional imaging sensors. For example, the robotic system 100 can include sensors 116 placed above a source location, such as one or more top down facing sensors 6. The sensors 116 placed above the source location can be used to, for example, recognize objects 112 (e.g., unknown objects, unregistered objects, known objects, and/or registered objects) at the source location, and/or calculate dimensions (e.g., a length and/or a width of top surfaces of) the objects 112. In some embodiments, the robotic system 100 can process sensor information of a top surface of a target object 112 that is captured using the sensors 116 to calculate detection results that may or may not correspond with registered objects (e.g., objects having corresponding information included in master data).
For illustrative purposes, the robotic system 100 is described in the context of a packaging and/or shipping center. It is understood, however, that the robotic system 100 can be configured to execute tasks in other environments/for other purposes, such as for manufacturing, assembly, storage/stocking, healthcare, and/or other types of automation. It is also understood that the robotic system 100 can include other units, such as manipulators, service robots, modular robots, etc., not shown in
The processors 202 can include data processors (e.g., central processing units (CPUs), special-purpose computers, and/or onboard servers) configured to execute instructions (e.g., software instructions) stored on the storage devices 204 (e.g., computer memory). In some embodiments, the processors 202 can be included in a separate/stand-alone controller that is operably coupled to the other electronic/electrical devices illustrated in
The storage devices 204 can include non-transitory computer-readable mediums having stored thereon program instructions (e.g., software 210). Some examples of the storage devices 204 can include volatile memory (e.g., cache and/or random-access memory (RAM)) and/or non-volatile memory (e.g., flash memory and/or magnetic disk drives). Other examples of the storage devices 204 can include portable memory and/or cloud storage devices.
In some embodiments, the storage devices 204 can be used to further store and provide access to processing results and/or predetermined data/thresholds. For example, the storage devices 204 can store master data 246 that includes descriptions of objects (e.g., boxes, cases, and/or products) that may be manipulated by the robotic system 200. In one or more embodiments, the master data 246 can include a dimension, a shape (e.g., templates for potential poses and/or computer-generated models for recognizing the object in different poses), a color scheme, an image, identification information (e.g., bar codes, quick response (QR) codes, logos, etc., and/or expected locations thereof), an expected weight, other physical/visual characteristics, or a combination thereof for the objects expected to be manipulated by the robotic system 200. In some embodiments, the master data 246 can include manipulation-related information regarding the objects, such as a center-of-mass (CoM) location on each of the objects, expected sensor measurements (e.g., for force, torque, pressure, and/or contact measurements) corresponding to one or more actions/maneuvers, or a combination thereof.
The communication devices 206 can include circuits configured to communicate with external or remote devices via a network. For example, the communication devices 206 can include communication input/output devices 248, such as receivers, transmitters, transceivers, modulators/demodulators (modems), signal detectors, signal encoders/decoders, connector ports, network cards, etc. The communication devices 206 can be configured to send, receive, and/or process electrical signals according to one or more communication protocols (e.g., the Internet Protocol (IP), wireless communication protocols, etc.). In some embodiments, the robotic system 200 can use the communication devices 206 to exchange information between units of the robotic system 200 and/or exchange information (e.g., for reporting, data gathering, analyzing, and/or troubleshooting purposes) with systems or devices external to the robotic system 200.
The input-output devices 208 can include user interface devices configured to communicate information to and/or receive information from human operators. For example, the input-output devices 208 can include a display 250 and/or other output devices (e.g., a speaker, a haptics circuit, or a tactile feedback device, etc.) for communicating information to the human operator. Also, the input-output devices 208 can include control or receiving devices, such as a keyboard, a mouse, a touchscreen, a microphone, a user interface (UI) sensor (e.g., a camera for receiving motion commands), a wearable input device, etc. In some embodiments, the robotic system 200 can use the input-output devices 208 to interact with the human operators in executing an action, a task, an operation, or a combination thereof.
The robotic system 200 can include physical or structural members (e.g., robotic manipulator arms) that are connected at joints for motion (e.g., rotational and/or translational displacements). The structural members and the joints can form a kinetic chain configured to manipulate an end-effector (e.g., a gripper) configured to execute one or more tasks (e.g., gripping, spinning, welding, etc.) depending on the use/operation of the robotic system 200. The actuation devices 212 (e.g., motors, actuators, wires, artificial muscles, electroactive polymers, etc.) can be configured to drive or manipulate (e.g., displace and/or reorient) the structural members about or at a corresponding joint. In some embodiments, the transport motors 214 can be configured to transport the corresponding units/chassis from place to place.
The sensors 216 can be configured to obtain information used to implement various tasks, such as manipulating the structural members and/or transporting objects. The sensors 216 can include devices configured to detect or measure one or more physical properties of the robotic system 200 (e.g., a state, a condition, and/or a location of one or more structural members/joints thereof), of one or more objects (e.g., individual objects 112 of
In some embodiments, for example, the sensors 216 can include one or more imaging devices 222 (e.g., visual and/or infrared cameras, 2D and/or 3D imaging cameras, distance measuring devices such as lidars or radars, etc.) configured to detect the surrounding environment. The imaging devices 222 can generate representations of the detected environment, such as digital images and/or point clouds, that may be processed via machine/computer vision (e.g., for automatic inspection, robot guidance, or other robotic applications). The robotic system 200 (via, e.g., the processors 202) can process the digital image and/or a point cloud to identify a target object, one or more dimensions (e.g., length, width, and/or height dimensions) of the target object, a pickup/start/source location, a drop/end/destination/task location, a pose of the target object, a confidence measure regarding the start location and/or the pose, or a combination thereof.
For manipulating the target object, the robotic system 200 (via, e.g., the various circuits/devices described above) can capture and analyze image data of a designated area (e.g., a pickup location, such as inside a truck, on a pallet, or on a conveyor belt) to identify the target object and a start location thereof. Similarly, the robotic system 200 can capture and analyze image data of another designated area (e.g., a drop location for placing objects on a conveyor, a location for placing objects inside a container, or a location on a pallet for stacking purposes) to identify a task location for the target object. For example, the imaging devices 222 can include one or more cameras configured to generate image data of the pickup area and/or one or more cameras configured to generate image data of the task area (e.g., drop area). Based on the image data, as described below, the robotic system 200 can determine the start location, the task location, the associated poses, a packing/placement location, and/or other processing results.
In some embodiments, the sensors 216 can include contact sensors 226 (e.g., pressure sensors, force sensors, strain gauges, piezoresistive/piezoelectric sensors, capacitive sensors, elastoresistive sensors, and/or other tractile sensors) configured to measure one or more characteristics associated with a direct contact between multiple physical structures or surfaces. The contact sensors 226 can measure characteristics that correspond to a grip of an end-effector (e.g., a gripper) on a target object. Accordingly, the contact sensors 226 can output a contact measure that represents a quantified measure (e.g., a measured force, torque, position, etc.) corresponding to a degree of contact or attachment between the gripper and the target object. For example, the contact measure can include one or more force or torque readings associated with forces applied to the target object by the end-effector.
In these and other embodiments, the sensors 216 can include position sensors 224 (e.g., position encoders, potentiometers, distance sensors, etc.) configured to detect positions of structural members (e.g., robotic arms and/or corresponding end-effectors of the robotic system 200), corresponding joints of the robotic system 200, and/or other objects (e.g., the individual objects 112 of
In some embodiments, the robotic system 300 can generate detection results corresponding to objects at the source location 314. For example, the robotic system 300 can image or monitor a predetermined area to identify and/or locate the source location 314. As a specific example, the robotic system 300 can include a source sensor (e.g., an instance of the sensors 116 of
The robotic system 300 can further image or monitor another predetermined area to identify the destination location 318. In some embodiments, for example, the robotic system 300 can include a destination sensor (e.g., another instance of the sensors 116 of
Using the identified source location 314 and/or the identified destination location 318, the robotic system 300 can operate one or more structures (e.g., the robotic arm 305 and/or the end-effector 309) of a corresponding unit (e.g., the transfer unit 304) to execute the task of transferring the selected target object 312 from the source location 314 to the destination location 318. More specifically, the robotic system 300 can derive or calculate (via, e.g., motion planning rules or algorithms) the motion plan 330 that corresponds to one or more actions that will be implemented by the corresponding unit to execute the task. In general, the motion plan 330 can include source trajectories associated with grasping a target object 312 at the source location 314, transfer trajectories associated with transferring the target object 312 from the source location 314 to the destination location 318, destination trajectories associated with releasing the target object 312 at the destination location 318, and/or return trajectories associated with a subsequent motion plan and/or with returning the corresponding unit to a start location.
In the specific example shown in
In some embodiments, the start location can be a default location for the end-effector 309. For example, the start location can be a location to which the end-effector 309 is returned by default after placing the target object 312 at the destination location 318. As another example, the start location can be a storage or idle location at which the end-effector 309 is positioned off to the side/out of the way, and/or a location at which the transfer unit 304 positions the end-effector 309 while the robotic system 300 derives or awaits further commands (e.g., for transferring a next target object between a source location and a destination location).
In these and other embodiments, the start location can be a location at which the transfer unit 304 positions the end-effector 309 to implement (or as part of implementing) a next source approach path and/or a next grasp approach path of a next motion plan derived for transferring a next target object between a source location and a destination location. For example, the start location can be a beginning location of a next source approach path and/or a next grasp approach path that may be implemented by the robotic system 300 to transfer a next target object between a source location and a destination location in accordance with a next motion plan. In other words, the return path 338 can be linked to a start of one or more paths of the next motion plan. Thus, after placing the target object 312 at the destination location 318, the robotic system 300 can implement the return path 338 in the motion plan 330 such that the robotic system 300 can implement the next source approach path and/or the next grasp approach path of the next motion plan for transferring the next target object. As another example, the next source approach path and/or the next grasp approach path of the next motion plan for the next target object can include at least part of the return path of the motion plan 330 for the target object 312. Thus, when the robotic system 300 implements the return path 338 of the motion plan 330, the robotic system 300 can also be implementing at least part of the next source approach path and/or the next grasp approach path of the next motion plan for the next target object. In either of these examples, the start location specified in the return path 338 can depend at least in part on the next target object (e.g., a position, pose, property of the next target object) and/or the next motion plan. Additionally, or alternatively, the next motion plan can depend at least in part on the return path 338.
In some embodiments, the robotic system 300 can derive or calculate the motion plan 330 by determining a sequence of commands and/or settings for one or more actuation devices (e.g., the actuation devices 212 of
In executing the actions associated with the motion plan 330, the robotic system 300 can track a current location (e.g., a set of coordinates corresponding to a grid used by the robotic system 300) and/or a current pose of the target object 312. For example, the robotic system 300 (via, e.g., one or more processors, such as processors 202 of
Transferring (Registered and/or Unregistered) Objects
The objects 412 at the source location 414 can include registered and/or unregistered objects. Registered objects include objects having one or more properties or characteristics that are included, stored, or registered in master data (e.g., the master data 246 of
As shown in
The robotic system 400 can generate detection results corresponding to objects at the source location 414 consistent with the discussion above. For example, the robotic system 400 can include scanners or sensors 416 placed at, above, or about the source location 414. As a specific example, the robotic system 400 can include two-dimensional and/or three-dimensional imaging sensors 416 placed above the source location 414 such that the objects 412 at the source location 414 are within field(s) of view of the imaging sensors 416. The robotic system 400 can utilize the sensors 416 at the source location 414 to determine one or more properties or characteristics of the objects 412 at the source location 414, and/or to detect or identify objects 412 at the source location 414.
For example, in the case of a registered object 412 at the source location 414, the robotic system 400 can utilize information corresponding to the registered object 412 (e.g., that is captured by the sensors 416 at the source location 414) to detect or identify the registered object 412 and/or retrieve corresponding properties and/or characteristics from the master data. Continuing with this example, the robotic system 400 can derive a motion plan (e.g., a motion plan similar to the motion plan 330 of
In the case of an unregistered object 412 at the source location 414, the robotic system 400 can utilize information corresponding to the unregistered object 412 that is captured by the sensors 416 at the source location 414 to detect or identify the unregistered object 412 and/or calculate one or more properties of the unregistered object 412. As a specific example, the robotic system 400 can utilize the sensors 416 at the source location 414 to image (e.g., a top surface of) an unregistered object 412 at the source location 414, and can use the image to estimate dimensions (e.g., a length and/or width of the top surface) of the unregistered object 412. In turn, the robotic system 400 can, based at least in part on the estimated dimensions of the unregistered object 412, derive a motion plan (e.g., similar to the motion plan 330 of
In some embodiments, the robotic system 400 can, with and/or without knowledge of one or more properties or characteristics of the objects 412, calculate motion plans for grasping the objects 412, transferring the objects 412 to or about the destination location 418, and/or placing the objects 412 at the destination location 418. As a specific example, it may be difficult to accurately determine a height of a target object 412 at the source location 414. Continuing with this example, the robotic system 400 can therefore derive, based at least in part on a maximum possible height value and/or a minimum possible height value of all objects 412 at the source location 414 that is/are provided to the robotic system 400, a motion plan for engaging the target object 412 at the source location 414, transferring the target object 412 from the source location 414 toward the destination location 418, and/or placing the target object 412 at the destination location 418.
For clarity, consider the partially schematic side view of the robotic system 400 shown in
Without knowledge of the actual height of the target object 412, however, it may be difficult for the robotic system 400 to place the target object 412 at the destination location 418, even with the precalculated default motion paths/speeds. For example, without knowledge of the height of a target object, it may be difficult for the robotic system 400 to determine how far the robotic arm 405 of the transfer unit 404 should lower the target object 412 along the default destination approach path 536 and toward the destination location 418 before releasing the target object 412. In addition, because the robotic system 400 does not know a position of a bottom surface of the target object 412 relative to the end-effector 409, it is difficult to calculate an optimized grasp approach path, an optimized grasp depart path, an optimized transfer path, an optimized destination approach path, an optimized destination depart path, an optimized return path, and/or one or more corresponding optimized motion speeds, that reduce or minimize time spent transferring the target object 412 to the destination location 418 and/or returning the end-effector 409 to a start location.
Thus, the robotic system 400 can include one or more sensors in some embodiments for determining height measurements of the objects 412 and/or locations of bottom surfaces of the objects 412. For example,
As the robotic system 400 lowers the target object 412 toward the destination location 418 along a default destination approach path 536, the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b can be used to detect a bottom surface of the target object 412 and/or determine a height of the target object 412. For example, as discussed above, the robotic system 400 can track a position of (e.g., a bottom surface of) the end-effector 409. Thus, when (i) the robotic system 400 lowers the target object 412 toward the destination location 418 and (ii) the upper horizontal line sensor 617a detects (e.g., a bottom surface of) the target object 412, the known vertical positions of the end-effector 409 and the upper horizontal line sensor 617a at the time the upper horizontal line sensor 617a detects the bottom surface of the target object 412 can be used to determine a height of the target object 412 using Equation 1 below:
Height of Target Object=Vertical Position of End-Effector−Vertical Position of Horizontal Line Sensor Equation 1
In these and other embodiments, the robotic system 400 can use the lower horizontal line sensor 617b in addition to or in lieu of the upper horizontal line sensor 617a to determine the height of the target object 412 based on the known positions of the end-effector 409 and the lower horizontal line sensor 617b at the time the lower horizontal line sensor 617b detects the bottom surface of the target object 412.
Additionally, or alternatively, as the robotic system 400 lowers the target object 412 toward the destination location 418 along the destination approach path 536, the robotic system 400 can use the lower horizontal line sensor 617b to determine when to release or disengage the target object 412 to place the target object 412 at the destination location 418. For example, the lower horizontal line sensor 617b can be positioned at a location above the conveyor 407. The location can correspond to a specified distance (e.g., a release height) above the conveyor 407 at which the robotic system 400 can safely release the target object 412 to place the target object 412 at the destination location 418 (e.g., without damaging the target object 412, without risking the target object 412 falling off the conveyor 407, etc.). Continuing with this example, as the robotic system 400 lowers the target object 412 toward the destination location 418 along the destination approach path 536, the lower horizontal line sensor 617b can detect when a bottom surface of the target object 412 is positioned the specified distance above the conveyor 407. At this point, the robotic system 400 can release or disengage the target object 412 to place the target object 412 at the destination location 418.
There are several drawbacks, however, to utilizing horizontal line sensors similar to the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b for detecting heights of target objects 412 and/or determining when to release the target objects 412 to place them at the destination location 418. For example, the robotic system 400 is not able to detect a bottom surface of the target object 412 and/or is not able to calculate the height of the target object 412, until the bottom surface of the target object 412 is lowered to the height level of and detected by the upper horizontal line sensor 617a and/or by the lower horizontal line sensor 617b. Therefore, prior to lowering or otherwise placing the target object 412 within the field of view of the upper horizontal line sensor 617a and/or the field of view of the lower horizontal line sensor 617b, the robotic system 400 is unable to calculate optimized trajectories (e.g., a destination approach path, a destination depart path, and/or a return path) and/or corresponding optimized motion speeds that reduce or minimize time spent placing the target object 412 at the destination location 418 and/or returning the end-effector 409 to a start location.
In addition, the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b are generally located near or proximate to the destination location 418 on the conveyor 407. Thus, by the time (i) the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b detect a target object 412 and (ii) the robotic system 400 is able to determine the height of the target object 412 using the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b, the robotic system 400 may not have enough time to dynamically recalculate or adjust the default or precalculated trajectories (e.g., the precalculated destination approach path 536, a precalculated destination depart path, and/or a precalculated return path) and/or corresponding motion speeds to optimize such trajectories/speeds.
Furthermore, the position of the lower horizontal line sensor 617b relative to the conveyor 407 is typically fixed. Thus, without adjusting the position of the lower horizontal line sensor 617b relative to the conveyor 407, the robotic system 400 can be configured to release each target object 412 from a same height above the conveyor 407. In other words, the robotic system 400 is unable to adjust or tailor the release height for a target object 412 based on one or more properties or characteristics (e.g., weight, center of mass location, size, shape, etc.) of the target object 412.
Moreover, given the positions of the upper horizontal line sensor 617a and the lower horizontal line sensor 617b above the conveyor 407, the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b can pose as obstacles to returning the end-effector 409 to a start location. Thus, before moving the end-effector 409 along a return path to return the end-effector 409 to a start location, the robotic system 400 may be required to first move the end-effector 409 along a precalculated destination depart path to raise the end-effector 409 to a specified height that will clear the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b. Such movement of the end-effector 409 can correspond to delays in the process of returning the end-effector 409 to the start location after placing a target object at the destination location 418.
To address one or more of these concerns, the robotic system 400 may employ one or more vertically oriented sensors in addition to or in lieu of the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b.
Although shown beneath the conveyor 407 in the illustrated embodiment, the sensor 745 can be positioned at other locations within the robotic system 400. For example, the sensor 745 can be positioned at a location between the source location 414 (
For the sake of example, consider
As shown in
Height of Target Object=Distance Between End-Effector and Sensor−Distance Between Target Object and Sensor, Equation 2
or H3=D1−D2 in the example illustrated in
Referring now to
In some embodiments, the release height D5 can be constant across placement of multiple target objects at the destination location 418. For example, the release height D5 can be invariable across placement of all target objects (including the target object 812) at the destination location 418. As another example, the release height D5 can correspond to a group of target objects (including the target object 812) such that the robotic system 400 is configured to release all target objects of the group from the release height D5. In both of these examples, the release height D5 can correspond to a specified distance above the conveyor 407 at which the robotic system 400 can safely release the multiple target objects to place those target objects at the destination location 418 (e.g., without damaging those target objects, without risking those target objects falling off the conveyor 407, etc.).
In some embodiments, the release height D5 can vary across placement of different target objects at the destination location 418. For example, the release height D5 can be variable and/or can depend at least in part on one or more properties or characteristics (e.g., weight, shape, center of mass location, fragility rating, etc.) of a given target object. As a specific example, the release height D5 for the target object 812 can be smaller when the target object 812 is heavier in weight and/or more fragile, and can be larger when the target object 812 is lighter in weight and/or less fragile. As another specific example, the release height D5 for the target object 812 can be smaller when a shape of the target object 812 and/or a size/shape of the bottom surface of the target object 812 pose a risk of the target object 812 rolling or otherwise falling off of the conveyor 407, and can be larger when the shape of the target object 812 and/or the size/shape of the bottom surface of the target object 812 are relatively flat or do not pose much of a risk of the target object 812 falling off the conveyor 407. In other words, in some embodiments, the release height D5 can be unique to the target object 812 and/or can correspond to one or more properties/characteristics of the target object 812. In some embodiments, the robotic system 400 can utilize one or more sensors (e.g., weight sensors, force sensors, imaging sensors, etc.) for determining one or more of the properties or characteristics of target objects, and/or can (e.g., dynamically) determine release heights for target objects based on properties/characteristics of target objects.
The robotic system 400 may employ any one or more of several possible methods for determining when the bottom surface of the target object 812 is at the release height D5. For example, the robotic system 400 can determine that the bottom surface of the target object 812 is at the release height D5 by monitoring motion of the end-effector 409. For example, a location of the destination location 418 at the top of the rollers of the conveyor 407 may be known to the robotic system 400. Thus, the robotic system 400 may know a vertical distance between the bottom surface of the end-effector 409 and the top of the rollers of the conveyor 407. As such, the robotic system 400 can determine that the bottom surface of the target object 812 is at the release height D5 when a vertical distance of the bottom surface of the end-effector 409 above the rollers of the conveyor 407 less the actual height measurement H3 of the target object 812 is equivalent to the release height D5. This is represented by Equation 3 below:
Vertical Height of Target Object Above Destination Location=Vertical Height of Bottom Surface of End-Effector Above Destination Location−Actual Height Measurement of Target Object Equation 3
Thus, using Equation 3 above, the robotic system 400 can determine that a bottom surface of the target object 812 is at the release height D5 when the Vertical Height of Target Object Above Destination Location value is equivalent to the specified and/or determined release height D5.
Additionally, or alternatively, the robotic system 400 can determine that the bottom surface of the target object 812 is at the release height D5 by monitoring motion of the end-effector 409 relative to the position of the end-effector 409 at a time (t0) the robotic system 400 determines the actual height measurement H3 of the target object 812 using the sensor 745 (e.g., relative to the position of the end-effector 409 shown in
Vertical Height of Target Object Above Destination Location=Vertical Height of Bottom Surface of End-Effector Above Destination Location at Time t0−Vertical Distance Traversed by End-Effector Along Destination Approach Path Since Time t0−Actual Height Measurement of Target Object Equation 4
Thus, using Equation 4 above, the robotic system 400 can determine that a bottom surface of the target object 812 is at the release height D5 when the Vertical Height of Target Object Above Destination Location value is equivalent to the specified and/or determined release height D5.
In these and still other embodiments, a distance (represented by line segment D3 in
Vertical Height of Target Object Above Destination Location=Distance Between Sensor and Bottom Surface of Target Object−Distance Between Sensor and Destination Location Equation 5
Vertical Height of Target Object Above Destination Location=Distance Between End-Effector and Sensor−Actual Height Measurement of Target Object−Distance Between Sensor and Destination Location Equation 6
Thus, using Equation 5 and/or Equation 6 above, the robotic system 400 can determine that a bottom surface of the target object 812 is at the release height D5 when the Vertical Height of Target Object Above Destination Location value is equivalent to the specified and/or determined release height D5.
Returning to discussion of the destination approach path 836 illustrated in
Moreover, knowledge of the actual height measurement H3 of the target object 812 can facilitate the robotic system 400 dynamically calculating (e.g., dynamically recalculating) destination depart paths and/or return paths for the robotic system 400. For example, knowing the actual height H3 can enable the robotic system 400 to determine a location of a top surface of the target object 812 (and therefore the bottom surface of the end-effector 409) when the bottom surface of the target object 812 is positioned at the release height D5. Thus, knowledge of the actual height H3 of the target object 812 can facilitate calculating a destination depart path and/or a return path starting from a location that the end-effector 409 will be positioned when the bottom surface of the target object 812 is positioned at the release height D5 and/or when the end-effector 409 disengages (e.g., drops) the target object 812. Furthermore, as discussed above, in embodiments in which the actual height H3 of the target object 812 is calculated by the robotic system 400 at or near a start of the destination approach path 836 (e.g., prior to or while the robotic system 400 moves the target object 812 along the destination approach path 836), the robotic system 400 can have ample time to dynamically calculate the destination depart path and/or the return path.
Referring to
In other embodiments, such as (i) in embodiments in which the robotic system 400 does not include the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b, or (ii) in embodiments in which the upper horizontal line sensor 617a and/or the lower horizontal line sensor 617b do not pose as obstacles to the end-effector 409, the robotic system 400 can (e.g., dynamically) calculate a hybrid return path 839. As shown in
In other words, use of the sensor 745 of the robotic system 400 to determine an actual height measurement H3 of the target object 812 can facilitate the robotic system 400 (e.g., dynamically) calculating optimized destination approach paths, optimized destination approach speeds, optimized destination depart paths, optimized return paths, and/or optimized hybrid ‘shortcut’ return paths. Additionally, in embodiments in which the sensor 745 is positioned at other locations (e.g., at or proximate the source location 414, between the source location 414 and the destination location 418, etc.), the robotic system 400 can utilize the sensor 745 to determine the actual height measurement H3 of the target object 812 at a point further upstream in the corresponding motion plan for the end-effector 409. In such embodiments, the robotic system 400 can (e.g., dynamically) calculate or optimize other paths (e.g., a source approach path, a grasp approach path, a grasp depart path, and/or a transfer path) for transferring the target object 812 from the source location 414 to the destination location 418.
As shown in
Referring now to
In some embodiments, the robotic system 400 can (e.g., dynamically) determine the release height D9 and/or the release height D8. For example, the robotic system 400 can determine the release height D9 and/or the release height D8 based at least in part on one or more properties or characteristics of the target object 912, consistent with the discussion of
Referring to
Accordingly, use of the sensor 745 in the robotic system 400 can facilitate realization of several advantages over robotic systems that lack such a sensor. For example, the robotic system 400 can use the sensor 745 to determine an actual height of a target object at an early stage of a corresponding motion plan (e.g., prior to or while moving the target object along a destination approach path). As such, the robotic system 400 can be provided sufficient time to (e.g., dynamically) calculate, recalculate, and/or optimize various motion paths and/or corresponding speeds (e.g., transfer paths, destination return paths, destination approach speeds, release heights, destination depart paths, return paths, hybrid return paths, etc.) included in the motion plan. In turn, time spent by the robotic system 400 placing target objects at the destination location 418 can be reduced and/or minimized in comparison to robotic systems that lack a sensor similar to the sensor 745.
Furthermore, use of the vertically oriented sensor 745 to determine actual height measurements and/or positions of bottom surfaces of target objects relative to the destination location 418 at the top of the rollers of the conveyor 407 can facilitate the robotic system 400 altering, adjusting, tailoring, and/or customizing release heights for different target objects (e.g., based on one or more properties or characteristics of those target objects), and without needing to adjust a position of the sensor 745.
Moreover, the sensor 745 can be positioned beneath the conveyor 407 and/or out of the way of the end-effector 409, and/or can be used in lieu of horizontal line sensors (e.g., one or both of the upper horizontal line sensor 617a and the lower horizontal line sensor 617b of
The method 1070 begins at block 1071 by detecting a target object at a source location. The target object can be a registered or unregistered object. Additionally, or alternatively, the source location can be a pallet, a bin, a designated region on a conveyor, a stack of objects including the target object, etc.
Detecting the target object can include detecting the target object using one or more sensors of the robotic system. For example, detecting the target object can include using one or more imaging sensors to image a designated area and identify a source location. As another example, detecting the target object can include using one or more imaging sensors to image the target object. Based on one or more images of the designated area and/or on one or more images of the target object, the robotic system can identify the source location and/or the target object at the source location.
As shown at subblock 1081, detecting the target object can include estimating at least some of the dimensions for the target object. For example, detecting the target object can include using one or more imaging sensors to image a portion (e.g., a top surface) of the target object. Continuing with this example, detecting the target object can include estimating dimensions (e.g., a length, a width, etc.) of the portion of the target object based at least in part on images of the target object.
At block 1072, the method 1070 continues by deriving a motion plan for transferring the target object to a destination location, such as from the source location to the destination location. In some embodiments, deriving the motion plan can include deriving the motion plan based on one or more properties or characteristics of the target object registered in master data of the robotic system. In these and other embodiments, deriving the motion plan can include deriving the motion plan based on default values (e.g., provided to the robotic system), such as a maximum possible height value for the target object and/or a minimum possible height value for the target object. Additionally, or alternatively, deriving the motion plan for transferring the target object can include determining one or more motion paths and/or one or more corresponding motion speeds for moving the robotic system (e.g., a robotic arm and/or an end-effector of the robotic system) and/or the target object toward the destination location.
For example, referring to subblocks 1082-1084, deriving the motion plan can include deriving a source approach path for moving the end-effector to a location at or proximate the source location; deriving a grasp approach path for maneuvering the end-effector to the target object and operating the end-effector to engage (e.g., grip) the target object; and/or deriving a grasp depart path for moving/raising the target object away from the source location after the target object is engaged by the end-effector. Additionally, or alternatively, referring to subblock 1085, deriving the motion plan can include deriving one or more transfer paths for moving the target object between the source location and the destination location. In these and other embodiments, referring to subblocks 1086-1089, deriving the motion plan can include deriving a destination approach path for placing the target object at the destination location; deriving a destination depart path for moving the end-effector away from the destination location and/or to a specified height; and/or deriving a return path for moving the end-effector to a start location (e.g., at or proximate the source location, such as for transferring another object from the source location to the destination location).
At block 1073, the method 1070 continues by implementing a first portion of the motion plan for transferring the target object to the destination location. Implementing the first portion of the motion plan can include moving the robotic system (e.g., the robotic arm and/or the end-effector) toward the source location in accordance with the source approach path; moving the robotic system to the target object and/or operating the robotic system such that the end-effector engages the target object in accordance with the grasp approach path; and/or moving the robotic system and the target object away from the source location in accordance with the grasp depart path. Additionally, or alternatively, implementing the first portion of the motion plan can include moving the robotic system (e.g., the robotic arm and/or the end-effector) toward the destination location in accordance with the transfer path(s). In these and still other embodiments, implementing the first portion of the motion plan can include moving the target object toward the destination location in accordance with at least part of the destination approach path.
As shown in subblock 1089, implementing the first portion of the motion plan can include presenting the target object to a sensor, such as a distance sensor similar to the distance sensor 745 discussed in detail above. Presenting the target object to the sensor can include positioning the target object above the sensor and/or within a field of view of the sensor. In embodiments in which the sensor is positioned beneath a destination location located at a top surface of rollers of a conveyor, presenting the target object to the sensor can include positioning the target object above the destination location and within a field of view of the sensor that extends unobstructed through a gap between rollers of the conveyor. Alternatively, in embodiments in which the sensor is positioned at another location, such as at a location between the source location and the destination location, presenting the target object to the sensor can include positioning the target object at a location within the field of view of the sensor at the other location. In these and other embodiments, presenting the target object to the sensor includes positioning the target object such that (i) the target object is within a field of view of the sensor and (ii) the end-effector of the robotic system is positioned on a side of the target object opposite the sensor.
At block 1074, the method 1070 continues by determining a height of the target object. Determining the height of the target object can include determining a first distance between a portion of the robotic system and the sensor. For example, determining the height of the target object can include determining a first distance between a bottom surface of the end-effector of the robotic system and the sensor. Continuing with this example, determining the first distance can include tracking or otherwise determining the location of the bottom surface of the end-effector. Determining the height of the target object can additionally, or alternatively, include determining a second distance between the target object and the sensor. For example, determining the second distance can include receiving (e.g., from the sensor) sensor data indicative of the second distance. Additionally, or alternatively, determining the second distance can include determining the second distance based at least in part on the sensor data and/or between the bottom surface of the target object and the sensor. In these and other embodiments, determining the height of the target object can include determining the height of the target object based at least in part on the first distance and/or the second distance. For example, determining the height of the target object can include determining the height of the target object as a difference between the first distance and the second distance.
At block 1075, the method 1070 continues by calculating (e.g., deriving) or updating (e.g., adjusting, altering, recalculating, etc.) a second portion of the motion plan for transferring the target object to the destination location. Calculating or updating the second portion of the motion plan can include calculating or updating the second portion of the motion plan based at least in part on the height of the target object determined at block 1074. In these and other embodiments, calculating or updating the second portion of the motion plan can include dynamically calculating or updating all or a subset of the second portion of the motion plan. In these and still other embodiments, calculating or updating the second portion of the motion plan includes calculating or updating the second portion of the motion plan prior to implementing all or a first subset of the second portion of the motion plan and/or while implementing all or a second subset of the second portion of the motion plan.
As shown in subblock 1091, calculating or updating the second portion of the motion plan can include calculating or updating a destination approach path and/or a corresponding destination approach speed. Calculating or updating the destination approach path can include determining a release height for the target object. Determining the release height for the target object can include determining the release height based at least in part on one or more properties or characteristics of the target object. Calculating or updating the destination approach path and/or the corresponding destination approach speed can include optimizing the destination approach path and/or the corresponding destination approach speed to minimize or reduce time spent by the robotic system placing the target object at the destination location.
As shown in subblock 1092, calculating or updating the second portion of the motion plan can include calculating or updating a destination depart path and/or a corresponding destination depart speed. Calculating or updating the destination depart path can include determining a height and/or location to which the robotic system raises the end-effector after placing the target object at the destination location. Determining the height and/or location can include determining a height and/or location that avoid horizontal line sensors and/or other components of the robotic system. Calculating or updating the destination depart path and/or the corresponding destination depart speed can include optimizing the destination depart path and/or the corresponding destination depart speed to minimize or reduce time spent by the robotic system moving to the determined height and/or location for the end-effector after placing the target object at the destination location.
As shown in subblock 1093, calculating or updating the second portion of the motion plan can include calculating or updating a return path and/or a corresponding return speed. Calculating or updating the return path can include determining or updating a path by which to return the end-effector of the robotic system to a start location (e.g., after raising the end-effector to the height and/or location specified by the destination depart path). Calculating or updating the return path and/or the corresponding return speed can include optimizing the return path and/or the corresponding return speed to minimize or reduce time spent by the robotic system moving the end-effector from the height/location specified by the destination depart path to the start location.
Alternatively, calculating or updating the return path and/or a corresponding return speed can include determining a path by which to return the end-effector of the robotic system to a start location after placing the target object at the destination location. Calculating or updating the return path can include determining a path starting from a position of the end-effector at the time the end-effector disengages (e.g., drops) the target object at the destination location and ending at the start location (e.g., at or proximate the source location). For example, calculating or updating the return path can include calculating or updating a hybrid ‘shortcut’ return path representing a combination of a destination depart path and a return path. In such embodiments, the subblock 1092 can be omitted. As another example, calculating or updating the return path can include calculating or updating a return path such that the end-effector is (e.g., immediately) moved (e.g., horizontally) toward the start location after placing the target object at the destination location. In these and other embodiments, calculating or updating the return path can include calculating or updating a return path directly from a location at which the end-effector disengages the target object to the start location. Additionally, or alternatively, calculating or updating the return path and/or a corresponding return speed can include optimizing the return path and/or the return speed to minimize or reduce time spent by the robotic system moving the end-effector from the location of the end-effector at the time the end-effector disengages the target object to the start location.
As discussed above, the start location can be (i) a default location and/or (ii) a location at which to position the end-effector to implement (or as part of implementing) all or a subset of a next motion plan, such as for transferring a next target object between a source location and a destination location. In the event the start location is (e.g., at the time subblock 1088 is executed) a default location, calculating or updating the return path can include determining or updating a path by which to return the end-effector to the default location. Alternatively, calculating or updating the return path can include (i) updating the start location from the default location to another location different from the default location (e.g., a location that facilitates implementing all or a subset of the next motion plan), and/or (ii) determining or updating a path along which to move the end-effector to position the end-effector at the other location. In the event that the start location is (e.g., at the time subblock 1088 is executed) a location at which to position the end-effector to implement (or as part of implementing) the next motion plan, calculating or updating the return path can include determining or updating a path along which to move the end-effector to position the end-effector at the start location (e.g., such that the return path links into one or more paths derived for the next motion plan).
At block 1076, the method 1070 continues by implementing the second portion of the motion plan for transferring the target object to the destination location. Implementing the second portion of the motion plan can include moving the target object toward the destination location according to the destination approach path and/or the destination approach speed calculated and/or updated at subblock 1090. Implementing the second portion of the motion plan can include lowering (e.g., a portion, such as a bottom surface of) the target object to a release height. As shown in subblock 1093, implementing the second portion of the motion plan can include placing the target object at the destination location, such as by disengaging (e.g., dropping, releasing) the target object at the release height. Implementing the second portion of the motion plan can include raising the end-effector to the height and/or location specified by the destination depart path and/or in accordance with the destination depart speed. Implementing the second portion of the motion plan can include moving the end-effector to the start location from the height and/or location specified by the destination depart path and/or in accordance with the return path and/or return speed. Alternatively, implementing the second portion of the motion plan can include moving the end-effector to the start location in accordance with the hybrid ‘shortcut’ return path and/or an associated return speed. For example, implementing the second portion of the motion plan can include moving the end-effector to the start location along the hybrid ‘shortcut’ return path and from the location of the end-effector at the time the end-effector disengages the target object. In implementations in which the start location is initially (e.g., at the time subblock 1088 is executed) a first location or a default location and then is updated to a different location (e.g., at the time subblock 1092 is executed), implementing the second portion of the motion plan can include moving the end-effector to the different location as opposed to the first/default location and along the return path/hybrid return path. In these and other embodiments, implementing the second portion of the motion plan can include moving the end-effector to the start location to facilitate implementing or as part of implementing a next motion plan for a next target object.
Although the steps of the method 1070 are discussed and illustrated in a particular order, the method 1070 of
The above detailed descriptions of embodiments of the technology are not intended to be exhaustive or to limit the technology to the precise form disclosed above. Although specific embodiments of, and examples for, the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology as those skilled in the relevant art will recognize. For example, although steps are presented in a given order above, alternative embodiments may perform steps in a different order. Furthermore, the various embodiments described herein may also be combined to provide further embodiments.
From the foregoing, it will be appreciated that specific embodiments of the technology have been described herein for purposes of illustration, but well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. To the extent any material incorporated herein by reference conflicts with the present disclosure, the present disclosure controls. Where context permits, singular or plural terms may also include the plural or singular term, respectively. In addition, unless the word “or” is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of “or” in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Furthermore, as used herein, the phrase “and/or” as in “A and/or B” refers to A alone, B alone, and both A and B. Additionally, the terms “comprising,” “including,” “having,” and “with” are used throughout to mean including at least the recited feature(s) such that any greater number of the same features and/or additional types of other features are not precluded. Moreover, as used herein, the phrases “based on,” “depends on,” “as a result of,” and “in response to” shall not be construed as a reference to a closed set of conditions. For example, an exemplary step that is described as “based on condition A” may be based on both condition A and condition B without departing from the scope of the present disclosure. In other words, as used herein, the phrase “based on” shall be construed in the same manner as the phrase “based at least in part on” or the phrase “based at least partially on.” Also, the terms “connect” and “couple” are used interchangeably herein and refer to both direct and indirect connections or couplings. For example, where the context permits, element A “connected” or “coupled” to element B can refer (i) to A directly “connected” or directly “coupled” to B and/or (ii) to A indirectly “connected” or indirectly “coupled” to B.
From the foregoing, it will also be appreciated that various modifications may be made without deviating from the disclosure or the technology. For example, one of ordinary skill in the art will understand that various components of the technology can be further divided into subcomponents, or that various components and functions of the technology may be combined and integrated. In addition, certain aspects of the technology described in the context of particular embodiments may also be combined or eliminated in other embodiments. Furthermore, although advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the disclosure and associated technology can encompass other embodiments not expressly shown or described herein.
The present application claims the benefit of U.S. Provisional Patent Application Ser. No. 63/418,637, filed Oct. 24, 2022, which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63418637 | Oct 2022 | US |