This disclosure relates to a system and end effectors for use with automated loading devices. In one application, the system and end effectors are useful for loading travel passenger checked baggage into containers for loading onto passenger aircraft.
In today's global and fast-moving economies, passenger mass transit, and in particular air travel, continues to rapidly increase. With the increase in passenger airline travel, there is increased pressure on airlines and airports to move the passengers and their checked luggage through the airports as quickly and efficiently as possible.
Following check-in and security screening in a main terminal, each checked bag is routed, and further sorted and gathered, typically by flight number. The sorted bags are then loaded into movable containers, for example containers commonly called unit load devices (ULDs), for transfer onto the airplane. Where it is not possible or efficient to use ULDs, other containers such as baskets and/or trays are used to transfer the flight-sorted bags for loading into the designated airplane.
Even in the most sophisticated and automated baggage handling systems, at several places between baggage check-in and loading/unloading of the airplane, bags must be manually handled by operators for various reasons and purposes. Due in large part to the baggage size, weight and variations thereof, the level of human physical effort and complex ergonomic movements to complete these physical bag loading/unloading tasks are high.
One area typically requiring manual bag handling (or human intervention) is the loading of the flight-sorted bags into the containers (for example ULDs described above). This is due to many reasons, including the almost unlimited differences in the sizes, shapes, rigidity, volumes, and weights of passenger bags. For example, the high variation in the physical characteristics of passenger bags has made it very difficult to automate, for example using programmable robots, the physical transfer of high volumes of the flight-sorted bags into a container. Further difficulties in automating loading of the containers exist in that the containers have a definite size and interior volume space, the available volume space for the next bag decreasing, and changing in three-dimensional shape, as each bag is deposited into the container.
There is a need for devices and methods that would solve or improve on the difficulties and disadvantages in the area of loading objects into movable containers, for example checked airline passenger bags into ULDs, for further processing of the objects and/or bags.
Disclosed herein is an engaging end effector, a conveyor end effector, and a system for engaging and loading objects into a container using the engaging and conveyor end effectors.
In one example, an engaging end effector is used in an automated loading device for selectively engaging objects. The exemplary engaging end effector includes a mounting plate connected to an automated loading device, for example a multi-axis programmable robot. The engaging end effector further includes an actuator connected to a coupler and a base operable to selectively rotate the base relative to the mounting plate about an axis of rotation. The base is operable to selectively engage an object, for example a travel passenger bag, positioned within a path of travel of the base.
In one example, the engaging end effector base includes a vacuum pad operable to selectively generate a vacuum force to selectively engage and disengage an object positioned along the base path of travel adjacent to the vacuum pad. In another example, the engaging end effector includes an extension arm positioned between the mounting plate and the automated loading device to extend the reach or path of travel of the base. The exemplary extension arm includes an axis of rotation relative to the automated loading device which is independent of the actuator axis of rotation further increasing the path of travel of the base.
In one example, a conveyor end effector is used in an automated loading device for selectively loading objects into a container, for example an airport checked passenger bag. The exemplary conveyor end effector includes a mounting plate connected to an automated loading device, for example a multi-axis programmable robot. The conveying end effector further includes an actuator connected to a conveyor operable to selectively rotate the conveyor relative to the mounting plate.
In one example of the conveyor end effector, the conveyor end effector includes a base connected to the actuator and having a pair of opposing arms extending outward from the base. A first roller and a second roller are rotatably connected to the pair of arms. One of the first and second roller comprises a power roller for selectively rotating a belt operable to transfer an object positioned on the belt to selectively position and deposit the object into the container within the conveyor path of travel.
In another example, the conveying end effector includes an extension arm positioned between the mounting plate and the automated loading device to extend the reach or path of travel of the base. The exemplary extension arm includes an axis of rotation relative to the automated loading device which is independent of the actuator axis of rotation further increasing the path of travel of the base.
In one example of a system for engaging and selectively positioning and depositing objects in a container, the system uses a first automated loading device including an engaging end effector having a path of travel and a second automated loading device including a conveyor end effector having a path of travel. In one example, the object is a travel passenger bag.
The exemplary system engaging end effector includes a mounting plate connected to an automated loading device, for example a multi-axis programmable robot. The engaging end effector further includes a base operable to selectively engage the object, for example a travel passenger bag, positioned within the engaging end effector path of travel. The system conveyor end effector further includes a mounting plate connected to a conveyor having a base and a powered belt operable to position and deposit an object on the conveyor into a container within the conveyor end effector path of travel. In one example, the engaging end effector path of travel is in overlapping communication with the conveyor end effector path of travel.
In one example of operation of the system, the engaging end effector autonomously selectively engages an object and coordinatingly disengages and deposits the object onto the conveyor end effector conveyor. The conveyor end effector autonomously positions the conveyor and transfers the object relative to the base to selectively position and deposit the object into an available space in the container.
In one example of the system engaging end effector and the conveyor end effector, each of the engaging and conveyor end effectors include an actuator connected to the respective mounting plate and the base. Each actuator is operable to selectively rotate the respective base about an axis of rotation relative to the mounting plate to increase the path of travel of each of the engaging and conveyor end effector. In another example of the system, each of the engaging end effector and the conveyor end effector includes an extension arm connected to the respective mounting plate and respective first or second automated device. Each extension arm includes an axis of rotation relative to the respective first or second automated device to increase the path of travel of the respective engaging end effector and the conveyor end effector.
In one example, systems and methods are disclosed for sequential loading of objects into a container. The systems and methods can include a first robot or automated loading device with an engaging end effector and a second robot or automated loading device with a conveyor end effector. The engaging end effector is operable to apply a vacuum force to engage an object and the conveyor end effector is operable to receive the object from the engaging end effector and convey the object into the container. A control system including a processor controls the first robot and the second robot to coordinate transfer of the object into the container. A memory storing an image of the object, object data for the object, and an identifier associated with the object.
In one example, the processor of the control system is programmed to extract data from the image corresponding to features of the object and store the extracted data.
In one example, the processor of the control system is programmed to execute a machine learning model to output a prediction as to whether the engaging end effector is capable of engaging and transferring the object.
In one example, he processor of the control system is programmed to operate in a training and test mode in which the processor controls the end engaging effector to attempt to engage and transfer the object independent of the prediction, determine an outcome of whether the engaging end effector was successful in engaging and transferring the object, and train the machine learning model based on the outcome.
In one example, the processor of the control system is programmed to operate in a production mode in which the processor controls the end engaging effector to attempt to engage and transfer the object in response to the prediction indicating that the engaging end effector can successfully engage and transfer the object.
In one example, the system of claim 1, the processor of the control system is programmed to determine an available interior volume of the container based on one or more images of the container; determine whether the object is capable of fitting in the available interior volume; and determine a location at which the conveyor end effector is to deposit the object in the container in response to determining that the object is capable of fitting in the available interior volume. The processor of the control system can be programmed to determine whether the object is capable of fitting in the available interior volume and a location at which the object is to be placed using a heuristic function and/or generate a point cloud of the interior volume of the container using the one or more images and determine the available interior volume based on the point cloud.
Any combination and/or permutation of the embodiments are envisioned. Other objects and features will become apparent from the following detailed description considered in conjunction with the accompanying drawings. However, it is to be understood that the drawings are designed as an illustration only and not as a definition of the limits of the present disclosure
To assist those of skill in the art in making and using embodiments of the present disclosure, reference is made to the accompanying figures, wherein elements are not to scale so as to more clearly show the details, and wherein like reference numbers indicate like elements throughout the several views.
Referring to
In one example embodiment of system 10, the first automated load device 22 and second automated load device 24 are positioned in an automated loading cell 20. In one example, automated loading cell 20 is a portion of a make-up module in a large municipal or mass transit passenger airport. The make-up module is where, for example, passenger checked bags have already been pre-sorted by a predetermined metric, for example flight number, and are loaded into containers 18, for example unit load devices (ULDs). In one example, the make-up module may include a manual loading station where the containers are partially filled in the automated load station and then the remaining spaces are manually filled by bag handlers. The filled containers 18 are then transferred out of the make-up module to an aircraft stand where the filled containers 18, or individual bags 14, are loaded into an aircraft hold for flight.
In the example automated load cell 20, the first automated load device 22 and second automated load device 24 are positioned adjacent to, and in communication with, a path of travel 26 whereby, in one example, the bags 14 sequentially (one after another) in a single file line, travel into and through automated load station 20. In one example, a plurality of autonomously propelled and navigated devices, for example automated guided vehicles (AGVs) 28, each carry a bag 14 into the automated load cell 20. A central and/or one or more local control systems (individually and collectively referred to as a central control system 30) in electronic communication, through hardwire or wireless communication protocols, with the AGVs 28 and the first automated load devices 22 and second automated load devices 24 control the operation, movement and coordination as known by those skilled in the art.
In one example, the first automated device 22 includes a robot 34 (for example see
Referring to
Referring to
In one example of automated loading cell 20, the first automated device 22 and robot 34 can be in electronic and/or data communication with the central control system 30 to provide signals and/or instructions to operate and control the robot 34 and respective attached end effector 40 or 56 to guide the end effector 40 or 56 to positions along a longitudinal (x coordinate) direction 44, a lateral (y-coordinate) direction 46, and vertical (z coordinate) direction 48. Control system 30 may further receive feedback signals from sensors (e.g., accelerometers, gyroscopes, optical sensors, acoustic sensors, encoders, cameras, pressure and piezoelectric sensors) on the robot 34 and/or attached respective end effector 40 or 56.
Referring to
Exemplary extension arm 84 includes a first end 83 connected to the robot wrist 38. In one example of engaging end effector 40, extension arm 82 is selectively rotatable relative to the robot wrist 38 about an axis of rotation 86 by a robot actuator thereby moving the engaging end effector 40 about an arcuate path of travel 88 (
Exemplary engaging end effector 40 further includes an actuator 90 connected to the mounting plate 80. In one example, actuator 90 is an electric motor in communication with an electrical or other power source from the first automated load device 22, for example robot 34. In one example, actuator 90 is selectively activated or energized by communication signals or other instructions received from central control system 30 to selectively rotate engaging end effector 40 along arcuate path of travel 108 (
Exemplary engaging end effector 40 further includes a coupler 94 connected to the actuator 90 operable to selectively rotate the coupler 94 about an axis or rotation 100 relative to the mounting plate 80. In one example, a base 106 is rigidly connected to the coupler 94. On selected activation of the actuator 90, the coupler 94 and base 106 rotate relative to the mounting plate about axis of rotation 100 thereby moving the base 106 about an arcuate path of travel 108 (
Referring to the
Vacuum air tubes secured to arm 82 in communication with the vacuum source and the vacuum air holes 112 are used. In one example, with reference to
In one example, the vacuum source selectively generates a vacuum force 120 through the plurality of vacuum air holes 112 perpendicular to the engaging surface 114 sufficient to engage, secure and support a bag 14 to the engagement pad 110 against the force of gravity when the engagement pad 110 is placed against bag 14, or in close proximity thereto. In one example, a manifold or plenum having two or more, or a plurality of, air channels in communication with certain vacuum holes 112 can be used to more evenly distribute the vacuum force 120 or the flow of air between an air tube 116 and certain vacuum air holes 112. Different devices and methods for creating a vacuum force 120 and directing the flow of air from the engaging surface 114 through the air tubes 116 can be used. Referring to
An example of an embedded pin 124 is shown in
In the example shown in
In one example of use of vacuum engaging pad 110, for example engaging a soft-sided bag 14, on contact of a bag 14 with engaging surface 114 and generation of a vacuum force, pad 110 will axially compress toward the base 106. Provided there is enough contact between the bag 14 and engaging surface 114, the vacuum force can force axial compression of the pad 110 until at least one or more of each set of first 124A, second 124B and third 124C pins contacts the bag 14. Due to the exemplary different length pins 124 and exemplary positions as described, this forms a concave shape of the compressed pad 110 thereby generating more contact surface (and friction) by engaging surface 114 with the bag 14 to more securely engage the bag 14 to the pad 110.
In one example of operation, the above described pad 110 and pins 124 further provide stability in the engagement of pad 110 with an engaged bag 14 during movement. For example, when engaging end effector 40 and engaged bag 14 are moved by first automated device 22, 34, for example along paths of travel 88 or 108, the pin 124 ends 128 provide contact and friction resistance to relative movement between the bag 14 and pad 110 in the lateral or shear direction. Other devices and methods than the described pins 124 and pad 110 may be used to engage and secure a bag to the engaging end effector 40 to suit the particular application.
Exemplary engaging end effector 40 further includes one or more sensors (e.g., optical sensors, acoustic sensors, cameras) connected to the engagement pad 110, and/or the base 106, operable to detect the presence of an AGV 28 and/or object, for example bag 14, positioned within one or both of the arcuate paths of travel 88 and/or 108 (collectively referred to as the range of travel). The one or more sensors can be in communication with the control system 30.
Referring to
As seen in the
As best seen in
Each exemplary first vacuum zone 140 and second vacuum zone 142 further includes an inner pad ring 164 connected to the base 106 as generally shown and described for outer pad ring 150. In one example, each of the outer pad ring 150 and inner pad ring 164 can be made from the same foam material and axially compress toward base 106, as described above for engaging pad 110. It is understood that outer pad ring 150 and inner pad ring 164 can be of other configurations, shapes, sizes and materials than that of pad 110 and/or to suit the particular application and performance specifications.
In the example pad 110B, each of the first vacuum zone 140 and second vacuum zone 142 can be independently operated from one another through one or more sensors included in the pad 110B and/or engaging end effector 40. For example as shown in
It is understood that variations size, shape and configuration, and components of pad 110B may vary to suit he particular application. It is also understood that the vacuum zones 140, 142, including but not limited to, a greater or fewer number of zones, the shape, configuration and orientation of the zones, may vary to suit the particular application and performance specification. As an example, with reference to
It is understood that the exemplary engaging vacuum end effector 40 may take different forms, include different components, and operate differently to suit the particular application and performance specifications. For example, engagement pad 110 or 110A may be circular, square, rectangular, polygonal, H-shaped, U-shaped, concave, convex, or other shapes and configurations to suit the application. It is further understood that engaging end effector 40 can take different forms other than a vacuum device to engage or grasp an object, for example bag 14, to suit the particular application and performance specifications. In one alternate example, engagement pad 110 or 110A may take the form of one or more pneumatic suction cups(e.g., as shown in
Referring to
In
In the example conveyor end effector 56, the second automated load device 24 can be generally similar to the first automated load device 22 and can include central control system 30, robot 34, robot arms 36, wrist 38. Modifications to second automated device 24 to suit the conveyor end effector 56 can be used.
Referring to
Exemplary extension arm 184 includes a first end 183 connected to the robot wrist 38 (
Exemplary conveyor end effector 56 further includes an actuator 190 connected to the mounting plate. In one example, actuator 190 is an electric motor in communication with an electrical or other power source from the second automated load device 24, for example robot 34, as described above for actuator 90. Actuator 190 is in communication with, and is activated or energized by central control system 30 as described above for actuator 90 to selectively move conveyor end effector 56 along arcuate path of travel 208 (
Exemplary conveyor end effector 56 further includes a conveyor portion 192 including a coupler 194 connected to the actuator 190 operable to selectively rotate the coupler 194 about an axis or rotation 200 relative to the mounting plate 180. Conveyor portion 192 further A base 206 is connected to the coupler 194. On selected activation of the actuator 190, the coupler 94 and base 206 rotate relative to the mounting plate 180 about axis of rotation 200 thereby moving the base 206 about an arcuate path of travel 208 (
Still referring to
As shown in
In one example, one of the first roller 216 or second roller 226 can be a powered or driven roller used to forcibly move a belt 240 engaged with the first 216 and second 224 roller and circumferentially positioned there around. In one example, the powered of the first 216 or second 226 rollers is a drum-type roller including an electrical motor and drive device inside the roller. The powered roller is in electrical communication with a power source provided by the second automated device 24,34 and is selectively activated by central control system 30. In one example, both of the first roller 216 and second roller 224 can be powered rollers activated/deactivated in a synchronous manner.
In one example, the powered roller(s) include position sensors, for example encoders, which are in communication with the control system 30. In another example, the powered roller(s) include additional devices, for example, a gearbox and an electromechanical brake to quickly slow and stop the roller from rotating, providing further control and flexibility in the use of the conveyor end effector 56. Other forms, sizes, positions and configurations of powered rollers, and rollers 216, 224 may be used to suit the particular application and performance specifications. In the above-described example, as the same control source is used for the conveyor end effector 56 and second automated device, for example robot 34, coordinated movements can be made between the robot 34 and conveyor end effector 56.
In the above example, belt 240 is a continuous or endless form of belt similar in materials and construction as industrial conveyor belts. Exemplary belt 240 provides a friction surface to frictionally engage objects, for example bags 14, positioned thereon to thereby move the object relative to the base 206 and second automated device 24 as further described below. Other forms, configurations, shapes, sizes and materials for belt 240 can be used to suit the particular application and performance specifications. It is understood that conveyor end effector 56 can be of different sizes, shapes, configurations, components, materials, and functions than that described and illustrated to suit the particular application.
Referring to
Exemplary conveyor end effector 56 further includes one or more sensors 244 (shown schematically in
In an alternate example of conveyor end effector 56, conveyor 192 can include two (2) parallel, side-by-side belts. In the above example, belt 240 is shown and described as a single belt. In the alternate example, the two parallel belts can include separate and independent rollers, one of which would be a powered roller as described above, and can be activated and rotated independently of one another by control system 30. This two belt example can provide additional capability to reorient an object, for example bag 14, relative to the belt. For example, one belt can be rotated away from the base, and the other belt can be rotated in an opposite direction (toward the base) which can have the effect of rotating the bag 14 relative to the conveyor. Other forms and configurations of conveyor 192 to suit the particular application can be used.
In an alternate example of conveyor end effector 56, the conveyor 192, and belt 240, are transversely mounted relative to the mounting plate 180. Using
Referring to
As shown in
In the example, using a partially filled container 18 shown in
As shown in the
In one example of autonomous operation of conveyor end effector 56, bags 14 (or other objects) can be autonomously and sequentially positioned to fill the first rows 260A and B, and first columns 266A, B and C, prior to beginning to position bags 14 in a second row 270B. It is understood that alternate methods and sequences to fill the container 18 interior spaces/volumes in rows 260, columns 266, and vertical rows 270 can be used to suit the particular application. In one example, central control system 30 can have preprogrammed sequences, for example by row 260, column 266 and row 270, for the positioning of second automated device 24, 34 and conveyor end effector 56 for the deposit of bags 14 in the manner described. It is further understood that some human operator assistance and/or intervention can be employed to, for example, select the available spaces within container 18, and/or direct the movement of the conveyor end effector 56 to the available container spaces, and/or in other ways.
In the example described and illustrated, second automated device 24 and conveyor end effector 56 can continue to receive and deposit bags 14 within empty spaces in container 18 until it is determined that container 18 is full and/or there are no more suitable spaces/volumes to deposit any more bags 14 by conveyor end effector 56. In response to determining or verifying, for example through the imaging sensor(s) (e.g., one or more of sensors 244) on the conveyor end effector 56 described herein, that container 18 is full, the control system 30 can send signals directing second automated load device 24 to return conveyor end effector 56 to a start position, for example within the reach range 88 and/or 108 of the first automated load device 22 to receive another object, for example bag 14, for loading into an empty container positioned within the reach range 188A of the second automated load device 24. Other devices and methods for implementation and use of inventive conveyor end effector 56 can be used. Although inventive conveyor end effector 56 is described in one example of use in system 10, and in coordination with engaging end effector 40, it is understood that inventive conveyor end effector 56 can be used independently of system 10 and engaging end effector 40, and with other automated devices, and in other applications than as shown and described herein.
Referring to
In the
Exemplary processor 300 can be any type of device that is able to process, calculate or manipulate information, including but not limited to digital information, that is currently known or may be developed in the future. As one example, the processor can be a central processing unit (CPU). As another example, the processors is a graphical processing unit (GPU),It is contemplated that multiple processors 300 and servers can be utilized to support, for example, automated loading cell 20. These processors can be on site at the airport, for example for security concerns, and/or in the “cloud” (cloud computing through remote servers and systems).
The exemplary data memory storage device 302 may include devices which store information, including but not limited to digital information, for immediate or future use by the processor 300. Examples of memory storage devices include either or both of random access memory (RAM) or read only memory (ROM) devices. The memory storage device may store information, such as program instructions that can be executed by the processor 300 and data that is stored by and recalled or retrieved by the processor 300. Additionally, an operating system and other applications can be stored in the data memory storage device 302. Non-limiting examples of memory storage device 302 include a hard disk drive or a solid state drive. Alternately, portions of the stored information may be stored in the cloud (remote storage devices or data centers) and selectively retrieved through hardwire and/or wireless protocols.
In one example of system 10, control system 30 includes a suitable software operating system and preprogrammed software to execute predetermined actions, functions or operations of the system 10 described herein. The operating system and various software may be stored in the data memory storage device 302, and processed and executed by the processor 300 through controller 304 and actuators 308.
In many of the above-described examples, system 10, or components thereof, for example automated devices 22, 24, engaging end effector 40 and conveyor end effector 56, sensors 310, AGV 28 and other system 10 devices described herein, receive operational instructions and commands through data signals hardwired or wirelessly streamed in real time from the central control system 30. Examples of communication networks that may be in use at an airport or other described applications may include, but are not limited to, large area networks (LAN) or a wide area network (WAN). Examples of wireless communication networks, systems and protocols usable with system 10 include wireless access points for communication based on IEEE standard 802.11 (also known as Wi-Fi). Other wireless communication protocols, for example BLUETOOTH, radio frequency controlled, or 4G or 5G LTE communications, including predecessor and successor systems, suitable for the particular application and performance specifications can be used as known by those skilled in the art. Other wired communication systems and components for communication may be based on IEEE standard 802.3 (also known as the Ethernet) may be used in certain applications. Other forms of communication networks, wired and wireless communication protocols, systems and devices can be used.
In the example described above, the autonomous actions, positioning and movements of each automated devices 22, 24, and engaging 40 and conveyor 56 end effectors are, in one example, the result of receiving hard wired and/or wireless data signals from the central control system 30. In one example, the data signals from the central control center 30 can be supplemented or aided in part from data gathered by the individual automated devices 22, 24 and/or engaging effector 40 or conveyor effector 56 and communicated to the central control system 30. Data received from the above described sensors 310 can be hard wire or wirelessly sent to the central control system 30 for analysis or calculations to aid, supplement and/or determine the signals sent from the central control system 30 to the automated devices 22, 24, and/or the engaging end effector 40 or conveyor end effector 56 as described herein. In one example, artificial intelligence and/or machine learning software and/or systems may be used to assist system 10 in engaging, moving and transferring different types of bags 14. Additional, and alternate, hardware and software to support the devices and functions described herein can be used.
The control system 30 can receive images captured from one or more sensors, e.g., imaging devices as the bags 14 are processed (e.g., at check-in, as the bags 14 are being transported on one or more conveyors) before the bag reaches the engaging end effector and the images can be stored and linked to identifiers encoded in a bag tag that is affixed to the bags (e.g., in memory 302). Additionally, bag data associated with the bags 14 can be collected and stored with the identifiers and data extracted from the images (e.g., in memory 302). The bag data can include, a size, weight, and/or type of bag. The control system 30, via the processor 300, can perform one or more imaging processing and/or machine vision to process the images and extract features from the images. For example, the control system 30 can use Stitching/Registration, Filtering, Thresholding, Pixel counting, Segmentation, Inpainting, Edge detection, Color Analysis, Blob discovery & manipulation, Neural net processing, Pattern recognition, Optical character recognition, blurring, normalized lighting, greyscaling, OTSU, thresholding, erosion/dilation, convert correct hull, contour detection, blob/mass calculation normalization, and/or Gauging/Metrology. Using the image processing and/or machine vision techniques the control system 30, via the processor 300, can extract data including one or more features of the bags, e.g., handles, zippers, contours, materials, dimensions, labels, and the like, and including one or more boundaries and/or transitions between the features.
In an example application, the processor 300 of the control system 30 can execute one or more machine learning models to train and validate the one or more machine learning models using the data extracted from images of bags and/or bag data to predict whether the engaging end effector 40 can successfully engage and move bags. The machine learning model(s) can be trained by the control system 30 to classify bags based on the predictions of whether the bags can be successfully engaged and transported using the end effector (as predicted pass or predicted fail). For bags which the trained machine learning model predicts that the engaging end effector can successfully engage and transport, the control system 30 can instruct and/or control the engaging end effector to engage and transport the bags. For bags which the trained machine learning model predicts that the engaging end effector cannot successfully engage and transport, the control system 30 can instruct and/or controls the engaging end effector to skip the bags. The machine learning models can be continuously updated and/or optimized based on whether the machine learning model correctly predicts that a bag can be successfully engage and transported by the engaging end effector.
In an example embodiment, the processor 300 of the control system 30 can execute the machine learning model in two modes: a training and test mode and a production mode. In the training and test mode, the control system 30 can extract and collect data from images of bags being processed using one or more image processing and/or machine vision techniques and collect bag data include, for example, a size, weight, and type of bag captured in each image. The control system 30 can link the images of the bags, data extracted from the images, and the respective bag data to identifiers assigned to the bags, which can be encoded in bag tags that are affixed to the bags. Based on data extracted from the images and the bag data, the machine learning algorithm can predict whether the engaging end effector 40 can successfully engage and move bags and can store the prediction with the identifier of the bag. When the bags reach the engaging end effector, the identifiers on the bag tags can be scanned/read to retrieve predictions for the bags. In the training and test mode, the control system 30 can instruct the engaging end effector to attempt to engage and transport the bags regardless of whether the machine learning algorithm predict that the engaging end effector predicted the engaging end effector would be successful or not. Initially, the ability to correctly predict whether or not the engaging end effector can successfully engage and transport a bag can be inaccurate until enough bags have been processed by the machine learning models and the actual outcomes (e.g., whether the engaging end effector was successful or not) is known. As more bags are processed, the accuracy of the machine learning models can improve. Once the machine learning model accurately, to a specified threshold, predicts whether or not bags can be successfully handled by the engaging end effector, the machine learning is considered to be sufficiently trained, at which time, the control system 30 can determine that the machine learning model has been sufficiently trained and can transition from the training and testing mode to the production mode.
In the production mode, the control system 30 can rely on the predicted outcome output by the machine learning model when determining whether to instruct and/or control the engaging end effector, such that when the trained machine learning model predicts that a bag can be successfully engaged and transported by the engaging end effector, the engaging end effector attempts to engage and transport the bag and when the trained machine learning model predicts that a bag cannot be successfully engaged and transported by the engaging end effector, the engaging end effector does not attempt to engage and transport the bag. In one embodiment, because the trained machine learning model can be predict whether the engaging end effector can successfully engage and transport a bag before the bag reaches the engaging end effector, the control system 30 can use the predict to alter or adjust the routing bags based on the predictions of the machine learning model. For example, if the machine learning model predicts that the engaging end effector cannot engage and transport a bag, the bag can be routed to a manual load cell, where the bag can be loaded into a container by a human. In the production mode, the machine learning model can continue to be trained by the control system such that when the machine learning model incorrectly predicts that the engaging end effector can successfully engage and transport a bag, but the engaging end effector fails to engage or transport the bag, the incorrect prediction can be fed back into the machine learning model to adjust the outcome prediction. In the event that the machine learning model incorrectly predicts that the engaging end effector will be successful in engaging and/or transporting a specified percentage of bags, the control system 30 can transition an operation of the machine learning model to the training and test mode to re-train the machine learning model.
In one non-limiting example, the machine learning algorithm from which the machine learning model is derived can be a convolution neural network, although other machine learning algorithms can be employed. For example, the one or more machine learning algorithms utilized by the control system 30 can include, for example, supervised learning algorithms, unsupervised learning algorithm, artificial neural network algorithms, artificial neural network algorithms, association rule learning algorithms, hierarchical clustering algorithms, cluster analysis algorithms, outlier detection algorithms, semi-supervised learning algorithms, reinforcement learning algorithms and/or deep learning algorithms Examples of supervised learning algorithms can include, for example, AODE; Artificial neural network, such as Backpropagation, Autoencoders, Hopfield networks, Boltzmann machines, Restricted Boltzmann Machines, and/or Spiking neural networks; Bayesian statistics, such as Bayesian network and/or Bayesian knowledge base; Case-based reasoning; Gaussian process regression; Gene expression programming; Group method of data handling (GMDH); Inductive logic programming; Instance-based learning; Lazy learning; Learning Automata; Learning Vector Quantization; Logistic Model Tree; Minimum message length (decision trees, decision graphs, etc.), such as Nearest Neighbor algorithms and/or Analogical modeling; Probably approximately correct learning (PAC) learning; Ripple down rules, a knowledge acquisition methodology; Symbolic machine learning algorithms; Support vector machines; Random Forests; Ensembles of classifiers, such as Bootstrap aggregating (bagging) and/or Boosting (meta-algorithm); Ordinal classification; Information fuzzy networks (IFN); Conditional Random Field; ANOVA; Linear classifiers, such as Fisher's linear discriminant, Linear regression, Logistic regression, Multinomial logistic regression, Naive Bayes classifier, Perceptron, and/or Support vector machines; Quadratic classifiers; k-nearest neighbor; Boosting; Decision trees, such as C4.5, Random forests, ID3, CART, SLIQ, and/or SPRINT; Bayesian networks, such as Naive Bayes; and/or Hidden Markov models. Examples of unsupervised learning algorithms can include Expectation-maximization algorithm; Vector Quantization; Generative topographic map; and/or Information bottleneck method. Examples of artificial neural network can include Self-organizing maps. Examples of association rule learning algorithms can include Apriori algorithm; Eclat algorithm; and/or FP-growth algorithm. Examples of hierarchical clustering can include Single-linkage clustering and/or Conceptual clustering. Examples of cluster analysis can include K-means algorithm; Fuzzy clustering; DBSCAN; and/or OPTICS algorithm. Examples of outlier detection can include Local Outlier Factors. Examples of semi-supervised learning algorithms can include Generative models; Low-density separation; Graph-based methods; and/or Co-training. Examples of reinforcement learning algorithms can include Temporal difference learning; Q-learning; Learning Automata; and/or SARSA. Examples of deep learning algorithms can include Deep belief networks; Deep Boltzmann machines; Deep Convolutional neural networks; Deep Recurrent neural networks; and/or Hierarchical temporal memory.
The control system 30 can use images of the interior volume of a container to determine with more bags can be inserted into the container and/or where to place the bags in the container. For example, one or more of the sensors 310 (for example sensor 244) can be used to capture images of the interior volume of the container. The point cloud can be used to determine whether there is any space available in the container, an available volume of space in the container, and/or where to place the next bag or a sequence of bags. For example, the processor 300 of the control system 30 can execute a heuristic function that receives bag dimensions and available interior volumes of the container as inputs and outputs a location at which a bag should be inserted into the container. In some embodiments, the control system can use the Jaccard Similarity or Index when determining whether one or more bag will fit within the available area of the container based on, for example, the dimensions of the available interior volume of the container and the dimensions of the one or more bags to be inserted into the available interior area.
In an example embodiment, the processor 300 can execute the heuristic function in two modes: an ad-hoc mode and a batch mode. The ad-hoc mode is used to process one bag at a time without taking into account the parameters of subsequent bags to be processed. In the ad-hoc mode, the interior volume of the container is determined as described herein and the heuristic function determines whether the bag will fit in the container, and if so, the location at which the bag should be inserted into the container. The batch mode is used to simultaneously process information (bag data) associated with multiple bags at once for determining whether the bags will fit in the container and a location at which the bags should be placed in the container. In the batch mode, the control system 30 receives information about a specified number of bags (e.g., four bags) and information about the available interior volume of the container as an input to the heuristic function, and outputs a determination of whether the bags will fit and the locations at which the bags will fit. In an example embodiment, when batch mode is used, after information for a batch of bags is processed by the heuristic function, the interior volume of the container can be imaged after each bag in the batch is placed in the container to generated an update point cloud, and the heuristic function is re-run for each subsequent bag to be placed in the batch based on the updated point cloud. Re-scanning the interior volume, updating the point cloud, and re-executing the heuristic function for each bag allows the control system to accommodate imperfections in the container loading process. For example, when a bag is placed in the container, one or more bags can be compressed or distorted and/or can shift.
The amount of suction or vacuum force the engaging end effector applies to a bag can be controlled by the control system 30. For example, before the engaging end effector is instructed and/or controlled to engage and transport the bag, the bag tag of the bag can be scanned/read to extract the identifier encoded therein, and the control system 30 can retrieve the bag data for the bag (e.g., size, weight, and type of bag). Using the bag data, the processor 300 of the control system 30 can determine the amount of suction or vacuum force to apply to the bag. As an example, as the size and/or of the bags to be engaged by the engaging end effector decreases, less suction or vacuum force can be used, e.g., fewer of the vacuum zones can be used or suction is applied through fewer of the pins 124 or 134. As another example, as the size and/or of the bags to be engaged by the engaging end effector increases, more suction or vacuum force can be used, e.g., more of the vacuum zones can be used or suction is applied through more of the pins 124 or 134.
Referring to
In one example of method 400, in a first exemplary step 405, the first automated device 22 and engaging end effector 40 are positioned in proximity to an object, for example a bag 14, positioned along a path of travel 30 within the first automated device reach range 108 as described above. In one example, sensors can be used to detect or verify the position of the bag 14 on AGV 28.
In step 410, engaging end effector engages bag 14 and removes the bag from AGV 28. As described in the example above, engaging end effector 40 can be a vacuum end effector which is selectively activated by control system 30 to generate a vacuum force 120.
In exemplary step 415, the second automated device 24 with conveyor end effector 56 is moved to a position a conveyor portion 192 within the reach range 108 and/or 88 of the first automated device 22.
In exemplary step 420, the first automated device, engaging end effector 40 and the engaged bag 14 are moved and aligned to deposit bag 14 onto the conveyor portion 192 of the conveyor end effector 56. In one example, one or more sensors are used to detect and/or verify alignment of the engaging end effector 40 with the conveyor end effector 56. The engaging end effector 40 disengages bag 40 through, for example, cessation of the generated vacuum force, allowing bag 14 to fall by gravitation force onto conveyor portion 192.
In exemplary step 425, the second automated device 24 moves and positions the conveyor end effector and onboard bag 14 in proximity to a container 18 including interior cavity space or volume sufficient to receive onboard bag 14.
In one example, an optional step 430 includes one or more sensors connected to the conveyor end effector 56 scanning and/or imaging the interior cavity of container 18. The data obtained can be used by control system 30 to calculate or determine the available open space(s) or volume(s) within container 18 of sufficient size to receive onboard bag 14.
In exemplary step 435, the second automated device and conveyor end effector 56 are positioned to deposit the onboard bag 14 in the predetermined or calculated available open space in container 18.
In exemplary step 430, the engaging end effector 40 and conveyor end effector 56 can be moved to a start position, for example for the engaging end effector 40 to engage a bag 14, and conveyor end effector to receive a bag 14 from the engaging end effector.
It is understood that method 400 can include additional or alternate steps, or the described steps in an alternate order, to suit the particular application and performance specifications as known by those skilled in the art.
Referring to
Referring to
While the invention has been described in connection with certain embodiments, it is to be understood that the invention is not to be limited to the disclosed embodiments but, on the contrary, is intended to cover various modifications and equivalent arrangements included within the scope of the appended claims, which scope is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures as is permitted under the law.
The present application claims priority to and the benefit of U.S. Provisional Application No. 62/988,633, filed on Mar. 12, 2020, the disclosure of which is incorporated by reference herein in its entirety.
Number | Date | Country | |
---|---|---|---|
62988638 | Mar 2020 | US |