The invention generally relates to object processing systems, and relates in particular to shipping systems that accommodate orders from sending entities, and provide distribution and shipping services to destination entities.
Current object processing systems generally involve the processing of a large number of objects, where the objects are received in either organized or disorganized batches, and must be routed to desired destinations in accordance with a manifest or specific addresses on the objects (e.g., in a mailing system).
Current distribution center sorting systems, for example, generally assume an inflexible sequence of operations whereby a disorganized stream of input objects is first singulated into a single stream of isolated objects presented one at a time to a scanner that identifies the object. An induction element (e.g., a conveyor, a tilt tray, or manually movable bins) transport the objects to the desired destination or further processing station, which may be a bin, a chute, a bag or a conveyor etc.
In typical parcel sortation systems, human workers or automated systems typically retrieve parcels in an arrival order, and sort each parcel or object into a collection bin based on a set of given heuristics. For instance, all objects of like type might go to a collection bin, or all objects in a single customer order, or all objects destined for the same shipping destination, etc. The human workers or automated systems are required to receive objects and to move each to their assigned collection bin. If the number of different types of input (received) objects is large, a large number of collection bins is required.
Current state-of-the-art sortation systems rely on human labor to some extent. Most solutions rely on a worker that is performing sortation, by scanning an object from an induction area (chute, table, etc.) and placing the object in a staging location, conveyor, or collection bin. When a bin is full or the controlling software system determines that it needs to be emptied, another worker empties the bin into a bag, box, or other container, and sends that container on to the next processing step. Such a system has limits on throughput (i.e., how fast can human workers sort to or empty bins in this fashion) and on number of diverts (i.e., for a given bin size, only so many bins may be arranged to be within efficient reach of human workers).
Adding to these challenges are the conditions that some objects may have information about the object entered into the manifest or a shipping label incorrectly. For example, if a manifest in a distribution center includes a size or weight for an object that is not correct (e.g., because it was entered manually incorrectly), or if a shipping sender enters an incorrect size or weight on a shipping label, the processing system may reject the object as being unknown. Additionally, and with regard to incorrect information on a shipping label, the sender may have been undercharged due to the erroneous information, for example, if the size or weight was entered incorrectly by the sender.
There remains a need for more efficient and more cost effective object processing systems that process objects of a variety of sizes and weights into appropriate collection bins or boxes, yet is efficient in handling objects of such varying sizes and weights.
In accordance with an embodiment, the invention provides a processing system for processing objects. The processing system includes a plurality of receiving stations for receiving a plurality of objects, each object being associated with prerecorded data, and a plurality of processing stations, each of which is in communication with at least one processing station. Each processing station includes perception means for perceiving data regarding an identity of any of an object or a bin of objects, and capture means for capturing characteristic data regarding an object to provide captured data. Each processing station further includes comparison means for comparing the captured data with the prerecorded data to provide comparison data, and a plurality of distribution stations, each of which is in communication with at least one processing station for receiving objects from the at least one processing station responsive to the comparison data.
In accordance with another embodiment, the invention provides a method of processing objects. The method includes the steps of receiving a plurality of objects, each object being associated with prerecorded data, providing a plurality of processing stations, each of which is in communication with at least one processing station, including perceiving data regarding an identity of any of an object or a bin of objects, and capturing characteristic data regarding an object to provide captured data, comparing the captured data with the prerecorded data to provide comparison data, and providing a plurality of distribution stations, each of which is in communication with at least one processing station for receiving objects from the at least one processing station responsive to the comparison data.
In accordance with a further embodiment, the invention provides an object processing verification system that includes a data repository for storing information about objects including: identifying information, object weight, object volume, and destination information, a first detection system that detects identifying information associated with the object, a second detection system that detects a volume associated with the object, a third detection system that detects a weight associated with the object, a computer processing system for comparing the detected identifying information, volume and weight with the volume and weight of the identified object that is stored in the data repository, and an object transportation system that routes an object to an advancement destination if the object's detected volume and weight match the stored volume and weight, and to an examination destination if the detected volume and weight do not closely enough match the stored volume and weight.
The following description may be further understood with reference to the accompanying drawings in which:
The drawings are shown for illustrative purposes only.
In accordance with an embodiment, the invention provides an object processing system that not only tracks objects (e.g., packages, envelopes, boxes, etc.), but also detects data regarding the objects at numerous points during processing, e.g., for pick validation and placement validation. The detected data is checked against a reference set of prerecorded data as provided by a manifest (manually or automatically generated) or a shipping label, etc. While the detected data may represent estimated mass, weight, size or volume, if significant discrepancies are found, the object may be held until the discrepancy is resolved, or the object is re-routed to be returned to its original sender.
More specifically, the system may determine an object's identity, and access the previously recorded data regarding the object. The previously recorded data may be provided by a manifest that provides for each object, unique identity data, its mass or weight and its size or volume or density, as well as its distribution information, such as a delivery address or a destination location. Identifying indicia that is representative of the identity data, such as a barcode, QR code or RFID label, is applied to the object. The previously recorded data may also be provided by the sender, for example, if the sender (or shipping company personnel) provides data regarding the object's mass or weight or size or volume, density, etc. The shipping company personnel may then assign unique identity data to the object, and apply identifying indicia such as a bar code, QR code or RFID label, that is representative of the identity data. The destination information such as an address or destination location, is then associated with the object's identity data.
During processing, the system will determine an object's identity data, and will then determine the object's mass, weight, size, volume, density, etc. The system will then compare the determined data (e.g., mass, weight, size or volume) with the previously recorded data associated with the object's identity. If a discrepancy (e.g., of more than e.g., 2%-5%, or 10%) is found, the object is internally re-routed to a holding station until the discrepancy is resolved. The discrepancy may be resolved by having the shipping network contact the sender via the shipping company to have the sender's billing account information either credited or debited the correct amount to accommodate the discrepancy. If the discrepancy is not resolved, the object may be returned to the sender, for example, by assigning the sender's address as the destination address associated with the object's identity data. In this case, the system may override the prerecorded data, and assign the sender's address to be the destination address for the object. This will provide that the object is then returned to the sender, and may further include an explanation of the reason for the return, for example, by including a stamp or adhesive label that reports the determined mass, weight, size or volume.
In accordance with certain embodiments, the system may update the manifest if it is determined that the captured data more accurately reflects characteristics of the object. For example, the system may record known sizes and weights of common objects, and after multiple encounters with an object, the system may know that the perceived data is more accurate than the original data in the manifest. For example, the system may employ learning, in the sense of improving over time. The performance of picking as a function of object, pick station and handling parameters may not be known a priori. Furthermore, objects that have not been picked before will periodically be encountered. It is likely, however, that new objects that are similar to previously picked objects will have similar performance characteristics. For example, object X may be a kind of shampoo in a 20 ounce bottle, and object Y may be conditioner in a 20 ounce bottle. If distributed by the same company, then the shape of the bottles may be the same. The system includes processes that use observations of past performance on similar objects to predict future performance, and learns what characteristics of the objects available to the system are reliable predictors of future performance. The learning is in particular a learning process that (a) extrapolates the performance of newly seen objects, and (b) is continually updating the data with which it learns to extrapolate so as to continually improve performance.
During this processing, data regarding the package is obtained and recorded. If the data is incorrect (e.g., the package weighs much more than was initially recorded or has a greater volume than was initially recorded), the sender is notified (via the shipping company) and a further charge is applied to the sender's billing account 16. The package is not initially returned in an embodiment, but is only provided to a delivery company 22 when the account 16 is paid in full (or credited if overpaid) and the discrepancy is remedied. The package is then provided by the delivery company 22 to a recipient 24. If the discrepancy is not remedied (e.g., within 24 hours), the object is returned to the sender's address (e.g., by having the sender's address be assigned to the shipping address).
The shipping network may include a variety of distribution systems such as the distribution system 30 shown in
With reference to
The programmable motion device 56 (e.g., a robotic articulated arm) of the processing station 36 includes an end effector 70 (e.g., a vacuum cup, grasping gripper, or other retention device) as well as a detection system 72 as shown in
Such robotic pickers are used in many different types of applications in material handling. In one case a robotic picker may be employed to pick a single object from a collection of the same types of objects, and then transfer the picked object to a container or conveyor or other destination location. In some cases the robotic picking technology uses cameras and 3D scanners to sense and analyze the pick face before it, automatically choosing the best place to pick an object based on a variety of criteria. Under certain circumstances, the robotic picking system can mistakenly pick two or more objects. This is an undesirable behavior, as this impacts the accounting of goods at the receiver, and results in a miscount of goods in the tracking of the number of remaining objects in inventory. What is desired are methods to sense whether the robot has picked more than one object, either before it is placed into the outgoing container or conveyor, so as to prevent the transfer of multiple objects, or after it has been placed, so that inventory counts can be updated. In certain further embodiments, again such as where the robotic picker is picking from a tote of homogenous objects, the system may, upon detecting a double-pick, route the double-pick to an output destination that is scheduled to receive two such objects.
The processing station 36 also includes a capture system 78 that includes scanning and receiving units 80, 82, as well as edge detection units 84 for capturing a variety of characteristics of a selected object in the bin.
In particular, the contents of the bin are volumetrically scanned as shown in
In accordance with further embodiments, the system may additionally employ edge detection sensors 84 that are employed (again together with the processing system 42), to detect edges of any objects in a bin, for example using data regarding any of intensity, shadow detection, or echo detection etc., and may be employed for example, to determine any of size, shape and/or contours as shown in
If the captured data (e.g., volume, density, size, etc.), is confirmed therefore within a reliable tolerance, then the object continues to be processed in accordance with a manifest or a shipping label. If not, however, the object may be directed to a holding location (e.g., a box that is assigned as a holding location), where it may remain until the discrepancy is resolved. For example, in certain embodiments, weight measuring may be provided by certified postal weights, which would have a high reliability value, and could be trusted for rerouting decisions for measurements near the tolerance threshold. On the other hand, for similar measurements near a tolerance threshold using less reliable weight measuring, such as measurements made with machines that may not be certified postal calibrated, objects would have to be re-routed for manual verification of weight (and appropriate further expense charging).
With reference again to
In accordance with further embodiments, the system may estimate a volume of an object while the object is being held by the end effector. Although with certain types of object processing systems (e.g., package sortation for shipping/mailing) volume may not be as helpful (for example when handling deformable plastic bags), in other systems such as store replenishment or e-commerce applications, volumetric scanning would be very valuable. In particular, the system may estimate a volume of picked object (or objects) while being held by the end effector, and compare the estimated volume with a known volume. To capture the estimated volume, one or more perception units (e.g. cameras or 3D scanners) are placed around a scanning volume in an embodiment to capture volume data.
With reference to
As shown in
The scanning volume may be the volume above the area where the objects are picked from; or the scanning volume may be strategically placed in between the picking location and the placing location to minimize travel time. Within the scanning volume, the system takes a snapshot of the volume of objects held by the gripper. The volume could be estimated in a variety of ways depending on the sensor type as discussed above.
For example, if the sensors are cameras, then two or more cameras may be placed in a ring around the volume, directed slightly upward towards a backlighting screen (as discussed above) that may be in the shape of sections of a torus, where the gripped volume is held in between all the cameras and the brightly lit white screen. The brightly lit screen backlights the one or more held objects, so that the interior volume appears black. Each perception unit and associated illumination source may be activated in a sequence so that no two illumination sources are on at the same time. This allows easy segmentation of the held volume in the image.
The object may be illuminated with ambient lighting, may be provided as a particular wavelength that is not present in the room, may be modulated and detectors may demodulate the received perception data so that only illumination from the associated source is provided. The black region once projected back into space, becomes a frustum and the objects are known to lie within a solid frustum. Each camera generates a separate frustum, with the property that the volume of the objects is a subset of all of the frustums. The intersection of all the frustums yields an upper bound on the volume of the object(s). The addition of a camera improves the accuracy of the volume estimate. The gripper may be visible within the cameras, and because its position and size are known, its volume can be subtracted out of the frustum or volume estimate.
If instead, 3D scanners that obtain 3D images of the scanning volume are obtained, then the volume estimates are obtained in a similar way by fusing together the point clouds received from each sensor, but without the need for segmenting the images from the background using backlighting. Each 3D scanner returns a 3D image, which for each pixel in the image returns a depth.
In accordance with other embodiments, 3D scanners may be used that obtain 3D images of the scanning volume, then the volume estimates are obtained in a similar way by fusing together the point clouds received from each sensor, but without the need for segmenting the images from the background using backlighting. Each 3D scanner returns a 3D image, which for each pixel in the image returns a depth, and again, may use any of light detection and ranging (LIDAR) scanners, pulsed time of flight cameras, continuous wave time of flight cameras, structured light cameras, or passive stereo cameras, etc.
The system may therefore compare the object volume to the difference in volumes of the picking area before and after pick. Another approach is to analyze either or both of the picking or placing volumes using a 3D scanner, and then to estimate the amount of additional or subtracted volume observed in the perceived areas. For example, first, the picking area is scanned with a 3D scanner that recovers a 3D point cloud of the area. Second, the robot picker picks an object with the aim of picking a single object. Third, the picking area is re-scanned, and an estimate is formed of how much volume was taken away from the picking area. Fourth, using that volume estimate, as above, a decision is made in accordance with one or more defined thresholds as to whether that volume difference is believed to exceed the volume of a single quantity of the object by a predetermined threshold.
In accordance with further embodiments, the system may scan the picking volume before and after picking, and compare estimated volumes. In this case, the volume of the picking or placing area might be estimated in the following way. The 3D scanner is assumed to be looking approximately down at the picking area, or at a slight angle. The 3D scanner provides an image of the area and for every pixel in the image it provides a range to the geometry in the direction of the pixel. With this array of range measurements a point cloud may be formed. This point cloud represents points in three dimensional space that are estimated to be on the top surface of the pick face, where the pick face is the topmost surface of the objects to be picked. The area of the pick face can be discretized into a grid of vertical columns, and for each vertical column, an estimate of the height of the geometry within the vertical column can be obtained by taking the maximum, mean, median, minimum, or some other robust statistic of the heights of the points that lie within the column. Then, the volume of the picking area is the sum of the values of the height values estimated for each vertical column.
For various reasons, such as resolution, reflections, or transparency, some vertical columns may not have any point cloud points, in which case the resolution may be changed adaptively so that vertical columns are wide enough that none are empty of point cloud points. Statistics of various kinds may be obtained to determine a bound for the volume, such as employing the variance of the heights of the points within the columns to obtain an overall variance for the pick area volume. In this way, an estimate of the differential volume can be obtained in either the picking area, where the volume of the area would decrease by a single pick if a single object were indeed picked; or, where the volume of the area would increase by a single pick if a single object were indeed placed; or, both a differential volume of picking and placing may be estimated and combined to combat noise and errors in estimation.
With reference to
In either or any approach to obtaining a volume estimate, the volume estimate is then compared to the known volume of the object. Because the sensor data may be noisy, and because the various algorithms for estimating volume may result in small errors, the thresholds for deciding whether more than one pick has occurred are tuned to balance the number of false positives (picks estimated to contain more than one object, but in actuality only one is held) and false negatives (picks estimated to contain a single object, but in actuality contain more than one), depending on the application. The tuned threshold may also depend on the object in question, and each object may have its own threshold. If it is determined that the threshold has been exceeded, the objects may be placed back into the area from which they were picked so that the robot may try again to pick a single object.
In accordance with further embodiments, the system may detect multiple picks by automatically perceiving leaks from flow or pressure data. With reference to
The system may therefore detect multiple picks by automatically perceiving leaks from flow or pressure data. Another approach is to compute from observations of flow and pressure while holding an object static with which to compare to statistics collected when the same object was collected before. In further embodiments, the system may compute from time series data of flow and/or pressure, while holding the object, the variance and other statistics with which to compare statistics from when the same object or similar object was previously gripped. In further embodiments, the system may compare the obtained values, and if the difference lies above a certain threshold, then rule it as an instance of picking more than one of the object. In further embodiments, the system may employ a linear classifier, support vector machine, or other machine learning-based classifier to discriminate between single or multiple picks using flow or pressure data. Additionally, the system may combine any subsets of the above approaches. The system may also use performance models of any of the above, where the system knows the probability of a single or multiple pick given the output of a detection, and may then combine any or all of the approaches above.
In certain applications, such as when picking from a homogenous tote of objects, the system may identify active orders (e.g., from a manifest) that require two such objects (and do not yet have the total number of such objects, requiring at least two more). The system may then route an identified double pick to the identified location, noting that two such objects have been delivered to the identified location. Such a system however, must maintain active monitoring of the grasp, and may learn over time which types of objects may be more likely to result in a multi-pick and which such multi-picks may be reliably delivered by the end effector, and further may be processed together further in the processing system. In particular, if a shuttle carriage is used, the system must know that two such objects will fit into the carriage at the same time. The presence of the two such object pick will be confirmed if the shuttle carriage includes weight sensing as discussed here with reference to
In accordance with further embodiments, the system may compare held or transferred weight with known object weight. In such a system, the approach is to measure the weight of the picked objects, and compare the measured weight to the a priori known weight of the object. The weighing of the object might be implemented in any of a number of ways, including at the gripper, by force or load sensing mechanisms in the gripper, at the pick or place area, or where the underlying pick area container or conveyor is continually weighed by a scale to measure the change in weight before and after picking or placing, or on a supplemental transfer mechanism such as a shuttle that transports singulated and picked objects to other locations, or on a temporary weighing mechanism within the workspace of the robot pick cell, where the robot places the one or more objects on a holding tray, where they are then weighed and the robot re-picks them.
The carriage also includes a pair of load cells 112, 114 that are coupled to the frame 106, and the carriage body 103 is mounted to (and is suspended by) the load cells. By locating the load cells on the body of the carriage close to object(s) held therein, a highly reliable weight measurement may be obtained. Once an object is detected, for example by the beam-break transmitter and receiver pair 102, 104, the system will determine the weight of the object. In accordance with an embodiment, the system will add the weight value of the two load cells (W1, W2) together, and subtract the weight of the body 100. In this way, weight of objects may also be obtained and checked (within a tolerance range of, for example 2% to 5%) with a manifest or shipping information. In accordance with other embodiments, the load cells themselves may register a change, indicating that the carriage has received or expelled an object.
The carriage body 103 is adapted to be rotatable about an axis 117 (to empty its contents), and the carriage body 103 is attached to a top portion 119 of the load cells 112, 114 above the axis of rotation 117.
The detection of weight is important for many industrial tasks. If an object in the carriage has a weight significantly different than that in a manifest or shipping label, the object may be held as discussed above until resolved (e.g., additional charges are processed). In accordance with other embodiments, a carriage may be used to transport multiple uniform weight objects. The uniform weight is known, the quantity of objects in the carriage may be determined by dividing the measured total weight by the known object weight.
The carriage 100′ is similarly mounted via load cells 112, 114 on a frame 106, and its motion along a rail and in tipping, is controlled by actuation system 108. Communication and electronic controls are provided by electronic processing and communication system 110 (shown in
The processing station 36′ also includes the capture system 78 that include scanning and receiving units 80, 82, as well as edge detection units 84 (as discussed above in reference to
In accordance with an embodiment, the volume captured in
The difference in volume between the scans shown in
In accordance with further embodiments and with reference to
The output boxes may be provided in a conveyor (e.g., rollers or belt), and may be biased (for example by gravity) to urge all destination bins toward one end 680 as shown. With reference to
Following displacement of the box onto the conveyor, each of the output boxes may be urged together, and the system will record the change in position of any of the boxes that moved. This way, a new empty box may be added to the end, and the system will record the correct location and identified processing particulars of each of the destination bins.
Each of the above detection systems for detecting or estimating any of weight, size, mass, volume, density etc. may be used with a variety of object processing systems.
The system 302 includes a perception unit 308, and with further reference to
The carriage 314 may be any of the carriages discussed above with reference to
Further,
The system 402 includes a perception unit 408 that is the same as the perception unit 308 that combines the functionalities of the detection system 72 and the capture system 78 discussed above, and includes lights and perception units, and scanning and receiving units as well as edge detection units. Each perception unit 408 may therefore capture identifying indicia, and provide volumetric 3D scanning as discussed above.
The carriage 414 may be any of the carriages discussed above with reference to
Those skilled in the art will appreciate that numerous modifications and variations may be made to the above disclosed embodiments without departing from the spirit and scope of the present invention.
The present application is a continuation application of U.S. patent application Ser. No. 16/661,820 filed Oct. 23, 2019, now U.S. Pat. No. 11,373,134, issued Jun. 28, 2022, which claims priority to U.S. Provisional Patent Application Ser. No. 62/749,509 filed Oct. 23, 2018, as well as U.S. Provisional Patent Application Ser. No. 62/884,351 filed Aug. 8, 2019, the disclosures of which are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
3592326 | Zimmerle et al. | Jul 1971 | A |
3595407 | Muller-Kuhn et al. | Jul 1971 | A |
3734286 | Simjian | May 1973 | A |
3983988 | Maxted et al. | Oct 1976 | A |
4136780 | Hunter et al. | Jan 1979 | A |
4186836 | Wassmer et al. | Feb 1980 | A |
4360098 | Nordstrom | Nov 1982 | A |
4560060 | Lenhart | Dec 1985 | A |
4622875 | Emery et al. | Nov 1986 | A |
4722653 | Williams et al. | Feb 1988 | A |
4759439 | Hartlepp | Jul 1988 | A |
4819784 | Sticht | Apr 1989 | A |
4846335 | Hartlepp | Jul 1989 | A |
4895242 | Michel | Jan 1990 | A |
5119306 | Metelits et al. | Jun 1992 | A |
5190162 | Hartlepp | Mar 1993 | A |
5326219 | Pippin et al. | Jul 1994 | A |
5419457 | Ross et al. | May 1995 | A |
5460271 | Kenny et al. | Oct 1995 | A |
5585917 | Woite et al. | Dec 1996 | A |
5672039 | Perry et al. | Sep 1997 | A |
5713473 | Satake et al. | Feb 1998 | A |
5794788 | Massen | Aug 1998 | A |
5794789 | Payson et al. | Aug 1998 | A |
5806661 | Martin et al. | Sep 1998 | A |
5839566 | Bonnet | Nov 1998 | A |
5875434 | Matsuoka et al. | Feb 1999 | A |
5990437 | Coutant et al. | Nov 1999 | A |
6060677 | Ulrichsen et al. | May 2000 | A |
6124560 | Roos et al. | Sep 2000 | A |
6246023 | Kugle | Jun 2001 | B1 |
6311892 | O'Callaghan et al. | Nov 2001 | B1 |
6323452 | Bonnet | Nov 2001 | B1 |
6401936 | Isaacs et al. | Jun 2002 | B1 |
6688459 | Bonham et al. | Feb 2004 | B1 |
6762382 | Danelski | Jul 2004 | B1 |
6897395 | Shiibashi et al. | May 2005 | B2 |
7306086 | Boelaars | Dec 2007 | B2 |
8560406 | Antony | Oct 2013 | B1 |
8731711 | Joplin | May 2014 | B1 |
8776694 | Rosenwinkel et al. | Jul 2014 | B2 |
8997438 | Fallas | Apr 2015 | B1 |
9020632 | Naylor | Apr 2015 | B2 |
9102336 | Rosenwinkel | Aug 2015 | B2 |
9174758 | Rowley | Nov 2015 | B1 |
9364865 | Kim | Jun 2016 | B2 |
9650214 | Hoganson | May 2017 | B2 |
9751693 | Battles et al. | Sep 2017 | B1 |
9878349 | Crest et al. | Jan 2018 | B2 |
9926138 | Brazeau et al. | Mar 2018 | B1 |
9931673 | Nice et al. | Apr 2018 | B2 |
9962743 | Bombaugh et al. | May 2018 | B2 |
9975148 | Zhu et al. | May 2018 | B2 |
10029865 | Amazon | Jul 2018 | B1 |
10198710 | Hahn et al. | Feb 2019 | B1 |
10206519 | Gyori et al. | Feb 2019 | B1 |
10438034 | Wagner et al. | Oct 2019 | B2 |
10538394 | Wagner et al. | Jan 2020 | B2 |
10576621 | Wagner et al. | Mar 2020 | B2 |
10577180 | Mehta et al. | Mar 2020 | B1 |
10611021 | Wagner et al. | Apr 2020 | B2 |
10809122 | Danenberg et al. | Oct 2020 | B1 |
10810715 | Chamberlin | Oct 2020 | B2 |
10853757 | Hill et al. | Dec 2020 | B1 |
11055504 | Wagner et al. | Jul 2021 | B2 |
11080496 | Wagner et al. | Aug 2021 | B2 |
11126807 | Wagner et al. | Sep 2021 | B2 |
11200390 | Wagner et al. | Dec 2021 | B2 |
11205059 | Wagner et al. | Dec 2021 | B2 |
11416695 | Wagner et al. | Aug 2022 | B2 |
11481566 | Wagner et al. | Oct 2022 | B2 |
11537807 | Wagner et al. | Dec 2022 | B2 |
11681884 | Wagner et al. | Jun 2023 | B2 |
11734526 | Wagner et al. | Aug 2023 | B2 |
11847513 | Wagner et al. | Dec 2023 | B2 |
20020134056 | Dimario et al. | Sep 2002 | A1 |
20020157919 | Sherwin | Oct 2002 | A1 |
20020170850 | Bonham et al. | Nov 2002 | A1 |
20020179502 | Cerutti et al. | Dec 2002 | A1 |
20030034281 | Kumar | Feb 2003 | A1 |
20030038065 | Pippin et al. | Feb 2003 | A1 |
20030075051 | Watanabe et al. | Apr 2003 | A1 |
20040065597 | Hanson | Apr 2004 | A1 |
20040118907 | Rosenbaum et al. | Jun 2004 | A1 |
20040194428 | Close et al. | Oct 2004 | A1 |
20040195320 | Ramsager | Oct 2004 | A1 |
20040215480 | Kadaba | Oct 2004 | A1 |
20040261366 | Gillet et al. | Dec 2004 | A1 |
20050002772 | Stone | Jan 2005 | A1 |
20050149226 | Stevens et al. | Jul 2005 | A1 |
20050220600 | Baker et al. | Oct 2005 | A1 |
20060021858 | Sherwood | Feb 2006 | A1 |
20060070929 | Fry et al. | Apr 2006 | A1 |
20070209976 | Worth et al. | Sep 2007 | A1 |
20080046116 | Khan et al. | Feb 2008 | A1 |
20080181753 | Bastian et al. | Jul 2008 | A1 |
20080193272 | Beller | Aug 2008 | A1 |
20090026017 | Freudelsperger | Jan 2009 | A1 |
20100122942 | Harres et al. | May 2010 | A1 |
20100318216 | Faivre et al. | Dec 2010 | A1 |
20110084003 | Benjamins | Apr 2011 | A1 |
20110130868 | Baumann | Jun 2011 | A1 |
20110144798 | Freudelsperger | Jun 2011 | A1 |
20110238207 | Bastian Ii et al. | Sep 2011 | A1 |
20110243707 | Dumas et al. | Oct 2011 | A1 |
20110320036 | Freudelsperger | Dec 2011 | A1 |
20120096818 | Pippin | Apr 2012 | A1 |
20120118699 | Buchman et al. | May 2012 | A1 |
20120125735 | Schuitema et al. | May 2012 | A1 |
20120293623 | Nygaard | Nov 2012 | A1 |
20130001139 | Tanner | Jan 2013 | A1 |
20130051696 | Garrett et al. | Feb 2013 | A1 |
20130104664 | Chevalier, Jr. et al. | May 2013 | A1 |
20130110280 | Folk | May 2013 | A1 |
20130202195 | Perez Cortes et al. | Aug 2013 | A1 |
20140244026 | Neiser | Aug 2014 | A1 |
20140249666 | Radwallner et al. | Sep 2014 | A1 |
20140277693 | Naylor | Sep 2014 | A1 |
20140291112 | Lyon et al. | Oct 2014 | A1 |
20150068866 | Fourney | Mar 2015 | A1 |
20150098775 | Razumov | Apr 2015 | A1 |
20150114799 | Hansl et al. | Apr 2015 | A1 |
20160042320 | Dearing et al. | Feb 2016 | A1 |
20160083196 | Dugat | Mar 2016 | A1 |
20160221762 | Schroader | Aug 2016 | A1 |
20160221766 | Schroader et al. | Aug 2016 | A1 |
20160228921 | Doublet et al. | Aug 2016 | A1 |
20170057756 | Dugat et al. | Mar 2017 | A1 |
20170108577 | Loverich et al. | Apr 2017 | A1 |
20170121113 | Wagner et al. | May 2017 | A1 |
20170157649 | Wagner et al. | Jun 2017 | A1 |
20170197233 | Bombaugh et al. | Jul 2017 | A1 |
20170225330 | Wagner et al. | Aug 2017 | A1 |
20170243158 | Gupta et al. | Aug 2017 | A1 |
20170312789 | Schroader | Nov 2017 | A1 |
20170330135 | Taylor | Nov 2017 | A1 |
20170349385 | Moroni et al. | Dec 2017 | A1 |
20170369244 | Battles et al. | Dec 2017 | A1 |
20180001353 | Stockard et al. | Jan 2018 | A1 |
20180044120 | Mäder | Feb 2018 | A1 |
20180065156 | Winkle | Mar 2018 | A1 |
20180068266 | Kirmani et al. | Mar 2018 | A1 |
20180085788 | Engel et al. | Mar 2018 | A1 |
20180105363 | Lisso et al. | Apr 2018 | A1 |
20180127219 | Wagner et al. | May 2018 | A1 |
20180186572 | Issing | Jul 2018 | A1 |
20180224837 | Enssle | Aug 2018 | A1 |
20180265291 | Wagner et al. | Sep 2018 | A1 |
20180265298 | Wagner et al. | Sep 2018 | A1 |
20180265311 | Wagner et al. | Sep 2018 | A1 |
20180273295 | Wagner et al. | Sep 2018 | A1 |
20180273296 | Wagner et al. | Sep 2018 | A1 |
20180273297 | Wagner et al. | Sep 2018 | A1 |
20180273298 | Wagner et al. | Sep 2018 | A1 |
20180282065 | Wagner et al. | Oct 2018 | A1 |
20180282066 | Wagner et al. | Oct 2018 | A1 |
20180312336 | Wagner et al. | Nov 2018 | A1 |
20180327198 | Wagner et al. | Nov 2018 | A1 |
20180330134 | Wagner et al. | Nov 2018 | A1 |
20190022702 | Vegh et al. | Jan 2019 | A1 |
20190030712 | Sciog et al. | Jan 2019 | A1 |
20190091730 | Torang | Mar 2019 | A1 |
20190337723 | Wagner et al. | Nov 2019 | A1 |
20200005005 | Wagner et al. | Jan 2020 | A1 |
20200023410 | Tamura et al. | Jan 2020 | A1 |
20200126025 | Kumar et al. | Apr 2020 | A1 |
20200143127 | Wagner et al. | May 2020 | A1 |
20200151407 | Wagner et al. | May 2020 | A1 |
20200151408 | Wagner et al. | May 2020 | A1 |
20200151409 | Wagner et al. | May 2020 | A1 |
20200151410 | Wagner et al. | May 2020 | A1 |
20200160011 | Wagner et al. | May 2020 | A1 |
20200265201 | Wagner et al. | Aug 2020 | A1 |
20200319627 | Edwards et al. | Oct 2020 | A1 |
20200363259 | Bergstra et al. | Nov 2020 | A1 |
20210214163 | Deacon et al. | Jul 2021 | A1 |
20210271835 | Wagner et al. | Sep 2021 | A1 |
20210312149 | Wagner et al. | Oct 2021 | A1 |
20210374367 | Wagner et al. | Dec 2021 | A1 |
20220043991 | Wagner et al. | Feb 2022 | A1 |
20220058354 | Wagner et al. | Feb 2022 | A1 |
20220198164 | Wagner et al. | Jun 2022 | A1 |
20220276088 | Bergstra et al. | Sep 2022 | A1 |
20220277155 | Wagner et al. | Sep 2022 | A1 |
20220314440 | Mizoguchi et al. | Oct 2022 | A1 |
20230062501 | Wagner et al. | Mar 2023 | A1 |
20230077893 | Gebhardt et al. | Mar 2023 | A1 |
20230219767 | Demir et al. | Jul 2023 | A1 |
20230334275 | Wagner et al. | Oct 2023 | A1 |
20230342573 | Wagner et al. | Oct 2023 | A1 |
20230401398 | Wagner et al. | Dec 2023 | A1 |
20240037353 | Wagner et al. | Feb 2024 | A1 |
20240054302 | Wagner et al. | Feb 2024 | A1 |
20240054303 | Wagner et al. | Feb 2024 | A1 |
Number | Date | Country |
---|---|---|
2006204622 | Mar 2007 | AU |
3126766 | Apr 2020 | CA |
3060257 | Mar 2023 | CA |
3126138 | Feb 2024 | CA |
3126258 | Feb 2024 | CA |
3126276 | May 2024 | CA |
1033604 | Jul 1989 | CN |
1643731 | Jul 2005 | CN |
1671489 | Sep 2005 | CN |
1783112 | Jun 2006 | CN |
1809428 | Jul 2006 | CN |
102884539 | Jan 2013 | CN |
103129783 | Jun 2013 | CN |
103442998 | Dec 2013 | CN |
103842270 | Jun 2014 | CN |
104355032 | Feb 2015 | CN |
104507814 | Apr 2015 | CN |
104858150 | Aug 2015 | CN |
204837530 | Dec 2015 | CN |
105314417 | Feb 2016 | CN |
105383906 | Mar 2016 | CN |
105668255 | Jun 2016 | CN |
105761195 | Jul 2016 | CN |
105800323 | Jul 2016 | CN |
105855189 | Aug 2016 | CN |
105873838 | Aug 2016 | CN |
205500186 | Aug 2016 | CN |
106111551 | Nov 2016 | CN |
106169168 | Nov 2016 | CN |
106734076 | May 2017 | CN |
107430719 | Dec 2017 | CN |
107472815 | Dec 2017 | CN |
108136596 | Jun 2018 | CN |
108137232 | Jun 2018 | CN |
108290297 | Jul 2018 | CN |
108290685 | Jul 2018 | CN |
108351637 | Jul 2018 | CN |
108602630 | Sep 2018 | CN |
108604091 | Sep 2018 | CN |
207981651 | Oct 2018 | CN |
108778636 | Nov 2018 | CN |
108921241 | Nov 2018 | CN |
109181473 | Jan 2019 | CN |
208304180 | Jan 2019 | CN |
110740954 | Jan 2020 | CN |
113039549 | Jun 2021 | CN |
113272835 | Aug 2021 | CN |
113272836 | Aug 2021 | CN |
113272837 | Aug 2021 | CN |
113287128 | Aug 2021 | CN |
113287129 | Aug 2021 | CN |
113287130 | Aug 2021 | CN |
113955367 | Jan 2022 | CN |
19510392 | Sep 1996 | DE |
102004001181 | Aug 2005 | DE |
102007023909 | Nov 2008 | DE |
102007038834 | Feb 2009 | DE |
102008039764 | May 2010 | DE |
0235488 | Sep 1987 | EP |
0613841 | Sep 1994 | EP |
0648695 | Apr 1995 | EP |
1695927 | Aug 2006 | EP |
1995192 | Nov 2008 | EP |
2233400 | Sep 2010 | EP |
2477914 | Apr 2013 | EP |
2995567 | Mar 2016 | EP |
3112295 | Jan 2017 | EP |
2832654 | May 2003 | FR |
2084531 | Apr 1982 | GB |
H0985181 | Mar 1997 | JP |
200228577 | Jan 2002 | JP |
2007182286 | Jul 2007 | JP |
2008037567 | Feb 2008 | JP |
4150106 | Sep 2008 | JP |
2010202291 | Sep 2010 | JP |
9731843 | Sep 1997 | WO |
03095339 | Nov 2003 | WO |
2005118436 | Dec 2005 | WO |
2007009136 | Jan 2007 | WO |
2008091733 | Jul 2008 | WO |
2010017872 | Feb 2010 | WO |
2011038442 | Apr 2011 | WO |
2014130937 | Aug 2014 | WO |
2015118171 | Aug 2015 | WO |
2016012742 | Jan 2016 | WO |
2017036780 | Mar 2017 | WO |
2017044747 | Mar 2017 | WO |
2017192783 | Nov 2017 | WO |
2018175466 | Sep 2018 | WO |
2018176033 | Sep 2018 | WO |
2018195196 | Oct 2018 | WO |
2020086748 | Apr 2020 | WO |
2020146467 | Jul 2020 | WO |
2020146472 | Jul 2020 | WO |
2020146480 | Jul 2020 | WO |
2020146487 | Jul 2020 | WO |
2020146503 | Jul 2020 | WO |
2020146509 | Jul 2020 | WO |
Entry |
---|
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,277 on Sep. 12, 2022, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,160 on Sep. 21, 2022, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,161 on Sep. 21, 2022, 4 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,258 on Sep. 21, 2022, 5 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,138 on Sep. 13, 2022, 5 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,138 on Oct. 26, 2022, 7 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/395,180, filed Nov. 30, 2022, 12 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/747,515, filed Dec. 21, 2022, 10 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/508,217, filed Jan. 4, 2023, 12 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/899,294, filed Feb. 23, 2023, 10 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/982,287, filed Mar. 22, 2023, 14 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Application No. 3,152,708 on May 3, 2023, 7 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Application No. 3,126,161 on Jul. 28, 2023, 3 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Application No. 3,126,160 on Aug. 3, 2023, 4 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Patent Application No. 18723144.4 on Nov. 26, 2019, 3 pages. |
Communication pursuant to Rules 161(1) and 162 EPC issued by the European Patent Office in related European Application No. 19805436.3 on Jun. 1, 2021, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office in related European Patent Application No. 20704961.0 on Aug. 17, 2021, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office in related European Patent Application No. 20703621.1 on Aug. 17, 2021, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office in related European Patent Application No. 20704119.5 on Aug. 17, 2021, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office in related European Patent Application No. 20704962.8 on Aug. 17, 2021, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office in related European Patent Application No. 20704645.9 on Aug. 17, 2021, 3 pages. |
Communication pursuant to Rules 161(1) and 162EPC issued by the European Patent Office in related European Patent Application No. 20703866.2 on Aug. 17, 2021, 3 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,060,257 on Dec. 9, 2020, 3 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,060,257 on Oct. 28, 2021, 6 pages. |
Examiner's Report issued by the Innovation, Science and Economic Development Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,766 on Apr. 13, 2022, 5 pages. |
Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,213 on Jul. 23, 2021, 10 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO on Oct. 22, 2019, in related International Application No. PCT/US2018/028164, 11 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2019/057710 on Apr. 27, 2021, 8 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2020/012744 on Jun. 16, 2021, 8 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2020/012720 on Jun. 16, 2021, 8 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2020/012695 on Jun. 16, 2021, 8 p. Feb. 5, 2024. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2020/012713 on Jun. 16, 2021, 8 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2020/012754 on Jun. 16, 2021, 8 pages. |
International Preliminary Report on Patentability issued by the International Bureau of WIPO in related International Application No. PCT/US2020/012704 on Jun. 16, 2021, 9 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Aug. 9, 2018, in related International Application No. PCT/US2018/028164, 15 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Mar. 25, 2020, in related International Application No. PCT/US2020/012695, 14 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Mar. 25, 2020, in related International Application No. PCT/US2020/012704, 15 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Mar. 25, 2020, in related International Application No. PCT/US2020/012713, 14 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Mar. 25, 2020, in related International Application No. PCT/US2020/012720, 14 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Mar. 25, 2020, in related International Application No. PCT/US2020/012744, 14 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Mar. 25, 2020, in related International Application No. PCT/US2020/012754, 14 pages. |
International Search Report and Written Opinion issued by the International Searching Authority on Feb. 6, 2020 in related International Application No. PCT/US2019/057710, 12 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 15/956,442, filed Mar. 15, 2019, 8 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,213, filed Jun. 4, 2020, 6 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,211, filed Nov. 23, 2020, 11 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,213, filed Feb. 24, 2021, 8 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,218, filed Mar. 26, 2021, 13 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,215, filed Mar. 26, 2021, 13 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,202, filed Apr. 13, 2021, 9 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/737,218, filed Oct. 26, 2021, 7 pages. |
Non-Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 16/661,820, filed Oct. 27, 2021, 18 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/324,588, filed Feb. 7, 2022, 10 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/349,064, filed Mar. 8, 2022, 14 pages. |
Notice on the First Office Action and First Office Action, along with its English Translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201880038892.0 on Sep. 2, 2020, 23 pages. |
Notice on the Second Office Action and Second Office Action, along with its English Translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201880038892.0 on Apr. 14, 2021, 8 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 17/516,862 on Jul. 19, 2022, 10 pages. |
Notice on the First Office Action, and its English translation, issued in related Chinese Patent Application No. 202111245956.4 on Dec. 14, 2022, 16 pages. |
Chao et al., Design and test of vacuum suction device for egg embryo activity sorting robot, Transactions of the Chinese Society of Agricultural Engineering, vol. 16, pp. 276-283, Aug. 23, 2017. |
Examiner's Report issued by Innovation, Science and Economic Devleopment Canada (Canadian Intellectual Property Office) in related Canadian Patent Application No. 3,126,277 on Feb. 1, 2024, 3 pages. |
Final Office Action issued by the U.S. Patent and Trademark Office in related U.S. Appl. No. 17/670,324, filed Dec. 26, 2023, 8 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 18/142,071, filed Nov. 21, 2023, 11 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 18/202,697, filed Jan. 3, 2024, 11 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 18/221,671, filed Mar. 12, 2024, 11 pages. |
Non-Final Office Action issued by the United States Patent and Trademark Office in related U.S. Appl. No. 18/382,452, filed May 2, 2024, 12 pages. |
Notice of First Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202080008346.X on Nov. 8, 2023, 15 pages. |
Notice on First Office Action issued by the China Naitonal Intellectual Property Administration in related Chinese Patent Application No. 202080008300.8 on Nov. 17, 2023, 20 pages. |
Notice on First Office Action issued by the China Naitonal Intellectual Property Administration in related Chinese Patent Application No. 202080008348.9 on Nov. 14, 2023, 18 pages. |
Notice on First Office Action issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202080008352.5 on Nov. 15, 2023, 23 pages. |
Notice on First Office Action, along with its English translation, issued by the China National Intellectual Property Administration issued in related Chinese Patent Application No. 201980070400.0 on Nov. 9, 2023, 22 pages. |
Notice on First Office Action, along with its English translation, issued by the China national Intellectual Property Administration in related Chinese Patent Application No. 202080008322.4 on Nov. 13, 2023, 21 pages. |
Notice on the First Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202080008347.4 on Nov. 21, 2023, 19 pages. |
Notice on the Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 201980070400.0 on May 15, 2024, 10 pages. |
Notice on the Second Office Action, along with its English translation, issued by the China National Intellectual Property Administration in related Chinese Patent Application No. 202080008347.4 on May 31, 2024, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20220261738 A1 | Aug 2022 | US |
Number | Date | Country | |
---|---|---|---|
62884351 | Aug 2019 | US | |
62749509 | Oct 2018 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16661820 | Oct 2019 | US |
Child | 17739738 | US |