A truck leaving a distribution center may contain numerous pallets each loaded with goods. One or more of the loaded pallets may be required to be delivered to each of a plurality of stores. Attempts are made to load the truck in reverse-sequence, that is, loading the last-to-be-delivered first. Loading the pallets in the wrong sequence can reduce efficiency. Loading pallets into the wrong truck can significantly reduce efficiency.
A delivery portal, which may be at a loading dock, includes a sensor configured to detect a pallet, platform or stack of goods as it passes through the portal. A computer is programmed to receive information from the sensor and to identify the pallet based upon the information. The computer is further programmed to compare the identified pallet to a database to determine if the identified pallet should be passing through the portal. For example, the computer determines whether the pallet is being loaded onto the wrong truck or onto the right truck but in the wrong sequence.
The sensor for detecting the pallet may be an RFID sensor reading an RFID tag on the pallets. The portal may be a loading dock.
The database may indicate a sequence for loading a plurality of pallets including the identified pallet onto a truck at the loading dock.
The delivery portal may also include a camera and the computer may be programmed to receive images from the camera. The computer may also be programmed to identify a person moving the pallet through the portal, such as via facial recognition based on the image from the camera.
The computer may be programmed to determine a direction of travel of the pallet through the portal. The computer may determine the direction of travel based upon information from the camera, such as based upon a plurality of sequential images from the camera. In this manner, the computer can track whether the identified pallet is being moved onto the truck or off of the truck (for example, after it has been noted that a wrong pallet has been moved onto the truck).
The delivery portal may further include a presence sensor. The computer may be programmed to activate the RFID sensor and/or the camera based upon information from the presence sensor. The presence sensor may be a breakbeam sensor or a motion sensor.
Also disclosed herein is a delivery portal sensor tower, which can be used, for example, at a loading dock. The tower may include a housing and an RFID sensor, a camera, and a presence sensor all mounted to the housing. A computer may be in communication with the RFID sensor, the camera and the presence sensor. Based upon an indication of presence by the presence sensor, the computer is programmed to cause the RFID sensor to read an RFID tag and to cause the camera to generate at least one image.
A computerized method for operating a portal is also disclosed herein. A platform carrying a plurality of items stacked thereon is identified near a truck. The identity of the platform is received in computer. The computer compares the identified platform to a list indicating whether the identified platform should be loaded onto the truck. The computer generates an indication whether the identified platform should be loaded onto the truck.
The platform may be a pallet. The list may indicate a sequence of loading a plurality of pallets including the identified pallet. The computer compares the identified pallet to the list to determine whether others of the plurality of pallets on the list should be loaded onto the truck before the identified pallet.
The platform or pallet may be identified by reading an RFID tag on the pallet or platform. The camera may be used to image the platform or pallet and a person moving the platform or pallet. The image may be used to validate the items on the pallet or platform, and may be used to identify the person.
The method may also include determining a direction of movement of the platform relative to the truck, e.g. whether the platform or pallet is being moved onto the truck or off of the truck.
Each distribution center 12 includes one or more pick stations 30, a plurality of validation stations 32, and a plurality of loading stations 34. Each loading station may be a loading dock for loading the trucks 18.
Each distribution center 12 includes a DC computer 26. The DC computer 26 receives orders 60 from the stores 16 and communicates with a central server 14. Each DC computer 26 receives orders and generates pick sheets 64, each of which stores SKUs and associates them with pallet ids. Alternatively, the orders 60 can be sent from the DC computer 26 to the central server 14 for generation of the pick sheets 64, which are synced back to the DC computer 26.
Some or all of the distribution centers 12 may include a training station 28 for generating image information and other information about new products 20 which can be transmitted to the central server 14 for analysis and future use.
The central server 14 may include a plurality of distribution center accounts 40, including DC1-DCn, each associated with a distribution center 12. Each DC account 40 includes a plurality of store accounts 42, including store 1-store n. The orders 60 and pick sheets 64 for each store are stored in the associated store account 42. The central server 14 further includes a plurality of machine learning models 44 trained as will be described herein based upon SKUs. The models 44 may be periodically synced to the DC computers 26.
The machine learning models 44 are used to identify SKUs. A “SKU” may be a single variation of a product that is available from the distribution center 12 and can be delivered to one of the stores 16. For example, each SKU may be associated with a particular package type, e.g. the number of containers (e.g. 12 pack) in a particular form (e.g. can pr bottle) and of a particular size (e.g. 24 ounces) with a particular secondary container (cardboard vs reusuable plastic crate, cardboard tray with plastic overwrap, etc). Each machine learning model 44 is trained to identify the possible package types.
Each SKU may also be associated with a particular “brand” (e.g. the manufacturer and the specific flavor). Each machine learning model 44 is trained to identify the possible brands, which are associated with the name of the product, a description of the product, dimensions of the product, and image information for the product. The central server 14 also stores the expected weight of each SKU. It is also possible that more than one variation of a product may share a single SKU, such as where only the packaging, aesthetics, and outward appearance of the product varies, but the content and quantity is the same. For example, sometimes promotional packaging may be utilized, which would have different image information for a particular SKU. In general, all the machine learning models 44 may be generated based upon image information generated through the training module 28.
Referring also to the flowchart in
Workers place items 20 on the pallets 22 according to the pick sheets 64, and report the palled ids to the DC computer 26 in step 154. The DC computer 26 dictates merchandizing groups and sub groups for loading items 20a, b on the pallets 22 in order to make unloading easier at the store. In the example shown, the pick sheets 64 dictate that products 20a are on one pallet 22 while products 20b are on another pallet 22. For example, cooler items should be grouped, and dry items should be grouped. Splitting of package groups is also minimized to make unloading easer. This makes pallets 22 more stable too.
After one pallet 22 is loaded, the next pallet 22 is brought to the pick station 30, until all of the SKUs required by the pick sheet 64 are loaded onto as many pallets 22 as required by that pick sheet 64. More pallets 22 are then loaded for the next pick sheet 64. The DC computer 26 records the pallet ids of the pallet(s) 22 that have been loaded with particular SKUs for each pick sheet 64. The pick sheet 64 may associate each pallet id with each SKU.
After being loaded, each loaded pallet 22 may be validated at the validation station 32, which may be adjacent to or part of the pick station 30. As will be described in more detail below, at least one still image, and preferably several still images or video, of the products 20 on the pallet 22 is taken at the validation station 32 in step 156. The pallet id of the pallet 22 is also read. The images are analyzed to determine the SKUS of the products 20 that are currently on the identified pallet 22 in step 158. The SKUs of the products 20 on the pallet 22 are compared to the pick sheet 64 by the DC computer 26 in step 160, to ensure that all the SKUs associated with the pallet id of the pallet 22 on the pick sheet 64 are present on the correct pallet 22, and that no additional SKUs are present. Several ways are of performing the aforementioned steps are disclosed below.
First, referring to
In one implementation, the turntable 67 is rotating and when the camera 68 detects that the two outer ends of the pallet 22 are equidistant (or otherwise that the side of the pallet 22 facing the camera 68 is perpendicular to the camera 68 view), the camera 68 records a still image. The camera 68 can record four still images in this manner, one of each side of the pallet 22.
The rfid reader 70 (or barcode reader, or the like) reads the pallet id (a unique serial number) from the pallet 22. The wrapper 66a includes a local computer 74 in communication with the camera 68 and rfid reader 70. The computer 74 can communicate with the DC computer 26 (and/or server 14) via a wireless network card 76. The image(s) and the pallet id are sent to the server 14 via the network card 76 and associated with the pick list 64 (
As an alternative, the turntable 67, camera 68, rfid reader 70, and computer 74 of
Alternatively, the validation station can include a worker with a networked camera, such as on a mobile device (e.g. smartphone or tablet) for taking one or more images 62 of the loaded pallet 22, prior to wrapping the loaded pallet 22. Other ways can be used to gather images of the loaded pallet. In any of the methods, the image analysis and/or comparison to the pick list is performed on the DC computer 26, which has a copy of the machine learning models. Alternatively, the analysis and comparison can be done on the server 14, locally on a computer 74, or on the mobile device 78, or on another locally networked computer.
However the image(s) of the loaded pallet 22 are collected, the image(s) are then analyzed to determine the sku of every item 20 on the pallet 22 in step 158 (
The computer vision-generated sku count for that specific pallet 22 is compared against the pick list 64 to ensure the pallet 22 is built correctly. This may be done prior to the loaded pallet 22 being wrapped thus preventing unwrapping of the pallet 22 to audit and correct. If the built pallet 22 does not match the pick list 64 (step 162), the missing or wrong SKUs are indicated to the worker (step 164). Then the worker can correct the items 20 on the pallet 22 (step 166) and reinitiate the validation (i.e. initiate new images in step 156). If the loaded pallet 22 is confirmed, positive feedback is given to the worker, who then continues wrapping the loaded pallet 22 (step 168). The worker then moves the validated loaded pallet 22 to the loading station 34 (step 172).
After the loaded pallet 22 has been validated, it is moved to a loading station 34 (
Referring to
If the wrong pallet 22 is moved through (or toward) the doorway 80, an audible and/or visual alarm alerts the workers. Optionally, the rfid reader 86 at the doorway 80 is able to determine the direction of movement of the rfid tag on the loaded pallet 22, i.e. it can determine if the loaded pallet 22 is being moved onto the truck 18 or off of the truck 18. This is helpful if the wrong loaded pallet 22 is moved onto the truck 18. The worker is notified that the wrong pallet 22 was loaded, and the rfid reader 86 can confirm that the pallet was then moved back off the truck 18.
When a group of loaded pallets 22 (two or more) is going to the same store 16, the loaded pallets 22 within this group can be loaded onto the truck 18 in any order. The display 82 may indicate the group of loaded pallets 22 and the loaded pallets 22 within this group going to the same store 16 will be approved by the rfid reader 86 and display 82 in any order within the group.
As shown in
As also shown in
The computer, such as the DC computer 26, the server 14, or a dedicated local computer (or some combination thereof) is programmed to perform the steps shown
Simultaneously with step 342, the camera 84 will start capturing images in step 356 (
In step 358, the two (or more) images are compared. Based upon this comparison, it is determined whether a direction can be determined in step 376. If so, the direction of the movement is recorded in step 362. If not, then “direction unknown” is recorded in step 360. The system goes into waiting in step 354.
Referring to the example in
With the loaded pallet 22 identified by pallet RFID, and the direction (loading or unloading determined), the system can determine that the particular pallet 22 is being loaded onto a correct truck or an incorrect truck based upon the loading assignments previously determined as described above. The system also determines whether the particular pallet 22 is being loaded in the correct or incorrect sequence by comparing it to the previously-determined loading sequence described above. If the pallet 22 is being loaded onto the wrong truck, or out of sequence, an alert would be generated (visually such as via display 82 and/or audibly). The system can then verify that the same pallet 22 is subsequently unloaded from that truck based upon a determination that the pallet 22 is moved in the direction off the truck.
In
Additional features for post processing can be implemented after events are recorded. Visual indicators can affirm or deny accuracy of asset movement. Additional audible alarms can be generated in cases where operator alerting is urgent or critical. Email/text alerts can be sent with photos of threshold events (e.g. a high value asset being loaded on to incorrect truck). Shipment claim processing can also be supported, such as photographic verification items left warehouse.
In accordance with the provisions of the patent statutes and jurisprudence, exemplary configurations described above are considered to represent preferred embodiments of the inventions. However, it should be noted that the inventions can be practiced otherwise than as specifically illustrated and described without departing from its spirit or scope. Alphanumeric identifiers on method steps are solely for ease in reference in dependent claims and such identifiers by themselves do not signify a required sequence of performance, unless otherwise explicitly specified.
Number | Name | Date | Kind |
---|---|---|---|
1086727 | Palmer | Feb 1914 | A |
5730252 | Herbinet | Mar 1998 | A |
6026378 | Onozaki | Feb 2000 | A |
6626634 | Hwang et al. | Sep 2003 | B2 |
6721762 | Levine et al. | Apr 2004 | B1 |
7097045 | Winkler | Aug 2006 | B2 |
7548166 | Roeder et al. | Jun 2009 | B2 |
7557714 | Roeder et al. | Jul 2009 | B2 |
7602288 | Broussard | Oct 2009 | B2 |
7698179 | Leung et al. | Apr 2010 | B2 |
7739147 | Branigan et al. | Jun 2010 | B2 |
7765668 | Townsend et al. | Aug 2010 | B2 |
7865398 | Schon | Jan 2011 | B2 |
7877164 | Grunbach et al. | Jan 2011 | B2 |
7882366 | Sen et al. | Feb 2011 | B2 |
8494673 | Miranda et al. | Jul 2013 | B2 |
8718372 | Holeva et al. | May 2014 | B2 |
8839132 | Reichert | Sep 2014 | B2 |
8849007 | Holeva et al. | Sep 2014 | B2 |
8885948 | Holeva et al. | Nov 2014 | B2 |
8892241 | Weiss | Nov 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8918976 | Townsend et al. | Dec 2014 | B2 |
8934672 | Holeva et al. | Jan 2015 | B2 |
8938126 | Holeva et al. | Jan 2015 | B2 |
8965559 | Pankratov et al. | Feb 2015 | B2 |
8977032 | Holeva et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8995743 | Holeva et al. | Mar 2015 | B2 |
9025827 | Holeva et al. | May 2015 | B2 |
9025886 | Holeva et al. | May 2015 | B2 |
9082195 | Holeva et al. | Jul 2015 | B2 |
9087384 | Holeva et al. | Jul 2015 | B2 |
9171278 | Kong et al. | Oct 2015 | B1 |
9224120 | Grabiner et al. | Dec 2015 | B2 |
9373098 | Nashif et al. | Jun 2016 | B2 |
9488466 | Hanson | Nov 2016 | B2 |
9488986 | Solanki | Nov 2016 | B1 |
9489655 | Lecky | Nov 2016 | B1 |
9503704 | Ando | Nov 2016 | B2 |
9505554 | Kong et al. | Nov 2016 | B1 |
9725195 | Lancaster, III et al. | Aug 2017 | B2 |
9727840 | Bernhardt | Aug 2017 | B2 |
9734367 | Lecky et al. | Aug 2017 | B1 |
9811632 | Grabiner et al. | Nov 2017 | B2 |
9821344 | Zsigmond et al. | Nov 2017 | B2 |
9826213 | Russell et al. | Nov 2017 | B1 |
9830485 | Lecky | Nov 2017 | B1 |
9969572 | Pankratov et al. | May 2018 | B2 |
9984339 | Hance et al. | May 2018 | B2 |
9990535 | Phillips et al. | Jun 2018 | B2 |
10005581 | Lancaster, III et al. | Jun 2018 | B2 |
10026044 | Wurman et al. | Jul 2018 | B1 |
10042079 | Patnaik | Aug 2018 | B2 |
10055805 | Satou | Aug 2018 | B2 |
10071856 | Hance et al. | Sep 2018 | B2 |
10089509 | Nachtrieb | Oct 2018 | B2 |
10133990 | Hance et al. | Nov 2018 | B2 |
10134120 | Jovanovski et al. | Nov 2018 | B2 |
10140724 | Benos et al. | Nov 2018 | B2 |
10155199 | Sakai et al. | Dec 2018 | B2 |
10198805 | Halata | Feb 2019 | B2 |
10217075 | Ward et al. | Feb 2019 | B1 |
10227152 | Lancaster, III et al. | Mar 2019 | B2 |
10229487 | Goyal et al. | Mar 2019 | B2 |
10265871 | Hance et al. | Apr 2019 | B2 |
10266349 | Pankratov et al. | Apr 2019 | B2 |
10328578 | Holz | Jun 2019 | B2 |
10363664 | Yoshii | Jun 2019 | B2 |
10346987 | Landman | Jul 2019 | B1 |
10347095 | Mattingly et al. | Jul 2019 | B2 |
10369701 | Diankov et al. | Aug 2019 | B1 |
10402956 | Jovanovski et al. | Sep 2019 | B2 |
10430969 | Kopelke et al. | Oct 2019 | B2 |
10442640 | Pankratov et al. | Oct 2019 | B2 |
10456915 | Diankov | Oct 2019 | B1 |
10482401 | Wurman et al. | Nov 2019 | B2 |
10491881 | Russell et al. | Nov 2019 | B1 |
10504343 | Mattingly et al. | Dec 2019 | B2 |
10518973 | Hance et al. | Dec 2019 | B2 |
10562188 | Diankov et al. | Feb 2020 | B1 |
10562189 | Diankov et al. | Feb 2020 | B1 |
10569416 | Diankov | Feb 2020 | B1 |
10569417 | Diankov | Feb 2020 | B1 |
10576631 | Diankov | Mar 2020 | B1 |
10592842 | High et al. | Mar 2020 | B2 |
10596701 | Diankov | Mar 2020 | B1 |
10607182 | Shah et al. | Mar 2020 | B2 |
10614319 | Douglas et al. | Apr 2020 | B2 |
10616553 | Russell et al. | Apr 2020 | B1 |
10618172 | Diankov et al. | Apr 2020 | B1 |
10621457 | Schimmel | Apr 2020 | B2 |
10627244 | Lauka et al. | Apr 2020 | B1 |
10628763 | Hance et al. | Apr 2020 | B2 |
10643038 | McCalib, Jr. et al. | May 2020 | B1 |
10643170 | Lee et al. | May 2020 | B2 |
10655945 | Nanda | May 2020 | B2 |
10679379 | Diankov et al. | Jun 2020 | B1 |
10685197 | Plummer et al. | Jun 2020 | B2 |
10724973 | Paresi | Jun 2020 | B2 |
10703584 | Diankov et al. | Jul 2020 | B2 |
10703585 | Pankratov et al. | Jul 2020 | B2 |
10706571 | Sugimura et al. | Jul 2020 | B2 |
10759599 | Hance et al. | Sep 2020 | B2 |
10769806 | Driegen et al. | Sep 2020 | B2 |
10796423 | Goja | Oct 2020 | B2 |
10845184 | Benos et al. | Nov 2020 | B2 |
10845499 | Paresi | Nov 2020 | B2 |
10867275 | Dholakia et al. | Dec 2020 | B1 |
10984207 | Sone | Apr 2021 | B2 |
11046519 | Martin, Jr. et al. | Jun 2021 | B2 |
11087273 | Bergamo | Aug 2021 | B1 |
11227458 | Farah | Jan 2022 | B1 |
20040069850 | De Wilde | Apr 2004 | A1 |
20040220694 | Stingel, III et al. | Nov 2004 | A1 |
20050071234 | Schon | Mar 2005 | A1 |
20050246056 | Marks et al. | Nov 2005 | A1 |
20060187041 | Olsen, III | Aug 2006 | A1 |
20060242820 | Townsend et al. | Nov 2006 | A1 |
20060255949 | Roeder et al. | Nov 2006 | A1 |
20060255950 | Roeder et al. | Nov 2006 | A1 |
20070126578 | Broussard | Jun 2007 | A1 |
20070156281 | Leung et al. | Jul 2007 | A1 |
20070163099 | Townsend et al. | Jul 2007 | A1 |
20100202702 | Benos et al. | Aug 2010 | A1 |
20120057022 | Nechiporenko et al. | Mar 2012 | A1 |
20120175412 | Grabiner et al. | Jul 2012 | A1 |
20120274784 | Hofman | Nov 2012 | A1 |
20130101166 | Holeva et al. | Apr 2013 | A1 |
20130101167 | Holeva et al. | Apr 2013 | A1 |
20130101173 | Holeva et al. | Apr 2013 | A1 |
20130101201 | Holeva et al. | Apr 2013 | A1 |
20130101202 | Holeva et al. | Apr 2013 | A1 |
20130101203 | Holeva et al. | Apr 2013 | A1 |
20130101204 | Holeva et al. | Apr 2013 | A1 |
20130101227 | Holeva et al. | Apr 2013 | A1 |
20130101228 | Holeva et al. | Apr 2013 | A1 |
20130101229 | Holeva et al. | Apr 2013 | A1 |
20130101230 | Holeva et al. | Apr 2013 | A1 |
20130282165 | Pankratov et al. | Oct 2013 | A1 |
20140197926 | Nikitin | Jul 2014 | A1 |
20150102100 | Hattrup et al. | Apr 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150166272 | Pankratov et al. | Jun 2015 | A1 |
20150325013 | Patnaik | Nov 2015 | A1 |
20160104274 | Jovanovski et al. | Apr 2016 | A1 |
20160104290 | Patnaik | Apr 2016 | A1 |
20160110630 | Heusch et al. | Apr 2016 | A1 |
20160154939 | Grabiner et al. | Jun 2016 | A1 |
20160275441 | Barber et al. | Sep 2016 | A1 |
20160371512 | Hattrup et al. | Dec 2016 | A1 |
20170076469 | Sonoura et al. | Mar 2017 | A1 |
20170132773 | Toedtli | May 2017 | A1 |
20170154397 | Satou | Jun 2017 | A1 |
20170161673 | High et al. | Jun 2017 | A1 |
20170193432 | Bernhardt | Jul 2017 | A1 |
20170316253 | Phillips et al. | Nov 2017 | A1 |
20180025185 | Hattrup et al. | Jan 2018 | A1 |
20180029797 | Hance et al. | Feb 2018 | A1 |
20180043547 | Hance et al. | Feb 2018 | A1 |
20180060630 | Nachtrieb | Mar 2018 | A1 |
20180060764 | Hance et al. | Mar 2018 | A1 |
20180089517 | Douglas et al. | Mar 2018 | A1 |
20180218247 | Lee et al. | Aug 2018 | A1 |
20180224569 | Paresi | Aug 2018 | A1 |
20180225597 | Hance et al. | Aug 2018 | A1 |
20180247404 | Goyal et al. | Aug 2018 | A1 |
20180253857 | Driegen et al. | Sep 2018 | A1 |
20180257879 | Pankratov et al. | Sep 2018 | A1 |
20180273226 | Lancaster, III et al. | Sep 2018 | A1 |
20180304468 | Holz | Oct 2018 | A1 |
20180322424 | Wurman et al. | Nov 2018 | A1 |
20180370046 | Hance et al. | Dec 2018 | A1 |
20180370727 | Hance et al. | Dec 2018 | A1 |
20190005668 | Sugimura et al. | Jan 2019 | A1 |
20190034839 | Hance et al. | Jan 2019 | A1 |
20190041341 | Paresi | Feb 2019 | A1 |
20190049234 | Benos et al. | Feb 2019 | A1 |
20190102874 | Goja | Apr 2019 | A1 |
20190122173 | Souder et al. | Apr 2019 | A1 |
20190156086 | Plummer et al. | May 2019 | A1 |
20190026878 | Jovanovski et al. | Jun 2019 | A1 |
20190206059 | Landman | Jul 2019 | A1 |
20190220990 | Goja et al. | Jul 2019 | A1 |
20190248604 | Pankratov et al. | Aug 2019 | A1 |
20190295385 | Mattingly et al. | Sep 2019 | A1 |
20200039765 | Pankratov et al. | Feb 2020 | A1 |
20200087068 | Hance et al. | Mar 2020 | A1 |
20200104785 | Ehrman et al. | Apr 2020 | A1 |
20200105008 | Ehrman et al. | Apr 2020 | A1 |
20200130961 | Diankov et al. | Apr 2020 | A1 |
20200130962 | Yu et al. | Apr 2020 | A1 |
20200134828 | Diankov et al. | Apr 2020 | A1 |
20200134830 | Yu et al. | Apr 2020 | A1 |
20200238517 | Diankov | Jul 2020 | A1 |
20200238519 | Diankov et al. | Jul 2020 | A1 |
20200273131 | Martin, Jr. et al. | Aug 2020 | A1 |
20200294244 | Diankov et al. | Sep 2020 | A1 |
20200302243 | Fryshman | Sep 2020 | A1 |
20200311362 | Plummer et al. | Oct 2020 | A1 |
20210082220 | Boerger | Mar 2021 | A1 |
20210133666 | Eckman | May 2021 | A1 |
20210198042 | Martin, Jr. et al. | Jul 2021 | A1 |
Number | Date | Country |
---|---|---|
20100051156 | May 2010 | KR |
2010123458 | Oct 2010 | WO |
Entry |
---|
International Search Report for International Application No. PCT/US2021/016007 dated Apr. 9, 2020. |
Number | Date | Country | |
---|---|---|---|
20210326544 A1 | Oct 2021 | US |
Number | Date | Country | |
---|---|---|---|
63012669 | Apr 2020 | US |