Camera enabled portal

Information

  • Patent Grant
  • 11922253
  • Patent Number
    11,922,253
  • Date Filed
    Tuesday, April 20, 2021
    3 years ago
  • Date Issued
    Tuesday, March 5, 2024
    9 months ago
Abstract
A delivery portal, which may be at a loading dock, includes a sensor configured to detect a pallet, platform or stack of goods as it passes through the portal. A computer is programmed to receive information from the sensor and to identify the pallet based upon the information. The computer is further programmed to compare the identified pallet to a database to determine if the identified pallet should be passing through the portal. For example, the computer determines whether the pallet is being loaded onto the wrong truck or onto the right truck but in the wrong sequence. The sensor for detecting the pallet may be an RFID sensor reading an RFID tag on the pallets. The portal may be a loading dock. The database may indicate a sequence for loading a plurality of pallets including the identified pallet onto a truck at the loading dock.
Description
BACKGROUND

A truck leaving a distribution center may contain numerous pallets each loaded with goods. One or more of the loaded pallets may be required to be delivered to each of a plurality of stores. Attempts are made to load the truck in reverse-sequence, that is, loading the last-to-be-delivered first. Loading the pallets in the wrong sequence can reduce efficiency. Loading pallets into the wrong truck can significantly reduce efficiency.


SUMMARY

A delivery portal, which may be at a loading dock, includes a sensor configured to detect a pallet, platform or stack of goods as it passes through the portal. A computer is programmed to receive information from the sensor and to identify the pallet based upon the information. The computer is further programmed to compare the identified pallet to a database to determine if the identified pallet should be passing through the portal. For example, the computer determines whether the pallet is being loaded onto the wrong truck or onto the right truck but in the wrong sequence.


The sensor for detecting the pallet may be an RFID sensor reading an RFID tag on the pallets. The portal may be a loading dock.


The database may indicate a sequence for loading a plurality of pallets including the identified pallet onto a truck at the loading dock.


The delivery portal may also include a camera and the computer may be programmed to receive images from the camera. The computer may also be programmed to identify a person moving the pallet through the portal, such as via facial recognition based on the image from the camera.


The computer may be programmed to determine a direction of travel of the pallet through the portal. The computer may determine the direction of travel based upon information from the camera, such as based upon a plurality of sequential images from the camera. In this manner, the computer can track whether the identified pallet is being moved onto the truck or off of the truck (for example, after it has been noted that a wrong pallet has been moved onto the truck).


The delivery portal may further include a presence sensor. The computer may be programmed to activate the RFID sensor and/or the camera based upon information from the presence sensor. The presence sensor may be a breakbeam sensor or a motion sensor.


Also disclosed herein is a delivery portal sensor tower, which can be used, for example, at a loading dock. The tower may include a housing and an RFID sensor, a camera, and a presence sensor all mounted to the housing. A computer may be in communication with the RFID sensor, the camera and the presence sensor. Based upon an indication of presence by the presence sensor, the computer is programmed to cause the RFID sensor to read an RFID tag and to cause the camera to generate at least one image.


A computerized method for operating a portal is also disclosed herein. A platform carrying a plurality of items stacked thereon is identified near a truck. The identity of the platform is received in computer. The computer compares the identified platform to a list indicating whether the identified platform should be loaded onto the truck. The computer generates an indication whether the identified platform should be loaded onto the truck.


The platform may be a pallet. The list may indicate a sequence of loading a plurality of pallets including the identified pallet. The computer compares the identified pallet to the list to determine whether others of the plurality of pallets on the list should be loaded onto the truck before the identified pallet.


The platform or pallet may be identified by reading an RFID tag on the pallet or platform. The camera may be used to image the platform or pallet and a person moving the platform or pallet. The image may be used to validate the items on the pallet or platform, and may be used to identify the person.


The method may also include determining a direction of movement of the platform relative to the truck, e.g. whether the platform or pallet is being moved onto the truck or off of the truck.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a schematic view of a delivery system.



FIG. 2 is a flowchart of one version of a method for assembling items for delivery.



FIG. 3 shows an example loading station of the delivery system of FIG. 1.



FIG. 4 shows an example validation station of the delivery system of FIG. 1.



FIG. 5 is another view of the example validation system of FIG. 4 with a loaded pallet thereon.



FIG. 6 shows an example loading station of the delivery system of FIG. 1.



FIG. 7 is another view of the example loading station of FIG. 6.



FIG. 8 illustrates a sensor tower at the loading dock of FIG. 7.



FIG. 9 shows a portion of the sensor tower of FIG. 8 partially broken away.



FIG. 10 shows a sensor tower positioned adjacent each doorway and a loaded pallet being brought toward the doorway.



FIGS. 11A and 11B show a flowchart for the operation of the sensor tower.



FIG. 12 shows break beam sensors detecting an outbound loaded pallet.



FIG. 13 shows the RFID reader detecting a tag.



FIG. 14 shows the camera capturing an image.



FIG. 15 illustrates the camera capturing images for the SKU validation and the load validation.



FIG. 16 shows the breakbeam sensor detecting movement of an inbound loaded pallet.



FIG. 17 shows the RFID sensor recording the RFID tag on the pallet.



FIG. 18 shows the camera imaging the loaded pallet and the driver.



FIG. 19 shows that the system has determined the direction, date/time, pallet id and identification of the driver.





DETAILED DESCRIPTION


FIG. 1 is a high-level view of a delivery system 10 including one or more distribution centers 12, a central server 14 (e.g. cloud computer), and a plurality of stores 16. A plurality of trucks 18 or other delivery vehicles each transport the products 20 on pallets 22 from one of the distribution centers 12 to a plurality of stores 16. Each truck 18 carries a plurality of pallets 22 which may be half pallets, each loaded with a plurality of goods 20 for delivery to one of the stores 16. A wheeled sled 24 is on each truck 18 to facilitate delivery of one of more pallets 22 of goods 20 to each store 16. Generally, the goods 20 could be loaded on the half pallets 22, full-size pallets, carts, or hand carts, or dollies—all considered “platforms” herein.


Each distribution center 12 includes one or more pick stations 30, a plurality of validation stations 32, and a plurality of loading stations 34. Each loading station may be a loading dock for loading the trucks 18.


Each distribution center 12 includes a DC computer 26. The DC computer 26 receives orders 60 from the stores 16 and communicates with a central server 14. Each DC computer 26 receives orders and generates pick sheets 64, each of which stores SKUs and associates them with pallet ids. Alternatively, the orders 60 can be sent from the DC computer 26 to the central server 14 for generation of the pick sheets 64, which are synced back to the DC computer 26.


Some or all of the distribution centers 12 may include a training station 28 for generating image information and other information about new products 20 which can be transmitted to the central server 14 for analysis and future use.


The central server 14 may include a plurality of distribution center accounts 40, including DC1-DCn, each associated with a distribution center 12. Each DC account 40 includes a plurality of store accounts 42, including store 1-store n. The orders 60 and pick sheets 64 for each store are stored in the associated store account 42. The central server 14 further includes a plurality of machine learning models 44 trained as will be described herein based upon SKUs. The models 44 may be periodically synced to the DC computers 26.


The machine learning models 44 are used to identify SKUs. A “SKU” may be a single variation of a product that is available from the distribution center 12 and can be delivered to one of the stores 16. For example, each SKU may be associated with a particular package type, e.g. the number of containers (e.g. 12 pack) in a particular form (e.g. can pr bottle) and of a particular size (e.g. 24 ounces) with a particular secondary container (cardboard vs reusuable plastic crate, cardboard tray with plastic overwrap, etc). Each machine learning model 44 is trained to identify the possible package types.


Each SKU may also be associated with a particular “brand” (e.g. the manufacturer and the specific flavor). Each machine learning model 44 is trained to identify the possible brands, which are associated with the name of the product, a description of the product, dimensions of the product, and image information for the product. The central server 14 also stores the expected weight of each SKU. It is also possible that more than one variation of a product may share a single SKU, such as where only the packaging, aesthetics, and outward appearance of the product varies, but the content and quantity is the same. For example, sometimes promotional packaging may be utilized, which would have different image information for a particular SKU. In general, all the machine learning models 44 may be generated based upon image information generated through the training module 28.


Referring also to the flowchart in FIG. 2, an order 60 may be received from a store 16 in step 150. As an example, an order 60 may be placed by a store employee using an app or mobile device 52. The order 60 is sent to the distribution center computer 26 (or alternatively to the server 14, and then relayed to the proper (e.g. closest) distribution center computer 26). The distribution center computer 26 analyzes the order 60 and creates a pick sheet 64 associated with that order 60 in step 152. The pick sheet 64 assigns each of the SKUs (including the quantity of each SKU) from the order. The pick sheet 64 specifies how many pallets 22 will be necessary for that order (as determined by the DC computer 26). The DC computer 26 may also determine which SKUs should be loaded near one another on the same pallet 22, or if more than one pallet 22 will be required, which SKUs should be loaded together on the same pallet 22. For example, SKUs that go in the cooler may be together on the same pallet (or near one another on the same pallet), while SKUs that go on the shelf may be on another part of the pallet (or on another pallet, if there is more than one). If the pick sheet 64 is created on the DC computer 26, it is copied to the server 14. If it is created on the server 14, it is copied to the DC computer 26.



FIG. 3 shows the pick station 30 of FIG. 1. Referring to FIGS. 1 and 3, workers at the distribution center read the palled id (e.g. via rfid, barcode, etc) on the pallet(s) 22 on a pallet jack 24a, such as with a mobile device or a reader on the pallet jack 24a. Shelves may contain a variety of items 20 for each SKU, such as first product 20a of a first SKU and a second product 20b of a second SKU (collectively “products 20”). A worker reading a computer screen or mobile device screen displaying from the pick sheet 64 retrieves each product 20 and places that product 20 on the pallet 22. Alternatively, the pallet 22 may be loaded by automated handling equipment.


Workers place items 20 on the pallets 22 according to the pick sheets 64, and report the palled ids to the DC computer 26 in step 154. The DC computer 26 dictates merchandizing groups and sub groups for loading items 20a, b on the pallets 22 in order to make unloading easier at the store. In the example shown, the pick sheets 64 dictate that products 20a are on one pallet 22 while products 20b are on another pallet 22. For example, cooler items should be grouped, and dry items should be grouped. Splitting of package groups is also minimized to make unloading easer. This makes pallets 22 more stable too.


After one pallet 22 is loaded, the next pallet 22 is brought to the pick station 30, until all of the SKUs required by the pick sheet 64 are loaded onto as many pallets 22 as required by that pick sheet 64. More pallets 22 are then loaded for the next pick sheet 64. The DC computer 26 records the pallet ids of the pallet(s) 22 that have been loaded with particular SKUs for each pick sheet 64. The pick sheet 64 may associate each pallet id with each SKU.


After being loaded, each loaded pallet 22 may be validated at the validation station 32, which may be adjacent to or part of the pick station 30. As will be described in more detail below, at least one still image, and preferably several still images or video, of the products 20 on the pallet 22 is taken at the validation station 32 in step 156. The pallet id of the pallet 22 is also read. The images are analyzed to determine the SKUS of the products 20 that are currently on the identified pallet 22 in step 158. The SKUs of the products 20 on the pallet 22 are compared to the pick sheet 64 by the DC computer 26 in step 160, to ensure that all the SKUs associated with the pallet id of the pallet 22 on the pick sheet 64 are present on the correct pallet 22, and that no additional SKUs are present. Several ways are of performing the aforementioned steps are disclosed below.


First, referring to FIGS. 4 and 5, the validation station may include a CV/RFID semi-automated wrapper 66a with turntable 67 may be specially fitted with a camera 68 and rfid reader 70 (and/or barcode reader). The wrapper 66a holds a roll of translucent, flexible, plastic wrap or stretch wrap 72. As is known, a loaded pallet 22 can be placed on the turntable 67, which rotates the loaded pallet 22 as stretch wrap 72 is applied. The camera 68 may be a depth camera. In this wrapper 66a, the camera 68 takes at least one image of the loaded pallet 22 while the turntable 67 is rotating the loaded pallet 22, prior to or while wrapping the stretch wrap 72 around the loaded pallet 22. Images/video of the loaded pallet 22 after wrapping may also be generated. As used herein, “image” or “images” refers broadly to any combination of still images and/or video, and “imaging” means capturing any combination of still images and/or video. Again, preferably 2 to 4 still images, or video, are taken.


In one implementation, the turntable 67 is rotating and when the camera 68 detects that the two outer ends of the pallet 22 are equidistant (or otherwise that the side of the pallet 22 facing the camera 68 is perpendicular to the camera 68 view), the camera 68 records a still image. The camera 68 can record four still images in this manner, one of each side of the pallet 22.


The rfid reader 70 (or barcode reader, or the like) reads the pallet id (a unique serial number) from the pallet 22. The wrapper 66a includes a local computer 74 in communication with the camera 68 and rfid reader 70. The computer 74 can communicate with the DC computer 26 (and/or server 14) via a wireless network card 76. The image(s) and the pallet id are sent to the server 14 via the network card 76 and associated with the pick list 64 (FIG. 1). Optionally, a weight sensor can be added to the turntable 67 and the known total weight of the products 20 and pallet 22 can be compared to the measured weight on the turntable 67 for confirmation. An alert is generated if the total weight on the turntable 67 does not match the expected weight.


As an alternative, the turntable 67, camera 68, rfid reader 70, and computer 74 of FIGS. 4 and 5 can be used without the wrapper. The loaded pallet 22 can be placed on the turntable 67 for validation only and can be subsequently wrapped either manually or at another station.


Alternatively, the validation station can include a worker with a networked camera, such as on a mobile device (e.g. smartphone or tablet) for taking one or more images 62 of the loaded pallet 22, prior to wrapping the loaded pallet 22. Other ways can be used to gather images of the loaded pallet. In any of the methods, the image analysis and/or comparison to the pick list is performed on the DC computer 26, which has a copy of the machine learning models. Alternatively, the analysis and comparison can be done on the server 14, locally on a computer 74, or on the mobile device 78, or on another locally networked computer.


However the image(s) of the loaded pallet 22 are collected, the image(s) are then analyzed to determine the sku of every item 20 on the pallet 22 in step 158 (FIG. 2).


The computer vision-generated sku count for that specific pallet 22 is compared against the pick list 64 to ensure the pallet 22 is built correctly. This may be done prior to the loaded pallet 22 being wrapped thus preventing unwrapping of the pallet 22 to audit and correct. If the built pallet 22 does not match the pick list 64 (step 162), the missing or wrong SKUs are indicated to the worker (step 164). Then the worker can correct the items 20 on the pallet 22 (step 166) and reinitiate the validation (i.e. initiate new images in step 156). If the loaded pallet 22 is confirmed, positive feedback is given to the worker, who then continues wrapping the loaded pallet 22 (step 168). The worker then moves the validated loaded pallet 22 to the loading station 34 (step 172).


After the loaded pallet 22 has been validated, it is moved to a loading station 34 (FIG. 1). As explained in more detail below, at the loading station 34, the distribution center computer 26 ensures that the loaded pallets 22, as identified by each pallet id, are loaded onto the correct trucks 18 in the correct order. For example, pallets 22 that are to be delivered at the end of the route are loaded first.


Referring to FIGS. 1 and 6, a computer (DC computer 26, server 14, or another) determines efficient routes to be driven by each truck 18 to visit each store 16 in the most efficient sequence, the specific loaded pallets 22 that must go onto each truck 18, and the order in which the pallets 22 should be loaded onto the trucks 18. An optimized queue system is used to queue and load loaded pallets 22 onto the truck 18 in the correct reverse-stop sequence (last stop is loaded onto the truck 18 first) based upon the route planned for that truck 18. Each truck 18 will be at a different loading dock doorway 80. A list or database may indicate which pallets 22 are to be loaded into which trucks 82 and in which sequence.



FIG. 7 shows an example loading station 34, such as a loading dock with a doorway 80. Based upon the sequence determined by the server 14 or DC computer 26 or other computer, an electronic visual display 82 proximate the doorway 80 shows which pallet 22 is to be loaded onto that truck 18 next. A sensor tower 310 having a housing 312 is mounted adjacent the doorway 80. A presence sensor 316 may be mounted to the housing 312. The sensor tower 310 may further include a camera 84 and/or rfid reader 86 adjacent the doorway 80. After being triggered by the presence sensor 316, the camera 84 and/or the rfid reader 86 image/read each loaded pallet 22 as it is being loaded onto the truck 18. The pallet 22 may be identified by the pallet id and/or based upon the products on the pallet as shown in the image. The computer compares that identified pallet 22 to the previously-determined lists.


If the wrong pallet 22 is moved through (or toward) the doorway 80, an audible and/or visual alarm alerts the workers. Optionally, the rfid reader 86 at the doorway 80 is able to determine the direction of movement of the rfid tag on the loaded pallet 22, i.e. it can determine if the loaded pallet 22 is being moved onto the truck 18 or off of the truck 18. This is helpful if the wrong loaded pallet 22 is moved onto the truck 18. The worker is notified that the wrong pallet 22 was loaded, and the rfid reader 86 can confirm that the pallet was then moved back off the truck 18.


When a group of loaded pallets 22 (two or more) is going to the same store 16, the loaded pallets 22 within this group can be loaded onto the truck 18 in any order. The display 82 may indicate the group of loaded pallets 22 and the loaded pallets 22 within this group going to the same store 16 will be approved by the rfid reader 86 and display 82 in any order within the group.



FIG. 8 shows the sensor tower 310 that could be used, for example, at the doorway 80 at the loading dock of FIG. 7. FIG. 9 shows a portion of the sensor tower 310, partially broken away. The sensor tower 310 includes the housing 312 supporting above the floor the RFID reader 86 (which could be a UHF RFID reader), the presence sensor such as a break beam sensor 316, and the camera 84 (which could be a depth camera, as above). The RFID reader 86, break beam sensor 316, and camera 84 may all be controlled by the DC computer 26. Alternatively, a local computer (e.g. in the tower 310) is programmed to control the operation of these devices and to communicate with the DC computer 26.


As shown in FIG. 10, the sensor tower 310 is positioned adjacent each doorway 80 at each loading station 34, with the RFID reader 86, break beam sensor 316 (which could be photo optic), and camera 84 all directed toward the doorway 80. The sensor tower 310 could also be mounted at any entrance or exit or any point where tracking asset moves would be beneficial. The display 82 is mounted near the doorway 80, such as above the doorway 80.


As also shown in FIG. 10, a forklift 328 (or pallet jack or pallet sled or any machine for lifting and moving pallets) operated by an operator 330, is moving a pallet 22 having an RFID tag 94. The pallet 22 is loaded with products 20. As the loaded pallet 22 is moved through the doorway 80, it passes in front of the sensor tower 310.


The computer, such as the DC computer 26, the server 14, or a dedicated local computer (or some combination thereof) is programmed to perform the steps shown FIGS. 11A and 11B. Referring to FIG. 11A and FIG. 12, the loaded pallet 22 passes through the doorway 80 (or as it approaches the doorway 80), the break beam sensor 316 detects presence in step 340, the RFID reader 86 and the camera 84 are activated in steps 342 and 344, respectively. If the RFID reader 86 detects a tag 94 (FIGS. 11A and 13), the tag 94 is read in step 346 and checked against known tags. If the tag 94 is identified in the system in step 348, it is recorded in step 352. If the tag 94 is not identified, it is determined that there is no loading event in step 350. For example, maybe a person or equipment passed in front of the break beam sensor 316 without a pallet 22.


Simultaneously with step 342, the camera 84 will start capturing images in step 356 (FIGS. 11A and 14). Two images taken at some short time interval apart (e.g. 1 second or less) are compared in step 358. Based upon the comparison of the two images, the direction of movement of the pallet 22, goods 20, and/or the lift 438 can be determined (such as by the DC computer, server, or local computer. It can also be determined by the computer whether the driver/operator 330 is in the image(s) in steps 364, 366. Referring to FIG. 11B, a person shape image within the image is identified in step 366. The person image is processed in step 368, e.g. via facial recognition. Alternatively, or additionally, the person may also have an RFID tag that can be read by the RFID reader 86. If a person is identified in step 370, then the known person is recorded in step 372. If not, then “person unknown” is recorded in step 374. The system may ensure that the person identified is authorized to be in that area and to handle those products. If the person is unknown or unauthorized, the system may sound an alarm and/or generate another alert.


In step 358, the two (or more) images are compared. Based upon this comparison, it is determined whether a direction can be determined in step 376. If so, the direction of the movement is recorded in step 362. If not, then “direction unknown” is recorded in step 360. The system goes into waiting in step 354.


Referring to the example in FIG. 15, the system has determined the direction (outbound, i.e. onto the truck), the date/time, the RFID of the pallet 22. The system may optionally also validate the load based upon the image(s) taken of the loaded pallet 22 (using the techniques described above but with the image(s) from the camera 84). In other words, the image(s) taken by the camera 84 could also operate as the validation station 32 described above, either instead of the validation station 32 or in supplement to the validation station 32. These images could be used to identify the products on the pallet 22. Alternatively, the image of the loaded pallet 22 could be compared by one of the computers to one or more of the images of the same loaded pallet 22 at the validation station 32 to make sure that there have been no changes (nothing has been removed or added). This could be done with or without specifically identifying every item on the pallet 22, e.g. just comparing the two images as a whole.


With the loaded pallet 22 identified by pallet RFID, and the direction (loading or unloading determined), the system can determine that the particular pallet 22 is being loaded onto a correct truck or an incorrect truck based upon the loading assignments previously determined as described above. The system also determines whether the particular pallet 22 is being loaded in the correct or incorrect sequence by comparing it to the previously-determined loading sequence described above. If the pallet 22 is being loaded onto the wrong truck, or out of sequence, an alert would be generated (visually such as via display 82 and/or audibly). The system can then verify that the same pallet 22 is subsequently unloaded from that truck based upon a determination that the pallet 22 is moved in the direction off the truck.



FIGS. 16-18 show the system operating with respect to an inbound loaded pallet 22. In FIG. 16, the breakbeam sensor 316 is triggered. In FIG. 17, the rfid signal tag 94 is recorded by the RID reader 314. In FIG. 18, the camera 84 takes a photo of the loaded pallet 22 and/or the driver/operator.


In FIG. 19 the system has determined that the loaded pallet was inbound, the date/time, the pallet id, and the identification of the operator.


Additional features for post processing can be implemented after events are recorded. Visual indicators can affirm or deny accuracy of asset movement. Additional audible alarms can be generated in cases where operator alerting is urgent or critical. Email/text alerts can be sent with photos of threshold events (e.g. a high value asset being loaded on to incorrect truck). Shipment claim processing can also be supported, such as photographic verification items left warehouse.


In accordance with the provisions of the patent statutes and jurisprudence, exemplary configurations described above are considered to represent preferred embodiments of the inventions. However, it should be noted that the inventions can be practiced otherwise than as specifically illustrated and described without departing from its spirit or scope. Alphanumeric identifiers on method steps are solely for ease in reference in dependent claims and such identifiers by themselves do not signify a required sequence of performance, unless otherwise explicitly specified.

Claims
  • 1. A delivery portal comprising: a sensor configured to detect a pallet as the pallet passes through or approaches the portal; anda computer programmed to receive information from the sensor and to identify the pallet based upon the information, the computer further programmed to compare the identified pallet to a list or database to determine if the identified pallet should be passing through the portal, wherein the delivery portal is proximate a loading dock, wherein the list or database indicates whether the identified pallet should be loaded onto a truck at the loading dock.
  • 2. The delivery portal of claim 1 wherein the sensor is an RFID sensor.
  • 3. The delivery portal of claim 1 wherein the list or database indicates a sequence for loading a plurality of pallets including the identified pallet onto the truck at the loading dock.
  • 4. The delivery portal of claim 3 further including a camera wherein the computer is programmed to receive images from the camera.
  • 5. The delivery portal of claim 3 wherein the computer is further programmed to generate an indication of whether the identified pallet should be loaded onto to the truck.
  • 6. The delivery portal of claim 4 wherein the computer is programmed to identify a person moving the pallet through the portal.
  • 7. The delivery portal of claim 4 wherein the computer is programmed to determine a direction of travel of the pallet through the portal.
  • 8. The delivery portal of claim 7 wherein the computer determines the direction of travel based upon information from the camera.
  • 9. The delivery portal of claim 8 wherein the computer determines the direction of travel based upon a plurality of images from the camera.
  • 10. The delivery portal of claim 2 further including a presence sensor, wherein the computer is programmed to activate the RFID sensor based upon information from the presence sensor.
  • 11. The delivery portal of claim 10 wherein the presence sensor is a breakbeam sensor.
  • 12. The delivery portal of claim 1 further including a camera, wherein the computer is programmed to identify items on the pallet based upon an image from the camera.
  • 13. The delivery portal of claim 1 wherein the computer is further programmed to generate an indication of whether the identified pallet should be loaded onto to the truck.
  • 14. The delivery portal of claim 13 wherein the sensor is an RFID sensor.
  • 15. A delivery portal sensor tower comprising: a housing at least partially defining a portal through which a loaded pallet can pass;an RFID sensor mounted to the housing;a camera mounted to the housing;a presence sensor mounted to the housing; anda computer in communication with the RFID sensor, the camera and the presence sensor, wherein the RFID sensor is configured to read an RFID tag of the loaded pallet as the loaded pallet passes through or approaches the RFID sensor and to cause the camera to generate an image based upon an indication of presence by the presence sensor.
  • 16. The delivery portal sensor tower of claim 15 wherein the computer is programmed to identify a plurality of items on a pallet based upon the image generated by the camera.
  • 17. The delivery portal sensor tower of claim 15 wherein the computer is programmed to identify the loaded pallet based upon the RFID sensor reading the RFID tag, and wherein the computer is programmed to compare the identified pallet to a list or database indicating whether the identified pallet should be passing the delivery portal sensor tower and to generate an indication whether the identified pallet should be passing the delivery portal sensor tower.
  • 18. A computerized method for operating a portal including: a) identifying a platform carrying a plurality of items stacked thereon as the platform is moving toward a truck;b) receiving the identity of the platform in a computer;c) the computer comparing the identified platform to a list or database indicating whether the identified platform should be loaded onto the truck; andd) based upon step c), the computer generating an indication whether the identified platform should be loaded onto the truck.
  • 19. The method of claim 18 wherein the platform is a pallet.
  • 20. The method of claim 19 wherein the list or database indicates a sequence of loading a plurality of pallets including the identified pallet and wherein step c) includes comparing the identified pallet to the list or database to determine whether others of the plurality of pallets on the list or database should be loaded onto the truck before the identified pallet.
  • 21. The method of claim 20 wherein said step a) includes reading an RFID tag on the pallet.
  • 22. The method of claim 21 further including the step of using a camera to image the platform and a person moving the platform.
  • 23. The method of claim 18 further including: determining a direction of movement of the platform relative to the truck;wherein step d) is performed based upon step e).
  • 24. The method of claim 18 further including the step of: using a camera to generate an image of the platform and a plurality of items on the platform, and identifying the plurality of items on the platform based upon the image.
  • 25. The method of claim 23 wherein in step e) the computer determines that the platform is moving toward the truck and wherein in step d) the computer generates an indication that the platform should not be loaded onto the truck based upon step c) and based upon step e).
  • 26. The method of claim 18 wherein in step d) the computer generates an indication that the identified platform should not be loaded onto the truck.
US Referenced Citations (203)
Number Name Date Kind
1086727 Palmer Feb 1914 A
5730252 Herbinet Mar 1998 A
6026378 Onozaki Feb 2000 A
6626634 Hwang et al. Sep 2003 B2
6721762 Levine et al. Apr 2004 B1
7097045 Winkler Aug 2006 B2
7548166 Roeder et al. Jun 2009 B2
7557714 Roeder et al. Jul 2009 B2
7602288 Broussard Oct 2009 B2
7698179 Leung et al. Apr 2010 B2
7739147 Branigan et al. Jun 2010 B2
7765668 Townsend et al. Aug 2010 B2
7865398 Schon Jan 2011 B2
7877164 Grunbach et al. Jan 2011 B2
7882366 Sen et al. Feb 2011 B2
8494673 Miranda et al. Jul 2013 B2
8718372 Holeva et al. May 2014 B2
8839132 Reichert Sep 2014 B2
8849007 Holeva et al. Sep 2014 B2
8885948 Holeva et al. Nov 2014 B2
8892241 Weiss Nov 2014 B2
8908995 Benos et al. Dec 2014 B2
8918976 Townsend et al. Dec 2014 B2
8934672 Holeva et al. Jan 2015 B2
8938126 Holeva et al. Jan 2015 B2
8965559 Pankratov et al. Feb 2015 B2
8977032 Holeva et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8995743 Holeva et al. Mar 2015 B2
9025827 Holeva et al. May 2015 B2
9025886 Holeva et al. May 2015 B2
9082195 Holeva et al. Jul 2015 B2
9087384 Holeva et al. Jul 2015 B2
9171278 Kong et al. Oct 2015 B1
9224120 Grabiner et al. Dec 2015 B2
9373098 Nashif et al. Jun 2016 B2
9488466 Hanson Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489655 Lecky Nov 2016 B1
9503704 Ando Nov 2016 B2
9505554 Kong et al. Nov 2016 B1
9725195 Lancaster, III et al. Aug 2017 B2
9727840 Bernhardt Aug 2017 B2
9734367 Lecky et al. Aug 2017 B1
9811632 Grabiner et al. Nov 2017 B2
9821344 Zsigmond et al. Nov 2017 B2
9826213 Russell et al. Nov 2017 B1
9830485 Lecky Nov 2017 B1
9969572 Pankratov et al. May 2018 B2
9984339 Hance et al. May 2018 B2
9990535 Phillips et al. Jun 2018 B2
10005581 Lancaster, III et al. Jun 2018 B2
10026044 Wurman et al. Jul 2018 B1
10042079 Patnaik Aug 2018 B2
10055805 Satou Aug 2018 B2
10071856 Hance et al. Sep 2018 B2
10089509 Nachtrieb Oct 2018 B2
10133990 Hance et al. Nov 2018 B2
10134120 Jovanovski et al. Nov 2018 B2
10140724 Benos et al. Nov 2018 B2
10155199 Sakai et al. Dec 2018 B2
10198805 Halata Feb 2019 B2
10217075 Ward et al. Feb 2019 B1
10227152 Lancaster, III et al. Mar 2019 B2
10229487 Goyal et al. Mar 2019 B2
10265871 Hance et al. Apr 2019 B2
10266349 Pankratov et al. Apr 2019 B2
10328578 Holz Jun 2019 B2
10363664 Yoshii Jun 2019 B2
10346987 Landman Jul 2019 B1
10347095 Mattingly et al. Jul 2019 B2
10369701 Diankov et al. Aug 2019 B1
10402956 Jovanovski et al. Sep 2019 B2
10430969 Kopelke et al. Oct 2019 B2
10442640 Pankratov et al. Oct 2019 B2
10456915 Diankov Oct 2019 B1
10482401 Wurman et al. Nov 2019 B2
10491881 Russell et al. Nov 2019 B1
10504343 Mattingly et al. Dec 2019 B2
10518973 Hance et al. Dec 2019 B2
10562188 Diankov et al. Feb 2020 B1
10562189 Diankov et al. Feb 2020 B1
10569416 Diankov Feb 2020 B1
10569417 Diankov Feb 2020 B1
10576631 Diankov Mar 2020 B1
10592842 High et al. Mar 2020 B2
10596701 Diankov Mar 2020 B1
10607182 Shah et al. Mar 2020 B2
10614319 Douglas et al. Apr 2020 B2
10616553 Russell et al. Apr 2020 B1
10618172 Diankov et al. Apr 2020 B1
10621457 Schimmel Apr 2020 B2
10627244 Lauka et al. Apr 2020 B1
10628763 Hance et al. Apr 2020 B2
10643038 McCalib, Jr. et al. May 2020 B1
10643170 Lee et al. May 2020 B2
10655945 Nanda May 2020 B2
10679379 Diankov et al. Jun 2020 B1
10685197 Plummer et al. Jun 2020 B2
10724973 Paresi Jun 2020 B2
10703584 Diankov et al. Jul 2020 B2
10703585 Pankratov et al. Jul 2020 B2
10706571 Sugimura et al. Jul 2020 B2
10759599 Hance et al. Sep 2020 B2
10769806 Driegen et al. Sep 2020 B2
10796423 Goja Oct 2020 B2
10845184 Benos et al. Nov 2020 B2
10845499 Paresi Nov 2020 B2
10867275 Dholakia et al. Dec 2020 B1
10984207 Sone Apr 2021 B2
11046519 Martin, Jr. et al. Jun 2021 B2
11087273 Bergamo Aug 2021 B1
11227458 Farah Jan 2022 B1
20040069850 De Wilde Apr 2004 A1
20040220694 Stingel, III et al. Nov 2004 A1
20050071234 Schon Mar 2005 A1
20050246056 Marks et al. Nov 2005 A1
20060187041 Olsen, III Aug 2006 A1
20060242820 Townsend et al. Nov 2006 A1
20060255949 Roeder et al. Nov 2006 A1
20060255950 Roeder et al. Nov 2006 A1
20070126578 Broussard Jun 2007 A1
20070156281 Leung et al. Jul 2007 A1
20070163099 Townsend et al. Jul 2007 A1
20100202702 Benos et al. Aug 2010 A1
20120057022 Nechiporenko et al. Mar 2012 A1
20120175412 Grabiner et al. Jul 2012 A1
20120274784 Hofman Nov 2012 A1
20130101166 Holeva et al. Apr 2013 A1
20130101167 Holeva et al. Apr 2013 A1
20130101173 Holeva et al. Apr 2013 A1
20130101201 Holeva et al. Apr 2013 A1
20130101202 Holeva et al. Apr 2013 A1
20130101203 Holeva et al. Apr 2013 A1
20130101204 Holeva et al. Apr 2013 A1
20130101227 Holeva et al. Apr 2013 A1
20130101228 Holeva et al. Apr 2013 A1
20130101229 Holeva et al. Apr 2013 A1
20130101230 Holeva et al. Apr 2013 A1
20130282165 Pankratov et al. Oct 2013 A1
20140197926 Nikitin Jul 2014 A1
20150102100 Hattrup et al. Apr 2015 A1
20150149946 Benos et al. May 2015 A1
20150166272 Pankratov et al. Jun 2015 A1
20150325013 Patnaik Nov 2015 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160104290 Patnaik Apr 2016 A1
20160110630 Heusch et al. Apr 2016 A1
20160154939 Grabiner et al. Jun 2016 A1
20160275441 Barber et al. Sep 2016 A1
20160371512 Hattrup et al. Dec 2016 A1
20170076469 Sonoura et al. Mar 2017 A1
20170132773 Toedtli May 2017 A1
20170154397 Satou Jun 2017 A1
20170161673 High et al. Jun 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170316253 Phillips et al. Nov 2017 A1
20180025185 Hattrup et al. Jan 2018 A1
20180029797 Hance et al. Feb 2018 A1
20180043547 Hance et al. Feb 2018 A1
20180060630 Nachtrieb Mar 2018 A1
20180060764 Hance et al. Mar 2018 A1
20180089517 Douglas et al. Mar 2018 A1
20180218247 Lee et al. Aug 2018 A1
20180224569 Paresi Aug 2018 A1
20180225597 Hance et al. Aug 2018 A1
20180247404 Goyal et al. Aug 2018 A1
20180253857 Driegen et al. Sep 2018 A1
20180257879 Pankratov et al. Sep 2018 A1
20180273226 Lancaster, III et al. Sep 2018 A1
20180304468 Holz Oct 2018 A1
20180322424 Wurman et al. Nov 2018 A1
20180370046 Hance et al. Dec 2018 A1
20180370727 Hance et al. Dec 2018 A1
20190005668 Sugimura et al. Jan 2019 A1
20190034839 Hance et al. Jan 2019 A1
20190041341 Paresi Feb 2019 A1
20190049234 Benos et al. Feb 2019 A1
20190102874 Goja Apr 2019 A1
20190122173 Souder et al. Apr 2019 A1
20190156086 Plummer et al. May 2019 A1
20190026878 Jovanovski et al. Jun 2019 A1
20190206059 Landman Jul 2019 A1
20190220990 Goja et al. Jul 2019 A1
20190248604 Pankratov et al. Aug 2019 A1
20190295385 Mattingly et al. Sep 2019 A1
20200039765 Pankratov et al. Feb 2020 A1
20200087068 Hance et al. Mar 2020 A1
20200104785 Ehrman et al. Apr 2020 A1
20200105008 Ehrman et al. Apr 2020 A1
20200130961 Diankov et al. Apr 2020 A1
20200130962 Yu et al. Apr 2020 A1
20200134828 Diankov et al. Apr 2020 A1
20200134830 Yu et al. Apr 2020 A1
20200238517 Diankov Jul 2020 A1
20200238519 Diankov et al. Jul 2020 A1
20200273131 Martin, Jr. et al. Aug 2020 A1
20200294244 Diankov et al. Sep 2020 A1
20200302243 Fryshman Sep 2020 A1
20200311362 Plummer et al. Oct 2020 A1
20210082220 Boerger Mar 2021 A1
20210133666 Eckman May 2021 A1
20210198042 Martin, Jr. et al. Jul 2021 A1
Foreign Referenced Citations (2)
Number Date Country
20100051156 May 2010 KR
2010123458 Oct 2010 WO
Non-Patent Literature Citations (1)
Entry
International Search Report for International Application No. PCT/US2021/016007 dated Apr. 9, 2020.
Related Publications (1)
Number Date Country
20210326544 A1 Oct 2021 US
Provisional Applications (1)
Number Date Country
63012669 Apr 2020 US