A three-dimensional (3D) scanner is a device that analyzes a real-world object or environment to collect data on its shape and appearance. The data collected is then used to generate a digital three-dimensional (3D) model. One example of a 3D scanner is a non-contact scanner that emits some light, such as infrared light, and detects reflected light from the object being scanned.
The discussion above is merely provided for general background information and is not intended to be used as an aid in determining the scope of the claimed subject matter.
A scanning system for a retail store includes a plurality of electric beacons located in aisles in a retail store A first electric beacon is located at a first end of an aisle and a second electric beacon is located at a second end of the aisle. A plurality of shopping carts having on board 3D scanning components are configured to scan items along aisles in a retail store, collect and process 3D information and transmit the 3D information to a store server. Each of the plurality of shopping carts include 3D cameras for 3D scanning, sensors for sensing the electric beacons, at least one processor and a communications module for remotely communicating with the store server. The store server is configured to gather the 3D information transmitted from the plurality of shopping carts to generate a 3D model of the retail store. The 3D information being collected by each shopping cart is collected when the shopping cart senses one of the first or the second electric beacon in the aisle and each shopping cart stops collecting 3D information when the shopping cart senses the other of the first or the second electric beacon in the aisle.
A method of scanning an aisle in a retail store is provided. An indication from a sensor is received indicating that a shopping cart having on board 3D scanning components has entered an aisle. It is determined whether a second shopping cart having on board 3D scanning components is located within a threshold radius from the shopping cart, whether the second shopping cart is heading in a direction towards the shopping cart and whether a threshold amount of time has occurred from when the aisle was last 3D scanned. The aisle is 3D scanned if there is no other shopping carts within the threshold radius and the aisle has not been scanned in the threshold amount of time or if the second shopping cart is within the threshold radius, but is heading in an opposing direction and the aisle has not been scanned in the threshold amount of time.
A scanning system for a retail store includes a plurality of item electric beacons located in proximity to different types of items in an aisle. Each of the item electric beacons are activated upon the type of item being out-of-stock or low on stock. A plurality of shopping carts have on board scanning components for sensing the electric beacons in the aisle. Each of the plurality of shopping carts include at least one visual camera for taking photographs, sensors for sensing the item electric beacons, at least one processor and a communications module for remotely communicating with a store server. When one of the sensors senses one of the electric beacons, the visual camera takes a photograph of the type of item and its surrounding area and transmits the photograph to the store server so that stock is replenished.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The claimed subject matter is not limited to implementations that solve any or all disadvantages noted in the background.
Described below is a system for a fleet of customer-driven shopping carts equipped with 3D scanning equipment and components that when being pushed around a retail store provide real time data for the construction of and updating of a 3D map or model for implementation in a 3D virtual shopping environment. A virtual shopping environment may be accessed in many different ways including being published to a website or being placed on a virtual reality application. In this way, a customer may shop in the retail store in real-time using a user device, such as a mobile device including a virtual reality device mounted to a person's head or a personal computer, without the customer actually having to be in the store. Another way to utilize the generated 3D map or model is to give it to store planners or in-store team members who are in need of modifying 3D space in the store. In yet another embodiment, the fleet of customer-driven shopping carts equipped with 3D scanning equipment or components can be used to detect out-of-stock items in the aisles and then instantly notify team members in the store when items need to be quickly replenished.
To generate a 3D map or model, each customer-driven shopping cart in the fleet is equipped with 3D cameras and other types of sensors and cameras located in sensor pods or packages on a front of the shopping cart and the sides of the shopping cart. Each customer-driven shopping cart may also be equipped with a graphic processing unit, a central processing unit, a memory, a Wi-Fi module, an application running on the central processing unit to connect to a cloud, an accelerometer, a gyroscope, LIDAR (Light Detection and Ranging) and a power pack. Many of these components will be hidden underneath the bottom of the cart and will provide orientation information, obstacle detection and compass direction on where the cart is being directed.
In one embodiment and with reference to
In this embodiment, shopping cart 210 also includes a pair of right side sensor pods or right side sensor packages 218a and 218b and a pair of left side sensor pods or left side sensor packages 220a and 220b (shown in
Optionally each of the 3D scanning shopping carts 210 can be equipped with a LIDAR unit 240. This unit can assist in autonomously driving the 3D scanning cart around the store.
Each of the 3D scanning shopping carts 210 further includes a graphic processing unit (GPU) 226, a central processing unit (CPU) 228, a memory 230 having application 234 to connect to a server, a communications or Wi-Fi module 232, an accelerometer 236 and a gyroscope 238 as well as a separate power unit 242 for powering the various equipment and components on board 3D scanning shopping cart 210. For example, power unit 242 may be a rechargeable battery or battery pack. In accordance with one embodiment, all of these components 226, 228, 230, 232, 234, 236, 238, 240 and 242 are mounted to the bottom of the cart and out of view from customers pushing cart 210.
System 100 includes a store server 244 in communication with a plurality of 3D scanning shopping carts 210a and 210b using network 248. 3D data is received from the plurality of 3D scanning shopping carts 210a and 210b and is off loaded by store server 244 to a central server 246 via a network 249. Central server 246 acts as a central repository of 3D maps or models of each store in a chain of retail stores to provide virtual shopping environments to customers. As previously discussed, not only do the components and hardware on board the 3D scanning carts collect and process data to generate a 3D map, but the 3D cameras regularly collect new data to update the 3D maps. In accordance with some embodiments, the 3D maps are updated several times per day.
In particular, each of the pair of right side sensor packages 218a and 218b and each of the pair of left side sensor packages 220a and 220b include IR cameras for sensing the strip of IR illuminators or IR beacon to determine bearing. Upon sensing the IR beacon, CPU 228 on board the cart recognizes the space to which it is entering and the specific display area or display areas that it will be 3D scanning. For example, the right side sensor packages 218a and 218b will sense the display area on one side of the cart and left side sensor packages 220a and 220b will sense the display area on the other side of the cart. In
As soon as the 3D scanning cart 210a senses an infrared beacon, like infrared beacons 461a and 462a, CPU 228 knows which aisle in the store the cart has entered and proceeds to make a determination of whether this particular aisle should be 3D scanned. First, 3D scanning cart 210a communicates with the other carts in the fleet that also have 3D scanning capability to share its location and determine the locations of those other carts. Second, CPU 228 checks with store server 244 to determine when 3D data for the aisle was last collected.
In one embodiment, to determine location of other 3D scanning carts, Wi-Fi triangulation is used. As illustrated in
3D scanning carts, such as carts 210a and 210b, include Wi-Fi module 232, accelerometer 236 and gyroscope 238. Together these components are used to determine a position of a 3D scanning cart and a direction the cart is heading using Wi-Fi module 232, CPU 228 is capable of informing other 3D scanning carts about the cart's position and direction via network 248 and store server 244. In particular, a distance between the cart and at least three Wi-Fi access points 470, 472 and 474 is determined. This is accomplished by measuring the power present in a received Wi-Fi signal (i.e., Wi-Fi signal strength) from each of the Wi-Fi access points. With the known distance between the cart and the Wi-Fi access points, trilateration algorithms may be used to determine the relative position of the cart using the known positions of the access points as reference. Alternatively, the angle of arriving signals at the cart can be employed to determine the cart's location based on triangulation algorithms. A combination of both of these techniques may be used to increase the accuracy of the system. It should be realized that other indoor positioning systems are possible, including systems making use of optical, radio and acoustic technologies. When a location of the 3D scanning cart is determined it is sent and frequently updated to store server 244 to share with other 3D scanning carts.
At block 304 in
If no 3D scanning carts are located within the threshold radius or if there is another 3D scanning cart within the threshold radius but that cart is not headed towards cart 210a, then method 300 passes to block 308. At block 308, CPU 228 requests information from store server 244 regarding the aisle that cart 210a has entered. For example, in regards to
At the end of the aisle, strips of infrared illuminators or IR beacons 461b and 462b are also detected by right side sensor packages 218a and 218b and left side sensor packages 220a and 220b to indicate to CPU 228 that cart 210a is exiting the aisle. In particular, sensing the infrared illuminators or IR beacons at the end of the aisle causes CPU 228 to stop collecting data and for CPU 228 to instruct the 3D structured-light cameras and the stereoscopic cameras to power down. By turning off the 3D structured-light cameras and the stereoscopic cameras, power provided by power unit 242 can be conserved.
As 3D data is collected, GPU 226 processes and analyzes the 3D data to construct a 3D model of the display areas scanned by the cart. In particular, GPU 226 aligns multiple frames of 3D data and from the aligned frames forms a 3D model of the 3D space filled with objects captured in the 3D data. CPU 228 sends the 3D models to store server 244. From there, store server 244 compiles data from all 3D scanning carts located in the retail store into a 3D model or map of the entire retail store and sends the compiled data to central server 246 over network 249. Network 249 can comprise different types of computer networks including the Internet, LAN and WAN. Central server 246 is the repository for 3D models and maps of retail stores in a chain of retail stores and each 3D model or map is accessible by a customer so that they may select a 3D map or model of a particular store they would like to shop in and virtually shop in that store including adding items to a virtual shopping cart, purchasing those items and having them shipped to a different store in the chain of retail stores or to an address, such as a personal home address.
In another embodiment, 3D scanning system 200 includes an out-of-stock detection feature. Each cart 210a and 210b with 3D scanning capability are also equipped to detect out-of-stock events. Out-of-stock events are events where there is no product remaining on the shelf or the amount of a product on the shelf is low.
In the embodiment illustrated in
Upon sensing the infrared light, certain visual cameras having fish eye lenses on board 3D scanning cart 210a will take a zoomed out snapshot of the aisle and empty shelf. This zoomed out photo is sent to an in-store team member via store server 244 to notify them that the product is out of stock. A further advantage of taking a snapshot of the aisle, is to show whether other products in that same aisle or shelf are also being depleted. This will allow for more efficient restocking of all products. It is also possible for the cart to scan a barcode of the price label on the shelf where the emptied product is and send the bar code information to an in-store team member via store server 244.
As illustrated in
In additional embodiments, upon sensing that an item is out of stock, not only can a message be sent to in-store team members requesting that the out of stock item be replenished, but a message may also be sent to all 3D scanning carts in the fleet. The message may indicate what item is currently out of stock. In addition, the threshold time discussed in connection with
Although elements have been shown or described as separate embodiments above, portions of each embodiment may be combined with all or part of other embodiments described above.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
The present application is based on and claims the benefit of U.S. provisional patent application Ser. No. 62/384,321, filed Sep. 7, 2016, the content of which is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
4071740 | Gogulski | Jan 1978 | A |
4973952 | Malec et al. | Nov 1990 | A |
5287266 | Malec et al. | Feb 1994 | A |
5295064 | Malec et al. | Mar 1994 | A |
5729697 | Schkolnick et al. | Mar 1998 | A |
5773954 | VanHorn | Jun 1998 | A |
5821512 | O'Hagan et al. | Oct 1998 | A |
5821513 | O'Hagan et al. | Oct 1998 | A |
5995015 | DeTemple et al. | Nov 1999 | A |
6032127 | Schkolnick et al. | Feb 2000 | A |
6598025 | Hamilton et al. | Jul 2003 | B1 |
6659344 | Otto | Dec 2003 | B2 |
6997382 | Bhri | Feb 2006 | B1 |
7178719 | Silverbrook et al. | Feb 2007 | B2 |
7183910 | Alvarez et al. | Feb 2007 | B2 |
7225980 | Ku et al. | Jun 2007 | B2 |
7364070 | Chang | Apr 2008 | B2 |
7443295 | Brice et al. | Oct 2008 | B2 |
7580699 | Shaw et al. | Aug 2009 | B1 |
7660747 | Brice et al. | Feb 2010 | B2 |
7679522 | Carpenter | Mar 2010 | B2 |
7714723 | Fowler et al. | May 2010 | B2 |
7741808 | Fowler et al. | Jun 2010 | B2 |
7762458 | Stawar et al. | Jul 2010 | B2 |
7782194 | Stawar et al. | Aug 2010 | B2 |
7828211 | Landers, Jr. et al. | Nov 2010 | B2 |
8152062 | Perrier et al. | Apr 2012 | B2 |
8208014 | Geiger et al. | Jun 2012 | B2 |
8254881 | Shaw et al. | Aug 2012 | B2 |
8630924 | Groenevelt et al. | Jan 2014 | B2 |
8720778 | Chen et al. | May 2014 | B2 |
8938396 | Swafford, Jr. et al. | Jan 2015 | B2 |
8950671 | Chan et al. | Feb 2015 | B2 |
8972285 | Soldate | Mar 2015 | B2 |
9230249 | Vora | Jan 2016 | B1 |
9269093 | Lee et al. | Feb 2016 | B2 |
9288268 | Ramaswamy et al. | Mar 2016 | B2 |
10002337 | Siddique | Jun 2018 | B2 |
10126747 | Svec | Nov 2018 | B1 |
10318907 | Bergstrom | Jun 2019 | B1 |
20030216969 | Bauer | Nov 2003 | A1 |
20050131578 | Weaver | Jun 2005 | A1 |
20070260429 | Vera | Nov 2007 | A1 |
20080077511 | Zimmerman | Mar 2008 | A1 |
20090094140 | Kwan | Apr 2009 | A1 |
20090231135 | Chaves | Sep 2009 | A1 |
20100156597 | Stern | Jun 2010 | A1 |
20100171826 | Hamilton | Jul 2010 | A1 |
20110143779 | Rowe | Jun 2011 | A1 |
20120019393 | Wolinsky | Jan 2012 | A1 |
20120091162 | Overhultz | Apr 2012 | A1 |
20120193408 | Shastri | Aug 2012 | A1 |
20130103608 | Scipioni | Apr 2013 | A1 |
20130235206 | Smith | Sep 2013 | A1 |
20140244207 | Hicks | Aug 2014 | A1 |
20150127496 | Marathe | May 2015 | A1 |
20150178565 | Rivlin | Jun 2015 | A1 |
20150310539 | McCoy | Oct 2015 | A1 |
20150317682 | Kayser | Nov 2015 | A1 |
20160171432 | Pugh | Jun 2016 | A1 |
20160258763 | High | Sep 2016 | A1 |
20160260148 | High | Sep 2016 | A1 |
20160314518 | Goodwin | Oct 2016 | A1 |
20170142373 | Black | May 2017 | A1 |
20170221130 | Kraus | Aug 2017 | A1 |
20170323281 | Jones | Nov 2017 | A1 |
20170323359 | Dey | Nov 2017 | A1 |
20170344937 | Atchley | Nov 2017 | A1 |
20170357939 | Jones | Dec 2017 | A1 |
20180114183 | Howell | Apr 2018 | A1 |
20180365630 | Seals | Dec 2018 | A1 |
Number | Date | Country | |
---|---|---|---|
62384321 | Sep 2016 | US |