Motorized transport unit worker support systems and methods

Information

  • Patent Grant
  • 10239739
  • Patent Number
    10,239,739
  • Date Filed
    Friday, March 4, 2016
    8 years ago
  • Date Issued
    Tuesday, March 26, 2019
    5 years ago
Abstract
Some embodiments provide systems and methods to assist product stocking on a sales floor of a retail shopping facility. In some implementations, a system comprises a plurality of motorized transport units that are each configured to perform multiple different types of tasks at a retail shopping facility; and a central computer system configured to coordinate the plurality of motorized transport units in performing the multiple different tasks comprising instruct a motorized transport unit to retrieve a specified stocking cart that is carrying a plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products, and further instruct the motorized transport unit to autonomously transport the stocking cart to a specified stocking location on the sales floor corresponding to at least one of the plurality of products carried by the stocking cart.
Description
TECHNICAL FIELD

These teachings relate generally to shopping environments and more particularly to devices, systems and methods for assisting workers and/or customers in those shopping environments.


BACKGROUND

In a modern retail store environment, there is a need to improve the customer experience and/or convenience for the customer. Whether shopping in a large format (big box) store or smaller format (neighborhood) store, customers often require assistance that employees of the store are not always able to provide. For example, particularly during peak hours, there may not be enough employees available to assist customers such that customer questions go unanswered. Additionally, due to high employee turnover rates, available employees may not be fully trained or have access to information to adequately support customers. Other routine tasks also are difficult to keep up with, particularly during peak hours. For example, shopping carts are left abandoned, aisles become messy, inventory is not displayed in the proper locations or is not even placed on the sales floor, shelf prices may not be properly set, and theft is hard to discourage. All of these issues can result in low customer satisfaction or reduced convenience to the customer. With increasing competition from non-traditional shopping mechanisms, such as online shopping provided by e-commerce merchants and alternative store formats, it can be important for “brick and mortar” retailers to focus on improving the overall customer experience and/or convenience.





BRIEF DESCRIPTION OF THE DRAWINGS

The above needs are at least partially met through provision of embodiments of systems, devices, and methods designed to provide assistance to workers and/or customers in a shopping facility, such as described in the following detailed description, particularly when studied in conjunction with the drawings, wherein:



FIG. 1 comprises a block diagram of a shopping assistance system as configured in accordance with various embodiments of these teachings;



FIGS. 2A and 2B are illustrations of a motorized transport unit of the system of FIG. 1 in a retracted orientation and an extended orientation in accordance with some embodiments;



FIGS. 3A and 3B are illustrations of the motorized transport unit of FIGS. 2A and 2B detachably coupling to a movable item container, such as a shopping cart, in accordance with some embodiments;



FIG. 4 comprises a block diagram of a motorized transport unit as configured in accordance with various embodiments of these teachings;



FIG. 5 comprises a block diagram of a computer device as configured in accordance with various embodiments of these teachings;



FIG. 6 illustrates a simplified block diagram of some components of an exemplary shopping facility assistance system, in accordance with some embodiments;



FIG. 7 illustrates a simplified flow diagram of an exemplary process of assisting in stocking of products, in accordance with some embodiments; and



FIG. 8 illustrates a simplified flow diagram of an exemplary process of controlling the movement of motorized transport units in supporting stocking at a shopping facility, in accordance with some embodiments.





Elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present teachings. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present teachings. Certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. The terms and expressions used herein have the ordinary technical meaning as is accorded to such terms and expressions by persons skilled in the technical field as set forth above except where different specific meanings have otherwise been set forth herein.


DETAILED DESCRIPTION

The following description is not to be taken in a limiting sense, but is made merely for the purpose of describing the general principles of exemplary embodiments. Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.


Generally speaking, pursuant to various embodiments, systems, devices and methods are provided for assistance of persons at a shopping facility. Generally, assistance may be provided to customers or shoppers at the facility and/or to workers at the facility. The facility may be any type of shopping facility at a location in which products for display and/or for sale are variously distributed throughout the shopping facility space. The shopping facility may be a retail sales facility, or any other type of facility in which products are displayed and/or sold. The shopping facility may include one or more of sales floor areas, checkout locations, parking locations, entrance and exit areas, stock room areas, stock receiving areas, hallway areas, common areas shared by merchants, and so on. Generally, a shopping facility includes areas that may be dynamic in terms of the physical structures occupying the space or area and objects, items, machinery and/or persons moving in the area. For example, the shopping area may include product storage units, shelves, racks, modules, bins, etc., and other walls, dividers, partitions, etc. that may be configured in different layouts or physical arrangements. In other example, persons or other movable objects may be freely and independently traveling through the shopping facility space. And in other example, the persons or movable objects move according to known travel patterns and timing. The facility may be any size of format facility, and may include products from one or more merchants. For example, a facility may be a single store operated by one merchant or may be a collection of stores covering multiple merchants such as a mall. Generally, the system makes use of automated, robotic mobile devices, e.g., motorized transport units, that are capable of self-powered movement through a space of the shopping facility and providing any number of functions. Movement and operation of such devices may be controlled by a central computer system or may be autonomously controlled by the motorized transport units themselves. Various embodiments provide one or more user interfaces to allow various users to interact with the system including the automated mobile devices and/or to directly interact with the automated mobile devices. In some embodiments, the automated mobile devices and the corresponding system serve to enhance a customer shopping experience in the shopping facility, e.g., by assisting shoppers and/or workers at the facility.


In some embodiments, a shopping facility personal assistance system comprises: a plurality of motorized transport units located in and configured to move through a shopping facility space; a plurality of user interface units, each corresponding to a respective motorized transport unit during use of the respective motorized transport unit; and a central computer system having a network interface such that the central computer system wirelessly communicates with one or both of the plurality of motorized transport units and the plurality of user interface units, wherein the central computer system is configured to control movement of the plurality of motorized transport units through the shopping facility space based at least on inputs from the plurality of user interface units.


System Overview


Referring now to the drawings, FIG. 1 illustrates embodiments of a shopping facility assistance system 100 that can serve to carry out at least some of the teachings set forth herein. It will be understood that the details of this example are intended to serve in an illustrative capacity and are not necessarily intended to suggest any limitations as regards the present teachings. It is noted that generally, FIGS. 1-5 describe the general functionality of several embodiments of a system, and FIGS. 6-8 expand on some functionalities of some embodiments of the system and/or embodiments independent of such systems.


In the example of FIG. 1, a shopping assistance system 100 is implemented in whole or in part at a shopping facility 101. Generally, the system 100 includes one or more motorized transport units (MTUs) 102; one or more item containers 104; a central computer system 106 having at least one control circuit 108, at least one memory 110 and at least one network interface 112; at least one user interface unit 114; a location determination system 116; at least one video camera 118; at least one motorized transport unit (MTU) dispenser 120; at least one motorized transport unit (MTU) docking station 122; at least one wireless network 124; at least one database 126; at least one user interface computer device 128; an item display module 130; and a locker or an item storage unit 132. It is understood that more or fewer of such components may be included in different embodiments of the system 100.


These motorized transport units 102 are located in the shopping facility 101 and are configured to move throughout the shopping facility space. Further details regarding such motorized transport units 102 appear further below. Generally speaking, these motorized transport units 102 are configured to either comprise, or to selectively couple to, a corresponding movable item container 104. A simple example of an item container 104 would be a shopping cart as one typically finds at many retail facilities, or a rocket cart, a flatbed cart or any other mobile basket or platform that may be used to gather items for potential purchase.


In some embodiments, these motorized transport units 102 wirelessly communicate with, and are wholly or largely controlled by, the central computer system 106. In particular, in some embodiments, the central computer system 106 is configured to control movement of the motorized transport units 102 through the shopping facility space based on a variety of inputs. For example, the central computer system 106 communicates with each motorized transport unit 102 via the wireless network 124 which may be one or more wireless networks of one or more wireless network types (such as, a wireless local area network, a wireless personal area network, a wireless mesh network, a wireless star network, a wireless wide area network, a cellular network, and so on), capable of providing wireless coverage of the desired range of the motorized transport units 102 according to any known wireless protocols, including but not limited to a cellular, Wi-Fi, Zigbee or Bluetooth network.


By one approach the central computer system 106 is a computer based device and includes at least one control circuit 108, at least one memory 110 and at least one wired and/or wireless network interface 112. Such a control circuit 108 can comprise a fixed-purpose hard-wired platform or can comprise a partially or wholly programmable platform, such as a microcontroller, an application specification integrated circuit, a field programmable gate array, and so on. These architectural options are well known and understood in the art and require no further description here. This control circuit 108 is configured (for example, by using corresponding programming stored in the memory 110 as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.


In this illustrative example the control circuit 108 operably couples to one or more memories 110. The memory 110 may be integral to the control circuit 108 or can be physically discrete (in whole or in part) from the control circuit 108 as desired. This memory 110 can also be local with respect to the control circuit 108 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 108 (where, for example, the memory 110 is physically located in another facility, metropolitan area, or even country as compared to the control circuit 108).


This memory 110 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 108, cause the control circuit 108 to behave as described herein. (As used herein, this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as an erasable programmable read-only memory (EPROM).)


Additionally, at least one database 126 may be accessible by the central computer system 106. Such databases may be integrated into the central computer system 106 or separate from it. Such databases may be at the location of the shopping facility 101 or remote from the shopping facility 101. Regardless of location, the databases comprise memory to store and organize certain data for use by the central control system 106. In some embodiments, the at least one database 126 may store data pertaining to one or more of: shopping facility mapping data, customer data, customer shopping data and patterns, inventory data, product pricing data, and so on.


In this illustrative example, the central computer system 106 also wirelessly communicates with a plurality of user interface units 114. These teachings will accommodate a variety of user interface units including, but not limited to, mobile and/or handheld electronic devices such as so-called smart phones and portable computers such as tablet/pad-styled computers. Generally speaking, these user interface units 114 should be able to wirelessly communicate with the central computer system 106 via a wireless network, such as the wireless network 124 of the shopping facility 101 (such as a Wi-Fi wireless network). These user interface units 114 generally provide a user interface for interaction with the system. In some embodiments, a given motorized transport unit 102 is paired with, associated with, assigned to or otherwise made to correspond with a given user interface unit 114. In some embodiments, these user interface units 114 should also be able to receive verbally-expressed input from a user and forward that content to the central computer system 106 or a motorized transport unit 102 and/or convert that verbally-expressed input into a form useful to the central computer system 106 or a motorized transport unit 102.


By one approach at least some of the user interface units 114 belong to corresponding customers who have come to the shopping facility 101 to shop. By another approach, in lieu of the foregoing or in combination therewith, at least some of the user interface units 114 belong to the shopping facility 101 and are loaned to individual customers to employ as described herein. In some embodiments, one or more user interface units 114 are attachable to a given movable item container 104 or are integrated with the movable item container 104. Similarly, in some embodiments, one or more user interface units 114 may be those of shopping facility workers, belong to the shopping facility 101 and are loaned to the workers, or a combination thereof.


In some embodiments, the user interface units 114 may be general purpose computer devices that include computer programming code to allow it to interact with the system 106. For example, such programming may be in the form of an application installed on the user interface unit 114 or in the form of a browser that displays a user interface provided by the central computer system 106 or other remote computer or server (such as a web server). In some embodiments, one or more user interface units 114 may be special purpose devices that are programmed to primarily function as a user interface for the system 100. Depending on the functionality and use case, user interface units 114 may be operated by customers of the shopping facility or may be operated by workers at the shopping facility, such as facility employees (associates or colleagues), vendors, suppliers, contractors, etc.


By one approach, the system 100 optionally includes one or more video cameras 118. Captured video imagery from such a video camera 118 can be provided to the central computer system 106. That information can then serve, for example, to help the central computer system 106 determine a present location of one or more of the motorized transport units 102 and/or determine issues or concerns regarding automated movement of those motorized transport units 102 in the shopping facility space. As one simple example in these regards, such video information can permit the central computer system 106, at least in part, to detect an object in a path of movement of a particular one of the motorized transport units 102.


By one approach these video cameras 118 comprise existing surveillance equipment employed at the shopping facility 101 to serve, for example, various security purposes. By another approach these video cameras 118 are dedicated to providing video content to the central computer system 106 to facilitate the latter's control of the motorized transport units 102. If desired, the video cameras 118 can have a selectively movable field of view and/or zoom capability that the central computer system 106 controls as appropriate to help ensure receipt of useful information at any given moment.


In some embodiments, a location detection system 116 is provided at the shopping facility 101. The location detection system 116 provides input to the central computer system 106 useful to help determine the location of one or more of the motorized transport units 102. In some embodiments, the location detection system 116 includes a series of light sources (e.g., LEDs (light-emitting diodes)) that are mounted in the ceiling at known positions throughout the space and that each encode data in the emitted light that identifies the source of the light (and thus, the location of the light). As a given motorized transport unit 102 moves through the space, light sensors (or light receivers) at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the light and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the data of the light it receives, since it can relate the light data to a mapping of the light sources to locations at the facility 101. Generally, such lighting systems are known and commercially available, e.g., the ByteLight system from ByteLight of Boston, Mass. In embodiments using a ByteLight system, a typical display screen of the typical smart phone device can be used as a light sensor or light receiver to receive and process data encoded into the light from the ByteLight light sources.


In other embodiments, the location detection system 116 includes a series of low energy radio beacons (e.g., Bluetooth low energy beacons) at known positions throughout the space and that each encode data in the emitted radio signal that identifies the beacon (and thus, the location of the beacon). As a given motorized transport unit 102 moves through the space, low energy receivers at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the radio signal and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the location encoded in the radio signal it receives, since it can relate the location data to a mapping of the low energy radio beacons to locations at the facility 101. Generally, such low energy radio systems are known and commercially available. In embodiments using a Bluetooth low energy radio system, a typical Bluetooth radio of a typical smart phone device can be used as a receiver to receive and process data encoded into the Bluetooth low energy radio signals from the Bluetooth low energy beacons.


In still other embodiments, the location detection system 116 includes a series of audio beacons at known positions throughout the space and that each encode data in the emitted audio signal that identifies the beacon (and thus, the location of the beacon). As a given motorized transport unit 102 moves through the space, microphones at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the audio signal and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the location encoded in the audio signal it receives, since it can relate the location data to a mapping of the audio beacons to locations at the facility 101. Generally, such audio beacon systems are known and commercially available. In embodiments using an audio beacon system, a typical microphone of a typical smart phone device can be used as a receiver to receive and process data encoded into the audio signals from the audio beacon.


Also optionally, the central computer system 106 can operably couple to one or more user interface computers 128 (comprising, for example, a display and a user input interface such as a keyboard, touch screen, and/or cursor-movement device). Such a user interface computer 128 can permit, for example, a worker (e.g., an associate, analyst, etc.) at the retail or shopping facility 101 to monitor the operations of the central computer system 106 and/or to attend to any of a variety of administrative, configuration or evaluation tasks as may correspond to the programming and operation of the central computer system 106. Such user interface computers 128 may be at or remote from the location of the facility 101 and may access one or more the databases 126.


In some embodiments, the system 100 includes at least one motorized transport unit (MTU) storage unit or dispenser 120 at various locations in the shopping facility 101. The dispenser 120 provides for storage of motorized transport units 102 that are ready to be assigned to customers and/or workers. In some embodiments, the dispenser 120 takes the form of a cylinder within which motorized transports units 102 are stacked and released through the bottom of the dispenser 120. Further details of such embodiments are provided further below. In some embodiments, the dispenser 120 may be fixed in location or may be mobile and capable of transporting itself to a given location or utilizing a motorized transport unit 102 to transport the dispenser 120, then dispense one or more motorized transport units 102.


In some embodiments, the system 100 includes at least one motorized transport unit (MTU) docking station 122. These docking stations 122 provide locations where motorized transport units 102 can travel and connect to. For example, the motorized transport units 102 may be stored and charged at the docking station 122 for later use, and/or may be serviced at the docking station 122.


In accordance with some embodiments, a given motorized transport unit 102 detachably connects to a movable item container 104 and is configured to move the movable item container 104 through the shopping facility space under control of the central computer system 106 and/or the user interface unit 114. For example, a motorized transport unit 102 can move to a position underneath a movable item container 104 (such as a shopping cart, a rocket cart, a flatbed cart, or any other mobile basket or platform), align itself with the movable item container 104 (e.g., using sensors) and then raise itself to engage an undersurface of the movable item container 104 and lift a portion of the movable item container 104. Once the motorized transport unit is cooperating with the movable item container 104 (e.g., lifting a portion of the movable item container), the motorized transport unit 102 can continue to move throughout the facility space 101 taking the movable item container 104 with it. In some examples, the motorized transport unit 102 takes the form of the motorized transport unit 202 of FIGS. 2A-3B as it engages and detachably connects to a given movable item container 104. It is understood that in other embodiments, the motorized transport unit 102 may not lift a portion of the movable item container 104, but that it removably latches to, connects to or otherwise attaches to a portion of the movable item container 104 such that the movable item container 104 can be moved by the motorized transport unit 102. For example, the motorized transport unit 102 can connect to a given movable item container using a hook, a mating connector, a magnet, and so on.


In addition to detachably coupling to movable item containers 104 (such as shopping carts), in some embodiments, motorized transport units 102 can move to and engage or connect to an item display module 130 and/or an item storage unit or locker 132. For example, an item display module 130 may take the form of a mobile display rack or shelving unit configured to house and display certain items for sale. It may be desired to position the display module 130 at various locations within the shopping facility 101 at various times. Thus, one or more motorized transport units 102 may move (as controlled by the central computer system 106) underneath the item display module 130, extend upward to lift the module 130 and then move it to the desired location. A storage locker 132 may be a storage device where items for purchase are collected and placed therein for a customer and/or worker to later retrieve. In some embodiments, one or more motorized transport units 102 may be used to move the storage locker to a desired location in the shopping facility 101. Similar to how a motorized transport unit engages a movable item container 104 or item display module 130, one or more motorized transport units 102 may move (as controlled by the central computer system 106) underneath the storage locker 132, extend upward to lift the locker 132 and then move it to the desired location.



FIGS. 2A and 2B illustrate some embodiments of a motorized transport unit 202, similar to the motorized transport unit 102 shown in the system of FIG. 1. In this embodiment, the motorized transport unit 202 takes the form of a disc-shaped robotic device having motorized wheels (not shown), a lower body portion 204 and an upper body portion 206 that fits over at least part of the lower body portion 204. It is noted that in other embodiments, the motorized transport unit may have other shapes and/or configurations, and is not limited to disc-shaped. For example, the motorized transport unit may be cubic, octagonal, triangular, or other shapes, and may be dependent on a movable item container with which the motorized transport unit is intended to cooperate. Also included are guide members 208. In FIG. 2A, the motorized transport unit 202 is shown in a retracted position in which the upper body portion 206 fits over the lower body portion 204 such that the motorized transport unit 202 is in its lowest profile orientation which is generally the preferred orientation for movement when it is unattached to a movable item container 104 for example. In FIG. 2B, the motorized transport unit 202 is shown in an extended position in which the upper body portion 206 is moved upward relative to the lower body portion 204 such that the motorized transport unit 202 is in its highest profile orientation for movement when it is lifting and attaching to a movable item container 104 for example. The mechanism within the motorized transport unit 202 is designed to provide sufficient lifting force to lift the weight of the upper body portion 206 and other objects to be lifted by the motorized transport unit 202, such as movable item containers 104 and items placed within the movable item container, item display modules 130 and items supported by the item display module, and storage lockers 132 and items placed within the storage locker. The guide members 208 are embodied as pegs or shafts that extend horizontally from the both the upper body portion 206 and the lower body portion 204. In some embodiments, these guide members 208 assist docking the motorized transport unit 202 to a docking station 122 or a dispenser 120. In some embodiments, the lower body portion 204 and the upper body portion are capable to moving independently of each other. For example, the upper body portion 206 may be raised and/or rotated relative to the lower body portion 204. That is, one or both of the upper body portion 206 and the lower body portion 204 may move toward/away from the other or rotated relative to the other. In some embodiments, in order to raise the upper body portion 206 relative to the lower body portion 204, the motorized transport unit 202 includes an internal lifting system (e.g., including one or more electric actuators or rotary drives or motors). Numerous examples of such motorized lifting and rotating systems are known in the art. Accordingly, further elaboration in these regards is not provided here for the sake of brevity.



FIGS. 3A and 3B illustrate some embodiments of the motorized transport unit 202 detachably engaging a movable item container embodied as a shopping cart 302. In FIG. 3A, the motorized transport unit 202 is in the orientation of FIG. 2A such that it is retracted and able to move in position underneath a portion of the shopping cart 302. Once the motorized transport unit 202 is in position (e.g., using sensors), as illustrated in FIG. 3B, the motorized transport unit 202 is moved to the extended position of FIG. 2B such that the front portion 304 of the shopping cart is lifted off of the ground by the motorized transport unit 202, with the wheels 306 at the rear of the shopping cart 302 remaining on the ground. In this orientation, the motorized transport unit 202 is able to move the shopping cart 302 throughout the shopping facility. It is noted that in these embodiments, the motorized transport unit 202 does not bear the weight of the entire cart 302 since the rear wheels 306 rest on the floor. It is understood that in some embodiments, the motorized transport unit 202 may be configured to detachably engage other types of movable item containers, such as rocket carts, flatbed carts or other mobile baskets or platforms.



FIG. 4 presents a more detailed example of some embodiments of the motorized transport unit 102 of FIG. 1. In this example, the motorized transport unit 102 has a housing 402 that contains (partially or fully) or at least supports and carries a number of components. These components include a control unit 404 comprising a control circuit 406 that, like the control circuit 108 of the central computer system 106, controls the general operations of the motorized transport unit 102. Accordingly, the control unit 404 also includes a memory 408 coupled to the control circuit 406 and that stores, for example, operating instructions and/or useful data.


The control circuit 406 operably couples to a motorized wheel system 410. This motorized wheel system 410 functions as a locomotion system to permit the motorized transport unit 102 to move within the aforementioned retail or shopping facility 101 (thus, the motorized wheel system 410 may more generically be referred to as a locomotion system). Generally speaking, this motorized wheel system 410 will include at least one drive wheel (i.e., a wheel that rotates (around a horizontal axis) under power to thereby cause the motorized transport unit 102 to move through interaction with, for example, the floor of the shopping facility 101). The motorized wheel system 410 can include any number of rotating wheels and/or other floor-contacting mechanisms as may be desired and/or appropriate to the application setting.


The motorized wheel system 410 also includes a steering mechanism of choice. One simple example in these regards comprises one or more of the aforementioned wheels that can swivel about a vertical axis to thereby cause the moving motorized transport unit 102 to turn as well.


Numerous examples of motorized wheel systems are known in the art. Accordingly, further elaboration in these regards is not provided here for the sake of brevity save to note that the aforementioned control circuit 406 is configured to control the various operating states of the motorized wheel system 410 to thereby control when and how the motorized wheel system 410 operates.


In this illustrative example, the control circuit 406 also operably couples to at least one wireless transceiver 412 that operates according to any known wireless protocol. This wireless transceiver 412 can comprise, for example, a Wi-Fi-compatible and/or Bluetooth-compatible transceiver that can communicate with the aforementioned central computer system 106 via the aforementioned wireless network 124 of the shopping facility 101. So configured the control circuit 406 of the motorized transport unit 102 can provide information to the central computer system 106 and can receive information and/or instructions from the central computer system 106. As one simple example in these regards, the control circuit 406 can receive instructions from the central computer system 106 regarding movement of the motorized transport unit 102.


These teachings will accommodate using any of a wide variety of wireless technologies as desired and/or as may be appropriate in a given application setting. These teachings will also accommodate employing two or more different wireless transceivers 412 if desired.


The control circuit 406 also couples to one or more on-board sensors 414. These teachings will accommodate a wide variety of sensor technologies and form factors. By one approach at least one such sensor 414 can comprise a light sensor or light receiver. When the aforementioned location detection system 116 comprises a plurality of light emitters disposed at particular locations within the shopping facility 101, such a light sensor can provide information that the control circuit 406 and/or the central computer system 106 employs to determine a present location and/or orientation of the motorized transport unit 102.


As another example, such a sensor 414 can comprise a distance measurement unit configured to detect a distance between the motorized transport unit 102 and one or more objects or surfaces around the motorized transport unit 102 (such as an object that lies in a projected path of movement for the motorized transport unit 102 through the shopping facility 101). These teachings will accommodate any of a variety of distance measurement units including optical units and sound/ultrasound units. In one example, a sensor 414 comprises a laser distance sensor device capable of determining a distance to objects in proximity to the sensor. In some embodiments, a sensor 414 comprises an optical based scanning device to sense and read optical patterns in proximity to the sensor, such as bar codes variously located on structures in the shopping facility 101. In some embodiments, a sensor 414 comprises a radio frequency identification (RFID) tag reader capable of reading RFID tags in proximity to the sensor. Such sensors may be useful to determine proximity to nearby objects, avoid collisions, orient the motorized transport unit at a proper alignment orientation to engage a movable item container, and so on.


The foregoing examples are intended to be illustrative and are not intended to convey an exhaustive listing of all possible sensors. Instead, it will be understood that these teachings will accommodate sensing any of a wide variety of circumstances or phenomena to support the operating functionality of the motorized transport unit 102 in a given application setting.


By one optional approach an audio input 416 (such as a microphone) and/or an audio output 418 (such as a speaker) can also operably couple to the control circuit 406. So configured the control circuit 406 can provide a variety of audible sounds to thereby communicate with a user of the motorized transport unit 102, other persons in the vicinity of the motorized transport unit 102, or even other motorized transport units 102 in the area. These audible sounds can include any of a variety of tones and other non-verbal sounds. These audible sounds can also include, in lieu of the foregoing or in combination therewith, pre-recorded or synthesized speech.


The audio input 416, in turn, provides a mechanism whereby, for example, a user provides verbal input to the control circuit 406. That verbal input can comprise, for example, instructions, inquiries, or information. So configured, a user can provide, for example, a question to the motorized transport unit 102 (such as, “Where are the towels?”). The control circuit 406 can cause that verbalized question to be transmitted to the central computer system 106 via the motorized transport unit's wireless transceiver 412. The central computer system 106 can process that verbal input to recognize the speech content and to then determine an appropriate response. That response might comprise, for example, transmitting back to the motorized transport unit 102 specific instructions regarding how to move the motorized transport unit 102 (via the aforementioned motorized wheel system 410) to the location in the shopping facility 101 where the towels are displayed.


In this example the motorized transport unit 102 includes a rechargeable power source 420 such as one or more batteries. The power provided by the rechargeable power source 420 can be made available to whichever components of the motorized transport unit 102 require electrical energy. By one approach the motorized transport unit 102 includes a plug or other electrically conductive interface that the control circuit 406 can utilize to automatically connect to an external source of electrical energy to thereby recharge the rechargeable power source 420.


By one approach the motorized transport unit 102 comprises an integral part of a movable item container 104 such as a grocery cart. As used herein, this reference to “integral” will be understood to refer to a non-temporary combination and joinder that is sufficiently complete so as to consider the combined elements to be as one. Such a joinder can be facilitated in a number of ways including by securing the motorized transport unit housing 402 to the item container using bolts or other threaded fasteners as versus, for example, a clip.


These teachings will also accommodate selectively and temporarily attaching the motorized transport unit 102 to an item container 104. In such a case the motorized transport unit 102 can include a movable item container coupling structure 422. By one approach this movable item container coupling structure 422 operably couples to a control circuit 202 to thereby permit the latter to control, for example, the latched and unlatched states of the movable item container coupling structure 422. So configured, by one approach the control circuit 406 can automatically and selectively move the motorized transport unit 102 (via the motorized wheel system 410) towards a particular item container until the movable item container coupling structure 422 can engage the item container to thereby temporarily physically couple the motorized transport unit 102 to the item container. So latched, the motorized transport unit 102 can then cause the item container to move with the motorized transport unit 102. In embodiments such as illustrated in FIGS. 2A-3B, the movable item container coupling structure 422 includes a lifting system (e.g., including an electric drive or motor) to cause a portion of the body or housing 402 to engage and lift a portion of the item container off of the ground such that the motorized transport unit 102 can carry a portion of the item container. In other embodiments, the movable transport unit latches to a portion of the movable item container without lifting a portion thereof off of the ground.


In either case, by combining the motorized transport unit 102 with an item container, and by controlling movement of the motorized transport unit 102 via the aforementioned central computer system 106, these teachings will facilitate a wide variety of useful ways to assist both customers and associates in a shopping facility setting. For example, the motorized transport unit 102 can be configured to follow a particular customer as they shop within the shopping facility 101. The customer can then place items they intend to purchase into the item container that is associated with the motorized transport unit 102.


In some embodiments, the motorized transport unit 102 includes an input/output (I/O) device 424 that is coupled to the control circuit 406. The I/O device 424 allows an external device to couple to the control unit 404. The function and purpose of connecting devices will depend on the application. In some examples, devices connecting to the I/O device 424 may add functionality to the control unit 404, allow the exporting of data from the control unit 404, allow the diagnosing of the motorized transport unit 102, and so on.


In some embodiments, the motorized transport unit 102 includes a user interface 426 including for example, user inputs and/or user outputs or displays depending on the intended interaction with the user. For example, user inputs could include any input device such as buttons, knobs, switches, touch sensitive surfaces or display screens, and so on. Example user outputs include lights, display screens, and so on. The user interface 426 may work together with or separate from any user interface implemented at a user interface unit 114 (such as a smart phone or tablet device).


The control unit 404 includes a memory 408 coupled to the control circuit 406 and that stores, for example, operating instructions and/or useful data. The control circuit 406 can comprise a fixed-purpose hard-wired platform or can comprise a partially or wholly programmable platform. These architectural options are well known and understood in the art and require no further description here. This control circuit 406 is configured (for example, by using corresponding programming stored in the memory 408 as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein. The memory 408 may be integral to the control circuit 406 or can be physically discrete (in whole or in part) from the control circuit 406 as desired. This memory 408 can also be local with respect to the control circuit 406 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 406. This memory 408 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 406, cause the control circuit 406 to behave as described herein. (As used herein, this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as an erasable programmable read-only memory (EPROM).)


It is noted that not all components illustrated in FIG. 4 are included in all embodiments of the motorized transport unit 102. That is, some components may be optional depending on the implementation.



FIG. 5 illustrates a functional block diagram that may generally represent any number of various electronic components of the system 100 that are computer type devices. The computer device 500 includes a control circuit 502, a memory 504, a user interface 506 and an input/output (I/O) interface 508 providing any type of wired and/or wireless connectivity to the computer device 500, all coupled to a communication bus 510 to allow data and signaling to pass therebetween. Generally, the control circuit 502 and the memory 504 may be referred to as a control unit. The control circuit 502, the memory 504, the user interface 506 and the I/O interface 508 may be any of the devices described herein or as understood in the art. The functionality of the computer device 500 will depend on the programming stored in the memory 504. The computer device 500 may represent a high level diagram for one or more of the central computer system 106, the motorized transport unit 102, the user interface unit 114, the location detection system 116, the user interface computer 128, the MTU docking station 122 and the MTU dispenser 120, or any other device or component in the system that is implemented as a computer device.


Additional Features Overview


Referring generally to FIGS. 1-5, the shopping assistance system 100 may implement one or more of several different features depending on the configuration of the system and its components. The following provides a brief description of several additional features that could be implemented by the system. One or more of these features could also be implemented in other systems separate from embodiments of the system. This is not meant to be an exhaustive description of all features and not meant to be an exhaustive description of the details any one of the features. Further details with regards to one or more features beyond this overview may be provided herein.


Tagalong Steering: This feature allows a given motorized transport unit 102 to lead or follow a user (e.g., a customer and/or a worker) throughout the shopping facility 101. For example, the central computer system 106 uses the location detection system 116 to determine the location of the motorized transport unit 102. For example, LED smart lights (e.g., the ByteLight system) of the location detection system 116 transmit a location number to smart devices which are with the customer (e.g., user interface units 114), and/or on the item container 104/motorized transport unit 102. The central computer system 106 receives the LED location numbers received by the smart devices through the wireless network 124. Using this information, in some embodiments, the central computer system 106 uses a grid placed upon a 2D CAD map and 3D point cloud model (e.g., from the databases 126) to direct, track, and plot paths for the other devices. Using the grid, the motorized transport unit 102 can drive a movable item container 104 in a straight path rather than zigzagging around the facility. As the user moves from one grid to another, the motorized transport unit 102 drives the container 104 from one grid to the other. In some embodiments, as the user moves towards the motorized transport unit, it stays still until the customer moves beyond an adjoining grid.


Detecting Objects: In some embodiments, motorized transport units 102 detect objects through several sensors mounted on motorized transport unit 102, through independent cameras (e.g., video cameras 118), through sensors of a corresponding movable item container 104, and through communications with the central computer system 106. In some embodiments, with semi-autonomous capabilities, the motorized transport unit 102 will attempt to avoid obstacles, and if unable to avoid, it will notify the central computer system 106 of an exception condition. In some embodiments, using sensors 414 (such as distance measurement units, e.g., laser or other optical-based distance measurement sensors), the motorized transport unit 102 detects obstacles in its path, and will move to avoid, or stop until the obstacle is clear.


Visual Remote Steering: This feature enables movement and/or operation of a motorized transport unit 102 to be controlled by a user on-site, off-site, or anywhere in the world. This is due to the architecture of some embodiments where the central computer system 106 outputs the control signals to the motorized transport unit 102. These controls signals could have originated at any device in communication with the central computer system 106. For example, the movement signals sent to the motorized transport unit 102 may be movement instructions determined by the central computer system 106; commands received at a user interface unit 114 from a user; and commands received at the central computer system 106 from a remote user not located at the shopping facility space.


Determining Location: Similar to that described above, this feature enables the central computer system 106 to determine the location of devices in the shopping facility 101. For example, the central computer system 106 maps received LED light transmissions, Bluetooth low energy radio signals or audio signals (or other received signals encoded with location data) to a 2D map of the shopping facility. Objects within the area of the shopping facility are also mapped and associated with those transmissions. Using this information, the central computer system 106 can determine the location of devices such as motorized transport units.


Digital Physical Map Integration: In some embodiments, the system 100 is capable of integrating 2D and 3D maps of the shopping facility with physical locations of objects and workers. Once the central computer system 106 maps all objects to specific locations using algorithms, measurements and LED geo-location, for example, grids are applied which sections off the maps into access ways and blocked sections. Motorized transport units 102 use these grids for navigation and recognition. In some cases, grids are applied to 2D horizontal maps along with 3D models. In some cases, grids start at a higher unit level and then can be broken down into smaller units of measure by the central computer system 106 when needed to provide more accuracy.


Calling a Motorized Transport Unit: This feature provides multiple methods to request and schedule a motorized transport unit 102 for assistance in the shopping facility. In some embodiments, users can request use of a motorized transport unit 102 through the user interface unit 114. The central computer system 106 can check to see if there is an available motorized transport unit. Once assigned to a given user, other users will not be able to control the already assigned transport unit. Workers, such as store associates, may also reserve multiple motorized transport units in order to accomplish a coordinated large job.


Locker Delivery: In some embodiments, one or more motorized transport units 102 may be used to pick, pack, and deliver items to a particular storage locker 132. The motorized transport units 102 can couple to and move the storage locker to a desired location. In some embodiments, once delivered, the requestor will be notified that the items are ready to be picked up, and will be provided the locker location and locker security code key.


Route Optimization: In some embodiments, the central computer system automatically generates a travel route for one or more motorized transport units through the shopping facility space. In some embodiments, this route is based on one or more of a user provided list of items entered by the user via a user interface unit 114; user selected route preferences entered by the user via the user interface unit 114; user profile data received from a user information database (e.g., from one of databases 126); and product availability information from a retail inventory database (e.g., from one of databases 126). In some cases, the route intends to minimize the time it takes to get through the facility, and in some cases, may route the shopper to the least busy checkout area. Frequently, there will be multiple possible optimum routes. The route chosen may take the user by things the user is more likely to purchase (in case they forgot something), and away from things they are not likely to buy (to avoid embarrassment). That is, routing a customer through sporting goods, women's lingerie, baby food, or feminine products, who has never purchased such products based on past customer behavior would be non-productive, and potentially embarrassing to the customer. In some cases, a route may be determined from multiple possible routes based on past shopping behavior, e.g., if the customer typically buys a cold Diet Coke product, children's shoes or power tools, this information would be used to add weight to the best alternative routes, and determine the route accordingly.


Store Facing Features: In some embodiments, these features enable functions to support workers in performing store functions. For example, the system can assist workers to know what products and items are on the shelves and which ones need attention. For example, using 3D scanning and point cloud measurements, the central computer system can determine where products are supposed to be, enabling workers to be alerted to facing or zoning of issues along with potential inventory issues.


Phone Home: This feature allows users in a shopping facility 101 to be able to contact remote users who are not at the shopping facility 101 and include them in the shopping experience. For example, the user interface unit 114 may allow the user to place a voice call, a video call, or send a text message. With video call capabilities, a remote person can virtually accompany an in-store shopper, visually sharing the shopping experience while seeing and talking with the shopper. One or more remote shoppers may join the experience.


Returns: In some embodiments, the central computer system 106 can task a motorized transport unit 102 to keep the returns area clear of returned merchandise. For example, the transport unit may be instructed to move a cart from the returns area to a different department or area. Such commands may be initiated from video analytics (the central computer system analyzing camera footage showing a cart full), from an associate command (digital or verbal), or on a schedule, as other priority tasks allow. The motorized transport unit 102 can first bring an empty cart to the returns area, prior to removing a full one.


Bring a Container: One or more motorized transport units can retrieve a movable item container 104 (such as a shopping cart) to use. For example, upon a customer or worker request, the motorized transport unit 102 can re-position one or more item containers 104 from one location to another. In some cases, the system instructs the motorized transport unit where to obtain an empty item container for use. For example, the system can recognize an empty and idle item container that has been abandoned or instruct that one be retrieved from a cart storage area. In some cases, the call to retrieve an item container may be initiated through a call button placed throughout the facility, or through the interface of a user interface unit 114.


Respond to Voice Commands: In some cases, control of a given motorized transport unit is implemented through the acceptance of voice commands. For example, the user may speak voice commands to the motorized transport unit 102 itself and/or to the user interface unit 114. In some embodiments, a voice print is used to authorize to use of a motorized transport unit 102 to allow voice commands from single user at a time.


Retrieve Abandoned Item Containers: This feature allows the central computer system to track movement of movable item containers in and around the area of the shopping facility 101, including both the sale floor areas and the back-room areas. For example, using visual recognition through store cameras 118 or through user interface units 114, the central computer system 106 can identify abandoned and out-of-place movable item containers. In some cases, each movable item container has a transmitter or smart device which will send a unique identifier to facilitate tracking or other tasks and its position using LED geo-location identification. Using LED geo-location identification with the Determining Location feature through smart devices on each cart, the central computer system 106 can determine the length of time a movable item container 104 is stationary.


Stocker Assistance: This feature allows the central computer system to track movement of merchandise flow into and around the back-room areas. For example, using visual recognition and captured images, the central computer system 106 can determine if carts are loaded or not for moving merchandise between the back room areas and the sale floor areas. Tasks or alerts may be sent to workers to assign tasks.


Self-Docking: Motorized transport units 102 will run low or out of power when used. Before this happens, the motorized transport units 102 need to recharge to stay in service. According to this feature, motorized transport units 102 will self-dock and recharge (e.g., at a MTU docking station 122) to stay at maximum efficiency, when not in use. When use is completed, the motorized transport unit 102 will return to a docking station 122. In some cases, if the power is running low during use, a replacement motorized transport unit can be assigned to move into position and replace the motorized transport unit with low power. The transition from one unit to the next can be seamless to the user.


Item Container Retrieval: With this feature, the central computer system 106 can cause multiple motorized transport units 102 to retrieve abandoned item containers from exterior areas such as parking lots. For example, multiple motorized transport units are loaded into a movable dispenser, e.g., the motorized transport units are vertically stacked in the dispenser. The dispenser is moved to the exterior area and the transport units are dispensed. Based on video analytics, it is determined which item containers 104 are abandoned and for how long. A transport unit will attach to an abandoned cart and return it to a storage bay.


Motorized Transport Unit Dispenser: This feature provides the movable dispenser that contains and moves a group of motorized transport units to a given area (e.g., an exterior area such as a parking lot) to be dispensed for use. For example, motorized transport units can be moved to the parking lot to retrieve abandoned item containers 104. In some cases, the interior of the dispenser includes helically wound guide rails that mate with the guide member 208 to allow the motorized transport units to be guided to a position to be dispensed.


Specialized Module Retrieval: This feature allows the system 100 to track movement of merchandise flow into and around the sales floor areas and the back-room areas including special modules that may be needed to move to the sales floor. For example, using video analytics, the system can determine if a modular unit it loaded or empty. Such modular units may house items that are of seasonal or temporary use on the sales floor. For example, when it is raining, it is useful to move a module unit displaying umbrellas from a back room area (or a lesser accessed area of the sales floor) to a desired area of the sales floor area.


Authentication: This feature uses a voice imprint with an attention code/word to authenticate a user to a given motorized transport unit. One motorized transport unit can be swapped for another using this authentication. For example, a token is used during the session with the user. The token is a unique identifier for the session which is dropped once the session is ended. A logical token may be a session id used by the application of the user interface unit 114 to establish the session id when user logs on and when deciding to do use the system 100. In some embodiments, communications throughout the session are encrypted using SSL or other methods at transport level.


Further Details of Some Embodiments

Some embodiments provide systems to at least in part assist, enhance and/or enable product stocking on a sales floor of a retail shopping facility. Typically, the sales floor comprises the area of a shopping facility where customers travel in shopping for products. The system includes a plurality of motorized transport units 102 that are each configured to perform multiple different types of tasks at a retail shopping facility. As described above, the different tasks can vary from, but not limited to, assisting customers, performing a clean-up, collecting one or more movable item containers, and other such tasks. Further, the motorized transport units can assist with stocking products on the sales floor, in part by transporting stocking carts carrying products to be stocked on the sales floor. A central computer system 106 is in communication with the motorized transport units and is configured to coordinate the plurality of motorized transport units 102 in performing the multiple different tasks. In some applications, the central computer system instructs one or more motorized transport units to retrieve a specified stocking movable item container 104, which is referred to below as a stocking cart. The stocking cart is configured to carry a plurality of products that are to be restocked onto one or more product supports (e.g., shelves, racks, modulars, etc.) that are positioned on the sales floor and/or product supports in a back storage area, overflow area, or the like. The stocking cart is intended to be used by workers to transport products to the sales floor so that workers can restock the product supports from the products carried by the stocking cart.


In some applications, the stocking cart may be a specific stocking cart and/or have a specific coupling structure with which a motorized transport unit can cooperate. The central computer system can further instruct the one or more motorized transport units to autonomously transport the specified stocking cart to a specified stocking location on the sales floor corresponding to at least one of the plurality of products carried by the stocking cart and intended to be restocked at or near the stocking location.


Typically, the control circuit 108 of the central computer system 106 can obtain location information for each of the motorized transport units 102. Similarly, the central computer system can obtain location information of one or more stocking carts. For example, a stocking cart may include a stocking cart control circuit and transceiver that can communicate location information to the central computer system. Additionally or alternatively, a worker can notify the central computer system when a stocking cart is being loaded and/or has completed a loading of a stocking cart, and can provide relevant location information. The stocking cart may determine and/or track its location similar to the motorized transport unit tracking its movement and location (e.g., detection of encoded location information emitted by one or more light sources, detecting one or more beacons, triangulation of beacons, Wi-Fi, cellular or other signals, inertial sensors, distance measurement sensors, distance travel detection systems, global positioning satellite information, information from a user interface unit, other such sources, or a combination of two or more of such sources. Similarly, the central computer system may task one or more workers to perform the task of loading one or more stocking carts, and the central computer system may track the location of the workers as they load the stocking carts, and determine a location of one or more stocking carts based on the movements of the workers. The workers' locations may be determined based on information communicated from a user interface unit 114, video processing, and the like.


The stocking location is typically dependent on the products loaded in the stocking cart. Further, in some instances, the stocking location and/or the products instructed to be loaded into and/or onto the stocking cart are dependent on one or more products the central computer system has been notified of and/or determines are in need of being restocked. Additionally, the stocking location may further be dependent on a location of a worker who has been tasked with restocking the one or more products in or on the stocking cart. Accordingly, the central computer system may direct an motorized transport unit to cooperate with and transport a stocking cart in response to identifying that a worker is at or moving toward a tasked stocking location (e.g., by tracking a user interface unit 114 associated with the worker, using image recognition to track the worker, receiving a notification from the worker acknowledging starting of the stocking task, other such methods, or combination of two or more of such methods).


The one or more instructions communicated to the motorized transport unit to autonomously transport one or more stocking carts can include routing instructions based on a mapping of the retail shopping facility, the starting location of the stocking cart, and the intended stocking location. Using the routing instructions, the motorized transport unit 102 can autonomously transport the stocking cart to the specified stocking location. In some instances, the motorized transport unit includes a stocking cart coupler, which may be the same as or similar to the item container coupling structure 422. Further, some shopping facilities may have different types of stocking carts, which may be intended for different types of products. These different types of stocking carts may include different types of couplers with which the motorized transport unit is expected to couple to be able to transport the stocking cart. In some embodiments, the central computer system 106 is further configured to identify a type of mating coupler on a stocking cart with which one of the motorized transport units is to couple in transporting the stocking cart. For example, the stocking cart may be and/or include a pallet jack, where the mating coupler is configured to allow the motorized transport unit to cause the pallet jack to at least partially lift the products and/or one or more pallets of products. In other instances, a stocking cart may be similar to or may be a shopping cart and the motorized transport unit may be intended to move in under at least a portion of the stocking cart and cooperate with one or more bars, latches or other such mating coupler with which an motorized transport unit can couple.


The central computer system can identify, from the plurality of motorized transport units, one or more motorized transport units that comprise one or more couplers that are consistent with the type of mating coupler on the stocking cart. Based at least in part on identifying one or more motorized transport units that comprise the coupler consistent with the type of mating coupler on the stocking cart the central computer system can select one or more of the identified motorized transport units and direct the one or more motorized transport units to one or more stocking carts. Accordingly, the central computer system can select a motorized transport unit having a correct coupler to cooperate with and move the specific stocking cart. Further, one or more motorized transport units at the shopping facility may be incapable of coupling with a stocking cart and are not considered by the central computer system in selecting a motorized transport unit to transport the stocking cart. The central computer system may take into consideration other factors, such as but not limited to tasks that an motorized transport unit has already been assigned, availability of other motorized transport units that can couple with the stocking cart, availability of one or more other motorized transport units to which one or more tasks can be reassigned freeing up a particular motorized transport unit, a location of the one or more motorized transport units relative to the stocking cart, a distance an motorized transport unit has to travel, a distance through the sales floor an motorized transport unit has to travel, a battery charge levels of an motorized transport unit, other such factors, and typically a combination of two or more of such factors.


Some embodiments may further confirm correct products to be stocked are on or in the stocking cart. In some instances, the central computer system receives sensor data indicative of a weight of the stocking cart and the plurality of products carried by the stocking cart. For example, the central computer system may receive sensor data from the motorized transport unit 102 that is directed to transport a specified stocking cart. Additionally or alternatively, the stocking cart may include one or more sensors that can detect weight, distributed weight, temperature, or the like. For example, the stocking cart may include an array of piezoelectric elements on the stocking cart that can detect weight and/or weight variations. Temperature sensors can detect when and/or where some products may be placed in or on the stocking cart. Further, some embodiments may use image and/or video processing over time to detect products and/or whether a stocking cart includes all of the intended products to be stocked from the stocking cart. This weight information may be communicated to the central computer system. In other embodiments, the motorized transport unit tasked to move the stocking car may determine an estimated weight. For example, the motorized transport unit may estimate weight based on forces used to initiate movement and/or maintain movement of the stocking cart. The central computer system can then confirm based on the sensor data that an expected plurality of products is carried by the stocking cart.


The motorized transport unit is configured to autonomously transport the stocking cart to a specified stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart. In some implementations, the central computer system communicates one or more routing instructions to the motorized transport unit that the motorized transport unit implements to move the stocking cart. The motorized transport unit typically includes one or more sensors that can be used to determine and/or confirm a location, track movement of the motorized transport unit and/or the stocking cart, detect distances between the motorized transport unit and/or stocking cart and other elements in the shopping facility (e.g., shelves, racks, movable item containers, customers, workers, other stocking carts, pallets, pallet jacks, fork lifts, boxes, products, and other such elements).


In some embodiments, the motorized transport unit can detect, through one or more sensors on the motorized transport unit, a customer or other element on the sales floor and within a threshold distance of the motorized transport unit as the motorized transport unit transports the stocking cart through the retail shopping facility. Based on the detection, the motorized transport unit can be configured to take one or more actions to avoid the motorized transport unit and the stocking cart from contacting the customer or other element. The sensor data may be a distance measurement, a movement sensor, image processing, other such sensor data, or a combination of two or more of such sensor data. Further, the motorized transport unit and/or the central computer system may obtain sensor data from one or more sensors outside of the motorized transport unit. For example, image and/or video processing can be performed on images and/or video captured by other motorized transport units, cameras mounted in the shopping facility, user interface units, and/or other sources. In some instances, the central computer system may communicate additional avoidance instructions to the motorized transport unit in response to an evaluation at the central computer system of sensor data from the motorized transport unit and/or other sources. The actions taken by the motorized transport unit can depend on one or more factors, such as but not limited to distance between the motorized transport unit and the customer or other element, speed at which the motorized transport unit is traveling, weight and/or estimated weight of the stocking cart and products on the stocking cart, estimated rate of the customer and/or other element, determined direction of travel of the customer and/or element, other elements around the motorized transport unit, available space around the motorized transport unit, other such factors, and typically a combination of such factors. In some instances, the action is to stop the motorized transport unit and the stocking cart. In other instances, the motorized transport unit may slow and change direction to avoid the customer and/or element. Accordingly, the motorized transport units are configured to operate in a congested retail shopping facility to support the stocking of products, while still maintaining safety in the retail shopping facility. Further, the motorized transport units can operate in a retail shopping facility where movement of other elements is random and often unexpected. This is in distinction from some systems that limit the activity within an environment where robots operate so that a control system controls all movement within that environment and does not have to take into consideration such safety factors and/or such random activity within the environment.


In some embodiments, one or more sensors on the motorized transport unit are utilized to track movement of a worker and to use that detected movement to better support the worker. This enhanced support can include moving the stocking cart to follow the worker, moving the stocking cart to a location that is anticipated the worker is moving toward (e.g., based on movements of the worker and products being transported by the stocking cart), keeping the motorized transport unit and the stocking cart out of the way of the worker while still being in a position to limit the workers movement to access additional products to be stocked, and other such actions. In some embodiments, the central computer system 106 receives sensor data, which can include sensor data from the motorized transport unit. Using the sensor data, the central computer system is further configured to identify a worker stocking one or more of the products from the stocking cart, and detect from the sensor data that the worker has moved at least a first threshold distance from the stocking cart. In response, the central computer system can direct the motorized transport unit to move the stocking cart to follow the worker and place the stocking cart within a second threshold distance of the worker. The threshold distance can vary depending on the products being stocked, the rate at which the identified worker is stocking products, the movement of the worker, other traffic (e.g., customers' movements, other motorized transport units, etc.) in the area, and the like. As such, in some instances, the central computer system can direct the motorized transport unit to follow the stocking worker.


In some embodiments, the central computer system is further configured to determine when a stocking cart is unloaded and/or empty of the products being stocked on the sales floor. This determination may be based on image and/or video processing from one or more cameras, a determined weight pulled by the motorized transport unit and/or carried by the stocking cart, a notification from a worker, data received from a product scanner used by the worker, detection or lack of detection of RFID tags, other such factors, or a combination of two or more of such factors. The central computer system can then direct the motorized transport unit to perform one or more actions and/or functions in response to determining that the multiple products carried by the stocking cart have been unloaded and/or the stocking cart is empty of products. The central computer system may further determine that the stocking cart is empty of the plurality of products and carries waste material. Based on the waste material, the central computer system can communicate instructions to the motorized transport unit to cause the motorized transport unit to transport the stocking cart to a waste disposal area. Workers at the waste disposal area can remove the waste material and properly dispose of the material. In other instances, the motorized transport unit may cause some or all of the waste material to be disposed in different and appropriate disposal bins (e.g., recycling bin, trash bin, etc.).


Additionally or alternatively, the central computer system may determine that the plurality of products have been unloaded from the stocking cart and communicate instructions to the motorized transport unit to cause the motorized transport unit to transport the unloaded and/or empty stocking cart to a drop location. The central computer system may further communicate instructions to the motorized transport unit instructing the motorized transport unit to retrieve a specified second stocking cart that is carrying a second plurality of products, and to autonomously transport the second stocking cart to the same or a different specified stocking location on the sales floor.


As described above, the motorized transport units are configured to perform multiple different types of tasks at the shopping facility. Accordingly, the central computer system 106 further instructs the motorized transport units to perform other tasks at the shopping facility. In some instances, the central computer system may determine based on the sensor data that the plurality of products have been unloaded from the stocking cart, and can direct the motorized transport unit to perform a second task, of the multiple different tasks, at the retail shopping facility that is unassociated with stocking products and retrieving products.



FIG. 6 illustrates a simplified block diagram of some components of an exemplary shopping facility assistance system 600, in accordance with some embodiments. The components in part support the stocking of products. The shopping facility assistance system 600 includes the central computer system 106 in communication with multiple motorized transport units 102. Multiple stocking carts 602 are available to be transported through the shopping facility to support the stocking of products on the sales floor. Multiple workers interact with the stocking carts 602, central computer system 106 and/or motorized transport units, in part, to retrieve products from the stocking carts and stock relevant shelves, racks and/or other such product supports. Some embodiments may further include sensors and/or sensor systems 608 that communicate with the central computer system. Further, one or more databases 126 and/or other such information sources can be accessed by the central computer system to obtain product information, stocking information, stocking schedules, product location information, product demand information, sales information and/or other such information that may be used by the central computer system in determining stocking needs, identifying products to be stocked, instructions for loading products on one or more stocking carts, routing information, and the like.


In some embodiments, the central computer system 106 further includes a location application 610, a routing application 612, a work queue application 614, and/or a replenishment application 616. The location application can receive sensor data from motorized transport units, sensor data from a stocking cart, shopping facility sensors, and the like. Based on the sensor data, the location application can determine a location of the motorized transport units, stocking cart, product locations and the like. The routing application 612 can determine optimum routing of the motorized transport units in going to a stocking cart, transporting a stocking cart, moving the stocking cart, returning the stocking cart, and the like. The work queue application can add and track tasks to be performed, including stocking tasks, transporting stocking carts, and the like. The working queue may, in some instances, further maintain and track task queues for each motorized transport unit, including when a stocking task is added to a motorized transport unit task queue. The replenishment application can track product inventory, determine when and where to stock products, and schedule the stocking of products.



FIG. 7 illustrates a simplified flow diagram of an exemplary process 700 of assisting in stocking of products, in accordance with some embodiments. In step 702, the central computer system 106 coordinates a plurality of motorized transport units 102 in performing the multiple different tasks. Many if not all of the motorized transport units are typically configured to perform multiple different types of tasks at the retail shopping facility.


In step 704, the central computer system communicates instructions to a selected motorized transport unit instructing the motorized transport unit to retrieve a specified stocking cart that is carrying a plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products. In some instances, the central computer system identifies a type of mating coupler on the stocking cart and with which one of the motorized transport units is to couple in transporting the stocking cart, and identifies, from the plurality of motorized transport units, that one or more motorized transport units comprise a coupler consistent with the type of mating coupler on the stocking cart. The central computer system can then select at least one of these motorized transport units based on the motorized transport unit comprising the coupler consistent with the type of mating coupler on the stocking cart. In step 706, the central computer system communicates instructions to the motorized transport unit instructing the motorized transport unit to autonomously transport the stocking cart to a specified stocking location on the sales floor corresponding to at least one of the plurality of products carried by the stocking cart.


In some embodiments, the central computer system may further confirm that a stocking pallet has the correct products. For example, the central computer system may receive, from the motorized transport unit, sensor data indicative of a weight of the stocking cart and the plurality of products carried by the stocking cart. Base on the sensor data, the central computer system can confirm that an expected plurality of products is carried by the stocking cart. Additionally or alternatively, the central computer system may receive product scan data, RFID data and the like regarding products moved onto the stocking cart.


Sensor data from sensors on a motorized transport unit can further be used in routing the motorized transport unit and/or to avoid hitting customers and other objects in the shopping facility. Some embodiments detect, through sensor data from one or more sensors on the motorized transport unit, a customer on the sales floor who is within a threshold distance of the motorized transport unit as the motorized transport unit transports the stocking cart through the retail shopping facility. The motorized transport unit can be caused to take one or more actions to avoid the motorized transport unit and the stocking cart from contacting the customer. The action may be based on one or more instructions from the central computer system, may be determined by the motorized transport unit control circuit 406, or the like,


Similarly, sensor data may be utilized to cause the motorized transport unit to follow a worker as the worker stocks products from the stocking cart. Some embodiments detect, from sensor data comprising sensor data from the motorized transport unit, a worker stocking one or more of the products from the stocking cart, and detect from the sensor data that the worker has moved at least a first threshold distance from the stocking cart. The motorized transport unit can be directed to move the stocking cart to follow the worker and place the stocking cart within a second threshold distance of the worker.


Some embodiments further detect that all of the products are unloaded from the stocking cart and/or the stocking cart is empty of the plurality of products, and may further detect that the stocking cart carries waste material. In response, the central computer system can communicate instructions instructing the motorized transport unit to transport the stocking cart to a waste disposal area. Similarly, it may be detected that the plurality of products have been unloaded from the stocking cart, and the central computer system may communicate instructions instructing the motorized transport unit to transport the unloaded and/or empty stocking cart to a drop location. Further, the central computer system may communicate instructions instructing the motorized transport unit to retrieve a different specified stocking cart that is carrying a second plurality of products, and to autonomously transport the stocking cart to a specified second stocking location on the sales floor.


Again, many if not all of the motorized transport units are configured to perform multiple different tasks associated with the shopping facility. Some embodiments detect based on the sensor data that the first plurality of products have been unloaded from the stocking cart. Instructions can be communicated to the motorized transport unit, in response to determining the stocking cart is unloaded and/or empty, directing the motorized transport unit to perform a second task of the multiple different tasks at the retail shopping facility and that is unassociated with stocking products and retrieving products.



FIG. 8 illustrates a simplified flow diagram of an exemplary process 800 of controlling the movement of motorized transport units in supporting stocking at a shopping facility, in accordance with some embodiments. In step 802, one or more stocking carts are loaded with products intended to be stocked on shelves, racks, and/or other such product supports. In some embodiments, the central computer system and/or a product loading system may track a sequence, order of placement and/or location of placement of the products on the stocking cart. The worker or system loading the stocking cart may scan products as they are placed, image processing may be used to track placement on the stocking cart, sensors on the stocking cart may detect placement (e.g., weight sensors, RFID detectors, etc.), and the like.


In step 804, the central computer system is notified that the stocking cart is ready to be transported to the sales floor. This may be based on sensor data from a worker (e.g., from a bar code scanner, RFID detector, etc.), notification from a worker (e.g., through a user interface unit, a notification that loading is complete, etc.), sensor data from the stocking cart and/or other sensors (e.g., cameras), and the like. In some embodiments, the notification may further include an identification of one or more departments and/or areas that the products on the stocking cart are to be taken. The loaded stocking cart may contain products intended for different departments and/or parts of the shopping facility. Accordingly, the central computer system typically has knowledge of the products placed on the stocking cart (e.g., the central computer system may specify which products are placed on the stocking cart, worker stocking the stocking cart may scan products placed on the stocking cart, RFID tags may be read of products placed on the stocking cart, other such determinations, or combination of two or more of such information). Based on the products, the central computer system can direct the motorized transport unit to transport the products to each of the multiple locations in turn to allow one or more workers to retrieve the products from the stocking cart. The information may further identify specific shelves, hooks or the like that products are to be placed.


In step 806, the central computer system uses the location of the stocking cart, the product information and product stocking locations to determine a routing that a motorized transport unit is to follow in transporting the stocking cart. The transport task can be entered into a task queue or work queue. In step 808, the central computer system identifies potentially available motorized transport units and selects a relevant motorized transport unit to cooperate with the loaded stocking cart, and adds the transport task to the selected motorized transport units queue. The selection of the motorized transport unit can be dependent on the type of motorized transport unit, capabilities of the motorized transport unit to move the loaded stocking cart, a coupler on the motorized transport unit that is capable of mating with a coupler on the stocking cart, the capability of the motorized transport unit to operate with the type of loaded stocking cart (e.g., flat bed, pallet, pallet jack, specialty cart, sampling cart, display cart, sorting cart, shopping cart, etc.), task queue, other such factors, and typically a combination of two or more of such factors.


In some instances, optional step 810 is included where the selected motorized transport unit may be directed to an unloaded and/or empty stocking cart to transport the empty stocking cart to a specified location as part of routing the motorized transport unit to the location of the loaded stocking cart. In returning the empty stocking cart, the motorized transport unit may further be directed to one or more disposal areas to dispose of waste on the stocking cart. In step 812, the motorized transport unit couples with the loaded stocking cart, and implements routing instructions to transport the loaded stocking cart. In some embodiments, the motorized transport unit may further notify the central computer system that the motorized transport unit is in transit and moving the stocking cart. Typically, the routing instructions are dependent on the product loading sequence. For example, the routing instructions can cause the motorized transport unit to transport the loaded stocking cart to a location where the products last loaded onto the stocking cart are to be stocked (i.e., last on, first off). As products are removed, the motorized transport unit may move the stocking cart (e.g., to follow a worker, to move to each of one or more different stocking location as products are removed from the stocking cart, and the like).


In step 814, products are removed from the stocking cart by a worker, and in some instances the products removed are tracked (e.g., based on RFID readings, bar code scans by the worker, weight changes, etc.). In step 816, sensor data is monitored to identify when a stocking cart is ready to be moved and/or when a stocking cart is ready to be returned to a loading area. Again, in some instances, the central computer system may direct the motorized transport unit to transport the unloaded stocking cart to a waste disposal area (e.g., carton crusher area, recycle area, trash area, etc.). In step 818, the central computer system can select an appropriate motorized transport unit to transport the unloaded stocking cart back to a loading area when a motorized transport unit is not already cooperated with the stocking cart. In step 820, the return task can be placed in the task queue of the selected motorized transport unit. In step 822, the selected motorized transport unit is provided routing instructions to move to and cooperate with the unloaded stocking cart, and to transport the stocking cart to one or more disposal locations and/or an empty cart location. In step 824, the motorized transport unit is caused to transport the unloaded stocking cart to the one or more relevant locations, and may wait until waste is removed from the stocking cart, then returns the empty stocking cart to a stocking cart corral or other such location. In step 826, the central computer system may direct the motorized transport unit to perform one or more other tasks.


In some implementations, the central computer system can schedule the pulling of stocking carts. Often in retail shopping facilities, workers tend to pull numerous stocking carts out onto the sales floor so that they can proceed to perform stocking without interruption to go to a back storage area to retrieve additional products to be stocked. Such stocking cart pulling can cause congestion on the sales floor, inhibit customers' abilities to access some products, and otherwise interfere with customers' shopping experience. Further, the worker may have to hunt through the multiple pulled stocking carts to find products that the worker is trying to stock. In some embodiments, however, the central computer system can coordinate the pulling of stocking carts as one or more workers are ready for additional products. As such, workers do not need to return to back areas to pull additional stocking carts, and numerous stocking carts are not simultaneously on the sales floor inhibiting customers' shopping experiences. Similarly, the central computer system can direct one or more motorized transport units to return empty stocking carts to a storage area so that workers do not have to take time to move the empty stocking carts.


In some embodiments, systems, apparatuses, processes and methods are provided to assist product stocking on a sales floor of a retail shopping facility. Some embodiments comprise: a plurality of motorized transport units that are each configured to perform multiple different types of tasks at a retail shopping facility; a central computer system configured to coordinate the plurality of motorized transport units in performing the multiple different tasks comprising instruct a first motorized transport unit to retrieve a specified first stocking cart that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products, and further instruct the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart.


Further, some embodiments include methods to assists product stocking on a sales floor of a retail shopping facility, comprising: by a central computer system for a retail shopping facility: coordinating a plurality of motorized transport units in performing the multiple different tasks, wherein each of the plurality of motorized transport units are configured to perform multiple different types of tasks at the retail shopping facility; communicating an instruction instructing a first motorized transport unit to retrieve a specified first stocking cart that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products; and communicating an instruction instructing the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart.


Those skilled in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described embodiments without departing from the scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.

Claims
  • 1. A system to assist product stocking on a sales floor of a retail shopping facility, comprising: a plurality of motorized transport units that are each configured to perform multiple different tasks at a retail shopping facility; anda central computer system configured to coordinate the plurality of motorized transport units in performing the multiple different tasks comprising instruct a first motorized transport unit to retrieve a specified first stocking cart that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products, and further instruct the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart;wherein the central computer system is further configured to detect, from sensor data comprising sensor data from the first motorized transport unit, a worker stocking one or more of the products from the first stocking cart, detect from the sensor data that the worker has moved at least a first threshold distance from the first stocking cart, and direct the first motorized transport unit to move the first stocking cart to follow the worker and place the first stocking cart within a second threshold distance of the worker;wherein each of the plurality of motorized transport units is configured to perform the multiple different tasks at the retail shopping facility comprising at least a first set of tasks to support customers comprising at least the first motorized transport unit being configured to temporarily couple with and transport a movable item container configured to receive items for potential purchase by a customer with which the first motorized transport unit is temporarily associated, a second set of tasks to support workers at the retail shopping facility comprising at least the first motorized transport unit being configured to temporarily couple with and transport the first stocking cart, and a third set of tasks regarding facility maintenance at the retail shopping facility to at least perform a clean-up.
  • 2. The system of claim 1, wherein the first motorized transport unit detects, through a sensor on the motorized transport unit, a customer on the sales floor within a threshold distance of the first motorized transport unit as the first motorized transport unit transports the first stocking cart through the retail shopping facility, and takes at least a first action to avoid the first motorized transport unit and the first stocking cart from contacting the customer.
  • 3. The system of claim 1, wherein the central computer system is further configured to receive, from the first motorized transport unit, sensor data indicative of a weight of the first stocking cart and the first plurality of products carried by the first stocking cart, and to confirm based on the sensor data that an expected plurality of products is carried by the first stocking cart.
  • 4. The system of claim 1, wherein the central computer system is further configured to determine based on the sensor data that the first plurality of products have been unloaded from the first stocking cart, and direct the first motorized transport unit to perform a second task, of the multiple different tasks, at the retail shopping facility and that is unassociated with stocking products and retrieving products.
  • 5. The system of claim 1, wherein the central computer system is further configured to determine that the first plurality of products have been unloaded from the first stocking cart, instruct the first motorized transport unit to transport the unloaded first stocking cart to a drop location, and instruct the first motorized transport unit to retrieve a specified second stocking cart that is carrying a second plurality of products, and to autonomously transport the second stocking cart to a specified second stocking location on the sales floor.
  • 6. The system of claim 1, wherein the central computer in directing the first motorized transport unit to follow the worker and place the first stocking cart within the second threshold distance of the worker utilizes the sensor data from the first motorized transport unit and facility sensor data received from sensors external to the first motorized transport unit in determining a distance between the first stocking cart and the worker.
  • 7. The system of claim 1, wherein the central computer is further configured to implement a work queue application that maintains a task queue for each of the plurality of motorized transport units, tracks the task queues in identifying available motorized transport units to perform different tasks, selects the first motorized transport unit to temporarily cooperate with the first stocking cart to perform the task of transporting the first stocking cart based on a first task queues associated with the first motorized transport unit, and adds the assigned transport task to a first task queue.
  • 8. A system to assist product stocking on a sales floor of a retail shopping facility, comprising: a plurality of motorized transport units that are each configured to perform multiple different tasks at a retail shopping facility; anda central computer system configured to coordinate the plurality of motorized transport units in performing the multiple different tasks comprising instruct a first motorized transport unit to retrieve a specified first stocking cart that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products, and further instruct the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart;wherein the central computer system is further configured to identify a mating coupler on the first stocking cart and with which one of the motorized transport units is to couple in transporting the first stocking cart, identify, from the plurality of motorized transport units, that the first motorized transport unit comprises a coupler consistent with the mating coupler on the first stocking cart, and select the first motorized transport unit based on the first motorized transport unit comprising the coupler consistent with the mating coupler on the first stocking cart;wherein each of the plurality of motorized transport units is configured to perform the multiple different tasks at the retail shopping facility comprising at least a first set of tasks to support customers comprising at least being configured to temporarily couple with and transport a movable item container configured to receive items for potential purchase by a customer with which the first motorized transport unit is temporarily associated, a second set of tasks to support workers at the retail shopping facility comprising at least the first motorized transport unit being configured to temporarily couple with and transport stocking carts, and a third set of tasks regarding facility maintenance at the retail shopping facility to at least perform a clean-up.
  • 9. A system to assist product stocking on a sales floor of a retail shopping facility, comprising: a plurality of motorized transport units that are each configured to perform multiple different tasks at a retail shopping facility; anda central computer system configured to coordinate the plurality of motorized transport units in performing the multiple different tasks comprising instruct a first motorized transport unit to retrieve a specified first stocking cart that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products, and further instruct the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart;wherein the central computer system is further configured to determine that the first plurality of products have been unloaded from the first stocking cart and the first stocking cart carries waste material, and instructs the first motorized transport unit to transport the first stocking cart to a waste disposal area;wherein each of the plurality of motorized transport units is configured to perform the multiple different tasks at the retail shopping facility comprising at least a first set of tasks to support customers comprising at least being configured to temporarily couple with and transport a movable item container configured to receive items for potential purchase by a customer with which the first motorized transport unit is temporarily associated, a second set of tasks to support workers at the retail shopping facility comprising at least the first motorized transport unit being configured to temporarily couple with and transport stocking carts, and a third set of tasks regarding facility maintenance at the retail shopping facility to at least perform a clean-up.
  • 10. A method to assists product stocking on a sales floor of a retail shopping facility, comprising: by a central computer system for a retail shopping facility:coordinating a plurality of motorized transport units in performing the multiple different tasks, wherein each of the plurality of motorized transport units are configured to perform multiple different tasks at the retail shopping facility comprising at least a first set of tasks to support customers comprising at least being configured to temporarily couple with and transport a movable item container configured to receive items for potential purchase by a customer with which the first motorized transport unit is temporarily associated, a second set of tasks to support workers at the retail shopping facility comprising at least the first motorized transport unit being configured to temporarily couple with and transport stocking carts, and a third set of tasks regarding facility maintenance at the retail shopping facility to at least perform a clean-up;communicating an instruction instructing a first motorized transport unit to retrieve a specified first stocking cart of the stocking carts that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products;communicating an instruction instructing the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart;detecting, from sensor data comprising sensor data from the first motorized transport unit, a worker stocking one or more of the products from the first stocking cart;detecting from the sensor data that the worker has moved at least a first threshold distance from the first stocking cart; anddirecting the first motorized transport unit to move the first stocking cart to follow the worker and place the first stocking cart within a second threshold distance of the worker.
  • 11. The method of claim 10, further comprising: detecting, through sensor data from a sensor on the motorized transport unit, a customer on the sales floor within a threshold distance of the first motorized transport unit as the first motorized transport unit transports the first stocking cart through the retail shopping facility; andcausing the first motorized transport unit to take at least a first action to avoid the first motorized transport unit and the first stocking cart from contacting the customer.
  • 12. The method of claim 10, further comprising: receiving, from the first motorized transport unit, sensor data indicative of a weight of the first stocking cart and the first plurality of products carried by the first stocking cart; andconfirming based on the sensor data that an expected plurality of products is carried by the first stocking cart.
  • 13. The method of claim 10, further comprising: determining based on the sensor data that the first plurality of products have been unloaded from the first stocking cart; anddirecting the first motorized transport unit, in response to determining the first stocking cart is unloaded, to perform a second task of the multiple different tasks at the retail shopping facility and that is unassociated with stocking products and retrieving products.
  • 14. The method of claim 10, further comprising: determining that the first plurality of products have been unloaded from the first stocking cart;communicating an instruction instructing the first motorized transport unit to transport the unloaded first stocking cart to a drop location; andcommunicating an instruction instructing the first motorized transport unit to retrieve a specified second stocking cart that is carrying a second plurality of products, and to autonomously transport the second stocking cart to a specified second stocking location on the sales floor.
  • 15. A method to assists product stocking on a sales floor of a retail shopping facility, comprising: by a central computer system for a retail shopping facility:coordinating a plurality of motorized transport units in performing the multiple different tasks, wherein each of the plurality of motorized transport units are configured to perform multiple different tasks at the retail shopping facility comprising at least a first set of tasks to support customers comprising at least being configured to temporarily couple with and transport a movable item container configured to receive items for potential purchase by a customer with which the first motorized transport unit is temporarily associated, a second set of tasks to support workers at the retail shopping facility comprising at least the first motorized transport unit being configured to temporarily couple with and transport stocking carts, and a third set of tasks regarding facility maintenance at the retail shopping facility to at least perform a clean-up;communicating an instruction instructing a first motorized transport unit to retrieve a specified first stocking cart of the stocking carts that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products;communicating an instruction instructing the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart;identifying a mating coupler on the first stocking cart and with which one of the motorized transport units is to couple in transporting the first stocking cart;identifying, from the plurality of motorized transport units, that the first motorized transport unit comprises a coupler consistent with the mating coupler on the first stocking cart, andselecting the first motorized transport unit based on the first motorized transport unit comprising the coupler consistent with the mating coupler on the first stocking cart.
  • 16. A method to assists product stocking on a sales floor of a retail shopping facility, comprising: by a central computer system for a retail shopping facility:coordinating a plurality of motorized transport units in performing the multiple different tasks, wherein each of the plurality of motorized transport units are configured to perform multiple different tasks at the retail shopping facility comprising at least a first set of tasks to support customers comprising at least being configured to temporarily couple with and transport a movable item container configured to receive items for potential purchase by a customer with which the first motorized transport unit is temporarily associated, a second set of tasks to support workers at the retail shopping facility comprising at least the first motorized transport unit being configured to temporarily couple with and transport stocking carts, and a third set of tasks regarding facility maintenance at the retail shopping facility to at least perform a clean-up;communicating an instruction instructing a first motorized transport unit to retrieve a specified first stocking cart of the stocking carts that is carrying a first plurality of products that are to be restocked onto product supports that are positioned on the sales floor where customers travel in shopping for products;communicating an instruction instructing the first motorized transport unit to autonomously transport the first stocking cart to a specified first stocking location on the sales floor corresponding to at least one of the first plurality of products carried by the first stocking cart;determining that the first plurality of products have been unloaded from the first stocking cart and the first stocking cart carries waste material; andcommunicating an instruction instructing the first motorized transport unit to transport the first stocking cart to a waste disposal area.
RELATED APPLICATIONS

This application claims the benefit of each of the following U.S. Provisional applications, each of which is incorporated herein by reference in its entirety: U.S. Provisional Application No. 62/129,726, filed Mar. 6, 2015; U.S. Provisional Application No. 62/129,727, filed Mar. 6, 2015; U.S. Provisional Application No. 62/138,877, filed Mar. 26, 2015; U.S. Provisional Application No. 62/138,885, filed Mar. 26, 2015; U.S. Provisional Application No. 62/152,421, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,465, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,440, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,630, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,711, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,610, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,667, filed Apr. 24, 2015; U.S. Provisional Application No. 62/157,388, filed May 5, 2015; U.S. Provisional Application No. 62/165,579, filed May 22, 2015; U.S. Provisional Application No. 62/165,416, filed May 22, 2015; U.S. Provisional Application No. 62/165,586, filed May 22, 2015; U.S. Provisional Application No. 62/171,822, filed Jun. 5, 2015; U.S. Provisional Application No. 62/175,182, filed Jun. 12, 2015; U.S. Provisional Application No. 62/182,339, filed Jun. 19, 2015; U.S. Provisional Application No. 62/185,478, filed Jun. 26, 2015; U.S. Provisional Application No. 62/194,131, filed Jul. 17, 2015; U.S. Provisional Application No. 62/194,119, filed Jul. 17, 2015; U.S. Provisional Application No. 62/194,121, filed Jul. 17, 2015; U.S. Provisional Application No. 62/194,127, filed Jul. 17, 2015; U.S. Provisional Application No. 62/202,744, filed Aug. 7, 2015; U.S. Provisional Application No. 62/202,747, filed Aug. 7, 2015; U.S. Provisional Application No. 62/205,548, filed Aug. 14, 2015; U.S. Provisional Application No. 62/205,569, filed Aug. 14, 2015; U.S. Provisional Application No. 62/205,555, filed Aug. 14, 2015; U.S. Provisional Application No. 62/205,539, filed Aug. 14, 2015; U.S. Provisional Application No. 62/207,858, filed Aug. 20, 2015; U.S. Provisional Application No. 62/214,826, filed Sep. 4, 2015; U.S. Provisional Application No. 62/214,824, filed Sep. 4, 2015; U.S. Provisional Application No. 62/292,084, filed Feb. 5, 2016; U.S. Provisional Application No. 62/302,547, filed Mar. 2, 2016; U.S. Provisional Application No. 62/302,567, filed Mar. 2, 2016; U.S. Provisional Application No. 62/302,713, filed Mar. 2, 2016; and U.S. Provisional Application No. 62/303,021, filed Mar. 3, 2016.

US Referenced Citations (611)
Number Name Date Kind
1774653 Marriott Sep 1930 A
2669345 Brown Feb 1954 A
3765546 Westerling Oct 1973 A
4071740 Gogulski Jan 1978 A
4158416 Podesta Jun 1979 A
4588349 Reuter May 1986 A
4672280 Honjo Jun 1987 A
4777416 George, II Oct 1988 A
4791482 Barry Dec 1988 A
4868544 Havens Sep 1989 A
4911608 Krappitz Mar 1990 A
5119087 Lucas Jun 1992 A
5279672 Betker Jan 1994 A
5287266 Malec Feb 1994 A
5295551 Sukonick Mar 1994 A
5363305 Cox Nov 1994 A
5380138 Kasai Jan 1995 A
5384450 Goetz, Jr. Jan 1995 A
5395206 Cerny, Jr. Mar 1995 A
5402051 Fujiwara Mar 1995 A
5548515 Pilley Aug 1996 A
5632381 Thust May 1997 A
5652489 Kawakami Jul 1997 A
5671362 Cowe Sep 1997 A
5777571 Chuang Jul 1998 A
5801340 Peter Sep 1998 A
5917174 Moore Jun 1999 A
5920261 Hughes Jul 1999 A
5969317 Espy Oct 1999 A
6018397 Cloutier Jan 2000 A
6199753 Tracy Mar 2001 B1
6201203 Tilles Mar 2001 B1
6240342 Fiegert May 2001 B1
6339735 Peless Jan 2002 B1
6365857 Maehata Apr 2002 B1
6374155 Wallach Apr 2002 B1
6394519 Byers May 2002 B1
6431078 Serrano Aug 2002 B2
6522952 Arai Feb 2003 B1
6525509 Petersson Feb 2003 B1
6535793 Allard Mar 2003 B2
6550672 Tracy Apr 2003 B1
6571693 Kaldenberg Jun 2003 B1
6584375 Bancroft Jun 2003 B2
6584376 VanKommer Jun 2003 B1
6600418 Francis Jul 2003 B2
6601759 Fife Aug 2003 B2
6606411 Loui Aug 2003 B1
6626632 Guenzi Sep 2003 B2
6633800 Ward Oct 2003 B1
6655897 Harwell Dec 2003 B1
6667592 Jacobs Dec 2003 B2
6672601 Hofheins Jan 2004 B1
6678583 Nasr Jan 2004 B2
6688435 Will Feb 2004 B1
6728597 Didriksen Apr 2004 B2
6731204 Lehmann May 2004 B2
6745186 Testa Jun 2004 B1
6752582 Garcia Jun 2004 B2
6810149 Squilla Oct 2004 B1
6816085 Haynes Nov 2004 B1
6832884 Robinson Dec 2004 B2
6841963 Song Jan 2005 B2
6883201 Jones Apr 2005 B2
6885736 Uppaluru Apr 2005 B2
6886101 Glazer Apr 2005 B2
6895301 Mountz May 2005 B2
6910828 Hughes Jun 2005 B1
6937989 McIntyre Aug 2005 B2
6954695 Bonilla Oct 2005 B2
6967455 Nakadai Nov 2005 B2
6975997 Murakami Dec 2005 B1
7039499 Nasr May 2006 B1
7066291 Martins Jun 2006 B2
7101113 Hughes Sep 2006 B2
7101139 Benedict Sep 2006 B1
7117902 Osborne Oct 2006 B2
7145562 Schechter Dec 2006 B2
7147154 Myers Dec 2006 B2
7177820 McIntyre Feb 2007 B2
7184586 Jeon Feb 2007 B2
7205016 Garwood Apr 2007 B2
7206753 Bancroft Apr 2007 B2
7222363 Rice May 2007 B2
7233241 Overhultz Jun 2007 B2
7234609 DeLazzer Jun 2007 B2
7261511 Felder Aug 2007 B2
7367245 Okazaki May 2008 B2
7381022 King Jun 2008 B1
7402018 Mountz Jul 2008 B2
7431208 Feldman Oct 2008 B2
7447564 Yasukawa Nov 2008 B2
7463147 Laffoon Dec 2008 B1
7474945 Matsunaga Jan 2009 B2
7487913 Adema Feb 2009 B2
7533029 Mallett May 2009 B2
7554282 Nakamoto Jun 2009 B2
7556108 Won Jul 2009 B2
7556219 Page Jul 2009 B2
7587756 Peart Sep 2009 B2
7613544 Park Nov 2009 B2
7627515 Borgs Dec 2009 B2
7636045 Sugiyama Dec 2009 B2
7648068 Silverbrook Jan 2010 B2
7653603 Holtkamp Jan 2010 B1
7658327 Tuchman Feb 2010 B2
7689322 Tanaka Mar 2010 B2
7693605 Park Apr 2010 B2
7693745 Pomerantz Apr 2010 B1
7693757 Zimmerman Apr 2010 B2
7706917 Chiappetta Apr 2010 B1
7716064 McIntyre May 2010 B2
7726563 Scott Jun 2010 B2
7762458 Stawar Jul 2010 B2
7783527 Bonner Aug 2010 B2
7787985 Tsujimoto Aug 2010 B2
7817394 Mukherjee Oct 2010 B2
7826919 D'Andrea Nov 2010 B2
7835281 Lee Nov 2010 B2
7894932 Mountz Feb 2011 B2
7894939 Zini Feb 2011 B2
7969297 Haartsen Jun 2011 B2
7996109 Zini Aug 2011 B2
8010230 Zini Aug 2011 B2
8032249 Shakes Oct 2011 B1
8041455 Thorne Oct 2011 B2
8050976 Staib Nov 2011 B2
8065032 Stifter Nov 2011 B2
8065353 Eckhoff-Hornback Nov 2011 B2
8069092 Bryant Nov 2011 B2
8083013 Bewley Dec 2011 B2
8099191 Blanc Jan 2012 B2
8103398 Duggan Jan 2012 B2
8195333 Ziegler Jun 2012 B2
8239276 Lin Aug 2012 B2
8244041 Silver Aug 2012 B1
8248467 Ganick Aug 2012 B1
8260456 Siegel Sep 2012 B2
8284240 Saint-Pierre Oct 2012 B2
8295542 Albertson Oct 2012 B2
8321303 Krishnamurthy Nov 2012 B1
8325036 Fuhr Dec 2012 B1
8342467 Stachowski Jan 2013 B2
8352110 Szybalski Jan 2013 B1
8359122 Koselka Jan 2013 B2
8380349 Hickman Feb 2013 B1
8393846 Coots Mar 2013 B1
8412400 DAndrea Apr 2013 B2
8423280 Edwards Apr 2013 B2
8425173 Lert Apr 2013 B2
8429004 Hamilton Apr 2013 B2
8430192 Gillett Apr 2013 B2
8433470 Szybalski Apr 2013 B1
8433507 Hannah Apr 2013 B2
8437875 Hernandez May 2013 B2
8444369 Watt May 2013 B2
8447863 Francis, Jr. May 2013 B1
8452450 Dooley May 2013 B2
8474090 Jones Jul 2013 B2
8494908 Herwig Jul 2013 B2
8504202 Ichinose Aug 2013 B2
8508590 Laws Aug 2013 B2
8510033 Park Aug 2013 B2
8511606 Lutke Aug 2013 B1
8515580 Taylor Aug 2013 B2
8516651 Jones Aug 2013 B2
8538577 Bell Sep 2013 B2
8544858 Eberlein Oct 2013 B2
8571700 Keller Oct 2013 B2
8572712 Rice Oct 2013 B2
8577538 Lenser Nov 2013 B2
8587662 Moll Nov 2013 B1
8588969 Frazier Nov 2013 B2
8594834 Clark Nov 2013 B1
8606314 Barnes, Jr. Dec 2013 B2
8606392 Wurman Dec 2013 B2
8639382 Clark Jan 2014 B1
8645223 Ouimet Feb 2014 B2
8649557 Hyung Feb 2014 B2
8656550 Jones Feb 2014 B2
8670866 Ziegler Mar 2014 B2
8671507 Jones Mar 2014 B2
8676377 Siegel Mar 2014 B2
8676420 Kume Mar 2014 B2
8676480 Lynch Mar 2014 B2
8700230 Hannah Apr 2014 B1
8708285 Carreiro Apr 2014 B1
8718814 Clark May 2014 B1
8724282 Hiremath May 2014 B2
8732039 Chen May 2014 B1
8744626 Johnson Jun 2014 B2
8751042 Lee Jun 2014 B2
8763199 Jones Jul 2014 B2
8770976 Moser Jul 2014 B2
8775064 Zeng Jul 2014 B2
8798786 Wurman Aug 2014 B2
8798840 Fong Aug 2014 B2
8814039 Bishop Aug 2014 B2
8818556 Sanchez Aug 2014 B2
8820633 Bishop Sep 2014 B2
8825226 Worley, III Sep 2014 B1
8831984 Hoffman Sep 2014 B2
8838268 Friedman Sep 2014 B2
8843244 Phillips Sep 2014 B2
8851369 Bishop Oct 2014 B2
8882432 Bastian, II Nov 2014 B2
8886390 Wolfe Nov 2014 B2
8892240 Vliet Nov 2014 B1
8892241 Weiss Nov 2014 B2
8899903 Saad Dec 2014 B1
8918202 Kawano Dec 2014 B2
8918230 Chen Dec 2014 B2
8930044 Peeters Jan 2015 B1
8965561 Jacobus Feb 2015 B2
8972045 Mountz Mar 2015 B1
8972061 Rosenstein Mar 2015 B2
8983647 Dwarakanath Mar 2015 B1
8989053 Skaaksrud Mar 2015 B1
9002506 Agarwal Apr 2015 B1
9008827 Dwarakanath Apr 2015 B1
9008829 Worsley Apr 2015 B2
9014848 Farlow Apr 2015 B2
9075136 Joao Jul 2015 B1
9129277 MacIntosh Sep 2015 B2
9170117 Abuelsaad Oct 2015 B1
9173816 Reinhardt Nov 2015 B2
9190304 MacKnight Nov 2015 B2
9278839 Gilbride Mar 2016 B2
9305280 Berg Apr 2016 B1
9329597 Stoschek May 2016 B2
9495703 Kaye Nov 2016 B1
9534906 High Jan 2017 B2
9550577 Beckman Jan 2017 B1
9573684 Kimchi Feb 2017 B2
9578282 Sills Feb 2017 B1
9607285 Wellman Mar 2017 B1
9623923 Riedel Apr 2017 B2
9649766 Stubbs May 2017 B2
9656805 Evans May 2017 B1
9658622 Walton May 2017 B2
9663292 Brazeau May 2017 B1
9663293 Wurman May 2017 B2
9663295 Wurman May 2017 B1
9663296 Dingle May 2017 B1
9747480 McAllister Aug 2017 B2
9757002 Thompson Sep 2017 B2
9801517 High Oct 2017 B2
9827678 Gilbertson Nov 2017 B1
9875502 Kay Jan 2018 B2
9875503 High Jan 2018 B2
1589225 High Feb 2018 A1
1589415 High Feb 2018 A1
9896315 High Mar 2018 B2
9908760 High Mar 2018 B2
9994434 High Jun 2018 B2
10017322 High Jul 2018 B2
10071891 High Sep 2018 B2
10071893 High Sep 2018 B2
10081525 High Sep 2018 B2
20010042024 Rogers Nov 2001 A1
20020060542 Song May 2002 A1
20020095342 Feldman Jul 2002 A1
20020154974 Fukuda Oct 2002 A1
20020156551 Tackett Oct 2002 A1
20020165638 Bancroft Nov 2002 A1
20020165643 Bancroft Nov 2002 A1
20020165790 Bancroft Nov 2002 A1
20020174021 Chu Nov 2002 A1
20030028284 Chirnomas Feb 2003 A1
20030152679 Garwood Aug 2003 A1
20030170357 Garwood Sep 2003 A1
20030185948 Garwood Oct 2003 A1
20030222798 Floros Dec 2003 A1
20040068348 Jager Apr 2004 A1
20040081729 Garwood Apr 2004 A1
20040093650 Martins May 2004 A1
20040098167 Yi May 2004 A1
20040117063 Sabe Jun 2004 A1
20040146602 Garwood Jul 2004 A1
20040216339 Garberg Nov 2004 A1
20040217166 Myers Nov 2004 A1
20040221790 Sinclair Nov 2004 A1
20040225613 Narayanaswami Nov 2004 A1
20040249497 Saigh Dec 2004 A1
20050008463 Stehr Jan 2005 A1
20050047895 Lert Mar 2005 A1
20050072651 Wieth Apr 2005 A1
20050080520 Kline Apr 2005 A1
20050104547 Wang May 2005 A1
20050149414 Schrodt Jul 2005 A1
20050177446 Hoblit Aug 2005 A1
20050216126 Koselka Sep 2005 A1
20050222712 Orita Oct 2005 A1
20050230472 Chang Oct 2005 A1
20050238465 Razumov Oct 2005 A1
20060107067 Safal May 2006 A1
20060147087 Goncalves Jul 2006 A1
20060163350 Melton Jul 2006 A1
20060178777 Park Aug 2006 A1
20060206235 Shakes Sep 2006 A1
20060220809 Stigall Oct 2006 A1
20060221072 Se Oct 2006 A1
20060231301 Rose Oct 2006 A1
20060235570 Jung Oct 2006 A1
20060241827 Fukuchi Oct 2006 A1
20060244588 Hannah Nov 2006 A1
20060279421 French Dec 2006 A1
20060293810 Nakamoto Dec 2006 A1
20070005179 Mccrackin Jan 2007 A1
20070017855 Pippin Jan 2007 A1
20070045018 Carter Mar 2007 A1
20070061210 Chen Mar 2007 A1
20070085682 Murofushi Apr 2007 A1
20070125727 Winkler Jun 2007 A1
20070150368 Arora Jun 2007 A1
20070152057 Cato Jul 2007 A1
20070222679 Morris Sep 2007 A1
20070269299 Ross Nov 2007 A1
20070284442 Herskovitz Dec 2007 A1
20070288123 D'Andrea Dec 2007 A1
20070293978 Wurman Dec 2007 A1
20080011836 Adema Jan 2008 A1
20080031491 Ma Feb 2008 A1
20080041644 Tudek Feb 2008 A1
20080042836 Christopher Feb 2008 A1
20080075566 Benedict Mar 2008 A1
20080075568 Benedict Mar 2008 A1
20080075569 Benedict Mar 2008 A1
20080077511 Zimmerman Mar 2008 A1
20080105445 Dayton May 2008 A1
20080131255 Hessler Jun 2008 A1
20080140253 Brown Jun 2008 A1
20080154720 Gounares Jun 2008 A1
20080201227 Bakewell Aug 2008 A1
20080226129 Kundu Sep 2008 A1
20080267759 Morency Oct 2008 A1
20080281515 Ann Nov 2008 A1
20080281664 Campbell Nov 2008 A1
20080294288 Yamauchi Nov 2008 A1
20080306787 Hamilton Dec 2008 A1
20080308630 Bhogal Dec 2008 A1
20080314667 Hannah Dec 2008 A1
20090074545 Lert Mar 2009 A1
20090132250 Chiang May 2009 A1
20090134572 Obuchi May 2009 A1
20090138375 Schwartz May 2009 A1
20090154708 Kolar Sunder Jun 2009 A1
20090155033 Olsen Jun 2009 A1
20090164902 Cohen Jun 2009 A1
20090210536 Allen Aug 2009 A1
20090240571 Bonner Sep 2009 A1
20090259571 Ebling Oct 2009 A1
20090265193 Collins Oct 2009 A1
20090269173 De Leo Oct 2009 A1
20090299822 Harari Dec 2009 A1
20090319399 Resta Dec 2009 A1
20100025964 Fisk Feb 2010 A1
20100030417 Fang Feb 2010 A1
20100076959 Ramani Mar 2010 A1
20100138281 Zhang Jun 2010 A1
20100143089 Hvass Jun 2010 A1
20100171826 Hamilton Jul 2010 A1
20100176922 Schwab Jul 2010 A1
20100211441 Sprigg Aug 2010 A1
20100222925 Anezaki Sep 2010 A1
20100268697 Karlsson Oct 2010 A1
20100295847 Titus Nov 2010 A1
20100299065 Mays Nov 2010 A1
20100302102 Desai Dec 2010 A1
20100324773 Choi Dec 2010 A1
20110010023 Kunzig Jan 2011 A1
20110022201 Reumerman Jan 2011 A1
20110098920 Chuang Apr 2011 A1
20110153081 Romanov Jun 2011 A1
20110163160 Zini Jul 2011 A1
20110176803 Song Jul 2011 A1
20110225071 Sano Sep 2011 A1
20110240777 Johns Oct 2011 A1
20110258060 Sweeney Oct 2011 A1
20110260865 Bergman Oct 2011 A1
20110279252 Carter Nov 2011 A1
20110288684 Farlow Nov 2011 A1
20110288763 Hui Nov 2011 A1
20110295424 Johnson Dec 2011 A1
20110301757 Jones Dec 2011 A1
20110320034 Dearlove Dec 2011 A1
20110320322 Roslak Dec 2011 A1
20120000024 Layton Jan 2012 A1
20120029697 Ota Feb 2012 A1
20120035823 Carter Feb 2012 A1
20120046998 Staib Feb 2012 A1
20120059743 Rao Mar 2012 A1
20120072303 Brown Mar 2012 A1
20120134771 Larson May 2012 A1
20120143726 Chirnomas Jun 2012 A1
20120192260 Kontsevich Jul 2012 A1
20120226556 Itagaki Sep 2012 A1
20120239224 McCabe Sep 2012 A1
20120255810 Yang Oct 2012 A1
20120259732 Sasankan Oct 2012 A1
20120272500 Reuteler Nov 2012 A1
20120294698 Villamar Nov 2012 A1
20120303263 Alam Nov 2012 A1
20120303479 Derks Nov 2012 A1
20120330458 Weiss Dec 2012 A1
20130016011 Harriman Jan 2013 A1
20130026224 Ganick Jan 2013 A1
20130051667 Deng Feb 2013 A1
20130054052 Waltz Feb 2013 A1
20130054280 Moshfeghi Feb 2013 A1
20130060461 Wong Mar 2013 A1
20130073405 Ariyibi Mar 2013 A1
20130096735 Byford Apr 2013 A1
20130103539 Abraham Apr 2013 A1
20130105036 Smith May 2013 A1
20130110671 Gray May 2013 A1
20130141555 Ganick Jun 2013 A1
20130145572 Schregardus Jun 2013 A1
20130151335 Avadhanam Jun 2013 A1
20130155058 Golparvar-Fard Jun 2013 A1
20130174371 Jones Jul 2013 A1
20130181370 Rafie Jul 2013 A1
20130211953 Abraham Aug 2013 A1
20130218453 Geelen Aug 2013 A1
20130235206 Smith Sep 2013 A1
20130238130 Dorschel Sep 2013 A1
20130276004 Boncyk Oct 2013 A1
20130300729 Grimaud Nov 2013 A1
20130302132 D'Andrea Nov 2013 A1
20130309637 Minvielle Nov 2013 A1
20130317642 Asaria Nov 2013 A1
20130333961 Odonnell Dec 2013 A1
20130338825 Cantor Dec 2013 A1
20140006229 Birch Jan 2014 A1
20140014470 Razumov Jan 2014 A1
20140032034 Raptopoulos Jan 2014 A1
20140032379 Schuetz Jan 2014 A1
20140037404 Hancock Feb 2014 A1
20140046512 Villamar Feb 2014 A1
20140058556 Kawano Feb 2014 A1
20140067564 Yuan Mar 2014 A1
20140081445 Villamar Mar 2014 A1
20140091013 Streufert Apr 2014 A1
20140100715 Mountz Apr 2014 A1
20140100768 Kessens Apr 2014 A1
20140100769 Wurman Apr 2014 A1
20140100998 Mountz Apr 2014 A1
20140100999 Mountz Apr 2014 A1
20140101690 Boncyk Apr 2014 A1
20140108087 Fukui Apr 2014 A1
20140124004 Rosenstein May 2014 A1
20140129054 Huntzicker May 2014 A1
20140133943 Razumov May 2014 A1
20140135984 Hirata May 2014 A1
20140143039 Branton May 2014 A1
20140149958 Samadi May 2014 A1
20140152507 McAllister Jun 2014 A1
20140156450 Ruckart Jun 2014 A1
20140156461 Lerner Jun 2014 A1
20140157156 Kawamoto Jun 2014 A1
20140164123 Wissner-Gross Jun 2014 A1
20140172197 Ganz Jun 2014 A1
20140172727 Abhyanker Jun 2014 A1
20140177907 Argue Jun 2014 A1
20140177924 Argue Jun 2014 A1
20140180478 Letsky Jun 2014 A1
20140180528 Argue Jun 2014 A1
20140180865 Argue Jun 2014 A1
20140180914 Abhyanker Jun 2014 A1
20140201041 Meyer Jul 2014 A1
20140207614 Ramaswamy Jul 2014 A1
20140209514 Gitschel Jul 2014 A1
20140211988 Fan Jul 2014 A1
20140214205 Kwon Jul 2014 A1
20140217242 Muren Aug 2014 A1
20140228999 D'Andrea Aug 2014 A1
20140229320 Mohammed Aug 2014 A1
20140244026 Neiser Aug 2014 A1
20140244207 Hicks Aug 2014 A1
20140246257 Jacobsen Sep 2014 A1
20140247116 Davidson Sep 2014 A1
20140250613 Jones Sep 2014 A1
20140254896 Zhou Sep 2014 A1
20140257928 Chen Sep 2014 A1
20140266616 Jones Sep 2014 A1
20140274309 Nguyen Sep 2014 A1
20140277693 Naylor Sep 2014 A1
20140277742 Wells Sep 2014 A1
20140277841 Klicpera Sep 2014 A1
20140285134 Kim Sep 2014 A1
20140289009 Campbell Sep 2014 A1
20140297090 Ichinose Oct 2014 A1
20140304107 McAllister Oct 2014 A1
20140306654 Partovi Oct 2014 A1
20140309809 Dixon Oct 2014 A1
20140330456 LopezMorales et al. Nov 2014 A1
20140330677 Boncyk Nov 2014 A1
20140344011 Dogin Nov 2014 A1
20140344118 Parpia Nov 2014 A1
20140350725 LaFary Nov 2014 A1
20140350851 Carter Nov 2014 A1
20140350855 Vishnuvajhala Nov 2014 A1
20140361077 Davidson Dec 2014 A1
20140369558 Holz Dec 2014 A1
20140371912 Passot Dec 2014 A1
20140379588 Gates Dec 2014 A1
20150006319 Thomas Jan 2015 A1
20150029339 Kobres Jan 2015 A1
20150032252 Galluzzo Jan 2015 A1
20150045992 Ashby Feb 2015 A1
20150046299 Yan Feb 2015 A1
20150066283 Wurman Mar 2015 A1
20150073589 Khodl Mar 2015 A1
20150098775 Razumov Apr 2015 A1
20150100439 Lu Apr 2015 A1
20150100461 Baryakar Apr 2015 A1
20150112826 Crutchfield Apr 2015 A1
20150120094 Kimchi Apr 2015 A1
20150123973 Larsen May 2015 A1
20150142249 Ooga May 2015 A1
20150203140 Holtan Jul 2015 A1
20150205298 Stoschek Jul 2015 A1
20150205300 Caver Jul 2015 A1
20150217449 Meier Aug 2015 A1
20150217790 Golden Aug 2015 A1
20150221854 Melz Aug 2015 A1
20150228004 Bednarek Aug 2015 A1
20150229906 Inacio De Matos Aug 2015 A1
20150231873 Okamoto Aug 2015 A1
20150277440 Kimchi Oct 2015 A1
20150278889 Qian Oct 2015 A1
20150325128 Lord Nov 2015 A1
20150336668 Pasko Nov 2015 A1
20150360865 Massey Dec 2015 A1
20160016731 Razumov Jan 2016 A1
20160023675 Hannah Jan 2016 A1
20160052139 Hyde Feb 2016 A1
20160101794 Fowler Apr 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160110701 Herring Apr 2016 A1
20160114488 Mascorro Medina Apr 2016 A1
20160167557 Mecklinger Jun 2016 A1
20160167577 Simmons Jun 2016 A1
20160176638 Toebes Jun 2016 A1
20160196755 Navot Jul 2016 A1
20160207193 Wise Jul 2016 A1
20160210602 Siddique Jul 2016 A1
20160236867 Brazeau Aug 2016 A1
20160255969 High Sep 2016 A1
20160257212 Thompson Sep 2016 A1
20160257240 High Sep 2016 A1
20160257401 Buchmueller Sep 2016 A1
20160258762 Taylor Sep 2016 A1
20160258763 High Sep 2016 A1
20160259028 High Sep 2016 A1
20160259329 High Sep 2016 A1
20160259331 Thompson Sep 2016 A1
20160259339 High Sep 2016 A1
20160259340 Kay Sep 2016 A1
20160259341 High Sep 2016 A1
20160259342 High Sep 2016 A1
20160259343 High Sep 2016 A1
20160259344 High Sep 2016 A1
20160259345 McHale Sep 2016 A1
20160259346 High Sep 2016 A1
20160260049 High Sep 2016 A1
20160260054 High Sep 2016 A1
20160260055 High Sep 2016 A1
20160260142 Winkle Sep 2016 A1
20160260145 High Sep 2016 A1
20160260148 High Sep 2016 A1
20160260158 High Sep 2016 A1
20160260159 Atchley Sep 2016 A1
20160260161 Atchley Sep 2016 A1
20160261698 Thompson Sep 2016 A1
20160274586 Stubbs Sep 2016 A1
20160288601 Gehrke Oct 2016 A1
20160300291 Carmeli Oct 2016 A1
20160301698 Katara Oct 2016 A1
20160325932 Hognaland Nov 2016 A1
20160349754 Mohr Dec 2016 A1
20160355337 Lert Dec 2016 A1
20160364785 Wankhede Dec 2016 A1
20160364786 Wankhede Dec 2016 A1
20170009417 High Jan 2017 A1
20170010608 High Jan 2017 A1
20170010609 High Jan 2017 A1
20170010610 Atchley Jan 2017 A1
20170020354 High Jan 2017 A1
20170024806 High Jan 2017 A1
20170080846 Lord Mar 2017 A1
20170107055 Magens Apr 2017 A1
20170110017 Kimchi Apr 2017 A1
20170120443 Kang May 2017 A1
20170129602 Alduaiji May 2017 A1
20170137235 Thompson May 2017 A1
20170148075 High May 2017 A1
20170158430 Raizer Jun 2017 A1
20170166399 Stubbs Jun 2017 A1
20170176986 High Jun 2017 A1
20170178066 High Jun 2017 A1
20170178082 High Jun 2017 A1
20170183159 Weiss Jun 2017 A1
20170283171 High Oct 2017 A1
20170355081 Fisher Dec 2017 A1
20180020896 High Jan 2018 A1
20180068357 High Mar 2018 A1
20180075403 Mascorro Medina Mar 2018 A1
20180099846 High Apr 2018 A1
20180170729 High Jun 2018 A1
20180170730 High Jun 2018 A1
Foreign Referenced Citations (94)
Number Date Country
2524037 May 2006 CA
2625885 Apr 2007 CA
100999277 Jul 2007 CN
102079433 Jun 2011 CN
202847767 Apr 2013 CN
103136923 May 2013 CN
103213115 Jul 2013 CN
203166399 Aug 2013 CN
203191819 Sep 2013 CN
203401274 Jan 2014 CN
203402565 Jan 2014 CN
103625808 Mar 2014 CN
203468521 Mar 2014 CN
103696393 Apr 2014 CN
103723403 Apr 2014 CN
203512491 Apr 2014 CN
103770117 May 2014 CN
203782622 Aug 2014 CN
104102188 Oct 2014 CN
104102219 Oct 2014 CN
102393739 Dec 2014 CN
204054062 Dec 2014 CN
204309852 Dec 2014 CN
204331404 May 2015 CN
105460051 Apr 2016 CN
102013013438 Feb 2015 DE
861415 May 1997 EP
1136052 Sep 2001 EP
0887491 Apr 2004 EP
1439039 Jul 2004 EP
1447726 Aug 2004 EP
2148169 Jan 2010 EP
2106886 Mar 2011 EP
2309487 Apr 2011 EP
2050544 Aug 2011 EP
2498158 Sep 2012 EP
2571660 Mar 2013 EP
2590041 May 2013 EP
2608163 Jun 2013 EP
2662831 Nov 2013 EP
2730377 May 2014 EP
2886020 Jun 2015 EP
2710330 Mar 1995 FR
1382806 Feb 1971 GB
2530626 Mar 2016 GB
2542472 Mar 2017 GB
2542905 May 2017 GB
62247458 Oct 1987 JP
10129996 May 1998 JP
2003288396 Oct 2003 JP
2005350222 Dec 2005 JP
2009284944 Dec 2009 JP
2010105644 May 2010 JP
2010231470 Oct 2010 JP
20120100505 Sep 2012 KR
8503277 Aug 1985 WO
9603305 Jul 1995 WO
1997018523 May 1997 WO
9855903 Dec 1998 WO
2000061438 Oct 2000 WO
0132366 May 2001 WO
2004092858 Oct 2004 WO
2005102875 Nov 2005 WO
2006056614 Jun 2006 WO
2006120636 Nov 2006 WO
2006137072 Dec 2006 WO
2007007354 Jan 2007 WO
2007047514 Apr 2007 WO
2007149196 Dec 2007 WO
2008118906 Oct 2008 WO
2008144638 Nov 2008 WO
2008151345 Dec 2008 WO
2009022859 Feb 2009 WO
2009027835 Mar 2009 WO
2009103008 Aug 2009 WO
2011063527 Jun 2011 WO
2012075196 Jun 2012 WO
2013138193 Sep 2013 WO
2013138333 Sep 2013 WO
2013176762 Nov 2013 WO
2014022366 Feb 2014 WO
2014022496 Feb 2014 WO
2014045225 Mar 2014 WO
2014046757 Mar 2014 WO
2014101714 Jul 2014 WO
2014116947 Jul 2014 WO
2014138472 Sep 2014 WO
2014165286 Oct 2014 WO
2015021958 Feb 2015 WO
2015104263 Jul 2015 WO
2015155556 Oct 2015 WO
2016009423 Jan 2016 WO
2016015000 Jan 2016 WO
2016144765 Sep 2016 WO
Non-Patent Literature Citations (188)
Entry
KR20120100505A-preview.pdf—Translation of KR 2012-0100505 A obtained from IP.com on Jun. 7, 2017.
U.S. Appl. No. 15/060,953, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,025, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,054, filed Mar. 4, 2016, Kay.
U.S. Appl. No. 15/061,203, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,265, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,285, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,325, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,350, filed Mar. 4, 2016, Thompson.
U.S. Appl. No. 15/061,402, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,406, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,474, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,671, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,677, filed Mar. 4, 2016, Taylor.
U.S. Appl. No. 15/061,686, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,688, filed Mar. 4, 2016, Thompson.
U.S. Appl. No. 15/061,722, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,770, filed Mar. 4, 2016, Atchley.
U.S. Appl. No. 15/061,792, filed Mar. 4, 2016, Winkle.
U.S. Appl. No. 15/061,801, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,805, filed Mar. 4, 2016, Atchley.
U.S. Appl. No. 15/061,844, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,848, filed Mar. 4, 2016, McHale.
U.S. Appl. No. 15/061,908, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/061,980, filed Mar. 4, 2016, Thompson.
Abbrobotics; “ABB Robotics—Innovative Packaging Solutions”, https://www.youtube.com/watch?v=e5jif-IUvHY, published on May 16, 2013, pp. 1-5.
Ang, Fitzwatler, et al.; “Automated Waste Sorter With Mobile Robot Delivery Waste System”, De La Salle University Research Congress 2013, Mar. 7-9, 2013, pp. 1-7.
Ansari, Sameer, et al.; “Automated Trash Collection & Removal in Office Cubicle Environments”, Squad Collaborative Robots, Sep. 27, 2013, pp. 1-23.
Armstrong, Jean, et al.; “Visible Light Positioning: A Roadmap for International Standardization”, IEEE Communications Magazine, Dec. 2013, pp. 2-7.
Artal, J.S., et al.; “Autonomous Mobile Robot with Hybrid PEM Fuel-Cell and Ultracapacitors Energy System, Dedalo 2.0”, International Conference on Renewable Energies and Power Quality, Santiago de Compostela, Spain, Mar. 28-30, 2012, pp. 1-6.
Atherton, Kelsey D.; “New GPS Receiver Offers Navigation Accurate to an Inch”, Popular Science, www.popsci.com/technology/article/2013-08/global-positioning-down-inches, Aug. 16, 2013, pp. 1-2.
Avezbadalov, Ariel, et al.; “Snow Shoveling Robot”, engineering.nyu.edu/mechatronics/projects/ME3484/2006/Snow Shoveling Robot/Mechatronics Snow Robot Presentation Update 12-19-09.pdf, 2006, pp. 1-24.
Bares, John, et al.; “Designing Crash-Survivable Unmanned Vehicles”, AUVSI Symposium, Jul. 10, 2002, pp. 1-15.
Bohren; Jonathan et al.; “Towards Autonomous Robotic Butlers: Lessons Learned with the PR2”, Willow Garage, pp. 1-8.
Bouchard, Samuel; “A Robot to Clean Your Trash Bin!”, Robotiq, http://blog.robotiq.com/bid/41203/A-Robot-to-Clean-your-Trash-Bin, Aug. 22, 2011, pp. 1-7.
Burns, Tom; “irobot roomba 780 review best robot vacuum floor cleaning robot review video demo”, https://www.youtube.com/watch?v=MkwtIyVAaEY, published on Feb. 13, 2013, pp. 1-10.
Bytelight; “Scalable Indoor Location”, http://www.bytelight.com/, Dec. 12, 2014, pp. 1-2.
Canadian Manufacturing; “Amazon unleashes army of order-picking robots”, http://www.canadianmanufacturing.com/supply-chain/amazon-unleashes-army-order-picking-robots-142902/, Dec. 2, 2014, pp. 1-4.
Capel, Claudine; “Waste sorting—A look at the separation and sorting techniques in today's European market”, Waste Management World, http://waste-management-world.com/a/waste-sorting-a-look-at-the-separation-and-sorting-techniques-in-todayrsquos-european-market, Jul. 1, 2008, pp. 1-8.
Carnegie Mellon Univeristy; “AndyVision—The Future of Retail”, https:// www.youtube.com/watch?v=n5309ILTV2s, published on Jul. 16, 2012, pp. 1-9.
Carnegie Mellon University; “Robots in Retail”, www.cmu.edu/homepage/computing/2012/summer/robots-in-retail.shmtl, 2012, pp. 1.
Chopade, Jayesh, et al.; “Control of Spy Robot by Voice and Computer Commands”, International Journal of Advanced Research in Computer and Communication Engineering, vol. 2, Issue 4, Apr. 2013, pp. 1-3.
CNET; “iRobot Braava 380t—No standing ovation for this robotic floor mop”, https://www.youtube.com/watch?v=JAtCIxFtC6Q, published on May 7, 2014, pp. 1-6.
Coltin, Brian & Ventura, Rodrigo; “Dynamic User Task Scheduling for Mobile Robots”, Association for the Advancement of Artificial Intelligence, 2011, pp. 1-6.
Couceiro, Micael S., et al.; “Marsupial teams of robots: deployment of miniature robots for swarm exploration under communication constraints”, Robotica, Cambridge University Press, downloaded Jan. 14, 2014, pp. 1-22.
Coxworth, Ben; “Robot designed to sort trash for recycling”, Gizmag, http://www.gizmag.com/robot-sorts-trash-for-recycling/18426/, Apr. 18, 2011, pp. 1-7.
Davis, Jo; “The Future of Retail: In Store Now”, Online Brands, http://onlinebrands.co.nz/587/future-retail-store-now/, Nov. 16, 2014, pp. 1-5.
Denso; “X-mobility”, pp. 1.
DHL; “Self-Driving Vehicles in Logistics: A DHL perspective on implications and use cases for the logistics industry”, 2014, pp. 1-39.
Dorrier, Jason; “Service Robots Will Now Assist Customers at Lowe's Store”, SingularityHUB, http://singularityhub.com/2014/10/29/service-robots-will-now-assist-customers-at-lowes-store/, Oct. 29, 2014, pp. 1-4.
Dronewatch; “Weatherproof Drone XAircraft Has ‘Black Box’”, DroneWatch, http://www.dronewatch.nl/2015/02/13/weatherproof-drone-van-xaircraft-beschikt-over-zwarte-doos/, Feb. 13, 2015, pp. 1-5.
Dyson US; “See the new Dyson 360 Eye robot vacuum cleaner in action #DysonRobot”, https://www.youtube.com/watch?v=OadhulCDAjk, published on Sep. 4, 2014, pp. 1-7.
Edwards, Lin; “Supermarket robot to help the elderly (w/Video)”, Phys.Org, http://phys.org/news/2009-12-supermarket-robot-elderly-video.html, Dec. 17, 2009, pp. 1-5.
Elfes, Alberto; “Using Occupancy Grids for Mobile Robot Perception and Navigation”, IEEE, 1989, pp. 46-57.
Elkins, Herschel T.; “Important 2014 New Consumer Laws”, County of Los Angeles Department of Consumer Affairs Community Outreach & Education, updated Jan. 6, 2014, pp. 1-46.
Falconer, Jason; “HOSPI-R drug delivery robot frees nurses to do more important work”, Gizmag, http://www.gizmag.com/panasonic-hospi-r-delivery-robot/29565/, Oct. 28, 2013, pp. 1-6.
Falconer, Jason; “Toyota unveils helpful Human Support Robot”, Gizmag, http:/www.gizmag.com/toyota-human-support-robot/24246/, Sep. 22, 2012, pp. 1-6.
Farivar, Cyrus; “This in-store robot can show you the hammer aisle, but not the bathroom”, Ars Technica, http://arstechnica.com/business/2014/12/this-in-store-robot-can-show-you-the-hammer-aisle-but-not-the-bathroom/, Dec. 3, 2014, pp. 1-4.
Fellow Robots; “Meet OSHBOT”, http://fellowrobots.com/oshbot/, pp. 1-3.
Fellowrobots; “Oshbot Progress—Fellow Robots”, https://vimeo.com/139532370, published Sep. 16, 2015, pp. 1-5.
Fora.tv; “A Day in the Life of a Kiva Robot”, https://www.youtube.com/watch?v=6KRjuuEVEZs, published on May 11, 2011, pp. 1-11.
Gamma2Video; “FridayBeerBot.wmv”, https://www.youtube.com/watch?v=KXXIIDYatxQ, published on Apr. 27, 2010, pp. 1-7.
Glas, Dylan F., et al.; “The Network Robot System: Enabling Social Human-Robot Interaction in Public Spaces”, Journal of Human-Robot Interaction, vol. 1, No. 2, 2012, pp. 5-32.
Green, A., et al; “Report on evaluation of the robot trolley”, CommRob IST-045441, Advanced Behaviour and High-Level Multimodal Communications with and among Robots, pp. 10-67.
Gross, H.-M., et al.; TOOMAS: Interactive Shopping Guide Robots in Everyday Use—Final Implementation and Experiences from Long-term Field Trials, Proc. IEEE/RJS Intern. Conf. on Intelligent Robots and Systems (IROS'09), St. Louis, USA, pp. 2005-2012.
Habib, Maki K., “Real Time Mapping and Dynamic Navigation for Mobile Robots”, International Journal of Advanced Robotic Systems, vol. 4, No. 3, 2007, pp. 323-338.
HRJ3 Productions; “Japanese Automatic Golf Cart”, https://www.youtube.com/watch?v=8diWYtqb6C0, published on Mar. 29, 2014, pp. 1-4.
Huang, Edward Y.C.; “A Semi-Autonomous Vision-Based Navigation System for a Mobile Robotic Vehicle”, Thesis submitted to the Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science on May 21, 2003, pp. 1-76.
IEEE Spectrum; “Warehouse Robots at Work”, https://www.youtube.com/watch?v=IWsMdN7HMuA, published on Jul. 21, 2008, pp. 1-11.
Intelligent Autonomous Systems; “TUM James goes shopping”, https://www.youtube.com/watch?v=JS2zycc4AUE, published on May 23, 2011, pp. 1-13.
Katic, M., Dusko; “Cooperative Multi Robot Systems for Contemporary Shopping Malls”, Robotics Laboratory, Mihailo Pupin Institute, University of Belgrade, Dec. 30, 2010, pp. 10-17.
Kehoe, Ben, et al.; “Cloud-Based Robot Grasping with the Google Object Recognition Engine”, 2013, pp. 1-7.
Kendricks, Cooper; “Trash Disposal Robot”, https://prezi.com31acae05zf8i/trash-disposal-robot/, Jan. 9, 2015, pp. 1-7.
Kibria, Shafkat, “Speech Recognition for Robotic Control”, Master's Thesis in Computing Science, Umea University, Dec. 18, 2005, pp. 1-77.
King, Rachael; “Newest Workers for Lowe's: Robots”, The Wall Street Journal, http:/www.wsj.com/articles/newest-workers-for-lowes-robots-1414468866, Oct. 28, 2014, pp. 1-4.
Kitamura, Shunichi; “Super Golf Cart with Remote drive and NAVI system in Japan”, https://www.youtube.com/watch?v-2_3-dUR12F8, published on Oct. 4, 2009, pp. 1-6.
Kiva Systems; “Automated Goods-to-Man Order Picking System—Kiva Systems”, http://www.kivasystems.com/solutions/picking/, printed on Apr. 2, 2015, pp. 1-2.
Kiva Systems; “Frequently Asked Questions about Kiva Systems—Kiva Systems”, http://kivasystems.com/about-us-the-kiva-approach/faq/, printed on Apr. 2, 2015, pp. 1-2.
Kiva Systems; “how a Kiva system makes use of the vertical space—Kiva Systems”, http://www.kivasystems.com/solutions/vertical-storage/, printed on Apr. 2, 2015, pp. 1-2.
Kiva Systems; “How Kiva Systems and Warehouse Management Systems Interact”, 2010, pp. 1-12.
Kiva Systems; “Kiva's warehouse automation system is the most powerful and flexible A . . . ”, http://www.kivasystems.com/solutions/, printed on Apr. 2, 2015, pp. 1-2.
Kiva Systems; “Kiva replenishment is more productive and accurate than replenishing pick faces in traditional distribution operations”, http//www.kivasystems.com/solutions/replenishment/, printed on Apr. 2, 2015, pp. 1-2.
Kiva Systems; “Kiva warehouse control software, Kiva WCS—Kiva Systems”, http://www.kivasystems.com/solutions/software/, printed on Apr. 2, 2015, pp. 1-2.
Kiva Systems; “Shipping Sortation—Kiva Systems”, http://www.kivasystems.com/solutions/shipping-sortation/, printed on Apr. 2, 2015, pp. 1-2.
Kohtsuka, Takafumi, et al.; “Design of a Control System for Robot Shopping Carts”, Knowledge-Based and Intelligent Information and Engineering Systems, 15th International Conference, KES 2011, Kaiserslautern, Germany, Sep. 12-14, 2011, pp. 280-288.
Koubaa, Anis; “A Service-Oriented Architecture for Virtualizing Robots in Robot-as-a-Service Clouds”, pp. 1-13.
Kumar, Swagat; “Robotics-as-a-Service: Transforming the Future of Retail”, Tata Consultancy Services, http://www.tcs.com/resources/white_papers/Pages/Robolics-as-Service.aspx, printed on May 13, 2015, pp. 1-4.
Kumar Paradkar, Prashant; “Voice Controlled Robotic Project using interfacing of Andruino and Bluetooth HC-05”, Robotics_Projects_C/C++_Android.
Lejepekov, Fedor; “Yuki-taro. Snow recycle robot.”, https://www.youtube.com/watch?v=gl2j9PY4jGY, published on Jan. 17, 2011, pp. 1-4.
Liu, Xiaohan, et al.; “Design of an Indoor Self-Positioning System for the Visually Impaired—Simulation with RFID and Bluetooth in a Visible Light Communication System”, Proceedings of the 29th Annual International Conference of the IEEE EMBS, Cite Internationale, Lyon, France, Aug. 23-26, 2007, pp. 1655-1658.
Lowe's Home Improvement; “OSHbots from Lowe's Innovation Labs”, https://www.youtube.com/watch?v=W-RKAjP1dtA, published on Dec. 15, 2014, pp. 1-8.
Lowe's Innovation Labs; “Autonomous Retail Service Robots”, http://www.lowesinnovationlabs.com/innovation-robots/, printed on Feb. 26, 2015, pp. 1-4.
Matos, Luis; “wi-Go—The autonomous and self-driven shopping cart”; https://www.indiegogo.com/projects/wi-go-the-autonomous-and-self-driven-shopping-cart; printed on Feb. 27, 2015, pp. 1-16.
Meena, M., & Thilagavahi, P.; “Automatic Docking System with Recharging and Battery Replacement for Surveillance Robot”, International Journal of Electronics and Computer Science Engineering, pp. 1148-1154.
Murph, Darren; “B.O.S.S. shopping cart follows you around”, Engadget, http://www.engadget.com/2006/08/11/b-o-s-s-shopping-cart-follows-you-around/, Aug. 11, 2006, pp. 1-4.
Nakajima, Madoka & Haruyama, Shinichiro; “New indoor navigation system for visually impaired people using visible light communication”, EURASIP Journal on Wireless Communications and Networking, 2013, pp. 1-10.
Neurobtv; “Shopping Robot TOOMAS 2009”, https://www.youtube.com/watch?v=49Pkm30qmQU, published on May 8, 2010, pp. 1-7.
Nickerson, S.B., et al.; “An autonomous mobile robot for known industrial environments”, Autonomous Robot for a Known environment, Aug. 28, 1997, pp. 1-28.
O'Donnell, Jake; “Meet the Bluetooth-Connected Self-Following Robo-Caddy of the Future”, Sportsgrid; http://www.sportsgrid.com/uncategorized/meet-the-bluetooth-connected-self-following-robo-caddy-of-the-future/, Apr. 22, 2014, pp. 1-5.
Ogawa, Keisuke; “Denso Demos In-wheel Motor System for Baby Carriages, Shopping Carts”, Nikkei Technology, http://techon.nikkeiibp.co.jp/english/NEWS_EN/20141010/381880/?ST=english_PRINT, Oct. 10, 2014, pp. 1-2.
Orchard Supply Hardware; “Orchard Supply Hardware's OSHbot”, https://www.youtube.com/watch?v=Sp9176vm7Co, published on Oct. 28, 2014, pp. 1-9.
Osborne, Charlie; “Smart Cart Follows You When Grocery Shopping”, Smartplanet, http://www.smartplanet.com/blog/smart-takes/smart-cart-follows-you-when-grocery-shopping/, Feb. 29, 2012, pp. 1-4.
Poudel, Dev Bahadur; “Coordinating Hundreds of Cooperative, Autonomous Robots in a Warehouse”, Jan. 27, 2013, pp. 1-13.
Robotlab Inc.; “NAO robot drives autonomously it's own car”, https://www.youtube.com/watch?v=oBHYwYlo1UE, published on Sep. 8, 2014, pp. 1-6.
Rodriguez, Ashley; “Meet Lowe's Newest Sales Associate—OSHbot, the Robot”, Advertising Age, http://adage.com/article/cmo-strategy/meet-lowe-s-newest-sales-associate-oshbot-robot/295591/, Oct. 28, 2014, pp. 1-8.
Sebaali, G., et al.; “Smart Shopping Cart”, Department of Electrical and Computer Engineering, American University of Beirut, pp. 1-6.
Shukla, Neha; “SaviOne the Butler Bot: Service Robot for Hospitality Industry”, TechieTonics, http://www.techietonics.com/robo-tonics/savione-the-butler-bot-service-for-hospitality-industry.html, pp. 1-5.
Song, Guangming, et al.; “Automatic Docking System for Recharging Home Surveillance Robots”, http://www.academia.edu/6495007/Automatic_Docking_System_for_Recharging_Home_Surveillance_Robots, IEEE Transactions on Consumer Electronics, vol. 57, No. 2, May 2011, pp. 1-8.
Soper, Taylor; “Amazon vet's new robot-powered apparel startup aims to revolutionize how we buy clothes”, GeekWire, http://www.geekwire.com/2012/hointer-robot-jeans-clothing-apparel-store-startup/, Nov. 29, 2012, pp. 1-12.
Stewart Golf; “Introducing the New Stewart Golf X9 Follow”, https://www.youtube.com/watch?v=HHivFGtiuE, published on Apr. 9, 2014, pp. 1-9.
Sun, Eric; “Smart Bin & Trash Route” system—RMIT 2012 Green Inventors Competition, http://www.youtube.com/watch?v=OrTA57a1O0k, published on Nov. 14, 2012, pp. 1-8.
Superdroid Robots; “Cool Robots, Making Life Easier”, http://www.superdroidrobots.com/shop/custom.aspx/cool-robots-making-life-easier/83/, printed on Jun. 16, 2015, pp. 1-7.
Swisslog; “RoboCourier Autonomous Mobile Robot”, http://www.swisslog.com/en/Products/HCS/Automated-Material-Transport/RoboCourier-Autonomous-Mobile-Robot, pp. 1.
Tam, Donna; “Meet Amazon's busiest employee—the Kiva robot”, CNET, http://www.cnet.com/news/meet-amazons-busiest-employee-the-kiva-robot/, Nov. 30, 2014, pp. 1-6.
Universal Robotics; “Neocortex Enables Random Part Handling and Automated Assembly”, http://www.universalrobotics.com/random-bin-picking, printed on Dec. 22, 2015, pp.-1-3.
Uphigh Productions; “Behold the Future (E017 Robot Sales Assistant)”, https://www.youtube.com/watch?v=8WbvjaPm7d4, published on Nov. 19, 2014, pp. 1-7.
Urankar, Sandeep, et al.; “Robo-Sloth: A Rope-Climbing Robot”, Department of Mechanical Engineering, Indian Institute of Technology, 2003, pp. 1-10.
Vasilescu, Iuliu, et al.; “Autonomous Modular Optical Underwater Robot (AMOUR) Design, Prototype and Feasibility Study”, pp. 1-7.
Vmecavacuumtech; “VMECA Magic Suction Cup with ABB robot for pick and place (packaging application)”, https://www.youtube.com/watch?v=5btR9MLtGJA, published on Sep. 14, 2014, pp. 1-4.
Wang, Xuan; “2D Mapping Solutions for Low Cost Mobile Robot”, Master's Thesis in Computer Science, Royal Institute of Technology, KTH CSC, Stockholm, Sweden, 2013, pp. 1-60.
Webb, Mick; “Robovie II—the personal robotic shopping”, Gizmag, http://www.gizmag.com/robovie-ii-robotic-shopping-assistance/13664/, Dec. 23, 2009, pp. 1-5.
Weise, Elizabeth; “15,000 robots usher in Amazon's Cyber Monday”, USATODAY, http://www.usatoday.com/story/tech/2014/12/01/robots-amazon.kiva-fulfillment-centers-cyber-monday/19725229/, Dec. 2, 2014, pp. 1-3.
Weiss, C.C.; “Multifunctional hybrid robot shovels snow and mows your lawn”, Gizmag, http://www.gizmag.com/snowbyte-snow-shoveling-robot/32961/, Jul. 21, 2014, pp. 1-7.
Wikipedia; “Kiva Systems”, http://en.wikipedia.org/wiki/Kiva_Systems, printed on Apr. 2, 2015, pp. 1-3.
Wired; “High-Speed Robots Part 1: Meet BettyBot in “Human Exclusion Zone” Warehouses—The Window-Wired”, https://www.youtube.com/watch?v=8gy5tYVR-28, published on Jul. 2, 2013, pp. 1-6.
Wulf, O., et al.; “Colored 2D maps for robot navigation with 3D sensor data,” Institute for Systems Engineering, University of Hannover, Hannover, Germany, 2014, pp. 1-6.
YRF; “The Diamond Robbery—Scene Dhoom:2 Hrithik Roshan”, https://www.youtube.com/watch?v=3bMYgo_S0Kc, published on Jul. 12, 2012, pp. 1-7.
Kohtsuka, T. et al.; “Design of a Control System for Robot Shopping Carts”; KES'11 Proceedings of the 15th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems; Sep. 12-14, 2011; pp. 280-288.
Nishimura, S. et al.; “Development of Attachable Modules for Robotizing Daily Items: Person Following Shopping Cart Robot”; Proceedings of the 2007 IEEE International Conference on Robotics and Biomimetics (Sanya, China); Dec. 14-18, 2007; pp. 1506-1511.
Scholz, J. et al.; “Cart Pushing with a Mobile Manipulation System: Towards Navigation with Moveable Objects”; Proceedings of the 2011 IEEE International Conference on Robotics and Automation (Shanghai, China); May 9-13, 2011; pp. 6115-6120.
UKIPO; App. No. 1703373.9; Office Action dated Aug. 31, 2017.
Bohren; Jonathan et al.; “Towards Autonomous Robotic Butlers: Lessons Learned with the PR2”, Willow Garage, May 9, 2011, pp. 1-8.
Denso; “X-mobility”, Oct. 10, 2014, pp. 1-2, including machine translation.
Fellow Robots; “Meet OSHBOT”, http://fellowrobots.com/oshbot/, May 19, 2015, pp. 1-3.
Green, A., et al; “Report on evaluation of the robot trolley”, CommRob IST-045441, Advanced Behaviour and High-Level Multimodal Communications with and among Robots, Jun. 14, 2010, pp. 10-67.
Koubaa, Anis; “A Service-Oriented Architecture for Virtualizing Robots in Robot-as-a-Service Clouds”, 2014, pp. 1-13.
Kumar Paradkar, Prashant; “Voice Controlled Robotic Project using interfacing of Ardruino and Bluetooth HC-05”, Robotics_Projects_C/C++_Android, Jan. 23, 2016, pp. 1-14.
Meena, M., & Thilagavathi, P.; “Automatic Docking System with Recharging and Battery Replacement for Surveillance Robot”, International Journal of Electronics and Computer Science Engineering, 2012, pp. 1148-1154.
Sebaali, G., et al.; “Smart Shopping Cart”, Department of Electrical and Computer Engineering, American University of Beirut, 2014, pp. 1-6.
Shukla, Neha; “SaviOne the Butler Bot: Service Robot for Hospitality Industry”, TechieTonics, http://www.techietonics.com/robo-tonics/savione-the-butler-bot-service-for-hospitality-industry.html, Aug. 14, 2014, pp. 1-5.
Swisslog; “RoboCourier Autonomous Mobile Robot”, http://www.swisslog.com/en/Products/ HCS/Automated-Material-Transport/RoboCourier-Autonornous-Mobile-Robot printed May 27, 2015, pp. 1.
Vasilescu, Iuliu, et al.; “Autonomous Modular Optical Underwater Robot (AMOUR) Design, Prototype and Feasibility Study”, Apr. 18, 2005, pp. 1-7.
Wikipedia; “Leeds Kirkgate Market”; https://en.wikipedia.org/wiki/Leeds_Kirkgate_Market; Retrieved on Apr. 5, 2017; 8 pages.
Garun, Natt; “Hop the hands-free suitcase follows you around like an obedient pet”; https://www.digitaltrends.com/cool-tech/hop-the-hands-free-suitcase-follows-you-around-like-an-obedient-pet/; Oct. 10, 2012; pp. 1-6.
Onozato, Taishi et al.; “A Control System for the Robot Shopping Cart”; 2010 IRAST International Congress on Computer Applications and Computational Science (CACS 2010); 2010; pp. 907-910.
SK Telecom Co.; “SK Telecom Launches Smart Cart Pilot Test in Korea”; http://www.sktelecom.com/en/press/press_detail.do?idx=971; Oct. 4, 2011; pp. 1-2.
Tam, Donna; “Meet Amazon's busiest employee—the Kiva robot”; http://www.cnet.com/news/meet-amazons-busiest-employee-the-kiva-robot/; Nov. 30, 2014; pp. 1-6.
Budgee; “The Robotic Shopping Cart Budgee ”; https://www.youtube.com/watch?v=2dYNdVPF4VM; published on Mar. 20, 2015; pp. 1-6.
Follow Inspiration; “wiiGO”; https://www.youtube.com/watch?v=dhHXIdpknC4; published on Jun. 16, 2015; pp. 1-7.
Office Action dated Mar. 26, 2018 for U.S. Appl. No. 15/061,688 (pp. 1-18).
Technion; “Autonomous Tracking Shopping Cart—Shopping Made Easy from Technion”; https://www.youtube.com/watch?v=pQcb9fofmXg; published on Nov. 23, 2014; pp. 1-10.
UKIPO; App. No. 1714769.5; Office Action dated Mar. 27, 2018.
UKIPO; App. No. 1715523.5; Office Action dated Mar. 26, 2018.
UKIPO; App. No. GB1613851.3; Examination Report dated Feb. 9, 2018.
UKIPO; App. No. GB1721298.6; Office Action dated Jan. 31, 2018.
USPTO; U.S. Appl. No. 15/061,203; Notice of Allowance dated May 8, 2018 (pp. 1-5).
USPTO; U.S. Appl. No. 15/061,406; Notice of Allowance dated May 15, 2018 (pp. 1-5).
USPTO; U.S. Appl. No. 15/061,443; Office Action dated Apr. 4, 2018 (pp. 1-13).
USPTO; U.S. Appl. No. 15/061,671; Office Action dated Apr. 18, 2018 (pp. 1-18).
USPTO; U.S. Appl. No. 15/061,801; Notice of Allowance dated Mar. 2, 2018 (pp. 1-5).
USPTO; U.S. Appl. No. 15/274,991; Office Action dated May 17, 2018 (pp. 1-9).
USPTO; U.S. Appl. No. 15/282,951; Office Action dated Mar. 30, 2018 (pp. 1-12).
USPTO; U.S. Appl. No. 15/471,278; Notice of Allowance dated Apr. 19, 2018 (pp. 1-7).
USPTO; U.S. Appl. No. 15/061,350; Notice of Allowance dated Apr. 4, 2018 (pp. 1-8).
U.S. Appl. No. 15/698,068, filed Sep. 7, 2017, Donald R. High.
U.S. Appl. No. 15/902,274, filed May 25, 2018, Donald R. High.
U.S. Appl. No. 16/001,774, filed Jun. 6, 2018, Donald R. High.
U.S. Appl. No. 16/059,431, filed Aug. 9, 2018, Donald R. High.
U.S. Appl. No. 16/100,064, filed Aug. 9, 2018, Donald R. High.
U.S. Appl. No. 16/109,290, filed Aug. 22, 2018, Donald R. High.
UKIPO; App. No. GB1703373.9; Examination Report dated Aug. 31, 2018.
U.S. Appl. No. 15/061,443, filed Mar. 4, 2016, High.
U.S. Appl. No. 15/275,009, filed Sep. 23, 2016, Donald R. High.
U.S. Appl. No. 15/275,019, filed Sep. 23, 2016, Donald R. High.
U.S. Appl. No. 15/275,047, filed Sep. 23, 2016, Donald R. High.
U.S. Appl. No. 15/829,951, filed Sep. 30, 2016, Donald R. High.
U.S. Appl. No. 15/288,923, filed Oct. 7, 2016, Donald R. High.
U.S. Appl. No. 15/423,812, filed Feb. 3, 2017, Donald R. High.
U.S. Appl. No. 15/446,914, filed Mar. 1, 2017, Donald R. High.
U.S. Appl. No. 15/447,175, filed Mar. 2, 2017, Donald R. High.
U.S. Appl. No. 15/447,202, filed Mar. 2, 2017, Donald R. High.
U.S. Appl. No. 15/712,278, filed Mar. 28, 2017, Donald R. High.
U.S. Appl. No. 15/692,226, filed Aug. 31, 2017, Donald R. High.
U.S. Appl. No. 15/836,708, filed Dec. 8, 2017, Donald R. High.
Daily Mail; “Dancing with your phone: The gyrating robotic dock that can move along with your music”, Sep. 12, 2012, http://www.dailymail.co.uk/sciencetech/article-2202164/The-intelligent-dancing-robot-controlled-mobile-phone.html, pp. 1-23.
Messieh, Nancy; “Humanoid robots will be roaming Abu Dhabi's malls next year”, The Next Web, Oct. 17, 2011, https://thenextweb.com/me/2011/10/17/humanoid-robots-will-be-roaming-abu-dhabis-malls-next-year/, pp. 1-6.
Owano, Nancy; “HEARBO robot can tell beeps, notes, and spoken word (w/ Video)”, Phys.org, Nov. 21, 2012, https://phys.org/news/2012-11-hearbo-robot-beeps-spoken-word.html, pp. 1-4.
Sales, Jorge, et al.; “CompaRob: The Shopping Cart Assistance Robot”, International Journal of Distributed Sensor Networks, vol. 2016, Article ID 4781280, Jan. 3, 2016, http://dx.doi.org/10.1155/2016/4781280, pp. 1-16.
Related Publications (1)
Number Date Country
20160259343 A1 Sep 2016 US
Provisional Applications (37)
Number Date Country
62129726 Mar 2015 US
62129727 Mar 2015 US
62138877 Mar 2015 US
62138885 Mar 2015 US
62152421 Apr 2015 US
62152465 Apr 2015 US
62152440 Apr 2015 US
62152630 Apr 2015 US
62152711 Apr 2015 US
62152610 Apr 2015 US
62152667 Apr 2015 US
62157388 May 2015 US
62165579 May 2015 US
62165416 May 2015 US
62165586 May 2015 US
62171822 Jun 2015 US
62175182 Jun 2015 US
62182339 Jun 2015 US
62185478 Jun 2015 US
62194131 Jul 2015 US
62194119 Jul 2015 US
62194121 Jul 2015 US
62194127 Jul 2015 US
62202744 Aug 2015 US
62202747 Aug 2015 US
62205548 Aug 2015 US
62205569 Aug 2015 US
62205555 Aug 2015 US
62205539 Aug 2015 US
62207858 Aug 2015 US
62214826 Sep 2015 US
62214824 Sep 2015 US
62292084 Feb 2016 US
62302547 Mar 2016 US
62302567 Mar 2016 US
62302713 Mar 2016 US
62303021 Mar 2016 US