These teachings relate generally to shopping environments and more particularly to devices, systems and methods for assisting customers and/or workers in those shopping environments.
In a modern retail store environment, there is a need to improve the customer experience and/or convenience for the customer. Whether shopping in a large format (big box) store or smaller format (neighborhood) store, customers often require assistance that employees of the store are not always able to provide. For example, particularly during peak hours, there may not be enough employees available to assist customers such that customer questions go unanswered. Additionally, due to high employee turnover rates, available employees may not be fully trained or have access to information to adequately support customers. Other routine tasks also are difficult to keep up with, particularly during peak hours. For example, shopping carts are left abandoned, aisles become messy, inventory is not displayed in the proper locations or is not even placed on the sales floor, shelf prices may not be properly set, and theft is hard to discourage. All of these issues can result in low customer satisfaction or reduced convenience to the customer. With increasing competition from non-traditional shopping mechanisms, such as online shopping provided by e-commerce merchants and alternative store formats, it can be important for “brick and mortar” retailers to focus on improving the overall customer experience and/or convenience.
The above needs are at least partially met through provision of embodiments of systems, devices, and methods designed to provide assistance to customers and/or workers in a shopping facility, such as described in the following detailed description, particularly when studied in conjunction with the drawings, wherein:
Elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present teachings. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present teachings. Certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. The terms and expressions used herein have the ordinary technical meaning as is accorded to such terms and expressions by persons skilled in the technical field as set forth above except where different specific meanings have otherwise been set forth herein.
The following description is not to be taken in a limiting sense, but is made merely for the purpose of describing the general principles of exemplary embodiments. Reference throughout this specification to “one embodiment,” “an embodiment,” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, appearances of the phrases “in one embodiment,” “in an embodiment,” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.
Generally speaking, pursuant to various embodiments, systems, devices and methods are provided for assistance of persons at a shopping facility. Generally, assistance may be provided to customers or shoppers at the facility and/or to workers at the facility. The facility may be any type of shopping facility at a location in which products for display and/or for sale are variously distributed throughout the shopping facility space. The shopping facility may be a retail sales facility, or any other type of facility in which products are displayed and/or sold. The shopping facility may include one or more of sales floor areas, checkout locations, parking locations, entrance and exit areas, stock room areas, stock receiving areas, hallway areas, common areas shared by merchants, and so on. Generally, a shopping facility includes areas that may be dynamic in terms of the physical structures occupying the space or area and objects, items, machinery and/or persons moving in the area. For example, the shopping area may include product storage units, shelves, racks, modules, bins, etc., and other walls, dividers, partitions, etc. that may be configured in different layouts or physical arrangements. In other example, persons or other movable objects may be freely and independently traveling through the shopping facility space. And in other example, the persons or movable objects move according to known travel patterns and timing. The facility may be any size of format facility, and may include products from one or more merchants. For example, a facility may be a single store operated by one merchant or may be a collection of stores covering multiple merchants such as a mall. Generally, the system makes use of automated, robotic mobile devices, e.g., motorized transport units, that are capable of self-powered movement through a space of the shopping facility and providing any number of functions. Movement and operation of such devices may be controlled by a central computer system or may be autonomously controlled by the motorized transport units themselves. Various embodiments provide one or more user interfaces to allow various users to interact with the system including the automated mobile devices and/or to directly interact with the automated mobile devices. In some embodiments, the automated mobile devices and the corresponding system serve to enhance a customer shopping experience in the shopping facility, e.g., by assisting shoppers and/or workers at the facility.
In some embodiments, a shopping facility personal assistance system comprises: a plurality of motorized transport units located in and configured to move through a shopping facility space; a plurality of user interface units, each corresponding to a respective motorized transport unit during use of the respective motorized transport unit; and a central computer system having a network interface such that the central computer system wirelessly communicates with one or both of the plurality of motorized transport units and the plurality of user interface units, wherein the central computer system is configured to control movement of the plurality of motorized transport units through the shopping facility space based at least on inputs from the plurality of user interface units.
System Overview
Referring now to the drawings,
In the example of
These motorized transport units 102 are located in the shopping facility 101 and are configured to move throughout the shopping facility space. Further details regarding such motorized transport units 102 appear further below. Generally speaking, these motorized transport units 102 are configured to either comprise, or to selectively couple to, a corresponding movable item container 104. A simple example of an item container 104 would be a shopping cart as one typically finds at many retail facilities, or a rocket cart, a flatbed cart or any other mobile basket or platform that may be used to gather items for potential purchase.
In some embodiments, these motorized transport units 102 wirelessly communicate with, and are wholly or largely controlled by, the central computer system 106. In particular, in some embodiments, the central computer system 106 is configured to control movement of the motorized transport units 102 through the shopping facility space based on a variety of inputs. For example, the central computer system 106 communicates with each motorized transport unit 102 via the wireless network 124 which may be one or more wireless networks of one or more wireless network types (such as, a wireless local area network, a wireless personal area network, a wireless mesh network, a wireless star network, a wireless wide area network, a cellular network, and so on), capable of providing wireless coverage of the desired range of the motorized transport units 102 according to any known wireless protocols, including but not limited to a cellular, Wi-Fi, Zigbee or Bluetooth network.
By one approach the central computer system 106 is a computer based device and includes at least one control circuit 108, at least one memory 110 and at least one wired and/or wireless network interface 112. Such a control circuit 108 can comprise a fixed-purpose hard-wired platform or can comprise a partially or wholly programmable platform, such as a microcontroller, an application specification integrated circuit, a field programmable gate array, and so on. These architectural options are well known and understood in the art and require no further description here. This control circuit 108 is configured (for example, by using corresponding programming stored in the memory 110 as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.
In this illustrative example the control circuit 108 operably couples to one or more memories 110. The memory 110 may be integral to the control circuit 108 or can be physically discrete (in whole or in part) from the control circuit 108 as desired. This memory 110 can also be local with respect to the control circuit 108 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 108 (where, for example, the memory 110 is physically located in another facility, metropolitan area, or even country as compared to the control circuit 108).
This memory 110 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 108, cause the control circuit 108 to behave as described herein. (As used herein, this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as an erasable programmable read-only memory (EPROM).)
Additionally, at least one database 126 may be accessible by the central computer system 106. Such databases may be integrated into the central computer system 106 or separate from it. Such databases may be at the location of the shopping facility 101 or remote from the shopping facility 101. Regardless of location, the databases comprise memory to store and organize certain data for use by the central control system 106. In some embodiments, the at least one database 126 may store data pertaining to one or more of: shopping facility mapping data, customer data, customer shopping data and patterns, inventory data, product pricing data, and so on.
In this illustrative example, the central computer system 106 also wirelessly communicates with a plurality of user interface units 114. These teachings will accommodate a variety of user interface units including, but not limited to, mobile and/or handheld electronic devices such as so-called smart phones and portable computers such as tablet/pad-styled computers. Generally speaking, these user interface units 114 should be able to wirelessly communicate with the central computer system 106 via a wireless network, such as the wireless network 124 of the shopping facility 101 (such as a Wi-Fi wireless network). These user interface units 114 generally provide a user interface for interaction with the system. In some embodiments, a given motorized transport unit 102 is paired with, associated with, assigned to or otherwise made to correspond with a given user interface unit 114. In some embodiments, these user interface units 114 should also be able to receive verbally-expressed input from a user and forward that content to the central computer system 106 or a motorized transport unit 102 and/or convert that verbally-expressed input into a form useful to the central computer system 106 or a motorized transport unit 102.
By one approach at least some of the user interface units 114 belong to corresponding customers who have come to the shopping facility 101 to shop. By another approach, in lieu of the foregoing or in combination therewith, at least some of the user interface units 114 belong to the shopping facility 101 and are loaned to individual customers to employ as described herein. In some embodiments, one or more user interface units 114 are attachable to a given movable item container 104 or are integrated with the movable item container 104. Similarly, in some embodiments, one or more user interface units 114 may be those of shopping facility workers, belong to the shopping facility 101 and are loaned to the workers, or a combination thereof.
In some embodiments, the user interface units 114 may be general purpose computer devices that include computer programming code to allow it to interact with the system 106. For example, such programming may be in the form of an application installed on the user interface unit 114 or in the form of a browser that displays a user interface provided by the central computer system 106 or other remote computer or server (such as a web server). In some embodiments, one or more user interface units 114 may be special purpose devices that are programmed to primarily function as a user interface for the system 100. Depending on the functionality and use case, user interface units 114 may be operated by customers of the shopping facility or may be operated by workers at the shopping facility, such as facility employees (associates or colleagues), vendors, suppliers, contractors, etc.
By one approach, the system 100 optionally includes one or more video cameras 118. Captured video imagery from such a video camera 118 can be provided to the central computer system 106. That information can then serve, for example, to help the central computer system 106 determine a present location of one or more of the motorized transport units 102 and/or determine issues or concerns regarding automated movement of those motorized transport units 102 in the shopping facility space. As one simple example in these regards, such video information can permit the central computer system 106, at least in part, to detect an object in a path of movement of a particular one of the motorized transport units 102.
By one approach these video cameras 118 comprise existing surveillance equipment employed at the shopping facility 101 to serve, for example, various security purposes. By another approach these video cameras 118 are dedicated to providing video content to the central computer system 106 to facilitate the latter's control of the motorized transport units 102. If desired, the video cameras 118 can have a selectively movable field of view and/or zoom capability that the central computer system 106 controls as appropriate to help ensure receipt of useful information at any given moment.
In some embodiments, a location detection system 116 is provided at the shopping facility 101. The location detection system 116 provides input to the central computer system 106 useful to help determine the location of one or more of the motorized transport units 102. In some embodiments, the location detection system 116 includes a series of light sources (e.g., LEDs (light-emitting diodes)) that are mounted in the ceiling at known positions throughout the space and that each encode data in the emitted light that identifies the source of the light (and thus, the location of the light). As a given motorized transport unit 102 moves through the space, light sensors (or light receivers) at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the light and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the data of the light it receives, since it can relate the light data to a mapping of the light sources to locations at the facility 101. Generally, such lighting systems are known and commercially available, e.g., the ByteLight system from ByteLight of Boston, Massachusetts. In embodiments using a ByteLight system, a typical display screen of the typical smart phone device can be used as a light sensor or light receiver to receive and process data encoded into the light from the ByteLight light sources.
In other embodiments, the location detection system 116 includes a series of low energy radio beacons (e.g., Bluetooth low energy beacons) at known positions throughout the space and that each encode data in the emitted radio signal that identifies the beacon (and thus, the location of the beacon). As a given motorized transport unit 102 moves through the space, low energy receivers at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the radio signal and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the location encoded in the radio signal it receives, since it can relate the location data to a mapping of the low energy radio beacons to locations at the facility 101. Generally, such low energy radio systems are known and commercially available. In embodiments using a Bluetooth low energy radio system, a typical Bluetooth radio of a typical smart phone device can be used as a receiver to receive and process data encoded into the Bluetooth low energy radio signals from the Bluetooth low energy beacons.
In still other embodiments, the location detection system 116 includes a series of audio beacons at known positions throughout the space and that each encode data in the emitted audio signal that identifies the beacon (and thus, the location of the beacon). As a given motorized transport unit 102 moves through the space, microphones at the motorized transport unit 102, on the movable item container 104 and/or at the user interface unit 114 receive the audio signal and can decode the data. This data is sent back to the central computer system 106 which can determine the position of the motorized transport unit 102 by the location encoded in the audio signal it receives, since it can relate the location data to a mapping of the audio beacons to locations at the facility 101. Generally, such audio beacon systems are known and commercially available. In embodiments using an audio beacon system, a typical microphone of a typical smart phone device can be used as a receiver to receive and process data encoded into the audio signals from the audio beacon.
Also optionally, the central computer system 106 can operably couple to one or more user interface computers 128 (comprising, for example, a display and a user input interface such as a keyboard, touch screen, and/or cursor-movement device). Such a user interface computer 128 can permit, for example, a worker (e.g., an associate, analyst, etc.) at the retail or shopping facility 101 to monitor the operations of the central computer system 106 and/or to attend to any of a variety of administrative, configuration or evaluation tasks as may correspond to the programming and operation of the central computer system 106. Such user interface computers 128 may be at or remote from the location of the facility 101 and may access one or more the databases 126.
In some embodiments, the system 100 includes at least one motorized transport unit (MTU) storage unit or dispenser 120 at various locations in the shopping facility 101. The dispenser 120 provides for storage of motorized transport units 102 that are ready to be assigned to customers and/or workers. In some embodiments, the dispenser 120 takes the form of a cylinder within which motorized transports units 102 are stacked and released through the bottom of the dispenser 120. Further details of such embodiments are provided further below. In some embodiments, the dispenser 120 may be fixed in location or may be mobile and capable of transporting itself to a given location or utilizing a motorized transport unit 102 to transport the dispenser 120, then dispense one or more motorized transport units 102.
In some embodiments, the system 100 includes at least one motorized transport unit (MTU) docking station 122. These docking stations 122 provide locations where motorized transport units 102 can travel and connect to. For example, the motorized transport units 102 may be stored and charged at the docking station 122 for later use, and/or may be serviced at the docking station 122.
In accordance with some embodiments, a given motorized transport unit 102 detachably connects to a movable item container 104 and is configured to move the movable item container 104 through the shopping facility space under control of the central computer system 106 and/or the user interface unit 114. For example, a motorized transport unit 102 can move to a position underneath a movable item container 104 (such as a shopping cart, a rocket cart, a flatbed cart, or any other mobile basket or platform), align itself with the movable item container 104 (e.g., using sensors) and then raise itself to engage an undersurface of the movable item container 104 and lift a portion of the movable item container 104. Once the motorized transport unit is cooperating with the movable item container 104 (e.g., lifting a portion of the movable item container), the motorized transport unit 102 can continue to move throughout the facility space 101 taking the movable item container 104 with it. In some examples, the motorized transport unit 102 takes the form of the motorized transport unit 202 of
In addition to detachably coupling to movable item containers 104 (such as shopping carts), in some embodiments, motorized transport units 102 can move to and engage or connect to an item display module 130 and/or an item storage unit or locker 132. For example, an item display module 130 may take the form of a mobile display rack or shelving unit configured to house and display certain items for sale. It may be desired to position the display module 130 at various locations within the shopping facility 101 at various times. Thus, one or more motorized transport units 102 may move (as controlled by the central computer system 106) underneath the item display module 130, extend upward to lift the module 130 and then move it to the desired location. A storage locker 132 may be a storage device where items for purchase are collected and placed therein for a customer and/or worker to later retrieve. In some embodiments, one or more motorized transport units 102 may be used to move the storage locker to a desired location in the shopping facility 101. Similar to how a motorized transport unit engages a movable item container 104 or item display module 130, one or more motorized transport units 102 may move (as controlled by the central computer system 106) underneath the storage locker 132, extend upward to lift the locker 132 and then move it to the desired location.
The control circuit 406 operably couples to a motorized wheel system 410. This motorized wheel system 410 functions as a locomotion system to permit the motorized transport unit 102 to move within the aforementioned retail or shopping facility 101 (thus, the motorized wheel system 410 may more generically be referred to as a locomotion system). Generally speaking, this motorized wheel system 410 will include at least one drive wheel (i.e., a wheel that rotates (around a horizontal axis) under power to thereby cause the motorized transport unit 102 to move through interaction with, for example, the floor of the shopping facility 101). The motorized wheel system 410 can include any number of rotating wheels and/or other floor-contacting mechanisms as may be desired and/or appropriate to the application setting.
The motorized wheel system 410 also includes a steering mechanism of choice. One simple example in these regards comprises one or more of the aforementioned wheels that can swivel about a vertical axis to thereby cause the moving motorized transport unit 102 to turn as well.
Numerous examples of motorized wheel systems are known in the art. Accordingly, further elaboration in these regards is not provided here for the sake of brevity save to note that the aforementioned control circuit 406 is configured to control the various operating states of the motorized wheel system 410 to thereby control when and how the motorized wheel system 410 operates.
In this illustrative example, the control circuit 406 also operably couples to at least one wireless transceiver 412 that operates according to any known wireless protocol. This wireless transceiver 412 can comprise, for example, a Wi-Fi-compatible and/or Bluetooth-compatible transceiver that can communicate with the aforementioned central computer system 106 via the aforementioned wireless network 124 of the shopping facility 101. So configured the control circuit 406 of the motorized transport unit 102 can provide information to the central computer system 106 and can receive information and/or instructions from the central computer system 106. As one simple example in these regards, the control circuit 406 can receive instructions from the central computer system 106 regarding movement of the motorized transport unit 102.
These teachings will accommodate using any of a wide variety of wireless technologies as desired and/or as may be appropriate in a given application setting. These teachings will also accommodate employing two or more different wireless transceivers 412 if desired.
The control circuit 406 also couples to one or more on-board sensors 414. These teachings will accommodate a wide variety of sensor technologies and form factors. By one approach at least one such sensor 414 can comprise a light sensor or light receiver. When the aforementioned location detection system 116 comprises a plurality of light emitters disposed at particular locations within the shopping facility 101, such a light sensor can provide information that the control circuit 406 and/or the central computer system 106 employs to determine a present location and/or orientation of the motorized transport unit 102.
As another example, such a sensor 414 can comprise a distance measurement unit configured to detect a distance between the motorized transport unit 102 and one or more objects or surfaces around the motorized transport unit 102 (such as an object that lies in a projected path of movement for the motorized transport unit 102 through the shopping facility 101). These teachings will accommodate any of a variety of distance measurement units including optical units and sound/ultrasound units. In one example, a sensor 414 comprises a laser distance sensor device capable of determining a distance to objects in proximity to the sensor. In some embodiments, a sensor 414 comprises an optical based scanning device to sense and read optical patterns in proximity to the sensor, such as bar codes variously located on structures in the shopping facility 101. In some embodiments, a sensor 414 comprises a radio frequency identification (RFID) tag reader capable of reading RFID tags in proximity to the sensor. Such sensors may be useful to determine proximity to nearby objects, avoid collisions, orient the motorized transport unit at a proper alignment orientation to engage a movable item container, and so on.
The foregoing examples are intended to be illustrative and are not intended to convey an exhaustive listing of all possible sensors. Instead, it will be understood that these teachings will accommodate sensing any of a wide variety of circumstances or phenomena to support the operating functionality of the motorized transport unit 102 in a given application setting.
By one optional approach an audio input 416 (such as a microphone) and/or an audio output 418 (such as a speaker) can also operably couple to the control circuit 406. So configured the control circuit 406 can provide a variety of audible sounds to thereby communicate with a user of the motorized transport unit 102, other persons in the vicinity of the motorized transport unit 102, or even other motorized transport units 102 in the area. These audible sounds can include any of a variety of tones and other non-verbal sounds. These audible sounds can also include, in lieu of the foregoing or in combination therewith, pre-recorded or synthesized speech.
The audio input 416, in turn, provides a mechanism whereby, for example, a user provides verbal input to the control circuit 406. That verbal input can comprise, for example, instructions, inquiries, or information. So configured, a user can provide, for example, a question to the motorized transport unit 102 (such as, “Where are the towels?”). The control circuit 406 can cause that verbalized question to be transmitted to the central computer system 106 via the motorized transport unit's wireless transceiver 412. The central computer system 106 can process that verbal input to recognize the speech content and to then determine an appropriate response. That response might comprise, for example, transmitting back to the motorized transport unit 102 specific instructions regarding how to move the motorized transport unit 102 (via the aforementioned motorized wheel system 410) to the location in the shopping facility 101 where the towels are displayed.
In this example the motorized transport unit 102 includes a rechargeable power source 420 such as one or more batteries. The power provided by the rechargeable power source 420 can be made available to whichever components of the motorized transport unit 102 require electrical energy. By one approach the motorized transport unit 102 includes a plug or other electrically conductive interface that the control circuit 406 can utilize to automatically connect to an external source of electrical energy to thereby recharge the rechargeable power source 420.
By one approach the motorized transport unit 102 comprises an integral part of a movable item container 104 such as a grocery cart. As used herein, this reference to “integral” will be understood to refer to a non-temporary combination and joinder that is sufficiently complete so as to consider the combined elements to be as one. Such a joinder can be facilitated in a number of ways including by securing the motorized transport unit housing 402 to the item container using bolts or other threaded fasteners as versus, for example, a clip.
These teachings will also accommodate selectively and temporarily attaching the motorized transport unit 102 to an item container 104. In such a case the motorized transport unit 102 can include a movable item container coupling structure 422. By one approach this movable item container coupling structure 422 operably couples to a control circuit 406 to thereby permit the latter to control, for example, the latched and unlatched states of the movable item container coupling structure 422. So configured, by one approach the control circuit 406 can automatically and selectively move the motorized transport unit 102 (via the motorized wheel system 410) towards a particular item container until the movable item container coupling structure 422 can engage the item container to thereby temporarily physically couple the motorized transport unit 102 to the item container. So latched, the motorized transport unit 102 can then cause the item container to move with the motorized transport unit 102. In embodiments such as illustrated in
In either case, by combining the motorized transport unit 102 with an item container, and by controlling movement of the motorized transport unit 102 via the aforementioned central computer system 106, these teachings will facilitate a wide variety of useful ways to assist both customers and associates in a shopping facility setting. For example, the motorized transport unit 102 can be configured to follow a particular customer as they shop within the shopping facility 101. The customer can then place items they intend to purchase into the item container that is associated with the motorized transport unit 102.
In some embodiments, the motorized transport unit 102 includes an input/output (I/O) device 424 that is coupled to the control circuit 406. The I/O device 424 allows an external device to couple to the control unit 404. The function and purpose of connecting devices will depend on the application. In some examples, devices connecting to the I/O device 424 may add functionality to the control unit 404, allow the exporting of data from the control unit 404, allow the diagnosing of the motorized transport unit 102, and so on.
In some embodiments, the motorized transport unit 102 includes a user interface 426 including for example, user inputs and/or user outputs or displays depending on the intended interaction with the user. For example, user inputs could include any input device such as buttons, knobs, switches, touch sensitive surfaces or display screens, and so on. Example user outputs include lights, display screens, and so on. The user interface 426 may work together with or separate from any user interface implemented at a user interface unit 114 (such as a smart phone or tablet device).
The control unit 404 includes a memory 408 coupled to the control circuit 406 and that stores, for example, operating instructions and/or useful data. The control circuit 406 can comprise a fixed-purpose hard-wired platform or can comprise a partially or wholly programmable platform. These architectural options are well known and understood in the art and require no further description here. This control circuit 406 is configured (for example, by using corresponding programming stored in the memory 408 as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein. The memory 408 may be integral to the control circuit 406 or can be physically discrete (in whole or in part) from the control circuit 406 as desired. This memory 408 can also be local with respect to the control circuit 406 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 406. This memory 408 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 406, cause the control circuit 406 to behave as described herein. (As used herein, this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as an erasable programmable read-only memory (EPROM).)
It is noted that not all components illustrated in
Additional Features Overview
Referring generally to
Tagalong Steering: This feature allows a given motorized transport unit 102 to lead or follow a user (e.g., a customer and/or a worker) throughout the shopping facility 101. For example, the central computer system 106 uses the location detection system 116 to determine the location of the motorized transport unit 102. For example, LED smart lights (e.g., the ByteLight system) of the location detection system 116 transmit a location number to smart devices which are with the customer (e.g., user interface units 114), and/or on the item container 104/motorized transport unit 102. The central computer system 106 receives the LED location numbers received by the smart devices through the wireless network 124. Using this information, in some embodiments, the central computer system 106 uses a grid placed upon a 2D CAD map and 3D point cloud model (e.g., from the databases 126) to direct, track, and plot paths for the other devices. Using the grid, the motorized transport unit 102 can drive a movable item container 104 in a straight path rather than zigzagging around the facility. As the user moves from one grid to another, the motorized transport unit 102 drives the container 104 from one grid to the other. In some embodiments, as the user moves towards the motorized transport unit, it stays still until the customer moves beyond an adjoining grid.
Detecting Objects: In some embodiments, motorized transport units 102 detect objects through several sensors mounted on motorized transport unit 102, through independent cameras (e.g., video cameras 118), through sensors of a corresponding movable item container 104, and through communications with the central computer system 106. In some embodiments, with semi-autonomous capabilities, the motorized transport unit 102 will attempt to avoid obstacles, and if unable to avoid, it will notify the central computer system 106 of an exception condition. In some embodiments, using sensors 414 (such as distance measurement units, e.g., laser or other optical-based distance measurement sensors), the motorized transport unit 102 detects obstacles in its path, and will move to avoid, or stop until the obstacle is clear.
Visual Remote Steering: This feature enables movement and/or operation of a motorized transport unit 102 to be controlled by a user on-site, off-site, or anywhere in the world. This is due to the architecture of some embodiments where the central computer system 106 outputs the control signals to the motorized transport unit 102. These controls signals could have originated at any device in communication with the central computer system 106. For example, the movement signals sent to the motorized transport unit 102 may be movement instructions determined by the central computer system 106; commands received at a user interface unit 114 from a user; and commands received at the central computer system 106 from a remote user not located at the shopping facility space.
Determining Location: Similar to that described above, this feature enables the central computer system 106 to determine the location of devices in the shopping facility 101. For example, the central computer system 106 maps received LED light transmissions, Bluetooth low energy radio signals or audio signals (or other received signals encoded with location data) to a 2D map of the shopping facility. Objects within the area of the shopping facility are also mapped and associated with those transmissions. Using this information, the central computer system 106 can determine the location of devices such as motorized transport units.
Digital Physical Map Integration: In some embodiments, the system 100 is capable of integrating 2D and 3D maps of the shopping facility with physical locations of objects and workers. Once the central computer system 106 maps all objects to specific locations using algorithms, measurements and LED geo-location, for example, grids are applied which sections off the maps into access ways and blocked sections. Motorized transport units 102 use these grids for navigation and recognition. In some cases, grids are applied to 2D horizontal maps along with 3D models. In some cases, grids start at a higher unit level and then can be broken down into smaller units of measure by the central computer system 106 when needed to provide more accuracy.
Calling a Motorized Transport Unit: This feature provides multiple methods to request and schedule a motorized transport unit 102 for assistance in the shopping facility. In some embodiments, users can request use of a motorized transport unit 102 through the user interface unit 114. The central computer system 106 can check to see if there is an available motorized transport unit. Once assigned to a given user, other users will not be able to control the already assigned transport unit. Workers, such as store associates, may also reserve multiple motorized transport units in order to accomplish a coordinated large job.
Locker Delivery: In some embodiments, one or more motorized transport units 102 may be used to pick, pack, and deliver items to a particular storage locker 132. The motorized transport units 102 can couple to and move the storage locker to a desired location. In some embodiments, once delivered, the requestor will be notified that the items are ready to be picked up, and will be provided the locker location and locker security code key.
Route Optimization: In some embodiments, the central computer system automatically generates a travel route for one or more motorized transport units through the shopping facility space. In some embodiments, this route is based on one or more of a user provided list of items entered by the user via a user interface unit 114; user selected route preferences entered by the user via the user interface unit 114; user profile data received from a user information database (e.g., from one of databases 126); and product availability information from a retail inventory database (e.g., from one of databases 126). In some cases, the route intends to minimize the time it takes to get through the facility, and in some cases, may route the shopper to the least busy checkout area. Frequently, there will be multiple possible optimum routes. The route chosen may take the user by things the user is more likely to purchase (in case they forgot something), and away from things they are not likely to buy (to avoid embarrassment). That is, routing a customer through sporting goods, women's lingerie, baby food, or feminine products, who has never purchased such products based on past customer behavior would be non-productive, and potentially embarrassing to the customer. In some cases, a route may be determined from multiple possible routes based on past shopping behavior, e.g., if the customer typically buys a cold Diet Coke product, children's shoes or power tools, this information would be used to add weight to the best alternative routes, and determine the route accordingly.
Store Facing Features: In some embodiments, these features enable functions to support workers in performing store functions. For example, the system can assist workers to know what products and items are on the shelves and which ones need attention. For example, using 3D scanning and point cloud measurements, the central computer system can determine where products are supposed to be, enabling workers to be alerted to facing or zoning of issues along with potential inventory issues.
Phone Home: This feature allows users in a shopping facility 101 to be able to contact remote users who are not at the shopping facility 101 and include them in the shopping experience. For example, the user interface unit 114 may allow the user to place a voice call, a video call, or send a text message. With video call capabilities, a remote person can virtually accompany an in-store shopper, visually sharing the shopping experience while seeing and talking with the shopper. One or more remote shoppers may join the experience.
Returns: In some embodiments, the central computer system 106 can task a motorized transport unit 102 to keep the returns area clear of returned merchandise. For example, the transport unit may be instructed to move a cart from the returns area to a different department or area. Such commands may be initiated from video analytics (the central computer system analyzing camera footage showing a cart full), from an associate command (digital or verbal), or on a schedule, as other priority tasks allow. The motorized transport unit 102 can first bring an empty cart to the returns area, prior to removing a full one.
Bring a Container: One or more motorized transport units can retrieve a movable item container 104 (such as a shopping cart) to use. For example, upon a customer or worker request, the motorized transport unit 102 can re-position one or more item containers 104 from one location to another. In some cases, the system instructs the motorized transport unit where to obtain an empty item container for use. For example, the system can recognize an empty and idle item container that has been abandoned or instruct that one be retrieved from a cart storage area. In some cases, the call to retrieve an item container may be initiated through a call button placed throughout the facility, or through the interface of a user interface unit 114.
Respond to Voice Commands: In some cases, control of a given motorized transport unit is implemented through the acceptance of voice commands. For example, the user may speak voice commands to the motorized transport unit 102 itself and/or to the user interface unit 114. In some embodiments, a voice print is used to authorize to use of a motorized transport unit 102 to allow voice commands from single user at a time.
Retrieve Abandoned Item Containers: This feature allows the central computer system to track movement of movable item containers in and around the area of the shopping facility 101, including both the sale floor areas and the back-room areas. For example, using visual recognition through store cameras 118 or through user interface units 114, the central computer system 106 can identify abandoned and out-of-place movable item containers. In some cases, each movable item container has a transmitter or smart device which will send a unique identifier to facilitate tracking or other tasks and its position using LED geo-location identification. Using LED geo-location identification with the Determining Location feature through smart devices on each cart, the central computer system 106 can determine the length of time a movable item container 104 is stationary.
Stocker Assistance: This feature allows the central computer system to track movement of merchandise flow into and around the back-room areas. For example, using visual recognition and captured images, the central computer system 106 can determine if carts are loaded or not for moving merchandise between the back room areas and the sale floor areas. Tasks or alerts may be sent to workers to assign tasks.
Self-Docking: Motorized transport units 102 will run low or out of power when used. Before this happens, the motorized transport units 102 need to recharge to stay in service. According to this feature, motorized transport units 102 will self-dock and recharge (e.g., at a MTU docking station 122) to stay at maximum efficiency, when not in use. When use is completed, the motorized transport unit 102 will return to a docking station 122. In some cases, if the power is running low during use, a replacement motorized transport unit can be assigned to move into position and replace the motorized transport unit with low power. The transition from one unit to the next can be seamless to the user.
Item Container Retrieval: With this feature, the central computer system 106 can cause multiple motorized transport units 102 to retrieve abandoned item containers from exterior areas such as parking lots. For example, multiple motorized transport units are loaded into a movable dispenser, e.g., the motorized transport units are vertically stacked in the dispenser. The dispenser is moved to the exterior area and the transport units are dispensed. Based on video analytics, it is determined which item containers 104 are abandoned and for how long. A transport unit will attach to an abandoned cart and return it to a storage bay.
Motorized Transport Unit Dispenser: This feature provides the movable dispenser that contains and moves a group of motorized transport units to a given area (e.g., an exterior area such as a parking lot) to be dispensed for use. For example, motorized transport units can be moved to the parking lot to retrieve abandoned item containers 104. In some cases, the interior of the dispenser includes helically wound guide rails that mate with the guide member 208 to allow the motorized transport units to be guided to a position to be dispensed.
Specialized Module Retrieval: This feature allows the system 100 to track movement of merchandise flow into and around the sales floor areas and the back-room areas including special modules that may be needed to move to the sales floor. For example, using video analytics, the system can determine if a modular unit it loaded or empty. Such modular units may house items that are of seasonal or temporary use on the sales floor. For example, when it is raining, it is useful to move a module unit displaying umbrellas from a back room area (or a lesser accessed area of the sales floor) to a desired area of the sales floor area.
Authentication: This feature uses a voice imprint with an attention code/word to authenticate a user to a given motorized transport unit. One motorized transport unit can be swapped for another using this authentication. For example, a token is used during the session with the user. The token is a unique identifier for the session which is dropped once the session is ended. A logical token may be a session id used by the application of the user interface unit 114 to establish the session id when user logs on and when deciding to do use the system 100. In some embodiments, communications throughout the session are encrypted using SSL or other methods at transport level.
Further Details of Some Embodiments
In accordance with some embodiments, further details are now provided for one or more of these and other features. For example, generally speaking, pursuant to various embodiments, systems, apparatuses, processes and methods are provided herein that allow for more accurate location determination, tracking and/or prediction relative to a shopping facility, such as a retail store location, shopping mall, distribution center, shopping campus or the like. The accurate location of motorized transport units, movable item containers, customers, associates and/or other objects allows for more accurate tracking, control and distribution of at least motorized transport units and movable item containers. In some embodiments the central computer system 106 in cooperation with the location detection system allows the central computer system to determine a location of the motorized transport units 102 at the shopping facility. Further, the central computer system may also be configured to determine a location of one or more of the movable item containers, user interface units, and the like.
Further, some embodiments include a lighting system or network 616, which may be part of the location detection system 116 or separate from the location detection system. The lighting system 616 includes one or more light units that emit light with information encoded into the emitted light. The information can include light source identifier information, area identifier or number, location information, and/or other such information or combination of such information. In some implementations, the light sources of the lighting system are configured to further provide lighting to the shopping facility. In some embodiments, however, the lighting system may cause non-visible light to be emitted that can include the relevant information and can be detected. Typically, the motorized transport units include light detectors to detect the light from the lighting system and communicate at least some of the information to the location controller. Accordingly, in some embodiments, the motorized transport units are configured to wirelessly communicate with the location controller, or the central computer system, which can forward relevant information to the location controller. Further, in some embodiments, location controller 602 is configured to communicate with user interface units 114, such as through one or more wireless communication protocols (e.g., Wi-Fi, Bluetooth, etc.), which can be part of or separate from a distributed communication network (e.g., wireless network 124).
The location controller 602 may also be communicationally coupled with one or more databases 126. The databases 126 can store substantially any relevant information such as but not limited to store mapping information, lighting patterns, light source identifiers, light source mapping, motorized transport unit identifying information, capabilities of the motorized transport units, movable item container identifying information, product information, location information, commands, codes, code location mapping, software, applications, executables, log and/or historic information, customer information (e.g., preferences, log-in information, contact information, etc.), other such relevant information, and typically a combination of two or more of such information. Similarly, some or all of the information stored and/or accessible through the databases may be stored at one or more of the location controller 602, the central computer system 106, the motorized transport units 102, the movable item containers 104, the user interface units, and the like.
As described above, the motorized transport units 102 are self-propelled and configured to move themselves throughout at least some, if not all of the shopping facility. In some embodiments, the motorized transport units 102 wirelessly receive commands from the location controller 602 (or the control circuit) to direct the motorized transport units to desired locations and/or along desired routes within or outside of the shopping facility. The motorized transport units may additionally or alternatively be configured to operate autonomously and/or at least partially autonomously from the central computer system (CCS). Further, in some embodiments, the motorized transport units 102 are configured to be fixed with or removably cooperated with the movable item containers 104 to move the movable item containers throughout authorized areas of the shopping facility, and in some instances outside of the shopping facility. The movable item containers 104 are configured to be used by customers and/or shopping facility associates or other employees in transporting products through the shopping facility. For example, in some embodiments, the movable item containers can be baskets, bins, wheeled carts, wheeled pallets, advertising systems, and/or other such movable item containers. For simplicity, the embodiments below are described with respect to carts or shopping carts. It will be appreciated by those skilled in the art, however, that the movable item containers are not limited to carts, but can be other objects configured to carry products.
In operation, the motorized transport units 102 and/or the movable item containers provide information to the location controller 602 to allow the location controller to determine, in association with one or more mappings of the shopping facility (and in some instances surrounding areas of the shopping facility), a location of the motorized transport units and/or movable item containers. In some embodiments, the motorized transport units 102 are configured with one or more detection systems that can provide relevant information to the location controller.
The control circuit 702 and the memory 704 may be integrated together, such as in a microcontroller, application specification integrated circuit, field programmable gate array or other such device, or may be separate devices coupled together. The I/O device 708 allows wired and/or wireless communication coupling of the location controller to external components, such as the databases 126, the motorized transport units 102, the user interface units 114, the movable item containers 104, and other such components, including when relevant the video camera 118 or video system, lighting system 616, and the like. Accordingly, the I/O device 708 may include any known wired and/or wireless interfacing device, circuit and/or connecting device. In some embodiments, a user interface 710 is included in and/or coupled with the location controller 602, which may be used for user input and/or output display. For example, the user interface 710 may include any known input devices, such one or more buttons, knobs, selectors, switches, keys, touch input surfaces and/or displays, etc. Additionally, the user interface may include one or more output display devices, such as lights, visual indicators, display screens, etc. to convey information to a user, such as status information, location information, mapping information, product location information, product information, video content, operating status information, notifications, errors, conditions and/or other such information. While
Generally, the location controller 602 and/or the control circuit 702 can comprise a fixed-purpose hard-wired platform or can comprise a partially or wholly programmable platform. These architectural options are well known and understood in the art and require no further description here. The location controller and/or control circuit can be configured (for example, by using corresponding programming as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.
The location controller 602 receives one or more inputs corresponding to one or more motorized transport units and identifies a relevant location of the one or more motorized transport units. The information may be received from the motorized transport unit, a movable item controller, a user interface unit 114, databases 126, video cameras 118, lighting system 616 and/or other such sources. Utilizing precise knowledge of the shopping facility and the layout of the shopping facility, and in some instances product location information, the location controller is configured to determine a location of one or more motorized transport units and/or movable item containers. For example, in some embodiments, one or more motorized transport units and/or movable item containers are configured to detect light from light sources of the lighting system 616 and extract a unique light source identifier or other relevant location information (e.g., area number, mapping or grid coordinate information, zone identifier, etc.) from the light detected from one or more light sources. Similarly, in some embodiments, one or more of the motorized transport units and/or movable item containers are configured to measure distances and provide relative distance information to the location controller 602. Still further, in some implementations, one or more of the motorized transport units and/or movable item containers are configured to detect and/or read one or more location markers and/or codes, and provide that information to the location controller.
The control circuit 406 typically comprises one or more processors and/or microprocessors. Generally, the memory 408 stores the operational code or set of instructions that is executed by the control circuit 406 and/or processor to implement the functionality of the motorized transport unit 802. In some embodiments, the memory 408 may also store some or all of particular data that may be needed to make any of the determinations, measurements and/or communications described herein. Such data may be pre-stored in the memory or be determined, for example, from detected light, measurements, and the like, and/or communicated to the motorized transport unit, such as from the movable item container 104, a user interface unit 114, the location controller 602, other source or combination of such sources. It is understood that the control circuit 406 and/or processor may be implemented as one or more processor devices as are well known in the art. Similarly, the memory 408 may be implemented as one or more memory devices as are well known in the art, such as one or more processor readable and/or computer readable media and can include volatile and/or nonvolatile media, such as RAM, ROM, EEPROM, flash memory and/or other memory technology. Further, the memory 408 is shown as internal to the motorized transport unit 802; however, the memory 408 can be internal, external or a combination of internal and external memory. Additionally, the motorized transport unit typically includes a power supply (not shown) or it may receive power from an external source. While
Generally, the control circuit 406 and/or electronic components of the motorized transport unit 802 can comprise fixed-purpose hard-wired platforms or can comprise a partially or wholly programmable platform. These architectural options are well known and understood in the art and require no further description here. The motorized transport unit and/or control circuit can be configured (for example, by using corresponding programming as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.
The control circuit 406 and the memory 408 may be integrated together, such as in a microcontroller, application specification integrated circuit, field programmable gate array or other such device, or may be separate devices coupled together. The I/O device 424 allows wired and/or wireless communication coupling of the motorized transport unit to external components, such as the location controller 602, the user interface units 114, the movable item containers 104, and other such components. Typically, the I/O device 424 provides at least wireless communication, and in some instances may include any known wired and/or wireless interfacing device, circuit and/or connecting device, such as but not limited to one or more transmitters, receivers, transceivers, etc. In some embodiments, a user interface 426 is included in and/or coupled with the motorized transport unit 802, which may be used for user input and/or output display. For example, the user interface 426 may include any known input devices, such one or more buttons, knobs, selectors, switches, keys, touch input surfaces and/or displays, etc. Additionally, the user interface may include one or more output display devices, such as lights, visual indicators, display screens, etc. to convey information to a user, such as status information, location information, mapping information, product location information, product information, video content, other communication information (e.g., text messages), operating status information, notifications, errors, conditions, shopping list, advertising, product recommendations, and/or other such information.
As introduced above, in some embodiments, the motorized transport unit includes one or more distance measurement units 808 configured to measure relative distances between the motorized transport unit and one or more external objects. For example, the distance measurement unit can be used to measure relative distances between the motorized transport unit and a shelf or rack within the shopping facility, another motorized transport unit, a wall, a structural support column, movable item containers, the customer associated with the motorized transport unit, other customers not associated with the motorized transport unit and/or substantially any other external object. In some implementations the motorized transport unit includes a laser distance measurement unit that uses a laser to measure distances between the motorized transport unit and an external object. Further, in some embodiments, the motorized transport unit includes multiple distance measurement units positioned to measure distances around the motorized transport unit (e.g., four distance measurement units positioned with a first measuring in a direction of travel, a second measuring in a direction 180 degrees away from the direction of travel, and third and fourth measuring at ninety degrees from the direction of travel). In other implementations, one or more distance measurement units may be capable of measure distances at multiple different directions or angles. The measured relative distance information can be communicated to the remote location controller 602 allowing the remote location controller to track movement of the motorized transport unit and/or use the distance information to determine a current and/or predicted location of the motorized transport unit.
One or more of the distance measurement unit or units 808 may include, in some embodiments, a light emitter and a light detector that detects light reflected by one or more objects. For example, the distance measurement unit may comprise a laser emitter and detector that allows for accurate measurement of distances between the emitter and detector. The distance measurement unit can be configured to determine the relative distance from the light emitter to the external object. The distance measurement unit and/or the control circuit 406 may further modify distance information based on known dimensions of the motorized transport unit and/or a location of the detector relative to one or more exterior surfaces of the motorized transport unit.
In some embodiments, the motorized transport unit 802 includes one or more light receiver units and/or light source identifiers configured to detect light from one or more light sources (e.g., from the lighting system 616) and extract and/or determine a unique light source identifier from the detected light. The light is typically received from predefined light sources of the lighting system that emit light with encoded unique light source identifiers within the emitted light. For example, a plurality of light sources can be overhead lights mounted and distributed across the ceiling of the shopping facility with the emitted light being directed down toward the floor. Further, the light sources can emit visible light providing lighting within the shopping facility. Typically, the encoding is implemented such that it is not detectable to humans. The light receiver unit 804 detects the light and extracts the unique light source identifier encoded in the emitted light. As a further example, a signal can be encoded in the light output from one or more LED or bulb light sources. The light receiver unit 804, which in some instances can comprise one or more cameras, light sensors, photodiodes, etc., detects and decodes this signal to obtain a light source identifier and/or location information that can be used in determining a position relative to the light source. Similarly, other light receiver units or devices can alternatively or additionally be used such as a camera on a user interface unit 114, a light receiver unit on other devices (e.g., movable item container, detectors carried by shopping facility associates, etc.) to detect the light source identifiers and/or signals.
The detected light source identifier can then be communicated to the location controller 602 to potentially be used in determining a location of the motorized transport unit based on a known location of the light source associated with the detected light source identifier. In other implementations, one or more light sources may be positioned closer to the floor and direct light parallel to or at an angle to the floor. Often, the light receiver unit 804 is configured to detect light source identifiers from multiple different light sources that in at least some instances are simultaneously impinging on the light receiver unit and/or simultaneously detectable by the light receiver unit 804. In some implementations, the encoded information may provide location information in addition to or in alternative to the light source identifier. The location information can include, for example coordinates, grid information or other such information that typically correspond with a shopping facility mapping. The location information to be encoded may be programmed into the light sources at installation, or communicated from the location controller 602, the central computer system 106 or other source.
The motorized transport unit 802 further includes the locomotion system 810 that includes and controls one or more motors of the motorized transport unit to at least cause the motorized transport unit to move throughout one or more areas within and/or exterior to the shopping facility. Typically, the locomotion system controls the one or more motors in accordance with one or more commands, position information, mapping coordinates, destination locations and the like. In some embodiments, the location controller 602 and/or central computer system 106 is configured to issue movement commands based on a determined and/or predicted location of the motorized transport unit. The locomotion system 810 can control the one or more motors to implement the one or more movement commands. In some embodiments, the motorized transport unit 802 further includes the movement tracker unit 812 that is configured to track one or more parameters corresponding to the movement and/or orientation of the motorized transport unit. For example, the movement tracker unit may include and/or communicate with one or more accelerometers, gyroscopes, compass, wheel or tread velocity or rate meters, odometer based on wheel and/or tread movement, global positioning satellite (GPS) information, Wi-Fi signal evaluation, and/or other such movement parameters. These parameters can be used in determining, predicting, and/or fine tuning a location of the motorized transport unit.
Substantially any number of light sources can be incorporated into the shopping facility to provide corresponding light or illumination areas 918. Similarly, the light sources are typically positioned such that the illumination areas 918 of two or more light sources overlap achieving overlapping light areas 920. In some embodiments, the light receiver unit 804 is configured to detect and/or extract multiple different light source identifiers from an overlapping light area 920 corresponding to the multiple light sources emitting the light creating the overlapping light areas such that the light from multiple light sources are simultaneously impinging on and/or detectable by the light receiver unit.
In some embodiments, the one or more extracted light source identifiers are communicated from the motorized transport unit to the location controller 602. The location controller can be configured with and/or have access to detailed orientation information about the location, relative to at least the floor spacing and/or a mapping of the shopping facility, of each light source emitting light with an encoded light source identifier, and/or the area on the floor and/or within the shopping facility mapping of the illumination area 918. Utilizing the one or more light source identifiers, the location controller 602 can determine a location of the motorized transport unit within a degree of error, which can vary depending on the size of the illumination areas 918, and the size of overlapping light areas 920. Typically, the location controller is capable of obtaining greater precision with greater numbers of overlapping areas. Additionally or alternatively, in some embodiments, the motorized transport unit maintains some shopping facility mapping information and/or layout information, and can be configured to determine location information of the motorized transport unit.
Referring back to
The obtained code information can be communicated from the I/O device 424 of the motorized transport unit (e.g., through a wireless transceiver (e.g., Wi-Fi, Bluetooth, cellular, etc.)) to the location controller 602 and/or the central computer system 106. The location controller 602 again can be configured with and/or have access to detailed location, position and/or orientation information corresponding to each of the plurality of machine readable codes relative to at least the floor spacing and/or a mapping of the shopping facility. Further, in some embodiments, the location controller may take into consideration the capabilities and/or limitations (e.g., range, angle, distance and/or other such limitations) of the machine readable code reader 806. Utilizing the code information, the location controller 602 can determine a location of the motorized transport unit 802 within a degree of error, which can vary depending on one or more factors, such as but not limited to the limitations of the code reader 806, the number, placement, distribution and/or orientation of the machine readable codes and other such factors. Further, in some embodiments, the location controller is capable of obtaining greater location precision by location information determined based on the machine readable codes in addition to one other relevant location information, such as but not limited to location information corresponding to one or more the light source identifiers, distance measurement information, GPS information, Wi-Fi information or other such information, and often a combination of such location information. Further, the distance measurement unit, the light receiver unit, code reader, movement tracker unit, and the like may not preform evaluations and/or determinations. Instead, these components may simply forward relevant information to the control circuit where the control circuit can utilize the information to make the relevant determinations and/or identifications (e.g., extract light source IDs, extract bar code information, determine relevant distance, determine movement and/or other such functions). In other implementations, the control circuit causes the processed or unprocessed information to the location controller for use by the location controller.
Illustrated in
In some embodiments, the motorized transport unit (and/or the movable item container) is incapable of determining its own location and is dependent on the one or more commands from the location controller. Alternatively, however, some embodiments of the motorized transport unit may optionally include an internal location controller 816 configured to allow the motorized transport unit to determine its own approximate location and/or provide feedback to the location controller 602. The internal location controller 816 may take some or all of the information from one or more of the distance measurement unit 808, the light receiver unit 804, code reader 806, and/or movement tracker unit 812. Utilizing locally stored and/or remotely accessed mapping of the shopping facility, the internal location controller can determine and/or predict a location of the motorized transport unit. In some instances, the locally determined location information can be compared with location information determined through the location controller 602.
Again, the motorized transport unit can be configured to communicate the measured distance information, the unique light source identifier(s), the code information corresponding to one or more machine readable codes, the movement tracking information and/or other such relevant information to the separate and remote location controller 602. The location controller 602 receives communications from the motorized transport unit and extracts relevant location information. Typically, the location controller is in communication with multiple motorized transport units that operate at the shopping facility (e.g., more than four). Further, the location controller may receive additional location information from other sources, such as from one or more movable item containers 104, user interface units 114, video processing unit, shopping facility associate communications, RFID tag readers, sensors, or the like, or combinations of such information. In some instances, the location controller is further configured to determine additional location information, such as through the evaluation of video content, application of one or more movement prediction algorithms and the like.
The location controller obtains and/or extracts relevant location information from the communications from the motorized transport unit and/or from other sources. The location controller can utilize the relevant location information in combination with detailed knowledge of the shopping facility (e.g., layout information, mapping information, coordinates, product placement within the shopping facility, advertisement placement, and/or other such information) in determining a location of a relevant motorized transport unit. The location controller may process some or all of the location information relevant to mapping in determining location information. For example, the location controller can process the one or more unique light source identifiers and identify a location of the corresponding light source and/or a corresponding location of the illumination area 918. Similarly, the location controller can use the distance information to further define and/or obtain a more precise location of the motorized transport unit (e.g., knowing that the motorized transport unit is within an area defined by an illumination area 918, a more precise location can be determined based on distance measurements between the motorized transport unit and each of two shelves on opposing sides of the motorized transport unit). Still further precision may be determined in the processing of the location information, such as identifying the motorized transport unit is within an overlapping light area 920 of two or more light sources and/or the detection of code information from one or more machine readable codes, in cooperation with distance measurements. Similarly, the movement tracking information may further be used to obtain specific amounts of movement from one or more previous location determinations. Additionally, some embodiments utilize previous location information in cooperation with newly received location information to adjust and/or clarify a determined location (e.g., knowing from movement tracking information that the motorized transport unit traveled three inches and has now just detected entering an overlapping light area 920, a more precise identification of location may be determined). In some embodiments, the location controller can determine a location of the motorized transport unit to within two feet, and typically within less than half a foot. Further, when utilizing a combination of the location information, the location controller can determine a location of the motorized transport unit to within less than one inch, and in some instances to within less than 1/16 of an inch. Further, the location determination can be an automated process that is continuously performed such that a current location of the motorized transport unit is determined at least once every ten seconds, and typically at least once every second. For example, in some implementations, the location of the motorized transport unit can be continuously determined four to ten times a second.
Again, in some embodiments, the location information received at the location controller may include code information (e.g., based on machine readable codes 1012, images, product identification, etc.). Such code information can be obtained from the communications from the motorized transport unit and/or other sources (e.g., user interface unit, movable item container, etc.) that correspond to one or more specific machine readable codes of a plurality of unique machine readable codes that are positioned at different locations distributed throughout at least a portion of the shopping facility and detected by the motorized transport unit. Based on the code information relative to the mapping, a location of the first machine readable code within the shopping facility can be identified. The location controller can use this information individually or in combination with other location information in determining, relative to the mapping of the shopping facility, the location of the motorized transport unit within the shopping facility. Again, typically multiple sources of location information are utilized, for example, determining a location as a function of at least one unique light source identifier, relative distance information and an identified location of one or more machine readable codes.
Some embodiments take into consideration communication connections with one or more wireless network access points or antennas. Information regarding which of one or more wireless network access points the motorized transport unit is wirelessly coupled with can be communicated to the location controller. This information can be communicated from the motorized transport unit or by the one or more network access points. The location controller 602 can use the connection information in determining a relative location of the motorized transport unit relative to the shopping facility mapping. For example, a trilateration and/or triangulation can be performed based on the connection information, which can include signal quality information, signal strength information and other such information about the wireless communication provided through the one or more network access points. In some instances, for example, location information can be determined by calculating one or more distances and angles between two or more reference nodes or access points whose positions are known.
Other location identifying information may be detected and/or used to determine location information and/or identify a location. For example, some embodiments utilize Bluetooth Low Energy Beacons (e.g., Bluetooth 4.0), which may provide a low energy mode in which a beacon device and/or access point emits a signal that includes a unique identifier, a major code, a minor code, a signal strength and/or other such information. For example, some embodiments employ an iOS, Android or other such beacon receivers, which can be used to determine proximity to the beacon (at an unknown, far, near or immediate distance), or to determine precise location when correlating signals from multiple beacons. Similarly, some embodiments may utilize audible, ultrasonic or other such sound beacons that transmit sound, ultrasound, etc. The sound beacon can be detected by motorized transport unit 102, a user interface unit 114 or the like through an onboard microphone and audio signal processor. Some embodiments may utilize magnetic resonance. For example, a compass or other magnetically sensitive device or system can be incorporated with the motorized transport unit 102, movable item container 104, and/or user interface unit 114 to detect variations in the magnetic fields (e.g., Earth's magnetic field, generated magnetic fields, etc.). The detected changes can be used to determine location with respect to the shopping facility's structural, positioning of magnetic field generators, and/or mapping of magnetic field variations through some or all of the shopping facility, and in some instances surrounding area(s).
Further, some embodiments utilize dead reckoning. For example, the motorized transport unit, movable item container, user interface unit, etc. can leverage onboard accelerometers to detect orientation and direction and speed of travel. As described above and further below, some embodiments utilize GPS. The motorized transport unit may include a GPS system or a smart device that enables GPS. Some embodiments may utilize video recognition as at least part of the location information that is used to determine relative location. The shopping facility may be configured to multiple cameras (within and without the shopping facility) that can be used in part to determine and/or confirm a location of the motorized transport unit, movable item container or other objects. In instances, for example, the motorized transport unit may include identifying markings (e.g., alphanumeric characters, code, coloring, etc.) that can be recognized and correlated to a location that is captured by the video camera and/or surrounding markings, products, structures and the like. In some embodiments, a visual recognition can be utilized. For example, the motorized transport unit can be configured to visually recognize its location by comparing sensor input to known image information. Typically, one or more mappings (e.g., two-dimensional, three-dimensional, grid, etc.). The two-dimensional mapping can be used to identify and/or show horizontal location, while the three-dimensional mapping can be used to identify and/or shows vertical location as some shopping facilities have multiple levels. As also introduced above, some embodiments may utilize sign posts, location tags and the like. These location sign posts, location tags, etc. (active RFID, NFC, UWB, Bluetooth, two-dimensional barcodes, images, etc.) can be placed throughout and around the shopping facility and detected by the motorized transport unit, movable item container, user interface unit and/or other devices. Again, some embodiments utilize location information from a combination of two or more of these sources, systems and/or techniques.
In some embodiments, the location controller 602 may additionally or alternatively receive information that can be used to determine location from the movable item container 104 (e.g., an interface unit or other smart unit cooperated with the movable item container). Further, in some instances, the movable item container 104 communicates with the motorized transport unit and one or both can rely information to the location controller.
The control circuit 1102 of the movable item container typically comprises one or more processors and/or microprocessors. Generally, the memory 1110 stores the operational code or set of instructions that is executed by the control circuit 1102 and/or processor to implement the functionality of the movable item container 1104. In some embodiments, the memory 1110 may also store some or all of particular data that may be needed to make any of the determinations, measurements and/or communications described herein. Such data may be pre-stored in the memory or be determined, for example, from detected light, measurements, and the like, and/or communicated to the movable item container 1104, such as from the motorized transport unit, a user interface unit 114, the location controller 602, other source or combination of such sources. It is understood that the control circuit 1102 and/or processor may be implemented as one or more processor devices as are well known in the art. Similarly, the memory 1110 may be implemented as one or more memory devices as are well known in the art, such as one or more processor readable and/or computer readable media and can include volatile and/or nonvolatile media, such as RAM, ROM, EEPROM, flash memory and/or other memory technology. Further, the memory 1110 is shown as internal to the movable item container 1104 and/or an interface unit cooperated with the movable item container; however, the memory 1110 can be internal, external or a combination of internal and external memory. Additionally, a power supply (not shown) is typically included to power one or more of the components, or power may be received from an external source. While
Generally, the control circuit 1102 and/or electronic components of the movable item container 1104 can comprise fixed-purpose hard-wired platforms or can comprise a partially or wholly programmable platform. These architectural options are well known and understood in the art and require no further description here. The motorized transport unit and/or control circuit can be configured (for example, by using corresponding programming as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.
The control circuit can be configured, in part, to provide overall control and/or coordinate operation of the components of the movable item container. For example, the control circuit can instruct and/or activate one or more transmitters, receivers, transceivers of the I/O device 1124 to communicate with the location controller 602, the central computer system 106 of the shopping facility assistance system 100, one or more the motorized transport units, user interface units 114, and the like. As another example, the control circuit may activate the light receiver unit 928, the code reader 930 or other components, which may be initiated in response to instructions from the location controller 602, a motorized transport unit, the user interface unit 114 or other device.
The distance measurement unit 1108 is configured to measure relative distances between the distance measurement unit and/or the movable item container 1104 and external objects. This distance information can be provided to the location controller and used in determining a location of the movable item container and/or a motorized transport unit cooperated with the movable item container. In some embodiments, the distance measurement unit 1108 is similar to the distance measurement unit 808 of the motorized transport unit.
In some embodiments, the distance measurement unit 1108, the light receiver unit 928, code reader 930 and movement tracker unit 1118 can be configured to operate similar to the distance measurement unit 808, the light receiver unit 804, code reader 806 and movement tracker unit 812 of the motorized transport unit 802, and provide similar distance, light source identifier, code and/or movement information.
Further, one or more of the distance measurement unit 1108, the light receiver unit 928, code reader 930 and movement tracker unit 1118 of the movable item container 1104 can be configured to operate independent of or in cooperation with one or more of the distance measurement unit 808, the light receiver unit 804, code reader 806 and movement tracker unit 812 of a motorized transport unit. For example, the light receiver unit 928 may be configured to operate in place of the light receiver unit 804 of a motorized transport unit when the motorized transport unit is cooperated with the movable item container because the sensors of the light receiver unit 804 of the motorized transport unit may be obstructed by the movable item container 1104 (e.g., the motorized transport unit may be configured to partially or fully position itself under the movable item container). Similarly, the distance measurement unit 1108 of the movable item container 1104 may be activated in response to an obstruction being detected by the motorized transport unit that is interfering with measurements by the distance measurement unit 808 of the motorized transport unit.
In some embodiments, the movable item container may optionally include an internal location controller 1116. Similar to the internal location controller 816 of a motorized transport unit, the internal location controller 1116 can be configured to determine an approximate location of the movable item container 1104 and/or provide feedback to the location controller 602. The internal location controller 1116 may take some or all of the information from one or more of the distance measurement unit 1108, the light receiver unit 928, code reader 930, and/or movement tracker unit 1118. Utilizing locally stored and/or remotely accessed mapping of the shopping facility, the internal location controller can determine and/or predict a location of the movable item container. In some instances, the locally determined location information can be compared with location information determined through the location controller 602.
In some embodiments, the user interface 932 is included and/or coupled with the movable item container, which may be used for user input, output audio and/or output display. For example, the user interface 932 may include any known input devices, such one or more buttons, knobs, selectors, switches, keys, touch input surfaces, audio input and/or displays, etc. Additionally, the user interface may include one or more output display devices, such as lights, visual indicators, display screens, etc. to convey information to a user, such as status information, location information, mapping information, product location information, product information, directions (e.g., to a product, to a parking space where customer parked, etc.), video content, other communication information (e.g., text messages), operating status information, notifications, errors, conditions, shopping list, advertising, product recommendations, video content, communications with other individuals (e.g., face time or other visual view of one or more remote individuals, shopping facility associates, etc.), and/or other such information. In some implementations, at least a portion of the user interface 932 is positioned on the movable item container to be readily viewed and accessed by the customer. For example, at least a portion of the user interface 932 may be mounted on a handle bar of the movable item container that is used by the customer to push the movable item contain through the shopping facility. In many instances, this makes the user interface 932 of the movable item container more visible and accessible to the customer than a user interface on the motorized transport unit. Accordingly, the user interface 932 cooperated with the movable item container may be used instead of the motorized transport unit when the motorized transport unit is cooperated with a movable item container.
Location parameters and/or information that can be utilized in determine a location of the motorized transport unit and/or movable item container may also be obtained and/or received from other sources. For example, in some embodiments, information may be provided by the portable user interface unit 114, from cameras and/or other sensors distributed through the shopping facility, and other such sources. As another example, one or more RFID tags may be associated with each motorized transport unit 102 and/or the movable item containers. RFID tag sensors or readers can be positioned at one or more locations within and/or exterior to the shopping facility. With knowledge of a range and/or using signal quality and strength, a general location (e.g., within a six foot radius) of the motorized transport unit or movable item container may be determined and/or used in cooperation with other information in determining a more precise location. In some instances, the user interface unit 114 may be activated by a user, for example, through the activation of application software and/or program (APP) that causes one or more cameras on the user interface unit to operate to capture images and/or video in attempts allow the user interface unit to operate similar to or the same as the light receiver unit 804 of the motorized transport unit in detecting unique light source identifiers. Similarly, the user interface unit may utilize the camera to capture images of machine readable codes that can be forwarded to the location controller and/or to operate similar to the code reader 806. Still further, the user interface unit may track a current location through GPS, triangulation or the like based on signals from and/or to one or more wireless network access points, or other such information or combinations of such information. In some instances, the user interface unit may be configured to take distance measurements that can be communicated to the location controller.
This additional information may be utilized by the location controller 602 in determining a location of the motorized transport unit and/or movable item container. Further, as described above, sensors, cameras and the like may be distributed through the shopping facility and exterior to the shopping facility. Information from the sensors and cameras can be used in determining a location of the motorized transport unit. In some instances, the motorized transport unit and/or the movable item container may include unique markings (e.g., unique numbering, color combination, and/or other such markings). Video content can be evaluated by a video processor of the location controller 602 or external to the location controller to identify the motorized transport containers and/or movable item containers within the video content. Based on the camera capturing the image and other information that may be extracted from the image (e.g., recognition of products on a shelf, other shelf identifiers (e.g., numbers or the like labeled on a top of the shelves), and other such identifiers, a location of the motorized transport unit may be determined within a margin of error. Again, the location information obtained from the video information may be used in combination with one or more other location information to obtain a more precise location of the motorized transport unit.
In step 1214, one or more unique light source identifiers and/or other location information (e.g., area identifier, grid identifier, map coordinate information, grid coordinate information, etc.) are obtained from the communications. Typically, the light source identifiers each exclusively correspond to one light source within the shopping facility from light emitted by the light source, and obtained from the light detected by the motorized transport unit, movable item container, user interface unit, etc. As described above, in some embodiments the light from the one or more light sources is detected through a light receiver unit of the motorized transport unit, and the unique light source identifier from the detected light of each light source is extracted at the motorized transport unit. In some embodiments, relative distance information is additionally or alternatively obtained from the communications. The distance information can include one or more distance measurements determined by the motorized transport unit through an optical measurement corresponding to a distance between the motorized transport unit and an external object. For example, the relative distance between the motorized transport unit and the external object can be determine through a distance measurement unit of the motorized transport unit. The external object can be substantially any object, such as a shelf, rack, customer, movable item container, wall, and the like. Additionally or alternatively, in some embodiments code information is obtained from the communications or separate communications. The code information corresponds to one or more machine readable codes of a plurality of unique machine readable codes positioned at different locations distributed throughout at least a portion of the shopping facility and detected by the motorized transport unit, the movable item container 104, the user interface unit 114 or the like. For example, the machine readable code can be optically read through a machine readable code reader of the motorized transport unit, and the code information can be identified and/or extracted at the motorized transport unit. The motorized transport unit can transmit the communications comprising the light source identifier, the relative distance, the code information, and/or other information, and typically a combination of two or more of such information.
In step 1216, the at least one unique light source identifier and/or the relative distance information are processed relative to a mapping of the shopping facility. In some embodiments, a location of the one or more machine readable codes is identified based on the identified code information relative to the mapping. The location of the one or more machine readable codes can be interior to the shopping facility, but in some instances may be exterior to the shopping facility. In step 1218, a location is determined, in response to the processing, of the motorized transport unit within the shopping facility as a function of the at least one unique light source identifier and the relative distance information. Some embodiments, in determining the location of the motorized transport unit further determine, relative to the mapping of the shopping facility, the location of the motorized transport unit within the shopping facility as a function of a combination of at least two of: the one or more unique light source identifiers, the relative distance information and the identified location of the one or more machine readable codes.
In some embodiments, the location controller 602 and/or the central computer system 106 determine, relative to the mapping and the determined location of the motorized transport unit, one or more movement commands to control movement of the motorized transport unit to cause the motorized transport unit to move in a desired direction. The one or more movement commands can be communicated (e.g., wirelessly transmitted) to the motorized transport unit to cause the motorized transport unit to control its movements in accordance with the movement commands. Further, some embodiments determine a destination location within the shopping location. A route or path can be identifying, relative to the mapping, between the determined location of the motorized transport unit and the destination location. One or more movement commands can be determined to control movement of the motorized transport unit to cause the motorized transport unit to move to the destination location, which can be in accordance with the determined route. The one or more movement commands can be transmitted to the motorized transport unit to cause the motorized transport unit to control its movements in accordance with the movement commands. For example, a communications transceiver can be activated to transmit one or more movement commands (e.g., the control circuit 702 of the location controller and/or the central computer system 106 of the shopping facility assistance system 100 can cause a transceiver to communicate one or more commands).
In some embodiments, the motorized transport unit receives light source identifiers and/or location transmissions from one or more light sources (e.g., LED lights) through the light receiver unit 804 (e.g., a photo eye, one or more photodiodes, etc.). The light source identifiers and/or location information can be sent to the location controller 602. Further, in some embodiments, the customer's mobile user interface unit 114 activates a camera, photo eye, photodiode array, etc. to detect and/or receive light source identifiers and/or location transmissions from the light sources, which can also be communicated (e.g., Wi-Fi, Bluetooth, cellular) to the location controller 602. The motorized transport unit, the movable item container 104, user interface unit 114 or the like can read one or more machine readable codes 1012, signposts, location tags or the like to obtain code information and/or receive relevant location information or data.
In some implementations, the location controller 602 links a customer to a motorized transport unit and/or movable item container. A map can be displayed on the motorized transport unit, movable item container and/or the customer's mobile user interface unit 114 (e.g., through an APP) indicating where the customer is currently located and where the motorized transport unit and/or movable item container is currently located. As described above, orientation information can be determined by the motorized transport unit through the use of GPS, compass, gyroscope, motion tracking and the like. The location controller can determine a route between the customer and the motorized transport unit, a desired destination, a desired product and the like, and using the shopping facility mapping can issue commands to navigate one or both of the customer and the motorized transport unit towards each other, which may be based upon a determined most efficient pathway, based on directing a customer through relevant portions of the shopping facility (e.g., based on products), or the like. In some embodiments, the motorized transport unit incorporates semi-autonomous location services based on, for example, Simultaneous Location and Mapping (SLAM) algorithms or similar services. Location data is typically continuously updated by analyzing sensory input and comparing to available map data. The resulting self-maintained location data can improve accuracy and provide a level of fault tolerance.
In some embodiments, the shopping facility and/or the location controller have certain authorized areas designated for the motorized transport unit. Accordingly, when a customer enters such an area that the motorized transport unit is prohibited from entering the motorized transport unit will, in some instances, wait for the customer to leave the prohibited area, and may notify the customer as the customer enters those areas that it is prohibited from entering that area.
Some embodiments establish geographic location capabilities for the motorized transport unit, the movable item container, the customer, user interface unit 114, or other devices or objects at the shopping facility. Further, some embodiments utilize and/or establish a mapping of light sources (e.g., LED light transmissions) to a two-dimensional map of the shopping facility, and potentially surrounding areas (e.g., parking lot, loading dock, delivery bay, and the like). Further, objects within an area may also be mapped and associated with relevant and/or proximate light sources or illumination areas 918. Devices that receive light source transmissions (e.g., motorized transport unit, movable item container, user interface units 114, etc.) are able to transmit back the light source identifier or location number to the location controller 602. Using this light source identifier information, the location controller can identify a relevant location on a two-dimensional map and/or grid map.
The determined location of one or more motorized transport units, movable item containers, customers, shopping facility colleagues, associates and other objects allows the location controller to know where they are located, and/or allow find each other motorized transport units, movable item containers, customers, shopping facility colleagues and associates to find each other. In some instances, multiple motorized transport units, movable item containers, customers, shopping facility colleagues and/or associates can be configured and instructed to cooperatively work together (e.g., as a team) to accomplish tasks that are assigned. In some embodiments, it is beneficial when the motorized transport units know where a customer associated to that motorized transport unit is located relative to a location of the motorized transport unit, and/or where one or more other motorized transport units, movable item containers, associates, etc. are located. For example, an instruction may be broadcasted to multiple motorized transport units, and the motorized transport unit or units nearest the project to be performed can accept the task or tasks for which it is best suited, based on its proximity or capability for the work. In other embodiments, the location controller 602 and/or the central computer system 106 of the shopping facility assistance system 100 can identify a task to be performed, and with knowledge of current locations of motorized transport units and their relevant capabilities can select and assign one or more specific motorized transport units to perform the desire task or tasks. In some implementations, multiple motorized transport units will be stationed throughout the shopping facility (inside, in the parking lot, in a storage area, at a loading dock, etc.), waiting for commands or instructions, or roaming assigned areas attempting to locate tasks to be performed. Accordingly, in some embodiments, the motorized transport units comprise high functionality and/or progressively intelligent systems with capabilities integrating “smart devices”, Internet, cellular communication services, indoor and outdoor location determination technology, and many other features providing safe and fun customer, member, colleague, associate interaction. Further, in some embodiments, the location controller 602, the motorized transport unit and/or the movable item container are capable of determining a location through geo-location technology, a central computer system, video analytics, three-dimensional mapping, two-dimensional shopping facility mapping and/or other such information.
Further, the motorized transport unit and the movable item containers may be configured for used both inside and outside of the shopping facility. According, the location controller utilizes relevant location information from one or more of the motorized transport units, movable item containers, user interface units 114, sensors, video data, or other such information and typically a combination of such information to support precise indoor and outdoor positioning. Outdoor positioning may take advantage of GPS, Standard Positioning Service (SPS) as well as other location information. The nominal accuracy of GPS and/or SPS receivers, however, is often within a margin of error of +/−3 meters. As such, some embodiments utilize additional location information acquired through one or more cost-effective position augmentation systems. As described above, one or more of these cost-effective position augmentation systems can be incorporated into the motorized transport unit, movable item container, user interface unit 114 and other such devices. The location determination in part allows for precision control of the motorized transport units, and can further enhance guided tracking, movement commands, obstacle avoidance (e.g., people, vehicles, other carts, trash cans, curbs, etc.), and other such control.
The location determination, in some embodiments further determined a current environment in which the motorized transport unit is positioned (e.g., based on a map). Further, a position within the environment (e.g., indoor, outdoor, gardening section, etc.) can be determined. In some embodiments, an orientation of the motorized transport unit and/or the customer are considered (e.g., north, south, east, west, etc.). Similarly, a direction of travel and/or speed may be considered. Some embodiments additionally consider how far away the motorized transport unit is away from an assigned or associated customer, colleague, associate, etc. (e.g., customer, colleague, or associate may be moving about the shopping facility). In some embodiments, a destination may further be taken into consideration in location determination, predication and/or routing. Some embodiments take into consideration potential errors and/or error conditions (e.g., positioning errors, mapping inaccessible or is incorrect, communication errors, etc.). The location determination typically takes into consideration two or more types of location information. For example, the location controller 602 can consider location information based on visible light communication, Bluetooth low energy beacons, audible beacons, ultrasonic beacons, magnetic resonance, dead reckoning, GPS, Nationwide Differential GPS (NDGPS), Wide Area Augmentation System (WAAS), video recognition, two-dimensional and/or three-dimensional maps, Near Field Communications (NFC), Ultrawide Band (UWB), Radio Frequency Identification (RFID), trilateration and/or triangulation, Received Angle of Arrival (RAOA), sign posts and/or location tags (e.g., active RFID, NFC, UWB, Bluetooth, two-dimensional barcodes, images, etc.), and other such location information.
In operation, the motorized transport unit obtains location information, such as detecting light source identifiers, communications from light sources, distance measurement information, code and/or image recognition information, or other such information, and typically a combination of such information. This location information is communicated to the location controller 602 and utilized to determine a location of the motorized transport unit, at least with respect to the shopping facility mapping information.
The central computer system 106 further receives requests from customers to sign up to use the motorized transport unit and/or request to be associated with a motorized transport unit. In some implementations a user submits the registration and/or request through an APP on the customer's mobile user interface unit 114. Additionally or alternatively, a kiosk or other such system may be provided at the shopping facility to allow the customer to register. Similarly, the customer may register through a remote user interface unit, such as a home computer, and may in some implementations reserve a motorized transport unit.
As described above, in some embodiments the customer's user interface unit 114 (and/or other devices such as the movable item container 104) may detect and/or acquire location information. For example, the one or more cameras of a user interface unit may be activated through and APP to detect light source identifiers and/or communications from light sources. Similarly, the user interface unit may capture video that may be forwarded to the motorized transport unit and/or location controller, which may be used in determining a location of the user interface unit and/or the motorized transport unit. Further, some embodiments determine a location of the customer upon receiving a request to be associated with and use a motorized transport unit. As such, the customer's user interface unit can provide relevant location information (e.g., GPS, light source identifiers, triangulation, other such information, or combinations of such information). The location controller can utilize this information to determine a location of the motorized transport device and/or the customer. In some instances, the location controller determines a route or path between the motorized transport unit and the customer. The route may be configured to direct the customer through areas of the shopping facility that may be of more interest to the customer and/or where products are placed that the customer is more likely to purchase. The location controller can provide relevant information, such as directions, movement commands, mapping coordinate information, animation information to be displayed and/or other such information to the customer's user interface unit and/or the motorized transport unit in an effort to bring the customer and the associated motorized transport unit together. As the customer and/or the motorized transport unit move through the shopping facility, the location information is automatically and typically continuously updated. Further, the location controller can update a determined location of the motorized transport unit and/or the customer, update routes, redirect one or both the customer and the motorized transport unit, and take other actions.
Further, in some instances, the location information may be updated over time. For example, the mapping can be updated based on repeated information and/or the determination of object within the shopping facility. Errors may be identified relative to location information and modifications made to correct or compensate for such errors. Other modifications can be made over time in attempts to enhance the location determination. In some instances, for example, the location controller can adjust a mapping based on repeated location information obtained from one or more motorized transport units.
The determined location supports many situations, including shopping, stocking, delivering, searching, alerting, and communicating. It has been recognized that, in at least some instances, it can be beneficial to provide mobility support for customers, shopping facility associates and the like. Utilizing the determined location information, more precise control can be provided to at least the motorized transport unit, and provide can take improve the shopping experience and/or self-service shopping to new levels of convenience. It can further provide increases in productivity in a shopping facility distribution center, and/or the like enabling teams to carry out more value add activities.
It is noted that the above description generally refers to shopping facilities; however, it will be appreciated by those skilled in the art that the location determination and/or control is not limited to shopping facilities, but can be extended to other facilities, such as distribution centers, shopping home office campuses or the like.
In some embodiments, apparatuses and methods are provided that provide control over movement of a motorized transport unit within a shopping facility. In some embodiments, an apparatus providing control over movement of a motorized transport unit within a shopping facility, comprises: a location controller separate and distinct from a self-propelled motorized transport unit, wherein the location controller comprises: a transceiver configured to receive communications from the motorized transport unit located within a shopping facility; a control circuit coupled with the transceiver; a memory coupled to the control circuit and storing computer instructions that when executed by the control circuit cause the control circuit to perform the steps of: obtain, from the communications received from the motorized transport unit, a unique light source identifier of a light source within the shopping facility detected by the motorized transport unit from light emitted by the light source, and relative distance information determined by the motorized transport unit through an optical measurement corresponding to a distance between the motorized transport unit and an external object; process the at least one unique light source identifier and the relative distance information relative to a mapping of the shopping facility; and determine, in response to the processing, a location of the motorized transport unit within the shopping facility as a function of the at least one unique light source identifier and the relative distance information.
In some embodiments, an apparatus providing control over movement of a motorized transport unit within a shopping facility, comprises: a self-propelled motorized transport unit within a shopping facility, wherein the motorized transport unit comprises: a light receiver unit configured to detect light from one or more external light sources within a shopping facility and extract at least one unique light source identifier from the detected light; a distance measurement unit comprising a light emitter and a light detector, wherein the distance measurement unit is configured to determine, as a function of light detected from the light emitter, a relative distance from the light emitter to one or more remote objects; a control circuit; and a memory coupled to the control circuit and storing computer instructions that when executed by the control circuit cause the control circuit to control movement of the motorized transport unit as a function of a location of the motorized transport unit determined relative to a mapping of the shopping facility and based on the at least one light source identifier and the relative distance.
In some embodiments, a method of controlling movement of a motorized transport unit within a shopping facility, comprises: receiving, at a location controller separate and distinct from a self-propelled motorized transport unit located within a shopping facility, communications from the motorized transport unit; obtaining from the communications a unique light source identifier of a light source within the shopping facility detected by the motorized transport unit from light emitted by the light source, and relative distance information determined by the motorized transport unit through an optical measurement corresponding to a distance between the motorized transport unit and an external object; processing the at least one unique light source identifier and the relative distance information relative to a mapping of the shopping facility; and determining, in response to the processing, a location of the motorized transport unit within the shopping facility as a function of the at least one unique light source identifier and the relative distance information.
Those skilled in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described embodiments without departing from the scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.
This application is a continuation of U.S. application Ser. No. 15/061,406, filed Mar. 4, 2016, issued as U.S. Pat. No. 10,071,892, which is incorporated herein by reference in its entirety, and which claims the benefit of each of the following U.S. Provisional applications, each of which is incorporated herein by reference in its entirety: U.S. Provisional Application No. 62/129,726, filed Mar. 6, 2015; U.S. Provisional Application No. 62/129,727, filed Mar. 6, 2015; U.S. Provisional Application No. 62/138,877, filed Mar. 26, 2015; U.S. Provisional Application No. 62/138,885, filed Mar. 26, 2015; U.S. Provisional Application No. 62/152,421, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,465, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,440, filed Apr 24, 2015; U.S. Provisional Application No. 62/152,630, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,711, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,610, filed Apr. 24, 2015; U.S. Provisional Application No. 62/152,667, filed Apr. 24, 2015; U.S. Provisional Application No. 62/157,388, filed May 5, 2015; U.S. Provisional Application No. 62/165,579, filed May 22, 2015; U.S. Provisional Application No. 62/165,416, filed May 22, 2015; U.S. Provisional Application No. 62/165,586; filed May 22, 2015; U.S. Provisional Application No. 62/171,822, filed Jun. 5, 2015; U.S. Provisional Application No. 62/175,182, filed Jun. 12, 2015; U.S. Provisional Application No. 62/182,339, filed Jun. 19, 2015; U.S. Provisional Application No. 62/185,478, filed Jun. 26, 2015; U.S. Provisional Application No. 62/194,131, filed Jul. 17, 2015; U.S. Provisional Application No. 62/194,119, filed Jul. 17, 2015; U.S. Provisional Application No. 62/194,121, filed Jul. 17, 2015; U.S. Provisional Application No. 62/194,127, filed Jul. 17, 2015; U.S. Provisional Application No. 62/202,744, filed Aug. 7, 2015; U.S. Provisional Application No. 62/202,747, filed Aug. 7, 2015; U.S. Provisional Application No. 62/205,548, filed Aug. 14, 2015; U.S. Provisional Application No. 62/205,569, filed Aug. 14, 2015; U.S. Provisional Application No. 62/205,555, filed Aug. 14, 2015; U.S. Provisional Application No. 62/205,539, filed Aug. 14, 2015; U.S. Provisional Application No. 62/207,858, filed Aug. 20, 2015; U.S. Provisional Application No. 62/214,826, filed Sep. 4, 2015; U.S. Provisional Application No. 62/214,824, filed Sep. 4, 2015; U.S. Provisional Application No. 62/292,084, filed Feb. 5, 2016; U.S. Provisional Application No. 62/302,547, filed Mar. 2, 2016; U.S. Provisional Application No. 62/302,567, filed Mar. 2, 2016; U.S. Provisional Application No. 62/302,713, filed Mar. 2, 2016; and U.S. Provisional Application No. 62/303,021, filed Mar. 3, 2016.
Number | Name | Date | Kind |
---|---|---|---|
1506095 | Stevenson | Aug 1924 | A |
1506102 | Wise | Aug 1924 | A |
1506105 | Zerbe | Aug 1924 | A |
1506120 | Hardinge | Aug 1924 | A |
1506126 | Kuenz | Aug 1924 | A |
1506128 | Lauterbur | Aug 1924 | A |
1506132 | Oishei | Aug 1924 | A |
1506135 | Raschick | Aug 1924 | A |
1506140 | Smith | Aug 1924 | A |
1506144 | Weeks | Aug 1924 | A |
1506147 | Abbott | Aug 1924 | A |
1506150 | Beaty | Aug 1924 | A |
1506167 | Ellwood | Aug 1924 | A |
1506168 | Erikstrup | Aug 1924 | A |
1506172 | Fredette | Aug 1924 | A |
1506177 | Heintz | Aug 1924 | A |
1506179 | Howe | Aug 1924 | A |
1506180 | Humphreys | Aug 1924 | A |
1506184 | Kellner | Aug 1924 | A |
1506190 | Marcuse | Aug 1924 | A |
1506198 | Nordell | Aug 1924 | A |
1527499 | Woods | Feb 1925 | A |
1527500 | Woods | Feb 1925 | A |
1527501 | Zeh | Feb 1925 | A |
1527504 | Backhaus | Feb 1925 | A |
1528295 | Greenwood | Mar 1925 | A |
1528892 | Pigott | Mar 1925 | A |
1542381 | Gabriel | Jun 1925 | A |
1544691 | Smith | Jul 1925 | A |
1544717 | Behrman | Jul 1925 | A |
1544720 | Brandt | Jul 1925 | A |
1547127 | Metzger | Jul 1925 | A |
1569222 | Dent | Jan 1926 | A |
1583670 | Davol | May 1926 | A |
1774653 | Marriott | Sep 1930 | A |
2669345 | Brown | Feb 1954 | A |
3765546 | Westerling | Oct 1973 | A |
4071740 | Gogulski | Jan 1978 | A |
4158416 | Podesta | Jun 1979 | A |
4588349 | Reuter | May 1986 | A |
4672280 | Honjo | Jun 1987 | A |
4777416 | George, II | Oct 1988 | A |
4791482 | Barry | Dec 1988 | A |
4868544 | Havens | Sep 1989 | A |
4911608 | Krappitz | Mar 1990 | A |
5119087 | Lucas | Jun 1992 | A |
5279672 | Betker | Jan 1994 | A |
5287266 | Malec | Feb 1994 | A |
5295551 | Sukonick | Mar 1994 | A |
5363305 | Cox | Nov 1994 | A |
5380138 | Kasai | Jan 1995 | A |
5384450 | Goetz, Jr. | Jan 1995 | A |
5395206 | Cerny, Jr. | Mar 1995 | A |
5402051 | Fujiwara | Mar 1995 | A |
5548515 | Pilley | Aug 1996 | A |
5632381 | Thust | May 1997 | A |
5652489 | Kawakami | Jul 1997 | A |
5671362 | Cowe | Sep 1997 | A |
5777571 | Chuang | Jul 1998 | A |
5801340 | Peter | Sep 1998 | A |
5917174 | Moore | Jun 1999 | A |
5920261 | Hughes | Jul 1999 | A |
5969317 | Espy | Oct 1999 | A |
6018397 | Cloutier | Jan 2000 | A |
6199753 | Tracy | Mar 2001 | B1 |
6201203 | Tilles | Mar 2001 | B1 |
6240342 | Fiegert | May 2001 | B1 |
6339735 | Peless | Jan 2002 | B1 |
6365857 | Maehata | Apr 2002 | B1 |
6374155 | Wallach | Apr 2002 | B1 |
6394519 | Byers | May 2002 | B1 |
6431078 | Serrano | Aug 2002 | B2 |
6522952 | Arai | Feb 2003 | B1 |
6525509 | Petersson | Feb 2003 | B1 |
6535793 | Allard | Mar 2003 | B2 |
6550672 | Tracy | Apr 2003 | B1 |
6571693 | Kaldenberg | Jun 2003 | B1 |
6584375 | Bancroft | Jun 2003 | B2 |
6584376 | VanKommer | Jun 2003 | B1 |
6600418 | Francis | Jul 2003 | B2 |
6601759 | Fife | Aug 2003 | B2 |
6606411 | Loui | Aug 2003 | B1 |
6626632 | Guenzi | Sep 2003 | B2 |
6633800 | Ward | Oct 2003 | B1 |
6655897 | Harwell | Dec 2003 | B1 |
6667592 | Jacobs | Dec 2003 | B2 |
6672601 | Hofheins | Jan 2004 | B1 |
6678583 | Nasr | Jan 2004 | B2 |
6688435 | Will | Feb 2004 | B1 |
6728597 | Didriksen | Apr 2004 | B2 |
6731204 | Lehmann | May 2004 | B2 |
6745186 | Testa | Jun 2004 | B1 |
6752582 | Garcia | Jun 2004 | B2 |
6810149 | Squilla | Oct 2004 | B1 |
6816085 | Haynes | Nov 2004 | B1 |
6832884 | Robinson | Dec 2004 | B2 |
6841963 | Song | Jan 2005 | B2 |
6850899 | Chow | Feb 2005 | B1 |
6883201 | Jones | Apr 2005 | B2 |
6885736 | Uppaluru | Apr 2005 | B2 |
6886101 | Glazer | Apr 2005 | B2 |
6895301 | Mountz | May 2005 | B2 |
6910828 | Hughes | Jun 2005 | B1 |
6937989 | McIntyre | Aug 2005 | B2 |
6954695 | Bonilla | Oct 2005 | B2 |
6967455 | Nakadai | Nov 2005 | B2 |
6975997 | Murakami | Dec 2005 | B1 |
7039499 | Nasr | May 2006 | B1 |
7066291 | Martins | Jun 2006 | B2 |
7101113 | Hughes | Sep 2006 | B2 |
7101139 | Benedict | Sep 2006 | B1 |
7117902 | Osborne | Oct 2006 | B2 |
7145562 | Schechter | Dec 2006 | B2 |
7147154 | Myers | Dec 2006 | B2 |
7177820 | McIntyre | Feb 2007 | B2 |
7184586 | Jeon | Feb 2007 | B2 |
7205016 | Garwood | Apr 2007 | B2 |
7206753 | Bancroft | Apr 2007 | B2 |
7222363 | Rice | May 2007 | B2 |
7233241 | Overhultz | Jun 2007 | B2 |
7234609 | DeLazzer | Jun 2007 | B2 |
7261511 | Felder | Aug 2007 | B2 |
7367245 | Okazaki | May 2008 | B2 |
7381022 | King | Jun 2008 | B1 |
7402018 | Mountz | Jul 2008 | B2 |
7431208 | Feldman | Oct 2008 | B2 |
7447564 | Yasukawa | Nov 2008 | B2 |
7463147 | Laffoon | Dec 2008 | B1 |
7474945 | Matsunaga | Jan 2009 | B2 |
7487913 | Adema | Feb 2009 | B2 |
7533029 | Mallett | May 2009 | B2 |
7554282 | Nakamoto | Jun 2009 | B2 |
7556108 | Won | Jul 2009 | B2 |
7556219 | Page | Jul 2009 | B2 |
7587756 | Peart | Sep 2009 | B2 |
7613544 | Park | Nov 2009 | B2 |
7627515 | Borgs | Dec 2009 | B2 |
7636045 | Sugiyama | Dec 2009 | B2 |
7648068 | Silverbrook | Jan 2010 | B2 |
7653603 | Holtkamp, Jr. | Jan 2010 | B1 |
7658327 | Tuchman | Feb 2010 | B2 |
7689322 | Tanaka | Mar 2010 | B2 |
7693605 | Park | Apr 2010 | B2 |
7693745 | Pomerantz | Apr 2010 | B1 |
7693757 | Zimmerman | Apr 2010 | B2 |
7706917 | Chiappetta | Apr 2010 | B1 |
7716064 | McIntyre | May 2010 | B2 |
7726563 | Scott | Jun 2010 | B2 |
7762458 | Stawar | Jul 2010 | B2 |
7783527 | Bonner | Aug 2010 | B2 |
7787985 | Tsujimoto | Aug 2010 | B2 |
7817394 | Mukherjee | Oct 2010 | B2 |
7826919 | DAndrea | Nov 2010 | B2 |
7835281 | Lee | Nov 2010 | B2 |
7894932 | Mountz | Feb 2011 | B2 |
7894939 | Zini | Feb 2011 | B2 |
7957837 | Ziegler | Jun 2011 | B2 |
7969297 | Haartsen | Jun 2011 | B2 |
7996109 | Zini | Aug 2011 | B2 |
8010230 | Zini | Aug 2011 | B2 |
8032249 | Shakes | Oct 2011 | B1 |
8041455 | Thorne | Oct 2011 | B2 |
8050976 | Staib | Nov 2011 | B2 |
8065032 | Stiffer | Nov 2011 | B2 |
8065353 | Eckhoff-Hornback | Nov 2011 | B2 |
8069092 | Bryant | Nov 2011 | B2 |
8083013 | Bewley | Dec 2011 | B2 |
8099191 | Blanc | Jan 2012 | B2 |
8103398 | Duggan | Jan 2012 | B2 |
8195333 | Ziegler | Jun 2012 | B2 |
8239276 | Lin | Aug 2012 | B2 |
8244041 | Silver | Aug 2012 | B1 |
8248467 | Ganick | Aug 2012 | B1 |
8260456 | Siegel | Sep 2012 | B2 |
8284240 | Saint-Pierre | Oct 2012 | B2 |
8295542 | Albertson | Oct 2012 | B2 |
8321303 | Krishnamurthy | Nov 2012 | B1 |
8325036 | Fuhr | Dec 2012 | B1 |
8342467 | Stachowski | Jan 2013 | B2 |
8352110 | Szybalski | Jan 2013 | B1 |
8359122 | Koselka | Jan 2013 | B2 |
8380349 | Hickman | Feb 2013 | B1 |
8393846 | Coots | Mar 2013 | B1 |
8412400 | DAndrea | Apr 2013 | B2 |
8423280 | Edwards | Apr 2013 | B2 |
8425173 | Lert | Apr 2013 | B2 |
8429004 | Hamilton | Apr 2013 | B2 |
8430192 | Gillett | Apr 2013 | B2 |
8433470 | Szybalski | Apr 2013 | B1 |
8433507 | Hannah | Apr 2013 | B2 |
8437875 | Hernandez | May 2013 | B2 |
8444369 | Watt | May 2013 | B2 |
8447863 | Francis, Jr. | May 2013 | B1 |
8452450 | Dooley | May 2013 | B2 |
8474090 | Jones | Jul 2013 | B2 |
8494908 | Herwig | Jul 2013 | B2 |
8504202 | Ichinose | Aug 2013 | B2 |
8508590 | Laws | Aug 2013 | B2 |
8510033 | Park | Aug 2013 | B2 |
8511606 | Lutke | Aug 2013 | B1 |
8515580 | Taylor | Aug 2013 | B2 |
8516651 | Jones | Aug 2013 | B2 |
8538577 | Bell | Sep 2013 | B2 |
8544858 | Eberlein | Oct 2013 | B2 |
8571700 | Keller | Oct 2013 | B2 |
8572712 | Rice | Oct 2013 | B2 |
8577538 | Lenser | Nov 2013 | B2 |
8587662 | Moll | Nov 2013 | B1 |
8588969 | Frazier | Nov 2013 | B2 |
8594834 | Clark | Nov 2013 | B1 |
8606314 | Barnes, Jr. | Dec 2013 | B2 |
8606392 | Wurman | Dec 2013 | B2 |
8639382 | Clark | Jan 2014 | B1 |
8645223 | Ouimet | Feb 2014 | B2 |
8649557 | Hyung | Feb 2014 | B2 |
8656550 | Jones | Feb 2014 | B2 |
8670866 | Ziegler | Mar 2014 | B2 |
8671507 | Jones | Mar 2014 | B2 |
8676377 | Siegel | Mar 2014 | B2 |
8676420 | Kume | Mar 2014 | B2 |
8676480 | Lynch | Mar 2014 | B2 |
8700230 | Hannah | Apr 2014 | B1 |
8708285 | Carreiro | Apr 2014 | B1 |
8718814 | Clark | May 2014 | B1 |
8724282 | Hiremath | May 2014 | B2 |
8732039 | Chen | May 2014 | B1 |
8744626 | Johnson | Jun 2014 | B2 |
8751042 | Lee | Jun 2014 | B2 |
8763199 | Jones | Jul 2014 | B2 |
8770976 | Moser | Jul 2014 | B2 |
8775064 | Zeng | Jul 2014 | B2 |
8798786 | Wurman | Aug 2014 | B2 |
8798840 | Fong | Aug 2014 | B2 |
8814039 | Bishop | Aug 2014 | B2 |
8818556 | Sanchez | Aug 2014 | B2 |
8820633 | Bishop | Sep 2014 | B2 |
8825226 | Worley, III | Sep 2014 | B1 |
8831984 | Hoffman | Sep 2014 | B2 |
8838268 | Friedman | Sep 2014 | B2 |
8843244 | Phillips | Sep 2014 | B2 |
8851369 | Bishop | Oct 2014 | B2 |
8882432 | Bastian, II | Nov 2014 | B2 |
8886390 | Wolfe | Nov 2014 | B2 |
8892240 | Vliet | Nov 2014 | B1 |
8892241 | Weiss | Nov 2014 | B2 |
8899903 | Saad | Dec 2014 | B1 |
8918202 | Kawano | Dec 2014 | B2 |
8918230 | Chen | Dec 2014 | B2 |
8930044 | Peeters | Jan 2015 | B1 |
8965561 | Jacobus | Feb 2015 | B2 |
8972045 | Mountz | Mar 2015 | B1 |
8972061 | Rosenstein | Mar 2015 | B2 |
8983647 | Dwarakanath | Mar 2015 | B1 |
8989053 | Skaaksrud | Mar 2015 | B1 |
9002506 | Agarwal | Apr 2015 | B1 |
9008827 | Dwarakanath | Apr 2015 | B1 |
9008829 | Worsley | Apr 2015 | B2 |
9014848 | Farlow | Apr 2015 | B2 |
9075136 | Joao | Jul 2015 | B1 |
9129277 | MacIntosh | Sep 2015 | B2 |
9170117 | Abuelsaad | Oct 2015 | B1 |
9173816 | Reinhardt | Nov 2015 | B2 |
9190304 | MacKnight | Nov 2015 | B2 |
9278839 | Gilbride | Mar 2016 | B2 |
9305280 | Berg | Apr 2016 | B1 |
9329597 | Stoschek | May 2016 | B2 |
9495703 | Kaye, III | Nov 2016 | B1 |
9534906 | High | Jan 2017 | B2 |
9550577 | Beckman | Jan 2017 | B1 |
9573684 | Kimchi | Feb 2017 | B2 |
9578282 | Sills | Feb 2017 | B1 |
9607285 | Wellman | Mar 2017 | B1 |
9623923 | Riedel | Apr 2017 | B2 |
9649766 | Stubbs | May 2017 | B2 |
9656805 | Evans | May 2017 | B1 |
9658622 | Walton | May 2017 | B2 |
9663292 | Brazeau | May 2017 | B1 |
9663293 | Wurman | May 2017 | B2 |
9663295 | Wurman | May 2017 | B1 |
9663296 | Dingle | May 2017 | B1 |
9747480 | McAllister | Aug 2017 | B2 |
9757002 | Thompson | Sep 2017 | B2 |
9796093 | Mascorro Medina | Oct 2017 | B2 |
9801517 | High | Oct 2017 | B2 |
9827678 | Gilbertson | Nov 2017 | B1 |
9875502 | Kay | Jan 2018 | B2 |
9875503 | High | Jan 2018 | B2 |
9896315 | High | Mar 2018 | B2 |
9908760 | High | Mar 2018 | B2 |
9948917 | Inacio De Matos | Apr 2018 | B2 |
9994434 | High | Jun 2018 | B2 |
10017322 | High | Jul 2018 | B2 |
10071891 | High | Sep 2018 | B2 |
10071892 | High | Sep 2018 | B2 |
10071893 | High | Sep 2018 | B2 |
10081525 | High | Sep 2018 | B2 |
20010042024 | Rogers | Nov 2001 | A1 |
20020060542 | Song | May 2002 | A1 |
20020095342 | Feldman | Jul 2002 | A1 |
20020154974 | Fukuda | Oct 2002 | A1 |
20020156551 | Tackett | Oct 2002 | A1 |
20020165638 | Bancroft | Nov 2002 | A1 |
20020165643 | Bancroft | Nov 2002 | A1 |
20020165790 | Bancroft | Nov 2002 | A1 |
20020170961 | Dickson | Nov 2002 | A1 |
20020174021 | Chu | Nov 2002 | A1 |
20030028284 | Chirnomas | Feb 2003 | A1 |
20030152679 | Garwood | Aug 2003 | A1 |
20030170357 | Garwood | Sep 2003 | A1 |
20030185948 | Garwood | Oct 2003 | A1 |
20030222798 | Floros | Dec 2003 | A1 |
20040068348 | Jager | Apr 2004 | A1 |
20040081729 | Garwood | Apr 2004 | A1 |
20040093650 | Martins | May 2004 | A1 |
20040098167 | Yi | May 2004 | A1 |
20040117063 | Sabe | Jun 2004 | A1 |
20040146602 | Garwood | Jul 2004 | A1 |
20040216339 | Garberg | Nov 2004 | A1 |
20040217166 | Myers | Nov 2004 | A1 |
20040221790 | Sinclair | Nov 2004 | A1 |
20040225613 | Narayanaswami | Nov 2004 | A1 |
20040249497 | Saigh | Dec 2004 | A1 |
20050008463 | Stehr | Jan 2005 | A1 |
20050047895 | Lert | Mar 2005 | A1 |
20050072651 | Wieth | Apr 2005 | A1 |
20050080520 | Kline | Apr 2005 | A1 |
20050104547 | Wang | May 2005 | A1 |
20050149414 | Schrodt | Jul 2005 | A1 |
20050154265 | Miro | Jul 2005 | A1 |
20050177446 | Hoblit | Aug 2005 | A1 |
20050216126 | Koselka | Sep 2005 | A1 |
20050222712 | Orita | Oct 2005 | A1 |
20050230472 | Chang | Oct 2005 | A1 |
20050238465 | Razumov | Oct 2005 | A1 |
20060107067 | Safal | May 2006 | A1 |
20060147087 | Goncalves | Jul 2006 | A1 |
20060163350 | Melton | Jul 2006 | A1 |
20060178777 | Park | Aug 2006 | A1 |
20060206235 | Shakes | Sep 2006 | A1 |
20060210382 | Mountz | Sep 2006 | A1 |
20060220809 | Stigall | Oct 2006 | A1 |
20060221072 | Se | Oct 2006 | A1 |
20060231301 | Rose | Oct 2006 | A1 |
20060235570 | Jung | Oct 2006 | A1 |
20060241827 | Fukuchi | Oct 2006 | A1 |
20060244588 | Hannah | Nov 2006 | A1 |
20060279421 | French | Dec 2006 | A1 |
20060293810 | Nakamoto | Dec 2006 | A1 |
20070005179 | Mccrackin | Jan 2007 | A1 |
20070017855 | Pippin | Jan 2007 | A1 |
20070045018 | Carter | Mar 2007 | A1 |
20070061210 | Chen | Mar 2007 | A1 |
20070069014 | Heckel | Mar 2007 | A1 |
20070072662 | Templeman | Mar 2007 | A1 |
20070085682 | Murofushi | Apr 2007 | A1 |
20070125727 | Winkler | Jun 2007 | A1 |
20070150368 | Arora | Jun 2007 | A1 |
20070152057 | Cato | Jul 2007 | A1 |
20070222679 | Morris | Sep 2007 | A1 |
20070269299 | Ross | Nov 2007 | A1 |
20070284442 | Herskovitz | Dec 2007 | A1 |
20070288123 | D Andrea | Dec 2007 | A1 |
20070288127 | Haq | Dec 2007 | A1 |
20070293978 | Wurman | Dec 2007 | A1 |
20080011836 | Adema | Jan 2008 | A1 |
20080031491 | Ma | Feb 2008 | A1 |
20080041644 | Tudek | Feb 2008 | A1 |
20080042836 | Christopher | Feb 2008 | A1 |
20080075566 | Benedict | Mar 2008 | A1 |
20080075568 | Benedict | Mar 2008 | A1 |
20080075569 | Benedict | Mar 2008 | A1 |
20080077511 | Zimmerman | Mar 2008 | A1 |
20080105445 | Dayton | May 2008 | A1 |
20080131255 | Hessler | Jun 2008 | A1 |
20080140253 | Brown | Jun 2008 | A1 |
20080154720 | Gounares | Jun 2008 | A1 |
20080201227 | Bakewell | Aug 2008 | A1 |
20080226129 | Kundu | Sep 2008 | A1 |
20080267759 | Morency | Oct 2008 | A1 |
20080281515 | Ann | Nov 2008 | A1 |
20080281664 | Campbell | Nov 2008 | A1 |
20080294288 | Yamauchi | Nov 2008 | A1 |
20080306787 | Hamilton | Dec 2008 | A1 |
20080308630 | Bhogal | Dec 2008 | A1 |
20080314667 | Hannah | Dec 2008 | A1 |
20090074545 | Lert | Mar 2009 | A1 |
20090132250 | Chiang | May 2009 | A1 |
20090134572 | Obuchi | May 2009 | A1 |
20090138375 | Schwartz | May 2009 | A1 |
20090154708 | Kolar Sunder | Jun 2009 | A1 |
20090155033 | Olsen | Jun 2009 | A1 |
20090164902 | Cohen | Jun 2009 | A1 |
20090177323 | Ziegler | Jul 2009 | A1 |
20090210536 | Allen | Aug 2009 | A1 |
20090240571 | Bonner | Sep 2009 | A1 |
20090259571 | Ebling | Oct 2009 | A1 |
20090265193 | Collins | Oct 2009 | A1 |
20090269173 | De Leo | Oct 2009 | A1 |
20090299822 | Harari | Dec 2009 | A1 |
20090319399 | Resta | Dec 2009 | A1 |
20100025964 | Fisk | Feb 2010 | A1 |
20100030417 | Fang | Feb 2010 | A1 |
20100076959 | Ramani | Mar 2010 | A1 |
20100131103 | Herzog | May 2010 | A1 |
20100138281 | Zhang | Jun 2010 | A1 |
20100143089 | Hvass | Jun 2010 | A1 |
20100171826 | Hamilton | Jul 2010 | A1 |
20100176922 | Schwab | Jul 2010 | A1 |
20100211441 | Sprigg | Aug 2010 | A1 |
20100222925 | Anezaki | Sep 2010 | A1 |
20100262278 | Winkler | Oct 2010 | A1 |
20100268697 | Karlsson | Oct 2010 | A1 |
20100295847 | Titus | Nov 2010 | A1 |
20100299065 | Mays | Nov 2010 | A1 |
20100302102 | Desai | Dec 2010 | A1 |
20100316470 | Lert | Dec 2010 | A1 |
20100324773 | Choi | Dec 2010 | A1 |
20110010023 | Kunzig | Jan 2011 | A1 |
20110022201 | Reumerman | Jan 2011 | A1 |
20110098920 | Chuang | Apr 2011 | A1 |
20110153081 | Romanov | Jun 2011 | A1 |
20110163160 | Zini | Jul 2011 | A1 |
20110176803 | Song | Jul 2011 | A1 |
20110225071 | Sano | Sep 2011 | A1 |
20110238211 | Shirado | Sep 2011 | A1 |
20110240777 | Johns | Oct 2011 | A1 |
20110258060 | Sweeney | Oct 2011 | A1 |
20110260865 | Bergman | Oct 2011 | A1 |
20110279252 | Carter | Nov 2011 | A1 |
20110288684 | Farlow | Nov 2011 | A1 |
20110288763 | Hui | Nov 2011 | A1 |
20110295424 | Johnson | Dec 2011 | A1 |
20110301757 | Jones | Dec 2011 | A1 |
20110320034 | Dearlove | Dec 2011 | A1 |
20110320322 | Roslak | Dec 2011 | A1 |
20120000024 | Layton | Jan 2012 | A1 |
20120029697 | Ota | Feb 2012 | A1 |
20120035823 | Carter | Feb 2012 | A1 |
20120046998 | Staib | Feb 2012 | A1 |
20120059743 | Rao | Mar 2012 | A1 |
20120072303 | Brown | Mar 2012 | A1 |
20120134771 | Larson | May 2012 | A1 |
20120143726 | Chirnomas | Jun 2012 | A1 |
20120185094 | Rosenstein | Jul 2012 | A1 |
20120185355 | Kilroy | Jul 2012 | A1 |
20120192260 | Kontsevich | Jul 2012 | A1 |
20120197431 | Toebes | Aug 2012 | A1 |
20120226556 | Itagaki | Sep 2012 | A1 |
20120239224 | McCabe | Sep 2012 | A1 |
20120255810 | Yang | Oct 2012 | A1 |
20120259732 | Sasankan | Oct 2012 | A1 |
20120272500 | Reuteler | Nov 2012 | A1 |
20120294698 | Villamar | Nov 2012 | A1 |
20120303263 | Alam | Nov 2012 | A1 |
20120303479 | Derks | Nov 2012 | A1 |
20120330458 | Weiss | Dec 2012 | A1 |
20130016011 | Harriman | Jan 2013 | A1 |
20130026224 | Ganick | Jan 2013 | A1 |
20130051667 | Deng | Feb 2013 | A1 |
20130054052 | Waltz | Feb 2013 | A1 |
20130054280 | Moshfeghi | Feb 2013 | A1 |
20130060461 | Wong | Mar 2013 | A1 |
20130073405 | Ariyibi | Mar 2013 | A1 |
20130096735 | Byford | Apr 2013 | A1 |
20130103539 | Abraham | Apr 2013 | A1 |
20130105036 | Smith | May 2013 | A1 |
20130110671 | Gray | May 2013 | A1 |
20130141555 | Ganick | Jun 2013 | A1 |
20130145572 | Schregardus | Jun 2013 | A1 |
20130151335 | Avadhanam | Jun 2013 | A1 |
20130155058 | Golparvar-Fard | Jun 2013 | A1 |
20130174371 | Jones | Jul 2013 | A1 |
20130181370 | Rafie | Jul 2013 | A1 |
20130211953 | Abraham | Aug 2013 | A1 |
20130218453 | Geelen | Aug 2013 | A1 |
20130231779 | Purkayastha | Sep 2013 | A1 |
20130235206 | Smith | Sep 2013 | A1 |
20130238130 | Dorschel | Sep 2013 | A1 |
20130245810 | Sullivan | Sep 2013 | A1 |
20130276004 | Boncyk | Oct 2013 | A1 |
20130300729 | Grimaud | Nov 2013 | A1 |
20130302132 | DAndrea | Nov 2013 | A1 |
20130309637 | Minvielle Eugenio | Nov 2013 | A1 |
20130317642 | Asaria | Nov 2013 | A1 |
20130333961 | Odonnell | Dec 2013 | A1 |
20130338825 | Cantor | Dec 2013 | A1 |
20140006229 | Birch | Jan 2014 | A1 |
20140014470 | Razumov | Jan 2014 | A1 |
20140032034 | Raptopoulos | Jan 2014 | A1 |
20140032379 | Schuetz | Jan 2014 | A1 |
20140037404 | Hancock | Feb 2014 | A1 |
20140046512 | Villamar | Feb 2014 | A1 |
20140058556 | Kawano | Feb 2014 | A1 |
20140067564 | Yuan | Mar 2014 | A1 |
20140081445 | Villamar | Mar 2014 | A1 |
20140091013 | Streufert | Apr 2014 | A1 |
20140100715 | Mountz | Apr 2014 | A1 |
20140100768 | Kessens | Apr 2014 | A1 |
20140100769 | Wurman | Apr 2014 | A1 |
20140100998 | Mountz | Apr 2014 | A1 |
20140100999 | Mountz | Apr 2014 | A1 |
20140101690 | Boncyk | Apr 2014 | A1 |
20140108087 | Fukui | Apr 2014 | A1 |
20140124004 | Rosenstein | May 2014 | A1 |
20140129054 | Huntzicker | May 2014 | A1 |
20140133943 | Razumov | May 2014 | A1 |
20140135984 | Hirata | May 2014 | A1 |
20140136414 | Abhyanker | May 2014 | A1 |
20140143039 | Branton | May 2014 | A1 |
20140149958 | Samadi | May 2014 | A1 |
20140152507 | McAllister | Jun 2014 | A1 |
20140156450 | Ruckart | Jun 2014 | A1 |
20140156461 | Lerner | Jun 2014 | A1 |
20140157156 | Kawamoto | Jun 2014 | A1 |
20140164123 | Wissner-Gross | Jun 2014 | A1 |
20140172197 | Ganz | Jun 2014 | A1 |
20140172727 | Abhyanker | Jun 2014 | A1 |
20140177907 | Argue | Jun 2014 | A1 |
20140177924 | Argue | Jun 2014 | A1 |
20140180478 | Letsky | Jun 2014 | A1 |
20140180528 | Argue | Jun 2014 | A1 |
20140180865 | Argue | Jun 2014 | A1 |
20140180914 | Abhyanker | Jun 2014 | A1 |
20140201041 | Meyer | Jul 2014 | A1 |
20140207614 | Ramaswamy | Jul 2014 | A1 |
20140209514 | Gitschel | Jul 2014 | A1 |
20140211988 | Fan | Jul 2014 | A1 |
20140214205 | Kwon | Jul 2014 | A1 |
20140217242 | Muren | Aug 2014 | A1 |
20140228999 | D'Andrea | Aug 2014 | A1 |
20140229320 | Mohammed | Aug 2014 | A1 |
20140244026 | Neiser | Aug 2014 | A1 |
20140244207 | Hicks | Aug 2014 | A1 |
20140246257 | Jacobsen | Sep 2014 | A1 |
20140247116 | Davidson | Sep 2014 | A1 |
20140250613 | Jones | Sep 2014 | A1 |
20140254896 | Zhou | Sep 2014 | A1 |
20140257928 | Chen | Sep 2014 | A1 |
20140266616 | Jones | Sep 2014 | A1 |
20140267409 | Fein | Sep 2014 | A1 |
20140274309 | Nguyen | Sep 2014 | A1 |
20140277693 | Naylor | Sep 2014 | A1 |
20140277742 | Wells | Sep 2014 | A1 |
20140277841 | Klicpera | Sep 2014 | A1 |
20140285134 | Kim | Sep 2014 | A1 |
20140289009 | Campbell | Sep 2014 | A1 |
20140297090 | Ichinose | Oct 2014 | A1 |
20140304107 | McAllister | Oct 2014 | A1 |
20140306654 | Partovi | Oct 2014 | A1 |
20140309809 | Dixon | Oct 2014 | A1 |
20140330456 | LopezMorales | Nov 2014 | A1 |
20140330677 | Boncyk | Nov 2014 | A1 |
20140344011 | Dogin | Nov 2014 | A1 |
20140344118 | Parpia | Nov 2014 | A1 |
20140350725 | LaFary | Nov 2014 | A1 |
20140350851 | Carter | Nov 2014 | A1 |
20140350855 | Vishnuvajhala | Nov 2014 | A1 |
20140361077 | Davidson | Dec 2014 | A1 |
20140369558 | Holz | Dec 2014 | A1 |
20140371912 | Passot | Dec 2014 | A1 |
20140379588 | Gates | Dec 2014 | A1 |
20150006319 | Thomas | Jan 2015 | A1 |
20150029339 | Kobres | Jan 2015 | A1 |
20150032252 | Galluzzo | Jan 2015 | A1 |
20150045992 | Ashby | Feb 2015 | A1 |
20150046299 | Yan | Feb 2015 | A1 |
20150066283 | Wurman | Mar 2015 | A1 |
20150073589 | Khodl | Mar 2015 | A1 |
20150098775 | Razumov | Apr 2015 | A1 |
20150100439 | Lu | Apr 2015 | A1 |
20150100461 | Baryakar | Apr 2015 | A1 |
20150112826 | Crutchfield | Apr 2015 | A1 |
20150120094 | Kimchi | Apr 2015 | A1 |
20150123973 | Larsen | May 2015 | A1 |
20150142249 | Ooga | May 2015 | A1 |
20150203140 | Holtan | Jul 2015 | A1 |
20150205298 | Stoschek | Jul 2015 | A1 |
20150205300 | Caver | Jul 2015 | A1 |
20150217449 | Meier | Aug 2015 | A1 |
20150217790 | Golden | Aug 2015 | A1 |
20150221854 | Melz | Aug 2015 | A1 |
20150228004 | Bednarek | Aug 2015 | A1 |
20150229906 | Inacio De Matos | Aug 2015 | A1 |
20150231873 | Okamoto | Aug 2015 | A1 |
20150277440 | Kimchi | Oct 2015 | A1 |
20150278889 | Qian | Oct 2015 | A1 |
20150325128 | Lord | Nov 2015 | A1 |
20150336668 | Pasko | Nov 2015 | A1 |
20150360865 | Massey | Dec 2015 | A1 |
20160016731 | Razumov | Jan 2016 | A1 |
20160023675 | Hannah | Jan 2016 | A1 |
20160052139 | Hyde | Feb 2016 | A1 |
20160101794 | Fowler | Apr 2016 | A1 |
20160101936 | Chamberlin | Apr 2016 | A1 |
20160101940 | Grinnell | Apr 2016 | A1 |
20160110701 | Herring | Apr 2016 | A1 |
20160114488 | Mascorro Medina | Apr 2016 | A1 |
20160167557 | Mecklinger | Jun 2016 | A1 |
20160167577 | Simmons | Jun 2016 | A1 |
20160176638 | Toebes | Jun 2016 | A1 |
20160196755 | Navot | Jul 2016 | A1 |
20160207193 | Wise | Jul 2016 | A1 |
20160210602 | Siddique | Jul 2016 | A1 |
20160236867 | Brazeau | Aug 2016 | A1 |
20160255969 | High | Sep 2016 | A1 |
20160257212 | Thompson | Sep 2016 | A1 |
20160257240 | High | Sep 2016 | A1 |
20160257401 | Buchmueller | Sep 2016 | A1 |
20160258762 | Taylor | Sep 2016 | A1 |
20160258763 | High | Sep 2016 | A1 |
20160259028 | High | Sep 2016 | A1 |
20160259329 | High | Sep 2016 | A1 |
20160259331 | Thompson | Sep 2016 | A1 |
20160259339 | High | Sep 2016 | A1 |
20160259340 | Kay | Sep 2016 | A1 |
20160259341 | High | Sep 2016 | A1 |
20160259342 | High | Sep 2016 | A1 |
20160259343 | High | Sep 2016 | A1 |
20160259344 | High | Sep 2016 | A1 |
20160259345 | McHale | Sep 2016 | A1 |
20160259346 | High | Sep 2016 | A1 |
20160260049 | High | Sep 2016 | A1 |
20160260054 | High | Sep 2016 | A1 |
20160260055 | High | Sep 2016 | A1 |
20160260142 | Winkle | Sep 2016 | A1 |
20160260145 | High | Sep 2016 | A1 |
20160260148 | High | Sep 2016 | A1 |
20160260158 | High | Sep 2016 | A1 |
20160260159 | Atchley | Sep 2016 | A1 |
20160260161 | Atchley | Sep 2016 | A1 |
20160261698 | Thompson | Sep 2016 | A1 |
20160274586 | Stubbs | Sep 2016 | A1 |
20160288601 | Gehrke | Oct 2016 | A1 |
20160288687 | Scherle | Oct 2016 | A1 |
20160300291 | Carmeli | Oct 2016 | A1 |
20160301698 | Katara | Oct 2016 | A1 |
20160325932 | Hognaland | Nov 2016 | A1 |
20160349754 | Mohr | Dec 2016 | A1 |
20160355337 | Lert | Dec 2016 | A1 |
20160364785 | Wankhede | Dec 2016 | A1 |
20160364786 | Wankhede | Dec 2016 | A1 |
20170009417 | High | Jan 2017 | A1 |
20170010608 | High | Jan 2017 | A1 |
20170010609 | High | Jan 2017 | A1 |
20170010610 | Atchley | Jan 2017 | A1 |
20170020354 | High | Jan 2017 | A1 |
20170024806 | High | Jan 2017 | A1 |
20170080846 | Lord | Mar 2017 | A1 |
20170107055 | Magens | Apr 2017 | A1 |
20170110017 | Kimchi | Apr 2017 | A1 |
20170120443 | Kang | May 2017 | A1 |
20170129602 | Alduaiji | May 2017 | A1 |
20170137235 | Thompson | May 2017 | A1 |
20170148075 | High | May 2017 | A1 |
20170158430 | Raizer | Jun 2017 | A1 |
20170166399 | Stubbs | Jun 2017 | A1 |
20170176986 | High | Jun 2017 | A1 |
20170178066 | High | Jun 2017 | A1 |
20170178082 | High | Jun 2017 | A1 |
20170183159 | Weiss | Jun 2017 | A1 |
20170283171 | High | Oct 2017 | A1 |
20170355081 | Fisher | Dec 2017 | A1 |
20180020896 | High | Jan 2018 | A1 |
20180068357 | High | Mar 2018 | A1 |
20180075403 | Mascorro Medina | Mar 2018 | A1 |
20180099846 | High | Apr 2018 | A1 |
20180170729 | High | Jun 2018 | A1 |
20180170730 | High | Jun 2018 | A1 |
20180273292 | High | Sep 2018 | A1 |
20180282139 | High | Oct 2018 | A1 |
20180346299 | High | Dec 2018 | A1 |
20180346300 | High | Dec 2018 | A1 |
20190002256 | High | Jan 2019 | A1 |
Number | Date | Country |
---|---|---|
2524037 | May 2006 | CA |
2625885 | Apr 2007 | CA |
100999277 | Jul 2007 | CN |
102079433 | Jun 2011 | CN |
202847767 | Apr 2013 | CN |
103136923 | May 2013 | CN |
103213115 | Jul 2013 | CN |
203166399 | Aug 2013 | CN |
203191819 | Sep 2013 | CN |
203401274 | Jan 2014 | CN |
203402565 | Jan 2014 | CN |
103625808 | Mar 2014 | CN |
203468521 | Mar 2014 | CN |
103696393 | Apr 2014 | CN |
103723403 | Apr 2014 | CN |
203512491 | Apr 2014 | CN |
103770117 | May 2014 | CN |
203782622 | Aug 2014 | CN |
104102188 | Oct 2014 | CN |
104102219 | Oct 2014 | CN |
102393739 | Dec 2014 | CN |
204054062 | Dec 2014 | CN |
204309852 | Dec 2014 | CN |
204331404 | May 2015 | CN |
105460051 | Apr 2016 | CN |
102013013438 | Feb 2015 | DE |
861415 | May 1997 | EP |
1136052 | Sep 2001 | EP |
0887491 | Apr 2004 | EP |
1439039 | Jul 2004 | EP |
1447726 | Aug 2004 | EP |
2148169 | Jan 2010 | EP |
2106886 | Mar 2011 | EP |
2309487 | Apr 2011 | EP |
2050544 | Aug 2011 | EP |
2498158 | Sep 2012 | EP |
2571660 | Mar 2013 | EP |
2590041 | May 2013 | EP |
2608163 | Jun 2013 | EP |
2662831 | Nov 2013 | EP |
2730377 | May 2014 | EP |
2886020 | Jun 2015 | EP |
2710330 | Mar 1995 | FR |
1382806 | Feb 1971 | GB |
2530626 | Mar 2016 | GB |
2542472 | Mar 2017 | GB |
2542905 | May 2017 | GB |
62247458 | Oct 1987 | JP |
H10129996 | May 1998 | JP |
2003288396 | Oct 2003 | JP |
2005350222 | Dec 2005 | JP |
2009284944 | Dec 2009 | JP |
2010105644 | May 2010 | JP |
2010231470 | Oct 2010 | JP |
20120100505 | Sep 2012 | KR |
8503277 | Aug 1985 | WO |
9603305 | Jul 1995 | WO |
1997018523 | May 1997 | WO |
9855903 | Dec 1998 | WO |
2000061438 | Oct 2000 | WO |
0132366 | May 2001 | WO |
2004092858 | Oct 2004 | WO |
2005102875 | Nov 2005 | WO |
2006056614 | Jun 2006 | WO |
2006120636 | Nov 2006 | WO |
2006137072 | Dec 2006 | WO |
2007007354 | Jan 2007 | WO |
2007047514 | Apr 2007 | WO |
2007149196 | Dec 2007 | WO |
2008118906 | Oct 2008 | WO |
2008144638 | Nov 2008 | WO |
2008151345 | Dec 2008 | WO |
2009022859 | Feb 2009 | WO |
2009027835 | Mar 2009 | WO |
2009103008 | Aug 2009 | WO |
2011063527 | Jun 2011 | WO |
2012075196 | Jun 2012 | WO |
2013138193 | Sep 2013 | WO |
2013138333 | Sep 2013 | WO |
2013176762 | Nov 2013 | WO |
2014022366 | Feb 2014 | WO |
2014022496 | Feb 2014 | WO |
2014045225 | Mar 2014 | WO |
2014046757 | Mar 2014 | WO |
2014101714 | Jul 2014 | WO |
2014116947 | Jul 2014 | WO |
2014138472 | Sep 2014 | WO |
2014165286 | Oct 2014 | WO |
2015021958 | Feb 2015 | WO |
2015104263 | Jul 2015 | WO |
2015155556 | Oct 2015 | WO |
2016009423 | Jan 2016 | WO |
2016015000 | Jan 2016 | WO |
2016144765 | Sep 2016 | WO |
Entry |
---|
U.S. Appl. No. 15/060,953, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,025, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,054, filed Mar. 4, 2016, Kay. |
U.S. Appl. No. 15/061,203, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,265, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,285, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,325, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,350, filed Mar. 4, 2016, Thompson. |
U.S. Appl. No. 15/061,402, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,406, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,443, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,474, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,507, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,671, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,677, filed Mar. 4, 2016, Taylor. |
U.S. Appl. No. 15/061,686, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,688, filed Mar. 4, 2016, Thompson. |
U.S. Appl. No. 15/061,722, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,770, filed Mar. 4, 2016, Atchley. |
U.S. Appl. No. 15/061,792, filed Mar. 4, 2016, Winkle. |
U.S. Appl. No. 15/061,801, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,805, filed Mar. 4, 2016, Atchley. |
U.S. Appl. No. 15/061,844, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,848, filed Mar. 4, 2016, McHale. |
U.S. Appl. No. 15/061,908, filed Mar. 4, 2016, High. |
U.S. Appl. No. 15/061,980, filed Mar. 4, 2016, Thompson. |
U.S. Appl. No. 15/274,991, filed Jan. 12, 2017, Donald R. High. |
U.S. Appl. No. 15/275,009, filed Sep. 23, 2016, Donald R. High. |
U.S. Appl. No. 15/275,019, filed Sep. 23, 2016, Donald R. High. |
U.S. Appl. No. 15/275,047, filed Sep. 23, 2016, Donald R. High. |
U.S. Appl. No. 15/282,951, filed Sep. 30, 2016, Donald R. High. |
U.S. Appl. No. 15/288,923, filed Oct. 7, 2016, Donald R. High. |
U.S. Appl. No. 15/423,812, filed Feb. 3, 2017, Donald R. High. |
U.S. Appl. No. 15/446,914, filed Mar. 1, 2017, Donald R. High. |
U.S. Appl. No. 15/447,175, filed Mar. 2, 2017, Donald R. High. |
U.S. Appl. No. 15/447,202, filed Mar. 2, 2017, Donald R. High. |
U.S. Appl. No. 15/471,278, filed Mar. 28, 2017, Donald R. High. |
U.S. Appl. No. 15/692,226, filed Aug. 31, 2017, Donald R. High. |
U.S. Appl. No. 15/698,068, filed Sep. 7, 2017, High Donald R. |
U.S. Appl. No. 15/836,708, filed Dec. 8, 2017, Donald R. High. |
U.S. Appl. No. 15/892,250, filed Feb. 8, 2018, Donald R. High. |
U.S. Appl. No. 15/894,155, filed Feb. 12, 2018, Donald R. High. |
U.S. Appl. No. 15/990,274, filed May 25, 2018, High Donald R. |
U.S. Appl. No. 16/001,774, filed Jun. 6, 2018, High Donald R. |
U.S. Appl. No. 16/059,431, filed Aug. 9, 2018, High Donald R. |
U.S. Appl. No. 16/100,064, filed Aug. 9, 2018, High Donald R. |
U.S. Appl. No. 16/109,290, filed Aug. 22, 2018, High Donald R. |
ABBROBOTICS; “ABB Robotics—Innovative Packaging Solutions”, https://www.youtube.com/watch?v=e5jif-IUvHY, published on May 16, 2013, pp. 1-5. |
Ang, Fitzwatler, et al.; “Automated Waste Sorter With Mobile Robot Delivery Waste System”, De La Salle University Research Congress 2013, Mar. 7-9, 2013, pp. 1-7. |
Ansari, Sameer, et al.; “Automated Trash Collection & Removal in Office Cubicle Environments”, Squad Collaborative Robots, Sep. 27, 2013, pp. 1-23. |
Armstrong, Jean, et al.; “Visible Light Positioning: A Roadmap for International Standardization”, IEEE Communications Magazine, Dec. 2013, pp. 2-7. |
Artal, J.S., et al.; “Autonomous Mobile Robot with Hybrid PEM Fuel-Cell and Ultracapacitors Energy System, Dedalo 2.0”, International Conference on Renewable Energies and Power Quality, Santiago de Compostela, Spain, Mar. 28-30, 2012, pp. 1-6. |
Atherton, Kelsey D.; “New GPS Receiver Offers Navigation Accurate to an Inch”, Popular Science, www.popsci.com/technology/article/2013-08/global-positioning-down-inches, Aug. 16, 2013, pp. 1-2. |
Avezbadalov, Ariel, et al.; “Snow Shoveling Robot”, engineering.nyu.edu/mechatronics/projects/ME3484/2006/Snow Shoveling Robot/Mechatronics Snow Robot Presentation Update 12-19-06.pdf, 2006, pp. 1-24. |
Bares, John, et al.; “Designing Crash-Survivable Unmanned Vehicles”, AUVSI Symposium, Jul. 10, 2002, pp. 1-15. |
Bohren; Jonathan et al.; “Towards Autonomous Robotic Butlers: Lessons Learned with the PR2”, Willow Garage, May 9, 2011, pp. 1-8. |
Bouchard, Samuel; “A Robot to Clean Your Trash Bin!”, Robotiq, http://blog.robotiq.com/bid/41203/A-Robot-to-Clean-your-Trash-Bin, Aug. 22, 2011, pp. 1-7. |
BUDGEE; “The Robotic Shopping Cart Budgee”; https://www.youtube.com/watch?v=2dYNdVPF4VM; published on Mar. 20, 2015; pp. 1-6. |
Burns, Tom; “irobot roomba 780 review best robot vacuum floor cleaning robot review video demo”, https://www.youtube.com/watch?v=MkwtlyVAaEY, published on Feb. 13, 2013, pp. 1-10. |
BYTELIGHT; “Scalable Indoor Location”, http://www.bytelight.com/, Dec. 12, 2014, pp. 1-2. |
Canadian Manufacturing; “Amazon unleashes army of order-picking robots”, http://www.canadianmanufacturing.com/supply-chain/amazon-unleashes-army-order-picking-robots-142902/, Dec. 2, 2014, pp. 1-4. |
Capel, Claudine; “Waste sorting—A look at the separation and sorting techniques in today's European market”, Waste Management World, http://waste-management-world.com/a/waste-sorting-a-look-at-the-separation-and-sorting-techniques-in-todayrsquos-european-market, Jul. 1, 2008, pp. 1-8. |
Carnegie Mellon Univeristy; “AndyVision—The Future of Retail”, https://www.youtube.com/watch?v=n5309ILTV2s, published on Jul. 16, 2012, pp. 1-9. |
Carnegie Mellon University; “Robots in Retail”, www.cmu.edu/homepage/computing/2012/summer/robots-in-retail.shmtl, 2012, p. 1. |
Chopade, Jayesh, et al.; “Control of Spy Robot by Voice and Computer Commands”, International Journal of Advanced Research in Computer and Communication Engineering, vol. 2, Issue 4, Apr. 2013, pp. 1-3. |
CNET; “iRobot Braava 380t—No standing ovation for this robotic floor mop”, https://www.youtube.com/watch?v=JAtClxFtC6Q, published on May 7, 2014, pp. 1-6. |
Coltin, Brian & Ventura, Rodrigo; “Dynamic User Task Scheduling for Mobile Robots”, Association for the Advancement of Artificial Intelligence, 2011, pp. 1-6. |
Couceiro, Micael S., et al.; “Marsupial teams of robots: deployment of miniature robots for swarm exploration under communication constraints”, Robotica, Cambridge University Press, downloaded Jan. 14, 2014, pp. 1-22. |
Coxworth, Ben; “Robot designed to sort trash for recycling”, Gizmag, http://www.gizmag.com/robot-sorts-trash-for-recycling/18426/, Apr. 18, 2011, pp. 1-7. |
Daily Mail; “Dancing with your phone: The gyrating robotic dock that can move along with your music”, Sep. 12, 2012, http://www.dailymail.co.uk/sciencetech/article-2202164/The-intelligent-dancing-robot-controlled-mobile-phone.html, pp. 1-23. |
Davis, Jo; “The Future of Retail: In Store Now”, Online Brands, http://onlinebrands.co.nz/587/future-retail-store-now/, Nov. 16, 2014, pp. 1-5. |
DENSO; “X-mobility”, Oct. 10, 2014, pp. 1-2, including machine translation. |
DHL; “Self-Driving Vehicles in Logistics: A DHL perspective on implications and use cases for the logistics industry”, 2014, pp. 1-39. |
Dorrier, Jason; “Service Robots Will Now Assist Customers at Lowe's Store”, SingularityHUB, http://singularityhub.com/2014/10/29/service-robots-will-now-assist-customers-at-lowes-store/, Oct. 29, 2014, pp. 1-4. |
DRONEWATCH; “Weatherproof Drone XAircraft has Black Box”, DroneWatch, http://www.dronewatch.nl/2015/02/13/weatherproof-drone-van-xaircraft-beschikt-over-zwarte-doos/, Feb. 13, 2015, pp. 1-5. |
Dyson US; “See the new Dyson 360 Eye robot vacuum cleaner in action #DysonRobot”, https://www.youtube.com/watch?v=OadhulCDAjk, published on Sep. 4, 2014, pp. 1-7. |
Edwards, Lin; “Supermarket robot to help the elderly (w/Video)”, Phys.Org, http://phys.org/news/2009-12-supermarket-robot-elderly-video.html, Dec. 17, 2009, pp. 1-5. |
Elfes, Alberto; “Using Occupancy Grids for Mobile Robot Perception and Navigation”, IEEE, 1989, pp. 46-57. |
Elkins, Herschel T.; “Important 2014 New Consumer Laws”, County of Los Angeles Department of Consumer Affairs Community Outreach & Education, updated Jan. 6, 2014, pp. 1-46. |
Falconer, Jason; “HOSPI-R drug delivery robot frees nurses to do more important work”, Gizmag, http://www.gizmag.com/panasonic-hospi-r-delivery-robot/29565/, Oct. 28, 2013, pp. 1-6. |
Falconer, Jason; “Toyota unveils helpful Human Support Robot”, Gizmag, http:/www.gizmag.com/toyota-human-support-robot/24246/, Sep. 22, 2012, pp. 1-6. |
Farivar, Cyrus; “This in-store robot can show you the hammer aisle, but not the bathroom”, Ars Technica, http://arstechnica.com/business/2014/12/this-in-store-robot-can-show-you-the-hammer-aisle-but-not-the-bathroom/, Dec. 3, 2014, pp. 1-4. |
Fellow Robots; “Meet OSHBOT”, http://fellowrobots.com/oshbot/, May 19, 2015, pp. 1-3. |
Fellow Robots; “Oshbot Progress—Fellow Robots”, https://vimeo.com/139532370, published Sep. 16, 2015, pp. 1-5. |
Follow Inspiration; “wiiGO”; https://www.youtube.com/watch?v=dhHXldpknC4; published on Jun. 16, 2015; pp. 1-7. |
FORA.TV; “A Day in the Life of a Kiva Robot”, https://www.youtube.com/watch?v=6KRjuuEVEZs, published on May 11, 2011, pp. 1-11. |
GAMMA2VIDEO; “FridayBeerBot.wmv”, https://www.youtube.com/watch?v=KXXIIDYatxQ, published on Apr. 27, 2010, pp. 1-7. |
Garun, Natt; “Hop the hands-free suitcase follows you around like an obedient pet”; https://www.digitaltrends.com/cool-tech/hop-the-hands-free-suitcase-follows-you-around-like-an-obedient-pet/; Oct. 10, 2012; pp. 1-6. |
Glas, Dylan F., et al.; “The Network Robot System: Enabling Social Human-Robot Interaction in Public Spaces”, Journal of Human-Robot Interaction, vol. 1, No. 2, 2012, pp. 5-32. |
Green, A., et al; “Report on evaluation of the robot trolley”, CommRob IST-045441, Advanced Behaviour and High-Level Multimodal Communications with and among Robots, Jun. 14, 2010, pp. 10-67. |
Gross, H.-M., et al.; TOOMAS: Interactive Shopping Guide Robots in Everyday Use—Final Implementation and Experiences from Long-term Field Trials, Proc. IEEE/RJS Intern. Conf. On Intelligent Robots and Systems (IROS'09), St. Louis, USA, pp. 2005-2012. |
Habib, Maki K., “Real Time Mapping and Dynamic Navigation for Mobile Robots”, International Journal of Advanced Robotic Systems, vol. 4, No. 3, 2007, pp. 323-338. |
HRJ3 Productions; “Japanese Automatic Golf Cart”, https://www.youtube.com/watch?v=8diWYtqb6C0, published on Mar. 29, 2014, pp. 1-4. |
Huang, Edward Y.C.; “A Semi-Autonomous Vision-Based Navigation System for a Mobile Robotic Vehicle”, Thesis submitted to the Massachusetts Institute of Technology Department of Electrical Engineering and Computer Science on May 21, 2003, pp. 1-76. |
IEEE Spectrum; “Warehouse Robots at Work”, https://www.youtube.com/watch?v=lWsMdN7HMuA, published on Jul. 21, 2008, pp. 1-11. |
Intelligent Autonomous Systems; “TUM James goes shopping”, https://www.youtube.com/watch?v=JS2zycc4AUE, published on May 23, 2011, pp. 1-13. |
Katic, M., Dusko; “Cooperative Multi Robot Systems for Contemporary Shopping Malls”, Robotics Laboratory, Mihailo Pupin Institute, University of Belgrade, Dec. 30, 2010, pp. 10-17. |
Kehoe, Ben, et al.; “Cloud-Based Robot Grasping with the Google Object Recognition Engine”, 2013, pp. 1-7. |
Kendricks, Cooper; “Trash Disposal Robot”, https://prezi.com31acae05zf8i/trash-disposal-robot/, Jan. 9, 2015, pp. 1-7. |
Kibria, Shafkat, “Speech Recognition for Robotic Control”, Master's Thesis in Computing Science, Umea University, Dec. 18, 2005, pp. 1-77. |
King, Rachael; “Newest Workers for Lowe's: Robots”, The Wall Street Journal, http:/www.wsj.com/articles/newest-workers-for-lowes-robots-1414468866, Oct. 28, 2014, pp. 1-4. |
Kitamura, Shunichi; “Super Golf Cart with Remote drive and NAVI system in Japan”, https://www.youtube.com/watch?v=2_3-dUR12F8, published on Oct. 4, 2009, pp. 1-6. |
Kiva Systems; “Automated Goods-to-Man Order Picking System—Kiva Systems”, http://www.kivasystems.com/solutions/picking/, printed on Apr. 2, 2015, pp. 1-2. |
Kiva Systems; “Frequently Asked Questions about Kiva Systems—Kiva Systems”, http://kivasystems.com/about-us-the-kiva-approach/faq/, printed on Apr. 2, 2015, pp. 1-2. |
Kiva Systems; “how a Kiva system makes use of the vertical space—Kiva Systems”, http://www.kivasystems.com/solutions/vertical-storage/, printed on Apr. 2, 2015, pp. 1-2. |
Kiva Systems; “How Kiva Systems and Warehouse Management Systems Interact”, 2010, pp. 1-12. |
Kiva Systems; “Kiva replenishment is more productive and accurate than replenishing pick faces in traditional distribution operations”, http//www.kivasystems.com/solutions/replenishment/, printed on Apr. 2, 2015, pp. 1-2. |
Kiva Systems; “Kiva warehouse control software, Kiva WCS—Kiva Systems”, http://www.kivasystems.com/solutions/software/, printed on Apr. 2, 2015, pp. 1-2. |
Kiva Systems; “Kiva's warehouse automation system is the most powerful and flexible A . . . ”, http://www.kivasystems.com/solutions/, printed on Apr. 2, 2015, pp. 1-2. |
Kiva Systems; “Shipping Sortation—Kiva Systems”, http://www.kivasystems.com/solutions/shipping-sortation/, printed on Apr. 2, 2015, pp. 1-2. |
Kohtsuka, T. et al.; “Design of a Control System for Robot Shopping Carts”; KES'11 Proceedings of the 15th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems; Sep 12-14, 2011; pp. 280-288. |
Koubaa, Anis; “A Service-Oriented Architecture for Virtualizing Robots in Robot-as-a-Service Clouds”, 2014, pp. 1-13. |
Kumar Paradkar, Prashant; “Voice Controlled Robotic Project using interfacing of Ardruino and Bluetooth HC-05”, Robotics_Projects_C/C++_Android, Jan. 23, 2016, pp. 1-14. |
Kumar, Swagat; “Robotics-as-a-Service: Transforming the Future of Retail”, Tata Consultancy Services, http://www.tcs.com/resources/white_papers/Pages/Robotics-as-Service.aspx, printed on May 13, 2015, pp. 1-4. |
Lejepekov, Fedor; “Yuki-taro. Snow recycle robot.”, https://www.youtube.com/watch?v=gl2j9PY4jGY, published on Jan. 17, 2011, pp. 1-4. |
Liu, Xiaohan, et al.; “Design of an Indoor Self-Positioning System for the Visually Impaired—Simulation with RFID and Bluetooth in a Visible Light Communication System”, Proceedings of the 29th Annual International Conference of the IEEE EMBS, Cite Internationale, Lyon, France, Aug. 23-26, 2007, pp. 1655-1658. |
Lowe'S Home Improvement; “OSHbots from Lowe's Innovation Labs”, https://www.youtube.com/watch?v=W-RKAjP1dtA, published on Dec. 15, 2014, pp. 1-8. |
Lowe'S Innovation Labs; “Autonomous Retail Service Robots”, http://www.lowesinnovationlabs.com/innovation-robots/, printed on Feb. 26, 2015, pp. 1-4. |
Matos, Luis; “wi-GO—The autonomous and self-driven shopping cart”; https://www.indiegogo.com/projects/wi-go-the-autonomous-and-self-driven-shopping-cart; printed on Feb. 27, 2015, pp. 1-16. |
Meena, M., & Thilagavathi, P.; “Automatic Docking System with Recharging and Battery Replacement for Surveillance Robot”, International Journal of Electronics and Computer Science Engineering, 2012, pp. 1148-1154. |
Messieh, Nancy; “Humanoid robots will be roaming Abu Dhabi's malls next year”, The Next Web, Oct. 17, 2011, https://thenextweb.com/me/2011/10/17/humanoid-robots-will-be-roaming-abu-dhabis-malls-next-year/, pp. 1-6. |
Murph, Darren; “B.O.S.S. shopping cart follows you around”, Engadget, http://www.engadget.com/2006/08/11/b-o-s-s-shopping-cart-follows-you-around/, Aug. 11, 2006, pp. 1-4. |
Nakajima, Madoka & Haruyama, Shinichiro; “New indoor navigation system for visually impaired people using visible light communication”, EURASIP Journal on Wireless Communications and Networking, 2013, pp. 1-10. |
NEUROBTV; “Shopping Robot TOOMAS 2009”, https://www.youtube.com/watch?v=49Pkm30qmQU, published on May 8, 2010, pp. 1-7. |
Nickerson, S.B., et al.; “An autonomous mobile robot for known industrial environments”, Autonomous Robot for a Known environment, Aug. 28, 1997, pp. 1-28. |
Nishimura, S. et al.; “Development of Attachable Modules for Robotizing Daily Items: Person Following Shopping Cart Robot”; Proceedings of the 2007 IEEE International Conference on Robotics and Biomimetics (Sanya, China); Dec. 15-18, 2007; pp. 1506-1511. |
O'Donnell, Jake; “Meet the Bluetooth-Connected Self-Following Robo-Caddy of the Future”, Sportsgrid; http://www.sportsgrid.com/uncategorized/meet-the-bluetooth-connected-self-following-robo-caddy-of-the-future/, Apr. 22, 2014, pp. 1-5. |
Ogawa, Keisuke; “Denso Demos In-wheel Motor System for Baby Carriages, Shopping Carts”, Nikkei Technology, http://techon.nikkeiibp.co.jp/english/NEWS_EN/20141010/381880/?ST=english_PRINT, Oct. 10, 2014, pp. 1-2. |
Onozato, Taishi et al.; “A Control System for the Robot Shopping Cart”; 2010 IRAST International Congress on Computer Applications and Computational Science (CACS 2010); 2010; pp. 907-910. |
Orchard Supply Hardware; “Orchard Supply Hardware's OSHbot”, https://www.youtube.com/watch?v=Sp9176vm7Co, published on Oct. 28, 2014, pp. 1-9. |
Osborne, Charlie; “Smart Cart Follows You When Grocery Shopping”, Smartplanet, http://www.smartplanet.com/blog/smart-takes/smart-cart-follows-you-when-grocery-shopping/, Feb. 29, 2012, pp. 1-4. |
Owano, Nancy; “HEARBO robot can tell beeps, notes, and spoken word (w/ Video)”, Phys.org, Nov. 21, 2012, https://phys.org/news/2012-11-hearbo-robot-beeps-spoken-word.html, pp. 1-4. |
Poudel, Dev Bahadur; “Coordinating Hundreds of Cooperative, Autonomous Robots in a Warehouse”, Jan. 27, 2013, pp. 1-13. |
ROBOTLAB Inc.; “NAO robot drives autonomously it's own car”, https://www.youtube.com/watch?v=oBHYwYlo1UE, published on Sep. 8, 2014, pp. 1-6. |
Rodriguez, Ashley; “Meet Lowe's Newest Sales Associate—OSHbot, the Robot”, Advertising Age, http://adage.com/article/cmo-strategy/meet-lowe-s-newest-sales-associate-oshbot-robot/295591/, Oct. 28, 2014, pp. 1-8. |
Sales, Jorge, et al.; “CompaRob: The Shopping Cart Assistance Robot”, International Journal of Distributed Sensor Networks, vol. 2016, Article ID 4781280, Jan. 3, 2016, http://dx.doi.org/10.1155/2016/4781280, pp. 1-16. |
Scholz, J. et al.; “Cart Pushing with a Mobile Manipulation System: Towards Navigation with Moveable Objects”; Proceedings of the 2011 IEEE International Conference on Robotics and Automation (Shanghai, China); May 9-13, 2011; pp. 6115-6120. |
Sebaali, G., et al.; “Smart Shopping Cart”, Department of Electrical and Computer Engineering, American University of Beirut, 2014, pp. 1-6. |
Shukla, Neha; “SaviOne the Butler Bot: Service Robot for Hospitality Industry”, TechieTonics, http://www.techietonics.com/robo-tonics/savione-the-butler-bot-service-for-hospitality-industry.html, Aug. 14, 2014, pp. 1-5. |
SK Telecom Co.; “SK Telecom Launches Smart Cart Pilot Test in Korea”; http://www.sktelecom.com/en/press/press_detail.do?idx=971; Oct. 4, 2011; pp. 1-2. |
Song, Guangming, et al.; “Automatic Docking System for Recharging Home Surveillance Robots”, http://www.academia.edu/6495007/Automatic_Docking_System_for_Recharging_Home_Surveillance_Robots, IEEE Transactions on Consumer Electronics, vol. 57, No. 2, May 2011, pp. 1-8. |
Soper, Taylor; “Amazon vet's new robot-powered apparel startup aims to revolutionize how we buy clothes”, GeekWire, http://www.geekwire.com/2012/hointer-robot-jeans-clothing-apparel-store-startup/, Nov. 29, 2012, pp. 1-12. |
Stewart Golf; “Introducing the New Stewart Golf X9 Follow”, https://www.youtube.com/watch?v=HHivFGtiuE, published on Apr. 9, 2014, pp. 1-9. |
Sun, Eric; ““Smart Bin & Trash Route” system—RMIT 2012 Green Inventors Competition”, http://www.youtube.com/watch?v=OrTA57alO0k, published on Nov. 14, 2012, pp. 1-8. |
Superdroid Robots; “Cool Robots, Making Life Easier”, http://www.superdroidrobots.com/shop/custom.aspx/cool-robots-making-life-easier/83/, printed on Jun. 16, 2015, pp. 1-7. |
SWISSLOG; “RoboCourier Autonomous Mobile Robot”, http://www.swisslog.com/en/Products/HCS/Automated-Material-Transport/RoboCourier-Autonomous-Mobile-Robot, printed May 27, 2015, p. 1. |
Tam, Donna; “Meet Amazon's busiest employee—the Kiva robot”, CNET, http://www.cnet.com/news/meet-amazons-busiest-employee-the-kiva-robot/, Nov. 30, 2014, pp. 1-6. |
TECHNION; “Autonomous Tracking Shopping Cart—Shopping Made Easy from Technion”; https://www.youtube.com/watch?v=pQcb9fofmXg; published on Nov. 23, 2014; pp. 1-10. |
Universal Robotics; “Neocortex Enables Random Part Handling and Automated Assembly”, http://www.universalrobotics.com/random-bin-picking, printed on Dec. 22, 2015, pp. 1-3. |
UPHIGH Productions; “Behold the Future (E017 Robot Sales Assistant)”, https://www.youtube.com/watch?v=8WbvjaPm7d4, published on Nov. 19, 2014, pp. 1-7. |
Urankar, Sandeep, et al.; “Robo-Sloth: A Rope-Climbing Robot”, Department of Mechanical Engineering, Indian Institute of Technology, 2003, pp. 1-10. |
USPTO; U.S. Appl. No. 15/061,406; Notice of Allowance dated May 15, 2018.. |
USPTO; U.S. Appl. No. 15/061,406; Office Action dated Dec. 19, 2017. |
Vasilescu, Iuliu, et al.; “Autonomous Modular Optical Underwater Robot (AMOUR) Design, Prototype and Feasibility Study”, Apr. 18, 2005, pp. 1-7. |
VMECAVACUUMTECH; “VMECA Magic Suction Cup with ABB robot for pick and place (packaging application)”, https://www.youtube.com/watch?v=5btR9MLtGJA, published on Sep. 14, 2014, pp. 1-4. |
Wang, Xuan; “2D Mapping Solutions for Low Cost Mobile Robot”, Master's Thesis in Computer Science, Royal Institute of Technology, KTH CSC, Stockholm, Sweden, 2013, pp. 1-60. |
Webb, Mick; “Robovie II—the personal robotic shopping”, Gizmag, http://www.gizmag.com/robovie-ii-robotic-shopping-assistance/13664/, Dec. 23, 2009, pp. 1-5. |
Weise, Elizabeth; “15,000 robots usher in Amazon's Cyber Monday”, USATODAY, http://www.usatoday.com/story/tech/2014/12/01/robots-amazon.kiva-fulfillment-centers-cyber-monday/19725229/, Dec. 2, 2014, pp. 1-3. |
Weiss, C.C.; “Multifunctional hybrid robot shovels snow and mows your lawn”, Gizmag, http://www.gizmag.com/snowbyte-snow-shoveling-robot/32961/, Jul. 21, 2014, pp. 1-7. |
Wikipedia; “Kiva Systems”, http://en.wikipedia.org/wiki/Kiva_Systems, printed on Apr. 2, 2015, pp. 1-3. |
Wikipedia; “Leeds Kirkgate Market”; https://en.wikipedia.org/wiki/Leeds_Kirkgate_Market; Retrieved on Apr. 5, 2017; 8 pages. |
WIRED; “High-Speed Robots Part 1: Meet BettyBot in “Human Exclusion Zone” Warehouses—The Window-WIRED”, https://www.youtube.com/watch?v=8gy5tYVR-28, published on Jul. 2, 2013, pp. 1-6. |
Wulf, O., et al.; “Colored 2D maps for robot navigation with 3D sensor data,” Institute for Systems Engineering, University of Hannover, Hannover, Germany, 2014, pp. 1-6. |
YRF; “The Diamond Robbery—Scene Dhoom:2 Hrithik Roshan”, https://www.youtube.com/watch?v=3bMYgo_S0Kc, published on Jul. 12, 2012, pp. 1-7. |
Number | Date | Country | |
---|---|---|---|
20180346299 A1 | Dec 2018 | US |
Number | Date | Country | |
---|---|---|---|
62138877 | Mar 2015 | US | |
62129726 | Mar 2015 | US | |
62129727 | Mar 2015 | US | |
62138885 | Mar 2015 | US | |
62152630 | Apr 2015 | US | |
62152711 | Apr 2015 | US | |
62152667 | Apr 2015 | US | |
62152610 | Apr 2015 | US | |
62152465 | Apr 2015 | US | |
62152440 | Apr 2015 | US | |
62152421 | Apr 2015 | US | |
62157388 | May 2015 | US | |
62165586 | May 2015 | US | |
62165579 | May 2015 | US | |
62165416 | May 2015 | US | |
62171822 | Jun 2015 | US | |
62175182 | Jun 2015 | US | |
62182339 | Jun 2015 | US | |
62185478 | Jun 2015 | US | |
62194119 | Jul 2015 | US | |
62194121 | Jul 2015 | US | |
62194127 | Jul 2015 | US | |
62194131 | Jul 2015 | US | |
62202747 | Aug 2015 | US | |
62202744 | Aug 2015 | US | |
62205555 | Aug 2015 | US | |
62205569 | Aug 2015 | US | |
62205548 | Aug 2015 | US | |
62205539 | Aug 2015 | US | |
62207858 | Aug 2015 | US | |
62214824 | Sep 2015 | US | |
62214826 | Sep 2015 | US | |
62292084 | Feb 2016 | US | |
62302713 | Mar 2016 | US | |
62302567 | Mar 2016 | US | |
62302547 | Mar 2016 | US | |
62303021 | Mar 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15061406 | Mar 2016 | US |
Child | 16059431 | US |