The present disclosure relates to automated guided vehicles (AGVs). In a more particular example, the present disclosure relates to automated guided vehicles that automatically transport objects between locations.
Storage and retail facilities often rely on human workers to transport objects such as containers between different locations, and in some situations, to put away items stored in these containers to their storage places. For example, a human worker may manually transport multiple containers from a shelving unit to a floor surface in front of storage aisles, pick up a container, and put away various items stored in the container to their storage places in the storage aisles. As the items in the container may be placed in different storage aisles, the human worker may need to manually carry the container or push a cart including the container while walking through multiple storage aisles to place the items in their corresponding storage places. This solution is generally inefficient and often quickly causes fatigue to the human workers due to the physical effort to unload the containers from the shelving unit, as well as to manually carry the container or push the cart while walking back and forth between multiple storage aisles to put away the items. In addition, the human worker may need to carry a container in hand up a ladder in order to reach high storage locations at which to place items from the container. It is usually inconvenient and potentially dangerous for the human worker to carry the container and perform these movements at the same time.
An automated guided vehicle (AGV) can be configured to perform particular operations or actions by virtue of having software, firmware, hardware, or a combination thereof installed on the AGV that may cause the AGV to perform the operations or actions. According to one innovative aspect of the subject matter described in this disclosure, an AGV includes: an AGV body; one or more elevator mechanisms coupled to the AGV body; a support surface coupled to the AGV body and vertically movable along the AGV body by the one or more elevator mechanisms, the support surface supporting an object from underneath the object when the object is placed on the support surface; one or more arms coupled to the AGV body and vertically movable along the AGV body by the one or more elevator mechanisms, the one or more arms articulating to move the object from a first position of the object and place the object on the support surface; and an AGV controller configured to control one or more operations of the AGV.
In general, another innovative aspect of the subject matter described in this disclosure may be embodied in methods comprising: determining a first position of an object in a first area; aligning an automated guided vehicle (AGV) with the first position of the object, the AGV including a support surface and one or more arms coupled to an AGV body, the support surface and the one or more arms being vertically movable along the AGV body; vertically moving the support surface of the AGV along the AGV body to a first surface position relative to the first position of the object; vertically moving the one or more arms of the AGV along the AGV body to a first arm position relative to the first position of the object; retrieving the object from the first position of the object using the one or more arms, the one or more arms holding the object; and moving the object towards the support surface by the one or more arms holding the object to place the object on the support surface.
In general, another innovative aspect of the subject matter described in this disclosure may be embodied in an AGV including: an AGV body; one or more elevator mechanisms coupled to the AGV body; a support surface coupled to the AGV body and vertically movable along the AGV body by the one or more elevator mechanisms, the support surface supporting an object from underneath the object when the object is placed on the support surface; a plurality of arms coupled to the AGV body and vertically movable along the AGV body by the one or more elevator mechanisms, the plurality of arms articulating to move the object from a first position of the object and place the object on the support surface, an arm in the plurality of arms including a hand element comprising a paddle slideable along the hand element and a finger element mounted to an end of the paddle, the paddle adapted to hold the object between a plurality of paddles of the plurality of arms, the finger element adapted to pivot to rest against a second object surface of the object to grasp the object, a wrist element coupling the hand element to the arm, the wrist element adapted to horizontally pivot the hand element around the wrist element to adjust a distance between the plurality of paddles of the plurality of arms based on the object, and a shoulder element coupling the arm to the AGV body, the shoulder element adapted to horizontally pivot the arm around the shoulder element to adjust the distance between the plurality of paddles of the plurality of arms based on the object; one or more legs coupled to the AGV body, the one or more legs including one or more multi-directional wheels adapted to move the AGV; and an AGV controller configured to control one or more operations of the AGV.
These and other implementations of the AGV may each optionally include one or more of the following features: that the one or more arms includes a plurality of arms and an arm in the plurality of arms includes a hand element comprising a paddle slideable along the hand element, the paddle adapted to hold the object between a plurality of paddles of the plurality of arms, the paddle sliding along the hand element when the object is held between the plurality of paddles of the plurality of arms to move the object; that the hand element includes a finger element mounted to an end of the paddle, the finger element being pivotable to rest against a second object surface of the object to grasp the object; that the one or more arms includes a plurality of arms and an arm in the plurality of arms includes a shoulder element coupling the arm to the AGV body, the shoulder element adapted to horizontally pivot the arm around the shoulder element to adjust a distance between a plurality of paddles of the plurality of arms based on the object, and a wrist element coupling a hand element of the arm to the arm, the wrist element adapted to horizontally pivot the hand element around the wrist element to adjust the distance between the plurality of paddles of the plurality of arms based on the object; that the AGV transports the object to or from a shelving unit, and the AGV includes one or more legs coupled to the AGV body, the one or more legs satisfying a height threshold associated with a space underneath the shelving unit; a plurality of legs coupled to the AGV body and separated by a distance satisfying a distance threshold to accommodate the object between the plurality of legs; a user communication unit including a voice interface device, the voice interface device adapted to receive one or more of a voice command and a voice input from a user; that the AGV controller executes one or more instructions that cause the AGV to align the AGV with the first position of the object, vertically move the support surface along the AGV body to a first surface position relative to the first position of the object, vertically move the one or more arms along the AGV body to a first arm position relative to the first position of the object, retrieve the object from the first position of the object using the one or more arms, the one or more arms holding the object, and move the object towards the support surface by the one or more arms holding the object to place the object on the support surface.
These and other implementations of the method may each optionally include one or more of the following features: that determining an object size of the object, adjusting the one or more arms based on the object size of the object, and responsive to moving the support surface to the first surface position and the one or more arms to the first arm position and responsive to adjusting the one or more arms, retrieving the object from the first position of the object using the one or more arms; that an arm in the one or more arms includes a shoulder element coupling the arm to the AGV body, the arm being horizontally pivotable around the shoulder element, a hand element including a paddle slideable along the hand element and a finger element mounted to an end of the paddle, the finger element being pivotable to rest against an object surface of the object, and a wrist element coupling the hand element to the arm, the hand element being horizontally pivotable around the wrist element; that the one or more arms include a plurality of arms and that determining an object size of the object, and horizontally pivoting one or more of the arm and the hand element of the arm to adjust a distance between a plurality of paddles of the plurality of arms based on the object size of the object; that the one or more arms include a plurality of arms, and retrieving the object from the first position of the object and moving the object towards the support surface includes sliding the paddle forward to hold the object between a plurality of paddles of the plurality of arms, pivoting the finger element to rest the finger element against a second object surface of the object to grasp the object, and sliding the paddle backward along the hand element to move the object being held by the plurality of paddles of the plurality of arms towards the support surface and place the object on the support surface; determining the first surface position for the support surface based on the first position of the object, a distance between the first surface position of the support surface and a bottom surface of the object located at the first position satisfying a first distance threshold, and determining the first arm position for the one or more arms based on the first position of the object, a distance between the first arm position of the one or more arms and a top surface of the object located at the first position satisfying a second distance threshold; responsive to placing the object on the support surface of the AGV, transporting the object situated on the support surface to a second area, determining a second position for the object in the second area, aligning the AGV with the second position of the object, vertically moving the support surface of the AGV along the AGV body to a second surface position relative to the second position of the object, vertically moving the one or more arms of the AGV along the AGV body to a second arm position relative to the second position of the object, and moving the object situated on the support surface of the AGV by the one or more arms holding the object to place the object at the second position of the object; determining the second surface position for the support surface based on the second position of the object, a distance between the second surface position of the support surface and the second position of the object satisfying a first distance threshold, determining a moving direction and a moving distance for the support surface to move to the second surface position, and determining the second arm position for the one or more arms based on the moving direction and the moving distance of the support surface; that the one or more arms include a plurality of arms and moving the object to the second position of the object includes pivoting a finger element of an arm away from an object surface of the object against which the finger element rested, and sliding a paddle forward along a hand element of the arm to move the object being held by the plurality of arms from the support surface of the AGV to the second position of the object; monitoring a tracking device associated with a user, the tracking device indicating a current position of the user in an operating environment, placing a first object on the support surface of the AGV, and responsive to placing the first object on the support surface of the AGV, maintaining a following distance between the AGV and the tracking device to automatically follow the user in the operating environment; responsive to determining that a height distance of the current position of the user to a floor surface satisfies a height threshold, determining a surface position for the support surface based on the current position of the user, and vertically moving the support surface of the AGV on which the object is situated along the AGV body to the surface position.
Other implementations of one or more of these aspects include corresponding systems, apparatus, and computer programs, configured to perform the actions of the methods, encoded on computer storage devices.
It should be understood that the language used in the present disclosure has been principally selected for readability and instructional purposes, and not to limit the scope of the subject matter disclosed herein.
The disclosure is illustrated by way of example, and not by way of limitation in the figures of the accompanying drawings in which like reference numerals are used to refer to similar elements.
The technology presented in this disclosure improves upon the existing approaches and is advantageous in a number of respects. For example, the automated guided vehicle (AGV) described herein can replace or assist the human workers in the process of transporting objects such as containers between different locations and also facilitate the human workers in the process of putting away the items stored in the containers. As discussed elsewhere herein, the AGV may retrieve an object to be transported using one or more grabbing arms and place the object onto a support surface. Thus, the object may be supported from underneath the object by the support surface during its transportation. These implementations may be advantageous, because they may avoid dropping the object even if one or more components of the one or more grabbing arms holding the object encounter an unexpected failure. Similarly, these implementations may also decrease the wear and motor power consumed by the components that form the grabbing arm, because, in some implementations, the one or more grabbing arms move the object to or from the support surface on which the object is situated, which may assist the grabbing arms and/or support the object during movement of the grabbing arms and/or AGV.
As a further example, in some embodiments, the AGV may include a plurality of grabbing arms and the plurality of grabbing arms may be capable of moving along an AGV body of the AGV and adjusting a distance between the plurality of grabbing arms based on the object size of the object. As a result, in some embodiments, the AGV may be able to carry various objects having different shape and size and transport them to and from various positions that are at different heights from the floor surface. Furthermore, the plurality of grabbing arms of the AGV may include a paddle slideable forward and backward along a hand element. The paddle may slide forward against an object surface of the object to hold the object between a plurality of paddles of the plurality of grabbing arms. Thus, for the objects that are placed on a shelving unit, the plurality of grabbing arms may be extended forward into the shelving unit with the paddles, and therefore an object located proximate to a back side of the shelving unit can be reached and retrieved by the plurality of grabbing arms of the AGV. As discussed elsewhere herein, the AGV may also include one or more legs that can fit into a space underneath the shelving unit, and thus the AGV can be positioned in close proximity to the shelving unit to facilitate its operation with the objects placed thereon.
In addition, as discussed elsewhere herein, the AGV may retrieve an object, place the object on its support surface, and automatically follow a user in the operating environment with the object situated on its support surface. These implementation may be advantageous, because they may eliminate the need for the user to manually carry the object or push the cart containing the object as the user proceeds in the operating environment. In some situations, the user may move from a position at which the user stands on a floor surface to a higher position or a lower position in order to reach the target position of the object or the target position of the item stored in the object (e.g., a user may climb a ladder to reach a high shelf). In these situations, the AGV may automatically raise or lower its support surface on which the object is situated to a position proximate to the current position of the user, thereby facilitating the user in reaching the object, for instance, this embodiment may be advantageous because it may eliminate or limit the need for the user to manually carry the object while moving to the higher position or to the lower position, thereby improving the safety of the user when performing these movements. Furthermore, the some embodiments of the AGV described herein may also capable of displaying information to the user via a visual interface device and receiving voice commands and/or voice inputs from the user via a voice interface device. As a result, the communication between the user and the AGV, as well as communication between the user and the server via the AGV, can be facilitated.
The technology described herein includes an example AGV that automatically transports objects between different locations in an operating environment. In some embodiments, the AGV may be a robotic vehicle operating in the storage facility. For example, the AGV may transport containers storing various inventory items between different locations in the storage facility. As the AGV can transport a container, the AGV may be referred to herein as the container-transport AGV, although it should be understood that the AGV may transport other types of objects (e.g., inventory items, facility equipment, etc.). In some embodiments, a container may be an object that is capable of storing items. In some embodiments, to place the items into the container, the management system of the storage facility (e.g., the management system implemented on a server, as described below) may reference the planogram that specifies the items stored in each storage aisle of the storage facility, and manage the placement of the items into the containers such that the items belonging to the same storage aisle may be placed in the same container. The storage aisle where the items in the container are stored may be referred to as the storage aisle associated with the container. Non-limiting examples of the container include, but are not limited to, a storage box, tote, pallet, mini pallet, etc.
In some embodiments, the containers may be stackable. For example, one container may be placed on a top surface of another container to form a container stack including multiple containers. In some situations, the containers may be placed on a shelving unit. The shelving unit may include one or more shelves on which the containers are situated. In some embodiments, when the shelving unit is positioned on a floor surface, the shelving unit may have a space underneath the shelving unit that is in between the bottom surface of the shelving unit and the floor surface (e.g., the ground floor, the vehicle floor of the delivery vehicle, etc.).
An example container-transport AGV 100 is depicted in
As illustrated in
As discussed elsewhere herein, the container-transport AGV 100 may include one or more grabbing arms 120 adapted to retrieve and release the object being transported by the container-transport AGV 100. As illustrated in
In some embodiments, to couple the grabbing arm 120 to the AGV body 102, the shoulder element 122 may couple the grabbing arm 120 to the corresponding elevator mechanism 106, and thus the elevator mechanism 106 may vertically move the grabbing arm 120 upward and downward along the AGV body 102 via the shoulder element 122. In some embodiments, the shoulder element 122 may also be adapted to horizontally pivot the grabbing arm 120 around the shoulder element 122 to adjust the distance between the hand elements 124 of the plurality of grabbing arms 120 based on the object being transported (e.g., based on dimensions, weight, configuration, etc. of the object). In some embodiments, the shoulder element 122 may be motorized with one or more motors that can be actuated to horizontally pivot the grabbing arm 120. In some embodiments, the plurality of grabbing arms 120 may pivot relative to one another. For example, the plurality of grabbing arms 120 may horizontally pivot around the corresponding shoulder elements 122 with the same rotation angle (e.g., 30°) but in opposite directions (e.g., towards the left versus towards the right relative to the AGV body 102). Thus, the plurality of grabbing arms 120 may collaboratively operate to hold and move the object, thereby retrieving the object from the first position of the object or releasing the object to the second position of the object.
In some embodiments, the hand element 124 may include a paddle 132 that is slideable along the hand element 124. To illustrate the components and the operations of the hand element 124,
In some embodiments, as the paddle 132 slides forward along the channel 134, the paddle 132 may slide along an object surface of the object to be transported and rest against the object surface of the object. Thus, as the plurality of paddles 132 of the plurality of grabbing arms 120 slide forward against different object surfaces of the object, the object may be held between the plurality of paddles 132 of the plurality of grabbing arms 120. In some embodiments, as the object is held between the plurality of paddles 132 of the plurality of grabbing arms 120, the plurality of paddles 132 of the plurality of grabbing arms 120 may slide forward or backward along their corresponding channel 134 to move the object. These implementations of slideable paddle 132 may be advantageous, because they may allow the hand element 124 to be extended with the paddle 132 to reach the position of the object that the container-transport AGV 100 cannot approach due to the shape and size of the AGV body 102 (e.g., the object may be located at the back side of a shelving unit). While some embodiments of the container-transport AGV 100 are described as having a paddle 132, some embodiments may additionally or alternatively include a suction device (e.g., suction cup) that adheres to a surface of the object to retrieve and hold the object. Other implementations of the paddle 132 and/or the hand element 124 are also possible and contemplated.
In some embodiments, as depicted in
In some embodiments, to grasp the object with the plurality of grabbing arms 120, the paddle 132 of the grabbing arm 120 may slide forward and rest against a first object surface of the object, and the finger element 128 of the grabbing arm 120 may bend at a folding angle and rest against a second object surface of the object. As the plurality of grabbing arms 120 perform these operations, the object may be securely gripped with the paddle 132 and the finger element 128 of the plurality of grabbing arms 120. In some embodiments, to release the object being held between the plurality of grabbing arms 120, the finger element 128 of the grabbing arm 120 may bend away from the second object surface of the object. As the finger element 128 no longer rests against the second object surface of the object, the object may not be gripped and the backward movement of the paddle 132 along the hand element 124 after the object is released from the paddles 132 can be facilitated. In some embodiments, the hand element 124 of the grabbing arm 120 may include multiple finger elements 128 bendable against multiple second object surfaces of the object. The finger element 128 may also be bendable at various folding angles to flexibly grasp various objects having different shapes and sizes. Other implementations of the finger element 128 are also possible and contemplated.
As depicted in
To illustrate the operations of the grabbing arm 120,
As depicted in
As discussed elsewhere herein, the container-transport AGV 100 may include the support surface 140 coupled to the AGV body 102 and vertically movable upward and downward along the AGV body 102. As depicted in
To illustrate the operations of the support surface 140,
As discussed elsewhere herein, the container-transport AGV 100 may include the one or more legs 110 coupled to the AGV body 102. As depicted in
As depicted in
As depicted in
In some embodiments, the container-transport AGV 100 may include a guidance unit (not shown) to reposition or navigate the container-transport AGV 100 in the operating environment. For example, the container-transport AGV 100 may use the guidance unit to reposition itself relative to the first position from which the object is retrieved or the second position to which the object is transported. In another example, the container-transport AGV 100 may use the guidance unit to navigate from a first area to a second area to transport an object from an initial position of the object in the first area to a new position of the object in the second area. In some embodiments, the guidance unit may include one or more sensors. Examples sensors of the guidance unit include, but are not limited to, vision sensors (e.g., camera, etc.), reader devices (e.g., marker scanner, etc.), etc. Other types of sensors are also possible and contemplated.
In some embodiments, the vision sensors may be image sensors capable of recording images (e.g., video images and still images), recording frames of a video stream, etc. In some embodiments, the vision sensors may be mounted on the container-transport AGV 100 and may capture images of surrounding environments within their sensor range. In some embodiments, the container-transport AGV 100 may analyze the captured images to determine its current position and/or to detect various objects present in the surrounding environment (e.g., containers, human workers, other AGVs, etc.). The container-transport AGV 100 may then adaptively reposition itself relative to the detected object (e.g., align with the container placed on the shelving unit), and/or determine a travel path to navigate through the detected objects (e.g., avoid collision with human workers, other AGVs, etc.) and then proceed accordingly.
In some embodiments, the reader devices may be optical scanners capable of performing read operations to read graphic markers. Non-limiting examples of the graphic marker may include, but are not limited to, a barcode, Quick Response (QR) code, Radio Frequency Identification (RFID) label, etc. In some embodiments, a graphic marker may be attached to various objects and/or various locations in the operating environment. For example, graphic markers may be attached to containers, shelving units, inventory items, AGVs, etc. In another example, graphic markers may be attached to the storage slots at which containers may be placed on a shelving unit, the designated positions at which containers may be placed on the floor surface in front of a storage aisle, etc. In some embodiments, the reader devices may read a graphic marker to obtain the unique ID of an object and/or the location to which the graphic marker is attached. The container-transport AGV 100 may then use these unique ID to identify the object to be transported, align itself to the position of the object, navigate between different locations in the operating environment, etc.
The processor 152 may execute software instructions by performing various input, logical, and/or mathematical operations. The processor 152 may have various computing architectures to method data signals including, for example, a complex instruction set computer (CISC) architecture, a reduced instruction set computer (RISC) architecture, and/or an architecture implementing a combination of instruction sets. The processor 152, which may include one or more processors, may be physical and/or virtual, and may include a single core or plurality of processing units and/or cores. In some embodiments, the processor 152 may be capable of generating and providing electronic signals to other computing entities (e.g., server, other AGVs, etc.), performing complex tasks such as image processing, AGV alignment and/or navigation, etc. In some embodiments, the processor 152 may be coupled to the memory 154 via the bus 190 to access data and instructions therefrom and store data therein. The bus 190 may couple the processor 152 to the other components of the computing device 150 including, for example, the AGV controller 160, the memory 154, the data store 156, the communication unit 158, and/or the input/output devices 170.
The memory 154 may store and provide access to data to the other components of the computing device 150. The memory 154 may be included in a single computing device or a plurality of computing devices. In some embodiments, the memory 154 may store instructions and/or data that may be executed by the processor 152. For example, the memory 154 may store computer logic executed by the AGV controller 160, depending on the configuration. The memory 154 is also capable of storing other instructions and data, including, for example, an operating system, hardware drivers, software applications, databases, etc. The memory 154 may be coupled to the bus 190 for communication with the processor 152 and other components of computing device 150. The memory 154 may include a non-transitory computer-usable (e.g., readable, writeable, etc.) medium, which can be any non-transitory apparatus or device that can contain, store, communicate, propagate, or transport instructions, data, computer programs, software, code, routines, etc., for processing by or in connection with the processor 152. In some embodiments, the memory 154 may include one or more of volatile memory and non-volatile memory (e.g., RAM, ROM, hard disk, optical disk, etc.). It should be understood that the memory 154 may be a single device or may include multiple types of devices and configurations.
The bus 190 may include a communication bus for transferring data between components of a computing device or between computing devices, a network bus system including a network or portions thereof, a processor mesh, a combination thereof, etc. In some embodiments, the server, the AGV controller 160, and various other components operating on the computing device 150 (operating systems, device drivers, etc.) may cooperate and communicate via a communication mechanism included in or implemented in association with the bus 190. The software communication mechanism may include and/or facilitate, for example, inter-method communication, local function or procedure calls, remote procedure calls, an object broker (e.g., CORBA), direct socket communication (e.g., TCP/IP sockets) among software modules, UDP broadcasts and receipts, HTTP connections, etc. In some embodiments, any or all of the communication can be secure (e.g., SSH, HTTPS, etc.).
The communication unit 158 may include one or more interface devices (I/F) for wired and wireless connectivity among the computing entities of the system (e.g., the server, the AGVs, etc.). For example, the communication unit 158 may include, but is not limited to, various types known connectivity and interface options. The communication unit 158 may be coupled to other components of the computing device 150 via the bus 190. The communication unit 158 may be coupled to a network (e.g., the Internet, an intranet, etc.), depending on the configuration. In some embodiments, the communication unit 158 may link the processor 152 to the network, which may in turn be coupled to other processing systems. The communication unit 158 may provide other connections to a network, servers, and/or computing devices using various standard communication protocols.
The data store 156 may include a non-transitory storage medium that stores various types of data and provides access to the data. The data stored by the data store 156 may be organized and queried using various criteria. For example, the data store 156 may include data tables, databases, or other organized collections of data. In some embodiments, the data store 156 may be included in the computing device 150 or in another computing system and/or storage system distinct from but coupled to or accessible by the computing device 150. In some embodiments, the data store 156 may be incorporated with the memory 154 or may be distinct therefrom. In some embodiments, the data store 156 may store data associated with a database management system (DBMS) operable on the computing device 150. For example, the DBMS could include a structured query language (SQL) DBMS, a NoSQL DBMS, various combinations thereof, etc. In some instances, the DBMS may store data in multi-dimensional tables comprised of rows and columns, and manipulate, e.g., insert, query, update and/or delete, rows of data using programmatic operations.
In some embodiments, the data stored by the data store 156 may include, but is not limited to, AGV data, container data, location data, planogram data, map data, etc. For example, the data store 156 may store the AGV ID of the container-transport AGV 100, the container ID of the container transported by the container-transport AGV 100, the first position in the first area from which the container is retrieved, the second position in the second area to which the container is transported, the location attributes of the first position of the object and the second position of the object, etc. In some embodiments, the location attributes of the first position of the object or the second position of the object may indicate the distances between these positions and one or more points of reference (e.g., distance to the floor surface, distance to an end point of the shelving unit, etc.). In some embodiments, the data store 156 may also store the planogram and the facility map describing the operating environment. In the context of the storage facility, the planogram may specify the items stored in each storage aisle of the storage facility and the storage location of the items within the storage aisle. The planogram may also indicate the location attributes of these storage locations. In some embodiments, the facility map may indicate various areas of the storage facility (e.g., replenishment area, designated area for placing the shelving units to be unloaded, designated area for placing the containers in front of each storage aisle, etc.). Other types of data are also possible and contemplated.
The input/output devices 170 may include any device for inputting and outputting information to and from the computing device 150. The input/output devices 170 may be coupled to the computing device 150 directly or through intervening Input/Output (I/O) controllers. As depicted in
As depicted in
The AGV controller 160 may include computer logic executable by the processor 152 to control one or more operations of the container-transport AGV 100 in the operating environment. For example, the AGV controller 160 may control the operations of the container-transport AGV 100 to transport the container between different locations in the storage facility. In some embodiments, the AGV controller 160 may be implemented in the container-transport AGV 100. Alternatively, the AGV controller 160 may be implemented in a server. In some embodiments, the AGV controller 160 may be implemented using software executable by one or more processors of one or more computer devices, using hardware, such as but not limited to a field-programmable gate array (FPGA), an application-specific integrated circuit (ASIC), etc., and/or a combination of hardware and software, etc.
In some embodiments, the AGV controller 160 may communicate with other components of the computing device 150 via the bus 190 and/or the processor 152, and communicate with other entities of the system via the network. In some embodiments, the AGV controller 160 may be a set of instructions executable by the processor 152 to provide its functionality. In further embodiments, the AGV controller 160 may be storable in the memory 154 and accessible and executable by the processor 152 to provide its functionality. In any of the foregoing embodiments, the AGV controller 160 may be adapted for cooperation and communication with the processor 152 and other components of the computing device 150. For example, the AGV controller 160 may receive input data from one or more input/output devices 170 (e.g., a captured image, graphic marker ID, container ID, target position of the container, etc.), process the input data, and provide one or more outputs via the visual interface device 176 and the voice interface device 178 (e.g., voice message confirming user input, execution status of user command, etc.). The AGV controller 160 is described in additional detail below with reference to at least
The network 220 may be a conventional type, wired and/or wireless, and may have numerous different configurations including a star configuration, token ring configuration, or other configurations. Furthermore, the network 220 may include one or more local area networks (LAN), wide area networks (WAN) (e.g., the Internet), personal area networks (PAN), cellular networks, public networks, private networks, virtual networks, virtual private networks, peer-to-peer networks, near field networks (e.g., Bluetooth, NFC, etc.), and/or other interconnected data paths across which multiple devices may communicate.
The network 220 may also be coupled to or include portions of a telecommunications network for sending data in a variety of different communication protocols. Example protocols include, but are not limited to, transmission control protocol/Internet protocol (TCP/IP), user datagram protocol (UDP), transmission control protocol (TCP), hypertext transfer protocol (HTTP), secure hypertext transfer protocol (HTTPS), dynamic adaptive streaming over HTTP (DASH), real-time streaming protocol (RTSP), real-time transport protocol (RTP) and the real-time transport control protocol (RTCP), voice over Internet protocol (VOIP), file transfer protocol (FTP), WebSocket (WS), wireless access protocol (WAP), various messaging protocols (SMS, MMS, XMS, IMAP, SMTP, POP, WebDAV, etc.), or other suitable protocols. In some embodiments, the network 220 is a wireless network using a connection such as DSRC (Dedicated Short Range Communication), WAVE, 802.11p, a 3G, 4G, 5G+ network, WiFi™, satellite networks, or other suitable networks. Although
As depicted in
The server 250 includes a hardware and/or virtual server that includes a processor, a memory, and network communication capabilities (e.g., a communication unit). In some embodiments, the server 250 may implement a management system managing various operations in the storage facility. For example, the managing system may manage the placement of inventory items into various containers. The server 250 may be communicatively coupled to the network 220, as reflected by signal line 202. In some embodiments, the server 250 may send and receive data to and from other entities of the system 200 (e.g., the container-transport AGVs 100, the shelf-transport AGVs 240, etc.). As depicted, the server 250 may include an instance of the AGV controller 160a. The server 250 may also include a data store 252 that stores various types of data for access and/or retrieval by the AGV controller 160a. In some embodiments, the data stored by the data store 252 may include, but is not limited to, planogram data, item data, container data, AGV data, etc. In some embodiments, the item data of an item may include item ID, item quantity, item order velocity, container IDs of one or more containers storing the item, etc. The container data of a container may include container ID, item ID and item quantity of one or more items stored in the container, container position of the container (e.g., storage aisle ID, slot ID), location attributes of the container position (e.g., distance between the container position to one or more point of references, etc.), etc. In some embodiments, the AGV data of the AGV may include AGV ID, AGV position of the AGV, container ID of one or more containers transported by the AGV, execution status of the AGV task performed by the AGV, etc. Other types of data are also possible and contemplated.
As illustrated in
In block 302, the AGV controller 160 may determine a first position of a container in a first area of the storage facility. In some embodiments, the AGV controller 160 may navigate the container-transport AGV 100 to the first area in which one or more containers are placed, determine a container to be transported from among the one or more containers, and determine the first position at which the container is located. In some embodiments, to determine the container to be transported, the reader device 174 of the container-transport AGV 100 may scan graphic markers (e.g., QR codes) attached to the containers or the storage slots that the containers occupy to obtain the container IDs of the containers and/or the slot IDs of the storage slots. The AGV controller 160 may then determine the container to be transported among these containers based on the container IDs and/or the slot IDs. In some embodiments, the AGV controller 160 may determine the container being transported to be the container that has the lowest distance to a point of reference. As an example, the container-transport AGV 100 may unload multiple containers from a shelving unit to the floor surface. In this example, the AGV controller 160 may determine the container on the shelving unit that has the lowest distance to an end point of the shelving unit that is designated to be the point of reference, and determine the container being transported to be this container.
In some embodiments, to determine the first position of the container, the AGV controller 160 may retrieve the first position at which the container is located and the location attributes of the first position from the data store 156 and/or the data store 252. Additionally or alternatively, the AGV controller 160 may determine the first position of the container and its location attributes based on the sensor data captured by one or more sensors 172 of the container-transport AGV 100. For example, the vision sensor of the container-transport AGV 100 may capture one or more images of the container. The AGV controller 160 may analyze the captured images and determine the first position of the container and its location attributes based on the captured images and the sensor position of the vision sensor that captured these images. Other implementations for determining the first position of the container and its location attributes are also possible and contemplated.
In block 304, the AGV controller 160 may align the container-transport AGV 100 with the container in the first area. To align the container-transport AGV 100 with the container, the AGV controller 160 may reposition the container-transport AGV 100 relative to the first position of the container such that the container-transport AGV 100 may be aligned to a surface of the container. In some embodiments, the alignment position at which the container-transport AGV 100 is aligned with the container may be indicated by a first graphic marker attached to the container, the storage slot on the shelving unit, or the floor surface, etc., such that when the reader device 174 of the container-transport AGV 100 detects the first graphic marker (e.g., QR code), the container-transport AGV 100 is aligned with the container. In some embodiments, the vision sensor of the container-transport AGV 100 may capture one or more images of the container from the perspective of the container-transport AGV 100. The AGV controller 160 may determine the alignment position relative to the container based on the captured images, and relocate the container-transport AGV 100 to the alignment position, thereby aligning the container-transport AGV 100 with the container. Other implementations for aligning the container-transport AGV 100 with the container are also possible and contemplated.
Continuing the example depicted in
In block 306, the AGV controller 160 may determine a first surface position for the support surface 140 and a first arm position for the plurality of grabbing arms 120 of the container-transport AGV 100 based on the first position of the container. In some embodiments, when the container is located at the first position, the AGV controller 160 may determine the first surface position for the support surface 140 that has the distance between the first surface position and the bottom surface of the container satisfying a first distance threshold (e.g., less than 0.5 cm). Thus, the support surface 140 may be moved to the first surface position that is proximate to the bottom surface of the container, and therefore the movement of the container from the first position of the container to the support surface 140 of the container-transport AGV 100 can be facilitated. In some embodiments, when the container is located at the first position, the AGV controller 160 may determine the first arm position for the plurality of grabbing arms 120 that has the distance between the first arm position and the top surface of the container satisfying a second distance threshold (e.g., more than 3 cm). Thus, the plurality of grabbing arms 120 may be moved to the first arm position that is at a certain distance from the top surface of the container, and therefore the likelihood of the container being dropped when the plurality of grabbing arms 120 hold the container may be reduced.
In block 308, the AGV controller 160 may actuate the elevator mechanism 104 to vertically move the support surface 140 along the AGV body 102 of the container-transport AGV 100 to the first surface position relative to the first position of the container. Similarly, the AGV controller 160 may actuate the elevator mechanisms 106 to vertically move the plurality of grabbing arms 120 along the AGV body 102 of the container-transport AGV 100 to the first arm position relative to the first position of the container. Thus, the support surface 140 and the plurality of grabbing arms 120 may vertically move towards the first position at which the container is located.
Continuing the example depicted in
In block 310, the AGV controller 160 may determine the object size of the container. The object size of the container may indicate one or more dimensions of the container (e.g., width, length, height, diameter, axis length, etc.). In some embodiments, the AGV controller 160 may retrieve the object size of the container from the data store 156 and/or the data store 252. Alternatively, the AGV controller 160 may determine the object size of the container based on the sensor data captured by one or more sensors 172 of the container-transport AGV 100. For example, the vision sensor of the container-transport AGV 100 may capture one or more images of the container. The AGV controller 160 may analyze the captured images, and determine the object size of the container based on the captured images. In another example, the container-transport AGV 100 may determine the object size of the container using dimension sensors integrated in the plurality of grabbing arms 120. Other implementations for determining the object size of the container are also possible and contemplated.
In block 312, the AGV controller 160 may adjust one or more grabbing arms 120 of the container-transport AGV 100 based on the object size of the container. In some embodiments, for a grabbing arm 120 in the plurality of grabbing arms 120, the AGV controller 160 may actuate the motors in the shoulder element 122 of the grabbing arm 120 to horizontally pivot the grabbing arm 120 around the shoulder element 122. The AGV controller 160 may also actuate the motors in the wrist element 126 of the grabbing arm 120 to horizontally pivot the hand element 124 of the grabbing arm 120 around the wrist element 126. Thus, by horizontally pivoting the grabbing arm 120 around the shoulder element 122 and/or horizontally pivoting the hand element 124 of the grabbing arm 120 around the wrist element 126, the distance between the plurality of paddles 132 of the plurality of grabbing arms 120 may be adjusted to be equal to the object size of the container, and thus the container can fit between the plurality of paddles 132 of the plurality of grabbing arms 120. It should be understood that the AGV controller 160 may adjust one grabbing arm 120 or multiple grabbing arms 120 in the plurality of grabbing arms 120. These grabbing arms 120 and/or their hand element 124 may be horizontally rotated with the same or different rotation angles (e.g., 15° v. 20°) in the same or different directions (e.g., towards the left v. towards the right relative to the AGV body 102). Other implementations for adjusting the one or more grabbing arms 120 of the container-transport AGV 100 are also possible and contemplated.
Continuing the example depicted in
In block 314, responsive to moving the support surface 140 to the first surface position and moving the one or more grabbing arms 120 to the first arm position that are relative to the first position of the container and responsive to adjusting one or more grabbing arms 120 based on the object size of the container, the container-transport AGV 100 may retrieve the container from the first position of the container using the one or more grabbing arms 120. The one or more grabbing arms 120 may articulate to grasp the container from the first position of the container. The one or more grabbing arms 120 may hold the container and move the container towards the support surface 140 of the container-transport AGV 100 to place the container on the support surface 140.
In some embodiments, to retrieve the container from the first position of the container, a grabbing arm 120 in the plurality of grabbing arms 120 may slide the paddle 132 forward along a first object surface of the container. When the plurality of grabbing arms 120 slide their paddles 132 forward in parallel against multiple first object surfaces of the container, the container may be held between the plurality of paddles 132 of the plurality of grabbing arms 120. In some embodiments, the grabbing arm 120 may actuate the motors in the finger element 128 to bend the finger element 128 against a second object surface of the container to grasp the container. As the container is securely gripped by the paddle 132 and the finger element 128 of the plurality of grabbing arms 120, the plurality of grabbing arms 120 may slide their paddle 132 backward in parallel along their hand elements 124 to move the container towards the support surface 140 and place the container on the support surface 140.
Continuing the example depicted in
In some embodiments, in stage 525 illustrated in
In stage 530 illustrated in
In stage 535 illustrated in
In block 316, responsive to placing the container on the support surface 140 of the container-transport AGV 100, the container-transport AGV 100 may transport the container situated on the support surface 140 to a second area where the container is released. In some embodiments, to transport the container to the second area, the guidance unit may reference the facility map of the storage facility, and determine a navigation path from the first area to the second area. To follow the navigation path to the second area, the reader device 174 may detect the graphic markers on the floor surface as the container-transport AGV 100 proceeds in the storage facility. In some embodiments, the guidance unit may map these graphic markers to the navigation path, and generate navigating instructions to follow the navigation path. The AGV controller 160 may then actuate the power motors and/or the driving actuators of the container-transport AGV 100 to move the container-transport AGV 100 based on the navigating instructions. As a result, the container-transport AGV 100 may follow the navigation path to transport the container situated on the support surface 140 from the first area to the second area. Other implementations for navigating the container-transport AGV 100 to the second area are also possible and contemplated.
Continuing the example depicted in
In block 318, the AGV controller 160 may determine a second position for the container in the second area. In some embodiments, to determine the second position of the container, the AGV controller 160 may retrieve the second position to which the container needs to be transported and the location attributes of the second position from the data store 156 and/or the data store 252. Alternatively, the user communication unit 142 may receive a user command specifying the second position for the container. For example, the voice interface device 178 may receive a voice command “take container C030387 to storage slot S115 of storage aisle 3” from the human worker. The AGV controller 160 may then extract the second position of the container from the user command, and reference the planogram of the storage facility to determine the location attributes of the second position. In some embodiments, the AGV controller 160 may determine the second position of the container and its location attributes based on the sensor data captured by one or more sensors 172 of the container-transport AGV 100. For example, the vision sensor of the container-transport AGV 100 may capture one or more images of the second area. The AGV controller 160 may analyze the captured images, and determine the second position for the container and the location attributes of the second position based on the captured images.
Continuing the example depicted in
In block 320, the AGV controller 160 may align the container-transport AGV 100 with the second position of the container. To align the container-transport AGV 100 with the second position of the container, the AGV controller 160 may reposition the container-transport AGV 100 relative to the second position of the container such that the container-transport AGV 100 will be facing (e.g., located opposite to) the container when the container is placed at the second position. In some embodiments, the AGV controller 160 may align the container-transport AGV 100 with the second position of the container in the same manner as the AGV controller 160 may align the container-transport AGV 100 with the container when the container is located in the first position. For example, the AGV controller 160 may align the container-transport AGV 100 with the second position of the container based on graphic marker, captured images, etc.
Continuing the example depicted in
In block 322, the AGV controller 160 may determine a second surface position for the support surface 140 and a second arm position for the plurality of grabbing arms 120 of the container-transport AGV 100 based on the second position of the container. In some embodiments, the distance between the second surface position of the support surface 140 and the second position of the container may satisfy the first distance threshold (e.g., less than 0.5 cm). Thus, the support surface 140 will be moved to the second surface position that is proximate to the second position to which the container is released, and therefore the movement of the container from the support surface 140 of the container-transport AGV 100 to the second position of the container can be facilitated. In some embodiments, the AGV controller 160 may determine a moving direction and a moving distance for the support surface 140 to move to the second surface position, and determine the second arm position for the plurality of grabbing arms 120 based on the moving direction and the moving distance of the support surface 140. In some embodiments, the moving direction and the moving distance for the plurality of grabbing arms 120 to move to the second arm position may be the same as the moving direction and the moving distance for the support surface 140 to move to the second surface position. As a result, the relative position between the support surface 140 and the plurality of grabbing arms 120 may remain unchanged, and thus the container may continue to be held between the paddles 132 of the plurality of grabbing arms 120 while the container is vertically moved towards the second position by the support surface 140. This implementation is advantageous, because it enables the plurality of grabbing arms 120 to move the container when the support surface 140 reaches the second surface position or move the container simultaneously while the support surface 140 is vertically moving to the second surface position.
In block 324, the AGV controller 160 may actuate the elevator mechanism 104 to vertically move the support surface 140 along the AGV body 102 of the container-transport AGV 100 to the second surface position relative to the second position of the container. The AGV controller 160 may also actuate the elevator mechanisms 106 to vertically move the plurality of grabbing arms 120 along the AGV body 102 of the container-transport AGV 100 to the second arm position relative to the second position of the container. For example, the elevator mechanisms 106 may vertically move the plurality of grabbing arms 120 along the AGV body 102 in the same moving direction with the same moving distance as the support surface 140. Thus, the support surface 140 and the plurality of grabbing arms 120 may vertically move towards the second position to which the container is released.
Continuing the example depicted in
In block 326, the container-transport AGV 100 may move the container situated on the support surface 140 by the plurality of grabbing arms 120 holding the container to place the container at the second position of the container. In some embodiments, to move the container from the support surface 140 to the second position of the container, a grabbing arm 120 in the plurality of grabbing arms 120 may bend the finger element 128 of the grabbing arm 120 away from the second object surface of the container, and slide the paddle 132 forward along the hand element 124 of the grabbing arm 120. As the plurality of grabbing arms 120 slide their paddles 132 forward in parallel along their hand element 124, the plurality of paddles 132 of the plurality of grabbing arms 120 may move the container being held between the plurality of paddles 132 from the support surface 140 towards the second position of the container and place the container at the second position of the container. Thus, the transportation of the container from the first position of the container in the first area to the second position of the container in the second area by the container-transport AGV 100 is completed.
Continuing the example depicted in
In stage 550 illustrated in
In stage 560 illustrated in
In stage 565 illustrated in
In block 402, the AGV controller 160 may monitor a tracking device associated with the user. In some embodiments, the tracking device may be held, worn, clipped to, or otherwise attached to the human worker, and thus the position of the tracking device may indicate a current position of the user in the storage facility. In some embodiments, the tracking device may periodically transmit the current position of the user to the server 250 and/or to other entities located proximate to the user. For example, the tracking device may broadcast the current position of the user to the AGVs located within a predefined distance from the tracking device (e.g., 250 m) at a predefined interval (e.g., every 0.5 s). Non-limiting examples of the tracking device include, but are not limited to, beacon, Bluetooth™ device, wearable device (e.g., smart watch, smart glasses), smart phone, etc.
In block 404, the container-transport AGV 100 may place a container on its support surface 140. In some embodiments, to place the container on the support surface 140, the container-transport AGV 100 may perform the operations described above with reference to blocks 302-314 in
In block 406, responsive to placing the container on the support surface 140 of the container-transport AGV 100, the container-transport AGV 100 may maintain a following distance between the container-transport AGV 100 and the tracking device to automatically follow the user in the storage facility. In some embodiments, the AGV controller 160 may receive the current position of the user from the tracking device, and automatically relocate the container-transport AGV 100 to maintain the following distance to the tracking device (e.g., 0.5 m). As a result, the container-transport AGV 100 may automatically follow the user with the container situated on its support surface 140 as the user proceeds in the storage facility. This implementation is advantageous, because it eliminates the need for the user to manually carry the container or push a cart including the container while walking back and forth along the storage aisle to put away the items stored in the container.
In block 408, the AGV controller 160 may determine whether the user stopped moving in the storage facility. In some embodiments, the AGV controller 160 may determine a difference between the current position of the user during a predefined time period (e.g., in the last 5 s). The AGV controller 160 may determine that the difference between the current position of the user during the time period satisfy a moving distance threshold (e.g., less than 5 cm), and thus determine that the user stopped moving.
In block 410, responsive to determining that the user stopped moving, the AGV controller 160 may relocate the container-transport AGV 100 to a position proximate to the current position of the user. In some embodiments, the AGV controller 160 may relocate the container-transport AGV 100 towards the current position of the user such that the distance between the container-transport AGV 100 and the current position of the user satisfies a reaching distance threshold (e.g., less than 15 cm), which may be advantageous because these operations may facilitate the user in reaching the items stored in the container when the user stops at the storage locations of the items. Thus, by automatically following the user with the container situated on the support surface 140 and relocating to the position proximate to the current position of the user when the user stopped moving, the container-transport AGV 100 can provide a hands-free experience to the user and facilitate the user in distributing inventory items stored in the container at their storage locations.
In block 452, the AGV controller 160 may receive the current position of the user from the tracking device. The AGV controller 160 may determine that the current position of the user has a height distance to the floor surface satisfying a height threshold (e.g., more than 120 cm or less than 30 cm), and thus determine that the user moved from a regular position at which the user stands on the floor surface to a higher position or to a lower position in order to reach the storage location of the items stored in the container. For example, the user may step on a ladder or sit down on the floor surface. In some embodiments, the AGV controller 160 may then relocate the container-transport AGV 100 towards the current position of the user. As discussed elsewhere herein, the AGV controller 160 may relocate the container-transport AGV 100 to the position that has the distance between the container-transport AGV 100 and the current position of the user satisfying the reaching distance threshold (e.g., less than 15 cm).
In block 454, responsive to determining that the user moved to the higher position or the lower position as compared to the regular position of the user, the AGV controller 160 may determine a surface position for the support surface 140 on which the container is situated. In some embodiments, the AGV controller 160 may determine the surface position for the support surface 140 along the AGV body 102 that has a vertical distance between the surface position of the support surface 140 and the current position of the user satisfying a distance threshold (e.g., less than 5 cm).
In block 456, the AGV controller 160 may move the support surface 140 on which the container is situated to the surface position. In some embodiments, the AGV controller 160 may actuate the elevator mechanism 104 to vertically move the support surface 140 along the AGV body 102 of the container-transport AGV 100 to the surface position vertically proximate to the current position of the user. In some embodiments, the AGV controller 160 may also move the plurality of grabbing arms 120 in parallel with the support surface 140. For example, the AGV controller 160 may actuate the elevator mechanisms 106 to vertically move the support surface 140 in the same moving direction with the same moving distance as the support surface 140, and thus the container may continue to be held between the plurality of grabbing arms 120 and supported from underneath by the support surface 140 as the support surface 140 reaches the surface position. As the support surface 140 reaches the surface position, the container situated on the support surface 140 may be vertically proximate to the current position of the user, thereby facilitating the user in reaching the items stored in the container, which may be advantageous because it may eliminate or reduce the need for the user to manually carry the object while moving to the higher position or to the lower position, thereby providing a hands-free experience that improves the safety of the user when performing these movements.
It should be understood that the container-transport AGV 100 can advantageously transport various objects between various locations in any operating environment. For example, the container-transport AGV 100 may perform the method 300 described above with reference to
It should be noted that the components described herein may be further delineated or changed without departing from the techniques described herein. For example, the processes described throughout this disclosure may be performed by fewer, additional, or different components.
It should be understood that the methods described herein are provided by way of example, and that variations and combinations of these methods, as well as other methods, are contemplated. For example, in some implementations, at least a portion of one or more of the methods represent various segments of one or more larger methods and may be concatenated or various steps of these methods may be combined to produce other methods which are encompassed by the present disclosure. Additionally, it should be understood that various operations in the methods are iterative, and thus repeated as many times as necessary generate the results described herein. Further the ordering of the operations in the methods is provided by way of example and it should be understood that various operations may occur earlier and/or later in the method without departing from the scope thereof.
In the above description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it should be understood that the technology described herein can be practiced without these specific details in various cases. Further, various systems, devices, and structures are shown in block diagram form in order to avoid obscuring the description. For instance, various implementations are described as having particular hardware, software, and user interfaces. However, the present disclosure applies to any type of computing device that can receive data and commands, and to any peripheral devices providing services.
In some instances, various implementations may be presented herein in terms of algorithms and symbolic representations of operations on data bits within a computer memory. An algorithm is here, and generally, conceived to be a self-consistent set of operations leading to a desired result. The operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout this disclosure, discussions utilizing terms such as “processing,” “computing,” “calculating,” “determining,” “displaying,” or the like, refer to the action and methods of a computer system that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
A data processing system suitable for storing and/or executing program code, such as the computing system and/or devices discussed herein, may include at least one processor coupled directly or indirectly to memory elements through a system bus. The memory elements can include local memory employed during actual execution of the program code, bulk storage, and cache memories that provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. Input and/or output devices can be coupled to the system either directly or through intervening I/O controllers. The data processing system may include an apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
The foregoing description has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the specification to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of the disclosure be limited not by this detailed description, but rather by the claims of this application. As will be understood by those familiar with the art, the specification may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. Likewise, the particular naming and division of the modules, routines, features, attributes, methodologies and other aspects may not be mandatory or significant, and the mechanisms that implement the specification or its features may have different names, divisions, and/or formats.
Furthermore, the modules, routines, features, attributes, methodologies and other aspects of the disclosure can be implemented as software, hardware, firmware, or any combination of the foregoing. The technology can also take the form of a computer program product accessible from a computer-usable or computer-readable medium providing program code for use by or in connection with a computer or any instruction execution system. Wherever a component, an example of which is a module or engine, of the specification is implemented as software, the component can be implemented as a standalone program, as part of a larger program, as a plurality of separate programs, as a statically or dynamically linked library, as a kernel loadable module, as firmware, as resident software, as microcode, as a device driver, and/or in every and any other way known now or in the future. Additionally, the disclosure is in no way limited to implementation in any specific programming language, or for any specific operating system or environment. Accordingly, the disclosure is intended to be illustrative, but not limiting, of the scope of the subject matter set forth in the following claims.
Number | Name | Date | Kind |
---|---|---|---|
2647650 | Sherriff | Aug 1953 | A |
3450276 | Ferrari | Jun 1969 | A |
3474877 | Wesener | Oct 1969 | A |
3628624 | Wesener | Dec 1971 | A |
3970840 | De Bruine | Jul 1976 | A |
4010409 | Waites | Mar 1977 | A |
4215759 | Diaz | Aug 1980 | A |
4258813 | Rubel | Mar 1981 | A |
4278142 | Kono | Jul 1981 | A |
4465155 | Collins | Aug 1984 | A |
4496274 | Pipes | Jan 1985 | A |
4524314 | Walker | Jun 1985 | A |
4530056 | MacKinnon et al. | Jul 1985 | A |
4556940 | Katoo et al. | Dec 1985 | A |
4562635 | Carter | Jan 1986 | A |
4566032 | Hirooka et al. | Jan 1986 | A |
4593238 | Yamamoto | Jun 1986 | A |
4593239 | Yamamoto | Jun 1986 | A |
4652803 | Kamejima et al. | Mar 1987 | A |
4653002 | Barry | Mar 1987 | A |
4657463 | Pipes | Apr 1987 | A |
4678390 | Bonneton et al. | Jul 1987 | A |
4700302 | Arakawa et al. | Oct 1987 | A |
4711316 | Katou et al. | Dec 1987 | A |
4714399 | Olson | Dec 1987 | A |
4716530 | Ogawa et al. | Dec 1987 | A |
4727492 | Reeve et al. | Feb 1988 | A |
4742283 | Bolger et al. | May 1988 | A |
4751983 | Leskovec et al. | Jun 1988 | A |
4764078 | Neri | Aug 1988 | A |
4772832 | Okazaki et al. | Sep 1988 | A |
4773018 | Lundstrom | Sep 1988 | A |
4777601 | Boegli | Oct 1988 | A |
4780817 | Lofgren | Oct 1988 | A |
4790402 | Field et al. | Dec 1988 | A |
4802096 | Hainsworth et al. | Jan 1989 | A |
4811227 | Wikstrom | Mar 1989 | A |
4811229 | Wilson | Mar 1989 | A |
4817000 | Eberhardt | Mar 1989 | A |
4846297 | Field et al. | Jul 1989 | A |
4847769 | Reeve | Jul 1989 | A |
4847773 | van Helsdingen et al. | Jul 1989 | A |
4847774 | Tomikawa et al. | Jul 1989 | A |
4852677 | Okazaki | Aug 1989 | A |
4857912 | Everett, Jr. et al. | Aug 1989 | A |
4858132 | Holmquist | Aug 1989 | A |
4862047 | Suzuki et al. | Aug 1989 | A |
4863335 | Herigstad et al. | Sep 1989 | A |
4875172 | Kanayama | Oct 1989 | A |
4890233 | Ando et al. | Dec 1989 | A |
4918607 | Wible | Apr 1990 | A |
4924153 | Toru et al. | May 1990 | A |
4926544 | Koyanagi et al. | May 1990 | A |
4935871 | Grohsmeyer | Jun 1990 | A |
4939650 | Nishikawa | Jul 1990 | A |
4939651 | Onishi | Jul 1990 | A |
4942531 | Hainsworth et al. | Jul 1990 | A |
4947324 | Kamimura et al. | Aug 1990 | A |
4950118 | Mueller et al. | Aug 1990 | A |
4954962 | Evans, Jr. et al. | Sep 1990 | A |
4982329 | Tabata et al. | Jan 1991 | A |
4990841 | Elder | Feb 1991 | A |
4993507 | Ohkura | Feb 1991 | A |
4994970 | Noji et al. | Feb 1991 | A |
4996468 | Field et al. | Feb 1991 | A |
5000279 | Kondo et al. | Mar 1991 | A |
5002145 | Wakaumi et al. | Mar 1991 | A |
5005128 | Robins et al. | Apr 1991 | A |
5006988 | Borenstein et al. | Apr 1991 | A |
5020620 | Field | Jun 1991 | A |
5023790 | Luke, Jr. | Jun 1991 | A |
5040116 | Evans, Jr. et al. | Aug 1991 | A |
5052882 | Blau et al. | Oct 1991 | A |
5053969 | Booth | Oct 1991 | A |
5073749 | Kanayama | Dec 1991 | A |
5109940 | Yardley | May 1992 | A |
5111401 | Everett, Jr. et al. | May 1992 | A |
5125783 | Kawasoe et al. | Jun 1992 | A |
5134353 | Kita et al. | Jul 1992 | A |
5138560 | Lanfer et al. | Aug 1992 | A |
5154249 | Yardley | Oct 1992 | A |
5164648 | Kita et al. | Nov 1992 | A |
5170351 | Nemoto et al. | Dec 1992 | A |
5170352 | McTamaney et al. | Dec 1992 | A |
5179329 | Nishikawa et al. | Jan 1993 | A |
5187664 | Yardley et al. | Feb 1993 | A |
5191528 | Yardley et al. | Mar 1993 | A |
5192903 | Kita et al. | Mar 1993 | A |
5199524 | Ivancic | Apr 1993 | A |
5202832 | Lisy | Apr 1993 | A |
5211523 | Andrada Galan et al. | May 1993 | A |
5213176 | Oroku et al. | May 1993 | A |
5216605 | Yardley et al. | Jun 1993 | A |
5239249 | Ono | Aug 1993 | A |
5249157 | Taylor | Sep 1993 | A |
5281901 | Yardley et al. | Jan 1994 | A |
5305217 | Nakamura et al. | Apr 1994 | A |
5341130 | Yardley et al. | Aug 1994 | A |
5387853 | Ono | Feb 1995 | A |
5488277 | Nishikawa et al. | Jan 1996 | A |
5510984 | Markin et al. | Apr 1996 | A |
5525884 | Sugiura et al. | Jun 1996 | A |
5545960 | Ishikawa | Aug 1996 | A |
5548512 | Quraishi | Aug 1996 | A |
5564890 | Knudsen, Jr. | Oct 1996 | A |
5568030 | Nishikawa et al. | Oct 1996 | A |
5586620 | Dammeyer | Dec 1996 | A |
5650703 | Yardley et al. | Jul 1997 | A |
5669748 | Knudsen, Jr. | Sep 1997 | A |
5875408 | Bendett et al. | Feb 1999 | A |
5911767 | Garibotto et al. | Jun 1999 | A |
5923270 | Sampo et al. | Jul 1999 | A |
5961559 | Shimbara et al. | Oct 1999 | A |
6049745 | Douglas et al. | Apr 2000 | A |
6058339 | Takiguchi et al. | May 2000 | A |
6092010 | Alofs et al. | Jul 2000 | A |
6246930 | Hori | Jun 2001 | B1 |
6256560 | Kim et al. | Jul 2001 | B1 |
6345217 | Zeitler et al. | Feb 2002 | B1 |
6370452 | Pfister | Apr 2002 | B1 |
6377888 | Olch | Apr 2002 | B1 |
6459966 | Nakano et al. | Oct 2002 | B2 |
6477463 | Hamilton | Nov 2002 | B2 |
6493614 | Jung | Dec 2002 | B1 |
6602037 | Winkler | Aug 2003 | B2 |
6615108 | Peless et al. | Sep 2003 | B1 |
6629028 | Paromtchik et al. | Sep 2003 | B2 |
6654647 | Kal | Nov 2003 | B1 |
6721638 | Zeitler | Apr 2004 | B2 |
6748292 | Mountz | Jun 2004 | B2 |
6772062 | Lasky et al. | Aug 2004 | B2 |
6882910 | Jeong | Apr 2005 | B2 |
6885912 | Peless et al. | Apr 2005 | B2 |
6895301 | Mountz | May 2005 | B2 |
6904343 | Kang | Jun 2005 | B2 |
6950722 | Mountz | Sep 2005 | B2 |
6971464 | Marino et al. | Dec 2005 | B2 |
7050891 | Chen | May 2006 | B2 |
7110855 | Leishman | Sep 2006 | B2 |
7155309 | Peless et al. | Dec 2006 | B2 |
7305287 | Park | Dec 2007 | B2 |
7333631 | Roh et al. | Feb 2008 | B2 |
7349759 | Peless et al. | Mar 2008 | B2 |
7402018 | Mountz et al. | Jul 2008 | B2 |
7403120 | Duron et al. | Jul 2008 | B2 |
7437226 | Roh et al. | Oct 2008 | B2 |
7460016 | Sorenson, Jr. et al. | Dec 2008 | B2 |
7505849 | Saarikivi | Mar 2009 | B2 |
7548166 | Roeder et al. | Jun 2009 | B2 |
7557714 | Roeder et al. | Jul 2009 | B2 |
7609175 | Porte et al. | Oct 2009 | B2 |
7613617 | Williams et al. | Nov 2009 | B2 |
7616127 | Sorenson, Jr. et al. | Nov 2009 | B2 |
7634332 | Williams et al. | Dec 2009 | B2 |
7639142 | Roeder et al. | Dec 2009 | B2 |
7648329 | Chilson et al. | Jan 2010 | B2 |
7656296 | Runyon et al. | Feb 2010 | B2 |
7681796 | Cato et al. | Mar 2010 | B2 |
7689001 | Kim et al. | Mar 2010 | B2 |
7693757 | Zimmerman | Apr 2010 | B2 |
7765027 | Hong et al. | Jul 2010 | B2 |
7826919 | DAndrea et al. | Nov 2010 | B2 |
7835821 | Roh et al. | Nov 2010 | B2 |
7840328 | Baginski et al. | Nov 2010 | B2 |
7845560 | Emanuel et al. | Dec 2010 | B2 |
7850413 | Fontana | Dec 2010 | B2 |
7873469 | DAndrea et al. | Jan 2011 | B2 |
7890228 | Redmann, Jr. et al. | Feb 2011 | B2 |
7894932 | Mountz et al. | Feb 2011 | B2 |
7894933 | Mountz et al. | Feb 2011 | B2 |
7894939 | Zini et al. | Feb 2011 | B2 |
7894951 | Norris et al. | Feb 2011 | B2 |
7912574 | Wurman et al. | Mar 2011 | B2 |
7912633 | Dietsch et al. | Mar 2011 | B1 |
7920962 | DAndrea et al. | Apr 2011 | B2 |
7925514 | Williams et al. | Apr 2011 | B2 |
7953551 | Park et al. | May 2011 | B2 |
7980808 | Chilson et al. | Jul 2011 | B2 |
7991521 | Stewart | Aug 2011 | B2 |
7996109 | Zini et al. | Aug 2011 | B2 |
8010230 | Zini et al. | Aug 2011 | B2 |
8020657 | Allard et al. | Sep 2011 | B2 |
8031086 | Thacher et al. | Oct 2011 | B2 |
8068978 | DAndrea et al. | Nov 2011 | B2 |
8072309 | Kraimer et al. | Dec 2011 | B2 |
8075243 | Chilson et al. | Dec 2011 | B2 |
8146702 | Schendel et al. | Apr 2012 | B2 |
8160728 | Curtis | Apr 2012 | B2 |
8170711 | DAndrea et al. | May 2012 | B2 |
8192137 | Ross et al. | Jun 2012 | B2 |
8193903 | Kraimer et al. | Jun 2012 | B2 |
8196835 | Emanuel et al. | Jun 2012 | B2 |
8200423 | Dietsch et al. | Jun 2012 | B2 |
8204624 | Zini et al. | Jun 2012 | B2 |
8210791 | Chilson et al. | Jul 2012 | B2 |
8220710 | Hoffman et al. | Jul 2012 | B2 |
8229619 | Roh et al. | Jul 2012 | B2 |
8239291 | Hoffman et al. | Aug 2012 | B2 |
8265873 | DAndrea et al. | Sep 2012 | B2 |
8269643 | Chou | Sep 2012 | B2 |
8271132 | Nielsen et al. | Sep 2012 | B2 |
8280546 | DAndrea et al. | Oct 2012 | B2 |
8280547 | DAndrea et al. | Oct 2012 | B2 |
8311902 | Mountz et al. | Nov 2012 | B2 |
8369981 | Dunsker et al. | Feb 2013 | B2 |
8381982 | Kunzig et al. | Feb 2013 | B2 |
8406949 | Kondo | Mar 2013 | B2 |
8412400 | DAndrea et al. | Apr 2013 | B2 |
8417444 | Smid et al. | Apr 2013 | B2 |
8418919 | Beyda | Apr 2013 | B1 |
8433442 | Friedman et al. | Apr 2013 | B2 |
8433469 | Harvey et al. | Apr 2013 | B2 |
8444369 | Watt et al. | May 2013 | B2 |
8452464 | Castaneda et al. | May 2013 | B2 |
8457978 | Williams et al. | Jun 2013 | B2 |
8473140 | Norris et al. | Jun 2013 | B2 |
8483869 | Wurman et al. | Jul 2013 | B2 |
8498734 | Dunsker et al. | Jul 2013 | B2 |
8515612 | Tanaka et al. | Aug 2013 | B2 |
8538692 | Wurman et al. | Sep 2013 | B2 |
8571781 | Bernstein et al. | Oct 2013 | B2 |
8577551 | Siefring et al. | Nov 2013 | B2 |
8587455 | Porte et al. | Nov 2013 | B2 |
8594834 | Clark et al. | Nov 2013 | B1 |
8606392 | Wurman et al. | Dec 2013 | B2 |
8626332 | Dunsker et al. | Jan 2014 | B2 |
8626335 | Wurman et al. | Jan 2014 | B2 |
8639382 | Clark et al. | Jan 2014 | B1 |
8649899 | Wurman et al. | Feb 2014 | B2 |
8653945 | Baek et al. | Feb 2014 | B2 |
8670892 | Yang | Mar 2014 | B2 |
8676426 | Murphy | Mar 2014 | B1 |
8700502 | Mountz et al. | Apr 2014 | B2 |
8718814 | Clark et al. | May 2014 | B1 |
8725286 | DAndrea et al. | May 2014 | B2 |
8725317 | Elston et al. | May 2014 | B2 |
8725362 | Elston et al. | May 2014 | B2 |
8725363 | Elston et al. | May 2014 | B2 |
8731777 | Castaneda et al. | May 2014 | B2 |
8751063 | Bernstein et al. | Jun 2014 | B2 |
8751147 | Colwell | Jun 2014 | B2 |
8755936 | Friedman et al. | Jun 2014 | B2 |
8760276 | Yamazato | Jun 2014 | B2 |
8761989 | Murphy | Jun 2014 | B1 |
8788121 | Klinger | Jul 2014 | B2 |
8798784 | Clark et al. | Aug 2014 | B1 |
8798786 | Wurman et al. | Aug 2014 | B2 |
8798840 | Fong et al. | Aug 2014 | B2 |
8805573 | Brunner et al. | Aug 2014 | B2 |
8805574 | Stevens et al. | Aug 2014 | B2 |
8825257 | Ozaki et al. | Sep 2014 | B2 |
8831984 | Hoffman et al. | Sep 2014 | B2 |
8862397 | Tsujimoto et al. | Oct 2014 | B2 |
8874300 | Allard et al. | Oct 2014 | B2 |
8874360 | Klinger et al. | Oct 2014 | B2 |
8880416 | Williams et al. | Nov 2014 | B2 |
8886385 | Takahashi et al. | Nov 2014 | B2 |
8892240 | Vliet et al. | Nov 2014 | B1 |
8892241 | Weiss | Nov 2014 | B2 |
8909368 | DAndrea et al. | Dec 2014 | B2 |
8930133 | Wurman et al. | Jan 2015 | B2 |
8948956 | Takahashi et al. | Feb 2015 | B2 |
8954188 | Sullivan et al. | Feb 2015 | B2 |
8965561 | Jacobus et al. | Feb 2015 | B2 |
8965562 | Wurman et al. | Feb 2015 | B1 |
8965578 | Versteeg et al. | Feb 2015 | B2 |
8970363 | Kraimer et al. | Mar 2015 | B2 |
8972045 | Mountz et al. | Mar 2015 | B1 |
8983647 | Dwarakanath et al. | Mar 2015 | B1 |
8988285 | Smid et al. | Mar 2015 | B2 |
8989918 | Sturm | Mar 2015 | B2 |
9002506 | Agarwal et al. | Apr 2015 | B1 |
9002581 | Castaneda et al. | Apr 2015 | B2 |
9008827 | Dwarakanath et al. | Apr 2015 | B1 |
9008828 | Worsley | Apr 2015 | B2 |
9008829 | Worsley | Apr 2015 | B2 |
9008830 | Worsley | Apr 2015 | B2 |
9009072 | Mountz et al. | Apr 2015 | B2 |
9014902 | Murphy | Apr 2015 | B1 |
9020679 | Zini et al. | Apr 2015 | B2 |
9026301 | Zini et al. | May 2015 | B2 |
9043016 | Filippov et al. | May 2015 | B2 |
9046893 | Douglas et al. | Jun 2015 | B2 |
9052714 | Creasey et al. | Jun 2015 | B2 |
9056719 | Tanahashi | Jun 2015 | B2 |
9067317 | Wurman et al. | Jun 2015 | B1 |
9073736 | Hussain et al. | Jul 2015 | B1 |
9082293 | Wellman et al. | Jul 2015 | B2 |
9087314 | Hoffman et al. | Jul 2015 | B2 |
9090214 | Bernstein et al. | Jul 2015 | B2 |
9090400 | Wurman et al. | Jul 2015 | B2 |
9098080 | Norris et al. | Aug 2015 | B2 |
9110464 | Holland et al. | Aug 2015 | B2 |
9111251 | Brazeau | Aug 2015 | B1 |
9114838 | Bernstein et al. | Aug 2015 | B2 |
9120621 | Curlander et al. | Sep 2015 | B1 |
9120622 | Elazary et al. | Sep 2015 | B1 |
9122276 | Kraimer et al. | Sep 2015 | B2 |
9129250 | Sestini et al. | Sep 2015 | B1 |
9134734 | Lipkowski et al. | Sep 2015 | B2 |
9146559 | Kuss et al. | Sep 2015 | B2 |
9147173 | Jones et al. | Sep 2015 | B2 |
9150263 | Bernstein et al. | Oct 2015 | B2 |
9152149 | Palamarchuk et al. | Oct 2015 | B1 |
9185998 | Dwarakanath et al. | Nov 2015 | B1 |
9188982 | Thomson | Nov 2015 | B2 |
9193404 | Bernstein et al. | Nov 2015 | B2 |
9202382 | Klinger et al. | Dec 2015 | B2 |
9206023 | Wong et al. | Dec 2015 | B2 |
9207673 | Pulskamp et al. | Dec 2015 | B2 |
9207676 | Wu et al. | Dec 2015 | B2 |
9211920 | Bernstein et al. | Dec 2015 | B1 |
9213934 | Versteeg et al. | Dec 2015 | B1 |
9216745 | Beardsley et al. | Dec 2015 | B2 |
9218003 | Fong et al. | Dec 2015 | B2 |
9218316 | Bernstein et al. | Dec 2015 | B2 |
9242799 | OBrien et al. | Jan 2016 | B1 |
9244463 | Pfaff et al. | Jan 2016 | B2 |
9248973 | Brazeau | Feb 2016 | B1 |
9260244 | Cohn | Feb 2016 | B1 |
9266236 | Clark et al. | Feb 2016 | B2 |
9268334 | Vavrick | Feb 2016 | B1 |
9274526 | Murai et al. | Mar 2016 | B2 |
9280153 | Palamarchuk et al. | Mar 2016 | B1 |
9280157 | Wurman et al. | Mar 2016 | B2 |
9290220 | Bernstein et al. | Mar 2016 | B2 |
9304001 | Park et al. | Apr 2016 | B2 |
9310802 | Elkins et al. | Apr 2016 | B1 |
9317034 | Hoffman et al. | Apr 2016 | B2 |
9329078 | Mundhenke et al. | May 2016 | B1 |
9329599 | Sun et al. | May 2016 | B1 |
9330373 | Mountz et al. | May 2016 | B2 |
9341720 | Garin et al. | May 2016 | B2 |
9342811 | Mountz et al. | May 2016 | B2 |
9346619 | OBrien et al. | May 2016 | B1 |
9346620 | Brunner et al. | May 2016 | B2 |
9352745 | Theobald | May 2016 | B1 |
9355065 | Donahue | May 2016 | B2 |
9365348 | Agarwal et al. | Jun 2016 | B1 |
9367827 | Lively et al. | Jun 2016 | B1 |
9367831 | Besehanic | Jun 2016 | B1 |
9371184 | Dingle et al. | Jun 2016 | B1 |
9378482 | Pikler et al. | Jun 2016 | B1 |
9389609 | Mountz et al. | Jul 2016 | B1 |
9389612 | Bernstein et al. | Jul 2016 | B2 |
9389614 | Shani | Jul 2016 | B2 |
9394016 | Bernstein et al. | Jul 2016 | B2 |
9395725 | Bernstein et al. | Jul 2016 | B2 |
9404756 | Fong et al. | Aug 2016 | B2 |
9405016 | Yim | Aug 2016 | B2 |
9427874 | Rublee | Aug 2016 | B1 |
9429940 | Bernstein et al. | Aug 2016 | B2 |
9429944 | Filippov et al. | Aug 2016 | B2 |
9436184 | DAndrea et al. | Sep 2016 | B2 |
9440790 | Mountz et al. | Sep 2016 | B2 |
9448560 | DAndrea et al. | Sep 2016 | B2 |
9451020 | Liu et al. | Sep 2016 | B2 |
9452883 | Wurman et al. | Sep 2016 | B1 |
9457730 | Bernstein et al. | Oct 2016 | B2 |
9823662 | Mecklinger et al. | Nov 2017 | B2 |
10328836 | Purwin et al. | Jun 2019 | B2 |
10589940 | Yang et al. | Mar 2020 | B2 |
10627829 | Lin | Apr 2020 | B2 |
10628790 | Aggarwal et al. | Apr 2020 | B1 |
10949910 | Carpenter et al. | Mar 2021 | B2 |
20040073359 | Ichijo | Apr 2004 | A1 |
20060245893 | Schottke | Nov 2006 | A1 |
20100300841 | OBrien | Dec 2010 | A1 |
20120321423 | MacKnight et al. | Dec 2012 | A1 |
20130058743 | Rebstock | Mar 2013 | A1 |
20130302132 | DAndrea | Nov 2013 | A1 |
20140124462 | Yamashita | May 2014 | A1 |
20140240117 | McKernan | Aug 2014 | A1 |
20140247116 | Davidson | Sep 2014 | A1 |
20150073589 | Khodl et al. | Mar 2015 | A1 |
20150081089 | Kapust | Mar 2015 | A1 |
20150117995 | DAndrea | Apr 2015 | A1 |
20150125252 | Berzen Ratzel | May 2015 | A1 |
20150307278 | Wickham et al. | Oct 2015 | A1 |
20160059875 | Segman | Mar 2016 | A1 |
20160090283 | Svensson et al. | Mar 2016 | A1 |
20160176637 | Ackerman et al. | Jun 2016 | A1 |
20160203543 | Snow | Jul 2016 | A1 |
20160232477 | Cortes et al. | Aug 2016 | A1 |
20160347545 | Lindbo et al. | Dec 2016 | A1 |
20170043953 | Battles et al. | Feb 2017 | A1 |
20170174431 | Borders et al. | Jun 2017 | A1 |
20170182924 | Lendo et al. | Jun 2017 | A1 |
20170229903 | Jones et al. | Aug 2017 | A1 |
20180057034 | Deshpande | Mar 2018 | A1 |
20180141752 | Nakanishi et al. | May 2018 | A1 |
20180162433 | Jones et al. | Jun 2018 | A1 |
20180208398 | Haveman et al. | Jul 2018 | A1 |
20190016573 | DAndrea | Jan 2019 | A1 |
20190302775 | Palan et al. | Oct 2019 | A1 |
20200103916 | Tu et al. | Apr 2020 | A1 |
Number | Date | Country |
---|---|---|
1196712 | Nov 1985 | CA |
1210367 | Aug 1986 | CA |
1228142 | Oct 1987 | CA |
1238103 | Jun 1988 | CA |
1264490 | Jan 1990 | CA |
1267866 | Apr 1990 | CA |
1269740 | May 1990 | CA |
1271544 | Jul 1990 | CA |
1275721 | Oct 1990 | CA |
1276264 | Nov 1990 | CA |
2029773 | May 1991 | CA |
1291725 | Nov 1991 | CA |
2036104 | Nov 1991 | CA |
2042133 | Jan 1992 | CA |
2049578 | Feb 1992 | CA |
2296837 | Feb 1992 | CA |
2094833 | Apr 1992 | CA |
1304043 | Jun 1992 | CA |
2095442 | Jun 1992 | CA |
1304820 | Jul 1992 | CA |
1323084 | Oct 1993 | CA |
2189853 | Nov 1995 | CA |
2244668 | Mar 1999 | CA |
2469652 | Jun 2003 | CA |
2514523 | Aug 2004 | CA |
2565553 | Nov 2005 | CA |
2577346 | Apr 2006 | CA |
2613180 | Jan 2007 | CA |
2921584 | Jan 2007 | CA |
2625885 | Apr 2007 | CA |
2625895 | Apr 2007 | CA |
2837477 | Apr 2007 | CA |
2864027 | Apr 2007 | CA |
2636233 | Jul 2007 | CA |
2640769 | Aug 2007 | CA |
2652114 | Dec 2007 | CA |
2654258 | Dec 2007 | CA |
2654260 | Dec 2007 | CA |
2654263 | Dec 2007 | CA |
2654295 | Dec 2007 | CA |
2654336 | Dec 2007 | CA |
2654471 | Dec 2007 | CA |
2748398 | Dec 2007 | CA |
2748407 | Dec 2007 | CA |
2750043 | Dec 2007 | CA |
2781624 | Dec 2007 | CA |
2781857 | Dec 2007 | CA |
2838044 | Dec 2007 | CA |
2866664 | Dec 2007 | CA |
2921134 | Dec 2007 | CA |
2663578 | Apr 2008 | CA |
2860745 | Apr 2008 | CA |
2671955 | Jul 2008 | CA |
2673025 | Jul 2008 | CA |
2674241 | Jul 2008 | CA |
2691710 | Dec 2008 | CA |
2721345 | Oct 2009 | CA |
2760127 | Nov 2009 | CA |
2760225 | Nov 2009 | CA |
2743706 | Jun 2010 | CA |
2754626 | Sep 2010 | CA |
2765565 | Jan 2011 | CA |
2932535 | Jan 2011 | CA |
2932537 | Jan 2011 | CA |
2770139 | Feb 2011 | CA |
2773963 | Mar 2011 | CA |
2778111 | May 2011 | CA |
2784874 | Jul 2011 | CA |
2868578 | Jul 2011 | CA |
2806852 | Feb 2012 | CA |
2823715 | Jul 2012 | CA |
2827281 | Aug 2012 | CA |
2827735 | Aug 2012 | CA |
2770715 | Sep 2012 | CA |
2770918 | Sep 2012 | CA |
2831832 | Oct 2012 | CA |
2836933 | Dec 2012 | CA |
2851774 | Apr 2013 | CA |
2799871 | Jun 2013 | CA |
2866708 | Sep 2013 | CA |
2938894 | Sep 2013 | CA |
2813874 | Dec 2013 | CA |
2824189 | Feb 2014 | CA |
2870381 | Apr 2014 | CA |
2935223 | Apr 2014 | CA |
2894546 | Jun 2014 | CA |
2845229 | Sep 2014 | CA |
2899553 | Oct 2014 | CA |
2882452 | Aug 2015 | CA |
2886121 | Oct 2015 | CA |
2012154872 | Nov 2012 | WO |
2016015000 | Jan 2016 | WO |
2018017102 | Jan 2018 | WO |
Entry |
---|
US 7,460,017 B2, 12/2008, Roeder et al. (withdrawn) |
US 9,050,932 B2, 06/2015, Bernstein et al. (withdrawn) |
US 9,342,073 B2, 05/2016, Bernstein et al. (withdrawn) |
Warehouse Robots at Work, IEEE Spectrum,. Jul. 21, 2008, YouTube https://www.youtube.com/watch?v=IWsMdN7HMuA. |
International Search Report and Written Opinion, PCT/US2017/054627, dated Jan. 5, 2018 (15 pages). |
International Search Report and Written Opinion, PCT/US2018/012645, dated Mar. 7, 2018 (13 pages). |
International Search Report and Written Opinion, PCT/US2018/012641, dated Mar. 7, 2018 (17 pages). |
Da-Sol, Kim, “E-mart unveils autonomous shopping cart Eli for test run,” http://www.koreaherald.com/view.php?ud=20180417000718, Apr. 17, 2018, 4 pgs. |
Mannes, John, “Canvas' robot cart could change how factories work,” https://techcrunch.com/2017/09/07/canvas-robot-cart-could-change-how-factories-work/, Posted Sep. 7, 2017, 5 pgs. |
Vincent, James, “LG unveils new concept robots for carrying your drinks, suitcase, and shopping,” https://www.theverge.com/circuitbreaker/2018/1/4/16848886/lg-concept-robots-carrying-drinks-luggage-shopping, The Verge, Jan. 4, 2018, 2 pgs. |
Number | Date | Country | |
---|---|---|---|
62717646 | Aug 2018 | US |