This disclosure relates generally to navigation of mobile robots.
A robot is a reprogrammable and multifunctional manipulator designed to move material, parts, tools, or specialized devices through variable programmed motions for a performance of tasks. Robots may be manipulators that are physically anchored (e.g., industrial robotic arms), mobile robots that move throughout an environment (e.g., legs, wheels, or traction based mechanisms), or some combination of a manipulator and a mobile robot. Robots are utilized in a variety of industries including, for example, manufacturing, transportation, hazardous environments, exploration, and healthcare. As such, the ability of robots to traverse environments with obstacles provides additional benefits to such industries.
An aspect of the present disclosure provides a computer-implemented method that when executed by data processing hardware causes the data processing hardware to perform operations. The operations include receiving a navigation route including a set of waypoints including a first waypoint and a second waypoint, generating a local obstacle map based on sensor data captured by a mobile robot, determining that the mobile robot is unable to execute a movement instruction along a path between the first waypoint and the second waypoint due to an obstacle obstructing the path, identifying a third waypoint based on the local obstacle map, and generating an alternative path to navigate the mobile robot to the third waypoint to avoid the obstacle.
Another aspect of the present disclosure provides a mobile robot including a locomotion structure and a navigation system configured to control the locomotion structure to coordinate movement of the mobile robot. The navigation system includes data processing hardware and memory hardware in communication with the data processing hardware, the memory hardware storing instructions that when executed on the data processing hardware cause the data processing hardware to perform operations. The operations include receiving a navigation route including a set of waypoints including a first waypoint and a second waypoint, generating a local obstacle map based on sensor data captured by the mobile robot, determining that the mobile robot is unable to execute a movement instruction along a path between the first waypoint and the second waypoint due to an obstacle obstructing the path, identifying a third waypoint based on the local obstacle map, generating an alternative path to navigate the mobile robot to the third waypoint to avoid the obstacle.
Another aspect of the present disclosure provides a computer-implemented method that when executed by data processing hardware causes the data processing hardware to perform operations. The operations include receiving a navigation route for a mobile robot. The navigation route includes a sequence of waypoints connected by edges. Each edge corresponds to movement instructions that navigate the mobile robot between waypoints of the sequence of waypoints. While the mobile robot is traveling along the navigation route, the operations include determining that the mobile robot is unable to execute a respective movement instruction for a respective edge of the navigation route due to an obstacle obstructing the respective edge, generating an alternative path to navigate the mobile robot to an untraveled waypoint in the sequence of waypoints, and resuming travel by the mobile robot along the navigation route. The alternative path avoids the obstacle.
In some implementations, generating the alternative path to the untraveled waypoint in the sequence of waypoints includes determining that the untraveled waypoint exists within a local obstacle map generated from current sensor data captured by a sensor system of the mobile robot. The current sensor data represents an area immediately surrounding the mobile robot. In further implementations, the local obstacle map includes an occupancy grid designating whether each cell in the occupancy grid includes a respective obstacle for the mobile robot. In other further implementations, determining that the untraveled waypoint exists within the local obstacle map includes, for each untraveled waypoint in the sequence of waypoints, identifying whether the respective untraveled waypoint exists within the local obstacle map. When the respective untraveled waypoint exists within the local obstacle map, generating the alternative path includes generating the alternative path from a current position of the mobile robot to the respective untraveled waypoint. In even further implementations, when the respective untraveled waypoint fails to exist within the local obstacle map, the operations further include moving the mobile robot toward a position of a respective untraveled waypoint while avoiding the obstacle until the respective untraveled waypoint exists within the local obstacle map.
In some embodiments, resuming travel by the mobile robot along the navigation route includes executing the movement instructions of the edges connecting an untraveled portion of the sequence of waypoints of the navigation route starting from the untraveled waypoint. In some examples, the navigation route is derived from a topological map that represents an environment as a plurality of waypoints interconnected by corresponding edges. In further examples, the topological map corresponds to a high-level map of the environment abstracted to remove metric details that fail to correspond to waypoints and edges. In other further examples, determining that the mobile robot is unable to execute the respective movement instruction for the respective edge includes determining that the topological map fails to provide an avoidance path that avoids the obstacle obstructing the respective edge.
In some implementations, the operations further include establishing a topological map for an environment by using an initial mapping process that navigates the mobile robot through the environment generating a plurality of waypoints corresponding to features of the environment and generating the navigation route using the topological map. In some embodiments, the alternative path forms a respective path between two waypoints previously not connected in the navigation route.
In some examples, each waypoint in the sequence of waypoints was previously recorded during an initial mapping of an environment and the obstacle obstructing the respective edge was previously undiscovered during the initial mapping of the environment. In some implementations, the alternative path avoiding the obstacle deviates from the navigation route.
Another aspect of the disclosure provides a mobile robot. The mobile robot includes a body, two or more legs coupled to the body, and a navigation system configured to coordinate movement for the two or more legs. The navigation system includes data processing hardware and memory hardware in communication with the data processing hardware. The memory hardware stores instructions that when executed on the data processing hardware cause the data processing hardware to perform operations. The operations include receiving a navigation route for a mobile robot. The navigation route includes a sequence of waypoints connected by edges. Each edge corresponds to movement instructions that navigate the mobile robot between waypoints of the sequence of waypoints. While the mobile robot is traveling along the navigation route, the operations include determining that the mobile robot is unable to execute a respective movement instruction for a respective edge of the navigation route due to an obstacle obstructing the respective edge, generating an alternative path to navigate the mobile robot to an untraveled waypoint in the sequence of waypoints, and resuming travel by the mobile robot along the navigation route. The alternative path avoids the obstacle.
In some implementations, generating the alternative path to the untraveled waypoint in the sequence of waypoints includes determining that the untraveled waypoint exists within a local obstacle map generated from current sensor data captured by a sensor system of the mobile robot. The current sensor data represents an area immediately surrounding the mobile robot. In further implementations, the local obstacle map includes an occupancy grid designating whether each cell in the occupancy grid includes a respective obstacle for the mobile robot. In other further implementations, determining that the untraveled waypoint exists within the local obstacle map includes, for each untraveled waypoint in the sequence of waypoints, identifying whether the respective untraveled waypoint exists within the local obstacle map. When the respective untraveled waypoint exists within the local obstacle map, generating the alternative path includes generating the alternative path from a current position of the mobile robot to the respective untraveled waypoint. In even further implementations, when the respective untraveled waypoint fails to exist within the local obstacle map, the operations further include moving the mobile robot toward a position of a respective untraveled waypoint while avoiding the obstacle until the respective untraveled waypoint exists within the local obstacle map.
In some embodiments, resuming travel by the mobile robot along the navigation route includes executing the movement instructions of the edges connecting an untraveled portion of the sequence of waypoints of the navigation route starting from the untraveled waypoint. In some examples, the navigation route is derived from a topological map that represents an environment as a plurality of waypoints interconnected by corresponding edges. In further examples, the topological map corresponds to a high-level map of the environment abstracted to remove metric details that fail to correspond to waypoints and edges. In other further examples, determining that the mobile robot is unable to execute the respective movement instruction for the respective edge includes determining that the topological map fails to provide an avoidance path that avoids the obstacle obstructing the respective edge.
In some implementations, the operations further include establishing a topological map for an environment by using an initial mapping process that navigates the mobile robot through the environment generating a plurality of waypoints corresponding to features of the environment and generating the navigation route using the topological map. In some embodiments, the alternative path forms a respective path between two waypoints previously not connected in the navigation route.
In some examples, each waypoint in the sequence of waypoints was previously recorded during an initial mapping of an environment and the obstacle obstructing the respective edge was previously undiscovered during the initial mapping of the environment. In some implementations, the alternative path avoiding the obstacle deviates from the navigation route.
The details of the one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
Like reference symbols in the various drawings indicate like elements.
To navigate about environments, mobile robots may use some degree of mapping. Maps may contain stored information that informs the robot as to details about an environment.
One type of map for navigation is a topological map, which is a diagram abstracting an area to be represented by a particular type (or subset) of information about the area. For example, with only a particular type or subset of information about an area, a topological map defines the area relationally using the particular information. For instance, a topological map represents a set of places by the relationship between these places. In this respect, a topological map may lack scale or other detailed metric data, such as distance and direction, while maintaining a relationship between points or places. One such example of a topological map is a map of a train line where the map depicts the train's route as a series of connected stops without further precise details such as the distance between stops.
Another form of a map representation is referred to as a metric map. Metric maps are derived from a more precise mapping framework when compared to topological maps. Metric maps represent locations of objects (e.g., obstacles) in an environment based on precise geometric coordinates (e.g., two-dimensional coordinates or three-dimensional coordinates). In order to effectively navigate a robot with a metric map, a robot typically operates with data indicating the robot's coordinate location with some degree of precision. Such data allows determination of the robot's relationship to other objects represented according to geometric coordinates.
Since metric maps represent objects according to detailed spatial information (e.g., geometric coordinates), metric maps may be inherently computationally expensive. For example, generally the more detail present in a map, the greater the number of computational units used for computing operations on the map (e.g., read, write, and/or storage operations). Therefore, when a robot performs navigational processes using the metric map (e.g., querying the metric map for information to make navigation decisions), these processes may demand greater computing resources or take a longer time to occur when compared to navigational processes using a topological map. Topological maps may have this advantage because generally, by diagraming only particular relationships, a topological map includes less information compared to more detailed maps (i.e., metric maps).
With topological maps being typically less computationally expensive than metric maps, it is desirable to employ topological maps as much as possible for robotic navigation. Unfortunately, during navigation, situations may arise where topological maps do not provide enough detailed information to guide the robot successfully to a specific destination.
One such situation that may occur during robotic navigation is that the robot encounters an object that obstructs the robot (referred to as an obstacle) in some unforeseeable way. When an unforeseeable obstacle occurs, a topological map that has abstracted the environment to a set of locations and a relationship among these locations is ill-equipped to provide the robot with a path that avoids the obstacle while continuing to a specific destination. In other words, the presence of an unforeseeable obstacle may break or interfere with the relationship between the set of locations represented in a topological map. Without further detailed information to guide the robot, such a broken relationship may cause the robot to fail to achieve navigation to a target destination using the topological map.
The situation with an unforeseeable obstacle is not uncommon because real-world environments are dynamically changing. That is, structures or objects within an environment may have some temporary or nonpermanent nature. Known obstacles may move and new obstacles may be introduced to the environment. For instance, if the environment for the robot is a construction site, a construction site is a changing environment often including temporary or non-permanent objects, such as tools, tool storage, machinery, material, etc. Dynamic changes (or changes over time) to an environment pose an issue for a robot deployed to perform some task within the dynamically changing environment. This may be especially true when the robot operates autonomously or semi-autonomously.
For localization and navigation purposes, a robot can be initially taught an environment by a mapping process. In one example of a mapping process, an operator of the robot initially guides the robot through the environment while collecting sensor data from sensors associated with the robot. The sensor data (e.g., image data or point cloud data) gathered during this initial mapping process allows the robot to construct a map of the environment. For example, the initial mapping process generates a topological map of locations recorded during the initial mapping process and the relationship between these locations to inform the robot how to navigate along a path from location to location.
When the robot proceeds to use a topological map for navigation in the mapped environment, an obstacle not recognized or discovered during the initial mapping process is an unforeseeable object that will not be accounted for in the topological map. Therefore, if the robot only navigates using a navigation route from the topological map, an unforeseeable object encountered by the robot will hinder (or even entirely prevent) complete execution of the navigation route.
To address some of these issues, the mobile robot may use a directed exploration approach for its navigation system. A directed exploration approach functions as a type of hybrid navigation system that utilizes both a topological map and a more detailed metric map. For instance, during navigation, a topological map guides the robot unless or until the robot encounters an unforeseeable obstacle. Here, once the robot encounters the unforeseeable obstacle, the navigation system then switches temporarily to a local navigation map to determine if the robot can find a path that temporarily deviates from the navigation route of the topological map to avoid the obstacle and enables the robot to resume some portion of the navigation route (e.g., to travel to a destination specified by the navigation route).
For example, while the robot moves along a navigation route, the robot may be generating a local obstacle map for an area immediately surrounding the robot (e.g., generates a grid of some size adjacent to the robot). In this example, when the robot encounters an unforeseeable obstacle that impedes the navigation path, the robot uses the local obstacle map to determine a path that avoids the obstacle and continues the robot on the navigation route. By generating an obstacle avoidance path with the local obstacle map that merges with the original navigation path, the navigation system may once again guide the robot with the navigation path from the topological map instead of continuing to use (e.g., exclusively or predominantly) the local obstacle map after avoidance of the unforeseeable obstacle. This directed exploration approach may therefore minimize the computing cost associated with the navigation system.
Referring to example of
In order to traverse the terrain, each leg 120 has a distal end 124 that contacts a surface of the terrain (i.e., a traction surface). In other words, the distal end 124 of the leg 120 is the end of the leg 120 used by the robot 100 to pivot, plant, or generally provide traction during movement of the robot 100. For example, the distal end 124 of a leg 120 corresponds to a foot of the robot 100. In some examples, though not shown, the distal end 124 of the leg 120 includes an ankle joint such that the distal end 124 is articulable with respect to the lower member 122L of the leg 120.
In the examples shown, the robot 100 includes an arm 126 that functions as a robotic manipulator. The arm 126 may be configured to move about multiple degrees of freedom in order to engage elements of the environment 10 (e.g., objects within the environment 10). In some examples, the arm 126 includes one or more members 128, where the members 128 are coupled by arm joints JA (for example, JA1, JA2, JA3, JA4, JA5, and JA6 in
In some examples, such as
In some implementations, the arm 126 may include additional joints JA such as the fifth arm joint JA5 and/or the sixth arm joint JA6. The fifth joint JA5 may be located near the coupling of the upper member 128U to the hand member 128H and functions to allow the hand member 128H to twist or to rotate relative to the lower member 128U. In other words, the fifth arm joint JA4 may function as a twist joint similarly to the fourth arm joint JA4 or wrist joint of the arm 126 adjacent the hand member 128H. For instance, as a twist joint, one member coupled at the joint J may move or rotate relative to another member coupled at the joint J (e.g., a first member portion coupled at the twist joint is fixed while the second member portion coupled at the twist joint rotates). Here, the fifth joint JA5 may also enable the arm 126 to turn in a manner that rotates the hand member 128H such that the hand member 128H may yaw instead of pitch. For instance, the fifth joint JA5 allows the arm 126 to twist within a 180 degree range of motion such that the jaws associated with the hand member 128H may pitch, yaw, or some combination of both. This may be advantageous for hooking some portion of the arm 126 around objects or refining the how the hand member 128H grasps an object. The sixth arm joint JA6 may function similarly to the fifth arm joint JA5 (e.g., as a twist joint). For example, the sixth arm joint JA6 also allows a portion of an arm member 128 (e.g., the upper arm member 128U) to rotate or twist within a 180 degree range of motion (e.g., with respect to another portion of the arm member 128 or another arm member 128). Here, a combination of the range of motion from the fifth arm joint JA5 and the sixth arm joint JA6 may enable 360 degree rotation. In some implementations, the arm 126 connects to the robot 100 at a socket on the body 110 of the robot 100. In some configurations, the socket is configured as a connector such that the arm 126 may attach or detach from the robot 100 depending on whether the arm 126 is needed for operation. In some examples, the first and second arm joints JA1,2 may be located at, adjacent to, or a portion of the socket that connects the arm 126 to the body 110.
The robot 100 has a vertical gravitational axis (e.g., shown as a Z-direction axis AZ) along a direction of gravity, and a center of mass CM, which is a position that corresponds to an average position of all parts of the robot 100 where the parts are weighted according to their masses (i.e., a point where the weighted relative position of the distributed mass of the robot 100 sums to zero). The robot 100 further has a pose P based on the CM relative to the vertical gravitational axis AZ (i.e., the fixed reference frame with respect to gravity) to define a particular attitude or stance assumed by the robot 100. The attitude of the robot 100 can be defined by an orientation or an angular position of the robot 100 in space. Movement by the legs 120 relative to the body 110 alters the pose P of the robot 100 (i.e., the combination of the position of the CM of the robot and the attitude or orientation of the robot 100). Here, a height generally refers to a distance along the z-direction (e.g., along a z-direction axis AZ). The sagittal plane of the robot 100 corresponds to the Y-Z plane extending in directions of a y-direction axis AY and the z-direction axis AZ. In other words, the sagittal plane bisects the robot 100 into a left and a right side. Generally perpendicular to the sagittal plane, a ground plane (also referred to as a transverse plane) spans the X-Y plane by extending in directions of the x-direction axis AX and the y-direction axis AY. The ground plane refers to a ground surface 14 where distal ends 124 of the legs 120 of the robot 100 may generate traction to help the robot 100 move within the environment 10. Another anatomical plane of the robot 100 is the frontal plane that extends across the body 110 of the robot 100 (e.g., from a left side of the robot 100 with a first leg 120a to a right side of the robot 100 with a second leg 120b). The frontal plane spans the X-Z plane by extending in directions of the x-direction axis AX and the z-direction axis Az.
In order to maneuver within the environment 10 or to perform tasks using the arm 126, the robot 100 includes a sensor system 130 (also referred to as a vision system) with one or more sensors 132, 132a-n. For instance,
When surveying a field of view FV with a sensor 132, the sensor system 130 generates sensor data 134 (e.g., image data) corresponding to the field of view FV. The sensor system 130 may generate the field of view FV with a sensor 132 mounted on or near the body 110 of the robot 100 (e.g., sensor(s) 132a, 132b). The sensor system may additionally and/or alternatively generate the field of view FV with a sensor 132 mounted at or near the end-effector 128H of the arm 126 (e.g., sensor(s) 132c). The one or more sensors 132 may capture sensor data 134 that defines the three-dimensional point cloud for the area within the environment 10 about the robot 100. In some examples, the sensor data 134 is image data that corresponds to a three-dimensional volumetric point cloud generated by a three-dimensional volumetric image sensor 132. Additionally or alternatively, when the robot 100 is maneuvering within the environment 10, the sensor system 130 gathers pose data for the robot 100 that includes inertial measurement data (e.g., measured by an IMU). In some examples, the pose data includes kinematic data and/or orientation data about the robot 100, for instance, kinematic data and/or orientation data about joints J or other portions of a leg 120 or arm 126 of the robot 100. With the sensor data 134, various systems of the robot 100 may use the sensor data 134 to define a current state of the robot 100 (e.g., of the kinematics of the robot 100) and/or a current state of the environment 10 about the robot 100. In other words, the sensor system 130 may communicate the sensor data 134 from one or more sensors 132 to any other system of the robot 100 in order to assist the functionality of that system.
In some implementations, the sensor system 130 includes sensor(s) 132 coupled to a joint J. Moreover, these sensors 132 may couple to a motor M that operates a joint J of the robot 100 (e.g., sensors 132, 132b-d). Here, these sensors 132 generate joint dynamics in the form of joint-based sensor data 134. Joint dynamics collected as joint-based sensor data 134 may include joint angles (e.g., an upper member 122U relative to a lower member 122L or hand member 128H relative to another member of the arm 126 or robot 100), joint speed, joint angular velocity, joint angular acceleration, and/or forces experienced at a joint J (also referred to as joint forces). Joint-based sensor data generated by one or more sensors 132 may be raw sensor data, data that is further processed to form different types of joint dynamics, or some combination of both. For instance, a sensor 132 measures joint position (or a position of member(s) 122 coupled at a joint J) and systems of the robot 100 perform further processing to derive velocity and/or acceleration from the positional data. In other examples, a sensor 132 is configured to measure velocity and/or acceleration directly.
As the sensor system 130 gathers sensor data 134, a computing system 140 stores, processes, and/or communicates the sensor data 134 to various systems of the robot 100 (e.g., the computing system 140, the control system 170, the perception system 180, and/or the navigation system 200). In order to perform computing tasks related to the sensor data 134, the computing system 140 of the robot 100 (which is schematically depicted in
In some examples, the computing system 140 is a local system located on the robot 100. When located on the robot 100, the computing system 140 may be centralized (e.g., in a single location/area on the robot 100, for example, the body 110 of the robot 100), decentralized (e.g., located at various locations about the robot 100), or a hybrid combination of both (e.g., including a majority of centralized hardware and a minority of decentralized hardware). To illustrate some differences, a decentralized computing system 140 may allow processing to occur at an activity location (e.g., at motor that moves a joint of a leg 120) while a centralized computing system 140 may allow for a central processing hub that communicates to systems located at various positions on the robot 100 (e.g., communicate to the motor that moves the joint of the leg 120).
Additionally or alternatively, the computing system 140 utilizes computing resources that are located remotely from the robot 100. For instance, the computing system 140 communicates via a network 150 with a remote system 160 (e.g., a remote server or a cloud-based environment). Much like the computing system 140, the remote system 160 includes remote computing resources, such as remote data processing hardware 162 and remote memory hardware 164. Here, sensor data 134 or other processed data (e.g., data processing locally by the computing system 140) may be stored in the remote system 160 and may be accessible to the computing system 140. In additional examples, the computing system 140 is configured to utilize the remote resources 162, 164 as extensions of the computing resources 142, 144 such that resources of the computing system 140 may reside on resources of the remote system 160.
In some implementations, as shown in
A given controller 172 may control the robot 100 by controlling movement about one or more joints J of the robot 100. In some configurations, the given controller 172 is software with programming logic that controls at least one joint J or a motor M which operates, or is coupled to, a joint J. For instance, the controller 172 controls an amount of force that is applied to a joint J (e.g., torque at a joint J). As programmable controllers 172, the number of joints J that a controller 172 controls is scalable and/or customizable for a particular control purpose. A controller 172 may control a single joint J (e.g., control a torque at a single joint J), multiple joints J, or actuation of one or more members 122, 128 (e.g., actuation of the hand member 128H) of the robot 100. By controlling one or more joints J, actuators or motors M, the controller 172 may coordinate movement for all different parts of the robot 100 (e.g., the body 110, one or more legs 120, the arm 126). For example, to perform a behavior with some movements, a controller 172 may be configured to control movement of multiple parts of the robot 100 such as, for example, two legs 120a-b, four legs 120a-d, the arm 126, or any combination of legs 120 and/or arm 126 (e.g., two or four legs 120 combined with the arm 126). In some examples, a controller 172 is configured as an object-based controller that is setup to perform a particular behavior or set of behaviors for interacting with an interactable object.
In some examples, the control system 170 includes at least one controller 172, a path generator 174, a step locator 176, and a body planner 178. The control system 170 may be configured to communicate with at least one sensor system 130 and any other system of the robot 100 (e.g., the perception system 180 and/or the navigation system 200). The control system 170 performs operations and other functions using the computing system 140. The controller 172 is configured to control movement of the robot 100 to traverse the environment 10 based on input or feedback from the systems of the robot 100 (e.g., the sensor system 130, the perception system 180, and/or the navigation system 200). This may include movement between poses and/or behaviors of the robot 100. For example, the controller 172 controls different footstep patterns, leg patterns, body movement patterns, or vision system-sensing patterns.
In some implementations, the control system 170 includes specialty controllers that are dedicated to a particular control purpose. These specialty controllers may include, but need not be limited to, the path generator 174, the step locator 176, and/or the body planner 178. Referring to
The perception system 180 is a system of the robot 100 that helps the robot 100 to move more precisely in a terrain with various obstacles. As the sensors 132 collect sensor data 134 for the space about the robot 100 (i.e., the robot's environment 10), the perception system 180 uses the sensor data 134 to form one or more perception maps 182 for the environment 10. Once the perception system 180 generates a perception map 182, the perception system 180 is also configured to add information to the perception map 182 (e.g., by projecting sensor data 134 on a preexisting map) and/or to remove information from the perception map 182.
In some examples, the one or more perceptions maps 182 generated by the perception system 180 are a ground height map 182, 182a, a no step map 182, 182b, and/or a body obstacle map 182, 182c (collectively 182a-c in
The no step map 182b generally refers to a perception map that defines regions where the robot 100 is not allowed to step in order to advise the robot 100 when the robot 100 may step at a particular horizontal location (i.e., location in the X-Y plane). In some examples, much like the body obstacle map 182c and the ground height map 182a, the no step map 182b is partitioned into a grid of cells where each cell represents a particular area in the environment 10 about the robot 100. For instance, each cell is a three centimeter square. For ease of explanation, each cell exists within an X-Y plane within the environment 10. When the perception system 180 generates the no-step map 182b, the perception system 180 may generate a Boolean value map where the Boolean value map identifies no step regions and step regions. A no step region refers to a region of one or more cells where an obstacle exists while a step region refers to a region of one or more cells where an obstacle is not perceived to exist. The perception system 180 may further process the Boolean value map such that the no step map 182b includes a signed-distance field. Here, the signed-distance field for the no step map 182b includes a distance to a boundary of an obstacle (e.g., a distance to a boundary of the no step region) and a vector v (e.g., defining nearest direction to the boundary of the no step region) to the boundary of an obstacle.
The body obstacle map 182c generally determines whether the body 110 of the robot 100 may overlap a location in the X-Y plane with respect to the robot 100. In other words, the body obstacle map 182c identifies obstacles for the robot 100 to indicate whether the robot 100, by overlapping at a location in the environment 10, risks collision or potential damage with obstacles near or at the same location. As a map of obstacles for the body 110 of the robot 100, systems of the robot 100 (e.g., the control system 170) may use the body obstacle map 182c to identify boundaries adjacent, or nearest to, the robot 100 as well as to identify directions (e.g., an optimal direction) to move the robot 100 in order to avoid an obstacle. In some examples, much like other perception maps 182, the perception system 180 generates the body obstacle map 182c according to a grid of cells (e.g., a grid of the X-Y plane). Here, each cell within the body obstacle map 182c includes a distance from an obstacle and a vector pointing to the closest cell that is an obstacle (i.e., a boundary of the obstacle).
Referring further to
As the navigation system 200 guides the robot 100 through movements that follow the navigation route 202, the navigation system 200 is configured to determine whether the navigation route 202 becomes obstructed by an object. That is, the navigation route 202 is derived from a topological map 204 (see for example,
Since the environment 10 may dynamically change from the time of recording the locations to the topological map 204, the navigation system 200 is configured to determine whether the navigation route 202 becomes obstructed by an object that was not previously discovered in its obstructed location when recording the locations being used by the navigation route 202. Stated differently, an object is in a location that obstructs the navigation route 202 during execution of the route 202 when the object exists in the path of the route 202 and the object was not present in that location when the location(s) of the navigation route 202 was recorded to the topological map 204. This results in the object being an “unforeseeable obstacle” in the navigation route 202 because the initial mapping process that informs the navigation route 202 did not recognize the object in the obstructed location. This may occur when an object is moved or introduced to a mapped environment.
When an unforeseeable obstacle obstructs the route 202, the navigation system 200 is configured to attempt to generate an alternative path 206 to a location in the route 202 that avoids the unforeseeable obstacle. This alternative path 206 may deviate from the navigation route 202 temporarily, but then resume the navigation route 202 after the deviation.
Unlike other approaches to generate an obstacle avoidance path, the navigation system 200 has the goal to only temporarily deviate from the route 202 to avoid the unforeseeable obstacle such that the robot 100 may return to using course features (e.g., like topological features from the topological map 204) for the navigation route 202.
In this sense, successful obstacle avoidance for the navigation system 200 is when an obstacle avoidance path both (i) avoids the unforeseeable obstacle and (ii) enables the robot 100 to resume some portion of the navigation route 202. This technique to merge back with the navigation route 202 after obstacle avoidance may be advantageous because the navigation route 202 may be important for task or mission performance for the robot 100 (or an operator of the robot 100). For instance, an operator of the robot 100 tasks the robot 100 to perform an inspection task at a location of the route 202. By generating an obstacle avoidance route that continues on the route 202 after obstacle avoidance, the navigation system 200 aims to promote task or mission success for the robot 100.
To illustrate,
In the example of
In this respect, the robot 100 may not be able to successfully navigate to one or more locations, such as the second location (second waypoint 212b), but may resume a portion of the navigation route 202 after avoiding the obstacle 20. For instance, the navigation route 202 includes more locations subsequent to the third location (waypoint 212c) and the alternative path 206 enables the robot 100 to continue to those locations after the alternative path 206 navigates the robot 100 to the third location (i.e., a location included in the route 202). The waypoint 212c is an untraveled waypoint along the navigation route 202 that in
In some implementations, an object that is an unforeseeable obstacle 20 (also referred to as obstacle 20) may be specific to the type of robot 100 performing navigation. That is, whether an object blocking a portion of the route 202 is truly an obstacle may vary based on the morphology of the robot 100 (e.g., the locomotion structures of the robot 100). For instance, some objects may not be considered “obstacles” because the robot 100 is implemented with a footpath range of motion that steps over the object. Here, if the robot 100 is able to step across the object without requiring deviation from the route 202, the object is not an obstacle. This is true even when the robot 100 may need to deviate from a nominal step height, but nonetheless execute a step height within the robot's capabilities. In contrast, if the robot 100 were a wheel-based or tracked-based robot, the robot 100 would not have the capability to step over the object, and most objects would cause these types of robots 100 to deviate from the route 202 to avoid the object. With a wheel-based or tracked-based robot, many objects that block the route 202 would be considered an unforeseeable obstacle 20. Although in situations where a wheel-based robot or tracked-based robot is able to wheel over the object, the object may also not be considered an obstacle for route navigation purposes. In some implementations, when the robot 100 is a quadruped robot 100 with four legs 120, there may be a situation when an object does not pose a foot placement issue (i.e., obstructs or hinders a foot touchdown), but nonetheless poses an obstruction for the body 110 of the robot 100. For instance, an object occupies a plane offset from the ground plane, but has open space underneath the plane. In this sense, for something that resembles a table or a platform, the robot 100 may be able to place one of its feet 124 underneath the occupied plane, but the body 110 of the robot 100 would collide with the plane. Here, due to the shape of the body 110 and the legs 120 (e.g., the legs 120 offset the body 110 from a support surface), this type of object would be considered an obstacle. For reasons such as this, the shape and/or design of the robot 100 may dictate whether an object actually functions as an obstacle 20.
Referring now to
In some implementations, the generator 210 builds the topological map 204 by executing at least one waypoint heuristic (e.g., waypoint search algorithm) that triggers the generator 210 to record a waypoint placement at a particular location in the topological map 204. For example, the waypoint heuristic is configured to detect a threshold feature detection within the image data 134 at a location of the robot 100 (e.g., when generating or updating the topological map 204).
The generator 210 (e.g., using a waypoint heuristic) may identify features within the environment 10 that function as reliable vision sensor features offering repeatability for the robot 100 to maneuver about the environment 10. For instance, a waypoint heuristic of the generator 210 is pre-programmed for feature recognition (e.g., programmed with stored features) or programmed to identify features where spatial clusters of volumetric image data 134 occur (e.g., corners of rooms or edges of walls). In response to the at least one waypoint heuristic triggering the waypoint placement, the generator 210 records the waypoint 212 on the topological map 204. This waypoint identification process may be repeated by the generator 210 as the robot 100 drives through an area (e.g., the robotic environment 10). For instance, an operator of the robot 100 manually drives the robot 100 through an area for an initial mapping process that establishes the waypoints 212 for the topological map 204.
When recording each waypoint 212, the generator 210 generally associates a waypoint edge 214 (also referred to as an edge 214) with a respective waypoint 212 such that the topological map 204 produced by the generator 210 includes both waypoints 212 and their respective edges 214. An edge 214 is configured to indicate how one waypoint 212 (e.g., a first waypoint 210a) is related to another waypoint 212 (e.g., a second waypoint 212b). For example, the edge 214 represents a positional relationship between waypoints 212 (e.g., adjacent waypoints 212). In other words, the edge 214 is a connection or designated path between two waypoints 212 (e.g., the edge 214a shown in
Pose transformations may describe a position and/or orientation of one coordinate frame within an environment relative to another coordinate frame. In other words, the pose transformations serve as movement instructions that dictate one or more positions and/or orientations for the robot 100 to assume to successfully navigate along the edge 214 from a source waypoint 212 to a destination waypoint 212. In some implementations, the edge 214 includes a full three-dimensional transform (e.g., six numbers). Some of these numbers include various estimates such as, for example, a dead reckoning pose estimation, a vision based estimation, or other estimations based on kinematics and/or inertial measurements of the robot 100.
In some examples, the edge 214 includes annotations that provide further indication/description of the environment 10. Some examples of annotations include a description or an indication that an edge 214 is associated with or located on some feature of the environment 10. For instance, an annotation for an edge 214 specifies that the edge 214 is located on stairs or crosses a doorway. These annotations may aid the robot 100 during maneuvering especially when visual information is missing or lacking (e.g., a void such as a doorway). In some configurations, the annotations include directional constraints (also may be referred to as pose constraints). A directional constraint of the annotation may specify an alignment and/or an orientation (e.g., a pose) of the robot 100 at a particular environment feature. For example, the annotation specifies a particular alignment or pose for the robot 100 before traveling along stairs or down a narrow corridor which may restrict the robot 100 from turning.
In some implementations, each waypoint 212 of the topological map 204 also includes sensor data 134 corresponding to data collected by the sensor system 130 of the robot 100 when the generator 210 recorded a respective waypoint 212 to the topological map 204. Here, the sensor data 134 at a waypoint 212 may enable the robot 100 to localize by comparing real-time sensor data 134 gathered as the robot 100 traverses the environment 10 according to the topological map 204 (e.g., a route 202 from the topological map 204) with sensor data 134 stored for the waypoints 212 of the topological map 204.
In some configurations, after the robot 100 moves along an edge 214 (e.g., with the intention to be at a target waypoint 212), the robot 100 is configured to localize by directly comparing real-time sensor data 134 with the topological map 204 (e.g., sensor data 134 associated with the intended target waypoint 212 of the topological map 204). Here, by storing raw or near-raw sensor data 134 with minimal processing for the waypoints 212 of the topological map 204, the robot 100 may use real-time sensor data 134 to localize efficiently as the robot 100 maneuvers within the mapped environment 10. In some examples, an iterative closest points (ICP) algorithm localizes the robot 100 with respect to a waypoint 212.
Because the generator 210 produces the topological map 204 using waypoints 212 and edges 214, the topological map 204 may be locally consistent (e.g., spatially consistent within an area due to neighboring waypoints), but does not need to be globally accurate and/or consistent. That is, as long as geometric relations (e.g., edges 214) between adjacent waypoints 212 are roughly accurate, the topological map 204 does not require precise global metric localization for the robot 100 and sensed objects within the environment 10. Without requiring this for the topological map 204, a navigation route 202 derived or built using the topological map 204 also does not need precise global metric information. Moreover, since the topological map 204 may be built only using waypoints 212 and a relationship between waypoints (e.g., edges 214), the topological map 204 may be considered an abstraction or high-level map in comparison to a metric map. That is, the topological map 204 can be devoid of other metric data about the mapped environment that does not relate to waypoints 212 or their corresponding edges 214. For instance, the mapping process (e.g., by the generator) that creates the topological map 204 may not store or record other metric data or the mapping process may remove recorded metric data to form a topological map 204 of waypoints 212 and edges 214.
Either way, navigating with the topological map 204 may simplify the hardware needed for navigation and/or the computational resources used during navigation. That is, topological-based navigation may operate with low-cost vision and/or low-cost inertial measurement unit (IMU) sensors when compared to navigation using metric localization that often requires expensive LIDAR sensors and/or expensive IMU sensors. Metric-based navigation tends to demand more computational resources than topological-based navigation because metric-based navigation often performs localization at a much higher frequency than topological navigation (e.g., with waypoints 212). For instance, the common navigation approach of Simultaneous Localization and Mapping (SLAM) using a global occupancy grid is constantly performing robot localization.
Referring to
The route executor 220 (also referred to as the executor 220) is configured to receive and to execute the navigation route 202. To execute the navigation route 202, the executor 220 may coordinate with other systems of the robot 100 to control the locomotion-based structures of the robot 100 (e.g., the legs) to drive the robot 100 along the route 202 through the sequence of waypoints 212. For instance, the executor 220 communicates the movement instructions of edges 214 connecting waypoints 212 in the sequence of waypoints 212 of the route 202 to the control system 170. The control system 170 may use these movement instructions to position the robot 100 (e.g., in an orientation) according to one or more pose transforms to successfully move the robot 100 along the edges 214 of the route 202.
While the robot 100 is traveling along the route 202, the executor 220 is also configured to determine whether the robot 100 is unable to execute a particular movement instruction for a particular edge 214. For instance, the robot 100 is unable to execute a movement instruction for an edge 214 because the robot 100 encounters an unforeseeable obstacle 20 while moving along the edge 214 to a waypoint 212. Here, the executor 220 recognizes that an unforeseeable obstacle 20 blocks the path of the robot 100 (e.g., using real-time or near real-time sensor data 134) or otherwise precludes the robot 100 from executing the movement instructions for the edge 214 and is configured to determine whether an alternative path 206 for the robot 100 exists to an untraveled waypoint 212, 212U in the sequence of the route 202. An untraveled waypoint 212U refers to a waypoint 212 of the waypoint sequence for the route 202 that the robot 100 has not already successfully traveled to. For instance, if the robot 100 had already traveled to three of the nine waypoints 212a-c of the route 202, the executor 220 would try to find an alternative path 206 to one or the remaining six waypoints 212d-i if possible.
The executor 220 may first attempt to find the alternative path 206 to the next untraveled-to waypoint 212 before attempting to find the alternative path 206 to subsequent untraveled to-waypoints 212, such as attempting to find the alternative path 206 from waypoint 212c to waypoint 212d before finding the alternative path 206 from waypoint 212c to one of waypoints 212e-i. In this sense, the alternative path 206 is an obstacle avoidance path that avoids the unforeseeable obstacle 20 and also a path that allows the robot 100 to resume the navigation route 202 (e.g., toward a particular goal or task). This means that after the robot 100 travels along the alternative path 206 to a destination of an untraveled waypoint 212U, the executor 220 continues executing the route 202 from that destination of the alternative path 206. This approach enables the robot 100 to return to navigation using the sparse topological map 204. In other words, the alternative path 206 avoiding the obstacle 20 may deviate from the navigation route 202 to avoid the obstacle 20 and merge with or continue the navigation route 202 after avoiding the obstacle 20 to return to navigating using the less resource-intensive topological map 204.
For example, referring to the example of
In some implementations, when the executor 220 determines that an unforeseeable obstacle 20 blocks an edge 214, the executor 220 identifies that the topological map 204 fails to provide an alternative route 206 avoiding the unforeseeable obstacle 20. This is usually the case because the topological map 204 includes waypoints 212 and edges 214 that were recorded during the mapping process (e.g., by the generator 210). Since the unforeseeable obstacle 20 was not present at that time of mapping, the topological map 204 fails to be able to generate an alternative path 206 on its own.
In other words, the generator 210 did not anticipate needing a path or edge 214 resembling the alternative path in
In some configurations, when an edge 214 is blocked by an unforeseeable object 20, the executor 220 resorts to other maps that are available from the systems of the robot 100. In some examples, the executor 220 uses or generates a local obstacle map 222 from current sensor data 134 captured by the sensor system 130 of the robot 100. Here, the local obstacle map 222 may refer to a more detailed map of the environment 10 than the topological map 204, but only for a local area surrounding the robot 100 (e.g., a three meter by three meter square area).
In some configurations, the local obstacle map 222 includes an occupancy grid where each cell within the grid designates whether an obstacle is present in that cell or not. The executor 220 may then generate the alternative path 206 using the unoccupied cells of the occupancy grid in combination with the positions of the untraveled waypoints 212U.
In some examples, the local obstacle map 222 is formed in whole or in part using the perception maps 182 from the perception system 180 (e.g., the ground height map 182a, the no step map 182b, and/or the body obstacle map 182c) for the local area surrounding the robot 100.
With the local obstacle map 222 of finite size, the executor 220 may determine which untraveled waypoint 212U should be the destination of the alternative path 206 by determining which untraveled waypoints 212U exists within the bounds of the local obstacle map 222.
Referring to the example of
For reference,
In some examples, the executor 220 functions methodically such that, for each untraveled waypoint 212U, the executor 220 identifies whether a respective untraveled waypoint 212U exists within the local obstacle map 222. For instance, the executor 220 performs this identification for each untraveled waypoint sequentially following the waypoint sequence of the route 202. For the example of
In the example of
Thus, as the robot 100 moves along the exploration path 224, the local obstacle map 222 will continue to span its finite area. This means that an untraveled waypoint 212U previously not within the bounds of the local obstacle map 222 may become within the bounds of the local obstacle map 222. In this respect, the robot 100 is exploring along the exploration path 224 until an untraveled waypoint 212U exists within the bounds of the local obstacle map 222.
As shown in the example of
The example of
The computing device 400 includes a processor 410 (e.g., data processing hardware 142, 162), memory 420 (e.g., memory hardware 144, 164), a storage device 430, a high-speed interface/controller 440 connecting to the memory 420 and high-speed expansion ports 450, and a low speed interface/controller 460 connecting to a low speed bus 470 and a storage device 430. Each of the components 410, 420, 430, 440, 450, and 460, are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 410 can process instructions for execution within the computing device 400, including instructions stored in the memory 420 or on the storage device 430 to display graphical information for a graphical user interface (GUI) on an external input/output device, such as display 480 coupled to high speed interface 440. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple computing devices 400 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
The memory 420 stores information non-transitorily within the computing device 400. The memory 420 may be a computer-readable medium, a volatile memory unit(s), or non-volatile memory unit(s). The non-transitory memory 420 may be physical devices used to store programs (e.g., sequences of instructions) or data (e.g., program state information) on a temporary or permanent basis for use by the computing device 400. Examples of non-volatile memory include, but are not limited to, flash memory and read-only memory (ROM)/programmable read-only memory (PROM)/erasable programmable read-only memory (EPROM)/electronically erasable programmable read-only memory (EEPROM) (e.g., typically used for firmware, such as boot programs). Examples of volatile memory include, but are not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), phase change memory (PCM) as well as disks or tapes.
The storage device 430 is capable of providing mass storage for the computing device 400. In some implementations, the storage device 430 is a computer-readable medium. In various different implementations, the storage device 430 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. In additional implementations, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as the memory 420, the storage device 430, or memory on processor 410.
The high speed controller 440 manages bandwidth-intensive operations for the computing device 400, while the low speed controller 460 manages lower bandwidth-intensive operations. Such allocation of duties is exemplary only. In some implementations, the high-speed controller 440 is coupled to the memory 420, the display 480 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 440, which may accept various expansion cards (not shown). In some implementations, the low-speed controller 460 is coupled to the storage device 430 and a low-speed expansion port 470. The low-speed expansion port 470, which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet), may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
The computing device 400 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 400a or multiple times in a group of such servers 400a, as a laptop computer 400b, as part of a rack server system 500c, as part of the robot 100, or as part of the remote control.
Various implementations of the systems and techniques described herein can be realized in digital electronic and/or optical circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine-readable medium” and “computer-readable medium” refer to any computer program product, non-transitory computer readable medium, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
To provide for interaction with a user, one or more aspects of the disclosure can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user. In certain implementations, interaction is facilitated by a keyboard and a pointing device, e.g., a mouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. For example, while processes or blocks are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified. Each of these processes or blocks may be implemented in a variety of different ways. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times. Furthermore, the elements and acts of the various embodiments described above can be combined to provide further embodiments. Indeed, the methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the disclosure. Accordingly, other implementations are within the scope of the following claims.
This application claims priority under 35 U.S.C. § 119(e) to U.S. Provisional Application 63/202,289, filed on Jun. 4, 2021. The disclosure of this prior application is considered part of the disclosure of this application and is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5111401 | Everett et al. | May 1992 | A |
5378969 | Haikawa | Jan 1995 | A |
7211980 | Bruemmer | May 2007 | B1 |
7865267 | Sabe et al. | Jan 2011 | B2 |
7912583 | Gutmann et al. | Mar 2011 | B2 |
8270730 | Watson | Sep 2012 | B2 |
8346391 | Anhalt et al. | Jan 2013 | B1 |
8401783 | Hyung et al. | Mar 2013 | B2 |
8548734 | Barbeau et al. | Oct 2013 | B2 |
8849494 | Herbach et al. | Sep 2014 | B1 |
8930058 | Quist et al. | Jan 2015 | B1 |
9352470 | da Silva et al. | May 2016 | B1 |
9561592 | Silva et al. | Feb 2017 | B1 |
9574883 | Watts et al. | Feb 2017 | B2 |
9586316 | Swilling | Mar 2017 | B1 |
9594377 | Perkins et al. | Mar 2017 | B1 |
9717387 | Szatmary et al. | Aug 2017 | B1 |
9844879 | Cousins et al. | Dec 2017 | B1 |
9896091 | Kurt | Feb 2018 | B1 |
9908240 | da Silva et al. | Mar 2018 | B1 |
9933781 | Bando et al. | Apr 2018 | B1 |
9969086 | Whitman | May 2018 | B1 |
9975245 | Whitman | May 2018 | B1 |
10081098 | Nelson et al. | Sep 2018 | B1 |
10081104 | Swilling | Sep 2018 | B1 |
10226870 | Silva et al. | Mar 2019 | B1 |
10639794 | Cousins et al. | May 2020 | B2 |
11175664 | Boyraz | Nov 2021 | B1 |
11268816 | Fay et al. | Mar 2022 | B2 |
11287826 | Whitman et al. | Mar 2022 | B2 |
11416003 | Whitman et al. | Aug 2022 | B2 |
11480974 | Lee | Oct 2022 | B2 |
11518029 | Cantor | Dec 2022 | B2 |
11656630 | Jonak et al. | May 2023 | B2 |
11747825 | Jonak et al. | Sep 2023 | B2 |
11774247 | Fay et al. | Oct 2023 | B2 |
20050131581 | Sabe et al. | Jun 2005 | A1 |
20060009876 | McNeil | Jan 2006 | A1 |
20060025888 | Gutmann et al. | Feb 2006 | A1 |
20060167621 | Dale | Jul 2006 | A1 |
20070233338 | Ariyur | Oct 2007 | A1 |
20070282564 | Sprague et al. | Dec 2007 | A1 |
20080009966 | Bruemmer | Jan 2008 | A1 |
20080027590 | Phillips et al. | Jan 2008 | A1 |
20080086241 | Phillips et al. | Apr 2008 | A1 |
20090125225 | Hussain et al. | May 2009 | A1 |
20100066587 | Yamauchi et al. | Mar 2010 | A1 |
20100172571 | Yoon et al. | Jul 2010 | A1 |
20110035087 | Kim | Feb 2011 | A1 |
20110172850 | Paz-Meidan et al. | Jul 2011 | A1 |
20110224901 | Aben et al. | Sep 2011 | A1 |
20120089295 | Ahn et al. | Apr 2012 | A1 |
20120182392 | Kearns et al. | Jul 2012 | A1 |
20120323432 | Wong et al. | Dec 2012 | A1 |
20130141247 | Ricci | Jun 2013 | A1 |
20130231779 | Purkayastha et al. | Sep 2013 | A1 |
20130325244 | Wang et al. | Dec 2013 | A1 |
20140188325 | Johnson et al. | Jul 2014 | A1 |
20150158182 | Farlow et al. | Jun 2015 | A1 |
20150355638 | Field et al. | Dec 2015 | A1 |
20160375901 | Di Cairano | Dec 2016 | A1 |
20170017236 | Song | Jan 2017 | A1 |
20170095383 | Li | Apr 2017 | A1 |
20170120448 | Lee | May 2017 | A1 |
20170131102 | Wirbel et al. | May 2017 | A1 |
20170203446 | Dooley et al. | Jul 2017 | A1 |
20180051991 | Hong | Feb 2018 | A1 |
20180161986 | Kee et al. | Jun 2018 | A1 |
20180173242 | Lalonde et al. | Jun 2018 | A1 |
20180328737 | Frey et al. | Nov 2018 | A1 |
20180348742 | Byme et al. | Dec 2018 | A1 |
20190056743 | Alesiani | Feb 2019 | A1 |
20190079523 | Zhu et al. | Mar 2019 | A1 |
20190080463 | Davison et al. | Mar 2019 | A1 |
20190138029 | Ryll et al. | May 2019 | A1 |
20190171911 | Greenberg | Jun 2019 | A1 |
20190187699 | Salour | Jun 2019 | A1 |
20190187703 | Millard et al. | Jun 2019 | A1 |
20190213896 | Gohi et al. | Jul 2019 | A1 |
20190318277 | Goldman et al. | Oct 2019 | A1 |
20190332114 | Moroniti et al. | Oct 2019 | A1 |
20200109954 | Li et al. | Apr 2020 | A1 |
20200117198 | Whitman et al. | Apr 2020 | A1 |
20200117214 | Jonak et al. | Apr 2020 | A1 |
20200164521 | Li | May 2020 | A1 |
20200174460 | Byme et al. | Jun 2020 | A1 |
20200192388 | Zhang | Jun 2020 | A1 |
20200198140 | Dupuis | Jun 2020 | A1 |
20200249033 | Gelhar | Aug 2020 | A1 |
20200258400 | Yuan et al. | Aug 2020 | A1 |
20200333790 | Kobayashi et al. | Oct 2020 | A1 |
20200386882 | Klein et al. | Dec 2020 | A1 |
20200409382 | Herman et al. | Dec 2020 | A1 |
20210041243 | Fay et al. | Feb 2021 | A1 |
20210041887 | Whitman et al. | Feb 2021 | A1 |
20210064055 | Jun | Mar 2021 | A1 |
20210141389 | Jonak et al. | May 2021 | A1 |
20210180961 | Oh | Jun 2021 | A1 |
20210311480 | Yang et al. | Oct 2021 | A1 |
20210323618 | Komoroski | Oct 2021 | A1 |
20210382491 | Murotani | Dec 2021 | A1 |
20220024034 | Wei | Jan 2022 | A1 |
20220063662 | Sprunk et al. | Mar 2022 | A1 |
20220083062 | Jaquez | Mar 2022 | A1 |
20220137637 | Baldini | May 2022 | A1 |
20220155078 | Fay et al. | May 2022 | A1 |
20220179420 | Whitman et al. | Jun 2022 | A1 |
20220244741 | Silva et al. | Aug 2022 | A1 |
20220276654 | Lee | Sep 2022 | A1 |
20220342421 | Kearns | Oct 2022 | A1 |
20220374024 | Whitman et al. | Nov 2022 | A1 |
20220388170 | Merewether | Dec 2022 | A1 |
20220390950 | Yamauchi | Dec 2022 | A1 |
20220390954 | Klingensmith | Dec 2022 | A1 |
20230309776 | Li | Oct 2023 | A1 |
20230359220 | Jonak et al. | Nov 2023 | A1 |
20230400307 | Fay et al. | Dec 2023 | A1 |
20230418297 | Komoroski et al. | Dec 2023 | A1 |
20240061436 | Tsuzaki | Feb 2024 | A1 |
20240165821 | Dabiri | May 2024 | A1 |
Number | Date | Country |
---|---|---|
2928262 | Jul 2012 | CA |
102037317 | Apr 2011 | CN |
203371557 | Jan 2014 | CN |
104075715 | Oct 2014 | CN |
104470685 | Mar 2015 | CN |
104536445 | Apr 2015 | CN |
106371445 | Feb 2017 | CN |
107301654 | Oct 2017 | CN |
107943038 | Apr 2018 | CN |
108052103 | May 2018 | CN |
108801269 | Nov 2018 | CN |
109029417 | Dec 2018 | CN |
109540142 | Mar 2019 | CN |
111604916 | Sep 2020 | CN |
211956515 | Nov 2020 | CN |
112034861 | Dec 2020 | CN |
113168184 | Jul 2021 | CN |
113633219 | Nov 2021 | CN |
114174766 | Mar 2022 | CN |
114503043 | May 2022 | CN |
H 02127180 | May 1990 | JP |
H09-134217 | Nov 2003 | JP |
2005-088189 | Apr 2005 | JP |
2006-011880 | Jan 2006 | JP |
2006-239844 | Sep 2006 | JP |
2007-041656 | Feb 2007 | JP |
2008-072963 | Apr 2008 | JP |
2009-223628 | Oct 2009 | JP |
2009-271513 | Nov 2009 | JP |
2010-253585 | Nov 2010 | JP |
2013-250795 | Dec 2013 | JP |
2014-123200 | Jul 2014 | JP |
2014-151370 | Aug 2014 | JP |
2014151370 | Aug 2014 | JP |
2016-081404 | May 2016 | JP |
2016-103158 | Jun 2016 | JP |
2017-182502 | Oct 2017 | JP |
2019500691 | Jan 2019 | JP |
2019-021197 | Feb 2019 | JP |
2022-504039 | Jan 2022 | JP |
2022-543997 | Oct 2022 | JP |
7219812 | Feb 2023 | JP |
7259020 | Apr 2023 | JP |
10-1121763 | Mar 2012 | KR |
20120019893 | Mar 2012 | KR |
20130020107 | Feb 2013 | KR |
10-2022-0078563 | Jun 2022 | KR |
20220083666 | Jun 2022 | KR |
10-2504729 | Feb 2023 | KR |
10-2023-0035139 | Mar 2023 | KR |
10-2533690 | May 2023 | KR |
10-2492242 | Jun 2023 | KR |
WO 2007051972 | May 2007 | WO |
WO2017090108 | Jun 2017 | WO |
WO 2018231616 | Dec 2018 | WO |
WO 2020076418 | Apr 2020 | WO |
WO 2020076422 | Apr 2020 | WO |
WO 2021025707 | Feb 2021 | WO |
WO 2021025708 | Feb 2021 | WO |
WO 2022164832 | Aug 2022 | WO |
WO 2022256811 | Dec 2022 | WO |
WO 2022256815 | Dec 2022 | WO |
WO 2022256821 | Dec 2022 | WO |
WO 2023249979 | Dec 2023 | WO |
Entry |
---|
Kuipers et al. “A Robot Exploration and Mapping Strategy Based on a Semantic Hierarchy of Spatial Representations” Journal of Robotics & Autonomous Systems vol. 8, 1991, pp. 47-63. |
Yamauchci et al. “Place Recognition in Dynamic Environments” Journal of Robotic Systems, Special Issue on Mobile Robots, vol. 14, No. 2, Feb. 1997, pp. 107-120. |
Yamauchi et al. “Spatial Learning for Navigation in Dynamic Environments” IEEE Transactions on Systems, Man, and Cybernetics—Part B: Cybernetics, Special Issue on Learning Autonomous Robots, vol. 26, No. 3, Jun. 1996, pp. 496-505. |
Abraham et al., “A Topological Approach of Path Planning for Autonomous Robot Navigation in Dynamic Environments”, The 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, Oct. 11-15, 2009, DOI: 10.1109/IROS.2009.5354771, pp. 4907-4912. |
Alkautsar, “Topological Path Planning and Metric Path Planning”, https://medium.com/@arifmaulanaa/topological-path-planning-and-metric-path-planning-5c0fa7f107f2, downloaded Feb. 13, 2023, 3 pages. |
boost.org, “buffer (with strategies)”, https://www.boost.org/doc/libs/1_75_0/libs/geometry/doc/html/geometry/reference/algorithms/buffer/buffer_7_with_strategies.html, downloaded Feb. 1, 2023, 5 pages. |
Buchegger et al. “An Autonomous Vehicle for Parcel Delivery in Urban Areas” 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Nov. 2018, DOI: 10.1109/ITSC.2018.8569339, pp. 2961-2967. |
Cao, “Topological Path Planning for Crowd Navigation”, https://www.ri.cmu.edu/app/uploads/2019/05/thesis.pdf, May 2019, 24 pages. |
Collins et al. “Efficient Planning for High-Speed MAV Flight in Unknown Environments Using Online Sparse Topological Graphs” IEEE International Conference on Robotics and Automation (ICRA), Aug. 2020, DOI: 10.1109/ICRA40945.2020.9197167, pp. 11450-11456. |
github.com, “cartographer-project/cartographer”, https://github.com/cartographer-project/cartographer, downloaded Feb. 1, 2023, 4 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2019/047804 mailed Apr. 6, 2020 in 14 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2019/051092, Apr. 30, 2020, 12 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2022/072710 Sep. 27, 2022, 16 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2022/072703, Sep. 22, 2022, 13 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2022/072717, Sep. 30, 2022, 26 pages. |
Mccammon et al., “Topological path planning for autonomous information gathering”, Auton. Robot 45, (2021), https://doi.org/10.1007/s10514-021-10012-x, pp. 821-842. |
Leon et al., “TIGRE: Topological Graph based Robotic Exploration”, 2017 European Conference on Mobile Robots (ECMR), Paris, France, 2017, doi: 10.1109/ECMR.2017.8098718. pp. 1-6. |
Mendes et al., “ICP-based pose-graph SLAM”, International Symposium on Safety, Security and Rescue Robotics (SSRR), Oct. 2016, Lausanne, Switzerland, ff10.1109/SSRR.2016.7784298, hal-01522248, pp. 195-200. |
Poncela et al., “Efficient integration of metric and topological maps for directed exploration of unknown environments”, Robotics and Autonomous Systems, vol. 41, No. 1, Oct. 31, 2002, https://doi.org/10.1016/S0921-8890(02)00272-5, p. 21-39. |
Tang, “Introduction to Robotics”, The Wayback Machine, https://web.archive.org/web/20160520085742/http://www.cpp.edu:80/˜ftang/courses/CS521/, downloaded Feb. 3, 2023, 58 pages. |
Thrun et al, “The GraphSLAM Algorithm with Applications to Large-Scale Mapping of Urban Structures”, The International Journal of Robotics Research, vol. 25, No. 5-6, May-Jun. 2006, DOI:10.1177/0278364906065387, pp. 403-4290. |
Video game, “Unreal Engine 5”, https://docs.unrealengine.com/5.0/en-US/basic-navigation-in-unreal-engine/, downloaded Feb. 1, 2023, 15 pages. |
Whelan et al, “ElasticFusion: Dense SLAM Without a Pose Graph”, http://www.roboticsproceedings.org/rss11/p01.pdf, downloaded Feb. 1, 2023, 9 pages. |
wilipedia.org, “Buffer (GIS)”, http://wiki.gis.com/wiki/index.php/Buffer_(GIS)#:˜:text=A%20'polygon%20buffer'%20is%20a,Buffer%20around%20line%20features, downloaded Feb. 1, 2023, 4 pages. |
wikipedia.org, “Probabilistic roadmap”, https://en.wikipedia.org/wiki/Probabilistic_roadmap, downloaded Feb. 1, 2023, 2 pages. |
wikipedia.org, “Rapidly-exploring random tree”, https://en.wikipedia.org/wiki/Rapidly-exploring_random_tree, downloaded Feb. 1, 2023, 7 pages. |
wikipedia.org, “Visibility graph”, https://en.wikipedia.org/wiki/Visibility_graph, downloaded Feb. 1, 2023, 3 pages. |
Boston Dynamics, “Hey Buddy, Can You Give Me a Hand?,” https://www.youtube.com/watch?v=fUyU3IKzoio, Feb. 12, 2018, downloaded Jul. 31, 2023. |
Boston Dynamics, “Introducing Spot Classic (previously Spot),” https://www.youtube.com/watch?v=M8YjvHYbZ9w, Feb. 9, 2015, downloaded Aug. 10, 2023. |
Boston Dynamics, “Introducing Spot (Previously SpotMini),” https://www.youtube.com/watch?v=tf7IEVTDjng, Jun. 23, 2016, downloaded Jul. 31, 2023. |
Boston Dynamics, “Spot Autonomous Navigation,” https://www.youtube.com/watch?v=Ve9kWX_KXus, May 10, 2018, downloaded Sep. 5, 2023. |
Boston Dynamics, “Spot Robot Testing at Construction Sites,” https://www.youtube.com/watch?v=wND9goxDVrY&t=15s, Oct. 11, 2018, downloaded Sep. 5, 2023. |
Boston Dynamics, “SpotMini,” The Wayback Machine, http://web.archive.org/web/20171118145237/https://bostondynamics.com/spot-mini, downloaded Jul. 31, 2023, 3 pages. |
Boston Dynamics, “Testing Robustness,” https://www.youtube.com/watch?v=aFuA50H9uek, Feb. 20, 2018, downloaded Jul. 31, 2023. |
Boston Dynamics, “The New Spot,” https://www.youtube.com/watch?v=kgaO45SyaO4, Nov. 13, 2017, downloaded Jul. 31, 2023. |
International Search Report and Written Opinion received in PCT Application No. PCT/US2019/046646 dated Oct. 31, 2019, 9 pages. |
International Search Report and Written Opinion received in International Application No. PCT/US2019/051511, Jul. 1, 2020, 15 pages. |
International Search Report and Written Opinion, PCT/US2022/013777, May 6, 2022, 16 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/US2023/025806, Oct. 11, 2023, 16 pages. |
Lee et al., “A New Semantic Descriptor for Data Association in Semantic SLAM”, 2019 109th International Conference on Control, Automation and Systems (ICCAS), Jeju, Korea (South), 2019, pp. 1178-1181, doi: 10.23919/ICCAS47443.2019.8971639. |
Liu et al., “Behavior-based navigation system of mobile robot in unknown environment”, Industrial Control Comp. Dec. 25, 2017; 12: 28-30. |
Song et al., “Environment model based path planning for mobile robot”, Electronics Optics & Control, Jun. 30, 2006, 3: 102-104. |
Wei et al., “A method of autonomous robot navigation in dynamic unknown environment”, J Comp Res Devel. Sep. 30, 2005; 9: 1538-1543. |
Zhao et al., “Machine vision-based unstructured road navigation path detection method”, Trans Chinese Soc Agricult Mach. Jun. 25, 2007; 6: 202-204. (English Abstract not available). |
Number | Date | Country | |
---|---|---|---|
20220390950 A1 | Dec 2022 | US |
Number | Date | Country | |
---|---|---|---|
63202289 | Jun 2021 | US |