A robot is generally a reprogrammable and multifunctional manipulator, often designed to move material, parts, tools, or specialized devices through variable programmed motions for performance of tasks. Robots may be manipulators that are physically anchored (e.g., industrial robotic arms), mobile robots that move throughout an environment (e.g., using legs, wheels, or traction-based mechanisms), or some combination of a manipulator and a mobile robot. Robots are utilized in a variety of industries including, for example, manufacturing, warehouse logistics, transportation, hazardous environments, exploration, and healthcare.
In some embodiments of the present disclosure, a mobile robot includes a robot body, one or more locomotion based structures, coupled to the body, configured to move the mobile robot about an environment, one or more sensors, supported by the body, configured to output data concerning one or more sensed conditions of the environment, at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the mobile robot to receive, by the one or more sensors, data corresponding to one or more locations of the mobile robot along a path the mobile robot is following within the environment on a first occasion, to determine, based on the data, that one or more stairs exist in a first region of the environment, to determine, when the mobile robot is at a position along the path the mobile robot is following on the first occasion, that the mobile robot is expected to enter the first region, and to control the mobile robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the mobile robot is expected to enter the first region.
In some embodiments, a mobile robot includes a robot body, one or more locomotion based structures, coupled to the body, configured to move the mobile robot about an environment, one or more sensors, supported by the body, configured to output data concerning one or more sensed conditions of the environment, at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the mobile robot to determine, based on data received from the one or more sensors of a robot operating within the environment, that one or more stairs exist in a first region of the environment, to determine, while an operator is guiding movement of the mobile robot about the environment, that the mobile robot is expected to enter the first region, and to control the mobile robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the mobile robot is expected to enter the first region.
In some embodiments, a mobile robot includes a robot body, one or more locomotion based structures, coupled to the body, configured to move the mobile robot about an environment, one or more sensors, supported by the body, configured to output data concerning one or more sensed conditions of the environment, at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the mobile robot to receive, by the one or more sensors of a robot, data corresponding to one or more locations of the mobile robot along a path the mobile robot is following within the environment on a first occasion, to determine, based on the data, that a first condition exists for terrain in a first region of the environment, to determine, when the mobile robot is at a position along the path the mobile robot is following on the first occasion, that the mobile robot is expected to enter the first region, and to control the mobile robot to operate in a first operational mode associated with the first condition when it is determined that the first condition exists for terrain in the first region and the mobile robot is expected to enter the first region.
In some embodiments, a mobile robot includes a robot body, one or more locomotion based structures, coupled to the body, configured to move the mobile robot about an environment, one or more sensors, supported by the body, configured to output data concerning one or more sensed conditions of the environment, at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the mobile robot to determine, based on data received from the one or more sensors of a robot operating within the environment, that a first condition exists for terrain in a first region of the environment, to determine, while an operator is guiding movement of the mobile robot about the environment, that the mobile robot is expected to enter the first region, and to control the mobile robot to operate in a first operational mode associated with traversal of terrain for which the first condition exists when it is determined that the first condition exists for terrain in the first region and the mobile robot is expected to enter the first region.
In some embodiments, a method involves receiving, by one or more sensors of a robot, data corresponding to one or more locations of the robot along a path the robot is following within an environment on a first occasion; determining, based on the data, that one or more stairs exist in a first region of the environment; determining, when the robot is at a position along the path the robot is following on the first occasion, that the robot is expected to enter the first region; and controlling the robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the robot is expected to enter the first region.
In some embodiments, a method involves determining, based on data received from one or more sensors of a robot operating within an environment, that one or more stairs exist in a first region of the environment; while an operator is guiding movement of the robot about the environment, determining that the robot is expected to enter the first region; and controlling the robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the robot is expected to enter the first region.
In some embodiments, a method involves receiving, by one or more sensors of a robot, data corresponding to one or more locations of the robot along a path the robot is following within an environment on a first occasion; determining, based on the data, that a first condition exists for terrain in a first region of the environment; determining, when the robot is at a position along the path the robot is following on the first occasion, that the robot is expected to enter the first region; and controlling the robot to operate in a first operational mode associated with the first condition when it is determined that the first condition exists for terrain in the first region and the robot is expected to enter the first region.
In some embodiments, a method involves determining, based on data received from one or more sensors of a robot operating within an environment, that a first condition exists for terrain in a first region of the environment; while an operator is guiding movement of the robot about the environment, determining that the robot is expected to enter the first region; and controlling the robot to operate in a first operational mode associated with traversal of terrain for which the first condition exists when it is determined that the first condition exists for terrain in the first region and the robot is expected to enter the first region.
In some embodiments, a system includes at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the system to receive, by one or more sensors of a robot, data corresponding to one or more locations of the robot along a path the robot is following within an environment on a first occasion, to determine, based on the data, that one or more stairs exist in a first region of the environment, to determine, when the robot is at a position along the path the robot is following on the first occasion, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the robot is expected to enter the first region.
In some embodiments, a system includes at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the system to determine, based on data received from one or more sensors of a robot operating within an environment, that one or more stairs exist in a first region of the environment, to determine, while an operator is guiding movement of the robot about the environment, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the robot is expected to enter the first region.
In some embodiments, a system includes at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the system to receive, by one or more sensors of a robot, data corresponding to one or more locations of the robot along a path the robot is following within an environment on a first occasion, to determine, based on the data, that a first condition exists for terrain in a first region of the environment, to determine, when the robot is at a position along the path the robot is following on the first occasion, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with the first condition when it is determined that the first condition exists for terrain in the first region and the robot is expected to enter the first region.
In some embodiments, a system includes at least one processor, and at least one computer-readable medium encoded with instructions which, when executed by the at least one processor, cause the system to determine, based on data received from one or more sensors of a robot operating within an environment, that a first condition exists for terrain in a first region of the environment, to determine, while an operator is guiding movement of the robot about the environment, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with traversal of terrain for which the first condition exists when it is determined that the first condition exists for terrain in the first region and the robot is expected to enter the first region.
In some embodiments, at least one non-transitory, computer-readable medium is encoded with instructions which, when executed by at least one processor of a system, cause the system to receive, by one or more sensors of a robot, data corresponding to one or more locations of the robot along a path the robot is following within an environment on a first occasion, to determine, based on the data, that one or more stairs exist in a first region of the environment, to determine, when the robot is at a position along the path the robot is following on the first occasion, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the robot is expected to enter the first region.
In some embodiments, at least one non-transitory, computer-readable medium is encoded with instructions which, when executed by at least one processor of a system, cause the system to determine, based on data received from one or more sensors of a robot operating within an environment, that one or more stairs exist in a first region of the environment, to determine, while an operator is guiding movement of the robot about the environment, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with traversal of stairs when it is determined that one or more stairs exist in the first region and the robot is expected to enter the first region.
In some embodiments, at least one non-transitory, computer-readable medium is encoded with instructions which, when executed by at least one processor of a system, cause the system to receive, by one or more sensors of a robot, data corresponding to one or more locations of the robot along a path the robot is following within an environment on a first occasion, to determine, based on the data, that a first condition exists for terrain in a first region of the environment, to determine, when the robot is at a position along the path the robot is following on the first occasion, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with the first condition when it is determined that the first condition exists for terrain in the first region and the robot is expected to enter the first region.
In some embodiments, at least one non-transitory, computer-readable medium is encoded with instructions which, when executed by at least one processor of a system, cause the system to determine, based on data received from one or more sensors of a robot operating within an environment, that a first condition exists for terrain in a first region of the environment, to determine, while an operator is guiding movement of the robot about the environment, that the robot is expected to enter the first region, and to control the robot to operate in a first operational mode associated with traversal of terrain for which the first condition exists when it is determined that the first condition exists for terrain in the first region and the robot is expected to enter the first region.
The foregoing apparatus and method embodiments may be implemented with any suitable combination of aspects, features, and acts described above or in further detail below. These and other aspects, embodiments, and features of the present teachings can be more fully understood from the following description in conjunction with the accompanying drawings.
Various aspects and embodiments will be described with reference to the following figures. It should be appreciated that the figures are not necessarily drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing.
Various techniques that can be employed by robots to travel within environments that include stairs, among other features, are described in prior-filed patent applications by the Assignee of the present disclosure, including U.S. Patent Application Publication No. 2021/0323618 (“the '618 Publication”), U.S. Patent Application Publication No. 2021/0333804 (“the '804 Publication) and U.S. Pat. No. 11,123,869 (“the '869 Patent”), the entire contents of each of which are incorporated herein by reference. As those applications describe, a robot may be configured so that it can be selectively put in an operational mode designed specifically to perceive and traverse stairs. That operational mode is referred to herein as “stairs mode.” Examples of particular settings for the robot that can (A) facilitate the perception of nuances of stairs, and (B) optimally control aspects of the robot to successfully traverse stairs, are described in the '804 Publication and the '869 Patent, and are also described in additional detail below.
The '618 Publication further describes how a robot may undergo an initial mapping process during which the robot may move about an environment (typically in response to commands input by a user to a tablet or other controller) to gather data (e.g., via one or more sensors) about the environment and may generate a topological map that defines “waypoints” of the robot as well as “edges” representing paths between respective pairs of such waypoints. Individual waypoints may, for example, represent sensor data, fiducials, and/or robot pose information at specific times and locations, whereas individual edges may connect waypoints topologically and define the local transform between the reference frames of the interconnected waypoints. When the robot detects stairs during such an initial mapping process, it may annotate the map accordingly to identify the region(s) in which the stairs were detected. After such a topological map has been generated, the robot may autonomously traverse a path including the waypoints identified on the map and, based on the annotations in the map that indicate the regions at which stairs were detected, may automatically enter the “stairs mode” before entering the indicated regions.
The inventor of the present disclosure has recognized and appreciated that the foregoing approaches may be improved. For instance, in the absence of a previously generated map that identifies stair locations, the robot may not enter the “stairs mode” unless the operator of the robot explicitly instructs the robot to do so. For example, after using a joystick or the like on a controller to steer the robot to the bottom or top of a staircase, the operator may be required to hit a button or other element on the controller to transition the robot from its current operational mode and corresponding gait (e.g., “walk,” “crawl,” “jog,” etc.) to the “stairs mode” before again moving the joystick to steer the robot to begin traversing, e.g., ascending or descending, the staircase. Imposing such a requirement (i.e., to manually transition the robot into and out of “stairs mode”) on the operator of the robot is both inconvenient for the operator and potentially dangerous for the robot should the operator neglect to make such a transition. For instance, the robot could tumble down the stairs and become damaged or otherwise cause harm in various ways if not transitioned into “stairs mode” prior to attempting to traverse stairs. To facilitate manual toggling into and out of “stairs mode” some previous systems included at least one additional, readily accessible user interface (UI) element on the controller for the robot, thus adding complexity and clutter to the user interface.
Some embodiments of the present disclosure provide a robot configured to automatically transition into and out of “stairs mode” as the robot is traveling along a path through an environment. As noted above, in some prior implementations, transitioning a robot into a “stairs mode” included configuring the robot, e.g., by adjusting one or more settings, to both (A) facilitate the perception of stairs, and (B) enable the robot to successfully traverse stairs. In accordance with some embodiments of the present disclosure, a robot may be configured to employ some or all of the “stair perception” functionality for identifying stairs on at least certain occasions when it is not also employing the “stair traversal” functionality. As explained in more detail below, in some implementations, by continuously employing at least certain aspects of the “stair perception” functionality, a robot may reliably detect stairs as it is moving about its environment and may be able to accurately predict when it needs to also implement the “stair traversal” functionality to enable the robot to successfully traverse stairs.
In some implementations, a robot may be configured to detect the presence or absence of stairs in the environment of the robot using sensor data that is acquired while the robot is traveling along a path. For example, a robot may be configured such that, should an operator use a joystick or the like to steer the robot in the direction of stairs, the robot may, as the robot is approaching the stairs, acquire and use sensor data to identify the existence of the stairs and then automatically switch the robot to “stairs mode” if the robot determines that its planned path will intersect the identified stairs. Further, as explained in more detail below, a similar technique may be employed to identify regions including other features (e.g., ice patches, steep hills, loose gravel, etc.) within an environment, and then automatically transition the robot to a specialized operational mode to enable effective and safe traversal over or through such regions, e.g., when the robot determines that its planned path will intersect a region in which such a feature is located.
Stairs 20, 20a-n generally refer to a group of more than one stair 20 (i.e., a group of “n” stairs 20) designed to bridge a vertical distance. To bridge the vertical distance, stairs 20a-n typically run a horizontal distance with a given rise in vertical height over a pitch (or pitch line). Each stair 20 may include a tread 22 and a riser 24. The tread 22 of a stair 20 refers to a horizontal part of the stair 20 that is stepped on while a riser 24 refers to a vertical portion of the stair 20 between each tread 22. The tread 22 of each stair 20 spans a tread depth “d” measuring from an outer edge 26 of a stair 20 to the riser 24 between stairs 20. For a residential, a commercial, or an industrial structure, some stairs 20 also include a nosing as part of the edge 26 for safety purposes. A nosing, as shown in
A set of stairs 20 may be preceded by or include a platform or support surface 12 (e.g., a level support surface). For example, a “landing” refers to a level platform or support surface 12 at the top of a set of stairs 20 or at a location between stairs 20. For instance, a landing occurs where a direction of the stairs 20 changes or between a particular number of stairs 20 (e.g., a flight of stairs 20 that connects two floors).
Stair-like terrain more generally refers to terrain that varies in height over some distance. Stair-like terrain may resemble stairs in terms of a change in elevation (e.g., an inclined pitch with a gain in elevation or a declined pitch with a loss in elevation). However, with stair-like terrain the delineation of treads 22 and risers 24 is not as obvious. Rather, stair-like terrain may refer to terrain with tread-like portions that allow a robot to have enough traction to plant a stance limb and sequentially or simultaneously use a leading limb to ascend or to descend over an adjacent vertical obstruction (resembling a riser) within the terrain. For example, stair-like terrain my include rubble, an inclined rock scramble, damaged or deteriorating traditional stairs, etc.
Referring to
In order to traverse the terrain, each leg 120 may have a distal end 124 that contacts a surface 12 of the terrain (i.e., a traction surface). In other words, the distal end 124 of the leg 120 is the end of the leg 120 used by the robot 100 to pivot, plant, or generally provide traction during movement of the robot 100. For example, the distal end 124 of a leg 120 may correspond to a “foot” of the robot 100. In some examples, though not shown, the distal end 124 of the leg 120 may include an ankle joint JA such that the distal end 124 is articulable with respect to the lower member 122L of the leg 120.
The robot 100 may have a vertical gravitational axis (e.g., shown as a Z-direction axis AZ) along a direction of gravity, and a center of mass CM, which is a point where the weighted relative position of the distributed mass of the robot 100 sums to zero. The robot 100 may further have a pose P based on the CM relative to the vertical gravitational axis AZ (i.e., the fixed reference frame with respect to gravity) to define a particular attitude or stance assumed by the robot 100. The attitude of the robot 100 can be defined by an orientation or an angular position of the robot 100 in space. Movement by the legs 120 relative to the body 110 alters the pose P of the robot 100 (i.e., the combination of the position of the CM of the robot and the attitude or orientation of the robot 100). Here, a height (i.e., vertical distance) generally refers to a distance along (e.g., parallel to) the z-direction (i.e., z-axis AZ). The sagittal plane of the robot 100 corresponds to the Y-Z plane extending in directions of the y-direction axis AY and the z-direction axis AZ. In other words, the sagittal plane bisects the robot 100 into a left and right side. Generally perpendicular to the sagittal plane, a ground plane (also referred to as a transverse plane) spans the X-Y plane by extending in directions of the x-direction axis AX and the y-direction axis AY. The ground plane refers to a support surface 12 where distal ends 124 of the legs 120 of the robot 100 may generate traction to help the robot 100 move about the environment 10. Another anatomical plane of the robot 100 is the frontal plane that extends across the body 110 of the robot 100 (e.g., from a left side of the robot 100 with a first leg 120a to a right side of the robot 100 with a second leg 120b). The frontal plane spans the X-Z plane by extending in directions of the x-direction axis Ax and the z-direction axis AZ.
When a legged robot moves about the environment 10, the legs 120 of the robot may undergo a gait cycle. Generally, a gait cycle begins when a leg 120 touches down or contacts a support surface 12 and ends when that same leg 120 once again contacts the support surface 12. The touching down of a leg 120 may also be referred to as a “footfall” defining a point or position where the distal end 124 of a locomotion-based structure 120 falls into contact with the support surface 12. The gait cycle may predominantly be divided into two phases, a swing phase and a stance phase. During the swing phase, a leg 120 performs (i) lift-off from the support surface 12 (also sometimes referred to as toe-off and the transition between the stance phase and swing phase), (ii) flexion at a knee joint JK of the leg 120, (iii) extension of the knee joint JK of the leg 120, and (iv) touchdown (or footfall) back to the support surface 12. Here, a leg 120 in the swing phase is referred to as a swing leg 120SW. As the swing leg 120SW proceeds through the movement of the swing phase 120SW, another leg 120 performs the stance phase. The stance phase refers to a period of time where a distal end 124 (e.g., a foot) of the leg 120 is on the support surface 12. During the stance phase, a leg 120 performs (i) initial support surface contact which triggers a transition from the swing phase to the stance phase, (ii) loading response where the leg 120 dampens support surface contact, (iii) mid-stance support for when the contralateral leg (i.e., the swing leg 120SW) lifts-off and swings to a balanced position (about halfway through the swing phase), and (iv) terminal-stance support from when the robot's CM is over the leg 120 until the contralateral leg 120 touches down to the support surface 12. Here, a leg 120 in the stance phase is referred to as a stance leg 120ST.
To enable the robot to perceive the environment 10, the robot 100 may include a sensor system 130 with one or more sensors 132, 132a-n. The sensors 132 may include vision/image sensors, inertial sensors (e.g., an inertial measurement unit (IMU)), force sensors, and/or kinematic sensors. Some examples of sensors 132 include a camera such as a stereo camera, a scanning light-detection and ranging (LIDAR) sensor, or a scanning laser-detection and ranging (LADAR) sensor. In some implementations, the robot 100 may include two stereo cameras as sensors 132 at a front end of the body 110 of the robot 100 (i.e., a “head” of the robot 100 adjacent the front legs 120a-b of the robot 100) and one stereo camera as a sensor 132 at a back end of the body 110 of the robot 100 adjacent rear legs 120c-d of the robot 100. In some implementations, the respective sensors 132 may have corresponding fields of view Fv, defining a sensing range or region corresponding to the sensor 132. For instance,
Referring to
When surveying a field of view Fv with a sensor 132, the sensor system 130 may generate sensor data 134 (also referred to herein as image data) corresponding to the field of view Fv. In some implementations, the sensor data 134 may be image data that corresponds to a three-dimensional volumetric point cloud generated by a three-dimensional volumetric image sensor 132. Additionally or alternatively, when the robot 100 is maneuvering about the environment 10, the sensor system 130 may gather pose data for the robot 100 that includes inertial measurement data (e.g., measured by an IMU). In some implementations, such pose data may include kinematic data and/or orientation data about the robot 100, for instance, kinematic data and/or orientation data about joints J or other portions of a leg 120 of the robot 100. With the sensor data 134, a perception system 180 of the robot 100 may generate perception maps 182 for the terrain about the environment 10.
While the robot 100 maneuvers about the environment 10, the sensor system 130 may gather sensor data 134 relating to the terrain of the environment 10 and/or structure of the robot 100 (e.g., joint dynamics and/or odometry of the robot 100). For instance,
With continued reference to
Additionally or alternatively, the computing system 140 may employ and/or interact with computing resources that are located remotely from the robot 100. For instance, the computing system 140 may communicate via a network 150 with a remote system 160 (e.g., a remote computer/server or a cloud-based environment). Much like the computing system 140, the remote system 160 may include remote computing resources such as remote data processing hardware 162 and remote memory hardware 164. Here, sensor data 134 or other processed data (e.g., data processing locally by the computing system 140) may be stored in the remote system 160 and may be accessible to the computing system 140. In some implementations, the computing system 140 may be configured to utilize the remote resources 162, 164 as extensions of the computing resources 142, 144 such that resources of the computing system 140 may reside on resources of the remote system 160.
In some implementations, as shown in
In some implementations, the control system 170 may include at least one controller 172, a path generator 174, a step locator 176, and a body planner 178. The control system 170 may be configured to communicate with at least one sensor system 130 and any other system of the robot 100 (e.g., the perception system 180 and/or the stair tracker 200). The control system 170 may perform operations and other functions using hardware 140. The controller(s) 172 may be configured to control movement of the robot 100 to traverse about the environment 10 based on input or feedback from the systems of the robot 100 (e.g., the control system 170, the perception system 180 and/or the stair tracker 200). This may include movement between poses and/or behaviors of the robot 100. For example, the controller(s) 172 may control different footstep patterns, leg patterns, body movement patterns, or vision system sensing patterns.
In some implementations, the controller(s) 172 may include a plurality of controllers 172, where each of the controllers 172 may be configured to operate the robot 100 at a fixed cadence. A fixed cadence refers to a fixed timing for a step or swing phase of a leg 120. For example, an individual controller 172 may instruct the robot 100 to move the legs 120 (e.g., take a step) at a particular frequency (e.g., step every 250 milliseconds, 350 milliseconds, etc.). With a plurality of controllers 172, where each controller 172 is configured to operate the robot 100 at a fixed cadence, the robot 100 can experience variable timing by switching between the different controllers 172. In some implementations, the robot 100 may continuously switch/select fixed cadence controllers 172 (e.g., re-select a controller 170 every three milliseconds) as the robot 100 traverses the environment 10.
In some implementations, the control system 170 may additionally or alternatively include specialty controllers 172 that are dedicated to a particular control purpose. For example, the control system 170 may include one or more stair controllers 172 dedicated to planning and coordinating the robot's movement to traverse a set of stairs 20. For instance, a stair controller 172 may ensure the footpath for a swing leg 120SW maintains a swing height to clear a riser 24 and/or edge 26 of a stair 20. Other specialty controllers 172 may include the path generator 174, the step locator 176, and/or the body planner 178.
Referring to
The path generator 174 may communicate information concerning the currently planned path, as well as identified obstacles, to the step locator 176 such that the step locator 176 may identify foot placements for legs 120 of the robot 100 (e.g., locations to place the distal ends 124 of the legs 120 of the robot 100). The step locator 176 may generate the foot placements (i.e., locations where the robot 100 should step) using inputs from the perception system 180 (e.g., perception map(s) 182). The body planner 178, much like the step locator 176, may receive inputs from the perception system 180 (e.g., perception map(s) 182). Generally speaking, the body planner 178 may be configured to adjust dynamics of the body 110 of the robot 100 (e.g., rotation, such as pitch or yaw and/or height of CM) to successfully move about the environment 10.
The perception system 180 may enable the robot 100 to move more precisely in a terrain with various obstacles. As the sensors 132 collect sensor data 134 for the space about the robot 100 (i.e., the robot's environment 10), the perception system 180 may use the sensor data 134 to form one or more perception maps 182 for the environment 10. In some implementations, the perception system 180 may also be configured to modify an existing perception map 182 (e.g., by projecting sensor data 134 on a preexisting map) and/or to remove information from a perception map 182.
In some implementations, the one or more perception maps 182 generated by the perception system 180 may include a ground height map, a no step map, and a body obstacle map. The ground height map refers to a map 182 generated by the perception system 180 based on voxels from a voxel map. In some implementations, the ground height map may function such that, at each X-Y location within a grid of the map 182 (e.g., designated as a cell of the ground height map), the ground height map specifies a height. In other words, the ground height map may convey that, at a particular X-Y location in a horizontal plane, the robot 100 should step at a certain height.
The no step map generally refers to a map 182 that defines regions where the robot 100 is not allowed to step in order to advise the robot 100 when the robot 100 may step at a particular horizontal location (i.e., location in the X-Y plane). In some implementations, much like the ground height map, the no step map may be partitioned into a grid of cells in which each cell represents a particular area in the environment 10 of the robot 100. For instance, each cell may correspond to a three centimeter square within an X-Y plane within the environment 10. When the perception system 180 generates the no step map, the perception system 180 may generate a Boolean value map where the Boolean value map identifies “no step” regions and “step” regions. A no step region refers to a region of one or more cells where an obstacle exists while a step region refers to a region of one or more cells where an obstacle is not perceived to exist. The perception system 180 may further process the Boolean value map such that the no step map includes a signed-distance field. Here, the signed-distance field for the no step map may include a distance to a boundary of an obstacle (e.g., a distance to a boundary of the no step region) and a vector “v” (e.g., defining nearest direction to the boundary of the no step region) to the boundary of an obstacle.
The body obstacle map generally determines whether the body 110 of the robot 100 overlaps a location in the X-Y plane with respect to the robot 100. In other words, the body obstacle map may identify obstacles for the robot 100 to indicate whether the robot 100, by overlapping at a location in the environment 10, risks collision or potential damage with obstacles near or at the same location. As a map of obstacles for the body 110 of the robot 100, systems of the robot 100 (e.g., the control system 170) may use the body obstacle map to identify boundaries adjacent, or nearest to, the robot 100 as well as to identify directions (e.g., an optimal direction) to move the robot 100 in order to avoid an obstacle. In some implementations, much like other maps 182, the perception system 182 may generate the body obstacle map according to a grid of cells (e.g., a grid of cells in the X-Y plane). Here, each cell within the body obstacle map may include a distance from an obstacle and a vector pointing to the closest cell that is identified as a portion of an obstacle (i.e., a boundary of the obstacle).
Situations may arise where certain types of structures within the environment 10 may routinely result in poor sensor data 134. The robot 100 may, however, still attempt to navigate and/or to perform tasks within the environment 10 even when poor sensor data 134 exists. One type of structure that often leads to poor sensor data 134 is stairs 20. This is particularly problematic because stairs 20 are a fairly common structural feature in commercial and residential environments. Furthermore, poor sensor data 134 for stair navigation may be problematic because stairs also generally demand precise leg movement and foot placement for successful traversal. Since stairs may be a difficult feature to navigate from a coordination perspective, poor sensor data 134 may significantly compound the navigational challenges of the robot.
A sensor 132 may produce poor sensor data 134 for a variety of reasons. With regard to stairs 20, two separate problems may commonly occur. One problem generally pertains to stair ascent while the other problem pertains to stair descent. For stair ascent, open riser stairs 20 may pose issues for the robot 100. With open riser stairs 20, the sensor(s) 132 of the robot 100 may be at a sensing height equal to a height of one or more stairs 20. At this height, the sensor 132 may generate far sensor data 134 through the open riser 24 and near sensor data 134 for an edge 26 of a stair 20. In other words, when the sensor 132 cannot see the riser 24 on open riser stairs, the edge 26 of the treads 22 of the stairs 20 may appear to the robot 100 as floating rungs and may be falsely identified as obstacles of the robot 100 by the robot's perception system 180 rather than stairs. When a robot 100 is about to descend, or is in the act of descending a set of stairs 20, a sensor 132, such as a stereo camera, may produce poor sensor data 134 due to the repetitive structure and lines that define a typical staircase. For example, stereo cameras specifically function by trying to find a portion of two different images that are the same object in the real world and use parallax to determine a distance for that object. Yet, based on the repeating lines of a staircase when viewing it from top to bottom, sensors 132 are more likely to mismatch the same object and thus generate poor sensor data 134. This is particularly common for industrial or grated staircases because the grating introduces more repeating lines that the sensor 132 is more apt to mismatch. Although not all staircases are grated, this presents a problem to the navigation of the robot 100 because robots 100 may often be deployed in industrial environments 10. Though these scenarios do not occur for every type of staircase, a robot 100 that struggles to ascend one type of staircase and to descend another may limit the robot's versatility and robustness to successfully traverse an environment.
To attempt to address some of these sensor data issues, as illustrated in
As shown in
To perform its tracking process, when the detection tracker 220 receives the second detected feature 212, 2122, the detection tracker 220 may determine whether the second detected feature 2122 received at the second time step t2 is similar to the first detected feature 2121 from the first time step t1 (now the tracked detection 222). When the first and the second detected features 212 are similar, the detection tracker 220 may merge the first and the second detected features 212 together to update the tracked detection 222. Here, during a merging operation, the detection tracker 220 may merge detected features 212 together with the tracked detection 222 using averaging (e.g., a weighted average weighted by a confidence error in the detected feature 212). When the second detected feature 2122 is not similar to the first detected feature 2121 the detection tracker 220 may determine whether an alternative tracked feature 224 exists for the stair feature corresponding to the second detected feature 2122 (i.e., has the detection tracker 220 previously identified at detected feature 212 as an alternative tracked feature 224). When an alternative tracked feature 224 does not exist, the detection tracker 220 may establish the second detected feature 2122 at the second time step t2 to be the alternative tracked feature 224. When an alternative tracked feature 224 already exists, the detection tracker 220 may determine whether the second detected feature 2122 at the second time step t2 is similar to the existing alternative tracked feature 224. When the second detected feature 2122 at the second time step t2 is similar to the existing alternative tracked feature 224, the detection tracker 220 may merge the second detected feature 2122 at the second time step t2 with the existing alternative tracked feature 224 (e.g., using averaging or weighted averaging). When the second detected feature 2122 at the second time step t2 is not similar to the existing alternative tracked feature 224, the detection tracker 200 may generate another alternative tracked feature 224 equal to the second detected feature 2122 at the second time step t2. In some examples, the detection tracker 220 may be configured to track and/or store multiple alternative detections 224.
By using the tracking process of the detection tracker 220 in conjunction with the detector 210, the stair tracker 200 may vet each detection to prevent the stair tracker 200 from detrimentally relying on a detection. In other words, with the robot 100 constantly gathering sensor data 134 about its environment (e.g., at a frequency of 15 Hz), a reliance on a single detection from a snapshot of sensor data 134 may cause inaccuracy as to the actual location of features of the stairs 20. For example, a robot 100 may move or change its pose P between a first time and a second time generating sensor data 134 for areas of the stairs 20 that were previously occluded, partially occluded, or poorly captured in general. Here, a system that only performed a single detection at the first time may suffer from incomplete sensor data 134 and inaccurately detect a feature. In contrast, by constantly tracking each detection based on the most recent sensor data 134 available to the stair tracker 200 over a period of time, the stair tracker 200 may generate a bimodal probability distribution for a detected stair feature (e.g., a primary detection and an alternative detection). With a bimodal probability distribution for a feature of a stair 20, the stair tracker 200 is able to generate an accurate representation for the feature of the stair 20 to include in the stair model 202. Furthermore, this detection and tracking process tolerates a detection at any particular instance in time that corresponds to arbitrary poor sensor data 134 because that detection is tracked and averaged over time with other detections (e.g., presumably detections based on better data or based on a greater aggregate of data over multiple detections). Therefore, although a single detection may appear noisy at any moment in time, the merging and alternative swapping operations of the detection tracker 220 develop an accurate representation of stair features over time.
These stair features may then be incorporated into the stair model 202 that the stair tracker 200 generates and communicates to various systems of the robot 100 (e.g., systems that control the robot 100 to traverse the stairs 20). In some configurations, the stair tracker 200 may incorporate a tracked feature 222 into the stair model 202 once the tracked feature 222 has been detected by the detector 210 and tracked by the detection tracker 220 for some number of iterations. For example, when the detection tracker 220 has tracked the same feature for three to five detection/tracking cycles, the stair tracker 200 may incorporate the tracked detection 222 (i.e., a detection that has been updated for multiple detection cycles) for this feature into the stair model 202. Stated differently, the stair detector 200 may determine that the tracked detection 222 has matured over the detection and tracking process into a most likely candidate for a feature for the stairs 20.
When a sensor 132 peers down a set of stairs 20, this descending vantage point for a sensor 132 produces a different quality of sensor data 134 than a sensor 132 peering up a set of stairs 20. For example, peering up a set of stairs 20 has a vantage point occluding the treads 22 of stairs 20 and some of the riser 24 while peering down the set of stairs 20 has a vantage point that occludes the risers 24 and a portion of the treads 22. Due to these differences, among other reasons, the stair tracker 200 may have separate functionality dedicated to stair ascent (e.g., a stair ascent tracker) and stair descent (e.g., a stair descent tracker). For example, each type of stair tracker may be part of the stair tracker 200, but may be implemented as separate software modules. In some configurations, each type stair tracker, though implemented via separate modules, may coordinate with each other. For instance, the stair ascent tracker may pass information to the stair descent tracker (or vice versa) when the robot 100 changes directions during stair navigation (e.g., on the stairs 20).
As indicated by an arrow 304 in
Further, as indicated by arrows 306 and 308 in
In some embodiments, the stairs mode setting selector 302 may be configured to output setting values (indicated by arrows 312, 314, and 316 in
As noted previously, some prior systems provided an operator with the ability to manually toggle the “stairs mode” between an “on” state and an “off” state (e.g., to switch between the setting values of the columns 502 and 504) before and after the user directed the robot to traverse a staircase. The columns 506 and 508 show the same settings as the columns 502 and 504, but illustrate how the values of those settings can change dynamically when the robot 100 is operating in an automatic stairs mode (referred to herein as “auto” or “auto stairs” mode) in accordance with the present disclosure. In particular, the column 506 illustrates setting values for a scenario in which the robot 100 is operating in the “auto stairs” mode but has not yet determined that traversal of stairs 20 is imminent (referred to herein as “auto-passive” state), whereas the column 508 illustrates setting values for a scenario in which the robot 100 is likewise operating in the “auto stairs” mode but has, in fact, determined that traversal of stairs 20 is imminent (referred to herein as “auto-active” state). As explained below, in some implementations, the stairs mode setting selector 302 may additionally apply “auto-active” settings (per the column 508), rather than the “auto-passive” settings (per the column 506), when it determines that the robot 100 recently exited a staircase (e.g., in case it is on a landing between staircases), and/or that the robot 100 is currently on stairs 20 (e.g., in case something precluded the stair tracker 200 from identifying stairs 20).
As can be seen by comparing the columns 508 and 504, the various settings may have the same values when the robot 100 is operating in the “auto-active” state (per the column 508) as when the robot 100 is operating with the “stairs mode” turned “on” (per the column 504). On the other hand, as can be seen by comparing columns 502 and 506, when the robot 100 is operating in the “auto-passive” state (per the column 506), the values for only two of the illustrated settings (i.e., “pitch limiter” and “stair tracker”) are different than when the robot is operating with the “stairs mode” turned “off” (per the column 502). Additionally, as can be seen by comparing the columns 506 and 504, when the robot 100 is operating in the “auto-passive” state (per the column 506), the values for only two of the illustrated settings (i.e., “pitch limiter” and “stair tracker”) are the same as when the robot 100 is operating with the “stairs mode” turned “on” (per the column 504).
As explained in more detail below, when the robot 100 is operating in the “auto-passive” state (per the column 506), the indicated values of the “pitch limiter” and “stair tracker” settings (i.e., “pitch limiter=on” and “stair tracker=on”) may enable the robot 100 to identify stairs 20 within the environment 10, thus enabling the stairs mode setting selector 302 to determine whether traversal of the identified stairs by the robot 100 is imminent in view of the planned path 310 provided by the path generator 174. The values of the remaining settings may not be changed, so as to allow the robot 100 to continue moving around the environment 10 without taking any extra actions to enable traversal of stairs 20. In some implementations, only after the robot 100 has determined, while in the “auto-passive” state (per the column 506), that the traversal of stairs 20 within the environment 10 is imminent will the values of the remaining settings be switched to those shown in the column 508, thus enabling the robot 100 to take particular actions to ensure that the identified stairs 20 can be safely traversed. Advantageously, the operator of the robot 100 need not manually switch the robot 100 from one mode to another before and after traversing stairs 20. Instead, when the robot 100 is operating in the “auto stairs” mode, as disclosed herein, the operator may simply steer the robot 100 about the environment 10, and the robot 100 will itself automatically determine whether and when to enable its robust stair traversal capabilities.
As noted previously, the decision tree 400 shown in
As shown in
As shown, when the “off” operational mode has been selected (e.g., in response to the operator selecting the UI element 320c on the control device 318), the stairs mode setting selector 302 may apply the settings from the “off” column 502 of the table 500 (shown in
As shown in
As indicated in
As shown, when the stairs mode setting selector 302 determines (per the decision 414) that the robot 100 is not approaching stairs, is not currently on stairs, and has not recently exited stairs, the stairs mode setting selector 302 may proceed to the node 416 at which it may apply the setting values from the “auto-passive” column 506 of the table 500 (shown in
When, on the other hand, the stairs mode setting selector 302 determines (per the decision 414) that the robot 100 is approaching stairs, is currently on stairs, or has recently exited stairs, the stairs mode setting selector 302 may instead proceed to the node 418 at which it may apply the setting values from the “auto-active” column 508 of the table 500 (shown in
In some implementations, the first criteria and/or threshold(s) (per the block 410) and the second criteria and/or threshold(s) (per the block 412) may differ in one or more significant respects. For example, the respective criteria and/or threshold(s) may be set so as to introduce hysteresis into the process that prevents undesirable rapid switching between the “auto-active” and “auto-passive” states. Examples of threshold(s) and/or criteria that may be set in this manner will now be described.
As noted above, in some implementations, a stair model 202 generated by the stair tracker 200 may represent both a configuration of a staircase and a location of the staircase relative to the robot 100. Further, as also described above, in some implementations, the path generator 174 may determine a planned path 310 of the robot for some future period (e.g., for the next 1-1.5 seconds), with adjustments to the planned path 310 by the path generator 174 occurring rapidly, such as hundreds of times per second.
In some implementations, when the decision 414 uses the second criteria and/or threshold(s) applied per the block 412, the stairs mode setting selector 302 may determine whether the planned path 310 for the subsequent 1-1.5 seconds, if followed, would intersect the location of stairs 20 indicated by the stair model 202. Upon the stairs mode setting selector 302 determining (per the decision 414) that such a condition is satisfied, the stairs mode setting selector 302 may, per the node 418, apply the “auto-active” setting values from the column 508 of the table 500 (shown in
In some implementations, the first criteria and/or threshold(s) (per the block 410) may account for the reduced speed of the robot 100 after the “auto-active” setting values have been applied, such as by causing the stairs mode setting selector 302 to determine, e.g., at the decision 414, whether the planned path 310 for the subsequent 1-1.5 seconds, if followed, would result in the robot 100 moving along essentially the same planned path 304 that the stairs mode setting selector 302 previously determined (e.g., when the decision 414 used the second criteria and/or threshold(s) per the block 412) would intersect the stairs 20, albeit a lesser distance along that path. To achieve such a result, the first criteria and/or threshold(s) may, for example, include a condition that forces a positive (i.e., “yes”) outcome at the decision 414 if the stairs mode setting selector 302 determines that the planned path 310 for the subsequent 1-1.5 seconds, if followed, would result in the robot 100 moving along essentially the same planned path 304 that the stairs mode setting selector 302 previously determined would intersect the stairs 20. To make such a determination, the stairs mode setting selector 302 may, for example, determine whether the two planned paths are within a threshold degree of similarity.
In other implementations, a similar result may be obtained by using different time periods for determining the planned paths 304 that are evaluated at the decision 414 when the first criteria and/or threshold(s) and the second criteria and/or threshold(s) are used to make the decision 414. For example, in some implementations, use of the second criteria and/or threshold(s) (per the block 412) may cause the stairs mode setting selector 302 to determine, e.g., at the decision 414, whether the planned path 310 for the subsequent 500 milliseconds, if followed, would intersect the location of stairs 20 indicated by the stair model 202, whereas use of the first criteria and/or threshold(s) (per the block 412) may cause the stairs mode setting selector 302 to determine, e.g., at the decision 414, whether the planned path 310 for the subsequent 1-1.5 seconds, if followed, would intersect the location of stairs 20 indicated by the stair model 202.
In some implementations, the first criteria and/or threshold(s) used per the block 410 and/or second criteria and/or threshold(s) used per the block 412 may depend on the current state of the robot 100 and may be updated dynamically as the robot 100 is operating. For instance, in some implementations, the stairs mode setting selector 302 may set the first criteria and/or threshold(s) and/or the second criteria and/or threshold(s) based on the current speed and/or gait of the robot 100. As an example, if the robot 100 is currently operating in the “auto-passive” state (per the column 506 of the table 500 shown in
As another example of different criteria and/or threshold(s) that may be used pursuant to the blocks 410 and 412, it may be desirable to refrain from transitioning from the “auto-active” state to the “auto-passive” state during the brief periods when the robot 100 is on a landing between different sections of a staircase. Accordingly, in some implementations, the first criteria and/or threshold(s) used pursuant to the block 410 may regulate the circumstances in which the robot 100 will transition from the “auto-active” state to the “auto-passive” state. For instance, in some implementations, the first criteria and/or threshold(s) used pursuant to the block 410 may allow a negative (i.e., “no”) outcome at the decision 414 only if the robot 100 has traveled more than 1 meter (e.g., a typical dimension of a landing between staircases) with the planned path 310 for the subsequent 1-1.5 seconds not intersecting the location of stairs 20. Using a distance threshold, rather than a time threshold, to determine whether the robot 100 has recently exited stairs may be preferable, as it is common for operators to pause forward motion of the robot 100 after reaching a landing between staircases.
In some implementations, rapid switching between the “auto-active” and “auto-passive” states may additionally or alternatively be inhibited by configuring the first criteria and/or threshold(s) and/or the second criteria and/or threshold(s) to include a requirement that a threshold amount of time must elapse after switching from one state to the other (e.g., from the “auto-passive” state to the “auto-active” state), before allowing a transition back to the prior state (e.g., from the “auto-active” state to the “auto-passive” state).
Many other configurations of the first criteria and/or threshold(s) and/or the second criteria and/or threshold(s) to introduce hysteresis and/or other desired behavior into the operation of stairs mode setting selector 302 are likewise possible.
As noted above, in some implementations, the decision 414 may further involve determining whether the robot is currently on stairs 20. In some implementations, for example, the stair tracker 200 may be configured, based on sensor data 134 and/or kinematic data, to determine that the robot 100 is currently on stairs 20. When, at the decision 414, the stairs mode setting selector 302 determines that the robot 100 is currently on stairs (using the stair tracker 200 or otherwise), the stairs mode setting selector 302 may automatically place the robot 100 in the “auto-active” state (per the node 418 of the decision tree 400), even if the stairs mode setting selector 302 has not determined that the planned path 310 will intersect the location of stairs 20 indicated by a stair model 202. Taking this step may allow the robot 100 to successfully navigate stairs 20 in the event of a failure of the stair tracker 200 to accurately identify stairs 20 within the robot's planned path.
The purpose and function of the example settings listed in the table 500 (shown in
The value of the “gait” setting shown the table 500 may determine allowable values for the current gait for the robot 100. Examples of possible “gait” setting values include (1) “crawl,” in which the robot 100 picks up one leg 120 at a time as it moves, (2) “walk” (which may alternatively be referred to as “trot”) in which the robot 100 picks up one diagonal pair of legs 120 at a time, (3) “jog,” in which the robot 100 also picks up one diagonal pair of legs 120 at a time but at a faster cadence than in the “walk” mode and also including a flight phase during which all four legs 120 are in the air, (4) “stairs trot,” in which to robot picks up one diagonal pair of legs 120 at a time and in manner optimized for stair traversal, and (5) “hop,” in which the robot 100 takes five quick jumps with one diagonal pair of legs 120, followed by five quick jumps with the other diagonal pair of legs 120, and then repeats. As shown in the columns 504 and 508 of the table 500 (shown in
The “speed limits” setting may control the maximum speed at which the robot 100 is permitted to travel. When the value of the “speed limits” setting is set to “stairs” (e.g., per the columns 504 and 508 of the table 500 shown in
The “pitch limiter” setting, when set to “on,” may control the pitch of the robot 100 when the robot is backing up, i.e., moving in reverse, and the stair tracker 200 is unable to determine (e.g., due to poor sensor data quality or otherwise) whether stairs 20 are present behind the robot 200, thus ensuring that the field of the view of the sensor 132 at the rear of the robot 100 points sufficiently downward to enable the robot 100 to accurately identify a downward going staircase 20. In particular, when the “pitch limiter” setting value is set to “on,” the robot 100 may be precluded from assuming a pose, while moving in reverse at a time that the stair tracker 200 is unable to determine whether stairs 20 are present, at which the pitch of the body 110 of the robot 100 causes the field of view of the rear sensor 132 to move upward by more than a threshold angle. When the “pitch limiter” setting is set to “off” (e.g., per the column 502 of the table 500 shown in
The “stair tracker” setting may determine whether the stair tracker 200 (described above) is actively operating. As indicated in the columns 502, 504, 506, and 508 of the table 500 (shown in
The “no step region adjustments” setting may determine whether special adjustments and/or filtering are to be made when identifying “no step regions” for the robot 100 while traversing stairs. As described above in connection with
As shown in the columns 504 and 508 of the table 500 (shown in
The “voxel map adjustments” setting may determine whether special assumptions are to be made by the perception system 180, e.g., to account for bad or missing sensor data, when generating voxel maps while traversing stairs 20. For instance, due to the pitch of the body 110 of the robot 100 when traveling up a staircase, the field of view of the sensors 132 may not include portions of the top landing, or, for open riser stairs, problems can arise because the field of the view of the sensors 132 includes the bottom surface, rather than the top surface, of the top landing. In some implementations, when the value of the “voxel map adjustments” setting is “yes” (e.g., per the columns 504 and 508 of the table 500 shown in
The “multi-step mu estimate” setting may determine how the ground coefficient of friction (μ) is determined during movement of the robot 100. In some implementations, when the value of the “multi-step mu estimate” setting is “yes” (e.g., per the columns 502 and 506 of the table 500 shown in
The “contact normals” setting may control how the direction normal to the surface underneath the robot 100, e.g., to inform force allocation decisions based on friction, is determined. In some implementations, when the value of the “contact normals” setting is “measured” (e.g., per the columns 502 and 506 of the table 500 shown in
The “body offsets” setting may control whether the robot 100 may assume particular poses. In some implementations, when the value of the “body offsets” setting is “allowed” (e.g., per the columns 502 and 506 of the table 500 shown in
As shown in
The processor(s) 602 may operate as one or more general-purpose processor or special purpose processors (e.g., digital signal processors, application specific integrated circuits, etc.). The processor(s) 602 may, for example, correspond to the data processing hardware 142 of the robot 100 described above. The processor(s) 602 can be configured to execute computer-readable program instructions 606 that are stored in the data storage 604 and are executable to provide the operations of the robotic device 600 described herein. For instance, the program instructions 606 may be executable to provide operations of controller 608, where the controller 608 may be configured to cause activation and/or deactivation of the mechanical components 614 and the electrical components 616. The processor(s) 602 may operate and enable the robotic device 600 to perform various functions, including the functions described herein.
The data storage 604 may exist as various types of storage media, such as a memory. The data storage 604 may, for example, correspond to the memory hardware 144 of the robot 100 described above. The data storage 604 may include or take the form of one or more non-transitory computer-readable storage media that can be read or accessed by processor(s) 602. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with processor(s) 602. In some implementations, the data storage 604 can be implemented using a single physical device (e.g., one optical, magnetic, organic or other memory or disc storage unit), while in other implementations, the data storage 604 can be implemented using two or more physical devices, which may communicate electronically (e.g., via wired or wireless communication). Further, in addition to the computer-readable program instructions 606, the data storage 604 may include additional data such as diagnostic data, among other possibilities.
The robotic device 600 may include at least one controller 608, which may interface with the robotic device 600 and may be either integral with the robotic device, or separate from the robotic device 600. The controller 608 may serve as a link between portions of the robotic device 600, such as a link between mechanical components 614 and/or electrical components 616. In some instances, the controller 608 may serve as an interface between the robotic device 600 and another computing device. Furthermore, the controller 608 may serve as an interface between the robotic system 600 and a user(s). The controller 608 may include various components for communicating with the robotic device 600, including one or more joysticks or buttons, among other features. The controller 608 may perform other operations for the robotic device 600 as well. Other examples of controllers may exist as well.
Additionally, the robotic device 600 may include one or more sensor(s) 610 such as image sensors, force sensors, proximity sensors, motion sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, and/or infrared sensors, or combinations thereof, among other possibilities. The sensor(s) 610 may, for example, correspond to the sensors 132 of the robot 100 described above. The sensor(s) 610 may provide sensor data to the processor(s) 602 to allow for appropriate interaction of the robotic system 600 with the environment as well as monitoring of operation of the systems of the robotic device 600. The sensor data may be used in evaluation of various factors for activation and deactivation of mechanical components 614 and electrical components 616 by controller 608 and/or a computing system of the robotic device 600.
The sensor(s) 610 may provide information indicative of the environment of the robotic device for the controller 608 and/or computing system to use to determine operations for the robotic device 600. For example, the sensor(s) 610 may capture data corresponding to the terrain of the environment or location of nearby objects, which may assist with environment recognition and navigation, etc. In an example configuration, the robotic device 600 may include a sensor system that may include a camera, RADAR, LIDAR, time-of-flight camera, global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment of the robotic device 600. The sensor(s) 610 may monitor the environment in real-time and detect obstacles, elements of the terrain, weather conditions, temperature, and/or other parameters of the environment for the robotic device 600.
Further, the robotic device 600 may include other sensor(s) 610 configured to receive information indicative of the state of the robotic device 600, including sensor(s) 610 that may monitor the state of the various components of the robotic device 600. The sensor(s) 610 may measure activity of systems of the robotic device 600 and receive information based on the operation of the various features of the robotic device 600, such as the operation of extendable legs, arms, or other mechanical and/or electrical features of the robotic device 600. The sensor data provided by the sensors may enable the computing system of the robotic device 600 to determine errors in operation as well as monitor overall functioning of components of the robotic device 600.
For example, the computing system may use sensor data to determine the stability of the robotic device 600 during operations as well as measurements related to power levels, communication activities, components that require repair, among other information. As an example configuration, the robotic device 600 may include gyroscope(s), accelerometer(s), and/or other possible sensors to provide sensor data relating to the state of operation of the robotic device. Further, sensor(s) 610 may also monitor the current state of a function, such as a gait, that the robotic system 600 may currently be operating. Additionally, the sensor(s) 610 may measure a distance between a given robotic leg of a robotic device and a center of mass of the robotic device. Other example uses for the sensor(s) 610 may exist as well.
Additionally, the robotic device 600 may also include one or more power source(s) 612 configured to supply power to various components of the robotic device 600. Among possible power systems, the robotic device 600 may include a hydraulic system, electrical system, batteries, and/or other types of power systems. As an example illustration, the robotic device 600 may include one or more batteries configured to provide power to components via a wired and/or wireless connection. Within examples, components of the mechanical components 614 and electrical components 616 may each connect to a different power source or may be powered by the same power source. Components of the robotic system 600 may connect to multiple power sources as well.
Within example configurations, any suitable type of power source may be used to power the robotic device 600, such as a gasoline and/or electric engine. Further, the power source(s) 612 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples. Other configurations may also be possible. Additionally, the robotic device 600 may include a hydraulic system configured to provide power to the mechanical components 614 using fluid power. Components of the robotic device 600 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example. The hydraulic system of the robotic device 600 may transfer a large amount of power through small tubes, flexible hoses, or other links between components of the robotic device 600. Other power sources may be included within the robotic device 600.
Mechanical components 614 can represent hardware of the robotic system 600 that may enable the robotic device 600 to operate and perform physical functions. As a few examples, the robotic device 600 may include actuator(s), extendable leg(s) (“legs”), arm(s), wheel(s), one or multiple structured bodies for housing the computing system or other components, and/or other mechanical components. The mechanical components 614 may depend on the design of the robotic device 600 and may also be based on the functions and/or tasks the robotic device 600 may be configured to perform. As such, depending on the operation and functions of the robotic device 600, different mechanical components 614 may be available for the robotic device 600 to utilize. In some examples, the robotic device 600 may be configured to add and/or remove mechanical components 614, which may involve assistance from a user and/or other robotic device. For example, the robotic device 600 may be initially configured with four legs, but may be altered by a user or the robotic device 600 to remove two of the four legs to operate as a biped. Other examples of mechanical components 614 may be included.
The electrical components 616 may include various components capable of processing, transferring, providing electrical charge or electric signals, for example. Among possible examples, the electrical components 616 may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of the robotic device 600. The electrical components 616 may interwork with the mechanical components 614 to enable the robotic device 600 to perform various operations. The electrical components 616 may be configured to provide power from the power source(s) 612 to the various mechanical components 614, for example. Further, the robotic device 600 may include electric motors. Other examples of electrical components 616 may exist as well.
In some implementations, the robotic device 600 may also include communication link(s) 618 configured to send and/or receive information. The communication link(s) 618 may transmit data indicating the state of the various components of the robotic device 600. For example, information read in by sensor(s) 610 may be transmitted via the communication link(s) 618 to a separate device. Other diagnostic information indicating the integrity or health of the power source(s) 612, mechanical components 614, electrical components 618, processor(s) 602, data storage 604, and/or controller 608 may be transmitted via the communication link(s) 618 to an external communication device.
In some implementations, the robotic device 600 may receive information at the communication link(s) 618 that is processed by the processor(s) 602. The received information may indicate data that is accessible by the processor(s) 602 during execution of the program instructions 606, for example. Further, the received information may change aspects of the controller 608 that may affect the behavior of the mechanical components 614 or the electrical components 616. In some cases, the received information indicates a query requesting a particular piece of information (e.g., the operational state of one or more of the components of the robotic device 600), and the processor(s) 602 may subsequently transmit that particular piece of information back out the communication link(s) 618.
In some cases, the communication link(s) 618 include a wired connection. The robotic device 600 may include one or more ports to interface the communication link(s) 618 to an external device. The communication link(s) 618 may include, in addition to or alternatively to the wired connection, a wireless connection. Some example wireless connections may utilize a cellular connection, such as CDMA, EVDO, GSM/GPRS, or 4G telecommunication, such as WiMAX or LTE. Alternatively or in addition, the wireless connection may utilize a Wi-Fi connection to transmit data to a wireless local area network (WLAN). In some implementations, the wireless connection may also communicate over an infrared link, radio, Bluetooth, or a near-field communication (NFC) device.
The above-described embodiments can be implemented in any of numerous ways. For example, the embodiments may be implemented using hardware, software or a combination thereof. When implemented in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single computer or distributed among multiple computers. It should be appreciated that any component or collection of components that perform the functions described above can be generically considered as one or more controllers that control the above-described functions. The one or more controllers can be implemented in numerous ways, such as with dedicated hardware or with one or more processors programmed using microcode or software to perform the functions recited above.
Various aspects of the present technology may be used alone, in combination, or in a variety of arrangements not specifically described in the embodiments described in the foregoing and are therefore not limited in their application to the details and arrangement of components set forth in the foregoing description or illustrated in the drawings. For example, aspects described in one embodiment may be combined in any manner with aspects described in other embodiments.
Also, some embodiments may be implemented as one or more methods, of which an example has been provided. The acts performed as part of the method(s) may be ordered in any suitable way. Accordingly, embodiments may be constructed in which acts are performed in an order different than illustrated, which may include performing some acts simultaneously, even though shown as sequential acts in illustrative embodiments.
Use of ordinal terms such as “first,” “second,” “third,” etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed. Such terms are used merely as labels to distinguish one claim element having a certain name from another element having a same name (but for use of the ordinal term).
The phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof, is meant to encompass the items listed thereafter and additional items.
Having described several embodiments in detail, various modifications and improvements will readily occur to those skilled in the art. Such modifications and improvements are intended to be within the spirit and scope of the technology. Accordingly, the foregoing description is by way of example only, and is not intended as limiting.
What is claimed is:
This application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application Ser. No. 63/354,854, filed Jun. 23, 2022, and entitled, “AUTOMATICALLY TRASITIONING A ROBOT TO AN OPERATIONAL MODE OPTIMIZED FOR PARTICULAR TERRAIN,” the entire contents of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63354854 | Jun 2022 | US |