Some types of robots can be classified as legged robotic devices in that they are equipped with one or more legs by which they are able to move about an environment. Some examples of legged robotic devices include biped, or two-legged robots, and quadruped, or four-legged robots. Legged robots may move about an environment according to a gait, or pattern of movements during locomotion. Each cycle of this pattern may be referred to as a step. A robotic device may alter certain steps based on features of the environment.
The present disclosure generally relates to controlling a legged robotic device. More specifically, implementations described herein may involve adjusting a swing height of one or more legs of a robotic device. As an environment in which the robotic device is operating changes as the robotic device moves through the environment, the legged robotic device may change to various swing heights to step over features or obstacles of the environment.
A first example implementation includes (i) receiving sensor data that indicates topographical features of an environment in which a robotic device is operating, (ii) processing the sensor data into a topographical map that includes a two-dimensional matrix of cells, the cells indicating sample heights of respective portions of the environment, (iii) determining, for a first foot of the robotic device, a first step path extending from a first lift-off location to a first touch-down location, (iv) identifying, within the topographical map, a first scan patch of cells that encompass the first step path, (v) determining a first high point among the first scan patch of cells; and (vi) during the first step, directing the robotic device to lift the first foot to a first swing height that is higher than the determined first high point.
In a second example implementation, a control system is configured to (i) receive sensor data that indicates topographical features of an environment in which a robotic device is operating, (ii) process the sensor data into a topographical map that includes a two-dimensional matrix of cells, the cells indicating sample heights of respective portions of the environment, (iii) determine, for a first foot of the robotic device, a first step path extending from a first lift-off location to a first touch-down location, (iv) identify, within the topographical map, a first scan patch of cells that encompass the first step path, (v) determine a first high point among the first scan patch of cells, and (vi) during the first step, direct the robotic device to lift the first foot to a first swing height that is higher than the determined first high point.
A third example implementation includes a robotic system having (a) a first leg ending with a first foot, (b) at least one perception sensor, and (c) a control system configured to perform operations. The operations include (i) receiving sensor data that indicates topographical features of an environment in which a robotic device is operating, (ii) processing the sensor data into a topographical map that includes a two-dimensional matrix of cells, the cells indicating sample heights of respective portions of the environment, (iii) determining, for a first foot of the robotic device, a first step path extending from a first lift-off location to a first touch-down location, (iv) identifying, within the topographical map, a first scan patch of cells that encompass the first step path, (v) determining a first high point among the first scan patch of cells; and (vi) during the first step, directing the robotic device to lift the first foot to a first swing height that is higher than the determined first high point.
A fourth example implementation may include a system. The system may include (i) a means for receiving sensor data that indicates topographical features of an environment in which a robotic device is operating, (ii) a means for processing the sensor data into a topographical map that includes a two-dimensional matrix of cells, the cells indicating sample heights of respective portions of the environment, (iii) a means for determining, for a first foot of the robotic device, a first step path extending from a first lift-off location to a first touch-down location, (iv) a means for identifying, within the topographical map, a first scan patch of cells that encompass the first step path, (v) a means for determining a first high point among the first scan patch of cells; and (vi) a means for directing the robotic device to lift the first foot to a first swing height that is higher than the determined first high point during the first step.
These as well as other aspects, advantages, and alternatives will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying drawings. This summary and other descriptions and figures provided herein are intended to illustrate implementations by way of example only and numerous variations are possible. For instance, structural elements and process steps can be rearranged, combined, distributed, eliminated, or otherwise changed, while remaining within the scope of the implementations as claimed.
The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.
Like reference symbols in the various drawings indicate like elements.
Referring Example apparatuses, systems and methods are described herein. The words “example,” “exemplary,” and “illustrative” are used herein to mean “serving as an example, instance, or illustration.” Any implementation or feature described herein as being an “example,” being “exemplary,” or being “illustrative” is not necessarily to be construed as preferred or advantageous over other implementations or features. The example implementations described herein are not meant to be limiting. Thus, the aspects of the present disclosure, as generally described herein and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are contemplated herein. Further, unless otherwise noted, figures are not drawn to scale and are used for illustrative purposes only. Moreover, the figures are representational only and not all components are shown. For example, additional structural or restraining components might not be shown.
Overview
A legged robot may include a control system that adjusts the step path of the robot's feet based on the surrounding terrain (e.g., the terrain that the robot is currently traversing or terrain that is upcoming). Example legged robots include biped robots having two legs or quadruped robots having four legs, among other possible configurations. Such legged robots may move about an environment using their legs, perhaps by moving their legs to swing their feet. Viewed from the side, the step path of a given foot may appear to have a roughly quadrilateral shape which is created by the robot picking its foot up from a support surface, stepping forward, and setting its foot back to the support surface (with the fourth side being created by the support surface). Viewed from above, the step path of the foot may appear to be a line extending from the point where the robot picks up its foot to the point where the robot sets the foot down.
A robot may adjust the step path of its feet based on the terrain that it is traversing, which may help the robot avoid tripping on features of the terrain. For instance, a robot may adjust its step path to “high-step” over obstacles and other features of the environment. However, some example robots may use relatively more energy during a high-step as compared with a “regular” step. So, in some cases, the robot may operate more efficiently by high-stepping only as necessary to avoid tripping. Moreover, reducing the height of the step so as to not raise the robot's foot unnecessarily high (i.e., high enough to clear an obstacle by an acceptable margin, but not more) may further improved efficiency.
To sense the position of obstacles, a robot may be equipped with various sensors, such as a stereo vision system. A stereo vision system may be used by the robot's control system to create a topographical map of the environment surrounding the robot. An example map may include a discretized matrix of cells each representing an area (e.g., 5 sq. cm.) of the surrounding environment. In some examples, the “value” of each cell may be based on the average height of samples within the cell and one or more standard deviations of those samples. As the robot moves through the environment, additional sensor data may be incorporated into the map so that the robot's control systems maintain a topographical sense of the environment surrounding the robot, which may assist the robot in determining the right height for each step.
The robot may also include various sensors, such as an inertial measurement unit, that provide data indicative of the robot's speed and positioning, which the robot may use to anticipate the path of the foot during the step. However, because the robot's sensors might not perfectly sense the movement of the robot relative to the environment, some uncertainty may exist as to the actual path that the foot will take relative to the environment.
Given an anticipated step path for a given step, the robot may identify a particular area, or “scan patch” of the topographical map through which the foot is likely to travel during the anticipated step. This scan patch includes discrete cells of the topographical map that surround the anticipated step path, as the robot device might trip on the topographical features represented by those cells if it does not step high enough. Because of the uncertainty that may exist as to the actual path of the foot, the scan patch may extend to some distance around the anticipated step path. Moreover, because the uncertainty may increase as the robot anticipates further ahead in time, the scan patch may widen along the step path from the anticipated foot lifting point to the anticipated foot landing point. For example, the scan patch may have an approximately trapezoidal shape in which the step path extends from the shorter of the two parallel sides to the longer of the two parallel sides. If the robot is undergoing lateral velocity, the robot may skew the scan patch in the direction of the lateral velocity, so as to shift or widen the scan patch in that direction. The amount of skew (or widening) may be proportional to the lateral velocity.
Example Robotic Systems
As shown in
Processor(s) 102 may operate as one or more general-purpose hardware processors or special purpose hardware processors (e.g., digital signal processors, application specific integrated circuits, etc.). The processor(s) 102 may be configured to execute computer-readable program instructions 106, and manipulate data 107, both of which are stored in the data storage 104. The processor(s) 102 may also directly or indirectly interact with other components of the robotic system 100, such as sensor(s) 112, power source(s) 114, mechanical components 110, and/or electrical components 116.
The data storage 104 may be one or more types of hardware memory. For example, the data storage 104 may include or take the form of one or more computer-readable storage media that can be read or accessed by processor(s) 102. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic, or another type of memory or storage, which can be integrated in whole or in part with processor(s) 102. In some implementations, the data storage 104 can be a single physical device. In other implementations, the data storage 104 can be implemented using two or more physical devices, which may communicate with one another via wired or wireless communication. As noted previously, the data storage 104 may include the computer-readable program instructions 106 and the data 107. The data 107 may be any type of data, such as configuration data, sensor data, and/or diagnostic data, among other possibilities.
The controller 108 may include one or more electrical circuits, units of digital logic, computer chips, and/or microprocessors that are configured to (perhaps among other tasks), interface between any combination of the mechanical components 110, the sensor(s) 112, the power source(s) 114, the electrical components 116, the control system 118, and/or a user of the robotic system 100. In some implementations, the controller 108 may be a purpose-built embedded device for performing specific operations with one or more subsystems of the robotic device 100.
The control system 118 may monitor and physically change the operating conditions of the robotic system 100. In doing so, the control system 118 may serve as a link between portions of the robotic system 100, such as between mechanical components 110 and/or electrical components 116. In some instances, the control system 118 may serve as an interface between the robotic system 100 and another computing device. Further, the control system 118 may serve as an interface between the robotic system 100 and a user. The instance, the control system 118 may include various components for communicating with the robotic system 100, including a joystick, buttons, and/or ports, etc. The example interfaces and communications noted above may be implemented via a wired or wireless connection, or both. The control system 118 may perform other operations for the robotic system 100 as well.
During operation, the control system 118 may communicate with other systems of the robotic system 100 via wired or wireless connections, and may further be configured to communicate with one or more users of the robot. As one possible illustration, the control system 118 may receive an input (e.g., from a user or from another robot) indicating an instruction to perform a particular gait in a particular direction, and at a particular speed. A gait is a pattern of movement of the limbs of an animal, robot, or other mechanical structure.
Based on this input, the control system 118 may perform operations to cause the robotic device 100 to move according to the requested gait. As another illustration, a control system may receive an input indicating an instruction to move to a particular geographical location. In response, the control system 118 (perhaps with the assistance of other components or systems) may determine a direction, speed, and/or gait based on the environment through which the robotic system 100 is moving en route to the geographical location.
Operations of the control system 118 may be carried out by the processor(s) 102. Alternatively, these operations may be carried out by the controller 108, or a combination of the processor(s) 102 and the controller 108. In some implementations, the control system 118 may partially or wholly reside on a device other than the robotic system 100, and therefore may at least in part control the robotic system 100 remotely.
Mechanical components 110 represent hardware of the robotic system 100 that may enable the robotic system 100 to perform physical operations. As a few examples, the robotic system 100 may include physical members such as leg(s), arm(s), and/or wheel(s). The physical members or other parts of robotic system 100 may further include actuators arranged to move the physical members in relation to one another. The robotic system 100 may also include one or more structured bodies for housing the control system 118 and/or other components, and may further include other types of mechanical components. The particular mechanical components 110 used in a given robot may vary based on the design of the robot, and may also be based on the operations and/or tasks the robot may be configured to perform.
In some examples, the mechanical components 110 may include one or more removable components. The robotic system 100 may be configured to add and/or remove such removable components, which may involve assistance from a user and/or another robot. For example, the robotic system 100 may be configured with removable arms, hands, feet, and/or legs, so that these appendages can be replaced or changed as needed or desired. In some implementations, the robotic system 100 may include one or more removable and/or replaceable battery units or sensors. Other types of removable components may be included within some implementations.
The robotic system 100 may include sensor(s) 112 arranged to sense aspects of the robotic system 100. The sensor(s) 112 may include one or more force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors, proximity sensors, motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras, among other possibilities. Within some examples, the robotic system 100 may be configured to receive sensor data from sensors that are physically separated from the robot (e.g., sensors that are positioned on other robots or located within the environment in which the robot is operating).
The sensor(s) 112 may provide sensor data to the processor(s) 102 (perhaps by way of data 107) to allow for interaction of the robotic system 100 with its environment, as well as monitoring of the operation of the robotic system 100. The sensor data may be used in evaluation of various factors for activation, movement, and deactivation of mechanical components 110 and electrical components 116 by control system 118. For example, the sensor(s) 112 may capture data corresponding to the terrain of the environment or location of nearby objects, which may assist with environment recognition and navigation. In an example configuration, sensor(s) 112 may include RADAR (e.g., for long-range object detection, distance determination, and/or speed determination), LIDAR (e.g., for short-range object detection, distance determination, and/or speed determination), SONAR (e.g., for underwater object detection, distance determination, and/or speed determination), VICON® (e.g., for motion capture), one or more cameras (e.g., stereoscopic cameras for 3D vision), a global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment in which the robotic system 100 is operating. The sensor(s) 112 may monitor the environment in real time, and detect obstacles, elements of the terrain, weather conditions, temperature, and/or other aspects of the environment.
Further, the robotic system 100 may include sensor(s) 112 configured to receive information indicative of the state of the robotic system 100, including sensor(s) 112 that may monitor the state of the various components of the robotic system 100. The sensor(s) 112 may measure activity of systems of the robotic system 100 and receive information based on the operation of the various features of the robotic system 100, such the operation of extendable legs, arms, or other mechanical and/or electrical features of the robotic system 100. The data provided by the sensor(s) 112 may enable the control system 118 to determine errors in operation as well as monitor overall operation of components of the robotic system 100.
As an example, the robotic system 100 may use force sensors to measure load on various components of the robotic system 100. In some implementations, the robotic system 100 may include one or more force sensors on an arm or a leg to measure the load on the actuators that move one or more members of the arm or leg. As another example, the robotic system 100 may use one or more position sensors to sense the position of the actuators of the robotic system. For instance, such position sensors may sense states of extension, retraction, or rotation of the actuators on arms or legs.
As another example, the sensor(s) 112 may include one or more velocity and/or acceleration sensors. For instance, the sensor(s) 112 may include an inertial measurement unit (IMU). The IMU may sense velocity and acceleration in the world frame, with respect to the gravity vector. The velocity and acceleration sensed by the IMU may then be translated to that of the robotic system 100 based on the location of the IMU in the robotic system 100 and the kinematics of the robotic system 100.
The robotic system 100 may include other types of sensors not explicated discussed herein. Additionally or alternatively, the robotic system may use particular sensors for purposes not enumerated herein.
The robotic system 100 may also include one or more power source(s) 114 configured to supply power to various components of the robotic system 100. Among other possible power systems, the robotic system 100 may include a hydraulic system, electrical system, batteries, and/or other types of power systems. As an example illustration, the robotic system 100 may include one or more batteries configured to provide charge to components of the robotic system 100. Some of the mechanical components 110 and/or electrical components 116 may each connect to a different power source, may be powered by the same power source, or be powered by multiple power sources.
Any type of power source may be used to power the robotic system 100, such as electrical power or a gasoline engine. Additionally or alternatively, the robotic system 100 may include a hydraulic system configured to provide power to the mechanical components 110 using fluid power. Components of the robotic system 100 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example. The hydraulic system may transfer hydraulic power by way of pressurized hydraulic fluid through tubes, flexible hoses, or other links between components of the robotic system 100. The power source(s) 114 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples.
The electrical components 116 may include various mechanisms capable of processing, transferring, and/or providing electrical charge or electric signals. Among possible examples, the electrical components 116 may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of the robotic system 100. The electrical components 116 may interwork with the mechanical components 110 to enable the robotic system 100 to perform various operations. The electrical components 116 may be configured to provide power from the power source(s) 114 to the various mechanical components 110, for example. Further, the robotic system 100 may include electric motors. Other examples of electrical components 116 may exist as well.
Although not shown in
The body and/or the other components may include or carry the sensor(s) 112. These sensors may be positioned in various locations on the robotic device 100, such as on the body and/or on one or more of the appendages, among other examples.
On its body, the robotic device 100 may carry a load, such as a type of cargo that is to be transported. The load may also represent external batteries or other types of power sources (e.g., solar panels) that the robotic device 100 may utilize. Carrying the load represents one example use for which the robotic device 100 may be configured, but the robotic device 100 may be configured to perform other operations as well.
As noted above, the robotic system 100 may include various types of legs, arms, wheels, and so on. In general, the robotic system 100 may be configured with zero or more legs. An implementation of the robotic system with zero legs may include wheels, treads, or some other form of locomotion. An implementation of the robotic system with two legs may be referred to as a biped, and an implementation with four legs may be referred as a quadruped. Implementations with six or eight legs are also possible. For purposes of illustration, biped and quadruped implementations of the robotic system 100 are described below.
The robot 200 may be a physical representation of the robotic system 100 shown in
The configuration, position, and/or structure of the legs 204A-204D may vary in example implementations. The legs 204A-204D enable the robot 200 to move relative to its environment, and may be configured to operate in multiple degrees of freedom to enable different techniques of travel. In particular, the legs 204A-204D may enable the robot 200 to travel at various speeds according to the mechanics set forth within different gaits. The robot 200 may use one or more gaits to travel within an environment, which may involve selecting a gait based on speed, terrain, the need to maneuver, and/or energy efficiency.
Further, different types of robots may use different gaits due to variations in design. Although some gaits may have specific names (e.g., walk, trot, run, bound, gallop, etc.), the distinctions between gaits may overlap. The gaits may be classified based on footfall patterns—the locations on a surface for the placement the feet 206A-206D. Similarly, gaits may also be classified based on ambulatory mechanics.
The body 208 of the robot 200 connects to the legs 204A-204D and may house various components of the robot 200. For example, the body 208 may include or carry sensor(s) 210. These sensors may be any of the sensors discussed in the context of sensor(s) 112, such as a camera, LIDAR, or an infrared sensor. Further, the locations of sensor(s) 210 are not limited to those illustrated in
For example, the robot 300 may include legs 304 and 306 connected to a body 308. Each leg may consist of one or more members connected by joints and configured to operate with various degrees of freedom with respect to one another. Each leg may also include a respective foot 310 and 312, which may contact a surface (e.g., the ground surface). Like the robot 200, the legs 304 and 306 may enable the robot 300 to travel at various speeds according to the mechanics set forth within gaits. The robot 300, however, may utilize different gaits from that of the robot 200, due at least in part to the differences between biped and quadruped capabilities.
The robot 300 may also include arms 318 and 320. These arms may facilitate object manipulation, load carrying, and/or balancing for the robot 300. Like legs 304 and 306, each arm may consist of one or more members connected by joints and configured to operate with various degrees of freedom with respect to one another. Each arm may also include a respective hand 322 and 324. The robot 300 may use hands 322 and 324 for gripping, turning, pulling, and/or pushing objects. The hands 322 and 324 may include various types of appendages or attachments, such as fingers, grippers, welding tools, cutting tools, and so on.
The robot 300 may also include sensor(s) 314, corresponding to sensor(s) 112, and configured to provide sensor data to its control system. In some cases, the locations of these sensors may be chosen in order to suggest an anthropomorphic structure of the robot 300. Thus, as illustrated in
Implementation 400 could be used with the robotic system 100 of
Within examples, operations of
In addition, for
At block 402 of
As noted above, sensor(s) 112 include one or more sensors that, in operation, generate data indicative of topographical features of an environment in which robotic system 100 is operating. As noted above, these sensors may include one or more cameras. For instance, some embodiments may include stereoscopic cameras to provide 3D vision data. As noted above, other examples sensors include RADAR, LIDAR, SONAR, VICON®, a global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment in which the robotic system 100 is operating.
In some cases, a robotic device may maintain or have access to a map indicating topographical features of the environment. In such cases, the robotic device may receive sensor data indicating the device's location within the environment (e.g., GPS sensor data). This location may indicate topographical features of the environment by reference to the pre-existing map of the environment. For instance, a robotic device may receive sensor data indicating the robotic device's current location, and query the map for topographical features corresponding to that location of the environment.
In operation, one or more processors (e.g., processor(s) 102) of control system 118 may receive or otherwise gain access to the data generated by the sensor(s) 112. The one or more processors may analyze the data to detect obstacles, elements of the terrain, weather conditions, temperature, and/or other aspects of the environment.
At block 404 of
In some embodiments, the topographical map is represented by data indicating a matrix (or array) of discrete cells. Such a topographical map could be a two-dimensional matrix of discrete cells with each cell representing a portion (e.g., 5 sq. cm, 10 sq. cm, or another granularity) of the environment in which the robotic device is operating. Within examples, the processor(s) 102 may assign respective values to the discrete cells that indicate sample heights of respective portions of the environment. Topographical features of different heights may result in discrete cells indicating different sample heights corresponding to the different heights of the topographical features.
Within examples, the value of each cell may be based on a measure representing the average height of the cell. For instance, the processor(s) 102 may average (or otherwise process) samples from the sensor(s) 112 that correspond to the respective portions of the environment. The processor(s) 102 may also determine a standard deviation of such samples, which indicates the amount of variation within the samples of the portion. A sample height of a given discrete cell may be based on the average height of samples within the discrete cell and one or more standard deviations of those samples.
To illustrate the sample heights, the discrete cells of the two-dimensional topographical map are shown with a third dimension indicating their respective sample heights. As shown in
As noted above, in some cases, a robotic device may maintain or have access to a map indicating topographical features of the environment. In such cases, processing the sensor data into a topographical map may involve determining a portion of the map that corresponds to the robot's current location. Such a portion may indicate topographical features surrounding the robotic device.
At block 406, the example implementation involves determining a first step path extending from a first lift off location to a first touch-down location. Topographical features intersecting the path of a foot of a robot may interfere with a robot's step, which may cause undesirable results, such as tripping. To aid in identifying which topographical features of the environment might interfere with the step, a control system determines a step path for the foot that is taking the step. For instance, referring to
To cause a legged robot to take a step, a control system may control actuators to perform a series of actuations in which a foot of the robot is lifted from the ground, swung forward (or backward), and lowered back to the ground. As noted above, robotic systems, such as robotic system 100, may include mechanical components (e.g., mechanical components 110) and electrical components (e.g., electrical components 116) to facilitate locomotion of the robotic system. As noted above, the pattern of movements that the legs undergo during a step can be referred to as the robot's gait. Some robots may use a variety of gaits, selecting a particular gait based on speed, terrain, the need to maneuver, and energetic efficiency, among other possible considerations.
During a step, a given foot may follow a step path. The step path for a given foot may extend from a lift-off location to a touch-down location. The lift-off location refers to the location in which the given foot is lifted off of the ground, while the touch-down location refers to the location in which the given foot is lowered back to the ground. As noted above, when viewed from the side, the step path of a given foot may appear to have a roughly quadrilateral shape which is created by the robot picking its foot up from a support surface, stepping forward, and setting its foot back to the support surface (with the fourth side of the quadrilateral being created by the support surface). Viewed from above, the step path of a given foot may appear to be a line extending from the lift-off location to the touch-down location.
To vary the speed and manner of walking, the control system of a legged robot may adjust the respective step paths of its feet. For instance, to run, the control system may lengthen the step path (for a longer stride) and increase the rate at which the actuators of the legged robot swing a foot during a step. As noted above, in some examples, a robot might be instructed to perform a particular gait in a particular direction, perhaps to move through the environment at a particular speed. Alternatively, a robot may be instructed to navigate to a particular geographical location and determine a direction, speed, and/or gait based on the environment through which the robotic device is moving en route to the geographical location.
Some gaits may have pre-determined step paths. Using such a gait, while taking a step, a foot of the robotic device follows a pre-planned path from a lift-off location to a touch-down location. With pre-planned step paths, the control system might not adjust the step path on a step-by-step basis, but instead use a similar step path for multiple steps.
Other gaits might not have pre-determined step paths. In such cases, a control system may anticipate the step path of a given foot based on direction, speed, and/or gait, among other possible factors. As noted above, the control system may be aware of the present direction, speed, and/or gait, as the robot may have been instructed to move with a particular direction, speed, and/or gait or may have chosen a particular direction, speed, and/or gait. For instance, referring back to
In some implementations, a control system may determine step paths for the feet of a legged robot on a foot-by-foot basis. For instance, referring to
At block 408 of
Although some example control systems might analyze the entirety of the environment surrounding the robot rather than portion of that environment (i.e., a scan patch encompassing the step path), such an approach may limit the robot. For instance, the control system may have to reduce the robot's speed in order to analyze such a large area prior to each step. Should the control system fail to analyze the environment prior to each step being taken, the risk arises that the environment will interfere with the next step. By reducing the area analyzed to a scan patch that encompasses the step path, processing time may be reduced, which may assist the control system in analyzing the relevant portion of the environment prior to each step.
Moreover, because the uncertainty may increase as the robot looks further ahead in time to anticipate the path of the foot, the control system may identify a scan patch that widens along the swing path from the from the lift-off location to the touch-down location, so as to include most or all cells through which the foot might travel. In other words, the identified scan patch of cells may be narrower proximate to the first lift-off location than proximate to the first touch-down location. In some cases, such a scan patch of cells may be trapezoidal, as illustrated by scan patch 600A in
As noted above, in some cases, a robotic device may use a gait with pre-planned step paths. With such gaits, less uncertainty as to the path of the robot's foot may exist as compared with a dynamic gait. In such cases, the control system may identify a relatively smaller scan patch of cells, as the robot's foot may be more likely to stay within a certain area during the step. In some embodiments, the control system may identify a rectangular scan patch, as the degree of uncertainty as to a foot's position during the step might not warrant a trapezoidal scan patch.
In some cases, the robot's movement may have a velocity component that is lateral to the primary direction of travel. Such lateral velocity might be caused by forces applied by the robot (e.g., side-stepping) or by external force (e.g., wind). Lateral velocity may cause the actual step path of the foot to travel through a portion of the environment that is outside of the scan patch. To avoid this, a control system of the robot may detect the velocity of the robotic device in the lateral direction and skew the scan patch in the lateral direction such that the scan patch still encompasses the first step path. In some examples, the amount of skew varies based on the magnitude of the lateral velocity. For instance, the control system may skew the scan patch in proportion to the detected velocity of the robotic device in the lateral direction.
In some cases, the lateral velocity may have a negligible effect on the actual path of the foot. In such cases, skewing the scan patch may be unnecessary. To avoid unnecessarily increasing the size of the scan patch, the control system may determine whether the detected velocity in the lateral direction exceeds a threshold lateral velocity. If so, the control system skews the scan patch in the lateral direction. If not, the control system might not skew the scan patch.
In some cases, a planar surface may be formed by two or more adjacent cells of the topographical map. For instance, adjacent cells representing a portion of the topographical map that includes a step, or a set of stairs, may form planar surfaces. To illustrate, referring back to
Within embodiments, the control system may compare the planar surfaces formed by the cells to pre-determined geometric primitives (e.g., a cuboid, or other geometric shape). In doing so, the control system may determine, from among the pre-determined geometric primitives, a particular geometric primitive that corresponds to the one or more planar surfaces. For instance, the two planar surfaces formed by the set of cells 504 in
A geometric primitive may approximate the height of one or more sub-portions of the environment. For instance, with a cube (e.g., a stair) the control system may assume that the environment remains a constant height across the vertical planar surface that forms the top of the cube. By identifying geometric primitive, the control system may reduce the number of cells processed, as group of cells forming a geometric primitive (or portion thereof) may be considered to have consistent height.
In some cases, a step path may pass through a portion of the environment in which a geometric primitive has been identified. In such instances, a control system may identify a portion of the geometric primitive that encompasses the first step path (e.g., a portion of a stair, or a portion of a staircase). The control system may analyze that portion of the geometric primitive in determining how high to step.
Referring back to
As noted above, the discrete cells of the topographical map may indicate sample heights of respective portions of the environment. To determine the a high point among the scan patch of cells, a control system 118 may determine a particular cell that has the greatest average sample height among the first trapezoidal scan patch of cells.
In some cases, the control system 118 may also base the high point on the respective standard deviations of the samples of each cell. As noted above, in some cases, the processor(s) 102 of the control system 118 may determine a standard deviation of the samples of a cell. Incorporating the standard deviation helps to identify a discrete cell having a low number of high samples (which might represent a tall but narrow object) that are offset by some very low samples. In such cases, the control system 118 may then determine the high point to be the cell with the greatest average sample height plus one or more standard deviations (e.g., two or three standard deviations).
Legged robots may vary in design such that different types of legged robots may be able to clear obstacles of different heights. For instance, a given robot may have a particular amount of ground clearance. The control system 118 may adjust for such differences in robot design by including an offset in the determination of the high point. For instance, the control system 118 may determine the high point to be at least the average sample height plus an offset. Within examples, the offset may be proportional to a height of the robot, which may related to the ground clearance of the robot.
Referring back to
As noted above, during a step, a given foot may follow a step path that involves the robot picking its foot up from a support surface, stepping forward, and setting its foot back to the support surface. To lift the foot higher than the determined high point, the control system 118 may control actuators of the robot to lift the foot to a height that is higher than the determined high point before or as the control system 118 controls the actuators to step the foot forward. For instance, control system 118 of quadruped robot 200 may cause actuators connected to leg 206B to rotate members of leg 204B at the hip and knee joints, which causes the foot 206B to lift off of the ground. Such control of the foot may avoid the topographical features within the scan patch interfering with the step of the foot.
As noted above, because relatively more energy is used when the robot “high-steps” over obstacles, the robot may attempt to improve efficiency by high-stepping only as necessary to avoid tripping. In operation, a control system may determine whether the determined high point within the scan patch is greater than or less than a threshold obstacle height. If the determined high point within the scan patch is greater than the threshold obstacle height, the control system may direct the robotic device to lift the foot to a swing height that is higher than the determined high point. However, if the determined high point within the scan patch is less than the threshold obstacle height, the control system may direct the robotic device to lift the foot to a nominal swing height (perhaps as indicated by the gait and speed of the robot), so as to revert to a “normal” step (e.g., a step influenced by the gait and speed of the robot, rather than a particular topographical feature that the robot is attempting to avoid tripping over).
Moreover, with some example legged robots, more energy is used as the foot is lifted higher, as more work is done against the force of gravity. Accordingly, in some cases, efficiency is further improved by minimizing the height of the step as to avoid raising the foot unnecessarily high (i.e., higher than necessary to clear the topographical feature represented by the determined high point). Accordingly, the control system 118 may direct the actuators to lift the foot to a swing height that is higher than the determined first high point by an acceptable margin that minimizes the step height while clearing the obstacle. The margin may vary based on the standard deviation of the determined high point cell, as greater variability within the cell may indicate a need to increase the margin to reduce the likelihood of a trip occurring.
As noted above, in some implementations, a control system may determine step paths for the feet of a legged robot on a foot-by-foot basis. Accordingly, in operation, a control system may repeat certain operations of implementation 400 for a second, third, and/or fourth foot of a robotic system. For instance, the control system may determine, for a second foot of the robotic device, a second step path extending from a second lift-off location to a second touch-down location. The control system may then identify a second scan patch of cells that encompass the second step path. The control system may determine a second high point among second scan patch of cells, and, during the second step, direct the robotic device to lift the second foot to a second swing height that is higher than the second high point. In some cases, the determined second swing height may be different from the determined first swing height, as the second high point may differ from the first high point.
The present disclosure is not to be limited in terms of the particular embodiments described in this application, which are intended as illustrations of various aspects. Many modifications and variations can be made without departing from its scope, as will be apparent to those skilled in the art. Functionally equivalent methods and apparatuses within the scope of the disclosure, in addition to those enumerated herein, will be apparent to those skilled in the art from the foregoing descriptions. Such modifications and variations are intended to fall within the scope of the appended claims.
The above detailed description describes various features and functions of the disclosed systems, devices, and methods with reference to the accompanying figures. The example embodiments described herein and in the figures are not meant to be limiting. Other embodiments can be utilized, and other changes can be made, without departing from the scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein.
With respect to any or all of the diagrams, scenarios, and flow charts in the figures and as discussed herein, each step, block, and/or communication can represent a processing of information and/or a transmission of information in accordance with example embodiments. Alternative embodiments are included within the scope of these example embodiments. In these alternative embodiments, for example, functions described as steps, blocks, transmissions, communications, requests, responses, and/or messages can be executed out of order from that shown or discussed, including substantially concurrent or in reverse order, depending on the functionality involved. Further, more or fewer blocks and/or functions can be used with any of the diagrams, scenarios, and flow charts discussed herein, and ladder diagrams, scenarios, and flow charts can be combined with one another, in part or in whole.
A step or block that represents a processing of information can correspond to circuitry that can be configured to perform the specific logical functions of a herein-described method or technique. Alternatively or additionally, a step or block that represents a processing of information can correspond to a module, a segment, or a portion of program code (including related data). The program code can include one or more instructions executable by a processor for implementing specific logical functions or actions in the method or technique. The program code and/or related data can be stored on any type of computer readable medium such as a storage device including a disk, hard drive, or other storage medium.
The computer readable medium can also include non-transitory computer readable media such as computer-readable media that store data for short periods of time like register memory, processor cache, and random access memory (RAM). The computer readable media can also include non-transitory computer readable media that store program code and/or data for longer periods of time. Thus, the computer readable media may include secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, compact-disc read only memory (CD-ROM), for example. The computer readable media can also be any other volatile or non-volatile storage systems. A computer readable medium can be considered a computer readable storage medium, for example, or a tangible storage device.
Moreover, a step or block that represents one or more information transmissions can correspond to information transmissions between software and/or hardware modules in the same physical device. However, other information transmissions can be between software modules and/or hardware modules in different physical devices.
The particular arrangements shown in the figures should not be viewed as limiting. It should be understood that other embodiments can include more or less of each element shown in a given figure. Further, some of the illustrated elements can be combined or omitted. Yet further, an example embodiment can include elements that are not illustrated in the figures.
Additionally, any enumeration of elements, blocks, or steps in this specification or the claims is for purposes of clarity. Thus, such enumeration should not be interpreted to require or imply that these elements, blocks, or steps adhere to a particular arrangement or are carried out in a particular order.
While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims.
This patent application is a continuation of, and claims priority under 35 U.S.C. § 120 from U.S. patent application Ser. No. 17/453,270, filed on Nov. 2, 2021, which is a continuation of, and claims priority under 35 U.S.C. § 120 from U.S. patent application Ser. No. 16/703,261, filed on Dec. 4, 2019, which is a continuation of U.S. patent application Ser. No. 15/416,361 filed on Jan. 26, 2017, which is a continuation of U.S. patent application Ser. No. 14/709,830, filed on May 12, 2015. The disclosures of these prior applications are considered part of the disclosure of this application and are hereby incorporated by reference in their entireties.
This invention was made with U.S. Government support under Contract No. HR00011-10-C-0025 awarded by DARPA. The Government may have certain rights with regard to the invention.
Number | Name | Date | Kind |
---|---|---|---|
4834200 | Kajita | May 1989 | A |
5151859 | Yoshino | Sep 1992 | A |
5343397 | Yoshino | Aug 1994 | A |
5355064 | Yoshino | Oct 1994 | A |
5416393 | Gomi | May 1995 | A |
5432417 | Takenaka | Jul 1995 | A |
5459659 | Takenaka | Oct 1995 | A |
5513106 | Yoshino et al. | Apr 1996 | A |
5644204 | Nagle | Jul 1997 | A |
5737217 | Nishikawa et al. | Apr 1998 | A |
5762153 | Zamagni | Jun 1998 | A |
5808433 | Tagami et al. | Sep 1998 | A |
5838130 | Ozawa | Nov 1998 | A |
5974366 | Kawai et al. | Oct 1999 | A |
6021363 | Nishikawa et al. | Feb 2000 | A |
6064167 | Takenaka et al. | May 2000 | A |
6177776 | Kawai et al. | Jan 2001 | B1 |
6301524 | Takenaka | Oct 2001 | B1 |
6317652 | Osada | Nov 2001 | B1 |
6374157 | Takamura | Apr 2002 | B1 |
6484068 | Yamamoto | Nov 2002 | B1 |
6493607 | Bourne | Dec 2002 | B1 |
6534943 | Hornby | Mar 2003 | B1 |
6584377 | Saijo | Jun 2003 | B2 |
6640160 | Takahashi | Oct 2003 | B2 |
6697709 | Kuroki et al. | Feb 2004 | B2 |
6802382 | Hattori et al. | Oct 2004 | B2 |
6832132 | Ishida | Dec 2004 | B2 |
6943520 | Furuta | Sep 2005 | B2 |
6992455 | Kato et al. | Jan 2006 | B2 |
6992457 | Furuta | Jan 2006 | B2 |
6999851 | Kato | Feb 2006 | B2 |
7013201 | Hattori | Mar 2006 | B2 |
7076331 | Nagatsuka | Jul 2006 | B1 |
7096983 | Hirai | Aug 2006 | B2 |
7120518 | Takenaka | Oct 2006 | B2 |
7127326 | Lewis | Oct 2006 | B2 |
7236852 | Moridaira et al. | Jun 2007 | B2 |
7272474 | Stentz | Sep 2007 | B1 |
7278501 | Mori | Oct 2007 | B2 |
7386364 | Mikami | Jun 2008 | B2 |
7418312 | Hidai | Aug 2008 | B2 |
7603234 | Takenaka et al. | Oct 2009 | B2 |
7606634 | Takenaka et al. | Oct 2009 | B2 |
7657345 | Endo et al. | Feb 2010 | B2 |
7734377 | Hasegawa | Jun 2010 | B2 |
7734378 | Takenaka et al. | Jun 2010 | B2 |
7756607 | Ikeuchi | Jul 2010 | B2 |
7840308 | Matsunaga | Nov 2010 | B2 |
7865267 | Sabe | Jan 2011 | B2 |
7873436 | Takenaka | Jan 2011 | B2 |
7881824 | Nagasaka et al. | Feb 2011 | B2 |
7949430 | Pratt | May 2011 | B2 |
7957835 | Suga | Jun 2011 | B2 |
7964364 | Refseth | Jun 2011 | B2 |
8020649 | Ogawa | Sep 2011 | B2 |
8060253 | Goswami | Nov 2011 | B2 |
8108070 | Tajima | Jan 2012 | B2 |
8172013 | Shimada | May 2012 | B2 |
8195332 | Pratt | Jun 2012 | B2 |
8237390 | Godler | Aug 2012 | B2 |
8239084 | Yamamoto | Aug 2012 | B2 |
8306657 | Yoshiike et al. | Nov 2012 | B2 |
8311731 | Sugiura | Nov 2012 | B2 |
8332068 | Goswami et al. | Dec 2012 | B2 |
8355818 | Nielsen | Jan 2013 | B2 |
8386076 | Honda | Feb 2013 | B2 |
8396593 | Orita | Mar 2013 | B2 |
8401725 | Matsunaga | Mar 2013 | B2 |
8457830 | Goulding | Jun 2013 | B2 |
8532824 | Orita | Sep 2013 | B2 |
8554366 | Kajima | Oct 2013 | B2 |
8565921 | Doi | Oct 2013 | B2 |
8583283 | Takenaka et al. | Nov 2013 | B2 |
8630763 | Goulding | Jan 2014 | B2 |
8644987 | Kwon | Feb 2014 | B2 |
8676381 | Kwon | Mar 2014 | B2 |
8688307 | Sekiya | Apr 2014 | B2 |
8738178 | Choi | May 2014 | B2 |
8781628 | Kwak | Jul 2014 | B2 |
8798965 | Quan | Aug 2014 | B2 |
8805582 | Zaier | Aug 2014 | B2 |
8825391 | Urmson | Sep 2014 | B1 |
8849454 | Yun et al. | Sep 2014 | B2 |
8855820 | Watabe | Oct 2014 | B2 |
8855821 | Seo et al. | Oct 2014 | B2 |
8918213 | Rosenstein | Dec 2014 | B2 |
8924021 | Dariush | Dec 2014 | B2 |
8935005 | Rosenstein | Jan 2015 | B2 |
8942848 | Pratt | Jan 2015 | B2 |
8948956 | Takahashi | Feb 2015 | B2 |
8965573 | Maisonnier | Feb 2015 | B2 |
9037396 | Pack | May 2015 | B2 |
9044862 | Kim | Jun 2015 | B2 |
9102055 | Konolige | Aug 2015 | B1 |
9197862 | Asatani | Nov 2015 | B2 |
9207678 | Kim | Dec 2015 | B2 |
9266233 | Kornbluh | Feb 2016 | B2 |
9317743 | Datta | Apr 2016 | B2 |
9329598 | Pack et al. | May 2016 | B2 |
9352470 | da Silva | May 2016 | B1 |
9387588 | Blankespoor | Jul 2016 | B1 |
9387896 | Blankespoor | Jul 2016 | B1 |
9434430 | Moridaira | Sep 2016 | B2 |
9446518 | Blankespoor | Sep 2016 | B1 |
9499218 | Stephens | Nov 2016 | B1 |
9499219 | Jackowski | Nov 2016 | B1 |
9561592 | da Silva | Feb 2017 | B1 |
9594377 | Perkins | Mar 2017 | B1 |
9618937 | Blankespoor | Apr 2017 | B1 |
9789607 | Whitman | Oct 2017 | B1 |
9789919 | Blankespoor | Oct 2017 | B1 |
9833899 | Blankespoor | Dec 2017 | B1 |
9878751 | Thorne | Jan 2018 | B1 |
9895804 | Perkins | Feb 2018 | B1 |
9931753 | Rizzi | Apr 2018 | B1 |
9969086 | Whitman | May 2018 | B1 |
10017218 | Swilling | Jul 2018 | B1 |
10031524 | Su | Jul 2018 | B2 |
10081104 | Swilling | Sep 2018 | B1 |
10232508 | Lafaye | Mar 2019 | B2 |
10239208 | Swilling | Mar 2019 | B1 |
10518409 | Oleynik | Dec 2019 | B2 |
10528051 | Perkins | Jan 2020 | B1 |
10966897 | Bankowski | Apr 2021 | B2 |
11188081 | Perkins | Nov 2021 | B2 |
11287826 | Whitman | Mar 2022 | B2 |
11325260 | Yeo | May 2022 | B2 |
11416003 | Whitman | Aug 2022 | B2 |
11726481 | Perkins | Aug 2023 | B2 |
20020183897 | Kuroki | Dec 2002 | A1 |
20030009259 | Hattori | Jan 2003 | A1 |
20030154201 | Berestov | Aug 2003 | A1 |
20040044440 | Takenaka | Mar 2004 | A1 |
20040063382 | Randall | Apr 2004 | A1 |
20040099450 | Kwok | May 2004 | A1 |
20040138780 | Lewis | Jul 2004 | A1 |
20040167641 | Kawai | Aug 2004 | A1 |
20040172165 | Iribe | Sep 2004 | A1 |
20040193323 | Higaki | Sep 2004 | A1 |
20040205417 | Moridaira | Oct 2004 | A1 |
20040230340 | Fukuchi | Nov 2004 | A1 |
20040236467 | Sano | Nov 2004 | A1 |
20050021176 | Takenaka | Jan 2005 | A1 |
20050065650 | Lewis | Mar 2005 | A1 |
20050067993 | Kato | Mar 2005 | A1 |
20050075755 | Takenaka | Apr 2005 | A1 |
20050077856 | Takenaka | Apr 2005 | A1 |
20050110448 | Takenaka | May 2005 | A1 |
20050113973 | Endo | May 2005 | A1 |
20050120820 | Takenaka | Jun 2005 | A1 |
20050216097 | Rifkin | Sep 2005 | A1 |
20050228539 | Takenaka | Oct 2005 | A1 |
20050240307 | Kuroki | Oct 2005 | A1 |
20050267630 | Kajita | Dec 2005 | A1 |
20050283043 | Sisk | Dec 2005 | A1 |
20060025888 | Gutmann | Feb 2006 | A1 |
20060064203 | Goto | Mar 2006 | A1 |
20060076167 | Setrakian | Apr 2006 | A1 |
20060149465 | Park | Jul 2006 | A1 |
20060155436 | Matsunaga | Jul 2006 | A1 |
20060173578 | Takenaka | Aug 2006 | A1 |
20060247800 | Takenaka | Nov 2006 | A1 |
20070003915 | Templeman | Jan 2007 | A1 |
20070021870 | Nagasaka | Jan 2007 | A1 |
20070050047 | Ragnarsdottlr | Mar 2007 | A1 |
20070126387 | Takenaka | Jun 2007 | A1 |
20070150095 | Zaier | Jun 2007 | A1 |
20070156283 | Takenaka | Jul 2007 | A1 |
20070193789 | Takenaka | Aug 2007 | A1 |
20070220637 | Endo et al. | Sep 2007 | A1 |
20070227786 | Hillis et al. | Oct 2007 | A1 |
20070241713 | Yamamoto | Oct 2007 | A1 |
20080065269 | Hasegawa | Mar 2008 | A1 |
20080133055 | Hasegawa | Jun 2008 | A1 |
20080160873 | Yoneda | Jul 2008 | A1 |
20080208391 | Hasegawa | Aug 2008 | A1 |
20090005906 | Tajima | Jan 2009 | A1 |
20090030530 | Martin | Jan 2009 | A1 |
20090171503 | Takenaka | Jul 2009 | A1 |
20090271037 | Hong | Oct 2009 | A1 |
20090306821 | Park | Dec 2009 | A1 |
20090312867 | Hasegawa | Dec 2009 | A1 |
20090325699 | Delgiannidis | Dec 2009 | A1 |
20100017028 | Suga et al. | Jan 2010 | A1 |
20100057253 | Kwon | Mar 2010 | A1 |
20100113980 | Herr | May 2010 | A1 |
20100126785 | Shimada | May 2010 | A1 |
20100161120 | Goswami | Jun 2010 | A1 |
20100174409 | Park | Jul 2010 | A1 |
20100252395 | Lehtonen | Oct 2010 | A1 |
20100274431 | Matsunaga | Oct 2010 | A1 |
20100277483 | Lee | Nov 2010 | A1 |
20100292838 | Goswami | Nov 2010 | A1 |
20110009241 | Lane | Jan 2011 | A1 |
20110022232 | Yoshiike | Jan 2011 | A1 |
20110054689 | Nielsen | Mar 2011 | A1 |
20110098856 | Yoshiike | Apr 2011 | A1 |
20110098857 | Yoshiike | Apr 2011 | A1 |
20110098860 | Yoshiike et al. | Apr 2011 | A1 |
20110172825 | Lee | Jul 2011 | A1 |
20110178637 | Lee | Jul 2011 | A1 |
20110224827 | Andoh | Sep 2011 | A1 |
20110231050 | Goulding | Sep 2011 | A1 |
20110257764 | Herr | Oct 2011 | A1 |
20110264264 | Shirokura | Oct 2011 | A1 |
20110288682 | Pinter | Nov 2011 | A1 |
20110288684 | Farlow | Nov 2011 | A1 |
20110301756 | Yoshiike et al. | Dec 2011 | A1 |
20120072026 | Takagi | Mar 2012 | A1 |
20120158175 | Lee | Jun 2012 | A1 |
20120185095 | Rosenstein | Jul 2012 | A1 |
20120185096 | Rosenstein | Jul 2012 | A1 |
20120197435 | Maisonnier | Aug 2012 | A1 |
20120203359 | Schimmels | Aug 2012 | A1 |
20120245734 | Yun | Sep 2012 | A1 |
20120259463 | Orita | Oct 2012 | A1 |
20120277907 | Kim | Nov 2012 | A1 |
20120303271 | Chowdhary | Nov 2012 | A1 |
20120310412 | Seo | Dec 2012 | A1 |
20120316682 | Seo | Dec 2012 | A1 |
20120316683 | Seo | Dec 2012 | A1 |
20120316684 | Lee | Dec 2012 | A1 |
20130079929 | Lim | Mar 2013 | A1 |
20130144439 | Lee | Jun 2013 | A1 |
20130178983 | Watabe | Jul 2013 | A1 |
20130184861 | Pratt | Jul 2013 | A1 |
20130206488 | Horinouchi | Aug 2013 | A1 |
20130231822 | Gouaillier | Sep 2013 | A1 |
20130238122 | Hodgins | Sep 2013 | A1 |
20130238183 | Goulding | Sep 2013 | A1 |
20140019082 | Lan | Jan 2014 | A1 |
20140200713 | Allen | Jul 2014 | A1 |
20150049910 | Ptucha et al. | Feb 2015 | A1 |
20150051734 | Zheng | Feb 2015 | A1 |
20150073592 | Kaneko | Mar 2015 | A1 |
20150073598 | Rosenstein | Mar 2015 | A1 |
20150120044 | Cory | Apr 2015 | A1 |
20150134079 | Yoon | May 2015 | A1 |
20150134080 | Roh | May 2015 | A1 |
20150202768 | Moridaira | Jul 2015 | A1 |
20150321342 | Smith | Nov 2015 | A1 |
20160059412 | Oleynik | Mar 2016 | A1 |
20160297072 | Williams | Oct 2016 | A1 |
20180004208 | Su | Jan 2018 | A1 |
20180120116 | Rombouts | May 2018 | A1 |
20180162469 | Blankespoor | Jun 2018 | A1 |
20190255701 | Blankespoor | Aug 2019 | A1 |
20190258274 | Perkins | Aug 2019 | A1 |
20190258275 | Saunders | Aug 2019 | A1 |
20200030971 | Oleynik | Jan 2020 | A1 |
20200117198 | Whitman | Apr 2020 | A1 |
20200241534 | Perkins | Jul 2020 | A1 |
20210147016 | Stephens | May 2021 | A1 |
20210171135 | Blankespoor | Jun 2021 | A1 |
20220055228 | Jackowski | Feb 2022 | A1 |
20220057800 | Perkins | Feb 2022 | A1 |
20220179420 | Whitman | Jun 2022 | A1 |
20220305648 | Oleynik | Sep 2022 | A1 |
20230347524 | Nelson | Nov 2023 | A1 |
Number | Date | Country |
---|---|---|
100815247 | Mar 2008 | KR |
Entry |
---|
Abe et al., “Multiobjective Control with Frictional Contacts,” Eurographics/ACM SIGGRAPH Symposium on Computer Animation, Aug. 4-5, 2007, San Diego, California, 10 pages. |
Bajracharya, et al., “High fidelity day/night stereo mapping with vegetation and negative obstacle detection for Vision-in-the-loop walking,” 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Nov. 2013, pp. 3663-3670, IEEE, Tokyo, Japan. |
Doshi et al. , “Collision Detection in Legged Locomotion using Supervised Leaming”, 2007, 6 pages, MIT Cambridge, us. |
Hashlamon et al., “Simple Virtual Slip Force Sensor for Walking Biped Robots,” IEEE, 2013, pp. 1-5. |
Kim et al., “Landing Force Controller for a Humanoid Robot: Time-Domain Passivity Approach,” 2006 IEEE Conference on Systems, Man, and Cybernetics, Oct. 8-11, 2006, Taipei, Taiwan, pp. 4237-4242. |
Koolen et al., “Capturabilily-Based Analysis and Control of Legged Locomotion, Part 1: Theory and Application to Three Simple Gail Models,” The International Journal of Robotics Research, 2012, pp. 1094-1113, vol. 31, No. 9. |
Pratt et al., “Capturabilily-Based Analysis and Control of Legged Locomotion, Part 2: Application to M2V2, a Lower Body Humanoid,” The International Journal of Robotics Research, Apr. 2011, pp. 1-25. |
Pratt et al., “Capture Point: A Step Toward Humanoid Push Recovery,” IEEE Humanoid Robots, 2006 6th IEEE RAS International Conference, Symposium, Dec. 2-6, 2006, pp. 1-8, Genoa, Italy. |
Silva et al., “Goal-Oriented Biped Walking Based on Force Interaction Control,” Proceedings of the 2001 IEEE Intemational Conference on Robotics & Automation, Seoul, Korea, May 21-26, 2001, pp. 4122-3127. |
Silva et al., “Towards Force Interaction Control of Biped Walking Robots,” Proceedings of 2004 IEEE/RSJ Intemational Conference on Intelligent Robots and Systems, Sendai, Japan, Sep. 28-Oct. 2, 2004, pp. 2568-2573. |
Complaint in Boston Dynamics, Inc. v. Ghost Robotics Corporation, Case No. 1.24-cv-00184-UNA filed, Feb. 12, 2024 in 45 pages (involving U.S. Pat. Nos. 9,594,377, 9,908,240, 9,789,611, and 11,287,819). |
Boston Dynamics, “LS3—Legged Squad Support System,” video screen shots taken from https://www.youtube.com/watch?v=R7ezXBEBE6U&t=1s, Sep. 10, 2012, downloaded Feb. 2, 2024, 5 pages. |
Stanford, “BigDog, the Rough-Terrain Robot,” video screen shots taken from https://www.youtube.com/watch?v=-Bi-tPO0OPs, Aug. 27, 2010, downloaded Apr. 26, 2024, 12 pages. |
“Boston Dynamics BIGDOG Robot,” video screen shots taken from https://www.youtube.com/watch?v=b2bExqhhWRI, Jul. 17, 2007, downloaded Oct. 5, 2023, 13 pages. |
“AlphaDog Proto,” video screen shots taken from https://www.youtube.com/watch?v=SSbZrQp-HOk, Sep. 29, 2011, downloaded Aug. 14, 2023, 24 pages. |
“Atlas Update,” video screen shots taken from https://www.youtube.com/watch?v=SD6Okylclb8, Oct. 3, 2013, downloaded Aug. 14, 2023, 18 pages. |
“BigDog Overview (Updated Mar. 2010),” video screen shots taken from https://www.youtube.com/watch?v=cNZPRsrwumQ, Apr. 22, 2010, downloaded Aug. 14, 2023, 25 pages. |
“BigDog Reflexes,” video screen shots taken from https://www.youtube.com/watch?v=3gi6Ohnp9x8, Jan. 27, 2009, downloaded Aug. 14, 2023, 15 pages. |
“Petman,” video screen shots taken from https://www.youtube.com/watch?v=mclbVTIYG8E, Oct. 30, 2011, downloaded Aug. 14, 2023, 12 pages. |
“Petman Prototype,” video screen shots taken from https://www.youtube.com/watch?v=67CUudkjEG4, Oct. 26, 2009, downloaded Aug. 14, 2023, 10 pages. |
Boston Dynamics, “Introducing Spot Classic (previously Spot),” video screen shots taken from https://www.youtube.com/watch?v=M8YjvHYbZ9w, Feb. 9, 2015, downloaded Aug. 10, 2023, 14 pages. |
Boston Dynamics, “Introducing Spot (Previously SpotMini),” video screen shots taken from https://www.youtube.com/watch?v=tf7IEVTDjng, Jun. 23, 2016, downloaded Jul. 31, 2023, 10 pages. |
Boston Dynamics, “Spot Autonomous Navigation,” video screen shots taken from https://www.youtube.com/watch?v=Ve9kWX_KXus, May 10, 2018, downloaded Sep. 5, 2023, 11 pages. |
Boston Dynamics, “SpotMini”, The Wayback Machine, http://web.archive.org/web/20171118145237/https://bostondynamics.com/spot-mini, downloaded Jul. 31, 2023, 3 pages. |
Boston Dynamics, “Testing Robustness,” video screen shots taken from https://www.youtube.com/watch?v=aFuA50H9uek, Feb. 20, 2018, downloaded Jul. 31, 2023, 3 pages. |
Boston Dynamics, “The New Spot,” video screen shots taken from https://www.youtube.com/watch?v=kgaO45SyaO4, Nov. 13, 2017, downloaded Jul. 31, 2023, 3 pages. |
LS3 Follow Tight, published Dec. 19, 2012, video screen shots taken from and available at https://www.youtube.com/watch?v=hNUeSUXOc-w, in 10 pages. |
Number | Date | Country | |
---|---|---|---|
20230333559 A1 | Oct 2023 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17453270 | Nov 2021 | US |
Child | 18341388 | US | |
Parent | 16703261 | Dec 2019 | US |
Child | 17453270 | US | |
Parent | 15416361 | Jan 2017 | US |
Child | 16703261 | US | |
Parent | 14709830 | May 2015 | US |
Child | 15416361 | US |