This disclosure relates to techniques for automated ceiling detection by a robotic device.
A robot is generally defined as a reprogrammable and multifunctional manipulator designed to move material, parts, tools, and/or specialized devices (e.g., via variable programmed motions) for performing tasks. Robots may include manipulators that are physically anchored (e.g., industrial robotic arms), mobile devices that move throughout an environment (e.g., using legs, wheels, or traction-based mechanisms), or some combination of one or more manipulators and one or more mobile devices. Robots are currently used in a variety of industries, including, for example, manufacturing, warehouse logistics, transportation, hazardous environments, exploration, and healthcare.
Mobile robots may be tasked with operating inside containers that have ceilings, such as truck cargo compartments or shipping containers. To avoid damage to the robot and/or the container, the robot should be aware of the location and/or geometry of the ceiling of the container within which it is operating. Some embodiments of the present disclosure relate to automated techniques for estimating one or more characteristics of the ceiling of the container (e.g., height, geometry, etc.), such that the robot can be controlled to operate safely within the container.
In one aspect, the invention features a method of estimating a ceiling location of a container within which a mobile robot is configured to operate. The method includes sensing distance measurement data associated with the ceiling of the container using one or more distance sensors arranged on an end effector of a mobile robot, and determining a ceiling estimate of the container based on the distance measurement data.
In some embodiments, the one or more distance sensors include a first distance sensor arranged on a first side of the end effector, the first distance sensor being configured to sense a first distance in a first direction. The method further comprises orienting, prior to sensing the distance measurement data, the first distance sensor such that the first direction is toward the ceiling of the container. In some embodiments, the one or more distance sensors include a second distance sensor arranged on a second side of the end effector, the second distance sensor being configured to sense a second distance in a second direction, the second direction being opposite the first direction, and sensing distance measurement data comprises sensing distance measurement data including the first distance and the second distance.
In some embodiments, sensing distance measurement data comprises controlling an arm of the mobile robot to move the arm through a scan trajectory, and sensing the distance measurement data as the arm is moved through the scan trajectory. In some embodiments, controlling an arm of the mobile robot to move the arm through a scan trajectory comprises moving the arm through a scan trajectory that includes a first direction and a second direction at an angle to the first direction. In some embodiments, the scan trajectory includes a first segment along the first direction, a second segment along the first direction, and a third segment along the second direction, the third segment connecting the first and second segments. In some embodiments, sensing distance measurement data is performed while a base of the mobile robot is outside of the container.
In some embodiments, determining a ceiling estimate based on the distance measurement data comprises fitting a plane based on at least one datum in the distance measurement data, and determining the ceiling estimate based on the plane. In some embodiments, determining a ceiling estimate based on the distance measurement data further comprises determining the at least one datum as a datum having a minimum distance in the distance measurement data. In some embodiments, determining a ceiling estimate based on the distance measurement data further comprises filtering the distance measurement data to generate filtered distance measurement data, and determining the ceiling estimate based on the filtered distance measurement data. In some embodiments, filtering the distance measurement data comprises sorting the distance measurement data in order of distance to generate sorted data, and excluding from the filtered distance measurement data, a threshold amount of the sorted data with the smallest distances.
In some embodiments, fitting a plane comprises fitting a flat plane to the at least one datum. In some embodiments, the flat plane is parallel to a floor of the container. In some embodiments, the flat plane is sloped relative to a floor of the container. In some embodiments, fitting a plane comprises fitting a curved plane using at least two pieces of data of the distance measurement data.
In some embodiments, determining a ceiling estimate based on the distance measurement data further comprises assigning a shape primitive to the plane, and controlling at least one operation of the mobile robot based, at least in part, on the shape primitive.
In some embodiments, the method further includes controlling at least one operation of the mobile robot based on the ceiling estimate of the container. In some embodiments, controlling at least one operation of the mobile robot based on the ceiling estimate of the container comprises determining a trajectory of an arm of the mobile robot based, at least in part, on the ceiling estimate, and moving the arm in accordance with the trajectory. In some embodiments, controlling at least one operation of the mobile robot based on the ceiling estimate of the container comprises determining a grasp strategy for grasping an object in the container based, at least in part, on the ceiling estimate, and grasping the object using the grasp strategy. In some embodiments, controlling at least one operation of the mobile robot based on the ceiling estimate of the container comprises controlling the mobile robot to move at least one component of the mobile robot without colliding with the ceiling of the container.
In some embodiments, the container comprises a truck cargo compartment. In some embodiments, the container comprises a shipping container. In some embodiments, the end effector comprises a suction-based gripper. In some embodiments, the one or more distance sensors include one or more time-of-flight point sensors.
In one aspect, the invention features a mobile robot. The mobile robot includes a mobile base, an arm coupled to the mobile base, a gripper coupled to the arm, wherein the gripper includes one or more distance sensors arranged thereon, and a controller. The controller is configured to determine a ceiling estimate of a container based on distance measurement data sensed by the one or more distance sensors.
In some embodiments, the one or more distance sensors include a first distance sensor arranged on a first side of the gripper, the first distance sensor being configured to sense a first distance in a first direction, and the controller is further configured to orient, prior to sensing the distance measurement data, the first distance sensor such that the first direction is toward the ceiling of the container. In some embodiments, the one or more distance sensors include a second distance sensor arranged on a second side of the gripper, the second distance sensor being configured to sense a second distance in a second direction, the second direction being opposite the first direction, and the distance measurement data includes distance measurement data sensed by the first distance sensor and the second distance sensor.
In some embodiments, the controller is further configured to control the arm of the mobile robot to move the arm through a scan trajectory when the first distance sensor is oriented toward the ceiling of the container. In some embodiments, controlling the arm of the mobile robot to move the arm through a scan trajectory comprises controlling the arm to move through a scan trajectory that includes a first direction and a second direction at an angle to the first direction. In some embodiments, the scan trajectory includes a first segment along the first direction, a second segment along the first direction, and a third segment along the second direction, the third segment connecting the first and second segments. In some embodiments, the distance measurement data is sensed while the mobile base of the mobile robot is outside of the container.
In some embodiments, determining a ceiling estimate of the container based on the distance measurement data comprises fitting a plane based on at least one datum in the distance measurement data, and determining the ceiling estimate of the container based on the plane. In some embodiments, determining a ceiling estimate of the container based on the distance measurement data further comprises determining the at least one datum as a datum having a minimum distance in the distance measurement data. In some embodiments, determining a ceiling estimate of the container based on the distance measurement data further comprises filtering the distance measurement data to generate filtered distance measurement data, and determining the ceiling estimate of the container based on the filtered distance measurement data. In some embodiments, filtering the distance measurement data comprises sorting the distance measurement data in order of distance to generate sorted data, and excluding from the filtered distance measurement data, a threshold amount of the sorted data with the smallest distances.
In some embodiments, fitting a plane comprises fitting a flat plane to the at least one datum. In some embodiments, the flat plane is parallel to a floor of the container. In some embodiments, the flat plane is sloped relative to a floor of the container. In some embodiments, fitting a plane comprises fitting a curved plane using at least two pieces of data of the distance measurement data.
In some embodiments, determining a ceiling estimate of the container based on the distance measurement data further comprises assigning a shape primitive to the plane, and controlling an operation of the mobile robot based, at least in part, on the shape primitive.
In some embodiments, the controller is further configured to control an operation of the mobile robot based, at least in part, on the ceiling estimate. In some embodiments, controlling an operation of the mobile robot based on the ceiling estimate comprises determining a trajectory of the arm of the mobile robot based, at least in part, on the ceiling estimate, and moving the arm in accordance with the trajectory. In some embodiments, controlling an operation of the mobile robot based on the ceiling estimate comprises determining a grasp strategy for grasping an object in the container based, at least in part, on the ceiling estimate, and grasping the object using the grasp strategy. In some embodiments, controlling an operation of the mobile robot based on the ceiling estimate comprises controlling the mobile robot to move at least one component of the mobile robot without colliding with the ceiling of the container.
In some embodiments, the container comprises a truck cargo compartment. In some embodiments, the container comprises a shipping container. In some embodiments, the gripper comprises a suction-based gripper. In some embodiments, the one or more distance sensors include one or more time-of-flight point sensors.
In one aspect, the invention features a component of a mobile robot. The component includes an arm configured to couple to a base of the mobile robot, and an end effector coupled to the arm, the end effector including one or more distance sensors. The arm is configured to move through a scan trajectory. The end effector is configured to orient a first distance sensor of the one or more distance sensors toward a ceiling of a container. The first distance sensor is configured to capture first distance measurements as the arm is moved through the scan trajectory. A ceiling height of the container is determined based, at least in part, on the plurality of distance measurements.
In some embodiments, the first distance sensor is arranged on a first side of the end effector. In some embodiments, the one or more distance sensors include a second distance sensor arranged on a second side of the end effector opposite the first side, the second distance sensor being configured to sense second distance measurements as the arm is moved through the scan trajectory, and the ceiling height of the container is determined based, at least in part, on the first distance measurements and the second distance measurements.
In some embodiments, the scan trajectory includes a first direction and a second direction at an angle to the first direction. In some embodiments, the scan trajectory includes a first segment along the first direction, a second segment along the first direction, and a third segment along the second direction, the third segment connecting the first and second segments. In some embodiments, the first distance sensor is configured to capture the first plurality of distance measurements while the base of the mobile robot is outside of the container.
In some embodiments, the container comprises a truck cargo compartment. In some embodiments, the container comprises a shipping container. In some embodiments, the end effector is a gripper. In some embodiments, the gripper comprises a suction-based gripper. In some embodiments, the one or more distance sensors include one or more time-of-flight point sensors.
The advantages of the invention, together with further advantages, may be better understood by referring to the following description taken in conjunction with the accompanying drawings. The drawings are not necessarily to scale, and emphasis is instead generally placed upon illustrating the principles of the invention.
To operate safely in an environment, a mobile manipulator robot should be aware of characteristics of the environment within which it is operating and plan its movements appropriately to prevent damage to the robot or objects in the environment. One such environment in which mobile robots may operate is a container, such as a truck cargo compartment, that includes walls, a floor, and a ceiling. To ensure that the robot does not collide with the walls or the ceiling during operation (e.g., a “pick and place” operation described below), the robot may arrange its position inside of the container and plan its movements appropriately by selecting trajectories that do not contact these surfaces. Determining a reliable estimate of the position of the walls and ceiling of the container relative to the robot is important to ensure that the movements of the robot are constrained sufficiently to avoid damage to the robot and/or the container but not too constrained to unnecessarily limit the workspace and/or operation of the robot within the container, which may slow down operation of the robot.
Measuring the location of the walls of the container with suitable accuracy after the robot has entered the container is typically accomplished with reasonable accuracy using distance sensors (e.g., scanning LIDAR sensors) mounted on the base of the mobile robot. Such distance sensors may also be used, among other things, for robot navigation and obstacle avoidance. The inventors have recognized, however, that determining a reasonable estimate of the location and/or shape of the ceiling of a container is considerably more challenging. For instance, ceilings in containers are often reflective, which may limit the ability of color (e.g., RGB) cameras that are typically used for object detection/recognition to also accurately detect the ceiling. Due to the challenge with accurately measuring characteristics of the ceiling using existing sensors on the robot, ceiling height of a container may be measured manually by a human prior to the robot entering the container, and the measured ceiling height may be entered by the human operator into a user interface to inform a controller of the robot how to plan and execute its movements accordingly. Manual measurement of the ceiling height by a human operator is time-consuming, error prone, and doesn't account for variations in the ceiling geometry unless multiple measurements are made at different locations within the container. Some embodiments of the present disclosure relate to an automated technique for detecting the ceiling of a container using one or more distance sensors mounted on an end effector of a mobile robot.
Robots can be configured to perform a number of tasks in an environment in which they are placed. Exemplary tasks may include interacting with objects and/or elements of the environment. Notably, robots are becoming popular in warehouse and logistics operations. Before robots were introduced to such spaces, many operations were performed manually. For example, a person might manually unload boxes from a truck onto one end of a conveyor belt, and a second person at the opposite end of the conveyor belt might organize those boxes onto a pallet. The pallet might then be picked up by a forklift operated by a third person, who might drive to a storage area of the warehouse and drop the pallet for a fourth person to remove the individual boxes from the pallet and place them on shelves in a storage area. Some robotic solutions have been developed to automate many of these functions. Such robots may either be specialist robots (i.e., designed to perform a single task or a small number of related tasks) or generalist robots (i.e., designed to perform a wide variety of tasks). To date, both specialist and generalist warehouse robots have been associated with significant limitations.
For example, because a specialist robot may be designed to perform a single task (e.g., unloading boxes from a truck onto a conveyor belt), while such specialized robots may be efficient at performing their designated task, they may be unable to perform other related tasks. As a result, either a person or a separate robot (e.g., another specialist robot designed for a different task) may be needed to perform the next task(s) in the sequence. As such, a warehouse may need to invest in multiple specialized robots to perform a sequence of tasks, or may need to rely on a hybrid operation in which there are frequent robot-to-human or human-to-robot handoffs of objects.
In contrast, while a generalist robot may be designed to perform a wide variety of tasks (e.g., unloading, palletizing, transporting, depalletizing, and/or storing), such generalist robots may be unable to perform individual tasks with high enough efficiency or accuracy to warrant introduction into a highly streamlined warehouse operation. For example, while mounting an off-the-shelf robotic manipulator onto an off-the-shelf mobile robot might yield a system that could, in theory, accomplish many warehouse tasks, such a loosely integrated system may be incapable of performing complex or dynamic motions that require coordination between the manipulator and the mobile base, resulting in a combined system that is inefficient and inflexible.
Typical operation of such a system within a warehouse environment may include the mobile base and the manipulator operating sequentially and (partially or entirely) independently of each other. For example, the mobile base may first drive toward a stack of boxes with the manipulator powered down. Upon reaching the stack of boxes, the mobile base may come to a stop, and the manipulator may power up and begin manipulating the boxes as the base remains stationary. After the manipulation task is completed, the manipulator may again power down, and the mobile base may drive to another destination to perform the next task.
In such systems, the mobile base and the manipulator may be regarded as effectively two separate robots that have been joined together. Accordingly, a controller associated with the manipulator may not be configured to share information with, pass commands to, or receive commands from a separate controller associated with the mobile base. As such, such a poorly integrated mobile manipulator robot may be forced to operate both its manipulator and its base at suboptimal speeds or through suboptimal trajectories, as the two separate controllers struggle to work together. Additionally, while certain limitations arise from an engineering perspective, additional limitations must be imposed to comply with safety regulations. For example, if a safety regulation requires that a mobile manipulator must be able to be completely shut down within a certain period of time when a human enters a region within a certain distance of the robot, a loosely integrated mobile manipulator robot may not be able to act sufficiently quickly to ensure that both the manipulator and the mobile base (individually and in aggregate) do not threaten the human. To ensure that such loosely integrated systems operate within required safety constraints, such systems are forced to operate at even slower speeds or to execute even more conservative trajectories than those limited speeds and trajectories as already imposed by the engineering problem. As such, the speed and efficiency of generalist robots performing tasks in warehouse environments to date have been limited.
In view of the above, a highly integrated mobile manipulator robot with system-level mechanical design and holistic control strategies between the manipulator and the mobile base may provide certain benefits in warehouse and/or logistics operations. Such an integrated mobile manipulator robot may be able to perform complex and/or dynamic motions that are unable to be achieved by conventional, loosely integrated mobile manipulator systems. As a result, this type of robot may be well suited to perform a variety of different tasks (e.g., within a warehouse environment) with speed, agility, and efficiency.
In this section, an overview of some components of one embodiment of a highly integrated mobile manipulator robot configured to perform a variety of tasks is provided to explain the interactions and interdependencies of various subsystems of the robot. Each of the various subsystems, as well as control strategies for operating the subsystems, are described in further detail in the following sections.
During operation, the perception mast of robot 20a (analogous to the perception mast 140 of robot 100 of
Also of note in
To pick some boxes within a constrained environment, the robot may need to carefully adjust the orientation of its arm to avoid contacting other boxes or the surrounding shelving. For example, in a typical “keyhole problem”, the robot may only be able to access a target box by navigating its arm through a small space or confined area (akin to a keyhole) defined by other boxes or the surrounding shelving. In such scenarios, coordination between the mobile base and the arm of the robot may be beneficial. For instance, being able to translate the base in any direction allows the robot to position itself as close as possible to the shelving, effectively extending the length of its arm (compared to conventional robots without omnidirectional drive which may be unable to navigate arbitrarily close to the shelving). Additionally, being able to translate the base backwards allows the robot to withdraw its arm from the shelving after picking the box without having to adjust joint angles (or minimizing the degree to which joint angles are adjusted), thereby enabling a simple solution to many keyhole problems.
The tasks depicted in
The robotic arm 430 of
Starting at the turntable 420, the robotic arm 430 includes a turntable offset 422, which is fixed relative to the turntable 420. A distal portion of the turntable offset 422 is rotatably coupled to a proximal portion of a first link 433 at a first joint 432. A distal portion of the first link 433 is rotatably coupled to a proximal portion of a second link 435 at a second joint 434. A distal portion of the second link 435 is rotatably coupled to a proximal portion of a third link 437 at a third joint 436. The first, second, and third joints 432, 434, and 436 are associated with first, second, and third axes 432a, 434a, and 436a, respectively.
The first, second, and third joints 432, 434, and 436 are additionally associated with first, second, and third actuators (not labeled) which are configured to rotate a link about an axis. Generally, the nth actuator is configured to rotate the nth link about the nth axis associated with the nth joint. Specifically, the first actuator is configured to rotate the first link 433 about the first axis 432a associated with the first joint 432, the second actuator is configured to rotate the second link 435 about the second axis 434a associated with the second joint 434, and the third actuator is configured to rotate the third link 437 about the third axis 436a associated with the third joint 436. In the embodiment shown in
In some embodiments, a robotic arm of a highly integrated mobile manipulator robot may include a different number of degrees of freedom than the robotic arms discussed above. Additionally, a robotic arm need not be limited to a robotic arm with three pitch joints and a 3-DOF wrist. A robotic arm of a highly integrated mobile manipulator robot may include any suitable number of joints of any suitable type, whether revolute or prismatic. Revolute joints need not be oriented as pitch joints, but rather may be pitch, roll, yaw, or any other suitable type of joint.
Returning to
In some embodiments, an end effector may be associated with one or more sensors. For example, a force/torque sensor may measure forces and/or torques (e.g., wrenches) applied to the end effector. Alternatively or additionally, a sensor may measure wrenches applied to a wrist of the robotic arm by the end effector (and, for example, an object grasped by the end effector) as the object is manipulated. Signals from these (or other) sensors may be used during mass estimation and/or path planning operations. In some embodiments, sensors associated with an end effector may include an integrated force/torque sensor, such as a 6-axis force/torque sensor. In some embodiments, separate sensors (e.g., separate force and torque sensors) may be employed. Some embodiments may include only force sensors (e.g., uniaxial force sensors, or multi-axis force sensors), and some embodiments may include only torque sensors. In some embodiments, an end effector may be associated with a custom sensing arrangement. For example, one or more sensors (e.g., one or more uniaxial sensors) may be arranged to enable sensing of forces and/or torques along multiple axes. An end effector (or another portion of the robotic arm) may additionally include any appropriate number or configuration of cameras, distance sensors, pressure sensors, light sensors, or any other suitable sensors, whether related to sensing characteristics of the payload or otherwise, as the disclosure is not limited in this regard.
As discussed above, accurately measuring the location of a ceiling of a container, such as a truck cargo compartment, within which a mobile manipulator robot is working may be challenging with existing sensors on the robot (e.g., perception modules 142 located on perception mast 140 of robot 100 shown in
In the example of
In some embodiments, distance sensors 530a-d and 540a-b are implemented as direct time-of-flight (TOF) sensors configured to detect signals reflected by an object located near (e.g., within 2 meters of) the robotic component 500. Other types of distance sensors including, but not limited to, acoustic-based (e.g., SONAR) distance sensors or laser-based distance sensors (e.g., laser range finders) may alternatively be used.
As described in more detail below, distance measurement signals sensed by the distance sensors arranged on the end effector of the robot may be provided to one or more computer processors. The one or more computer processors may be configured to process the distance measurement signals to detect one or more characteristics (e.g., distance, geometry) of a ceiling of a container within which the robot is operating or is intending to operate. The one or more characteristics of the ceiling of the container can then be used when determining how to operate the robot to perform a task, such as picking boxes from a stack of boxes within the container and placing them on a conveyor. For instance, the trajectory of the robotic arm and/or the end effector portion of the robotic arm may be selected in a way that avoids a collision with the ceiling of the container.
In some embodiments, the distance sensors 540a-b configured to detect objects in the Z direction (through the suction cup assemblies) may include a different type of distance sensors (or be configured differently) than the distance sensors 530a-d arranged on the sides of the gripper 520. For instance, distance sensors 540a-b being arranged on the suction cup assembly surface should have a transmit cone small enough to fit between the suction cup assemblies, whereas the distance sensors 530a-d may not have such restrictions. Accordingly, in some embodiments the distance sensors 530a-d are configured to have a larger field of view than the distance sensors 540a-b.
In some embodiments, locating distance sensors on a robotic component, such as an end effector that can be manipulated with dexterity enables for distance measurements to be made in ways that may not be possible with other sensors included in the perception system of the robot. For instance, the perception mast of the robot described in connection with
Process 600 then proceeds to act 620, where a ceiling estimate of the container is determined based, at least in part, on the distance measurement data. The ceiling estimate may include one or more characteristics of the ceiling, examples of which include, but are not limited to, ceiling height and ceiling geometry. For instance, the distance measurement data may include a point cloud of distance measurements sensed as the arm of the robot is moved through a scan trajectory. An example scan trajectory is described with reference to
In some embodiments, distance measurement data may be sensed in act 610 and a ceiling estimate may be determined in act 620 only once, either prior to the robot entering the container or after the robot has entered the container, but prior to operation of the robot to interact with objects in the container. In other embodiments, distance measurement data may be periodically obtained (e.g., during the normal picking operation of the robot or at any other suitable time) when the end effector is oriented such that a distance measurement to the ceiling of the container can be sensed. For instance, the orientation of a gripper may be such that a distance sensor located on a top surface of the gripper when executing a face pick of an object (e.g., as shown in
Process 600 then proceeds to act 630, where an operation of the robot is controlled based on the ceiling estimate. As described above, determining an accurate ceiling estimate using the techniques described herein enables the robot to operate in a safe manner that reduces the likelihood that the robot will collide with the ceiling of a container within which it is operating. Accordingly, the ceiling estimate may be taken into consideration as a part of a collision avoidance process when deciding on a trajectory to pick and/or place an object within the container. Additionally or alternatively, the ceiling estimate may be taken into consideration when considering how to grasp an object using its end effector. For instance, based on the location of an object to be grasped relative to the ceiling of the container, it may be determined to place the gripper at a location that reduces the likelihood of other objects (e.g., objects located below the object to be grasped) from tipping over, while ensuring that the grasp quality of the gripper is sufficient to grasp and move the object. As an example, rather the placing the gripper in the middle of a box such that a maximum number of suction cups engage the box, it may be determined that, when the box is close to the ceiling of the container, to offset the gripper relative to the center of the box such that fewer suction cups than the maximum possible number engage the box, while still maintaining a sufficient grasp on the box to move it. Other operations of the robot may additionally or alternatively be controlled based on the ceiling estimate in act 630, and embodiments of the present disclosure are not limited in this respect.
Process 700 then proceeds to act 720, where the arm of the robot is controlled to move through a scan trajectory in a plane parallel to the ceiling of the container. Process 700 then proceeds to act 730, where distance measurements are sensed by the distance sensors as the arm is moved through the scan trajectory. By moving the arm through the scan trajectory as distance measurements are sensed, a sampling of different points on the ceiling may be captured to help ensure that the distance measurements represent an accurate geometry of the ceiling of the container. For instance, capturing distance measurements at multiple points facilitates a determination of whether the ceiling of the container is flat, tapered (e.g., from back to front or front to back), bowed (e.g., taller in the middle of the container compared to the sides), or has some other geometry (e.g., crossbars extending across the width of the container).
The inventors have recognized and appreciated that it may be advantageous to design a scan trajectory that includes scanning in multiple directions to ensure that enough features of the ceiling are captured to facilitate an accurate modeling of the ceiling surface. For instance, if only a single line of distance measurements is captured along the width of the container, all of the measurements may fail to capture one or more crossbars hanging down from the ceiling of the container if not directly above the end effector when scanning. Accordingly, in some embodiments, the scan trajectory includes a first segment oriented in a first direction and a second segment oriented at an angle relative to the first segment. The example scan trajectory 930 shown in
Process 1000 then proceeds to act 1014, where a plane representing the ceiling of the container is fit using one or more of the filtered distance measurements. In some embodiments, a plane may be fit using a minimum distance measurement in the filtered distance measurements. For example, when the data is sorted to remove measurements close to the sensor during filtering in act 1012, the remaining smallest distance measurement may be determined as the minimum distance measurement, and a plane parallel to the floor may be fit at the minimum distance measurement. Any suitable type of plane including, but not limited to, a flat plane, a curved plane, an angled plane, or multiple planes (e.g., if there is a step in the ceiling) may be fit using one or more of the filtered distance measurements. As one example, a flat plane parallel to the floor of the container may be fit at the level of the minimum distance as described above. In another example, an angled plane may be fit to the filtered distance measurements when the ceiling of the container is tapered from back to front or front to back. In such an example, multiple distance measurements along the length of the container's ceiling may be used to define the angled plane. In another example, a curved plane may be fit when the ceiling of the container is determined to be bowed in the middle. In such an example, multiple distance measurements along the width of the container may be used to fit the curved plane, to characterize the bowed ceiling of the container. Planes having other geometrical shapes are also possible. For instance, the plane may be both angled and curved. In some embodiments, more complex geometries including, but not limited to, the presence of crossbars coupled to the ceiling of the container may be modeled using the sensed distance measurements.
Process 1000 then proceeds to act 1016, where a shape primitive is associated with the plane to represent the ceiling estimate for the container. In some embodiments, a robot can use certain approximations (e.g., in determining candidate trajectories and/or conducting feasibility assessments) to help avoid collisions (e.g., between different portions of itself or with objects, such as the ceiling, in its surrounding environment). Such approximations are also referred to herein as “primitives” or “shape primitives,” and can be used to implement collision avoidance by limiting the distance between primitives (e.g., a primitive representing the gripper and a primitive representing the ceiling). The primitives can be simple geometric shapes (e.g., spheres, boxes, etc.) or more complicated geometric shapes. Separation vectors can be computed to determine distances between certain primitives of interest to provide quantifiable metrics to help ensure that collisions are avoided (e.g., by ensuring that the distances remain above a specified positive number). In some embodiments, two points (one on each primitive) that are closest together can be determined, and the vector connecting the two points can determine the corresponding separation vector.
In
An orientation may herein refer to an angular position of an object. In some instances, an orientation may refer to an amount of rotation (e.g., in degrees or radians) about three axes. In some cases, an orientation of a robotic device may refer to the orientation of the robotic device with respect to a particular reference frame, such as the ground or a surface on which it stands. An orientation may describe the angular position using Euler angles, Tait-Bryan angles (also known as yaw, pitch, and roll angles), and/or Quaternions. In some instances, such as on a computer-readable medium, the orientation may be represented by an orientation matrix and/or an orientation quaternion, among other representations.
In some scenarios, measurements from sensors on the base of the robotic device may indicate that the robotic device is oriented in such a way and/or has a linear and/or angular velocity that requires control of one or more of the articulated appendages in order to maintain balance of the robotic device. In these scenarios, however, it may be the case that the limbs of the robotic device are oriented and/or moving such that balance control is not required. For example, the body of the robotic device may be tilted to the left, and sensors measuring the body's orientation may thus indicate a need to move limbs to balance the robotic device; however, one or more limbs of the robotic device may be extended to the right, causing the robotic device to be balanced despite the sensors on the base of the robotic device indicating otherwise. The limbs of a robotic device may apply a torque on the body of the robotic device and may also affect the robotic device's center of mass. Thus, orientation and angular velocity measurements of one portion of the robotic device may be an inaccurate representation of the orientation and angular velocity of the combination of the robotic device's body and limbs (which may be referred to herein as the “aggregate” orientation and angular velocity).
In some implementations, the processing system may be configured to estimate the aggregate orientation and/or angular velocity of the entire robotic device based on the sensed orientation of the base of the robotic device and the measured joint angles. The processing system has stored thereon a relationship between the joint angles of the robotic device and the extent to which the joint angles of the robotic device affect the orientation and/or angular velocity of the base of the robotic device. The relationship between the joint angles of the robotic device and the motion of the base of the robotic device may be determined based on the kinematics and mass properties of the limbs of the robotic devices. In other words, the relationship may specify the effects that the joint angles have on the aggregate orientation and/or angular velocity of the robotic device. Additionally, the processing system may be configured to determine components of the orientation and/or angular velocity of the robotic device caused by internal motion and components of the orientation and/or angular velocity of the robotic device caused by external motion. Further, the processing system may differentiate components of the aggregate orientation in order to determine the robotic device's aggregate yaw rate, pitch rate, and roll rate (which may be collectively referred to as the “aggregate angular velocity”).
In some implementations, the robotic device may also include a control system that is configured to control the robotic device on the basis of a simplified model of the robotic device. The control system may be configured to receive the estimated aggregate orientation and/or angular velocity of the robotic device, and subsequently control one or more jointed limbs of the robotic device to behave in a certain manner (e.g., maintain the balance of the robotic device).
In some implementations, the robotic device may include force sensors that measure or estimate the external forces (e.g., the force applied by a limb of the robotic device against the ground) along with kinematic sensors to measure the orientation of the limbs of the robotic device. The processing system may be configured to determine the robotic device's angular momentum based on information measured by the sensors. The control system may be configured with a feedback-based state observer that receives the measured angular momentum and the aggregate angular velocity, and provides a reduced-noise estimate of the angular momentum of the robotic device. The state observer may also receive measurements and/or estimates of torques or forces acting on the robotic device and use them, among other information, as a basis to determine the reduced-noise estimate of the angular momentum of the robotic device.
In some implementations, multiple relationships between the joint angles and their effect on the orientation and/or angular velocity of the base of the robotic device may be stored on the processing system. The processing system may select a particular relationship with which to determine the aggregate orientation and/or angular velocity based on the joint angles. For example, one relationship may be associated with a particular joint being between 0 and 90 degrees, and another relationship may be associated with the particular joint being between 91 and 180 degrees. The selected relationship may more accurately estimate the aggregate orientation of the robotic device than the other relationships.
In some implementations, the processing system may have stored thereon more than one relationship between the joint angles of the robotic device and the extent to which the joint angles of the robotic device affect the orientation and/or angular velocity of the base of the robotic device. Each relationship may correspond to one or more ranges of joint angle values (e.g., operating ranges). In some implementations, the robotic device may operate in one or more modes. A mode of operation may correspond to one or more of the joint angles being within a corresponding set of operating ranges. In these implementations, each mode of operation may correspond to a certain relationship.
The angular velocity of the robotic device may have multiple components describing the robotic device's orientation (e.g., rotational angles) along multiple planes. From the perspective of the robotic device, a rotational angle of the robotic device turned to the left or the right may be referred to herein as “yaw.” A rotational angle of the robotic device upwards or downwards may be referred to herein as “pitch.” A rotational angle of the robotic device tilted to the left or the right may be referred to herein as “roll.” Additionally, the rate of change of the yaw, pitch, and roll may be referred to herein as the “yaw rate,” the “pitch rate,” and the “roll rate,” respectively.
As shown in
Processor(s) 1202 may operate as one or more general-purpose processor or special purpose processors (e.g., digital signal processors, application specific integrated circuits, etc.). The processor(s) 1202 can be configured to execute computer-readable program instructions 1206 that are stored in the data storage 1204 and are executable to provide the operations of the robotic device 1200 described herein. For instance, the program instructions 1206 may be executable to provide operations of controller 1208, where the controller 1208 may be configured to cause activation and/or deactivation of the mechanical components 1214 and the electrical components 1216. The processor(s) 1202 may operate and enable the robotic device 1200 to perform various functions, including the functions described herein.
The data storage 1204 may exist as various types of storage media, such as a memory. For example, the data storage 1204 may include or take the form of one or more computer-readable storage media that can be read or accessed by processor(s) 1202. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with processor(s) 1202. In some implementations, the data storage 1204 can be implemented using a single physical device (e.g., one optical, magnetic, organic or other memory or disc storage unit), while in other implementations, the data storage 1204 can be implemented using two or more physical devices, which may communicate electronically (e.g., via wired or wireless communication). Further, in addition to the computer-readable program instructions 1206, the data storage 1204 may include additional data such as diagnostic data, among other possibilities.
The robotic device 1200 may include at least one controller 1208, which may interface with the robotic device 1200. The controller 1208 may serve as a link between portions of the robotic device 1200, such as a link between mechanical components 1214 and/or electrical components 1216. In some instances, the controller 1208 may serve as an interface between the robotic device 1200 and another computing device. Furthermore, the controller 1208 may serve as an interface between the robotic device 1200 and a user(s). The controller 1208 may include various components for communicating with the robotic device 1200, including one or more joysticks or buttons, among other features. The controller 1208 may perform other operations for the robotic device 1200 as well. Other examples of controllers may exist as well.
Additionally, the robotic device 1200 includes one or more sensor(s) 1210 such as force sensors, proximity sensors, motion sensors, load sensors, position sensors, touch sensors, depth sensors, ultrasonic range sensors, and/or infrared sensors, among other possibilities. The sensor(s) 1210 may provide sensor data to the processor(s) 1202 to allow for appropriate interaction of the robotic device 1200 with the environment as well as monitoring of operation of the systems of the robotic device 1200. The sensor data may be used in evaluation of various factors for activation and deactivation of mechanical components 1214 and electrical components 1216 by controller 1208 and/or a computing system of the robotic device 1200.
The sensor(s) 1210 may provide information indicative of the environment of the robotic device for the controller 1208 and/or computing system to use to determine operations for the robotic device 1200. For example, the sensor(s) 1210 may capture data corresponding to the terrain of the environment or location of nearby objects, which may assist with environment recognition and navigation, etc. In an example configuration, the robotic device 1200 may include a sensor system that may include a camera, RADAR, LIDAR, time-of-flight camera, global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment of the robotic device 1200. The sensor(s) 1210 may monitor the environment in real-time and detect obstacles, elements of the terrain, weather conditions, temperature, and/or other parameters of the environment for the robotic device 1200.
Further, the robotic device 1200 may include other sensor(s) 1210 configured to receive information indicative of the state of the robotic device 1200, including sensor(s) 1210 that may monitor the state of the various components of the robotic device 1200. The sensor(s) 1210 may measure activity of systems of the robotic device 1200 and receive information based on the operation of the various features of the robotic device 1200, such the operation of extendable legs, arms, or other mechanical and/or electrical features of the robotic device 1200. The sensor data provided by the sensors may enable the computing system of the robotic device 1200 to determine errors in operation as well as monitor overall functioning of components of the robotic device 1200.
For example, the computing system may use sensor data to determine the stability of the robotic device 1200 during operations as well as measurements related to power levels, communication activities, components that require repair, among other information. As an example configuration, the robotic device 1200 may include gyroscope(s), accelerometer(s), and/or other possible sensors to provide sensor data relating to the state of operation of the robotic device. Further, sensor(s) 1210 may also monitor the current state of a function that the robotic device 1200 may currently be operating. Additionally, the sensor(s) 1210 may measure a distance between a given robotic limb of a robotic device and a center of mass of the robotic device. Other example uses for the sensor(s) 1210 may exist as well.
Additionally, the robotic device 1200 may also include one or more power source(s) 1212 configured to supply power to various components of the robotic device 1200. Among possible power systems, the robotic device 1200 may include a hydraulic system, electrical system, batteries, and/or other types of power systems. As an example illustration, the robotic device 1200 may include one or more batteries configured to provide power to components via a wired and/or wireless connection. Within examples, components of the mechanical components 1214 and electrical components 1216 may each connect to a different power source or may be powered by the same power source. Components of the robotic device 1200 may connect to multiple power sources as well.
Within example configurations, any type of power source may be used to power the robotic device 1200, such as a gasoline and/or electric engine. Further, the power source(s) 1212 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples. Other configurations may also be possible. Additionally, the robotic device 1200 may include a hydraulic system configured to provide power to the mechanical components 1214 using fluid power. Components of the robotic device 1200 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example. The hydraulic system of the robotic device 1200 may transfer a large amount of power through small tubes, flexible hoses, or other links between components of the robotic device 1200. Other power sources may be included within the robotic device 1200.
Mechanical components 1214 can represent hardware of the robotic device 1200 that may enable the robotic device 1200 to operate and perform physical functions. As a few examples, the robotic device 1200 may include actuator(s), extendable leg(s), arm(s), wheel(s), one or multiple structured bodies for housing the computing system or other components, and/or other mechanical components. The mechanical components 1214 may depend on the design of the robotic device 1200 and may also be based on the functions and/or tasks the robotic device 1200 may be configured to perform. As such, depending on the operation and functions of the robotic device 1200, different mechanical components 1214 may be available for the robotic device 1200 to utilize. In some examples, the robotic device 1200 may be configured to add and/or remove mechanical components 1214, which may involve assistance from a user and/or other robotic device.
The electrical components 1216 may include various components capable of processing, transferring, providing electrical charge or electric signals, for example. Among possible examples, the electrical components 1216 may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of the robotic device 1200. The electrical components 1216 may interwork with the mechanical components 1214 to enable the robotic device 1200 to perform various operations. The electrical components 1216 may be configured to provide power from the power source(s) 1212 to the various mechanical components 1214, for example. Further, the robotic device 1200 may include electric motors. Other examples of electrical components 1216 may exist as well.
In some implementations, the robotic device 1200 may also include communication link(s) 1218 configured to send and/or receive information. The communication link(s) 1218 may transmit data indicating the state of the various components of the robotic device 1200. For example, information read in by sensor(s) 1210 may be transmitted via the communication link(s) 1218 to a separate device. Other diagnostic information indicating the integrity or health of the power source(s) 1212, mechanical components 1214, electrical components 1216, processor(s) 1202, data storage 1204, and/or controller 1208 may be transmitted via the communication link(s) 1218 to an external communication device.
In some implementations, the robotic device 1200 may receive information at the communication link(s) 1218 that is processed by the processor(s) 1202. The received information may indicate data that is accessible by the processor(s) 1202 during execution of the program instructions 1206, for example. Further, the received information may change aspects of the controller 1208 that may affect the behavior of the mechanical components 1214 or the electrical components 1216. In some cases, the received information indicates a query requesting a particular piece of information (e.g., the operational state of one or more of the components of the robotic device 1200), and the processor(s) 1202 may subsequently transmit that particular piece of information back out the communication link(s) 1218.
In some cases, the communication link(s) 1218 include a wired connection. The robotic device 1200 may include one or more ports to interface the communication link(s) 1218 to an external device. The communication link(s) 1218 may include, in addition to or alternatively to the wired connection, a wireless connection. Some example wireless connections may utilize a cellular connection, such as CDMA, EVDO, GSM/GPRS, or 4G telecommunication, such as WiMAX or LTE. Alternatively or in addition, the wireless connection may utilize a Wi-Fi connection to transmit data to a wireless local area network (WLAN). In some implementations, the wireless connection may also communicate over an infrared link, radio, Bluetooth, or a near-field communication (NFC) device.
A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure.
This application claims the benefit under 35 U.S.C. § 119(e) to U.S. Provisional Patent Application No. 63/434,505, filed Dec. 22, 2022, and titled “METHODS AND APPARATUS FOR AUTOMATED CEILING DETECTION,” the entire contents of which is incorporated by reference herein.
Number | Date | Country | |
---|---|---|---|
63434505 | Dec 2022 | US |