The project leading to this application has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No. 818087.
The present disclosure relates generally to systems and methods for monitoring a shared workspace to ensure the safety of a human working closely with dangerous machinery, such as a robot. In particular, the disclosure relates to systems and methods for determining a maximum allowable speed for movements of the robot in the presence of a human to ensure proper safety protocols are implemented in a collaborative workspace.
Many industries have long relied on industrial robots and other automated equipment to complete various tasks in a safe and efficient manner. To ensure the safety of nearby human workers, this equipment is typically isolated or deployed in separate workspaces and positioned behind fences or other protective barriers. In some industries, humans and robots can work effectively in this manner, so these isolation methods do not impact overall performance. However, in other environments, there is a substantial benefit derived from the synergy of having humans and robots coexisting and working together in a shared workspace. For such arrangements to function properly and realize the impact of a joint human and robot workforce, fences or other physical barriers are ineffective since they would inhibit some or all interaction between the human and robot. Accordingly, other safety measures are required to ensure human safety in the shared workspace is maintained while also fostering a productive environment.
To this end, various virtual barriers have been designed for controlling the actions of the robot (or other automated machine) when the conditions may pose an elevated risk to human safety. For example, in some configurations, an optical sensor system including one or more light emitters and detectors creates a light curtain that defines the boundaries of a safety zone between the human and robot. When the optical sensors detect the presence of the human near or within the safety zone, the system slows down or deactivates the robot to prevent potential injury to the human. While these sensor systems are relatively low cost and easy to configure, the safety zone defined by the light curtain is often static and exists in a two-dimensional space. Because of these limitations, the system is difficult to adapt to a dynamic workplace where the human and robot may be constantly moving, or where the robot moves through a series of wide-ranging trajectories.
Some more recent designs have shifted to the use of three-dimensional optoelectronic sensors or other suitable equipment able to monitor three-dimensional space, such as stereo cameras and time-of-flight cameras. This technology may be used to accurately capture relative distance between the human and robot while one or both are moving within the safety zone. These configurations allow for a closer working relationship between a human and robot since the systems can precisely identify the relative locations between the human and robot in a three-dimensional space, and can be used in dynamic workplaces where the human and robot may move across a range of trajectories and in different movement speeds. Moreover, the three-dimensional safety zones may be constantly adapted based on a real-time position and movement velocity of the robot and the human.
However, even with the implementation of three-dimensional monitoring systems and the development of safety zones and constant monitoring of the shared workspace, issues may arise at any moment as both the robot and human are capable of moving throughout the workspace at varying speeds and directions. In typical configurations, a robot (or other dangerous machine) may include a stationary base and one or more linkages designed to move over various possible trajectories at different speeds across the workspace. In some configurations, the base may be mounted on rails to increase the overall range of motion of the robot linkages within the workspace. In these and other configurations, the robot linkages and/or the human may change positions and movement speed rapidly, thereby creating a potentially dangerous situation at any given time if adequate safety protocols and proper monitoring are not implemented. Typically, robots include encoders for determining position and movement velocity for the robot and all robot linkages and controllers for issuing control commands to the robot linkages. As discussed above, position and velocity information is integral for ensuring the safety of a shared workspace because this information is used in conjunction with a robot's three-dimensional modeling and kinematics to define where the robot is in space, where it will be in the near future. With this information, it is possible to define one or more safety zones that can accurately track a position of the robot at all times and ensure that a human worker is safe while working within a close vicinity of the robot.
In some conventional designs, the robot controller may limit the full range of motion of the robot and/or decrease overall movement velocity in all directions when a human is present or near a safety zone so as to avoid a potential collision with a human in the shared workspace. In addition, robots may incorporate software applications that employ discrete and fixed speed limitations when the human is detected in the safety zone to ensure human safety at all times. While these conventional controls increase the safety of the shared workspace, the limitations tend to be overly conservative and unnecessarily limit the movement speed of the robot, which detrimentally impacts efficiency and overall performance.
Accordingly, the present inventors have identified a need for a system and method capable of dynamically controlling robot speed based on a precise location and movement of the human and the robot. As discussed in further detail below, the system and method adjust the robot movement speed proportionally to the distance between the human and robot, and along a robot movement direction where the robot moves toward the detected human within the safety zone, while allowing higher robot movement speeds along robot movement directions that would increase the separation distance between human and robot. This configuration optimizes overall performance while ensuring the workspace between the human and robot is safe. Additional aspects and advantages of such methods will be apparent from the following detailed description of example embodiments, which proceed with reference to the accompanying drawings.
Understanding that the drawings depict only certain embodiments and are not, therefore, to be considered limiting in nature, these embodiments will be described and explained with additional specificity and detail with reference to the drawings.
With reference to the drawings, this section describes particular embodiments and their detailed construction and operation. The embodiments described herein are set forth by way of illustration only and not limitation. The described features, structures, characteristics, and methods of operation may be combined in any suitable manner in one or more embodiments. In view of the disclosure herein, those skilled in the art will recognize that the various embodiments can be practiced without one or more of the specific details or with other methods, components, materials, or the like. In other instances, well-known structures, materials, or methods of operation are not shown or not described in detail to avoid obscuring more pertinent aspects of the embodiments.
In the following description of the figures and any example embodiments, certain embodiments may describe the disclosed subject matter in the context of a workspace shared between a human and a robot to protect personnel and reduce the likelihood of inadvertent injuries. It should be understood that these references are merely example uses for the described systems and methods and should not be considered as limiting. The techniques described herein apply not only to robots in a workspace, but also to any form of dangerous automated machinery, including such machinery for which a minimum safe separation distance may vary over time due to movements of the machine. Moreover, in other embodiments, the concepts described herein may be adapted for uses in other arrangements that may differ from the workspace examples described herein.
In the field of robotics, standards ISO 10218 and ISO/TS 15066 set forth by the International Electrotechnical Commission (IEC) provide speed and separation monitoring guidelines for ensuring a safe workspace between an industrial robot and a human worker. Risk of injury to the human worker may be reduced in these environments by monitoring the workspace to ensure at least a protective separation distance, also referred to as a safety zone in the disclosure, between the human and robot is maintained to avoid collisions, and to guarantee a safe movement speed for the robot at all times while the human and robot move along the workspace. In various conventional designs, the robot system may be slowed down, stopped, or have its trajectory path altered to avoid injury to the human when a human presence is detected within the safety zone. However, as noted previously, conventional systems include constraints that may unduly limit the movement speed of the robot in these circumstances, which impacts overall efficiency and performance of the robot.
As is further described in detail below with collective reference to the figures, the following disclosure relates to systems and methods designed to prioritize human safety in shared workspaces, while also determining a maximum allowable speed for robot movement based on the proximity of the human to the robot, among other variables. As is described in further detail below, the present subject matter discloses a method for determining the maximum allowable speed for the robot using a three-dimensional approach, where speed limitations for the robot are applied for specific vectors in a three-dimensional Cartesian robot axis relating to a real-time position of the robot and the human. The disclosed concept aims to limit the robot speed just in the direction of motion toward the human, where a collision would be more likely, while allowing for higher robot speeds when the robot moves in directions that will increase the separation distance between the human and robot. In other words, when the human and robot are at a defined separation distance relative to one another, speed limitations for the robot are applied only for robot movement along a vector that would result in the robot moving closer to the human. For movement trajectories of the robot that are moving away from a position of the human, no robot speed limitations are applied since the risk of human injury is minimal given the position of the human. In this configuration, the safety of the human is prioritized without unduly impacting overall performance and efficiency of the robot in the shared workspace.
With reference to
Turning now to the figures,
The workspace 100 may include any number of sensors 102 needed to ensure the sensors 102 collectively monitor the target regions of the workspace 100 as desired. Preferably, the sensors 102 are arranged to minimize or avoid occlusions to the extent possible to obtain as complete of a three-dimensional view as possible of the workspace shared between the human and robot, and to effectively monitor the workspace 100 with fewer sensors 102. After arranging the sensors 102 around the workspace 100, their position relative to one another may be registered by comparing images as between the sensors to ensure proper calibration and coverage of the workspace 100, and to retrieve the relative position and orientation between the sensors 102 and the base of the robot 10. The calibration step may be used to identify occlusions or static objects in the sensor field-of-view to ensure those objects are accounted for and not considered in future analysis steps. With the sensors 102 properly calibrated relative to one another, the sensor data can be reliably used to monitor positions and movements of the human 10 and robot 20 in the workspace 100.
The control system 104 further includes a network interface 118 to communicate with and receive data from the sensors 102. The network interface 118 may facilitate wired or wireless communication with other devices over a short distance (e.g., Bluetooth™) or nearly unlimited distances (e.g., the Internet). In the case of a wired connection, a data bus may be provided using any protocol, such as IEEE 802.3 (Ethernet), advanced technology attachment (ATA), personal computer memory card international association (PCMCIA), and USB. A wireless connection may use low or high powered electromagnetic waves to transmit data using any wireless protocol, such as Bluetooth™, IEEE 802.11b (or other WiFi standards), infrared data association (IrDa), and radio frequency identification (RFID). In addition, a modem module (not shown) or Ethernet module (not shown) may be incorporated to facilitate a WAN networking environment. The control system 104 may also include an interface 120 coupled to a database or internal hard drive 122. Interface 120 may also be coupled to removable memory, such as flash memory, a magnetic floppy disk drive, an optical disk drive, or another drive. Further, the interface 120 may be configured for external drive implementations, such as over a USB, IEEE 1194, or PCMCIA connection.
In one embodiment, any number of program modules may be stored in one or more drives 122 and RAM 110, including an operating system 124, one or more application programs 126, or other program modules 128 (such as instructions to implement the methods described herein), and data 130. All or portions of the program modules may also be cached in RAM 110. Any suitable operating system 124 may be employed, such as Windows Embedded CE, Windows Embedded Handheld, Windows Desktop, Android, Linux, iOS, MacOS, or other commercially available or proprietary operating systems.
The above-described components, including the processing unit 106, memory 108, display controller 116, network interface 118, and interface 120 may be interconnected via a bus 130. While a bus-based architecture is illustrated in
As noted previously, data from the sensors 102 monitoring the workspace 100 is received by the control system 104 via any suitable communications means, such as the network interface 118, and stored in memory 108 for processing by an analysis module 134. The analysis module 134 may employ conventional computer-vision techniques, such as deep-learning algorithms or deterministic algorithms, to analyze the data from the sensors 102 and distinguish between humans and automated robots or other workpieces. As is further described in detail below with reference to
To establish a general frame of reference, the following briefly describes an example method for determining a safety zone via a workspace monitoring system 300 and monitoring the workspace 100 to ensure safety distances are maintained.
As mentioned previously, all aspects of the robot's movement, such as range of motion, movement pattern, and velocity, may be governed by a robot controller 302. The robot controller 302 also determines the instantaneous state of the robot 20, including the current orientation of any robot limbs or joints, their respective movement patterns, and movement velocities. The robot controller 302 also includes all instructions relating to the robot model that controls the behavioral aspects of the robot throughout its operational sequence.
In one example method for determining a safety zone, the analysis module 134 of the control system 104 first obtains data from the sensors 102 and uses this information to identify the location of the person 10, the location of the robot 20, and other objects of interest in the workspace 100. The analysis module 134 (or other component of the control system 104) also communicates (either directly or indirectly) with the robot controller 302 via the network interface 118 or any suitable wireless or wired communication protocols to obtain information relating to the robot model and robot data relating to the movement of the robot 20 and the planned trajectory of the robot 20. With the robot data and the sensor data, along with the safety parameters set forth by the safety zone, the analysis module 134 is able to assess whether the person 10 has or will enter the safety zone. If the analysis module 134 determines that the person 10 and the robot 20 are on a collision course, the control system 104 may communicate with the robot controller 302 to take a safe action, such as by deactivating the robot 20, slowing down the robot 20, or altering the movement pattern of the robot 20 to avoid the collision.
With reference to
As noted above, standard ISO/TS 15066 describes a mathematical model for determining speed and separation parameters to maintain safety in a shared workspace and avoid a collision between the human 10 and robot 20. The concept behind the mathematical model aims to ensure that a minimum separation distance between the human and robot is maintained at all times. In configurations where the robot may move across a range of trajectories at various speeds, the movement speed of the human and robot, along with other variables, are used to determine the applicable protective separation distance that must be maintained at any given time. With some rearrangement of various variables in the model, it can be reformulated to instead determine the maximum allowed robot speed based on the human's speed and an actual separation distance between the robot and human. The following provides a brief summary of an example method for calculating the maximum allowable speed of the robot using the mathematical model outlined in ISO/TS 15066.
Based on the model, below is a mathematical formulation for calculating the protective separation distance, Sp(t0), as follows:
S
p(t0)=Sh+Sr+Ss+C+Zd+Zr (1)
where:
In the above equation, the contributions for Sr and Ss described in ISO/TS 15066 can be calculated as follows:
S
r=∫t
where vr is the directed speed of the robot in the direction of the human in the workspace and Tr is the robot reaction time. The directed speed may be positive or negative depending on whether the separation between the human and robot is increasing or decreasing. Here, vr may vary due to either the robot's speed or a change in direction of the robot. The system is designed to limit vr depending on the separation distance. If a safety-rated speed limit is in effect, the system may trust the speed limitation performed by the robot system.
Similarly, Ss can be calculated as follows:
S
s=∫t
where vs is the speed of the robot in the course of stopping, where the speed is calculated from the activation of the stop command until the robot has come to a full stop; Tr is the robot reaction time; and Ts is the robot stopping time. Here, vs is a function of time and can vary due to either the robot's speed or its change in direction. The system is designed to account for vs in the manner that most reduces the separation distance. For example, in the case where the robot's speed is not being monitored, the system assumes that equation (3) is the robot's stopping distance in the direction that most reduces the separation distance. On the other hand, if the robot's speed is being monitored, the system may use the robot's stopping distance based on the monitored speed, applied in the direction that most reduces the separation distance.
The concept of “directed speed” is also described in the ISO/TS 15066 standard, suggesting that an advanced approach can be employed rather than considering just the magnitude value. A common way to ensure that the minimum separation distance is always observed is to retrieve from above-referenced equations (1)-(3) a maximum allowable robot speed that can then be communicated to the robot controller to ensure safe operation. The maximum allowable speed, based on the parameters of the robot system and the sensing system as described above, can be summarized as follows:
where amax is the robot deceleration during the stopping phase when considering the velocity of the human constant during integration time. The value for amax is suggested by the standard as being 1.6 m/s or 2.0 m/s if the separation distance is below 0.5 meters or can be retrieved at runtime when a sensing system able to monitor the human speed is employed.
In
In one example embodiment, when the human 10 is detected by the sensor 402 in the furthest zone 410, little or no adjustment may be necessary for the robot movement speed since the human 10 is not in much danger of colliding with the robot 20. As the human 10 moves toward the robot 20 and into a closer zone 406, some safety action may be taken by the robot 20 since the separation distance has decreased and a potential collision is more probable as compared to when the human 10 was in zone 410. In this scenario, the speed of movement for the robot 20 is decreased in line with equation (4) above to slow down the robot 20 due to the shorter separation distance between the human 10 and robot 20. If the human 10 continues moving toward the robot 20 and enters the zone 406 adjacent the robot 20, the robot speed may be substantially decreased to its minimum speed, or all movement of the robot 20 may cease, to avoid potential injury for the human 10. As noted above, if the human 10 enters the zone 404, all robot movement is stopped. It should be understood that the illustrated zones and the example provided above are for ease of understanding. In other embodiments, the zones may have different boundaries created based on the robot speed, range of motion, and various other factors.
One significant disadvantage of the above-referenced configuration employing the mathematical model highlighted in the ISO/TS 15066 standard is that the concept does not account for the position of the robot 20 at any given time. As such, the concept relies on statically defined monitored zones 404, 406, 408, 410 that are determined using the worst case scenario for the robot 20 to maximize safety. For example, the protection zone 404 is defined to account for the furthest reachable area by the robot 20 at any point in its entire range of motion. As such, the protection zone 404 may be overly inclusive and lead to unnecessary decreases in speed or movement stoppages for the robot 20 that could be avoided if dynamic positioning and movement for both the robot 20 and human 10 were taken into consideration.
Using the above-referenced vector approach, the speed limitations are not simply scalar values that limit robot speed in all directions, but rather focus the speed limitations to specific movement vectors to ensure that movement speed is limited only when the direction of motion of the robot 20 would decrease the separation distance relative to a position of the human 10. In one embodiment, the three-dimensional vector approach described herein uses a total of six values in each of the three cartesian robot axes, considering both the positive and negative direction. In other words, the three-dimensional vector, V3D, may be represented as:
V
3D
=[−V
x
;−V
y
;−V
Z
;+V
x
;+V
y
;+V
z] (5)
It should be understood that while the approach described herein relates to a configuration with six vector values, other embodiments may use any suitable number of vectors in the calculation. For simplicity, the disclosure proceeds with reference to above-noted configuration, but it should be understood that the example embodiment is for purposes of illustration and is not meant to be limiting.
The following description relates to a method 500 for calculating V3D to determine suitable speed limitations for the robot 20 to avoid potential collisions with the human 10 in the workspace 600. For purposes of the description below, V3D, may be initialized as:
V
3D=[−∞;−∞;−∞;+∞;+∞;+∞] (6)
With reference to
At step 506, the analysis module 134 uses the sets of human point cloud HPC and robot point cloud RPC to determine a separation distance between the human 10 and robot 20. For example, in one embodiment, the analysis module 134 selects from the sets of point clouds a combination of a robot point {right arrow over (pr)} and a human point {right arrow over (ph)} to determine the vector between the two points (this process is repeated for all points in the sets as described below):
{right arrow over (sp)}={right arrow over (pr)}−{right arrow over (ph)} (7)
Then, the separation distance dp is determined as a length of the vector:
d
p=|{right arrow over (sp)}| (8)
In some embodiments, the separation distance, dp, is compared to a minimum distance threshold to determine whether the human 10 and robot 20 are sufficiently close relative to one another such that the control system 104 should simply stop all robot movement (e.g., by setting velocities in all vectors to 0). In other words, the following equation may be considered:
d
p<(Zd+Zr+C+Sh), then return V3d=[0;0;0;0;0;0]; else continue (9)
The above-noted calculation implies that under a certain distance threshold, the analysis module 134 (or other suitable component of the control system 104) must stop all robot movement due to various factors, such as the uncertainty of the sensors, or a change of the position of the human during the response time of the sensor 102 and the robot system, or for other suitable reasons. Assuming the minimum distance threshold is satisfied based on a position of the human 10 and robot 20, then the method 500 continues.
At step 508, the analysis module 134 determines the scalar value of the maximum allowed speed of the robot according to the standard ISO/TS 15066, as a function of dp calculated using equation (8) and of robot parameters Pr (including reaction time, stopping time, stopping performance, etc.) and of sensor parameters Ps (including latency, accuracy, etc.). The maximum speed may be calculated as:
v
max
=f(dp,Pr,Ps) (10)
At step 510, the analysis module 134 calculates the versor (unitary vector) of {right arrow over (sp)} (from equation (7)) as:
and decomposes the maximum allowable speed vmax into the three cartesian directions as:
The above-referenced value can be interpreted as the speed limitation decomposed among cartesian directions that is required to ensure that a collision between the robot {right arrow over (pr)} and the human {right arrow over (ph)} points does not occur.
At step 512, the analysis module 134 compares each component of V3D in view of the values currently stored in V3D. At step 514, the components of V3D are updated as necessary. For example, if the calculated result from equation (12) results in a lower speed than the currently stored value for one or more component vectors, that indicates that the robot 20 is moving too fast along the vector to maintain a safe distance and avoid a collision with the human 10. As such, the component vector is updated with the lower speed value to avoid a potential collision between the human 10 and robot 20 and maintain a proper separation distance. If not, then no update is made for the component vector, indicating that the robot 20 may maintain existing speed in that vector since it would result in no collision between the human 10 and robot 20. The selection of the positive or negative component for each axial direction in V3D is based on the sign of the components of {right arrow over (sp)}.
At step 516, the process is repeated from step 506 to step 514 until all combinations of robot {right arrow over (pr)} and human {right arrow over (ph)} points in the point clouds RPC and HPC have been computed. As described previously, the result of the method 500 is a three-dimensional vector, V3D, containing the most restrictive speed limitations for each cartesian axis (both positive and negative) as retrieved from the analysis of all combinations of points belong to the human 10 and robot 20 in the workspace 600. This solution allows the robot 20 to maintain high movement speed in directions that will increase the separation distance between the human 10 and robot 20 while ensuring low speeds or a full stop in cartesian directions that will decrease the minimum separation distance.
At step 518, after all combinations of robot {right arrow over (pr)} and human {right arrow over (ph)} points in the point clouds RPC and HPC have been computed, the resulting speed limitations in the V3D are communicated from the analysis module 134 to a robot controller (such as robot controller 302 of
With reference to
As described in the method 500, the object of V3D is to find the maximum allowable robot speed for all of the vectors so as to maintain an adequate separation distance between the human 10 and robot 20 and avoid potential collisions. The end result of this approach is that robot speed is decreased only for vectors where movement of the robot 20 along that vector would decrease the separation distance, dp, between the human 10 and robot 20, in other words, along directions where the robot 20 approaches the human 10. For all other vectors where the robot movement maintains or increases the separation distance, then no change in speed is applied along those vectors so that the robot 20 can continue performing its functions without risking injury to the human 10. In
In some embodiments, certain of the steps described in method 500 may be combined, altered, varied, and/or omitted without departing from the principles of the disclosed subject matter. It is intended that subject matter disclosed in portion herein can be combined with the subject matter of one or more of other portions herein as long as such combinations are not mutually exclusive or inoperable. In addition, many variations, enhancements and modifications of the systems and methods described herein are possible.
The terms and descriptions used above are set forth by way of illustration only and are not meant as limitations. Those skilled in the art will recognize that many variations can be made to the details of the above-described embodiments without departing from the underlying principles of the invention.