This application claims priority to Taiwan Application Serial Number 105138684, filed Nov. 24, 2016, which is herein incorporated by reference.
The present disclosure relates to an anti-collision system and an anti-collision method. More particularly, the present disclosure relates to an anti-collision system and an anti-collision method applied to an automatic robotic arm.
In general, the automatic robotic arm is one kind of precision machineries composed of rigid bodies and servo motor. When an unexpected collision is happened, the operation precisions of each axis of the automatic robotic arm will be impacted. Further, the unexpected collision may damage the servo motor or the components. The components in automatic robotic arm are assembled as continuous structures. Thus, all the components need to be changed in the same time when updating the components of the automatic robotic arm. Besides, the automatic robotic arm with new servo motor or new components also needs to process the test critically after updating the components. Only when the test is passed, the automatic robotic arm with new servo motor or new components can be returned to work. Therefore, the time and the cost of maintaining the automatic robotic arm are higher than other precision machineries.
Therefore, efficiently preventing the servo motor from damage can help decrease the maintaining cost of the automatic robotic arm. As such, how to detect that whether an unexpected object enters the operation region of the automatic robotic arm when the automatic robotic arm is operating, and how to immediately adjust the operation status of the automatic robotic arm when the unexpected object enters the operation region of the automatic robotic arm for preventing the servo motor from damage becomes a problem to-be solved in the art.
To address the issues, one aspect of the present disclosure is to provide an anti-collision system for preventing an object from colliding with an automatic robotic arm. The automatic robotic arm comprises a controller. The anti-collision system comprises a first image sensor, a vision processing unit and a processing unit. The first image sensor is configured to capture a first image. The vision processing unit is configured to receive the first image, recognize the object of the first image and estimate an object movement estimation path of the object. The processing unit is coupled to the controller to access an arm movement path, estimate an arm estimation path of the automatic robotic arm, analyze the first image to establish a coordinate system, and determine whether the object will collide with the automatic robotic arm according to the arm estimation path of the automatic robotic arm and the object movement estimation path of the object. The processing unit adjusts an operation status of the automatic robotic arm when the processing unit determines that the object will collide with the automatic robotic arm.
Another aspect of the present disclosure is to provide an anti-collision method for preventing an object from colliding with an automatic robotic arm. The automatic robotic arm comprises a controller. The anti-collision method comprising: capturing a first image by a first image sensor; receiving the first image, recognizing the object of the first image and estimating an object movement estimation path of the object by a vision processing unit, and accessing an arm movement path, estimating an arm estimation path of the automatic robotic arm, analyzing the first image to establish a coordinate system, and determining whether the object will collide with the automatic robotic arm according to the arm estimation path of the automatic robotic arm and the object movement estimation path of the object by a processing unit coupled to the controller. The processing unit adjusts an operation status of the automatic robotic arm when the processing unit determines that the object will collide with the automatic robotic arm.
Accordingly, the anti-collision system and the anti-collision method use the vision processing unit to recognize the object of the image and estimate an object movement estimation path of the object. And, the processing unit can determine whether the object will collide with the automatic robotic arm according to the arm estimation path of the automatic robotic arm and the object movement estimation path of the object. Besides, if the processing unit determines that an unexpected object enters the operation region when the automatic robotic arm is operating, the processing unit can immediately commands the automatic robotic arm to stop moving or to enter an adaptation mode. The adaptation mode means that the rotation angle (that is, the displacement of the arm formed by the force or the torque) of the servo motor is changed by the external force when the servo motor is in the condition without operating by internal electronic force. It can prevent the automatic robotic arm from suffering the stress in the condition of reversing movement or counterforce status. As such, the anti-collision system and the anti-collision method can achieve the effect of preventing the object from colliding with the automatic robotic arm and preventing the servo motor from breakdown.
It is to be understood that both the foregoing general description and the following detailed description are by examples, and are intended to provide further explanation of the disclosure as claimed.
The disclosure can be more fully understood by reading the following detailed description of the embodiment, with reference made to the accompanying drawings as follows:
Reference will now be made in detail to the present embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
References are made to
In one embodiment, the anti-collision system 100 includes the image sensor 120 and the embedded system 130. In one embodiment, the embedded system 130 can be an external embedded system. And, the external embedded system can be mounted on any part of the automatic robotic arm A1. In one embodiment, the embedded system 130 can be placed on the automatic robotic arm A1. In one embodiment, the embedded system 130 connected to the controller 140 of the automatic robotic arm A1 by a wire or a wireless communication link. And, the embedded system 130 connected to the image sensor 120 by another wire or another wireless communication link.
In one embodiment, as shown in
In on embodiment, the anti-collision system 100 includes multiple image sensors 120, 121, and the automatic robotic arm A1 includes multiple motors M1, M2. And, the motors M1, M2 are coupled to the controller 140. The vision processing unit 132 is coupled to the multiple image sensors 120, 121.
In one embodiment, the image sensor 120 can be mounted on the automatic robotic arm A1. Or, the image sensor 120 can be configured independently at any position which can capture the automatic robotic arm A1 in the coordinate system.
In one embodiment, the image sensors 120, 121 can be composed of at least one charge coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) sensor. The image sensors 120, 121 can be mounted on the automatic robotic arm A1 or separately configured at other positions in the coordinate system. In one embodiment, the processing unit 131 and controller 140 can be separately or combined by using a microcontroller, a microprocessor, a digital signal processor, an application specific integrated circuit (ASIC), or a logic circuit to implement. In one embodiment, the vision processing unit 132 uses for processing image analyzation, such as image recognition, dynamical object tracing, distance measurement of physical object or depth measurement of environment. In one embodiment, the image sensor 120 can be implemented as a three-dimensional camera, infrared camera or other depth cameras for obtaining the depth information of the image. In one embodiment, the processing unit 132 can implemented by multiple reduced instruction set computers (RISC), hardware accelerators units, high performance image signal processor or high-speed peripheral interface.
Next, reference is made to
In one embodiment, as shown in
In one embodiment, as shown in
In another embodiment, as shown in
In one embodiment, as shown in
Next, the following paragraphs describe the steps of the anti-collision method 400. The person skilled in the art can easily understand that the order of the following steps can be adjusted according to the practice condition.
In step 410, the image sensor 120 captures a first image.
In one embodiment, the image sensor 120 captures a region Ra1 of the selective compliance assembly robot arm A1 on an X-Y plane to obtain the first image.
It should be noticed that, for describing easily, the image(s) captured by the image sensor 120 at different time points is/are collectively called as the first image in the following statements.
In one embodiment, as shown in
It should be noticed that, for describing easily, the image(s) captured by the image sensor 121 at different time points is/are collectively called as the second image in the following statements.
Based on above, the automatic robotic arm A2 comprises the first arm 110 and the second arm 111 when automatic robotic arm A2 is the six degrees of freedom robot arm. And, the image sensor 121 can be mounted on the joint of the first arm 110 and the second arm 111 to capture the operation of the second arm 111 for more precisely determining that whether the second arm 111 will cause collision. Besides, the image sensor 120, 121 can separately obtain the first image and the second image. And, the image sensor 120, 121 can separately transmit the first image and the second image to the visual processing unit 132.
In step 420, the visual processing unit 132 uses for receiving the first image, recognizing the object OBJ of the first image and estimating an object movement estimation path “a” of the object OBJ.
References are made to
In one embodiment, the visual processing unit 132 can estimate the object movement estimation path “a” of the object OBJ by optical flow. For example, the visual processing unit 132 compares the first one captured first image (which is captured firstly) and the second one captured first image (which is captured secondly). And, the visual processing unit 132 estimates that the object movement estimation path “a” of the object OBJ represents the memo vent of moving to the right side if the position of the object OBJ in the second one captured first image is on the right of the first one captured first image.
Therefore, the visual processing unit 132 can compare the first images captured at different time points to estimate the object movement estimation path “a” of the object OBJ and transmit the object movement estimation path “a” of the object OBJ to the processing unit 131.
In one embodiment, when the processing unit 131 has the better calculation ability, the vision processing unit 132 also can transmit the information of the recognized object OBJ to the processing unit 131. Thus, the processing unit 131 can estimate the object movement estimation path “a” according to the positions of the object OBJ in coordinate system corresponding to multiple time points.
In one embodiment, in the condition that the automatic robotic arm A2 is the six degrees of freedom robot arm (as shown in
In step 430, the processing unit 131 uses for accessing an arm movement path, estimating an arm estimation path “b” of the automatic robotic arm A1, and analyzing the first image to establish a coordinate system.
In one embodiment, the processing unit 131 estimates the arm estimation path “b” of the automatic robotic arm A1 (as shown in
In one embodiment, the anti-collision system 100 includes a storage device for storing the motion control code. The motion control code can be predefined by user. And, the motion control code uses for controlling the operation direction, operation speed and operation function (e.g., picking or rotating a target object) of the automatic robotic arm A1 in each time point. Therefore, the processing unit 131 can estimate the arm estimation path “b” of the automatic robotic arm A1 by accessing the motion control code stored in the storage device.
In one embodiment, the image sensor 120 can continuously capture multiple first images. The processing unit 131 analyzes one of the first images to determine a datum point objects. And, the processing unit 131 configures the datum point objects as a center point coordinate and calibrates the center point coordinate according to another first image. In other words, the processing unit 131 can calibrate the center point coordinate according to the multiple first images captured at different time points. As shown in
Therefore, the processing unit 131 can analyze the first image to establish a coordinate system. The coordinate system can use for determining the relative position of each object (e.g. the automatic robotic arm A1 or the object OBJ) in the first image.
In one embodiment, after establishing the coordinate system, the processing unit 131 can receive the real-time signal from the controller 140 to obtain the current coordinate position of the first arm 110. And, processing unit 131 can estimate the arm estimation path “b” according to the coordinate position and the motion control code of the first arm 110.
In one embodiment, as shown in
In one embodiment, as shown in
In one embodiment, the order of the step 420 and the step 430 can be exchanged.
In step 440, the processing unit 131 determines whether the object OBJ will collide with the automatic robotic arm A1 according to the arm estimation path “b” of the automatic robotic arm A1 and the object movement estimation path “a” of the object OBJ. If the processing unit 131 determines the object OBJ will collide with the automatic robotic arm A1, the step 450 is performed. If the processing unit 131 determines the object OBJ will not collide with the automatic robotic arm A1, the step 410 is performed.
In one embodiment, the processing unit 131 determines that whether the arm estimation path “b” of the automatic robotic arm A1 and the object movement estimation path “a” of the object OBJ are overlapped at a specific time point. The processing unit 131 determines that the object OBJ will collide with the automatic robotic arm A1 if the processing unit determines that the arm estimation path “b” of the automatic robotic arm A1 and the object movement estimation path “b” of the object OBJ are overlapped at the specific time point.
For example, the processing unit 131 estimates that the position of the first arm 110 of the automatic robotic arm A1 is at coordinate (10, 20, 30) at 10:00 A.M. according to the arm estimation path “b”. And, the processing unit 131 estimates that the position of the first arm 110 of the automatic robotic arm A1 is also at coordinate (10, 20, 30) at 10:00 A.M. according to the object movement estimation path “a”. Therefore, the processing unit 131 determines that the path of the automatic robotic arm A1 and the object OBJ will be overlapped at 10:00 A.M., so as to determine the object OBJ will collide with the automatic robotic arm A1.
In one embodiment, when the automatic robotic arm A2 is a six degrees of freedom robot arm (as shown in
In step 450, the processing unit 131 adjusts an operation status of the automatic robotic arm A1.
In one embodiment, the processing unit 131 adjusts the operation status of the automatic robotic arm A1 as an adaptation mode (as shown in
In one embodiment, the processing unit further determines that whether a collision period is higher than a safety threshold (e.g., determining whether the collision period is higher than 2 seconds) when the processing unit 130 determines that the arm estimation path “b” of the automatic robotic arm A1 and the object movement estimation path “a” of the object OBJ are overlapped at a specific time point. If the processing unit 131 determines that the collision period is higher than the safety threshold, the processing unit 131 changes a current movement direction of the automatic robotic arm A1 (e.g., the processing unit 131 indicates the controller 140 to control the automatic robotic arm A1 moving to the opposite side). If the processing unit 131 determines that the collision period is not higher than the safety threshold, the processing unit 131 decreases current movements speed of the automatic robotic arm.
In this step, other operation methods of the automatic robotic arm A2 in
Accordingly, the anti-collision system and the anti-collision method use the vision processing unit to recognize the object of the image and estimate an object movement estimation path of the object. And, the processing unit can determine whether the object will collide with the automatic robotic arm according to the arm estimation path of the automatic robotic arm and the object movement estimation path of the object. Besides, if the processing unit determines that an unexpected object enters the operation region when the automatic robotic arm is operating, the processing unit can immediately commands the automatic robotic arm to stop moving or to enter the adaptation mode. It can prevent the automatic robotic arm from suffering the stress in the condition of reversing movement or counterforce status. As such, the anti-collision system and the anti-collision method can achieve the effect of preventing the object from colliding with the automatic robotic arm and preventing the servo motor from breakdown.
Although the present disclosure has been described in considerable detail with reference to certain embodiments thereof, other embodiments are possible. Therefore, the spirit and scope of the appended claims should not be limited to the description of the embodiments contained herein.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the present disclosure without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the present disclosure cover modifications and variations of this disclosure provided they fall within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
105138684 | Nov 2016 | TW | national |