1. Technical Field
Embodiments of the present disclosure relate generally to robot control technologies and particularly to a system and method for controlling a robot using human motions.
2. Description of Related Art
Robots are widely employed for replacing humans or assisting humans in dangerous, dirty, or dull work such as in assembling and packing, transportation, earth exploration, and mass production of commercial and industrial goods. Additionally, the robots may execute tasks according to real-time human commands, preset software programs, or principles set with aid of artificial intelligent (AI) technologies. In a typical robot control method, a particular control device remotely controls most of the robots. However, the method needs operators to train in the use of the control device, which is inconvenient and time consuming.
The disclosure, including the accompanying drawings, is illustrated by way of example and not by way of limitation. It should be noted that references to “an” or “one” embodiment in this disclosure are not necessarily to the same embodiment, and such references mean at least one.
In the embodiment, the robot M1 may operate in a vision field of the operator M0, so that the operator M0 can control the robot M1 using proper motions according to actual situations of the robot M1. In other embodiments, if the robot M1 is out of the vision field of the operator M0, the operator M1 may acquire real-time video of the robot M1 using an assistant device, such as a computer, to control the robot M0 according to the video.
The image capturing device 2 may be a digital camera, such as a time of flight (TOF) camera, that is positioned in front of the operator M0 to capture 3D images of the operator M0. In one example, as shown in
The image capturing module 101 captures the 3D images of the operator M0 using the image capturing device 2 in real-time.
The correlation module 101 determines different portions of the operator M0 in one of the 3D images according to moveable joints of the robot M1, and correlates each of the determined portions with one of the moveable joints. In one example, as shown in
The motion data obtaining module 103 obtains motion data of each of the determined portions of the operator M0 from the real-time 3D images of the operator M0. In the embodiment, the motion data may include a movement direction (X-Y-Z coordinates) of each of the determined portions of the operator M0, and a movement distance of each of the determined portions along the movement direction.
In one embodiment, the motion data obtaining module 103 may acquire a current 3D image and a previous 3D image of the operator M0 from the 3D images. In addition, the motion data obtaining module 103 may calculate the motion data of each of the determined portions by comparing position information (e.g., coordinate information) of each of the determined portions in the current 3D image and the previous 3D image. For example, the motion data obtaining module 103 may calculate a movement distance of the portion S1 along the Z-axis direction of
In other embodiment, the motion data obtaining module 103 may input the real-time 3D images of the operator M0 into a software program, which may be a middleware, such as an open natural interaction (OpenNI) software, to analyze the real-time 3D images using the middleware, and obtain the motion data of each of the determined portions from the middleware. The OpenNI software is middleware that can capture body movements and sounds of a user to allow for a more natural interaction of the user with computing devices in the context of a natural user interface.
The control module 104 generates a control command according to the motion data of each of the determined portions, and sends the control command to the robot M1 through the network 3, to control each moveable joint of the robot M1 to implement a motion of a determined portion of the operator M0 that is correlated with the moveable joint of the robot M1. In the embodiment, the control command includes the motion data of each of the determined portions of the operator M0. When the robot M1 receives the control command, the robot M1 may control the moveable joints to implement corresponding motions using its own driving system, such as a servomotor.
In block S01, the image capturing module 101 captures the 3D images of the operator M0 in real-time using the image capturing device 2.
In block S02, the correlation module 101 determines different portions of the operator M0 in one of the 3D images according to the moveable joints of the robot M1, and correlates each of the determined portions with one of the moveable joints of the robot M1.
In block S03, the motion data obtaining module 103 obtains motion data of each of the determined portions of the operator M0 from the real-time 3D images of the operator M0. In the embodiment, the motion data may include a movement direction of each of the determined portions of the operator M0, and a movement distance of each of the determined portions along the movement direction. Details of obtaining the motion data are provided as the paragraph [0016] and paragraph [0017] as described above.
In block S04, the control module 104 generates a control command according to the motion data of each of the determined portions, and sends the control command to the robot M1 through the network 3, to control each moveable joint of the robot M1 to implement a motion of a determined portion that is correlated with the moveable joint. In the embodiment, the control command includes the motion data of each of the determined portions of the operator M0. When the robot M1 receives the control command, the robot M1 may control the moveable joints to implement corresponding motions using its own driving system.
Although certain embodiments of the present disclosure have been specifically described, the present disclosure is not to be construed as being limited thereto. Various changes or modifications may be made to the present disclosure without departing from the scope and spirit of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
100130443 | Aug 2011 | TW | national |