Example embodiments of the present disclosure generally relate to camera management, and more specifically, to methods, apparatuses, systems and computer readable media for managing a camera system that is deployed in a robot system.
With the development of computer and automatic control, robot systems have been widely used to process various types of objects in the manufacturing industry. For example, a tool may be equipped at a tip of a robot system for cutting, grabbing and other operations. Typically, the robot system may have a plurality of mechanical arms, each of which may be rotated by a corresponding joint at an end of the arm. A camera system may be deployed in the robot system for monitoring an operation of the robot system. Usually, a field of view of a single camera cannot cover an entire workspace of the robot system, therefore multiple cameras are provided in the camera system to collect images of various areas in the workspace. Further, these images may be merged for monitoring the robot system. At an initial stage of the robot system, the camera system should be calibrated such that images collected by these cameras may be properly merged for further processing.
There have been proposed several solutions for calibrating the camera system based on a calibrating board. However, with the increasing scale of the workspace, cameras may be distributed across a wide area. Therefore, a size of the calibrating board also increases. It is understood that the calibrating board is required to be manufactured at a very high accuracy and even a slight error in the accuracy may cause a huge deviation in the calibrating. However, compared with a small calibrating board, it is more difficult to ensure a high manufacturing accuracy for a huge calibrating board. Therefore, it is desired to propose a more efficient solution for calibrating the camera system.
Example embodiments of the present disclosure provide solutions for managing a camera system.
In a first aspect, example embodiments of the present disclosure provide a method for managing a camera system, the camera system comprising at least a first camera and a second camera. Here, the method comprises: obtaining a first position and a second position for a first object and a second object from the first and second cameras, respectively, the first and second objects being used for calibrating the camera system; obtaining, after a movement of the first and second objects, a third position and a fourth position for the first and second objects from the first and second cameras, respectively, a relative object position between the first and second objects remaining unchanged during the movement; and determining a relative camera position between the first and second cameras based on the first, second, third, and fourth positions. With these embodiments, the first and second cameras in the camera system may be calibrated by the two individual calibrating objects. The two objects may have small sizes with high accuracies and the only requirement is that the relative object position between the two objects remains unchanged during the movement. At this point, the calibrating procedure does not require a huge calibrating board that covers fields of view of all the cameras, instead the two individual calibrating objects may be used to replace the huge calibrating board as along as the relative object position is fixed. Compared with a huge calibrating board, the two calibrating objects may be small and have high manufacturing accuracy. Therefore, the calibrating procedure may be implemented in a more convenient and effective way.
In some embodiments, determining the relative camera position comprises: generating an equation associated with the relative camera position, the first, second, third, and fourth positions based on a geometry relationship between the first and second objects and the first and second cameras; and determining the relative camera position by solving the equation. As the first, second, third and fourth positions are easily to be detected and thus the problem for determining the relative camera position may be converted into a problem of solving the equation. Compared with calibrating the camera system by a huge physical calibrating board, the two small calibrating objects with high accuracies provide a more effective calibrating way based on mathematical operations.
In some embodiments, solving the equation comprising: representing the relative camera position by a transformation matrix including a plurality of unknown parameters; generating a group of equations including the plurality of unknown parameters based on the equation; and determining the plurality of unknown parameters based by solving the group of equations. With these embodiments, the relative camera position may be represented by an RT transformation matrix including twelve unknown parameters. Further, based on a mathematical relationship for the matrix multiplication, the one equation may be extended to a group of equations associated with the twelve unknown parameters. Further, the twelve unknown parameters may be easily determined based on the mathematical operation, and thus the relative camera position may be obtained.
In some embodiments, the first and second objects are connected with a fixed connection such that the relative object position remains unchanged during the movement. In these embodiments, the first and second objects may be in various shapes as along as feature points in these objects may reflect degree of freedom (DOF) for the objects. With these embodiments, the first and second objects may be connected with various types of connections as along as the first and second objects may move together and their relative object position remains unchanged. Compared with manufacturing a huge calibrating board with the high accuracy, connecting the two individual objects with a fixed connection is much simple and convenient.
In some embodiments, the first object is placed within a first field of view of the first camera, and the second object is placed within a second field of view of the second camera. Here, embodiments of the present disclosure do not require a huge calibrating object to cover fields of view of all the cameras. Instead, the calibrating object of the present disclosure may be in a small size such that the manufacturing accuracy for the calibrating object may be ensured in an easier manner.
In some embodiments, obtaining the first position comprises: obtaining a first image for the first object from the first camera; and determining the first position from the first image, the first position representing a relative position between the first object and the first camera. Nowadays, various types of camera are capable of providing distance measurements. For example, some cameras are equipped with laser devices that may detect the position of the object directly. In another example, the position of the object may be calculated based on processing the image of the object. With these embodiments, all inputs for determining the relative camera position may be collected in an effective and convenient way.
In some embodiments, the camera system further comprises a third camera, and the method further comprises: obtaining a fifth position for a third object from the third camera, the third object being used for calibrating the camera system; obtaining, after a movement of the third object together with the first and second objects, a six position for the third from the third camera, a relative object position between third object and any of the first and second objects remaining unchanged during the movement; and determining a relative camera position between the first and third cameras based on the first, fifth, third, and sixth positions. The above embodiments may be easily extended for managing multiple cameras. Specifically, the number of the calibrating objects may be determined based on the number of the to-be-calibrated cameras. By connecting the third object to the first object or the second object and move these objects together, all the three cameras may be calibrated in a simple and effective way. Therefore, more cameras may be added into the camera system, and the added camera may be calibrated with other existing cameras in an easy and effective way.
In some embodiments, the method further comprises: calibrating the camera system based on the relative camera position. As the small calibrating objects are easily to be manufactured with the high accuracy, the high manufacturing accuracy may ensure a high accuracy for the relative camera position. Accordingly, the first and second cameras may be calibrated on the basis of the accurate relative camera position.
In some embodiments, the camera system is deployed in a robot system, and the method further comprises: monitoring an operation of the robot system based on the calibrated camera system. The robot system may include multiple robot arms that move at high speeds. In order to increase the accuracy of movements of these robot arms, more cameras may be deployed in the robot system. For example, a new camera may be deployed at a position that is far from other cameras. At this point, the entire camera system may be calibrated by adding a new object to the existing objects, in turns, the accuracy of the robot system may be increased accordingly.
In a second aspect, example embodiments of the present disclosure provide an apparatus for managing a camera system, the camera system comprising at least a first camera and a second camera, the apparatus comprising: a first obtaining unit, being configured to obtain a first position and a second position for a first object and a second object from the first and second cameras, respectively, the first and second objects being used for calibrating the camera system; a second obtaining unit, being configured to obtain, after a movement of the first and second objects, a third position and a fourth position for the first and second objects from the first and second cameras, respectively, a relative object position between the first and second objects remaining unchanged during the movement; and a determining unit, being configured to determine a relative camera position between the first and second cameras based on the first, second, third, and fourth positions.
In some embodiments, the determining unit comprises: a generating unit, being configured to generate an equation associated with the relative camera position, the first, second, third, and fourth positions based on a geometry relationship between the first and second objects and the first and second cameras; and a solving unit, being configured to determine the relative camera position by solving the equation.
In some embodiments, the solving unit comprises: a representing unit, being configured to represent the relative camera position by a transformation matrix including a plurality of unknown parameters; and an equation generating unit, being configured to generate a group of equations including the plurality of unknown parameters based on the equation; and a parameter determining unit, being configured to determine the plurality of unknown parameters by solving the group of equations.
In some embodiments, the first and second objects are connected with a fixed connection such that the relative object position remains unchanged during the movement.
In some embodiments, the first object is placed within a first field of view of the first camera and the second object is placed within a second field of view of the second camera.
In some embodiments, the first obtaining unit comprises: an image obtaining unit, being configured to obtain a first image for the first object from the first camera; and a position determining unit, being configured to determine the first position from the first image, the first position representing a relative position between the first object and the first camera.
In some embodiments, the camera system further comprises a third camera, and the first obtaining unit being further configured to obtain a fifth position for a third object from the third camera, the third object being used for calibrating the camera system; the second obtaining unit being further configured to obtain, after a movement of the third object together with the first objects, a six position for the third object from the third camera, a relative object position between third object and any of the first and second objects remaining unchanged during the movement; and the determining unit being further configured to determine a relative camera position between the first and third cameras based on the first, fifth, third, and sixth positions.
In some embodiments, the apparatus further comprises: a calibrating unit, being configured to calibrate the camera system based on the relative camera position.
In some embodiments, the camera system is deployed in a robot system and the apparatus further comprises: a monitoring unit, being configured to monitor an operation of the robot system based on the calibrated camera system.
In a third aspect, example embodiments of the present disclosure provide a system for managing a camera system. The system comprises: a computer processor coupled to a computer-readable memory unit, the memory unit comprising instructions that when executed by the computer processor implements the method for managing a camera system.
In a fourth aspect, example embodiments of the present disclosure provide a computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, cause the at least one processor to perform the method for managing a camera system.
Throughout the drawings, the same or similar reference symbols are used to indicate the same or similar elements.
Principles of the present disclosure will now be described with reference to several example embodiments shown in the drawings. Though example embodiments of the present disclosure are illustrated in the drawings, it is to be understood that the embodiments are described only to facilitate those skilled in the art in better understanding and thereby achieving the present disclosure, rather than to limit the scope of the disclosure in any manner.
For the sake of description, reference will be made to
A tip of the end arm 144 may be equipped with a tool 130 for processing the target object 150 such as a raw material that is to be shaped by the robot system 100. Here, the tool may include, for example, a cutting tool for shaping the target object 150 into a desired shape. Before normal operations of the robot system 100, the camera system should be calibrated first, such that images collected by the first and second cameras 110 and 120 may be merged for further processing.
There have been proposed solutions for calibrating the camera system of the robot system. In some solutions, a calibrating board is used for calibration and reference will be made to
Usually, the calibrating board 210 is selected based on a distance between the first and second cameras 110 and 120. The far the distance is, the larger the size of the calibrating board 210 is. If the first and second cameras 110 and 120 are far from each other, then a huge calibrating board should be selected for the calibrating procedure. However, the calibrating board requires the high manufacture accuracy, and the bigger the calibrating board is, the more difficult it is to manufacture. Therefore, a huge calibrating board is hard to be made and it is difficult to ensure the manufacturing accuracy of the huge calibrating board. Further, the robot system 100 may include multiple camera systems with different camera distances, therefore multiple calibrating boards with different sizes should be prepared.
In order to at least partially solve the above and other potential problems, a new method for managing a camera system is provided according to embodiments of the present disclosure. In general, according to embodiments of the present disclosure, multiple calibrating objects are provided for the calibrating procedure. Reference will be made to
In
With these embodiments, the relative camera position may be determined by two individual calibrating objects. Therefore, the calibrating procedure does not require a huge calibrating board that covers fields of view of all the cameras. Instead the two calibrating objects may be connected by any way as along as their relative position is fixed. Compared with the huge calibrating board, the first and second objects 310 and 320 may be small in size and having a higher manufacturing accuracy. Therefore, the relative camera position may be implemented in a more convenient and effective way.
Reference will be made to
Although the above paragraph describe that the first object 310 is placed in the field of view of the first camera 110, it does not exclude a situation that a portion of or all of the second object 320 is also within the field of view of the first camera 110. Therefore, the size of the first and second objects 310 and 320 may be selected in a causal way. For example, the first and second objects 310 and 320 may be in the same size, in different sizes, in the same shape, or in different shapes. In these embodiments, the only requirement is that the first and second objects 310 and 320 are connected in a fixed manner such that the two objects move together and the relative object position therebetween remains unchanged during the movement.
As illustrated in
Nowadays, various types of camera have functions for distance measurement. For example, some cameras are equipped with laser devices that may detect the position of the object directly. Specifically, a laser beam may be transmitted from a transceiver in the camera to the object, and then the position of the object may be determined based on a time point when the laser beam is transmitted and a time point when a reflected beam returns back to the transceiver. In another example, for some cameras that do not have a laser device, the position of the object may be calculated based on processing the image of the object. For example, pixels for features (such as corners in a cube object) may be identified from the image and then the position of the object may be determined. With these embodiments, the positions of the first and second objects 310 and 320 may be collected in an effective and convenient way. Having described the determination of the first position 510, other positions may be determined in a similar manner. For example, the second position 520 may be determined from an image for the second object 320 that is captured by the second camera 120.
Referring back to
With these embodiments, the first and second objects 310 and 320 may be connected with various types of connections as along as the two objects may move together and the relative object position remains unchanged. Compared with producing a huge calibrating board with the high accuracy, connecting the two individual objects with a fixed connection is much simple and convenient.
At a block 420 in
Referring back to a block 430 in
(HCam1Obj1)−1·HCam1Cam2·HCam2Obj2=HObj1Obj2 Equation 1
Where HCam1Obj1 represents a relative position between the first object 310 and the first camera 110, HCam1Cam2 represents a transformation matrix for the relative camera position between the second camera 120 and first camera 110, HCam2Obj2 represents a relative position between the second object 320 and the second camera 120, and HObj1Obj2 represents a transformation matrix for the relative object position between the second object 320 and the first object 310.
It is to be understood that the above Equation 1 may always work during movements of the first and second objects 310 and 320. Accordingly, the above geometry relationship between the first and second objects and the first and second cameras may be used to generate an equation associated with the relative camera position, the first, second, third, and fourth positions (510, 520, 530, and 540). Specifically, another Equation 2 may be obtained for those positions obtained after the movement.
(HCam1Obj1′)−1·HCam1Cam2·HCam2Obj2′=HObj1Obj2 Equation 2
Where HCam1Obj1′ represents a relative position between the first object 310 and the first camera 110 after the movement, HCam1Cam2 represents a transformation matrix for the relative camera position between the second camera 120 and first camera 110, HCam2Obj2′ represents a relative position between the second object 320 and the second camera 120 after the movement, and HObj1Obj2 represents a transformation matrix for the relative object position between the second object 320 and the first object 310.
It is to be noted that the positions of the first and second cameras 110 and 120 are unchanged, and thus the relative camera position HCam1Cam2 in Equations 1 and 2 have the same value. Further, the first and second objects 310 and 320 move together and thus the relative object position HObj1Obj2 in Equations 1 and 2 have the same value. Accordingly, right sides of the Equations 1 and 2 have the same value and thus the two equations may be combined into the following Equation 3.
(HCam1Obj1)−1·HCam1Cam2·HCam2Obj2=(HCam1Obj1′)−1·HCam1Cam2·HCam2Obj2′ Equation 3
Symbols in Equation 3 have the same meanings as those in the above Equations 1 and 2. In Equation 3, HCam1Obj1, HCam2Obj2, HCam1Obj1′, and HCam2Obj2′ have known values (i.e., the first, second, third and fourth positions 510 to 540 as determined in
In some embodiments, symbols in Equation 3 may be donated in a form of a RT matrix
where R donates a 3*3 rotating matrix, and T donates a 3*1 column vector. As HCam1Obj1, HCam2Obj2′, HCam1Obj1′, and HCam2Obj2′ in Equation 3 have known values, parameters in the RT matrix for HCam1Obj1, HCam2Obj2, HCam1Obj1′, and HCam2Obj2′ are known. Regarding to the only unknown value, HCam1Cam2 may be donated by a RT matrix
that including twelve unknown parameters, where R may be donated by
and T may be donated by
Accordingly, the unknown relative camera position HCam1Cam2 may be represented a transformation matrix including the twelve unknown parameters: r11, r12, r13, r21, r22, r23, r31, r32, r33, t1, t2 and t3 as below:
Further, based on Equations 3 and 4, the following Equation 5 may be determined:
By moving the right side in Equation 5 to the left side, Equation 5 may be converted to Equation 6:
In Equation 6, each of HCam1Obj1, HCam2Obj2, HCam1Obj1′, and HCam2Obj2′ may be represented by an individual RT matrix with 16 (4*4) known parameters. Further, based on mathematical definitions of matrix multiplication, the above Equation 6 may be expended to a group of equations including the plurality of unknown parameters. Specifically, each of the group of equations may be associated with some of the 12 unknown members r11, r12, r13, r21, r22, r23, r31, r32, r33, t1, t2 and t3. As the RT matrix is in form of a 4*4 matrix which includes 16 parameters, parameters at the same position at both sides of the above Equation 6 should have the same value. Therefore, 16 equations associated with the 12 unknown parameters may be represented as below:
Next, the plurality of unknown parameters may be determined by solving the above Equation Set 1. Here, the above Equation Set 1 may be solved by normal mathematical operations and details will be omitted hereinafter. With these embodiments, the relative camera position may be represented by an RT transformation matrix including twelve unknown parameters. Further, based on a mathematical relationship for the matrix multiplication, the one equation may be converted into a group of equations associated with the twelve unknown parameters. Further, the twelve unknown parameters may be easily determined based on the mathematical operation.
With the above method 300, the engineer only needs to place the two objects towards the two cameras and collect two images. Further, the engineer may move the two objects and then collect two more images. Based on the four images before and after the movement, the relative camera position may be determined effectively.
Although the first and second objects 310 and 320 are moved only once in the above embodiments, in other embodiments of the present disclosure, the two objects may be moved several times and then multiple relative camera positions may be obtained. Further, the multiple relative camera positions may be used to calculate the actual relative camera positions. For example, an average may be determined for the multiple relative camera positions and thus the relative camera position may be determined in a more reliable manner.
In some embodiments, the first and second cameras 110 and 120 may be calibrated based on the relative camera position. As the small calibrating objects are easily to be manufactured to a higher accuracy, therefore the high manufacturing accuracy may lead to an accurate relative camera position. Accordingly, the first and second cameras may be calibrated on the basis of the accurate relative camera position.
Although the above paragraphs describe the method for calibrating a camera system including two cameras, in some embodiments, the camera system may include more cameras. Hereinafter, reference will be made to
In some embodiments, the above method 300 may be implemented to any two cameras in the multiple cameras. For example, the method 300 may be implemented to the first and third cameras 110 and 710. At this point, a fifth position may be obtained for the third object 720 from the third camera 710. After a movement of the third object 710 together with the first object 310, a six position for the third object 720 may be obtained from the third camera. Here, a relative object position between the third object 720 and any of the first and second objects 310 and 320 remains unchanged during the movement. Further, a relative camera position between the first and third cameras 110 and 710 may be determined based on the first, fifth, third, and sixth positions.
In some embodiments, the above method 300 may be implemented to all of the multiple cameras. As illustrated in
The above embodiments may be easily extended for managing multiple cameras. Specifically, the number of the calibrating objects may be determined based on the number of the to-be-calibrated cameras. By connecting the third object to the first object or the second object and move these objects together, all the three cameras may be calibrated in a simple and effective way. By these means, as more cameras are added into the camera system, more objects may be used for calibrating the camera system.
In some embodiments, the camera system may be deployed in a robot system for monitoring an operation of the robot system. The robot system may include multiple robot arms that move at a high speed. In order to increase the accuracy of movements of these robot arms, more cameras may be deployed in the robot system. For example, a new camera may be deployed at a position that is far from other cameras. At this point, a new object may be connected with existing objects and then all the cameras may be calibrated.
In some embodiments, the method 300 may be implemented in a controller of the robot system. Alternatively and/or in addition to, the method 300 may be implemented in any computing device. As along as the first, second, third and fourth positions 510 to 540 are inputted into the computing device, the relative camera position may be outputted for calibrating the camera system.
The preceding paragraphs having described detailed steps of the method 300, in some embodiments of the present disclosure, the method 300 may be implemented by an apparatus 800 for managing a camera system.
In some embodiments, the determining unit 830 comprises: a generating unit, being configured to generate an equation associated with the relative camera position, the first, second, third, and fourth positions based on a geometry relationship between the first and second objects and the first and second cameras; and a solving unit, being configured to determine the relative camera position by solving the equation.
In some embodiments, the solving unit comprises: a representing unit, being configured to represent the relative camera position by a transformation matrix including a plurality of unknown parameters; and an equation generating unit, being configured to generate a group of equations including the plurality of unknown parameters based on the equation; and a parameter determining unit, being configured to determine the plurality of unknown parameters by solving the group of equations.
In some embodiments, the first and second objects are connected with a fixed connection such that the relative object position remains unchanged during the movement.
In some embodiments, the first object is placed within a first field of view of the first camera and the second object is placed within a second field of view of the second camera.
In some embodiments, the first obtaining unit 810 comprises: an image obtaining unit, being configured to obtain a first image for the first object from the first camera; and a position determining unit, being configured to determine the first position from the first image, the first position representing a relative position between the first object and the first camera.
In some embodiments, the camera system further comprises a third camera, and the first obtaining unit 810 being further configured to obtain a fifth position for a third object from the third camera, the third object being used for calibrating the camera system; the second obtaining unit 820 being further configured to obtain, after a movement of the third object together with the first objects, a six position for the third object from the third camera, a relative object position between third object and any of the first and second objects remaining unchanged during the movement; and the determining unit 830 being further configured to determine a relative camera position between the first and third cameras based on the first, fifth, third, and sixth positions.
In some embodiments, the apparatus 800 further comprises: a calibrating unit, being configured to calibrate the camera system based on the relative camera position.
In some embodiments, the camera system is deployed in a robot system and the apparatus 800 further comprises: a monitoring unit, being configured to monitor an operation of the robot system based on the calibrated camera system.
In some embodiments of the present disclosure, a system 900 for managing a camera system is provided.
In some embodiments of the present disclosure, a computer readable medium for managing a camera system is provided. The computer readable medium has instructions stored thereon, and the instructions, when executed on at least one processor, may cause at least one processor to perform the method for managing a camera system as described in the preceding paragraphs, and details will be omitted hereinafter.
Generally, various embodiments of the present disclosure may be implemented in hardware or special purpose circuits, software, logic or any combination thereof. Some aspects may be implemented in hardware, while other aspects may be implemented in firmware or software which may be executed by a controller, microprocessor or other computing device. While various aspects of embodiments of the present disclosure are illustrated and described as block diagrams, flowcharts, or using some other pictorial representation, it will be appreciated that the blocks, apparatus, systems, techniques or methods described herein may be implemented in, as non-limiting examples, hardware, software, firmware, special purpose circuits or logic, general purpose hardware or controller or other computing devices, or some combination thereof.
The present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer readable storage medium. The computer program product includes computer-executable instructions, such as those included in program modules, being executed in a device on a target real or virtual processor, to carry out the process or method as described above with reference to
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented. The program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
The above program code may be embodied on a machine readable medium, which may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine readable medium may be a machine readable signal medium or a machine readable storage medium. A machine readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are contained in the above discussions, these should not be construed as limitations on the scope of the present disclosure, but rather as descriptions of features that may be specific to particular embodiments. Certain features that are described in the context of separate embodiments may also be implemented in combination in a single embodiment. On the other hand, various features that are described in the context of a single embodiment may also be implemented in multiple embodiments separately or in any suitable sub-combination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2020/140856 | 12/29/2020 | WO |