The present disclosure relates to systems and methods for measuring an orientation of an object, and in particular to a system and method for photogrammetrically measuring an orientation of an object relative to another object using a camera disposed on each object.
Photogrammetry utilizes measurements extracted from images of optical markers (“targets”) acquired from one or more sensors (e.g. cameras) to produce three-dimensional information about the relationship between the targets and the sensor(s). One such application of this technique is to measure the orientation (position and rotation) of one rigid object relative to another rigid object, where one such object might be the ground.
Standard photogrammetry methods accomplish this by use of markers on the object of interest and the use of two sensors mounted on a nearby rigid object. The position of the object of interest relative to the first sensor and the second sensor is photogrammetrically determined using measurements from each respective sensor. This creates a photogrammetry bundle comprising a system of non-linear equations that can be solved (for example, by least squares best-fit techniques) to compute the orientation of the object of interest relative to the nearby rigid object that the sensors are mounted on.
In some situations, a high level of accuracy in such measurements is desired, with rotational accuracy of particular importance. This requires the use of more cameras, cameras with lower measurement uncertainties, or both. Such solutions are costly. What is needed is a system and method for economically meeting measurement accuracy requirements, particularly with respect to rotational motion between two objects.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
To address the requirements described above, this document discloses a system and method for photogrammetrically determining a six degree of freedom spatial relationship between a first object and a second object. In one embodiment, the method comprises photogrammetrically determining a first orientation of the first object relative to the second object, photogrammetrically determining a second orientation of the second object relative to the first object, and determining the six degree of freedom spatial relationship between the first object and the second object from the photogrammetrically determined first orientation of the first object relative to the second object and the photogrammetrically determined second orientation of the second object relative to the first object.
Another embodiment is evidenced by a system for photogrammetrically determining a six degree of freedom spatial relationship between a first object and a second object. In this embodiment, the system comprises a first camera, mounted on the first object, for photogrammetrically determining a first orientation of the first object relative to the second object; a second camera, mounted on the second object, for photogrammetrically determining a second orientation of the second object relative to the first object using the first camera mounted on the first object; and a photogrammetry bundle adjustment module, communicatively coupled to the first camera and the second camera, for determining the six degree of freedom spatial relationship between the first object and the second object from the photogrammetrically determined first orientation of the first object relative to the second object and the photogrammetrically determined second orientation of the second object relative to the first object. In one embodiment, the photogrammetry bundle adjustment module is a processor and a communicatively coupled memory storing processor instructions for performing the foregoing photogrammetry operations.
Still another embodiment is evidenced by an apparatus for photogrammetrically determining a six degree of freedom spatial relationship between a first object and a second object, comprising: means for photogrammetrically determining a first orientation of the first object relative to the second object; means for photogrammetrically determining a second orientation of the second object relative to the first object; and means for determining the six degree of freedom spatial relationship between the first object and the second object from the photogrammetrically determined first orientation of the first object relative to the second object and the photogrammetrically determined second orientation of the second object relative to the first object.
The features, functions, and advantages that have been discussed can be achieved independently in various embodiments of the present invention or may be combined in yet other embodiments, further details of which can be seen with reference to the following description and drawings.
Referring now to the drawings in which like reference numbers represent corresponding parts throughout:
In the following description, reference is made to the accompanying drawings which form a part hereof, and which is shown, by way of illustration, several embodiments. It is understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present disclosure.
This disclosure presents a system and method by which the orientation of one object (Object A) relative to another object (Object B) is determined using at least one sensor mounted on each object (one on Object A to take measurements of Object B and one on Object B to take measurements of Object A). Using this system and method, one or more sensors mounted on Object A (which may be stationary or in motion) view targets mounted on Object B (which also may be stationary or in motion), and one or more sensors mounted on Object B view targets mounted on Object A. Data from all sensors on both objects are combined to generate measurements that are as much as five times more accurate than standard methods which utilize the same quantity and type of sensors, thus permitting either greater measurement accuracy, the use of lower accuracy sensors, or both.
The first camera 104-1 and second camera 104-2 are used to sense the location of a number of targets 106A-106N (alternatively collectively known hereinafter as target(s) 106) mounted at target coordinates (T) on an exterior surface of rigid object B 102B. These locations are sensed in terms of two dimensional (2D) measurements made by first camera 104-1 and second camera 104-2 (M1 and M2, respectively). These measurements, as well as the orientation of first camera 10-1 and relative to the rigid object A 102A (C1A) and of the second camera 104-2 relative to rigid object A 102A (C2A) (both of such orientation are typically measured or determined in advance) and the target 106 locations relative to rigid object B 104B (TB) are used to perform a photogrammetry computation.
Similarly, the second camera 104-2 (CAM 2) generates two dimensional measurements corresponding to sensed location (M2) of the targets 106 on the exterior surface of the rigid object B 102B. The orientation of the first camera 104-1 relative to rigid object A (C1A), the orientation of the second camera 104-2 relative to object A (C2A), and the target 106 locations relative to object B (TB) are provided to a conventional photogrammetry bundle adjustment module (PBAM) 202. The PBAM 202 accepts a set of two dimensional images depicting a number of target locations on an object from different viewpoints and simultaneously refines the coordinates defining the target locations in three dimensions according to optimality criteria. This amounts to an optimization problem on the three dimensional relationship between rigid object A and rigid object B, as well as viewing parameters (i.e., camera pose, and optionally intrinsic calibration and radial distortion), to obtain the orientation of one rigid object with respect to the other rigid object that is optimal under certain assumptions regarding the noise and image errors pertaining to the observed image features. If the image error is normally distributed about a zero mean, the bundle adjustment is an application of a maximum likelihood estimator (MLE) to a system of non-linear equations. In the application and parlance described above, the PBA solves the system of non-linear equations described below:
C1A*M1≈BA*TB Equation (1)
C2A*M2≈BA*TB Equation (2)
The result of the solution to a system of simultaneous non-linear equations of Equation (1) and Equation (2) is the orientation of object B 102B relative to object A 102A (BA).
As described above, while this technique provides a satisfactory result, if increased accuracy in the rotational aspects of the orientation are desired, this requires either the use of more cameras 104 or cameras 104 with lower measurement uncertainties, for example, cameras with higher resolution sensors or more robust signal processing. Either solution adds to the cost of obtaining the six degree-of-freedom (DOF) orientation of rigid object B 102B relative to rigid object A 102A.
The systems and methods described below use one or more cameras on two objects (where one of the objects might be the ground). The accuracy of measurements of the relative rotation between the objects is much better using this technique than if the same number of cameras were located on just one of the objects. Thus, a desired proportion of improved rotational measurement accuracy and lower cost is achieved.
Beginning with
Similarly,
C1A*M1≈BA*TB Equation (3)
C2B*M2≅AB*TA Equation (4)
While separate PBAMs 502A, 502B are illustrated, these operations may be performed by the same PBAM (hereinafter referred to as PBAM 502).
AB*C1A*M1≅TB Equation (5)
thus resulting in the following system of non-linear equations:
C2B*M2≅AB*TA Equation (4)
AB*C1A*M1≅TB Equation (5)
wherein TA comprises the locations of the plurality of first object targets mounted on the exterior surface of the first object relative to the first object, and TB comprises the locations of the plurality of second object targets mounted on the exterior surface of the second object relative to the second object.
This system of non-linear equations is then solved, for example, using a least-squares best-fit to compute AB or the orientation of object rigid object A 102A relative to object B 102B.
Hence, in this embodiment, the six degree of freedom spatial relationship between the first rigid object A 102A and the second rigid object B 102B is computed from the computed orientation of the first rigid object A 102A relative to the second rigid object B 102B (AB), the inverse of a computed orientation of the first rigid object A 102A relative to the second rigid object B 102B (AB−1), the sensed location (M2) of each of the plurality of first object targets 106A-106N mounted on the exterior surface of the first rigid object A 102A facing the second camera CAM 2 104-2 and the orientation of the second camera CAM 2 104-2 relative to the second rigid object B 102B (C2B), the sensed location (M1) of each of the plurality of second object targets 108A-108N on the exterior surface of the second rigid object B 102B facing the first camera (CAM 1) 104-1 and the orientation of the first camera (CAM 1) 104-1 relative to the first rigid object B 102B (C1A).
AB can be inverted to obtain the orientation of rigid object B 102B relative to rigid object A 102A (BA), if desired. Or, Equation 4 may be expressed as:
BA*C2B*M2≅TA Equation (6)
resulting in the following system of non-linear equations:
BA*C2B*M2≅TA Equation (6)
C1A*M1≅BA*TB Equation (3)
that are solved with a least-square best-fit to compute BA.
The foregoing principles can be used with additional sensors (e.g. cameras) mounted on either the same or other rigid objects. This can provide additional measurement accuracy, or permit the user to obtain equivalent measurement accuracies with lower resolution cameras. Further, these principles can be extended to situations wherein the orientation of multiple objects are determined.
Noting the following definitions:
C1A=orientation of CAM 1 704-1 relative to rigid object A 102A;
C2A=orientation of CAM 2 704-2 relative to rigid object A 102A;
C3A=orientation of CAM 3 704-3 relative to rigid object B 102B;
C4A=orientation of CAM 4 704-4 relative to rigid object B 102B;
C5A=orientation of CAM 5 704-5 relative to rigid object C 102C;
C6A=orientation of CAM 6 704-6 relative to rigid object C 102C;
M1B=CAM 1 704-1 measurement of targets 106A-106N on object B;
M2C=CAM 2 704-2 measurement of targets 110A-110N on object C;
M3A=CAM 3 704-2 measurement of targets 108A-108N on object A;
M4C=CAM 4 704-2 measurement of targets 110A-110N on object C;
M5B=CAM 5 704-2 measurement of targets 106A-106N on object B;
M6A=CAM 6 704-2 measurement of targets on 108A-108N object A;
TA=object A target locations 108A-108N relative to object A 102A;
TB=object B target locations 106A-106N relative to object B 102B;
TA=object C target locations 100A-110N relative to object C 102C;
BA=AB−1=object B 102B orientation relative to object A 102A;
AB=BA−1=object A 102A orientation relative to object B 102B;
CA==object C 102C orientation relative to object A 102A;
AC=CA−1=object A 102A orientation relative to object C 102C;
CB=BC−1=object C 102C orientation relative to object B 102B; and
BC=CB−1=object B 102B orientation relative to object C 102C.
Note that in
CB=(AB*CA) Equation(7)
BC=(AC*BA) Equation(8)
Combining all equations from all cameras results in Equations (9)-(14):
C1A*M1B≅BA*TB Equation (9)
C2A*M2C≅CA*TC Equation (10)
C3B*M3A≅AB*TA Equation (11)
C4B*M4C≅CB*TC Equation (12)
C5C*M5B≅BC*TB Equation (13)
C6C*M6A≅AC*TA Equation (14)
Applying matrix substitutions results in six sets of simultaneous non-linear equations that can be solved for BA and CA:
C1A*M1B≅BA*TB Equation (9)
C2A*M2C≅CA*TC Equation (10)
BA*C3B*M3A≅AB Equation (15)
BA*C4B*M4C≅CA*TC Equation (16)
CA*C5C*M5B≅BA*TB Equation (17)
CA*C6C*M6A≅TA Equation (18)
Accordingly, a six degree-of-freedom spatial relationship can be determined between the any of the rigid objects 102 with respect to any of the other rigid objects 102 using the camera measurements.
Generally, the computer 802 operates under control of an operating system 808 stored in the memory 806, and interfaces with the user to accept inputs and commands and to present results through a graphical user interface (GUI) module 818A. Although the GUI module 818B is depicted as a separate module, the instructions performing the GUI functions can be resident or distributed in the operating system 808, the computer program 810, or implemented with special purpose memory and processors. The computer 802 also implements a compiler 812 which allows an application program 810 written in a programming language such as COBOL, C++, FORTRAN, or other language to be translated into processor 804 readable code. After completion, the application 810 accesses and manipulates data stored in the memory 806 of the computer 802 using the relationships and logic that was generated using the compiler 812. The computer 802 also optionally comprises an external communication device such as a modem, satellite link, Ethernet card, or other device for communicating with other computers.
In one embodiment, instructions implementing the operating system 808, the computer program 810, and the compiler 812 are tangibly embodied in a computer-readable medium, e.g., data storage device 820, which could include one or more fixed or removable data storage devices, such as a zip drive, floppy disc drive 824, hard drive, CD-ROM drive, tape drive, etc. Further, the operating system 808 and the computer program 810 are comprised of instructions which, when read and executed by the computer 802, causes the computer 802 to perform the operations herein described. Computer program 810 and/or operating instructions may also be tangibly embodied in memory 806 and/or data communications devices 830, thereby making a computer program product or article of manufacture. As such, the terms “article of manufacture,” “program storage device” and “computer program product” as used herein are intended to encompass a computer program accessible from any computer readable device or media.
Those skilled in the art will recognize many modifications may be made to this configuration without departing from the scope of the present disclosure. For example, those skilled in the art will recognize that any combination of the above components, or any number of different components, peripherals, and other devices, may be used.
This concludes the description of the preferred embodiments of the present disclosure.
The foregoing description of the preferred embodiment has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the disclosure to the precise form disclosed. Many modifications and variations are possible in light of the above teaching. It is intended that the scope of rights be limited not by this detailed description, but rather by the claims appended hereto.