The present disclosure relates to metrology devices, such as, for example, a laser tracker, and more particularly to a laser tracker that automatically identifies each of a plurality of retroreflector targets placed on an object using one or more locator cameras associated with (e.g., as part of) the laser tracker.
There is a class of instruments known as a laser tracker that measures the coordinates of a point by sending a laser beam to a retroreflector target in contact with the point. The instrument determines the coordinates of the point by measuring the distance and the two angles to the target. The distance is measured with a distance-measuring device such as an absolute distance meter or an interferometer. The angles are measured with an angle-measuring device such as an angular encoder. A gimbaled beam-steering mechanism within the instrument directs the laser beam to the point of interest.
The laser tracker is a particular type of coordinate-measuring device that tracks the retroreflector target with one or more laser beams it emits. There is another category of instruments known as total stations or tachymeters that may measure a retroreflector or a point on a diffusely scattering surface. Laser trackers, which typically have accuracies on the order of a thousand of an inch and as good as one or two micrometers under certain circumstances, are usually much more accurate than total stations. The broad definition of laser tracker, which includes total stations, is used throughout this application.
Ordinarily the laser tracker sends a laser beam to a retroreflector target that is typically located on the surface of an object to be measured. A common type of retroreflector target is the spherically mounted retroreflector (SMR), which includes a cube-corner retroreflector embedded within a metal sphere. The cube-corner retroreflector includes three mutually perpendicular mirrors. The vertex, which is the common point of intersection of the three mirrors, is located near the center of the sphere. Because of this placement of the cube corner within the sphere, the perpendicular distance from the vertex to any surface of the object on which the SMR rests remains nearly constant, even as the SMR is rotated. Consequently, the laser tracker can measure the 3D coordinates of a surface by following the position of an SMR as it is moved over the surface. Stated another way, the laser tracker needs to measure only three degrees of freedom (one radial distance and two angles) to fully characterize the 3D coordinates of a surface.
Some laser trackers have the ability to measure six degrees of freedom (DOF), which may include three translations, such as x, y, and z, and three rotations, such as pitch, roll, and yaw. An exemplary six-DOF laser tracker system is described in U.S. Pat. No. 7,800,758 ('758) to Bridges, et al., incorporated by reference herein. The '758 patent discloses a probe that holds a cube corner retroreflector, onto which marks have been placed. A retroreflector onto which such marks have been placed is called a six-DOF retroreflector. The cube corner retroreflector is illuminated by a laser beam from the laser tracker, and the marks on the cube corner retroreflector are captured by a camera within the laser tracker. The three orientational degrees of freedom, for example, the pitch, roll, and yaw angles, are calculated based on the image obtained by the camera. The laser tracker measures a distance and two angles to the vertex of the cube-corner retroreflector. When the distance and two angles, which give three translational degrees of freedom of the vertex, are combined with the three orientational degrees of freedom obtained from the camera image, the position of a probe tip, arranged at a prescribed position relative to the vertex of the cube corner retroreflector, can be found. Such a probe tip may be used, for example, to measure the coordinates of a “hidden” feature that is out of the line of sight of the laser beam from the laser tracker.
One common application of a laser tracker is to measure a relatively large object to see how its actual dimensions compare to the design dimensions (e.g., as given by CAD data). There may be several of these objects utilized in a particular application and the objects are typically expected to be identical in geometry. Any distortion in the geometry of the object either initially or developed over time can influence other operations in the overall system that the object is a part of. For example, if the object is bent or twisted in any way it can lead to manufacturing defects and poor product quality.
Typically as known at least three points are required to establish the relationship between the laser tracker and the object for measurement purposes. As is known in the art, the ability of the operator to manually measure these initial points with sufficient accuracy is an area for consideration.
Thus, there is a need for an operator of a laser tracker or similar measurement device to be able to not have to manually measure the targets points (e.g., SMRs). Instead, it would be desirable for the operator of the laser tracker to utilize the camera system in the laser tracker to automatically measure all of the target points required for any particular application, thereby significantly reducing the possibility of operator error in the measurement process and not requiring specialized skills and/or training.
More generally, there is a need for a method and a system in which the laser tracker automatically carries out many of the functions that would previously have to be carried out manually. It would be desirable to quickly obtain consistent measurements with the laser tracker, even if the measurements are carried out by an unskilled operator. Typical measurements include tool inspection measurements; for example, the carriage in a body-in-white assembly line is an example of a tool to be inspected or monitored. Other examples of tools include a sheet metal stamping jig, and an assembly tool for assembling a portion of an aircraft structure. Generally, for almost every part made in an automotive or aerospace application, there is a corresponding tool. Thus, it would be desirable to improve the process of measuring such tools with a laser tracker. In addition, it would be desirable to apply the measurement process to finished parts as well.
A method for measuring with a system includes the steps of: providing the system including a collection of retroreflector targets and a laser tracker, the collection of retroreflector targets including at least three non-collinear retroreflector targets, the at least three non-collinear retroreflector targets including a first target, a second target, and a third target, the laser tracker in a first frame of reference fixed with respect to tracker surroundings, the laser tracker having a structure, a first light source, an absolute distance meter, a first angular transducer, a second angular transducer, a tracking system, a first camera, a second light source, and a processor, the structure rotatable about a first axis and a second axis, the first light source producing a first light beam that cooperates with the absolute distance meter, the first angular transducer measuring a first angle of rotation about the first axis, the second angular transducer measuring a second angle of rotation about the second axis, the tracking system configured to move the first light beam to a center of any retroreflector target from among the collection of retroreflector targets, the first camera including a first lens system and a first photosensitive array, the second light source providing a second light beam, and the processor configured to operate the laser tracker; storing a list of nominal coordinates for the first target, the second target, the third target, and at least one additional point, the nominal coordinates being three-dimensional coordinates in a second frame of reference; capturing on the first photosensitive array a portion of the light emitted by the second light beam and reflected off the first target, the second target, and the third target; obtaining spot positions on the photosensitive array from the portion of light reflected off each of the first target, second target, and the third target; determining a correspondence between a first spot position, a second spot position, and a third spot position on the first photosensitive array and the nominal coordinates of the first target, the second target, and the third target, respectively; directing the first beam of light to the first target based at least in part on the nominal coordinates of the first target and the first spot position; measuring three-dimensional coordinates of the first target using the absolute distance meter, the first angular transducer, and the second angular transducer; directing the first beam of light to the second target based at least in part on the nominal coordinates of the second target and the second spot position; measuring three-dimensional coordinates of the second target using the absolute distance meter, the first angular transducer, and the second angular transducer; directing the first beam of light to the third target based at least in part on the nominal coordinates of the third target and the third spot position; measuring three-dimensional coordinates of the third target using the absolute distance meter, the first angular transducer, and the second angular transducer; determining three-dimensional coordinates of the at least one additional point in the first frame of reference based at least in part on the measured three-dimensional coordinates of the first target, the second target, the third target, and the nominal coordinates of the at least one additional point; and storing the determined three-dimensional coordinates of the at least one additional point.
Referring now to the drawings, exemplary embodiments are shown which should not be construed to be limiting regarding the entire scope of the disclosure, and wherein the elements are numbered alike in several FIGURES:
An exemplary laser tracker 10 is illustrated in
The laser tracker 10 is a device that has a device frame of reference 30. The device frame of reference may have as its origin the gimbal point 22. The frame of reference may be fixed with respect to the azimuth base 16, which is typically stationary with respect to the surroundings. The device frame of reference may be represented by a variety of coordinate systems. One type of coordinate system is a Cartesian coordinate system having three perpendicular axes x, y, and z. Another type of coordinate system is a spherical coordinate system. A point 74 within a spherical coordinate 30 may be represented in a spherical coordinate system by one radial distance 73 (r), a first (zenith) angle 72 (θ), and a second (azimuth) angle 71 (φ). The angle θ is obtained by using the projection of the point 74 onto the z axis. The angle φ is obtained by using the projection of the point 74 onto the x-y plane. The laser tracker 10 inherently makes measurements in a spherical coordinate system, but a point measured in spherical coordinates may be easily converted to Cartesian coordinates.
The target 26 may be in contact with an object under test 61. The object under test 61 has an object frame of reference 40. The object frame of reference may be represented, for example, using Cartesian coordinates x, y, and z. The x, y, and z axes of the object frame of reference 40 move with the object 61 and are not necessarily parallel to the corresponding device axes x, y, and z of the device frame of reference 30. The target 26 may be placed in contact with the object surface 61 at a point 63. To find the three-dimensional (3D) coordinates of the point 63, the tracker first determines the center of the target 26 using the distance and two angles it has measured. It may also be used to account for a vector offset of the retroreflector reference point (e.g., cube-corner vertex) with respect to the center of the spherical contact surface of the target 26. To move from the center of the target to the surface of the workpiece, the position of the center point is offset by an amount equal to the radius of the spherical target surface. In an embodiment, the direction of the offset is found by measuring several points near to the contact point 63 to determine the surface normal at the point 63.
Laser beam 46 may include one or more laser wavelengths. For the sake of clarity and simplicity, a steering mechanism of the sort shown in
In exemplary laser tracker 10, cameras 52 and light sources 54 are located on payload 15. Light sources 54 illuminate one or more retroreflector targets 26. In an embodiment, light sources 54 are LEDs electrically driven to repetitively emit pulsed light. Each camera 52 includes a photosensitive array and a lens placed in front of the photosensitive array. The photosensitive array may be a CMOS or CCD array, for example. In an embodiment, the lens has a relatively wide field of view, for example, 30 or 40 degrees. The purpose of the lens is to form an image on the photosensitive array of objects within the field of view of the lens. Usually at least one light source 54 is placed near camera 52 so that light from light source 54 is reflected off each retroreflector target 26 onto camera 52. (To illuminate a retroreflector target in a way that can be seen on the camera 52, the light source 54 must be near the camera; otherwise the reflected light will be reflected at too large an angle and will miss the camera.) In this way, retroreflector images are readily distinguished from the background on the photosensitive array as their image spots are brighter than background objects and are pulsed. In an embodiment, there are two cameras 52 and two light sources 54 placed about the line of laser beam 46. By using two cameras in this way, the principle of triangulation can be used to find the three-dimensional coordinates of any SMR within the field of view of the camera. In addition, the three-dimensional coordinates of an SMR can be monitored as the SMR is moved from point to point. A use of two cameras for this purpose is described in U.S. Published Patent Application No. 2010/0128259 to Bridges, et al., the contents of which are herein incorporated by reference.
Auxiliary unit 50 may be a part of laser tracker 10. The purpose of auxiliary unit 50 is to supply electrical power to the laser tracker body and in some cases to also supply computing and clocking capability to the system. It is possible to eliminate auxiliary unit 50 altogether by moving the functionality of auxiliary unit 50 into the tracker body. In most cases, auxiliary unit 50 is attached to general purpose computer 60. Application software loaded onto general purpose computer 60 may provide application capabilities such as reverse engineering. It is also possible to eliminate general purpose computer 60 by building its computing capability directly into laser tracker 10. In this case, a user interface, possibly providing keyboard and mouse functionality may be built into laser tracker 10. The connection between auxiliary unit 50 and computer 60 may be wireless or through a cable of electrical wires. Computer 60 may be connected to a network, and auxiliary unit 50 may also be connected to a network. Plural instruments, for example, multiple measurement instruments or actuators, may be connected together, either through computer 60 or auxiliary unit 50. In an embodiment, auxiliary unit is omitted and connections are made directly between laser tracker 10 and computer 60.
In alternative embodiments of the present invention, the laser tracker 10 may utilize both wide field of view (FOV) and narrow FOV cameras 52 together on the laser tracker 10. Various exemplary methods of using such cameras together are described hereinbelow.
In a first embodiment, one of the cameras 52 in
In another embodiment illustrated in
In still another embodiment, the two wide FOV cameras 52 in
The method for finding the location of a retroreflector target using images on the two cameras 52 mounted on the front of the laser tracker 10 of
Five frames of reference are associated with the laser tracker 10: a payload frame of reference that rotates with the payload 15; an azimuth frame of reference that rotates with the zenith carriage 14; a tracker-world frame of reference that is fixed with respect to the azimuth base 16; and two camera frames of reference. The azimuth base 16 is stationary with respect to its surroundings. The cameras 52 include a lens system (not shown) and a photosensitive array (not shown). Representative illustrations of a camera containing a lens system and a photosensitive array are given in
In an embodiment, the payload frame of reference has an origin at the gimbal point 22, which lies at a point along the azimuth axis; a y axis parallel to the zenith direction; an x axis perpendicular to the y axis and approximately parallel to the laser beam; and a z axis perpendicular to the x and y axes. The cameras 52 are fixed with respect to the payload frame of reference.
In an embodiment, the azimuth frame of reference has an origin at the gimbal point 22; a z axis along the azimuth direction; a y axis parallel to the zenith axis and perpendicular to the z axis; and an x axis perpendicular to the y and z axes.
In an embodiment, the tracker-world frame of reference has an origin at the gimbal point 22; a z axis along the azimuth axis; a y axis perpendicular to the z axis and parallel to the zenith axis when the angle of the azimuth axis is set to zero degrees; and an x axis perpendicular to the y and z axes.
In an embodiment, a camera frame of reference has an x axis that is the optical axis of the lens system within the camera. The y axis and z axis are perpendicular to the x axis and to each other and are aligned with the rows and columns, respectively, of the pixels of the photosensitive array within the camera 52.
In the laser tracker 10, a zenith angle and an azimuth angle, which are angles of rotation about the zenith and azimuth axes, respectively, are measured by the zenith encoder and azimuth encoder, respectively. With knowledge of the zenith and azimuth angles and the equations of the camera optical axes in the payload frame of reference, it is possible to transform any one of the five frames of reference—payload frame of reference, azimuth frame of reference, tracker-world frame of reference, and two camera frames of reference—into any of the other frames of reference. This is usually done using a transformation matrix, which is a 4×4 matrix including a 3×3 rotation matrix and a scaling component that enables translation. The use of transformation matrices is well known to those of ordinary skill in the art.
The cameras 52 and lights 54 are used to find the location of one or more retroreflector targets 26 in the payload frame of reference or any other frame of reference. Such targets can be automatically acquired, if desired, by the laser tracker 10.
The method for finding a retroreflective target in the payload frame of reference will now be described. A first step in the method is to turn on the lights 54 to illuminate the retroreflectors 26 and form an image on the cameras 52. In some cases, the illumination may be turned off briefly and the difference taken between the illuminated and non-illuminated scenes. In this way, background features can be removed, causing the retroreflector target to be revealed more clearly. A second step is use a processor (e.g., processor 50) to calculate a center point for each retroreflector spot on the photosensitive array of the camera 52. The center point may, for example, be calculated as a centroid. A third step is to establish a direction in the camera frame of reference for each of the center points. With the simplest approximation, the direction is found by drawing a line between the center point and the perspective center of the camera 52. A more sophisticated analysis may consider aberrations of the lens system in determining the direction. A fourth step is to convert into the payload frame of reference the coordinates of the perspective center and the directions for each of the center points. A fifth step is to find a best estimate for the position of the retroreflector target 26 in the payload frame of reference by solving simultaneous equations, as explained hereinbelow.
For each retroreflector target 26, a center point will be formed on the photosensitive arrays of each of the two cameras 52 and from these center points a line indicating the direction from each of the cameras to the retroreflector target 26 will be constructed. In the ideal case, the two lines intersect in a point, but in general, the two lines will be skew lines that do not exactly intersect. The best estimate of the intersection position for two skew lines is found by determining a line segment of closest approach. The line segment of closest approach is perpendicular to each of the two skew lines and is shorter than any other line segment perpendicular to the two skew lines. Ordinarily, the best estimate of the position of the retroreflector target 26 is the midpoint of the line segment of closest approach.
One way to find the endpoints of the vector C is by using the equation
P+uU+cC=R+vV, (1)
which contains the scalar quantities u, c, and v and the vector quantities P, U, C, R, and V.
In addition, the vector C is subject to the constraint
C=U×V, (2)
where x is the cross-product operator. Vector equation (1) can be written as a first equation in x, a second equation in y, and a third equation in z. The vector C can be written in terms of x, y, and z components of U and V using well known cross product formulas. The x, y, and z components of C are substituted into the first, second, and third equations. The result is three equations in x, y, and z, where all of the vector quantities are known and only the three scalar quantities u, v, and c remain to be found. Since there are three equations and three unknowns, the three values can be found.
The three dimensional endpoint coordinates Q1 and Q2 of the line segment that join the vectors U and V along the line of closest approach are given by
Q1=P+uU, (3)
Q2=R+vV. (4)
The best estimate Q of the intersection points of the two lines is given by
Q=(Q1+Q2)/2. (5)
If desired, other mathematical methods can be used to find a best estimate Q of the intersection points. For example, an optimization procedure may be used to find the values of Q1 and Q2.
The methods above are described with respect to a laser tracker 10 having a beam of light 46 launched from a payload 15 rotating about a zenith axis 18. However, other types of mechanical steering mechanisms are possible. For example, the payload 15 may be replaced by a steering mirror. With this approach, a beam of light is directed upward from the azimuth base 16. The beam of light strikes the steering mirror and reflects out of the tracker enclosure. A motor attached to the zenith mechanical axis rotates the steering mirror to point the beam in the desired direction. In this embodiment, the payload frame of reference is replaced by the mirror frame of reference, but otherwise the analysis is the same.
It would also be possible to use one or more cameras not attached to the payload of the laser tracker. Such cameras might be attached to the azimuth carriage 14 or they might be mounted separate from the laser tracker altogether. The method for finding the relationship between the frames of reference of the cameras and the laser tracker would be found in a manner similar to that described above: some number of points would be measured by the cameras and by the laser tracker and the measurement results would be used to establish appropriate transformation matrices.
Referring again to
Q=0.2Q1+0.8Q2 (6)
Another mathematical method that may be used is a least squares optimization procedure to find the best estimate for the retroreflector target 26, but to weight the readings of the narrow FOV camera 58 more heavily than the readings of the wide FOV camera 52.
Referring to
Electrical wires 441 provide power from a power source (e.g., auxiliary unit 50) within the laser tracker 10 to the emitters 401 and the photosensitive array 404. Electrical wires 441 also transmit the pixel data from photosensitive array 404 to general purpose computer 60, for example, for analysis. The computer 60 analyzes the pattern of light on photosensitive array 404 to determine the location of a central point 452 on photosensitive array 404. The computer 60 also performs this analysis of the pattern formed by the other bundles of light returned by the retroreflectors. In other words, the reflected light bundles are focused by lens 402 into patterns on photosensitive array 404. The computer 60 analyzes these patterns to determine the central point of each pattern. From the location of the central points, the approximate angular direction to each of the retroreflectors can be determined.
Suppose that the retroreflector of interest is a particular retroreflector among the multiple retroreflectors. If the objective is to acquire the target and measure the target positions with the laser tracker, then the following procedure may carried out. Motors are activated to turn the payload until the laser beam points in the approximate direction of a particular retroreflector. If the estimate of the target position is good enough, the light beam directly locks onto the target and begins tracking of the target. If the estimate of the target position is not good enough, one possibility is to initiate a search in which the direction of the laser beam is changed in a systematic fashion. For example, the laser beam might be steered along a spiral pattern. When the laser beam intersects the target, a position detector within the laser tracker senses the reflected light. The signals from position detector provide enough information to enable motors to point the payload directly to the center of the particular retroreflector. Another possibility is for the operator to directly grab the mechanical mechanism of the laser tracker, for example, the payload, and to manually direct the light beam toward the retroreflector of interest. In one embodiment, if the operator directs the light beam close enough to the center of the retroreflector, an LED begins to flash on the front of the tracker. If the light beam is closer still, the light beam will lock onto the retroreflector target. If the light beam is not quite close enough to the center of the retroreflector to lock onto the target, a quick search procedure may be carried out to locate the retroreflector target.
In the case that two or more locator cameras 52 are on the laser tracker 10, it is usually possible to directly establish, by means of the stereo camera calculation described hereinabove, a one-to-one correspondence between the retroreflector targets 26 and the target centers appearing on the photosensitive arrays of the cameras 52. Similarly, if a single camera 52 is located within the laser tracker in such a way that the light reflected by the targets 26 travels to the camera on an optical axis of the laser tracker, then a parallax between the camera and the camera is eliminated, and it is usually possible to establish a one-to-one correspondence between the retroreflector targets and the target centers appearing on the photosensitive array of the camera. If a single camera is used, alternative methods may be used to establish a one-to-one correspondence between the target centers and the retroreflector targets. One method involves turning the azimuth axis to different angles and observing the change in position on the photosensitive array of the single camera 52. As the azimuth angle is changed, the positions of the centers on the photosensitive array will change by an amount that depends on the distance from the laser tracker 10 to the retroreflector 26. For a given change in azimuth angle, as the distance to the retroreflector increases, the change in between the two centers on the photosensitive array decreases. A similar procedure can be carried out by changing the zenith angle, rather than the azimuth angle, of the of the laser tracker. A more detailed description of this procedure is described in reference to FIG. 18 in U.S. Patent Application No. 2011/0260033 ('033), incorporated by reference herein.
In some cases, the one or more cameras on the laser tracker are accurate enough to direct the light beam from the laser tracker close enough to the center of a retroreflector target so that the light beam reflected back into the laser tracker is picked up by the position detector, thereby causing the laser beam to begin tracking the target. In such cases, the software that controls the laser tracker may automatically direct the light beam from the tracker to each of the targets so that the relatively high accuracies of the laser tracker distance meter and angular encoders are transferred to the three dimensional coordinate values. In other cases, the one or more cameras on the laser tracker may not be accurate enough to immediately direct the light beam from the laser tracker close enough to the center of the retroreflector target to enable the position detector to immediately detect the light beam and begin tracking. In this case, the light beam from the laser tracker may be aimed at a target and the beam directed in a search pattern to locate the target, as explained hereinabove. By repeating this procedure for each of the targets within the measurement volume, relatively accurate three dimensional coordinates can be obtained for each of the target points. Relatively accurate three dimensional coordinates for the target points are important because they enable the software that controls the laser tracker to efficiently carry out automatic measurements of the target points without performing interim target searches.
As suggested by the discussion hereinabove, some aspects of this invention require the obtaining of a one-to-one correspondence between target points viewed by the one or more cameras on the laser tracker and a list of three dimensional coordinates of retroreflector target points. Some methods for obtaining a list of three dimensional coordinates of the target points are described hereinbelow. The list may, in some cases, have nominal three dimensional coordinates that differ by a relatively large amount from the actual three dimensional coordinates. In other cases, the list may have relatively accurate three dimensional coordinates.
In an embodiment, a list of three dimensional coordinates between the laser tracker and the target points on an object under test is obtained from a CAD model describing the positions of the targets on the object.
In another embodiment, the one-to-one correspondence between the target points and the list of three dimensional coordinates is obtained by performing three dimensional measurements on each of the points observed by the camera. Such three dimensional measurements may have been carried out prior to the current measurement session.
In some cases, the images of the target points on the one or more cameras may be too closely spaced to immediately determine the one-to-one correspondence between the target points and the spots on the camera images. In this case, the points may be measured with the laser tracker using the methods described hereinabove. For example, the laser tracker may direct the light beam toward the target. The laser tracker may then measure the target position directly or with the assistance of a search procedure, if necessary.
An important aspect of this invention is the establishing of a relationship between the frame of reference of the laser tracker and the frame of reference of the object under test. Another way of expressing the same idea is to say that it is important to have a method for transforming the laser tracker frame of reference into the object-under-test frame of reference or vice versa.
Three methods are taught herein for the establishing of this relationship. In a first method, at least three retroreflector target points are measured by the laser tracker. In a second method, at least two target points are measured by the laser tracker and in addition at least two angles of tilt are measured by inclinometers disposed on each of the laser tracker and the object under test. In a third method, a single six degree-of-freedom (DOF) camera is measured by a laser tracker having six DOF measurement capability. By combining the information obtained from any of the three methods, it is possible to put the laser tracker within the frame of reference of the object under test. Equivalently, it is possible to put the object under test within the laser tracker frame of reference.
A brief explanation will now be given on the method for putting the object under test into the tracker frame of reference based on the obtaining of the measured information described in the preceding paragraph. For the case in which the laser tracker measures three retroreflector target points, a local coordinate system for the object under test may be established by allowing one of the three measured points to be an origin point in the local frame of reference of the object under test, a second of the measured points to establish the x axis, and the third of the measured points to establish a component in the y direction. The y axis is taken to pass through the origin and to be perpendicular to the x axis. The z axis is taken to pass through the origin and to be perpendicular the x axis and the y axis and to have a direction according to the right-hand rule, which is known to those of ordinary skill in the art. The object under test may have its own reference coordinate system established by a CAD drawing. For example, the CAD drawing may have datums that establish an origin, x axis, y axis, and z axis. To put the CAD drawing within the frame of reference of the laser tracker, or equivalently to put the laser tracker within the frame of reference of the CAD drawing, usually three transformation matrices are obtained. Transformation matrices are usually 4×4 matrices that include a 3×3 rotation matrix and a scaling component that accounts for translations of the frames of reference relative to the other frames of reference. In the situation described hereinabove, the three transformation matrices are multiplied together in a particular order to obtain an overall transformation matrix to transform measured values or CAD values into the desired frame of reference. The use of transformation matrices is well known to one of ordinary skill in the art and will not be described further here.
For the case in which the laser tracker measures the three dimensional coordinates of at least two retroreflector target point in addition to angles of tilt of the laser tracker and the object under test, a local coordinate system for the object under test may be established by allowing a first retroreflector target point to be the local origin of the object under test and the direction from the first target point to the second target point to establish a local x axis for the object under test. If an inclinometer located on the laser tracker and the object under test each measures two perpendicular tilt angles and to the gravity vector, then it is possible to rotate the object under test to align the two gravity vectors, again using rotation methods that are well known to one of ordinary skill in the art. The ambiguity in the rotation angle about the gravity vector can be removed since there is only possible rotation about the gravity vector that provides that proper correspondence between the local x axis of the object under test and the x axis as defined by the CAD model. This method will work as long as the three dimensional coordinates of the two retroreflector target points, as measured by the laser tracker, do not form a line that coincides with the gravity vector.
Another way of viewing the transformation between frames of reference is to consider the number of degrees of freedom provided by the measured values. For example, when the laser tracker measures a first retroreflector target point, it is said to have constrained the possible movement of the object under test by three degrees of freedom because a first, second, and third degree of freedom corresponding to x, y, and z coordinates have been established for a point on the object under test. Physically, this constraint fixes the location of the measured point in space but allows the object under test to rotate in any orientation about this point. When the laser tracker measures the second retroreflector target point, it is said to have constrained the possible movement of the object under test by an additional two degrees of freedom because the object under test no longer has the ability to rotate in any of three orientational angles but instead is constrained to rotate about the line connecting the first and second retroreflector target points. Hence the three degrees of orientational freedom have been reduced to one orientational degree of freedom. The first measured point constrained three translational degrees of freedom, and the second measured point constrained two orientational degrees of freedom for a total constraint of five degrees of freedom. Since there is one unconstrained degree of freedom in this case, the total of the constrained and unconstrained degrees of freedom is six.
For the case in which inclinometers on the laser tracker and the object under test each measure two angles of tilt relative to the gravity vector and the laser tracker measures the three dimensional coordinates of just one target point, there is not enough information to fully constrain the object under test. The two inclinometers constrain two angles but provide no information on rotation of the object under test about rotation about the gravity vector. In other words, the two inclinometers constrain two degrees of freedom. The three dimensional coordinates of the single target measured by the laser tracker provides constraint of three degrees of freedom, for a total constraint of five degrees of freedom. Since six degrees of freedom are needed for complete constraint, the measured values do not provide complete constraint, and the object is free to move around the gravity vector.
For the case in which inclinometers on the laser tracker and the object under test each measure two angles of tilt relative to the gravity vector and the laser tracker measures the three dimensional coordinates of two target points, there is enough information to fully constrain the object under test as long as the two target points do not establish a line along the direction of the gravity vector. By performing this measurement, the object under test is said to be constrained in six degrees of freedom as long as the two target points do not lie along the direction of the gravity vector.
For the case in which the two vectors lie along the gravity vector, the object under test is said to be constrained by five degrees of freedom since there is not enough information to determine the orientation of the object under test about the gravity vector. Note that the number of degrees of freedom cannot be determined by simply adding the number of degrees of freedom that would be obtained by individual measurements. For example, measurement of a single point constrains three degrees of freedom, but the measurement of two points constrains five degrees of freedom, not six degrees of freedom. Also notice that the two angular degrees of freedom provided by the inclinometers on the laser tracker and the object under test do not add to the five degrees of freedom obtained by the laser tracker measurement of the two retroreflector target points to obtain six or seven degrees of freedom. This is because the two degrees of freedom provided by the inclinometers do not correspond to a basis set that is independent of the basis set of the two target points measured by the laser tracker. In other words, complete constraint of a single rigid body requires constraint of three translational degrees of freedom (e.g., x, y, z) and three orientational degrees of freedom (e.g., pitch, roll, and yaw angles). In the case considered above, there is no constraint for rotation about the gravity vector (often called the yaw angle). In this application, the term degrees of freedom should be understood to mean independent degrees of freedom.
It should be understood that the targets viewed by the cameras on the laser tracker may be in a region beyond the field of view (FOV) of the cameras by rotating the azimuth and zenith axes of the laser tracker. For example, the FOV of one of the cameras on the laser tracker may be 30 degrees in the azimuth direction. However, the azimuth axis of the tracker may be rotated by 360 degrees, thus increasing the effective FOV of the camera to 360 degrees.
Embodiments of the present invention allow an operator with limited training on the measurement system (e.g., laser tracker, target tooling, SMRs or other laser tracker targets, computer system, measurement system software and optionally a remote control or handheld device connected to the measurement software) optionally to follow a series of prompts and instructions via a computer (e.g., the general purpose computer 60) to set up the laser tracker, optionally to place the SMRs in the required tooling on the part to be measured, and optionally to define the area of interest to be measured. The measurement system may then automatically measure the target points and produce the results.
An embodiment of the present invention that helps enable simpler and faster measurements for the operator is a method in which the laser tracker points the light beam at a desired measurement location on the object to prompt the operator to place an SMR at the desired location. For example, the operator might be prompted to place a retroreflector target in a magnetic nest on an object under test. As another example, a retroreflector may be located in the wrong position, and the operator may be prompted by the light beam from the laser tracker to move the misplaced target to the correct location. The prompting might do this, for example, by moving the light beam sequentially from a first position containing the misplaced target to a second position where the target is to be placed.
The guidance given by the light beam from the laser tracker may also have an advantage during a setup phase in which the operator places SMRs in specified locations while the laser tracker measures the three dimensional coordinates of the target positions. This advantage is seen when the nominal dimensions given on a CAD model do not correspond to the actual dimensions of an object under test. If the accurate three dimensional locations of the target points are determined during setup, measurement time and errors occurring later in the process may be decreased.
The directing of the actions of the operator by pointing of the light beam can help eliminate errors. For example, a test plan within software for a laser tracker test may indicate that the operator is to measure points in a particular order. The results of such measurements may be saved and used to obtain a desired relationship—for example, a relationship between two frames of reference, a length between two lines, or the angle between two planes. If the operator has measured the initial points in the wrong order or has measured the wrong points, then the software may fail to solve for the desired values or get the wrong answers.
In the cases described thus far, the operator is directed to place a retroreflector target in a fixed location, which might be a magnetic nest or a tooling hole, for example. However, there is another important case in which the operator measures a surface profile. Such a surface profile might be measured to determine the flatness of a surface or the diameter of a sphere, for example, or two surfaces might be measured to determine the angle between the surfaces. As another example, an operator might measure a portion of a tool being built for use in assembling automobiles or airplanes. The laser tracker might be used to measure the surface profile to see whether the profile is within the design tolerances. If not, the operator might be directed to modify the tool in an appropriate manner—perhaps by abrading material from a region, for example. In all of these cases in which the SMR is used to measure the profile of a surface, the software that controls the laser tracker may greatly simplify and speed the procedure by the operator by indicating the region that is to be scanned. It may do this by causing the laser tracker to direct the light beam to delimit the areas the operator is to scan. Alternatively, it might trace the actual path the operator is to follow during the scan.
The laser tracker may also be used to assist in the assembly of complex structures. For example, it may be necessary to affix a number of components to the cockpit of an aircraft. In many cases, a cost effective way to do this is to point a light beam to instruct the assembler to drill holes or perform other operations at the appropriate locations. After the components have been attached, the operator may be instructed to scan the profile of the installed items to confirm that the installation has been made properly. To facilitate this measurement, one or more cameras may be used to identify retroreflectors on the object under assembly. These retroreflectors would be used to move the laser tracker into the frame of reference of the object under assembly, which would then enable the laser tracker to direct the activities of the assembler using the light beam from the tracker.
The one or more cameras on the laser tracker have the ability to measure all of the retroreflector targets within a large effective FOV by rotating the azimuth and zenith axes, as explained hereinabove. If the only targets accessible to the laser tracker are those on the object under test, the laser tracker can automatically determine, by viewing the retroreflector targets, the region of space to be measured. On the other hand, if targets occupy several objects, not all of interest to the current measurement, it may in some cases be necessary for the operator to indicate the region that is to be measured. In an embodiment, the operator may indicate the region to be measured by using a retroreflector to delimit the region of interest. The operator may do this, for example, by making four consecutive movements of the retroreflector target to indicate the upper, lower, left, and right extent of the region. In another embodiment, the operator may manually move the payload (or equivalent structure) of the laser tracker to point the light beam in the upper, lower, left, and right edges of the region of interest. The operator may be instructed to carry out these movements by software that controls the laser tracker, or the operator may give gestures to give this information without being directed to do so by the computer program. Such gestures may include, for example, movement of a retroreflector target in predetermined patterns within specified time intervals. As another example, the operator may indicate the desire to delimit a region by grabbing the payload and moving the light beam from the laser tracker directly downward. The operator could follow this initial movement by moving the payload to delimit the upper, lower, left, and right edges of the desired measurement region. In other cases, it may be possible for the software that controls the laser tracker to perform a target matching procedure in which the software identifies a collection of retroreflector targets in correspondence to a CAD model or to a list of three dimensional coordinates of targets.
In discussions above, the benefits of having the laser tracker using the light beam to assist the operator in performing measurements has been emphasized. Now the benefits of completely automating a measurement are considered. One potential benefit is that, because of the speed with which a fully automated measurement can be performed, additional targets may be added to the object under test without increasing the test time. By providing more points in each set of data, the software may more quickly determine the desired geometrical characteristics of the object under test with fewer potential errors. Also, by measuring sets of points without having the user manually move the SMR, the chance of having the object get shifted during the measurement session are reduced. This in turn reduces the chance of measurement errors.
A potential advantage of a fully automated measurement is that the order in which measurements are made can be optimized. In the case in which there are a large number of target positions on an object under test, an operator may measure the target positions according to relative proximity of the points since this is the fastest procedure for manual measurement. In a fully automated procedure, on the other hands, measurements may be performed in an order which produces the most accurate and robust measurement results. For example, two points on a datum line may be on opposite sides of a large object under test. An automated test procedure can measure these widely separated datum points one after the other, thereby avoiding drift and getting the most accurate measurement results.
Another potential advantage of automatic measurement is the possibility of automatic refit. It is often desirable to periodically measure the characteristics of tools used in the manufacture of products. Such periodic measurements help ensure that the tool has not gotten bent, that the targets have not moved, and so forth. If an object under test has gotten bumped, then during a periodic measurement this will be noticed by the software that controls the laser tracker. The software may in response invoke an automatic refit procedure in which the new location of the bumped target is re-established. The automatic refit procedure may also reduce the requirement for rigid mounting to hold the object rigidly on the tool. Reduced rigidity requirements results in reduced costs for building and operating a very accurate, repeatable tool.
Another example of automatic refit is for the case in which the object under test is on an assembly line. Such an object will probably not be in exactly the same location after it has completed a circuit and has returned to the laser tracker for an inspection. The laser tracker can measure the reference points to re-establish the relationship the frame of reference of the laser tracker and the frame of reference of the object under test.
One capability made possible by the automated measurements described above is the setting of a desired accuracy value, possibly by the user, to drive specific operations and set thresholds for alerts and alarms. For example, the value set for desired accuracy can drive: (1) the frequency and tolerance on stability checks; (2) self-compensation versus full pointing comp requirement; (3) the frequency and tolerance of self-compensation; (4) the threshold for number of measurement samples per point measured; (5) ambient temperature change limits before compensation checks; (6) tolerance for acceptable results of alignments and position moves; and (7) the frequency and tolerance of drift checks.
Alternatively, each of these values can be set individually. A matrix of values can be set based on different applications and operating conditions and these could be saved and recalled as measurement profiles.
An example is now considered in which a setup procedure includes cooperation by the operator rather than complete automation. For the desired measurement locations, the laser tracker 10 aims at a desired position on the object to place an SMR. In a first embodiment, the operator holds the SMR in hand while placing the SMR in the light beam, thereby enabling the beam to lock onto the SMR. After the SMR is put onto the object (e.g., on a magnetic nest), the laser tracker measures the three dimensional coordinates and moves the light beam to the next target position. In a second embodiment, the operator places the retroreflector immediately on the object, which might be on a magnetic nest, for example. If the light beam does not immediately lock onto the retroreflector, the operator gives a signal by, for example, passing a hand in front of the retroreflector target, thereby causing the target to flash in the view of the camera. The tracker searches for the SMR and quickly measures the SMR location. The tracker then advances to the next nominal point to guides the operator as to where to place the target on the object. A third embodiment is like the second embodiment except that the laser tracker does not do a search if the target is not immediately found. Instead, when the operator passes a hand in front of the retroreflector target, the laser tracker directs the light beam to the next target location. During an initial setup, it may be acceptable to make all of the measurements relatively quickly by limiting each measurement time to about 0.1 second, for example.
A large number of retroreflector targets, which may exceed 100, may be used to measure points on a tool. In some instances, the operator may wish to put on only a portion of the retroreflectors at a time (e.g., 25 retroreflectors at time) to save money on the purchase of retroreflectors. A measurement cycle is defined as the cycle over which the available retroreflector targets (e.g., 25 targets) are located on the tool and measurements are performed by the laser tracker.
If the SMRs are not already affixed to the object under test, the operator places the SMRs on the object either manually or by using the guidance provided by the laser tracker 10. The tracker may then conduct a stability and reference system check. The stability check may be conducted by measuring one or more points. In an embodiment, the tracker measures two or more points at the extreme boundaries of the measurement volume along with one point nearest the center of the volume. The laser tracker 10 automatically takes a series of points with shorter and progressively longer duration (more samples) to determine the optimal number of samples to achieve the desired accuracy (the operator sets this number in the system). The systems setting for samples per point are set to this value. The system will have the option to recheck the stability after a certain period of time, a number of points measured or at the beginning and/or end of each cycle. After the reference points (minimum of three) have been measured the first time, they can be re-measured at the end of the cycle to check movement or at the beginning of each cycle to re-orient the laser tracker to the part to correct for any movement of the object that may have been introduced by the operator while moving the SMRs. A simpler and faster method for checking possible movement is to place a single point on the object and a second point somewhere else; e.g., on the floor. These positions will have SMRs in them all the time and the measurement system can periodically check them throughout the measurement session. One way is to check at specific intervals during the measurement session in addition to the beginning and end of each cycle. A minimal implementation would to measure these drift points at the beginning and end of each measurement session.
The measurement system automatically measures all of the required points per the system settings. The laser tracker's user lights can flash a pattern of LEDs after each point is measured to alert the operator as to a passing or failing point. If a point fails, the operator can pause the automatic measurements by waving a hand in front of any target in the tracker's FOV. The camera system will register the break in the flashing of a single target produced in response to the flashing light source 54 and pause the measurements. The operator can hold a hand in front of the out-of-tolerance point so that it can be adjusted. The laser tracker then aims at the desired point. A digital readout guides the operator to adjust that portion of the tool that is out of tolerance. When the adjustment is complete, the operator may give another gesture (e.g., moving a hand in front of the SMR) to command the laser tracker to re-measure the point and continue measuring the rest of the points. By either flagging the SMR or physically moving the azimuth or zenith axis of the systems, it will be possible to execute the full measurement process so that the operator does not need to use a remote control, mouse or keyboard.
Referring to
Due to the size of the object 500 being measured and the accuracies that may be desired, the ambient temperature of the object 500 could become a source of measurement error if not addressed. For example, metal structures may expand as they get warmer. Also, the nominal values of the object (e.g., the CAD file) are often set at the temperature of a controlled room in the range of 20 degrees C. or 68 degrees F. If the object being measured is warmer than this it will be physically larger. It is common practice to adjust for this difference by applying a scale factor to the measurement job and adjusting the measurement data back to the design temperature by knowing the part material and temperature or by measuring reference points and applying a scale factor when transforming the job.
In an automated measurement session, where reference points 26 are being used and measured, the operator may indicate by a setting in the software to always apply a scale factor when transforming the job. The problem with this practice is that if the geometry of the object is changed, bent, etc., the automatic scale method will reduce this error by changing the scale of the job. A second method may be to use a material sensor placed on the object and have the operator enter the coefficient of expansion or material type and the system can determine scale based on these inputs. However, a preferred method by which an automated system may operate is to compare the two methods and alert the operator if any variation exceeds the desired system accuracy. The operator would place one or more material sensors on the object. The system can check the ambient temperature via an internal tracker sensor or an external sensor. If the difference is great enough to cause the part to expand or contract during the measurement session, the system would alert the operator to allow the object to soak in the environment and delay the measurement until the object temperature stabilizes. The measurement job may include the material type and/or the coefficient of expansion of the material. The system measures the reference points on the object and compares their values to nominal or desired values.
During the transformation process the system calculates the scale factor based on the transformation of measured to nominal and calculates scale based on the material type and material temperature sensor. If there is an unacceptable difference between the two scale calculations the operator will be alerted and the measurement session is halted. This difference may indicate that one of the following conditions has happened and the system may not be able to measure the job as expected; (1) the material temperature sensor(s) and/or the air temperature sensor(s) may be defective and producing the wrong values; (2) or the more common cause may be that the object is geometrically deformed to the point where the reference points 26 are no longer at the design location on the object. This is relatively difficult to detect in current measurement sessions as the automatic scaling tends to hide these errors and introduce uncertainties or errors across the entire job. If the reference points have errors, traditionally the entire job will be shifted slightly off nominal and this can cause some points to fail incorrectly. If the highest accuracy is desired, additional checks can be performed by the system during the measurement session to minimize expansion or contraction of the part.
The relatively most automatic system may be to use the stereo cameras 52 on the laser tracker 10 to determine depth and position to estimate the location of the SMRs 26. In an embodiment, the operator places the SMRs 26 on the object 500 and manually aims them in the direction of the laser tracker 10. The operator indicates the desired measurement volume by manually moving the azimuth and zenith axes of the tracker 10 to the relatively extreme points of the measurement volume as prompted by the software. The software prompts the operator to move the tracker head and laser beam 46 to the furthest right point; the operator then moves the tracker head. When the movement is stable for a specified amount of time (e.g., two seconds), the system records the location. The software prompts the user to move the tracker head to the furthest left, top and bottom of the intended volume.
If the indicated volume exceeds the range of the widest field of view of the camera system 52 on the tracker 10, the tracker then executes a programmed sweep/survey of the entire volume looking for SMRs 26 or targets. This programmed sweep/survey will be repeated as required throughout the measurement session to monitor the status of the SMRs or look for user input to the system. Using the stereo cameras 52 the laser tracker 10 estimates the XYZ location for every point within the measurement volume. The measurement system software calculates a first approximation on the transformation between the tracker and the set of points. The tracker then aims at the desired points. If any points are not visible from the measurement set, the tracker 10 flashes an error with the LEDs on the front of the tracker (not shown) and then aims at the location where the target 26 is missing, miss-aimed or obscured by another object. The tracker can move in a fixed pattern to make the location more visible for the operator. Once the point is corrected, the operator can flag the SMR 26 and the system will know to measure the location and proceed with the process. Again automatically, the tracker 10 aims at and uses the camera system 52 or traditional search system to lock onto and measure each target 26.
As stated in the previous paragraph, when pointing a light beam toward a target location, it can sometimes be a good idea to move the light beam in a pattern rather than pointing it at a fixed angle. Consider, for example, the case in which a magnetic nest is located off the top of an object under test. In this instance, a light beam pointed directly toward the target location (i.e., the center of the retroreflector target when placed in the magnetic nest) may be invisible to the operator since it may pass by the object without striking anything in its path. By moving the light beam in a pattern, the desired position of the SMR can be made visible.
If the laser tracker 10 is equipped with one or more wide field of view (WFOV) cameras and one or more narrow field of view (NFOV) cameras, the system can locate the general position of the SMR 26 with the WFOV camera and aim at the point 26. If the laser beam 46 does not hit the center of the target 26 close enough to enable the tracking system to lock onto the target, one or more of the following processes can be executed. (1) The tracker can reevaluate the position from the WFOV camera(s) and aim again at the stationary point. (2) The tracker can switch to the NFOV camera(s) and recalculate the optical center of the target and aim at this calculated center and attempt to acquire the target with the tracking system. (3) If the tracker is equipped with an optical zoom function on the NFOV camera(s) and the NFOV camera(s) cannot see the target after changing from the WFOV cameras (the WFOV position calculation caused the tracker to aim at a position that has enough error that the NFOV camera(s) cannot see the SMR), the NFOV camera can zoom out to the point where the target is visible and then calculate the optical center and properly aim the tracker.
Any of these processes can be repeated until the target is re-acquired, the advantage is that using the combination of the WFOV and NFOV cameras (referred to as the camera system herein) can be faster than the traditional aim and search method using the laser beam and position sensor.
In other embodiments of the present invention, another measurement procedure may be to compare, under the direction of software, measurement results to allowable tolerances. The laser tracker 10 may compare the nominal (CAD model) dimensions between target points on a tool and the dimensions as measured by the laser tracker. If the error between a nominal dimension and a measured dimension exceeds a tolerance value, the tracker may take action. This action may be as simple as re-measuring the points or measuring the points for a longer time. The tracker may also perform a two-face measurement to ensure that the problem is not with tracker accuracy. In the alternative, the action taken by the tracker may be to send the operator an error message, sound a beep, flash a light, or even shut down the production line until the operator checks the stability, makes an adjustment, or replaces a defective target, for example.
Embodiments of the two-face test are described in U.S. Pat. No. 7,327,446 ('446) to Cramer et al., which is incorporated by reference in its entirety. The tracker 10 makes a two-face measurement of one or more target points 26. If the two-face error obtained exceeds the specified value (for example, as given in the manufacturer's data sheet), a further step might be for the tracker to carry out a compensation procedure to improve tracker performance. There are two types of compensation procedures that are most commonly carried out (although other procedures are possible). These two procedures are the self-compensation procedure described in the '446 patent and the pointing compensation procedure. The pointing compensation procedure includes making a number of two-face measurements by pointing at targets that may be mounted on the floor, on pedestals, or on an object. After collecting the data from the pointing compensation, the tracker automatically corrects its internal parameters, thereby improving its measurement accuracy.
Another measurement procedure might be to check for stability of measurements over time. For example, the tracker may measure a target point on a floor and another target point on the tool. If the relative positions of these two target points change over the course of a measurement, the tracker may send the operator a warning. Similarly, the tracker may measure the distance between three points on a tool, and then come back at the end of the measurement and measure these three points again. If the relative positions of these points change, the validity of the entire measurement is called into questions, and additional measurements may be required.
Although the discussion has mostly treated the case in which the one or more cameras are located on the payload of the laser tracker, it will understood by one of ordinary skill in the art that such cameras may be located internally to the laser tracker (e.g., coaxial with the optical axis of the laser tracker), located on the azimuth carriage 14 of the laser tracker 10, or entirely off the laser tracker.
Many types of peripheral devices are possible, but here three such devices are shown: a temperature sensor 1582, a six-DOF probe 1584, and a personal digital assistant, 1586, which might be a smart phone, for example. The laser tracker may communicate with peripheral devices in a variety of means, including wireless communication over the antenna 1572, by means of a vision system such as a camera, and by means of distance and angular readings of the laser tracker to a cooperative target such as the six-DOF probe 1584.
In an embodiment, a separate communications bus goes from the master processor 1520 to each of the electronics units 1530, 1540, 1550, 1560, 1565, and 1570. Each communications line may have, for example, three serial lines that include the data line, clock line, and frame line. The frame line indicates whether or not the electronics unit should pay attention to the clock line. If it indicates that attention should be given, the electronics unit reads the current value of the data line at each clock signal. The clock signal may correspond, for example, to a rising edge of a clock pulse. In an embodiment, information is transmitted over the data line in the form of a packet. In an embodiment, each packet includes an address, a numeric value, a data message, and a checksum. The address indicates where, within the electronics unit, the data message is to be directed. The location may, for example, correspond to a processor subroutine within the electronics unit. The numeric value indicates the length of the data message. The data message contains data or instructions for the electronics unit to carry out. The checksum is a numeric value that is used to minimize the chance that errors are transmitted over the communications line.
In an embodiment, the master processor 1520 sends packets of information over bus 1610 to payload functions electronics 1530, over bus 1611 to azimuth encoder electronics 1540, over bus 1612 to zenith encoder electronics 1550, over bus 1613 to display and UI electronics 1560, over bus 1614 to removable storage hardware 1565, and over bus 1616 to RFID and wireless electronics 1570.
In an embodiment, master processor 1520 also sends a synch (synchronization) pulse over the synch bus 1630 to each of the electronics units at the same time. The synch pulse provides a way of synchronizing values collected by the measurement functions of the laser tracker. For example, the azimuth encoder electronics 1540 and the zenith electronics 1550 latch their encoder values as soon as the synch pulse is received. Similarly, the payload functions electronics 1530 latch the data collected by the electronics contained within the payload. The six-DOF, ADM, and position detector all latch data when the synch pulse is given. In most cases, the camera and inclinometer collect data at a slower rate than the synch pulse rate but may latch data at multiples of the synch pulse period.
The laser tracker electronics processing system 1510 may communicate with an external computer 1590, or it may provide computation, display, and UI functions within the laser tracker. The laser tracker communicates with computer 1590 over communications link 1606, which might be, for example, and Ethernet line or a wireless connection. The laser tracker may also communicate with other elements 1600, represented by the cloud, over communications link 1602, which might include one or more electrical cables, such as Ethernet cables, and one or more wireless connections. An example of an element 1600 is another three dimensional test instrument—for example, an articulated arm CMM, which may be relocated by the laser tracker. A communication link 1604 between the computer 1590 and the elements 1600 may be wired (e.g., Ethernet) or wireless. An operator sitting on a remote computer 1590 may make a connection to the Internet, represented by the cloud 1600, over an Ethernet or wireless line, which in turn connects to the master processor 1520 over an Ethernet or wireless line. In this way, a user may control the action of a remote laser tracker.
Step 710 is to store a list of nominal coordinates for the first target, the second target, the third target, and at least one additional point. The nominal coordinates are three-dimensional coordinates given in a second frame of reference. The second frame of reference is associated with an object under test or with a structure to which the object under test is attached. An examples of a second frame of reference 40 is shown in
Step 715 is to capture on the first photosensitive array a portion of the light emitted by the second light beam and reflected off the first target, the second target, and the third target.
Step 720 is to obtain spot positions on the photosensitive array from the portion of light reflected off the first target, the second target, and the third target. The spot positions may, for example, be centroids of the spots for the first target, the second target, and the third target.
Step 725 is to determine a correspondence between a first spot position, a second spot position, and a third spot position on the first photosensitive array and the nominal coordinates of the first target, the second target, and the third target, respectively. Such a correspondence may be obtained in a variety of ways, for example, according to the methods of described in the claims herein below. One such method includes observing the possible correspondences within an allowable range of orientations of the second frame of reference with the first frame of reference. Another method involves using a triangulation method with two (stereo) cameras located on the laser tracker. Another method involves using a single tracker camera but rotating the tracker to two different orientations. With this method, the two images obtained on the camera photosensitive array are used to determine the correspondence. Measurements can also be made with the first camera in frontsight and backsight modes and the images obtained on the first photosensitive array used to determine the correspondence. The relative positions of first and second frames of reference may be changed, and the resulting pattern of spots on the first photosensitive array used to determine the correspondence. For example, the second frame of reference may be associated with a moving carriage, as shown in
Step 730 is to direct the first beam of light to the first target based at least in part on the nominal coordinates of the first target and the first spot position and to measure the three-dimensional coordinates of the first target using the absolute distance meter, the first angular transducer, and the second angular transducer. As explained in step 720, some of the methods of obtaining correspondences yield three-dimensional coordinates with respect to the first frame of reference so that directing the laser beam to the first, second, and third targets is straightforward. In the case in which the directions are based on the constraints between the first and second frames of reference, the optimal directions to the first, second, and third targets may not be known to high precision; however, a direction can be obtained by assuming a distance to the target. The resulting direction is generally close enough to the optimal direction to enable the target to be captured, for example, by means of the methods describe in claims herein below. The measured three-dimensional coordinates are in the first frame of reference, which is the frame of reference of the laser tracker.
Step 735 is the same as step 730, only applied to the second target rather than the first target. Step 740 is the same as step 735, only applied to the third target rather than the first target.
Step 745 is to determine three-dimensional coordinates of the at least one additional point in the first frame of reference based at least in part on the measured three-dimensional coordinates of the first target, the second target, the third target, and the nominal coordinates of the at least one additional point. This is a mathematical step, which may be carried out, for example, by obtaining a transformation matrix that enables calculation of any nominal coordinate (three dimensional coordinate within the second frame of reference) into a three-dimensional coordinate within the first (tracker) frame of reference. The three-dimensional coordinates obtained in steps 725, 730, and 735 are sufficient to determine the transformation matrix, using methods that are well known to those skilled in the art.
Step 750 is to store the three-dimensional coordinates of the at least one additional point. The coordinates may be stored in electronic readable media, in a computer memory, or in a microprocessor, for example. Step 755 is the end of the method having the steps 700.
Step 810 is to place a selected retroreflector target to intercept the first light beam. One way to do this is for the operator to move a handheld selected retroreflector target into the first beam of light. A second way to do this is to place the selected retroreflector target on a nest, for example, a magnetic nest, mounted on an object under test. If the three-dimensional coordinates of the at least one additional point is the known accurately enough, the first laser beam will be directed well enough that at least a portion of the beam will be captured by the clear aperture of the selected retroreflector target.
Step 815 is to direct the first light beam to the center of the selected retroreflector target. This step is performed by the tracking system of the laser tracker. Step 820 is to measure three-dimensional coordinates of the selected retroreflector target using the absolute distance meter, the first angular transducer, and the second angular transducer. The method 800 ends with step 825.
The step 910 is to move the first light beam in a first pattern in space, the first pattern proximate to the at least one additional point. Such a first pattern is usually referred to as a search pattern. As an example, the light beam may begin at an initial position and then move in a spiral pattern outward.
Step 915 is to detect the light beam with the tracking system of the laser tracker. This may occur when a portion of the first beam of light reflected off a retroreflector strikes a position detector. This detecting of light by the position detector indicates that the first light beam has intercepted the clear aperture of the retroreflector target.
Step 920 is to direct the first light beam to a center of the selected retroreflector target. As explained above, the center in this context refers to a position relative to the retroreflector target about which light beams reflect symmetrically. The term center in this context does not necessarily refer to a physical center of the retroreflector target.
Step 925 is to measure three-dimensional coordinates of the selected retroreflector target using the absolute distance meter, the first angular transducer, and the second angular transducer. The method of step 900 ends with step 930.
Step 1010 is to capture on the third photosensitive array a portion of the light emitted by the fourth light source and reflected off the first target, the second target, and the third target. Images of the first target, the second target and the third target were already obtained with the first camera. Now additional measurements are collected using a third camera. In some cases, the information obtained from the first camera may be used to steer the laser tracker to a position enabling the second camera to view the first, second, or third target.
Step 1015 is to obtain spot positions on the photosensitive array from the portion of light reflected off each of the first target, second target, and the third target. Such spot positions may be obtained, for example, as centroids of each of the spots.
Step 1020 is to determine a correspondence between a first spot position, a second spot position, and a third spot position on the third photosensitive array and the nominal coordinates of the first target, the second target, and the third target, respectively. The methods for determining a correspondence are the same as those discussed herein above, but relatively more accurate information provided by the third camera, which has a narrower field of view than the first camera. The method 1000 ends with step 1025.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method, or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++, C# or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). Aspects of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, may be implemented by computer program instructions.
These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer program instructions may also be stored in a computer readable medium that may direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Any flowcharts and block diagrams in the FIGURES illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the FIGURES. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, may be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
While preferred embodiments have been shown and described, various modifications and substitutions may be made thereto without departing from the spirit and scope of the invention. Accordingly, it is to be understood that the present invention has been described by way of illustrations and not limitation.
The presently disclosed embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims, rather than the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
The present application is a divisional application of U.S. patent application Ser. No. 14/199,211 filed on Mar. 6, 2014. Application Ser. No. 14/199,211 is a divisional application of U.S. patent application Ser. No. 13/860,010 filed on Apr. 10, 2013, which is a divisional application of U.S. application Ser. No. 13/418,899 filed on Mar. 13, 2012, which is nonprovisional application of U.S. Provisional Patent Application No. 61/452,314 filed Mar. 14, 2011, the entire contents of which are hereby incorporated by reference. U.S. patent application Ser. No. 13/418,899 is also a continuation-in-part application of U.S. patent application Ser. No. 13/340,730 filed Dec. 30, 2011, which is a continuation-in-part application of U.S. patent application Ser. No. 13/090,889 filed Apr. 20, 2011, which is a nonprovisional application of U.S. Provisional Patent Application No. 61/326,294 filed Apr. 21, 2010, the entire contents of all of which are hereby incorporated by reference. U.S. patent application Ser. No. 13/418,899 is further a nonprovisional application of U.S. Provisional Patent Application No. 61/475,703 filed Apr. 15, 2011 and U.S. Provisional Patent Application No. 61/592,049 filed Jan. 30, 2012, the entire contents of both of which are hereby incorporated by reference. U.S. patent application Ser. No. 13/418,899 is also a continuation-in-part application of U.S. patent application Ser. No. 13/407,983 filed Feb. 29, 2012, which is a nonprovisional application of U.S. Provisional Patent Application No. 61/448,823 filed Mar. 3, 2011, the entire contents of both of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
2612994 | Woodland | Oct 1952 | A |
2682804 | Clifford et al. | Jul 1954 | A |
2484641 | Keuffel et al. | Mar 1957 | A |
2784641 | Keuffel et al. | Mar 1957 | A |
3339457 | Pun | Sep 1967 | A |
3365717 | Holscher | Jan 1968 | A |
3464770 | Schmidt | Sep 1969 | A |
3497695 | Smith et al. | Feb 1970 | A |
3508828 | Froome et al. | Apr 1970 | A |
3619058 | Hewlett et al. | Nov 1971 | A |
3627429 | Jaenicke et al. | Dec 1971 | A |
3658426 | Vyce | Apr 1972 | A |
3728025 | Madigan et al. | Apr 1973 | A |
3740141 | DeWitt, Jr. | Jun 1973 | A |
3779645 | Nakazawa et al. | Dec 1973 | A |
3813165 | Hines et al. | May 1974 | A |
3832056 | Shipp et al. | Aug 1974 | A |
3900260 | Wendt | Aug 1975 | A |
3914052 | Wiklund | Oct 1975 | A |
4178515 | Tarasevich | Dec 1979 | A |
4297030 | Chaborski | Oct 1981 | A |
4403857 | Holscher | Sep 1983 | A |
4413907 | Lane | Nov 1983 | A |
4453825 | Buck et al. | Jun 1984 | A |
4498764 | Bolkow et al. | Feb 1985 | A |
4521107 | Chaborski et al. | Jun 1985 | A |
4531833 | Ohtomo | Jul 1985 | A |
4537475 | Summers et al. | Aug 1985 | A |
4560270 | Wiklund et al. | Dec 1985 | A |
4632547 | Kaplan et al. | Dec 1986 | A |
4652130 | Tank | Mar 1987 | A |
4689489 | Cole | Aug 1987 | A |
4692023 | Ohtomo et al. | Sep 1987 | A |
4699508 | Bolkow et al. | Oct 1987 | A |
4707129 | Hashimoto et al. | Nov 1987 | A |
4714339 | Lau et al. | Dec 1987 | A |
4731812 | Akerberg | Mar 1988 | A |
4731879 | Sepp et al. | Mar 1988 | A |
4767257 | Kato | Aug 1988 | A |
4777660 | Gould et al. | Oct 1988 | A |
4790651 | Brown et al. | Dec 1988 | A |
4839507 | May | Jun 1989 | A |
4983021 | Fergason | Jan 1991 | A |
5002388 | Ohishi et al. | Mar 1991 | A |
5051934 | Wiklund | Sep 1991 | A |
5069524 | Watanabe et al. | Dec 1991 | A |
5082364 | Russell | Jan 1992 | A |
5090131 | Deer | Feb 1992 | A |
5121242 | Kennedy | Jun 1992 | A |
5137354 | deVos et al. | Aug 1992 | A |
5138154 | Hotelling | Aug 1992 | A |
5162862 | Bartram et al. | Nov 1992 | A |
5198868 | Saito et al. | Mar 1993 | A |
5237384 | Fukunaga et al. | Aug 1993 | A |
5263103 | Kosinski | Nov 1993 | A |
5267014 | Prenninger | Nov 1993 | A |
5301005 | deVos et al. | Apr 1994 | A |
5313409 | Wiklund et al. | May 1994 | A |
5319434 | Croteau et al. | Jun 1994 | A |
5347306 | Nitta | Sep 1994 | A |
5392521 | Allen | Feb 1995 | A |
5400130 | Tsujimoto et al. | Mar 1995 | A |
5402193 | Choate | Mar 1995 | A |
5416321 | Sebastian et al. | May 1995 | A |
5440112 | Sakimura et al. | Aug 1995 | A |
5440326 | Quinn | Aug 1995 | A |
5448505 | Novak | Sep 1995 | A |
5455670 | Payne et al. | Oct 1995 | A |
5500737 | Donaldson et al. | Mar 1996 | A |
5532816 | Spann et al. | Jul 1996 | A |
5534992 | Takeshima et al. | Jul 1996 | A |
5594169 | Field et al. | Jan 1997 | A |
D378751 | Smith | Apr 1997 | S |
5671160 | Julian | Sep 1997 | A |
5698784 | Hotelling et al. | Dec 1997 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5737068 | Kaneko et al. | Apr 1998 | A |
5742379 | Reifer | Apr 1998 | A |
5754284 | Leblanc et al. | May 1998 | A |
RE35816 | Schulz | Jun 1998 | E |
5764360 | Meier | Jun 1998 | A |
5767952 | Ohtomo et al. | Jun 1998 | A |
5771623 | Pernstich et al. | Jun 1998 | A |
5825350 | Case, Jr. et al. | Oct 1998 | A |
5828057 | Hertzman et al. | Oct 1998 | A |
5861956 | Bridges et al. | Jan 1999 | A |
5880822 | Kubo | Mar 1999 | A |
5886775 | Houser et al. | Mar 1999 | A |
5886777 | Hirunuma | Mar 1999 | A |
5892575 | Marino | Apr 1999 | A |
5893214 | Meier et al. | Apr 1999 | A |
5898421 | Quinn | Apr 1999 | A |
5926388 | Kimbrough et al. | Jul 1999 | A |
5957559 | Rueb et al. | Sep 1999 | A |
5973788 | Pettersen et al. | Oct 1999 | A |
5991011 | Damm | Nov 1999 | A |
6017125 | Vann | Jan 2000 | A |
6023326 | Katayama et al. | Feb 2000 | A |
6034722 | Viney et al. | Mar 2000 | A |
6036319 | Rueb et al. | Mar 2000 | A |
6052190 | Sekowski et al. | Apr 2000 | A |
D427087 | Kaneko et al. | Jun 2000 | S |
6085155 | Hayase et al. | Jul 2000 | A |
6097491 | Hartrumpf | Aug 2000 | A |
6100540 | Ducharme et al. | Aug 2000 | A |
6111563 | Hines | Aug 2000 | A |
6122058 | Van Der Werf et al. | Sep 2000 | A |
6133998 | Monz et al. | Oct 2000 | A |
6166809 | Pettersen et al. | Dec 2000 | A |
6171018 | Ohtomo et al. | Jan 2001 | B1 |
6193371 | Snook | Feb 2001 | B1 |
6222465 | Kumar et al. | Apr 2001 | B1 |
6262801 | Shibuya et al. | Jul 2001 | B1 |
6295174 | Ishinabe et al. | Sep 2001 | B1 |
6317954 | Cunningham et al. | Nov 2001 | B1 |
6324024 | Shirai et al. | Nov 2001 | B1 |
6330379 | Hendriksen | Dec 2001 | B1 |
6344846 | Hines | Feb 2002 | B1 |
6347290 | Bartlett | Feb 2002 | B1 |
6351483 | Chen | Feb 2002 | B1 |
6353764 | Imagawa et al. | Mar 2002 | B1 |
6369794 | Sakurai et al. | Apr 2002 | B1 |
6369880 | Steinlechner | Apr 2002 | B1 |
6433866 | Nichols | Aug 2002 | B1 |
6437859 | Ohtomo et al. | Aug 2002 | B1 |
6445446 | Kumagai et al. | Sep 2002 | B1 |
6462810 | Muraoka et al. | Oct 2002 | B1 |
6463393 | Giger | Oct 2002 | B1 |
6490027 | Rajchel et al. | Dec 2002 | B1 |
6532060 | Kindaichi et al. | Mar 2003 | B1 |
6559931 | Kawamura et al. | May 2003 | B2 |
6563569 | Osawa et al. | May 2003 | B2 |
6567101 | Thomas | May 2003 | B1 |
6573883 | Bartlett | Jun 2003 | B1 |
6573981 | Kumagai et al. | Jun 2003 | B2 |
6583862 | Perger | Jun 2003 | B1 |
6587244 | Ishinabe et al. | Jul 2003 | B1 |
6624916 | Green et al. | Sep 2003 | B1 |
6633367 | Gogolla | Oct 2003 | B2 |
6646732 | Ohtomo et al. | Nov 2003 | B2 |
6667798 | Markendorf et al. | Dec 2003 | B1 |
6668466 | Bieg et al. | Dec 2003 | B1 |
6678059 | Cho et al. | Jan 2004 | B2 |
6681031 | Cohen et al. | Jan 2004 | B2 |
6727984 | Becht | Apr 2004 | B2 |
6727985 | Giger | Apr 2004 | B2 |
6754370 | Hall-Holt et al. | Jun 2004 | B1 |
6765653 | Shirai et al. | Jul 2004 | B2 |
6802133 | Jordil et al. | Oct 2004 | B2 |
6847436 | Bridges | Jan 2005 | B2 |
6859744 | Giger | Feb 2005 | B2 |
6864966 | Giger | Mar 2005 | B2 |
6935036 | Raab | Aug 2005 | B2 |
6957493 | Kumagai et al. | Oct 2005 | B2 |
6964113 | Bridges et al. | Nov 2005 | B2 |
6965843 | Raab et al. | Nov 2005 | B2 |
6980881 | Greenwood et al. | Dec 2005 | B2 |
6996912 | Raab | Feb 2006 | B2 |
6996914 | Istre et al. | Feb 2006 | B1 |
7022971 | Ura et al. | Apr 2006 | B2 |
7023531 | Gogolla et al. | Apr 2006 | B2 |
7055253 | Kaneko | Jun 2006 | B2 |
7072032 | Kumagai et al. | Jul 2006 | B2 |
7086169 | Bayham et al. | Aug 2006 | B1 |
7095490 | Ohtomo et al. | Aug 2006 | B2 |
7099000 | Connolly | Aug 2006 | B2 |
7129927 | Mattsson | Oct 2006 | B2 |
7130035 | Ohtomo et al. | Oct 2006 | B2 |
7168174 | Piekutowski | Jan 2007 | B2 |
7177014 | Mori et al. | Feb 2007 | B2 |
7193695 | Sugiura | Mar 2007 | B2 |
7196776 | Ohtomo et al. | Mar 2007 | B2 |
7222021 | Ootomo et al. | May 2007 | B2 |
7224444 | Stierle et al. | May 2007 | B2 |
7230689 | Lau | Jun 2007 | B2 |
7233316 | Smith et al. | Jun 2007 | B2 |
7246030 | Raab et al. | Jul 2007 | B2 |
7248374 | Bridges | Jul 2007 | B2 |
7253891 | Toker et al. | Aug 2007 | B2 |
7256899 | Faul et al. | Aug 2007 | B1 |
7262863 | Schmidt et al. | Aug 2007 | B2 |
7274802 | Kumagai et al. | Sep 2007 | B2 |
7285793 | Husted | Oct 2007 | B2 |
7286246 | Yoshida | Oct 2007 | B2 |
7304729 | Yasutomi et al. | Dec 2007 | B2 |
7307710 | Gatsios et al. | Dec 2007 | B2 |
7312862 | Zumbrunn et al. | Dec 2007 | B2 |
7321420 | Yasutomi et al. | Jan 2008 | B2 |
7325326 | Istre et al. | Feb 2008 | B1 |
7327446 | Cramer et al. | Feb 2008 | B2 |
7336346 | Aoki et al. | Feb 2008 | B2 |
7339655 | Nakamura et al. | Mar 2008 | B2 |
7345748 | Sugiura et al. | Mar 2008 | B2 |
7352446 | Bridges et al. | Apr 2008 | B2 |
7372558 | Kaufman et al. | May 2008 | B2 |
7388654 | Raab et al. | Jun 2008 | B2 |
7388658 | Glimm | Jun 2008 | B2 |
7401783 | Pryor | Jul 2008 | B2 |
7423742 | Gatsios et al. | Sep 2008 | B2 |
7446863 | Nishita et al. | Nov 2008 | B2 |
7453554 | Yang et al. | Nov 2008 | B2 |
7466401 | Cramer et al. | Dec 2008 | B2 |
7471377 | Liu et al. | Dec 2008 | B2 |
7474388 | Ohtomo et al. | Jan 2009 | B2 |
7480037 | Palmateer et al. | Jan 2009 | B2 |
7492444 | Osada | Feb 2009 | B2 |
7503123 | Matsuo et al. | Mar 2009 | B2 |
7511824 | Sebastian et al. | Mar 2009 | B2 |
7518709 | Oishi et al. | Apr 2009 | B2 |
7535555 | Nishizawa et al. | May 2009 | B2 |
7541965 | Ouchi et al. | Jun 2009 | B2 |
7552539 | Piekutowski | Jun 2009 | B2 |
7555766 | Kondo et al. | Jun 2009 | B2 |
7562459 | Fourquin et al. | Jul 2009 | B2 |
7564538 | Sakimura et al. | Jul 2009 | B2 |
7565216 | Soucy | Jul 2009 | B2 |
7583375 | Cramer et al. | Sep 2009 | B2 |
7586586 | Constantikes | Sep 2009 | B2 |
7613501 | Scherch | Nov 2009 | B2 |
7614019 | Rimas Ribikauskas et al. | Nov 2009 | B2 |
D605959 | Apotheloz | Dec 2009 | S |
7634374 | Chouinard et al. | Dec 2009 | B2 |
7634381 | Westermark et al. | Dec 2009 | B2 |
7692628 | Smith et al. | Apr 2010 | B2 |
7701559 | Bridges et al. | Apr 2010 | B2 |
7701566 | Kumagai et al. | Apr 2010 | B2 |
7705830 | Westerman et al. | Apr 2010 | B2 |
7710396 | Smith et al. | May 2010 | B2 |
7724380 | Horita et al. | May 2010 | B2 |
7728963 | Kirschner | Jun 2010 | B2 |
7738083 | Luo et al. | Jun 2010 | B2 |
7751654 | Lipson et al. | Jul 2010 | B2 |
7761814 | Rimas-Ribikauskas et al. | Jul 2010 | B2 |
7765084 | Westermark et al. | Jul 2010 | B2 |
7782298 | Smith et al. | Aug 2010 | B2 |
7800758 | Bridges et al. | Sep 2010 | B1 |
7804051 | Hingerling et al. | Sep 2010 | B2 |
7804602 | Raab | Sep 2010 | B2 |
7812736 | Collingwood et al. | Oct 2010 | B2 |
7812969 | Morimoto et al. | Oct 2010 | B2 |
D629314 | Ogasawara | Dec 2010 | S |
7876457 | Rueb | Jan 2011 | B2 |
7894079 | Altendorf et al. | Feb 2011 | B1 |
7903237 | Li | Mar 2011 | B1 |
7929150 | Schweiger | Apr 2011 | B1 |
7976387 | Venkatesh et al. | Jul 2011 | B2 |
7983872 | Makino et al. | Jul 2011 | B2 |
7990523 | Schlierbach et al. | Aug 2011 | B2 |
7990550 | Aebischer et al. | Aug 2011 | B2 |
8087315 | Goossen et al. | Jan 2012 | B2 |
8094121 | Obermeyer et al. | Jan 2012 | B2 |
8151477 | Tait | Apr 2012 | B2 |
8190030 | Leclair et al. | May 2012 | B2 |
8217893 | Quinn et al. | Jul 2012 | B2 |
8237934 | Cooke et al. | Aug 2012 | B1 |
8279430 | Dold et al. | Oct 2012 | B2 |
8314939 | Kato et al. | Nov 2012 | B2 |
8320708 | Kurzweil et al. | Nov 2012 | B2 |
8360240 | Kallabis | Jan 2013 | B2 |
8379224 | Piasse et al. | Feb 2013 | B1 |
8387961 | Im | Mar 2013 | B2 |
8405604 | Pryor et al. | Mar 2013 | B2 |
8422034 | Steffensen et al. | Apr 2013 | B2 |
8437011 | Steffensen et al. | May 2013 | B2 |
8438747 | Ferrari et al. | May 2013 | B2 |
8467072 | Cramer et al. | Jun 2013 | B2 |
8483512 | Moeller | Jul 2013 | B2 |
8509949 | Bordyn et al. | Aug 2013 | B2 |
8537375 | Steffensen et al. | Sep 2013 | B2 |
8553212 | Jaeger et al. | Oct 2013 | B2 |
8654354 | Steffensen et al. | Feb 2014 | B2 |
8659749 | Bridges | Feb 2014 | B2 |
8670114 | Bridges et al. | Mar 2014 | B2 |
8681317 | Moser et al. | Mar 2014 | B2 |
8699756 | Jensen et al. | Apr 2014 | B2 |
8717545 | Sebastian et al. | May 2014 | B2 |
8772719 | Böckem et al. | Jul 2014 | B2 |
8931183 | Jonas | Jan 2015 | B2 |
20010045534 | Kimura | Nov 2001 | A1 |
20020093646 | Muraoka | Jul 2002 | A1 |
20020148133 | Bridges et al. | Oct 2002 | A1 |
20020179866 | Hoeller et al. | Dec 2002 | A1 |
20030014212 | Ralston et al. | Jan 2003 | A1 |
20030020895 | Bridges | Jan 2003 | A1 |
20030033041 | Richey | Feb 2003 | A1 |
20030048459 | Gooch | Mar 2003 | A1 |
20030066202 | Eaton | Apr 2003 | A1 |
20030090682 | Gooch et al. | May 2003 | A1 |
20030112449 | Tu et al. | Jun 2003 | A1 |
20030133092 | Rogers | Jul 2003 | A1 |
20030179362 | Osawa et al. | Sep 2003 | A1 |
20030206285 | Lau | Nov 2003 | A1 |
20040041996 | Abe | Mar 2004 | A1 |
20040075823 | Lewis et al. | Apr 2004 | A1 |
20040170363 | Angela | Sep 2004 | A1 |
20040189944 | Kaufman et al. | Sep 2004 | A1 |
20040223139 | Vogel | Nov 2004 | A1 |
20050147477 | Clark | Jul 2005 | A1 |
20050179890 | Cramer et al. | Aug 2005 | A1 |
20050185182 | Raab et al. | Aug 2005 | A1 |
20050197145 | Chae et al. | Sep 2005 | A1 |
20050254043 | Chiba | Nov 2005 | A1 |
20050284937 | Xi et al. | Dec 2005 | A1 |
20060009929 | Boyette et al. | Jan 2006 | A1 |
20060053647 | Raab et al. | Mar 2006 | A1 |
20060055662 | Rimas-Ribikauskas et al. | Mar 2006 | A1 |
20060055685 | Rimas-Ribikauskas et al. | Mar 2006 | A1 |
20060066836 | Bridges et al. | Mar 2006 | A1 |
20060103853 | Palmateer | May 2006 | A1 |
20060132803 | Clair et al. | Jun 2006 | A1 |
20060140473 | Brooksby et al. | Jun 2006 | A1 |
20060141435 | Chiang | Jun 2006 | A1 |
20060145703 | Steinbichler et al. | Jul 2006 | A1 |
20060146009 | Syrbe et al. | Jul 2006 | A1 |
20060161379 | Ellenby et al. | Jul 2006 | A1 |
20060164384 | Smith et al. | Jul 2006 | A1 |
20060164385 | Smith et al. | Jul 2006 | A1 |
20060164386 | Smith et al. | Jul 2006 | A1 |
20060222237 | Du et al. | Oct 2006 | A1 |
20060222314 | Zumbrunn et al. | Oct 2006 | A1 |
20060235611 | Deaton et al. | Oct 2006 | A1 |
20060262001 | Ouchi et al. | Nov 2006 | A1 |
20060279246 | Hashimoto et al. | Dec 2006 | A1 |
20070016386 | Husted | Jan 2007 | A1 |
20070019212 | Gatsios et al. | Jan 2007 | A1 |
20070024842 | Nishizawa et al. | Feb 2007 | A1 |
20070090309 | Hu et al. | Apr 2007 | A1 |
20070121095 | Lewis | May 2007 | A1 |
20070127013 | Hertzman et al. | Jun 2007 | A1 |
20070130785 | Bublitz et al. | Jun 2007 | A1 |
20070236452 | Venkatesh et al. | Oct 2007 | A1 |
20070247615 | Bridges et al. | Oct 2007 | A1 |
20070285672 | Mukai et al. | Dec 2007 | A1 |
20080002866 | Fujiwara | Jan 2008 | A1 |
20080024795 | Yamamoto et al. | Jan 2008 | A1 |
20080043409 | Kallabis | Feb 2008 | A1 |
20080107305 | Vanderkooy et al. | May 2008 | A1 |
20080122786 | Pryor et al. | May 2008 | A1 |
20080203299 | Kozuma et al. | Aug 2008 | A1 |
20080229592 | Hinderling et al. | Sep 2008 | A1 |
20080239281 | Bridges | Oct 2008 | A1 |
20080246974 | Wilson et al. | Oct 2008 | A1 |
20080250659 | Bellerose et al. | Oct 2008 | A1 |
20080297808 | Riza et al. | Dec 2008 | A1 |
20080302200 | Tobey | Dec 2008 | A1 |
20080309949 | Rueb | Dec 2008 | A1 |
20080316497 | Taketomi et al. | Dec 2008 | A1 |
20080316503 | Smarsh et al. | Dec 2008 | A1 |
20090009747 | Wolf et al. | Jan 2009 | A1 |
20090033621 | Quinn et al. | Feb 2009 | A1 |
20090046271 | Constantikes | Feb 2009 | A1 |
20090066932 | Bridges et al. | Mar 2009 | A1 |
20090109426 | Cramer et al. | Apr 2009 | A1 |
20090153817 | Kawakubo | Jun 2009 | A1 |
20090157226 | De Smet | Jun 2009 | A1 |
20090171618 | Kumagai et al. | Jul 2009 | A1 |
20090187373 | Atwell et al. | Jul 2009 | A1 |
20090190125 | Foster et al. | Jul 2009 | A1 |
20090205088 | Crampton et al. | Aug 2009 | A1 |
20090213073 | Obermeyer et al. | Aug 2009 | A1 |
20090239581 | Lee | Sep 2009 | A1 |
20090240372 | Bordyn et al. | Sep 2009 | A1 |
20090240461 | Makino et al. | Sep 2009 | A1 |
20090240462 | Lee | Sep 2009 | A1 |
20100008543 | Yamada et al. | Jan 2010 | A1 |
20100025746 | Chapman et al. | Feb 2010 | A1 |
20100058252 | Ko | Mar 2010 | A1 |
20100091112 | Veeser et al. | Apr 2010 | A1 |
20100103431 | Demopoulos | Apr 2010 | A1 |
20100128259 | Bridges et al. | May 2010 | A1 |
20100142798 | Weston et al. | Jun 2010 | A1 |
20100149518 | Nordenfelt et al. | Jun 2010 | A1 |
20100149525 | Lau | Jun 2010 | A1 |
20100158361 | Grafinger et al. | Jun 2010 | A1 |
20100176270 | Lau et al. | Jul 2010 | A1 |
20100207938 | Yau et al. | Aug 2010 | A1 |
20100225746 | Shpunt et al. | Sep 2010 | A1 |
20100234094 | Gagner et al. | Sep 2010 | A1 |
20100235786 | Maizels et al. | Sep 2010 | A1 |
20100245851 | Teodorescu | Sep 2010 | A1 |
20100250175 | Briggs et al. | Sep 2010 | A1 |
20100265316 | Sali et al. | Oct 2010 | A1 |
20100277747 | Rueb et al. | Nov 2010 | A1 |
20100284082 | Shpunt et al. | Nov 2010 | A1 |
20110007154 | Vogel et al. | Jan 2011 | A1 |
20110023578 | Grasser | Feb 2011 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110032509 | Bridges | Feb 2011 | A1 |
20110035952 | Roithmeier | Feb 2011 | A1 |
20110043620 | Svanholm et al. | Feb 2011 | A1 |
20110043808 | Isozaki et al. | Feb 2011 | A1 |
20110052006 | Gurman et al. | Mar 2011 | A1 |
20110069322 | Hoffer, Jr. | Mar 2011 | A1 |
20110107611 | Desforges et al. | May 2011 | A1 |
20110107612 | Ferrari et al. | May 2011 | A1 |
20110107613 | Tait | May 2011 | A1 |
20110107614 | Champ | May 2011 | A1 |
20110109502 | Sullivan | May 2011 | A1 |
20110112786 | Desforges et al. | May 2011 | A1 |
20110123097 | Van Coppenolle et al. | May 2011 | A1 |
20110166824 | Haisty et al. | Jul 2011 | A1 |
20110169924 | Haisty et al. | Jul 2011 | A1 |
20110173827 | Bailey et al. | Jul 2011 | A1 |
20110175745 | Atwell et al. | Jul 2011 | A1 |
20110179281 | Chevallier-Mames et al. | Jul 2011 | A1 |
20110181872 | Dold et al. | Jul 2011 | A1 |
20110260033 | Steffensen et al. | Oct 2011 | A1 |
20110301902 | Panagas et al. | Dec 2011 | A1 |
20120050255 | Thomas et al. | Mar 2012 | A1 |
20120062706 | Keshavmurthy et al. | Mar 2012 | A1 |
20120099117 | Hanchett et al. | Apr 2012 | A1 |
20120105821 | Moser et al. | May 2012 | A1 |
20120120391 | Dold et al. | May 2012 | A1 |
20120120415 | Steffensen et al. | May 2012 | A1 |
20120154577 | Yoshikawa et al. | Jun 2012 | A1 |
20120188559 | Becker et al. | Jul 2012 | A1 |
20120206808 | Brown et al. | Aug 2012 | A1 |
20120218563 | Spruck et al. | Aug 2012 | A1 |
20120262550 | Bridges | Oct 2012 | A1 |
20120262573 | Bridges et al. | Oct 2012 | A1 |
20120262728 | Bridges et al. | Oct 2012 | A1 |
20120265479 | Bridges et al. | Oct 2012 | A1 |
20120317826 | Jonas | Dec 2012 | A1 |
20130096873 | Rosengaus et al. | Apr 2013 | A1 |
20130100282 | Siercks et al. | Apr 2013 | A1 |
20130155386 | Bridges et al. | Jun 2013 | A1 |
20130162469 | Zogg et al. | Jun 2013 | A1 |
20140002806 | Buchel et al. | Jan 2014 | A1 |
20140028805 | Tohme et al. | Jan 2014 | A1 |
Number | Date | Country |
---|---|---|
2811444 | Mar 2012 | CA |
1290850 | Apr 2001 | CN |
1290850 | Apr 2001 | CN |
1362692 | Aug 2002 | CN |
1362692 | Aug 2002 | CN |
101203730 | Jun 2008 | CN |
101297176 | Oct 2008 | CN |
3530922 | Aug 1984 | DE |
3827458 | Feb 1990 | DE |
202004004945 | Oct 2004 | DE |
102004024171 | Sep 2005 | DE |
202006020299 | May 2008 | DE |
0797076 | Sep 1997 | EP |
0919831 | Jun 1999 | EP |
0957336 | Nov 1999 | EP |
1067363 | Jan 2001 | EP |
1519141 | Mar 2005 | EP |
1607767 | Dec 2005 | EP |
2136178 | Dec 2009 | EP |
2177868 | Apr 2010 | EP |
2259013 | Dec 2010 | EP |
S5848881 | Mar 1983 | JP |
6097288 | Jul 1990 | JP |
H0331715 | Feb 1991 | JP |
H0465631 | Mar 1992 | JP |
H05257005 | Oct 1993 | JP |
H05302976 | Nov 1993 | JP |
H06214186 | Aug 1994 | JP |
H06265355 | Sep 1994 | JP |
H074967 | Jan 1995 | JP |
H08145679 | Jun 1996 | JP |
H0914965 | Jan 1997 | JP |
H09113223 | May 1997 | JP |
H102722 | Jan 1998 | JP |
H10317874 | Dec 1998 | JP |
H11337642 | Dec 1999 | JP |
2000275042 | Oct 2000 | JP |
2000346645 | Dec 2000 | JP |
2001165662 | Jun 2001 | JP |
2001272468 | Oct 2001 | JP |
2001284317 | Oct 2001 | JP |
2001353112 | Dec 2001 | JP |
2002098762 | Apr 2002 | JP |
2002209361 | Jul 2002 | JP |
2004508954 | Mar 2004 | JP |
2004108939 | Apr 2004 | JP |
2005010585 | Jan 2005 | JP |
2005265700 | Sep 2005 | JP |
2006084460 | Mar 2006 | JP |
2006276012 | Oct 2006 | JP |
2007165331 | Jun 2007 | JP |
2007256872 | Oct 2007 | JP |
2009014639 | Jan 2009 | JP |
2011158371 | Aug 2011 | JP |
5302976 | Oct 2013 | JP |
9534849 | Dec 1995 | WO |
0177613 | Oct 2001 | WO |
0223121 | Mar 2002 | WO |
0237466 | May 2002 | WO |
02084327 | Oct 2002 | WO |
03062744 | Jul 2003 | WO |
03073121 | Sep 2003 | WO |
2004063668 | Jul 2004 | WO |
2005026772 | Mar 2005 | WO |
2006039682 | Apr 2006 | WO |
2006055770 | May 2006 | WO |
2007124010 | Nov 2007 | WO |
2011057130 | May 2011 | WO |
Entry |
---|
“DLP-Based Structured Light 3D Imaging Technologies and Applications” by J. Geng; Proceedings of SPIE, vol. 7932. Published Feb. 11, 2011, 15 pages. |
Parker, et al “Instrument for Setting Radio Telescope Surfaces” (4 pp) ASPE Proceedings, vol. 15, Oct. 22, 2000. |
Cao, et al.; “VisionWand: Interaction Techniques for Large Displays using a Passive Wand Tracked in 3D”; Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, UIST; vol. 5, issue 2; pp. 173-182; Jan. 2003. |
DE Office Action dated Dec. 4, 2014 corresponding to German App. No. 11 2012 001 254.2. |
FARO Technical Institute, Basic Measurement Training Workbook, Version 1.0, FARO Laster Tracker, Jan. 2008, Students Book, FAO CAM2 Measure. |
GB OA dated Nov. 20, 2014 corresponding to GB App. No. GB1407388.6. |
GB Office Action dated Nov. 20, 2014 corresponding to GB App. No. 1407389.4. |
Hanwei Xiong et al: “The Development of Optical Fringe Measurement System integrated with a CMM for Products Inspection.” Proceedings of SPIE, vol. 7855, Nov. 3, 2010, pp. 78551W-7855W-8, XP055118356. ISSN: 0277-786X. |
Integrated Optical Amplitude Modulator; [on-line technical data sheet]; [Retrieved Oct. 14, 2010]; Jenoptik; Retrieved from http://www.jenoptik.com/cms/products.nsf/0/A6DF20B50AEE7819C12576FE0074E8E6/$File/amplitudemodulators—en.pdf?Open. |
Turk, et al., “Perceptual Interfaces”, UCSB Technical Report 2003-33, pp. 1-43 [Retreived Aug. 11, 2011, http://www.cs.ucsb.edu/research/tech—reports/reports/2003-33.pdf] (2003). |
Tracker3; Ultra-Portable Laser Tracking System; 4 pages; 2010 Automated Precision Inc.; www.apisensor.com. |
Kollorz, et al., “Gesture recognition with a time-of-flight camera”, International Journal of Intelligent Systems Technologies and Applications, vol. 5, No. 3/4, pp. 334-343, [Retreived Aug. 11, 2011; http://www5.informatik.uni-erlangen.de/Forschung/Publikat. |
LaserTRACER-measureing sub-micron in space; http://www.etalon-ag.com/index.php/en/products/lasertracer; 4 pages; Jun. 28, 2011; ETALON AG. |
Leica Absolute Tracker AT401-ASME B89.4.19/2006 Specifications; Hexagon Metrology; Leica Geosystems Metrology Products, Switzerland; 2 pages; www.leica-geosystems.com/metrology. |
Leica Geosystems AG ED—“Leica Laser Tracker System”, Internet Citation, Jun. 28, 2012, XP002678836, Retrieved from the Internet: URL:http://www.a-solution.com.au/pages/downloads/LTD500—Brochure—EN.pdf. |
Leica Geosystems: “TPS1100 Professional Series”, 1999, Retrieved from the Internet: URL:http://www.estig.ipbeja.pt/˜legvm/top—civil/TPS1100%20-%20A%20New%20Generation%20of%20Total%20Stations.pdf, [Retrieved on Jul. 2012]. |
Lightvision—High Speed Variable Optical Attenuators (VOA); [on-line]; A publication of Lightwaves 2020, Feb. 1, 2008; Retrieved from http://www.lightwaves2020.com/home/. |
Maekynen, A. J. et al., Tracking Laser Radar for 3-D Shape Measurements of Large Industrial Objects Based on Time-of-Flight Laser Rangefinding and Position-Sensitive Detection Techniques, IEEE Transactions on Instrumentation and Measurement, vol. 43, No, 1994. |
Making the Big Step from Electronics to Photonics by Modulating a Beam of Light with Electricity; May 18, 2005; [on-line]; [Retrieved May 7, 2009]; Cornell University News Service; Retrieved from http://www.news.cornell.edu/stories/May05/LipsonElectroOptica. |
Matsumaru, K., “Mobile Robot with Preliminary-Announcement and Display Function of Forthcoming Motion Using Projection Equipment,” Robot and Human Interactive Communication, 2006. RO-MAN06. The 15th IEEE International Symposium, pp. 443-450, Sep. 6-8. |
MEMS Variable Optical Attenuators Single/Multi-Channel; [on-line]; Jan. 17, 2005; Retrieved from www.ozoptics.com. |
Nanona High Speed & Low Loss Optical Switch; [on-line technical data sheet]; [Retrieved Oct. 14, 2010]; Retrieved from http://www.bostonati.com/products/PI-FOS.pdf. |
New River Kinematics, SA ARM—“The Ultimate Measurement Software for Arms, Software Release!”, SA Sep. 30, 2010, [On-line], http://www.kinematics.com/news/software-release-sa20100930.html (1 of 14), [Retrieved Apr. 13, 2011 11:40:47 AM]. |
Office Action for Japanese Patent Application No. PCT/US2012028984; Date of Mailing Feb. 17, 2012. |
Optical Circulators Improve Bidirectional Fiber Systems; by Jay S. Van Delden; [online]; [Retrieved May 18, 2009]; Laser Focus World; Retrieved from http://www.laserfocusworld.com/display—article/28411/12/nonc/nonc/News/Optical-circulators-improve-bidirecti. |
Ou-Yang, Mang, et al., “High-Dynamic-Range Laser Range Finders Based on a Novel Multimodulated Frequency Method”, Optical Engineering, vol. 45, No. 12, Jan. 1, 2006, p. 123603, XP55031001, ISSN: 0091-3286, DOI: 10.1117/1.2402517. |
PCMM System Specifications Leica Absolute Tracker and Leica T-Products; Hexagon Metrology; Leica Geosystems Metrology Products, Switzerland; 8 pages; www.leica-geosystems.com/metrology, Feb. 21, 2012. |
Poujouly, Stephane, et al., “A Twofold Modulation Frequency Laser Range Finder; A Twofold Modulation Frequency Laser Range Finder”, Journal of Optics. A, Pure and Applied Optics, Institute of Physics Publishing, Bristol, GB, vol. 4, No. 6, Nov. 4, 2002. |
Poujouly, Stephane, et al., Digital Laser Range Finder: Phase-Shift Estimation by Undersampling Technique; IEEE, Copyright 1999. |
Rahman, et al., “Spatial-Geometric Approach to Physical Mobile Interaction Based on Accelerometer and IR Sensory Data Fusion”, ACM Transactions on Multimedia Computing, Communications and Applications, vol. 6, No. 4, Article 28, Publication date: Nov. 2010. |
RS Series Remote Controlled Optical Switch; [on-line technical data sheet]; Sercalo Microtechnology, Ltd. [Retrieved Oct. 14, 2010]; Retreived from http://www.sercalo.com/document/PDFs/DataSheets/RS%20datasheet.pdf. |
Stone, et al. “Automated Part Tracking on the Construction Job Site” 8 pp; XP 55055816A; National Institute of Standards and Technology. |
Super-Nyquist Operation of the AD9912 Yields a High RF Output Signal; Analog Devices, Inc., AN-939 Application Note; www.analog.com; Copyright 2007. |
Office Action for Chinese Patent Application No. 201280013306.X dated Jul. 15, 2015; 1-7 pages. |
Search Report for Chinese Patent Application No. 201280013306.X dated Jul. 3, 2015; 1-2 pages. |
Number | Date | Country | |
---|---|---|---|
20150092199 A1 | Apr 2015 | US |
Number | Date | Country | |
---|---|---|---|
61452314 | Mar 2011 | US | |
61326294 | Apr 2010 | US | |
61475703 | Apr 2011 | US | |
61592049 | Jan 2012 | US | |
61448823 | Mar 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14199211 | Mar 2014 | US |
Child | 14565624 | US | |
Parent | 13860010 | Apr 2013 | US |
Child | 14199211 | US | |
Parent | 13418899 | Mar 2012 | US |
Child | 13860010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13340730 | Dec 2011 | US |
Child | 13418899 | US | |
Parent | 13090889 | Apr 2011 | US |
Child | 13340730 | US | |
Parent | 13407983 | Feb 2012 | US |
Child | 13418899 | US |