System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices

Abstract
A method is provided of determining three-dimensional coordinates of an object surface with a laser tracker and structured light scanner. The method includes providing the scanner having a body, a pair of cameras, a projector, and a processor. The projector and cameras are positioned in a non-collinear arrangement. The projector is configured to project a first pattern onto the surface. The method also includes providing the tracker which emits a beam of light onto the retroreflector. The tracker receives a reflected beam of light. The first location is measured with the tracker. The first orientation is measured with the tracker. The first surface pattern is projected onto the surface. A pair of images of the surface pattern is acquired with cameras. The processor determines the 3D coordinates of a first plurality of points in the tracker frame of reference based in part on epipolar constraints of the cameras and projector.
Description
BACKGROUND OF THE INVENTION

The subject matter disclosed herein relates to a system and method of acquiring three-dimensional coordinates of points on a surface of an object and in particular to a system and method of operating a laser tracker in conjunction with a scanner device to track the position and orientation of the scanner device during operation.


The acquisition of three-dimensional coordinates of an object or an environment is known. Various techniques may be used, such as time-of-flight (TOF) or triangulation methods for example. A TOF system such as a laser tracker, for example, directs a beam of light such as a laser beam toward a retroreflector target positioned over a spot to be measured. An absolute distance meter (ADM) is used to determine the distance from the distance meter to the retroreflector based on length of time it takes the light to travel to the spot and return. By moving the retroreflector target over the surface of the object, the coordinates of the object surface may be ascertained. Another example of a TOF system is a laser scanner that measures a distance to a spot on a diffuse surface with an ADM that measures the time for the light to travel to the spot and return. TOF systems have advantages in being accurate, but in some cases may be slower than systems that project a plurality of light spots onto the surface at each instant in time.


In contrast, a triangulation system such as a scanner projects either a line of light (e.g. from a laser line probe) or a pattern of light (e.g. from a structured light) onto the surface. In this system, a camera is coupled to a projector in a fixed mechanical relationship. The light/pattern emitted from the projector is reflected off of the surface and detected by the camera. Since the camera and projector are arranged in a fixed relationship, the distance to the object may be determined from captured images using trigonometric principles. Triangulation systems provide advantages in quickly acquiring coordinate data over large areas.


In some systems, during the scanning process, the scanner acquires, at different times, a series of images of the patterns of light formed on the object surface. These multiple images are then registered relative to each other so that the position and orientation of each image relative to the other images is known. Where the scanner is handheld, various techniques have been used to register the images. One common technique uses features in the images to match overlapping areas of adjacent image frames. This technique works well when the object being measured has many features relative to the field of view of the scanner. However, if the object contains a relatively large flat or curved surface, the images may not properly register relative to each other.


Accordingly, while existing coordinate measurement devices are suitable for their intended purposes, the need for improvement remains, particularly in improving the registration of images acquired by a scanner device.


BRIEF DESCRIPTION OF THE INVENTION

According to one aspect of the invention, a method of determining three-dimensional (3D) coordinates of an object surface with a six degree-of-freedom (DOF) laser tracker and a portable structured light scanner is provided. The method comprises providing the scanner having a body, a first camera, a second camera, a first projector, and a processor. The first camera, second camera, and the first projector are coupled to the body, the first camera having a first camera perspective center at a first camera position, the second camera having a second camera perspective center at a second camera position, and the first projector having a first projector perspective center at a first projector position, respectively, in a scanner frame of reference. The first projector position being non-collinear with respect to the first camera position and the second camera position. The first projector configured to produce a first projector pattern of light within the projector and to project the first projector pattern onto the surface as a first surface pattern. The first projector pattern of light being a pattern of light having uniformly spaced elements in each of two dimensions of two-dimensional space. The scanner further having a first retroreflector coupled to the body. The tracker is provided having a tracker frame of reference. The scanner having a first pose in the tracker frame of reference, the first pose including a first location and a first orientation, each of the first location and the first orientation being defined by three degrees of freedom. An emitted beam of light from the tracker is locked onto the first retroreflector. The tracker receives a reflected portion of the emitted beam of light. The tracker measures the first location, the location based at least in part on a first distance, a first angle, and a second angle. The first distance being a distance from the tracker to the retroreflector, the first distance measured with a distance meter, a first angle measured with a first angle measuring device, and a second angle measured with a second angle measuring device. The tracker measures the first orientation. The first surface pattern is projected onto the surface. The first surface pattern is imaged with the first camera to obtain a first image. The first surface pattern is imaged with the second camera to obtain a second image. The processor determines the 3D coordinates of a first plurality of points in the tracker frame of reference based at least in part on the first location, the first orientation, the first projector pattern, the first image, the second image, the first camera position, the second camera position, and the first projector position. The determining of the 3D coordinates being based at least in part on the use of epipolar constraints among the first camera, the second camera, and the first projector. The 3D coordinates are stored.


These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWINGS

The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:



FIG. 1 is a perspective view of a system for measuring an object in accordance with an embodiment of the invention;



FIG. 2A and FIG. 2B are schematic illustrations of the production of a pattern by means of a diffractive optical element used in the scanner of FIG. 1;



FIG. 3 illustrates a first pattern and a second pattern for use with the scanner of FIG. 1;



FIG. 4 is a schematic illustration of a projector plane, an image plane and epipolar lines;



FIG. 5 is a schematic illustration of an averaging process;



FIG. 6 is a schematic illustration of a ring closure process;



FIG. 7 is a flow diagram illustrating the operation of the system of FIG. 1; and



FIG. 8 is a perspective view of a system for measuring an object in accordance with another embodiment of the invention.





The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.


DETAILED DESCRIPTION OF THE INVENTION

Embodiments of the present invention provide advantages in registration of images acquired by a scanner device. Embodiments of the invention provide further advantages in the tracking of a handheld scanner device with a coordinate measurement device such as a laser tracker.


Referring to the FIG. 1, a system 20 is shown for measuring the three-dimensional coordinates of an object 22. The system includes a first coordinate measurement device, such as six degree-of-freedom (six-DOF) laser tracker 24 that cooperates with a six-DOF retroreflector. In one embodiment the six-DOF retroreflector may be a six-DOF spherically mounted retroreflector (SMR) 48, attached to a second coordinate measurement device, such as scanner 26.


The laser tracker 24 includes a light source that emits light, for example, a laser, and a distance meter. The light source and distance meter are configured to emit and receive light 28 via an aperture 30. The distance meter may be an absolute distance meter assembly which allows the laser tracker 24 to optically measure the distance between the laser tracker 24 and a six-DOF retroreflector.


In other embodiments, the six-DOF laser tracker 24 may operate with a different type of six-DOF target affixed to the scanner 26. In an embodiment, the six-DOF target includes a cube-corner retroreflector and a collection of light points that are imaged by a camera attached to the tracker. In a further embodiment, the six-DOF tracker works with a type of six-DOF target that includes a glass cube-corner retroreflector prism that has its vertex beveled off to permit light to pass through to a position detector for measurement of pitch and yaw angles of the six-DOF target. This six-DOF target may include a mechanical pendulum that permits low friction rotation with measurement of rotation by an angular encoder attached to the pendulum. Other types of six-DOF targets and six-DOF laser trackers are possible.


The six-DOF laser tracker 24 may include motors, angular encoders and a position detector that allows the laser tracker 24 to track the position of a retroreflector as it is moved through space. Provided within the tracker is a controller 32 having a processor configured to determine the three dimensional coordinates of the retroreflector based at least in part on the distance to the retroreflector and on signals from the angular encoders. In addition, the six-DOF laser tracker includes additional methods for determining the three orientational degrees of freedom (e.g., pitch, roll, and yaw). The methods may include steps of imaging, with a camera coupled to the tracker, points of light adjacent to the retroreflector 48. This camera may have a controlled magnification. The methods may also include a wired or wireless communication to obtain data from a position detector and/or an angular encoder attached to a mechanical pendulum. It should be appreciated that these methods are exemplary and other configurations are possible for a six-DOF laser tracker. The controller 32 may further have additional circuitry, including but not limited to communications circuits which allow the laser tracker 24 to communicate with the scanner 26 or a computer 33 via a wired or wireless communications medium 35.


A scanner 26 is a portable device that allows an operator to optically scan and measure an object or the environment. The scanner 26 has a base part 104, a grip part 106, which protrudes from the base part 104, and a head end 108. An operator of the scanner 26 may hold the scanner 26 at the grip part 106, which is configured to allow the operator to carry the scanner 26 through the environment and to align the scanner 26 to objects 22 in the environment.


In the exemplary embodiment, the scanner 26 is a structured light type of coordinate measurement device. As will be discussed in more detail below, the scanner 26 first emits structured light 123 with a projector 120 to form a structured light pattern 34 on surfaces of object 22. The light pattern 34 is reflected from the surface of object 22 as reflected light 40 and is received by the cameras 111, 112. A lens 117 (FIG. 2B) within each camera 111, 112 images a portion of reflected light 40 onto a corresponding photosensitive array 119 (FIG. 2B) within each camera 111, 112. It should be appreciated that variations in the surface of the object 22 create distortions in the structured pattern when the image of the pattern is captured by the cameras 111, 112. Since the pattern is formed by structured light, it is possible in some instances for a controller 118, or a remote computing device 33 to determine a one to one correspondence between the pixels in the projected pattern and the pixels in the patterns imaged by the cameras 111, 112. The scanner 26 may be a device such as that described in commonly owned U.S. patent application Ser. No. 13/767,167 filed on Feb. 14, 2013, which is incorporated herein by reference.


In the exemplary embodiment, the projector 120 has a diffraction grating 124. The refraction grating 124 has a lens perspective center 125 and a projector optical axis 127. The ray of light from the light source 121 travels from the light source through the refraction grating 124 and through the perspective center 125 to the object 22. Similarly, each camera lens 117 includes a lens perspective center 129 and a lens optical axis. In the embodiment, of FIG. 2B, the lens optical axis is collinear with the illustrated ray of light 40. The ray of light 40 reflects off of the object 22 and travels through the lens perspective center 129 and intercepts the photosensitive array 119. As will be discussed in more detail herein, the determination of 3D coordinates of points on the object 22 will be based at least in part on the use of epipolar constraints among the cameras 111, 112 and the projector 120.


As will be discussed in more detail below, in the exemplary embodiment, a six degree of freedom (6DOF) retroreflector 48 is coupled to the head end 108 along a top surface. The retroreflector 48 may be similar to the one described in commonly owned U.S. patent application Ser. No. 13/370,339 filed on Feb. 10, 2012 or U.S. patent application Ser. No. 13/407983 filed on Feb. 29, 2012, both of which are incorporated by reference herein in their entirety. In one embodiment, the retroreflector 48 in the form of a six-DOF SMR is coupled to a nest 50. The nest 50 may be a magnetic nest, or may include a clamping arrangement that holds the retroreflector 48 in place during operation. In still other embodiments, one or more the retroreflectors 48 are integrated into the scanner 26. In other embodiments, the scanner 26 may include, in addition to a three DOF retroreflector, three or more points of light mounted on the scanner and viewed by a camera on the six-DOF tracker 24, the combination of retroreflector and lights are sufficient to provide the six degrees of freedom of the scanner 24 within a frame of reference of the tracker 24. In another embodiment, the scanner 26 includes a glass cube-corner retroreflector having the vertex beveled so as to permit light to pass through the retroreflector to a position detector. The position of the transmitted light on the position detector may be used to determine the pitch and yaw angles of the scanner 26. This may be used in combination with a low-friction mechanical pendulum coupled to an angular encoder to determine the roll angle of the scanner 26 with the tracker frame of reference. It should be appreciated that the above described six-DOF targets and trackers are exemplary and not limiting. In other embodiments, other types of six-DOF targets and six-DOF trackers may be used in combination with the scanner 26.


The scanner 26 includes a first camera 111 and a second camera 112 arranged a predetermined distance apart in the head end 108. The first camera 111 and the second camera 112 may be aligned in such a way as to cause the fields of view (FOV) to overlap, thereby providing stereoscopic images of surfaces of object 22. There may be a desired overlap of the camera FOVs that matches, at least approximately, the area of the projected light pattern 34 for a typical distance between the scanner 26 and the object 22. In some embodiments, a typical distance from the scanner 26 to the object 22 may be on the order of several decimeters or a few meters. In an embodiment, the mutual alignment of cameras 111 and 112 is not fixed but can be adjusted by the operator, for example by pivoting the cameras 111, 112 in opposite sense, about axes of rotation that are parallel to the grip 106. Such an adjustment may be followed by a compensation procedure, which may include use of a dot plate, to determine the angles of rotation of the cameras 111, 112.


In the exemplary embodiment, the first camera 111 and the second camera 112 are monochrome, i.e. sensitive to a narrow wavelength range, for example by being provided with filters that pass the desired narrow wavelength range and block other wavelength ranges. The narrow wavelength range passed to the photosensitive arrays 119 within the cameras 111, 112 may be within the infrared range. In order to obtain information on the color of the object 22, a color camera 113 may be arranged in the head end 108. In one embodiment, the color camera 113 may be symmetrically aligned to the first camera 111 and to the second camera 112, and arranged centrally therebetween. The color camera 113 is sensitive in the visible light wavelength range.


The scanner 26 may include a display and control unit 115, such as a touch screen for example. The display and control unit 115 may be arranged at the head end 108, on a side opposite the cameras 111, 112. In one embodiment, the display and control unit 115 may be configured to be detachable. The cameras 111, 112 and, if available, camera 113, as well as the display and control unit 115 are connected to a controller 118, which may also be arranged in the head end 108. The controller 118 can pre-process the data of the cameras 111, 112, 113, to produce the 3D-scans and provide suitable views onto the display and control unit 115. In some embodiments the scanner may not have a display and control unit 115, but rather is operated by means of a remote control, such as portable computer 33 for example, which is in continuous connection (cabled or wireless) with the control and evaluation unit 118, such as through medium 35 for example.


It should be appreciated that unless the controller 118 transfers the 3D-scans or the data of the cameras 111, 112, 113, by means of wireless medium 35, the scanner 26 may be provided with a data connection, such as on the base part 104 for example. The data connection can be, for example, a standardized interface for LAN, USB or the like. If appropriate, the data connection can be configured also for introducing a portable storage medium (SD-card, USB-stick etc.). For power supply, a battery may be provided in the base part 104. For charging the battery, a power supply outlet may be provided, preferably on the base part 104. In another embodiment, the battery may be replaced by the user when depleted.


In an embodiment, a first projector 120 is provided in the base part 104. The first projector 120 is aligned in correspondence with the two cameras 111, 112. The relative distance and the relative alignment are pre-set or may be set by the user. The first projector 120 projects the structured light pattern 34 onto the object 22 being scanned.


As used herein, the term “structured light” refers to a two-dimensional pattern of light projected onto a continuous area of an object that conveys information which may be used to determine coordinates of points on the object. A structured light pattern will contain at least three non-collinear pattern elements. Each of the three non-collinear pattern elements conveys information which may be used to determine the point coordinates.


In general, there are two types of structured light patterns, a coded light pattern and an uncoded light pattern. In a coded light pattern, the set of elements are arranged identifiable elements such as collections of lines or pattern regions. In contrast, an uncoded structured light pattern may include a pattern in which the elements are identical and uniformly spaced, such as a pattern of dots or other geometric shapes.


In the exemplary embodiment, the pattern 34 is an uncoded pattern, for example, a periodic pattern. A similarity in the appearance of the periodic pattern elements is resolved by the use of the two cameras 111, 112 together with a projector 120 located at a position not collinear with the two cameras. With this arrangement, epipolar constraints and related mathematical methods may be used to establish the correspondence between periodic elements projected by the projector 120 and the periodic elements observed by the cameras 111, 112. The uncoded pattern 34 may be a point pattern, comprising a regular arrangement of points in a grid. For example, in one embodiment, the uncoded pattern consists of a 100×100 array of points that are projected at an angle of approximately 50° to a distance between 0.5 m to 5 m. The pattern 34 can also be a line pattern or a combined pattern of points and lines, each of which is formed by light points. Lenses 117 in the two cameras 111 and 112 form images of the pattern 34 in their respective image planes B111 and B112 (FIG. 4) located on photosensitive arrays 119 (for example, a CMOS or CCD type sensor) to record the pattern 34.


The resolution in the 3D coordinates obtained for the object 22 may depend on the distance from the scanner 26 to the object 22. For example, fine structures of the object 22, a relatively high point density may be used, while a relatively low point density may be sufficient to resolve coarse structures. It therefore it may be advantageous to produce, in addition to pattern 34, at least one other pattern 34A (FIG. 3). Suitable patterns 34, 34A may be dynamically selected to measure the coarse and fine structures of the object 22.


In one embodiment a second projector 122 may be provided. The second projector 122 may be aligned to produce the second pattern 34A. In another embodiment, the first projector 120 may produce, in addition to pattern 34, the second pattern 34A, where the patterns 34, 34A are offset to each other with respect to time or in another wavelength range. The second pattern 34A may be a pattern different from pattern 34, obtained for example by changing the distance between the points (grid pitch).


In an embodiment, the second pattern 34A overlays the first pattern 34, for example, with a different intensity. A combined pattern may include a first set of light points 34 spaced farther apart but with higher intensity and a second set of light points 34A spaced closer together but with lower intensity. With the combined pattern having spots of differing intensities, it is in some cases possible to overcome issues with different levels of reflected light by properly selecting the exposure times or projected optical power levels.


It is also contemplated that more than two patterns 34, 34A may be used. For example, a defined sequence of differing patterns may be projected over time.


In one embodiment, the monochromatic first pattern 34 (and second pattern 34A) is produced by means of a diffractive optical element 124 (FIG. 1-2), which divides a light beam produced by a laser in the wavelength range (infrared) of the two cameras 111, 112 in correspondence with the first pattern 34, without losing a significant amount of optical power. In the exemplary embodiment, the diffractive optical element 124 is arranged in front of the light source 121 within a projector 120 (FIG. 2B). In this instance, the lateral resolution is then determined by the size of the projected points. It is possible to record the images of the color camera 113 and images of the infrared spots on the cameras 111, 112 without interference of the patterns received by the cameras. The first pattern 34 (and the second pattern 34A) could in other embodiments be produced in the ultraviolet range.


Two patterns 34, 34A may be produced with two diffractive optical elements, which are produced at different times or illuminated with different wavelengths. With a time-variable diffractive optical element, it is possible to quickly (i.e. with approximately each frame) or slowly (for example manually controlled) change between the patterns 34, 34A, or first pattern 34 may be adapted dynamically to the changing facts or situation (with regard to the density of the light points on the object surface and the reach of the projected first pattern 34). A gradual transition between the patterns 34, 34A is conceivable as well (fade-over). As an alternative to diffractive optical elements, arrays of microlenses or of single lasers can be used. Optionally, also a classical imaging by means of a mask, in particular of a transparency, is possible.


In one embodiment, to improve energy efficiency, the first projector 120 may be configured to produce the first pattern 34 on the objects 22 when the cameras 111, 112 (and if available camera 113) record images of the objects 22 which are provided with the first pattern 34. For this purpose, the two cameras 111, 112 and the projector 120 (and if available the second projector 122) are synchronized, such as coordinated internally with each other for example, with regard to both time and the first pattern 34 (and optionally second pattern 34A). Each recording process starts by the first projector 120 producing the first pattern 34, similar to a flash in photography, and the cameras 111, 112 (and, if available camera 113) following with their records, more precisely their pairs of records (frames), such as one image each from each of the two cameras 111, 112. The recording process can comprise one single frame (shot), or a sequence of a plurality of frames (video). A trigger switch 126, by means of which such a shot or such a video can be triggered, is provided on the grip part 106. After processing of the data, each frame then constitutes a 3D-scan, i.e. a point cloud in the three-dimensional space, in relative coordinates of the scanner 26. In another embodiment, the recording process may be triggered by means of a remote control of the scanner 26. As will be discussed in more detail below, the plurality of frames may be registered relative to each other in space using coordinate data acquired by the tracker 24.


The first projector 120 and the second projector 122 may be arranged in a non-collinear position relative to the one camera 111 or two cameras 111, 112. In one embodiment, the projectors 120, 122 and the one camera 111 or two cameras 111, 112 are positioned in a triangular arrangement. This arrangement of the two cameras 111, 112, as well as of the first projector 120 (and optionally of the second projector 122) makes use of mathematical methods of optics, which are known in the art, as epipolar geometry, according to which one point in the image plane B112 of the second camera 112 can be observed on a (known) epipolar line, in the image plane B111 of the first camera 111, and vice versa, and/or a point which is produced by the first projector 120 from a projector level P121 can be observed on one epipolar line each, in the image planes B111, B112 of the two cameras 111, 112.


In the exemplary embodiment, at least three units (projector 120 and the two cameras 111, 112) are involved, i.e. proceeding from each of the units, two stereo geometries each (with a plurality of epipolar lines each) can be defined with the two other units. Unambiguous triangle relations of points and epipolar lines, from which the correspondence of projections of the first pattern 34 (and optionally second pattern 34A) in the two image levels B111, B112 can be determined, as a result of this arrangement. Due to the additional stereo geometry (compared to a pair of cameras), considerably more of the points of the pattern, which otherwise could not be distinguished, may be identified on a given epipolar line. It should be appreciated that this allows for the identification of the points in an uncoded structured light pattern. The density of features may thus simultaneously be high, and the size of the feature can be kept very low. This provides advantages over other structured light devices that use encoded patterns (having features consisting, for example, of a plurality of points), where the size of the feature has a lower limit, limiting the lateral resolution. If the correspondence has been determined, the three-dimensional coordinates of the points on the surface of the object 22 are determined for the 3D-scan by means of triangulation.


Additional three-dimensional data may be gained by means of photogrammetry from several frames with different camera positions, for example from the color camera 113 or from the part of the signal of the cameras 111, 112, which comes from the ambient light. It can also be advantageous, if the scanner 26 or another unit (not shown) illuminates the object 22 and optionally the background, with white light or infrared light for example. This allows for not only the parts of the object 22 (also illuminated by the pattern 34) and background to be visible, but also areas in between. Such illumination may be desirable if the data of the color camera 113 is used for making the 3D-scans (and not only for the coloration thereof), and for calibrating the cameras 111, 112, if filters are used are used to allow the capture of only a limited spectral range.


The scanning process also shows an aspect of time. Whereas, with stationary devices, a whole sequence of patterns may be projected and images be recorded in order to determine one single 3D-scan, one 3D-scan is produced with each shot of the scanner 26 in the embodiments. In one embodiment, if a second projector 122 or a further diffractive optical element 124 or at least a second pattern 34A in addition to first pattern 34 is provided, it is possible to also record, with one shot, images with different patterns 34, 34A consecutively, so that the 3D-scan will provide a higher resolution.


In order to capture the complete scene, the 3D-scans which are produced with the shot need to be registered, meaning that the three-dimensional point clouds of each frame are inserted into a common coordinate system. Registration may be possible, for example, by videogrammetry, such as by using “structure from motion” (SFM) or “simultaneous localization and mapping” (SLAM) techniques for example. The features, such as edges and corners for example, of the objects 22 may be used for common points of reference, or a stationary pattern 37 may be produced. The natural texture and features of objects may be captured by the color camera 113 and may also provide common points of reference.


In one embodiment, a separate projector 130 shown in FIG. 1 projects a stationary pattern 37 onto the objects 22 to be scanned, which might be a pattern similar to first pattern 34 or second pattern 34A or may be a different pattern—for example, of a more complex and easily recognizable pattern. While first pattern 34 and optionally second pattern 34A moves with the scanner 26, the pattern 37 remains stationary. Thus, if the scanner 26 is moved and images are acquired by the scanner 26 from different positions, the images may be registered to a common coordinate system based on the stationary pattern 37. The stationary pattern 37 is then visible in a plurality of images (frames) of the cameras 111, 112, so that the 3D-scans determined thereof can be registered by means of the stationary pattern 37. In an embodiment, the stationary pattern 37 differs from patterns 34, 34A with regard to geometry or time or spectrum (or a combination thereof). If it differs with regard to time, the stationary pattern 37 is produced at least in intervals of time, in which the first pattern 34 and optionally second pattern 34A are not produced (alternating or overlapping). If the stationary pattern 37 differs with regard to spectrum, the stationary pattern 37 is within another wavelength range as first pattern 34 and optionally second pattern 34A, so that the cameras 111, 112 are configured to be sensitive for this wavelength, such as through the use of corresponding filters. The separate projector 130 may be synchronized with the scanner 26, such that the time and kind of the projected stationary pattern 37 are known to the scanner 26.


Another method for registration is provided by measuring the six degrees of freedom (DOF) of the scanner 26 with a six-DOF laser tracker 24 and a six-DOF target or targets. There are many types of six-DOF laser trackers and six-DOF targets that may be used, and any of these will suit the purpose described below. In an embodiment, the six DOF target 48 is a six-DOF retroreflector, which may be a glass cube-corner retroreflector having edges that are darkened so that a camera internal to the tracker 24 can image the lines. The images of these lines may be analyzed to determine the orientation of the retroreflector. The tracker measures the three DOF associated with the x, y, z position of the retroreflector so that, combined with the orientation of the retroreflector, six degrees of freedom are obtained. In the embodiment depicted in FIG. 1, a six-DOF glass retroreflector target is incorporated within a metal sphere to obtain a six-DOF spherically mounted retroreflector (SMR) 48. In some embodiments, the six-DOF SMR is placed in a magnetic nest 50 and rotated into a desired position. In some embodiments, the six-DOF SMR 48 is rotated into position and then fixed in placed with a clamp (not shown). A potential advantage of using a six-DOF SMR 48 placed in a magnetic nest is that the SMR may be conveniently rotated into the line of sight of the laser tracker as the scanner 26 is moved around the object 22 to scan the object from all directions. The six-DOF retroreflector 48 is tracked by the six-DOF laser tracker to determine the six degrees of freedom of the scanner 26. This information may be used to convert the three-dimensional coordinates of the object surface 22 measured by the scanner within its local frame of reference to be converted into a global frame of reference. This is accomplished using methods from matrix mathematics that are well understood in the art and will not be described further herein. As the scanner is moved, its position may be continually recalculated. To register the images acquired by the scanner 26 with the position and orientation data acquired by the laser tracker 24, the operation of the scanner 26 may be synchronized with the tracker measurements. In one embodiment, the scanner 26 is coupled for communication to the laser tracker 24 via a wired or wireless communications medium 35. In one embodiment, the communication between the scanner 26 and the laser tracker 24 is optical. The controller 118 of scanner 26 may transmit a trigger signal to the laser tracker 24 via medium 35. The trigger signal may be initiated by the operation (e.g. closing of the shutter) of the cameras 111, 112 for example. It should be appreciated that the direction of the trigger signal may also reversed and transmitted to the scanner 26 from the laser tracker 24 (or another third device) in response to the scanner 26 being positioned in a desired location and orientation.


It should still further be appreciated that the synchronization may be realized by other methods than the transmitting of a signal. In one embodiment, the laser tracker 24 and the scanner 26 have a common clock. In this embodiment, the acquisition time for the image and coordinate data are recorded. The time data may then be used to determine a correspondence between the image and coordinate data to register the images. In one embodiment, the clocks of the laser tracker 24 and scanner 26 a synchronized using the Institute of Electrical and Electronics Engineers (IEEE) standard 1588. In still another embodiment, the operation of the laser tracker 24 and the scanner 26 may be synchronized using polling.


In one embodiment, the registration of the images obtained by the scanner 26 using a combination of the position and orientation data from laser tracker 24 and videogrammetry via the pattern 37 projected by projector 130. This embodiment may provide advantages in verifying that results are self-consistent and that the fixtures mounting the object 22 are stable.


In still other embodiments, movement of the scanner 26 may be automated, such as by mounting the scanner 26 to a manually movable trolley (or on another cart), or on an autonomously moving robot for example. The scanner 26, which is no longer carried by the user, scans its environment in a more defined manner, rather by producing a video than by producing a sequence of discrete images.


The scanner 26 may be configured to produce a video with a high rate of image frames, such as seventy frames per second for example. Since the scanner 26 only moves a short distance between any two adjacent frames, the video will contain redundant information, in other words the two frames which are adjacent with regard to time will differ only very slightly spatially. In order to reduce the amount of data to be saved and/or to be transferred, suitable averaging procedure such as that shown in FIG. 5 may be used in post-processing. In a first averaging step, the frames F are divided into groups [F]i, with a plurality of frames per group [F]i around one key frame Fi each.


A group [F]i of substantially overlapping frames F, single measuring points may be efficiently stored in a common two-dimensional data structure (grid structure), such as a threshold related to surface data and similar to a two-dimensional image for example. The smaller storage capacity used by the data structure permits the scanner 26 to initially save all captured measured values as a vector in the two-dimensional data structure, i.e. gray-tone value/color and distance from the scanner 26 for each of the pixels of the frames F of the group [F]i.


In a second averaging step, an averaging takes place within each group [F]i, in order remove erroneous measurements. For such averaging (with regard to gray tones/colors and/or distances), a defined part of the vector within the central range of the sorted measured values is taken. The central range can be distinguished by means of threshold values. Such averaging corresponds to a replacement of the group [F]i by a key frame Fi with averaged measured values, wherein the key frames Fi still overlap. Each measuring point which is gained is then carried on as a point (corresponding to a three-dimensional vector) of the three-dimensional overall point cloud.


In an optional third step, the measuring points gained by averaging can be brought together with data from another group [F]i, for example by Cartesian averaging.


When an object 22 is circumnavigated by the scanner 26, a ring closure may occur, meaning that the scanner 26 is moved about the object 22 until the video (or the sequence of shots) shows the same or similar view to the first image. The ring closures could be recognized immediately, if it were possible to look at all available data, at any time during the production of the overall point cloud. However, the amount of data and the computing time to perform the operations do not typically allow for such immediate recognition of the ring closure. In one embodiment a method is provided which allows for the rapid determination of a ring closure. In this embodiment, if all measurements are error free, the ring closure may quickly result from the registration of the 3D-scan in the common coordinate system. However, in a typical scanning operation an error may occur resulting in an offset of two similar frames F. An embodiment for automatically recognizing the ring closure shown in FIG. 6 that corrects for such errors.


A frustum, or more precisely a viewing frustum, is usually a truncated-pyramid-shaped area of space, which extends from the image plane, in correspondence with the viewing direction, into the infinite. In the present invention, a frustum V is formed for each frame in a first step, such frustum comprising (at least approximately) 80% of the captured points from the three-dimensional point cloud, i.e. a finite part of said area of space of the assigned 3D scan, which is determined from the frame F. The latest frustum Vn is assigned to the latest frame Fn. In a second step, the latest frustum Vn is then compared to the past frusta V by forming the intersection. The frustum out of a previous frusta Vj, with which there is the largest intersection, is selected for carrying out an analysis.


In a third step, within the latest frustum Vn and the selected frustum Vj each, features are evaluated, such as edges and corners for example, in a known manner In a fourth step, the detected features are compared to each other, for example with regard to their embedded geometry, and the coinciding features are identified. Depending on the degree of coincidence, it is determined in a fifth step, whether there is a ring closure or not.


In this method, common features are generated from the identified, coinciding features. Using a “bundle adjustment” technique, the error of measurement may be corrected in a sixth step, such as the 3D scans are corrected up to a defined depth of penetration into space for example, or the three-dimensional point cloud is in some places and to a certain degree displaced, so that the offset is eliminated in the frames, 3D scans and frusta which are substantially identical. If correction is not completely possible, after the sixth step (with the “bundle adjustment”), a certain deviation of data and consequently a certain error of measurement which is not corrected, still remains, this certain deviation (i.e. the error which cannot be corrected) may be used as a measure for the quality of the measurements and of the data as a whole.


The movement of the scanner 26 and the image frames may also be processed by a method of image tracking, in other words the scanner 26 tracks the relative movement of its environment using the images acquired by a camera, such as camera 113 for example. If image tracking gets lost, such as when the scanner 26 is moved too fast for example, there is a simple possibility of resuming image tracking. For this purpose, the latest video image, as it is provided by the camera, and the last video still image from tracking provided by it, are represented side by side (or one above the other) on the display and control unit 115 for the operator. The operator may then move the scanner 26 until the two images coincide.


It should be appreciated that the method of registering the images via the ring closure and image tracking methods may also be combined with the tracking of the scanner 26 by the six-DOF laser tracker 24. In an embodiment, the laser tracker 24 may determine the position and orientation of the scanner 26 until the scanner 26 is at a position or angle where the light beam 26 is not reflected back to the aperture 30. Once this occurs, the registration of images switches from using the laser tracker 24 coordinate data to a secondary method, such as image tracking for example. Using the combined methods of registration provides advantages in avoiding having to move the laser tracker 24 to scan the opposite side of the object 22.


Referring now to FIG. 7, a method 200 is shown for scanning an object 22 using a laser tracker 24 and a scanner 26. The method 200 begins with compensating the scanner 26 to the tracker 24 in block 202 to enable the tracker to accurately determine the six DOF of the scanner in the tracker frame of reference. In block 202, a compensation procedure will be carried out in which compensation parameters are determined for the particular situation or application. For the case of a six-DOF tracker measuring a six-DOF retroreflector having lines at the intersections of the reflecting planes of a cube corner, the parameters would include a position and orientation of the scanner in relation to a frame of reference of the scanner. The method 200 then proceeds to block 204 where the scanner 26 acquires a first image F1 of the object 22. The acquisition of the image may be in response to the operator actuating a switch 126 for example. The method 200 then acquires the coordinates C1 of the scanner 26 with the laser tracker 24 in block 206. It should be appreciated that the coordinate data set CY includes not only position data (X, Y, Z or a translational set of coordinates), but also orientation data (orientational set of coordinates) of the scanner 26. The position and orientation data of the scanner 26 define a pose of the scanner 26 in the scanner frame of reference. There will be a pose (position and orientation) of the scanner 26 for each image FN acquired. The method 200 may trigger the acquisition of the scanner coordinates by the laser tracker via a signal transmitted over medium 35, for example, or data may be automatically acquired at predetermined intervals, for example, by polling.


Next, the query block 208 determines if the scanning of the object 22 is completed. In an embodiment, if the query block 208 returns a negative value, the method 200 proceeds to block 210 where the indexing variables are incremented and the method 200 loops back to block 204 where the image frames (F2, F3 . . . FN) and block 206 where the coordinates (C2, C3 . . . CY) are acquired. This continues until the scanning of the object 22 desired by the operator is completed.


Once the scan is completed, the query block 208 returns a positive value and proceeds to block 212 where the image frames FN and the coordinates CY are registered to each other. Finally, in block 214 the coordinates of the points on the object 22 are determined in the laser tracker 24 frame of reference.


Referring now to FIG. 8, another embodiment of the system 20 is shown having a laser tracker 24 that is used in combination with a scanner 26. In this embodiment, the scanner 26 includes a first 6DOF retroreflector 48 coupled to the head end 108. The retroreflector 48 is configured to receive and reflect an incoming beam of light 35 that is approaching from the side of the scanner 26 where the user interface is located, in other words the rear of the scanner 26. It should be appreciated that as the scanner 26 is moved, as indicated by the arrow 220, the angle of the incoming beam of light 35 will increase until the retroreflector 48 can not reflect the light back towards the aperture 30. Typically, this transition point occurs at angle of 90 to 95 degrees from normal. It should further be appreciated that once the retroreflector 48 is not reflecting light back to the laser tracker 24, the laser tracker 24 will no longer be able to determine the location of the scanner 26.


In this embodiment, the scanner 26 also includes a second 6DOF retroreflector 222. The second retroreflector 222 is mounted to the scanner 26 with its reflective elements configured to reflect light that in incoming from a direction substantially opposite from that of the first retroreflector 48. In other words, the second retroreflector 222 is configured to reflect light that is traveling towards the front of the scanner 26. In one embodiment, the second retroreflector is mounted to the base 104. Is should be appreciated that while the retroreflectors 48, 222 are illustrated in the embodiments as being arranged on the head end 108 or the base 104, this is for exemplary purposes and the claimed invention should not be so limited. In other embodiments, the retroreflectors 48, 222 may be mounted on other areas of the scanner 26 or on the same area of the scanner 26. In some embodiments, the scanner 26 may have additional retroreflectors, such as four retroreflectors for example, that each are oriented to direct light from different directions. Such six-DOF retroreflectors 48, 222 may be rotated by toward the six-DOF laser trackers if the SMRs are held in magnetic nests. However, such rotation may not be possible if the SMRs are clamped in place. The six-DOF retroreflectors may also be glass cube corners directly embedded into the scanner and held stationary. Of course, as discussed hereinabove, other types of six-DOF targets may be used with laser trackers, some which involve illuminated markers in addition to a retroreflector, and some which require active (electrically powered) retroreflective targets.


It should be appreciated that since the retroreflectors 48, 222 are arranged in a fixed geometric relationship, the laser tracker 24 will be able to determine the relative positions of the retroreflectors 48, 222 to each other at any point in time. Therefore, as the scanner 26 approaches a point in space where the laser tracker 24 will switch from one retroreflector to the other retroreflector, the laser tracker 24 will be able to automatically reposition and redirect the beam of light 28 onto the desired retroreflector.


While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.

Claims
  • 1. A method of determining three-dimensional (3D) coordinates of an object surface with a six degree-of-freedom (DOF) laser tracker and a portable structured light scanner, the method comprising: providing the scanner having a body, a first camera, a second camera, a first projector, and a processor, the first camera, the second camera, and the first projector coupled to the body, the first camera having a first camera perspective center at a first camera position, the second camera having a second camera perspective center at a second camera position, and the first projector having a first projector perspective center at a first projector position, respectively, in a scanner frame of reference, the first projector position being non-collinear with respect to the first camera position and the second camera position, the first projector configured to produce a first projector pattern of light within the projector and to project the first projector pattern onto the surface as a first surface pattern, the first projector pattern of light being a pattern of light having uniformly spaced elements in each of two dimensions of two-dimensional space, the scanner further having a first retroreflector coupled to the body;providing the tracker having a tracker frame of reference, the scanner having a first pose in the tracker frame of reference, the first pose including a first location and a first orientation, each of the first location and the first orientation being defined by three degrees of freedom;locking an emitted beam of light from the tracker onto the first retroreflector;receiving by the tracker a reflected portion of the emitted beam of light;measuring with the tracker the first location, the location based at least in part on a first distance, a first angle, and a second angle, the first distance being a distance from the tracker to the retroreflector, the first distance measured with a distance meter, a first angle measured with a first angle measuring device, and a second angle measured with a second angle measuring device;measuring with the tracker the first orientation;projecting onto the surface the first surface pattern;imaging the first surface pattern with the first camera to obtain a first image;imaging the first surface pattern with the second camera to obtain a second image;determining with the processor the 3D coordinates of a first plurality of points in the tracker frame of reference based at least in part on the first location, the first orientation, the first projector pattern, the first image, the second image, the first camera position, the second camera position, and the first projector position, the determining based at least in part on the use of epipolar constraints among the first camera, the second camera, and the first projector; andstoring the 3D coordinates.
  • 2. The method of claim 1 further comprising: moving the scanner to a second pose that includes a second location and a second orientation;locking the emitted beam of light from the tracker onto the first retroreflector;receiving by the tracker the reflected portion of the emitted beam of light;measuring with the tracker the second location;measuring with the tracker the second orientation;projecting onto the surface a second surface pattern;imaging the second surface pattern with the first camera to obtain a third image;imaging the second surface pattern with the second camera to obtain a fourth image; anddetermining with the processor the 3D coordinates of a second plurality of points in the tracker frame of reference based at least in part on the second location, the second orientation, the third image, the fourth image, and the first projector position.
  • 3. The method of claim 1 further comprising synchronizing acquiring of the first image, the acquiring of the second image, the acquiring of the first location, and the acquiring of the first orientation, the synchronizing based at least in part on a trigger signal shared by the scanner and the tracker.
  • 4. The method of claim 1 wherein, in the step of providing the scanner, the scanner further has a second retroreflector coupled the body, the second retroreflector being oriented in a different direction than the first retroreflector.
  • 5. The method of claim 1 wherein, in the step of providing the scanner, the first retroreflector is configured to be rotated with respect to the body.
  • 6. The method of claim 5 wherein, in the step of providing the scanner, the first retroreflector is further configured to be rotated within a magnetic nest.
  • 7. The method of claim 5 wherein, in the step of providing the scanner, the first retroreflector is a spherically mounted retroreflector.
  • 8. The method of claim 1 wherein, in the step of projecting the first surface pattern onto the surface, the first surface pattern is a first pattern of dots.
CROSS REFERENCE TO RELATED APPLICATIONS

The Present Application is a continuation-in-part application of U.S. patent application Ser. No. 13/443,946 filed on Apr. 11, 2012, which is a nonprovisional application of U.S. Patent Application Ser. No. 61/592,049 filed on Jan. 30, 2012 and U.S. patent application Ser. No. 61/475,703 filed on Apr. 15, 2011, all of which are incorporated herein by reference in their entirety.

US Referenced Citations (491)
Number Name Date Kind
2612994 Woodland Oct 1952 A
2682804 Clifford et al. Jul 1954 A
2484641 Keuffel et al. Mar 1957 A
2784641 Keuffel et al. Mar 1957 A
3339457 Pun Sep 1967 A
3365717 Holscher Jan 1968 A
3464770 Schmidt Sep 1969 A
3497695 Smith et al. Feb 1970 A
3508828 Froome et al. Apr 1970 A
3619058 Hewlett et al. Nov 1971 A
3627429 Jaenicke et al. Dec 1971 A
3658426 Vyce Apr 1972 A
3728025 Madigan et al. Apr 1973 A
3740141 DeWitt, Jr. Jun 1973 A
3779645 Nakazawa et al. Dec 1973 A
3813165 Hines et al. May 1974 A
3832056 Shipp et al. Aug 1974 A
3900260 Wendt Aug 1975 A
3914052 Wiklund Oct 1975 A
4113381 Epstein Sep 1978 A
4178515 Tarasevich Dec 1979 A
4297030 Chaborski Oct 1981 A
4403857 Holscher Sep 1983 A
4413907 Lane Nov 1983 A
4453825 Buck et al. Jun 1984 A
4498764 Bolkow et al. Feb 1985 A
4521107 Chaborski et al. Jun 1985 A
4531833 Ohtomo Jul 1985 A
4537475 Summers et al. Aug 1985 A
4560270 Wiklund et al. Dec 1985 A
4632547 Kaplan et al. Dec 1986 A
4652130 Tank Mar 1987 A
4689489 Cole Aug 1987 A
4692023 Ohtomo et al. Sep 1987 A
4699508 Bolkow et al. Oct 1987 A
4707129 Hashimoto et al. Nov 1987 A
4714339 Lau et al. Dec 1987 A
4731812 Akerberg Mar 1988 A
4731879 Sepp et al. Mar 1988 A
4767257 Kato Aug 1988 A
4777660 Gould et al. Oct 1988 A
4790651 Brown et al. Dec 1988 A
4839507 May Jun 1989 A
4983021 Fergason Jan 1991 A
5002388 Ohishi et al. Mar 1991 A
5051934 Wiklund Sep 1991 A
5069524 Watanabe et al. Dec 1991 A
5082364 Russell Jan 1992 A
5090131 Deer Feb 1992 A
5121242 Kennedy Jun 1992 A
5137354 Devos et al. Aug 1992 A
5138154 Hotelling Aug 1992 A
5162862 Bartram et al. Nov 1992 A
5198868 Saito et al. Mar 1993 A
5237384 Fukunaga et al. Aug 1993 A
5263103 Kosinski Nov 1993 A
5267014 Prenninger Nov 1993 A
5301005 Devos et al. Apr 1994 A
5313409 Wiklund et al. May 1994 A
5319434 Croteau et al. Jun 1994 A
5347306 Nitta Sep 1994 A
5392521 Allen Feb 1995 A
5400130 Tsujimoto et al. Mar 1995 A
5402193 Choate Mar 1995 A
5416321 Sebastian et al. May 1995 A
5440112 Sakimura et al. Aug 1995 A
5440326 Quinn Aug 1995 A
5448505 Novak Sep 1995 A
5455670 Payne et al. Oct 1995 A
5500737 Donaldson et al. Mar 1996 A
5532816 Spann et al. Jul 1996 A
5534992 Takeshima et al. Jul 1996 A
5594169 Field et al. Jan 1997 A
D378751 Smith Apr 1997 S
5671160 Julian Sep 1997 A
5698784 Hotelling et al. Dec 1997 A
5724264 Rosenberg et al. Mar 1998 A
5737068 Kaneko et al. Apr 1998 A
5742379 Reifer Apr 1998 A
5754284 Leblanc et al. May 1998 A
5764360 Meier Jun 1998 A
5767952 Ohtomo et al. Jun 1998 A
5771623 Pernstich et al. Jun 1998 A
5817243 Shaffer Oct 1998 A
5825350 Case, Jr. et al. Oct 1998 A
5828057 Hertzman et al. Oct 1998 A
5861956 Bridges et al. Jan 1999 A
5880822 Kubo Mar 1999 A
5886775 Houser et al. Mar 1999 A
5886777 Hirunuma Mar 1999 A
5892575 Marino Apr 1999 A
5893214 Meier et al. Apr 1999 A
5898421 Quinn Apr 1999 A
5926388 Kimbrough et al. Jul 1999 A
5930030 Scifres Jul 1999 A
5957559 Rueb et al. Sep 1999 A
5973788 Pettersen et al. Oct 1999 A
5991011 Damm Nov 1999 A
6017125 Vann Jan 2000 A
6023326 Katayama et al. Feb 2000 A
6034722 Viney et al. Mar 2000 A
6036319 Rueb et al. Mar 2000 A
6052190 Sekowski et al. Apr 2000 A
D427087 Kaneko et al. Jun 2000 S
6085155 Hayase et al. Jul 2000 A
6097491 Hartrumpf Aug 2000 A
6097897 Ide Aug 2000 A
6100540 Ducharme et al. Aug 2000 A
6111563 Hines Aug 2000 A
6122058 Van Der Werf et al. Sep 2000 A
6133998 Monz et al. Oct 2000 A
6166809 Pettersen et al. Dec 2000 A
6171018 Ohtomo et al. Jan 2001 B1
6193371 Snook Feb 2001 B1
6222465 Kumar et al. Apr 2001 B1
6262801 Shibuya et al. Jul 2001 B1
6295174 Ishinabe et al. Sep 2001 B1
6317954 Cunningham et al. Nov 2001 B1
6324024 Shirai et al. Nov 2001 B1
6330379 Hendriksen Dec 2001 B1
6344846 Hines Feb 2002 B1
6347290 Bartlett Feb 2002 B1
6351483 Chen Feb 2002 B1
6353764 Imagawa et al. Mar 2002 B1
6369794 Sakurai et al. Apr 2002 B1
6369880 Steinlechner Apr 2002 B1
6433866 Nichols Aug 2002 B1
6437859 Ohtomo et al. Aug 2002 B1
6445446 Kumagai et al. Sep 2002 B1
6462810 Muraoka et al. Oct 2002 B1
6463393 Giger Oct 2002 B1
6490027 Rajchel et al. Dec 2002 B1
6501543 Hedges et al. Dec 2002 B2
6532060 Kindaichi et al. Mar 2003 B1
6559931 Kawamura et al. May 2003 B2
6563569 Osawa et al. May 2003 B2
6567101 Thomas May 2003 B1
6573883 Bartlett Jun 2003 B1
6573981 Kumagai et al. Jun 2003 B2
6583862 Perger Jun 2003 B1
6587244 Ishinabe et al. Jul 2003 B1
6611617 Crampton Aug 2003 B1
6624916 Green et al. Sep 2003 B1
6630993 Hedges et al. Oct 2003 B1
6633367 Gogolla Oct 2003 B2
6646732 Ohtomo et al. Nov 2003 B2
6650222 Darr Nov 2003 B2
6667798 Markendorf et al. Dec 2003 B1
6668466 Bieg et al. Dec 2003 B1
6678059 Cho et al. Jan 2004 B2
6681031 Cohen et al. Jan 2004 B2
6727984 Becht Apr 2004 B2
6727985 Giger Apr 2004 B2
6754370 Hall-Holt et al. Jun 2004 B1
6765653 Shirai et al. Jul 2004 B2
6802133 Jordil et al. Oct 2004 B2
6847436 Bridges Jan 2005 B2
6859744 Giger Feb 2005 B2
6864966 Giger Mar 2005 B2
6935036 Raab Aug 2005 B2
6957493 Kumagai et al. Oct 2005 B2
6964113 Bridges et al. Nov 2005 B2
6965843 Raab et al. Nov 2005 B2
6980881 Greenwood et al. Dec 2005 B2
6996912 Raab Feb 2006 B2
6996914 Istre et al. Feb 2006 B1
7022971 Ura et al. Apr 2006 B2
7023531 Gogolla et al. Apr 2006 B2
7055253 Kaneko Jun 2006 B2
7072032 Kumagai et al. Jul 2006 B2
7086169 Bayham et al. Aug 2006 B1
7095490 Ohtomo et al. Aug 2006 B2
7099000 Connolly Aug 2006 B2
7129927 Mattsson Oct 2006 B2
7130035 Ohtomo et al. Oct 2006 B2
7168174 Piekutowski Jan 2007 B2
7177014 Mori et al. Feb 2007 B2
7193695 Sugiura Mar 2007 B2
7196776 Ohtomo et al. Mar 2007 B2
7222021 Ootomo et al. May 2007 B2
7224444 Stierle et al. May 2007 B2
7230689 Lau Jun 2007 B2
7233316 Smith et al. Jun 2007 B2
7246030 Raab et al. Jul 2007 B2
7248374 Bridges Jul 2007 B2
7253891 Toker et al. Aug 2007 B2
7256899 Faul et al. Aug 2007 B1
7262863 Schmidt et al. Aug 2007 B2
7274802 Kumagai et al. Sep 2007 B2
7285793 Husted Oct 2007 B2
7286246 Yoshida Oct 2007 B2
7304729 Yasutomi et al. Dec 2007 B2
7307710 Gatsios et al. Dec 2007 B2
7312862 Zumbrunn et al. Dec 2007 B2
7321420 Yasutomi et al. Jan 2008 B2
7325326 Istre et al. Feb 2008 B1
7327446 Cramer et al. Feb 2008 B2
7336346 Aoki et al. Feb 2008 B2
7336375 Faul et al. Feb 2008 B1
7339655 Nakamura et al. Mar 2008 B2
7345748 Sugiura et al. Mar 2008 B2
7352446 Bridges et al. Apr 2008 B2
7372558 Kaufman et al. May 2008 B2
7388654 Raab et al. Jun 2008 B2
7388658 Glimm Jun 2008 B2
7401783 Pryor Jul 2008 B2
7429112 Metcalfe Sep 2008 B2
7446863 Nishita et al. Nov 2008 B2
7453554 Yang et al. Nov 2008 B2
7466401 Cramer et al. Dec 2008 B2
7471377 Liu et al. Dec 2008 B2
7474388 Ohtomo et al. Jan 2009 B2
7480037 Palmateer et al. Jan 2009 B2
7492444 Osada Feb 2009 B2
7503123 Matsuo et al. Mar 2009 B2
7511824 Sebastian et al. Mar 2009 B2
7518709 Oishi et al. Apr 2009 B2
7535555 Nishizawa et al. May 2009 B2
7541965 Ouchi et al. Jun 2009 B2
7552539 Piekutowski Jun 2009 B2
7555766 Kondo et al. Jun 2009 B2
7562459 Fourquin et al. Jul 2009 B2
7564538 Sakimura et al. Jul 2009 B2
7565216 Soucy Jul 2009 B2
7583375 Cramer et al. Sep 2009 B2
7586586 Constantikes Sep 2009 B2
7613501 Scherch Nov 2009 B2
7614019 Rimas Ribikauskas et al. Nov 2009 B2
D605959 Apotheloz Dec 2009 S
7634374 Chouinard et al. Dec 2009 B2
7634381 Westermark et al. Dec 2009 B2
7692628 Smith et al. Apr 2010 B2
7701559 Bridges et al. Apr 2010 B2
7701566 Kumagai et al. Apr 2010 B2
7705830 Westerman et al. Apr 2010 B2
7710396 Smith et al. May 2010 B2
7724380 Horita et al. May 2010 B2
7728963 Kirschner Jun 2010 B2
7738083 Luo et al. Jun 2010 B2
7751654 Lipson et al. Jul 2010 B2
7761814 Rimas-Ribikauskas et al. Jul 2010 B2
7765084 Westermark et al. Jul 2010 B2
7782298 Smith et al. Aug 2010 B2
7800758 Bridges et al. Sep 2010 B1
7804051 Hingerling et al. Sep 2010 B2
7804602 Raab Sep 2010 B2
7812736 Collingwood et al. Oct 2010 B2
7812969 Morimoto et al. Oct 2010 B2
7876457 Rueb Jan 2011 B2
7894079 Altendorf et al. Feb 2011 B1
7903237 Li Mar 2011 B1
7929150 Schweiger Apr 2011 B1
7954250 Crampton Jun 2011 B2
7976387 Venkatesh et al. Jul 2011 B2
7983872 Makino et al. Jul 2011 B2
7990523 Schlierbach et al. Aug 2011 B2
7990550 Aebischer et al. Aug 2011 B2
8087315 Goossen et al. Jan 2012 B2
8094121 Obermeyer et al. Jan 2012 B2
8094212 Jelinek Jan 2012 B2
8125629 Dold et al. Feb 2012 B2
8151477 Tait Apr 2012 B2
8190030 Leclair et al. May 2012 B2
8217893 Quinn et al. Jul 2012 B2
8237934 Cooke et al. Aug 2012 B1
8244023 Yamada Aug 2012 B2
8279430 Dold et al. Oct 2012 B2
8314939 Kato Nov 2012 B2
8320708 Kurzweil et al. Nov 2012 B2
8360240 Kallabis Jan 2013 B2
8379224 Piasse et al. Feb 2013 B1
8387961 Im Mar 2013 B2
8405604 Pryor et al. Mar 2013 B2
8422034 Steffensen et al. Apr 2013 B2
8437011 Steffensen et al. May 2013 B2
8438747 Ferrari May 2013 B2
8467071 Steffey et al. Jun 2013 B2
8467072 Cramer et al. Jun 2013 B2
8483512 Moeller Jul 2013 B2
8509949 Bordyn et al. Aug 2013 B2
8525983 Bridges et al. Sep 2013 B2
8537371 Steffensen et al. Sep 2013 B2
8537375 Steffensen et al. Sep 2013 B2
8553212 Jaeger et al. Oct 2013 B2
8593648 Cramer et al. Nov 2013 B2
8619265 Steffey et al. Dec 2013 B2
8630314 York Jan 2014 B2
8638984 Roithmeier Jan 2014 B2
8654354 Steffensen et al. Feb 2014 B2
8659749 Bridges Feb 2014 B2
8670114 Bridges et al. Mar 2014 B2
8681317 Moser et al. Mar 2014 B2
8699756 Jensen Apr 2014 B2
8717545 Sebastian et al. May 2014 B2
8740396 Brown et al. Jun 2014 B2
8772719 Böokem et al. Jul 2014 B2
8773667 Edmonds et al. Jul 2014 B2
8848203 Bridges et al. Sep 2014 B2
8874406 Rotvold et al. Oct 2014 B2
8902408 Bridges Dec 2014 B2
8931183 Jonas Jan 2015 B2
9151830 Bridges Oct 2015 B2
9207309 Bridges Dec 2015 B2
9377885 Bridges et al. Jun 2016 B2
20010045534 Kimura Nov 2001 A1
20020033940 Hedges et al. Mar 2002 A1
20020093646 Muraoka Jul 2002 A1
20020148133 Bridges et al. Oct 2002 A1
20020179866 Hoeller et al. Dec 2002 A1
20030014212 Ralston et al. Jan 2003 A1
20030033041 Richey Feb 2003 A1
20030035195 Blech et al. Feb 2003 A1
20030048459 Gooch Mar 2003 A1
20030090682 Gooch et al. May 2003 A1
20030112449 Tu et al. Jun 2003 A1
20030125901 Steffey et al. Jul 2003 A1
20030133092 Rogers Jul 2003 A1
20030179362 Osawa et al. Sep 2003 A1
20030206285 Lau Nov 2003 A1
20030227616 Bridges Dec 2003 A1
20040035277 Hubbs Feb 2004 A1
20040041996 Abe Mar 2004 A1
20040075823 Lewis et al. Apr 2004 A1
20040100705 Hubbs May 2004 A1
20040170363 Angela Sep 2004 A1
20040189944 Kaufman et al. Sep 2004 A1
20040218104 Smith et al. Nov 2004 A1
20040223139 Vogel Nov 2004 A1
20050058179 Phipps Mar 2005 A1
20050147477 Clark Jul 2005 A1
20050179890 Cramer et al. Aug 2005 A1
20050185182 Raab et al. Aug 2005 A1
20050197145 Chae et al. Sep 2005 A1
20050254043 Chiba Nov 2005 A1
20050284937 Xi et al. Dec 2005 A1
20060009929 Boyette et al. Jan 2006 A1
20060053647 Raab et al. Mar 2006 A1
20060055662 Rimas-Ribikauskas et al. Mar 2006 A1
20060055685 Rimas-Ribikauskas et al. Mar 2006 A1
20060066836 Bridges et al. Mar 2006 A1
20060103853 Palmateer May 2006 A1
20060132803 Clair et al. Jun 2006 A1
20060140473 Brooksby et al. Jun 2006 A1
20060141435 Chiang Jun 2006 A1
20060145703 Steinbichler et al. Jul 2006 A1
20060146009 Syrbe et al. Jul 2006 A1
20060161379 Ellenby et al. Jul 2006 A1
20060164384 Smith et al. Jul 2006 A1
20060164385 Smith et al. Jul 2006 A1
20060164386 Smith et al. Jul 2006 A1
20060222237 Du et al. Oct 2006 A1
20060222314 Zumbrunn et al. Oct 2006 A1
20060235611 Deaton et al. Oct 2006 A1
20060262001 Ouchi et al. Nov 2006 A1
20060279246 Hashimoto et al. Dec 2006 A1
20070016386 Husted Jan 2007 A1
20070019212 Gatsios et al. Jan 2007 A1
20070024842 Nishizawa et al. Feb 2007 A1
20070090309 Hu et al. Apr 2007 A1
20070121095 Lewis May 2007 A1
20070127013 Hertzman et al. Jun 2007 A1
20070130785 Bublitz et al. Jun 2007 A1
20070236452 Venkatesh et al. Oct 2007 A1
20070247615 Bridges et al. Oct 2007 A1
20070285672 Mukai et al. Dec 2007 A1
20080002866 Fujiwara Jan 2008 A1
20080024795 Yamamoto et al. Jan 2008 A1
20080043409 Kallabis Feb 2008 A1
20080107305 Vanderkooy et al. May 2008 A1
20080122786 Pryor et al. May 2008 A1
20080203299 Kozuma et al. Aug 2008 A1
20080229592 Hinderling et al. Sep 2008 A1
20080239281 Bridges Oct 2008 A1
20080246974 Wilson et al. Oct 2008 A1
20080250659 Bellerose et al. Oct 2008 A1
20080279446 Hassebrook et al. Nov 2008 A1
20080297808 Riza et al. Dec 2008 A1
20080302200 Tobey Dec 2008 A1
20080309949 Rueb Dec 2008 A1
20080316497 Taketomi et al. Dec 2008 A1
20080316503 Smarsh et al. Dec 2008 A1
20090000136 Crampton Jan 2009 A1
20090009747 Wolf et al. Jan 2009 A1
20090033621 Quinn et al. Feb 2009 A1
20090046271 Constantikes Feb 2009 A1
20090066932 Bridges et al. Mar 2009 A1
20090109426 Cramer et al. Apr 2009 A1
20090153817 Kawakubo Jun 2009 A1
20090157226 De Smet Jun 2009 A1
20090171618 Kumagai et al. Jul 2009 A1
20090187373 Atwell et al. Jul 2009 A1
20090190125 Foster et al. Jul 2009 A1
20090205088 Crampton et al. Aug 2009 A1
20090213073 Obermeyer et al. Aug 2009 A1
20090239581 Lee Sep 2009 A1
20090240372 Bordyn et al. Sep 2009 A1
20090240461 Makino et al. Sep 2009 A1
20090240462 Lee Sep 2009 A1
20090244277 Nagashima et al. Oct 2009 A1
20090260240 Bernhard Oct 2009 A1
20100008543 Yamada et al. Jan 2010 A1
20100025746 Chapman et al. Feb 2010 A1
20100058252 Ko Mar 2010 A1
20100091112 Veeser et al. Apr 2010 A1
20100103431 Demopoulos Apr 2010 A1
20100128259 Bridges et al. May 2010 A1
20100142798 Weston et al. Jun 2010 A1
20100149518 Nordenfelt et al. Jun 2010 A1
20100149525 Lau Jun 2010 A1
20100158361 Grafinger et al. Jun 2010 A1
20100176270 Lau et al. Jul 2010 A1
20100207938 Yau et al. Aug 2010 A1
20100225746 Shpunt et al. Sep 2010 A1
20100234094 Gagner et al. Sep 2010 A1
20100235786 Maizels et al. Sep 2010 A1
20100245851 Teodorescu Sep 2010 A1
20100250175 Briggs et al. Sep 2010 A1
20100250188 Brown Sep 2010 A1
20100251148 Brown Sep 2010 A1
20100265316 Sali et al. Oct 2010 A1
20100277747 Rueb et al. Nov 2010 A1
20100284082 Shpunt et al. Nov 2010 A1
20100299103 Yoshikawa Nov 2010 A1
20110001958 Bridges et al. Jan 2011 A1
20110003507 Van Swearingen et al. Jan 2011 A1
20110007154 Vogel et al. Jan 2011 A1
20110013281 Mimura et al. Jan 2011 A1
20110023578 Grasser Feb 2011 A1
20110025827 Shpunt et al. Feb 2011 A1
20110032509 Bridges et al. Feb 2011 A1
20110035952 Roithmeier Feb 2011 A1
20110043620 Svanholm et al. Feb 2011 A1
20110043808 Isozaki et al. Feb 2011 A1
20110052006 Gurman et al. Mar 2011 A1
20110069322 Hoffer, Jr. Mar 2011 A1
20110107611 Desforges et al. May 2011 A1
20110107612 Ferrari et al. May 2011 A1
20110107613 Tait May 2011 A1
20110107614 Champ May 2011 A1
20110109502 Sullivan May 2011 A1
20110112786 Desforges et al. May 2011 A1
20110123097 Van Coppenolle et al. May 2011 A1
20110128625 Larsen et al. Jun 2011 A1
20110166824 Haisty et al. Jul 2011 A1
20110169924 Haisty et al. Jul 2011 A1
20110170534 York Jul 2011 A1
20110173827 Bailey et al. Jul 2011 A1
20110175745 Atwell et al. Jul 2011 A1
20110176145 Edmonds et al. Jul 2011 A1
20110179281 Chevallier-Mames et al. Jul 2011 A1
20110181872 Dold et al. Jul 2011 A1
20110260033 Steffensen et al. Oct 2011 A1
20110301902 Panagas et al. Dec 2011 A1
20120050255 Thomas et al. Mar 2012 A1
20120062706 Keshavmurthy et al. Mar 2012 A1
20120065928 Rotvold et al. Mar 2012 A1
20120099117 Hanchett et al. Apr 2012 A1
20120105821 Moser et al. May 2012 A1
20120120391 Dold et al. May 2012 A1
20120120415 Steffensen et al. May 2012 A1
20120124850 Ortleb et al. May 2012 A1
20120154577 Yoshikawa et al. Jun 2012 A1
20120188559 Becker et al. Jul 2012 A1
20120206716 Cramer et al. Aug 2012 A1
20120206808 Brown et al. Aug 2012 A1
20120218563 Spruck et al. Aug 2012 A1
20120236320 Steffey et al. Sep 2012 A1
20120242795 Kane et al. Sep 2012 A1
20120262550 Bridges Oct 2012 A1
20120262573 Bridges et al. Oct 2012 A1
20120262728 Bridges et al. Oct 2012 A1
20120265479 Bridges et al. Oct 2012 A1
20120317826 Jonas Dec 2012 A1
20130037694 Steffensen et al. Feb 2013 A1
20130096873 Rosengaus et al. Apr 2013 A1
20130100282 Siercks et al. Apr 2013 A1
20130128284 Steffey et al. May 2013 A1
20130155386 Bridges et al. Jun 2013 A1
20130162469 Zogg et al. Jun 2013 A1
20130197852 Grau et al. Aug 2013 A1
20130201470 Cramer et al. Aug 2013 A1
20130293684 Becker et al. Nov 2013 A1
20140002806 Buchel et al. Jan 2014 A1
20140028805 Tohme et al. Jan 2014 A1
20140267629 Tohme et al. Sep 2014 A1
20140320643 Markendorf Oct 2014 A1
20140327920 Bridges et al. Nov 2014 A1
20150049329 Bridges et al. Feb 2015 A1
20150331159 Bridges et al. Nov 2015 A1
20150365653 Tohme et al. Dec 2015 A1
20150373321 Bridges Dec 2015 A1
Foreign Referenced Citations (158)
Number Date Country
2811444 Mar 2012 CA
589856 Jul 1977 CH
1263807 Aug 2000 CN
1290850 Apr 2001 CN
1362692 Aug 2002 CN
1474159 Feb 2004 CN
1531659 Sep 2004 CN
1608212 Apr 2005 CN
1926400 Mar 2007 CN
101031817 Sep 2007 CN
101203730 Jun 2008 CN
101297176 Oct 2008 CN
101371160 Feb 2009 CN
101427155 May 2009 CN
101750012 Jun 2010 CN
101776982 Jul 2010 CN
201548192 Aug 2010 CN
7704949 Jun 1977 DE
3530922 Apr 1986 DE
3827458 Feb 1990 DE
10022054 Nov 2001 DE
10160090 Jul 2002 DE
202004004945 Oct 2004 DE
102004024171 Sep 2005 DE
102005019058 Dec 2005 DE
102006013185 Sep 2007 DE
202006020299 May 2008 DE
60319016 Apr 2009 DE
102007058692 Jun 2009 DE
102009040837 Mar 2011 DE
0166106 Jan 1986 EP
0598523 May 1994 EP
598523 May 1994 EP
0797076 Sep 1997 EP
0919831 Jun 1999 EP
0957336 Nov 1999 EP
1067363 Jan 2001 EP
1519141 Mar 2005 EP
1607767 Dec 2005 EP
1710602 Oct 2006 EP
2136178 Dec 2009 EP
2177868 Apr 2010 EP
2219011 Aug 2010 EP
2259010 Dec 2010 EP
2259013 Dec 2010 EP
2322901 May 2011 EP
2446300 May 2012 EP
1543636 Apr 1979 GB
2503179 Dec 2013 GB
2503390 Dec 2013 GB
2516528 Jan 2015 GB
2518544 Mar 2015 GB
2518769 Apr 2015 GB
2518998 Apr 2015 GB
S57147800 Sep 1982 JP
S6097288 May 1985 JP
2184788 Jul 1990 JP
H0331715 Feb 1991 JP
H0371116 Mar 1991 JP
H0465631 Mar 1992 JP
H05257005 Oct 1993 JP
H05302976 Nov 1993 JP
H6097288 Apr 1994 JP
H06229715 Aug 1994 JP
H0665818 Sep 1994 JP
H06265355 Sep 1994 JP
H074967 Jan 1995 JP
H08145679 Jun 1996 JP
H0914965 Jan 1997 JP
H102722 Jan 1998 JP
H10107357 Apr 1998 JP
H10317874 Dec 1998 JP
11502629 Mar 1999 JP
H11304465 Nov 1999 JP
H11513495 Nov 1999 JP
H11337642 Dec 1999 JP
2000503476 Mar 2000 JP
2000275042 Oct 2000 JP
2000346645 Dec 2000 JP
2001013247 Jan 2001 JP
2001165662 Jun 2001 JP
2001513204 Aug 2001 JP
2001272468 Oct 2001 JP
2001284317 Oct 2001 JP
2001353112 Dec 2001 JP
2002089184 Mar 2002 JP
2002098762 Apr 2002 JP
2002139310 May 2002 JP
2002209361 Jul 2002 JP
2003506691 Feb 2003 JP
2004508954 Mar 2004 JP
2004108939 Apr 2004 JP
2004527751 Sep 2004 JP
3109969 Jun 2005 JP
2005265700 Sep 2005 JP
2006003127 Jan 2006 JP
2006058091 Mar 2006 JP
2006084460 Mar 2006 JP
2006220514 Aug 2006 JP
2006276012 Oct 2006 JP
2006526844 Nov 2006 JP
2007504459 Mar 2007 JP
2007165331 Jun 2007 JP
2007523357 Aug 2007 JP
2007256872 Oct 2007 JP
2008027308 Feb 2008 JP
2008514967 May 2008 JP
2008544215 Dec 2008 JP
2009014639 Jan 2009 JP
2009134761 Jun 2009 JP
2009229350 Oct 2009 JP
2010169633 Aug 2010 JP
2011158371 Aug 2011 JP
2011526706 Oct 2011 JP
2013525787 Oct 2011 JP
H04504468 Oct 2011 JP
2012509464 Apr 2012 JP
2012530909 Dec 2012 JP
5302976 Oct 2013 JP
381361 Feb 2000 TW
9012284 Oct 1990 WO
9534849 Dec 1995 WO
0109642 Feb 2001 WO
0177613 Oct 2001 WO
0223121 Mar 2002 WO
0237466 May 2002 WO
02084327 Oct 2002 WO
03062744 Jul 2003 WO
03073121 Sep 2003 WO
2004063668 Jul 2004 WO
2005026772 Mar 2005 WO
2006039682 Apr 2006 WO
2006052259 May 2006 WO
2006055770 May 2006 WO
2007079601 Jul 2007 WO
2007084209 Jul 2007 WO
2007123604 Nov 2007 WO
2007124010 Nov 2007 WO
2008052348 May 2008 WO
2008119073 Oct 2008 WO
WO2008121919 Oct 2008 WO
2010057169 May 2010 WO
2010100043 Sep 2010 WO
2010107434 Sep 2010 WO
2010141120 Dec 2010 WO
2010148525 Dec 2010 WO
2011035290 Mar 2011 WO
2011057130 May 2011 WO
2011107729 Sep 2011 WO
2011112277 Sep 2011 WO
2012142074 Oct 2012 WO
2010148526 Dec 2012 WO
2014143644 Sep 2014 WO
2014149701 Sep 2014 WO
2014149704 Sep 2014 WO
2014149705 Sep 2014 WO
2014149706 Sep 2014 WO
2014149702 Sep 2015 WO
Non-Patent Literature Citations (59)
Entry
“A New Generation of Total Stations from Leica Geosystems.” K. Zeiske. Leica Geosystems AG, May 1999, 8 pages.
“DLP-Based Structured Light 3D Imaging Technologies and Applications” by J. Geng; Proceedings of SPIE, vol. 7932. Published Feb. 11, 2011, 15 pages.
“Fiber Optic Rotary Joints Product Guide”; Moog Inc; MS1071, rev. 2; p. 1-4; 2010; Retrieved on Nov. 13, 2013 from http://www.moog.com/literature/ICD/Moog-Fiber-Optic-Rotary-Joint—Catalog-en.pdf;.
“Technical Brief: Fiber Optic Rotary Joint”; Document No. 303; Moog Inc; p. 1-6; 2008; Retrieved on Nov. 13, 2013 from http://www.moog.com/literature/MCG/FORJtechbrief.pdf.
2x2 High Speed Lithium Niobate Interferometric Switch; [on-line]; JDS Uniphase Corporation; 2007; Retreived from www.jdsu.com.
AO Modulator—M040-8J-FxS; [online—technical data sheet]; Gooch & Housego; Nov. 2006; Retrieved from http://www.goochandhousego.com/.
Automated Precision, Inc., Product Specifications, Radian, Featuring INNOVO Technology, info@apisensor.com, Copyright 2011, 2 pages.
Cao, et al.; “VisionWand: Interaction Techniques for Large Displays using a Passive Wand Tracked in 3D”; Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, UIST; vol. 5, issue 2; pp. 173-182; Jan. 2003.
Chen, Junewen, “Novel Laser Range Finding Algorithms”, Proceedings of SPIE, vol. 6100, Jan. 1, 2006, pp. 61001Q-61001Q-8, XP55031002, ISSN: 0277-786X, DOI: 10.1117/12.645131.
Parker, et al “Instrument for Setting Radio Telescope Surfaces” (4 pp) XP 55055817A.
Rahman, et al., “Spatial-Geometric Approach to Physical Mobile Interaction Based on Accelerometer and IR Sensory Data Fusion”, ACM Transactions on Multimedia Computing, Communications and Applications, vol. 6, No. 4, Article 28, Publication date: Novembe.
Sladek, J., et al: “The Hybrid Contact-Optical Coordinate Measuring System.” Measurement, vol. 44, No. 3, Mar. 1, 2011, pp. 503-510.
Stone, et al. “Automated Part Tracking on the Construction Job Site” 8 pp; XP 55055816A; National Institute of Standards and Technology.
Turk, et al., “Perceptual Interfaces”, UCSB Technical Report 2003-33, pp. 1-43 [Retreived Aug. 11, 2011, http://www.cs.ucsb.edu/research/tech—reports/reports/2003-33.pdf] (2003).
Computer Giants Embrace On-Chip Optics; Mar. 27, 2008; [on-line]; Optics.org; [Retreived on Apr. 2, 2008]; Retreived from http://optics.org/cws/article/research/33521.
Cuypers, et al “Optical Measurement Techniques for Mobile and Large-Scale Dimensional Metrology” (2009) ; Optics and Lasers in Engineering pp. 292-300; vol. 47; Elsevier Ltd. XP 25914917A.
English Abstract of CN1362692; Applicant: Univ Tianjin; Published Date: Aug. 7, 2002; 1 pg.
English Abstract of JP2005010585; Applicant: TDK Corp; Published Date: Jan. 13, 2005; 1 pg.
English Abstract of JPH06214186; Applicant: Eastman Kodak Co Ltd; Published Date: Aug. 5, 1994; 1 pg.
English Abstract of JPH09113223; Applicant: Fuji Xerox Co Ltd; Published Date: May 2, 1997; 1 pg.
EOSpace—High-Speed Switches; [on-line technical brochure]; [Retrieved May 18, 2009]; Retrieved from http://www.cospace.com/Switches.htm.
FARO Laser Tracker ION; 2 pages; revised Apr. 23, 2010; FARO Technologies, Inc., www.lasertracker.faro.com.
FARO Technical Institute, Basic Measurement Training Workbook, Version 1.0, FARO Laster Tracker, Jan. 2008, Students Book, FAO CAM2 Measure.
Hanwei Xiong et al: “The Development of Optical Fringe Measurement System integrated with a CMM for Products Inspection.” Proceedings of SPIE, vol. 7855, Nov. 3, 2010, pp. 78551W-7855W-8, XP055118356. ISSN: 0277-786X.
Hecht,Photonic Frontiers:Gesture Recognition: Lasers Bring Gesture Recognition to the Home, Laser Focus World, pp. 1-5, Retrieved on Mar. 3, 2011:http://www.optoiq.com/optoiq-2/en-us/index/photonics-technologies-applications/Ifw-display/Ifw-display/Ifw-arti.
Hui, Elliot E., et al, “Single-Step Assembly of Complex 3-D Microstructures”, Jan. 23, 2000, IEEE; pp. 602-607.
Integrated Optical Amplitude Modulator; [on-line technical data sheet]; [Retrieved Oct. 14, 2010]; Jenoptik; Retrieved from http://www.jenoptik.com/cms/products.nsf/0/A6DF20B50AEE7819C12576FE0074E8E6/$File/amplitudemodulators—en.pdf?Open.
Tracker3; Ultra-Portable Laser Tracking System; 4 pages; 2010 Automated Precision Inc.; www.apisensor.com.
Katowski “Optical 3-D Measurement Techniques-Applications in inspection, quality control and robotic” Vienna, Austria, Sep. 18-20, 1989.
Kester, Walt, Practical Analog Design Techniques, Analog Devices, Section 5, Undersampling Applications, Copyright 1995, pp. 5-1 to 5-34.
Kollorz et al.,“Gesture recognition with a time-of-flight camera”,Int. Jo. of Intelligent Sys Tech and Applications,vol. 5, No. 3/4,p. 334-343,Retreived Aug. 11, 2011;http://www5.informatik.uni-erlangen.de/Forschung/Publikationen/2008/Kollorz08-GRW.pdf, 2008.
LaserTRACER-measureing sub-micron in space; http://www.etalon-ag.com/index.php/en/products/lasertracer; 4 pages; Jun. 28, 2011; ETALON AG.
Leica Absolute Tracker AT401-ASME B89.4.19-2006 Specifications; Hexagon Metrology; Leica Geosystems Metrology Products, Switzerland; 2 pages; www.leica-geosystems.com/metrology.
Leica Geosystems AG ED—“Leica Laser Tracker System”, Internet Citation, Jun. 28, 2012, XP002678836, Retrieved from the Internet: URL:http://www.a-solution.com.au/pages/downloads/LTD500—Brochure—EN.pdf.
Leica Geosystems Metrology, “Leica Absolute Tracker AT401, White Paper,” Hexagon AB; 2010.
Leica Geosystems: “TPS1100 Professional Series”, 1999, Retrieved from the Internet: URL:http://www.estig.ibeja.pt/-legvm/top—civil/TPS1100%20-%20A%20New%20Generation%20of%20Total%20Stations.pdf, [Retrieved on Jul. 9, 2012 ] the whole document.
Li, et al., “Real Time Hand Gesture Recognition using a Range Camera”, Australasian Conference on Robotics and Automation (ACRA), [Retreived Aug. 10, 2011, http://www.araa.asn.au/acra/acra2009/papers/pap128s1.pdf] pp. 1-7 (2009).
Lightvision—High Speed Variable Optical Attenuators (VOA); [on-line]; A publication of Lightwaves 2020, Feb. 1, 2008; Retrieved from http://www.lightwaves2020.com/home/.
Maekynen, A. J. et al., Tracking Laser Radar for 3-D Shape Measurements of Large Industrial Objects Based on Time-of-Flight Laser Rangefinding and Position-Sensitive Detection Techniques, IEEE Transactions on Instrumentation and Measurement, vol. 43, No.
Making the Big Step from Electronics to Photonics by Modulating a Beam of Light with Electricity; May 18, 2005; [on-line]; [Retrieved May 7, 2009]; Cornell University News Service; Retrieved from http://www.news.cornell.edu/stories/May05/LipsonElectroOptica.
Matsumaru, K, “Mobile Robot with Preliminary-Announcement and Display Function of Forthcoming Motion Using Projection Equipment,” Robot and Human Interactive Communication, 2006. RO-MAN06. The 15th IEEE International Symposium, pp. 443-450, Sep. 6-8.
MEMS Variable Optical Attenuators Single/Multi-Channel; [on-line]; Jan. 17, 2005; Retrieved from www.ozoptics.com.
Nanona High Speed & Low Loss Optical Switch; [on-line technical data sheet]; [Retrieved Oct. 14, 2010]; Retrieved from http://www.bostonati.com/products/PI-FOS.pdf.
New River Kinematics, SA ARM—“The Ultimate Measurement Software for Arms, Software Release!”, SA Sep. 30, 2010 [On-line], http://www.kinematics.com/news/software-release-sa20100930.html (1 of 14), [Retrieved Apr. 13, 2011 11:40:47 AM].
Optical Circulator (3-Ports & 4-Ports); [on-line technical data sheet]; Alliance Fiber Optic Products, Inc. REV.D Jan. 15, 2004; Retrieved from www.afop.com.
Optical Circulators Improve Bidirectional Fiber Systems; By Jay S. Van Delden; [online]; [Retrieved May 18, 2009]; Laser Focus World; Retrieved from http://www.laserfocusworld.com/display—article/28411/12/nonc/nonc/News/Optical-circulators-improve-bidirecti.
Ou-Yang, Mang, et al., “High-Dynamic-Range Laser Range Finders Based on a Novel Multimodulated Frequency Method”, Optical Engineering, vol. 45, No. 12, Jan. 1, 2006, p. 123603, XP55031001, ISSN: 0091-3286, DOI: 10.1117/1.2402517.
PCMM System Specifications Leica Absolute Tracker and Leica T-Products; Hexagon Metrology; Leica Geosystems Metrology Products, Switzerland; 8 pages; www.leica-geosystems.com/metrology.
Poujouly, Stephane, et al., “A Twofold Modulation Frequency Laser Range Finder; A Twofold Modulation Frequency Laser Range Finder”, Journal of Optics. A, Pure and Applied Optics, Institute of Physics Publishing, Bristol, GB, vol. 4, No. 6, Nov. 1, 2.
Poujouly, Stephane, et al., Digital Laser Range Finder: Phase-Shift Estimation by Undersampling Technique; IEEE, Copyright 1999.
RS Series Remote Controlled Optical Switch; [on-line technical data sheet]; Sercalo Microtechnology, Ltd. [Retrieved Oct. 14, 2010]; Retreived from http://www.sercalo.com/document/PDFs/DataSheets/RS%20datasheet.pdf.
Super-Nyquist Operation of the AD9912 Yields a High RF Output Signal; Analog Devices, Inc., AN-939 Application Note; www.analog.com; Copyright 2007.
Burge, James H., et al, Use of a commerical laser tracker for optical alignment, Proc, of SPIE vol. 6676, Sep. 21, 2007, pp. 66760E-1-6 6760E-12.
Chen, Jihua, et al, Research on the Principle of 5/6-DOF Laser Tracking Metrology, Journal of Astronautic Metrology and Measurement vol. 27, No. 3, May 31, 2007, pp. 58-62.
Newport Company “Fiber Optic Scribes” https://web.archive.org/web/20120903063012/http://www.newport.com/Fiber-Optic-Scribes/835171/1033/info.aspx; 2012, 2 pages.
Newport Corporation “Projects in Fiber Optics: Applications Handbook”, 1986; 3 pages.
Takeuchi et al., “Ultraprecision 3D Micromachining of Glass”; Annals of the CIRP; Jan. 4, 1996; vol. 45; 401-404 pages.
Thorlabs “Ruby Dualscribe Fiber Optic Scribe” a Mechanical Drawing, 2014, 1 page.
TOPCON: “IS-3 Imaging Station”, www.topconposition.com; 2011; 1-4 pages.
Related Publications (1)
Number Date Country
20140028805 A1 Jan 2014 US
Provisional Applications (2)
Number Date Country
61475703 Apr 2011 US
61592049 Jan 2012 US
Continuation in Parts (1)
Number Date Country
Parent 13443946 Apr 2012 US
Child 14044311 US