Three-dimensional coordinate scanner and method of operation

Information

  • Patent Grant
  • 10267619
  • Patent Number
    10,267,619
  • Date Filed
    Thursday, August 25, 2016
    8 years ago
  • Date Issued
    Tuesday, April 23, 2019
    5 years ago
Abstract
A noncontact optical three-dimensional measuring device that includes a first projector, a first camera, and a second camera; a processor electrically coupled to the first projector, the first camera and the second camera; and computer readable media which, when executed by the processor, causes the first digital signal to be collected at a first time and the second digital signal to be collected at a second time different than the first time and determines three-dimensional coordinates of a first point on the surface based at least in part on the first signal and the first distance and determines three-dimensional coordinates of a second point on the surface based at least in part on the second signal.
Description
BACKGROUND OF THE INVENTION

The subject matter disclosed herein relates to a three-dimensional coordinate scanner and in particular to a triangulation-type scanner having multiple modalities of data acquisition.


The acquisition of three-dimensional coordinates of an object or an environment is known. Various techniques may be used, such as time-of-flight or triangulation methods for example. A time-of-flight systems such as a laser tracker, total station, or time-of-flight scanner may direct a beam of light such as a laser beam toward a retroreflector target or a spot on the surface of the object. An absolute distance meter is used to determine the distance to the target or spot based on length of time it takes the light to travel to the target or spot and return. By moving the laser beam or the target over the surface of the object, the coordinates of the object may be ascertained. Time-of-flight systems have advantages in having relatively high accuracy, but in some cases may be slower than some other systems since time-of-flight systems must usually measure each point on the surface individually.


In contrast, a scanner that uses triangulation to measure three-dimensional coordinates projects onto a surface either a pattern of light in a line (e.g. a laser line from a laser line probe) or a pattern of light covering an area (e.g. structured light) onto the surface. A camera is coupled to the projector in a fixed relationship, by attaching a camera and the projector to a common frame for example. The light emitted from the projector is reflected off of the surface and detected by the camera. Since the camera and projector are arranged in a fixed relationship, the distance to the object may be determined using trigonometric principles. Compared to coordinate measurement devices that use tactile probes, triangulation systems provide advantages in quickly acquiring coordinate data over a large area. As used herein, the resulting collection of three-dimensional coordinate values provided by the triangulation system is referred to as a point cloud data or simply a point cloud.


A number of issues may interfere with the acquisition of high accuracy point cloud data when using a laser scanner. These include, but are not limited to: variations in the level of light received over the camera image plane as a result in variations in the reflectance of the object surface or variations in the angle of incidence of the surface relative to the projected light; low resolution near edge, such as edges of holes; and multipath interference for example. In some cases, the operator may be unaware of or unable to eliminate a problem. In these cases, missing or faulty point cloud data is the result.


Accordingly, while existing scanners are suitable for their intended purpose the need for improvement remains, particularly in providing a scanner that can adapt to undesirable conditions and provide improved data point acquisition.


BRIEF DESCRIPTION OF THE INVENTION

According to one aspect of the invention, a noncontact optical three-dimensional measuring device is provided. A noncontact optical three-dimensional measuring device comprising: a movable assembly that includes a first projector, a first camera, and a second camera, wherein the first projector, the first camera, and the second camera are fixed in relation to one another, there being a first distance between the first projector and the first camera and a second distance between the first projector and the second camera, the first projector having a first light source, the first projector configured to emit onto a surface of an object a first light having a first pattern, the first camera having a first lens and a first photosensitive array, the first camera configured to receive a first portion of the first light reflected off the surface and to produce a first signal in response, the first camera having a first field of view, the first field of view being a first angular viewing region of the first camera, the second camera having a second lens and a second photosensitive array, the second camera configured to receive a second portion of the first light reflected off the surface and to produce a second signal in response, the second camera having a second field of view, the second field of view being a second angular viewing region of the second camera, the second field of view being different than the first field of view; a position sensing device operably coupled to the movable assembly, the position sensing device configured to determine the position of the movable assembly relative to the object; and a processor, electrically coupled to the first projector, the first camera, the second camera, and the position sensing device, that executes a computer executable program code that when executed by the processor performs operations that include causing the first signal to be collected at a first time and a first position, and the second signal to be collected at a second time and a second position, the second time being different than the first time, determining three-dimensional coordinates of a first point on the surface based at least in part on the first signal and the first distance, and determining three-dimensional coordinates of a second point on the surface based at least in part on the second signal.


According to one aspect of the invention, a method of determining three-dimensional coordinates on a surface of an object is provided. A method of determining three-dimensional coordinates on a surface of an object, the method comprising: providing a movable assembly that includes a first projector, a first camera, and a second camera, wherein the first projector, the first camera, and the second camera are fixed in relation to one another, there being a first distance between the first projector and the first camera and a second distance between the first projector and the second camera, the first projector having a first light source, the first projector configured to emit onto the surface a first light having a first pattern, the first camera having a first lens and a first photosensitive array, the first camera configured to receive a first portion of the first light reflected off the surface, the first camera having a first field of view, the first field of view being a first angular viewing region of the first camera, the second camera having a second lens and a second photosensitive array, the second camera configured to receive a second portion of the first light reflected off the surface, the second camera having a second field of view, the second field of view being a second angular viewing region of the second camera, the second field of view being different than the first field of view; providing a processor electrically coupled to the first projector, the first camera and the second camera; emitting from the first projector onto the surface, in a first instance, the first light having the first pattern; acquiring in the first instance a first image of the surface with the first camera and sending a first signal to the processor in response; determining a first position of the assembly relative to the object; determining a first set of three-dimensional coordinates of first points on the surface, the first set based at least in part on the first pattern, the first signal, the first distance and the first position; carrying out a diagnostic procedure that determines a quality factor for the first set; moving the assembly from the first position to a second position based at least in part on results of the diagnostic procedure; emitting from the first projector onto the surface, in a second instance, the first light having the first pattern; acquiring in the second instance a second image of the surface with the second camera and sending a second signal to the processor in response; and determining a second set of three-dimensional coordinates of second points on the surface, the second set based at least in part on the second signal, and the second position.


These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.





BRIEF DESCRIPTION OF THE DRAWING

The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:



FIG. 1 is a top schematic view of a scanner in accordance with an embodiment of the invention;



FIG. 2 is a flow chart showing a method of operating the scanner of FIG. 1;



FIG. 3 is a top schematic view of a scanner in accordance with another embodiment of the invention;



FIG. 4 is a flow chart showing a method of operating the scanner of FIG. 3;



FIG. 5A is a schematic view of elements within a laser scanner according to an embodiment;



FIG. 5B is a flow chart showing a method of operating a scanner according to an embodiment;



FIG. 6 is a top schematic view of a scanner in accordance with another embodiment of the invention;



FIG. 7 is a flow chart showing a method of operating the scanner according to an embodiment;



FIGS. 8A and 8B are perspective views of a scanner used in conjunction with a remote probe device in accordance with an embodiment of the invention;



FIG. 9 is a flow chart showing a method of operating the scanner of FIG. 5;



FIG. 10 is top schematic view of a scanner according to an embodiment;



FIG. 11 is a flow chart showing a method of operating the scanner of FIG. 10; and



FIG. 12 is a flow chart showing a diagnostic method according to an embodiment.





The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.


DETAILED DESCRIPTION OF THE INVENTION

Embodiments of the present invention provide advantages increasing the reliability and accuracy of three-dimensional coordinates of a data point cloud acquired by a scanner. Embodiments of the invention provide advantages in detecting anomalies in acquired data and automatically adjusting the operation of the scanner to acquire the desired results. Embodiments of the invention provide advantages in detecting anomalies in the acquired data and providing indication to the operator of areas where additional data acquisition is needed. Still further embodiments of the invention provide advantages in detecting anomalies in the acquired data and providing indication to the operator where additional data acquisition may be acquired with a remote probe.


Scanner devices acquire three-dimensional coordinate data of objects. In one embodiment, a scanner 20 shown in FIG. 1 has a housing 22 that includes a first camera 24, a second camera 26 and a projector 28. The projector 28 emits light 30 onto a surface 32 of an object 34. In the exemplary embodiment, the projector 28 uses a visible light source that illuminates a pattern generator. The visible light source may be a laser, a superluminescent diode, an incandescent light, a Xenon lamp, a light emitting diode (LED), or other light emitting device for example. In one embodiment, the pattern generator is a chrome-on-glass slide having a structured light pattern etched thereon. The slide may have a single pattern or multiple patterns that move in and out of position as needed. The slide may be manually or automatically installed in the operating position. In other embodiments, the source pattern may be light reflected off or transmitted by a digital micro-mirror device (DMD) such as a digital light projector (DLP) manufactured by Texas Instruments Corporation, a liquid crystal device (LCD), a liquid crystal on silicon (LCOS) device, or a similar device used in transmission mode rather than reflection mode. The projector 28 may further include a lens system 36 that alters the outgoing light to cover the desired area.


In this embodiment, the projector 28 is configurable to emit a structured light over an area 37. As used herein, the term “structured light” refers to a two-dimensional pattern of light projected onto an area of an object that conveys information which may be used to determine coordinates of points on the object. In one embodiment, a structured light pattern will contain at least three non-collinear pattern elements disposed within the area. Each of the three non-collinear pattern elements conveys information which may be used to determine the point coordinates. In another embodiment, a projector is provided that is configurable to project both an area pattern as well as a line pattern. In one embodiment, the projector is a digital micromirror device (DMD), which is configured to switch back and forth between the two. In one embodiment, the DMD projector may also sweep a line or to sweep a point in a raster pattern.


In general, there are two types of structured light patterns, a coded light pattern and an uncoded light pattern. As used herein a coded light pattern is one in which the three dimensional coordinates of an illuminated surface of the object are found by acquiring a single image. With a coded light pattern, it is possible to obtain and register point cloud data while the projecting device is moving relative to the object. One type of coded light pattern contains a set of elements (e.g. geometric shapes) arranged in lines where at least three of the elements are non-collinear. Such pattern elements are recognizable because of their arrangement.


In contrast, an uncoded structured light pattern as used herein is a pattern that does not allow measurement through a single pattern. A series of uncoded light patterns may be projected and imaged sequentially. For this case, it is usually necessary to hold the projector fixed relative to the object.


It should be appreciated that the scanner 20 may use either coded or uncoded structured light patterns. The structured light pattern may include the patterns disclosed in the journal article “DLP-Based Structured Light 3D Imaging Technologies and Applications” by Jason Geng published in the Proceedings of SPIE, Vol. 7932, which is incorporated herein by reference. In addition, in some embodiments described herein below, the projector 28 transmits a pattern formed a swept line of light or a swept point of light. Swept lines and points of light provide advantages over areas of light in identifying some types of anomalies such as multipath interference. Sweeping the line automatically while the scanner is held stationary also has advantages in providing a more uniform sampling of surface points.


The first camera 24 includes a photosensitive sensor 44 which generates a digital image/representation of the area 48 within the sensor's field of view. The sensor may be charged-coupled device (CCD) type sensor or a complementary metal-oxide-semiconductor (CMOS) type sensor for example having an array of pixels. The first camera 24 may further include other components, such as but not limited to lens 46 and other optical devices for example. The lens 46 has an associated first focal length. The sensor 44 and lens 46 cooperate to define a first field of view “X”. In the exemplary embodiment, the first field of view “X” is 16 degrees (0.28 inch per inch).


Similarly, the second camera 26 includes a photosensitive sensor 38 which generates a digital image/representation of the area 40 within the sensor's field of view. The sensor may be charged-coupled device (CCD) type sensor or a complementary metal-oxide-semiconductor (CMOS) type sensor for example having an array of pixels. The second camera 26 may further include other components, such as but not limited to lens 42 and other optical devices for example. The lens 42 has an associated second focal length, the second focal length being different than the first focal length. The sensor 38 and lens 42 cooperate to define a second field of view “Y”. In the exemplary embodiment, the second field of view “Y” is 50 degrees (0.85 inch per inch). The second field of view Y is larger than the first field of view X. Similarly, the area 40 is larger than the area 48. It should be appreciated that a larger field of view allows acquired a given region of the object surface 32 to be measured faster; however, if the photosensitive arrays 44 and 38 have the same number of pixels, a smaller field of view will provide higher resolution.


In the exemplary embodiment, the projector 28 and the first camera 24 are arranged in a fixed relationship at an angle such that the sensor 44 may receive light reflected from the surface of the object 34. Similarly, the projector 28 and the second camera 26 are arranged in a fixed relationship at an angle such that the sensor 38 may receive light reflected from the surface 32 of object 34. Since the projector 28, first camera 24 and second camera 26 have fixed geometric relationships, the distance and the coordinates of points on the surface may be determined by their trigonometric relationships. Although the fields-of-view (FOVs) of the cameras 24 and 26 are shown not to overlap in FIG. 1, the FOVs may partially overlap or totally overlap.


The projector 28 and cameras 24, 26 are electrically coupled to a controller 50 disposed within the housing 22. The controller 50 may include one or more microprocessors, digital signal processors, memory and signal conditioning circuits. The scanner 20 may further include actuators (not shown) which may be manually activated by the operator to initiate operation and data capture by the scanner 20. In one embodiment, the image processing to determine the X, Y, Z coordinate data of the point cloud representing the surface 32 of object 34 is performed by the controller 50. The coordinate data may be stored locally such as in a volatile or nonvolatile memory 54 for example. The memory may be removable, such as a flash drive or a memory card for example. In other embodiments, the scanner 20 has a communications circuit 52 that allows the scanner 20 to transmit the coordinate data to a remote processing system 56. The communications medium 58 between the scanner 20 and the remote processing system 56 may be wired (e.g. Ethernet) or wireless (e.g. Bluetooth, IEEE 802.11). In one embodiment, the coordinate data is determined by the remote processing system 56 based on acquired images transmitted by the scanner 20 over the communications medium 58.


A relative motion is possible between the object surface 32 and the scanner 20, as indicated by the bidirectional arrow 47. There are several ways in which such relative motion may be provided. In an embodiment, the scanner is a handheld scanner and the object 34 is fixed. Relative motion is provided by moving the scanner over the object surface. In another embodiment, the scanner is attached to a robotic end effector. Relative motion is provided by the robot as it moves the scanner over the object surface. In another embodiment, either the scanner 20 or the object 34 is attached to a moving mechanical mechanism, for example, a gantry coordinate measurement machine or an articulated arm CMM. Relative motion is provided by the moving mechanical mechanism as it moves the scanner 20 over the object surface. In some embodiments, motion is provided by the action of an operator and in other embodiments, motion is provided by a mechanism that is under computer control.


Referring now to FIG. 2, the operation of the scanner 20 according to a method 1260 is described. As shown in block 1262, the projector 28 first emits a structured light pattern onto the area 37 of surface 32 of the object 34. The light 30 from projector 28 is reflected from the surface 32 as reflected light 62 received by the second camera 26. The three-dimensional profile of the surface 32 affects the image of the pattern captured by the photosensitive array 38 within the second camera 26. Using information collected from one or more images of the pattern or patterns, the controller 50 or the remote processing system 56 determines a one to one correspondence between the pixels of the photosensitive array 38 and pattern of light emitted by the projector 28. Using this one-to-one correspondence, triangulation principals are used to determine the three-dimensional coordinates of points on the surface 32. This acquisition of three-dimensional coordinate data (point cloud data) is shown in block 1264. By moving the scanner 20 over the surface 32, a point cloud may be created of the entire object 34.


During the scanning process, the controller 50 or remote processing system 56 may detect an undesirable condition or problem in the point cloud data, as shown in block 1266. Methods for detecting such a problem are discussed hereinbelow with regard to FIG. 12. The detected problem may be an error in or absence of point cloud data in a particular area for example. This error in or absence of data may be caused by too little or too much light reflected from that area. Too little or too much reflected light may result from a difference in reflectance over the object surface, for example, as a result of high or variable angles of incidence of the light 30 on the object surface 32 or as a result of low reflectance (black or transparent) materials or shiny surfaces. Certain points on the object may be angled in such as way as to produce a very bright specular reflectance known as a glint.


Another possible reason for an error in or absence of point cloud data is a lack of resolution in regions having fine features, sharp edges, or rapid changes in depth. Such lack of resolution may be the result of a hole, for example.


Another possible reason for an error in or an absence of point cloud data is multipath interference. Ordinarily a ray of light from the projector 28 strikes a point on the surface 32 and is scattered over a range of angles. The scattered light is imaged by the lens 42 of camera 26 onto a small spot on the photosensitive array 38. Similarly, the scattered light may be imaged by the lens 46 of camera 24 onto a small spot on the photosensitive array 44. Multipath interference occurs when the light reaching the point on the surface 32 does not come only from the ray of light from the projector 28 but in addition, from secondary light is reflected off another portion of the surface 32. Such secondary light may compromise the pattern of light received by the photosensitive array 38, 44, thereby preventing accurate determination of three-dimensional coordinates of the point. Methods for identifying the presence of multipath interference are described in the present application with regard to FIG. 12.


If the controller determines that the point cloud is all right in block 1266, the procedure is finished. Otherwise, a determination is made in block 1268 of whether the scanner is used in a manual or automated mode. If the mode is manual, the operator is directed in block 1270 to move the scanner into the desired position.


There are many ways that the movement desired by the operator may be indicated. In an embodiment, indicator lights on the scanner body indicate the desired direction of movement. In another embodiment, a light is projected onto the surface indicating the direction over which the operator is to move. In addition, a color of the projected light may indicate whether the scanner is too close or too far from the object. In another embodiment, an indication is made on display of the region to which the operator is to project the light. Such a display may be a graphical representation of point cloud data, a CAD model, or a combination of the two. The display may be presented on a computer monitor or on a display built into the scanning device.


In any of these embodiments, a method of determining the approximate position of the scanner is desired. In one case, the scanner may be attached to an articulated arm CMM that uses angular encoders in its joints to determine the position and orientation of the scanner attached to its end. In another case, the scanner includes inertial sensors placed within the device. Inertial sensors may include gyroscopes, accelerometers, and magnetometers, for example. Another method of determining the approximate position of the scanner is to illuminate photogrammetric dots placed on or around the object as marker points. In this way, the wide FOV camera in the scanner can determine the approximate position of the scanner in relation to the object.


In another embodiment, a CAD model on a computer screen indicates the regions where additional measurements are desired, and the operator moves the scanner according by matching the features on the object to the features on the scanner. By updating the CAD model on the screen as a scan is taken, the operator may be given rapid feedback whether the desired regions of the part have been measured.


After the operator has moved the scanner into position, a measurement is made in block 1272 with the small FOV camera 24. By viewing a relatively smaller region in block 1272, the resolution of the resulting three-dimensional coordinates is improved and better capability is provided to characterize features such as holes and edges.


Because the narrow FOV camera views a relatively smaller region than the wide FOV camera, the projector 28 may illuminate a relatively smaller region. This has advantages in eliminating multipath interference since there is relatively fewer illuminated points on the object that can reflect light back onto the object. Having a smaller illuminated region may also make it easier to control exposure to obtain the optimum amount of light for a given reflectance and angle of incidence of the object under test. In the block 1274, if all points have been collected, the procedure ends at block 1276; otherwise it continues.


In an embodiment where the mode from block 1268 is automated, then in block 1278 the automated mechanism moves the scanner into the desired position. In some embodiments, the automated mechanism will have sensors to provide information about the relative position of the scanner and object under test. For an embodiment in which the automated mechanism is a robot, angular transducers within the robot joints provide information about the position and orientation of the robot end effector used to hold the scanner. For an embodiment in which the object is moved by another type of automated mechanism, linear encoders or a variety of other sensors may provide information on the relative position of the object and the scanner.


After the automated mechanism has moved the scanner or object into position, then in block 1280 three-dimensional measurements are made with the small FOV camera. Such measurements are repeated by means of block 1282 until all measurements are completed and the procedure finishes at block 1284.


In one embodiment, the projector 28 changes the structured light pattern when the scanner switches from acquiring data with the second camera 26 to the first camera 24. In another embodiment, the same structured light pattern is used with both cameras 24, 26. In still another embodiment, the projector 28 emits a pattern formed by a swept line or point when the data is acquired by the first camera 24. After acquiring data with the first camera 24, the process continues scanning using the second camera 26. This process continues until the operator has either scanned the desired area of the part.


It should be appreciated that while the process of FIG. 2 is shown as a linear or sequential process, in other embodiments one or more of the steps shown may be executed in parallel. In the method shown in FIG. 2, the method involved measuring the entire object first and then carrying out further detailed measurements according to an assessment of the acquired point cloud data. An alternative using the scanner 20 is to begin by measuring detailed or critical regions using the camera 24 having the small FOV.


It should also be appreciated that it is common practice in existing scanning systems to provide a way of changing the camera lens or projector lens as a way of changing the FOV of the camera or of projector in the scanning system. However, such changes are time consuming and typically require an additional compensation step in which an artifact such as a dot plate is placed in front of the camera or projector to determine the aberration correction parameters for the camera or projector system. Hence a scanning system that provides two cameras having different FOVs, such as the cameras 24, 26 of FIG. 1, provides a significant advantage in measurement speed and in enablement of the scanner for a fully automated mode.


Another embodiment is shown in FIG. 3 of a scanner 20 having a housing 22 that includes a first coordinate acquisition system 76 and a second coordinate acquisition system 78. The first coordinate acquisition system 76 includes a first projector 80 and a first camera 82. Similar to the embodiment of FIG. 1, the projector 80 emits light 84 onto a surface 32 of an object 34. In the exemplary embodiment, the projector 80 uses a visible light source that illuminates a pattern generator. The visible light source may be a laser, a superluminescent diode, an incandescent light, a light emitting diode (LED), or other light emitting device. In one embodiment, the pattern generator is a chrome-on-glass slide having a structured light pattern etched thereon. The slide may have a single pattern or multiple patterns that move in and out of position as needed. The slide may be manually or automatically installed in the operating position. In other embodiments, the source pattern may be light reflected off or transmitted by a digital micro-mirror device (DMD) such as a digital light projector (DLP) manufactured by Texas Instruments Corporation, a liquid crystal device (LCD), a liquid crystal on silicon (LCOS) device, or a similar device used in transmission mode rather than reflection mode. The projector 80 may further include a lens system 86 that alters the outgoing light to have the desired focal characteristics.


The first camera 82 includes a photosensitive array sensor 88 which generates a digital image/representation of the area 90 within the sensor's field of view. The sensor may be charged-coupled device (CCD) type sensor or a complementary metal-oxide-semiconductor (CMOS) type sensor for example having an array of pixels. The first camera 82 may further include other components, such as but not limited to lens 92 and other optical devices for example. The first projector 80 and first camera 82 are arranged at an angle in a fixed relationship such that the first camera 82 may detect light 85 from the first projector 80 reflected off of the surface 32 of object 34. It should be appreciated that since the first camera 92 and first projector 80 are arranged in a fixed relationship, the trigonometric principals discussed above may be used to determine coordinates of points on the surface 32 within the area 90. Although for clarity FIG. 3 is depicted as having the first camera 82 near to the first projector 80, it should be appreciated that the camera could be placed nearer the other side of the housing 22. By spacing the first camera 82 and first projector 80 farther apart, accuracy of 3D measurement is expected to improve.


The second coordinate acquisition system 78 includes a second projector 94 and a second camera 96. The projector 94 has a light source that may comprise a laser, a light emitting diode (LED), a superluminescent diode (SLED), a Xenon bulb, or some other suitable type of light source. In an embodiment, a lens 98 is used to focus the light received from the laser light source into a line of light 100 and may comprise one or more cylindrical lenses, or lenses of a variety of other shapes. The lens is also referred to herein as a “lens system” because it may include one or more individual lenses or a collection of lenses. The line of light is substantially straight, i.e., the maximum deviation from a line will be less than about 1% of its length. One type of lens that may be utilized by an embodiment is a rod lens. Rod lenses are typically in the shape of a full cylinder made of glass or plastic polished on the circumference and ground on both ends. Such lenses convert collimated light passing through the diameter of the rod into a line. Another type of lens that may be used is a cylindrical lens. A cylindrical lens is a lens that has the shape of a partial cylinder. For example, one surface of a cylindrical lens may be flat, while the opposing surface is cylindrical in form.


In another embodiment, the projector 94 generates a two-dimensional pattern of light that covers an area of the surface 32. The resulting coordinate acquisition system 78 is then referred to as a structured light scanner.


The second camera 96 includes a sensor 102 such as a charge-coupled device (CCD) type sensor or a complementary metal-oxide-semiconductor (CMOS) type sensor for example. The second camera 96 may further include other components, such as but not limited to lens 104 and other optical devices for example. The second projector 94 and second camera 96 are arranged at an angle such that the second camera 96 may detect light 106 from the second projector 94 reflected off of the object 34. It should be appreciated that since the second projector 94 and the second camera 96 are arranged in a fixed relationship, the trigonometric principles discussed above may be used to determine coordinates of points on the surface 32 on the line formed by light 100. It should also be appreciated that the camera 96 and the projector 94 may be located on opposite sides of the housing 22 to increase 3D measurement accuracy.


In another embodiment, the second coordinate acquisition system is configured to project a variety of patterns, which may include not only a fixed line of light but also a swept line of light, a swept point of light, a coded pattern of light (covering an area), or a sequential pattern of light (covering an area). Each type of projection pattern has different advantages such as speed, accuracy, and immunity to multipath interference. By evaluating the performance requirements for each particular measurements and/or by reviewing the characteristics of the returned data or of the anticipated object shape (from CAD models or from a 3D reconstruction based on collected scan data), it is possible to select the type of projected pattern that optimizes performance.


In another embodiment, the distance from the second coordinate acquisition system 78 and the object surface 32 is different than the distance from the first coordinate acquisition system 76 and the object surface 32. For example, the camera 96 may be positioned closer to the object 32 than the camera 88. In this way, the resolution and accuracy of the second coordinate acquisition system 78 can be improved relative to that of the first coordinate acquisition system 76. In many cases, it is helpful to quickly scan a relatively large and smooth object with a lower resolution system 76 and then scan details including edges and holes with a higher resolution system 78.


A scanner 20 may be used in a manual mode or in an automated mode. In a manual mode, an operator is prompted to move the scanner nearer or farther from the object surface according to the acquisition system that is being used. Furthermore, the scanner 20 may project a beam or pattern of light indicating to the operator the direction in which the scanner is to be moved. Alternatively, indicator lights on the device may indicate the direction in which the scanner should be moved. In an automated mode, the scanner 20 or object 34 may be automatically moved relative to one another according to the measurement requirements.


Similar to the embodiment of FIG. 1, the first coordinate acquisition system 76 and the second coordinate acquisition system 78 are electrically coupled to a controller 50 disposed within the housing 22. The controller 50 may include one or more microprocessors, digital signal processors, memory and signal conditioning circuits. The scanner 20 may further include actuators (not shown) which may be manually activated by the operator to initiate operation and data capture by the scanner 20. In one embodiment, the image processing to determine the X, Y, Z coordinate data of the point cloud representing the surface 32 of object 34 is performed by the controller 50. The coordinate data may be stored locally such as in a volatile or nonvolatile memory 54 for example. The memory may be removable, such as a flash drive or a memory card for example. In other embodiments, the scanner 20 has a communications circuit 52 that allows the scanner 20 to transmit the coordinate data to a remote processing system 56. The communications medium 58 between the scanner 20 and the remote processing system 56 may be wired (e.g. Ethernet) or wireless (e.g. Bluetooth, IEEE 802.11). In one embodiment, the coordinate data is determined by the remote processing system 56 and the scanner 20 transmits acquired images on the communications medium 58.


Referring now to FIG. 4, the method 1400 of operating the scanner 20 of FIG. 3 will be described. In block 1402, the first projector 80 of the first coordinate acquisition system 76 of scanner 20 emits a structured light pattern onto the area 90 of surface 32 of the object 34. The light 84 from projector 80 is reflected from the surface 32 and the reflected light 85 is received by the first camera 82. As discussed above, the variations in the surface profile of the surface 32 create distortions in the imaged pattern of light received by the first photosensitive array 88. Since the pattern is formed by structured light, a line or light, or a point of light, it is possible in some instances for the controller 50 or the remote processing system 56 to determine a one to one correspondence between points on the surface 32 and the pixels in the photosensitive array 88. This enables triangulation principles discussed above to be used in block 1404 to obtain point cloud data, which is to say to determine X, Y, Z coordinates of points on the surface 32. By moving the scanner 20 relative to the surface 32, a point cloud may be created of the entire object 34.


In block 1406, the controller 50 or remote processing system 56 determines whether the point cloud data possesses the desired data quality attributes or has a potential problem. The types of problems that may occur were discussed hereinabove in reference to FIG. 2 and this discussion is not repeated here. If the controller determines that the point cloud has the desired data quality attributes in block 1406, the procedure is finished. Otherwise, a determination is made in block 1408 of whether the scanner is used in a manual or automated mode. If the mode is manual, the operator is directed in block 1410 to move the scanner to the desired position.


There are several ways of indicating the desired movement by the operator as described hereinabove with reference to FIG. 2. This discussion is not repeated here.


To direct the operator in obtaining the desired movement, a method of determining the approximate position of the scanner is needed. As explained with reference to FIG. 2, methods may include attachment of the scanner 20 to an articulated arm CMM, use of inertial sensors within the scanner 20, illumination of photogrammetric dots, or matching of features to a displayed image.


After the operator has moved the scanner into position, a measurement is made with the second coordinate acquisition system 78 in block 1412. By using the second coordinate acquisition system, resolution and accuracy may be improved or problems may be eliminated. In block 1414, if all points have been collected, the procedure ends at block 1416; otherwise it continues.


If the mode of operation from block 1408 is automated, then in block 1418 the automated mechanism moves the scanner into the desired position. In most cases, an automated mechanism will have sensors to provide information about the relative position of the scanner and object under test. For the case in which the automated mechanism is a robot, angular transducers within the robot joints provide information about the position and orientation of the robot end effector used to hold the scanner. For other types of automated mechanisms, linear encoders or a variety of other sensors may provide information on the relative position of the object and the scanner.


After the automated mechanism has moved the scanner or object into position, then in block 1420 three-dimensional measurements are made with the second coordinate acquisition system 78. Such measurements are repeated by means of block 1422 until all measurements are completed. The procedure finishes at block 1424.


It should be appreciated that while the process of FIG. 4 is shown as a linear or sequential process, in other embodiments one or more of the steps shown may be executed in parallel. In the method shown in FIG. 4, the method involved measuring the entire object first and then carrying out further detailed measurements according to an assessment of the acquired point cloud data. An alternative using scanner 20 is to begin by measuring detailed or critical regions using the second coordinate acquisition system 78.


It should also be appreciated that it is common practice in existing scanning systems to provide a way of changing the camera lens or projector lens as a way of changing the FOV of the camera or of projector in the scanning system. However, such changes are time consuming and typically require an additional compensation step in which an artifact such as a dot plate is placed in front of the camera or projector to determine the aberration correction parameters for the camera or projector system. Hence a system that provides two different coordinate acquisition systems such as the scanning system 20 of FIG. 3 provides a significant advantage in measurement speed and in enablement of the scanner for a fully automated mode.


An error may occur in making scanner measurements as a result of multipath interference. The origin of multipath interference is now discussed, and a first method for eliminating or reducing multipath interference is described.


The case of multipath interference occurs when the some of the light that strikes the object surface is first scattered off another surface of the object before returning to the camera. For the point on the object that receives this scattered light, the light sent to the photosensitive array then corresponds not only to the light directly projected from the projector but also to the light sent to a different point on the projector and scattered off the object. The result of multipath interference, especially for the case of scanners that project two-dimensional (structured) light, may be to cause the distance calculated from the projector to the object surface at that point to be inaccurate.


An instance of multipath interference is illustrated in reference to FIG. 5A, in this embodiment a scanner 4570 projects a line of light 4525 onto the surface 4510A of an object. The line of light 4525 is perpendicular to the plane of the paper. In an embodiment, the rows of a photosensitive array are parallel to the plane of the paper and the columns are perpendicular to the plane of the paper. Each row represents one point on the projected line 4525 in the direction perpendicular to the plane of the paper. The distance from the projector to the object for that point on the line is found by first calculating the centroid for each row. For the surface point 4526, the centroid on the photosensitive array 4541 is represented by the point 4546. The position 4546 of the centroid on the photosensitive array can be used to calculate the distance from the camera perspective center 4544 to the object point 4526. This calculation is based on trigonometric relationships according to the principles of triangulation. To perform these calculations, the baseline distance D from the camera perspective center 4544 to the projector perspective center 4523 is required. In addition, knowledge of the relative orientation of the projector system 4520 to the camera system 4540 is required.


To understand the error caused by multipath interference, consider the point 4527. Light reflected or scattered from this point is imaged by the lens 4542 onto the point 4548 on the photosensitive array 4541. However, in addition to the light received directly from the projector and scattered off the point 4527, additional light is reflected off the point 4526 onto the point 4527 before being imaged onto the photosensitive array. The light will mostly likely be scattered to an unexpected position and cause two centroids to be formed in a given row. Consequently observation of two centroids on a given row is a good indicator of the presence of multipath interference.


For the case of structured light projected onto an area of the object surface, a secondary reflection from a point such as 4527 is not usually as obvious as for light projected onto a line and hence is more likely to create an error in the measured 3D surface coordinates.


By using a projector having an adjustable pattern of illumination on a display element 4521, it is possible to vary the pattern of illumination. The display element 4521 might be a digital micromechanical mirror (DMM) such as a digital light projector (DLP). Such devices contain multiple small mirrors that are rapidly adjustable by means of an electrical signal to rapidly adjust a pattern of illumination. Other devices that can produce an electrically adjustable display pattern include an LCD (liquid crystal display) and an LCOS (liquid crystal on silicon) display.


A way of checking for multipath interference in a system that projects structured light over an area is to change the display to project a line of light. The presence of multiple centroids in a row will indicate that multipath interference is present. By sweeping the line of light, an area can be covered without requiring that the probe be moved by an operator.


The line of light can be set to any desired angle by an electrically adjustable display. By changing the direction of the projected line of light, multipath interference can, in many cases, be eliminated.


For surfaces having many fold and steep angles so that reflections are hard to avoid, the electrically adjustable display can be used to sweep a point of light. In some cases, a secondary reflection may be produced from a single point of light, but it is usually relatively easy to determine which of the reflected spots of light is valid.


An electrically adjustable display can also be used to quickly switch between a coded and an uncoded pattern. In most cases, a coded pattern is used to make a 3D measurement based on a single frame of camera information. On the other hand, multiple patterns (sequential or uncoded patterns) may be used to obtain greater accuracy in the measured 3D coordinate values.


In the past, electrically adjustable displays have been used to project each of a series of patterns within a sequential pattern—for example, a series of gray scale line patterns followed by a sequence of sinusoidal patterns, each having a different phase.


The present inventive method provides advantages over earlier methods in selecting those methods that identify or eliminate problems such as multipath interference and that indicate whether a single-shot pattern (for example, coded pattern) or a multiple-shot pattern is preferred to obtain the required accuracy as quickly as possible.


For the case of a line scanner, there is often a way to determine the presence of multipath interference. When multipath interference is not present, the light reflected by a point on the object surface is imaged in a single row onto a region of contiguous pixels. If there are two or more regions of a row receive a significant amount of light, multipath interference is indicated. An example of such a multipath interference condition and the resulting extra region of illumination on the photosensitive array are shown in FIG. 5A. The surface 4510A now has a greater curvature near the point of intersection 4526. The surface normal at the point of intersection is the line 4528, and the angle of incidence is 4531. The direction of the reflected line of light 4529 is found from the angle of reflection 4532, which is equal to the angle of incidence. As stated hereinabove, the line of light 4529 actually represents an overall direction for light that scatters over a range of angles. The center of the scattered light strikes the surface 4510A at the point 4527, which is imaged by the lens 4544 at the point 4548 on the photosensitive array. The unexpectedly high amount of light received in the vicinity of point 4548 indicates that multipath interference is probably present. For a line scanner, the main concern with multipath interference is not for the case shown in FIG. 5A, where the two spots 4546 and 4527 are separated by a considerable distance and can be analyzed separately but rather for the case in which the two spots overlap or smear together. In this case, it may not be possible to determine the centroid corresponding to the desired point, which in FIG. 15E corresponds to the point 4546. The problem is made worse for the case of a scanner that projects light over a two-dimensional area as can be understood by again referring to FIG. 5A. If all of the light imaged onto the photosensitive array 4541 were needed to determine two-dimensional coordinates, then it is clear that the light at the point 4527 would correspond to the desired pattern of light directly projected from the projector as well as the unwanted light reflected off the object surface to the point 4527. As a result, in this case, the wrong three dimensional coordinates would likely be calculated for the point 4527 for light projected over an area.


For a projected line of light, in many cases, it is possible to eliminate multipath interference by changing the direction of the line. One possibility is to make a line scanner using a projector having inherent two-dimensional capability, thereby enabling the line to be swept or to be automatically rotated to different directions. An example of such a projector is one that makes use of a digital micromirror (DMD), as discussed hereinabove. For example, if multipath interference were suspected in a particular scan obtained with structured light, a measurement system could be automatically configured to switch to a measurement method using a swept line of light.


Another method to reduce, minimize or eliminate multipath interference is to sweep a point of light, rather than a line of light or an area of light, over those regions for which multipath interference has been indicated. By illuminating a single point of light, any light scattered from a secondary reflection can usually be readily identified.


The determination of the desired pattern projected by an electrical adjustable display benefits from a diagnostic analysis, as described with respect to FIG. 12 below.


Besides its use in diagnosing and correcting multipath interference, changing the pattern of projected light also provides advantages in obtaining a required accuracy and resolution in a minimum amount of time. In an embodiment, a measurement is first performed by projecting a coded pattern of light onto an object in a single shot. The three-dimensional coordinates of the surface are determined using the collected data, and the results analyzed to determine whether some regions have holes, edges, or features that require more detailed analysis. Such detailed analysis might be performed for example by using the narrow FOV camera 24 in FIG. 1 or the high resolution scanner system 78 in FIG. 3.


The coordinates are also analyzed to determine the approximate distance to the target, thereby providing a starting distance for a more accurate measurement method such as a method that sequentially projects sinusoidal phase-shifted patterns of light onto a surface, as discussed hereinabove. Obtaining a starting distance for each point on the surface using the coded light pattern eliminates the need to obtain this information by vary the pitch in multiple sinusoidal phase-shifted scans, thereby saving considerable time.


Referring now to FIG. 5B, an embodiment is shown for overcoming anomalies or improving accuracy in coordinate data acquired by scanner 20. The process 211 starts in block 212 by scanning an object, such as object 34, with a scanner 20. The scanner 20 may be a scanner such as those described in the embodiments of FIG. 1, 3, 5 and FIG. 7 for example having at least one projector and a camera. In this embodiment, the scanner 20 projects a first light pattern onto the object in block 212. In one embodiment, this first light pattern is a coded structured light pattern. The process 211 acquires and determines the three-dimensional coordinate data in block 214. The coordinate data is analyzed in query block 216 to determine if there are any anomalies, such as the aforementioned multipath interference, low resolution around an element, or an absence of data due to surface angles or surface reflectance changes. When an anomaly is detected, the process 211 proceeds to block 218 where the light pattern emitted by the projector is changed to a second light pattern. In an embodiment, the second light pattern is a swept line of light.


After projecting the second light pattern, the process 211 proceeds to block 220 where the three-dimensional coordinate data is acquired and determined for the area where the anomaly was detected. The process 211 loops back to query block 216 where it is determined if the anomaly has been resolved. If the query block 216 still detects an anomaly or lack or accuracy or resolution, the process loops back to block 218 and switches to a third light pattern. In an embodiment, the third light pattern is a sequential sinusoidal phase shift pattern. In another embodiment, the third light pattern is a swept point of light. This iterative procedure continues until the anomaly has been resolved. Once coordinate data from the area of the anomaly has been determined, the process 211 proceeds to block 222 where the emitted pattern is switched back to the first structured light pattern and the scanning process is continued. The process 211 continues until the operator has scanned the desired area of the object. In the event that the scanning information obtained using the method of FIG. 11 is not satisfactory, a problem of measuring with a tactile probe, as discussed herein, may be used.


Referring now to FIG. 6, another embodiment of a scanner 20 is shown mounted to a movable apparatus 120. The scanner 20 has at least one projector 122 and at least one camera 124 that are arranged in a fixed geometric relationship such that trigonometric principles may be used to determine the three-dimensional coordinates of points on the surface 32. The scanner 20 may be the same scanner as described in reference to FIG. 1 or FIG. 3 for example. In one embodiment, the scanner is the same as the scanner of FIG. 10 having a tactile probe. However the scanner used in the embodiment of FIG. 6 may be any scanner such as a structured light or line scanner, for example, a scanner disclosed in commonly owned U.S. Pat. No. 7,246,030 entitled “Portable Coordinate Measurement Machine with Integrated Line Laser Scanner” filed on 18 Jan. 2006 which is incorporated by reference herein. In another embodiment, the scanner used in the embodiment of FIG. 6 is a structured light scanner that projects light over an area on an object.


In the exemplary embodiment, the moveable apparatus 120 is a robotic apparatus that provides automated movements by means of arm segments 126, 128 that are connected by pivot and swivel joints 130 to allow the arm segments 126, 128 to be moved, resulting in the scanner 20 moving from a first position to a second position (as indicated in dashed line in FIG. 6). The moveable apparatus 120 may include actuators, such as motors (not shown), for example, that are coupled to the arm segments 126, 128 to move the arm segments 126, 128 from the first position to the second position. It should be appreciated that a movable apparatus 120 having articulated arms is for exemplary purposes and the claimed invention should not be so limited. In other embodiments, the scanner 20 may be mounted to a movable apparatus that moves the scanner 20 via rails, wheels, tracks, belts or cables or a combination of the foregoing for example. In other embodiments, the robot has a different number of arm segments.


In one embodiment, the movable apparatus is an articulated arm coordinate measurement machine (AACMM) such as that described in commonly owned U.S. patent application Ser. No. 13/491,176 filed on Jan. 20, 2010. In this embodiment, the movement of the scanner 20 from the first position to the second position may involve the operator manually moving the arm segments 126, 128.


For an embodiment having an automated apparatus, the moveable apparatus 120 further includes a controller 132 that is configured to energize the actuators to move the arm segments 126, 128. In one embodiment, the controller 132 communicates with a controller 134. As will be discussed in more detail below, this arrangement allows the controller 132 to move the scanner 20 in response to an anomaly in the acquired data. It should be appreciated that the controllers 132, 134 may be incorporated into a single processing unit or the functionality may be distributed among several processing units.


By carrying out an analysis with reference to FIG. 12, it is possible to position and orient the scanner 20 to obtain the desired measurement results. In some embodiments, a feature being measured may benefit from a desired direction of the scanner. For example, measurement of the diameter of a hole may be improved by orienting the scanner camera 124 to be approximately perpendicular to the hole. In other embodiments, a scanner may be positioned so as to reduce or minimize the possibility of multipath interference. Such an analysis may be based on a CAD model available as a part of the diagnostic procedure or it may be based on data collected by the scanner in an initial position prior to a secondary movement of the scanner 20 by the apparatus 120.


Referring now to FIG. 7, the operation of the scanner 20 and movable apparatus 120 will be described. The process starts in block 136 with scanning the object 34 with the scanner 20 in the first position. The scanner 20 acquires and determines coordinate data for points on the surface 32 of the object 34 in block 138. It should be appreciated that the movable apparatus 120 may move the scanner 20 to acquire data on surface points in a desired area. In query block 140, it is determined whether there is an anomaly in the coordinate data at point 142, such as multipath interference for example, or whether there is a need to change direction to obtain improved resolution or measurement accuracy. It should be appreciated that the point 142 of FIG. 6 may represent a single point, a line of points or an area on the surface 32. If an anomaly or need for improved accuracy is detected, the process continues to block 144 where the movable apparatus 120 moves the position of the scanner 20, such as from the first position to the second position, and rescans the area of interest in block 146 to acquire three-dimensional coordinate data. The process loops back to query block 140 where it is determined whether there is still an anomaly in the coordinate data or if an improvement measurement accuracy is desired. If these cases, the scanner 20 is moved again and the process continues until the measurement results achieve a desired level. Once the coordinate data is obtained, the process proceeds from query block 140 to block 148 where the scanning process continues until the desired area has been scanned.


In embodiments where the scanner 20 includes a tactile probe (FIG. 10), the movement of the scanner from the first position to the second position may be arranged to contact the areas of interest with the tactile probe. Since the position of the scanner, and thus the tactile probe, may be determined from the position and orientation of the arm segments 126, 128 the three-dimensional coordinates of the point on the surface 32 may be determined.


In some embodiments, measurement results obtained by the scanner 20 of FIGS. 8A, 8B may be corrupted by multipath interference. In other cases, measurement results may not provide the desired resolution or accuracy to properly measure some characteristics of the surface 32, especially edges, holes, or intricate features. In these cases, it may be desirable to have an operator use a remote probe 152 to interrogate points or areas on the surface 32. In one embodiment shown in FIGS. 8A, 8B, the scanner 20 includes a projector 156 and cameras 154, 155 arranged on an angle relative to the projector 156 such that light emitted by the projector 156 is reflected off of the surface 32 and received by one or both of the cameras 154, 155. The projector 156 and cameras 154,156 are arranged in a fixed geometric relationship so that trigonometric principles may be used to determine the three-dimensional coordinates of points on the surface 32.


In one embodiment, the projector 156 is configured to emit a visible light 157 onto an area of interest 159 on the surface 32 of object 34 as shown in FIG. 8A. The three-dimensional coordinates of the illuminated area of interest 159 may be confirmed using the image of the illuminated region 159 on one or both of the cameras 154, 155.


The scanner 20 is configured to cooperate with the remote probe 152 so that an operator may bring a probe tip 166 into contact with the object surface 132 at the illuminated region of interest 159. In an embodiment, the remote probe 152 includes at least three non-collinear points of light 168. The points of light 168 may be spots of light produced, for example, by light emitting diodes (LED) or reflective dots of light illuminated by infrared or visible light source from the projector 156 or from another light source not depicted in FIG. 8B. The infrared or visible light source in this case may be attached to the scanner 20 or may be placed external to the scanner 20. By determining the three-dimensional coordinates of the spots of light 168 with the scanner and by using information on the geometry of the probe 152, the position of the probe tip 166 may be determined, thereby enabling the coordinates of the object surface 32 to be determined. A tactile probe used in this way eliminates potential problems from multipath interference and also enables relatively accurate measurement of holes, edges, and detailed features. In an embodiment, the probe 166 is a tactile probe, which may be activated by pressing of an actuator button (not shown) on the probe, or the probe 166 may be a touch-trigger probe activated by contact with the surface 32. In response to a signal produced by the actuator button or the touch trigger probe, a communications circuit (not shown) transmits a signal to the scanner 20. In an embodiment, the points of light 168 are replaced with geometrical patterns of light, which may include lines or curves.


Referring now to FIG. 9, a process is shown for acquiring coordinate data for points on the surface 32 of object 34 using a stationary scanner 20 of FIGS. 8A, 8B with a remote probe 152. The process starts in block 170 with the surface 32 of the object 34 being scanned. The process acquires and determines the three-dimensional coordinate data of the surface 32 in block 172. The process then determines in query block 174 whether there is an anomaly in the coordinate data of area 159 or whether there is a problem in accuracy or resolution of the area 159. An anomaly could be invalid data that is discarded due to multipath interference for example. An anomaly could also be missing data due to surface reflectance or a lack of resolution around a feature such as an opening or hole for example. Details on a diagnostic procedure for detecting (identifying) multipath interference and related problems in given in reference to FIG. 12.


Once the area 159 has been identified, the scanner 20 indicates in block 176 to the operator the coordinate data of area 159 may be acquired via the remote probe 152. This area 159 may be indicated by emitting a visible light 157 to illuminate the area 159. In one embodiment, the light 157 is emitted by the projector 156. The color of light 157 may be changed to inform the operator of the type of anomaly or problem. For example, where multipath interference occurs, the light 157 may be colored red, while a low resolution may be colored green. The area may further be indicated on a display having a graphical representation (e.g. a CAD model) of the object.


The process then proceeds to block 178 to acquire an image of the remote probe 152 when the sensor 166 touches the surface 32. The points of light 168, which may be LEDs or reflective targets, for example, are received by one of the cameras 154, 155. Using best-fit techniques well known to mathematicians, the scanner 20 determines in block 180 the three-dimensional coordinates of the probe center from which three-dimensional coordinates of the object surface 32 are determined in block 180. Once the points in the area 159 where the anomaly was detected have been acquired, the process may proceed to continue the scan of the object 34 in block 182 until the desired areas have been scanned.


Referring now to FIG. 10, another embodiment of the scanner 20 is shown that may be handheld by the operator during operation. In this embodiment, the housing 22 may include a handle 186 that allows the operator to hold the scanner 20 during operation. The housing 22 includes a projector 188 and a camera 190 arranged on an angle relative to each other such that the light 192 emitted by the projector is reflected off of the surface 32 and received by the camera 190. The scanner 20 of FIG. 10 operates in a manner substantially similar to the embodiments of FIG. 1 and FIG. 3 and acquires three-dimensional coordinate data of points on the surface 32 using trigonometric principles.


The scanner 20 further includes an integral probe member 184. The probe member 184 includes a sensor 194 on one end. The sensor 194 is a tactile probe that may respond to pressing of an actuator button (not shown) by an operator or it may be a touch trigger probe that responds to contact with the surface 32, for example. As will be discussed in more detail below, the probe member 184 allows the operator to acquire coordinates of points on the surface 32 by contacting the sensor 194 to the surface 32.


The projector 188, camera 190 and actuator circuit for the sensor 194 are electrically coupled to a controller 50 disposed within the housing 22. The controller 50 may include one or more microprocessors, digital signal processors, memory and signal conditioning circuits. The scanner 20 may further include actuators (not shown), such as on the handle 186 for example, which may be manually activated by the operator to initiate operation and data capture by the scanner 20. In one embodiment, the image processing to determine the X, Y, Z coordinate data of the point cloud representing the surface 32 of object 34 is performed by the controller 50. The coordinate data may be stored locally such as in a volatile or nonvolatile memory 54 for example. The memory may be removable, such as a flash drive or a memory card for example. In other embodiments, the scanner 20 has a communications circuit 52 that allows the scanner 20 to transmit the coordinate data to a remote processing system 56. The communications medium 58 between the scanner 20 and the remote processing system 56 may be wired (e.g. Ethernet) or wireless (e.g. Bluetooth, IEEE 802.11). In one embodiment, the coordinate data is determined by the remote processing system 56 and the scanner 20 transmits acquired images on the communications medium 58.


Referring now to FIG. 11, the operation of the scanner 20 of FIG. 10 will be described. The process begins in block 196 with the operator scanning the surface 32 of the object 34 by manually moving the scanner 20. The three-dimensional coordinates are determined and acquired in block 198. In query block 200, it is determined if an anomaly is present in the coordinate data or if improved accuracy is needed. As discussed above, anomalies may occur for a number of reasons such as multipath interference, surface reflectance changes or a low resolution of a feature. If an anomaly is present, the process proceeds to block 202 where the area 204 is indicated to the operator. The area 204 may be indicated by projecting a visible light 192 with the projector 188 onto the surface 32. In one embodiment, the light 192 is colored to notify the operator of the type of anomaly detected.


The operator then proceeds to move the scanner from a first position to a second position (indicated by the dashed lines) in block 206. In the second position, the sensor 194 contacts the surface 32. The position and orientation (to six degrees of freedom) of the scanner 20 in the second position may be determined using well known best-fit methods based on images acquired by the camera 190. Since the dimensions and arrangement of the sensor 194 are known in relation to the mechanical structure of the scanner 20, the three-dimensional coordinate data of the points in area 204 may be determined in block 208. The process then proceeds to block 210 where scanning of the object continues. The scanning process continues until the desired area has been scanned.


A general approach may be used to evaluate not only multipath interference but also quality in general, including resolution and effect of material type, surface quality, and geometry. Referring also to FIG. 12, in an embodiment, a method 4600 may be carried out automatically under computer control. A step 4602 is to determine whether information on three-dimensional coordinates of an object under test are available. A first type of three-dimensional information is CAD data. CAD data usually indicates nominal dimensions of an object under test. A second type of three-dimensional information is measured three-dimensional data—for example, data previously measured with a scanner or other device. In some cases, the step 4602 may include a further step of aligning the frame of reference of the coordinate measurement device, for example, laser tracker or six-DOF scanner accessory, with the frame of reference of the object. In an embodiment, this is done by measuring at least three points on the surface of the object with the laser tracker.


If the answer to the question posed in step 4602 is that the three-dimensional information is available, then, in a step 4604, the computer or processor is used to calculate the susceptibility of the object measurement to multipath interference. In an embodiment, this is done by projecting each ray of light emitted by the scanner projector, and calculating the angle or reflection for each case. The computer or software identifies each region of the object surface that is susceptible to error as a result of multipath interference. The step 4604 may also carry out an analysis of the susceptibility to multipath error for a variety of positions of the six-DOF probe relative to the object under test. In some cases, multipath interference may be avoided or minimized by selecting a suitable position and orientation of the six-DOF probe relative to the object under test, as described hereinabove. If the answer to the question posed in step 4602 is that three-dimensional information is not available, then a step 4606 is to measure the three-dimensional coordinates of the object surface using any desired or preferred measurement method. Following the calculation of multipath interference, a step 4608 may be carried out to evaluate other aspects of expected scan quality. One such quality factor is whether the resolution of the scan is sufficient for the features of the object under test. For example, if the resolution of a device is 3 mm, and there are sub-millimeter features for which valid scan data is desired, then these problem regions of the object should be noted for later corrective action. Another quality factor related partly to resolution is the ability to measure edges of the object and edges of holes. Knowledge of scanner performance will enable a determination of whether the scanner resolution is good enough for given edges. Another quality factor is the amount of light expected to be returned from a given feature. Little if any light may be expected to be returned to the scanner from inside a small hole, for example, or from a glancing angle. Also, little light may be expected from certain kinds and colors of materials. Certain types of materials may have a large depth of penetration for the light from the scanner, and in this case good measurement results would not be expected. In some cases, an automatic program may ask for user supplementary information. For example, if a computer program is carrying out steps 4604 and 4608 based on CAD data, it may not know the type of material being used or the surface characteristics of the object under test. In these cases, the step 4608 may include a further step of obtaining material characteristics for the object under test.


Following the analysis of steps 4604 and 4608, the step 4610 is to decide whether further diagnostic procedures should be carried out. A first example of a possible diagnostic procedure is the step 4612 of projecting a stripe at a preferred angle to note whether multipath interference is observed. The general indications of multipath interference for a projected line stripe were discussed hereinabove with reference to FIG. 5. Another example of a diagnostic step is step 4614, which is to project a collection of lines aligned in the direction of epipolar lines on the source pattern of light, for example, the source pattern of light 30 from projector 36 in FIG. 1. For the case in which lines of light in the source pattern of light are aligned to the epipolar lines, then these lines will also appear as straight lines in the image plane on the photosensitive array. The use of epipolar lines is discussed in more detail in commonly owned U.S. patent application Ser. No. 13/443,946 filed Apr. 11, 2012 the contents of which are incorporated by reference herein. If these patterns on the photosensitive array are not straight lines or if the lines are blurred or noisy, then a problem is indicated, possibly as a result of multipath interference.


The step 4616 is to select a combination of preferred actions based on the analyses and diagnostic procedure performed. If speed in a measurement is particularly important, a step 4618 of measuring using a 2D (structured) pattern of coded light may be preferred. If greater accuracy is more important, then a step 4620 of measuring using a 2D (structured) pattern of coded light using sequential patterns, for example, a sequence of sinusoidal patterns of varying phase and pitch, may be preferred. If the method 4618 or 4620 is selected, then it may be desirable to also select a step 4628, which is to reposition the scanner, in other words to adjust the position and orientation of the scanner to the position that minimizes multipath interference and specular reflections (glints) as provided by the analysis of step 4604. Such indications may be provided to a user by illuminating problem regions with light from the scanner projector or by displaying such regions on a monitor display. Alternatively, the next steps in the measurement procedure may be automatically selected by a computer or processor. If the preferred scanner position does not eliminate multipath interference and glints, several options are available. In some cases, the measurement can be repeated with the scanner repositioned and the valid measurement results combined. In other cases, alternative measurement steps may be added to the procedure or performed instead of using structured light. As discussed previously, a step 4622 of scanning a stripe of light provides a convenient way of obtaining information over an area with reduced chance of having a problem from multipath interference. A step 4624 of sweeping a small spot of light over a region of interest further reduces the chance of problems from multipath interference. A step of measuring a region of an object surface with a tactile probe eliminates the possibility of multipath interference. A tactile probe provides a known resolution based on the size of the probe tip, and it eliminates issues with low reflectance of light or large optical penetration depth, which might be found in some objects under test.


In most cases, the quality of the data collected in a combination of the steps 4618-4628 may be evaluated in a step 4630 based on the data obtained from the measurements, combined with the results of the analyses carried out previously. If the quality is found to be acceptable in a step 4632, the measurement is completed at a step 4634. Otherwise, the analysis resumes at the step 4604. In some cases, the 3D information may not have been as accurate as desired. In this case, repeating some of the earlier steps may be helpful.


While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.

Claims
  • 1. A method of determining three-dimensional (3D) coordinates of points on a surface of an object, the method comprising: providing a 3D coordinate measurement device that includes a projector and a camera, the projector having a light source operable to emit projected light, the projected light having at least one pattern, the camera having a lens and a photosensitive array, the camera having a camera field of view;providing a moveable apparatus operable to cause relative movement between the object and the device;providing a position sensing mechanism operably coupled to the moveable apparatus, the position sensing mechanism operable to determine a position of the moveable apparatus;providing a processor electrically coupled to the projector, the camera, the moveable apparatus, and the position sensing mechanism;emitting from the projector onto the surface a first projected light having a first pattern;acquiring with the camera a first image of the first projected light and producing a first electrical signal in response;determining with the processor a first set of 3D coordinates of first points on the surface, the first set based at least in part on the first pattern and the first electrical signal;determining with the processor a simulation in which the processor projects first rays from the projector in the first position to the first points and calculates for each first ray an angle of reflection of a secondary ray from each of the first points and determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;moving the device with the moveable apparatus from the first position to a second position based at least in part on the determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;with the projector in the second position, emitting from the projector onto the surface a second projected light having a second pattern;acquiring with the camera a second image of the second projected light and producing a second electrical signal in response;determining with the processor a second set of 3D coordinates of a second point on the surface, the second set based at least in part on the second pattern and the second electrical signal; andstoring the second set of 3D coordinates.
  • 2. The method of claim 1, wherein the first pattern and the second pattern are the same.
  • 3. The method of claim 1, wherein the first pattern is different than the second pattern.
  • 4. The method of claim 1, further including steps of: combining the first set of 3D coordinates and the second set of 3D coordinates into a third set of 3D coordinates;eliminating a portion of the third set of 3D coordinates to obtain a fourth set of 3D coordinates based at least in part on the determining that the angle of reflection of at least one of the secondary rays intersects the object within the field of view of the camera in the first position; andstoring the fourth set of 3D coordinates.
  • 5. The method of claim 1, wherein: the determining with the processor a simulation in which the processor projects first rays further includes steps of emitting from the projector onto the surface a first collection of epipolar lines, reflecting into the camera a portion of the first collection of epipolar lines; reflecting into the camera a portion of the first collection of epipolar lines as a first diagnostic reflected light; forming with the lens a first diagnostic image of the first diagnostic reflected light on the photosensitive array and producing a first diagnostic electrical signal in response; determining with the processor whether the first diagnostic image includes a collection of straight lines; and determining with the processor that the first diagnostic image includes lines that are not straight lines; andin the selecting by the processor the second pattern, the second pattern is further based on the determining that the first diagnostic image includes lines that are not straight lines.
  • 6. The method of claim 1, wherein in the selecting with the processor the second pattern, the second pattern is a single line stripe or a single spot, the second pattern being based at least in part on the determining that the angle of reflection of at least one of the secondary rays intersects the object within the field of view of the camera in the first position.
  • 7. A method of object measurement to determine three-dimensional (3D) coordinates of points on a surface of the object, the method comprising: providing a 3D coordinate measurement device that includes a projector and a camera, the camera having a lens and a photosensitive array, the camera having a camera field of view;providing a moveable apparatus operable to cause relative movement between the object and the device;providing a position sensing mechanism operably coupled to the moveable apparatus, the position sensing mechanism configured to determine a position of the moveable apparatus;providing a processor electrically coupled to the projector, the camera, the moveable apparatus, and the position sensing mechanism;providing a CAD model of the object, the CAD model operable to provide computer-readable information for determining a 3D representation of the surface of the object;providing computer readable media having computer readable instructions which when executed by the processor calculates 3D coordinates of points on the surface based at least in part on the CAD model;determining with the processor a first set of 3D coordinates of first points on the surface, the first set based at least in part on the computer-readable information in the CAD model;determining with the processor a simulation in which the processor projects first rays from the projector to the first points and calculates for each first ray an angle of reflection of a secondary ray from each of the first points and determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;moving the movable apparatus to a first location based at least in part on the determining that the angle of reflection of at least one of the secondary rays intersects the object within the field of view of the camera in the first position;selecting with the processor a first pattern based at least in part on the determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;with the projector in the first location, emitting from the projector onto the surface a first projected light having the first pattern;acquiring with the camera a first image of the first projected light and producing a first electrical signal in response;determining with the processor a second set of 3D coordinates of at least one second point on the surface, the second set based at least in part on the first pattern and the first electrical signal; and storing the second set of 3D coordinates.
  • 8. A system of determining three-dimensional (3D) coordinates of points on a surface of an object, the system comprising: a 3D coordinate measurement device having a projector and a camera, the projector having a light source operable to emit projected light, the projected light having at least one pattern, the camera having a lens and a photosensitive array, the camera having a camera field of view;a moveable apparatus operable to cause relative movement between the object and the device;a position sensing mechanism operably coupled to the moveable apparatus, the position sensing mechanism operable to determine a position of the moveable apparatus;one or more processors responsive to executable computer instructions, the one or more processors electrically coupled to the projector, the camera, the moveable apparatus, and the position sensing mechanism, the executable computer instructions comprising:emitting from the projector onto the surface a first projected light having a first pattern;acquiring with the camera a first image of the first projected light and producing a first electrical signal in response;determining a first set of 3D coordinates of first points on the surface, the first set based at least in part on the first pattern and the first electrical signal;determining a simulation in which first rays are projected from the projector in the first position to the first points;calculating for each first ray an angle of reflection of a secondary ray from each of the first points;determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;moving the device with the moveable apparatus from the first position to a second position based at least in part on the determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;emitting from the projector onto the surface, with the projector in the second position, a second projected light having a second pattern;acquiring with the camera a second image of the second projected light and producing a second electrical signal in response;determining a second set of 3D coordinates of a second point on the surface, the second set based at least in part on the second pattern and the second electrical signal; andstoring the second set of 3D coordinates.
  • 9. The system of claim 8, wherein the first pattern and the second pattern are the same.
  • 10. The system of claim 8, wherein the first pattern is different than the second pattern.
  • 11. The system of claim 8, wherein the executable computer instructions further comprise: combining the first set of 3D coordinates and the second set of 3D coordinates into a third set of 3D coordinates;eliminating a portion of the third set of 3D coordinates to obtain a fourth set of 3D coordinates based at least in part on the determining that the angle of reflection of at least one of the secondary rays intersects the object within the field of view of the camera in the first position; andstoring the fourth set of 3D coordinates.
  • 12. The system of claim 8, wherein: in the of determining a simulation in which the first rays are projected, further includes steps of emitting from the projector onto the surface a first collection of epipolar lines, reflecting into the camera a portion of the first collection of epipolar lines; reflecting into the camera a portion of the first collection of epipolar lines as a first diagnostic reflected light; forming with the lens a first diagnostic image of the first diagnostic reflected light on the photosensitive array and producing a first diagnostic electrical signal in response; determining with the processor whether the first diagnostic image includes a collection of straight lines; and determining with the processor that the first diagnostic image includes lines that are not straight lines; andin the selecting the second pattern, the second pattern is further based on the determining that the first diagnostic image includes lines that are not straight lines.
  • 13. The system of claim 8, wherein, in the step of selecting with the processor the second pattern, the second pattern is a single line stripe or a single spot, the second pattern being based at least in part on the determining that the angle of reflection of at least one of the secondary rays intersects the object within the field of view of the camera in the first position.
  • 14. A system of object measurement to determine three-dimensional (3D) coordinates of points on a surface of the object, the system comprising: a 3D coordinate measurement device having a projector and a camera, the camera having a lens and a photosensitive array, the camera having a camera field of view;a moveable apparatus operable to cause relative movement between the object and the device;a position sensing mechanism operably coupled to the moveable apparatus, the position sensing mechanism operable to determine a position of the moveable apparatus;one or more processors responsive to executable computer instructions, the one or more processors being electrically coupled to the projector, the camera, the moveable apparatus, and the position sensing mechanism, the executable computer instructions comprising:receiving a CAD model of the object, the CAD model configured to provide computer-readable information for determining a 3D representation of the surface of the object;determining 3D coordinates of points on the surface based at least in part on the CAD model;determining a first set of 3D coordinates of first points on the surface, the first set based at least in part on the computer-readable information in the CAD model;determining a simulation in which first rays are projected from the projector to the first points;calculating for each first ray an angle of reflection of a secondary ray from each of the first points;determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;moving the movable apparatus to a first location based at least in part on the determining that the angle of reflection of at least one of the secondary rays intersects the object within the field of view of the camera in the first position;selecting a first pattern based at least in part on the determining that at least one of the secondary rays intersects the object within the field of view of the camera in the first position;emitting from the projector onto the surface, with the projector in the first location, a first projected light having the first pattern;acquiring with the camera a first image of the first projected light and producing a first electrical signal in response;determining with the processor a second set of 3D coordinates of at least one second point on the surface, the second set based at least in part on the first pattern and the first electrical signal; and storing the second set of 3D coordinates.
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a Continuation application of U.S. application Ser. No. 14/139,143 field on Dec. 23, 2013, which is a non-provisional patent application which claims the benefit of U.S. Provisional Patent Application No. 61/791,797 filed Mar. 15, 2013. U.S. application Ser. No. 14/139,143 is also a Continuation In Part application of U.S. patent application Ser. No. 13/443,946, filed on Apr. 11, 2012. U.S. patent application Ser. No. 13/443,946 is a nonprovisional application of U.S. Provisional Application 61/592,049 filed on Jan. 30, 2012 and U.S. Provisional Application 61/475,703 filed on Apr. 15, 2011. The entire contents of all of which are incorporated herein by reference.

US Referenced Citations (520)
Number Name Date Kind
2612994 Woodland Oct 1952 A
2682804 Clifford et al. Jul 1954 A
2484641 Keuffel et al. Mar 1957 A
2784641 Keuffel et al. Mar 1957 A
3339457 Pun Sep 1967 A
3365717 Holscher Jan 1968 A
3464770 Schmidt Sep 1969 A
3497695 Smith et al. Feb 1970 A
3508828 Froome et al. Apr 1970 A
3619058 Hewlett et al. Nov 1971 A
3627429 Jaenicke et al. Dec 1971 A
3658426 Vyce Apr 1972 A
3728025 Madigan et al. Apr 1973 A
3740141 DeWitt, Jr. Jun 1973 A
3779645 Nakazawa et al. Dec 1973 A
3813165 Hines et al. May 1974 A
3832056 Shipp et al. Aug 1974 A
3900260 Wendt Aug 1975 A
3914052 Wiklund Oct 1975 A
4113381 Epstein Sep 1978 A
4178515 Tarasevich Dec 1979 A
4297030 Chaborski Oct 1981 A
4403857 Holscher Sep 1983 A
4413907 Lane Nov 1983 A
4453825 Buck et al. Jun 1984 A
4498764 Bolkow et al. Feb 1985 A
4521107 Chaborski et al. Jun 1985 A
4531833 Ohtomo Jul 1985 A
4537475 Summers et al. Aug 1985 A
4560270 Wiklund et al. Dec 1985 A
4632547 Kaplan et al. Dec 1986 A
4652130 Tank Mar 1987 A
4689489 Cole Aug 1987 A
4692023 Ohtomo et al. Sep 1987 A
4699598 Bolkow et al. Oct 1987 A
4707129 Hashimoto et al. Nov 1987 A
4714339 Lau et al. Dec 1987 A
4731812 Akerberg Mar 1988 A
4731879 Sepp et al. Mar 1988 A
4767257 Kato Aug 1988 A
4777660 Gould et al. Oct 1988 A
4790651 Brown et al. Dec 1988 A
4839507 May Jun 1989 A
4983021 Fergason Jan 1991 A
5002388 Ohishi et al. Mar 1991 A
5051934 Wiklund Sep 1991 A
5069524 Watanabe et al. Dec 1991 A
5082364 Russell Jan 1992 A
5090131 Deer Feb 1992 A
5121242 Kennedy Jun 1992 A
5137354 Devos et al. Aug 1992 A
5138154 Hotelling Aug 1992 A
5162862 Bartram et al. Nov 1992 A
5175601 Fitts Dec 1992 A
5198868 Saito et al. Mar 1993 A
5237384 Fukunaga et al. Aug 1993 A
5263103 Kosinski Nov 1993 A
5267014 Prenninger Nov 1993 A
5301005 Devos et al. Apr 1994 A
5313409 Wiklund et al. May 1994 A
5319434 Croteau et al. Jun 1994 A
5347306 Nitta Sep 1994 A
5392521 Allen Feb 1995 A
5400130 Tsujimoto et al. Mar 1995 A
5402193 Choate Mar 1995 A
5402582 Raab Apr 1995 A
5416321 Sebastian et al. May 1995 A
5440112 Sakimura et al. Aug 1995 A
5440326 Quinn Aug 1995 A
5448505 Novak Sep 1995 A
5455670 Payne et al. Oct 1995 A
5500737 Donaldson et al. Mar 1996 A
5532816 Spann et al. Jul 1996 A
5534992 Takeshima et al. Jul 1996 A
5594169 Field et al. Jan 1997 A
5611147 Raab Mar 1997 A
D378751 Smith Apr 1997 S
5671160 Julian Sep 1997 A
5698784 Hotelling et al. Dec 1997 A
5724264 Rosenberg et al. Mar 1998 A
5737068 Kaneko et al. Apr 1998 A
5742379 Reifer Apr 1998 A
5754284 Leblanc et al. May 1998 A
5764360 Meier Jun 1998 A
5767952 Ohtomo et al. Jun 1998 A
5771623 Pernstich et al. Jun 1998 A
5817243 Shaffer Oct 1998 A
5825350 Case, Jr. et al. Oct 1998 A
5828057 Hertzman et al. Oct 1998 A
5861956 Bridges et al. Jan 1999 A
5880822 Kubo Mar 1999 A
5886775 Houser et al. Mar 1999 A
5886777 Hirunuma Mar 1999 A
5892575 Marino Apr 1999 A
5893214 Meier et al. Apr 1999 A
5898421 Quinn Apr 1999 A
5926388 Kimbrough et al. Jul 1999 A
5930030 Scifres Jul 1999 A
5957559 Rueb et al. Sep 1999 A
5973788 Pettersen et al. Oct 1999 A
5991011 Damm Nov 1999 A
6017125 Vann Jan 2000 A
6023326 Katayama et al. Feb 2000 A
6034722 Viney et al. Mar 2000 A
6036319 Rueb et al. Mar 2000 A
6052190 Sekowski et al. Apr 2000 A
D427087 Kaneko et al. Jun 2000 S
6085155 Hayase et al. Jul 2000 A
6097491 Hartrumpf Aug 2000 A
6097897 Ide Aug 2000 A
6100540 Ducharme et al. Aug 2000 A
6111563 Hines Aug 2000 A
6122058 Van Der Werf et al. Sep 2000 A
6133998 Monz et al. Oct 2000 A
6166809 Pettersen et al. Dec 2000 A
6171018 Ohtomo et al. Jan 2001 B1
6193371 Snook Feb 2001 B1
6222465 Kumar et al. Apr 2001 B1
6262801 Shibuya et al. Jul 2001 B1
6295174 Ishinabe et al. Sep 2001 B1
6317954 Cunningham et al. Nov 2001 B1
6324024 Shirai et al. Nov 2001 B1
6330379 Hendriksen Dec 2001 B1
6344846 Hines Feb 2002 B1
6347290 Bartlett Feb 2002 B1
6351483 Chen Feb 2002 B1
6353764 Imagawa et al. Mar 2002 B1
6369794 Sakurai et al. Apr 2002 B1
6369880 Steinlechner Apr 2002 B1
6433866 Nichols Aug 2002 B1
6437859 Ohtomo et al. Aug 2002 B1
6445446 Kumagai et al. Sep 2002 B1
6462810 Muraoka et al. Oct 2002 B1
6463393 Giger Oct 2002 B1
6490027 Rajchel et al. Dec 2002 B1
6501543 Hedges et al. Dec 2002 B2
6532060 Kindaichi et al. Mar 2003 B1
6559931 Kawamura et al. May 2003 B2
6563569 Osawa et al. May 2003 B2
6567101 Thomas May 2003 B1
6573883 Bartlett Jun 2003 B1
6573981 Kumagai et al. Jun 2003 B2
6583862 Perger Jun 2003 B1
6587244 Ishinabe et al. Jul 2003 B1
6611617 Crampton Aug 2003 B1
6624916 Green et al. Sep 2003 B1
6630993 Hedges et al. Oct 2003 B1
6633367 Gogolla Oct 2003 B2
6646732 Ohtomo et al. Nov 2003 B2
6650222 Darr Nov 2003 B2
6667798 Markendorf et al. Dec 2003 B1
6668466 Bieg et al. Dec 2003 B1
6678059 Cho et al. Jan 2004 B2
6681031 Cohen et al. Jan 2004 B2
6727984 Becht Apr 2004 B2
6727985 Giger Apr 2004 B2
6754370 Hall-Holt et al. Jun 2004 B1
6765653 Shirai et al. Jul 2004 B2
6802133 Jordil et al. Oct 2004 B2
6847436 Bridges Jan 2005 B2
6859744 Giger Feb 2005 B2
6864966 Giger Mar 2005 B2
6935036 Raab Aug 2005 B2
6957493 Kumagai et al. Oct 2005 B2
6964113 Bridges et al. Nov 2005 B2
6965843 Raab et al. Nov 2005 B2
6980881 Greenwood et al. Dec 2005 B2
6996912 Raab Feb 2006 B2
6996914 Istre et al. Feb 2006 B1
7022971 Ura et al. Apr 2006 B2
7023531 Gogolla et al. Apr 2006 B2
7055253 Kaneko Jun 2006 B2
7072032 Kumagai et al. Jul 2006 B2
7086169 Bayham et al. Aug 2006 B1
7095490 Ohtomo et al. Aug 2006 B2
7099000 Connolly Aug 2006 B2
7129927 Mattsson Oct 2006 B2
7130035 Ohtomo et al. Oct 2006 B2
7168174 Piekutowski Jan 2007 B2
7177014 Mori et al. Feb 2007 B2
7193695 Sugiura Mar 2007 B2
7196776 Ohtomo et al. Mar 2007 B2
7222021 Ootomo et al. May 2007 B2
7224444 Stierle et al. May 2007 B2
7230689 Lau Jun 2007 B2
7233316 Smith et al. Jun 2007 B2
7246030 Raab et al. Jul 2007 B2
7248374 Bridges Jul 2007 B2
7253891 Toker et al. Aug 2007 B2
7256899 Faul et al. Aug 2007 B1
7262863 Schmidt et al. Aug 2007 B2
7274802 Kumagai et al. Sep 2007 B2
7285793 Husted Oct 2007 B2
7286246 Yoshida Oct 2007 B2
7304729 Yasutomi et al. Dec 2007 B2
7307710 Gatsios et al. Dec 2007 B2
7312862 Zumbrunn et al. Dec 2007 B2
7321420 Yasutomi et al. Jan 2008 B2
7325326 Istre et al. Feb 2008 B1
7327446 Cramer et al. Feb 2008 B2
7336346 Aoki et al. Feb 2008 B2
7336375 Faul et al. Feb 2008 B1
7339655 Nakamura et al. Mar 2008 B2
7345748 Sugiura et al. Mar 2008 B2
7352446 Bridges et al. Apr 2008 B2
7353954 Malek et al. Apr 2008 B1
7372558 Kaufman et al. May 2008 B2
7388654 Raab et al. Jun 2008 B2
7388658 Glimm Jun 2008 B2
7401783 Pryor Jul 2008 B2
7423742 Gatsios et al. Sep 2008 B2
7429112 Metcalfe Sep 2008 B2
7446863 Nishita et al. Nov 2008 B2
7453554 Yang et al. Nov 2008 B2
7466401 Cramer et al. Dec 2008 B2
7471377 Liu et al. Dec 2008 B2
7474388 Ohtomq et al. Jan 2009 B2
7480037 Palmateer et al. Jan 2009 B2
7492444 Osada Feb 2009 B2
7503123 Matsu et al. Mar 2009 B2
7511824 Sebastian et al. Mar 2009 B2
7518709 Oishi et al. Apr 2009 B2
7535555 Nishizawa et al. May 2009 B2
7541965 Ouchi et al. Jun 2009 B2
7552539 Piekutowski Jun 2009 B2
7555766 Kondo et al. Jun 2009 B2
7562459 Fourquin et al. Jul 2009 B2
7564538 Sakimura et al. Jul 2009 B2
7565216 Soucy Jul 2009 B2
7583375 Cramer et al. Sep 2009 B2
7586586 Constantikes Sep 2009 B2
7613501 Scherch Nov 2009 B2
7614019 Rimas Ribikauskas et al. Nov 2009 B2
D605959 Apotheloz Dec 2009 S
7634374 Chouinard et al. Dec 2009 B2
7634381 Westermark et al. Dec 2009 B2
7692628 Smith et al. Apr 2010 B2
7701559 Bridges et al. Apr 2010 B2
7701566 Kumagai et al. Apr 2010 B2
7705830 Westerman et al. Apr 2010 B2
7710396 Smith et al. May 2010 B2
7724380 Horita et al. May 2010 B2
7728963 Kirschner Jun 2010 B2
7738083 Luo et al. Jun 2010 B2
7751654 Lipson et al. Jul 2010 B2
7761814 Rimas-Ribikauskas et al. Jul 2010 B2
7785084 Westermark et al. Jul 2010 B1
7782298 Smith et al. Aug 2010 B2
7800758 Bridges et al. Sep 2010 B1
7804051 Hingerling et al. Sep 2010 B2
7804602 Raab Sep 2010 B2
7812736 Collingwood et al. Oct 2010 B2
7812969 Morimoto et al. Oct 2010 B2
D629314 Ogasawara Dec 2010 S
7876457 Rueb Jan 2011 B2
7894079 Altendorf et al. Feb 2011 B1
7903237 Li Mar 2011 B1
7929150 Schweiger Apr 2011 B1
7954250 Crampton Jun 2011 B2
7976387 Venkatesh et al. Jul 2011 B2
7983872 Makino et al. Jul 2011 B2
7990523 Schlierbach et al. Aug 2011 B2
7990550 Aebischer et al. Aug 2011 B2
8087315 Goossen et al. Jan 2012 B2
8094121 Obermeyer et al. Jan 2012 B2
8094212 Jelinek Jan 2012 B2
8125629 Dold et al. Feb 2012 B2
8151477 Tait Apr 2012 B2
8190030 Leclair et al. May 2012 B2
8217893 Quinn et al. Jul 2012 B2
8237934 Cooke et al. Aug 2012 B1
8244023 Yamada Aug 2012 B2
8279430 Dold et al. Oct 2012 B2
8314939 Kato Nov 2012 B2
8320708 Kurzweil et al. Nov 2012 B2
8360240 Kallabis Jan 2013 B2
8379224 Piasse et al. Feb 2013 B1
8384760 Tan Feb 2013 B2
8387961 Im Mar 2013 B2
8405604 Pryor et al. Mar 2013 B2
8422034 Steffensen et al. Apr 2013 B2
8437011 Steffensen et al. May 2013 B2
8438747 Ferrari May 2013 B2
8467071 Steffey et al. Jun 2013 B2
8467072 Cramer et al. Jun 2013 B2
8483512 Moeller Jul 2013 B2
8485668 Zhang Jul 2013 B2
8509949 Bordyn et al. Aug 2013 B2
8525983 Bridges et al. Sep 2013 B2
8537371 Steffensen et al. Sep 2013 B2
8537375 Steffensen et al. Sep 2013 B2
8553212 Jaeger et al. Oct 2013 B2
8593648 Cramer et al. Nov 2013 B2
8619265 Steffey et al. Dec 2013 B2
8630314 York Jan 2014 B2
8638984 Roithmeier Jan 2014 B2
8654354 Steffensen et al. Feb 2014 B2
8659749 Bridges Feb 2014 B2
8670114 Bridges et al. Mar 2014 B2
8681317 Moser et al. Mar 2014 B2
8699756 Jensen Apr 2014 B2
8717545 Sebastian et al. May 2014 B2
8740396 Brown et al. Jun 2014 B2
8772719 Böckem Jul 2014 B2
8773514 Gharib Jul 2014 B2
8773667 Edmonds et al. Jul 2014 B2
8848203 Bridges et al. Sep 2014 B2
8874406 Rotvold et al. Oct 2014 B2
8902408 Bridges Dec 2014 B2
8931183 Jonas Jan 2015 B2
9151830 Bridges Oct 2015 B2
9207309 Bridges Dec 2015 B2
9234742 Bridges et al. Jan 2016 B2
9377885 Bridges et al. Jun 2016 B2
9448059 Bridges et al. Sep 2016 B2
9482514 Bridges Nov 2016 B2
9482529 Becker et al. Nov 2016 B2
9664508 Mcafee et al. May 2017 B2
20010045534 Kimura Nov 2001 A1
20020033940 Hedges et al. Mar 2002 A1
20020093646 Muraoka Jul 2002 A1
20020148133 Bridges et al. Oct 2002 A1
20020179866 Hoeller et al. Dec 2002 A1
20030014212 Ralston et al. Jan 2003 A1
20030020895 Bridges Jan 2003 A1
20030033041 Richey Feb 2003 A1
20030035195 Blech et al. Feb 2003 A1
20030048459 Gooch Mar 2003 A1
20030066202 Eaton Apr 2003 A1
20030090682 Gooch et al. May 2003 A1
20030112449 Tu et al. Jun 2003 A1
20030125901 Steffey et al. Jul 2003 A1
20030133092 Rogers Jul 2003 A1
20030179362 Osawa et al. Sep 2003 A1
20030206285 Lau Nov 2003 A1
20030227616 Bridges Dec 2003 A1
20040035277 Hubbs Feb 2004 A1
20040041996 Abe Mar 2004 A1
20040075823 Lewis et al. Apr 2004 A1
20040100705 Hubbs May 2004 A1
20040170363 Angela Sep 2004 A1
20040189944 Kaufman et al. Sep 2004 A1
20040218104 Smith et al. Nov 2004 A1
20040223139 Vogel Nov 2004 A1
20050058179 Phipps Mar 2005 A1
20050147477 Clark Jul 2005 A1
20050179890 Cramer et al. Aug 2005 A1
20050185182 Raab et al. Aug 2005 A1
20050197145 Chae et al. Sep 2005 A1
20050254043 Chiba Nov 2005 A1
20050284937 Xi et al. Dec 2005 A1
20060009929 Boyette et al. Jan 2006 A1
20060017720 Li Jan 2006 A1
20060053647 Raab et al. Mar 2006 A1
20060055662 Rimas-Ribikauskas et al. Mar 2006 A1
20060055685 Rimas-Ribikauskas et al. Mar 2006 A1
20060066836 Bridges et al. Mar 2006 A1
20060103853 Palmateer May 2006 A1
20060132803 Clair et al. Jun 2006 A1
20060140473 Brooksby et al. Jun 2006 A1
20060141435 Chiang Jun 2006 A1
20060145703 Steinbichler et al. Jul 2006 A1
20060146009 Syrbe et al. Jul 2006 A1
20060161379 Ellenby et al. Jul 2006 A1
20060164384 Smith et al. Jul 2006 A1
20060164385 Smith et al. Jul 2006 A1
20060164386 Smith et al. Jul 2006 A1
20060222237 Du et al. Oct 2006 A1
20060222314 Zumbrunn et al. Oct 2006 A1
20060235611 Deaton Oct 2006 A1
20060262001 Ouchi et al. Nov 2006 A1
20060279246 Hashimoto et al. Dec 2006 A1
20070016386 Hosted Jan 2007 A1
20070019212 Gatsios et al. Jan 2007 A1
20070024842 Nishizawa et al. Feb 2007 A1
20070090309 Hu et al. Apr 2007 A1
20070121095 Lewis May 2007 A1
20070127013 Hertzman et al. Jun 2007 A1
20070130785 Bublitz et al. Jun 2007 A1
20070236452 Venkatesh et al. Oct 2007 A1
20070247615 Bridges et al. Oct 2007 A1
20070285672 Mukai et al. Dec 2007 A1
20080002866 Fujiwara Jan 2008 A1
20080024795 Yamamoto et al. Jan 2008 A1
20080043409 Kallabis Feb 2008 A1
20080107305 Vanderkooy et al. May 2008 A1
20080118143 Gordon et al. May 2008 A1
20080122786 Pryor et al. May 2008 A1
20080201101 Hebert et al. Aug 2008 A1
20080203299 Kozuma et al. Aug 2008 A1
20080229592 Hinderling et al. Sep 2008 A1
20080239281 Bridges Oct 2008 A1
20080246974 Wilson et al. Oct 2008 A1
20080250659 Bellerose et al. Oct 2008 A1
20080297808 Riza et al. Dec 2008 A1
20080302200 Tobey Dec 2008 A1
20080309949 Rueb Dec 2008 A1
20080316497 Taketomi et al. Dec 2008 A1
20080316503 Smarsh et al. Dec 2008 A1
20090000136 Crampton Jan 2009 A1
20090009747 Wolf et al. Jan 2009 A1
20090033621 Quinn et al. Feb 2009 A1
20090046271 Constantikes Feb 2009 A1
20090066932 Bridges et al. Mar 2009 A1
20090078620 Malek et al. Mar 2009 A1
20090109426 Cramer et al. Jun 2009 A1
20090153817 Kawakubo Jun 2009 A1
20090157226 De Smet Jun 2009 A1
20090171618 Kumagai et al. Jul 2009 A1
20090187373 Atwell et al. Jul 2009 A1
20090190125 Foster et al. Jul 2009 A1
20090205088 Crampton et al. Aug 2009 A1
20090213073 Obermeyer et al. Aug 2009 A1
20090239581 Lee Sep 2009 A1
20090240372 Bordyn et al. Sep 2009 A1
20090240461 Makino et al. Sep 2009 A1
20090240462 Lee Sep 2009 A1
20090244277 Nagashima et al. Oct 2009 A1
20090260240 Bernhard Oct 2009 A1
20090284757 Mayer et al. Nov 2009 A1
20100008543 Yamada et al. Jan 2010 A1
20100025746 Chapman et al. Feb 2010 A1
20100046005 Kalkowski Feb 2010 A1
20100058252 Ko Mar 2010 A1
20100074532 Gordon et al. Mar 2010 A1
20100091112 Veeser et al. Apr 2010 A1
20100103431 Demopoulos Apr 2010 A1
20100128259 Bridges et al. May 2010 A1
20100142798 Weston et al. Jun 2010 A1
20100149518 Nordenfelt et al. Jun 2010 A1
20100149525 Lau Jun 2010 A1
20100158361 Grafinger et al. Jun 2010 A1
20100176270 Lau et al. Jul 2010 A1
20100207938 Yau et al. Aug 2010 A1
20100225746 Shpunt et al. Sep 2010 A1
20100234094 Gagner et al. Sep 2010 A1
20100235786 Maizels et al. Sep 2010 A1
20100245851 Teodorescu Sep 2010 A1
20100250175 Briggs et al. Sep 2010 A1
20100250188 Brown Sep 2010 A1
20100251148 Brown Sep 2010 A1
20100265316 Sali et al. Oct 2010 A1
20100277747 Rueb et al. Nov 2010 A1
20100284082 Shpunt et al. Nov 2010 A1
20110001958 Bridges et al. Jan 2011 A1
20110003507 Van Swearingen et al. Jan 2011 A1
20110007154 Vogel et al. Jan 2011 A1
20110013281 Mimura et al. Jan 2011 A1
20110023578 Grasser Feb 2011 A1
20110025827 Shpunt et al. Feb 2011 A1
20110032507 Braunecker et al. Feb 2011 A1
20110032509 Bridges et al. Feb 2011 A1
20110035952 Roithmeier Feb 2011 A1
20110043620 Svanholm et al. Feb 2011 A1
20110043808 Isozaki et al. Feb 2011 A1
20110052006 Gurman et al. Mar 2011 A1
20110069322 Hoffer, Jr. Mar 2011 A1
20110107611 Desforges et al. May 2011 A1
20110107612 Ferrari et al. May 2011 A1
20110107613 Tait May 2011 A1
20110107614 Champ May 2011 A1
20110109502 Sullivan May 2011 A1
20110112786 Desforges et al. May 2011 A1
20110123097 Van Coppenolle et al. May 2011 A1
20110128625 Larsen et al. Jun 2011 A1
20110166824 Haisty et al. Jul 2011 A1
20110169924 Haisty et al. Jul 2011 A1
20110170534 York Jul 2011 A1
20110173827 Bailey et al. Jul 2011 A1
20110175745 Atwell et al. Jul 2011 A1
20110176145 Edmonds et al. Jul 2011 A1
20110179281 Chevallier-Mames et al. Jul 2011 A1
20110181872 Dold et al. Jul 2011 A1
20110260033 Steffensen et al. Oct 2011 A1
20110282622 Canter Nov 2011 A1
20110288684 Farlow et al. Nov 2011 A1
20110301902 Panagas et al. Dec 2011 A1
20110316978 Dillon et al. Dec 2011 A1
20120050255 Thomas et al. Mar 2012 A1
20120062706 Keshavmurthy et al. Mar 2012 A1
20120065928 Rotvold et al. Mar 2012 A1
20120124850 Ortleb et al. Mar 2012 A1
20120099117 Hanchett et al. Apr 2012 A1
20120105821 Moser et al. May 2012 A1
20120120391 Dold et al. May 2012 A1
20120120415 Steffensen et al. May 2012 A1
20120154577 Yoshikawa Jun 2012 A1
20120188559 Becker et al. Jul 2012 A1
20120194644 Newcombe et al. Aug 2012 A1
20120206716 Cramer et al. Aug 2012 A1
20120206808 Brown et al. Aug 2012 A1
20120218563 Spruck et al. Aug 2012 A1
20120236320 Steffey et al. Sep 2012 A1
20120262550 Bridges Oct 2012 A1
20120262573 Bridges et al. Oct 2012 A1
20120262728 Bridges et al. Oct 2012 A1
20120265479 Bridges et al. Oct 2012 A1
20120317826 Jonas Dec 2012 A1
20130037694 Steffensen et al. Feb 2013 A1
20130096873 Rosengaus et al. Apr 2013 A1
20130100282 Siercks et al. Apr 2013 A1
20130128284 Steffey et al. May 2013 A1
20130155386 Bridges et al. Jun 2013 A1
20130162469 Zogg et al. Jun 2013 A1
20130197852 Grau et al. Aug 2013 A1
20130201470 Cramer et al. Aug 2013 A1
20130293684 Becker et al. Nov 2013 A1
20140002806 Buchel et al. Jan 2014 A1
20140028805 Tohme et al. Jan 2014 A1
20140320643 Markendorf Oct 2014 A1
20140327920 Bridges et al. Nov 2014 A1
20150049329 Bridges et al. Feb 2015 A1
20150331159 Bridges et al. Nov 2015 A1
20150365653 Yazid Dec 2015 A1
20150373321 Bridges Dec 2015 A1
20160178348 Nagalla et al. Jun 2016 A1
20160370171 Bridges Dec 2016 A1
20160377410 Becker et al. Dec 2016 A1
20170176169 Nagalla et al. Jun 2017 A1
20180120089 Nagalla May 2018 A1
Foreign Referenced Citations (196)
Number Date Country
501507 Sep 2006 AT
506110 Jun 2009 AT
2811444 Mar 2012 CA
589856 Jul 1977 CH
1263807 Aug 2000 CN
1290850 Apr 2001 CN
1362692 Aug 2002 CN
1474159 Feb 2004 CN
1531659 Sep 2004 CN
1608212 Apr 2005 CN
1846148 Oct 2006 CN
1926400 Mar 2007 CN
101031817 Sep 2007 CN
101203730 Jun 2008 CN
101297176 Oct 2008 CN
101371160 Feb 2009 CN
101427155 May 2009 CN
101556137 Oct 2009 CN
101750012 Jun 2010 CN
101776982 Jul 2010 CN
101806574 Aug 2010 CN
201548192 Aug 2010 CN
7704949 Jun 1977 DE
3530922 Apr 1986 DE
3827458 Feb 1990 DE
10022054 Nov 2001 DE
10160090 Jul 2002 DE
202004004945 Oct 2004 DE
102004024171 Sep 2005 DE
102005019058 Dec 2005 DE
102004052199 Apr 2006 DE
102006013185 Sep 2007 DE
102006049695 Apr 2008 DE
202006020299 May 2008 DE
60319016 Apr 2009 DE
202008013217 May 2009 DE
102007058692 Jun 2009 DE
102009035336 Nov 2010 DE
102009040837 Mar 2011 DE
112009001652 Jan 2012 DE
0166106 Jan 1986 EP
598523 May 1994 EP
0598523 May 1994 EP
0797073 Sep 1997 EP
0919831 Jun 1999 EP
0957336 Nov 1999 EP
1067363 Jan 2001 EP
1211481 Jun 2002 EP
1519141 Mar 2005 EP
1607767 Dec 2005 EP
1659417 May 2006 EP
1681533 Jul 2006 EP
1710602 Oct 2006 EP
1710602 Oct 2006 EP
2071283 Jun 2009 EP
2177868 Oct 2009 EP
2136178 Dec 2009 EP
2219011 Aug 2010 EP
2259010 Dec 2010 EP
2259013 Dec 2010 EP
2275775 Jan 2011 EP
2322901 May 2011 EP
2400379 Dec 2011 EP
2446300 May 2012 EP
1543636 Apr 1979 GB
2503179 Dec 2013 GB
2503390 Dec 2013 GB
2516528 Jan 2015 GB
2518544 Mar 2015 GB
2518769 Apr 2015 GB
2518998 Apr 2015 GB
57147800 Sep 1982 JP
5804881 Mar 1983 JP
S5848881 Mar 1983 JP
S6097288 May 1985 JP
2184788 Jul 1990 JP
H0331715 Feb 1991 JP
H0371116 Mar 1991 JP
H0465631 Mar 1992 JP
H05257005 Oct 1993 JP
H05302976 Nov 1993 JP
6097288 Apr 1994 JP
H06214186 Aug 1994 JP
H06229715 Aug 1994 JP
H0665818 Sep 1994 JP
H06241779 Sep 1994 JP
H06265355 Sep 1994 JP
H074967 Jan 1995 JP
H07190772 Jul 1995 JP
H08145679 Jun 1996 JP
H0914965 Jan 1997 JP
H09113223 May 1997 JP
H102722 Jan 1998 JP
H10107357 Apr 1998 JP
H10317874 Dec 1998 JP
11502629 Mar 1999 JP
H11304465 Nov 1999 JP
H11513495 Nov 1999 JP
11337642 Dec 1999 JP
2000503476 Mar 2000 JP
2000275042 Oct 2000 JP
2000346645 Dec 2000 JP
2001013247 Jan 2001 JP
2001033250 Feb 2001 JP
2001165662 Jun 2001 JP
2001513204 Aug 2001 JP
2001272468 Oct 2001 JP
2001284317 Oct 2001 JP
2001353112 Dec 2001 JP
2002089184 Mar 2002 JP
2002098762 Apr 2002 JP
2002139310 May 2002 JP
2002209361 Jul 2002 JP
2003506691 Feb 2003 JP
2004508954 Mar 2004 JP
2004527751 Sep 2004 JP
2005010585 Jan 2005 JP
3109969 Jun 2005 JP
2005265700 Sep 2005 JP
2006003127 Jan 2006 JP
2006058091 Mar 2006 JP
2006084460 Mar 2006 JP
2006220514 Aug 2006 JP
2006276012 Oct 2006 JP
2006526844 Nov 2006 JP
2007504459 Mar 2007 JP
2007165331 Jun 2007 JP
2007523357 Aug 2007 JP
2007256872 Oct 2007 JP
2008027308 Feb 2008 JP
2004108939 Apr 2008 JP
2008514967 May 2008 JP
2008536146 Sep 2008 JP
2008544215 Dec 2008 JP
2009014639 Jan 2009 JP
2009134761 Jun 2009 JP
2009523236 Jun 2009 JP
2009229350 Oct 2009 JP
2010169633 Aug 2010 JP
2011158371 Aug 2011 JP
2011526706 Oct 2011 JP
2013525787 Oct 2011 JP
H04504468 Oct 2011 JP
2012063352 Mar 2012 JP
2012509464 Apr 2012 JP
2012215496 Nov 2012 JP
2012225869 Nov 2012 JP
2012230097 Nov 2012 JP
2012530909 Dec 2012 JP
5302976 Oct 2013 JP
1020090078620 Jul 2009 KR
381361 Feb 2000 TW
9012284 Oct 1990 WO
9534849 Dec 1995 WO
0109642 Feb 2001 WO
0177613 Oct 2001 WO
0223121 Mar 2002 WO
0237466 May 2002 WO
02084327 Oct 2002 WO
03062744 Jul 2003 WO
2003062744 Jul 2003 WO
03073121 Sep 2003 WO
2004063668 Jul 2004 WO
2005026772 Mar 2005 WO
2006039682 Apr 2006 WO
2006052259 May 2006 WO
2006055770 May 2006 WO
2006133799 Dec 2006 WO
2007079601 Jul 2007 WO
2007123604 Nov 2007 WO
2007124010 Nov 2007 WO
2008052348 May 2008 WO
2008119073 Oct 2008 WO
2008121919 Oct 2008 WO
2009106141 Sep 2009 WO
2010057169 May 2010 WO
2010100043 Sep 2010 WO
2010107434 Sep 2010 WO
2010141120 Dec 2010 WO
2010148525 Dec 2010 WO
2010148526 Dec 2010 WO
2011057130 May 2011 WO
2011107729 Sep 2011 WO
2011112277 Sep 2011 WO
2011133731 Oct 2011 WO
2011134083 Nov 2011 WO
2011160962 Dec 2011 WO
2012142074 Oct 2012 WO
2010148526 Dec 2012 WO
2012057283 May 2014 WO
2014143644 Sep 2014 WO
2014149701 Sep 2014 WO
2014149704 Sep 2014 WO
2014149705 Sep 2014 WO
2014149706 Sep 2014 WO
2014149702 Sep 2015 WO
Non-Patent Literature Citations (97)
Entry
https://en.wikipedia.org/wiki/Multipath_interference (accessed Nov. 29, 2017) 2pgs.
International Preliminary Report on Patentability for International Application No. PCT/US2014/020480 dated Sep. 15, 2015; dated Sep. 24, 2015; 8 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/020485 dated Sep. 15, 2015; Mailed Sep. 24, 2015; 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US20141020487 dated Sep. 15, 2015; Mailed Sep. 24, 2015; 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/020488 dated Sep. 15, 2015; Mailed Sep. 24, 2015; 7 pages.
International Preliminary Report on Patentability for International Application No. PCT/US2014/027035 dated Sep. 15, 2015; Mailed Sep. 24, 2015; 7 pages.
Office Action for Japanese Patent Application No. 2016-500622 dated Mar. 15, 2016; 3 pages.
International Preliminary Report on Patentability of the International Searching Authority for PCT No. PCT/US2012/024629; dated Aug. 21, 2013.
International Search Report of the International Searching Authority for PCT No. PCT/US2012/024629; dated May 15, 2012.
Kester, Walt, Practical Analog Design Techniques, Analog Devices, Section 5, Undersampling Applications, Copyright 1995, pp. 5-1 to 5-34.
Non Final Office Action for U.S. Appl. No. 13/832,658, dated Oct. 2, 2014, 92 pages.
Written Opinion of the International Searching Authority for PCT No. PCT/US2012/024629; dated May 15, 2012.
Written Opinion of the International Searching Authority for PCT/US2012/032715; dated Jul. 5, 2012.
Notice of Allowance for U.S. Appl. No. 14/199,211 dated Aug. 21, 2014, pp. 1-42.
Stone, et al. “Automated Part Tracking on the Construction Job Site” 8 pp; XP 55055816A; National Institute of Standards and Technology.
Parker, et al “Instrument for Setting Radio Telescope Surfaces” (4 pp) XP 55055817A.
International Search Report dated Mar. 19, 2013 for International Application Serial No. PCT/US2013/022515; International filing date Jan. 22, 2013. All references cited incorporated herein.
Cuypers, et al “Optical Measurement Techniques for Mobile and Large-Scale Dimensional Metrology” (2009); Optics and Lasers in Engineering pp. 292-300; vol. 47; Elsevier Ltd. XP 25914917A.
Written Opinion of the International Searching Authority dated Mar. 19, 2013 for International Application Serial No. PCT/US2013/022515; International filing date Jan. 22, 2013. All references cited incorporated herein.
“DLP-Based Structured Light 3D Imaging Technologies and Applications” by J. Geng; Proceedings of SPIE, vol. 7932. Published Feb. 11, 2011, 15 pages.
Matsumaru, K., “Mobile Robot with Preliminary-Announcement and Display Function of Forthcoming Motion Using Projection Equipment,” Robot and Human Interactive Communication, 2006. RO-MAN06. The 15th IEEE International Symposium, Symposium, pp. 443-450, Sep. 6-8.
Brenneke et al: “Using 3D laser range data for slam in outsoor enviomments.” Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas, NV Oct. 27-31, 2003; IEEE US, vol. 1, Oct. 27 2003, pp. 188-193.
Dipl. Ing. Karl Zeiske; : “Vermessen Leicht Gemacht”; Leica Geosystems AG, Heerbrugg, Switzerland, 2000; pp. 1-39; www.leica-geosystems.com—English Translation Attached.
Gebre, et al. “Remotely Operated and Autonomous Mapping System (ROAMS).” Technologies for Practical Robot Applications, 2009. Tepra 2009. IEEE International Conference on IEEE, Piscataway, NJ, USA. Nov. 9, 2009, pp. 173-178.
Granstrom, Karl, M et al: “Learning to Close the Loop from 3-D Point Clouds.” 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Piscataway, NJ, Oct. 18, 2010, pp. 1-7.
Hebert P., “A self-referenced hand-held range sensor”, 3-D Digital Imaging and Modeling, 2001, Proceedings, Third anual International Conference on May 28-Jun. 1, 2001, Piscataway, NJ, USA, IEEE, May 28, 2001, pp. 5-12.
Henry, R, et al: “RGB-D Mapping; Using Kinnect-style Depth Cameras for Dense 3-D Modeling of Indoor Enviornments.” The International Journal of Robitcs Research, vol. 31, No. 5, Feb. 10, 2012, pp. 647-663.
Lee, Wonwoo, et al.:“Panoramic Mesh Model Generation From Multiple Range Data for Indoor Screen Reconstruction.” Advances in Multimedia Information Processing, PCM Lecture Notes in Computer Science, Jan. 1, 2005, Berlin, DE, pp. 1004-1014.
May S. et al; “Robust 3-D Mapping with time-of-flight cameras.” Intelligent Robots and Systems, IROS 2009. IEEE/RSJ Internation Conference. Piscataway, NJ Oct. 10, 2009, pp. 1-6.
Surmann et al. “An autonomous mobile robot with a 3D laser range finder for 3D exploration and digitalization of indoor enviornments.” Robotics and Autonomous Systems vol. 45 No. 3-4, Dec. 31, 2003, pp. 181-198. Amsterdamn, Netherlands.
Weise, Thibaut, et al.:“Online Loop Closure for real-time interactive 3-D scanning.” Computer Vision and Image Understanding, vol. 115, No. 5, May 1, 2011, pp. 635-648.
Newport Company “Fiber Optic Scribes” https://web.archive.org/web/20120903063012/http://www.newport.com/Fiber-Optic-Scribes/835171/1033/info.aspx; 2012, 2 pages.
Newport Corporation “Projects in Fiber Optics: Applications Handbook”, 1986; 3 pages.
Non Final Office Action for U.S. Appl. No. 14/139,143 dated Jan. 6, 2016; 142 pages.
Office Action for Application No. 112014001459.1 dated Feb. 11, 2016; 6 pages.
Office Action for Application No. GB1513550.2 dated Sep. 30, 2015; 4 pages.
Takeuchi et al., “Ultraprecision 3D Micromachining of Glass”; Annals of the CIRP; Jan. 4, 1996; vol. 45; 401-404 pages.
Thorlabs “Ruby Dualscribe Fiber Optic Scribe” a Mechanical Drawing, 2014, 1 page.
“Fiber Optic Rotary Joints Product Guide”; Moog Inc; MS1071, rev. 2; p. 1-4; 2010; Retrieved on Nov. 13, 2013 from http://www.moog.com/literature/ICD/Moog-Fiber-Optic-Rotary-Joint_Catalog-en.pdf.
“Technical Brief: Fiber Optic Rotary Joint”; Document No. 303; Moog Inc; p. 1-6; 2008; Retrieved on Nov. 13, 2013 from http://www.moog.com/literature/MCG/FORJtechbrief.pdf.
Katowski “Optical 3-D Measurement Techniques—Applications in inspection, quality control and robotic” Vienna, Austria, Sep. 18-20, 1989.
Rahman, et al., “Spatial-Geometric Approach to Physical Mobile Interaction Based on Accelerometer and IR Sensory Data Fusion”, ACM Transactions on Multimedia Computing, Communications and Applications, vol. 6, No. 4, Article 28, Publication date: Nov. 2010.
International Search report of the International Application No. PCT/US2013/049562 dated Jun. 20, 2014.
International Search report of the International Application No. PCT/US2014/020481 dated Jun. 20, 2014.
International Search report of the International Application No. PCT/US2014/020487 dated Jun. 20, 2014.
International Search report of the International Application No. PCT/US2014/020488 dated Jun. 23, 2014.
International Search report of the International Application No. PCT/US2014/027035 dated Jun. 20, 2014.
Hanwei Xiong et al: “The Development of Optical Fringe Measurement System Integrated with a CMM for Products Inspection.” Proceedings of SPIE, vol. 7855, Nov. 3, 2010, pp. 78551W-7855W-8, XP055118356. ISSN: 0277-786X.
Jsladek et al: “The Hybrid Contact-Optical Coordinate Measuring System.” Measurement, vol. 44, No. 3, Mar. 1, 2011, pp. 503-510, XP055047404, ISSN: 0263-2241.
Written Opinion of the International Searching Authority for International Application No. PCT/US2014/020480 dated Jun. 20, 2014.
Written Opinion of the international Searching Authority for International Application No. PCT/US2014/020481 dated Jun. 20, 2014.
Written Opinion of the International Searching Authority for International Application No. PCT/US2014/020485 dated Jun. 20, 2014.
Written Opinion of the International Searching Authority for International Application No. PCT/US2014/020487 dated Jun. 20, 2014.
Written Opinion of the International Searching Authority for International Application No. PCT/US2014/020488 dated Jun. 23, 2014.
Written Opinion of the International Searching Authority for International Application No. PCT/US2014/027035 dated Jun. 20, 2014.
2×2 High Speed Lithium Niobate Interferometric Switch; [on-line]; JDS Uniphase Corporation; 2007; Retreived from www.jdsu.com.
AO Modulator—M040-8J-FxS; [online—technical data sheet]; Gooch & Housego; Nov. 2006; Retreived from http://www.goochandhousego.com/.
Automated Precision, Inc., Product Specifications, Radian, Featuring INNOVO Technology, info@apisensor.com, Copyright 2011.
Cao, et al.“VisionWand: Interaction Techniques for Large Displays using a Passive Wand Tracked in 3D”, Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, UIST, vol. 5, Issue 2, pp. 173-182, (Jan. 2003).
Chen, Junewen, “Novel Laser Range Finding Algorithms”, Proceedings of SPIE, vol. 6100, Jan. 1, 2006, pp. 61001Q-61001Q-8, XP55031002, ISSN: 0277-786X, DOI: 10.1117/12.645131, the whole document.
Computer Giants Embrace On-Chip Optics; Mar. 27, 2008, [on-line]; Optics.org; [Retreived on Apr. 2, 2008]; Retreived from http://optics.org/cws/article/research/33521.
EOSpace—High-Speed Swtiches; [on-line technical brochure]; [Retreived May 18, 2009]; retreived from http://www.cospace.com/Switches.htm.
FARO Laser Tracker ION; 2 pages; revised Apr. 23, 2010; FARO Technologies, Inc., www.lasertracker.faro.com.
FARO Technical Institute, Basic Measurement Training Workbook, Version 1.0, FARO Laster Tracker, Jan. 2008, Students Book, FAO CAM2 Measure.
Hecht, Jeff, Photonic Frontiers: Gesture Recognition: Lasers Bring Gesture Recognition to the Home, Laser Focus World, pp. 1-5, [Retrieved On-Line Mar. 3, 2011], http://www.optolq.com/optoiq-2/en-us/index/photonics-technologiesapplications/lfw-display/lfw.
Hui E et al: “Single-Step Assembly of Complex 3-D microstructures.” Jan. 23, 2000, pp. 602-607, XP010377196.
Integrated Optical Amplitude Modulator; [on-line technical data sheet]; [Retreived Oct. 14, 2010]; Jenoptik; Retreived from http://www.jenoptik.com/cms/products.nsf/0/A6DF20B50AEE7819C12576FE0074E8E6/$File/amplitudemodulators_en.pdf?Open.
Turk, et al., “Perceptual Interfaces”, UCSB Technical Report 2003-33, pp. 1-43 [Retreived Aug. 11, 2011, http://www.cs.ucsb.edu/research/tech_reports/reports/2003-33.pdf] (2003).
Kollorz, et al., “Gesture recognition with a time-of-flight camera”, International Journal of Intelligent Systems Technologies and Applications, vol. 5, No. 3/4, pp. 334-343, [Retreived Aug. 11, 2011; http://www5.informatik.unierlangen.
LaserTRACER-measureing sub-mircron in space; http://www.etalon-ag.com/index.php/en/products/lasertracer; 4 pages; Jun. 28, 2011; ETALON AG.
Leica Absolute Tracker AT401-ASME B89.4.19-2006 Specifications; Hexagon Metrology; Leica Geosystems Metrology Products, Switzerland; 2 pages; www.leica-geosystems.com/metrology.
Leica Geosystems AG ED—“Leica Laser Tracker System”, Internet Citation, Jun. 28, 2012 (Jun. 28, 2012), XP002678836, Retrieved from the Internet: URL:http://www.a-solution.com.au/pages/downloads/LTD500_Brochure_EN.pdf.
Leica Geosystems Metrology, “Leica Absolute Tracker AT401, White Paper,” Hexagon AB; 2010.
Leica Geosystems: “TPS1100 Professional Series”, 1999, Retrieved from the Internet: URL:http://www.estig.ipbeja.pt/˜legvm/top_civil/TPS1100%20-%20A%20New%20Generation%20of%20Total%20Stations.pdf, [Retrieved on Jul. 2012] the whole document.
Leica Laser Tracker System, Leica Geosystems AG, Jan. 1, 1999, XP002678836, Retrieved from the Internet: URL:http://wwm.a-solution.com.au/pages/downloads/LTD500_Brochure_EN.Pdf [retrieved on 2012] the whole document.
Li, et al., “Real Time Hand Gesture Recognition using a Range Camera”, Australasian Conference on Robotics and Automation (ACRA), [Retreived Aug. 10, 2011, http://www.araa.asn.au/acra/acra2009/papers/pap128s1.pdf] pp. 1-7 (2009).
Lightvision—High Speed Variable Optical Attenuators (VOA); [on-line]; A publication of Lightwaves 2020, Feb. 1, 2008; Retreived from http://www.lightwaves2020.com/home/.
Maekynen, A. J. et al., Tracking Laser Radar for 3-D Shape Measurements of Large Industrial Objects Based on Timeof-Flight Laser Rangefinding and Position-Sensitive Detection Techniques, IEEE Transactions on Instrumentation and Measurement, vol. 43, No. 1, pp. 40-49, (Feb. 1994).
Making the Big Step from Electronics to Photonics by Modulating a Beam of Light with Electricity; May 18, 2005; [online]; [Retreived May 7, 2009]; Cornell University News Service; Retreived from http://www.news.cornell.edu/stories/May05/LipsonElectroOptica.
MEMS Variable Optical Attenuators Single/Multi-Channel; [on-line]; Jan. 17, 2005; Retreived from www.ozoptics.com.
Nanona High Speed & Low Loss Optical Swtich; [on-line technical data sheet]; [Retreived Oct. 14, 2010]; Retreived from http://www.bostonati.com/products/PI-FOS.pdf.
New River Kinematics, SA Arm—The Ultimate Measurement Software for Arms, Software Release! SA Sep. 30, 2010, [On-line]; http://www.kinematics.com/news/software-release-sa20100930.html (1 of 14), [Retrieved Apr. 13, 2011 11:40:47 AM].
Super-Nyquist Operation of the AD9912 Yields a High RF Output Signal; Analog Devices, Inc., AN-939 Application Note; www.analog.com; Copyright 2007.
Tracker3; Ultra-Portable Laser Tracking System; 4 pages; 2010 Automated Precision Inc.; www.apisensor.com.
Optical Circulator (3-Ports & 4-Ports); [on-line technical data sheet]; Alliance Fiber Optic Products, Inc. REV.D Jan. 15, 2004; Retreived from www.afop.com.
Optical Circulators Improve Bidirectional Fiber Systems; By Jay S. Van Delden; [online]; [Retreived May 18, 2009]; Laser Focus World; Retreived from http://www.laserfocusworld.com/display_article/28411/12/nonc/nonc/News/Opticalcirculators-improve-bidirecti.
Ou-Yang, Mang, et al., “High-Dynamic-Range Laser Range Finders Based on a Novel Multimodulated Frequency Method”, Optical Engineering, vol. 45, No. 12, Jan. 1, 2006, p. 123603, XP55031001, ISSN: 0091-3286, DOI: 10.1117/1.2402517, the whole document.
PCMM System Specifications Leica Absolute Tracker and Leica T-Products; Hexagon Metrology; Leica Geosystems Metrology Products, Switzerland; 8 pages; www.leica-geosystems.com/metrology.
Poujouly, Stephane, et al., “A Twofold Modulation Frequency Laser Range Finder; A Twofold Modulation Frequency Laser Range Finder”, Journal of Optics. A, Pure and Applied Optics, Institute of Physics Publishing, Bristol, GB, vol. 4, No. 6, Nov. 1, 2 36.
Poujouly, Stephane, et al., Digital Laser Range Finder: Phase-Shift Estimation by Undersampling Technique; IEEE, Copyright 1999.
RS Series Remote Controlled Optical Switch; [on-line technical data sheet]; Sercalo Microtechnology, Ltd. [Retreived Oct. 14, 2010]; Retreived from http://www.sercalo.com/document/PDFs/DataSheels/RS%20datasheet.pdf.
Burge, James H., et al, Use of a commerical laser tracker for optical alignment, Proc, of SPIE vol. 6676, Sep. 21, 2007, pp. 66760E-1-6 6760E-12.
Chen, Jihua, et al, Research on the Principle of 5/6-DOF Laser Tracking Metrology, Journal of Astronautic Metrology and Measurement vol. 27, No. 3, May 31, 2007, pp. 58-62.
NonFinal Office Action for U.S. Appl. No. 13/932,267 dated Aug. 13, 2015; 105 pages.
Tohme, et al., “Diagnosing Multipath Interference and Eliminating Multipath Interference in 3D Scanners Using Automated Repositioning” U.S. Appl. No. 14/139,143, filed Dec. 23, 2013.
Japanese Office Action for Application No. 2016-500623 dated Oct. 3, 2017; 4 pgs.
Japanese Office Action for Application No. 2016-500625 dated Oct. 3, 2017; 3 pgs.
Related Publications (1)
Number Date Country
20160364874 A1 Dec 2016 US
Provisional Applications (3)
Number Date Country
61791797 Mar 2013 US
61592049 Jan 2012 US
61475703 Apr 2011 US
Continuations (1)
Number Date Country
Parent 14139143 Dec 2013 US
Child 15246619 US
Continuation in Parts (1)
Number Date Country
Parent 13443946 Apr 2012 US
Child 14139143 US