Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations

Information

  • Patent Grant
  • 10060722
  • Patent Number
    10,060,722
  • Date Filed
    Wednesday, October 26, 2016
    8 years ago
  • Date Issued
    Tuesday, August 28, 2018
    6 years ago
Abstract
A portable articulated arm coordinate measuring machine includes a noncontact 3D measuring device that has a projector configured to emit a first pattern of light onto an object, a scanner camera arranged to receive the first pattern of light reflected from the surface of the object, an edge-detecting camera arranged to receive light reflected from an edge feature of the object, and a processor configured to determine first 3D coordinates of an edge point of the edge feature based on electrical signals received from the scanner camera and the edge-detecting camera.
Description
BACKGROUND

The present disclosure relates to a coordinate measuring machine, and more particularly to a portable articulated arm coordinate measuring machine having a connector on a probe end of the coordinate measuring machine that allows accessory devices which use structured light for non-contact three dimensional measurement to be removably connected to the coordinate measuring machine.


Portable articulated arm coordinate measuring machines (AACMMs) have found widespread use in the manufacturing or production of parts where there is a need to rapidly and accurately verify the dimensions of the part during various stages of the manufacturing or production (e.g., machining) of the part. Portable AACMMs represent a vast improvement over known stationary or fixed, cost-intensive and relatively difficult to use measurement installations, particularly in the amount of time it takes to perform dimensional measurements of relatively complex parts. Typically, a user of a portable AACMM simply guides a probe along the surface of the part or object to be measured. The measurement data are then recorded and provided to the user. In some cases, the data are provided to the user in visual form, for example, three-dimensional (3-D) form on a computer screen. In other cases, the data are provided to the user in numeric form, for example when measuring the diameter of a hole, the text “Diameter=1.0034” is displayed on a computer screen.


An example of a prior art portable articulated arm CMM is disclosed in commonly assigned U.S. Pat. No. 5,402,582 ('582), which is incorporated herein by reference in its entirety. The '582 patent discloses a 3-D measuring system comprised of a manually-operated articulated arm CMM having a support base on one end and a measurement probe at the other end. Commonly assigned U.S. Pat. No. 5,611,147 ('147), which is incorporated herein by reference in its entirety, discloses a similar articulated arm CMM. In the '147 patent, the articulated arm CMM includes a number of features including an additional rotational axis at the probe end, thereby providing for an arm with either a two-two-two or a two-two-three axis configuration (the latter case being a seven axis arm).


Three-dimensional surfaces may be measured using non-contact techniques as well. One type of non-contact device, sometimes referred to as a laser line probe or laser line scanner, emits a laser light either on a spot, or along a line. An imaging device, such as a charge-coupled device (CCD) for example, is positioned adjacent the laser. The laser is arranged to emit a line of light which is reflected off of the surface. The surface of the object being measured causes a diffuse reflection which is captured by the imaging device. The image of the reflected line on the sensor will change as the distance between the sensor and the surface changes. By knowing the relationship between the imaging sensor and the laser and the position of the laser image on the sensor, triangulation methods may be used to measure three-dimensional coordinates of points on the surface. One issue that arises with laser line probes, is that the density of measured points may vary depending on the speed at which the laser line probe is moved across the surface of the object. The faster the laser line probe is moved, the greater the distance between the points and a lower point density. With a structured light scanner, the point spacing is typically uniform in each of the two dimensions, thereby generally providing uniform measurement of workpiece surface points. A further issue that arises in obtaining 3D representations from scan data is that there is often a fuzzy region around edges or holes.


While existing CMM's are suitable for their intended purposes, what is needed is a portable AACMM that has certain features of embodiments of the present invention.


SUMMARY OF THE INVENTION

In accordance with one embodiment of the invention, a portable articulated arm coordinate measuring machine (AACMM) is provided for measuring three-dimensional (3D) coordinates of an object in space that includes a processor, a noncontact 3D measuring device and an edge-detecting camera. The noncontact 3D measuring device being operably coupled to the processor, the noncontact 3D measuring device having a projector and a scanner camera, the projector configured to emit a first pattern of light onto the object, the scanner camera arranged to receive the first pattern of light reflected from the object and to send a first electrical signal to the processor in response. The edge-detecting camera is operably coupled to the processor, the edge-detecting camera being one of the scanner camera or a second camera different than the scanner camera, the edge-detecting camera positioned to receive during operation a second light reflected from an edge feature of the object and to send a second electrical signal to the processor in response. The processor is responsive to non-transitory computer readable instructions, the computer readable instructions comprising: determining a first 3D coordinates of first points on a surface of the object based at least in part on the first pattern of light from the projector and the first electrical signal; determining a first ray from the edge-detecting camera to the object, the first ray based at least in part on the second electrical signal; and determining a second 3D coordinates of an edge point of the edge feature, the second 3D coordinates based at least in part on an intersection of the first ray with the first 3D coordinates of the surface.


In accordance with another embodiment of the invention, a method for measuring an edge point with a portable articulated arm coordinate measuring machine (AACMM). The method includes providing the AACMM, the AACMM including a processor, a noncontact 3D measuring device operably coupled to the processor, the noncontact 3D measuring device having a projector and a scanner camera, the AACMM further including and an edge-detecting camera operably coupled to the processor, the edge-detecting camera being one of the scanner camera or a second camera different than the scanner camera. A first pattern of light is projected by the projector onto object. The scanner camera receives the first pattern of light reflected from the object and sending a first electrical signal to the processor in response. The edge-detecting camera receives a second light reflected from an edge feature of the object and sending a second electrical signal to the processor in response, the edge feature having an edge point, the edge point being a point on the edge feature. A first 3D coordinates of first points on a surface of the object are determined by the processor, the first 3D coordinates based at least in part on the first pattern of light from the projector and the first electrical signal. A first ray is determined from the edge-detecting camera to the object, the first ray based at least in part on the second electrical signal. A second 3D coordinates of the edge point is determined with the processor based at least in part on an intersection of the first ray with the first 3D coordinates of the surface. The second 3D coordinates of the edge point are stored.





BRIEF DESCRIPTION OF THE DRAWINGS

Referring now to the drawings, exemplary embodiments are shown which should not be construed to be limiting regarding the entire scope of the disclosure, and wherein the elements are numbered alike in several FIGURES:



FIGS. 1A and 1B are perspective views of a portable articulated arm coordinate measuring machine (AACMM) having embodiments of various aspects of the present invention therewithin;



FIGS. 2A-2D taken together, are a block diagram of electronics utilized as part of the AACMM of FIG. 1A in accordance with an embodiment;



FIGS. 3A and 3B taken together, are a block diagram describing detailed features of the electronic data processing system of FIG. 2A in accordance with an embodiment;



FIG. 4 is an isometric view of the probe end of the AACMM of FIG. 1A;



FIG. 5 is a side view of the probe end of FIG. 4 with the handle being coupled thereto;



FIG. 6 is a side view of the probe end of FIG. 4 with the handle attached;



FIG. 7 is an enlarged partial side view of the interface portion of the probe end of FIG. 6;



FIG. 8 is another enlarged partial side view of the interface portion of the probe end of FIG. 5;



FIG. 9 is an isometric view partially in section of the handle of FIG. 4;



FIG. 10 is an isometric view of the probe end of the AACMM of FIG. 1A with a structured light device having a single camera attached;



FIG. 11 is an isometric view partially in section of the device of FIG. 10;



FIG. 12 is an isometric view of the probe end of the AACMM of FIG. 1A with another structured light device having dual cameras attached;



FIG. 13A and FIG. 13B are schematic views illustrating the operation of the device of FIG. 10 when attached to the probe end of the AACMM of FIG. 1A;



FIGS. 14A-17C are sequential projections having an uncoded binary pattern that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the present invention;



FIGS. 18-19 are spatially varying color coded patterns that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the invention;



FIGS. 20-23 are strip index coded patterns that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the invention;



FIGS. 24-31 are two-dimensional grid patterns that may be emitted by the structured light device of FIG. 10 or FIG. 12, in accordance with an embodiment of the invention;



FIG. 32 is a schematic illustration of a photometric technique for acquiring patterns of structured light under a plurality of lighting conditions;



FIG. 33 is an illustration of a structured light scanner device independently operable from an AACMM in accordance with another embodiment of the invention;



FIG. 34 is a isometric drawing of a probe end having a triangulation scanner and camera used together to produce sharp 3D representations;



FIG. 35 is a schematic illustration of rays projected through a camera perspective center to provide sharp edges for 3D representations; and



FIG. 36 is an illustration showing a hole having edges having a surrounding region having a “fuzzy”.



FIG. 37 is a flow chart including the steps of used in the method for determining 3D coordinates of an edge point located on an edge feature.





DETAILED DESCRIPTION

Portable articulated arm coordinate measuring machines (“AACMM”) are used in a variety of applications to obtain measurements of objects. Embodiments of the present invention provide advantages in allowing an operator to easily and quickly couple accessory devices to a probe end of the AACMM that use structured light to provide for the non-contact measuring of a three-dimensional object. Embodiments of the present invention provide further advantages in providing for communicating data representing a point cloud measured by the structured light device within the AACMM. Embodiments of the present invention provide advantages in greater uniformity in the distribution of measured points that may provide enhanced accuracy. Embodiments of the present invention provide still further advantages in providing power and data communications to a removable accessory without having external connections or wiring. Embodiments of the present invention provide still further advantages in sharpening edges of features in 3D representations.


As used herein, the term “structured light” refers to a two-dimensional pattern of light projected onto a continuous and enclosed area of an object that conveys information which may be used to determine coordinates of points on the object. A structured light pattern will contain at least three non-collinear pattern elements disposed within the contiguous and enclosed area. Each of the three non-collinear pattern elements conveys information which may be used to determine the point coordinates.


In general, there are two types of structured light, a coded light pattern and an uncoded light pattern. As used herein a coded light pattern is one in which the three dimensional coordinates of an illuminated surface of the object may be ascertained by the acquisition of a single image. In some cases, the projecting device may be moving relative to the object. In other words, for a coded light pattern there will be no significant temporal relationship between the projected pattern and the acquired image. Typically, a coded light pattern will contain a set of elements (e.g. geometric shapes) arranged so that at least three of the elements are non-collinear. In some cases, the set of elements may be arranged into collections of lines. Having at least three of the element be non-collinear ensures that the pattern is not a simple line pattern as would be projected, for example, by a laser line scanner. As a result, the pattern elements are recognizable because of the arrangement of the elements.


In contrast, an uncoded structured light pattern as used herein is a pattern that does not allow measurement through a single pattern when the projector is moving relative to the object. An example of an uncoded light pattern is one which requires a series of sequential patterns and thus the acquisition of a series of sequential images. Due to the temporal nature of the projection pattern and acquisition of the image, there should be no relative movement between the projector and the object.


It should be appreciated that structured light is different from light projected by a laser line probe or laser line scanner type device that generates a line of light. To the extent that laser line probes used with articulated arms today have irregularities or other aspects that may be regarded as features within the generated lines, these features are disposed in a collinear arrangement. Consequently such features within a single generated line are not considered to make the projected light into structured light.



FIGS. 1A and 1B illustrate, in perspective, an AACMM 100 according to various embodiments of the present invention, an articulated arm being one type of coordinate measuring machine. As shown in FIGS. 1A and 1B, the exemplary AACMM 100 may comprise a six or seven axis articulated measurement device having a probe end 401 that includes a measurement probe housing 102 coupled to an arm portion 104 of the AACMM 100 at one end. The arm portion 104 comprises a first arm segment 106 coupled to a second arm segment 108 by a first grouping of bearing cartridges 110 (e.g., two bearing cartridges). A second grouping of bearing cartridges 112 (e.g., two bearing cartridges) couples the second arm segment 108 to the measurement probe housing 102. A third grouping of bearing cartridges 114 (e.g., three bearing cartridges) couples the first arm segment 106 to a base 116 located at the other end of the arm portion 104 of the AACMM 100. Each grouping of bearing cartridges 110, 112, 114 provides for multiple axes of articulated movement. Also, the probe end 401 may include a measurement probe housing 102 that comprises the shaft of the seventh axis portion of the AACMM 100 (e.g., a cartridge containing an encoder system that determines movement of the measurement device, for example a probe 118, in the seventh axis of the AACMM 100). In this embodiment, the probe end 401 may rotate about an axis extending through the center of measurement probe housing 102. In use of the AACMM 100, the base 116 is typically affixed to a work surface.


Each bearing cartridge within each bearing cartridge grouping 110, 112, 114 typically contains an encoder system (e.g., an optical angular encoder system). The encoder system (i.e., transducer) provides an indication of the position of the respective arm segments 106, 108 and corresponding bearing cartridge groupings 110, 112, 114 that all together provide an indication of the position of the probe 118 with respect to the base 116 (and, thus, the position of the object being measured by the AACMM 100 in a certain frame of reference—for example a local or global frame of reference). The arm segments 106, 108 may be made from a suitably rigid material such as but not limited to a carbon composite material for example. A portable AACMM 100 with six or seven axes of articulated movement (i.e., degrees of freedom) provides advantages in allowing the operator to position the probe 118 in a desired location within a 360° area about the base 116 while providing an arm portion 104 that may be easily handled by the operator. However, it should be appreciated that the illustration of an arm portion 104 having two arm segments 106, 108 is for exemplary purposes, and the claimed invention should not be so limited. An AACMM 100 may have any number of arm segments coupled together by bearing cartridges (and, thus, more or less than six or seven axes of articulated movement or degrees of freedom).


The probe 118 is detachably mounted to the measurement probe housing 102, which is connected to bearing cartridge grouping 112. A handle 126 is removable with respect to the measurement probe housing 102 by way of, for example, a quick-connect interface. As will be discussed in more detail below, the handle 126 may be replaced with another device configured to emit a structured light to provide non-contact measurement of three-dimensional objects, thereby providing advantages in allowing the operator to make both contact and non-contact measurements with the same AACMM 100. In exemplary embodiments, the probe housing 102 houses a removable probe 118, which is a contacting measurement device and may have different tips 118 that physically contact the object to be measured, including, but not limited to: ball, touch-sensitive, curved and extension type probes. In other embodiments, the measurement is performed, for example, by a non-contacting device such as a coded structured light scanner device. In an embodiment, the handle 126 is replaced with the coded structured light scanner device using the quick-connect interface. Other types of measurement devices may replace the removable handle 126 to provide additional functionality. Examples of such measurement devices include, but are not limited to, one or more illumination lights, a temperature sensor, a thermal scanner, a bar code scanner, a projector, a paint sprayer, a camera, or the like, for example.


As shown in FIGS. 1A and 1B, the AACMM 100 includes the removable handle 126 that provides advantages in allowing accessories or functionality to be changed without removing the measurement probe housing 102 from the bearing cartridge grouping 112. As discussed in more detail below with respect to FIG. 2D, the removable handle 126 may also include an electrical connector that allows electrical power and data to be exchanged with the handle 126 and the corresponding electronics located in the probe end 401.


In various embodiments, each grouping of bearing cartridges 110, 112, 114 allows the arm portion 104 of the AACMM 100 to move about multiple axes of rotation. As mentioned, each bearing cartridge grouping 110, 112, 114 includes corresponding encoder systems, such as optical angular encoders for example, that are each arranged coaxially with the corresponding axis of rotation of, e.g., the arm segments 106, 108. The optical encoder system detects rotational (swivel) or transverse (hinge) movement of, e.g., each one of the arm segments 106, 108 about the corresponding axis and transmits a signal to an electronic data processing system within the AACMM 100 as described in more detail herein below. Each individual raw encoder count is sent separately to the electronic data processing system as a signal where it is further processed into measurement data. No position calculator separate from the AACMM 100 itself (e.g., a serial box) is required, as disclosed in commonly assigned U.S. Pat. No. 5,402,582 ('582).


The base 116 may include an attachment device or mounting device 120. The mounting device 120 allows the AACMM 100 to be removably mounted to a desired location, such as an inspection table, a machining center, a wall or the floor for example. In one embodiment, the base 116 includes a handle portion 122 that provides a convenient location for the operator to hold the base 116 as the AACMM 100 is being moved. In one embodiment, the base 116 further includes a movable cover portion 124 that folds down to reveal a user interface, such as a display screen.


In accordance with an embodiment, the base 116 of the portable AACMM 100 contains or houses an electronic circuit having an electronic data processing system that includes two primary components: a base processing system that processes the data from the various encoder systems within the AACMM 100 as well as data representing other arm parameters to support three-dimensional (3-D) positional calculations; and a user interface processing system that includes an on-board operating system, a touch screen display, and resident application software that allows for relatively complete metrology functions to be implemented within the AACMM 100 without the need for connection to an external computer.


The electronic data processing system in the base 116 may communicate with the encoder systems, sensors, and other peripheral hardware located away from the base 116 (e.g., a structured light device that can be mounted to the removable handle 126 on the AACMM 100). The electronics that support these peripheral hardware devices or features may be located in each of the bearing cartridge groupings 110, 112, 114 located within the portable AACMM 100.



FIG. 2A is a block diagram of electronics utilized in an AACMM 100 in accordance with an embodiment. The embodiment shown in FIG. 2A includes an electronic data processing system 210 including a base processor board 204 for implementing the base processing system, a user interface board 202, a base power board 206 for providing power, a Bluetooth module 232, and a base tilt board 208. The user interface board 202 includes a computer processor for executing application software to perform user interface, display, and other functions described herein.


As shown in FIG. 2A and FIG. 2B, the electronic data processing system 210 is in communication with the aforementioned plurality of encoder systems via one or more arm buses 218. In the embodiment depicted in FIG. 2B and FIG. 2C, each encoder system generates encoder data and includes: an encoder arm bus interface 214, an encoder digital signal processor (DSP) 216, an encoder read head interface 234, and a temperature sensor 212. Other devices, such as strain sensors, may be attached to the arm bus 218.


Also shown in FIG. 2D are probe end electronics 230 that are in communication with the arm bus 218. The probe end electronics 230 include a probe end DSP 228, a temperature sensor 212, a handle/device interface bus 240 that connects with the handle 126 or the coded structured light scanner device 242 via the quick-connect interface in an embodiment, and a probe interface 226. The quick-connect interface allows access by the handle 126 to the data bus, control lines, and power bus used by the coded structured light scanner device 242 and other accessories. In an embodiment, the probe end electronics 230 are located in the measurement probe housing 102 on the AACMM 100. In an embodiment, the handle 126 may be removed from the quick-connect interface and measurement may be performed by the structured light device 242 communicating with the probe end electronics 230 of the AACMM 100 via the interface bus 240. In an embodiment, the electronic data processing system 210 is located in the base 116 of the AACMM 100, the probe end electronics 230 are located in the measurement probe housing 102 of the AACMM 100, and the encoder systems are located in the bearing cartridge groupings 110, 112, 114. The probe interface 226 may connect with the probe end DSP 228 by any suitable communications protocol, including commercially-available products from Maxim Integrated Products, Inc. that embody the 1-Wire® communications protocol 236.



FIG. 3A is a block diagram describing detailed features of the electronic data processing system 210 of the AACMM 100 in accordance with an embodiment. In an embodiment, the electronic data processing system 210 is located in the base 116 of the AACMM 100 and includes the base processor board 204, the user interface board 202, a base power board 206, a Bluetooth module 232, and a base tilt module 208.


In an embodiment shown in FIG. 3A, the base processor board 204 includes the various functional blocks illustrated therein. For example, a base processor function 302 is utilized to support the collection of measurement data from the AACMM 100 and receives raw arm data (e.g., encoder system data) via the arm bus 218 and a bus control module function 308. The memory function 304 stores programs and static arm configuration data. The base processor board 204 also includes an external hardware option port function 310 for communicating with any external hardware devices or accessories such as a coded structured light scanner device 242. A real time clock (RTC) and log 306, a battery pack interface (IF) 316, and a diagnostic port 318 are also included in the functionality in an embodiment of the base processor board 204 depicted in FIG. 3A.


The base processor board 204 also manages all the wired and wireless data communication with external (host computer) and internal (display processor 202) devices. The base processor board 204 has the capability of communicating with an Ethernet network via an Ethernet function 320 (e.g., using a clock synchronization standard such as Institute of Electrical and Electronics Engineers (IEEE) 1588), with a wireless local area network (WLAN) via a LAN function 322, and with Bluetooth module 232 via a parallel to serial communications (PSC) function 314. The base processor board 204 also includes a connection to a universal serial bus (USB) device 312.


The base processor board 204 transmits and collects raw measurement data (e.g., encoder system counts, temperature readings) for processing into measurement data without the need for any preprocessing, such as disclosed in the serial box of the aforementioned '582 patent. The base processor 204 sends the processed data to the display processor 328 on the user interface board 202 via an RS485 interface (IF) 326. In an embodiment, the base processor 204 also sends the raw measurement data to an external computer.


Turning now to the user interface board 202 in FIG. 3B, the angle and positional data received by the base processor is utilized by applications executing on the display processor 328 to provide an autonomous metrology system within the AACMM 100. Applications may be executed on the display processor 328 to support functions such as, but not limited to: measurement of features, guidance and training graphics, remote diagnostics, temperature corrections, control of various operational features, connection to various networks, and display of measured objects. Along with the display processor 328 and a liquid crystal display (LCD) 338 (e.g., a touch screen LCD) user interface, the user interface board 202 includes several interface options including a secure digital (SD) card interface 330, a memory 332, a USB Host interface 334, a diagnostic port 336, a camera port 340, an audio/video interface 342, a dial-up/cell modem 344 and a global positioning system (GPS) port 346.


The electronic data processing system 210 shown in FIG. 3A also includes a base power board 206 with an environmental recorder 362 for recording environmental data. The base power board 206 also provides power to the electronic data processing system 210 using an AC/DC converter 358 and a battery charger control 360. The base power board 206 communicates with the base processor board 204 using inter-integrated circuit (I2C) serial single ended bus 354 as well as via a DMA serial peripheral interface (DSPI) 357. The base power board 206 is connected to a tilt sensor and radio frequency identification (RFID) module 208 via an input/output (I/O) expansion function 364 implemented in the base power board 206.


Though shown as separate components, in other embodiments all or a subset of the components may be physically located in different locations and/or functions combined in different manners than that shown in FIG. 3A and FIG. 3B. For example, in one embodiment, the base processor board 204 and the user interface board 202 are combined into one physical board.


Referring now to FIGS. 4-9, an exemplary embodiment of a probe end 401 is illustrated having a measurement probe housing 102 with a quick-connect mechanical and electrical interface that allows removable and interchangeable device 400 to couple with AACMM 100. In the exemplary embodiment, the device 400 includes an enclosure 402 that includes a handle portion 404 that is sized and shaped to be held in an operator's hand, such as in a pistol grip for example. The enclosure 402 is a thin wall structure having a cavity 406 (FIG. 9). The cavity 406 is sized and configured to receive a controller 408. The controller 408 may be a digital circuit, having a microprocessor for example, or an analog circuit. In one embodiment, the controller 408 is in asynchronous bidirectional communication with the electronic data processing system 210 (FIGS. 2 and 3). The communication connection between the controller 408 and the electronic data processing system 210 may be wired (e.g. via controller 420) or may be a direct or indirect wireless connection (e.g. Bluetooth or IEEE 802.11) or a combination of wired and wireless connections. In the exemplary embodiment, the enclosure 402 is formed in two halves 410, 412, such as from an injection molded plastic material for example. The halves 410, 412 may be secured together by fasteners, such as screws 414 for example. In other embodiments, the enclosure halves 410, 412 may be secured together by adhesives or ultrasonic welding for example.


The handle portion 404 also includes buttons or actuators 416, 418 that may be manually activated by the operator. The actuators 416, 418 are coupled to the controller 408 that transmits a signal to a controller 420 within the probe housing 102. In the exemplary embodiments, the actuators 416, 418 perform the functions of actuators 422, 424 located on the probe housing 102 opposite the device 400. It should be appreciated that the device 400 may have additional switches, buttons or other actuators that may also be used to control the device 400, the AACMM 100 or vice versa. Also, the device 400 may include indicators, such as light emitting diodes (LEDs), sound generators, meters, displays or gauges for example. In one embodiment, the device 400 may include a digital voice recorder that allows for synchronization of verbal comments with a measured point. In yet another embodiment, the device 400 includes a microphone that allows the operator to transmit voice activated commands to the electronic data processing system 210.


In one embodiment, the handle portion 404 may be configured to be used with either operator hand or for a particular hand (e.g. left handed or right handed). The handle portion 404 may also be configured to facilitate operators with disabilities (e.g. operators with missing finders or operators with prosthetic arms). Further, the handle portion 404 may be removed and the probe housing 102 used by itself when clearance space is limited. As discussed above, the probe end 401 may also comprise the shaft of the seventh axis of AACMM 100. In this embodiment the device 400 may be arranged to rotate about the AACMM seventh axis.


The probe end 401 includes a mechanical and electrical interface 426 having a first connector 429 (FIG. 8) on the device 400 that cooperates with a second connector 428 on the probe housing 102. The connectors 428, 429 may include electrical and mechanical features that allow for coupling of the device 400 to the probe housing 102. In one embodiment, the interface 426 includes a first surface 430 having a mechanical coupler 432 and an electrical connector 434 thereon. The enclosure 402 also includes a second surface 436 positioned adjacent to and offset from the first surface 430. In the exemplary embodiment, the second surface 436 is a planar surface offset a distance of approximately 0.5 inches from the first surface 430. This offset provides a clearance for the operator's fingers when tightening or loosening a fastener such as collar 438. The interface 426 provides for a relatively quick and secure electronic connection between the device 400 and the probe housing 102 without the need to align connector pins, and without the need for separate cables or connectors.


The electrical connector 434 extends from the first surface 430 and includes one or more connector pins 440 that are electrically coupled in asynchronous bidirectional communication with the electronic data processing system 210 (FIGS. 2 and 3), such as via one or more arm buses 218 for example. The bidirectional communication connection may be wired (e.g. via arm bus 218), wireless (e.g. Bluetooth or IEEE 802.11), or a combination of wired and wireless connections. In one embodiment, the electrical connector 434 is electrically coupled to the controller 420. The controller 420 may be in asynchronous bidirectional communication with the electronic data processing system 210 such as via one or more arm buses 218 for example. The electrical connector 434 is positioned to provide a relatively quick and secure electronic connection with electrical connector 442 on probe housing 102. The electrical connectors 434, 442 connect with each other when the device 400 is attached to the probe housing 102. The electrical connectors 434, 442 may each comprise a metal encased connector housing that provides shielding from electromagnetic interference as well as protecting the connector pins and assisting with pin alignment during the process of attaching the device 400 to the probe housing 102.


The mechanical coupler 432 provides relatively rigid mechanical coupling between the device 400 and the probe housing 102 to support relatively precise applications in which the location of the device 400 on the end of the arm portion 104 of the AACMM 100 preferably does not shift or move. Any such movement may typically cause an undesirable degradation in the accuracy of the measurement result. These desired results are achieved using various structural features of the mechanical attachment configuration portion of the quick connect mechanical and electronic interface of an embodiment of the present invention.


In one embodiment, the mechanical coupler 432 includes a first projection 444 positioned on one end 448 (the leading edge or “front” of the device 400). The first projection 444 may include a keyed, notched or ramped interface that forms a lip 446 that extends from the first projection 444. The lip 446 is sized to be received in a slot 450 defined by a projection 452 extending from the probe housing 102 (FIG. 8). It should be appreciated that the first projection 444 and the slot 450 along with the collar 438 form a coupler arrangement such that when the lip 446 is positioned within the slot 450, the slot 450 may be used to restrict both the longitudinal and lateral movement of the device 400 when attached to the probe housing 102. As will be discussed in more detail below, the rotation of the collar 438 may be used to secure the lip 446 within the slot 450.


Opposite the first projection 444, the mechanical coupler 432 may include a second projection 454. The second projection 454 may have a keyed, notched-lip or ramped interface surface 456 (FIG. 5). The second projection 454 is positioned to engage a fastener associated with the probe housing 102, such as collar 438 for example. As will be discussed in more detail below, the mechanical coupler 432 includes a raised surface projecting from surface 430 that adjacent to or disposed about the electrical connector 434 which provides a pivot point for the interface 426 (FIGS. 7 and 8). This serves as the third of three points of mechanical contact between the device 400 and the probe housing 102 when the device 400 is attached thereto.


The probe housing 102 includes a collar 438 arranged co-axially on one end. The collar 438 includes a threaded portion that is movable between a first position (FIG. 5) and a second position (FIG. 7). By rotating the collar 438, the collar 438 may be used to secure or remove the device 400 without the need for external tools. Rotation of the collar 438 moves the collar 438 along a relatively coarse, square-threaded cylinder 474. The use of such relatively large size, square-thread and contoured surfaces allows for significant clamping force with minimal rotational torque. The coarse pitch of the threads of the cylinder 474 further allows the collar 438 to be tightened or loosened with minimal rotation.


To couple the device 400 to the probe housing 102, the lip 446 is inserted into the slot 450 and the device is pivoted to rotate the second projection 454 toward surface 458 as indicated by arrow 464 (FIG. 5). The collar 438 is rotated causing the collar 438 to move or translate in the direction indicated by arrow 462 into engagement with surface 456. The movement of the collar 438 against the angled surface 456 drives the mechanical coupler 432 against the raised surface 460. This assists in overcoming potential issues with distortion of the interface or foreign objects on the surface of the interface that could interfere with the rigid seating of the device 400 to the probe housing 102. The application of force by the collar 438 on the second projection 454 causes the mechanical coupler 432 to move forward pressing the lip 446 into a seat on the probe housing 102. As the collar 438 continues to be tightened, the second projection 454 is pressed upward toward the probe housing 102 applying pressure on a pivot point. This provides a see-saw type arrangement, applying pressure to the second projection 454, the lip 446 and the center pivot point to reduce or eliminate shifting or rocking of the device 400. The pivot point presses directly against the bottom on the probe housing 102 while the lip 446 is applies a downward force on the end of probe housing 102. FIG. 5 includes arrows 462, 464 to show the direction of movement of the device 400 and the collar 438. FIG. 7 includes arrows 466, 468, 470 to show the direction of applied pressure within the interface 426 when the collar 438 is tightened. It should be appreciated that the offset distance of the surface 436 of device 400 provides a gap 472 between the collar 438 and the surface 436 (FIG. 6). The gap 472 allows the operator to obtain a firmer grip on the collar 438 while reducing the risk of pinching fingers as the collar 438 is rotated. In one embodiment, the probe housing 102 is of sufficient stiffness to reduce or prevent the distortion when the collar 438 is tightened.


Embodiments of the interface 426 allow for the proper alignment of the mechanical coupler 432 and electrical connector 434 and also protects the electronics interface from applied stresses that may otherwise arise due to the clamping action of the collar 438, the lip 446 and the surface 456. This provides advantages in reducing or eliminating stress damage to circuit board 476 mounted electrical connectors 434, 442 that may have soldered terminals. Also, embodiments provide advantages over known approaches in that no tools are required for a user to connect or disconnect the device 400 from the probe housing 102. This allows the operator to manually connect and disconnect the device 400 from the probe housing 102 with relative ease.


Due to the relatively large number of shielded electrical connections possible with the interface 426, a relatively large number of functions may be shared between the AACMM 100 and the device 400. For example, switches, buttons or other actuators located on the AACMM 100 may be used to control the device 400 or vice versa. Further, commands and data may be transmitted from electronic data processing system 210 to the device 400. In one embodiment, the device 400 is a video camera that transmits data of a recorded image to be stored in memory on the base processor 204 or displayed on the display 328. In another embodiment the device 400 is an image projector that receives data from the electronic data processing system 210. In addition, temperature sensors located in either the AACMM 100 or the device 400 may be shared by the other. It should be appreciated that embodiments of the present invention provide advantages in providing a flexible interface that allows a wide variety of accessory devices 400 to be quickly, easily and reliably coupled to the AACMM 100. Further, the capability of sharing functions between the AACMM 100 and the device 400 may allow a reduction in size, power consumption and complexity of the AACMM 100 by eliminating duplicity.


In one embodiment, the controller 408 may alter the operation or functionality of the probe end 401 of the AACMM 100. For example, the controller 408 may alter indicator lights on the probe housing 102 to either emit a different color light, a different intensity of light, or turn on/off at different times when the device 400 is attached versus when the probe housing 102 is used by itself. In one embodiment, the device 400 includes a range finding sensor (not shown) that measures the distance to an object. In this embodiment, the controller 408 may change indicator lights on the probe housing 102 in order to provide an indication to the operator how far away the object is from the probe tip 118. In another embodiment, the controller 408 may change the color of the indicator lights based on the quality of the image acquired by the coded structured light scanner device. This provides advantages in simplifying the requirements of controller 420 and allows for upgraded or increased functionality through the addition of accessory devices.


Referring to FIGS. 10-13, embodiments of the present invention provide advantages to projector, camera, signal processing, control and indicator interfaces for a non-contact three-dimensional measurement device 500. The device 500 includes a pair of optical devices, such as a light projector 508 and a camera 510, for example, that project a structured light pattern and receive a two-dimensional pattern that was reflected from an object 501. The device 500 uses triangulation-based methods based on the known emitted pattern and the acquired image to determine a point cloud representing the X, Y, Z coordinate data for the object 501 for each pixel of the received image. In an embodiment, the structured light pattern is coded so that a single image is sufficient to determine the three-dimensional coordinates of object points. Such a coded structured light pattern may also be said to measure three-dimensional coordinates in a single shot.


In the exemplary embodiment, the projector 508 uses a visible light source that illuminates a pattern generator. The visible light source may be a laser, a superluminescent diode, an incandescent light, a light emitting diode (LED), or other light emitting device. In the exemplary embodiment, the pattern generator is a chrome-on-glass slide having a structured light pattern etched thereon. The slide may have a single pattern or multiple patterns that move in and out of position as needed. The slide may be manually or automatically installed in the operating position. In other embodiments, the source pattern may be light reflected off or transmitted by a digital micro-mirror device (DMD) such as a digital light projector (DLP) manufactured by Texas Instruments Corporation, a liquid crystal device (LCD), a liquid crystal on silicon (LCOS) device, or a similar device used in transmission mode rather than reflection mode. The projector 508 may further include a lens system 515 that alters the outgoing light to have the desired focal characteristics.


The device 500 further includes an enclosure 502 with a handle portion 504. In one embodiment, the device 500 may further include an interface 426 on one end that mechanically and electrically couples the device 500 to the probe housing 102 as described herein above. In other embodiments, the device 500 may be integrated into the probe housing 102. The interface 426 provides advantages in allowing the device 500 to be coupled and removed from the AACMM 100 quickly and easily without requiring additional tools.


The camera 510 includes a photosensitive sensor which generates a digital image/representation of the area within the sensor's field of view. The sensor may be charged-coupled device (CCD) type sensor or a complementary metal-oxide-semiconductor (CMOS) type sensor for example having an array of pixels. The camera 510 may further include other components, such as but not limited to lens 503 and other optical devices for example. In the exemplary embodiment, the projector 508 and the camera 510 are arranged at an angle such that the sensor may receive light reflected from the surface of the object 501. In one embodiment, the projector 508 and camera 510 are positioned such that the device 500 may be operated with the probe tip 118 in place. Further, it should be appreciated that the device 500 is substantially fixed relative to the probe tip 118 and forces on the handle portion 504 may not influence the alignment of the device 500 relative to the probe tip 118. In one embodiment, the device 500 may have an additional actuator (not shown) that allows the operator to switch between acquiring data from the device 500 and the probe tip 118.


The projector 508 and camera 510 are electrically coupled to a controller 512 disposed within the enclosure 502. The controller 512 may include one or more microprocessors, digital signal processors, memory and signal conditioning circuits. Due to the digital signal processing and large data volume generated by the device 500, the controller 512 may be arranged within the handle portion 504. The controller 512 is electrically coupled to the arm buses 218 via electrical connector 434. The device 500 may further include actuators 514, 516 which may be manually activated by the operator to initiate operation and data capture by the device 500. In one embodiment, the image processing to determine the X, Y, Z coordinate data of the point cloud representing object 501 is performed by the controller 512 and the coordinate data is transmitted to the electronic data processing system 210 via bus 240. In another embodiment images are transmitted to the electronic data processing system 210 and the calculation of the coordinates is performed by the electronic data processing system 210.


In one embodiment, the controller 512 is configured to communicate with the electronic data processing system 210 to receive structured light pattern images from the electronic data processing system 210. In still another embodiment, the pattern emitted onto the object may be changed by the electronic data processing system 210 either automatically or in response to an input from the operator. This may provide advantages in obtaining higher accuracy measurements with less processing time by allowing the use of patterns that are simpler to decode when the conditions warrant, and use the more complex patterns where it is desired to achieve the desired level of accuracy or resolution.


In other embodiments of the present invention, the device 520 (FIG. 12) includes a pair of cameras 510. The cameras 510 are arranged on an angle relative to the projector 508 to receive reflected light from the object 501. The use of multiple cameras 510 may provide advantages in some applications by providing redundant images to increase the accuracy of the measurement. In still other embodiments, the redundant images may allow for sequential patterns to be quickly acquired by the device 500 by increasing the acquisition speed of images by alternately operating the cameras 510.


Referring now to FIG. 13A and FIG. 13B, the operation of the structured light device 500 will be described. The device 500 first emits a structured light pattern 522 with projector 508 onto surface 524 of the object 501. The structured light pattern 522 may include the patterns disclosed in the journal article “DLP-Based Structured Light 3D Imaging Technologies and Applications” by Jason Geng published in the Proceedings of SPIE, Vol. 7932, which is incorporated herein by reference. The structured light pattern 522 may further include, but is not limited to one of the patterns shown in FIGS. 14-32. The light 509 from projector 508 is reflected from the surface 524 and the reflected light 511 is received by the camera 510. It should be appreciated that variations in the surface 524, such as protrusion 526 for example, create distortions in the structured pattern when the image of the pattern is captured by the camera 510. Since the pattern is formed by structured light, it is possible in some instances for the controller 512 or the electronic data processing system 210 to determine a one to one correspondence between the pixels in the emitted pattern, such as pixel 513 for example, and the pixels in the imaged pattern, such as pixel 515 for example. This enables triangulation principals to be used to determine the coordinates of each pixel in the imaged pattern. The collection of three-dimensional coordinates of the surface 524 is sometimes referred to as a point cloud. By moving the device 500 over the surface 524, a point cloud may be created of the entire object 501. It should be appreciated that in some embodiments the coupling of the device 500 to the probe end provides advantages in that the position and orientation of the device 500 is known by the electronic data processing system 210, so that the location of the object 501 relative to the AACMM 100 may also be ascertained.


To determine the coordinates of the pixel, the angle of each projected ray of light 509 intersecting the object 522 in a point 527 is known to correspond to a projection angle phi (Φ), so that Φ information is encoded into the emitted pattern. In an embodiment, the system is configured to enable the Φ value corresponding to each pixel in the imaged pattern to be ascertained. Further, an angle omega (Ω) for each pixel in the camera is known, as is the baseline distance “D” between the projector 508 and the camera. Therefore, the distance “Z” from the camera 510 to the location that the pixel has imaged using the equation:










Z
D

=


sin


(
Φ
)



sin


(

Ω
+
Φ

)







(
1
)








Thus three-dimensional coordinates may be calculated for each pixel in the acquired image.


In general, there are two categories of structured light, namely coded and uncoded structured light. A common form of uncoded structured light, such as that shown in FIGS. 14-17 and 28-30, relies on a striped pattern varying in a periodic manner along one dimension. These types of patterns are usually applied in a sequence to provide an approximate distance to the object. Some uncoded pattern embodiments, such as the sinusoidal patterns for example, may provide relatively highly accurate measurements. However, for these types of patterns to be effective, it is usually necessary for the scanner device and the object to be held stationary relative to each other. Where the scanner device or the object are in motion (relative to the other), then a coded pattern, such as that shown in FIGS. 18-27 may be preferable. A coded pattern allows the image to be analyzed using a single acquired image. Some coded patterns may be placed in a particular orientation on the projector pattern (for example, perpendicular to epipolar lines on the projector plane), thereby simplifying analysis of the three-dimensional surface coordinates based on a single image.


Epipolar lines are mathematical lines formed by the intersection of epipolar planes and the source plane 517 or the image plane 521 (the plane of the camera sensor) in FIG. 13B. An epipolar plane may be any plane that passes through the projector perspective center 519 and the camera perspective center. The epipolar lines on the source plane 517 and the image plane 521 may be parallel in some cases, but in general are not parallel. An aspect of epipolar lines is that a given epipolar line on the projector plane 517 has a corresponding epipolar line on the image plane 521. Therefore, any particular pattern known on an epipolar line in the projector plane 517 may be immediately observed and evaluated in the image plane 521. For example, if a coded pattern is placed along an epipolar line in the projector plane 517, the spacing between the coded elements in the image plane 521 may be determined using the values read out of the pixels of the camera sensor 510. This information may be used to determine the three-dimensional coordinates of a point 527 on the object 501. It is further possible to tilt coded patterns at a known angle with respect to an epipolar line and efficiently extract object surface coordinates. Examples of coded patterns are shown in FIGS. 20-29.


In embodiments having a periodic pattern, such as a sinusoidally repeating pattern, the sinusoidal period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two-dimensions, the pattern elements are non-collinear. In some cases, a striped pattern having stripes of varying width may represent a coded pattern.


Referring now to FIGS. 14-17, embodiments of uncoded structured light patterns are shown. Some of the patterns use simple on-off (or 1, 0) type pattern and are referred to as binary patterns. In some cases, the binary pattern is one known to have a particular sequence referred to as a gray code sequence. The term gray code as used in the field of three-dimensional metrology based on structured light is somewhat different than the term as used in the field of electrical engineering, where the term Gray code commonly means the sequential changing of a single bit at a time. The present application follows the use of the term gray code as is customary for the field of three-dimensional metrology where the gray code typically represents a sequence of binary black and white values. FIG. 14A shows an example of a binary pattern that includes a plurality of sequential images 530, 532, 534, each having a different stripped pattern thereon. Usually, the stripes alternate between bright (illuminated) and dark (non-illuminated) striped regions. Sometimes, the terms white and black are used to mean illuminated and non-illuminated, respectively. Thus, when the images 530, 532, 534 are projected sequentially onto the surface 524 as shown in FIG. 14B which shows a composite image 536. It should be noted that the bottom two patterns 535, 537 of FIG. 14B are not illustrated in FIG. 14A for clarity. For each point on the object 501 (represented by a camera pixel in the image) the composite pattern 536 has a unique binary value obtained through the sequential projection of patterns 530, 532, 534, 535, 537, which correspond to a relatively small range of possible projection angles Φ. By using these projection angles, together with the known pixel angle Ω for a given pixel and the known baseline distance D, Eq. (1) may be used to find the distance Z from the camera to the object point. A two-dimensional angle is known for each camera pixel. The two-dimensional angle corresponds generally to the one-dimensional angle Omega, which is used in the calculation of the distance Z according to Eq. (1). However, a line drawn from each camera pixel through the camera perspective center and intersecting the object in a point defines a two-dimensional angle in space. When combined with the calculated value Z, the two pixel angles provide three-dimensional coordinates corresponding to a point on the object surface.


Similarly, rather than a binary pattern, a sequential series of grey patterns having stripes with varying grey-scale values may be used. When used in this context, the term grey-scale usually refers to an amount of irradiance at a point on the object from white (maximum light), to various levels of gray (less light), to black (minimum light). This same nomenclature is used even if the light being projected has a color such as red, and the gray-scale values correspond to levels of red illumination. In an embodiment, the pattern (FIG. 15) has a plurality of images 538, 540, 542 with stripes having varying light power levels, such as black, grey and white for example, used to produce an emitted pattern on the object 501. The grey scale values may be used to determine the possible projection angles Φ to within a relatively small range of possible values. As discussed hereinabove, Eq. (1) may then be used to determine the distance Z.


In another embodiment, the distance Z to an object point may be found by measuring a phase shift observed in a plurality of images. For example, in an embodiment shown in FIG. 16, the gray-scale intensities 546, 548, 550 of a projector pattern 552 vary in a sinusoidal manner, but with the phase shifted between projected patterns. For example, in the first projector pattern, the sinusoid gray-scale intensity 546 (representing optical power per unit area) may have a phase of zero degrees at a particular point. In the second projector pattern, the sinusoid intensity 548 has a phase of 120 degrees at the same point. In the third projector pattern, the sinusoid intensity 550 may have a phase of 240 degrees at the same point. This is the same thing as saying that the sinusoidal pattern is shifted to the left (or right) by one-third of a period in each step. A phase shift method is used to determine a phase of the projected light at each camera pixel, which eliminates the need to consider information from adjacent pixels as in the coded-pattern single shot case. Many methods may be used to determine the phase of a camera pixel. One method involves performing a multiply and accumulate procedure and then taking an arctangent of a quotient. This method is well known to those of ordinary skill in the art and is not discussed further. In addition, with the phase shift method, the background light cancels out in the calculation of phase. For these reasons, the value Z calculated for a give pixel is usually more accurate than the value Z calculated using a coded-pattern single shot method. However, with a single collection of sinusoidal patterns such as those shown in FIG. 16, all of the calculated phases vary from 0 to 360 degrees. For a particular structured-light triangulation system, these calculated phases may be adequate if “thickness” of the object under test does not vary by too much because the angle for each projected stripe is known ahead of time. However, if the object is too thick, an ambiguity may arise between in the phase calculated for a particular pixel since that pixel may have been obtained from first projected ray of light striking the object at a first position or a second projected ray of light striking the object at a second position. In other words, if there is a possibility that the phase may vary by more than 2π radians for any pixel in the camera array, then the phases may not be properly decoded and the desired one to one correspondence not achieved.



FIG. 17A shows a sequence 1-4 of projected gray-code intensities 554 according to a method by which the ambiguity may be eliminated in the distance Z based on a calculated phase. A collection of gray code patterns are projected sequentially onto the object. In the example shown, there are four sequential patterns indicated by 1, 2, 3, 4 to the left side of 554 in FIG. 17A. The sequential pattern 1 has dark (black) on the left half of the pattern (elements 0-15) and bright (white) on the right half of the pattern (elements 16-31). The sequential pattern 2 has a dark band toward the center (elements 8-23) and bright bands toward the edges (elements 2-7, 24-31). The sequential pattern 3 has two separated bright bands near the center (elements 4-11, 20-27) and three bright bands (elements 0-3, 12-19, 28-31). The sequential pattern 4 has four separated dark bands (elements 2-5, 10-13, 18-21, 26-29) and five separated bright bands (elements 0-1, 6-9, 14-17, 22-25, 30-31). For any given pixel in the camera, this sequence of patterns enables the “object thickness region” of the object to be improved by a factor of 16 compared to an initial object thickness region corresponding to all the elements 0 to 31.


In another method 556 illustrated in FIG. 17C, a phase shift method, similar to the method of FIG. 16, is performed. In the embodiment shown in FIG. 17C, a pattern 556A four sinusoidal periods are projected onto an object. For reasons discussed hereinabove, there may be an ambiguity in a distance Z to an object using the pattern of FIG. 17C. One way to reduce or eliminate the ambiguity is to project one or more additional sinusoidal patterns 556B, 556C, each pattern having a different fringe period (pitch). So, for example, in FIG. 17B, a second sinusoidal pattern 555 having three fringe periods rather than four fringe periods is projected onto an object. In an embodiment, the difference in the phases for the two patterns 555, 556 may be used to help eliminate an ambiguity in the distance Z to the target.


Another method for eliminating ambiguity is to use a different type of method, such as the gray code method of FIG. 17A for example, to eliminate the ambiguity in the distances Z calculated using the sinusoidal phase shift method.


In applications where the object and device 500 are in relative motion, it may be desirable to use a single pattern that allows the camera 510 to capture an image that provides sufficient information to measure the three dimensional characteristics of the object 501 without having to project sequential images. Referring now to FIG. 18 and FIG. 19, patterns 558, 566 have a distribution of colors that may in some cases enable measurement of the object to be based on a single (coded) image. In the embodiment of FIG. 18, the pattern 558 uses lines having a continuously spatially varying wavelength of light to create a pattern where the color changes continuously from blue to green to yellow to red to fuchsia for example. Thus for each particular spectral wavelength, a one-to-one correspondence may be made between the emitted image and the imaged pattern. With the correspondence established, the three-dimensional coordinates of the object 501 may be determined from a single imaged pattern. In one embodiment, the stripes of the pattern 558 are oriented perpendicular to the epipolar lines on the projector plane. Since the epipolar lines on the projector plane map into epipolar lines on the camera image plane, it is possible to obtain an association between projector points and camera points by moving along the direction of epipolar lines in the camera image plane and noting the color of the line in each case. It should be appreciated that each pixel in the camera image plane corresponds to a two-dimensional angle. The color enables determination of the one-to-one correspondence between particular projection angles and particular camera angles. This correspondence information, combined with the distance between the camera and the projector (the baseline distance D) and the angles of the camera and projector relative to the baseline, is sufficient to enable determination of the distance Z from the camera to the object.


Another embodiment using color patterns is shown in FIG. 19. In this embodiment, a plurality of colored patterns having varying intensities 560, 562, 564 are combined to create a color pattern 566. In one embodiment, the plurality of colored patterns intensities 560, 562, 564 are primary colors, such that pattern 560 varies the intensity of the color red, pattern 562 varies the intensity of the color green and pattern 564 varies the intensity of the color blue. Since the ratios of colors are known, the resulting emitted image has a known relationship that may be decoded in the imaged pattern. As with the embodiment of FIG. 18, once the correspondence established, the three-dimensional coordinates of the object 501 may be determined. Unlike the pattern of FIG. 18, in which a single cycle of unique colors are projected, the pattern of FIG. 19 projects three complete cycles of nearly identical colors. With the pattern of FIG. 18, there is little possibility of ambiguity in the measured distance Z (at least for the case in which the projected lines are perpendicular to epipolar lines) since each camera pixel recognizes a particular color that corresponds uniquely to a particular projection direction. Since the camera angle and projection angles are known, triangulation may be used to determine the three-dimensional object coordinates at each pixel position using only a single camera image. Hence the method of FIG. 18 may be considered to be a coded, single-shot method. In contrast, in FIG. 19, there is a chance of ambiguity in the distance Z to an object point. For example, if the camera sees a color purple, the projector may have projected any of three different angles. Based on the triangulation geometry, three different distances Z are possible. If the thickness of the object is known ahead of time to be within a relatively small range of values, then it may be possible to eliminate two of the values, thereby obtaining three-dimensional coordinates in a single shot. In the general case, however, it would be necessary to use additional projected patterns to eliminate the ambiguity. For example, the spatial period of the colored pattern may be changed, and then used to illuminate the object a second time. In this instance, this method of projected structured light is considered to be a sequential method rather than a coded, single-shot method.


Referring now to FIGS. 20-23, coded structured light patterns for a single image acquisition are shown based on a stripe indexing technique. In the embodiments of FIG. 20 and FIG. 21, patterns having color stripes 568, 570 are emitted by the projector 508. This technique utilizes a characteristic of image sensors wherein the sensor has three independent color channels, such as red, green, blue or cyan, yellow, magenta for example. The combinations of the values generated by these sensor channels may produce a large number of colored patterns. As with the embodiment of FIG. 19, the ratio of the color distribution is known, therefore the relationship between the emitted pattern and the imaged pattern may be determined and the three-dimensional coordinates calculated. Still other types of colored patterns may be used, such as a pattern based on the De Bruijn sequence. The stripe indexing techniques and the De Bruijn sequence are well known to those of ordinary skill in the art and so are not discussed further.


In the embodiments of FIG. 22 and FIG. 23, a non-color stripe indexing technique is used. In the embodiment of FIG. 22, the pattern 572 provides groups of stripes having multiple intensity (gray-scale) levels and different widths. As a result, a particular group of stripes within the overall image has a unique gray-scale pattern. Due to the uniqueness of the groups, a one-to-one correspondence may be determined between the emitted pattern and the imaged pattern to calculate the coordinates of the object 501. In the embodiment of FIG. 23, the pattern 574 provides a series of stripes having a segmented pattern. Since each line has unique segment design, the correspondence may be determined between the emitted pattern and the imaged pattern to calculate the coordinates of the object 501. In FIGS. 20-23, additional advantages may be gained by orienting the projected lines 572, 574 perpendicular to epipolar lines so that in the camera plane since this simplifies determination of a second dimension in finding the one-to-one correspondence between camera and projector patterns.


Referring now to FIGS. 24-27, coded structured light patterns are shown that use a two-dimensional spatial grid pattern technique. These types of patterns are arranged such that a sub window, such as window 576 on pattern 578 for example, is unique relative to other sub windows within the pattern. In the embodiment of FIG. 24, a pseudo random binary array pattern 578 is used. The pattern 578 uses a grid with elements, such as circles 579 for example, that form the coded pattern. It should be appreciated that elements having other geometric shapes may also be used, such as but not limited to squares, rectangles, and triangles for example. In the embodiment of FIG. 25, a pattern 580 is shown of a multi-valued pseudo random array wherein each of the numerical values has an assigned shape 582. These shapes 582 form a unique sub-window 584 that allows for correspondence between the emitted pattern and the imaged pattern to calculate the coordinates of the object 501. In the embodiment of FIG. 26, the grid 586 is color coded with stripes perpendicular to the projector plane. The pattern of FIG. 26 will not necessarily provide a pattern that can be decoded in a single shot, but the color information may help to simplify the analysis. In the embodiment of FIG. 27, an array 588 of colored shapes, such as squares or circles, for example, are used to form the pattern.


Referring now to FIGS. 28A-28B, an exemplary sinusoidal pattern 720 is shown. In an embodiment, the lines 734 are perpendicular to epipolar lines on the projector plane. The sinusoidal pattern 720 is made up of thirty lines 722 which are repeated once to give a total number of lines 722 of sixty. Each line 722 has a sinusoidal feature 723 that is approximately 180 degrees out of phase with the line above and the line below. This is to allow the lines 722 to be as close as possible and also allows a greater depth of field because the lines can blur on the projected surface or acquired image and still be recognized. Each single line 722 can be uniquely decoded using just the phase of that line where the line length must be at least one wavelength of the sinusoid.


Since the pattern 720 is repeated, it would generally cause ambiguities in the line identification. However this is problem is resolved in this system through the geometry of the camera's field of view and depth of field. For a single view of the camera, i.e. a row of pixels, within the depth of field in which the lines can be optically resolved, no two lines with the same phase can be imaged. For example, the first row of pixels on the camera can only receive reflected light from lines 1-30 of the pattern. Whereas further down the camera sensor, another row will only receive reflected light from lines 2-31 of the pattern, and so on. In FIG. 28B an enlarged portion of the pattern 720 is shown of three lines where the phase between consecutive lines 722 is approximately 180 degrees. It also shows how the phase of each single line is enough to uniquely decode the lines.


Referring now to FIGS. 29A-29B, another pattern 730 is shown having square pattern elements. In an embodiment, the lines 732 are perpendicular to epipolar lines on the projector plane. The square pattern 730 contains twenty seven lines 732 before the pattern 730 is repeated and has a total number of lines of 59. The code elements 734 of pattern 730 are distinguished by the phase of the square wave from left to right in FIG. 29B. The pattern 730 is encoded such that a group of sequential lines 732 are distinguished by the relative phases of its members. Within the image, sequential lines are found by scanning vertically for the lines. In an embodiment, scanning vertically means scanning along epipolar lines in the camera image plane. Sequential lines within a camera vertical pixel column are paired together and their relative phases are determined. Four sequential paired lines are required to decode the group of lines and locate them within the pattern 730. There is also an ambiguity in this pattern 730 due to the repeat but this is also solved in the same manner as discussed above with respect to sinusoidal pattern 720. FIG. 29B shows an enlarged view of four lines 732 of the square pattern. This embodiment shows that the phase of a single line 732 alone is not able to uniquely decode a line because the first and third lines have the same absolute phase.


This approach to code the relative phases versus the absolute phases provides advantages in that there is a higher tolerance for the positions of the phases. Minor errors in the construction of the projector which may cause the phases of the lines to shift throughout the depth of field of the camera, as well as errors due to the projector and camera lenses make an absolute phase much more difficult to determine. This can be overcome in the absolute phase method by increasing the period such that it is sufficiently large enough to overcome the error in determining the phase.


It should be appreciated that for the case of a two-dimensional pattern that projects a coded pattern of light, the three non-collinear pattern elements are recognizable because of their codes, and since they are projected in two dimensions, the at least three pattern elements are non-collinear. For the case of the periodic pattern, such as the sinusoidally repeating pattern, each sinusoidal period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two dimensions, the pattern elements are non-collinear. In contrast, for the case of the laser line scanner that emits a line of light, all of the pattern elements lie on a straight line. Although the line has width and the tail of the line cross section may have less optical power than the peak of the signal, these aspects of the line are not evaluated separately in finding surface coordinates of an object and therefore do not represent separate pattern elements. Although the line may contain multiple pattern elements, these pattern elements are collinear.


Further, the various pattern techniques may be combined as shown in FIGS. 30-31 to form either a binary (FIG. 30) checkerboard uncoded pattern 590 or a colored (FIG. 31) checkerboard uncoded pattern 592. In still another embodiment shown in FIG. 32, a photometric stereo technique may be used where a plurality of images 594 are taken on the object 501 where the light source 596 is moved to a plurality of locations.


Referring now to FIG. 33, another embodiment is shown of a system 700 for acquiring three-dimensional coordinates of an object 702. In this embodiment, the device 704 is independently operable when detached from the AACMM 100. The device 704 includes a controller 706 and an optional display 708. The display 708 may be integrated in the housing of the device 704 or may be a separate component that is coupled to the device 704 when it is used independently from the AACMM 100. In embodiments where the display 708 is separable from the device 704, the display 708 may include a controller (not shown) that provides additional functionality to facilitate to independent operation of the device 704. In one embodiment, the controller 706 is disposed within the separable display.


The controller 706 includes a communications circuit configured to wirelessly transmit data, such as images or coordinate data via a communications link 712 to the AACMM 100, to a separate computing device 710 or a combination of both. The computing device 710 may be, but is not limited to a computer, a laptop, a tablet computer, a personal digital assistant (PDA), or a cell phone for example. The display 708 may allow the operator see the acquired images, or the point cloud of acquired coordinates of the object 702. In one embodiment, the controller 706 decodes the patterns in the acquired image to determine the three-dimensional coordinates of the object. In another embodiment, the images are acquired by the device 704 and transmitted to either the AACMM 100, the computing device 710 or a combination of both.


The device 704 may further include a location device assembly 714. The location device assembly may include one or more of inertial navigation sensors, such as a Global Positioning System (GPS) sensor, a gyroscopic sensor, an accelerometer sensor. Such sensors may be electrically coupled to the controller 706. Gyroscopic and accelerometer sensors may be single-axis or multiple-axis devices. The location device assembly 714 is configured to allow the controller 706 to measure or maintain the orientation of the device 704 when detached from the AACMM 100. A gyroscope within the location device assembly 714 may be a MEMS gyroscopic device, a solid-state ring-laser device, a fiber optic device gyroscope, or other type.


When the device 704 is removed from the articulated arm CMM 100, a method is used to combine images obtained from multiple scans. In an embodiment the images are each obtained by using coded patterns so that only a single image is needed to obtain three-dimensional coordinates associated with a particular position and orientation of the device 704. One way to combine multiple images captured by the device 704 is to provide at least some overlap between adjacent images so that point cloud features may be matched. This matching function may be assisted by the inertial navigation devices described above.


Another method that can be used to assist in accurate registration of images collected by the device 704 is the use of reference markers. In an embodiment, the reference markers are small markers having an adhesive or sticky backing, for example, circular markers that are placed on an object or objects being measured. Even a relatively small number of such markers can be useful in registering multiple images, especially if the object being measured has a relatively small number of features to use for registration. In an embodiment, the reference markers may be projected as spots of light onto the object or objects under inspection. For example, a small portable projector capable of emitting a plurality of small dots may be placed in front of the object or objects to be measured. An advantage of projected dots over sticky dots is that the dots do not have to be attached and later removed.


In one embodiment, the device projects the structured light over a contiguous and enclosed area 716 and can acquire an image over the area 716 at a range of 100 mm to 300 mm with an accuracy of 35 microns. In an embodiment, the perpendicular area 716 of projection is approximately 150 to 200 mm2. The camera or cameras 510 may be a digital camera having a 1.2-5.0 megapixel CMOS or CCD sensor.


Referring to FIG. 28 and FIG. 29, the process of decoding a coded pattern will be described. The first step in decoding an image of the pattern is to extract the centers of gravity (cog) 724 (FIG. 28C) of the projected pattern 720 features in the Y direction. This is carried out by calculating a moving average of the pixel grayscale values and moving downwards in the Y direction processing a single column at a time. When a pixel value in an image falls above the moving average value then a starting point for a feature is found. After a starting point is found the width of the feature continues to increase until a pixel value falls below the moving average value. A weighted average is then calculated using the pixel values and their Y positions between the start and end points to give the cog 724 of the pattern feature 723 in the image. The distances between the start and end points are also recorded for later use.


The resulting cogs 724 are used next to find the pattern lines 722. This is done by moving in a left to right direction (when viewed from the direction shown in the FIGS.) starting with the first column of the image. For each cog 724 in this column the neighboring column to the immediate right is searched for a cog 724 that is within a particular distance. If two matching cogs 724 are found then a potential line has been determined. As the process moves across the image more new lines are determined and other previously determined lines are extended in length as additional cogs 724 are detected within the tolerance. Once the entire image has been processed a filter is applied to the extracted lines to ensure only lines of a desired length, which is the wavelength of the pattern, are used in the remaining steps. FIG. 28C also shows the detected lines where they are all longer than a single wavelength of the pattern. In one embodiment there is no or a small delta between neighboring column's cogs.


The next step in the decoding process is to extract the projected pattern features along the lines in the X direction in the form of block centers. Each pattern contains both wide blocks and narrow blocks. In the sinusoidal pattern 720 this refers to the peaks and valleys of the wave and in the square pattern 730 this refers to the wide squares and the narrow squares. This process proceeds in a similar fashion to extracting the features in the Y direction, however the moving average is also calculated using the widths found in the first stage and the direction of movement is along the line. As described above, the features are extracted in the area where widths are above the moving average value but in this process, features are also extracted in the areas where the widths are below the moving average. The widths and X positions are used to calculate a weighted average to find the center of the block 726 in the X direction. The Y positions of the cogs 724 between moving average crossings are also used to calculate a center for the block 726 in the Y direction. This is carried out by taking the average of the Y coordinates of the cogs. The start and end points of each line are also modified based on the features extracted in this step to ensure that both points are where the crossing of the moving average occurs. In one embodiment, only complete blocks are used in later processing steps.


The lines and blocks are then processed further to ensure that the distance between the block centers 726 on each line are within a predetermined tolerance. This is accomplished by taking the delta between the X center positions between two neighboring blocks on a line and checking that the delta is below the tolerance. If the delta is above the tolerance then the line is broken up into smaller lines. If the break is required between the last two blocks on a line then the last block is removed and no additional line is created. If the break is required between the first and second or second and third blocks on a line then the blocks to the left of the break are also discarded and no additional line is created. For situations where the break occurs in any other place along the line the line is broken into two and a new line is created and the appropriate blocks are transferred to it. After this stage of processing the two patterns require different steps to finish decoding.


The sinusoidal pattern 720 may now be decoded with one additional step of processing using the block centers on the lines. The modulus of each block X center and the wavelength of the pattern 720 on a line 722 are calculated and the average of these values gives the phase of the line 722. The phase of the line 722 may then be used to decode the line in the pattern 720 which in turn allows for the determination of an X, Y, Z coordinate position for all cogs 724 on that line 722.


Before the square pattern 730 is decoded, first lines 732 be connected vertically before any decoding can take place. This allows a group of lines to be identified and not just a single line like the sinusoidal pattern. Connections 736 are found between lines 732 by using the blocks 734 and the cogs contained in the block calculated in the first stage of processing. The first cog in each block on a line 732 is tested to see if there is another cog directly below it in the same column. If there is no cog below then there is no connection with another line at this point so processing continues. If there is a cog below then the Y distance between the two cogs is determined and compared to a desired maximum spacing between lines. If the distance is less than this value the two lines are considered connected at that point and the connection 736 is stored and processing continues onto the next block. In one embodiment, a line connection 736 is unique such that no two lines will have more than one connection 736 between them.


The next step of processing for the square pattern 730 is phase calculation between connected lines. Each pair of lines 732 is first processed to determine the length of overlap between them. In one embodiment there is at least one wavelength of overlap between the pair of lines to allow the calculation of the relative phase. If the lines have the desired overlap, then the cog at center of the area of overlap is found. The blocks 738 that contain the center cog and the cog directly below are determined and the relative phase between the block X centers is calculated for that line connection. This process is repeated for all connections between lines. In one embodiment, the process is repeated in only the downwards direction in the Y axis. This is because the code is based on connections below lines and not the other way round or both. FIG. 29C shows the blocks 738 that could be used for calculating the relative phase for this set of lines. The relative phases in embodiment of FIG. 29C are 3, 1 and 2 and these phases would be used in the final stage to decode the top line.


The next step in decoding the square pattern 730 is performing a look up using the relative phases calculated in the previous step. Each line 732 is processed by tracking down the line connections 736 until a connection depth of four is reached. This depth is used because this is the number of phases to decode the line. At each level of the connection a hash is determined using the relative phase between the lines 732. When the required connection depth is reached the hash is used to look up the line code. If the hash returns a valid code then this is recorded and stored in a voting system. Every line 732 is processed in this way and all connections that are of the desired depth are used to generate a vote if they are a valid phase combination. The final step is then to find out which code received the most votes on each line 732 and assigned the code of the line 732 to this value. If there is not a unique code that received the most votes then the line is not assigned a code. The lines 732 are identified once a code has been assigned and the X, Y, Z coordinate position for all cogs on that line 732 may now be found.


It should be noted that although the descriptions given above distinguish between line scanners and area (structured light) scanners based on whether three or more pattern elements are collinear, it should be noted that the intent of this criterion is to distinguish patterns projected as areas and as lines. Consequently patterns projected in a linear fashion having information only along a single path are still line patterns even though the one-dimensional pattern may be curved.


A difficulty sometimes encountered in making measurements with a triangulation scanner attached to the end of an articulated arm CMM is that edges are not very sharp. In other words, the edge may have a radius or a chamfer. Such edges may be edges of parts, holes in parts, or sharp aspects of other features. Problems with fuzzy or inaccurately located edges may be seen with line scanners or area scanners. Although the edges viewed in the two-dimensional (2D) image of a triangulation scanner may be sharp, the exact distance to the edge may be less certain. Near an edge, a single pixel may have a distance that is not clearly defined. On one portion of light reflected into the pixel, the light may come from a flat surface. On another portion of the pixel, the distance may be that of neighboring pixels on the side or bottom of a hole, or it may be a faraway distance in the case of an edge of a part. In most cases, because of lens defocus, lens aberrations, and limited modulation transfer function (MTF), a plurality of pixels (rather than a single pixel) may correspond to a feature such as the edge of a hole. In this case, when the point in question is near an edge, the apparent distance to the pixel may not be determined to a single distance to a point on the object. Sometimes the term “mixed pixel” is used to refer to the case in which the distance ascribed to a single pixel on the final image is determined by a plurality of distances on the object. In such a case, the distance as determined by the triangulation scanner for the pixel in question may be a simple average of the distances over the extent of the pixel. In other cases, the distance as determined by the triangulation scanner may be a much different value, as for example when an “ambiguity range” is exceeded during a phase shift method of triangulation. In this case, the distance may be in error by an amount that is difficult to predict.


In accordance with one embodiment, a solution to this issue uses the sharp edges that appear in one or more 2D images of the feature being measured. In many cases, such edge features can be clearly identified in 2D images—for example, based on textural shadings. These sharp edges may be determined in coordination with those surface coordinates that are determined accurately using the triangulation methods. By intersecting the projected rays that pass through the perspective center of the lens in the triangulation scanner with the 3D coordinates of the portion of the surface determined to relatively high accuracy by triangulation methods, the 3D coordinates of the edge features may be accurately determined.


It should be further appreciated that edges seen in an image are never perfectly sharp and so an imperfect edge discontinuity (for example, a fillet) will have to be relatively wide to be seen clearly by a camera. A position of an imperfect edge may still be calculated using methods discussed herein (for example, taking a centroid) to obtain an edge value to a subpixel resolution. In other words, even though a camera will respond on a subpixel level to the width of an edge, the methods given here are still valid as there is generally less uncertainty in the position of an edge from a 2D image than from a 3D image, which is relatively higher amount of data noise when compared with 2D images. In some cases, the surfaces meet to form a substantially 90 degree angle. In other cases, the surfaces may meet with an intermediary surface that is angled less than 90 degrees (e.g. 45 degrees), such as a chamfer or a bevel for example. In other cases, there may be a curved intermediary surface, such as a fillet for example. In still other cases, the edge may be “broken,” such as where the intersection of the surfaces is worked with a file or rasp for example. The methods disclosed herein will be valid for edges having these characteristics. In some embodiments, empirical data may be collected to understand how the edge contrast changes in the captured image under prescribed lighting conditions.


With reference made to FIGS. 34-36, an example of the procedure described above is explained in more detail for the embodiment having an object with a hole. The camera 508 of triangulation scanner 3400 captures the image of light projected by projector 510 onto the surface of an object and reflected off the object surface. The reflected rays of light pass through the perspective center 3414 of the camera lens 3412 onto a photosensitive array 3416 within the camera. The photosensitive array sends an electrical signal to an electrical circuit board 3420 that includes a processor for processing digital image data. Using methods of triangulation described hereinabove, the processor determines the 3D coordinates to each point on the object surface. It should be appreciated that the projected light may cover an area in a single projected image, or it may cover a more limited region such as a stripe or a dot. The comments made herein apply to each of these cases.


The method of combining the 2D image captured by a camera, which may in some embodiments be the camera 508, but in other cases be a separate camera 3410, is to project the rays of light 3440, 3442 corresponding to the edges of the hole 3432A, 3432B captured by the photosensitive array 3416 from the corresponding points on the photosensitive array 3416 so that these rays intersect the edges of the surface 3430A, 3430B. This intersection determines the 3D edge coordinates.


This method may be more clearly understood by considering the example of an object 3600 having a flat region 3610 into which is drilled hole 3620. A region extends from the edge of hole 3620 to a peripheral boundary 3622 in which there is a relatively high level of uncertainty because of mixed pixel effects as discussed above. An assumption is made, based on a priori knowledge of the part being investigated that the edge (in this case of a hole) is sharp and the surface is generally flat. Therefore by projecting the 2D image of hole through the lens perspective center onto the flat region having coordinates determined using triangulation, the 3D coordinates of the sharp edges of the hole may be determined to relatively high accuracy. In a similar manner, the 3D coordinates of any sorts of sharp edges may be determined.


In an embodiment, an uncertainty distance 3424 characteristic of the triangulation system is provided. In some cases, the uncertainty distance is based at least in part on the amount of noise observed in a region or a measure of the “smoothness” of edges. In regions of high noise or low smoothness, uncertainty distance may be increased. Other factors such as light level, which might be a level of ambient light or a level of illumination provided by the device 401, may also be considered in determining an appropriate uncertainty distance 3424.


A method 3700 is now described for determining 3D coordinates of an edge point located on an edge feature using a combination of a projector, a scanner camera, and an edge-detection camera is now described with reference to FIG. 37. In a step 3705, an AACMM is provided that includes a projector, scanner camera, edge-detection scanner, and processor are provided. The articulated arm CMM further includes mechanical elements such as an arm portion that is rotationally coupled to a base. Each arm segment includes at least one position transducer, which in most cases, is an angular encoder. The position transducer produces a position signal, which is usually an angular reading. One end of the arm portion is attached to the base and the other end is attached to a probe end. The projector, scanner camera, and edge-detection scanner are coupled to a probe end. The edge-detection camera may be the same camera as the scanner camera or a camera different than the scanner camera. The projector camera has a projector perspective center through which rays from a first pattern of light pass in traveling to an object. The first pattern of light may be structured light of the type described hereinabove, the first pattern possibly being a coded or sequential pattern. Alternatively, the first pattern may be projected as a line of light or as a spot of light. The rays of light may arise from a pattern of light reflected from a MEMS array or generated by an individual light source that sends the light through suitable optical elements.


In a step 3710, an electronic circuit within the AACMM receives a position signal from the position transducers in the arm segments and sends a first electrical signal to the processor. In a step 3715, the projector emits a first pattern of light onto the object. In a step 3720, the scanner camera receives the first pattern of light reflected from the object. In response to receiving the reflected light, the scanner camera sends a second electrical signal to the processor.


In a step 3725, the edge-detecting camera receives a second light reflected from the object and sends a third electrical signal to the processor in response. A portion of the second light is reflected from an edge feature of the object, where the edge point is a point on the edge feature. The second light may come from a variety of sources. It may be an ambient light coming from background light sources in the environment. The second light may be intentionally emitted by a light source element coupled to the probe end. The light source may provide a uniform illumination over the surface. The second light may be sent to the object at a different time that the first light pattern.


In a step 3730, the processor determines first 3D coordinates of first points on a surface of the object. These first 3D points are based at least in part on the first pattern of light from the projector and the second electrical signal, which arises from the image captured by the scanner camera. Using triangulation methods, the 3D coordinates of the first points on the surface are determined in the local frame of reference of the projector and scanner camera. By further including the first electrical signals, the position of the object surface in an AACMM frame of reference may be determined.


In a step 3735, the processor further determines a first ray, the first ray going from the object to the object. The first ray is that ray that passes from the edge point through the perspective center of the edge-detecting camera. The processor determines the first ray based at least in part on the third electrical signal, which captures the edge in the image of a photosensitive array within the edge-detecting camera. In addition, the first ray is based on the first electrical signal, which is needed to determine the first ray within the AACMM frame of reference. The first ray may be represented as a vector within the AACMM frame of reference.


In a step 3740, the processor further determines 3D coordinates of the edge point based at least in part on an intersection of the first ray with the first 3D coordinates of the first surface. This may be done by determining a characteristic distance over which 3D data is considered of less accuracy than desired. The characteristic distance may be based on a rule associated with a given system, or it may be based on image quality—for example, jagged edges or noise in 3D points near the edge. The general approach is to mathematically project a smooth surface (characterized by 3D points) along a continuing path across the characteristic distance until the smooth surface intersects the first ray. In in most cases, a large number of first rays along an edge points on an edge feature and projected to intersect a projection of a smooth surface, thereby enabling more accurate determination of 3D points on and near the edge feature. In a step 3745, the 3D coordinates of the edge point are stored.


While the invention has been described with reference to example embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims. Moreover, the use of the terms first, second, etc. do not denote any order or importance, but rather the terms first, second, etc. are used to distinguish one element from another. Furthermore, the use of the terms a, an, etc. do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item.

Claims
  • 1. A portable articulated arm coordinate measuring machine (AACMM) for measuring three-dimensional (3D) coordinates of an object in space, comprising: a processor;a noncontact 3D measuring device operably coupled to the processor, the noncontact 3D measuring device having a projector and a scanner camera, the projector configured to emit a first pattern of light onto the object, the scanner camera arranged to receive the first pattern of light reflected from the object and to send a first electrical signal to the processor in response;an edge-detecting camera operably coupled to the processor, the edge-detecting camera being one of the scanner camera or a second camera different than the scanner camera, the edge-detecting camera positioned to receive during operation a second light reflected from an edge feature of the object and to send a second electrical signal to the processor in response; andthe processor being responsive to non-transitory computer readable instructions, the computer readable instructions comprising:determining a first 3D coordinates of first points on a surface of the object based at least in part on the first pattern of light from the projector and the first electrical signal;determining a first ray from the edge-detecting camera to the object, the first ray based at least in part on the second electrical signal; anddetermining a second 3D coordinates of an edge point of the edge feature, the second 3D coordinates based at least in part on an intersection of the first ray with the first 3D coordinates of the surface.
  • 2. The AACMM of claim 1 wherein the first pattern of light is a line of light.
  • 3. The AACMM of claim 1 wherein the first pattern of light is a coded structured light pattern.
  • 4. The AACMM of claim 1 wherein the second light reflected from the object is in response to ambient light that falls on the object.
  • 5. The AACMM of claim 1 wherein the second light reflected from the object is in response to light provided by the a light source, the light provided by the light source being substantially uniform.
  • 6. The AACMM of claim 1 wherein the projector has a projector perspective center and the scanner camera has a scanner camera perspective center, rays of light from the projector passing through the projector perspective center, rays from a second pattern of light reflected off the object passing through the scanner camera perspective center, a baseline distance being a distance from the projector perspective center to the scanner camera perspective center, wherein the first 3D coordinates are further based on the baseline distance.
  • 7. A method for measuring an edge point with a portable articulated arm coordinate measuring machine (AACMM), the method comprising: providing the AACMM, the AACMM including a processor, a noncontact 3D measuring device operably coupled to the processor, the noncontact 3D measuring device having a projector and a scanner camera, the AACMM further including and an edge-detecting camera operably coupled to the processor, the edge-detecting camera being one of the scanner camera or a second camera different than the scanner camera;emitting from the projector a first pattern of light onto object;receiving with the scanner camera the first pattern of light reflected from the object and sending a first electrical signal to the processor in response;receiving with the edge-detecting camera a second light reflected from an edge feature of the object and sending a second electrical signal to the processor in response, the edge feature having an edge point, the edge point being a point on the edge feature;determining with the processor first 3D coordinates of first points on a surface of the object, the first 3D coordinates based at least in part on the first pattern of light from the projector and the first electrical signal;further determining with the processor a first ray from the edge-detecting camera to the object, the first ray based at least in part on the second electrical signal;further determining with the processor second 3D coordinates of the edge point based at least in part on an intersection of the first ray with the first 3D coordinates of the surface; andstoring the second 3D coordinates of the edge point.
CROSS-REFERENCE TO RELATED APPLICATIONS

The present application is a continuation application of U.S. patent application Ser. No. 14/485,876 filed on Sep. 15, 2014, now U.S. Pat. No. 9,607,239. The 14/485,876 application, now U.S. Pat. No. 9,607,239, is a continuation-in-part of U.S. patent application Ser. No. 13/491,176 filed Jun. 7, 2012, now U.S. Pat. No. 8,832,954, which is a continuation-in-part of U.S. patent application Ser. No. 13/006,507 filed Jan. 14, 2011, now U.S. Pat. No. 8,533,967, and claims the benefit of provisional application number 61/296,555 filed Jan. 20, 2010, provisional application number 61/355,279 filed Jun. 16, 2010, and provisional application number 61/351,347 filed on Jun. 4, 2010, the contents of which are hereby incorporated by reference in their entirety.

US Referenced Citations (792)
Number Name Date Kind
1535312 Hosking Apr 1925 A
1538758 Taylor May 1925 A
1918813 Kinzy Jul 1933 A
2316573 Egy Apr 1943 A
2333243 Glab Nov 1943 A
2702683 Green et al. Feb 1955 A
2748926 Leahy Jun 1956 A
2983367 Paramater et al. Jun 1958 A
2924495 Haines Sep 1958 A
2966257 Littlejohn Dec 1960 A
3066790 Armbruster Dec 1962 A
3447852 Barlow Jun 1969 A
3458167 Cooley, Jr. Jul 1969 A
3830567 Riegl Aug 1974 A
3899145 Stephenson Aug 1975 A
3945729 Rosen Mar 1976 A
4138045 Baker Feb 1979 A
4178515 Tarasevich Dec 1979 A
4340008 Mendelson Jul 1982 A
4379461 Nilsson et al. Apr 1983 A
4424899 Rosenberg Jan 1984 A
4430796 Nakagawa Feb 1984 A
4457625 Greenleaf et al. Jul 1984 A
4506448 Topping et al. Mar 1985 A
4537233 Vroonland et al. Aug 1985 A
4561776 Pryor Dec 1985 A
4606696 Slocum Aug 1986 A
4659280 Akeel Apr 1987 A
4663852 Guarini May 1987 A
4664588 Newell et al. May 1987 A
4667231 Pryor May 1987 A
4676002 Slocum Jun 1987 A
4714339 Lau et al. Dec 1987 A
4733961 Mooney Mar 1988 A
4736218 Kutman Apr 1988 A
4751950 Bock Jun 1988 A
4767257 Kato Aug 1988 A
4790651 Brown et al. Dec 1988 A
4816822 Vache et al. Mar 1989 A
4870274 Hebert et al. Sep 1989 A
4882806 Davis Nov 1989 A
4891509 Jones et al. Jan 1990 A
4954952 Ubhayakar Sep 1990 A
4982841 Goedecke Jan 1991 A
4984881 Osada et al. Jan 1991 A
4996909 Vache et al. Mar 1991 A
4999491 Semler et al. Mar 1991 A
5021641 Swartz et al. Jun 1991 A
5025966 Potter Jun 1991 A
5027951 Johnson Jul 1991 A
5068971 Simon Dec 1991 A
5069524 Watanabe et al. Dec 1991 A
5155684 Burke et al. Oct 1992 A
5168532 Seppi et al. Dec 1992 A
5189797 Granger Mar 1993 A
5205111 Johnson Apr 1993 A
5211476 Coudroy May 1993 A
5212738 Chande et al. May 1993 A
5213240 Dietz et al. May 1993 A
5216479 Dotan et al. Jun 1993 A
5218427 Koch Jun 1993 A
5219423 Kamaya Jun 1993 A
5239855 Schleifer et al. Aug 1993 A
5251156 Heier et al. Oct 1993 A
5289264 Steinbichler Feb 1994 A
5289265 Inoue et al. Feb 1994 A
5289855 Baker et al. Mar 1994 A
5313261 Leatham et al. May 1994 A
5319445 Fitts Jun 1994 A
5329347 Wallace et al. Jul 1994 A
5329467 Nagamune et al. Jul 1994 A
5332315 Baker et al. Jul 1994 A
5337149 Kozah et al. Aug 1994 A
5371347 Plesko Dec 1994 A
5372250 Johnson Dec 1994 A
5373346 Hocker Dec 1994 A
5402365 Kozikaro et al. Mar 1995 A
5402582 Raab Apr 1995 A
5412880 Raab May 1995 A
5416505 Eguchi et al. May 1995 A
5430384 Hocker Jul 1995 A
5446846 Lennartsson Aug 1995 A
5455670 Payne et al. Oct 1995 A
5455993 Link et al. Oct 1995 A
5510977 Raab Apr 1996 A
5517297 Stenton May 1996 A
5528354 Uwira Jun 1996 A
5528505 Granger et al. Jun 1996 A
5535524 Carrier et al. Jul 1996 A
5563655 Lathrop Oct 1996 A
5577130 Wu Nov 1996 A
5611147 Raab Mar 1997 A
5615489 Breyer et al. Apr 1997 A
5623416 Hocker, III Apr 1997 A
5629756 Kitajima May 1997 A
5668631 Norita et al. Sep 1997 A
5675326 Juds et al. Oct 1997 A
5677760 Mikami et al. Oct 1997 A
5682508 Hocker, III Oct 1997 A
5716036 Isobe et al. Feb 1998 A
5724264 Rosenberg et al. Mar 1998 A
5734417 Yamamoto et al. Mar 1998 A
5745225 Watanabe et al. Apr 1998 A
5752112 Paddock et al. May 1998 A
5754449 Hoshal et al. May 1998 A
5768792 Raab Jun 1998 A
5793993 Broedner et al. Aug 1998 A
5804805 Koenck Sep 1998 A
5805289 Corby, Jr. et al. Sep 1998 A
5825666 Freifeld Oct 1998 A
5829148 Eaton Nov 1998 A
5831719 Berg et al. Nov 1998 A
5832416 Anderson Nov 1998 A
5844591 Takamatsu et al. Dec 1998 A
5856874 Tachibana et al. Jan 1999 A
5887122 Terawaki et al. Mar 1999 A
5894123 Ohtomo et al. Apr 1999 A
5898484 Harris Apr 1999 A
5898490 Ohtomo et al. Apr 1999 A
5909939 Fugmann Jun 1999 A
5926782 Raab Jul 1999 A
5933267 Ishizuka Aug 1999 A
5936721 Ohtomo et al. Aug 1999 A
5940170 Berg et al. Aug 1999 A
5940181 Tsubono et al. Aug 1999 A
5949530 Wetteborn Sep 1999 A
5956661 Lefebvre et al. Sep 1999 A
5956857 Raab Sep 1999 A
5969321 Danielson et al. Oct 1999 A
5973788 Pettersen et al. Oct 1999 A
5978748 Raab Nov 1999 A
5983936 Schw1eterman et al. Nov 1999 A
5988862 Kacyra et al. Nov 1999 A
5991011 Damm Nov 1999 A
5996790 Yamada et al. Dec 1999 A
5997779 Potter Dec 1999 A
6040898 Mrosik et al. Mar 2000 A
D423534 Raab et al. Apr 2000 S
6050615 Weinhold Apr 2000 A
6057915 Squire et al. May 2000 A
6060889 Hocker May 2000 A
6067116 Yamano et al. May 2000 A
6069700 Rudnick et al. May 2000 A
6077306 Metzger et al. Jun 2000 A
6112423 Sheehan Sep 2000 A
6115511 Sakai et al. Sep 2000 A
6125337 Rosenberg et al. Sep 2000 A
6131299 Raab et al. Oct 2000 A
6134507 Markey, Jr. et al. Oct 2000 A
6138915 Danielson et al. Oct 2000 A
6149112 Thieltges Nov 2000 A
6151789 Raab et al. Nov 2000 A
6163294 Talbot Dec 2000 A
6166504 Iida et al. Dec 2000 A
6166809 Pettersen et al. Dec 2000 A
6166811 Long et al. Dec 2000 A
6204651 Marcus et al. Mar 2001 B1
6204961 Anderson et al. Mar 2001 B1
6219928 Raab et al. Apr 2001 B1
D441632 Raab et al. May 2001 S
6240651 Schroeder et al. Jun 2001 B1
6253458 Raab et al. Jul 2001 B1
6282195 Miller et al. Aug 2001 B1
6285390 Blake Sep 2001 B1
6298569 Raab et al. Oct 2001 B1
6339410 Milner et al. Jan 2002 B1
6349249 Cunningham Feb 2002 B1
6366831 Raab Apr 2002 B1
6408252 De Smet Jun 2002 B1
6418774 Brogaardh et al. Jul 2002 B1
6438507 Imai Aug 2002 B1
6438856 Kaczynski Aug 2002 B1
6442419 Chu et al. Aug 2002 B1
6445446 Kumagai et al. Sep 2002 B1
6460004 Greer et al. Oct 2002 B2
6470584 Stoodley Oct 2002 B1
6477784 Schroeder et al. Nov 2002 B2
6480270 Studnicka et al. Nov 2002 B1
6483106 Ohtomo et al. Nov 2002 B1
6497394 Dunchock Dec 2002 B1
6504602 Hinderling Jan 2003 B1
6512575 Marchi Jan 2003 B1
6519860 Bieg et al. Feb 2003 B1
D472824 Raab et al. Apr 2003 S
6542249 Kofman et al. Apr 2003 B1
6547397 Kaufman et al. Apr 2003 B1
6598306 Eaton Jul 2003 B2
6611346 Granger Aug 2003 B2
6611617 Crampton Aug 2003 B1
D479544 Raab et al. Sep 2003 S
6612044 Raab et al. Sep 2003 B2
6621065 Fukumoto et al. Sep 2003 B1
6626339 Gates et al. Sep 2003 B2
6633051 Holloway et al. Oct 2003 B1
6649208 Rodgers Nov 2003 B2
6650402 Sullivan et al. Nov 2003 B2
6668466 Bieg et al. Dec 2003 B1
6675122 Markendorf et al. Jan 2004 B1
6681495 Masayuki et al. Jan 2004 B2
6710859 Shirai et al. Mar 2004 B2
D490831 Raab et al. Jun 2004 S
D491210 Raab et al. Jun 2004 S
6750873 Bernardini et al. Jun 2004 B1
6753876 Brooksby et al. Jun 2004 B2
6759649 Hipp Jul 2004 B2
6759979 Vashisth et al. Jul 2004 B2
6764185 Beardsley et al. Jul 2004 B1
6789327 Roth et al. Sep 2004 B2
6820346 Raab et al. Nov 2004 B2
6822749 Christoph Nov 2004 B1
6825923 Hamar et al. Nov 2004 B2
6826664 Hocker, III et al. Nov 2004 B2
6847436 Bridges Jan 2005 B2
6856381 Christoph Feb 2005 B2
6858836 Hartrumpf Feb 2005 B1
6859269 Ohtomo et al. Feb 2005 B2
6862097 Yanagisawa et al. Mar 2005 B2
6868359 Raab Mar 2005 B2
6879933 Steffey et al. Apr 2005 B2
6889903 Koenck May 2005 B1
6892465 Raab et al. May 2005 B2
6894767 Ishinabe et al. May 2005 B2
6895347 Dorny et al. May 2005 B2
6901673 Cobb et al. Jun 2005 B1
6904691 Raab et al. Jun 2005 B2
6914678 Ulrichsen et al. Jul 2005 B1
6917415 Gogolla et al. Jul 2005 B2
6920697 Raab et al. Jul 2005 B2
6922234 Hoffman et al. Jul 2005 B2
6925722 Raab et al. Aug 2005 B2
6931745 Granger Aug 2005 B2
6935036 Raab et al. Aug 2005 B2
6935748 Kaufman et al. Aug 2005 B2
6948255 Russell Sep 2005 B2
6957496 Raab et al. Oct 2005 B2
6965843 Raab et al. Nov 2005 B2
6973734 Raab et al. Dec 2005 B2
6988322 Raab et al. Jan 2006 B2
6989890 Riegl et al. Jan 2006 B2
7003892 Eaton et al. Feb 2006 B2
7006084 Buss et al. Feb 2006 B1
7024032 Kidd et al. Apr 2006 B2
7029126 Tang Apr 2006 B2
7032321 Raab et al. Apr 2006 B2
7040136 Forss et al. May 2006 B2
7051447 Kikuchi et al. May 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7076420 Snyder et al. Jul 2006 B1
7106421 Matsuura et al. Sep 2006 B2
7117107 Dorny et al. Oct 2006 B2
7120092 del Prado Pavon et al. Oct 2006 B2
7127822 Kumagai et al. Oct 2006 B2
7136153 Mori et al. Nov 2006 B2
7140213 Feucht et al. Nov 2006 B2
7142289 Ando et al. Nov 2006 B2
7145926 Vitruk et al. Dec 2006 B2
7152456 Eaton Dec 2006 B2
7174651 Raab et al. Feb 2007 B2
7180072 Persi et al. Feb 2007 B2
7184047 Crampton Feb 2007 B1
7190465 Froehlich et al. Mar 2007 B2
7191541 Weekers et al. Mar 2007 B1
7193690 Ossig et al. Mar 2007 B2
7196509 Teng Mar 2007 B2
7199872 Van Cranenbroeck Apr 2007 B2
7200246 Cofer et al. Apr 2007 B2
7202941 Munro Apr 2007 B2
7230689 Lau Jun 2007 B2
7242590 Yeap et al. Jul 2007 B1
7246030 Raab et al. Jul 2007 B2
7249421 MacManus et al. Jul 2007 B2
7254262 Nehse et al. Aug 2007 B2
7256899 Faul et al. Aug 2007 B1
7269910 Raab et al. Sep 2007 B2
D551943 Hodjat et al. Oct 2007 S
7285793 Husted Oct 2007 B2
7296364 Seitz et al. Nov 2007 B2
7296955 Dreier Nov 2007 B2
7296979 Raab et al. Nov 2007 B2
7306339 Kaufman et al. Dec 2007 B2
7307701 Hoffman, II Dec 2007 B2
7312862 Zumbrunn et al. Dec 2007 B2
7313264 Crampton Dec 2007 B2
D559657 Wohlford et al. Jan 2008 S
7319512 Ohtomo et al. Jan 2008 B2
7330242 Reichert et al. Feb 2008 B2
7337344 Barman et al. Feb 2008 B2
7342650 Kern et al. Mar 2008 B2
7348822 Baer Mar 2008 B2
7352446 Bridges et al. Apr 2008 B2
7360648 Blaschke Apr 2008 B1
7372558 Kaufman et al. May 2008 B2
7372581 Raab et al. May 2008 B2
7383638 Granger Jun 2008 B2
7388654 Raab et al. Jun 2008 B2
7389870 Slappay Jun 2008 B2
7395606 Crampton Jul 2008 B2
7400384 Evans et al. Jul 2008 B1
7403268 England et al. Jul 2008 B2
7403269 Yamashita et al. Jul 2008 B2
7430068 Becker et al. Sep 2008 B2
7430070 Soreide et al. Sep 2008 B2
7441341 Eaton Oct 2008 B2
7443555 Blug et al. Oct 2008 B2
7447931 Rischar et al. Nov 2008 B1
7449876 Pleasant et al. Nov 2008 B2
7454265 Marsh Nov 2008 B2
7463368 Morden et al. Dec 2008 B2
7477359 England et al. Jan 2009 B2
7477360 England et al. Jan 2009 B2
7480037 Palmateer et al. Jan 2009 B2
2452033 Born Feb 2009 A1
7508496 Mettenleiter et al. Mar 2009 B2
7508971 Vaccaro et al. Mar 2009 B2
7515256 Ohtomo et al. Apr 2009 B2
7525276 Eaton Apr 2009 B2
7527205 Zhu et al. May 2009 B2
7528768 Wakayama et al. May 2009 B2
7541830 Fahrbach et al. Jun 2009 B2
7545517 Rueb et al. Jun 2009 B2
7546689 Ferrari et al. Jun 2009 B2
7551771 England, III Jun 2009 B2
7552644 Haase et al. Jun 2009 B2
7557824 Holliman Jul 2009 B2
7561598 Stratton et al. Jul 2009 B2
7564250 Hocker Jul 2009 B2
7568293 Ferrari Aug 2009 B2
7578069 Eaton Aug 2009 B2
D599226 Gerent et al. Sep 2009 S
7589595 Cutler Sep 2009 B2
7589825 Orchard et al. Sep 2009 B2
7591077 Pettersson Sep 2009 B2
7591078 Crampton Sep 2009 B2
7599106 Matsumoto et al. Oct 2009 B2
7600061 Honda Oct 2009 B2
7602873 Eidson Oct 2009 B2
7604207 Hasloecher et al. Oct 2009 B2
7610175 Eidson Oct 2009 B2
7614157 Granger Nov 2009 B2
7624510 Ferrari Dec 2009 B2
7625335 Deichmann et al. Dec 2009 B2
7626690 Kumagai et al. Dec 2009 B2
D607350 Cooduvalli et al. Jan 2010 S
7656751 Rischar et al. Feb 2010 B2
7659995 Knighton et al. Feb 2010 B2
D610926 Gerent et al. Mar 2010 S
7693325 Pulla et al. Apr 2010 B2
7697748 Dimsdale et al. Apr 2010 B2
7701592 Saint Clair et al. Apr 2010 B2
7712224 Hicks May 2010 B2
7721396 Fleischman May 2010 B2
7728833 Verma et al. Jun 2010 B2
7728963 Kirschner Jun 2010 B2
7733544 Becker et al. Jun 2010 B2
7735234 Briggs et al. Jun 2010 B2
7742634 Fujieda et al. Jun 2010 B2
7743524 Eaton et al. Jun 2010 B2
7752003 MacManus Jul 2010 B2
7756615 Barfoot et al. Jul 2010 B2
7765707 Tomelleri Aug 2010 B2
7769559 Reichert Aug 2010 B2
7774949 Ferrari Aug 2010 B2
7777761 England et al. Aug 2010 B2
7779548 Ferrari Aug 2010 B2
7779553 Jordil et al. Aug 2010 B2
7784194 Raab et al. Aug 2010 B2
7787670 Urushiya Aug 2010 B2
7793425 Bailey Sep 2010 B2
7798453 Maningo et al. Sep 2010 B2
7800758 Bridges et al. Sep 2010 B1
7804602 Raab Sep 2010 B2
7805851 Pettersson Oct 2010 B2
7805854 Eaton Oct 2010 B2
7809518 Zhu et al. Oct 2010 B2
7834985 Morcom Nov 2010 B2
7847922 Gittinger et al. Dec 2010 B2
RE42055 Raab Jan 2011 E
7869005 Ossig et al. Jan 2011 B2
RE42082 Raab et al. Feb 2011 E
7881896 Atwell et al. Feb 2011 B2
7889324 Yamamoto Feb 2011 B2
7891248 Hough et al. Feb 2011 B2
7900714 Milbourne et al. Mar 2011 B2
7903245 Miousset et al. Mar 2011 B2
7903261 Saint Clair et al. Mar 2011 B2
7908757 Ferrari Mar 2011 B2
7933055 Jensen et al. Apr 2011 B2
7935928 Serger et al. May 2011 B2
7965747 Kumano Jun 2011 B2
7982866 Vogel Jul 2011 B2
D643319 Ferrari et al. Aug 2011 S
7990397 Bukowski et al. Aug 2011 B2
7994465 Bamji et al. Aug 2011 B1
7995834 Knighton et al. Aug 2011 B1
8001697 Danielson et al. Aug 2011 B2
8020657 Allard et al. Sep 2011 B2
8022812 Beniyama et al. Sep 2011 B2
8028432 Bailey et al. Oct 2011 B2
8036775 Matsumoto et al. Oct 2011 B2
8045762 Otani et al. Oct 2011 B2
8051710 Van Dam et al. Nov 2011 B2
8052857 Townsend Nov 2011 B2
8064046 Ossig et al. Nov 2011 B2
8085861 Caputo Nov 2011 B2
8082673 Desforges et al. Dec 2011 B2
8099877 Champ Jan 2012 B2
8117668 Crampton et al. Feb 2012 B2
8123350 Cannell et al. Feb 2012 B2
8152071 Doherty et al. Apr 2012 B2
D659035 Ferrari et al. May 2012 S
8171650 York et al. May 2012 B2
8179936 Bueche et al. May 2012 B2
D662427 Bailey et al. Jun 2012 S
8218131 Otani et al. Jul 2012 B2
8224032 Fuchs et al. Jul 2012 B2
8260483 Barfoot et al. Sep 2012 B2
8269984 Hinderling et al. Sep 2012 B2
8276286 Bailey et al. Oct 2012 B2
8284407 Briggs et al. Oct 2012 B2
8310653 Ogawa et al. Nov 2012 B2
8321612 Hartwich et al. Nov 2012 B2
8346392 Walser et al. Jan 2013 B2
8346480 Trepagnier et al. Jan 2013 B2
8352212 Fetter et al. Jan 2013 B2
8353059 Crampton et al. Jan 2013 B2
D676341 Bailey et al. Feb 2013 S
8379191 Braunecker et al. Feb 2013 B2
8381704 Debelak et al. Feb 2013 B2
8384914 Becker et al. Feb 2013 B2
D678085 Bailey et al. Mar 2013 S
8391565 Purcell et al. Mar 2013 B2
8402669 Ferrari et al. Mar 2013 B2
8422035 Hinderling et al. Apr 2013 B2
8497901 Pettersson Jul 2013 B2
8533967 Bailey et al. Sep 2013 B2
8537374 Briggs et al. Sep 2013 B2
8619265 Steffey et al. Dec 2013 B2
8645022 Yoshimura et al. Feb 2014 B2
8659748 Dakin et al. Feb 2014 B2
8659752 Cramer et al. Feb 2014 B2
8661700 Briggs et al. Mar 2014 B2
8677643 Bridges et al. Mar 2014 B2
8683709 York Apr 2014 B2
8699007 Becker et al. Apr 2014 B2
8705012 Greiner et al. Apr 2014 B2
8705016 Schumann et al. Apr 2014 B2
8718837 Wang et al. May 2014 B2
8784425 Ritchey et al. Jul 2014 B2
8797552 Suzuki et al. Aug 2014 B2
8830485 Woloschyn Sep 2014 B2
8832954 Atwell et al. Sep 2014 B2
9163922 Bridges et al. Oct 2015 B2
9228816 Grau Jan 2016 B2
9607239 Bridges Mar 2017 B2
9628775 Bridges et al. Apr 2017 B2
20010004269 Shibata et al. Jun 2001 A1
20020032541 Raab et al. Mar 2002 A1
20020059042 Kacyra et al. May 2002 A1
20020087233 Raab Jul 2002 A1
20020128790 Woodmansee Sep 2002 A1
20020143506 D'Aligny et al. Oct 2002 A1
20020149694 Seo Oct 2002 A1
20020170192 Steffey et al. Nov 2002 A1
20020176097 Rodgers Nov 2002 A1
20030002055 Kilthau et al. Jan 2003 A1
20030033104 Gooche Feb 2003 A1
20030043386 Froehlich et al. Mar 2003 A1
20030053037 Blaesing-Bangert et al. Mar 2003 A1
20030066954 Hipp Apr 2003 A1
20030090646 Riegl et al. May 2003 A1
20030125901 Steffey et al. Jul 2003 A1
20030137449 Vashisth et al. Jul 2003 A1
20030142631 Silvester Jul 2003 A1
20030167647 Raab et al. Sep 2003 A1
20030172536 Raab et al. Sep 2003 A1
20030172537 Raab et al. Sep 2003 A1
20030179361 Ohtomo et al. Sep 2003 A1
20030208919 Raab et al. Nov 2003 A1
20030221326 Raab et al. Dec 2003 A1
20040004727 Yanagisawa et al. Jan 2004 A1
20040022416 Lemelson et al. Feb 2004 A1
20040027554 Ishinabe et al. Feb 2004 A1
20040040166 Raab et al. Mar 2004 A1
20040103547 Raab et al. Jun 2004 A1
20040111908 Raab et al. Jun 2004 A1
20040119020 Bodkin Jun 2004 A1
20040135990 Ohtomo et al. Jul 2004 A1
20040139265 Hocker, III et al. Jul 2004 A1
20040158355 Holmqvist et al. Aug 2004 A1
20040162700 Rosenberg et al. Aug 2004 A1
20040179570 Vitruk et al. Sep 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040246462 Kaneko et al. Dec 2004 A1
20040246589 Kim et al. Dec 2004 A1
20040259533 Nixon et al. Dec 2004 A1
20050016008 Raab et al. Jan 2005 A1
20050024625 Mori et al. Feb 2005 A1
20050028393 Raab et al. Feb 2005 A1
20050046823 Ando et al. Mar 2005 A1
20050058332 Kaufman et al. Mar 2005 A1
20050082262 Rueb et al. Apr 2005 A1
20050085940 Griggs et al. Apr 2005 A1
20050115092 Griggs et al. Apr 2005 A1
20050111514 Matsumoto et al. May 2005 A1
20050141052 Becker et al. Jun 2005 A1
20050144799 Raab et al. Jul 2005 A1
20050150123 Eaton Jul 2005 A1
20050151963 Pulla et al. Jul 2005 A1
20050166413 Crampton Aug 2005 A1
20050172503 Kumagai et al. Aug 2005 A1
20050188557 Raab et al. Sep 2005 A1
20050190384 Persi et al. Sep 2005 A1
20050214716 Weber et al. Sep 2005 A1
20050259271 Christoph Nov 2005 A1
20050276466 Vaccaro et al. Dec 2005 A1
20050283989 Pettersson Dec 2005 A1
20060016086 Raab et al. Jan 2006 A1
20060017720 Li Jan 2006 A1
20060026851 Raab et al. Feb 2006 A1
20060028203 Kawashima et al. Feb 2006 A1
20060053647 Raab et al. Mar 2006 A1
20060056459 Stratton et al. Mar 2006 A1
20060056559 Pleasant et al. Mar 2006 A1
20060059270 Pleasant et al. Mar 2006 A1
20060061566 Verma et al. Mar 2006 A1
20060066836 Bridges et al. Mar 2006 A1
20060088044 Hammerl et al. Apr 2006 A1
20060096108 Raab et al. May 2006 A1
20060103853 Palmateer May 2006 A1
20060109536 Mettenleiter et al. May 2006 A1
20060123649 Muller Jun 2006 A1
20060129349 Raab et al. Jun 2006 A1
20060132803 Clair et al. Jun 2006 A1
20060145703 Steinbichler et al. Jul 2006 A1
20060169050 Kobayashi et al. Aug 2006 A1
20060169608 Carnevali Aug 2006 A1
20060170870 Kaufman et al. Aug 2006 A1
20060182314 England et al. Aug 2006 A1
20060186301 Dozier et al. Aug 2006 A1
20060193521 England, III et al. Aug 2006 A1
20060241791 Pokorny et al. Oct 2006 A1
20060244746 England et al. Nov 2006 A1
20060245717 Ossig et al. Nov 2006 A1
20060279246 Hashimoto et al. Dec 2006 A1
20060282574 Zotov et al. Dec 2006 A1
20060287769 Yanagita et al. Dec 2006 A1
20060291970 Granger Dec 2006 A1
20070019212 Gatsios et al. Jan 2007 A1
20070030841 Lee et al. Feb 2007 A1
20070043526 De Jonge et al. Feb 2007 A1
20070050774 Eldson et al. Mar 2007 A1
20070055806 Stratton et al. Mar 2007 A1
20070058154 Reichert et al. Mar 2007 A1
20070058162 Granger Mar 2007 A1
20070064976 England, III Mar 2007 A1
20070097382 Granger May 2007 A1
20070100498 Matsumoto et al. May 2007 A1
20070105238 Mandl et al. May 2007 A1
20070118269 Gibson et al. May 2007 A1
20070122250 Mullner May 2007 A1
20070142970 Burbank et al. Jun 2007 A1
20070147265 Eidson et al. Jun 2007 A1
20070147435 Hamilton et al. Jun 2007 A1
20070147562 Eidson Jun 2007 A1
20070150111 Wu et al. Jun 2007 A1
20070151390 Blumenkranz et al. Jul 2007 A1
20070153297 Lau Jul 2007 A1
20070163134 Eaton Jul 2007 A1
20070163136 Eaton et al. Jul 2007 A1
20070171394 Steiner et al. Jul 2007 A1
20070176648 Baer Aug 2007 A1
20070177016 Wu Aug 2007 A1
20070181685 Zhu et al. Aug 2007 A1
20070183459 Eidson Aug 2007 A1
20070185682 Eidson Aug 2007 A1
20070217169 Yeap et al. Sep 2007 A1
20070217170 Yeap et al. Sep 2007 A1
20070221522 Yamada et al. Sep 2007 A1
20070223477 Eidson Sep 2007 A1
20070229801 Tearney et al. Oct 2007 A1
20070229929 Soreide et al. Oct 2007 A1
20070247615 Bridges et al. Oct 2007 A1
20070248122 Hamilton Oct 2007 A1
20070256311 Ferrari Nov 2007 A1
20070257660 Pleasant et al. Nov 2007 A1
20070258378 Hamilton Nov 2007 A1
20070282564 Sprague et al. Dec 2007 A1
20070294045 Atwell et al. Dec 2007 A1
20080046221 Stathis Feb 2008 A1
20080052808 Leick et al. Mar 2008 A1
20080052936 Briggs et al. Mar 2008 A1
20080066583 Lott et al. Mar 2008 A1
20080068103 Cutler Mar 2008 A1
20080075325 Otani et al. Mar 2008 A1
20080075326 Otani et al. Mar 2008 A1
20080080562 Burch et al. Apr 2008 A1
20080096108 Sumiyama et al. Apr 2008 A1
20080098272 Fairbanks et al. Apr 2008 A1
20080148585 Raab et al. Jun 2008 A1
20080154538 Stathis Jun 2008 A1
20080183065 Goldbach Jul 2008 A1
20080196260 Pettersson Aug 2008 A1
20080204699 Benz et al. Aug 2008 A1
20080216552 Ibach et al. Sep 2008 A1
20080218728 Kirschner Sep 2008 A1
20080228331 McNerney et al. Sep 2008 A1
20080232269 Tatman et al. Sep 2008 A1
20080235969 Jordil et al. Oct 2008 A1
20080235970 Crampton Oct 2008 A1
20080240321 Narus et al. Oct 2008 A1
20080245452 Law et al. Oct 2008 A1
20080246943 Kaufman et al. Oct 2008 A1
20080252671 Cannell et al. Oct 2008 A1
20080256814 Pettersson Oct 2008 A1
20080257023 Jordil et al. Oct 2008 A1
20080263411 Baney et al. Oct 2008 A1
20080271332 Jordil et al. Nov 2008 A1
20080273758 Fuchs et al. Nov 2008 A1
20080282564 Pettersson Nov 2008 A1
20080179206 Feinstein et al. Dec 2008 A1
20080295349 Uhl et al. Dec 2008 A1
20080298254 Eidson Dec 2008 A1
20080302200 Tobey Dec 2008 A1
20080309460 Jefferson et al. Dec 2008 A1
20080309546 Wakayama et al. Dec 2008 A1
20090000136 Crampton Jan 2009 A1
20090010740 Ferrari et al. Jan 2009 A1
20090013548 Ferrari Jan 2009 A1
20090016475 Rischar et al. Jan 2009 A1
20090021351 Beniyama et al. Jan 2009 A1
20090031575 Tomelleri Feb 2009 A1
20090046140 Lashmet et al. Feb 2009 A1
20090046752 Bueche et al. Feb 2009 A1
20090046895 Pettersson et al. Feb 2009 A1
20090049704 Styles et al. Feb 2009 A1
20090051938 Miousset et al. Feb 2009 A1
20090083985 Ferrari Apr 2009 A1
20090089004 Vook et al. Apr 2009 A1
20090089078 Bursey Apr 2009 A1
20090089233 Gach et al. Apr 2009 A1
20090089623 Neering et al. Apr 2009 A1
20090095047 Patel et al. Apr 2009 A1
20090100949 Shirai et al. Apr 2009 A1
20090109797 Eidson Apr 2009 A1
20090113183 Barford et al. Apr 2009 A1
20090113229 Cataldo et al. Apr 2009 A1
20090122805 Epps et al. May 2009 A1
20090125196 Velazquez et al. May 2009 A1
20090133276 Bailey et al. May 2009 A1
20090133494 Van Dam et al. May 2009 A1
20090139105 Granger Jun 2009 A1
20090157419 Bursey Jun 2009 A1
20090161091 Yamamoto Jun 2009 A1
20090165317 Little Jul 2009 A1
20090177435 Heininen Jul 2009 A1
20090177438 Raab Jul 2009 A1
20090185741 Nahari et al. Jul 2009 A1
20090187373 Atwell Jul 2009 A1
20090241360 Tait et al. Oct 2009 A1
20090249634 Pettersson Oct 2009 A1
20090265946 Jordil et al. Oct 2009 A1
20090273771 Gittinger et al. Nov 2009 A1
20090299689 Stubben et al. Dec 2009 A1
20090322859 Shelton et al. Dec 2009 A1
20090323121 Valkenburg et al. Dec 2009 A1
20090323742 Kumano Dec 2009 A1
20100030421 Yoshimura et al. Feb 2010 A1
20100040742 Dijkhuis et al. Feb 2010 A1
20100049891 Hartwich et al. Feb 2010 A1
20100057392 York Mar 2010 A1
20100078866 Pettersson Apr 2010 A1
20100095542 Ferrari Apr 2010 A1
20100122920 Butter et al. May 2010 A1
20100123892 Miller et al. May 2010 A1
20100128259 Bridges et al. May 2010 A1
20100134596 Becker Jun 2010 A1
20100135534 Weston et al. Jun 2010 A1
20100148013 Bhotika et al. Jun 2010 A1
20100188504 Dimsdale et al. Jul 2010 A1
20100195086 Ossig et al. Aug 2010 A1
20100195087 Ossig et al. Aug 2010 A1
20100207938 Yau et al. Aug 2010 A1
20100208062 Pettersson Aug 2010 A1
20100208318 Jensen et al. Aug 2010 A1
20100245851 Teodorescu Sep 2010 A1
20100277747 Rueb et al. Nov 2010 A1
20100281705 Verdi et al. Nov 2010 A1
20100286941 Merlot Nov 2010 A1
20100312524 Siercks et al. Dec 2010 A1
20100318319 Maierhofer Dec 2010 A1
20100321152 Argudyaev et al. Dec 2010 A1
20100325907 Tait Dec 2010 A1
20100328682 Kotake et al. Dec 2010 A1
20110000095 Carlson Jan 2011 A1
20110001958 Bridges et al. Jan 2011 A1
20110007305 Bridges et al. Jan 2011 A1
20110007326 Daxauer et al. Jan 2011 A1
20110013199 Siercks et al. Jan 2011 A1
20110019155 Daniel et al. Jan 2011 A1
20110023578 Grasser Feb 2011 A1
20110025905 Tanaka Feb 2011 A1
20110043515 Stathis Feb 2011 A1
20110066781 Debelak et al. Mar 2011 A1
20110070534 Hayashi et al. Mar 2011 A1
20110094908 Trieu et al. Apr 2011 A1
20110107611 Desforges et al. May 2011 A1
20110107612 Ferrari et al. May 2011 A1
20110107613 Tait May 2011 A1
20110107614 Champ May 2011 A1
20110111849 Sprague et al. May 2011 A1
20110112786 Desforges et al. May 2011 A1
20110119025 Fetter et al. May 2011 A1
20110123097 Van Coppenolle et al. May 2011 A1
20110164114 Kobayashi et al. Jul 2011 A1
20110166824 Haisty et al. Jul 2011 A1
20110169924 Haisty et al. Jul 2011 A1
20110170534 York Jul 2011 A1
20110173823 Bailey et al. Jul 2011 A1
20110173827 Bailey et al. Jul 2011 A1
20110173828 York Jul 2011 A1
20110178754 Atwell et al. Jul 2011 A1
20110178755 York Jul 2011 A1
20110178758 Atwell et al. Jul 2011 A1
20110178762 York Jul 2011 A1
20110178764 York Jul 2011 A1
20110178765 Atwell et al. Jul 2011 A1
20110192043 Ferrari et al. Aug 2011 A1
20110273568 Lagassey et al. Nov 2011 A1
20110282622 Canter et al. Nov 2011 A1
20110288684 Farlow et al. Nov 2011 A1
20120019806 Becker et al. Jan 2012 A1
20120035788 Trepagnier et al. Feb 2012 A1
20120035798 Barfoot et al. Feb 2012 A1
20120044476 Earhart et al. Feb 2012 A1
20120046820 Allard et al. Feb 2012 A1
20120069325 Schumann et al. Mar 2012 A1
20120069352 Ossig et al. Mar 2012 A1
20120070077 Ossig et al. Mar 2012 A1
20120099100 Cramer et al. Apr 2012 A1
20120113913 Tiirola et al. May 2012 A1
20120140244 Gittinger et al. Jun 2012 A1
20120154786 Gosch et al. Jun 2012 A1
20120155744 Kennedy et al. Jun 2012 A1
20120169876 Reichert et al. Jul 2012 A1
20120181194 McEwan et al. Jul 2012 A1
20120197439 Wang et al. Aug 2012 A1
20120210678 Alcouloumre et al. Aug 2012 A1
20120217357 Franke Aug 2012 A1
20120224052 Bae Sep 2012 A1
20120229788 Schumann et al. Sep 2012 A1
20120236320 Steffey et al. Sep 2012 A1
20120257017 Pettersson et al. Oct 2012 A1
20120260512 Kretschmer et al. Oct 2012 A1
20120260611 Jones Oct 2012 A1
20120262700 Schumann et al. Oct 2012 A1
20120287265 Schumann et al. Nov 2012 A1
20130010307 Greiner et al. Jan 2013 A1
20130025143 Bailey et al. Jan 2013 A1
20130025144 Briggs et al. Jan 2013 A1
20130027515 Vinther et al. Jan 2013 A1
20130062243 Chang et al. Mar 2013 A1
20130070250 Ditte et al. Mar 2013 A1
20130094024 Ruhland et al. Apr 2013 A1
20130097882 Bridges et al. Apr 2013 A1
20130125408 Atwell et al. May 2013 A1
20130162472 Najim et al. Jun 2013 A1
20130176453 Mate et al. Jul 2013 A1
20130201487 Ossig et al. Aug 2013 A1
20130205606 Briggs et al. Aug 2013 A1
20130212889 Bridges et al. Aug 2013 A9
20130222816 Briggs et al. Aug 2013 A1
20130239424 Tait Sep 2013 A1
20130293684 Becker et al. Nov 2013 A1
20130300740 Snyder et al. Nov 2013 A1
20140002608 Atwell et al. Jan 2014 A1
20140009582 Suzuki Jan 2014 A1
20140012409 McMurtry et al. Jan 2014 A1
20140028805 Tohme Jan 2014 A1
20140049784 Woloschyn et al. Feb 2014 A1
20140063489 Steffey et al. Mar 2014 A1
20140152769 Atwell et al. Jun 2014 A1
20140202016 Bridges et al. Jul 2014 A1
20140226190 Bridges et al. Aug 2014 A1
20140240690 Newman et al. Aug 2014 A1
20140259715 Engel Sep 2014 A1
20140268108 Grau Sep 2014 A1
20140293023 Sherman et al. Oct 2014 A1
20140362424 Bridges et al. Dec 2014 A1
20150002659 Atwell et al. Jan 2015 A1
20150130906 Bridges May 2015 A1
20150185000 Wilson et al. Jul 2015 A1
20150229907 Bridges Aug 2015 A1
Foreign Referenced Citations (321)
Number Date Country
508635 Mar 2011 AT
2005200937 Sep 2006 AU
2236119 Sep 1996 CN
1307241 Aug 2001 CN
2508896 Sep 2002 CN
2665668 Dec 2004 CN
1630804 Jun 2005 CN
1630805 Jun 2005 CN
1688867 Oct 2005 CN
1735789 Feb 2006 CN
1812868 Aug 2006 CN
1818537 Aug 2006 CN
1838102 Sep 2006 CN
1839293 Sep 2006 CN
1853084 Oct 2006 CN
1926400 Mar 2007 CN
101024286 Aug 2007 CN
101156043 Apr 2008 CN
101163939 Apr 2008 CN
101371099 Feb 2009 CN
101416024 Apr 2009 CN
101484828 Jul 2009 CN
201266071 Jul 2009 CN
101506684 Aug 2009 CN
101511529 Aug 2009 CN
101542227 Sep 2009 CN
101556137 Oct 2009 CN
101806574 Aug 2010 CN
101932952 Dec 2010 CN
2216765 Apr 1972 DE
3227980 May 1983 DE
3245060 Jul 1983 DE
3340317 Aug 1984 DE
4027990 Feb 1992 DE
4222642 Jan 1994 DE
4340756 Jun 1994 DE
4303804 Aug 1994 DE
4445464 Jul 1995 DE
4410775 Oct 1995 DE
4412044 Oct 1995 DE
29622033 Feb 1997 DE
19543763 May 1997 DE
19601875 Jul 1997 DE
19607345 Aug 1997 DE
19720049 Nov 1998 DE
19811550 Sep 1999 DE
19820307 Nov 1999 DE
19850118 May 2000 DE
19928958 Nov 2000 DE
10026357 Jan 2002 DE
20208077 May 2002 DE
10137241 Sep 2002 DE
10155488 May 2003 DE
10219054 Nov 2003 DE
10232028 Feb 2004 DE
10336458 Feb 2004 DE
10244643 Apr 2004 DE
20320216 Apr 2004 DE
10304188 Aug 2004 DE
10326848 Jan 2005 DE
202005000983 Apr 2005 DE
10361870 Jul 2005 DE
102004015668 Sep 2005 DE
102004015111 Oct 2005 DE
102004028090 Dec 2005 DE
10114126 Aug 2006 DE
202006005643 Aug 2006 DE
102004010083 Nov 2006 DE
102005043931 Mar 2007 DE
102005056265 May 2007 DE
102006053611 May 2007 DE
102005060967 Jun 2007 DE
102006023902 Nov 2007 DE
102006024534 Nov 2007 DE
102006035292 Jan 2008 DE
202006020299 May 2008 DE
102007037162 Feb 2009 DE
102008014274 Aug 2009 DE
102008039838 Mar 2010 DE
102005036929 Jun 2010 DE
102008062763 Jul 2010 DE
102009001894 Sep 2010 DE
102009035336 Nov 2010 DE
102009055988 Mar 2011 DE
102010032726 Nov 2011 DE
102010032725 Jan 2012 DE
202011051975 Feb 2013 DE
102012107544 May 2013 DE
102012104745 Dec 2013 DE
102012109481 Apr 2014 DE
0546784 Jun 1993 EP
0667549 Aug 1995 EP
0727642 Aug 1996 EP
0730210 Sep 1996 EP
0614517 Mar 1997 EP
0838696 Apr 1998 EP
0949524 Oct 1999 EP
1033556 Sep 2000 EP
1160539 Dec 2001 EP
1189124 Mar 2002 EP
0767357 May 2002 EP
1310764 May 2003 EP
1342989 Sep 2003 EP
1347267 Sep 2003 EP
1361414 Nov 2003 EP
1452279 Sep 2004 EP
1468791 Oct 2004 EP
1056987 Apr 2005 EP
1528410 May 2005 EP
1669713 Jun 2006 EP
1734425 Dec 2006 EP
1429109 Apr 2007 EP
1764579 Dec 2007 EP
1878543 Jan 2008 EP
1967930 Sep 2008 EP
2003419 Dec 2008 EP
2023077 Feb 2009 EP
2042905 Apr 2009 EP
2060530 May 2009 EP
2068067 Jun 2009 EP
2068114 Jun 2009 EP
2108917 Oct 2009 EP
2177868 Apr 2010 EP
2259013 Dec 2010 EP
2372302 Oct 2011 EP
2400261 Dec 2011 EP
2344303 May 2012 EP
2603228 Mar 1988 FR
2935043 Feb 2010 FR
894320 Apr 1962 GB
1112941 May 1968 GB
2222695 Mar 1990 GB
2255648 Nov 1992 GB
2336493 Oct 1999 GB
2341203 Mar 2000 GB
2388661 Nov 2003 GB
2420241 May 2006 GB
2447258 Sep 2008 GB
2452033 Feb 2009 GB
2510510 Aug 2014 GB
575584 Jan 1982 JP
58171291 Jan 1983 JP
5827264 Feb 1983 JP
S58171291 Oct 1983 JP
59133890 Aug 1984 JP
61062885 Mar 1986 JP
S61157095 Jul 1986 JP
63135814 Jun 1988 JP
0357911 Mar 1991 JP
04115108 Apr 1992 JP
04225188 Aug 1992 JP
04267214 Sep 1992 JP
0572477 Mar 1993 JP
06313710 Nov 1994 JP
1994313710 Nov 1994 JP
06331733 Dec 1994 JP
06341838 Dec 1994 JP
074950 Jan 1995 JP
07128051 May 1995 JP
7210586 Aug 1995 JP
07229963 Aug 1995 JP
0815413 Jan 1996 JP
0821714 Jan 1996 JP
08129145 May 1996 JP
08136849 May 1996 JP
08262140 Oct 1996 JP
0921868 Jan 1997 JP
H101111130 Apr 1998 JP
10213661 Aug 1998 JP
1123993 Jan 1999 JP
2001056275 Aug 1999 JP
2000121724 Apr 2000 JP
2000249546 Sep 2000 JP
2000339468 Dec 2000 JP
2001013001 Jan 2001 JP
2001021303 Jan 2001 JP
2001066158 Mar 2001 JP
2001066211 Mar 2001 JP
2001337278 Dec 2001 JP
3274290 Apr 2002 JP
2003050128 Feb 2003 JP
2003156330 May 2003 JP
2003156562 May 2003 JP
2003194526 Jul 2003 JP
2003202215 Jul 2003 JP
2003216255 Jul 2003 JP
2003308205 Oct 2003 JP
2004109106 Apr 2004 JP
2004163346 Jun 2004 JP
2004245832 Sep 2004 JP
2004257927 Sep 2004 JP
2004333398 Nov 2004 JP
2004348575 Dec 2004 JP
2005030937 Feb 2005 JP
2005055226 Mar 2005 JP
2005069700 Mar 2005 JP
2005174887 Jun 2005 JP
2005517908 Jun 2005 JP
2005517914 Jun 2005 JP
2005215917 Aug 2005 JP
2005221336 Aug 2005 JP
2005257510 Sep 2005 JP
2005293291 Oct 2005 JP
2006038683 Feb 2006 JP
2006102176 Apr 2006 JP
2006203404 Aug 2006 JP
2006226948 Aug 2006 JP
2006519369 Aug 2006 JP
2006241833 Sep 2006 JP
2006266821 Oct 2006 JP
2006301991 Nov 2006 JP
2007101836 Apr 2007 JP
2007514943 Jun 2007 JP
2007178943 Jul 2007 JP
2007228315 Sep 2007 JP
2008076303 Apr 2008 JP
2008082707 Apr 2008 JP
2008096123 Apr 2008 JP
2008107286 May 2008 JP
2008514967 May 2008 JP
2008224516 Sep 2008 JP
2008304220 Dec 2008 JP
2009063339 Mar 2009 JP
2009524057 Jun 2009 JP
2009531674 Sep 2009 JP
2009534969 Sep 2009 JP
2009229255 Oct 2009 JP
2009541758 Nov 2009 JP
2010060304 Mar 2010 JP
2010112875 May 2010 JP
2010122209 Jun 2010 JP
2010169405 Aug 2010 JP
2010207990 Sep 2010 JP
2011141174 Jul 2011 JP
2013516928 May 2013 JP
2013517508 May 2013 JP
2013117417 Jun 2013 JP
2013543970 Dec 2013 JP
8801924 Mar 1988 WO
8905512 Jun 1989 WO
9208568 May 1992 WO
9711399 Mar 1997 WO
9808050 Feb 1998 WO
9910706 Mar 1999 WO
0014474 Mar 2000 WO
0020880 Apr 2000 WO
0026612 May 2000 WO
0033149 Jun 2000 WO
0034733 Jun 2000 WO
0063645 Oct 2000 WO
0063681 Oct 2000 WO
0177613 Oct 2001 WO
02084327 Oct 2002 WO
02088855 Nov 2002 WO
02101323 Dec 2002 WO
2004096502 Nov 2004 WO
2005008271 Jan 2005 WO
2005059473 Jun 2005 WO
2005072917 Aug 2005 WO
2005075875 Aug 2005 WO
2005100908 Oct 2005 WO
2006000552 Jan 2006 WO
2006014445 Feb 2006 WO
2006051264 May 2006 WO
2006053837 May 2006 WO
2007002319 Jan 2007 WO
200712198 Feb 2007 WO
2007028941 Mar 2007 WO
2007051972 May 2007 WO
2007087198 Aug 2007 WO
2007118478 Oct 2007 WO
2007125081 Nov 2007 WO
2007144906 Dec 2007 WO
2008019856 Feb 2008 WO
2008027588 Mar 2008 WO
2008047171 Apr 2008 WO
2008048424 Apr 2008 WO
2008052348 May 2008 WO
2008064276 May 2008 WO
2008066896 Jun 2008 WO
2008068791 Jun 2008 WO
2008075170 Jun 2008 WO
2008121073 Oct 2008 WO
2008157061 Dec 2008 WO
2009001165 Dec 2008 WO
2009016185 Feb 2009 WO
2009053085 Apr 2009 WO
2009083452 Jul 2009 WO
2009095384 Aug 2009 WO
2009123278 Oct 2009 WO
2009127526 Oct 2009 WO
2009130169 Oct 2009 WO
2009149740 Dec 2009 WO
2010040742 Apr 2010 WO
2010092131 Aug 2010 WO
2010108089 Sep 2010 WO
2010108644 Sep 2010 WO
2010148525 Dec 2010 WO
2011000435 Jan 2011 WO
2011000955 Jan 2011 WO
2011021103 Feb 2011 WO
2011029140 Mar 2011 WO
2011057130 May 2011 WO
2011060899 May 2011 WO
2011002908 Jun 2011 WO
2011090829 Jul 2011 WO
2011090892 Jul 2011 WO
2011090895 Jul 2011 WO
2011090903 Sep 2011 WO
2012037157 Mar 2012 WO
2012038446 Mar 2012 WO
2012061122 May 2012 WO
2012013525 Aug 2012 WO
2012103525 Aug 2012 WO
2012112683 Aug 2012 WO
2012125671 Sep 2012 WO
2013112455 Aug 2013 WO
2013184340 Dec 2013 WO
2013188026 Dec 2013 WO
2013190031 Dec 2013 WO
2014128498 Aug 2014 WO
Non-Patent Literature Citations (132)
Entry
Office Action for Chinese Application No. 201180004746.4 dated Jul. 21, 2015; 1-4 pages.
Office Action for Chinese Patent Application for Invention No. 201380029985.4 dated Aug. 7, 2015; 2 pages.
Office Action for Japanese Patent Application No. 2014-561197 dated Sep. 1, 2015; 3 pages.
Office Action for Japanese Patent Application No. 2015-049378 dated Aug. 11, 2015; 9 pages.
FARO (ScanArm); metrologic group, 6 Chemindu Vieux Chêne, Inovallée—38240 Meylan—France; Nov. 2007; 22 pgs.
FARO Gage Basic Training Workbookstudents Book, Version 1.5; FARO Technologies Inc.; 125Technology Park, Lake Mary, FL 32746, USA, Jan. 2006; 76 pgs.
FAROARM USB User Guide; FAROTechnologies Inc.; 125 Technology Park, Lake Mary, FL 32746, USA; Nov. 2003; 84 pgs.
International Search Report and Written Opinion for Application No. PCT/US2015/049078 dated Nov. 23, 2015; 12 pgs.
German Office Action for Application No. 10 2015 205 110.2 dated Feb. 25, 2016; 5 pgs.
Davidson, A. et al., “MonoSLAM: Real-Time Single Camera SLAM”, IEEE Transactions on Pattern Analysis and Intelligence, vol. 29, No. 6, Jun. 1, 2007, pp. 1052-1067, XP011179664.
Gebre, Biruk A., et al., “Remotely Operated and Autonomous Mapping System (ROAMS)”, Technologies for Practical Robot Applications, TEPRA 2009, IEEE International Conference on Nov. 9, 2009, pp. 173-178, XP031570394.
Harrison A. et al., “High Quality 3D Laser Ranging Under General Vehicle Motion”, 2008 IEEE International Conference on Robotics and Automation, May 19-23, 2008, pp. 7-12, XP031340123.
May, S. et al, “Robust 3D-Mapping with Time-of-Flight Cameras”, Intelligent Robots and Systems, IROS 2009, IEEE/RSJ International Conference on Oct. 10, 2009, pp. 1673-1678, XP031581042.
German Examination Report, dated Jul. 22, 2015, Application No. 112013002824.7; 6 pages.
German Examination Report, dated Jul. 23, 2015, Application No. 112013003076.4; 7 pages.
Ohno, K. et al., “Real-Time Robot Trajectory Estimation and 3D Map Construction Using 3D Camera”, Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on Oct. 1, 2006, pp. 5279-5285, XP031006974.
Surman, H. et al., “An Autonomous Mobile Robot with a 3D Laser Range Finder for 3D Exploration and Digitalization ofIndorr Environments”, Robotics and Autonomous Systems, Elsevier Science Publishers, vol. 45, No. 3-4, Dec. 31, 2003, pp. 181-198.
Yan, R., et al, “3D Point Cloud Map Construction Based on Line Segments with Two Mutually Perpendicular Laser Sensors”, 2013 13th International Conference on Control, Automation and Systems (ICCAS 2013), IEEE, Oct. 20, 2013, pp. 1114-1116.
Ye, C. et al., “Characterization of a 2-D Laser Scanner for Mobile Robot Obstacle Negotiation” Proceedings / 2002 IEEE International Conference on Robotics and Automation, May 11-15, 2002, Washington, D.C. May 1, 2002, pp. 2512-2518, XP009169742.
Notice of Allowance for U.S. Appl. No. 14/548,528 dated Feb. 20, 2015, 51 pages.
GB Office Action for Application #GB1418273.7 dated Oct. 24, 2014, 8 pages.
GB Office Action for Application #GB1422105.5 dated Jan. 26, 2015, 1 page.
iQsun Laserscanner Brochure, 2 Pages, Apr. 2005.
GB Office Action dated Oct. 6, 2014 corresponding to GB App. No. 1214426.7.
RW Boyd “Radiometry and the Detection of Otpical Radiation” (pp. 20-23 ) 1983 Jon wiley & Sons, Inc.
“Scanner Basis Configuration for Riegl VQ-250”, Riegl Company Webpage, Feb. 16, 2011 (Feb. 16, 2011), XP002693900, Retrieved from the internet: URL:http://www.riegl.com/uploads/tx_pxpriegldownloads/30_SystemConfiguration_VQ-250_02-11_16-02.2011.pdf [retr.
Akca, Devrim, Full Automatic Registration of Laser Scanner Point Clouds, Optical 3D Measurement Techniques, vol. VI, 2003, XP002590305, ETH, Swiss Federal Institute of Technology, Zurich, Institute of Geodesy and Photogrammetry, DOI:10.3929/ethz-a-004656.
Bornaz, L., et al., Multiple Scan Registration in Lidar Close-Range Applications, The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XXXIV, Part 5/W12, Jul. 2003 (Jul. 2003) pp. 72-77, XP002590305.
Brenneke, C., et al., “Using 3D Laser Range Data for Slam in Outdoor Environments”, Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems. (IROS 2003); Las Vegas, NV, Oct. 27-31, 2003; [IEEE/RSJ International Confer.
Cho, et al., Implementation of a Precision Time Protocol over Low Rate Wireless Personal Area Networks, IEEE, 2008.
Jasperneite, et al., Enhancements to the Time Synchronization Standard IEEE-1588 for a System of Cascaded Bridges, IEEE, 2004.
Jgeng “DLP-Based Structured Light 3D Imaging Technologies and Applications” (15 pages) Emerging Digital Micromirror Device Based Systems and Application III; edited by Michael R. Douglass, Patrick I. Oden, Proc. of SPIE, vol. 7932, 79320B; (Feb. 9, 20.
Langford, et al., “Practical Skills in Forensic Science”, Pearson Education Limited, Essex, England, First Published 2005, Forensic Chemistry.
Mg Lee; “Compact 3D LIDAR based on optically coupled horizontal and vertical Scanning mechanism for the autonomous navigation of robots” (13 pages) vol. 8037; downloaded from http://proceedings.spiedigitallibrary.org/ on Jul. 2, 2013.
MOOG Components Group; “Fiber Optic Rotary Joints; Product Guide” (4 pages) Dec. 2010; MOOG, Inc. 2010.
P Ben-Tzvi, et al “Extraction of 3D Images Using Pitch-Actuated 2D Laser Range Finder for Robotic Vision” (6 pages) BNSDOCID <XP 31840390A_1_>, Oct. 15, 2010.
Romer “Romer Absolute Arm Maximum Performance Portable Measurement” (Printed 2010); Hexagon Metrology, Inc., http://us:Romer.com; Hesagon Metrology, Inc. 2010.
Romer “Romer Absolute Arm Product Brochure” (2010); Hexagon Metrology; www.hexagonmetrology.com; Hexagon AB 2010.
Sauter, et al., Towards New Hybrid Networks for Industrial Automation, IEEE, 2009.
Se, et al., “Instant Scene Modeler for Crime Scene Reconstruction”, MDA, Space Missions, Ontario, Canada, Copyright 2005, IEEE.
Umeda, K., et al., Registration of Range and Color Images Using Gradient Constraints and Ran Intensity Images, Proceedings of the 17th International Conference onPatern Recognition (ICPR'04), Copyright 2010 IEEE. [Retrieved online Jan. 28, 2010—IEEE.
Cooklev, et al., An Implementation of IEEE 1588 Over IEEE 802.11b for Syncrhonization of Wireless Local Area Network Nodes, IEEE Transactions on Instrumentation and Measurement, vol. 56, No. 5, Oct. 2007.
Elstrom, M.D., Stereo-Based Registration of LADAR and Color Imagery, Part of SPIE Conference on Intelligent Robots and Computer Vision XVII: Algorithms, Techniques, and Active Vision, Boston, MA, Nov. 1998, SPIE vol. 3522, 0277-786X/98; [Retrieved on.
EO Edmund Optics “Silicon Detectors” (5 pages) 2013 Edmund Optics, Inc. http://www.edmundoptics.com/electro-optics/detector-components/silicon-detectors/1305[Oct. 15, 2013 10:14:53 AM].
Godin, G., et al., A Method for the Registration of Attributed Range Images, Copyright 2001, [Retrieved on Jan. 18, 2010 at 03:29 from IEEE Xplore].
Haag, et al., “Technical Overview and Application of 3D Laser Scanning for Shooting Reconstruction and Crime Scene Investigations”, Presented at the American Academy of Forensic Sciences Scientific Meeting, Washington, D.C., Feb. 21, 2008.
Hart, A., “Kinematic Coupling Interchangeability”, Precision Engineering, vol. 28, No. 1, Jan. 1, 2004, pp. 1-15, XP55005507, ISSN: 0141-6359, DOI: 10.1016/S0141-6359(03)00071-0.
Horn, B.K.P., Closed-Form Solution of Absolute Orientation Using Unit Quaternions, J. Opt. Soc. Am. A., vol. 4., No. 4, Apr. 1987, pp. 629-642, ISSN 0740-3232.
Howard, et al., “Virtual Environments for Scene of Crime Reconstruction and Analysis”, Advanced Interfaces Group, Department of Computer Science, University of Manchester, Manchester, UK, Feb. 28, 2000.
Ingensand, H., Dr., “Introduction to Geodetic Metrology”, “Einfuhrung in die Geodatische Messtechnik”, Federal Institute of Technology Zurich, 2004, with English translation.
J.Geng “Structured-Light 3D Surface Imaging: A Tutorial” (pub. Mar. 31, 2011) Advances in Optics and Photonics 3, pp. 128-160; IEEE Intelligent Transportation System Society; 2011 Optical Society of America.
Jasiobedzki, Piotr, “Laser Eye—A New 3D Sensor for Active Vision”, SPIE—Sensor Fusion VI, vol. 2059, Sep. 7, 1993 (Sep. 7, 1993), pp. 316-321, XP00262856, Boston, U.S.A., Retrieved from the Internet: URL:http:.//scitation.aip.org/getpdf/servlet/Ge.
Williams, J.A., et al.,. Evaluation of a Novel Multiple Point Set Registration Algorithm, Copyright 2000, [Retrieved on Jan. 18, 2010 at 04:10 from IEEE Xplore].
Willoughby, P., “Elastically Averaged Precisoin Alignment”, In: “Doctoral Thesis”, Jun. 1, 2005, Massachusetts Institute of Technology, XP55005620, abstract 1.1 Motivation, Chapter 3, Chapter 6.
Yk Cho, et al. “Light-weight 3D LADAR System for Construction Robotic Operations” (pp. 237-244); 26th International Symposium on Automation and Robotics in Construction (ISARC 2009).
It is Alive in the Lab, Autodesk University, Fun with the Immersion MicroScribe Laser Scanner, [online], [retrieved Nov. 29, 2011], http://labs.blogs.com/its_alive_in_the_lab/2007/11/fun-with-the-im.html; 3 pages.
14th International Forensic Science Symposium, Interpol—Lyon, France, Oct. 19-22, 2004, Review Papers, Edited by Dr. Niamh Nic Daeid, Forensic Science Unit, Univeristy of Strathclyde, Glasgow, UK; 585 pages.
A. Hart; “Kinematic Coupling Interchangeability” Precision Engineering; vol. 28, No. 1; Jan. 1, 2004 (Jan. 1, 2004) pp. 1-15.
ABB Flexible Automation AB: “Product Manual IRB 6400R M99, On-line Manual”; Sep. 13, 2006; XP00002657684; Retrieved from the Internet: URL: http://pergatory.mit.edu/kinematiccouplings/case_studies/ABB_Robotics/general/6400R%20Product%20Manual.pdf (re.
Anonymous : So wird's gemacht: Mit T-DSL und Windows XP Home Edition gemeinsam ins Internet (Teil 3) Internet Citation, Jul. 2003 (Jul. 2003), XP002364586, Retrieved from Internet: URL:http://support.microsfot.com/kb/814538/DE/ [retrieved on Jan. 26, 2006].
Bouvet, D., et al., “Precise 3-D Localization by Automatic Laser Theodolite and Odometer for Civil-Engineering Machines”, Proceedings of the 2001 IEEE International Conference on Robotics and Automation. ICRA 2001. Seoul, Korea, May 21-26, 2001; IEEE, US.
Chinese Office Action to Application No. 20118004746.4, dated Apr. 21, 2015, 3 pages.
Chinese Office Action to Application No. 201380005188.2, dated Mar. 3, 2015, 6 pages.
CN Office Action re Application No. 201280018596.7 dated Jul. 7, 2014.
Decision Revoking the European Patent dated Aug. 14, 2013, filed in Opposition re Application No. 07 785 873.6/Patent No. 2 062 069, Proprietor: Faro Technologies, Inc., filed by Leica Geosystem AG on Feb. 5, 2013, 12 pages.
Dylan, Craig R., High Precision Makes the Massive Bay Bridge Project Work. Suspended in MidAir—Cover Story—Point of Beginning, Jan. 1, 2010, [online] http://www.pobonline.com/Articles/Cover_Story/BNP_Guid_9-5-2006_A_10000000000 . . . [Retreived Jan. 25, 2.
Electro-Optical Information Systems, “The Handy Handheld Digitizer” [online], [retrieved on Nov. 29, 2011], http://vidibotics.com/htm/handy.htm, 2 pages.
Examination Report foe GB1504825.9 dated May 28, 2015; Received Jun. 30, 2015; 6 pages.
Examination Report to Application No. GB1412309.5; Aug. 8, 2014; 3 pages.
Examination Report under Section 18(3); Report dated Oct. 31, 2012; Application No. GB1210309.5.
Examinaton Report for Application No. 112011100309.9 dated Sep. 23, 2014, 10 pages.
Exemination Report for Patent Application No. GB1214426.7, dated Jan. 15, 2014, 5 pages.
FARO Laser Scanner LS, Recording Reality's Digital Fingerprint, The Measure of Success, Rev. Aug. 22, 2005, 16 pages.
FARO Laserscanner LS, Presentation Forensic Package, Policeschool of Hessen, Wiesbaden, Germany, Dec. 14, 2005; FARO Technologies, Copyright 2008, 17 pages.
FARO Product Catalog; Faro Arm; 68 pages; Faro Technologies Inc. 2009; printed Aug. 3, 2009.
Foreign Office Action for Japanese Patent Appliation No. 2015-516035 filed May 20, 2013, Base on International application No. PCT/US2013/041826, dated May 12, 2015, dated Jun. 3, 2015; pp. 1-3.
Franklin, Paul F., What IEEE 1588 Means for Your Next T&M System Design, Keithley Instruments, Inc., [on-line] Oct. 19, 2010, http://www.eetimes.com/General/DisplayPrintViewContent?contentItemId=4209746, [Retreived Oct. 21, 2010], 6 pages.
GB Office Action dated Jan. 15, 2014 for SJB/PX210785GB; UK Patent Application No. 1214426.7., 4 pages.
International Search Report for of the International Searching Authority for PCT/EP2004/014605; dated Apr. 15, 2005.
Gebre, et al. “Remotely Operated and Autonomous Mapping System (ROAMS).” Technologies for Practical Robot Applications, 2009. Tepra 2009. IEEE International Conference on IEEE, Piscataway, NJ, USA. Nov. 9, 2009, pp. 173-178.
German Patent Application No. 11 2011 100 291.2 dated Dec. 20, 2012.
Germany Office Action for DE Application No. 10 2012 107 544.1; dated Jan. 2, 2013.
Ghost 3D Systems, Authorized MicroScribe Solutions, FAQs—MicroScribe 3D Laser, MicroScan Tools, & related info, [online], [retrieved Nov. 29, 2011], http://microscribe.ghost3d.com/gt_microscan-3d_faqs.htm,. 4 pages.
GoMeasure3D—Your source for all things measurement, Baces 3D 100 Series Portable CMM from GoMeasure3D, [online], [retrieved Nov. 29, 2011], http://www.gomeasure3d.com/baces100.html, 3 pages.
Huebner, S.F., “Sniper Shooting Tecnhique”, “Scharfschutzen Schiebtechnik”, Copyright by C.A. Civil Arms Verlag GmbH, Lichtenwald 1989, Alle Rechte vorbehalten, pp. 11-17.
HYDROpro Navigation, Hydropgraphic Survey Software, Trimble, www.trimble.com, Copyright 1997-2003.
Information on Electro-Optical Information Systems; EOIS 3D Mini-Moire C.M.M. Sensor for Non-Contact Measuring & Surface Mapping; Direct Dimensions, Jun. 1995.
International Search Report for International Application No. PCT/US2013/040321; dated Jul. 15, 2013; 4 pages.
International Search Report for International Application No. PCT/U52013/040309; dated Jul. 3, 2013; dated Jul. 15, 2013; 4 pages.
International Preliminary Report on Patentability and Written Opinion for Application No. PCT/EP2004/014605; dated Aug. 29, 2006.
International Search Report for International Application No. PCT/U52011/021253; dated Sep. 26, 2011; 7 pages.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2009/050887; dated Sep. 7, 2010.
International Preliminary Report on Patentability for International Application No. PCT/US2011/021253; dated May 9, 2012.
International Search Report for International Application No. PCT/US/2013/041826 filed May 20, 2013; dated Jul. 29, 2013; 5 pages.
International Preliminary Report on Patentability to PCT/US2013/040309, dated Dec. 16, 2014, 8 pages.
International Preliminary Report on Patentability to PCT/US2013/040321, dated Dec. 16, 2014, 7 pages.
International Preliminary Report on Patentability to PCT/US2013/041826, dated Dec. 9, 2019, 7 pages.
Japanese Office Action for Application No. 2015-049378, dated May 8, 2015, 3 pages.
Jgeng “DLP-Based Structured Light 3D Imaging Technologies and Applications” (15 pages) Emerging Digital Micromirror Device Based Systems and Application III; edited by Michael R. Douglass, Patrick I. Oden, Proc. of SPIE, vol. 7932, 79320B; (2011) SPIE.
Kreon Laser Scanners, Getting the Best in Cutting Edge 3D Digitizing Technology, B3-D MCAD Consulting/Sales [online], [retrieved Nov. 29, 2011], http://www.b3-d.com/Kreon.html.
Laser Reverse Engineering with Microscribe, [online], [retrieved Nov. 29, 2011], http://www.youtube.com/watch?v=8VRz_2aEJ4E&feature=PlayList&p=F63ABF74F30DC81B&playnext=1&playnext_from=PL&index=1.
Leica Geosystems TruStory Forensic Analysis by Albuquerque Police Department, 2006.
Leica Geosystems, FBI Crime Scene Case Study, Tony Grissim, Feb. 2006; 11 pages.
Leica Geosystems: “Leics Rugby 55 Designed for Interior Built for Construction”, Jan. 1, 2009, XP002660558, Retrieved from the Internet: URL:http://www.leica-geosystems.com/downloads123/zz/lasers/Rugby%2055/brochures/Leica_Rugby_55_brochure_en.pdf [re.
Leica TPS800 Performance Series—Equipment List, 2004, pp. 1-4.
Written Opinion of the International Searching Authority for International Application No. PCT/US2011/021253 dated Sep. 26, 2011.
Japanese Office Action and English Language summary for JP2012-550044 filed Jul. 20, 2012; based on International Application No. PCT/US2011/021252 filed Jan. 14, 2011.
Leica Geosystems: “Leica Rugby 55 Designed for Interior Built for Construction”, Jan. 1, 2009, XP002660558, Retrieved from the Internet: URL:http://www.leica-geosystems.com/downloads123/zz/lasers/Rugby%2055/brochures/Leica_Rugby_55_brochure_en.pdf [re.
Merriam-Webster (m-w.com), “Interface”. 2012. http://www.merriam-webster.com/dictionary/interface.
Merriam-Webster (m-w.com), “Parts”. 2012, pp. 1-6. http://www.merriam-webster.com/dictionary/parts.
Merriam-Webster (m-w.com), “Traverse”. 2012. http://www.merriam-webster.com/dictionary/traverse.
MicroScan 3D User Guide, RSI GmbH, 3D Systems & Software, Oberursel, Germany, email: info@rsi-gmbh.de, Copyright RSI Roland Seifert Imaging GmbH 2008.
Moog Components Group “Technical Brief; Fiber Optic Rotary Joints” Document No. 303 (6 pages) Mar. 2008; MOOG, Inc. 2008 Canada; Focal Technologies.
Notification of Transmittal of the International Search Report for International Application No. PCT/US2015/060087 dated Feb. 9, 2016; dated Feb. 17, 2016; 6 pages.
Patrick Willoughby; “Elastically Averaged Precision Alignment”; In: “Doctoral Thesis” ; Jun. 1, 2005; Massachusetts Institute of Technology; XP55005620; Abstract 1.1 Motivation; Chapter 3, Chapter 6.
Provision of the minutes in accordance with Rule 124(4) EPC dated Aug. 14, 2013, filed in Opposition re Application No. 07 785 873.6/Patent No. 2 062 069, Proprietor: Faro Technologies, Inc., filed by Leica Geosystem AG on Feb. 5, 2013, pp. 1-10.
Romer “Romer Measuring Arms Portable CMMs for R&D and shop floor” (Mar. 2009) Hexagon Metrology (16 pages).
Romer Measuring Arms; Portable CMMs for the shop floor; 20 pages; Hexagon Metrology, Inc. (2009) http//us.romer.com.
Spada, et al., IEEE 1588 Lowers Integration Costs in Continuous Flow Automated Production Lines, XP-002498255, ARC Insights, Insight # 2003-33MD&H, Aug. 20, 2003.
The Scene, Journal of the Association for Crime Scene Reconstruction, Apr.-Jun. 2006, vol. 12, Issue 2; 31 pages.
Trimble—Trimble SIDS630, SPS730 and SPS930 Universal Total Stations, [on-line] http://www.trimble.com/sps630_730_930.shtml (1 of 4), [Retreived Jan. 26, 2010 8:50:29AM].
Written Opinion of the International Searching Authority for International Application No. PCT/US2015/060087 dated Feb. 9, 2016; dated Feb. 17, 2016; 5 pages.
Written Opinion for International Application No. PCT/US/2013/041826 filed May 20, 2013; dated Jul. 29, 2013; 7 pages.
Written Opinion for International Application No. PCT/US2013/040309; dated Jul. 15, 2013; 7 pages.
Written Opinion for International Application No. PCT/US2013/040321 filed May 9, 2013; dated Jul. 15, 2013; 7 pages.
Ben-Tzvi, P., et al., “Extraction of 3D images using pitch-actuated 2D laser range finder for robotic vision”, Robotic and Sensors Environments (ROSE), 2010 IEEE International Workshop on, IEEE, Pisataway, NJ, USA, Oct. 15, 2010 (Oct. 15, 2010), pp. 1-6, XP031840390.
Great Britain Examination Report for Application No. GB1504825.9 dated May 28, 2015; 6 pgs.
Japanese Office Action for Application No. 2015-516023 dated Mar. 28, 2017; 4 pgs.
German Examination Report for Application No. 11 2013 003 076.4 dated May 5, 2017; 6 pgs.
Trujilla-Pino, A., et al., “Accurate subpixel edge location based on partial area effect” Elsevier, Imaging and Vision Computing 31 (2013) pp. 72-90.
German Office Action for Application No. 11 2015 004 196.6 dated Jul. 31, 2017; 6 pgs.
Extended European Search Report for Application No. 18164913.8 dated Jun. 8, 2018; 7 pgs.
Related Publications (1)
Number Date Country
20170102224 A1 Apr 2017 US
Provisional Applications (3)
Number Date Country
61296555 Jan 2010 US
61355279 Jun 2010 US
61351347 Jun 2010 US
Continuations (1)
Number Date Country
Parent 14485876 Sep 2014 US
Child 15334961 US
Continuation in Parts (2)
Number Date Country
Parent 13491176 Jun 2012 US
Child 14485876 US
Parent 13006507 Jan 2011 US
Child 13491176 US