Method for optically scanning and measuring an environment

Information

  • Patent Grant
  • 9684078
  • Patent Number
    9,684,078
  • Date Filed
    Thursday, April 28, 2016
    9 years ago
  • Date Issued
    Tuesday, June 20, 2017
    8 years ago
Abstract
A method, system and computer program product are provided for displaying three-dimensional measurement points on a two-dimensional plane of a display screen having a plurality of pixels. The method includes projecting the measurement points onto the plane. Each of the measurement points is assigned to one of the pixels. A depth value is assigned to each of the pixels. A first pixel is selected having a first measurement point and a first depth value. A first side is searched for a second pixel having a second measurement point and a second depth value. A second side is searched for a third pixel having a third measurement point and a third depth value. It is determined whether the second and third measurement points are on a same plane. The first depth value of the first pixel is changed when the second and third measurement points are on the same plane.
Description
BACKGROUND OF THE INVENTION

The invention relates to a method for optically scanning and measuring an environment.


Through use of a known method of this kind, a three-dimensional scan is recorded which is then displayed two-dimensionally. Provided that density and extension of the measurement points are smaller than the pixels of the display, a relatively better visual impression is generated if a gap-filling takes place between the measurement points, i.e., if surfaces are generated from the single measurement points. All measurement points can thus be projected onto one plane and be assigned to single pixels. The intermediate pixels of the plane are then filled, for example, by interpolation.


SUMMARY OF THE INVENTION

According to an embodiment of the present invention, a method, system and computer program product are provided for displaying a plurality of measurement points in three-dimensional space on a two-dimensional plane of a display screen. The method includes projecting the plurality of measurement points onto the two-dimensional plane, the display screen has a plurality of pixels. Each of the measurement points of the plurality of measurement points is assigned to one of the pixels in the plurality of pixels. A depth value is assigned to each of the plurality of pixels that are assigned one of the measurement points of the plurality of measurement points. A first pixel is selected, the first pixel having a first measurement point of the plurality of measurement points assigned to the first pixel, the first pixel having a first depth value assigned to the first pixel. A first side of the first pixel is searched for a second pixel having a second measurement point of the plurality of measurement points assigned to the second pixel, the second pixel having a second depth value assigned to the second pixel. A second side of the first pixel is searched for a third pixel having a third measurement point of the plurality of measurement points assigned to the third pixel, the second side being opposite the first side, the third pixel having a third depth value assigned to the third pixel. It is determined whether the second measurement point and the third measurement point are on a same object plane based at least in part on the second depth value and the third depth value. The first depth value assigned to the first pixel is changed based on determining the second measurement point and the third measurement point are on the same object plane.





BRIEF DESCRIPTION OF THE DRAWINGS

Embodiments of the present invention are explained in more detail below on the basis of exemplary embodiments illustrated in the drawings, in which



FIG. 1 is a schematic illustration of the assignment and filling of the pixels with a view onto the plane, wherein the adjacent pixels are on the same surface;



FIG. 2 is a schematic illustration of the assignment and filling of the pixels, according to FIG. 1, with a view onto the plane;



FIG. 3 is a schematic illustration of the assignment and filling of the pixels with a view onto the plane, wherein the adjacent pixels are located on different surfaces;



FIG. 4 is a schematic illustration of the assignment and filling of the pixels, according to FIG. 3, with a view onto the plane;



FIG. 5 is a schematic illustration of a laser scanner in the environment including the display device, and



FIG. 6 is a partial sectional illustration of the laser scanner.





DETAILED DESCRIPTION OF THE INVENTION

Referring to the Figures, a laser scanner 10 is provided as a device for optically scanning and measuring the environment of the laser scanner 10. The laser scanner 10 has a measuring head 12 and a base 14. The measuring head 12 is mounted on the base 14 as a unit that can be rotated about a vertical axis. The measuring head 12 has a rotary mirror 16 that can be rotated about a horizontal axis. The point of intersection between the two axes of rotation is designated as the center C10 of the laser scanner 10.


The measuring head 12 also has a light emitter 17 for emitting an emission light beam 18. The emission light beam 18 may be a laser beam in the range of wave length of approximately 300 to 1600 nm, for example, 790 nm, 905 nm or less than 400 nm, but other electro-magnetic waves having, for example, a greater wave length can be used. The emission light beam 18 is amplitude-modulated with, for example, a sinusoidal or rectangular-waveform modulation signal. The emission light beam 18 is passed from the light emitter 17 onto the rotary mirror 16 where it is deflected and then emitted into the environment. A reception light beam 20, which is reflected by or otherwise scattered from an object O, is captured again by the rotary mirror 16, deflected and passed onto a light receiver 21. The direction of the emission light beam 18 and of the reception light beam 20 results from the angular positions of the rotary mirror 16 and the measuring head 12, which depend on the positions of their respective rotary drives which are, in turn, detected by a respective encoder.


A control and evaluation device 22 has a data link connection to the light emitter 17 and to the light receiver 21 in the measuring head 12, parts thereof being arranged also outside the measuring head 12, for example as a computer connected to the base 14. The control and evaluation device 22 determines, for a multiplicity of measurement points X, the distance d of the laser scanner 10 from the illuminated point on the object O, and from the propagation times of emission light beam 18 and reception light beam 20. For this purpose, the phase shift between the two light beams 18 and 20 can be determined and evaluated.


Through use of the relatively rapid rotation of the mirror 16, scanning takes place along a circular line. Also, through use of the relatively slow rotation of the measuring head 12 relative to the base 14, the entire space is gradually scanned with the circular lines. The totality of the measurement points X of such a measurement shall be designated as a scan. The center C10 of the laser scanner 10 defines for such a scan the origin of the local stationary reference system. The base 14 is stationary in this local stationary reference system.


In addition to the distance d to the center C10 of the laser scanner 10, each measurement point X comprises a brightness value which may also be determined by the control and evaluation device 22. The brightness is a gray-tone value which is determined, for example, by integration of the bandpass-filtered and amplified signal of the light receiver 21 over a measuring period which is assigned to the measurement point X. Through use of a color camera, it is optionally possible to generate pictures, by which colors (R, G, B) can be assigned as a value to the measurement points X in addition to the brightness or comprising the brightness.


A display device 30 is connected to the control and evaluation device 22. The display device 30 can be integrated into the laser scanner 10, for example into the measuring head 12 or into the base 14, or it can be an external unit, for example part of a computer which is connected to the base 14. The display device 30 has a graphic card 32 and a screen 34 which can be arranged separately from one another or as a structural unit. The control and evaluation device 22 provides 3D data of the scan.


Referring also to FIGS. 1-4 as well as FIGS. 5 and 6, the graphic card 32 converts the 3-D data into 2-D data (e.g., rendering data), which are displayed on the screen 34. The 3-D data are the measurement points X, wherein several scans from different positions of the laser scanner 10 can be assembled into one scene. For representing the 2-D data, there are pixels P, i.e., adjacent, small polygonal surfaces (e.g. squares or hexagons), which are arranged in a two-dimensional plane E which corresponds to the screen 34. The starting point is the projection of the measurement points X onto the plane E with a viewer (e.g., eye, camera), at a certain viewpoint V. The projection appears to be in perspective (i.e., the viewpoint V is in the finite) or orthographical (i.e., the viewpoint V in is the infinite). The projected measurement points X are assigned to single pixels P. A Z-buffer serves for representing the 2-D data, i.e., a two-dimensional auxiliary field for the pixels P. In this Z-buffer, a field element (e.g., depth z) is assigned to each pixel P. The depth z of each projected measurement point X corresponds to the distance of the measurement point X to the plane E with respect to the viewpoint V. The field of the pixels P and the Z-buffer may be treated in the same way as the images.


The viewpoint V may be arbitrary per se and is usually changed several times when regarding the scan and/or the scene.


Since the measurement points X are punctiform, with gaps in between, and the pixels P usually, in the case of nearby objects O, have a higher density in the plane E than the projections of the measurement points X, a gap-filling is carried out to fill as many pixels P as possible for an improved representation. The graphic card 32 carries this out in parallel using the 3-D data and the indication of the viewpoint V and of the plane E.


Initially only those pixels P are filled to which the projection of a measurement point X is assigned, i.e., which exactly cover one measurement point X. These pixels P are filled with the value of the assigned measurement point X, i.e., brightness and, where applicable, color. All other pixels P, which do not exactly correspond with a projection of a measurement point X, i.e., which are “in between” are empty at first, for example are set to zero. Each of the depths z, i.e., the field elements of the Z-buffer, which are assigned to the initially filled pixels P, is set to that depth z0, z1, z2, which corresponds to the distance of the assigned measurement point X to the plane E. All other field elements of the Z-buffer (e.g., depths z) are set to an extreme value, for example, to infinite. If, when the projection of the measurement points X is made, it turns out that two measurement points X are available for one pixel P, the measurement point having the smaller depth z is selected and the other one is rejected, so that covered surfaces and covered edges are not visible.


According to embodiments of the present invention, gap-filling takes place in dependence on the depth z0, z1, z2, i.e., on the distance to the plane E. The graphic card 32 selects all pixels P in parallel with respect to time. By way of example, one selected pixel P0 is regarded now. The assigned depth z, i.e., field element of the Z-buffer, contains the depth z0. For each selected pixel P0 the adjacent pixels P1, P2, are searched consecutively, i.e., to the left and to the right and above and below. If the adjacent pixel P1 is not yet filled or if its depth z is bigger than the depth z0 of the selected pixel P0, it is skipped and the second next pixel P is taken as adjacent pixel P1, if necessary iteratively. If an adjacent pixel P1, the depth z1 of which is smaller than the depth z0 of the selected pixel P0, is found in one of the directions, a change to the next direction takes place, and it is looked for the adjacent pixel P2 (e.g., the depth z2 of which is smaller than the depth z0 of the selected pixel P0). It is possible to define a maximum number of skipped pixels, i.e., if the adjacent pixel P or P2 is not yet found after skipping the maximum number of skipped pixels, the search for P1 or P2 is aborted.


If the adjacent pixels P1 and P2 to the selected pixel P0 have been found in opposite directions, with the depths z1 and z2 of the adjacent pixels P1 and P2 being smaller than the depth z0, it is checked whether P1 and P2 are on the same plane, i.e., whether the amount of the difference of z2 and z1 is below a threshold value for the depth zcrit, i.e.,

|z2−z1|<zcrit

applies. In such a case, the selected pixel P0 is filled with the value which is interpolated between P1 and P2, i.e., brightness and, if applicable color. The assigned field element of the Z-buffer is likewise set to the interpolated depth between z1 and z2. Interpolation depends on the distance of the selected pixel P0 from P1 and P2 in plane E.


If the difference of the depths is too big, i.e., the condition

|z2−z1|>zcrit

applies, it is assumed that P1 and P2 are located on different planes. The selected pixel P0 is then filled with the value, i.e., brightnesses and, if applicable color, of, for example, the more remote pixel P1 or P2, and the assigned field element of the Z-buffer with the bigger depth z1 or z2. Alternatively, the value and the depth of pixel P1 or P2 having the smaller depth z1 or z2 is transferred. In the case of more than two adjacent pixels P1, P2, the average value of the majority, i.e., of the adjacent pixels P1, P2, which are located on the same surface, can be transferred.


Selected pixels P0, which are already filled with values of the measurement points, are overwritten by the interpolation of the values of the adjacent pixels P1 and P2. Alternatively, a selected pixel P0, which is already filled, remains unvaried.


If pixels P have been skipped when finding the pixels P1 and P2, because they were not filled or because their depth z was too big, their adjacent pixels P1, P2 are the same as with the selected pixel P0, so that the skipped pixels P and the assigned field elements of the Z-buffer, within the framework of the selection taking place in parallel, are likewise filled either with a value which is interpolated between the pixels P1 and P2 and/or the depths z1 and z2 (depending on the distance of the selected pixel P0 from P1 and P2 in plane E) or with the value and/or the depth z1 or z2 of the more remote one among pixels P1 or P2 (or the average value of the majority).


Due to the selection taking place in parallel, filling with the value and/or the depth z1 or z2 of the more remote among the pixels P1 or P2 on account of a difference of depths which is too big, leads to the closer-by pixel P1 or P2 forming an edge. Even if no adjacent pixel P1 or P2 is found, the depth z1 or z2 of which is smaller than the depth z0 of the selected pixel P0, since the selected pixel P0 is at the side of the screen 34, an edge is generated, since these selected pixels P0 at the edge are not filled then.


Gap-filling may take place once again to fill further pixels, i.e., to improve the representation once again.


Gap-filling may take place in the control and evaluation device 22 or by software running on an external computer. Due to the savings in time by a parallel selection, the hardware-based gap-filling on the graphic card 32 may be used together with the programming interface of the latter.

Claims
  • 1. A method comprising: projecting with a processor a plurality of measurement points in three-dimensional space onto a two-dimensional plane of a display screen, the display screen having a plurality of pixels;assigning with the processor each of the measurement points of the plurality of measurement points to one of the pixels in the plurality of pixels;assigning with the processor a depth value to each of the plurality of pixels that are assigned one of the measurement points of the plurality of measurement points;selecting with the processor a first pixel, the first pixel having a first measurement point of the plurality of measurement points assigned to the first pixel, the first pixel having a first depth value assigned to the first pixel;searching with the processor to a first side of the first pixel for a second pixel having a second measurement point of the plurality of measurement points assigned to the second pixel, the second pixel having a second depth value assigned to the second pixel;searching with the processor to a second side of the first pixel for a third pixel having a third measurement point of the plurality of measurement points assigned to the third pixel, the second side being opposite the first side, the third pixel having a third depth value assigned to the third pixel;determining with the processor that the second measurement point and the third measurement point are on a same object plane based at least in part on the second depth value and the third depth value; andchanging with the processor the first depth value assigned to the first pixel based on determining the second measurement point and the third measurement point are on the same object plane.
  • 2. The method of claim 1, wherein the changing of the first depth value assigned to the first pixel is changed based on an interpolation between the second depth value and the third depth value based on a distance from the first measurement point to the second measurement point and third measurement point.
  • 3. The method of claim 1, wherein the determination that the second measurement point and the third measurement point are on the same object plane is determined based on a difference between the second depth value and the third depth value being less than a predetermined value.
  • 4. The method of claim 3, further comprising: determining with the processor the second measurement point and the third measurement point are not on the same object plane based on the difference between the second depth value and the third depth value being greater than or equal to the predetermined value; andchanging with the processor the first depth value to be equal to the second depth value based on determining the second measurement point and the third measurement point are not on the same object plane, the second depth value being greater than the third depth value.
  • 5. The method of claim 4, further comprising: determining with the processor a fourth pixel between the first pixel and the second pixel, the fourth pixel not having one of the measurement points of the plurality of measurement points assigned to the fourth pixel;changing with the processor a fourth depth value of the fourth pixel to the second depth value based on the second measurement point and the third measurement point are not on the same object plane; andchanging with the processor the fourth depth value based on an interpolation between the second depth value and the third depth value based on a distance from the fourth measurement point to the second measurement point and third measurement point based on the second measurement point and the third measurement point being on the same object plane.
  • 6. The method of claim 1, further comprising assigning a brightness value and a color to the first pixel, the second pixel and the third pixel.
  • 7. The method of claim 1, wherein the projection of the plurality of measurement points one the two-dimensional plane is a perspective projection.
  • 8. The method of claim 1, wherein the projection of the plurality of measurement points one the two-dimensional plane is an orthogonal projection.
  • 9. The method of claim 1, wherein the selecting of the first pixel selects all of the pixels in parallel.
  • 10. The method of claim 1, further comprising measuring a first portion of the plurality of measurement points in a first scan at a first position and measuring a second portion of the plurality of measurement points in a second scan at a second position, the first position and second position being in difference locations.
  • 11. A system comprising: a memory having computer readable instructions; andone or more processors for executing the computer readable instructions, the computer readable instructions comprising:projecting with a processor a plurality of measurement points in three-dimensional space onto a two-dimensional plane of a display screen, the display screen having a plurality of pixels;assigning with the processor each of the measurement points of the plurality of measurement points to one of the pixels in the plurality of pixels;assigning with the processor a depth value to each of the plurality of pixels that are assigned one of the measurement points of the plurality of measurement points;selecting with the processor a first pixel, the first pixel having a first measurement point of the plurality of measurement points assigned to the first pixel, the first pixel having a first depth value assigned to the first pixel;searching with the processor to a first side of the first pixel for a second pixel having a second measurement point of the plurality of measurement points assigned to the second pixel, the second pixel having a second depth value assigned to the second pixel;searching with the processor to a second side of the first pixel for a third pixel having a third measurement point of the plurality of measurement points assigned to the third pixel, the second side being opposite the first side, the third pixel having a third depth value assigned to the third pixel;determining with the processor that the second measurement point and the third measurement point are on a same object plane based at least in part on the second depth value and the third depth value; andchanging with the processor the first depth value assigned to the first pixel based on determining the second measurement point and the third measurement point are on the same object plane.
  • 12. The system of claim 11, wherein the changing of the first depth value assigned to the first pixel is changed based on an interpolation between the second depth value and the third depth value based on a distance from the first measurement point to the second measurement point and third measurement point.
  • 13. The system of claim 11, wherein the determination that the second measurement point and the third measurement point are on the same object plane is determined based on a difference between the second depth value and the third depth value being less than a predetermined value.
  • 14. The system of claim 13, wherein the computer readable instructions further comprise: determining with the processor the second measurement point and the third measurement point are not on the same object plane based on the difference between the second depth value and the third depth value being greater than or equal to the predetermined value; andchanging with the processor the first depth value to be equal to the second depth value based on determining the second measurement point and the third measurement point are not on the same object plane, the second depth value being greater than the third depth value.
  • 15. The system of claim 14, wherein the computer readable instructions further comprise: determining with the processor a fourth pixel between the first pixel and the second pixel, the fourth pixel not having one of the measurement points of the plurality of measurement points assigned to the fourth pixel;changing with the processor a fourth depth value of the fourth pixel to the second depth value based on the second measurement point and the third measurement point are not on the same object plane; andchanging with the processor the fourth depth value based on an interpolation between the second depth value and the third depth value based on a distance from the fourth measurement point to the second measurement point and third measurement point based on the second measurement point and the third measurement point being on the same object plane.
  • 16. The system of claim 11, further comprising a device for measuring an environment, the device being configured to measure the plurality of measurement points.
  • 17. The system of claim 16, wherein the device is configured to measure the plurality of measurement points optically.
  • 18. A computer program product for displaying a plurality of measurement points in three-dimensional space on a two-dimensional plane of a display screen, the computer program product comprising a non-transitory computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor to cause the processor to perform: projecting with the processor the plurality of measurement points onto the two-dimensional plane of the display screen, the display screen having a plurality of pixels;assigning with the processor each of the measurement points of the plurality of measurement points to one of the pixels in the plurality of pixels;assigning with the processor a depth value to each of the plurality of pixels that are assigned one of the measurement points of the plurality of measurement points;selecting with the processor a first pixel, the first pixel having a first measurement point of the plurality of measurement points assigned to the first pixel, the first pixel having a first depth value assigned to the first pixel;searching with the processor to a first side of the first pixel for a second pixel having a second measurement point of the plurality of measurement points assigned to the second pixel, the second pixel having a second depth value assigned to the second pixel;searching with the processor to a second side of the first pixel for a third pixel having a third measurement point of the plurality of measurement points assigned to the third pixel, the second side being opposite the first side, the third pixel having a third depth value assigned to the third pixel;determining with the processor that the second measurement point and the third measurement point are on a same object plane based at least in part on the second depth value and the third depth value; andchanging with the processor the first depth value assigned to the first pixel based on determining the second measurement point and the third measurement point are on the same object plane.
  • 19. The computer program product of claim 18, wherein: the changing of the first depth value assigned to the first pixel is changed based on an interpolation between the second depth value and the third depth value based on a distance from the first measurement point to the second measurement point and third measurement point; andthe determination that the second measurement point and the third measurement point are on the same object plane is determined based on a difference between the second depth value and the third depth value being less than a predetermined value.
  • 20. The computer program product of claim 19 further comprising: determining with the processor the second measurement point and the third measurement point are not on the same object plane based on the difference between the second depth value and the third depth value being greater than or equal to the predetermined value;changing with the processor the first depth value to be equal to the second depth value based on determining the second measurement point and the third measurement point are not on the same object plane, the second depth value being greater than the third depth value;determining with the processor a fourth pixel between the first pixel and the second pixel, the fourth pixel not having one of the measurement points of the plurality of measurement points assigned to the fourth pixel;changing with the processor a fourth depth value of the fourth pixel to the second depth value based on the second measurement point and the third measurement point are not on the same object plane; andchanging with the processor the fourth depth value based on an interpolation between the second depth value and the third depth value based on a distance from the fourth measurement point to the second measurement point and third measurement point based on the second measurement point and the third measurement point being on the same object plane.
Priority Claims (1)
Number Date Country Kind
10 2010 020 952 May 2010 DE national
CROSS REFERENCE TO RELATED APPLICATIONS

The present application is a continuation application of U.S. application Ser. No. 13/697,031 filed on Apr. 1, 2011, which is a National Stage Application of PCT Application No. PCT/EP2011/001662, filed on Apr. 1, 2011, which claims the benefit of U.S. Provisional Patent Application No. 61/362,810, filed on Jul. 9, 2010, and of pending German Patent Application No. DE 10 2010 020 952.2, filed on May 10, 2010, and which are hereby incorporated by reference.

US Referenced Citations (740)
Number Name Date Kind
1535312 Hosking Apr 1925 A
1538758 Taylor May 1925 A
1918813 Kinzy Jul 1933 A
2316573 Egy Apr 1943 A
2333243 Glab Nov 1943 A
2702683 Green et al. Feb 1955 A
2748926 Leahy Jun 1956 A
2983367 Paramater et al. Jun 1958 A
2924495 Haines Sep 1958 A
2966257 Littlejohn Dec 1960 A
3066790 Armbruster Dec 1962 A
3458167 Cooley, Jr. Jul 1969 A
3830567 Riegl Aug 1974 A
3899145 Stephenson Aug 1975 A
3945729 Rosen Mar 1976 A
4138045 Baker Feb 1979 A
4178515 Tarasevich Dec 1979 A
4340008 Mendelson Jul 1982 A
4379461 Nilsson et al. Apr 1983 A
4424899 Rosenberg Jan 1984 A
4430796 Nakagawa Feb 1984 A
4457625 Greenleaf et al. Jul 1984 A
4506448 Topping et al. Mar 1985 A
4537233 Vroonland et al. Aug 1985 A
4561776 Pryor Dec 1985 A
4606696 Slocum Aug 1986 A
4659280 Akeel Apr 1987 A
4663852 Guarini May 1987 A
4664588 Newell et al. May 1987 A
4667231 Pryor May 1987 A
4676002 Slocum Jun 1987 A
4714339 Lau et al. Dec 1987 A
4733961 Mooney Mar 1988 A
4736218 Kutman Apr 1988 A
4751950 Bock Jun 1988 A
4767257 Kato Aug 1988 A
4790651 Brown et al. Dec 1988 A
4816822 Vache et al. Mar 1989 A
4870274 Hebert et al. Sep 1989 A
4882806 Davis Nov 1989 A
4891509 Jones et al. Jan 1990 A
4954952 Ubhayakar et al. Sep 1990 A
4982841 Goedecke Jan 1991 A
4984881 Osada et al. Jan 1991 A
4996909 Vache et al. Mar 1991 A
4999491 Semler et al. Mar 1991 A
5021641 Swartz et al. Jun 1991 A
5025966 Potter Jun 1991 A
5027951 Johnson Jul 1991 A
5068971 Simon Dec 1991 A
5069524 Watanabe et al. Dec 1991 A
5155684 Burke et al. Oct 1992 A
5168532 Seppi et al. Dec 1992 A
5189797 Granger Mar 1993 A
5205111 Johnson Apr 1993 A
5211476 Coudroy May 1993 A
5213240 Dietz et al. May 1993 A
5216479 Dotan et al. Jun 1993 A
5218427 Koch Jun 1993 A
5219423 Kamaya Jun 1993 A
5239855 Schleifer et al. Aug 1993 A
5289264 Steinbichler Feb 1994 A
5289265 Inoue et al. Feb 1994 A
5289855 Baker et al. Mar 1994 A
5313261 Leatham et al. May 1994 A
5319445 Fitts Jun 1994 A
5329347 Wallace et al. Jul 1994 A
5329467 Nagamune et al. Jul 1994 A
5332315 Baker et al. Jul 1994 A
5337149 Kozah Aug 1994 A
5371347 Plesko Dec 1994 A
5372250 Johnson Dec 1994 A
5373346 Hocker Dec 1994 A
5402365 Kozikaro et al. Mar 1995 A
5402582 Raab Apr 1995 A
5412880 Raab May 1995 A
5416505 Eguchi et al. May 1995 A
5430384 Hocker Jul 1995 A
5446846 Lennartsson Aug 1995 A
5455670 Payne et al. Oct 1995 A
5455993 Link et al. Oct 1995 A
5510977 Raab Apr 1996 A
5517297 Stenton May 1996 A
5528505 Granger et al. Jun 1996 A
5535524 Carrier et al. Jul 1996 A
5563655 Lathrop Oct 1996 A
5611147 Raab Mar 1997 A
5615489 Breyer et al. Apr 1997 A
5623416 Hocker, III Apr 1997 A
5629756 Kitajima May 1997 A
5668631 Norita et al. Sep 1997 A
5675326 Juds et al. Oct 1997 A
5677760 Mikami et al. Oct 1997 A
5682508 Hocker, III Oct 1997 A
5716036 Isobe et al. Feb 1998 A
5724264 Rosenberg et al. Mar 1998 A
5734417 Yamamoto et al. Mar 1998 A
5745225 Watanabe et al. Apr 1998 A
5752112 Paddock et al. May 1998 A
5754449 Hoshal et al. May 1998 A
5768792 Raab Jun 1998 A
5793993 Broedner et al. Aug 1998 A
5804805 Koenck et al. Sep 1998 A
5825666 Freifeld Oct 1998 A
5829148 Eaton Nov 1998 A
5831719 Berg et al. Nov 1998 A
5832416 Anderson Nov 1998 A
5844591 Takamatsu et al. Dec 1998 A
5856874 Tachibana et al. Jan 1999 A
5887122 Terawaki et al. Mar 1999 A
5894123 Ohtomo et al. Apr 1999 A
5898490 Ohtomo et al. Apr 1999 A
5909939 Fugmann Jun 1999 A
5926782 Raab Jul 1999 A
5933267 Ishizuka Aug 1999 A
5936721 Ohtomo et al. Aug 1999 A
5940170 Berg et al. Aug 1999 A
5940181 Tsubono et al. Aug 1999 A
5956661 Lefebvre Sep 1999 A
5956857 Raab Sep 1999 A
5969321 Danielson et al. Oct 1999 A
5973788 Pettersen et al. Oct 1999 A
5978748 Raab Nov 1999 A
5983936 Schwieterman et al. Nov 1999 A
5988862 Kacyra et al. Nov 1999 A
5991011 Damm Nov 1999 A
5996790 Yamada et al. Dec 1999 A
5997779 Potter Dec 1999 A
6040898 Mrosik et al. Mar 2000 A
D423534 Raab et al. Apr 2000 S
6050615 Weinhold Apr 2000 A
6060889 Hocker May 2000 A
6067116 Yamano et al. May 2000 A
6069700 Rudnick et al. May 2000 A
6077306 Metzger et al. Jun 2000 A
6112423 Sheehan Sep 2000 A
6125337 Rosenberg et al. Sep 2000 A
6131299 Raab et al. Oct 2000 A
6134507 Markey, Jr. et al. Oct 2000 A
6138915 Danielson et al. Oct 2000 A
6149112 Thieltges Nov 2000 A
6151789 Raab et al. Nov 2000 A
6163294 Talbot Dec 2000 A
6166504 Iida et al. Dec 2000 A
6166809 Pettersen et al. Dec 2000 A
6166811 Long et al. Dec 2000 A
6204651 Marcus et al. Mar 2001 B1
6204961 Anderson et al. Mar 2001 B1
6219928 Raab et al. Apr 2001 B1
D441632 Raab et al. May 2001 S
6240651 Schroeder et al. Jun 2001 B1
6246468 Dimsdale Jun 2001 B1
6253458 Raab et al. Jul 2001 B1
6282195 Miller et al. Aug 2001 B1
6298569 Raab et al. Oct 2001 B1
6339410 Milner et al. Jan 2002 B1
6349249 Cunningham Feb 2002 B1
6366831 Raab Apr 2002 B1
6408252 De Smet Jun 2002 B1
6418774 Brogaardh et al. Jul 2002 B1
6438856 Kaczynski Aug 2002 B1
6442419 Chu et al. Aug 2002 B1
6445446 Kumagai et al. Sep 2002 B1
6460004 Greer et al. Oct 2002 B2
6470584 Stoodley Oct 2002 B1
6477784 Schroeder et al. Nov 2002 B2
6480270 Studnicka et al. Nov 2002 B1
6483106 Ohtomo et al. Nov 2002 B1
6497394 Dunchock Dec 2002 B1
6504602 Hinderling Jan 2003 B1
6512575 Marchi Jan 2003 B1
6519860 Bieg et al. Feb 2003 B1
D472824 Raab et al. Apr 2003 S
6542249 Kofman Apr 2003 B1
6547397 Kaufman et al. Apr 2003 B1
6598306 Eaton Jul 2003 B2
6611346 Granger Aug 2003 B2
6611617 Crampton Aug 2003 B1
D479544 Raab et al. Sep 2003 S
6612044 Raab et al. Sep 2003 B2
6621065 Fukumoto et al. Sep 2003 B1
6626339 Gates et al. Sep 2003 B2
6633051 Holloway et al. Oct 2003 B1
6649208 Rodgers Nov 2003 B2
6650402 Sullivan et al. Nov 2003 B2
6668466 Bieg et al. Dec 2003 B1
6675122 Markendorf et al. Jan 2004 B1
6681495 Masayuki et al. Jan 2004 B2
6710859 Shirai et al. Mar 2004 B2
D490831 Raab et al. Jun 2004 S
D491210 Raab et al. Jun 2004 S
6750873 Bernardini et al. Jun 2004 B1
6764185 Beardsley et al. Jul 2004 B1
6789327 Roth et al. Sep 2004 B2
6820346 Raab et al. Nov 2004 B2
6822749 Christoph Nov 2004 B1
6825923 Hamar et al. Nov 2004 B2
6826664 Hocker, III et al. Nov 2004 B2
6856381 Christoph Feb 2005 B2
6858836 Hartrumpf Feb 2005 B1
6868359 Raab Mar 2005 B2
6879933 Steffey et al. Apr 2005 B2
6889903 Koenck May 2005 B1
6892465 Raab et al. May 2005 B2
6894767 Ishinabe et al. May 2005 B2
6895347 Dorny et al. May 2005 B2
6901673 Cobb et al. Jun 2005 B1
6904691 Raab et al. Jun 2005 B2
6914678 Ulrichsen et al. Jul 2005 B1
6917415 Gogolla et al. Jul 2005 B2
6920697 Raab et al. Jul 2005 B2
6922234 Hoffman Jul 2005 B2
6925722 Raab et al. Aug 2005 B2
6931745 Granger Aug 2005 B2
6935036 Raab et al. Aug 2005 B2
6935748 Kaufman et al. Aug 2005 B2
6948255 Russell Sep 2005 B2
6957496 Raab et al. Oct 2005 B2
6965843 Raab et al. Nov 2005 B2
6973734 Raab et al. Dec 2005 B2
6988322 Raab et al. Jan 2006 B2
6989890 Riegl et al. Jan 2006 B2
7003892 Eaton et al. Feb 2006 B2
7006084 Buss et al. Feb 2006 B1
7024032 Kidd et al. Apr 2006 B2
7029126 Tang Apr 2006 B2
7032321 Raab et al. Apr 2006 B2
7040136 Forss et al. May 2006 B2
7051447 Kikuchi et al. May 2006 B2
7069124 Whittaker et al. Jun 2006 B1
7069875 Warecki Jul 2006 B2
7076420 Snyder et al. Jul 2006 B1
7106421 Matsuura et al. Sep 2006 B2
7117107 Dorny et al. Oct 2006 B2
7120092 del Prado Pavon et al. Oct 2006 B2
7127822 Kumagai et al. Oct 2006 B2
7140213 Feucht et al. Nov 2006 B2
7152456 Eaton Dec 2006 B2
7174651 Raab et al. Feb 2007 B2
7180072 Persi et al. Feb 2007 B2
7184047 Crampton Feb 2007 B1
7190465 Froehlich et al. Mar 2007 B2
7191541 Weekers et al. Mar 2007 B1
7193690 Ossig et al. Mar 2007 B2
7196509 Teng Mar 2007 B2
7199872 Van Cranenbroeck Apr 2007 B2
7200246 Cofer et al. Apr 2007 B2
7202941 Munro Apr 2007 B2
7230689 Lau Jun 2007 B2
7242590 Yeap et al. Jul 2007 B1
7246030 Raab et al. Jul 2007 B2
7249421 MacManus et al. Jul 2007 B2
7256899 Faul et al. Aug 2007 B1
7269910 Raab et al. Sep 2007 B2
D551943 Hodjat et al. Oct 2007 S
7285793 Husted Oct 2007 B2
7296364 Seitz et al. Nov 2007 B2
7296955 Dreier Nov 2007 B2
7296979 Raab et al. Nov 2007 B2
7306339 Kaufman et al. Dec 2007 B2
7307701 Hoffman, II Dec 2007 B2
7312862 Zumbrunn et al. Dec 2007 B2
7313264 Crampton Dec 2007 B2
D559657 Wohlford et al. Jan 2008 S
7319512 Ohtomo et al. Jan 2008 B2
7337344 Barman et al. Feb 2008 B2
7342650 Kern et al. Mar 2008 B2
7348822 Baer Mar 2008 B2
7352446 Bridges et al. Apr 2008 B2
7360648 Blaschke Apr 2008 B1
7372558 Kaufman et al. May 2008 B2
7372581 Raab et al. May 2008 B2
7383638 Granger Jun 2008 B2
7388654 Raab et al. Jun 2008 B2
7389870 Slappay Jun 2008 B2
7395606 Crampton Jul 2008 B2
7400384 Evans et al. Jul 2008 B1
7403268 England Jul 2008 B2
7403269 Yamashita et al. Jul 2008 B2
7430068 Becker et al. Sep 2008 B2
7441341 Eaton Oct 2008 B2
7443555 Blug et al. Oct 2008 B2
7447931 Rischar et al. Nov 2008 B1
7449876 Pleasant et al. Nov 2008 B2
7454265 Marsh Nov 2008 B2
7463368 Morden et al. Dec 2008 B2
7477359 England Jan 2009 B2
7477360 England Jan 2009 B2
7480037 Palmateer et al. Jan 2009 B2
7508971 Vaccaro et al. Mar 2009 B2
7515256 Ohtomo et al. Apr 2009 B2
7525276 Eaton Apr 2009 B2
7541830 Fahrbach et al. Jun 2009 B2
7545517 Rueb et al. Jun 2009 B2
7546689 Ferrari et al. Jun 2009 B2
7551771 England, III Jun 2009 B2
7552644 Haase et al. Jun 2009 B2
7557824 Holliman Jul 2009 B2
7561598 Stratton et al. Jul 2009 B2
7564250 Hocker Jul 2009 B2
7568293 Ferrari Aug 2009 B2
7578069 Eaton Aug 2009 B2
D599226 Gerent et al. Sep 2009 S
7589595 Cutler Sep 2009 B2
7591077 Pettersson Sep 2009 B2
7591078 Crampton Sep 2009 B2
7600061 Honda Oct 2009 B2
7602873 Eidson Oct 2009 B2
7604207 Hasloecher et al. Oct 2009 B2
7610175 Eidson Oct 2009 B2
7614157 Granger Nov 2009 B2
7624510 Ferrari Dec 2009 B2
7625335 Deichmann et al. Dec 2009 B2
7626690 Kumagai et al. Dec 2009 B2
D607350 Cooduvalli et al. Jan 2010 S
7656751 Rischar et al. Feb 2010 B2
7659995 Knighton et al. Feb 2010 B2
D610926 Gerent et al. Mar 2010 S
7693325 Pulla et al. Apr 2010 B2
7697748 Dimsdale et al. Apr 2010 B2
7701592 Saint Clair et al. Apr 2010 B2
7712224 Hicks May 2010 B2
7721396 Fleischman May 2010 B2
7728963 Kirschner Jun 2010 B2
7733544 Becker et al. Jun 2010 B2
7735234 Briggs et al. Jun 2010 B2
7743524 Eaton et al. Jun 2010 B2
7752003 MacManus Jul 2010 B2
7765707 Tomelleri Aug 2010 B2
7769559 Reichert Aug 2010 B2
7774949 Ferrari Aug 2010 B2
7777761 England Aug 2010 B2
7779548 Ferrari Aug 2010 B2
7779553 Jordil et al. Aug 2010 B2
7784194 Raab et al. Aug 2010 B2
7787670 Urushiya Aug 2010 B2
7793425 Bailey Sep 2010 B2
7798453 Maningo et al. Sep 2010 B2
7800758 Bridges et al. Sep 2010 B1
7804602 Raab Sep 2010 B2
7805851 Pettersson Oct 2010 B2
7805854 Eaton Oct 2010 B2
7809518 Zhu et al. Oct 2010 B2
7834985 Morcom Nov 2010 B2
7847922 Gittinger Dec 2010 B2
RE42055 Raab Jan 2011 E
7869005 Ossig et al. Jan 2011 B2
RE42082 Raab et al. Feb 2011 E
7881896 Atwell et al. Feb 2011 B2
7891248 Hough et al. Feb 2011 B2
7900714 Milbourne et al. Mar 2011 B2
7903261 Saint Clair et al. Mar 2011 B2
7908757 Ferrari Mar 2011 B2
7933055 Jensen et al. Apr 2011 B2
7935928 Serger et al. May 2011 B2
7982866 Vogel Jul 2011 B2
D643319 Ferrari et al. Aug 2011 S
7994465 Bamji et al. Aug 2011 B1
7995834 Knighton et al. Aug 2011 B1
8001697 Danielson et al. Aug 2011 B2
8020657 Allard et al. Sep 2011 B2
8028432 Bailey et al. Oct 2011 B2
8052857 Townsend Nov 2011 B2
8065861 Caputo Nov 2011 B2
8082673 Desforges et al. Dec 2011 B2
8099877 Champ Jan 2012 B2
8117668 Crampton et al. Feb 2012 B2
8123350 Cannell et al. Feb 2012 B2
8152071 Doherty et al. Apr 2012 B2
D659035 Ferrari et al. May 2012 S
8171650 York et al. May 2012 B2
D662427 Bailey et al. Jun 2012 S
8218131 Otani et al. Jul 2012 B2
8269984 Hinderling et al. Sep 2012 B2
8276286 Bailey et al. Oct 2012 B2
8284407 Briggs et al. Oct 2012 B2
8310653 Ogawa et al. Nov 2012 B2
8346392 Walser et al. Jan 2013 B2
8352212 Fetter et al. Jan 2013 B2
8353059 Crampton et al. Jan 2013 B2
D676341 Bailey et al. Feb 2013 S
8384914 Becker Feb 2013 B2
D678085 Bailey et al. Mar 2013 S
8391565 Purcell et al. Mar 2013 B2
8402669 Ferrari et al. Mar 2013 B2
8497901 Pettersson Jul 2013 B2
8533967 Bailey et al. Sep 2013 B2
8537374 Briggs et al. Sep 2013 B2
8619265 Steffey et al. Dec 2013 B2
8659748 Dakin et al. Feb 2014 B2
8659752 Cramer et al. Feb 2014 B2
8661700 Briggs et al. Mar 2014 B2
8677643 Bridges et al. Mar 2014 B2
8683709 York Apr 2014 B2
8699007 Becker et al. Apr 2014 B2
8705012 Greiner et al. Apr 2014 B2
8705016 Schumann et al. Apr 2014 B2
8718837 Wang et al. May 2014 B2
8784425 Ritchey et al. Jul 2014 B2
8797552 Suzuki et al. Aug 2014 B2
8811767 Veeraraghavan et al. Aug 2014 B2
8830485 Woloschyn Sep 2014 B2
9113023 Bridges et al. Aug 2015 B2
9279662 Steffey et al. Mar 2016 B2
9372265 Zweigle et al. Jun 2016 B2
9383587 Balogh Jul 2016 B2
20010004269 Shibata et al. Jun 2001 A1
20020015934 Rubbert et al. Feb 2002 A1
20020032541 Raab et al. Mar 2002 A1
20020059042 Kacyra et al. May 2002 A1
20020087233 Raab Jul 2002 A1
20020128790 Woodmansee Sep 2002 A1
20020143506 D'Aligny et al. Oct 2002 A1
20020149694 Seo Oct 2002 A1
20020170192 Steffey et al. Nov 2002 A1
20020176097 Rodgers Nov 2002 A1
20030002055 Kilthau et al. Jan 2003 A1
20030033104 Gooche Feb 2003 A1
20030043386 Froehlich et al. Mar 2003 A1
20030053037 Blaesing-Bangert et al. Mar 2003 A1
20030066954 Hipp Apr 2003 A1
20030090646 Riegl et al. May 2003 A1
20030125901 Steffey et al. Jul 2003 A1
20030137449 Vashisth et al. Jul 2003 A1
20030142631 Silvester Jul 2003 A1
20030167647 Raab et al. Sep 2003 A1
20030172536 Raab et al. Sep 2003 A1
20030172537 Raab et al. Sep 2003 A1
20030179361 Ohtomo et al. Sep 2003 A1
20030208919 Raab et al. Nov 2003 A1
20030221326 Raab et al. Dec 2003 A1
20040004727 Yanagisawa et al. Jan 2004 A1
20040022416 Lemelson et al. Feb 2004 A1
20040027554 Ishinabe et al. Feb 2004 A1
20040040166 Raab et al. Mar 2004 A1
20040103547 Raab et al. Jun 2004 A1
20040111908 Raab et al. Jun 2004 A1
20040119020 Bodkin Jun 2004 A1
20040135990 Ohtomo et al. Jul 2004 A1
20040139265 Hocker, III et al. Jul 2004 A1
20040158355 Holmqvist et al. Aug 2004 A1
20040162700 Rosenberg et al. Aug 2004 A1
20040179570 Vitruk et al. Sep 2004 A1
20040221790 Sinclair et al. Nov 2004 A1
20040246462 Kaneko et al. Dec 2004 A1
20040246589 Kim et al. Dec 2004 A1
20040259533 Nixon et al. Dec 2004 A1
20050016008 Raab et al. Jan 2005 A1
20050024625 Mori et al. Feb 2005 A1
20050028393 Raab et al. Feb 2005 A1
20050046823 Ando et al. Mar 2005 A1
20050058332 Kaufman et al. Mar 2005 A1
20050082262 Rueb et al. Apr 2005 A1
20050085940 Griggs et al. Apr 2005 A1
20050111514 Matsumoto et al. May 2005 A1
20050141052 Becker Jun 2005 A1
20050144799 Raab et al. Jul 2005 A1
20050150123 Eaton Jul 2005 A1
20050151963 Pulla et al. Jul 2005 A1
20050166413 Crampton Aug 2005 A1
20050172503 Kumagai et al. Aug 2005 A1
20050188557 Raab et al. Sep 2005 A1
20050190384 Persi et al. Sep 2005 A1
20050259271 Christoph Nov 2005 A1
20050276466 Vaccaro et al. Dec 2005 A1
20050283989 Pettersson Dec 2005 A1
20060016086 Raab et al. Jan 2006 A1
20060017720 Li Jan 2006 A1
20060026851 Raab et al. Feb 2006 A1
20060028203 Kawashima et al. Feb 2006 A1
20060053647 Raab et al. Mar 2006 A1
20060056459 Stratton et al. Mar 2006 A1
20060056559 Pleasant et al. Mar 2006 A1
20060059270 Pleasant et al. Mar 2006 A1
20060061566 Verma et al. Mar 2006 A1
20060066836 Bridges et al. Mar 2006 A1
20060088044 Hammerl Apr 2006 A1
20060096108 Raab et al. May 2006 A1
20060103853 Palmateer May 2006 A1
20060109536 Mettenleiter et al. May 2006 A1
20060123649 Muller Jun 2006 A1
20060129349 Raab et al. Jun 2006 A1
20060132803 Clair et al. Jun 2006 A1
20060145703 Steinbichler et al. Jul 2006 A1
20060169050 Kobayashi et al. Aug 2006 A1
20060169608 Carnevali et al. Aug 2006 A1
20060170870 Kaufman et al. Aug 2006 A1
20060182314 England et al. Aug 2006 A1
20060186301 Dozier et al. Aug 2006 A1
20060193521 England, III et al. Aug 2006 A1
20060241791 Pokorny et al. Oct 2006 A1
20060244746 England et al. Nov 2006 A1
20060245717 Ossig et al. Nov 2006 A1
20060279246 Hashimoto et al. Dec 2006 A1
20060282574 Zotov et al. Dec 2006 A1
20060287769 Yanagita et al. Dec 2006 A1
20060291970 Granger Dec 2006 A1
20070030841 Lee et al. Feb 2007 A1
20070043526 De Jonge et al. Feb 2007 A1
20070050774 Eldson et al. Mar 2007 A1
20070055806 Stratton et al. Mar 2007 A1
20070058154 Reichert et al. Mar 2007 A1
20070058162 Granger Mar 2007 A1
20070064976 England, III Mar 2007 A1
20070097381 Tobiason et al. May 2007 A1
20070097382 Granger May 2007 A1
20070100498 Matsumoto et al. May 2007 A1
20070105238 Mandl et al. May 2007 A1
20070118269 Gibson et al. May 2007 A1
20070122250 Mullner May 2007 A1
20070142970 Burbank et al. Jun 2007 A1
20070147265 Eidson et al. Jun 2007 A1
20070147435 Hamilton et al. Jun 2007 A1
20070147562 Eidson Jun 2007 A1
20070150111 Wu et al. Jun 2007 A1
20070151390 Blumenkranz et al. Jul 2007 A1
20070153297 Lau Jul 2007 A1
20070163134 Eaton Jul 2007 A1
20070163136 Eaton et al. Jul 2007 A1
20070171220 Kriveshko Jul 2007 A1
20070171394 Steiner et al. Jul 2007 A1
20070172112 Paley et al. Jul 2007 A1
20070176648 Baer Aug 2007 A1
20070177016 Wu Aug 2007 A1
20070181685 Zhu et al. Aug 2007 A1
20070183459 Eidson Aug 2007 A1
20070185682 Eidson Aug 2007 A1
20070217169 Yeap et al. Sep 2007 A1
20070217170 Yeap et al. Sep 2007 A1
20070221522 Yamada et al. Sep 2007 A1
20070223477 Eidson Sep 2007 A1
20070229801 Tearney et al. Oct 2007 A1
20070229929 Soreide et al. Oct 2007 A1
20070247615 Bridges et al. Oct 2007 A1
20070248122 Hamilton Oct 2007 A1
20070256311 Ferrari Nov 2007 A1
20070257660 Pleasant et al. Nov 2007 A1
20070258378 Hamilton Nov 2007 A1
20070282564 Sprague et al. Dec 2007 A1
20070294045 Atwell et al. Dec 2007 A1
20080046221 Stathis Feb 2008 A1
20080052808 Leick et al. Mar 2008 A1
20080052936 Briggs et al. Mar 2008 A1
20080066583 Lott et al. Mar 2008 A1
20080068103 Cutler Mar 2008 A1
20080075325 Otani et al. Mar 2008 A1
20080075326 Otani et al. Mar 2008 A1
20080080562 Burch et al. Apr 2008 A1
20080096108 Sumiyama et al. Apr 2008 A1
20080098272 Fairbanks et al. Apr 2008 A1
20080148585 Raab et al. Jun 2008 A1
20080154538 Stathis Jun 2008 A1
20080179206 Feinstein et al. Jul 2008 A1
20080183065 Goldbach Jul 2008 A1
20080196260 Pettersson Aug 2008 A1
20080204699 Benz et al. Aug 2008 A1
20080216552 Ibach et al. Sep 2008 A1
20080218728 Kirschner Sep 2008 A1
20080228331 McNerney et al. Sep 2008 A1
20080232269 Tatman et al. Sep 2008 A1
20080235969 Jordil et al. Oct 2008 A1
20080235970 Crampton Oct 2008 A1
20080240321 Narus et al. Oct 2008 A1
20080245452 Law et al. Oct 2008 A1
20080246943 Kaufman et al. Oct 2008 A1
20080252671 Cannell et al. Oct 2008 A1
20080256814 Pettersson Oct 2008 A1
20080257023 Jordil et al. Oct 2008 A1
20080263411 Baney et al. Oct 2008 A1
20080271332 Jordil et al. Nov 2008 A1
20080273758 Fuchs et al. Nov 2008 A1
20080282564 Pettersson Nov 2008 A1
20080295349 Uhl et al. Dec 2008 A1
20080298254 Eidson Dec 2008 A1
20080302200 Tobey Dec 2008 A1
20080309460 Jefferson et al. Dec 2008 A1
20080309546 Wakayama et al. Dec 2008 A1
20090000136 Crampton Jan 2009 A1
20090010740 Ferrari et al. Jan 2009 A1
20090013548 Ferrari Jan 2009 A1
20090016475 Rischar et al. Jan 2009 A1
20090021351 Beniyama et al. Jan 2009 A1
20090031575 Tomelleri Feb 2009 A1
20090046140 Lashmet et al. Feb 2009 A1
20090046752 Bueche et al. Feb 2009 A1
20090046895 Pettersson et al. Feb 2009 A1
20090049704 Styles et al. Feb 2009 A1
20090051938 Miousset et al. Feb 2009 A1
20090083985 Ferrari Apr 2009 A1
20090089004 Vook et al. Apr 2009 A1
20090089078 Bursey Apr 2009 A1
20090089233 Gach et al. Apr 2009 A1
20090089623 Neering et al. Apr 2009 A1
20090095047 Patel et al. Apr 2009 A1
20090100949 Shirai et al. Apr 2009 A1
20090109797 Eidson Apr 2009 A1
20090113183 Barford et al. Apr 2009 A1
20090113229 Cataldo et al. Apr 2009 A1
20090122805 Epps et al. May 2009 A1
20090125196 Velazquez et al. May 2009 A1
20090133276 Bailey et al. May 2009 A1
20090133494 Van Dam et al. May 2009 A1
20090139105 Granger Jun 2009 A1
20090157419 Bursey Jun 2009 A1
20090161091 Yamamoto Jun 2009 A1
20090165317 Little Jul 2009 A1
20090177435 Heininen Jul 2009 A1
20090177438 Raab Jul 2009 A1
20090185741 Nahari et al. Jul 2009 A1
20090187373 Atwell Jul 2009 A1
20090241360 Tait et al. Oct 2009 A1
20090249634 Pettersson Oct 2009 A1
20090265946 Jordil et al. Oct 2009 A1
20090273771 Gittinger et al. Nov 2009 A1
20090299689 Stubben et al. Dec 2009 A1
20090322859 Shelton et al. Dec 2009 A1
20090323742 Kumano Dec 2009 A1
20100030421 Yoshimura et al. Feb 2010 A1
20100040742 Dijkhuis et al. Feb 2010 A1
20100049891 Hartwich et al. Feb 2010 A1
20100057392 York Mar 2010 A1
20100078866 Pettersson Apr 2010 A1
20100095542 Ferrari Apr 2010 A1
20100122920 Butter et al. May 2010 A1
20100123892 Miller et al. May 2010 A1
20100134596 Becker Jun 2010 A1
20100134598 St-Pierre et al. Jun 2010 A1
20100134599 Billert et al. Jun 2010 A1
20100135534 Weston et al. Jun 2010 A1
20100148013 Bhotika et al. Jun 2010 A1
20100188504 Dimsdale et al. Jul 2010 A1
20100195086 Ossig Aug 2010 A1
20100207938 Yau et al. Aug 2010 A1
20100208062 Pettersson Aug 2010 A1
20100208318 Jensen et al. Aug 2010 A1
20100245851 Teodorescu Sep 2010 A1
20100277472 Kaltenbach et al. Nov 2010 A1
20100277747 Rueb et al. Nov 2010 A1
20100281705 Verdi et al. Nov 2010 A1
20100286941 Merlot Nov 2010 A1
20100312524 Siercks et al. Dec 2010 A1
20100318319 Maierhofer Dec 2010 A1
20100321152 Argudyaev et al. Dec 2010 A1
20100325907 Tait Dec 2010 A1
20110000095 Carlson Jan 2011 A1
20110001958 Bridges et al. Jan 2011 A1
20110007305 Bridges et al. Jan 2011 A1
20110007326 Daxauer et al. Jan 2011 A1
20110013199 Siercks et al. Jan 2011 A1
20110019155 Daniel et al. Jan 2011 A1
20110023578 Grasser Feb 2011 A1
20110025905 Tanaka Feb 2011 A1
20110043515 Stathis Feb 2011 A1
20110066781 Debelak et al. Mar 2011 A1
20110094908 Trieu et al. Apr 2011 A1
20110107611 Desforges et al. May 2011 A1
20110107612 Ferrari et al. May 2011 A1
20110107613 Tait May 2011 A1
20110107614 Champ May 2011 A1
20110111849 Sprague et al. May 2011 A1
20110112786 Desforges et al. May 2011 A1
20110119025 Fetter et al. May 2011 A1
20110123097 Van Coppenolle May 2011 A1
20110164114 Kobayashi et al. Jul 2011 A1
20110166824 Haisty et al. Jul 2011 A1
20110169924 Haisty et al. Jul 2011 A1
20110173823 Bailey et al. Jul 2011 A1
20110173827 Bailey et al. Jul 2011 A1
20110173828 York Jul 2011 A1
20110178755 York Jul 2011 A1
20110178758 Atwell et al. Jul 2011 A1
20110178762 York Jul 2011 A1
20110178764 York Jul 2011 A1
20110178765 Atwell et al. Jul 2011 A1
20110188739 Lee et al. Aug 2011 A1
20110192043 Ferrari et al. Aug 2011 A1
20110273568 Lagassey et al. Nov 2011 A1
20110282622 Canter et al. Nov 2011 A1
20110288684 Farlow et al. Nov 2011 A1
20120035788 Trepagnier et al. Feb 2012 A1
20120035798 Barfoot et al. Feb 2012 A1
20120044476 Earhart et al. Feb 2012 A1
20120046820 Allard et al. Feb 2012 A1
20120069325 Schumann et al. Mar 2012 A1
20120069352 Ossig et al. Mar 2012 A1
20120070077 Ossig et al. Mar 2012 A1
20120113913 Tiirola et al. May 2012 A1
20120133953 Ossig May 2012 A1
20120140244 Gittinger et al. Jun 2012 A1
20120154786 Gosch et al. Jun 2012 A1
20120155744 Kennedy et al. Jun 2012 A1
20120169876 Reichert et al. Jul 2012 A1
20120181194 Mcewan et al. Jul 2012 A1
20120210678 Alcouloumre et al. Aug 2012 A1
20120217357 Franke Aug 2012 A1
20120229788 Schumann et al. Sep 2012 A1
20120260512 Kretschmer et al. Oct 2012 A1
20120260611 Jones Oct 2012 A1
20120262700 Schumann et al. Oct 2012 A1
20130010307 Greiner et al. Jan 2013 A1
20130025143 Bailey et al. Jan 2013 A1
20130025144 Briggs et al. Jan 2013 A1
20130027515 Vinther et al. Jan 2013 A1
20130062243 Chang et al. Mar 2013 A1
20130070250 Ditte et al. Mar 2013 A1
20130094024 Ruhland et al. Apr 2013 A1
20130097882 Bridges et al. Apr 2013 A1
20130125408 Atwell et al. May 2013 A1
20130162472 Najim et al. Jun 2013 A1
20130201487 Ossig et al. Aug 2013 A1
20130205606 Briggs et al. Aug 2013 A1
20130212889 Bridges et al. Aug 2013 A9
20130218024 Boctor et al. Aug 2013 A1
20130222816 Briggs et al. Aug 2013 A1
20130300740 Snyder et al. Nov 2013 A1
20140002608 Atwell et al. Jan 2014 A1
20140015963 Klaas Jan 2014 A1
20140028805 Tohme Jan 2014 A1
20140049784 Woloschyn Feb 2014 A1
20140063489 Steffey et al. Mar 2014 A1
20140120493 Levin May 2014 A1
20140226190 Bridges et al. Aug 2014 A1
20140240690 Newman et al. Aug 2014 A1
20140267623 Bridges et al. Sep 2014 A1
20140268108 Grau Sep 2014 A1
20140362424 Bridges et al. Dec 2014 A1
20150015701 Yu Jan 2015 A1
20150109419 Vollrath et al. Apr 2015 A1
20150160347 Zweigle et al. Jun 2015 A1
20150229907 Bridges Aug 2015 A1
20150241204 Steffey et al. Aug 2015 A1
20150369917 Bridges et al. Dec 2015 A1
20150373321 Bridges Dec 2015 A1
20150378023 Royo Royo et al. Dec 2015 A1
20160033643 Zweigle et al. Feb 2016 A1
20160047914 Zweigle et al. Feb 2016 A1
20160069670 Ruhland et al. Mar 2016 A1
20160073085 Hillebrand et al. Mar 2016 A1
20160073091 Hillebrand et al. Mar 2016 A1
20160073104 Hillebrand et al. Mar 2016 A1
Foreign Referenced Citations (302)
Number Date Country
506110 Jun 2009 AT
508635 Mar 2011 AT
2005200937 Sep 2006 AU
2236119 Sep 1996 CN
1133969 Oct 1996 CN
2508896 Sep 2002 CN
2665668 Dec 2004 CN
1630804 Jun 2005 CN
1630805 Jun 2005 CN
1735789 Feb 2006 CN
1812868 Aug 2006 CN
1818537 Aug 2006 CN
1838102 Sep 2006 CN
1839293 Sep 2006 CN
1853084 Oct 2006 CN
101024286 Aug 2007 CN
101156043 Apr 2008 CN
101163939 Apr 2008 CN
101371099 Feb 2009 CN
201266071 Jul 2009 CN
101511529 Aug 2009 CN
2216765 Apr 1972 DE
2950138 Jun 1981 DE
3227980 May 1983 DE
3245060 Jul 1983 DE
3340317 Aug 1984 DE
4027990 Feb 1992 DE
4222642 Jan 1994 DE
4340756 Jun 1994 DE
4303804 Aug 1994 DE
4445464 Jul 1995 DE
4410775 Oct 1995 DE
4412044 Oct 1995 DE
29622033 Feb 1997 DE
19543763 May 1997 DE
19601875 Jul 1997 DE
19607345 Aug 1997 DE
19720049 Nov 1998 DE
19811550 Sep 1999 DE
19820307 Nov 1999 DE
19850118 May 2000 DE
19928958 Nov 2000 DE
10026357 Jan 2002 DE
20208077 May 2002 DE
10137241 Sep 2002 DE
10149750 Sep 2002 DE
10155488 May 2003 DE
10155488 May 2003 DE
10219054 Nov 2003 DE
10232028 Feb 2004 DE
10336458 Feb 2004 DE
10244643 Apr 2004 DE
20320216 Apr 2004 DE
10304188 Aug 2004 DE
10313223 Oct 2004 DE
10326848 Jan 2005 DE
202005000983 Mar 2005 DE
10361870 Jul 2005 DE
102004015668 Sep 2005 DE
102004015111 Oct 2005 DE
102004028090 Dec 2005 DE
10114126 Aug 2006 DE
202006005643 Aug 2006 DE
102004010083 Nov 2006 DE
102005043931 Mar 2007 DE
102005056265 May 2007 DE
102006053611 May 2007 DE
102005060967 Jun 2007 DE
102006023902 Nov 2007 DE
102006024534 Nov 2007 DE
102006035292 Jan 2008 DE
202006020299 May 2008 DE
102007037162 Feb 2009 DE
102008014274 Aug 2009 DE
102009055988 Nov 2009 DE
102008039838 Mar 2010 DE
102005036929 Jun 2010 DE
102008062763 Jul 2010 DE
102009001894 Sep 2010 DE
102009035336 Nov 2010 DE
202010005042 Aug 2011 DE
102010032723 Nov 2011 DE
102010032726 Nov 2011 DE
102010033561 Dec 2011 DE
102010032725 Jan 2012 DE
202011051975 Feb 2013 DE
102012107544 May 2013 DE
102012112322 Jun 2014 DE
0546784 Jun 1993 EP
0667549 Aug 1995 EP
0727642 Aug 1996 EP
0730210 Sep 1996 EP
0614517 Mar 1997 EP
0838696 Apr 1998 EP
0949524 Oct 1999 EP
1056987 Jun 2000 EP
1160539 Dec 2001 EP
1189124 Mar 2002 EP
0767357 May 2002 EP
1310764 May 2003 EP
1342989 Sep 2003 EP
1347267 Sep 2003 EP
1361414 Nov 2003 EP
1452279 Sep 2004 EP
1468791 Oct 2004 EP
1528410 May 2005 EP
1669713 Jun 2006 EP
1734425 Dec 2006 EP
1429109 Apr 2007 EP
1764579 Dec 2007 EP
1878543 Jan 2008 EP
1882895 Jan 2008 EP
1967930 Sep 2008 EP
2003419 Dec 2008 EP
2023077 Feb 2009 EP
2042905 Apr 2009 EP
2060530 May 2009 EP
2068067 Jun 2009 EP
2068114 Jun 2009 EP
2108917 Oct 2009 EP
2177868 Apr 2010 EP
2259013 Dec 2010 EP
2400261 Dec 2011 EP
2428764 Mar 2012 EP
2693300 Feb 2014 EP
2728306 May 2014 EP
2603228 Mar 1988 FR
2935043 Feb 2010 FR
894320 Apr 1962 GB
1112941 May 1968 GB
2222695 Mar 1990 GB
2255648 Nov 1992 GB
2336493 Oct 1999 GB
2341203 Mar 2000 GB
2388661 Nov 2003 GB
2420241 May 2006 GB
2447258 Sep 2008 GB
2452033 Feb 2009 GB
5581525 Jun 1955 JP
575584 Jan 1982 JP
58171291 Jan 1983 JP
5827264 Feb 1983 JP
S58171291 Oct 1983 JP
59133890 Aug 1984 JP
61062885 Mar 1986 JP
S61157095 Jul 1986 JP
63135814 Jun 1988 JP
H0357911 Mar 1991 JP
H04115108 Apr 1992 JP
04225188 Aug 1992 JP
H04267214 Sep 1992 JP
H0572477 Mar 1993 JP
06313710 Nov 1994 JP
1994313710 Nov 1994 JP
06331733 Dec 1994 JP
06341838 Dec 1994 JP
074950 Jan 1995 JP
07128051 May 1995 JP
7210586 Aug 1995 JP
H07229963 Aug 1995 JP
0821714 Jan 1996 JP
H0815413 Jan 1996 JP
H08129145 May 1996 JP
H08136849 May 1996 JP
H08262140 Oct 1996 JP
09021868 Jan 1997 JP
1123993 Jan 1999 JP
2001056275 Aug 1999 JP
2000121724 Apr 2000 JP
2000249546 Sep 2000 JP
2000339468 Dec 2000 JP
2001013001 Jan 2001 JP
2001021303 Jan 2001 JP
11231047 Feb 2001 JP
2011066211 Mar 2001 JP
2001337278 Dec 2001 JP
2003050128 Feb 2003 JP
2003156330 May 2003 JP
2003156562 May 2003 JP
2003194526 Jul 2003 JP
2003202215 Jul 2003 JP
2003216255 Jul 2003 JP
2003308205 Oct 2003 JP
2004109106 Apr 2004 JP
2004245832 Sep 2004 JP
2004257927 Sep 2004 JP
2004348575 Dec 2004 JP
2005030937 Feb 2005 JP
2005055226 Mar 2005 JP
2005069700 Mar 2005 JP
2005174887 Jun 2005 JP
2005517908 Jun 2005 JP
2005215917 Aug 2005 JP
2005257510 Sep 2005 JP
2006038683 Feb 2006 JP
2006102176 Apr 2006 JP
2006203404 Aug 2006 JP
2006226948 Aug 2006 JP
2006241833 Sep 2006 JP
2006266821 Oct 2006 JP
2006301991 Nov 2006 JP
2007514943 Jun 2007 JP
2007178943 Jul 2007 JP
2008076303 Apr 2008 JP
2008082707 Apr 2008 JP
2008096123 Apr 2008 JP
2008107286 May 2008 JP
2008304220 Dec 2008 JP
2009063339 Mar 2009 JP
2009524057 Jun 2009 JP
2009541758 Nov 2009 JP
2010169405 Aug 2010 JP
2013516928 May 2013 JP
2013517508 May 2013 JP
2013117417 Jun 2013 JP
2013543970 Dec 2013 JP
8801924 Mar 1988 WO
8905512 Jun 1989 WO
9208568 May 1992 WO
9711399 Mar 1997 WO
9808050 Feb 1998 WO
9910706 Mar 1999 WO
0014474 Mar 2000 WO
0020880 Apr 2000 WO
0026612 May 2000 WO
0033149 Jun 2000 WO
0034733 Jun 2000 WO
0063645 Oct 2000 WO
0063681 Oct 2000 WO
0177613 Oct 2001 WO
02084327 Oct 2002 WO
02088855 Nov 2002 WO
02101323 Dec 2002 WO
2004096502 Nov 2004 WO
2005008271 Jan 2005 WO
2005059473 Jun 2005 WO
2005072917 Aug 2005 WO
2005075875 Aug 2005 WO
2005100908 Oct 2005 WO
2005103863 Nov 2005 WO
2006000552 Jan 2006 WO
2006014445 Feb 2006 WO
2006051264 May 2006 WO
2006053837 May 2006 WO
2007002319 Jan 2007 WO
2007012198 Feb 2007 WO
2007028941 Mar 2007 WO
2007051972 May 2007 WO
2007087198 Aug 2007 WO
2007118478 Oct 2007 WO
2007125081 Nov 2007 WO
2007144906 Dec 2007 WO
2008019856 Feb 2008 WO
2008027588 Mar 2008 WO
2008047171 Apr 2008 WO
2008047171 Apr 2008 WO
2008048424 Apr 2008 WO
2008052348 May 2008 WO
2008064276 May 2008 WO
2008066896 Jun 2008 WO
2008068791 Jun 2008 WO
2008075170 Jun 2008 WO
2008121073 Oct 2008 WO
2008157061 Dec 2008 WO
2009001165 Dec 2008 WO
2009003225 Jan 2009 WO
2009016185 Feb 2009 WO
2009053085 Apr 2009 WO
2009083452 Jul 2009 WO
2009095384 Aug 2009 WO
2009123278 Oct 2009 WO
2009127526 Oct 2009 WO
2009130169 Oct 2009 WO
2009149740 Dec 2009 WO
2010015086 Feb 2010 WO
2010040742 Apr 2010 WO
2010092131 Aug 2010 WO
2010108089 Sep 2010 WO
2010108644 Sep 2010 WO
2010148525 Dec 2010 WO
2011000435 Jan 2011 WO
2011000955 Jan 2011 WO
2011021103 Feb 2011 WO
2011029140 Mar 2011 WO
2011057130 May 2011 WO
2011060899 May 2011 WO
2011090829 Jul 2011 WO
2011090895 Jul 2011 WO
2012013525 Feb 2012 WO
2012037157 Mar 2012 WO
2012038446 Mar 2012 WO
2012061122 May 2012 WO
2012013525 Aug 2012 WO
2012112683 Aug 2012 WO
2012125671 Sep 2012 WO
2012168322 Dec 2012 WO
2013112455 Aug 2013 WO
2013184340 Dec 2013 WO
2013186160 Dec 2013 WO
2013188026 Dec 2013 WO
2013190031 Dec 2013 WO
2014128498 Aug 2014 WO
Non-Patent Literature Citations (154)
Entry
First Chinese Office Action for Application No. 201080003467.1; Office Action Issue Date Feb. 5, 2013; (translated).
Elstrom, M.D., Stereo-Based Registration of LADAR and Color Imagery, Part of SPIE Conference on Intelligent Robots and Computer Vision XVII: Algorithms, Techniques, and Active Vision, Boston, MA, Nov. 1998, SPIE vol. 3522, 0277-786X/98; [Retrieved on-line], Downloaded From: http://proceedings.spiedigitallibrary.org/on Jan. 26, 2013.
International Search Report for International Application No. PCT/EP2006/003010 mailed Nov. 12, 2006.
Chinese Notification of First Office Action for Chinese Application No. 201080003463.3; Issued Oct. 30, 2012 (translated).
Japanese Office Action for JP Application No. 2012-521117, issued Mar. 25, 2014, including cited references.
Japanese Office Action for JP Application No. 2012-525222, issued Apr. 2, 1014, including cited references.
First Chinese Office Action for Chinese Patent Applicaiton No. 2013082200801190; Dated Aug. 27, 2013.
Japanese Office Action for Japanese Patent Application No. 2012501176; Dated Aug. 27, 2013.
Japanese Office Action for Japanese Patent Application No. 2012-534588; Date of Mailing Sep. 17, 2013.
Merriam-Webster (m-w.com), “Interface”. 2012. http://www.merriam-webster.com/dictionary/interface.
Merriam-Webster (m-w.com), “Traverse”. 2012. http://www.merriam-webster.com/dictionary/traverse.
Merriam-Webster (m-w.com), “Parts”. 2012. http://www.merriam-webster.com/dictionary/parts.
International Search Report of the International Searching Authority for Application No. PCT/US2012/075178; Date of Mailing Apr. 9, 2013.
“Scanner Basis Configuration for Riegl VQ-250”, Riegl Company Webpage, Feb. 16, 2011 (Feb. 16, 2011), XP002693900, Retrieved from the internet: URL:http://www.riegl.com/uploads/ tx—pxpriegldownloads/30—SystemConfiguration—VQ-250—02-11—16-02-2011.pdf [retrieved on Mar. 15, 2013] the whole document.
Written Opinion of the International Searching Authority for Application No. PCT/US2012/075178; Date of Mailing Apr. 9, 2013.
Chinese Office Action for Chinese Application Serial No. 201080047516-1; Date of Issue Apr. 1, 2013.
Germany Office Action for DE Application No. 10 2012 107 544.1; Issued Jan. 2, 2013.
International Search Report of the International Searching Authority for Application No. PCT/EP2011/003262; Date of Mailing Sep. 30, 2011.
Yk Cho, et al. “Light-weight 3D LADAR System for Construction Robotic Operations” (pp. 237-244); 26th International Symposium on Automation and Robotics in Construction (ISARC 2009), Jun. 24, 2009. 8 Pages.
ABB Flexibile Automation AB: “Product Manual IRB 6400R M99, On-line Manual”; Sep. 13, 2006; XP00002657684; [Retrieved on Sep. 28, 2011 (Sep. 28, 2011)]. Retrieved from the Internet: (See URL Below) 481 Pages.
Anonymous: So wird's gemacht: Mit T-DSL and Windows XP Home Edition gemeinsam ins Internet (Teil 3) Internet Citation, Jul. 2003 (Jul. 2003), XP002364586, Retrieved from Internet: URL:http://support.microsfot.com/ kb/814538/DE/ [retrieved on Jan. 26, 2006]eh.
Cho, et al., Implementation of a Precision Time Protocol over Low Rate Wireless Personal Area Networks, IEEE, 2008, 8 Pages.
Cooklev, et al., An Implementation of IEEE 1588 Over IEEE 802.11b for Syncrhonization of Wireless Local Area Network Nodes, IEEE Transactions on Instrumentation and Measurement, vol. 56, No. 5, Oct. 2007.
Dylan, Craig R., High Precision Makes the Massive Bay Bridge Project Work. Suspended in MidAir—Cover Story—Point of Beginning, Jan. 1, 2010, [online] http://www.pobonline.com/Articles/Cover—Story—BNP—GUID—9-5-2006—A—10000000000 . . .[Retreived May 5, 2015] 6 Pages.
Electro-Optical Information Systems, “The Handy Handheld Digitizer” [online], [retrieved on Nov. 29, 2011], http://vidibotics.com/htm/handy.htm. 2 Pages.
EO Edmund Optics “Silicon Detectors” (5 pages) 2013 Edmund Optics, Inc. http://www.edmundoptics.com/electro-optics/detector-components/silicon-detectors/1305[Oct. 15, 2013 10:14:53 Am].
FARO Product Catalog; Faro Arm; 68 pages; Faro Technologies Inc. 2009; printed Aug. 3, 2009.
Franklin, Paul F., What IEEE 1588 Means for Your Next T&M System Design, Keithley Instruments, Inc., [on-line] Oct. 19, 2010, http://www.eetimes.com/General/DisplayPrintViewContent?contentltemld=4209746, [Retreived Oct. 21, 2010] 6 Pages.
Gebre, et al. “Remotely Operated and Autonomous Mapping System (ROAMS).” Technologies for Practical Robot Applications, 2009. Tepra 2009. IEEE International Conference on IEEE, Piscataway, NJ, USA. Nov. 9, 2009, pp. 173-178.
Ghost 3D Systems, Authorized MicroScribe Solutions, FAQs—MicroScribe 3D Laser, MicroScan Tools, & related info, [online], [retrieved Nov. 29, 2011], http://www.gomeasure3d.com/baces100.html. 3 Pages.
GoMeasure3D—Your source for all things measurement, Baces 3D 100 Series Portable CMM from GoMeasure3D, [online], [retrieved Nov. 29, 2011], http://www.gomeasure3d.com/baces100.html. 3 Pages.
Hart, A., “Kinematic Coupling Interchangeability”, Precision Engineering, vol. 28, No. 1, Jan. 1, 2004, pp. 1-15, XP55005507, ISSN: 0141-6359, DOI: 10.1016/S0141-6359(03)00071-0.
HYDROpro Navigation, Hydropgraphic Survey Software, Trimble, www.trimble.com, Copyright 1997-2003. 2 Pages.
Information onElectro-Optical Information Systems; EOIS 3D Mini-Moire C.M.M. Sensor for Non-Contact Measuring & Surface Mapping; Direct Dimensions, Jun. 1995. 1 Page.
It is Alive in the Lab, Autodesk University, Fun with the Immersion MicroScribe Laser Scanner, [online], [retrieved Nov. 29, 2011], http://labs.blogs.com/its—alive—in—the—lab/2007/11/fun-with-the-im.html. 3 Pages.
J.Geng “Structured-Light 3D Surface Imaging: A Tutorial,” Advances in Optics and Photonics 3; Mar. 31, 2011, pp. 128-160; IEEE Intelligent Transportation System Society; 2011 Optical Society of America.
Jasperneite, et al., Enhancements to the Time Synchronization Standard IEEE-1588 for a System of Cascaded Bridges, IEEE, 2004. 6 Pages.
Jgeng “DLP-Based Structured Light 3D Imaging Technologies and Applications” (15 pages) Emerging Digital Micromirror Device Based Systems and Application III; edited by Michael R. Douglass, Patrick I. Oden, Proc. of SPIE, vol. 7932, 79320B; (Feb. 9, 20.
Kreon Laser Scanners, Getting the Best in Cutting Edge 3D Digitizing Technology, B3-D MCAD Consulting/Sales [online], [retrieved Nov. 29, 2011], http://www.b3-d.com/Kreon.html. 2 Pages.
Laser Reverse Engineering with Microscribe, [online], [retrieved Nov. 29, 2011], http://www.youtube.com/watch? v=8VRz—2aEJ4E&feature=PlayList&p=F63ABF74F30DC81B&playnext=1&playnext—from=PL&index=1. 2 Pages.
Leica TPS800 Performance Series—Equipment List, 2004. 4 Pages.
MG Lee; “Compact 3D LIDAR based on optically coupled horizontal and vertical Scanning mechanism for the autonomous navigation of robots” (13 pages) vol. 8037; downloaded from http://proceedings.spiedigitallibrary.org/ on Jul. 2, 2013.
MicroScan 3D User Guide, RSI GmbH, 3D Systems & Software, Oberursel, Germany, email: info@rsi-gmbh.de, Copyright RSI Roland Seifert Imaging GmbH 2008. 58 Pages.
Moog Components Group “Technical Brief; Fiber Optic Rotary Joints” Document No. 303 (6 pages) Mar. 2008; MOOG, Inc. 2008 Canada; Focal Technologies.
Moog Components Group; “Fiber Optic Rotary Joints; Product Guide” (4 pages) Dec. 2010; MOOG, Inc. 2010.
P Ben-Tzvi, et al “Extraction of 3D Images Using Pitch-Actuated 2D Laser Range Finder for Robotic Vision” (6 pages) BNSDOCID <XP 31840390A—1—>, Oct. 15, 2010.
Patrick Willoughby; “Elastically Averaged Precision Alignment”; In: “Doctoral Thesis” ; Jun. 1, 2005; Massachusetts Institute of Technology; XP55005620; Abstract 1.1 Motivation; Chapter 3, Chapter 6, 158 Pages.
Romer “Romer Absolute Arm Maximum Performance Portable Measurement” (Printed Oct. 2010); Hexagon Metrology, Inc. http://us:Romer.com; Hexagon Metrology, Inc 2010. 9 Pages.
Romer Romer Absolute Arm Product Brochure: (2010); Hexagon Metrology; www.hexagonmetrology.com; Hexagon AB, 2010. 24 Pages.
Romer “Romer Measuring Arms Portable CMMs for R&D and shop floor” (Mar. 2009) Hexagon Metrology (16 pages).
RW Boyd “Radiometry and the Detection of Otpical Radiation” (pp. 20-23) 1983 Jon wiley & Sons, Inc.
Sauter, et al., Towards New Hybrid Networks for Industrial Automation, IEEE, 2009. 8 Pages.
Spada, et al., IEEE 1588 Lowers Integration Costs in Continuous Flow Automated Production Lines, XP-002498255, ARC Insights, Insight # 2003-33MD&H, Aug. 20, 2003. 4 Pages.
Surman et al. “An autonomous mobile robot with a 3D laser range finder for 3D exploration and digitalization of indoor enviornments.” Robotics and Autonomous Systems vol. 45 No. 3-4, Dec. 31, 2003, pp. 181-198. Amsterdamn, Netherlands.
Trimble—Trimble SPS630, SPS730 and SPS930 Universal Total Stations, [on-line] http://www.trimble.com/sps630—730—930.shtml (1 of 4), [Retreived Jan. 26, 2010 8:50:29AM]. 4 Pages.
Willoughby, P., “Elastically Averaged Precisoin Alignment”, In: “Doctoral Thesis”, Jun. 1, 2005, Massachusetts Institute of Technology, XP55005620, abstract 1.1 Motivation, Chapter 3, Chapter 6. 158 Pages.
AKCA, Devrim, Full Automated Registration of Laser Scanner Point Clouds, Institute of Geodesy and Photogrammetry, Swiss Federal Institute of Technology, Zuerich, Switzerland; Published Dec. 2003.
First Office Action and Search Report with English Translation for Chinese Patent Application No. 201080003456.3; Issue Date Jan. 17, 2013.
International Preliminary Report on Patentability for International Application Serial No. PCT/EP2011/003261. International filed Jul. 1, 2011. Date of Issuance Jan. 29, 2013.
International Preliminary Report on Patentability for International Application Serial No. PCT/EP2011/003262. International filed Jul. 1, 2011. Date of Issuance Jan. 29, 2013.
International Preliminary Report on Patentability for International Application Serial No. PCT/EP2011/003263. International filed Jul. 1, 2011. Date of Issuance Jan. 29, 2013.
International Preliminary Report on Patentability for International Application Serial No. PCT/EP2011/003264. International filed Jul. 1, 2011. Date of Issuance Jan. 29, 2013.
Second Office Action with English Translation for Chinese Patent Application No. 201080003466.7; Issue Date Jul. 19, 2013.
German Office Action for DE Application Serial No. 102012109481.0; dated Aug. 1, 2013.
Japanese Office Action for JP Application Serial No. 2013-520990; Date of Mailing Jul. 2, 2013.
GB Examination Report dated Jun. 19, 2013 for GB Application No. GB1202398.2.
GB Examination Report dated Aug. 15, 2013 for GB Application No. GB 1303382.4.
GB Examination Report dated Aug. 7, 2013 for GB Application No. GB1303390.7.
Japanese Office Action for JP Application No. 2012-534589; issued Jul. 30, 2013.
Japanese Office Action for Application Serial No. 2013-520987; Date of Mailing Jul. 2, 2013.
Japanese Office Action for Application Serial No. 2013-520989; Date of Mailing Jul. 2, 2013.
Japanese Office Action for Application Serial No. 2012-534590; Date of Mailing Jul. 30, 2013.
Japanese Office Action for JP Application Serial No. 2012-501175; Date of Mailing Jul. 16, 2013.
Leica Geosystems, FBI Crime Scene Case Study, Cited in Opposition of EP Application No. 07785873.6 in Oral Proceedings held on Jun. 27, 2013, Munchen, Germany; D13, p. 5 of Summons, Tony Grissim, Feb. 2006.
GB Examination Report dated Mar. 27, 2013 for GB Application No. GB1303390.7.
GB Examination Report dated Mar. 27, 2013 for GB Application No. GB1303382.4.
Second German Office Action for DE Application Serial No. 10 2009 015 922.3; Dated Dec. 2, 2013.
German Office Acton for DE Application No. 102013102.554.4; Dated Jan. 9, 2014.
GB Exam and Search Report for Application No. GB1314371.4; Dated Nov. 22, 2013.
Horn, B.K.P., Closed-Form Solution of Absolute Orientation Using Unit Quaternions, J. Opt. Soc. Am. A., vol. 4., No. 4, Apr. 1987, pp. 629-642, ISSN 0740-3232.
Second JP Office Action for JP Patent Application Serial No. 2012-534590; Date of Mailing Nov. 12, 2013.
Japanese Office Action for JP Patent Application Serial No. 2012-501174; Dated Oct. 29, 2013.
WO 00/26612 is the published equivalent of DE 19850118. Published May 11, 2000.
AKCA, Devrim, Full Automatic Registration of Laser Scanner Point Clouds, Optical 3D Measurement Techniques, vol. VI, 2003, XP002590305, ETH, Swiss Federal Institute of Technology, Zurich, Institute of Geodesy and Photogrammetry, DOI:10.3929/ethz-a-004656666.
Bornaz, L., et al., Multiple Scan Registration in Lidar Close-Range Applications, The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XXXIV, Part 5/W12, Jul. 2003 (Jul. 2003), pp. 72-77, XP002590306.
Brenneke, C., et al., “Using 3D Laser Range Data for Slam in Outdoor Environments”, Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems. (IROS 2003); Las Vegas, NV, Oct. 27-31, 2003. [IEEE/RSJ International Conference on Intelligent Robots and Systems], New York, NY: IEEE, US vol. 1, Oct. 27, 2003; pp. 188-193, XP010672337, DOI:10.1109/IROS.2003.1250626;ISBN 978-0-7803-7860-5, p. 189; Figure 1.
Chinese Office Action Dated Jun. 2, 2010 with English Translation of the Text for Application No. 2006800540959.
Chinese Publication No. CN 1445509, published Oct. 1, 2003—English Abstract Not Available; EP Equivalent 1347267.
Elstrom, M.D., et al., Stereo-Based Registration of LADAR and Color Imagery, Intelligent Robots and Computer Vision XVII: Algorithms, Techniques, and Active Vision, Boston, MA, USA, vol. 3522, Nov. 2, 1998 (Nov. 2, 1998), Nov. 3, 1998 (Nov. 3, 1998) pp. 343-354, XP 002587995, Proceedings of the SPIE.
Godin, G., et al., A Method for the Registration of Attributed Range Images, Copyright 2001, [Retrieved on Jan. 18, 2010 at 03:29 from IEEE Xplore].
International Preliminary Report on Patentability and Written Opinion for International Application No. PCT/EP2007/005789; Date of Mailing Oct. 30, 2007.
International Preliminary Report on Patentability and Written Opinion for PCT/IB2010/002216; Date of Issuance Jan. 24, 2012.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2009/050887; Date of Issue Sep. 7, 2010.
International Preliminary Report on Patentability and Written Opinion for PCT/IB2010/002226; Date of Issuance Jan. 24, 2012.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2004/014605; Date of Issue Aug. 29, 2006.
iQsun Laserscanner Brochure, 2 Pages, Apr. 2005.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2009/050888; Date of Issuance Sep. 7, 2010.
International Search Report and Written Opinion of the International Searching Authority for PCT/EP2009/009174; Date of Mailing May 25, 2010.
International Search Report of the International Searching Authority for PCT/EP2007/005789; Date of Mailing Oct. 30, 2007.
International Search Report and Written Opinion for International Patent Application PCT/IB2010/002226; mailing date Dec. 13, 2010.
International Search Report of the International Searching Authority for PCT/EP2004/014605; Date of Mailing Apr. 15, 2005.
International Search Report of the International Searching Authority for PCT/EP2010/001779; Date of Mailing Jul. 20, 2010.
International Search Report for International Patent Application PCT/EP2010/001780; mailing date Jul. 23, 2010.
International Search Report of the International Searching Authority for PCT/EP2010/001781; Date of Mailing Jul. 22, 2010.
International Search Report of the International Searching Authority for PCT/IB2010/002258; Date of Mailing Jan. 28, 2011.
International Search Report of the International Searching Authority for PCT/IB2010/002216; Date of Mailing Feb. 3, 2011.
International Search Report of the International Searching Authority for PCT/EP2009/050888; Date of Mailing Sep. 15, 2009.
International Search Report of the International Seraching Authority forPCT/EP2010/006867; Date of Mailing Mar. 18, 2011.
International Search Report of the International Searching Authority for PCT/EP2010/006866; Date of Mailing Mar. 14, 2011.
International Search Report of the International Searching Authority for PCT/EP2010/006868; Date of Mailing Mar. 14, 2011.
International Search Report of the International Searching Authority for PCT/EP2009/050887; Date of Mailing May 14, 2009.
Jasiobedzki, Piotr, “Laser Eye—A New 3D Sensor for Active Vision”, SPIE—Sensor Fusion VI, vol. 2059, Sep. 7, 1993 (Sep. 7, 1993), pp. 316-321, XP00262856, Boston, U.S.A., Retrieved from the Internet: URL:http:.// scitation.aip.org/getpdf/servlet/GetPDFServlet? filetype=pdf&id=PSISDG002059000001000316000001&idtype=cvips&doi=10.117/12.150236&prog=normal>[retrieved on Mar. 8, 2011] the whole document.
Umeda, K., et al., Registration of Range and Color Images Using Gradient Constraints and Ran Intensity Images, Proceedings of the 17th International Conference onPatern Recognition (ICPR'04), Copyright 2010 IEEE. [Retrieved online Jan. 28, 2010—IEEE Xplore].
Williams, J.A., et al., Evaluation of a Novel Multiple Point Set Registration Algorithm, Copyright 2000, [Retrieved on Jan. 18, 2010 at 04:10 from IEEE Xplore].
Davidson, A. at al., “MonoSLAM: Real-Time Single Camera Slam”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol, 29, No. 6, Jun. 1, 2007, pp. 1052-1067, XP011179664.
Gebre, Biruk a., et al “Remotely Operated and Autonomous Mapping System (ROAMS)”, Technologies for Practical Robot Applications, TEPRA 2009, IEEE International Conference on Nov. 9, 2009, pp. 173-178, XP031570394.
Harrison A. et al., “High Quality 3D Laser Ranging Under General Vehicle Motion”, 2008 IEEE International Conference on Robotics and Automation, May 19-23, 2008, pp. 7-12, XP031340123.
May, S. et al, “Robust 3D-Mapping with Time-of-Flight Cameras”, Intelligent Robots and Systems, IROS 2009, IEEE/RSJ International Conference on Oct. 10, 2009, pp. 1673-1678, XP031581042.
Ohno, K. et al., “Real-Time Robot Trajectory Estimation and 3D Map Construction Using 3D Camera”, Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on Oct. 1, 2006, pp. 5279-5285, XP031006974.
Surmann, H. et al., “An Autonomous Mobile Robot with a 3D Laser Range Finder for 3D Exploration and Digitalization of Indoor Environments”, Robotics and Autonomous Systems, Elsevier Science Publishers, vol. 45, No, 3-4, Dec. 31, 2003, pp. 181-198.
Yan, R., et al, “3D Point Cloud Map Construction Based on Line Segments with Two Mutually Perpendicular Laser Sensors”, 2013 13th International Conference on Control, Automation and Systems (ICCAS 2013), IEEE, Oct. 20, 2013, pp. 1114-1116.
Ye, C. et al., “Characterization of a 2-D Laser Scanner for Mobile Robot Obstacle Negotiation” Proceedings/2002 IEEE International Conference on Robotics and Automation, May 11-15, 2002, Washington, D.C., May 1, 2002, pp. 2512-2518, XP009169742.
14th International Forensic Science Symposium, Interpol—Lyon, France, Oct. 19-22, 2004, Review Papers, Edited by Dr. Niamh Nic Daeid, Forensic Science Unit, Univeristy of Strathclyde, Glasgow, UK.
Bouvet, D., et al., “Precise 3-D Localization by Automatic Laser Theodolite and Odometer for Civil-Engineering Machines”, Proceedings of the 2001 IEEE International Conference on Robotics and Automation. ICRA 2001. Seoul, Korea, May 21-26, 2001; IEEE, US., vol. 2, May 21, 2001, pp. 2045-2050, XP010550445, DOI: 10.1109/Robot.2001.932908 ISBN: 978-0/7803-6576-6, the whole document.
Ingensand, H., Dr., “Introduction to Geodetic Metrology”, “Einfuhrung in die Geodatische Messtechnik”, Federal Institute of Technology Zurich, Edition 2004, p. 16.
FARO Laserscanner LS, Presentation Forensic Package, Policeschool of Hessen, Wiesbaden, Germany, Dec. 14, 2005; FARO Technologies, Copyright 2008.
FARO Laser Scanner LS, Recording Reality's Digital Fingerprint, The Measure of Success, Copyright 2005.
Leica Geosystems, FBI Crime Scene Case Study.
Haag, et al., “Technical Overview and Application of 3D Laser Scanning for Shooting Reconstruction and Crime Scene Investigations”, Presented at the American Academy of Forensic Sciences Scientific Meeting, Washington, D.C., Feb. 21, 2008.
Howard, et al., “Virtual Environments for Scene of Crime Reconstruction and Analysis”, Advanced Interfaces Group, Department of Computer Science, University of Manchester, Manchester, UK, Feb. 28, 2000.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2010/001779; Date of Issuance Sep. 27, 2011.
International Preliminary Report on Patentability and Written Opinion for PCT/1B2010/002258; Date of Issuance Feb. 21, 2012.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2010/006866; Date of Issuance May 22, 2012.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2010/006867; Date of Issuance May 22, 2012.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2010/001780; Date of Issuance Sep. 27, 2011.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2010/001781; Date of Issuance Sep. 27, 2011.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2010/006868; Date of Issuance May 22, 2012.
International Preliminary Report on Patentability and Written Opinion for PCT/EP2009/009174; Date of Issuance Aug. 16, 2011.
International Search Report of the International Searching Authority for Application No. PCT/EP2011/001662; Date of Mailing May 26, 2011.
International Search Report of the International Searching Authority for Application No. PCT/EP2011/003261; Date of Mailing Oct. 17, 2011.
International Search Report of the International Searching Authority for Application No. PCT/EP2011/003263; Date of Mailing Oct. 19, 2011.
International Search Report of the International Searching Authority for Application No. PCT/EP2011/003264; Date of Mailing Oct. 19, 2011.
Leica Geosystems TruStory Forensic Analysis by Albuquerque Police Department, 2006.
Leica Geosystems: “Leica Rugby 55 Designed for Interior Built for Construction”, Jan. 1, 2009, XP002660558, Retrieved from the Internet: URL:http://www.leica-geosystems.com/downloads123/zz/lasers/Rugby%2055/brochures/Leica—Rugby—55—brochure—en.pdf [retrieved on Oct. 5, 2011] the whole document.
Langford, et al., “Practical Skills in Forensic Science”, Pearson Education Limited, Essex, England, First Published 2005, Forensic Chemistry.
Huebner, S.F., “Sniper Shooting Tecnhique”, “Scharfschutzen Schiebtechnik”, Copyright by C.A. Civil Arms Verlag GmbH, Lichtenwald 1989, Alle Rechte vorbehalten, pp. 11-17.
Se, et al., “Instant Scene Modeler for Crime Scene Reconstruction”, MDA, Space Missions, Ontario, Canada, Copyright 2005, IEEE.
The Scene, Journal of The Association for Crime Scene Reconstruction, Apr.-Jun. 2006, vol. 12, Issue 2.
Written Opinion of the International Searching Authority for Application No. PCT/EP2006/003010; Date of Mailing Dec. 11, 2006.
Written Opinion of the International Searching Authority for Application No. PCT/EP2011/001662; Date of Mailing May 26, 2011.
GB Examination Report for Application No. GB1220971.4 dated May 20, 2014.
Creaform Metrology Solutions, “Handy Scan 3D—The Truly Portable Metrology-Grade 3D Scanners” brochure, 7 pages.
Creaform, “Creaform Releases Completely Re-Engineered Handyscan 3D Portable Scanners”, May 5, 2014, 1 page.
Mandy, Yousef B., et al; “Projector Calibration Using Passive Stereo and Triangulation”; International Journal of Future Computer and Communication; vol. 2; No. 5; 385-390; Oct. 2013; 6 pgs.
Related Publications (1)
Number Date Country
20160238710 A1 Aug 2016 US
Provisional Applications (1)
Number Date Country
61362810 Jul 2010 US
Continuations (1)
Number Date Country
Parent 13697031 US
Child 15140909 US