The present disclosure relates to a coordinate measuring machine, and more particularly to a portable articulated arm coordinate measuring machine having a connector on a probe end of the coordinate measuring machine that allows accessory devices which use structured light for non-contact three dimensional measurement to be removably connected to the coordinate measuring machine.
Portable articulated arm coordinate measuring machines (AACMMs) have found widespread use in the manufacturing or production of parts where there is a need to rapidly and accurately verify the dimensions of the part during various stages of the manufacturing or production (e.g., machining) of the part. Portable AACMMs represent a vast improvement over known stationary or fixed, cost-intensive and relatively difficult to use measurement installations, particularly in the amount of time it takes to perform dimensional measurements of relatively complex parts. Typically, a user of a portable AACMM simply guides a probe along the surface of the part or object to be measured. The measurement data are then recorded and provided to the user. In some cases, the data are provided to the user in visual form, for example, three-dimensional (3-D) form on a computer screen. In other cases, the data are provided to the user in numeric form, for example when measuring the diameter of a hole, the text “Diameter=1.0034” is displayed on a computer screen.
An example of a prior art portable articulated arm CMM is disclosed in commonly assigned U.S. Pat. No. 5,402,582 ('582), which is incorporated herein by reference in its entirety. The '582 patent discloses a 3-D measuring system comprised of a manually-operated articulated arm CMM having a support base on one end and a measurement probe at the other end. Commonly assigned U.S. Pat. No. 5,611,147 (147), which is incorporated herein by reference in its entirety, discloses a similar articulated arm CMM. In the '147 patent, the articulated arm CMM includes a number of features including an additional rotational axis at the probe end, thereby providing for an arm with either a two-two-two or a two-two-three axis configuration (the latter case being a seven axis arm).
Three-dimensional surfaces may be measured using non-contact techniques as well. One type of non-contact device, sometimes referred to as a laser line probe (LLP) or laser line scanner, emits a laser light either on a spot, or along a line. An imaging device, such as a charge-coupled device (CCD) for example, is positioned adjacent the laser. The laser is arranged to emit a line of light which is reflected off of the surface. The surface of the object being measured causes a diffuse reflection which is captured by the imaging device. The image of the reflected line on the sensor will change as the distance between the sensor and the surface changes. By knowing the relationship between the imaging sensor and the laser and the position of the laser image on the sensor, triangulation methods may be used to measure three-dimensional coordinates of points on the surface.
It is sometimes desirable to measure 3D coordinates of smoothly continuous edge points using one or more cameras integrated into an AACMM without using additional probes. There is a need to obtain such 3D coordinates with relatively high accuracy and high density and without errors that may be introduced by 3D noncontact measuring devices such as an LLP or structured light scanner.
While existing CMM's are suitable for their intended purposes, what is needed is a portable AACMM that has certain features of embodiments of the present invention.
In accordance with an embodiment of the invention, a method of determining three-dimensional (3D) coordinates of an edge point of an object is provided. The method comprises: a measurement device having at least one positioning device, a first camera, and an electronic circuit, the electronic circuit receiving a position signal from the positioning device providing data corresponding to a pose of the first camera, the electronic circuit having a processor and memory; in a first instance: capturing with the first camera in a first pose a first image of the object; obtaining from the electronic circuit first data corresponding to the first pose; in a second instance: capturing with the first camera in a second pose a second image of the object; obtaining from the electronic circuit second data corresponding to the second pose; in a third instance: capturing with the first camera in a third pose a third image of the object; obtaining from the electronic circuit third data corresponding to the third pose; determining with the processor the 3D coordinates of a first edge point, the 3D coordinates of the first edge point determined based at least in part on the first data, the second data, the third data, the first image, the second image, and the third image; and storing in memory the determine 3D coordinates of the first edge point.
In accordance with a further embodiment of the invention, a method of determining three-dimensional (3D) coordinates of an edge point of an object is provided. The method comprises: in a first instance: capturing with the first camera in a first pose a first image of the object, the position device providing data corresponding to a pose of the first camera and the second camera; determining the first pose with the processor based at least in part on a first data from the position device; capturing with the second camera in a second pose a second image of the object; determining the second pose with the processor based at least in part on a second data from the position device; in a second instance: capturing with the first camera in a third pose a third image of the object; determining the third pose with the processor based at least in part on a third data from the position device; determining with the processor the 3D coordinates of a first edge point, the first edge point being within an interval of edge points, the 3D coordinates of the first edge point determined based at least in part on the first pose, the second pose, the third pose, the first image, the second image, and the third image; and storing in a memory the 3D coordinates of the first edge point, the memory being operably coupled to the processor.
In accordance with a further embodiment of the invention, a measurement device for determining three-dimensional (3D) coordinates of an edge point of an object is provided. The measurement device includes at least one positioning device. A first camera is operably coupled to the at least one positioning device. An electronic circuit includes a processor and memory, the electronic circuit being operably coupled to receive data from the positioning device corresponding to a pose of the first camera. Wherein the processor is responsive to nontransitory executable computer instructions to: in a first instance: cause the first camera to capture in a first pose a first image of the object; determine the first pose in response to a first data from the at least one positioning device; in a second instance: cause the first camera to capture in a second pose a second image of the object; determine the second pose in response to a second data from the at least one positioning device; in a third instance: causing the first camera to capture in a third pose a third image of the object; determine the third pose in response to a third data from the at least one positioning device; and determining the 3D coordinates of a first edge point, the 3D coordinates of the first edge point determined based at least in part on the first pose, the second pose, the third pose, the first image, the second image, and the third image.
Referring now to the drawings, exemplary embodiments are shown which should not be construed to be limiting regarding the entire scope of the disclosure, and wherein the elements are numbered alike in several FIGURES:
Portable articulated arm coordinate measuring machines (“AACMM”) are used in a variety of applications to obtain measurements of objects. Embodiments of the present invention provide advantages in allowing an operator to easily and quickly couple accessory devices to a probe end of the AACMM that use structured light to provide for the non-contact measuring of a three-dimensional object. Embodiments of the present invention provide further advantages in providing for communicating data representing a point cloud measured by the structured light device within the AACMM. Embodiments of the present invention provide advantages in greater uniformity in the distribution of measured points that may provide enhanced accuracy. Embodiments of the present invention provide still further advantages in providing power and data communications to a removable accessory without having external connections or wiring. Embodiments of the present invention provide still further advantages in sharpening edges of features in 3D representations.
As used herein, the term “structured light” refers to a two-dimensional pattern of light projected onto a continuous and enclosed area of an object that conveys information which may be used to determine coordinates of points on the object. A structured light pattern will contain at least three non-collinear pattern elements disposed within the contiguous and enclosed area. Each of the three non-collinear pattern elements conveys information that may be used to determine the point coordinates.
In general, there are two types of structured light, a coded light pattern and an uncoded light pattern. As used herein a coded light pattern is one in which the three dimensional coordinates of an illuminated surface of the object may be ascertained by the acquisition of a single image. In some cases, the projecting device may be moving relative to the object. In other words, for a coded light pattern there will be no significant temporal relationship between the projected pattern and the acquired image. Typically, a coded light pattern will contain a set of elements (e.g. geometric shapes) arranged so that at least three of the elements are non-collinear. In some cases, the set of elements may be arranged into collections of lines. Having at least three of the element be non-collinear ensures that the pattern is not a simple line pattern as would be projected, for example, by a laser line scanner. As a result, the pattern elements are recognizable because of the arrangement of the elements.
In contrast, an uncoded structured light pattern as used herein is a pattern that does not allow measurement through a single pattern when the projector is moving relative to the object. An example of an uncoded light pattern is one that requires a series of sequential patterns and thus the acquisition of a series of sequential images. Due to the temporal nature of the projection pattern and acquisition of the image, there should be no relative movement between the projector and the object.
It should be appreciated that structured light is different from light projected by a laser line probe or laser line scanner type device that generates a line of light. To the extent that laser line probes used with articulated arms today have irregularities or other aspects that may be regarded as features within the generated lines, these features are disposed in a collinear arrangement. Consequently, such features within a single generated line are not considered to make the projected light into structured light.
Each bearing cartridge within each bearing cartridge grouping 110, 112, 114 typically contains an encoder system (e.g., an optical angular encoder system). The encoder system (i.e., transducer) provides an indication of the position of the respective arm segments 106, 108 and corresponding bearing cartridge groupings 110, 112, 114 that all together provide an indication of the position of the probe 118 with respect to the base 116 (and, thus, the position of the object being measured by the AACMM 100 in a certain frame of reference—for example a local or global frame of reference). The arm segments 106, 108 may be made from a suitably rigid material such as but not limited to a carbon composite material for example. A portable AACMM 100 with six or seven axes of articulated movement (i.e., degrees of freedom) provides advantages in allowing the operator to position the probe 118 in a desired location within a 360° area about the base 116 while providing an arm portion 104 that may be easily handled by the operator. However, it should be appreciated that the illustration of an arm portion 104 having two arm segments 106, 108 is for exemplary purposes, and the claimed invention should not be so limited. An AACMM 100 may have any number of arm segments coupled together by bearing cartridges (and, thus, more or less than six or seven axes of articulated movement or degrees of freedom).
The probe 118 is detachably mounted to the measurement probe housing 102, which is connected to bearing cartridge grouping 112. A handle 126 is removable with respect to the measurement probe housing 102 by way of, for example, a quick-connect interface. As will be discussed in more detail below, the handle 126 may be replaced with another device configured to emit a structured light to provide non-contact measurement of three-dimensional objects, thereby providing advantages in allowing the operator to make both contact and non-contact measurements with the same AACMM 100. In exemplary embodiments, the probe housing 102 houses a removable probe 118, which is a contacting measurement device and may have different tips 118 that physically contact the object to be measured, including, but not limited to: ball, touch-sensitive, curved and extension type probes. In other embodiments, the measurement is performed, for example, by a non-contacting device such as a coded structured light scanner device. In an embodiment, the handle 126 is replaced with the coded structured light scanner device using the quick-connect interface. Other types of measurement devices may replace the removable handle 126 to provide additional functionality. Examples of such measurement devices include, but are not limited to, one or more illumination lights, a temperature sensor, a thermal scanner, a bar code scanner, a projector, a paint sprayer, a camera, or the like, for example.
As shown in
In various embodiments, each grouping of bearing cartridges 110, 112, 114 allows the arm portion 104 of the AACMM 100 to move about multiple axes of rotation. As mentioned, each bearing cartridge grouping 110, 112, 114 includes corresponding encoder systems, such as optical angular encoders for example, that are each arranged coaxially with the corresponding axis of rotation of, e.g., the arm segments 106, 108. The optical encoder system detects rotational (swivel) or transverse (hinge) movement of, e.g., each one of the arm segments 106, 108 about the corresponding axis and transmits a signal to an electronic data processing system within the AACMM 100 as described in more detail herein below. Each individual raw encoder count is sent separately to the electronic data processing system as a signal where it is further processed into measurement data. No position calculator separate from the AACMM 100 itself (e.g., a serial box) is required, as disclosed in commonly assigned U.S. Pat. No. 5,402,582 ('582).
The base 116 may include an attachment device or mounting device 120. The mounting device 120 allows the AACMM 100 to be removably mounted to a desired location, such as an inspection table, a machining center, a wall or the floor for example. In one embodiment, the base 116 includes a handle portion 122 that provides a convenient location for the operator to hold the base 116 as the AACMM 100 is being moved. In one embodiment, the base 116 further includes a movable cover portion 124 that folds down to reveal a user interface, such as a display screen.
In accordance with an embodiment, the base 116 of the portable AACMM 100 contains or houses an electronic circuit having an electronic data processing system that includes two primary components: a base processing system that processes the data from the various encoder systems within the AACMM 100 as well as data representing other arm parameters to support three-dimensional (3-D) positional calculations; and a user interface processing system that includes an on-board operating system, a touch screen display, and resident application software that allows for relatively complete metrology functions to be implemented within the AACMM 100 without the need for connection to an external computer.
The electronic data processing system in the base 116 may communicate with the encoder systems, sensors, and other peripheral hardware located away from the base 116 (e.g., a structured light device that can be mounted to the removable handle 126 on the AACMM 100). The electronics that support these peripheral hardware devices or features may be located in each of the bearing cartridge groupings 110, 112, 114 located within the portable AACMM 100.
As shown in
Also shown in
In an embodiment shown in
The base processor board 204 also manages all the wired and wireless data communication with external (host computer) and internal (display processor 202) devices. The base processor board 204 has the capability of communicating with an Ethernet network via an Ethernet function 320 (e.g., using a clock synchronization standard such as Institute of Electrical and Electronics Engineers (IEEE) 1588), with a wireless local area network (WLAN) via a LAN function 322, and with Bluetooth module 232 via a parallel to serial communications (PSC) function 314. The base processor board 204 also includes a connection to a universal serial bus (USB) device 312.
The base processor board 204 transmits and collects raw measurement data (e.g., encoder system counts, temperature readings) for processing into measurement data without the need for any preprocessing, such as disclosed in the serial box of the aforementioned '582 patent. The base processor 204 sends the processed data to the display processor 328 on the user interface board 202 via an RS485 interface (IF) 326. In an embodiment, the base processor 204 also sends the raw measurement data to an external computer.
Turning now to the user interface board 202 in
The electronic data processing system 210 shown in
Though shown as separate components, in other embodiments all or a subset of the components may be physically located in different locations and/or functions combined in different manners than that shown in
Referring now to
The handle portion 404 also includes buttons or actuators 416, 418 that may be manually activated by the operator. The actuators 416, 418 are coupled to the controller 408 that transmits a signal to a controller 420 within the probe housing 102. In the exemplary embodiments, the actuators 416, 418 perform the functions of actuators 422, 424 located on the probe housing 102 opposite the device 400. It should be appreciated that the device 400 may have additional switches, buttons or other actuators that may also be used to control the device 400, the AACMM 100 or vice versa. Also, the device 400 may include indicators, such as light emitting diodes (LEDs), sound generators, meters, displays or gauges for example. In one embodiment, the device 400 may include a digital voice recorder that allows for synchronization of verbal comments with a measured point. In yet another embodiment, the device 400 includes a microphone that allows the operator to transmit voice activated commands to the electronic data processing system 210.
In one embodiment, the handle portion 404 may be configured to be used with either operator hand or for a particular hand (e.g. left handed or right handed). The handle portion 404 may also be configured to facilitate operators with disabilities (e.g. operators with missing finders or operators with prosthetic arms). Further, the handle portion 404 may be removed and the probe housing 102 used by itself when clearance space is limited. As discussed above, the probe end 401 may also comprise the shaft of the seventh axis of AACMM 100. In this embodiment the device 400 may be arranged to rotate about the AACMM seventh axis.
The probe end 401 includes a mechanical and electrical interface 426 having a first connector 429 (
The electrical connector 434 extends from the first surface 430 and includes one or more connector pins 440 that are electrically coupled in asynchronous bidirectional communication with the electronic data processing system 210 (
The mechanical coupler 432 provides relatively rigid mechanical coupling between the device 400 and the probe housing 102 to support relatively precise applications in which the location of the device 400 on the end of the arm portion 104 of the AACMM 100 preferably does not shift or move. Any such movement may typically cause an undesirable degradation in the accuracy of the measurement result. These desired results are achieved using various structural features of the mechanical attachment configuration portion of the quick connect mechanical and electronic interface of an embodiment of the present invention.
In one embodiment, the mechanical coupler 432 includes a first projection 444 positioned on one end 448 (the leading edge or “front” of the device 400). The first projection 444 may include a keyed, notched or ramped interface that forms a lip 446 that extends from the first projection 444. The lip 446 is sized to be received in a slot 450 defined by a projection 452 extending from the probe housing 102 (
Opposite the first projection 444, the mechanical coupler 432 may include a second projection 454. The second projection 454 may have a keyed, notched-lip or ramped interface surface 456 (
The probe housing 102 includes a collar 438 arranged co-axially on one end. The collar 438 includes a threaded portion that is movable between a first position (
To couple the device 400 to the probe housing 102, the lip 446 is inserted into the slot 450 and the device is pivoted to rotate the second projection 454 toward surface 458 as indicated by arrow 464 (
Embodiments of the interface 426 allow for the proper alignment of the mechanical coupler 432 and electrical connector 434 and also protects the electronics interface from applied stresses that may otherwise arise due to the clamping action of the collar 438, the lip 446 and the surface 456. This provides advantages in reducing or eliminating stress damage to circuit board 476 mounted electrical connectors 434, 442 that may have soldered terminals. Also, embodiments provide advantages over known approaches in that no tools are required for a user to connect or disconnect the device 400 from the probe housing 102. This allows the operator to manually connect and disconnect the device 400 from the probe housing 102 with relative ease.
Due to the relatively large number of shielded electrical connections possible with the interface 426, a relatively large number of functions may be shared between the AACMM 100 and the device 400. For example, switches, buttons or other actuators located on the AACMM 100 may be used to control the device 400 or vice versa. Further, commands and data may be transmitted from electronic data processing system 210 to the device 400. In one embodiment, the device 400 is a video camera that transmits data of a recorded image to be stored in memory on the base processor 204 or displayed on the display 328. In another embodiment the device 400 is an image projector that receives data from the electronic data processing system 210. In addition, temperature sensors located in either the AACMM 100 or the device 400 may be shared by the other. It should be appreciated that embodiments of the present invention provide advantages in providing a flexible interface that allows a wide variety of accessory devices 400 to be quickly, easily and reliably coupled to the AACMM 100. Further, the capability of sharing functions between the AACMM 100 and the device 400 may allow a reduction in size, power consumption and complexity of the AACMM 100 by eliminating duplicity.
In one embodiment, the controller 408 may alter the operation or functionality of the probe end 401 of the AACMM 100. For example, the controller 408 may alter indicator lights on the probe housing 102 to either emit a different color light, a different intensity of light, or turn on/off at different times when the device 400 is attached versus when the probe housing 102 is used by itself. In one embodiment, the device 400 includes a range finding sensor (not shown) that measures the distance to an object. In this embodiment, the controller 408 may change indicator lights on the probe housing 102 in order to provide an indication to the operator how far away the object is from the probe tip 118. In another embodiment, the controller 408 may change the color of the indicator lights based on the quality of the image acquired by the coded structured light scanner device. This provides advantages in simplifying the requirements of controller 420 and allows for upgraded or increased functionality through the addition of accessory devices.
Referring to
In the exemplary embodiment, the projector 508 uses a visible light source that illuminates a pattern generator. The visible light source may be a laser, a superluminescent diode, an incandescent light, a light emitting diode (LED), or other light emitting device. In the exemplary embodiment, the pattern generator is a chrome-on-glass slide having a structured light pattern etched thereon. The slide may have a single pattern or multiple patterns that move in and out of position as needed. The slide may be manually or automatically installed in the operating position. In other embodiments, the source pattern may be light reflected off or transmitted by a digital micro-mirror device (DMD) such as a digital light projector (DLP) manufactured by Texas Instruments Corporation, a liquid crystal device (LCD), a liquid crystal on silicon (LCOS) device, or a similar device used in transmission mode rather than reflection mode. The projector 508 may further include a lens system 515 that alters the outgoing light to have the desired focal characteristics.
The device 500 further includes an enclosure 502 with a handle portion 504. In one embodiment, the device 500 may further include an interface 426 on one end that mechanically and electrically couples the device 500 to the probe housing 102 as described herein above. In other embodiments, the device 500 may be integrated into the probe housing 102. The interface 426 provides advantages in allowing the device 500 to be coupled and removed from the AACMM 100 quickly and easily without requiring additional tools.
The camera 510 includes a photosensitive sensor which generates a digital image/representation of the area within the sensor's field of view. The sensor may be charged-coupled device (CCD) type sensor or a complementary metal-oxide-semiconductor (CMOS) type sensor for example having an array of pixels. The camera 510 may further include other components, such as but not limited to lens 503 and other optical devices for example. In the exemplary embodiment, the projector 508 and the camera 510 are arranged at an angle such that the sensor may receive light reflected from the surface of the object 501. In one embodiment, the projector 508 and camera 510 are positioned such that the device 500 may be operated with the probe tip 118 in place. Further, it should be appreciated that the device 500 is substantially fixed relative to the probe tip 118 and forces on the handle portion 504 may not influence the alignment of the device 500 relative to the probe tip 118. In one embodiment, the device 500 may have an additional actuator (not shown) that allows the operator to switch between acquiring data from the device 500 and the probe tip 118.
The projector 508 and camera 510 are electrically coupled to a controller 512 disposed within the enclosure 502. The controller 512 may include one or more microprocessors, digital signal processors, memory and signal conditioning circuits. Due to the digital signal processing and large data volume generated by the device 500, the controller 512 may be arranged within the handle portion 504. The controller 512 is electrically coupled to the arm buses 218 via electrical connector 434. The device 500 may further include actuators 514, 516 which may be manually activated by the operator to initiate operation and data capture by the device 500. In one embodiment, the image processing to determine the X, Y, Z coordinate data of the point cloud representing object 501 is performed by the controller 512 and the coordinate data is transmitted to the electronic data processing system 210 via bus 240. In another embodiment images are transmitted to the electronic data processing system 210 and the calculation of the coordinates is performed by the electronic data processing system 210.
In one embodiment, the controller 512 is configured to communicate with the electronic data processing system 210 to receive structured light pattern images from the electronic data processing system 210. In still another embodiment, the pattern emitted onto the object may be changed by the electronic data processing system 210 either automatically or in response to an input from the operator. This may provide advantages in obtaining higher accuracy measurements with less processing time by allowing the use of patterns that are simpler to decode when the conditions warrant, and use the more complex patterns where it is desired to achieve the desired level of accuracy or resolution.
In other embodiments of the present invention, the device 520 (
Referring now to
To determine the coordinates of the pixel, the angle of each projected ray of light 509 intersecting the object 522 in a point 527 is known to correspond to a projection angle phi (Φ), so that Φ information is encoded into the emitted pattern. In an embodiment, the system is configured to enable the Φ value corresponding to each pixel in the imaged pattern to be ascertained. Further, an angle omega (Ω) for each pixel in the camera is known, as is the baseline distance “D” between the projector 508 and the camera. Therefore, the distance “Z” from the camera 510 to the location that the pixel has imaged using the equation:
Thus three-dimensional coordinates may be calculated for each pixel in the acquired image.
In general, there are two categories of structured light, namely coded and uncoded structured light. A common form of uncoded structured light, such as that shown in
Epipolar lines are mathematical lines formed by the intersection of epipolar planes and the source plane 517 or the image plane 521 (the plane of the camera sensor) in
In embodiments having a periodic pattern, such as a sinusoidally repeating pattern, the sinusoidal period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two-dimensions, the pattern elements are non-collinear. In some cases, a striped pattern having stripes of varying width may represent a coded pattern.
Referring now to
Similarly, rather than a binary pattern, a sequential series of grey patterns having stripes with varying grey-scale values may be used. When used in this context, the term grey-scale usually refers to an amount of irradiance at a point on the object from white (maximum light), to various levels of gray (less light), to black (minimum light). This same nomenclature is used even if the light being projected has a color such as red, and the gray-scale values correspond to levels of red illumination. In an embodiment, the pattern (
In another embodiment, the distance Z to an object point may be found by measuring a phase shift observed in a plurality of images. For example, in an embodiment shown in
In another method 556 illustrated in
Another method for eliminating ambiguity is to use a different type of method, such as the gray code method of
In applications where the object and device 500 are in relative motion, it may be desirable to use a single pattern that allows the camera 510 to capture an image that provides sufficient information to measure the three dimensional characteristics of the object 501 without having to project sequential images. Referring now to
Another embodiment using color patterns is shown in
Referring now to
In the embodiments of
Referring now to
Referring now to
Since the pattern 720 is repeated, it would generally cause ambiguities in the line identification. However, this is problem is resolved in this system through the geometry of the camera's field of view and depth of field. For a single view of the camera, i.e. a row of pixels, within the depth of field in which the lines can be optically resolved, no two lines with the same phase can be imaged. For example, the first row of pixels on the camera can only receive reflected light from lines 1-30 of the pattern. Whereas further down the camera sensor, another row will only receive reflected light from lines 2-31 of the pattern, and so on. In
Referring now to
This approach to code the relative phases versus the absolute phases provides advantages in that there is a higher tolerance for the positions of the phases. Minor errors in the construction of the projector that may cause the phases of the lines to shift throughout the depth of field of the camera, as well as errors due to the projector and camera lenses make an absolute phase much more difficult to determine. This can be overcome in the absolute phase method by increasing the period such that it is sufficiently large enough to overcome the error in determining the phase.
It should be appreciated that for the case of a two-dimensional pattern that projects a coded pattern of light, the three non-collinear pattern elements are recognizable because of their codes, and since they are projected in two dimensions, the at least three pattern elements are non-collinear. For the case of the periodic pattern, such as the sinusoidally repeating pattern, each sinusoidal period represents a plurality of pattern elements. Since there is a multiplicity of periodic patterns in two dimensions, the pattern elements are non-collinear. In contrast, for the case of the laser line scanner that emits a line of light, all of the pattern elements lie on a straight line. Although the line has width and the tail of the line cross section may have less optical power than the peak of the signal, these aspects of the line are not evaluated separately in finding surface coordinates of an object and therefore do not represent separate pattern elements. Although the line may contain multiple pattern elements, these pattern elements are collinear.
Further, the various pattern techniques may be combined as shown in
Referring now to
The controller 706 includes a communications circuit configured to wirelessly transmit data, such as images or coordinate data via a communications link 712 to the AACMM 100, to a separate computing device 710 or a combination of both. The computing device 710 may be, but is not limited to a computer, a laptop, a tablet computer, a personal digital assistant (PDA), or a cell phone for example. The display 708 may allow the operator see the acquired images, or the point cloud of acquired coordinates of the object 702. In one embodiment, the controller 706 decodes the patterns in the acquired image to determine the three-dimensional coordinates of the object. In another embodiment, the images are acquired by the device 704 and transmitted to either the AACMM 100, the computing device 710 or a combination of both.
The device 704 may further include a location device assembly 714. The location device assembly may include one or more of inertial navigation sensors, such as a Global Positioning System (GPS) sensor, a gyroscopic sensor, an accelerometer sensor. Such sensors may be electrically coupled to the controller 706. Gyroscopic and accelerometer sensors may be single-axis or multiple-axis devices. The location device assembly 714 is configured to allow the controller 706 to measure or maintain the orientation of the device 704 when detached from the AACMM 100. A gyroscope within the location device assembly 714 may be a MEMS gyroscopic device, a solid-state ring-laser device, a fiber optic device gyroscope, or other type.
When the device 704 is removed from the articulated arm CMM 100, a method is used to combine images obtained from multiple scans. In an embodiment the images are each obtained by using coded patterns so that only a single image is needed to obtain three-dimensional coordinates associated with a particular position and orientation of the device 704. One way to combine multiple images captured by the device 704 is to provide at least some overlap between adjacent images so that point cloud features may be matched. This matching function may be assisted by the inertial navigation devices described above.
Another method that can be used to assist in accurate registration of images collected by the device 704 is the use of reference markers. In an embodiment, the reference markers are small markers having an adhesive or sticky backing, for example, circular markers that are placed on an object or objects being measured. Even a relatively small number of such markers can be useful in registering multiple images, especially if the object being measured has a relatively small number of features to use for registration. In an embodiment, the reference markers may be projected as spots of light onto the object or objects under inspection. For example, a small portable projector capable of emitting a plurality of small dots may be placed in front of the object or objects to be measured. An advantage of projected dots over sticky dots is that the dots do not have to be attached and later removed.
In one embodiment, the device projects the structured light over a contiguous and enclosed area 716 and can acquire an image over the area 716 at a range of 100 mm to 300 mm with an accuracy of 35 microns. In an embodiment, the perpendicular area 716 of projection is approximately 150 to 200 mm2. The camera or cameras 510 may be a digital camera having a 1.2-5.0 megapixel CMOS or CCD sensor.
Referring to
The resulting cogs 724 are used next to find the pattern lines 722. This is done by moving in a left to right direction (when viewed from the direction shown in the FIGS.) starting with the first column of the image. For each cog 724 in this column the neighboring column to the immediate right is searched for a cog 724 that is within a particular distance. If two matching cogs 724 are found, then a potential line has been determined. As the process moves across the image, more new lines are determined and other previously determined lines are extended in length as additional cogs 724 are detected within the tolerance. Once the entire image has been processed a filter is applied to the extracted lines to ensure only lines of a desired length, which is the wavelength of the pattern, are used in the remaining steps.
The next step in the decoding process is to extract the projected pattern features along the lines in the X direction in the form of block centers. Each pattern contains both wide blocks and narrow blocks. In the sinusoidal pattern 720 this refers to the peaks and valleys of the wave and in the square pattern 730 this refers to the wide squares and the narrow squares. This process proceeds in a similar fashion to extracting the features in the Y direction, however the moving average is also calculated using the widths found in the first stage and the direction of movement is along the line. As described above, the features are extracted in the area where widths are above the moving average value but in this process, features are also extracted in the areas where the widths are below the moving average. The widths and X positions are used to calculate a weighted average to find the center of the block 726 in the X direction. The Y positions of the cogs 724 between moving average crossings are also used to calculate a center for the block 726 in the Y direction. This is carried out by taking the average of the Y coordinates of the cogs. The start and end points of each line are also modified based on the features extracted in this step to ensure that both points are where the crossing of the moving average occurs. In one embodiment, only complete blocks are used in later processing steps.
The lines and blocks are then processed further to ensure that the distance between the block centers 726 on each line are within a predetermined tolerance. This is accomplished by taking the delta between the X center positions between two neighboring blocks on a line and checking that the delta is below the tolerance. If the delta is above the tolerance, then the line is broken up into smaller lines. If the break is required between the last two blocks on a line, then the last block is removed and no additional line is created. If the break is required between the first and second or second and third blocks on a line, then the blocks to the left of the break are also discarded and no additional line is created. For situations where the break occurs in any other place along the line the line is broken into two and a new line is created and the appropriate blocks are transferred to it. After this stage of processing the two patterns require different steps to finish decoding.
The sinusoidal pattern 720 may now be decoded with one additional step of processing using the block centers on the lines. The modulus of each block X center and the wavelength of the pattern 720 on a line 722 are calculated and the average of these values gives the phase of the line 722. The phase of the line 722 may then be used to decode the line in the pattern 720 which in turn allows for the determination of an X, Y, Z coordinate position for all cogs 724 on that line 722.
Before the square pattern 730 is decoded, first lines 732 be connected vertically before any decoding can take place. This allows a group of lines to be identified and not just a single line like the sinusoidal pattern. Connections 736 are found between lines 732 by using the blocks 734 and the cogs contained in the block calculated in the first stage of processing. The first cog in each block on a line 732 is tested to see if there is another cog directly below it in the same column. If there is no cog below then there is no connection with another line at this point so processing continues. If there is a cog below then the Y distance between the two cogs is determined and compared to a desired maximum spacing between lines. If the distance is less than this value, the two lines are considered connected at that point and the connection 736 is stored and processing continues onto the next block. In one embodiment, a line connection 736 is unique such that no two lines will have more than one connection 736 between them.
The next step of processing for the square pattern 730 is phase calculation between connected lines. Each pair of lines 732 is first processed to determine the length of overlap between them. In one embodiment there is at least one wavelength of overlap between the pair of lines to allow the calculation of the relative phase. If the lines have the desired overlap, then the cog at center of the area of overlap is found. The blocks 738 that contain the center cog and the cog directly below are determined and the relative phase between the block X centers is calculated for that line connection. This process is repeated for all connections between lines. In one embodiment, the process is repeated in only the downwards direction in the Y axis. This is because the code is based on connections below lines and not the other way round or both.
The next step in decoding the square pattern 730 is performing a look up using the relative phases calculated in the previous step. Each line 732 is processed by tracking down the line connections 736 until a connection depth of four is reached. This depth is used because this is the number of phases to decode the line. At each level of the connection a hash is determined using the relative phase between the lines 732. When the required connection depth is reached the hash is used to look up the line code. If the hash returns a valid code, then this is recorded and stored in a voting system. Every line 732 is processed in this way and all connections that are of the desired depth are used to generate a vote if they are a valid phase combination. The final step is then to find out which code received the most votes on each line 732 and assigned the code of the line 732 to this value. If there is not a unique code that received the most votes, then the line is not assigned a code. The lines 732 are identified once a code has been assigned and the X, Y, Z coordinate position for all cogs on that line 732 may now be found.
It should be noted that although the descriptions given above distinguish between line scanners and area (structured light) scanners based on whether three or more pattern elements are collinear, it should be noted that the intent of this criterion is to distinguish patterns projected as areas and as lines. Consequently, patterns projected in a linear fashion having information only along a single path are still line patterns even though the one-dimensional pattern may be curved.
A difficulty sometimes encountered in making measurements with a triangulation scanner attached to the end of an articulated arm CMM is that edges are not very sharp. In other words, the edge may have a radius or a chamfer. Such edges may be edges of parts, holes in parts, or sharp aspects of other features. Problems with fuzzy or inaccurately located edges may be seen with line scanners or area scanners. Although the edges viewed in the two-dimensional (2D) image of a triangulation scanner may be sharp, the exact distance to the edge may be less certain. Near an edge, a single pixel may have a distance that is not clearly defined. On one portion of light reflected into the pixel, the light may come from a flat surface. On another portion of the pixel, the distance may be that of neighboring pixels on the side or bottom of a hole, or it may be a faraway distance in the case of an edge of a part. In most cases, because of lens defocus, lens aberrations, and limited modulation transfer function (MTF), a plurality of pixels (rather than a single pixel) may correspond to a feature such as the edge of a hole. In this case, when the point in question is near an edge, the apparent distance to the pixel may not be determined to a single distance to a point on the object. Sometimes the term “mixed pixel” is used to refer to the case in which the distance ascribed to a single pixel on the final image is determined by a plurality of distances on the object. In such a case, the distance as determined by the triangulation scanner for the pixel in question may be a simple average of the distances over the extent of the pixel. In other cases, the distance as determined by the triangulation scanner may be a much different value, as for example when an “ambiguity range” is exceeded during a phase shift method of triangulation. In this case, the distance may be in error by an amount that is difficult to predict.
In accordance with one embodiment, a solution to this issue uses the sharp edges that appear in one or more 2D images of the feature being measured. In many cases, such edge features can be clearly identified in 2D images—for example, based on textural shadings. These sharp edges may be determined in coordination with those surface coordinates that are determined accurately using the triangulation methods. By intersecting the projected rays that pass through the perspective center of the lens in the triangulation scanner with the 3D coordinates of the portion of the surface determined to relatively high accuracy by triangulation methods, the 3D coordinates of the edge features may be accurately determined.
It should be further appreciated that edges seen in an image are never perfectly sharp and so an imperfect edge discontinuity (for example, a fillet) will have to be relatively wide to be seen clearly by a camera. A position of an imperfect edge may still be calculated using methods discussed herein (for example, taking a centroid) to obtain an edge value to a subpixel resolution. In other words, even though a camera will respond on a subpixel level to the width of an edge, the methods given here are still valid as there is generally less uncertainty in the position of an edge from a 2D image than from a 3D image, which is relatively higher amount of data noise when compared with 2D images. In some cases, the surfaces meet to form a substantially 90-degree angle. In other cases, the surfaces may meet with an intermediary surface that is angled less than 90 degrees (e.g. 45 degrees), such as a chamfer or a bevel for example. In other cases, there may be a curved intermediary surface, such as a fillet for example. In still other cases, the edge may be “broken,” such as where the intersection of the surfaces is worked with a file or rasp for example. The methods disclosed herein will be valid for edges having these characteristics. In some embodiments, empirical data may be collected to understand how the edge contrast changes in the captured image under prescribed lighting conditions.
With reference made to
The method of combining the 2D image captured by a camera, which may in some embodiments be the camera 508, but in other cases be a separate camera 3410, is to project the rays of light 3440, 3442 corresponding to the edges of the hole 3432A, 3432B captured by the photosensitive array 3416 from the corresponding points on the photosensitive array 3416 so that these rays intersect the edges of the surface 3430A, 3430B. This intersection determines the 3D edge coordinates.
This method may be more clearly understood by considering the example of an object 3600 having a flat region 3610 into which is drilled hole 3620. A region extends from the edge of hole 3620 to a peripheral boundary 3622 in which there is a relatively high level of uncertainty because of mixed pixel effects as discussed above. An assumption is made, based on a priori knowledge of the part being investigated that the edge (in this case of a hole) is sharp and the surface is generally flat. Therefore, by projecting the 2D image of hole through the lens perspective center onto the flat region having coordinates determined using triangulation, the 3D coordinates of the sharp edges of the hole may be determined to relatively high accuracy. In a similar manner, the 3D coordinates of any sorts of sharp edges may be determined.
In an embodiment, an uncertainty distance 3424 characteristic of the triangulation system is provided. In some cases, the uncertainty distance is based at least in part on the amount of noise observed in a region or a measure of the “smoothness” of edges. In regions of high noise or low smoothness, uncertainty distance may be increased. Other factors such as light level, which might be a level of ambient light or a level of illumination provided by the device 401, may also be considered in determining an appropriate uncertainty distance 3424.
A method 3700 is now described for determining 3D coordinates of an edge point located on an edge feature using a noncontact 3D measuring device that includes a combination of a projector, a scanner camera, and an edge-detection camera is now described with reference to
In a step 3710, an electronic circuit within the AACMM receives a position signal from the position transducers in the arm segments and sends a first electrical signal to the processor. The position signal may be generated in response to the operator moving the noncontact 3D measurement device from a first position to a second position adjacent the object to be measured. As used herein, the second position is located such that the object is both within the operating range of the noncontact 3D measurement device's field of view and focal distance. In a step 3715, the operator activates the noncontact 3D measuring device, such as by depressing an actuator for example, and the projector emits a first pattern of light onto the object. In a step 3720, the scanner camera receives the first pattern of light reflected from the object. In response to receiving the reflected light, the scanner camera sends a second electrical signal to the processor.
In a step 3725, the edge-detecting camera receives a second light reflected from the object and sends a third electrical signal to the processor in response. A portion of the second light is reflected from an edge feature of the object, where the edge point is a point on the edge feature. The second light may come from a variety of sources. It may be an ambient light coming from background light sources in the environment. The second light may be intentionally emitted by a light source element coupled to the probe end. The light source may provide a uniform illumination over the surface. The second light may be sent to the object at a different time that the first light pattern.
In a step 3730, the processor determines first 3D coordinates of first points on a surface of the object. These first 3D points are based at least in part on the first pattern of light from the projector and the second electrical signal, which arises from the image captured by the scanner camera. Using triangulation methods, the 3D coordinates of the first points on the surface are determined in the local frame of reference of the projector and scanner camera. By further including the first electrical signals, the position of the object surface in an AACMM frame of reference may be determined.
In a step 3735, the processor further determines a first ray, the first ray going from the object to the object. The first ray is that ray that passes from the edge point through the perspective center of the edge-detecting camera. The processor determines the first ray based at least in part on the third electrical signal, which captures the edge in the image of a photosensitive array within the edge-detecting camera. In addition, the first ray is based on the first electrical signal, which is needed to determine the first ray within the AACMM frame of reference. The first ray may be represented as a vector within the AACMM frame of reference.
In a step 3740, the processor further determines 3D coordinates of the edge point based at least in part on an intersection of the first ray with the first 3D coordinates of the first surface. This may be done by determining a characteristic distance over which 3D data is considered of less accuracy than desired. The characteristic distance may be based on a rule associated with a given system, or it may be based on image quality—for example, jagged edges or noise in 3D points near the edge. The general approach is to mathematically project a smooth surface (characterized by 3D points) along a continuing path across the characteristic distance until the smooth surface intersects the first ray. In in most cases, a large number of first rays along an edge points on an edge feature and projected to intersect a projection of a smooth surface, thereby enabling more accurate determination of 3D points on and near the edge feature. In a step 3745, the 3D coordinates of the edge point are stored.
It should be appreciated that the coupling of a noncontact 3D measuring device to an AACMM that is manually moved by an operator may have advantages over other systems, such as those that use robotic systems. In general, an AACMM will be able to determine the location of the noncontact 3D measuring device in space (relative to the local AACMM coordinate system) much more accurately than a robotic system. Further, an operator may move the AACMM articulated arm segments in an ad hoc manner to place the noncontact 3D measuring device in a position to measure a desired surface on any object within operating area of the AACMM. A robotic system on the other hand would require complex programming to define the movement of the robot which increases the time and cost to perform an equivalent scan of the object.
It is often the case that 3D coordinates of edge features of an object are of more importance than 3D coordinates of smooth regions of the object-for example, the smooth regions between edge features. A method is now described for determining 3D coordinates of points on edge features using a 2D camera coupled to a probe end 401 of an AACMM 100. Optionally, a scanner such as an LLP or a structured light scanner may further be used to obtain 3D coordinates of other regions (smooth regions) of an object, as described herein below.
In an embodiment illustrated in
In an embodiment illustrated in
In an embodiment according to the inventive method now described, a single camera is used to determine 3D coordinates of edges. The camera may be the camera 3810 of
In an embodiment, the camera assembly, which might for example be the camera assembly 3810 or the camera assembly 3840, is moved to two different poses. The term “pose” as used herein refers to the six degrees of freedom of the camera assembly. The six degrees-of-freedom includes three translational degrees of freedom, for example (x, y, z), and three orientational degrees of freedom, for example (pitch angle, yaw angle, roll angle). The pose of the camera assembly specifies the position and orientation of the camera assembly in space.
In
The camera in the second pose 3920 has a perspective center O2, which is the point from which rays of light from the feature 3940 appear to pass before reaching the camera photosensitive array when the camera is in the second pose. An epipolar plane 3922 is established by projecting the plane of the camera photosensitive array symmetrically about the perspective center O2. The epipolar plane 3922 is in general mathematically equivalent to the plane of the photosensitive array in the method that follows. For the camera in the second pose, the epipolar plane is 3922.
A line drawn between the perspective center O1 and the perspective center O2 is referred to as the baseline 3930 between the camera in the first pose 3900 and the camera in the second pose 3920. The length of the baseline 3930 is B. The baseline intersects the epipolar plane 3902 at the epipole E1, and it intersects the epipolar plane 3922 at the epipole E2. Consider a point VD on the edge 3942. A ray from this point through the perspective center O1 intersects the epipolar plane 3902 at the point UD. A ray drawn from the point VD through the perspective center O2 intersects the epipolar plane 3922 at the point WD. A line that resides on an epipolar plane and that also passes through the epipole of that plane is referred to an epipolar line. The epipolar line 3904 includes the point UD, and the epipolar line 3924 includes the point WD. Because the points O1, O2, E1, E2, WD, UD, and VD all lie in a common plane, as do the epipolar lines 3904 and 3924, it follows that, if one epipolar line is known, there is sufficient information to determine the location of the other epipolar line. So if the epipolar line 3904 is known, the epipolar line 3924 may be drawn.
A processor determines that the edge point VD selected from an image captured by the photosensitive array of the camera in the first pose 3900. The camera in the first pose 3900 further determines that the same point VD lies on the epipolar line 3924. However, many such points on the epipolar lines satisfy this condition. For example, the points WA, WB, WC, WD correspond to the matching points VA, VB, VC, VD. Hence, there is insufficient information for the processor to determine a one-to-one correspondence between points WD and UD based on epipolar analysis alone. Thus, if the edge 1942 is smoothly continuous over a portion of the edge 3942 that includes the point VD, then a one-to-one correspondence between the points WD and UD cannot be obtained from the epipolar analysis alone.
In an embodiment, a camera assembly is attached to an AACMM, such as AACMM 100 for example, which determines the pose of the camera assembly. The camera in the first pose 3900 has a perspective center O1 and an epipolar plane 3902. The photosensitive array of the camera captures an image 3906 over an area corresponding to the region 3905 of the epipolar plane 3902. The image 3906 of the object 3940 appears on the photosensitive array of the camera and correspondingly on the epipolar plane 3902. The camera in the second pose 3920 has a perspective center O2 and an epipolar plane 3922. The photosensitive array of the camera captures an image 3926 over an area corresponding to the region 3925 of the epipolar plane 3922. The image 3926 appears on the photosensitive array of the camera and correspondingly on the epipolar plane 3922. A point VD on the edge 3942 projects an image point UD on the epipolar plane 3902 and an image point WD on the epipolar plane 3922. Epipoles E1 and E2 are obtained as described herein above with respect to
The image 3906 is bounded by lines 3907 that pass from the perspective center O1 to tangential points on the object 3940. Likewise, the image 3926 is bounded by lines 3927 that pass from the perspective center O2 to tangential points on the object 3940. In general, the edges captured in the image 3906 and the edges captured in the image 3926 may not fully match. In other words, each view may capture some edge points not visible in the other view.
The image 3906 is obtained from the projection of light rays from the object 3940, but this image could instead be obtained from the projection of light rays from other possible objects 4000. Likewise, the image 3926 is obtained from the projection of light rays from the object 3940 but could instead be obtained of other possible objects 4020. Each of the objects 4000, 4020 may be shifted in position (x, y, z) or in orientation (pitch angle, yaw angle, roll angle) and still produce the same images 3906, 3926, respectively.
An optimization procedure may be performed to adjust the six degrees-of-freedom of each of the possible objects 4000 and 4020 to place the objects at the same position and orientation, which is the position and orientation of the actual object 3940. The accuracy of the procedure is enhanced if certain of the features are known to be coplanar. For example, the 3D coordinates of the points on the edges 3942 may be accurately determined using this method because the edge points 3942 lie on a single plane. If it is further known that some of the points, for example, the points 3944 lie on a separate plane, the optimization procedure may be used to determine the 3D coordinates of those edge points 3944.
For the general case in which edge features do not necessarily lie on a plane, the 3D coordinates of the edge features may be determined by further adding the camera assembly at a third pose 4100 as shown in
In the example of
In the methods described herein to determine 3D coordinates of edge points using 2D cameras, a preliminary step is to identify those parts of the images that are edges. There are several methods that may be used to determine which image portions are edges and then to locate the edges on the captured images 3906, 3926, 4106. In an embodiment, a method of edge detection based on a partial area effect is used. This method is described in “Accurate subpixel edge location based on partial area effect” in Image and Vision Computing 31 (2013) 72-90 by Trujillo-Pino, et al., hereafter referred to as Trujillo-Pino [2013], the contents of which are incorporated herein by reference. In other embodiments, other techniques such as moment-based techniques, least-squared-error-based techniques, or interpolation techniques, may be used. The effect of noise inherent in 2D images may be reduced by reconstructing smooth lines where appropriate. Examples of constructions that assume the presence of straight lines, circular curves, and polynomial curves are described in the Trujillo-Pino [2013]. Such smoothing methods are examples of noise-reducing filtering techniques that are used in embodiments described herein.
When a third 2D image 4106 is captured with the AACMM assembly in the third pose 4100, it becomes possible to determine 3D coordinates of smoothly continuous edge points such as the point 4142 on the object 4140. A method of determining smoothly continuous edge points on a point-by-point basis is now described.
In an embodiment illustrated in
The epipoles of the three epipolar planes are the points of intersection of the epipolar planes with the lines connecting the perspective centers of the cameras in the three poses. The epipolar plane 3902 includes two epipoles. The epipole E12 is the point at which the epipolar plane 3902 is intersected by the line 3930 that connects the perspective centers O1 and O2. Likewise, the epipole E13 is the point at which the epipolar plane 3902 is intersected by the line 4251 that connects the perspective centers O1 and O3. The epipoles E21, E23 are the points at which the epipolar plane 3922 is intersected by the lines 3930, 4250, respectively. The epipoles E31, E32 are the points at which the epipolar plane 4102 is intersected by the lines 4251, 4250, respectively.
The epipolar lines are lines that pass through the epipoles and through a point of interest on the epipolar plane. Because the points E12, E21, and P lie on a common plane, the epipolar line 4220 drawn from E12 to the point P can be used to draw the corresponding epipolar line 4222 on the epipolar plane 3922. Any one epipolar line can be used to generate a corresponding epipolar line on an adjacent epipolar plane.
Consider the embodiment illustrated in
For the embodiment in which the edge points are noisy or perhaps not clearly visible, there may be some discrepancy in the intersection of the epipolar lines, as determined from the geometry of the adjacent planes and as determined from the intersection of the epipolar lines with an edge point. If such a discrepancy is observed with respect to the edge point V, several actions may taken in other embodiments. In one embodiment, the point V is dropped or removed from the collection of calculated 3D coordinates for the edge 4142. In another embodiment, edge filtering techniques such as those described in Trujillo-Pino [2013] to reduce noise, as explained herein above. In another embodiment, a 2D image is further obtained with the camera assembly on the AACMM in a fourth pose. In this case, if the 3D coordinates agree for three of the four poses, the outlier points may be dropped or removed from consideration. Additional poses beyond four poses may further be used to increase accuracy of determined 3D coordinates of edge points.
In an embodiment, 2D images of an object may be obtained from a relatively large number of directions around the object, with edges having been captured or acquired by the AACMM camera assembly with relatively high quality images in at least three poses from each direction. In this case, the 3D coordinates of edge features can be reconstructed. In other embodiments, images are captured or acquired only for those edge features of interest.
In some embodiments an AACMM may include more than one camera assembly. An example is shown in
While the invention has been described with reference to example embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include all embodiments falling within the scope of the appended claims. Moreover, the use of the terms first, second, etc. do not denote any order or importance, but rather the terms first, second, etc. are used to distinguish one element from another. Furthermore, the use of the terms a, an, etc. do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced item.
This present application is a continuation of Ser. No. 15/481,673 filed on Apr. 7, 2017, now U.S. Pat. No. 9,879,976, which is a continuation-in-part of U.S. patent application Ser. No. 14/679,580 filed on Apr. 6, 2015, now U.S. Pat. No. 9,628,775, which is a continuation-in-part of U.S. patent application Ser. No. 14/485,876 filed on Sep. 15, 2014, now U.S. Pat. No. 9,607,239, which is a continuation-in-part of U.S. patent application Ser. No. 13/491,176 filed Jun. 7, 2012, now U.S. Pat. No. 8,832,954, which is a continuation-in-part of U.S. patent application Ser. No. 13/006,507 filed Jan. 14, 2011, now U.S. Pat. No. 8,533,967, and claims the benefit of provisional application No. 61/296,555 filed Jan. 20, 2010, provisional application No. 61/355,279 filed Jun. 16, 2010, and provisional application No. 61/351,347 filed on Jun. 4, 2010. The contents of all of the above referenced patent applications and patents are hereby incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
1535312 | Hosking | Apr 1925 | A |
1538758 | Taylor | May 1925 | A |
1918813 | Kinzy | Jul 1933 | A |
2316573 | Egy | Apr 1943 | A |
2333243 | Glab | Nov 1943 | A |
2702683 | Green et al. | Feb 1955 | A |
2748926 | Leahy | Jun 1956 | A |
2983367 | Paramater et al. | Jun 1958 | A |
2924495 | Haines | Sep 1958 | A |
2966257 | Littlejohn | Dec 1960 | A |
3066790 | Armbruster | Dec 1962 | A |
3447852 | Barlow | Jun 1969 | A |
3458167 | Cooley, Jr. | Jul 1969 | A |
3830567 | Riegl | Aug 1974 | A |
3899145 | Stephenson | Aug 1975 | A |
3945729 | Rosen | Mar 1976 | A |
4138045 | Baker | Feb 1979 | A |
4178515 | Tarasevich | Dec 1979 | A |
4340008 | Mendelson | Jul 1982 | A |
4379461 | Nilsson et al. | Apr 1983 | A |
4424899 | Rosenberg | Jan 1984 | A |
4430796 | Nakagawa | Feb 1984 | A |
4457625 | Greenleaf et al. | Jul 1984 | A |
4506448 | Topping et al. | Mar 1985 | A |
4537233 | Vroonland et al. | Aug 1985 | A |
4561776 | Pryor | Dec 1985 | A |
4606696 | Slocum | Aug 1986 | A |
4659280 | Akeel | Apr 1987 | A |
4663852 | Guarini | May 1987 | A |
4664588 | Newell et al. | May 1987 | A |
4667231 | Pryor | May 1987 | A |
4676002 | Slocum | Jun 1987 | A |
4714339 | Lau et al. | Dec 1987 | A |
4733961 | Mooney | Mar 1988 | A |
4736218 | Kutman | Apr 1988 | A |
4751950 | Bock | Jun 1988 | A |
4767257 | Kato | Aug 1988 | A |
4790651 | Brown et al. | Dec 1988 | A |
4816822 | Vache et al. | Mar 1989 | A |
4870274 | Hebert et al. | Sep 1989 | A |
4882806 | Davis | Nov 1989 | A |
4891509 | Jones et al. | Jan 1990 | A |
4954952 | Ubhayakar et al. | Sep 1990 | A |
4982841 | Goedecke | Jan 1991 | A |
4984881 | Osada et al. | Jan 1991 | A |
4996909 | Vache et al. | Mar 1991 | A |
4999491 | Semler et al. | Mar 1991 | A |
5021641 | Swartz et al. | Jun 1991 | A |
5025966 | Potter | Jun 1991 | A |
5027951 | Johnson | Jul 1991 | A |
5068971 | Simon | Dec 1991 | A |
5069524 | Watanabe et al. | Dec 1991 | A |
5155684 | Burke et al. | Oct 1992 | A |
5168532 | Seppi et al. | Dec 1992 | A |
5189797 | Granger | Mar 1993 | A |
5205111 | Johnson | Apr 1993 | A |
5211476 | Coudroy | May 1993 | A |
5212738 | Chande et al. | May 1993 | A |
5213240 | Dietz et al. | May 1993 | A |
5216479 | Dotan et al. | Jun 1993 | A |
5218427 | Koch | Jun 1993 | A |
5219423 | Kamaya | Jun 1993 | A |
5239855 | Schleifer et al. | Aug 1993 | A |
5251156 | Heier et al. | Oct 1993 | A |
5289264 | Steinbichler | Feb 1994 | A |
5289265 | Inoue et al. | Feb 1994 | A |
5289855 | Baker et al. | Mar 1994 | A |
5313261 | Leatham et al. | May 1994 | A |
5319445 | Fitts | Jun 1994 | A |
5329347 | Wallace et al. | Jul 1994 | A |
5329467 | Nagamune et al. | Jul 1994 | A |
5332315 | Baker et al. | Jul 1994 | A |
5337149 | Kozah et al. | Aug 1994 | A |
5371347 | Plesko | Dec 1994 | A |
5372250 | Johnson | Dec 1994 | A |
5373346 | Hocker | Dec 1994 | A |
5402365 | Kozikaro et al. | Mar 1995 | A |
5402582 | Raab | Apr 1995 | A |
5412880 | Raab | May 1995 | A |
5416505 | Eguchi et al. | May 1995 | A |
5430384 | Hocker | Jul 1995 | A |
5446846 | Lennartsson | Aug 1995 | A |
5455670 | Payne et al. | Oct 1995 | A |
5455993 | Link et al. | Oct 1995 | A |
5510977 | Raab | Apr 1996 | A |
5517297 | Stenton | May 1996 | A |
5528354 | Uwira | Jun 1996 | A |
5528505 | Granger et al. | Jun 1996 | A |
5535524 | Carrier et al. | Jul 1996 | A |
5563655 | Lathrop | Oct 1996 | A |
5577130 | Wu | Nov 1996 | A |
5611147 | Raab | Mar 1997 | A |
5615489 | Breyer et al. | Apr 1997 | A |
5623416 | Hocker, III | Apr 1997 | A |
5629756 | Kitajima | May 1997 | A |
5668631 | Norita et al. | Sep 1997 | A |
5675326 | Juds et al. | Oct 1997 | A |
5677760 | Mikami et al. | Oct 1997 | A |
5682508 | Hocker, III | Oct 1997 | A |
5716036 | Isobe et al. | Feb 1998 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5734417 | Yamamoto et al. | Mar 1998 | A |
5745225 | Watanabe et al. | Apr 1998 | A |
5752112 | Paddock et al. | May 1998 | A |
5754449 | Hoshal et al. | May 1998 | A |
5768792 | Raab | Jun 1998 | A |
5793993 | Broedner et al. | Aug 1998 | A |
5804805 | Koenck et al. | Sep 1998 | A |
5805289 | Corby, Jr. et al. | Sep 1998 | A |
5825666 | Freifeld | Oct 1998 | A |
5829148 | Eaton | Nov 1998 | A |
5831719 | Berg et al. | Nov 1998 | A |
5832416 | Anderson | Nov 1998 | A |
5844591 | Takamatsu et al. | Dec 1998 | A |
5856874 | Tachibana et al. | Jan 1999 | A |
5887122 | Terawaki et al. | Mar 1999 | A |
5894123 | Ohtomo et al. | Apr 1999 | A |
5898484 | Harris | Apr 1999 | A |
5898490 | Ohtomo et al. | Apr 1999 | A |
5909939 | Fugmann | Jun 1999 | A |
5926782 | Raab | Jul 1999 | A |
5933267 | Ishizuka | Aug 1999 | A |
5936721 | Ohtomo et al. | Aug 1999 | A |
5940170 | Berg et al. | Aug 1999 | A |
5940181 | Tsubono et al. | Aug 1999 | A |
5949530 | Wetteborn | Sep 1999 | A |
5956661 | Lefebvre et al. | Sep 1999 | A |
5956857 | Raab | Sep 1999 | A |
5969321 | Danielson et al. | Oct 1999 | A |
5973788 | Pettersen et al. | Oct 1999 | A |
5978748 | Raab | Nov 1999 | A |
5983936 | Schwieterman et al. | Nov 1999 | A |
5988862 | Kacyra et al. | Nov 1999 | A |
5991011 | Damm | Nov 1999 | A |
5996790 | Yamada et al. | Dec 1999 | A |
5997779 | Potter | Dec 1999 | A |
6040898 | Mrosik et al. | Mar 2000 | A |
D423534 | Raab et al. | Apr 2000 | S |
6050615 | Weinhold | Apr 2000 | A |
6057915 | Squire et al. | May 2000 | A |
6060889 | Hocker | May 2000 | A |
6067116 | Yamano et al. | May 2000 | A |
6069700 | Rudnick et al. | May 2000 | A |
6077306 | Metzger et al. | Jun 2000 | A |
6112423 | Sheehan | Sep 2000 | A |
6115511 | Sakai et al. | Sep 2000 | A |
6125337 | Rosenberg et al. | Sep 2000 | A |
6131299 | Raab et al. | Oct 2000 | A |
6134507 | Markey, Jr. et al. | Oct 2000 | A |
6138915 | Danielson et al. | Oct 2000 | A |
6149112 | Thieltges | Nov 2000 | A |
6151789 | Raab et al. | Nov 2000 | A |
6163294 | Talbot | Dec 2000 | A |
6166504 | Iida et al. | Dec 2000 | A |
6166809 | Pettersen et al. | Dec 2000 | A |
6166811 | Long et al. | Dec 2000 | A |
6204651 | Marcus et al. | Mar 2001 | B1 |
6204961 | Anderson et al. | Mar 2001 | B1 |
6219928 | Raab et al. | Apr 2001 | B1 |
D441632 | Raab et al. | May 2001 | S |
6240651 | Schroeder et al. | Jun 2001 | B1 |
6253458 | Raab et al. | Jul 2001 | B1 |
6282195 | Miller et al. | Aug 2001 | B1 |
6285390 | Blake | Sep 2001 | B1 |
6298569 | Raab et al. | Oct 2001 | B1 |
6339410 | Milner et al. | Jan 2002 | B1 |
6349249 | Cunningham | Feb 2002 | B1 |
6366831 | Raab | Apr 2002 | B1 |
6408252 | De Smet | Jun 2002 | B1 |
6418774 | Brogaardh et al. | Jul 2002 | B1 |
6438507 | Imai | Aug 2002 | B1 |
6438856 | Kaczynski | Aug 2002 | B1 |
6442419 | Chu et al. | Aug 2002 | B1 |
6445446 | Kumagai et al. | Sep 2002 | B1 |
6460004 | Greer et al. | Oct 2002 | B2 |
6470584 | Stoodley | Oct 2002 | B1 |
6477784 | Schroeder et al. | Nov 2002 | B2 |
6480270 | Studnicka et al. | Nov 2002 | B1 |
6483106 | Ohtomo et al. | Nov 2002 | B1 |
6497394 | Dunchock | Dec 2002 | B1 |
6504602 | Hinderling | Jan 2003 | B1 |
6512575 | Marchi | Jan 2003 | B1 |
6519860 | Bieg et al. | Feb 2003 | B1 |
D472824 | Raab et al. | Apr 2003 | S |
6542249 | Kofman et al. | Apr 2003 | B1 |
6547397 | Kaufman et al. | Apr 2003 | B1 |
6598306 | Eaton | Jul 2003 | B2 |
6611346 | Granger | Aug 2003 | B2 |
6611617 | Crampton | Aug 2003 | B1 |
D479544 | Raab et al. | Sep 2003 | S |
6612044 | Raab et al. | Sep 2003 | B2 |
6621065 | Fukumoto et al. | Sep 2003 | B1 |
6626339 | Gates et al. | Sep 2003 | B2 |
6633051 | Holloway et al. | Oct 2003 | B1 |
6649208 | Rodgers | Nov 2003 | B2 |
6650402 | Sullivan et al. | Nov 2003 | B2 |
6668466 | Bieg et al. | Dec 2003 | B1 |
6675122 | Markendorf et al. | Jan 2004 | B1 |
6681495 | Masayuki et al. | Jan 2004 | B2 |
6710859 | Shirai et al. | Mar 2004 | B2 |
D490831 | Raab et al. | Jun 2004 | S |
D491210 | Raab et al. | Jun 2004 | S |
6750873 | Bernardini et al. | Jun 2004 | B1 |
6753876 | Brooksby et al. | Jun 2004 | B2 |
6759649 | Hipp | Jul 2004 | B2 |
6759979 | Vashisth et al. | Jul 2004 | B2 |
6764185 | Beardsley et al. | Jul 2004 | B1 |
6789327 | Roth et al. | Sep 2004 | B2 |
6820346 | Raab et al. | Nov 2004 | B2 |
6822749 | Christoph | Nov 2004 | B1 |
6825923 | Hamar et al. | Nov 2004 | B2 |
6826664 | Hocker, III et al. | Nov 2004 | B2 |
6847436 | Bridges | Jan 2005 | B2 |
6856381 | Christoph | Feb 2005 | B2 |
6858836 | Hartrumpf | Feb 2005 | B1 |
6859269 | Ohtomo et al. | Feb 2005 | B2 |
6862097 | Yanagisawa et al. | Mar 2005 | B2 |
6868359 | Raab | Mar 2005 | B2 |
6879933 | Steffey et al. | Apr 2005 | B2 |
6889903 | Koenck | May 2005 | B1 |
6892465 | Raab et al. | May 2005 | B2 |
6894767 | Ishinabe et al. | May 2005 | B2 |
6895347 | Dorny et al. | May 2005 | B2 |
6901673 | Cobb et al. | Jun 2005 | B1 |
6904691 | Raab et al. | Jun 2005 | B2 |
6914678 | Ulrichsen et al. | Jul 2005 | B1 |
6917415 | Gogolla et al. | Jul 2005 | B2 |
6920697 | Raab et al. | Jul 2005 | B2 |
6922234 | Hoffman et al. | Jul 2005 | B2 |
6925722 | Raab et al. | Aug 2005 | B2 |
6931745 | Granger | Aug 2005 | B2 |
6935036 | Raab et al. | Aug 2005 | B2 |
6935748 | Kaufman et al. | Aug 2005 | B2 |
6948255 | Russell | Sep 2005 | B2 |
6957496 | Raab et al. | Oct 2005 | B2 |
6965843 | Raab et al. | Nov 2005 | B2 |
6973734 | Raab et al. | Dec 2005 | B2 |
6988322 | Raab et al. | Jan 2006 | B2 |
6989890 | Riegl et al. | Jan 2006 | B2 |
7003892 | Eaton et al. | Feb 2006 | B2 |
7006084 | Buss et al. | Feb 2006 | B1 |
7024032 | Kidd et al. | Apr 2006 | B2 |
7029126 | Tang | Apr 2006 | B2 |
7032321 | Raab et al. | Apr 2006 | B2 |
7040136 | Forss et al. | May 2006 | B2 |
7051447 | Kikuchi et al. | May 2006 | B2 |
7069124 | Whittaker et al. | Jun 2006 | B1 |
7076420 | Snyder et al. | Jul 2006 | B1 |
7106421 | Matsuura et al. | Sep 2006 | B2 |
7117107 | Dorny et al. | Oct 2006 | B2 |
7120092 | Del Prado Pavon et al. | Oct 2006 | B2 |
7127822 | Kumagai et al. | Oct 2006 | B2 |
7136153 | Mori et al. | Nov 2006 | B2 |
7140213 | Feucht et al. | Nov 2006 | B2 |
7142289 | Ando et al. | Nov 2006 | B2 |
7145926 | Vitruk et al. | Dec 2006 | B2 |
7152456 | Eaton | Dec 2006 | B2 |
7174651 | Raab et al. | Feb 2007 | B2 |
7180072 | Persi et al. | Feb 2007 | B2 |
7184047 | Crampton | Feb 2007 | B1 |
7190465 | Froehlich et al. | Mar 2007 | B2 |
7191541 | Weekers et al. | Mar 2007 | B1 |
7193690 | Ossig et al. | Mar 2007 | B2 |
7196509 | Teng | Mar 2007 | B2 |
7199872 | Van Cranenbroeck | Apr 2007 | B2 |
7200246 | Cofer et al. | Apr 2007 | B2 |
7202941 | Munro | Apr 2007 | B2 |
7230689 | Lau | Jun 2007 | B2 |
7242590 | Yeap et al. | Jul 2007 | B1 |
7246030 | Raab et al. | Jul 2007 | B2 |
7249421 | MacManus et al. | Jul 2007 | B2 |
7254262 | Nehse et al. | Aug 2007 | B2 |
7256899 | Faul et al. | Aug 2007 | B1 |
7269910 | Raab et al. | Sep 2007 | B2 |
D551943 | Hodjat et al. | Oct 2007 | S |
7285793 | Husted | Oct 2007 | B2 |
7296364 | Seitz et al. | Nov 2007 | B2 |
7296955 | Dreier | Nov 2007 | B2 |
7296979 | Raab et al. | Nov 2007 | B2 |
7306339 | Kaufman et al. | Dec 2007 | B2 |
7307701 | Hoffman, II | Dec 2007 | B2 |
7312862 | Zumbrunn et al. | Dec 2007 | B2 |
7313264 | Crampton | Dec 2007 | B2 |
D559657 | Wohlford et al. | Jan 2008 | S |
7319512 | Ohtomo et al. | Jan 2008 | B2 |
7330242 | Reichert et al. | Feb 2008 | B2 |
7337344 | Barman et al. | Feb 2008 | B2 |
7342650 | Kern et al. | Mar 2008 | B2 |
7348822 | Baer | Mar 2008 | B2 |
7352446 | Bridges et al. | Apr 2008 | B2 |
7360648 | Blaschke | Apr 2008 | B1 |
7372558 | Kaufman et al. | May 2008 | B2 |
7372581 | Raab et al. | May 2008 | B2 |
7383638 | Granger | Jun 2008 | B2 |
7388654 | Raab et al. | Jun 2008 | B2 |
7389870 | Slappay | Jun 2008 | B2 |
7395606 | Crampton | Jul 2008 | B2 |
7400384 | Evans et al. | Jul 2008 | B1 |
7403268 | England et al. | Jul 2008 | B2 |
7403269 | Yamashita et al. | Jul 2008 | B2 |
7430068 | Becker et al. | Sep 2008 | B2 |
7430070 | Soreide et al. | Sep 2008 | B2 |
7441341 | Eaton | Oct 2008 | B2 |
7443555 | Blug et al. | Oct 2008 | B2 |
7447931 | Rischar et al. | Nov 2008 | B1 |
7449876 | Pleasant et al. | Nov 2008 | B2 |
7454265 | Marsh | Nov 2008 | B2 |
7463368 | Morden et al. | Dec 2008 | B2 |
7477359 | England et al. | Jan 2009 | B2 |
7477360 | England et al. | Jan 2009 | B2 |
7480037 | Palmateer et al. | Jan 2009 | B2 |
2452033 | Born | Feb 2009 | A1 |
7508496 | Mettenleiter et al. | Mar 2009 | B2 |
7508971 | Vaccaro et al. | Mar 2009 | B2 |
7515256 | Ohtomo et al. | Apr 2009 | B2 |
7525276 | Eaton | Apr 2009 | B2 |
7527205 | Zhu et al. | May 2009 | B2 |
7528768 | Wakayama et al. | May 2009 | B2 |
7541830 | Fahrbach et al. | Jun 2009 | B2 |
7545517 | Rueb et al. | Jun 2009 | B2 |
7546689 | Ferrari et al. | Jun 2009 | B2 |
7551771 | England, III | Jun 2009 | B2 |
7552644 | Haase et al. | Jun 2009 | B2 |
7557824 | Holliman | Jul 2009 | B2 |
7561598 | Stratton et al. | Jul 2009 | B2 |
7564250 | Hocker | Jul 2009 | B2 |
7568293 | Ferrari | Aug 2009 | B2 |
7578069 | Eaton | Aug 2009 | B2 |
D599226 | Gerent et al. | Sep 2009 | S |
7589595 | Cutler | Sep 2009 | B2 |
7589825 | Orchard et al. | Sep 2009 | B2 |
7591077 | Pettersson | Sep 2009 | B2 |
7591078 | Crampton | Sep 2009 | B2 |
7599106 | Matsumoto et al. | Oct 2009 | B2 |
7600061 | Honda | Oct 2009 | B2 |
7602873 | Eidson | Oct 2009 | B2 |
7604207 | Hasloecher et al. | Oct 2009 | B2 |
7610175 | Eidson | Oct 2009 | B2 |
7614157 | Granger | Nov 2009 | B2 |
7624510 | Ferrari | Dec 2009 | B2 |
7625335 | Deichmann et al. | Dec 2009 | B2 |
7626690 | Kumagai et al. | Dec 2009 | B2 |
D607350 | Cooduvalli et al. | Jan 2010 | S |
7656751 | Rischar et al. | Feb 2010 | B2 |
7659995 | Knighton et al. | Feb 2010 | B2 |
D610926 | Gerent et al. | Mar 2010 | S |
7693325 | Pulla et al. | Apr 2010 | B2 |
7697748 | Dimsdale et al. | Apr 2010 | B2 |
7701592 | Saint Clair et al. | Apr 2010 | B2 |
7712224 | Hicks | May 2010 | B2 |
7721396 | Fleischman | May 2010 | B2 |
7728833 | Verma et al. | Jun 2010 | B2 |
7728963 | Kirschner | Jun 2010 | B2 |
7733544 | Becker et al. | Jun 2010 | B2 |
7735234 | Briggs et al. | Jun 2010 | B2 |
7742634 | Fujieda et al. | Jun 2010 | B2 |
7743524 | Eaton et al. | Jun 2010 | B2 |
7752003 | MacManus | Jul 2010 | B2 |
7756615 | Barfoot et al. | Jul 2010 | B2 |
7765707 | Tomelleri | Aug 2010 | B2 |
7769559 | Reichert | Aug 2010 | B2 |
7774949 | Ferrari | Aug 2010 | B2 |
7777761 | England et al. | Aug 2010 | B2 |
7779548 | Ferrari | Aug 2010 | B2 |
7779553 | Jordil et al. | Aug 2010 | B2 |
7784194 | Raab et al. | Aug 2010 | B2 |
7787670 | Urushiya | Aug 2010 | B2 |
7793425 | Bailey | Sep 2010 | B2 |
7798453 | Maningo et al. | Sep 2010 | B2 |
7800758 | Bridges et al. | Sep 2010 | B1 |
7804602 | Raab | Sep 2010 | B2 |
7805851 | Pettersson | Oct 2010 | B2 |
7805854 | Eaton | Oct 2010 | B2 |
7809518 | Zhu et al. | Oct 2010 | B2 |
7834985 | Morcom | Nov 2010 | B2 |
7847922 | Gittinger et al. | Dec 2010 | B2 |
RE42055 | Raab | Jan 2011 | E |
7869005 | Ossig et al. | Jan 2011 | B2 |
7881896 | Atwell et al. | Feb 2011 | B2 |
7889324 | Yamamoto | Feb 2011 | B2 |
7891248 | Hough et al. | Feb 2011 | B2 |
7900714 | Milbourne et al. | Mar 2011 | B2 |
7903245 | Miousset et al. | Mar 2011 | B2 |
7903261 | Saint Clair et al. | Mar 2011 | B2 |
7908757 | Ferrari | Mar 2011 | B2 |
7933055 | Jensen et al. | Apr 2011 | B2 |
7935928 | Serger et al. | May 2011 | B2 |
7965747 | Kumano | Jun 2011 | B2 |
7982866 | Vogel | Jul 2011 | B2 |
D643319 | Ferrari et al. | Aug 2011 | S |
7990397 | Bukowski et al. | Aug 2011 | B2 |
7994465 | Bamji et al. | Aug 2011 | B1 |
7995834 | Knighton et al. | Aug 2011 | B1 |
8001697 | Danielson et al. | Aug 2011 | B2 |
8020657 | Allard et al. | Sep 2011 | B2 |
8022812 | Beniyama et al. | Sep 2011 | B2 |
8028432 | Bailey et al. | Oct 2011 | B2 |
8036775 | Matsumoto et al. | Oct 2011 | B2 |
8045762 | Otani et al. | Oct 2011 | B2 |
8051710 | Van Dam et al. | Nov 2011 | B2 |
8052857 | Townsend | Nov 2011 | B2 |
8064046 | Ossig et al. | Nov 2011 | B2 |
8065861 | Caputo | Nov 2011 | B2 |
8082673 | Desforges et al. | Dec 2011 | B2 |
8099877 | Champ | Jan 2012 | B2 |
8117668 | Crampton et al. | Feb 2012 | B2 |
8123350 | Cannell et al. | Feb 2012 | B2 |
8152071 | Doherty et al. | Apr 2012 | B2 |
D659035 | Ferrari et al. | May 2012 | S |
8171650 | York et al. | May 2012 | B2 |
8179936 | Bueche et al. | May 2012 | B2 |
D662427 | Bailey et al. | Jun 2012 | S |
8218131 | Otani et al. | Jul 2012 | B2 |
8224032 | Fuchs et al. | Jul 2012 | B2 |
8260483 | Barfoot et al. | Sep 2012 | B2 |
8269984 | Hinderling et al. | Sep 2012 | B2 |
8276286 | Bailey et al. | Oct 2012 | B2 |
8284407 | Briggs et al. | Oct 2012 | B2 |
8310653 | Ogawa et al. | Nov 2012 | B2 |
8321612 | Hartwich et al. | Nov 2012 | B2 |
8346392 | Walser et al. | Jan 2013 | B2 |
8346480 | Trepagnier et al. | Jan 2013 | B2 |
8352212 | Fetter et al. | Jan 2013 | B2 |
8353059 | Crampton et al. | Jan 2013 | B2 |
D676341 | Bailey et al. | Feb 2013 | S |
8379191 | Braunecker et al. | Feb 2013 | B2 |
8381704 | Debelak et al. | Feb 2013 | B2 |
8384914 | Becker et al. | Feb 2013 | B2 |
D678085 | Bailey et al. | Mar 2013 | S |
8391565 | Purcell et al. | Mar 2013 | B2 |
8402669 | Ferrari et al. | Mar 2013 | B2 |
8422035 | Hinderling et al. | Apr 2013 | B2 |
8497901 | Pettersson | Jul 2013 | B2 |
8533967 | Bailey et al. | Sep 2013 | B2 |
8537374 | Briggs et al. | Sep 2013 | B2 |
8619265 | Steffey et al. | Dec 2013 | B2 |
8645022 | Yoshimura et al. | Feb 2014 | B2 |
8659748 | Dakin et al. | Feb 2014 | B2 |
8659752 | Cramer et al. | Feb 2014 | B2 |
8661700 | Briggs et al. | Mar 2014 | B2 |
8677643 | Bridges et al. | Mar 2014 | B2 |
8683709 | York | Apr 2014 | B2 |
8699007 | Becker et al. | Apr 2014 | B2 |
8705012 | Greiner et al. | Apr 2014 | B2 |
8705016 | Schumann et al. | Apr 2014 | B2 |
8718837 | Wang et al. | May 2014 | B2 |
8784425 | Ritchey et al. | Jul 2014 | B2 |
8797552 | Suzuki et al. | Aug 2014 | B2 |
8830485 | Woloschyn | Sep 2014 | B2 |
8832954 | Atwell et al. | Sep 2014 | B2 |
9163922 | Bridges et al. | Oct 2015 | B2 |
9228816 | Grau | Jan 2016 | B2 |
9607239 | Bridges et al. | Mar 2017 | B2 |
9628775 | Bridges et al. | Apr 2017 | B2 |
9879976 | Bridges | Jan 2018 | B2 |
20010004269 | Shibata et al. | Jun 2001 | A1 |
20020032541 | Raab et al. | Mar 2002 | A1 |
20020059042 | Kacyra et al. | May 2002 | A1 |
20020087233 | Raab | Jul 2002 | A1 |
20020128790 | Woodmansee | Sep 2002 | A1 |
20020143506 | D'Aligny et al. | Oct 2002 | A1 |
20020149694 | Seo | Oct 2002 | A1 |
20020170192 | Steffey et al. | Nov 2002 | A1 |
20020176097 | Rodgers | Nov 2002 | A1 |
20030002055 | Kilthau et al. | Jan 2003 | A1 |
20030033104 | Gooche | Feb 2003 | A1 |
20030043386 | Froehlich et al. | Mar 2003 | A1 |
20030053037 | Blaesing-Bangert et al. | Mar 2003 | A1 |
20030066954 | Hipp | Apr 2003 | A1 |
20030090646 | Riegl et al. | May 2003 | A1 |
20030125901 | Steffey et al. | Jul 2003 | A1 |
20030137449 | Vashisth et al. | Jul 2003 | A1 |
20030142631 | Silvester | Jul 2003 | A1 |
20030167647 | Raab et al. | Sep 2003 | A1 |
20030172536 | Raab et al. | Sep 2003 | A1 |
20030172537 | Raab et al. | Sep 2003 | A1 |
20030179361 | Ohtomo et al. | Sep 2003 | A1 |
20030208919 | Raab et al. | Nov 2003 | A1 |
20030221326 | Raab et al. | Dec 2003 | A1 |
20040004727 | Yanagisawa et al. | Jan 2004 | A1 |
20040022416 | Lemelson et al. | Feb 2004 | A1 |
20040027554 | Ishinabe et al. | Feb 2004 | A1 |
20040040166 | Raab et al. | Mar 2004 | A1 |
20040103547 | Raab et al. | Jun 2004 | A1 |
20040111908 | Raab et al. | Jun 2004 | A1 |
20040119020 | Bodkin | Jun 2004 | A1 |
20040135990 | Ohtomo et al. | Jul 2004 | A1 |
20040139265 | Hocker, III et al. | Jul 2004 | A1 |
20040158355 | Holmqvist et al. | Aug 2004 | A1 |
20040162700 | Rosenberg et al. | Aug 2004 | A1 |
20040179570 | Vitruk et al. | Sep 2004 | A1 |
20040221790 | Sinclair et al. | Nov 2004 | A1 |
20040246462 | Kaneko et al. | Dec 2004 | A1 |
20040246589 | Kim et al. | Dec 2004 | A1 |
20040259533 | Nixon et al. | Dec 2004 | A1 |
20050016008 | Raab et al. | Jan 2005 | A1 |
20050024625 | Mori et al. | Feb 2005 | A1 |
20050028393 | Raab et al. | Feb 2005 | A1 |
20050046823 | Ando et al. | Mar 2005 | A1 |
20050058332 | Kaufman et al. | Mar 2005 | A1 |
20050082262 | Rueb et al. | Apr 2005 | A1 |
20050085940 | Griggs et al. | Apr 2005 | A1 |
20050115092 | Griggs et al. | Apr 2005 | A1 |
20050111514 | Matsumoto et al. | May 2005 | A1 |
20050141052 | Becker et al. | Jun 2005 | A1 |
20050144799 | Raab et al. | Jul 2005 | A1 |
20050150123 | Eaton | Jul 2005 | A1 |
20050151963 | Pulla et al. | Jul 2005 | A1 |
20050166413 | Crampton | Aug 2005 | A1 |
20050172503 | Kumagai et al. | Aug 2005 | A1 |
20050188557 | Raab et al. | Sep 2005 | A1 |
20050190384 | Persi et al. | Sep 2005 | A1 |
20050214716 | Weber et al. | Sep 2005 | A1 |
20050259271 | Christoph | Nov 2005 | A1 |
20050276466 | Vaccaro et al. | Dec 2005 | A1 |
20050283989 | Pettersson | Dec 2005 | A1 |
20060016086 | Raab et al. | Jan 2006 | A1 |
20060017720 | Li | Jan 2006 | A1 |
20060026851 | Raab et al. | Feb 2006 | A1 |
20060028203 | Kawashima et al. | Feb 2006 | A1 |
20060053647 | Raab et al. | Mar 2006 | A1 |
20060056459 | Stratton et al. | Mar 2006 | A1 |
20060056559 | Pleasant et al. | Mar 2006 | A1 |
20060059270 | Pleasant et al. | Mar 2006 | A1 |
20060061566 | Verma et al. | Mar 2006 | A1 |
20060066836 | Bridges et al. | Mar 2006 | A1 |
20060088044 | Hammerl et al. | Apr 2006 | A1 |
20060096108 | Raab et al. | May 2006 | A1 |
20060103853 | Palmateer | May 2006 | A1 |
20060109536 | Mettenleiter et al. | May 2006 | A1 |
20060123649 | Muller | Jun 2006 | A1 |
20060129349 | Raab et al. | Jun 2006 | A1 |
20060132803 | Clair et al. | Jun 2006 | A1 |
20060145703 | Steinbichler et al. | Jul 2006 | A1 |
20060169050 | Kobayashi et al. | Aug 2006 | A1 |
20060169608 | Carnevali et al. | Aug 2006 | A1 |
20060170870 | Kaufman et al. | Aug 2006 | A1 |
20060182314 | England et al. | Aug 2006 | A1 |
20060186301 | Dozier et al. | Aug 2006 | A1 |
20060193521 | England, III et al. | Aug 2006 | A1 |
20060241791 | Pokorny et al. | Oct 2006 | A1 |
20060244746 | England et al. | Nov 2006 | A1 |
20060245717 | Ossig et al. | Nov 2006 | A1 |
20060279246 | Hashimoto et al. | Dec 2006 | A1 |
20060282574 | Zotov et al. | Dec 2006 | A1 |
20060287769 | Yanagita et al. | Dec 2006 | A1 |
20060291970 | Granger | Dec 2006 | A1 |
20070019212 | Gatsios et al. | Jan 2007 | A1 |
20070030841 | Lee et al. | Feb 2007 | A1 |
20070043526 | De Jonge et al. | Feb 2007 | A1 |
20070050774 | Eldson et al. | Mar 2007 | A1 |
20070055806 | Stratton et al. | Mar 2007 | A1 |
20070058154 | Reichert et al. | Mar 2007 | A1 |
20070058162 | Granger | Mar 2007 | A1 |
20070064976 | England, III | Mar 2007 | A1 |
20070097382 | Granger | May 2007 | A1 |
20070100498 | Matsumoto et al. | May 2007 | A1 |
20070105238 | Mandl et al. | May 2007 | A1 |
20070118269 | Gibson et al. | May 2007 | A1 |
20070122250 | Mullner | May 2007 | A1 |
20070142970 | Burbank et al. | Jun 2007 | A1 |
20070147265 | Eidson et al. | Jun 2007 | A1 |
20070147435 | Hamilton et al. | Jun 2007 | A1 |
20070147562 | Eidson | Jun 2007 | A1 |
20070150111 | Wu et al. | Jun 2007 | A1 |
20070151390 | Blumenkranz et al. | Jul 2007 | A1 |
20070153297 | Lau | Jul 2007 | A1 |
20070163134 | Eaton | Jul 2007 | A1 |
20070163136 | Eaton et al. | Jul 2007 | A1 |
20070171394 | Steiner et al. | Jul 2007 | A1 |
20070176648 | Baer | Aug 2007 | A1 |
20070177016 | Wu | Aug 2007 | A1 |
20070181685 | Zhu et al. | Aug 2007 | A1 |
20070183459 | Eidson | Aug 2007 | A1 |
20070185682 | Eidson | Aug 2007 | A1 |
20070217169 | Yeap et al. | Sep 2007 | A1 |
20070217170 | Yeap et al. | Sep 2007 | A1 |
20070221522 | Yamada et al. | Sep 2007 | A1 |
20070223477 | Eidson | Sep 2007 | A1 |
20070229801 | Tearney et al. | Oct 2007 | A1 |
20070229929 | Soreide et al. | Oct 2007 | A1 |
20070247615 | Bridges et al. | Oct 2007 | A1 |
20070248122 | Hamilton | Oct 2007 | A1 |
20070256311 | Ferrari | Nov 2007 | A1 |
20070257660 | Pleasant et al. | Nov 2007 | A1 |
20070258378 | Hamilton | Nov 2007 | A1 |
20070282564 | Sprague et al. | Dec 2007 | A1 |
20070294045 | Atwell et al. | Dec 2007 | A1 |
20080046221 | Stathis | Feb 2008 | A1 |
20080052808 | Leick et al. | Mar 2008 | A1 |
20080052936 | Briggs et al. | Mar 2008 | A1 |
20080066583 | Lott et al. | Mar 2008 | A1 |
20080068103 | Cutler | Mar 2008 | A1 |
20080075325 | Otani et al. | Mar 2008 | A1 |
20080075326 | Otani et al. | Mar 2008 | A1 |
20080080562 | Burch et al. | Apr 2008 | A1 |
20080096108 | Sumiyama et al. | Apr 2008 | A1 |
20080098272 | Fairbanks et al. | Apr 2008 | A1 |
20080148585 | Raab et al. | Jun 2008 | A1 |
20080154538 | Stathis | Jun 2008 | A1 |
20080183065 | Goldbach | Jul 2008 | A1 |
20080196260 | Pettersson | Aug 2008 | A1 |
20080204699 | Benz et al. | Aug 2008 | A1 |
20080216552 | Ibach et al. | Sep 2008 | A1 |
20080218728 | Kirschner | Sep 2008 | A1 |
20080228331 | McNerney et al. | Sep 2008 | A1 |
20080232269 | Tatman et al. | Sep 2008 | A1 |
20080235969 | Jordil et al. | Oct 2008 | A1 |
20080235970 | Crampton | Oct 2008 | A1 |
20080240321 | Narus et al. | Oct 2008 | A1 |
20080245452 | Law et al. | Oct 2008 | A1 |
20080246943 | Kaufman et al. | Oct 2008 | A1 |
20080252671 | Cannell et al. | Oct 2008 | A1 |
20080256814 | Pettersson | Oct 2008 | A1 |
20080257023 | Jordil et al. | Oct 2008 | A1 |
20080263411 | Baney et al. | Oct 2008 | A1 |
20080271332 | Jordil et al. | Nov 2008 | A1 |
20080273758 | Fuchs et al. | Nov 2008 | A1 |
20080282564 | Pettersson | Nov 2008 | A1 |
20080179206 | Feinstein et al. | Dec 2008 | A1 |
20080295349 | Uhl et al. | Dec 2008 | A1 |
20080298254 | Eidson | Dec 2008 | A1 |
20080302200 | Tobey | Dec 2008 | A1 |
20080309460 | Jefferson et al. | Dec 2008 | A1 |
20080309546 | Wakayama et al. | Dec 2008 | A1 |
20090000136 | Crampton | Jan 2009 | A1 |
20090010740 | Ferrari et al. | Jan 2009 | A1 |
20090013548 | Ferrari | Jan 2009 | A1 |
20090016475 | Rischar et al. | Jan 2009 | A1 |
20090021351 | Beniyama et al. | Jan 2009 | A1 |
20090031575 | Tomelleri | Feb 2009 | A1 |
20090046140 | Lashmet et al. | Feb 2009 | A1 |
20090046752 | Bueche et al. | Feb 2009 | A1 |
20090046895 | Pettersson et al. | Feb 2009 | A1 |
20090049704 | Styles et al. | Feb 2009 | A1 |
20090051938 | Miousset et al. | Feb 2009 | A1 |
20090083985 | Ferrari | Apr 2009 | A1 |
20090089004 | Vook et al. | Apr 2009 | A1 |
20090089078 | Bursey | Apr 2009 | A1 |
20090089233 | Gach et al. | Apr 2009 | A1 |
20090089623 | Neering et al. | Apr 2009 | A1 |
20090095047 | Patel et al. | Apr 2009 | A1 |
20090100949 | Shirai et al. | Apr 2009 | A1 |
20090109797 | Eidson | Apr 2009 | A1 |
20090113183 | Barford et al. | Apr 2009 | A1 |
20090113229 | Cataldo et al. | Apr 2009 | A1 |
20090122805 | Epps et al. | May 2009 | A1 |
20090125196 | Velazquez et al. | May 2009 | A1 |
20090133276 | Bailey et al. | May 2009 | A1 |
20090133494 | Van Dam et al. | May 2009 | A1 |
20090139105 | Granger | Jun 2009 | A1 |
20090157419 | Bursey | Jun 2009 | A1 |
20090161091 | Yamamoto | Jun 2009 | A1 |
20090165317 | Little | Jul 2009 | A1 |
20090177435 | Heininen | Jul 2009 | A1 |
20090177438 | Raab | Jul 2009 | A1 |
20090185741 | Nahari et al. | Jul 2009 | A1 |
20090187373 | Atwell | Jul 2009 | A1 |
20090241360 | Tait et al. | Oct 2009 | A1 |
20090249634 | Pettersson | Oct 2009 | A1 |
20090265946 | Jordil et al. | Oct 2009 | A1 |
20090273771 | Gittinger | Nov 2009 | A1 |
20090299689 | Stubben et al. | Dec 2009 | A1 |
20090322859 | Shelton et al. | Dec 2009 | A1 |
20090323121 | Valkenburg et al. | Dec 2009 | A1 |
20090323742 | Kumano | Dec 2009 | A1 |
20100030421 | Yoshimura et al. | Feb 2010 | A1 |
20100040742 | Dijkhuis et al. | Feb 2010 | A1 |
20100049891 | Hartwich et al. | Feb 2010 | A1 |
20100057392 | York | Mar 2010 | A1 |
20100078866 | Pettersson | Apr 2010 | A1 |
20100095542 | Ferrari | Apr 2010 | A1 |
20100122920 | Butter et al. | May 2010 | A1 |
20100123892 | Miller et al. | May 2010 | A1 |
20100134596 | Becker | Jun 2010 | A1 |
20100135534 | Weston et al. | Jun 2010 | A1 |
20100148013 | Bhotika et al. | Jun 2010 | A1 |
20100195086 | Ossig et al. | Aug 2010 | A1 |
20100195087 | Ossig et al. | Aug 2010 | A1 |
20100207938 | Yau et al. | Aug 2010 | A1 |
20100208062 | Pettersson | Aug 2010 | A1 |
20100208318 | Jensen et al. | Aug 2010 | A1 |
20100245851 | Teodorescu | Sep 2010 | A1 |
20100277747 | Rueb et al. | Nov 2010 | A1 |
20100281705 | Verdi et al. | Nov 2010 | A1 |
20100286941 | Merlot | Nov 2010 | A1 |
20100312524 | Siercks et al. | Dec 2010 | A1 |
20100318319 | Maierhofer | Dec 2010 | A1 |
20100321152 | Argudyaev et al. | Dec 2010 | A1 |
20100325907 | Tait | Dec 2010 | A1 |
20100328682 | Kotake et al. | Dec 2010 | A1 |
20110000095 | Carlson | Jan 2011 | A1 |
20110001958 | Bridges et al. | Jan 2011 | A1 |
20110007305 | Bridges et al. | Jan 2011 | A1 |
20110007326 | Daxauer et al. | Jan 2011 | A1 |
20110013199 | Siercks et al. | Jan 2011 | A1 |
20110019155 | Daniel et al. | Jan 2011 | A1 |
20110023578 | Grasser | Feb 2011 | A1 |
20110025905 | Tanaka | Feb 2011 | A1 |
20110043515 | Stathis | Feb 2011 | A1 |
20110066781 | Debelak et al. | Mar 2011 | A1 |
20110070534 | Hayashi et al. | Mar 2011 | A1 |
20110094908 | Trieu et al. | Apr 2011 | A1 |
20110107611 | Desforges et al. | May 2011 | A1 |
20110107612 | Ferrari et al. | May 2011 | A1 |
20110107613 | Tait | May 2011 | A1 |
20110107614 | Champ | May 2011 | A1 |
20110111849 | Sprague et al. | May 2011 | A1 |
20110112786 | Desforges et al. | May 2011 | A1 |
20110119025 | Fetter et al. | May 2011 | A1 |
20110123097 | Van Coppenolle et al. | May 2011 | A1 |
20110164114 | Kobayashi et al. | Jul 2011 | A1 |
20110166824 | Haisty et al. | Jul 2011 | A1 |
20110169924 | Haisty et al. | Jul 2011 | A1 |
20110170534 | York | Jul 2011 | A1 |
20110173823 | Bailey et al. | Jul 2011 | A1 |
20110173827 | Bailey et al. | Jul 2011 | A1 |
20110173828 | York | Jul 2011 | A1 |
20110178754 | Atwell et al. | Jul 2011 | A1 |
20110178755 | York | Jul 2011 | A1 |
20110178758 | Atwell et al. | Jul 2011 | A1 |
20110178762 | York | Jul 2011 | A1 |
20110178764 | York | Jul 2011 | A1 |
20110178765 | Atwell et al. | Jul 2011 | A1 |
20110192043 | Ferrari et al. | Aug 2011 | A1 |
20110273568 | Lagassey et al. | Nov 2011 | A1 |
20110282622 | Canter et al. | Nov 2011 | A1 |
20110288684 | Farlow et al. | Nov 2011 | A1 |
20120019806 | Becker et al. | Jan 2012 | A1 |
20120035788 | Trepagnier et al. | Feb 2012 | A1 |
20120044476 | Earhart et al. | Feb 2012 | A1 |
20120046820 | Allard et al. | Feb 2012 | A1 |
20120069325 | Schumann et al. | Mar 2012 | A1 |
20120069352 | Ossig et al. | Mar 2012 | A1 |
20120070077 | Ossig et al. | Mar 2012 | A1 |
20120099100 | Cramer et al. | Apr 2012 | A1 |
20120113913 | Tiirola et al. | May 2012 | A1 |
20120140244 | Gittinger et al. | Jun 2012 | A1 |
20120154786 | Gosch et al. | Jun 2012 | A1 |
20120155744 | Kennedy et al. | Jun 2012 | A1 |
20120169876 | Reichert et al. | Jul 2012 | A1 |
20120181194 | McEwan et al. | Jul 2012 | A1 |
20120197439 | Wang et al. | Aug 2012 | A1 |
20120210678 | Alcouloumre et al. | Aug 2012 | A1 |
20120217357 | Franke | Aug 2012 | A1 |
20120224052 | Bae | Sep 2012 | A1 |
20120229788 | Schumann et al. | Sep 2012 | A1 |
20120236320 | Steffey et al. | Sep 2012 | A1 |
20120257017 | Pettersson et al. | Oct 2012 | A1 |
20120260512 | Kretschmer et al. | Oct 2012 | A1 |
20120260611 | Jones et al. | Oct 2012 | A1 |
20120262700 | Schumann et al. | Oct 2012 | A1 |
20120287265 | Schumann et al. | Nov 2012 | A1 |
20130010307 | Greiner et al. | Jan 2013 | A1 |
20130025143 | Bailey et al. | Jan 2013 | A1 |
20130025144 | Briggs et al. | Jan 2013 | A1 |
20130027515 | Vinther et al. | Jan 2013 | A1 |
20130062243 | Chang et al. | Mar 2013 | A1 |
20130070250 | Ditte et al. | Mar 2013 | A1 |
20130094024 | Ruhland et al. | Apr 2013 | A1 |
20130097882 | Bridges et al. | Apr 2013 | A1 |
20130125408 | Atwell et al. | May 2013 | A1 |
20130162472 | Najim et al. | Jun 2013 | A1 |
20130176453 | Mate et al. | Jul 2013 | A1 |
20130201487 | Ossig et al. | Aug 2013 | A1 |
20130205606 | Briggs et al. | Aug 2013 | A1 |
20130212889 | Bridges et al. | Aug 2013 | A9 |
20130222816 | Briggs et al. | Aug 2013 | A1 |
20130239424 | Tait | Sep 2013 | A1 |
20130293684 | Becker et al. | Nov 2013 | A1 |
20130300740 | Snyder et al. | Nov 2013 | A1 |
20140002608 | Atwell et al. | Jan 2014 | A1 |
20140009582 | Suzuki | Jan 2014 | A1 |
20140012409 | McMurtry et al. | Jan 2014 | A1 |
20140028805 | Tohme | Jan 2014 | A1 |
20140049784 | Woloschyn et al. | Feb 2014 | A1 |
20140063489 | Steffey et al. | Mar 2014 | A1 |
20140152769 | Atwell et al. | Jun 2014 | A1 |
20140202016 | Bridges et al. | Jul 2014 | A1 |
20140226190 | Bridges et al. | Aug 2014 | A1 |
20140240690 | Newman et al. | Aug 2014 | A1 |
20140259715 | Engel | Sep 2014 | A1 |
20140268108 | Grau | Sep 2014 | A1 |
20140293023 | Sherman et al. | Oct 2014 | A1 |
20140362424 | Bridges et al. | Dec 2014 | A1 |
20150002659 | Atwell et al. | Jan 2015 | A1 |
20150130906 | Bridges | May 2015 | A1 |
20150185000 | Wilson et al. | Jul 2015 | A1 |
20150229907 | Bridges | Aug 2015 | A1 |
20170102224 | Bridges et al. | Apr 2017 | A1 |
20180095549 | Watanabe | Apr 2018 | A1 |
Number | Date | Country |
---|---|---|
508635 | Mar 2011 | AT |
2005200937 | Sep 2006 | AU |
2236119 | Sep 1996 | CN |
1307241 | Aug 2001 | CN |
2508896 | Sep 2002 | CN |
2665668 | Dec 2004 | CN |
1630804 | Jun 2005 | CN |
1630805 | Jun 2005 | CN |
1688867 | Oct 2005 | CN |
1735789 | Feb 2006 | CN |
1812868 | Aug 2006 | CN |
1818537 | Aug 2006 | CN |
1838102 | Sep 2006 | CN |
1839293 | Sep 2006 | CN |
1853084 | Oct 2006 | CN |
1926400 | Mar 2007 | CN |
101024286 | Aug 2007 | CN |
101156043 | Apr 2008 | CN |
101163939 | Apr 2008 | CN |
101371099 | Feb 2009 | CN |
101416024 | Apr 2009 | CN |
101484828 | Jul 2009 | CN |
201266071 | Jul 2009 | CN |
101506684 | Aug 2009 | CN |
101511529 | Aug 2009 | CN |
101542227 | Sep 2009 | CN |
101556137 | Oct 2009 | CN |
101806574 | Aug 2010 | CN |
101932952 | Dec 2010 | CN |
2216765 | Apr 1972 | DE |
3227980 | May 1983 | DE |
3245060 | Jul 1983 | DE |
3340317 | Aug 1984 | DE |
4410775 | Oct 1995 | DE |
4412044 | Oct 1995 | DE |
29622033 | Feb 1997 | DE |
19607345 | Aug 1997 | DE |
19720049 | Nov 1998 | DE |
19850118 | May 2000 | DE |
19928958 | Nov 2000 | DE |
10026357 | Jan 2002 | DE |
20208077 | May 2002 | DE |
10137241 | Sep 2002 | DE |
10155488 | May 2003 | DE |
10219054 | Nov 2003 | DE |
10232028 | Feb 2004 | DE |
10336458 | Feb 2004 | DE |
10244643 | Apr 2004 | DE |
20320216 | Apr 2004 | DE |
10304188 | Aug 2004 | DE |
202005000983 | Mar 2005 | DE |
202005000983 | Apr 2005 | DE |
102004028090 | Dec 2005 | DE |
102004010083 | Nov 2006 | DE |
102005056265 | May 2007 | DE |
102006053611 | May 2007 | DE |
102005060967 | Jun 2007 | DE |
102006024534 | Nov 2007 | DE |
102006035292 | Jan 2008 | DE |
202006020299 | May 2008 | DE |
102008014274 | Aug 2009 | DE |
102008039838 | Mar 2010 | DE |
102005036929 | Jun 2010 | DE |
102009001894 | Sep 2010 | DE |
102009055988 | Mar 2011 | DE |
102010032726 | Nov 2011 | DE |
102012104745 | Dec 2013 | DE |
102012109481 | Apr 2014 | DE |
0546784 | Jun 1993 | EP |
0667549 | Aug 1995 | EP |
0730210 | Sep 1996 | EP |
0614517 | Mar 1997 | EP |
0838696 | Apr 1998 | EP |
0949524 | Oct 1999 | EP |
1033556 | Sep 2000 | EP |
1189124 | Mar 2002 | EP |
1342989 | Sep 2003 | EP |
1468791 | Oct 2004 | EP |
1056987 | Apr 2005 | EP |
1734425 | Dec 2006 | EP |
1967930 | Sep 2008 | EP |
2023077 | Feb 2009 | EP |
2060530 | May 2009 | EP |
2068067 | Jun 2009 | EP |
2068114 | Jun 2009 | EP |
2177868 | Apr 2010 | EP |
2259013 | Dec 2010 | EP |
2372302 | Oct 2011 | EP |
2400261 | Dec 2011 | EP |
2344303 | May 2012 | EP |
2603228 | Mar 1988 | FR |
2935043 | Feb 2010 | FR |
894320 | Apr 1962 | GB |
1112941 | May 1968 | GB |
2222695 | Mar 1990 | GB |
2255648 | Nov 1992 | GB |
2336493 | Oct 1999 | GB |
2341203 | Mar 2000 | GB |
2388661 | Nov 2003 | GB |
2420241 | May 2006 | GB |
2447258 | Sep 2008 | GB |
2452033 | Feb 2009 | GB |
2510510 | Aug 2014 | GB |
5581525 | Jun 1955 | JP |
575584 | Jan 1982 | JP |
58171291 | Jan 1983 | JP |
5827264 | Feb 1983 | JP |
S58171291 | Oct 1983 | JP |
59133890 | Aug 1984 | JP |
61062885 | Mar 1986 | JP |
S61157095 | Jul 1986 | JP |
63135814 | Jun 1988 | JP |
0357911 | Mar 1991 | JP |
04115108 | Apr 1992 | JP |
04225188 | Aug 1992 | JP |
04267214 | Sep 1992 | JP |
0572477 | Mar 1993 | JP |
06313710 | Nov 1994 | JP |
1994313710 | Nov 1994 | JP |
06331733 | Dec 1994 | JP |
06341838 | Dec 1994 | JP |
074950 | Jan 1995 | JP |
07128051 | May 1995 | JP |
7210586 | Aug 1995 | JP |
07229963 | Aug 1995 | JP |
0815413 | Jan 1996 | JP |
0821714 | Jan 1996 | JP |
08129145 | May 1996 | JP |
H08136849 | May 1996 | JP |
08262140 | Oct 1996 | JP |
0921868 | Jan 1997 | JP |
H101111130 | Apr 1998 | JP |
10213661 | Aug 1998 | JP |
1123993 | Jan 1999 | JP |
2001056275 | Aug 1999 | JP |
2000121724 | Apr 2000 | JP |
2000249546 | Sep 2000 | JP |
2000339468 | Dec 2000 | JP |
2001013001 | Jan 2001 | JP |
2001021303 | Jan 2001 | JP |
2001066158 | Mar 2001 | JP |
2001066211 | Mar 2001 | JP |
2001337278 | Dec 2001 | JP |
3274290 | Apr 2002 | JP |
2003050128 | Feb 2003 | JP |
2003156330 | May 2003 | JP |
2003156330 | May 2003 | JP |
2003156562 | May 2003 | JP |
2003194526 | Jul 2003 | JP |
2003202215 | Jul 2003 | JP |
2003216255 | Jul 2003 | JP |
2003308205 | Oct 2003 | JP |
2004109106 | Apr 2004 | JP |
2004163346 | Jun 2004 | JP |
2004245832 | Sep 2004 | JP |
2004257927 | Sep 2004 | JP |
2004333398 | Nov 2004 | JP |
2004348575 | Dec 2004 | JP |
2005030937 | Feb 2005 | JP |
2005055226 | Mar 2005 | JP |
2005069700 | Mar 2005 | JP |
2005174887 | Jun 2005 | JP |
2005517908 | Jun 2005 | JP |
2005517914 | Jun 2005 | JP |
2005215917 | Aug 2005 | JP |
2005221336 | Aug 2005 | JP |
2005257510 | Sep 2005 | JP |
2005293291 | Oct 2005 | JP |
2006038683 | Feb 2006 | JP |
2006102176 | Apr 2006 | JP |
2006203404 | Aug 2006 | JP |
2006226948 | Aug 2006 | JP |
2006519369 | Aug 2006 | JP |
2006241833 | Sep 2006 | JP |
2006266821 | Oct 2006 | JP |
2006301991 | Nov 2006 | JP |
2007101836 | Apr 2007 | JP |
2007514943 | Jun 2007 | JP |
2007178943 | Jul 2007 | JP |
2007228315 | Sep 2007 | JP |
2008076303 | Apr 2008 | JP |
2008082707 | Apr 2008 | JP |
2008096123 | Apr 2008 | JP |
2008107286 | May 2008 | JP |
2008514967 | May 2008 | JP |
2008224516 | Sep 2008 | JP |
2008304220 | Dec 2008 | JP |
2009063339 | Mar 2009 | JP |
2009524057 | Jun 2009 | JP |
2009531674 | Sep 2009 | JP |
2009534969 | Sep 2009 | JP |
2009229255 | Oct 2009 | JP |
2009541758 | Nov 2009 | JP |
2010060304 | Mar 2010 | JP |
2010112875 | May 2010 | JP |
2010122209 | Jun 2010 | JP |
2010169405 | Aug 2010 | JP |
2010207990 | Sep 2010 | JP |
2011141174 | Jul 2011 | JP |
2013516928 | May 2013 | JP |
2013517508 | May 2013 | JP |
2013117417 | Jun 2013 | JP |
2013543970 | Dec 2013 | JP |
8801924 | Mar 1988 | WO |
8905512 | Jun 1989 | WO |
9208568 | May 1992 | WO |
9711399 | Mar 1997 | WO |
9808050 | Feb 1998 | WO |
9910706 | Mar 1999 | WO |
0014474 | Mar 2000 | WO |
0020880 | Apr 2000 | WO |
0026612 | May 2000 | WO |
0033149 | Jun 2000 | WO |
0034733 | Jun 2000 | WO |
0063645 | Oct 2000 | WO |
0063681 | Oct 2000 | WO |
0177613 | Oct 2001 | WO |
02084327 | Oct 2002 | WO |
02101323 | Dec 2002 | WO |
2004096502 | Nov 2004 | WO |
2005008271 | Jan 2005 | WO |
2005059473 | Jun 2005 | WO |
2005072917 | Aug 2005 | WO |
2005075875 | Aug 2005 | WO |
2005100908 | Oct 2005 | WO |
2006000552 | Jan 2006 | WO |
2006014445 | Feb 2006 | WO |
2006051264 | May 2006 | WO |
2006053837 | May 2006 | WO |
2007002319 | Jan 2007 | WO |
200712198 | Feb 2007 | WO |
2007028941 | Mar 2007 | WO |
2007051972 | May 2007 | WO |
2007087198 | Aug 2007 | WO |
2007118478 | Oct 2007 | WO |
2007125081 | Nov 2007 | WO |
2007144906 | Dec 2007 | WO |
2008019856 | Feb 2008 | WO |
2008027588 | Mar 2008 | WO |
2008047171 | Apr 2008 | WO |
2008048424 | Apr 2008 | WO |
2008052348 | May 2008 | WO |
2008064276 | May 2008 | WO |
2008066896 | Jun 2008 | WO |
2008068791 | Jun 2008 | WO |
2008075170 | Jun 2008 | WO |
2008121073 | Oct 2008 | WO |
2008157061 | Dec 2008 | WO |
2009001165 | Dec 2008 | WO |
2009053085 | Apr 2009 | WO |
2009083452 | Jul 2009 | WO |
2009095384 | Aug 2009 | WO |
2009123278 | Oct 2009 | WO |
2009127526 | Oct 2009 | WO |
2009130169 | Oct 2009 | WO |
2009149740 | Dec 2009 | WO |
2010040742 | Apr 2010 | WO |
2010092131 | Aug 2010 | WO |
2010108089 | Sep 2010 | WO |
2010108644 | Sep 2010 | WO |
2010148525 | Dec 2010 | WO |
2011000435 | Jan 2011 | WO |
2011000955 | Jan 2011 | WO |
2011021103 | Feb 2011 | WO |
2011029140 | Mar 2011 | WO |
2011057130 | May 2011 | WO |
2011060899 | May 2011 | WO |
2011002908 | Jun 2011 | WO |
2011090829 | Jul 2011 | WO |
2011090892 | Jul 2011 | WO |
2011090895 | Jul 2011 | WO |
2011090903 | Sep 2011 | WO |
2012037157 | Mar 2012 | WO |
2012038446 | Mar 2012 | WO |
2012061122 | May 2012 | WO |
2012103525 | Aug 2012 | WO |
2012112683 | Aug 2012 | WO |
2012125671 | Sep 2012 | WO |
2013112455 | Aug 2013 | WO |
2013184340 | Dec 2013 | WO |
2013190031 | Dec 2013 | WO |
2014128498 | Aug 2014 | WO |
Entry |
---|
GB Office Action dated Jan. 15, 2014 for SJB/PX210785GB; UK Patent Application No. 1214426.7., 4 pages. |
Extended European Search Report for Application No. 18164913.8 dated Jun. 8, 2018; 7 pgs. |
“Scanner Basis Configuration for Riegl VQ-250”, Riegl Company Webpage, Feb. 16, 2011 (Feb. 16, 2011) [retrieved on Apr. 19, 2013 (Apr. 19, 2013)]. Retrieved from the internet; 3 pages. |
14th International Forensic Science Symposium, Interpol—Lyon, France, Oct. 19-22, 2004, Review Papers, Edited by Dr. Niamh Nic Daeid, Forensic Science Unit, Univeristy of Strathclyde, Glasgow, UK; 585 pages. |
A. Hart; “Kinematic Coupling Interchangeability” Precision Engineering; vol. 28, No. 1; Jan. 1, 2004 (Jan. 1, 2004) pp. 1-15. |
ABB Flexibile Automation AB: “Product Manual IRB 6400R M99, On-line Manual”; Sep. 13, 2006; XP00002657684. |
Akca, Devrim, “Full Automated Registration of Laser Scanner Point Clouds”, Institute of Geodesy and Photogrammetry, Swiss Federal Institute of Technology, Zuerich, Switzerland; Published Dec. 2003, 8 pages. |
Anonymous : So wird's gemacht: Mit T-DSL und Windows XP Home Edition gemeinsam ins Internet (Teil 3) Internet Citation, Jul. 2003 (Jul. 2003), XP002364586, Retrieved from Internet: URL:http://support.microsfot.com/kb/814538/DE/ [retrieved on Jan. 26, 2006]. |
Bornaz, L., et al., “Multiple Scan Registration in Lidar Close-Range Applications,” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. XXXIV, Part 5/W12, Jul. 2003 (Jul. 2003), pp. 72-77, XP002590306. |
Bouvet, D., et al., “Precise 3-D Localization by Automatic Laser Theodolite and Odometer for Civil-Engineering Machines”, Proceedings of the 2001 IEEE International Conference on Robotics and Automation. ICRA 2001. Seoul, Korea, May 21-26, 2001; IEEE, US. |
Brenneke et al: “Using 3D laser range data for slam in outsoor enviornments.” Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas, NV Oct. 27-31, 2003; IEEE US, vol. 1, Oct. 27, 2003, pp. 188-193. |
Cho, et al., Implementation of a Precision Time Protocol over Low Rate Wireless Personal Area Networks, IEEE, 2008. 8 pages. |
Cooklev, et al., An Implementation of IEEE 1588 Over IEEE 802.11b for Syncrhonization of Wireless Local Area Network Nodes, IEEE Transactions on Instrumentation and Measurement, vol. 56, No. 5, Oct. 2007, 8 pages. |
Dylan, Craig R., High Precision Makes the Massive Bay Bridge Project Work. Suspended in MidAir—Cover Story—Point of Beginning, Jan. 1, 2010, [online] http://www.pobonline.com/Articles/Cover_Story/BNP_GUID_9-5-2006_A_10000000000 . . . . |
Electro-Optical Information Systems, “The Handy Handheld Digitizer” [online], [retrieved on Nov. 29, 2011], http://vidibotics.com/htm/handy.htm, 2 pages. |
Elstrom, M.D., et al., Stereo-Based Registration of LADAR and Color Imagery, Intelligent Robots and Computer Vision XVII: Algorithms, Techniques, and Active Vision, Boston, MA, USA, vol. 3522, Nov. 2, 1998 (Nov. 2, 1998), Nov. 3, 1998 (Nov. 3, 1998) p. 343-354. |
EO Edmund Optics “Silicon Detectors” (5 pages) 2013 Edmund Optics, Inc. http://www.edmundoptics.com/electro-optics/detector-components/silicon-detectors/1305[Oct. 15, 2013 10:14:53 AM]. |
Examination Report under Section 18(3); Report dated Oct. 31, 2012; Application No. GB1210309.5. |
Examination Report under Section 18(3); Report dated Oct. 31, 2012; Application No. GB1210309.9. |
FARO Laser Scanner LS, Recording Reality's Digital Fingerprint, The Measure of Success, Rev. Aug. 22, 2005, 16 pages. |
FARO Laserscanner LS, Presentation Forensic Package, Policeschool of Hessen, Wiesbaden, Germany, Dec. 14, 2005; FARO Technologies, Copyright 2008, 17 pages. |
FARO Product Catalog; Faro Arm; 68 pages; Faro Technologies Inc. 2009; printed Aug. 3, 2009. |
Franklin, Paul F., What IEEE 1588 Means for Your Next T&M System Design, Keithley Instruments, Inc., [on-line] Oct. 19, 2010, http://www.eetimes.com/General/DisplayPrintViewContent?contentItemId=4209746, [Retreived Oct. 21, 2010], 6 pages. |
Ghost 3D Systems, Authorized MicroScribe Solutions, FAQs—MicroScribe 3D Laser, MicroScan Tools, & related info, [online], [retrieved Nov. 29, 2011], http://microscribe.ghost3d.com/gt_microscan-3d_faqs.htm,. 4 pages. |
Godin, G., et al., A Method for the Registration of Attributed Range Images, Copyright 2001, [Retrieved on Jan. 18, 2010 at 03:29 from IEEE Xplore]. p. 178-186. |
GoMeasure3D—Your source for all things measurement, Baces 3D 100 Series Portable CMM from GoMeasure3D, [online], [retrieved Nov. 29, 2011], http://www.gomeasure3d.com/baces100.html, 3 pages. |
Haag, et al., “Technical Overview and Application of 3D Laser Scanning for Shooting Reconstruction and Crime Scene Investigations”, Presented at the American Academy of Forensic Sciences Scientific Meeting, Washington, D.C., Feb. 21, 2008; 71 pages. |
Hart, A., “Kinematic Coupling Interchangeability”, Precision Engineering, vol. 28, No. 1, Jan. 1, 2004, pp. 1-15, XP55005507, ISSN: 0141-6359, DOI: 10.1016/S0141-6359(03)00071-0. |
Horn, B.K.P., “Closed-Form Solution of Absolute Orientation Using Unit Quaternions” J. Opt. Soc. Am. A., vol. 4., No. 4, Apr. 1987, pp. 629-642, ISSN 0740-3232. |
Howard, et al., “Virtual Environments for Scene of Crime Reconstruction and Analysis”, Advanced Interfaces Group, Department of Computer Science, University of Manchester, Manchester, UK, Feb. 28, 2000. |
Huebner, S.F., “Sniper Shooting Tecnhique”, “Scharfschutzen Schiebtechnik”, Copyright by C.A. Civil Arms Verlag GmbH, Lichtenwald 1989, Alle Rechte vorbehalten, pp. 11-17. |
HYDROpro Navigation, Hydropgraphic Survey Software, Trimble, www.trimble.com, Copyright 1997-2003. |
Information on Electro-Optical Information Systems; EOIS 3D Mini-Moire C.M.M. Sensor for Non-Contact Measuring & Surface Mapping; Direct Dimensions, Jun. 1995. |
Ingensand, H., Dr., “Introduction to Geodetic Metrology”, “Einfuhrung in die Geodatische Messtechnik”, Federal Institute of Technology Zurich, 2004, with English translation, 6 pages. |
International Search Report for International Application No. PCT/US/2013/041826 filed May 20, 2013; dated Jul. 29, 2013; 5 pages. |
Qsun Laserscanner Brochure, 2 Pages, Apr. 2005. |
Jasperneite J et al: “Enhancements to the time synchronization standard IEEE-1588 for a system of cascaded bridges”, Factory Communication Systems, 2004. Proceedings. 2004 IEEE Internatio Nal Workshop on Vienna, Austria Sep. 22-24, 2004, Piscataway, NJ, USA,IEEE, Sep. 22, 2004 (Sep. 22, 2004), Seiten 239-244, XP010756406, DOI: 10.1109/WFCS.2004.1377716 ISBN: 978-0-7803-8734-8. |
Leica Geosystems TruStory Forensic Analysis by Albuquerque Police Department, 2006. |
Leica Geosystems, FBI Crime Scene Case Study, Tony Grissim, Feb. 2006; 11 pages. |
Leica Rugby 55 Designed for Interior Built for Construction Brochure, Leica Geosystems, Heerbrugg, Switzerland, www.leica-geosystems.com. |
Patrick Willoughby; “Elastically Averaged Precision Alignment”; In: “Doctoral Thesis” ; Jun. 1, 2005; Massachusetts Institute of Technology; XP55005620; Abstract 1.1 Motivation; Chapter 3, Chapter 6. |
Romer “Romer Measuring Arms Portable CMMs for R&D and shop floor” (Mar. 2009) Hexagon Metrology (16 pages). |
Spada, et al., IEEE 1588 Lowers Integration Costs in Continuous Flow Automated Production Lines, XP-002498255, ARC Insights, Insight # 2003-33MD&H, Aug. 20, 2003. |
Trimble—Trimble SPS630, SPS730 and SPS930 Universal Total Stations, [on-line] http://www.trimble.com/sps630_730_930.shtml (1 of 4), [Retreived Jan. 26, 2010 8:50:29AM]. |
Written Opinion for International Application No. PCT/US2013/040309; dated Jul. 15, 2013; 7 pages. |
Written Opinion of the International Searching Authority for International Application No. PCT/US2011/021253 dated Mar. 22, 2012. |
Written Opinion of the International Searching Authority for International Application No. PCT/US2011/021253 dated Sep. 26, 2011. |
Chinese Office Action for Application No. 201180004746.4 dated Apr. 21, 2015; 5 pages. |
Decision Revoking the European Patent (Art. 101(3)(b) EPC) dated Aug. 14, 2013, filed in Opposition re Application No. 07 785 873.6/Patent No. 2 062 069, Proprietor: Faro Technologies, Inc., filed by Leica Geosystem AG on Feb. 5, 2013, 12 pages. |
Final Office Action dated Feb. 18, 2015. |
Great Britain Examination Report to Application No. GB1422105.5 dated Jan. 26, 2015, 4 pages. |
Great Britain Examination Report to GB1418273.7 dated Oct. 24, 2014, 8 pages. |
International Preliminary Report on Patentability for International Application No. PCT/US2011/021253; Date of Completion May 9, 2012. |
International Preliminary Report on Patentability to PCT/US2013/040309, dated Dec. 16, 2014, 8 pages. |
International Preliminary Report on Patentability to PCT/US2013/040321, dated Dec. 16, 2014, 7 pages. |
International Preliminary Report on Patentability to PCT/US2013/041826, dated Dec. 9, 2019, 7 pages. |
IPRP, dated Nov. 12, 2014. |
It is Alive in the Lab, Autodesk University, Fun with the Immersion MicroScribe Laser Scanner, [online], [retrieved Nov. 29, 2011], http://labs.blogs.com/its_alive_in_the_lab/2007/11/fun-with-the-im.html; 3 pages. |
Japanese Office Action for Application No. 2012-550052 dated Jul. 15, 2014; 4 pages. |
Jasiobedzki, Piotr, “Laser Eye—A New 3D Sensor for Active Vision”, SPIE—Sensor Fusion VI, vol. 2059, Sep. 7, 1993 (Sep. 7, 1993), pp. 316-321, XP00262856, Boston, U.S.A., Retrieved from the Internet: URL:http://scitation.aip.org/getpdf/servlet/Ge. |
Jgeng “DLP-Based Structured Light 3D Imaging Technologies and Applications” (15 pages) Emerging Digital Micromirror Device Based Systems and Application III; edited by Michael R. Douglass, Patrick I. Oden, Proc. of SPIE, vol. 7932, 79320B; (Feb. 9, 2011). |
JP Office Action re Application No. 2012-534590 dated Sep. 2, 2014, 4 pages. |
Kreon Laser Scanners, Getting the Best in Cutting Edge 3D Digitizing Technology, B3-D MCAD Consulting/Sales [online], [retrieved Nov. 29, 2011], http://www.b3-d.com/Kreon.html. |
Langford, et al., “Practical Skills in Forensic Science”, Pearson Education Limited, Essex, England, First Published 2005, Forensic Chemistry. |
Laser Reverse Engineering with Microscribe, [online], [retrieved Nov. 29, 2011], http://www.youtube.com/watch?v=8VRz_2aEJ4E&feature=PlayList&p=F63ABF74F30DC81B&playnext=1&playnext_from=PL&index=1. |
Leica Geosystems: “Leica Rugby 55 Designed for Interior Built for Construction”, Jan. 1, 2009, XP002660558. |
Leica TPS800 Performance Series—Equipment List, 2004, pp. 1-4. |
Merriam-Webster (m-w.com), “Interface”. 2012. http://www.merriam-webster.com/dictionary/interface. |
Merriam-Webster (m-w.com), “Parts”. 2012, pp. 1-6. http://www.merriam-webster.com/dictionary/parts. |
Merriam-Webster (m-w.com), “Traverse”. 2012. http://www.merriam-webster.com/dictionary/traverse. |
MicroScan 3D User Guide, RSI GmbH, 3D Systems & Software, Oberursel, Germany, email: info@rsi-gmbh.de, Copyright RSI Roland Seifert Imaging GmbH 2008. |
Moog Components Group “Technical Brief; Fiber Optic Rotary Joints” Document No. 303 (6 pages) Mar. 2008; MOOG, Inc. 2008 Canada; Focal Technologies. |
MOOG Components Group; “Fiber Optic Rotary Joints; Product Guide” (4 pages) Dec. 2010; MOOG, Inc. 2010. |
Non-Final Office Action dated Dec. 29, 2014. |
Non-Final Office Action, dated Mar. 6, 2015. |
Notice of Allowance, dated Dec. 1, 2014. |
Notice of Allowance, dated Feb. 20, 2015. |
P Ben-Tzvi, et al “Extraction of 3D Images Using Pitch-Actuated 2D Laser Range Finder for Robotic Vision” (6 pages) BNSDOCID <XP 31840390A_1_>. |
PCT/2011/020625 International Search Report—dated Feb. 25, 2011. |
Provision of the minutes in accordance with Rule 124(4) EPC dated Aug. 14, 2013, filed in Opposition re Application No. 07 785 873.6/Patent No. 2 062 069, Proprietor: Faro Technologies, Inc., filed by Leica Geosystem AG on Feb. 5, 2013, pp. 1-10. |
Romer “Romer Absolute Arm Maximum Performance Portable Measurement” (Printed Oct. 2010); Hexagon Metrology, Inc. http://us:ROMER.com; Hexagon Metrology, Inc 2010. |
Romer “Romer Absolute Arm Product Brochure” (2010); Hexagon Metrology; www.hexagonmetrology.com; Hexagon AB 2010. |
Romer Measuring Arms; Portable CMMs for the shop floor; 20 pages; Hexagon Metrology, Inc. (2009) http//us.ROMER.com. |
RW Boyd “Radiometry and the Detection of Otpical Radiation” (pp. 20-23 ) 1983 Jon wiley & Sons, Inc. |
Sauter, et al., Towards New Hybrid Networks for Industrial Automation, IEEE, 2009. |
Se, et al., “Instant Scene Modeler for Crime Scene Reconstruction”, MDA, Space Missions, Ontario, Canada, Copyright 2005, IEEE; 8 pages. |
Surmann et al. “An autonomous mobile robot with a 3D laser range finder for 3D exploration and digitalization of indoor enviornments.” Robotics and Autonomous Systems vol. 45 No. 3-4, Dec. 31, 2003, pp. 181-198. Amsterdamn, Netherlands; 18 pgs. |
The Scene, Journal of the Association for Crime Scene Reconstruction, Apr.-Jun. 2006, vol. 12, Issue 2; 31 pages. |
Umeda, K., et al., Registration of Range and Color Images Using Gradient Constraints and Ran Intensity Images, Proceedings of the 17th International Conference onPatern Recognition (ICPR'04), Copyright 2010 IEEE. |
Williams, J.A., et al., Evaluation of a Novel Multiple Point Set Registration Algorithm, Copyright 2000, [Retrieved on Jan. 18, 2010 at 04:10 from IEEE Xplore], pp. 1006-1010. |
Willoughby, P., “Elastically Averaged Precisoin Alignment”, In: “Doctoral Thesis”, Jun. 1, 2005, Massachusetts Institute of Technology, XP55005620, abstract 1.1 Motivation, Chapter 3, Chapter 6. |
Written Opinion for International Application No. PCT/US/2013/041826 filed May 20, 2013; dated Jul. 29, 2013; 7 pages. |
Written Opinion for International Application No. PCT/US2013/040321 filed May 9, 2013; dated Jul. 15, 2013; 7 pages. |
Written Opinion of the International Searching Authority for International Application No. PCT/US2013/049562 dated Nov. 28, 2013, 10 pages. |
Written Opinion of the International Searching Authority for International Application No. PCT/US2011/021253; dated Sep. 26, 2011; 11 pages. |
Ben-Tzvi, P., et al., “Extraction of 3D images using pitch-actuated 2D laser range finder for robotic vision”, Robotic and Sensors Environments (ROSE), 2010 IEEE International Workshop on, IEEE, Pisataway, NJ, USA, Oct. 15, 2010 (Oct. 15, 2010), pp. 1-6, XP031840390. |
Chinese Office Action for Application No. 201180004746.4 dated Sep. 30, 2014; 9 pages. |
Chinese Office Action for Application No. 201380029985.4 dated Aug. 7, 2015; 7 pages. |
Chinese Office Action for Application No. 201380030405.3 dated Sep. 20, 2016; 12 pages. |
Chinese Office Action to Application No. 20118004746.4, dated Apr. 21, 2015, 3 pages. |
Chinese Office Action with Search Report for Application No. 201380005188.2 dated Mar. 3, 2015; 3 pages. |
Cho, Yong K., et al. “Light-weight 3D ladar system for construction robotic operations” 26th International Symposium on Automation and Robotics in Construction (ISARC 2009), 2009, XP55068755, Retrieved from Internet: URL:http://www.iaarc.org/publications/full text/ Light-weight_3D_ladar_system_for_construction_robotic_operations.pdf [retrieved on Jun. 28, 2013]. |
Examination Report foe GB1504825.9 dated May 28, 2015; 6 pages. |
Geng, J. “Structured-Light 3D Surface Imaging: A Tutorial,” Advances in Optics and Photonics 3; Mar. 31, 2011, pp. 128-160; IEEE Intelligent Transportation System Society; 2011 Optical Society of America. |
Geng, J., et al., “DLP-based structured light 3D imaging technologies and applications”, Emerging Digitial Micromirror Device Based Systems and Applications III, SPIE, vol. 7932, No. 1, Feb. 10, 2011 (Feb. 10, 2011) 15 pgs. |
German Examination Report Application No. 11 2013 003 076.4 dated Jul. 23, 2015; 1-7 pages. |
German Examination Report for Application No. 10 201502050110.2 dated Feb. 25, 2016; pp. 5. |
German Examination Report for Application No. 11 2013 002 824.7 dated Jul. 21, 2015; 1-6 pages. |
German Examinaton Report for Application No. 112011100309.9 dated Sep. 23, 2014, 10 pages. |
Great Britain Examination Report for Application No. GB121446.7 dated Oct. 6, 2014; 5 pages. |
Great Britain Examination Report to Application No. GB1412309.5; dated Aug. 8, 2014; 3 pages. |
Great BritainExamination Report for Application No. GB1214426.7 dated Jan. 15, 2014; 5 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2011021253 dated Sep. 26, 2011; 18 pages. |
Japanese Office Action for Appliation No. 2015-516035 dated May 12, 2015; pp. 1-3. |
Japanese Office Action for Application No. 2012-550045 dated Feb. 18, 2014; 2 pages. |
Japanese Office Action for Application No. 2014-102495 dated Feb. 10, 2015; 2 pgs. |
Japanese Office Action for Application No. 2014-561197 dated Sep. 1, 2015, 3 pgs. |
Japanese Office Action for Application No. 2015-516023 dated Mar. 28, 2017; 4 pgs. |
Japanese Office Action for Application No. 2015-049378 dated Apr. 7, 2015, 3 pages. |
Lee, Min-Gu, et al., “Compact 3D lidar based on optically coupled horizontal and vertical scanning mechanism for the autonomous navigation of robots,” Proceedings of SPIE, vol. 8037, May 10, 2011 (May 10, 2011), p. 80371H, XP055069052. |
Great Britain Office action for Application No. GB1500230.6 dated Nov. 21, 2017; 5 pgs. |
Chinese Office Action for Application No. 201380029985.4 dated Mar. 11, 2016; 8 pages. |
PCT International Search Report and Written Opinion for International Application No. PCT/US2013/040309 dated Jul. 15, 2013; 11 pages. |
PCT International Search Report and Written Opinion for International Application No. PCT/US2013/040321 dated Jul. 15, 2013; 11 pages. |
PCT International Search Report and Written Opinion for International Application No. PCT/US2013/041826 dated Jul. 29, 2013; 12 pages. |
PCT International Search Report and Written Opinion for International Application No. PCT/US2015/049078 dated Nov. 23, 2015; 12 pages. |
PCT International Search Report and Written Opinion for International Application No. PCT/US2015/060087 dated Feb. 17, 2016; 11 pages. |
Trujilla-Pino, A., et al., “Accurate subpixel edge location based on partial area effect” Elsevier, Imaging and Vision Computing 31 (2013) pp. 72-90. |
Number | Date | Country | |
---|---|---|---|
20180172428 A1 | Jun 2018 | US |
Number | Date | Country | |
---|---|---|---|
61355279 | Jun 2010 | US | |
61351347 | Jun 2010 | US | |
61296555 | Jan 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15481673 | Apr 2017 | US |
Child | 15876476 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14679580 | Apr 2015 | US |
Child | 15481673 | US | |
Parent | 14485876 | Sep 2014 | US |
Child | 14679580 | US | |
Parent | 13491176 | Jun 2012 | US |
Child | 14485876 | US | |
Parent | 13006507 | Jan 2011 | US |
Child | 13491176 | US |