The following background art may be regarded as useful for the understanding of the invention: U.S. Patent Publication No. 2012/120391 A1; CA Patent No. 2,811,444 A1; and, U.S. Patent Publication No. 2012/120415 A1.
The present invention relates to a coordinate measuring device. One set of coordinate measurement devices belongs to a class of instruments that measure the three-dimensional (3D) coordinates of a point by sending a laser beam to the point, where it is intercepted by a retroreflector target. The instrument finds the coordinates of the point by measuring the distance and the two angles to the target. The distance is measured with a distance-measuring device such as an absolute distance meter (ADM) or an interferometer. The angles are measured with an angular transducer such as an angular encoder. A gimbaled beam-steering mechanism within the instrument directs the laser beam to the point of interest. An example of such a device is a laser tracker.
A coordinate measuring device closely related to the laser tracker is the total station. The total station, which is most often used in surveying applications, may be used to measure the coordinates of diffusely scattering or retroreflective targets. Hereinafter, the term laser tracker is used in a broad sense to include total stations.
Ordinarily a laser tracker sends a laser beam to a retroreflector target. A common type of retroreflector target is the spherically mounted retroreflector (SMR), which comprises a cube-corner retroreflector embedded within a metal sphere. The cube-corner retroreflector comprises three mutually perpendicular mirrors. The apex of the cube corner, which is the common point of intersection of the three mirrors, is located at the center of the sphere. It is common practice to place the spherical surface of the SMR in contact with an object under test and then move the SMR over the surface being measured. Because of this placement of the cube corner within the sphere, the perpendicular distance from the apex of the cube corner to the surface of the object under test remains constant despite rotation of the SMR. Consequently, the 3D coordinates of a surface can be found by having a tracker follow the 3D coordinates of an SMR moved over the surface.
A gimbal mechanism within the laser tracker may be used to direct a laser beam from the tracker to the SMR. Part of the light retroreflected by the SMR enters the laser tracker and passes onto a position detector. The position of the light that hits the position detector is used by a tracker control system to adjust the rotation angles of the mechanical azimuth and zenith axes of the laser tracker to keep the laser beam centered on the SMR. In this way, the tracker is able to follow (track) the SMR.
Angular transducers such as angular encoders attached to the mechanical azimuth and zenith axes of the tracker may be used to determine the azimuth and zenith angles of the laser beam (with respect to the tracker frame of reference). The one distance measurement and two angle measurements obtained by the laser tracker are sufficient to completely specify the three-dimensional location of the SMR or other retroreflector target.
As mentioned previously, two types of distance meters may be found in laser trackers: interferometers and absolute distance meters (ADMs). In the laser tracker, an interferometer (if present) may determine the distance from a starting point to a finishing point by counting the number of increments of known length (usually the half-wavelength of the laser light) that pass as a retroreflector target is moved between the two points. If the beam is broken during the measurement, the number of counts cannot be accurately known, causing the distance information to be lost. By comparison, the ADM in a laser tracker determines the absolute distance to a retroreflector target without regard to beam breaks, which also allows switching between targets. Because of this, the ADM is said to be capable of “point-and-shoot” measurement. Initially, absolute distance meters were only able to measure stationary targets and for this reason were always used together with an interferometer. However, some modern absolute distance meters can make rapid measurements, thereby eliminating the need for an interferometer.
In its tracking mode, the laser tracker will automatically follow movements of the SMR when the SMR is in the capture range of the tracker. If the laser beam is broken, tracking will stop. The beam may be broken by any of several means: (1) an obstruction between the instrument and SMR; (2) rapid movements of the SMR that are too fast for the instrument to follow; or (3) the direction of the SMR being turned beyond the acceptance angle of the SMR. Following a beam break, in some modes of operation the beam by default remains fixed at the point of the beam break or at the last commanded position. It may be necessary for an operator to visually search for the tracking beam and place the SMR in the beam in order to lock the instrument onto the SMR and continue tracking. In another mode of operation, the beam may be automatically directed back to the SMR through the use of a camera system, as discussed hereinbelow.
Some laser trackers include one or more cameras. A camera axis may be coaxial with the measurement beam or offset from the measurement beam by a fixed distance or angle. A camera may be used to provide a wide field of view to locate retroreflectors. A modulated light source placed near the camera optical axis may illuminate retroreflectors, thereby making them easier to identify. In this case, the retroreflectors flash in phase with the illumination, whereas background objects do not. One application for such a camera is to detect multiple retroreflectors in the field of view and measure each in an automated sequence.
Some laser trackers have the ability to measure with six degrees of freedom (DOF), which may include three coordinates, such as x, y, and z, and three rotations, such as pitch, roll, and yaw. Several systems based on laser trackers are available or have been proposed for measuring six degrees of freedom.
As explained hereinabove, tracking of a retroreflector target stops when the beam is broken. In some cases, such a beam break is intentionally created by the operator—for example, to use to beam to provide a marker for the alignment of target stands or instruments. In other cases, a beam break is unintentional or unavoidable—for example, when an operator rotates the retroreflector target too much or passes the retroreflector behind an object in moving from one point to another. In cases where the beam break is unwanted, it is desirable to provide a way to conveniently steer the beam back onto the retroreflector target.
One method known in the art for conveniently steering a beam back onto a retroreflector target is to illuminate the retroreflector target with a cone of light, to view the illuminated retroreflector target with a locator camera placed in close proximity to the light source producing the cone of light, to evaluate the position of the retroreflector image on a photosensitive array contained within the locator camera, and to activate motors of the laser tracker to drive the beam of light from the tracker toward the retroreflector target. This action may be repeated if necessary to lock the light beam from the tracker onto the retroreflector target. The locking of the light beam on the retroreflector target may be recognized by the position detector receiving a relatively large amount of retroreflected light.
In one implementation of this method for steering a beam onto a retroreflector target, the locator camera system automatically finds and locks onto a nearby retroreflector target whenever the tracker loses the beam. However, this method is limited in some respects. In some cases, numerous retroreflector targets may be located within a measurement volume, and the operator may want to direct the tracker beam to a different target than the one automatically selected by the tracker following a beam break. In other cases, the operator may want the tracker beam to remain fixed in direction so that a stand or instrument may be aligned to it.
One way around this difficulty that is known in the art is to use gestures to control the behavior of a laser tracker. In one implementation using gestures, a retroreflector target is followed using one or more locator cameras and associated proximate light sources. In this implementation, the cameras may detect a particular gesture by evaluating the motion of an illuminated retroreflector target or evaluating a pattern in the power of light from the retroreflector targets. A potential disadvantage in the use of gestures is that the operator must remember the correspondence between tracker commands and gestures.
What is needed is a flexible and convenient method for acquiring retroreflector targets. In some cases, it is desirable to recapture a retroreflector target following a beam break. In other cases, it is desirable to direct a tracker beam, either broken or unbroken, to a different retroreflector target.
A method for locking onto and tracking a selected retroreflector target with a laser tracker, the locking onto and tracking carried out under direction of an operator, the method including steps of: providing at least one retroreflector target; providing the laser tracker, the laser tracker having a structure, a first light source, a distance meter, a first angular transducer, a second angular transducer, a position detector, a camera, a second light source, and a processor, the structure rotatable about a first axis and a second axis; the first light source configured to produce a first light beam that cooperates with the distance meter, the first angular transducer configured to measure a first angle of rotation about the first axis, the second angular transducer configured to measure a second angle of rotation about the second axis, the position detector configured to receive a reflected beam, the reflected beam being the first light beam reflected by a retroreflector target, the camera including a lens system and a photosensitive array, the second light source configured to provide a cone of light, the first light beam and the cone of light being fixed in relation to the structure, the second light source configured to cooperate with the camera, the camera having a field of view, the processor configured to operate the laser tracker. The method also includes the steps of: providing a transceiver coupled to the laser tracker or coupled to a computer in communication with the laser tracker, the transceiver including a receiver and optionally a transmitter; providing a handheld appliance configured to wirelessly communicate with the transceiver; positioning the at least one retroreflector target within the field of view of the camera; actuating by the operator the handheld appliance and sending in response to the actuation a wireless message to the transceiver; determining a retroreflector target criterion; responding to the wireless message by repetitively carrying out steps in a loop including the following steps (a)-(e) and exiting the loop when an exit condition is met: (a) reflecting part of the cone of light by the at least one retroreflector target and capturing an array image on the photosensitive array; (b) determining which retroreflector target meets the retroreflector target criterion, the determining based at least in part on the array image, the retroreflector target that meets the retroreflector target criterion referred to as the selected retroreflector target; (c) measuring a signal level with the position detector and determining, based on the signal level, whether the position detector is receiving the reflected beam; (d) establishing whether the exit condition is met, the exit condition being met if and only if the position detector receives the reflected beam and the reflected beam comes from the selected retroreflector target; (e) activating the first motor and the second motor to steer the first light beam toward the selected retroreflector target; activating the first motor and the second motor to steer the reflected beam so as to keep the reflected beam on the position detector; and measuring to the first retroreflector target a distance with the distance meter, a third angle with the first angular transducer, and a fourth angle with the second angular transducer.
Referring now to the drawings, wherein like elements are numbered alike in the several FIGURES:
A prior art laser tracker 10 is illustrated in
Laser beam 46 may comprise one or more laser wavelengths. For the sake of clarity and simplicity, a steering mechanism of the sort shown in
In exemplary laser tracker 10, cameras 52 and light sources 54 are located on payload 15. Light sources 54 illuminate one or more retroreflector targets 26. Light sources 54 may be LEDs electrically driven to repetitively emit pulsed light. Each camera 52 includes a photosensitive array and a lens placed in front of the photosensitive array. The photosensitive array may be a CMOS or CCD array. The lens may have a relatively wide field of view, say thirty or forty degrees. The purpose of the lens is to form an image on the photosensitive array of objects within the field of view of the lens. Each light source 54 is placed near camera 52 so that light from light source 54 is reflected off each retroreflector target 26 onto camera 52. In this way, retroreflector images are readily distinguished from the background on the photosensitive array as their image spots are brighter than background objects and are pulsed. There may be two cameras 52 and two light sources 54 placed about the line of laser beam 46. By using two cameras in this way, the principle of triangulation can be used to find the three-dimensional coordinates of any SMR within the field of view of the camera. In addition, the three-dimensional coordinates of the SMR can be monitored as the SMR is moved from point to point.
Other arrangements of one or more cameras and light sources are possible. For example, a light source and camera can be coaxial or nearly coaxial with the laser beams emitted by the tracker. In this case, it may be necessary to use optical filtering or similar methods to avoid saturating the photosensitive array of the camera with the laser beam from the tracker. Another possible arrangement is to use a single camera located on the payload or base of the tracker.
As shown in
The laser tracker 10 may be rotated on its side, rotated upside down, or placed in an arbitrary orientation. In these situations, the terms azimuth axis and zenith axis have the same direction relative to the laser tracker as the directions shown in
In another embodiment, the payload 15 is replaced by a mirror that rotates about the azimuth axis 20 and the zenith axis 18. A laser beam is directed upward and strikes the mirror, from which it launches toward a retroreflector 26.
Many types of peripheral devices are possible, but here three such devices are shown: a temperature sensor 1582, a six-DOF probe 1584, and a personal digital assistant, 1586, which might be a smart phone or a remote control, for example. The laser tracker may communicate with peripheral devices in a variety of means, including wireless communication over the antenna 1572, by means of a vision system such as a camera, and by means of distance and angular readings of the laser tracker to a cooperative target such as the six-DOF probe 1584.
In an embodiment, a separate communications bus goes from the master processor 1520 to each of the electronics units 1530, 1540, 1550, 1560, 1565, and 1570. Each communications line may have, for example, three serial lines that include the data line, clock line, and frame line. The frame line indicates whether or not the electronics unit should pay attention to the clock line. If it indicates that attention should be given, the electronics unit reads the current value of the data line at each clock signal. The clock signal may correspond, for example, to a rising edge of a clock pulse. In an embodiment, information is transmitted over the data line in the form of a packet. In an embodiment, each packet includes an address, a numeric value, a data message, and a checksum. The address indicates where, within the electronics unit, the data message is to be directed. The location may, for example, correspond to a processor subroutine within the electronics unit. The numeric value indicates the length of the data message. The data message contains data or instructions for the electronics unit to carry out. The checksum is a numeric value that is used to minimize the chance that errors are transmitted over the communications line.
In an embodiment, the master processor 1520 sends packets of information over bus 1610 to payload functions electronics 1530, over bus 1611 to azimuth encoder electronics 1540, over bus 1612 to zenith encoder electronics 1550, over bus 1613 to display and UI electronics 1560, over bus 1614 to removable storage hardware 1565, and over bus 1616 to RFID and wireless electronics 1570.
In an embodiment, master processor 1520 also sends a synch (synchronization) pulse over the synch bus 1630 to each of the electronics units at the same time. The synch pulse provides a way of synchronizing values collected by the measurement functions of the laser tracker. For example, the azimuth encoder electronics 1540 and the zenith electronics 1550 latch their encoder values as soon as the synch pulse is received. Similarly, the payload functions electronics 1530 latch the data collected by the electronics contained within the payload. The six-DOF, ADM, and position detector all latch data when the synch pulse is given. In most cases, the camera and inclinometer collect data at a slower rate than the synch pulse rate but may latch data at multiples of the synch pulse period.
The laser tracker electronics processing system 1510 may communicate with an external computer 1590, or it may provide computation, display, and UI functions within the laser tracker. The laser tracker communicates with computer 1590 over communications link 1606, which might be, for example, and Ethernet line or a wireless connection. The laser tracker may also communicate with other elements 1600, represented by the cloud, over communications link 1602, which might include one or more electrical cables, such as Ethernet cables, and one or more wireless connections. An example of an element 1600 is another three dimensional test instrument—for example, an articulated arm CMM, which may be relocated by the laser tracker. A communication link 1604 between the computer 1590 and the elements 1600 may be wired (e.g., Ethernet) or wireless. An operator sitting on a remote computer 1590 may make a connection to the Internet, represented by the cloud 1600, over an Ethernet or wireless line, which in turn connects to the master processor 1520 over an Ethernet or wireless line. In this way, a user may control the action of a remote laser tracker.
A method and apparatus in an embodiment of the present invention is shown in
The reception of the wireless signal by the one or more transceivers 65A, 65B, and 65C causes the software to obtain a retroreflector target criterion (step 710 in
A further response to the wireless signal 420 is illustrated in
In a step 732, software determines which retroreflector meets the retroreflector criterion. For example, if the retroreflector criterion were selected by the operator to be the retroreflector target nearest the tracker beam 46, the software would evaluate the position of the images on the photosensitive array(s) of the camera(s) to determine whether the retroreflector criterion was being met. The decision of step 732 is made based on an evaluation of two conditions. First, the software notes whether the position detector is receiving a retroreflected tracker beam 46. Such a beam will be received by the position detector if the beam strikes relatively near the center of a retroreflector target. The determination of whether the position detector has received the retroreflector light is made on the basis of a signal level provided by the position detector. For example, one type of position detector is a lateral type position sensitive detector having four electrodes. By adding the voltage levels at each of the four electrodes, the total optical power present on the position detector may be determined. If the optical power exceeds a pre-established level, presence of a retroreflected portion of the beam 46 is indicated. In other words, in this case, the laser tracker has locked onto the target. Second, the software notes whether the image obtained on the photosensitive array(s) of the camera(s) 52 corresponds to the position of the retroreflector that meets the retroreflector target criterion. If this is the case and if a retroreflected portion of the tracker beam 46 is being received by the position detector, the procedure continues to track on the retroreflector target, as indicated in step 745. Otherwise, in step 740 the azimuth (AZ) and zenith (ZE) motors are activated to drive the tracker beam toward the selected retroreflector target. The steps 720-740 are then repeated until an exit condition of step 735 is satisfied.
At the step 745, tracking of the beam on the retroreflector target is initiated by activating the AZ and ZEN motors to keep the beam approximately centered on the position detector. The retroreflector may be tracked by the operator to a location of interest, at which point the tracker may be used measure distance and two angles to determine three-dimensional coordinates of an object under test (step 750 in
The use of the selection criterion of step 710 is illustrated for the situation shown in
In another embodiment, a mode of selection is chosen ahead of time as a setting value or default value. For example, the operator may want to always have the tracker beam lock onto the retroreflector target closest to the IMU or the operator may always want to select a retroreflector from an image. Such a setting may be chosen by the operator on a computer program used with the laser tracker, for example.
One choice the user may make is whether the method discussed herein applies only to the case in which the beam is broken (not tracking on the retroreflector) or whether it also should apply to the case in which a beam is tracking on a target and the operator wishes to direct the retroreflector to another target.
As an example, we consider the situation of
In
The IMUs found in smart phones and other handheld devices provide information not only about the position of the smart phone but also the direction in which the smart phone is pointing. The capability can be used to provide a useful measurement technique illustrated in
Another possibility is to have the cameras 52 send an image showing the relative positions of the retroreflector targets from the perspective of the laser tracker. The operator may then select the retroreflector target of interest.
The handheld appliances described hereinabove may be any of several different types. They might be remote controls, mobile phones (including smart phones), electronics pads, or keypads. Although wireless communication is advantageous in most cases, it is also possible to use the method described herein with a wired method—in other words, with the handheld appliance communication with the laser tracker or associated computer through a wired connection.
While preferred embodiments have been shown and described, various modifications and substitutions may be made thereto without departing from the spirit and scope of the invention being indicated by the appended claims.
The presently disclosed embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims, rather than the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
This application is a Non-Provisional of U.S. Provisional Application Ser. No. 61/702,864, filed Sep. 19, 2012. This application is also a Continuation-In-Part of U.S. patent application Ser. No. 13/340,730, filed Dec. 30, 2011; U.S. patent application Ser. No. 13/340,730 is a Continuation-In-Part of U.S. patent application Ser. No. 13/090,889, filed on Apr. 20, 2011, which claims priority to U.S. Provisional Patent Application No. 61/326,294, filed Apr. 21, 2010, the entire contents of each which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
4413907 | Lane | Nov 1983 | A |
4560270 | Wiklund et al. | Dec 1985 | A |
4714339 | Lau et al. | Dec 1987 | A |
4731879 | Sepp et al. | Mar 1988 | A |
4777660 | Gould et al. | Oct 1988 | A |
4790651 | Brown et al. | Dec 1988 | A |
4983021 | Fergason | Jan 1991 | A |
5051934 | Wiklund | Sep 1991 | A |
5121242 | Kennedy | Jun 1992 | A |
5137354 | deVos et al. | Aug 1992 | A |
5138154 | Hotelling | Aug 1992 | A |
5267014 | Prenninger | Nov 1993 | A |
5313409 | Wiklund et al. | May 1994 | A |
5347306 | Nitta | Sep 1994 | A |
5440326 | Quinn | Aug 1995 | A |
5532816 | Spann et al. | Jul 1996 | A |
5594169 | Field et al. | Jan 1997 | A |
D378751 | Smith | Apr 1997 | S |
5698784 | Hotelling et al. | Dec 1997 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5767952 | Ohtomo et al. | Jun 1998 | A |
5825350 | Case, Jr. et al. | Oct 1998 | A |
5828057 | Hertzman et al. | Oct 1998 | A |
5898421 | Quinn | Apr 1999 | A |
5957559 | Rueb et al. | Sep 1999 | A |
5973788 | Pettersen et al. | Oct 1999 | A |
6023326 | Katayama et al. | Feb 2000 | A |
6034722 | Viney et al. | Mar 2000 | A |
6036319 | Rueb et al. | Mar 2000 | A |
6085155 | Hayase et al. | Jul 2000 | A |
6111563 | Hines | Aug 2000 | A |
6133998 | Monz et al. | Oct 2000 | A |
6166809 | Pettersen et al. | Dec 2000 | A |
6171018 | Ohtomo et al. | Jan 2001 | B1 |
6222465 | Kumar et al. | Apr 2001 | B1 |
6262801 | Shibuya et al. | Jul 2001 | B1 |
6295174 | Ishinabe et al. | Sep 2001 | B1 |
6344846 | Hines | Feb 2002 | B1 |
6347290 | Bartlett | Feb 2002 | B1 |
6353764 | Imagawa et al. | Mar 2002 | B1 |
6369794 | Sakurai et al. | Apr 2002 | B1 |
6433866 | Nichols | Aug 2002 | B1 |
6445446 | Kumagai et al. | Sep 2002 | B1 |
6462810 | Muraoka et al. | Oct 2002 | B1 |
6559931 | Kawamura et al. | May 2003 | B2 |
6567101 | Thomas | May 2003 | B1 |
6573883 | Bartlett | Jun 2003 | B1 |
6573981 | Kumagai et al. | Jun 2003 | B2 |
6587244 | Ishinabe et al. | Jul 2003 | B1 |
6624916 | Green et al. | Sep 2003 | B1 |
6646732 | Ohtomo et al. | Nov 2003 | B2 |
6667798 | Markendorf et al. | Dec 2003 | B1 |
6668466 | Bieg et al. | Dec 2003 | B1 |
6681031 | Cohen et al. | Jan 2004 | B2 |
6802133 | Jordil et al. | Oct 2004 | B2 |
6847436 | Bridges | Jan 2005 | B2 |
6935036 | Raab et al. | Aug 2005 | B2 |
6957493 | Kumagai et al. | Oct 2005 | B2 |
6964113 | Bridges et al. | Nov 2005 | B2 |
6965843 | Raab et al. | Nov 2005 | B2 |
6980881 | Greenwood et al. | Dec 2005 | B2 |
6996912 | Raab et al. | Feb 2006 | B2 |
7022971 | Ura et al. | Apr 2006 | B2 |
7055253 | Kaneko | Jun 2006 | B2 |
7072032 | Kumagai et al. | Jul 2006 | B2 |
7129927 | Mattsson | Oct 2006 | B2 |
7130035 | Ohtomo et al. | Oct 2006 | B2 |
7168174 | Piekutowski | Jan 2007 | B2 |
7193695 | Sugiura | Mar 2007 | B2 |
7222021 | Ootomo et al. | May 2007 | B2 |
7230689 | Lau | Jun 2007 | B2 |
7233316 | Smith et al. | Jun 2007 | B2 |
7248374 | Bridges | Jul 2007 | B2 |
7274802 | Kumagai et al. | Sep 2007 | B2 |
7285793 | Husted | Oct 2007 | B2 |
7304729 | Yasutomi et al. | Dec 2007 | B2 |
7307710 | Gatsios et al. | Dec 2007 | B2 |
7312862 | Zumbrunn et al. | Dec 2007 | B2 |
7321420 | Yasutomi et al. | Jan 2008 | B2 |
7327446 | Cramer et al. | Feb 2008 | B2 |
7345748 | Sugiura et al. | Mar 2008 | B2 |
7352446 | Bridges et al. | Apr 2008 | B2 |
7388654 | Raab et al. | Jun 2008 | B2 |
7388658 | Glimm | Jun 2008 | B2 |
7401783 | Pryor | Jul 2008 | B2 |
7423742 | Gatsios et al. | Sep 2008 | B2 |
7446863 | Nishita et al. | Nov 2008 | B2 |
7466401 | Cramer et al. | Dec 2008 | B2 |
7474388 | Ohtomo et al. | Jan 2009 | B2 |
7503123 | Matsuo et al. | Mar 2009 | B2 |
7541965 | Ouchi et al. | Jun 2009 | B2 |
7552539 | Piekutowski | Jun 2009 | B2 |
7555766 | Kondo et al. | Jun 2009 | B2 |
7562459 | Fourquin et al. | Jul 2009 | B2 |
7564538 | Sakimura et al. | Jul 2009 | B2 |
7583375 | Cramer et al. | Sep 2009 | B2 |
7634381 | Westermark et al. | Dec 2009 | B2 |
7705830 | Westerman et al. | Apr 2010 | B2 |
7728963 | Kirschner | Jun 2010 | B2 |
7765084 | Westermark et al. | Jul 2010 | B2 |
7800758 | Bridges et al. | Sep 2010 | B1 |
7804602 | Raab | Sep 2010 | B2 |
7903237 | Li | Mar 2011 | B1 |
8237934 | Cooke et al. | Aug 2012 | B1 |
8320708 | Kurzweil et al. | Nov 2012 | B2 |
8379224 | Piasse et al. | Feb 2013 | B1 |
8472029 | Bridges et al. | Jun 2013 | B2 |
20020148133 | Bridges et al. | Oct 2002 | A1 |
20030014212 | Ralston et al. | Jan 2003 | A1 |
20030206285 | Lau | Nov 2003 | A1 |
20050185182 | Raab et al. | Aug 2005 | A1 |
20050197145 | Chae et al. | Sep 2005 | A1 |
20050254043 | Chiba | Nov 2005 | A1 |
20060009929 | Boyette et al. | Jan 2006 | A1 |
20060055662 | Rimas-Ribikauskas et al. | Mar 2006 | A1 |
20060055685 | Rimas-Ribikauskas et al. | Mar 2006 | A1 |
20060146009 | Syrbe et al. | Jul 2006 | A1 |
20060161379 | Ellenby et al. | Jul 2006 | A1 |
20060164384 | Smith et al. | Jul 2006 | A1 |
20060164385 | Smith et al. | Jul 2006 | A1 |
20060164386 | Smith et al. | Jul 2006 | A1 |
20060262001 | Ouchi et al. | Nov 2006 | A1 |
20070016386 | Husted | Jan 2007 | A1 |
20070019212 | Gatsios et al. | Jan 2007 | A1 |
20070236452 | Venkatesh et al. | Oct 2007 | A1 |
20080122786 | Pryor et al. | May 2008 | A1 |
20080229592 | Hinderling et al. | Sep 2008 | A1 |
20080309949 | Rueb | Dec 2008 | A1 |
20090033621 | Quinn et al. | Feb 2009 | A1 |
20090171618 | Kumagai et al. | Jul 2009 | A1 |
20090239581 | Lee | Sep 2009 | A1 |
20090240372 | Bordyn et al. | Sep 2009 | A1 |
20090240461 | Makino et al. | Sep 2009 | A1 |
20090240462 | Lee | Sep 2009 | A1 |
20100091112 | Veeser et al. | Apr 2010 | A1 |
20100128259 | Bridges et al. | May 2010 | A1 |
20100149518 | Nordenfelt et al. | Jun 2010 | A1 |
20100234094 | Gagner et al. | Sep 2010 | A1 |
20100235786 | Maizels et al. | Sep 2010 | A1 |
20100265316 | Sali et al. | Oct 2010 | A1 |
20100284082 | Shpunt et al. | Nov 2010 | A1 |
20110007154 | Vogel et al. | Jan 2011 | A1 |
20110023578 | Grasser | Feb 2011 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110035952 | Roithmeier | Feb 2011 | A1 |
20110043620 | Svanholm et al. | Feb 2011 | A1 |
20110052006 | Gurman et al. | Mar 2011 | A1 |
20110069322 | Hoffer, Jr. | Mar 2011 | A1 |
20110107611 | Desforges et al. | May 2011 | A1 |
20110107612 | Ferrari et al. | May 2011 | A1 |
20110107613 | Tait | May 2011 | A1 |
20110107614 | Champ | May 2011 | A1 |
20110112786 | Desforges et al. | May 2011 | A1 |
20110181872 | Dold et al. | Jul 2011 | A1 |
20110260033 | Steffensen et al. | Oct 2011 | A1 |
20120050255 | Thomas et al. | Mar 2012 | A1 |
20120120391 | Dold et al. | May 2012 | A1 |
20120120415 | Steffensen et al. | May 2012 | A1 |
20120327390 | Bridges et al. | Dec 2012 | A1 |
Number | Date | Country |
---|---|---|
2811444 | Mar 2012 | CA |
0797076 | Sep 1997 | EP |
0919831 | Jun 1999 | EP |
0957336 | Nov 1999 | EP |
2004108939 | Apr 2008 | JP |
9534849 | Dec 1995 | WO |
0223121 | Mar 2002 | WO |
0237466 | May 2002 | WO |
03062744 | Jul 2003 | WO |
03073121 | Sep 2003 | WO |
2007079601 | Jul 2007 | WO |
2010100043 | Sep 2010 | WO |
2010148526 | Dec 2010 | WO |
2011057130 | May 2011 | WO |
Entry |
---|
Automated Precision, Inc., Product Specifications, Radian, Featuring INNOVO Technology, info@apisensor.com, Copyright 2011. |
FARO Technical Institute, Basic Measurement Training Workbook, Version 1.0, FARO Laser Tracker, Jan. 2008, Students Book, FAO CAM2 Measure. |
Kollorz, et al., “Gesture recognition with a time-of-flight camera”, International Journal of Intelligent Systems Technologies and Applications, vol. 5, No. 3/4, pp. 334-343, [Retreived Aug. 11, 2011; http://www5.informatik.uni-erlangen.de/Forschung/Publikationen/2008/Kollorz08-GRW.pdf] (2008). |
International Search Report of the International Application No. PCT/US2011/033360 mailed Feb. 29, 2012. |
International Search Report of the International Searching Authority for Application No. PCT/US2012/070283; Date of Mailing Mar. 27, 2013. |
Hecht, Jeff, Photonic Frontiers: Gesture Recognition: Lasers Bring Gesture Recognition to the Home, Laser Focus World, pp. 1-5, [Retrieved On-Line Mar. 3, 2011], http://www.optoiq.com/optoiq-2/en-us/index/photonics-technologies-applications/lfw-display/lfw-articles-toolstemplate.articles.optoiq2.photonics-technologies.technology-products.imaging-—detectors.2011.01.lasers-bringgesture-recognition-to-the-home.html. |
New River Kinematics, SA Arm—The Ultimate Measurement Software for Arms, Software Release! SA Sep. 30, 2010, [On-line], http://www.kinematics.com/news/software-release-sa20100930.html (1 of 14), [Retrieved Apr. 13, 2011 11:40:47 AM]. |
Turk, et al., “Perceptual Interfaces”, UCSB Technical Report 2003-33, pp. 1-43 [Retreived Aug. 11, 2011, http://www.cs.ucsb.edu/research/tech—reports/reports/2003-33.pdf] (2003). |
Li, et al., “Real Time Hand Gesture Recognition using a Range Camera”, Australasian Conference on Robotics and Automation (ACRA), [Retreived Aug. 10, 2011, http://www.araa.asn.au/acra/acra2009/papers/pap128s1.pdf] pp. 1-7 (2009). |
Cao, et al.“VisionWand: Interaction Techniques for Large Displays using a Passive Wand Tracked in 3D”, Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, UIST, vol. 5, issue 2, pp. 173-182, (Jan. 2003). |
Written Opinion of the International Searching Authority for International Application No. PCT/US2011/033360 mailed Feb. 29, 2011. |
Written Opinion of the International Searching Authority for Application No. PCT/US2012/070283; Date of Mailing Mar. 27, 2013. |
International Search Report of the International Searching Authority for Application No. PCT/US2013/059592; Date of Mailing Dec. 10, 2013. |
Rahman, et al., “Spatial-Geometric Approach to Physical Mobile Interaction Based on Accelerometer and IR Sensory Data Fusion”, ACM Transactions on Multimedia Computing, Communications and Applications, vol. 6, No. 4, Article 28, Publication date: Nov. 2010. |
Written Opinion of the International Searching Authority for Application No. PCT/US2013/059592; Date of Mailing Dec. 10, 2013. |
International Search Report of the International Searching Authority for Application No. PCT/US2012/028984; Date of Mailing Jul. 19, 2012. |
International Search Report of the International Searching Authority for Application No. PCT/US2012/027083; Date of Mailing Jun. 29, 2012. |
Leica Geosystems Metrology, “Leica Absolute Tracker AT401, White Paper,” Hexagon AB; 2010. |
Leica Geosystems AG ED—“Leica Laser Tracker System”, Internet Citation, Jun. 28, 2012, XP002678836, Retrieved from the Internet: URL:http://www.a-solution.com.au/pages/downloads/LTD500—Brochure—EN.pdf. |
Maekynen, A. J. et al., Tracking Laser Radar for 3-D Shape Measurements of Large Industrial Objects Based on Time-of-Flight Laser Rangefinding and Position-Sensitive Detection Techniques, IEEE Transactions on Instrumentation and Measurement, vol. 43, No. 1, Feb. 1, 1994, pp. 40-48, XP000460026, ISSN: 0018-9456, DOI 10.1109/19.286353, the whole document. |
Leica Geosystems: “TPS1100 Professional Series”, 1999, Retrieved from the Internet: URL:http://www.estig.ipbeja.pt/˜legvm/top—civil/TPS1100%20-%20A%20New%20Generation%20of%20Total%20Stations.pdf, [Retrieved on Jul. 2012] the whole document. |
Written Opinion of the International Searching Authority for Application No. PCT/US2012/028984; Date of Mailing Jul. 19, 2012. |
Written Opinion of the International Searching Authority for International Application PCT/US2012/027083; Date of Mailing Jun. 29, 2012. |
Number | Date | Country | |
---|---|---|---|
20130229512 A1 | Sep 2013 | US |
Number | Date | Country | |
---|---|---|---|
61702864 | Sep 2012 | US | |
61326294 | Apr 2010 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13340730 | Dec 2011 | US |
Child | 13851221 | US | |
Parent | 13090889 | Apr 2011 | US |
Child | 13340730 | US |