The present invention relates in general to methods for measuring targets and in particular to methods for measuring the center of a spherical target containing a retroreflector.
There is a class of instruments that measures the coordinates of a point by sending a laser beam to a retroreflector target in contact with the point. The instrument determines the coordinates of the point by measuring the distance and the two angles to the target. The distance is measured with a distance-measuring device such as an absolute distance meter or an interferometer. The angles are measured with an angle-measuring device such as an angular encoder. A gimbaled beam-steering mechanism within the instrument directs the laser beam to the point of interest.
The laser tracker is a particular type of coordinate-measuring device that tracks the retroreflector target with one or more laser beams it emits. There is another category of instruments known as total stations or tachymeters that may measure a retroreflector or a point on a diffusely scattering surface. Laser trackers, which typically have accuracies on the order of a thousand of an inch and as good as one or two micrometers under certain circumstances, are usually much more accurate than total stations. The broad definition of laser tracker, which includes total stations, is used throughout this application.
Ordinarily the laser tracker sends a laser beam to a retroreflector target. A common type of retroreflector target is the spherically mounted retroreflector (SMR), which includes a cube-corner retroreflector embedded within a metal sphere. The cube-corner retroreflector includes three mutually perpendicular mirrors. The vertex, which is the common point of intersection of the three mirrors, is located near the center of the sphere. Because of this placement of the cube corner within the sphere, the perpendicular distance from the vertex to any surface on which the SMR rests remains nearly constant, even as the SMR is rotated. Consequently, the laser tracker can measure the 3D coordinates of a surface by following the position of an SMR as it is moved over the surface. Stating this another way, the laser tracker needs to measure only three degrees of freedom (one radial distance and two angles) to fully characterize the 3D coordinates of a surface.
Some laser trackers have the ability to measure six degrees of freedom (DOF), which may include three translations, such as x, y, and z, and three rotations, such as pitch, roll, and yaw. An exemplary six-DOF laser tracker system is described in U.S. Pat. No. 7,800,758 ('758) to Bridges, et al., incorporated by reference herein. The '758 patent discloses a probe that holds a cube corner retroreflector, onto which marks have been placed. The cube corner retroreflector is illuminated by a laser beam from the laser tracker, and the marks on the cube corner retroreflector are captured by an orientation camera within the laser tracker. The three orientational degrees of freedom, for example, the pitch, roll and yaw angles, are calculated based on the image obtained by the orientation camera. The laser tracker measures a distance and two angles to the vertex of the cube-corner retroreflector. When the distance and two angles, which give three translational degrees of freedom of the vertex, are combined with the three orientational degrees of freedom obtained from the orientation camera image, the position of a probe tip, arranged at a prescribed position relative to the vertex of the cube corner retroreflector, can be found. Such a probe tip may be used, for example, to measure the coordinates of a “hidden” feature that is out of the line of sight of the laser beam from the laser tracker.
As explained hereinabove, the vertex of a cube corner retroreflector within an SMR is ideally placed at the exact center of the sphere into which the cube corner is embedded. In practice, the position of the vertex is off the center of the sphere by up to a few thousandths of an inch. In many cases, the difference in the positions of the vertex and the sphere center are known to high accuracy, but this data is not used to correct the tracker readings because the orientation of the SMR is not known. In the accurate measurements made with laser trackers, this error in the centering of the cube corner retroreflector in the sphere is sometimes larger than the errors from the distance and angle meters within the laser tracker. Consequently, there is a need for a method to correct this centering error.
Most of the SMRs in use today contain open-air cube corner retroreflectors. There are some SMRs that use glass cube corner retroreflectors, but these have limited accuracy. Because of the bending of the light entering such glass cube corners, the light appears to travel in a direction that is not the true direction within the cube corner. The error this produces can be minimized by moving the vertex of the cube corner behind the center of the sphere. An example of the calculations involved in minimizing this error is given in U.S. Pat. No. 7,388,654 to Raab, et al., the contents of which are incorporated by reference. However, there is no one distance of movement that eliminates the tracker errors in using such a retroreflector over the full range of angles of incidence over which light can enter the cube corner. As a result, SMRs made with glass cube corners tend to be made very small, as this reduces error, and they tend to be used in applications where the highest accuracy is not required. However, SMRs made with glass cube corners have a significant advantage compared to SMRs made with open-air cube corners: they have a wider acceptance angle. In other words, the light may enter a glass cube corner at a larger angle of incidence without being clipped than an open-air cube corner. Consequently, there is a need for a method of measuring a relatively large SMR containing a glass cube corner with high accuracy. The need is essentially one of finding the center of the SMR spherical surface, regardless of the position of the glass cube corner, and in this respect it is similar to the need described above for SMRs containing open-air cube corners.
More generally, there is a need for a method of finding the center of a target having a spherical surface and containing a retroreflector, regardless of the type of retroreflector. For example, a different type of retroreflector put into spherical surfaces is the cateye retroreflector. Another example is the photogrammetric dot—a small circle of reflective material—which is sometimes centered in a sphere. There are errors in the centering of cateye retroreflectors and photogrammetric dots in spheres, just as in centering cube corner retroreflectors in spheres. Hence there is a general need for a method of finding the center of a target having a spherical surface and containing a retroreflector.
According to an embodiment of the present invention, a method of obtaining the characteristics of a target by a device is provided. The method comprising the steps of providing the target, wherein the target has a target frame of reference and includes a first retroreflector and a body, the body containing an opening, the opening sized to hold the first retroreflector, the opening open to the exterior of the body, the first retroreflector at least partially disposed in the opening, the first retroreflector having a first retroreflector reference point in the target frame of reference. A spherical contact element is provided having a region of spherical curvature rigidly fixed with respect to the body, the spherical contact element having a sphere center and a sphere radius. A device is provided, wherein the device has a device frame of reference and a first light source, the device being configured to measure a distance and two angles from the device to the first retroreflector reference point. An identifier element is provided located on the body, the identifier element configured to store first information, the identifier element being one of a bar code pattern and a radio-frequency identification tag, the first information including one of the sphere radius and a serial number, the serial number accessed by a processor to obtain the sphere radius. A workpiece is provided having a workpiece surface. The spherical contact element is placed in contact with the workpiece surface. The first retroreflector is illuminated with light from the first light source to provide a first reflected light that is returned to the device. A measurement is made, from the device to the first retroreflector reference point, a first distance and a first set of two angles based at least in part on the first reflected light, the first distance based at least in part on the speed of light over the path traveled by the light from the device to the first retroreflector reference point. The first information is read with a first reader attached to the device, the first reader being one of a bar code scanner, a radio-frequency identification reader, and a camera. A three-dimensional coordinate of a point is calculated on the workpiece surface based at least in part on the first distance, the first set of two angles, and the first information.
Embodiments will now be described, by way of example only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several figures, in which:
An exemplary laser tracker 10 is illustrated in
The laser tracker 10 is a device that has a device frame of reference 30. The device frame of reference may have as its origin the gimbal point 22. The frame of reference may be fixed with respect to the azimuth base 16, which is typically stationary with respect to the surroundings. The device frame of reference may be represented by a variety of coordinate systems. One type of coordinate system is a Cartesian coordinate system having three perpendicular axes x, y, and z. Another type of coordinate system is a spherical coordinate system. A point 74 within a spherical coordinate 30 may be represented in a spherical coordinate system by one radial distance 73 (r), a first (zenith) angle 72 (θ), and a second (azimuth) angle 71 (φ). The angle θ is obtained by using the projection of the point 74 onto the z axis. The angle φ is obtained by using the projection of the point 74 onto the x-y plane. The laser tracker 10 inherently makes measurements in a spherical coordinate system, but a point measured in spherical coordinates may be easily converted to Cartesian coordinates.
The target 26 has a target frame of reference 40. The target frame of reference may be represented, for example, using Cartesian coordinates x, y, and z. The x, y, and z axes of the target frame of reference 40 move with the target 26 and are not necessarily parallel to the corresponding device axes x, y, and z of the device frame of reference 30. The target 26 may be placed in contact with the workpiece surface 61 at a point 63. To find the three-dimensional (3D) coordinates of the point 63, the tracker first determines the center of the target 26 using the distance and two angles it has measured. It may also be used to account for a vector offset of the retroreflector reference point (e.g., cube-corner vertex) with respect to the center of the spherical contact surface of the target 26 using methods described herein below. To move from the center of the target to the surface of the workpiece the position of the center point is offset by an amount equal to the radius of the spherical target surface. In an embodiment, the direction of the offset is found by measuring several points near to the contact point 63 to determine the surface normal at the point 63.
Laser beam 46 may include one or more laser wavelengths. For the sake of clarity and simplicity, a steering mechanism of the sort shown in
In exemplary laser tracker 10, locator cameras 52 and light sources 54 are located on payload 15. Light sources 54 illuminate one or more retroreflector targets 26. In an embodiment, light sources 54 are LEDs electrically driven to repetitively emit pulsed light. Each locator camera 52 includes a photosensitive array and a lens placed in front of the photosensitive array. The photosensitive array may be a CMOS or CCD array, for example. In an embodiment, the lens has a relatively wide field of view, for example, 30 or 40 degrees. The purpose of the lens is to form an image on the photosensitive array of objects within the field of view of the lens. Usually at least one light source 54 is placed near locator camera 52 so that light from light source 54 is reflected off each retroreflector target 26 onto locator camera 52. In this way, retroreflector images are readily distinguished from the background on the photosensitive array as their image spots are brighter than background objects and are pulsed. In an embodiment, there are two locator cameras 52 and two light sources 54 placed about the line of laser beam 46. By using two locator cameras in this way, the principle of triangulation can be used to find the three dimensional coordinates of any SMR within the field of view of the locator camera. In addition, the three dimensional coordinates of an SMR can be monitored as the SMR is moved from point to point. A use of two locator cameras for this purpose is described in U.S. Published Patent Application No. 2010/0128259 to Bridges, et al., the contents of which are herein incorporated by reference.
As shown in
In
To determine the three-dimensional coordinates of the probe tip 62, the tracker measures six degrees of freedom of the probe tip. It measures the three translational degrees of freedom of the retroreflector reference point. For the cube corner retroreflector described hereinabove, the retroreflector reference point is the vertex of the cube corner. For a cateye retroreflector made of a single sphere (for example, using glass with refractive index equals two), the reference point is the center of the cateye sphere. For a cateye retroreflector made of two hemispherical elements, the reference point is centered on the two hemispherical elements on the plane that separates them. For a reflective photogrammetry target in the shape of a flat circle, the reference point is the center of the circle.
The three translational degrees of freedom may be described in a Cartesian frame of reference with x, y, and z coordinates. Alternatively, the three translational degrees of freedom may be described in a spherical frame of reference with radial distance r, azimuth angle phi, and zenith angle theta. The laser tracker 10 measures a distance r using either an interferometer or an absolute distance meter (ADM). It measures an azimuth angle phi and a zenith angle theta using angular encoders. Hence the laser tracker measures in a spherical coordinate system, although the coordinate values for any measured point may be converted into coordinates in any other desired coordinate system.
As stated herein above, some targets, such as six-DOF probe 60, require a tracker configured for six-DOF measurements. In addition to measuring the three translational degrees of freedom, the tracker must also be able to measure three orientational degrees of freedom. Together, three translational and three orientational degrees of freedom that produce six independent degrees of freedom fully specify (fully constrain) the position of every point within a rigid body—for example, the rigid body six-DOF probe 60.
The three degrees of orientational freedom may be described in a variety of ways. Methods for describing three degrees of orientational freedom include Euler angles and Tait-Bryan angles, the latter of which include the well known pitch-yaw-roll and heading-elevation-bank descriptions. In the present application, the term three degrees of orientational freedom should be understood to mean three independent degrees of freedom. For example, a rotation about an x axis, a y axis, and a third axis in the x-y plane would represent only two degrees of freedom as the three axes are not independent. In other words, the three axes do not provide a method of specifying the rotation about the z axis and hence do not represent three independent degrees of freedom.
It is possible to have several interconnected objects that move independently. In such a situation, more than three degrees of freedom may be required to fully specify the motion of the collection of objects. Generally, a six-DOF probe such as probe 60 moves as a unit so that three degrees of orientational freedom are sufficient to fully describe the orientation of every point on the probe structure.
It should be similarly understood that three degrees of translational freedom means three independent degrees of translational freedom. Another way of saying this is that the three directions corresponding to the three degrees of translational freedom form a basis set in three-dimensional space. In other words, each of the three directions corresponding to a degree of translational freedom has a component orthogonal to each of the other two directions.
Light from source element 410 passes through beam splitter 420. Light from source element 405 reflects off mirror 415 and beam splitter 420. If source elements 405, 410 contain light of different wavelengths, beam splitter 420 may advantageously be a dichroic beam splitter than transmits the wavelength of light emitted by source element 410 and reflects the wavelength of light emitted by source element 405.
Most of the light from beam splitter 420 passes through beam splitters 425 and 430. A small amount of light is reflected off each of these mirrors and is lost. The light passes through beam expander 435. The beam expander 435 expands the size of the beam on the way out of the tracker. The laser light 440 leaving the tracker 10 travels to a retroreflector target 26 or a retroreflector probe 60. A portion of this laser light returns to the tracker. The beam expander reduces the size of the beam on the way back into the tracker. In an embodiment, some of the light reflects off beam splitter 430 and travels to orientation camera 445. The orientation camera is used to obtain the three degrees of orientational freedom of the six-DOF probe 60 or any other six-DOF device. Orientation camera contains a lens system 446 and a photosensitive array 447. It may use a motor to adjust the size of the image. The principal of operation of the orientation camera is explained in the '758 patent.
Part of the light travels to beam splitter 425. Most of the light passes on to elements 405, 410 but a small amount is split off and strikes position detector 450. In some cases, the light may pass through a lens after reflecting off beam splitter 425 but before striking position detector 450. The position detector 450 may be of several types—for example, a position sensitive detector or photosensitive detector. A position sensitive detector might be a lateral effect detector or a quadrant detector, for example. A photosensitive array might be a CMOS or CCD array, for example. Position detectors are responsive to the position of the returning light beam. The motors attached to the azimuth mechanical axes and the zenith mechanical axes are adjusted by a control system with the tracker 10 to keep the returning light beam centered, as nearly as possible, on the position detector 450.
As explained in the '758 patent, the orientation camera 445 provides the tracker 10 with one method of measuring the six degrees of freedom of a target. Other methods are possible, and the methods described herein for measuring the center of a spherical target are also applicable to these other methods.
The traditional SMR 26 includes a body having a spherical exterior portion and a retroreflector. The body contains a cavity sized to hold a cube corner retroreflector, which is attached to the cavity. The spherical exterior portion has a spherical center.
A cube corner retroreflector includes three planar reflectors that are mutually perpendicular. The three planar reflectors intersect at a common vertex, which in the ideal case is a point. Each of the planar reflectors has two intersection junctions, each intersection junction of which is shared with an adjacent planar reflector for a total of three intersection junctions within the cube corner retroreflector. The cube corner retroreflector has an interior portion that is a region of space surrounded on three sides by the planar reflectors.
Cube corner retroreflectors may be open-air cube corners or glass cube corners. Open-air cube corner retroreflectors have an interior portion of air, while the glass cube corner retroreflectors have an interior portion of glass. The glass cube corner retroreflector is a type of glass prism. One surface of the glass prism called the top surface is distal to the vertex.
The SMR is designed for use with a laser tracker for measuring three degrees of freedom. A more powerful version of the SMR is the spherical six-DOF target. Some examples of six-DOF targets are shown in
One type of six-DOF target uses cube corners containing marks or non-reflecting portions, as explained in the '758 patent. Each intersection junction of the cube corner retroreflector may have a non-reflecting portion. The non-reflecting portion does not necessarily suppress all light that it reflects or scatters. Rather the non-reflecting portions are configured to greatly reduce the return of light to the tracker. The reduced return of light may be achieved, for example, by making the non-reflecting portion from (a) an absorbing material such as an absorbing coloration or an absorbing tape, (b) a scattering surface texture or material, (c) a curved reflective surface that results in a diverging pattern of light, or (d) a planar surface that reflects the light away from the laser tracker. Other methods for making the non-reflecting portion achieve a reduced return of light may be utilized in light of the teachings herein, as should be apparent to one of ordinary skill in the art.
There are at least two common methods for making open-air cube corner retroreflectors: replication and assembly of glass panels.
Usually, the intersection junctions of the master element 510 are not perfectly sharp. One reason for this lack of sharpness is the difficulty of machining such sharp intersection junctions. Another reason is that the intersection junctions tend to chip during repeated replications if the junctions are too sharp. Instead, the intersection junctions are usually rounded with a small fillet or angled with a small bevel. Usually, for cube corners that are placed in spherically mounted retroreflectors used to measure three degrees of freedom, these features are made as small as practical. For example, a fillet applied to the intersection junctions of master element 510 might have a radius of curvature of 0.003 inch. This radius of curvature is transferred to the intersection junctions of slug 520. The fillet or bevel applied to the cube corner retroreflector is a non-reflecting portion according to the explanation given hereinabove. In other words, very little light will return to the laser tracker after striking a fillet or bevel applied to the intersection junctions of the cube corner retroreflector.
If the cube corner retroreflector is to be used in conjunction with a system to measure six degrees of freedom similar to that described in the '758 patent, then it may be desirable to broaden the non-reflecting portions observed by the orientation camera within the laser tracker. If a six-DOF target is only a few meters away from the tracker, then the narrow non-reflecting portions commonly present in high quality SMRs may be wide enough to be easily seen by the orientation camera. However, if the six-DOF target is located farther away—for example, up to 30 meters from the laser tracker—then the non-reflecting portions will need to be widened to make them visible on the orientation camera. For example, the non-reflecting portions might need to be about 0.5 mm wide to be clearly seen by the orientation camera.
In
The second common method of making open-air cube corner retroreflectors is to join mirror panels into a cube-corner assembly. Three mirror panels are joined together to be mutually perpendicular. There are slight gaps at the intersection regions between glass panels. Light that strikes the gaps is not reflected back to the laser tracker and so represents non-reflecting portions. If thicker lines are desired, these may be obtained, for example, by (a) increasing the width of the gap, (b) coloring (darkening) the mirrors over the desired portions, or (c) attaching low reflection material (e.g., black adhesive tape) at the intersection junctions.
Referring now to
FIG. 12 of the '758 patent illustrates an image pattern 100 in the prior art appearing on an orientation camera within a laser tracker. The three lines shown in this figure were obtained by illuminating a cube corner retroreflector onto which non-reflecting portions were placed on each of the three intersection junctions of the three planar surfaces of the cube corner retroreflector. The vertex of the cube corner retroreflector corresponds to the intersection point of the lines. Each of the lines in this figure extends on both sides of the intersection (vertex) point because each non-reflecting portion blocks laser light on the way into and on the way out of the cube corner.
A potential problem with non-reflecting portions placed on a cube corner retroreflector to produce the pattern of FIG. 12 of the '758 patent is that a large amount of light may be lost near the center of the retroreflector where the optical power is the highest. In some cases, the result of the reduced optical power returning to the laser tracker is a decrease in tracking performance and a decrease in the accuracy of distance and angle measurements by the laser tracker. To get around this problem, the non-reflecting portions may be modified to produce an illumination pattern 700 like that shown in
A cube corner retroreflector having non-reflecting portions may be embedded in a sphere, as shown in
The coarse orientation of the retroreflector is determined when there is an ambiguity in the orientation of the retroreflector because of the symmetry of the retroreflector. For example, in a cube corner retroreflector, the three intersecting lines 710, 720, 730 formed by the three reflecting surfaces, as shown in
Another optional element of interface component 820 is identifier element 824. The identifier element 824 may take the form of a bar-code pattern or radio-frequency identification (RFID) tag, for example. In an embodiment, the bar code has a one-dimensional pattern that includes a series of parallel lines of differing thicknesses and spacings. In another embodiment, the bar code has a two-dimensional pattern that includes a series of dots or squares spaced in a two-dimensional pattern. The tracker may read the contents of the bar code using a locator camera placed, for example, on the front of the tracker or with an orientation camera. The tracker may read the identity of the RFID tag by illuminating the RFID tag with radio frequency (RF) energy. The identifier element 824 may contain a serial number that identifies the particular target 810. Alternatively, the identifier element may contain one or more parameters that characterize the target 810. A serial number could include letters or other symbols as well as numbers and is intended to identify a particular target. Parameters could provide any information about any aspect of the device—for example, geometrical characteristics or thermal characteristics. A serial number may be used by a processor to gain access to parameters associated with a particular target.
Another optional element of interface component 820 is antenna 830. Antenna 830 may be used to send and/or to receive wireless data in the form of radio frequency signals. Such an antenna may be attached to a small circuit board that is powered by a small battery 828 that fits inside interface component 820. The small circuit board may be made of rigid-flex material which permits a very compact circuit to be enclosed within the interface component.
The identification function performed by identifier element 824 may instead be carried by saving the identification information, which might be a serial number or one or more parameters, in the electrical components of the circuit board within the interface component 820 and then transmitting the identification information to the laser tracker with an RF signal from the antenna 830 or with an optical signal from the target light at 822. RF and optical signals are two examples of electromagnetic signals that might be used to transmit identification information. As used here, optical signals may include ultraviolet, visible, infrared, and far infrared wavelengths.
The interface component 820 may also be provided with one or more optional actuator buttons 826. The actuator buttons 826 may be used to start and stop measurements or to initiate a variety of other actions. These buttons may be used in combination with indicator lights on the laser tracker to ensure that the tracker has received the intended commands.
The interface component 820 may also contain a temperature sensor 832 mounted within the target—for example, on the spherical body 802 or cube corner retroreflector 804. As the spherical body 802 and cube corner retroreflector 804 are heated or cooled, the position of the vertex 808 may shift since in general the spherical body 802 and cube corner retroreflector 804 may be made of different materials having different coefficients of thermal expansion (CTEs). By tracking the temperature of the target 810, compensation may be performed to shift the position of the vertex 808 by an appropriate amount.
The interface component may include an air temperature sensor assembly 834 comprising a temperature sensor 836, protector 838, and insulation (not shown). The temperature sensor may be a thermistor, RTD, thermocouple, or any other device capable of measuring temperature. It may be placed in a protector structure, which might be a hollow cylinder, for example. The purpose of the protector is to keep the temperature sensor from being damaged and to keep heat sources away from the temperature sensor. The protector is open at the end and may contain perforations to increase exposure of the temperature sensor to air. Insulation is provided between the body 802 and the air temperature sensor 836. The insulation keeps the sensor from being exposed to the metal of the target, which may be at a different temperature than the surrounding air.
A possible use of air temperature assembly 834 is to measure the temperature of the air as a function of position within the measurement volume. This may be important because measured distance and measured angles depend on the temperature of the air. For example, to get the distance from the tracker to a target, one of the tracker processors calculates the distance by dividing the uncorrected measured distance (as measured by the tracker interferometer or ADM) by the average index of refraction of the air over the path from the tracker to the target. The index of refraction, which is found using the Edlin equation or the Ciddor equation, finds the index of refraction of the air based on the temperature, pressure, and humidity of the air measured by sensors attached to the tracker. The resulting corrected distance value is the distance the tracker interferometer or ADM would have measured if the air were replaced by vacuum. Unlike pressure and humidity, which tend to be relatively constant over a measurement volume, the temperature often varies significantly over the measurement volume. Ordinarily, the temperature of the air in a measurement volume is estimated by placing a temperature sensor at a particular location and assuming that the air temperature has the same value at all positions within the measurement volume. A more accurate way to correct for temperature over the measurement volume would be to move the target 810 from a starting position at the tracker to an ending position near the measurement region, keeping track of the temperature at all distances and calculating the average temperature over those distances. The time constant of the air temperature sensor 836 should be set fast enough to enable the sensor to respond to changes in temperature as the operator carries the target 810 to different positions within the measurement volume. An alternative is to take the average of a first air temperature read by an air temperature sensor near the tracker and a second air temperature read by the air temperature sensor 832 on the target.
A three dimensional error vector may be drawn from the sphere center 1222 to the vertex 1212. The three errors dx, dy, and dz are called the vector length components. It is also possible to decompose the error vector into other directions. For example, it would be possible to select dx to be off the axis of symmetry, and the components dy and dz may be taken perpendicular to dx.
The error vector has three length components dx, dy, dz in
The target 1200 is given a target frame of reference. For example, such a frame of reference is given in
Measurements are performed on targets 1200 to obtain the vector length components. For the alignment shown in
The central ray 1372 from a beam of light emitted by the laser tracker 10 or other device intersects the cube corner at point 1374, bends and travels to the vertex 1334. The position detector 450 in
Because the light incident on the glass cube corner bends as it enters the glass, the position of the vertex will be improperly measured unless calculations take the bending of the light into account. It is possible to account for the bending of the light if the three orientational degrees of freedom of the glass cube corner are known. Since the target and the device both have a frame of reference, the three degrees of orientational freedom can be given for the target frame of reference as viewed within the device frame of reference.
In the '758 patent, equations were given in columns 14 and 15 for relating the pitch, yaw, and roll angles of an open-air cube corner to the slopes of the three lines observed on the photosensitive array of the orientation camera. The equations were compared to three dimensional rotations performed in CAD software to demonstrate their validity. In the '758 patent, the term slope is used in the usual way to indicate a change in ordinate (y) value divided by a corresponding change in the abscissa (x) value. It should be understood that other definitions for slope may be used (for example, the change in abscissa divided by the change in ordinate) if the relevant equations are modified correspondingly.
There are many computational methods that can be used to relate the orientational angles of the glass cube corner to the slopes of the lines that appear on the photosensitive array of the orientation camera. One method is described here. Three orientational angles are assumed. These may be described in terms of pitch, roll, yaw, or other orientational parameters. Each of the three lines corresponding to the intersection junctions are projected until they intersect a plane that includes the upper surface. These three points can be found using the altitude length and geometry of the cube corner. These three dimensional coordinates of these three points and of the vertex are transformed by a three dimensional rotation matrix, based on the three orientational degrees of freedom, to get new three dimensional coordinates for each. These transformed intersection points are called the junction points at the top surface. A normal vector to the upper surface may be constructed using a first point at the center of the upper surface and a second point of unit length that is perpendicular to the upper surface. These two points are transformed by the three dimensional rotation matrix to get a new normal vector. The light projected from the device in the device frame of reference makes an angle with respect to the new normal vector. The direction of the refracted light is found using vector mathematics. Since the light must intersect the vertex of the glass cube corner, the point of intersection of the central ray of light with the top surface can be found. This point of intersection with the top surface may be referred to as the apparent vertex. The lines that will be displayed on the orientation camera can now be obtained by connecting the apparent vertex with each of the three junction points at the top surface and then projecting these lines to be perpendicular to the direction of propagation of light back to the device. The lines may be drawn on both sides of the apparent vertex since the non-reflecting portions block light both on the way into and the way out of the retroreflector. The slopes of the lines from the calculations are compared to the slopes of the lines observed on the orientation camera. An optimization procedure is carried out to iteratively select parameters to minimize the least squared error until the best values for roll, pitch, and yaw are obtained. A coarse measurement, described in more detail below, is performed to get a good starting value for the calculations.
After the three orientational degrees of freedom have been found, the results are used to correct the distance and two angles read by the laser tracker. In the absence of other knowledge, the device will believe that the vertex 1334 lies at position 1380 of
Once the three orientational degrees of freedom are known, the distance T can be found. If the normal vector 1348 is parallel to the axis of symmetry, then the distance T is found by dividing the altitude height by the cosine of the angle of refraction. If the normal vector 1348 is not parallel to the axis of symmetry, then the distance T is found using vector mathematics, the use of which is well known to one of ordinary skill in the art. To find the coordinates of the vertex 1334 in the device frame of reference, two steps are taken. First, the distance measured by the device is reduced by T (n−1), which moves the point 1380 to a new point 1382 on the line 1375. Next the intersection point 1374 is found by moving back along the line 1375 an additional distance T. The line 1375 is rotated about the calculated intersection point 1374 by an angle aI−aR, which causes the point 1382 to move to the vertex 1334.
To find the sphere center 1314, the target frame of reference of the target 1300 is rotated to make its axes parallel to those of the device frame of reference. The vertex position is then adjusted by the depth along the direction of the axis of symmetry to get the three dimensional coordinates of the sphere center.
As in the case of the open-air cube corner illustrated in
In step 1640, three orientational degrees of freedom are found. The method for doing this for open-air cube corner retroreflectors was explained in detail in the '758 patent. The method for doing this for a glass cube corner is explained in the present application. In step 1650, the device measures a first distance and a first set of two angles. For example, a laser tracker might measure an ADM or interferometer distance, a zenith angle, and an azimuth angle, each taken from a point on the tracker to a retroreflector reference point.
In step 1660, three dimensional coordinates are calculated for a sphere center of the target. One way to do this, as described above, is to rotate the target frame of reference to make the axes parallel to the corresponding axes of the device frame of reference. The retroreflector reference point of the retroreflector is then shifted by a specified amount in one or more directions. The values obtained for the three dimensional coordinates are the desired result. In step 1670, this result is saved—for example in a processor or processor memory.
Although the methods described hereinabove have mostly dealt with the case of cube corner retroreflectors, they are applicable to any type of retroreflector for which three orientational degrees of freedom can be obtained. A method of putting marks on or in the vicinity of the retroreflector target can be used with retroreflectors such as cateye retroreflectors and photogrammetric targets. A cateye retroreflector contains a top transmissive spherical surface and a bottom reflective spherical surface. The top and bottom spherical surfaces may have the same or different radii. One way to make a six-DOF target from a cateye is to place marks on the top spherical surface and an intermediate layer—for example a planar layer that separates an upper and a lower hemisphere of the cateye. Such marks would be decoded in a manner similar to that used with cube corner retroreflectors—by working out suitable mathematical formulas to relate the observed patterns to the three orientational degrees of freedom. In a similar manner, marks may be put on a photogrammetric dot held within a sphere. Such marks may be made on two or more levels to improve the sensitivity of the system for detecting tilt.
Some types of retroreflectors have symmetries that need to be resolved before a calculation can be completed to determine the three orientational degrees of freedom. For example, a cube corner retroreflector having intersection junctions marked with identical non-reflecting portions appears to have six-fold symmetry when viewed along the axis of symmetry. A three-fold symmetry in the intersection junctions can be seen in the front views of
The association between the physical marks and the pattern reflected by the retroreflector is found from the coarse orientation of the target. The association may be determined by a variety of methods, five of which are described here. A first method uses a reference mark, or feature, such as the mark 801 or 901 shown
A second method for establishing a coarse orientation uses a small reference retroreflector in the vicinity of the larger retroreflector. Such a reference retroreflector is shown as element 1708 in FIG. 17 of the '758 patent. A reference retroreflector may be located in a variety of positions near a larger retroreflector. In the present application, for example, a reference retroreflector may be located in position 822 of
One mode for using a reference retroreflector to establish a coarse orientation is to simultaneously illuminate both the reference retroreflector and the larger retroreflector. This may be done, for example, by flashing the light sources 54 near orientation cameras 52. If the laser tracker 10 is relatively close to the target, the locator camera will be able to distinguish the relative positions of the two light sources, thereby establishing the coarse orientation.
A second mode for using a reference retroreflector to establish a coarse orientation is to begin by moving the laser beam from the tracker to a first steering angle (comprising a first azimuth angle and a first zenith angle) to center the laser beam on the larger retroreflector. The tracker then moves the beam away from the center of the retroreflector by the distance from the larger target to the smaller target. The laser tracker does this by moving the laser beam by an angle in radians equal to a distance, known in advance from the target dimensions, divided by the distance from the tracker to the retroreflector target. After the laser beam has been moved away from the center, it is rotated in a circular pattern about the initial center point until the smaller retroreflector is intercepted. The laser beam is then centered on the smaller target to obtain a second steering angle (comprising a second azimuth angle and a second zenith angle). The coarse orientation is determined from the relative values of the first and second steering angles. This second mode for using a reference retroreflector is useful when the distance to the target is too large to be accurately measured with the locator cameras 52.
A third method for establishing a coarse orientation is to use a target light, which might be an LED, located in position 822 or 922. Such a target light is illuminated to enable a locator camera, such as the locator camera 52, to view the position of the target light relative to the retroreflector. In this way, the location of a feature of the retroreflector, such as an intersection junction, for example, may be tagged by the target light.
A fourth method for establishing a coarse orientation is to use a region of reflective material, which may be located at positions 822 or 922, for example. In general, most reflective materials do not reflect light in so narrow a beam width as a retroreflector such as a cube corner or high quality cateye. Because of this disparity in reflectance, the reflective material may require a much larger exposure than would a cube corner or cateye. Such a long exposure may result in blooming of the image on the photosensitive array of the locator camera. To get around this problem, at least one light may be placed relatively near to the lens system of the locator camera 52 and at least one light may be placed relatively far from the lens system. As shown in FIGS. 17A-C of the '758 patent, a light must be placed relatively close to the locator camera for the light to be captured by the locator camera. On the other hand, because reflective materials, even those that are intended to be highly reflective or “retroreflective”, reflect light at a relatively large angle, a light that is relatively far from a locator camera will succeed in reflecting light off the reflective material and into the locator camera. Such an arrangement is shown in
In general, it is only necessary to determine the coarse orientation when the laser tracker has stopped measuring a target for a period of time. During continuous measurement, the position of the retroreflector is known to relatively high accuracy based on the previous measurement, and so a coarse measurement is not needed.
Many of the elements described in
A material temperature sensor 2080 attached to the circuit board may be connected to the retroreflector 2020 or housing 2010 to measure the temperature and use this information to correct the measured position of 2060 to account for thermal expansion or thermal changes in the index of refraction. An air temperature sensor assembly 2070 may be used to measure the temperature of the air as a function of location within the measurement volume. The air temperature assembly 2070 includes an air temperature sensor 2072, a protector 2074, and an insulator 2076. The temperature sensor may be a thermistor, RTD, thermocouple, or any other device capable of measuring temperature. It may be placed in a protector structure, which might be a hollow cylinder, for example. The purpose of the protector is to keep the temperature sensor from being damaged and to keep heat sources away from the temperature sensor. The protector is open at the end and may contain perforations to increase exposure of the temperature sensor to air. Insulation 2076 is provided between the housing 2010 and the air temperature sensor 2072. The insulation keeps the sensor from being exposed to the metal of the target, which may be at a different temperature than the surrounding air. The uses of the air temperature sensor are similar to those described hereinabove with regard to
An electrical memory component on the electrical circuit board may be used as an identifier to send information stored on the memory component to a transmitter on the electrical circuit board that sends the information over the antenna 2090 or over a light source. Such information might include a serial number to identify the target or at least one parameter. The parameter may indicate, for example, geometrical, optical, or thermal properties of the target. Alternatively, identification information may be stored on a bar-code pattern or an RFID tag. The bar-code pattern may be read by a locator camera disposed on the tracker and the RFID tag may be read by an RF reader on the laser tracker or other device.
While the description above refers to particular embodiments of the present invention, it will be understood that many modifications may be made without departing from the spirit thereof. The accompanying claims are intended to cover such modifications as would fall within the true scope and spirit of the present invention.
The presently disclosed embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims, rather than the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
The present application is a divisional application of U.S. patent application Ser. No. 13/407,983 filed Feb. 29, 2012, which claims the benefit of provisional application No. 61/448,823 filed Mar. 3, 2011, and also further claims the benefit of provisional application No. 61/475,703 filed Apr. 15, 2011 and provisional application No. 61/592,049 filed Jan. 30, 2012, the entire contents of both of which are hereby incorporated by reference. The U.S. patent Ser. No. 13/407,983 is also a continuation-in-part of U.S. patent application Ser. No. 13/370,339 filed Feb. 10, 2012, which claims priority to provisional application No. 61/442,452 filed Feb. 14, 2011, the entire contents of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
4413907 | Lane | Nov 1983 | A |
4560270 | Wiklund et al. | Dec 1985 | A |
4714339 | Lau et al. | Dec 1987 | A |
4731879 | Sepp et al. | Mar 1988 | A |
4777660 | Gould et al. | Oct 1988 | A |
4790651 | Brown et al. | Dec 1988 | A |
4983021 | Fergason | Jan 1991 | A |
5051934 | Wiklund | Sep 1991 | A |
5121242 | Kennedy | Jun 1992 | A |
5138154 | Hotelling | Aug 1992 | A |
5267014 | Prenninger | Nov 1993 | A |
5313409 | Wiklund et al. | May 1994 | A |
5347306 | Nitta | Sep 1994 | A |
5440326 | Quinn | Aug 1995 | A |
5594169 | Field et al. | Jan 1997 | A |
D378751 | Smith | Apr 1997 | S |
5698784 | Hotelling et al. | Dec 1997 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5767952 | Ohtomo et al. | Jun 1998 | A |
5825350 | Case, Jr. et al. | Oct 1998 | A |
5828057 | Hertzman et al. | Oct 1998 | A |
5898421 | Quinn | Apr 1999 | A |
5957559 | Rueb et al. | Sep 1999 | A |
5973788 | Pettersen et al. | Oct 1999 | A |
6017125 | Vann | Jan 2000 | A |
6023326 | Katayama et al. | Feb 2000 | A |
6034722 | Viney et al. | Mar 2000 | A |
6036319 | Rueb et al. | Mar 2000 | A |
6085155 | Hayase et al. | Jul 2000 | A |
6111563 | Hines | Aug 2000 | A |
6133998 | Monz et al. | Oct 2000 | A |
6166809 | Pettersen et al. | Dec 2000 | A |
6171018 | Ohtomo et al. | Jan 2001 | B1 |
6222465 | Kumar et al. | Apr 2001 | B1 |
6262801 | Shibuya et al. | Jul 2001 | B1 |
6295174 | Ishinabe et al. | Sep 2001 | B1 |
6344846 | Hines | Feb 2002 | B1 |
6347290 | Bartlett | Feb 2002 | B1 |
6353764 | Imagawa et al. | Mar 2002 | B1 |
6369794 | Sakurai et al. | Apr 2002 | B1 |
6433866 | Nichols | Aug 2002 | B1 |
6445446 | Kumagai et al. | Sep 2002 | B1 |
6462810 | Muraoka et al. | Oct 2002 | B1 |
6559931 | Kawamura et al. | May 2003 | B2 |
6567101 | Thomas | May 2003 | B1 |
6573883 | Bartlett | Jun 2003 | B1 |
6573981 | Kumagai et al. | Jun 2003 | B2 |
6587244 | Ishinabe et al. | Jul 2003 | B1 |
6624916 | Green et al. | Sep 2003 | B1 |
6667798 | Markendorf et al. | Dec 2003 | B1 |
6668466 | Bieg et al. | Dec 2003 | B1 |
6681031 | Cohen et al. | Jan 2004 | B2 |
6802133 | Jordil et al. | Oct 2004 | B2 |
6847436 | Bridges | Jan 2005 | B2 |
6935036 | Raab et al. | Aug 2005 | B2 |
6957493 | Kumagai et al. | Oct 2005 | B2 |
6964113 | Bridges et al. | Nov 2005 | B2 |
6965843 | Raab et al. | Nov 2005 | B2 |
6980881 | Greenwood et al. | Dec 2005 | B2 |
6996912 | Raab et al. | Feb 2006 | B2 |
7022971 | Ura et al. | Apr 2006 | B2 |
7055253 | Kaneko | Jun 2006 | B2 |
7072032 | Kumagai et al. | Jul 2006 | B2 |
7129927 | Mattsson | Oct 2006 | B2 |
7130035 | Ohtomo et al. | Oct 2006 | B2 |
7168174 | Piekutowski | Jan 2007 | B2 |
7193695 | Sugiura | Mar 2007 | B2 |
7222021 | Ootomo et al. | May 2007 | B2 |
7230689 | Lau | Jun 2007 | B2 |
7233316 | Smith et al. | Jun 2007 | B2 |
7248374 | Bridges | Jul 2007 | B2 |
7274802 | Kumagai et al. | Sep 2007 | B2 |
7285793 | Husted | Oct 2007 | B2 |
7304729 | Yasutomi et al. | Dec 2007 | B2 |
7307710 | Gatsios et al. | Dec 2007 | B2 |
7312862 | Zumbrunn et al. | Dec 2007 | B2 |
7321420 | Yasutomi et al. | Jan 2008 | B2 |
7327446 | Cramer et al. | Feb 2008 | B2 |
7345748 | Sugiura et al. | Mar 2008 | B2 |
7352446 | Bridges et al. | Apr 2008 | B2 |
7388654 | Raab et al. | Jun 2008 | B2 |
7388658 | Glimm | Jun 2008 | B2 |
7401783 | Pryor | Jul 2008 | B2 |
7423742 | Gatsios et al. | Sep 2008 | B2 |
7446863 | Nishita et al. | Nov 2008 | B2 |
7466401 | Cramer et al. | Dec 2008 | B2 |
7474388 | Ohtomo et al. | Jan 2009 | B2 |
7503123 | Matsuo et al. | Mar 2009 | B2 |
7541965 | Ouchi et al. | Jun 2009 | B2 |
7552539 | Piekutowski | Jun 2009 | B2 |
7555766 | Kondo et al. | Jun 2009 | B2 |
7562459 | Fourquin et al. | Jul 2009 | B2 |
7564538 | Sakimura et al. | Jul 2009 | B2 |
7583375 | Cramer et al. | Sep 2009 | B2 |
7634381 | Westermark et al. | Dec 2009 | B2 |
7705830 | Westerman et al. | Apr 2010 | B2 |
7728963 | Kirschner | Jun 2010 | B2 |
7765084 | Westermark et al. | Jul 2010 | B2 |
7800758 | Bridges et al. | Sep 2010 | B1 |
7804602 | Raab | Sep 2010 | B2 |
7903237 | Li | Mar 2011 | B1 |
8237934 | Cooke et al. | Aug 2012 | B1 |
8320708 | Kurzweil et al. | Nov 2012 | B2 |
8379224 | Piasse et al. | Feb 2013 | B1 |
20020148133 | Bridges et al. | Oct 2002 | A1 |
20030014212 | Ralston et al. | Jan 2003 | A1 |
20030206285 | Lau | Nov 2003 | A1 |
20040223139 | Vogel | Nov 2004 | A1 |
20050185182 | Raab et al. | Aug 2005 | A1 |
20050197145 | Chae et al. | Sep 2005 | A1 |
20050254043 | Chiba | Nov 2005 | A1 |
20060009929 | Boyette et al. | Jan 2006 | A1 |
20060055662 | Rimas-Ribikauskas et al. | Mar 2006 | A1 |
20060055685 | Rimas-Ribikauskas et al. | Mar 2006 | A1 |
20060146009 | Syrbe et al. | Jul 2006 | A1 |
20060161379 | Ellenby et al. | Jul 2006 | A1 |
20060164384 | Smith et al. | Jul 2006 | A1 |
20060164385 | Smith et al. | Jul 2006 | A1 |
20060164386 | Smith et al. | Jul 2006 | A1 |
20060262001 | Ouchi et al. | Nov 2006 | A1 |
20070016386 | Husted | Jan 2007 | A1 |
20070019212 | Gatsios et al. | Jan 2007 | A1 |
20070236452 | Venkatesh et al. | Oct 2007 | A1 |
20080122786 | Pryor et al. | May 2008 | A1 |
20080229592 | Hinderling et al. | Sep 2008 | A1 |
20080309949 | Rueb | Dec 2008 | A1 |
20090033621 | Quinn et al. | Feb 2009 | A1 |
20090171618 | Kumagai et al. | Jul 2009 | A1 |
20090239581 | Lee | Sep 2009 | A1 |
20090240372 | Bordyn et al. | Sep 2009 | A1 |
20090240461 | Makino et al. | Sep 2009 | A1 |
20090240462 | Lee | Sep 2009 | A1 |
20100091112 | Veeser et al. | Apr 2010 | A1 |
20100128259 | Bridges et al. | May 2010 | A1 |
20100149518 | Nordenfelt et al. | Jun 2010 | A1 |
20100234094 | Gagner et al. | Sep 2010 | A1 |
20100235786 | Maizels et al. | Sep 2010 | A1 |
20100265316 | Sali et al. | Oct 2010 | A1 |
20100284082 | Shpunt et al. | Nov 2010 | A1 |
20110007154 | Vogel et al. | Jan 2011 | A1 |
20110023578 | Grasser | Feb 2011 | A1 |
20110025827 | Shpunt et al. | Feb 2011 | A1 |
20110035952 | Roithmeier | Feb 2011 | A1 |
20110043620 | Svanholm et al. | Feb 2011 | A1 |
20110052006 | Gurman et al. | Mar 2011 | A1 |
20110069322 | Hoffer, Jr. | Mar 2011 | A1 |
20110107611 | Desforges et al. | May 2011 | A1 |
20110107612 | Ferrari et al. | May 2011 | A1 |
20110107613 | Tait | May 2011 | A1 |
20110107614 | Champ | May 2011 | A1 |
20110112786 | Desforges et al. | May 2011 | A1 |
20110123097 | Van Coppenolle | May 2011 | A1 |
20110181872 | Dold et al. | Jul 2011 | A1 |
20110260033 | Steffensen et al. | Oct 2011 | A1 |
20120050255 | Thomas et al. | Mar 2012 | A1 |
20120099117 | Hanchett et al. | Apr 2012 | A1 |
Number | Date | Country |
---|---|---|
0797076 | Sep 1997 | EP |
0919831 | Jun 1999 | EP |
0957336 | Nov 1999 | EP |
2004108939 | Apr 2008 | JP |
9534849 | Dec 1995 | WO |
0223121 | Mar 2002 | WO |
0237466 | May 2002 | WO |
03062744 | Jul 2003 | WO |
03073121 | Sep 2003 | WO |
2007079601 | Jul 2007 | WO |
2010100043 | Sep 2010 | WO |
2010148526 | Dec 2010 | WO |
2011057130 | May 2011 | WO |
Entry |
---|
Automated Precision, Inc., Product Specifications, Radian, Featuring INNOVO Technology, info@apisensor.com, Copyright 2011. |
FARO Technical Institute, Basic Measurement Training Workbook, Version 1.0, FARO Laster Tracker, Jan. 2008, Students Book, FAO CAM2 Measure. |
Kollorz, et al., “Gesture recognition with a time-of-flight camera”, International Journal of Intelligent Systems Technologies and Applications, vol. 5, No. 3/4, pp. 334-343, [Retreived Aug. 11, 2011; http://www5.informatik.uni-erlangen.de/Forschung/Publikationen/2008/Kollorz08-GRW.pdf] (2008). |
International Search Report of the International Searching Authority for International Application No. PCT/US2012/028984; Mailed Jul. 19, 2012. |
International Search Report of the International Searching Authority for International Application No. PCT/US2012/070283; Mailed Mar. 27, 2013. |
International Search Report of the International Searching Authority for International Application No. PCT/US2011/033360; Mailed Feb. 29, 2012. |
Hecht, Jeff, Photonic Frontiers: Gesture Recognition: Lasers Bring Gesture Recognition to the Home, Laser Focus World, pp. 1-5, [Retrieved On-Line Mar. 3, 2011], http://www.optoiq.com/optoiq-2/en-us/index/photonics-technologies-applications/lfw-display/lfw-articles-toolstemplate.articles.optoiq2.photonics-technologies.technology-products.imaging-—detectors.2011.01.lasers-bringgesture-recognition-to-the-home.html. |
Leica Geosystems Metrology, “Leica Absolute Tracker AT401, White Paper,” Hexagon AB; 2010. |
Leica Geosystems AG ED—“Leica Laser Tracker System”, Internet Citation, Jun. 28, 2012, XP002678836, Retrieved from the Internet: URL:http://www.a-solution.com.au/pages/downloads/LTD500—Brochure—EN.pdf. |
Maekynen, A. J. et al., Tracking Laser Radar for 3-D Shape Measurements of Large Industrial Objects Based on Time-of-Flight Laser Rangefinding and Position-Sensitive Detection Techniques, IEEE Transactions on Instrumentation and Measurement, vol. 43, No. 1, Feb. 1, 1994, pp. 40-48, XP000460026, ISSN: 0018-9456, DOI 10.1109/19.286353, the whole document. |
New River Kinematics, SA Arm—The Ultimate Measurement Software for Arms, Software Release! SA Sep. 30, 2010, [On-line], http://www.kinematics.com/news/software-release-sa20100930.html (1 of 14), [Retrieved Apr. 13, 2011 11:40:47 AM]. |
Turk, et al., “Perceptual Interfaces”, UCSB Technical Report 2003-33, pp. 1-43 [Retreived Aug. 11, 2011, http://www.cs.ucsb.edu/research/tech—reports/reports/2003-33.pdf] (2003). |
Li, et al., “Real Time Hand Gesture Recognition using a Range Camera”, Australasian Conference on Robotics and Automation (ACRA), [Retreived Aug. 10, 2011, http://www.araa.asn.au/acra/acra2009/papers/pap128s1.pdf] pp. 1-7 (2009). |
Leica Geosystems: “TPS1100 Professional Series”, 1999, Retrieved from the Internet: URL:http://www.estig.ipbeja.pt/˜legvm/top—civil/TPS1100%20-%20A%20New%20Generation%20of%20Total%20Stations.pdf, [Retrieved on Jul. 2012] the whole document. |
Cao, et al.“VisionWand: Interaction Techniques for Large Displays using a Passive Wand Tracked in 3D”, Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology, UIST, vol. 5, Issue 2, pp. 173-182, (Jan. 2003). |
Written Opinion of the International Searching Authority for International Application No. PCT/US2012/028984; Mailed Jul. 19, 2012. |
Written Opinion of the International Searching Authority for International Application No. PCT/US2012/070283; Mailed Mar. 27, 2013. |
Written Opinion of the International Searching Authority for International Application No. PCT/US2011/033360; Mailed Feb. 29, 2011. |
International Search Report mailed Jun. 29, 2012 for International Application Serial No. PCT/US2012/027083; International filing date Feb. 29, 2012. |
Written Opinion of the International Searching Authority mailed Jun. 29, 2012 for International Application Serial No. PCT/US2012/027083; International filing date Feb. 29, 2012. |
International Preliminary Report on Patentability date of Issuance Sep. 3, 2013 for International Application Serial No. PCT/US2012/027083; International filing date Feb. 29, 2012. |
Number | Date | Country | |
---|---|---|---|
20130201470 A1 | Aug 2013 | US |
Number | Date | Country | |
---|---|---|---|
61448823 | Mar 2011 | US | |
61442452 | Feb 2011 | US | |
61475703 | Apr 2011 | US | |
61592049 | Jan 2012 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13407983 | Feb 2012 | US |
Child | 13826883 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13370339 | Feb 2012 | US |
Child | 13407983 | US |