The subject matter disclosed herein relates to a coordinate measurement device or scanner, and in particular to a coordinate measurement device or scanner having an event based camera
Optical scanning systems are used for measuring coordinates of points on a surface of an object. These devices use trigonometry and epipolar geometry to determine a distance from the object using a pattern of light emitted by a projector. The device includes a sensor, such as a photosensitive array, that receives light reflected from the surface of the object. By knowing or determining the pose of the device and the distance to the object, three-dimensional coordinates of the object may be determined. Examples of these types of devices includes laser line probes, laser line scanners, flying dot scanners, area scanners, triangulation scanners, and structured light scanners.
One limiting factor on these devices is the speed at which the images acquired by the sensor can be processed. In the case of a photosensitive array, the entire array needs to be read and evaluated before the distance may be determined. It should be appreciated that with modern photosensitive arrays, the number of pixels may be quite large. Further, the values of the pixels (e.g. the value of the accumulated charge) are read or retrieved sequentially on a row by row basis. As a result, before the next image is acquired, the previous frame/image needs to be acquired, retrieved from the sensor, and analyzed to determine the distance to the object.
Accordingly, while existing coordinate scanners are suitable for their intended purposes the need for improvement remains, particularly in providing a coordinate scanner having the features described herein.
According to one aspect of the disclosure, a three-dimensional coordinate scanner is provided. The scanner includes a projector configured to emit a pattern of light; a sensor arranged in a fixed predetermined relationship to the projector, the sensor having a photosensitive array comprised of a plurality of event-based pixels, each of the event-based pixels being configured to transmit a signal in response to a change in irradiance exceeding a threshold. One or more processors are electrically coupled to the projector and the sensor, the one or more processors being configured to modulate the pattern of light and determine a three-dimensional coordinate of a surface based at least in part on the pattern of light and the signal.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may the one or more processors being configured to modulate the intensity of the pattern of light.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the one or more processors being configured to modulate the locations of the pattern of light.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the signal having a pixel ID and a timestamp. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the timestamp being the time that a row of the photosensitive array that the pixel is located in was read. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the row having a first pixel and a second pixel, each of which has a charge that exceeds the threshold, the timestamp for the first pixel and the second pixel being the same. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the timestamp being the time that the accumulation of charge in the pixel exceeded the threshold.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the light pattern being a doublet pattern, and the projector is configured to control the light intensity of the doublet patter with respect to time.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the projector being configured to project a first plane of light that forms a first line of light on the surface of the object. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the projector being further configure to project a second plane of light that forms a second line of light on the surface of the object. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the one or more processors being configured to cause the projector to modulate the intensity of the first plane of light and the second plane of light.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the average intensity of the modulation being greater than or equal to 20% of a maximum intensity of the projector.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the modulating of the intensity turning off at least one of the first plane of light or the second plane of light on a periodic or aperiodic basis.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the first line of light and the second line of light overlapping
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the pattern of being is a structured light pattern.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include a tracking camera having a second photosensitive array that includes a plurality of second event-based pixels each of the event-based pixels being configured to transmit a second signal in response to an accumulation of charge exceeding a second threshold, wherein the one or more processors is further configured to identify a feature on the object in response to the second signal.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the feature being a reflective target affixed to the surface.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include a light source operably coupled to the tracking camera, the light source being configured to emit light at a predetermined frequency. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the tracking camera having at least one optical element that allows light of the predetermined frequency to pass through and substantially blocks light of other frequencies. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the predetermined frequency being in the infrared light spectrum.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the one or more processors being further configured to determine three-dimensional coordinates based at least on part on an amount of time between a first event and a second event, wherein there are no events between the first event and the second event.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include the amount of time being based at least in part on a first time stamp of the first event and a second time stamp of the second event. In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include an encoder operably coupled to the one or more processors, wherein the encoder measures one of a speed of the object being scanned or the speed of the scanner relative to the object.
In addition to one or more of the features described herein, or as an alternative, further embodiments of the scanner may include a registration of three-dimensional coordinates generated at different timestamps is based at least in part on a relative position between the scanner and an object being measured.
According to another aspect of the disclosure a three-dimensional coordinate scanner is provided. The scanner including a projector configured to emit a pattern of light. A sensor is arranged in a fixed predetermined relationship to the projector, the sensor having a photosensitive array arranged to receive light reflected from a surface of the object. A light source is operably coupled to the projector and the sensor. At least one tracking camera is operably coupled to the light source and arranged to receive light emitted by the light source and reflected off of the surface of the object, the at least one tracking camera having a photosensitive array comprised of a plurality of event-based pixels, each of the event-based pixels being configured to transmit a signal in response to an accumulation of charge exceeding a threshold. One or more processors are electrically coupled to the projector and the sensor, the one or more processors being configured determine a three-dimensional coordinate of a surface based at least in part on the pattern of light and to determine a pose of the scanner based at least in part on the signal.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The subject matter, which is regarded as the disclosure, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the disclosure are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
The detailed description explains embodiments of the disclosure, together with advantages and features, by way of example with reference to the drawings.
Embodiments of the present disclosure provide advantages in increasing the speed acquiring data with a coordinate measurement device.
Referring to
Since the projector 102 and the optical sensor 104 are a coupled in a fixed geometric position to each other, three-dimensional (3D) coordinates of points on the surface where the line of light 108 strikes may be determined using trigonometric principals.
It should be appreciated that while embodiments herein may refer to a scanner 100 projecting a line of light, sometimes referred to as a laser line probe or a line scanner, this is for example purposes and the claims should not be so limited. In other embodiments, other types of optical scanners may be used, including but not limited to: a triangulation scanner, a flying-spot scanner, a structured light scanner, a photogrammetry device, a laser tracker device, and a theodolite for example.
Referring now to
The position of the line 208 on the array 200, in combination with the baseline distance between the projector 102 and the sensor 104, may be used to determine the depth or distance from the scanner 100 to the points on the surface where line 108 strikes as is described in commonly owned U.S. patent application Ser. No. 17/073,923 (Atty. Docket FA00989US4), the contents of which are incorporated by reference herein. For example, when the line of light 208 crosses a pixel (e.g. pixel 210) closer to the left side (when viewed from the position of
In operation, when the optical sensor 104 opens the shutter, light reflected from the surface 110 is received by pixels, which accumulate charge. In a traditional photosensitive array, when the shutter closes, each pixel in the photosensitive array is read to determine a value of the accumulated charge. The value is proportional to the number of photons per unit time received by the pixel, which is proportional to the irradiance (power per unit area) received by that pixel. It should be appreciated that reading each of the pixels is relatively time consuming and is one of the limiting factors in the speed of acquisition of 3D coordinates for points on the surface 110. Further, the reading of each pixel generates a large amount of data. For example, typically a 1 megapixel to a 10 megapixel image sensor may typically be used. It should be appreciated that reading, transmitting, analyzing and storing 1,000,000 to 10,000,000 values may be relatively time consuming to transfer, store, and computationally process.
It should be appreciated that the array 200 is exemplary and that the pixels are illustrated as being large relative to the line 208. This is for purposes of clarity and typically the width of the line 208 is larger than the width of the pixels.
In the illustrated embodiment, the array 200 includes event-based pixels 202. As used herein, an event-based pixel transmits a signal when a change or difference in the irradiance exceeds a threshold. Further, in an event-based image sensor, the array 200 is read row by row, accordingly if a row, such as row 214 for example, have no pixels where the line 208 crosses, there would be no data read. In a row, such as row 216 for example, only a single pixel (e.g. 212) would read out data. It should be appreciated that for each reading of the array 200 results in less data than in a traditional photosensitive array.
Referring to
Referring to
Referring now to
One characteristic of the optical sensor 304 having an event-based photosensitive array (e.g. array 200) is that when the line 308 is projected onto a substantially flat surface (e.g. conveyor 305), where there is little change in irradiance of the reflected light (e.g. less than or equal to 20% differential), the optical sensor 304 will not generate an event. It should be appreciated that this provides a technical benefit in reducing a computational load on the one or more processors of the scanner 300. When there is a change in height relative to the conveyor belt surface 305, such as when the object 310 enters the field of view 307 and the line 308 is projected onto a surface of the object 310. As discussed in more detail herein, a row of the photosensitive array is read when one of the pixels generates an event. Each event has an associated time stamp and pixel location. In some embodiments, the event may also include a polarity (e.g. irradiance is increasing or decreasing). It should be appreciated that it is contemplated that other types of event-based sensors may be used that scan the entire sensor for events simultaneously. It is further contemplated that still other types of event-based sensors may be used wherein the pixels transmit a signal when the event occurs, rather than being read by the processor.
When scanning the object 310, the optical sensor 304 will only generate data when there is a change in irradiance at least one pixel on the object 310 that is greater than the threshold (e.g. a change in brightness that is greater than or equal to 20%). As discussed in more detail, this change in irradiance may occur due to the line of reflected light moving relative to the pixel. For example, if the object 310 is a flat plate, the optical sensor 304 will generate data at the leading edge of the object 310 when the conveyor belt 305 moves the leading edge into the field of view 307 and intersects the line 308. This change in height between the conveyor belt 305 and the plate will result in the line on the photosensitive array moving (relative to the pixels) from the left to the right (when the array is viewed from the view point of
It should be appreciated that in some embodiments, it may be desirable to at least periodically measure the 3D coordinates of a flat surface, such as to verify that the scanner is operating correctly for example. In an embodiment, scanner may modulate the brightness of the plane of light 306 to cause a change in the brightness of the line 308. When the modulation of the emitted light is greater than the threshold for triggering an event (e.g. greater than or equal 20% change), this will cause each of the pixels that the line of light crosses to generate an event, even though the line did not move relative to the pixels. In an embodiment, the modulation of the line 308 is accomplished by projecting a second line that periodically overlaps with the line 308 to change the brightness. In still another embodiment, the measurement of a flat surface is to periodically shift/move the line 308 spatially. The movement of the line 308 on the surface will cause the line to move on the photosensitive array, resulting in the generation of an event by the pixels.
Referring now to
By tracking each event, both increasing and decreasing, as the line of light moves towards and away from the pixel, the maximum irradiance point may be determined. It should be appreciated that in some embodiments, the maximum irradiance point may be extrapolated since the time of the shutter opening may not correspond with the center of the line of light passing over the center of the pixel. This determined maximum irradiance level at each time step may be plotted based on pixel position for a given row as shown in
As discussed herein, the event-based pixels only generate a signal when the change in irradiance is equal to or exceeds a threshold (e.g. 20%). Thus each of the pixels in Row 4, using the example of
In some embodiments, the scanner may include a second optical sensor or camera, sometimes referred to as a tracking or a texture camera. Referring now to
The first optical sensor 504 includes a field of view 507 that is larger than (e.g. encompasses) the line of light 508. The light reflected from the surface by the line of light 508 is received by the optical sensor 504. In this embodiment, the optical sensor 504 may have an event-based photosensitive array or traditional photosensitive array.
The second optical sensor 505 includes a photosensitive array having event-based pixels. In an embodiment, the second optical sensor 505 may operate in the same manner as photosensitive array 200, 400. In other words, the pixels only generate a signal when the change in irradiance exceeds a threshold. In an embodiment, the second optical sensor 505 may include optical elements, such as filters for example, that allow light from a predetermined wavelength to pass onto the photosensitive array. In an embodiment, the optical elements allow infrared light to pass therethrough. In an embodiment, the scanner 500 may include a light source configured to emit light of the predetermined wavelength (e.g. infrared).
The second optical sensor 505 has a field of view 511. In an embodiment, the field of view may be larger than the field of view 507. In this embodiment, a plurality of reflective targets or markers 513 are coupled to the surface 510. In an embodiment the targets 513 are configured to reflect light of a predetermined wavelength threshold, such as an infrared wavelength for example. In an embodiment, the reflective wavelength of the targets 513 cooperates with the optical-elements/filters of the second optical sensor 505. The second optical sensor 505 may operate synchronously or asynchronously with the first optical sensor 504.
The second optical sensor 505 acquires images in the field of view that includes one or more targets 513. Since the targets 513 are in a fixed relative to each other, the images of the targets 513 acquired by the second optical sensor 505 may be used for tracking the movement of the scanner 500 relative to the surface 510. As a result, the three-dimensional coordinates acquired by the scanner 500 may be registered into a common coordinate frame of reference. Since the second optical sensor 505 includes an event-based image sensor, the images acquired by the second optical sensor 505 may be acquired and computationally processed at a higher rate than provided in prior art systems. As a result, the tracking of the scanner 500 may be performed at a higher rate that provides the technical solution of improving the accuracy of the tracking of scanner 500 since the scanner 500 will move a smaller distance between adjacent image frames.
The term “about” is intended to include the degree of error associated with measurement of the particular quantity based upon the equipment available at the time of filing the application. For example, “about” may include a range of ±8% or 5%, or 2% of a given value.
Additionally, the term “exemplary” is used herein to mean “serving as an example, instance or illustration.” Any embodiment or design described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or designs. The terms “at least one” and “one or more” are understood to include any integer number greater than or equal to one, i.e. one, two, three, four, etc. The terms “a plurality” are understood to include any integer number greater than or equal to two, i.e. two, three, four, five, etc. The term “connection” can include an indirect “connection” and a direct “connection.” It should also be noted that the terms “first”, “second”, “third”, “upper”, “lower”, and the like may be used herein to modify various elements. These modifiers do not imply a spatial, sequential, or hierarchical order to the modified elements unless specifically stated.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
While the disclosure is provided in detail in connection with only a limited number of embodiments, it should be readily understood that the disclosure is not limited to such disclosed embodiments. Rather, the disclosure can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the disclosure. Additionally, while various embodiments of the disclosure have been described, it is to be understood that the exemplary embodiment(s) may include only some of the described exemplary aspects. Accordingly, the disclosure is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
This application claims the benefit of U.S. Provisional Application Ser. No. 63/129,216, filed Dec. 22, 2020, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
63129216 | Dec 2020 | US |