The subject matter disclosed herein relates to a three-dimensional (3D) measurement device, and in particular to a 3D measurement device operable to selectively acquire high quality color images.
A 3D imager is a portable device includes a projector that projects light patterns on the surface of an object to be scanned. Typically the projector emits a coded or uncoded pattern. One (or more) cameras, having a predetermined positions and alignment relative to the projector, which record images of the light pattern on the surface of an object. The three-dimensional coordinates of elements in the light pattern can be determined by trigonometric methods, such as by using epipolar geometry. Other types of noncontact devices may also be used to measure 3D coordinates, such as those that use time of flight techniques (e.g. laser trackers, laser scanners or time of flight cameras) for measuring the amount of time it takes for light to travel to the surface and return to the device.
These 3D imagers often have an additional color camera that is used for tracking the position of the 3D imager, the colorizing (texture) of the 3D point cloud or provide a visual feedback to the operator during scanning. It should be appreciated that to reduce processing and storage loading, the color camera acquires images at a relatively low resolution, such as 1.3 megapixels for example. The low resolution images allow the 3D imager to perform the intended function (e.g. tracking, colorizing or visualization feedback) without unnecessarily using large amounts of memory or slowing down the processor of the 3D imager. Further, typically, the color camera uses a global shutter camera sensor where the smallest achievable pixel size is larger than for a rolling shutter camera.
In some applications, it may be desirable to acquire high resolution or high quality images of the scene or portions of the scanned area. For example, in a crime scene investigation, the investigator may use a high resolution DSLR camera to take photographs of areas they believe needs to be documented further, such as evidence for example. As a result, the high quality photographs are acquired and stored separately from the 3D point cloud generated by the 3D imager.
Accordingly, while existing 3D imagers are suitable for their intended purpose the need for improvement remains, particularly in providing a system for acquiring high quality images during a scanning process.
According to one aspect of the disclosure, a three-dimensional (3D) measurement system is provided. The measurement system includes a noncontact measurement device operable to measure a distance from the noncontact measurement device to a surface. The noncontact measurement device includes a projector that emits a light pattern. A measurement camera is operably coupled to the noncontact measurement device. A first color camera is provided having a first quality parameter. A second color camera is operably coupled to the noncontact measurement device, the second color camera having a second quality parameter, the second quality parameter being larger than the first quality parameter. One or more processors are operably coupled to the noncontact measurement device, the one or more processors operable to execute computer instructions when executed on the processor for determining 3D coordinates of at least one point in a field of view and selectively acquiring an image with the second color camera.
According to another aspect of the disclosure, a method is provided. The method includes emitting a pattern of light with a projector of a noncontact measurement device. Point data is acquired about a plurality of points on a surface with a measurement camera of a noncontact measurement device. A first color image of the surface is acquired with a first color camera of the noncontact measurement device, the first color camera having a first quality parameter. 3D coordinates of the plurality of points are determined based at least in part on the point data and the baseline distance between the projector and the measurement camera. A second color image of the surface is selectively acquired with a second color camera in response to an input from an operator, the high quality camera being operably coupled to the noncontact measurement device and having a second quality parameter, the second quality parameter being different than the first quality parameter.
According to yet another aspect of the disclosure, a computer program product for determining three dimensional coordinates using a noncontact measurement device is provided. The computer program product comprises a nontransitory computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor to cause the processor to perform: emitting a pattern of light with a projector of a noncontact measurement device; acquiring point data about a plurality of points on a surface with a measurement camera of a noncontact measurement device, acquiring a first color image of the surface with a first color camera of the noncontact measurement device, the first color camera having a first quality parameter; determining 3D coordinates of the plurality of points based at least in part on the point data and the baseline distance between the projector and the measurement camera; selectively acquiring a second color image of the surface with a second color camera in response to an input from an operator, the high quality camera being operably coupled to the noncontact measurement device and having a second quality parameter, the second quality parameter being different than the first quality parameter.
According to yet another aspect of the disclosure, a three-dimensional (3D) measurement system is provided. The measurement system includes a noncontact measurement device operable to measure a distance from the noncontact measurement device to a surface. The noncontact measurement device includes a projector that emits a light pattern. A measurement camera is operably coupled to the noncontact measurement device. A color camera is provided. One or more processors are operably coupled to the noncontact measurement device, the one or more processors operable to execute computer instructions when executed on the processor for: selectively acquiring an image with the color camera; determining 3D coordinates of at least one point in a field of view and determining a position registration through optical tracking based at least in part on the image, wherein the at least one point includes a plurality of points, the plurality of points includes a first plurality of points and a second plurality of points; determining an area of interest from the image and registers the first plurality of points with the area of interest; and deleting the second plurality of points.
According to yet another aspect of the disclosure, a three-dimensional (3D) measurement system is provided. The measurement system includes a noncontact measurement device operable to measure a distance from the noncontact measurement device to a surface. The noncontact measurement device includes a projector that emits a light pattern; a measurement camera operably coupled to the noncontact measurement device; a color camera operable to selectively operate at a first quality parameter and a second quality parameter; a user-interface; and one or more processors operably coupled to the noncontact measurement device, the one or more processors operable to execute computer instructions when executed on the processor for selectively acquiring a first image with the color camera operating at the first quality parameter; selectively acquiring a second image with the color camera operating at the second quality parameter; determining 3D coordinates of at least one point in a field of view and determining a position registration through optical tracking based at least in part on the image.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The subject matter, which is regarded as the disclosure, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features, and advantages of the disclosure are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
The detailed description explains embodiments of the disclosure, together with advantages and features, by way of example with reference to the drawings.
Embodiments of the invention provide for a three-dimensional (3D) measurement device that acquires high quality images during a scanning process. Embodiments disclosed herein further provide for integrating high quality images into a point cloud acquired by the 3D imager. Further embodiments disclosed herein provide for selectively coloring (texture) of the point cloud using high quality images. Still further embodiments disclosed herein provide for the automatic removal of points in the point cloud that are outside of an area of interest based on high quality images acquired by the 3D imager. Still further embodiments provide feedback to a user on the density of the point cloud based on high quality images acquired by the 3D imager.
Referring now to
In this embodiment, the color camera 40 is configured to be selectively changed between a first quality parameter that acquires a low quality image, such as an image having a resolution equal to or less than about 1.3-5 megapixels, and a second quality parameter that acquires a high quality image, such as an image having a resolution equal to or greater than about 10 megapixels. It should be appreciated that while embodiments herein describe the quality parameter based on image or sensor resolution, this is for exemplary purposes and the claims should not be so limited. In other embodiments the quality parameter may be based on other attributes of the color camera 40.
As discussed in more detail herein, in an embodiment the projector 24 projects a pattern of light onto a surface in the environment. As used herein, the term “projector” is defined to generally refer to a device for producing a pattern. The generation of the pattern can take place by means of deflecting methods, such as generation by means of diffractive optical elements or micro-lenses (or single lasers), or by shading methods, for example the production by means of shutters, transparencies (as they would be used in a transparency projector) and other masks. The deflecting methods have the advantage of less light getting lost and consequently a higher intensity being available.
The cameras 26, 28 acquire images of the pattern and in some instances able to determine the 3D coordinates of points on the surface using trigonometric principles, e.g. epipolar geometry. In an embodiment, the cameras 26, 28 are sensitive to monochromatic light, such as light in the infrared (IR) spectrum.
It should be appreciated that while the illustrated embodiments show and describe the device that determines 3D coordinates as being an image scanner, this is for exemplary purposes and the claimed invention should not be so limited. In other embodiments, devices that use other noncontact means for measuring 3D coordinates may also be used, such as a laser scanner device that uses time-of-flight to determine the distance to the surface.
A controller 48 is coupled for communication to the projector 24, cameras 26, 28, 40 and in an embodiment the high quality camera 47. The connection may be a wired connection 50 or a wireless connection. The controller 48 is a suitable electronic device capable of accepting data and instructions, executing the instructions to process the data, and presenting the results. Controller 48 may accept instructions through user interface 52, or through other means such as but not limited to electronic data card, voice activation means, manually-operable selection and control means, radiated wavelength and electronic or electrical transfer.
Controller 48 uses signals act as input to various processes for controlling the system 20. The digital signals represent one or more system 20 data including but not limited to images acquired by cameras 26, 28, 40, temperature, ambient light levels, operator inputs via user interface 52 and the like.
Controller 48 is operably coupled with one or more components of system 20 by data transmission media 50. Data transmission media 50 includes, but is not limited to, twisted pair wiring, coaxial cable, and fiber optic cable. Data transmission media 50 also includes, but is not limited to, wireless, radio and infrared signal transmission systems. Controller 48 is configured to provide operating signals to these components and to receive data from these components via data transmission media 50.
In general, controller 48 accepts data from cameras 26, 28, 40, projector 24 and a light source, and is given certain instructions for the purpose of determining the 3D coordinates of points on surfaces being scanned. The controller 48 may compare the operational parameters to predetermined variances and if the predetermined variance is exceeded, generates a signal that may be used to indicate an alarm to an operator or to a remote computer via a network. Additionally, the signal may initiate other control methods that adapt the operation of the system 20 such as changing the operational state of cameras 26, 28, 40, projector 24 or light source 42 to compensate for the out of variance operating parameter. Still other control methods may display, highlight in the display or otherwise notify the operator when a low point density is detected.
The data received from cameras 26, 28, 40 may be displayed on a user interface 52. The user interface 52 may be an LED (light-emitting diode) display, an LCD (liquid-crystal diode) display, a CRT (cathode ray tube) display, a touch-screen display or the like. A keypad may also be coupled to the user interface for providing data input to controller 38. In an embodiment, the controller 48 displays in the user interface 52 a point cloud to visually represent the acquired 3D coordinates.
In addition to being coupled to one or more components within system 20, controller 48 may also be coupled to external computer networks such as a local area network (LAN) and the Internet. A LAN interconnects one or more remote computers, which are configured to communicate with controller 48 using a well-known computer communications protocol such as TCP/IP (Transmission Control Protocol/Internet(̂) Protocol), RS-232, ModBus, and the like. Additional systems 20 may also be connected to LAN with the controllers 48 in each of these systems 20 being configured to send and receive data to and from remote computers and other systems 20. The LAN is connected to the Internet. This connection allows controller 48 to communicate with one or more remote computers connected to the Internet.
Controller 48 includes a processor 54 coupled to a random access memory (RAM) device 56, a non-volatile memory (NVM) device 58, a read-only memory (ROM) device 60, one or more input/output (I/O) controllers, and a LAN interface device 62 via a data communications bus.
LAN interface device 62 provides for communication between controller 48 and a network in a data communications protocol supported by the network. ROM device 60 stores an application code, e.g., main functionality firmware, including initializing parameters, and boot code, for processor 54. Application code also includes program instructions as shown in
NVM device 58 is any form of non-volatile memory such as an EPROM (Erasable Programmable Read Only Memory) chip, a disk drive, or the like. Stored in NVM device 58 are various operational parameters for the application code. The various operational parameters can be input to NVM device 58 either locally, using a user interface 52 or remote computer, or remotely via the Internet using a remote computer. It will be recognized that application code can be stored in NVM device 58 rather than ROM device 60.
Controller 48 includes operation control methods embodied in application code such as that shown in
In an embodiment, the controller 48 further includes an energy source, such as battery 64. The battery 64 may be an electrochemical device that provides electrical power for the controller 48. In an embodiment, the battery 64 may also provide electrical power to the cameras 26, 28, 40, the projector 24 and the high quality camera 47. In some embodiments, the battery 64 may be separate from the controller (e.g. a battery pack). In an embodiment, a second battery (not shown) may be disposed in the housing 36 to provide electrical power to the cameras 26, 28, 40 and projector 24. In still further embodiments, the light source 42 may have a separate energy source (e.g. a battery pack).
It should be appreciated that while the controller 48 is illustrated as being separate from the housing 36, this is for exemplary purposes and the claimed invention should not be so limited. In other embodiments, the controller 48 is integrated into the housing 36.
Referring now to
Referring now to
As will be discussed in more detail herein, the second color camera 47, or the color camera 40 in high quality mode of operation, may be used in combination with the measured 3D coordinate data to improve or enhance the point cloud data. As used herein, a point cloud is a collection or a set of data points in a coordinate system. In a three-dimensional coordinate system, these points are usually defined by X, Y, and Z coordinates, and represent the external surface of an object that is scanned with the system 20.
Referring now to
In
Consider the embodiment of
To check the consistency of the image point P1, intersect the plane P3-E31-E13 with the reference plane 108 to obtain the epipolar line 114. Intersect the plane P2-E21-E12 to obtain the epipolar line 116. If the image point P1 has been determined consistently, the observed image point P1 will lie on the intersection of the determined epipolar line 114 and line 116.
To check the consistency of the image point P2, intersect the plane P3-E32-E23 with the reference plane 110 to obtain the epipolar line 105. Intersect the plane P1-E12-E21 to obtain the epipolar line 107. If the image point P2 has been determined consistently, the observed image point P2 will lie on the intersection of the determined epipolar lines 107 and 105.
To check the consistency of the projection point P3, intersect the plane P2-E23-E32 with the reference plane 110 to obtain the epipolar line 118. Intersect the plane P1-E13-E31 to obtain the epipolar line 120. If the projection point P3 has been determined consistently, the projection point P3 will lie on the intersection of the determined epipolar line 118 and line 120.
The redundancy of information provided by using a 3D imager 100 having a triangular arrangement of projector and cameras may be used to reduce measurement time, to identify errors, and to automatically update compensation/calibration parameters. It should be appreciated that based on the epipolar geometry relationships described herein, the distance from the image scanner 22 to points on the surface being scanned may be determined. By moving the image scanner 22, the determination of the pose/orientation of the image scanner, and a registration process the three dimensional coordinates of locations (point data) on a surface may be determined and the point cloud generated.
It should be appreciated that since the cameras 26, 28 are sensitive to monochromatic light (e.g. in the infrared spectrum), the measured 3D coordinate points do not include color or texture information. In an embodiment, as the 3D coordinates are measured and determined, the image scanner 22 also acquires color images of the scene being scanned with the color camera 40. In one embodiment, the color data from the images acquired by color camera 40 is merged with the measured 3D coordinate data. In order to obtain a 3D point cloud of the scanned object, each image shot/frame is registered, in other words the three-dimensional coordinates obtained in each image frame is inserted in a common coordinate system. Registration is possible, for example, by videogrammetry, i.e., for example, “structure from motion” (SFM) or “simultaneous localisation and mapping” (SLAM). The natural texture of the scanned objects or environment can also be used for common points of reference, or a separate stationary pattern can be produced. The natural texture can be captured by the color camera 113 in addition to obtaining the color information. This allows the color information to be associated with each of the measured 3D coordinates. In one embodiment, the pose of the first color camera 40 and the second color camera 47 is known in relation to the pose of the projector 24 and the cameras 26, 28 and since the characteristics/parameters of the color camera are known (e.g. focal length), the two-dimensional (2D) color data may be merged with the measured 3D coordinate data/points by projecting the rays that enter the 2D color cameras 40, 47 onto the measured 3D coordinate data/points.
It should be appreciated that since the color camera 40 acquires images at a rapid rate during the scanning operation, the quality level of the color image (e.g. resolution) may be lower than that desired for the purpose of documenting the scene. For example, in an embodiment the system 20 is used to scan a crime scene to document the position of different objects or evidence in the area where the crime was committed. In addition to the positions, sizes and coordinates of the objects, the investigator is also likely to desire high quality (e.g. high resolution) images to document small details for later analysis. It should be appreciated that the color camera is operated at a lower quality level in order to reduce the amount of electronic memory, storage used or processing resources (e.g. CPU) for the color images. Further, in some embodiments, the color camera 40 has a global-shutter type of sensor (i.e. all pixels exposed at the same time) and the color camera 47 has a rolling-shutter type of sensor (i.e. time shift between the exposures of different pixel rows). Compared to a rolling shutter type of sensor, sensors with a global shutter are more limited regarding the minimum achievable pixel size. However, the global-shutter sensor is desired for improved position registration through optical tracking. Thus in some embodiments the use of a second color camera with a rolling-shutter type sensor provides advantages in allowing higher resolution.
Referring now to
When the image is acquired, the position and pose information of the image scanner 22 is stored and associated with the image. This allows the image 134 (
Referring now to
From these high quality images, the area or object of interest, such as object 130 for example, may be identified. In one embodiment, the object 130 is manually identified by the user on the user interface, such as by tracing the outline of the object 130 on the user interface 52. In other embodiments, the object 130 may be automatically identified by the controller 48 using edge matching processes (e.g. the Canny edge detection method), greyscale matching methods, gradient matching methods, blob detection, or object template matching for example. In other embodiments, the object 130 may be identified using feature-based methods (e.g. surface patches, corners, linear edges). In still other embodiments, the object 130 may be compared to electronic object models, such as computer-aided-design (CAD) models for example. Once the object 130 is detected, the measured 3D coordinates of points that are not on the surfaces of object 130, such as points on the surface 140, 142, 144 for example, are removed from the point cloud data.
In another embodiment, an area of interest is identified based on identifying common surfaces that are located in each of the high quality color camera 147 images (e.g. a Boolean combined operation), such as images 136, 138 for example. In still another embodiment, an area of interest is identified from the accumulation of surfaces that are located within the high quality color camera 147 images (e.g. a Boolean union operation).
Referring now to
After storing the position and pose, or when the query block 154 returns a negative, the method 150 proceeds to block 160 where the measured 3D coordinate points are determined based at least in part on the light pattern projected by the projector 24 and from the images of the light pattern on surfaces acquired by cameras 26, 28. The method 150 then proceeds to block 162 where the high quality image acquired in block 156 is registered to the measured 3D coordinate points based on position and pose of the image scanner 22 determined in block 158. The method 150 then proceeds to query block 164 where it is determined whether the user wishes to continue scanning. When the query block 164 returns a positive, the method 150 loops back to block 152. When the query block 164 returns a negative, the method 150 proceeds to stop block 166.
It should be appreciated that while
Referring now to
After the registration of the high quality image in block 162, the method 170 proceeds to block 172 where the colors from the high quality image are registered to the measured 3D coordinates. As discussed herein, the merging of the color data with the measured 3D coordinate points may be performed by projecting the rays that enter the 2D color camera 40, 47 onto the measured 3D coordinate data/points.
Referring now to
After the user has completed their scan and acquired at least one, and in some embodiments a plurality, of high quality images of the area or object of interest, the query block 164 returns a negative and the method 180 proceeds to block 182 where the area or object or interest (e.g. object 130 of
Referring now to
With the measured 3D coordinate points determined the method 190 proceeds to query block 198 where it is determined if the scan includes the surfaces of interest, such as surfaces 186, 188 for example. When query block 198 returns a positive, the method 190 proceeds to block 200 where the point density of the measured 3D coordinates on the surface of interest is determined. This point density is then compared to a threshold in query block 202. In an embodiment, the threshold represents a level, such as a minimum level for example, of density of the measured 3D coordinate points. In one embodiment, the threshold may be user-defined. It should be appreciated that in some embodiments, the point density may be compared to several thresholds (e.g. low, medium, high point density thresholds). When the query block 202 returns a positive, meaning that the point density is below a threshold, the method 190 then proceeds to block 204 where the user interface 52 is changed to provide feedback to the operator on the point density level(s) of the surfaces of interest.
Referring to
It should be appreciated that in some embodiments, there may be multiple thresholds. In the embodiment illustrated in
Once the feedback indicator of point density is displayed on user interface 52, or when the query block 202 returns a negative, the method 190 proceeds to query block 212. In query block 212 it is determined whether the operator desires to continue with additional scanning, such as to scan additional areas or to increase the point density of a surface of interest for example. When query block 212 returns a positive, the method 190 loops back to block 194 and the process continues. When query block 190 returns a negative, the method 190 proceeds to stop block 214.
Technical effects and benefits of some embodiments include the noncontact measurement of three-dimensional coordinates of points on a surface and the acquisition of color images having a high quality parameter/level. The high quality image may be integrated into the point cloud to allow the user to view the measurement data with an additional (e.g. higher) level of detail. The high quality image may be used to improve the merging of color information into portions of the point cloud. The high quality image may further be used to identify the object or area of interest to allow points outside of the object/area of interest to be deleted from the point cloud. The deletion of points from the point cloud may improve the performance of the controller and improve the visualization of the point cloud for the user. The high quality image may still further be used to provide feedback to the operator during the scanning operation to determine whether a desired level of point density has been achieved.
The term “about” is intended to include the degree of error associated with measurement of the particular quantity based upon the equipment available at the time of filing the application. For example, “about” can include a range of ±8% or 5%, or 2% of a given value.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
While the disclosure is provided in detail in connection with only a limited number of embodiments, it should be readily understood that the disclosure is not limited to such disclosed embodiments. Rather, the disclosure can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the disclosure. Additionally, while various embodiments of the disclosure have been described, it is to be understood that the exemplary embodiment(s) may include only some of the described exemplary aspects. Accordingly, the disclosure is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
This application claims the benefit of U.S. Provisional Application Ser. No. 62/521,620 filed Jun. 19, 2017, the entire disclosure of which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
62521620 | Jun 2017 | US |