The present invention relates generally to a system and method of measuring an environment, and in particular, to a system and method that rapidly measures planes in an environment and generates a sparse point cloud.
Noncontact measurement systems, such as a 3D imager for example, use a triangulation method to measure the 3D coordinates of points on an object. The 3D imager usually includes a projector that projects onto a surface of the object either a pattern of light in a line or a pattern of light covering an area. A camera is coupled to the projector in a fixed relationship, for example, by attaching a camera and the projector to a common frame. The light emitted from the projector is reflected off the object surface and detected by the camera. Since the camera and projector are arranged in a fixed relationship, the distance to the object may be determined using trigonometric principles. Compared to coordinate measurement devices that use tactile probes, triangulation systems provide advantages in quickly acquiring coordinate data over a large area. As used herein, the resulting collection of 3D coordinate values or data points of the object being measured by the triangulation system is referred to as point cloud data or simply a point cloud.
With traditional 3D imagers, the number of data points on the object would be large, such as X points per square meter for example. This allowed the 3D imager to acquire very accurate representations of the object or environment being scanned. As a result, these traditional 3D imagers tended to be very expensive and in some cases required specialized training to use. Further, due to the number of points acquired, the process could be slow and computationally intensive.
It should be appreciated that traditional 3D imagers may not be suitable or desirable for some applications due to cost and speed considerations. These applications include measurements made by architects, building contractors or carpenters for example. In these applications, an existing space within a building may be in the process of being renovated. Typically, these measurements are made manually (e.g. a tape measure) and written on paper with a sketch of the area. As a result, measurements may contain errors, be missing, or otherwise incomplete, requiring multiple visits to the job site.
Accordingly, while existing triangulation-based 3D imager devices are suitable for their intended purpose, the need for improvement remains, particularly in providing a low cost noncontact measurement device that may quickly measure an environment.
According to an embodiment of the present invention, a system for noncontact measurement of an environment is provided. The system includes a measurement system having a baseplate and a light projector, the light projector being mounted to the baseplate. A tablet computer is coupled to the measurement system and having a processor, memory, a user interface and a tablet camera, the camera having a second resolution, the second resolution being higher than the first resolution, the processor being operably coupled to the light projector, the processor being responsive to nontransitory executable computer instructions, when executed on the processor for performing a method comprising: causing the light projector to emit a light pattern onto a plurality of surfaces in the environment and causing the tablet camera to acquire an image of the light pattern at a first instance on the plurality of surfaces; causing the tablet camera to acquire a color image of the plurality of surfaces; determining three-dimensional coordinates of points on the plurality of surfaces based at least in part on the image of the light pattern acquired at the first instance, and determining at least one plane in the environment from the three-dimensional coordinates; determining a plurality of contrast lines in the color image; matching at least two lines of the plurality of lines to the at least one plane; and aligning the color image and the three-dimensional coordinate of points into a common coordinate frame of reference.
According to an embodiment of the present invention, a method for noncontact measurement of an environment, the method comprising: projecting a light pattern with a light projector onto a plurality of surfaces in the environment; acquiring at a first instance, with at least one camera, an image of the light pattern on the plurality of surfaces, the light projector and the at least one camera being coupled to a baseplate in a predetermined geometrical relationship; acquiring a color image of the plurality of surfaces with a camera of a table computer; determining three-dimensional coordinates of points on the plurality of surfaces based at least in part on the image of the light pattern acquired at the first instance and the predetermined geometrical relationship; determining at least one plane in the environment from the three-dimensional coordinates; determining a plurality of lines in the color image; matching at least two lines of the plurality of lines to the at least one plane; and aligning the color image and the three-dimensional coordinate of points into a common coordinate frame of reference.
According to an embodiment of the present invention, a system for noncontact measurement of an environment is provided. The system includes a case having a first side and a second side. A measurement system is coupled to the first side, the measurement system having a baseplate, a light projector and at least one camera, the light projector and a first camera being mounted to the baseplate in a predetermined geometric relationship, the at least one camera having a first resolution. A tablet computer is coupled to the second side, the tablet computer having a processor, memory, a user interface and a second camera, the user interface being visible to an operator from the second side, the camera having a second resolution, the second resolution being higher than the first resolution, the processor being operably coupled to the light projector and the at least one camera, the processor being responsive to nontransitory executable computer instructions, when executed on the processor for performing a method comprising: causing the light projector to emit a light pattern onto a plurality of surfaces in the environment; causing the first camera to acquire a first image of the light pattern on the plurality of surfaces; causing the second camera to acquire a second image of the light pattern; causing the second camera to acquire a color image of the plurality of surfaces; determining three-dimensional coordinates of points on the plurality of surfaces based at least in part on the first image and the second image of the light pattern and the predetermined geometrical relationship, and determining at least one plane in the environment from the three-dimensional coordinates; determining a plurality of lines in the color image; matching at least two lines of the plurality of lines to the at least one plane; and aligning the color image and the three-dimensional coordinates of points into a common coordinate frame of reference.
These and other advantages and features will become more apparent from the following description taken in conjunction with the drawings.
The subject matter, which is regarded as the invention, is particularly pointed out and distinctly claimed in the claims at the conclusion of the specification. The foregoing and other features and advantages of the invention are apparent from the following detailed description taken in conjunction with the accompanying drawings in which:
The detailed description explains embodiments of the invention, together with advantages and features, by way of example with reference to the drawings.
Embodiments of the present invention provide advantages in rapid three-dimension measurements of the environment with a portable handheld device. Embodiments of the present invention provide further advantages of providing an imaging device that cooperates with a mobile computing device, such as a tablet computer, for rapidly acquiring three-dimensional measurements of an environment.
Referring now to
The 3D imager 20 includes an optical system 22 having a first optical device 24, a second optical device 26 and a third optical device 28. In the exemplary embodiment, the optical devices 24, 26 are camera assemblies each having a lens and a photosensitive array. The third optical device 28 is a projector having a light source and a means for projecting a light pattern. The device 24, 26, 28 are mounted to a baseplate 30. The baseplate 30 is made from a material with a uniform and low coefficient of expansion so that the geometric arrangement of the devices 24, 26, 28 and the baseline distances therebetween are known. In an embodiment, the baseplate 30 is made from carbon fiber. In an embodiment, the devices 24, 26, 28 are mounted to an electronic circuit 29 that is mounted on the baseplate 30.
The baseplate 30 is coupled to a case 32. In the exemplary embodiment, the case 32 is made from a material, such as an elastomeric material for example, that protects the optical system and a mobile computing device 34 during operation and transport.
In the exemplary embodiment, the mobile computing device 34 is a tablet style computer that is removably coupled to the case 32. In an embodiment, the optical system 22 is disposed on a first side of the case 32 and the mobile computing device 34 is removably coupled to a second side of the case 32 opposite the first side.
The 3D imager 20 operation is controlled by the mobile computing device 34. Mobile computing device 34 is a suitable electronic device capable of accepting data and instructions, executing the instructions to process the data, and presenting the results. Mobile computing device 34 may accept instructions through user interface, or through other means such as but not limited to electronic data card, voice activation means, manually-operable selection and control means, radiated wavelength and electronic or electrical transfer. Therefore, mobile computing device 34 can be a microprocessor, microcomputer, a minicomputer, an optical computer, a board computer, a complex instruction set computer, an ASIC (application specific integrated circuit), a reduced instruction set computer, an analog computer, a digital computer, a molecular computer, a quantum computer, a cellular computer, a superconducting computer, a supercomputer, a solid-state computer, a single-board computer, a buffered computer, a computer network, a desktop computer, a laptop computer, a scientific computer, a scientific calculator, or a hybrid of any of the foregoing.
Mobile computing device 34 is capable of receiving signals from the optical system 22 (such as via electronic circuit 29) that represents an image of the surfaces in the environment. Mobile computing device 34 uses the images as input to various processes for determining three-dimensional (3D) coordinates of the environment.
Mobile computing device 34 is operably coupled with to communicate with the optical system 22 via a data transmission media. The data transmission media includes, but is not limited to, twisted pair wiring, coaxial cable, and fiber optic cable. The data transmission media also includes, but is not limited to, wireless, radio and infrared signal transmission systems. Mobile computing device 34 is configured to provide operating signals to components in the optical system 22 and to receive data from these components via the data transmission media.
In general, mobile computing device 34 accepts data from optical system 22, is given certain instructions for the purpose of determining 3D coordinates. Mobile computing device 34 provides operating signals to optical system 22, such as to project a light pattern and acquire an image. Additionally, the signal may initiate other control methods that adapt the operation of the 3D Imager 20 to compensate or calibrate for the out of variance operating parameters. For example, in an embodiment the calibration of the devices 24, 26, 28 is performed at each time instant where a light pattern is projected and an image is acquired.
The data received from optical system 22, the 3D coordinates or an image from a camera integrated into the mobile computing device 34 may be displayed on a user interface coupled to controller 38. The user interface may be an LCD (liquid-crystal diode) touch screen display, or the like. A keypad may also be coupled to the user interface for providing data input to mobile computing device 34.
In addition to being coupled to one or more components within 3D Imager 20, mobile computing device 34 may also be coupled to external computer networks such as a local area network (LAN) and the Internet. The LAN interconnects one or more remote computers, which are configured to communicate with Mobile computing device 34 using a well-known computer communications protocol such as TCP/IP (Transmission Control Protocol/Internet({circumflex over ( )}) Protocol), RS-232, ModBus, and the like. Additional 3D Imagers 20 may also be connected to LAN with the mobile computing devices 34 in each of these 3D Imagers 20 being configured to send and receive data to and from remote computers and other 3D Imagers 20. In an embodiment, the LAN is connected to the Internet. This connection allows mobile computing device 34 to communicate with one or more remote computers connected to the Internet.
Referring now to
I/O controllers 44 are coupled to components within the mobile computing device 34, such as the user interface 48, an inertial measurement unit 52 (IMU), and a digital camera 50 for providing digital data between these devices the processor 36. I/O controllers 44 may also be coupled to analog-to-digital (A/D) converters, which receive analog data signals. In an embodiment, the mobile computing device 34 may include a port 54 that allows the mobile computing device to transfer and receive signals from external devices.
The IMU 52 is a position/orientation sensor that may include accelerometers (inclinometers), gyroscopes, a magnetometers or compass, and altimeters. In the exemplary embodiment, the IMU 52 includes multiple accelerometers and gyroscopes. The compass indicates a heading based on changes in magnetic field direction relative to the earth's magnetic north. The IMU 52 may further have an altimeter that indicates altitude (height). An example of a widely used altimeter is a pressure sensor. By combining readings from a combination of position/orientation sensors with a fusion algorithm that may include a Kalman filter, relatively accurate position and orientation measurements can be obtained using relatively low-cost sensor devices. In the exemplary embodiment, the IMU 52 determines the pose or orientation of the 3D Imager 20 about three-axis to allow a determination of a yaw, roll and pitch parameter.
Communications interface device 46 provides for communication between mobile computing device 34 and external devices or networks (e.g. a LAN) in a data communications protocol supported by the device or network. ROM device 42 stores an application code, e.g., main functionality firmware, including initializing parameters, and boot code, for processor 36. Application code also includes program instructions as shown in
NVM device 40 is any form of non-volatile memory such as an EPROM (Erasable Programmable Read Only Memory) chip, a disk drive, or the like. Stored in NVM device 40 are various operational parameters for the application code. The various operational parameters can be input to NVM device 40 either locally, using user interface 48 or remote computer, or remotely via the Internet using remote computer. It will be recognized that application code can be stored in NVM device 40 rather than ROM device 42.
Mobile computing device 34 includes operation control methods embodied in application code shown in
Referring now to
The ray of light 76 intersects the surface 62 in a point 80, which is reflected (scattered) off the surface and sent through the camera lens 82 to create a clear image of the pattern on the surface 62 on the surface of a photosensitive array 84. The light from the point 80 passes in a ray 86 through the camera perspective center 88 to form an image spot at the corrected point 90. The image spot is corrected in position to correct for aberrations in the camera lens. A correspondence is obtained between the point 90 on the photosensitive array 84 and the point 78 on the illuminated projector pattern generator 70. As explained herein below, the correspondence may be obtained by using a coded or an uncoded (sequentially projected) pattern. Once the correspondence is known, the angles a and b in
As used herein, the term “pose” refers to a combination of a position and an orientation. In embodiment, the position and the orientation are desired for the camera and the projector in a frame of reference of the optical system 22. Since a position is characterized by three translational degrees of freedom (such as x, y, z) and an orientation is composed of three orientational degrees of freedom (such as roll, pitch, and yaw angles), the term pose defines a total of six degrees of freedom. In a triangulation calculation, a relative pose of the camera and the projector are desired within the frame of reference of the optical system 22. As used herein, the term “relative pose” is used because the perspective center of the camera or the projector can be located on an (arbitrary) origin of the optical system 22; one direction (say the x axis) can be selected along the baseline; and one direction can be selected perpendicular to the baseline and perpendicular to an optical axis. In most cases, a relative pose described by six degrees of freedom is sufficient to perform the triangulation calculation. For example, the origin of a optical system 22 can be placed at the perspective center of the camera. The baseline (between the camera perspective center and the projector perspective center) may be selected to coincide with the x axis of the optical axis. The y axis may be selected perpendicular to the baseline and the optical axis of the camera. Two additional angles of rotation are used to fully define the orientation of the camera system. Three additional angles or rotation are used to fully define the orientation of the projector. In this embodiment, six degrees-of-freedom define the state of the optical system 22: one baseline, two camera angles, and three projector angles. In other embodiment, other coordinate representations are possible.
Referring now to
The inclusion of two cameras 104 and 106 in the system 100 provides advantages over the device of
A triangular arrangement of the cameras 104, 106 with the projector 102 provides additional information beyond that available for two cameras and a projector arranged in a straight line. The additional information may be understood in reference to
In
Consider the embodiment of
To check the consistency of the image point P1, intersect the plane P3-E31-E13 with the reference plane 178 to obtain the epipolar line 184. Intersect the plane P2-E21-E12 to obtain the epipolar line 186. If the image point P1 has been determined consistently, the observed image point P1 will lie on the intersection of the determined epipolar lines 184, 186.
To check the consistency of the image point P2, intersect the plane P3-E32-E23 with the reference plane 180 to obtain the epipolar line 188. Intersect the plane P1-E12-E21 to obtain the epipolar line 190. If the image point P2 has been determined consistently, the observed image point P2 will lie on the intersection of the determined epipolar lines 188, 190.
To check the consistency of the projection point P3, intersect the plane P2-E23-E32 with the reference plane 182 to obtain the epipolar line 194. Intersect the plane P1-E13-E31 to obtain the epipolar line 196. If the projection point P3 has been determined consistently, the projection point P3 will lie on the intersection of the determined epipolar lines 194, 196.
The redundancy of information provided by using a 3D imager 20 having a triangular arrangement of projector and cameras may be used to reduce measurement time, to identify errors, and to automatically update compensation/calibration parameters.
Referring now to
In an embodiment, images of a “sparse” light pattern are acquired. As used herein, a “sparse” pattern is a pattern of elements that are acquired in a single instant. In an embodiment, the light pattern projected by the projector 102 have a density of about 1000 points per square meter. Thus, the resulting point cloud of the surfaces 206, 208, 210, 212, 214 will have a density of no greater than this light pattern. This is different from prior art imaging systems that acquired multiple images, typically about 10 images per second or higher. As a result, the point cloud generated by the sparse light pattern is about 1000 times less than prior art imagers. It should be appreciated that this reduces the processing power and time to prepare a point cloud.
In an embodiment, the projector 102 projects the pattern of light at a wavelength that is not visible to the human eye, such as in the infrared spectrum about 700 nm. In this embodiment, the camera's 104, 106 are configured to be sensitive to light having a wavelength of the light pattern.
The method 200 then proceeds to block 218 where a two-dimensional color image (i.e. an image lacking depth information) of the environment 204 is acquired using the mobile computing device 34 camera 50. In an embodiment, the camera 50 is a color camera (e.g. having red, green and blue pixels). In an embodiment, the camera 50 has a resolution (number of pixels) that is greater than the cameras 104, 106. In an embodiment, the camera 50 has a resolution of about 8 megapixels. It should be appreciated that acquisition of the color image in block 218 may occur simultaneously with the acquisition of the first image and second image in block 216.
The method 200 then proceeds to block 220 where the 3D coordinates of the elements of the light pattern are determined based at least in part on the first image and the second images acquired by cameras 104, 106. In an embodiment, the 3D coordinates are determined in the manner described with respect to
The method 200 then proceeds to block 222 where planes defined by surfaces 206, 208, 210, 212, 214 are determined from the point cloud. It should be appreciated that in other embodiments, the surfaces may be fit to other geometric primitives, such as but not limited to cylinders or spheres. In an embodiment, the planes are determined by iteratively selecting three points within the point cloud to define a plane. Normals from remaining points in the point cloud are then compared to the plane. When the surrounding points are either on the plane or their normal is within a predetermined distance of the normal, then the point is defined as being on the plane. Once all of the points in the point cloud are compared, the points on the plane are removed from the data set and another set of three-points are selected to define a plane. In an embodiment, this process is iteratively performed until all of the points (or substantially all of the points) are defined as being on a plane. In one embodiment, the method of defining the planes is based on a random sample consensus (RANSAC) method. In an embodiment, when the planes are identified, they are classified as being vertical (e.g. walls) or horizontal (e.g. floors or ceilings).
The method 200 then proceeds to block 224 where lines are identified in the color image acquired by camera 50. It should be appreciated that lines in a two-dimensional image may, at least in some instances, represent edges that define the boundary of a plane. For example the corner 226 is a line in the 2D image representing the boundary between the surface 206 and the surface 214. Similarly, the intersection of the floor 208 and the surface 206 is a line 228 in the two-dimensional image. These lines may be identified using imaging processing methods, such as Canny edge detection, Marr-Hildreth edge detection, Gaussian filtering, discrete Laplace operator, or Hough transform for example.
The method 200 then proceeds to block 230 where the lines identified in block 224 are matched with the planes identified in block 222. In an embodiment, the matching of the lines with the edges of the planes is performed using the method described in “Brute Force Matching Between Camera Shots and Synthetic Images from Point Clouds” by R. Boerner et al., published in the The International Archive of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume, XLI-B5, 2016. In another embodiment, the matching may be performed using the method described in “registration of Images to Unorganized 3D Point Clouds Using Contour Cues” by Alba Pujol-Miro et al, published in the 25th European Processing Conference, 2017.
With the lines matched to the point cloud, the method 200 then proceeds to block 232 where the image of the environment 204 is displayed to the operator, such as on user interface 48 for example. Since the displayed two-dimension image is aligned and associated with the 3D point cloud, the operator may select points on the displayed two-dimensional image and be provided with the distance between the points. In an embodiment, when the operator selects two points, the method 200 returns a distance by determining a correspondence between the selected points on the two-dimensional image and the point cloud. This may include identifying a locations on planes (determined in block 222) and then determining a distance therebetween. It should be appreciated that the method 200 may also return other measurements, such as areas of planes, or the volume of a space for example.
It should be appreciated that the 3D imager 20 and the method 200 provide for the rapid acquisition of three-dimensional information of an environment based on a sparse point cloud and provide a means for the operator to obtain measurement information (distance, area, volume, etc.) for arbitrary or desired locations within the environment.
Referring now to
In an embodiment where the 3D imager 250 of
After a correspondence is determined among projected and imaged elements, a triangulation calculation is performed to determine 3D coordinates of the projected element on an object. In an embodiment, the elements are uncoded spots projected in an uncoded pattern. In an embodiment, a triangulation calculation is performed based on selection of a spot for which correspondence has been obtained on each of two cameras. In this embodiment, the relative position and orientation of the two cameras is used. For example, the baseline distance between the perspective centers is used to perform a triangulation calculation based on the first image of the first camera 254 and on the second image of the second camera 256. Likewise, the first baseline is used to perform a triangulation calculation based on the projected pattern of the projector 252 and on the second image of the second camera 256. Similarly, the second baseline is used to perform a triangulation calculation based on the projected pattern of the projector 252 and on the first image of the first camera 254. In an embodiment of the present invention, the correspondence is determined based at least on an uncoded pattern of uncoded elements projected by the projector, a first image of the uncoded pattern captured by the first camera, and a second image of the uncoded pattern captured by the second camera. In an embodiment, the correspondence is further based at least in part on a position of the projector, the first camera, and the second camera. In a further embodiment, the correspondence is further based at least in part on an orientation of the projector, the first camera, and the second camera.
The term “uncoded element” or “uncoded spot” as used herein refers to a projected or imaged element that includes no internal structure that enables it to be distinguished from other uncoded elements that are projected or imaged. The term “uncoded pattern” as used herein refers to a pattern in which information is not encoded in the relative positions of projected or imaged elements. For example, one method for encoding information into a projected pattern is to project a quasi-random pattern of “dots” in which the relative position of the dots is known ahead of time and can be used to determine correspondence of elements in two images or in a projection and an image. Such a quasi-random pattern contains information that may be used to establish correspondence among points and hence is not an example of a uncoded pattern. An example of an uncoded pattern is a rectilinear pattern of projected pattern elements.
In an embodiment, uncoded spots are projected in an uncoded pattern as illustrated in the 3D imager 250 of
In an embodiment, the illuminated object spot 270 produces a first image spot 278 on the first image plane 280 of the first camera 254. The direction from the first image spot to the illuminated object spot 270 may be found by drawing a straight line 282 from the first image spot 278 through the first camera perspective center 284. The location of the first camera perspective center 284 is determined by the characteristics of the first camera optical system.
In an embodiment, the illuminated object spot 270 produces a second image spot 286 on the second image plane 288 of the second camera 256. The direction from the second image spot 286 to the illuminated object spot 270 may be found by drawing a straight line 290 from the second image spot 286 through the second camera perspective center 292. The location of the second camera perspective center 292 is determined by the characteristics of the second camera optical system.
The method 300 then proceeds to block 304 that includes capturing with a first camera the illuminated object spots as first-image spots in a first image. In the embodiment of
In an embodiment, the method 300 then proceeds to block 308 to determine the 3D coordinates of at least some of the spots 266. In an embodiment, a first aspect includes determining with a processor 3D coordinates of a first collection of points on the object based at least in part on the first uncoded pattern of uncoded spots, the first image, the second image, the relative positions of the projector, the first camera, and the second camera, and a selected plurality of intersection sets. In the embodiment of
In an embodiment, a second aspect of the determining of 3D coordinates in block 308 includes selecting with the processor a plurality of intersection sets, each intersection set including a first spot, a second spot, and a third spot, the first spot being one of the uncoded spots in the projector reference plane, the second spot being one of the first-image spots, the third spot being one of the second-image spots. The selecting of each intersection set based at least in part on the nearness of intersection of a first line, a second line, and a third line, the first line being a line drawn from the first spot through the projector perspective center, the second line being a line drawn from the second spot through the first-camera perspective center, the third line being a line drawn from the third spot through the second-camera perspective center. In the embodiment of
The processor 260 may determine the nearness of intersection of the first line, the second line, and the third line based on any of a variety of criteria. For example, in an embodiment, the criterion for the nearness of intersection is based on a distance between a first 3D point and a second 3D point. In an embodiment, the first 3D point is found by performing a triangulation calculation using the first image point 278 and the second image point 286, with the baseline distance used in the triangulation calculation being the distance between the perspective centers 284, 292. In the embodiment, the second 3D point is found by performing a triangulation calculation using the first image point 278 and the projector point 272, with the baseline distance used in the triangulation calculation being the distance between the perspective centers 284, 276. If the three lines 274, 282, 290 nearly intersect at the object point 270, then the calculation of the distance between the first 3D point and the second 3D point will result in a relatively small distance. On the other hand, a relatively large distance between the first 3D point and the second 3D would indicate that the points 272, 278, 286 did not all correspond to the object point 270.
As another example, in an embodiment, the criterion for the nearness of the intersection is based on a maximum of closest-approach distances between each of the three pairs of lines. This situation is illustrated in
The processor 260 may use many other criteria to establish the nearness of intersection. For example, for the case in which the three lines were coplanar, a circle inscribed in a triangle formed from the intersecting lines would be expected to have a relatively small radius if the three points 272, 278, 286 corresponded to the object point 270. For the case in which the three lines were not coplanar, a sphere having tangent points contacting the three lines would be expected to have a relatively small radius.
It should be noted that the selecting of intersection sets based at least in part on a nearness of intersection of the first line, the second line, and the third line is not used in other projector-camera methods based on triangulation. For example, for the case in which the projected points are coded points, which is to say, recognizable as corresponding when compared on projection and image planes, there is no need to determine a nearness of intersection of the projected and imaged elements. Likewise, when a sequential method is used, such as the sequential projection of phase-shifted sinusoidal patterns, there is no need to determine the nearness of intersection as the correspondence among projected and imaged points is determined based on a pixel-by-pixel comparison of phase determined based on sequential readings of optical power projected by the projector and received by the camera(s). The method element 190 includes storing 3D coordinates of the first collection of points.
Once the 3D coordinates are determined in block 308, the method 300 proceeds to block 310 where the 3D coordinates are stored in memory, such as memory 40 (
It should be appreciated that the embodiments of
Technical effects and benefits of some embodiments include providing a method and a system that allow for measurements of an environment to be quickly performed by measuring the three-dimensional coordinates of points on surfaces within the environment by acquiring images at a single instance in time.
The term “about” is intended to include the degree of error associated with measurement of the particular quantity based upon the equipment available at the time of filing the application. For example, “about” can include a range of ±8% or 5%, or 2% of a given value.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, element components, and/or groups thereof.
The corresponding structures, materials, acts, and equivalents of all means or step plus function elements in the claims below are intended to include any structure, material, or act for performing the function in combination with other claimed elements as specifically claimed. The description of the present invention has been presented for purposes of illustration and description, but is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the invention. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
While the invention has been described in detail in connection with only a limited number of embodiments, it should be readily understood that the invention is not limited to such disclosed embodiments. Rather, the invention can be modified to incorporate any number of variations, alterations, substitutions or equivalent arrangements not heretofore described, but which are commensurate with the spirit and scope of the invention. Additionally, while various embodiments of the invention have been described, it is to be understood that aspects of the invention may include only some of the described embodiments. Accordingly, the invention is not to be seen as limited by the foregoing description, but is only limited by the scope of the appended claims.
This application claims the benefit of U.S. Provisional Application Ser. No. 62/644,127 filed Mar. 16, 2018, the contents of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
62644127 | Mar 2018 | US |