Projection mapping, which may also be known as video mapping or spatial augmented reality, is a projection technology used to turn physical objects (often irregularly shaped objects) into a display surface for image and video projection. The objects may be complex industrial landscapes, such as buildings, small indoor objects or theatrical stages. Using software, a two or three-dimensional object is spatially mapped on a virtual program that mimics the real environment that is to be projected on. The software may interact with a projector to fit any desired image onto a surface of that object.
A position enabled projector (PEP) is a tool that projects an image, such as a blueprint, onto a work surface at its true position with true scale. Being able to correctly project position of the image onto a surface may involve determining the position or orientation of the projector itself and/or knowing the surface that the image is being projected onto. If the surface is a smooth, flat, even surface, then the whole image being projected may undergo a single transformation such that every point on the image is projected at the true position. If the surface is uneven, the projector may need to know the exact geometry of the surface and the image being projected may need to undergo a different type of transformation to project the points of the image onto the uneven surface correctly at the true position.
The projection of image points on a surface at their true positions along with true scale is particularly important if a specific task to be performed requires precision and accuracy. For instance, a blueprint may be projected onto a work surface so as to allow a construction worker to drill holes at various specified positions on the surface based on the information provided by the blueprint. If the work surface is uneven, however, the virtual positions of the drill holes may be incorrectly and inaccurately projected when the geometry of the uneven surface is not taken into account. In that regard, there is a need for projecting an image onto an even or uneven surface such that all points of the image are projected and appear at their true position on the surface with true scale. There is also a need for an image being projected to be automatically updated when the projector is moved from its original position to a new position.
In accordance with one or more aspects of the present invention, the invention is directed to a system and method for surface profiling via a projector system, such as a position enabled projector. By way of example, a three-dimensional representation of a physical object, such as an uneven surface of the object may be generated and profiled. The three-dimensional representation may be a 3D point cloud, a surface mesh, or any other suitable type of representation. A two-dimensional image to be projected onto the surface may undergo an image transformation based on the generated 3D representation of the surface. The transformed image is then projected onto the surface, where the image points are projected are at their true positions with true scale. Moreover, the projected image may be automatically updated when the projector is moved to a new position.
Other objects, advantages and novel features of the present invention will become apparent from the following detailed description of one or more preferred embodiments when considered in conjunction with the accompanying drawings.
The present invention is directed to correctly and accurately projecting, using a projector system, a two-dimensional image (e.g., a construction-related blueprint) onto an uneven work surface, such as corrugated steel sheets, so that all points of the image appear on the surface at their true positions with true scale. Moreover, the present invention is directed to updating the projected image when the projector system is moved to a new position.
In one aspect of the present invention, a three-dimensional (3D) profile of the uneven surface may be generated. By way of example, generation of the 3D profile may be implemented by a projector system using the one or more of the following components and/or approaches: (1) a laser scanner, (2) a time-of-flight (TOF) camera, (3) at least one stereoscopy camera based approach, and/or (4) one or more structured light approaches. The 3D profile that is output from the projector system may be a point cloud, a surface mesh, a surface profile, or any other suitable type of three-dimensional representation of the surface. In a further example, the use of one or more range meters may improve the accuracy and robustness of the point cloud.
In another aspect of the present invention, the 3D point cloud of the uneven surface generated by the projector system may be converted to into a virtual surface mesh, such as a polygon mesh. The mesh may be generated based on geometric processing of the surface and virtually reconstructed.
In yet another aspect of the present invention, a two-dimensional (2D) image, such as a blueprint associated with a construction task, may be transformed (e.g., using linear affine transformation) based on the generated polygon mesh of the uneven surface. Optionally, the transformation may also be based on the position and the orientation of the projector system, which is further described in U.S. application Ser. No. 15/638,815, filed on Jun. 30, 2017, the content of which is incorporated herein by reference in its entirety. In at least that regard, the points and/or lines of the blueprint appears at their true positions, despite the uneven characteristics of the projection surface. As such, the construction worker relying on information in the projected blueprint to carry out the construction task may trust that the points, lines, and other graphical representations are where they actually have to be located.
One of the numerous advantages of the present invention is that the true position of every image point on the uneven surface is accounted for, and thus, ensures that tasks such as the afore-mentioned construction task can be performed accurately and correctly. The invention relates to preserving accuracy (e.g., true position, true scale) of the various aspects of a projected image and not merely how the projected image may look to an observer. This may be achieved, for example, by calibrating all system components (e.g., in itself and to each other) based on known-design and/or all data (e.g., point cloud, range, mesh, original and transformed images) may be referenced to a common coordinate system.
The invention described herein may be implemented on and executed by one or more computing devices. For instance, the projector system may have computing capabilities, by way of example, one or more processors, central processing units (CPUs), etc. As will be further described below, the computing associated with surface profiling and projecting a transformed image according to aspect(s) of the present invention may be executed by computing hardware in the projector system itself. Alternatively, the processing may be performed by a separate portable computing device, such as a laptop, tablet computer, or any other suitable type of mobile computing device.
The instructions 116 may be one or more sets of computer-executable instructions (e.g., software) that can be implemented by the processor 112. Data 115 may include various types of information (which can be retrieved, manipulated and/or stored by the processor 112), such as information captured from surface profiling equipment to generate a 3D profile, mesh data, one or more images to be projected, one or more transformed images, etc.
Interface 137 may be any component that allows interfacing with an operator or user. For example, interface 137 may be a device, port, or a connection that allows a user to communicate with the projector system 110, including but not limited to a touch-sensitive screen, microphone, camera, and may also include one or more input/output ports, such as a universal serial bus (USB) drive, various card readers, etc. The interface 137 may also include hardware and/or equipment for surface profiling, such as one or more sensors (e.g., image sensors, light sensors), one or more cameras, one or more projectors, one or more range meters, etc.
The projector system 110 may be configured to communicate with other computing devices via network 130. For example, the projector system 110 may communicate with other projector systems, and/or mobile computing devices (e.g., laptops, tablet computers, smartphones). The network 130 may be any type of network, such as LAN, WAN, Wi-Fi, Bluetooth, etc.
Although processing related to at least generating the 3D profile, surface profiling, and/or image transformation are carried out by the one or more processors 112 of the projector system 110, it may be understood that the processing may be performed by external computing devices and/or hardware, such as a mobile computing device, that may be communicating with the projector system 110 via the network 130.
Before image transformation, however, the projector system 110 may take the image 206 to be projected and input it into block 210. Optionally, information on the position and/or the orientation of the projector system 110 may also be input into block 210 for further accuracy. The image 206 may be a two-dimensional image. At block 210, the image 206 to be projected may be mapped onto the polygon mesh and a new two-dimensional image (e.g., a transformed image) is generated for projection at block 214.
In an example,
In another example,
By way of example,
According to aspects of the present invention, a 3D point cloud of an object may be generated using different techniques. For example, a technique based on structured light where a projector is used to project one or multiple light patterns onto the surface may be implemented. These light patterns may be captured by one or more cameras, such as the cameras 304 and 404 of
After generating the 3D point cloud, the 3D point cloud may be converted into a polygon mesh, as described above.
More specifically, during image transformation, a triangle may be located in a plan image 804 (e.g., ortho) of the image 802 and a corresponding triangle may be located in the 3D polygon mesh, as illustrated in
In image projection 910, for example, if a blueprint image were to be projected onto the corrugated steel sheet without proper image transformation, the point (which indicates where the construction worker needs to drill) would be projected slightly below where the construction worker actually needs to drill. In image projection 940, however, the blueprint image undergoes proper image transformation (such as the image transformation described above) to account for the uneven surface of the corrugated steel sheet, the point that indicates where the construction worker needs to drill is projected at its true position. In other words, as shown in
In embodiments according to aspects of the present invention, when the projector is moved from one position to a new position, the image transformation may be automatically updated based on newly acquired information on the geometric characteristics of the object at the new position. Then, the projector system may automatically update the projection of the updated-transformed image from the new position. In at least that regard, when the construction worker intentionally or accidently moves the projector system, or if the projector system is moved for other reasons, the projected image is constantly and/or automatically updated so that tasks associated with the image being projected may be performed without little to no interruption.
In step 1010, the projector system 110, for instance, may capture the overall geometric characteristics of an object, including the surface of the object (whether the surface is even or uneven). As described above, the object may be a corrugated steel sheet and the geometric characteristics may be captured using a laser scanner, a range finding camera (e.g., a time-of-flight camera), stereoscopy (e.g., using two cameras) based approach, structured light approach, etc.
In step 1020, a 3D point cloud of the object may be generated using the obtained overall geometric characteristics in step 1010. Thereafter, in step 1030, the 3D point cloud may be used to generate a 3D polygon mesh of the object, which may be used to transform a 2D image to another 2D image. The 2D image being transformed may be a blueprint for performing a construction-related work task, such as drilling holes.
If the surface is uneven, the generated 3D polygon mesh in step 1030 will transform the 2D image into a new 2D image in step 1040 so that when the new 2D image is projected onto the surface of the object in step 1050, the image characteristics and corresponding information (e.g., as the exact drilling positions) will be projected on the surface at their correct and accurate locations with true scale.
Numerous advantageous of the present invention, include but are not limited to, accounting for the accurate and correct projection of every point, line, characteristic, etc., of an image on an uneven surface, especially if the image that is being projected is related to a task that requires accuracy and precision. In that regard, how the projected image looks to an observer is not the main concern of the present invention, but rather whether one or more points in an image is projected at its true position.
The foregoing invention has been set forth merely to illustrate the invention and is not intended to be limiting. Since modifications of the disclosed embodiments incorporating the spirit and substance of the invention may occur to persons skilled in the art, the invention should be construed to include everything within the scope of the appended claims and equivalents thereof. Although the present disclosure uses terminology and acronyms that may not be familiar to the layperson, those skilled in the art will be familiar with the terminology and acronyms used herein.
| Filing Document | Filing Date | Country | Kind |
|---|---|---|---|
| PCT/EP2018/067592 | 6/29/2018 | WO | 00 |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 15639308 | Jun 2017 | US |
| Child | 16625066 | US |