The subject matter herein generally relates to an image processing method, especially relates to a point cloud processing method and a computing device using the same.
Three-Dimensional (3D) point cloud data acquired from a scanning device might include miscellaneous noise points due to various actors, for example, quality of scanning device, illumination, environment, and product scanned by the scanning device. The miscellaneous noise points generally result in blurred product images, therefore reducing accuracy of various product test based on the blurred product images. Therefore, there is a need for a point cloud processing method capable of reducing miscellaneous noise points.
Implementations of the present technology will now be described, by way of example only, with reference to the attached figures.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures and components have not been described in detail so as not to obscure the related relevant feature being described. Also, the description is not to be considered as limiting the scope of the embodiments described herein. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features of the present disclosure.
A definition that applies throughout this disclosure will now be presented.
The term “comprising,” when utilized, means “including, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in the so-described combination, group, series and the like.
The computing device 1 can include, but not limited to, a storage device 11, a processor 12, and a display device 13. The storage device 11 can be configured to store data related to operation of the computing device 1. The processor 12 can be configured to control operation of the computing device 1.
The storage device 11 can be an internal storage unit of the computing device 1, for example, a hard disk or memory, or a pluggable memory, for example, Smart Media Card, Secure Digital Card, Flash Card. In at least one embodiment, the storage device 11 can include two or more storage devices such that one storage device is an internal storage unit and the other storage device is a pluggable memory. The processor 12 can be a central processing unit (CPU), a microprocessor, or other data processor chip that performs functions of the computing device 1. The display device 13 can be a liquid crystal display or other currently available display.
Referring to
The depicting module 101 can be configured to depict a 3-D image based on a point cloud data set. The point cloud data set can define coordinates of a plurality of points in world coordinate system.
The coordinate transformation module 102 can be configured to convert the 3-D image to a two-dimensional (2-D) image by coordinate conversion. Any currently available coordinate conversion method for converting a 3-D image to a 2-D image can be used.
The brush module 103 can be configured to drag a brush in the 2-D image to form a coverage area which has an area boundary of the coverage area as illustrated in
The determining module 104 can be configured to determine whether a point of the 2-D image is within the coverage area by comparing coordinates of the point with the coordinates of the area boundary of the coverage area.
The painting module 105 can be configured to paint the point within the coverage area to specific color, for example, red.
Referring to
At block 202, the computing device depicts a 3-D image in a model space system based on a point-cloud data set. In detail, the computing device opens the point-cloud data set in the model space system equipped on the computing device, for example, computer aided design system (CAD). The point cloud data set defines a plurality of points, each point having a 3-D coordinates in a 3D coordinate system, for example, a world coordinate system. Then, the computing device depicts each point in the model space system based on the 3-D coordinates of each point, thus generating the 3-D image. The 3-D image consists of the plurality of points. A common instance of a 3D Coordinate System is the Cartesian coordinate system where three X, Y, Z axes perpendicular to each other and meeting each other at an origin point (0, 0, 0) are used to parameterize the 3-dimensional space.
At block 204, the computing device converts the 3-D image to a 2-D image by coordinate conversion. An exemplary embodiment of coordinate conversion can include: conversion from the world coordinate system to the camera coordinate system, conversion from the camera coordinate system to projection plane coordinate system, and from projection plane coordinate system to image plane coordinate system. After the image data in the image coordinate system is calculated, the 2-D image can be correctly depicted.
Firstly, an exemplary conversion from the world coordinate system to the camera coordinate system can be illustrated herein. Both the world coordinate system and the camera coordinate system are 3-D coordinate system. The camera coordinate system can be treated as a result of translation and rotation of the world coordinate system. So that, the conversion from the world coordinate system to the camera coordinate system can be implemented based on an expression 1.1:
wherein: [XW,YW,ZW,1]T represents coordinates of a point P in the world coordinate system; [XC, YC, ZC,1]T represents coordinates of a point P in the camera coordinate system; R represents 3*3 orthogonal matrixes, θ, , φ are Euler angles of rotation and respectively represent angles of yaw, pitch, and roll; Tx, Ty, Tz respectively represent displacement in X, Y, Z axis; Ml is a 4*4 matrix.
Then, an exemplary conversion from the camera coordinate system to the projection coordinate system can be illustrated herein. The projection coordinate system is a 2-D coordinate system and is a projection of the camera coordinate system. The conversion from the world coordinate system to the camera coordinate system can be implemented based on an expression 1.2:
wherein: (x, y) represents coordinates of a point P in the projection coordinate system;
represents coordinates of the point P in the camera coordinate system; f represents a displacement of the projection plane in the Z axis of the camera coordinate system.
Then, an exemplary conversion from the projection coordinate system to the image coordinate system can be illustrated herein. The image coordinate system is a 2-D coordinate system and can be treated as a result of scaling and translation of the projection coordinate system. The conversion from the world coordinate system to the camera coordinate system can be implemented based on an expression 1.3:
wherein: (μ,v) represents coordinates of a point P in the image coordinate system; (x, y) represents coordinates of a point P in the projection coordinate system; (μ0,v0) represents coordinates of origin of the projection coordinate system in the image coordinate system; μx,μy represent coordinates of an area boundary of 2-D image formed in the projection plane in the image coordinate system.
The conversion from the world coordinate system to the image coordinate system can be derived as the following expression 1.4 based on the above expressions 1.1-1.3:
At block 206, the computing device drags a brush to form a coverage area in the 2-D image in response to user operation. Referring to
At block 208, the computing device obtains coordinates of each pixel point in the coverage area Q and determines coordinates of the area boundary W of the coverage area. In at least one exemplary embodiment, the obtained coordinates can be stored in the storage device.
At block 210, the computing device compares coordinates of each pixel point of the 2-D image with the coordinates of the area boundary of the coverage area.
At block 212, the computing device determines whether a random point A of the 2-D image is within the coverage area. For example, if a random point A of the 2-D image has a coordinate (Xa, Ya), the coordinates of the area boundary of the 2-D image have a maximum value and a minimized value in X and Y axis: Xmax, Xmin, Ymax, Ymin. If the coordinate (Xa, Ya) satisfies: Xmin≦Xa≦Xmax and Ymin≦Ya≦Ymax, the random point A can be determined to be within the coverage area Q, otherwise, the random point A can be determined to be outside the coverage area Q. If the random point A is determined to be within the coverage area Q, the process goes to block 214, otherwise, the process goes to block 216.
At block 214, the computing device paints the pixel point within the coverage area to specific color, for example, red.
At block 126, the computing device remains the current color of the pixel point outside the coverage area unchanged.
The embodiments shown and described above are only examples. Even though numerous characteristics and advantages of the present technology have been set forth in the foregoing description, together with details of the structure and function of the present disclosure, the disclosure is illustrative only, and changes may be made in the detail, including in matters of shape, size and arrangement of the parts within the principles of the present disclosure up to, and including, the full extent established by the broad general meaning of the terms used in the claims.
Number | Date | Country | Kind |
---|---|---|---|
201410593967.5 | Oct 2014 | CN | national |