The present disclosure relates to computer-assisted processing technologies, and more particularly, to computer-assisted processing technologies for generating surface feature data.
Three-dimensional maps include features derived from vector datasets resulting from light detection and ranging (LiDAR) scans of an environment. The LiDAR scan results in a point cloud of LiDAR points within the environment. Features, such as roads, buildings, signs and the like may be represented by polygons that consist of a collection of ordered, interconnected vertices. Aggregation of LiDAR data to generate accurate roadway features can be difficult and time-consuming, as width or the height of the roadway may vary. Moreover, there may be missing or incomplete data of the roadway. Therefore, a need exists for processing data to fill incomplete features of the roadway.
In an aspect of the present disclosure, a method for generating surface feature data includes receiving a data set, extracting input layers from the data set, dilating and unionizing the input layers, eroding and triangulating the input layers to generate a plurality of planar surfaces, assigning heights to each vertex of the plurality of planar surfaces, detecting incomplete features in the plurality of planar surfaces, and filling the incomplete features in the plurality of planar surfaces.
In another aspect of the present disclosure, a system for generating surface feature data includes a computing device comprising a memory component, wherein the memory component stores logic that, when executed by the computing device, causes the system to perform at least the following: receive a data set, extract input layers from the data set, dilate and unionize the input layers, erode and triangulate the input layers to generate a plurality of planar surfaces, assign heights to each vertex of the plurality of planar surfaces, detect incomplete features in the plurality of planar surfaces, and fill in the incomplete features in the plurality of planar surfaces.
These and additional features provided by the embodiments described herein will be more fully understood in view of the following detailed description, in conjunction with the drawings.
The embodiments set forth in the drawings are illustrative and exemplary in nature and not intended to limit the subject matter defined by the claims. The following detailed description of the illustrative embodiments can be understood when read in conjunction with the following drawings, where like structure is indicated with like reference numerals and in which:
The embodiments described herein are directed to methods and systems for generating surface feature data, specifically, roadway surface feature data. The systems, methods, and computer implemented programs may utilize various algorithms and artificial intelligence for generating surface feature data.
Roadway maps may be generated by a user based on input data. However, it may be difficult and/or time consuming to generate specific surface feature data, such as the assignment of heights to various points/planes within the roadway. Moreover, it may be difficult or impossible for the user to generate accurate surface features if there is missing data. Thus, user generated surface feature may suffer from various inaccuracies, inefficiencies, and discontinuities.
The system for generating surface feature data includes eroding and triangulating input layers from a data set to generate a plurality of planar surfaces. Based on the LiDAR data, heights are assigned to each vertex of the plurality of planar surfaces. This may generate surface feature data with accurate three-dimensional points. Moreover, incomplete features in the plurality of planar surfaces may be filled using the systems and methods described herein.
As used herein, the singular forms “a,” “an” and “the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to “a” component includes aspects having two or more such components unless the context clearly indicates otherwise.
Referring to
The computing device 102 may include a graphical user interface (GUI) 102a, a processor 102b and an input device 102c, each of which may be communicatively coupled to together and/or to the network 10. The server 103 may be configured to include similar components as the computing device 102. As described in more detail herein, the computing device 102 may be configured to receive a data set (e.g., LiDAR data) from the server 103 and/or the vehicle 104 and generate surface feature data 101 (as depicted in
It should be understood that the computing device 102, the server 103, and the electronic control unit 104a of the vehicle 104 may be a personal computer, a micro controller, or the like. Additionally, while each of the computing devices illustrated in
The vehicle 104 includes an electronic control unit 104a, a communications unit 104b, and a sensor 104c. The electronic control unit 104a may be any device or combination of components comprising a processor and non-transitory computer readable memory. The processor may be any device capable of executing the machine-readable instruction set stored in the non-transitory computer readable memory. Accordingly, the processor may be an electric controller, an integrated circuit, a microchip, a computer, or any other computing device. The processor is communicatively coupled to the other components of the vehicle 104 by a communication bus. Accordingly, the communication bus may communicatively couple any number of processors with one another, and allow the components coupled to the communication bus to operate in a distributed computing environment. Specifically, each of the components may operate as a node that may send and/or receive data. It is further noted that the processor may comprise a single processor, multiple processors, or a system of processors.
The communications unit 104b of the vehicle 104 may include network interfaces for one or more of a plurality of different networks, protocols, or the like. For instance, the communications unit 104b may include one or more antennas (e.g., many in/many out (MIMO) antennas, etc.) that may allow for communication via Wi-Fi networks, IrDA, Bluetooth, Wireless USB, Z-Wave, ZigBee, near field communication (NFC), LTE, WiMAX, UMTS, CDMA, C-V2X, GSM interfaces may include Wi-Fi, xth generation cellular technology (e.g., 2G, 3G, 4G, 5G, etc.), WCDMA, LTE Advanced, or the like.
The electronic control unit 104a is configured to be communicatively coupled to one or more sensors 104c for detecting roadway data. The one or more sensors 104c may be any sensor capable of generating a data set that would assist in generating roadway surface features, such as but not limited to LiDAR sensors, radar sensors, ultrasonic sensors, or any other suitable sensor for generating roadway surface data points. The at least one sensor 104c may emit an array of infrared beams, and detect the return of the beams and their times of flight to generate a point cloud (e.g., data points 300) of LiDAR points in three-dimensional space (as depicted in
A data storage component 268 and/or the memory component 260 of the computing device 102 or a memory of the server 103 may store the data from the sensors 104c of the vehicle 104. In some embodiments, the computer and the server 103 may be communicatively connected to a plurality of vehicles 104, such that the memory of the computing device 102 and server 103 include data from each of the plurality of vehicles 104. The computing device 102 and server 103 may aggregate the data from the plurality of vehicles 104, such that the aggregated data may be used to generate more accurate surface feature data 101, as described further herein.
Referring now to
As depicted in
Additionally, the memory component 260 may be configured to store logic, such as but not limited to operating logic 261, LiDAR processing logic 262 for receiving and processing LiDAR data from the sensor 104c of the vehicle 104, and display logic 264 for displaying surface data on the GUI 102a, as described herein (each of which may be embodied as computer readable program code, firmware, or hardware, as an example). It should be understood that the data storage component 268 may reside local to and/or remote from the computing device 102, and may be configured to store one or more pieces of data for access by computing device 102 and/or other components.
A local interface 270 is also included in
The processor 102b may include any processing component configured to receive and execute computer readable code instructions (such as from the data storage component 268 and/or memory component 260). The input device 102c may include one or more of graphics display device, keyboard, mouse, printer, camera, microphone, speaker, touch-screen, and/or other device for receiving, sending, and/or presenting data. The network interface hardware 267 may include any wired or wireless networking hardware, such as a modem, LAN port, wireless fidelity (Wi-Fi) card, WiMax card, mobile communications hardware, and/or other hardware for communicating with other networks and/or devices. The network interface hardware 267 may communicate via the Internet to receive the LiDAR data 269 provided from one or more sources (e.g., one or more sensors 104c of the vehicles 104) as well as communicate with a display device, such as the GUI 102a of the computing device 102 to display surface feature data 101.
Included in the memory component 260 may be the operating logic 261, LiDAR processing logic 262, and display logic 164. The operating logic 261 may include an operating system and/or other software for managing components of the computing device 102. Similarly, the LiDAR processing logic 262 may reside in the memory component 260 and may be configured to receive and process LiDAR data, such as to generate surface feature data 101. The display logic 264 includes logic to generate a display on the GUI 102a. The logic or algorithm(s) may be written in any programming language of any generation (e.g., 1GL, 2GL, 3GL, 4GL, or 5GL) such as, for example, machine language that may be directly executed by the processor 102b, or assembly language, object-oriented programming (OOP), scripting languages, microcode, etc., that may be compiled or assembled into machine readable instructions (e.g., logic) and stored in the memory component 260. It is noted that the aforementioned logics are referred to herein generally as “the logic,” which may refer to each of the logics recited above.
In embodiments, the logic set may be written in a hardware description language (HDL), such as logic implemented via either a field-programmable gate array (FPGA) configuration or an application-specific integrated circuit (ASIC), or their equivalents. Accordingly, the functionality described herein may be implemented in any conventional computer programming language, as pre-programmed hardware elements, or as a combination of hardware and software components. For example, the memory component 260 may be a machine-readable memory (which may also be referred to as a non-transitory processor-readable memory or medium) that stores instructions that, when executed by the processor 102b, causes the processor 102b to perform a method of generating surface feature data 101 as described herein.
The components illustrated in
The logic stored on the memory component 260, when executed by the processor 102b, may cause the system 100 to receive a data set 300 (such as data points 300 or a plurality of data points from the LiDAR sensor 104c of the vehicle 104 described hereinabove). The logic may also cause the system 100 to extract input layers 306 from the data set 300 (depicted in
Referring now to
The data points 300 may include three-dimensional data points. For example, the data points 300 may be assigned Cartesian coordinates from the Cartesian coordinate system 308 depicted in
The data points 300 may include intensity values. The intensity values of the LiDAR data 300 may correspond to a color of the roadway surface 304. For example, as discussed herein bellows, paint on the roadway surface 304 may be modeled as surface color features 322 through intensity values of the LiDAR data 300. The data points 300 may also include timestamp values. The timestamp values may include a time of when the data points 300 were acquired.
In embodiments, the input layers 306 may be mapped/generated by a user, such as by using the computing device 102 as described hereinabove. Once the data points 300 are received from the vehicle 104 through the network 10, the user may generate the input layers 306 by observing the data points 300. The input layers may represent a roadway surface 304. As such, for example, as depicted in
In embodiments, the input layers 306 and, thus, the roadway surface 304, may also be auto-generated by the computing device 102. For example, as depicted in
As depicted in
The input layers 306 may be dilated and unionized by the system 100, such as to generate the unionized input layer 400, as depicted in
The logic may further cause the system 100 to erode and triangulate the input layers 306 (e.g., the unionized input layer 400) to generate a plurality of planar surfaces 500, as depicted in
The input layers 306 may be triangulated and, thus, be divided into a plurality of 2-dimensional triangular planar surfaces 500 that that graphically represent the roadway surface 304. The input layers may be divided into triangular planes (as depicted in
The planar surfaces 500 may be evenly distributed. In other embodiments, there may be a higher number of the planar surfaces 500 at a curve 501 or edge 503 of the roadway surface 304 in order to capture characteristics (such as heights at the curvature, as explained further below) of the curve 501 or edge 503 more accurately.
As the planar surfaces 500 are 2-dimensional, the memory component 260 may further include logic that causes the system 100 to assign heights to each vertex planar surface 500, as depicted in
In embodiments, heights may be assigned to each vertex of the planar surfaces 500 through the Cartesian coordinates of three-dimensional points in the data set. As noted hereinabove, the data points 300 may correspond to Cartesian coordinates from the Cartesian coordinate system 308. To assign heights to each of the vertices of the planar surfaces 500, the system 100 may include logic that overlays the data points 300 onto the planar surfaces 500, as depicted in
For example, a first planar surface 504 may include three vertices, 504A, 504B, and 504C. Each of the vertices may be assigned different height values in the z-direction of the Cartesian coordinate system 308 because the heights of the closest data point 300 for each vertex may be different. In some embodiments, one or more vertices of the planar surfaces 500 may share the same height, such as when the vertices share a closest data point 300. Such sharing of the closest data point 300 may occur when there is a high resolution of planar surfaces 500 and, thus, the vertices of the planar surfaces 500 are close to one another.
In embodiments, heights may be assigned to each vertex based on an average z-value of the data points 300 surrounding the vertex. A threshold number of data points may be used to determine the height assigned to the vertex. In embodiments, the threshold number of data points may be 25, 50, 100, 150, 200, 300, 400 or 500 data points 300, such that the closest data points 300 to the vertex under the threshold number of data points may be used to calculate the height assigned to the vertex. In other embodiments, data points 300 within a threshold distance of the vertices may be averaged to assign heights to the vertices of the planar surfaces 500. In embodiments, the threshold distance from the vertices used to assign heights to the vertices may be less than or equal to 50 millimeters, less than or equal to 60 millimeters, less than or equal to 75 millimeters, less than or equal to 100 millimeters, less than or equal to 150 millimeters, less than or equal to 200 millimeters, or less than or equal to 500 millimeters. As such, the height of each vertex may correspond to an average height of each data point 300 within the threshold distance from the vertex.
Each data point 300 used to calculate the height assigned to the vertex of the planar surface 500 may carry equal weight in determining the height assigned to the vertex of each planar surface 500. As such, the assigned height to each vertex may be the average of the heights (i.e., z-values) of each data point 300 within the threshold distance from each vertex of the planar surfaces 500. In other embodiments, weights may be assigned to each data point 300 within the threshold distance of each vertex. Weights assigned to each data point 300 may correspond to how close the data point is to the vertex of the planar surface 500. As such, a data point 300 may be assigned a high weight when calculating the height assigned to the vertex of each planar surface 500 when the data point is at or near the vertex. In contrast, data points 300 that are further from the vertex of the planar surface 500, such as those data points 300 that are near the threshold distance from the vertex.
In embodiments, if a data point 300 is directly above or below the vertex of the planar surface 500, such that the data point 300 has the same or near the same x-y coordinates as the vertex, the data point 300 may carry 80%, 85%, 90%, 95%, 97%, or 98%, 99% or 100% of the weight assigned to the height of the vertex since it is likely that the data point 300 that shares the same x-y coordinates as the vertex is a point on the roadway surface 304.
The logic may further cause the system 100 to detect incomplete features 502 in the plurality of planar surfaces 500, as depicted in
The memory component 260 may further include logic that causes the system 100 to fill in the incomplete features 502 in the planar surfaces 500, as depicted in
Referring now to
Referring to
The method 900 may further include detecting the surface color features 322 through the intensity values of the data points 300. The method 900 may also include generating the two-dimensional model of the surface feature data 101 through the plurality of planar surfaces 500 and generating the three-dimensional model of the surface features data 101 through assigning heights to each vertex of the planar surfaces 500.
It is noted that the terms “substantially” and “about” may be utilized herein to represent the inherent degree of uncertainty that may be attributed to any quantitative comparison, value, measurement, or other representation. These terms are also utilized herein to represent the degree by which a quantitative representation may vary from a stated reference without resulting in a change in the basic function of the subject matter at issue.
While particular embodiments have been illustrated and described herein, it should be understood that various other changes and modifications may be made without departing from the spirit and scope of the claimed subject matter. Moreover, although various aspects of the claimed subject matter have been described herein, such aspects need not be utilized in combination. It is therefore intended that the appended claims cover all such changes and modifications that are within the scope of the claimed subject matter.