CN112381029A - Airborne LiDAR data building extraction method based on Euclidean distance - Google Patents

Airborne LiDAR data building extraction method based on Euclidean distance Download PDF

Info

Publication number
CN112381029A
CN112381029A CN202011328391.1A CN202011328391A CN112381029A CN 112381029 A CN112381029 A CN 112381029A CN 202011328391 A CN202011328391 A CN 202011328391A CN 112381029 A CN112381029 A CN 112381029A
Authority
CN
China
Prior art keywords
points
value
data
point
steps
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011328391.1A
Other languages
Chinese (zh)
Other versions
CN112381029B (en
Inventor
刘茂华
陈晗琳
王岩
由迎春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Jianzhu University
Original Assignee
Shenyang Jianzhu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Jianzhu University filed Critical Shenyang Jianzhu University
Priority to CN202011328391.1A priority Critical patent/CN112381029B/en
Publication of CN112381029A publication Critical patent/CN112381029A/en
Application granted granted Critical
Publication of CN112381029B publication Critical patent/CN112381029B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

An extraction method of an airborne LiDAR data building based on Euclidean distance belongs to the technical field of airborne laser LiDAR point cloud data classification. The method comprises the following steps: removing noise points and carrying out elevation constraint to filter out ground points and low points; extracting scanning lines, sequencing all points in each scanning line according to coordinates, sequentially calculating Euclidean distances between the sequenced points and variance marks of the distances of all the points, and marking the variance as a characteristic value; determining an initial L value according to the change of the data volume; automatically reducing the L value, calculating the characteristic value of the residual points, comparing the new L value with the characteristic value and iterating; and stopping iteration according to the data volume change trend and outputting a result. According to the invention, based on the unmanned aerial vehicle laser radar point cloud data acquired in the urban sample area, the research object is converted from discrete points into scanning lines for calculation, and the spatial relation among all the points in the same scanning line can be fully utilized. The method is fully automatic, simple and efficient, does not need parameters to participate in operation, and has no human error.

Description

Airborne LiDAR data building extraction method based on Euclidean distance
Technical Field
The invention belongs to the technical field of airborne laser LiDAR point cloud data classification, and particularly relates to an airborne LiDAR data building extraction method based on Euclidean distance.
Background
Cities are the habitation places for human life and the center of gravity of social and economic activities, and urbanization has profoundly changed the living environment and the living style of human beings. With the rapid advance of the urbanization process, how to reasonably manage and plan the rapidly expanded cities becomes a problem to be solved urgently in each developing city. The most easily changed part in the city is the building, and the requirement of people for detecting the change of the building in the urban area is continuously increased, which mainly shows two aspects: 1. in the change information acquisition mode, it is desirable to acquire the change information more quickly; 2. in the aspect of acquiring the change information, people want to know the change attribute information and the spatial change information of the change area more abundantly and accurately. For a long time, most of scholars use remote sensing image data to detect buildings by adopting an image processing method. The remote sensing image lacks direct three-dimensional data information, and has the problems of foreign matter same spectrum, same spectrum foreign matter, shadow, projection and the like, so that great challenges are met. The laser radar (Light detection and Ranging) is an advanced active remote sensing technology, can quickly acquire high-precision height information and three-dimensional structure information of a ground surface target, and has the advantages of strong anti-interference capability, good low-altitude detection performance and the like. Along with the rapid development of the unmanned aerial vehicle technology, the unmanned aerial vehicle laser radar provides powerful technical support for rapid and accurate vegetation fine classification.
In recent years, methods for extracting buildings using airborne LiDAR data have been well developed, and can be roughly classified into two categories according to the extraction method: direct extraction and indirect extraction. The direct extraction method of the building point cloud is to directly separate the building from the ground, trees and other ground objects by a point cloud segmentation and classification means. The method is difficult to establish the accurate corresponding relation between the segmentation classification result and the building roof, so that extraction omission and misclassification are easily caused, and the detected building area is incomplete. The common direct extraction method can be roughly divided into four aspects of edge segmentation, region segmentation, feature clustering and model fitting. The indirect building point cloud extracting method includes filtering and separating ground point cloud, identifying point set belonging to the roof of the building from non-ground point cloud, and is suitable for use in case of lacking other auxiliary data. The indirect extraction of the building point cloud has strong correlation with the building point cloud extraction method in the direct extraction method, and the indirect extraction method only does not consider the participation of the ground point cloud. According to different data structures, the indirect building extraction method is divided into five categories, namely, based on original point cloud, based on regular grid, based on irregular triangular network (TIN), based on section and based on voxel.
At present, the TIN-based extraction method is most widely applied, the method well keeps the accuracy and the information quantity of data while acquiring the proximity relation of laser foot points, but has the defects of complex data structure, occupation of a large amount of storage space, difficulty in space analysis and the like. The extraction method based on the regular grid is mature, can refer to a plurality of existing image processing technologies, has high-efficiency spatial analysis and calculation capacity, and is easily influenced by factors such as scale selection and data interpolation of the grid, and the like, so that data precision loss is caused. In addition, these studies require parameter setting before calculation, and the influence of the size of the parameter on the extraction result is very large. For the extraction of buildings in large-scale urban areas, the above algorithm cannot simultaneously guarantee efficiency and accuracy. On the other hand, the result of extraction of buildings is also enormous due to the presence of tall plants or large-area regular plants in urban areas. Most algorithms cannot accurately avoid the plant extraction building, or remove a part of the building while removing the plant, so that the extracted building is incomplete.
Disclosure of Invention
Aiming at the existing technical problems, an airborne LiDAR data building extraction method based on Euclidean distance is provided, a research object is converted into regular scanning lines from traditional discrete points, the connection between point clouds in the lines is utilized, the characteristics of the point cloud distribution rule of the building and the random dispersion of the plant point cloud are utilized, and the building extraction of unmanned aerial vehicle LiDAR data is realized based on the Euclidean distance by calculating the distance data fluctuation between the point clouds.
The purpose of the invention is realized by the following technical scheme:
the invention relates to an airborne LiDAR data building extraction method based on Euclidean distance, which comprises the following steps:
step S1, loading airborne LiDAR data;
step S2, removing noise points of the airborne LiDAR data, and performing elevation constraint to filter out ground points and low points;
step S3, extracting scanning lines, sequencing all points in each scanning line according to coordinates, sequentially calculating Euclidean distances between the sequenced points and variances of the distances of the points, and marking the variances as characteristic values;
step S4, determining an initial L value according to the change of the data volume;
step S5, automatically reducing the L value, calculating the characteristic value of the residual point, comparing the new L value with the characteristic value and iterating, stopping iterating according to the data volume change trend, and outputting the result;
and step S6, carrying out experimental test on effectiveness of different urban areas by using the algorithm.
Further, the step S2 includes the following steps:
(21) searching each point for the same number of neighborhood points, and calculating the average distance D from the point to the neighborhood pointsmeanAnd the median m and standard deviation sigma thereof, calculating the maximum distance Dmax
Dmax=m+K*σ
Where K is a multiple of the standard deviation set to 5, if DmeanGreater than DmaxIf the point is a noise point, the noise point is removed;
(22) and filtering out point clouds with the height difference of 1 meter by taking the ground height as a reference so as to ensure that ground points and low points are not in the calculated data.
Further, the step S3 includes the following steps:
(31) reading the 'TIME' attribute of a built-in GPS of the aircraft, and clustering the aircraft into the same scanning line with the same numerical value;
(32) re-ordering the points in the same scanning line by X coordinates;
(33) sequentially calculating Euclidean space distances between two adjacent points in front and back, and sequentially calculating the variance value between every two adjacent distances, wherein each set of variance value corresponds to three points.
Further, the step S4 includes the following steps:
(41) setting the value L to be 1.0, comparing the value L with the characteristic value, reserving three points of which the variance values are smaller than the value L, observing the comparison result of the variance values and the value L of the next group when the characteristic value is larger than the value L, if the variance values and the value L are also out of limit, calibrating the last point of the first group as a dangerous point, and otherwise, reserving the three points of the group;
(42) reducing the L value by taking 0.1 as a step length, respectively calculating screening points of the original data by using different L values according to the rule (41), recording the total data amount calculated by different L values, and drawing a curve of the total data amount;
(43) and obtaining the slope between the adjacent front and back points according to the curve change trend, and taking the L value of the point as the initial L value of iterative calculation when the slope change of the total data amount is minimum.
Further, the step S5 includes the following steps:
(51) decreasing the value of L in steps of 0.0001;
(52) recalculating variance values of all points except the dangerous points;
(53) judging according to the judgment rule of the step S4, and recording the total amount of the safety point data calculated each time;
(54) and repeating the steps for iteration, stopping iteration when the total amount of the data points changes most smoothly, and outputting a result.
The invention has the beneficial effects that:
1. according to the invention, based on the unmanned aerial vehicle laser radar point cloud data acquired in the urban sample area, the research object is converted from discrete points into scanning lines for calculation, and the spatial relation among all the points in the same scanning line can be fully utilized.
2. The invention provides a method for extracting a building by utilizing the Euclidean distance between calculation points, which is simple and efficient and does not need to set parameters. 97.1% of total accuracy, 2.2% of error rate and 3.2% of leakage rate are realized in urban data of two different regions, and experiments verify that in the data set, the initial L value is set to be L, and the iteration end value is 0.04, so that high efficiency and high accuracy can be met.
3. The invention creatively converts the research object from the traditional discrete point into the scanning line, can fully utilize the spatial relationship between adjacent points and provides a new idea for other building extraction methods.
Drawings
FIG. 1 is a schematic view of a Z-shaped scan line.
FIG. 2 is a flow chart of the method of the present invention.
FIG. 3(a) is a LiDAR raw data diagram for UAV
FIG. 3(b) is a diagram showing the effect of filtering
FIG. 4 is a diagram of a segmentation result of a single tree based on watershed segmentation.
FIG. 5 is a graph of a single tree segmentation result based on point cloud distance.
FIG. 6 is a diagram of a neural network model used in the present invention.
FIG. 7 is a diagram of a T-Net network architecture.
Fig. 8 is a diagram of a pointnet (vanilla) network architecture.
Fig. 9 is an error rate for three algorithms.
Fig. 10 is the leak rate for the three algorithms.
Detailed Description
In order to clearly explain the technical features of the present invention, the following detailed description of the present invention is provided with reference to the accompanying drawings. It should be noted that the components illustrated in the figures are not necessarily drawn to scale. Descriptions of well-known components and processing techniques and procedures are omitted so as to not unnecessarily limit the invention.
Example (b): aiming at the requirement of building extraction based on airborne LiDAR data, the invention provides an airborne LiDAR data building extraction method based on Euclidean distance, which solves the problem of rapid and accurate extraction of large-scale buildings based on airborne LiDAR data at present by extracting scanning lines and utilizing the spatial relationship between points in one scanning line.
As shown in the flowchart of fig. 2, the present invention provides a method for extracting an airborne LiDAR data building based on euclidean distance for the need of building extraction using airborne LiDAR data, comprising the steps of:
step S1, loading airborne LiDAR data;
step S2, removing noise points and carrying out elevation constraint to filter out ground points and low points;
step S3, extracting Z-shaped scanning lines, as shown in FIG. 1, sorting all points in each scanning line according to coordinates, and sequentially calculating Euclidean distances between the ordered points and variances of the distances of the points, and marking the variances as characteristic values;
step S4, determining an initial L value according to the change of the data volume;
step S5, automatically reducing the L value, calculating the characteristic value of the residual point, comparing the new L value with the characteristic value and iterating, stopping iterating according to the data volume change trend, and outputting the result;
and step S6, carrying out experimental test on effectiveness of different urban areas by using the algorithm.
The step S2 specifically includes the following steps:
(21) due to the complexity of the urban environment, gross errors can be generated in the process of acquiring unmanned aerial vehicle LiDAR data, so that the original point cloud data must be denoised first. Each point is searched for the same number (set to 10) of neighborhood points, and the average value D of the distances from the point to the neighborhood points is calculatedmeanAnd the median m and standard deviation σ thereof. Calculating the maximum distance Dmax:
Dmax=m+K*σ
Wherein K is a multiple of the standard deviation and is set to be 5; if D ismeanGreater than DmaxIf the point is a noise point, the noise point is removed;
(22) since the ground points in the urban area are similar to buildings in shape, the change degree of Euclidean distances between adjacent points is not large, and all the ground points are very necessary to be removed before calculation. The method filters the point cloud with the height difference of 1 meter by taking the ground height as a reference, ensures that ground points and low points are not in the calculated data, and does not need to use a complex filtering algorithm to calculate the data because the height of most buildings is not less than 1 meter.
The step S3 specifically includes the following steps:
(31) reading the 'TIME' attribute of a built-in GPS of the aircraft, and clustering the aircraft into the same scanning line with the same numerical value;
(32) and reordering the points in the same scanning line by X coordinate.
(33) Calculating Euclidean space distances of two adjacent points in a scanning line from head to tail according to the sequence, wherein each distance value corresponds to two points, calculating a variance value of each two adjacent distance sections from head to tail as a calculated characteristic value, and each variance value corresponds to three points.
The step S4 specifically includes the following steps:
(41) setting the value L as 1.0, comparing with the characteristic value, keeping three points of which the variance values are smaller than the value L, observing the comparison result of the variance values and the value L of the next group when the characteristic value is larger than the value L, if the variance values and the value L are also out of limit, calibrating the last point of the first group as a dangerous point, and if the variance values and the value L are not out of limit, keeping the three points of the group;
(42) reducing the L value by taking 0.1 as a step length and repeating the steps, respectively calculating screening points of the original data by using different L values according to the rule (41), recording the total data amount after the calculation of the different L values, and drawing a curve of the total data amount;
(43) and (3) calculating the slope according to the curve change trend, namely the slope between two adjacent points before and after the curve change trend is shown in FIG. 4, and taking the L value of the point as the initial L value of iterative calculation when the change slope of the total data amount is minimum.
The step S5 specifically includes the following steps:
(51) decreasing the value of L in steps of 0.0001;
(52) recalculating variance values of all points except the dangerous points;
(53) judging according to the judgment rule of the step S4, and recording the total amount of the safety point data calculated each time;
(54) and repeating the steps for iteration, stopping iteration when the total amount of the data points changes most smoothly, and outputting a result.
In order to verify the performance of the building detection method, actually measured unmanned aerial vehicle LiDAR data provided by a certain unit is used as experimental data, and the feasibility of extracting the building by taking the Euclidean distance as a characteristic is verified through experiments.
The invention selects a part of Shenyang city (41.749477 degrees N, 123.522408 degrees E) and a part of Panjin city (41.411438 degrees N, 122.438642 degrees E) as research areas, both of which are typical high-density building coverage areas in cities, and simultaneously have high greening rate and complex vegetation forms. Experimental data are obtained by a MiniVUX 1UAV laser radar scanner carried by a Riegl corporation unmanned aerial vehicle in Chaga in 2019 at 24 and 8 months, the air route distance is 40m, the flight height is 50m, the flying speed is 5m/s, the scanning angle of the laser radar is 90-270 degrees, and the scanning linear speed is 100 m/s. Building sample data obtained based on the unmanned aerial vehicle lidar system is shown in fig. 1.
Taking the test area 1 as an example, the non-ground point is obtained through the denoising practice and the elevation constraint test, and the ratio of the unmanned aerial vehicle LiDAR raw data to the post-test effect is shown in fig. 3(a) and 3 (b). For the determination of the initial L value, it needs to determine through a data total amount change curve, the total number of data points in the area is 2579932, as shown in fig. 4, different initial values of L are selected for data calculation, so that different data total amounts can be obtained, the curve is gradually gentle, and finally drops sharply, and the position with the minimum curvature is taken as the selection of the initial L value.
After the initial calculation is completed, the iteration gradually reduces the value of L, so that the data can be calculated more accurately. As shown in fig. 5, the step size for iteratively calculating the L value is set to 0.001, and it can be known from fig. 5 that the number of data points changes drastically when the L value is greater than 0.091 and less than 0.009; between 0.091 and 0.009, the data amount changes slowly (since the point cloud changes in the range of 0.091 to 0.009 are small and the data density is large, the data point number with the small data change and the dense data point number between 0.09 and 0.01 are deleted for the intuitive perception of the point cloud number). The difference of the number of data points of each adjacent step is calculated for averaging, the change range of the data quantity is determined to be basically unchanged after the L value is iterated to be 0.05, and in order to reduce the calculation time, the calculation is stopped when the L value is 0.04.
To demonstrate the superiority of the experimental method, the experiment was compared with the following two methods.
Method 1-referring to the irregular triangulation network (TIN) -based method proposed by Homanyun et al, firstly, an irregular triangulation network is established by using original point cloud data, and the edge points of the protrusions are extracted by using the normal vector, side length and elevation features of the triangle where the edge points of the protrusions in the triangulation network are located. And then, taking the extracted edge points as seed points, performing region growth according to the connection relation of the triangulation network, extracting a protrusion point set, and finally deleting the non-building point set with a small number of points in the set to obtain a building point set.
Method 2-reference K-means clustering method, K-means clustering algorithm (K-means clustering algorithm) is an iterative solution clustering analysis algorithm, its steps are, divide data into K groups in advance, choose K objects as the initial clustering center at random, then calculate each object and each seed clustering center distance, assign each object to its nearest clustering center. The cluster centers and the objects assigned to them represent a cluster. The elevation and the echo intensity are used as classification standards, data are classified into 5 types, and finally a building point cloud set is obtained.
FIG. 6, FIG. 7, and FIG. 8 are graphs of the results of the SLED, TIN, and K-means algorithms, respectively. The red frame 1 and the red frame 2 in the same part are the difference in detail, and as can be seen from the red circle in fig. 7, the TIN algorithm cannot avoid the influence caused by dense and high vegetation, and the K-means algorithm does not detect part of buildings as non-buildings.
The extraction results are shown in table 1.
TABLE 1 test zones 1. each algorithm detects area and overlap area
Figure BDA0002795050120000081
Figure BDA0002795050120000091
Fig. 9 and fig. 10 are histogram charts of error rate and leakage rate of the three algorithms, respectively, and the difference between the three algorithms can be visually seen. The comprehensive accuracy rate of the method reaches 97.1%, and the error rate and the leakage rate are respectively reduced to 2.2% and 3.7%. And the comprehensive accuracy of the TIN method reaches 89.6 percent. The error rate and the leakage rate were 4.9% and 16.0%, respectively. The comprehensive accuracy rate of the K-means algorithm is 83.4 percent, and the error rate and the leakage rate are respectively 7.3 percent and 25.9 percent
The following conclusions were drawn from all the above experiments:
(1) the present invention (SLED) achieved an overall accuracy of 97.1% and error and leakage rates of 2.2% and 3.7% on a municipal data set containing a large number of tall and dense plants.
(2) In the data set, when the initial L value is selected to be 0.1, the iteration step length is set to be 0.0001, and the termination L value is selected to be 0.04, the building extraction accuracy is highest, and the calculation interval is short.
(3) By converting the research object from the traditional discrete points into the scanning line, the spatial relationship between the adjacent points can be fully utilized, and a new idea is provided for the full-automatic extraction of the building.
It should be understood that the detailed description of the present invention is only for illustrating the present invention and is not limited by the technical solutions described in the embodiments of the present invention, and those skilled in the art should understand that the present invention can be modified or substituted equally to achieve the same technical effects; as long as the use requirements are met, the method is within the protection scope of the invention.

Claims (5)

1. The utility model provides an airborne LiDAR data building extraction method based on Euclidean distance which characterized in that: the method comprises the following steps:
step S1, loading airborne LiDAR data;
step S2, removing noise points of the airborne LiDAR data, and performing elevation constraint to filter out ground points and low points;
step S3, extracting scanning lines, sequencing all points in each scanning line according to coordinates, sequentially calculating Euclidean distances between the sequenced points and variances of the distances of the points, and marking the variances as characteristic values;
step S4, determining an initial L value according to the change of the data volume;
step S5, automatically reducing the L value, calculating the characteristic value of the residual point, comparing the new L value with the characteristic value and iterating, stopping iterating according to the data volume change trend, and outputting the result;
and step S6, carrying out experimental test on effectiveness of different urban areas by using the algorithm.
2. The Euclidean distance-based airborne LiDAR data building extraction method of claim 1, wherein: the step S2 includes the steps of:
(21) searching each point for the same number of neighborhood points, and calculating the average distance D from the point to the neighborhood pointsmeanAnd the median m and standard deviation sigma thereof, calculating the maximum distance Dmax
Dmax=m+K*σ
Where K is a multiple of the standard deviation set to 5, if DmeanGreater than DmaxIf the point is a noise point, the noise point is removed;
(22) and filtering out point clouds with the height difference of 1 meter by taking the ground height as a reference so as to ensure that ground points and low points are not in the calculated data.
3. The Euclidean distance-based airborne LiDAR data building extraction method of claim 1, wherein: the step S3 includes the steps of:
(31) reading the 'TIME' attribute of a built-in GPS of the aircraft, and clustering the aircraft into the same scanning line with the same numerical value;
(32) re-ordering the points in the same scanning line by X coordinates;
(33) sequentially calculating Euclidean space distances between two adjacent points in front and back, and sequentially calculating the variance value between every two adjacent distances, wherein each set of variance value corresponds to three points.
4. The Euclidean distance-based airborne LiDAR data building extraction method of claim 1, wherein: the step S4 includes the steps of:
(41) setting the value L to be 1.0, comparing the value L with the characteristic value, reserving three points of which the variance values are smaller than the value L, observing the comparison result of the variance values and the value L of the next group when the characteristic value is larger than the value L, if the variance values and the value L are also out of limit, calibrating the last point of the first group as a dangerous point, and otherwise, reserving the three points of the group;
(42) reducing the L value by taking 0.1 as a step length, respectively calculating screening points of the original data by using different L values according to the rule (41), recording the total data amount calculated by different L values, and drawing a curve of the total data amount;
(43) and obtaining the slope between the adjacent front and back points according to the curve change trend, and taking the L value of the point as the initial L value of iterative calculation when the slope change of the total data amount is minimum.
5. The Euclidean distance-based airborne LiDAR data building extraction method of claim 1, wherein: the step S5 includes the steps of:
(51) decreasing the value of L in steps of 0.0001;
(52) recalculating variance values of all points except the dangerous points;
(53) judging according to the judgment rule of the step S4, and recording the total amount of the safety point data calculated each time;
(54) and repeating the steps for iteration, stopping iteration when the total amount of the data points changes most smoothly, and outputting a result.
CN202011328391.1A 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance Active CN112381029B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011328391.1A CN112381029B (en) 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011328391.1A CN112381029B (en) 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance

Publications (2)

Publication Number Publication Date
CN112381029A true CN112381029A (en) 2021-02-19
CN112381029B CN112381029B (en) 2023-11-14

Family

ID=74588925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011328391.1A Active CN112381029B (en) 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance

Country Status (1)

Country Link
CN (1) CN112381029B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116579949A (en) * 2023-05-31 2023-08-11 浙江省测绘科学技术研究院 Airborne point cloud ground point filtering method suitable for urban multi-noise environment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102103202A (en) * 2010-12-01 2011-06-22 武汉大学 Semi-supervised classification method for airborne laser radar data fusing images
CN105488770A (en) * 2015-12-11 2016-04-13 中国测绘科学研究院 Object-oriented airborne laser radar point cloud filtering method
CN106199557A (en) * 2016-06-24 2016-12-07 南京林业大学 A kind of airborne laser radar data vegetation extracting method
CN106970375A (en) * 2017-02-28 2017-07-21 河海大学 A kind of method that building information is automatically extracted in airborne laser radar point cloud
US20170262732A1 (en) * 2014-08-01 2017-09-14 Shenzhen Cimc-Tianda Airport Support Ltd. System and method for aircraft docking guidance and aircraft type identification
CN110992341A (en) * 2019-12-04 2020-04-10 沈阳建筑大学 Segmentation-based airborne LiDAR point cloud building extraction method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102103202A (en) * 2010-12-01 2011-06-22 武汉大学 Semi-supervised classification method for airborne laser radar data fusing images
US20170262732A1 (en) * 2014-08-01 2017-09-14 Shenzhen Cimc-Tianda Airport Support Ltd. System and method for aircraft docking guidance and aircraft type identification
CN105488770A (en) * 2015-12-11 2016-04-13 中国测绘科学研究院 Object-oriented airborne laser radar point cloud filtering method
CN106199557A (en) * 2016-06-24 2016-12-07 南京林业大学 A kind of airborne laser radar data vegetation extracting method
CN106970375A (en) * 2017-02-28 2017-07-21 河海大学 A kind of method that building information is automatically extracted in airborne laser radar point cloud
CN110992341A (en) * 2019-12-04 2020-04-10 沈阳建筑大学 Segmentation-based airborne LiDAR point cloud building extraction method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MAOHUA LIU 等: "method for extraction of airborne lidar point cloud buildings based on segmentation", 《PLOS ONE》, pages 1 - 11 *
刘茂华 等: "记载LiDAR与高分2号融合的城镇地物分类研究", 《中国科技论文》, vol. 13, no. 9, pages 990 - 994 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116579949A (en) * 2023-05-31 2023-08-11 浙江省测绘科学技术研究院 Airborne point cloud ground point filtering method suitable for urban multi-noise environment

Also Published As

Publication number Publication date
CN112381029B (en) 2023-11-14

Similar Documents

Publication Publication Date Title
CN110570428B (en) Method and system for dividing building roof sheet from large-scale image dense matching point cloud
CN110781827B (en) Road edge detection system and method based on laser radar and fan-shaped space division
CN112070769B (en) Layered point cloud segmentation method based on DBSCAN
CN106529469B (en) Unmanned aerial vehicle-mounted LiDAR point cloud filtering method based on self-adaptive gradient
Aschoff et al. Describing forest stands using terrestrial laser-scanning
CN106127857B (en) The on-board LiDAR data modeling method of integrated data driving and model-driven
CN108010092A (en) A kind of city high density area Solar use potential evaluation method based on low altitude photogrammetry
US20130096886A1 (en) System and Method for Extracting Features from Data Having Spatial Coordinates
CN106680798B (en) A kind of identification of airborne LIDAR air strips overlay region redundancy and removing method
CN108109139B (en) Airborne LIDAR three-dimensional building detection method based on gray voxel model
CN110794413B (en) Method and system for detecting power line of point cloud data of laser radar segmented by linear voxels
CN105488770A (en) Object-oriented airborne laser radar point cloud filtering method
CN108074232B (en) Voxel segmentation-based airborne LIDAR building detection method
CN114764871B (en) Urban building attribute extraction method based on airborne laser point cloud
CN109961470B (en) Living standing tree She Shuxing accurate estimation method based on laser point cloud
CN112099046B (en) Airborne LIDAR three-dimensional plane detection method based on multi-value voxel model
CN104597449B (en) A kind of airborne many scanning weather radar target vertical contour reconstruction methods
CN111950589B (en) Point cloud region growing optimization segmentation method combined with K-means clustering
CN114119902A (en) Building extraction method based on unmanned aerial vehicle inclined three-dimensional model
CN113177897A (en) Rapid lossless filtering method for disordered 3D point cloud
CN112381029B (en) Method for extracting airborne LiDAR data building based on Euclidean distance
Shen et al. Object-based classification of airborne light detection and ranging point clouds in human settlements
CN111060922B (en) Tree point cloud extraction method based on airborne laser radar point cloud spatial distribution characteristics
CN117765006A (en) Multi-level dense crown segmentation method based on unmanned aerial vehicle image and laser point cloud
CN112862720A (en) Denoising method and system for diffuse reflection noise of glass plate in airborne LiDAR point cloud

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant