CN112381029B - Method for extracting airborne LiDAR data building based on Euclidean distance - Google Patents

Method for extracting airborne LiDAR data building based on Euclidean distance Download PDF

Info

Publication number
CN112381029B
CN112381029B CN202011328391.1A CN202011328391A CN112381029B CN 112381029 B CN112381029 B CN 112381029B CN 202011328391 A CN202011328391 A CN 202011328391A CN 112381029 B CN112381029 B CN 112381029B
Authority
CN
China
Prior art keywords
points
value
data
point
steps
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011328391.1A
Other languages
Chinese (zh)
Other versions
CN112381029A (en
Inventor
刘茂华
陈晗琳
王岩
由迎春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Jianzhu University
Original Assignee
Shenyang Jianzhu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Jianzhu University filed Critical Shenyang Jianzhu University
Priority to CN202011328391.1A priority Critical patent/CN112381029B/en
Publication of CN112381029A publication Critical patent/CN112381029A/en
Application granted granted Critical
Publication of CN112381029B publication Critical patent/CN112381029B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

An airborne LiDAR data building extraction method based on Euclidean distance belongs to the technical field of airborne laser LiDAR point cloud data classification. Comprising the following steps: noise points are removed, and Cheng Yaoshu is high to filter out ground points and low points; extracting scanning lines, sequencing all points in each scanning line according to coordinates, sequentially calculating Euclidean distances among ordered points and variance marks of the distances of the points, and marking variances as characteristic values; determining an initial L value according to the data amount variation; automatically reducing the L value, calculating the characteristic value of the residual point, comparing the new L value with the characteristic value, and iterating; and stopping iteration according to the data quantity change trend, and outputting a result. According to the invention, based on unmanned plane laser radar point cloud data acquired in the urban sample area, a research object is converted into a scanning line from discrete points for calculation, and the spatial relation existing between all points in the same scanning line can be fully utilized. The method is fully automatic, simple and efficient, does not need parameters to participate in operation, and has no human error.

Description

Method for extracting airborne LiDAR data building based on Euclidean distance
Technical Field
The invention belongs to the technical field of airborne laser LiDAR point cloud data classification, and particularly relates to an airborne LiDAR data building extraction method based on Euclidean distance.
Background
Cities are the habitat of human life and are the center of gravity of socioeconomic activities, and urbanization has deeply changed the living environment and lifestyle of humans. Along with the rapid promotion of the urban process, how to reasonably manage and plan the rapidly expanded cities becomes a problem to be solved in each developing city. The most easily changeable part in the city is the building, and the requirements of people on the detection of the change of the building in the urban area are also continuously improved, and the main aspects are as follows: 1. in terms of a change information acquisition manner, it is desirable to acquire change information more quickly; 2. in the content of acquiring change information, it is desirable to know change attribute information and spatial change information in which change regions are richer and more accurate. For a long time, the scholars mostly use remote sensing image data to detect the building by adopting an image processing method. Because the remote sensing image lacks direct three-dimensional data information, the problems of foreign matter homospectrum, homospectrum foreign matter, shadow, projection and the like exist, and great challenges are met. The laser radar (Light Detecting and Ranging, liDAR) is an advanced active remote sensing technology, can rapidly acquire high-precision height information and three-dimensional structure information of a ground surface target, and has the advantages of high anti-interference capability, good low-altitude detection performance and the like. With the rapid development of unmanned aerial vehicle technology, unmanned aerial vehicle laser radar provides powerful technical support for quick and accurate vegetation fine classification.
In recent years, a method for extracting buildings by using airborne LiDAR data has been well developed, and two main types can be roughly classified according to the extraction method: direct extraction and indirect extraction. The method for directly extracting the point cloud of the building is to directly separate the building from the ground, the trees and other ground features through a point cloud segmentation and classification means. The method is difficult to establish the accurate corresponding relation between the segmentation classification result and the building roof, so that missing extraction and misclassification are easy to cause, and the detected building area is incomplete. The common direct extraction method can be roughly divided into four aspects of edge segmentation, region segmentation, feature clustering and model fitting. The indirect extraction method of the building point cloud generally filters and separates the ground point cloud, and then identifies the point set belonging to the building roof from the non-ground point cloud, so that the method is more suitable for the situation of lacking other auxiliary data. The indirect extraction of the building point cloud has strong correlation with the extraction method of the building point cloud in the direct extraction method, and the difference is only that the participation of the ground point cloud is not considered in the indirect extraction method. According to the difference of data structures, the indirect extraction method of the building is divided into five types, namely, an original point cloud based, a regular grid based, an irregular triangular network (TIN) based, a profile based and a voxel based.
At present, the extraction method based on TIN is most widely applied, and the method well maintains the precision and information quantity of data while obtaining the laser foot point proximity relation, but has the defects of complex data structure, large occupied storage space, difficulty in space analysis and the like. The extraction method based on the regular grids is also mature, can be used for introducing a plurality of existing image processing technologies, has high-efficiency space analysis and calculation capability, but is easily influenced by factors such as scale selection and data interpolation of the grids, and causes data precision loss. In addition, these studies all require parameter setting before calculation, and the influence of the size of the parameters on the extraction result is quite large. The above algorithm cannot guarantee efficiency and accuracy at the same time for building extraction in a large-scale urban area. On the other hand, the results of building extraction are equally enormous, since tall or large-area regular plants are always present in urban areas. Most algorithms cannot accurately avoid plants to extract buildings, or remove plants and a part of the buildings, resulting in incomplete extraction of the buildings.
Disclosure of Invention
Aiming at the technical problems, the method for extracting the airborne LiDAR data building based on the Euclidean distance is provided, a research object is converted into a regular scanning line from a traditional discrete point, the relationship between point clouds in the line is utilized simultaneously, the characteristic that the distribution of the point clouds of the building is irregular and the distribution of the plant point clouds is irregular is utilized, the fluctuation of the distance data between the point clouds is calculated, and the extraction of the unmanned aerial vehicle LiDAR data building based on the Euclidean distance is realized.
The aim of the invention is realized by the following technical scheme:
the invention discloses an airborne LiDAR data building extraction method based on Euclidean distance, which comprises the following steps:
step S1, loading airborne LiDAR data;
step S2, removing noise points of the airborne LiDAR data, and raising the noise point Cheng Yaoshu to filter out ground points and low points;
s3, extracting scanning lines, sequencing all points in each scanning line according to coordinates, and sequentially calculating Euclidean distances among ordered points and variances of the distances of the points, wherein the variances are marked as characteristic values;
s4, determining an initial L value according to the data quantity change;
s5, automatically reducing the L value, calculating the characteristic value of the residual point, comparing the new L value with the characteristic value, iterating, stopping iterating according to the data quantity change trend, and outputting a result;
and S6, carrying out experimental test on the effectiveness of different urban areas by utilizing the steps S1-S5.
Further, the step S2 includes the steps of:
(21) Searching the same number of neighborhood points for each point, and calculating the average value D of the distances from the point to the neighborhood points mean And calculating the maximum distance D by the value m and the standard deviation sigma max
D max =mTek×σ
Wherein K is a standard deviation multiple, and is set to 5, if D mean Greater than D max Then the point is considered as a noise point, and the noise point is removed;
(22) And filtering out point clouds with the height difference of 1 meter by taking the ground height as a reference, and ensuring that the ground points and the low points are not in the calculated data.
Further, the step S3 includes the steps of:
(31) The TIME attribute of the built-in GPS of the aircraft is read, and the numerical values are clustered into the same scanning line;
(32) Re-ordering the points in the same scanning line by X coordinates;
(33) And sequentially calculating Euclidean space distances between two adjacent points before and after the two adjacent points, and sequentially calculating variance values between every two adjacent distances, wherein each group of variance values corresponds to three points.
Further, the step S4 includes the steps of:
(41) Setting the L value to be 1.0, comparing the L value with the characteristic value, reserving three points with variance values smaller than the L value, observing a comparison result of the next set of variance values with the L value when the characteristic value is larger than the L value, if the comparison result is too overrun, calibrating the last point of the first set as a dangerous point, otherwise, reserving the three points;
(42) Reducing the L value by taking 0.1 as a step length, respectively using different L values to calculate sieve points on the original data according to the rule of the step (41), recording the total data calculated by the different L values, and drawing a curve of the total data;
(43) And obtaining the slope between the two adjacent points according to the curve change trend, and taking the L value of the point as the initial L value of iterative calculation when the slope change of the total data amount is minimum.
Further, the step S5 includes the steps of:
(51) Decreasing the L value in steps of 0.0001;
(52) Recalculating variance values of all points except dangerous points;
(53) Judging according to the judging rule in the step S4, and recording the total amount of the safety point data after each calculation;
(54) Repeating the steps for iteration, stopping iteration when the total quantity of the data points changes most gently, and outputting a result.
The beneficial effects of the invention are as follows:
1. according to the invention, based on unmanned plane laser radar point cloud data acquired in the urban sample area, a research object is converted into a scanning line from discrete points for calculation, and the spatial relation existing between all points in the same scanning line can be fully utilized.
2. The invention provides a method for extracting a building by calculating Euclidean distance between points, which is simple and efficient and does not need to set parameters. The method has the advantages that the overall accuracy of 97.1% and the error rate of 2.2% and the omission rate of 3.2% are realized in urban data of two different areas, and the data set is verified through experiments, the initial L value is set to be L, and the iteration end value is set to be 0.04, so that the high efficiency and high accuracy can be met.
3. The invention creatively converts the research object from the traditional discrete point to the scanning line, can fully utilize the spatial relationship between adjacent points, and provides a new thought for other building extraction methods.
Drawings
FIG. 1 is a schematic view of a "Z" shaped scan line.
Fig. 2 is a flow chart of the method of the present invention.
Fig. 3 (a) is a view of original data of unmanned plane LiDAR
FIG. 3 (b) is a diagram of the effect after filtering
Fig. 4 is a graph of a single-wood segmentation result based on watershed segmentation.
Fig. 5 is a graph of a single-wood segmentation result based on a point cloud distance.
Fig. 6 is a diagram of a neural network model used in the present invention.
Fig. 7 is a diagram of a T-Net network architecture.
Fig. 8 is a diagram of a PointNet (vanilla) network structure.
Fig. 9 shows the error rates of three algorithms.
FIG. 10 shows the leak rates of the three algorithms.
Detailed Description
In order to clearly illustrate the technical features of the present solution, the present invention will be described in detail below with reference to the following detailed description and the accompanying drawings. It should be noted that the components illustrated in the figures are not necessarily drawn to scale. Descriptions of well-known components and processing techniques and processes are omitted so as to not unnecessarily obscure the present invention.
Examples: aiming at the requirements of building extraction based on airborne laser radar data, the invention provides an airborne LiDAR data building extraction method based on Euclidean distance, which solves the problem of rapid and accurate extraction of a large-scale building based on airborne LiDAR data at present by extracting scanning lines and utilizing the spatial relationship among points in one scanning line.
As shown in the flowchart of fig. 2, the invention provides an airborne LiDAR data building extraction method based on euclidean distance, aiming at the requirement of building extraction by using airborne LiDAR data, comprising the following steps:
step S1, loading airborne LiDAR data;
step S2, removing noise points and increasing Cheng Yaoshu to filter out ground points and low points;
step S3, extracting Z-shaped scanning lines, as shown in FIG. 1, sequencing all points in each scanning line according to coordinates, and sequentially calculating variances of Euclidean distances among ordered points and distances of each point, wherein the variances are marked as characteristic values;
s4, determining an initial L value according to the data quantity change;
s5, automatically reducing the L value, calculating the characteristic value of the residual point, comparing the new L value with the characteristic value, iterating, stopping iterating according to the data quantity change trend, and outputting a result;
and S6, carrying out experimental test effectiveness on different urban areas by using the algorithm.
The step S2 specifically includes the following steps:
(21) Because of the complexity of urban environment, coarse difference is generated in the process of acquiring unmanned aerial vehicle LiDAR data, and therefore, the original point cloud data must be denoised first. Searching the same number (set as 10) of neighborhood points for each point, and calculating the average value D of the distance from the point to the neighborhood points mean And the median m and standard deviation sigma. Calculate the maximum distance D max :
D max =m+K*σ
Wherein K is a standard deviation multiple and is set to be 5; if D mean Greater than D max Then the point is considered as a noise point, and the noise point is removed;
(22) Since the ground points in urban areas are morphologically similar to buildings, the euclidean distance between adjacent points does not change to a great extent, and it is necessary to eliminate the ground points before calculation. And filtering point clouds with the height difference of 1 meter by taking the ground height as a reference, ensuring that ground points and low points are not in calculated data, and calculating the data without using a complex filtering algorithm because most building heights are not less than 1 meter.
The step S3 specifically comprises the following steps:
(31) Reading the TIME attribute of the built-in GPS of the aircraft, and clustering the same values into the same scanning line;
(32) The points within the same scan line are reordered in X coordinates.
(33) And calculating Euclidean space distances between two adjacent points in front and behind in a scanning line from beginning to end according to the sequence, wherein each distance value corresponds to two points, and calculating variance values of each two adjacent distances from beginning to end as calculated characteristic values, and each variance value corresponds to three points.
The step S4 specifically includes the following steps:
(41) Setting the L value to be 1.0, comparing the L value with the characteristic value, reserving three points with variance values smaller than the L value, observing a result of comparing the next set of variance values with the L value when the characteristic value is larger than the L, if the result is out of limit, calibrating the last point of the first set as a dangerous point, otherwise, reserving the three points of the set;
(42) Reducing the L value by taking 0.1 as a step length, repeating the steps, respectively using different L values to calculate screening points of the original data according to the rule of the step (41), recording the total data calculated by the different L values, and drawing a curve of the total data;
(43) And obtaining the slope according to the curve change trend, namely obtaining the slope between the two adjacent front and rear points shown in fig. 4, and taking the L value of the point as the initial L value of iterative calculation when the data total amount change slope is minimum.
The step S5 specifically includes the following steps:
(51) Decreasing the L value in steps of 0.0001;
(52) Recalculating variance values of all points except dangerous points;
(53) Judging according to the judging rule in the step S4, and recording the total amount of the safety point data after each calculation;
(54) Repeating the steps for iteration, stopping iteration when the total quantity of the data points changes most gently, and outputting a result.
In order to verify the performance of the building detection method, actually measured LiDAR data provided by a certain unit is used as experimental data, and the Euclidean distance is verified through experiments to be used as a feature to extract the feasibility of the building.
The invention selects a part of Shenyang city (41.749477 DEG N,123.522408 DEG E) and a part of Panjin city (41.411438 DEG N,122.438642 DEG E) as research areas, which are typical high-density building coverage areas in cities, and has high greening rate and complex vegetation form. Experimental data are acquired by a Riegl company miniVUX 1UAV laser radar scanner carried by a Dajiang unmanned aerial vehicle on 24 days of 8 months in 2019, the distance between the air lines is 40m, the air height is 50m, the flying speed is 5m/s, the laser radar scanning angle is 90-270 degrees, and the scanning linear speed is 100m/s. Building sample data obtained based on the unmanned aerial vehicle laser radar system is shown in fig. 1.
Taking test area 1 as an example, the above denoising real and elevation constraint tests are used to obtain non-ground points, and the original data and experimental effects of the unmanned plane LiDAR are shown in FIG. 3 (a) and FIG. 3 (b). For the determination of the initial L value, the determination needs to be performed through a data total amount change curve, the total number of data points in the region is 2579932, as shown in fig. 4, different L initial values are selected for data calculation to obtain different data total amounts, the curve overall approaches gradually, and finally drops sharply, and the position with the minimum curvature is taken as the initial L value.
After the initial calculation is completed, the L value is gradually reduced by iteration, so that more accurate calculation is performed on the data. As shown in fig. 5, the step size of iteratively calculating the L value is set to 0.001, and it can be seen from fig. 5 that the number of data points varies drastically when the L value is greater than 0.091 and less than 0.009; between 0.091 and 0.009, the data size changes slowly (due to the small range of change of the point cloud between 0.091 and 0.009 and the large data density, the number of data points with small and dense data changes between 0.09 and 0.01 are deleted for intuitively sensing the embodiment of the number of point clouds). The difference between the number of data points of each adjacent step is calculated to be averaged, it is determined that the change amplitude of the data amount is basically unchanged after the iteration of the L value is 0.05, and in order to reduce the calculation time, the calculation is stopped when the L value is 0.04 in the example.
To demonstrate the superiority of this experimental method, this experiment was compared with the following two methods.
The irregular triangular network (TIN) based method proposed by method 1-reference He Manyun and the like firstly utilizes original point cloud data to establish an irregular triangular network, and utilizes normal vector, side length and elevation characteristics of a triangle where a protrusion edge point in the triangular network is located to extract the protrusion edge point. And then taking the extracted edge points as seed points, carrying out region growth according to the connection relation of the triangular network, extracting a protrusion point set, and finally deleting a non-building point set with less points in the set to obtain a building point set.
Method 2-referring to the K-means clustering method, the K-means clustering algorithm (K-means clustering algorithm) is an iterative solution clustering analysis algorithm, which comprises the steps of pre-dividing data into K groups, randomly selecting K objects as initial clustering centers, calculating the distance between each object and each seed clustering center, and distributing each object to the closest clustering center. The cluster centers and the objects assigned to them represent a cluster. The elevation and the echo intensity are used as classification standards, data are divided into 5 types, and finally a building point cloud set is obtained.
The results of SLED, TIN and K-means algorithms are shown in FIGS. 6, 7 and 8, respectively. In which the same part of red frame 1 and red frame 2 are the difference of the details between the three, as can be seen from the red circle of fig. 7, the TIN algorithm cannot avoid the influence caused by dense and tall vegetation, and the K-means algorithm regards part of the building as a non-building and does not detect.
The extraction results are shown in Table 1.
Table 1 each algorithm of test area 1 detects area and overlap area
Fig. 9 and fig. 10 are bar pile-up graphs of error rate and omission rate of three algorithms, respectively, and the differences between the three can be intuitively seen. Wherein the comprehensive accuracy of the method reaches 97.1%, and the error rate and the omission rate are respectively as low as 2.2% and 3.7%. And the comprehensive accuracy of the TIN method reaches 89.6 percent. The error rate and omission rate were 4.9% and 16.0%, respectively. The comprehensive accuracy of the K-means algorithm is 83.4%, and the error rate and the omission rate are respectively 7.3% and 25.9%
The following conclusions were drawn by all the above experiments:
(1) The present invention (SLED) achieves an overall accuracy of 97.1% and error and omission rates of 2.2% and 3.7% on urban data sets containing large numbers of large and dense plants.
(2) In the data set, when the initial L value is selected to be 0.1, the iteration step is set to be 0.0001, and the final L value is selected to be 0.04, the building extraction accuracy is the highest, and the calculation time is short.
(3) By converting the research object from the traditional discrete point to the scanning line, the spatial relationship between adjacent points can be fully utilized, and a new idea is provided for the full-automatic extraction of the building.
It should be understood that the foregoing detailed description of the present invention is provided for illustration only and is not limited to the technical solutions described in the embodiments of the present invention, and those skilled in the art should understand that the present invention may be modified or substituted for the same technical effects; as long as the use requirement is met, the invention is within the protection scope of the invention.

Claims (3)

1. An airborne LiDAR data building extraction method based on Euclidean distance is characterized in that: the method comprises the following steps:
step S1, loading airborne LiDAR data;
step S2, removing noise points of the airborne LiDAR data, and raising the noise point Cheng Yaoshu to filter out ground points and low points;
the step S2 includes the steps of:
(21) Searching the same number of neighborhood points for each point, and calculating the average value D of the distances from the point to the neighborhood points mean And calculating the maximum distance D by the value m and the standard deviation sigma max
D max =m+K*σ
Wherein K is a standard deviation multiple, and is set to 5, if D mean Greater than D max Then the point is considered as a noise point, and the noise point is removed;
(22) Filtering point cloud with the height difference of 1 meter by taking the ground height as a reference, and ensuring that ground points and low points are not in the calculated data;
s3, extracting scanning lines, sequencing all points in each scanning line according to coordinates, and sequentially calculating Euclidean distances among ordered points and variances of the distances of the points, wherein the variances are marked as characteristic values;
step S4, determining an initial L value according to the change of the data quantity, comprising the following steps:
(41) Setting the L value to be 1.0, comparing the L value with the characteristic value, reserving three points with variance values smaller than the L value, observing a comparison result of the next set of variance values with the L value when the characteristic value is larger than the L value, if the comparison result is too overrun, calibrating the last point of the first set as a dangerous point, otherwise, reserving the three points;
(42) Reducing the L value by taking 0.1 as a step length, respectively using different L values to calculate sieve points on the original data according to the rule of the step (41), recording the total data calculated by the different L values, and drawing a curve of the total data;
(43) According to the curve change trend, obtaining the slope between the two adjacent points, and taking the L value of the point as the initial L value of iterative calculation when the slope change of the total data amount is minimum;
s5, automatically reducing the L value, calculating the characteristic value of the residual point, comparing the new L value with the characteristic value, iterating, stopping iterating according to the data quantity change trend, and outputting a result;
and S6, carrying out experimental test on the effectiveness of different urban areas by utilizing the steps S1-S5.
2. The method for extracting the airborne LiDAR data building based on Euclidean distance according to claim 1, wherein the method comprises the following steps: the step S3 includes the steps of:
(31) The TIME attribute of the built-in GPS of the aircraft is read, and the numerical values are clustered into the same scanning line;
(32) Re-ordering the points in the same scanning line by X coordinates;
(33) And sequentially calculating Euclidean space distances between two adjacent points before and after the two adjacent points, and sequentially calculating variance values between every two adjacent distances, wherein each group of variance values corresponds to three points.
3. The method for extracting the airborne LiDAR data building based on Euclidean distance according to claim 1, wherein the method comprises the following steps: the step S5 includes the steps of:
(51) Decreasing the L value in steps of 0.0001;
(52) Recalculating variance values of all points except dangerous points;
(53) Judging according to the judging rule in the step S4, and recording the total amount of the safety point data after each calculation;
(54) Repeating the steps for iteration, stopping iteration when the total quantity of the data points changes most gently, and outputting a result.
CN202011328391.1A 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance Active CN112381029B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011328391.1A CN112381029B (en) 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011328391.1A CN112381029B (en) 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance

Publications (2)

Publication Number Publication Date
CN112381029A CN112381029A (en) 2021-02-19
CN112381029B true CN112381029B (en) 2023-11-14

Family

ID=74588925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011328391.1A Active CN112381029B (en) 2020-11-24 2020-11-24 Method for extracting airborne LiDAR data building based on Euclidean distance

Country Status (1)

Country Link
CN (1) CN112381029B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116579949A (en) * 2023-05-31 2023-08-11 浙江省测绘科学技术研究院 Airborne point cloud ground point filtering method suitable for urban multi-noise environment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102103202A (en) * 2010-12-01 2011-06-22 武汉大学 Semi-supervised classification method for airborne laser radar data fusing images
CN105488770A (en) * 2015-12-11 2016-04-13 中国测绘科学研究院 Object-oriented airborne laser radar point cloud filtering method
CN106199557A (en) * 2016-06-24 2016-12-07 南京林业大学 A kind of airborne laser radar data vegetation extracting method
CN106970375A (en) * 2017-02-28 2017-07-21 河海大学 A kind of method that building information is automatically extracted in airborne laser radar point cloud
CN110992341A (en) * 2019-12-04 2020-04-10 沈阳建筑大学 Segmentation-based airborne LiDAR point cloud building extraction method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105302151B (en) * 2014-08-01 2018-07-13 深圳中集天达空港设备有限公司 A kind of system and method for aircraft docking guiding and plane type recognition

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102103202A (en) * 2010-12-01 2011-06-22 武汉大学 Semi-supervised classification method for airborne laser radar data fusing images
CN105488770A (en) * 2015-12-11 2016-04-13 中国测绘科学研究院 Object-oriented airborne laser radar point cloud filtering method
CN106199557A (en) * 2016-06-24 2016-12-07 南京林业大学 A kind of airborne laser radar data vegetation extracting method
CN106970375A (en) * 2017-02-28 2017-07-21 河海大学 A kind of method that building information is automatically extracted in airborne laser radar point cloud
CN110992341A (en) * 2019-12-04 2020-04-10 沈阳建筑大学 Segmentation-based airborne LiDAR point cloud building extraction method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
method for extraction of airborne lidar point cloud buildings based on segmentation;Maohua Liu 等;《PLOS ONE》;第1-11页 *
记载LiDAR与高分2号融合的城镇地物分类研究;刘茂华 等;《中国科技论文》;第13卷(第9期);第990-994页 *

Also Published As

Publication number Publication date
CN112381029A (en) 2021-02-19

Similar Documents

Publication Publication Date Title
CN110781827B (en) Road edge detection system and method based on laser radar and fan-shaped space division
CN106529469B (en) Unmanned aerial vehicle-mounted LiDAR point cloud filtering method based on self-adaptive gradient
US9064151B2 (en) Device and method for detecting plantation rows
CN106680798B (en) A kind of identification of airborne LIDAR air strips overlay region redundancy and removing method
CN105488770A (en) Object-oriented airborne laser radar point cloud filtering method
CN114463403A (en) Tree carbon sink amount calculation method based on point cloud data and image recognition technology
CN109961470B (en) Living standing tree She Shuxing accurate estimation method based on laser point cloud
CN110988909B (en) TLS-based vegetation coverage measuring method for sand vegetation in severe cold fragile area
CN114764871B (en) Urban building attribute extraction method based on airborne laser point cloud
CN111598780B (en) Terrain adaptive interpolation filtering method suitable for airborne LiDAR point cloud
CN110765962A (en) Plant identification and classification method based on three-dimensional point cloud contour dimension values
CN110794413A (en) Method and system for detecting power line of point cloud data of laser radar segmented by linear voxels
CN104597449B (en) A kind of airborne many scanning weather radar target vertical contour reconstruction methods
CN111091079A (en) TLS-based method for measuring dominant single plant structural parameters of vegetation in alpine and fragile regions
CN111950589B (en) Point cloud region growing optimization segmentation method combined with K-means clustering
CN113177897A (en) Rapid lossless filtering method for disordered 3D point cloud
CN116523898A (en) Tobacco phenotype character extraction method based on three-dimensional point cloud
CN112381029B (en) Method for extracting airborne LiDAR data building based on Euclidean distance
CN114898118A (en) Automatic statistical method and system for power transmission line house removal amount based on multi-source point cloud
CN116579949A (en) Airborne point cloud ground point filtering method suitable for urban multi-noise environment
CN111932574B (en) Building vertical point cloud extraction system and method based on multi-level semantic features
CN113487636B (en) Laser radar-based automatic extraction method for plant height and row spacing of wide-ridge crops
CN115631136A (en) 3D point cloud image-based method for rapidly measuring phenotypic parameters of schima superba seedlings
CN115410036A (en) Automatic classification method for key element laser point clouds of high-voltage overhead transmission line
CN111913185B (en) TLS (TLS (visual inspection) measuring method for low shrub pattern investigation of severe cold fragile region

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant