CN110111374A - Laser point cloud matching process based on grouping staged threshold decision - Google Patents

Laser point cloud matching process based on grouping staged threshold decision Download PDF

Info

Publication number
CN110111374A
CN110111374A CN201910355885.XA CN201910355885A CN110111374A CN 110111374 A CN110111374 A CN 110111374A CN 201910355885 A CN201910355885 A CN 201910355885A CN 110111374 A CN110111374 A CN 110111374A
Authority
CN
China
Prior art keywords
data
cloud
point cloud
subsets
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910355885.XA
Other languages
Chinese (zh)
Other versions
CN110111374B (en
Inventor
章弘凯
范光宇
周圣杰
陈年生
徐圣佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Dianji University
Original Assignee
Shanghai Dianji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Dianji University filed Critical Shanghai Dianji University
Priority to CN201910355885.XA priority Critical patent/CN110111374B/en
Publication of CN110111374A publication Critical patent/CN110111374A/en
Application granted granted Critical
Publication of CN110111374B publication Critical patent/CN110111374B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of laser point cloud matching process based on grouping staged threshold decision, comprising steps of S1: two frame point cloud data M, N before and after current time obtains laser radar;S2: M, N are divided into the first cloud subsets of data and the second cloud subsets of data of fixed array;S3: closest approach matching is iterated to every bit cloud;S4: judge whether the matching rate of the first and second cloud subsets of data is greater than the first preset threshold;Such as larger than, successful match continues subsequent step, and otherwise, it fails to match;S5: judge whether the match group rate of the successful match of M, N is greater than the second preset threshold;Such as larger than, otherwise successful match continues subsequent step;S6: two groups of point cloud datas of two frame of front and back of the laser radar of subsequent time are obtained as new M, N, return step S2.A kind of laser point cloud matching process based on grouping staged threshold decision of the invention can reduce algorithm calculation amount in the case where not reducing positioning accuracy.

Description

Laser point cloud matching process based on grouping staged threshold decision
Technical field
The present invention relates to robot navigation field more particularly to a kind of laser point clouds based on grouping staged threshold decision Matching process.
Background technique
Currently, robot localization technology is widely used in garden inspection, and in the fields such as carrying of storing in a warehouse, robot autonomous localization The application of airmanship effectively can replace people to complete Partial Jobs, and therefore, the location and navigation technology formula of robot is current to grind Study carefully hot spot.
During robot navigation, ambient enviroment is scanned by laser radar, is realized to robot itself Positioning.Wherein, the difficult point of robot localization technology is the identification to peripheral obstacle and successful match.For example, in laser thunder Up in scanning process, laser radar can be scanned same barrier in different time, different location, need this two groups or Point cloud more than two is matched, and the barrier that successful match goes out in environment is carried out.The prior art is often used iteration closest approach (ICP) algorithm is scanned matching to the barrier of ambient enviroment, is a kind of matching process of point set to point set.Iteration closest approach Method is that two groups of point clouds for scanning same barrier by rotation transformation are overlapped two groups of point clouds maximumlly, completes matching.
In being iterated closest approach matching process, each point to one group of point cloud is needed to carry out nearest neighbor point matching, It is computationally intensive, thereby increases and it is possible to local optimum problem can be fallen into.In traditional ICP algorithm, when finding corresponding points, it is believed that Europe Formula is exactly corresponding points apart from nearest point, and this hypothesis is possible to that a certain number of wrong corresponding points can be generated.Most due to iteration The problems such as near point is computationally intensive exists, and causes robot barrier object matching real-time low, locating effect is poor, cannot preferably complete Barrier avoiding function.
Summary of the invention
In view of the deficiency of the prior art, the present invention provides a kind of laser point based on grouping staged threshold decision Cloud matching process can reduce algorithm calculation amount in the case where not reducing positioning accuracy.
To achieve the goals above, the present invention provides a kind of laser point cloud match party based on grouping staged threshold decision Method, comprising steps of
S1: the first point cloud data M of the former frame of a laser radar and the second point cloud of a later frame are obtained at current time Data N;
S2: presently described first point cloud data M and the second point cloud data N are respectively classified into the first cloud of fixed array Subsets of data and the second cloud subsets of data;The first cloud subsets of data and the second cloud subsets of data respectively include multiple points Cloud;
S3: each described cloud of presently described first point cloud data M and the second point cloud data N are iterated most Near point matching;
S4: it is default to judge whether the matching rate of the first cloud subsets of data and the second cloud subsets of data is greater than first Threshold value;Such as larger than, the first cloud subsets of data and the second cloud subsets of data successful match continue subsequent step, otherwise, It fails to match with the second cloud subsets of data for the first cloud subsets of data, gives up presently described first cloud subsets of data and institute State the second cloud subsets of data;
S5: judge whether the match group rate of the successful match of the first point cloud data M and the second point cloud data N is big In the second preset threshold;Such as larger than, otherwise successful match, end step continue subsequent step;
S6: two groups of point cloud datas of two frame of front and back of the laser radar of subsequent time are obtained, respectively as new institute State the first point cloud data M and the second point cloud data N, return step S2.
Preferably, the S3 step further comprises step:
S31: each point of presently described first point cloud data M and the second point cloud data N is calculated using formula (1) The mass center of cloud:
Wherein, μmIndicate the mass center of i-th group of j-th cloud in the first point cloud data M;μnIt indicates in the second point cloud data N The mass center of j-th cloud of i group;D indicates each point set that the first cloud subsets of data and the second cloud subsets of data include The actual number of point;mijIndicate i-th group of j-th cloud in the first point cloud data M;nijI-th group is indicated in the second point cloud data M J-th cloud;I, j indicates to be greater than zero natural number;
S32: removing the mass center from each described cloud, after obtaining the updated first point cloud data M ' and updating The second point cloud data N ';
S33: formula (2) and the updated first point cloud data M ' and updated second point cloud data are utilized N ', which is calculated, obtains the first transformation matrix U and the second transformation matrix V:
Wherein, W indicates to solve the matrix of singular value decomposition;m′ijIndicate the i-th of the updated first point cloud data M ' The point set of j-th cloud of group;n′ijIndicate the point set of i-th group of j-th cloud of the updated second point cloud data N ' It closes;T indicates transposition;σ1Indicate the singular value of split-matrix W;σ2Indicate the singular value of split-matrix W;σ3Indicate split-matrix W's Singular value;
S34: when rank (W)=3 acquires the unique solution of the first transformation matrix U and the second transformation matrix V;
S35: it is calculated using formula (3) and obtains an a transformation matrix R and translation matrix T ';
S36: N " is obtained using the transformation matrix R and the translation matrix T ' calculatingij, N "ijIndicate updated described The theoretical value of i-th group of j-th cloud of the second point cloud data N '.
Preferably, the S4 step further comprises step:
S41: i-th group of j-th cloud M ' of the updated first point cloud data M ' is calculatedijWith N "ijDistance;
S42: presently described first cloud subsets of data and the second point cloud data N corresponding position are judged using formula (4) Point cloud whether match qualification;
Wherein, p (i) indicates matching factor;D (i) indicates M 'ijWith N "ijDistance;E indicates pre-determined distance threshold value;Work as p (i) indicate that it fails to match when value is 0;Indicate that two point cloud matchings of current corresponding position are qualified when p (i) value is 1, record point cloud Match qualified quantity;
S43: calculating the matching rate of current first cloud subsets of data and the second point cloud data N, and the matching rate is equal to The quantity of point cloud matching qualification is divided by current first cloud subsets of data midpoint cloud sum in current first cloud subsets of data;
S44: judge whether the matching rate is greater than first preset threshold;Such as larger than, export at presently described first point Cloud data M and the second point cloud data N simultaneously continues subsequent step, otherwise gives up presently described first point cloud data M and described Second point cloud data N.
Preferably, the match group rate is equal to the first cloud subsets of data and the second cloud subsets of data successful match Group number divided by the fixed number.
Preferably, it is further comprised the steps of: after the first cloud subsets of data and the second cloud subsets of data successful match defeated Presently described first cloud subsets of data and the second cloud subsets of data out, and record the first cloud subsets of data and described the The group number of two cloud subsets of data successful match.
The present invention due to use above technical scheme, make it have it is following the utility model has the advantages that
The present invention from first to last divides the first point cloud data M and the second point cloud data N according to the chronological order of scanning It is not equally divided into k group, it is by every group each point that corresponding each group of point cloud, which carries out arest neighbors matching, between M and N Cloud is matched, and is that laser radar point cloud is grouped to matching, can be reduced and be calculated in the case where not reducing positioning accuracy Method calculation amount, and when applying to robot path planning, the barrier avoiding function of robot can be effectively improved.
Detailed description of the invention
Fig. 1 is the flow chart of the laser point cloud matching process based on grouping staged threshold decision of the embodiment of the present invention.
Specific embodiment
Below according to attached drawing 1, presently preferred embodiments of the present invention is provided, and is described in detail, makes to be better understood when this Function, the feature of invention.
Referring to Fig. 1, a kind of laser point cloud matching process based on grouping staged threshold decision of the embodiment of the present invention, Comprising steps of
S1: the first point cloud data M of the former frame of a laser radar and the second point cloud of a later frame are obtained at current time Data N.
S2: current first point cloud data M and the second point cloud data N are respectively classified into the first cloud subsets of data of fixed array With the second cloud subsets of data;First cloud subsets of data and the second cloud subsets of data respectively include multiple clouds.
S3: closest approach matching is iterated to the every bit cloud of current first point cloud data M and the second point cloud data N.
Wherein, S3 step further comprises step:
S31: the mass center of each point cloud of current first point cloud data M and the second point cloud data N is calculated using formula (1):
Wherein, μmIndicate the mass center of i-th group of j-th cloud in the first point cloud data M;μnIt indicates in the second point cloud data N The mass center of j-th cloud of i group;D indicates the reality for each point set point that the first cloud subsets of data and the second cloud subsets of data include Number;mijIndicate i-th group of j-th cloud in the first point cloud data M;nijI-th group j-th point is indicated in the second point cloud data M Cloud;I, j indicates to be greater than zero natural number;
S32: removing mass center from each point cloud, obtains updated first point cloud data M ' and updated second point cloud number According to N ';
S33: it is obtained using formula (2) and updated first point cloud data M ' and updated second point cloud data N ' calculating Obtain the first transformation matrix U and the second transformation matrix V:
Wherein, W indicates to solve the matrix of singular value decomposition;m′ijIndicate i-th group of updated first point cloud data M ' The point set of j clouds;n′ijIndicate the point set of i-th group of j-th cloud of updated second point cloud data N ';T indicates to turn It sets;σ1Indicate the singular value of split-matrix W;σ2Indicate the singular value of split-matrix W;σ3Indicate the singular value of split-matrix W;
S34: when rank (W)=3 acquires the unique solution of the first transformation matrix U and the second transformation matrix V;
S35: it is calculated using formula (3) and obtains an a transformation matrix R and translation matrix T ';
S36: N " is obtained using transformation matrix R and translation matrix T ' calculatingij, N "ijIndicate updated second point cloud data The theoretical value of i-th group of j-th cloud of N '.
S4: judge whether the matching rate of the first cloud subsets of data and the second cloud subsets of data is greater than one first preset threshold; Such as larger than, the first cloud subsets of data and the second cloud subsets of data successful match continue subsequent step, otherwise, the first cloud data It fails to match with the second cloud subsets of data for group, gives up current first cloud subsets of data and the second cloud subsets of data.
Wherein, S4 step further comprises step:
S41: i-th group of j-th cloud M ' of updated first point cloud data M ' is calculatedijWith N "ijDistance;
S42: judge that the point cloud of current first cloud subsets of data and the second point cloud data N corresponding position is using formula (4) No matching is qualified;
Wherein, p (i) indicates matching factor;D (i) indicates M 'ijWith N "ijDistance;E indicates pre-determined distance threshold value;Work as p (i) indicate that it fails to match when value is 0;Indicate that two point cloud matchings of current corresponding position are qualified when p (i) value is 1, record point cloud Match qualified quantity;
S43: calculating the matching rate of current first cloud subsets of data and the second point cloud data N, and matching rate is equal to current first The quantity of point cloud matching qualification is divided by current first cloud subsets of data midpoint cloud sum in cloud subsets of data;
S44: judge whether matching rate is greater than the first preset threshold;Such as larger than, current first point cloud data M and second is exported Point cloud data N simultaneously continues subsequent step, otherwise gives up current first point cloud data M and the second point cloud data N.
Preferably, match group rate is equal to the group number of the first cloud subsets of data and the second cloud subsets of data successful match divided by solid Fixed number.
Preferably, output current the is further comprised the steps of: after the first cloud subsets of data and the second cloud subsets of data successful match One cloud subsets of data and the second cloud subsets of data, and record the group of the first cloud subsets of data and the second cloud subsets of data successful match Number.
S5: judge whether the match group rate of the successful match of the first point cloud data M and the second point cloud data N is greater than one second Preset threshold;Such as larger than, otherwise successful match, end step continue subsequent step;
S6: two groups of point cloud datas of two frame of front and back of the laser radar of subsequent time are obtained, respectively as new first point Cloud data M and the second point cloud data N, return step S2.
The present invention provide it is a kind of based on grouping staged threshold decision laser point cloud matching process, using laser radar come The real-time point cloud information for obtaining ambient enviroment, can be matched real-time point cloud information with known map, obtain robot with this Current location in map.During point cloud matching, the present invention is carried out in real time using grouping staged threshold decision method The matching of point cloud.Two frame point cloud datas of laser radar front and back are inputted, point cloud data is grouped, point cloud data is grouped Group matching.If this group of point cloud matching rate reaches requirement, this group of point cloud matching success is determined.It is wanted if the match group rate of point cloud reaches It asks, then determines the success of two point set point cloud matchings.
Such as:
I, two frame point cloud datas before and after input laser radar, are grouped point cloud data.
(1.1) first using two frame scan of the front and back of laser radar as point cloud data, respectively as M point set and N point set.
(1.2) M point set and N point set are respectively classified into F group, are denoted as M1,M2,M3...MkAnd N1,N2,N3...Nk, each point set The number of chalaza is D.
II, ICP matching is carried out to every group of point cloud.
(2.1) by MijAnd NijPoint cloud carries out ICP matching, (MijIndicate that M point concentrates i-th group of j-th cloud) utilize formula (1) two cloud M are calculatedijAnd NijMass center:
Remove corresponding mass center respectively from two point concentrations and obtains new point set M 'ij,N′ij
(2.2) (singular value decomposition) is decomposed using SVD (Singular Value Decomposition) to acquire transformation Matrix acquires U, V such as following formula (2).
If rank (W)=3 (rank of matrix), it is unique to acquire solution, acquires rotational transformation matrix with this to utilize formula (3) R and translation matrix T '
(2.3) N ' is acquired using R and Tij, compare M 'ij, N 'ijDistance d.
Wherein, p (i) is used to sentence whether breakpoint match is qualified, and E is the distance threshold obtained through overtesting, is in test 5mm。
(2.4) if point matching is not up to distance threshold, the point cloud after rotation translation is carried out to position re-starts matching, The process that (2.1) arrive (2.3) is re-started, is iterated.If the distance for putting cloud is less than threshold value E, two o'clock cloud is determined With success.
III, threshold decision is carried out to the matching rate and point cloud matching group number of every group of point cloud, if reach requirement.
(3.1) if Mi, NiMatching rate reach threshold value ζ (matching rate is that the group successfully puts cloud number divided by group point cloud sum), Stop iteration, then shows this two groups of corresponding points cloud successful match.
If the match group rate (successful match group is divided by total group number) of M, N reach threshold value beta, stops iteration, then show M, N point Collect successful match.
The present invention has been described in detail with reference to the accompanying drawings, those skilled in the art can be according to upper It states and bright many variations example is made to the present invention.Thus, certain details in embodiment should not constitute limitation of the invention, this Invention will be using the range that the appended claims define as protection scope of the present invention.

Claims (5)

1. a kind of laser point cloud matching process based on grouping staged threshold decision, comprising steps of
S1: the first point cloud data M of the former frame of a laser radar and the second point cloud data of a later frame are obtained at current time N;
S2: presently described first point cloud data M and the second point cloud data N are respectively classified into the first cloud data of fixed array Subgroup and the second cloud subsets of data;The first cloud subsets of data and the second cloud subsets of data respectively include multiple clouds;
S3: closest approach is iterated to each described cloud of presently described first point cloud data M and the second point cloud data N Matching;
S4: judge whether the matching rate of the first cloud subsets of data and the second cloud subsets of data is greater than the first default threshold Value;Such as larger than, the first cloud subsets of data and the second cloud subsets of data successful match continue subsequent step, otherwise, institute Stating the first cloud subsets of data, it fails to match with the second cloud subsets of data, give up presently described first cloud subsets of data with it is described Second cloud subsets of data;
S5: judge whether the match group rate of the successful match of the first point cloud data M and the second point cloud data N is greater than Two preset thresholds;Such as larger than, otherwise successful match, end step continue subsequent step;
S6: obtaining two groups of point cloud datas of two frame of front and back of the laser radar of subsequent time, respectively as new described One point cloud data M and the second point cloud data N, return step S2.
2. the laser point cloud matching process according to claim 1 based on grouping staged threshold decision, which is characterized in that The S3 step further comprises step:
S31: each described cloud of presently described first point cloud data M and the second point cloud data N is calculated using formula (1) Mass center:
Wherein, μmIndicate the mass center of i-th group of j-th cloud in the first point cloud data M;μnI-th group is indicated in the second point cloud data N The mass center of j-th cloud;Each point set point that D indicates the first cloud subsets of data and the second cloud subsets of data includes Actual number;mijIndicate i-th group of j-th cloud in the first point cloud data M;nijIndicate i-th group j-th in the second point cloud data N Point cloud;I, j indicates to be greater than zero natural number;
S32: removing the mass center from each described cloud, obtains the updated first point cloud data M ' and updated institute State the second point cloud data N ';
S33: it is counted using formula (2) and the updated first point cloud data M ' and the updated second point cloud data N ' It calculates and obtains the first transformation matrix U and the second transformation matrix V:
Wherein, W indicates to solve the matrix of singular value decomposition;m′ijIndicate i-th group of the updated first point cloud data M ' The point set of j clouds;n′ijIndicate the point set of i-th group of j-th cloud of the updated second point cloud data N ';T table Show transposition;σ1Indicate the singular value of split-matrix W;σ2Indicate the singular value of split-matrix W;σ3Indicate that split-matrix W's is unusual Value;
S34: when rank (W)=3 acquires the unique solution of the first transformation matrix U and the second transformation matrix V;
S35: it is calculated using formula (3) and obtains an a transformation matrix R and translation matrix T ';
S36: N " is obtained using the transformation matrix R and the translation matrix T ' calculatingij, N "ijIndicate updated described second The theoretical value of i-th group of j-th cloud of point cloud data N '.
3. the laser point cloud matching process according to claim 2 based on grouping staged threshold decision, which is characterized in that The S4 step further comprises step:
S41: i-th group of j-th cloud M ' of the updated first point cloud data M ' is calculatedijWith N "ijDistance;
S42: the point of presently described first cloud subsets of data and the second point cloud data N corresponding position is judged using formula (4) Whether cloud matches qualification;
Wherein, p (i) indicates matching factor;D (i) indicates M 'ijWith N "ijDistance;E indicates pre-determined distance threshold value;When p (i) value is Indicate that it fails to match when 0;Indicate that two point cloud matchings of current corresponding position are qualified when p (i) value is 1, record point cloud matching closes The quantity of lattice;
S43: calculating the matching rate of current first cloud subsets of data and the second point cloud data N, and the matching rate is equal to current The quantity of point cloud matching qualification is divided by current first cloud subsets of data midpoint cloud sum in first cloud subsets of data;
S44: judge whether the matching rate is greater than first preset threshold;Such as larger than, presently described first cloud number is exported According to M and the second point cloud data N and continue subsequent step, otherwise gives up presently described first point cloud data M and described second Point cloud data N.
4. the laser point cloud matching process according to claim 3 based on grouping staged threshold decision, which is characterized in that The match group rate is equal to the first cloud subsets of data with the group number of the second cloud subsets of data successful match divided by described Fixed number.
5. the laser point cloud matching process according to claim 4 based on grouping staged threshold decision, which is characterized in that Presently described first cloud of output is further comprised the steps of: after the first cloud subsets of data and the second cloud subsets of data successful match Subsets of data and the second cloud subsets of data, and record the first cloud subsets of data and matched with the second cloud subsets of data Successfully group number.
CN201910355885.XA 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment Active CN110111374B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910355885.XA CN110111374B (en) 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910355885.XA CN110111374B (en) 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment

Publications (2)

Publication Number Publication Date
CN110111374A true CN110111374A (en) 2019-08-09
CN110111374B CN110111374B (en) 2020-11-17

Family

ID=67487504

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910355885.XA Active CN110111374B (en) 2019-04-29 2019-04-29 Laser point cloud matching method based on grouped stepped threshold judgment

Country Status (1)

Country Link
CN (1) CN110111374B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111929694A (en) * 2020-10-12 2020-11-13 炬星科技(深圳)有限公司 Point cloud matching method, point cloud matching equipment and storage medium
CN113204030A (en) * 2021-04-13 2021-08-03 珠海市一微半导体有限公司 Multipoint zone constraint repositioning method, chip and robot

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102194126A (en) * 2010-03-09 2011-09-21 索尼公司 Information processing apparatus, information processing method, and program
US20140169685A1 (en) * 2012-12-14 2014-06-19 National Central University Method of enhancing an image matching result using an image classification technique
CN104715469A (en) * 2013-12-13 2015-06-17 联想(北京)有限公司 Data processing method and electronic device
CN104932001A (en) * 2015-07-08 2015-09-23 四川德马克机器人科技有限公司 Real-time 3D nuclear radiation environment reconstruction monitoring system
CN105180890A (en) * 2015-07-28 2015-12-23 南京工业大学 Rock structural surface occurrence measuring method integrated with laser-point cloud and digital imaging
CN105678318A (en) * 2015-12-31 2016-06-15 百度在线网络技术(北京)有限公司 Traffic label matching method and apparatus
CN105701820A (en) * 2016-01-14 2016-06-22 上海大学 Point cloud registration method based on matching area
CN105913489A (en) * 2016-04-19 2016-08-31 东北大学 Indoor three-dimensional scene reconstruction method employing plane characteristics
US20170094245A1 (en) * 2015-09-24 2017-03-30 Intel Corporation Drift correction for camera tracking
CN106562757A (en) * 2012-08-14 2017-04-19 直观外科手术操作公司 System and method for registration of multiple vision systems
CN106981081A (en) * 2017-03-06 2017-07-25 电子科技大学 A kind of degree of plainness for wall surface detection method based on extraction of depth information
CN107491071A (en) * 2017-09-04 2017-12-19 中山大学 A kind of Intelligent multi-robot collaboration mapping system and its method
CN107861920A (en) * 2017-11-27 2018-03-30 西安电子科技大学 cloud data registration method
CN108152831A (en) * 2017-12-06 2018-06-12 中国农业大学 A kind of laser radar obstacle recognition method and system
CN108627175A (en) * 2017-03-20 2018-10-09 现代自动车株式会社 The system and method for vehicle location for identification
CN108765487A (en) * 2018-06-04 2018-11-06 百度在线网络技术(北京)有限公司 Rebuild method, apparatus, equipment and the computer readable storage medium of three-dimensional scenic
CN108776474A (en) * 2018-05-24 2018-11-09 中山赛伯坦智能科技有限公司 Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
CN108986149A (en) * 2018-07-16 2018-12-11 武汉惟景三维科技有限公司 A kind of point cloud Precision Registration based on adaptive threshold
CN109345620A (en) * 2018-08-13 2019-02-15 浙江大学 Merge the improvement ICP object under test point cloud method of quick point feature histogram
CN109459759A (en) * 2018-11-13 2019-03-12 中国科学院合肥物质科学研究院 City Terrain three-dimensional rebuilding method based on quadrotor drone laser radar system
CN109633688A (en) * 2018-12-14 2019-04-16 北京百度网讯科技有限公司 A kind of laser radar obstacle recognition method and device

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102194126A (en) * 2010-03-09 2011-09-21 索尼公司 Information processing apparatus, information processing method, and program
CN106562757A (en) * 2012-08-14 2017-04-19 直观外科手术操作公司 System and method for registration of multiple vision systems
US20140169685A1 (en) * 2012-12-14 2014-06-19 National Central University Method of enhancing an image matching result using an image classification technique
CN104715469A (en) * 2013-12-13 2015-06-17 联想(北京)有限公司 Data processing method and electronic device
CN104932001A (en) * 2015-07-08 2015-09-23 四川德马克机器人科技有限公司 Real-time 3D nuclear radiation environment reconstruction monitoring system
CN105180890A (en) * 2015-07-28 2015-12-23 南京工业大学 Rock structural surface occurrence measuring method integrated with laser-point cloud and digital imaging
US20170094245A1 (en) * 2015-09-24 2017-03-30 Intel Corporation Drift correction for camera tracking
CN105678318A (en) * 2015-12-31 2016-06-15 百度在线网络技术(北京)有限公司 Traffic label matching method and apparatus
CN105701820A (en) * 2016-01-14 2016-06-22 上海大学 Point cloud registration method based on matching area
CN105913489A (en) * 2016-04-19 2016-08-31 东北大学 Indoor three-dimensional scene reconstruction method employing plane characteristics
CN106981081A (en) * 2017-03-06 2017-07-25 电子科技大学 A kind of degree of plainness for wall surface detection method based on extraction of depth information
CN108627175A (en) * 2017-03-20 2018-10-09 现代自动车株式会社 The system and method for vehicle location for identification
CN107491071A (en) * 2017-09-04 2017-12-19 中山大学 A kind of Intelligent multi-robot collaboration mapping system and its method
CN107861920A (en) * 2017-11-27 2018-03-30 西安电子科技大学 cloud data registration method
CN108152831A (en) * 2017-12-06 2018-06-12 中国农业大学 A kind of laser radar obstacle recognition method and system
CN108776474A (en) * 2018-05-24 2018-11-09 中山赛伯坦智能科技有限公司 Robot embedded computing terminal integrating high-precision navigation positioning and deep learning
CN108765487A (en) * 2018-06-04 2018-11-06 百度在线网络技术(北京)有限公司 Rebuild method, apparatus, equipment and the computer readable storage medium of three-dimensional scenic
CN108986149A (en) * 2018-07-16 2018-12-11 武汉惟景三维科技有限公司 A kind of point cloud Precision Registration based on adaptive threshold
CN109345620A (en) * 2018-08-13 2019-02-15 浙江大学 Merge the improvement ICP object under test point cloud method of quick point feature histogram
CN109459759A (en) * 2018-11-13 2019-03-12 中国科学院合肥物质科学研究院 City Terrain three-dimensional rebuilding method based on quadrotor drone laser radar system
CN109633688A (en) * 2018-12-14 2019-04-16 北京百度网讯科技有限公司 A kind of laser radar obstacle recognition method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DARIO CARREA等: "Building a LiDAR point cloud simulator: Testing algorithms for high resolution topographic change", 《CONFERENCE:EUROPEAN GEOSCIENCES UNION GENERAL ASEEMBLY 2014》 *
任秉银等: "一种非结构环境下目标识别和 3D 位姿估计方法", 《哈尔滨工业大学学报》 *
刘伟等: "基于图层优化与融合的2D—3D视频转换方法", 《计算机辅助设计与图形学学报》 *
雷鸣等: "激光辅助智能车障碍物探测方法研究", 《西安工业大学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111929694A (en) * 2020-10-12 2020-11-13 炬星科技(深圳)有限公司 Point cloud matching method, point cloud matching equipment and storage medium
CN113204030A (en) * 2021-04-13 2021-08-03 珠海市一微半导体有限公司 Multipoint zone constraint repositioning method, chip and robot

Also Published As

Publication number Publication date
CN110111374B (en) 2020-11-17

Similar Documents

Publication Publication Date Title
Huang et al. Consistent sparsification for graph optimization
US20210086362A1 (en) Method and System for Selecting a Preferred Robotic Grasp of an Object-of-Interest Using Pairwise Ranking
Zhang et al. Hierarchical topic model based object association for semantic SLAM
JP5924862B2 (en) Information processing apparatus, information processing method, and program
Elfiky et al. Automation of dormant pruning in specialty crop production: An adaptive framework for automatic reconstruction and modeling of apple trees
CN106909877A (en) A kind of vision based on dotted line comprehensive characteristics builds figure and localization method simultaneously
CN107561549B (en) Method and device for relocating terminal position, terminal and storage medium
CN109029363A (en) A kind of target ranging method based on deep learning
CN109146972A (en) Vision navigation method based on rapid characteristic points extraction and gridding triangle restriction
CN108801268A (en) Localization method, device and the robot of target object
CN107209853A (en) Positioning and map constructing method
CN110334680A (en) Shipping depth gauge recognition methods based on climbing robot, system, device
CN110111374A (en) Laser point cloud matching process based on grouping staged threshold decision
CN111222539B (en) Method for optimizing and expanding supervision classification samples based on multi-source multi-temporal remote sensing image
CN109389156A (en) A kind of training method, device and the image position method of framing model
CN107135541A (en) UWB indoor localization method based on OPTICS Density Clusterings and BP neural network
CN110390639A (en) Processing joining method, device, equipment and the storage medium of orthography
CN115410104A (en) Data processing system for acquiring image acquisition points of aircraft
CN115952691A (en) Optimized station distribution method and device of multi-station passive time difference cross joint positioning system
CN113052761B (en) Laser point cloud map fusion method, device and computer readable storage medium
CN111290053B (en) Thunderstorm path prediction method based on Kalman filtering
CN111368637A (en) Multi-mask convolution neural network-based object recognition method for transfer robot
CN110162812B (en) Target sample generation method based on infrared simulation
Roy et al. Active view planning for counting apples in orchards
CN111156991B (en) Space debris real-time astronomical positioning method based on automatic pointing error determination

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant