CN103093459A - Assisting image matching method by means of airborne lidar point cloud data - Google Patents

Assisting image matching method by means of airborne lidar point cloud data Download PDF

Info

Publication number
CN103093459A
CN103093459A CN2013100033778A CN201310003377A CN103093459A CN 103093459 A CN103093459 A CN 103093459A CN 2013100033778 A CN2013100033778 A CN 2013100033778A CN 201310003377 A CN201310003377 A CN 201310003377A CN 103093459 A CN103093459 A CN 103093459A
Authority
CN
China
Prior art keywords
image
point
cloud data
matching
matched
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013100033778A
Other languages
Chinese (zh)
Other versions
CN103093459B (en
Inventor
王慧
李鹏程
张勇
王利勇
闸旋
李烁
刘忠滨
武海洋
胡志定
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PLA Information Engineering University
Original Assignee
PLA Information Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by PLA Information Engineering University filed Critical PLA Information Engineering University
Priority to CN201310003377.8A priority Critical patent/CN103093459B/en
Publication of CN103093459A publication Critical patent/CN103093459A/en
Application granted granted Critical
Publication of CN103093459B publication Critical patent/CN103093459B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to an assisting image matching method by means of airborne lidar point cloud data. The assisting image matching method by means of the airborne lidar point cloud data includes the steps of obtaining a reference image, searching image and the cut out airborne lidar point cloud data, abstracting a feature point from the reference image as a point to be matched, generating regularized digital surface model (DSM) information corresponding to an image overlapped region by means of the lidar point cloud data, converting point of sale (POS) data into elements of exterior orientation of the reference image, working out object space coordinates of corresponding ground points of the feature point to be matched by means of the DMS information and the elements of exterior orientation of the reference image, reversely calculating object space coordinates of the corresponding ground points to the searching image, obtaining the initial position of a corresponding matching point of the searching image, and matching by means of the correlation index to obtain a final matching result. The assisting image matching method provides good object space restraint for image matching by means of the lidar point cloud data, can effectively avoid complicated search strategies, achieves the purpose of the image matching and is high in matching success rate, short in consumed time and effective.

Description

Utilize the method for the auxiliary Image Matching of airborne LiDAR cloud data
Technical field
The invention belongs to the Photogrammetry and Remote Sensing fields, relate to Remote Sensing Images Matching and airborne laser radar LiDAR measuring technique, relate in particular to the method for utilizing the auxiliary Image Matching of airborne LiDAR cloud data.
Background technology
Remote Sensing Images Matching is focus and the difficulties of photogrammetric research field, and it is widely used in relative orientation, Remote Sensing Image Matching, the DEM(digital elevation model of stereogram) generate and the aspect such as remote sensing image splicing.It is defined as between the key element of two or many digital images and automatically sets up corresponding relation, and at diverse location with do not obtain in the same time, key element can be the point in digitized video to these images, or other features of extracting to Same Scene.Existing matching process all must adopt loaded down with trivial details search strategy to obtain corresponding match point position on the search image, and search strategy commonly used has core line geometry constraint condition and pyramid multi-pass decoding strategy.
Along with the development of sensor, the mode of multiple sensors work compound provides new solution for photogrammetric.Airborne LiDAR Technology refers to the technology of a kind of height integrated laser range finding, dynamic GPS difference and inertial navigation attitude determination.Wherein, laser ranging be used for to be measured the laser radar signal transmitted-reference and is put distance between the laser pin point of ground; Dynamic difference GPS is used for determining the locus of laser radar signal transmitted-reference point; Inertial navigation is used for measuring the primary optical axis attitude parameter of scanister.Synchronous by this three, as to coordinate work has realized directly obtaining of three-dimensional coordinate on a surface target.Compare photogrammetric measurement, it be the active type measure mode, ageing strong, the laser pulse penetration capacity is strong, operating efficiency is high, production cost is low.Airborne LiDAR Technology realized photogrammetric important supplement, and at present, the research of the auxiliary Image Matching technology of airborne LiDAR is badly in need of carrying out.
Summary of the invention
The purpose of this invention is to provide a kind of method of utilizing the auxiliary Image Matching of airborne LiDAR cloud data, to solve loaded down with trivial details search problem in existing image matching method.
For achieving the above object, the solution of the present invention is: a kind of method of utilizing the auxiliary Image Matching of airborne LiDAR cloud data comprises the steps:
(1) obtain with reference to image and search image, described is a group image pair with reference to image and search image, the airborne LiDAR cloud data of intercepting in described image right overlapping region.
(2) on the reference image extract minutiae as the point to be matched of Image Matching;
(3) utilize the LiDAR cloud data to generate corresponding described image to the regularization digital surface model DSM of overlapping region;
(4) carrier positions of POS system being obtained and attitude information convert the elements of exterior orientation with reference to image and search image to;
(5) calculate the search image on described initial position with reference to the corresponding point of image point to be matched;
(6) on the search image that calculates according to step (5), the initial position of corresponding point to be matched, carry out correlation coefficient matching method, obtains final matching results.
Further, step (2) is to utilize the Harris operator reference image to be carried out the extraction of unique point.
Further, step (3) is at first carried out elimination of rough difference to airborne LiDAR cloud data, then cloud data is carried out rule gridding process.
Further, resolving the method for searching for the initial position of corresponding point to be matched on image in step (5) is:
At first utilize DSM information to calculate each corresponding topocentric coordinates to be matched with elements of exterior orientation with reference to image, then with these topocentric coordinates inverses to the search image on, according to the relation of picture point conllinear on the elements of exterior orientation of searching for image and ground point and search image, finally obtain searching for the initial position of corresponding point to be matched on image;
The method of utilizing the auxiliary Image Matching of airborne LiDAR cloud data of the present invention, utilize laser point cloud data to provide good object space constraint for Image Matching, can effectively avoid loaded down with trivial details search strategy and reach the purpose of Image Matching, be a kind of efficient image matching method that power is high, elapsed time is short that is matched to.
Description of drawings
Fig. 1 is the basic matching principle figure of the inventive method;
Fig. 2 a surveys the reference image that district's experimental data shows;
Fig. 2 b is the airborne LiDAR cloud data of surveying the intercepting of district's experimental data demonstration
Fig. 2 c surveys the search image that district's experimental data shows;
Fig. 3 is feature point extraction figure;
Fig. 4 a is that sample 1 is when resolving the initial point position, with reference to the unique point of utilizing the Harris operator extraction on image;
Fig. 4 b is sample 1 when resolving the initial point position, with reference to the initial point position of the match point to the search image of the angle point inverse on image;
Fig. 5 a is that sample 2 is when resolving the initial point position, with reference to the unique point of utilizing the Harris operator extraction on image;
Fig. 5 b is sample 2 when resolving the initial point position, with reference to the initial point position of the match point to the search image of the angle point inverse on image;
When Fig. 6 a is sample 1 correlation coefficient matching method, the Harris angle point that extracts in sample 1 zone;
When Fig. 6 b is sample 1 correlation coefficient matching method, corresponding match point position;
When Fig. 7 a is sample 2 correlation coefficient matching method, the Harris angle point that extracts in sample 2 zones;
When Fig. 7 b is sample 2 correlation coefficient matching method, corresponding match point position
Fig. 8 a and Fig. 8 b are that the parallax scheme is at the matching result figure of sample 1;
Fig. 9 a and Fig. 9 b are that the parallax scheme is at the matching result figure of sample 2.
Embodiment
utilize the auxiliary Image Matching of airborne LiDAR cloud data method basic thought as shown in Figure 1, 1 is with reference to image, 2 are the search image, 3 is digital surface model (DSM), at first on the reference image extract minutiae as point to be matched, utilize the LiDAR cloud data to generate the DSM of corresponding image overlap area, and the POS data-switching is become the elements of exterior orientation of CCD image, calculate the corresponding topocentric coordinates of each unique point to be matched in conjunction with the elements of exterior orientation with reference to image according to monolithic mapping principle, again in conjunction with the elements of exterior orientation of searching for image and base area millet cake and the collinear relationship of searching for picture point on image, obtain searching for the initial position of corresponding point to be matched on image, utilize at last correlation coefficient matching method to obtain final matching results.
The committed step of the method comprises utilizes airborne LiDAR cloud data to generate DSM, feature point extraction, POS conversion elements of exterior orientation, resolve initial position and the correlation coefficient matching method of searching for match point on image.Concrete steps are as follows:
1) generation of DSM
What airborne LiDAR measuring system was obtained is discrete, irregular and densely distributed laser point cloud data, itself is exactly a kind of expression way of DSM, yet, this discrete data structure give follow-up application make troubles and cloud data in often have rough error, therefore, should at first carry out elimination of rough difference, then cloud data be carried out rule gridding and process.The formula of create-rule DSM is suc as formula shown in (1):
(1)
Wherein,
Figure 566693DEST_PATH_IMAGE002
Be graticule mesh ranks coordinate; Be original point cloud planimetric coordinates;
Figure 169319DEST_PATH_IMAGE004
It is respectively the minimum value of horizontal ordinate and ordinate in a cloud; C is sampling interval; M is constant, the multiple of expression sampling interval; N is that in unit area, pin is counted out.
When graticule mesh is done the interpolation processing, if comprise a plurality of discrete points in graticule mesh, the elevation of grid points is got elevation minimum value in these discrete points; If there is no discrete point in graticule mesh, adopt the most contiguous interpolation method, the elevation of this grid points is got the height value of contiguous discrete point.
2) feature point extraction
Utilize the Harris operator to carry out a feature extraction to the reference image, and with the to be matched point of these unique points as Image Matching.The principle of Harris operator is: at corner point, the shade of gray of image is discontinuous, and in the zone of angle point vicinity, there are two or more different values in gradient.
The formula of Harris operator only relates to the first order derivative of image, it by calculate each pixel respectively laterally with first order derivative and both long-pending 3 width new images that obtain longitudinally.Again 3 width images are carried out gaussian filtering, calculate the interest value of each point on original image according to formula (2) and formula (3).
Figure 2013100033778100002DEST_PATH_IMAGE005
Wherein, M is the gray scale covariance matrix of image picture elements; I (x, y) image picture elements gray-scale value;
Figure 2013100033778100002DEST_PATH_IMAGE007
With
Figure 2013100033778100002DEST_PATH_IMAGE009
Respectively original image with the gradient of direction; G (s) is Gauss's template;
Figure 2013100033778100002DEST_PATH_IMAGE011
With
Figure 2013100033778100002DEST_PATH_IMAGE013
Be respectively determinant of a matrix and mark; K is constant, generally gets 0.04 ~ 0.06; R is the interest value of original image respective point, when surpassing given threshold value, this point is labeled as angle point.
3) POS conversion elements of exterior orientation
POS comprises GPS receiver and two parts of Inertial Measurement Unit (IMU, Inertial Measurement Unit), also claims the GPS/IMU integrated system.GPS positioning result and IMU data can obtain high-precision navigation results by Kalman filtering, comprise WGS84 terrestrial coordinate and the information such as course deviation (Heading), pitching (Pitch) and the angle of roll (Roll) at IMU center.Then these positions and attitude information are converted to the elements of exterior orientation of image.
4) resolve the initial point position
Resolve the initial position that the initial point position refers to resolve point corresponding with reference image unique point to be matched on the search image, at first it utilize DSM information and find the solution the unique point to be matched object coordinates of millet cake accordingly with reference to the elements of exterior orientation of image, again with these topocentric object coordinates inverses to the search image, thereby find the initial position of match point.
The principle of finding the solution object coordinates is based on monolithic mapping principle, and the projection ray when it utilizes imaging and earth surface intersect to come the locus of appearance punctuate definitely.The terrain object point coordinate with corresponding with reference to the relation between the picpointed coordinate of unique point to be matched on image suc as formula shown in (4):
Figure 2013100033778100002DEST_PATH_IMAGE015
(4)
Wherein, (Xs, Ys, Zs) is for taking the photograph station coordinates; F is focal length; a i, b i, c i(i=1,2,3) are the direction cosine by the rotation matrix of foreign side's parallactic angle element set one-tenth.
When utilize unique point to be matched find the solution the object coordinates of corresponding ground impact point as coordinate the time, be to be answered by 2 equations to separate 3 unknown numbers, so solution procedure should be an iterative process.At first a given initial value is got the mean value of maximal value and minimum value in DSM, and the substitution formula is tried to achieve planimetric coordinates (X, Y) in (4), calculates corresponding Z coordinate figure according to this planimetric position interpolation in DSM, then calculates by its planimetric coordinates that makes new advances.Before and after calculating, the planimetric coordinates of twice is poor
Figure 2013100033778100002DEST_PATH_IMAGE017
If the coordinate difference absolute value is all poor less than limit, thinks that (X, Y, the Z) that find the solution is the ground coordinate of impact point, if poor greater than limit, repeat above step until the poor absolute value of planimetric coordinates stops iteration less than limitting when differing from.
After obtaining the object coordinates of corresponding impact point, can be easy to try to achieve the initial position of match point according to it and the collinear relationship of search image picture point, if in the ideal case, direct solution obtains initial point position is just very accurate, do not need to do further processing, yet the projection error that the error that the POS data produce when transferring elements of exterior orientation with LiDAR cloud data generation DSM to and atural object exist makes relative exact position, initial point position be slightly offset, but it also provides good initial position for follow-up coupling.
5) correlation coefficient matching method
Correlation coefficient matching method is the Gray-scale Matching method take related coefficient as similarity measure.Related coefficient is standardized covariance function, and its geometric meaning is that object vector is more little more similar to the angle of locating vector.It is that its principle is simple that this programme utilizes the reason of correlation coefficient matching method, and processing speed is fast and algorithm is ripe, behind the initial point position that calculates point to be matched on the search image, utilizes correlation coefficient matching method can access final matching results.In addition, it can also verify feasibility and the validity of the auxiliary Image Matching scheme of airborne LiDAR cloud data.
For whether checking this method can produce a desired effect, the spy does following experiment.Experimental data is 1 image of surveying the district pair and the airborne LiDAR cloud data of corresponding overlapping region, as shown in Fig. 2 a, 2b and 2c.
At first be result and the analysis of each step of this method:
1) generation of DSM
As shown in table 1 by the regularization DSM information that airborne LiDAR cloud data generates, wherein,
Figure 2013100033778100002DEST_PATH_IMAGE019
Planimetric coordinates for the DSM lower left corner; Planimetric coordinates for the DSM upper right corner;
Figure 2013100033778100002DEST_PATH_IMAGE023
Be respectively elevation minimum value and maximal value in DSM.
Figure 20131000337781000021
2) feature point extraction
Fig. 3 has provided the angle point that utilizes the Harris operator to extract on the reference image, only show the angle point in corresponding DSM scope here, because extraneous angle point is not point to be matched and can't be directed on the search image.Because the angle point area coverage is large and densely distributed, be not easy to holistic approach, therefore, the regional area of choosing in Fig. 3 in white square frame amplifies as the sample of this experiment, and in figure, 4 is sample 1,5 is sample 2, and the experimental result of follow-up each step and effect assessment are all carried out based on sample.
3) resolve the initial point position
What Fig. 4 a, Fig. 4 b and Fig. 5 a, Fig. 5 b were respectively sample 1 and sample 2 resolves initial point position schematic diagram, wherein Fig. 4 a and Fig. 5 a are with reference to the unique point of utilizing the Harris operator extraction on image, Fig. 4 b and Fig. 5 b all serve as reasons with reference to the initial point position of match point to the search image of the angle point inverse on image, and wherein the regular DSM that generates of airborne LiDAR cloud data plays connection function.Can find out, by the initial point bit comparison of the airborne LiDAR cloud data guiding accurate location near match point, it provides good initial value for follow-up correlation coefficient matching method.
4) correlation coefficient matching method
Correlation coefficient matching method is carried out as the approximate value of to be matched some conjugate position in the initial point position of resolving, and Fig. 6 a, Fig. 6 b and Fig. 7 a, Fig. 7 b are respectively the correlation coefficient matching method result of sample 1 and sample 2.Fig. 6 a and Fig. 7 a are the Harris angle point that extracts in sample areas, and Fig. 6 b and Fig. 7 b are corresponding match point position.Through more as can be known, correlation coefficient matching method will be searched for initial point position on image and forward more accurate match point position to, and it can reach such effect and illustrate that also airborne LiDAR cloud data effectively assisted resolving of initial point position.
The contrast experiment:
Be that further checking utilizes the superiority of the auxiliary image matching method of airborne LiDAR cloud data, with the method with utilize the matching process of parallax to compare experiment.Utilize the matching process of parallax at first to extract the Harris angle point on the reference image, then according to each angle point of disparity computation corresponding initial point position on the search image, then carry out on this basis correlation coefficient matching method and obtain final matching results.Experiment to two kinds of methods be matched to power and elapsed time compares.
1) be matched to power
Being matched to power and need setting identical matching condition of objective relatively two schemes, the i.e. consistent size of search window.When the consistent size of search window, the correlation coefficient matching method elapsed time in two kinds of methods is approximate, and the quality of its matching effect depends on the degree of approximation of initial point position and to be matched some conjugate position.
Table 2 has been added up the time that the two schemes coupling consumes, and because search window size is consistent, the time that in two kinds of methods, correlation coefficient matching method consumes is very approximate, can compare objectively their power that is matched on this basis.By statistics, this method to be matched to power as shown in table 3, Fig. 8 and Fig. 9 are respectively the parallax scheme at the matching result of sample 1 with sample 2.Can find out, this method to be matched to power high, and the parallax method be matched to power very low (less than 30%), the guiding that airborne LiDAR cloud data is described is compared the parallax scheme and can be accessed to be matched more approximate conjugate position, thereby obtains reliable matching result.
Figure 2013100033778100002DEST_PATH_IMAGE027
2) elapsed time
The parallax method increases search window size can access the matching result approximate with this method, yet the time of its consumption can significantly increase, and table 4 has provided the parallax method and increased the comparison that is matched to power after search window size with this method.
Two kinds of methods this moments to be matched to power suitable, therefore, verify whether this method has higher matching efficiency below by their matching treatment time relatively, as shown in table 5.The time that this method coupling consumes is about 1/3 of parallax scheme, proves that this method matching efficiency is higher, and this gives the credit to the guiding of airborne LiDAR cloud data.Be matched to power and time consuming contrast experiment proves, utilizing the auxiliary Image Matching of airborne LiDAR cloud data is a kind of reliable, image matching method efficiently, has certain practical value.
Figure 2013100033778100002DEST_PATH_IMAGE031
Above-described specific embodiment; purpose of the present invention, technical scheme and beneficial effect are further described; institute is understood that; the above is only specific embodiments of the invention; be not limited to the present invention; within the spirit and principles in the present invention all, any modification of making, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.

Claims (4)

1. a method of utilizing the auxiliary Image Matching of airborne LiDAR cloud data, is characterized in that, the step of the method is as follows:
(1) obtain with reference to image and search image, described is a group image pair with reference to image and search image, the airborne LiDAR cloud data of intercepting in described image right overlapping region.
(2) on the reference image extract minutiae as the point to be matched of Image Matching;
(3) utilize the LiDAR cloud data to generate corresponding described image to the regularization digital surface model DSM of overlapping region;
(4) carrier positions of POS system being obtained and attitude information convert the elements of exterior orientation with reference to image and search image to;
(5) calculate the search image on described initial position with reference to the corresponding point of image point to be matched;
(6) on the search image that calculates according to step (5), the initial position of corresponding point to be matched, carry out correlation coefficient matching method, obtains final matching results.
2. method according to claim 1, it is characterized in that: described step (2) is to utilize the Harris operator reference image to be carried out the extraction of unique point.
3. method according to claim 1 and 2, it is characterized in that: described step (3) is at first carried out elimination of rough difference to airborne LiDAR cloud data, then cloud data is carried out rule gridding process.
4. method according to claim 1 is characterized in that: the method for resolving the initial position of corresponding point to be matched on the search image in step (5) is:
At first utilize DSM information to calculate each corresponding topocentric coordinates to be matched with elements of exterior orientation with reference to image, then with these topocentric coordinates inverses to the search image on, according to the relation of picture point conllinear on the elements of exterior orientation of searching for image and ground point and search image, finally obtain searching for the initial position of corresponding point to be matched on image.
CN201310003377.8A 2013-01-06 2013-01-06 Utilize the method that airborne LiDAR point cloud data assisted image mates Expired - Fee Related CN103093459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310003377.8A CN103093459B (en) 2013-01-06 2013-01-06 Utilize the method that airborne LiDAR point cloud data assisted image mates

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310003377.8A CN103093459B (en) 2013-01-06 2013-01-06 Utilize the method that airborne LiDAR point cloud data assisted image mates

Publications (2)

Publication Number Publication Date
CN103093459A true CN103093459A (en) 2013-05-08
CN103093459B CN103093459B (en) 2015-12-23

Family

ID=48205990

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310003377.8A Expired - Fee Related CN103093459B (en) 2013-01-06 2013-01-06 Utilize the method that airborne LiDAR point cloud data assisted image mates

Country Status (1)

Country Link
CN (1) CN103093459B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426165A (en) * 2013-06-28 2013-12-04 吴立新 Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds
CN105651209A (en) * 2016-02-05 2016-06-08 中测新图(北京)遥感技术有限责任公司 Emergency obtaining method and device for designated region area
CN106780712A (en) * 2016-10-28 2017-05-31 武汉市工程科学技术研究院 Joint laser scanning and the three-dimensional point cloud generation method of Image Matching
CN107886531A (en) * 2017-12-15 2018-04-06 武汉智能鸟无人机有限公司 A kind of virtual controlling point acquisition methods matched based on laser ranging and object space
CN109507677A (en) * 2018-11-05 2019-03-22 浙江工业大学 A kind of SLAM method of combination GPS and radar odometer
CN109727278A (en) * 2018-12-31 2019-05-07 中煤航测遥感集团有限公司 A kind of autoegistration method of airborne lidar point cloud data and aviation image
CN110389586A (en) * 2018-04-19 2019-10-29 法拉第未来公司 The system and method detected for ground and free space
CN110646792A (en) * 2019-11-04 2020-01-03 中国人民解放军空军工程大学 Radar search window setting method based on observation whistle digital telescope
CN110940994A (en) * 2018-09-25 2020-03-31 北京京东尚科信息技术有限公司 Positioning initialization method and system thereof
US10983215B2 (en) 2018-12-19 2021-04-20 Fca Us Llc Tracking objects in LIDAR point clouds with enhanced template matching
CN114689015A (en) * 2021-11-29 2022-07-01 成都理工大学 Method for improving elevation precision of optical satellite stereoscopic image DSM
CN115329111A (en) * 2022-10-11 2022-11-11 齐鲁空天信息研究院 Image feature library construction method and system based on point cloud and image matching

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506824A (en) * 2011-10-14 2012-06-20 航天恒星科技有限公司 Method for generating digital orthophoto map (DOM) by urban low altitude unmanned aerial vehicle

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102506824A (en) * 2011-10-14 2012-06-20 航天恒星科技有限公司 Method for generating digital orthophoto map (DOM) by urban low altitude unmanned aerial vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
姚春静: "机载LiDAR点云数据与遥感影像配准的方法研究", 《中国博士学位论文全文数据库信息科技辑》 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426165A (en) * 2013-06-28 2013-12-04 吴立新 Precise registration method of ground laser-point clouds and unmanned aerial vehicle image reconstruction point clouds
CN105651209A (en) * 2016-02-05 2016-06-08 中测新图(北京)遥感技术有限责任公司 Emergency obtaining method and device for designated region area
CN105651209B (en) * 2016-02-05 2018-10-02 中测新图(北京)遥感技术有限责任公司 The emergent acquisition methods and device of specified region area
CN106780712A (en) * 2016-10-28 2017-05-31 武汉市工程科学技术研究院 Joint laser scanning and the three-dimensional point cloud generation method of Image Matching
CN107886531A (en) * 2017-12-15 2018-04-06 武汉智能鸟无人机有限公司 A kind of virtual controlling point acquisition methods matched based on laser ranging and object space
CN107886531B (en) * 2017-12-15 2024-04-16 武汉智能鸟无人机有限公司 Virtual control point acquisition method based on laser ranging and object space matching
CN110389586A (en) * 2018-04-19 2019-10-29 法拉第未来公司 The system and method detected for ground and free space
CN110940994A (en) * 2018-09-25 2020-03-31 北京京东尚科信息技术有限公司 Positioning initialization method and system thereof
CN109507677B (en) * 2018-11-05 2020-08-18 浙江工业大学 SLAM method combining GPS and radar odometer
CN109507677A (en) * 2018-11-05 2019-03-22 浙江工业大学 A kind of SLAM method of combination GPS and radar odometer
US10983215B2 (en) 2018-12-19 2021-04-20 Fca Us Llc Tracking objects in LIDAR point clouds with enhanced template matching
CN109727278B (en) * 2018-12-31 2020-12-18 中煤航测遥感集团有限公司 Automatic registration method for airborne LiDAR point cloud data and aerial image
CN109727278A (en) * 2018-12-31 2019-05-07 中煤航测遥感集团有限公司 A kind of autoegistration method of airborne lidar point cloud data and aviation image
CN110646792A (en) * 2019-11-04 2020-01-03 中国人民解放军空军工程大学 Radar search window setting method based on observation whistle digital telescope
CN110646792B (en) * 2019-11-04 2022-04-12 中国人民解放军空军工程大学 Radar search window setting method based on observation whistle digital telescope
CN114689015A (en) * 2021-11-29 2022-07-01 成都理工大学 Method for improving elevation precision of optical satellite stereoscopic image DSM
CN114689015B (en) * 2021-11-29 2023-01-17 成都理工大学 Method for improving elevation precision of optical satellite stereoscopic image DSM
CN115329111A (en) * 2022-10-11 2022-11-11 齐鲁空天信息研究院 Image feature library construction method and system based on point cloud and image matching

Also Published As

Publication number Publication date
CN103093459B (en) 2015-12-23

Similar Documents

Publication Publication Date Title
CN103093459B (en) Utilize the method that airborne LiDAR point cloud data assisted image mates
Yang et al. Automatic registration of UAV-borne sequent images and LiDAR data
US9098229B2 (en) Single image pose estimation of image capture devices
CN102693542B (en) Image characteristic matching method
CN108801274B (en) Landmark map generation method integrating binocular vision and differential satellite positioning
CN107560592B (en) Precise distance measurement method for photoelectric tracker linkage target
CN104268935A (en) Feature-based airborne laser point cloud and image data fusion system and method
CN111830953A (en) Vehicle self-positioning method, device and system
CN104536009A (en) Laser infrared composite ground building recognition and navigation method
CN102435188A (en) Monocular vision/inertia autonomous navigation method for indoor environment
CN102853835B (en) Scale invariant feature transform-based unmanned aerial vehicle scene matching positioning method
CN105352509A (en) Unmanned aerial vehicle motion target tracking and positioning method under geographic information space-time constraint
CN103115614A (en) Associated parallel matching method for multi-source multi-track long-strip satellite remote sensing images
CN106525054B (en) A kind of above pushed away using star is swept single star of remote sensing images information and independently surveys orbit determination method
Hallquist et al. Single view pose estimation of mobile devices in urban environments
Shi et al. Fusion of a panoramic camera and 2D laser scanner data for constrained bundle adjustment in GPS-denied environments
Parmehr et al. Automatic registration of optical imagery with 3d lidar data using local combined mutual information
Huang et al. Multi-view large-scale bundle adjustment method for high-resolution satellite images
Li et al. Geo-localization with transformer-based 2D-3D match network
CN113504543A (en) Unmanned aerial vehicle LiDAR system positioning and attitude determination system and method
Zhao et al. Alignment of continuous video onto 3D point clouds
Ma et al. RoLM: Radar on LiDAR map localization
Chenchen et al. A camera calibration method for obstacle distance measurement based on monocular vision
CN102620745A (en) Airborne inertial measurement unite (IMU) collimation axis error calibration method
CN107578429A (en) Stereopsis dense Stereo Matching method based on Dynamic Programming and global cost cumulative path

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20151223

Termination date: 20170106