CN109727278A - A kind of autoegistration method of airborne lidar point cloud data and aviation image - Google Patents

A kind of autoegistration method of airborne lidar point cloud data and aviation image Download PDF

Info

Publication number
CN109727278A
CN109727278A CN201811651300.0A CN201811651300A CN109727278A CN 109727278 A CN109727278 A CN 109727278A CN 201811651300 A CN201811651300 A CN 201811651300A CN 109727278 A CN109727278 A CN 109727278A
Authority
CN
China
Prior art keywords
point
image
coordinate
width
reliable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811651300.0A
Other languages
Chinese (zh)
Other versions
CN109727278B (en
Inventor
梁菲
王慧芳
王铮尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Coal Survey & Remote Sensing Group Co Ltd
Aerial Photogrammetry and Remote Sensing Co Ltd
Original Assignee
China Coal Survey & Remote Sensing Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Coal Survey & Remote Sensing Group Co Ltd filed Critical China Coal Survey & Remote Sensing Group Co Ltd
Priority to CN201811651300.0A priority Critical patent/CN109727278B/en
Publication of CN109727278A publication Critical patent/CN109727278A/en
Application granted granted Critical
Publication of CN109727278B publication Critical patent/CN109727278B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses the autoegistration methods of a kind of airborne lidar point cloud data and aviation image, comprising steps of one, point cloud data and aviation image obtain and control point field operation is surveyed;Two, Yunnan snub-nosed monkey;Three, Image Matching;Four, photo coordinate in control point obtains;Five, initial laser beam method block adjustment and elements of exterior orientation update;Six, point cloud data and aviation image Feature Points Matching;Seven, image characteristic point photo coordinate to be processed calculates;Eight, Image Matching;Nine, set updates;Ten, bundle block adjustment and elements of exterior orientation update;11, autoregistration terminates to judge.The present invention carries out the characteristic point of aviation image with point cloud data to match the autoregistration for realizing airborne lidar point cloud data and aviation image, registration accuracy can be effectively improved, and calculation amount is small, it is registrated in combination with the field operation actual measurement three-dimensional coordinate of multiple ground control points, the point cloud data and aviation image autoregistration process of different terrain can be effectively applicable to.

Description

A kind of autoegistration method of airborne lidar point cloud data and aviation image
Technical field
The invention belongs to aerophotogrammetry technical fields, more particularly, to a kind of airborne lidar point cloud data and aviation The autoegistration method of image.
Background technique
INSAR(Interferometric Synthetic Aperture Radar;Referred to as: interferometer radar measurement) technology Can with quick obtaining to high-precision digital surface model (DSM), after data processing, be widely applied to electric power, The fields such as forestry, digital city, the data obtained are mainly discrete laser point cloud (also referred to as Lidar data, airborne Lidar Data, Lidar point cloud data or point cloud data), there is high-precision space geometry structural information, but lack spectral information, To which atural object thematic data can not be obtained.On the contrary, traditional image data is capable of providing semantic information abundant, therefore will Lidar data and image association get up to use, can effectively improve the precision of remotely-sensed data product, expand remote sensing application depth and Range is the important research direction of current remote sensing fields.
The two is exactly carried out high registration accuracy using the committed step of image and point cloud data by joint, and wherein core is also Image Matching technology.Image Matching refers to identifies same place by certain Image Matching Algorithm between 2-D data, identification Same place out is also referred to as image connecting points.Image Matching can be used in unused field, such as Photogrammetric Processing, digitized map As processing, Medical Image Processing, remote sensing image data processing etc., different application fields has different want to Image Matching It asks.And being registrated for point cloud data and image is registration between two and three dimensions data, is matched different from traditional 2-D data. Since image is two-dimensional, and point cloud data is three-dimensional, so needing to carry out point cloud data interpolation in registration process, is led Cause matching precision relatively low.Want to reach high-precision registration result, it is desirable that the accuracy of match point is high, and precision need to reach 1 picture Within element.In practical applications, matching algorithm suffers from image noise, image rotation and deformation of image etc. and influences, especially In aeroplane photography, due to the influence of hypsography, the camera style of central projection and flight environment of vehicle, make aerial stereo images rotate and Larger, to be easy to cause accuracy of registration not high during automatically processing using existing method for registering problem is deformed, therefore Need the registration model that service precision is high.The mathematical expression and comparatively laborious, the registration model of calculating of the registration model of existing 3D-3D The problems such as not stringent.
Summary of the invention
In view of the above-mentioned deficiencies in the prior art, the technical problem to be solved by the present invention is that providing a kind of airborne The autoegistration method of LiDAR point cloud data and aviation image, method and step is simple, design is reasonable and it is convenient to realize, uses Effect is good, the characteristic point of the aviation image extracted and point cloud data match realize airborne lidar point cloud data with The autoregistration of aviation image can effectively improve registration accuracy, and calculation amount is small, in combination with the outer of multiple ground control points Industry actual measurement three-dimensional coordinate is registrated, and the point cloud data and aviation image autoregistration process of different terrain can be effectively applicable to.
In order to solve the above technical problems, the technical solution adopted by the present invention is that: a kind of airborne lidar point cloud data and aviation The autoegistration method of image, it is characterised in that method includes the following steps:
Step 1: point cloud data and aviation image obtain and control point field operation is surveyed: being obtained using airborne LiDAR measuring system The point cloud data in region to be measured is taken, and acquired point cloud data is sent to data processing equipment;In the point cloud data The three-dimensional coordinate of multiple measuring points and each measuring point including region to be measured;
Meanwhile multiple ground control points are laid in measured region, and carry out to the three-dimensional coordinate of each ground control point outer Industry actual measurement, obtains the actual measurement three-dimensional coordinate of each ground control point;The multiple ground control points laid are recycled, to tested It measures region and carries out aerophotogrammetry, several aeroplane photography images of measured region are absorbed, by several acquired described boats Empty photographs synchronous driving is to the data processing equipment;Width aeroplane photography image described in every width be digitized video and its For two images;
When carrying out aerophotogrammetry to measured region, the elements of exterior orientation of aeroplane photography image described in each width is obtained, And by the elements of exterior orientation synchronous driving of aeroplane photography image described in obtained each width to the data processing equipment;This step In, the elements of exterior orientation of aeroplane photography image described in each width is the initial elements of exterior orientation of the aeroplane photography image;
Step 2: Yunnan snub-nosed monkey: using the data processing equipment the aeroplane photography image of several in step 1 It is denoised and is filtered respectively, obtain several pretreated described aeroplane photography images;
Step 3: Image Matching: using the data processing equipment and calling Image Matching module, locate in advance in step 2 Several described aeroplane photography images after reason carry out Image Matchings, obtain and are mutually matched between several described aeroplane photography images All characteristic points;The image connecting points that characteristic point obtained is mutually matched between several described aeroplane photography images;
In this step, all feature point groups on aeroplane photography image described in each width obtained are at the width aerial photography map The set of characteristic points of picture include matching the width obtained in this step in the set of characteristic points of aeroplane photography image described in every width The photo coordinate of all characteristic points on aeroplane photography image;
Step 4: control point photo coordinate obtains: being sat according to the actual measurement three-dimensional of the ground control points multiple in step 1 Mark using the data processing equipment and calls photo coordinate calculation module, exists to the ground control points multiple in step 1 Photo coordinate in step 2 on aeroplane photography image described in pretreated each width is respectively calculated, and acquisition is multiple describedly Photo coordinate of the face control point on the aeroplane photography image described in each width;Multiple ground control points are navigated described in each width again Photo coordinate on empty photographs is added in step 3 in the set of characteristic points of the width aeroplane photography image, obtains each width The complete characterization point set of the aeroplane photography image;
Step 5: initial laser beam method block adjustment and elements of exterior orientation update: according to aviation described in width each in step 1 The complete characterization point set and step 1 of aeroplane photography image described in each width in the initial elements of exterior orientation of photographs, step 4 In multiple ground control points actual measurement three-dimensional coordinate, using the data processing equipment and call bundle block adjustment Module carries out bundle block adjustment, the elements of exterior orientation of aeroplane photography image described in each width after acquisition adjustment;Recall number According to update module, the elements of exterior orientation of aeroplane photography image described in each width is updated to the width aviation after the adjustment obtained at this time The elements of exterior orientation of photographs;
The actual measurement three-dimensional coordinate composition control point set of multiple ground control points in step 1;
Step 6: point cloud data and aviation image Feature Points Matching: using the data processing equipment to institute in step 1 It states aeroplane photography image described in pretreated any width in point cloud data and step 2 to be matched, which is Benchmark image;Remaining aviation in step 2 in several pretreated described aeroplane photography images in addition to the benchmark image Photographs is image to be processed;
When matching to the point cloud data and the benchmark image, process is as follows:
Step 601, the building triangulation network: the point cloud data according to step 1 calls triangulation network building module building three Angle net;The constructed triangulation network is the point cloud triangulation network;
Step 602, Harris angle point grid: Harris Corner Detection module is called to extract the width aeroplane photography image Characteristic point, and the photo coordinate of extracted each characteristic point is recorded;Extracted characteristic point is Harris angle point;
Step 603, image Harris angle point and ground control point the three-dimensional coordinate determination based on the triangulation network: according to step The cloud triangulation network is put described in 601, it is multiple in all Harris angle points and step 4 to the width aeroplane photography image in step 602 The three-dimensional coordinate of the ground control point is determined respectively;
When being determined to the three-dimensional coordinate of all Harris angle points of the width aeroplane photography image, taken the photograph according to the width aviation The photo coordinate of each Harris angle point on shadow image, is determined the three-dimensional coordinate of each Harris angle point respectively, obtains multiple The three-dimensional coordinate of reliable Harris angle point;
Wherein, to the width aeroplane photography image take up an official post the three-dimensional coordinate of a Harris angle point be determined when, should Harris angle point is currently processed point, and process is as follows:
Step A1, process points effectively judge: according to the elements of exterior orientation of the width aeroplane photography image at this time and currently processed Photo coordinate of the point on the width aeroplane photography image calls ground coordinate conversion module conversion to obtain the ground of currently processed point Coordinate;Further according to the ground coordinate for the currently processed point that conversion obtains, found out in the constructed point cloud triangulation network in step 601 Triangle locating for currently processed point, and triangle judgment module is called to judge the triangle found out: when being found out When three edge lengths of triangle are respectively less than the depth displacement in TL and three vertex between any two vertex and are respectively less than TH, judgement It is to be effectively treated a little for currently processed point, enters step A2;Otherwise, reason point will be presently in give up;
Wherein, TL is preset triangle side length judgment threshold;TH is preset triangular apex depth displacement Judgment threshold;It is described to be effectively treated a little as the reliable Harris angle point;
Step A2, three-dimensional coordinate determines: calling elevation coordinate computing module and using in the triangle found out in step A1 The height value for obtaining currently processed point is inserted, the ground coordinate of currently processed point is obtained in conjunction with conversion in step A1, is obtained current The three-dimensional coordinate of process points;
Step A3, one or many to repeat step A1 to step A2, to all angles Harris of the width aeroplane photography image The three-dimensional coordinate of point is determined respectively, is obtained the three-dimensional of multiple reliable Harris angle points of the width aeroplane photography image and is sat Mark;
When being determined respectively to the three-dimensional coordinate of the ground control points multiple in step 4, according to multiple in step 4 Photo coordinate of the ground control point on the width aeroplane photography image, to the three of the ground control points multiple in step 4 Dimension coordinate is determined respectively, obtains the three-dimensional coordinate at multiple reliable control points;
Wherein, when being determined the three-dimensional coordinate of any one ground control point, which is current place Point is managed, process is as follows:
Step B1, process points effectively judge: according to the elements of exterior orientation of the width aeroplane photography image at this time and currently processed Photo coordinate of the point on the width aeroplane photography image calls ground coordinate conversion module conversion to obtain the ground of currently processed point Coordinate;Further according to the ground coordinate for the currently processed point that conversion obtains, found out in the constructed point cloud triangulation network in step 601 Triangle locating for currently processed point, and triangle judgment module is called to judge the triangle found out: when being found out When three edge lengths of triangle are respectively less than the depth displacement in TL and three vertex between any two vertex and are respectively less than TH, judgement It is to be effectively treated a little for currently processed point, enters step B2;Otherwise, reason point will be presently in give up;
It is described to be effectively treated a little as the reliable control point in this step;
Step B2, three-dimensional coordinate determines: calling elevation coordinate computing module and using in the triangle found out in step B1 The height value for obtaining currently processed point is inserted, the ground coordinate of currently processed point is obtained in conjunction with conversion in step B1, is obtained current The three-dimensional coordinate of process points;
Step B3, one or many to repeat step B1 to step B2, to the three of the ground control points multiple in step 4 Dimension coordinate is determined respectively, obtains the three-dimensional coordinate at multiple reliable control points;
Step 604, transformation matrix of coordinates calculate: calling control point searching module, multiple ground controls from step 1 Make the actual measurement three-dimensional coordinate that multiple reliable control points are found out in the actual measurement three-dimensional coordinate of point;Recall transformation matrix of coordinates meter Calculate module, be calculated the three-dimensional coordinate at multiple reliable control points obtained in step B3 and find out it is multiple described in can By the transformation matrix of coordinates of the actual measurement three-dimensional coordinate at control point;
Step 605, the image Harris angle point three-dimensional coordinate determination based on transformation matrix of coordinates: it is fallen into a trap according to step 604 The transformation matrix of coordinates obtained, call coordinate transformation module in step A3 the width aeroplane photography image it is multiple described The three-dimensional coordinate of reliable Harris angle point is coordinately transformed respectively, multiple described reliable after coordinate transform is calculated The three-dimensional coordinate of Harris angle point;
Step 606, the image Harris angle point three-dimensional coordinate amendment based on the triangulation network: constructed by step 601 The point cloud triangulation network, is modified the three-dimensional coordinate of multiple reliable Harris angle points obtained in step 605 respectively;
It wherein, should when being modified the three-dimensional coordinate for either one or two of obtaining the reliable Harris angle point in step 605 Reliable Harris angle point is current adjusting point, and process is as follows:
Step C1, amendment judgement: according to the three-dimensional coordinate of the current adjusting point obtained in step 605, in step 601 In the constructed point cloud triangulation network triangle locating for current adjusting point is found out, and calls triangle judgment module to being found out Triangle is judged: when three edge lengths of found out triangle are respectively less than in TL and three vertex between any two vertex Depth displacement when being respectively less than TH, the three-dimensional coordinate of current adjusting point need to be modified by being judged as, enter step C2;Otherwise, nothing The three-dimensional coordinate of current adjusting point need to be modified, the three-dimensional coordinate of current adjusting point is obtained in step 605 at this time The three-dimensional coordinate of current adjusting point;
Step C2, it coordinate modification: calls elevation coordinate computing module and is obtained using the triangular interpolation found out in step C1 The height value of current adjusting point out, then the height value in the three-dimensional coordinate of current adjusting point obtained in step 604 is replaced For the height value for the current adjusting point that interpolation at this time obtains, the three-dimensional coordinate of revised current adjusting point is obtained;
Step C3, one or many to repeat step C1 to step C2, the three-dimensional of multiple reliable Harris angle points is sat Mark is modified respectively, obtains the three-dimensional coordinate of revised multiple reliable Harris angle points;
The three-dimensional coordinate of revised multiple reliable Harris angle points is the multiple described of benchmark image in step C3 The three-dimensional coordinate of reliable Harris angle point;
Step 7: image characteristic point photo coordinate to be processed calculates: using the data processing equipment to each in step 6 The characteristic point photo coordinate of image to be processed is respectively calculated, and obtains multiple corresponding angles Harris of each image to be processed The photo coordinate of point;
The characteristic point photo Coordinate calculation method of each image to be processed is all the same;
When the characteristic point photo coordinate of the image to be processed described in any width calculates, according to obtained in step C3 The three-dimensional coordinate of multiple reliable Harris angle points of the benchmark image, and combine the exterior orientation of the image to be processed at this time Element calls photo coordinate point of the photo coordinate calculation module to multiple reliable Harris angle points on the image to be processed It is not calculated, obtains the photo coordinate of multiple corresponding Harris angle points of the image to be processed;
Each corresponding Harris angle point is picture of the reliable Harris angle point on the image to be processed Vegetarian refreshments, pixel coordinate of each reliable Harris angle point on the image to be processed are the reliable Harris angle point at this The coordinate of pixel on image to be processed;
Step 8: Image Matching: using the data processing equipment and characteristic point searching module is called, from step 602 The photo coordinate that multiple reliable Harris angle points in step A3 are found out in the photo coordinate of extracted each characteristic point, is looked for The photo coordinate of multiple reliable Harris angle points out is multiple reliable Harris angle points of the benchmark image Photo coordinate;Obtained in photo coordinate and step 7 further according to multiple reliable Harris angle points of the benchmark image Each image to be processed multiple corresponding Harris angle points photo coordinate, using the data processing equipment and call image Matching module carries out Image Matching, obtain match between image to be processed described in the benchmark image and each width it is all Harris angle point;
Each Harris angle point to match between image to be processed described in the benchmark image obtained and each width is Registering control points;
Step 9: set updates: using the data processing equipment by all registering control points obtained in step 8 Photo coordinate on the aeroplane photography image described in each width is added in step 4 the complete of aeroplane photography image described in the width In set of characteristic points, the complete characterization point set of aeroplane photography image described in updated each width is obtained;Meanwhile it will be in step 6 The three-dimensional coordinate of revised multiple reliable Harri s angle points is added in the control point set at this time, is obtained The updated control point set;
Step 10: bundle block adjustment and elements of exterior orientation update: according to aeroplane photography image described in each width at this time Elements of exterior orientation, the complete characterization point set of aeroplane photography image described in each width and the control point set at this time in step 9 It closes, using the data processing equipment and bundle block adjustment module is called to carry out bundle block adjustment, put down The elements of exterior orientation of aeroplane photography image described in each width after difference;Data update module is recalled, by aerial photography map described in each width The elements of exterior orientation of picture is updated to the elements of exterior orientation of the width aeroplane photography image after the adjustment obtained at this time, after being updated Each width described in aeroplane photography image elements of exterior orientation, complete the point cloud data and match automatically with the primary of aeroplane photography image Quasi- process;
Step 11: autoregistration terminates to judge: using the data processing equipment and numerical value comparison module is called, to step The corrected value of three angle elements is judged respectively in the elements of exterior orientation of aeroplane photography image described in each width after updating in rapid ten: The corrected value of three angle elements is respectively less than preset in the elements of exterior orientation of aeroplane photography image described in each width after update When limiting difference, it is judged as the autoregistration process for completing point cloud data and several aeroplane photography images, using the data processing Equipment exports autoregistration as a result, the autoregistration result is aeroplane photography image described in updated each width in step 10 Elements of exterior orientation;Otherwise, then using the data processing equipment carry out the judgement of autoregistration number;
When judging using data processing equipment progress autoregistration number, this is judged using the data processing equipment When completed autoregistration number whether have reached preset maximum registration number: when completed autoregistration at this time When number has reached preset maximum registration number, it is judged as that autoregistration fails, the data processing equipment is defeated at this time Automatic registration result out, the autoregistration result are autoregistration failure;Otherwise, return step six, to the point cloud data Autoregistration next time is carried out with aeroplane photography image.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: institute in step A1 The TL=3m stated, the TH=1m.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: in step 11 When carrying out autoregistration next time to the point cloud data and aeroplane photography image, according to side described in step 6 to step 10 Method carries out autoregistration to the point cloud data and aeroplane photography image, and entering back into step 11 progress autoregistration terminates to sentence It is disconnected.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: in step 2 into Before row Yunnan snub-nosed monkey, autoregistration times N is set using the data processing equipment;At this point, N=0;
After an autoregistration process for completing the point cloud data and aeroplane photography image in step 10, using the number Autoregistration times N at this time is added 1 according to processing equipment;
Preset maximum registration number is denoted as N in step 11max;Wherein, NmaxFor positive integer and Nmax≥3;
Using data processing equipment judgement, whether completed autoregistration number is had reached at this time in step 11 When preset maximum registration number, by described at this time N and NmaxCarry out difference comparsion: as N >=NmaxWhen, it is judged as at this time Completed autoregistration number has reached preset maximum registration number;Otherwise, it is judged as completed at this time automatic Registration number is not up to preset maximum registration number.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: step 5 neutralizes When carrying out bundle block adjustment in step 10, it is all made of the data processing equipment and calls bundle block adjustment mould Block carries out POS auxiliary beam method block adjustment.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: right in step 1 When measured region carries out aerophotogrammetry, aerophotogrammetry is carried out with POS system using aerophotogrammetry;
The elements of exterior orientation of aeroplane photography image described in each width is to carry out aeroplane photography to measured region in step 1 The elements of exterior orientation that the POS system obtains when measurement.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: in step 7 into Before row image characteristic point photo coordinate to be processed calculates, first using the data processing equipment to institute obtained in step C3 The multiple reliable Harris angle points for stating benchmark image carry out angle point Effective judgement respectively;
The angle point Effective judgement method of all reliable Harris angle points is all the same;
When the reliable Harris angle point described to either one or two of the benchmark image carries out angle point Effective judgement, according to step The three-dimensional coordinate of the reliable Harris angle point of the benchmark image obtained in C3, and in conjunction with the benchmark image at this time Elements of exterior orientation, call photo coordinate calculation module to the photo coordinate of the reliable Harris angle point on the benchmark image into Row calculates, and the photo coordinate of the reliable Harris angle point on the benchmark image being calculated is denoted as (x*, y*);It uses again The data processing equipment is looked for from the photo coordinate of multiple reliable Harris angle points of benchmark image described in step 8 The pixel coordinate of the reliable Harris angle point of the benchmark image out, the pixel coordinate for the reliable Harris angle point found out It is denoted as (x^, y^);Later, Numerical Simulation Module is called and according to formulaIt is reliable that this is calculated The photo grid deviation value Δ r of Harris angle point, Δ x=x*-x^, Δ y=y*-y^ in formula;Difference comparsion module is recalled to sentence Whether disconnected Δ r is less than Δ t: as Δ r < Δ t, being judged as that the reliable Harris angle point is effective angle point;Otherwise, it is judged as this Reliable Harris angle point is invalid angle point;
Wherein, Δ t is preset photo grid deviation judgment threshold;
Before the characteristic point photo coordinate of the image to be processed described in any width calculates in step 7, using the number It will be all invalid in multiple reliable Harris angle points of the benchmark image obtained in step C3 according to processing equipment Angle point is rejected;
When the characteristic point photo coordinate of the image to be processed described in any width calculates in step 7, the benchmark image Multiple reliable Harris angle points be the effective focal spot;
Each corresponding Harris angle point is pixel of the effective angle point on the image to be processed;
The reliable Harris angle point being added in the control point set at this time in step 9 is described effective Focus.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: being adopted in step 8 With the data processing equipment and call Image Matching module carries out Image Matching when, to the benchmark image it is multiple described in can It carries out respectively by Harris angle point to multiple corresponding Harris angle points of each image to be processed obtained in step 7 related Coefficient calculates, and according to related coefficient calculated result find out the benchmark image and each width described in match between image to be processed All registering control points;
Obtained in multiple reliable Harris angle points and step 7 to the benchmark image any it is to be processed When multiple corresponding Harris angle points of image carry out related coefficient calculating, to the multiple described reliable of the benchmark image Harris angle point carries out related coefficient calculating with multiple corresponding Harris angle points of this image to be processed respectively, and according to correlation Coefficient calculated result finds out all Harris angle points to match between the benchmark image and this image to be processed;
Each figure to be processed obtained in multiple reliable Harris angle points and step 7 to the benchmark image Multiple corresponding Harris angle points of picture are completed after related coefficient calculates, obtain the benchmark image and each image to be processed it Between all Harris angle points for matching, the Harris angle point found out is registering control points.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: being adjusted in step A2 After the height value for obtaining currently processed point with elevation coordinate computing module and using the triangular interpolation found out in step A1, also need Height value correction module is called to be corrected the height value that interpolation obtains, process is as follows:
Step A21, nearest measuring point is searched in the triangulation network: in the height value and step A1 for obtaining currently processed point according to interpolation The ground coordinate for the currently processed point that conversion obtains finds out currently processed point in the constructed point cloud triangulation network in step 601 Locating triangle, and the measuring point apart from nearest measuring point found out is found out in the triangle found out with currently processed point For a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the currently processed point after correction Height value;
Step A22, height value correction terminates judgement: by the height value and step of the currently processed point after correcting in step A21 The height value of currently processed point carries out difference comparsion before correcting in rapid A21, when the absolute value of difference between the two is less than Δ H, It is judged as that height value corrects as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, A23 is entered step, is corrected next time;
Wherein, Δ H is preset elevation difference judgment threshold;
Step A23, nearest measuring point is searched in the triangulation network: according to the height value and step of the currently processed point after correcting at this time Convert the ground coordinate of the currently processed point obtained in A1, finds out current place in the constructed point cloud triangulation network in step 601 The locating triangle of reason point, and find out with currently processed point in the triangle found out apart from nearest measuring point, it is found out Measuring point is a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the current place after correction Manage the height value of point;
Step A24, height value correction terminate judgement: by step A23 correct after currently processed point height value with it is upper The height value progress difference comparsion of currently processed point is sentenced when the absolute value of difference between the two is less than 0.001 after primary correction Break and is corrected for height value as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, return step A23 is corrected next time;
It combines to convert in step A1 in step A2 and obtains the ground coordinate of currently processed point, obtain the three-dimensional of currently processed point Coordinate obtains the ground coordinate of currently processed point and the reliable height value of currently processed point according to conversion in step A1, obtains and work as The three-dimensional coordinate of pre-treatment point.
The autoegistration method of above-mentioned a kind of airborne lidar point cloud data and aviation image, it is characterized in that: being adjusted in step B2 After the height value for obtaining currently processed point with elevation coordinate computing module and using the triangular interpolation found out in step B1, also need Height value correction module is called to be corrected the height value that interpolation obtains, process is as follows:
Step B21, nearest measuring point is searched in the triangulation network: in the height value and step B1 for obtaining currently processed point according to interpolation The ground coordinate for the currently processed point that conversion obtains finds out currently processed point in the constructed point cloud triangulation network in step 601 Locating triangle, and the measuring point apart from nearest measuring point found out is found out in the triangle found out with currently processed point For a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the currently processed point after correction Height value;
Step B22, height value correction terminates judgement: by the height value and step of the currently processed point after correcting in step B21 The height value of currently processed point carries out difference comparsion before correcting in rapid B21, when the absolute value of difference between the two is less than Δ H, It is judged as that height value corrects as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, B23 is entered step, is corrected next time;
Step B23, nearest measuring point is searched in the triangulation network: according to the height value and step of the currently processed point after correcting at this time Convert the ground coordinate of the currently processed point obtained in B1, finds out current place in the constructed point cloud triangulation network in step 601 The locating triangle of reason point, and find out with currently processed point in the triangle found out apart from nearest measuring point, it is found out Measuring point is a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the current place after correction Manage the height value of point;
Step B24, height value correction terminate judgement: by step B23 correct after currently processed point height value with it is upper The height value progress difference comparsion of currently processed point is sentenced when the absolute value of difference between the two is less than 0.001 after primary correction Break and is corrected for height value as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, return step B23 is corrected next time;
It combines to convert in step B1 in step B2 and obtains the ground coordinate of currently processed point, obtain the three-dimensional of currently processed point Coordinate obtains the ground coordinate of currently processed point and the reliable height value of currently processed point according to conversion in step B1, obtains and work as The three-dimensional coordinate of pre-treatment point.
Compared with the prior art, the present invention has the following advantages:
1, method and step is simple, realizes that convenient and input cost is lower.
2, the degree of automation is higher, and registration process and registration are easily-controllable.
3, using effect is good and practical value is high, and image connecting points precision is high, and what is directlyed adopt is from aeroplane photography image The characteristic point of middle extraction carries out merging matching with point cloud data, and the tie point precision of acquisition can be guaranteed, tie point accuracy Height, positional accuracy within 1 pixel and are evenly distributed, and meet the requirement that point cloud data is registrated tie point with aviation image, Registration accuracy can thus be effectively improved.Meanwhile being registrated using the actual measurement three-dimensional coordinate of multiple ground control points, thus There is field operation measurement control point auxiliary to be registrated, thus be adaptable to different landform, applicable surface is wider.In addition, to a cloud number When according to carrying out autoregistration with aeroplane photography image, without establishing complicated registration model, calculation amount is small, thus can effectively adapt to It is registrated in mass remote sensing data, has further widened the scope of application, economic benefit and social benefit are significant.
In conclusion the method for the present invention step is simple, design is reasonable and realizes that convenient, using effect is good, will be extracted Characteristic point and the point cloud data of aviation image match and to realize matching automatically for airborne lidar point cloud data and aviation image Standard can effectively improve registration accuracy, and calculation amount is small, in combination with multiple ground control points field operation survey three-dimensional coordinate into Row registration can be effectively applicable to the point cloud data and aviation image autoregistration process of different terrain.It is not needing to all the points Under the premise of cloud data carry out interpolation, the method for registering robustness for carrying out point cloud data and image is good, and accuracy is high.
Below by drawings and examples, technical scheme of the present invention will be described in further detail.
Detailed description of the invention
Fig. 1 is method flow block diagram of the invention.
Specific embodiment
The autoegistration method of a kind of airborne lidar point cloud data and aviation image as shown in Figure 1, including following step It is rapid:
Step 1: point cloud data and aviation image obtain and control point field operation is surveyed: being obtained using airborne LiDAR measuring system The point cloud data in region to be measured is taken, and acquired point cloud data is sent to data processing equipment;In the point cloud data The three-dimensional coordinate of multiple measuring points and each measuring point including region to be measured;
Meanwhile multiple ground control points are laid in measured region, and carry out to the three-dimensional coordinate of each ground control point outer Industry actual measurement, obtains the actual measurement three-dimensional coordinate of each ground control point;The multiple ground control points laid are recycled, to tested It measures region and carries out aerophotogrammetry, several aeroplane photography images of measured region are absorbed, by several acquired described boats Empty photographs synchronous driving is to the data processing equipment;Width aeroplane photography image described in every width be digitized video and its For two images;
When carrying out aerophotogrammetry to measured region, the elements of exterior orientation of aeroplane photography image described in each width is obtained, And by the elements of exterior orientation synchronous driving of aeroplane photography image described in obtained each width to the data processing equipment;This step In, the elements of exterior orientation of aeroplane photography image described in each width is the initial elements of exterior orientation of the aeroplane photography image;
Step 2: Yunnan snub-nosed monkey: using the data processing equipment the aeroplane photography image of several in step 1 It is denoised and is filtered respectively, obtain several pretreated described aeroplane photography images;
Step 3: Image Matching: using the data processing equipment and calling Image Matching module, locate in advance in step 2 Several described aeroplane photography images after reason carry out Image Matchings, obtain and are mutually matched between several described aeroplane photography images All characteristic points;The image connecting points that characteristic point obtained is mutually matched between several described aeroplane photography images;
In this step, all feature point groups on aeroplane photography image described in each width obtained are at the width aerial photography map The set of characteristic points of picture include matching the width obtained in this step in the set of characteristic points of aeroplane photography image described in every width The photo coordinate of all characteristic points on aeroplane photography image;
Step 4: control point photo coordinate obtains: being sat according to the actual measurement three-dimensional of the ground control points multiple in step 1 Mark using the data processing equipment and calls photo coordinate calculation module, exists to the ground control points multiple in step 1 Photo coordinate in step 2 on aeroplane photography image described in pretreated each width is respectively calculated, and acquisition is multiple describedly Photo coordinate of the face control point on the aeroplane photography image described in each width;Multiple ground control points are navigated described in each width again Photo coordinate on empty photographs is added in step 3 in the set of characteristic points of the width aeroplane photography image, obtains each width The complete characterization point set of the aeroplane photography image;
Step 5: initial laser beam method block adjustment and elements of exterior orientation update: according to aviation described in width each in step 1 The complete characterization point set and step 1 of aeroplane photography image described in each width in the initial elements of exterior orientation of photographs, step 4 In multiple ground control points actual measurement three-dimensional coordinate, using the data processing equipment and call bundle block adjustment Module carries out bundle block adjustment, the elements of exterior orientation of aeroplane photography image described in each width after acquisition adjustment;Recall number According to update module, the elements of exterior orientation of aeroplane photography image described in each width is updated to the width aviation after the adjustment obtained at this time The elements of exterior orientation of photographs;
The actual measurement three-dimensional coordinate composition control point set of multiple ground control points in step 1;
Step 6: point cloud data and aviation image Feature Points Matching: using the data processing equipment to institute in step 1 It states aeroplane photography image described in pretreated any width in point cloud data and step 2 to be matched, which is Benchmark image;Remaining aviation in step 2 in several pretreated described aeroplane photography images in addition to the benchmark image Photographs is image to be processed;
When matching to the point cloud data and the benchmark image, process is as follows:
Step 601, the building triangulation network: the point cloud data according to step 1 calls triangulation network building module building three Angle net;The constructed triangulation network is the point cloud triangulation network;
Step 602, Harris angle point grid: Harris Corner Detection module is called to extract the width aeroplane photography image Characteristic point, and the photo coordinate of extracted each characteristic point is recorded;Extracted characteristic point is Harris angle point;
Step 603, image Harris angle point and ground control point the three-dimensional coordinate determination based on the triangulation network: according to step The cloud triangulation network is put described in 601, it is multiple in all Harris angle points and step 4 to the width aeroplane photography image in step 602 The three-dimensional coordinate of the ground control point is determined respectively;
When being determined to the three-dimensional coordinate of all Harris angle points of the width aeroplane photography image, taken the photograph according to the width aviation The photo coordinate of each Harris angle point on shadow image, is determined the three-dimensional coordinate of each Harris angle point respectively, obtains multiple The three-dimensional coordinate of reliable Harris angle point;
Wherein, to the width aeroplane photography image take up an official post the three-dimensional coordinate of a Harris angle point be determined when, should Harris angle point is currently processed point, and process is as follows:
Step A1, process points effectively judge: according to the elements of exterior orientation of the width aeroplane photography image at this time and currently processed Photo coordinate of the point on the width aeroplane photography image calls ground coordinate conversion module conversion to obtain the ground of currently processed point Coordinate;Further according to the ground coordinate for the currently processed point that conversion obtains, found out in the constructed point cloud triangulation network in step 601 Triangle locating for currently processed point, and triangle judgment module is called to judge the triangle found out: when being found out When three edge lengths of triangle are respectively less than the depth displacement in TL and three vertex between any two vertex and are respectively less than TH, judgement It is to be effectively treated a little for currently processed point, enters step A2;Otherwise, reason point will be presently in give up;
Wherein, TL is preset triangle side length judgment threshold;TH is preset triangular apex depth displacement Judgment threshold;It is described to be effectively treated a little as the reliable Harris angle point;
Step A2, three-dimensional coordinate determines: calling elevation coordinate computing module and using in the triangle found out in step A1 The height value for obtaining currently processed point is inserted, the ground coordinate of currently processed point is obtained in conjunction with conversion in step A1, is obtained current The three-dimensional coordinate of process points;
Step A3, one or many to repeat step A1 to step A2, to all angles Harris of the width aeroplane photography image The three-dimensional coordinate of point is determined respectively, is obtained the three-dimensional of multiple reliable Harris angle points of the width aeroplane photography image and is sat Mark;
When being determined respectively to the three-dimensional coordinate of the ground control points multiple in step 4, according to multiple in step 4 Photo coordinate of the ground control point on the width aeroplane photography image, to the three of the ground control points multiple in step 4 Dimension coordinate is determined respectively, obtains the three-dimensional coordinate at multiple reliable control points;
Wherein, when being determined the three-dimensional coordinate of any one ground control point, which is current place Point is managed, process is as follows:
Step B1, process points effectively judge: according to the elements of exterior orientation of the width aeroplane photography image at this time and currently processed Photo coordinate of the point on the width aeroplane photography image calls ground coordinate conversion module conversion to obtain the ground of currently processed point Coordinate;Further according to the ground coordinate for the currently processed point that conversion obtains, found out in the constructed point cloud triangulation network in step 601 Triangle locating for currently processed point, and triangle judgment module is called to judge the triangle found out: when being found out When three edge lengths of triangle are respectively less than the depth displacement in TL and three vertex between any two vertex and are respectively less than TH, judgement It is to be effectively treated a little for currently processed point, enters step B2;Otherwise, reason point will be presently in give up;
It is described to be effectively treated a little as the reliable control point in this step;
Step B2, three-dimensional coordinate determines: calling elevation coordinate computing module and using in the triangle found out in step B1 The height value for obtaining currently processed point is inserted, the ground coordinate of currently processed point is obtained in conjunction with conversion in step B1, is obtained current The three-dimensional coordinate of process points;
Step B3, one or many to repeat step B1 to step B2, to the three of the ground control points multiple in step 4 Dimension coordinate is determined respectively, obtains the three-dimensional coordinate at multiple reliable control points;
Step 604, transformation matrix of coordinates calculate: calling control point searching module, multiple ground controls from step 1 Make the actual measurement three-dimensional coordinate that multiple reliable control points are found out in the actual measurement three-dimensional coordinate of point;Recall transformation matrix of coordinates meter Calculate module, be calculated the three-dimensional coordinate at multiple reliable control points obtained in step B3 and find out it is multiple described in can By the transformation matrix of coordinates of the actual measurement three-dimensional coordinate at control point;
Step 605, the image Harris angle point three-dimensional coordinate determination based on transformation matrix of coordinates: it is fallen into a trap according to step 604 The transformation matrix of coordinates obtained, call coordinate transformation module in step A3 the width aeroplane photography image it is multiple described The three-dimensional coordinate of reliable Harris angle point is coordinately transformed respectively, multiple described reliable after coordinate transform is calculated The three-dimensional coordinate of Harris angle point;
Step 606, the image Harris angle point three-dimensional coordinate amendment based on the triangulation network: constructed by step 601 The point cloud triangulation network, is modified the three-dimensional coordinate of multiple reliable Harris angle points obtained in step 605 respectively;
It wherein, should when being modified the three-dimensional coordinate for either one or two of obtaining the reliable Harris angle point in step 605 Reliable Harris angle point is current adjusting point, and process is as follows:
Step C1, amendment judgement: according to the three-dimensional coordinate of the current adjusting point obtained in step 605, in step 601 In the constructed point cloud triangulation network triangle locating for current adjusting point is found out, and calls triangle judgment module to being found out Triangle is judged: when three edge lengths of found out triangle are respectively less than in TL and three vertex between any two vertex Depth displacement when being respectively less than TH, the three-dimensional coordinate of current adjusting point need to be modified by being judged as, enter step C2;Otherwise, nothing The three-dimensional coordinate of current adjusting point need to be modified, the three-dimensional coordinate of current adjusting point is obtained in step 605 at this time The three-dimensional coordinate of current adjusting point;
Step C2, it coordinate modification: calls elevation coordinate computing module and is obtained using the triangular interpolation found out in step C1 The height value of current adjusting point out, then the height value in the three-dimensional coordinate of current adjusting point obtained in step 604 is replaced For the height value for the current adjusting point that interpolation at this time obtains, the three-dimensional coordinate of revised current adjusting point is obtained;
Step C3, one or many to repeat step C1 to step C2, the three-dimensional of multiple reliable Harris angle points is sat Mark is modified respectively, obtains the three-dimensional coordinate of revised multiple reliable Harris angle points;
The three-dimensional coordinate of revised multiple reliable Harris angle points is the multiple described of benchmark image in step C3 The three-dimensional coordinate of reliable Harris angle point;
Step 7: image characteristic point photo coordinate to be processed calculates: using the data processing equipment to each in step 6 The characteristic point photo coordinate of image to be processed is respectively calculated, and obtains multiple corresponding angles Harris of each image to be processed The photo coordinate of point;
The characteristic point photo Coordinate calculation method of each image to be processed is all the same;
When the characteristic point photo coordinate of the image to be processed described in any width calculates, according to obtained in step C3 The three-dimensional coordinate of multiple reliable Harris angle points of the benchmark image, and combine the exterior orientation of the image to be processed at this time Element calls photo coordinate point of the photo coordinate calculation module to multiple reliable Harris angle points on the image to be processed It is not calculated, obtains the photo coordinate of multiple corresponding Harris angle points of the image to be processed;
Each corresponding Harris angle point is picture of the reliable Harris angle point on the image to be processed Vegetarian refreshments, pixel coordinate of each reliable Harris angle point on the image to be processed are the reliable Harris angle point at this The coordinate of pixel on image to be processed;
Step 8: Image Matching: using the data processing equipment and characteristic point searching module is called, from step 602 The photo coordinate that multiple reliable Harris angle points in step A3 are found out in the photo coordinate of extracted each characteristic point, is looked for The photo coordinate of multiple reliable Harris angle points out is multiple reliable Harris angle points of the benchmark image Photo coordinate;Obtained in photo coordinate and step 7 further according to multiple reliable Harris angle points of the benchmark image Each image to be processed multiple corresponding Harris angle points photo coordinate, using the data processing equipment and call image Matching module carries out Image Matching, obtain match between image to be processed described in the benchmark image and each width it is all Harris angle point;
Each Harris angle point to match between image to be processed described in the benchmark image obtained and each width is Registering control points;
Step 9: set updates: using the data processing equipment by all registering control points obtained in step 8 Photo coordinate on the aeroplane photography image described in each width is added in step 4 the complete of aeroplane photography image described in the width In set of characteristic points, the complete characterization point set of aeroplane photography image described in updated each width is obtained;Meanwhile it will be in step 6 The three-dimensional coordinate of revised multiple reliable Harris angle points is added in the control point set at this time, is obtained more The control point set after new;
Step 10: bundle block adjustment and elements of exterior orientation update: according to aeroplane photography image described in each width at this time Elements of exterior orientation, the complete characterization point set of aeroplane photography image described in each width and the control point set at this time in step 9 It closes, using the data processing equipment and bundle block adjustment module is called to carry out bundle block adjustment, put down The elements of exterior orientation of aeroplane photography image described in each width after difference;Data update module is recalled, by aerial photography map described in each width The elements of exterior orientation of picture is updated to the elements of exterior orientation of the width aeroplane photography image after the adjustment obtained at this time, after being updated Each width described in aeroplane photography image elements of exterior orientation, complete the point cloud data and match automatically with the primary of aeroplane photography image Quasi- process;
Step 11: autoregistration terminates to judge: using the data processing equipment and numerical value comparison module is called, to step The corrected value of three angle elements is judged respectively in the elements of exterior orientation of aeroplane photography image described in each width after updating in rapid ten: The corrected value of three angle elements is respectively less than preset in the elements of exterior orientation of aeroplane photography image described in each width after update When limiting difference, it is judged as the autoregistration process for completing point cloud data and several aeroplane photography images, using the data processing Equipment exports autoregistration as a result, the autoregistration result is aeroplane photography image described in updated each width in step 10 Elements of exterior orientation;Otherwise, then using the data processing equipment carry out the judgement of autoregistration number;
When judging using data processing equipment progress autoregistration number, this is judged using the data processing equipment When completed autoregistration number whether have reached preset maximum registration number: when completed autoregistration at this time When number has reached preset maximum registration number, it is judged as that autoregistration fails, the data processing equipment is defeated at this time Automatic registration result out, the autoregistration result are autoregistration failure;Otherwise, return step six, to the point cloud data Autoregistration next time is carried out with aeroplane photography image.Wherein, the limit difference of judgement is carried out to the corrected value of each angle element It is configured according to specific needs, usage mode is flexible.Also, the limit difference is that the limit of conventional angle element corrected value is poor Value.
When being judged as the autoregistration process for completing point cloud data and several aeroplane photography images when step 11, the boat The residual error at each control point need to meet poor demand limit in each control point and point cloud data on empty photographs, reach the point cloud data With the autoregistration requirement of aeroplane photography image.
Wherein, three angle elements in the elements of exterior orientation of aeroplane photography image described in each width after updating in step 10 are changed Positive value carries out judgment method respectively, for conventional judgment method.Also, the value for limiting difference can refer to " digital airborne photography survey Amount aerial triangulation specification " GB/T 23236-2009 is determined.
According to the elements of exterior orientation of the width aeroplane photography image at this time and work as in the present embodiment, in step A1 neutralization procedure B1 Photo coordinate of the pre-treatment point on the width aeroplane photography image calls ground coordinate conversion module conversion to obtain currently processed point Ground coordinate when,
All in accordance with formulaCurrently processed point is calculated in the width Ground coordinate (X, Y) on aeroplane photography image;
Wherein, (XS, YS, ZS) it is photo centre's point coordinate in the elements of exterior orientation of the width aeroplane photography image at this time, f For a parameter in the elements of interior orientation of the width aeroplane photography image and its be in step 1 to measured region carry out aviation The focal length of aerial surveying camera used when photogrammetric;(x, y) is picture of the currently processed point on the width aeroplane photography image in this step Piece coordinate;Z is the height above average terrain of measured region;
The spin matrix of the width aeroplane photography image
Aviation image (i.e. aeroplane photography image) and point cloud data are carried out on time at present, used existing method is all It is directly to match aviation image (i.e. aeroplane photography image) with point cloud data, but since point cloud data is discrete, and Aviation image is continuous, therefore wants to find same place or line of the same name, more difficult, just has found same place or line of the same name at last, Precision is not also high.
In addition, it is punctual match to aviation image (i.e. aeroplane photography image) and point cloud data, used in existing method Image connecting points are all only the photo coordinates at control point, but in registering control points due to the error of elements of exterior orientation, point cloud number According to error and registration model error etc. influence, keep the precision of photo tie point (also referred to as image connecting points) lower and past It being unevenly distributed toward will cause photo tie point, reducing registration accuracy.Wherein, control point is in advance in measured region setting Ground control point.
In the present invention, the registering control points found out in step 8 are image connecting points.
The point cloud data and aeroplane photography image are matched automatically according to method described in step 6 to step 10 On time, what the image connecting points of use directlyed adopt is the characteristic point (i.e. Harris angle point) extracted from aeroplane photography image, And carry out extracted characteristic point with point cloud data according to method described in step 6 to step 8 to merge matching, acquisition Tie point precision can be guaranteed.Thus, image connecting points use the matching characteristic point of aeroplane photography image in the present invention, connection Point accuracy is high, and positional accuracy within 1 pixel and is evenly distributed, meets point cloud data with aviation image and be registrated tie point Requirement.
In addition, when extracted characteristic point is carried out merging matching by step 6 into step 8 with point cloud data, while benefit With the actual measurement three-dimensional coordinate of the ground control points multiple in step 1, thus it can be further reduced error, further increase and match Quasi- precision.
Thus, the point cloud data and aeroplane photography image are carried out certainly according to method described in step 6 to step 10 It is dynamic with punctual, method described in step 6 to step 10 formed it is a set of can be automatically performed, robustness is good and registration accuracy is high Error compensation method is matched, existing aviation image (i.e. aeroplane photography image) and the method for registering of point cloud data are thoroughly changed, Various problems existing for the method for registering of existing aviation image (i.e. aeroplane photography image) and point cloud data can effectively be overcome.
On the other hand, registration model used in the method for registering of existing point cloud data and aviation image is only applicable to city Area, and control point is unevenly distributed, and reduces registration accuracy.
And the present invention according to method described in step 6 to step 10 to the point cloud data and aeroplane photography image into It when row autoregistration, is registrated using the actual measurement three-dimensional coordinate of multiple ground control points, thus has field operation measurement control Point auxiliary is registrated, thus is adaptable to different landform, and applicable surface is wider.
In addition, when the present invention carries out autoregistration to the point cloud data and aeroplane photography image, without establishing complexity Registration model, calculation amount is small, thus can be effectively adapted to mass remote sensing data registration, has further widened the scope of application.
And registration model used in the method for registering of existing point cloud data and aviation image is more complicated, calculation amount Greatly, the registration of low volume data can only be carried out.
As shown in the above, the registration accuracy of the used method for registering of the present invention is very high, and robustness is good, and is applicable in It is wide.
In the present embodiment, TL=3m described in step A1, the TH=1m.
In actual use, according to specific needs, image adjustment is carried out respectively to the value size of TL and TH.
In the present embodiment, autoregistration next time is carried out to the point cloud data and aeroplane photography image in step 11 When, autoregistration is carried out to the point cloud data and aeroplane photography image according to method described in step 6 to step 10, then Entering step 11 progress autoregistrations terminates to judge.
For convenience of operation, before carrying out Yunnan snub-nosed monkey in step 2, using the data processing equipment to autoregistration Times N is set;At this point, N=0;
After an autoregistration process for completing the point cloud data and aeroplane photography image in step 10, using the number Autoregistration times N at this time is added 1 according to processing equipment;
Preset maximum registration number is denoted as N in step 11max;Wherein, NmaxFor positive integer and Nmax≥3;
Using data processing equipment judgement, whether completed autoregistration number is had reached at this time in step 11 When preset maximum registration number, by described at this time N and NmaxCarry out difference comparsion: as N >=NmaxWhen, it is judged as at this time Completed autoregistration number has reached preset maximum registration number;Otherwise, it is judged as completed at this time automatic Registration number is not up to preset maximum registration number.
When actually carrying out with bundle block adjustment is carried out on time, in step 5 neutralization procedure ten, it is all made of the number Adjustment is carried out according to processing equipment and according to conventional bundle block adjustment method, is settled accounts by adjustment and finds out each width aviation The elements of exterior orientation (elements of exterior orientation of each width aeroplane photography image i.e. after adjustment) of photographs, thus practical operation is easy, Operation difficulty can be effectively reduced, realize easy.
When carrying out bundle block adjustment in the present embodiment, in step 5 neutralization procedure ten, it is all made of at the data It manages equipment and bundle block adjustment module is called to carry out POS auxiliary beam method block adjustment.Wherein, practical to carry out POS When auxiliary beam method block adjustment, used error compensation method is conventional POS auxiliary beam method block adjustment method.
In the present embodiment, when carrying out aerophotogrammetry to measured region in step 1, used using aerophotogrammetry POS system carries out aerophotogrammetry;
The elements of exterior orientation of aeroplane photography image described in each width is to carry out aeroplane photography to measured region in step 1 The elements of exterior orientation that the POS system obtains when measurement.
The POS system is the conventional POS system of aerophotogrammetry.POS system, which refers to, uses global positioning system (GPS) and inertial measuring unit (IMU) directly measures the aeroplane photography navigation system of airphoto orientation element, and POS is English Position's writes a Chinese character in simplified form.
To further increase registration accuracy, before image characteristic point photo coordinate calculating to be processed is carried out in step 7, first Using the data processing equipment to multiple reliable Harris angle points point of the benchmark image obtained in step C3 It carry out not angle point Effective judgement;
The angle point Effective judgement method of all reliable Harris angle points is all the same;
When the reliable Harris angle point described to either one or two of the benchmark image carries out angle point Effective judgement, according to step The three-dimensional coordinate of the reliable Harris angle point of the benchmark image obtained in C3, and in conjunction with the benchmark image at this time Elements of exterior orientation, call photo coordinate calculation module to the photo coordinate of the reliable Harris angle point on the benchmark image into Row calculates, and the photo coordinate of the reliable Harris angle point on the benchmark image being calculated is denoted as (x*, y*);It uses again The data processing equipment is looked for from the photo coordinate of multiple reliable Harris angle points of benchmark image described in step 8 The pixel coordinate of the reliable Harris angle point of the benchmark image out, the pixel coordinate for the reliable Harris angle point found out It is denoted as (x^, y^);Later, Numerical Simulation Module is called and according to formulaIt is reliable that this is calculated The photo grid deviation value Δ r of Harris angle point, Δ x=x*-x^, Δ y=y*-y^ in formula;Difference comparsion module is recalled to sentence Whether disconnected Δ r is less than Δ t: as Δ r < Δ t, being judged as that the reliable Harris angle point is effective angle point;Otherwise, it is judged as this Reliable Harris angle point is invalid angle point;
Wherein, Δ t is preset photo grid deviation judgment threshold;
Before the characteristic point photo coordinate of the image to be processed described in any width calculates in step 7, using the number It will be all invalid in multiple reliable Harris angle points of the benchmark image obtained in step C3 according to processing equipment Angle point is rejected;
When the characteristic point photo coordinate of the image to be processed described in any width calculates in step 7, the benchmark image Multiple reliable Harris angle points be the effective focal spot;
Each corresponding Harris angle point is pixel of the effective angle point on the image to be processed;
The reliable Harris angle point being added in the control point set at this time in step 9 is described effective Focus.
Wherein, Δ t≤5 pixel.The Δ t is 1 pixel, 2 pixels, 3 pixels, 4 pixels or 5 A pixel.In the present embodiment, the Δ t is 3 pixels.In actual use, according to specific needs, big to the value of Δ t It is small to adjust accordingly.
In the present embodiment, Image Matching is carried out using the data processing equipment and calling Image Matching module in step 8 When, each image to be processed obtained in the multiple reliable Harris angle points and step 7 to the benchmark image it is more A corresponding Harris angle point carries out related coefficient calculating respectively, and according to related coefficient calculated result find out the benchmark image with All registering control points to match between image to be processed described in each width;
Obtained in multiple reliable Harris angle points and step 7 to the benchmark image any it is to be processed When multiple corresponding Harris angle points of image carry out related coefficient calculating, to the multiple described reliable of the benchmark image Harris angle point carries out related coefficient calculating with multiple corresponding Harris angle points of this image to be processed respectively, and according to correlation Coefficient calculated result finds out all Harris angle points to match between the benchmark image and this image to be processed;
Each figure to be processed obtained in multiple reliable Harris angle points and step 7 to the benchmark image Multiple corresponding Harris angle points of picture are completed after related coefficient calculates, obtain the benchmark image and each image to be processed it Between all Harris angle points for matching, the Harris angle point found out is registering control points.
Thus, using the data processing equipment and when Image Matching module being called to carry out Image Matching in step 8, adopt It is matched with Relative coefficient, energy is easy, Rapid matching obtains between the benchmark image and each image to be processed All Harris angle points to match, and realize simplicity, it need to only complete between the benchmark image and each image to be processed Harris corners Matching process, carry out any two images to be processed in all aeroplane photography images without being fully completed Harris corners Matching process, not only can ensure that matching precision, while calculation amount can be effectively reduced.
In step 8 using the data processing equipment and when Image Matching module being called to carry out Image Matching, using described Data processing equipment carries out least square correction to remaining image to be processed according to the benchmark image, thus in step 8 into When row Image Matching, phase between the benchmark image and image to be processed described in each width is found out according to related coefficient calculated result All registering control points matched realize that the least square of several aeroplane photography images is corrected.
In the present embodiment, transformation matrix of coordinates computing module is called in step 604, is calculated obtained in step B3 The three-dimensional coordinate at multiple reliable control points and the coordinate for the actual measurement three-dimensional coordinate for finding out multiple reliable control points become When changing matrix, the transformation matrix of coordinates is calculated using closest point (ICP) algorithm of iteration.
In actual use, the transformation matrix of coordinates can also be calculated using other methods.
Due to being to determine photography on the basis of restoring elements of interior orientation (having restored photographic light flux) toward elements of exterior orientation Light beam is in the spatial position of photography moment and the parameter of posture, referred to as elements of exterior orientation.The elements of exterior orientation of one photo includes Six parameters, wherein three are vertical elements, for describing the spatial value of photo centre;The other three is angle element, is used In the spatial attitude of description photo.
After being updated in step 10 in the elements of exterior orientation of aeroplane photography image described in each width in the corrected value of three angle elements, The corrected value of each angle element is before updating in step 10 in the elements of exterior orientation of aeroplane photography image described in each width after update After being updated in the numerical value (i.e. angle element value) of the angle element and step 10 in the elements of exterior orientation of aeroplane photography image described in the width In the elements of exterior orientation of aeroplane photography image described in the width between the numerical value (i.e. angle element value) of the angle element difference absolute value.
The accuracy for further increasing the three-dimensional coordinate of ground control point, further increases connection precision and registration accuracy, Elevation coordinate computing module is called in the present embodiment, in step A2 and is obtained currently using the triangular interpolation found out in step A1 After the height value of process points, also need that height value correction module is called to be corrected the height value that interpolation obtains, process is as follows:
Step A21, nearest measuring point is searched in the triangulation network: in the height value and step A1 for obtaining currently processed point according to interpolation The ground coordinate for the currently processed point that conversion obtains finds out currently processed point in the constructed point cloud triangulation network in step 601 Locating triangle, and the measuring point apart from nearest measuring point found out is found out in the triangle found out with currently processed point For a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the currently processed point after correction Height value;
Step A22, height value correction terminates judgement: by the height value and step of the currently processed point after correcting in step A21 The height value of currently processed point carries out difference comparsion before correcting in rapid A21, when the absolute value of difference between the two is less than Δ H, It is judged as that height value corrects as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, A23 is entered step, is corrected next time;
Wherein, Δ H is preset elevation difference judgment threshold;
Step A23, nearest measuring point is searched in the triangulation network: according to the height value and step of the currently processed point after correcting at this time Convert the ground coordinate of the currently processed point obtained in A1, finds out current place in the constructed point cloud triangulation network in step 601 The locating triangle of reason point, and find out with currently processed point in the triangle found out apart from nearest measuring point, it is found out Measuring point is a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the current place after correction Manage the height value of point;
Step A24, height value correction terminate judgement: by step A23 correct after currently processed point height value with it is upper The height value progress difference comparsion of currently processed point is sentenced when the absolute value of difference between the two is less than 0.001 after primary correction Break and is corrected for height value as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, return step A23 is corrected next time;
It combines to convert in step A1 in step A2 and obtains the ground coordinate of currently processed point, obtain the three-dimensional of currently processed point Coordinate obtains the ground coordinate of currently processed point and the reliable height value of currently processed point according to conversion in step A1, obtains and work as The three-dimensional coordinate of pre-treatment point.
In the present embodiment, the Δ H=0.001.
In actual use, according to specific needs, the value size of Δ H is adjusted accordingly.
Correspondingly, for further increase characteristic point three-dimensional coordinate accuracy, further increase connection precision be registrated Precision is called elevation coordinate computing module and is obtained using the triangular interpolation found out in step B1 in the present embodiment, in step B2 Out after the height value of currently processed point, also need that height value correction module is called to be corrected the height value that interpolation obtains, process It is as follows:
Step B21, nearest measuring point is searched in the triangulation network: in the height value and step B1 for obtaining currently processed point according to interpolation The ground coordinate for the currently processed point that conversion obtains finds out currently processed point in the constructed point cloud triangulation network in step 601 Locating triangle, and the measuring point apart from nearest measuring point found out is found out in the triangle found out with currently processed point For a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the currently processed point after correction Height value;
Step B22, height value correction terminates judgement: by the height value and step of the currently processed point after correcting in step B21 The height value of currently processed point carries out difference comparsion before correcting in rapid B21, when the absolute value of difference between the two is less than Δ H, It is judged as that height value corrects as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, B23 is entered step, is corrected next time;
Step B23, nearest measuring point is searched in the triangulation network: according to the height value and step of the currently processed point after correcting at this time Convert the ground coordinate of the currently processed point obtained in B1, finds out current place in the constructed point cloud triangulation network in step 601 The locating triangle of reason point, and find out with currently processed point in the triangle found out apart from nearest measuring point, it is found out Measuring point is a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the current place after correction Manage the height value of point;
Step B24, height value correction terminate judgement: by step B23 correct after currently processed point height value with it is upper The height value progress difference comparsion of currently processed point is sentenced when the absolute value of difference between the two is less than 0.001 after primary correction Break and is corrected for height value as a result, using the height value of the currently processed point after correcting at this time as the reliable elevation of currently processed point Value;Otherwise, return step B23 is corrected next time;
It combines to convert in step B1 in step B2 and obtains the ground coordinate of currently processed point, obtain the three-dimensional of currently processed point Coordinate obtains the ground coordinate of currently processed point and the reliable height value of currently processed point according to conversion in step B1, obtains and work as The three-dimensional coordinate of pre-treatment point.
In the present embodiment, the point cloud data in region to be measured is obtained in step 1 using airborne LiDAR measuring system, is adopted Point cloud data method is conventional point cloud data acquisition methods.The airborne LiDAR measuring system and the data processing Equipment carries out two-way communication, and the POS system and the data processing equipment carry out two-way communication.
Point cloud data described in step 1 is the point cloud data that region to be measured will be obtained using airborne LiDAR measuring system The point cloud data obtained after excluding gross error processing is carried out, and used excluding gross error processing method is conventional excluding gross error Processing method.
The resolution ratio of the aeroplane photography image of several in step 1 is consistent, and image size is identical;Several described aviations are taken the photograph Shadow image and the point cloud data can be same period acquisition, be also possible to not same period acquisition.
When carrying out aerophotogrammetry to measured region in step 1, the aerial surveying camera of synchronization gain measured region is also needed Parameter (the initial elements of exterior orientation including each width aeroplane photography image), boat take the photograph ground space resolution ratio (GSD), Flight Design boat High and datum level is high (i.e. the height above average terrain of measured region).
In the present embodiment, the quantity of ground control point described in step 1 is not less than 5.
The camera as used in aerophotogrammetry, which is divided into, measures camera and non-metric camera, if being non-measurement phase Machine, image are had the radial direction of object lens, tangential distortion, are needed to eliminate radial direction, tangential distortion before using image, to aviation image (i.e. aeroplane photography image) carries out distortion correction.
Before carrying out Yunnan snub-nosed monkey in step 2, also need using the data processing equipment several aeroplane photography Image carries out distortion correction respectively.Also, used distortion correction method is conventional distortion correction method.And according to institute The aerial surveying camera supplemental characteristic of acquisition carries out distortion correction to image.
When carrying out Yunnan snub-nosed monkey in step 2, drying method and conventional filter processing method are removed according to conventional, to step The aeroplane photography image of several in rapid one is denoised and is filtered respectively.
In the present embodiment, when being denoised respectively the aeroplane photography image of several in step 1, due to image noise May be very big, Gaussian smoothing is carried out to aeroplane photography image.Also, it is carried out at Gaussian smoothing with the window of w × w (w >=3) Reason.
When being filtered respectively the aeroplane photography image of several in step 1, to the boat after Gaussian smoothing filter Empty photographs carries out Wallis filtering processing.
Wallis filtering processing is a kind of more special filter, it can be enhanced the contrast of raw video and while pressing Noise processed, especially it can greatly enhance the image texture mode of different scale in image, so the point in extraction image The quantity and precision that point feature can be improved when feature improve the reliability and precision of matching result.
When carrying out Image Matching in step 3, first using the data processing equipment and calling image pyramid generation module Generate the image pyramid data of aeroplane photography image described in each width, and the shadow of aeroplane photography image described in every width obtained As including L layers of pyramid image in pyramid data, wherein L is positive integer and its L >=3.
When carrying out Image Matching in step 3, the L layer pyramid image of several aeroplane photography images is successively carried out Image Matching.Also, it when successively carrying out Image Matching the L layer pyramid image of several aeroplane photography images, is used Image matching method be conventional image matching method.
When carrying out Image Matching in the present embodiment, in step 3, matched using the data processing equipment, process is such as Under:
Step 301, SIFT feature are extracted: using the L layer of data processing equipment aeroplane photography image described in each width Pyramid image carries out SIFT feature extraction respectively, obtains all of the L layer pyramid image of aeroplane photography image described in each width The SIFT feature of characteristic point and each characteristic point description;
Step 302, L layers of pyramid image Data Matching: L layers of pyramid shadow of several aeroplane photography images As carrying out characteristic matching;
When carrying out characteristic matching L layers of pyramid image of several aeroplane photography images, respectively described in every width The characteristic point building kd tree of aeroplane photography image simultaneously carries out bidirectional research with remaining each width aeroplane photography image, obtains described in several The characteristic point being mutually matched between L layers of pyramid image of aeroplane photography image;L layers of pyramid image are shadow to be matched Picture;
Wherein, kd tree is constructed with the characteristic point of image to be matched described in every width and is carried out with remaining each image to be matched double To search when, using this image to be matched initial elements of exterior orientation by the characteristic point of this image to be matched go to it is other to Match image on, if turn point within the scope of the photo of remaining image to be matched, between two images to be matched into Row bidirectional research obtains reliable match point.
Described in any two width when image progress bidirectional research to be matched, first image of image difference to be matched described in two width With the second image, process is as follows:
Step D, it matches for the first time, comprising the following steps:
Step D1, kd tree is constructed: using second image as benchmark image to be matched, and by second image The SIFT feature description son building kd tree of all characteristic points;
Step D2, characteristic matching: using kd tree constructed in step D1, and use k-nearest neighbor in step 2 to mentioning All characteristic points of first image taken SIFT feature description son carry out characteristic matching, find out in second image with All characteristic points of first Image Matching with the characteristic point of first Image Matching are to match in second image Point;
Step E, it matches for second, comprising the following steps:
Step E1, kd tree is constructed: using first image as benchmark image to be matched, and by first image The SIFT feature description son building kd tree of all characteristic points;
Step E2, characteristic matching: using kd tree constructed in step D1, and using k-nearest neighbor in step D2 Characteristic matching is carried out with all match points obtained, finds out all features in first image with second Image Matching Point with the characteristic point of second Image Matching is reliable matching point in first image;
According to method described in step D to step E complete image and remaining image to be matched to be matched described in every width it Between bidirectional research, obtain the reliable matching point between image and remaining image to be matched to be matched described in every width, and obtain L All reliable matching points of layer pyramid image;
Then, using the data processing equipment and using RANSAC algorithm, the L layers of pyramid image that matching is obtained All reliable matching points carry out elimination of rough difference, all reliable matching points of L layers of pyramid image after being rejected, guarantee The correctness of final match point result;
Step 303, L-1 layers of pyramid image matching: by the institute of L layers of pyramid image after being rejected in step 302 The enterprising Correlation series of L-1 layers of pyramid image for having reliable matching point to go to several aeroplane photography images calculate, and All reliable matching points by related coefficient no more than 0.9 are rejected, and L-1 layers of pyramid image after being rejected own Reliable matching point;
Step 304, next layer of pyramid image matching: by all reliable matchings of upper one layer of pyramid image after rejecting The enterprising Correlation series of this layer of pyramid image that point goes to several aeroplane photography images calculate, and related coefficient is little All reliable matching points in 0.9 are rejected, all reliable matching points of this layer of pyramid image after being rejected;
Step 305, one or many repetition steps 304, until completing the raw video of several aeroplane photography images The Image Matching process of layer (i.e. the 0th layer of pyramid image), the original shadow of several aeroplane photography images after being rejected All reliable matching points of picture;
It is obtained rejected after all reliable matching points of raw video of several aeroplane photography images be The image connecting points being mutually matched between the aeroplane photography image of several in step 3.
In the present embodiment, L is positive integer and L < 10.
In actual use, according to specific needs, the value size of L is adjusted accordingly.
The picture on multiple ground control point aeroplane photography images described in each width is obtained in the present embodiment, in step 4 After piece coordinate, also need to adjust each ground using the mutual data processing equipment and according to the description of station file of the ground control point Photo coordinate of the control point on the aeroplane photography image described in each width, to take the photograph the aviation described in each width of each ground control point The accuracy of photo coordinate on shadow image is higher.
When carrying out Harris angle point grid in the present embodiment, in step 602, according to the rectangle frame of m × m (0 < m < 100) It is divided, Harris angle point grid is carried out within the scope of rectangle frame.
The above is only presently preferred embodiments of the present invention, is not intended to limit the invention in any way, it is all according to the present invention Technical spirit any simple modification to the above embodiments, change and equivalent structural changes, still fall within skill of the present invention In the protection scope of art scheme.

Claims (10)

1. the autoegistration method of a kind of airborne lidar point cloud data and aviation image, it is characterised in that: this method includes following Step:
Step 1: point cloud data and aviation image obtain and control point field operation is surveyed: using airborne LiDAR measuring system obtain to The point cloud data of measured zone, and acquired point cloud data is sent to data processing equipment;Include in the point cloud data Multiple measuring points in region to be measured and the three-dimensional coordinate of each measuring point;
Meanwhile multiple ground control points are laid in measured region, and it is real to carry out field operation to the three-dimensional coordinate of each ground control point It surveys, obtains the actual measurement three-dimensional coordinate of each ground control point;The multiple ground control points laid are recycled, to measured area Domain carries out aerophotogrammetry, absorbs several aeroplane photography images of measured region, several acquired described aviations are taken the photograph Shadow image synchronization is sent to the data processing equipment;Width aeroplane photography image described in every width be digitized video and its be two Position image;
When carrying out aerophotogrammetry to measured region, the elements of exterior orientation of aeroplane photography image described in each width is obtained, and will The elements of exterior orientation synchronous driving of aeroplane photography image described in each width is obtained to the data processing equipment;In this step, respectively The elements of exterior orientation of aeroplane photography image described in width is the initial elements of exterior orientation of the aeroplane photography image;
Step 2: Yunnan snub-nosed monkey: being distinguished using the data processing equipment the aeroplane photography image of several in step 1 It is denoised and is filtered, obtain several pretreated described aeroplane photography images;
Step 3: Image Matching: using the data processing equipment and Image Matching module is called, after pre-processing in step 2 Several described aeroplane photography images carry out Image Matchings, obtain be mutually matched between several described aeroplane photography images it is all Characteristic point;The image connecting points that characteristic point obtained is mutually matched between several described aeroplane photography images;
In this step, all feature point groups on aeroplane photography image described in each width obtained are at the width aeroplane photography image Set of characteristic points include matching the width aviation obtained in this step in the set of characteristic points of aeroplane photography image described in every width The photo coordinate of all characteristic points on photographs;
Step 4: control point photo coordinate obtains: according to the actual measurement three-dimensional coordinate of the ground control points multiple in step 1, adopting With the data processing equipment and photo coordinate calculation module is called, to the ground control points multiple in step 1 in step 2 In photo coordinate on aeroplane photography image described in pretreated each width be respectively calculated, obtain multiple ground controls Photo coordinate of the point on the aeroplane photography image described in each width;Again by multiple ground control point aeroplane photography described in each width Photo coordinate on image is added in step 3 in the set of characteristic points of the width aeroplane photography image, obtains boat described in each width The complete characterization point set of empty photographs;
Step 5: initial laser beam method block adjustment and elements of exterior orientation update: according to aeroplane photography described in width each in step 1 It is more in the complete characterization point set and step 1 of aeroplane photography image described in each width in the initial elements of exterior orientation of image, step 4 The actual measurement three-dimensional coordinate of a ground control point using the data processing equipment and calls bundle block adjustment module Bundle block adjustment is carried out, the elements of exterior orientation of aeroplane photography image described in each width after acquisition adjustment;Recall data more The elements of exterior orientation of aeroplane photography image described in each width is updated to the width aeroplane photography after the adjustment obtained at this time by new module The elements of exterior orientation of image;
The actual measurement three-dimensional coordinate composition control point set of multiple ground control points in step 1;
Step 6: point cloud data and aviation image Feature Points Matching: using the data processing equipment to point described in step 1 Aeroplane photography image described in pretreated any width is matched in cloud data and step 2, on the basis of the aeroplane photography image Image;Remaining aeroplane photography in step 2 in several pretreated described aeroplane photography images in addition to the benchmark image Image is image to be processed;
When matching to the point cloud data and the benchmark image, process is as follows:
Step 601, the building triangulation network: the point cloud data according to step 1 calls triangulation network building module to construct the triangulation network; The constructed triangulation network is the point cloud triangulation network;
Step 602, Harris angle point grid: Harris Corner Detection module is called to extract the feature of the width aeroplane photography image Point, and the photo coordinate of extracted each characteristic point is recorded;Extracted characteristic point is Harris angle point;
Step 603, image Harris angle point and ground control point the three-dimensional coordinate determination based on the triangulation network: according in step 601 The described cloud triangulation network, it is multiple described in all Harris angle points and step 4 to the width aeroplane photography image in step 602 The three-dimensional coordinate of ground control point is determined respectively;
When being determined to the three-dimensional coordinate of all Harris angle points of the width aeroplane photography image, according to the width aerial photography map As the photo coordinate of upper each Harris angle point, the three-dimensional coordinate of each Harris angle point is determined respectively, is obtained multiple reliable The three-dimensional coordinate of Harris angle point;
Wherein, to the width aeroplane photography image take up an official post the three-dimensional coordinate of a Harris angle point be determined when, the angle Harris Point is currently processed point, and process is as follows:
Step A1, process points effectively judge: being existed according to the elements of exterior orientation of the width aeroplane photography image at this time and currently processed point Photo coordinate on the width aeroplane photography image calls ground coordinate conversion module conversion to show that the ground of currently processed point is sat Mark;Further according to the ground coordinate for the currently processed point that conversion obtains, finds out work as in the constructed point cloud triangulation network in step 601 Triangle locating for pre-treatment point, and triangle judgment module is called to judge the triangle found out: when finding out three When three angular edge lengths are respectively less than the depth displacement in TL and three vertex between any two vertex and are respectively less than TH, it is judged as Currently processed point is to be effectively treated a little, enters step A2;Otherwise, reason point will be presently in give up;
Wherein, TL is preset triangle side length judgment threshold;TH is the judgement of preset triangular apex depth displacement Threshold value;It is described to be effectively treated a little as the reliable Harris angle point;
Step A2, three-dimensional coordinate determines: calling elevation coordinate computing module and is obtained using the triangular interpolation found out in step A1 The height value of currently processed point out obtains the ground coordinate of currently processed point in conjunction with conversion in step A1, obtains currently processed The three-dimensional coordinate of point;
Step A3, one or many to repeat step A1 to step A2, to all Harris angle points of the width aeroplane photography image Three-dimensional coordinate is determined respectively, obtains the three-dimensional coordinate of multiple reliable Harris angle points of the width aeroplane photography image;
When being determined respectively to the three-dimensional coordinate of the ground control points multiple in step 4, according to multiple described in step 4 Photo coordinate of the ground control point on the width aeroplane photography image sits the three-dimensional of the ground control points multiple in step 4 Mark is determined respectively, obtains the three-dimensional coordinate at multiple reliable control points;
Wherein, when being determined the three-dimensional coordinate of any one ground control point, which is currently processed point, Process is as follows:
Step B1, process points effectively judge: being existed according to the elements of exterior orientation of the width aeroplane photography image at this time and currently processed point Photo coordinate on the width aeroplane photography image calls ground coordinate conversion module conversion to show that the ground of currently processed point is sat Mark;Further according to the ground coordinate for the currently processed point that conversion obtains, finds out work as in the constructed point cloud triangulation network in step 601 Triangle locating for pre-treatment point, and triangle judgment module is called to judge the triangle found out: when finding out three When three angular edge lengths are respectively less than the depth displacement in TL and three vertex between any two vertex and are respectively less than TH, it is judged as Currently processed point is to be effectively treated a little, enters step B2;Otherwise, reason point will be presently in give up;
It is described to be effectively treated a little as the reliable control point in this step;
Step B2, three-dimensional coordinate determines: calling elevation coordinate computing module and is obtained using the triangular interpolation found out in step B1 The height value of currently processed point out obtains the ground coordinate of currently processed point in conjunction with conversion in step B1, obtains currently processed The three-dimensional coordinate of point;
Step B3, one or many to repeat step B1 to step B2, the three-dimensional of the ground control points multiple in step 4 is sat Mark is determined respectively, obtains the three-dimensional coordinate at multiple reliable control points;
Step 604, transformation matrix of coordinates calculate: calling control point searching module, multiple ground control points from step 1 Actual measurement three-dimensional coordinate in find out the actual measurement three-dimensional coordinate at multiple reliable control points;It recalls transformation matrix of coordinates and calculates mould Block is calculated the three-dimensional coordinate at multiple reliable control points obtained in step B3 and finds out multiple reliable controls The transformation matrix of coordinates of the actual measurement three-dimensional coordinate of system point;
Step 605, the image Harris angle point three-dimensional coordinate determination based on transformation matrix of coordinates: it is calculated according in step 604 The transformation matrix of coordinates out, call coordinate transformation module in step A3 the width aeroplane photography image it is multiple described reliable The three-dimensional coordinate of Harris angle point is coordinately transformed respectively, multiple reliable angles Harris after coordinate transform is calculated The three-dimensional coordinate of point;
Step 606, the image Harris angle point three-dimensional coordinate amendment based on the triangulation network: according to point cloud constructed in step 601 The triangulation network is modified the three-dimensional coordinate of multiple reliable Harris angle points obtained in step 605 respectively;
Wherein, when being modified the three-dimensional coordinate for either one or two of obtaining the reliable Harris angle point in step 605, this is reliable Harris angle point is current adjusting point, and process is as follows:
Step C1, amendment judgement: according to the three-dimensional coordinate of the current adjusting point obtained in step 605, institute's structure in step 601 Triangle locating for current adjusting point is found out in the point cloud triangulation network built, and calls triangle judgment module to the triangle found out Shape is judged: when three edge lengths of found out triangle are respectively less than the height in TL and three vertex between any two vertex When path difference is respectively less than TH, the three-dimensional coordinate of current adjusting point need to be modified by being judged as, enter step C2;Otherwise, without pair The three-dimensional coordinate of current adjusting point is modified, and the three-dimensional coordinate of current adjusting point is current obtained in step 605 at this time The three-dimensional coordinate of adjusting point;
Step C2, it coordinate modification: calls elevation coordinate computing module and is obtained using the triangular interpolation found out in step C1 and worked as The height value of preceding adjusting point, then thus by the height value replacement in the three-dimensional coordinate of current adjusting point obtained in step 604 When the height value of current adjusting point that obtains of interpolation, obtain the three-dimensional coordinate of revised current adjusting point;
Step C3, one or many to repeat step C1 to step C2, to the three-dimensional coordinate point of multiple reliable Harris angle points It is not modified, obtains the three-dimensional coordinate of revised multiple reliable Harris angle points;
The three-dimensional coordinate of revised multiple reliable Harris angle points is the multiple described reliable of benchmark image in step C3 The three-dimensional coordinate of Harris angle point;
Step 7: image characteristic point photo coordinate to be processed calculates: being waited for using the data processing equipment width each in step 6 The characteristic point photo coordinate of processing image is respectively calculated, and obtains multiple corresponding Harris angle points of each image to be processed Photo coordinate;
The characteristic point photo Coordinate calculation method of each image to be processed is all the same;
When the characteristic point photo coordinate of the image to be processed described in any width calculates, according to obtained in step C3 described in The three-dimensional coordinate of multiple reliable Harris angle points of benchmark image, and combine foreign side's bit of the image to be processed at this time Element calls photo coordinate calculation module to distinguish photo coordinate of multiple reliable Harris angle points on the image to be processed It is calculated, obtains the photo coordinate of multiple corresponding Harris angle points of the image to be processed;
Each corresponding Harris angle point is pixel of the reliable Harris angle point on the image to be processed, Pixel coordinate of each reliable Harris angle point on the image to be processed, which is the reliable Harris angle point, to be waited locating at this Manage the coordinate of the pixel on image;
Step 8: Image Matching: using the data processing equipment and calling characteristic point searching module, mentioned from step 602 The photo coordinate that multiple reliable Harris angle points in step A3 are found out in the photo coordinate of each characteristic point taken, is found out The photo coordinate of multiple reliable Harris angle points is the photo of multiple reliable Harris angle points of the benchmark image Coordinate;It is each obtained in photo coordinate and step 7 further according to multiple reliable Harris angle points of the benchmark image The photo coordinate of multiple corresponding Harris angle points of image to be processed using the data processing equipment and calls Image Matching Module carries out Image Matching, obtains all angles Harris to match between image to be processed described in the benchmark image and each width Point;
Each Harris angle point to match between the benchmark image obtained and image to be processed described in each width is to match Control point;
Step 9: set updates: using the data processing equipment by all registering control points obtained in step 8 each Photo coordinate on aeroplane photography image described in width is added to the complete characterization of aeroplane photography image described in the width in step 4 In point set, the complete characterization point set of aeroplane photography image described in updated each width is obtained;Meanwhile it will be corrected in step 6 The three-dimensional coordinate of multiple reliable Harris angle points afterwards is added in the control point set at this time, after being updated The control point set;
Step 10: bundle block adjustment and elements of exterior orientation update: according to the outer of aeroplane photography image described in each width at this time The complete characterization point set of aeroplane photography image described in each width and the control point set at this time in the element of orientation, step 9, Using the data processing equipment and bundle block adjustment module is called to carry out bundle block adjustment, after obtaining adjustment The elements of exterior orientation of aeroplane photography image described in each width;Data update module is recalled, by aeroplane photography image described in each width Elements of exterior orientation is updated to the elements of exterior orientation of the width aeroplane photography image after the adjustment obtained at this time, obtains updated each The elements of exterior orientation of aeroplane photography image described in width completes an autoregistration mistake of the point cloud data Yu aeroplane photography image Journey;
Step 11: autoregistration terminates to judge: using the data processing equipment and numerical value comparison module is called, to step 10 The corrected value of three angle elements is judged respectively in the elements of exterior orientation of aeroplane photography image described in each width after middle update: when more It is poor to be respectively less than preset limit for the corrected value of three angle elements in the elements of exterior orientation of aeroplane photography image described in new rear each width When value, it is judged as the autoregistration process for completing point cloud data and several aeroplane photography images, using the data processing equipment Autoregistration is exported as a result, the autoregistration result is the foreign side of aeroplane photography image described in updated each width in step 10 Bit element;Otherwise, then using the data processing equipment carry out the judgement of autoregistration number;
When being judged using data processing equipment progress autoregistration number, judged at this time using the data processing equipment Whether the autoregistration number of completion has reached preset maximum registration number: when completed autoregistration number at this time When having reached preset maximum registration number, it is judged as that autoregistration fails, the data processing equipment output at this time is certainly Dynamic registration result, the autoregistration result are autoregistration failure;Otherwise, return step six, to the point cloud data and boat Empty photographs carries out autoregistration next time.
2. the autoegistration method of a kind of airborne lidar point cloud data and aviation image described in accordance with the claim 1, feature It is: TL=3m described in step A1, the TH=1m.
3. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: when carrying out autoregistration next time to the point cloud data and aeroplane photography image in step 11, according to step Six carry out autoregistration to the point cloud data and aeroplane photography image to method described in step 10, enter back into step 11 Carrying out autoregistration terminates to judge.
4. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: before carrying out Yunnan snub-nosed monkey in step 2, autoregistration times N being set using the data processing equipment It is fixed;At this point, N=0;
After an autoregistration process for completing the point cloud data and aeroplane photography image in step 10, at the data It manages equipment and autoregistration times N at this time is added 1;
Preset maximum registration number is denoted as N in step 11max;Wherein, NmaxFor positive integer and Nmax≥3;
Using data processing equipment judgement, whether completed autoregistration number is had reached in advance at this time in step 11 When the maximum registration number of setting, by described at this time N and NmaxCarry out difference comparsion: as N >=NmaxWhen, it is judged as complete at this time At autoregistration number have reached preset maximum registration number;Otherwise, it is judged as completed autoregistration at this time Number is not up to preset maximum registration number.
5. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: when carrying out bundle block adjustment in step 5 neutralization procedure ten, being all made of the data processing equipment and tune POS auxiliary beam method block adjustment is carried out with bundle block adjustment module.
6. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: when carrying out aerophotogrammetry to measured region in step 1, being carried out using aerophotogrammetry with POS system Aerophotogrammetry;
The elements of exterior orientation of aeroplane photography image described in each width is to carry out aerophotogrammetry to measured region in step 1 The elements of exterior orientation that Shi Suoshu POS system obtains.
7. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: before carrying out image characteristic point photo coordinate calculating to be processed in step 7, first using the data processing equipment Angle point Effective judgement is carried out respectively to multiple reliable Harris angle points of the benchmark image obtained in step C3;
The angle point Effective judgement method of all reliable Harris angle points is all the same;
When the reliable Harris angle point described to either one or two of the benchmark image carries out angle point Effective judgement, according in step C3 The three-dimensional coordinate of the reliable Harris angle point of the benchmark image obtained, and in conjunction with the foreign side of the benchmark image at this time Bit element, calls photo coordinate calculation module to count the photo coordinate of the reliable Harris angle point on the benchmark image It calculates, the photo coordinate of the reliable Harris angle point on the benchmark image being calculated is denoted as (x*, y*);Again using described Data processing equipment finds out institute from the photo coordinate of multiple reliable Harris angle points of benchmark image described in step 8 The pixel coordinate of the reliable Harris angle point of benchmark image is stated, the pixel coordinate for the reliable Harris angle point found out is denoted as (x^,y^);Later, Numerical Simulation Module is called and according to formulaThe reliable angle Harris is calculated The photo grid deviation value Δ r of point, Δ x=x*-x^, Δ y=y*-y^ in formula;Recall whether difference comparsion module judges Δ r Less than Δ t: as Δ r < Δ t, being judged as that the reliable Harris angle point is effective angle point;Otherwise, it is judged as the reliable Harris Angle point is invalid angle point;
Wherein, Δ t is preset photo grid deviation judgment threshold;
Before the characteristic point photo coordinate of the image to be processed described in any width calculates in step 7, at the data Equipment is managed by all invalid angle points in multiple reliable Harris angle points of the benchmark image obtained in step C3 Reject;
When the characteristic point photo coordinate of the image to be processed described in any width calculates in step 7, the benchmark image it is more A reliable Harris angle point is the effective focal spot;
Each corresponding Harris angle point is pixel of the effective angle point on the image to be processed;
The reliable Harris angle point being added in the control point set at this time in step 9 is the effective focal spot.
8. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: in step 8 using the data processing equipment and when Image Matching module being called to carry out Image Matching, to described Multiple correspondences of each image to be processed obtained in the multiple reliable Harris angle points and step 7 of benchmark image Harris angle point carries out related coefficient calculating respectively, and finds out the benchmark image and each width institute according to related coefficient calculated result State all registering control points to match between image to be processed;
Any image to be processed obtained in multiple reliable Harris angle points and step 7 to the benchmark image Multiple corresponding Harris angle points when carrying out related coefficient calculating, to multiple reliable angles Harris of the benchmark image Point carries out related coefficient calculating with multiple corresponding Harris angle points of this image to be processed respectively, and is calculated according to related coefficient As a result all Harris angle points to match between the benchmark image and this image to be processed are found out;
Each image to be processed obtained in multiple reliable Harris angle points and step 7 to the benchmark image After multiple corresponding Harris angle points complete related coefficient calculating, phase between the benchmark image and each image to be processed is obtained Matched all Harris angle points, the Harris angle point found out are registering control points.
9. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: calling elevation coordinate computing module in step A2 and obtain current place using the triangular interpolation found out in step A1 After the height value for managing point, also need that height value correction module is called to be corrected the height value that interpolation obtains, process is as follows:
Step A21, nearest measuring point is searched in the triangulation network: being obtained according to interpolation and is converted in the height value and step A1 of currently processed point The ground coordinate of the currently processed point obtained is found out locating for currently processed point in the constructed point cloud triangulation network in step 601 Triangle, and find out in the triangle found out with currently processed point apart from nearest measuring point, the measuring point found out is institute State the measuring point surveyed in point cloud data;Again using the height value of found out measuring point at this time as the height of the currently processed point after correction Journey value;
Step A22, height value correction terminates judgement: by the height value and step A21 of the currently processed point after correcting in step A21 The height value of currently processed point carries out difference comparsion before middle correction, when the absolute value of difference between the two is less than Δ H, is judged as Height value correction is as a result, using the height value of the currently processed point after correcting at this time as the reliable height value of currently processed point;It is no Then, A23 is entered step, is corrected next time;
Wherein, Δ H is preset elevation difference judgment threshold;
Step A23, nearest measuring point is searched in the triangulation network: according in the height value and step A1 of the currently processed point after correcting at this time The ground coordinate for the currently processed point that conversion obtains finds out currently processed point in the constructed point cloud triangulation network in step 601 Locating triangle, and the measuring point apart from nearest measuring point found out is found out in the triangle found out with currently processed point For a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the currently processed point after correction Height value;
Step A24, height value, which corrects, terminates judgement: by the height value of the currently processed point after correcting in step A23 and last time The height value progress difference comparsion of currently processed point is judged as when the absolute value of difference between the two is less than 0.001 after correction Height value correction is as a result, using the height value of the currently processed point after correcting at this time as the reliable height value of currently processed point;It is no Then, return step A23 is corrected next time;
It combines to convert in step A1 in step A2 and obtains the ground coordinate of currently processed point, show that the three-dimensional of currently processed point is sat Mark obtains the ground coordinate of currently processed point and the reliable height value of currently processed point according to conversion in step A1, obtains current The three-dimensional coordinate of process points.
10. the autoegistration method of a kind of airborne lidar point cloud data and aviation image according to claim 1 or 2, It is characterized in that: calling elevation coordinate computing module in step B2 and obtain current place using the triangular interpolation found out in step B1 After the height value for managing point, also need that height value correction module is called to be corrected the height value that interpolation obtains, process is as follows:
Step B21, nearest measuring point is searched in the triangulation network: being obtained according to interpolation and is converted in the height value and step B1 of currently processed point The ground coordinate of the currently processed point obtained is found out locating for currently processed point in the constructed point cloud triangulation network in step 601 Triangle, and find out in the triangle found out with currently processed point apart from nearest measuring point, the measuring point found out is institute State the measuring point surveyed in point cloud data;Again using the height value of found out measuring point at this time as the height of the currently processed point after correction Journey value;
Step B22, height value correction terminates judgement: by the height value and step B21 of the currently processed point after correcting in step B21 The height value of currently processed point carries out difference comparsion before middle correction, when the absolute value of difference between the two is less than Δ H, is judged as Height value correction is as a result, using the height value of the currently processed point after correcting at this time as the reliable height value of currently processed point;It is no Then, B23 is entered step, is corrected next time;
Step B23, nearest measuring point is searched in the triangulation network: according in the height value and step B1 of the currently processed point after correcting at this time The ground coordinate for the currently processed point that conversion obtains finds out currently processed point in the constructed point cloud triangulation network in step 601 Locating triangle, and the measuring point apart from nearest measuring point found out is found out in the triangle found out with currently processed point For a measuring point in the survey point cloud data;Again using the height value of found out measuring point at this time as the currently processed point after correction Height value;
Step B24, height value, which corrects, terminates judgement: by the height value of the currently processed point after correcting in step B23 and last time The height value progress difference comparsion of currently processed point is judged as when the absolute value of difference between the two is less than 0.001 after correction Height value correction is as a result, using the height value of the currently processed point after correcting at this time as the reliable height value of currently processed point;It is no Then, return step B23 is corrected next time;
It combines to convert in step B1 in step B2 and obtains the ground coordinate of currently processed point, show that the three-dimensional of currently processed point is sat Mark obtains the ground coordinate of currently processed point and the reliable height value of currently processed point according to conversion in step B1, obtains current The three-dimensional coordinate of process points.
CN201811651300.0A 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image Active CN109727278B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811651300.0A CN109727278B (en) 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811651300.0A CN109727278B (en) 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image

Publications (2)

Publication Number Publication Date
CN109727278A true CN109727278A (en) 2019-05-07
CN109727278B CN109727278B (en) 2020-12-18

Family

ID=66298534

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811651300.0A Active CN109727278B (en) 2018-12-31 2018-12-31 Automatic registration method for airborne LiDAR point cloud data and aerial image

Country Status (1)

Country Link
CN (1) CN109727278B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110264502A (en) * 2019-05-17 2019-09-20 华为技术有限公司 Point cloud registration method and device
CN111060910A (en) * 2019-12-11 2020-04-24 西安电子科技大学 InSAR carrier reverse positioning based on terrain-image matching
CN112002007A (en) * 2020-08-31 2020-11-27 胡翰 Model obtaining method and device based on air-ground image, equipment and storage medium
CN112381941A (en) * 2021-01-15 2021-02-19 武汉鸿宇飞规划设计技术有限公司 Aviation flight image coordinate correction method
CN112927370A (en) * 2021-02-25 2021-06-08 苍穹数码技术股份有限公司 Three-dimensional building model construction method and device, electronic equipment and storage medium
CN113593023A (en) * 2021-07-14 2021-11-02 中国科学院空天信息创新研究院 Three-dimensional drawing method, device, equipment and storage medium
CN117036622A (en) * 2023-10-08 2023-11-10 海纳云物联科技有限公司 Three-dimensional reconstruction method, device and equipment for fusing aerial image and ground scanning

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112767460B (en) * 2020-12-31 2022-06-14 武汉大学 Spatial fingerprint image registration element feature description and matching method

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070269102A1 (en) * 2006-05-20 2007-11-22 Zheng Wang Method and System of Generating 3D Images with Airborne Oblique/Vertical Imagery, GPS/IMU Data, and LIDAR Elevation Data
CN102411778A (en) * 2011-07-28 2012-04-11 武汉大学 Automatic registration method of airborne laser point cloud and aerial image
CN102930540A (en) * 2012-10-26 2013-02-13 中国地质大学(武汉) Method and system for detecting contour of urban building
CN103017739A (en) * 2012-11-20 2013-04-03 武汉大学 Manufacturing method of true digital ortho map (TDOM) based on light detection and ranging (LiDAR) point cloud and aerial image
CN103093459A (en) * 2013-01-06 2013-05-08 中国人民解放军信息工程大学 Assisting image matching method by means of airborne lidar point cloud data
CN103106339A (en) * 2013-01-21 2013-05-15 武汉大学 Synchronous aerial image assisting airborne laser point cloud error correction method
CN103744086A (en) * 2013-12-23 2014-04-23 北京建筑大学 High-precision registration method for ground laser radar and close-range photography measurement data
CN104123730A (en) * 2014-07-31 2014-10-29 武汉大学 Method and system for remote-sensing image and laser point cloud registration based on road features
CN104599272A (en) * 2015-01-22 2015-05-06 中国测绘科学研究院 Movable target sphere oriented onboard LiDAR point cloud and image united rectification method
US20160027178A1 (en) * 2014-07-23 2016-01-28 Sony Corporation Image registration system with non-rigid registration and method of operation thereof
US9466143B1 (en) * 2013-05-03 2016-10-11 Exelis, Inc. Geoaccurate three-dimensional reconstruction via image-based geometry
US20180347978A1 (en) * 2017-06-01 2018-12-06 Michael David SÁNCHEZ System and method of photogrammetry
CN109087339A (en) * 2018-06-13 2018-12-25 武汉朗视软件有限公司 A kind of laser scanning point and Image registration method

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070269102A1 (en) * 2006-05-20 2007-11-22 Zheng Wang Method and System of Generating 3D Images with Airborne Oblique/Vertical Imagery, GPS/IMU Data, and LIDAR Elevation Data
CN102411778A (en) * 2011-07-28 2012-04-11 武汉大学 Automatic registration method of airborne laser point cloud and aerial image
CN102930540A (en) * 2012-10-26 2013-02-13 中国地质大学(武汉) Method and system for detecting contour of urban building
CN103017739A (en) * 2012-11-20 2013-04-03 武汉大学 Manufacturing method of true digital ortho map (TDOM) based on light detection and ranging (LiDAR) point cloud and aerial image
CN103093459A (en) * 2013-01-06 2013-05-08 中国人民解放军信息工程大学 Assisting image matching method by means of airborne lidar point cloud data
CN103106339A (en) * 2013-01-21 2013-05-15 武汉大学 Synchronous aerial image assisting airborne laser point cloud error correction method
US9466143B1 (en) * 2013-05-03 2016-10-11 Exelis, Inc. Geoaccurate three-dimensional reconstruction via image-based geometry
CN103744086A (en) * 2013-12-23 2014-04-23 北京建筑大学 High-precision registration method for ground laser radar and close-range photography measurement data
US20160027178A1 (en) * 2014-07-23 2016-01-28 Sony Corporation Image registration system with non-rigid registration and method of operation thereof
CN104123730A (en) * 2014-07-31 2014-10-29 武汉大学 Method and system for remote-sensing image and laser point cloud registration based on road features
CN104599272A (en) * 2015-01-22 2015-05-06 中国测绘科学研究院 Movable target sphere oriented onboard LiDAR point cloud and image united rectification method
US20180347978A1 (en) * 2017-06-01 2018-12-06 Michael David SÁNCHEZ System and method of photogrammetry
CN109087339A (en) * 2018-06-13 2018-12-25 武汉朗视软件有限公司 A kind of laser scanning point and Image registration method

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
BISHENG YANG等: "Automatic registration of UAV-borne sequent images and LiDAR data", 《ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING》 *
张永军等: "城区机载LiDAR数据与航空影像的自动配准", 《遥感学报》 *
杜全叶: "无地面控制的航空影像与LiDAR数据自动高精度配准", 《中国博士学位论文全文数据库 信息科技辑》 *
贾娇: "机载 LiDAR点云与航空影像自动配准的精度分析", 《中国优秀硕士学位论文全文数据库 基础科学辑》 *
邵杰等: "三维激光点云与CCD影像融合的研究", 《中国激光》 *
顾斌: "数字图像与激光点云配准及在建筑物三维建模中的应用", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110264502B (en) * 2019-05-17 2021-05-18 华为技术有限公司 Point cloud registration method and device
CN110264502A (en) * 2019-05-17 2019-09-20 华为技术有限公司 Point cloud registration method and device
CN111060910B (en) * 2019-12-11 2023-08-29 西安电子科技大学 InSAR carrier reverse positioning based on topography-image matching
CN111060910A (en) * 2019-12-11 2020-04-24 西安电子科技大学 InSAR carrier reverse positioning based on terrain-image matching
CN112002007A (en) * 2020-08-31 2020-11-27 胡翰 Model obtaining method and device based on air-ground image, equipment and storage medium
CN112002007B (en) * 2020-08-31 2024-01-19 胡翰 Model acquisition method and device based on air-ground image, equipment and storage medium
CN112381941A (en) * 2021-01-15 2021-02-19 武汉鸿宇飞规划设计技术有限公司 Aviation flight image coordinate correction method
CN112381941B (en) * 2021-01-15 2021-03-26 武汉鸿宇飞规划设计技术有限公司 Aviation flight image coordinate correction method
CN112927370A (en) * 2021-02-25 2021-06-08 苍穹数码技术股份有限公司 Three-dimensional building model construction method and device, electronic equipment and storage medium
CN113593023A (en) * 2021-07-14 2021-11-02 中国科学院空天信息创新研究院 Three-dimensional drawing method, device, equipment and storage medium
CN113593023B (en) * 2021-07-14 2024-02-02 中国科学院空天信息创新研究院 Three-dimensional drawing method, device, equipment and storage medium
CN117036622A (en) * 2023-10-08 2023-11-10 海纳云物联科技有限公司 Three-dimensional reconstruction method, device and equipment for fusing aerial image and ground scanning
CN117036622B (en) * 2023-10-08 2024-02-23 海纳云物联科技有限公司 Three-dimensional reconstruction method, device and equipment for fusing aerial image and ground scanning

Also Published As

Publication number Publication date
CN109727278B (en) 2020-12-18

Similar Documents

Publication Publication Date Title
CN109727278A (en) A kind of autoegistration method of airborne lidar point cloud data and aviation image
CN113436260B (en) Mobile robot pose estimation method and system based on multi-sensor tight coupling
CN105160702B (en) The stereopsis dense Stereo Matching method and system aided in based on LiDAR point cloud
CN112927360A (en) Three-dimensional modeling method and system based on fusion of tilt model and laser point cloud data
CN109737913B (en) Laser tracking attitude angle measurement system and method
CN108648240A (en) Based on a non-overlapping visual field camera posture scaling method for cloud characteristics map registration
CN106780712B (en) Three-dimensional point cloud generation method combining laser scanning and image matching
CN110044374B (en) Image feature-based monocular vision mileage measurement method and odometer
CN109341668B (en) Multi-camera measuring method based on refraction projection model and light beam tracking method
CN107330927B (en) Airborne visible light image positioning method
CN112270698A (en) Non-rigid geometric registration method based on nearest curved surface
CN112305576A (en) Multi-sensor fusion SLAM algorithm and system thereof
CN109341720A (en) A kind of remote sensing camera geometric calibration method based on fixed star track
CN106500729B (en) A kind of smart phone self-test calibration method without controlling information
CN107564046A (en) It is a kind of based on a cloud and the secondary accurate extracting method of registering contour of building of UAV images
CN112669354A (en) Multi-camera motion state estimation method based on vehicle incomplete constraint
CN113298947A (en) Multi-source data fusion-based three-dimensional modeling method medium and system for transformer substation
CN105571518A (en) Three dimensional information vision measurement method based on refraction image deviation
CN107063187A (en) A kind of height of tree rapid extracting method of total powerstation and unmanned plane image association
CN112767461A (en) Automatic registration method for laser point cloud and sequence panoramic image
CN116563377A (en) Mars rock measurement method based on hemispherical projection model
CN113947638A (en) Image orthorectification method for fisheye camera
CN117036300A (en) Road surface crack identification method based on point cloud-RGB heterogeneous image multistage registration mapping
CN113340272B (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
CN107063191B (en) A kind of method of photogrammetric regional network entirety relative orientation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant