CN105160684A - Online automatic matching method for geometric correction of remote sensing image - Google Patents

Online automatic matching method for geometric correction of remote sensing image Download PDF

Info

Publication number
CN105160684A
CN105160684A CN201510634043.XA CN201510634043A CN105160684A CN 105160684 A CN105160684 A CN 105160684A CN 201510634043 A CN201510634043 A CN 201510634043A CN 105160684 A CN105160684 A CN 105160684A
Authority
CN
China
Prior art keywords
image
matched
point
matching
image blocks
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510634043.XA
Other languages
Chinese (zh)
Other versions
CN105160684B (en
Inventor
龙腾飞
焦伟利
何国金
王威
程博
张兆明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Remote Sensing and Digital Earth of CAS
Original Assignee
Institute of Remote Sensing and Digital Earth of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Remote Sensing and Digital Earth of CAS filed Critical Institute of Remote Sensing and Digital Earth of CAS
Priority to CN201510634043.XA priority Critical patent/CN105160684B/en
Publication of CN105160684A publication Critical patent/CN105160684A/en
Application granted granted Critical
Publication of CN105160684B publication Critical patent/CN105160684B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Landscapes

  • Image Processing (AREA)

Abstract

An online automatic matching method for geometric correction of a remote sensing image relates to the technical field of remote sensing image matching. The method has the advantage that uniformly distributed control points are efficiently and reliably obtained by virtue of free or low-cost network image map resources by fully utilizing prior geometric information of the remote sensing image. The method comprises: dividing a to-be-matched image into a plurality of regions according to the number of control points required to be acquired by utilizing an initial imaging model of the remote sensing image, and taking an image block with a certain size in each region as a processing unit; determining an approximate range of a reference image block according to the range of a to-be-matched image unit and the initial imaging model, then downloading the reference image block through a network and re-sampling the reference image block to ensure that the resolution ratio of the reference image block is close to that of the to-be-matched image unit; and performing SIFT matching on the to-be-matched image unit and the reference image block, removing gross error points, finally selecting an optimal matching point from the rest of matching points, performing least square matching, and correcting point coordinates.

Description

A kind of on-line automatic matching process for remotely sensing image geometric correction
Technical field
The present invention relates to a kind of remote sensing image automatic matching method of practicality, network Aeronautics and Astronautics photomap can be utilized to carry out on-line automatic coupling to remote sensing image.Can be applicable to the fields such as remote sensing, photogrammetric, mapping, image procossing.
Background technology
Image Auto-matching and geometry correction are the key links in Photogrammetry and Remote Sensing task, and they are visual fusion, inlay, change the basis of the senior application such as detection, map rejuvenation.Although very many to the research of image Auto-matching in the past few decades, the Auto-matching of remote sensing image is still very challenging.The performance that practical automatic matching method all should have in efficiency, robustness, precision etc., but because remote sensing image data amount is large, scene large, obtain the features such as condition is changeable, geometry deformation is complicated, existing method is difficult to take into account this several respects.In addition, the preparation with reference to image is also a difficult point of remote sensing image Auto-matching and geometry correction, and especially high resolving power often needs very high cost with reference to the acquisition of image.
In view of the limitation of existing automatic matching method, a kind of on-line automatic matching process for remotely sensing image geometric correction has important practical value.
Summary of the invention
The object of the invention is to solve the deficiencies in the prior art, a kind of quick, sane, accurate on-line automatic matching process for remotely sensing image geometric correction is proposed, the method can process the remote sensing image of arbitrary size, utilize network video map to be automatically matched to some, well-distributed high-precision control points in the short period of time, be directly used in the geometric accurate correction of remote sensing image.The advantage of the method is mainly the priori geological information making full use of remote sensing image, and by network video map resource that is free or low cost, high efficient and reliable ground obtains equally distributed reference mark.
For solving the problem, the invention provides a kind of on-line automatic matching process for remotely sensing image geometric correction, the method comprising the steps of:
Remote sensing image to be matched is evenly divided into several regions by the number at the reference mark S1. gathered as required;
If S2. all imagery zones are all processed, then complete whole Image Matching process; Otherwise, start to process next imagery zone;
S3. current image region is divided into several image units by the size of 256 pixel × 256 pixels;
If all image units S4. in current image region are all processed, then mark this imagery zone and complete process, and proceed to step S2; Otherwise, start to process next image unit;
S5: according to the approximate range with reference to image blocks in the scope of current image unit to be matched and initial imaging model computational grid photomap, then by reference image blocks corresponding to web download and be the resolution close with image unit to be matched by its resampling;
S6: utilize SIFT to mate operator and mate to image unit to be matched with reference to image blocks, and excluding gross error point, if this step obtains the match point of more than 4, then carry out step S7, otherwise proceed to step S4;
S7. from the match point that step S6 obtains, select optimal matching points and Least squares matching is carried out to it, revising the position coordinate of SIFT feature point, this is added in matching result match point.
Wherein, step S5 comprises further:
S5.1 closes on level of zoom most according to the resolution computational grid photomap of image unit to be matched;
S5.2 calculates the corresponding width with reference to image blocks and height;
S5.3 calculates the latitude and longitude coordinates of the corresponding central point with reference to image blocks;
S5.4 sends static map services request and downloads corresponding to image blocks;
The reference image blocks resampling of downloading is the resolution close with image unit to be matched by S5.5.
Wherein, step S6 comprises further:
S6.1 carries out SIFT coupling to image unit to be matched with reference to image blocks;
S6.2 is by dimensional constraints excluding gross error point;
S6.3 utilizes the anglec of rotation to retrain excluding gross error;
S6.4 utilizes RANSAC to estimate similarity transformation constraint excluding gross error;
S6.5 utilizes affined transformation to retrain excluding gross error point.
Accompanying drawing explanation
Fig. 1 is the on-line automatic matching process process flow diagram according to one embodiment of the present invention;
Fig. 2 is imagery zone and image unit schematic diagram in the inventive method;
Fig. 3 obtains network video map reference point image blocks process flow diagram in the on-line automatic matching process according to one embodiment of the present invention;
Fig. 4 is to image unit to be matched with reference to image blocks coupling process flow diagram in the on-line automatic matching process according to one embodiment of the present invention;
Embodiment
The on-line automatic matching process that the present invention proposes, is described with reference to the accompanying drawings as follows.
As shown in Figure 1, the automatic matching method according to one embodiment of the present invention comprises step:
Remote sensing image to be matched is evenly divided into several regions by the number at the reference mark S1. gathered as required, and each imagery zone is labeled as untreated state, and Fig. 2 is shown in by imagery zone schematic diagram;
If S2. all imagery zones are all processed, then complete and terminate whole Image Matching process; Otherwise, start to process next imagery zone;
S3. current image region is divided into several image units by the size of 256 pixel × 256 pixels, and each image unit is designated as untreated state, Fig. 2 is shown in by image unit schematic diagram;
If all image units S4. in current image region are all processed, then mark this imagery zone and complete process, and proceed to step S2; Otherwise, start to process next image unit;
S5. according to the approximate range with reference to image blocks in the scope of current image unit to be matched and initial imaging model computational grid photomap (network video map available at present comprises Google satellite image map, Bing Aerial image map, MapQuest satellite photomap and Mapbox satellite photomap), then by reference image blocks corresponding to web download and be the resolution close with image unit to be matched by its resampling;
S6. utilize SIFT to mate operator to mate to image unit to be matched with reference to image blocks, and excluding gross error point, if this step obtains the match point of more than 4, then carry out step S7, otherwise proceed to step S4;
S7. from the match point that step S6 obtains, choose a pair that local contrast is maximum, and SIFT is mated the local geometric transformation model obtained as initial value, to match point, Least squares matching is carried out to this, the position coordinate of SIFT feature of refining point, finally this is added in matching result to match point, Least squares matching represents by formula (1) certain any equation of condition
k 1I s(x s,y s)+k 2-I r(x r,y r)=0(1)
Wherein x s, y sfor image unit pixel coordinate to be matched, x r, y rfor reference image blocks pixel coordinate,
x r=a 0+a 1x s+a 2y s,y r=b 0+b 1x s+b 2y s
A 0, a 1, a 2, b 0, b 1, b 2be 6 geometric transformation parameters, k 1, k 2be 2 radiation conversion parameters,
I s(x s, y s) and I r(x r, y r) be respectively image unit to be matched and the gray-scale value with reference to image blocks,
Treat each point in the region of 11 pixel × 11 pixels around match point and set up error equation according to formula (1), then optimization is carried out with Levenberg-Marquardt algorithm, obtain optimum geometric transformation parameter, thus determine the match point coordinate after refining.
After image to be matched is divided into image unit, need according to the approximate range with reference to image blocks in the scope of current image unit to be matched and initial imaging model computational grid photomap.Particularly, as shown in Figure 3, step S5 comprises further:
S5.1 closes on level of zoom most according to the resolution computational grid photomap of image unit to be matched, level of zoom utilizes formula (2) to calculate according to the latitude and longitude coordinates at image unit place to be matched place and resolution, the longitude λ at image unit place place and latitude φ is calculated by the initial imaging model of image to be matched
G S D = 2 πR e a r t h 512 × 2 n - 1 cos φ n = [ log 2 2 πR e a r t h cos φ 512 × G S D + 1 ] - - - ( 2 )
R in formula earthrice is earth radius, and its approximate value is 6378137 meters,
GSD is the resolution at image unit place place,
N is level of zoom,
[] represents the computing of getting the most contiguous integer;
S5.2 calculates the corresponding width width with reference to image blocks and height height: given level of zoom n, network video map image coordinate x, the conversion method of y and longitude λ, latitude φ such as formula shown in (3) and formula (4),
x = λ + 180 360 × 512 × 2 n - 1 y = ( 0.5 - 1 4 π ln 1 - sin φ 1 + sin φ ) × 512 × 2 n - 1 - - - ( 3 )
λ = 360 512 × 2 n - 1 x - 180 φ = 180 π a r c s i n exp [ 4 π ( 0.5 - y 512 × 2 n - 1 ) ] - 1 exp [ 4 π ( 0.5 - y 512 × 2 n - 1 ) ] + 1 - - - ( 4 )
The latitude and longitude coordinates through type (3) on image unit 4 summits to be matched is utilized to calculate corresponding 4 picture point coordinates respectively, and then ask the minimum enclosed rectangle of these 4 points, the width of this rectangle and the width width be highly with reference to image blocks and height height;
S5.3 calculates the latitude and longitude coordinates λ of the corresponding central point with reference to image blocks rc, φ rc, calculate its center dot image coordinate according to the minimum enclosed rectangle obtained in S5.2, then substitute into formula (4) and calculate corresponding latitude and longitude coordinates λ rc, φ rc;
S5.4 sends static map services request and downloads corresponding reference image blocks, and static map services request sends in the mode of URL(uniform resource locator) (URL), and the static map services request form of conventional network video map is as follows:
(1) Google satellite image map:
https://maps.***apis.com/maps/api/staticmap?maptype=satellite&zoom={zoomLevel}&center={lat},{lon}&size={width}x{height}&key={***Key}
(2) Bing Aerial image map:
http://dev.virtualearth.net/REST/v1/Imagery/Map/Aerial/{lat},{lon}/{zoomLevel}?mapSize={width},{height}&key={BingMapsKey}
(3) MapQuest satellite photomap:
http://www.mapquestapi.com/staticmap/v4/getmap?type=sat&zoom={zoomLevel}&center={lat},{lon}&size={width},{height}&key={mapquestKey}
(4) Mapbox satellite photomap:
http://api.tiles.mapbox.com/v4/mapbox.satellite/{lon},{lat},{zoomLevel}/{width}x{height}.png?access_token={mapboxKey}
Above in each URL, the content in " { } " is filled according to the result of calculation of step S5.1, S5.2 and S5.3, particularly,
ZoomLevel be calculate in step S5.1 close on level of zoom n most,
Width, height are width width and the height height of the reference image blocks calculated in step S5.2,
Lon, lat are the central point longitude λ of the reference image blocks calculated in step S5.3 rcwith latitude φ rc,
GoogleKey, BingMapsKey, mapquestKey, mapboxKey are respectively the api key of each Map Service of Network, can apply for obtaining on corresponding website;
The reference image blocks resampling of downloading is the resolution close with image unit to be matched by S5.5.
Download obtains corresponding with reference to after image blocks, needs to utilize SIFT to mate operator and mates to image unit to be matched with reference to image blocks, and excluding gross error point.Particularly, as shown in Figure 4, step S6 comprises further:
S6.1 is respectively from image unit to be matched with reference to extracting SIFT feature point image blocks and calculating corresponding SIFT description vectors, distance metric using the Euclidean distance of SIFT description vectors as two SIFT feature points, two conditions are below judged to each SIFT feature point in image unit to be matched, meet the two one of or meet 2 conditions simultaneously and namely it can be used as matching candidate point:
(1) this is less than 0.75 to the ratio with reference to each SIFT feature point minor increment in image blocks and time small distance;
(2) distance of this point is all not less than with this point apart from the distance of other SIFT feature points in minimum unique point to image unit to be matched with reference in image blocks;
The match point of its correspondence is with reference to SIFT feature point nearest with it in image blocks;
S6.2 checks the SIFT yardstick coordinate ratio T of each candidate point and match point thereof σif, 0.8≤T σ≤ 1.25, then retain this to candidate point, otherwise reject this to candidate point;
S6.3 calculates the SIFT principal direction differential seat angle of each candidate point and match point thereof, with 10 degree for setting up grey level histogram in interval, using the angle at histogram peak place as image unit to be matched with reference to the rotation angle θ between image blocks peak, then check the SIFT principal direction differential seat angle Δ θ of each candidate point and match point thereof, if | Δ θ-θ peak|≤15 °, then retain this to candidate point, otherwise reject this to candidate point;
S6.4 utilizes RANSAC algorithm to estimate to remain the similarity transformation relation (5) between candidate matches point, rejects the rough error point not meeting this variation relation simultaneously,
x r = s ( x s cos θ + y s sin θ ) + t x y r = s ( - x s sin θ + y s cos θ ) + t y - - - ( 5 )
Wherein x s, y sfor image unit pixel coordinate to be matched, x r, y rfor reference image blocks pixel coordinate, s and θ is yardstick and the rotation angle parameter of similarity transformation, t x, t ybe respectively the translation parameters of similarity transformation in x direction and y direction;
S6.5 calculates affine Transform Model to candidate point remaining in step S6.4,
x r = a 0 + a 1 x s + a 2 y s y r = b 0 + b 1 x s + b 2 y s - - - ( 6 )
Wherein x s, y sfor image unit pixel coordinate to be matched, x r, y rfor reference image blocks pixel coordinate, a 0, a 1, a 2, b 0, b 1, b 2for 6 parameters of affined transformation;
Utilize the affine Transform Model obtained to check each pair of matching candidate point, if maximum residul difference is greater than 1 pixel, then rejects this to after match point, recalculate affine Transform Model, and residual test is carried out to residue candidate point; If maximum residul difference is not more than 1 pixel, then export remaining matching double points, end step S6.

Claims (4)

1., for an on-line automatic matching process for remotely sensing image geometric correction, it is characterized in that: comprise the following steps,
Remote sensing image to be matched is evenly divided into several regions by the number at the reference mark that step S1. gathers as required;
If all imagery zones of step S2. are all processed, then complete whole Image Matching process; Otherwise, start to process next imagery zone;
Current image region is divided into several image units by the size of 256 pixel × 256 pixels by step S3.;
If all image units in step S4. current image region are all processed, then mark this imagery zone and complete process, and proceed to step S2; Otherwise, start to process next image unit;
Step S5: according to the approximate range with reference to image blocks in the scope of current image unit to be matched and initial imaging model computational grid photomap, then by reference image blocks corresponding to web download and be the resolution close with image unit to be matched by its resampling;
Step S6: utilize SIFT to mate operator and mate to image unit to be matched with reference to image blocks, and excluding gross error point, if this step obtains the match point of more than 4, then carry out step S7, otherwise proceed to step S4;
Step S7. selects optimal matching points and carries out Least squares matching to it from the match point that step S6 obtains, and revises the position coordinate of SIFT feature point, this is added in matching result match point.
2. On-line matching method as claimed in claim 1, it is characterized in that, step S5 comprises further:
Step S5.1 closes on level of zoom most according to the resolution computational grid photomap of image unit to be matched;
Step S5.2 calculates the corresponding width with reference to image blocks and height;
Step S5.3 calculates the latitude and longitude coordinates of the corresponding central point with reference to image blocks;
Step S5.4 sends static map services request and downloads corresponding to image blocks;
The reference image blocks resampling of downloading is the resolution close with image unit to be matched by step S5.5.
3. On-line matching method as claimed in claim 1, it is characterized in that, step S6 comprises further:
S6.1 carries out SIFT coupling to image unit to be matched with reference to image blocks;
S6.2 is by dimensional constraints excluding gross error point;
S6.3 utilizes the anglec of rotation to retrain excluding gross error;
S6.4 utilizes RANSAC to estimate similarity transformation constraint excluding gross error;
S6.5 utilizes affined transformation to retrain excluding gross error point.
4. the step S6 of On-line matching method as claimed in claim 3, it is characterized in that, in step S6.1, respectively from image unit to be matched with reference to extracting SIFT feature point image blocks and calculating corresponding SIFT description vectors, distance metric using the Euclidean distance of SIFT description vectors as two SIFT feature points, two conditions are below judged to each SIFT feature point in image unit to be matched, meet the two one of or meet 2 conditions simultaneously and namely it can be used as matching candidate point:
1. this is less than 0.75 to the ratio with reference to each SIFT feature point minor increment in image blocks and time small distance;
2. the distance of this point is all not less than with this point apart from the distance of other SIFT feature points in minimum unique point to image unit to be matched with reference in image blocks.
CN201510634043.XA 2015-09-30 2015-09-30 A kind of on-line automatic matching process for remotely sensing image geometric correction Expired - Fee Related CN105160684B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510634043.XA CN105160684B (en) 2015-09-30 2015-09-30 A kind of on-line automatic matching process for remotely sensing image geometric correction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510634043.XA CN105160684B (en) 2015-09-30 2015-09-30 A kind of on-line automatic matching process for remotely sensing image geometric correction

Publications (2)

Publication Number Publication Date
CN105160684A true CN105160684A (en) 2015-12-16
CN105160684B CN105160684B (en) 2019-01-18

Family

ID=54801526

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510634043.XA Expired - Fee Related CN105160684B (en) 2015-09-30 2015-09-30 A kind of on-line automatic matching process for remotely sensing image geometric correction

Country Status (1)

Country Link
CN (1) CN105160684B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408028A (en) * 2016-09-26 2017-02-15 珠海市测绘院 Urban-rural planning inspection and plotting data processing method
CN106595598A (en) * 2016-12-21 2017-04-26 上海航天控制技术研究所 Wide-field-of-view constant-diyuan optical remote sensing imaging method
CN108428220A (en) * 2018-03-05 2018-08-21 武汉大学 Satellite sequence remote sensing image sea island reef region automatic geometric correction method
CN108492711A (en) * 2018-04-08 2018-09-04 黑龙江工业学院 A kind of drawing electronic map method and device
CN109242894A (en) * 2018-08-06 2019-01-18 广州视源电子科技股份有限公司 Image alignment method and system based on mobile least square method
CN109508674A (en) * 2018-11-13 2019-03-22 佳木斯大学 Airborne lower view isomery image matching method based on region division
CN114004770A (en) * 2022-01-04 2022-02-01 成都国星宇航科技有限公司 Method and device for accurately correcting satellite space-time diagram and storage medium
CN116521927A (en) * 2023-06-30 2023-08-01 成都智遥云图信息技术有限公司 Remote sensing image matching method and system based on network map tiles

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080089558A1 (en) * 2004-12-16 2008-04-17 Centre National D'etudes Spatiales Method for Processing Images Using Automatic Georeferencing of Images Derived from a Pair of Images Captured in the Same Focal Plane
CN102663680A (en) * 2012-03-06 2012-09-12 中国科学院对地观测与数字地球科学中心 A geometric correction method of images based on surface characters
CN103218783A (en) * 2013-04-17 2013-07-24 国家测绘地理信息局卫星测绘应用中心 Fast geometric correction method for satellite remote sensing image and based on control point image database
CN103337052A (en) * 2013-04-17 2013-10-02 国家测绘地理信息局卫星测绘应用中心 Automatic geometric correction method for wide remote-sensing images

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080089558A1 (en) * 2004-12-16 2008-04-17 Centre National D'etudes Spatiales Method for Processing Images Using Automatic Georeferencing of Images Derived from a Pair of Images Captured in the Same Focal Plane
CN102663680A (en) * 2012-03-06 2012-09-12 中国科学院对地观测与数字地球科学中心 A geometric correction method of images based on surface characters
CN103218783A (en) * 2013-04-17 2013-07-24 国家测绘地理信息局卫星测绘应用中心 Fast geometric correction method for satellite remote sensing image and based on control point image database
CN103337052A (en) * 2013-04-17 2013-10-02 国家测绘地理信息局卫星测绘应用中心 Automatic geometric correction method for wide remote-sensing images

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
ARMIN GRUEN: "DEVELOPMENT AND STATUS OF IMAGE MATCHING IN PHOTOGRAMMETRY", 《THE PHOTOGRAMMETRIC RECORD》 *
JIAO WEILI等: "A NEW METHOD FOR GEOMETRIC QUALITY EVALUATION OF REMOTE SENSING IMAGE BASED ON INFORMATION ENTROPY", 《THE INTERNATIONAL ARCHIVES OF THE PHOTOGRAMMETRY, REMOTE SENSING AND SPATIAL INFORMATION SCIENCES》 *
KHALIL AL-JOBURI: "An Automated Method for Geo-Referencing Satellite Images Using Google Tile Scheme", 《WORLD APPLIED SCIENCES JOURNAL》 *
LONG TENGFEI等: "A generic framework for image rectification using multiple types of feature", 《ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING》 *
LONG TENGFEI等: "Automatic Line Segment Registration Using Gaussian Mixture Model and Expectation-Maximization Algorithm", 《IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING》 *
OZGE C. OZCANLI等: "Automatic Geo-location Correction of Satellite Imagery", 《2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106408028A (en) * 2016-09-26 2017-02-15 珠海市测绘院 Urban-rural planning inspection and plotting data processing method
CN106408028B (en) * 2016-09-26 2020-04-14 珠海市测绘院 Urban and rural planning inspection surveying and mapping data processing method
CN106595598A (en) * 2016-12-21 2017-04-26 上海航天控制技术研究所 Wide-field-of-view constant-diyuan optical remote sensing imaging method
CN106595598B (en) * 2016-12-21 2019-03-19 上海航天控制技术研究所 A kind of first optical remote sensing imaging method in permanent ground of wide visual field
CN108428220A (en) * 2018-03-05 2018-08-21 武汉大学 Satellite sequence remote sensing image sea island reef region automatic geometric correction method
CN108492711A (en) * 2018-04-08 2018-09-04 黑龙江工业学院 A kind of drawing electronic map method and device
CN109242894B (en) * 2018-08-06 2021-04-09 广州视源电子科技股份有限公司 Image alignment method and system based on mobile least square method
CN109242894A (en) * 2018-08-06 2019-01-18 广州视源电子科技股份有限公司 Image alignment method and system based on mobile least square method
CN109508674A (en) * 2018-11-13 2019-03-22 佳木斯大学 Airborne lower view isomery image matching method based on region division
CN114004770A (en) * 2022-01-04 2022-02-01 成都国星宇航科技有限公司 Method and device for accurately correcting satellite space-time diagram and storage medium
CN114004770B (en) * 2022-01-04 2022-04-26 成都国星宇航科技有限公司 Method and device for accurately correcting satellite space-time diagram and storage medium
CN116521927A (en) * 2023-06-30 2023-08-01 成都智遥云图信息技术有限公司 Remote sensing image matching method and system based on network map tiles
CN116521927B (en) * 2023-06-30 2024-02-13 成都智遥云图信息技术有限公司 Remote sensing image matching method and system based on network map tiles

Also Published As

Publication number Publication date
CN105160684B (en) 2019-01-18

Similar Documents

Publication Publication Date Title
CN105160684A (en) Online automatic matching method for geometric correction of remote sensing image
CN107504981B (en) Satellite attitude error correction method and device based on laser height measurement data
CN105046251B (en) A kind of automatic ortho-rectification method based on environment No.1 satellite remote-sensing image
Ermolaev et al. Automated construction of the boundaries of basin geosystems for the Volga Federal District
CN102194225A (en) Automatic registering method for coarse-to-fine space-borne synthetic aperture radar image
CN105444778B (en) A kind of star sensor based on imaging geometry inverting is in-orbit to determine appearance error acquisition methods
CN102855628B (en) Automatic matching method for multisource multi-temporal high-resolution satellite remote sensing image
CN109100719B (en) Terrain map joint mapping method based on satellite-borne SAR (synthetic aperture radar) image and optical image
Báčová et al. A GIS method for volumetric assessments of erosion rills from digital surface models
CN106709944A (en) Satellite remote sensing image registration method
CN116597013A (en) Satellite image geometric calibration method based on different longitude and latitude areas
CN102147249B (en) Method for precisely correcting satellite-borne optical linear array image based on linear characteristic
CN105571598B (en) A kind of assay method of laser satellite altimeter footmark camera posture
Yang et al. Extracting road centrelines from high-resolution satellite images using active window line segment matching and improved SSDA
CN109579796B (en) Area network adjustment method for projected image
CN106228593B (en) A kind of image dense Stereo Matching method
CN102663680A (en) A geometric correction method of images based on surface characters
CN103177441A (en) Image geometric correction method based on straight line segments
CN104166977A (en) Image matching similarity measuring method and image matching method thereof
CN101446642B (en) Automatic matching method for remote sensing satellite data ground control point based on knowledge learning
Lu et al. Estimation of Transformation Parameters Between Centre‐Line Vector Road Maps and High Resolution Satellite Images
Sohn et al. Rational function model‐based image matching for digital elevation models
CN115326025A (en) Binocular image measuring and predicting method for sea waves
Yan et al. Polygon-based image registration: A new approach for geo-referencing historical maps
CN113124834A (en) Regional network adjustment method and system combining multi-source data and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190118

Termination date: 20190930