CN113516606B - Slope radar image and optical photo matching fusion method assisted by oblique photographing data - Google Patents

Slope radar image and optical photo matching fusion method assisted by oblique photographing data Download PDF

Info

Publication number
CN113516606B
CN113516606B CN202110244381.8A CN202110244381A CN113516606B CN 113516606 B CN113516606 B CN 113516606B CN 202110244381 A CN202110244381 A CN 202110244381A CN 113516606 B CN113516606 B CN 113516606B
Authority
CN
China
Prior art keywords
image
radar
point
matching
rough
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110244381.8A
Other languages
Chinese (zh)
Other versions
CN113516606A (en
Inventor
郑翔天
何秀凤
李金旺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202110244381.8A priority Critical patent/CN113516606B/en
Publication of CN113516606A publication Critical patent/CN113516606A/en
Application granted granted Critical
Publication of CN113516606B publication Critical patent/CN113516606B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a slope radar image and optical photo matching fusion method assisted by oblique photographic data, which comprises the steps of firstly, utilizing the spatial relationship between foundation radar station setting information and an oblique image to complete geometric mapping; then, obtaining an instantaneous view angle image according to photo shooting point coordinates and orientation perspective projection transformation in oblique shooting data, extracting an instantaneous view angle image and a shooting photo common point, and then using least square optimized affine transformation matching to obtain a mapping relation between radar measurement images and shooting photo pixels; and finally, introducing an image fusion algorithm, and fusing the real-time deformation with a color space of the photographed picture to obtain a photographed picture-deformation fusion picture. The invention utilizes the three-dimensional information of the inclined image to assist in completing the matching of the foundation radar image and the optical photographing, and avoids the problem of huge matching error caused by directly utilizing a two-dimensional image transformation method to ignore the difference of imaging principles in the conventional method.

Description

Slope radar image and optical photo matching fusion method assisted by oblique photographing data
Technical Field
The invention relates to a slope radar image and optical photo matching fusion method assisted by oblique photographic data, and belongs to the technical field of strip mine slope deformation monitoring.
Background
Landslide is a geological disaster that is second only to earthquakes, occurs most frequently, and causes the most serious loss. The number of on-site emergency rescue personnel and engineering vehicles after landslide disaster is large, and if secondary disasters occur, loss cannot be estimated. The method is characterized in that residual rock mass of landslide is continuously monitored, deformation characteristics are analyzed, and an expert is used for judging and positioning dangerous deformation positions and early warning in time, so that the method is a widely accepted technical route for on-site emergency monitoring after landslide disaster at present. Ground-based synthetic aperture interferometry radar (GB-InSAR) is a hotspot technology in recent years, has proven to be a powerful tool for regional deformation monitoring, and has a short time interval (the revisitation period can reach the order of minutes). The existing landslide monitoring case shows that the actual measurement precision can reach the sub-millimeter level, and the GB-InSAR has great prospect in the landslide emergency monitoring and early warning field. The radar images in a polar coordinate system, and is mapped to a three-dimensional space to facilitate human eyes to perceive and locate dangerous areas, so that the radar is not suitable for safety monitoring personnel to directly use, and therefore, a measurement result is often required to be matched with a picture taken by the monitoring personnel.
The existing matching method has low matching precision, the ground-based radar image and the shot image are matched through two-dimensional image transformation, the difference of the imaging principle of the ground-based radar image and the shot image is large, the imaging principle of the ground-based radar image and the shot image cannot be accurately corresponding, and the ground-based radar image and the shot image are difficult to be used as interpretation references. Slope sites often have three-dimensional laser scanning (LIDAR) and unmanned aerial vehicle oblique photography three-dimensional data. Three-dimensional data can be matched to radar images by means of a distance-doppler analysis via a geometric mapping on the one hand, and to oblique-photographed instantaneous view images on the other hand, in turn, to photographs. Therefore, matching fusion of the foundation slope deformation monitoring radar image and the photographed picture can be completed through oblique photography assistance.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention provides a method for matching and fusing slope radar deformation images and optical photos by taking oblique photography as intermediate data and matching with the slope deformation monitoring radar images and the photographed photos respectively.
The invention adopts the following technical scheme for solving the technical problems:
the slope radar image and optical photo matching fusion method assisted by the oblique photography data comprises the following specific steps of:
step 1, continuously measuring the antenna center coordinates of each stop position in the one-step one-stop movement process of the foundation radar to obtain a point set P rail Point set P rail Performing straight line fitting to obtain an antenna trace vector
Step 2, calculating a three-dimensional oblique photography data point set P map The relative slant distance and relative azimuth angle between each point and radar form a two-dimensional point set P map3D
Step 3, calculating P map3D The Euclidean distance between each point in the radar image and each pixel point in the radar image is the mostThe pixel points corresponding to the small Euclidean distance form and obtain the nearest pixel point set P I Obtaining P map3D And P I Inter-one mapping table T I
Step 4, from optical photograph I shot Geographical coordinates P of internal reading optical photograph shooting point shot Will P map Is set at P shot Transient image I obtained by adjusting the viewing angle of three-dimensional oblique photographic data by projective transformation temp Save P map And I temp Mapping table T between points temp
Step 5, visual interpretation I temp And I shot Two-dimensional coordinates of the object with obvious features in the center are obtained to obtain a rough matching public point set P co Wherein P is co Consists of two sub-point sets, one is I temp Coordinate sub-point set of object with obvious feature in the interior, the other is I shot Inner AND I temp The same set of coordinate sub-points of the salient feature object;
step 6, P is co Inputting an image affine transformation equation f to obtain a transformation parameter initial value of the image affine transformation equation, wherein the transformation parameter comprises a rotation factor, a translation factor and a scaling factor;
step 7, the transformation parameter initial value in the step 6 is replaced by f to form a transformation equation f1, I is calculated shot F1 is substituted into all pixel points in the image to obtain a rough matching image I rough Form I shot And I rough Coarse matching mapping table T between pixels rough Completing rough matching;
step 8, extracting I by using a feature extraction method temp And I rough Image characteristics in the image are obtained to obtain a plurality of I temp And I rough Is used to determine the characteristic point of the same, forming a fine matching common point pair set P fine ,P fine The element is I temp And I rough Pixel coordinates of the same feature points;
step 9, utilizing P in step 8 fine The initial value of the transformation parameter in the step 6, the optimal transformation parameter of the affine transformation equation f of the image is estimated by using least square iteration, and the optimal transformation parameter is replaced byf forms a transformation equation f2;
step 10, I rough Substituting the transformation equation f2 to obtain I fine Form I rough And I fine Fine matching mapping table T between pixels fine I can be obtained through resampling and interpolation method shot And radar image P I Mapping relation T between final
Step 11, find T final Mapping table, which combines deformation value of each pixel point in radar image with I by image fusion method shot The RGB color channels are fused to obtain a fused image I fusion And (5) completing the matching fusion process.
Further, in step 1Is directed from the radar operation start point to the end point.
Further, P is calculated in step 2 using a range-Doppler algorithm map Relative slant distance and relative azimuth angle between each point and radar.
Further, P map The ith vertex A of (b) i Relative slant range to radar:
wherein N represents P map Vertex number of (x) i ,y i ,z i ) Is A i Three-dimensional coordinates of (x) s ,y s ,z s ) Is the radar synthetic aperture center O s Is a three-dimensional coordinate of (2);
P map the ith vertex A of (b) i The relative azimuth angle to the radar is:
in the method, in the process of the invention,is A i At antenna track vector +.>The foot drop above, |·| represents the modulus of the vector, P 2 Is the radar travel termination point.
Further, the affine transformation equation f of the image in step 6 is:
wherein the method comprises the steps ofFor the target coordinates +.>For the coordinates to be matched, R is a twiddle factor, T x 、T y For the panning factor, ρ is the scaling factor.
Compared with the prior art, the technical scheme provided by the invention has the following technical effects:
(1) The three-dimensional information of the inclined image is used for assisting in completing the matching of the foundation radar image and the optical photographing, so that the problem of huge matching error caused by the fact that the conventional method directly uses a two-dimensional image transformation method to ignore the difference of imaging principles is avoided;
(2) The problem of difficulty in initial value selection of a conventional transformation method is solved by identifying significant ground objects in a scene, firstly performing rough matching of an inclined image instantaneous image and an optical photographing image, and then identifying a large number of common points by utilizing characteristics to optimize affine transformation equation parameters;
(3) The method is easy to realize, has high operation speed and can realize pixel level matching fusion precision by characteristic recognition of a large number of common point least square iteration optimization transformation equation parameters;
(4) The deformation measurement image of the slope deformation monitoring radar is matched with photos taken by geological disaster investigation personnel in real time, so that the slope deformation monitoring radar is high in usability, and data support can be provided for landslide risk assessment and disaster early warning.
Drawings
FIG. 1 is a schematic diagram of slope deformation monitoring radar image and optical photograph matching fusion assisted by oblique photography data;
fig. 2 is a flowchart of a slope deformation monitoring radar image and optical photograph matching fusion method assisted by oblique photography data according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
The invention relates to a matching fusion method of slope radar images and optical photos assisted by oblique photographic data, which comprises the steps of firstly, utilizing the space relation among foundation radar station setting coordinates, pitch angle information and oblique images to complete geometric mapping; then, obtaining an instantaneous view angle image according to photo shooting point coordinates and orientation perspective projection transformation in oblique shooting data, extracting an instantaneous view angle image and a shooting optical photo common point, and then using least square optimized affine transformation matching to obtain a mapping relation between radar images and shooting optical photo pixels; and finally, introducing an image fusion algorithm, and fusing the side slope deformation monitoring radar image with a color space of the photographed optical photo to obtain a photographed photo-deformation fusion diagram.
Examples
The embodiment provides a slope deformation monitoring radar image and optical photo matching fusion method assisted by oblique photography data, which is shown in fig. 2.
The antenna center coordinates of each stop position in the one-step one-stop movement process of the foundation radar are continuously measured by using common auxiliary measuring equipment such as a total station, three-dimensional laser scanning, a static global satellite navigation positioning system (GPS/GNSS) and the like. The radar primary operation measuring point set is as followsThe n-th run measuring point set is +.>Set->Obtaining a point set P rail
Point set P using conventional straight line fitting method rail Fitting to obtain radar antenna trace vectorVector->The direction is directed from the radar operation start point to the end point.
Antenna trace vectorThree-dimensional oblique photographic data point set P map As an input parameter of the geometric mapping algorithm, a point set P is calculated by using a conventional distance-Doppler algorithm map Relative slant distance R between each point in the range and radar 3D And relative azimuth angle theta 3D
Let the synthetic aperture center of the radar be O s Traversal P map Each vertex (x) i ,y i ,z i ) Relative to the aperture center O s (x s ,y s ,z s ) Is a euclidean distance of (2):
in the middle ofRepresenting the i-th point relative to the aperture center O s Is N represents P map Model total points.
Traversal P map Is defined with respect to the aperture center O s Taking the left side relative to the vertical center plane as the radar azimuth negative angle, the azimuth is:
in the method, in the process of the invention,for the azimuth angle of the ith point relative to the aperture center, < >>For the ith point at the track axis +.>Is hung on the foot. The |·| represents the modulus of the vector. Forming a two-dimensional point set P map3D :{(R 3D3D ) 1 ,(R 3D3D ) 2 ,...,(R 3D3D ) N Calculation of P map3D The Euclidean distance between each point and each grid point of radar image I (r, theta) is the minimum Euclidean distance point to obtain the nearest pixel point set P I :{(R 2D2D ) 1 ,(R 2D2D ) 2 ,...,(R 2D2D ) N And get P map3D And P I Inter-one mapping table T I
Optical photograph I of this example shot The geographic coordinates of the shooting location are obtained through GPS/GNSS, and the geographic coordinates P of the shooting photo point are read shot The oblique photographing data sets the observation point at P shot At this point, the direction of the line of sight is again determinedInstant two-dimensional image I for determining oblique photographic data temp . Example->For the observation point P shot With scene center P R0 The vector is formed. Projective transformation adjusts oblique photographic data viewing angle to instantaneous image I temp And take photo I shot The content is preliminarily consistent. In the present embodiment, projection transformation is performed, the oblique photography is performed by initially transforming the world coordinate system O-XYZ to the eye coordinate system P centered on the observation point shot -X e Y e Z e Negative Z e The axis pointing in the viewing direction +.>The world coordinate system and the eye coordinate system are right-hand three-dimensional space rectangular coordinate systems. Then, P is map The projection refers to the transformation of the object description from the world coordinate system to the target coordinate system equivalent to the transformation of the target coordinate system superimposed on the world coordinate system by using translation and rotation operations, and perspective projection is used. Parallel to X e O e Y e Plane with distance f from viewpoint as instantaneous image I temp Projection surface, then, one point in the eye coordinate system is in the instantaneous image I temp Grid point coordinates (X m ,Y m ) The calculation can be made by the following formula:
wherein (X) s ,Y s ,Z s ) Is P shot Coordinates in world coordinate system O-XYZ, instantaneous image I temp Azimuth angle theta and pitch angle alpha of projection plane relative to plane XOY, observation point P shot And instantaneous image I temp Projection plane distance f, arbitrary target point (X M ,Y M ,Z M ) Save P map And I temp Mapping table T between grid points temp
Visual interpretation of transient image I temp And clappingPhotographic picture I shot Obtaining a rough matching public point set P by two-dimensional coordinates of a remarkable object target in an image co Point set P co IncludedIs I temp The coordinate sub-point set of the object with obvious interior feature further comprises +.>Is I shot Inner AND I temp The coordinate sub-point sets of the same ground object target are common in np pairs.
Will roughly match the common point set P co Affine transformation equation of input imageObtaining the initial value ρ of the parameter 0 、R 0 、/>In equation f +.>For the target coordinates +.>For the coordinates to be matched, R is a twiddle factor, T x 、T y For the panning factor, ρ is the scaling factor.
Bringing the initial value of the parameter back to equation f to form a transformation equation f1, and adding I shot Is brought into f2 to obtain a coarse matching image grid I rough Obtaining a rough matching mapping table T rough And (5) completing rough matching.
Under the control of the rough matching result, extracting I by using a characteristic extraction method temp And I rough The characteristics of points, lines and planes in the interior are used for obtaining a large number of identical characteristic points as a fine matching public point set P fine
Using a fine-matched common point set P fine Initial value ρ of transformation parameter 0 、R 0The least square affine transformation is optimized by using the least square iteration optimizing affine transformation equation f1 parameter, in this embodiment, the least square affine transformation is optimized by using an iteration method, and taking a plane right-hand rectangular coordinate system as an example, the affine transformation in this embodiment is followed by bilinear interpolation transformation, and the transformation model is as follows:
establishing an error equation:
wherein,n is a common point P fine The number is a least squares solution of the model parameters +.>And P is an equal weight matrix. And (3) carrying out least square iteration to minimize the error of the formula (4). Controlling rounding errors by iterative calculation, wherein the rounding errors select errors in unit weights>Degree of freedom->For the difference between the number of error equations and the number of variables needed for actually solving the equations, each control point is regarded as an independent observed quantity weight matrix P to take an identity matrix, and the optimal estimation value +.>The return f forms the transformation equation f2.
Will I rough Carrying out transformation equation f2 to obtain I fine Mapping ofThe relation of the radiation is recorded as T fine At this time, a slope deformation monitoring radar image P is obtained I And P map3D Inter-one mapping table T I ,P map3D And P map In one-to-one correspondence, P map And I temp Mapping table T between points temp ,I shot And I rough Coarse matching mapping table T rough ,I rough And I fine Fine match mapping table T fine I can be obtained by conventional resampling and interpolation method shot Optical photograph and radar image P I Mapping relation T between final
Find T final Mapping table, which combines deformation value of deformation image grid point of slope deformation radar with I shot The red, green and blue (RGB) color channels of the photo are fused by an image fusion method. In this embodiment, the conventional fusion method is adopted to replace the red (R) channel of the optical photo with the radar measurement deformation
In the method, in the process of the invention,for the transformed image, there are red, green and blue three channels, the red channel uses the radar image P I Deformation value IR, green channel and blue channel use I shot Green channel I of optical photograph VG And blue channel I VB Obtaining a fusion image I fusion And (5) completing the matching fusion process. FIG. 1 shows a schematic diagram of matching and fusing of slope deformation monitoring radar images and optical photos assisted by oblique photography data.

Claims (5)

1. The slope radar image and optical photo matching fusion method assisted by the oblique photographing data is characterized by comprising the following specific steps of:
step 1, continuously measuring the antenna center coordinates of each stop position in the one-step one-stop movement process of the foundation radar to obtain a point set P rail Point set P rail Performing straight line fitting to obtain an antenna trace vector
Step 2, calculating a three-dimensional oblique photography data point set P map The relative slant distance and relative azimuth angle between each point and radar form a two-dimensional point set P map3D
Step 3, calculating P map3D The Euclidean distance between each point in the radar image and each pixel point in the radar image, and the pixel point corresponding to the minimum Euclidean distance forms and obtains the nearest pixel point set P I Obtaining P map3D And P I Inter-one mapping table T I
Step 4, from optical photograph I shot Geographical coordinates P of internal reading optical photograph shooting point shot Will P map Is set at P shot Transient image I obtained by adjusting the viewing angle of three-dimensional oblique photographic data by projective transformation temp Save P map And I temp Mapping table T between points temp
Step 5, visual interpretation I temp And I shot Two-dimensional coordinates of the object with obvious features in the center are obtained to obtain a rough matching public point set P co Wherein P is co Consists of two sub-point sets, one is I temp Coordinate sub-point set of object with obvious feature in the interior, the other is I shot Inner AND I temp The same set of coordinate sub-points of the salient feature object;
step 6, P is co Inputting an image affine transformation equation f to obtain a transformation parameter initial value of the image affine transformation equation, wherein the transformation parameter comprises a rotation factor, a translation factor and a scaling factor;
step 7, the transformation parameter initial value in the step 6 is replaced by f to form a transformation equation f1, I is calculated shot F1 is substituted into all pixel points in the image to obtain a rough matching image I rough Form I shot And I rough Coarse matching mapping table T between pixels rough Completing rough matching;
step 8, extracting by using a feature extraction methodGet I temp And I rough Image characteristics in the image are obtained to obtain a plurality of I temp And I rough Is used to determine the characteristic point of the same, forming a fine matching common point pair set P fine ,P fine The element is I temp And I rough Pixel coordinates of the same feature points;
step 9, utilizing P in step 8 fine Step 6, the initial value of the transformation parameter is used for estimating the optimal transformation parameter of the affine transformation equation f of the image by using least square iteration, and the optimal transformation parameter is replaced by f to form a transformation equation f2;
step 10, I rough Substituting the transformation equation f2 to obtain I fine Form I rough And I fine Fine matching mapping table T between pixels fine I can be obtained through resampling and interpolation method shot And radar image P I Mapping relation T between final
Step 11, find T final Mapping table, which combines deformation value of each pixel point in radar image with I by image fusion method shot The RGB color channels are fused to obtain a fused image I fusion And (5) completing the matching fusion process.
2. The oblique photography data auxiliary slope radar image and optical photograph matching fusion method as claimed in claim 1, wherein in step 1Is directed from the radar operation start point to the end point.
3. The oblique photography data auxiliary slope radar image and optical photograph matching fusion method as claimed in claim 1, wherein in step 2, P is calculated by using a range-doppler algorithm map Relative slant distance and relative azimuth angle between each point and radar.
4. The method for matching and fusing the oblique photography data auxiliary slope radar image and the optical picture according to claim 3, wherein,P map the ith vertex A of (b) i Relative slant range to radar:
wherein N represents P map Vertex number of (x) i ,y i ,z i ) Is A i Three-dimensional coordinates of (x) s ,y s ,z s ) Is the radar synthetic aperture center O s Is a three-dimensional coordinate of (2);
P map the ith vertex A of (b) i The relative azimuth angle to the radar is:
in the method, in the process of the invention,is A i At antenna track vector +.>The foot drop above, |·| represents the modulus of the vector, P 2 Is the radar travel termination point.
5. The oblique photography data auxiliary slope radar image and optical photograph matching fusion method as claimed in claim 1, wherein the image affine transformation equation f in the step 6 is:
wherein the method comprises the steps ofFor the target coordinates +.>For the coordinates to be matched, R is the rotation factorSon, T x 、T y For the panning factor, ρ is the scaling factor.
CN202110244381.8A 2021-03-05 2021-03-05 Slope radar image and optical photo matching fusion method assisted by oblique photographing data Active CN113516606B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110244381.8A CN113516606B (en) 2021-03-05 2021-03-05 Slope radar image and optical photo matching fusion method assisted by oblique photographing data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110244381.8A CN113516606B (en) 2021-03-05 2021-03-05 Slope radar image and optical photo matching fusion method assisted by oblique photographing data

Publications (2)

Publication Number Publication Date
CN113516606A CN113516606A (en) 2021-10-19
CN113516606B true CN113516606B (en) 2023-12-12

Family

ID=78061163

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110244381.8A Active CN113516606B (en) 2021-03-05 2021-03-05 Slope radar image and optical photo matching fusion method assisted by oblique photographing data

Country Status (1)

Country Link
CN (1) CN113516606B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127683A (en) * 2016-06-08 2016-11-16 中国电子科技集团公司第三十八研究所 A kind of real-time joining method of unmanned aerial vehicle SAR image
KR102028324B1 (en) * 2019-02-26 2019-11-04 엘아이지넥스원 주식회사 Synthetic Aperture Radar Image Enhancement Method and Calculating Coordinates Method
CN112001955A (en) * 2020-08-24 2020-11-27 深圳市建设综合勘察设计院有限公司 Point cloud registration method and system based on two-dimensional projection plane matching constraint
CN112102458A (en) * 2020-08-31 2020-12-18 湖南盛鼎科技发展有限责任公司 Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106127683A (en) * 2016-06-08 2016-11-16 中国电子科技集团公司第三十八研究所 A kind of real-time joining method of unmanned aerial vehicle SAR image
KR102028324B1 (en) * 2019-02-26 2019-11-04 엘아이지넥스원 주식회사 Synthetic Aperture Radar Image Enhancement Method and Calculating Coordinates Method
CN112001955A (en) * 2020-08-24 2020-11-27 深圳市建设综合勘察设计院有限公司 Point cloud registration method and system based on two-dimensional projection plane matching constraint
CN112102458A (en) * 2020-08-31 2020-12-18 湖南盛鼎科技发展有限责任公司 Single-lens three-dimensional image reconstruction method based on laser radar point cloud data assistance

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
地基SAR 图像与地形数据的几何映射三维匹配方法;杨俊;乞耀龙;谭维贤;王彦平;洪文;中国科学院大学学报;第32卷(第3期);全文 *
基于Hough变换的激光SLAM几何特征地图提取方法;李万莉;王文佳;;机电一体化(第07期);全文 *
高分雷达与光学影像融合的滨海湿地变化检测;吴瑞娟;何秀凤;测绘科学(第011期);全文 *

Also Published As

Publication number Publication date
CN113516606A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
CN111473739B (en) Video monitoring-based surrounding rock deformation real-time monitoring method for tunnel collapse area
EP1242966B1 (en) Spherical rectification of image pairs
EP2111530B1 (en) Automatic stereo measurement of a point of interest in a scene
EP1788349B1 (en) Method for geocoding a perspective image
US11508030B2 (en) Post capture imagery processing and deployment systems
CN103226838A (en) Real-time spatial positioning method for mobile monitoring target in geographical scene
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
CN116883604A (en) Three-dimensional modeling technical method based on space, air and ground images
Chellappa et al. On the positioning of multisensor imagery for exploitation and target recognition
Maurice et al. A photogrammetric approach for map updating using UAV in Rwanda
Choi et al. Precise geometric registration of aerial imagery and LIDAR data
CN112785686A (en) Forest map construction method based on big data and readable storage medium
CN113516606B (en) Slope radar image and optical photo matching fusion method assisted by oblique photographing data
CN115328181A (en) Method for positioning key target space in unmanned aerial vehicle power transmission line inspection
Kusuno et al. A method localizing an omnidirectional image in pre-constructed 3D wireframe map
Aliakbarpour et al. Geometric exploration of virtual planes in a fusion-based 3D data registration framework
Bai et al. Application of unmanned aerial vehicle multi-vision image 3D modeling in geological disasters
Lee et al. Automatic building reconstruction with satellite images and digital maps
Fridhi et al. DATA ADJUSTMENT OF THE GEOGRAPHIC INFORMATION SYSTEM, GPS AND IMAGE TO CONSTRUCT A VIRTUAL REALITY.
Ye et al. Photogrammetric Accuracy and Modeling of Rolling Shutter Cameras
CN116222592B (en) High-precision map generation method and system based on multi-source data
Trusheim et al. Cooperative localisation using image sensors in a dynamic traffic scenario
Ye et al. RIGOROUS GEOMETRIC MODELLING OF 1960s ARGON SATELLITE IMAGES FOR ANTARCTIC ICE SHEET STEREO MAPPING.
Mirisola Exploiting attitude sensing in vision-based navigation, mappping and tracking including results from an airship
Avila Forero Documentation of cultural heritage using 3D laser scanning and close-range photogrammetry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant