CN112750075A - Low-altitude remote sensing image splicing method and device - Google Patents

Low-altitude remote sensing image splicing method and device Download PDF

Info

Publication number
CN112750075A
CN112750075A CN201911052390.6A CN201911052390A CN112750075A CN 112750075 A CN112750075 A CN 112750075A CN 201911052390 A CN201911052390 A CN 201911052390A CN 112750075 A CN112750075 A CN 112750075A
Authority
CN
China
Prior art keywords
image
remote sensing
low
transformation
altitude remote
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911052390.6A
Other languages
Chinese (zh)
Inventor
吴凡路
王栋
闫得杰
孟庆宇
王征
关海南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Original Assignee
Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun Institute of Optics Fine Mechanics and Physics of CAS filed Critical Changchun Institute of Optics Fine Mechanics and Physics of CAS
Priority to CN201911052390.6A priority Critical patent/CN112750075A/en
Publication of CN112750075A publication Critical patent/CN112750075A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/37Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a low-altitude remote sensing image splicing method, which comprises the following steps: acquiring an original image; carrying out nonlinear distortion correction on the original image to obtain a primary image; performing oblique imaging geometric correction on the primary image to obtain an orthoimage; performing an approximate projection-similarity transformation on the ortho images for image registration; and carrying out image fusion on the images subjected to image registration to obtain a panoramic image. The invention also discloses a low-altitude remote sensing image splicing device. The low-altitude remote sensing image splicing method and device provided by the invention have the advantage of high image splicing precision.

Description

Low-altitude remote sensing image splicing method and device
Technical Field
The invention relates to the field of remote sensing equipment, in particular to a low-altitude remote sensing image splicing method and device.
Background
With the development and progress of science and technology, the demands for multimedia and image information are gradually increased, which not only shows that the requirements for the resolution of the traditional video and image are higher and higher, but also focuses on the extended application of the video and image, such as image splicing, virtual reality technology and the like. The public demand for high-resolution video and images is more and more urgent, but it is difficult to directly acquire large-view-field high-resolution images due to the limitation of hardware conditions and cost. Although professional equipment such as wide-angle lenses and ultra-wide-angle lenses can be used to acquire large-field and high-resolution images, such imaging equipment is expensive and difficult to be used on a large scale. In order to meet the urgent need of the public for high-quality, large-view-field and high-resolution panoramic images, the image stitching technology is generated and becomes a popular research direction, and the application of the image stitching technology permeates into various fields such as remote sensing image processing, medical image processing, virtual reality, national defense and the like.
Aiming at the urgent requirements of various industries on high-resolution and high-precision remote sensing image data, the unmanned aerial vehicle-mounted imaging system becomes a feasible solution. Particularly, the small and medium-sized multi-rotor unmanned aerial vehicle can realize high-resolution imaging during low-altitude flight, has the advantages of low cost, small risk, short development period and the like, and is widely applied to the fields of vegetation forest research, fine agriculture, three-dimensional terrain reconstruction, field search and rescue, disaster investigation and the like. However, the flight height of the small and medium-sized unmanned aerial vehicle is low, and meanwhile, the field of view of the carried imaging system is small, and in order to meet the requirements of large field of view and high resolution, multi-frame images acquired by the imaging system are spliced into a large field of view and high resolution panorama by the image splicing technology. However, the image stitching technology at the present stage cannot meet the high-precision requirement of low-altitude remote sensing image stitching.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and adopts the following technical scheme:
in one aspect, the invention provides a low-altitude remote sensing image splicing method. The low-altitude remote sensing image splicing method comprises the following steps:
acquiring an original image;
carrying out nonlinear distortion correction on the original image to obtain a primary image;
performing oblique imaging geometric correction on the primary image to obtain an orthoimage;
performing an approximate projection-similarity transformation on the ortho images for image registration;
and carrying out image fusion on the images subjected to image registration to obtain a panoramic image.
In some embodiments, the nonlinear distortion correction is a correction of nonlinear distortion by conventional laboratory calibration data.
In some embodiments, the tilted imaging geometry is corrected to project a primary image in a pixel coordinate system into a local geodetic coordinate system, and then resample the projected image by meshing in a northeast geodetic coordinate system according to ground resolution requirements to obtain an orthoimage.
In some embodiments, the approximating a projection-similarity transformation comprises: dividing an image to be subjected to image registration into a first area and a second area: the first region transforms an image to be subjected to image registration to a reference image by adopting a projective transformation function, and the second region transforms by adopting a similarity transformation function.
In some embodiments, the approximating a projection-similarity transformation further comprises: and replacing the global projective transformation with an approximate projective transformation function to weaken the local dislocation phenomenon caused by the parallax of the overlapped region.
In some embodiments, the first region is an image overlap region and the second region is an image non-overlap region.
In some embodiments, said approximating a projection-similarity transformation to said orthoimage for image registration comprises the steps of:
respectively extracting characteristic points of the reference image and the image to be registered;
carrying out initial matching on the feature points;
carrying out accurate matching on the feature points subjected to the initial matching;
performing global homography matrix transformation according to the matching points for performing accurate matching;
and performing approximate projection-similarity transformation on the image to be registered subjected to the global homography matrix transformation.
In some embodiments, the initial matching of the feature points uses a random kd-Tree algorithm to search nearest neighbor points and next nearest neighbor points for initial matching.
In some embodiments, the exact matching is performed by eliminating the mismatching point pairs after the initial matching is completed by using a random sampling consistency algorithm.
On the other hand, the invention provides a low-altitude remote sensing image splicing device. The low latitude remote sensing image splicing apparatus includes:
the unmanned aerial vehicle airborne imaging system is used for acquiring an original image;
the nonlinear distortion correction module is used for carrying out nonlinear distortion correction on the original image to obtain a primary image;
the oblique imaging geometric correction is used for carrying out oblique imaging geometric correction on the primary image to obtain an orthoimage;
an image registration module, configured to perform approximate projection-similarity transformation on the ortho-image to perform image registration;
and the image fusion module is used for carrying out image fusion on the images subjected to the image registration to obtain a panoramic image.
The invention has the technical effects that: the low-altitude remote sensing image splicing method and the device firstly perform oblique imaging geometric correction on the image so as to reduce the slight change of the flight attitude of the unmanned aerial vehicle or the changes of rotation, scaling and the like of ground objects caused by the changes of the flight attitude and the flight attitude of the unmanned aerial vehicle, and the consistency of the ground object dimension in the image to be spliced can be ensured by performing oblique imaging geometry on the image. And the low-altitude remote sensing image splicing method combining the approximate projection transformation and the similarity transformation can weaken the local dislocation phenomenon of the overlapped region and the deformation problem of the non-overlapped region by carrying out the approximate projection-similarity transformation on the orthoimage for image registration, thereby finally obtaining the high-precision spliced image.
Drawings
FIG. 1 is a frame diagram of a low-altitude remote sensing image stitching method according to an embodiment of the invention;
FIG. 2 is a schematic diagram of a tilt imaging geometry correction according to one embodiment of the present invention;
FIG. 3 is a flow diagram of coordinate transformation for tilted imaging geometry correction, according to one embodiment of the present invention;
FIG. 4 is a schematic flow chart of a low-altitude remote sensing image stitching method according to an embodiment of the invention;
fig. 5 is a schematic block diagram of a low-altitude remote sensing image stitching device according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings and specific embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention.
Aiming at the urgent need of the public for high-quality, large-view-field and high-resolution panoramic images, at present, a plurality of related solutions are available both in China and abroad. At present, widely applied commercial image splicing systems are all products of foreign companies. For example, Ladybug2 of Point greeny company adopts 6 cameras to acquire 360-degree panoramic images in the water direction, and Google street view is also based on an image splicing technology to realize the display and roaming of urban street panorama. The main technologies adopted by these products are: a global homography registration alignment, cylindrical or spherical projection, bundle adjustment, multi-band fusion. The AutoStitch proposed by Brown is taken as a milestone, and various splicing software and applications fall into disputes, such as the splicing schemes in Microsoft's ICE (image Composite editor) and Photoshop. However, these image stitching algorithms require that the input images necessarily satisfy the following assumptions: the actual scene corresponding to the overlapping area can ignore the change in the depth direction. If the assumption is not met, obvious ghost images and dislocation phenomena are generated due to the parallax problem, the parallax problem cannot be solved by a global homography matrix, and the high-precision requirement of low-altitude remote sensing image splicing cannot be met.
For example, the classical image stitching algorithm uses a global transformation model to register and align the image sequence to be stitched, and requires that the input image must satisfy that the change in the depth direction can be ignored in the actual scene corresponding to the overlapping region, otherwise, obvious ghost and dislocation phenomena can be generated due to parallax problems. The most common global transformation is the projective transformation (image registration alignment is achieved with a global homography matrix). The stitching result obtained by using the full-office projection transformation to align the images has two problems: the phenomenon of blurring and ghosting in the spliced image is caused by the condition of local dislocation in the image overlapping region; distortion problems occur in non-overlapping areas, especially at the edges of the field of view where stretching is severe, visually manifested as shape distortion and non-uniform scaling.
For example, according to the bihomography transformation algorithm, feature points are clustered, an actual scene is divided into a near scene and a far scene to obtain two homography matrices, and the weighted homography matrices are used for image splicing by distributing weights according to Euclidean distances between the points and two clustering centers. The algorithm has a good splicing effect on scenes which can be obviously divided into a near scene and a far scene, but the seamless splicing is difficult to realize when the algorithm is extended to any scene, and manual intervention and post-processing are needed.
For another example, an approximation projection transformation algorithm is used for dividing the image into dense grids, each grid in the overlapped area of the image to be spliced is registered and aligned by adopting a local homography matrix, and the non-overlapped area is smoothly extrapolated by adopting a Moving DLT method. The transformation adopted by the algorithm in the non-overlapping area is approximate to the full-area projection transformation, so that the image has the deformation problem, particularly the stretching phenomenon of the edge of the visual field is serious.
In addition, the inventors of the present application found that: in addition to the lens distortion, the imaging system also generates nonlinear distortion of images acquired by the imaging system to different degrees due to error factors introduced by processing, adjustment and the like. The slight change of unmanned aerial vehicle flight attitude makes the airborne imaging system be the slope formation of image mode during actual work. Compared with an orthoimage, oblique imaging has more complicated geometric deformation and scale scaling problems, and splicing errors (such as ghost, dislocation and the like) can be caused by directly splicing the acquired original images. In addition, the changes of the flying height and flying attitude of the unmanned aerial vehicle can also cause the changes of ground objects such as rotation and scaling. These factors all affect the accuracy of image stitching.
Aiming at the problems, the invention provides a low-altitude remote sensing image splicing method combining approximate projection transformation and similarity transformation, which comprises the following steps: and processing the local alignment problem of the overlapped region by adopting approximation projective transformation, and processing the deformation problem of the non-overlapped region of global projective transformation by adopting similarity transformation. The method combines the advantages of approximate projection transformation and similarity transformation, and can realize high-quality and high-precision splicing of images acquired by the airborne imaging system of the unmanned aerial vehicle.
Referring to fig. 4, a low-altitude remote sensing image stitching method according to an embodiment of the invention is illustrated. The low-altitude remote sensing image splicing method comprises the following steps:
s1, acquiring an original image;
s2, carrying out nonlinear distortion correction on the original image to obtain a primary image;
s3, performing oblique imaging geometric correction on the primary image to obtain an orthoimage;
s4, performing approximate projection-similarity transformation on the orthoimage to perform image registration;
and S5, carrying out image fusion on the images subjected to image registration to obtain a panoramic image.
In some embodiments, the nonlinear distortion correction is a correction of nonlinear distortion by conventional laboratory calibration data.
In some embodiments, the tilted imaging geometry is corrected to project a primary image in a pixel coordinate system into a local geodetic coordinate system, and then resample the projected image by meshing in a northeast geodetic coordinate system according to ground resolution requirements to obtain an orthoimage.
In some embodiments, the approximating a projection-similarity transformation comprises: dividing an image to be subjected to image registration into a first area and a second area: the first region transforms an image to be subjected to image registration to a reference image by adopting a projective transformation function, and the second region transforms by adopting a similarity transformation function.
In some embodiments, the approximating a projection-similarity transformation further comprises: and replacing the global projective transformation with an approximate projective transformation function to weaken the local dislocation phenomenon caused by the parallax of the overlapped region.
In some embodiments, the first region is an image overlap region and the second region is an image non-overlap region.
In some embodiments, the step S4, performing an approximate projection-similarity transformation on the ortho image for image registration includes the steps of:
respectively extracting characteristic points of the reference image and the image to be registered;
carrying out initial matching on the feature points;
carrying out accurate matching on the feature points subjected to the initial matching;
performing global homography matrix transformation according to the matching points for performing accurate matching;
and performing approximate projection-similarity transformation on the image to be registered subjected to the global homography matrix transformation.
In some embodiments, the initial matching of the feature points uses a random kd-Tree algorithm to search nearest neighbor points and next nearest neighbor points for initial matching.
In some embodiments, the exact matching is performed by eliminating the mismatching point pairs after the initial matching is completed by using a random sampling consistency algorithm.
On the other hand, as shown in fig. 5, the invention provides a low-altitude remote sensing image stitching device 100. The low-altitude remote sensing image stitching device 100 includes:
the unmanned aerial vehicle airborne imaging system 10 is used for acquiring an original image;
the nonlinear distortion correction module 20 is configured to perform nonlinear distortion correction on the original image to obtain a primary image;
the oblique imaging geometric correction 30 is used for carrying out oblique imaging geometric correction on the primary image to obtain an orthoimage;
an image registration module 40, configured to perform an approximate projection-similarity transformation on the ortho-image to perform image registration;
and an image fusion module 50, configured to perform image fusion on the image subjected to image registration to obtain a panoramic image.
The invention has the technical effects that: the low-altitude remote sensing image splicing method and the device firstly perform oblique imaging geometric correction on the image so as to reduce the slight change of the flight attitude of the unmanned aerial vehicle or the changes of rotation, scaling and the like of ground objects caused by the changes of the flight attitude and the flight attitude of the unmanned aerial vehicle, and the consistency of the ground object dimension in the image to be spliced can be ensured by performing oblique imaging geometry on the image. And the low-altitude remote sensing image splicing method combining the approximate projection transformation and the similarity transformation can weaken the local dislocation phenomenon of the overlapped region and the deformation problem of the non-overlapped region by carrying out the approximate projection-similarity transformation on the orthoimage for image registration, thereby finally obtaining the high-precision spliced image.
The following detailed description of the present invention will be made with reference to fig. 1 to 3.
Example 1:
the low-altitude remote sensing image stitching method provided by the embodiment combines the approximate projection transformation and the similarity transformation to perform image stitching, and the specific implementation mode is as follows.
(1) The original image obtained by the unmanned aerial vehicle airborne imaging system has nonlinear distortion, and the nonlinear distortion can be corrected based on calibration data of a conventional laboratory.
(2) The slight change of unmanned aerial vehicle flight attitude makes the airborne imaging system be the slope formation of image mode during actual work, in addition, because the change of unmanned aerial vehicle height, flight attitude also can lead to ground feature to take place changes such as rotation, zooming. Therefore, the oblique imaging keystone distortion correction must be performed on the images to ensure the consistency of the ground object dimensions in the images to be stitched. As shown in FIG. 2, the northeast coordinate system F is involved in the geometric correction of the keystone distortion in oblique imagingwnedNortheast coordinate system of sensor FisnedSensor body coordinate system FibodyAuxiliary coordinate system O of camerac-X'cYcZcCamera coordinate system Oc-XcYcZcFrom the northeast coordinate system FwnedTo camera coordinate system Oc-XcYcZcThe transformation relationship of (a) is shown in fig. 3. Geometric correction of tilted imaging keystone distortion i.e. projecting a primary image in the pixel coordinate system O-UV to a local geodetic coordinate system (i.e. northeast coordinate system F)wned) Then in the northeast coordinate system F according to the ground resolution requirementswnedThe mid-division grid resamples the projected image to obtain an orthoimage.
(3) Determining coordinates (u, v) of image point in pixel coordinate system O-UV and coordinate system F of northeast region without ground control point and with deformation caused by curvature and topography of earth approximate to zerownedLower object point coordinate (x)wnedWn, zwned) requires knowledge of the position and attitude information (acquired by the GNSS/INS integrated navigation system) of the camera coordinate system Oc-XcYcZc.
The current position information obtained by the GNSS/INS integrated navigation System is longitude, latitude, and elevation (respectively represented by L, B, H) in WGS84(World Geodetic System 1984) coordinate System, and needs to be converted into coordinates in a geocentric rectangular coordinate System, where the conversion relationship is as follows:
Figure BDA0002255637890000081
in the formula
Figure BDA0002255637890000082
Referring to the first eccentricity of an ellipsoid for a WGS84 coordinate system, a-6378137 m is a major semiaxis of the ellipsoid, b-6356752.3142 m is a minor semiaxis of the ellipsoid,
Figure BDA0002255637890000083
and the curvature radius of the unitary fourth-element circle at the current position.
Let two points P exist in a small-range space0、P1The WGS84 coordinates of two points are known as (L)0,B0, H0)、(L1,B1,H1) With P0And establishing a northeast earth coordinate system for the origin, wherein a rotation matrix from the earth-center space rectangular coordinate system to the northeast earth coordinate system is as follows:
Figure BDA0002255637890000084
wherein (L, B, H) is the longitude and latitude elevation value of the origin of the coordinate system in northeast, and is (L) here0,B0,H0)。
Firstly, the geocentric space rectangular coordinates (x) of two points are obtained according to the formula (1)0,ecef,y0,ecef,z0,ecef)、(x1,ecef, y1,ecef,z1,ecef) Then P is obtained according to the following formula1Is marked by P0Coordinates under the northeast coordinate system as origin:
Figure BDA0002255637890000085
in the image splicing process, a sensor body coordinate system F corresponding to a first group of longitude and latitude elevation values acquired by a GNSS/INS integrated navigation systembodyEstablishing a northeast coordinate system as a world coordinate system F in the whole processing flow by taking an origin as the origin of the coordinate systemwnedAnd synchronously obtaining the sensor body coordinate system F at the moment when each frame of image is acquiredbodyWGS84 coordinate of origin, sensor body coordinate system FbodyNorth east earth coordinate system F of relative sensorsnedCan approximate all transient sensor northeast coordinate system F in the working rangesnedAre all parallel, so that each image point, the optical center, is in the world coordinate system FwnedThe coordinates of the lower are all determinable.
Coordinate system F of object point P in northeast of ideal pinhole imagingwnedCoordinate of lower P (x)wned,ywned, zwned) The following relationship exists with the pixel coordinates p' (u, v):
Figure BDA0002255637890000091
in the formula MINFor the camera internal parameter matrix, MEXIs a camera extrinsic parameter matrix, (x)c,yc,zc) For point P in the camera coordinate system Oc-XcYcZcLower coordinate, zwnedEqual to zero. MINThe concrete form of (A) is as follows:
Figure BDA0002255637890000092
wherein f is the effective focal length of the imaging system, dx and dy are the pixel size of the imaging system, and u0,v0) Is the pixel coordinate of the image principal point (ideally, the geometric center of the image). MEXRepresents the coordinate system F from northeastwnedTo camera coordinate system Oc-XcYcZcThe conversion relationship of (1).
Firstly, calculating a coordinate system F of the north east of the moment according to the position information acquired by the GNSS/INS integrated navigation system when the ith image is shotwnedAnd north east earth coordinate system F of sensorisnedThe amount of translation T betweeniwsI.e. the north east earth coordinate system FwnedHas an origin in the northeast coordinate system F of the sensorisnedCoordinates of the north east earth coordinate system F obtained according to the formula (1)wnedOrigin of (2), sensor northeast coordinate system FisnedRectangular coordinate system F with origin in geocentric spaceecefThe coordinates of the lower points are respectively (x)0,ecef,y0,ecef,z0,ecef)、(xi,ecef,yi,ecef, zi,ecef) Then T is obtained according to the following formulaiws
Figure BDA0002255637890000101
In the formula (x)i,wned,yi,wned,zi,wned) Is the north east coordinate system F of the sensorisnedHas an origin in the northeast coordinate system FwnedCoordinates of Recef2wnedFrom a rectangular coordinate system of the earth's center space to a coordinate system F of the northeastwnedThe rotation matrix of (2) can be obtained by calculation.
Then, calculating a coordinate system F of the north east of the time sensor according to three Euler angles acquired by the GNSS/INS integrated navigation system when the ith image is shotisnedTo the sensor body coordinate system FibodyOf (3) a rotational matrix Risb. North east earth coordinate system of sensor FisnedSequentially wound around XisnedAxis, YisnedAxis, ZisnedAngle of rotation of the shaft
Figure BDA0002255637890000104
(roll angle), theta (pitch angle) and psi (yaw angle) with the sensor body coordinate system FibodyCoincidence, rotation matrix RisbCan be expressed as:
Figure BDA0002255637890000102
in the formula
Figure BDA0002255637890000103
Rθ,Y、Rψ,ZEach representing a winding XisnedAxis, YisnedAxis, ZisnedRotation matrix of the shaft, c*=cos(*)、s*=sin(*)。
Sensor body coordinate system FbodyAnd camera auxiliary coordinate system Oc-X'cYcZcThe translation vector between is a constant value (T)x,0,0)TAnd can be obtained from structural design parameters.
Auxiliary coordinate system O of camerac-X'cYcZcAnd camera coordinate system Oc-XcYcZcThe Y, Z coordinates are the same at the lower position and the X coordinate values are reversed.
From the above conversion process, MEXThe concrete form of (A) is as follows:
Figure BDA0002255637890000111
equation (4) can be written as:
Figure BDA0002255637890000112
where element m of the projection matrixijThe above conversion process can be used to obtain the following formula:
Figure BDA0002255637890000113
then the coordinate system F of the corresponding object point of each image point (u, v) in the original image in the northeast is calculated by the following formulawnedCoordinate of (x)wned,ywned,0):
Figure BDA0002255637890000114
(4) When the characteristic matching is carried out, the idea of nearest neighbor distance ratio is adopted to carry out initial matching and set
Figure BDA0002255637890000115
Is a feature vector of the feature points extracted on the reference image,
Figure BDA0002255637890000116
the Euclidean distance D between the feature vectors of the feature points extracted from the image to be registeredijExpressed as:
Figure BDA0002255637890000117
is provided with YaIs XiNearest neighbor of, YbIs XiIf the following equation is satisfied:
Figure BDA0002255637890000121
then consider XiAnd YaIs a homonym point, and Thr in the formula is a threshold value. And searching nearest neighbor points and next-nearest neighbor points by adopting a random kd-tree algorithm during initial matching. And after the initial matching is finished, rejecting mismatching point pairs by adopting a random sampling consistency calculation method.
(5) The basic idea of image stitching is to transform the sequence images into the same coordinate system and then perform image fusion. The transformation function can be represented as ω (x, y) → (x ', y '), where (x, y), (x ', y ') are coordinates of the homonymous feature points in the image I to be stitched and the reference image I ', respectively, and when the transformation function is projective transformation, H (x, y) represents the transformation function and H represents the homography matrix. Let x be [ x, y ═ x]T、x'=[x',y']TThen the projective transformation has the following relationship:
Figure BDA0002255637890000122
in the formula
Figure BDA0002255637890000123
Homogeneous coordinate [ x, y,1 ] of x]T
Figure BDA0002255637890000124
Equation (14) can also be expressed in another form:
Figure BDA0002255637890000125
based on the obtained matching point pair set { xi,xiThe information of' (i ═ 1, 2.., N) can be directly transformed by linear transformation to estimate the homography matrix H. According to formula (14):
Figure BDA0002255637890000126
wherein h ═ h1,h2,h3,h4,h5,h6,h7,h8,1]T. Only two of the three equations in equation (16) are linearly uncorrelated, let di∈P2×9As a corresponding point pair { xi,x′iD is formed by the first two rows of the 3X 9 matrix in the formula (16) and N point pairsiCan form D e to P2N×9The direct linear transformation method estimates h as:
Figure BDA0002255637890000133
estimation result
Figure BDA0002255637890000134
Right singular vector of D, i.e. DTD feature vectors. By
Figure BDA0002255637890000135
A homography matrix H is obtained, and the substantially direct linear transformation method is a least squares problem.
In order to reduce local dislocation phenomenon caused by parallax in image overlapping region, the approximation projection transformation uses local dependence homography matrix H*And carrying out projection transformation on the image I to be spliced. Locally dependent homography matrix H*Can be estimated by:
Figure BDA0002255637890000136
weight in the formula
Figure BDA0002255637890000137
According to the characteristic point xiTo the current point x*To determine:
Figure BDA0002255637890000138
where σ is a scale parameter. It can be seen that the distance from the current point x*Closer feature point xiThe greater the weight of (A), the matrix H*Can better accord with the current point x*Local information of the vicinity. The contrast formula (14) uses a global homography matrix H to perform projection transformation on the image I to be spliced, and uses a transformation matrix H obtained by traversing the whole image by the formula (18)*Is smoothly varying.
Like equation (17), equation (18) can also be written in matrix form:
Figure BDA0002255637890000139
weight matrix W in formula*∈P2N×2NIs a diagonal matrix and can be obtained by:
Figure BDA00022556378900001311
estimation result h*Is W*Right singular vector of D.
Calculating matrix H of each pixel of the whole image*The workload is too large and the same feature point xiWeights for neighboring pixels
Figure BDA0002255637890000141
Relatively close and corresponding matrix H*And also relatively close. Thus, the image is divided evenly into grids, and only the matrix H of the central pixels of the grids is calculated*And the pixels in the grid are subjected to projective transformation by using the matrix.
(6) The projection transformation has a transformation relation of equations (14) and (15), a denominator in equation (15) has two coordinate quantities x and y, an xy coordinate system is rotated to a new coordinate system uv, and coordinates of pixel points under the uv coordinate system and the xy coordinate system have the following transformation relation:
Figure BDA0002255637890000142
wherein theta is atan2 (-h)8,-h7). Based on the above transformation relationship, equation (14) can be rewritten as:
Figure BDA0002255637890000143
in the formula
Figure BDA0002255637890000144
Using homography matrices
Figure BDA0002255637890000145
Projecting (u, v) to (x ', y'), the transformation function can be written as:
Figure BDA0002255637890000146
there is only one quanta u in the denominator at this time.
Homography matrix
Figure BDA0002255637890000147
Can be decomposed into an affine matrix A and a homography matrix H', as shown in the following formula:
Figure BDA0002255637890000148
the local transformation scale at point (u, v) may be scaled by a homography matrix
Figure BDA0002255637890000151
The jacobian at this point is obtained:
Figure BDA0002255637890000152
where det (a) is a constant independent of the coordinates (u, v), so that the local transformation scale introduced by the projective transformation depends only on u, and the larger the coordinate u, the more severe the stretching phenomenon after transformation.
The length ratio of the line segments is maintained by the similarity transformation, when u is constant u0Then, according to the formulas (23) and (24):
Figure BDA0002255637890000153
it can be seen that for a straight line parallel to the v-axis, the length ratio of the line segments can still be maintained after the transformation function H (u, v), i.e. the similarity transformation. Based on the analysis, projection-similarity transformation can be constructed, and the image to be spliced is divided into two areas: one of the regions uses a projective transformation function H (u, v) to transform the image to be stitched to the reference image, and the other region uses a similarity transformation function S (u, v).
(7) Obtaining two images I1、I2Homography matrix of
Figure BDA0002255637890000154
Constructing a projection-similarity transformation function omega according to the matrix; then using an approximation projective transformation function omegaAPAPAlternative global projective transforms
Figure BDA0002255637890000155
To reduce the local misalignment caused by parallax in the overlapping area. Therefore, the low-altitude remote sensing image splicing method combining the approximate projection transformation and the similarity transformation can weaken the local dislocation phenomenon of the overlapped region and the deformation problem of the non-overlapped region.
It will be further appreciated by those of skill in the art that the elements and algorithm steps of the examples described in connection with the embodiments disclosed herein may be embodied in electronic hardware, computer software, or combinations of both, the components and steps of the examples having been described in functional generality in the foregoing description for the purpose of clearly illustrating the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
The above-described embodiments of the present invention should not be construed as limiting the scope of the present invention. Any other corresponding changes and modifications made according to the technical idea of the present invention should be included in the protection scope of the claims of the present invention.

Claims (10)

1. A low-altitude remote sensing image splicing method is characterized by comprising the following steps:
acquiring an original image;
carrying out nonlinear distortion correction on the original image to obtain a primary image;
performing oblique imaging geometric correction on the primary image to obtain an orthoimage;
performing an approximate projection-similarity transformation on the ortho images for image registration;
and carrying out image fusion on the images subjected to image registration to obtain a panoramic image.
2. The low-altitude remote sensing image stitching method according to claim 1, wherein the nonlinear distortion correction is performed by conventional laboratory calibration data.
3. The low-altitude remote sensing image stitching method according to claim 1, wherein the oblique imaging geometry is corrected by projecting a primary image in a pixel coordinate system into a local geodetic coordinate system, and then resampling the projected image by dividing a grid in a northeast geodetic coordinate system according to a ground resolution requirement to obtain an orthoimage.
4. The low-altitude remote sensing image stitching method according to claim 1, wherein the approximating projection-similarity transformation comprises: dividing an image to be subjected to image registration into a first area and a second area: the first region transforms an image to be subjected to image registration to a reference image by adopting a projective transformation function, and the second region transforms by adopting a similarity transformation function.
5. The low-altitude remote sensing image stitching method according to claim 4, wherein approximating the projection-similarity transformation further comprises: and replacing the global projective transformation with an approximate projective transformation function to weaken the local dislocation phenomenon caused by the parallax of the overlapped region.
6. The low-altitude remote sensing image stitching method according to claim 4, wherein the first region is an image overlapping region, and the second region is an image non-overlapping region.
7. The low-altitude remote sensing image stitching method according to claim 4, wherein the approximate projection-similarity transformation of the ortho images for image registration comprises the steps of:
respectively extracting characteristic points of the reference image and the image to be registered;
carrying out initial matching on the feature points;
carrying out accurate matching on the feature points subjected to the initial matching;
performing global homography matrix transformation according to the matching points for performing accurate matching;
and performing approximation projection-similarity transformation on the image to be registered subjected to the global homography matrix transformation.
8. The low-altitude remote sensing image stitching method according to claim 7, wherein a random kd-Tree algorithm is adopted to search nearest neighbor points and next nearest neighbor points for initial matching when the feature points are initially matched.
9. The low-altitude remote sensing image stitching method according to claim 7, wherein the accurate matching is realized by eliminating mismatching point pairs by using a random sampling consistency algorithm after the initial matching is completed.
10. The utility model provides a low latitude remote sensing image splicing apparatus which characterized in that includes:
the unmanned aerial vehicle airborne imaging system is used for acquiring an original image;
the nonlinear distortion correction module is used for carrying out nonlinear distortion correction on the original image to obtain a primary image;
the oblique imaging geometric correction is used for carrying out oblique imaging geometric correction on the primary image to obtain an orthoimage;
the image registration module is used for carrying out approximate projection-similarity transformation on the orthoimage so as to carry out image registration;
and the image fusion module is used for carrying out image fusion on the images subjected to the image registration to obtain a panoramic image.
CN201911052390.6A 2019-10-31 2019-10-31 Low-altitude remote sensing image splicing method and device Pending CN112750075A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911052390.6A CN112750075A (en) 2019-10-31 2019-10-31 Low-altitude remote sensing image splicing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911052390.6A CN112750075A (en) 2019-10-31 2019-10-31 Low-altitude remote sensing image splicing method and device

Publications (1)

Publication Number Publication Date
CN112750075A true CN112750075A (en) 2021-05-04

Family

ID=75641571

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911052390.6A Pending CN112750075A (en) 2019-10-31 2019-10-31 Low-altitude remote sensing image splicing method and device

Country Status (1)

Country Link
CN (1) CN112750075A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463186A (en) * 2022-04-12 2022-05-10 常州铭赛机器人科技股份有限公司 Tiled splicing method based on correction scanning image
CN114742707A (en) * 2022-04-18 2022-07-12 中科星睿科技(北京)有限公司 Multi-source remote sensing image splicing method and device, electronic equipment and readable medium
CN117036666A (en) * 2023-06-14 2023-11-10 北京自动化控制设备研究所 Unmanned aerial vehicle low-altitude positioning method based on inter-frame image stitching

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103218789A (en) * 2013-04-24 2013-07-24 中国科学院遥感与数字地球研究所 Automation ortho-rectification method based on geometric deformation step resolving
CN105844587A (en) * 2016-03-17 2016-08-10 河南理工大学 Low-altitude unmanned aerial vehicle-borne hyperspectral remote-sensing-image automatic splicing method
CN105931185A (en) * 2016-04-20 2016-09-07 中国矿业大学 Automatic splicing method of multiple view angle image
CN106447601A (en) * 2016-08-31 2017-02-22 中国科学院遥感与数字地球研究所 Unmanned aerial vehicle remote image mosaicing method based on projection-similarity transformation
CN109903352A (en) * 2018-12-24 2019-06-18 中国科学院遥感与数字地球研究所 A kind of seamless orthography production method in the big region of satellite remote-sensing image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103218789A (en) * 2013-04-24 2013-07-24 中国科学院遥感与数字地球研究所 Automation ortho-rectification method based on geometric deformation step resolving
CN105844587A (en) * 2016-03-17 2016-08-10 河南理工大学 Low-altitude unmanned aerial vehicle-borne hyperspectral remote-sensing-image automatic splicing method
CN105931185A (en) * 2016-04-20 2016-09-07 中国矿业大学 Automatic splicing method of multiple view angle image
CN106447601A (en) * 2016-08-31 2017-02-22 中国科学院遥感与数字地球研究所 Unmanned aerial vehicle remote image mosaicing method based on projection-similarity transformation
CN109903352A (en) * 2018-12-24 2019-06-18 中国科学院遥感与数字地球研究所 A kind of seamless orthography production method in the big region of satellite remote-sensing image

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
CHE-HAN CHANG 等: "Shape-Preserving Half-Projective Warps for Image Stitching", 《2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 *
JULIO ZARAGOZA 等: "As-Projective-As-Possible Image Stitching with Moving DLT", 《2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》 *
唐文彦: "图像拼接关键技术研究与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
杨鹏: "基于POS数据的低空固定翼无人机航拍影像拼接研究", 《中国优秀硕士学位论文全文数据库 基础科学辑》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463186A (en) * 2022-04-12 2022-05-10 常州铭赛机器人科技股份有限公司 Tiled splicing method based on correction scanning image
CN114742707A (en) * 2022-04-18 2022-07-12 中科星睿科技(北京)有限公司 Multi-source remote sensing image splicing method and device, electronic equipment and readable medium
CN114742707B (en) * 2022-04-18 2022-09-27 中科星睿科技(北京)有限公司 Multi-source remote sensing image splicing method and device, electronic equipment and readable medium
CN117036666A (en) * 2023-06-14 2023-11-10 北京自动化控制设备研究所 Unmanned aerial vehicle low-altitude positioning method based on inter-frame image stitching
CN117036666B (en) * 2023-06-14 2024-05-07 北京自动化控制设备研究所 Unmanned aerial vehicle low-altitude positioning method based on inter-frame image stitching

Similar Documents

Publication Publication Date Title
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
CN110675450B (en) Method and system for generating orthoimage in real time based on SLAM technology
CN107808362A (en) A kind of image split-joint method combined based on unmanned plane POS information with image SURF features
CA2705809C (en) Method and apparatus of taking aerial surveys
CA2395257A1 (en) Any aspect passive volumetric image processing method
CN112750075A (en) Low-altitude remote sensing image splicing method and device
CN114936971A (en) Unmanned aerial vehicle remote sensing multispectral image splicing method and system for water area
CN113222820B (en) Pose information-assisted aerial remote sensing image stitching method
CN110555813B (en) Rapid geometric correction method and system for remote sensing image of unmanned aerial vehicle
CN109883433B (en) Vehicle positioning method in structured environment based on 360-degree panoramic view
WO2020198963A1 (en) Data processing method and apparatus related to photographing device, and image processing device
CN110986888A (en) Aerial photography integrated method
CN112862683A (en) Adjacent image splicing method based on elastic registration and grid optimization
Liu et al. A new approach to fast mosaic UAV images
Zhou et al. Automatic orthorectification and mosaicking of oblique images from a zoom lens aerial camera
CN113034347A (en) Oblique photographic image processing method, device, processing equipment and storage medium
Lee et al. Georegistration of airborne hyperspectral image data
JPH09153131A (en) Method and device for processing picture information and picture information integrating system
CN116228860A (en) Target geographic position prediction method, device, equipment and storage medium
CN116124094A (en) Multi-target co-location method based on unmanned aerial vehicle reconnaissance image and combined navigation information
CN113362265B (en) Low-cost rapid geographical splicing method for orthographic images of unmanned aerial vehicle
CN114663789A (en) Power transmission line unmanned aerial vehicle aerial image splicing method
CN114757834A (en) Panoramic image processing method and panoramic image processing device
CN116839595B (en) Method for creating unmanned aerial vehicle route
CN111784622B (en) Image splicing method based on monocular inclination of unmanned aerial vehicle and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210504

RJ01 Rejection of invention patent application after publication