CN108021886B - Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle - Google Patents

Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle Download PDF

Info

Publication number
CN108021886B
CN108021886B CN201711263004.9A CN201711263004A CN108021886B CN 108021886 B CN108021886 B CN 108021886B CN 201711263004 A CN201711263004 A CN 201711263004A CN 108021886 B CN108021886 B CN 108021886B
Authority
CN
China
Prior art keywords
image
matched
matching
points
reference image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201711263004.9A
Other languages
Chinese (zh)
Other versions
CN108021886A (en
Inventor
陈敏
严少华
赵怡涛
朱庆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Jiaotong University
Original Assignee
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Jiaotong University filed Critical Southwest Jiaotong University
Priority to CN201711263004.9A priority Critical patent/CN108021886B/en
Publication of CN108021886A publication Critical patent/CN108021886A/en
Application granted granted Critical
Publication of CN108021886B publication Critical patent/CN108021886B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Astronomy & Astrophysics (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a matching method of local significant feature points of an unmanned aerial vehicle repeated texture image, which comprises the following steps: respectively extracting local significant feature points of a reference image and local significant feature points of an image to be matched; selecting seed points on the reference image and the image to be matched based on the characteristic response intensity values of the local significant characteristic points and matching; calculating a geometric transformation model between the reference image and the image to be matched by using the homonymous seed points, and calculating a homonymous feature search area of each local significant feature point on the reference image on the image to be matched based on the geometric transformation model; and searching the homonymous point of the local salient feature point of each reference image in the image search area to be matched by using an NNDR method. The method provides a local significant feature point detection operator suitable for unmanned aerial vehicle repeated texture image matching, and the feature points obtained by the feature point detection operator have strong positioning capability and distinguishability, can be correctly matched in the matching process, and ensures high matching accuracy.

Description

Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle
Technical Field
The invention relates to the technical field of remote sensing image processing, in particular to a matching method for local significant feature points of an unmanned aerial vehicle repeated texture image.
Background
For a long time, the traditional aviation and aerospace remote sensing images taking airplanes and satellites as platforms are widely applied to mapping of various scale topographic maps. However, for small-area and large-scale surveying and mapping, the conventional aerial photography system cannot meet the requirement of rapidly updating the large-scale remote sensing image due to the high cost, the poor cost performance ratio and the limitation of various factors such as strict transition conditions of the airplane. With the gradual release of the national low-altitude airspace, the development of the low-altitude unmanned remote sensing platform construction and the research of the intelligent system for quickly acquiring, processing and integrating the observation data are urgent requirements in the current major engineering construction and scientific decision-making. The unmanned aerial vehicle remote sensing platform has the following unique advantages as an effective supplementary means for satellite remote sensing and traditional aerial photogrammetry: the influence of weather is small, and the operation mode is flexible and quick; the method is not limited by a revisit cycle, can take off at any time according to task needs, can fly under clouds, and has strong timeliness and pertinence of images and high resolution; the cost of platform construction, maintenance and operation is low; a large-scale high-precision image can be obtained; the method does not need to apply for airspace, and the country does not control the airspace below one kilometer; the image with high overlapping degree can be obtained, and the reliability of subsequent processing is enhanced.
Although the unmanned aerial vehicle remote sensing platform has unique advantages in the aspect of remote sensing data acquisition, the automation and intelligentization level of data processing of the unmanned aerial vehicle remote sensing platform still cannot meet the increasing urgent requirements of the geographic information application industry. In a plurality of key technologies of unmanned aerial vehicle remote sensing data automated processing, the image matching technology is particularly important.
The existing matching method for repeated texture images mainly includes two types:
one class of methods introduces global context features into the feature matching method, constraining the matching process by expanding the range of the feature region and computing the feature internal relationship. The method obtains better matching effect on some images containing repeated textures. However, this method relies on the matching result of the feature pair, and when the image contains large-area repeated texture, the enlarged feature area still has not enough distinguishability for correct matching, and at this time, this kind of method will fail to match.
In another method, distance and angle geometric constraint conditions are constructed by using GNSS/INS (global Navigation Satellite System and Inertial Navigation System) remote sensing image geographic reference information, and the geometric constraint conditions and local features are combined for image matching. Through geometric constraint, the generation of wrong matching can be avoided to a certain extent, and the matching accuracy is further improved. However, since the feature point detection results in such methods contain a large number of similar features, it is difficult to obtain correct matching in the repeated texture region of the image while eliminating the incorrect matching by geometric constraint, so that the number of finally obtained correct matches is very small, and the distribution is not uniform, which is not favorable for the subsequent application of the matching results.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a matching method for local significant feature points of an unmanned aerial vehicle repeated texture image.
In order to achieve the purpose, the invention adopts the following technical scheme:
an unmanned aerial vehicle repeated texture image local salient feature point matching method comprises the following steps:
the method comprises the following steps: respectively extracting local salient feature points of a reference image in the reference image and local salient feature points of an image to be matched in the image to be matched; the method comprises the following steps of extracting local significant feature points:
1-1: estimating an image overlapping area of the reference image and the image to be matched according to the flight parameters of the unmanned aerial vehicle;
1-2: dividing the image overlapping area into a plurality of sub-areas, and calculating a covariance matrix of each pixel in each sub-area by using a covariance matrix formula (1):
Figure BDA0001494037350000031
where M is a covariance matrix, I (x, y) is an image intensity function of the pixel (x, y), (xk,yk) Is a point within the gaussian weighted window W of pixel (x, y); i isx(xk,yk) And Iy(xk,yk) The partial derivatives in the x and y directions are represented, respectively;
1-3: constructing a characteristic response intensity function (2) which considers the significance of the pixel points and the significance of the support area at the same time, and calculating the characteristic response intensity value of each pixel in each subarea by using the characteristic response intensity function (2):
FRi=[det(M)-κ(trace(M))2]α·(min||Di-Dj,j≠i||)β (2)
where det (M) is the determinant of matrix M, trace (M) is the trace of matrix M, κ is an empirical constant, DiFeature descriptors for pixels being traversed, Dj,j≠iAlpha and beta are two weight coefficients for controlling the relative importance of pixel saliency and support region saliency for the feature descriptors of other pixels;
1-4: obtaining the local significant feature points based on the feature response intensity values;
step two: in each corresponding sub-region on the reference image and the image to be matched, selecting a reference image seed point and an image seed point to be matched based on the characteristic response intensity value of the local significant feature point of the reference image and the characteristic response intensity value of the local significant feature point of the image to be matched respectively, and matching to obtain the homonymous seed point in each corresponding sub-region on the reference image and the image to be matched;
step three: using a seed point matching result calculation formula (3) as a geometric transformation model of the reference image and the image to be matched, calculating corresponding point coordinates of each local salient feature point of the reference image on the image to be matched based on the geometric transformation model, and determining a search area with the size of R multiplied by R by taking the corresponding point coordinates as a center:
Figure BDA0001494037350000041
wherein (x)1,y1) And (x)2,y2) Respectively representing the coordinates of the image points of the homonymous seed points on the reference image and the image to be matched, a0-a5And b0-b5Is a polynomial coefficient;
step four: and searching for a homonymous point matched with each local salient feature point of the reference image by using an NNDR (New Neighbor Distance Rat io) method in the image search area to be matched.
Preferably, the method for obtaining the local significant feature points based on the feature response intensity values in steps 1-4 includes:
performing non-maximum suppression within a 3 x 3 neighborhood according to the characteristic response intensity values of pixels;
and sorting the pixels with the non-maximum values suppressed from large to small according to the characteristic response intensity value, and selecting the first s% of the pixels as the local significant characteristic points.
Preferably, step two comprises the following steps:
2-1: in each corresponding sub-region on the reference image and the image to be matched, respectively selecting the local significant feature point of the reference image and the local significant feature point of the image to be matched which have the maximum feature response intensity value of the first t percent as a reference image seed point and a seed point of the image to be matched;
2-2: matching the reference image seed points and the image seed points to be matched by adopting a bidirectional NNDR matching strategy based on the similarity of the feature descriptors of the reference image seed points and the image seed points to be matched;
2-3: calculating the matching reliability of the reference image seed point and the image seed point to be matched by using a matching reliability measurement index function in a formula (4):
Figure BDA0001494037350000051
wherein, MRijRepresenting homonymous seed points piAnd q isjThe reliability of the matching of (a) to (b),
Figure BDA0001494037350000052
and
Figure BDA0001494037350000053
respectively representing features on a reference imagePoint piThe feature descriptor and the feature response strength value of (c),
Figure BDA0001494037350000054
and
Figure BDA0001494037350000055
respectively representing the matching points q on the image to be matchedjExp () is an exponential function that, together with the negative sign before the intra-function ratio operation, normalizes the match confidence value to (0, 1)]The value range of (1);
2-4: and keeping the matching result of the reference image seed point with the maximum matching reliability metric index value and the image seed point to be matched in each sub-region as the homonymous seed point.
Compared with the prior art, one or more technical schemes provided by the invention have the following technical effects or advantages:
according to the method for matching the local significant feature points of the repetitive texture image of the unmanned aerial vehicle, in the feature point detection process, a feature response intensity function capable of reflecting the pixel point positioning capacity and the matching potential is designed, and on the basis, a local significant feature point detection operator suitable for matching the repetitive texture image of the unmanned aerial vehicle is provided. The feature points obtained by the feature point detection operator have strong distinguishability, can be correctly matched in the subsequent matching process, and finally obtain a large number of uniformly distributed matching results while ensuring high matching accuracy.
Furthermore, the unmanned aerial vehicle repeated texture image local salient feature point matching method provided by the invention estimates the overlapping area of the reference image and the image to be matched by using the flight parameters of the unmanned aerial vehicle, divides the overlapping area into sub-areas, calculates the pixel feature response intensity value in the sub-area, avoids comparing pixel feature descriptors in the image global range, and effectively improves the time efficiency of feature point detection.
Furthermore, the matching reliability measurement index considering the similarity between the matching points and the saliency of the feature points is designed by the unmanned aerial vehicle repeated texture image local salient feature point matching method. And evaluating the matching result of the seed points by using the index, and only reserving the most reliable seed points in each sub-region as final homonymous seed points. The image geometric transformation model calculated on the basis can more accurately express the actual geometric transformation relation between the reference image and the image to be matched. In the subsequent characteristic point matching process, based on the geometric transformation model, the searching range of the homonymous point can be estimated more accurately, the searching range of the homonymous point is reduced, the homonymous point is ensured to be positioned in the searching range, and the efficiency and the reliability of homonymous point matching are effectively improved.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions of the present invention will be further described with reference to the following examples.
The embodiment of the invention provides a method for matching local significant feature points of an unmanned aerial vehicle repeated texture image, which comprises the following steps:
an unmanned aerial vehicle repeated texture image local salient feature point matching method comprises the following steps:
the method comprises the following steps: respectively extracting local salient feature points of a reference image in the reference image and local salient feature points of an image to be matched in the image to be matched; the method comprises the following steps of extracting local significant feature points:
1-1: estimating an image overlapping area of the reference image and the image to be matched according to the flight parameters of the unmanned aerial vehicle;
1-2: dividing the image overlapping area into a plurality of sub-areas, and calculating a covariance matrix of each pixel in each sub-area by using a covariance matrix formula (1):
Figure BDA0001494037350000071
where M is a covariance matrix, I (x, y) is an image intensity function of the pixel (x, y), (xk,yk) Is a point within the gaussian weighted window W of pixel (x, y);Ix(xk,yk) And Iy(xk,yk) The partial derivatives in the x and y directions are represented, respectively;
1-3: constructing a characteristic response intensity function (2) which considers the significance of the pixel points and the significance of the support area at the same time, and calculating the characteristic response intensity value of each pixel in each subarea by using the characteristic response intensity function (2):
FRi=[det(M)-κ(trace(M))2]α·(minDi-Dj,j≠i||)β (2)
where det (M) is the determinant of matrix M, trace (M) is the trace of matrix M, κ is an empirical constant, DiFeature descriptors for pixels being traversed, Dj,j≠iAlpha and beta are two weight coefficients for controlling the relative importance of pixel saliency and support region saliency for the feature descriptors of other pixels;
1-4: obtaining the locally significant feature points based on the feature response intensity values.
In a specific implementation process, the reference image local significant feature point and the image to be matched are extracted by adopting the above method. The first factor in the characteristic response intensity function (2) reflects the local saliency of the image pixel, which is referred to as a pixel saliency factor in the embodiments of the present invention. The second factor reflects the saliency of the support region centered on the pixel, which is referred to as the region saliency factor in the embodiments of the present invention. In the region saliency factor, the minimum feature descriptor distance represents the difference between the pixel support region being traversed and other pixel support regions, and a larger value indicates that the pixel support region has a larger difference from other pixel support regions, i.e., the pixel support region has stronger saliency. According to the image feature matching theory, the stronger the significance of the feature region of the feature point, the higher the probability that the feature region can be correctly identified in the matching process. Therefore, the region saliency factor may reflect the matching potential of the image pixels to some extent. The characteristic response intensity function shown in the characteristic response intensity function (2) effectively combines the pixel significance factor and the area significance factor, and can obtain the characteristic points which are easy to accurately position and accurately match.
In the characteristic response intensity function (2), when the region significance factor is calculated, in order to improve the time efficiency of calculation, the method only compares the characteristic descriptors of the pixel which is traversing with the characteristic descriptors of other pixels which are positioned in the same subregion.
In the Feature response strength function (2), when the region saliency factor is calculated, because the scale change and the rotation change between the unmanned aerial vehicle sequence images are small, the embodiment of the invention adopts a Scale Invariant Feature Transform (SIFT) method for unifying the size and the direction of the Feature region to calculate the Feature descriptor, instead of adopting a method for determining the size of the Feature region through the Feature point scale value and determining the direction of the Feature region by using a gradient direction histogram in the original SIFT algorithm.
In the feature response strength function (2), the selection of the feature description method is very flexible when calculating the region saliency factor. In practical applications, the user can select an appropriate characterization method according to the processed image. However, the characterization method used herein is preferably consistent with the characterization method used in the subsequent image matching process. Therefore, the characteristic description is avoided after the characteristic point is detected, the time efficiency of the algorithm is improved, the matching performance of the pixels in the subsequent matching process is more accurately measured due to the characteristic description method consistent with the matching process, and the matching effect is improved.
In a specific implementation process, preferably, the method for obtaining the local significant feature point based on the feature response intensity value includes:
performing non-maximum suppression within a 3 x 3 neighborhood according to the characteristic response intensity values of pixels;
and sorting the pixels with the non-maximum values suppressed from large to small according to the characteristic response intensity value, and selecting the first s% of the pixels as the local significant characteristic points.
In addition, in the embodiment of the present invention, s% is set to 50%. The user can set reasonable parameter values according to practical application. If higher matching precision is required in practical application and the application requirement can be met only by a small number of homonymous points, a smaller s% is suitable to be set; if a large number of homologous points are required for practical applications, s% should be set larger.
After the first step is completed, executing a second step: and in each corresponding sub-region on the reference image and the image to be matched, selecting a reference image seed point and an image seed point to be matched based on the characteristic response intensity value of the local significant feature point of the reference image and the characteristic response intensity value of the local significant feature point of the image to be matched respectively, and matching to obtain the homonymous seed point in each corresponding sub-region on the reference image and the image to be matched.
In a specific implementation process, the second step comprises the following specific steps:
2-1: and respectively selecting the local significant feature points of the reference image and the local significant feature points of the image to be matched which have the maximum feature response intensity value in the top t percent in each corresponding sub-area of the reference image and the image to be matched as reference image seed points and image seed points to be matched.
In the embodiment of the invention, as the seed point matching result is used for calculating the image geometric transformation model, the geometric transformation model coefficient can be calculated by only a small number of homonymous seed points, and the reliability of the seed point matching result is far more important than the matching quantity, therefore, t% is set to be a smaller value, and t% is set to be 5% in the embodiment of the invention. The user can set reasonable parameter values according to practical application.
After the seed points are acquired, 2-2: and matching the reference image seed points and the image seed points to be matched by adopting a bidirectional NNDR matching strategy based on the similarity of the feature descriptors of the reference image seed points and the image seed points to be matched.
The specific steps of bidirectional NNDR matching are: first, for the seed point p in the sub-region Q on the reference imageiSearching p in the sub-region Q '(Q and Q' are corresponding sub-regions) on the image to be matched by using NNDR methodiIs matched with the point qj(Forward direction)Match); then, using NNDR method to find Q in sub-region Q on the reference imagejIs matched with the point pi' (reverse matching); if the results of the forward match and the reverse match are consistent, then this is considered a pair of homonymous seed points.
In the two-way NNDR matching, since the image sub-regions are divided based on the image overlap region, and the sub-regions on the reference image and the image to be matched are roughly aligned, the search process of the homonymous seed point in the embodiment of the invention is only performed in the corresponding sub-region. Therefore, the efficiency of seed point matching can be improved, the interference of non-homonymous points can be reduced, and the matching accuracy is improved.
After completing step 2-2, performing 2-3: calculating the matching reliability of the reference image seed point and the image seed point to be matched by using a matching reliability measurement index function in a formula (4):
Figure BDA0001494037350000101
wherein, MRijRepresenting homonymous seed points piAnd q isjThe reliability of the matching of (a) to (b),
Figure BDA0001494037350000102
and
Figure BDA0001494037350000103
respectively representing the feature points p on the reference imageiThe feature descriptor and the feature response strength value of (c),
Figure BDA0001494037350000104
and
Figure BDA0001494037350000105
respectively representing the matching points q on the image to be matchedjExp () is an exponential function that, together with the negative sign before the intra-function ratio operation, normalizes the match confidence value to (0, 1)]The value range of (2).
After the step 2-3 is completed, the step 2-4 is executed: and reserving the matching result of the reference image seed point with the maximum matching reliability metric index value and the image seed point to be matched in each sub-region as the homonymous seed point.
After the homonymous seed points are obtained, executing a third step: using a seed point matching result calculation formula (3) as a geometric transformation model of the reference image and the image to be matched, calculating corresponding point coordinates of each local salient feature point of the reference image on the image to be matched based on the geometric transformation model, and determining a search area with the size of R multiplied by R by taking the corresponding point coordinates as a center:
Figure BDA0001494037350000111
wherein (x)1,y1) And (x)2,y2) Respectively representing the coordinates of the image points of the homonymous seed points on the reference image and the image to be matched, a0-a5And b0-b5Is a polynomial coefficient.
In a specific implementation, the size of R may be determined according to specific situations. After the search area is determined, executing step four: and searching a homonymous point matched with each local salient feature point of the reference image in the image search area to be matched by using an NNDR (New Neighbor Distance ratio) method.
According to the method for matching the local significant feature points of the repetitive texture image of the unmanned aerial vehicle, provided by the embodiment of the invention, in the feature point detection process, a feature response intensity function capable of reflecting the pixel point positioning capability and the matching potential is designed, and on the basis, a local significant feature point detection operator suitable for matching the repetitive texture image of the unmanned aerial vehicle is provided. The feature points obtained by the feature point detection operator have strong distinguishability, can be correctly matched in the subsequent matching process, and finally obtain a large number of uniformly distributed matching results while ensuring high matching accuracy.
Further, according to the method for matching the local significant feature points of the repetitive texture image of the unmanned aerial vehicle, provided by the embodiment of the invention, the overlapped area of the reference image and the image to be matched is estimated by using flight parameters of the unmanned aerial vehicle, the sub-area division is carried out on the overlapped area, the pixel feature response intensity value is calculated in the sub-area, the comparison of pixel feature descriptors in the image global range is avoided, and the time efficiency of feature point detection is effectively improved.
Further, the matching reliability measurement index considering the similarity between the matching points and the saliency of the feature points is designed by the method for matching the local salient feature points of the repetitive texture image of the unmanned aerial vehicle. And evaluating the matching result of the seed points by using the index, and only reserving the most reliable seed points in each sub-region as final homonymous seed points. The image geometric transformation model calculated on the basis can more accurately express the actual geometric transformation relation between the reference image and the image to be matched. In the subsequent characteristic point matching process, based on the geometric transformation model, the searching range of the homonymous point can be estimated more accurately, the searching range of the homonymous point is reduced, the homonymous point is ensured to be positioned in the searching range, and the efficiency and the reliability of homonymous point matching are effectively improved.
It should be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the scope of the invention, which is defined by the claims and the accompanying drawings, wherein all equivalent changes and modifications in the form, construction, characteristics and spirit of the invention are included in the claims.

Claims (3)

1. An unmanned aerial vehicle repeated texture image local salient feature point matching method is characterized by comprising the following steps:
the method comprises the following steps: respectively extracting local salient feature points of a reference image in the reference image and local salient feature points of an image to be matched in the image to be matched; the method comprises the following steps of extracting local significant feature points:
1-1: estimating an image overlapping area of the reference image and the image to be matched according to the flight parameters of the unmanned aerial vehicle;
1-2: dividing the image overlapping area into a plurality of sub-areas, and calculating a covariance matrix of each pixel in each sub-area by using a covariance matrix formula (1):
Figure FDA0001494037340000011
where M is a covariance matrix, I (x, y) is an image intensity function of the pixel (x, y), (xk,yk) Is a point within the gaussian weighted window W of pixel (x, y); i isx(xk,yk) And Iy(xk,yk) The partial derivatives in the x and y directions are represented, respectively;
1-3: constructing a characteristic response intensity function (2) which considers the significance of the pixel points and the significance of the support area at the same time, and calculating the characteristic response intensity value of each pixel in each subarea by using the characteristic response intensity function (2):
FRi=[det(M)-κ(trace(M))2]α·(min||Di-Dj,j≠i||)β (2)
where det (M) is the determinant of matrix M, trace (M) is the trace of matrix M, κ is an empirical constant, DiFeature descriptors for pixels being traversed, Dj,j≠iAlpha and beta are two weight coefficients for controlling the relative importance of pixel saliency and support region saliency for the feature descriptors of other pixels;
1-4: obtaining the local significant feature points based on the feature response intensity values;
step two: in each corresponding sub-region on the reference image and the image to be matched, selecting a reference image seed point and an image seed point to be matched based on the characteristic response intensity value of the local significant feature point of the reference image and the characteristic response intensity value of the local significant feature point of the image to be matched respectively, and matching to obtain the homonymous seed point in each corresponding sub-region on the reference image and the image to be matched;
step three: using a seed point matching result calculation formula (3) as a geometric transformation model of the reference image and the image to be matched, calculating corresponding point coordinates of each local salient feature point of the reference image on the image to be matched based on the geometric transformation model, and determining a search area with the size of R multiplied by R by taking the corresponding point coordinates as a center:
Figure FDA0001494037340000021
wherein (x)1,y1) And (x)2,y2) Respectively representing the coordinates of the image points of the homonymous seed points on the reference image and the image to be matched, a0-a5And b0-b5Is a polynomial coefficient;
step four: and searching for the homonymous point matched with each local salient feature point of the reference image in the image search area to be matched by using an NNDR method.
2. The method for matching local salient feature points of repetitive texture images of unmanned aerial vehicles according to claim 1, wherein the method for obtaining the local salient feature points based on the feature response intensity values in steps 1-4 comprises:
performing non-maximum suppression within a 3 x 3 neighborhood according to the characteristic response intensity values of pixels;
and sorting the pixels with the non-maximum values suppressed from large to small according to the characteristic response intensity value, and selecting the first s% of the pixels as the local significant characteristic points.
3. The method for matching the local salient feature points of the repetitive texture images of the unmanned aerial vehicle according to claim 1, wherein the second step comprises the following steps:
2-1: in each corresponding sub-region on the reference image and the image to be matched, respectively selecting the local significant feature point of the reference image and the local significant feature point of the image to be matched which have the maximum feature response intensity value of the first t percent as a reference image seed point and a seed point of the image to be matched;
2-2: matching the reference image seed points and the image seed points to be matched by adopting a bidirectional NNDR matching strategy based on the similarity of the feature descriptors of the reference image seed points and the image seed points to be matched;
2-3: calculating the matching reliability of the reference image seed point and the image seed point to be matched by using a matching reliability measurement index function in a formula (4):
Figure FDA0001494037340000031
wherein, MRijRepresenting homonymous seed points piAnd q isjThe reliability of the matching of (a) to (b),
Figure FDA0001494037340000032
and
Figure FDA0001494037340000033
respectively representing the feature points p on the reference imageiThe feature descriptor and the feature response strength value of (c),
Figure FDA0001494037340000034
and
Figure FDA0001494037340000035
respectively representing the matching points q on the image to be matchedjExp () is an exponential function that, together with the negative sign before the intra-function ratio operation, normalizes the match confidence value to (0, 1)]The value range of (1);
2-4: and keeping the matching result of the reference image seed point with the maximum matching reliability metric index value and the image seed point to be matched in each sub-region as the homonymous seed point.
CN201711263004.9A 2017-12-04 2017-12-04 Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle Expired - Fee Related CN108021886B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711263004.9A CN108021886B (en) 2017-12-04 2017-12-04 Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711263004.9A CN108021886B (en) 2017-12-04 2017-12-04 Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN108021886A CN108021886A (en) 2018-05-11
CN108021886B true CN108021886B (en) 2021-09-14

Family

ID=62078401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711263004.9A Expired - Fee Related CN108021886B (en) 2017-12-04 2017-12-04 Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN108021886B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109543561B (en) * 2018-10-31 2020-09-18 北京航空航天大学 Method and device for detecting salient region of aerial video
CN110599531B (en) * 2019-09-11 2022-04-29 北京迈格威科技有限公司 Repetitive texture feature description method and device and binocular stereo matching method and device
SG10201913798WA (en) * 2019-12-30 2021-07-29 Sensetime Int Pte Ltd Image processing method and apparatus, and electronic device
US11354883B2 (en) 2019-12-30 2022-06-07 Sensetime International Pte. Ltd. Image processing method and apparatus, and electronic device
CN111414968B (en) * 2020-03-26 2022-05-03 西南交通大学 Multi-mode remote sensing image matching method based on convolutional neural network characteristic diagram

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693542A (en) * 2012-05-18 2012-09-26 中国人民解放军信息工程大学 Image characteristic matching method
CN104268550A (en) * 2014-09-18 2015-01-07 鲁路平 Feature extraction method and device
CN104504723A (en) * 2015-01-14 2015-04-08 西安电子科技大学 Image registration method based on remarkable visual features
CN105160686A (en) * 2015-10-21 2015-12-16 武汉大学 Improved scale invariant feature transformation (SIFT) operator based low altitude multi-view remote-sensing image matching method
CN105184801A (en) * 2015-09-28 2015-12-23 武汉大学 Optical and SAR image high-precision registration method based on multilevel strategy
CN106023230A (en) * 2016-06-02 2016-10-12 辽宁工程技术大学 Dense matching method suitable for deformed images
CN106127209A (en) * 2016-06-17 2016-11-16 中南大学 A kind of objects' contour extracting method based on local edge feature integration
CN107025449A (en) * 2017-04-14 2017-08-08 西南交通大学 A kind of inclination image linear feature matching process of unchanged view angle regional area constraint
CN107274419A (en) * 2017-07-10 2017-10-20 北京工业大学 A kind of deep learning conspicuousness detection method based on global priori and local context

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8903181B2 (en) * 2011-12-28 2014-12-02 Venkatesh Gangadharan Low cost unique identification generation using combination of patterns and unique code images

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693542A (en) * 2012-05-18 2012-09-26 中国人民解放军信息工程大学 Image characteristic matching method
CN104268550A (en) * 2014-09-18 2015-01-07 鲁路平 Feature extraction method and device
CN104504723A (en) * 2015-01-14 2015-04-08 西安电子科技大学 Image registration method based on remarkable visual features
CN105184801A (en) * 2015-09-28 2015-12-23 武汉大学 Optical and SAR image high-precision registration method based on multilevel strategy
CN105160686A (en) * 2015-10-21 2015-12-16 武汉大学 Improved scale invariant feature transformation (SIFT) operator based low altitude multi-view remote-sensing image matching method
CN106023230A (en) * 2016-06-02 2016-10-12 辽宁工程技术大学 Dense matching method suitable for deformed images
CN106127209A (en) * 2016-06-17 2016-11-16 中南大学 A kind of objects' contour extracting method based on local edge feature integration
CN107025449A (en) * 2017-04-14 2017-08-08 西南交通大学 A kind of inclination image linear feature matching process of unchanged view angle regional area constraint
CN107274419A (en) * 2017-07-10 2017-10-20 北京工业大学 A kind of deep learning conspicuousness detection method based on global priori and local context

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
《CS 6476 Project 2: Local Feature Matching》;Ian Buckley;《https://www.cc.gatech.edu/classes/AY2016/cs4476_fall/results/proj2/html/ibuckley3/》;20160923;第1-4页 *
《Shot Change Detection via Local Keypoint Matching》;Chun-Rong Huang等;《IEEE Transactions on Multimedia》;20081024;第10卷(第6期);第1097-1108页 *
《局部相位特征描述的多源遥感影像自动匹配》;叶沅鑫 等;《武汉大学学报(信息科学版)》;20170905;第42卷(第09期);第1278-1284页 *
《资源三号三线阵影像立体匹配与DSM生成研究》;胡海燕;《中国优秀硕士学位论文全文数据库 基础科学辑》;20170415(第04期);第A008-33页 *

Also Published As

Publication number Publication date
CN108021886A (en) 2018-05-11

Similar Documents

Publication Publication Date Title
CN108021886B (en) Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle
KR102273559B1 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN108921947B (en) Method, device, equipment, storage medium and acquisition entity for generating electronic map
CN111028277B (en) SAR and optical remote sensing image registration method based on pseudo-twin convolution neural network
CN108225348B (en) Map creation and moving entity positioning method and device
CN109324337B (en) Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle
EP3644015A1 (en) Position estimation system and position estimation method
CN102426019B (en) Unmanned aerial vehicle scene matching auxiliary navigation method and system
CN110930495A (en) Multi-unmanned aerial vehicle cooperation-based ICP point cloud map fusion method, system, device and storage medium
CN111650598A (en) External parameter calibration method and device for vehicle-mounted laser scanning system
KR102314038B1 (en) Method for determining unusual area for optical navigation based on artificial neural network, apparatus for producing onboard map, and method for determining direction of lander
CN107240130B (en) Remote sensing image registration method, device and system
CN111256696B (en) Aircraft autonomous navigation method with multi-feature and multi-level scene matching
CN114111774B (en) Vehicle positioning method, system, equipment and computer readable storage medium
Dawood et al. Harris, SIFT and SURF features comparison for vehicle localization based on virtual 3D model and camera
CN115187798A (en) Multi-unmanned aerial vehicle high-precision matching positioning method
CN115861860B (en) Target tracking and positioning method and system for unmanned aerial vehicle
CN114755661A (en) Parameter calibration method and device for mobile laser scanning system
CN111143489B (en) Image-based positioning method and device, computer equipment and readable storage medium
CN113822996B (en) Pose estimation method and device for robot, electronic device and storage medium
CN114187418A (en) Loop detection method, point cloud map construction method, electronic device and storage medium
CN114792338A (en) Vision fusion positioning method based on prior three-dimensional laser radar point cloud map
CN115239899B (en) Pose map generation method, high-precision map generation method and device
CN115952248A (en) Pose processing method, device, equipment, medium and product of terminal equipment
CN114842224A (en) Monocular unmanned aerial vehicle absolute vision matching positioning scheme based on geographical base map

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20210914

CF01 Termination of patent right due to non-payment of annual fee