CN110378940B - Aviation image feature point matching diffusion recursive calibration method - Google Patents

Aviation image feature point matching diffusion recursive calibration method Download PDF

Info

Publication number
CN110378940B
CN110378940B CN201910521634.4A CN201910521634A CN110378940B CN 110378940 B CN110378940 B CN 110378940B CN 201910521634 A CN201910521634 A CN 201910521634A CN 110378940 B CN110378940 B CN 110378940B
Authority
CN
China
Prior art keywords
density
image
matching
reference image
units
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910521634.4A
Other languages
Chinese (zh)
Other versions
CN110378940A (en
Inventor
张志伟
周文宗
胡伍生
沙月进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201910521634.4A priority Critical patent/CN110378940B/en
Publication of CN110378940A publication Critical patent/CN110378940A/en
Application granted granted Critical
Publication of CN110378940B publication Critical patent/CN110378940B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a method for matching diffusion recursive calibration of aviation image feature points, which comprises the following steps: s1: dividing the reference image and the matching image into density units respectively; s2: performing the following operations on both the reference image and the matching image: setting a threshold value n according to the number of the feature points in the density units, marking the density units with the number of the feature points being more than or equal to n as high-density units, and marking other density units as low-density units; s3: performing the following operations on both the reference image and the matching image: extracting the communicated high-density units to obtain a high-density area of the aerial image; s4: performing the following operations on both the reference image and the matching image: position marking is carried out on all the high-density areas; s5: and matching the high-density areas of the reference image and the matching image. The invention effectively improves the anti-interference capability and efficiency.

Description

Aviation image feature point matching diffusion recursive calibration method
Technical Field
The invention relates to the field of aerial photogrammetry, in particular to an aerial image feature point matching diffusion recursive calibration method.
Background
The difficulty of the aerial triangular measurement image is increased when the characteristic points are matched due to the light brightness and the influence caused by angle rotation when the aircraft shoots. In order to solve the problem, the invention provides a region calibration algorithm based on the distribution density of the feature points, so that the feature point matching algorithm has scale invariance and the matching robustness is enhanced. There are many ways to make feature point matching have scale invariance, mainly including two ways of exploring feature direction represented by Sift operator and searching image overall feature information represented by Mean Shift algorithm. The biggest interference in aerial images is brightness change caused by light, the Sift-like algorithm cannot overcome the brightness change well, and the Mean Shift algorithm has advantages but unstable effect.
Disclosure of Invention
The purpose of the invention is as follows: the invention aims to provide a matching diffusion recursion calibration method for aviation image feature points with strong anti-jamming capability.
The technical scheme is as follows: in order to achieve the purpose, the invention adopts the following technical scheme:
the invention relates to a method for matching diffusion recursive calibration of aviation image feature points, which comprises the following steps:
s1: dividing the reference image and the matching image into density units respectively;
s2: performing the following operations on both the reference image and the matching image: setting a threshold value n according to the number of the feature points in the density units, marking the density units with the number of the feature points being more than or equal to n as high-density units, and marking other density units as low-density units;
s3: performing the following operations on both the reference image and the matching image: extracting the communicated high-density units to obtain a high-density area of the aerial image;
s4: performing the following operations on both the reference image and the matching image: position marking is carried out on all the high-density areas;
s5: and matching the high-density areas of the reference image and the matching image.
Further, in step S1, the reference image is divided into density units according to equation (1):
Figure GDA0003947762610000011
in the formula (1), d is the side length of the density unit, t is the root of the number of pixels contained in the density unit, w is the side length of the detection window of the feature point extraction operator, and l is the side length of the reference image.
Further, in the step S2, the total number of the high density units does not exceed the total number of all the density units.
Further, the step S3 specifically includes the following steps:
s31: establishing an x coordinate system and a y coordinate system for the reference image and the matching image, and forming keys according to coordinates in the x coordinate system and the y coordinate system;
s32: inquiring the value corresponding to the key in the set formed by all the high-density units, namely inquiring whether an adjacent high-density unit exists, if so, continuing to perform the step S33, and if not, ending;
s33: judging whether the adjacent high-density unit has diffused before, if yes, ending, otherwise continuing to step S34;
s34: returning to step S32.
Further, the specific process of step S4 is: the minimum y-value component of all the high-density units in the high-density area is selected as the ordinate of the position mark point, and the minimum x-value component of all the high-density units in the high-density area is selected as the abscissa of the position mark point.
Further, the step S5 specifically includes the following steps:
s51: the following is performed for all high-density regions of the reference image: forming a high-density expanded area of the reference image by diffusing 40 multiplied by 40 pixel units to the periphery by taking a position mark point of the high-density area of the reference image as a center;
s52: forming a high-density expanded area matrix X of the reference image by using all the high-density expanded areas of the reference image obtained in the step S51;
s53: the following is performed for all high density regions of the matching image: taking a position mark point of a high-density area of the matching image as a center, and diffusing 40 x 40 pixel units to the periphery to form a high-density expansion area of the matching image;
s54: forming a high-density expanded area matrix Y of the matched image by using all the high-density expanded areas of the matched image obtained in the step S53; the correlation coefficient rho of X and Y is calculated according to the formula (2) X,Y
Figure GDA0003947762610000021
In the formula (2), cov (X, Y) is the covariance of X and Y, σ X Standard deviation of X, σ Y Is the standard deviation of Y, μ X Is the mean value of X, μ Y Is the mean value of Y;
s55: taking a matrix of correlation coefficients ρ X,Y The element in X and the element in Y corresponding to the largest element in all the elements are used as a group of homonymous high-density areas, and r is calculated according to the formula (3) x And r y
Figure GDA0003947762610000022
In the formula (3), r x Number of pixels, r, of horizontal displacement required to achieve alignment of the matching image with the reference image y Number of pixels, x, of vertical displacement required to achieve alignment of the matching image with the reference image l Is the abscissa, X, of the element of X in the set of homonymous high-density regions r Is the abscissa, Y, of the element of Y in the set of homonymous high-density regions l Is the ordinate, y, of the element of X in the set of homonymous high-density regions r D is the side length of the density unit in the reference image and the matching image, which is the ordinate of the elements of Y in the set of homonymous high density regions.
Has the advantages that: the invention discloses a method for matching, diffusing and recursively calibrating characteristic points of aerial images, which comprises the steps of finding out a high-density area of a reference image and a matching image, and then performing high-density area matching, thereby effectively improving the anti-interference capability; the division of the density units greatly simplifies the complexity of the method, converts all pixel points traversing the whole image into matrixes traversing a limited number, and improves the efficiency of the method.
Drawings
FIG. 1 is a flow chart of a method in accordance with an embodiment of the present invention.
Detailed Description
The technical solution of the present invention will be further described with reference to the following embodiments.
The specific embodiment discloses an aviation image feature point matching diffusion recursive calibration method, as shown in fig. 1, comprising the following steps:
s1: dividing the reference image and the matching image into density units respectively;
s2: performing the following operations on both the reference image and the matching image: setting a threshold value n according to the number of the feature points in the density units, marking the density units with the number of the feature points being more than or equal to n as high-density units, and marking other density units as low-density units; wherein the total number of the high-density units is not more than 20% of the sum of the numbers of the high-density units and the low-density units;
s3: performing the following operations on both the reference image and the matching image: extracting the communicated high-density units to obtain a high-density area of the aerial image;
s4: performing the following operations on both the reference image and the matching image: position marking is carried out on all the high-density areas;
s5: and matching the high-density areas of the reference image and the matching image.
In step S1, the reference image is divided into density units according to equation (1):
Figure GDA0003947762610000031
in the formula (1), d is the side length of the density unit, t is the root of the number of pixels contained in the density unit, w is the side length of the detection window of the feature point extraction operator, and l is the side length of the reference image.
The matching image is also divided into density units as the reference image, and the description thereof is omitted here.
In step S2, the total number of high density units does not exceed the total number of all density units.
The step S3 specifically includes the following steps:
s31: establishing an x coordinate system and a y coordinate system for the reference image and the matching image, and forming keys according to coordinates in the x coordinate system and the y coordinate system;
s32: inquiring the value corresponding to the key in the set formed by all the high-density units, namely inquiring whether an adjacent high-density unit exists, if so, continuing to perform the step S33, and if not, ending;
s33: judging whether the adjacent high-density unit is diffused before, if so, ending, otherwise, continuing to the step S34;
s34: the process returns to step S32.
The specific process of step S4 is: the minimum y-value component of all the high-density units in the high-density area is selected as the ordinate of the position mark point, and the minimum x-value component of all the high-density units in the high-density area is selected as the abscissa of the position mark point.
The step S5 specifically includes the following steps:
s51: the following is performed for all high-density regions of the reference image: taking a position mark point of a high-density area of the reference image as a center, and diffusing 40 x 40 pixel units to the periphery to form a high-density expanded area of the reference image;
s52: forming a high-density expanded area matrix X of the reference image by using all the high-density expanded areas of the reference image obtained in the step S51;
s53: the following is performed for all high density regions of the matching image: taking a position mark point of a high-density area of the matching image as a center, and diffusing 40 multiplied by 40 pixel units to the periphery to form a high-density expansion area of the matching image;
s54: forming a high-density expanded area matrix Y of the matched image by using all the high-density expanded areas of the matched image obtained in the step S53; the correlation coefficient rho of X and Y is calculated according to the formula (2) X,Y
Figure GDA0003947762610000041
In the formula (2), cov (X, Y) is the covariance of X and Y, σ X Is the standard deviation of X, σ Y Is the standard deviation of Y, μ X Is the mean value of X, μ Y Is the mean value of Y;
s55: taking a matrix of correlation coefficients ρ X,Y The element in X and the element in Y corresponding to the largest element in all the elements are used as a group of homonymous high-density areas, and r is calculated according to the formula (3) x And r y
Figure GDA0003947762610000042
In the formula (3), r x Number of pixels, r, of horizontal displacement required to achieve alignment of the matching image with the reference image y Number of pixels, x, of vertical displacement required to match the image to be aligned with the reference image l Of elements of X in the set of homonymous high-density regionsAbscissa, x r Is the abscissa, Y, of the element of Y in the set of homonymous high-density regions l Is the ordinate, y, of the element of X in the set of homonymous high-density regions r D is the side length of the density unit in the reference image and the matching image, which is the ordinate of the element of Y in the set of homonymous high density regions.

Claims (5)

1. The aviation image feature point matching diffusion recursive calibration method is characterized by comprising the following steps: the method comprises the following steps:
s1: dividing the reference image and the matching image into density units respectively;
s2: performing the following operations on both the reference image and the matching image: setting a threshold value n according to the number of the feature points in the density units, marking the density units with the number of the feature points being more than or equal to n as high-density units, and marking other density units as low-density units;
s3: performing the following operations on both the reference image and the matching image: extracting the communicated high-density units to obtain a high-density area of the aerial image;
s4: performing the following operations on both the reference image and the matching image: position marking is carried out on all the high-density areas;
s5: matching the high-density areas of the reference image and the matching image;
the step S5 specifically includes the following steps:
s51: the following is performed for all high-density regions of the reference image: taking a position mark point of a high-density area of the reference image as a center, and diffusing 40 x 40 pixel units to the periphery to form a high-density expanded area of the reference image;
s52: forming a high-density expanded area matrix X of the reference image by using all the high-density expanded areas of the reference image obtained in the step S51;
s53: the following is performed for all high density regions of the matching image: taking a position mark point of a high-density area of the matching image as a center, and diffusing 40 x 40 pixel units to the periphery to form a high-density expansion area of the matching image;
s54: all the high densities of the matching images obtained in step S53The degree expansion area forms a high-density expansion area matrix Y of the matched image; the correlation coefficient rho of X and Y is calculated according to the formula (2) X,Y
Figure FDA0003947762600000011
In the formula (2), cov (X, Y) is the covariance of X and Y, σ X Standard deviation of X, σ Y Standard deviation of Y,. Mu. X Is the mean value of X, μ Y Is the mean value of Y;
s55: taking a matrix of correlation coefficients ρ X,Y The element in X and the element in Y corresponding to the largest element in all the elements are used as a group of homonymous high-density areas, and r is calculated according to the formula (3) x And r y
Figure FDA0003947762600000012
In the formula (3), r x Number of pixels, r, of horizontal displacement required to achieve alignment of the matching image with the reference image y Number of pixels, x, of vertical displacement required to match the image to be aligned with the reference image l Is the abscissa, X, of the element of X in the set of homonymous high-density regions r Is the abscissa, Y, of the element of Y in the set of homonymous high-density regions l Is the ordinate, y, of the element of X in the set of homonymous high-density regions r D is the side length of the density unit in the reference image and the matching image, which is the ordinate of the element of Y in the set of homonymous high density regions.
2. The aerial image feature point matching diffusion recursive calibration method according to claim 1, characterized in that: in step S1, the reference image is divided into density units according to equation (1):
Figure FDA0003947762600000021
in the formula (1), d is the side length of the density unit, t is the root of the number of pixels contained in the density unit, w is the side length of the detection window of the feature point extraction operator, and l is the side length of the reference image.
3. The aerial image feature point matching diffusion recursive calibration method according to claim 1, characterized in that: in step S2, the total number of high density units does not exceed the total number of all density units.
4. The aerial image feature point matching diffusion recursive calibration method according to claim 1, characterized in that: the step S3 specifically includes the following steps:
s31: establishing an x coordinate system and a y coordinate system for the reference image and the matching image, and forming keys according to coordinates in the x coordinate system and the y coordinate system;
s32: inquiring the value corresponding to the key in the set formed by all the high-density units, namely inquiring whether an adjacent high-density unit exists, if so, continuing to perform the step S33, and if not, ending;
s33: judging whether the adjacent high-density unit is diffused before, if so, ending, otherwise, continuing to the step S34;
s34: the process returns to step S32.
5. The aerial image feature point matching diffusion recursive calibration method according to claim 1, characterized in that: the specific process of step S4 is: the minimum y-value component of all the high-density units in the high-density area is selected as the ordinate of the position mark point, and the minimum x-value component of all the high-density units in the high-density area is selected as the abscissa of the position mark point.
CN201910521634.4A 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method Active CN110378940B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910521634.4A CN110378940B (en) 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910521634.4A CN110378940B (en) 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method

Publications (2)

Publication Number Publication Date
CN110378940A CN110378940A (en) 2019-10-25
CN110378940B true CN110378940B (en) 2023-04-07

Family

ID=68248982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910521634.4A Active CN110378940B (en) 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method

Country Status (1)

Country Link
CN (1) CN110378940B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112514970B (en) * 2020-12-08 2021-11-05 烟台海裕食品有限公司 Self-adaptive fish scale removing platform and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101122999A (en) * 2007-04-16 2008-02-13 北京联合大学 Method for automatically extracting stamper image from Chinese painting and calligraphy
US20120148164A1 (en) * 2010-12-08 2012-06-14 Electronics And Telecommunications Research Institute Image matching devices and image matching methods thereof
CN109816051A (en) * 2019-02-25 2019-05-28 北京石油化工学院 A kind of harmful influence cargo characteristic point matching method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101122999A (en) * 2007-04-16 2008-02-13 北京联合大学 Method for automatically extracting stamper image from Chinese painting and calligraphy
US20120148164A1 (en) * 2010-12-08 2012-06-14 Electronics And Telecommunications Research Institute Image matching devices and image matching methods thereof
CN109816051A (en) * 2019-02-25 2019-05-28 北京石油化工学院 A kind of harmful influence cargo characteristic point matching method and system

Also Published As

Publication number Publication date
CN110378940A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN107665351B (en) Airport detection method based on difficult sample mining
CN101957919B (en) Character recognition method based on image local feature retrieval
CN107688806B (en) Affine transformation-based free scene text detection method
CN109460764B (en) Satellite video ship monitoring method combining brightness characteristics and improved interframe difference method
WO2017016448A1 (en) Qr code feature detection method and system
CN105205488A (en) Harris angular point and stroke width based text region detection method
CN107992856B (en) High-resolution remote sensing building shadow detection method under urban scene
CN111310768B (en) Saliency target detection method based on robustness background prior and global information
CN107862319B (en) Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting
CN111461036B (en) Real-time pedestrian detection method using background modeling to enhance data
CN111783773B (en) Correction method for angle-inclined telegraph pole signboard
CN110852207A (en) Blue roof building extraction method based on object-oriented image classification technology
CN102004914B (en) Method and device for detecting round seal position
CN110378940B (en) Aviation image feature point matching diffusion recursive calibration method
US20210286970A1 (en) Cloud detection method based on landsat 8 snow-containing image
CN111160107B (en) Dynamic region detection method based on feature matching
CN112529901A (en) Crack identification method in complex environment
CN108830283B (en) Image feature point matching method
CN105303566A (en) Target contour clipping-based SAR image target azimuth estimation method
CN112329641B (en) Form identification method, device, equipment and readable storage medium
CN112396582A (en) Mask RCNN-based equalizing ring skew detection method
CN112215319B (en) Two-dimensional code of color mark characteristic graph and identification method thereof
CN111695557B (en) Image processing method and device
US11132557B2 (en) Logo extraction device, and brightness adjusting method and device for logo
CN110348286B (en) Face fitting and matching method based on least square method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant