CN110378940A - Aerial image Feature Points Matching spreads recurrence calibration method - Google Patents

Aerial image Feature Points Matching spreads recurrence calibration method Download PDF

Info

Publication number
CN110378940A
CN110378940A CN201910521634.4A CN201910521634A CN110378940A CN 110378940 A CN110378940 A CN 110378940A CN 201910521634 A CN201910521634 A CN 201910521634A CN 110378940 A CN110378940 A CN 110378940A
Authority
CN
China
Prior art keywords
density
image
matching
benchmark image
benchmark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910521634.4A
Other languages
Chinese (zh)
Other versions
CN110378940B (en
Inventor
张志伟
周文宗
胡伍生
沙月进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201910521634.4A priority Critical patent/CN110378940B/en
Publication of CN110378940A publication Critical patent/CN110378940A/en
Application granted granted Critical
Publication of CN110378940B publication Critical patent/CN110378940B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a kind of aerial image Feature Points Matchings to spread recurrence calibration method, comprising the following steps: Density Units S1: are respectively divided to benchmark image and matching image;S2: following operation is performed both by benchmark image and matching image: according to the quantity given threshold n of the characteristic point in Density Units, characteristic point quantity >=n Density Units is labeled as high density unit, color density unit is labeled as low-density unit;S3: following operation is performed both by with matching image to benchmark image: the high density unit being connected to being extracted, the high-density region of aerial image is obtained;S4: following operation is performed both by benchmark image and matching image: position mark is carried out to all high-density regions;S5: the high-density region of benchmark image and matching image is matched.The present invention effectively increases anti-interference ability and efficiency.

Description

Aerial image Feature Points Matching spreads recurrence calibration method
Technical field
The present invention relates to field of aerial photography measurement, spread recurrence calibration side more particularly to aerial image Feature Points Matching Method.
Background technique
Bright and dark light and angle rotation bring influence so that aerial triangulation image is special in matching when aircraft is shot Difficulty increases when sign point.To solve this problem, the present invention proposes a kind of regional calibration algorithm based on characteristic point distribution density, Make Feature Points Matching algorithm that there is scale invariability, and enhances matching robustness.There is Feature Points Matching with scale invariability A variety of modes mainly have by representative of Sift operator and probe into characteristic direction and using Mean Shift algorithm as the searching of representative Both modes of image global feature information.Maximum interference is brightness change caused by light, Sift class in aviation image It can not overcome well like algorithm, though and the advantageous effect of Mean Shift algorithm is unstable.
Summary of the invention
Goal of the invention: the object of the present invention is to provide a kind of diffusions of the aerial image Feature Points Matching of strong antijamming capability to pass Return calibration method.
Technical solution: to reach this purpose, the invention adopts the following technical scheme:
Aerial image Feature Points Matching of the present invention spreads recurrence calibration method, comprising the following steps:
S1: Density Units are respectively divided to benchmark image and matching image;
S2: following operation is performed both by benchmark image and matching image: being set according to the quantity of the characteristic point in Density Units Determine threshold value n, characteristic point quantity >=n Density Units is labeled as high density unit, color density unit is labeled as low-density list Member;
S3: following operation is performed both by with matching image to benchmark image: the high density unit being connected to being extracted, is obtained The high-density region of aerial image;
S4: following operation is performed both by benchmark image and matching image: position mark is carried out to all high-density regions;
S5: the high-density region of benchmark image and matching image is matched.
Further, in the step S1, benchmark image divides Density Units according to formula (1):
In formula (1), d is the side length of Density Units, and t opens root comprising pixel quantity by Density Units, and w is characterized a little The side length of operator detection window is extracted, l is the side length of benchmark image.
Further, in the step S2, the sum of high density unit is no more than the sum of all Density Units.
Further, the step S3 specifically includes the following steps:
S31: establishing x, y coordinate system to benchmark image and matching image, according to the set of coordinates bonding in x, y coordinate system;
S32: the corresponding value of query key in the set of all high density units composition, that is, inquire whether have it is adjacent highly dense Degree unit does not terminate then if there is then continuing step S33;
S33: judging whether the adjacent high density unit has diffused into before, if there is then terminating, it is no then Continue step S34;
S34: return step S32.
Further, the detailed process of the step S4 are as follows: the minimum y value of all high density units in selection high-density region Ordinate of the component as position mark point selects the minimum x value component of all high density units in high-density region as position The abscissa of tagging point.
Further, the step S5 specifically includes the following steps:
S51: all high-density regions of benchmark image are all performed the following operation: with the high-density region of benchmark image Centered on position mark point, the high density expansion region of 40 × 40 pixel unit formation benchmark image is spread around;
S52: the high density that all high density expansion regions of the obtained benchmark image of step S51 constitute benchmark image is expanded Exhibition section domain matrix X;
S53: all high-density regions of matching image are all performed the following operation: with the high-density region of matching image Centered on position mark point, the high density expansion region of 40 × 40 pixel unit formation matching image is spread around;
S54: the high density that all high density expansion regions of the obtained matching image of step S53 constitute matching image is expanded Exhibition section domain matrix Y;The relative coefficient ρ of X and Y is calculated according to formula (2)X,Y:
In formula (2), cov (X, Y) is the covariance of X and Y, σXFor the standard deviation of X, σYFor the standard deviation of Y, μXFor the equal of X Value, μYFor the mean value of Y;
S55: correlation coefficient matrix ρ is takenX,YAll elements in element is made in element and Y in X corresponding to greatest member For one group of high-density region of the same name, r is calculated according to formula (3)xAnd ry:
In formula (3), rxTo reach the number of pixels of horizontal displacement required for being aligned with benchmark image, r for matching imageyFor Matching image will reach the number of pixels that required vertical displacement is aligned with benchmark image, xlFor in group high-density region of the same name The abscissa of the element of X, xrFor the abscissa of the element of Y in group high-density region of the same name, ylFor group high-density region of the same name The ordinate of the element of middle X, yrFor the ordinate of the element of Y in group high-density region of the same name, d is benchmark image and matching figure The side length of Density Units as in.
The utility model has the advantages that the invention discloses aerial image Feature Points Matchings to spread recurrence calibration method, reference map is first found out The high-density region of picture and matching image, then high-density region matching is carried out, effectively increase anti-interference ability;Density Units The all pixels point for traversing entire image is converted to the square of traversal finite number by the complexity for dividing the method that enormously simplifies Battle array, improves the efficiency of method.
Detailed description of the invention
Fig. 1 is the flow chart of method in the specific embodiment of the invention.
Specific embodiment
Technical solution of the present invention is further introduced With reference to embodiment.
Present embodiment discloses a kind of aerial image Feature Points Matching diffusion recurrence calibration method, as shown in Figure 1, The following steps are included:
S1: Density Units are respectively divided to benchmark image and matching image;
S2: following operation is performed both by benchmark image and matching image: being set according to the quantity of the characteristic point in Density Units Determine threshold value n, characteristic point quantity >=n Density Units is labeled as high density unit, color density unit is labeled as low-density list Member;The sum of its middle-high density unit is no more than the 20% of the sum of high density unit and low-density unit number;
S3: following operation is performed both by with matching image to benchmark image: the high density unit being connected to being extracted, is obtained The high-density region of aerial image;
S4: following operation is performed both by benchmark image and matching image: position mark is carried out to all high-density regions;
S5: the high-density region of benchmark image and matching image is matched.
In step S1, benchmark image divides Density Units according to formula (1):
In formula (1), d is the side length of Density Units, and t opens root comprising pixel quantity by Density Units, and w is characterized a little The side length of operator detection window is extracted, l is the side length of benchmark image.
Matching image also divides Density Units as benchmark image, is not described in more detail here.
In step S2, the sum of high density unit is no more than the sum of all Density Units.
Step S3 specifically includes the following steps:
S31: establishing x, y coordinate system to benchmark image and matching image, according to the set of coordinates bonding in x, y coordinate system;
S32: the corresponding value of query key in the set of all high density units composition, that is, inquire whether have it is adjacent highly dense Degree unit does not terminate then if there is then continuing step S33;
S33: judging whether the adjacent high density unit has diffused into before, if there is then terminating, it is no then Continue step S34;
S34: return step S32.
The detailed process of step S4 are as follows: select the minimum y value component of all high density units in high-density region as position The ordinate of tagging point selects the minimum x value component of all high density units in high-density region as position mark point Abscissa.
Step S5 specifically includes the following steps:
S51: all high-density regions of benchmark image are all performed the following operation: with the high-density region of benchmark image Centered on position mark point, the high density expansion region of 40 × 40 pixel unit formation benchmark image is spread around;
S52: the high density that all high density expansion regions of the obtained benchmark image of step S51 constitute benchmark image is expanded Exhibition section domain matrix X;
S53: all high-density regions of matching image are all performed the following operation: with the high-density region of matching image Centered on position mark point, the high density expansion region of 40 × 40 pixel unit formation matching image is spread around;
S54: the high density that all high density expansion regions of the obtained matching image of step S53 constitute matching image is expanded Exhibition section domain matrix Y;The relative coefficient ρ of X and Y is calculated according to formula (2)X,Y:
In formula (2), cov (X, Y) is the covariance of X and Y, σXFor the standard deviation of X, σYFor the standard deviation of Y, μXFor the equal of X Value, μYFor the mean value of Y;
S55: correlation coefficient matrix ρ is takenX,YAll elements in element is made in element and Y in X corresponding to greatest member For one group of high-density region of the same name, r is calculated according to formula (3)xAnd ry:
In formula (3), rxTo reach the number of pixels of horizontal displacement required for being aligned with benchmark image, r for matching imageyFor Matching image will reach the number of pixels that required vertical displacement is aligned with benchmark image, xlFor in group high-density region of the same name The abscissa of the element of X, xrFor the abscissa of the element of Y in group high-density region of the same name, ylFor group high-density region of the same name The ordinate of the element of middle X, yrFor the ordinate of the element of Y in group high-density region of the same name, d is benchmark image and matching figure The side length of Density Units as in.

Claims (6)

1. aerial image Feature Points Matching spreads recurrence calibration method, it is characterised in that: the following steps are included:
S1: Density Units are respectively divided to benchmark image and matching image;
S2: following operation is performed both by benchmark image and matching image: threshold is set according to the quantity of the characteristic point in Density Units Characteristic point quantity >=n Density Units are labeled as high density unit by value n, and color density unit is labeled as low-density unit;
S3: following operation is performed both by with matching image to benchmark image: the high density unit being connected to being extracted, aviation is obtained The high-density region of image;
S4: following operation is performed both by benchmark image and matching image: position mark is carried out to all high-density regions;
S5: the high-density region of benchmark image and matching image is matched.
2. aerial image Feature Points Matching according to claim 1 spreads recurrence calibration method, it is characterised in that: the step In rapid S1, benchmark image divides Density Units according to formula (1):
In formula (1), d is the side length of Density Units, and t opens root comprising pixel quantity for Density Units, and w is characterized an extraction The side length of operator detection window, l are the side length of benchmark image.
3. aerial image Feature Points Matching according to claim 1 spreads recurrence calibration method, it is characterised in that: the step In rapid S2, the sum of high density unit is no more than the sum of all Density Units.
4. aerial image Feature Points Matching according to claim 1 spreads recurrence calibration method, it is characterised in that: the step Rapid S3 specifically includes the following steps:
S31: establishing x, y coordinate system to benchmark image and matching image, according to the set of coordinates bonding in x, y coordinate system;
S32: whether the corresponding value of query key in the set of all high density units composition, that is, inquiring has adjacent high density list Member does not terminate then if there is then continuing step S33;
S33: judging whether the adjacent high density unit has diffused into before, if there is then terminating, does not continue then Carry out step S34;
S34: return step S32.
5. aerial image Feature Points Matching according to claim 1 spreads recurrence calibration method, it is characterised in that: the step The detailed process of rapid S4 are as follows: select the minimum y value component of all high density units in high-density region as position mark point Ordinate selects abscissa of the minimum x value component of all high density units in high-density region as position mark point.
6. aerial image Feature Points Matching according to claim 5 spreads recurrence calibration method, it is characterised in that: the step Rapid S5 specifically includes the following steps:
S51: all high-density regions of benchmark image are all performed the following operation: with the position of the high-density region of benchmark image Centered on mark point, the high density expansion region of 40 × 40 pixel unit formation benchmark image is spread around;
S52: all high density expansion regions of the obtained benchmark image of step S51 are constituted to the high density expansion area of benchmark image Domain matrix X;
S53: all high-density regions of matching image are all performed the following operation: with the position of the high-density region of matching image Centered on mark point, the high density expansion region of 40 × 40 pixel unit formation matching image is spread around;
S54: all high density expansion regions of the obtained matching image of step S53 are constituted to the high density expansion area of matching image Domain matrix Y;The relative coefficient ρ of X and Y is calculated according to formula (2)X,Y:
In formula (2), cov (X, Y) is the covariance of X and Y, σXFor the standard deviation of X, σYFor the standard deviation of Y, μXFor the mean value of X, μYFor The mean value of Y;
S55: correlation coefficient matrix ρ is takenX,YAll elements in X corresponding to greatest member in element and Y element as one Group high-density region of the same name, calculates r according to formula (3)xAnd ry:
In formula (3), rxTo reach the number of pixels of horizontal displacement required for being aligned with benchmark image, r for matching imageyFor matching Image will reach the number of pixels that required vertical displacement is aligned with benchmark image, xlFor X in group high-density region of the same name The abscissa of element, xrFor the abscissa of the element of Y in group high-density region of the same name, ylFor X in group high-density region of the same name Element ordinate, yrFor the ordinate of the element of Y in group high-density region of the same name, d is benchmark image and matching image The side length of middle Density Units.
CN201910521634.4A 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method Active CN110378940B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910521634.4A CN110378940B (en) 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910521634.4A CN110378940B (en) 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method

Publications (2)

Publication Number Publication Date
CN110378940A true CN110378940A (en) 2019-10-25
CN110378940B CN110378940B (en) 2023-04-07

Family

ID=68248982

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910521634.4A Active CN110378940B (en) 2019-06-17 2019-06-17 Aviation image feature point matching diffusion recursive calibration method

Country Status (1)

Country Link
CN (1) CN110378940B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112514970A (en) * 2020-12-08 2021-03-19 泰州市朗嘉馨网络科技有限公司 Self-adaptive fish scale removing platform and method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101122999A (en) * 2007-04-16 2008-02-13 北京联合大学 Method for automatically extracting stamper image from Chinese painting and calligraphy
US20120148164A1 (en) * 2010-12-08 2012-06-14 Electronics And Telecommunications Research Institute Image matching devices and image matching methods thereof
CN109816051A (en) * 2019-02-25 2019-05-28 北京石油化工学院 A kind of harmful influence cargo characteristic point matching method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101122999A (en) * 2007-04-16 2008-02-13 北京联合大学 Method for automatically extracting stamper image from Chinese painting and calligraphy
US20120148164A1 (en) * 2010-12-08 2012-06-14 Electronics And Telecommunications Research Institute Image matching devices and image matching methods thereof
CN109816051A (en) * 2019-02-25 2019-05-28 北京石油化工学院 A kind of harmful influence cargo characteristic point matching method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112514970A (en) * 2020-12-08 2021-03-19 泰州市朗嘉馨网络科技有限公司 Self-adaptive fish scale removing platform and method

Also Published As

Publication number Publication date
CN110378940B (en) 2023-04-07

Similar Documents

Publication Publication Date Title
US8218853B2 (en) Change discrimination device, change discrimination method and change discrimination program
US7515153B2 (en) Map generation device, map delivery method, and map generation program
Recky et al. Windows detection using k-means in cie-lab color space
WO2016062159A1 (en) Image matching method and platform for testing of mobile phone applications
Neumann A multi-ring color fiducial system and an intensity-invariant detection method for scalable fiducial-tracking augmented reality
CN101770646B (en) Edge detection method based on Bayer RGB images
CN109883654A (en) A kind of chessboard trrellis diagram, generation method and localization method for OLED sub-pixel positioning
CN105913070B (en) A kind of multi thread conspicuousness extracting method based on light-field camera
CN101770583B (en) Template matching method based on global features of scene
CN105225251B (en) Over the horizon movement overseas target based on machine vision quickly identifies and positioner and method
CN107066989B (en) Method and system for identifying accumulated snow of geostationary satellite remote sensing sequence image
CN102385753A (en) Illumination-classification-based adaptive image segmentation method
CN104657980A (en) Improved multi-channel image partitioning algorithm based on Meanshift
CN106815587A (en) Image processing method and device
CN110852207A (en) Blue roof building extraction method based on object-oriented image classification technology
CN110378940A (en) Aerial image Feature Points Matching spreads recurrence calibration method
CN105469392A (en) High spectral image significance detection method based on regional spectrum gradient characteristic comparison
JPH05181411A (en) Map information collation and update system
Li et al. A novel framework for urban change detection using VHR satellite images
Shiting et al. Clustering-based shadow edge detection in a single color image
CN109635679A (en) A kind of real-time target sheet positioning and loop wire recognition methods
CN108830273A (en) Visibility measurement method based on Image Warping
CN105681677B (en) A kind of high-resolution optical remote sensing Satellite Camera optimal focal plane determines method
CN108629227A (en) The method and system on left and right vehicle wheel boundary are determined in the picture
CN106683128B (en) Sub-pixel registration method for airport runway image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant