CN109272541A - A kind of image matching method, equipment - Google Patents

A kind of image matching method, equipment Download PDF

Info

Publication number
CN109272541A
CN109272541A CN201810970690.1A CN201810970690A CN109272541A CN 109272541 A CN109272541 A CN 109272541A CN 201810970690 A CN201810970690 A CN 201810970690A CN 109272541 A CN109272541 A CN 109272541A
Authority
CN
China
Prior art keywords
low texture
texture density
image
density subgraph
subgraph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810970690.1A
Other languages
Chinese (zh)
Other versions
CN109272541B (en
Inventor
罗胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dragon Totem Technology Hefei Co ltd
Original Assignee
Institute of Laser and Optoelectronics Intelligent Manufacturing of Wenzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Laser and Optoelectronics Intelligent Manufacturing of Wenzhou University filed Critical Institute of Laser and Optoelectronics Intelligent Manufacturing of Wenzhou University
Priority to CN201810970690.1A priority Critical patent/CN109272541B/en
Publication of CN109272541A publication Critical patent/CN109272541A/en
Application granted granted Critical
Publication of CN109272541B publication Critical patent/CN109272541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/44Analysis of texture based on statistical description of texture using image operators, e.g. filters, edge density metrics or local histograms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Probability & Statistics with Applications (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of image matching method, equipment, and accompanying method includes: each low texture density subgraph obtained in image to be matched G1 and G2;Obtain the feature descriptor of each low texture density subgraph;Obtain in G1 and G2 respectively adjacent low texture density subgraph two-by-two it is shared while and while attribute;In feature descriptor and G1 based on low texture density subgraph each in G1 while and while attribute obtain G1 non-directed graph;In feature descriptor and G2 based on low texture density subgraph each in G2 while and while attribute obtain G2 non-directed graph;The similarity matrix of the non-directed graph construction G1 and G2 of non-directed graph and G2 based on G1;The maximum path of similitude is searched in the similarity matrix of G1 and G2, and the matching relationship of G1 and G2 are determined by the maximum path of similitude.Stablize with matching, the beneficial effect that accuracy rate is high.

Description

A kind of image matching method, equipment
Technical field
The present invention relates to images match fields, more particularly, to a kind of image matching method, system and equipment.
Background technique
With the development of science and technology, image matching technology will be used wider and wider in modern age field of information processing It is general.
Common matching process is the image matching method based on point, but is existed based on the image matching method of point and matched The disadvantages of stability is poor, accuracy rate is low, matching speed is slow, less rich semantic, unfriendly to user.
Summary of the invention
To solve the above-mentioned problems, the present invention provides one kind and overcomes the above problem or at least be partially solved the above problem Image matching method, system and equipment.
According to a first aspect of the embodiments of the present invention, a kind of image matching method is provided, comprising:
Each low texture density subgraph in image to be matched G1 and G2 is obtained, wherein each low texture density subgraph It is lower than the region of n times of average texture density in its affiliated G1 or G2 for texture density, wherein n >=0;
Obtain the feature descriptor of each low texture density subgraph;Adjacent low texture is close two-by-two in acquisition G1 and G2 respectively Spend subgraph it is shared while and while attribute;
In feature descriptor and G1 based on low texture density subgraph each in G1 while and while attribute obtain G1 nothing Xiang Tu;In feature descriptor and G2 based on low texture density subgraph each in G2 while and while attribute obtain G2 it is undirected Figure;The similarity matrix of the non-directed graph construction G1 and G2 of non-directed graph and G2 based on G1;
In the similarity matrix of G1 and G2 search the maximum path of similitude, by the maximum path of similitude determine G1 and The matching relationship of G2.
Further, each low texture density subgraph obtained in image to be matched G1 and G2, later further include:
Each low texture density subgraph is normalized;Wherein, the normalized includes image direction Maximum of gradients normalized in normalized, picture size normalized and image.
Further, the feature descriptor for obtaining each low texture density subgraph, comprising:
Each low texture density subgraph after normalization is divided into m*m grandson's image, wherein m > 1;
4m-4 grandson's image for obtaining each low texture density subgraph, by grandson's image each in 4m-4 grandson's image In the gradient magnitude of each pixel add up by direction into u principal direction, u > 0;4m-4 are obtained in each low texture density subgraph The histogram of gradients of grandson's image;
It is respectively that the gradient of 4m-4 grandson's image in each low texture density subgraph is straight by counterclockwise or clockwise direction Square figure is spliced, and the feature descriptor of each low texture density subgraph is obtained.
Further, it is described obtain in G1 and G2 respectively adjacent low texture density subgraph two-by-two it is shared while and while category Property, comprising:
It is adjacent low two-by-two in the pixel that adjacent low texture density subgraph shares two-by-two in acquisition G1 and G2, as G1 and G2 The side of texture density subgraph;
Obtain the attribute on each side;Wherein, while attribute include: while correspond to two adjacent low texture density subgraph inconocenters Form the length of line segment Q1Q2, side corresponds to the code chain of two adjacent low texture density subgraphs, the code chain whether is closed and institute State the ratio of the length of yard chain and the length of Q1Q2.
Further, in the feature descriptor and G1 based on low texture density subgraph each in G1 while and while attribute obtain The non-directed graph of G1;In feature descriptor and G2 based on low texture density subgraph each in G2 while and while attribute obtain G2 Non-directed graph;Include:
Each low texture density subgraph in G1 and G2 is abstracted into vertex respectively, between each low texture density subgraph When being abstracted into, vertex and side constitute the non-directed graph of G1 and G2.
Further, the similarity matrix of the non-directed graph of the non-directed graph based on G1 and G2 construction G1 and G2, comprising:
Similarity between vertex constitutes the point pair in similarity matrix;Similarity between side pair is constituted in similarity matrix Side between point pair.
The maximum path of similitude is searched in the similarity matrix of G1 and G2, determines that the matching of G1 and G2 is closed by this path System.
Based on dynamic programming algorithm, the maximum path of similitude is searched in the similarity matrix of G1 and G2, by this path Determine the matching relationship of G1 and G2.
According to a further aspect of the present invention, a kind of image matching system is provided, comprising:
First obtains module, for obtaining each low texture density subgraph in image to be matched G1 and G2, wherein respectively A low texture density subgraph is region of the texture density lower than n times of average texture density in its affiliated G1 or G2, wherein n >=0;
Second obtains module, for obtaining the feature descriptor of each low texture density subgraph;G1 and G2 is obtained respectively In two-by-two adjacent low texture density subgraph it is shared while and while attribute;
Third obtain module, in feature descriptor and G1 based on low texture density subgraph each in G1 while and while Attribute obtain G1 non-directed graph;In feature descriptor and G2 based on low texture density subgraph each in G2 while and while category Property obtain G2 non-directed graph;The similarity matrix of the non-directed graph construction G1 and G2 of non-directed graph and G2 based on G1;
Computing module searches the maximum path of similitude, by the maximum road of similitude in the similarity matrix of G1 and G2 Diameter determines the matching relationship of G1 and G2.
According to a further aspect of the invention, a kind of equipment of images match is provided, comprising:
At least one processor;And
At least one processor being connect with the processor communication, in which:
The memory is stored with the program instruction that can be executed by the processor, and the processor calls described program to refer to Order is able to carry out such as any of the above-described method.
According to a further aspect of the invention, a kind of non-transient computer readable storage medium is provided, which is characterized in that institute Non-transient computer readable storage medium storage computer instruction is stated, the computer instruction makes the computer execute above-mentioned Method described in one.
The present invention provides a kind of image matching method, system and equipment, and the present invention based on area image by being realized to figure The matching of picture.The present invention has compared to based on point, matched stability is strong, accuracy rate is high, matching speed is fast, richer semantic, right The beneficial effects such as user friendly.
Detailed description of the invention
Fig. 1 is a kind of overall flow schematic diagram of image matching method of the embodiment of the present invention;
Fig. 2 is a kind of general frame schematic diagram of image matching system of the embodiment of the present invention.
Fig. 3 is a kind of equipment block diagram of image matching method of the embodiment of the present invention.
Specific embodiment
With reference to the accompanying drawings and examples, specific embodiments of the present invention will be described in further detail.Implement below Example is not intended to limit the scope of the invention for illustrating the present invention.
Such as Fig. 1, a kind of overall procedure schematic diagram of image matching method of the embodiment of the present invention is shown, comprising:
S1 obtains each low texture density subgraph in image to be matched G1 and G2, wherein each low texture density Image is region of the texture density lower than n times of average texture density in its affiliated G1 or G2, wherein n >=0;
S2 obtains the feature descriptor of each low texture density subgraph;Adjacent low line two-by-two in G1 and G2 is obtained respectively Manage density subgraph it is shared while and while attribute;
S3, in feature descriptor and G1 based on low texture density subgraph each in G1 while and while attribute obtain G1's Non-directed graph;In feature descriptor and G2 based on low texture density subgraph each in G2 while and while attribute obtain G2 it is undirected Figure;The similarity matrix of the non-directed graph construction G1 and G2 of non-directed graph and G2 based on G1;
S4 searches the maximum path of similitude in the similarity matrix of G1 and G2, determines by the maximum path of similitude The matching relationship of G1 and G2.
Specifically, each low texture density subgraph is found interregional boundary line by watershed algorithm in the prior art, Each at least more than n pixel of low texture density subgraph.
On the basis of any of the above-described specific embodiment of the invention, provide a kind of image matching method, it is described obtain to With each low texture density subgraph in image G1 and G2, later further include: each low texture density subgraph is returned One change processing;Wherein, the normalized includes in image direction normalized, picture size normalized and image Maximum of gradients normalized.
By taking image direction normalized in the present embodiment as an example, the image side of low texture density subgraph is obtained first To: image direction is directed toward the gray scale center of image, is then counterclockwise positive from the geometric center of image.It is logical first Crossing following existing method and calculating gray scale the center C, m of image is square, square mpqCalculating by second formula explanation.Pq in square It is parameter, illustrates the order of square, can be arbitrary integer.C is calculated based on m.I (x, y) is image grayscale expression formula.
mpq=∑x,y∈rxpyqI(x,y)
Image direction calculates as follows:
Region is rotated into θ, completes image direction normalized.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching method is provided, the acquisition is each The feature descriptor of low texture density subgraph, comprising:
Each low texture density subgraph after normalization is divided into m*m grandson's image, wherein m > 1;
4m-4 grandson's image for obtaining each low texture density subgraph, by grandson's image each in 4m-4 grandson's image In the gradient magnitude of each pixel add up by direction into u principal direction, u > 0;4m-4 are obtained in each low texture density subgraph The histogram of gradients of grandson's image;
It is respectively that the gradient of 4m-4 grandson's image in each low texture density subgraph is straight by counterclockwise or clockwise direction Square figure is spliced, and the feature descriptor of each low texture density subgraph is obtained.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching method is provided, it is described to obtain respectively In G1 and G2 two-by-two adjacent low texture density subgraph it is shared while and while attribute, comprising:
It is adjacent low two-by-two in the pixel that adjacent low texture density subgraph shares two-by-two in acquisition G1 and G2, as G1 and G2 The side of texture density subgraph;
Obtain the attribute on each side;Wherein, while attribute include: while correspond to two adjacent low texture density subgraph inconocenters Form the length of line segment Q1Q2, side corresponds to the code chain of two adjacent low texture density subgraphs, the code chain whether is closed and institute State the ratio of the length of yard chain and the length of Q1Q2.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching method is provided, based on each in G1 In the feature descriptor and G1 of low texture density subgraph while and while attribute obtain G1 non-directed graph;Based on low line each in G2 Manage density subgraph feature descriptor and G2 in while and while attribute obtain G2 non-directed graph;Include:
Each low texture density subgraph in G1 and G2 is abstracted into vertex respectively, between each low texture density subgraph When being abstracted into, vertex and side constitute the non-directed graph of G1 and G2.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching method is provided, based on the undirected of G1 The similarity matrix of the non-directed graph of figure and G2 construction G1 and G2, comprising:
Similarity between vertex constitutes the point pair in similarity matrix;Similarity between side pair is constituted in similarity matrix Side between point pair.
In the specific embodiment of the invention, similarity matrix: the similarity between vertex constitutes the point pair in similarity matrix, I ' similitude in i and zG2 in G1 are as follows:
Sv(i, i ')=| Ri-Ri’’|
Ri is the attribute in the region i in G1;Ri ' is the attribute in the region i ' in G2.Sv(i, i ') it is two interregional Similitude.
While pair between similarity constitute similarity matrix in point pair between while.Similarity between side pair is exactly to connect in code chain Continuous same section length percentage:
Se(eij 1,ei’j’ 2)=d (eij 1,ei’j’ 2)/max(lij 1,li’j’ 2)
Se(eij 1,ei’j’ 2) it is similitude between two sides, eij 1It is the code chain of the i and the interregional side j two in G1, ei’j’ 2It is The code chain of i ' and the interregional side j ' two in G2, d (eij 1,ei’j’ 2) it is the identical digit of two code chains, max (lij 1,li’j’ 2) be The maximum length of two code chains.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching method is provided, in the phase of G1 and G2 Like the maximum path of similitude is searched in property matrix, the matching relationship of G1 and G2 are determined by the maximum path of similitude, comprising:
The side pair that respective numbers between G1 and G2 match is found, so that similitude is maximum in the similarity matrix of G1 and G2 Change.
In the specific embodiment of the invention, similar region, side also should be similar.Therefore so-called figure matching, reality is just It is the side pair for finding enough tie points pair in a matrix, similitude maximizes:
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching method is provided, in the phase of G1 and G2 Like the maximum path of similitude is searched in property matrix, the matching relationship of G1 and G2 are determined by the maximum path of similitude, further Include:
Based on dynamic programming algorithm, the maximum path of similitude is searched in the similarity matrix of G1 and G2, by similitude Maximum path determines the matching relationship of G1 and G2.
Simplify maximized searching process based on dynamic programming method:
S41 seeks image feature space mass center G to the feature of all areas of figure G1f;Calculate the attribute in each region To mass center GfDistance.And histogram disHist is generated, the histogram of primary all images is counted, looks at the discrimination of feature.
Gf=mean (Rf)
dR=| Rf-Gf|
To all areas of figure G1, apart from descending sort, queue Q1 is generated by mass center.Take k1/2 member before queue Q1 Element.
S42 seeks image feature space mass center G to the feature of all areas of figure G2f';Calculate the attribute in each region To mass center Gf' distance.And histogram disHist is generated, the histogram of primary all images is counted, looks at the differentiation of feature Degree.
Gf=mean (Rf)
dR=| Rf-Gf|
To all areas of figure G2, apart from descending sort, queue Q2 is generated by mass center.Take k1/2 member before queue Q2 Element.
The total k1 element of k1/2 element before k1/2 element, Q2 before S43, Q1 finds the maximum point of similitude to (ai, bi’), similitude Sv(ai,bi’) value are as follows:
Sv(ai,bi’)=max (Sv(X, Y)) X={ Q1 (1) Q1 (2) ... Q1 (k1/2) }, Y={ Q2 (1)
Q2(2)…Q2(k1/2)}。
S44, by matching (ai,bi’) be put into Y.There is side to connected all-pair (a in similarity matrixj,bj’), (ak,bk’) ... it is put into queue Q2.
S45 calculates the similitude S of current point pair to each element of Q2v(aj,bj’).Give up and is less than threshold value δSPoint pair.
Calculate the road similitude of each element of Q2:
SpIt is the similitude in this path, the similitude for the point pair being added in matching queue Y is had determined before being this element Weighted accumulation.Sv(ai,bi’) be former point pair similitude, Sv(aj,bj’) be current point pair similitude,Before being A little to the similitude on side pair between current point pair.
It sorts to the element of Q2 by similitude, therefrom the maximum k1 matching of similitude is selected to be put into Y, and will have with it While to the point of connection to being also placed in queue Q2.
S46 repeats S45, until Q2 does not have element, the maximum path of path similarity is selected from all paths, by it Element is put into Y.
S47 repeats S42, S43, S44, S45, S46 until taking all elements in Q1.
Such as Fig. 2, a kind of overall framework schematic diagram of image matching system of the embodiment of the present invention is shown, comprising:
First obtains modules A 1, for obtaining each low texture density subgraph in image to be matched G1 and G2, wherein Each low texture density subgraph is region of the texture density lower than n times of average texture density in its affiliated G1 or G2, wherein n >= 0;
Second obtains modules A 2, for obtaining the feature descriptor of each low texture density subgraph;Respectively obtain G1 and In G2 two-by-two adjacent low texture density subgraph it is shared while and while attribute;
Third obtains modules A 3, for side in feature descriptor and G1 based on low texture density subgraph each in G1 and The attribute on side obtains the non-directed graph of G1;In feature descriptor and G2 based on low texture density subgraph each in G2 while and while The non-directed graph of attribute acquisition G2;The similarity matrix of the non-directed graph construction G1 and G2 of non-directed graph and G2 based on G1;
Computing module A4, for searching the maximum path of similitude in the similarity matrix of G1 and G2, most by similitude Big path determines the matching relationship of G1 and G2.
Specifically, each low texture density subgraph is found interregional boundary line by watershed algorithm in the prior art, The pixel of each low texture density sub-image area at least more than n.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching system is provided, first obtains module A1 is also used to: each low texture density subgraph is normalized;Wherein, the normalized includes image direction Maximum of gradients normalized in normalized, picture size normalized and image.
By taking image direction normalized in the present embodiment as an example, the image side of low texture density subgraph is obtained first To: image direction is directed toward the gray scale center of image, is then counterclockwise positive from the geometric center of image.It is logical first Crossing following existing method and calculating gray scale the center C, m of image is square, square mpqCalculating by second formula explanation.Pq in square It is parameter, illustrates the order of square, can be arbitrary integer.C is calculated based on m.I (x, y) is image grayscale expression formula.
mpq=∑x,y∈rxpyqI(x,y)
Image direction calculates as follows:
Region is rotated into θ, completes image direction normalized.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching system is provided, described second obtains Modules A 2 is further used for:
Each low texture density subgraph after normalization is divided into m*m grandson's image, wherein m > 1;
4m-4 grandson's image for obtaining each low texture density subgraph, by grandson's image each in 4m-4 grandson's image In the gradient magnitude of each pixel add up by direction into u principal direction, u > 0;4m-4 are obtained in each low texture density subgraph The histogram of gradients of grandson's image;
It is respectively that the gradient of 4m-4 grandson's image in each low texture density subgraph is straight by counterclockwise or clockwise direction Square figure is spliced, and the feature descriptor of each low texture density subgraph is obtained.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching system is provided, described second obtains Modules A 2 is further used for:
It is adjacent low two-by-two in the pixel that adjacent low texture density subgraph shares two-by-two in acquisition G1 and G2, as G1 and G2 The side of texture density subgraph;
Obtain the attribute on each side;Wherein, while attribute include: while correspond to two adjacent low texture density subgraph inconocenters Form the length of line segment Q1Q2, side corresponds to the code chain of two adjacent low texture density subgraphs, the code chain whether is closed and institute State the ratio of the length of yard chain and the length of Q1Q2.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching system is provided, third obtains module A3 is further used for:
Each low texture density subgraph in G1 and G2 is abstracted into vertex respectively, between each low texture density subgraph When being abstracted into, vertex and side constitute the non-directed graph of G1 and G2.
On the basis of any of the above-described specific embodiment of the invention, a kind of image matching system is provided, third obtains module A3 is further used for:
Similarity between vertex constitutes the point pair in similarity matrix;Similarity between side pair is constituted in similarity matrix Side between point pair.
In the specific embodiment of the invention, similarity matrix: the similarity between vertex constitutes the point pair in similarity matrix, I ' similitude in i and zG2 in G1 are as follows:
Sv(i, i ')=| Ri-Ri’’|
Ri is the attribute in the region i in G1;Ri ' is the attribute in the region i ' in G2.Sv(i, i ') it is two interregional Similitude.
While pair between similarity constitute similarity matrix in point pair between while.Similarity between side pair is exactly to connect in code chain Continuous same section length percentage:
Se(eij 1,ei’j’ 2)=d (eij 1,ei’j’ 2)/max(lij 1,li’j’ 2)
Se(eij 1,ei’j’ 2) it is similitude between two sides, eij 1It is the code chain of the i and the interregional side j two in G1, ei’j’ 2It is The code chain of i ' and the interregional side j ' two in G2, d (eij 1,ei’j’ 2) it is the identical digit of two code chains, max (lij 1,li’j’ 2) be The maximum length of two code chains.
On the basis of any of the above-described specific embodiment of the invention, provide a kind of image matching system, computing module A4 into One step is used for:
The side pair that respective numbers between G1 and G2 match is found, so that similitude is maximum in the similarity matrix of G1 and G2 Change.
In the specific embodiment of the invention, similar region, side also should be similar.Therefore so-called figure matching, reality is just It is the side pair for finding enough tie points pair in a matrix, similitude maximizes:
On the basis of any of the above-described specific embodiment of the invention, provide a kind of image matching system, computing module A4 into One step is used for:
Based on dynamic programming algorithm, the maximum path of similitude is searched in the similarity matrix of G1 and G2, by similitude Maximum path determines the matching relationship of G1 and G2.
Simplify maximized searching process based on dynamic programming method:
S41 seeks image feature space mass center G to the feature of all areas of figure G1f;Calculate the attribute in each region To mass center GfDistance.And histogram disHist is generated, the histogram of primary all images is counted, looks at the discrimination of feature.
Gf=mean (Rf)
dR=| Rf-Gf|
To all areas of figure G1, apart from descending sort, queue Q1 is generated by mass center.Take k1/2 member before queue Q1 Element.
S42 seeks image feature space mass center G to the feature of all areas of figure G2f';Calculate the attribute in each region To mass center Gf' distance.And histogram disHist is generated, the histogram of primary all images is counted, looks at the differentiation of feature Degree.
Gf=mean (Rf)
dR=| Rf-Gf|
To all areas of figure G2, apart from descending sort, queue Q2 is generated by mass center.Take k1/2 member before queue Q2 Element.
The total k1 element of k1/2 element before k1/2 element, Q2 before S43, Q1 finds the maximum point of similitude to (ai, bi’), similitude Sv(ai,bi’) value are as follows:
Sv(ai,bi’)=max (Sv(X, Y)) X={ Q1 (1) Q1 (2) ... Q1 (k1/2) }, Y={ Q2 (1)
Q2(2)…Q2(k1/2)}。
S44, by matching (ai,bi’) be put into Y.There is side to connected all-pair (a in similarity matrixj,bj’), (ak,bk’) ... it is put into queue Q2.
S45 calculates the similitude S of current point pair to each element of Q2v(aj,bj’).Give up and is less than threshold value δSPoint pair.
Calculate the road similitude of each element of Q2:
SpIt is the similitude in this path, the similitude for the point pair being added in matching queue Y is had determined before being this element Weighted accumulation.Sv(ai,bi’) be former point pair similitude, Sv(aj,bj’) be current point pair similitude,Before being A little to the similitude on side pair between current point pair.
It sorts to the element of Q2 by similitude, therefrom the maximum k1 matching of similitude is selected to be put into Y, and will have with it While to the point of connection to being also placed in queue Q2.
S46 repeats S45, until Q2 does not have element, the maximum path of path similarity is selected from all paths, by it Element is put into Y.
S47 repeats S42, S43, S44, S45, S46 until taking all elements in Q1.
On the basis of any of the above-described specific embodiment of the invention, a kind of equipment of images match is provided, comprising: at least one A processor;And at least one processor being connect with processor communication, wherein Fig. 3 is electronics provided in an embodiment of the present invention The structural block diagram of equipment, comprising: processor (processor) 310, memory (memory) 320 and bus 330, wherein processing Device 310, memory 320 complete mutual communication by bus 330.Processor 310 can call the logic in memory 320 Instruction, to execute following method: each low texture density subgraph in image to be matched G1 and G2 is obtained, wherein each low line Managing density subgraph is region of the texture density lower than n times of average texture density in its affiliated G1 or G2, wherein n >=0;It obtains each The feature descriptor of a low texture density subgraph;It is shared that adjacent low texture density subgraph two-by-two is obtained in G1 and G2 respectively While and while attribute;In feature descriptor and G1 based on low texture density subgraph each in G1 while and while attribute obtain G1 Non-directed graph;In feature descriptor and G2 based on low texture density subgraph each in G2 while and while attribute obtain G2 nothing Xiang Tu;The similarity matrix of the non-directed graph construction G1 and G2 of non-directed graph and G2 based on G1;In the similarity matrix of G1 and G2 The maximum path of similitude is searched, the matching relationship of G1 and G2 are determined by the maximum path of similitude.
The embodiment of the present invention discloses a kind of computer program product, and computer program product includes being stored in non-transient calculating Computer program on machine readable storage medium storing program for executing, computer program include program instruction, when program instruction is computer-executed, Computer is able to carry out method provided by above-mentioned each method embodiment, for example, obtains in image to be matched G1 and G2 Each low texture density subgraph, wherein each low texture density subgraph is that texture density is average lower than in its affiliated G1 or G2 The region of n times of texture density, wherein n >=0;Obtain the feature descriptor of each low texture density subgraph;Respectively obtain G1 and In G2 two-by-two adjacent low texture density subgraph it is shared while and while attribute;Based on low texture density subgraph each in G1 In feature descriptor and G1 while and while attribute obtain the non-directed graph of G1;Feature based on low texture density subgraph each in G2 In descriptor and G2 while and while attribute obtain the non-directed graph of G2;The non-directed graph construction G1's and G2 of non-directed graph and G2 based on G1 Similarity matrix;The maximum path of similitude is searched in the similarity matrix of G1 and G2, is determined by the maximum path of similitude The matching relationship of G1 and G2.
The embodiment of the present invention provides a kind of non-transient computer readable storage medium, non-transient computer readable storage medium Computer instruction is stored, computer instruction makes computer execute method provided by above-mentioned each method embodiment, for example, obtains Each low texture density subgraph in image to be matched G1 and G2 is taken, wherein each low texture density subgraph is texture density Lower than the region of n times of average texture density in its affiliated G1 or G2, wherein n >=0;Obtain the spy of each low texture density subgraph Levy descriptor;Obtain in G1 and G2 respectively adjacent low texture density subgraph two-by-two it is shared while and while attribute;Based in G1 In the feature descriptor and G1 of each low texture density subgraph while and while attribute obtain G1 non-directed graph;Based on each in G2 In the feature descriptor and G2 of low texture density subgraph while and while attribute obtain G2 non-directed graph;Non-directed graph based on G1 and The similarity matrix of the non-directed graph construction G1 and G2 of G2;The maximum path of similitude is searched in the similarity matrix of G1 and G2, The matching relationship of G1 and G2 are determined by the maximum path of similitude.
Those of ordinary skill in the art will appreciate that: realize that all or part of the steps of above method embodiment can pass through The relevant hardware of program instruction is completed, and program above-mentioned can be stored in a computer readable storage medium, the program When being executed, step including the steps of the foregoing method embodiments is executed;And storage medium above-mentioned includes: ROM, RAM, magnetic disk or light The various media that can store program code such as disk.
Through the above description of the embodiments, those skilled in the art can be understood that each embodiment can It realizes by means of software and necessary general hardware platform, naturally it is also possible to pass through hardware.Based on this understanding, on Stating technical solution, substantially the part that contributes to existing technology can be embodied in the form of software products in other words, should Computer software product may be stored in a computer readable storage medium, such as ROM/RAM, magnetic disk, CD, including several fingers It enables and using so that a computer equipment (can be personal computer, server or the network equipment etc.) executes each implementation The method of certain parts of example or embodiment.
The invention has the following beneficial effects: have principal direction, anti-rotation;Gradient normalization, resisting illumination variation;There is scaling, Anti- dimensional variation;Anti- affine transformation;Figure matching is converted into dynamic programming problems;It not only solves figure matching, and solves One-to-one, one-to-many and many-to-one matching.Image matching method of the invention can be applied in following scene: between image Similarity measures, closed loop detection and SLAM between matching, image.
Finally, the method and apparatus being described in detail in present specification are only preferable embodiment, it is not intended to limit this The protection scope of inventive embodiments.All spirit in the embodiment of the present invention within principle, equally replace by made any modification It changes, improve, should be included within the protection scope of the embodiment of the present invention.

Claims (8)

1. a kind of image matching method characterized by comprising
Each low texture density subgraph in image to be matched G1 and G2 is obtained, wherein each low texture density subgraph is line Region of the density lower than n times of average texture density in its affiliated G1 or G2 is managed, wherein n >=0;
Obtain the feature descriptor of each low texture density subgraph;Adjacent low texture density is sub two-by-two in acquisition G1 and G2 respectively Image it is shared while and while attribute;
In feature descriptor and G1 based on low texture density subgraph each in G1 while and while attribute obtain G1 non-directed graph; In feature descriptor and G2 based on low texture density subgraph each in G2 while and while attribute obtain G2 non-directed graph;It is based on The similarity matrix of the non-directed graph construction G1 and G2 of the non-directed graph and G2 of G1;
The maximum path of similitude is searched in the similarity matrix of G1 and G2, determines G1's and G2 by the maximum path of similitude Matching relationship.
2. the method according to claim 1, wherein each low line obtained in image to be matched G1 and G2 Density subgraph is managed, later further include:
Each low texture density subgraph is normalized;Wherein, the normalized includes image direction normalizing Maximum of gradients normalized in change processing, picture size normalized and image.
3. according to the method described in claim 2, it is characterized in that, the feature for obtaining each low texture density subgraph is retouched State symbol, comprising:
Each low texture density subgraph after normalization is divided into m*m grandson's image, wherein m > 1;
4m-4 grandson's image for obtaining each low texture density subgraph, will be every in grandson's image each in 4m-4 grandson's image The gradient magnitude of a pixel adds up by direction into u principal direction, u > 0;Obtain 4m-4 grandson in each low texture density subgraph The histogram of gradients of image;
By counterclockwise or clockwise direction, respectively by the histogram of gradients of 4m-4 grandson's image in each low texture density subgraph Spliced, obtains the feature descriptor of each low texture density subgraph.
4. the method according to claim 1, wherein adjacent low texture is close two-by-two in the G1 and G2 of acquisition respectively Spend subgraph it is shared while and while attribute, comprising:
Obtain in G1 and G2 the shared pixel of adjacent low texture density subgraph two-by-two, adjacent low texture two-by-two in as G1 and G2 The side of density subgraph;
Obtain the attribute on each side;Wherein, while attribute include: while correspond to two adjacent low texture density subgraph centers compositions The length of line segment Q1Q2, the corresponding two adjacent low texture density subgraphs in side code chain, whether the code chain closes and the code The ratio of the length of the length and Q1Q2 of chain.
5. the method according to claim 1, wherein the feature based on low texture density subgraph each in G1 is retouched State in symbol and G1 while and while attribute obtain G1 non-directed graph;Feature descriptor based on low texture density subgraph each in G2 With in G2 while and while attribute obtain G2 non-directed graph;Include:
Each low texture density subgraph in G1 and G2 is abstracted into vertex respectively, the side between each low texture density subgraph It is abstracted into side, vertex and side constitute the non-directed graph of G1 and G2.
6. according to the method described in claim 5, it is characterized in that, the non-directed graph of non-directed graph and G2 based on G1 constructs G1 and G2 Similarity matrix, comprising:
Similarity between vertex constitutes the point pair in similarity matrix;Similarity between side pair constitutes the point pair in similarity matrix Between side.
7. the method according to claim 1, wherein it is maximum to search similitude in the similarity matrix of G1 and G2 Path, the matching relationship of G1 and G2 are determined by the maximum path of similitude, further comprises:
Based on dynamic programming algorithm, the maximum path of similitude is searched in the similarity matrix of G1 and G2, by similitude maximum Path determine the matching relationship of G1 and G2.
8. a kind of equipment of images match characterized by comprising
At least one processor;And
At least one processor being connect with the processor communication, in which:
The memory is stored with the program instruction that can be executed by the processor, and the processor calls described program to instruct energy Enough methods executed as described in claim 1 to 7 is any.
CN201810970690.1A 2018-08-27 2018-08-27 Image matching method and device Active CN109272541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810970690.1A CN109272541B (en) 2018-08-27 2018-08-27 Image matching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810970690.1A CN109272541B (en) 2018-08-27 2018-08-27 Image matching method and device

Publications (2)

Publication Number Publication Date
CN109272541A true CN109272541A (en) 2019-01-25
CN109272541B CN109272541B (en) 2023-10-24

Family

ID=65154594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810970690.1A Active CN109272541B (en) 2018-08-27 2018-08-27 Image matching method and device

Country Status (1)

Country Link
CN (1) CN109272541B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113822916A (en) * 2021-08-17 2021-12-21 北京大学 Image matching method, device, equipment and readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102289681A (en) * 2011-08-05 2011-12-21 上海邮政科学研究院 Method for matching envelope images
CN105654421A (en) * 2015-12-21 2016-06-08 西安电子科技大学 Projection transform image matching method based on transform invariant low-rank texture
CN105976364A (en) * 2016-04-28 2016-09-28 北京理工大学 Simplified weighted-undirected graph-based statistical averaging model construction method
EP3333770A1 (en) * 2016-12-09 2018-06-13 Fujitsu Limited Matching graph entities in graph data

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102289681A (en) * 2011-08-05 2011-12-21 上海邮政科学研究院 Method for matching envelope images
CN105654421A (en) * 2015-12-21 2016-06-08 西安电子科技大学 Projection transform image matching method based on transform invariant low-rank texture
CN105976364A (en) * 2016-04-28 2016-09-28 北京理工大学 Simplified weighted-undirected graph-based statistical averaging model construction method
EP3333770A1 (en) * 2016-12-09 2018-06-13 Fujitsu Limited Matching graph entities in graph data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
万华林等: "利用二部图匹配进行图像相似性度量", 《计算机辅助设计与图形学学报》 *
李浩: ""图的相似性描述与匹配方法的研究及其应用"", 《中国优秀博硕士学位论文全文数据库(硕士)基础科学辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113822916A (en) * 2021-08-17 2021-12-21 北京大学 Image matching method, device, equipment and readable storage medium
CN113822916B (en) * 2021-08-17 2023-09-15 北京大学 Image matching method, device, equipment and readable storage medium

Also Published As

Publication number Publication date
CN109272541B (en) 2023-10-24

Similar Documents

Publication Publication Date Title
Cai et al. Learning delicate local representations for multi-person pose estimation
Fu et al. Recurrent thrifty attention network for remote sensing scene recognition
Kong et al. Pixel-wise attentional gating for parsimonious pixel labeling
Kong et al. Pixel-wise attentional gating for scene parsing
Wu et al. KD-PAR: A knowledge distillation-based pedestrian attribute recognition model with multi-label mixed feature learning network
KR102618916B1 (en) Data classification method and system, and classifier training method and system
Rozenberg et al. Localization with limited annotation for chest x-rays
Liu et al. Robust salient object detection for RGB images
Setyono et al. Betawi traditional food image detection using ResNet and DenseNet
Cao et al. Parallel K nearest neighbor matching for 3D reconstruction
Liu et al. Dynamic parallel pyramid networks for scene recognition
Chen et al. Spatiotemporal context-aware network for video salient object detection
CN109272541A (en) A kind of image matching method, equipment
Guo et al. UDTIRI: An online open-source intelligent road inspection benchmark suite
Kompella et al. Weakly supervised multi-scale recurrent convolutional neural network for co-saliency detection and co-segmentation
Xiao et al. PP-NAS: Searching for Plug-and-Play Blocks on Convolutional Neural Networks
CN109242009A (en) A kind of image matching system
Cyganek Change detection in multidimensional data streams with efficient tensor subspace model
Jin et al. Game theory-based visual tracking approach focusing on color and texture features
WO2017168601A1 (en) Similar image search method and system
Gu et al. A Swin Transformer based Framework for Shape Recognition
Wang et al. SAFD: single shot anchor free face detector
Chen et al. Fusion sampling networks for skeleton-based human action recognition
Wang et al. Multi-branch detection network based on trigger attention for pedestrian detection under occlusion
Chang et al. The same size dilated attention network for keypoint detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240126

Address after: 230000 floor 1, building 2, phase I, e-commerce Park, Jinggang Road, Shushan Economic Development Zone, Hefei City, Anhui Province

Patentee after: Dragon totem Technology (Hefei) Co.,Ltd.

Country or region after: China

Address before: 325000 building C1, marine science and Technology Pioneer Park, Longwan District, Wenzhou City, Zhejiang Province

Patentee before: INSTITUTE OF LASER AND OPTOELECTRONICS INTELLIGENT MANUFACTURING, WENZHOU University

Country or region before: China

TR01 Transfer of patent right