CN101197045A - Image solid matching method and device thereof - Google Patents

Image solid matching method and device thereof Download PDF

Info

Publication number
CN101197045A
CN101197045A CNA2007100508536A CN200710050853A CN101197045A CN 101197045 A CN101197045 A CN 101197045A CN A2007100508536 A CNA2007100508536 A CN A2007100508536A CN 200710050853 A CN200710050853 A CN 200710050853A CN 101197045 A CN101197045 A CN 101197045A
Authority
CN
China
Prior art keywords
image
point
feature
unique
unique point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2007100508536A
Other languages
Chinese (zh)
Other versions
CN100550054C (en
Inventor
陈雷霆
刘启和
张建中
蔡洪斌
房春兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CNB2007100508536A priority Critical patent/CN100550054C/en
Publication of CN101197045A publication Critical patent/CN101197045A/en
Application granted granted Critical
Publication of CN100550054C publication Critical patent/CN100550054C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to an image three-dimensional matching method and a device thereof. The invention is characterized in that the invention comprises the following steps: a graphic identifier is adopted to extract feature points of a detection image one and a candidate matching image two; a multi-dimensional vector identifier makes a multi-parameter description to the feature points of the feature list to acquire the multi-dimensional vectors described by the parameters of the feature points; a data optimizer reduces the multi-dimensional vector described by the parameters of each feature point of the remarkable feature point list of the image one, calculates the similarity of each feature point of the feature point list of the reduced image one and the feature point of the feature point list of the image two to acquire the candidate matching point set of the image one; a classifier is used to classify the comparison result to judge the matching result of the image one and the image two. The invention has higher matching accuracy and efficiency to the three-dimensional image matching.

Description

A kind of image solid matching method and device thereof
Technical field
The present invention relates to a kind of image solid matching method and device thereof.
Background technology
Stereoscopic vision is most important apart from cognition technology in the computing machine distance-finding method, and the mode of its simulating human visual processes scenery can be measured the steric information of scenery neatly under multiple condition, its effect be other computer vision methods can not replace.The part of most critical is the corresponding point matching problem of carrying out several visual patterns in the stereoscopic vision, and promptly the stereoscopic vision coupling is called for short three-dimensional coupling.Wherein, the binocular solid matching algorithm is exactly a process of setting up corresponding relation between the match point of two width of cloth images, and it is the key of binocular tri-dimensional vision system.In fact, any based on all comprising a matching algorithm in the three-dimensional reconstruction system of computer vision as its core, can avoid calculating fundamental matrix based on the algorithm of phase matching, but the efficient of algorithm is lower.Stereo Matching Algorithm based on the zone is difficult to set window size, and the hunting zone is big, and efficiency of algorithm is lower.Existing based on the feature Stereo Matching Algorithm, though the efficiency of algorithm height, its matching precision is still waiting to improve.Existing Feature Points Extraction is robust not, and the extraordinary point of some of piece image may not exist in another width of cloth image unique point corresponding with it.Therefore, the precision of coupling is not high, causes three-dimensional match point to more sparse, needs further processing just can carry out three-dimensional reconstruction, as the right sparse problem of interpolation technique solution match point of certain methods by three-dimensional match point.Though interpolation technique can to a certain degree solve the sparse problem of three-dimensional match point, interpolation obtains new match point to being to be based upon on the interpolated data, and the error of interpolated data causes the right mistake coupling of many points.
Summary of the invention
The objective of the invention is: provide a kind of matching precision and matching efficiency high image solid matching method;
An also purpose of the present invention is: provide the high image of a kind of matching precision and matching efficiency three-dimensional coalignment.
The objective of the invention is to realize by the enforcement following technical proposals:
A kind of image solid matching method is characterized in that, comprises the steps:
The unique point of detected image one and candidate matches image two in step 1, the above-mentioned image of extraction, respectively composition diagram as one, the feature point range of image two, unique point to the feature point range of image one is carried out descending sort by the feature point value, and ranking unique point is in front formed a notable feature point range;
Step 2, the unique point of above-mentioned feature point range is used multiple parametric description, the characteristic parameter descriptor of computed image one and image two obtains the multi-C vector of unique point parametric description, and each dimensional vector is represented a feature of unique point;
Step 3, the multi-C vector to each unique point parametric description of the notable feature point range of image one carries out yojan according to the following steps:
The similarity of the unique point of each unique point of notable feature point range and image two in step 31, the computed image one;
Step 32, according to specifying a threshold value to obtain the candidate matches point set 1 of each unique point in image two of the notable feature point range of image one;
Step 33, the multi-C vector feature of one of them unique point of notable feature point range in the image one is removed a feature arbitrarily; Repeating step 31,32, the unique point similarity of the unique point after the calculating yojan and the feature point range of image two; Specify a threshold value, obtain the candidate matches point set 2 of each unique point in image two of the notable feature point range of image one;
Step 34, candidate matches point set 1 and candidate matches point set 2 are compared,, show the characteristic remarkable of the unique point of being removed, can not remove this feature,, remove this feature if difference is less if the two differs greatly;
Step 35, repeat above step 33,34, till can not be again the feature of each unique point of the notable feature point range of image one being carried out yojan;
Step 4, repeating step 31,32, the similarity of the unique point of the unique point of the feature point range of the image one after the yojan of calculating multi-C vector and the feature point range of image two; Specify a threshold value, obtain the candidate matches point set of each unique point in image two of the feature point range of image one;
Step 5, according to the result of above comparison step, judge whether detected image one and candidate matches image two mate.
The multiple parametric description of the unique point of above-mentioned steps 2 described feature point ranges comprises half-tone information, multistage gradient information, curved transition function at least.
The method that the multi-C vector to each unique point parametric description of the notable feature point range of image one of above-mentioned steps 3 carries out yojan is to use the Rough Set Reduction method that the multi-C vector of unique point parametric description is carried out yojan, obtains the multi-C vector of the unique point parametric description after the yojan.
The unique point of above-mentioned steps 1 described image one and image two is to adopt following steps to extract: (4-1) average gradient of each pixel square matrix is as follows in the computed image:
N ( x , y ) = ( ∂ I ∂ x ) 2 ∂ I ∂ x ∂ I ∂ y ∂ I ∂ x ∂ I ∂ y ( ∂ I ∂ x ) 2 ,
Wherein I (x, y) be position in the image (x, the gray-scale value of y) locating when two eigenwerts of the corresponding average gradient square matrix of certain point are bigger, have bigger gray level variation so near this point, choose this o'clock as a unique point, and the unique point response function is:
R=det(N)-k(trace(N)) 2
Wherein det (N) is the determinant of a matrix value, and trace (N) is the mark of matrix N, and k is 0.04;
By the R value picture element in the image is carried out descending sort, constitute an ordered series of numbers, determine a required unique point number F, preceding F picture element is unique point in the peek row then, can constantly adjust F by matching result.
(4-2) to the picture element in the image (i, j), calculate respectively level, vertical, the left diagonal sum right side to the angular direction on the quadratic sum of the difference of neighbor pixel value, and get the initial value of minimum value as this picture point feature;
H=(f(i,j)-f(i,j-1)) 2+(f(i,j)-f(i,j+1)) 2
V=(f(i,j)-f(i-1,j)) 2+(f(i,j)-f(i+1,j)) 2
L=(f(i,j)-f(i-1,j-1)) 2+(f(i,j)-f(i+1,j+1)) 2
R=(f(i,j)-f(i-1,j+1)) 2+(f(i,j)-f(i+1,j-1)) 2
Make F (i, j)=min{H, V, L, R} is the regional W of size for the non-overlapping copies of n * m with image division P, q, coordinates computed (x, y), its computing formula is as follows:
( x , y ) = arg max ( i , j ) ∈ W p , q ( F ( i , j ) ) ,
And the definition coordinate (x, y) the characteristic image value on is:
F ( x , y ) = max ( i , j ) ∈ W p , q ( F ( i , j ) ) ,
For further eliminating noise, specify a threshold value d, redesign characteristic image value is as follows:
G ( x , y ) = F ( x , y ) F ( x , y ) > d 0 , F ( x , y ) ≤ d ,
(x, y) greater than 0, then (x y) is exactly the characteristic of correspondence point as G.
(4-3) to image one or image two, obtain the R value of each unique point correspondence among the feature point range T1 according to above step (4-1), by step (4-2), obtain the G value of another feature point range T2 and each unique point correspondence; To the unique point in T2 in T1 and not, obtain its G value according to step (4-2); To the unique point in T1 in T2 and not, obtain its R value according to step (4-1), make all unique points among T1 and the T2 all have G and R value;
With the unique point of all unique points among T1 and the T2 as image one or image two, respectively composition diagram is as one and the feature point range T of image two SAnd Tr;
To the unique point of image one by (G+R) carry out descending sort with value, come its characteristic remarkable of unique point of front, get before the image one m remarkable characteristic composition diagram as one notable feature point range, be designated as T l
The feature point range T of above-mentioned image one SAdopt following steps to carry out parametric description and calculating with the unique point among the feature point range Tr of image two:
(5-1) the pixel point value of calculated characteristics point and consecutive point
If characteristic point position be (i, j), with (i j) is the center, 3 * 3 pixel areas, the pixel value vector that directly can get unique point and consecutive point is as follows:
WF(i,j)=(I(i,j),I(i-1,j),I(i+1,j),I(i-1,j-1),I(i-1,j),I(i-1,j+1),I(i+1,j-1),I(i+1,j),I(i+1,j+1)),
Be that (i j) is pixel point value in 3 * 3 zones to W
(5-2) calculated characteristics point single order gradient information
If characteristic point position be (i, j), this derivative gx in x direction and y direction (i, j), gy (i j) is respectively calculated as follows:
g x ( i , j ) = 1 2 ( ( f ( i , j ) - f ( i - 1 , j ) ) + ( f ( i + 1 , j ) - f ( i , j ) ) ) ,
g y ( i , j ) = 1 2 ( ( f ( i , j ) - f ( i , j - 1 ) ) + ( f ( i , j + 1 ) - f ( i , j ) ) ) ,
Compute gradient weights D (i, j) and direction θ (i, j), it is calculated as follows:
D ( i , j ) = g x ( i , j ) 2 + g y ( i , j ) 2 ,
θ ( i , j ) = g y ( i , j ) g x ( i , j ) ,
So just obtain unique point (i, gradient information vector j) is as follows:
DF(i,j)=(g x(i,j),g y(i,j),D(i,j),θ(i,j)),
(5-3) according to step (5-1) and step (5-2) obtain unique point (i, j) the parametric description multi-C vector is as follows:
DS(i,j)=(WF(i,j),DF(i,j)),
(i j) is one 13 dimensional vector to DS, describes for convenient, is designated as
DS(i,j)=(w1,w2,…,w13)。
(5-4) to the unique point in the feature point range in image one, the image two,, calculate the multi-C vector of the parametric description of all unique points according to step (5-1), step (5-2) and step (5-3)
DF(i,j)=(g x(i,j)g y(i,j),D(i,j),θ(i,j))。
Notable feature point range T in the above-mentioned steps 31 described images one lEach unique point and the unique point of the feature point range Tr of image two carry out similarity according to the following steps and calculate:
(6-1) get the notable feature point range T of image one l, i.e. T l[1,2 ..., m], establish T lThe parametric description vector of middle unique point t is (t 1, t 2...., t 13), the parametric description vector of the unique point s among the feature point range Tr of image two is ((s 1, s 2..., s 13)), then the similarity between t and the s is calculated as follows:
S ( t , s ) = 1 - Σ i = 1 13 t i s i Σ i = 1 13 t i 2 Σ i = 1 13 s i 2
(6-1) calculates T (6-2) set by step l[1,2 ..., m] in the similarity of all unique points in each unique point and the image two, the length of image two feature point range Tr is 1, the similarity matrix that can obtain m * 1 thus is as follows:
MS=(S(t,s)) m×l
The concrete steps of above-mentioned steps 32,33,34,35 are as follows:
(7-1), obtain T according to an assign thresholds GH l[1,2 ..., m] in the candidate matches point set 1 of each unique point t, i.e. SP (t):
SP(t)={s|S(t,s)>GH,s∈T r},
SP (t) expression and t similarity have also been represented the feature point set that may mate with t greater than the unique point in the image two of GH; According to the definition of similar matrix MS, calculate SP (t); { SP (t) } T ∈ T[1,2 ..., m]Regard coarse field relational system as;
(7-2) characterising parameter of the remarkable characteristic t of image one is one 13 dimensional vector, regards 13 dimensions as 13 features, is designated as { 1,2 ..., 13}, if F={1,2 ..., 13}, from F, remove any one feature i, allow F=F-{i}, calculate the unique point similarity of the feature point range of t and image two again according to step (6-1), its formula is as follows:
S ( t , s ) = 1 - Σ i ∈ F t i s i Σ i ∈ F t i 2 Σ i ∈ F s i 2
The similarity of (7-2) t of calculating (7-3) set by step, (7-1) calculates T once more according to step l[1,2 ..., m] in the candidate matches point set 2 of each unique point t, be designated as { SP *(t) } T ∈ Tl[1,2 ..., m]
To { SP *(t) } T ∈ Tl[1,2 ..., m]{ SP (t) } T ∈ T[1,2 ..., m], definition:
W = 1 2 m × Σ t ∈ T l [ 1,2 , . . . , m ] ( | SP ( t ) ∩ SP * ( t ) | | SP ( t ) | + | SP ( t ) ∩ SP * ( t ) | | SP * ( t ) | ) ,
Wherein W has described { SP *(t) } T ∈ Tl[1,2 ..., m]With { SP (t) } T ∈ T[1,2 ..., m]Between difference degree, its value is big more, difference is more little.
If W, allows F=F_{i} less than certain specified threshold value, promptly can not remove feature i, otherwise remove feature i.
(7-4) repeating step (7-2) and step (7-3), in F, can not remove again be characterized as to, obtain the feature F after the yojan.
The candidate matches point set acquisition methods of described step 4 is as follows:
The feature point range T of computed image one on the feature F after the yojan SAll unique points and the similarity of all unique points of the feature point range Tr of image two:
S ( t , s ) = 1 - Σ i ∈ F t i s i Σ i ∈ F t i 2 Σ i ∈ F s i 2
Similarity and step (7-1) calculate the candidate matches point set through each unique point of the notable feature point range of the image one behind the feature reduction thus, are designated as SP (t).
Whether described step 5 mates detected image one and candidate matches image two is as follows judged:
The radix of the candidate matches point set in the described image two equals 1, and then it is the match point of the unique point of image one; If it is match point that the candidate matches point set radix in the image two greater than 1, then selects candidate matches point to concentrate the unique point the most similar to the unique point of image one; If the radix of candidate matches point set equals 0, then in the image two not with the unique point of the Feature Points Matching of image one.
The present invention also provides a kind of image three-dimensional coalignment, it is characterized in that, comprising:
The Figure recognition device is used to discern the unique point of detected image and candidate matches image two; The unique point that identifies respectively composition diagram as one, the feature point range T of image two S, Tr, the unique point of image one feature point range is carried out descending sort by the feature point value, ranking unique point is in front formed a notable feature point range T l
The multi-C vector recognizer, be used for using half-tone information, multistage gradient information, curved transition function to be described to the unique point of above-mentioned feature point range, the characteristic parameter descriptor of computed image one and image two, obtain the multi-C vector feature of unique point parametric description, each dimensional vector is represented a feature of unique point;
Data-optimized device is used for the multi-C vector feature of each unique point parametric description of the notable feature point range of yojan image one, and is specific as follows:
The similarity of the multi-C vector feature of the parametric description of the multi-C vector feature of the parametric description of the unique point of the notable feature point range of more above-mentioned image one and the unique point of image two, the candidate matches point set 1 of each unique point in image two of the notable feature point range of acquisition image one;
The multi-C vector feature of one of them unique point of notable feature point range in the image one is removed a feature arbitrarily; The similarity of the multi-C vector feature of the parametric description of the multi-C vector feature of the parametric description of the unique point of the image one after the comparison yojan and the unique point of image two, the candidate matches point set 2 of each unique point in image two of the notable feature point range of image one after the acquisition yojan;
Described coupling point set 1 and 2 is compared,, show the characteristic remarkable of the unique point of being removed, can not remove this feature,, remove this feature if difference is less if the two differs greatly;
Repeat this step the above, till the multi-C vector feature of each unique point of the notable feature point range that can not remove image one again, obtain the feature F after the yojan;
The feature point range T of computed image one on the feature F after the yojan SAll unique points and the similarity of all unique points of the feature point range Tr of image two; Obtain the candidate matches point set of each unique point in image two of the feature point range of image one;
Sorter is used for above comparative result is classified, and the radix of the candidate matches point set in the described image two equals 1, and then it is the match point of the unique point of image one; If it is match point that the candidate matches point set radix in the image two greater than 1, then selects candidate matches point to concentrate the unique point the most similar to the unique point of image one; If the radix of candidate matches point set equals 0, then in the image two not with the unique point of the Feature Points Matching of image one.
The present invention is in feature extraction phases, and integrated various features extraction algorithm obtains a large amount of dense characteristic points from image, avoids few and cause the sparse problem of match point by single feature extraction algorithm unique point.Aspect the parametric description of unique point, the multiple parameter information of unique point is provided, as half-tone information, multistage gradient information, multiple parametric descriptions such as curved transition function, the accurate description unique point obtains the unique point more information from many aspects.At last, aspect Feature Points Matching, choosing the remarkable characteristic of detected image and the unique point of matching image utilizes coarse fuzzy technology to carry out similarity calculating, use a parameter to control the precision of similarity, control the precision of coupling by regulating this parameter, multi-C vector feature to the unique point of detected image is carried out yojan, obtain the feature after the yojan, thereby obtain suitable more parametric description, the similarity of all unique points of all unique points of calculating detected image and matching image on the feature after the yojan, obtain the candidate matches point set,, finally obtain higher matching precision and efficient by judgement to the candidate matches point set.
Description of drawings
Fig. 1 is the process flow diagram of stereo-picture matching process of the present invention;
Fig. 2 is the process flow diagram of the candidate matches point set 1 of each unique point in image two of the present invention's notable feature point range of obtaining image one;
Fig. 3 is the process flow diagram that the present invention carries out yojan to the multi-C vector feature of each unique point of the notable feature point range of image one;
Fig. 4 is the synoptic diagram of stereo-picture coalignment of the present invention.
Embodiment
The present invention is described in further detail below in conjunction with accompanying drawing:
Fig. 1 is the process flow diagram of image solid matching method of the present invention,
The unique point of detected image one and candidate matches image two in step 1, the above-mentioned image of extraction, respectively composition diagram as one, the feature point range of image two, unique point to the feature point range of image one is carried out descending sort by the feature point value, and ranking unique point is in front formed a notable feature point range;
Step 2, the unique point of above-mentioned feature point range is used multiple parametric description, the characteristic parameter descriptor of computed image one and image two obtains the multi-C vector of unique point parametric description, and each dimensional vector is represented a feature of unique point;
Step 3, the multi-C vector to each unique point parametric description of the notable feature point range of image one carries out yojan according to the following steps:
The similarity of the unique point of each unique point of notable feature point range and image two in step 31, the computed image one;
Step 32, according to specifying a threshold value to obtain the candidate matches point set 1 of each unique point in image two of the notable feature point range of image one;
Step 33, the multi-C vector feature of one of them unique point of notable feature point range in the image one is removed a feature arbitrarily; Repeating step 31,32, the unique point similarity of the unique point after the calculating yojan and the feature point range of image two; Specify a threshold value, obtain the candidate matches point set 2 of each unique point in image two of the notable feature point range of image one;
Step 34, candidate matches point set 1 and candidate matches point set 2 are compared,, show the characteristic remarkable of the unique point of being removed, can not remove this feature,, remove this feature if difference is less if the two differs greatly;
Step 35, repeat above step 33,34, till can not be again the feature of each unique point of the notable feature point range of image one being carried out yojan;
Step 4, repeating step 31,32, the similarity of the unique point of the unique point of the feature point range of the image one after the yojan of calculating multi-C vector and the feature point range of image two; Specify a threshold value, obtain the candidate matches point set of each unique point in image two of the feature point range of image one;
Step 5, according to the result of above comparison step, judge whether detected image one and candidate matches image two mate.
Further, the method that the multi-C vector to each unique point parametric description of the notable feature point range of image one of above-mentioned steps 3 carries out yojan is to use the Rough Set Reduction method that the multi-C vector of unique point parametric description is carried out yojan, obtain the multi-C vector of the unique point parametric description after the yojan, its concrete steps as shown in Figures 2 and 3:
The similarity of the unique point of each unique point of notable feature point range and image two in step 31, the computed image one;
Whether the similarity that step 32, determination step 31 obtain is greater than a threshold value of appointment, if the result of step 32 then enters step 33 for not ', abandon this candidate matches point, otherwise enter step 33;
The candidate matches point set 1 of each unique point in image two of the notable feature point range of step 33, acquisition image one; The unique point similarity of this candidate matches point set representations and image one has also been represented the feature point set that may mate with the unique point of image one greater than the unique point in the image two of specifying the fault value;
Step 34, the multi-C vector feature of one of them unique point of notable feature point range in the image one is removed a feature arbitrarily;
Step 35, with step 31, calculate the unique point similarity of the feature point range of unique point after the yojan and image two;
Step 36, with step 32, whether the similarity that determination step 35 obtains greater than specified threshold value, if the result of step 36 then enters step 37 for not, abandons this candidate matches point, otherwise enters step 38
Step 38, with step 33, obtain the candidate matches point set 2 of each unique point in image two of the notable feature point range of image one;
Step 39, candidate matches point set 1 and candidate matches point set 2 are compared,, show the characteristic remarkable of the unique point of being removed, enter step 40, can not remove this feature,, enter step 41, remove this feature if difference is less if the two differs greatly;
Step 42, other multi-C vector features of the unique point of notable feature point range in the image one are repeated above step 34-41 carry out yojan and calculate, all feature reductions calculate and finish, and enter step 43;
Step 43, acquisition multi-C vector feature are all passed through the feature of the unique point of the image one after the yojan.
The unique point of described image one of step 1 and image two is to adopt following steps to extract among above-mentioned Fig. 1: the average gradient of each pixel square matrix is as follows in step 11, the computed image:
N ( x , y ) = ( ∂ I ∂ x ) 2 ∂ I ∂ x ∂ I ∂ y ∂ I ∂ x ∂ I ∂ y ( ∂ I ∂ x ) 2 ,
Wherein I (x, y) be position in the image (x, the gray-scale value of y) locating when two eigenwerts of the corresponding average gradient square matrix of certain point are bigger, have bigger gray level variation so near this point, choose this o'clock as a unique point, and the unique point response function is:
R=det(N)-k(trace(N)) 2
Wherein det (N) is the determinant of a matrix value, and trace (N) is the mark of matrix N, and k is 0.04;
By the R value picture element in the image is carried out descending sort, constitute an ordered series of numbers, determine a required unique point number F, preceding F picture element is unique point in the peek row then, can constantly adjust F by matching result.
Step 12, to the picture element in the image (i, j), calculate respectively level, vertical, the left diagonal sum right side to the angular direction on the quadratic sum of the difference of neighbor pixel value, and get the initial value of minimum value as this picture point feature;
H=(f(i,j)-f(i,j-1)) 2+(f(i,j)-f(i,j+1)) 2
V=(f(i,j)-f(i-1,j)) 2+(f(i,j)-f(i+1,j)) 2
L=(f(i,j)-f(i-1,j-1)) 2+(f(i,j)-f(i+1,j+1)) 2
R=(f(i,j)-f(i-1,j+1)) 2+(f(i,j)-f(i+1,j-1)) 2
Make F (i, j)=min{H, V, L, R} is the regional W of size for the non-overlapping copies of n * m with image division P, q, coordinates computed (x, y), its computing formula is as follows:
( x , y ) = arg max ( i , j ) ∈ W p , q ( F ( i , j ) ) ,
And the definition coordinate (x, y) the characteristic image value on is:
F ( x , y ) = max ( i , j ) ∈ W p , q ( F ( i , j ) ) ,
For further eliminating noise, specify a threshold value d, redesign characteristic image value is as follows:
G ( x , y ) = F ( x , y ) , F ( x , y ) > d 0 , F ( x , y ) ≤ d ,
(x, y) greater than 0, then (x y) is exactly the characteristic of correspondence point as G.
Step 13, to image one or image two, obtain the R value of each unique point correspondence among the feature point range T1 according to above step 11, by step 12, obtain the G value of another feature point range T2 and each unique point correspondence; To the unique point in T2 in T1 and not, obtain its G value according to step 12; To the unique point in T1 in T2 and not, obtain its R value according to step 11, make all unique points among T1 and the T2 all have G and R value;
With the unique point of all unique points among T1 and the T2 as image one or image two, respectively composition diagram is as one and the feature point range T of image two SAnd Tr;
To the unique point of image one by (G+R) carry out descending sort with value, come its characteristic remarkable of unique point of front, get before the image one m remarkable characteristic composition diagram as one notable feature point range, be designated as T l
Feature point range T for above-mentioned image one SAdopt following steps to carry out parametric description and calculating with the unique point among the feature point range Tr of image two, its step is as follows:
The pixel point value of step 14, calculated characteristics point and consecutive point
If characteristic point position be (i, j), with (i j) is the center, 3 * 3 pixel areas, the pixel value vector that directly can get unique point and consecutive point is as follows:
WF(i,j)=(I(i,j),I(i-1,j),I(i+1,j),I(i-1,j-1),I(i-1,j),I(i-1,j+1),I(i+1,j-1),I(i+1,j),I(i+1,j+1)),
Be that (i j) is pixel point value in 3 * 3 zones to W
Step 15, calculated characteristics point single order gradient information
If characteristic point position be (i, j), this derivative gx in x direction and y direction (i, j), gy (i j) is respectively calculated as follows:
g x ( i , j ) = 1 2 ( ( f ( i , j ) - f ( i - 1 , j ) ) + ( f ( i + 1 , j ) - f ( i , j ) ) ) ,
g y ( i , j ) = 1 2 ( ( f ( i , j ) - f ( i , j - 1 ) ) + ( f ( i , j + 1 ) - f ( i , j ) ) ) ,
Compute gradient weights D (i, j) and direction θ (i, j), it is calculated as follows:
D ( i , j ) = g x ( i , j ) 2 + g y ( i , j ) 2 ,
θ ( i , j ) = g y ( i , j ) g x ( i , j ) ,
So just obtain unique point (i, gradient information vector j) is as follows:
DF(i,j)=(g x(i,j),g y(i,j),D(i,j),θ(i,j)),
Step 16, according to step 14 and step 15 obtain unique point (i, j) the parametric description multi-C vector is as follows:
DS(i,j)=(WF(i,j),DF(i,j)),
(i j) is one 13 dimensional vector to DS, describes for convenient, is designated as
DS(i,j)=(w1,w2,…,w13)。
Step 17, to the unique point in the feature point range in image one, the image two, according to step 14-16, calculate the multi-C vector of the parametric description of all unique points
DF(i,j)=(g x(i,j),g y(i,j),D(i,j),θ(i,j))
Step 18, get the notable feature point range T of image one l, i.e. T l[1,2 ..., m], establish T lThe parametric description vector of middle unique point t is (t 1, t 2...., t 13), the parametric description vector of the unique point s among the feature point range Tr of image two is (s 1, s 2..., s 13), then the similarity between t and the s is calculated as follows:
S ( t , s ) = 1 - Σ i = 1 13 t i s i Σ i = 1 13 t i 2 Σ i = 1 13 s i 2
Step 19,18 calculate T set by step l[1,2 ..., m] in the similarity of all unique points in each unique point and the image two, the length of image two feature point range Tr is 1, the similarity matrix that can obtain m * 1 thus is as follows:
MS=(S(t,s)) m×l
Step 20, according to an assign thresholds GH, obtain T l[1,2 ..., m] in the candidate matches point set 1 of each unique point t, i.e. SP (t):
SP(t)={s|S(t,s)>GH,s∈T r},
SP (t) expression and t similarity have also been represented the feature point set that may mate with t greater than the unique point in the image two of GH; According to the definition of similar matrix MS, calculate SP (t); { SP (t) } T ∈ T[1,2 ..., m]Regard coarse field relational system as;
The characterising parameter of the remarkable characteristic t of step 21, image one is one 13 dimensional vector, regard 13 dimensions as 13 features, be designated as { 1,2,13} establishes F={1, and 2, there is redundancy in 13} aspect similarity calculating in 13 features, redundancy can cause the reduction of similarity computational accuracy and carry out efficient low.For this reason, reduction method in the employing rough set theory is to { 1,2,, 13} carries out yojan, adopts top-down reduction method, its concrete steps are as follows: remove any one feature i from F, allow F=F-{i}, calculate the unique point similarity of the feature point range of t and image two again according to step 18, its formula is as follows:
S ( t , s ) = 1 - Σ i ∈ F t i s i Σ i ∈ F t i 2 Σ i ∈ F s i 2
Step 22, the similarity of 21 t that calculate are set by step calculated T once more according to step 20 l[1,2 ..., m] in the candidate matches point set 2 of each unique point t, be designated as { SP *(t) } T ∈ Tl[1,2 ..., m]
To { SP *(t) } T ∈ Tl[1,2 ..., m]{ SP (t) } T ∈ T[1,2 ..., m], definition:
W = 1 2 m × Σ t ∈ T l [ 1,2 , . . . , m ] ( | SP ( t ) ∩ SP * ( t ) | | SP ( t ) | + | SP ( t ) ∩ SP * ( t ) | | SP * ( t ) | ) ,
Wherein W has described { SP *(t) } T ∈ Tl[1,2 ..., m]With { SP (t) } T ∈ T[1,2 ..., m]Between difference degree, its value is big more, difference is more little.
If W is less than certain specified threshold value, promptly difference is big, allows F=F_{i}, promptly can not remove feature i, otherwise remove feature i.
Step 23, repeating step 21,22 can not remove till the feature in F again, obtain the feature F after the yojan.
Step 24, on the feature F after the yojan feature point range T of computed image one SAll unique points and the similarity of all unique points of the feature point range Tr of image two::
S ( t , s ) = 1 - Σ i ∈ F t i s i Σ i ∈ F t i 2 Σ i ∈ F s i 2
Similarity and step 20 calculate through each unique point of the feature point range of the image one behind feature reduction candidate matches point set at image two thus, are designated as SP (t).
According to the result of above comparison step, as follows whether detected image one and candidate matches image two are mated again and judge:
The radix of the candidate matches point set in the described image two equals 1, and then it is the match point of the unique point of image one; If it is match point that the candidate matches point set radix in the image two greater than 1, then selects candidate matches point to concentrate the unique point the most similar to the unique point of image one; If the radix of candidate matches point set equals 0, then in the image two not with the unique point of the Feature Points Matching of image one.
Fig. 4 is the three-dimensional coalignment synoptic diagram of a kind of image of the present invention, comprising:
The Figure recognition device is used to discern the unique point of detected image and candidate matches image two; The unique point that identifies respectively composition diagram as one, the feature point range T of image two S, Tr, the unique point of image one feature point range is carried out descending sort by the feature point value, ranking unique point is in front formed a notable feature point range T l
The multi-C vector recognizer, be used for using half-tone information, multistage gradient information, curved transition function to be described to the unique point of above-mentioned feature point range, the characteristic parameter descriptor of computed image one and image two, obtain the multi-C vector feature of unique point parametric description, each dimensional vector is represented a feature of unique point;
Data-optimized device is used for the multi-C vector feature of each unique point parametric description of the notable feature point range of yojan image one, and is specific as follows:
The similarity of the multi-C vector feature of the parametric description of the multi-C vector feature of the parametric description of the unique point of the notable feature point range of more above-mentioned image one and the unique point of image two, the candidate matches point set 1 of each unique point in image two of the notable feature point range of acquisition image one;
The multi-C vector feature of one of them unique point of notable feature point range in the image one is removed a feature arbitrarily; The similarity of the multi-C vector feature of the parametric description of the multi-C vector feature of the parametric description of the unique point of the image one after the comparison yojan and the unique point of image two, the candidate matches point set 2 of each unique point in image two of the notable feature point range of image one after the acquisition yojan;
Described coupling point set 1 and 2 is compared,, show the characteristic remarkable of the unique point of being removed, can not remove this feature,, remove this feature if difference is less if the two differs greatly;
Repeat this step the above, till the multi-C vector feature of each unique point of the notable feature point range that can not remove image one again, obtain the feature F after the yojan;
The feature point range T of computed image one on the feature F after the yojan SAll unique points and the similarity of all unique points of the feature point range Tr of image two; Obtain the candidate matches point set of each unique point in image two of the feature point range of image one;
Sorter is used for above comparative result is classified, and the radix of the candidate matches point set in the described image two equals 1, and then it is the match point of the unique point of image one; If it is match point that the candidate matches point set radix in the image two greater than 1, then selects candidate matches point to concentrate the unique point the most similar to the unique point of image one; If the radix of candidate matches point set equals 0, then in the image two not with the unique point of the Feature Points Matching of image one.
Compare with prior art, stereoscopic image coupling of the present invention has higher matching precision and efficient.

Claims (10)

1. an image solid matching method is characterized in that, comprises the steps:
The unique point of detected image one and candidate matches image two in step 1, the above-mentioned image of extraction, respectively composition diagram as one, the feature point range of image two, unique point to the feature point range of image one is carried out descending sort by the feature point value, and ranking unique point is in front formed a notable feature point range;
Step 2, the unique point of above-mentioned feature point range is used multiple parametric description, the characteristic parameter descriptor of computed image one and image two obtains the multi-C vector of unique point parametric description, and each dimensional vector is represented a feature of unique point;
Step 3, the multi-C vector to each unique point parametric description of the notable feature point range of image one carries out yojan according to the following steps:
The similarity of the unique point of each unique point of notable feature point range and image two in step 31, the computed image one;
Step 32, according to specifying a threshold value to obtain the candidate matches point set 1 of each unique point in image two of the notable feature point range of image one;
Step 33, the multi-C vector feature of one of them unique point of notable feature point range in the image one is removed a feature arbitrarily; Repeating step 31,32, the unique point similarity of the unique point after the calculating yojan and the feature point range of image two; Specify a threshold value, obtain the candidate matches point set 2 of each unique point in image two of the notable feature point range of image one;
Step 34, candidate matches point set 1 and candidate matches point set 2 are compared,, show the characteristic remarkable of the unique point of being removed, can not remove this feature,, remove this feature if difference is less if the two differs greatly;
Step 35, repeat above step 33,34, till can not be again the feature of each unique point of the notable feature point range of image one being carried out yojan;
Step 4, repeating step 31,32, the similarity of the unique point of the unique point of the feature point range of the image one after the yojan of calculating multi-C vector and the feature point range of image two; Specify a threshold value, obtain the candidate matches point set of each unique point in image two of the feature point range of image one;
Step 5, according to the result of above comparison step, judge whether detected image one and candidate matches image two mate.
2. image solid matching method according to claim 1 is characterized in that, the multiple parametric description of the unique point of above-mentioned steps 2 described feature point ranges comprises half-tone information, multistage gradient information, curved transition function at least.
3. image solid matching method according to claim 2, it is characterized in that, the method that the multi-C vector to each unique point parametric description of the notable feature point range of image one of above-mentioned steps 3 carries out yojan is to use the Rough Set Reduction method that the multi-C vector of unique point parametric description is carried out yojan, obtains the multi-C vector of the unique point parametric description after the yojan.
4. according to claim 1,2 or 3 described image solid matching methods, it is characterized in that the unique point of above-mentioned steps 1 described image one and image two is to adopt following steps to extract:
(4-1) average gradient of each pixel square matrix is as follows in the computed image:
N ( x , y ) = ( ∂ I ∂ x ) 2 ∂ I ∂ x ∂ I ∂ y ∂ I ∂ x ∂ I ∂ y ( ∂ I ∂ x ) 2 ,
Wherein I (x, y) be position in the image (x, the gray-scale value of y) locating when two eigenwerts of the corresponding average gradient square matrix of certain point are bigger, have bigger gray level variation so near this point, choose this o'clock as a unique point, and the unique point response function is:
R=det(N)-k(trace(N)) 2
Wherein det (N) is the determinant of a matrix value, and trace (N) is the mark of matrix N, and k is 0.04;
By the R value picture element in the image is carried out descending sort, constitute an ordered series of numbers, determine a required unique point number F, preceding F picture element is unique point in the peek row then, can constantly adjust F by matching result.
(4-2) to the picture element in the image (i, j), calculate respectively level, vertical, the left diagonal sum right side to the angular direction on the quadratic sum of the difference of neighbor pixel value, and get the initial value of minimum value as this picture point feature;
H=(f(i,j)-f(i,j-1)) 2+(f(i,j)-f(i,j+1)) 2
V=(f(i,j)-f(i-1,j)) 2+(f(i,j)-f(i+1,j)) 2
L=(f(i,j)-f(i-1,j-1)) 2+(f(i,j)-f(i+1,j+1)) 2
R=(f(i,j)-f(i-1,j+1)) 2+(f(i,j)-f(i+1,j-1)) 2
Make F (i, j)=min{H, V, L, R} is the regional W of size for the non-overlapping copies of n * m with image division P, q, coordinates computed (x, y), its computing formula is as follows:
( x , y ) = arg max ( i , j ) ∈ W p , q ( F ( i , j ) ) ,
And the definition coordinate (x, y) the characteristic image value on is:
F ( x , y ) = max ( i , j ) ∈ W p , q ( F ( i , j ) ) ,
For further eliminating noise, specify a threshold value d, redesign characteristic image value is as follows:
G ( x , y ) = F ( x , y ) F ( x , y ) > d 0 , F ( x , y ) ≤ d ,
(x, y) greater than 0, then (x y) is exactly the characteristic of correspondence point as G.
(4-3) to image one or image two, obtain the R value of each unique point correspondence among the feature point range T1 according to above step (4-1), by step (4-2), obtain the G value of another feature point range T2 and each unique point correspondence; To the unique point in T2 in T1 and not, obtain its G value according to step (4-2); To the unique point in T1 in T2 and not, obtain its R value according to step (4-1), make all unique points among T1 and the T2 all have G and R value;
With the unique point of all unique points among T1 and the T2 as image one or image two, respectively composition diagram is as one and the feature point range T of image two SAnd Tr;
To the unique point of image one by (G+R) carry out descending sort with value, come its characteristic remarkable of unique point of front, get before the image one m remarkable characteristic composition diagram as one notable feature point range, be designated as T l
5. image solid matching method according to claim 4 is characterized in that, for the feature point range T of above-mentioned image one SAdopt following steps to carry out parametric description and calculating with the unique point among the feature point range Tr of image two:
(5-1) the pixel point value of calculated characteristics point and consecutive point
If characteristic point position be (i, j), with (i j) is the center, 3 * 3 pixel areas, the pixel value vector that directly can get unique point and consecutive point is as follows:
WF(i,j)=(I(i,j),I(i-1,j),I(i+1,j),I(i-1,j-1),I(i-1,j),I(i-1,j+1),I(i+1,j-1),I(i+1,j),I(i+1,j+1)),
Be that (i j) is pixel point value in 3 * 3 zones to W
(5-2) calculated characteristics point single order gradient information
If characteristic point position be (i, j), this derivative gx in x direction and y direction (i, j), gy (i j) is respectively calculated as follows:
g x ( i , j ) = 1 2 ( ( f ( i , j ) - f ( i - 1 , j ) ) + ( f ( i + 1 , j ) - f ( i , j ) ) ) ,
g y ( i , j ) = 1 2 ( ( f ( i , j ) - f ( i , j - 1 ) ) + ( f ( i , j + 1 ) - f ( i , j ) ) ) ,
Compute gradient weights D (i, j) and direction θ (i, j), it is calculated as follows:
D ( i , j ) = g x ( i , j ) 2 + g y ( i , j ) 2 ,
θ ( i , j ) = g y ( i , j ) g x ( i , j ) ,
So just obtain unique point (i, gradient information vector j) is as follows:
DF(i,j)=(g x(i,j),g y(i,j),D(i,j),θ(i,j)),
(5-3) according to step (5-1) and step (5-2) obtain unique point (i, j) the parametric description multi-C vector is as follows:
DS(i,j)=(WF(i,j),DF(i,j)),
(i j) is one 13 dimensional vector to DS, describes for convenient, is designated as
DS(i,j)=(w1,w2,…,w13)。
(5-4) to the unique point in the feature point range in image one, the image two,, calculate the multi-C vector of the parametric description of all unique points according to step (5-1), step (5-2) and step (5-3)
DF(i,j)=(g x(i,j),g y(i,j),D(i,j),θ(i,j))。
6. image solid matching method according to claim 5 is characterized in that, notable feature point range T in the above-mentioned steps 31 described images one lEach unique point and the unique point of the feature point range Tr of image two carry out similarity according to the following steps and calculate:
(6-1) get the notable feature point range T of image one l, i.e. T l[1,2 ..., m], establish T lThe parametric description vector of middle unique point t is (t 1, t 2...., t 13), the parametric description vector of the unique point s among the feature point range Tr of image two is (s 1, s 2..., s 13), then the similarity between t and the s is calculated as follows:
S ( t , s ) = 1 - Σ i = 1 13 t i s i Σ i = 1 13 t i 2 Σ i = 1 13 s i 2
(6-1) calculates T (6-2) set by step l[1,2 ..., m] in the similarity of all unique points in each unique point and the image two, the length of image two feature point range Tr is 1, the similarity matrix that can obtain m * 1 thus is as follows:
MS=(S(t,s)) m×l
7. according to claim 5 or 6 described image solid matching methods, it is characterized in that the concrete steps of above-mentioned steps 32,33,34,35 are as follows:
(7-1), obtain T according to an assign thresholds GH l[1,2 ..., m] in the candidate matches point set 1 of each unique point t in image two, i.e. SP (t):
SP(t)={s|S(t,s)>GH,s∈T r},
SP (t) expression and t similarity have also been represented the feature point set that may mate with t greater than the unique point in the image two of GH; According to the definition of similar matrix MS, calculate SP (t); { SP (t) } T ∈ T[1,2 ..., m]Regard coarse field relational system as;
(7-2) characterising parameter of the remarkable characteristic t of image one is one 13 dimensional vector, regards 13 dimensions as 13 features, is designated as { 1,2 ..., 13}, if F={1,2 ..., 13}, from F, remove any one feature i, allow F=F-{i}, calculate the unique point similarity of the feature point range of t and image two again according to step (6-1), its formula is as follows:
S ( t , s ) = 1 - Σ i ∈ F t i s i Σ i ∈ F t i 2 Σ i ∈ F s i 2
The similarity of (7-2) t of calculating (7-3) set by step, (7-1) calculates T once more according to step l[1,2 ..., m] in the candidate matches point set 2 of each unique point t, be designated as { SP *(t) } T ∈ Tl[1,2 ..., m]
To { SP *(t) } T ∈ Tl[1,2 ..., m]{ SP (t) } T ∈ T[1,2 ..., m], definition:
W = 1 2 m × Σ t ∈ T l [ 1,2 , . . . , m ] ( | SP ( t ) ∩ SP * ( t ) | | SP ( t ) | + | SP ( t ) ∩ SP * ( t ) | | SP * ( t ) | ) ,
Wherein W has described { SP *(t) } T ∈ Tl[1,2 ..., m]With { SP (t) } T ∈ T[1,2 ..., m]Between difference degree, its value is big more, difference is more little.
If W, allows F=F_{i} less than certain specified threshold value, promptly can not remove feature i, otherwise remove feature i.
(7-4) repeating step (7-2) and step (7-3), in F, can not remove again be characterized as to, obtain the feature F after the yojan.
8. image solid matching method according to claim 7 is characterized in that, the candidate matches point set acquisition methods of described step 4 is as follows:
The feature point range T of computed image one on the feature F after the yojan SAll unique points and the similarity of all unique points of the feature point range Tr of image two:
S ( t , s ) = 1 - Σ i ∈ F t i s i Σ i ∈ F t i 2 Σ i ∈ F s i 2
Similarity and step (7-1) calculate through each unique point of the feature point range of the image one behind feature reduction candidate matches point set at image two thus, are designated as SP (t).
9. image solid matching method according to claim 8 is characterized in that, whether described step 5 mates detected image one and candidate matches image two is as follows judged:
The radix of the candidate matches point set in the described image two equals 1, and then it is the match point of the unique point of image one; If it is match point that the candidate matches point set radix in the image two greater than 1, then selects candidate matches point to concentrate the unique point the most similar to the unique point of image one; If the radix of candidate matches point set equals 0, then in the image two not with the unique point of the Feature Points Matching of image one.
10. the three-dimensional coalignment of image is characterized in that, comprising:
The Figure recognition device is used to discern the unique point of detected image and candidate matches image two; The unique point that identifies respectively composition diagram as one, the feature point range T of image two S, Tr, the unique point of image one feature point range is carried out descending sort by the feature point value, ranking unique point is in front formed a notable feature point range T l
The multi-C vector recognizer, be used for using half-tone information, multistage gradient information, curved transition function to be described to the unique point of above-mentioned feature point range, the characteristic parameter descriptor of computed image one and image two, obtain the multi-C vector feature of unique point parametric description, each dimensional vector is represented a feature of unique point;
Data-optimized device is used for the multi-C vector feature of each unique point parametric description of the notable feature point range of yojan image one, and is specific as follows:
The similarity of the multi-C vector feature of the parametric description of the multi-C vector feature of the parametric description of the unique point of the notable feature point range of more above-mentioned image one and the unique point of image two, the candidate matches point set 1 of each unique point in image two of the notable feature point range of acquisition image one;
The multi-C vector feature of one of them unique point of notable feature point range in the image one is removed a feature arbitrarily; The similarity of the multi-C vector feature of the parametric description of the multi-C vector feature of the parametric description of the unique point of the image one after the comparison yojan and the unique point of image two, the candidate matches point set 2 of each unique point in image two of the notable feature point range of image one after the acquisition yojan;
Described coupling point set 1 and 2 is compared,, show the characteristic remarkable of the unique point of being removed, can not remove this feature,, remove this feature if difference is less if the two differs greatly;
Repeat this step the above, till the multi-C vector feature of each unique point of the notable feature point range that can not remove image one again, obtain the feature F after the yojan;
The feature point range T of computed image one on the feature F after the yojan SAll unique points and the similarity of all unique points of the feature point range Tr of image two; Obtain the candidate matches point set of each unique point in image two of the feature point range of image one;
Sorter is used for above comparative result is classified, and the radix of the candidate matches point set in the described image two equals 1, and then it is the match point of the unique point of image one; If it is match point that the candidate matches point set radix in the image two greater than 1, then selects candidate matches point to concentrate the unique point the most similar to the unique point of image one; If the radix of candidate matches point set equals 0, then in the image two not with the unique point of the Feature Points Matching of image one.
CNB2007100508536A 2007-12-17 2007-12-17 A kind of image solid matching method and device thereof Expired - Fee Related CN100550054C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2007100508536A CN100550054C (en) 2007-12-17 2007-12-17 A kind of image solid matching method and device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2007100508536A CN100550054C (en) 2007-12-17 2007-12-17 A kind of image solid matching method and device thereof

Publications (2)

Publication Number Publication Date
CN101197045A true CN101197045A (en) 2008-06-11
CN100550054C CN100550054C (en) 2009-10-14

Family

ID=39547426

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2007100508536A Expired - Fee Related CN100550054C (en) 2007-12-17 2007-12-17 A kind of image solid matching method and device thereof

Country Status (1)

Country Link
CN (1) CN100550054C (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102084364A (en) * 2008-07-24 2011-06-01 西门子公司 Parallel navigation in a plurality of CAD models
CN102144231A (en) * 2008-06-16 2011-08-03 微软公司 Adaptive visual similarity for text-based image search results re-ranking
CN102224523A (en) * 2008-11-25 2011-10-19 Nec软件***科技有限公司 Stereo matching process system, stereo matching process method, and recording medium
CN102236893A (en) * 2010-04-30 2011-11-09 中国人民解放军装备指挥技术学院 Space-position-forecast-based corresponding image point matching method for lunar surface image
CN102609948A (en) * 2012-02-10 2012-07-25 浙江理工大学 Manipulation detection method for copy-paste distorted photo digital photos
CN102905864A (en) * 2009-12-23 2013-01-30 Cfs比尔有限责任公司 Method for classifying the quality of food slices of a stick of food
CN102117412B (en) * 2009-12-31 2013-03-27 北大方正集团有限公司 Method and device for image recognition
CN103052971A (en) * 2010-07-30 2013-04-17 松下电器产业株式会社 Detection device and method for transition area in space
CN103714543A (en) * 2013-12-26 2014-04-09 南京理工大学 Simple tree dynamic programming binocular and stereo matching method based on invariant moment spatial information
CN104169946A (en) * 2011-07-14 2014-11-26 华为技术有限公司 Scalable query for visual search
CN104346797A (en) * 2013-07-31 2015-02-11 北大方正集团有限公司 Key pixel point matching method and device, and image matching method and device
CN104639933A (en) * 2015-01-07 2015-05-20 前海艾道隆科技(深圳)有限公司 Real-time acquisition method and real-time acquisition system for depth maps of three-dimensional views
CN105190226A (en) * 2013-03-12 2015-12-23 富士胶片株式会社 Image assessment device, capturing device, 3d measuring device, image assessment method, and program
CN106327188A (en) * 2016-08-15 2017-01-11 华为技术有限公司 Binding method and device for bank cards in payment application
CN106484118A (en) * 2016-10-24 2017-03-08 福建北极光虚拟视觉展示科技有限公司 A kind of augmented reality exchange method based on fixing trigger source and system
CN107590502A (en) * 2017-09-18 2018-01-16 西安交通大学 A kind of whole audience dense point fast matching method
CN108009595A (en) * 2017-12-25 2018-05-08 北京航空航天大学 A kind of image-recognizing method of feature based stipulations
CN109631829A (en) * 2018-12-17 2019-04-16 南京理工大学 A kind of binocular distance measuring method of adaptive Rapid matching
CN109635822A (en) * 2018-12-07 2019-04-16 浙江科技学院 The significant extracting method of stereo-picture vision based on deep learning coding and decoding network
CN116309757A (en) * 2023-05-24 2023-06-23 山东省青东智能科技有限公司 Binocular stereo matching method based on machine vision

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102693533A (en) * 2012-03-12 2012-09-26 清华大学 Medical digital image mosaicing method

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102144231A (en) * 2008-06-16 2011-08-03 微软公司 Adaptive visual similarity for text-based image search results re-ranking
CN102084364A (en) * 2008-07-24 2011-06-01 西门子公司 Parallel navigation in a plurality of CAD models
US8756036B2 (en) 2008-07-24 2014-06-17 Siemens Aktiengesellschaft Parallel navigation in a plurality of CAD models
CN102224523B (en) * 2008-11-25 2014-04-23 Nec软件***科技有限公司 Stereo matching process system, stereo matching process method, and recording medium
CN102224523A (en) * 2008-11-25 2011-10-19 Nec软件***科技有限公司 Stereo matching process system, stereo matching process method, and recording medium
CN102905864A (en) * 2009-12-23 2013-01-30 Cfs比尔有限责任公司 Method for classifying the quality of food slices of a stick of food
CN102117412B (en) * 2009-12-31 2013-03-27 北大方正集团有限公司 Method and device for image recognition
CN102236893A (en) * 2010-04-30 2011-11-09 中国人民解放军装备指挥技术学院 Space-position-forecast-based corresponding image point matching method for lunar surface image
US9064171B2 (en) 2010-07-30 2015-06-23 Panasonic Intellectual Property Management Co., Ltd. Detection device and method for transition area in space
CN103052971A (en) * 2010-07-30 2013-04-17 松下电器产业株式会社 Detection device and method for transition area in space
CN104169946B (en) * 2011-07-14 2019-12-06 华为技术有限公司 Extensible queries for visual search
CN104169946A (en) * 2011-07-14 2014-11-26 华为技术有限公司 Scalable query for visual search
CN102609948A (en) * 2012-02-10 2012-07-25 浙江理工大学 Manipulation detection method for copy-paste distorted photo digital photos
CN105190226A (en) * 2013-03-12 2015-12-23 富士胶片株式会社 Image assessment device, capturing device, 3d measuring device, image assessment method, and program
CN105190226B (en) * 2013-03-12 2017-09-19 富士胶片株式会社 Image judgment device, camera device, three-dimentional measurement device and image determinant method
CN104346797A (en) * 2013-07-31 2015-02-11 北大方正集团有限公司 Key pixel point matching method and device, and image matching method and device
CN103714543A (en) * 2013-12-26 2014-04-09 南京理工大学 Simple tree dynamic programming binocular and stereo matching method based on invariant moment spatial information
CN104639933A (en) * 2015-01-07 2015-05-20 前海艾道隆科技(深圳)有限公司 Real-time acquisition method and real-time acquisition system for depth maps of three-dimensional views
CN106327188A (en) * 2016-08-15 2017-01-11 华为技术有限公司 Binding method and device for bank cards in payment application
US10937016B2 (en) 2016-08-15 2021-03-02 Huawei Technologies Co., Ltd. Method and apparatus for binding bank card in payment application
CN106484118B (en) * 2016-10-24 2020-01-14 福建北极光虚拟视觉展示科技有限公司 Augmented reality interaction method and system based on fixed trigger source
CN106484118A (en) * 2016-10-24 2017-03-08 福建北极光虚拟视觉展示科技有限公司 A kind of augmented reality exchange method based on fixing trigger source and system
CN107590502A (en) * 2017-09-18 2018-01-16 西安交通大学 A kind of whole audience dense point fast matching method
CN107590502B (en) * 2017-09-18 2020-05-22 西安交通大学 Full-field dense point fast matching method
CN108009595B (en) * 2017-12-25 2018-10-12 北京航空航天大学 A kind of image-recognizing method of feature based stipulations
CN108009595A (en) * 2017-12-25 2018-05-08 北京航空航天大学 A kind of image-recognizing method of feature based stipulations
CN109635822A (en) * 2018-12-07 2019-04-16 浙江科技学院 The significant extracting method of stereo-picture vision based on deep learning coding and decoding network
CN109635822B (en) * 2018-12-07 2022-06-21 浙江科技学院 Stereoscopic image visual saliency extraction method based on deep learning coding and decoding network
CN109631829A (en) * 2018-12-17 2019-04-16 南京理工大学 A kind of binocular distance measuring method of adaptive Rapid matching
CN116309757A (en) * 2023-05-24 2023-06-23 山东省青东智能科技有限公司 Binocular stereo matching method based on machine vision

Also Published As

Publication number Publication date
CN100550054C (en) 2009-10-14

Similar Documents

Publication Publication Date Title
CN100550054C (en) A kind of image solid matching method and device thereof
CN102236794B (en) Recognition and pose determination of 3D objects in 3D scenes
CN109934862A (en) A kind of binocular vision SLAM method that dotted line feature combines
CN105956560A (en) Vehicle model identification method based on pooling multi-scale depth convolution characteristics
CN108010045A (en) Visual pattern characteristic point error hiding method of purification based on ORB
CN105608441B (en) Vehicle type recognition method and system
CN105956582A (en) Face identifications system based on three-dimensional data
CN102024152B (en) Method for recognizing traffic sings based on sparse expression and dictionary study
CN106682586A (en) Method for real-time lane line detection based on vision under complex lighting conditions
CN103593832A (en) Method for image mosaic based on feature detection operator of second order difference of Gaussian
CN105046694A (en) Quick point cloud registration method based on curved surface fitting coefficient features
CN109325510B (en) Image feature point matching method based on grid statistics
CN104299009A (en) Plate number character recognition method based on multi-feature fusion
CN110309010B (en) Partial discharge network training method and device for phase resolution of power equipment
CN108268865A (en) Licence plate recognition method and system under a kind of natural scene based on concatenated convolutional network
CN105825233A (en) Pedestrian detection method based on random fern classifier of online learning
CN115272652A (en) Dense object image detection method based on multiple regression and adaptive focus loss
CN102147812A (en) Three-dimensional point cloud model-based landmark building image classifying method
CN103218825A (en) Quick detection method of spatio-temporal interest points with invariable scale
CN109729298A (en) Image processing method and image processing apparatus
CN102902976A (en) Image scene classification method based on target and space relationship characteristics
CN104573722A (en) Three-dimensional face race classifying device and method based on three-dimensional point cloud
CN1286064C (en) An image retrieval method based on marked interest point
CN103336964A (en) SIFT image matching method based on module value difference mirror image invariant property
CN107301431A (en) A kind of method and device of the non-maxima suppression based on weight

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20091014

Termination date: 20191217