CN103646396A - Matching cost algorithm of binocular stereo matching algorithm, and non-local stereo matching algorithm - Google Patents
Matching cost algorithm of binocular stereo matching algorithm, and non-local stereo matching algorithm Download PDFInfo
- Publication number
- CN103646396A CN103646396A CN201310634040.7A CN201310634040A CN103646396A CN 103646396 A CN103646396 A CN 103646396A CN 201310634040 A CN201310634040 A CN 201310634040A CN 103646396 A CN103646396 A CN 103646396A
- Authority
- CN
- China
- Prior art keywords
- pixel
- cost
- image
- color space
- normalization
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Image Processing (AREA)
Abstract
The invention discloses a matching cost computation method of a binocular stereo matching algorithm and a non-local stereo matching algorithm based on variable weight MST. The matching cost computation method comprises the following steps: (S1) a color space normalization step in which RGB color space normalization processing is carried out on an original image according to the following method to obtain a normalized color space, wherein R, G and B color value components r, g and b of pixels of the normalized color space meet the following conditions: r is the quotient of Cr divided by the sum of Cr, Cg and Cb, g is the quotient of Cg divided by the sum of Cr, Cg and Cb, and b is the quotient of Cb divided by the sum of Cr, Cg and Cb; and (S2) a matching cost computation step in which the matching cost of the image is evaluated with the use of the normalized color space. Compared with the prior art, the method of the invention has stronger robustness.
Description
Technical field
The present invention relates to binocular tri-dimensional video technology, especially relate to: the coupling cost algorithms of Binocular Stereo Matching Algorithm and non local Stereo Matching Algorithm.
Background technology
Computer vision is to study how with video camera and computer, to replace human eye and cerebral nervous system target to be carried out to the computer vision system of acquisition of information, scene understanding, target identification, tracking and measurement etc.The research object of computer vision is the two-dimensional projection image that is comprising three-dimensional information, and target is from these two dimensional images, to extract the three-dimensional information that it comprises, thereby recovers 3 D stereo scene comprehensively.
Stereoscopic vision generally can be divided into Binocular Stereo Vision System, three orders or many objects stereo visual system.Wherein three orders and multi-view stereo vision system can be regarded as by a plurality of Binocular Stereo Vision Systems and form, and their basic principle is all still based on binocular stereo vision principle; It is the minimum system in computer stereo vision that Binocular Stereo Vision System can be considered as.In binocular stereo vision corresponding point matching problem be Stereo matching problem be the most difficult, be rich in a challenging step, precision and the speed of coupling have a great impact stereo visual system.
According to the difference of constraint condition, current Stereo Matching Algorithm is mainly divided into sectional perspective coupling and the large class of overall Stereo Matching Algorithm two.Sectional perspective matching algorithm is taked the strategy of support window conventionally, thinks that the parallax value in support window is the same, but always establishment of this class hypothesis can cause " prospect bulking effect ".Overall situation Stereo Matching Algorithm is considered local colouring information and structural information conventionally, and sets up overall energy function for image, by some optimization methods, and BP for example, GC, DP comes for each pixel distribution parallax value.In general, current overall Stereo Matching Algorithm speed is slower, but the parallax value precision of obtaining is higher, can well react the depth information of scene, by comparison, some sectional perspective matching algorithm energy real-time implementation, but in the application of degree of precision, seem unable to do what one wishes.
Recently, Stereo Matching Algorithm based on tree-model is widely used, one of them good example is Yang Q.A non-local cost aggregation method for stereo matching[C] //Computer Vision and Pattern Recognition (CVPR), 2012IEEE Conference on.IEEE, the disclosed model of 2012:1402-1409, in this model, the heterochromia of neighbor pixel is as the weights on the limit of connected node.In the tree finally forming, each tree node can exert an influence to the coupling cost superposition calculation of other node, when calculating each node parallax value, used the information of other all nodes, such algorithm is different from original sectional perspective matching algorithm and overall Stereo Matching Algorithm, can obtain the good disparity map of quality, be called as non-local Stereo Matching Algorithm.But in such algorithm, when image configuration is become to four Neighborhood Graph on limit, the weights on limit are affected by picture quality easily.Such as, the binocular image that general sampling is obtained is not quite identical to color, affected when larger by shooting environmental, and weights calculate can distortion.
Summary of the invention
Technical matters to be solved by this invention is for the defect of aforementioned prior art, to provide a kind of coupling cost computing method of Binocular Stereo Matching Algorithm, raising coupling cost calculating accuracy and robustness.
The present invention will solve another technical matters, the accuracy that provides a kind of non local Stereo Matching Algorithm based on can variable weight MST to calculate to improve parallax value.
The present invention solves aforementioned technical problem by following technical proposals:
Coupling cost computing method for Binocular Stereo Matching Algorithm, is characterized in that, comprising:
S1) color space normalization step: by the following method original image is carried out to rgb color space normalized and obtain normalization color space, the R of the pixel of normalization color space, G, B color-values component r, g, b meet:
Wherein, C
r, C
g, C
br, the G, the B color-values component that represent respectively original image corresponding pixel points;
S2) coupling cost calculation procedure: utilize normalization color space to evaluate the coupling cost of image, evaluation method is as follows:
In image, arbitrfary point (i, j) is assumed to be the coupling cost of d at parallax
It is mainly the heterochromia of considering matching double points that existing coupling cost is calculated, the inferior position of these class methods be to be easily subject to original image on impact.Due to the restriction of experiment condition, the general binocular image that obtains of gathering is to there being certain color error ratio, and main manifestations is that the some color-values in the image of left and right in visual field exists deviation, this can give coupling on selection bring impact.Such scheme of the present invention has proposed color space to be normalized, and utilizes the value of normalization color space to mate cost calculating, thereby can effectively weaken the impact of sampling on matching result, makes coupling more accurate.
Another defect that prior art coupling cost is calculated is that local structural information is not considered in the calculating of coupling cost, so can may cause without coupling.For overcoming this defect, the present invention, on the basis of aforementioned techniques scheme, has proposed the part using the relative gradient information of pixel surrounding structure information as coupling cost.That is: on aforementioned schemes basis, increase following step:
S3) relative gradient difference calculation procedure: according to the gradient information of each pixel in the color information computed image of image; To each pixel in image, for it sets up support window, in support window, find out maximum Grad, and calculate the relative gradient of this pixel:
Then according to following formula, calculate the relative gradient difference RG (i, j, d) of matching double points:
RG (i, j, d)=| rg
left(i, j)-rg
right(i+d, j) |; Wherein, rg
left(i, j) represents the relative gradient value that in left figure, coordinate is put for (i, j), rg
right(i+d, j) represents to calculate the relative gradient value of corresponding point (i+d, j) point in right figure;
S4) coupling cost Cost (i, j, d) calculation procedure: in image, arbitrfary point (i, j) is assumed to be coupling cost Cost (i, j, d)=λ cost (i, j, d)+(1-λ) RG (i, j, d) of d, wherein λ ∈ [0,1] at parallax.
Preferred λ < 0.5.
With respect to prior art, technique scheme can solve left and right image color deviation and without the problem of coupling, with respect to prior art robust more simultaneously.
Non local Stereo Matching Algorithm based on can variable weight MST of the present invention, comprises that disparity map calculates, and it is characterized in that, described disparity map calculates and comprises the following steps:
S1) color space normalization step: by the following method original image is carried out to rgb color space normalized and obtain normalization color space, the R of the pixel of normalization color space, G, B color-values component r, g, b meet:
Wherein, C
r, C
g, C
br, the G, the B color-values component that represent respectively original image corresponding pixel points;
S2) initial matching cost calculation procedure: utilize normalization color space to evaluate the initial matching cost of image, evaluation method is as follows:
The coupling cost of arbitrfary point (i, j) corresponding parallax d in image
S3) relative gradient difference calculation procedure: according to the gradient information of each pixel in the color information computed image of image; To each pixel in image, for it sets up support window, in support window, find out maximum Grad, and calculate the relative gradient of this pixel:
Then according to following formula, calculate the relative gradient difference RG (i, j, d) of matching double points:
RG(i,j,d)=|rg(i,j)-rg(i,j,d)|;
S4) coupling cost Cost (i, j, d) calculation procedure:
Cost (i, j, d)=λ cost (i, j, d)+(1-λ) RG (i, j, d), wherein λ ∈ [0,1];
S5) four Neighborhood Graph constitution steps: each pixel in image and neighbours territory pixel around generate four edges, and for the limit that connects pixel (s.i, s.j) and (r.i, r.j), its weights W e constructive formula is as follows:
S6) four Neighborhood Graphs that minimum spanning tree constitution step: according to step S5) obtain, utilize the minimum spanning tree of minimal spanning tree algorithm construct image;
S7) coupling cost stack step: to any two pixel p and q, only have a paths that they are connected on minimum spanning tree, make C
d(p) represent the coupling cost of p point when parallax is assumed to be d, order
represent the cascade matching cost that p is ordered, according to following formula, calculate cascade matching cost:
S8) cascade matching cost initial parallax figure obtaining step: according to step S7) obtaining, utilizes WTA algorithm, obtains initial parallax figure;
S9) four Neighborhood Graph weight step of updating: utilize initial parallax figure to upgrade the weight on figure limit, private field, update method is: for the limit that connects pixel (s.i, s.j) and (r.i, r.j), the weight of establishing in its original four Neighborhood Graphs is w
pre, make D (s) and D (r) represent respectively the initial parallax value of two pixels, the weight w after renewal
update=(1-μ) w
pre+ μ (D (s)-D (r)); Wherein, μ ∈ [0,1];
S10) method disparity map step of updating: on the basis of four Neighborhood Graphs after renewal, according to step S6)-S8) is calculated and is obtained final disparity map.
Preferred μ > 0.5.
Preferably also comprise disparity map post-processing step, it comprises:
Disparity map to left and right two-way image, by left and right consistency detection, obtains in image parallax value point accurately; To not meeting the point of left and right consistency detection, after the mid-filtering of disparity map, by inaccurate some assignment of parallax value, be the parallax value apart from the correct point of its nearest parallax value.
With respect to prior art, technique scheme at least has the following advantages: 1) in some actual scenes, especially the edge of some objects, reason due to sampling, edge fog, the pixel that color is close may be positioned at different degree of depth scenes, if only utilize color information, color weight close with distance but limit between pixel in different depth is just larger, and the present invention utilizes relative gradient and color at the difference in normalization space limit weights enough early, can address this is that well.2) in actual scene, parallax value meets smoothness constraint, the present invention is after obtaining initial parallax value, weights to four Neighborhood Graph limits are adjusted accordingly, smoothness constraint and weights adjustment are combined, by the right value update to four Neighborhood Graph limits, construct new minimum spanning tree (MST) and mate the stack of cost, can obtain the anaglyph that quality is higher.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of the non local Stereo Matching Algorithm based on can variable weight MST;
Fig. 2 is the process flow diagram of step 101 in Fig. 1;
Fig. 3 is the process flow diagram of step 102 in Fig. 1;
Fig. 4 is the process flow diagram of step 103 in Fig. 1;
Fig. 5 is the process flow diagram of step 104 in Fig. 1.
Embodiment
Preferred embodiment the invention will be further described to contrast accompanying drawing combination below.
As shown in Figure 1, a kind of non local Stereo Matching Algorithm based on can variable weight MST, it comprises the steps:
Step 101: calculate coupling cost;
Step 102: structure four Neighborhood Graphs, structure minimum spanning tree, obtains initial parallax figure;
Step 103: upgrade four Neighborhood Graph weights, construct new minimum spanning tree, upgrade disparity map;
Step 104: disparity map aftertreatment.
As shown in Figure 2, step 101 is calculated to mate to compare and is only utilized the right heterochromia of coupling to improve in the past.In new method, utilize normalized color space to evaluate the heterochromia that coupling is right, and add relative gradient information, structure mates cost function more accurately.Key step is as follows:
301: the normalization of color space.
Original rgb color space in the face of the image of colored deviation to time, can not well distinguish matching double points.In order to reduce color error ratio impact on accuracy on coupling, a kind of method of new measurement matching double points heterochromia has been proposed.Method is as follows:
The R of the pixel of normalization color space, G, B color-values component r, g, b meet:
C wherein
r, C
g, C
br, the G, the B color-values component that represent respectively original image corresponding pixel points.Normalized rgb space can be so that the color-values of pixel and lighting condition be irrelevant.
Weigh so shown in the following formula of coupling cost (2) that corresponding coupling is right:
Wherein, three-channel colouring intensity I
leftC, I
rightCr, the G, B color-values component r, g, the b that represent respectively the pixel of left and right figure normalization color space, cost(i, j, d) represent the coupling cost of the corresponding parallax d in arbitrfary point (i, j).
302: relative gradient information is calculated
The second portion of coupling cost function is the relative gradient information that characterizes pixel surrounding structure information.The computing method of relative gradient are as follows:
First according to the gradient information of each pixel in the color information computed image of image; To each pixel, for it sets up support window, in support window, find out maximum Grad, utilize following formula (3) to try to achieve the relative gradient size of this point:
Wherein, gradient (i, j) represents the Grad of pixel (i, j), max(gradient(i, j in denominator)) represent the maximum Grad in the support window of pixel (i, j).
Then according to the relative gradient difference RG (i, j, d) of matching double points relative gradient formula (4) calculating matching double points for difference:
RG(i,j,d)=|rg
left(i,j)-rg
right(i+d,j)| (4)
Wherein, rg
left(i, j) represents the relative gradient value that in left figure, coordinate is put for (i, j), rg
right(i+d, j) represents to calculate the relative gradient value of corresponding point (i+d, j) point in right figure;
303: coupling cost function structure and calculating
Mate cost constructed fuction as shown in Equation (5):
Cost(i,j,d)=λcost(i,j,d)+(1-λ)RG(i,j,d) (5)
λ ∈ [0,1], preferably gets λ < 0.5.
According to formula (5), mating cost calculates.
As shown in Figure 3, step 102 comprises:
401: four Neighborhood Graph structures
Coupling cost utilizes geodesic distance to construct the weights on four Neighborhood Graph limits after calculating.The building method of four Neighborhood Graphs considers the neighbours territory point of each pixel, and each point is connected with its neighbours territory point, and the weight on limit utilizes geodesic distance to carry out assignment.
The assignment method of limit weight is more suitable than directly utilizing heterochromia to construct weight.In some actual scenes, especially the edge of some objects, reason due to sampling, edge fog, the pixel that color is close may be positioned at different degree of depth scenes, if only utilize color information, color weight close with distance but limit between pixel in different depth is just larger, and utilize relative gradient and color at the difference in normalization space limit weights enough early, can address this is that well.For four Neighborhood Graphs in this algorithm, the method for structure weight is as follows:
In image, each pixel and four field pixels around generate four edges, and the weight on limit represents by the heterochromia of neighbor pixel.The limit that connects pixel (s.i, s.j) and (r.i, r.j), its weights W e constructive formula is as follows:
402: minimum spanning tree structure
Utilize said method to construct after four Neighborhood Graphs, utilize the minimum spanning tree MST of minimal spanning tree algorithm construct image.In minimum spanning tree step, preferably take Cruise-Ka Er algorithm.Minimum spanning tree has been reflected the associate feature between pixel in image.By the process of structure weights, can find out, the most similar pixel is most likely in same degree of depth scene.
403: the stack of coupling cost
To any two pixel p and q, on MST, only have a paths that they are connected, make C
d(p) represent the coupling cost (by formula 5 calculate the coupling cost that obtain) of p point when parallax value is assumed to be d, order
represent the cascade matching cost that p is ordered, following formula (7) represents non-local coupling cost stacking method:
O is the set of whole sub-picture pixel.Wherein S (p, q) definition is as formula (8)
δ is steady state value, can select a suitable value according to experiment contrast, for example, with 0.05,0.1,0.3, calculate preferentially and determine respectively, and the algorithm of the present embodiment is made as 0.1.And D (p, q) represents p and the distance of two pixels of q on MST, this is apart from representing the connect together length on the limit that will walk of p and two pixels of q.
404: initial parallax figure obtains
Utilize above-mentioned formula to mate after the stack of cost, utilize WTA algorithm, obtain initial parallax figure.Process is as follows: after the stack of coupling cost, obtain the coupling cost of each pixel under different parallax value hypothesis, (English name is: Winner Take All for the algorithm that utilizes that the victor is a king, referred to as WTA algorithm) select the central minimum value of these coupling costs, and the corresponding parallax value of this minimum value is exactly the parallax value of this point.So, can try to achieve the parallax value of each pixel in image.
As shown in Figure 4, step 103 comprises:
501: upgrade four Neighborhood Graphs
Utilize initial parallax figure to upgrade the weight on figure limit, private field, update method is: for the limit that connects pixel (s.i, s.j) and (r.i, r.j), the weight of establishing in its original four Neighborhood Graphs is w
pre, make D (s) and D (r) represent respectively the initial parallax value of two pixels, the weight w after renewal
update=(1-μ) w
pre+ μ (D (s)-D (r)) (9)
Wherein, μ ∈ [0,1]; Preferred μ > 0.5
Under same parallax value level, the weights on the limit that the row between pixel becomes should be less, so the present invention considers original weights and pixel parallax value difference, and again composes weights.
502: upgrade disparity map
On the basis of four Neighborhood Graphs after renewal, according to the method for step 402, re-construct new minimum spanning tree;
503: the stack of coupling cost
On the basis of new minimum spanning tree, according to the method for step 403, mate cost stack.
504: on the basis of the cascade matching cost of 503 acquisitions, utilize the method for step 404 to calculate and obtain final disparity map.
104: disparity map aftertreatment
According to aforesaid method, we can try to achieve respectively the disparity map of left view and right view.Obtain after horizontal parallax figure, can to disparity map, carry out effective aftertreatment with more scene information, obtain more level and smooth and disparity map accurately.As shown in Figure 5, in this algorithm, last handling process comprises the steps:
By left and right consistency detection, obtain in image parallax value point accurately.To not meeting the point of left and right consistency detection, adopting after medium filtering, is the parallax value apart from the correct point of its nearest parallax value by inaccurate some assignment of parallax value.
Series of measures because this algorithm proposes, can obtain high-quality disparity map.
Above content is in conjunction with concrete preferred implementation further description made for the present invention, can not assert that specific embodiment of the invention is confined to these explanations.For those skilled in the art, without departing from the inventive concept of the premise, can also make some being equal to substitute or obvious modification, and performance or purposes identical, all should be considered as belonging to protection scope of the present invention.
Claims (6)
1. coupling cost computing method for Binocular Stereo Matching Algorithm, is characterized in that, comprising:
S1) color space normalization step: by the following method original image is carried out to rgb color space normalized and obtain normalization color space, the R of the pixel of normalization color space, G, B color-values component r, g, b meet:
Wherein, C
r, C
g, C
br, the G, the B color-values component that represent respectively original image corresponding pixel points;
S2) coupling cost calculation procedure: utilize normalization color space to evaluate the coupling cost of image, evaluation method is as follows:
In image, arbitrfary point (i, j) is assumed to be the coupling cost of d at parallax
2. coupling cost computing method for Binocular Stereo Matching Algorithm, is characterized in that, comprising:
S1) color space normalization step: by the following method original image is carried out to rgb color space normalized and obtain normalization color space, the R of the pixel of normalization color space, G, B color-values component r, g, b meet:
Wherein, C
r, C
g, C
br, the G, the B color-values component that represent respectively original image corresponding pixel points;
S2) initial matching cost calculation procedure: utilize normalization color space to evaluate the initial matching cost of image, evaluation method is as follows:
In image, arbitrfary point (i, j) is assumed to be the initial matching cost of d at parallax
S3) relative gradient difference calculation procedure: according to the gradient information of each pixel in the color information computed image of image; To each pixel in image, for it sets up support window, in support window, find out maximum Grad, and calculate the relative gradient of this pixel:
Then according to following formula, calculate the relative gradient difference RG (i, j, d) of matching double points:
RG (i, j, d)=| rg
left(i, j)-rg
right(i+d, j) |, wherein, rg
left(i, j) represents the relative gradient value that in left figure, coordinate is put for (i, j), rg
right(i+d, j) represents to calculate the relative gradient value of corresponding point (i+d, j) point in right figure;
S4) coupling cost Cost (i, j, d) calculation procedure: in image, arbitrfary point (i, j) is assumed to be coupling cost Cost (i, j, d)=λ cost (i, j, d)+(1-λ) RG (i, j, d) of d, wherein λ ∈ [0,1] at parallax.
3. method according to claim 2, is characterized in that: λ < 0.5.
4. the non local Stereo Matching Algorithm based on can variable weight MST, comprises that disparity map calculates, and it is characterized in that, described disparity map calculates and comprises the following steps:
S1) color space normalization step: by the following method original image is carried out to rgb color space normalized and obtain normalization color space, the R of the pixel of normalization color space, G, B color-values component r, g, b meet:
Wherein, C
r, C
g, C
br, the G, the B color-values component that represent respectively original image corresponding pixel points;
S2) initial matching cost calculation procedure: utilize normalization color space to evaluate the initial matching cost of image, evaluation method is as follows:
The coupling cost of arbitrfary point (i, j) corresponding parallax d in image
S3) relative gradient difference calculation procedure: according to the gradient information of each pixel in the color information computed image of image; To each pixel in image, for it sets up support window, in support window, find out maximum Grad, and calculate the relative gradient of this pixel:
Then according to following formula, calculate the relative gradient difference RG (i, j, d) of matching double points:
RG (i, j, d)=| rg
left(i, j)-rg
right(i+d, j) |, wherein, rg
left(i, j) represents the relative gradient value that in left figure, coordinate is put for (i, j), rg
right(i+d, j) represents to calculate the relative gradient value of corresponding point (i+d, j) point in right figure;
S4) coupling cost Cost (i, j, d) calculation procedure:
Cost (i, j, d)=λ cost (i, j, d)+(1-λ) RG (i, j, d), wherein λ ∈ [0,1];
S5) four Neighborhood Graph constitution steps: each pixel in image and neighbours territory pixel around generate four edges, and for the limit that connects pixel (s.i, s.j) and (r.i, r.j), its weights W e constructive formula is as follows:
S6) four Neighborhood Graphs that minimum spanning tree constitution step: according to step S5) obtain, utilize the minimum spanning tree of minimal spanning tree algorithm construct image;
S7) coupling cost stack step: to any two pixel p and q, only have a paths that they are connected on minimum spanning tree, make C
d(p) represent the coupling cost of p point when parallax is assumed to be d, order
represent the cascade matching cost that p is ordered, according to following formula, calculate cascade matching cost:
S8) cascade matching cost initial parallax figure obtaining step: according to step S7) obtaining, utilizes WTA algorithm, obtains initial parallax figure;
S9) four Neighborhood Graph weight step of updating: utilize initial parallax figure to upgrade the weight on figure limit, private field, update method is: for the limit that connects pixel (s.i, s.j) and (r.i, r.j), the weight of establishing in its original four Neighborhood Graphs is w
pre, make D (s) and D (r) represent respectively the initial parallax value of two pixels, the weight w after renewal
update=(1-μ) w
pre+ μ (D (s)-D (r)); Wherein, μ ∈ [0,1];
S10) method disparity map step of updating: on the basis of four Neighborhood Graphs after renewal, according to step S6)-S8) is calculated and is obtained final disparity map.
5. method according to claim 4, is characterized in that: μ > 0.5.
6. method according to claim 4, is characterized in that: also comprise disparity map post-processing step, it comprises:
Disparity map to left and right two-way image, by left and right consistency detection, obtains in image parallax value point accurately; To not meeting the point of left and right consistency detection, after the mid-filtering of disparity map, for inaccurate some assignment of parallax value is the parallax value apart from the correct point of its nearest parallax value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310634040.7A CN103646396B (en) | 2013-11-29 | 2013-11-29 | The Matching power flow algorithm of Binocular Stereo Matching Algorithm and non local Stereo Matching Algorithm |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310634040.7A CN103646396B (en) | 2013-11-29 | 2013-11-29 | The Matching power flow algorithm of Binocular Stereo Matching Algorithm and non local Stereo Matching Algorithm |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103646396A true CN103646396A (en) | 2014-03-19 |
CN103646396B CN103646396B (en) | 2016-08-17 |
Family
ID=50251606
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201310634040.7A Active CN103646396B (en) | 2013-11-29 | 2013-11-29 | The Matching power flow algorithm of Binocular Stereo Matching Algorithm and non local Stereo Matching Algorithm |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103646396B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103985128A (en) * | 2014-05-23 | 2014-08-13 | 南京理工大学 | Three-dimensional matching method based on color intercorrelation and self-adaptive supporting weight |
CN104091339A (en) * | 2014-07-17 | 2014-10-08 | 清华大学深圳研究生院 | Rapid image three-dimensional matching method and device |
CN105761270A (en) * | 2016-03-15 | 2016-07-13 | 杭州电子科技大学 | Tree type filtering three-dimensional coupling method based on epipolar line linear distance transformation |
CN106504276A (en) * | 2016-10-25 | 2017-03-15 | 桂林电子科技大学 | The combinations matches cost algorithms of non local Stereo Matching Algorithm and parallax joint filling algorithm |
CN107045713A (en) * | 2017-04-12 | 2017-08-15 | 湖南源信光电科技股份有限公司 | Enhancement method of low-illumination image based on census Stereo matchings |
CN107341823A (en) * | 2017-06-06 | 2017-11-10 | 东北大学 | A kind of minimum branch's solid matching method of feature based fusion |
CN107767393A (en) * | 2017-09-07 | 2018-03-06 | 南京信息工程大学 | A kind of scene flows method of estimation towards mobile hardware |
CN109493373A (en) * | 2018-11-07 | 2019-03-19 | 上海为森车载传感技术有限公司 | A kind of solid matching method based on binocular stereo vision |
CN109522833A (en) * | 2018-11-06 | 2019-03-26 | 深圳市爱培科技术股份有限公司 | A kind of binocular vision solid matching method and system for Road Detection |
CN110009254A (en) * | 2019-04-16 | 2019-07-12 | 重庆大学 | Tidal current energy generating field collector system planing method based on variable weight minimum spanning tree |
CN110176037A (en) * | 2019-05-31 | 2019-08-27 | 东北大学 | A kind of target range Method of fast estimating driven for outdoor road auxiliary |
CN110675319A (en) * | 2019-09-12 | 2020-01-10 | 创新奇智(成都)科技有限公司 | Mobile phone photographing panoramic image splicing method based on minimum spanning tree |
CN110702015A (en) * | 2019-09-26 | 2020-01-17 | 中国南方电网有限责任公司超高压输电公司曲靖局 | Method and device for measuring icing thickness of power transmission line |
CN112306064A (en) * | 2020-11-04 | 2021-02-02 | 河北省机电一体化中试基地 | RGV control system and method for binocular vision identification |
CN112348871A (en) * | 2020-11-16 | 2021-02-09 | 长安大学 | Local stereo matching method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070031037A1 (en) * | 2005-08-02 | 2007-02-08 | Microsoft Corporation | Stereo image segmentation |
US20120008857A1 (en) * | 2010-07-07 | 2012-01-12 | Electronics And Telecommunications Research Institute | Method of time-efficient stereo matching |
CN102385752A (en) * | 2011-11-01 | 2012-03-21 | 清华大学深圳研究生院 | Stereo matching method based on distance difference and color difference |
CN103310421A (en) * | 2013-06-27 | 2013-09-18 | 清华大学深圳研究生院 | Rapid stereo matching method and disparity map obtaining method both aiming at high-definition image pair |
-
2013
- 2013-11-29 CN CN201310634040.7A patent/CN103646396B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070031037A1 (en) * | 2005-08-02 | 2007-02-08 | Microsoft Corporation | Stereo image segmentation |
US20120008857A1 (en) * | 2010-07-07 | 2012-01-12 | Electronics And Telecommunications Research Institute | Method of time-efficient stereo matching |
CN102385752A (en) * | 2011-11-01 | 2012-03-21 | 清华大学深圳研究生院 | Stereo matching method based on distance difference and color difference |
CN103310421A (en) * | 2013-06-27 | 2013-09-18 | 清华大学深圳研究生院 | Rapid stereo matching method and disparity map obtaining method both aiming at high-definition image pair |
Non-Patent Citations (5)
Title |
---|
ANDREAS KOSCHAN: "Dense stereo correspondence using polychromatic block matching", 《COMPUTER ANALYSIS OF IMAGES AND PATTERNS.5TH INTERNATIONAL CONFERENCE,CAIP′93 PROCEEDIGNS》, 13 September 1993 (1993-09-13) * |
QINGXIONG YANG: "A non-local cost aggregation method for stereo matching", 《2012 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION(CVPR)》, 16 June 2012 (2012-06-16), pages 1402 - 1409, XP 032232226, DOI: doi:10.1109/CVPR.2012.6247827 * |
XING MEI ET AL.: "Segment-Tree Based Cost Aggregation for Stereo Matching", 《2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》, 23 June 2013 (2013-06-23), pages 313 - 320, XP 032493084, DOI: doi:10.1109/CVPR.2013.47 * |
何华君 等: "基于梯度和MRF模型的视差估计算法", 《西安电子科技大学学报(自然科学版)》, vol. 34, no. 3, 30 June 2007 (2007-06-30), pages 373 - 376 * |
狄红卫 等: "一种快速双目视觉立体匹配算法", 《光学学报》, vol. 29, no. 8, 31 August 2009 (2009-08-31), pages 2180 - 2184 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103985128B (en) * | 2014-05-23 | 2017-03-15 | 南京理工大学 | A kind of solid matching method for supporting weight based on related and self adaptation in color |
CN103985128A (en) * | 2014-05-23 | 2014-08-13 | 南京理工大学 | Three-dimensional matching method based on color intercorrelation and self-adaptive supporting weight |
CN104091339A (en) * | 2014-07-17 | 2014-10-08 | 清华大学深圳研究生院 | Rapid image three-dimensional matching method and device |
CN104091339B (en) * | 2014-07-17 | 2017-01-11 | 清华大学深圳研究生院 | Rapid image three-dimensional matching method and device |
CN105761270B (en) * | 2016-03-15 | 2018-11-27 | 杭州电子科技大学 | A kind of tree-shaped filtering solid matching method based on EP point range conversion |
CN105761270A (en) * | 2016-03-15 | 2016-07-13 | 杭州电子科技大学 | Tree type filtering three-dimensional coupling method based on epipolar line linear distance transformation |
CN106504276A (en) * | 2016-10-25 | 2017-03-15 | 桂林电子科技大学 | The combinations matches cost algorithms of non local Stereo Matching Algorithm and parallax joint filling algorithm |
CN106504276B (en) * | 2016-10-25 | 2019-02-19 | 桂林电子科技大学 | Non local solid matching method |
CN107045713A (en) * | 2017-04-12 | 2017-08-15 | 湖南源信光电科技股份有限公司 | Enhancement method of low-illumination image based on census Stereo matchings |
CN107341823A (en) * | 2017-06-06 | 2017-11-10 | 东北大学 | A kind of minimum branch's solid matching method of feature based fusion |
CN107341823B (en) * | 2017-06-06 | 2019-08-09 | 东北大学 | A kind of minimum branch's solid matching method based on Fusion Features |
CN107767393A (en) * | 2017-09-07 | 2018-03-06 | 南京信息工程大学 | A kind of scene flows method of estimation towards mobile hardware |
CN107767393B (en) * | 2017-09-07 | 2021-05-25 | 南京信息工程大学 | Scene flow estimation method for mobile hardware |
CN109522833A (en) * | 2018-11-06 | 2019-03-26 | 深圳市爱培科技术股份有限公司 | A kind of binocular vision solid matching method and system for Road Detection |
CN109493373B (en) * | 2018-11-07 | 2020-11-10 | 上海为森车载传感技术有限公司 | Stereo matching method based on binocular stereo vision |
CN109493373A (en) * | 2018-11-07 | 2019-03-19 | 上海为森车载传感技术有限公司 | A kind of solid matching method based on binocular stereo vision |
CN110009254A (en) * | 2019-04-16 | 2019-07-12 | 重庆大学 | Tidal current energy generating field collector system planing method based on variable weight minimum spanning tree |
CN110009254B (en) * | 2019-04-16 | 2020-12-15 | 重庆大学 | Tidal current energy power generation field collecting system planning method based on variable-weight minimum spanning tree |
CN110176037A (en) * | 2019-05-31 | 2019-08-27 | 东北大学 | A kind of target range Method of fast estimating driven for outdoor road auxiliary |
CN110675319A (en) * | 2019-09-12 | 2020-01-10 | 创新奇智(成都)科技有限公司 | Mobile phone photographing panoramic image splicing method based on minimum spanning tree |
CN110675319B (en) * | 2019-09-12 | 2020-11-03 | 创新奇智(成都)科技有限公司 | Mobile phone photographing panoramic image splicing method based on minimum spanning tree |
CN110702015A (en) * | 2019-09-26 | 2020-01-17 | 中国南方电网有限责任公司超高压输电公司曲靖局 | Method and device for measuring icing thickness of power transmission line |
CN112306064A (en) * | 2020-11-04 | 2021-02-02 | 河北省机电一体化中试基地 | RGV control system and method for binocular vision identification |
CN112348871A (en) * | 2020-11-16 | 2021-02-09 | 长安大学 | Local stereo matching method |
CN112348871B (en) * | 2020-11-16 | 2023-02-10 | 长安大学 | Local stereo matching method |
Also Published As
Publication number | Publication date |
---|---|
CN103646396B (en) | 2016-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103646396A (en) | Matching cost algorithm of binocular stereo matching algorithm, and non-local stereo matching algorithm | |
CN107220997B (en) | Stereo matching method and system | |
CN101625768B (en) | Three-dimensional human face reconstruction method based on stereoscopic vision | |
CN103310421A (en) | Rapid stereo matching method and disparity map obtaining method both aiming at high-definition image pair | |
CN101271578B (en) | Depth sequence generation method of technology for converting plane video into stereo video | |
CN101877143B (en) | Three-dimensional scene reconstruction method of two-dimensional image group | |
CN104318569A (en) | Space salient region extraction method based on depth variation model | |
CN105528785A (en) | Binocular visual image stereo matching method | |
EP3308323B1 (en) | Method for reconstructing 3d scene as 3d model | |
CN104376552A (en) | Virtual-real registering algorithm of 3D model and two-dimensional image | |
CN101866497A (en) | Binocular stereo vision based intelligent three-dimensional human face rebuilding method and system | |
CN106447661A (en) | Rapid depth image generating method | |
CN108010075B (en) | Local stereo matching method based on multi-feature combination | |
CN104517317A (en) | Three-dimensional reconstruction method of vehicle-borne infrared images | |
CN104156957A (en) | Stable and high-efficiency high-resolution stereo matching method | |
CN106600632A (en) | Improved matching cost aggregation stereo matching algorithm | |
CN103065351A (en) | Binocular three-dimensional reconstruction method | |
CN103955945A (en) | Self-adaption color image segmentation method based on binocular parallax and movable outline | |
CN108629809B (en) | Accurate and efficient stereo matching method | |
CN103971366A (en) | Stereoscopic matching method based on double-weight aggregation | |
CN103268482B (en) | A kind of gesture of low complex degree is extracted and gesture degree of depth acquisition methods | |
CN102447917A (en) | Three-dimensional image matching method and equipment thereof | |
CN106295657A (en) | A kind of method extracting human height's feature during video data structure | |
CN109166171A (en) | Motion recovery structure three-dimensional reconstruction method based on global and incremental estimation | |
CN107274448B (en) | Variable weight cost aggregation stereo matching algorithm based on horizontal tree structure |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |