CN107248179A - Three-dimensional matching method for building up for disparity computation - Google Patents

Three-dimensional matching method for building up for disparity computation Download PDF

Info

Publication number
CN107248179A
CN107248179A CN201710427806.2A CN201710427806A CN107248179A CN 107248179 A CN107248179 A CN 107248179A CN 201710427806 A CN201710427806 A CN 201710427806A CN 107248179 A CN107248179 A CN 107248179A
Authority
CN
China
Prior art keywords
mrow
msup
disparity
subgraph
gradient
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710427806.2A
Other languages
Chinese (zh)
Inventor
周姝
魏能强
兰波
杨科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
API ZC Chengdu Precision Instrument Co Ltd
Original Assignee
API ZC Chengdu Precision Instrument Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by API ZC Chengdu Precision Instrument Co Ltd filed Critical API ZC Chengdu Precision Instrument Co Ltd
Priority to CN201710427806.2A priority Critical patent/CN107248179A/en
Publication of CN107248179A publication Critical patent/CN107248179A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses the three-dimensional matching method for building up for disparity computation:(a) two video cameras are demarcated respectively;(b) image for setting two video cameras acquisitions is respectively supergraph picture, subgraph, sets up the fundamental matrix relation of two video cameras:F=Ar ‑T[t]×RAl ‑1;(c) epipolar-line constraint relation is set up:m’TFm=0;(d) neighborhood image in supergraph picture, subgraph near characteristic point is obtained, and calculates the gray average in supergraph picture in neighborhood imageGray average in the gray value I of subgraph, subgraph in neighborhood image(e) gray scale similarity S is calculated;(f) gradient of disparity d is calculatedgr;(g) according to the gray scale similarity, gradient of disparity calculated, false matches value is removed, three-dimensional matching is carried out.The problem of present invention is to solve gray scale in the prior art and gradient of disparity interference three-dimensional matching precision, realizes and quantum chemical method is carried out to gray scale and gradient of disparity, so as to eliminating false matches, put forward high-precision purpose.

Description

Three-dimensional matching method for building up for disparity computation
Technical field
The present invention relates to three-dimensional measurement field, and in particular to the three-dimensional matching method for building up for disparity computation.
Background technology
Parallax is exactly the direction difference produced by same target from two points for having certain distance.In terms of target Angle between two points, is called the parallactic angle of the two points, and the distance between 2 points are referred to as baseline.Only it is to be understood that parallactic angle degree And baseline length, it is possible to calculate the distance between target and observer.In three-dimensional measurement field, that is, refer to from two shootings Difference on the image obtained in machine between same pixel point.Three-dimensional matching is, according to the calculating to selected feature, to set up feature Between corresponding relation, photosites of the same spatial point in different images are mapped, and thus obtain corresponding parallax The technology of image.Traditional three-dimensional matching is all based on camera calibration and carried out, due to the interference of gray scale and gradient of disparity, Easily there are false matches, and then influence three-dimensional matching precision.
The content of the invention
It is grey in the prior art to solve it is an object of the invention to provide the three-dimensional matching method for building up for disparity computation The problem of degree and gradient of disparity interference three-dimensional matching precision, realize and quantum chemical method is carried out to gray scale and gradient of disparity, so as to eliminating False matches, put forward high-precision purpose.
The present invention is achieved through the following technical solutions:
For the three-dimensional matching method for building up of disparity computation, comprise the following steps:
(a) two video cameras are demarcated respectively, obtains outer parameter matrix R, translation vector t and two video cameras Respective Intrinsic Matrix Al、Ar
(b) image for setting two video cameras acquisitions is respectively supergraph picture, subgraph, sets up the basic square of two video cameras Battle array relation:F=Ar -T[t]×RAl -1;Wherein T is female image intensity value;
(c) epipolar-line constraint relation is set up:m’TFm=0;Wherein m, m ' it is that a pair in supergraph picture, subgraph match Characteristic point;
(d) neighborhood image in supergraph picture, subgraph near characteristic point is obtained, and is calculated in supergraph picture in neighborhood image Gray averageGray average in the gray value I of subgraph, subgraph in neighborhood image
(e) gray scale similarity S is calculated:
Wherein, (x, y) is m point coordinates, and (x ', y ') is m ' coordinates;
(f) gradient of disparity d is calculatedgr
dgr=| da-db|/|dcs(am,bm)|
Wherein, daFor the coordinate difference for the characteristic point that a pair match, dbThe coordinate difference of the characteristic point matched for another pair Value, dcs(am, bm) for the vector at two pairs of match point line midpoints;
(g) according to the gray scale similarity, gradient of disparity calculated, false matches value is removed, remaining match point is brought into pole In line restriction relation, three-dimensional matching is carried out.
The problem of for gray scale in the prior art and gradient of disparity interference three-dimensional matching precision, the present invention proposes that one kind is used for Two video cameras are demarcated, obtain outer parameter matrix R, translation vector by the three-dimensional matching method for building up of disparity computation first The respective Intrinsic Matrix A of t and two video cameral、Ar.Wherein it is using any existing method to the method for camera calibration Can, it will not be described here.The image that two video cameras of setting are obtained is respectively supergraph picture, subgraph, sets up two video cameras Fundamental matrix relation:F=Ar -T[t]×RAl -1;Afterwards in testee surface selected characteristic point, epipolar-line constraint relation is set up:m ’TFm=0.Neighborhood image in supergraph picture, subgraph near characteristic point is obtained by graphical analysis, and calculated in supergraph picture Gray average in neighborhood imageGray average in the gray value I of subgraph, subgraph in neighborhood imageBring into m, The two-dimensional coordinate that 2 points of m ', the calculation formula for calculating gray scale similarity S, S is as follows:
Gradient of disparity is calculated again:
dgr=| da-db|/|dcs(am,bm)|
After the gray scale similarity that calculates, gradient of disparity, need to exclude that gray scale similarity is too low, regard according to measurement accuracy The excessive value of poor gradient, remaining match point is brought into epipolar-line constraint relation, can carry out three-dimensional matching.The present invention is compared to biography The three-dimensional matching process of system, overcomes gray scale, the error that gradient of disparity is brought, and eliminates false matches, being capable of high degree Upper raising measurement accuracy.
It is preferred that, match point of the gray scale similarity less than 0.8 is removed as false matches value.
It is preferred that, the match point that gradient of disparity is more than 0.2 is removed as false matches value.
It is preferred that, step (e) and (f) order are adjustable.
The present invention compared with prior art, has the following advantages and advantages:
Three-dimensional matching method for building up of the present invention for disparity computation, overcomes gray scale, the error that gradient of disparity is brought, False matches are eliminated, measurement accuracy can be largely improved.
Brief description of the drawings
Accompanying drawing described herein is used for providing further understanding the embodiment of the present invention, constitutes one of the application Point, do not constitute the restriction to the embodiment of the present invention.In the accompanying drawings:
Fig. 1 is the schematic flow sheet of the specific embodiment of the invention.
Embodiment
For the object, technical solutions and advantages of the present invention are more clearly understood, with reference to embodiment and accompanying drawing, to this Invention is described in further detail, and exemplary embodiment and its explanation of the invention is only used for explaining the present invention, does not make For limitation of the invention.
Embodiment 1:
As shown in Figure 1 is used for the three-dimensional matching method for building up of disparity computation, comprises the following steps:(a) respectively to two Video camera is demarcated, and obtains outer parameter matrix R, translation vector t and the respective Intrinsic Matrix A of two video camerasl、Ar; (b) image for setting two video cameras acquisitions is respectively supergraph picture, subgraph, sets up the fundamental matrix relation of two video cameras:F =Ar -T[t]×RAl -1;Wherein T is female image intensity value;(c) epipolar-line constraint relation is set up:m’TFm=0;Wherein m, m ' for mother The characteristic point matched for a pair in image, subgraph;(d) neighborhood image in supergraph picture, subgraph near characteristic point is obtained, And calculate the gray average in supergraph picture in neighborhood imageAsh in the gray value I of subgraph, subgraph in neighborhood image Spend average(e) gray scale similarity S is calculated:
Wherein, (x, y) is m point coordinates, and (x ', y ') is m ' coordinates;(f) gradient of disparity d is calculatedgr
dgr=| da-db|/|dcs(am,bm)|
Wherein, daFor the coordinate difference for the characteristic point that a pair match, dbThe coordinate difference of the characteristic point matched for another pair Value, dcs(am, bm) for the vector at two pairs of match point line midpoints;(g) according to the gray scale similarity, gradient of disparity calculated, go Except false matches value, remaining match point is brought into epipolar-line constraint relation, three-dimensional matching is carried out.Wherein, gray scale similarity is less than 0.8 match point is removed as false matches value.The match point that gradient of disparity is more than 0.2 is removed as false matches value.
Above-described embodiment, has been carried out further to the purpose of the present invention, technical scheme and beneficial effect Describe in detail, should be understood that the embodiment that the foregoing is only the present invention, be not intended to limit the present invention Protection domain, within the spirit and principles of the invention, any modification, equivalent substitution and improvements done etc. all should be included Within protection scope of the present invention.

Claims (4)

1. the three-dimensional matching method for building up for disparity computation, it is characterised in that comprise the following steps:
(a) two video cameras are demarcated respectively, obtains outer parameter matrix R, translation vector t and two video cameras each Intrinsic Matrix Al、Ar
(b) image for setting two video cameras acquisitions is respectively supergraph picture, subgraph, and the fundamental matrix for setting up two video cameras is closed System:F=Ar -T[t]×RAl -1;Wherein T is female image intensity value;
(c) epipolar-line constraint relation is set up:m’TFm=0;Wherein m, m ' it is the feature matched for a pair in supergraph picture, subgraph Point;
(d) neighborhood image in supergraph picture, subgraph near characteristic point is obtained, and calculates the ash in supergraph picture in neighborhood image Spend averageGray average in the gray value I of subgraph, subgraph in neighborhood image
(e) gray scale similarity S is calculated:
<mrow> <mi>S</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>n</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>m</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mrow> <mo>(</mo> <mi>T</mi> <mo>(</mo> <mrow> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>,</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> </mrow> <mo>)</mo> <mo>-</mo> <mover> <mi>T</mi> <mo>&amp;OverBar;</mo> </mover> <mo>)</mo> </mrow> <mrow> <mo>(</mo> <mi>I</mi> <mo>(</mo> <mrow> <mi>x</mi> <mo>+</mo> <msup> <mi>x</mi> <mo>&amp;prime;</mo> </msup> <mo>,</mo> <mi>y</mi> <mo>+</mo> <msup> <mi>y</mi> <mo>&amp;prime;</mo> </msup> </mrow> <mo>)</mo> <mo>-</mo> <mover> <mi>I</mi> <mo>&amp;OverBar;</mo> </mover> <mo>)</mo> </mrow> </mrow>
Wherein, (x, y) is m point coordinates, and (x ', y ') is m ' coordinates;
(f) gradient of disparity d is calculatedgr
dgr=| da-db|/|dcs(am,bm)|
Wherein, daFor the coordinate difference for the characteristic point that a pair match, dbThe coordinate difference of the characteristic point matched for another pair, dcs(am, bm) for the vector at two pairs of match point line midpoints;
(g) according to the gray scale similarity, gradient of disparity calculated, false matches value is removed, remaining match point is brought into polar curve about In beam relation, three-dimensional matching is carried out.
2. the three-dimensional matching method for building up according to claim 1 for disparity computation, it is characterised in that gray scale similarity Match point less than 0.8 is removed as false matches value.
3. the three-dimensional matching method for building up according to claim 1 for disparity computation, it is characterised in that gradient of disparity is big Match point in 0.2 is removed as false matches value.
4. the three-dimensional matching method for building up according to claim 1 for disparity computation, it is characterised in that step (e) and (f) order is adjustable.
CN201710427806.2A 2017-06-08 2017-06-08 Three-dimensional matching method for building up for disparity computation Pending CN107248179A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710427806.2A CN107248179A (en) 2017-06-08 2017-06-08 Three-dimensional matching method for building up for disparity computation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710427806.2A CN107248179A (en) 2017-06-08 2017-06-08 Three-dimensional matching method for building up for disparity computation

Publications (1)

Publication Number Publication Date
CN107248179A true CN107248179A (en) 2017-10-13

Family

ID=60017881

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710427806.2A Pending CN107248179A (en) 2017-06-08 2017-06-08 Three-dimensional matching method for building up for disparity computation

Country Status (1)

Country Link
CN (1) CN107248179A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116797463A (en) * 2023-08-22 2023-09-22 佗道医疗科技有限公司 Feature point pair extraction method and image stitching method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887589A (en) * 2010-06-13 2010-11-17 东南大学 Stereoscopic vision-based real low-texture image reconstruction method
CN103310421A (en) * 2013-06-27 2013-09-18 清华大学深圳研究生院 Rapid stereo matching method and disparity map obtaining method both aiming at high-definition image pair
CN104268880A (en) * 2014-09-29 2015-01-07 沈阳工业大学 Depth information obtaining method based on combination of features and region matching
CN104680510A (en) * 2013-12-18 2015-06-03 北京大学深圳研究生院 RADAR parallax image optimization method and stereo matching parallax image optimization method and system
CN106183995A (en) * 2016-07-26 2016-12-07 武汉大学 A kind of visual parking device method based on stereoscopic vision
CN106228605A (en) * 2016-07-29 2016-12-14 东南大学 A kind of Stereo matching three-dimensional rebuilding method based on dynamic programming
CN106600686A (en) * 2016-12-06 2017-04-26 西安电子科技大学 Three-dimensional point cloud reconstruction method based on multiple uncalibrated images
CN106780442A (en) * 2016-11-30 2017-05-31 成都通甲优博科技有限责任公司 A kind of solid matching method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887589A (en) * 2010-06-13 2010-11-17 东南大学 Stereoscopic vision-based real low-texture image reconstruction method
CN103310421A (en) * 2013-06-27 2013-09-18 清华大学深圳研究生院 Rapid stereo matching method and disparity map obtaining method both aiming at high-definition image pair
CN104680510A (en) * 2013-12-18 2015-06-03 北京大学深圳研究生院 RADAR parallax image optimization method and stereo matching parallax image optimization method and system
CN104268880A (en) * 2014-09-29 2015-01-07 沈阳工业大学 Depth information obtaining method based on combination of features and region matching
CN106183995A (en) * 2016-07-26 2016-12-07 武汉大学 A kind of visual parking device method based on stereoscopic vision
CN106228605A (en) * 2016-07-29 2016-12-14 东南大学 A kind of Stereo matching three-dimensional rebuilding method based on dynamic programming
CN106780442A (en) * 2016-11-30 2017-05-31 成都通甲优博科技有限责任公司 A kind of solid matching method and system
CN106600686A (en) * 2016-12-06 2017-04-26 西安电子科技大学 Three-dimensional point cloud reconstruction method based on multiple uncalibrated images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李健等: "《基于极线几何的改进多约束图像立体匹配》", 《微型机与应用》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116797463A (en) * 2023-08-22 2023-09-22 佗道医疗科技有限公司 Feature point pair extraction method and image stitching method
CN116797463B (en) * 2023-08-22 2023-11-21 佗道医疗科技有限公司 Feature point pair extraction method and image stitching method

Similar Documents

Publication Publication Date Title
CN111260597B (en) Parallax image fusion method of multiband stereo camera
CN107977997B (en) Camera self-calibration method combined with laser radar three-dimensional point cloud data
CN102065313B (en) Uncalibrated multi-viewpoint image correction method for parallel camera array
CN109919911B (en) Mobile three-dimensional reconstruction method based on multi-view photometric stereo
CN113034568B (en) Machine vision depth estimation method, device and system
CN102665086B (en) Method for obtaining parallax by using region-based local stereo matching
CN111210481A (en) Depth estimation acceleration method of multiband stereo camera
CN108020175B (en) multi-grating projection binocular vision tongue surface three-dimensional integral imaging method
CN109840922B (en) Depth acquisition method and system based on binocular light field camera
CN103945207B (en) A kind of stereo-picture vertical parallax removing method based on View Synthesis
CN109712232B (en) Object surface contour three-dimensional imaging method based on light field
CN106340045B (en) Calibration optimization method in three-dimensional facial reconstruction based on binocular stereo vision
CN111091076B (en) Tunnel limit data measuring method based on stereoscopic vision
CN104778716B (en) Lorry compartment volume measuring method based on single image
CN104167001B (en) Large-visual-field camera calibration method based on orthogonal compensation
CN106203429A (en) Based on the shelter target detection method under binocular stereo vision complex background
CN102914295A (en) Computer vision cube calibration based three-dimensional measurement method
CN109285189A (en) A kind of straight path quick calculation method synchronous without binocular
CN107909543A (en) A kind of flake binocular vision Stereo matching space-location method
CN115880344A (en) Binocular stereo matching data set parallax truth value acquisition method
CN111382591A (en) Binocular camera ranging correction method and vehicle-mounted equipment
CN113643427A (en) Binocular ranging and three-dimensional reconstruction method
CN102842118A (en) New robust stereopair correcting method
CN107248179A (en) Three-dimensional matching method for building up for disparity computation
CN110487254B (en) Rapid underwater target size measuring method for ROV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171013