CN110473219A - Solid matching method based on related information of neighborhood - Google Patents

Solid matching method based on related information of neighborhood Download PDF

Info

Publication number
CN110473219A
CN110473219A CN201910703250.4A CN201910703250A CN110473219A CN 110473219 A CN110473219 A CN 110473219A CN 201910703250 A CN201910703250 A CN 201910703250A CN 110473219 A CN110473219 A CN 110473219A
Authority
CN
China
Prior art keywords
census
neighborhood
pixel
value
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910703250.4A
Other languages
Chinese (zh)
Inventor
高静
张培文
徐江涛
史再峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201910703250.4A priority Critical patent/CN110473219A/en
Publication of CN110473219A publication Critical patent/CN110473219A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to binocular stereo imaging field, to improve performance of the Census algorithm in the case where repeating texture and the discontinuous situation of parallax, obtain that precision is higher, the stronger matching cost algorithm of anti-interference ability.Thus, the technical solution adopted by the present invention is that, solid matching method based on related information of neighborhood, for two pictures to be matched of input, on the basis of left view, take n × n-pixel window, calculate its Census Transformation Matching cost, the window of n × n is also taken in right view, and from left to right, moving window from top to bottom, the value of its matching cost is recorded respectively, choose match point of the smallest point of matching cost as left figure, each point in left figure is traversed according to the method, its corresponding match point in right figure can be then found respectively, complete the matching of left and right view.Present invention is mainly applied to three-dimensional imagings to handle occasion.

Description

Solid matching method based on related information of neighborhood
Technical field
The present invention relates to binocular stereo imaging fields, more particularly to the wherein core algorithm of Stereo matching.Using based on neighbour The mode that the Census algorithm of domain relevant information is combined with AD algorithm improves the accuracy and robustness of matching algorithm.
Background technique
Stereovision technique is in fields such as virtual reality, three-dimensional measurement, stereo camera, target identification and robot navigations It is widely used.Wherein, Stereo matching is the key technology of stereoscopic vision, and algorithm complexity is high, needs to handle a large amount of Data, there is presently no a kind of general algorithms to be adapted to most scene requirement.
In numerous matching algorithms, AD algorithm gray value pixel-based, the i.e. same characteristic point tool of hypothesis two images Have identical gray value, but actual conditions are often unsatisfactory for this condition, so algorithm in image noise and brightness compare It is sensitive.And the matching algorithm based on Census transformation is to be surveyed using the size relation with neighborhood territory pixel gray value as similitude Amount, therefore anti-interference is stronger, but repetition or similar partial structurtes in the picture can generate erroneous matching.Therefore, it mentions Go out a kind of modified Census algorithm based on related information of neighborhood, improves it in the performance of parallax discontinuity zone, and The characteristics of blending with Stereo Matching Algorithm, converting in conjunction with AD and Census, is complementary to one another, improve algorithm matching precision and Anti-interference.
Summary of the invention
In order to overcome the deficiencies of the prior art, the present invention is directed to propose a kind of modified AD- based on related information of neighborhood Census Stereo Matching Algorithm improves performance of the Census algorithm in the case where repeating texture and the discontinuous situation of parallax, and defines Normalize formula and it blended into generation disparity space with AD algorithm, thus obtain precision is higher, anti-interference ability stronger With cost algorithms.For this reason, the technical scheme adopted by the present invention is that the solid matching method based on related information of neighborhood, for defeated Two pictures to be matched entered take n × n-pixel window on the basis of left view, calculate its Census Transformation Matching cost, The window of n × n is also taken in right view, and from left to right, moving window from top to bottom, record the value of its matching cost respectively, Match point of the smallest point of matching cost as left figure is chosen, each point in left figure is traversed according to the method, then can distinguish Its corresponding match point in right figure is found, that is, completes the matching of left and right view.
Wherein Census transformation carries out following improve:
Census transformation is formulated are as follows:
Wherein p, q represent two different points, and I represents the gray value of this point;
Image Edge-Detection first is carried out with sobel operator, for the pixel in the same area, with Census window The gray average that central point outside neighborhood is removed in its window is sought in scanningCalculate StThe gray scale of neighborhood territory pixel in region Average value:
N in formula, m ∈ [- (st-1)/2,(st+ 1)/2] and m ≠ 0, n ≠ 0, num are Census mapping window size, then neighbour Domain pixel and neighborhood gray averageGray difference are as follows:
If the region center pixel p texture is single, p and neighborhood gray averageDifference very little, δ value are smaller;Work as center Pixel p region parallax is discontinuous, then match point only considers that the pixel of edge the same side matches, and the neighborhood picture outside region Plain p ' and neighborhood gray averageGray scale difference should be larger, δ value also can be very big;Census innovatory algorithm are as follows:
T is given threshold.
Using AD Census matching cost, detailed process is as follows:
The sum of AD algorithmic notation pixel to be matched pixel absolute value of the difference corresponding with its neighborhood territory pixel, matching cost Calculation formula are as follows:
CAD(x, y, d)=∑(r,c)∈Ω|Il(r,c)-Ir(r,c-d)| (5)
Wherein CAD(x, y, d) is that the AD at pixel (x, y) estimates, i.e. the pixel value absolute value of the difference of two pixels;Ω is The neighborhood of pixel (x, y), I in left imagel(r, c) indicates gray value of the left image at (r, c), Ir(r, c-d) is to regard in right figure Difference is the gray value of the point to be matched of d;
AD Census matching cost is exactly that the two kinds of similarity measure function fusions of AD algorithm and Census algorithm are generated view Difference space is defined as follows normalization formula, by two measures function normalization to [0,1] section, the public affairs of then directly summing Formula:
C (p, d)=ρ (CCensus(p,d),λCensus)+ρ(CAD(p,d),λAD)
It is calculated compared to the matching cost that AD or Census transformation is used alone, in conjunction with the AD Census algorithm of the two advantage Obtain matching effect.
For the threshold value T in formula (4), different texture situation lower threshold value is different in image, does not use fixed threshold will affect Effect should determine as follows T according to specific texture self-adaption threshold value:
Wherein
The features of the present invention and beneficial effect are:
AD algorithm is combined with the Census algorithm based on related information of neighborhood, has both overcome AD algorithm to brightness noise Sensitivity causes the problem of error hiding, and improves the repetition in the picture of Census algorithm or similar partial structurtes and can generate mistake The shortcomings that error hiding, improves the precision and robustness of matching algorithm.
Detailed description of the invention:
Fig. 1 smooth region Census conversion process.
From figure 1 it appears that in the single area of texture, for being greater than pixel " 58 " and " 73 " of center pixel, according to The different gray scale difference of mean value can correspond to obtain conversion code to be " 00 ", " 01 ", and be not different in original transformation.In imago vegetarian refreshments In situation affected by noise, such as Fig. 2, traditional algorithm is not different neighborhood regional processing and improved Census algorithm can be distinguished Each pixel, such as " 56 " are converted into " 10 ", and " 73 " are converted into " 11 ", have distinguished the difference of neighborhood territory pixel.
Census conversion process in Fig. 2 central pixel point situation affected by noise.
Fig. 3 algorithm flow chart.
Specific embodiment
Details are as follows for technical solution of the present invention:
(1) the improvement Census algorithm based on related information of neighborhood
Census transformation is converted using a kind of imparametrization of Image neighborhood information, can indicate the local grain of image Feature.Census algorithm generally uses a rectangular window traversal image, is respectively compared neighborhood territory pixel and center pixel ash in window The relative size of angle value is denoted as 0 when gray value can be less than or equal to center pixel, gray value is denoted as 1 when being greater than center pixel.Most These value step-by-steps are linked to be Hamming distance afterwards, as the characteristic value of characterization center pixel feature, with the distance vector word in neighborhood Symbol string can retain the local grain structural information of image.Then Census transformation can be formulated as:
Wherein p, q represent two different points, and I represents the gray value of this point.
Although above-mentioned Census transformation largely remains the Local textural feature of image, improve due to imaging The influence to quality of match that video camera itself and brightness change generate in journey is made an uproar but if there is center pixel by road is serious When sound shadow pilot causes the case where serious distortion, because each pixel is made comparisons with central pixel point, Census matching will lead to Robustness seriously destroyed, while it is texture-free or it is single repeat texture region, parallax jump object background handover In the case where region, above-mentioned algorithm must be increased because cannot distinguish between concrete condition, matching error rate.
In order to improve the above problem, Image Edge-Detection first can be carried out with sobel operator, in the same area Pixel, scanned with Census window, seek the gray average for removing central point outside neighborhood in its windowCalculate StThe average gray of neighborhood territory pixel in region:
N in formula, m ∈ [- (st-1)/2,(st+ 1)/2] and m ≠ 0, n ≠ 0, num are Census mapping window size.It is then adjacent Domain pixel and neighborhood gray averageGray difference are as follows:
Neighborhood gray averageThe supplement of gray difference in image mapping window can be provided, for the picture in analysis window Texture is associated with situation between element, to make up because of the risk that each point will generate compared with central point.If center pixel p Region texture is single, then p and neighborhood gray averageDifference very little, δ value are smaller;When the region center pixel p parallax not Continuously, then match point only considers that the pixel of edge the same side matches, and the neighborhood territory pixel p ' outside region and neighborhood gray average Gray scale difference should be larger, δ value also can be very big;Therefore neighborhood territory pixel p ' and neighborhood gray averageGray scale difference δ can also be used for characterizing Image texture and local feature.If center pixel p is caused gray scale to be substantially distorted by noise jamming, because mean value selection eliminates Central point can overcome such interference by the difference of neighborhood territory pixel and neighborhood gray average, improve algorithm robustness.Therefore Census Innovatory algorithm are as follows:
T is given threshold, can choose and adjust according to the actual situation.
Fig. 1, Fig. 2 are respectively the specific mistake for improving Census transformation under texture single area and parallax discontinuity zone Journey.
(2) merging for Census algorithm and AD algorithm is improved
The characteristic that Census has gray scale constant, i.e. correlation between the specific size and coding of grey scale pixel value are not Very strong, characterization is only size relation between pixel, institute in this approach while there is preferable robustness to noise, Unavoidably have the shortcomings that larger in repeat region error hiding.And AD algorithm is based on color characteristic, it is extremely sensitive to gray value, Noise is very big on the influence of AD algorithm, but not is influenced by repetitive structure.Linear fusion is carried out to the two, it is each that the two can be played From the advantages of, the precision and robustness of algorithm is greatly improved.
The sum of AD algorithmic notation pixel to be matched pixel absolute value of the difference corresponding with its neighborhood territory pixel.Because calculating letter Just, easily hardware realization and be widely adopted, matching cost calculation formula are as follows:
CAD(x, y, d)=∑(r,c)∈Ω|Il(r,c)-Ir(r,c-d)| (5)
Wherein CAD(x, y, d) is that the AD at pixel (x, y) estimates, i.e. the pixel value absolute value of the difference of two pixels;Ω is The neighborhood of pixel (x, y), I in left imagel(r, c) indicates gray value of the left image at (r, c), Ir(r, c-d) is to regard in right figure Difference is the gray value of the point to be matched of d.
AD Census matching cost is exactly that the two kinds of similarity measure function fusions of AD algorithm and Census algorithm are generated view Difference space.Because the evaluation criterion that AD estimates from Census transformation uses is different, the initial matching cost that the two generates is not consistent, Therefore it is defined as follows normalization formula, and by two measures function normalization to [0,1] section, the formula of then directly summing.
C (p, d)=ρ (CCensus(p,d),λCensus)+ρ(CAD(p,d),λAD)
It is calculated compared to the matching cost that AD or Census transformation is used alone, in conjunction with the AD Census algorithm of the two advantage Available more preferably matching effect.
For the threshold value T in formula (4), different texture situation lower threshold value is different in image, does not use fixed threshold will affect Effect should determine as follows T according to specific texture self-adaption threshold value:
Wherein
In the formula (6) of fusion AD algorithm and improvement Census algorithm, not because of evaluation criterion used by two measures Together, the initial matching cost that the two generates is not consistent.In order to need to define on two measures function normalization to [0,1] section λCensusAnd λADValue, such as λCensusTake 20, λADTake 5.
Input two pictures to be matched, such as the left and right view obtained by binocular camera.On the basis of left view, 3 × 3 are taken The window of pixel calculates its AD Census matching cost.3 × 3 window is also taken in right view, and from left to right, from up to Lower moving window records the value of its matching cost respectively, chooses match point of the smallest point of matching cost as left figure.According to this Each point in kind method traversal left figure, then can find its corresponding match point in right figure respectively, that is, complete left and right view Matching.

Claims (4)

1. a kind of solid matching method based on related information of neighborhood, characterized in that for two pictures to be matched of input, with On the basis of left view, n × n-pixel window is taken, its Census Transformation Matching cost is calculated, the window of n × n is also taken in right view Mouthful, and from left to right, moving window from top to bottom, record the value of its matching cost respectively, choose the smallest point of matching cost and make For the match point of left figure, each point in left figure is traversed according to the method, then can find it respectively corresponding in right figure With point, that is, complete the matching of left and right view.
2. as described in claim 1 based on the solid matching method of related information of neighborhood, characterized in that wherein Census is converted Carry out following improve:
Census transformation is formulated are as follows:
Wherein p, q represent two different points, and I represents the gray value of this point;
Image Edge-Detection first is carried out with sobel operator, for the pixel in the same area, is swept with Census window It retouches, seeks the gray average for removing central point outside neighborhood in its windowCalculate StThe gray scale of neighborhood territory pixel is flat in region Mean value:
N in formula, m ∈ [- (st-1)/2,(st+ 1)/2] and m ≠ 0, n ≠ 0, num are Census mapping window size, then neighborhood picture Element and neighborhood gray averageGray difference are as follows:
If the region center pixel p texture is single, p and neighborhood gray averageDifference very little, δ value are smaller;Work as center pixel The region p parallax is discontinuous, then match point only considers that the pixel of edge the same side matches, and the neighborhood territory pixel p ' outside region With neighborhood gray averageGray scale difference should be larger, δ value also can be very big;Census innovatory algorithm are as follows:
T is given threshold.
3. as claimed in claim 1 or 2 based on the solid matching method of related information of neighborhood, characterized in that use AD Census matching cost, the sum of AD algorithmic notation pixel to be matched pixel absolute value of the difference corresponding with its neighborhood territory pixel, Matching cost calculation formula are as follows:
CAD(x, y, d)=∑(r,c)∈Ω|Il(r,c)-Ir(r,c-d)| (5)
Wherein CAD(x, y, d) is that the AD at pixel (x, y) estimates, i.e. the pixel value absolute value of the difference of two pixels;Ω is left figure The neighborhood of pixel (x, y), I as inl(r, c) indicates gray value of the left image at (r, c), Ir(r, c-d) is that parallax is in right figure The gray value of the point to be matched of d;AD Census matching cost is exactly by two kinds of similarity measure letters of AD algorithm and Census algorithm Number fusion generates disparity spaces, is defined as follows normalization formula, by two measures function normalization to [0,1] section, then It directly sums the formula:
C (p, d)=ρ (CCensus(p,d),λCensus)+ρ(CAD(p,d),λAD)
It calculates compared to the matching cost that AD or Census transformation is used alone, is obtained in conjunction with the AD Census algorithm of the two advantage Matching effect.
4. as claimed in claim 2 based on the solid matching method of related information of neighborhood, characterized in that in formula (4) Threshold value T, different texture situation lower threshold value is different in image, does not use fixed threshold to will affect effect, should according to specific texture from Threshold value is adapted to, determines T as follows:
Wherein
CN201910703250.4A 2019-07-31 2019-07-31 Solid matching method based on related information of neighborhood Pending CN110473219A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910703250.4A CN110473219A (en) 2019-07-31 2019-07-31 Solid matching method based on related information of neighborhood

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910703250.4A CN110473219A (en) 2019-07-31 2019-07-31 Solid matching method based on related information of neighborhood

Publications (1)

Publication Number Publication Date
CN110473219A true CN110473219A (en) 2019-11-19

Family

ID=68509355

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910703250.4A Pending CN110473219A (en) 2019-07-31 2019-07-31 Solid matching method based on related information of neighborhood

Country Status (1)

Country Link
CN (1) CN110473219A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111325778A (en) * 2020-01-22 2020-06-23 天津大学 Improved Census stereo matching algorithm based on window cross-correlation information
CN111415305A (en) * 2020-03-10 2020-07-14 桂林电子科技大学 Method for recovering three-dimensional scene, computer-readable storage medium and unmanned aerial vehicle
CN112750154A (en) * 2020-12-31 2021-05-04 湖南大学 Stereo matching method based on binocular vision
CN112907714A (en) * 2021-03-05 2021-06-04 兰州大学 Mixed matching binocular vision system based on Census transformation and gray absolute difference
CN113344988A (en) * 2020-03-03 2021-09-03 海信集团有限公司 Stereo matching method, terminal and storage medium
CN113344989A (en) * 2021-04-26 2021-09-03 贵州电网有限责任公司 Binocular stereo matching method for minimum spanning tree aerial images of NCC and Census
CN118154793A (en) * 2024-05-13 2024-06-07 四川省川建勘察设计院有限公司 Real-scene three-dimensional rapid modeling method based on remote sensing image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103996202A (en) * 2014-06-11 2014-08-20 北京航空航天大学 Stereo matching method based on hybrid matching cost and adaptive window
CN108898575A (en) * 2018-05-15 2018-11-27 华南理工大学 A kind of NEW ADAPTIVE weight solid matching method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103996202A (en) * 2014-06-11 2014-08-20 北京航空航天大学 Stereo matching method based on hybrid matching cost and adaptive window
CN108898575A (en) * 2018-05-15 2018-11-27 华南理工大学 A kind of NEW ADAPTIVE weight solid matching method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
王云峰等: "基于自适应权重AD-Census变换的双目立体匹配", 《工程科学与技术》 *
王群伟等: "一种改进的AD-Census立体匹配算法", 《有线电视技术》 *
马利等: "邻域相关信息的改进Census变换立体匹配算法", 《计算机工程与应用》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111325778A (en) * 2020-01-22 2020-06-23 天津大学 Improved Census stereo matching algorithm based on window cross-correlation information
CN111325778B (en) * 2020-01-22 2022-04-08 天津大学 Improved Census stereo matching algorithm based on window cross-correlation information
CN113344988A (en) * 2020-03-03 2021-09-03 海信集团有限公司 Stereo matching method, terminal and storage medium
CN113344988B (en) * 2020-03-03 2023-03-31 海信集团有限公司 Stereo matching method, terminal and storage medium
CN111415305A (en) * 2020-03-10 2020-07-14 桂林电子科技大学 Method for recovering three-dimensional scene, computer-readable storage medium and unmanned aerial vehicle
CN112750154A (en) * 2020-12-31 2021-05-04 湖南大学 Stereo matching method based on binocular vision
CN112907714A (en) * 2021-03-05 2021-06-04 兰州大学 Mixed matching binocular vision system based on Census transformation and gray absolute difference
CN113344989A (en) * 2021-04-26 2021-09-03 贵州电网有限责任公司 Binocular stereo matching method for minimum spanning tree aerial images of NCC and Census
CN118154793A (en) * 2024-05-13 2024-06-07 四川省川建勘察设计院有限公司 Real-scene three-dimensional rapid modeling method based on remote sensing image

Similar Documents

Publication Publication Date Title
CN110473219A (en) Solid matching method based on related information of neighborhood
Di Stefano et al. A fast area-based stereo matching algorithm
Zhao et al. Alignment of continuous video onto 3D point clouds
Mittal et al. Scene modeling for wide area surveillance and image synthesis
Yang et al. Fusion of active and passive sensors for fast 3D capture
RU2382406C1 (en) Method of improving disparity map and device for realising said method
EP3549094A1 (en) Method and system for creating images
CN111160291B (en) Human eye detection method based on depth information and CNN
CN101765019B (en) Stereo matching algorithm for motion blur and illumination change image
CN107154017A (en) A kind of image split-joint method based on SIFT feature Point matching
Ataer-Cansizoglu et al. Pinpoint SLAM: A hybrid of 2D and 3D simultaneous localization and mapping for RGB-D sensors
JP2013012045A (en) Image processing method, image processing system, and computer program
Tang et al. A vertex-to-edge weighted closed-form method for dense RGB-D indoor SLAM
CN111444768A (en) Method for discovering tiny obstacles for reflective ground scene
Mayer Analysis of means to improve cooperative disparity estimation
Antunes et al. Piecewise-planar reconstruction using two views
Brosch et al. Segmentation-based depth propagation in videos
US20230419524A1 (en) Apparatus and method for processing a depth map
Madjidi et al. On robustness and localization accuracy of optical flow computation for underwater color images
JPH08329110A (en) Method for processing picture
Yao et al. 3D modeling and rendering from multiple wide-baseline images by match propagation
Xiong et al. Color rank and census transforms using perceptual color contrast
JPH10283474A (en) Depth information extracting device and depth information extracting method
Fouhey et al. Object recognition robust to imperfect depth data
Cai et al. A stereo matching algorithm based on color segments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20191119