CN106651897B - Parallax correction method based on super-pixel segmentation - Google Patents

Parallax correction method based on super-pixel segmentation Download PDF

Info

Publication number
CN106651897B
CN106651897B CN201610888875.9A CN201610888875A CN106651897B CN 106651897 B CN106651897 B CN 106651897B CN 201610888875 A CN201610888875 A CN 201610888875A CN 106651897 B CN106651897 B CN 106651897B
Authority
CN
China
Prior art keywords
super
pixel
superpixel
block
parallax
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610888875.9A
Other languages
Chinese (zh)
Other versions
CN106651897A (en
Inventor
李宏亮
孙文龙
王久圣
廖伟军
罗旺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Quick Eye Technology Co Ltd
Original Assignee
Chengdu Quick Eye Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Quick Eye Technology Co Ltd filed Critical Chengdu Quick Eye Technology Co Ltd
Priority to CN201610888875.9A priority Critical patent/CN106651897B/en
Publication of CN106651897A publication Critical patent/CN106651897A/en
Application granted granted Critical
Publication of CN106651897B publication Critical patent/CN106651897B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a parallax correction method based on superpixel segmentation, aiming at left and right images shot by a corrected binocular camera, respectively carrying out superpixel block segmentation on the left and right images according to pixel brightness values; aiming at the problem of mismatching of a non-texture region, marking an unstable superpixel block on a superpixel block of a flat region; the parallax of the unstable super-pixel block is corrected according to the fact that the parallax of the areas with the adjacent super-pixel blocks is closer. Aiming at the problem of foreground expansion of a local stereo algorithm, the method corrects the region with the error matching according to the closer the parallaxes of the regions with the more similar adjacent superpixel blocks are based on the superpixel segmentation, and the parallax image generation speed is high and the precision is high.

Description

Parallax correction method based on super-pixel segmentation
Technical Field
The present invention relates to a parallax correction method, and more particularly, to a parallax correction method suitable for super-pixel segmentation.
Background
At present, most stereo matching algorithms comprise four steps: (1) cost calculation, (2) cost aggregation, (3) parallax calculation and optimization, and (4) parallax refinement. In general, stereo matching algorithms can be divided into two categories: a local stereo matching algorithm and a global stereo matching algorithm. The local stereo matching algorithm uses color or gray scale information within a window to determine the disparity value for each point. While the global stereo matching algorithm is based on the smoothness assumption and utilizes an energy minimization technique to determine disparity values for all points simultaneously.
The global stereo matching algorithm solves the minimum value of the overall energy by using two constraint terms of a smooth term and a data term of an image. The mismatching phenomenon of the low-texture area can be well solved. The global stereo matching algorithm is represented by an image segmentation method, a dynamic programming algorithm, a confidence coefficient propagation algorithm and the like. However, the algorithm has a large amount of calculation and is not suitable for a real-time system.
The local stereo matching algorithm determines the parallax by calculating the total matching cost in the matching window and finding the minimum matching cost by using the WTA strategy because each point only depends on local information. If a fixed small-size window is adopted, texture and edge information can be reserved, but the noise of the disparity map is large; and the local matching can be smoothed by adopting a fixed large-size window, a foreground expansion effect is generated in a discontinuous depth area, a disparity map is fuzzy, and the edge effect is poor. The local stereo matching algorithm represents absolute value of pixel difference (SAD), adaptive window algorithm, adaptive weight algorithm, and the like. Although the accuracy of the local stereo matching algorithm is not as good as that of the global stereo matching algorithm, the algorithm operation amount is small, and the method is suitable for a real-time system.
The conventional local parallax correction method comprises the following steps: the adaptive weight algorithm calculates the probability of the center pixel and the adjacent pixels belonging to the same region by using the local adaptive support weight. The complexity depends on the size of the matching window, and the square relation is formed between the complexity and the size of the window, so that the self-adaptive window speed is low, and the real-time requirement cannot be met; although the stereo matching algorithm of the fixed window has high operation speed, the processing effect in the low texture area is poor, and the foreground expansion effect is easy to occur.
At present, the real-time performance of the local stereo matching algorithm is higher than that of the global stereo matching algorithm. In the local stereo matching algorithm, although the complexity of the stereo matching algorithm of the fixed window is low, the problems of expansion of the foreground and poor parallax effect of the depth discontinuous region exist, and although the error matching of the depth discontinuous region is greatly reduced by the stereo matching algorithm of the adaptive window and the adaptive weight, the complexity is high, and the method is not suitable for occasions with high real-time requirements.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a parallax correction method based on super-pixel segmentation, aiming at the problem of foreground expansion of a local stereo algorithm, based on super-pixel segmentation rather than a fixed window, and correcting a mismatching area according to the fact that the more similar areas of adjacent super-pixel blocks are closer to each other in parallax.
The technical scheme adopted by the invention is as follows: a parallax correction method based on super pixel segmentation is characterized in that super pixel block segmentation is respectively carried out on left and right images shot by a corrected binocular camera according to pixel brightness values; aiming at the problem of mismatching of a non-texture region, marking an unstable superpixel block on a superpixel block of a flat region; the parallax of the unstable super-pixel block is corrected according to the fact that the parallax of the areas with the adjacent super-pixel blocks is closer.
And respectively performing SLIC (simple linear iterative cluster) super pixel block segmentation on the left image and the right image shot by the corrected binocular camera according to the pixel brightness value. The image is divided into a plurality of superpixel blocks, and the pixels in each superpixel block are similar in color and distance. The correction is to correct the image output by the binocular camera, the parameter of the binocular camera is obtained by calibrating the binocular camera, and the parameter of the camera is mainly the relative position of the lens of the binocular camera and the left camera and the right camera. This parameter calibration is done without change, so the parameters used for all corrections are not changed.
The method comprises the following specific steps:
s1, respectively carrying out super-pixel block segmentation on the left image and the right image shot by the corrected binocular camera according to the pixel brightness values;
s2, respectively calculating gradients of the blocks of the input image after the super pixel block segmentation, and taking the gradients and the brightness as matching costs for parallax calculation;
s3, aggregating the matching cost of each pixel point as an aggregation cost by taking the super pixel block area as a matching window; taking the left image as a reference, searching a best matching point in the right image according to a strategy that a winner is a king (WTA) to obtain a left disparity map; taking the right image as a reference, searching a best matching point in the left image according to a strategy that a winner is a king (WTA) to obtain a right disparity map;
calculating the matching reliability of the superpixel block while aggregating the matching cost of each pixel point as the aggregation cost and searching the most matched point, and if the reliability is less than a set reliability threshold, considering that the superpixel block is stable and the parallax calculation is reliable, otherwise, considering that the superpixel block is unstable; marking unstable superpixel blocks;
s4, performing left-right consistency check (LRC) on the obtained left-right disparity map based on consistency constraint of binocular vision, namely performing occlusion point marking on occlusion point disparity at the same coordinate position in the left disparity map and the right disparity map to obtain an initial disparity map;
s5, finding all superpixel blocks adjacent to the superpixel block and the unstable superpixel block where any shielding point is located; if the brightness similarity of the adjacent super-pixel blocks is larger than a set brightness similarity threshold value, the adjacent super-pixel blocks are considered to belong to the same target, and the adjacent super-pixel blocks are accepted, otherwise, the adjacent super-pixel blocks are not accepted;
s6, the parallax of the occlusion point is corrected to be the median value of the parallax of the super pixel block adjacent to the super pixel block and having the brightness similarity larger than the set brightness similarity threshold.
At S2, the gradient includes a gradient in the x directionAnd gradient in y-directionThe sum of the gradient in the x direction and the gradient in the y direction is the gradient.
In S5, the normalized histogram of the current super-pixel block and its neighboring super-pixel blocks is calculated, and the histogram distance between the current super-pixel block and the neighboring super-pixel block is compared as the measure of the brightness similarity.
The confidence threshold is 0.9.
Compared with the prior art, the invention has the beneficial effects that: aiming at the problem of foreground expansion of a local stereo algorithm, the method corrects the region with the error matching according to the closer the parallaxes of the regions with the more similar adjacent superpixel blocks are based on the superpixel segmentation, and the parallax image generation speed is high and the precision is high.
Drawings
FIG. 1 is a flowchart of a method according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Any feature disclosed in this specification (including any accompanying drawings) may be replaced by alternative features serving equivalent or similar purposes, unless expressly stated otherwise. That is, unless expressly stated otherwise, each feature is only an example of a generic series of equivalent or similar features.
Detailed description of the preferred embodiment 1
As shown in fig. 1, a parallax correction method based on superpixel segmentation is divided into two stages, namely initial parallax calculation and parallax correction.
The initial parallax calculation includes the following four steps:
s1, respectively carrying out super-pixel block segmentation on the left image and the right image shot by the corrected binocular camera according to the pixel brightness values;
s2, calculating gradients of the input image divided by the superpixel blocks, and calculating the gradients and the brightness
C(x,y,d)=(1-ω)*CSAD(x,y,d)+ω*CGRAD(x,y,d)A... 9..... (3) the matching cost of parallax:
where N (x, y) is the superpixel block region to which the matching point (x, y) belongs.Represents the horizontal component of the image gradient and,representing the vertical component of the image gradient, I1(I, j) is the brightness value at the (I, j) coordinate in the left image, I2(i + d, j) is a luminance value at (i + d, j) coordinates in the right diagram, ω represents a weight, is a real number between 0 and 1, d is a parallax, CSAD(x, y, d) is the luminance matching cost, CGRAD(x, y, d) is the gradient matching cost, and C (x, y, d) is the aggregation cost when the parallax is d at the coordinate (x, y);
s3, aggregating the matching cost of each pixel point as an aggregation cost by taking the super pixel block area as a matching window; taking the left image as a reference, searching a best matching point in the right image according to a strategy that a winner is a king (WTA) to obtain a left disparity map; taking the right image as a reference, searching a best matching point in the left image according to a strategy that a winner is a king (WTA) to obtain a right disparity map;
calculating the matching reliability of the superpixel block while aggregating the matching cost of each pixel point as the aggregation cost and searching the most matched point, and if the reliability is less than a set reliability threshold, considering that the superpixel block is stable and the parallax calculation is reliable, otherwise, considering that the superpixel block is unstable; marking unstable superpixel blocks; and calculating the credibility:
Confidence=SAD_min/SAD_min2................................(4)
where Confidence represents Confidence, SAD _ min represents the smallest aggregation cost, and SAD _ min2 represents the next smallest aggregation cost.
S4, performing left-right consistency check (LRC) on the obtained left-right disparity map based on consistency constraint of binocular vision, namely performing occlusion point marking on occlusion point disparity at the same coordinate position in the left disparity map and the right disparity map to obtain an initial disparity map;
the parallax correction stage comprises the following steps:
s5, finding all superpixel blocks adjacent to the superpixel block and the unstable superpixel block where any shielding point is located; if the brightness similarity of the adjacent super-pixel blocks is larger than a set brightness similarity threshold value, the adjacent super-pixel blocks are considered to belong to the same target, and the adjacent super-pixel blocks are accepted, otherwise, the adjacent super-pixel blocks are not accepted;
s6, the parallax of the occlusion point is corrected to be the median value of the parallax of the super pixel block adjacent to the super pixel block and having the brightness similarity larger than the set brightness similarity threshold.
In this embodiment, slic (single linear iterative cluster) super pixel block segmentation is performed on the left and right images respectively according to the pixel brightness values.
Specific example 2
Based on the specific embodiment 1, in S5, the normalized histogram of the current super pixel block and its neighboring super pixel blocks is calculated, and the histogram distance between the current super pixel block and the neighboring super pixel block is compared as the measure of the brightness similarity.
Detailed description of the preferred embodiment 3
On the basis of embodiment 1 or 2, the confidence threshold is 0.9.

Claims (4)

1. A parallax correction method based on super pixel segmentation is characterized in that super pixel block segmentation is respectively carried out on left and right images shot by a corrected binocular camera according to pixel brightness values; aiming at the problem of mismatching of a non-texture region, marking an unstable superpixel block on a superpixel block of a flat region; correcting the parallax of the unstable superpixel blocks according to the fact that the parallaxes of the areas with the adjacent superpixel blocks are closer;
the method comprises the following specific steps:
s1, respectively carrying out super-pixel block segmentation on the left image and the right image shot by the corrected binocular camera according to the pixel brightness values;
s2, respectively calculating gradients of pixel points of the input image after the super pixel block is divided, and taking the gradients and the brightness as matching costs for parallax calculation;
s3, aggregating the matching cost of each pixel point as an aggregation cost by taking the super pixel block area as a matching window; taking the left image as a reference, and searching a best matching point in the right image according to a strategy of taking a winner as a king to obtain a left disparity map; taking the right image as a reference, searching a best matching point in the left image according to a strategy of taking a winner as a king to obtain a right disparity map;
calculating the matching reliability of the superpixel block while aggregating the matching cost of each pixel point as the aggregation cost and searching the most matched point, and if the reliability is less than a set reliability threshold, considering that the superpixel block is stable and the parallax calculation is reliable, otherwise, considering that the superpixel block is unstable; marking unstable superpixel blocks;
s4, performing left-right consistency check on the obtained left-right disparity map based on consistency constraint of binocular vision, and performing occlusion point marking on occlusion point disparity to obtain an initial disparity map;
s5, finding out all superpixel blocks adjacent to the superpixel block or unstable superpixel block where any shielding point is located; if the brightness similarity between the super-pixel block or the unstable super-pixel block where the shielding point is located and the adjacent super-pixel block is larger than a set brightness similarity threshold value, the super-pixel block or the unstable super-pixel block and the adjacent super-pixel block are considered to belong to the same target, and the adjacent super-pixel block is accepted, otherwise, the adjacent super-pixel block is not accepted;
s6, the parallax of the occlusion point is corrected to be the median of the parallax of the super pixel block adjacent to the super pixel block where the occlusion point is located and the brightness similarity is greater than the set brightness similarity threshold.
2. The method for disparity correction based on super-pixel segmentation as claimed in claim 1, wherein the gradient comprises a gradient in an x-direction in S2
And gradient in y-directionThe sum of the gradient in the x direction and the gradient in the y direction is the gradient.
3. The method for correcting parallax error based on super pixel segmentation as claimed in claim 1, wherein in S5, a normalized histogram of the current super pixel block and its neighboring super pixel blocks is calculated by comparing the histogram distance between the current super pixel block and the neighboring super pixel blocks as a measure of the similarity of the two luminance values.
4. The method of claim 1, wherein the confidence threshold is 0.9.
CN201610888875.9A 2016-10-12 2016-10-12 Parallax correction method based on super-pixel segmentation Active CN106651897B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610888875.9A CN106651897B (en) 2016-10-12 2016-10-12 Parallax correction method based on super-pixel segmentation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610888875.9A CN106651897B (en) 2016-10-12 2016-10-12 Parallax correction method based on super-pixel segmentation

Publications (2)

Publication Number Publication Date
CN106651897A CN106651897A (en) 2017-05-10
CN106651897B true CN106651897B (en) 2019-12-31

Family

ID=58855241

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610888875.9A Active CN106651897B (en) 2016-10-12 2016-10-12 Parallax correction method based on super-pixel segmentation

Country Status (1)

Country Link
CN (1) CN106651897B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11798147B2 (en) 2018-06-30 2023-10-24 Huawei Technologies Co., Ltd. Image processing method and device

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107330932B (en) * 2017-06-16 2020-08-14 海信集团有限公司 Method and device for repairing noise in parallax map
CN107240083B (en) * 2017-06-29 2020-06-09 海信集团有限公司 Method and device for repairing noise in parallax map
CN107397658B (en) * 2017-07-26 2020-06-19 成都快眼科技有限公司 Multi-scale full-convolution network and visual blind guiding method and device
CN108230273B (en) * 2018-01-05 2020-04-07 西南交通大学 Three-dimensional image processing method of artificial compound eye camera based on geometric information
CN111508012B (en) 2019-01-31 2024-04-19 先临三维科技股份有限公司 Method and device for line stripe mismatching detection and three-dimensional reconstruction
CN109949320B (en) * 2019-03-20 2020-12-11 哈尔滨工业大学 Hyperspectral image superpixel segmentation method based on entropy and mutual information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102567992A (en) * 2011-11-21 2012-07-11 刘瑜 Image matching method of occluded area
CN103049903A (en) * 2012-11-21 2013-04-17 清华大学深圳研究生院 Binocular stereoscopic matching method for stereoscopic vision system
CN104318576A (en) * 2014-11-05 2015-01-28 浙江工业大学 Super-pixel-level image global matching method
CN105847783A (en) * 2016-05-17 2016-08-10 武汉鸿瑞达信息技术有限公司 3D video display and interaction method based on stream media and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IN2013CH05313A (en) * 2013-11-18 2015-05-29 Nokia Corp

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102567992A (en) * 2011-11-21 2012-07-11 刘瑜 Image matching method of occluded area
CN103049903A (en) * 2012-11-21 2013-04-17 清华大学深圳研究生院 Binocular stereoscopic matching method for stereoscopic vision system
CN104318576A (en) * 2014-11-05 2015-01-28 浙江工业大学 Super-pixel-level image global matching method
CN105847783A (en) * 2016-05-17 2016-08-10 武汉鸿瑞达信息技术有限公司 3D video display and interaction method based on stream media and device

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Automated co-superpixel generation via graph matching;Yurui Xie;《Signal, Image and Video Processing》;20140531;第8卷(第4期);第753–763页 *
Color Image Guided Boundary-inconsistent Region Refinement for Stereo Matching;F. Jin 等;《IEEE Transactions on Circuits and Systems for Video Technology》;20151230;第1155-1159页 *
基于图像分割的快速立体匹配算法研究;朱松;《万方知识数据服务平台》;20160504;全文 *
基于语义约束与Graph Cuts的稠密三维场景重建;王伟;《中国科学》;20140509;第44卷(第6期);第774-792页 *
数字图像滤波器在立体匹配中的应用;杨青青;《万方数据知识服务平台》;20140715;全文 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11798147B2 (en) 2018-06-30 2023-10-24 Huawei Technologies Co., Ltd. Image processing method and device

Also Published As

Publication number Publication date
CN106651897A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN106651897B (en) Parallax correction method based on super-pixel segmentation
CN112115953B (en) Optimized ORB algorithm based on RGB-D camera combined plane detection and random sampling coincidence algorithm
CN107578430B (en) Stereo matching method based on self-adaptive weight and local entropy
US8644596B1 (en) Conversion of monoscopic visual content using image-depth database
US7876954B2 (en) Method and device for generating a disparity map from stereo images and stereo matching method and device therefor
CN111833393A (en) Binocular stereo matching method based on edge information
CN106408596B (en) Sectional perspective matching process based on edge
CN110060283B (en) Multi-measure semi-global dense matching method
CN108460792B (en) Efficient focusing stereo matching method based on image segmentation
US20210319580A1 (en) Method, apparatus and electronic device for stereo matching
CN106952304B (en) A kind of depth image calculation method using video sequence interframe correlation
US9111350B1 (en) Conversion of monoscopic visual content to stereoscopic 3D
CN106530336B (en) Stereo matching method based on color information and graph cut theory
CN110322572A (en) A kind of underwater culvert tunnel inner wall three dimensional signal space method based on binocular vision
CN109859249B (en) Scene flow estimation method based on automatic layering in RGBD sequence
CN110942102B (en) Probability relaxation epipolar matching method and system
CN107220996A (en) A kind of unmanned plane linear array consistent based on three-legged structure and face battle array image matching method
CN115601406A (en) Local stereo matching method based on fusion cost calculation and weighted guide filtering
Ni et al. Second-order semi-global stereo matching algorithm based on slanted plane iterative optimization
CN103489183B (en) A kind of sectional perspective matching process split based on edge with seed point
CN107274448B (en) Variable weight cost aggregation stereo matching algorithm based on horizontal tree structure
CN107122782B (en) Balanced semi-dense stereo matching method
CN111179327B (en) Depth map calculation method
CN116402904A (en) Combined calibration method based on laser radar inter-camera and monocular camera
CN114331919B (en) Depth recovery method, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant