CN105374040A - Large mechanical workpiece stereo matching method based on vision measurement - Google Patents

Large mechanical workpiece stereo matching method based on vision measurement Download PDF

Info

Publication number
CN105374040A
CN105374040A CN201510795743.7A CN201510795743A CN105374040A CN 105374040 A CN105374040 A CN 105374040A CN 201510795743 A CN201510795743 A CN 201510795743A CN 105374040 A CN105374040 A CN 105374040A
Authority
CN
China
Prior art keywords
point
matching
image
edge feature
parallax value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510795743.7A
Other languages
Chinese (zh)
Inventor
乔玉晶
万立莉
范宇琪
谭世征
曹岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin University of Science and Technology
Original Assignee
Harbin University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin University of Science and Technology filed Critical Harbin University of Science and Technology
Priority to CN201510795743.7A priority Critical patent/CN105374040A/en
Publication of CN105374040A publication Critical patent/CN105374040A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a large mechanical workpiece stereo matching method based on vision measurement. The method comprises steps: an original image pair is acquired through a camera firstly, wherein the image pair comprises a left image and a right image; then, edge features of the image pair are matched; a matching result is used to acquire high-quality edge feature matching point pairs; a disparity value range is determined according to the high-quality edge feature matching point pairs; the high-quality edge feature matching point pairs and the disparity value range are then used for matching the left image and the right image; and finally, a disparity value array is obtained through all matching corresponding point pairs, and a disparity map is generated. As the method in which the edge features of the image pair are matched to acquire the high-quality edge feature matching point pairs is adopted, points which are likely to be wrongly matched in a non-edge area of the image can be filtered, the matching rate is improved, and the matching precision is improved; and as the disparity value range is determined through the high-quality edge feature matching point pairs, the to-be-matched range can be narrowed, the calculation amount of the matching algorithm is reduced, and the matching precision is thus improved.

Description

The big machinery workpiece solid matching method that view-based access control model is measured
Technical field
The invention belongs to technical field of machine vision, be specifically related to the big machinery workpiece solid matching method that a kind of view-based access control model is measured.
Background technology
Measuring three-dimensional morphology is widely used in the fields such as aeroplane mapping, medical imaging, industrial detection.In numerous 3 D measuring method, noncontact stereo vision measuring technology, does not damage the technical advantages such as measured object surface, and is widely used with it.And technique of binocular stereoscopic vision is as the important component part of noncontact stereovision technique, serve the irreplaceable effect of other method.In technique of binocular stereoscopic vision, precision and the speed of solid matching method and measuring three-dimensional morphology are closely bound up.
Conventional stereo matching method can be divided into based on the Stereo Matching Algorithm in region, the matching algorithm of distinguished point based and the global registration algorithm etc. based on Iamge Segmentation several, wherein:
Based on the Stereo Matching Algorithm in region, according to certain similarity basis for estimation, traversal finds the subgraph the most similar to subwindow image.The shortcoming of this method is that calculated amount is large, and matching precision is low.
Based on the global registration algorithm of Iamge Segmentation, mainly have employed the optimum theory method estimating disparity of the overall situation, by successive ignition, obtain result more accurately, but this algorithm does not solve the large problem of calculated amount equally, therefore there is the shortcoming of length consuming time yet.
The matching algorithm of distinguished point based, because match objects is sparse unique point, therefore coupling speed is fast.But matching precision to a great extent dependence characteristics extracts, and what obtain is sparse disparity map, causes matching precision not high.
Visible, in conventional stereo matching method, measuring accuracy and measuring speed are the conflicts that can not get both.
Summary of the invention
For the problems referred to above, the invention discloses the big machinery workpiece solid matching method that a kind of view-based access control model is measured, the method can take into account precision and the speed issue of matching process simultaneously, quick and precisely measures play facilitation for three-dimensional appearance.
The object of the present invention is achieved like this:
The big machinery workpiece solid matching method that view-based access control model is measured, comprises the following steps:
Step a, by video camera obtain original image pair, described image is to comprising left image and right image;
The edge feature that step b, matching image are right;
Step c, the matching result utilizing step b to obtain obtain high-quality Edge Feature Matching point pair;
Steps d, by high-quality Edge Feature Matching point to determining parallax value scope;
Step e, utilize step c to obtain high-quality edge matching point to the parallax value scope obtained with steps d, mate left image and right image;
Step f, calculating parallax value array by all coupling corresponding point to trying to achieve to look, generating disparity map.
The big machinery workpiece solid matching method that above-mentioned view-based access control model is measured,
The edge feature that matching image described in step b is right, is specially:
Carry out canny edge extracting to left image, the result obtained is the set of A point; Carry out canny edge extracting to right image, the result obtained is the set of B point; The corresponding point that during employing AD Matching power flow function finds line by line and gathers with A point in right image, each pixel matches, described corresponding point are the set of C point, and described C point set is gathered corresponding with A point;
Acquisition high-quality Edge Feature Matching point pair described in step c, is specially:
Ask B point to gather the common factor gathered with C point, described common factor is the set of D point; The pixel coordinate point gathered without corresponding relation with D point in being gathered by A point is rejected, and obtains the set of E point, and it is high-quality Edge Feature Matching point pair that the set of E point gathers mutually corresponding pixel coordinate with D point;
Determination parallax value scope described in steps d, is specially:
The horizontal ordinate of pixel coordinate point corresponding with the set of D point for the set of E point is subtracted each other, namely the pixel horizontal ordinate of high-quality Edge Feature Matching corresponding point subtracts each other, and to the maximal value that the result obtained takes absolute value, obtain parallax value maximal value, parallax value scope definition is from 0 to 1.5 times of parallax value maximal value;
The left image of coupling described in step e and right image, be specially:
With pixel coordinate to be matched in left image for starting point, be no more than within the scope of parallax value, in right image, corresponding point are mated in search successively from right to left, until run into high-quality Edge Feature Matching point.
Further, if in step e, exceed the pixel that matching range does not find coupling yet, give coupling corresponding point by position proportional relation between its both sides high-quality Edge Feature Points, mate left image and right image.
The big machinery workpiece solid matching method that above-mentioned view-based access control model is measured, between step a and step b, also comprises the step of denoising.
The big machinery workpiece solid matching method that above-mentioned view-based access control model is measured, after step f, also comprises the step of disparity map being carried out to aftertreatment.
Further, described aftertreatment is medium filtering.
Beneficial effect:
The present invention is based on the big machinery workpiece solid matching method of vision measurement, precision and the speed of matching process can be improved simultaneously, be embodied in:
The first, obtain the right method of high-quality Edge Feature Matching point owing to present invention employs by the edge feature that matching image is right, therefore, it is possible to filter out the point of the easy error hiding of image non-edge, improve matching rate, thus improve matching precision;
The second, due to the present invention by high-quality Edge Feature Matching point to determining parallax value scope, therefore, it is possible to reduce scope to be matched, reduce the operand of matching algorithm, thus improve matching speed.
Accompanying drawing explanation
Fig. 1 is the big machinery workpiece solid matching method process flow diagram that the present invention is based on vision measurement.
Fig. 2 is the left image of the big machinery workpiece solid matching method specific embodiment three that the present invention is based on vision measurement.
Fig. 3 is the right image of the big machinery workpiece solid matching method specific embodiment three that the present invention is based on vision measurement.
Fig. 4 is the disparity map of the big machinery workpiece solid matching method specific embodiment three that the present invention is based on vision measurement.
Embodiment
Below in conjunction with accompanying drawing, the specific embodiment of the present invention is described in further detail.
Specific embodiment one
The big machinery workpiece solid matching method of the view-based access control model measurement of the present embodiment, process flow diagram as shown in Figure 1.The method comprises the following steps:
Step a, by video camera obtain original image pair, described image is to comprising left image and right image;
The edge feature that step b, matching image are right;
Step c, the matching result utilizing step b to obtain obtain high-quality Edge Feature Matching point pair;
Steps d, by high-quality Edge Feature Matching point to determining parallax value scope;
Step e, utilize step c to obtain high-quality edge matching point to the parallax value scope obtained with steps d, mate left image and right image;
Step f, calculating parallax value array by all coupling corresponding point to trying to achieve to look, generating disparity map.
Specific embodiment two
The big machinery workpiece solid matching method of the view-based access control model measurement of the present embodiment, illustrates method of the present invention for following two matrixes.
Step a, by video camera obtain original image pair, described image is to comprising left image and right image;
Wherein, left image is expressed in matrix as:
20 200 67 70 230
22 20 185 200 217
Right image is expressed in matrix as:
200 67 69 230 110
185 68 200 216 111
The edge feature that step b, matching image are right;
Carry out canny edge extracting to left image, the result obtained is A point set { (1,2), (1,5), (2,3), (2,5) }; Carry out canny edge extracting to right image, the result obtained is B point set { (1,1), (Isosorbide-5-Nitrae), (2,1), (2,4) }; The corresponding point that during employing AD Matching power flow function finds line by line and gathers with A point in right image, each pixel matches, described corresponding point are C point set { (1,1), (Isosorbide-5-Nitrae), (2,1) }, described C point set is gathered corresponding with A point;
It should be noted that, canny edge extracting is prior art, because those skilled in the art can find its specific algorithm in related data, and therefore no longer repeat specification in the present invention; And the computing formula of AD Matching power flow function is:
C AD(p,d)=|I Left(p)-I Right(pd)|
Wherein, I leftp () is the gray-scale value of pixel p in left image, I right(pd) in right image with the gray-scale value that p point parallax is the pixel of d.
Step c, the matching result utilizing step b to obtain obtain high-quality Edge Feature Matching point pair;
Ask B point to gather the common factor gathered with C point, described common factor is D point set { (1,1), (Isosorbide-5-Nitrae), (2,1) }; The pixel coordinate point gathered without corresponding relation with D point in being gathered by A point is rejected, and obtains E point set { (1,2), (1,5), (2,3) }, the set of E point gathers mutually corresponding pixel coordinate with D point is high-quality Edge Feature Matching point pair.
Steps d, by high-quality Edge Feature Matching point to determining parallax value scope;
The horizontal ordinate of pixel coordinate point corresponding with the set of D point for the set of E point is subtracted each other, namely the pixel horizontal ordinate of high-quality Edge Feature Matching corresponding point subtracts each other, and to the maximal value that the result obtained takes absolute value, obtain parallax value maximal value, parallax value scope definition is from 0 to 1.5 times of parallax value maximal value;
In this example, maximum disparity value is 2, and parallax value scope definition is [0,3].
Step e, utilize step c to obtain high-quality edge matching point to the parallax value scope obtained with steps d, mate left image and right image;
With pixel coordinate to be matched in left image for starting point, be no more than within the scope of parallax value, in right image, corresponding point are mated in search successively from right to left, until run into high-quality Edge Feature Matching point.
Step f, calculating parallax value array by all coupling corresponding point to trying to achieve to look, generating disparity map.
Need to further illustrate step e in specific embodiment two, if in step e, exceed the pixel that matching range does not find coupling yet, give coupling corresponding point by position proportional relation between its both sides high-quality Edge Feature Points, mate left image and right image.
The example of matched pixel point is found: to find the pixel (1 with left image in matching range, 2) corresponding point of mating are example, by pixel coordinate (1,2) be positioned in right image, for starting point, match point is searched for from right to left with (1, the 2) pixel in right image, coupling corresponding point are pixel in right image (1,1).
Exceed matching range and do not find the example of matched pixel point yet: when (1 of left image, 4) exceeding matching range when right image searches for match objects point does not from right to left find the pixel of coupling to give coupling corresponding point by position proportional relation between its both sides high-quality Edge Feature Points yet, namely right image (1,3) be the coupling corresponding point of left image (Isosorbide-5-Nitrae).
Specific embodiment three
The big machinery workpiece solid matching method of the view-based access control model measurement of the present embodiment, for two width images, comprises the following steps:
Step a, by video camera obtain original image pair, described image is to comprising left image and right image; Wherein, as shown in Figure 2, right image as shown in Figure 3 for left image;
The edge feature that step b, matching image are right;
Step c, the matching result utilizing step b to obtain obtain high-quality Edge Feature Matching point pair;
Steps d, by high-quality Edge Feature Matching point to determining parallax value scope;
Step e, utilize step c to obtain high-quality edge matching point to the parallax value scope obtained with steps d, mate left image and right image;
Step f, calculating parallax value array by all coupling corresponding point to trying to achieve to look, generating disparity map, as shown in Figure 4.
Specific embodiment four
The big machinery workpiece solid matching method that the view-based access control model of the present embodiment is measured, on the basis of specific embodiment two or specific embodiment three, is limited to further between step a and step b, also comprises the step of denoising.
This step can reduce the error hiding rate of step b.
Specific embodiment five
The big machinery workpiece solid matching method that the view-based access control model of the present embodiment is measured, on the basis of specific embodiment two or specific embodiment three, after being limited to step f further, also comprises the step of disparity map being carried out to aftertreatment.
In measuring three-dimensional morphology process, often not only need to carry out Stereo matching, and need to splice between different three-dimension curved surface.Although the present invention only relates to solid matching method, disparity map is carried out to the enforcement of post-processing step, the some cloud algorithm implemented after the present invention can be made to have better splicing effect.
Described aftertreatment is medium filtering.

Claims (6)

1. the big machinery workpiece solid matching method of view-based access control model measurement, is characterized in that, comprise the following steps:
Step a, by video camera obtain original image pair, described image is to comprising left image and right image;
The edge feature that step b, matching image are right;
Step c, the matching result utilizing step b to obtain obtain high-quality Edge Feature Matching point pair;
Steps d, by high-quality Edge Feature Matching point to determining parallax value scope;
Step e, utilize step c to obtain high-quality edge matching point to the parallax value scope obtained with steps d, mate left image and right image;
Step f, calculating parallax value array by all coupling corresponding point to trying to achieve to look, generating disparity map.
2. the big machinery workpiece solid matching method of view-based access control model measurement according to claim 1, is characterized in that,
The edge feature that matching image described in step b is right, is specially:
Carry out canny edge extracting to left image, the result obtained is the set of A point; Carry out canny edge extracting to right image, the result obtained is the set of B point; The corresponding point that during employing AD Matching power flow function finds line by line and gathers with A point in right image, each pixel matches, described corresponding point are the set of C point, and described C point set is gathered corresponding with A point;
Acquisition high-quality Edge Feature Matching point pair described in step c, is specially:
Ask B point to gather the common factor gathered with C point, described common factor is the set of D point; The pixel coordinate point gathered without corresponding relation with D point in being gathered by A point is rejected, and obtains the set of E point, and it is high-quality Edge Feature Matching point pair that the set of E point gathers mutually corresponding pixel coordinate with D point;
Determination parallax value scope described in steps d, is specially:
The horizontal ordinate of pixel coordinate point corresponding with the set of D point for the set of E point is subtracted each other, namely the pixel horizontal ordinate of high-quality Edge Feature Matching corresponding point subtracts each other, and to the maximal value that the result obtained takes absolute value, obtain parallax value maximal value, parallax value scope definition is from 0 to 1.5 times of parallax value maximal value;
The left image of coupling described in step e and right image, be specially:
With pixel coordinate to be matched in left image for starting point, be no more than within the scope of parallax value, in right image, corresponding point are mated in search successively from right to left, until run into high-quality Edge Feature Matching point.
3. the big machinery workpiece solid matching method of view-based access control model measurement according to claim 2, it is characterized in that, if in step e, exceed the pixel that matching range does not find coupling yet, give coupling corresponding point by position proportional relation between its both sides high-quality Edge Feature Points, mate left image and right image.
4. the big machinery workpiece solid matching method of view-based access control model measurement according to claim 1, is characterized in that, between step a and step b, also comprise the step of denoising.
5. the big machinery workpiece solid matching method of view-based access control model measurement according to claim 1, is characterized in that, after step f, also comprise the step of disparity map being carried out to aftertreatment.
6. the big machinery workpiece solid matching method of view-based access control model measurement according to claim 5, it is characterized in that, described aftertreatment is medium filtering.
CN201510795743.7A 2015-11-18 2015-11-18 Large mechanical workpiece stereo matching method based on vision measurement Pending CN105374040A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510795743.7A CN105374040A (en) 2015-11-18 2015-11-18 Large mechanical workpiece stereo matching method based on vision measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510795743.7A CN105374040A (en) 2015-11-18 2015-11-18 Large mechanical workpiece stereo matching method based on vision measurement

Publications (1)

Publication Number Publication Date
CN105374040A true CN105374040A (en) 2016-03-02

Family

ID=55376212

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510795743.7A Pending CN105374040A (en) 2015-11-18 2015-11-18 Large mechanical workpiece stereo matching method based on vision measurement

Country Status (1)

Country Link
CN (1) CN105374040A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113516699A (en) * 2021-05-18 2021-10-19 哈尔滨理工大学 Stereo matching system based on super-pixel segmentation
CN114742876A (en) * 2022-06-13 2022-07-12 菏泽市土地储备中心 Land vision stereo measurement method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100220932A1 (en) * 2007-06-20 2010-09-02 Dong-Qing Zhang System and method for stereo matching of images
CN102982334A (en) * 2012-11-05 2013-03-20 北京理工大学 Sparse parallax obtaining method based on target edge features and gray scale similarity
CN104392434A (en) * 2014-11-05 2015-03-04 浙江工业大学 Triangle constraint-based image matching diffusion method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100220932A1 (en) * 2007-06-20 2010-09-02 Dong-Qing Zhang System and method for stereo matching of images
CN102982334A (en) * 2012-11-05 2013-03-20 北京理工大学 Sparse parallax obtaining method based on target edge features and gray scale similarity
CN104392434A (en) * 2014-11-05 2015-03-04 浙江工业大学 Triangle constraint-based image matching diffusion method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
周东翔等: ""基于模糊判别的立体匹配算法"", 《中国图象图形学报》 *
熊俊: ""基于立体视觉的增强现实的制作方法研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
龚文彪等: ""基于颜色内相关和自适应支撑权重的立体匹配算法"", 《中国激光》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113516699A (en) * 2021-05-18 2021-10-19 哈尔滨理工大学 Stereo matching system based on super-pixel segmentation
CN114742876A (en) * 2022-06-13 2022-07-12 菏泽市土地储备中心 Land vision stereo measurement method

Similar Documents

Publication Publication Date Title
Kim et al. Multi-view image and tof sensor fusion for dense 3d reconstruction
Pantilie et al. SORT-SGM: Subpixel optimized real-time semiglobal matching for intelligent vehicles
CN103383776B (en) A kind of laddering Stereo Matching Algorithm based on two stage cultivation and Bayesian Estimation
CN110310331B (en) Pose estimation method based on combination of linear features and point cloud features
CN104408772A (en) Grid projection-based three-dimensional reconstructing method for free-form surface
Ma et al. A modified census transform based on the neighborhood information for stereo matching algorithm
CN103337064A (en) Method for removing mismatching point in image stereo matching
Liu et al. The applications and summary of three dimensional reconstruction based on stereo vision
CN105374040A (en) Large mechanical workpiece stereo matching method based on vision measurement
Shen et al. Depth map enhancement method based on joint bilateral filter
CN106802149B (en) Rapid sequence image matching navigation method based on high-dimensional combination characteristics
Borisagar et al. A novel segment-based stereo matching algorithm for disparity map generation
CN106447709A (en) Rapid high-precision binocular parallax matching method
CN106548482B (en) Dense matching method and system based on sparse matching and image edges
Xu et al. Hybrid plane fitting for depth estimation
Xiao et al. A segment-based stereo matching method with ground control points
Jorissen et al. Multi-view wide baseline depth estimation robust to sparse input sampling
CN103615987A (en) Method for measuring distance between central electrode and ground electrode of spark plug based on edge sub-pixel technology
Sheng et al. Depth enhancement based on hybrid geometric hole filling strategy
CN106056599A (en) Object depth data-based object recognition algorithm and device
San et al. Feature based disparity estimation using hill-climbing algorithm
Zhang et al. Passive 3D reconstruction based on binocular vision
Xue et al. Multi-view image denoising based on graphical model of surface patch
Zhang et al. Insights into local stereo matching: Evaluation of disparity refinement approaches
Vellanki et al. Enhanced stereo matching technique using image gradient for improved search time

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20160302