CN104616325B - A kind of large surfaces Light stripes center extraction method of quick high accuracy - Google Patents

A kind of large surfaces Light stripes center extraction method of quick high accuracy Download PDF

Info

Publication number
CN104616325B
CN104616325B CN201510034164.0A CN201510034164A CN104616325B CN 104616325 B CN104616325 B CN 104616325B CN 201510034164 A CN201510034164 A CN 201510034164A CN 104616325 B CN104616325 B CN 104616325B
Authority
CN
China
Prior art keywords
mrow
msub
image
striation
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510034164.0A
Other languages
Chinese (zh)
Other versions
CN104616325A (en
Inventor
刘巍
高鹏
张洋
杨帆
李晓东
贾振元
高航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201510034164.0A priority Critical patent/CN104616325B/en
Publication of CN104616325A publication Critical patent/CN104616325A/en
Application granted granted Critical
Publication of CN104616325B publication Critical patent/CN104616325B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A kind of large-scale composite material component surface linear structure light extraction method of quick high accuracy of the present invention belongs to computer vision measurement technical field, is related to a kind of image quality evaluating method and linear structure light Light stripes center extraction method.This method is directed to large-scale composite material component surface linear structure light, establishes striation evaluation criterion based on Gauss similarity image quality evaluation, proposes center extraction decision criteria;The illumination model and spatial information transformation relation formed using laser structure light in composite material surface, obtains the incident angle of structured light light source and the viewing angle of video camera;Pass through the incident angle of obtained structured light light source and the viewing angle of video camera and extraction amendment is carried out to structure light geometric center in image, so as to obtain actual optical losses.The present invention based on Gauss similarity graph image quality evaluation method using being evaluated optical strip image, and precision is higher, and arithmetic speed is fast, can meet the quick high accuracy extraction of large-scale zero component surface structure optical losses.

Description

A kind of large surfaces Light stripes center extraction method of quick high accuracy
Technical field
The invention belongs to computer vision measurement technical field, is related to a kind of image quality evaluating method and linear structure light Light stripes center extraction method.
Background technology
It is its large-scale part and component such as large aircraft empennage, big with being significantly increased for large-scale Grand Equipments machine-building size The high-acruracy survey of the shape such as type antenna face size is that it equips the precondition and guarantee that high quality is linked and packed.Swept based on auxiliary laser The Binocular vision photogrammetry method retouched is its a kind of conventional large-scale surface measurement method, and this method is auxiliary using binocular camera collection Help laser stripe and carry out optical losses reconstruction, to realize the high-acruracy survey of measured object bodily form facial contour size.However, due to Large-scale part and component shape face size is big, laser stripe measured object surface movement yardstick it is big, while laser stripe by live illumination, The multi-source such as material surface characteristic factor influences, and traditional center extraction method is difficult to realize structure striation in large scale surface measurement The extracted with high accuracy at line center.Therefore, how the large-scale zero component surface light strip center of structured light of extraction of quick high accuracy is to big The vision measurement of the component of type zero is significant.
Existing structured light strip center extraction method mainly has traditional geometrical center method, grey scale centre of gravity method and curve to intend It is legal etc., and the more direction template method and hessian matrix methods of characteristic.Traditional geometrical center method is to pass through detection light Bar edge further extracts optical losses, and arithmetic speed is fast, but precision is relatively low;Grey scale centre of gravity method is by calculating optical strip image Grey scale centre of gravity per a line is as optical losses, and robustness is low, and precision is not high;Curve-fitting method can reach degree of precision, but It is computationally intensive, arithmetic speed is slow.Domestic and foreign scholars further provide more on the basis of traditional Light stripes center extraction algorithm Have the geometric direction template and Hessian matrix methods of characteristic.What Hu Bin etc. was delivered《Structural light stripes based on direction template Spot detection method》[J] computer engineering and application, 2002.11:58-60, it is proposed that one kind is examined using direction-changeable template The method of geodesic structure light stripe center, 4 templates are designed on 0 °, 90 °, 45 ° and 135 ° direction, respectively with this 4 templates pair The each row of image is handled.This method has certain anti-white noise ability and broken line repairing ability;The inventions such as Zhou Fuqiang Patent No. CN200510123724.6 " a kind of structured light strip center quick and high-precision method for extracting " is using solution Hessian matrixes determine the normal direction of striation, and the sub-pixel centre bit of striation is solved using Taylor expansion and recursion method Put, operand is reduced under conditions of precision and robustness is ensured, realize the quick of light stripe center to a certain extent and carry Take.
The content of the invention
The invention solves technical barrier be for large-scale zero component surface, structure striation is fast under complicated measuring environment Fast extracted with high accuracy, Gauss similarity image criteria of quality evaluation and the accordingly optical losses based on the picture appraisal are invented Quick and high-precision method for extracting.It is right, it is necessary to carry out image quality evaluation to optical strip image during structure Light stripes center extraction Less than the image optical losses amendment of certain threshold value, realize that the quick high accuracy of large-scale zero component surface structure optical losses carries Take.
The technical solution adopted by the present invention is a kind of large-scale composite material component surface linear structure light of quick high accuracy Extracting method, it is characterized in that, this method is directed to large-scale composite material component surface linear structure light, based on Gauss similarity graph picture Striation evaluation criterion is established in quality evaluation, proposes center extraction decision criteria;Using laser structure light in composite material surface shape Into illumination model and spatial information transformation relation, obtain the incident angle of structured light light source and the viewing angle of video camera;It is logical The incident angle of obtained structured light light source and the viewing angle of video camera is crossed to put forward structure light geometric center in image Amendment is taken, so as to obtain actual optical losses;Method comprises the following steps that:
The first step is based on Gaussian structures similarity image criteria of quality evaluation setting center extraction decision criteria
1) image quality evaluating method based on Gaussian structures similarity
The structural similarity of Gaussian characteristics and image for structure light, propose the image matter based on Gaussian structures similarity Measure evaluation method;For two images x, y to be compared, wherein x is standard picture, and y is image to be evaluated, can be by following Four brightness, contrast, structural similarity, Gauss similitude aspects are compared the structure similarity degree realized to two images Measurement;
Brightness ratio compared with:
Wherein, μx, μyIt is the mean flow rate of image,C1Be in order to avoid denominator for zero set it is small often Number, n is number of pixels;
Contrast compares:
Wherein, σx, σyIt is the standard deviation of image,C2Be in order to avoid denominator is zero and If small constant, n is number of pixels;
Structure compares:
Wherein, σxyIt is the covariance of image, characterizes Structure Comparison degree,It is In order to avoid denominator is the zero small constant set, n is number of pixels;
Gaussian characteristics based on structural light strip, it is as follows to define Gauss similitude:
Wherein T, xu,i, M represents the gray value of image to be evaluated respectively,It is public for Gaussian distribution curve Formula,It is gaussian curve approximation coefficient;Then can be built with reference to the image evaluation method based on structural similarity Vertical Gauss similarity image Environmental Evaluation Model is as follows:
GFSSIM (x, y)=[l (x, y)]α[c(x,y)]β[s(x,y)]γ[g(y)]λ (5)
Wherein, α, beta, gamma, λ are that brightness, contrast and structure compare with Gauss similitude in the picture appraisal model respectively In shared adaptive proportion coefficients;
2) based on standard picture setting center extraction decision criteria
Define standard picture be laser light incident light and camera optical axis with straight line and perpendicular to receiving plane when camera shoot Image;Standard picture optical losses are the geometric center of striation, can directly be extracted using geometrical center method;Established standardses figure The Gauss similarity image quality evaluation threshold value of picture is GFSSIM;It is when image to be evaluated is similar enough to standard picture, i.e., super GFSSIM is crossed, then can assert that the optical losses of image to be evaluated are located approximately at the geometric center of striation, geometrical center method can be used Extract optical losses;When image striation to be evaluated and standard picture have relatively large deviation, i.e., image to be evaluated and standard picture foot It is less than GFSSIM when enough similar, the center of image striation to be evaluated generates larger inclined not in the geometric center of striation Shifting amount, certain amendment need to be carried out to the center extraction of the striation;
Second step standard picture Light stripes center extraction method
1) based on Hough transform extraction striation two edges straight line
Under image coordinate system, linear equation is set as v=ku+b,
Wherein (u, v) is the pixel coordinate of image, and k is the slope of straight line, and b is the intercept of straight line;Define the parameter of straight line Equation is:
ρ=uicosθ+visinθ (6)
Wherein, ρ is the distance of the origin of coordinates and straight line, θ be straight line withuAxle clamp angle;, will using ρ-θ as independent variable-dependent variable Each pixel (u in image coordinate systemi,vi) bring into formula (5) and respectively constitute a curve, its intersections of complex curve is (ρ, θ), The parameter that space line can then be drawn is:
K=-cot θ (7)
B=ρ/sin θ (8)
And then it can obtain the linear equation of striation two edges:
Y=x (- cot θl)+ρl/sinθl (9)
Y=x (- cot θr)+ρr/sinθr (10)
Wherein, (ρll) be left hand edge linear equation corresponding parameter, (ρrr) join for the corresponding of right hand edge linear equation Number;Due to striation edge line less parallel, therefore striation width D can be calculated using two straight parallel wire spacing formula:
K=- (cot θl+cotθr)/2 (11)
Wherein, k is G-bar, for calculating two parallel wire spacings;
2) striation geometric center is determined
Left and right image is evaluated using Gauss similarity image evaluation criterion, can if meeting structure light extraction threshold value Directly according to edge line equation, optical losses are determined using geometrical center method;Optical losses linear equation can simultaneous two edges Linear equation asks for angular bisector to obtain.
Center line straight slope k is tried to achieve by angular bisector slope formula (13);
Wherein, kl=-cot θlFor left hand edge straight slope, kr=-cot θrFor right hand edge straight slope;Then in striation Heart linear equation is:
y-y0=k (x-x0) (14)
Wherein, (x0,y0) it is left and right edges straight-line intersection coordinate;x0=(br-bl)/(kl-kr);y0=klx0+bl;br,bl For the intercept of right, left straight line;
3rd step:Correct striation extracting method
Left and right image is evaluated using Gauss similarity image evaluation criterion, if less than extraction threshold value, needed pair The optical losses of geometrical center method extraction are modified;
1) multi-source factor coupling model
Based on laser, composite material surface object properties, reflection characteristic of the laser in composite material surface is analyzed, is established more Source factor coupling model, its model formation are as follows:
Wherein, I is total structure light light intensity, kaIpaFor ambient light light intensity, kaFor ambient light illumination tensor, kdFor diffusing reflection Coefficient, ksFor specularity factor, i is the angle of incident striation and plane law vector, and θ is direction of observation and plane law vector Angle, d be laser to the distance of receiving plane, h is reflectivity factor;
2) structural light strip width is determined
Following relational expression can be established according to the spatial relation of real image optical losses and standard picture optical losses:
D=d [tan (i+ α)-tan (i- α)] sin θ (16)
Wherein, D is actual striation width, and i is striation incident angle, and θ is camera looks into fee angle, and α is generating laser water The half of the flat angle of departure, d are distance of the laser to receiving plane;
3) angle correction solves
Known by formula (15), (16), known to the information in standard laser striation in the case of, the gray value and sky of its striation Between width it is only relevant with the observation angle of measured surface with the incident angle and observation camera of laser and composite material surface.When obtaining When obtaining laser optical strip image, the developed width D of striation is calculated using Hough edge extractings, simultaneous formula (15), (16) reverse go out Striation incident angle i and camera view angle θ, optical losses line position is corrected as angle correction;
4) optical losses amendment
Using the geometric center of geometrical center method rapid extraction laser striation, due to known striation incident angle i and camera View angle θ, according to optical losses in theorem in Euclid space changing rule, amendment optical losses line position, the actual light obtained after amendment Bar center relative ideal geometric center deviant is:
The beneficial effects of the invention are as follows optical strip image is commented using based on Gauss similarity graph image quality evaluation method Valency, whole Light stripes center extraction process are divided into Gaussian image quality evaluation, the extraction of standard picture striation geometric center and striation Three key steps of heart amendment, determine whether optical strip image to be measured needs center to correct according to image quality evaluation, according to striation Center extraction algorithm and optical losses modification method obtain accurate actual optical losses.Precision is higher, and arithmetic speed is fast, can expire The quick high accuracy extraction of the structure optical losses of large-scale zero component surface of foot.
Brief description of the drawings
Fig. 1 is that light strip center of structured light offsets schematic diagram.In figure, i- striation incidence angles, θ-camera view angle, Δ-reality Optical losses are with respect to the deviant of geometry optical losses, the distance of d- lasers to receiving plane, α-laser transmitting half-angle, 1- Testee surface, 2- lasers, 3- camera imagings face, 4- camera lens, 5- solid lines are geometry optical losses position, 6- dotted lines For actual optical losses position.
Fig. 2 is the large surfaces Light stripes center extraction flow chart of quick high accuracy.
Embodiment
Describe the embodiment of the present invention in detail below in conjunction with technical scheme and accompanying drawing.Accompanying drawing 1 is structural light strip Off-centring schematic diagram.Testee surface is 3.4 × 0.6m t800 composite panels, and striation gets to multiple material at a certain angle On plate, adjustment camera focus gathers clear optical strip image.
Embodiment 1, the present invention shoot a width optical strip image using two video cameras that wide-angle lens is respectively configured.Video camera The video cameras of model viewworks VC-12MC-M/C 65, resolution ratio:4096 × 3072, imaging sensor:CMOS, frame per second: Silent frame, highest 64.3fps, weight:420g.Wide-angle lens model EF 16-35mm f/2.8L II USM, parameter are as follows It is shown, lens focus:F=16-35, APS focal length:25.5-52.5 aperture:F2.8, Lens:82×106.Shooting condition It is as follows:Picture pixels are 4096 × 3072, and lens focus 17mm, object distance 750mm, visual field is about 800mm × 800mm.
Accompanying drawing 2 is the large surfaces Light stripes center extraction flow chart of quick high accuracy.According to the operating process, whole striation Center extraction is divided into Gaussian image quality evaluation, the extraction of standard picture striation geometric center and the main step of optical losses amendment three Suddenly.
The first step is based on Gaussian structures similarity image criteria of quality evaluation setting structure light correction threshold
1) the image quality evaluation model based on Gaussian structures similarity is established
This example is established according to formula (5) and is based on Gaussian structures similarity image Environmental Evaluation Model, wherein α, beta, gamma, λ Take 1.
2) determination of correction threshold
By taking large aircraft exemplary complex material members surface as an example, laser often offsets 2 degree of cameras and captures a striation figure Picture, striation sequence image is thus obtained, calculating processing is carried out by formula (5) image quality evaluation model, analysis can must correct threshold It is worth for GFSSIM=0.998, after laser offsets 20 ° of angles, laser striation off-centring is more than 0.15mm, it should to striation Center is modified.
Second step standard picture Light stripes center extraction method
1) based on Hough transform extraction striation two edges straight line
The coordinate diagram using ρ-θ as independent variable-dependent variable is obtained according to the definition of the parametric equation of the straight line (6) first, wherein ρ is The distance of the origin of coordinates and straight line, θ are straight line and u axle clamps angle, and the intersection point of ρ-θ curves can be calculated according to formula (7) and (8) To the slope and intercept of space line, and then obtain linear equation (9) and (10) of striation two edges.
2) geometrical center method determines striation geometric center
According to striation two edges obtained in the previous step linear equation, optical losses are calculated by angular bisector formula (13) Line straight slope k, simultaneous equations (9) and (10) obtain a bit (x on center line0,y0), thus obtain light stripe centric line equation (14)。
Three stepwise updating striation extracting method
1) calculating of angle correction
Simultaneous formula (15) and (16) can go out striation incident angle i and camera view angle θ with reverse.Except i and θ, remaining Parameter is, it is known that actual striation width D can try to achieve according to the formula (11) during striation edge extracting and (12).
2) optical losses amendment
It is inclined that actual optical losses position relative ideal geometric center is obtained using formula (17) amendment optical losses line position Shifting value is Δ.On the basis of the desired light bar geometric center of geometrical center method extraction, laser left-side images geometric center is to the right Shifted by delta, laser image right geometric center shifted by delta to the left, obtains actual optical losses.
The present invention based on Gauss similarity graph image quality evaluation method using being evaluated optical strip image, to meeting striation The image of threshold value is extracted directly using Hough transform and geometrical center method extraction optical losses, to being unsatisfactory for striation extraction threshold value Image actual optical losses are obtained by optical losses amendment after striation geometric center is extracted.This method beneficial effect is Precision is higher, and arithmetic speed is fast, can meet the quick high accuracy extraction of the structure optical losses of large-scale zero component surface.

Claims (1)

1. a kind of large surfaces Light stripes center extraction method of quick high accuracy, it is characterized in that, this method is directed to large-scale composite wood Expect component surface linear structure light, striation evaluation criterion is established based on Gauss similarity image quality evaluation, proposes center extraction Decision criteria;The illumination model and spatial information transformation relation formed using laser structure light in composite material surface, is tied The incident angle of structure radiant and the viewing angle of video camera;Pass through the incident angle and video camera of obtained structured light light source Viewing angle extraction amendment is carried out to structure light geometric center in image, so as to obtain actual optical losses;Method specifically walks It is rapid as follows:
The first step is based on Gaussian structures similarity image criteria of quality evaluation setting center extraction decision criteria
1) image quality evaluating method based on Gaussian structures similarity
The structural similarity of Gaussian characteristics and image for structure light, propose that the picture quality based on Gaussian structures similarity is commented Valency method;For two images x, y to be compared, wherein x is standard picture, and y is image to be evaluated, by following brightness, right The measurement realized to the structure similarity degree of two images is compared than four degree, structural similarity, Gauss similitude aspects;
Brightness ratio compared with:
<mrow> <mi>l</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&amp;mu;</mi> <mi>x</mi> </msub> <msub> <mi>&amp;mu;</mi> <mi>y</mi> </msub> <mo>+</mo> <msub> <mi>C</mi> <mn>1</mn> </msub> </mrow> <mrow> <msup> <msub> <mi>&amp;mu;</mi> <mi>x</mi> </msub> <mn>2</mn> </msup> <mo>+</mo> <msup> <msub> <mi>&amp;mu;</mi> <mi>y</mi> </msub> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>C</mi> <mn>1</mn> </msub> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
Wherein, μx, μyIt is the mean flow rate of image,C1It is in order to avoid denominator is for the zero small constant set, n Number of pixels;
Contrast compares:
<mrow> <mi>c</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mn>2</mn> <msub> <mi>&amp;sigma;</mi> <mi>x</mi> </msub> <msub> <mi>&amp;sigma;</mi> <mi>y</mi> </msub> <mo>+</mo> <msub> <mi>C</mi> <mn>2</mn> </msub> </mrow> <mrow> <msup> <msub> <mi>&amp;sigma;</mi> <mi>x</mi> </msub> <mn>2</mn> </msup> <mo>+</mo> <msup> <msub> <mi>&amp;sigma;</mi> <mi>y</mi> </msub> <mn>2</mn> </msup> <mo>+</mo> <msub> <mi>C</mi> <mn>2</mn> </msub> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
Wherein, σx, σyIt is the standard deviation of image,C2In order to avoid denominator sets for zero Small constant, n are number of pixels;
Structure compares:
<mrow> <mi>s</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <msub> <mi>&amp;sigma;</mi> <mrow> <mi>x</mi> <mi>y</mi> </mrow> </msub> <mo>+</mo> <msub> <mi>C</mi> <mn>3</mn> </msub> </mrow> <mrow> <msub> <mi>&amp;sigma;</mi> <mi>x</mi> </msub> <msub> <mi>&amp;sigma;</mi> <mi>y</mi> </msub> <mo>+</mo> <msub> <mi>C</mi> <mn>3</mn> </msub> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
σxyIt is the covariance of image, characterizes Structure Comparison degree,C3It is in order to avoid dividing Mother is the zero small constant set, and n is number of pixels;
Gaussian characteristics based on structural light strip, it is as follows to define Gauss similitude:
<mrow> <mi>g</mi> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> <mo>-</mo> <msqrt> <mfrac> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>v</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>T</mi> </munderover> <msup> <mrow> <mo>&amp;lsqb;</mo> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>M</mi> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>M</mi> </munderover> <mo>(</mo> <mrow> <msub> <mi>y</mi> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>g</mi> <mi>v</mi> </msub> <mrow> <mo>(</mo> <mi>u</mi> <mo>)</mo> </mrow> </mrow> <mo>)</mo> <mo>)</mo> </mrow> <mo>/</mo> <mrow> <mo>(</mo> <mfrac> <mn>1</mn> <mi>M</mi> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>u</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>M</mi> </munderover> <msub> <mi>y</mi> <mrow> <mi>u</mi> <mo>,</mo> <mi>v</mi> </mrow> </msub> <mo>)</mo> </mrow> <mo>&amp;rsqb;</mo> </mrow> <mn>2</mn> </msup> </mrow> <mi>T</mi> </mfrac> </msqrt> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
Wherein, T, M represent the pixel column of image to be evaluated, columns, y respectivelyu,vThe gray value at image (u, v) place to be evaluated is represented,For Gaussian distribution curve formula,The respectively gaussian curve approximation coefficient of v rows;With reference to It is as follows that image evaluation method based on structural similarity establishes Gauss similarity image Environmental Evaluation Model:
GFSSIM (x, y)=[l (x, y)]α[c(x,y)]β[s(x,y)]γ[g(y)]λ (5)
Wherein, α, beta, gamma, λ are that brightness, contrast and structure compare with Gauss similitude the institute in the picture appraisal model respectively The adaptive proportion coefficients accounted for;
2) based on standard picture setting center extraction decision criteria
Define standard picture be laser light incident light and camera optical axis with straight line and perpendicular to receiving plane when camera shooting figure Picture;Standard picture optical losses are the geometric center of striation, are directly extracted using geometrical center method;The height of established standardses image This similarity image quality evaluation threshold value is GFSSIM;When image to be evaluated is similar enough to standard picture, that is, exceed GFSSIM, then assert that the optical losses of image to be evaluated are located approximately at the geometric center of striation, light is extracted using geometrical center method Bar center;When image striation to be evaluated and standard picture have relatively large deviation, i.e., when image to be evaluated is with standard picture dissmilarity It is less than GFSSIM, the center of image striation to be evaluated generates larger offset, need pair not in the geometric center of striation The center extraction of the striation carries out certain amendment;
Second step standard picture Light stripes center extraction method
1) based on Hough transform extraction striation two edges straight line
Under image coordinate system, linear equation is set as v=ku+b, wherein, (u, v) is the pixel coordinate of image, and k is straight line Slope, b are the intercept of straight line;Define straight line parametric equation be:
ρ=uicosθ+visinθ(6)
Wherein, ρ is the distance of the origin of coordinates and straight line, θ be straight line withuAxle clamp angle;Using ρ-θ as independent variable-dependent variable, by image Each pixel (u in coordinate systemi,vi) substitute into formula (6) and respectively constitute a curve, its intersections of complex curve be (ρ, θ), then must The parameter for going out space line is:
K=-cot θ (7)
B=ρ/sin θ (8)
And then obtain the linear equation of striation two edges:
V=u (- cot θl)+ρl/sinθl (9)
V=u (- cot θr)+ρr/sinθr (10)
Wherein, (ρll) be left hand edge linear equation corresponding parameter, (ρrr) be right hand edge linear equation corresponding parameter;By In striation edge line less parallel, therefore striation width D is calculated using two straight parallel wire spacing formula:
<mrow> <mover> <mi>k</mi> <mo>&amp;OverBar;</mo> </mover> <mo>=</mo> <mo>-</mo> <mrow> <mo>(</mo> <msub> <mi>cot&amp;theta;</mi> <mi>l</mi> </msub> <mo>+</mo> <msub> <mi>cot&amp;theta;</mi> <mi>r</mi> </msub> <mo>)</mo> </mrow> <mo>/</mo> <mn>2</mn> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>11</mn> <mo>)</mo> </mrow> </mrow>
<mrow> <mi>D</mi> <mo>=</mo> <mfrac> <mrow> <mo>|</mo> <msub> <mi>&amp;rho;</mi> <mi>l</mi> </msub> <mo>/</mo> <msub> <mi>sin&amp;theta;</mi> <mi>l</mi> </msub> <mo>-</mo> <msub> <mi>&amp;rho;</mi> <mi>r</mi> </msub> <mo>/</mo> <msub> <mi>sin&amp;theta;</mi> <mi>r</mi> </msub> <mo>|</mo> </mrow> <msqrt> <mrow> <mn>1</mn> <mo>+</mo> <msup> <mover> <mi>k</mi> <mo>&amp;OverBar;</mo> </mover> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> </mrow>
Wherein,For G-bar, for calculating two parallel wire spacings;
2) striation geometric center is determined
Left and right image is evaluated using Gauss similarity image evaluation criterion, if meeting structure light extraction threshold value, direct root According to edge line equation, optical losses are determined using geometrical center method;Optical losses linear equation simultaneous two edges linear equation Angular bisector is asked for obtain;
Center line straight slope k is tried to achieve by angular bisector slope formula (13)center
<mrow> <mfrac> <mrow> <mo>|</mo> <msub> <mi>k</mi> <mi>l</mi> </msub> <mo>-</mo> <msub> <mi>k</mi> <mrow> <mi>c</mi> <mi>e</mi> <mi>n</mi> <mi>t</mi> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>|</mo> </mrow> <mrow> <mn>1</mn> <mo>+</mo> <msub> <mi>k</mi> <mi>l</mi> </msub> <msub> <mi>k</mi> <mrow> <mi>c</mi> <mi>e</mi> <mi>n</mi> <mi>t</mi> <mi>e</mi> <mi>r</mi> </mrow> </msub> </mrow> </mfrac> <mo>=</mo> <mfrac> <mrow> <mo>|</mo> <msub> <mi>k</mi> <mrow> <mi>c</mi> <mi>e</mi> <mi>n</mi> <mi>t</mi> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>k</mi> <mi>r</mi> </msub> <mo>|</mo> </mrow> <mrow> <mn>1</mn> <mo>+</mo> <msub> <mi>k</mi> <mi>r</mi> </msub> <msub> <mi>k</mi> <mrow> <mi>c</mi> <mi>e</mi> <mi>n</mi> <mi>t</mi> <mi>e</mi> <mi>r</mi> </mrow> </msub> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>13</mn> <mo>)</mo> </mrow> </mrow>
Wherein, kl=-cot θlFor left hand edge straight slope, kr=-cot θrFor right hand edge straight slope;Then optical losses straight line Equation is:
v-v0=kcenter(u-u0) (14)
Wherein, (u0,v0) it is left and right edges straight-line intersection coordinate;u0=(br-bl)/(kl-kr);v0=klu0+bl;br,blFor it is right, The intercept of left straight line;
3rd step:Correct striation extracting method
Left and right image is evaluated using Gauss similarity image evaluation criterion, if less than extraction threshold value, needed to geometry The optical losses of center method extraction are modified;
1) multi-source factor coupling model
Based on laser, composite material surface object properties, analysis laser in the reflection characteristic of composite material surface, establish multi-source because Plain coupling model, its model formation are as follows:
Wherein, I is total structure light light intensity, kaIpaFor ambient light light intensity, kaFor ambient light illumination tensor, kdFor diffusing reflection coefficient, ksFor specularity factor,For incident striation and the angle of plane law vector, ψ is the folder of direction of observation and plane law vector Angle, d be laser to the distance of receiving plane, h is reflectivity factor;
2) structural light strip width is determined
Following relational expression is established according to the spatial relation of real image optical losses and standard picture optical losses:
Wherein, D is actual striation width,For striation incident angle, ψ is camera looks into fee angle, and φ is the horizontal hair of generating laser The half of firing angle, d are distance of the laser to receiving plane;
3) angle correction solves
Known by formula (15), (16), known to the information in standard laser striation in the case of, the gray value of its striation and space are wide Degree is only relevant with the observation angle of measured surface with the incident angle and observation camera of composite material surface with laser;When being swashed During light optical strip image, the developed width D of striation is calculated using Hough edge extractings, simultaneous formula (15), (16) reverse go out striation Incident angleWith camera view angle ψ, optical losses line position is corrected as angle correction;
4) optical losses amendment
Using the geometric center of geometrical center method rapid extraction laser striation, due to known striation incident angleObserved with camera Angle ψ, according to optical losses in theorem in Euclid space changing rule, amendment optical losses line position:
Δ is the deviant of the actual optical losses position relative ideal geometric center obtained after amendment.
CN201510034164.0A 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy Active CN104616325B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510034164.0A CN104616325B (en) 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510034164.0A CN104616325B (en) 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy

Publications (2)

Publication Number Publication Date
CN104616325A CN104616325A (en) 2015-05-13
CN104616325B true CN104616325B (en) 2018-02-16

Family

ID=53150758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510034164.0A Active CN104616325B (en) 2015-01-21 2015-01-21 A kind of large surfaces Light stripes center extraction method of quick high accuracy

Country Status (1)

Country Link
CN (1) CN104616325B (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105091783B (en) * 2015-05-30 2017-08-22 大连理工大学 Periphery striation modeling method based on section gray scale Energy distribution
CN105141839B (en) * 2015-08-21 2018-01-26 大连理工大学 A kind of high-definition image acquisition methods based on aperture time control
CN105335988B (en) * 2015-09-25 2017-12-26 大连理工大学 A kind of sub-pix center extraction method based on layered shaping
CN105931232B (en) * 2016-04-18 2019-02-19 南京航空航天大学 Light strip center of structured light high-precision sub-pixel extraction
CN106097430B (en) * 2016-06-28 2018-10-26 哈尔滨工程大学 A kind of laser stripe center line extraction method of more gaussian signal fittings
CN106228542A (en) * 2016-07-13 2016-12-14 苏州光图智能科技有限公司 High-rate laser projection line peak detection method
CN107516324B (en) * 2017-07-20 2019-12-17 大连理工大学 Target boundary extraction method based on geometric characteristic mutation of light bars
CN107687819B (en) * 2017-08-01 2019-09-27 大连理工大学 A kind of optical losses sub-pixel extraction of quick high accuracy
CN107578420B (en) * 2017-08-21 2019-11-19 大连理工大学 A kind of adaptive striation carrying out image threshold segmentation method
CN110533675B (en) * 2019-08-26 2021-01-19 大连理工大学 Laser stripe shielding noise filtering and compensating method
CN111260708A (en) * 2020-01-14 2020-06-09 华中科技大学鄂州工业技术研究院 Line structure optical center extraction method and system
CN111462214B (en) * 2020-03-19 2023-06-09 南京理工大学 Line structure light stripe center line extraction method based on Hough transformation
CN111721316A (en) * 2020-06-22 2020-09-29 重庆大学 High-performance lane line identification region-of-interest prediction method
CN116878382B (en) * 2023-08-10 2024-05-24 安徽大学 Remote high-speed surface measurement method based on structured light

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101814185A (en) * 2010-04-14 2010-08-25 天津大学 Line structured light vision sensor calibration method for micro-size measurement
US8224068B2 (en) * 2007-09-18 2012-07-17 University Of Kentucky Research Foundation (Ukrf) Lock and hold structured light illumination
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor
CN103955927A (en) * 2014-04-26 2014-07-30 江南大学 Fillet weld automatic tracking method based on laser vision

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8224068B2 (en) * 2007-09-18 2012-07-17 University Of Kentucky Research Foundation (Ukrf) Lock and hold structured light illumination
CN101814185A (en) * 2010-04-14 2010-08-25 天津大学 Line structured light vision sensor calibration method for micro-size measurement
CN103411553A (en) * 2013-08-13 2013-11-27 天津大学 Fast calibration method of multiple line structured light visual sensor
CN103955927A (en) * 2014-04-26 2014-07-30 江南大学 Fillet weld automatic tracking method based on laser vision

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Statistical behavior analysis and precision optimization for the laser stripe center detector based on Steger"s algorithm;Li Qi等;《OPTICS EXPRESS》;20130528;第21卷(第11期);全文 *
结构光三维测量中的亚像素级特征提取与边缘检测;梁治国等;《机械工程学报》;20041231;第40卷(第12期);全文 *
结构光三维视觉检测中光条图像处理方法研究;贺俊吉等;《北京航空航天大学学报》;20030731;第29卷(第7期);全文 *

Also Published As

Publication number Publication date
CN104616325A (en) 2015-05-13

Similar Documents

Publication Publication Date Title
CN104616325B (en) A kind of large surfaces Light stripes center extraction method of quick high accuracy
DE112020004810B4 (en) Systems and methods for recording surface normals with polarization
Cui et al. Polarimetric multi-view stereo
CN103971353B (en) Splicing method for measuring image data with large forgings assisted by lasers
CN104930985B (en) Binocular vision 3 D topography measurement method based on space-time restriction
US8803943B2 (en) Formation apparatus using digital image correlation
CN111784778B (en) Binocular camera external parameter calibration method and system based on linear solving and nonlinear optimization
CN105716539B (en) A kind of three-dimentioned shape measurement method of quick high accuracy
CN105046743A (en) Super-high-resolution three dimensional reconstruction method based on global variation technology
CN107421473A (en) The two beam laser coaxial degree detection methods based on image procossing
CN103278138A (en) Method for measuring three-dimensional position and posture of thin component with complex structure
CN104331900A (en) Corner sub-pixel positioning method in CCD (charge coupled device) camera calibration
CN111192235A (en) Image measuring method based on monocular vision model and perspective transformation
CN108154536A (en) The camera calibration method of two dimensional surface iteration
CN104848801A (en) Line structure light vision sensor calibration method based on parallel bicylindrical target
CN106500625A (en) A kind of telecentricity stereo vision measuring apparatus and its method for being applied to the measurement of object dimensional pattern micron accuracies
CN106097430B (en) A kind of laser stripe center line extraction method of more gaussian signal fittings
Ren et al. Accurate three-dimensional shape and deformation measurement at microscale using digital image correlation
CN109360248A (en) Parabolic catadioptric video camera is demarcated using the property of single ball and conjugate value
CN109325983A (en) Parabolic catadioptric video camera is demarcated about the linear matter of entelechy using infinite point
CN115359127A (en) Polarization camera array calibration method suitable for multilayer medium environment
CN105321181A (en) Method for calibrating parabolic catadioptric camera by using separate image of double balls and image of circular point
CN109506629B (en) Method for calibrating rotation center of underwater nuclear fuel assembly detection device
CN102789644B (en) Novel camera calibration method based on two crossed straight lines
Chenchen et al. A camera calibration method for obstacle distance measurement based on monocular vision

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant