CN105335988A - Hierarchical processing based sub-pixel center extraction method - Google Patents

Hierarchical processing based sub-pixel center extraction method Download PDF

Info

Publication number
CN105335988A
CN105335988A CN201510618494.4A CN201510618494A CN105335988A CN 105335988 A CN105335988 A CN 105335988A CN 201510618494 A CN201510618494 A CN 201510618494A CN 105335988 A CN105335988 A CN 105335988A
Authority
CN
China
Prior art keywords
striation
low
image
row
tau
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510618494.4A
Other languages
Chinese (zh)
Other versions
CN105335988B (en
Inventor
刘巍
张洋
高鹏
李晓东
杨帆
贾振元
高航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201510618494.4A priority Critical patent/CN105335988B/en
Publication of CN105335988A publication Critical patent/CN105335988A/en
Application granted granted Critical
Publication of CN105335988B publication Critical patent/CN105335988B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of computer vision measurement, and relates to a hierarchical processing based sub-pixel center extraction method. The method realizes high-precision sub-pixel center extraction of light strips with features by extracting feature information of the light strips in images with different resolutions based on hierarchical processing. The method comprises: firstly, compressing high-resolution images into low-resolution images by compressing original images, and extracting normal vectors of light strips for the low-resolution images; secondly, in the high-resolution images, computing normal vectors of light strips in the high-resolution images in a differential mode; and finally, computing light strip centers with a gray centroid method on the normal vectors of the light strips. The extraction of sub-pixel centers of the light strips is realized and the validity of the extraction method is verified through standard light strip reconstruction. The method is high in measurement precision, high in computing speed and short in computing time, realizes sub-pixel extraction of the light strips, and can satisfy high-precision quick extraction of the light strip centers.

Description

A kind of sub-pix center extraction method based on layered shaping
Technical field
The invention belongs to computer vision measurement technical field, relate to a kind of sub-pix center extraction method based on layered shaping.
Background technology
Large-scale zero component manufacture and assembly precision are the important leverages of the Grand Equipments such as aircraft, automobile quality and performance, and therefore high precision three-dimensional measurement is most important in manufacture assembling process.Vision measurement based on auxiliary laser has the advantages such as untouchable, large information, measuring speed be fast, has been widely used in during heavy parts measures.Be the laser scanning information being incident upon measured object surface by binocular camera Real-time Collection by laser instrument, realize the reconstruction of laser feature information based on binocular stereo vision theory, and then realize measured object surface surface measurement.Wherein the extraction of optical losses is the important step ensureing measuring accuracy, but the extraction of current pixel level can not meet the requirement of measuring accuracy; And the center extraction of high-precision striation needs to expend the more time, the requirement of Quick Measurement can not be met.The sub-pixel detection how realizing the quick high accuracy of optical losses is the problem needing important solution at present.
Periodical " the Anunbiaseddetectorofcurvilinearstructures " (C.Steger that C.Steger delivers, Anunbiaseddetectorofcurvilinearstructures [J], PatternAnalysisandMachineIntelligence, IEEETransactionson, 1998, 20 (2), 113-125.) propose the normal direction that Hessian matrix obtains striped in image, extreme point is asked to obtain the sub-pixel location of light stripe center by normal direction, this has just been sent out, and to have precision high, the advantages such as robustness is good, but operand is large, be difficult to realize rapid extraction optical losses." a kind of extracting method of light strip center of structured light " of people's inventions such as Wei Zhenzhong, patent No. CN101504770.This invention, by the successive ignition between optical losses point and described optical losses point place normal direction, obtains light strip center of structured light; To the smoothing operation of the light strip center of structured light of described acquisition, extract final light strip center of structured light, the high calculation procedure can removing redundancy of the method measuring accuracy, but longlyer cannot meet on-site rapid measurement requirement computing time.
Summary of the invention
The technical barrier that the present invention will solve is the sub-pixel detection problem for the quick high accuracy being difficult to realize optical losses, has invented a kind of sub-pix center extraction method based on layered shaping.The method adopts layered shaping, by calculating optical losses normal vector in low-resolution image, sub-pix in high-definition picture extracted with high accuracy striation, striation normal vector adopts grey scale centre of gravity method calculate optical losses, realizes the extraction at the sub-pix center of striation; Improve extraction efficiency, realize the Light stripes center extraction of quick high accuracy.
The technical solution used in the present invention is a kind of sub-pix center extraction method based on layered shaping, it is characterized in that, the method adopts layered shaping, by extracting the characteristic information of striation in different resolution image, and the high-precision sub-pix center extraction of realization character striation; The method, first by the compression to original image, by high-definition picture boil down to low-resolution image, extracts the normal vector of striation for low-resolution image; Then, in high-definition picture, the mode of difference is adopted to calculate the normal vector of striation in high-definition picture; The last grey scale centre of gravity method that adopts on striation normal vector calculates optical losses, realizes the extraction at the sub-pix center of striation, and is demonstrated the validity of this extracting method by the reconstruction of standard striation; Method concrete steps are as follows: the striation normal vector in the low resolution image of the first step extracts
1) low-resolution image compression method
For the feature optical strip image that binocular camera gathers, its resolution is m × n, defining τ according to compression factor factor τ is the integral multiple of 4, i.e. τ=4s (s=1,2,3...), image is compressed, carry out the cumulative of gray scale by the pixel of τ × τ in image, be then averaging gray scale, boil down to pixel; Its calculation criterion is as follows:
I c i , j = ( Σ σ = 1 τ Σ δ = 1 τ I τ ( i - 1 ) + δ , τ ( j - 1 ) + σ ) / τ 2 , i = 1 , 2 , 3... m τ , j = 1 , 2 , 3... n τ - - - ( 1 )
Wherein, I c i,jfor the i-th row of compressed image, the grey scale pixel value of j row, I τ (i-1)+δ, τ (j-1)+σfor the grey scale pixel value that capable, τ (the j-1)+σ of τ (i-1)+δ in full resolution pricture arranges, σ and δ is design factor respectively, m and n is the ranks number of high-definition picture respectively;
2) Light stripes center extraction in low-resolution image
Low-resolution image is carried out filtering process, and then adopt Sobel edge detection operator to extract striation border, computing method are as follows:
g ( i , j ) = { d x 2 ( i , j ) + d y 2 ( i , j ) } 1 2
d x = - 1 0 1 - 2 0 2 - 1 0 1 d y = - 1 2 - 1 0 0 0 1 2 1 - - - ( 2 )
Wherein, g (i, j) for image coordinate be the Sobel edge detection operator of (i, j), d xand d ybe respectively the convolution mask of calculating.
According to extracted striation boundary image, search the striation border of striation in often going in low-resolution image, adopt geometrical center method to calculate the centre coordinate of striation:
( v i c , u i c ) = ( v i c , u c l - i + u c r - i 2 ) - - - ( 3 )
Wherein, be respectively the left and right boundary pixel coordinate of image i-th row; When pixel coordinate is non-integer, ask whole to pixel coordinate, the pixel coordinate of its geometric center is
3) calculating of optical losses law vector in low-resolution image
The center choosing the front and back w of striation geometric center capable carries out curve fitting, and counting of each matching is at least four points, and its striation geometric center ordered series of numbers chosen is:
{P k|k=i-w,i-w+1,...,i-1,i,i+1,...,i+w-1,i+w;k≥1;w≥3}(4)
Wherein i is the line number of low-resolution image.Because local curve is less, adopt quafric curve to carry out matching, its fitting formula is:
f i(x)=a ix 2+b ix+c i(5)
Wherein, a i, b i, c ibe respectively the fitting coefficient of the i-th row center matched curve, according to matched curve solving by partial derivative, calculate the normal vector at the i-th row center
q i c = ∂ f i ( v i c ) ∂ x = a i v i c + b i - - - ( 6 )
Wherein, for f ix () is at x=v ithe partial derivative relative to x during position, is the normal vector of this point, calculates the method phase direction of each geometric center in low-resolution image according to normal vector solution formula;
Optical losses sub-pixel detection in second step high-definition picture
1) optical losses coarse positioning
Filtering process is carried out to full resolution pricture, according to boundary extraction algorithm in low-resolution image, namely formula (1) extracts the striation border in high resolving power, according to the computing method of geometric center in second step, striation width calculates the geometric center of striation i-th row, and its computing formula is as follows:
( v i e , u i e ) = ( v i e , u e l - i + u e r - i 2 ) - - - ( 7 )
Wherein, be respectively the left and right boundary pixel coordinate of image i-th row; When pixel coordinate is non-integer, ask whole to pixel coordinate, the pixel coordinate of its geometric center is
2) law vector in full resolution pricture calculates
The normal vector of the optical losses solved by low-resolution image reverts in the striation of full resolution pricture, the one-row pixels of low-resolution image represents the capable pixel of τ of high-definition picture, the striation normal vector at definition low resolution Zhong Meihang center is the normal vector of row striation in τ/2 in high-definition picture, namely
q i c = q τ ( i - 1 ) + ( τ + 1 ) / 2 - - - ( 8 )
Wherein q τ (i-1)+(τ+1)/2for the normal vector of τ (i-1) in full resolution pricture+(τ+1)/2 line position, according to former and later two striation normal vectors, by the normal vector of row middle in mathematic interpolation high-definition picture, computing formula is as follows:
q τ ( i - 1 ) + τ / 2 + h = q i c | q i c | + q i + 1 c | q i + 1 c | × h - 1 / 2 τ / 2 , h = 1 , 2 , 3 ... τ - - - ( 9 )
3) grey scale centre of gravity method essence extracts optical losses
Along the normal vector search striation of every row optical losses, striation normal direction width calculates the grey scale centre of gravity of striation, and computing method are as follows:
( v e , u e ) = Σ e = l r eI i j e Σ e = l r I i j e - - - ( 10 )
Wherein, (u e, v e) be e striation grey scale centre of gravity coordinate, I ijit is the i-th row jth row gray-scale value;
The checking of the 3rd step striation extraction accuracy
By rebuilding striation information, calculate the extraction accuracy of the reconstruction precision checking striation of standard round column piece; First mate the grey scale centre of gravity that left and right full resolution pricture extracts based on limit restraint, computing method are as follows:
x e ′ T Fx e ′ ′ = 0 - - - ( 11 )
Wherein, x e'=(u e, v e) be the grey scale centre of gravity coordinate of left camera; x e ''=(u e', v e') be and x e' match right image reform coordinate; F is the fundamental matrix between two cameras that calculated by 8 methods; If 2 full formula (11) in left images, then the focus point in left images is match point; The intermediate point matched by left images carries out three-dimensional reconstruction, to be laid foundations fit standard column diameter R by Three-dimensional Gravity f, digital simulation diameter R fevaluate reconstruction precision η with normal diameter R footpath, computing formula is as follows:
η = R f - R R × 100 % - - - ( 12 )
Thus demonstrate the validity of this extracting method.
The invention has the beneficial effects as follows extracting method first by the compression to original image, extract the normal vector of striation for low-resolution image; Then, in high-definition picture, the mode of difference is adopted to calculate the normal vector of striation in high-definition picture; The last grey scale centre of gravity method that adopts on striation normal vector calculates optical losses, realizes the extraction at the sub-pix center of striation.The method measuring accuracy is high, fast operation, and computing time is short, achieves the sub-pixel detection of striation, can meet the high precision rapid extraction of optical losses.
Accompanying drawing explanation
Fig. 1 is the schematic diagram of high resolving power optical strip image.Wherein, 1-full resolution pricture, the original striation of 2-.
Fig. 2 is the low-resolution image after compression, wherein, 3-low-resolution image, striation after 4-compression, be respectively in the low-resolution image after compression the 1st, 2, the normal vector of 3 row.
Fig. 3 is the sub-pix center extraction process flow diagram based on layered shaping.
Embodiment
The specific embodiment of the present invention is described in detail below in conjunction with technical scheme and accompanying drawing.
In embodiment, the present invention adopts two video cameras configuring wide-angle lens respectively to take a width optical strip image.Video camera model is viewworksVC-12MC-M/C65 video camera, resolution: 4096 × 3072, imageing sensor: CMOS, frame per second: silent frame, the highest 64.3fps, weight: 420g.Wide-angle lens model is EF16-35mmf/2.8LIIUSM, and parameter is as follows, lens focus: f=16-35, APS focal length: 25.5-52.5, aperture: F2.8, Lens: 82 × 106.Shooting condition is as follows: picture pixels is 4096 × 3072, and lens focus is 17mm, and object distance is 750mm, and visual field is about 720mm × 1300mm.The standard cylinder of tested standard item to be radius be 600mm, long 1200mm.First, adjustment laser positions makes striation project standard cylinder, and adjustment camera focus gathers clear optical strip image.
Extracting method based on layered shaping, by extracting the characteristic information of striation in different resolution image, the high-precision sub-pix center extraction of realization character striation; The method, first by the compression to original image, by high-definition picture boil down to low-resolution image, extracts the normal vector of striation for low-resolution image; Then, in high-definition picture, the mode of difference is adopted to calculate the normal vector of striation in high-definition picture; The last grey scale centre of gravity method that adopts on striation normal vector calculates optical losses, realizes the extraction at the sub-pix center of striation, and is demonstrated the validity of this extracting method by the reconstruction of standard striation.
Accompanying drawing 3 is the sub-pix center extraction process flow diagram based on layered shaping, and whole leaching process is divided into several key steps such as low-resolution image compression, low-resolution image Light stripes center extraction, the calculating of low resolution image striation law vector, high resolving power law vector mathematic interpolation.According to low-resolution image extraction method direction vector, obtained the law vector of high-definition picture by Difference Calculation, adopt grey scale centre of gravity method to obtain optical losses, finally complete the sub-pixel detection of optical losses.The concrete steps of extracting method are as follows:
Striation normal vector in the low resolution image of the first step extracts
1) low-resolution image compression method
For the feature optical strip image that binocular camera gathers, according to the compression factor factor 4, image is compressed, obtain the image after compression, as shown in accompanying drawing 23 according to formula (1);
2) Light stripes center extraction in low-resolution image
Low-resolution image is carried out filtering process, then adopts Sobel edge detection operator to extract striation border, obtain image boundary according to formula (2);
3) calculating of optical losses law vector in low-resolution image
The center choosing front and back 4 row of striation geometric center carries out curve fitting, and utilizes formula (5) matching striation curve to calculate striation normal vector, as shown in Figure 2, be respectively in the low-resolution image after compression the 1st, 2, the normal vector of 3 row;
Optical losses sub-pixel detection in second step high-definition picture
1) optical losses coarse positioning
Fig. 1 is the schematic diagram of high resolving power optical strip image, filtering process is carried out to full resolution pricture, according to boundary extraction algorithm in low-resolution image, namely formula (1) extracts the striation border in high resolving power, according to the computing method of geometric center in second step, striation width calculates according to formula (7) geometric center of striation, when pixel coordinate is non-integer, pixel coordinate is asked whole;
2) law vector in full resolution pricture calculates
The normal vector of the optical losses solved by low-resolution image reverts in the striation of full resolution pricture, the one-row pixels of low-resolution image represents 4 row pixels of high-definition picture, the law vector adjacent according to front and back, adopt difference algorithm, calculate the normal vector of geometric center in full resolution pricture based on formula (8) and formula (9);
3) grey scale centre of gravity method essence extracts optical losses
Along the normal vector search striation of every row optical losses, striation normal direction width calculates the grey scale centre of gravity of striation;
The checking of the 3rd step striation extraction accuracy
By rebuilding striation information, calculate the extraction accuracy of the reconstruction precision checking striation of standard round column piece; First based on limit restraint, the grey scale centre of gravity that left and right full resolution pricture extracts is mated, judged the match point of left images by formula (11); The intermediate point matched by left images carries out three-dimensional reconstruction, to be laid foundations fit standard column diameter by Three-dimensional Gravity, digital simulation diameter 602mm and normal diameter 600m evaluates reconstruction precision, be 0.33% according to formula (12) reconstruction precision, relative to striation Pixel-level extraction algorithm, precision improves 74.5%, demonstrates the validity of this extracting method.
The present invention, first by the compression to original image, extracts the normal vector of striation for low-resolution image; Then, in high-definition picture, the mode of difference is adopted to calculate the normal vector of striation in high-definition picture; The last grey scale centre of gravity method that adopts on striation normal vector calculates optical losses, realizes the extraction at the sub-pix center of striation.The method measuring accuracy is high, fast operation, achieves the sub-pixel detection of striation, can meet the high precision rapid extraction of optical losses.

Claims (1)

1. based on a sub-pix center extraction method for layered shaping, it is characterized in that, the method adopts layered shaping, by extracting the characteristic information of striation in different resolution image, and the high-precision sub-pix center extraction of realization character striation; First by the compression to original image, by high-definition picture boil down to low-resolution image, the normal vector of striation is extracted for low-resolution image; Again in high-definition picture, the mode of difference is adopted to calculate the normal vector of striation in high-definition picture; Finally, striation normal vector adopts grey scale centre of gravity method calculate optical losses, realize the extraction at the sub-pix center of striation, and demonstrated the validity of this extracting method by the reconstruction of standard striation; Method concrete steps are as follows:
Striation normal vector in the low resolution image of the first step extracts
1) low-resolution image compression method
For the feature optical strip image that binocular camera gathers, its resolution is m × n, defining τ according to compression factor factor τ is the integral multiple of 4, i.e. τ=4s (s=1,2,3...), image is compressed, carry out the cumulative of gray scale by the pixel of τ × τ in image, be then averaging gray scale, boil down to pixel; Its calculation criterion is as follows:
I c i , j = ( Σ σ = 1 τ Σ δ = 1 τ I τ ( i - 1 ) + δ , τ ( j - 1 ) + σ ) / τ 2 , i = 1 , 2 , 3 ... m τ , j = 1 , 2 , 3 ... n τ - - - ( 1 )
Wherein, I c i,jfor the i-th row of compressed image, the grey scale pixel value of j row, I τ (i-1)+δ, τ (j-1)+σfor the grey scale pixel value that capable, τ (the j-1)+σ of τ (i-1)+δ in full resolution pricture arranges, σ and δ is design factor respectively, m and n is the ranks number of high-definition picture respectively;
2) Light stripes center extraction in low-resolution image
Low-resolution image is carried out filtering process, and then adopt Sobel edge detection operator to extract striation border, computing method are as follows:
g ( i , j ) = { d x 2 ( i , j ) + d y 2 ( i , j ) } 1 2
d x = - 1 0 1 - 2 0 2 - 1 0 1 d y = - 1 2 - 1 0 0 0 1 2 1 - - - ( 2 )
Wherein, g (i, j) for image coordinate be the Sobel edge detection operator of (i, j), d xand d ybe respectively the convolution mask of calculating.
According to extracted striation boundary image, search the striation border of striation in often going in low-resolution image, adopt geometrical center method to calculate the centre coordinate of striation:
( v i c , u i c ) = ( v i c , u c l - i + u c r - i 2 ) - - - ( 3 )
Wherein, be respectively the left and right boundary pixel coordinate of image i-th row; When pixel coordinate is non-integer, ask whole to pixel coordinate, the pixel coordinate of its geometric center is
3) calculating of optical losses law vector in low-resolution image
The center choosing the front and back w of striation geometric center capable carries out curve fitting, and counting of each matching is at least four points, and its striation geometric center ordered series of numbers chosen is:
{P k|k=i-w,i-w+1,...,i-1,i,i+1,...,i+w-1,i+w;k≥1;w≥3}(4)
Wherein i is the line number of low-resolution image.Because local curve is less, adopt quafric curve to carry out matching, its fitting formula is:
f i(x)=a ix 2+b ix+c i(5)
Wherein, a i, b i, c ibe respectively the fitting coefficient of the i-th row center matched curve, according to matched curve solving by partial derivative, calculate the normal vector at the i-th row center
q i c = ∂ f i ( v i c ) ∂ x = a i v i c + b i - - - ( 6 )
Wherein, for f ix () is at x=v ithe partial derivative relative to x during position, is the normal vector of this point, calculates the method phase direction of each geometric center in low-resolution image according to normal vector solution formula;
Optical losses sub-pixel detection in second step high-definition picture
1) optical losses coarse positioning
Filtering process is carried out to full resolution pricture, according to boundary extraction algorithm in low-resolution image, namely formula (1) extracts the striation border in high resolving power, according to the computing method of geometric center in second step, striation width calculates the geometric center of striation i-th row, and its computing formula is as follows:
( v i e , u i e ) = ( v i e , u e l - i + u e r - i 2 ) - - - ( 7 )
Wherein, be respectively the left and right boundary pixel coordinate of image i-th row; When pixel coordinate is non-integer, ask whole to pixel coordinate, the pixel coordinate of its geometric center is
2) law vector in full resolution pricture calculates
The normal vector of the optical losses solved by low-resolution image reverts in the striation of full resolution pricture, the one-row pixels of low-resolution image represents the capable pixel of τ of high-definition picture, the striation normal vector at definition low resolution Zhong Meihang center is the normal vector of row striation in τ/2 in high-definition picture, namely
q i c = q τ ( i - 1 ) + ( τ + 1 ) / 2 - - - ( 8 )
Wherein q τ (i-1)+(τ+1)/2for the normal vector of τ (i-1) in full resolution pricture+(τ+1)/2 line position, according to former and later two striation normal vectors, by the normal vector of row middle in mathematic interpolation high-definition picture, computing formula is as follows:
q τ ( i - 1 ) + τ / 2 + h = q i c | q i c | + q i + 1 c | q i + 1 c | × h - 1 / 2 τ / 2 , h = 1 , 2 , 3 ... τ - - - ( 9 )
3) grey scale centre of gravity method essence extracts optical losses
Along the normal vector search striation of every row optical losses, striation normal direction width calculates the grey scale centre of gravity of striation, and computing method are as follows:
( v e , u e ) = Σ e = l r eI i j e Σ e = l r I i j e - - - ( 10 )
Wherein, (u e, v e) be e striation grey scale centre of gravity coordinate, I ijit is the i-th row jth row gray-scale value;
The checking of the 3rd step striation extraction accuracy
By rebuilding striation information, calculate the extraction accuracy of the reconstruction precision checking striation of standard round column piece; First mate the grey scale centre of gravity that left and right full resolution pricture extracts based on limit restraint, computing method are as follows:
x' e TFx e′′=0(11)
Wherein, x e'=(u e, v e) be the grey scale centre of gravity coordinate of left camera; x e ''=(u e', v e') be and x e' match right image reform coordinate; F is the fundamental matrix between two cameras that calculated by 8 methods; If 2 full formula (11) in left images, then the focus point in left images is match point; The intermediate point matched by left images carries out three-dimensional reconstruction, to be laid foundations fit standard column diameter R by Three-dimensional Gravity f, digital simulation diameter R fevaluate reconstruction precision η with normal diameter R footpath, computing formula is as follows:
η = R f - R R × 100 % - - - ( 12 )
Thus demonstrate the validity of this extracting method.
CN201510618494.4A 2015-09-25 2015-09-25 A kind of sub-pix center extraction method based on layered shaping Active CN105335988B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510618494.4A CN105335988B (en) 2015-09-25 2015-09-25 A kind of sub-pix center extraction method based on layered shaping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510618494.4A CN105335988B (en) 2015-09-25 2015-09-25 A kind of sub-pix center extraction method based on layered shaping

Publications (2)

Publication Number Publication Date
CN105335988A true CN105335988A (en) 2016-02-17
CN105335988B CN105335988B (en) 2017-12-26

Family

ID=55286493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510618494.4A Active CN105335988B (en) 2015-09-25 2015-09-25 A kind of sub-pix center extraction method based on layered shaping

Country Status (1)

Country Link
CN (1) CN105335988B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107578420A (en) * 2017-08-21 2018-01-12 大连理工大学 A kind of adaptive striation carrying out image threshold segmentation method
CN107687819A (en) * 2017-08-01 2018-02-13 大连理工大学 A kind of optical losses sub-pixel extraction of quick high accuracy
CN108510544A (en) * 2018-03-30 2018-09-07 大连理工大学 A kind of striation localization method of feature based cluster
CN108550160A (en) * 2018-04-03 2018-09-18 大连理工大学 Non-homogeneous striation characteristic area extracting method based on light intensity template
CN112381807A (en) * 2020-11-18 2021-02-19 北京图知天下科技有限责任公司 Method, system and computer for detecting crystal diameter in Czochralski single crystal production

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8483487B2 (en) * 2011-06-28 2013-07-09 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Image processing device and method for capturing object outline
CN103400399A (en) * 2013-08-07 2013-11-20 长春工业大学 Spatial moment based line structured light center extraction method
CN104616325A (en) * 2015-01-21 2015-05-13 大连理工大学 Rapid and high-precision method for extracting light strip center on large surface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8483487B2 (en) * 2011-06-28 2013-07-09 Fu Tai Hua Industry (Shenzhen) Co., Ltd. Image processing device and method for capturing object outline
CN103400399A (en) * 2013-08-07 2013-11-20 长春工业大学 Spatial moment based line structured light center extraction method
CN104616325A (en) * 2015-01-21 2015-05-13 大连理工大学 Rapid and high-precision method for extracting light strip center on large surface

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘枝梅 等: "结构光测量***中光条中心的提取算法", 《北京机械工业学院学报》 *
江永付 等: "线结构光光条中心亚像素精确提取方法", 《激光与光电子学进展》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107687819A (en) * 2017-08-01 2018-02-13 大连理工大学 A kind of optical losses sub-pixel extraction of quick high accuracy
CN107578420A (en) * 2017-08-21 2018-01-12 大连理工大学 A kind of adaptive striation carrying out image threshold segmentation method
CN107578420B (en) * 2017-08-21 2019-11-19 大连理工大学 A kind of adaptive striation carrying out image threshold segmentation method
CN108510544A (en) * 2018-03-30 2018-09-07 大连理工大学 A kind of striation localization method of feature based cluster
CN108510544B (en) * 2018-03-30 2020-01-17 大连理工大学 Light strip positioning method based on feature clustering
CN108550160A (en) * 2018-04-03 2018-09-18 大连理工大学 Non-homogeneous striation characteristic area extracting method based on light intensity template
CN108550160B (en) * 2018-04-03 2020-04-07 大连理工大学 Non-uniform light bar characteristic region extraction method based on light intensity template
CN112381807A (en) * 2020-11-18 2021-02-19 北京图知天下科技有限责任公司 Method, system and computer for detecting crystal diameter in Czochralski single crystal production
CN112381807B (en) * 2020-11-18 2023-06-20 北京图知天下科技有限责任公司 Crystal diameter detection method, system and equipment in Czochralski crystal production

Also Published As

Publication number Publication date
CN105335988B (en) 2017-12-26

Similar Documents

Publication Publication Date Title
CN105300316B (en) Optical losses rapid extracting method based on grey scale centre of gravity method
CN101443817B (en) Method and device for determining correspondence, preferably for the three-dimensional reconstruction of a scene
CN105335988A (en) Hierarchical processing based sub-pixel center extraction method
CN105716539B (en) A kind of three-dimentioned shape measurement method of quick high accuracy
CN104930985B (en) Binocular vision 3 D topography measurement method based on space-time restriction
CN101952855B (en) Method and camera for the real-time acquisition of visual information from three-dimensional scenes
CN103698010B (en) A kind of microminiaturized linear gradient optical filter type imaging spectrometer
CN105046743A (en) Super-high-resolution three dimensional reconstruction method based on global variation technology
CN109325995B (en) Low-resolution multi-view hand reconstruction method based on hand parameter model
CN102075785B (en) Method for correcting wide-angle camera lens distortion of automatic teller machine (ATM)
CN102156969A (en) Processing method for correcting deviation of image
CN102402855A (en) Method and system of fusing real-time panoramic videos of double cameras for intelligent traffic
JPWO2019054092A1 (en) Image generating apparatus and image generating method
CN103727927A (en) High-velocity motion object pose vision measurement method based on structured light
CN104363369A (en) Image restoration method and device for optical field camera
CN105005964A (en) Video sequence image based method for rapidly generating panorama of geographic scene
CN108364309A (en) A kind of spatial light field restoration methods based on hand-held light-field camera
CN116778288A (en) Multi-mode fusion target detection system and method
CN103679680A (en) Stereo matching method and system
CN105352482B (en) 332 dimension object detection methods and system based on bionic compound eyes micro lens technology
CN116935181B (en) Three-dimensional measurement method for full binary speckle embedded pulse width modulation mode
CN108615221B (en) Light field angle super-resolution method and device based on shearing two-dimensional polar line plan
CN102110290B (en) Method for solving internal parameters of camera by using regular triangular prism as target
CN117876397A (en) Bridge member three-dimensional point cloud segmentation method based on multi-view data fusion
CN103873773B (en) Primary-auxiliary synergy double light path design-based omnidirectional imaging method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant