CN107578420A - A kind of adaptive striation carrying out image threshold segmentation method - Google Patents

A kind of adaptive striation carrying out image threshold segmentation method Download PDF

Info

Publication number
CN107578420A
CN107578420A CN201710715098.2A CN201710715098A CN107578420A CN 107578420 A CN107578420 A CN 107578420A CN 201710715098 A CN201710715098 A CN 201710715098A CN 107578420 A CN107578420 A CN 107578420A
Authority
CN
China
Prior art keywords
mrow
striation
image
section
optical strip
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710715098.2A
Other languages
Chinese (zh)
Other versions
CN107578420B (en
Inventor
刘巍
张致远
叶帆
赵海洋
兰志广
张洋
马建伟
贾振元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201710715098.2A priority Critical patent/CN107578420B/en
Publication of CN107578420A publication Critical patent/CN107578420A/en
Application granted granted Critical
Publication of CN107578420B publication Critical patent/CN107578420B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

A kind of adaptive striation carrying out image threshold segmentation method of the present invention belongs to binocular vision technology field, is related to a kind of adaptive striation carrying out image threshold segmentation method.This method splits initial striation region by traditional fixed threshold image partition method, obtains the row coordinate of striation cross section right boundary;Then gradation of image Evaluation on distribution coefficient is established, according to initial threshold segmentation result, calculates the striation cross section energy intensity of often row striation cross section;According to striation distribution characteristics, the gray scale branch for calculating desired light sliver transvers section energy intensity is horizontal;Resettle with the positively related optical strip image adaptive threshold fuzziness correlation model of optical strip image intensity profile coefficient, to determine the self-adaptive projection method threshold value of optical strip image, striation region is precisely separating out from background.The extraction accuracy of random surface large aerospace component surface striation is the method increase, avoid local overexposure or local striation secretly causes the problem of striation extraction is difficult, and striation extraction accuracy is not high excessively.

Description

A kind of adaptive striation carrying out image threshold segmentation method
Technical field
The invention belongs to binocular vision technology field, is related to a kind of adaptive striation carrying out image threshold segmentation method.
Background technology
During vision measurement, accurate Light stripes center extraction is the key for realizing high precision three-dimensional measurement.However, it is directed to Large aerospace component is the situation of measurement object, and because its surface is usually free form surface, the striation of projection is through workpiece for measurement table It is free curve striation that face, which is modulated at distortion in the image of shooting,;And live photoenvironment is complicated, exist it is local high reflective and The uneven influence of global illumination, cause the optical strip image intensity profile of shooting seriously uneven, while regional area striation be present Overexposure and the excessively dark phenomenon of local area light bar, so as to have a strong impact on the integrality and precision of striation extraction.In current vision In measurement process, unique binary-state threshold generally is chosen to shooting gained image, striation contours extract is carried out, at the scene under environment Easily there is the problem of local fracture and wide local overexposure striation in the striation of extraction.
By literature search, Long Jianwu Master's thesis《Carrying out image threshold segmentation key technology research》, 2014, Jilin was big Learn, in this paper, three-dimensional popularization is carried out to two-dimentional minimum error method, and combine stereogram reconstruction and carried with dimensionality reduction thought A kind of Minimum error threshold algorithm of robust is gone out.This method efficiently solves Small object image under the conditions of inhomogeneous illumination Segmentation problem.But for the situation of large aerospace component, because part and visual field are bigger, party's rule can compare nothing Power.
The content of the invention
The present invention is directed to the problem of large aerospace component surface optical strip image intensity profile is uneven, has invented a kind of adaptive Answer optical strip image threshold segmentation method.This method splits initial striation area by traditional fixed threshold image partition method Domain, obtain the row coordinate of striation region right boundary;Then gradation of image Evaluation on distribution coefficient is established, is split according to initial threshold As a result, striation cross section average intensity level is defined as to the cross section energy intensity of striation;It is adaptive to resettle optical strip image Threshold segmentation correlation model, to determine the self-adaptive projection method threshold value of optical strip image, striation area is precisely separating out from background Domain, the extraction accuracy of random surface large aerospace component surface striation is substantially increased, is avoided due to local overexposure or office Portion's striation is excessively dark and causes the problem of striation extraction is difficult, and striation extraction accuracy is not high.
The technical solution adopted by the present invention is a kind of adaptive striation carrying out image threshold segmentation method, it is characterized in that, this method Split initial striation region by traditional fixed threshold image partition method, the row for obtaining striation cross section right boundary are sat Mark;Then gradation of image Evaluation on distribution coefficient is established, according to initial threshold segmentation result, by striation cross section average intensity level It is defined as the cross section energy intensity of striation;Calculate the striation cross section energy intensity of often row striation cross section;According to striation point Cloth feature, the gray scale branch for calculating desired light sliver transvers section energy intensity are horizontal;Obtain optical strip image intensity profile coefficient;Build again The vertical and positively related optical strip image adaptive threshold fuzziness correlation model of optical strip image intensity profile coefficient, to determine optical strip image Self-adaptive projection method threshold value, striation region is precisely separating out from background.Method comprises the following steps that:
The first step splits initial striation region
According to line laser fringe gray level Characteristics of Distribution, the intensity profile uniformity of optical strip image is by striation cross section ash Degree is horizontal and is determined along the shade of gray change on striation direction is comprehensive;Traditional fixed threshold image partition method is used first Optical strip image is handled, if the original optical strip image of input is f, output image g, f (u, v) represent that input picture is being schemed As the gray value at pixel (u, v) place, g (u, v) represents gray value of the output image at image pixel (u, v) place, then traditional two Value segmentation can represent as follows:
In formula, T is image segmentation threshold;
Then left and right boundary coordinate p (v) and the q (v) in striation region are calculated according to bianry image;P (v), q (v) points Not Biao Shi in image v row striations cross section left and right border column coordinate;
Second step establishes gradation of image Evaluation on distribution coefficient
It is horizontal to calculate actual optical strip image section intensity profile;According to initial threshold segmentation result, striation cross section is put down Equal grey level is defined as the cross section energy intensity of striation, is described with formula (2):
Wherein, EICS (v) is the striation section energy intensity of v rows in optical strip image;Striation section energy intensity EICS The main intensity profile for characterizing striation is horizontal;EICS numerical value is bigger, then shows that the sectional position striation brightness is bigger, it is easier from Separated in background;On the contrary, EICS numerical value is smaller, then the section go out striation may be excessively dark, cause in overall extraction process Striation characteristic information may be lost here;
Then, define and calculate preferable striation image cross section intensity profile level;In theory, according to the life of line laser striped Into mechanism, its striation cross section intensity profile Gaussian distributed model of the preferable optical strip image of video camera shooting acquisition, therefore Image peak gray value saturation, obedience ideal Gaussian distribution striation section average intensity level is defined as into preferable striation to cut Face energy intensity, is expressed as:
Wherein, iEICS (v) is the striation section energy intensity of v rows in ideal situation hypograph;W (v) is actual striation The striation width of v rows in image, w (v)=q (v)-p (v);A is desired light sliver transvers section gray scale peak value, and defining its peak value is Image saturation gray value, therefore A=255;In addition, 3 σ based on Gaussian Profile are theoretical, i.e., striation energy 99.74% concentrates on height In the range of ± 3 σ of this distribution average, therefore, according to striation overwhelming majority energy has been concentrated in striation width range, w (v) is defined =6 σw, calculate the standard deviation sigma that standard gaussian corresponding to often being gone in image is distributedw
The energy intensity EICS (v) in actual striation section and the ratio of preferable striation section energy intensity iEICS (v) are made For optical strip image intensity profile coefficient ηCGDLS
3rd step optical strip image adaptive threshold fuzziness correlation model
Utilize the optical strip image intensity profile coefficient η calculatedCGDLS, establish and extracted with it into positively related adaptive threshold Mathematical modeling:
λ (v)=f (ηCGDLS(v)) in (5) formula, λ (v) is the figure of the v rows cross section grey level of adaptive optical strip image As segmentation threshold;
Using linear regression method, the self-adaptive projection method threshold value for determining optical strip image is
Wherein, threshupAnd threshdownCorresponding optical strip image intensity profile coefficient maximum is represented respectivelyAnd minimum ValueThe optical strip image segmentation threshold at place, so that striation region can be precisely separating out from background as standard, pass through priori Knowledge Acquirement;So, the striation binary-state threshold that often row should use will pass through adaptive mode and choose completion.
The beneficial effects of the invention are as follows by establishing gradation of image Evaluation on distribution coefficient, according to initial threshold segmentation result, Calculate the striation cross section energy intensity of often row striation cross section;According to striation distribution characteristics, desired light sliver transvers section energy is calculated The gray scale branch for measuring intensity is horizontal.The extraction accuracy of large aerospace component surface striation is substantially increased, is avoided due to local mistake Expose or local striation secretly causes the problem of striation extraction is difficult, and striation extraction accuracy is not high excessively.
Brief description of the drawings
Fig. 1 is method flow diagram.
Fig. 2 is optical strip image along striation direction intensity profile index variation profiles.Wherein, abscissa represents v pixel columns, Ordinate represents the intensity profile coefficient of the pixel column;2- overexposures region, 3,4- cross dark areas.
Fig. 3 is the striation section energy intensity situation of change schematic diagram along striation direction.Abscissa represents v pixel columns, Ordinate represents the striation section energy intensity value of the pixel column.
Fig. 4 is local overexposure striation schematic diagram.Wherein, 1- parts overexposure region.
Fig. 5 is the result figure after adaptive threshold fuzziness.
Embodiment
Describe the embodiment of the present invention in detail with technical scheme below in conjunction with the accompanying drawings.
In the present embodiment, testee is t800 composite panels, and wavelength 460nm royal purple line lasers are projected into multiple material plate On.
The present invention is using the video camera shooting optical strip image for configuring wide-angle lens.Video camera model view works VC- The video cameras of 12MC-M/C 65, resolution ratio:4096 × 3072, imaging sensor:CMOS, frame per second:Silent frame, highest 64.3fps, Weight:420g.Wide-angle lens model EF 16-35mm f/2.8L II USM, parameter is as follows, lens focus:F=16- 35mm, APS focal length:25.5-52.5 aperture:F2.8, Lens:82×106.Shooting condition is as follows:Picture pixels are 4096 × 3072, lens focus 25mm, object distance 750mm, visual field are about 850mm × 450mm.
The flow chart of method is as shown in figure 1, comprise the following steps that:
The first step, split initial striation region.According to line laser fringe gray level Characteristics of Distribution, the gray scale of optical strip image Distributing homogeneity determines by striation cross section grey level and along the shade of gray change synthesis on striation direction.Therefore, first Optical strip image is handled using traditional fixed threshold image partition method, if the original optical strip image of input is f, output Image is g, and f (u, v) represents gray value of the input picture at image pixel (u, v) place, and g (u, v) represents output image in image The gray value at pixel (u, v) place, then traditional binary segmentation can be expressed as shown in formula (1).
Then left and right boundary coordinate p (v) and the q (v) in striation region are calculated according to bianry image;P (v), q (v) points Not Biao Shi in image v row striations cross section left and right border column coordinate.
Second step, establish gradation of image Evaluation on distribution coefficient.It is horizontal to calculate actual optical strip image section intensity profile;According to Initial threshold segmentation result, striation cross section average intensity level is defined as to the cross section energy intensity of striation, (2) can be used Formula is described.
Wherein, EICS (v) is the striation section energy intensity of v rows in optical strip image.Striation section energy intensity EICS The main intensity profile for characterizing striation is horizontal.EICS numerical value is bigger, then shows that the sectional position striation brightness is bigger, it is easier from Separated in background;On the contrary, EICS numerical value is smaller, then the section go out striation may be excessively dark, as shown in Fig. 2 causing in entirety Striation characteristic information may be lost in extraction process here.
Preferable striation image cross section intensity profile is horizontal, and in theory, according to the formation mechanism of line laser striped, video camera is clapped Take the photograph the preferable optical strip image of acquisition its striation cross section intensity profile Gaussian distributed model, therefore by image peak gray value Saturation, obedience ideal Gaussian distribution striation section average intensity level is defined as preferable striation section energy intensity, states For formula (3).Wherein, in iEICS (v) ideal situations hypograph v rows striation section energy intensity;W (v) is actual striation The striation width of v rows in image, w (v)=q (v)-p (v);A is desired light sliver transvers section gray scale peak value, and defining its peak value is Image saturation gray value, therefore A=255;In addition, 3 σ based on Gaussian Profile are theoretical, i.e., striation energy 99.74% concentrates on height In the range of ± 3 σ of this distribution average, therefore, according to striation overwhelming majority energy has been concentrated in striation width range, w (v) is defined =6 σw, it is possible thereby to calculate in image often go corresponding to standard gaussian distribution standard deviation sigmaw
The energy intensity EICS (v) in actual striation section and the ratio of preferable striation section energy intensity iEICS (v) are made For optical strip image intensity profile coefficient ηCGDLS, calculated and obtained using formula (4), as shown in Figure 3.
3rd step, establish optical strip image adaptive threshold fuzziness correlation model.Utilize the optical strip image gray scale point calculated Cloth coefficient ηCGDLS, optical strip image intensity profile coefficient ηCGDLSVariation tendency result.
Established using formula (5) and extract mathematical modeling into positively related adaptive threshold with it.Using linear regression method, By formula (6), the self-adaptive projection method threshold value of optical strip image is determined.Wherein, threshupAnd threshdownRepresent respectively Corresponding optical strip image intensity profile coefficient maximumAnd minimum valueThe optical strip image segmentation threshold at place, with can It is standard that striation region is precisely separating out from background, is obtained by priori.So, the striation binaryzation that often row should use Threshold value will pass through adaptive mode and choose completion, can obtain uniform optical strip image as shown in Figure 5, avoid shown in Fig. 4 Local overexposure optical strip image.
This method improves the extraction accuracy of random surface large aerospace component surface striation, avoid local overexposure or Local striation is excessively dark and causes the problem of striation extraction is difficult, and striation extraction accuracy is not high.

Claims (1)

1. a kind of adaptive striation carrying out image threshold segmentation method, it is characterized in that, this method passes through traditional fixed threshold image point Segmentation method splits initial striation region, obtains the row coordinate of striation cross section right boundary;Then gradation of image distribution is established Evaluation coefficient, according to initial threshold segmentation result, striation cross section average intensity level is defined as to the cross section energy of striation Intensity;Calculate the striation cross section energy intensity of often row striation cross section;According to striation distribution characteristics, it is transversal to calculate preferable striation The gray scale branch of face energy intensity is horizontal;Obtain optical strip image intensity profile coefficient;Resettle and optical strip image intensity profile system The positively related optical strip image adaptive threshold fuzziness correlation model of number, to determine the self-adaptive projection method threshold value of optical strip image, Striation region is precisely separating out from background;Method comprises the following steps that:
The first step splits initial striation region
According to line laser fringe gray level Characteristics of Distribution, the intensity profile uniformity of optical strip image is by striation cross section gray scale water It is gentle to be determined along the shade of gray change on striation direction is comprehensive;
Optical strip image is handled using traditional fixed threshold image partition method first, if the original optical strip image of input Gray value of the input picture at image pixel (u, v) place is represented for f, output image g, f (u, v), g (u, v) represents output figure As image pixel (u, v) place gray value, then traditional binary segmentation can represent as follows:
<mrow> <mi>g</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <mn>0</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>f</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&amp;le;</mo> <mi>T</mi> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>1</mn> <mo>,</mo> </mrow> </mtd> <mtd> <mrow> <mi>f</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&gt;</mo> <mi>T</mi> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow>
In formula, T is image segmentation threshold;
Then left and right boundary coordinate p (v) and the q (v) in striation region are calculated according to bianry image;P (v), q (v) difference table The left and right border column coordinate (in theory, q (v) > p (v)) of v row striations cross section in diagram picture;
Second step establishes gradation of image Evaluation on distribution coefficient
It is horizontal to calculate actual optical strip image section intensity profile;According to initial threshold segmentation result, by the average ash in striation cross section Degree level is defined as the cross section energy intensity of striation, is described with formula (2):
<mrow> <mi>E</mi> <mi>I</mi> <mi>C</mi> <mi>S</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>u</mi> <mo>=</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mi>q</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> </munderover> <mi>f</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mi>q</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>-</mo> <mi>p</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow>
Wherein, EICS (v) is the striation section energy intensity of v rows in optical strip image;
Then, define and calculate preferable striation image cross section intensity profile level;In theory, according to the generation machine of line laser striped Reason, its striation cross section intensity profile Gaussian distributed model of the preferable optical strip image of video camera shooting acquisition, therefore will figure As peak gray value saturation, obedience ideal Gaussian distribution striation section average intensity level is defined as preferable striation section energy Intensity is measured, is expressed as:
<mrow> <mi>i</mi> <mi>E</mi> <mi>I</mi> <mi>C</mi> <mi>S</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>k</mi> <mo>=</mo> <mo>-</mo> <mi>w</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>/</mo> <mn>2</mn> </mrow> <mrow> <mi>w</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>/</mo> <mn>2</mn> </mrow> </munderover> <mi>A</mi> <mo>&amp;CenterDot;</mo> <msup> <mi>e</mi> <mrow> <mo>-</mo> <mfrac> <msup> <mi>k</mi> <mn>2</mn> </msup> <mrow> <mn>2</mn> <msubsup> <mi>&amp;sigma;</mi> <mi>w</mi> <mn>2</mn> </msubsup> </mrow> </mfrac> </mrow> </msup> </mrow> <mrow> <mi>w</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow>
Wherein, in iEICS (v) ideal situations hypograph v rows striation section energy intensity;W (v) is in actual optical strip image The striation width of v rows, w (v)=q (v)-p (v);A is desired light sliver transvers section gray scale peak value, and its peak value is image in definition Saturation gray value, therefore A=255;In addition, 3 σ based on Gaussian Profile are theoretical, i.e., striation energy 99.74% concentrates on Gauss point In the range of ± 3 σ of cloth average, therefore, according to striation overwhelming majority energy has been concentrated in striation width range, w (v)=6 is defined σw, it is possible thereby to calculate in image often go corresponding to standard gaussian distribution standard deviation sigmaw
Using the energy intensity EICS (v) in actual striation section and the ratio of preferable striation section energy intensity iEICS (v) as light Bar gradation of image breadth coefficient ηCGDLS
<mrow> <msub> <mi>&amp;eta;</mi> <mrow> <mi>C</mi> <mi>G</mi> <mi>D</mi> <mi>L</mi> <mi>S</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>E</mi> <mi>I</mi> <mi>C</mi> <mi>S</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> <mrow> <mi>i</mi> <mi>E</mi> <mi>I</mi> <mi>C</mi> <mi>S</mi> <mrow> <mo>(</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>4</mn> <mo>)</mo> </mrow> </mrow>
3rd step optical strip image adaptive threshold fuzziness correlation model
Utilize the optical strip image intensity profile coefficient η calculatedCGDLS, establish and extract mathematics into positively related adaptive threshold with it Model:
λ (v)=f (ηCGDLS(v)) (5)
In formula, λ (v) is the image segmentation threshold of the v rows cross section grey level of adaptive optical strip image;
Using linear regression method, the self-adaptive projection method threshold value for determining optical strip image is
Wherein, threshupAnd threshdownCorresponding optical strip image intensity profile coefficient maximum is represented respectivelyAnd minimum ValueThe optical strip image segmentation threshold at place;So, the striation binary-state threshold that often row should use, it will pass through adaptive side Formula, which is chosen, to be completed.
CN201710715098.2A 2017-08-21 2017-08-21 A kind of adaptive striation carrying out image threshold segmentation method Expired - Fee Related CN107578420B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710715098.2A CN107578420B (en) 2017-08-21 2017-08-21 A kind of adaptive striation carrying out image threshold segmentation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710715098.2A CN107578420B (en) 2017-08-21 2017-08-21 A kind of adaptive striation carrying out image threshold segmentation method

Publications (2)

Publication Number Publication Date
CN107578420A true CN107578420A (en) 2018-01-12
CN107578420B CN107578420B (en) 2019-11-19

Family

ID=61034593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710715098.2A Expired - Fee Related CN107578420B (en) 2017-08-21 2017-08-21 A kind of adaptive striation carrying out image threshold segmentation method

Country Status (1)

Country Link
CN (1) CN107578420B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108550144A (en) * 2018-04-09 2018-09-18 大连理工大学 Laser striation sequence image quality evaluating method based on gray scale reliability
CN108629790A (en) * 2018-04-26 2018-10-09 大连理工大学 A kind of optical strip image threshold segmentation method based on depth residual error network
CN110210437A (en) * 2019-06-10 2019-09-06 上海联影医疗科技有限公司 The determination method and system of human region in a kind of image
CN110232709A (en) * 2019-04-19 2019-09-13 武汉大学 A kind of line-structured light Light stripes center extraction method becoming Threshold segmentation
CN112710250A (en) * 2020-11-23 2021-04-27 武汉光谷卓越科技股份有限公司 Three-dimensional measurement method based on line structured light and sensor
CN117437247A (en) * 2023-12-18 2024-01-23 津泰(天津)医疗器械有限公司 Lesion region extraction and segmentation method based on natural cavity image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040001195A1 (en) * 2002-06-28 2004-01-01 Fuji Photo Optical Co., Ltd. Method of extracting circular region from fringe image
CN104616325A (en) * 2015-01-21 2015-05-13 大连理工大学 Rapid and high-precision method for extracting light strip center on large surface
CN105300316A (en) * 2015-09-22 2016-02-03 大连理工大学 Light stripe center rapid extraction method based on gray centroid method
CN105335988A (en) * 2015-09-25 2016-02-17 大连理工大学 Hierarchical processing based sub-pixel center extraction method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040001195A1 (en) * 2002-06-28 2004-01-01 Fuji Photo Optical Co., Ltd. Method of extracting circular region from fringe image
CN104616325A (en) * 2015-01-21 2015-05-13 大连理工大学 Rapid and high-precision method for extracting light strip center on large surface
CN105300316A (en) * 2015-09-22 2016-02-03 大连理工大学 Light stripe center rapid extraction method based on gray centroid method
CN105335988A (en) * 2015-09-25 2016-02-17 大连理工大学 Hierarchical processing based sub-pixel center extraction method

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108550144A (en) * 2018-04-09 2018-09-18 大连理工大学 Laser striation sequence image quality evaluating method based on gray scale reliability
CN108550144B (en) * 2018-04-09 2020-04-07 大连理工大学 Laser light bar sequence image quality evaluation method based on gray scale reliability
CN108629790A (en) * 2018-04-26 2018-10-09 大连理工大学 A kind of optical strip image threshold segmentation method based on depth residual error network
CN108629790B (en) * 2018-04-26 2020-08-14 大连理工大学 Light bar image threshold segmentation method based on depth residual error network
CN110232709A (en) * 2019-04-19 2019-09-13 武汉大学 A kind of line-structured light Light stripes center extraction method becoming Threshold segmentation
CN110232709B (en) * 2019-04-19 2022-07-29 武汉大学 Method for extracting line structured light strip center by variable threshold segmentation
CN110210437A (en) * 2019-06-10 2019-09-06 上海联影医疗科技有限公司 The determination method and system of human region in a kind of image
CN112710250A (en) * 2020-11-23 2021-04-27 武汉光谷卓越科技股份有限公司 Three-dimensional measurement method based on line structured light and sensor
CN117437247A (en) * 2023-12-18 2024-01-23 津泰(天津)医疗器械有限公司 Lesion region extraction and segmentation method based on natural cavity image
CN117437247B (en) * 2023-12-18 2024-03-05 津泰(天津)医疗器械有限公司 Lesion region extraction and segmentation method based on natural cavity image

Also Published As

Publication number Publication date
CN107578420B (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN107578420A (en) A kind of adaptive striation carrying out image threshold segmentation method
CN104680496B (en) A kind of Kinect depth map restorative procedures based on color images
US20180091798A1 (en) System and Method for Generating a Depth Map Using Differential Patterns
CN103530880A (en) Camera calibration method based on projected Gaussian grid pattern
CN108613637B (en) Structured light system dephasing method and system based on reference image
CN110009693B (en) Rapid blind calibration method of light field camera
JP5633058B1 (en) 3D measuring apparatus and 3D measuring method
CN105100771A (en) Single-viewpoint video depth obtaining method based on scene classification and geometric dimension
CN111563952B (en) Method and system for realizing stereo matching based on phase information and spatial texture characteristics
CN105335968A (en) Depth map extraction method based on confidence coefficient propagation algorithm and device
CN108550160A (en) Non-homogeneous striation characteristic area extracting method based on light intensity template
CN101900536A (en) Method for measuring object surface appearance based on digital picture method
CN103993548A (en) Multi-camera stereoscopic shooting based pavement damage crack detection system and method
CN104111038A (en) Method for using phase fusion algorithm to repair phase error caused by saturation
CN108051183B (en) Focus type light-field camera parameter calibration method based on first-order theory
CN107680152A (en) Target surface topography measurement method and apparatus based on image procossing
JP6009206B2 (en) 3D measuring device
CN108010075A (en) A kind of sectional perspective matching process based on multiple features combining
CN103925889A (en) Method for fast recovering surface phase of high-light object based on least square method
JP2006058091A (en) Three-dimensional image measuring device and method
CN106447718A (en) 2D-to-3D depth estimation method
CN109003308A (en) A kind of special areas imaging camera calibration system and method based on phase code
CN111951339A (en) Image processing method for performing parallax calculation by using heterogeneous binocular cameras
CN115330684A (en) Underwater structure apparent defect detection method based on binocular vision and line structured light
CN109218706B (en) Method for generating stereoscopic vision image from single image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20191119

Termination date: 20210821

CF01 Termination of patent right due to non-payment of annual fee