CN115248187A - Method for calibrating linear structured light vision sensor for simultaneously imaging polarization beam splitting - Google Patents
Method for calibrating linear structured light vision sensor for simultaneously imaging polarization beam splitting Download PDFInfo
- Publication number
- CN115248187A CN115248187A CN202210541221.4A CN202210541221A CN115248187A CN 115248187 A CN115248187 A CN 115248187A CN 202210541221 A CN202210541221 A CN 202210541221A CN 115248187 A CN115248187 A CN 115248187A
- Authority
- CN
- China
- Prior art keywords
- polarization
- light
- expressed
- image
- structured light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000010287 polarization Effects 0.000 title claims abstract description 47
- 238000000034 method Methods 0.000 title claims abstract description 23
- 238000003384 imaging method Methods 0.000 title claims abstract description 10
- 230000004927 fusion Effects 0.000 claims abstract description 15
- 238000005259 measurement Methods 0.000 claims abstract description 8
- 238000000605 extraction Methods 0.000 claims abstract description 6
- 239000011159 matrix material Substances 0.000 claims description 10
- 230000005855 radiation Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 4
- 238000013507 mapping Methods 0.000 claims description 2
- 239000000126 substance Substances 0.000 claims description 2
- 238000001514 detection method Methods 0.000 claims 1
- 238000004519 manufacturing process Methods 0.000 abstract description 3
- 230000000694 effects Effects 0.000 description 4
- 230000009467 reduction Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/21—Polarisation-affecting properties
Landscapes
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Pathology (AREA)
Abstract
The invention provides a method for calibrating a line structured light vision sensor for simultaneous polarization beam splitting and imaging, aiming at the problem that high light reflection influences the extraction precision of the centers of light stripes of a structure. Two polarization component images with different polarization directions can be obtained by one-time measurement by utilizing a polarization beam splitting imaging system, and the polarization component images with four angles are fused by utilizing an image fusion algorithm according to the polarization information of a highlight area. The difference in column coordinates of adjacent pixels on the light bar centerline is used to compare the smoothness of the line structured light. The line structured light vision sensor calibration method can effectively improve the calibration precision of the strong reflection target and can be applied to the field calibration in the industrial manufacturing field.
Description
Technical Field
The invention belongs to the field of visual non-contact measurement, and particularly relates to a method for calibrating a linear structured light visual sensor by simultaneously imaging polarization and beam splitting, so that the aim of high-precision calibration of a system is fulfilled.
Background
As a non-contact measurement technique, a structured light measurement system acquires a large amount of effective data by projecting laser stripes on the surface of a measured object. The method has the advantages of simple structure, low cost, high precision and the like, and the process is not interfered by texture information and color characteristics of the surface of the object. Because the light stripe central point contains the information of the characteristic point and the spatial position relation among the laser, the camera and the measured object, the accurate extraction of the light stripe central point coordinate is important for the calibration of the system. The invention provides a method for calibrating a linear structured light vision sensor for simultaneously imaging polarization beam splitting on the basis of a traditional polarization structured light calibration method.
Disclosure of Invention
Compared with the traditional polarization structure light calibration method, the polarization beam splitting and simultaneous imaging line structure light vision sensor calibration method provided by the invention has a better signal-to-noise ratio, can reduce the measurement error caused by machine shaking, reduces the false polarization effect, can effectively improve the calibration precision of the strong reflection target, and can be applied to field calibration in the field of industrial manufacturing.
A method for calibrating a linear structured light vision sensor for simultaneous imaging of polarization and beam splitting comprises the following steps:
(1) Collecting an original polarization component image, and carrying out median filtering processing on the original polarization component image;
(2) And fusing polarization component images in four directions, extracting light bar areas and extracting central lines.
In order to enable the obtained polarization component diagram to have a better fusion effect, the central point of the camera is firstly positioned on the central line of the orthogonal chessboard, the relative positions of the central point and the orthogonal chessboard are kept unchanged, the polarization component diagrams in two directions are obtained through one-time measurement, and the influence of high reflection on the extraction of the surface characteristic information of the target object is eliminated. And secondly, an image fusion weight mechanism is established, so that the intensity of the image is not reduced while strong reflection is eliminated, and the reduction of the measurement precision along with the reduction of the signal-to-noise ratio is avoided.
The invention adopts Stokes parameters and polarization angles to describe the polarization state of incident light and applies omega 1 ,ω 2 ,ω 3 And ω 4 Describing the fusion weight of each polarization component image, the corresponding process of polarization fusion is as follows:
the Stokes matrix of the incident light is
S=[S0, S1, S2, S3] T (1)
The intensity of the polarized radiation of an image pixel in the theta direction is expressed as:
the polarization angle was calculated using stokes parameters and is expressed as:
the average polarization angle is expressed as:
the incident light is divided by the polarization beam splitter into S light and P light perpendicular to each other in the propagation direction, and the radiation intensity relationship between the P light and the S light can be expressed as:
when the angle of the linear polarizer is θ ± 90 °, the intensity of the polarized radiation of an image pixel in the direction of θ ± 90 ° can be expressed as:
four original images x, y, u, v are fused, and the fused covariance can be expressed as:
computing x, y, u and v covariance matrices C 4×4 Eigenvalue diagonal matrix D 4×4 Sum matrix V 4×4 ,V 4×4 The column vector is the right eigenvector, C 4×4 × V 4×4 = V 4×4 × D 4×4 And calculating a weighting coefficient:
the final polarization map is obtained by fusion, and the process of fusion can be expressed as:
wherein, the first and the second end of the pipe are connected with each other,I f in order to obtain the final polarization-fused image,the fusion weights of the four polarization component graphs are respectively omega 1 ,ω 2 ,ω 3 And ω 4 . The present invention sets the weight of the four to 1.
For the collected 1200 × 1920 structured light stripe image, first 8 is set pixel×8 pixelIs defined as a Cell, the gradient direction inside the CellAnd is divided into 9 bins (bins), i.e., 9 directional blocks are generated. ThroughM(x,y) Weighting the gradient direction count of each pixel point, so that the pixel point in each Cell falls into each direction blockiIs countedNum_i (i=1,2,3, \8230;, 9) can be statistically derived from equation (13).
Secondly, 2 × 2 cells are set as a Block, and the structured light stripe image is scanned by using a sliding window with 8 pixels as step length, so that the total number of features in the structured light stripe image collected in the text can be calculated to be 36 × 149 × 239, and the 1281996 features are the final fusion effect of the structured light stripe.
To I f Performing pixel intensity enhancement to obtain the maximum value of gray scale in the whole imageI max . Considering that the energy of the light stripe is strong, the maximum value of the gray scale is 8 through multiple experiments0% is used as a threshold to segment the light stripe from the background information. Discontinuous phenomena exist in the light strip area segmented from the background image, so that the light strip area is filled by adopting a gray scale expansion algorithm, namely pixel addition is realized on the discontinuous area by combining the object boundary in the light strip image. At the moment, the structured light stripes become a communicated area, and the structured light center stripes can be extracted by a skeleton thinning method. Before the framework thinning method is adopted, binarization processing is necessary to the expanded light stripes, and the processed image isS. Calculating outSThe euclidean distance of (c) is transformed as shown in equation (11).
Wherein the content of the first and second substances, (ii) (x i 0 ,y j 0 )(i=1,2,…1200;j=1,2, \8230;, 1920) is non-zero element(s) in the expanded structured light stripe imagex i ,y j ) And the distance between the coordinates of a certain adjacent zero element and the coordinates of the certain adjacent zero element is the Euclidean distance. The shortest distance between any zero element and its adjacent non-zero element can be obtained by the formulaDAnd the position of the non-zero elementLUse of matrix [ 2 ]D,L]Is then expressed byD,L]Performing a negation operation to obtain the termD,L]And (4) matrix. Second pairSTo carry outs(s= ∞) morphological processing until no further change in the image occurs. Removing pixels on the boundary of the light stripe object without changing the continuity and Euler number of the light stripe, and obtaining the rest pixels as the skeleton image of the light stripeSkel. The center coordinates of the light stripe at this timex c ,y c ) May be represented by formula (12).
Will (a) tox c ,y c ) Mapping to the original structured light stripe imageThe extraction of the center of the light bar is completed.
Compared with the traditional calibration method, the method for calibrating the linear structured light vision sensor by simultaneously imaging the polarized beam and the beam has better signal-to-noise ratio, can reduce the measurement error caused by machine shaking, reduce the false polarization effect, effectively improve the calibration precision of the strong reflecting target, and can be applied to field calibration in the field of industrial manufacturing.
Drawings
FIG. 1 is a diagram of an experimental apparatus.
FIG. 2 is a graph of the raw polarization components collected.
FIG. 3 is a light bar centerline extraction of the original polarization component map and polarization fusion.
Claims (4)
1. The invention belongs to the technical field of visual detection and measurement, and particularly relates to a method for calibrating a linear structured light visual sensor for simultaneously imaging polarization beam splitting, which comprises the following steps of:
and expressing the polarization state of incident light by using a Stokes vector, carrying out image fusion on the original polarization component diagram, and extracting the light stripe central line of the fused image.
2. The method of claim 1, wherein the Stokes vector can represent the polarization state of any light beam, and the Stokes parameters and polarization angles of the polarization component diagram are calculated, and the Stokes matrix of the incident light is:
S = [S0, S1, S2, S3] T (1)
the intensity of the polarized radiation of an image pixel in the theta direction is expressed as:
the polarization angle was calculated using stokes parameters and is expressed as:
the average polarization angle is expressed as:
3. the method for fusing polarization component images according to claim 1, wherein the polarization component images have regularity difference and strong correlation in gray scale, and a final polarization image is obtained by an image fusion algorithm, and the steps are as follows:
the incident light is split by the polarization beam splitter into S light and P light perpendicular to each other in the propagation direction, and the radiation intensity relationship between the P light and the S light can be expressed as:
when the angle of the linear polarizer is θ ± 90 °, the intensity of the polarized radiation of an image pixel in the direction of θ ± 90 ° can be expressed as:
four original images x, y, u, v are fused, and the fused covariance can be expressed as:
computing x, y, u and v covariance matrices C 4×4 Eigenvalue diagonal matrix D 4×4 Sum matrix V 4×4 ,V 4×4 The column vector is the right eigenvector, C 4×4 × V 4×4 = V 4×4 × D 4×4 And calculating a weighting coefficient:
the final polarization map is obtained by fusion, and the process of fusion can be expressed as:
4. The method as claimed in claim 1, wherein the light stripe is binarized to obtain a fused imageS,Calculating outSThe euclidean distance transform of (a) may be expressed as:
wherein, the first and the second end of the pipe are connected with each other, (ii) (x i 0 ,y j 0 )(i=1,2,…1200;j=1,2, \8230;, 1920) is the non-zero element in the striation image: (x i ,y j ) The coordinate of a certain adjacent zero element can obtain the shortest distance between any zero element and the adjacent non-zero element through the formulaDAnd the position of the non-zero elementLUse of matrix [ 2 ]D,L]Is then expressed byD,L]Performing a negation operation to obtain the termD,L]Matrix, the center coordinates of the light stripe at this timex c ,y c ) Can be expressed as:
will (a) tox c ,y c ) And mapping to the original structured light stripe image to complete the center extraction of the light stripe.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210541221.4A CN115248187A (en) | 2022-05-18 | 2022-05-18 | Method for calibrating linear structured light vision sensor for simultaneously imaging polarization beam splitting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210541221.4A CN115248187A (en) | 2022-05-18 | 2022-05-18 | Method for calibrating linear structured light vision sensor for simultaneously imaging polarization beam splitting |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115248187A true CN115248187A (en) | 2022-10-28 |
Family
ID=83697965
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210541221.4A Pending CN115248187A (en) | 2022-05-18 | 2022-05-18 | Method for calibrating linear structured light vision sensor for simultaneously imaging polarization beam splitting |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115248187A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116773457A (en) * | 2023-08-18 | 2023-09-19 | 华东交通大学 | Polarization measurement method, system, equipment and medium based on Stokes parameters |
-
2022
- 2022-05-18 CN CN202210541221.4A patent/CN115248187A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116773457A (en) * | 2023-08-18 | 2023-09-19 | 华东交通大学 | Polarization measurement method, system, equipment and medium based on Stokes parameters |
CN116773457B (en) * | 2023-08-18 | 2024-05-17 | 华东交通大学 | Polarization measurement method, system, equipment and medium based on Stokes parameters |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109559324B (en) | Target contour detection method in linear array image | |
CN109559348B (en) | Bridge non-contact deformation measurement method based on feature point tracking | |
CN107248159A (en) | A kind of metal works defect inspection method based on binocular vision | |
Liu et al. | An improved online dimensional measurement method of large hot cylindrical forging | |
CN112161997B (en) | Online precise visual measurement method and system for three-dimensional geometric dimension of semiconductor chip pin | |
CN111709985B (en) | Underwater target ranging method based on binocular vision | |
CN108564621B (en) | Structured light strip center extraction method and device for track detection | |
CN103727930A (en) | Edge-matching-based relative pose calibration method of laser range finder and camera | |
CN103308000B (en) | Based on the curve object measuring method of binocular vision | |
CN108550160B (en) | Non-uniform light bar characteristic region extraction method based on light intensity template | |
Zhong et al. | Stereo-rectification and homography-transform-based stereo matching methods for stereo digital image correlation | |
CN112017243A (en) | Medium visibility identification method | |
CN115248187A (en) | Method for calibrating linear structured light vision sensor for simultaneously imaging polarization beam splitting | |
CN111951339A (en) | Image processing method for performing parallax calculation by using heterogeneous binocular cameras | |
CN115330684A (en) | Underwater structure apparent defect detection method based on binocular vision and line structured light | |
CN113340201B (en) | Three-dimensional measurement method based on RGBD camera | |
CN113205553A (en) | Light stripe center extraction method based on three-channel feature fusion | |
CN114485479B (en) | Structured light scanning and measuring method and system based on binocular camera and inertial navigation | |
Ziqiang et al. | Research of the algorithm calculating the length of bridge crack based on stereo vision | |
KR102445865B1 (en) | Image-based civil structure real-time displacement measurement system, method, and a recording medium recording a computer-readable program for executing the method | |
CN115100446A (en) | Similarity measurement method for matching SAR and visible light remote sensing image | |
CN109242824A (en) | A kind of road surface intelligent checking system based on depth image | |
CN114963981A (en) | Monocular vision-based cylindrical part butt joint non-contact measurement method | |
CN114255398A (en) | Method and device for extracting and matching features of satellite video image | |
CN209279912U (en) | A kind of object dimensional information collecting device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |