CN109064508B - Laser spot detection method and system - Google Patents

Laser spot detection method and system Download PDF

Info

Publication number
CN109064508B
CN109064508B CN201811017716.7A CN201811017716A CN109064508B CN 109064508 B CN109064508 B CN 109064508B CN 201811017716 A CN201811017716 A CN 201811017716A CN 109064508 B CN109064508 B CN 109064508B
Authority
CN
China
Prior art keywords
spot
light spot
image
value
baseline value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811017716.7A
Other languages
Chinese (zh)
Other versions
CN109064508A (en
Inventor
景文博
于洪洋
董猛
赵海丽
刘鹏
王彩霞
王晓曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun University of Science and Technology
Original Assignee
Changchun University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun University of Science and Technology filed Critical Changchun University of Science and Technology
Priority to CN201811017716.7A priority Critical patent/CN109064508B/en
Publication of CN109064508A publication Critical patent/CN109064508A/en
Application granted granted Critical
Publication of CN109064508B publication Critical patent/CN109064508B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/08Measuring arrangements characterised by the use of optical techniques for measuring diameters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20216Image averaging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a laser spot detection method and a laser spot detection system. The method comprises the following steps: acquiring multiple continuous background images and light spot images; calculating an overall average pixel value and a frame average image of the background image; calculating a first noise baseline value; determining a first light spot area and a second light spot area of the light spot image; calculating a second noise baseline value and a third noise baseline value; calculating a first centroid coordinate and a first spot radius; calculating a second centroid coordinate and a second spot radius; judging whether the difference value of the first light spot radius and the second light spot radius is larger than a preset error value or not; if so, changing the sizes of the first light spot area and the second light spot area, and returning to calculate a second noise baseline value and a third noise baseline value; if not, determining the centroid coordinate and the diameter of the spot image according to the first centroid coordinate, the first spot radius, the second centroid coordinate and the second spot radius. The method or the system can improve the detection precision of the laser facula.

Description

Laser spot detection method and system
Technical Field
The invention relates to the technical field of laser beam performance detection, in particular to a laser spot detection method and a laser spot detection system.
Background
The current accurate detection of the light spot has wide application in the aspects of laser transmission, astronomical observation, laser quality evaluation, light beam size detection and the like. The detection method of the laser spot mainly comprises a trepanning method, a knife edge method, an image sensor detection method, a photographic method, a slit method, a threshold time method and the like. The photographic method has the disadvantages that the exposure time is not easy to control, so that overexposure or underexposure is easy to cause, the accuracy of light spot detection is influenced, and data processing is troublesome; the scanning method has the disadvantages that the minimum light spot position and the size thereof cannot be determined simultaneously, and the measurement accuracy is influenced by the response frequency and the response time of the detector; the threshold time method has a disadvantage in that the accuracy of detection may be affected by burning out the target due to energy accumulation or by a high threshold energy.
In the process of collecting the light spot image, electrical noise, optical noise and the like have a great influence on the detection accuracy of the light spot, so that the currently adopted image sensor detection method adopts a baseline threshold method and a window method to reduce the influence of the noise. The baseline threshold method is to set a baseline to improve the signal-to-noise ratio of a light spot image, but when the signal and the noise are alternated, the separation of the signal and the noise cannot be finished; the window method is to reduce the influence of noise by changing the size of a detection window, but only can reduce the influence of noise outside the window on the calculation of the size of a laser spot, and cannot help the noise inside the detection window.
The existing laser spot detection method is limited by various factors, so that the detection precision is not high.
Disclosure of Invention
Therefore, it is necessary to provide a laser spot detection method and system to improve the detection accuracy of the laser spot.
In order to achieve the purpose, the invention provides the following scheme:
a laser spot detection method, comprising:
acquiring multiple continuous background images; the background image is an image displayed in the image sensor after a laser beam emitted by the laser is blocked by the light blocking plate;
obtaining an overall average pixel value and a frame average image of a background image according to pixel values of multiple continuous background images;
calculating a first noise baseline value according to the overall average pixel value and the frame average image; the first noise baseline value is a noise baseline value of a background image;
acquiring a light spot image; the light spot image is an image displayed by laser beams emitted by the laser device and incident into the image sensor;
determining a first light spot area and a second light spot area of the light spot image; the first light spot area is determined according to a first threshold value, and the second light spot area is determined according to a second threshold value; the first threshold is a first noise baseline value; the second threshold is determined using Otsu;
calculating a second noise baseline value and a third noise baseline value; the second noise baseline value is a noise baseline value of the first light spot region, and the third noise baseline value is a noise baseline value of the second light spot region;
calculating a first centroid coordinate and a first spot radius according to the second noise baseline value; the first centroid coordinate is a centroid coordinate of the first light spot region; the first spot radius is the spot radius of the first spot area;
calculating a second centroid coordinate and a second spot radius according to the third noise baseline value; the second centroid coordinate is a centroid coordinate of the second light spot region; the second spot radius is the spot radius of the second spot region;
judging whether the difference value of the first light spot radius and the second light spot radius is larger than a preset error value or not;
if so, changing the sizes of the first light spot region and the second light spot region, and returning to calculate a second noise baseline value and a third noise baseline value;
if not, determining the centroid coordinate and the diameter of the spot image according to the first centroid coordinate, the first spot radius, the second centroid coordinate and the second spot radius.
Optionally, the obtaining of the overall average pixel value and the frame average image of the background image according to the pixel values of multiple continuous background images specifically includes:
calculating the average pixel value of each frame of background image according to the pixel values of a plurality of frames of continuous background images; in particular to
Figure BDA0001786543400000031
Wherein A iskMean pixel value, f, representing the background image of the k-th framek(i, j) represents the pixel value of the ith row and the jth column of the k frame background image, M represents the width of the background image, and N represents the height of the background image;
calculating an ensemble average pixel value of the background image
Figure BDA0001786543400000032
Wherein n represents the total number of frames;
obtaining a frame average image according to the pixel values of multiple continuous background images
Figure BDA0001786543400000033
Wherein f isnRepresenting the pixel values of the background image of the nth frame.
Optionally, the calculating a first noise baseline value according to the overall average pixel value and the frame average image specifically includes:
carrying out mean value filtering and smoothing processing on the frame average image to obtain a processed frame average image;
calculating an average pixel value of the processed frame average image
Figure BDA0001786543400000034
Wherein Q' (i, j) represents the pixel value of ith row and jth column of the processed frame average image, M represents the width of the background image, and N represents the height of the background image;
calculating the standard deviation of the processed frame mean image
Figure BDA0001786543400000035
Calculating a first noise baseline value
Figure BDA0001786543400000036
Wherein the content of the first and second substances,
Figure BDA0001786543400000037
denotes the overall average pixel value, and mask _ x and mask _ y denote the width and height, respectively, of the filtering template of the mean filtering.
Optionally, the determining the first spot area and the second spot area of the spot image specifically includes:
determining the pixel value of a pixel point with the pixel value lower than a first threshold value in the light spot image as 0, and determining the pixel value of a pixel point with the pixel value higher than the first threshold value as 1;
generating a binary image corresponding to the first threshold;
processing the binary image corresponding to the first threshold value by adopting an eight-neighborhood connected domain algorithm to obtain a plurality of first connected domains;
determining a first light communication domain with the largest area as a first light spot region;
processing the light spot image by using an Otsu method to obtain a second threshold value;
determining the pixel value of the pixel point with the pixel value lower than the second threshold value in the light spot image as 0, and determining the pixel value of the pixel point with the pixel value higher than the second threshold value as 1;
generating a binary image corresponding to the second threshold;
processing the binary image corresponding to the second threshold value by adopting an eight-neighborhood connected domain algorithm to obtain a plurality of second connected domains;
and determining the second connected domain with the largest area as a second light spot region.
Optionally, the calculating the second noise baseline value and the third noise baseline value specifically includes:
determining a noise baseline value of the background image as a second noise baseline value;
the second light spot area is doubled to obtain a light spot effective area;
judging whether a pixel value lower than a noise baseline value of the background image exists in the light spot effective area;
if yes, counting all negative noise values in the light spot effective area; the negative noise value is a pixel value below a noise baseline value of the background image;
determining a third noise baseline value from all negative noise values;
and if not, determining the noise baseline value of the background image as a third noise baseline value.
Optionally, calculating a first centroid coordinate and a first spot radius according to the second noise baseline value specifically includes:
the first centroid coordinate is (C)x1,Cy1),
Figure BDA0001786543400000051
Wherein m and n respectively represent the length and width of the first light spot region, I1(i, j) is the pixel value of the ith row and the jth column of the first light spot area, wherein, i is more than 0 and less than m, j is more than 0 and less than n, S1Representing a second noise baseline value;
Figure BDA0001786543400000052
Rx1spot radius in the x-axis direction, R, of the first spot areay1Is the spot radius in the y-axis direction of the first spot area.
Optionally, calculating a second centroid coordinate and a second spot radius according to the third noise baseline value specifically includes:
the second centroid coordinate is (C)x2,Cy2),
Figure BDA0001786543400000053
Wherein g and h respectively represent the length and width of the second light spot region, I2(i, j) is the pixel value of the ith row and the jth column of the second light spot area, i is more than 0 and less than g, j is more than 0 and less than h, S2Representing a third noise baseline value;
Figure BDA0001786543400000054
Rx2spot radius in the x-axis direction, R, of the second spot areay2The spot radius in the y-axis direction of the second spot area.
Optionally, if so, changing the sizes of the first light spot region and the second light spot region, specifically:
if R isx1-Rx2>ζ,Ry1-Ry2When the intensity is larger than zeta, the x-axis direction of the first light spot area is reduced
Figure BDA0001786543400000061
Reducing the y-axis direction of the first light spot area
Figure BDA0001786543400000062
Expanding the x-axis direction of the second light spot area
Figure BDA0001786543400000063
The second light spot areaY-axis direction of
Figure BDA0001786543400000064
Wherein R isx1Spot radius in the x-axis direction, R, of the first spot areay1Is the spot radius in the y-axis direction, R, of the first spot areax2Spot radius in the x-axis direction, R, of the second spot areay2Spot radius in the y-axis direction of the second spot area, Dx1Spot diameter in the x-axis direction of the first spot area, Dy1The spot diameter in the y-axis direction of the first spot area, Dx2Spot diameter in the x-axis direction of the second spot area, Dy2The diameter of the light spot in the y-axis direction of the second light spot area is shown, and zeta is a preset error value;
if R isx2-Rx1>ζ,Ry2-Ry1When the intensity is larger than zeta, the x-axis direction of the first light spot area is expanded
Figure BDA0001786543400000065
Expanding the y-axis direction of the first light spot area
Figure BDA0001786543400000066
Reducing the x-axis direction of the second light spot area
Figure BDA0001786543400000067
Reducing the y-axis direction of the second light spot area
Figure BDA0001786543400000068
Optionally, the determining the centroid coordinate and the diameter of the spot image according to the first centroid coordinate, the first spot radius, the second centroid coordinate and the second spot radius specifically includes:
determining an average value of the first centroid coordinate and the second centroid coordinate as a centroid coordinate of the light spot image;
determining the diameter of the spot image from the average of the first spot radius and the second spot radius.
The invention also provides a laser spot detection system, which comprises: the device comprises a laser, an optical attenuation component, a lens, an attenuation sheet, an image sensor, a processor and a light shielding plate;
the laser is used for emitting laser beams; the optical attenuation component is arranged on an output optical path of the laser; the lens is arranged on an emergent light path of the optical attenuation component; the attenuation sheet is arranged on an emergent light path of the lens; the image sensor is fixedly connected with the attenuation sheet and is used for enabling the light beams passing through the attenuation sheet to form a light spot image; the processor is electrically connected with the image sensor and is used for detecting the centroid coordinate and the diameter of the light spot image; when the processor acquires a background image, a light shielding plate is arranged between the lens and the attenuation sheet;
the processor includes:
the first acquisition module is used for acquiring multiple continuous background images; the background image is an image displayed in the image sensor after a laser beam emitted by the laser is blocked by the light blocking plate;
the first calculation module is used for obtaining the total average pixel value and the frame average image of the background image according to the pixel values of a plurality of continuous background images;
a second calculation module, configured to calculate a first noise baseline value according to the ensemble average pixel value and the frame average image; the first noise baseline value is a noise baseline value of a background image;
the second acquisition module is used for acquiring a light spot image; the light spot image is an image displayed by laser beams emitted by the laser device and incident into the image sensor;
the first determining module is used for determining a first light spot area and a second light spot area of the light spot image; the first light spot area is determined according to a first threshold value, and the second light spot area is determined according to a second threshold value; the first threshold is a first noise baseline value; the second threshold is determined using Otsu;
a third calculation module for calculating a second noise baseline value and a third noise baseline value; the second noise baseline value is a noise baseline value of the first light spot region, and the third noise baseline value is a noise baseline value of the second light spot region;
the fourth calculation module is used for calculating a first centroid coordinate and a first spot radius according to the second noise baseline value; the first centroid coordinate is a centroid coordinate of the first light spot region; the first spot radius is the spot radius of the first spot area;
the fifth calculation module is used for calculating a second centroid coordinate and a second spot radius according to the third noise baseline value; the second centroid coordinate is a centroid coordinate of the second light spot region; the second spot radius is the spot radius of the second spot region;
the judging module is used for judging whether the difference value of the first light spot radius and the second light spot radius is larger than a preset error value or not;
the light spot area changing module is used for changing the sizes of the first light spot area and the second light spot area and returning to the third calculating module if the difference value of the first light spot radius and the second light spot radius is larger than a preset error value;
and the second determining module is used for determining the centroid coordinate and the diameter of the light spot image according to the first centroid coordinate, the first light spot radius, the second centroid coordinate and the second light spot radius if the difference value of the first light spot radius and the second light spot radius is less than or equal to a preset error value.
Compared with the prior art, the invention has the beneficial effects that:
the invention provides a laser spot detection method and a system, which are characterized in that through counting background noise, when calculating the centroid and the diameter of a spot, the noise value is subtracted from the pixel value in the spot area, so that the accuracy is improved, and the influence of the background noise is removed; and the bilateral contraction of the region is carried out by using a region iteration method, the sizes of the first light spot region and the second light spot region are changed once every iteration, the centroid and the size of the light spot are recalculated, and when the difference value of the radiuses of the first light spot region and the second light spot region is smaller than a certain error, the size of the light spot is determined, so that the accuracy of light spot calculation is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
Fig. 1 is a flowchart of a laser spot detection method according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a laser spot detection system according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a processor.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Fig. 1 is a flowchart of a laser spot detection method according to an embodiment of the present invention.
Referring to fig. 1, the laser spot detection method of the embodiment includes:
step S1: acquiring multiple frames of continuous background images.
The background image is an image which is displayed in the image sensor after a laser beam emitted by the laser is blocked by the light blocking plate.
Step S2: and obtaining the total average pixel value and the frame average image of the background image according to the pixel values of a plurality of continuous background images. Specifically, the method comprises the following steps:
calculating the average pixel value of each frame of background image according to the pixel values of a plurality of frames of continuous background images; in particular to
Figure BDA0001786543400000091
Wherein A iskMean pixel value, f, representing the background image of the k-th framek(i, j) represents the pixel value of the ith row and the jth column of the k frame background image, M represents the width of the background image, and N represents the height of the background image;
calculating an ensemble average pixel value of the background image
Figure BDA0001786543400000092
Wherein n represents the total number of frames;
obtaining a frame average image according to the pixel values of multiple continuous background images
Figure BDA0001786543400000101
Wherein f isnRepresenting the pixel values of the background image of the nth frame.
Step S3: a first noise baseline value is calculated from the ensemble averaged pixel value and the frame averaged image.
The first noise baseline value is a noise baseline value of a background image. The step S3 specifically includes:
carrying out mean value filtering and smoothing processing on the frame average image to obtain a processed frame average image;
calculating an average pixel value of the processed frame average image
Figure BDA0001786543400000102
Wherein Q' (i, j) represents the pixel value of ith row and jth column of the processed frame average image, M represents the width of the background image, and N represents the height of the background image;
calculating the standard deviation of the processed frame mean image
Figure BDA0001786543400000103
Calculating a first noise baseline value
Figure BDA0001786543400000104
Wherein the content of the first and second substances,
Figure BDA0001786543400000105
denotes the overall average pixel value, and mask _ x and mask _ y denote the width and height, respectively, of the filtering template of the mean filtering.
Step S4: and acquiring a light spot image.
The spot image is an image which is displayed when the laser beam emitted by the laser is incident into the image sensor.
Step S5: and determining a first light spot area and a second light spot area of the light spot image.
The first light spot area is determined according to a first threshold value, and the second light spot area is determined according to a second threshold value; the first threshold is a first noise baseline value; the second threshold is determined using Otsu. Specifically, the method comprises the following steps:
determining the pixel value of a pixel point with the pixel value lower than a first threshold value in the light spot image as 0, and determining the pixel value of a pixel point with the pixel value higher than the first threshold value as 1;
generating a binary image corresponding to the first threshold;
processing the binary image corresponding to the first threshold value by adopting an eight-neighborhood connected domain algorithm to obtain a plurality of first connected domains;
determining a first light communication domain with the largest area as a first light spot region;
processing the light spot image by using an Otsu method to obtain a second threshold value;
determining the pixel value of the pixel point with the pixel value lower than the second threshold value in the light spot image as 0, and determining the pixel value of the pixel point with the pixel value higher than the second threshold value as 1;
generating a binary image corresponding to the second threshold;
processing the binary image corresponding to the second threshold value by adopting an eight-neighborhood connected domain algorithm to obtain a plurality of second connected domains;
and determining the second connected domain with the largest area as a second light spot region.
Step S6: a second noise baseline value and a third noise baseline value are calculated.
The second noise baseline value is a noise baseline value of the first light spot region, and the third noise baseline value is a noise baseline value of the second light spot region. The step S6 specifically includes:
determining a noise baseline value of the background image as a second noise baseline value;
the second light spot area is doubled to obtain a light spot effective area;
judging whether a pixel value lower than a noise baseline value of the background image exists in the light spot effective area;
if yes, counting all negative noise values in the light spot effective area; the negative noise value is a pixel value below a noise baseline value of the background image;
determining a third noise baseline value based on all negative noise values
Figure BDA0001786543400000111
Wherein D isk(k > 0) is a negative noise value, i.e. a pixel value below the noise baseline value of the background image, in the light spot effective area, k represents the total number of pixel values below the noise baseline value of the background image in the light spot effective area;
and if not, determining the noise baseline value of the background image as a third noise baseline value.
Step S7: and calculating a first centroid coordinate and a first spot radius according to the second noise baseline value.
The first centroid coordinate is a centroid coordinate of the first light spot region; the first spot radius is a spot radius of the first spot area. Specifically, the obtained pixel coordinates in the first spot area are normalized, whether the pixel is in a circular area with the center of the first spot area as an origin and the radius of 1 is detected, and the centroid coordinates and the diameter of the spot are calculated for the pixel in the circular area by using a moment method theory.
The first centroid coordinate is (C)x1,Cy1),
Figure BDA0001786543400000121
Wherein m and n respectively represent the length and width of the first light spot region, I1(i, j) is the pixel value of the ith row and the jth column of the first light spot area, wherein, i is more than 0 and less than m, j is more than 0 and less than n, S1Representing a second noise baseline value;
Figure BDA0001786543400000122
Rx1spot radius in the x-axis direction, R, of the first spot areay1Is the spot radius in the y-axis direction of the first spot area.
Step S8: and calculating a second centroid coordinate and a second spot radius according to the third noise baseline value.
The second centroid coordinate is a centroid coordinate of the second light spot region; the second spot radius is a spot radius of the second spot region. Specifically, the obtained pixel coordinates in the second spot area are normalized, whether the pixel is in a circular area with the center of the second spot area as the origin and the radius of 1 is detected, and the centroid coordinates and the diameter of the spot are calculated for the pixels in the circular area by using a moment method theory.
The second centroid coordinate is (C)x2,Cy2),
Figure BDA0001786543400000123
Wherein g and h respectively represent the length and width of the second light spot region, I2(i, j) is the pixel value of the ith row and the jth column of the second light spot area, i is more than 0 and less than g, j is more than 0 and less than h, S2Representing a third noise baseline value;
Figure BDA0001786543400000131
Rx2spot radius in the x-axis direction, R, of the second spot areay2The spot radius in the y-axis direction of the second spot area.
Step S9: and judging whether the difference value of the first light spot radius and the second light spot radius is larger than a preset error value.
If yes, go to step S10; if not, step S11 is executed.
Step S10: the sizes of the first and second spot areas are changed, and the process returns to step S6. The method specifically comprises the following steps:
if R isx1-Rx2>ζ,Ry1-Ry2When the intensity is larger than zeta, the x-axis direction of the first light spot area is reduced
Figure BDA0001786543400000132
Reducing the y-axis direction of the first light spot area
Figure BDA0001786543400000133
Expanding the x-axis direction of the second light spot area
Figure BDA0001786543400000134
Expanding the y-axis direction of the second light spot area
Figure BDA0001786543400000135
Wherein R isx1Spot radius in the x-axis direction, R, of the first spot areay1The light spot is in the y-axis direction of the first light spot areaRadius, Rx2Spot radius in the x-axis direction, R, of the second spot areay2Spot radius in the y-axis direction of the second spot area, Dx1Spot diameter in the x-axis direction of the first spot area, Dy1The spot diameter in the y-axis direction of the first spot area, Dx2Spot diameter in the x-axis direction of the second spot area, Dy2The diameter of the light spot in the y-axis direction of the second light spot area is shown, and zeta is a preset error value;
if R isx2-Rx1>ζ,Ry2-Ry1When the intensity is larger than zeta, the x-axis direction of the first light spot area is expanded
Figure BDA0001786543400000141
Expanding the y-axis direction of the first light spot area
Figure BDA0001786543400000142
Reducing the x-axis direction of the second light spot area
Figure BDA0001786543400000143
Reducing the y-axis direction of the second light spot area
Figure BDA0001786543400000144
Step S11: and determining the centroid coordinate and the diameter of the spot image according to the first centroid coordinate, the first spot radius, the second centroid coordinate and the second spot radius. Specifically, the method comprises the following steps:
determining an average value of the first centroid coordinate and the second centroid coordinate as a centroid coordinate of the light spot image;
determining the diameter of the spot image from the average of the first spot radius and the second spot radius.
According to the laser spot detection method, through counting the background noise, when the centroid and the diameter of the spot are calculated, the noise value is subtracted from the pixel value in the spot area, so that the accuracy is improved, and the influence of the background noise is removed; and the bilateral contraction of the region is carried out by using a region iteration method, the sizes of the first light spot region and the second light spot region are changed once every iteration, the centroid and the size of the light spot are recalculated, and when the difference value of the radiuses of the first light spot region and the second light spot region is smaller than a certain error, the size of the light spot is determined, so that the accuracy of light spot calculation is improved.
The invention also provides a laser spot detection system, fig. 2 is a schematic structural diagram of the laser spot detection system according to the embodiment of the invention, and fig. 3 is a schematic structural diagram of a processor.
Referring to fig. 2 and 3, the laser spot detection system of the embodiment includes: a laser 21, an optical attenuation module 22, a lens 23, an attenuation sheet 24, an image sensor 25, a processor 26, and a mask 27.
The laser 21 is used for emitting laser beams; the optical attenuation component 22 is arranged on the output light path of the laser 21; the lens 23 is arranged on an emergent light path of the optical attenuation component 22; the attenuation sheet 24 is arranged on the emergent light path of the lens 23; the image sensor 25 is fixedly connected with the attenuation sheet 24 and is used for enabling the light beams passing through the attenuation sheet 24 to form a light spot image; the processor 26 is electrically connected with the image sensor 25 and is used for detecting the centroid coordinate and the diameter of the light spot image; when the processor 26 acquires a background image, a light shielding plate 27 is further provided between the lens 23 and the attenuation sheet 24.
The processor 26 includes:
a first obtaining module 261, configured to obtain multiple frames of continuous background images; the background image is an image displayed in the image sensor after a laser beam emitted by the laser is blocked by the light blocking plate;
the first calculating module 262 is configured to obtain an overall average pixel value and a frame average image of a background image according to pixel values of multiple frames of continuous background images;
a second calculating module 263, configured to calculate a first noise baseline value according to the overall average pixel value and the frame average image; the first noise baseline value is a noise baseline value of a background image;
a second obtaining module 264, configured to obtain a spot image; the light spot image is an image displayed by laser beams emitted by the laser device and incident into the image sensor;
a first determining module 265, configured to determine a first light spot region and a second light spot region of the light spot image; the first light spot area is determined according to a first threshold value, and the second light spot area is determined according to a second threshold value; the first threshold is a first noise baseline value; the second threshold is determined using Otsu;
a third calculation module 266 for calculating a second noise baseline value and a third noise baseline value; the second noise baseline value is a noise baseline value of the first light spot region, and the third noise baseline value is a noise baseline value of the second light spot region;
a fourth calculating module 267, configured to calculate a first centroid coordinate and a first spot radius according to the second noise baseline value; the first centroid coordinate is a centroid coordinate of the first light spot region; the first spot radius is the spot radius of the first spot area;
a fifth calculating module 268, configured to calculate a second centroid coordinate and a second spot radius according to the third noise baseline value; the second centroid coordinate is a centroid coordinate of the second light spot region; the second spot radius is the spot radius of the second spot region;
a determining module 269, configured to determine whether a difference between the first light spot radius and the second light spot radius is greater than a preset error value;
a light spot area changing module 270, configured to change the sizes of the first light spot area and the second light spot area if the difference between the first light spot radius and the second light spot radius is greater than a preset error value, and return to the third calculating module;
a second determining module 271, configured to determine, if a difference between the first spot radius and the second spot radius is smaller than or equal to a preset error value, a centroid coordinate and a diameter of the spot image according to the first centroid coordinate, the first spot radius, the second centroid coordinate, and the second spot radius.
According to the laser spot detection system, the background noise is counted, and when the centroid and the diameter of the spot are calculated, the noise value is subtracted from the pixel value in the spot area, so that the accuracy is improved, and the influence of the background noise is removed; and the bilateral contraction of the region is carried out by using a region iteration method, the sizes of the first light spot region and the second light spot region are changed once every iteration, the centroid and the size of the light spot are recalculated, and when the difference value of the radiuses of the first light spot region and the second light spot region is smaller than a certain error, the size of the light spot is determined, so that the accuracy of light spot calculation is improved.
In the system disclosed by the embodiment in the specification, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (9)

1. A laser spot detection method, comprising:
acquiring multiple continuous background images; the background image is an image displayed in the image sensor after a laser beam emitted by the laser is blocked by the light blocking plate;
obtaining an overall average pixel value and a frame average image of a background image according to pixel values of multiple continuous background images;
calculating a first noise baseline value according to the overall average pixel value and the frame average image; the first noise baseline value is a noise baseline value of a background image;
acquiring a light spot image; the light spot image is an image displayed by laser beams emitted by the laser device and incident into the image sensor;
determining a first light spot area and a second light spot area of the light spot image; the first light spot area is determined according to a first threshold value, and the second light spot area is determined according to a second threshold value; the first threshold is a first noise baseline value; the second threshold is determined using Otsu;
calculating a second noise baseline value and a third noise baseline value; the second noise baseline value is a noise baseline value of the first light spot region, and the third noise baseline value is a noise baseline value of the second light spot region;
calculating a first centroid coordinate and a first spot radius according to the second noise baseline value; the first centroid coordinate is a centroid coordinate of the first light spot region; the first spot radius is the spot radius of the first spot area;
calculating a second centroid coordinate and a second spot radius according to the third noise baseline value; the second centroid coordinate is a centroid coordinate of the second light spot region; the second spot radius is the spot radius of the second spot region;
judging whether the difference value of the first light spot radius and the second light spot radius is larger than a preset error value or not;
if so, changing the sizes of the first light spot region and the second light spot region, and returning to calculate a second noise baseline value and a third noise baseline value;
if not, determining the centroid coordinate and the diameter of the light spot image according to the first centroid coordinate, the first light spot radius, the second centroid coordinate and the second light spot radius;
calculating a first noise baseline value according to the ensemble average pixel value and the frame average image, specifically including:
carrying out mean value filtering and smoothing processing on the frame average image to obtain a processed frame average image;
calculating an average pixel value of the processed frame average image
Figure FDA0002611347060000021
Wherein Q' (i, j) represents the pixel value of ith row and jth column of the processed frame average image, M represents the width of the background image, and N represents the height of the background image;
calculating the standard deviation of the processed frame mean image
Figure FDA0002611347060000022
Calculating a first noise baseline value
Figure FDA0002611347060000023
Wherein the content of the first and second substances,
Figure FDA0002611347060000024
denotes the overall average pixel value, and mask _ x and mask _ y denote the width and height, respectively, of the filtering template of the mean filtering.
2. The method according to claim 1, wherein the obtaining of the overall average pixel value and the frame average image of the background image according to the pixel values of a plurality of frames of continuous background images specifically comprises:
calculating the average pixel value of each frame of background image according to the pixel values of a plurality of frames of continuous background images; in particular to
Figure FDA0002611347060000025
Wherein A iskMean pixel value, f, representing the background image of the k-th framek(i, j) represents the pixel value of the ith row and the jth column of the k frame background image, M represents the width of the background image, and N represents the height of the background image;
calculating an ensemble average pixel value of the background image
Figure FDA0002611347060000026
Wherein n represents the total number of frames;
obtaining a frame average image according to the pixel values of multiple continuous background images
Figure FDA0002611347060000027
Wherein f isnRepresenting the pixel values of the background image of the nth frame.
3. The method according to claim 1, wherein the determining the first spot area and the second spot area of the spot image specifically includes:
determining the pixel value of a pixel point with the pixel value lower than a first threshold value in the light spot image as 0, and determining the pixel value of a pixel point with the pixel value higher than the first threshold value as 1;
generating a binary image corresponding to the first threshold;
processing the binary image corresponding to the first threshold value by adopting an eight-neighborhood connected domain algorithm to obtain a plurality of first connected domains;
determining a first light communication domain with the largest area as a first light spot region;
processing the light spot image by using an Otsu method to obtain a second threshold value;
determining the pixel value of the pixel point with the pixel value lower than the second threshold value in the light spot image as 0, and determining the pixel value of the pixel point with the pixel value higher than the second threshold value as 1;
generating a binary image corresponding to the second threshold;
processing the binary image corresponding to the second threshold value by adopting an eight-neighborhood connected domain algorithm to obtain a plurality of second connected domains;
and determining the second connected domain with the largest area as a second light spot region.
4. The method according to claim 1, wherein the calculating the second noise baseline value and the third noise baseline value specifically includes:
determining a noise baseline value of the background image as a second noise baseline value;
the second light spot area is doubled to obtain a light spot effective area;
judging whether a pixel value lower than a noise baseline value of the background image exists in the light spot effective area;
if yes, counting all negative noise values in the light spot effective area; the negative noise value is a pixel value below a noise baseline value of the background image;
determining a third noise baseline value from all negative noise values;
and if not, determining the noise baseline value of the background image as a third noise baseline value.
5. The method according to claim 1, wherein the calculating the first centroid coordinate and the first spot radius according to the second noise baseline value specifically comprises:
the first centroid coordinate is (C)x1,Cy1),
Figure FDA0002611347060000041
Wherein m and n respectively represent the length and width of the first light spot region, I1(i, j) is the pixel value of the ith row and the jth column of the first light spot area, wherein, i is more than 0 and less than m, j is more than 0 and less than n, S1Representing a second noise baseline value;
Figure FDA0002611347060000042
Rx1spot radius in the x-axis direction, R, of the first spot areay1Is the spot radius in the y-axis direction of the first spot area.
6. The method according to claim 1, wherein the calculating of the second centroid coordinate and the second spot radius according to the third noise baseline value comprises:
the second centroid coordinate is (C)x2,Cy2),
Figure FDA0002611347060000043
Wherein g and h respectively represent the length and width of the second light spot region, I2(i, j) is the pixel value of the ith row and the jth column of the second light spot area, i is more than 0 and less than g, j is more than 0 and less than h, S2Representing a third noise baseline value;
Figure FDA0002611347060000044
Rx2spot radius in the x-axis direction, R, of the second spot areay2The spot radius in the y-axis direction of the second spot area.
7. The method according to claim 1, wherein if yes, changing sizes of the first spot area and the second spot area specifically comprises:
if R isx1-Rx2>ζ,Ry1-Ry2When the intensity is larger than zeta, the x-axis direction of the first light spot area is reduced
Figure FDA0002611347060000051
Reducing the y-axis direction of the first light spot area
Figure FDA0002611347060000052
Expanding the x-axis direction of the second light spot area
Figure FDA0002611347060000053
Expanding the y-axis direction of the second light spot area
Figure FDA0002611347060000054
Wherein R isx1Spot radius in the x-axis direction, R, of the first spot areay1Is the spot radius in the y-axis direction, R, of the first spot areax2Spot radius in the x-axis direction, R, of the second spot areay2Spot radius in the y-axis direction of the second spot area, Dx1Spot diameter in the x-axis direction of the first spot area, Dy1The spot diameter in the y-axis direction of the first spot area, Dx2Spot diameter in the x-axis direction of the second spot area, Dy2The diameter of the light spot in the y-axis direction of the second light spot area is shown, and zeta is a preset error value;
if R isx2-Rx1>ζ,Ry2-Ry1When the intensity is larger than zeta, the x-axis direction of the first light spot area is expanded
Figure FDA0002611347060000055
Expanding the y-axis direction of the first light spot area
Figure FDA0002611347060000056
Reducing the x-axis direction of the second light spot area
Figure FDA0002611347060000057
Reducing the y-axis direction of the second light spot area
Figure FDA0002611347060000058
8. The method according to claim 1, wherein the determining the centroid coordinate and the diameter of the spot image according to the first centroid coordinate, the first spot radius, the second centroid coordinate and the second spot radius specifically comprises:
determining an average value of the first centroid coordinate and the second centroid coordinate as a centroid coordinate of the light spot image;
determining the diameter of the spot image from the average of the first spot radius and the second spot radius.
9. A laser spot detection system, the system comprising: the device comprises a laser, an optical attenuation component, a lens, an attenuation sheet, an image sensor, a processor and a light shielding plate;
the laser is used for emitting laser beams; the optical attenuation component is arranged on an output optical path of the laser; the lens is arranged on an emergent light path of the optical attenuation component; the attenuation sheet is arranged on an emergent light path of the lens; the image sensor is fixedly connected with the attenuation sheet and is used for enabling the light beams passing through the attenuation sheet to form a light spot image; the processor is electrically connected with the image sensor and is used for detecting the centroid coordinate and the diameter of the light spot image; when the processor acquires a background image, a light shielding plate is arranged between the lens and the attenuation sheet;
the processor includes:
the first acquisition module is used for acquiring multiple continuous background images; the background image is an image displayed in the image sensor after a laser beam emitted by the laser is blocked by the light blocking plate;
the first calculation module is used for obtaining the total average pixel value and the frame average image of the background image according to the pixel values of a plurality of continuous background images;
a second calculation module, configured to calculate a first noise baseline value according to the ensemble average pixel value and the frame average image; the first noise baseline value is a noise baseline value of a background image;
the second acquisition module is used for acquiring a light spot image; the light spot image is an image displayed by laser beams emitted by the laser device and incident into the image sensor;
the first determining module is used for determining a first light spot area and a second light spot area of the light spot image; the first light spot area is determined according to a first threshold value, and the second light spot area is determined according to a second threshold value; the first threshold is a first noise baseline value; the second threshold is determined using Otsu;
a third calculation module for calculating a second noise baseline value and a third noise baseline value; the second noise baseline value is a noise baseline value of the first light spot region, and the third noise baseline value is a noise baseline value of the second light spot region;
the fourth calculation module is used for calculating a first centroid coordinate and a first spot radius according to the second noise baseline value; the first centroid coordinate is a centroid coordinate of the first light spot region; the first spot radius is the spot radius of the first spot area;
the fifth calculation module is used for calculating a second centroid coordinate and a second spot radius according to the third noise baseline value; the second centroid coordinate is a centroid coordinate of the second light spot region; the second spot radius is the spot radius of the second spot region;
the judging module is used for judging whether the difference value of the first light spot radius and the second light spot radius is larger than a preset error value or not;
the light spot area changing module is used for changing the sizes of the first light spot area and the second light spot area and returning to the third calculating module if the difference value of the first light spot radius and the second light spot radius is larger than a preset error value;
a second determining module, configured to determine a centroid coordinate and a diameter of the spot image according to the first centroid coordinate, the first spot radius, the second centroid coordinate, and the second spot radius if a difference between the first spot radius and the second spot radius is smaller than or equal to a preset error value;
calculating a first noise baseline value according to the ensemble average pixel value and the frame average image, specifically including:
carrying out mean value filtering and smoothing processing on the frame average image to obtain a processed frame average image;
calculating an average pixel value of the processed frame average image
Figure FDA0002611347060000071
Wherein Q' (i, j) represents the pixel value of ith row and jth column of the processed frame average image, M represents the width of the background image, and N represents the height of the background image;
calculating the standard deviation of the processed frame mean image
Figure FDA0002611347060000072
Calculating a first noise baseline value
Figure FDA0002611347060000073
Wherein the content of the first and second substances,
Figure FDA0002611347060000074
denotes the overall average pixel value, and mask _ x and mask _ y denote the width and height, respectively, of the filtering template of the mean filtering.
CN201811017716.7A 2018-09-03 2018-09-03 Laser spot detection method and system Active CN109064508B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811017716.7A CN109064508B (en) 2018-09-03 2018-09-03 Laser spot detection method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811017716.7A CN109064508B (en) 2018-09-03 2018-09-03 Laser spot detection method and system

Publications (2)

Publication Number Publication Date
CN109064508A CN109064508A (en) 2018-12-21
CN109064508B true CN109064508B (en) 2020-10-09

Family

ID=64759321

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811017716.7A Active CN109064508B (en) 2018-09-03 2018-09-03 Laser spot detection method and system

Country Status (1)

Country Link
CN (1) CN109064508B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110260787B (en) * 2019-06-26 2020-12-01 王菲 Laser spot size all-angle evaluation and characterization method
CN110533601B (en) * 2019-07-15 2023-07-18 江苏大学 Method for acquiring center position and contour of laser spot
CN113096059B (en) * 2019-12-19 2023-10-31 合肥君正科技有限公司 Method for eliminating interference shielding detection of night light source by in-vehicle monitoring camera
CN113298762B (en) * 2021-05-07 2022-08-02 威海世高光电子有限公司 flare detection method
CN114563162A (en) * 2022-02-17 2022-05-31 武汉思创精密激光科技有限公司 Fiber laser output light spot diameter testing arrangement

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455813A (en) * 2013-08-31 2013-12-18 西北工业大学 Method for facula center positioning of CCD image measurement system
CN104966308A (en) * 2015-06-12 2015-10-07 深圳大学 Method for calculating spot size of laser beam
CN107633493A (en) * 2017-09-28 2018-01-26 珠海博明视觉科技有限公司 A kind of method that adaptive background suitable for industrial detection deducts
CN108088427A (en) * 2017-12-30 2018-05-29 浙江维思无线网络技术有限公司 A kind of planar laser beam sending method and device
US10062012B1 (en) * 2014-10-22 2018-08-28 Kla-Tencor Corp. Finding patterns in a design based on the patterns and their surroundings

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10404954B2 (en) * 2016-01-21 2019-09-03 Ricoh Company, Ltd. Optical deflection apparatus, image projector, optical writing unit, and object recognition device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455813A (en) * 2013-08-31 2013-12-18 西北工业大学 Method for facula center positioning of CCD image measurement system
US10062012B1 (en) * 2014-10-22 2018-08-28 Kla-Tencor Corp. Finding patterns in a design based on the patterns and their surroundings
CN104966308A (en) * 2015-06-12 2015-10-07 深圳大学 Method for calculating spot size of laser beam
CN107633493A (en) * 2017-09-28 2018-01-26 珠海博明视觉科技有限公司 A kind of method that adaptive background suitable for industrial detection deducts
CN108088427A (en) * 2017-12-30 2018-05-29 浙江维思无线网络技术有限公司 A kind of planar laser beam sending method and device

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"An improved algorithm of laser spot center detection in strong noise background";Le Zhang 等;《Proc. SPIE 10616, 2017 International Conference on Optical Instruments and Technology: Optical Systems and Modern Optoelectronic Instruments》;20180110;全文 *
"Estimation and display for far-field energy density distribution of laser spot";Jin Duan 等;《2012 International Conference on Optoelectronics and Microelectronics》;20121031;全文 *
"激光照射光斑检测方法";戴淖敏 等;《长春理工大学学报(自然科学版)》;20150228;第38卷(第1期);第119-123页 *
"自动对焦的多路激光光斑中心定位算法";程马兵;《中国优秀硕士学位论文全文数据库信息科技辑》;20180315(第03期);全文 *

Also Published As

Publication number Publication date
CN109064508A (en) 2018-12-21

Similar Documents

Publication Publication Date Title
CN109064508B (en) Laser spot detection method and system
US10302420B2 (en) Method for reconstructing a surface using spatially structured light and a dynamic vision sensor
CN111308448B (en) External parameter determining method and device for image acquisition equipment and radar
US20050201612A1 (en) Method and apparatus for detecting people using stereo camera
WO2020147639A1 (en) Method for real-time tracking and detection of weld bead trajectory and attitude, electronic device and medium
EP1958158B1 (en) Method for detecting streaks in digital images
US5103105A (en) Apparatus for inspecting solder portion of a circuit board
CN109191513B (en) Power equipment stereo matching method based on global optimization
CN104102069B (en) A kind of focusing method of imaging system and device, imaging system
EP3199914B1 (en) Imaging device
KR101582153B1 (en) Exposure measuring method and apparatus based on the composition for automatic image correction
JPS6345565B2 (en)
JP2013113696A (en) Displacement measuring method and displacement measuring apparatus
CN111680574A (en) Face detection method and device, electronic equipment and storage medium
CN116358417A (en) Device and method for judging object edge position through Fresnel diffraction principle
KR101826711B1 (en) Method for Calibrating Depth Map of ToF camera
CN109448060B (en) Camera calibration parameter optimization method based on bat algorithm
CN109661683B (en) Structured light projection method, depth detection method and structured light projection device based on image content
US20190096079A1 (en) Image measurement device, image measurement method, imaging device
CN113919398B (en) Non-visual field target signal identification method based on deep learning
US9959612B2 (en) Measuring optical turbulence using cell counting algorithms
Nikolova et al. Detecting of Unique Image Features by Using Camera with Controllable Parameters
JPH06180218A (en) Solid shape detection method
KR100269512B1 (en) Method for measuring 3-dimentional shape from image focus using curved window in ccd camera
JP2006106617A (en) Photometric device of camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant