CN116433577B - Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction - Google Patents

Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction Download PDF

Info

Publication number
CN116433577B
CN116433577B CN202310062761.9A CN202310062761A CN116433577B CN 116433577 B CN116433577 B CN 116433577B CN 202310062761 A CN202310062761 A CN 202310062761A CN 116433577 B CN116433577 B CN 116433577B
Authority
CN
China
Prior art keywords
light
camera
image
dimensional
beam quality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310062761.9A
Other languages
Chinese (zh)
Other versions
CN116433577A (en
Inventor
宁鸿章
庹文波
***
武春风
胡黎明
王凯
张倩
罗昱
代弋
张培健
雷景添
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Space Sanjiang Group Co Ltd
Original Assignee
China Space Sanjiang Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Space Sanjiang Group Co Ltd filed Critical China Space Sanjiang Group Co Ltd
Priority to CN202310062761.9A priority Critical patent/CN116433577B/en
Publication of CN116433577A publication Critical patent/CN116433577A/en
Application granted granted Critical
Publication of CN116433577B publication Critical patent/CN116433577B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/52Scale-space analysis, e.g. wavelet analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Quality & Reliability (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a spot detection and beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction, wherein the method comprises three parts of noise threshold calibration, spot detection and beam quality estimation; then, the two-dimensional image is compressed into a one-dimensional space, and the size and the position of the light spot are detected by utilizing a multi-scale one-dimensional convolution kernel; and finally, estimating the beam quality of the incident beam by using the actual light spot size obtained by image detection and the theoretical light spot size obtained by calculation. By the mode, background noise interference is effectively restrained, the accuracy and the robustness of light spot detection are improved, and the efficiency and the anti-interference capability of light spot detection are improved. In the process, the noise threshold calibration technology and the characteristic dimension reduction technology are adopted to process the image, so that the efficiency, the precision and the anti-interference capability of light spot detection are effectively improved.

Description

Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction
Technical Field
The invention relates to the technical field of optical measurement, in particular to a light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction.
Background
Optical measurement is one of important modes in the field of modern industrial measurement, and has the irreplaceable advantages of high measurement accuracy, non-contact measurement and the like. Spot detection is a common and important basic problem in optical measurement, and has wide application in the fields of object tracking and positioning, three-dimensional morphology acquisition, camera calibration, laser measurement and the like. At present, the light spot detection method mainly comprises two types, namely a positioning method based on a two-dimensional shape of a light spot, such as a moment method, a gravity center method, a fitting method transformation method of a light spot edge circle or arc, and the like, and a method based on surface fitting, such as a Gaussian surface fitting method, and the like. Both methods are used for directly processing the two-dimensional image, are easy to be interfered by local abnormal values, have large calculated amount and are to be improved in the aspects of robustness and high instantaneity.
In the prior art, the application number is 202210385803.8, the publication date is 2022, 7 and 5, and the application document named as a method for extracting the mass centers of a plurality of sub-light spots based on edge detection and target tracking uses an edge detection and target tracking algorithm, and the mass centers of the multi-target light spots can be detected by introducing a detection mechanism of multi-target multi-line-of-sight light beam co-diffraction imaging. However, in the above technical solution, an edge extraction algorithm is adopted to segment all sub-light spots to extract the centroid of each sub-light spot, and then an algorithm based on field search is used to automatically match different sub-light spots with each sub-aperture into multiple arrays, so that the extraction mode of the centroids of multiple sub-light spots is realized.
In view of the foregoing, there is a need for an improved method and apparatus for spot detection and beam quality estimation based on one-dimensional multi-scale convolution feature extraction.
Disclosure of Invention
The invention aims to provide a spot detection and beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction, which are used for eliminating interference of camera system noise on spot detection by adopting a camera noise threshold calibration technology and improving the accuracy and efficiency of an algorithm; the feature dimension reduction technology for compressing the two-dimensional image into the one-dimensional space is utilized, so that the calculated amount of image processing is reduced, the interference of local abnormality of the image on the detection of the speckles is eliminated, and the robustness and the efficiency of an algorithm are improved; the multi-scale one-dimensional convolution kernel with a specific shape is utilized to perform feature enhancement and facula extraction on the one-dimensional array after image compression, so that the accuracy of an algorithm is improved.
In order to achieve the above object, the present invention provides a method for detecting light spots and estimating the quality of light beams based on one-dimensional multi-scale convolution feature extraction, comprising the following steps:
s1, calibrating a camera noise threshold;
s2, respectively compressing the images into a one-dimensional array in the horizontal direction and the vertical direction, wherein the compression method is to calculate the sum, the mean or the variance of the rows or the columns;
s3, spot detection based on multi-scale one-dimensional convolution feature extraction;
s4, estimating the beam quality of the incident beam by using the actual spot size obtained by image detection.
Preferably, in step S3, the spot detection includes the steps of:
s31, subtracting a set noise threshold from a spot image detected when the light beam is incident, and eliminating background noise interference in the image;
s32, compressing the image into a one-dimensional array in the horizontal and vertical directions;
s33, setting the minimum size of a convolution kernel as min_sz, setting the maximum size of the convolution kernel as max_sz, setting the initial value of the convolution kernel as i=min_sz, and setting the size range of the convolution kernel as the size of an image or according to the theoretical light spot diameter d Theory of Further calculating to obtain the size range of the convolution kernel;
s34, creating a loop until l > max_sz, and ending the loop.
Preferably, in step S4, the calculation formula of the beam quality estimation is:where β is beam quality, d Actual measurement The measured light spot diameter is shown as D, the beam diameter is shown as f, the focal length is shown as f, and the wavelength of lambda is the wavelength of light; said d Actual measurement The expression of (2) isw is the width of the spot and h is the height of the spot.
Preferably, the theoretical spot diameter d Theory of Is calculated asThe convolution kernel has a size in the range of a.d Theory of ~b·d Theory of Wherein a is<1,b>1, d is the beam diameter, f is the focal length, and λ is the wavelength of light.
Preferably, the calibration of the noise threshold of the camera is performed in the following manner: firstly setting parameters of a camera sensor of a light spot detection and light beam quality estimation device and a lens of an electric tuning mirror, then collecting multiple frames of images when no light beam is incident, finally carrying out noise analysis on the multiple frames of images, and setting a noise threshold value as y n Under the condition that the image noise satisfies normal distribution, the noise threshold y is calculated n An estimation is made.
Preferably, the saidNoise threshold y n The expression of (2) is y n =m+3·δ, where m is the mean of the noise and δ is the standard deviation; the noise threshold y n Is the average of the maximum noise values in a single image.
Preferably, in step S34, the cycle is performed as follows:
s341, creating a one-dimensional convolution kernel with the scale size of i;
s342, performing convolution operation on the multi-scale one-dimensional convolution kernel and the horizontal one-dimensional array and the vertical one-dimensional array respectively to obtain a response array;
s343, obtaining the maximum value of the response array, and calculating the ratio of the current maximum value to the current convolution kernel size;
s344, judging whether the ratio is a historical maximum value and exceeds a set ratio threshold y, if yes, setting the size of the convolution kernel at the moment as the light spot size, setting the coordinate at the maximum value as the coordinate of the light spot origin, otherwise, directly entering step 345; wherein the expression of the ratio threshold y of the maximum value of the response array to the convolution kernel size is y=m 1 +n·δ 1 ,m 1 Is the average value delta of the one-dimensional array after image compression 1 The standard deviation of the one-dimensional array after image compression is shown, and n is a coefficient;
s345, adding the value of i by a step number t, wherein t is a natural number greater than 0.
Particularly, the spot detection and beam quality estimation device based on the one-dimensional multi-scale convolution feature extraction is characterized in that the spot detection and beam quality estimation method based on the one-dimensional multi-scale convolution feature extraction according to any one of claims 1-7 is performed by using the spot detection and beam quality estimation device, the spot detection and beam quality estimation device comprises an electric tuning mirror for adjusting the input angle of incident light, an optical filter rotating wheel and a camera, wherein the optical filter rotating wheel and the camera are sequentially arranged at the tail end of the electric tuning mirror, the camera is connected with the image processing equipment, the optical filter rotating wheel and the image processing equipment are communicated through a serial port, and the image processing equipment is used for receiving and executing control instructions issued by the image processing equipment.
Preferably, the camera is used for collecting a spot image of an incident light beam, and a response curve of the camera is required to be matched with a wavelength range of the incident light beam; the camera is a common camera for transmitting images by a gigabit network or a scientific camera for transmitting high-frame-rate images by adopting a camera link, an optical fiber interface and the like; the image processing device is provided with communication interfaces such as gigabit network ports, camera links, optical fibers, serial ports and the like.
Preferably, the image processing device adopts architectures such as FPGA-DSP, FPGA-CPU, FPGA-GPU, and the like, wherein the FPGA is used for collecting and transmitting communication information with the camera, the electrotuning mirror and the optical filter runner, and the DSP/CPU/GPU is used for calculating light spot detection, beam quality estimation, and the like.
The beneficial effects of the invention are as follows:
1. the invention provides a spot detection and beam quality estimation method based on one-dimensional multi-scale convolution feature extraction, which comprises three parts of noise threshold calibration, spot detection and beam quality estimation, wherein the method firstly calibrates a camera noise threshold and eliminates interference of system noise on spot detection; then, the two-dimensional image is compressed into a one-dimensional space, and the size and the position of the light spot are detected by utilizing a multi-scale one-dimensional convolution kernel; and finally, estimating the beam quality of the incident beam by using the actual light spot size obtained by image detection and the theoretical light spot size obtained by calculation. By the mode, background noise interference is effectively restrained, the accuracy and the robustness of light spot detection are improved, and the efficiency and the anti-interference capability of light spot detection are improved.
2. According to the spot detection and beam quality estimation method based on the one-dimensional multi-scale convolution feature extraction, the two-dimensional image is converted into the one-dimensional array, so that the calculated amount is remarkably reduced compared with a spot detection algorithm based on the two-dimensional image, the spot detection algorithm efficiency is improved, the influence of local outliers on the whole is effectively restrained by adopting a calculation method of summing according to rows and columns, averaging or variance in the image dimension reduction process, the anti-interference capability of the spot detection algorithm is improved, and the accuracy and the robustness of the algorithm are improved; by adopting a convolution kernel with specific numerical value characteristic distribution, the information in the light spot interval is positively excited, and the information in other intervals is negatively excited, so that the response of characteristic peaks caused by light spots in a one-dimensional array after the dimension reduction of the image can be improved, the response of a peak-free area is restrained, the enhancement of the light spot characteristics is realized, and the accuracy of a light spot detection algorithm is improved; by collecting multi-frame image data before image processing and analyzing the statistical characteristics of noise, extra calculation cost is avoided in the image processing process, and the accuracy and efficiency of an algorithm are improved.
3. The light spot detection and beam quality estimation method based on the one-dimensional multi-scale convolution feature extraction is performed based on the light spot detection and beam quality estimation device constructed by the image processing equipment, the optical filter turntable, the electronic tuning mirror and the camera, has high detection efficiency and high precision, can meet the requirements of high-real-time light spot detection and beam quality estimation, can adapt to light intensity changes and input light beams with different incidence angles, has strong environmental adaptability and good stability, and has extremely high engineering application value.
Drawings
FIG. 1 is a flow chart of a spot detection and beam quality estimation method based on one-dimensional multi-scale convolution feature extraction according to the present invention;
FIG. 2 is a schematic flow chart of the noise threshold calibration in FIG. 1;
FIG. 3 is a flow chart of the spot detection algorithm of FIG. 1;
FIG. 4 is a one-dimensional convolution kernel feature map in example 1 of the present invention;
fig. 5 is a schematic structural diagram of a device for detecting light spots and estimating the quality of light beams based on one-dimensional multi-scale convolution feature extraction.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
It should be noted that, in order to avoid obscuring the present invention due to unnecessary details, only structures and/or processing steps closely related to aspects of the present invention are shown in the drawings, and other details not greatly related to the present invention are omitted.
In addition, it should be further noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Referring to fig. 1 to 4, the method for detecting light spots and estimating light beam quality based on one-dimensional multi-scale convolution feature extraction of the present invention includes the following steps:
s1, calibrating a camera noise threshold value:
the camera noise threshold calibration is performed in the following manner: firstly setting parameters of a camera sensor of a light spot detection and light beam quality estimation device and a lens of an electronic tuning mirror, then collecting multiple frames of images when no light beam is incident, finally carrying out noise analysis on the multiple frames of images, and setting a noise threshold y n The noise analysis may assume that the image noise satisfies a normal distribution, and thus for the noise threshold y n Making an estimation, i.e. y n =m+3·δ, where m is the mean of the noise and δ is the standard deviation; noise threshold y n The average value of the maximum noise value in a single image can also be taken;
s2, reducing dimension of image features: the image is compressed into a one-dimensional array according to the horizontal direction and the vertical direction respectively, and the compression method can be used for calculating the sum, the mean or the variance of the rows or the columns;
s3, spot detection based on multi-scale one-dimensional convolution feature extraction;
s4, estimating the beam quality of the incident beam by using the actual spot size obtained by image detection, wherein the spot is one of the representation of the beam emitted by the light source such as laser, lamplight and the like on the image after imaging by the lens and the camera.
Preferably, the spot detection process of step S3 includes the steps of:
s31, subtracting a set noise threshold from a spot image detected when the light beam is incident, and eliminating background noise interference in the image;
s32, compressing the image into a one-dimensional array in the horizontal and vertical directions;
s33, setting the minimum size min_sz and the maximum size max_sz of the convolution kernel, wherein the initial value i=min_sz, the size range of the convolution kernel can be equal to the size (0-w, 0-h) of the image, and d can be calculated first Theory of ,d Theory of Is calculated asThe size range of the convolution kernel is then set to a.d Theory of ~b·d Theory of Wherein a is<1,b>1, D is the beam diameter, f is the focal length, and λ is the wavelength of light;
s34, creating a loop until l is more than max_sz, and ending the loop;
the circulation process is carried out according to the following steps:
s341, creating a one-dimensional convolution kernel with the scale size of i;
s342, performing convolution operation on the multi-scale one-dimensional convolution kernel and the horizontal one-dimensional array and the vertical one-dimensional array respectively to obtain a response array;
s343, obtaining the maximum value of the response array, and calculating the ratio of the current maximum value to the current convolution kernel size;
s344, judging whether the ratio is the historical maximum value and exceeds a set ratio threshold y, if yes, setting the size of the convolution kernel at the moment as the light spot size, setting the coordinate at the maximum value as the coordinate of the light spot origin, otherwise, directly entering step S345, wherein the expression of the ratio threshold y of the response array maximum value and the convolution kernel size is y=m 1 +n·δ 1 ,m 1 Is the average value delta of the one-dimensional array after image compression 1 The standard deviation of the one-dimensional array after image compression is shown, and n is a coefficient;
s345, adding one to the value of i.
Preferably, in step S4, the estimation of the beam quality is calculated using the following formula:where β is beam quality, d Actual measurement Is actually measured asThe spot diameter, according to the spot size detected by the spot detection algorithm, defines the width of the spot as w and the height as h, and then +.>D is the beam diameter, f is the focal length, and λ is the wavelength of light.
In particular, referring to fig. 5, the apparatus for performing spot detection and beam quality estimation based on one-dimensional multi-scale convolution feature extraction includes: the electronic control device comprises an electronic control mirror for adjusting the input angle of incident light, a light filter rotating wheel and a camera, wherein the light filter rotating wheel and the camera are sequentially arranged at the tail end of the electronic control mirror, the camera is connected with image processing equipment, communication is carried out between the light filter rotating wheel and the image processing equipment through a serial port, the image processing equipment is used for receiving and executing control instructions issued by the image processing equipment, and specific parameters of each gear of the light filter can be set according to actual conditions.
Specifically, the camera is used for collecting a spot image of an incident light beam, and the response curve of the camera needs to be matched with the wavelength range of the incident light beam; the optical filter rotating wheel is used for adjusting the gear of the optical filter, so that the damage to a camera caused by the too strong energy of the incident light beam can be prevented, and the influence on the observation of the camera caused by the too weak energy of the incident light beam can also be prevented; the electronic regulator is used for regulating the input angle of the incident light beam and preventing the angle change of the incident light beam from deviating from the observation range of the camera; the image processing equipment can be used for detecting image light spots, adjusting the azimuth of the electric tuning mirror according to the light spot position in the image and automatically switching the optical filter runner according to the image exposure condition.
Preferably, the camera can be a common camera for transmitting images by using a gigabit network, or a scientific camera for transmitting high-frame-rate images by using a camera link, an optical fiber interface and the like.
Preferably, the image processing device is provided with communication interfaces such as a gigabit network port, a camera link, an optical fiber, a serial port and the like, and can realize communication and control with a camera, an electronic tuning mirror and an optical filter rotating wheel.
Preferably, the image processing device can adopt architectures such as FPGA-DSP, FPGA-CPU, FPGA-GPU and the like, wherein the FPGA is used for collecting and sending communication information with a camera, an electronic tuning mirror and a light filter, and the DSP/CPU/GPU is used for calculating spot detection, beam quality estimation and the like.
The spot detection and beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction of the present invention are further described below with reference to specific embodiments:
example 1
The invention provides a light spot detection and beam quality estimation method based on one-dimensional multi-scale convolution feature extraction, which specifically comprises the following steps:
s1, calibrating a camera noise threshold value:
the camera noise threshold calibration is performed in the following manner: firstly setting parameters of a camera sensor of a light spot detection and light beam quality estimation device and a lens of an electronic tuning mirror, then collecting multiple frames of images when no light beam is incident, finally carrying out noise analysis on the multiple frames of images, and setting a noise threshold y n The noise analysis may assume that the image noise satisfies a normal distribution, and thus for the noise threshold y n Estimation is performed, wherein y n =m+3·δ, m is the mean of noise, δ is the standard deviation; in other embodiments, the noise threshold y n The average value of the maximum noise value in the single image can also be taken, which is not limited herein;
s2, reducing dimension of image features: the image is compressed into a one-dimensional array according to the horizontal direction and the vertical direction respectively, and the compression method can be used for calculating the sum, the mean or the variance of the rows or the columns;
s3, spot detection based on multi-scale one-dimensional convolution feature extraction, wherein the spot detection process comprises the following steps:
s31, subtracting a set noise threshold from a spot image detected when the light beam is incident, and eliminating background noise interference in the image;
s32, compressing the image into a one-dimensional array in the horizontal and vertical directions;
s33, setting the minimum size min_sz and the maximum size max_sz of the convolution kernel, wherein the initial value i=min_sz, the size range of the convolution kernel can be equal to the size (0-w, 0-h) of the image, and d can be calculated first Theory of ,d Theory of Is calculated asThe size range of the convolution kernel is then set to a.d Theory of ~b·d Theory of Wherein a is<1,b>1, D is the beam diameter, f is the focal length, and λ is the wavelength of light;
s34, creating a loop until l is more than max_sz, and ending the loop;
the circulation process is carried out according to the following steps:
s341, creating a one-dimensional convolution kernel with the scale of i, wherein the one-dimensional convolution kernel meets the characteristic distribution shown in fig. 4, namely, the characteristic peak of the light spot is enhanced after the one-dimensional array obtained by image compression and the convolution kernel are convolved;
s342, performing convolution operation on the multi-scale one-dimensional convolution kernel and the horizontal one-dimensional array and the vertical one-dimensional array respectively to obtain a response array;
s343, obtaining the maximum value of the response array, and calculating the ratio of the current maximum value to the current convolution kernel size;
s344, judging whether the ratio is the historical maximum value and exceeds a set ratio threshold y, if yes, setting the size of the convolution kernel at the moment as the light spot size, setting the coordinate at the maximum value as the coordinate of the light spot origin, otherwise, directly entering step S345, wherein the expression of the ratio threshold y of the response array maximum value and the convolution kernel size is y=m 1 +n·δ 1 ,m 1 Is the average value delta of the one-dimensional array after image compression 1 The standard deviation of the one-dimensional array after image compression is shown, and n is a coefficient;
s345, adding the value of i by a step number t, wherein t is a natural number greater than 0.
Preferably, in step S4, the estimation of the beam quality is calculated using the following formula:
where β is beam quality, d Actual measurement For actually measuring the diameter of the light spot, defining the width of the light spot as w and the height as h according to the size of the light spot detected by a light spot detection algorithm, and then +.>D is the beam diameter, f is the focal length, and lambda is the wavelength of light;
s4, estimating the beam quality of the incident beam by using the actual light spot size obtained by image detection, wherein the estimation of the beam quality is calculated by adopting the following formula:where β is beam quality, d Actual measurement For actually measuring the diameter of the light spot, defining the width of the light spot as w and the height as h according to the size of the light spot detected by a light spot detection algorithm, and then +.>D is the beam diameter, f is the focal length, and λ is the wavelength of light.
Specifically, the process of spot detection and beam quality estimation is performed by means of a device comprising an electric tuning mirror for adjusting the input angle of incident light, a light filter rotating wheel and a camera, wherein the light filter rotating wheel and the camera are sequentially arranged at the tail end of the electric tuning mirror, the camera is connected with image processing equipment, the light filter rotating wheel and the image processing equipment are communicated through a serial port, control instructions issued by the image processing equipment are received and executed, and specific parameters of each gear of the light filter can be set according to actual conditions.
In summary, the method and the device for detecting the light spot and estimating the light beam quality based on the one-dimensional multi-scale convolution feature extraction provided by the invention comprise three parts of noise threshold calibration, light spot detection and light beam quality estimation, wherein the method firstly calibrates the noise threshold of a camera and eliminates the interference of system noise on the light spot detection; then, the two-dimensional image is compressed into a one-dimensional space, and the size and the position of the light spot are detected by utilizing a multi-scale one-dimensional convolution kernel; and finally, estimating the beam quality of the incident beam by using the actual light spot size obtained by image detection and the theoretical light spot size obtained by calculation. By the mode, background noise interference is effectively restrained, the accuracy and the robustness of light spot detection are improved, and the efficiency and the anti-interference capability of light spot detection are improved.
The above embodiments are only for illustrating the technical solution of the present invention and not for limiting the same, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications and equivalents may be made thereto without departing from the spirit and scope of the technical solution of the present invention.

Claims (6)

1. The light spot detection and light beam quality estimation method based on one-dimensional multi-scale convolution feature extraction is characterized by comprising the following steps:
s1, calibrating a camera noise threshold;
s2, respectively compressing the images into a one-dimensional array in the horizontal direction and the vertical direction, wherein the compression method is to calculate the sum, the mean or the variance of the rows or the columns;
s3, spot detection based on multi-scale one-dimensional convolution feature extraction;
the light spot detection comprises the following steps:
s31, subtracting a set noise threshold from a spot image detected when the light beam is incident, and eliminating background noise interference in the image;
s32, compressing the image into a one-dimensional array in the horizontal and vertical directions;
s33, setting the minimum size of a convolution kernel as min_sz, setting the maximum size of the convolution kernel as max_sz, setting the initial value of the convolution kernel as i=min_sz, and setting the size range of the convolution kernel as the size of an image or according to the theoretical light spot diameter d Theory of Further calculating to obtain the size range of the convolution kernel;
wherein the theoretical spot diameter d Theory of Is calculated asThe convolution kernel has a size in the range of a.d Theory of ~b·d Theory of Wherein a is<1,b>1, D is the beam diameter, f is the focal length, and λ is the wavelength of light;
s34, creating a loop until i is more than max_sz, and ending the loop;
the cycle proceeds as follows:
s341, creating a one-dimensional convolution kernel with the scale size of i;
s342, performing convolution operation on the multi-scale one-dimensional convolution kernel and the horizontal one-dimensional array and the vertical one-dimensional array respectively to obtain a response array;
s343, obtaining the maximum value of the response array, and calculating the ratio of the current maximum value to the current convolution kernel size;
s344, judging whether the ratio is a historical maximum value and exceeds a set ratio threshold y, if yes, setting the size of the convolution kernel at the moment as the light spot size, setting the coordinate at the maximum value as the coordinate of the light spot origin, otherwise, directly entering step 345; wherein the expression of the ratio threshold y of the maximum value of the response array to the convolution kernel size is y=m 1 +n·δ 1 ,m 1 Is the average value delta of the one-dimensional array after image compression 1 The standard deviation of the one-dimensional array after image compression is shown, and n is a coefficient;
s345, adding a step number t to the value of i, wherein t is a natural number greater than 0;
s4, estimating the beam quality of the incident beam by using the actual spot size obtained by image detection;
the calculation formula of the beam quality estimation is as follows:where β is beam quality, d Actual measurement The measured light spot diameter is shown as D, the beam diameter is shown as f, the focal length is shown as f, and the wavelength of lambda is the wavelength of light; said d Actual measurement The expression of (2) is +.>w is the width of the spot and h is the height of the spot.
2. The method for detecting light spots and estimating the quality of light beams based on the one-dimensional multi-scale convolution feature extraction according to claim 1, wherein the calibration of the noise threshold of the camera is performed by the following method: firstly, setting parameters of a camera sensor of a light spot detection and light beam quality estimation device and a lens of an electric tuning mirror,then, collecting multiple frames of images without incidence of light beams, finally carrying out noise analysis on the multiple frames of images, and setting a noise threshold value as y n Under the condition that the image noise satisfies normal distribution, the noise threshold y is calculated n An estimation is made.
3. The method for spot detection and beam quality estimation based on one-dimensional multi-scale convolution feature extraction according to claim 2, wherein said noise threshold y n The expression of (2) is y n =m+3·δ, where m is the mean of the noise and δ is the standard deviation; the noise threshold y n Is the average of the maximum noise values in a single image.
4. The spot detection and beam quality estimation device based on the one-dimensional multi-scale convolution feature extraction is characterized in that the spot detection and beam quality estimation method based on the one-dimensional multi-scale convolution feature extraction according to any one of claims 1-3 is performed by adopting the spot detection and beam quality estimation device, the spot detection and beam quality estimation device comprises an electric tuning mirror for adjusting the input angle of incident light, an optical filter rotating wheel and a camera, wherein the optical filter rotating wheel and the camera are sequentially arranged at the tail end of the electric tuning mirror, the camera is connected with image processing equipment, the optical filter rotating wheel and the image processing equipment are communicated through a serial port, and the image processing equipment is used for receiving and executing control instructions issued by the image processing equipment.
5. The device for detecting and estimating the quality of a light beam based on the one-dimensional multi-scale convolution feature extraction according to claim 4, wherein the camera is used for collecting the light spot image of the incident light beam, and the response curve of the camera is required to be matched with the wavelength range of the incident light beam; the camera is a common camera for transmitting images by a gigabit network or a scientific camera for transmitting high-frame-rate images by adopting a camera link and an optical fiber interface; the image processing device is provided with a gigabit network port, a camera link, an optical fiber and a serial port communication interface.
6. The device for detecting light spots and estimating light beam quality based on the one-dimensional multi-scale convolution feature extraction according to claim 4, wherein the image processing device adopts an FPGA-DSP, an FPGA-CPU, and an FPGA-GPU architecture, wherein the FPGA is used for collecting and transmitting communication information with the camera, the electrotuning mirror, and the filter wheel, and the DSP/CPU/GPU is used for calculating light spot detection and light beam quality estimation.
CN202310062761.9A 2023-01-18 2023-01-18 Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction Active CN116433577B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310062761.9A CN116433577B (en) 2023-01-18 2023-01-18 Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310062761.9A CN116433577B (en) 2023-01-18 2023-01-18 Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction

Publications (2)

Publication Number Publication Date
CN116433577A CN116433577A (en) 2023-07-14
CN116433577B true CN116433577B (en) 2024-02-06

Family

ID=87089674

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310062761.9A Active CN116433577B (en) 2023-01-18 2023-01-18 Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction

Country Status (1)

Country Link
CN (1) CN116433577B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107655403A (en) * 2016-07-24 2018-02-02 程媛 A kind of high-rate laser spot detection system
CN110246115A (en) * 2019-04-23 2019-09-17 西安理工大学 A kind of detection method of far-field laser light spot image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107655403A (en) * 2016-07-24 2018-02-02 程媛 A kind of high-rate laser spot detection system
CN110246115A (en) * 2019-04-23 2019-09-17 西安理工大学 A kind of detection method of far-field laser light spot image

Also Published As

Publication number Publication date
CN116433577A (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN111288967A (en) Remote high-precision displacement detection method based on machine vision
CN110827360B (en) Photometric stereo measurement system and method for calibrating light source direction thereof
CN1769855B (en) Cavity mirror misalignment monitoring system based on positive-branch confocal unstable resonator and monitoring method thereof
CN115437030B (en) Star-guiding closed-loop tracking method and system for high-dispersion optical fiber spectrometer
CN116433577B (en) Light spot detection and light beam quality estimation method and device based on one-dimensional multi-scale convolution feature extraction
CN110836634B (en) Four-quadrant detector calibration method capable of adapting to various light beams
CN112665532B (en) High-precision laser warning device based on four-quadrant detector and two-dimensional grating
CN117706578B (en) Underwater target polarization imaging detection device and method based on near field suppression
CN115031856A (en) Sub-light spot screening-based wavefront restoration method for shack-Hartmann wavefront sensor
CN114235149A (en) Laser measurement system and method based on CCD reflection imaging method
CN101285712B (en) Linear phase inversion wavefront sensor based on disrete lighting intensity measuring device
CN113188671A (en) Wavefront detection method based on cross iteration automatic position correction
CN114399434B (en) High-precision facula centroid positioning algorithm for establishing space ultra-long distance inter-satellite laser link and identification method thereof
CN108645431B (en) Fitting peak searching method for cavity length correlation demodulation of optical fiber Fabry-Perot sensor
CN111829443A (en) Optical fiber spacing measuring system and measuring method thereof
CN111860616B (en) General acquisition method for weak contrast collimation image target center of comprehensive diagnosis system
CN101285711A (en) Linear phase inversion wavefront sensor based on planar array CCD
CN117760571A (en) Unsupervised learning wavefront detection method based on Hartmann detector
Ferriere et al. In-situ measurement of concentrated solar flux and distribution at the aperture of a central solar receiver
CN114636992A (en) Camera calibration method, camera and computer-readable storage medium
CN112414316A (en) Strain gauge sensitive grid size parameter measuring method
CN114252163B (en) Low signal-to-noise ratio sub-facula wavefront restoration method based on image noise removal
CN111145148A (en) Image interference degree evaluation method based on compressed sensing
CN110749280B (en) Method, system and computer readable medium for extracting index coordinates of peak position
CN117537937B (en) Direction control system for inhibiting nonlinearity of differential wavefront sensing technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant