CN112985269B - Slit width uniformity measuring system, slit width uniformity measuring method and image processing device - Google Patents

Slit width uniformity measuring system, slit width uniformity measuring method and image processing device Download PDF

Info

Publication number
CN112985269B
CN112985269B CN202110192686.9A CN202110192686A CN112985269B CN 112985269 B CN112985269 B CN 112985269B CN 202110192686 A CN202110192686 A CN 202110192686A CN 112985269 B CN112985269 B CN 112985269B
Authority
CN
China
Prior art keywords
slit
image
detected
target
sum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110192686.9A
Other languages
Chinese (zh)
Other versions
CN112985269A (en
Inventor
梁敏勇
崔厚欣
于彩虹
邓家春
冯浩
孙泽宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei Sailhero Environmental Protection High Tech Co ltd
Original Assignee
Hebei Sailhero Environmental Protection High Tech Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei Sailhero Environmental Protection High Tech Co ltd filed Critical Hebei Sailhero Environmental Protection High Tech Co ltd
Priority to CN202110192686.9A priority Critical patent/CN112985269B/en
Publication of CN112985269A publication Critical patent/CN112985269A/en
Application granted granted Critical
Publication of CN112985269B publication Critical patent/CN112985269B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention is applicable to the technical field of optics and provides a slit width uniformity measuring system and method and an image processing device. The slit width uniformity measurement system includes: light source, the slit that awaits measuring, first mirror group, image acquisition device and image processing apparatus, wherein: the light source is used for generating a light beam; the light beam passes through a slit to be detected, passes through the first lens group and then reaches the image acquisition device; the image acquisition device is used for acquiring an image formed by the slit to be detected through the first lens group so as to generate a target image and sending the target image to the image processing device; the width direction of the image of the slit to be detected is parallel to the pixel row or the pixel column of the target image; the image processing device is used for detecting the target image to obtain the width uniformity coefficient of the slit to be detected. The invention can realize quantitative judgment of slit width uniformity and solves the problem of inaccurate judgment caused by lack of quantitative judgment basis in the field of slit width uniformity measurement.

Description

Slit width uniformity measuring system, slit width uniformity measuring method and image processing device
Technical Field
The invention belongs to the technical field of optics, and particularly relates to a slit width uniformity measuring system and method and an image processing device.
Background
An optical slit (hereinafter referred to as a slit) is widely applied to optical instruments, belongs to key components of instruments such as a spectrometer, a monochromator and a push-broom type hyperspectral imager, and can be used as a control mechanism for adjusting parameters such as luminous flux, spectral resolution, spatial resolution and the like. The uniformity of the width dimension of the slit has a significant impact on the performance of the instrument. The ideal slit is not present, typically with slit widths ranging from a few microns to hundreds of microns, where some burrs are present on microscopic scale.
However, the current method of observing the slit by using a microscope can only visually see the effect, and the method lacks a quantitative judgment basis for the uniformity of the slit width.
Disclosure of Invention
In view of the above, the present invention provides a slit width uniformity measuring system, a slit width uniformity measuring method, and an image processing apparatus, so as to solve the problem that the prior art lacks a basis for quantitative determination of slit width uniformity.
A first aspect of an embodiment of the present invention provides a slit width uniformity measurement system, including: light source, the slit that awaits measuring, first mirror group, image acquisition device and image processing apparatus, wherein:
The light source is used for generating a light beam; the light beam passes through the slit to be detected and then passes through the first lens group to reach the image acquisition device;
the image acquisition device is used for acquiring an image formed by the slit to be detected through the first lens group so as to generate a target image and sending the target image to the image processing device; the width direction of the image of the slit to be detected is parallel to the pixel row or the pixel column of the target image;
the image processing device is used for detecting the target image to obtain the width uniformity coefficient of the slit to be detected.
Based on the first aspect, in a first implementation manner, the system further includes a second mirror group, and the light beam passes through the second mirror group before passing through the slit to be detected.
Based on the first aspect, in a second implementation manner, the image processing apparatus is configured to:
determining the sum of gray values of pixels occupied by the pixels of the slit to be detected in each row of target pixel rows, wherein the target pixel rows are pixel rows of the pixels of the slit to be detected;
and determining the width uniformity coefficient of the slit to be detected based on the sum of the gray values corresponding to each row of target pixels.
Based on the second possible implementation manner of the first aspect, in a third possible implementation manner, the image processing apparatus is specifically configured to:
normalizing the sum of the gray values corresponding to each row of target pixel rows;
and determining the sum of the gray values corresponding to each target pixel row after normalization processing as the width uniformity coefficient of the slit to be measured.
Based on the third possible implementation manner of the first aspect, in a fourth possible implementation manner, the image processing apparatus is specifically configured to:
determining the reciprocal of the maximum value in the sum of the gray values corresponding to all the target pixel rows as a normalization coefficient;
and multiplying the sum of the gray values corresponding to each row of target pixel rows by the normalization coefficient to obtain the sum of the gray values corresponding to each row of target pixel rows after normalization processing.
Based on the second possible implementation manner of the first aspect, in a fifth possible implementation manner, the number of pixels occupied by the image of the slit to be measured is a preset number;
the ratio of the width of the image of the slit to be detected to the width of the slit to be detected is a preset ratio;
the image processing device is used for determining the width of the image of the slit to be detected based on the preset proportion and the width of the slit to be detected; and determining the preset number based on the width of the image of the slit to be detected and the size parameter of the image acquisition device.
Based on any one possible implementation manner of the first aspect, in a sixth possible implementation manner, the light source includes:
an integrating sphere light source;
the image acquisition device includes:
a CCD camera.
A second aspect of the embodiments of the present invention provides a slit width uniformity measurement method, including:
acquiring a target image, wherein the target image comprises an image of a slit to be detected, and the width direction of the image of the slit to be detected is parallel to a pixel row of the target image;
determining the sum of gray values of pixels occupied by the pixels of the slit to be detected in each row of target pixel rows, wherein the target pixel rows are pixel rows of the pixels of the slit to be detected;
and determining the width uniformity coefficient of the slit to be detected based on the sum of the gray values corresponding to each row of target pixels.
Based on the second aspect, in a first implementation manner, determining a width uniformity coefficient of the slit to be measured based on a sum of gray values corresponding to each target pixel row includes:
normalizing the sum of the gray values corresponding to each row of target pixel rows;
and determining the sum of the gray values corresponding to each target pixel row after normalization processing as the width uniformity coefficient of the slit to be measured.
A third aspect of the embodiments of the present invention provides an image processing apparatus, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, implements the steps of the slit width uniformity measurement method according to any one of the above-mentioned second aspects.
Compared with the prior art, the invention has the following beneficial effects:
the invention provides a slit width uniformity measuring system, a slit width uniformity measuring method and an image processing device, wherein the slit width uniformity measuring system comprises: light source, the slit that awaits measuring, first mirror group, image acquisition device and image processing apparatus, wherein: the light source is used for generating light beams, and the light beams pass through the slit to be detected and then reach the image acquisition device after passing through the first lens group. The image acquisition device is used for acquiring an image formed by the slit to be detected through the first lens group so as to generate a target image and sending the target image to the image processing device, wherein the width direction of the image of the slit to be detected is parallel to a pixel row or a pixel column of the target image. The image processing device is used for detecting the target image to obtain the width uniformity coefficient of the slit to be detected. The invention can realize quantitative judgment of slit width uniformity and solves the problem of inaccurate judgment caused by lack of quantitative judgment basis in the field of slit width uniformity measurement.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic diagram of a slit width uniformity measurement system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a slit to be measured;
FIG. 3 is a schematic diagram of a target image of a slit to be measured;
FIG. 4 is a schematic structural diagram of a slit width uniformity measurement system provided in accordance with another embodiment of the present invention;
FIG. 5 is a flowchart illustrating an implementation of a slit width uniformity measurement method according to an embodiment of the present invention;
FIG. 6 is a diagram of an image processing apparatus according to an embodiment of the present invention;
fig. 7 is a schematic diagram of an image processing apparatus according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
In order to make the objects, technical solutions and advantages of the present invention more apparent, the following description is made by way of specific embodiments with reference to the accompanying drawings.
Referring to fig. 1, a schematic structural diagram of a slit width uniformity measuring system provided by an embodiment of the present invention is shown. The slit width uniformity measuring system includes a light source 100, a slit 200 to be measured, a first lens group 300, an image collecting device 400, and an image processing device 500.
A light source 100 for generating a light beam. The light beam passes through the slit 200 to be measured, and then passes through the first lens group 300 to reach the image capturing device 400.
The image collecting device 400 is configured to collect an image formed by the slit 200 to be tested through the first lens group 300 to generate a target image, and send the target image to the image processing device 500, wherein a width direction of the image of the slit 200 to be tested is parallel to a pixel row or a pixel column of the target image.
The image processing device 500 is configured to detect a target image to obtain a width uniformity coefficient of the slit 200 to be detected.
In this embodiment, the light intensity of the light beam reaching the slit 200 to be measured is uniform.
The slit 200 may be rectangular, or may be a slit with other shapes, such as a trapezoid, rhombus, or other regular shapes. Fig. 2 is a schematic diagram of a slit 200 to be measured, in which a non-shaded longitudinal region in the middle of the schematic diagram is the slit 200 to be measured, and it can be observed that two sides of the slit 200 to be measured are not completely parallel, wherein a non-parallel region is an uneven burr.
Optionally, the slit 200 to be measured is rectangular: the rectangle is 10mm in height and 50um in width. It should be noted that if the width of the slit 200 to be measured is small enough, for example, less than 10um, the diffraction effect of the light will significantly interfere with the test result, and the error caused by diffraction needs to be considered during the measurement.
The first lens group 300 may include one or more lenses for displaying the light beam passing through the slit 200 in the image capturing device 400 to reflect the complete image of the slit 200. The preset ratio of the slit 200 to be measured imaged in the image capturing device 400 can be controlled by selecting different lens groups.
Optionally, when the lens group is 1:1, the size of the image formed by the slit 200 to be measured is equal to the size of the slit 200 to be measured. The lens in the lens group can be a common machine vision industrial lens, and the focal length is between 25mm and 50 mm.
In this embodiment, the target image generated in the image capturing apparatus 400 may be as shown in fig. 3. The longitudinal white strip in the center of the target image is an image formed by the slit 200 to be detected, the width direction of the white strip is the width direction of the slit, the height direction of the white strip is the height direction of the slit, and the irregular part enclosed by the gray circle in the white strip is an image projected by uneven burrs in the slit 200 to be detected.
When the width direction of the image of the slit 200 to be measured is parallel to the pixel row or the pixel column of the target image, it can be ensured that the acquired pixel data is accurate and complete. The width direction of the image of the slit 200 to be measured shown in fig. 3 is parallel to the pixel row of the target image.
In a first possible implementation manner, the image of the slit 200 to be measured includes an image of a transparent region of the slit and an image projected by uneven burrs in the slit, and the image of the slit 200 to be measured is rectangular.
In a second possible implementation manner, the image of the slit 200 to be measured does not include the projected image of the uneven burrs, and the image of the slit 200 to be measured only includes the image of the transparent region of the slit 200 to be measured.
In an embodiment of the present invention, a slit width uniformity measurement system includes: light source 100, slit 200 to be measured, first mirror group 300, image acquisition device 400 and image processing device 500, wherein: the light source 100 is configured to generate a light beam, and the light beam passes through the slit 200 to be measured and then the first lens group 300 to reach the image capturing device 400. The image collecting device 400 is configured to collect an image formed by the slit 200 to be measured through the first lens group to generate a target image, and send the target image to the image processing device 500, where a width direction of the image of the slit 200 to be measured is parallel to a pixel row or a pixel column of the target image. The image processing apparatus 500 is used for detecting the target image to obtain the width uniformity coefficient of the slit 200 to be measured. The invention can realize quantitative judgment of slit width uniformity and solves the problem of inaccurate judgment caused by lack of quantitative judgment basis in the field of slit width uniformity measurement.
Alternatively, on the basis of the embodiment shown in fig. 1, the image processing apparatus 500 is configured to: and determining the sum of the gray values of the pixels occupied by the pixels of the slit 200 to be detected in each row of target pixel rows, wherein the target pixel rows exist the pixel rows of the pixels of the slit 200 to be detected. And determining the width uniformity coefficient of the slit 200 to be measured based on the sum of the gray values corresponding to each row of the target pixels.
If the width direction of the image of the slit 200 to be measured in the target image acquired by the image acquisition device 400 is parallel to the pixel column of the target image, the image processing device 500 may rotate the target image by 90 degrees so that the width direction of the image of the slit 200 to be measured in the target image is parallel to the pixel row of the target image, and then perform the detection in the above manner. If the width direction of the image of the slit 200 to be measured in the target image acquired by the image acquisition device 400 is parallel to the pixel line of the target image, the image processing device 500 may perform detection without rotating the target image.
The determination method of the target pixel row is not limited herein. In a possible implementation manner, the target pixel row may be determined by calculation, for example, the slit width is fixed, and the positions of the slit 200 to be measured, the first lens group 300, and the image acquisition device 400 are fixed, so that the position information of the target image formed after the light beam passes through the slit 200 to be measured may be calculated by an optical method, and then the position of the target pixel row is obtained. Alternatively, determining the target pixel row may also be implemented by using methods such as edge detection in an image recognition technology.
Optionally, the image processing apparatus 500 is specifically configured to: normalizing the sum of the gray values corresponding to each row of target pixel rows; and determining the sum of the gray values corresponding to each target pixel row after normalization processing as the width uniformity coefficient of the slit 200 to be measured.
Optionally, the image processing apparatus 500 is specifically configured to: determining the reciprocal of the maximum value in the sum of the gray values corresponding to all the target pixel rows as a normalization coefficient; and multiplying the sum of the gray values corresponding to each line of target pixel lines by a normalization coefficient to obtain the sum of the gray values corresponding to each line of target pixel lines after normalization processing.
The method of performing normalization processing on the sum of the gray values corresponding to each target pixel row is not limited, and the normalization coefficient may be selected from other methods, for example, the inverse of the average of the sums of the gray values corresponding to all the target pixel rows may be used as the normalization coefficient, so that the difference of the sums of the gray values corresponding to different target pixel rows can be reflected.
Optionally, the number of pixels occupied by the image of the slit 200 to be measured is a preset number, and the ratio between the width of the image of the slit 200 to be measured and the width of the slit 200 to be measured is a preset ratio. The image processing device 500 is configured to determine the width of the image of the slit 200 to be measured based on the preset ratio and the width of the slit 200 to be measured, and determine the preset number based on the width of the image of the slit 200 to be measured and the size parameter of the image acquisition device 400.
For example, the width of the slit 200 to be measured is 50um, the size parameter of the image capturing device 400 includes the size of the pixel of each pixel, the side length of the pixel is 5um, the preset ratio of the first lens group 103 is 1:1, and the length and the width of the image of the slit 200 to be measured are the same as those of the slit 200 to be measured, so that the number of the pixels occupied by the image of the slit 200 to be measured in the width direction is 10. In the same way, the number of pixels occupied by the image of the slit 200 to be measured in the height direction can be calculated.
For another example, the width of the slit 200 to be measured is 50um, the size parameter of the image acquisition device 400 includes the pixel size of each pixel, the side length of the pixel is 5um, the preset ratio of the first mirror group 103 is 1:5, the width of the image of the slit 200 to be measured is 250um, and the number of the pixels occupied by the image of the slit 200 to be measured in the width direction is 50. In the same way, the number of pixels occupied by the image of the slit 200 to be measured in the height direction can be calculated.
Optionally, the light source 100 includes an integrating sphere light source, the integrating sphere light source can provide a uniform surface light beam, the larger the integrating sphere volume is, the better the opening area needs to be larger than the slit height by more than 2 times, so that the emergent light can be ensured to be more uniform.
Optionally, the image capturing device 400 includes a CCD (charge coupled device) camera. In the debugging process of the CCD camera, the imaging position of the image of the slit 200 to be measured cannot be overexposed, which may cause distortion of the acquired information, and thus an accurate uniformity coefficient cannot be obtained.
Fig. 4 is a schematic structural diagram of a slit width uniformity measuring system according to another embodiment of the present invention. Referring to fig. 4, in addition to any of the above embodiments, the slit width uniformity measuring system further includes a second lens group 600, and the light beam passes through the second lens group 600 before passing through the slit 200 to be measured.
The second lens group 600 may include one or more lenses for imaging the light beam generated by the light source 100 onto the plane of the slit 200 to be measured. The predetermined ratio of the light beam generated by the light source 100 to be imaged on the plane of the slit 200 to be measured can be controlled by selecting different lens groups. The second lens group 600 can improve the uniformity of the light intensity when the light beam emitted from the light source 100 reaches the slit 200 to be measured.
Optionally, when a 1:1 lens group is selected, the size of the image formed by the light beam emitted from the light source 100 is equal to the size of the light beam itself. In addition, in the slit width uniformity measurement system, although the first lens group 300 and the second lens group 600 may also have a certain influence on the light intensity distribution, their influence coefficients are fixed, and may be measured in advance or looked up from the instructions of the lens groups themselves, and when actually performing slit width uniformity measurement, the influence coefficients of the lens groups may be added to the measurement results.
Fig. 5 is a flowchart of an implementation of a slit width uniformity measurement method according to an embodiment of the present invention, where the method includes:
s501, obtaining a target image, wherein the target image comprises an image of the slit to be detected, and the width direction of the image of the slit to be detected is parallel to the pixel line of the target image.
In the present embodiment, the target image is acquired from the image pickup device by the image processing device. The target image comprises the position of the image of the slit to be measured and the gray value of each pixel. The width direction of the image of the slit to be measured is parallel to the pixel rows of the target image, so that the operation on each row of target pixel rows can be realized, and the accuracy of the acquired data can be ensured.
S502, aiming at each row of target pixel rows, determining the sum of the gray values of the pixels occupied by the pixels of the slit to be detected in the target pixel rows, wherein the target pixel rows are the pixel rows of the pixels of the slit to be detected.
Optionally, in this embodiment, the target pixel row may be determined by calculation, for example, the slit width is fixed, and the positions of the slit 200 to be measured, the first mirror group 103, and the image acquisition device 104 are fixed, so that the position information of the target image formed by the light beam passing through the slit to be measured may be calculated by an optical method, and then the position of the target pixel row is obtained.
Alternatively, determining the target pixel row may also be implemented by using methods such as edge detection in an image recognition technology.
After the target pixel row is determined, the sum of the gray values of the pixels occupied by the pixels of the slit to be measured in each row of the target pixel row is determined.
S503, determining the width uniformity coefficient of the slit to be measured based on the sum of the gray values corresponding to each row of target pixel rows.
Optionally, determining a width uniformity coefficient of the slit to be measured based on the sum of the gray values corresponding to each row of the target pixel rows specifically includes:
normalizing the sum of the gray values corresponding to each row of target pixel rows; and determining the sum of the gray values corresponding to each target pixel row after normalization processing as the width uniformity coefficient of the slit to be measured.
Optionally, the process of performing normalization processing on the sum of the gray values corresponding to each row of target pixel rows specifically includes: determining the reciprocal of the maximum value in the sum of the gray values corresponding to all the target pixel rows as a normalization coefficient; and multiplying the sum of the gray values corresponding to each line of target pixel lines by a normalization coefficient to obtain the sum of the gray values corresponding to each line of target pixel lines after normalization processing.
For example, the slit to be measured occupies 1000 rows of pixels in the slit height direction and 10 pixels in the slit width direction. The slit to be measured is divided into 1000 equal parts in the slit height direction, and then 1000 target pixel rows are shared, and uniformity coefficients of the slit width are represented by A1, A2, A3 and …, and the coefficients can be obtained by summing the gray values corresponding to each target pixel row. The gray scale values corresponding to the i-th row and the j-th column of pixels are represented by a (i, j), and the normalization coefficient is represented by B, so that the uniformity coefficient Ai corresponding to the target pixel row in the i-th row is [ a (i,1) + a (i,2) + … + a (i,10) ] × B, specifically, the uniformity coefficient A1 corresponding to the target pixel row in the i-th row is [ a (1,1) + a (1,2) + … + a (1,10) ] × B, and the uniformity coefficient a2 corresponding to the target pixel row in the second row is [ a (2,1) + a (2,2) + … + a (2,10) ] × B, and A1, a2, A3, …, and a1000 obtained after the target pixel rows are processed are uniformity coefficients of the slit width.
And (3) carrying out n equal time division on the image of the slit to be measured in the height direction, wherein the final width uniformity coefficient is a1 x n one-dimensional array.
Optionally, more than one normalization processing method is used for normalizing the sum of the gray values corresponding to each row of target pixel rows, the normalization coefficient may be selected otherwise, for example, the reciprocal of the average value of the sums of the gray values corresponding to all the target pixel rows may be used as the normalization coefficient, so that the difference reflecting the sum of the gray values corresponding to different target pixel rows may be achieved.
The method includes the steps of firstly obtaining a target image, wherein the target image comprises a slit image to be detected, the width direction of the slit image to be detected is parallel to a pixel row of the target image, then determining the sum of gray values of pixels occupied by the slit image to be detected in the target pixel row aiming at each row of target pixel row, wherein the target pixel row comprises the pixel row of the slit image to be detected, and finally determining a width uniformity coefficient of the slit to be detected based on the sum of the gray values corresponding to the target pixel row of each row. The embodiment of the invention can realize quantitative judgment of the slit width uniformity in the above way, and can solve the problem of inaccurate judgment caused by lack of quantitative judgment basis in the current slit width uniformity measurement field.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
The following are embodiments of the apparatus of the invention, reference being made to the corresponding method embodiments described above for details which are not described in detail therein.
Fig. 6 is a schematic structural diagram of an image processing apparatus 60 according to an embodiment of the present invention, and for convenience of description, only the parts related to the embodiment of the present invention are shown, and detailed as follows:
as shown in fig. 6, the image processing apparatus 60 includes: an acquisition unit 601, a first determination unit 602, a second determination unit 603.
The acquiring unit 601 is configured to acquire a target image, where the target image includes an image of a slit to be measured, and a width direction of the image of the slit to be measured is parallel to a pixel row of the target image.
A first determining unit 602, configured to determine, for each line of target pixel lines, a sum of gray values of pixels occupied by pixels of a slit to be tested in the target pixel line, where the target pixel line is a pixel line of pixels in which the slit to be tested exists.
A second determining unit 603, configured to determine a width uniformity coefficient of the slit to be measured based on a sum of gray values corresponding to each target pixel row.
The method includes the steps of firstly obtaining a target image, wherein the target image comprises a slit image to be detected, the width direction of the slit image to be detected is parallel to a pixel row of the target image, then determining the sum of gray values of pixels occupied by the slit image to be detected in the target pixel row aiming at each row of target pixel row, wherein the target pixel row comprises the pixel row of the slit image to be detected, and finally determining the width uniformity coefficient of the slit to be detected based on the sum of the gray values corresponding to each row of target pixel row. The embodiment of the invention can realize quantitative judgment of the slit width uniformity in the above way, and can solve the problem of inaccurate judgment caused by lack of quantitative judgment basis in the current slit width uniformity measurement field.
Optionally, the second determining unit 603 is specifically configured to:
normalizing the sum of the gray values corresponding to each row of target pixel rows;
and determining the sum of the gray values corresponding to each target pixel row after normalization processing as the width uniformity coefficient of the slit to be measured.
Optionally, the second determining unit 603 is specifically configured to:
determining the reciprocal of the maximum value in the sum of the gray values corresponding to all the target pixel rows as a normalization coefficient;
and multiplying the sum of the gray values corresponding to each line of target pixel lines by a normalization coefficient to obtain the sum of the gray values corresponding to each line of target pixel lines after normalization processing.
The image processing apparatus provided in this embodiment may be used to implement the method embodiments described above, and the implementation principle and technical effect are similar, which are not described herein again.
Fig. 7 is a schematic diagram of an image processing apparatus according to an embodiment of the present invention. As shown in fig. 7, the image processing apparatus 7 of this embodiment includes: a processor 70, a memory 71 and a computer program 72 stored in said memory 71 and executable on said processor 70. The processor 70, when executing the computer program 72, implements the steps in the various slit width uniformity measurement method embodiments described above, such as steps 501-503 shown in fig. 5. Alternatively, the processor 70, when executing the computer program 72, implements the functions of the units in the above-described device embodiments, for example, the functions of the units 601 to 603 shown in fig. 6.
Illustratively, the computer program 72 may be divided into one or more units, which are stored in the memory 71 and executed by the processor 70 to accomplish the present invention. The one or more units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 72 in the image processing apparatus 7. For example, the computer program 72 may be divided into an acquisition unit 601, a first determination unit 602, and a second determination unit 603, and the specific functions of each unit are as follows:
the acquiring unit 601 is configured to acquire a target image, where the target image includes an image of a slit to be measured, and a width direction of the image of the slit to be measured is parallel to a pixel row of the target image.
A first determining unit 602, configured to determine, for each line of target pixel lines, a sum of gray values of pixels occupied by a pixel of a slit to be tested in the target pixel line, where the target pixel line is a pixel line of a pixel of a slit to be tested.
A second determining unit 603, configured to determine a width uniformity coefficient of the slit to be measured based on a sum of gray values corresponding to each target pixel row.
The image processing device 7 may be a desktop computer, a notebook computer, a palm computer, a cloud server, or other computing devices. The image processing device may include, but is not limited to, a processor 70, a memory 71. It will be understood by those skilled in the art that fig. 7 is only an example of the image processing apparatus 7, and does not constitute a limitation to the image processing apparatus 7, and may include more or less components than those shown, or combine some components, or different components, for example, the image processing apparatus may further include an input-output device, a network access device, a bus, and the like.
The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the image processing apparatus 7, such as a hard disk or a memory of the image processing apparatus 7. The memory 71 may also be an external storage device of the image processing apparatus 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the image processing apparatus 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the image processing apparatus 7. The memory 71 is used to store the computer program and other programs and data required by the image processing apparatus. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional units is merely illustrated, and in practical applications, the above distribution of functions may be performed by different functional units according to needs, that is, the internal structure of the apparatus may be divided into different functional units to perform all or part of the functions described above. Each functional unit in the embodiments may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units are only used for distinguishing one functional unit from another, and are not used for limiting the protection scope of the application. The specific working process of the units in the system may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed image processing apparatus and method may be implemented in other ways. For example, the above-described embodiments of the image processing apparatus are merely illustrative, and for example, the division of the units is only one logical function division, and there may be other division manners in actual implementation, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (9)

1. A slit width uniformity measurement system, comprising: light source, the slit that awaits measuring, first mirror group, image acquisition device and image processing apparatus, wherein:
the light source is used for generating a light beam; the light beam passes through the slit to be detected and then passes through the first lens group to reach the image acquisition device;
the image acquisition device is used for acquiring an image formed by the slit to be detected through the first lens group so as to generate a target image and sending the target image to the image processing device; the width direction of the image of the slit to be detected is parallel to the pixel row or the pixel column of the target image;
The image processing device is used for detecting the target image, determining the sum of gray values of pixels occupied by the pixels of the slit to be detected in each line of target pixel lines, and determining the width uniformity coefficient of the slit to be detected based on the sum of the gray values corresponding to each line of target pixel lines, wherein the target pixel lines are pixel lines of the pixels of the slit to be detected.
2. The slit width uniformity measurement system of claim 1, further comprising a second mirror group through which the light beam passes before passing through the slit to be measured.
3. The slit width uniformity measurement system of claim 1, wherein the image processing device is specifically configured to:
normalizing the sum of the gray values corresponding to each row of target pixel rows;
and determining the sum of the gray values corresponding to each target pixel row after normalization processing as the width uniformity coefficient of the slit to be measured.
4. The slit width uniformity measurement system of claim 3, wherein the image processing device is specifically configured to:
determining the reciprocal of the maximum value in the sum of the gray values corresponding to all the target pixel rows as a normalization coefficient;
And multiplying the sum of the gray values corresponding to each row of target pixel rows by the normalization coefficient to obtain the sum of the gray values corresponding to each row of target pixel rows after normalization processing.
5. The slit width uniformity measurement system of claim 1, wherein the number of pixels occupied by the image of the slit to be measured is a preset number;
the ratio of the width of the image of the slit to be detected to the width of the slit to be detected is a preset ratio;
the image processing device is used for determining the width of the image of the slit to be detected based on the preset proportion and the width of the slit to be detected; and determining the preset number based on the width of the image of the slit to be detected and the size parameter of the image acquisition device.
6. The slit width uniformity measurement system of any of claims 1-5, wherein the light source comprises:
an integrating sphere light source;
the image acquisition device includes:
a CCD camera.
7. A slit width uniformity measuring method is characterized by comprising the following steps:
acquiring a target image, wherein the target image comprises an image of a slit to be detected, and the width direction of the image of the slit to be detected is parallel to a pixel row of the target image;
Determining the sum of gray values of pixels occupied by the pixels of the slit to be detected in each row of target pixel rows, wherein the target pixel rows are pixel rows of the pixels of the slit to be detected;
and determining the width uniformity coefficient of the slit to be detected based on the sum of the gray values corresponding to each row of target pixels.
8. The slit width uniformity measuring method of claim 7, wherein determining the width uniformity coefficient of the slit to be measured based on the sum of the gray values corresponding to each row of the target pixels comprises:
normalizing the sum of the gray values corresponding to each row of target pixel rows;
and determining the sum of the gray values corresponding to each target pixel row after normalization processing as the width uniformity coefficient of the slit to be measured.
9. An image processing apparatus comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the slit width uniformity measuring method according to claim 7 or 8 above when executing the computer program.
CN202110192686.9A 2021-02-20 2021-02-20 Slit width uniformity measuring system, slit width uniformity measuring method and image processing device Active CN112985269B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110192686.9A CN112985269B (en) 2021-02-20 2021-02-20 Slit width uniformity measuring system, slit width uniformity measuring method and image processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110192686.9A CN112985269B (en) 2021-02-20 2021-02-20 Slit width uniformity measuring system, slit width uniformity measuring method and image processing device

Publications (2)

Publication Number Publication Date
CN112985269A CN112985269A (en) 2021-06-18
CN112985269B true CN112985269B (en) 2022-09-13

Family

ID=76393667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110192686.9A Active CN112985269B (en) 2021-02-20 2021-02-20 Slit width uniformity measuring system, slit width uniformity measuring method and image processing device

Country Status (1)

Country Link
CN (1) CN112985269B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111316086A (en) * 2019-04-04 2020-06-19 合刃科技(深圳)有限公司 Optical detection method for surface defects and related device

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4206499C2 (en) * 1992-03-02 1994-03-10 Haeusler Gerd Distance measuring method and device
CN102853777B (en) * 2012-10-15 2016-04-06 山东大学 Based on brake clearance method for measuring width and the equipment of image procossing
CN104535006B (en) * 2015-01-21 2017-10-27 杭州电子科技大学 A kind of bottle cap gap width evaluation method of utilization transmission-type illumination imaging systems
CN106469451A (en) * 2016-08-31 2017-03-01 浙江捷尚视觉科技股份有限公司 Gap detection device and detection method
CN107256689B (en) * 2016-11-09 2020-09-18 长春希达电子技术有限公司 Uniformity repairing method for LED display screen after brightness correction
CN108088381B (en) * 2017-12-13 2020-02-07 湖北汽车工业学院 Non-contact type micro gap width measuring method based on image processing
CN110446019B (en) * 2019-07-17 2022-02-01 成都理想境界科技有限公司 Optical fiber scanning projection system and modulation method thereof
CN110930407B (en) * 2020-02-07 2020-05-15 西南交通大学 Suspension gap visual detection method based on image processing
CN111750789B (en) * 2020-06-08 2021-10-01 北京工业大学 Tooth pitch deviation and tooth profile deviation evaluation method in small module gear vision measurement
CN112291446B (en) * 2020-10-22 2022-03-01 中国科学院长春光学精密机械与物理研究所 Non-uniformity correction method for large-area array CMOS image sensor

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111316086A (en) * 2019-04-04 2020-06-19 合刃科技(深圳)有限公司 Optical detection method for surface defects and related device

Also Published As

Publication number Publication date
CN112985269A (en) 2021-06-18

Similar Documents

Publication Publication Date Title
US7769243B2 (en) Method and apparatus for image inspection
JP6319329B2 (en) Surface attribute estimation using plenoptic camera
US20120013760A1 (en) Characterization of image sensors
Molleda et al. An improved 3D imaging system for dimensional quality inspection of rolled products in the metal industry
US9471984B2 (en) Method for self-calibration of a microscope apparatus
CN109767425B (en) Machine vision light source uniformity evaluation device and method
CN110335204B (en) Thermal imaging image enhancement method
WO2018223267A1 (en) Method and system for hyperspectral light field imaging
CN112233076B (en) Structural vibration displacement measurement method and device based on red round target image processing
CN111369484B (en) Rail profile detection method and device
CN113962876B (en) Pixel distortion correction method, correction device and terminal
CN110879131A (en) Imaging quality testing method and imaging quality testing device for visual optical system, and electronic apparatus
JP7170590B2 (en) IMAGE PROCESSING APPARATUS AND IMAGE PROCESSING METHOD FOR SUPER RESOLUTION MICROSCOPE
CN112985269B (en) Slit width uniformity measuring system, slit width uniformity measuring method and image processing device
US7518712B2 (en) Tilted edge for optical-transfer-function measurement
Meißner et al. Towards standardized evaluation of image quality for airborne camera systems
CN111336938A (en) Robot and object distance detection method and device thereof
US10504802B2 (en) Target location in semiconductor manufacturing
CN112067136B (en) Drift correction method and device for photothermal reflection microscopic thermal imaging
Baer Circular-edge spatial frequency response test
CN113624358A (en) Three-dimensional displacement compensation method and control device for photothermal reflection microscopic thermal imaging
CN113939759A (en) System and method for correcting focal plane variations in a field of view of a microscope objective
Perezyabov et al. Comparative analysis of resolution measurement methods for the optoelectronic systems
JP2723914B2 (en) Lens barrel resolution inspection system
KR20000060731A (en) Calibraion method of high resolution photographing equipment using multiple imaging devices.

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant