CN113902894B - Automatic reading identification method for strip level based on image processing - Google Patents

Automatic reading identification method for strip level based on image processing Download PDF

Info

Publication number
CN113902894B
CN113902894B CN202111249287.8A CN202111249287A CN113902894B CN 113902894 B CN113902894 B CN 113902894B CN 202111249287 A CN202111249287 A CN 202111249287A CN 113902894 B CN113902894 B CN 113902894B
Authority
CN
China
Prior art keywords
image
level
area
bubble
reading
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111249287.8A
Other languages
Chinese (zh)
Other versions
CN113902894A (en
Inventor
孙晓艳
唐圣金
杨百龙
郭君斌
康凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rocket Force University of Engineering of PLA
Original Assignee
Rocket Force University of Engineering of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rocket Force University of Engineering of PLA filed Critical Rocket Force University of Engineering of PLA
Priority to CN202111249287.8A priority Critical patent/CN113902894B/en
Publication of CN113902894A publication Critical patent/CN113902894A/en
Application granted granted Critical
Publication of CN113902894B publication Critical patent/CN113902894B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an automatic reading and identifying method of a strip level based on image processing, which comprises the following steps: constructing a level meter image acquisition system; preprocessing an image to be identified, extracting a reading area of a level, and setting the reading area as a later level detection region of interest; in the detection region of interest, extracting graduation marks of the level based on edge, color and area characteristics; detecting bubbles of the level meter based on a template matching method in the detection region of interest; according to the position relation between the bubble edge position and the scale mark, the reading value of the level meter is calculated, so that the automatic reading function of the level meter is realized, and the level state of the system to be detected is judged. According to the invention, on the premise of an image acquisition system, preprocessing of image filtering and noise reduction, positioning of a level detection area, extraction of scale marks and bubble targets, full scale determination is carried out through a position relation algorithm of the bubble and the scale marks, fine recognition is carried out on the basis, and the accuracy of automatic recognition of the level readings is improved.

Description

Automatic reading identification method for strip level based on image processing
Technical Field
The invention relates to a reading identification method of a strip type level, in particular to an automatic reading identification method of the strip type level based on image processing.
Background
The strip type level gauge is used as a measuring instrument for detecting flatness, parallelism and verticality, and has the characteristics of simple structure, convenience in operation and accuracy in measurement. Meanwhile, the strip type level gauge is used as a physical instrument, the measurement accuracy is little affected by the interference of the outside air temperature and the humidity, the measurement errors cannot be accumulated along with time, the level gauges with different graduation values can be used for detecting the level conditions with different accuracy, and the measurement accuracy is flexible to select. In particular, the device has the advantage of low price, greatly reduces the equipment cost and improves the practicability.
At present, the research on the high-precision automatic reading technology for measuring verticality by using a strip type level meter by domestic relevant manufacturers and scientific research institutions is not mature, the reading of the level meter is mainly carried out by human eyes of operators, and when the level meter is applied to the industrial field, the efficiency is low, and the reading error is often caused due to the fatigue of the human eyes and the like, so that the metering result is seriously influenced.
In recent years, replacing manual reading based on computer vision reading is becoming a research hot spot, and in terms of the present, computer vision reading is mainly realized by adopting an image processing algorithm, but the present image processing algorithm has the problem of low precision when carrying out reading identification of a level meter.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides an automatic reading identification method for a strip level, which is used for determining the reading through a scale mark detection of edge characteristics and a bubble detection algorithm matched with a template on the premise of image denoising and detection area positioning, and carrying out fine identification on the basis, so that the accuracy of automatic reading identification of the strip level is effectively improved.
The aim of the invention is realized by the following technical scheme:
An automatic reading identification method of a strip level based on image processing comprises the following steps:
s1, constructing a level meter image acquisition system;
s2, collecting a plurality of images with a level meter, preprocessing the images to be identified, and removing noise interference of the images;
S3, extracting a reading area of the level meter by adopting a template matching method on the preprocessed image, and setting the reading area as a detection interest area of the later level meter;
S4, in the detection region of interest, extracting scale marks of the level based on the characteristics of edges, colors, areas and the like;
s5, detecting bubbles of the level meter based on a template matching method in the detection region of interest;
S6, calculating a reading value of the level according to the position relation between the bubble edge position and the scale mark, so that an automatic reading function of the level is realized, and the level state of the system to be detected is judged.
The invention further improves that the specific implementation method of the image acquisition system in the step S1 is as follows:
Taking the image quality and the stability of a detection system into consideration, a module camera with the model of IMX274 is selected as an image acquisition module of the system; the double-connecting-rod hinged fixing method is adopted, and the connecting-rod bracket is formed by hinging two brackets: one end is fixed, the other end is connected with the camera through a knob component, or the direction of the camera is adjusted by 360 degrees, and the position of the camera is adjusted in an XOZ plane by hinging through double connecting rods.
The invention further improves that the specific implementation method of the image preprocessing in the step S2 is as follows:
s201, collecting a plurality of images with a level meter, wherein the original images are RGB images, the images consist of three primary colors of red, green and blue, the conversion is carried out by adopting a weighted average method, and the gray processing is carried out on the original RGB images, and the gray processing process is expressed as follows:
Y=0.299×R+0.587×G+0.114×B (1)
S202, denoising by adopting a Gaussian filtering method, and weighting and averaging all pixel points of the target image, namely, calculating the value of each pixel point by the weighted average of the central point and the pixel points of the neighborhood of the central point.
The invention further improves that the specific implementation method for detecting the region of interest by the extraction level in the step S3 is as follows:
S301, selecting any level image as a sample image, wherein the level image comprises a glass level for displaying scales and a black shell marked with characters and graduation values of a manufacturer;
s302, selecting a reading area of a level in a sample image by using a rectangular frame, marking position coordinates of four vertexes of the rectangular frame where the reading area of the level is positioned, calculating the width and height of a template, taking the calculated width and height as a size parameter of a later-period normalized template, and taking the reading rectangular area of the level corresponding to the sample image as a template;
s303, normalizing the size of the template image;
S304, performing template matching calculation on each image to be identified, and traversing each image by adopting a full search method;
s305, calculating the correlation value of the template and the image to be detected, adopting the product of the template and the corresponding position of the target image pixel, solving the mean value of the product as the correlation coefficient value, and storing all the correlation values calculated in the step, wherein the calculation process of the correlation value is as follows:
R(x,y)=∑x′y′(T′(x′,y′)·I′(x+x′,y+y′)) (4)
Wherein R is a correlation coefficient; i (x, y), T (x, y) are the original image and template image pixel coordinates; w and h are the size of the template;
S306, selecting a candidate image area with the largest correlation value, comparing the candidate image area with a preset threshold value, and judging whether the correlation value is larger than the preset threshold value or not:
If yes, considering that a reading area of the level is detected, and cutting out a reading image area in the image to be identified according to the candidate level reading position coordinate with the maximum correlation value to be used as a detection interested area of the next step;
If not, the reading area of the level is not recognized, and the step S1 is returned to perform preprocessing of the next image to be recognized.
The invention is further improved in that the specific implementation method for extracting the scale marks of the level gauge in the step S4 is as follows:
S401, acquiring an image with a level meter, performing image denoising operation, inputting a mature extraction level meter detection interested model in the step S304, acquiring a level meter detection interested region in the image according to the steps S304-S306, and cutting and storing the level meter detection interested region;
S402, performing binarization processing on an image of an area of interest of the level, setting a fixed area, comparing all pixels in the area of interest with a threshold value, setting 255 larger than the threshold value, or setting 0;
S403, taking black areas in the binary image as targets, extracting outlines of all target blocks, calculating key features of each outline area, and storing feature values of the key features;
s404, screening the outline of the target block to extract a needed scale mark area, comparing the scale mark area with a preset threshold value, and judging whether the characteristic value meets the preset condition:
If the area is in the outline within the threshold range, reserving; if not, deleting the contour area;
If the previous condition is met, the ordinate of the mass center of the outline area block is larger than a preset threshold value, and the area is saved as a scale mark area; if not, deleting the contour area; storing the extracted scale mark region characteristic value;
S405, sorting the scale marks according to the position relation of the scale marks, wherein the scale marks are required to be marked from inside to outside and are required to be symmetrical left and right.
The invention is further improved in that the specific implementation method for extracting the air bubbles of the level based on the template matching in the step S5 is as follows:
S501, selecting any level image as a sample image, selecting a bubble area of the level image as a template, marking position coordinates of four vertexes of a rectangular frame where the bubble area is located, and calculating the width and height of the template as a size parameter of a later normalized template;
s502, normalizing the size of the template image;
s503, performing template matching calculation on the detected region of interest obtained in the step S4, and traversing the image of the region of interest by adopting a full search method;
s504, calculating a correlation value of the template and the image to be detected;
S505, selecting a candidate image area with the largest correlation value, comparing the candidate image area with a preset threshold value, and judging whether the correlation value is larger than the preset threshold value or not:
if yes, the bubble area is considered to be detected;
if not, the bubble area is not recognized, the threshold is corrected, and the steps S503 to S505 are repeated to perform the next search.
The invention is further improved in that the specific implementation method of the level reading in the step S6 is as follows:
s601, theoretically, the distances between two adjacent graduation marks are equal, the distances between the two graduation marks are 2mm through ruler measurement, the pixel distance between the two adjacent graduation marks is calculated, the abscissa of the mass center of the graduation marks is used for calculating the difference value, the difference value between the two graduation marks is 28 pixels, and the error is 1-2;
s602, the graduation lines of the level gauge are arranged in sequence in the step S405, and the structure of the level gauge can know that the level gauge is in a zero position when the left and right edges of the bubble are overlapped with the fourth graduation line, and the level gauge is in a horizontal state at the moment, the fourth graduation line is set as a zero graduation line, and the upper left corner coordinate of the rectangular frame represents the position of the column head of the edge of the bubble; when the bubble is positioned on the left side of the zero graduation line, namely the air bubble abscissa is smaller than the zero graduation line abscissa, the air bubble is marked as negative, and when the air bubble is positioned on the right side of the zero graduation line, namely the air bubble abscissa is larger than the zero graduation line abscissa, the air bubble is marked as positive;
S603, detecting a nearest scale mark to the bubble, setting the nearest scale mark to the bubble as A, setting C as a zero scale mark, setting a scale mark B between the A and the zero scale mark C, and setting the deviation lattice value of the nearest scale mark A and the zero scale mark C as a main gradient; when the bubble is positioned between the two graduation marks A, B, the percentage of the distance between the bubble and the nearest graduation mark A, which is occupied by the distance between the bubble and the adjacent two graduation marks A, B, is the secondary gradient;
S604, judging the position relation between the air bubble and the zero scale mark, and if the edge of the air bubble is far away from the zero scale mark, subtracting the secondary gradient from the primary gradient; if the bubble edge is adjacent to the zero scale mark, the level inclination is the value of the primary inclination plus the secondary inclination;
S605, judging whether the final reading is positive or negative, and if the bubble is positioned on the left side of the zero scale mark, judging that the final reading is negative; otherwise, positive values are used; and obtaining a final reading result.
The invention has at least the following beneficial technical effects:
The invention provides a method for measuring verticality by using a strip type level. The strip type level gauge is used as a measuring instrument for detecting levelness, and has the characteristics of simple structure, convenient operation and accurate measurement. Meanwhile, the strip type level gauge is used as a physical instrument, the measurement accuracy is little affected by the interference of the outside air temperature and the humidity, the measurement errors cannot be accumulated along with time, the level gauges with different graduation values can be used for detecting the level conditions with different accuracy, the selection is flexible, and most importantly, the strip type level gauge has the advantage of low price, and the cost is greatly reduced.
Furthermore, the built image acquisition system performs image filtering noise reduction pretreatment on the image to be identified, improves the quality of the image, extracts a reading region of interest of the level meter by using a template matching method, effectively reduces the interference of an image processing link, improves the detection efficiency, determines scales by a position relation algorithm of bubbles and scale marks on the premise of extracting the scale marks and bubble targets, performs fine identification on the basis, effectively improves the accuracy of automatic identification of the reading of the strip-type level meter, and has good engineering application value.
Drawings
FIG. 1 is a schematic flow chart of an algorithm of the invention;
FIG. 2 is a graph of the results of an image filtering noise reduction process according to one embodiment of the present invention;
FIG. 3 is a schematic diagram of a detection zone template according to an embodiment of the present invention;
FIG. 4 is a graph of the extraction results of detecting a region of interest according to one embodiment of the present invention;
FIG. 5 is a graph of a segmentation effect according to an embodiment of the present invention;
FIG. 6 is a graph of the profile inspection results according to one embodiment of the present invention;
FIG. 7 is a schematic diagram of contour zoning according to an embodiment of the present invention;
FIG. 8 is a tick mark profile view according to one embodiment of the invention;
FIG. 9 is a tick mark ordering schematic according to one embodiment of the invention;
FIG. 10 is a schematic diagram of a bubble template according to one embodiment of the invention;
FIG. 11 is a graph showing the results of bubble detection in accordance with one embodiment of the present invention;
FIG. 12 is a schematic diagram of the relationship between bubble and tick mark according to an embodiment of the present invention;
FIG. 13 is a flow chart of the level calculation according to one embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
As shown in fig. 1, an automatic reading identification method of a strip level based on image processing comprises the following steps:
s1, constructing a level meter image acquisition system;
S2, collecting a plurality of images with a level meter, and preprocessing the images to be identified;
s3, extracting a reading area of the level by adopting a template matching method, and setting the reading area as a detection interest area of the later level;
S4, in the detection region of interest, extracting scale marks of the level based on the characteristics of edges, colors, areas and the like;
s5, detecting bubbles of the level meter based on a template matching method in the detection region of interest;
S6, calculating a reading value of the level according to the position relation between the bubble edge position and the scale mark, so that an automatic reading function of the level is realized, and the level state of the system to be detected is judged.
In an embodiment of the present application, the preprocessing procedure described in step S1 includes:
In consideration of image quality and stability of a detection system, a module camera with the model number of IMX274 is selected as an image acquisition module of the system. The double-connecting-rod hinged fixing method is adopted, and the connecting-rod bracket is formed by hinging two brackets: one end is fixed and can be fixed on a tabletop; the other end is connected with the camera through the knob component, the direction of the camera can be adjusted by 360 degrees, and the position of the camera can be adjusted in the XOZ plane by using double-connecting-rod hinging.
In an embodiment of the present application, the step S2 includes:
S201, collecting a plurality of images with a level meter, wherein the original images are RGB images, the images consist of three primary colors of red (R), green (G) and blue (B), the conversion is carried out by adopting a weighted average method, and the gray processing is carried out on the original RGB images, wherein the gray processing process is expressed as follows:
Y=0.299×R+0.587×G+0.114×B (1)
s202, in the image acquisition and transmission process, the acquired image is affected by surrounding environment, so that the acquired image is noisy, the characteristics of the image part are seriously lost or changed, and the implementation of a target detection method of the level meter is not facilitated. For this reason, denoising is performed by using a gaussian filtering method, all pixel points of the target image are weighted and averaged, that is, the value of each pixel point is calculated by weighted average of the center point and the pixel points in the neighborhood thereof, and the denoising processing result is shown in fig. 2.
In an embodiment of the present application, the step S3 includes:
S301, selecting any level image as a sample image, wherein the level image comprises a glass level for displaying scales and a black shell marked with manufacturer characters and graduation values.
S302, selecting a reading area of a level in a sample image by using a rectangular frame, marking position coordinates of four vertexes of the rectangular frame where the reading area of the level is positioned, calculating the width and the height of a template, taking the width and the height of the template as the size parameter of a later normalized template, and taking the reading rectangular area of the level corresponding to the sample image as a template, wherein a schematic diagram of the template is shown in fig. 3;
s303, normalizing the size of the template image;
S304, performing template matching calculation on each image to be identified, and traversing each image by adopting a full search method;
s305, calculating the correlation value of the template and the image to be detected, adopting the product of the template and the corresponding position of the target image pixel, solving the mean value of the product as the correlation coefficient value, and storing all the correlation values calculated in the step, wherein the calculation process of the correlation value is as follows:
R(x,y)=∑x′y′(T′(x′,y′)·I′(x+x′,y+y′)) (4)
Wherein R is a correlation coefficient; i (x, y), T (x, y) are the original image and template image pixel coordinates; w and h are template sizes.
S306, selecting a candidate image area with the largest correlation value, comparing the candidate image area with a preset threshold value, and judging whether the correlation value is larger than the preset threshold value or not:
If yes, considering that a reading area of the level is detected, and cutting out a reading image area in the image to be identified according to the candidate level reading position coordinate with the maximum correlation value to be used as a detection interested area of the next step;
If not, the reading area of the level is not recognized, and the step S1 is returned to perform preprocessing of the next image to be recognized.
In an embodiment of the present application, the step S4 includes:
S401, acquiring an image with a level meter, performing image denoising operation, inputting a mature extraction level meter detection interested model in the step S304, acquiring a level meter detection interested region in the image according to the steps S304-S306, and cutting and storing the level meter detection interested region;
s402, performing binarization processing on an image of an area of interest of the level, setting a fixed area, comparing all pixels in the area of interest with a threshold value, setting 255 larger than the threshold value, otherwise, setting 0, and obtaining a segmentation effect diagram shown in FIG. 5;
s403, taking black areas in the binary image as targets, extracting outlines of all target blocks, enabling an outline detection result image to be shown in FIG. 6, calculating key features of each outline area, and storing feature values of the key features;
s404, screening the outline of the target block to extract a needed scale mark area, comparing the scale mark area with a preset threshold value, and judging whether the characteristic value meets the preset condition:
If the area is in the outline within the threshold range, reserving; if not, deleting the contour area;
If the previous condition is met, the ordinate of the mass center of the outline area block is larger than a preset threshold value, and the area is saved as a scale mark area; if not, deleting the contour area; and storing the extracted characteristic value of the scale mark region.
S405, sorting the scale marks according to the position relation of the scale marks, wherein the scale marks are required to be marked from inside to outside and 1-9 are required to be symmetrical left and right, and the outline diagram of the scale marks is shown in fig. 8.
In an embodiment of the present application, the step S5 includes:
S501, selecting any level image as a sample image, and selecting the area where bubbles of the level image are located as a model, wherein a bubble template is shown in FIG. 10. Marking the position coordinates of four vertexes of a rectangular frame where the bubble area is located, and calculating the width and height of the template as the size parameter of the later normalized template;
s502, normalizing the size of the template image;
s503, performing template matching calculation on the detected region of interest obtained in the step S4, and traversing the image of the region of interest by adopting a full search method;
s504, calculating a correlation value of the template and the image to be detected;
S505, selecting a candidate image area with the largest correlation value, comparing the candidate image area with a preset threshold value, and judging whether the correlation value is larger than the preset threshold value or not:
if yes, the bubble area is considered to be detected;
if not, the bubble area is not recognized, the threshold is corrected, and the steps S503 to S505 are repeated to perform the next search.
The bubble detection result is shown in fig. 11.
In an embodiment of the present application, the step S6 includes:
s601, a flow chart of the numerical calculation of the level meter is shown in fig. 13. Theoretically, the distance between two adjacent graduation marks should be equal, and the distance between two graduation marks is 2mm as measured by a ruler. And calculating the pixel distance between two adjacent graduation marks, and calculating the difference value by adopting the abscissa of the mass center of the graduation marks. The difference between the two graduations is about 28 pixels, with an error between 1 and 2.
S602, in the step S405, the graduation marks of the level gauge are arranged in sequence, and the structure of the level gauge can know that the level gauge is in a zero position when the left and right edges of the bubble are overlapped with the fourth graduation marks, and the level gauge is in a horizontal state. And setting the fourth scale line as a zero scale line, and representing the position of the bubble edge column head by the upper left corner coordinate of the rectangular frame. When the bubble is positioned on the left side of the zero graduation line, namely the abscissa of the bubble is smaller than the abscissa of the zero graduation line, the bubble is marked as negative; when the bubble is positioned on the right side of the zero scale mark, namely the abscissa of the bubble is larger than the zero scale mark, the bubble is marked as positive.
S603, detecting the closest graduation mark to the bubble, wherein the position relation diagram of the bubble and the graduation mark is shown in fig. 12, setting the closest graduation mark to the bubble as A, setting C as zero graduation mark, and setting the graduation mark B between the A and the zero graduation mark C. The deviation lattice value of the nearest graduation mark A and the zero graduation mark C is the main gradient; when the bubble is located in the middle of two graduations A, B, the percentage of the distance between the bubble and the adjacent two graduations A, B is the minor inclination according to the distance between the bubble and the nearest graduation a.
S604, judging the position relation between the air bubble and the zero scale mark, and if the edge of the air bubble is far away from the zero scale mark, subtracting the secondary gradient from the primary gradient; if the bubble edge is adjacent to the zero tick mark, the level grade is the primary grade plus the secondary grade.
S605, judging whether the final reading is positive or negative, and if the bubble is positioned on the left side of the zero scale mark C, judging that the final reading is negative; otherwise, positive values are used; and obtaining a final reading result.
While the foregoing description illustrates and describes a preferred embodiment of the present invention, it is to be understood that the invention is not limited to the form disclosed herein, but is not to be construed as limited to other embodiments, but is capable of use in various other combinations, modifications and environments and is capable of changes or modifications within the spirit of the invention described herein, either as a result of the foregoing teachings or as a result of the knowledge or skill of the relevant art. And that modifications and variations which do not depart from the spirit and scope of the invention are intended to be within the scope of the appended claims.

Claims (5)

1. An automatic reading identification method of a strip level based on image processing is characterized by comprising the following steps:
s1, constructing a level meter image acquisition system;
s2, collecting a plurality of images with a level meter, preprocessing the images to be identified, and removing noise interference of the images;
S3, extracting a reading area of the level meter by adopting a template matching method on the preprocessed image, and setting the reading area as a detection interest area of the later level meter; the specific implementation method for extracting the region of interest of the level meter comprises the following steps:
S301, selecting any level image as a sample image, wherein the level image comprises a glass level for displaying scales and a black shell marked with characters and graduation values of a manufacturer;
s302, selecting a reading area of a level in a sample image by using a rectangular frame, marking position coordinates of four vertexes of the rectangular frame where the reading area of the level is positioned, calculating the width and height of a template, taking the calculated width and height as a size parameter of a later-period normalized template, and taking the reading rectangular area of the level corresponding to the sample image as a template;
s303, normalizing the size of the template image;
S304, performing template matching calculation on each image to be identified, and traversing each image by adopting a full search method;
s305, calculating the correlation value of the template and the image to be detected, adopting the product of the template and the corresponding position of the target image pixel, solving the mean value of the product as the correlation coefficient value, and storing all the correlation values calculated in the step, wherein the calculation process of the correlation value is as follows:
R(x,y)=∑x'y'(T'(x',y')·I′(x+x',y+y')) (4)
Wherein R is a correlation coefficient; i (x, y), T (x, y) are the original image and template image pixel coordinates; w and h are the size of the template;
S306, selecting a candidate image area with the largest correlation value, comparing the candidate image area with a preset threshold value, and judging whether the correlation value is larger than the preset threshold value or not:
If yes, considering that a reading area of the level is detected, and cutting out a reading image area in the image to be identified according to the candidate level reading position coordinate with the maximum correlation value to be used as a detection interested area of the next step;
If not, the reading area of the level is not recognized, and the step S1 is returned to perform preprocessing of the next image to be recognized;
S4, in the detection region of interest, extracting scale marks of the level based on the edge, color and area characteristics; the specific implementation method for extracting the scale marks of the level gauge comprises the following steps:
S401, acquiring an image with a level meter, performing image denoising operation, inputting a mature extraction level meter detection interested model in the step S304, acquiring a level meter detection interested region in the image according to the steps S304-S306, and cutting and storing the level meter detection interested region;
S402, performing binarization processing on an image of an area of interest of the level, setting a fixed area, comparing all pixels in the area of interest with a threshold value, setting 255 larger than the threshold value, or setting 0;
S403, taking black areas in the binary image as targets, extracting outlines of all target blocks, calculating key features of each outline area, and storing feature values of the key features;
s404, screening the outline of the target block to extract a needed scale mark area, comparing the scale mark area with a preset threshold value, and judging whether the characteristic value meets the preset condition:
If the area is in the outline within the threshold range, reserving; if not, deleting the contour area;
If the previous condition is met, the ordinate of the mass center of the outline area block is larger than a preset threshold value, and the area is saved as a scale mark area; if not, deleting the contour area; storing the extracted scale mark region characteristic value;
s405, sorting the scale marks according to the position relation of the scale marks, wherein the scale marks are required to be marked from inside to outside and are required to be symmetrical left and right;
s5, detecting bubbles of the level meter based on a template matching method in the detection region of interest;
S6, calculating a reading value of the level according to the position relation between the bubble edge position and the scale mark, so that an automatic reading function of the level is realized, and the level state of the system to be detected is judged.
2. The automatic reading identification method of the strip level based on image processing as claimed in claim 1, wherein the specific implementation method of the image acquisition system in step S1 is as follows:
Taking the image quality and the stability of a detection system into consideration, a module camera with the model of IMX274 is selected as an image acquisition module of the system; the double-connecting-rod hinged fixing method is adopted, and the connecting-rod bracket is formed by hinging two brackets: one end is fixed, the other end is connected with the camera through a knob component, or the direction of the camera is adjusted by 360 degrees, and the position of the camera is adjusted in an XOZ plane by hinging through double connecting rods.
3. The automatic reading identification method of the strip level based on image processing according to claim 1, wherein the specific implementation method of the image preprocessing in step S2 is as follows:
s201, collecting a plurality of images with a level meter, wherein the original images are RGB images, the images consist of three primary colors of red, green and blue, the conversion is carried out by adopting a weighted average method, and the gray processing is carried out on the original RGB images, and the gray processing process is expressed as follows:
Y=0.299×R+0.587×G+0.114×B(1)
S202, denoising by adopting a Gaussian filtering method, and weighting and averaging all pixel points of the target image, namely, calculating the value of each pixel point by the weighted average of the central point and the pixel points of the neighborhood of the central point.
4. The automatic reading identification method of the strip type level based on image processing according to claim 1, wherein the specific implementation method of the level bubble extraction based on template matching in step S5 is as follows:
S501, selecting any level image as a sample image, selecting a bubble area of the level image as a template, marking position coordinates of four vertexes of a rectangular frame where the bubble area is located, and calculating the width and height of the template as a size parameter of a later normalized template;
s502, normalizing the size of the template image;
s503, performing template matching calculation on the detected region of interest obtained in the step S4, and traversing the image of the region of interest by adopting a full search method;
s504, calculating a correlation value of the template and the image to be detected;
S505, selecting a candidate image area with the largest correlation value, comparing the candidate image area with a preset threshold value, and judging whether the correlation value is larger than the preset threshold value or not:
if yes, the bubble area is considered to be detected;
if not, the bubble area is not recognized, the threshold is corrected, and the steps S503 to S505 are repeated to perform the next search.
5. The automatic reading identification method for the strip level based on image processing according to claim 4, wherein the specific implementation method for reading the level in step S6 is as follows:
s601, theoretically, the distances between two adjacent graduation marks are equal, the distances between the two graduation marks are 2mm through ruler measurement, the pixel distance between the two adjacent graduation marks is calculated, the abscissa of the mass center of the graduation marks is used for calculating the difference value, the difference value between the two graduation marks is 28 pixels, and the error is 1-2;
s602, the graduation lines of the level gauge are arranged in sequence in the step S405, and the structure of the level gauge can know that the level gauge is in a zero position when the left and right edges of the bubble are overlapped with the fourth graduation line, and the level gauge is in a horizontal state at the moment, the fourth graduation line is set as a zero graduation line, and the upper left corner coordinate of the rectangular frame represents the position of the column head of the edge of the bubble; when the bubble is positioned on the left side of the zero graduation line, namely the air bubble abscissa is smaller than the zero graduation line abscissa, the air bubble is marked as negative, and when the air bubble is positioned on the right side of the zero graduation line, namely the air bubble abscissa is larger than the zero graduation line abscissa, the air bubble is marked as positive;
S603, detecting a nearest scale mark to the bubble, setting the nearest scale mark to the bubble as A, setting C as a zero scale mark, setting a scale mark B between the A and the zero scale mark C, and setting the deviation lattice value of the nearest scale mark A and the zero scale mark C as a main gradient; when the bubble is positioned between the two graduation marks A, B, the percentage of the distance between the bubble and the nearest graduation mark A, which is occupied by the distance between the bubble and the adjacent two graduation marks A, B, is the secondary gradient;
S604, judging the position relation between the air bubble and the zero scale mark, and if the edge of the air bubble is far away from the zero scale mark, subtracting the secondary gradient from the primary gradient; if the bubble edge is adjacent to the zero scale mark, the level inclination is the value of the primary inclination plus the secondary inclination;
S605, judging whether the final reading is positive or negative, and if the bubble is positioned on the left side of the zero scale mark, judging that the final reading is negative; otherwise, positive values are used; and obtaining a final reading result.
CN202111249287.8A 2021-10-26 2021-10-26 Automatic reading identification method for strip level based on image processing Active CN113902894B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111249287.8A CN113902894B (en) 2021-10-26 2021-10-26 Automatic reading identification method for strip level based on image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111249287.8A CN113902894B (en) 2021-10-26 2021-10-26 Automatic reading identification method for strip level based on image processing

Publications (2)

Publication Number Publication Date
CN113902894A CN113902894A (en) 2022-01-07
CN113902894B true CN113902894B (en) 2024-05-31

Family

ID=79026398

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111249287.8A Active CN113902894B (en) 2021-10-26 2021-10-26 Automatic reading identification method for strip level based on image processing

Country Status (1)

Country Link
CN (1) CN113902894B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115100209B (en) * 2022-08-28 2022-11-08 电子科技大学 Camera-based image quality correction method and correction system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392206A (en) * 2014-10-24 2015-03-04 南京航空航天大学 Image processing method for automatic pointer-type instrument reading recognition
CN106289325A (en) * 2016-09-23 2017-01-04 浙江大学 A kind of air-bubble level automatic checkout system
CN110599471A (en) * 2019-09-02 2019-12-20 唐山市气象局 Rain gauge horizontal monitoring system based on image processing and detection method thereof
CN112254744A (en) * 2020-10-23 2021-01-22 广州计量检测技术研究院 Bubble level meter calibration method, system, device and storage medium
CN112990179A (en) * 2021-04-20 2021-06-18 成都阿莱夫信息技术有限公司 Single-pointer type dial reading automatic identification method based on picture processing
WO2021195873A1 (en) * 2020-03-30 2021-10-07 南昌欧菲光电技术有限公司 Method and device for identifying region of interest in sfr test chart image, and medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104392206A (en) * 2014-10-24 2015-03-04 南京航空航天大学 Image processing method for automatic pointer-type instrument reading recognition
CN106289325A (en) * 2016-09-23 2017-01-04 浙江大学 A kind of air-bubble level automatic checkout system
CN110599471A (en) * 2019-09-02 2019-12-20 唐山市气象局 Rain gauge horizontal monitoring system based on image processing and detection method thereof
WO2021195873A1 (en) * 2020-03-30 2021-10-07 南昌欧菲光电技术有限公司 Method and device for identifying region of interest in sfr test chart image, and medium
CN112254744A (en) * 2020-10-23 2021-01-22 广州计量检测技术研究院 Bubble level meter calibration method, system, device and storage medium
CN112990179A (en) * 2021-04-20 2021-06-18 成都阿莱夫信息技术有限公司 Single-pointer type dial reading automatic identification method based on picture processing

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
压力表图像自动识别***研究;张道德;王周星;胡新宇;吴良溢;;湖北工业大学学报;20170215(第01期);全文 *
在线式绝缘子图像采集与实时处理***设计;张希珍;孙晓艳;张立材;;电子测量技术;20180823(第16期);全文 *

Also Published As

Publication number Publication date
CN113902894A (en) 2022-01-07

Similar Documents

Publication Publication Date Title
US11551341B2 (en) Method and device for automatically drawing structural cracks and precisely measuring widths thereof
CN112906694B (en) Reading correction system and method for transformer substation inclined pointer instrument image
CN108918526B (en) Notch defect detection method for flexible IC packaging substrate circuit
CN112818988B (en) Automatic identification reading method and system for pointer instrument
CN108764257B (en) Multi-view pointer instrument identification method
CN103207987B (en) A kind of registration recognition methods of pointer instrument
CN108613630B (en) Two-wire tube level bubble offset measurement method based on image processing technology
CN109115800B (en) Method for rapidly detecting burrs of product and accurately measuring length
CN110929710A (en) Method and system for automatically identifying meter pointer reading based on vision
CN110211178B (en) Pointer instrument identification method using projection calculation
CN111507186B (en) Method for recognizing reading of pointer instrument of transformer substation
CN114005108A (en) Pointer instrument degree identification method based on coordinate transformation
CN113902894B (en) Automatic reading identification method for strip level based on image processing
CN114627080A (en) Vehicle stamping accessory defect detection method based on computer vision
CN114926625A (en) Automatic reading method of glass liquid thermometer based on image processing technology
CN111815580B (en) Image edge recognition method and small module gear module detection method
CN113989482B (en) Automatic reading identification method of optical imaging level based on image processing
CN115761468A (en) Water level detection system and method based on image segmentation and target detection technology
CN115424009A (en) Automatic reading method for pointer instrument data based on Yolact network
CN111189826B (en) Intelligent scoring experimental equipment and method for measuring pH value of solution to be measured by pH test paper
CN113989513A (en) Method for recognizing reading of square pointer type instrument
CN114677428A (en) Power transmission line icing thickness detection method based on unmanned aerial vehicle image processing
CN113313122A (en) Pointer type instrument automatic reading identification method based on computer vision
CN109360289B (en) Power meter detection method fusing inspection robot positioning information
CN113269749A (en) Strip position image data collection method and visual detection method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant