CN114998328A - Workpiece spraying defect detection method and system based on machine vision and readable storage medium - Google Patents

Workpiece spraying defect detection method and system based on machine vision and readable storage medium Download PDF

Info

Publication number
CN114998328A
CN114998328A CN202210889416.8A CN202210889416A CN114998328A CN 114998328 A CN114998328 A CN 114998328A CN 202210889416 A CN202210889416 A CN 202210889416A CN 114998328 A CN114998328 A CN 114998328A
Authority
CN
China
Prior art keywords
image
workpiece
point cloud
dimensional
defect detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210889416.8A
Other languages
Chinese (zh)
Inventor
丁峰
张德松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Tiancheng Coating System Co ltd
Original Assignee
Suzhou Tiancheng Coating System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Tiancheng Coating System Co ltd filed Critical Suzhou Tiancheng Coating System Co ltd
Priority to CN202210889416.8A priority Critical patent/CN114998328A/en
Publication of CN114998328A publication Critical patent/CN114998328A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20064Wavelet transform [DWT]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computing Systems (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Artificial Intelligence (AREA)
  • Biochemistry (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a workpiece spraying defect detection method and system based on machine vision and a readable storage medium, wherein the method comprises the following steps of obtaining a three-dimensional image of a workpiece, preprocessing the three-dimensional image, extracting three-dimensional point cloud data from the preprocessed three-dimensional image, carrying out standardized processing on the three-dimensional point cloud data, and extracting point cloud data characteristics; inputting the point cloud data characteristics into a defect detection model to generate defect parameter information; comparing the defect parameter information with preset information to obtain a deviation rate; judging whether the deviation rate is greater than a preset threshold value or not; if the spraying parameter is larger than the preset value, generating compensation information, and optimizing the spraying parameter through the compensation information; if the defect parameter information is smaller than the preset defect parameter information, the defect parameter information is displayed according to a preset mode, the workpiece spraying form can be better shown through the three-dimensional point cloud data, multi-dimensional analysis of the data is achieved, and the defect detection precision is improved.

Description

Workpiece spraying defect detection method and system based on machine vision and readable storage medium
Technical Field
The invention relates to the technical field of industrial vision detection, in particular to a workpiece spraying defect detection method and system based on machine vision and a readable storage medium.
Background
Machine vision is a branch of the rapid development of artificial intelligence, and the machine vision is to replace human eyes with machines for measurement and judgment. The machine vision system converts the shot target into image signals through a machine vision product (namely an image shooting device which is divided into a CMOS (complementary metal oxide semiconductor) product and a CCD (charge coupled device) product), transmits the image signals to a special image processing system to obtain the form information of the shot target, converts the form information into digital signals according to the information of pixel distribution, brightness, color and the like, performs various operations on the signals by the image system to extract the characteristics of the target, further controls the on-site equipment action according to the judgment result, detects the defect of workpiece spraying through the machine vision, and has important value and significance.
Traditional defect detection is through infrared scanning, then carries out the judgement of spraying defect, and the precision is relatively poor, and in addition, traditional defect detection only judges work piece spraying defect through shooing the two-dimensional image to the work piece surface, and data analysis is accurate inadequately, and the defect judgement result is skew actual result easily, influences the judgement of defect.
Disclosure of Invention
The invention aims to provide a workpiece spraying defect detection method and system based on machine vision and a readable storage medium, which are simple to operate, high in efficiency and good in universality.
In order to achieve the purpose of the invention, the technical scheme adopted by the invention is as follows: a workpiece spraying defect detection method based on machine vision comprises the following steps:
acquiring a three-dimensional image of a workpiece, and preprocessing the three-dimensional image;
extracting three-dimensional point cloud data from the preprocessed three-dimensional image, standardizing the three-dimensional point cloud data, and extracting point cloud data characteristics;
inputting the point cloud data characteristics into a defect detection model to generate defect parameter information;
comparing the defect parameter information with preset information to obtain a deviation rate;
judging whether the deviation rate is greater than a preset threshold value or not;
if the number of the spraying parameters is larger than the preset value, generating compensation information, and optimizing the spraying parameters through the compensation information;
and if the defect parameter information is smaller than the preset defect parameter information, displaying the defect parameter information according to a preset mode.
Preferably, acquiring a three-dimensional image of the workpiece, and preprocessing the three-dimensional image, specifically includes:
acquiring an original left image and an original right image of a workpiece;
calculating the parallax between the original left image and the original right image according to a binocular camera calibration principle, and calibrating a binocular camera;
acquiring a new left image and a new right image of the workpiece through the calibrated binocular camera;
and forming a binocular stereoscopic vision image according to the new left image and the new right image.
Preferably, the method further comprises the following steps:
establishing a three-dimensional space coordinate system, and calculating a three-dimensional coordinate of the workpiece in the three-dimensional space according to the spatial geometrical relationship;
and segmenting the image into a plurality of regions with the same size according to the three-dimensional coordinates, and filtering the segmented image.
Preferably, the image filtering processing method is as follows:
carrying out wavelet transform multi-scale decomposition on the image;
removing noise in the image by using the scale coefficient;
the image is reconstructed by inverse wavelet transform.
Preferably, the inverse wavelet transform formula is as follows:
Figure 132109DEST_PATH_IMAGE001
Figure 561953DEST_PATH_IMAGE002
in the formula
Figure 537999DEST_PATH_IMAGE003
Which represents the inverse wavelet transform function, is,
Figure 78702DEST_PATH_IMAGE004
represents a wavelet function, s represents a scale factor, k represents a translation factor,
Figure 722173DEST_PATH_IMAGE005
indicating the correction factor.
Preferably, the method for calculating the three-dimensional coordinates of the Q point in space is as follows: the left image and the right image are shot by two cameras respectively, the distance between the connecting lines of the projection centers of the two cameras is recorded as p, and the coordinate correction coefficient is recorded as
Figure 57339DEST_PATH_IMAGE006
Assuming that the left image and the right image are on the same horizontal plane, the Y coordinate of the Q point in space is the same, and thus it can be known that:
Figure 599310DEST_PATH_IMAGE007
Figure 678125DEST_PATH_IMAGE008
Figure 176102DEST_PATH_IMAGE009
in the formula, the left image coordinate is
Figure 619853DEST_PATH_IMAGE010
The right image coordinate is
Figure 649120DEST_PATH_IMAGE011
Parallax of left and right images
Figure 531625DEST_PATH_IMAGE012
From this, three-dimensional coordinates of the point Q in space are calculated as
Figure 884109DEST_PATH_IMAGE013
Preferably, the three-dimensional point cloud data processing method comprises the following steps:
analyzing each point in a statistical filtering mode, and calculating the distance from the point to an adjacent point;
calculating a mean value m and a standard deviation n to obtain a threshold value range w;
removing point cloud data outside the threshold range;
wherein the threshold range w is (m-a n, m + a n), wherein a represents a constant.
Preferably, the method for extracting the point cloud data features is as follows:
calculating a normal vector according to the geometrical characteristics of the point cloud;
sorting the normal vectors, and judging the offset angle of the normal vectors by using the calculated normal vectors;
judging whether the offset angle is larger than a preset angle or not;
if the value is larger than the threshold value, adjusting the normal vector parameters, extracting the edge characteristics of the point cloud, and performing boundary division on the point cloud data.
The invention also claims a computer device, which comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the computer program to execute the instructions of the workpiece spraying defect detection method based on machine vision.
The invention also claims a computer readable storage medium storing a computer program which, when executed by a processor, executes instructions of the above-described machine vision-based workpiece spray defect detection method.
Due to the application of the technical scheme, compared with the prior art, the invention has the following advantages:
according to the method and the device, the three-dimensional image and the three-dimensional point cloud data of the workpiece are obtained, the workpiece spraying form can be better displayed through the three-dimensional point cloud data, multi-dimensional analysis of the data is achieved, the defect detection precision is improved, and the defect detection result is closer to an actual value.
Drawings
FIG. 1 is a flow chart of a workpiece spray defect detection method based on machine vision according to the present invention;
FIG. 2 is a flow chart of a three-dimensional image preprocessing method of the present invention;
FIG. 3 is a flow chart of a method for extracting point cloud features according to the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present specification, the technical solutions in the embodiments of the present specification will be clearly and completely described below with reference to the drawings in the embodiments of the present specification, and it is obvious that the described embodiments are only a part of the embodiments of the present specification, and not all of the embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step should fall within the scope of protection of the present specification.
Example 1
As shown in FIG. 1, the invention discloses a workpiece spraying defect detection method based on machine vision, which comprises the following steps:
s102, acquiring a three-dimensional image of a workpiece, and preprocessing the three-dimensional image;
s104, extracting three-dimensional point cloud data from the preprocessed three-dimensional image, standardizing the three-dimensional point cloud data, and extracting point cloud data characteristics;
s106, inputting the point cloud data characteristics into a defect detection model to generate defect parameter information;
s108, comparing the defect parameter information with preset information to obtain a deviation rate;
s110, judging whether the deviation rate is larger than a preset threshold value or not;
s112, if the value is larger than the preset value, generating compensation information, and optimizing the spraying parameters through the compensation information;
and S114, if the defect parameter information is smaller than the preset defect parameter information, displaying the defect parameter information according to a preset mode.
Furthermore, the representation form of the three-dimensional image mainly comprises a voxel, a grid and a point cloud model, and point cloud data, namely a point cloud, is obtained by a line scanning device and is displayed in the space empty image like a cloud.
Fig. 2 shows a flow chart of a three-dimensional image preprocessing method.
Preferably, acquiring a three-dimensional image of the workpiece, and preprocessing the three-dimensional image, specifically includes:
s202, acquiring an original left image and an original right image of the workpiece;
s204, calculating the parallax between the original left image and the original right image according to the calibration principle of the binocular camera, and calibrating the binocular camera;
s206, acquiring a new left image and a new right image of the workpiece through the calibrated binocular camera;
and S208, forming a binocular stereoscopic vision image according to the new left image and the new right image.
Furthermore, binocular stereo vision fuses images obtained by two eyes and observes differences between the images, so that people can obtain obvious depth feeling, a corresponding relation between features is established, and differences of mapping points of the same space physical point in different images are called parallax images.
Preferably, the method further comprises the following steps: establishing a three-dimensional space coordinate system, and calculating the three-dimensional coordinates of the workpiece in the three-dimensional space according to the spatial geometrical relationship;
and segmenting the image into a plurality of regions with the same size according to the three-dimensional coordinates, and filtering the segmented image.
FIG. 3 shows a flow chart of a method of extracting point cloud data.
Further, the method for extracting the point cloud data features comprises the following steps:
s302, calculating a normal vector according to the geometrical characteristics of the point cloud;
s304, sequencing the normal vectors, and judging the offset angles of the normal vectors by using the calculated normal vectors;
s306, judging whether the offset angle is larger than a preset angle or not;
and S308, if the value is larger than the threshold value, adjusting the normal vector parameters, extracting the edge characteristics of the point cloud, and performing boundary division on the point cloud data.
Preferably, the image filtering processing method is as follows:
carrying out wavelet transformation multi-scale decomposition on the image;
removing noise in the image by using the scale coefficient;
the image is reconstructed by inverse wavelet transform.
Preferably, the inverse wavelet transform is formulated as follows:
Figure 888974DEST_PATH_IMAGE001
Figure 326909DEST_PATH_IMAGE002
in the formula
Figure 747526DEST_PATH_IMAGE003
Which represents the inverse wavelet transform function, and,
Figure 220096DEST_PATH_IMAGE004
representing a wavelet function, s a scale factor, k a translation factor,
Figure 881015DEST_PATH_IMAGE005
indicating the correction factor.
Further, the method for calculating the three-dimensional coordinate of the point Q in space is as follows: the left image and the right image are shot by two cameras respectively, the distance between the connecting lines of the projection centers of the two cameras is recorded as p, and the coordinate correction coefficient is recorded as
Figure 806246DEST_PATH_IMAGE006
Assuming that the left image and the right image are on the same horizontal plane, the Y coordinate of the Q point in space is the same, and thus it can be seen that:
Figure 30554DEST_PATH_IMAGE007
Figure 357630DEST_PATH_IMAGE008
Figure 704298DEST_PATH_IMAGE009
in the formula, the left image coordinate is
Figure 116824DEST_PATH_IMAGE014
The right image coordinate is
Figure 879244DEST_PATH_IMAGE011
Parallax of left and right images
Figure 60827DEST_PATH_IMAGE012
From this, three-dimensional coordinates of the point Q in space are calculated as
Figure 329128DEST_PATH_IMAGE013
Furthermore, the parallax of the same workpiece in the binocular camera by taking pixels as units is related to the base line and the focal length of the binocular camera and the pixel size, the shorter the base line of the binocular camera is, the smaller the focal length is, the larger the matching range is, the closer the depth can be detected is, the longer the base line is and the larger the focal length is, the farther the depth can be detected is, and the binocular camera needs to be calibrated and calibrated before the left image and the right image are acquired through the binocular camera.
Preferably, the three-dimensional point cloud data processing method comprises the following steps:
analyzing each point in a statistical filtering mode, and calculating the distance from the point to an adjacent point;
calculating a mean value m and a standard deviation n to obtain a threshold value range w;
removing point cloud data outside the threshold range;
wherein the threshold range w is (m-a n, m + a n), wherein a represents a constant.
Further, the value of the constant a ranges from 1 to 1.5, preferably a = 1.25.
Furthermore, certain noise exists in the point cloud, the point cloud needs to be denoised, the point cloud denoising is to detect and remove the noise or points which are not interested in the point cloud through a filtering principle, a common point cloud filtering method mainly comprises a voxel and moving average least square method, the moving average least square method is to eliminate discrete points through a curved surface fitting mode, and the discrete points are unrelated points in the characteristic representation of a workpiece or an object.
Example 2
The present disclosure also provides a computer device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to perform the instructions of the workpiece spray coating defect detection method based on machine vision described in the above embodiments.
The computer device may include one or more processors, such as one or more Central Processing Units (CPUs) or Graphics Processors (GPUs), each of which may implement one or more hardware threads. The computer device may also comprise any memory for storing any kind of information, such as code, settings, data, etc., and in a particular embodiment a computer program on the memory and executable on the processor, which computer program when executed by the processor may perform the instructions of the method of any of the above embodiments. For example, and without limitation, memory may include any one or combination of the following: any type of RAM, any type of ROM, flash memory devices, hard disks, optical disks, etc. More generally, any memory may use any technology to store information. Further, any memory may provide volatile or non-volatile retention of information. Further, any memory may represent fixed or removable components of the computer device. In one case, when the processor executes the associated instructions stored in any memory or combination of memories, the computer device can perform any of the operations of the associated instructions. The computer device also includes one or more drive mechanisms for interacting with any memory, such as a hard disk drive mechanism, an optical disk drive mechanism, and so forth.
The present disclosure also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiments; or may exist separately without being assembled into the electronic device. The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement the method described in embodiment 1 or 2 above; computer-readable media, including both permanent and non-permanent, removable and non-removable media, may implement the information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computer device. As defined herein, computer readable media does not include transitory computer readable media (transmyedia) such as modulated data signals and carrier waves.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the system embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and reference may be made to the partial description of the method embodiment for relevant points. In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of an embodiment of the specification. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Moreover, various embodiments or examples and features of various embodiments or examples described in this specification can be combined and combined by one skilled in the art without being mutually inconsistent.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A workpiece spraying defect detection method based on machine vision is characterized by comprising the following steps,
acquiring a three-dimensional image of the workpiece, preprocessing the three-dimensional image,
extracting three-dimensional point cloud data from the preprocessed three-dimensional image, standardizing the three-dimensional point cloud data, and extracting point cloud data characteristics;
inputting the point cloud data characteristics into a defect detection model to generate defect parameter information;
comparing the defect parameter information with preset information to obtain a deviation rate;
judging whether the deviation rate is greater than a preset threshold value or not;
if the spraying parameter is larger than the preset value, generating compensation information, and optimizing the spraying parameter through the compensation information;
and if the defect parameter information is smaller than the preset defect parameter information, displaying the defect parameter information according to a preset mode.
2. The workpiece spraying defect detection method based on machine vision as claimed in claim 1, wherein a three-dimensional image of the workpiece is obtained, and the preprocessing of the three-dimensional image specifically comprises:
acquiring an original left image and an original right image of a workpiece;
calculating the parallax between the original left image and the original right image according to a binocular camera calibration principle, and calibrating a binocular camera;
acquiring a new left image and a new right image of the workpiece through the calibrated binocular camera;
and forming a binocular stereoscopic vision image according to the new left image and the new right image.
3. The machine vision-based workpiece spray defect detection method of claim 2, further comprising:
establishing a three-dimensional space coordinate system, and calculating the three-dimensional coordinates of the workpiece in the three-dimensional space according to the spatial geometrical relationship;
and segmenting the image into a plurality of regions with the same size according to the three-dimensional coordinates, and filtering the segmented image.
4. The workpiece spraying defect detection method based on the machine vision as claimed in claim 3, characterized in that the image filtering processing method comprises the following steps:
carrying out wavelet transform multi-scale decomposition on the image;
removing noise in the image by using the scale coefficient;
the image is reconstructed by inverse wavelet transform.
5. The workpiece spray coating defect detection method based on machine vision as claimed in claim 4, characterized in that the wavelet inverse transformation formula is as follows:
Figure 14729DEST_PATH_IMAGE001
Figure 709015DEST_PATH_IMAGE002
in the formula
Figure 360576DEST_PATH_IMAGE003
Which represents the inverse wavelet transform function, is,
Figure 405893DEST_PATH_IMAGE004
representing a wavelet function, s a scale factor, k a translation factor,
Figure 801102DEST_PATH_IMAGE005
indicating the correction factor.
6. The method for detecting the workpiece spraying defects based on the machine vision as claimed in claim 2, wherein the method for calculating the three-dimensional coordinates of the Q point in the space is as follows: the left image and the right image are shot by two cameras respectively, the distance between the projection center connecting lines of the two cameras is recorded as p, and the coordinate correction coefficient is recorded as
Figure 162944DEST_PATH_IMAGE006
Assuming that the left image and the right image are on the same horizontal plane, the Y coordinate of the Q point in space is the same, and thus it can be known that:
Figure 250986DEST_PATH_IMAGE007
Figure 518019DEST_PATH_IMAGE008
Figure 716919DEST_PATH_IMAGE009
in the formula, the left image coordinate is
Figure 120219DEST_PATH_IMAGE010
The right image coordinate is
Figure 441479DEST_PATH_IMAGE011
Parallax of left and right images
Figure 195808DEST_PATH_IMAGE012
From this, three-dimensional coordinates of the point Q in space are calculated as
Figure 932820DEST_PATH_IMAGE013
7. The workpiece spraying defect detection method based on machine vision as claimed in claim 1, characterized in that the three-dimensional point cloud data processing method is as follows:
analyzing each point in a statistical filtering mode, and calculating the distance from the point to an adjacent point;
calculating a mean value m and a standard deviation n to obtain a threshold value range w;
removing point cloud data outside the threshold range;
wherein the threshold range w is (m-a n, m + a n), wherein a represents a constant.
8. The workpiece spraying defect detection method based on machine vision as claimed in claim 3, characterized in that the method for extracting the point cloud data features is as follows:
calculating a normal vector according to the geometrical characteristics of the point cloud;
sorting the normal vectors, and judging the offset angle of the normal vectors by using the calculated normal vectors;
judging whether the offset angle is larger than a preset angle or not;
if the value is larger than the threshold value, adjusting the normal vector parameters, extracting the edge characteristics of the point cloud, and performing boundary division on the point cloud data.
9. A machine vision based workpiece spray defect detection system comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to perform the instructions of the machine vision based workpiece spray defect detection method according to any one of claims 1 to 8.
10. A computer-readable storage medium storing a computer program, wherein the computer program when executed by a processor executes instructions of the machine vision based workpiece spray defect detection method of any one of claims 1 to 7.
CN202210889416.8A 2022-07-27 2022-07-27 Workpiece spraying defect detection method and system based on machine vision and readable storage medium Pending CN114998328A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210889416.8A CN114998328A (en) 2022-07-27 2022-07-27 Workpiece spraying defect detection method and system based on machine vision and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210889416.8A CN114998328A (en) 2022-07-27 2022-07-27 Workpiece spraying defect detection method and system based on machine vision and readable storage medium

Publications (1)

Publication Number Publication Date
CN114998328A true CN114998328A (en) 2022-09-02

Family

ID=83021998

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210889416.8A Pending CN114998328A (en) 2022-07-27 2022-07-27 Workpiece spraying defect detection method and system based on machine vision and readable storage medium

Country Status (1)

Country Link
CN (1) CN114998328A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115561249A (en) * 2022-11-09 2023-01-03 松乐智能装备(深圳)有限公司 Intelligent monitoring method and system for spraying equipment
CN115932864A (en) * 2023-02-24 2023-04-07 深圳市博铭维技术股份有限公司 Pipeline defect detection method and pipeline defect detection device
CN116124081A (en) * 2023-04-18 2023-05-16 菲特(天津)检测技术有限公司 Non-contact workpiece detection method and device, electronic equipment and medium
CN116990692A (en) * 2023-09-28 2023-11-03 深圳康普盾科技股份有限公司 Lithium battery health condition assessment and residual life prediction method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566291A (en) * 2010-12-29 2012-07-11 中芯国际集成电路制造(上海)有限公司 Test system for projection mask
CN113888531A (en) * 2021-11-02 2022-01-04 中南大学 Concrete surface defect detection method and device, electronic equipment and storage medium
CN114549519A (en) * 2022-04-08 2022-05-27 苏州天成涂装***股份有限公司 Visual detection method and system for automobile spraying production line and readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566291A (en) * 2010-12-29 2012-07-11 中芯国际集成电路制造(上海)有限公司 Test system for projection mask
CN113888531A (en) * 2021-11-02 2022-01-04 中南大学 Concrete surface defect detection method and device, electronic equipment and storage medium
CN114549519A (en) * 2022-04-08 2022-05-27 苏州天成涂装***股份有限公司 Visual detection method and system for automobile spraying production line and readable storage medium

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115561249A (en) * 2022-11-09 2023-01-03 松乐智能装备(深圳)有限公司 Intelligent monitoring method and system for spraying equipment
CN115932864A (en) * 2023-02-24 2023-04-07 深圳市博铭维技术股份有限公司 Pipeline defect detection method and pipeline defect detection device
CN116124081A (en) * 2023-04-18 2023-05-16 菲特(天津)检测技术有限公司 Non-contact workpiece detection method and device, electronic equipment and medium
CN116124081B (en) * 2023-04-18 2023-06-27 菲特(天津)检测技术有限公司 Non-contact workpiece detection method and device, electronic equipment and medium
CN116990692A (en) * 2023-09-28 2023-11-03 深圳康普盾科技股份有限公司 Lithium battery health condition assessment and residual life prediction method and system
CN116990692B (en) * 2023-09-28 2023-12-08 深圳康普盾科技股份有限公司 Lithium battery health condition assessment and residual life prediction method and system

Similar Documents

Publication Publication Date Title
Wolff et al. Point cloud noise and outlier removal for image-based 3D reconstruction
CN109461181B (en) Depth image acquisition method and system based on speckle structured light
CN114998328A (en) Workpiece spraying defect detection method and system based on machine vision and readable storage medium
KR102674646B1 (en) Apparatus and method for obtaining distance information from a view
US9773302B2 (en) Three-dimensional object model tagging
CN107392958B (en) Method and device for determining object volume based on binocular stereo camera
JP6955783B2 (en) Information processing methods, equipment, cloud processing devices and computer program products
CN111582054B (en) Point cloud data processing method and device and obstacle detection method and device
CN107481271B (en) Stereo matching method, system and mobile terminal
EP3488424A1 (en) Systems and methods for improved surface normal estimation
CN112686877A (en) Binocular camera-based three-dimensional house damage model construction and measurement method and system
CN111412842A (en) Method, device and system for measuring cross-sectional dimension of wall surface
CN111553946A (en) Method and device for removing ground point cloud and obstacle detection method and device
CN113077476A (en) Height measurement method, terminal device and computer storage medium
CN115456945A (en) Chip pin defect detection method, detection device and equipment
CN116129037A (en) Visual touch sensor, three-dimensional reconstruction method, system, equipment and storage medium thereof
Chiang et al. Active stereo vision system with rotated structured light patterns and two-step denoising process for improved spatial resolution
Bormann et al. Fast and accurate normal estimation by efficient 3d edge detection
Örnek et al. From 2D to 3D: Re-thinking benchmarking of monocular depth prediction
CN114638891A (en) Target detection positioning method and system based on image and point cloud fusion
US10223803B2 (en) Method for characterising a scene by computing 3D orientation
Brink et al. Indexing Uncoded Stripe Patterns in Structured Light Systems by Maximum Spanning Trees.
Wang et al. LBP-based edge detection method for depth images with low resolutions
CN110969650B (en) Intensity image and texture sequence registration method based on central projection
CN116645418A (en) Screen button detection method and device based on 2D and 3D cameras and relevant medium thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220902