CN117132592B - Industrial defect detection method based on entropy fusion - Google Patents
Industrial defect detection method based on entropy fusion Download PDFInfo
- Publication number
- CN117132592B CN117132592B CN202311382011.6A CN202311382011A CN117132592B CN 117132592 B CN117132592 B CN 117132592B CN 202311382011 A CN202311382011 A CN 202311382011A CN 117132592 B CN117132592 B CN 117132592B
- Authority
- CN
- China
- Prior art keywords
- image
- entropy
- feature
- defect
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000007547 defect Effects 0.000 title claims abstract description 54
- 238000001514 detection method Methods 0.000 title claims abstract description 24
- 230000004927 fusion Effects 0.000 title claims abstract description 22
- 238000010586 diagram Methods 0.000 claims abstract description 39
- 238000012549 training Methods 0.000 claims abstract description 29
- 238000012360 testing method Methods 0.000 claims abstract description 27
- 238000009826 distribution Methods 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims abstract description 5
- 238000010606 normalization Methods 0.000 claims abstract description 4
- 238000000034 method Methods 0.000 claims description 29
- 238000004364 calculation method Methods 0.000 claims description 3
- 238000011176 pooling Methods 0.000 claims description 3
- 230000011218 segmentation Effects 0.000 claims description 3
- 230000000379 polymerizing effect Effects 0.000 claims 1
- 230000006870 function Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 238000004220 aggregation Methods 0.000 description 2
- 230000002776 aggregation Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- ORILYTVJVMAKLC-UHFFFAOYSA-N Adamantane Natural products C1C(C2)CC3CC1CC2C3 ORILYTVJVMAKLC-UHFFFAOYSA-N 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000000877 morphologic effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
- G06N3/0455—Auto-encoder networks; Encoder-decoder networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Evolutionary Computation (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Mathematical Physics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the technical field of defect detection, and discloses an industrial defect detection method based on entropy fusion, which comprises the following steps: step one: acquiring an image of the surface of an industrial product, carrying out normalization processing on the image, and acquiring a entropy diagram by adopting an entropy algorithm of a histogram; step two: constructing a network model, realizing self-adaptive fusion of the entropy diagram and the image, obtaining a feature diagram through an encoder, a defect enhancement module and a decoder, and recovering the feature diagram to the size before encoding; step three: training the network model through a training data set, and inputting the test image into the trained network model to obtain a defect detection result of the test image; the entropy of the image is analyzed, so that the characteristic information of the image can be extracted better; the robustness of the model is improved through the self-attention fusion of the entropy diagram and the image; image defect information is obtained by calculating the pixel value distribution ratio between images, and the defect detection precision is improved.
Description
Technical Field
The invention relates to the field of defect detection, in particular to an industrial defect detection method based on entropy fusion.
Background
Defect detection is an important task in the production of industrial product flows, which plays an important role in detecting wear of the product. With the rapid development of integrated circuit designs, computer processing techniques, and vision theory models, defect detection by vision can be basically achieved. The images contain abundant texture information, which is helpful for the feature learning of industrial products. But the image is very vulnerable to illumination, especially in bad scenes such as dark environments, many details are lost.
The existing industrial defect detection method based on entropy fusion has the following problems: (1) The method is generally suitable for normal and proper environments, but due to the fact that scenes are complex and changeable in practical application, for example, in bad scenes such as dark night, dim light and the like, acquired industrial product images can have the problems of unclear, lost details and the like, and the perception effect is poor. (2) The complementation of information is realized through the original image and the entropy diagram, and the effect of direct fusion for defect detection is not obvious.
Disclosure of Invention
In order to solve the technical problems, the invention provides an industrial defect detection method based on entropy fusion.
In order to solve the technical problems, the invention adopts the following technical scheme:
an industrial defect detection method based on entropy fusion comprises the following steps:
step one: acquiring an image of the surface of an industrial product, carrying out normalization processing on the image, and acquiring a entropy diagram by adopting an entropy algorithm of a histogram; dividing the image and entropy diagram into a training data set and a test data set;
step two: constructing a network model, realizing self-adaptive fusion of entropy diagrams and images by using a fusion mode of entropy attention, obtaining a feature diagram capable of highlighting a defect part through an encoder, a defect enhancement module and a decoder of the network model, and recovering the feature diagram to a size before encoding, wherein the method specifically comprises the steps of;
step two A: the image and entropy diagram are first feature extracted with one convolution layer and then fused in a manner similar to the self-attention mechanism as follows: encoding the image information into a query and a value, wherein the value is an image characteristic, and the query is a description of the image characteristic value; encoding entropy diagram information into feature vector keys; calculating affinity score between the query and the key, obtaining attention weight after the affinity score is subjected to a Softmax function, and performing aggregation by using the attention weight and value:
;
wherein the method comprises the steps ofFor the fused features, gamma is an adjustable parameter in the training process, query and value form a feature data pair of the image, key is a feature vector obtained from a entropy diagram, and +_>For the feature map of the image, ">As a Softmax function;
step two, B: the first two layers of the encoder adopt the first two network layers of a ResNet-50 residual network and are loaded with pre-training; the two latter layers of the encoder use convolutional layers;
after the fused features pass through the encoder, global and local defect feature enhancement is carried out through a defect enhancement module, so that a feature map is obtained; globally adopting texture operators to enhance defect characteristics, and locally adopting strip pooling to enhance the defect characteristics;
step two, C: gradually restoring the feature map obtained in the step B to the size before encoding through a decoder consisting of four deconvolution layers;
step three: training the network model through the training data set, and inputting the test image in the test data set into the trained network model to obtain a defect detection result of the test image; the method specifically comprises the following steps:
step three A: inputting the training data set into the network model in batches, and re-disturbing the whole training data set after the whole training data set is iterated for one round, and training again until the network model converges;
step three B: inputting the test image into the trained network model, and performing defect detection to obtain a feature map of the segmented test image;
step three C: comparing the feature image obtained in the step III with the feature image of the template image without defects, and calculating the variance ratio of the pixel value distributionJudging whether the variance ratio is smaller than a prescribed threshold value; if yes, the defect exists in the test image; if not, no defect exists in the test image;
;/>;/>;
wherein the method comprises the steps ofPixel value distribution variance of feature map representing test image,/->Pixel value distribution variance of feature map representing template image,/->The pixel value of the pixel point i in the feature image of the test image and the pixel value of the pixel point i in the feature image of the template image are respectively.
Further, the first step specifically includes:
step one A: setting the distance between the industrial product and the camera to enable the image to contain the surface of the whole industrial product;
step one B: randomly selecting a plurality of photographed images, and carrying out different angle rotations, wherein the rotation angle ranges from-10 degrees to 10 degrees;
step C: graying the acquired RGB channel image to obtain a single-channel imageThe method comprises the steps of carrying out a first treatment on the surface of the Map a single channelThe pixel value statistics of the pixel values is a histogram, the histogram is normalized to obtain the probability hist(s) of each pixel point occurrence, and the normalized histogram is subjected to accumulated calculation for calculating a threshold value and performing threshold segmentation:
;
wherein the method comprises the steps ofFor the sum of histograms>Is the probability of occurrence of the pixel value s;
step D: calculating entropy of each pixel value by adopting an entropy algorithm of the histogram, and accumulating:
;
wherein the method comprises the steps ofEntropy value which is the pixel value s;
solving for a threshold for obtaining a entropy diagram: calculating a functionThe maximum t value is the threshold value; wherein the function->、/>The following are provided:
;
;
and then binarizing the single-channel diagram according to the threshold value to obtain the entropy diagram.
Compared with the prior art, the invention has the beneficial technical effects that:
the entropy of the image is analyzed, so that the characteristic information of the image can be extracted better; the robustness of the model is improved through the self-attention fusion of the entropy diagram and the image; according to the invention, the image defect information is obtained by calculating the pixel value distribution ratio between the images, so that the defect detection precision is improved.
Drawings
Fig. 1 is a schematic diagram of a network model according to the present invention.
Detailed Description
A preferred embodiment of the present invention will be described in detail with reference to the accompanying drawings.
As shown in fig. 1, the industrial defect detection method based on entropy fusion in the present embodiment includes:
step one: and obtaining an image of the surface of the industrial product through a high-resolution camera, carrying out normalization processing on the image, and obtaining a entropy diagram by adopting an entropy algorithm of a histogram. The method specifically comprises the following steps:
step one A: setting the distance between the industrial product and the camera, the image needs to contain the surface part of the whole industrial product, and the size of the image is @, the image is) The height and width of the image are 2048, so that the subsequent algorithm is convenient to process.
Step one B: on a practical production line, the camera may have a slight rotation, resulting in a rotation of the captured image; it is necessary to randomly select 30 images to be photographed and rotate them at different angles (rotation angle is-10 deg.).
Step C: graying the acquired RGB channel image to obtain a single-channel imageThe histogram is conveniently obtained and normalized;
map a single channelThe pixel value statistics of (2) is used as a histogram, and the histogram is normalized to obtain each pixel point +.>Probability hist(s) of occurrence, and then carrying out accumulated calculation on the normalized histogram for thresholding and threshold segmentation:
,/>;
wherein the method comprises the steps ofFor the sum of histograms>Is the probability of occurrence of a pixel of value s.
Step D: an entropy algorithm of a histogram with relatively high speed is adopted, the entropy algorithm has little influence on the speed of a network model, and entropy values of pixel values are calculated and accumulated firstly:
,/>;
wherein the method comprises the steps ofIs the entropy value.
Solving for a threshold for obtaining a entropy diagram: calculating a functionThe maximum t value is the threshold value; performing binarization on the single-channel graph according to the threshold value to obtain a mask graph related to entropy; wherein the function->、/>The following are provided:
;
。
step two: and constructing a network model, using an entropy attention fusion mode to enable the network model to realize self-adaptive fusion of entropy diagrams and images, obtaining a characteristic diagram capable of highlighting a defect part through an encoder, a decoder and a defect enhancement module, and recovering the characteristic diagram to the original size.
Because the mask image contains some noise points, the mask image needs to be denoised by morphological open operation and then used for image fusion; the images contain abundant texture detail information, which is helpful for feature extraction; however, the image is easily affected by illumination, and the loss of detail information in a poor illumination environment can reduce the performance of the network model; to this end, an adaptive fusion of the image and the entropy diagram is used, the dependency between the image and the entropy diagram being captured by remote context modeling of the attention. The method specifically comprises the following steps:
step two A: first using a core asThe entropy diagram and the image are feature extracted and then fused in a manner similar to the self-attention mechanism, in particular as follows: obtaining a query and a value from an image, obtaining a feature vector key from a entropy diagram, calculating an affinity score between the query and the key, obtaining an attention weight after the score is subjected to a Softmax function, and performing aggregation by using the attention weight and the value, wherein the method comprises the following steps of:
;
wherein the method comprises the steps ofFor the fused features, gamma is an adjustable parameter in the training process, query and value form a feature data pair of the image, key is a feature vector obtained from a entropy diagram, and +_>For the feature map of the image, ">As a Softmax function.
Step two, B: the first two network layers res block of ResNet-50 are adopted in the first two layers of the encoder, and pre-training is loaded; the two later layers use the common core asIs a convolution layer conv of (1);
and after the fused features pass through the network layer of the encoder, performing global and local Defect feature enhancement through Defect enhancement modules Defect enhancement to obtain a feature map. Texture operators are adopted globally, and strip pooling is adopted locally.
Step two, C: the feature map is gradually restored to the original size by a Decoder composed of four deconvolution layers.
Step three: training the network model, and performing defect detection on the test image by using the trained network to verify the validity of the model. The method comprises the following steps:
step three A: and inputting the training data set into the built network model in batches, and re-disturbing the whole training data set after one round of iteration of the whole training data set, and re-training until the network model converges.
Specifically, parameters in the network model are randomly initialized, and the network model is optimized using an Adam optimizer. The target number of iterations is set as: 100000 times/training data set image number, the iteration number is increased by 1 every time the images in the training data set are traversed. After the iteration is finished, the obtained network model is saved as the network model finally used for defect detection.
Step three B: and putting the shot image into a trained network model, and performing defect detection to obtain a segmented notebook computer surface characteristic image.
Step three C: comparing the feature image obtained in the step III with the feature image of the template image without defects, and calculating the variance ratio of pixel value distribution of the two imagesJudging whether the variance ratio is smaller than a prescribed threshold value; if yes, the defect exists in the test image; if not, no defect exists in the test image;
;/>;/>;
wherein the method comprises the steps ofPixel value distribution variance of feature map representing test image,/->Pixel value distribution variance of feature map representing template image,/->The pixel value of the pixel point i in the feature image of the test image and the pixel value of the pixel point i in the feature image of the template image are respectively.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a single embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to specific embodiments, and that the embodiments may be combined appropriately to form other embodiments that will be understood by those skilled in the art.
Claims (2)
1. An industrial defect detection method based on entropy fusion comprises the following steps:
step one: acquiring an image of the surface of an industrial product, carrying out normalization processing on the image, and acquiring a entropy diagram by adopting an entropy algorithm of a histogram; dividing the image and entropy diagram into a training data set and a test data set;
step two: constructing a network model, realizing self-adaptive fusion of entropy diagrams and images by using a fusion mode of entropy attention, obtaining a feature diagram capable of highlighting a defect part through an encoder, a defect enhancement module and a decoder of the network model, and recovering the feature diagram to a size before encoding, wherein the method specifically comprises the steps of;
step two A: the image and entropy diagram are first feature extracted with one convolution layer and then fused in a manner similar to the self-attention mechanism as follows: encoding the image information into a query and a value, wherein the value is an image characteristic, and the query is a description of the image characteristic value; encoding entropy diagram information into feature vector keys; calculating affinity scores between the query and the key, obtaining attention weights after the affinity scores are subjected to a Softmax function, and polymerizing by using the attention weights and the value:
;
wherein the method comprises the steps ofFor the fused features, gamma is an adjustable parameter in the training process, and query and value form a feature data pair of an image, < ->For the feature map of the image, ">As a Softmax function;
step two, B: the first two layers of the encoder adopt the first two network layers of a ResNet-50 residual network and are loaded with pre-training; the two latter layers of the encoder use convolutional layers;
after the fused features pass through the encoder, global and local defect feature enhancement is carried out through a defect enhancement module, so that a feature map is obtained; globally adopting texture operators to enhance defect characteristics, and locally adopting strip pooling to enhance the defect characteristics;
step two, C: gradually restoring the feature map obtained in the step B to the size before encoding through a decoder consisting of four deconvolution layers;
step three: training the network model through the training data set, specifically comprising:
inputting the training data set into the network model in batches, and re-disturbing the whole training data set after the whole training data set is iterated for one round, and training again until the network model converges;
step four: inputting the test image in the test data set into the trained network model to obtain a defect detection result of the test image, wherein the method specifically comprises the following steps of:
inputting the test image into the trained network model, and performing defect detection to obtain a feature map of the segmented test image;
comparing the obtained characteristic image with the characteristic image of the template image without defects, and calculating the variance ratio of the pixel value distributionJudging whether the variance ratio is smaller than a prescribed threshold value; if yes, the defect exists in the test image; if not, no defect exists in the test image;
;/>;/>;
wherein the method comprises the steps ofRepresenting test chartPixel value distribution variance of feature map of image, +.>Pixel value distribution variance of feature map representing template image,/->The pixel value of the pixel point i in the feature image of the test image and the pixel value of the pixel point i in the feature image of the template image are respectively.
2. The method for detecting industrial defects based on entropy fusion according to claim 1, wherein the first step specifically comprises:
step one A: setting the distance between the industrial product and the camera to enable the image to contain the surface of the whole industrial product;
step one B: randomly selecting a plurality of photographed images, and carrying out different angle rotations, wherein the rotation angle ranges from-10 degrees to 10 degrees;
step C: graying the acquired RGB channel image to obtain a single-channel imageThe method comprises the steps of carrying out a first treatment on the surface of the Single channel map +.>The pixel value statistics of the pixel values is a histogram, the histogram is normalized to obtain the probability hist(s) of each pixel point occurrence, and the normalized histogram is subjected to accumulated calculation for calculating a threshold value and performing threshold segmentation:
;
wherein the method comprises the steps ofFor the sum of histograms>Is the probability of occurrence of the pixel value s;
step D: calculating entropy of each pixel value by adopting an entropy algorithm of the histogram, and accumulating:
;
wherein the method comprises the steps ofEntropy value which is the pixel value s;
solving for a threshold for obtaining a entropy diagram: calculating a functionThe maximum t value is the threshold value; wherein the function->、/>The following are provided:
;
;
and binarizing the single-channel diagram according to a threshold value to obtain the entropy diagram.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311382011.6A CN117132592B (en) | 2023-10-24 | 2023-10-24 | Industrial defect detection method based on entropy fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311382011.6A CN117132592B (en) | 2023-10-24 | 2023-10-24 | Industrial defect detection method based on entropy fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117132592A CN117132592A (en) | 2023-11-28 |
CN117132592B true CN117132592B (en) | 2024-01-26 |
Family
ID=88860343
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311382011.6A Active CN117132592B (en) | 2023-10-24 | 2023-10-24 | Industrial defect detection method based on entropy fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117132592B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110246108A (en) * | 2018-11-21 | 2019-09-17 | 浙江大华技术股份有限公司 | A kind of image processing method, device and computer readable storage medium |
US10901715B1 (en) * | 2019-09-26 | 2021-01-26 | Jonathan RAIMAN | Lazy compilation and kernel fusion in dynamic computation graphs |
CN113240613A (en) * | 2021-06-07 | 2021-08-10 | 北京航空航天大学 | Image restoration method based on edge information reconstruction |
CN113554629A (en) * | 2021-07-28 | 2021-10-26 | 江苏苏桥焊材有限公司 | Strip steel red rust defect detection method based on artificial intelligence |
CN113822885A (en) * | 2021-11-23 | 2021-12-21 | 常州微亿智造科技有限公司 | Workpiece defect detection method and device integrating multi-attention machine system |
CN114708189A (en) * | 2022-02-24 | 2022-07-05 | 中北大学 | Deep learning-based multi-energy X-ray image fusion method and device |
CN116152244A (en) * | 2023-04-19 | 2023-05-23 | 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) | SMT defect detection method and system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114913565B (en) * | 2021-01-28 | 2023-11-17 | 腾讯科技(深圳)有限公司 | Face image detection method, model training method, device and storage medium |
-
2023
- 2023-10-24 CN CN202311382011.6A patent/CN117132592B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110246108A (en) * | 2018-11-21 | 2019-09-17 | 浙江大华技术股份有限公司 | A kind of image processing method, device and computer readable storage medium |
US10901715B1 (en) * | 2019-09-26 | 2021-01-26 | Jonathan RAIMAN | Lazy compilation and kernel fusion in dynamic computation graphs |
CN113240613A (en) * | 2021-06-07 | 2021-08-10 | 北京航空航天大学 | Image restoration method based on edge information reconstruction |
CN113554629A (en) * | 2021-07-28 | 2021-10-26 | 江苏苏桥焊材有限公司 | Strip steel red rust defect detection method based on artificial intelligence |
CN113822885A (en) * | 2021-11-23 | 2021-12-21 | 常州微亿智造科技有限公司 | Workpiece defect detection method and device integrating multi-attention machine system |
CN114708189A (en) * | 2022-02-24 | 2022-07-05 | 中北大学 | Deep learning-based multi-energy X-ray image fusion method and device |
CN116152244A (en) * | 2023-04-19 | 2023-05-23 | 合肥综合性国家科学中心人工智能研究院(安徽省人工智能实验室) | SMT defect detection method and system |
Non-Patent Citations (3)
Title |
---|
"multimodal learning for classroom cativity detection";Hang Li.et al;《IEEE》;全文 * |
"Adaptive Diagnosis for Transformer With Unknown Faults Based on Antenna-Augmented RFID Sensor and Deep Learning";Tao Wang.et al;《ieee》;第23卷(第17期);全文 * |
"二极管玻壳表面缺陷检测技术研究";牛乾;《知网》;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN117132592A (en) | 2023-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106875373B (en) | Mobile phone screen MURA defect detection method based on convolutional neural network pruning algorithm | |
CN108562589B (en) | Method for detecting surface defects of magnetic circuit material | |
CN111383209B (en) | Unsupervised flaw detection method based on full convolution self-encoder network | |
CN111553929A (en) | Mobile phone screen defect segmentation method, device and equipment based on converged network | |
CN111445459B (en) | Image defect detection method and system based on depth twin network | |
CN111126412B (en) | Image key point detection method based on characteristic pyramid network | |
CN111209858B (en) | Real-time license plate detection method based on deep convolutional neural network | |
CN113592923B (en) | Batch image registration method based on depth local feature matching | |
CN111242026B (en) | Remote sensing image target detection method based on spatial hierarchy perception module and metric learning | |
CN115880298B (en) | Glass surface defect detection system based on unsupervised pre-training | |
CN112767369A (en) | Defect identification and detection method and device for small hardware and computer readable storage medium | |
CN109360179B (en) | Image fusion method and device and readable storage medium | |
CN113177947B (en) | Multi-module convolutional neural network-based complex environment target segmentation method and device | |
CN109815923B (en) | Needle mushroom head sorting and identifying method based on LBP (local binary pattern) features and deep learning | |
CN111985314B (en) | Smoke detection method based on ViBe and improved LBP | |
WO2024021461A1 (en) | Defect detection method and apparatus, device, and storage medium | |
Fang et al. | Laser stripe image denoising using convolutional autoencoder | |
CN107578011A (en) | The decision method and device of key frame of video | |
CN113139544A (en) | Saliency target detection method based on multi-scale feature dynamic fusion | |
CN113657528A (en) | Image feature point extraction method and device, computer terminal and storage medium | |
CN112634171A (en) | Image defogging method based on Bayes convolutional neural network and storage medium | |
CN115829942A (en) | Electronic circuit defect detection method based on non-negative constraint sparse self-encoder | |
CN117541652A (en) | Dynamic SLAM method based on depth LK optical flow method and D-PROSAC sampling strategy | |
CN115641445B (en) | Remote sensing image shadow detection method integrating asymmetric inner convolution and Transformer | |
CN115797314B (en) | Method, system, equipment and storage medium for detecting surface defects of parts |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |