CN111429431A - Element positioning and identifying method based on convolutional neural network - Google Patents

Element positioning and identifying method based on convolutional neural network Download PDF

Info

Publication number
CN111429431A
CN111429431A CN202010212444.7A CN202010212444A CN111429431A CN 111429431 A CN111429431 A CN 111429431A CN 202010212444 A CN202010212444 A CN 202010212444A CN 111429431 A CN111429431 A CN 111429431A
Authority
CN
China
Prior art keywords
neural network
convolutional neural
candidate
confidence
box
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010212444.7A
Other languages
Chinese (zh)
Other versions
CN111429431B (en
Inventor
陈玮钰
金智超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhenbang Technology Co ltd
Original Assignee
Shenzhen Zhenbang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhenbang Technology Co ltd filed Critical Shenzhen Zhenbang Technology Co ltd
Priority to CN202010212444.7A priority Critical patent/CN111429431B/en
Publication of CN111429431A publication Critical patent/CN111429431A/en
Application granted granted Critical
Publication of CN111429431B publication Critical patent/CN111429431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention is suitable for the field of PCB element defect detection, and provides an element positioning and identifying method based on a convolutional neural network, which comprises the following steps: s1: defining the types of PCB elements, selecting the position of each element in the PCB image by a frame, marking the type of the element, making a data sample, and amplifying the data sample to form a training sample; s2: constructing a convolutional neural network model by using the data samples; s3: training the constructed convolutional neural network by using a training sample; s4: and analyzing the images containing various electronic elements by using the trained convolutional neural network, and positioning the positions of the electronic elements of different types. The convolutional neural network is constructed, so that the elements in the PCB image can be identified and the positions of the elements can be confirmed through training, the input PCB image does not need to be subjected to element framing by manpower, but is subjected to framing and identification by the convolutional neural network, manpower and material resources are reduced, the automation degree is improved, and the working efficiency is improved.

Description

Element positioning and identifying method based on convolutional neural network
Technical Field
The invention belongs to the field of PCB element defect detection, and particularly relates to an element positioning and identifying method based on a convolutional neural network.
Background
In the prior art, the defect detection of the components of the PCB generally adopts the AOI technology to position the components, the chinese language of the AOI (Automated Optical Inspection) is called automatic Optical detection, and is a device for detecting common defects encountered in the welding production based on the Optical principle.
The AOI system requires manual framing of the component and manual specification of the component type. The manual framing is time-consuming and labor-consuming, the component types need to be defined manually, and the framed images cannot be identified by the system, so that the automation degree of component detection of the PCB images by the AOI system is low, and the working efficiency is influenced.
Disclosure of Invention
The invention aims to provide a component positioning and identifying method based on a convolutional neural network, and aims to solve the problem that the automation degree of an AOI system for positioning components in a PCB image is low.
The invention is realized in such a way that an element positioning and identifying method based on a convolutional neural network comprises the following steps:
S1: defining the types of PCB elements, selecting the position of each element in the PCB image by a frame, marking the type of the element, making a data sample, and amplifying the data sample to form a training sample;
S2: constructing a convolutional neural network model by using the data samples;
S3: training the constructed convolutional neural network by using a training sample;
S4: and analyzing the images containing various electronic elements by using the trained convolutional neural network, and positioning the positions of the electronic elements of different types.
The further technical scheme of the invention is as follows: the data samples in step S1 include positive samples and negative samples, wherein the areas in the PCB image that are outlined are positive samples, and the areas that are not outlined are negative samples.
The further technical scheme of the invention is as follows: the amplification method for the data sample in step S1 includes image flipping, random clipping and gaussian noise.
The further technical scheme of the invention is as follows: the step S2 includes the steps of:
S21: establishing a main network structure consisting of 20 convolutional layers;
S22: the feature layers of the convolutional layers are convolved with two different 3x3 convolution kernels respectively to obtain a confidence for output classification and a coordinate value for output regression.
The further technical scheme of the invention is as follows: the step S22 includes the steps of:
S22A: the system sets IoU a threshold value for the value;
S22B: generating a series of concentric candidate frames by taking the central point of the element on the characteristic layer as a center;
S22C: convolving each candidate frame by using the first convolution kernel of 3 x 3 to obtain the confidence coefficient of each candidate frame;
S22D: and (4) convolving each candidate frame by using the second convolution kernel of 3 x 3 to obtain the coordinate value of each candidate frame.
The further technical scheme of the invention is as follows: the step S2 further includes the steps of:
S23: adding all the candidate frames into a candidate frame list, and sorting according to the confidence values;
S24: computing IoU values of the candidate box with the highest confidence degree and other candidate boxes, comparing the IoU values with the IoU threshold value set in the step S22A, if the IoU value of the candidate box is larger than the threshold value, deleting the candidate box from the list and adding the candidate box with the highest confidence degree into the output list; if the IoU value for the candidate box is not greater than the threshold, then it is left in the candidate box list;
S25: when there is no candidate box with IoU value greater than the threshold value in the candidate box list, returning to step S23;
S26: and outputting the confidence coefficient and the coordinate value of the candidate frame of the output list.
The further technical scheme of the invention is as follows: the step S3 includes the steps of:
S31: inputting the training sample into the constructed convolutional neural network by the system;
S32: the convolutional neural network performs convolutional calculation on the training samples, and updates the confidence, coordinate values and component types of the candidate boxes of the output list in step S26.
The further technical scheme of the invention is as follows: the step S4 includes the steps of:
S41: inputting the target PCB image into the constructed convolutional neural network, framing a framing chart of the element, and calculating the confidence of the framing chart;
S42: and comparing the confidence degrees of the box-selected map with the confidence degrees of the candidate boxes in the output list in the step S32, if a candidate box with the same confidence degree as the box-selected map exists, it is proved that the target PCB image contains the element, the convolutional neural network outputs the coordinate value and the element type of the element, and if no candidate box with the same or similar confidence degree as the box-selected map exists, it is proved that the target PCB image does not contain the element, and the convolutional neural network does not output the element.
The invention has the beneficial effects that: the convolutional neural network is constructed, so that the elements in the PCB image can be identified and the positions of the elements can be confirmed through training, the input PCB image does not need to be subjected to element framing by manpower, but is subjected to framing and identification by the convolutional neural network, manpower and material resources are greatly reduced, the automation degree is improved, and the working efficiency is improved.
Drawings
FIG. 1 is a general flow chart of the present invention;
FIG. 2 is a flowchart of step S2 of the present invention;
FIG. 3 is a flowchart of step S22 of the present invention;
FIG. 4 is a flowchart of step S3 of the present invention;
Fig. 5 is a flowchart of step S4 of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be noted that the terms "front," "back," "left," "right," "upper" and "lower" used in the following description refer to directions in the drawings, and the terms "bottom" and "top," "inner" and "outer" refer to directions toward and away from, respectively, the geometric center of a particular component.
Fig. 1 shows an element positioning identification method based on a convolutional neural network provided by the invention, the method comprises the following steps:
S1: defining the types of PCB elements, selecting the position of each element in the PCB image by a frame, marking the type of the element, making a data sample, and amplifying the data sample to form a training sample;
S2: constructing a convolutional neural network model by using the data samples;
S3: training the constructed convolutional neural network by using a training sample;
S4: and analyzing the images containing various electronic elements by using the trained convolutional neural network, and positioning the positions of the electronic elements of different types.
Preferably, the data samples in step S1 include positive samples and negative samples, where the boxed areas in the PCB image are positive samples and the non-boxed areas are negative samples.
Preferably, the amplification method for the data samples in step S1 includes image flipping, random clipping and gaussian noise.
Preferably, the step S2 includes the steps of:
S21: establishing a main network structure consisting of 20 convolutional layers;
S22: the feature layers of the convolutional layers are convolved with two different 3x3 convolution kernels respectively to obtain a confidence for output classification and a coordinate value for output regression.
Preferably, the step S22 includes the steps of:
S22A: the system sets IoU a threshold value for the value;
S22B: generating a series of concentric candidate frames by taking the central point of the element on the characteristic layer as a center;
S22C: convolving each candidate frame by using the first convolution kernel of 3 x 3 to obtain the confidence coefficient of each candidate frame;
S22D: and (4) convolving each candidate frame by using the second convolution kernel of 3 x 3 to obtain the coordinate value of each candidate frame.
Preferably, the step S2 further includes the steps of:
S23: adding all the candidate frames into a candidate frame list, and sorting according to the confidence values;
S24: computing IoU values of the candidate box with the highest confidence degree and other candidate boxes, comparing the IoU values with the IoU threshold value set in the step S22A, if the IoU value of the candidate box is larger than the threshold value, deleting the candidate box from the list and adding the candidate box with the highest confidence degree into the output list; if the IoU value for the candidate box is not greater than the threshold, then it is left in the candidate box list;
S25: when there is no candidate box with IoU value greater than the threshold value in the candidate box list, returning to step S23;
S26: and outputting the confidence coefficient and the coordinate value of the candidate frame of the output list.
Preferably, the step S3 includes the steps of:
S31: inputting the training sample into the constructed convolutional neural network by the system;
S32: the convolutional neural network performs convolutional calculation on the training samples, and updates the confidence, coordinate values and component types of the candidate boxes of the output list in step S26.
Preferably, the step S4 includes the steps of:
S41: inputting the target PCB image into the constructed convolutional neural network, framing a framing chart of the element, and calculating the confidence of the framing chart;
S42: and comparing the confidence degrees of the box-selected map with the confidence degrees of the candidate boxes in the output list in the step S32, if a candidate box with the same confidence degree as the box-selected map exists, it is proved that the target PCB image contains the element, the convolutional neural network outputs the coordinate value and the element type of the element, and if no candidate box with the same or similar confidence degree as the box-selected map exists, it is proved that the target PCB image does not contain the element, and the convolutional neural network does not output the element.
Convolutional Neural Networks (CNN) are a type of feed-forward Neural network that includes convolution calculations and has a deep structure, and are one of the representative algorithms for deep learning. The convolutional neural network has the characteristic learning capability, namely an operator can train the convolutional neural network through a large number of data samples, so that the operator can form memory for a certain image in an image, and when the operator is used for detecting and identifying, if the confidence coefficient of the image in the image is consistent with or close to that of the image in the neural network memory, whether the image in the image contains the image can be identified roughly. The invention utilizes the characteristic of the convolutional neural network to be used for the identification of elements in the PCB image.
The invention selects the element frame in the PCB image, marks the type of each element, and makes into a data sample, wherein the framed image is used as a positive sample, the non-framed image is used as a negative sample, the positive sample and the negative sample are data participating in training, and are used for training to obtain a classification detection model, wherein at least 1000 PCB images containing various types of elements are required. And amplifying the data samples to form a training sample, wherein the amplification method comprises image inversion, random clipping and Gaussian noise.
The data samples were then input into a main network of a convolutional neural network, the main network structure containing 20 convolutional layers, of which 5 different feature layers were each convolved with two different 3x3 convolutional kernels. Before the convolution operation, an operator needs to set IoU a threshold in the main network, wherein the IoU value is an intersection ratio, which means the overlapping rate of the candidate box and the target box, namely the ratio of the intersection to the union of the candidate box and the target box, and is used for subsequently screening the candidate box; the neural network then generates a series of concentric candidate frames centered on the center point of the element. Then, the confidence degrees of the candidate frames are obtained by using a first convolution kernel, and the coordinate values of the candidate frames are obtained by using a second convolution kernel.
Next, adding the candidate frames into a candidate frame list, sorting the candidate frames according to the confidence values, picking the candidate frame with the highest confidence, calculating IoU values of the candidate frames and comparing the IoU values with other candidate frames with a IoU threshold value set by a previous operator, and if the IoU value of the candidate frame is not greater than the threshold value, remaining in the candidate frame list; if a candidate box is greater than the threshold, the candidate box is deleted and the highest confidence candidate box is added to the output list, at which point the highest confidence candidate box is re-selected from the list of candidate boxes, and a value IoU is calculated and compared to the threshold. Repeating the above operations, the candidate frame information with high confidence of the component, including the confidence of the candidate frame, the coordinate value and the component type, exists in the final output list.
After the convolutional neural network is constructed, training samples can be introduced into the convolutional neural network, and after the training samples are subjected to the operation, the candidate frame data in the final output list can be updated, so that the data information in the output list is more accurate, and the error rate is lower during subsequent identification and detection.
When the component is identified and detected, the PCB image is input into the constructed convolutional neural network, the candidate frames of each component can be automatically selected by the convolutional neural network, the confidence coefficient is obtained after convolution operation, the confidence coefficient is compared with the confidence coefficient of the candidate frames in the updated output list, if the candidate frames with the same or similar confidence coefficients exist, the convolutional neural network considers that the image has the component with the maximum probability, the type and the coordinate value of the component are output, and if the candidate frames with the same or similar confidence coefficients do not exist, the component is not output. Based on the method, after the PCB image to be identified is input into the convolutional neural network by an operator, the type and the position of the element can be output on the image finally, so that automatic positioning identification is realized, the consumption of manpower and material resources is saved, the identification accuracy and the production work efficiency are improved, and the editing time is shortened.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (8)

1. An element positioning and identifying method based on a convolutional neural network is characterized by comprising the following steps:
S1: defining the types of PCB elements, selecting the position of each element in the PCB image by a frame, marking the type of the element, making a data sample, and amplifying the data sample to form a training sample;
S2: constructing a convolutional neural network model by using the data samples;
S3: training the constructed convolutional neural network by using a training sample;
S4: and analyzing the images containing various electronic elements by using the trained convolutional neural network, and positioning the positions of the electronic elements of different types.
2. The convolutional neural network-based component location identification method as claimed in claim 1, wherein the data samples in step S1 include positive samples and negative samples, wherein the boxed areas in the PCB image are positive samples, and the non-boxed areas are negative samples.
3. The convolutional neural network-based component location identification method of claim 2, wherein the amplification method for the data samples in step S1 includes image flipping, random clipping and gaussian noise.
4. The convolutional neural network-based component location identification method as claimed in claim 3, wherein the step S2 comprises the following steps:
S21: establishing a main network structure consisting of 20 convolutional layers;
S22: the feature layers of the convolutional layers are convolved with two different 3x3 convolution kernels respectively to obtain a confidence for output classification and a coordinate value for output regression.
5. The convolutional neural network-based component location identification method as claimed in claim 4, wherein the step S22 comprises the steps of:
S22A: the system sets IoU a threshold value for the value;
S22B: generating a series of concentric candidate frames by taking the central point of the element on the characteristic layer as a center;
S22C: convolving each candidate frame by using the first convolution kernel of 3 x 3 to obtain the confidence coefficient of each candidate frame;
S22D: and (4) convolving each candidate frame by using the second convolution kernel of 3 x 3 to obtain the coordinate value of each candidate frame.
6. The convolutional neural network-based component location identification method as claimed in claim 5, wherein the step S2 further comprises the steps of:
S23: adding all the candidate frames into a candidate frame list, and sorting according to the confidence values;
S24: computing IoU values of the candidate box with the highest confidence degree and other candidate boxes, comparing the IoU values with the IoU threshold value set in the step S22A, if the IoU value of the candidate box is larger than the threshold value, deleting the candidate box from the list and adding the candidate box with the highest confidence degree into the output list; if the IoU value for the candidate box is not greater than the threshold, then it is left in the candidate box list;
S25: when there is no candidate box with IoU value greater than the threshold value in the candidate box list, returning to step S23;
S26: and outputting the confidence coefficient and the coordinate value of the candidate frame of the output list.
7. The convolutional neural network-based component location identification method as claimed in claim 6, wherein the step S3 includes the following steps:
S31: inputting the training sample into the constructed convolutional neural network by the system;
S32: the convolutional neural network performs convolutional calculation on the training samples, and updates the confidence, coordinate values and component types of the candidate boxes of the output list in step S26.
8. The convolutional neural network-based component location identification method as claimed in claim 7, wherein the step S4 includes the steps of:
S41: inputting the target PCB image into the constructed convolutional neural network, framing a framing chart of the element, and calculating the confidence of the framing chart;
S42: and comparing the confidence degrees of the box-selected map with the confidence degrees of the candidate boxes in the output list in the step S32, if a candidate box with the same confidence degree as the box-selected map exists, it is proved that the target PCB image contains the element, the convolutional neural network outputs the coordinate value and the element type of the element, and if no candidate box with the same or similar confidence degree as the box-selected map exists, it is proved that the target PCB image does not contain the element, and the convolutional neural network does not output the element.
CN202010212444.7A 2020-03-24 2020-03-24 Element positioning and identifying method based on convolutional neural network Active CN111429431B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010212444.7A CN111429431B (en) 2020-03-24 2020-03-24 Element positioning and identifying method based on convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010212444.7A CN111429431B (en) 2020-03-24 2020-03-24 Element positioning and identifying method based on convolutional neural network

Publications (2)

Publication Number Publication Date
CN111429431A true CN111429431A (en) 2020-07-17
CN111429431B CN111429431B (en) 2023-09-19

Family

ID=71555435

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010212444.7A Active CN111429431B (en) 2020-03-24 2020-03-24 Element positioning and identifying method based on convolutional neural network

Country Status (1)

Country Link
CN (1) CN111429431B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112643618A (en) * 2020-12-21 2021-04-13 东风汽车集团有限公司 Intelligent adjusting device and method for flexible engine warehousing tool
CN113030121A (en) * 2021-03-11 2021-06-25 微讯智造(广州)电子有限公司 Automatic optical detection method, system and equipment for circuit board components
CN113674207A (en) * 2021-07-21 2021-11-19 电子科技大学 Automatic PCB component positioning method based on graph convolution neural network

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109285139A (en) * 2018-07-23 2019-01-29 同济大学 A kind of x-ray imaging weld inspection method based on deep learning
CN109344753A (en) * 2018-09-21 2019-02-15 福州大学 A kind of tiny fitting recognition methods of Aerial Images transmission line of electricity based on deep learning
CN109409517A (en) * 2018-09-30 2019-03-01 北京字节跳动网络技术有限公司 The training method and device of object detection network
CN109859207A (en) * 2019-03-06 2019-06-07 华南理工大学 A kind of defect inspection method of high density flexible substrate
CN110717456A (en) * 2019-10-10 2020-01-21 北京百度网讯科技有限公司 Object monitoring method, device, system, electronic equipment and storage medium
CN110889421A (en) * 2018-09-07 2020-03-17 杭州海康威视数字技术股份有限公司 Target detection method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109285139A (en) * 2018-07-23 2019-01-29 同济大学 A kind of x-ray imaging weld inspection method based on deep learning
CN110889421A (en) * 2018-09-07 2020-03-17 杭州海康威视数字技术股份有限公司 Target detection method and device
CN109344753A (en) * 2018-09-21 2019-02-15 福州大学 A kind of tiny fitting recognition methods of Aerial Images transmission line of electricity based on deep learning
CN109409517A (en) * 2018-09-30 2019-03-01 北京字节跳动网络技术有限公司 The training method and device of object detection network
CN109859207A (en) * 2019-03-06 2019-06-07 华南理工大学 A kind of defect inspection method of high density flexible substrate
CN110717456A (en) * 2019-10-10 2020-01-21 北京百度网讯科技有限公司 Object monitoring method, device, system, electronic equipment and storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112643618A (en) * 2020-12-21 2021-04-13 东风汽车集团有限公司 Intelligent adjusting device and method for flexible engine warehousing tool
CN113030121A (en) * 2021-03-11 2021-06-25 微讯智造(广州)电子有限公司 Automatic optical detection method, system and equipment for circuit board components
CN113674207A (en) * 2021-07-21 2021-11-19 电子科技大学 Automatic PCB component positioning method based on graph convolution neural network
CN113674207B (en) * 2021-07-21 2023-04-07 电子科技大学 Automatic PCB component positioning method based on graph convolution neural network

Also Published As

Publication number Publication date
CN111429431B (en) 2023-09-19

Similar Documents

Publication Publication Date Title
CN109919934B (en) Liquid crystal panel defect detection method based on multi-source domain deep transfer learning
CN110059694B (en) Intelligent identification method for character data in complex scene of power industry
CN110175982B (en) Defect detection method based on target detection
CN111062915B (en) Real-time steel pipe defect detection method based on improved YOLOv3 model
CN111429431B (en) Element positioning and identifying method based on convolutional neural network
CN109683360B (en) Liquid crystal panel defect detection method and device
CN111783772A (en) Grabbing detection method based on RP-ResNet network
CN109447979B (en) Target detection method based on deep learning and image processing algorithm
CN112365497A (en) High-speed target detection method and system based on Trident Net and Cascade-RCNN structures
CN113222913B (en) Circuit board defect detection positioning method, device and storage medium
CN112365491A (en) Method for detecting welding seam of container, electronic equipment and storage medium
US20220076404A1 (en) Defect management apparatus, method and non-transitory computer readable medium
CN108133235A (en) A kind of pedestrian detection method based on neural network Analysis On Multi-scale Features figure
CN113420619A (en) Remote sensing image building extraction method
CN115147418B (en) Compression training method and device for defect detection model
CN113012153A (en) Aluminum profile flaw detection method
CN114429445A (en) PCB defect detection and identification method based on MAIRNet
CN114359932B (en) Text detection method, text recognition method and device
CN116258175A (en) Weld defect intelligent recognition model evolution method based on active learning
CN115546586A (en) Method and device for detecting infrared dim target, computing equipment and storage medium
CN111539931A (en) Appearance abnormity detection method based on convolutional neural network and boundary limit optimization
CN109615610B (en) Medical band-aid flaw detection method based on YOLO v2-tiny
CN113205511B (en) Electronic component batch information detection method and system based on deep neural network
CN111091534A (en) Target detection-based pcb defect detection and positioning method
CN114299040A (en) Ceramic tile flaw detection method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant