CN113920049B - Template matching method based on fusion of small amount of positive samples - Google Patents

Template matching method based on fusion of small amount of positive samples Download PDF

Info

Publication number
CN113920049B
CN113920049B CN202010587584.2A CN202010587584A CN113920049B CN 113920049 B CN113920049 B CN 113920049B CN 202010587584 A CN202010587584 A CN 202010587584A CN 113920049 B CN113920049 B CN 113920049B
Authority
CN
China
Prior art keywords
template
target edge
target
gradient
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010587584.2A
Other languages
Chinese (zh)
Other versions
CN113920049A (en
Inventor
李思聪
吴清潇
王化明
嵇冠群
张正光
朱枫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenyang Institute of Automation of CAS
Original Assignee
Shenyang Institute of Automation of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenyang Institute of Automation of CAS filed Critical Shenyang Institute of Automation of CAS
Priority to CN202010587584.2A priority Critical patent/CN113920049B/en
Publication of CN113920049A publication Critical patent/CN113920049A/en
Application granted granted Critical
Publication of CN113920049B publication Critical patent/CN113920049B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a template matching method based on fusion of a small number of positive samples. The invention is oriented to the field of image target identification and positioning, and particularly relates to a target identification and positioning method based on template matching, which can carry out robust and rapid identification and positioning on a target when the target has larger shielding, deformation and gradient direction mutation through fusing a small amount of positive samples of templates. The invention comprises the following four steps: 1. extracting a target edge point set in a template image; 2. registering the target edge point sets under different template images; 3. fusing target edge points; 4. and (5) carrying out similarity statistics based on the fusion point set. The invention solves the problem of template matching when the target has larger shielding, deformation and gradient direction mutation, and improves the target identification positioning robustness; the invention is applied to the field of industrial machine vision identification and positioning, provides a solving method for identifying and positioning parts in industrial production, and provides a perception function for realizing industrial automatic production process.

Description

Template matching method based on fusion of small amount of positive samples
Technical Field
The invention belongs to the field of computer vision, and particularly relates to a template matching method based on fusion of a small number of positive samples.
Background
With the continuous improvement of the industrial automation level, the visual identification and positioning technology is widely applied. Facing the great demand for intelligent manufacturing, machine vision as an "eye" in industrial automation provides a non-contact measurement method and dense, visual measurement data for the production process. Most of parts in the same production period of the same station on the industrial assembly line are samples of different types repeatedly arranged in shape, so that the machine vision technology can be used for rapidly and reliably identifying and positioning the samples of the same type under the conditions of arrangement, illumination, shielding and the like, and template matching is an identification and positioning technology capable of meeting the requirement.
Template matching technology generally utilizes one or more template images to extract limited features capable of describing a target and generate a target template; and then, in the process of identification and positioning, under the statistics of a certain measurement method, comparing and searching the template in the image to be detected according to a certain traversal rule, and matching with the potential target with high similarity. Template matching technology in industrial machine vision aims to realize quick and reliable identification and positioning of specific target workpieces, but the following conditions exist in the targets: (1) there is a large occlusion of the target edge feature; (2) a gradient direction mutation occurs in the target edge feature; (3) The deformation of the edge features of the target, especially the regular large deformation caused by the production process, is generally difficult to realize reliable identification and positioning by the existing template matching algorithm.
Disclosure of Invention
According to the invention, by utilizing a limited number of target positive sample images, the samples provide correct description under the conditions of shielding, edge deformation and gradient mutation of the target, an algorithm firstly extracts target edge points and obtains gradient information, the edge point set of each sample is subjected to pose registration, point set fusion is realized through the algorithm, a new template capable of covering the characteristics of each sample is finally formed, and the matching process is completed by utilizing the new template, so that reliable and rapid identification and positioning are realized.
The technical scheme adopted by the invention for achieving the purpose is as follows:
a template matching method based on fusion of a small number of positive samples comprises the following steps:
1) Extracting a target edge point set in an industrial part template image;
2) Registering target edge point sets under different industrial part template images;
3) Combining or splitting the target edge points to obtain a combined and split industrial part template feature point set;
4) And carrying out similarity statistics based on the combined and split characteristic point sets of the industrial part templates, and carrying out matching of the industrial part templates and the detection images.
The step 1) is specifically as follows: and extracting coordinates of a plurality of target edge points in the industrial part image template, and calculating the gradient amplitude and direction of each target edge point by a gradient detection method to form a target edge point set.
The gradient detection method comprises the following steps:
m=mag[gx,gy]
θ=dir[gx,gy]
wherein gx and gy are values obtained by using a gradient mask to calculate target edge point coordinates, m is a discrete gradient amplitude value obtained according to array indexes, θ is a discrete gradient direction obtained by using array indexes, mag and dir are two-dimensional arrays respectively, two dimensional variables are gx and gy respectively, and an index range comprises all values from a minimum value to a maximum value obtained by gx and gy.
The step 2) is specifically as follows: and performing European transformation on the target edge point sets under different industrial part template images, so that the transformed target edge point sets are aligned with the target pose of the target edge point under a certain industrial part template, namely the reference template image.
The European transformation is as follows:
wherein,respectively taking the form of homogeneous coordinates of the point p, (tx, ty) and alpha as relative pose between two registration point sets, and setting a target edge point set of a kth industrial part template image +.>The point set registered with the b-th reference template after gesture transformation is S k ' then all m templatesThe set of points S after registration of the points with the reference template b, b e 1..m is:
wherein,for the target edge point, +.>For the gradient direction of the target edge points in the target edge point set, i is the sequence number of the target edge points, n k The number of target edges for the kth template.
The step 3) is specifically as follows: if the registered edge points are concentrated, merging the target edge points if the target edge points with the same coordinates and the same gradient direction exist; if there are target edge points whose coordinates are the same but whose gradient directions are different, these target edge points are split.
The merging or splitting is specifically as follows:
wherein,target edge points in the target edge point set respectively +.>I, j are target edge point sequence numbers, n k The number of target edge points of the kth template is calculated, and the feature point set of the part template after splitting is combined as follows:
wherein n is m To merge the number of edge points after splitting.
The step 4) is specifically as follows: and accumulating dot products of the gradient direction vector of the similarity-like degree of each feature point in the combined and split industrial part template feature point set and the gradient direction vector under the coordinates of the edge points corresponding to the actual matching image.
The similarity gradient direction vector is:
θ i =max(cos(θ j -θ)),j=1...i...l
wherein l is the number theta of all gradient directions of a certain point of the combined split template j Is a template pointAll gradient directions, θ i Is a template dot->And participating in the gradient direction of similarity statistics, wherein θ is the gradient direction of the corresponding coordinate point of the detected image.
The similarity statistical formula is:
wherein n is k Representing that all m template points participating in merging and splitting are concentrated, and the number of points is n at maximum km
The invention has the following beneficial effects and advantages:
1. when the target has shielding, edge deformation and gradient mutation, the template matching effect is robust;
2. by adopting the off-line gradient mapping table and the off-line cosine mapping table, rapid template matching can be realized.
Drawings
FIG. 1 is a schematic diagram of template fusion based on a small number of positive samples;
FIG. 2 is a schematic diagram of a sobel gradient mask of 3*3;
FIG. 3 is a schematic illustration of registration of different point sets;
FIG. 4 is a schematic diagram of target coincidence point classification.
FIG. 5 is a schematic diagram illustrating merging and splitting at the merging point.
Fig. 6 is a schematic diagram of gradient directions involved in the matching calculation.
Fig. 7 is a flowchart of a template matching method based on a small number of positive sample fusion.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
The template matching method based on the fusion of a small number of positive samples is characterized in that the method can carry out robust and rapid identification and positioning on a target when the target has larger shielding, deformation and gradient direction mutation through the fusion of a small number of positive samples, and comprises the following steps:
extracting a target edge point set in a template image: and extracting an edge point set which can better describe the target characteristics in the template image by using a gradient detection method, and obtaining the gradient amplitude and direction of the edge points.
Registering target edge point sets under different template images: performing European transformation on the target edge point sets under different template images, so that the transformed target edge point sets are just aligned with the same target gesture;
fusion of target edge points: the registered edge points are concentrated, and if points with the same coordinates and the same gradient direction exist, the points are combined.
Splitting of target edge points: if there are multiple points with the same coordinates but different gradient directions in the edge point set after registration, the points are split.
Similarity statistics based on the fusion point set: in the image to be detected, the target similarity statistics under a certain coordinate is the dot product accumulation of the most similar gradient direction vector of each point in the template fusion point set and the gradient direction vector under the corresponding coordinate of the image.
As shown in fig. 1, the invention provides a template matching method based on fusion of a small number of positive samples, and the algorithm can fuse the template characteristics into a new template by giving limited and small number of positive samples of the template, wherein the positive samples comprise shielding, deformation and gradient direction mutation existing in a target, and the fused template is used for carrying out robust and rapid identification and positioning on the target.
As shown in fig. 7, the present invention includes the following parts: 1. extracting a target edge point set in a template image; 2. registering target edge point sets under different template images; 3. merging and splitting target edge points; 4. and (5) carrying out similarity statistics based on the fusion point set.
1. And extracting a target edge point set in the template image.
The template should select limited features of the target's complete description and these features have a high consistency under the metrology algorithm. Industrial parts generally have better edge characteristics due to the prior processing techniques of turning, milling, casting and the like. In order to express the target edge under the image, the coordinates of the target edge point are extracted by considering a method of selecting gradient detection, and the gradient amplitude and direction of each point are obtained, wherein the gradient is calculated as follows:
θ'=arc tan(gy',gx') (2)
gx and gy are values calculated by using a gradient mask, gx ' and gy ' are gradient magnitudes in the x and y directions respectively, and θ ' is the calculated gradient direction. In view of the detection speed, in actual operation, a sobel template of 3*3 may be selected, as shown in fig. 2.
The gradient detection is a convolution process, the processing time is long, and in order to improve the processing efficiency, when the gradient detection is carried out on the image, a gradient mapping table expressed in a two-dimensional array form can be established in advance, and the calculation is as follows:
θ=dir[gx,gy] (4)
gx and gy are values calculated by using a gradient mask, gx 'and gy' are gradient amplitude values in the x direction and the y direction obtained according to array indexes, θ is gradient direction obtained by adopting array indexes, and in order to save memory space, [0,8 ] is used for representing an angle range of [0,360 ], and each 45 DEG is a step length.
2. Target edge point set registration under different template images
Because the target pose in the obtained multiple sample images is not consistent, in order to obtain a point set which can cover all sample characteristics, pose transformation is required to be carried out on the target edge point set, and other sample target point sets are uniformly transformed to the target pose in the 1 st image. As shown in fig. 3, a registration operation of two different sets of target points is achieved. Set the target edge point set of the kth template imageThrough European transformation->Rear S k ' register with the object pose of the 1 st template image, european transformation of edge points is as follows:
here, theAnd->Respectively the homogeneous coordinates of point p. The point set after gesture registration:
3. merging and splitting of target edge points.
The registered point set may have identical coordinate points, as shown in fig. 4, which are summarized as: (1) repeated edge points; (2) The change of gradient direction of the same physical characteristic under different imaging conditions; (3) the edge characteristics of different target samples are changed.
The class (1) edge points are combined, so that repeated statistics in the matching process is avoided;
as shown in fig. 5, a merge-split operation is defined:
is divided into->Is a gradient direction of (c). After the merging operation, the (1) class points under the same coordinate are merged into one point, and the (2) class points and the (3) class points still participate in matching as different points. The point set after merging the splits is:
4. and (5) carrying out similarity statistics based on the fusion point set.
As shown in fig. 6, the combined template point set has a plurality of gradient directions at the same point, and for the point, we select the point closest to the gradient direction of the detected image, as shown in fig. 5, namely
θ i =max(cos(θ j -θ)),j=1...l (9)
Wherein θ is j Is a template pointAll gradient directions, θ i Is a template dot->Actual participation similarity systemThe gradient direction of the meter, theta is the gradient direction of the coordinate point corresponding to the detected image, and j refers to any one of all gradient directions corresponding to a certain point.
The similarity statistical formula under a certain coordinate position in the image to be detected is as follows:
since the number of samples increases, resulting in an increase in the total number of template points, in order to prevent the total number of template points from increasing, resulting in a decrease in matching similarity, where n mk Maximum value of the number of the template points for all the samples participating in fusion, and simultaneously, the maximum value of the number of the template points for all the samples participating in fusion is prevented from being equal to n m Greater than n mk In the case where the similarity exceeds 1.0, a cutoff operation with an upper limit of 1.0 is used for the calculation result.

Claims (7)

1. The template matching method based on the fusion of a small number of positive samples is characterized by comprising the following steps of:
1) Extracting a target edge point set in an industrial part template image;
2) Registering target edge point sets under different industrial part template images;
3) Combining or splitting the target edge points to obtain a combined and split industrial part template feature point set;
4) Carrying out similarity statistics based on the combined and split characteristic point sets of the industrial part templates, and carrying out matching of the industrial part templates and the detection images;
the step 3) is specifically as follows: if the registered edge points are concentrated, merging the target edge points if the target edge points with the same coordinates and the same gradient direction exist; if there are target edge points with the same coordinates but different gradient directions, splitting the target edge points;
the merging or splitting is specifically as follows:
wherein,target edge points in the target edge point set respectively +.>I, j are target edge point sequence numbers, n k The number of target edge points of the kth template is calculated, and the feature point set of the part template after splitting is combined as follows:
wherein n is m To merge the number of edge points after splitting.
2. The template matching method based on a small number of positive sample fusion according to claim 1, wherein the step 1) specifically comprises: and extracting coordinates of a plurality of target edge points in the industrial part image template, and calculating the gradient amplitude and direction of each target edge point by a gradient detection method to form a target edge point set.
3. The template matching method based on small amount of positive sample fusion according to claim 2, wherein the gradient detection method is as follows:
m=mag[gx,gy]
θ=dir[gx,gy]
wherein gx and gy are values obtained by using a gradient mask to calculate target edge point coordinates, m is a discrete gradient amplitude value obtained according to array indexes, θ is a discrete gradient direction obtained by using array indexes, mag and dir are two-dimensional arrays respectively, two dimensional variables are gx and gy respectively, and an index range comprises all values from a minimum value to a maximum value obtained by gx and gy.
4. The template matching method based on a small number of positive sample fusion according to claim 1, wherein the step 2) specifically comprises: and performing European transformation on the target edge point sets under different industrial part template images, so that the transformed target edge point sets are aligned with the target pose of the target edge point under a certain industrial part template, namely the reference template image.
5. The template matching method based on small number of positive sample fusion according to claim 4, wherein the european transformation is as follows:
wherein,respectively taking the form of homogeneous coordinates of the point p, (tx, ty) and alpha as relative pose between two registration point sets, and setting a target edge point set of a kth industrial part template image +.>The point set registered with the b-th reference template after gesture transformation is S k ' then the set S of points after registration of all m template points with the reference template b, b e 1..m is:
wherein,for the target edge point, +.>The gradient direction of the target edge points in the target edge point set is that i is the target edgeEdge number, n k The number of target edges for the kth template.
6. The template matching method based on a small number of positive sample fusion according to claim 1, wherein the step 4) specifically comprises: accumulating dot products of the gradient direction vector of the similarity-like degree of each feature point in the combined and split industrial part template feature point set and the gradient direction vector under the coordinates of the edge points corresponding to the actual matching image;
the similarity gradient direction vector is:
θ i =max(cos(θ j -θ)),j=1...i...l
wherein l is the number of all gradient directions of a certain point of the combined split template, and theta j Is a template pointAll gradient directions, θ i Is a template dot->And participating in the gradient direction of similarity statistics, wherein θ is the gradient direction of the corresponding coordinate point of the detected image.
7. The template matching method based on a small number of positive sample fusion according to claim 1 or 6, wherein the similarity statistical formula is:
wherein n is k Representing that all m template points participating in merging and splitting are concentrated, and the number of points is n at maximum km
CN202010587584.2A 2020-06-24 2020-06-24 Template matching method based on fusion of small amount of positive samples Active CN113920049B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010587584.2A CN113920049B (en) 2020-06-24 2020-06-24 Template matching method based on fusion of small amount of positive samples

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010587584.2A CN113920049B (en) 2020-06-24 2020-06-24 Template matching method based on fusion of small amount of positive samples

Publications (2)

Publication Number Publication Date
CN113920049A CN113920049A (en) 2022-01-11
CN113920049B true CN113920049B (en) 2024-03-22

Family

ID=79231228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010587584.2A Active CN113920049B (en) 2020-06-24 2020-06-24 Template matching method based on fusion of small amount of positive samples

Country Status (1)

Country Link
CN (1) CN113920049B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4205760B1 (en) * 2007-12-27 2009-01-07 株式会社ファースト Image matching method, program and application apparatus
CN103679636A (en) * 2013-12-23 2014-03-26 江苏物联网研究发展中心 Rapid image splicing method based on point and line features
CN103679702A (en) * 2013-11-20 2014-03-26 华中科技大学 Matching method based on image edge vectors
CN106339707A (en) * 2016-08-19 2017-01-18 亿嘉和科技股份有限公司 Instrument pointer image recognition method based on symmetrical characteristics
CN111079803A (en) * 2019-12-02 2020-04-28 易思维(杭州)科技有限公司 Template matching method based on gradient information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4205760B1 (en) * 2007-12-27 2009-01-07 株式会社ファースト Image matching method, program and application apparatus
CN103679702A (en) * 2013-11-20 2014-03-26 华中科技大学 Matching method based on image edge vectors
CN103679636A (en) * 2013-12-23 2014-03-26 江苏物联网研究发展中心 Rapid image splicing method based on point and line features
CN106339707A (en) * 2016-08-19 2017-01-18 亿嘉和科技股份有限公司 Instrument pointer image recognition method based on symmetrical characteristics
CN111079803A (en) * 2019-12-02 2020-04-28 易思维(杭州)科技有限公司 Template matching method based on gradient information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于模板匹配和SVM模型的快速目标检测算法;李蕊;赵娅琳;陈金涛;周霜;何志豪;陶青川;;电视技术;20190205(03);全文 *

Also Published As

Publication number Publication date
CN113920049A (en) 2022-01-11

Similar Documents

Publication Publication Date Title
CN108107444B (en) Transformer substation foreign matter identification method based on laser data
CN105740899B (en) A kind of detection of machine vision image characteristic point and match compound optimization method
CN106251353A (en) Weak texture workpiece and the recognition detection method and system of three-dimensional pose thereof
CN109523505B (en) Method for detecting pattern defects on surface of ceramic tile based on machine vision
CN106683137B (en) Artificial mark based monocular and multiobjective identification and positioning method
CN103727930A (en) Edge-matching-based relative pose calibration method of laser range finder and camera
CN111223133A (en) Registration method of heterogeneous images
CN108492017B (en) Product quality information transmission method based on augmented reality
CN109389625B (en) Three-dimensional image registration method based on multi-scale descriptor screening and mismatching
CN107564006B (en) Circular target detection method utilizing Hough transformation
CN103729631A (en) Vision-based connector surface feature automatically-identifying method
CN108182705A (en) A kind of three-dimensional coordinate localization method based on machine vision
CN112419429A (en) Large-scale workpiece surface defect detection calibration method based on multiple viewing angles
CN104460505A (en) Industrial robot relative pose estimation method
CN109766903B (en) Point cloud model curved surface matching method based on curved surface features
CN107895166B (en) Method for realizing target robust recognition based on feature descriptor by geometric hash method
Zhou et al. Vision-based pose estimation from points with unknown correspondences
CN114358166A (en) Multi-target positioning method based on self-adaptive k-means clustering
CN114494463A (en) Robot sorting method and device based on binocular stereoscopic vision technology
CN108182700B (en) Image registration method based on two-time feature detection
CN113920049B (en) Template matching method based on fusion of small amount of positive samples
CN111179271A (en) Object angle information labeling method based on retrieval matching and electronic equipment
Song et al. Multimodal remote sensing image registration algorithm based on a new edge descriptor
CN114240871A (en) Point cloud data processing method for contour detection in workpiece forming process
CN109272558B (en) Method for calibrating pinhole camera by using common free-pole triangle and circular ring points of separating circles

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant